Stop Spiders From Crawling Your Site on Shabbat, Including GoogleBot

Sep 9, 2009 • 8:57 am | comments (4) by | Filed Under SEO - Search Engine Optimization

A Google Webmaster Help thread has an interesting discussion around blocking your site from coming up for both visitors and search engine crawlers on Shabbat (the Jewish Saturday). This is not a new topic, we discussed using cloaking for religious Shabbat purposes in the past.

In short, some observant Jews do not want their site to be accessible on Shabbat, which is sundown Friday night, to nightfall Saturday night. The issue on the SEO front is if you turn off your site, then what happens to the search engine crawlers? Do they get 404 pages and drop your site from the search index?

Phil Payne posted an answer to how one can handle this, which Googler JohnMu said was a good answer. Phil said:

Yes - a 503 is the correct server response for "We're closed". If you substitute a normal HTML page saying "We're closed" and serve a 200 it's very likely to get indexed by Google.

If you give the Googlebot a 503, it will just go away and come back later without indexing what you give it.

For humans, you can serve a custom 503 page that explains the situation. Are there no other Orthodox sites you can ask, to see how they do it?

Now, Friday night here, is not the same as Friday night by you. So detecting the location of a visitor is key here. There are services like Saturday Guard that do this for you, but I am not sure how they handle search bots.

Technically, the issue, as far as I understand it (I am not a Rabbi, but I am an observant Jew) is that they do not want to earn money on Shabbat or Jewish holidays. Some hold that since the money doesn't transfer from the merchant account to the bank that day, then there is no money being earned technically that day. But some do not hold that way or some want to be extra careful. If it is a matter of money, then just turn off the "add to cart" and shopping cart features for the site.

If they do not want any activity on their site by potential customers, then I guess a 503 is a good answer. But are search engine bots customers? No. I suspect, most Rabbis would be okay with spiders or automated crawlers using the site on Shabbat. The issue then is, are you allowed to serve up a 503 page to a visitor and not to a crawler - that might be against Google's terms of service and fall within the bad cloaking policies.

If the issue is about the server actually working on Shabbat. Then a 503 cannot really be served up at all, because you would technically need to power down the server and without a server to send the 503 response code - then you got nothing.

This is a complex issue that I personally never had to deal with on sites that we have built. But it would be interesting to see what to do in the case of turning off a web server. There isn't much Google can do here.

Forum discussion at Google Webmaster Help.

Previous story: Bing Not Honoring Robots.txt Directives?
Ninja Banner
blog comments powered by Disqus