Keeping Those Pesky Search Bots Away from Test Pages

Nov 10, 2005 • 10:24 am | comments (0) by twitter Google+ | Filed Under SEO - Search Engine Optimization
 

A Search Engine Watch Forum thread that was just featured (I had it in my "topics to discuss" bookmark since yesterday) discusses Prevent Indexing While Testing New Site.

The solutions given, seem to me, to be focused towards a lower budget.

  • Setup my site w/ the host provider as an ip address and after testing change to my domain name.
  • Add a robots.txt file to my site root folder (I read on the google site that this file is recoginzed and google won't index the site. Not sure if other SEs will honor this file?)
  • Consideration given to having my host provider setting up a password on the default home page file (index.htm, default, etc.)
  • Run the test site at www.domain.com/testurl/ and block it with robot.txt

I do it a different way. I have test servers located locally at my company. We develop on our test servers which are accessible via a private test URL. The only way to access it is to have a user/pass provided to you within my client management system. When the work on the test server meets customer approval, we move it to the main server, which is accessible to those pesky spiders. :)

Previous story: Ex Girlfriend Takes Revenge via AdSense Click Fraud
 

Comments:

No comments.

blog comments powered by Disqus