Keeping Those Pesky Search Bots Away from Test Pages

Nov 10, 2005 - 10:24 am 0 by

A Search Engine Watch Forum thread that was just featured (I had it in my "topics to discuss" bookmark since yesterday) discusses Prevent Indexing While Testing New Site.

The solutions given, seem to me, to be focused towards a lower budget.

  • Setup my site w/ the host provider as an ip address and after testing change to my domain name.
  • Add a robots.txt file to my site root folder (I read on the google site that this file is recoginzed and google won't index the site. Not sure if other SEs will honor this file?)
  • Consideration given to having my host provider setting up a password on the default home page file (index.htm, default, etc.)
  • Run the test site at www.domain.com/testurl/ and block it with robot.txt

I do it a different way. I have test servers located locally at my company. We develop on our test servers which are accessible via a private test URL. The only way to access it is to have a user/pass provided to you within my client management system. When the work on the test server meets customer approval, we move it to the main server, which is accessible to those pesky spiders. :)

 

Popular Categories

The Pulse of the search community

Search Video Recaps

 
- YouTube
Video Details More Videos Subscribe to Videos

Most Recent Articles

Search Forum Recap

Daily Search Forum Recap: March 20, 2025

Mar 20, 2025 - 10:00 am
Bing Search

Microsoft Confirms Schema Helps Its LLMs (Copilot) Understand Your Content

Mar 20, 2025 - 7:51 am
Google

Google Drops Review Counts For Non-Shopping Results

Mar 20, 2025 - 7:41 am
Google Search Engine Optimization

Google: If Searchers Don't Use Your Page In Search It May Be Removed

Mar 20, 2025 - 7:31 am
Bing Search

Bing Buy Now Button Goes Directly To Checkout

Mar 20, 2025 - 7:21 am
Google Maps

Google Business Profiles Fixed The Reverification Bug

Mar 20, 2025 - 7:11 am
Previous Story: Ex Girlfriend Takes Revenge via AdSense Click Fraud