Many industries rely on simultaneous innovation in order to grow and prosper. New ideas are scrutinized, and ways to improve them fill the minds of competitors nearly from the concept stage as word leaks out. Search engine optimization works this way, as most expert SEO practitioners constantly monitor discussions in industry forums and collective research that takes place when algorithms change or new engines appear. Many would argue that people that push the limit and fail to follow strict guidelines (read here “black hats”) help those that abide by search engine-approved best practices, and visa versa, quite consistently in this fashion.
A new thread at Search Engine Watch forums explores methods for marketers to keep an eye on actual SEO performed on the websites of competitors. The original poster decided to start a discussion about the potential value of performing the following:
Analyze your top 3 competitors’ source code and robot.txt file (if its not been hidden)There are just a few responses so far, but this has the makings of a great discussion highlighting some important things that can be monitored competition-wise. One member poses that competitive analysis is paramount to success:
By understanding the necessary benchmarks for a particular industry or market one can calculate the financial barrier to entry, undertake strategic niche planning, create an effective SEO plan, set specific goals and much more. By carefully studying the playing field you learn IF you can compete and HOW to compete. You can set specific goals and run evaluations.Certainly the thread should yield some recommendations for valuable analysis tools. One thing to keep in mind is that in a lot of industries people know they are being watched. In fact, one of our Engineers, Brian, was relating to me the other day how he had some friends that would create fake META tags and ridiculous keyword lists, only to find them in competitors’ source code within days!
Share your secrets at Search Engine Watch Forums.