Fun With Dynamic Sites

Apr 26, 2006 • 3:33 pm | comments (2) by | Filed Under Search Engine Strategies 2006 Toronto

Chris Sherman moderates this session.

Mikkel deMib Svendsen is up first, sporting his bright red suit. He explains that search engines want your content. SEO Basics; indexing, ranking, traffic, actions. With dynamic sites, the problems are normally with indexing the site. When it comes to the rankings side of thing, you can easily make changes to templates to increase or tweak rankings. Dynamic Web site Architecture; he explains how a dynamic site works. He explains that you can deploy URL rewrites, static replication and so on. What is not a problem? Content in a database is not an issue. A question-mark is not an issue. Server side include is also not an issue. Extension names are not an issue (i.e. php, cfm, asp, html). Indexing barriers include; long and ugly URLs, duplicate content (session ids, click is, time stamped URLs), spider traps (infinite loops), server downtime and slow response. There are also indirect issues with dynamic sites; required support of cookies, JavaScript, flash, etc, SEO-targeting and personalization, Form (post method) based navigation. Some non-related issues; robots.txt issues, password protection. Solutions that work; there are many ways to solve the system. (1) Fix your system (2) Add a "bridge layer" and if that isn't possible, you can "replicate your content." One fix he called the "one parameter web site" that makes sure all the parameters are limited to one. Identifying spiders; identify on a global level (session ids, geo targeting, spider traps), look for generic part of the agent name (googlebot, msnbot, slurp, etc.). Also think about building static Web pages (limit the use of dynamic pages, use dynamic objects on hard coded pages, create a site map). You can also use pay for inclusion, directories and PPC.

Jake Baillie from TrueLocal with a Dr. Phil slide. Common Dynamic site issues... (1) Circular navigation, same two links go to the same place. (2) Print-friendly pages to fix, block them in robots.txt or use CSS/Ja to generate them. (3) Canonical URL problem (what is my homepage, index.asp, default.html, / or what?) (4) Looks dont count, just add content. (5) Badly implemented mod_rewrite code, DNS errors with multiple domains. make sure to 301 redirect them. (6) Don't use a poorly written cloaking scripts. To prevent duplicate content, the same content should not be accessible via multiple URLs. What are URL rewrite URLs? He examples it. If your site is fully indexed by the search engines - don't use URL rewriting. He shows some of his "tasty tips" which I am not writing here.

Previous story: Tracking Repeat Customers and The Influence on Sales
Ninja Banner
blog comments powered by Disqus