Below is live coverage of the Technical SEO Issues For Developers from SMX West 2009 conference.
This coverage is provided by Keri Morgret of Morgret Designs.
We are using a live blogging tool to provide the real time coverage. We will publish the archive below after the session is completed. In addition, you can interact with us and while we are live blogging, so feel free to ask us questions as we blog.
|Keri Morgret: Starting with Malie from Google talking about URL structures (starting a bit late).|
|Keri Morgret: Things to disallow for crawling: contact us forms.|
|Keri Morgret: Avoid maverick coding practices. Discourage alternative encodings. Don't use things like QQ instead of &. |
|Keri Morgret: Eliminate positional encoding. Shows slide of someone that uses a bunch of ones and zeros to expand and collapse lots of stuff, leads to infinite URLs. Fix: Limit category expansion to one.|
|Keri Morgret: Remove session IDs from paths or positions.|
|Keri Morgret: In general, make things easy for Google. Don't make them try to figure out the pattern of your website by using your own special way of doing your URLs.|
|Keri Morgret: Malie is working on making the web prettier. She's working on looking at some common CMSs and figuring out ways that they make infinite crawalbe URLs that you don't want.|
|Keri Morgret: Join Webmaster Tools! The message Center will give you notification when they find infinite crawl spaces. Visit code.google.com/doctype. |
|Keri Morgret: Patrick Bennett from Modern Blue is up next.|
|Keri Morgret: A new acronym! SUMIA. Sitemaps - URLs - Meta tags - Infrastructure - Analytics.|
|Keri Morgret: We want sitemaps so that Google can know about what pages we have, especially when we create a new site. It can help you get exposed, but doesn't necessarily improve ranking.|
You need both human sitemaps and XML sitemaps.
|Keri Morgret: U = URLs.|
Canonicalization. To use www or not www, that is the question. Whatever you decide, do keep things the same. But what happens if someone links to you the wrong way? Use a redirect. Search for URL Canonicalization and you'll find examples of what to do in an htaccess file.
|Keri Morgret: URL Return codes (header status codes). Need to make sure you have the right code!|
200 OK (everything is fine)
301 Moved Permanently
302 Temporary redirect
404 Not found.
He gives an example of what his .htaccess file looks like.
|Keri Morgret: Be sure to not have a soft 404 -- don't report a 200 OK when the page really isn't found. Google Webmaster Central Blog has detailed information about this.|
|Keri Morgret: Meta and title tags. Have them, be sure to be able to change them.|
|Keri Morgret: I = Infrastructure. He's joining the Keep it Clean movement. |
His suggestions for code:
Keep it clean: XHTML/CSS
Use necessary tags for important content:
|[Comment From Prashant]|
What website/tool do you suggest using for checking the status code?
|Keri Morgret: @prashant There are firefox extensions that can check this, and xenu link sleuth does so as well. |
|Keri Morgret: A = Analytics. You CANNOT live without analytics. Learn the program you're using. Google Analytics has a lot you can use for free. You need this to know how your site is doing.|
|Keri Morgret: Arnab from Yahoo! is up next.|
|Keri Morgret: Great picture of the OPPOSITE of simplicity!|
|Keri Morgret: Be simple. Follow standards. Use static html, meaningful page titles, aclear anchor text, don't link to spam.|
|Keri Morgret: Brevity is more for URLs. Use clean URLs without session IDs, few query parameters, be simple. |
Have the URLs scream COPY ME!
|Keri Morgret: Have stable URLs, don't use several URLs for same page. This leads to fragmented anchor text and link popularity, and wasted crawler and website bandwidth.|
|Keri Morgret: He talks about open standards and search monkey and microformats.|
|[Comment From Guest]|
Is there an allowed way or method to do URL tracking
|Keri Morgret: Improve your crawler discovery by leveraging Sitemaps and Robots.txt/meta tag exclusion.|
|Keri Morgret: Do be sure to use robots only if required and fully understood. People do screw this up.|
|Keri Morgret: Validate what you use! Google has a way to validate your robots.txt. |
|Keri Morgret: Use Site Explorer a lot when you're developing a site. Find out what's indexed, how people link to your site, submit your site maps, dealing with dynamic URLs and deleting URLs.|
Site Explorer allows you to tell Yahoo! what your parameters are so they don't have to figure it out. Tell them what yoru session IDs are, source trackers, format modifiers.
|Keri Morgret: Michael Gray is up next.|
|[Comment From Vanessa Fox]|
For URL tracking, check out http://janeandrobot.com/post/URL-Referrer-Tracking.aspx
|Keri Morgret: Thanks Vanessa!|
|Keri Morgret: Michael is going to walk us through a case study.|
They started with analytics -- they hadn't been running any analytics at all. They were getting 200-500 uniques a day, and about 250,000 URLs, and Google only had about 10,000 of them in the index.
|Keri Morgret: He shows the poor URL structures that were in effect with lots of parameters, long ids. Lots of stuff you carried around in the URL. This was bad and confused the spiders.|
|Keri Morgret: Their fix: stripped it down as small as they could:|
301s from old IDs to new
Put stuff into cookies instead of URL
And some other wonderful magic things.
|Keri Morgret: He improved their page title structure, put the important information first.|
|Keri Morgret: Set up mini-sitemaps, made them interconnected. Once they did this, they added in a breadcrumb trail at the top of the mini-sitemaps. Helps spiders to find new pages. Helps establish a hierarchy of data.|
|Keri Morgret: Fixed their anchor text. The typical "click here" and "more information" anchor text had been in place.|
|Keri Morgret: What happened? Traffic crashed! Lots of nasty emails and calls. Client was not happy. But patience paid off, then they started getting up to 5,000 uniques a day, then 60,000 uniques a day, now at 90,000 uniques a day. From a few hundred uniques a day in June to 90,000 uniques today.|
|Keri Morgret: Takeaways:|
Make it easy to crawl.
Make URLs search friendly.
Titles and content closer to the top.
Be whitehat, no tricks.
|Keri Morgret: Yes, that was Michael Gray saying that he used all white hat.|
|Keri Morgret: The Q&A session is starting and my computer may be crashing.|
|Keri Morgret: Maile: Talking about iFrames. They're good for gadgets, but not the home page.|
|Keri Morgret: Vanessa comments on on Michael's suggestion about title tags. She shows eyetracking study, and shows how pepole look towards the left of the screen, you want the keywords at the left where they look first.|
|Keri Morgret: She suggests using Live HTTP headers extension.|
Look at janeandrobot.com/post/URL-Referrer-Tracking.aspx
|Keri Morgret: Michael 301s URL tracking parameters to a cookie then gives them to the proper URL. |
|Keri Morgret: Actual filename vs. directory (for like contact page). Either way is fine for a search engine, though Michael points out that having it go to a directory is easier if you later change technologies and go from html to asp extensions, for example.|
|Keri Morgret: Question about international encoding in search, Google doesn't have detailed recommendations yet about international encodings, but has a post coming soon.|
|Keri Morgret: Absolute URLs vs relative URLs. Absolute is preferred by panelists. Michael points out how absolute helps with being scraped (gets you a link), and canonicalization.|
|Keri Morgret: Will having partial sitemap (leaving out some URLs from XML sitemaps) be a problem?|
|Keri Morgret: Ending coverage, my computer is crashing.|