Successful Site Architecture Wed. April 11, 2007 Session 4:45pm Fundamentals Track
Moderator: Alex Bennert
Speakers: Matt Bailey, SiteLogic Marketing Derrick Wheeler, Acxiom Digital
Walked into a busy room. The speakers from the talk beforehand are still here, meeting audience members and answering their questions. Quick visit with Matt Bailey, speaker. Says he has new stuff for this presentation. Alex Bennert came up to say hello. Waved to Anne Kennedy. A Cre8asiteforums member recognized me, walked up and introduced herself. That was fun. Another female SEO. It's so fun to have a face to go with a forums avatar or name. The room is again, absolutely packed. Chris Winfield and his new wife, Danielle, are here somewhere. Saw them come in and did the wave thing. The room is warm and alive with chatter.
Derrick leads off with what is successful site architecture? First, he talks about se spiders. A search engine spider requests pages from websites. It must be able to follow links to your pages. It takes them back and indexes them into an index, somebody searches for it and they come and buy something. Funny diagram. SE spiders discover URLs and adds to a Que. Collects content elements to use. A search engine Index is a scaled down database. They boil down your page to the smallest amount of info that is relevant. They use off page factors. Algorithm asks which page should I show from my index? There are different versions of indexes, depending on where you are and time of day. Always updating.
SE crawls entire site. Indexes entire site. Users perform targeted queries, se ranks appropriate pages. Users click on the ranked listings. Users take action or interact with the website. Any change will impact one of those things. These are the 6 steps to success. Balance between users and search engines. He asks how many have new site, or redesign or tweak existing.
Mastering the basics - figure out where you are today and measure over time to know impact. Keep a list of domains and sub domains you own and what they're doing. If you have other domains that are dupes, your SEO needs to know this. Monitor your log files. Not just analytics. Log files show what is making the request. You want to know what pages the SE's are requesting. You need to know if something is blocking you website from se's.
Track your rankings for business critical keywords at major SE's. Collect data from different sources, like Yahoo! Google, MSN. Use tools like Omniture, Wordtracker, etc. Webposition shows rankings. Keep monthly reports for your websites to spot problems earlier.
Footer links - recommends using them. Link to the most important pages on your site. Not all partner sites or links pages. Keep it short and simple. Less links equals more weight for each one on the page. Short URLs are less complex and easier to follow. Many levels down is interpreted as not being important on your site. The higher up the link, the most "important" you feel the page is.
Http request/response cycle - referring url user agent name, ip address, cookies for domain, more. http response is 3 digit status code (200, 301, 302, 404), html code, location of redirect, cookies or more. Every request is met with a response. Some URLS have moved, for example. There are cookie communications to and from.
200 ok code - all is well and here is your html. Don't put a custom error page with 200 ok code.
301 - permanent moving of page; redirect to appropriate page ] 302 - temp move
404 - custom error page
The circle of death - do not block entire site using robots.txt file. Don't permit a "Disallow://". Don't require anyone to require a cookie to access the site. SE's don't accept cookies. Don't force cookies to see a particular country. Can cause huge problems. Every URL should have unique content.
Breadcrumbs show paths. Related products and navigating them cause problems due to breadcrumb setup. Two products may end up on the same page but the URL is different. He's talking about spider traps. Shows T-mobile site. Clicks on View All results. Shows how clicking on two links extends the page URL with dis=true code. Session ids are another cause for accidental duplicate pages. Link and session ids have different urls linking to it. Duplicate pages can happen with http and https absolute urls. SE's will think its two different sites. Use relative links to be safe.
Alex introduces Matt Bailey.
Architecture. Thanks everyone for hanging in there. Wants to make it simple. Shows a picture of Prince, We're gonna party like its 1999. Looks at site architecture from an accessibility standpoint. Describes Target lawsuit. The site had no alt attributes. Made a lot of use of image maps that you had to see visually. Forced to use a mouse. This excludes keyboard and voice only users. SE's want to index your websites. Offer a sitemap. Use text links. Create a useful information style rich site. Google has suggestions for how to be ranked, but Target didn't want to follow those guidelines. If you make the site accessible, it is crawl able. SE's are handicapped. They can't see, or click, eat cookies. If site is accessible to special needs, SE's will get it too.
Alt attributes - Shows Target website without images. Empty pages. No content was visible. Screen readers can't use it. Selecting a country first? You have to hit a continue button, but SE's can't do that. Flash prevents SE's. Cluttered URLS are too long, force line breaks, rewrite it to make sense to the user. A recent test showed rewriting urls to make shorter increased se visits. It removes all the wild cards. People want to know location, directory and where they are. Favicons put your branding on someone's browser. When you redesign, carefully rewrite and redirect your pages and links. You have to redirect your traffic. Who is linking to your deep pages? You need to take time move and redirect to not lose traffic and links. Study your popular pages. Match keyword traffic that is unique or the primary url. Describes 301 redirect old directory to new directory. Does not recommend meta redirects. MSN will describe how to setup 301s.
CSS and standards. Can validated code rank better or do sites using CSS rank higher? CSS separates content from markup. It keeps it external. Reduces page clutter. CSS vs. tables. Shows an example. Shows how engine looks top to bottom. It stacks the page for easier reading. Stacks tables. Navigation always on top and content to the right. Tables aren't "bad". SE's see the code differently. CSS eliminates this stacking of tables. The focus is on the content. Validation can uncover coding errors. Assures spiders can index content. It's not about rank. It's about making pages accessible to SE's.
Mobile phones are another factor. You want the pages to gracefully degrade (graceful degradation). Sometimes there are different style sheets for different browsers. Progressive enhancement starts at the lowest common denominator. Anyone on any device can access the content. At the base, can be accessed at basic level. Additional functionality is adding in layers for increased experience and technologies. Use Webmaster Central by Google for "great reports" It's free. If you have accidentally disallowed your site, it will tell you. How many pages have been crawled? How often, what sections, what time? See external links to your site. Sitemaps.org accepted by MSN, Yahoo and Google.