Successful Site Architecture

Dec 6, 2005 - 6:33 pm 0 by
Filed Under SES Chicago 2005

Successful Site Architecture

Moderated by Barbara Coll, WebMama.com Inc.

Derrick Wheeler, Marketleap

Came in late due to speaking in the previous session…as I entered Derrick was speaking about making sure that you do not accidentally get a site that is still under development indexed. If your development server is not password protected, you should ensure that the Robots.txt exclusion code is in the site’s pages, just in case Google or another search engine was somehow to find it. This also applies to using Sessions IDs, but in the reverse. You should turn Session ID requirements off in order to not dissuade the spider from indexing your content. Be careful with cookies, they sometimes also attach session IDs, which will hurt. Some sites require cookies to be turned on in order to view content. If this is the case, the search engines will not be able to see the content. Derrick shows quite a few examples of sites that were well-ranked before sessions ID’s/cookies were enabled/required…the results in the “after” showed tremendous loss of rankings and traffic.

The use of “weird redirects” can also inhibit your ability to rank/be indexed. Uses the site www.omahasteaks.com, suggesting that you type it directly into the URL. Each time you visit, you will see a different URL in the browser bar. Another issue is JavaScript requirements. This can cause problems with search engines, which will index “were sorry, but you need to install JavaScript…” etc. Another issue that can cause seemingly duplicate content is how you link within your site. All of your links on secure pages should have the full path of the URL instead of being a relative link, otherwise SE’s may feel that all pages are under the https secure URL, and therefore not “indexable.” Also mentions that you should use descriptive anchor text in your links, in order to help the search engines identify the probable content relevant to the link.

It is good to have sitemaps, but if improperly created, this can cause for trouble. Gives and example of a site that uses a JavaScript link to its sitemap, which is bad. URL structure: 3 main factors: 1 is the number of parameters you have within your URL. You should use one or two parameters, or even three, but you should be consistent, otherwise you are dealing with a possible duplicate content mis-identification. Shorter URLs are also easier for people to link-to, remember, and virally distribute. Every unique URL should have specifically unique content. Shows an example of a jewelry site that uses duplicate content on different URLs, which is bad.

When selecting a domain, do a “lemon check” to ensure if any prior owner of a domain was not penalized. Also: always think unique: if you have two or more domains indexed with the exact same content, this is bad, most major search engines specifically recommend against this practice. Darren uses a 302 redirect instead of a 301 redirect, because he has had better luck maintaining all the current links. Also to avoid: invisible text, small text, link farming. Each page should have unique content, down to META information. He suggests using CSS to present content. Note: added after the speakers were done, Barbara Coll highly recommends the fee tools available for research purposes at the Marketleap website.

James Jeude, AskJeeves Will focus on some things that he has learned in his own experience. Improving chances of being picked by users: visual relevance is critical in order to “get clicked.” Ensure that important keywords that you want to highlight are surrounded by helpful words, in order to be used in the organic search result “snippet” description. Also, don’t forget to organize and tag all of your images. Spelling is a weak suit? Decide a strategy if you want to go after a lot of keywords. If you feel that a particular keyword will be misspelled, try to find some way to place it in the content. Keeping it short because many of the topics were already covered by Derrick.

Rajat Mukherjee, Yahoo! Will discuss a few new developments that Y! has in store. Why is it important to be indexed in the Yahoo search index? Because the Y! Network is the largest online audience in the world…yada yada (to quote Barry). Spidering and indexing are sequential processes. You should optimize for both processes, crawling and indexing. The Spider crawls, and the Indexer removes duplicates and Spam. Ensure you enable your site though the robots text inclusion for SLURP, the Y! Crawler. Navigation: always link back to your home page. Use unique content and avoid spam. What gets crawled: static URLs and dynamic pages with in-links from static pages. You can also use feeds to send your pages with dynamic content to Yahoo for inclusion into its index.

Site Explorer: was released at SES San Jose, and there are more new features being added. It was created specifically for webmasters, and it is incumbent on all developers to go “give it a shot.” It is a set of tools and interfaces that allows for webmasters to explore a URL from the point of view of Yahoo. Will show you which pages are indexed, your in-links, and many more “neat” things. He is very excited to announce that you can now submit URLs to the site using various feeds (see the announcement today at the Yahoo Search Blog). The tool is also now able to filter specific in-links by domain. They have also simplified the support process with a new URL: help.yahoo.com/search.

The session then moved into specific questions about URLs. I will not blog these comments…come to the next SES and you too can participate and gain knowledge about your site from the panel of experts.

 

Popular Categories

The Pulse of the search community

Follow

Search Video Recaps

 
Google Core Update Rumbling, Manual Actions FAQs, Core Web Vitals Updates, AI, Bing, Ads & More - YouTube
Video Details More Videos Subscribe to Videos

Most Recent Articles

Search Forum Recap

Daily Search Forum Recap: March 18, 2024

Mar 18, 2024 - 4:00 pm
Google Updates

Google Urges Patience As The March 2024 Core Update Continues To Rollout

Mar 18, 2024 - 7:51 am
Google

Official: Google Replaces Perspective Filter With Forums Filter

Mar 18, 2024 - 7:41 am
Google Maps

Google Business Profiles Now Offers Additional Review After Appeal Is Denied

Mar 18, 2024 - 7:31 am
Google Maps

EU Searchers Complaining About Google Maps Features Changes Related To DMA

Mar 18, 2024 - 7:21 am
Google

Google Showing Fewer Sitelinks Within Search

Mar 18, 2024 - 7:11 am
Previous Story: Meet The Blog & Feed Search Engines