Search Engineers Q&A

Feb 27, 2008 - 8:41 pm 0 by
Filed Under SMX West 2008

Moderated by Danny Sullivan, the conference chairman. This will be primarily QA, so not everything will get covered as this format is more difficult to love blog than a traditional session.

Nathan Buggia from Live Search makes a short presentation announce a new improvement to sitemaps. How they work today: strict hierarchy based. He uses MSN as an example and shows how the engines have come together to extend the protocol (sitemaps.org) to allow cross linking between sitemaps on different domains. Now he is getting really technical someone else nerdier than I likely should have covered this session .:)

Sitemap files cannot be larger than 10MB Uncompressed. Tips: submit sitemaps in robots.txt file. Submit via webmaster tools and site explorer. Use the PING protocol to let us know of updates.

Q: Is there a way for people with multiple sites that address different regions and languages to map them properly? Evan Roseman from Google says this can be done in the Webmaster Tools area. Peter Linsley from Ask.com says that there are standards around telling what language is on the page, but people do not always do this right. They need to find a way to confirm that what the tag says is right. Danny also pulls up an article that he wrote on this at Search Engine Land in October 2007.

Q: Do you feel at a high level that SEO is good or bad? Sean Suchter from Yahoo feels that the community as a whole is doing fantastic things. Nathan feels one problem is that there is unequal access to SEO depending on what company you are. Their goal is to strive to find the best content out there regardless of how optimized pages are. Tools and blogs and forums are a great example of how SEO is a “big deal” and is a positive thing. Evan says that SEO is also good because you are making the site accessible for search engines which is also good for users with disabilities for example. Peter agrees it is a good thing.

Q: back to sitemaps, is it still worth spending the time to create an HTML sitemap? The engineers in consensus feel that the HTML sitemap is still important for human users. Where it used to be you would have to have all pages in them, now maybe you can just lit the 100 most important for human navigators, Danny suggests. The key thing is that if you are making the HTML sitemap for the search engines then it may now be a waste of resources when you can use the more powerful XML version (paraphrased).

Q: Are links in DIVs hidden by CSS or JavaScript crawled? Evan says that Google does crawl them. Sean thinks that they are crawled by them but would caution that they rank links based on value. If it is invisible to the end user, then there is a question there. Nathan says that Live search looks at all links on the page, but anything not visible is not considered. If it is apparently an attempt to manipulate, then they make take these into account in a negative way.

Q: If they cannot put a sitemap on the root level of the domain, what are the other options? Best practice is to put sitemaps file anywhere you want and use robots.txt file to show us where it is. Evan states you should use the Webmaster Tools to let them know. The Wordpress example is interesting, according to Sean, because it is a case where it is hard to confirm the real owner of the site. Only Wordpress actually knows who the owner of the site is…they would need to let the engines know that it is a legitimate change being requested by the owner.

Q: Do you follow links on 404 pages? Danny is curious why he cares. Peter says no, Evan says no. Nathan says no. Sean draws the difference between a “hard404” and a “soft 404.” Please use the hard 404 (not retuning 200) so that they can know for sure what the human is seeing. Nathan agrees that the 200 return can lead to duplicate content issues.

Q: what are some strategies to do better in Mobile search, and do links make a difference? For Yahoo, the things that are important are very similar to web search. Mobile has a bit of a unique thing in that some sites use WAP, and other are slightly designed for Mobile but using a light HTML. Use the same strategies and you should be OK. One of the big problems, according to Nathan is non-standard sites. They like the others use a “mobilizer” to help present the content on a mobile device, and non-standard code can make this end up looking really bad.

Q: Are HTML links still important? Yes. Nathan describes that you may not want to link to every single page, but instead make more links to categorized pages which in turn link to other pages.

Q: Why cant we get search results without scraping? Evan agrees that scraping is bad, mkay. Sean says there is some access through Webmaster Tools and Sean says there is some limited ability through Yahoo APOI. Nathan says Live Search has done really well with this.

Battery is dying so I am going to sign off. Very interesting panel.

Note this is live coverage of SMX West 2008, and there may exist grammatical or typographical errors in this post. Please share your thoughts in the comments!

 

Popular Categories

The Pulse of the search community

Follow

Search Video Recaps

 
- YouTube
Video Details More Videos Subscribe to Videos

Most Recent Articles

Search Forum Recap

Daily Search Forum Recap: December 10, 2024

Dec 10, 2024 - 10:00 am
Google

Google Crawler Documention Adds HTTP Caching

Dec 10, 2024 - 7:51 am
Google Search Engine Optimization

Google: Sometimes Over Optimization Drift Towards SEO Spam

Dec 10, 2024 - 7:41 am
Google Search Engine Optimization

Google Search Console Insights Removes Google Analytics Data

Dec 10, 2024 - 7:33 am
Bing SEO

Copilot Beta Now Bing Webmaster Tools For 10,000 Users

Dec 10, 2024 - 7:31 am
Google

Google Tests Trending & Popular With Labels In Search Results

Dec 10, 2024 - 7:21 am
Previous Story: Just Behave, A Look At Searcher Behavior