CSS, AJAX, Web 2.0 & Search Engines

Apr 13, 2007 • 1:22 pm | comments (4) by twitter | Filed Under Search Engine Strategies 2007 New York
 

CSS, AJAX, Web 2.0 & Search Engines, Friday April 13, 2007 12:30 pm Organic Track

Moderator: Danny Sullivan

Speakers: Shari Thurow, GrantasitcDesigns.com Jim McFadyen, CriticalMass Dan Crow, Google Amit Kumar, Yahoo! Search Ryan Johnston - Critical Mass

This is the final session for me today and one of the last for the conference. Again, it is being held in a ballroom sized room and it's cold in here (but better than being in some of the hot rooms.) I didn't have info in advance on who would be moderating, but Danny Sullivan just appeared and it will be he who moderates this session. Two minutes to go and the room is starting to buzz and fill up.

Danny is at the podium and cracking jokes. Last session of the last day. This will be the "best session" he jokes. The web has evolved as more people are making use of css, ajax. Issues for SEO? Shari leads off.

Shari:

I think everybody who attends the last session deserves a reward. CSS - html addition that allows webmasters to control design, font, link appearance, etc. It’s a text file. SE's can read it. Decreases download time of page. Easier to control elements on a page. Communicates visited and unvisited links. Ability to control look of a site. SE's monitor hidden links. Disadvantages - end users have to have the fonts you call in stylesheet. users prefer a font that is not commonly installed on all computers; they often prefer odd typefaces found in print. Css hyperlinks clutter a page. Sometimes there is unusual text wrapping when a stylesheet is changed, like changing font sizes. CSS can be used to hide text on a page. SE's don't use alt text to determine relevancy. Some people use h1 tags as workaround and they make a lot of content h1 tags in CSS. CSS layer coordinates are something SE's can detect. Some SEO's try to hide content in negative coordinates. CSS makes it easy to put layers on top of each other, making it easier to use CSS to hide text. They myth is that you can use CSS to hide things from SE's. Drop down menus are not considered spam because text is meant to be read by humans and so are the links.

Put all css into separate directory. Make a different design for mobile, she recommends. Not just changing the css. Should you robots exclude css? No. SE's don't want you to hide CSS or JavaScript with robots.txt. She highly recommends css. Increases page load times. Make sure your websites display properly on browsers. Not all elements need to be css. Some images are fine rather than doing it by css.

Ryan and Jim co-present:

We use tools like AJAX all the time. Used for some high end clients. Tech has been around for 7 years. Asynchronous JavaScript XHTML. X is for data formatting. AJAX is not a programming lang. Nothing to install or download. All browsers are enabled. AJAX not supported by SE's.

Full/ partial /none are 3 groups of support. SE's and AJAX don't mix because of the use of JavaScript. Makes it hard to locate or index content. If AJAX delivers your content, this is the problem. Every pg needs to be an html, php, aspx page. SE's must find and index them. Every page must have content that exists on the page. All links must be in html. Test by turning off JavaScript. If pages are there, SE's will find them.

AJAX enhances the user experience. Engineers come in and change anchors on the page to change function to AJAX calls. Ensure your baseline app supports non-AJ users, including spiders. AJ can help a site be more interesting for users. make it run faster. offer assistance, like Google suggest.

Ex - Rolex.com

copy in nav wanted nav accessible from every page

They didn't want all content indexed by SE's. Solution was AJAX.

AJAX breaks the normal browser refresh. This means content does not always correspond to the URL. No history, no back button. Major usability issue. They use JavaScript to update urls, won't refresh the page and fake an entry into browsers history. Advises not to cloak. Our research suggest duplicate content should not be an issue as spiders don't index past the # sign.

[This is a very techy presentation. He is having trouble showing his examples from a live site due to FLASH. It's hard to take notes on this session because he is showing AJAX solutions in use.]

AJAX is used to accommodate url updates and handle deep linking. Looks at gucci.com. It's a pretty site. Easy to move around. If you remove the JavaScript, there is nothing on the pages. No content. It breaks every rule for SEO. All images, all JavaScript driving it.

Amazon is viewed next. Shows how AJAX handles interface and menus. Amazon Diamond search. Site works without JavaScript.

Panel input:

Dan Crow - Google Says Google is moving towards indexing css FLASH, AJAX, JavaScript. They say there is no change in the present state, but expect a major shift in the future. They're interested in this technology. If you think everything is hidden behind JavaScript, someday that will longer be true. Be cautious about your assumptions about how you build your websites because they are making changes.

Amit - It's our fault we can't index programming. We don't want to stop you from designing for your users. He states that what is built for accessibility is also built for search engines. CSS and JavaScript would like it not be hidden by robots.txt. If you have a problem with this, let Yahoo know. If technology you use requires clicks to use a form, makes it hard for SE's. They look at content and references from other sites. AJAX that doesn't allow urls to change is a problem for SE's and bookmarking because descriptions are in those urls. SE's don't know exactly which url is the actual inbound link that is making the referral.

[Note: This session was about conflicts, or not, with CSS and AJAX with search engines. It was a little hard to follow if you don't know about AJAX and what it is used for. It was interesting to hear what the search engine reps had to say. I think as AJAX solutions become more popular, we'll be hearing more about this and getting more details on actual applications for use.]

Previous story: SEM For Non-Profits and Charities
 

Comments:

Ryan Johnston

04/14/2007 06:50 am

Thanks for the write up! I'd love to hear some feedback on how to make things less techie - we had lots to cover in less than 20 min. Thanks again, ryanj http://www.ryanj.org

Kim Krause Berg (cre8pc)

04/15/2007 02:07 am

Ryan, your passion spilled over in your talk and that was fun to see. I think you needed more time, or a way to show examples of what AJAX can do. I also think there were folks who didn't understand enough about AJAX solutions to be able to follow along well. I actually understood much more after talking to my husband, who was with me and heard your talk, and who knows more about AJAX. He helped fill in the gaps for me. A reporter, who is a programmer, would have done the session better justice than I did :)

Jim McFadyen

04/24/2007 05:28 am

Hi Kim, We have really worked on this presentation to make it less techy, but by nature of the topic, a JavaScript Object, it is 100% code! Any feedback please let me know. It is tough to balance this out, as some of the feedback from Chicago was show more examples and what AJAX can do and less basics. So it is a fine line to walk. I just hope most people got at least something out of it. Thanks, Jim

Craig

05/15/2007 09:06 am

Of all the issues with using AJAX and being SE friendly, the one remaining problem to be overcome is the case where a visitor uses the URI shown in the address bar to create a link on their own site to the page in question. It is simple enough to provide URLs to pages that can be read by SEs and non-Javascript enabled browsers to arrive at the correct pages/content generated statically while a Javascript enabled browser could load the static page initially but then have all future requests processed via AJAX. Even supporting the proper function of forward and back buttons has a number of solutions so internally, a site can function as one would expect. But when a "page" is AJAX generated, it will more than likely be identified by the fragment/hash which while allowing one to return to exactly the same page and can even be bookmarked, will more than likely be seen by SEs as nothing more than a link to the site's entrance page. Although it may seem like a minor point in that PageRank should be credited to the site correctly, it more than likely won't be properly associated with the page/content it actually points to. One could provide "permalink" anchors on each AJAX generated page but they have to actually be used to be of any value and since the dominate method is just to copy and paste from the address bar, the benefit would more than likely be hit and miss. SEs eventually being able to natively navigate AJAX generated content will be a web revolution in the making but at the same time, if and when it comes, the complexity of defining what should and shouldn't be indexed will likely stretch the capabilities of our friendly neighborhood robots.txt file to breaking. In any event, the future will be, if nothing else, exciting and interesting.

blog comments powered by Disqus