CSS, AJAX, Web 2.0 & Search Engines

Mar 19, 2008 • 7:04 pm | comments (1) by twitter Google+ | Filed Under Search Engine Strategies 2008 New York
 

As the web moves into its second generation, sites are making more use of CSS, AJAX and other advanced and interactive design techniques. But how are the largely Web 1.0 search engines reacting to these, from an SEO perspective. This session explores issues and solutions. Moderator: • Jon Myers, Head of Search, MediaVest Speakers: • Jonathan Ashton, VP of SEO & Web Analytics, Agency.com • Ben D'Angelo, Software Engineer, Google • Chris Humber, Director of SEO, 360i

Jonathan Ashton

I would like to talk about the issues of standards in web development, and ultimately presenting the right experience to all web users.

You guys remember when Flash came and brought the whole idea of life into the static environments of the web. Ren and Stimpy was one of the first automated cartoons built in Flash. When Flash emerged, it was something that gave us an added layer to bill our clients extra for! But today, we have so many amazing tools that now our clients are paying us for our ability to measure proper use of these elements.

Usability standards for optimizing in web 2.0:

AOL still has 9 million + users on dial up! You have to realize that if you are going to be a good internet citizen, you cannot leave these people behind! Now, 35% of people use Firefox to explore the web without JavaScript! If your intention is to reach every human, you will also reach every search engine bot!

If a person who is blind visits a web page, they can’t tell a picture of a horse unless it has an alt tag. So by taking yourself out of the perspective of someone who is marketing, and into the shoes of someone trying to create a good community, you will benefit.

I have met SEO’s who have not taken time to read the Google guidelines and recommendations. If you are spending any time messing with your site on trying to get more traffic, please take the time to read this stuff! Google is showing you as much of their hand as they are willing to show!

Google tells you that certain technologies are not crawlable! It also suggests you download an old school browser and look at your site in it! We need to help our clients achieve that level of interactivity.

What does index actually mean? Just because it’s indexable does not mean it’s going to win for anything meaningful. I know search engines are focusing on more content in these dynamic environments, but indexability does not mean winability! So leave semantics behind. You need the layer underneath for the non-Flash enabled user, or spider.

Information architecture is core to usability. It is also required for a usable site. Optimizers should get involved early and often. If the content needs to be indexed, don’t hide it. As optimizers we need to bring this level of rationality to the IA process.

Is dynamic content really required? Just because everyone else is doing it doesn’t make it a valid reason to do it. If there is a valid reason, go for it. But if you can accomplish everything without using the newest technology, then great, don’t use it.

So what’s the A in Ajax? It stands for asynchronous! It may look cool but it’s ultimately a challenge to index.

Validate your HTML and CSS. Careful development means good optimization, a browser is designed to interpret what it sees and is forgiving of mistakes, but what a search engine sees is a much more literal engagement.

So, how do you finish first? Develop for the highest common denominator and the lowest. Make sure your tools are still 2.0 plus, in a 1.0 environment.

Ben D’Angelo

A lot of content is already easily accessed by search engines. Blogs, wikis etc. use HTML markup. It becomes more challenging introducing other ways of interaction. The 2 main technologies I will talk about are Ajax and Flash.

What is 2.0 about? It’s about richer and more complex systems relating to the management and interdependence of content, presentation and navigation.

Ajax maybe content and navigation. Flash – all 3 of these are tightly coupled.

Most people have Flash enabled. Why should I worry about the tiny of fraction of those who don’t have Flash enabled? You can say a similar argument about images back in the day! Of course now we know, images are great but at the same time, have alt texts, etc it’s much more accessible. So it’s similar argument to Flash.

When you think about accessibility for all users, it will become much more available for search engines. If it’s viewable for the blind reader, great. Some tech savvy people have plug-ins to disable Flash. Cell phones and low-bandwidth devices also don’t support Flash and is a market you likely want to target. Bookmarking is something you might not think about, but it’s important. It’s good for your site to attract links but can they link to your site if you have Flash – can I link to this cool game I played if the entire site is in Ajax? If a user can bookmark it, it will be accessible to search engines.

A simple thing – make static links and they will automatically be recognizable by search engines.

CSS – it allows you to isolate the content from the presentation. You can try turning off CSS to see if your site still looks reasonable. Avoid abusing techniques like hiding text in CSS.

Start with traditional HTML, add a little embellishments like rich media elements. You Tube is a good example.

From an Ajax perspective: URL parameters vs. fragments. Googlebot can ignore fragments in a URL. If you want to use some Ajax, use together with HTML.

Flash – Google does try to read some of it in URLs but not all, so use regular HTML for primary content and navigation and then compliment it with Flash elements.

A little more advanced technique – SIFR – takes content in HTML elements and will replace t with a little Flash – primary use is for different fonts. If a user does have things installed and enabled they will see it, if not, they will see regular HTML.

Useful links: Google webmaster central blog, webmaster help center, webmaster discussion group.

Chris Humber

Flash is a restrictive technology. Why, because the content is invisible to the spider and spiders can’t navigate it. I personally am waiting for the day where I can see a great article when Google can incorporate Flash! Unfortunately, the engines operate in a 1.0 space.

TheBar.com – great website where the bartender will respond to your questions. They have a great margarita recipe. This site is built entirely in Flash. This is great info that would be extremely useful in the search engines, unfortunately when you search for a margarita recipe, it cannot be found. The user perspective gets a rich interface, but the spiders get nothing.

Some best practices to incorporate if you must use Flash:

Adobe Search Engine SDK – extracts texts and links from a SWF file. It’s a direct sort of output of a Flash file, never use alone, can’t be indexed.

SWF address is a code library that allows you to create URLS in a Flash environment.

An SWF object is a great way to embed Flash into your HTML code, it’s compatible. Plenty of sites use it. Allows for content integration. A DIV layer allows you to provide static text in a Flash environment.

You have to privde the navigation if you use a SWF address or SWF object. The spiders need enough navigation to find the content. Also think about inbound linking. Otherwise you wont rank very well.

sIFR – short text vlocks, page editors, carousels – ensures content is accessible. Uses combo if CSS, java, Flash. ABC News uses it for their website. Very useful if you have a dynamic lead on the website.

If you apply the above best practices, you should see an improvement in search visibility and increase traffic via natural search in a Flash-based environment.

This session is provided by Sheara Wilensky of Promedia Corp.

Previous story: The New Face of In-House Search
 

Comments:

Jonathan Ashton

03/20/2008 07:02 pm

Thanks for the notes and I hope you enjoyed the discussion.

blog comments powered by Disqus