AJAX & Search Engine Optimization (SEO)

Feb 20, 2007 • 8:13 am | comments (14) by twitter Google+ | Filed Under SEO - Search Engine Optimization

I can make this quick. AJAX and SEO do not mix. Search engines can not read AJAX, because most search engines won't read most JavaScript. So when you implement AJAX, make sure to give search engines alternative methods of getting to the same content that is accessible via the AJAX form.

There was a nice presentation given at SES Chicago 2006 named CSS, AJAX, Web 2.0 & Search Engines where Jim McFadyen from CriticalMass gave some nice tips and examples of AJAX and alternative solutions implemented right and wrong.

He loves Amazon's Diamond Search feature.

JavaScript turned on - i.e. AJAX version:


JavaScript turned off - i.e. non AJAX version:


This shows an example of how Amazon is using an alternative to JavaScript for their users. But a search engine won't be able to index and crawl Amazon's non-AJAX version, because it is using image maps, forms and so on.

A good alternative would to design a linked based alternative that somehow has some of these filtering options.

Forum discussion at High Rankings Forum.

Previous story: Google Allows AdSense Publishers to Click Play Button on Video Ads



02/20/2007 01:57 pm

Can I use Ajax in my website <a href="http://www.ajaxlines.com" rel="nofollow">Ajaxlines</a> and Google can crawl it?


02/20/2007 02:15 pm

Link is broken to amazon... missing http://


02/20/2007 02:17 pm

Not only for SEO's, but also with regards to usability, there should always be an alternative to javascript/ajax. A good example of why SEO is for 50% about making websites better, like they should be. For the other 50% ofcourse, SEO is rocket science ;)

Barry Schwartz

02/20/2007 02:54 pm

Fixed link Gabs...


02/20/2007 03:18 pm

Just a quick note -- this is often referred to as "Hijax" or "progressive enhancement." Essentially, you take a fully working, plain HTML page, and then use Javascript to add additional features, "hijack" the functionality of existing form controls, etc. Not only is this great for SEO -- allowing non-Javascript crawlers to spider the site -- but it's also great for accessibility -- allowing non-Javascript browsers to actually use the site.

Aaron Shear

02/20/2007 05:57 pm

You can always layer your AJAX with DHTML and replicate the content so that browsers that cannot load AJAX can still see it. I have used this on multiple sites and it works like a charm.

Alister Cameron, Blog Consulta

02/21/2007 05:31 am

I would say the exact opposite: ajax and seo make WONDERFUL bedfellows, but for the exact opposite reason you discuss here. I have done some "advanced" seo work to my site that very deliberately takes advantage of the fact that ajax content isn't visible to SEs ... as a kind of "kosher cloaking" technique. I've explained what I do here: http://www.alistercameron.com/2007/02/05/advanced-search-engine-optimization-seo-for-wordpress/ It all makes sense when you get my full explanation. So anyway, my suggestion is to consider how useful ajax is to allow you to separate the content on a page you want SEs to see, from what you don't want them to see! - Alister

Kenny Hyder

02/23/2007 12:55 am

What you are saying is MOSTLY true.. But not completely. There is a way to use javascript and Ajax so that engines can spider it just fine.. You do this by modifying the DOM. You can append to the node using a reference to text that is located elsewhere on your server, and have it attach objects on the fly using Xml Http Requests, XHR.. Also, if you write un-obtrusive javascript, there should be no problem regarding spiders at all. Un-obtrusive meaning, if you turn js off, and the page still works (even if it looks different) you should be fine.. Check out <a href="http://www.kennyhyder.com/webdev" rel="nofollow">this example</a>. It is fully spiderable, but is all javascript... Do a test by turning off js in your browser and see what I mean by un-obtrusive javascript.


11/28/2007 09:22 pm

What I was working on for the past 3 days is an <a href="http://www.ajaxoptimize" rel="nofollow">SEO friendly AJAX site</a>. Still working on it but it is intended to make SEO for AJAX sites easier to do. You can check it out now. - It can load dynamic AJAX content without page reloading. - It is pulling the data from a database. - But content and link can still be crawled and followed. So far I am pretty stoked myself just looking at what I've done. But it is still a work in progress.

Alex Choo

12/07/2007 05:58 pm

My site, 109things.com, uses Yahoo's User Interface library. I've tried to view it using Lynx, and it seems to look ok, although not as pretty as the AJAX version. But I think search engines are still able to crawl it too.

Design Wannabe

08/15/2009 06:23 am

Is it ok to use Ajax if you have a sitemap?

Q. Rafiq

01/09/2010 11:12 am

While it seems difficult initially to incorporate SEO in Ajax enabled websites but it can always be done with properly structured ajax calls and let all the data be available to search engines on page ready state. Subsequent calls after ready state are the critical factors for SEO and here comes the alternative methods of javascript to let search engines aware of the content.


05/22/2010 09:12 am

I've just been playing with Googles Crawlable Ajax proposal. My <a href="http://seo-website-designer.com/SEO-Ajax-Google-Solution/" rel="nofollow">Crawlable Javascript Example</a> seems to work very well. I can already see that two ajax pages have been indexed using my chosen title, description and content.


05/01/2011 01:13 pm

Hello! I have a question! SEO has problems with duplicate content. Can I use AJAX to avoid this problem? For example I have the home page present projects with a short description and a URL such as /app/category/project which directs to the project page (also containing the same description). If I use AJAX on the home page for retrieving the project description will it still count as duplicate content for the SEO?

blog comments powered by Disqus