After Months Of Warnings; Google Kills Their AJAX Crawling Scheme

Oct 15, 2015 - 8:29 am 0 by

google ajax

Last night, Google announced they are deprecating their AJAX crawling proposal from 2009.

This does NOT mean they won't crawl your AJAX web pages. They will crawl your AJAX sites but Google is saying you no longer need to hack in special stuff to help GoogleBot understand your AJAX web site. Instead, Google is suggesting you use standard best practices, such as progressive enhancement when building these sites.

As you know, Google has told us this would happen back in March. But it never happened, so Google told us in late April, they still have plans to deprecate it and then in August repeated that. So this should come as no surprise to anyone who read this site.

We have dozens of stories here about Google's advancement in crawling JavaScript - and Google is much better at it now that they don't need this special AJAX scheme.

Here is some Q&A with Google's Kazushi Nagayama:

Q: My site currently follows your recommendation and supports _escaped_fragment_. Would my site stop getting indexed now that you've deprecated your recommendation? A: No, the site would still be indexed. In general, however, we recommend you implement industry best practices when you're making the next update for your site. Instead of the _escaped_fragment_ URLs, we'll generally crawl, render, and index the #! URLs.

Q: Is moving away from the AJAX crawling proposal to industry best practices considered a site move? Do I need to implement redirects? A: If your current setup is working fine, you should not have to immediately change anything. If you're building a new site or restructuring an already existing site, simply avoid introducing _escaped_fragment_ urls. .

Q: I use a JavaScript framework and my webserver serves a pre-rendered page. Is that still ok? A: In general, websites shouldn't pre-render pages only for Google -- we expect that you might pre-render pages for performance benefits for users and that you would follow progressive enhancement guidelines. If you pre-render pages, make sure that the content served to Googlebot matches the user's experience, both how it looks and how it interacts. Serving Googlebot different content than a normal user would see is considered cloaking, and would be against our Webmaster Guidelines.

Here are a couple tweets I want to pull out from the industry around this:

Forum discussion at Google+ and Twitter.

 

Popular Categories

The Pulse of the search community

Follow

Search Video Recaps

 
Google Core Update Rumbling, Manual Actions FAQs, Core Web Vitals Updates, AI, Bing, Ads & More - YouTube
Video Details More Videos Subscribe to Videos

Most Recent Articles

Google Updates

Google Urges Patience As The March 2024 Core Update Continues To Rollout

Mar 18, 2024 - 7:51 am
Google

Official: Google Replaces Perspective Filter With Forums Filter

Mar 18, 2024 - 7:41 am
Google Maps

Google Business Profiles Now Offers Additional Review After Appeal Is Denied

Mar 18, 2024 - 7:31 am
Google Maps

EU Searchers Complaining About Google Maps Features Changes Related To DMA

Mar 18, 2024 - 7:21 am
Google

Google Showing Fewer Sitelinks Within Search

Mar 18, 2024 - 7:11 am
Search Forum Recap

Daily Search Forum Recap: March 15, 2024

Mar 15, 2024 - 4:00 pm
Previous Story: Google Knowledge Graph Now Have Sport Team Rosters