Google Gives Update On Crawling JavaScript Sites & Progressive Web Apps

Mar 7, 2016 - 7:42 am 3 by

Greentech Google Bot 1900px

John Mueller of Google posted a detailed update on how Google currently handles JavaScript sites (including AJAX) and Progressive Web Apps in their index, crawling, and ranking. He posted this on Google+ and titled it "An update (March 2016) on the current state & recommendations for JavaScript sites / Progressive Web Apps in Google Search.

I am just bringing it more daylight by referencing it over here:

  • Don't cloak to Googlebot. Use "feature detection" & "progressive enhancement" [2] techniques to make your content available to all users. Avoid redirecting to an "unsupported browser" page. Consider using a polyfill or other safe fallback where needed. The features Googlebot currently doesn't support include Service Workers, the Fetch API, Promises, and requestAnimationFrame.
  • Use rel=canonical [3] when serving content from multiple URLs is required.
  • Avoid the AJAX-Crawling scheme on new sites. Consider migrating old sites that use this scheme soon. Remember to remove "meta fragment" tags when migrating. Don't use a "meta fragment" tag if the "escaped fragment" URL doesn't serve fully rendered content. [4]
  • Avoid using "#" in URLs (outside of "#!"). Googlebot rarely indexes URLs with "#" in them. Use "normal" URLs with path/filename/query-parameters instead, consider using the History API for navigation.
  • Use Search Console's Fetch and Render tool [5] to test how Googlebot sees your pages. Note that this tool doesn't support "#!" or "#" URLs.
  • Ensure that all required resources (including JavaScript files / frameworks, server responses, 3rd-party APIs, etc) aren't blocked by robots.txt. The Fetch and Render tool will list blocked resources discovered. If resources are uncontrollably blocked by robots.txt (e.g., 3rd-party APIs) or otherwise temporarily unavailable, ensure that your client-side code fails gracefully.
  • Limit the number of embedded resources, in particular the number of JavaScript files and server responses required to render your page. A high number of required URLs can result in timeouts & rendering without these resources being available (e.g., some JavaScript files might not be loaded). Use reasonable HTTP caching directives.
  • Google supports the use of JavaScript to provide titles, description & robots meta tags, structured data, and other meta-data. When using AMP, the AMP HTML page must be static as required by the spec, but the associated web page can be built using JS/PWA techniques. Remember to use a sitemap file with correct "lastmod" dates for signaling changes on your website.
  • Finally, keep in mind that other search engines and web services accessing your content might not support JavaScript at all, or might support a different subset.

John ended by saying:

Looking at this list, none of these recommendations are completely new & limited to today -- and they'll continue to be valid for foreseeable future. Working with modern JavaScript frameworks for search can be a bit intimidating at first, but they open up some really neat possibilities to make fast & awesome sites!

Forum discussion at Google+.

 

Popular Categories

The Pulse of the search community

Follow

Search Video Recaps

 
Video Details More Videos Subscribe to Videos

Most Recent Articles

Search Forum Recap

Daily Search Forum Recap: October 4, 2024

Oct 5, 2024 - 10:00 am
Search Video Recaps

Search News Buzz Video Recap: Google Ranking Volatility Record, Forbes Advisor Slapped, Bing Generative Search Experience & More

Oct 4, 2024 - 8:01 am
Google

Google AI Overview Ads, New Link Format, AI Organized Search Results & More

Oct 4, 2024 - 7:55 am
Bing Search

Bing Testing Bing Button

Oct 4, 2024 - 7:51 am
Google Ads

Google Ads Send Message On WhatsApp Tests

Oct 4, 2024 - 7:41 am
Google Ads

Google Tests Local Services Ads Without Attorney Photos

Oct 4, 2024 - 7:31 am
Previous Story: Matt Cutts Makes Appearance At The Google Dance