I am just bringing it more daylight by referencing it over here:
- Don't cloak to Googlebot. Use "feature detection" & "progressive enhancement"  techniques to make your content available to all users. Avoid redirecting to an "unsupported browser" page. Consider using a polyfill or other safe fallback where needed. The features Googlebot currently doesn't support include Service Workers, the Fetch API, Promises, and requestAnimationFrame.
- Use rel=canonical  when serving content from multiple URLs is required.
- Avoid the AJAX-Crawling scheme on new sites. Consider migrating old sites that use this scheme soon. Remember to remove "meta fragment" tags when migrating. Don't use a "meta fragment" tag if the "escaped fragment" URL doesn't serve fully rendered content. 
- Avoid using "#" in URLs (outside of "#!"). Googlebot rarely indexes URLs with "#" in them. Use "normal" URLs with path/filename/query-parameters instead, consider using the History API for navigation.
- Use Search Console's Fetch and Render tool  to test how Googlebot sees your pages. Note that this tool doesn't support "#!" or "#" URLs.
John ended by saying:
Forum discussion at Google+.