What Changed In The Google Webmaster Guidelines

Jan 28, 2016 - 2:49 pm 15 by

Penaltycards V2 Soccer1 1900

Patrick Sexton asked if he can publish the new changes and additions made to the Google Webmaster Guidelines update I mentioned this morning. So here is his take on that. But first, Patrick Sexton is the author of Varvy.com and the co-founder of getlisted.org (acquired by Moz, now known as Moz local).

Occasionally over the years the Google webmaster guidelines get updated.

Rarely is there a significant change in the goals behind the Google guidelines, the updates tend to be more clarifications than big news.

This update is different. It includes entirely new guidelines, several clarifications, separation of former guidelines, combinations of former guidelines, and complete removals of guidelines. The highlights are that secure sites (HTTPS), mobile seo and accessibility are now formally in the Google webmaster guidelines.

Let's look at the changes.

Entirely New Guidelines:

"If possible, secure your site's connections with HTTPS. Encrypting interactions between the user and your website is a good practice for communication on the web."

Having a secure website using HTTPS is now a Google webmaster guideline. This was somewhat inevitable. By officially adding it to their guidelines signals a significant event. Google very rarely adds new webmaster guidelines and when they do, a great deal of thought and energy is put into the decision. Websites should clearly go HTTPS at this point. There is no gray area anymore.

"Design your site for all device types and sizes, including desktops, tablets, and smartphones. Use the mobile friendly testing tool to test how well your pages work on mobile devices, and get feedback on what needs to be fixed."

Mobile SEO now is formally a Google webmaster guideline. If you have been waiting to go mobile, you have waited to long. This is actually the first time the Google webmaster guidelines have specifically mentioned mobile.

"Ensure that your pages are useful for readers with visual impairments, for example, by testing usability with a screen-reader."

Accessibility has long been absent from the Google guidelines. This new guideline leaves no doubt that accessibility will now receive the importance it deserves. Overall this is my favorite new guideline.

"Make your site's important content visible by default. Google is able to crawl HTML content hidden inside navigational elements such as tabs or expanding sections, however we consider this content less accessible to users, and believe that you should make your most important information visible in the default page view."

This guideline clears up the whole "What about content in tabs" debate. Google clearly considers the initially visible content to be more important than the content found in tabs or other non visible elements.

New Guidelines That Are Changed Versions Of Already Existing Guidelines:

"Ensure that all pages on the site can be reached by a link from another findable page. The referring link should include either text or, for images, an alt attribute, that is relevant to the target page."

This guideline used to read:

"Make a site with a clear hierarchy and text links. Every page should be reachable from at least one static text link."

There are two differences here. The first difference is basically a more human and understandable description that both highlights the importance of anchor text and ALT tags and uses the the word "findable" rather than the term "static link". This is a natural update because "static links" are not the only way that Google discovers content anymore, nor is it a friendly term to non technical content providers.

The second difference is that this guideline has been split in two. The previous guideline mentioned a "clear hierarchy", which is now removed from this guideline and upgraded to its own separate guideline (more on this below).

"Provide a sitemap file with links that point to the important pages on your site. Also provide a page with a human-readable list of links to these pages (sometimes called a site index or site map page)."

This guideline is a combination of two prior guidelines. The previous version of the Google guidelines had two separate guidelines that read...

"Offer a site map to your users with links that point to the important parts of your site. If the site map has an extremely large number of links, you may want to break the site map into multiple pages."

and...

"Submit a Sitemap using Google Search Console. Google uses your Sitemap to learn about the structure of your site and to increase our coverage of your webpages."

The new version of the Google guidelines still has two separate guidelines about sitemaps (one for humans, one for search engines) but it interestingly chose to add the site map for search engines to the new guideline about sitemap for humans guidelines.

"Limit the number of links on a page to a reasonable number (a few thousand at most)."

Years ago the Google guidelines recommended only 100 links, then later it updated that guideline to state "a reasonable amount of links". The update today now states "a few thousand at most".

"Use the robots.txt file on your web server to manage your crawling budget by preventing crawling of infinite spaces such as search result pages. Keep your robots.txt file up to date. Learn how to manage crawling with the robots.txt file. Test the coverage and syntax of your robots.txt file using the robots.txt testing tool."

This guideline previously stated...

"Make use of the robots.txt file on your web server. This file tells crawlers which directories can or cannot be crawled. Make sure it's current for your site so that you don't accidentally block the Googlebot crawler. Visit http://code.google.com/web/controlcrawlindex/docs/faq.html to learn how to instruct robots when they visit your site. You can test your robots.txt file to make sure you're using it correctly with the robots.txt analysis tool available in Google Search Console."

The changes are basically the removal (oddly) of "don't accidentally block Googlebot" and the introduction of the term "crawl budget".

"Ensure that your title elements and alt attributes are descriptive, specific, and accurate."

This guideline now adds the word "specific" to the older version of the guideline which stated:

"Make sure that your title elements and ALT attributes are descriptive and accurate."

"Follow our recommended best practices for images, video, and structured data."

This guideline only changes by stating explicitly to "follow" these guidelines, rather than to "review" them.

"When using a content management system (for example, Wix or WordPress), make sure that it creates pages and links that search engines can crawl."

This guideline formally stated...

"If your company buys a content management system, make sure that the system creates pages and links that search engines can crawl."

The changes are the inclusion of specific platforms, namely WordPress and Wix.

"To help Google fully understand your site's contents, allow all site assets that would significantly affect page rendering to be crawled: for example, CSS and JavaScript files that affect the understanding of the pages. The Google indexing system renders a web page as the user would see it, including images, CSS, and JavaScript files. To see which page assets that Googlebot cannot crawl, or to debug directives in your robots.txt file, use the blocked resources report in Search Console and the Fetch as Google and robots.txt Tester tools."

The main changes to this guideline are mostly just nuances that highlight the importance of CSS and JS that "significantly affect page rendering". The former version of this guideline made no such distinction. It also adds a mention to the blocked pages report available in the Google search console.

"Make a reasonable effort to ensure that advertisement links on your pages do not affect search engine rankings. For example, use robots.txt or rel="nofollow" to prevent advertisement links from being followed by a crawler."

This is the famous, and formerly cryptic paid links guideline. The former version of this guideline read...

"Make reasonable efforts to ensure that advertisements do not affect search engine rankings. For example, Google's AdSense ads and DoubleClick links are blocked from being crawled by a robots.txt file."

I would say this is a somewhat significant change. First off, the new guideline is much clearer than the old one, and it is also the first mention of nofollow in the Google webmaster guidelines.

"Try to use text instead of images to display important names, content, or links. If you must use images for textual content, use the alt attribute to include a few words of descriptive text."

The main change to this guideline is that it states more explicitly to use a few words of ALT text. The previous version of this guideline only asked that you "consider" using ALT text. The new guideline says "use the alt attribute to include a few words of descriptive text" which is stronger language about alt text use.

"Ensure that all links go to live web pages. Use valid HTML."

The former version of this guideline read:

"Check for broken links and correct HTML."

The change is simply stronger language stating to use valid HTML. The other change is that rather than say "check for broken links" it states "Ensure that all links go to live web pages". This is just a more simple, less technical way to say "Check for broken links"

"Optimize your page loading times. Fast sites make users happy and improve the overall quality of the web (especially for those users with slow Internet connections). Google recommends that you use tools like PageSpeed Insights and Webpagetest.org to test the performance of your page."

This guideline was simply updated to remove references to outdated tools that Google formerly provided but no longer are supported.

Guidelines That Were Separated (Upgraded) From Former Guidelines:

"Design your site to have a clear conceptual page hierarchy."

This guideline was formerly half of the old Google guideline that stated...

"Make a site with a clear hierarchy and text links. Every page should be reachable from at least one static text link."

The addition of "conceptual" was interesting.

Guidelines That No Longer Exist:

"If you decide to use dynamic pages (i.e., the URL contains a "?" character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few."

This guideline was removed. Likely because dynamic pages are pretty standard now.

"Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines."

This guideline was removed. A new guideline hints at it, but this guideline as a whole is now gone.

Forum discussion continued at Google+.

 

Popular Categories

The Pulse of the search community

Follow

Search Video Recaps

 
Google Core Update Rumbling, Manual Actions FAQs, Core Web Vitals Updates, AI, Bing, Ads & More - YouTube
Video Details More Videos Subscribe to Videos

Most Recent Articles

Search Forum Recap

Daily Search Forum Recap: March 15, 2024

Mar 15, 2024 - 4:00 pm
Search Video Recaps

Search News Buzz Video Recap: Google Core Update Rumbling, Manual Actions FAQs, Core Web Vitals Updates, AI, Bing, Ads & More

Mar 15, 2024 - 8:01 am
Google Updates

Google March 2024 Core & Spam Update Movement Today

Mar 15, 2024 - 7:51 am
Google

Google Image Search Results Testing Like Button

Mar 15, 2024 - 7:41 am
Google

Google Merchant Center Product Studio With Themed Templates Including St. Patrick's Day

Mar 15, 2024 - 7:31 am
Google Maps

Google Search Dishes Nearby Carousel

Mar 15, 2024 - 7:21 am
Previous Story: Google & Fox News Team Up For Tonights Republican Debate