Google: SEOs Can Help Shape Client Policies & Decisions On AI Bots

Oct 3, 2024 - 7:31 am 3 by
Filed Under Google

Control Room

Google's John Mueller said that SEOs are in a great place because they understand how crawlers work, how the controls work, and they can help their clients decide on their AI policies and decisions as they navigate this new era of AI bots.

John Mueller wrote on LinkedIn, "This intersection of AI & SEO puts you all (technical SEOs!) into a great place to help shape policies / decisions for & with your clients." "You know how these control mechanisms work, you can choose to use them, and help folks to decide what makes sense for them," he added.

I like how he worded this next line, saying, "The robots.txt gives you a lot of control (over the reasonable crawlers / uses -- for unreasonable ones, you might need to dig deeper into, or use a CDN/hoster that lets you block them by request type), you can even make your robots.txt disallow all by default if you want." I mean, he did not say "full control" but "a lot of control." Because, no, it does not give you full control. In some cases, if you want to block AI Overviews, you need to block all of Google Search. There are other AI bots and crawlers unrelated to Googlebot. And then there are the countless up and coming AI engines with bots all over the place.

John wrote more, here is the full set of comments:

This intersection of AI & SEO puts you all (technical SEOs!) into a great place to help shape policies / decisions for & with your clients. You know how these control mechanisms work, you can choose to use them, and help folks to decide what makes sense for them.

The robots.txt gives you a lot of control (over the reasonable crawlers / uses -- for unreasonable ones, you might need to dig deeper into, or use a CDN/hoster that lets you block them by request type), you can even make your robots.txt disallow all by default if you want. Help the person running the site to make a decision (this is the hard part), and implement it properly (you definitely know how to do this).

These new systems access the web in a way similar to search engines, which you (I assume) know how it works & how to guide it. The controls are similar (sometimes the same) to those for search engines, which you know how they work & can use thoughtfully. What these new systems do with the data is sometimes very different, but it's learnable (also, it changes quickly). You know what you want from search engines ("why do SEO? XYZ is why"), you can extrapolate from there if the new systems give you something comparable, and use that to decide how you interact with them. You're (as a technical SEO in particular) in a good position to help make these decisions, and you're definitely the right person to implement them. (And of course, your clean technical SEO foundation will make anything that these new systems do easier, crawling, internal links, clean URLs, clean HTML, etc -- if you choose to go down that route.)

And finally, you hopefully have a lot of practice saying "it depends", which is the basis of all technical decision making.

Are clients coming to you and asking how to deal with this?

Forum discussion at LinkedIn.

Note: This was pre-written and scheduled to be posted today, I am currently offline for Rosh Hashanah.

 

Popular Categories

The Pulse of the search community

Follow

Search Video Recaps

 
- YouTube
Video Details More Videos Subscribe to Videos

Most Recent Articles

Search Forum Recap

Daily Search Forum Recap: November 11, 2024

Nov 11, 2024 - 10:00 am
Google News

November 2024 Google Webmaster Report

Nov 11, 2024 - 8:01 am
Google

Google AI Overviews Testing Anchor Text Based Hyperlinks

Nov 11, 2024 - 7:51 am
Google Ads

Google Ads Allows US Elections Ads Again Today

Nov 11, 2024 - 7:41 am
Google Ads

Google Ads Editor Version 2.8 Is Now Out

Nov 11, 2024 - 7:31 am
Google

Google Tests Dropping Site Name & Favicon From Search Results

Nov 11, 2024 - 7:21 am
Previous Story: Google Ads With Images & Map Carousel