Google: Don't Crawl Your Site To Build A Sitemap File

Oct 30, 2019 - 8:59 am 7 by

Google Crawl

I never understood why people would use a crawler to build their XML sitemap file. I guess if you don't have access to read the database of your CMS you would do that but seems so inefficient. John Mueller of Google said that also on Reddit.

John Mueller said "Automate it on your backend (generate the files based on your local database). That way you can ping sitemap files immediately when something changes, and you have an exact last-modification date. Don't crawl your own site, Google already does that."

So make sure you rebuild your XML sitemap file based on what your database sees and does. Do not crawl your site to make your sitemap file, because (1) you may miss things and (2) it causes unnecessary stress on your server resources.

Be efficient and smart with your resources.

The good thing is that most CMS platforms these days already do this.

Forum discussion at Reddit.

 

Popular Categories

The Pulse of the search community

Follow

Search Video Recaps

 
Google Weekend Volatility, Google On Search Leak, Elizabeth Tucker Interview & Apple Intelligence - YouTube
Video Details More Videos Subscribe to Videos

Most Recent Articles

Google Updates

Google Father's Day Weekend Search Ranking Volatility

Jun 16, 2024 - 8:32 am
Search Forum Recap

Daily Search Forum Recap: June 14, 2024

Jun 14, 2024 - 10:00 am
Search Video Recaps

Search News Buzz Video Recap: Google Weekend Volatility, Google On Search Leak, Elizabeth Tucker Interview, Apple Intelligence & More

Jun 14, 2024 - 8:01 am
Google

Google Tests Multiple Featured Snippets Under From Sources Across The Web

Jun 14, 2024 - 7:51 am
Google Search Engine Optimization

Google: Sometimes Search Experiments Conflict Causing Issues

Jun 14, 2024 - 7:41 am
Google Maps

Google Business Profiles Websites No Longer Load - 404

Jun 14, 2024 - 7:31 am
Previous Story: Vlog #22: Eric Enge On SEO Testing, Structuring Content & Managing Client Expectations