Bing: You Can Have Up To 125 Trillion URLs In Sitemaps

Jun 11, 2014 • 8:23 am | comments (6) by twitter Google+ | Filed Under Bing SEO
 

New Bing LogoFabrice Canel, the Principal Program Manager of the Bing Index Generation team, posted their Sitemaps best practices guide for large web sites.

Bing says they can support up to 125 trillion links through multiple XML sitemap files. With one sitemap file, Bing allows you to list 50,000 x 50,000 links, which brings you to 2,500,000,000 links (2.5 billion). If you need more URLs, then Bing allows you to use 2 sitemap index files, which can then lead to the 125 trillion number.

Bing however recommends you don't list so many URLs. Rarely will Bing index all those URLs, so just list the URLs that are important to the site.

The total size of sitemap XML files can reach more than 100GB. For really large sites, Bing recommends you take things slow:

To mitigate these issues, a best practice to help ensure that search engines discover all the links of your very large web site is that you manage two sets of sitemaps files: update sitemap set A on day one, update sitemap set B on day two, and continue iterating between A and B. Use a sitemap index file to link to Sitemaps A and Sitemaps B or have 2 sitemap index files one for A and one for B. This method will give enough time (24 hours) for search engines to download a set of sitemaps not modified and so will help ensure that search engines have discovered all your sites URLs in the past 24 to 48 hours.

Check out the post for all the details but keep in mind, Google's best practices may differ and some of these points and it isn't always recommended to create search engine specific Sitemap files.

Forum discussion at WebmasterWorld.

Previous story: Moz Analytics Gains Not Provided Report
 

Comments:

Yo Mamma

06/11/2014 01:04 pm

I can see Bing partnering with Apple and others like Elon Musk of SpaceX and Tesla against Google. In fact Tesla cars specifically don't use Google maps. Apple and Tesla are cool products Google was cool for about 5 years about 15 years ago

Alexander Hemedinger

06/11/2014 01:10 pm

Honestly I just want to be able to feel self-worth on doing seo efforts and focusing on other search engines than just Google. Hopefully soon enough perhaps the search volume will expand to other sources..... who knows?

F1 Steve

06/11/2014 04:18 pm

@barry matt cuts twitter been hacked??

F1 Steve

06/11/2014 04:21 pm

nevermind I clicked the wrong one, google "matt cutts twitter" https://twitter.com/rnattcutts

F1 Steve

06/11/2014 04:22 pm

Is that new? I have never seen that before? Right underneath matts twitter feed in the serps!

wertwert

06/11/2014 07:50 pm

@Bing... this isn't what you need to compete... this feature is a Football Bat... undercut Google's PPC cost by a lot. Support webmasters with honest transparency. Make your campaign management tools better. Let organic results have quality above the fold placement. If you give something worthy to webmasters then webmasters become your evanglists. Google forgets how it came to power. The key to competing is recognizing what Google has forgotten. Stop the website abuse that Google started and do something engaging for the webmaster community to foster growth and success. Let google play bad site whack-a-mole while you make new lasting relationships with webmasters. Searchers don't make content. Searchers don't have ad spends... websites do... two thirds of your model doesn't need to be neglected or treated like shit. Promote freedom online... let people link how they may... you may not count all links but no links should ever incur punishment. punishable links are weapons... disarm the blackhats by not punishing the bad... reward the good instead and feel good doing it.

blog comments powered by Disqus