Google Panda Impacting Your Mega Site? Use Sitemaps To Recover?

Dec 11, 2013 • 8:26 am | comments (10) by twitter Google+ | Filed Under Google Search Engine Optimization
 

sitemapsA WebmasterWorld thread has discussion around how to determine which sections or categories of your web site are impacted by Google's Panda algorithm.

Panda has not gone away and sites are still suffering from not ranking in Google after the Panda algorithm. New sites are hit monthly, some sites are released in some way from its grips as well.

The thread talks about one technique for large sites to determine which sections of their sites are impacted by Panda.

The concept is to use XML Sitemaps and break the sitemap files into a logical structure of the web site. Then once Google processes all the files, Google will quickly show you how many pages within each sitemap file was indexed or not indexed.

One webmaster explained how he went about this technique:

The sector I was perfoming in. allowed me to created dozens of sitemaps of 100 pages each. No reason why any of the pages should not be indexed. I found some sitemaps with 0 listed then others from 25 up to the full 100. I then discovered trends. IE pages with similar title tags and URLS. (the on page content was considerably different, which is why I did not remove them initially)

I then did different experiments with each sitemap group, until I saw a recovery, then applied the solutions across the board.

The question I have and I am not certain of... I thought sites impacted by Panda, the pages are indexed but they don't rank as well. Meaning, if a page is not indexed, that is more of an issue with crawl budget and PageRank (or sever set up issues) versus Panda. Panda, the content has to be indexed for Google to know not to rank it well. Am I wrong?

Forum discussion at WebmasterWorld.

Image credit to BigStockPhoto for abstract image

Previous story: Google On How To Make Better Mobile Web Sites
 

Comments:

Giuseppe Pastore

12/11/2013 02:08 pm

Definetely not a Panda issue. Having not indexed pages in big sites is an architecture related problem, maybe too flat and PageRank flowing from the home to too many pages. Or too many levels and still little internal PR flowing to lower levels, so that final leaves remain unspidered because of very low scheduling priority or yet not found and added to the scheduling process.

Michael Martinez

12/11/2013 02:52 pm

Panda downgrades directly impact crawl, according to Matt Cutts (you may have quoted him on that yourself at some point either here or at Search Engine Land). The method described above is crude but seems like it would be effective, at least on a large site, to some degree. Panda does look at duplicate content signals, although that may not be the only kind of signals it looks at.

JustProduct

12/11/2013 03:17 pm

" I thought sites impacted by Panda, the pages are indexed but they don't rank as well. " Absolutely. Obviously the guy from WMW is SEO or reads too much of SEO stuff )

Nicholas Chimonas

12/11/2013 03:47 pm

If the issues he repaired were title tags, URL structure, and other duplicated on-page fundamentals, I'd rather just crawl the site with screaming frog and sort by titles, sort by meta descriptions, etc. Discovering that by creating several XML maps and waiting for Webmaster Tools crawl reports sounds much more tedious. Again, if index status is an issue, I'd lean towards internal architecture mishaps and sort pages by number of links pointing to them. I've seen Panda hit pages which were still indexed, but simply lowered in ranking - If you create proper segmentation in Google Analytics, you can correlate the exact date of the Panda update which affected the pages in question.. Much more of a simple and succinct solution in my opinion.

anon

12/11/2013 05:40 pm

Cool tip, but this doesn't sound like a Panda thing. It sounds like a general site quality thing. Google has had site quality evaluation and link evaluation algorithms before and after Panda and Penguin, there's no reason why we need to always attribute phenomena to those algorithms.

xoxo

12/11/2013 09:14 pm

this is quality of "site quality evaluation and link evaluation algorithms" :)

Jack

12/12/2013 05:02 am

Most of the pages from my site are indexed properly with no error though I have Optimized my content once again after the Panda algo and I have also disavow most of the Spam links still my website are suffering. I think Penguin is the major factor behind this issue. But thanks for the above tips I will try this out too.

Dharmik

12/12/2013 05:04 am

so what we have to do for that? I mean if we have site which is not ranking good even there is a content as well it is regularly cached by Google still ranking doesn't improve. So what we can do in this matter?

Guest

12/12/2013 01:13 pm

No offence, but when you are hit by Panda your WHOLE website is hit, not just a part of it. You will have the occasional random article still doing well, but that's more due to Google being dumb than any segment of a website thriving

mathewmakio

12/12/2013 01:14 pm

Experience: 3 years on my damn resume!

blog comments powered by Disqus