Google Can Tell Us If Our Sites Are Impacted By Panda Or Penguin But They Don't

Nov 20, 2013 • 9:10 am | comments (57) by twitter Google+ | Filed Under Google Search Engine Optimization
 

chasing tailI spotted a very interested comment by Google's head of search spam, Matt Cutts at Hacker News. The obvious comment is that Matt Cutts outs thecupcakeblog.com as being impacted in a negative way by the Panda algorithm.

More shocking to me, based on my conversations with Google search quality people is that Google knows clearly if a site is impacted by an algorithm or not.

In August, Google launched the manual action viewer, mostly to appease webmasters who want to know if their site has a penalty or action. But this only covers manual penalties, issued by Google representatives with the click of a button.

It does not include details if the site was hurt by an algorithm such as Panda or Penguin or others.

I want Google to release an "automated action viewer" to show how much your site is impacted by algorithmic updates. It sounds hard from what I imagined.

I thought that all sites are impacted by all algorithms on some level, but some are more than others. But when it comes to Penguin or Panda, it doesn't appear that it is some, it more appears that it is all or nothing.

By Matt Cutts saying, "looking at the site in question, it has been affected by Google's Panda," it makes me scratch my head. Does Google have a backend tool for themselves to see this? If so, can their dumb it down a bit to add to Webmaster Tools?

It pains me to see so many sites struggling to figure out what is hurting their sites in Google when nothing shows up in the manual actions viewer.

Don't get me wrong... Google's transparency over the years has grown tremendously. But this one thing would be gold for most small webmasters who are lost and being told by "SEO experts" or companies things that may not be true. I see so many webmasters chasing their tails - it pains me.

Forum discussion at Hacker News.

Previous story: Here Is Why Google Forced Me To Take Down My Site...
 

Comments:

StevenLockey

11/20/2013 02:24 pm

I think its a sliding scale, showing the exact rating might be a bad idea, but if a rating is really low, then at least letting the webmaster know that wouldn't be a bad idea and probably would help the people gaming the SERPs much.

Michael Martinez

11/20/2013 02:34 pm

"Does Google have a backend tool for themselves to see this? If so, can their dumb it down a bit to add to Webmaster Tools?" They can look at all the factors that affect a site's performance in their search results. In 2007 the New York Times reported they have a program they call "Debug" that "shows how its computers evaluate each query and each Web page." Even if "Debug" is no longer in use, they would only have replaced it with something better.

Greg Fowler

11/20/2013 02:48 pm

I would believe if I had an algo which produced certain queries like Google's top ten, then sure I would have data to determine a sites performance? Let's see, Google stores every single query indefinitely, keeps track of all links indefinitely, has cached results of probably every single website ever? The cupcake site has nice pictures of cupcakes, but it certainly doesn't take a PHD in machine learning to realize their 200 word descriptions and almost every single URL has "cupcake" in it would dampen the user experience and Google would decide to say "Hey, we have another cupcake site which we think is better"?

Durant Imboden

11/20/2013 03:07 pm

The real question has never been "Have I been hurt by [name of animal]?", it's been "Why, to what extent, and what can I do about it?"

ethalon

11/20/2013 03:11 pm

There is a delicate balance between transparency for those who are just going about their business and trying to follow the guidelines and giving details about why a site was penalized algorithmically to everyone regardless. Why would Google want to give black hatters and spammers more insight into why a specific site was hit algorithmically? It would just make the battle that much harder and faster moving.

mastersat

11/20/2013 03:17 pm

many webmasters all leaving their sites because Google unknown update

Marie Haynes

11/20/2013 03:19 pm

This doesn't necessarily mean that Google has a hidden sign on sites saying, "Panda" or "Penguin". It's pretty obvious that this site had Panda issues. When I read the thread yesterday and saw that the site consisted only of short posts that just copied other people's content my first thought was that it was prime Panda material. If you look on SemRush.com you'll see that there is a big drop that is quite typical of Panda hit sites. It happens after February 2012, so the site was probably impacted in the Feb 27, 2012 Panda update.

JustConsumer

11/20/2013 03:35 pm

"Does Google have a backend tool for themselves to see this?" Absolutely. "can their dumb it down a bit to add to Webmaster Tools?" Absolutely not. I use algo to interact with contributors and I can tell you for sure, that there always will be those, who want to do reverse engineering of your algo. The best way to avoid it is to keep algo details in secret. I can understand Google in this case. By the way, webmasterworld was the main venue where Google's algo updates were cracked on a regular basis. As far as I remember, it took about two months to crack Florida. Now imagine, Google released updates once in 3-4 months, but it took

Michael Davis

11/20/2013 03:38 pm

This is surprising, and not surprising at the same time. If Google truly wanted people to create the best possible website with the best possible content, why wouldn't they tell them if they're doing something wrong? For those of us who read blogs like this and stay up to date with changes in search, it might seem completely obvious why a page has been slammed by an algo update (though sometimes not), but the majority of website owners wouldn't have a clue. This type of inconsistent messaging is what drives me nuts about Google. Why would they allow people to know if there was a manual penalty on their site but not if they had an algo penalty. Help make the web a better place and stop playing games! I for one would love it if Google put an end to the forum post: "I think my site was hit by Panda/Penguin!" Ethalon: You can say the same about manual penalties.

ethalon

11/20/2013 03:47 pm

I don't think you can say the same about manual penalties. A manual penalty would imply that they have taken a look at the site and can decide if they want to push a notification forward. I would be surprised if the web spam team sends out a notification to an obvious churn and burn spam site. It's just my opinion, so I may be completely off base on that.

xoxo

11/20/2013 04:07 pm

it very good idea, but i not think it happens. google now only about preventing competition and $$$.

Pixelrage

11/20/2013 04:12 pm

This is of little concern to Google for one major reason: the biggest 'demographic' hurt by these algorithms are small businesses and at-home webmasters, which are also the demographic that spends the least (if that much) on AdWords and relies entirely on organic. You're not hearing Walmart, eBay, Amazon or Target complain about Panda or Penguin (all of the above being sites primarily with thin content or content scraped from manufacturer sites)

Durant Imboden

11/20/2013 04:19 pm

Google probably isn't as concerned with having people create the best possible Web sites as it is with returning the best possible search results. If wallys-widgets dot com gets knocked to the bottom of the SERPs by Panda or Penguin, how does that hurt Google or Google's searchers? From Google's point of view, a better site will--or at least should--rise to take the victim's place. Still, your comment about "inconsistent messaging" makes sense. As a non-Googler, I can't help wondering why Google would provide messages and tools for spammers (people who have earned manual penalties) while leaving possibly innocent Panda and Penguin victims in a state of ignorance. Google seems to be going halfway, but it's offering assistance to the wrong half.

Chase Anderson

11/20/2013 04:19 pm

Plenty of much larger sites were impacted by Panda and Penguin. Although none you listed. With that said, the mom and pop shops that were hit were hit because they've obviously be trying to manipulate the search results.Site's were hit because of their ignorance but that doesn't make Google the bad guy, it means people should get educated or stop playing games with their businesses. It's just like a mom and pop restaurant. If you don't adhere to policies in the food industry, you're going to go out of business.

Pixelrage

11/20/2013 04:30 pm

When you say "manipulate search results," I'd seem to think you're talking about major corporations who buy dofollow links and hire armies of article writing monkeys writing fluff copy with backlinks. That's my definition of 'manipulating search results,' not innocent webmasters who lost their rankings (and their BUSINESS) overnight because they weren't big enough. It's insane to even say that everyone who got hit by these algorithms were the result of manipulating search engines -- the vast majority who were hit couldn't even scratch the surface of manipulation to the level that big corp's do it (and continue to get away with it). I will never understand the mentality of anyone who refuses to admit that there is major, unmistakable brand bias on Google that has been going on since 2011.

Chris Gedge

11/20/2013 04:55 pm

Why would Google want to help you to fix your organic penalty instead of buying Adwords ads?

StevenLockey

11/20/2013 04:57 pm

Because its pure BS. Quite a few of our sites out-rank major brands. Why? Because we made sure they had high-quality content. The only reason some brands do very well is because their websites have a lot of high quality content on them and they have a lot of inbound links because of that content. They spent the money to generate that content, money a small business may not have, but its still about the quality of the site regardless of how it was generated. Also all the small businesses who can't compete with the brands in the SERPs are simply because their content isn't as good. The only way what you call the 'brand bias' is going away is if Google start rewarding low-quality content which they aren't going to do!

ethalon

11/20/2013 05:07 pm

You list WalMart, Ebay, and Target...do you really think they need to hire hordes of writers to rank? No, they don't. Any writers they are hiring are to saturate the market (be it brick and mortar or online) with their name. I would go as far as to say they rank well because their sites are engaging and funnel users around the site and not out of it. Their interior pages rank for queries because Google has seen that users are happy to enter a WalMart result and spend time there. WalMart will rank because it is a brand people search for. I wouldn't be surprised if the vast majority of their traffic is direct, or from e-mail blasts, or from people searching on "garden hose walmart". The same applies for Ebay and Target. Do they buy a ton of AdWords? Probably...but that is because they buy a ton of all kinds of advertising. Glossy newspaper fliers, radio spots, TV ads, product/brand placement in film and television...and online advertising. This isn't some 'Google favors brands because brands can buy the most AdWords' scheme...if AdWords never existed, WalMart, Target, and Ebay would still dominate the SERPs more and more as more customers are comfortable browsing and shopping online. Stop thinking that ecommerce is the same as it was 10 or even five years ago; it isn't. Continuing to moan about big brands ranking well is just as effective as a hardware store moaning that WalMart moved into town and took business away. Don't like the situation? Refuse to buy from WalMart and frequent another retailer. Surprise, it is the people who choose to shop there (I try to avoid WalMart because I don't agree with how I feel they abuse the social welfare state to pay their employees less than a living wage)...until everyone does the same get used to seeing them in the SERPs.

ethalon

11/20/2013 05:08 pm

Do they really offer insights into the obviously spammy churn and burn sites? That is an assumption, that you can't know the answer to sans working on the web spam team, that sort of shapes your entire argument.

Chase Anderson

11/20/2013 05:17 pm

It's not just about 'good content' but the point is well said. You're not going to outrank walmart for 'walmart' or any other terms they have major hold on. But you can beat walmart on longer tail terms on products even if your a small mom and pop. As a small business competing for long-tail terms, you should be profitable with only a few long-tail terms at a time. If you need more than that find otherways to attract visitors outside of Google. You'll be better off in the long run anyways.

Korenl

11/20/2013 06:08 pm

Only if the rules are clear and all the participants are treated the same way, which is not the case with Google. So yes it is a bad guy, who only cares about his own business. Why should we care about Google? Check the new results, where empty pages of amazon and such services fill the tops. Last week I wanted to find a doctor purely as user, I had to change into Bing cause the Google results where always unrelevant or relevant but showing only sites, which funnel traffic to the doctors site I was looking for. In Bing I got 5 doctors in TOP5, which alle where good results for my query.

Juggernart

11/20/2013 06:21 pm

What's so shocking and/or surprising?!? I mean it would be pretty stupid to develop increadible complex algorithms for Panda and Penguin without having 100% control over the algorithms. Google surely have backends to see what's going on with a site. How else would the many manual actions be triggered (automatic flags go red --> human reviews)? Google knows all about your site.

Anonymous

11/20/2013 06:53 pm

Pedantic anonymous comment alert: 1. "I spotted a very interested comment" 2. " Does Google have a backend tool for themselves to see this? If so, can their dumb it down a bit to add to Webmaster Tools?" Furthermore I'm not sure why you find it in the slightest bit surprising. You have sat in on numerous hangouts with JM where he has done exactly what you have alluded to in this article - ie looking up a site and seeing via some sort of backend that a site was being hurt by Panda/Penguin. As someone who also sat it on a few of those, the content of this article is more "welcome to 6 months ago Barry" rather than being in any way shocking.

Nick

11/20/2013 07:19 pm

I couldn't agree more with this article. I can accept the fact, that in order to make an omelet you have to break some eggs. In Google's case its like asking an electrician to fix a broken socket and ends up bringing down your whole wall to do so.

Durant Imboden

11/20/2013 07:36 pm

I didn't say anything about "churn and burn" sites specifically. I referred to "people who have earned manual penalties." Notifications of such penalties through "Manual Actions" in Webmaster Tools is more information than targets of Panda and Penguin are receiving.

ethalon

11/20/2013 08:38 pm

My point was, don't you think the manual spam action team (I hope they have a better way of self identifying than what I just supplied) would have a pretty good idea as to which people get the description pushed to them? Some manual action may be simple things and that bringing it to the attention of the site owner is no big deal; others they may want to keep the specifics to themselves. It may be completely automated and I may be completely off base, but you would think they have a good idea about what information they are giving to whom.

Mike Pannell (Dallas Realtor)

11/20/2013 10:22 pm

AMEN.. I been trying to recover since the first Penguin... Kinda getting old and it is hurting our business

incrediblehelp

11/20/2013 10:46 pm

"secret secrets are no fun, secret secrets hurt someone"

Chase Anderson

11/20/2013 10:57 pm

How are the rules unclear? The number of significant changes to Google's Webmaster Guidelines in the last 2-3 years I could count on one hand. No one forced you to care about Google but yourself.

Durant Imboden

11/20/2013 11:59 pm

You raise some interesting points. Also, I wonder how many "churn and burn" sites are listed by their owners in Google Webmaster Tools? I'd imagine that most of the "churn and burn" crowd would rather stay below Google's radar.

Mark Melgie

11/21/2013 12:50 am

Same here mike... i think this will take forever...

Gautam Jain

11/21/2013 05:21 am

Google may be able to say if it was Panda or Penguin. But I think it is very hard for them to tell which exact signal may be causing your site to rank low. Google sees more than 200 signals. If one signal may be right, another signal may be completely contradictory to your efforts.

Sam

11/21/2013 06:38 am

Does that mean that when a google rep helps on the google forum and says a site has thin content that it might not be speculation but rather from the backend? Could they have access too?

Rahul Trivedi

11/21/2013 06:46 am

Plenty of websites hit by penguin and panda but only 5% of them got the manual penalty message.

romanUK

11/21/2013 06:59 am

Not all reps, but some for sure. I know of a case where a webmaster was struggling to lift his penalty (penguin) and it finally took JohnMu to pinpoint the exact link to help solve the riddle.

Gaurav Srivastava

11/21/2013 07:52 am

Is this the revenge policy of Google?? lol

Gaurav Srivastava

11/21/2013 07:53 am

Point.

Jorge Gonzalez

11/21/2013 08:50 am

Donde se quedo el "Don't be evil"

Craig Hamilton-Parker

11/21/2013 09:21 am

I got hit by the first Panda and still getting nowhere. Working full time on trying to fix it since - such a waste of time and energy that could be doing something more worthwhile. It's probably something simple but I'd get straighter answers from the Spynx than Google' official advice.

Jitendra Vaswani

11/21/2013 10:23 am

Any tool to know that site is hit by which algo ?

David Whitehouse

11/21/2013 10:27 am

Great idea Barry, I love it. Hopefully as a result of this post we'll see something similar to what you are suggesting in the near future. It would certainly help if Google were a bit more transparent.

David

11/21/2013 10:33 am

is it true ?

Simon Allen

11/21/2013 11:55 am

I'm as pained and frustrated as I think you are Barry, we have lots of micro businesses using our platform; some of the recent updates and the new algorithm have seen some of them lose 90% of their sales. It's difficult to give advice other than to do more of everything else in their marketing mix - it's a tall order for most 'one man band' webmasters.

Hazel

11/21/2013 12:56 pm

Rahul Trivedi SEO Expert is back :-D

Gregory Smith

11/21/2013 01:50 pm

I would love to see this happen Barry. It's a much needed item...

Gregory Smith

11/21/2013 02:02 pm

Either way, my guess would be that it would still be coined as "debug"..

Jérôme Verstrynge

11/21/2013 03:30 pm

Squeezing lemons does not produce more lemons. Google is putting more effort on fighting spammers than empowering webmasters. We need proper feedback from Google to create the gold they need to feed the business model. Right now, they are killing all possibilities of creating win-win deals. They need to own this. I see two possibilities, i) they get smart about it, that is, they get over their excessive information retention reflex and relax the muscle, ii) webmasters give up and stop producing the quality they need to serve to users, they feel the pain and are forced to review their position. The current situation is unsustainable on the long term, for everyone. Let's hope someone at the top with enough vision understands this...

Mozalami

11/21/2013 03:59 pm

I agree "It pains me to see so many sites struggling to figure out what is hurting their sites in Google when nothing shows up in the manual actions viewer." even tools such like barracuda-digital cannot follow and extension "Chartelligence" for GA is out of date. Would love to hear about tools like this that can keep up to date such like Moz change history but interactive with GA data

Judith

11/21/2013 04:05 pm

Chris, you nailed it! g##### could care less about helping -- especially all the small Mom and Pop shops that don't have 4 figures to hire a pro to find that "one" link that is wrecking havoc. Nothing they did was so heinous to warrant being penalized like they have been. Nor do they have the AdWords budgets to compete with the big boxes who muck up the system with irrelevant ads that link to irrelevant pages that put the bids out of reach. I know of many small Mom and Pop sites that were exiled to oblivion that have great looking sites, never bought links, have excellent reputations with their customers, create great content and are secure. But no matter. g##### didn't like something about those sites (something no one can seem to explain) nor will they (or their minions) admit collateral damage. Not only is SEO dead as those of us who have been around from the start knew it; so is "do no evil".

tomshark

11/21/2013 04:10 pm

I see this as another indicator that online marketers are over dependent on Google.

StevenLockey

11/21/2013 04:54 pm

Well you aren't going to outrank them on their name, but on many other strong, short keywords you can, regardless of how much of a brand they are if your website is better than their's.

Ashish Ahuja

11/21/2013 07:15 pm

no "seo expert" in his userid though now. back to reality

Mark Hammersley

11/21/2013 09:03 pm

Very interesting, they must have a penalty flag against a URL so they can see what Panda is doing. It makes sense

xoxo

11/22/2013 01:09 am

google scroogled themself :)

Sandy Albaytar Jr.

11/22/2013 05:21 am

if your SERP for a specific KW suddenly drops, how long should you wait before you consider it as a long-term drop? I mean, when can you say that it is not a false alarm? I'm probably asking a stupid question because I have a feeling that the answer is obvious: can't figure. our web traffic is ok but I should probably brace for a drop seeing that the previous page we created for a star KW went suddenly missing from SER today (not even in top 100 when previously it was on the 1st page). Instead there's our homepage & another different page on page 2. And yet when I search for the same KW while restricting results to only pages of our site, that missing page is coming out as first. so confusing. how are we supposed to right things out when we didn't even know what went wrong in the first place? oh yeah... sure... go figure!

Gracious Store

11/22/2013 06:27 am

I wish Google can feel the pains of the webmasters who are closing down their sites out shear frustrations

Spook SEO

01/11/2014 03:51 pm

If only google understands the way webmasters feel on what they do. Yes definitely, they know when a page has been hit by panda or penguin but they wont tell, We all know google, thay have a lot of tricks :)

blog comments powered by Disqus