Google's Matt Cutts On Telling If Your Site Was Hit By Algorithm

Mar 25, 2014 • 8:23 am | comments (42) by twitter Google+ | Filed Under Google Search Engine Optimization
 

Google's Matt CuttsThe truth is, for an experienced SEO, this video sheds nothing new about the question on determining if your site was hit by an algorithm or not.

In short, the best way to tell if you were hit by a Google algorithm such as Panda or Penguin, is to see your analytics and see if you had a major dive in traffic from Google on a specific day. If so, then write down that date, go to our Google updates section here and see if the date corresponds with anything reported here. If not, then you are out of luck. Well, not exactly.

Matt describes three reasons why a ranking drop might occur:

(1) Manual Actions
(2) Crawling Errors or Issues
(3) Algorithmic Penalty

(1) Manual actions show a notification in Google Webmaster Tools, so it is clear cut, Matt said.

(2) Crawl errors also are likely to show in Google Webmaster Tools, often clear cut also.

(3) Algorithmic penalties are not thought of as a penalty, they are algorithms for ranking. General quality and algorithms will determine rankings. So it is hard to tell if an algorithm is hurting you. But Google will communicate large scale algorithm changes, such as Panda or Penguin. They will tell you on what date they run, this way you can check the date and see if that algorithm had an impact on your site.

But as you improve your site and the algorithms run, your rankings can improve.

Here is the video:

At WebmasterWorld, GoodROI, the administrator, said:

For people (especially newbies) having trouble making money online they should remember most things are interconnected. For example if you publish poor content it will lead to weak link development because no one likes linking to poor content. There are ripple effects when working on different parts of your site.

Forum discussion at WebmasterWorld, Google+ & Twitter.

Previous story: Chrome Adds Google Now Notifications
 

Comments:

David Beart

03/25/2014 12:46 pm

It would be nice if "Algorithmic Penalties" were treated like manual penalties, at least this way one would know what penalty they were fighting and what needed to be fixed. A lot of time and energy could be saved if Google would just tell us straight out what they 'don't like' about a site; this would enable webmasters to make the changes and continue with building out their sites. The current situation where we have to guess makes no sense.

IDoRant

03/25/2014 01:44 pm

Once you fillet a fish, you want to watch it fry?

David Beart

03/25/2014 01:59 pm

All we would like to see is clarity. If could can give manual and Algorithmic Penalties, they have the ability to provide us reasons why the penalties were given (which links / pages were bad) and how to fix the problems. Right now the only one that benefits from all the guessing are SEO's.... people get frustrated after trying to fix the problems... then resort to hiring a SEO to fix the problem. Unfortunately often it is SEO's that caused the problems in the first place.

Adam Heaton

03/25/2014 02:07 pm

I like how they are open about these things, but it's nothing we didn't really already know. If they did what you proposed previously Barry by adding a viewer in the webmaster tools to indicate what algorithm has hurt the website, that would help massively in determining what efforts should be made.

Spam Cutts

03/25/2014 02:25 pm

Why do you think they would want to help you recover your rankings? That's what addwords are there for, to give the sites that don't rank a way to gain exposure.

James

03/25/2014 02:47 pm

yawn, got anything new?

Durant Imboden

03/25/2014 02:55 pm

Good summary. It would be nice if Matt Cutts's Webmaster videos were accompanied by transcripts for people who already know the basics and just want to see if a video contains anything new. ("Talking head" videos aren't the most efficient way to transmit information.)

Spam Cutts

03/25/2014 03:06 pm

I don't understand why so many people think they have the right to know why / how Google does these things. Their algorithm is a trade secret, always has been.

Adam Heaton

03/25/2014 03:07 pm

I'm not asking for them to tell me what I've done wrong specifically, what I'm asking for is what many people want, to know whether it was a panda or penalty algorithm. It means time is spent on the right areas instead of the wrong.

Nick Ker

03/25/2014 03:11 pm

YouTube does have those automatic transcripts. But those are usually inaccurate enough to be more valuable as comedy than anything else.

Nick Ker

03/25/2014 03:12 pm

Something remotely true would be refreshing too.

Nick Ker

03/25/2014 03:13 pm

If you think it could be either Panda or Penguin, you probably have good reason to believe the site is "eligible" for either one. It is not as if Google just randomly applies those algorithms to sites without at least some reason for it. So why not clean up issues that could trigger them both?

Nick Ker

03/25/2014 03:14 pm

Google telling webmasters exactly what the problem is would be good - if there weren't so many who would use that information to continue to try to cheat their way to the top. Even more time and energy would be saved if people would just read the webmaster guidelines BEFORE they go out and buy links, add filler content (panda food), keyword-stuff, join blog networks, etc. Having done a few dozen audits of penalized sites, most of their owners said they were trying to do what Google liked or wanted. The problem is they were not basing that on what Google has actually said, but what a bunch of "gurus" and "ninjas" told them. If these webmasters had actually read & followed the guidelines, they would know that much of what they were doing was actually the opposite of what Google likes. Improving the quality of the site's code, structure & content is what google wanted them to do. Instead, they chased after various types and quantities of links, overused keywords, added useless pages, and didn't bother checking the site for things like duplicate content, broken links, redundant titles, etc... So I think Google views it like this: If Google were to tell every webmaster exactly what they did wrong, some would figure out that they should be focusing on things that actually improve the site rather than more of those mistakes. Unfortunately, too many would instead know what exactly which tactics to stop using, and would just change the way they spam.

Nick Ker

03/25/2014 03:20 pm

It is kind of like hiring an auto mechanic. Some just don't know what they are doing or are intentionally ripping people off. If that happens, don't go back to the same one. I think Google has an attitude like "You know what you did, we do too. Fix it." In most cases, this is appropriate. Most webmasters know they when they hired a shady SEO, or maybe they bought links, tried some SEO-Gurunator plugin or some other shenanigans. If a webmaster has done several things that violated the guidelines, they should fix them all. Google isn't about to prioritize the clean up tasks for anyone. Those few webmasters who have absolutely no clue, should read the webmaster guidelines. Chances are, spending 15 minutes reading that would shed a lot of light on exactly what is wrong.

Durant Imboden

03/25/2014 03:30 pm

Yep, like those transcribed phone messages that I get from Google Voice.

Eemes

03/25/2014 04:11 pm

Exactly what I was looking for. Google can put something like a tab of recommendations in webmaster tools for those sites which have been affected by penguin or panda, by which users can try to recover rankings. Matt does say you can recover rankings when they re-crawl or reprocess, so that's good news in terms of Algorithm Penalty!

Adam Heaton

03/25/2014 04:34 pm

Problem being Nick is that it can take a long time to clean up backlinks or improve on the content. If there is nothing wrong with either, why spend time working on them when that was never the issue?

Adam Heaton

03/25/2014 04:35 pm

Find myself agreeing with you a lot on here Durant, one of the only people I will actually bother paying attention to in these comments.

Durant Imboden

03/25/2014 04:47 pm

"Google telling webmasters exactly what the problem is would be good - if there weren't so many who would use that information to continue to try to cheat their way to the top. " In the case of Panda, "cheating" would mean improving quality (or at least whatever Google judges to be quality). I don't think Google would mind if site owners invested money in improving the quality of their Web sites, as opposed to spending money on links, guest posts, or even SEO. And while it certainly wouldn't make sense for Google to "tell webmasters exactly what the problem is" (which isn't likely to be practical anyway with an algorithm, as opposed to a simple manual penalty), it's hard to think of a reason why Google wouldn't be able to include a quality score or graph in Webmaster Tools without revealing the recipe for the "secret sauce."

magnaromagna

03/25/2014 04:47 pm

Hi, good schema but missing an important note: dates of updates are for english language websites, not for other languages, right? I mean here, from Italy or from another "stranger" website I think the updates comes later... Is more difficult to understand which mistake the webmaster made. Do you know how to understand dates of google updates for single country? Thanks

JoyceGrayy

03/25/2014 05:33 pm

So it is hard to tell if an algorithm is hurting you. But Google will communicate large scale algorithm changes, such as Panda or Penguin. They will tell you on what date they run, this way you can check the date and see if that algorithm had an impact on your site. http://num.to/817039484145

Durant Imboden

03/25/2014 05:59 pm

Not all "algorithmic penalties" are the result of violating Google's Webmaster Guidelines. Panda, for example, is based on content quality (as determined by Google). A site can be in compliance with Google's guidelines and still be hurt by Panda. To make matters even more confusing, site quality is only one of the factors involved in Panda. The assumptions built into the algorithm can affect a site's ranking, too. If Google tends to favor link authority, site size, or other factors that work to the advantage of big brands or megasites (think Amazon, Wikipedia, About.com, or Tripadvisor), a site's average rankings can go up or down with each Panda update even if the actual quality of its content is unchanged. If Google provided "site quality" and "page quality" scores in Webmaster Tools (along with all the other data that it shares in WMT), site owners would at least know whether they'd screwed up in Google's eyes or Google had simply changed its assumptions in the latest Panda update.

Nick Ker

03/25/2014 06:09 pm

Think about it: if you believe you could have a panda problem, clean up panda related issues. If you think you might have a Penguin problem, deal with the links. Or do you want to only clean up the things that Google caught for sure, and hope they don't catch the other issues that could also trigger the algorithms? If the problems need to be fixed, they are problems that need to be fixed whether or not Google gives you a list the specific items, or tells you which algorithm was triggered. A re-read of the Webmaster Guidelines, followed by an honest look at the site, will usually reveal the problems that Google would have with the site or the links. Why should Google make it any easier for people who violated those terms? I think Google's attitude is kind of like a tough-love parent: You know what you did. If you had the time to make the links, copy content, keyword stuff, or whatever else may be the problem - then you certainly have the time to clean up the mess.

Morgan Akchehirlian

03/25/2014 06:18 pm

Sorry the URL that you have added redirects to some amazon blu-ray movie? What is it? I just open it as an important source of reading but it redirects to amazon.

Nick Ker

03/25/2014 06:31 pm

I agree that Panda is a bit more complex and different than other parts of the main algorithm or updates in how there are few specifics given as to what "quality" means. But my point is that Webmaster Tools, the Guidelines, and Google in general already give plenty of information that would indicate what is wrong. Basic quality issues like duplicate content, site speed, which pages are indexed or not, and markup issues are all readily available in WMT. The Guidelines, the SEO starter guide, most of Matt's videos, and just about every other public statement from Google regarding search mentions "quality". I think they are serious about that, and it is high time that anyone who cares how their website ranks starts taking it seriously too. As for fluctuations that come from Google revising Panda or any other part of the algorithm, there is no way Google is going to give indicators of all of those factors just so people can try to micromanage rankings by selectively cleaning up problems only when Google says you must. The solution to a drop from something like that is still the same: Improve. Improve the site's function, improve it's usefulness, improve the quality of the content, improve the focus of the content, improve the marketing (stop building spam links, etc)...

Nick Ker

03/25/2014 06:38 pm

A page quality score would not be much help. Like Matt explained in the video, it is too complex. Search results are query dependent. Sometimes a page that is pretty crappy in many ways - slow, bad grammar, poorly written, whatever - still happens to be a pretty good result for a query. And sometimes a page that would score a 10, still is not the best search result for some other reason.

Stuart David

03/25/2014 08:13 pm

New spam technique being used, they extract from the article, post it, and stick there shitty link in

roachdawg

03/26/2014 12:25 am

There's the problem...going by that screenshot pic for the story, Matt's cranky because he's thinking about hamburgers.

Allen Chris

03/26/2014 03:24 am

Trying to be funny?

roachdawg

03/26/2014 03:27 am

Trying?

Adam Heaton

03/26/2014 09:53 am

He's in fact making a link sign... link building is good to go again!

Rhonda Erdey

03/26/2014 11:46 am

Nick , Thank you ...

SEO animal

03/26/2014 11:46 am

Shut up Durant, no one cares what you have to say...

Arun Kallarackal

03/26/2014 03:17 pm

Well, making good and regular us of Google Webmaster tools ensures that we get t know about the manual actions and crawling errors issues. A drop in the rank and timely notification will help webmasters start the 'repair' work :) And Google announcing the latest major algorithm updates also is helpful. Especially looking up the date the update was rolled out and the date the ranking were dropped. But what about the many small updates, which are rolled out without any announcement? In that case, webmasters have no means to know what actually caused the drop in ranking. Anyway, nice information. I found this post, thanks to Kingged! Arun

Winston

03/26/2014 03:58 pm

If you are still just guessing after reading the Webmaster Guidelines, you should probably just follow the main rule - make your site good for users in how it functions and its content. Don't try to do anything because you think it is what Google wants. If you don't know, don't try. That includes trying to use keywords enough times, building links, or doing anything else in the name of rankings that really does not benefit the users of the site. As for "bad pages" - make sure they are all original, well written, not duplicated on your site, and hopefully say something more or better than all the other pages in the world that are about the same topic. Google doesn't just hand out penalties for no reason. Don't give them any reasons and you will be OK. SEO is not simple. Even Google can't give millions of webmasters free advice on individual pages. For what you pay for Webmaster Tools (which is free), you get just about everything you would need to know about what Google thinks of your site. If you would share the URL of the site in question, I am sure many people here will help. Sometimes it is difficult to look at your own website objectively.

David Beart

03/26/2014 04:25 pm

A 'quality score' would be nice, however it still does not address favoritism to larger 'branded' sites. The other big issues now is links themselves. If a NEW site links to us that has no pagerank, no authority, few inbound links we have almost no choice but to disavow it. This makes it more and more difficult for new sites and a bigger advantage to a large site. Who would want a link from a big well established site vs one that is not.. and as Google as a lot of us scared about 'incoming' links... why would we want links from newbies?

Nick Ker

03/26/2014 04:59 pm

There is no favoritism toward larger branded sites just because of their brand. There are however, a lot of people claiming there is such a bias because they can't imagine why their own site is not number one for everything, while a well-established, well known brand, that has lots of customers/users, and a variety of other good things going for it DOES rank well. You only need to disavow something if you have reason to believe it is causing harm, not just because you don't know what it is or it isn't as good of a link as you would like. A link from a new site could turn into one of your best incoming links, should that new site turns into a popular one. I urge you to read Google's webmaster guidelines & the SEO start up guide, watch Matt Cutts' videos and stop believing every paranoid rant you find in comments and forums. There are a LOT of people who have no idea what they are talking about, but still insist things like "Google favors big brands just because they are big brands", or "google hates all links". Google is not out to get you. Google is trying to reduce spam and underserving sites in its search results. Follow the rules and don't try to trick Google and you will be extremely unlikely to run into trouble.

David Beart

03/26/2014 05:58 pm

I have read the rules many times...removed links to our site,removed outbound links, cleaned up content, narrowed our focus, paid for link removal, added "no follow" attributes to outbound links and invested countless hours trying to "guess" what we can do to get back in Google's good books. I could have spent all that time and money creating a better site but why buy more articles and spend money on programming when you don't know if old mistakes are behind you. If Google said "remove" these 4 articles... or these 100... I would do so in a heartbeat. As for improving content... 95% of our 2,5000 original article ARE over 900 words. We haven't built a single link in 1 1/2 years!

Gimel

03/27/2014 03:24 am

My biggest concern here is that links have been "weaponized" by Google. It would be nice to know that the hummingbird algo. does not penalize, but just decreases ranking over time. However, Penguin does penalize and competitors know it. Therefore, if you start to gain traction often times black hits will knock you back with a "nuke" blast, and if your link profile is not incredibly strong, or if you ever lost trust with Google you are hit by an unnatural link penalty. Would be nice if penguin could be integrated into hummingbird and just ignored spammy links instead of penalizing for them. On a side note links always have been a utopian concept. In a perfect world citation would convey trust, and authority, however, once you assign monetary value to citation it corrupts the integrity of citation as an effective means of discerning authority with any reasonable confidence. Therefore, i really hope that all the white knights with there impeccable link profiles are ready for what is sure to be a major change in business as usual as big brands take over search when links no longer matter!

Nick Ker

03/27/2014 03:39 am

Your last two sentences are kind of revealing. 900 words is how you measure quality? Numbers are for measuring quantities. By improving, I meant things like readability, uniqueness (in subject matter, not just different words for the same thing), usefulness, entertainment value or any number of things that would make that content worthy of ranking above most other pages that are about the same topic. Is it truly the best page about that topic and deserves to be in that number one spot? If not, what can you do to make it better? And when you say you haven't built a single link in 1.5 years - does that mean you didn't earn any truly natural links to all that content? If so, either the content is not very good, or you aren't doing anything to get the attention of people who would link to it. Either way, I think you need to consider that maybe you don't have a penalty problem but a quality problem. Was your penalty a manual action which Google told you about in WMT, an obvious Penguin or Panda hit that coincided with the date of an update, or are you just assuming that you have a penalty because of a change in rankings? Not all ranking drops are the result of a penalty.

SharynJRivera

03/29/2014 10:10 pm

If they did what you proposed previously Barry by adding a viewer in the webmaster tools to indicate what algorithm has hurt the website, that would help massively in determining what efforts should be made. http://qr.net/rtcX

Georgi Georgiev

04/26/2014 08:26 am

"In short, the best way to tell if you were hit by a Google algorithm such as Panda or Penguin, is to see your analytics and see if you had a major dive in traffic from Google on a specific day". You and Matt make it sound easier than it is :-) In reality, it takes a lot more than that if you want to do a proper job at detecting a Google algorithm update impact on your traffic. I've seen time and again even very experienced SEOs make huge mistakes in estimating whether a particular update hit a site or not. Most of them simply lack the web analytics and statistics background required for the task. I, on the other hand, have spend the past 10 years balancing between SEO and web analytics and seeing the proportions of the problem I wanted to share my expertise with the guild. Thus this tool was born: https://www.analytics-toolkit.com/google-algorithm-updates/ . It does all the hard statistical work for you, reports only significant results and basically saves you many sleepless nights :-) It has an in-built calendar of updates (built with the help of Barry's update tracking efforts!) so you don't need to worry about that either. The tool is free to try and you can see a preview bellow.

blog comments powered by Disqus