Didn't Google Release A Scraper Algorithm? Google Wants Scraper Sites To Be Reported

Aug 29, 2011 • 9:13 am | comments (11) by twitter Google+ | Filed Under Google Search Engine Optimization
 

Google Scraper IconOn Friday, Matt Cutts of Google tweeted that he wants users, searchers, webmasters and SEOs to report scraper sites via a special Google Doc Form.

Matt said:

Scrapers getting you down? Tell us about blog scrapers you see: http://goo.gl/S2hIh We need datapoints for testing.

As you may remember, before the Panda update, Google released an algorithm or filter that went after scraper sites. That has been out since late January, so I guess Google is seeking feedback on the quality of that algorithm and looking to improve on it.

It also shows that Google is not happy with the current state of scraper sites found in the index and they are likely going to do something about it in the next six months. So be prepared.

Forum discussion at WebmasterWorld and Google Webmaster Help.

Previous story: Poll: Hurricane Irene Hurt Your AdSense Income
 

Comments:

Admin

08/29/2011 02:23 pm

First thing I submitted... APPSpot Proxy scrapers... Would be interesting if Google actually policed their own products.

Nick Stamoulis

08/29/2011 02:39 pm

Google is always looking for a way to better their product, the search results. No algorithm update is perfect, and you have to at least applaud Google for recognizing that. Giving site owners the ability to report sites does put some power back in the hands of the user as well.

Michael Martinez

08/29/2011 05:19 pm

I think the Panda algorithm revealed more than a few flaws in the scraper algorithm they rolled out early this year.  It could be they're now looking to roll scraper identification into the Panda process.

Search Chronicle

08/29/2011 10:04 pm

That would be nice, if it works.  For a while after the April 11 Panda there was a time when the algo seemed to favor scarper sites.  Hopefully they don't mess it up again.

Mae Loraine Jacobs

08/30/2011 02:59 am

Hopefully, this is justice. I've met several small business owners who are suffering from extremely low credibility because of scraper sites. The reporting process will also make website owners more alert and responsible with the protection of their content.

Amelia Stevenson

08/31/2011 01:32 am

Algorithms are unreliable. It might seem to work at first but fails in a matter of hours. I think Google understands this that's why it's enlisting help from users. Everyone hates scraper sites so it's sure that a lot of forms will be filled. Great thinking by Google!

Scott Ludtke

09/17/2011 10:56 am

The link for the form does not work.  Here is the form URL: https://docs.google.com/spreadsheet/viewform?formkey=dGM4TXhIOFd3c1hZR2NHUDN1NmllU0E6MQ&theme=0AX42CRMsmRFbUy1mYzJkYmE4MS04Mzc4LTQ0ZGMtYjFlYi03NjU4MjkyMjIwMWY&ifq

Scott Ludtke

09/17/2011 10:57 am

The link for the form does not work.  Here is the form URL: https://docs.google.com/spread...

Google Arrogance

01/20/2012 11:05 pm

Oh, you mean like how GOOGLE SCRAPES millions of websites with their "site preview" BS etc. that company IS FREAKING OUT OF CONTROL AND ULTRA ARROGANT let-alone changing things for no reason and forcing cpu-intensive mouseover UI "inventions" everywhere on by default that can barely be turned off or CAN'T be turned off. ARROGANCE. 

Hashim121

02/09/2012 10:42 am

If any of google adminstartor can send his E-mail in order i will forward the E-mail. Thanks HASHIM

Esoteric Articles

03/02/2012 08:01 pm

 Very true, not to mention that Google punishes websites for being top heavy on advertisements (fair enough) but then goes on to be top heavy with Adwords campaigns. Bing + Facebook is very quickly becoming the new way to search the web.

blog comments powered by Disqus