Interview with Google's Adam Lasnik on Paid Links, Duplicate Content, and Other Topics

May 4, 2007 • 7:01 am | comments (2) by twitter | Filed Under Other Google Topics

A DigitalPoint Forums post links to an interview Eric Enge had with Google's Search Evangelist, Adam Lasnik. In the interview, Adam discusses paid links, nofollow links, duplicate content and crawling, validated pages and how they are handled by the algorithm, and more.

Some interesting points he mentions are:

On buying links:

...our goal is not to catch one hundred percent of paid links. It's to try to address the egregious behavior of buying and selling the links that focus on the passing of PageRank

On duplicate content:

So, my core suggestion to webmasters would be to use Noindex and robots.txt to help us know what pages you'd prefer to not have indexed. the context of duplicate content, penalties tend to be relatively rare. In the majority of cases it is innocent and unintentional. But, in cases where it's very extreme, there can be penalties applied.

The interview is extremely informative and I suggest that you give it a read.

Discussion continues at DigitalPoint Forums.

Previous story: Google AdSense Upgrades Old Accounts to Google Accounts



05/04/2007 12:24 pm

While Adam got interviewed Matt posted about overdone reciprocal linkage. Merge both pages and you get a somewhat complete Google policy on "linkspam".

Michael Martinez

05/04/2007 04:16 pm

It's a great interview, filled with a lot of information that many people may not have read before. Certainly Adam speaks with more authority on these topics than the typical SEO guru blogger. One of the most important takeaways is that Google does NOT score on the basis of HTML validation. So while Adam acknowledges there are many good reasons for Web sites to be W3C compliant, he has made it clear that Google is not factoring compliance into its algorithm. Hopefully, that SEO myth will now die out. The duplicate content discussion was interesting but it doesn't do much more than recap points already made elsewhere. And it way overemphasizes the impact that duplicate content has on search results. Does Eric equate duplicate content with Supplemental Results? That misidentification is very popular with the SEO community and it may take another six months to get everyone straightened out on that issue. How many times do Googlers like Matt Cutts, Adam Lasnik, and Vanessa Fox have to tell people that you go Supplemental for LACK OF (INTERNAL) PAGERANK, and not because of duplicate content before the SEO community gets the message?

blog comments powered by Disqus