Does "Freshness" Mean More in Google?

Apr 3, 2007 • 7:43 am | comments (8) by twitter | Filed Under Google Search Engine Optimization
 

Google, like most legitimate search engines, is constantly tweaking its ranking algorithms in order to try and provide the most relevant content to its users. Search engine optimization experts are tasked with going beyond relatively simple best practices recommended at the onset of an engagement, in order to keep up with these trends. It may not be rocket science at the onset, but in the long run the most creative SEO teams that can recommend and effect changes to Web sites in the quickest manner will always win out for competitive terms.

One of the founders of WebmasterWorld, known as Tedster (hint his real name starts with a "T"), has revived a month-old thread which dealt with some commonly perceived changes to Google's algorithm in March. The new thread, split off by Tedster from the old, is titled April 2007 Google SERP Changes. Don't think this is just a continuation of the March discussion, however, as Tedster has posed an interesting theory for discussion:

In the past, Google seemed to forgive relatively unchanging content for certain types of sites, though certainly not for others. Now I'm thinking the algo has shifted, at least a bit.
He believes "freshness" (of content and probably linking) has a lot to do with being able to command the top spots at Google these days, and people seem to agree with him:
My gut feel is that the recognition of "freshness" is playing a greater part in Google's algo, where it can.

To me, in short, I agree that updated content seems to be making a move, even against some the well-entrenched .gov and .edu TLDs. We are seeing some "refreshed" branded sites ranking very quickly for competitive terms.

Lots more great discussion follows at WebmasterWorld. Share your thoughts there and also feel free to comment below.

Previous story: 2007 NYC SES Party and Event Schedule Posted
 

Comments:

Andy Powers

04/03/2007 01:43 pm

I haven't seen examples of this online, but I think it would make sense based on how algorithms improve. The basic goal is improving relevancy (an accurate result for the query) and precision (number of "accurate" vs. "inaccurate" results) for user queries. I think the following situation may be reasonable: For two sites on topic A -- If site #1 makes new content and receives many new links regularly, while the older, legitimate site #2 receives almost no new links, a search for topic A may find the first site more relevant. I think 'freshness' could be a reasonable improvement to information retrieval algos. I'll watch for examples, too.

Skitzzo

04/03/2007 03:49 pm

I've been saying this for over a year now. http://www.seorefugee.com/seoblog/2005/12/21/googles-irregular-content-filter-a-theory-of-googles-expectations/ However, as blogging continues to take hold, I think it's become even more emphasized. http://www.seorefugee.com/seoblog/2007/03/13/blogging-metamucil-the-importance-of-being-regular/ I think in the not to distant future if you're not writing and writing fairly often, you're going to suffer for it.

Johan a.k.a. T0PS3O

04/03/2007 04:07 pm

I've been experiencing this for some time and I've been calling it the "newsworthiness" factor: http://forums.digitalpoint.com/showpost.php?p=1917358&postcount=2 http://forums.digitalpoint.com/showpost.php?p=1930954&postcount=24 http://forums.digitalpoint.com/showpost.php?p=2146175&postcount=6 Get fresh links from places that makes the content look topical (PayPerPost in my examples, getting bloggers to mass jump on something) and up goes the rankings. Then it tails off later.

Michael Martinez

04/03/2007 05:03 pm

It's too soon for any of us to know exactly what Google changed in the past few weeks, but I think the freshness factors are adequately dealt with through their RSS crawls, particularly for BlogSearch and News Search, where freshness really matters. Web search seems to be more affected by the tighter trust requirements they have placed on Web content. The sites most likely to earn that trust also happen to be updated often and probably garner more referencing links from other trusted sites. One recent hypothesis I have seen circulated in SEO circles is that the ratio of deep links to root URL links should now favor deep links, and I have to agree that is starting to make more sense. If all your inbound links point to your root URL, but you have 2,000 pages of content, that would look kind of odd to me. So how should it look to a search engine that is evaluating linking pattern anomalies?

Chris Boggs

04/03/2007 07:08 pm

Andy and Michael, thanks for your comments! Andy I agree with you that a big part of the freshness is the fresh content feeding fresh links. Michael good point about the RSS crawls being a factor for many "freshness-oriented" sites. Would this to you be a reason to test RSS with even "normal" (non-news) content? I also feel you definitely have something regarding the link hypothesis, although I won't comment further on that since it takes this thread a bit off-course. I would hope that you report that in a more link-focussed article like the one from Ben later this morning?

Michael Martinez

04/06/2007 05:55 pm

Chris, I've been pushing out RSS feeds for my Web content since 1998, back when MyNetscape and similar start page services made it a viable means of building visibility. However, those feeds are still grabbed by a lot of robots today. I've blocked a fair number of what I feel are suspect robots but I think the major search engines all grab them.

Marek

04/09/2007 03:36 pm

Chris, I agree that there is something going on with this freshness and G ranking. My website had a PR 4 for 2-3 years, not much changing on it. Then it was demoted not long ago to PR 1 :(. I increased my blogging frequency 2 months ago, and have seen the new pages / posts show up fairly quickly. I guess time will tell. Thanks for this post. Best, Marek

Janet Martin

05/24/2007 12:02 am

I'm definitely noticing a change. I'm seeing one client's blog posts appear in Google SERPs often on the same day they're published. The RSS feeds are set up to ping all the usual blog search engines, including Google Blog Search. I imagine that's doing the trick. With my projects, it's still too early to tell if RSS and frequent updates will provide a ranking advantage in the long term, but it sure is a great way to get the latest content into the engine as soon as possible. I'm hopeful.

blog comments powered by Disqus