I am sorry, but I simply could not miss this session, so you will get my coverage as well as Ben's on this single session. On the panel, moderated by Danny Sullivan, are Greg Boser, Jill Whalen, Mikkel Svendsen, Shari Thurow, David Naylor and Jeff Watts. This is kind of a white hat - black hat issue, but this session is more about what should I do with my resources.
Jeff Watts from National Instruments will give a quick intro. They think of the search problem as something that can be represented on a graph, Y axis is better rankings, X axis is time. If all things are equal for them, it does not mean all things are equal for his competitors. Those who reverse engineer the algorithm, he shows how they have a short period of time where they rank above you. But at the same time they have a constant rankings. They use the concept of "smarter content processes" to help increase and stabilize rankings. Web search is like a black box, a query comes in and the relevant content comes out. In the future your success to use blackhat tactics will not be as achievable.
David Naylor, DaveN SEW/WMW mod, for seven years they have chanced the algorithm and catches it every day. If you dont have the skills of writing content like Danny or us, he said you need other skills (programming skills). Build something that is totally focused for one purpose, to rank number one. He said if you have any questions about chasing the algo, he will answer them honestly.
Shari Thurow from Grantistic Designs, starts off saying she is someone with programming skills who chooses to write content. Is there such thing as natural SEO? She calls these SEOs "Algoholic SEOs", with them the goal is positioning, exploit web search engines, not look at the big picture, and meet partial business goals, with a poor search experience. On the flip side, you have "Usability Experts" are goal conversions, confused about search engines, not look at the big picture, balance business and users. She likes to think of the balance between the two is "natural" search optimization, where the goals are to increase quality traffic, page views, conversions and so on, search optimization is part of design, it is the big picture approach. She showed a small case study, page views increased 110%, page views per visitor increased 38%, web search engine referrals increased 350% - just by making the site more query friendly and more browser friendly. So use your resources this way.
Mikkel Svendsen was next up, and he said he is a Algoholic. In his opinion, he feels broad based reversed engineering is dead now. Search engines are evolving. He monitors, research and study overall trends in se algorithms, he produces large number of sites and pages and monitor for results, he zooms in on specific verticals and competitors, and for the most sites outside the engine language all you need is basic SEO skills. If you throw enough direct on the wall, some is bound to stick.
Jill Whalen, who is no longer referred to a while hat. But she doesn't know where that came from. She agrees with what Mikkel said, but we need to step back and discuss what is meant by algorithm chasing. It doesnt mean you a spammer or black hat if you chase the algorithm. When you run into problems and if you look at your rankings everyday, and you go up or down, it doesn't always mean something. You can't read into that too much. It will drive yourself crazy. It is very difficult to tell cause and effect these days.
Greg Boser is the last one up, from WebGuerrilla. He says he is also an Algoholic. He said he has been for a long time. In the old days it was easy to do very well. Much of this is hit or miss. They know stuff that worked well for several years. They have clients that go both the white hat side and the black hat side. So he plays on both sides of the fence. The aggressive stuff helps you find things for the white side stuff as well. In this day and age its a lot different, its not as exciting and mathematic as it once was. But don't be afraid to go down that road. He said the algo chasers do care about conversions, its not true that they don't.
Q & A: Q: He noticed that when Yahoo! & MSN focused on search, they pretty much copied Google. How did they do it? A: Danny said that the primary use for rankings is link popularity, but there is little overlap in search results. Search engine results are really not the same. But the quality is overall better. Mikkel said I dont think Yahoo! copied Google. He said they all copied earlier search engines like Fast and AltaVista. He said they are all copy cats. Greg said Google was not the first company to analyze links. He said what you will see, the real differences are the amount of filters to patch these holes (MSN has the fewest, since they are the newest). Jill said they all want the same thing, the most relevant results. David said he noticed the biggest difference is that Google is more towards informational, and Yahoo is more towards purchase. He said Yahoo! has a lot to worry about since they increased their index size.
Q: Question for Greg, you seem to be neutral, you have clients that go both way, do you find that there is a distinguishing factor between the b2b versions b2c? A: He said there is a lot of aggressive stuff going on in the b2b stuff. Thomas Register did some very aggressive stuff in the past. He said also in the auto space, you do a search and you get different sites from the same one company. Shari said she turned in Thomas B2B and they have not got back in. She said Thomas is not always on the up and up. Greg said wait for version two. Shari responded wait until I get my hands on to it.
Q: Where do you get more information about the competitors competing in your industry? The industry is Real Estate California. A: Greg said you just need to buy tons and tons of links. He said some of the most crafty link building schemes was in the real estate business. Mikkel said the keyword list in that market is the same, there is very little niche keywords undiscovered. Try innovative ways to build unique content (forums, blogs, other sources). David said that if its one client its easy, just build up that client. But if you have several clients, what he does is build his own site and then distribute leads on a percentage basis to the various clients.
Q: An online retailer asks about one big site versus multiple sites. A: Danny said if it was working before, dont change it. Greg said he likes to break up content into multiple sites. That is why many sites use subdomain structures. David said I would be real careful with subdomains.
Q: Five items in the algo we should focus on? A: Greg said, links; internal and external and anchor text. content and titles. Jill said it depends on the site, its not a formula. Do keyword research, choose three or so for every page, make sure you write content around phrases, use them in titles, and use them in links. Mikkel said you need to look at it in "site dynamics" the site needs to grow, content growth, keyword growth, link growth. Shari said use words and phrases people search on, then tell them what you need them to do and you need the link component, and the last part is link development. David jokingly said he agrees totally with Shari. He said you need internal links, external links and in bound links. If you link out to good places, that is good. Don't be afraid to link out. Jeff Watts said he would add to think about adding value to the customer. When linking out, think, does it add value.
Q: Recent Google patent, your thoughts? A: DavidN said its a "Smoke Screen" and everyone else agrees.
Forum discussion at Search Engine Watch Forums.