Better Ways

Jun 5, 2007 - 6:01 pm 1 by

Better Ways

Moderator: Danny Sullivan, Editor-in-Chief, Search Engine Land

Speaker: Alex Bennert, Director of Client Services, Beyond Ink Greg Boser, Search Engine Marketing Consultant, WebGuerrilla Jim Boykin, CEO, We Build Pages Christine Churchill, President, Key Relevance Todd Friesen, Director of Search Engine Optimization, Range Online Cameron Olthuis, Director of Marketing and Design, ACS Aaron Wall, Author, SEO Book

Better Ways at Search Marketing Expo

Danny introduces the session and says that it's really called "Better Ways to to Boring Stuff." You can think of this session as an SEO technique clinic.

This is a question-and-answer session instead of a presentation session.

Q: We've seen great results with social media for small clients, but our bigger clients are hesitant to get into that space. They are having difficulty launching their first blog - it took six months to show them that there's value in that. Do you have recommendations on how to ease them into it? Cameron: We run into the same thing. A lot of education is involved. Reinforce the idea that you'll be doing a lot of reputation management and watching that closely. You can watch the fire and catch it before it goes out. Aaron: You can reach out to brand evangelists if they don't want to do it on their own site. Greg: The corporate web world is so slow. Even if you get approval, it takes 6 months and 400 meetings and meetings about the meetings. Alex: We have a client of a large corporation. They wanted to start a blog. However, it couldn't get through legal. The blog ended up being generously being sponsored through an affiliated company. There is a good halo effect without having to worry about legal stuff because technically it's not their blog; they're just sponsoring it.

Q: I have an open-ended question: how would you scale getting linkbait? What would the tricks or tactics actually do to get a lot of backlinks without having to do much? Aaron: You can find the key ideas for your topic. When you search for your topics on Google, search for things that are non-commercial that people are interested in. Then write about it. Get really good content for it -- spend a few grand -- and run an ad campaign for keyword permutations that would match that. Christine: If you happen to be an authority in a certain area, you can give out awards. An example is web2.0 awards of SEOmoz. These have longevity and can continue to build links. Try to look for something that has continuous link-building ability. Todd: We take a different approach. We have a lot of interns who do directory submissions and we also do link buying. We feel it is media placement and relevant. We also have done widget building. When that HTML is generated, we embed a keyword link with one of the phrases we're trying to target. Jim: Our approach is "no pain, no gain." We send it via email and do it by hand. You need to write something where you'll expect a response. The days of pressing buttons is over. Alex: We also try to build it into another part of the business. We have a matching service and offer incentives (referral fees) for links. Cameron: Use brand evangelists - give them the content and let them do the linkbait or viral stuff for you.

Q: Has anybody got a tool or a way of logging into social networks very quickly? Cameron: I think anything worth doing is worth putting a little time into it. Pay your dues. (Hurray!) Danny: How long does it take to log in to them?! Todd: Use Roboform which logs you in. Christine: There are scripts that you can add to the toolbar that keep you logged in.

Q: How about developing an accurate keyword research tool? All the ones there suck. Todd: Have you seen Microsoft's keyword tool? It's a fantastic tool. They have a lot of data including demographic data. They launched it a few weeks ago and they now have an API. Greg: Why do you think they suck? Christine: Most of the keyword tools pull information from ISPs so you can get some skewing. Others like Wordtracker pull data from metasearch engines and then have a filtering system which throws out aberrations in the search trends. That’s why the search queries in WordTracker are lower than some of the comparable tools. Keyword Discovery recently made a change in the way they present their data – the default data used to be from ISP data, but just recently they changed the default data to be user toolbar data, so its resonably accurate. They still have the ISP database, they just made the toolbar data be the default. A good thing to do is to use a couple of keyword tools and compare the order of the phrase – and ignore the actual search number. If I see the same phrase above another one above 3 different keyword tools, I can assume that it's a more popular keyword than another one. Alex: You gauge the relative disparity across different tools. e.g. children's furniture vs. kid's furniture. Greg: The relationships of the words do pan out very accurately, even if the numbers are different. Alex: The most accurate source for keyword data is your referral data. Followup: Our referral data shows different results from those keyword tools. Christine: Use a PPC campaign. There are a lot of differnet ways to determine this.

Danny gets a poll of which keyword tools are being used. Christine says that Overture is still good for brainstorming even though there isn't a large show of hands.

Q: I have a question about linkbait. Some pages have a lot of icons that add clutter to the page to submit to Digg, Delicious, Reddit, etc. Has anyone done any research on "how much is too much?" Cameron: I prefer not to include those at all. Generally, what happens is that a lot of stories get submitted to Digg and provide more value to the Digg community, but if the Diggers see that your site is being continually submitted to Digg, they would equate your site with spam. You should add the ones that are most relevant to your userbase.

Q: In a corporate culture, how do you explain to corporate clients that you will be getting links from unrelated sites, unrelated topics, etc. to people who don't get it? Cameron: I think that the links are pretty relevant. The blogs that link are generally on topic to what you've written about it. There will be a few irrelevant links, but it's not like it's going to hurt you. Aaron: A lot of links can be polluted anyway across the web. Todd: It will spread through your group of interest. That's how it spreads. The vast majority of your links will be on target. Christine: You should have a balance in your links. When you do link-building, you don't want Digg links only, or blog links only. You need to do links over a variety of sources. Greg: We don't have control over anchor text and what people link to you. However, it helps develop trust to your site to rank better. You should still look for focus, targeted links. Cameron: You can't control that anchor text but you can influence it by having an applicable title, description, etc.

Q: How many targeted links would you go after a month without penalty? Aaron: It's all relative to what others are doing and the field you're playing in. Some people go for thousands, and others go for fewer. Followup: Let's say it's under control and it's not viral. Greg: The rate that we add links really depends on the space and where your site is in the level of trust. What's good for one might not be good for another. Older sites can get crappy links and it will help you. Newer sites have problems with that. Jim: I think a lot of sites are concerned about the number of backlinks. It also depends on the quality of the link as well. If a subpage has a thousand links that go to it and you get a link from that page, it's worth a lot more. Todd: In regards to Greg's comment about fitting into your space, I once got 65,000 links overnight by pressing a button. The top competitor had 3,000 links. Eventually they found me out.

Q: Regarding getting large companies to make the shift, I run into challenges with editorial writing (locked into an old style of writing) that isn't SEO friendly. Also, there are graphic designers who want to build Flash sites that aren't discovered by spiders. Have you had any success stories on getting them to change their minds? Todd: We have a lot of clients like that. It's a long process of education. Search engines need to understand what the page is about. If you show someone a description of a product without the actual image of the product, people don't necessarily know what the product is that you're describing. You need to make sure it's understandable. Add one word to the writing and make it as simple as possible. That's going to get their attention. A lot of corporate clients are thinking about this as a resource issue of how to get it done. Greg: If they give you grief, I'll just fire them. It's like pulling teeth. Lawyers ruin everything. Everything hits legal and goes to a grounding halt. I don't work at that level anymore simply because I can't work with clients unless they prove to me that they'll follow through. I like to win. Alex: In terms of designers, Danny wrote an article that clicks for some of them. Make sure that any designer has a design that is cross-browser compatible. If you think of the bot as a browser, that's how you can help them see how to work on the content. Christine: Some clients will launch that Flash page regardless of what you tell them. They will learn by being burned. A lot of times, you can then convince them. Todd: We have clients that have a lot of Flash pages. There's absolutely no way in the world that they will change it. We built out an HTML version of the site and sent a User-Agent delivery to the bots. Call it cloaking as you like, but it's the same page for the search engines. Greg: We used to use cloaking to prove our case for some of these clients. I would build a bot version of that model and cloak that. All of the sudden, they would believe it. Danny: You can have a great headline that is keyword rich. Newspapers are struggling behind content behind paid walls. 25% of visitors to newspapers come from search engines. I think that's because they are learning that they need to change the way they write. At Search Engine Land, I talk about that "third browser" - everybody uses a search engine. Designers have to design for IE, Firefox, Safari, etc. But search engines are more popular than all those browsers combined. That idea should resonate with the developers. Followup: I wanted to followup with that question. I trained journalists to write for search. You shouldn't mention SEO at all. You should emphasize that you're writing for users who use search engines.

Q: I was wondering how relevant page freshness is to ranking. Do we need to reoptimize pages every few months? Sometimes we see pages that rank higher which are crap but are fresher. Danny: You've never seen this page before and it comes up? Aaron: You probably see that these newer pages are featured more prominently on their page structure. You need to make sure that your older pages continue to have that prominence. You need to make sure your internal link weight focuses on these types of pages. You should assume that since Google News is now integrated into search, Google is also focusing on newer content. Jim: Google would try to feed in a fresh page that may or may not last through time. I think there are a lot of people who say that content needs to constantly change and be fresh. Search engines look at this. If you wrote a great page in 1996 and many people linked to it and suddenly you decided to change that content, I believe the search engines see those older links as not having as much value. Newer links, however, would have more value. Todd: I know a site that has 500,000 links to it and Google says it has 8. You need to use other tools like Yahoo. It might be a freshness issue - it might be something entirely different. Google is constantly tweaking the algorithm. Today, something might have changed. Always make minor adjustments to maintain your position. Greg: Check your header messages - Google supports the Last Modified date. Christine: I have pages that I optimized 8 years ago and still rank, so it might not be a freshness issue at all. Danny shows a Google Blogoscoped blog post that mentions that Google has a QDF value - "query deserves freshness"

Q: Do you have tips that a retailer can optimize in Google Base to go above organic results? Nobody uses it on the panel. Someone in the audience says that it doesn't do much for him. Danny: I'm not actively using base but they are taking more and more database data. They are doing outreach to real estate agencies. Base doesn't yet have that momentum and still needs to be experimented with. They may downplay it but it is going to be Google Real Estate or Google Classified down the line.

Q: Regarding Google Base, we have a real estate company that updates the feed for Google Base. We see that people are scraping the MLS off the feed. Greg: The playing field is never even. Google gets a lot of duplication in the MLS. Individual agents send their own stuff and then brokers send their own stuff, so I think they're taking only from brokerages and that will hopefully address the spam issue. You always want to hope that the guidelines are evenly enforced/policed. The guidelines say don't do it but Google doesn't allocate the manpower to hit the people who are cheating. Those that understand that push the envelope. So now you have to face your own solution - do I wait for Google or do I compete? Todd: About a year ago at another conference, Tim Meyer of Yahoo was looking at spam (pills, mortgage, etc), and we concluded that you can't take a sword to a gunfight. Otherwise, you'll miss the opportunity entirely.

Comment from someone in the audience: Eight months ago, I added a Google Base application and we saw some of the results going into a OneBox.

Danny then reviews the Google Universal search and shows how Google OneBox is more prevalent: news results, maps, pictures, and Google Video have gone to replace traditional organic results.

Q: I wanted to know if you have a favorite tool that you use on a regular basis for anything. Greg: All our stuff is internal that we built ourselves. Todd: We use a lot of internal tools, but my favorite Firefox tool is SearchStatus. The Web Developer toolbar is also brilliant. Cameron: We have developed a tool inhouse called Serph, which is a reputation management tool that checks many social media sites. Jim: We have a We Build Pages Top 10 Analysis tool. Todd: I also used Aaron's SEO for Firefox extension. We also use Xenu Link Sleuth which will come back with a list of URLs on your site, etc. (He adds: It has a multi-threaded crawl. You can do that to your competitors.) Alex: Xenu is great and lets you export the entire crawl to a file.

 

Popular Categories

The Pulse of the search community

Follow

Search Video Recaps

 
Google Core Update Rumbling, Manual Actions FAQs, Core Web Vitals Updates, AI, Bing, Ads & More - YouTube
Video Details More Videos Subscribe to Videos

Most Recent Articles

Google Updates

Google Urges Patience As The March 2024 Core Update Continues To Rollout

Mar 18, 2024 - 7:51 am
Google

Official: Google Replaces Perspective Filter With Forums Filter

Mar 18, 2024 - 7:41 am
Google Maps

Google Business Profiles Now Offers Additional Review After Appeal Is Denied

Mar 18, 2024 - 7:31 am
Google Maps

EU Searchers Complaining About Google Maps Features Changes Related To DMA

Mar 18, 2024 - 7:21 am
Google

Google Showing Fewer Sitelinks Within Search

Mar 18, 2024 - 7:11 am
Search Forum Recap

Daily Search Forum Recap: March 15, 2024

Mar 15, 2024 - 4:00 pm
Previous Story: Exclusivity Rumors Between SES, SMX and WebmasterWorld Are True
Next Story: Penalty Box Summit