SEO 101 - The Timeless and Classic Hits

Dec 4, 2007 - 3:28 pm 3 by
Filed Under PubCon 2007

This is a Q & A Intro to SEO 101 session with Jill Whalen, Bruce Clay, Jake Baillie, Bill Slawski, and Ash Nallawalla.

Q: How important are on page factors now relative to off page factors?

A: Bruce: If you do the wrong thing it will cause problems, on-page – best practices, spiderable, if do all pages correctly gets good jumps in ranking.

Jill: Linking extremely important, both work together, on-page comes in handy, first get on-page stuff in order then focus on linking. So many ppl look for links before even having decent title tags etc.

Ashi : If working on competitive sites, off-page more imp., if not competitive on page more imp. Treat on page as formula, treat as guide.

Bill: Getting into competition is important. Being there without paying attention to content, not doing things well, not part of game. New thing people are not paying attention to: search engines indexing parts of pages instead of whole pages, need to think about how pages are laid out, if cover more than one topic, can use more than one heading on a page.

Jake: if prioritize on-page factor, which is single most important to optimize first?

Bill: unique content, title tags, meta, etc. are unique otherwise pages get lost. Crawl a site to make sure no issues with pages getting indexed. If SE’s can’t find pages, nothing matters.

Which is most important?

Bill: Probably the title.

Ashi: Title, get architecture right.

Jill: title tags, optimize for proper keywords

Bruce: title tags most weight, body tags close second. Must have unique content on every page. Title more important. Need site map.

Q: What is the worst SEO factor you’ve seen?

Bruce: all flash, no site map.

Jill: content in graphic big mistake.

Ashi: whole website is one file, splash pages

Bill: saw a website where 1 page was indexed 15,000 times. This page had 28 widgets and everytime clicked on url changed, hours to figure out why, minutes to fix with java script.

Jake: dairyqueen.com – home page indexed as cookies required!

Q: What is the biggest problem these days? Tips on avoiding dupe content problems, content issues?

Bill: starts by crawl site with links sleuth. Find where the problem is – where the CMS is messing up. Xenu link sleuth rocks! Make sure robots.txt doesn’t disallow Xenu.

Ashi: Robots.txt is a good way to solve dupe content issues. Good examples are those who run Vbulletin forum software - they easily cause dupe content issues. Wordpress has a good plugin (forgot name) which puts a nofollow on every link. Use Firefox search status by quirk.biz allows you to view nofollow links. Individual posts have the follow tag, but all the duplicate links to the dupe page have nofollow.

Jill: Don’t use session IDs, feed engine clean URLs. If rewriting URL’s to clean them – exclude the ugly URL’s. Don’t need to mod rewrite much these days

Bruce: www vs. non www is a dupe content issues, also https: can have a problem. Content syndication can cause problems. Have to address that. Biggest problem is multiple pathways to same content because of the architecture. Pick pages you want indexed and de-index the ones you don’t. Getting spidered is great – but they might interpret it as dupe content. Easiest way to resolve issues is do a site: in google and look for duplicate page titles. If you don’t have duplicate page titles, and see them in the index, that’s the big problem.

Jake: How many people represent large brands that everyone’s heard of? Can you guys tell us how you approach these sites?

Bruce: We have a different approach for every site – it’s a consulting solution – not a page by page. Need a consulting agreement.

Jill: Good and bad with working with each, large and small sites. Large ones can be good. Sometimes doing training for people in house so they get excited about it – it can be good. Otherwise it’s tough. Easier to work with a marketing manager that knows what SEO is. The challenge is great for large sites because they have many links. Don’t need to do link building for large sites very often so it can be exciting.

Ashi: Works with a large Australian online company with many divisions. Problem with the large company is that everyone is an expert in the online world. They have a lot of power to engage external consultants, and sometimes you get the business working with multiple SEO consultants. Some companies acquire technology for reasons other than SEO. Security might be the main priority. Maybe different divisions choose different CMS’s and the site is built by multiple crews.

Bill: Both large and small have unique changes. Need a strategy for each. Often find making a business case when presenting SEO. Work on templates instead of individual pages. Often not the first person there is a challenge. Play SEO archeology – one site had 2000 301 redirects from previous campaigns. Pities search engines for trying to figure out what’s going on. Small sites can do investigative marketing research. Finding niches that large companies cannot take advantage of. No board meetings and committees. Both fun, just different.

Jake: Open it up to the floor. Any questions:

Q: We are in the process of moving into the world of CSS. What should we do?

Bruce: I redid my site the same way it was done 12 years ago. Went to pure style sheets. Decided to make it W3C compliant. More content. Rankings went up. CSS won’t cause any negatives. Style sheets work well.

Jill: If leaving content same, no problems should arise from style sheets. Keep URLs the same. If not 301 redirect them.

Jake: Long ago a concept was promoted of ratio between text to code. Does reducing that ratio help?

Bill: MSFT looked at looking at onpage content in spam fighting. They spoke about code to content ratios and text to style ratio. Not sure that there is a magical percentage, other than in spam area. If there’s a lot of code on the page, open to errors. I like CSS because it eliminates some types of duplicate content. Can use style sheets to have people view site on wireless devices which is growing fast. If not thinking about mobile, and how CSS can help – missing opportunities. Mobile search – it will help.

Ashi: If you have a large website. Introduce changes gradually.

Q: One of our products is a barcode scanner. People often search for “bar code” as two words and one word. Most people search the technically wrong word. What do you recommend?

Jill: Dedicate a page to misspelling. Doesn’t recommend using both versions on the page.

Bruce: When you get the “did you mean”, sometimes it disappears because the index is getting smarter. If you do a search and it asks “did you mean”, there is a prominence of the other way. Given that, I would want to rank with the way most people write.

Ashi: Don’t use misspellings on visible text. Try microsites around you that use misspelling and try to get traffic from them.

Bill: This is an issue that search engines should handle better. Less a problem in English than German. As Bruce said, the engines mine query logs to understand how people search. Treatment and handling of merged words – look at the SERPs when doing keyword research. Has tried using alternate versions, but not too much.

Jake: Write a page that asks – what’s correct Bar Code or Barcode!

Q: Talk a little bit about sitemaps – dynamic vs. manual – a Google sitemap- what’s the importance to those and the best way to create them.

Bruce: Need 2 – a spiderable one and an XML. Follow standard at sitemap.org. Can use a sitemap: command in robots.txt to point spiders to sitemap. Including Ask. Can daisy chain large amounts of sitemaps. Beware of dupe content!

Jill: If your site is made correctly, don’t really need it. Don’t get the advantage unless you have a massive site. Without internal link structure, not going to succeed.

Ash: There are lots of free tools and scripts – use for large websites – agrees with Jill.

Bill: Agrees with Jill. Valuable for larger sites. Doesn’t hurt to do it and see what happens. Look at webmaster central tools to see indexing errors. Make sure everything is there.

Q: How do I use Mod rewrite - how to find people who know how to do it.

Jake: Webmasterworld is great for that.

Ashi: If you look for an .htaccess cheat sheet – it will help you.

Jake: There is a session covering it.

Jill: Look at forums.

Q: How much of one’s online budget should go to SEO:

Bruce jokes: All of it (chuckles). Go with a ROI approach. If it works, spend more.

Jill: Agrees. If making more on PPC, use PPC. All about ROI and conversions.

Q: Any relevance of domain name to ranking?

Bruce: Matt Cutts said 0, 1, and 2 is the max in a domain. File names – max 14 hyphens. Try not to use hyphens in domain name. In filename ideal is just a few. In Google an underscore is a character and does not separate words.

Bill: Subdomains, just to add. MSFT did a paper on spam. Found that subdomains that use lots of hyphens are flagged.

Q: Search engine friendly pages and query strings. Is the only benefit the keywords? Can you help me with that?

Jake: It’s a pet peeve of mine. Parameters won’t cause you not to be indexed. It’s not your problem. It’s how you use them and how they create dupe content. Largely the issue is not the parameters.

Jill: Jake said it all.

Bill: Is keywords in the url part of the question? Yes. SE use lots of ranking signals – is there special importance to adding the keyword in the url – such as the query string. Not very many, but there might be some.

Jill: In many cases don’t change indexed pages, and then rewrite the URLs. Could be SEO suicide.

Jake: Shorter URLs are good for usability. The shorter the better.

Q: How do you deal with feed base sites?

Bill: Do something better. Build better descriptions. When submitting feeds, make your website as crawlable as possible. Optimize the feed. Submit both. Might not get as crawled deeply if feed based.

Bruce: If the query for a product is shopping related vs. intent based query – need an appropriate site. Mindset.research.yahoo.com is a great tool to analyze intent. Can optimize around that, need to know what search engines reporting.

Jill: If you do it, and it makes you a positive ROI, continue doing it.

Q: If you maintain the #1 position, what housekeeping should be done?

Bruce: If it ain’t broken don’t fix it. If you rank #1, I’ll go to the top 3 sites and ranking doesn’t always mean conversions. If you are number 1, congrats your’re doing well – focus on your bounce rate. Not a focus on SEO – don’t break it – focus on your conversions. I might go to site #2, then site #3, then buy from site number 3.

Jill: Make sure you are not optimizing for 1 phrase. You are optimizing for thousands or more phrases. Stop looking at the rankings. Look at conversions. Start thinking about what other phrases will get traffic.

Ashi: Keep a close eye on the competition.

Bill: Broaden the ability of people that come to your site. Sometimes rankings are the wrong thing to look at. Long tail terms might be better. It’s just not being number. Can’t set and forget.

Q: Follow up to Bruce – what can you do dynamically to improve conversions on organic pages? How can you change the content based on the query string coming in.

Bruce: A/B testing – improving copy so that bounce rate diminishes. It’s a science and industry. There are good tools to focus on multivariate testing. When I search, I want to see my keyword at the top – not welcome to my site. Once someone comes to your site, the person is on a mission – you want to keep them there and answer there question and convert them. If you can’t convince them, doesn’t do you good. Bounce rates under 20% are good. Your question is a session in itself, but it must be done.

Q: What if a product is out of stock?

Bruce: It matters if there a links to the page. Might want to 301 redirect it to pass the PR.

Bill: Create a new page an continue to get traffic to that page – replace the product or add a new model.

Q:This is a question on duplicate content question. We have good navigation for the user experience. When you go to different pages, has the same navigation. Do I want to a nofollow on the navigation?

Jill: That’s a typical way a site is built. No penalty.

Ashi: One guy I know uses Iframes for navigations. I haven’t tested it but it’s worth thinking about.

Bill: Doesn’t hurt you to have the same navigation structure on every page.

Bruce: In my case, my navigation is flash, just to reduce the html, and then footer links. In many cases there are too many links. Keep them less than 100 links, or use iFrames – must look at architecture of page. Make sure the search engines see the page as content. Under normal circumstances, don’t have to worry about it.

Q: Because spiders change behavior as do the engines, how do you prepare for that?

Jill: Even though little things change to fight spammers, IMO the search engines always want the same thing – the best relevant page. That’s your long term goal. Don’t have to worry about these loophole changes to fight spammers.

Bruce: My approach is more specific. I agree with Jill, search engines want the same thing and if you understand the use of normal architecture – chances are the items are not going to change. Play by the rules and the search engines won’t penalize you.

Bill: On the basis of crawling – there’s a paper written in the late 90’s by a Stanford professor – helped create Google. For example, how far a page is from the root directory. Search engines prefer many homepages verses many subpages. Based on architecture, links, PR – things you control – like trailing slashes in URL. Good paper worth looking at.

Q: If a website’s content is largely built on a subdirectory – should we move it to own domain?

Bruce: Happens all the time. Split them up and use the 301’s not hard, but tedious.

Jill: A brand new domain needs an aging delay. 301’s will help, but sometimes a new domain won’t have the same equity as old directory.

Q:Is there a correlation between inbound links and number of pages indexed. Would you start with the important pages in a sitemap?

Bruce: No correlation. I think the searcher learns how to be more specific.

Jill: If you have the content, put up the sitemap.

Ashi: Did a test. Used sitemaps and Google indexed new pages quickest, then Yahoo, then MSN.

Bill: Had a site with a million pages. Took 4-5 months for the first couple pages to get indexed. Search engines identify massive increases in links, pages and see something out of the ordinary. Visitors will give natural links and other marketing will help besides SEO.

Contributed by Avi Wilensky is a search engine marketing specialist and owner of Promediacorp.

 

Popular Categories

The Pulse of the search community

Follow

Search Video Recaps

 
Google Core Update Rumbling, Manual Actions FAQs, Core Web Vitals Updates, AI, Bing, Ads & More - YouTube
Video Details More Videos Subscribe to Videos

Most Recent Articles

Search Forum Recap

Daily Search Forum Recap: March 18, 2024

Mar 18, 2024 - 4:00 pm
Google Updates

Google Urges Patience As The March 2024 Core Update Continues To Rollout

Mar 18, 2024 - 7:51 am
Google

Official: Google Replaces Perspective Filter With Forums Filter

Mar 18, 2024 - 7:41 am
Google Maps

Google Business Profiles Now Offers Additional Review After Appeal Is Denied

Mar 18, 2024 - 7:31 am
Google Maps

EU Searchers Complaining About Google Maps Features Changes Related To DMA

Mar 18, 2024 - 7:21 am
Google

Google Showing Fewer Sitelinks Within Search

Mar 18, 2024 - 7:11 am
Previous Story: Are Paid Links Evil?