Search Engines Strategies – Chicago December 9th – Day One

Dec 9, 2003 • 11:14 pm | comments (1) by twitter Google+ | Filed Under Search Engine Strategies 2003 Chicago
 

Search Engine Strategies (SES) December conference is hosted by Danny Sullivan from SearchEngineWatch.com. This is the first time that the SES conference is in Chicago and it spans 3 days (many SES conferences span 2 days but the largest (in NY and CA) span 4 days). There are probably 750 – 1000 people attending this conference, this includes exhibitor only attendees. This being my first attendance of the SES conference, I cannot compare from past to present conferences, individual tracks and speakers. I will be providing my view of the speakers and courses I attend and report them at the end of the day. In the future, I hope to provide a historical perspective of the conference. As a side note, I am really tired and I hope this reads well.

Please continue reading for the extended entry...Session One: Unfortunately I missed the entire first session which ran from 9:00-10:30AM, this session had three tracks including: “Introduction to Search Engine Marketing” presented by Danny Sullivan; “Organic Listing Forum” presented by Mike Grehan (author of SEM book), Anthony Muller (Range Online Media), and Brett Tabke (Editor of WebmasterWorld); and “Search Engine Advertising Forum” presented by Ron Belanger (Carat I.) and Dana Todd (SiteLab). I cannot report on these because I did not attend them.

Session Two: This session ran from 11:00-12:15PM and the three tracks include: “Search Engine Friendly Design” presented by Shari Thurow of GrantasticDesigns; “Advanced Link Building Forum” presented by Greg Boser (WebGuerilla), Paul Gardi (Ask Jeeves), and Debra Mastaler (Alliance-Link.com); and “Search Engines and Affiliates” presented by Jamie Crouthamel (Performics), Todd Friesen (Oilman), and Lisa Riolo (CJ).

I selected to go to the Search Engine Friendly Design track because (1) I am more interested in the organic sessions then paid sessions (2) I wanted to hear Shari live and this was a full session of just her talking. Her presentation was very well organized and she spoke very well. What I did not like was that everything she covered and much more is available in her book (Search Engine Visibility). I read the book and it is an excellent book to read for beginners (I strongly recommend it) however I was hoping she would add more in the live presentation but she did not. The one minor change was here thoughts on DHTML navigation. She said (based on internal data collected) that Google has been crawling DHTML code over the past few months. Another thing that I did not know was that Teoma finds links and resources pages on your site important. So for Teoma, unlike other search engines, having relevant external links in a resource or link page is a relevant factor in your overall sites ranking. Overall, I recommend this session for beginners that attend the SES conference. If you do not want to attend then buy the book, you will get a lot more for your buck.

Session Three: 1:30 – 3:00PM and included three tracks as well, these tracks were as follows. First track was named “Search Term Research” presented by Andy Beal (KeywordRanking), Chris Copeland (Outrider) and Dan Theis (SEO Research Labs). Second track was named “Search Engines and Web Server Issues” presented by Greg Boser (WebGuerilla), Bruce Clay, and John Heard. The third track was named “Contextual Ads” presented by Brad Byrd (NewGate), David Jakubowski (Quigo) and Joshua Stylman (Reprise Media).

My choice for this session was Search Term Research because I was not particularly interested in contextual ads and I know enough about web server issues in relation to search engines. This track helped me reinforce the basic principles of search term research. Most the information covered in this topic can be found in my article Keyword Selection Strategy but it did contain some information that is not covered in my article.

Lets start with Andy Beal, 78% of people use search terms of 1 – 3 words in length. He also said that “generic names dominate” the searches performed in contrast to brand searches (i.e. wireless telephone versus Panasonic wireless telephone). One thing he said that disturbed me was “search engines look for themes within sites”, that is not a 100% correct – search engines look for themes within pages and not sites, trust me he knew he was saying sites. He is a strong advocate for WordTracker (many speakers are), he also recommended ClickTracks for Web stats for small budgets. So you know, Overture search suggestion tool does a bad job with differentiating between singular and plural terms as well as it ignores punctuations (WordTracker does not). Since I do not use WordTracker, it was nice to see it in action – I will probably buy a subscription to it now.

The next speaker, Chris Copeland was a much better public speaker overall. He said a few things that caught my attentions. (1) 92% of the time one will click on you if you have both a paid and organic listing on the SERP. That is about all I wrote down for his presentation.

The final speaker, Dan Theis provided a deeper understanding of what to look for. His company only does search term research and nothing more (so that is cool). One point he said was rank checking could provide skewed results, that means if you are ranking well for a search term that does nothing for you in revenue – then who cares! He has a point and many SEOs want to rank well for keyword phrases but do not conduct the appropriate keyword research to rank for keywords that drive revenue. Also you cannot simply determine competitiveness of a keyword by WordTracker’s KEI feature nor by the total results found in Google, it’s just not an exact match. He uses a much more detailed process to determine competitiveness including reviewing the value of search terms in the pay per click module, the higher the dollar per click the more competitive. Also conducting a feed and then researching the results over a two-week period. Finally he said that a complex search term is rarely used compared to a simpler term (we know that but he took time to say it).

Session Four: 3:30 – 5:00PM, this three track session included: “Writing for Search Engines” by Heather Lloyd Martin (SuccessWorks) and Jill Whalen (HighRankings); “Cleaning Up the Mess” by Matt Bailey (Kacher Group), Anne Kennedy (Beyond Ink) and Shari Thurow; “Audit Paid Listings” by Kevin Lee (Did It), John Lustina (Intrapromote) and Jessie Stricchiola (Alchemist Media). I was really tempted to attend the Cleaning Up the Mess session, since that what most SEOs spend their time doing but I wanted to hear Jill Whalen and see how good she really is. The Audit Paid Listings did not interest me…

Writing for Search Engines track began with Heather Martin. Let me tell you, she is full of energy and she really knows how to get the audience into it. Excellent expression and the best presenter I have yet to see at the conference. Notes taken include: 2 – 3 keyphrases per page, if you are editing a large site – start with the top 20 pages in terms of top products and then work your way to the other pages, too many links (anchor text) makes a sites usability extremely poor, repeat the keyword phrases 3 – 4 times within a 250 word page. Both speakers were clear that you should not look at “keyword density” when writing copy because (1) the content will come out unclear and urge the visitor to hit the back button and (2) with the current Google changes (Florida update) Jill feels that Google might be filtering out “over-optimized” sites. I personally disagree with that but who am I :)?

Now its Jill’s turn, she was a far less impressive speaker but she did go directly after the best speaker of the day. After I got over her speaking skills I listened up for her SEO wisdom. Jill said right out “don’t worry they [Google] will fix things” when she discussed certain terms such as chicago real estate. Jill repeatedly said that her clients results were not affected but also brought examples of her clients terms that were affected, so I am not too sure about that. She said “don’t use formulas” just write with keywords in mind and keep the percentages and templates out of SEO copywriting. She then went on to say that “floating keywords around” is why Google kicked out the sites that have disappeared. An other point she made was that if you have words like e-mail/email or t-shirt/t shirt, do not optimize for both on the same page – use different pages for each variation because your Web visitor will think you are crazy for using different spellings on the same page (they might not notice if its spelled differently on a different page). And for the big statement, Google no longer reads alternative text tags since that last update.

Session Five: 6-7pm with Danny Sullivan. This session was just an open forum on anything search engine related. My overall impression with Danny Sullivan is that he contains a wealth of knowledge in both the history of the search engine industry and search engine practices. I guess that is a must for the “search engine guru.” First question was obviously about the Google Florida Update, surprisingly there were a large number of people who did not know about it (goes to show you who attendees these conferences – fyi, not meant a negative remark). Danny Sullivan believes that Google is “testing the new algorithm” out by applying the new algorithm to certain results and not others. He believes this is the case because Google allowed for a method of showing the previous algorithm’s results by using the –ksdjsdf function. I tend to disagree with this theory simply because I would believe Google would “test” a new algorithm at a better time and not right before the big holiday season but again Google has no obligation to the site owners. I also disagree with this theory because if this was the case, why would Google try to block Scroogle.org – one can argue that it goes against Google’s TOS but why did they try close it down so quickly? Why not go after WebPositionGold or something?

He then went on to more questions, he covered the history of search engines and the big consolidation of the search engines and directories (i.e Yahoo, Overture, Inktomi, AltaVista, AlltheWeb). He said the reason Yahoo has not yet consolidated them all into one service can be because (1) contractual agreements (2) political issues internally, which technology do I use for X, Y and Z and (3) it’s a complicated process to get everything integrated. Yahoo! switching from Google to Inktomi will not only bring down’s Google’s market share but also affect us SEOs. Finally, he believes that’s search engines in the future will be more paid oriented and less organic oriented because that is the nature of advertisers and this industry is still in its infancy.

That wraps up my day at the conference, tomorrow is day two and I will post my thoughts on day-two tomorrow night.

Thanks all!

Previous story: A Merry Christmas for Content Writers
 

Comments:

Marc

04/11/2007 01:43 pm

How do you know google is crawling dhtml? I used to have text pop-ups using dhtml and they were not being crawled. I originally thought dhtml was the fix to the uncrawlable flash with google.

blog comments powered by Disqus