Search Engine Strategies (SES) December conference is hosted by Danny Sullivan from SearchEngineWatch.com. I will be providing my view of the speakers and courses I attend and report them at the end of the day. I missed the last session because I had to catch my flight back to New York. Its amazing how the place emptied out the last day. I bet that 75% of the people from day one was already gone by session two of the last day.
For day three please read on.Session One: Keynote with Steve Berkowitz the President of Ask Jeeves. This half hour session was basically a presentation on the search engine industry and where Ask Jeeves fits. I showed up 10 minutes late due to some technical issues getting the SEMPO blog up. He described how the Web has grown and why search is the number 1 activity and location on the Web. He also explained that other sites such as Amazon, Ebay, MapQuest and others are all in a sense search engines. They all push and utilize search to help their users find what they need. However these site do not consider themselves search engines. He gave a lot of nice figures and statistics on the market including; more experienced users use search more then new users for example, New users use search 40% of the time to find new sites whereas experiences users use search 85% of the time to find new sites.
Steve then went on to explain how search is growing and he explained that Google is not the only search engine out there. He quoted from Traffick.com from a blog by Andrew on October 22nd discussing Ask Jeeves and why they are still a player. In reality Google is behind Yahoo! in where a site search is conducted. On average people use 2.8 search engines per week. Brand awareness for all search engines is significantly high. He also made a point to say that less then 1% use tabs while in search engines, backing Danny's theory of invisible tabs and promoting his engine that does the best job with the theory to date. Also Ask Jeeves searchers have come back more and more then they ever have in the past since they added Teoma search. There was a Q & A session that took place afterwards, nothing too interesting to report on that.
Session Two: To keep things short, take a look at the conference agenda for day 3 for speakers and what it covered. I opted not to go with the "Converting Visitors to Buyers" track and "Doing It In House Forum" and went to the "Meet The Crawlers" track. The speakers for this track included Jon Glick the senior editor at Yahoo! Search, Steve Gemignani at Loosmart, Craig Nevill-Mannig from Google and Paul Gardi from Ask Jeeves.
First up was Jon Glick from Yahoo! who gave a quick overview of the new stuff going on at Yahoo!. One thing you can now do is access all your paid inclusions for Inktomi, AV, etc. at one place - Overture. Yahoo! Shopping came out with SmartSort a new and more intuitive way to filter what you really are looking for in Yahoo! Shopping (pretty neat interface). He also mentioned that if you do an Inktomi feed, you can be included automatically (no extra charge above the feed) in Yahoo! Shopping.
Next up was Steve Gemignani from LookSmart/Wisenute who started off comparing LookSmart as the Little Engine that Could. LookSmart's network includes WiseNut, Zeal, and Grub. I am pretty sure that they will be promoting and utilizing Grub as a major player in their network but it was hard to hear that comment well. They are working on an interesting new method to crawl updated pages. Instead of crawling pages on set intervals like many of the crawlers do, they will look at an individual page and then determine based on how often that individual page is updated, how often they should send out their robot to that page.
Craig Nevill-Manning from Google was the next speaker. He started off by asking Danny Sullivan to enter in a weird alphanumeric code and then click go. Google returned with a UPS result, so Google now realizes that you are typing in a UPS number and gives you that information. Google also does this for FedEx, airplane reservations, patent numbers and more. I thought that was pretty neat. Other new add features were discussed, like Froogle and the stuff that is going on at labs.google.com. He ended with that and we were able to ask some questions later.
Paul Gardi from Ask Jeeves spoke next about how unique and different their search (i.e. Teoma) is from Google and the others. He said that while others try to emulate Google, Teoma takes its own approach. Based on research Teoma results only overlapped with Google by around 10%, where as the other search engines overlap a lot more. I discussed how Teoma's technology works in the Link Building track on Day Two. Paul asked Danny to type in "quote askj" in ask.com and see what it returns. It brought back nothing, nothing was returned - how embarrassing! After some work they reworded the keyword phrase to "Askj quote" and it brought back the stock information on Ask Jeeves. Paul was quick to say they are still working on ironing out the bugs.
Questions and Answers: The first question was directed at Google and about, yes you guessed it, the Google Florida Update. Craig was quick to make a sarcastic remark that he was surprised to get that question. I am quoting him as saying that "its [Google] not perfect" and "its [Google] a work in progress". He also made it clear that this update was done for quality purposes and in no way was done to increase AdWords revenues. He was clear that by taking the revenue driven path for his organic search would be counterproductive and will quickly make them lose their market share ultimately leading to less revenues. Google's search and AdWords departments are separate and are not influenced by each other in this sense. Then he gave the generic response that Google is striving to provide the most relevant results for the user and they will continue to work to do so. They appreciate feedback and recommend you report your feedback.
The next question was about personalization and how the search engines will use a person's location and demographic information to enhance the search. Basically, the concept is - since I know who you are and what you searched for in the past, provide different results that are tailored to you versus the guy sitting in the next cubicle. They all said they do track this information if you have the tool bars installed but they would never sell or give this information to a third party. Yahoo! said that we can expect to see this as an OPTION in a year or two but will not be required. Most the others said the same except for Google, Google said that they have no plans to personalize results based on user. They said that will just make things more confusing and is not the best way to go about search.
How many people and resources does each search engine put towards spam reports? All but Yahoo! said they did not have any specific team or individuals responsible for spam. Yahoo! said they have 3 full time engineers who handle spam and send your spam reports to either firstname.lastname@example.org or email@example.com. Google said they will rarely (almost never) block or ban a specific page or site, they said they will look at the spam and build into their algorithm to filter out those types of sites. Teoma basically came out and said that they do not have spam based on their unique method of ranking site. That was a bold statement and I liked it, I am starting to like Teoma.
The next to sessions I attended had little to do with optimization strategies and more to do with conversion tactics and analysis.
Session Two: To keep things short, take a look at the conference agenda for day 3 for speakers and what it covered. I opted not to go with the "Dealing with Directories" track and "Outsource SEM Business Forum" and went to the "Measuring Success" track. The reason I selected Measuring Success over the others is because the majority of my business is developing sites and figuring out how to convert the sale, SEO is really just a prerequisite. The speakers for this track included Bryan Eisenberg from Future Now and Laura Thieme from Bizresearch.
Bryan started off with saying that you must define each page's action (each and every page). That means what is the action you want a person to take? What person has to be persuaded to take actions? What do they need in order to take action? Conversion rate is a measure of your ability to persuade your visitors to take some sort of action. Obtaining actionable data is by (1) defining your metrics, (2) using your benchmarks and not your competitions and (3) how are the metrics trending. He said that number of people that leave from your homepage will always be a little inflated because people by nature like to go back to your homepage before leaving your site - I found that interesting because it makes sense and I never heard it before. He said an important figure to track is latency and many of the new Web analytics programs can help you do this. He wrote an article on this that covers more of this information that can be found at http://www.clickz.com/sales/traffic/article.php/2174241.
Laura Thieme was next and I took very little notes because she basically discussed keyword track reporting in relation to traffic. All pretty basic stuff but they do this stuff in detail, so I am sure Bizresearch is a quality company. My only issue was that she regularly uses WebPositionGold to do conduct keyword ranking. I asked her later why she uses it when it is explicitly against the Google TOS and she said that she does it twice a month at off hours. Danny jumped in to say that if I go ask a Google Guy they will pretty much be ok with it if its not abused. I then said why not use the Google API, Danny said because there are other search engines - this is the same Danny who said that Google powers 77% of all Internet searches. I understand this number will drop but Overture has an API as well. Google and other search engines say not to do automated queries, I say don't do it. Not because it might get you banned, I know it would not ban a site, but because search engines are an SEO/SEMs friend and we should not abuse them. There are so many SEO/SEM companies that use WPG and other programs that don't use the API and if they run these reports twice a month for each client, that is a huge load on the search engines servers. The answers I got to this question disturbed me, then someone else asked for an alternative program and there was no mention of any of the three programs (free and paid) that utilize the API. Something is up with that but I won't go beyond that.
The final session I was able to attend was Measuring Succes Case Studies that went over some of the popular Web analytics programs. The products that were covered were ClickTracks, HitBox, WebTrends and Urchin. I use Urchin but they all seem the same to me. I need to upgrade all my clients to the new Urchin and I will soon.
I was unable to attend that last session but that did not bother me, it seems like the 1200 people all left and there were a few stragglers behind at this time.
Overall the conference went well, I have more of an appreciation for the industry and its nice to see people wanting to learn about the industry. It was nice to hear from the engines and it was nice to hear from some of the famous SEO/SEMs out there. I recommend the conference to all newbie's and those who have a deep devotion to SEM/SEO.