Ad Testing: Research and Findings
Moderated by Andrew Goodman
Came in a couple minutes late during Anton Konikoff from Acronym Media’s presentation. He is speaking about advanced topics with testing. Covering the fundamentals of ad testing: where can you test? Google and MSN both allow for multiple ad creative testing, where Yahoo has not in the past. However Yahoo! Panama will alow for this. He said that they would have to run ads over a certain period of time and then change them. He discussed key metrics: Click-Through-Rate (CTR) allows you to know how attractive the ad or offering is, and which messaging is more effective at driving traffic. Conversion rate is a more desirable metric, and should be tracked by keyword.
How many ads is enough to test? Rules of thumb: small budget try 3-4 ads, larger budgets 5+ ads. Need to allow enough time and spend. Title lines are very easy to test, you can use keyword insertion, or a multitude of titles. When testing descriptions, keep the title consistent. Use a themed approach to create a variety of description lines, such as price points, “official site,” time sensitive offers, promotional offers, and even language variations. He says that using the keyword in the title doesn’t always work, even though most people say that is a “best practice.” He went over a short case study fro Klutz Toys. They tested multiple themes from Award Winning to “How-to,” to “Unique.” They found that “award winning” performed best.
More granular ad testing strategies include punctuation use, capitalization, proper case versus sentence case, accent usage versus non, keyword insertion versus non. Another short case study for Sirius. They tested two offers: $30 instant rebate and “free trial.” The $30 off offer was higher, which surprised the client. Misconceptions and myths: start with the most straight forward test, not with difficult multivariate testing. Don’t use this to come up with good ad copy. Try a simply A/B test in early stages. He suggests trying radical hypotheses…subtle changes will yield subtle results. Keep up a testing regimen, then test, test, test! The best testing is continuous…don’t just come up with a conclusion from a test then stop. You should set aside a specific budget for testing – many clients don’t like this because they want results from every single dollar spent.
Jonathan Mendez from OTTO Digital. Discussed how they set up search ad MTV (multivariable testing). 4 elements to ad: title, description line 1, description line 2, and URL. They try to use three variations: one for control and two new copies versus control. They call this a 4 by 3 MVT. They use Taguchi methodology. In Google, they added parameters to click through URL. They are not making any changes to the pages themselves, they want to know if the ads are impacting conversions.
First case study: “major product aggregator” (some would refer to them as a vertical search engine). They wanted to increase user registration. He went over their MTV test array with the variations. They found that one ad greatly outperformed the rest, with a 36% lift over the control. Lots of numbers and I could not get them all… looked at the influence of the keyword in the title…conversion was not as good, but they had the highest CTR. Quality score does not make this easy to solve, since CTR is very important for that score. Case study #2, same client, looked at Titles with impression levels. Case study 3 was with a major services brand. Interesting test because landing page had more than one option…they were able to segment the groups and see how their behavior might be different based on the ads. They looked at them separately and then combined. Brand and services combo title had highest CTR, but lowest revenue per visitor. Summing up, titles matter more for CTR than for conversions. Description lines do matter, and the URL matters as well.
Hugh Bernham from Rare Method Capital Corporation. Will go over a case study with some CPC ads they did for a client (OnAir.ca). He is pretty funny, telling people to “get jiggy with it” about testing, if “he can be so fly to say jiggy.” (laughs) They audited an existing CPC campaign, and found it was basically solid but being used with broad match. The impressions were huge but the quality of visits were low. This caused for their ad cost to shoot up. They found the gems in the keyword list and then built on them using plurals and variations. Don’t reinvent the wheel. There is a good chance that you know more about Paid Search than the ad agency that may be working with them. Is the client giving enough credence to the value of Paid Search? Wonders about the clients that feel fine spending $15K for a half page print ad, but wouldn’t dedicate that to a year of Paid Search.
They went into the strategy based on the audit. They wanted to switch all broad match to exact phrases. Raised the daily budget, created a more diverse approach to ad copy, created ad groups for general terms as well as industry specific terms. They shut off the campaigns on weekends to save money and expanded the keyword list as described above with plurals and like terms. Results were outstanding: impressions up 233% increased visits to the site 358%, average ad position went form 4.3 to 2.1 (solely on improved CTR), and all this was done on just 16 hours of billable time! As a result of the success of this campaign, his company is now doing SEO for the client and have had some early success with that as well.
“And now a man who needs no introduction,” (Andrew says he has always wanted to say this) Gord Hotchkiss from Enquiro as well as being the Chairman of SEMPO. He laughs about being from Canada and enjoying coming down south for the warm weather…says there will a lake swim in the morning. He introduces his thoughts by saying that people need to find five basic things out: what is the right message, the right place, the right time, the right person, and the right experience. It is tough to figure out how to make these 5 work together, but if you do enough research you will gain important understandings.
They have done a fair amount of research into how people scan web pages. He discusses how they have found that people scan from the top top and look fro relevance clues before making a click decision . So how does customer intent impact their searching behavior? Let’s say you want to stay at Bellagio. He shows a MSN Live SERP for “bellagio las vegas.” There are two types of intent instructed to the test participants: either they already know they want to book a room, or they are still researching if they want to stay there or not. If the intent is to book a room, there is probably a fairly high chance of people clicking on the Paid listings. In the case of research, however, people are m ore likely to skip down to the organic results.
Intent should impact scanning behavior, they hypothesized. But you should never simply ask people how they search, because often how they think they search does not match with how they actually do. So you have to actually observe them. Discusses Enquiro’s famous eye tracking studies and the golden triangle results. They found that people scan from top down and then laterally. He also found more lateral searches in top listing and decreasing as they go further down. Some interesting results: the searchers looking to actually book a room spent three times longer on the SERP than the ones looking for research information . They also spent a longer time in the sponsored listings area than the research-focused searchers. They found that on the purchase group, it was about 50/50 between Paid and organic click-throughs. However, in the case of the researchers 100% clicked on Organic results.