Landing Page Testing & Tuning

Aug 9, 2005 • 6:14 pm | comments (0) by twitter Google+ | Filed Under Search Engine Strategies 2005 San Jose
 

Moderator for this session is Allan Dick. There are three speakers, Tim Ash, James Roche and Scott Miller. There is new technology being used in this session by Offermatica where the presenter asks questions, and we in the audience reply via a radio keypad and the responses are shown up on the projector in real time, pretty cool.

First up is Tim Ash from SiteTuners.com. Why should I tune? CPA = CPC/CR, the CPC is rising 2% or so per month. The only thing you can control is your CR, conversion rate. Your marketing program ROI is 25% How much do your profits improve if you increase conversion rate by 10%? Your profits would increase by 50%. What should I tune? price of your product or service, or the landing pages with trackable actions (lead form, buy, etc.). Price Tuning Basics; At the low end of your price, you will have low margins, at the high end you have no buyers, where is that profit sweet spot (somewhere in the middle)? They test a few prices and then build a model to predict the sweet spot for your prices. Common Tuning Elements; headline, page layout, nav, color scheme, offers, form layout, button text, sales copy, special offers, call to action, there are no universal truths. They test each and every component this of this, they tested 13 things, with 42 different values - 2+ million Web site variations. How Do I tune? A/B split testing, multivariate analysis and sitetuners tuning engine. A/B Testing; test one variable at a time, send equal traffic to all versions, very simple to implement. Multivariate analysis; test several variables at once, tries to predict best setting for each variable. Sitetuners tuning engine; proprietary math for internet marketing, designed for large tests, handles complex interactions. Outsourcing Considerations; Size of test, Services offered (tools, consulting, and hands-off), Business Model (rent tools, ppf, etc.). What Mistakes Should I Avoid? (1) Ignoring your baselines, always devote some bandwidth to your current version, measure relative to the baseline, not absolute to baseline. (2) Not collecting enough data (do not make decisions based on too little data, understand basic error bars, confidence intervals). (3) Not considering delayed conversions. (4) Assuming That Testing Has No Costs. (5) Ignore Complex Interactions.

Scott Miller from Verster is now up to give case studies. Test: Link Text; Objective: Increase Clickthroughs. They tested three different link texts. The best link text option was "Learn More", believe it or not. Next example was wilsonweb.com, objective is to improve stickiness, and tested inclusion of sitepal avatar. Did the sitepal Avatar help? People in audience voted, 46% it helped, but the actual result was true, it did help. This is kind of fun. DVD Lead Generation system, PPC Campaigns, Object is for a DVD request, and they tested copy test. One is a personalized version, and the other was an institutional version. Most in the audience said it was personalized version, but the institutional version actually won (last year the same study went the other way). Take always; even small changes can help, conversion lift can come from unexpected places, retest periodically - results do change, more examples and insight at vertster.com/blog/

James Roche from Offermatica is the last one up. They do dynamic marketing and landing page optimization. They took a complicated statistical analysis and made it easy to provide to marketers. He explained that they did this in two weeks for Allan Dick's site Vintage Tub. Test 1: Category Landing Page Optimization. Objective: Increase conversion and gross sales. Approach: Strengthen category imagery, clarify subcategory navigation, reinforce "free shipping." They were not able to touch the text, they actually did deploy a form of cloaking on this site. They segmented out three sections of the page to overlay on the page. They then tested the two against each other. The new version was more visual and more promotional. Which won? The new won worked. Which element contributed the most? 1. free shipping, 2. link text or 3. free shipping? Free shipping was the major attribute, picture did help, links did not do much. They saw a 60% lift with this test. Takeaways; reinforce your promotion, review keyword-density strategies vs conversion impact and plan for periodic champion / challenger testing. Test 2: Page Setup: Objective increase purchase intent, Approach, strengthen benefit images. This case they did not do a multivariate test, they did an A/B test. They developed 4 stories, 4 alternatives to the content they had at the top. The conversion was did they click in deeper. The baby just beat out the simple tub version. The hide from your kids version lost out to the indulge yourself version. Overall the one that won was the default version. Takeaways; reinforce your promo, consider splitting traffic by segment and testing, and dramatic changes should be tested iteratively. Testing Roadmap; retest a/c/d for RPV, restest A/C/D by source and new/return, strengthen "free shipping" message on alternative versions, test integration of link density and branding options. Key Features / Differentiators; All-web marketer interface (upload content, plan and run tests - advanced stats without a PhD), campaign (not just page) testing & optimization (multi page, multi session, multiple point conversion), Customer segment targeting (target to source, category, email, geo), Analytics & Real Time Reporting (instant response, integration with coremetrics).

Previous story: Link Building Basics
 

Comments:

No comments.

blog comments powered by Disqus