Auditing Paid Listings and Click Fraud Issues
Moderated by Jeff Rohrs, President of Optiem, LLC. Introduced panelists, and explains this will be a presentation with all panelists sharing one PowerPoint presentation. This will allow for interesting discussion of examples (added-sure did, this was honestly the best session I have ever attended at SES). We are joined by, in this corner, John Slade from Yahoo. Shuman Ghosemanjumder from Google, and Paul Valez from Ask. In the other corner, Jessie Stricchiola from Alchemist Media, Lori Weiman from KeywordMax and Tom Cuthbert from Click Forensics.
First off, following up from NYC Sessions, recaps some of the news headlines regarding click fraud over the past few moths. Lane’s Gift Court settled for $90M, Click Forensics reports click fraud at 14.1%, Google makes invalid click data available.
The lawsuits: Lane’s Gift (AR) Class Action, Google settled/approved. Yahoo- ongoing. First covered by Lori (“also a recovering lawyer”- laughs) who explains the Lane’s Gift can go through appeal. Two other suits: AIT/Checkmate. Yahoo settled with Checkmate, AIT ongoing. Also covered another third one didn’t catch it yet. Going into detail: To get anything from the Lane’s Gift, you had to file by August 4th, or opt-out by June 19. Your share would be proportioned, based on ad spending between 2003 and August 2004. The value is $60M in credits, not cash, and only 50% can be used in any month.
Yahoo settlement with Checkmate is a little different. There is no limit, and you can get cash back. You have to prove there was fraud. The claim must be filed and a Special Master (Appointed Judge) will review the appeals to Yahoo’s findings. Time covered is January 1, 2004 to July 31, 2006. Tom Cuthbert adds that these suits do not do much going forward. There is still a need for accurate third party verification of clicks. Jessie has had clients that have decided to participate in the class, mostly those spending mid range dollars. The larger clients are “holding off” to possible sue in the future. Lorie agrees, same case with her clients.
Question posed to the search engines reps: Are there any changes relative to the settlements? John says that for a long time, Yahoo! Has wanted to be able to have dialogues. The settlement has enabled them to start actually communicating in public, and having these dialogues. What Yahoo is trying to do is do some of the things the industry is saying that they want. For example: how do you really measure CF accurately? The industry is coming together to get some standards about what actually counts as a click, and other such definitions. As a part of the settlement, they invited the plaintiff’s lawyers, and they have said they are generally doing a good job.
Yahoo will start bringing in panels of customers to come in and comment on methodology. Also creating a position for some internal to be a click fraud advocate for advertisers. Once again, for the entire industry, the settlement is a positive step. Shuman says they are pleased with the outcome. They found that the study supported that they are “doing their best.” The report is actually available at the Google blog, and contains “more details than we have ever been able to release previously” (about our click fraud methodology.)
Paul from Ask: has not been affected by the past lawsuits, but has been working with legal team in understanding the issue and how it relates to Ask. They were actyaully initially sued, but were able to prove that they were not in the market during the effective time frame. Yet they are happy to be watching and learning and able to join in the dialogue with the other engines to help.
Discussing stats. Lori: small sample of clients: 7% CF on the low side (the majority from 8-10%) all the way up to =20% on the high side. Tom is asked how Click Forensics gathered their data and came up with 14.1% overall average CF number? They have used data (up to 100,000 clicks for free) donated by advertisers and are collecting technical behavior (IP address, etc) and behavioral data. They have also had experts come in the verify their methodology.
Shuman is asked about his insights into the methodologies used by the third parties. Shuman states they G has always dedicated significant resources to combat invalid clicks. They are “concerned that some of the numbers by 3rd party estimates are grossly exaggerated.” They did a study of some of the studies, and found some to be very weak, without any published methodology. Discusses the “Outsell report” and how it has been misnamed as a measurement of click fraud – this was not a measurement, but an opinion survey! Analogous to asking the residents of a city what “they think the crime rate is” and taking those numbers and using them to create the published crime rate. (wow boxing gloves are off early). They frequently receive request for “fictitious clicks” (clicks that never even occurred) as well as for clicks that were not charged for (through automatic fraud prevention). They have a 17 page study about how click fraud auditing firms and how they have “consistently inflated” numbers.
Jeff asks Shuman if this seems like a methodology error on the part of the 3rd parties, or an actual attempt to fraud the engine. One big problem they have found is that additional page views are counted as invalid click. They cannot disambiguate between original and additional clicks. In addition, because those five clicks look like paid clicks, in these cases there are now “give fraudulent clicks per each misidentified one.” Second area: actual conflation of traffic from diff ad networks. They have found clicks made on Yahoo ads in reports delivered to Google. Numerous instances that have resulted in implied click fraud rates for the advertisers. In some case, there are even some that are greater than 100%. They are flawed; we do not know why “basic sanity checks” have not occurred on the part of the 3rd party auditing systems.
The advertisers on the panel and Tom seem a little shell-shocked, and Lori asks why they were not given the data (study) prior to the session. Shuman says they have made this information publicly. Lori, however, says that it is very simple to not report on repeated clicks, paid reloads, and clicks that happen in the same website. We do not track or report on those, in our system at least. The other problem with the incorrect engine being used is something that also doesn’t happen. It is interesting that you bring all this up, and the methodologies that need to be created are important. We all agree that there are no standards today – we need to conclude what the definition is. Shuman replies that they have covered some of the major ones in their research of the 3rd party tools: Ad Watcher being one of the most popular. Continued questioning of the methodology.
Tom starts with “I don’t even know what disambiguate or conflation mean – I didn’t go to MIT” (Laughs). He is really looking forward to reading the report. He draws a David and Goliath picture. Let’s agree that we will never agree on methodology and facts. What we should agree is that we can send you data, and you can look at it. They added a site this morning called “reasonable isnotenough.com.” (one note by me: Tom is obviously a very passionate man about this subject, which I think is great for the industry. I feel he was honestly slightly enraged by Shuman’s comments, per his demeanor. He “clamed down,” though, and provided excellent further comments and good answers in the QA.)
John says that the problem is also that the methodologies vary. Comparing apples to Buicks. Diff people are trying to do the right thing. He doesn’t think that advertisers and Tom don’t sit down in Texas and try to come up with fake numbers. He feels that the IAB effort is of utmost importance. Talks about how after the movie “Quiz Show” Congress actually created a body (media Ratings Council) that “audits the auditors.” They want to do things the right way…the biggest news of the whole panel is the fact that the engines are working together. This will end the methodology wars.
Shuman ads…what we have covered in the report released today does not say that the methodology is flawed, but the basic accounting is flawed. He says this is a completely diff issue than criticizing methodology.
Jessie: it is important to recognize that with regards to the infancy of the issue, none has happened without the threat of lawsuits, and that the engines didn’t wake up one day and decide to help the advertisers. “Gloves seem to be off at this point, so I have to address a couple issues.” In regards to the report, up until March of 2005, Google was counting the “doubleclick” on every single ad click. She describes people that double click instead of single clicking. She said that until March, the process of this counting double clicks was also “basic accounting” problems. She reminds us that we should keep that is context for the rest of the discussion.
Shuman “thanks her for the positive comment” (laughs). He says we are talking about click fraud, not invalid clicks. He is trying to define click fraud as being only of “bad intent.” There are certain categories of clicks such as double clicks that are detected and filtered out. He wants to ensure that they do not explain too much, so that the actual “fraudsters” do not have more data. He discusses the recent addition of the Invalid Clicks report in the AdWords report center. Jeff asks if the system tells you why the click was labeled as invalid or just numbers? Just numbers…they added this because of the greatly exaggerated numbers being disseminated though the media. Almost all invalid clicks are detected and filtered out by the system in real time. Once again, he lambastes the 3rd party reports…saying he has never identified a vulnerability in the system due to 3rd party reports.
Jessie: do you agree, Shuman, that the system needs to includes “both sides of the data?” “yes” For what percentage of the accounts do you analyze this data? “we do not publicize the data” assuming that you do not have the required full dataset with conversion data, what are you doing in order to incorporate that data? “They provide free conversion tracking tools for the advertisers to share the data” (some laughs in the audience)
Paul speaks up and says “he sees why Ask was sited at the middle of the table” (laughs- John and Shuman are on the right side from the audience and Tom, Lori, and Jessie are on the left). He provides some more insight about how Ask provides methods for its advertisers to give them as much data as they want to. They appreciate the third parties and can “learn a little something from them.” John agrees. They need to find a way to make the data interchange possible in order to make the best decisions. Tom agrees too. Shuman ads that these issues are addressable, and that they have recommended solutions to implement within the reports.
No coverage of QA…catch the continuation of this highly heated discussion in Chicago!