Greg Boser is the moderator and he introduced the panel. The first man up was Jim Stobb from PositionTech. He discusses what paid inclusion is: (1) new sites are usually indexed within 72 hours (2) existing pages are recrawled every 48 hours. Two programs (1) Direct Submit (pay per page, flat 12 month fee and plus a CPC and (2) Trusted Feed (pay per click, and designed for sites with 200 or more URLs). Trusted feed key features are; (1) its trusted because they are from a feed and are reviewed by the engines (2) ultimate control over the site content and updated 48 hours and can do geo-targeting (3) click through reporting. Trusted feeds show up in natural results and they are not for everyone. Large sites/commercial sites are good for trusted feeds, so are database driven sites, CMS sites, and flash/multimedia sites. He showed an example of staples.com's search terms and clicks for its trusted feed program (hope he got approval). When producing the feed they require a destination URL, product name, manufacturer, product description, part #, tracking URL are all important to capture. He showed a screen capture of a staples product and highlighted the data on the page that is being requested in the trusted feed. I wonder if people understand that feeds are normally in csv or xml format and that a bot doesn't try to determine what criteria is used to figure out what the page is talking about. I think the point is getting across, most people do not look puzzled. He then discusses the value of choosing PositionTech to manage your trusted feeds. Then he goes into how they use the data dump and the crawling process, then showed a sample final feed format.
Next up was Tim Mayer from Yahoo, Greg introduces him as the first search rep to post at a forum under his real name. Tim's task is to support the much debated Overture PFI programs. He reasons that this gives the web-master a support line when it comes to ranking issues. The next slide is named "Why do we need a feed program?" Less then 1% of the index is PFI content, this is a premium service. He said the more Yahoo crawls the less unique content they find (maybe they are looking in the wrong place - just kidding). This program, he admits, is somewhat controversial, but has lead to a dialog between the SEO and the engines. PFI offers higher redundancy - ensures your content is always in the index even if your site goes down. Then discusses Site Match and Site Match Xchange (which is for larger sites). Pros include content inclusion across all networks, frequent refresh, quality review and interaction with the engines and detailed reporting (ROI stuff). Tim clarifies that PFI doesn't mean that your rankings will improve (or at least that is how I interpreted it).
Joe Laratro from MoreVisibility.com was next up, he said he was from Florida (good to know). He defined PFI and XML feeds - PFI is a pull of data and XML trusted feed is a push of data (good explanation to start off with). He then goes over the guidelines; subject to editorial review, subject to strict algorithms, typical requirements (title < 70 characters, description < 180, keywords < 3 to 5, body text), category CPC card rates, ability to use 3 feed types (RLD, category and product). He goes over the benefits of using trusted feeds; all covered earlier. Pitfalls to avoid; duplicate content, keyword stuffing, artificial geo targeting, repetitive titles, keyword duplication on the titles (search engine optimization, search engine placement and search engine marketing - you repeated search engine three times), and product not found pages. Candidates for PFU; 20+ pages, dynamic sites, not crawlable sites, sites not in the index, and and if your willing to guarantee to be index.
Dave Roth from Razorfish, a pretty big Web design company and very well known. Dave will discuss how they manage and optimize these feeds. They call themselves the largest SEM in the industry now. They target all types of companies, no niches here. They deal with a lot of e-commerce site customers, and they are lacking optimization. One of the biggest problems the clients have is that there is not enough data to make an informed decision, which results in negative ROI. You need to know price, margins, profitability targets and track by all types of levels. They apply forecasting models to client's data to see what they should expect in ROI on a very detailed levels. He then goes through the pyramid affect of keywords targets and conversion based on that (generic keywords have higher volume but less conversions, specific keywords have lower volume but higher conversions). He shows a sample excel document with the formulas used to analyze a feed and forecasting model. He said, don't try this at home - basically call Razorfish to do this (or an other firm).
Some Q & A:
Q: Can you put price in the feed? A: In the description yes, not in the title. And they don't recommend including part # in title (that seems weird to me), they explain that it often brings back weird results.
Q: The banning issue...Greg Boser asked this. A: Tim said, Yahoo will check to make sure your content is not banned. They are very focused on comprehensiveness. They feel its important to rank relevant sites first in the SERPs. PositionTech said if you submit it will enable Yahoo to re-review the site. Most of the time these penalties can be corrected and put back in Yahoo.
Tim said, most of the WMW stickies he gets about being banned are actually not the site being banned, but other issues such as SEO design issues.
Q: Greg then brings up click fraud. He said he knows Google disallows certain proxies for AdSense ads but doesn't apply that disallow list to AdWords. In addition, he knows, based on 'click frauding' his own stuff that the companies do not find and refund all fraud. A: Razorfish says you probably have to rely on 3rd party tools to track it and then bring that data to Overture, Google, etc. Tim said they do and are working on preventing click-fraud. As always, its a constant battle between spammers and the content providers. Its an area they need to focus on.