Ad copy testing is one of my favorite aspects of PPC. Where else can you get so much great data about what people respond to, so quickly?
The challenge in ad copy testing isn’t so much writing the ads, it’s deciding how to pick a winner. And there are many different opinions on how ad tests should be decided. If it were up to Google, they’d pick click-through rate as the deciding metric – as evidenced by their default “optimize” option:
(I hope all my readers know that “optimize for clicks” is NOT the ideal setting for most advertisers!)
Others may say total conversions should decide the winner in PPC ad testing; still others may vote for conversion rate or cost per conversion.
Arguments can be made for all of these options. An argument can even be made for CTR, if traffic, and not conversions, is your goal. I’ve optimized accounts using most of these metrics. And I’ve often seen that the ad with the worst CTR has the best conversion rate. Sometimes you actually want to discourage clicks from the wrong users, so a low CTR may be a good thing.
But my favorite way to analyze ad tests is using conversions per impression.
I first learned about conversions per impression from Brad Geddes in a blog post several years ago. It’s a revolutionary concept, and one that works well especially for lead generation. (If you’re doing ecommerce and measuring ROAS, you may want to stick to ROAS as your testing measurement instead of conversions per impression, as Brad outlines in this post).
The conversions-per-impression metric often serves as the tie-breaker when one ad has a higher CTR and the other has a higher conversion rate. Here’s an example:
This image is from AdAlysis, another great tool that Brad developed. AdAlysis highlights winning metrics that meet a certain confidence level. In this test, you can see that one ad is winning for CTR at 99% confidence, but the other metrics are less than 90%.
I added green highlighting for the metric that’s higher in each of the conversion columns. Although these aren’t winners yet, you can see that the first ad has a higher conversion rate. If you were measuring conversion rate alone, you might be tempted to pick the first ad as your winner.
But look at conversions per impression and cost per conversion. Both of these numbers are better for the second ad. If you picked the first ad based on conversion rate alone, you would have possibly picked a loser.
At a minimum, you’ll want to wait until the conversion metrics reach statistical significance, especially since impressions are pretty different between the two ads (the first ad has half the impressions of the second, even though they launched on the same date). But it looks like the second ad is in the lead here.
Here’s another example:
This is an interesting one. These ads have nearly identical impression numbers. The first ad is winning for CTR, but the second is winning for both conversion rate and cost per conversion. You may look at this and say “well, I bet the second ad is doing a better job of weeding out unqualified visitors. I’ll pick that as my winner.”
You’d be wrong. Look at conversions per impression here. The numbers are nearly identical, but the top ad’s number is slightly higher. This ad may end up doing better because of the higher CTR.
Remember, every impression is an opportunity for a conversion, as is every click. Conversions per impression takes both CTR and conversion rate into account, and allows you to maximize conversions per opportunity. That’s why I like to use this metric, especially for lead gen where revenue doesn’t play into the equation.
If you’re not using AdAlysis, you can calculate conversions per impression manually – just divide total conversions by total impressions, and run the numbers through a statistical significance calculator. (And if you’re not using AdAlysis for ad testing, what are you waiting for? It’s inexpensive, and worth every penny.)
Whatever metric you use to determine ad tests, make sure you use best practices for picking the winning ad. Don’t guess. The stakes are too high to risk a wrong guess!
What metric or metrics do you like to use to evaluate PPC tests? Have you tried conversions per impression? Share in the comments!