In a recent study, comScore and Pretarget analyzed 263 million display ad impressions over nine months across 18 advertisers in numerous verticals. After collecting the data, Pretarget did a correlation analysis, including variables such as gross impressions, views (75% of ad within screen, either above the fold or after scrolling), time in-view, hover/engagements and total hover/engagement time, clicks and conversions. The results are astonishing. They for example show that…
… ad hover/interaction (correlation = 0.49) and viewable impressions (correlation = 0.35) had highest correlation with conversion, while gross impressions (correlation = 0.17) was significantly lower. Perhaps most interestingly, clicks (correlation = 0.01) had the lowest correlation with conversion, far under-performing all other metrics analyzed in the study. These findings suggest that advertisers and media planners ought to break their addiction to clicks and instead look to more meaningful metrics for evaluating campaign performance.
(Emphasis by me)
Interesting, isn’t it? Hovering over the ad for a longer time span, e.g. to read mouse over info, and the viewable ad impression both outperform clicks in predicting sales. Now, if you seek for a formal study with further information to download, you’ll do this in vein. There is none, as Pretarget-CEO Keith Pieper noted. He also added that the results might be biased towards view-based conversions.
Measuring opening length in email marketing
I’d be especially interested in where the time-in-view ranked, which was also part of the analysis, and if the results could possibly be applied to email marketing. As you might know, email senders can not only track opens (= impressions), but also how long an email has been opened individually or on average. Service providers like Litmus, CampaignCog or LiveClicker allow you to easily augment your standard reports with additional opening duration measures. In addition, several email service providers started adding such measures to their reports lately, too. So if there’s really a connection between opening length and conversions, this could become an interesting new metric in the near future.
Btw: wouldn’t Pearson get a heart attack?
Let’s get back to the comScore/Pretarget study once again. Can we really make something out of this obscure “Pearson correlation analysis”?
In the above example, I quickly modeled four scatterplots with the according Pearson correlations from the study. Therefore, these plots could theoretically represent Pretarget’s results. Hovers & interactions (green, top left) spread much closer around the gray regression line than clicks for instance (red, bottom right). This means one could assume a stronger relationship between hovers and conversions than between clicks and conversions. (Note that if the dependency would be perfect, indicated by a correlation coefficient of 1, all observations would be on a straight line.) Is this, what Pretarget did?
To show you how misleading such an analysis can be, I plotted again hovers & interactions versus conversions (right figure). Only this time, I added one outlier. This outlier alone results in a drop from 0.49 to 0.17 for Pearson’s correlation coefficient. Maybe that’s part of the reasons why clicks failed? Or are there other misconceptions? No one knows. It’s always the same with such reports: Without having further information about the data distributions, they are practically useless and appear more like only a PR-thing. Everyone should keep that in mind when e.g. citing them.