The blind spot: Why testing creative often fails

We’re all familiar with the original promise of digital: to deliver the right message to the right audience in the right place at the right moment. A tsunami of data and analytics has brought us closer to that dream – but there’s still a lot of work to do. To date, our focus has been on applying data-driven thinking to the media side of the equation: the sites, strategies, and placements. We now need to look beyond the medium and focus on the message itself – the creative strategy, execution, concept and version.

The medium and the message should work together, like the directional edges of a saw, to achieve the advertiser’s goal. If a saw only cuts one way, it isn’t very efficient. Solid testing will show you if the relationship between the two is working.

Time to evolve our metrics
To correctly measure both the medium and the message, you need to re-think your approach to analytics. For most advertisers, this is the blind spot. They still use the wrong approach, combing through outdated media metrics to assess their creative.

Let’s start with the dependence on the click-through rate as an indication of performance. Think about it: Do you really expect an intelligent, time-starved customer to stop whatever they’re doing and respond to a banner ad? If not, why is that the key performance metric for creative? The purpose of display media is to create brand awareness and consideration. It is not a direct response vehicle. A display ad that gets zero clicks can still be very effective. You just have to measure it correctly.

Then there’s the reliance on the last touch (post-view or post-click) as a KPI. Giving 100 percent credit to the last impression served makes very little sense in light of the marketer’s real objectives. It assumes that only the last ad matters, penalizing every interaction that preceded it.

It’s a bit like awarding only one medal to the 4th swimmer on a winning relay team: If you only have one medal to give, cut it up and allocate pieces based on each swimmer’s lap time. The fastest time gets the biggest piece. This is the basis for algorithmic, fractional attribution. It’s also a logical approach to measuring creative – just as you do your media.

Checking the math
An agency client of ours was concerned. Their last-touch attribution showed that static ads were outperforming the more expensive dynamic ads. To discover why, Encore by Flashtalking completed an analysis of static versus dynamic creative for one of their campaigns.

We found that static ads were more prevalent low-funnel, whereas dynamic ads were served higher in the funnel. Additionally, almost 40 percent of last-touch conversions from static ads were preceded by dynamic ads which were scored much higher by machine-learning attribution models.

Now, according to fractional attribution, the dynamic ads actually had a 25 percent lower CPA than the static ads. That is to say, dynamic generated 33 percent more conversions for the same budget.

When testing your creative, (sample) size matters
Misperceptions like these often arise when relying on results from a small control group. The rules don’t change when testing creative: You need enough impressions within the control and test groups to achieve representative results. Allocating 5 percent of impressions to the control group (and 95 percent to test) is not likely to yield a meaningful comparison. The control group will not have enough reach and frequency, setting you up to chase outliers.

But even with large, representative groups, the test is often rigged from the start, as performance of the creative is highly influenced by the quality of the media placement. Bad placement trumps good creative: A great ad served to the wrong audience (or buried at the bottom of a page) is not likely to perform well regardless of how inspired the creative. Conversely, mediocre creative in a great placement can perform very, very well.

As advertisers start to personalize creative to match the context of the placement, they make it hard to compare and even harder to determine if it was the creative or the media that influenced the outcome.

ABT: Always be testing
To measure creative effectively, you need to construct good tests: test and control groups with a consistent and representative mix of sites, strategies and placements. You need to have an “always testing” approach and be willing to invest a small part of your budget to learn.

When all is said and done, successful advertising is achieved by marrying media and creative for each audience. By properly measuring and optimizing both sides of the equation, advertisers can eliminate that blind spot in their advertising program and do a better job of delivering on the promise of digital.

Learn more about creative analytics.

https://digiday.com/?p=193382

More from Digiday

TelevisaUnivision ofrecerá experiencia de compra IA a su audiencia durante los Latin American Music Awards

Suscríbete al newsletter de Digiday en Español aquí para recibir las últimas noticias sobre el sector de marcas y la industria del marketing. Los Latin American Music Awards tendrán una particularidad para su audiencia este año, gracias a una asociación de TelevisaUnivision y Shopsense AI televidentes podrán gozar de una experiencia de compra directa mientras […]

Influencer agency Billion Dollar Boy offers creators a membership program, with benefits

Influencer agency Billion Dollar Boy on Thursday is launching a creator community membership as it expands its consultative services and partnerships, Digiday has learned.

The case for and against organic social

Digiday has delved into the debate, weighing the arguments for and against marketers relying on organic social.