The problem with Web branding surveys

With the click universally acknowledged to be a poor way of measuring online ads, the brand-lift survey has become the method of choice. But is the survey actually any better at measuring brand lift?

Advertisers and their agencies (and sometimes publishers, too) pay big money for surveys and use them to inform campaign decisions. But there are still plenty of problems.

Ben Winkler, chief digital officer at OMD, said surveys are far better than click-through rates in evaluating ad campaigns. Even the one-question survey, while a blunt tool, can help establish what really matters in a campaign. But he also said surveys have their limitations. “Even after they weed out the professional survey takers,” he said, “it doesn’t mean the demographics of the people who are filling the survey out match the people you’re looking for.”

Even the survey companies acknowledge there’s still work to be done: One agency exec told of running an online survey for a packaged-goods company after its teaser campaign. The brand name wasn’t mentioned, so there was zero brand lift.

“The majority of the marketers out there are still relying on click-through rates or conversions,” said Marc Ryan, co-CEO of one of the big survey outfits, InsightExpress. “It’s unfair to say [marketers] are not doing the best thing for brands, because that’s the data that’s available. We need to scale. We need higher response rates, and in real time.”

Apparently, not much has changed since a scathing 2010 report by the IAB about survey effectiveness. The report identified several problems: Pop-up surveys that randomly intercept Web surfers have extremely low response rates. Other surveys that rely on panelists who have opted in to take surveys don’t reach the target population of the ad campaign. Most surveys aren’t designed to use an equivalent control group when testing the response to an ad. Cookie deletion also can also have an impact on a control group. And so on.

The IAB concluded that “one cannot be confident whether the findings of most IAE studies are right or wrong” and followed up with best practices for improving ad-effectiveness studies.

InsightExpress moved almost entirely away from pop-up surveys to panel-based ones around the time of the IAB report. The response rate for these surveys is much higher — about 9 percent, versus less than 1 percent for pop-up surveys, said Ryan. As for the “professional survey takers” that can skew survey results, Ryan contends the company can identify and weed them out.

No method is perfect, though. Another big survey company, Nielsen’s Vizu, took a different approach to the low-response problem of pop-up surveys. As it described in a response to the IAB report, Vizu intercepts online users with a single question, thereby avoiding the problem of the steep dropoff in respondents that multi-question surveys get. While InsightsExpress has to reward its online panelists to get a suitable response rate, such intercept surveys are low cost and easy.

A limitation of this approach is that you don’t get the breadth of a multi-response survey or the ability to compare responses across demographic groups. You can get people’s immediate response to a campaign, but that doesn’t necessarily imitate real-life behavior.

There’s another dimension to the limits of surveys with the growing popularity of native ads. Publisher metrics (pageviews, uniques) are still the dominant way marketers are measuring native ads, according to a survey by Contently. But there’s a yearning to measure brand awareness, brand perception and purchase intent.

Surveys can be well suited to evaluating native advertising because they’re designed to measure just what native is trying to achieve. But the difficulty of scaling native that’s a challenge for advertisers extends to measurement, too. Because native ad formats aren’t standardized, surveys have to be custom-built. By their nature, native ads’ distribution is usually small, so it can be hard to get enough respondents who actually saw the ad. And companies are still figuring out what they’re measuring.

Ultimately, with the rise of social networking, marketers may find other ways to answer questions about brand lift. “As consumers are becoming more vocal in response to ad campaigns via social engagement, we are finding more ways to use alternate means of gathering similar types of reactions and responses,” said Jessica Sanfilippo, group media director at 360i.

https://digiday.com/?p=79804

More in Marketing

What does the Omnicom-IPG deal mean for marketing pitches and reviews?

Pitch consultants predict how the potential holdco acquisition could impact media and creative reviews heading into the new year.

AdTechChat organizers manage grievances amid fallout of controversial Xmas party

Community organizers voice regret over divisive entertainment act at London-hosted industry party, which tops a list of grievances.

X tries to win back advertisers with self-reported video stats

Is X’s big bet on video real growth or just a number’s game?