Digitas’ data chief on attention as a metric: ‘All measurement is imperfect’
It is undeniable that consumers today have a staggering amount of distractions at their fingertips, from social media feeds to streaming services galore.
It’s because of the massive amounts of content available that it’s harder for people to maintain focus and interest in it all, and there’s intense competition over our attention and data to try to measure that focus and interest. But all of this also makes it harder to measure engagement — is it looking at impressions, clicks or shares? One measure that seems to be gaining ground in recent years is attention metrics, and more specifically how to define them, track them and apply that data.
Jen Faraci, chief data officer for Digitas North America, spoke with Digiday about the quality of attention and how to effectively turn it into a return on investment. While tracking attention is not entirely new to the industry, technology and increased data sources have enabled firms to create new standards for using these metrics.
“It’s not enough to be viewable,” Faraci told Digiday. “We need to know that people are paying attention to it. So how do we define what attention actually means? How do we create standards around it?”
It wasn’t that long ago — perhaps in the last few decades — that people consumed a handful of channels that used a few different ad formats. And people had to see one another or call and get updates on each other, as opposed to passively scanning strangers’ feeds on social media.
Today, there are more than 20 media types and hundreds of ad formats, potentially exposing consumers to an ad roughly 50,000 times per day, according to Digitas and Publicis Media research. Back then, the opportunity for consumers to see an ad was roughly 2,000 times per day. But as Faraci noted, our attention span is down to just 2.5 seconds from 12 seconds about three years ago.
“I do think [attention] actually already is starting to be integrated,” Faraci said. “I know some of the pioneers of measuring attention from an advertiser perspective are starting to use it as a lever in figuring out investment models as well as investment breakdowns.”
IPG’s Mediahub, for example, has been another media agency utilizing attention to improve results for its clients, which included the NBA’s campaign during playoffs seasons. After injecting attention metrics into the automated bidding process, the league delivered a 20% increase in attention score and got a 20% lower cost per attention in online video purchased.
Faraci said she believes using attention as a performance indicator starts with defining and creating more standardization, and then measuring and testing across platforms as it differs across channels and partners. This testing and analysis will gradually allow agencies to achieve long-term impact, rather than short-term business outcomes, she added.
This interview has been edited for space and clarity.
When did we start looking at attention as a metric and why?
Two years ago, I would say is when [we started using] the word attention as a way to define that [key performance indicator]. If you think about the history of media measurement and how it’s evolved, particularly as the internet became a thing, we’ve always been sort of striving for what’s that next best metric that is an indicator of behavior. Way back when, it was all about clicks, right? Somebody clicked on something that means they were paying attention to it. You use that for a while and then you find … that’s actually not a good indicator of someone’s going to behave a certain way.
I think of attention as a metric as the next iteration of viewability five to 10 years ago. Viewability was sort of the big thing around — well, if someone can’t see your ad, then it’s not going to have an impact … Now we’re sort of in that early stage of doing that same thing with attention. So it’s not enough to be viewable. We need to know that people are paying attention to [ads].
Do you see attention as one part of the larger picture in measuring engagement?
Absolutely, I think attention is one — one thing in a toolbox of many things. I am a big proponent of the phrase that all measurement is imperfect. We shouldn’t rely on just one form of measurement to decide what we should and shouldn’t be doing. We need to triangulate to the best possible outcomes using multiple approaches.
Why is it important to start with defining what attention means and create standards?
If you don’t define it consistently, you end up with a whole bunch of metrics that you can’t actually compare and don’t tell you what you asked it to. … It allows us to create consistent measurement, which gives us things like trends. It gives us the ability to import the results into our modeling to understand the long-term impact of attention.
So how do you use analysis and testing on attention to drive long-term results?
One is to get the definition right, so that you’re measuring the same way every single time. That consistency is what will allow you to be confident in the results of your testing [and] in the results of your modeling over the long run. The second thing would be to measure longitudinally — it tells you something worked or didn’t work. Longitudinal measurement will allow you to start to tease out nuances of impact. For example, if we wanted to build attention metrics into a marketing mix model to understand their sales impact over time, we need it to be measured the same way, and we need it to be measured against the same taxonomy and metadata that we collect over that same period of time in order to start to tease out what attention does in different periods and across different aspects of the plans.
What are you learning about attention measurement across different platforms?
One [thing] is that the technology available to measure attention is very different [across] platforms. So in linear TV, where you have the Adelaides of the world, it’s all about the biometrics — versus in more digital channels like a social channel, you have more combination metrics and a combined set of metrics that then result in an attention score type of an approach. You’re also finding that some platforms are better at garnering attention. Like your social platforms, where it’s a single-screen experience and it’s six inches from your face, and people are very focused on it. Versus a linear TV where people can be multitasking or they leave the room.
How can the industry keep up with tracking and reaching people without being invasive?
I think where the line is is up to the end consumer. Especially with a lot of the biometrics … some of it comes with incentives, which you can argue that biases the data or not, it depends. But the consumer draws that line. I choose to put an Alexa in my living room. I choose to agree to be on a panel with Adelaide for the sensory tracking in [my] living room. People make that choice. What’s on us about that is to be very clear about what that actually means and what we’re doing with that data.
More in Marketing
Key takeaways from Digiday’s 2024 Gaming Advertising Forum
Now that gaming has gone from a buzzword to a regular presence in brands’ media mix, marketers are more closely scrutinizing the value and ROI of their investments in this channel — and the platforms are rising to the challenge. Here are some of the biggest takeaways from this week’s Gaming Advertising Forum.
‘The most controversial rebrand of the year’: Understanding the tightrope that legacy brands like Jaguar walk during a rebrand
Jaguar’s attempt at a sleek, ultra-modern rebrand replete with art-house aesthetics has been the talk of the water cooler – excuse me, LinkedIn – this week.
The Trade Desk finally confirms it: Meet Ventura, the OS to cement its grip on CTV
The Trade Desk is indeed building a CTV operating system. So much for shutting down those rumors. Weeks ago, CEO Jeff Green insisted they were off-base.