New York Magazine is getting real with its advertisers, telling them the website can’t guarantee viewability of their ads if they don’t consume less data.
“If the advertiser provides creative that does not load in a certain time, then there cannot be viewability conversation,” said Ron Stokes, executive director of client advertising solutions.
The website will not offer viewability guarantees, which are increasingly being demanded by advertisers, unless they adhere to the data limits.
The magazine works with advertisers before campaigns run to make sure video and rich-media ads meet the required specifications, which for video is a maximum of two megabytes. For perspective, that’s about the equivalent of two minutes of streaming music.
Stokes said the magazine is just being frank with advertisers that if their big ads can’t render on the site, then they can’t expect people to see them or demand viewability guarantees.
New York Magazine is not alone either in pushing back. One top publishing source, who didn’t want to be named, said many publishers are looking to third-party monitors to help get a handle on ad sizes.
“If ads are not within the specs required, then guarantees are off. They audit us, we audit them,” the source said.
It’s a role advertisers might not be so familiar with, because they are usually the ones doing the monitoring of publishers to ensure the validity of traffic and other metrics.
Advertising data hogs are a growing concern throughout for the publishing world. The file size of ads are considered one of the factors pushing more people to install ad blockers, as the ads slow the page and come loaded with trackers that weigh them down even more, and raise privacy concerns.
A study released today found that, on average, ads doubled page-load times and accounted for half the data usage, according to SecretMedia, which conducted the research and works with publishers on how to address ad blocking.
The study looked at 25 top websites, including The New York Times, The Huffington Post and The Washington Post, and found on average ads took up 9 percent of web pages graphically but used 55 percent of the bandwidth, accounting for 54 percent of the load time.
In the most extreme cases, ads caused websites to load five times slower than if there were no ads. The ads used more than six times more bandwidth, in some cases, compared to pages without ads on them.
“You can’t have 10 tracking pixels attached to an ad, and an ad that’s twice the file size of a standard ad, and also expect 100 percent viewability. That’s an oxymoron,” said Jason Kint, CEO of Digital Content Next, an industry trade group.
The question is what can publishers do about it? SecretMedia is working with Clarity Ad on technology that could detect troublesome ads but only after they run, according to CEO Frédéric Montagnon.
“Unfortunately for the publishers, it’s impossible for them to have the visibility on every ad delivery in real time, which means that they just can see abusive ad campaigns after the campaign is distributed. This is an aspect we are working on,” Montagnon said.
More in Media
Publishers are still feeling the effects of a change Facebook made in May that caused a steep decline in referral traffic. Nearly four months later, publishers aren’t sure when — or if — that traffic will come back.
A new definition for MFAs is available but the vague nature of the guidelines is leading to a lack of standards that might prevent adoption.
The publishers who attended DPS were focused on the potential upsides of applying the technology to their operations while guarding against the downsides.