5 ways publishers’ tech choices come back to haunt them
Written by Tom Herman, CEO, DashBid
Publishers can cause themselves irreparable harm when it comes to the technology they choose to run on their pages. Many of these tech “solutions” don’t add real value and instead clutter up the user experience, slow down pages, and drive people away.
The ad tech industry deserves a lot of the blame. We are the ones hawking these products, often measuring success only in short term revenue. So we – the ad tech community – need to create products that put the user experience first and help publishers find more sustainable KPIs. Otherwise, we stand to lose a lot more.
Some of the issues, like scripting overload, have been discussed time and time again. Others are more hidden. But none have been solved. Here’s our list of five ways publishers are losing control of their pages, followed by our solution.
Publishers’ pages are weighed down by scripts
I’m not the first to observe that there are way too many pieces of executable code that run on publishers’ pages. That excess of scripts causes a surfeit of issues with which most publishers have yet to grapple.
At the recent Clean Ads I/O conference, IAB CEO Randall Rothenberg cited a study that found “90 tags from a single advertising asset” on just one page. Unsurprisingly, these scripts periodically crashed the sites on which they appeared.
Ninety sounds like a big number, but at DashBid we commonly see more than 500 servers called from a single publisher page. Some of those scripts are doing crucial work — everything from enabling web analytics to calling ads from approved providers.
Most, however, do nothing that benefits the publisher. On the contrary, they just leak data at best, and at worst they reduce the consumer’s privacy and experience.
Many of the scripts are from parts unknown
Not only do those scripts not benefit the publisher, but they’re also often from unknown parties who are un-vetted and unapproved.
For example, a publisher may connect when an SSP to help find the best revenue opportunity for an available ad spot. The SSP sends a spot to an auction, but before anyone bids, scripts make their way onto the page to ensure the “avail” meets the requirements of various potential bidders, such as viewability levels, a lack of non-human traffic or certain targeting parameters.
Each tool or script can call and load other scripts, and each of these calls to more exchanges, data aggregators and potential partners, who may also test and bid. The cycle continues, ballooning out of control.
Launching a single auction or installing just one application can kick off a daisy chain leading to dozens of unknown players placing script after script after script.
The scripts are stealing publisher data
Many of these scripts have a more nefarious purpose: They’re put on publishers’ pages by exchanges or data gatherers such as data management platforms (DMPs) with no intention of buying an ad. They simply wish to drop cookies, collect device IDs, track users, siphon off data and cross-reference it with other information they can package to lure advertising to their own platforms. That data has real, tangible value.
Data brokers will pay a fee (such as a $.10 CPM) for data gathered on users that can be matched with or mapped to other data sets. Publishers don’t see any of that money made from their own data.
The scripts may violate users’ privacy
Furthermore, publishers may unwittingly be violating their visitors’ privacy by allowing scripts to grab or use data without consent.
Users are dropping off
Page scripts that cause latencies lead to delays in delivering content, turning impatient users away.
The longer the load times, the steeper the “user decay” curve, a term we use to describe the pattern of page abandonment caused by latency.
User decay information should influence how a publisher handles ad partners on their pages, such as by limiting how many scripts are allowed and setting optimal floor prices for auctioned ads.
The solution? Publishers need to take back control.
Most publishers aren’t intentionally overloading a page with scripts to create a horrid user experience, goading their precious audiences into leaving or spurring them to install ad blockers. Once someone has taken either step the publisher has lost control.
The publisher needs to act before this ever happens, deciding which scripts to block and how to optimize, improve user experience, maximize revenue and help their business partners.
Publishers need to separate revenue opportunities and useful applications from the unwanted elements that tag along for the ride. They need the ability to block unwanted scripts that cause latencies or come from unrecognized vendors who may be stealing data or worse. They need to do so on an individual basis, without throwing out the baby with the bathwater.
When publishers can learn what’s slipping onto their pages and keep those pages working smoothly, they’ll have the control to give users the experience they deserve, give true advertising partners valuable access to those users, and earn their rightful revenue in return.