RTB’s fatal flaw is it’s too slow

Jonathan Mendez is CEO of Yieldbot, an intent-based ad platform.

I have some bad news for real-time bidding. The Web is getting faster, and RTB is about to be left behind.

Now, 120 milliseconds is becoming too long to make the necessary computations prior to page load that many of today’s systems have been built around. Google is making the Web faster to give it a technology advantage due to computational speed. It is likely the No. 1 advantage Google has. So if we want to look to the future of technology, we should investigate why speed is a competitive advantage.

Not everything really smart can be done in real time. In fact, the most intelligent things require too much computing power to get calculated in real time. Think about the years of training that went into the models used in IBM’s Watson. Advanced decision systems like Watson, the type that will fuel the next generation of ad serving, will require model building that will include pre-calculation to be done at the ad slot level. By the way, this is very similar to how Google figures out what ad to show in search.

The quality of the decisions these systems make is reliant on: 1) more static data sets with higher signal-to-noise ratio to help train the models and 2) the ability to learn based on outcomes. That means the most intelligent systems must use first-party data and backend data that will be looped back in real time to aid in decision intelligence.

None of these features are part of RTB or programmatic display today. To date, RTB moves just too slowly. It is reliant on a latent data model based on previous actions or behaviors. Most RTB systems are built to determine whether to bid on the same impression that’s available on five separate exchanges. They are being built to compete decision against themselves. This is crazy.

Real-time systems of the future are going to more closely resembles site-side optimization technology — it’s worth noting that is what Criteo was built on — that do things like landing-page optimization, content recommendation and cross-sell/up sell than what makes up the majority of the Lumascape.

First-party data is the key here because the analytics and the deep learning needed for high performance require that the source and timing of the data collection be put in context. Did this person come from Google, Facebook, Twitter or Pinterest? What is their local time? What pageview are they on? What happened to the last person that clicked this ad from this page? Without that, it is impossible to expect that the application of that data can be used in the right context or timing.

These ad-serving decisions are very much user controlled — like the medium itself; thus, they are more relevant and ultimately more valuable to all parties: the consumer, the publisher and the marketer.

My favorite representative example of this is one you may not expect, Waze. Your timing and context are pulled in real time. Waze processes that to respond with helpful and useful information depending on your location, your speed and other rules. It even shows ads when you are not moving — proof there can be innovation here — but also in the long run that Google may just keep these systems cornered since they have and will always remain the core of Web monetization.


More in Media

Publishers’ Privacy Sandbox pauses settle into a deep freeze following reports of poor performance

Publishers aren’t ready to press play yet on dedicated Privacy Sandbox tests.

AI Briefing: Senators propose new regulations for privacy, transparency and copyright protections

A new bill called the COPIED Act aims to pass new transparency standards to protect IP and guard against AI-generated misinformation.