Four passes left to attend the Digiday Publishing Summit
WTF is Retrieval Augmented Generation for AI chatbots and large language models?

This article is a WTF explainer, in which we break down media and marketing’s most confusing terms. More from the series →
AI-powered chatbots are scary smart, thanks to the large language models undergirding them. But they’re only as smart as the data they’re trained on.
Two major factors can inhibit the AI chatbots being created by brands and publishers: the timeliness of the bots’ training data and the inclusion of proprietary information in that data set. In the first case, a company would need a way to pass information in real time to an LLM powering its chatbot. And in the second case, the company would need to be able to safeguard that information so that the LLM is just provided whatever necessary context to respond someone’s prompt. In both cases, a system called “retrieval augmented generation” would provide a framework for bridging the gap, as explained in the video below.
“There is a difference between being trained on public and licensed data and knowing what sits in the cloud environment of a brand,” said Hugo Loriot, head of data and technology integration at The Brandtech Group. He added, “All of the data that is very specific to a brand, this is off-limits for a [large language] model — unless you have a way to feed that data into the model, which is exactly what RAG does.”
More in Marketing

When it comes to Perplexity’s ad business, the platform is at a crossroads
The departure of Perplexity’s ads chief Taz Patel underscores a broader identity question the AI platform has yet to resolve.

Still spending, still nervous: the paradox of Q4 advertising
In a precarious economy, marketers opt for controlled risk.

‘The year where I don’t roll my eyes’ at retail: Bayer’s programmatic and digital lead on the state of retail media
Bayer’s programmatic and digital lead shares how the brand is pushing past silos and redefining what effective retail media looks like.