Why the Tribeca Film Festival embraced AI movies with OpenAI and Runway
Movies have always offered a kaleidoscopic lens on life: blockbusters provide escape, documentaries bring us closer to unfamiliar people and places, and dramas elicit emotions. But what happens when humans aren’t behind the camera?
The 2024 Tribeca Film Festival introduced new dialogue about generative AI, from AI-generated films to feature-length documentaries about AI’s risks and rewards. Now about more than just film, the annual New York festival gave filmmakers, moviegoers and marketers new ways to see and hear about AI’s growing role in entertainment. And as Hollywood debates AI’s impact, others question if the tech deserves such a bright spotlight.
On Father’s Day weekend, Tribeca and OpenAI screened a new series of short films created with Sora, the AI model that lets people generate hyper-realistic video with just text-based prompts. The series, “Sora Shorts,” featured five commissioned films made in a mere three weeks. And while filmmakers experimented with an AI platform still inaccessible to the general public, the collaboration also gave OpenAI a way to reach an audience that might or might not be open to AI.
Tribeca Enterprises CEO Jane Rosenthal, who co-founded the festival with Robert De Niro in 2002, said it’s important to work with companies like OpenAI and Runway rather than avoiding or fighting AI. That way the industry can try new tools, foster conversations and collaborate on new standards. And with so many new generative AI platforms being released, Rosenthal said it was natural to find ways of incorporating them into this year’s festival.
“There’s been so much fear and questioning about it,” Rosenthal told Digiday. “The only way to eliminate that is to ask the questions. Then you also understand what you need to have, whether it’s governmental restrictions, industry restrictions [or] legal restrictions. You have to be able to understand what it is to be able to work together to make it the best that it can be for humanity.”
As a producer on dozens of films like “Rent,” “The Irishman” and “Meet The Parents,” Rosenthal has seen decades of evolution in how Hollywood adopts new ways of bringing stories to screens. More recently, she’s tested AI image platforms like Midjourney, tried other AI tools to upgrade old film resolution and aspect ratios, and used Sora to learn about the prompting process. “It’s hard to get AI-generated videos that iterate on a very specific image without changing it completely, which can create challenges for filmmakers,” she explained.
Screening AI-generated shorts also sparked some controversy, with the festival and the filmmakers it commissioned for the AI films receiving criticism for partnering with OpenAI. Posts on X and Reddit complained about Tribeca’s screening. Some raised concerns about IP issues and accused the festival of taking advantage of artists, but others online and in attendance praised the premiere with enthusiasm.
“Ethical AI and being able to train on ethical models and on clean models is key,” Rosenthal said. “And that’s where having conversations where you’re partnering and working with OpenAI, you’re talking to DeepMind, you’re talking to Runway AI, that’s where you’re going to solve [issues].”
At the premiere, OpenAI COO Brad Lightcap described Sora Shorts as a way to learn from filmmakers. He also noted realistic AI video models didn’t seem possible until recently: “To have not only built these tools, but to be working collaboratively with artists to explore how these tools can bring benefit to creators and to fans is really surreal for us.”
After the screening, filmmakers said Sora was different from other types of movies, comparing it to poetry or describing a dream. Beyond being a creative tool, they think it could help make future films more efficiently, freeing them from traditional workflows. Filmmaker Nikyatu Jusu wants to use Sora to make images of set pieces in an upcoming pitch for a traditional movie. Ellie Foumbi hopes it will let department heads collaborate in new ways. Iranian director-actor Reza Sixo Safai said Sora could be a valuable tool in countries with censorship: “What’s really cool is to be able to generate actors that can’t be arrested. I think AI could be used to do that.”
Michaela Ternasky-Holland, a filmmaker who specializes in storytelling with immersive tech like virtual and augmented reality, said Sora helped her find and refine creative ideas through a balance of creating specific prompts to elicit certain visuals but also knowing when to collaborate with Sora by giving away control. In an interview with Digiday at the screening, she said rapid prototyping as an approach required finding what landed emotionally for her as a director, but also visually for Sora.
Collaborating with Sora was like “working with an infant with unlimited computing power” that other times is like a Las Vegas slot machine where you “sit there all night and hit nothing,” Ternasky-Holland said. She also compared it to a genie that grants wishes, but the key is knowing what to wish for. The end result was a process with dozens of of macro- and micro-level prompts to build a world and maintain consistency based on visuals Sora made.
“You can tell an infant to write a story about a telephone, and the infant won’t think of maybe a regular telephone,” Ternasky-Holland said. “Sora might think of seven other different versions of a telephone. So what it also forces you to do is get really clear with how you’re writing and recognize when things you’re writing might be convoluted, or when things you’re writing have a social construct.”
OpenAI’s films weren’t the only place AI showed up in films during the festival. It also included the premiere of two documentaries about AI. One starring DeepMind co-founder and CEO Demis Hassabis included a post-screening interview with Hassabis and filmmaker Darren Aronofsky. Another documentary called “How I Faked My Life With AI” examined the darker sides of AI and questioned how the use of AI-generated content can damage trust in personal relationships and society. Tribeca 2024 also included a partnership with the newly opened Mercer Labs, a technology and art museum in lower Manhattan, with some exhibits using AI to create immersive visual and audio experiences.
Tribeca also screened a series of AI short films in partnership with Runway ML, an AI video startup that hosted its own AI film festivals in 2023 and 2024. Partnering with Tribeca was a “natural fit,” according to Runway head of creative Jamie Umpherson, who noted submissions grew from 300 last year to 3,000 this year, ranging from narrative films and music videos to stop-motion. Runway, which titled its series “Human Powered,” also just released its new Gen-3 Alpha video model this week.
“It was a really great example of how different artists are embracing the tools in different ways, and getting really unique outcomes that we hadn’t really seen before,” Umpherson said. “We build the tools and we get them into the hands of creatives. But it’s often once we see it in the hands of creatives where we really begin to understand the full potential that they bring to the creative process.”
Tribeca has a long history of exploring new tech. Since 2013, the annual Tribeca Immersive program has curated fiction and non-fiction stories with virtual reality, mixed reality and spacial audio. Last year, sponsor OKX promoted its crypto exchange by giving attendees a way to make AI-generated NFTs.
This year’s TribecaX conference during the festival had several talks about AI’s role in storytelling and innovation, with featured speakers including top marketing execs from Publicis Media, IPG, Accenture, Adidas, General Motors and Molson Coors. During one onstage talk, BBDO Worldwide CEO Andrew Robertson said there’d been three “game-changing technological shifts” in the past 20 years: the introduction of the internet, the arrival of smartphones and now the adoption of generative AI.
“You just don’t know what’s being worked on, where you cannot really attempt to get out in front,” Robertson said. “What you do have to do is keep an eye on everything. And when something moves, when things move to millions, when they start to become significant, that’s when you need to be able to leap on them fast and learn fast.”
The filmmakers commissioned by OpenAI are plenty aware and sensitive to the ongoing ethical issues with using generative AI. Ternasky-Holland likened current discussions to how documentary filmmakers used to not consider certain ethical issues when filming around the world, adding that “things need to get a little uncomfortable for people to find a new comfort with a new paradigm.”
“Our paradigm is constantly shifting around ethics,” Ternasky-Holland said. “That’s the same thing happening in emerging tech. But ethics isn’t something that is a one-and-done thing. Technology is very similar to a nuclear weapon. People can make a really good, amazing destructive thing, and they can make a really good, amazing non-destructive thing.”
More in Media
AI Briefing: Why Walmart is developing its own retail-specific AI models
Walmart debuted its own set of retail-specific AI models to help power the company’s “Adaptive Retail” era of personalized shopping and customer service.
Media Briefing: Publishers confront the AI era during the Digiday Publishing Summit
This week’s Media Briefing recaps what publishers had to say about AI platforms during the Digiday Publishing Summit’s closed-door town hall sessions.
Mastercard, Samsung and 7-Eleven are 2024 Greater Good Awards winners
The honorees of this year’s Greater Good Awards, presented by Digiday, Glossy, Modern Retail and WorkLife, recognize the importance of empowering communities and fostering economic opportunities, both globally and closer to home. Many of this year’s entrants and subsequent winners also collaborated with mission-driven organizations to amplify their efforts in education, inclusion and sustainability. For […]