What Lionsgate’s Partnership Deal With Runway Means

Photo collage of Lionsgate franchises The Hunger Games, Expendables, and Saw.
Photo Illustration: Variety VIP+; "Saw X," "The Hunger Games: The Ballad of Songbirds & Snakes," "Expend4bles" courtesy of Lionsgate

In this article

  • Lionsgate and other Hollywood studios are beginning to fine-tune train video generation models with their archival content
  • Studios intend to use fine-tuning initially for previsualization and complex visual effects, such as simulations
  • Fine-tuning a video generation model still isn’t immune from copyrightability or infringement concerns of using AI models

In the first of any such deals to become public, Lionsgate has a partnership deal with AI firm Runway to develop an exclusive video model, trained on the film studio’s library of film and TV content, first reported by the Hollywood Reporter.

This training refers to fine-tuning video generation models using the studio’s owned film and TV content as training data. VIP+ previously discussed early-stage interest among media companies and the downstream implications of fine-tuning video models on studio content in July, soon after Runway’s launch of the alpha version of its latest video model update, Gen-3.

VIP+ Analysis: Fine-Tuning AI Video Models Getting Early Interest From Film & TV Studios

There’s been a chaos of licensing deals over the past year between publishers and AI companies for training data, which VIP+ has indexed. However, importantly, the Lionsgate-Runway deal is not a data licensing deal. The deal also doesn’t mean that Lionsgate data is being used to train any of Runway’s own general video models, such as Gen-3, that support its tools made accessible for Runway’s other users.

VIP+ Analysis: Why Studios Still Haven’t Licensed Movies and TV Shows to Train AI

While exact terms couldn’t be shared, the Lionsgate-Runway deal is structured as a partnership. Lionsgate is giving Runway access to some of its film and TV archives in order to create a jointly managed custom model that only Lionsgate and its filmmakers, directors or other creatives can exclusively use. Lionsgate production teams will also be able to work with Runway to help them use the technology to the greatest effect.

“We are partnering with Lionsgate to create custom models and tools they can then use in whatever their pre- or post-production lifecycle of a movie or a show,” Runway CEO and co-founder Cristóbal Valenzuela told VIP+.

Eventually, Runway and Lionsgate may also together develop templates for smaller, hyperspecialized custom video models that are licensed out as fit-for-purpose tools for outside creative teams to then train and use with their own IP on their own productions. Though still only hypothetical, these tools might be capable of creating specific pre-production and post-production materials, including certain special effects.

“We have different algorithms for different types of things,” said Valenzuela. “The way we train models is based on use cases by first understanding what directors and creatives might want to use it for. For example, if you need explosions and simulations of fires or liquids, those require different set of parameters, datasets, tuning and ways of training.”

Fine-tuning to have exclusive video models trained on owned content IP is likely proceeding across the industry, as each studio races to gain production efficiencies from gen AI. Six major Hollywood studios are pursuing video model fine-tuning, a source told VIP+. At least two studios are now negotiating with OpenAI to license and then train Sora on the corpus of their intellectual property for internal use, as VIP+ has previously written per a different source.

Hints of such deals also appeared in various news accounts in recent months, including Disney reportedly discussing a deal with Microsoft to create a private generative AI tool trained on the studio’s library of content and other data, per the Wall Street Journal in May.

Yet the exact capabilities and production use of these fine-tuned models are still largely unknowns. Discovering use cases will be an ongoing process of testing and learning for filmmakers within studios. As variously described, immediate use cases would be in previsualization, creating storyboards or animatics, and post-production, such as creating backgrounds or environments where actors are later composited or for complex special effects, such as explosions that are expensive and time-intensive to create physically or with traditional VFX.

Replicating an actor’s likeness with a model isn’t on the table, said a source with knowledge of the matter. As SAG-AFTRA’s 2023 agreement is silent on AI training, informed consent would still be required if a studio intended to use a model for visuals of a specific actor on a new production. California's newly signed assembly bill AB-2602 adds more teeth to protect actors from unfair practices, making any talent contract unenforceable where a digital replica is used in work the actor could have performed in person; if its use isn't clearly described in the contract; and if the performer wasn't represented by a lawyer or labor union upon entering the deal.

It might be more likely for a fine-tuned model to be tasked with creating non-actor visuals, such as virtual backgrounds or expensive CGI shots that would normally fall to VFX, as VIP+ previously argued in our July piece on the implications of fine-tuning.

The true north of the model is to give filmmakers an “efficiency tool” that allows them to, quite simply, get movies made under budget constraints.

Importantly, studios likely see fine-tuning an AI model exclusively for internal use as a preferable alternative to licensing their IP for general model training, as VIP+ has also written.

In contrast to licensing, studios regard fine-tuning as more data-secure and their competitive advantage, allowing studios to retain their IP and any production-efficiency benefits without conferring improvements to an AI company’s general video model capabilities and with model use being contained and controlled by the studio. As such, any benefits remain in-house rather than a developer’s widely distributed commercial product.

While fine-tuning is not totally scot-free of potential legal concerns, it’s considerably less contentious than a data licensing deal for AI training could become for a studio, which talent or other third-parties might interpret as in breach of old contracts, sources told VIP+ in August.

Moreover, a studio’s use of a fine-tuned model isn't immune from copyright considerations. The copyrightabilty of AI output and infringement status of using models trained on copyrighted works remain open questions. Valenzuela couldn’t say what specific model Lionsgate or other potential partners were using as the underlying model to then customize with their own data.

Variety VIP+ Explores Gen AI From All Angles — Pick a Story