16.04.26
In the wake of Shy Girl being pulled from shelves and the rise of Suno in the music industry, AI is an ever-pressing concern for anyone working in the culture sector. In a new series of posts on AI, we will take stock of recent developments, asking where theatre sits in the current landscape and what we can expect to come next.
Sam Goodman (Partner) and Shay Kennedy (Associate) work across Theatre and Digital at Lee & Thompson, one of the UK's leading media and entertainment law firms and write for the blog on the legal implications of using AI in the arts.
AI has disrupted sectors such as publishing, journalism and music relatively quickly. Theatre, by contrast, has been slower to feel the impact. Part of that is structural: as Lucinda Everett at The Guardian notes, theatre provides “real human connection”. Live performance and shared experience is inherently difficult to replicate technologically, putting a natural limit on the application of artificial automation, intelligent or otherwise.
AI is nonetheless beginning to be felt in one way or another within the sector. Its current use is largely concentrated in lower-risk areas such as captioning and accessibility (including speech-to-text and translation), marketing and audience analytics, and early-stage design work, where it can be used to generate various visual concepts quickly and cost-effectively.
There have also been more visible experiments. The Young Vic’s AI (2021) used GPT-3 to generate a script live during performances, while AI: When a Robot Writes a Play (Švanda Theatre / Czech Centre, 2021) explored similar ideas. These productions are often cited as examples of what the technology can do, but they also illustrate its limits: AI has so far functioned more as a subject of interest for theatre makers than as an existential threat to them.
The more immediate impact is legal. The UK does not yet have an AI-specific legislative framework, leaving the sector to rely on existing bodies of law, like copyright, performers’ rights, and contract. None of those regimes were designed with AI training or digital replication in mind.
The government’s recent approach illustrates the difficulty. A 2024 consultation on Copyright and Artificial Intelligence proposed a broad “text and data mining” exception, which would have allowed AI developers to use copyrighted works unless rights holders opted out. The proposal attracted significant opposition and was not taken forward. A House of Lords report in March 2026 warned that such an approach risked “further harm” to the creative industries, and as of April 2026 the government has confirmed it has no preferred alternative.
The underlying issue therefore remains unresolved: training AI systems requires "copying", in some sense, which is precisely what copyright law restricts. The law has yet to provide a clear answer as to how that natural friction should be resolved.
Contracts are filling the gap. AI clauses are now increasingly common in theatre production agreements, although the drafting is not yet standardised. Broadly, three themes are emerging: restrictions on the use of scripts, recordings and other materials for training purposes; greater control by performers, particularly in relation to digital replication or AI-driven manipulation of voice and likeness; and a growing emphasis on transparency, including obligations to disclose AI use, label outputs and maintain human oversight. Industry bodies such as Equity and the Society of London Theatre have called for stronger protections around consent and remuneration.
Against that backdrop, theatre remains relatively AI-resistant compared to other creative sectors. AI can generate music, text and images, but can it deliver a compelling live performance of Hamlet without humans? That is the question (the answer is no). The risk is therefore less one of replacement and more one of control: who is entitled to use creative material in AI, or to generate creative material using AI, and on what terms? That is also the question (the answer is TBD).
Looking ahead, two developments seem likely. The first is a greater reliance on licensing models for AI training, particularly if the legal position on unlicensed use remains uncertain. Theatre may complicate that process given the number of rights holders involved in a single production. The second is increasing pressure for clearer rights over image and voice. UK law currently relies on a combination of performers’ rights, privacy and passing off, rather than a single, unified right. As AI replication improves, that position is likely to come under increasing strain.
For now, however, the key decisions sit in contracts rather than legislation - determining what can be used, how it can be used, and who bears the risk. The law will catch up. Until then, big tech will continue to move fast and break things; theatre makers will hope it's only a leg.
If you'd like to keep up to date with all our blog posts, important and interesting stories in the worlds of theatre, arts and media, plus job ads and opportunities from our industry friends, sign up to our daily media briefing at this link.