AI can resurrect voices from the past — but without clear laws and ethics, we risk burying fair use.
Eleven Labs' licensing of deceased celebrity voices raises critical questions about intellectual property and fair use in AI. The intersection of voice-cloning technology with existing copyright frameworks is one of the more consequential legal and ethical frontiers we face — and it's arriving faster than our institutions are prepared to handle.
The IP Gap
Current intellectual property law doesn't adequately address computer-generated content. The Berne Convention stipulates that machines cannot hold copyright — but this was written in a world where machines couldn't create at the level of human creativity. Voice cloning changes the calculus entirely. When an AI can reproduce a celebrity's voice with near-perfect fidelity from a few minutes of sample audio, the question of ownership becomes urgent and genuinely unresolved.
Two Models Worth Considering
There are two emerging models for how this might be handled. The first is an "AI moat" concept inspired by the DMCA — a filtering layer that prevents AI-generated content from infringing copyright by detecting and blocking unauthorized reproductions. The second is a blanket licensing approach modeled after the way radio stations operate: entities obtain a single license covering a broad catalog of works, rather than negotiating individually with each rights holder.
Both have precedent. Both have weaknesses. The DMCA-style approach has proven difficult to enforce even against human creators — extending it to AI-generated content adds layers of technical and legal complexity. The blanket licensing model works well for music but hasn't been stress-tested against the specific challenges of voice identity and likeness rights.
Eleven Labs as an Industry Standard
Eleven Labs' approach — securing ethical agreements with celebrity estates before deploying their voices — demonstrates responsible industry leadership. It's a template others in the space should be studying. The company isn't waiting for regulators to force the issue; it's establishing norms proactively. That matters, because the industry's self-governance track record in AI has been, charitably, mixed.
What Comes Next
Meaningful solutions here require collaboration among tech companies, legal experts, and content creators to establish standards that balance innovation with intellectual property protection and ethical considerations. None of these parties can solve this alone. The law will lag the technology — that's almost guaranteed. Which means the responsibility falls, for now, on the companies building these tools to set standards worth keeping.
The stakes go beyond celebrity estates. Every person has a voice. Every voice is, in some sense, a form of identity. The norms being established right now around celebrity voice licensing will shape how we think about the digital rights of ordinary people for decades to come.