OpenAI has rolled out stronger protections for voice and likeness use in Sora 2 after Breaking Bad star Bryan Cranston found his voice and image could be replicated without his consent. The discovery triggered immediate talks with SAG-AFTRA and leading Hollywood talent agencies.
Hollywood Steps In
Trader Andrew Curran recently shared that OpenAI moved quickly to tighten its copyright and likeness safeguards after Cranston raised concerns about early Sora 2 generations mimicking his voice and appearance. The situation prompted fast-moving negotiations between OpenAI, SAG-AFTRA, Cranston himself, and major industry players including United Talent Agency (UTA), Creative Artists Agency (CAA), and the Association of Talent Agents (ATA).

In a joint statement, these organizations said their collaboration resulted in stronger guardrails around how Sora 2 handles individual replication. They called the discussions "productive" and highlighted OpenAI's commitment to requiring opt-in consent for any personality-based content generation.
What OpenAI Changed
The problem surfaced during Sora 2's invite-only testing phase, when some users generated voice and image outputs resembling Cranston without authorization or compensation. While OpenAI maintained that its policy always required explicit consent, the company acknowledged the unauthorized generations and immediately rolled out technical updates to prevent future incidents. These new safeguards are designed to block any likeness generation unless explicit permission has been granted, establishing clearer ethical boundaries for synthetic media.
What This Means for AI and Entertainment
This incident underscores the mounting tension between AI development and creative rights as generative models grow capable of producing realistic digital doubles with accurate voices. SAG-AFTRA's involvement—fresh off negotiating AI clauses in major studio deals—shows how seriously Hollywood is taking this challenge. Industry observers see this partnership as setting a precedent for consent-based AI systems in entertainment, signaling that AI developers must work directly with talent representatives and unions to prevent unauthorized replication.