In May 2024, OpenAI released a voice assistant named Sky. Within hours, the internet lit up: it sounded eerily like Scarlett Johansson’s character in the film Her.

The problem? Johansson had explicitly said no when asked twice to lend her voice.

This wasn’t just a tech mishap. It was a collision of AI ambition, consent, and corporate accountability.

The Timeline

  • Sept 2023: OpenAI CEO Sam Altman first approached Johansson to license her voice. She declined.

  • May 10, 2024: Altman contacted her again. She refused a second time.

  • May 13, 2024: OpenAI launched GPT-4o with the “Sky” voice.

  • May 15, 2024: Altman tweeted a single word: “her.”

  • May 20, 2024: Johansson issued a statement: “I was shocked, angered, and in disbelief.”

Within days, OpenAI suspended the voice. But the damage—to trust, reputation, and the ethical debate—was done.

Why This Matters for Filmmakers

The Johansson saga isn’t about one voice. It’s about:

  • Consent: What does “no” mean in the age of AI?

  • Corporate Culture: Is “ask forgiveness, not permission” an acceptable strategy?

  • Creative Integrity: When an actor’s voice can be cloned, where does artistry end and exploitation begin?

For filmmakers, this case is a warning: if consent frameworks aren’t built into AI, careers and reputations are at risk.

The Ethical Fault Lines

  1. Consent Failures
    Johansson’s explicit refusal highlights why written, revocable, and auditable consent is essential.

  2. Corporate Accountability
    OpenAI’s pattern—ship fast, apologize later—mirrors a tech ethos ill-suited for creative industries where rights and legacies are at stake.

  3. Public Trust
    The backlash wasn’t just legal—it was cultural. Audiences sided with Johansson, demanding safeguards for actors across the industry.

Mini Case Example: The Ripple Effect

After the Johansson controversy, SAG-AFTRA doubled down on consent clauses, requiring “clear and conspicuous written consent” for digital replicas.

Studios began reviewing contracts for likeness rights, and lawmakers cited the case in drafting federal proposals like the NO FAKES Act.

In other words: one refusal reshaped policy discussions across Hollywood and Washington.

Quick Wins Checklist for Creatives & Studios

Add explicit consent clauses covering voice, likeness, and digital replicas.
Use blockchain consent registries for immutable proof.
Establish corporate AI ethics boards to review high-risk uses.
Audit AI vendors for compliance with guild rules.
Communicate AI use openly in marketing and credits.
Train teams on consent, rights, and ethical obligations.

Risks & Pitfalls if Ignored

  • Legal Battles: Violating publicity rights sparks lawsuits (Johansson’s case proves it).

  • Reputational Fallout: Public sentiment can turn overnight against “unethical” studios.

  • Union Backlash: WGA and SAG-AFTRA are prepared to strike again if protections fail.

Closing Thought

Johansson’s refusal wasn’t just personal—it was precedent-setting.

Question for you:
If a studio cloned your voice without consent, what would you do?

👉 Drop your thoughts in the comments.
👉 Follow me for more insights from the Ethical AI in Filmmaking Playbook.

Reply

or to participate