Scarlett Johansson vs. OpenAI: Who Owns Your Voice in the Age of AI?

What happens when a celebrity’s voice is used in AI without permission?

In the recent clash between Scarlett Johansson and OpenAI, that very question leaped from the realm of science fiction right into our headlines. And I had to talk about it. Not just because it involves a Hollywood A-lister  but because it hits at something much deeper:

Where do we draw the line when it comes to human likeness in AI?

And here’s the real question I keep coming back to: Who owns a voice?

Let’s break this down.

The Voice Behind ChatGPT: A Hollywood Twist

OpenAI recently introduced a range of voice options for ChatGPT. One of them  named “Sky” raised eyebrows instantly. People noticed how eerily similar it sounded to Scarlett Johansson. That wasn’t just a coincidence.

According to Johansson, Sam Altman (OpenAI’s co-founder) had asked her  twice  to lend her voice to the project. She said no. And yet, when “Sky” launched, the resemblance was unmistakable. She noticed. And so did everyone else.

Altman didn’t deny the connection. In fact, he referenced Her, the 2013 film where Johansson voices a hyper-realistic AI assistant. That film was clearly a creative inspiration for ChatGPT’s voice experience.

But here’s the real dilemma: Was it homage or imitation?

Inspiration vs. Imitation: Where’s the Line?

This isn’t just about Scarlett Johansson. It’s about the future of voice, identity, and AI.

We’re in a gray area. Drawing “inspiration” from someone’s voice may seem innocent  but when the result is nearly indistinguishable, where do you draw the ethical and legal boundaries? Especially when the voice in question is instantly recognizable to millions?

And this isn’t the first time we’ve seen something like this.

The Bette Midler Case: A Legal Wake-Up Call

We’ve seen this play out before. In the 1980s, Bette Midler sued Ford for using a soundalike in a commercial after she declined to participate. She won. That case established a powerful precedent: your voice is part of your identity and you can protect it legally.

Scarlett Johansson may be preparing to walk a similar legal path.

Altman’s Her Inspiration and AI’s Human Touch

I get it. It was a cinematic masterpiece. I’ve referenced it too when talking about AI and emotional engagement. And I understand why Sam Altman is fascinated by the idea of an AI that feels human  that connects on an emotional level. It’s powerful.

But that emotional layer? It cuts both ways.

Because when that voice, especially one like Scarlett Johansson's, starts doing your calendar, setting timers, or reading your emails, it can feel unsettling. Even creepy.

That discomfort matters. It shows that voice isn’t just a feature. It’s a feeling.

OpenAI Removes "Sky" Voice, But Not Without Repercussions

In response to the controversy, OpenAI pulled the “Sky” voice from ChatGPT. That’s a smart move, both from a PR and legal standpoint. But it doesn’t erase the issue. In fact, it opens up even more questions:

  • Should AI companies need explicit consent to use or replicate vocal styles?

  • How do we define “copying” when it comes to voice?

  • What legal protections do individuals have over their vocal likeness?

These aren’t theoretical questions anymore. They’re urgent — and they’re going to shape the future of AI development.

Celebrity Voices + AI: A Legal and Ethical Minefield

This case also opens the floodgates. If OpenAI can (even accidentally) create a voice that mimics Johansson, what’s stopping companies from building AIs that sound like Morgan Freeman, Beyoncé, or Ryan Reynolds?

We’re entering a world where celebrity voices could become digital assets and legal battlegrounds. Expect more lawsuits. Expect tighter regulations. And expect celebrities to start locking down their vocal likenesses just like they do their image rights.

Voice Design Isn’t Just Technical It’s Emotional

During this controversy, I saw something subtle but powerful: a sense of emotional discomfort. People didn’t just question the legality, they felt uneasy hearing a voice that sounded like Scarlett Johansson representing them in casual interactions.

This shows us that voice AI isn’t just technical, it's deeply emotional. That emotional layer impacts trust, adoption, and ultimately how we connect with technology.

What’s Next for Voice AI? Consent, Licensing, and Clear Boundaries

Here’s the bigger picture: this isn’t just about Scarlett Johansson. It’s about all of us. AI is learning how to replicate our voices, our faces, our mannerisms. That creates amazing possibilities—and massive ethical landmines.

As AI gets more personal, more human, we’ll need:

  • Clear consent frameworks

  • Licensing agreements for vocal likeness

  • Restrictions on unauthorized mimicry

Otherwise, we risk turning people into data points without their knowledge or permission.

Final Thoughts: Where Does AI Voice Go From Here?

Scarlett Johansson vs. The OpenAI story is more than a headline, it's a preview of what’s to come. As AI continues to humanize its interactions, we’ll need to rethink what “identity” means in a digital world.

Voices aren’t just sound. They’re emotion, memory, and identity.

And whether you’re a Hollywood star or a regular user, your voice deserves to be respected.

Scarlett Johansson’s voice may have been removed from ChatGPT, but the ripple effect is just beginning.

About the Author:

Shawn Kanungo is a globally recognized disruption strategist and keynote speaker who helps organizations adapt to change and leverage disruptive thinking. Named one of the "Best New Speakers" by the National Speakers Bureau, Shawn has spoken at some of the world's most innovative organizations, including IBM, Walmart, and 3M. His expertise in digital disruption strategies helps leaders navigate transformation and build resilience in an increasingly uncertain business environment.


Previous
Previous

Innovation Is a Remix: Why AI Isn’t Stealing, It’s Creating

Next
Next

AI Isn’t the End, It’s the Beginning: How Generative AI Is Reshaping Work and Creativity