Copying a person’s likeness has become easy with AI. Is it legal?

By: Verbit Editorial

Header
Filters

Filters

Popular posts

Adding Subtitles in DaVinci Resolve Adding Subtitles in DaVinci Resolve
instagram-logo-1
Adding Captions To Instagram Reels & Videos Adding Captions To Instagram Reels & Videos

Related posts

Woman working on a laptop
AI’s growth in the legal sector provides improved efficiencies, fresh insights and new analytics AI’s growth in the legal sector provides improved efficiencies, fresh insights and new analytics
student at laptop
Hear how colleges and universities are preparing for new Title II rules and enhancing digital accessibility Hear how colleges and universities are preparing for new Title II rules and enhancing digital accessibility
Share
Copied!
Copied!

On May 13, OpenAI demoed GPT-4o, the latest version of its popular chatbot. The presenters showed off ChatGPT’s ability to help a child with their math homework, translate speech between Spanish and English in real time and even provide fashion advice for a disheveled-looking man who said he was about to go into a job interview.

Shortly after the demo showcasing the chatbot’s impressive ─ and unprecedented ─  abilities, OpenAI CEO Sam Altman posted the word ‘her’ on X (formerly known as Twitter), seemingly an allusion to the 2013 film featuring an artificially intelligent assistant voiced by Scarlett Johansson. Before long, Open AI heard from Johansson’s lawyers.

“Sky,” the voice featured in their tech demo, bore an eerie resemblance to Johansson’s. Altman, a fan of the movie “Her,” had for months tried to get the actor to voice his company’s chatbot. Repeatedly, he was turned down.

OpenAI has since removed Sky from ChatGPT. In a blog post published May 19, the company wrote that they flew in their five chosen voice actors for recording sessions in June and July of 2023 and did not contact Johansson until months later on Sept. 11, 2023, as a possible sixth voice.

The dispute between OpenAI and Johansson offers a glimpse into the complex legal landscape surrounding the use of artificial intelligence. Verbit explored this issue to determine which legal precedents offer insights into the current case and how future legislation might regulate the use of peoples’ work and likeness with AI.

a black and white image of entertainer Better Midler

When it comes to rights to voice and likeness, there are parallels to the past

Scarlett Johansson’s case is not without precedent. Consider the Midler v. Ford lawsuit in the 1980s. The Ford Motor Company’s ad agency attempted to hire famed singer and actor Bette Midler to sing for a television commercial. When she declined, the company went on to hire one of Midler’s former backup singers, instructing her to sound as much like the “Do You Want to Dance” singer as possible. The resulting piece was so convincing that many viewers thought that Midler herself was singing in the commercial. Midler sued Ford, and eventually, a court ruled that imitating a famous singer’s voice without their consent was unlawful.

A similar lawsuit between Frito-Lay and singer Tom Waits took place in the early 1990s when the company tried to hire Waits to sing a jingle advertising their new Salsa Rio Doritos. Instead, the advertising agency hired by Frito-Lay found a professional musician to record a jingle similar to “Step Right Up,” a song by Waits featuring his distinctive deep, gravelly vocals. Waits sued the ad agency and Frito-Lay for violating his right of publicity and for false endorsement. Witnesses during the trial testified that they believed Waits sang the Frito-Lay jingle, when in fact he did not. The jury eventually awarded Waits $2,475,000 in damages.

Closed captioning services

Reach wider audiences and boost engagement with Verbit’s captioning services.

Learn more

According to an analysis by Casetext, neither Midler nor Waits would have won their cases had they sued for copyright infringement, since Ford obtained the right to use Midler’s song, and Waits did not own the rights to his. But as public persons with distinctive voices, making the case that their “rights to publicity” were violated was relatively easier. As a major celebrity who has done extensive voice-over work, Johansson’s voice is considered distinctive; she could also build a compelling case against OpenAI if she were to pursue legal action.

This right to publicity is a legal precedent that gives individuals the exclusive right to use their image. This means no one can use an individual’s likeness to falsely claim that they endorse a product. However, only around half of states recognize this right, according to the Legal Information Institute at Cornell Law School.

When it comes to unauthorized impersonations, the rest of us probably do not need to worry about major companies duplicating our voices for their commercials. Deepfakes of everyday people are much more likely to be used for mischief. According to Kristelia García, a law professor at Georgetown University, it is illegal to use someone else’s voice without their consent. But the law gets murky when it comes to AI replicas. Some legal jurisdictions have stronger laws on this issue than others.

Four officials, three men and one women, stand at a table, holding up their right hands, readying to testify

What comes next for copyright laws and AI safeguards?

Such concerns have not escaped lawmakers’ attention. In January, the House of Representatives introduced the “No AI Fraud Act,” a bill designed to safeguard against the nonconsensual use of AI replicas by giving people property rights to their voice and likeness at the federal level. Meanwhile, the Senate is proposing the “NO FAKES Act,” which would offer people similar protections.

These laws come at a sensitive time for Hollywood. In November last year, the Screen Actors Guild-American Federation of Television and Radio Artists ended a 118-day strike. One of the key points of contention was related to the use of AI. In its deal with the studios, SAG-AFTRA agreed to a number of rules surrounding the use of AI in TV and film to protect performers’ voices and likenesses. Studios, for instance, will need an actor’s consent if they want to use their likeness. They will also have to pay an actor if they make an AI clone of them, just as they would if they had employed them to act directly.

The explosion of AI systems in recent years has raised a seemingly endless number of legal questions. Some companies have attempted to put guardrails in place. For instance, Meta-owned social media platforms recently announced plans to label AI-generated images, but the rollout will likely take time.

As the technology advances, lawmakers will have to strike a careful balance between freedom of expression and the rights to privacy and ownership of personal likeness.

Current laws are vague and differ greatly from state to state. Many already prohibit deepfakes, but clarity will likely come from the passing of new laws ─ and the settlement of new lawsuits. These laws will need to clearly protect everyday people and not just celebrities.

Written by Wade Zhou

Captions that support ADA guidelines

Get real-time solutions with live captioning and transcription.

Click here