Why deepfakes aren’t just a problem for celebrities

American actress Scarlett Johansson at the Cannes Film Festival 2023. Asteroid City movie photo shoot. Cannes (France), 24 May 2023

Mondadori Portfolio | Mondadori Portfolio | Getty Images

Movie star Scarlett Johansson is taking legal action against an AI app that used her name and an AI-generated version of her voice in an ad without her permission, according to Variety.

The 22-second ad was posted on X, formerly Twitter, on Oct. 28 by AI image-generating app Lisa AI: 90s Yearbook & Avatar, according to Variety. The ad featured images of Johansson and an AI-generated voice similar to hers promoting the app. However, fine print displayed below the ad indicates that the AI-generated content “has nothing to do with this person”.

Representatives for Johansson confirmed to Variety that she is not a spokesperson for the app, and her attorney told the publication that legal action is underway. CNBC has not seen the ad and it appears to have been taken down. Lisa AI and a rep for Johansson did not respond to CNBC Make It’s request for comment.

Although many celebrities have been the subject of deep fake fake fake fake fake fake, they can also cause problems for ordinary people. Here’s what you need to know.

The word deepfake comes from the concept of “deep learning,” which falls under the broader umbrella of machine learning. This is when algorithms are trained to identify patterns in large data sets, then use those pattern recognition skills on a new data set or produce results that are similar to the original data set.

Here’s a simple example: An AI model can be fed audio clips of a person speaking and learn how to identify their speech patterns, tonality, and other unique aspects of their voice. The AI ​​model can then create a synthetic version of the voice.

The problem is that technology can be used in harmful ways, says Jamin Edis, an assistant professor at New York University with more than 25 years of experience in the technology and media industries.

“Deepfakes are simply a new vector for impersonation and fraud, and as such can be used in similarly malicious ways whether someone is a celebrity or not,” he told CNBC Make It. “Examples could be your likeness – or that of your loved ones – being used to generate pornography or used for extortion or to bypass security by stealing your identity.”

What’s even more troubling is that it’s getting harder to tell the difference between what’s real and what’s fake as deep-counterfeiting technology rapidly advances, Edis says.

There are a few things you can do if you’re wondering if something you’re looking at might be a deep fake.

First, ask yourself if the images you’re seeing look true to reality, Edis says. Since celebrities are required to disclose when they are paid to promote products, if you see an ad featuring a celebrity pushing something vague, it’s a good idea to check their other social media accounts for disclosure.

Big tech companies including Meta, Google and Microsoft are also developing tools to help people spot deepfakes.

President Biden recently announced the first executive order on artificial intelligence, which will require a watermark to clearly identify AI-generated content and other safety measures.

However, the technology has historically been one step ahead of regulations or attempts to protect it, Edis says.

“Over time, social norms and legal regulations tend to correct humanity’s worst instincts,” he says.
“Until then, we will continue to see the weaponization of deepfake technology for negative results.”

DON’T MISS: Want to be smarter and more successful with your money, work and life? Sign up for our new newsletter!

CNBC will host its Your Money virtual event on November 9 at 12pm ET with experts including Jim Cramer, Ben McKenzie and Farnoosh Torabi. Learn how to grow your finances, invest for the future and reduce risk amid record high inflation. Register for free here.

CHECK OUT: This new tool lets artists ‘poison’ their artwork to stop AI companies from deleting it

Leave a Comment

Your email address will not be published. Required fields are marked *