Critics Say Sweeping Artificial Intelligence Regulations Could Target Parody, Satire Such as ‘South Park,’ ‘Family Guy’

As lawmakers seek to regulate uncharted AI territory, a debate is emerging about whether broad definitions could unintentionally chill the speech of parody and cartoon creators.

AP/Markus Schreiber
People reflected in a window of a hotel at the Davos Promenade with a slogan about AI alongside the World Economic Forum at Davos, Switzerland. AP/Markus Schreiber

With the artificial intelligence boom well underway, lawmakers are racing to crack down on an influx of deepfake pornography, fraud, and phony celebrity endorsements circulating on social media and the internet at-large.

Deepfakes and digitally altered images have been wreaking havoc with the rise of artificial intelligence, making it easier than ever to convey misleading information. Last year, one fake photo showing an attack on the Pentagon caused a stock market dip within minutes. 

A growing number of videos on social media — which receive millions of views — show AI versions of celebrities endorsing products, and targets of the fake videos have included Elon Musk, Tom Hanks, and recently, Taylor Swift. As of 2020, more than 100,000 women were targets of non-consensual fake nude photos created without their knowledge or permission, a Department of Homeland Security report notes.

Yet, as lawmakers move into uncharted territory, debate is emerging about how to protect people from fake portrayals of themselves without trampling upon the First Amendment or stifling the positive side of AI innovation. 

One recently-introduced bill, the “No AI Fraud Act,” states that “every individual has a property right in their own likeness and voice,” and would provide legal backing to sue to protect it.

“It’s just not workable,” a fellow at the R Street Institute, Shoshana Weissmann, tells the Sun. Although AI impersonation “is a problem” and fraud laws should protect against it, that’s not what this law would do, she says. 

The bill defines likeness as the “actual or simulated image or likeness of an individual, regardless of the means of creation, that is readily identifiable” by virtue of “face, likeness, or other distinguishing characteristic.” It defines voice as “any medium containing the actual voice or a simulation of the voice of an individual, whether recorded or generated by computer, artificial intelligence, algorithm, or other digital technology, service, or device” to the extent that an individual is “readily identifiable” from the sound of it. 

“There’s no exception for parody, and basically, the way they define digital creations is just so broad, it would cover cartoons,” Ms. Weissmann says, adding that the bill would extend to shows such as South Park and Family Guy, which both do impersonations of people.

“It’s understood that this isn’t the real celebrity. When South Park made fun of Ben Affleck, it wasn’t really Ben Affleck. And they even used his picture at one point, but it was clear they were making fun of him. But under the pure text of this law, that would be unlawful,” she says. 

If the bill was enacted, “someone would sue immediately,” she says, adding that it would not pass First Amendment scrutiny. 

Lawmakers should be more careful to ensure these regulations don’t “run afoul” of the Constitution, she says, but “instead, they have haphazard legislation like this that just doesn’t make any functional sense.”

While the bill does include a section relating to the First Amendment defense, Ms. Weissmann says, it’s essentially saying that “after you’re sued under our bill, you can use the First Amendment as a defense. But you can do that anyway under the bill. That doesn’t change that.”

Because of the threat of being “dragged into court” and spending “thousands of dollars on lawyers,” the bill would effectively be “chilling speech,” she notes. 

One of the harms defined in the bill includes “severe emotional distress of any person whose voice or likeness is used without consent.” 

“Let’s say Ben Affleck said he had severe emotional distress because South Park parodied him,”  Ms. Weissmann says. “He could sue under this law. That’s insane, absolutely insane.”

The bill would be more workable if it was made more “specific and narrow to actual harms, and also made sure that people couldn’t sue over very obvious parodies,” she says. The way it’s drafted now, however, is “going to apply to a lot more than they intended,” she adds.

A backer of the bill, Florida Representative Maria Salazar, was not reachable for comment. In a press release, she said it’s time for “bad actors using AI to face the music.”

“This bill plugs a hole in the law and gives artists and U.S. citizens the power to protect their rights, their creative work, and their fundamental individuality online,” she said in the statement.


The New York Sun

© 2024 The New York Sun Company, LLC. All rights reserved.

Use of this site constitutes acceptance of our Terms of Use and Privacy Policy. The material on this site is protected by copyright law and may not be reproduced, distributed, transmitted, cached or otherwise used.

The New York Sun

Sign in or  create a free account

By continuing you agree to our Privacy Policy and Terms of Use