On Wednesday, January 10, 2024, lawmakers in the House of Representatives announced the introduction of a new discussion draft of a bill called the No Artificial Intelligence Fake Replicas And Unauthorized Duplications Act (No AI FRAUD Act). The draft bill follows the Senate’s own proposal three months prior and was introduced by Reps. María Elvira Salazar (R-FL), Madeleine Dean (D-PA), Nathaniel Moran (R-TX), Joe Morelle (D-NY) and Rob Wittman (R-VA). The central purpose of the No AI FRAUD Act is to prevent the unauthorized creation and use of AI-generated replicas of an individual’s likeness, voice, or other personal characteristics without that individual’s consent.
Although many states have already begun passing their own AI regulations and laws, there is presently no enacted federal-level right of publicity, much less one that applies to this powerful technology. In October 2023, the U.S. Senate took the first step to propose filling this gap by introducing its own discussion draft bill called the Nurture Originals, Foster Art, and Keep Entertainment Safe Act of 2023 (NO FAKES Act). Together, both chambers of Congress are now proposing to provide a nationwide set of rules to address the rising use of AI to generate unauthorized replicas of individuals’ and artists’ voices and likenesses.
The House’s No AI FRAUD Act specifically cites past instances in which generative AI has raised concerns about the power and danger of the technology, including the AI-generated “Heart on My Sleeve” song that mimicked the vocals and styles of Drake and The Weeknd, false AI-generated endorsements featuring Tom Hanks, and AI-generated nonconsensual explicit images of high school girls.
Key portions of the proposed No AI FRAUD Act include the following:
- The act provides that anyone who “in a manner affecting interstate or foreign commerce” and “without consent” (A) “distributes, transmits or otherwise makes available to the public a personalized cloning service”; (B) “publishes, performs, distributes, transmits or otherwise makes available to the public a digital voice replica or digital depiction with knowledge that the digital voice replica or digital depiction was not authorized”; or (C) “materially contributes to, directs, or otherwise facilitates” such conduct shall be liable for damages, including statutory damages that, depending on the violation, would start at $5,000 or $50,000, or greater actual damages, as well as punitive damages and reasonable attorneys’ fees.
- Notably, the proposed No AI FRAUD Act provides that the use of a disclaimer that the rights owner did not authorize or participate in the use “shall not be a defense.” But the Act does provide for a First Amendment defense subject to factors such as commerciality, the necessity and relevance of the use of the voice or likeness to the “primary expressive purpose of the work,” and whether the use “competes with or otherwise adversely affects” the value of works owned by rights holders.
- The Act contains an exception where the harm caused is “negligible,” but provides for per se harm for “[a]ny digital depiction or digital voice replica which includes child sexual abuse material, is sexually explicit, or includes intimate images.”
With federal bills to regulate the creation and use of digital replicas having been proposed in both houses of Congress, momentum seems to be gathering for such a new federal right, and it bears watching how the draft bills are modified and amended through the legislative process.
Developments are not limited to the federal realm, however. On the same day that the No AI FRAUD Act was announced, Tennessee lawmakers introduced the Ensuring Likeness Voice and Image Security (ELVIS) Act, “a bill updating Tennessee’s Protection of Personal Rights law to include protections for songwriters, performers, and music industry professionals’ voice from the misuse of artificial intelligence (AI).” As we indicated in October 2023, when writing about the NO FAKES draft bill, it is therefore time for artists, musicians, actors and other stakeholders to watch closely and prepare themselves for the possible establishment in the near future of an AI-centric federal right of publicity as well as similar new state laws.