Publication

Locke Lord QuickStudy: The NO FAKES Act: With Proposed Bill, ‎Congress Set to Protect Against Unauthorized ‎Digital Replicas ‎of Faces, Names and Voices

Locke Lord LLP
October 16, 2023

Congress has taken its first swing at generative AI. On October 12, 2023, a bipartisan group of senators (Senators Blackburn, Coons, Klobuchar, and Tillis) announced a draft federal bill aimed at establishing guard rails around the creation and use of digital replicas of faces, names, and voices. The Nurture Originals, Foster Art, and Keep Entertainment Safe Act (referred to as the “NO FAKES Act”) addresses a concern most can agree on: preventing one party from co-opting another’s identity without the latter’s consent.[1]  We previously discussed the interactions between generative AI and the right of publicity; the NO FAKES Act would create a uniform set of laws substantially expanding that right.

Expanding the Right of Publicity

The right of publicity has been previously codified exclusively in state statutes[2] and common law. The most notable difference between the proposed federal NO FAKES Act and current right of publicity jurisprudence is that the Act allows virtually anyone to bring a cause of action. In the conventional understanding of the right to publicity, individuals could take legal action only if their “persona,” which refers to the commercially valuable public image of an individual, was utilized. Similarly, monetary damages in traditional right of publicity cases are measured by the monetary loss to the plaintiff or gain to the defendant[3] instead of a statutory “per violation” measurement of damages. Prior common law only protected those who had identities with commercial value. My Aunt Jean—as identifiable as she was—had little chance to win a right to publicity case. Now she potentially could, and the monetary stakes would be higher.

The proposed bill flips this traditional approach in two ways. Namely, under the NO FAKES Act, there is no requirement the offender use the “commercial value” of the victim’s identity or that the victim’s “identity” or “persona” even have commercial value. It is simply unlawful for someone to produce a digital replica of an individual without their consent,[4]  regardless of the commercial value of the identity or the use.[5]  Additionally, damages are not limited to the commercial gains or losses of the defendants or plaintiffs, respectively. The proposed bill allows for $5,000.00 in damages per violation (or the traditional monetary damages, if they’re greater). Thus, every creation or single distribution of an unauthorized “likeness” could cost the offender $5,000.00. Putting these two provisions together—along with the potential for punitive damages, § (d)(4)(B), and discretionary attorneys’ fees, § (d)(4)(C)—the NO FAKES Act expands the right of publicity and acts as a strong disincentive for any such unauthorized digital replication.

It is easy to imagine the broad application of this proposed bill. A student bullying another by creating a “digital replica” of his target and spreading it through school would likely be free from any right-of-publicity suit under the current law, as the victim’s “identity” is without commercial value, and, in any event, the damages would be negligible or difficult to prove.[6]  The NO FAKES Act, however, would prohibit this conduct and stack statutory damages in multiples of $5,000.00, not just for the creation of the unauthorized digital replica, but also for each unauthorized distribution. It would also make clear that anyone forwarding or hosting that content could also be liable. This damages formula would provide certainty over otherwise speculative “causation” damages and prevent having to prove highly personal and difficult-to-quantify damages for right of privacy actions.

Secondary Liability

The NO FAKES Act applies not just to individuals who create the digital replica (i.e., the person who puts in the prompt), the proposed bill also precludes the “publication, distribution, or transmission of . . . an unauthorized digital replica” if the offenders were “knowledgeable” that the replica was not authorized.[7]  This could apply to individuals sharing already-created content and websites hosting the content.

Further, the proposed bill arguably targets AI models themselves. The bill specifically provides that it is not a defense to a claim that the offender “did not participate in the creation, development, distribution, or dissemination of the applicable digital replica.”[8]  It appears that the proposed bill is preemptively closing an argument that AI model owners would make: that they are not creating, publishing, or transmitting the digital replica. Stated differently, an AI company could not avoid liability by arguing, as a defense to liability, that it did not “create”—i.e., bring something new into existence; simply providing the software which creates the ability for someone to input a prompt and create an unauthorized replica may be enough.

Other Notable Take-Aways

This potential new federal right of action exists beyond the death of the individual,[9] providing uniformity to disparate state treatment. The NO FAKES Act also includes general First Amendment exceptions. Namely, the proposed bill does not apply to works of parody or satire, news pieces or documentaries, and other similar categories. There is a large body of law regarding these topics in the copyright context. However, the sheer amount of AI-created content will test and likely refine these judicial boundaries.

Finally, the bill also explicitly permits individuals to license their digital replica rights. Courts have already held that the right of publicity, like other property rights, can be licensed and alienated.[10]  However, it is notable that the proposed bill requires[11] an attorney to represent the individual for the license to be valid.[12]  Notably, this could prevent companies from receiving a right to another’s publicity through form, unilateral adhesion contracts. This would preclude, as a timely example, movie studios from requiring background actors to sign away their personal likenesses “in perpetuity,” a hotly contested point in the recent SAG-AFTRA strike.

Conclusion

The draft bill is only that—a draft. It remains to be seen what will be in any final legislation. However, the draft bill’s bipartisan support suggests that both sides of Congress agree that generative AI and the rise of “fakes,” “generative AI mash-ups,” and other “digital replicas” require new rules going forward. As with any game-changing technology, it requires some time for the law to catch up with the technological advances. This is only the beginning.

---

[1] The proposed bill defines “digital replica” as: “a newly-created, computer-generated, electronic representation of the image, voice, or visual likeness of an individual that— (A) is [nearly indistinguishable] from the actual image, voice, or visual likeness of that individual; and (B) is fixed in a sound recording or audiovisual work in which that individual did not actually perform or appear.”

[2] Thirty-six states have recognized some form of the right to publicity tort, while no state has yet to outright reject the cause of action.

[3] The Rights of Publicity and Privacy, McCarthy and Schechter, § 11:31 (2019 ed.).

[4]  NO FAKES Act‎ ‎§ (c)(2)(A)‎

[5] The proposed bill also prohibits publishing or distributing an unauthorized digital replica “if the person engaging in that activity has knowledge that the digital replica was not authorized[.]”  § (c)(2)(B).

[6] There are potentially other causes of action at play here, including traditional right to privacy actions and the various state-law privacy torts.  However, the former, generally, only redress damages for mental pain and suffering, which can be both difficult and invasive to prove.  The latter, generally, has similar difficulties in proof and is not as expansive as the proposed bill.

[7] NO FAKES Act‎ § (c)(2)(B).‎

[8]‎ NO FAKES Act‎ § (d)(3)(B).‎

[9] NO FAKES Act‎ § (b)(2)(A)(ii).

[10] See The Rights of Publicity and Privacy, McCarthy and Schechter, § 1:7 (2019 ed.).

[11] ‎NO FAKES Act‎ § (b)(2)(B)(i).‎

[12] The proposed bill also allows for licenses that are covered by a collective bargaining agreement.  NO FAKES Act‎ § (b)(2)(B)(ii).  

Click here to visit the AI resource center