NO FAKES Act: Unpacking the New Bipartisan Bill on Digital Replicas

Senators aim to rein in digital replicas with the “NO FAKES” Act which proposes a limited federal right to control one’s likeness using some DMCA-like notice-and-takedown elements.

Guest post by Professor Justin Hughes

This week, Senators Blackburn, Coons, Klobuchar, and Tillis introduced the bipartisan “NO FAKES” Act in Congress, a bill that has been under discussion for months and is intended to provide centerpiece legislation addressing the problem of digital replicas.  The recording industry (RIAA) and the actors’ union (SAGAFTRA) have been the leading proponents of such a law.  Senate Judiciary staff led a process with those groups–and with the Motion Picture Association (MPA)–that went through a long series of drafts.  AI companies were also part of the drafting process.

The bill is substantively complex and structurally complicated, partly the result of so many cooks in the kitchen.  What follows here are only the bill’s basics – as well as some concerns.

The bill defines a “digital replica” as a “computer-generated, highly realistic electronic representation that is readily identifiable as the voice or visual likeness of an individual” and then gives that individual the exclusive “right to authorize the use of the voice or visual likeness of the individual in a digital replica.”

The individual’s exclusive right applies to the “production, publication, reproduction, display, distribution, transmission of, or otherwise making available to the public, a digital replica” at least where the activity in question affects interstate commerce.  But then there is an important caveat – liability comes only when the exclusive right is violated with knowledge that the thing the person used was a digital replica and that that replica was unauthorized.

Post-mortem rights

The NO FAKES digital replica right survives the individual for a minimum of life+10 and a maximum of life+70.  How long the descendible right lasts beyond the initial 10-year period depends on continued “authorized public use of the voice or visual likeness of the individual.”  The bill proposes that the Copyright Office will maintain a registry/database of these post-mortem rights.

Protecting the individual from bad deals

During the individual’s lifetime, the digital replica right cannot be assigned – it is inalienable – but it can be licensed; such a license must be in writing and signed by the individual, must “includes a reasonably specific description of the intended uses of the applicable digital replica,” and cannot have a term of more than 10 years.   A license for a minor’s digital replica can have a term of no more than 5 years and must terminate when the person turns 18.  All those requirements do not apply “if the license is governed by a collective bargaining agreement that addresses digital replicas” – a nod to the deal that ended SAGAFTRA’s 2023 strike against the film studios.

What about the First Amendment?

As with more general rights of publicity and privacy, the drafters were keenly aware of the difficult problem of balancing the legitimate interests of individuals in their own likenesses with others’ freedom of expression.  The present bill has exceptions to the exclusive right for using a digital replica in:

  • “a bona fide news, public affairs, or sports broadcast or account”;
  • “a documentary or in a historical or biographical manner, including some degree of fictionalization”;
  • “bona fide commentary, criticism, scholarship, satire, or parody”; or
  • “fleeting or negligible” usage.

For myself, I’m most concerned that only the “documentary . . . historical or biographical manner” exception is conditioned by the requirement that the usage not “create[] the false impression that the work is an authentic sound recording, image, transmission, or audiovisual work in which the individual participated.” The presence of this limitation in one exception but not the others could be interpreted by courts to mean that the use of a digital replica in “commentary” “satire” or a “news broadcast” can create the false impression that the individual participated.   Given how courts have recognized that protecting consumers from deception is a legitimate basis to restrict free expression, I would think it better to condition all the exceptions on not confusing, misleading, or deceiving consumers.

What performers, record labels, and online platforms get

One criticism of this bill is going to be that there are already all sorts of causes of action the victim of a digital replica can bring, but folks who say that are missing what this bill really aims to do.  The NO FAKES Act introduces a takedown system in which “online service” providers have a safe harbor from liability if they disable access to an unauthorized digital replica after receipt of a notice with requirements similar to the DMCA; the online service needs to remove or disable access  “as soon as is technically and practically feasible” – language that reflects some bad experiences content owners have had with the DMCA’s “expeditiously” requirement.

“Online service” is given a broad definition to include user-generated content platforms, social media, and digital music providers, but seems to exclude transmission ISPs that would qualify for the DMCA’s 512(a) safe harbor.

Indeed, since liability is triggered only when someone has knowledge about the unauthorized digital replica, this bill is really directed at those online services who will receive these notices.

What AI companies get

One troubling part of this bill is that “products and services capable of producing digital replicas” seem to get a carte blanche shield from secondary liability – without even the limited role that ISPs must undertake to enjoy safe harbors in the DMCA.  Given that we know that AI companies can and do use “guardrails” to prevent the generation of at least some copyright infringing materials, it’s disappointing that the drafters haven’t imposed at least minimal requirements to enjoy the safe harbor, i.e. that companies deploy measures to prevent the generation of digital replicas for whom the companies receive notices as well as individuals listed on the registry that the Copyright Office will maintain.

Fortunately, the bill denies this liability shield to products or services intended to produce digital replicas and deepfakes, using a framework similar to 17 U.S.C. 1201(a)(2).  So some future service like clone-glen-powell.com or taylorswiftserenadesyou.net won’t get an automatic hall pass.

What about state laws?

Existing state laws on digital replicas are not preempted, including the new laws that will come online January 1 in California, New York, and Illinois.   State laws addressing sexual deep fakes and election-related misformation are also not preempted.   The bill does preempt new state laws “for the protection of an individual’s voice and visual likeness rights in connection with a digital replica . . . in an expressive work,” but for practical purposes the No FAKES Act will produce a regime like trademark and trade secrecy, where there may be overlapping, but distinct state and federal claims.

Why is this happening now?

Normally you might expect the record labels, as major content owners, to have their interests more aligned with the motion picture studios, not the actors’ union.  A keen observer might ask, what is going on?

The answer is simple.  We’re in a replay of the early days of the internet.   In those days, music was the canary in the coal mine for online digital piracy – simply because it was far easier to reproduce and distribute .mp3 files than full-length television shows and feature films. We’re at a similar moment now when AI-generated sound recordings are passable as music; at least one music AI developer, Suno, has admitted that they can produce outputs that replicate real artists’ vocals. Meanwhile, actors are fighting abusive uses of digital replicas in everything from deep fake porn to fairly mainstream advertising.

Is this bill perfect?  No, far from it.  But the takedown system it envisages could go a long way to suppressing the market for what the FBI calls “synthetic content”  — synthetic content that deceives consumers and replaces creative professionals.  That itself might make AI development a little bit less like the digital Wild West.  To most of us, that would be a good thing.

# # # #

Justin Hughes is on the faculty at Loyola Law School, Loyola Marymount University and a Visiting Professor on the Law Faculty, Oxford University.  He provided advice on some of the drafting of the bill.

Leave a Reply

Your email address will not be published. Required fields are marked *

You can click here to Subscribe without commenting

Add a picture