A group of bipartisan senators has introduced the NO FAKES Act, a bill that would make it illegal to create digital replicas of a person’s voice or likeness without their consent.
This legislation, called the Nurture Originals, Foster Art, and Keep Entertainment Safe Act of 2024, is spearheaded by Senators Chris Coons, Marsha Blackburn, Amy Klobuchar, and Thom Tillis.
The proposed law aims to give individuals the right to seek damages if their voice, face, or body are digitally copied without consent. Both individuals and companies could be held liable for producing, hosting, or sharing these unauthorized replicas, including those created using generative AI.
Numerous instances have occurred where celebrities have discovered their digital lookalikes being used without permission. For example, a fake “Taylor Swift” was used to deceive people into a fraudulent Le Creuset cookware giveaway, and a voice resembling Scarlett Johansson’s was featured in a ChatGPT voice demo.
Political figures have also been targeted; Kamala Harris was recently depicted making false statements via AI. Deepfakes can affect anyone, not just famous individuals.
The senators behind this act believe everyone should have the right to control and protect their voice and likeness. They emphasize that while AI can foster creativity, it should not exploit individuals’ identities without consent.
The NO FAKES Act proactively addresses the lag between legislation and technological advancements. It follows the Senate’s recent passage of the DEFIANCE Act, which allows victims of sexual deepfakes to sue for damages.
OpenAI, a prominent AI company, has endorsed the NO FAKES Act. OpenAI’s vice president of global affairs, Anna Makanju, indicated that the company supports the bill, highlighting the need to protect creators and artists from unauthorized digital replicas of their voices and likenesses. This legislation safeguards individuals from improper impersonation and exploitation through thoughtful federal regulation.
This new bill has received support from various entertainment organizations, such as SAG-AFTRA, the RIAA, the Motion Picture Association, and the Recording Academy. These groups have been actively seeking protections against unauthorized AI recreations. Recently, SAG-AFTRA went on strike against several game publishers to secure a union agreement for likenesses in video games.