House Lawmakers Unveil No AI FRAUD Act in Push for Federal Protections for Voice, Likeness
Written by djfrosty on January 10, 2024
A bipartisan group of U.S. House lawmakers announced a new bill on Wednesday (Jan. 10) that regulates the use of AI for cloning voices and likenesses. Called the No Artificial Intelligence Fake Replicas And Unauthorized Duplications Act of 2023 (“No AI FRAUD” Act), the bill aims to establish a federal framework for protecting one’s voice and likeness and lays out First Amendment protections.
More federal and state legislation regulating artificial intelligence is expected to be announced later today, including a bill from Gov. Bill Lee of Tennessee also regarding AI voice and likeness cloning. On Jan. 5, Gov. Lee hinted at the subject of his forthcoming legislation: “As the technology landscape evolves with artificial intelligence, we’re proud to lead the nation in proposing legal protection for our best-in-class artists and songwriters.”
The No AI FRAUD Act was introduced by Rep. María Elvira Salazar (R-FL), the lead Republican sponsor of the bill, alongside Reps. Madeleine Dean (D-PA), Nathaniel Moran (R-TX), Joe Morelle (D-NY) and Rob Wittman (R-VA). It is said to be based on the Senate discussion draft Nurture Originals, Foster Art, and Keep Entertainment Safe Act (“NO FAKES” Act), which was announced last October.
“It’s time for bad actors using AI to face the music,” said Rep. Salazar. “This bill plugs a hole in the law and gives artists and U.S. citizens the power to protect their rights, their creative work, and their fundamental individuality online.”
AI voice synthesis technology poses a new problem and opportunity for recording artists. While some laud it as a novel marketing, creative or fan engagement tool, it also leaves artists vulnerable to uncanny impersonations that could confuse, scam or mislead the public.
An artists’ voice, image or likeness may be covered by “right of publicity” laws which protect them from commercial exploitation without authorization, but this is a right that varies state by state. The No AI FRAUD Act aims to establish a harmonized baseline of protection. Still, if one lives in a state with an even stronger right of publicity law than the No AI FRAUD Act, that state protection is still viable, and may be easier to address in court.
This bill is keeping with regulations that a number of music business executives, including those at Sony, ASCAP, UMG, have called for in recent months — following incidents like the viral fake-Drake song “Heart On My Sleeve.”
Mitch Glazier, chairman and CEO of the Recording Industry Association of America (RIAA), released a statement, showing support for the No AI FRAUD Act. “The No AI FRAUD Act is a meaningful step towards building a safe, responsible and ethical AI ecosystem, and the RIAA applauds Representatives Salazar, Dean, Moran, Morelle, and Wittman for leading in this important area. To be clear, we embrace the use of AI to offer artists and fans new creative tools that support human creativity. But putting in place guardrails like the No AI FRAUD Act is a necessary step to protect individual rights, preserve and promote the creative arts, and ensure the integrity and trustworthiness of generative AI. As decades of innovation have shown, when Congress establishes strong IP rights that foster market-led solutions, it results in both driving innovation and supporting human expression and partnerships that create American culture.”
Lucian Grainge, chairman and CEO of Universal Music Group, also shared his praise for the new bill in a statement: “Universal Music Group strongly supports the ‘No AI FRAUD Act’ because no one should be permitted to steal someone else’s image, likeness or voice. While we have an industry-leading track record of enabling AI in the service of artists and creativity, AI that uses their voice or identity without authorization is unacceptable and immoral. We call upon Congress to help put an end to nefarious deepfakes by enacting this federal right of publicity and ensuring that all Americans are protected from such harm.”