State Champ Radio

by DJ Frosty

Current track

Title

Artist

Current show
blank

State Champ Radio Mix

1:00 pm 7:00 pm

Current show
blank

State Champ Radio Mix

1:00 pm 7:00 pm


NO FAKES Act

The NO FAKES Act was reintroduced to the U.S. House of Representatives and Senate on Wednesday (April 9) with the help of country legend Randy Travis, his wife Mary Travis and Warner Music Group CEO Robert Kyncl.
The reintroduction of the bill, designed to protect artists against unauthorized AI deepfake impersonations, was part of the Recording Academy’s annual GRAMMYs on the Hill initiative, in which the organization visits D.C. to meet with elected officials and advocate for a variety of music-related causes. On Wednesday, the GRAMMYs on the Hill Awards celebrated Travis, along with U.S. Representatives Linda Sánchez (D-CA) and Ron Estes (R-KS), for their dedication and advocacy for the rights of music creators.

Introduced by Senators Marsha Blackburn (R-TN), Chris Coons (D-DE), Thom Tillis (R-NC) and Amy Klobuchar (D-MN) and Representatives María Elvira Salazar (R-FL-27), Madeleine Dean (D-PA-4) Nathaniel Moran (R-TX-1), Becca Balint (D-VT-At Large), the NO FAKES Act has also found new supporters in an unlikely place: the tech industry. The bill is now supported by tech giants like YouTube, OpenAI, IBM and Adobe, showing a rare moment of solidarity between artists and big tech in the AI age.

Trending on Billboard

The NO FAKES Act was first introduced as a draft bill in 2023, and formally introduced to the Senate in the summer of 2024. If passed, the legislation would create federal intellectual property protections for the so-called right of publicity for the first time, which adds restrictions to how someone’s name, image, likeness and voice can be used without consent. Currently, these rights are only protected at the state level, leading to a patchwork of varying laws around the nation.

Unlike some of the patchy state publicity rights laws, the federal right that the NO FAKES Act would create would not expire at death and could be controlled by a person’s heirs for 70 years after their passing. There are, however, specific carve outs for replicas used in news, parody, historical works and criticism to ensure the First Amendment right to free speech remains protected.

Over the last few years, as AI voice models have continued to develop, many artists have often found themselves on the receiving end of AI deepfakes. In 2023, the AI music craze kicked off with the so-called “fake Drake” song “Heart On My Sleeve” which featured the unauthorized AI voices of Drake and the Weeknd. Last year, Taylor Swift, for example, was the subject of a number of sexually-explicit AI deepfakes of her body; the late Tupac Shakur‘s voice was deepfaked by fellow rapper Drake in his Kendrick Lamar diss track “Taylor Made Freestyle,” which was posted, and then deleted, on social media.

Even President Donald Trump participated in the deepfake trend, posting an unauthorized AI image of Swift allegedly endorsing him during his campaign to return to the White House.

“Recently, I was made aware that [an] AI [image] of ‘me’ falsely endorsing Donald Trump’s presidential run was posted to his site. It really conjured up my fears around AI, and the dangers of spreading misinformation,” Swift wrote in an Instagram post soon after. “It brought me to the conclusion that I need to be very transparent about my actual plans for this election as a voter. The simplest way to combat misinformation is with the truth.”

Overall, the bill has seen widespread support among the entertainment industry establishment. According to a press release about the bill’s reintroduction, it is celebrated by Sony Music, Warner Music Group, Universal Music Group, the Recording Industry Association of America, the Recording Academy, SAG-AFTRA, Human Artistry Campaign, Motion Picture Association and more.

Mitch Glazier, chairman and CEO of the RIAA, praised the bipartisan effort, saying “this bill proves that we can prioritize the growth of AI and protecting American creativity at the same time.”

Harvey Mason jr., CEO of the Recording Academy, added: “The Academy is proud to represent and serve creators, and for decades, GRAMMYs on the Hill has brought music makers to our nation’s capital to elevate the policy issues affecting our industry. Today’s reintroduction of the NO FAKES Act underscores our members’ commitment to advocating for the music community, and as we enter a new era of technology, we must create guardrails around AI and ensure it enhances – not replaces – human creativity.”

Dennis Kooker, president of global digital business at Sony Music Entertainment, represented the music business at Sen. Chuck Schumer’s (D-NY) seventh artificial intelligence insight forum in Washington, D.C. on Wednesday (Nov. 29). In his statement, Kooker implored the government to act on new legislation to protect copyright holders to ensure the development of “responsible and ethical generative AI.”

The executive revealed that Sony has already sent “close to 10,000 takedowns to a variety of platforms hosting unauthorized deepfakes that SME artists asked us to take down.” He says these platforms, including streamers and social media sites, are “quick to point to the loopholes in the law as an excuse to drag their feet or to not take the deepfakes down when requested.”

Presently, there is no federal law that explicitly requires platforms to takedown songs that impersonate an artists’ voice. Platforms are only obligated to do this when a copyright (a sound recording or a musical work) is infringed, as stipulated by the Digital Millennium Copyright Act (DMCA). Interest in using AI to clone the voices of famous artists has grown rapidly since a song with AI impersonations of Drake and The Weekend went viral earlier this year. The track, called “Heart on My Sleeve” has become one of the most popular use-cases of music-related AI.

A celebrity’s voice and likeness can be protected by “right of publicity” laws that safeguard it from unauthorized exploitation, but this right is limited. Its protections vary state-to-state and are even more limited post-mortem. In May, Billboard reported that the major labels — Sony, Universal Music Group and Warner Music Group — had been in talks with Spotify, Apple Music and Amazon Music to create a voluntary system for takedowns of right of publicity violations, much like the one laid out by the DMCA, according to sources at all three majors. It is unclear from Kooker’s remarks if the platforms that are dragging their feet on voice clone removals include the three streaming services that previously took part in these discussions.

In his statement, Kooker asked the Senate forum to create a federal right of publicity to create a stronger and more uniform protection for artists. “Creators and consumers need a clear unified right that sets a floor across all fifty states,” he said. This echoes what UMG general counsel/ executive vp of business and legal affairs Jeffery Harleston asked the Senate during a July AI hearing.

Kooker expressed his “sincere gratitude” to Sens. Chris Coons, Marsha Blackburn, Amy Klobuchar and Thom Tillis for releasing a draft bill called the No FAKES (“Nurture Originals, Foster Art, and Keep Entertainment Safe”) Act in October, which would create a federal property right for one’s voice or likeness and protect against unauthorized AI impersonations. At its announcement, the No FAKES Act drew resounding praise from music business organizations, including the RIAA and the American Association of Independent Music.

Kooker also stated that in this early stage many available generative AI products today are “not expanding the business model or enhancing human creativity.” He pointed to a “deluge of 100,000 new recordings delivered to [digital service providers] every day” and said that some of these songs are “generated using generative AI content creation tools.” He added, “These works flood the current music ecosystem and compete directly with human artists…. They reduce and diminish the earnings of human artists.”

“We have every reason to believe that various elements of AI will become routine in the creative process… [as well as] other aspects of our business” like marketing and royalty accounting,” Kooker continued. He said Sony Music has already started “active conversations” with “roughly 200” different AI companies about potential partnerships with Sony Music.

Still, he stressed five key issues remain that need to be addressed to “assure a thriving marketplace for AI and music.” Read his five points, as written in his prepared statement, below:

Assure Consent, Compensation, and Credit. New products and businesses built with music must be developed with the consent of the owner and appropriate compensation and credit. It is essential to understand why the training of AI models is being done, what products will be developed as a result, and what the business model is that will monetize the use of the artist’s work. Congress and the agencies should assure that creators’ rights are recognized and respected.

Confirm That Copying Music to Train AI Models is Not Fair Use. Even worse are those that argue that copyrighted content should automatically be considered fair use so that protected works are never compensated for usage and creators have no say in the products or business models that are developed around them and their work. Congress should assure and agencies should presume that reproducing music to train AI models, in itself, is not a fair use.

Prevent the Cloning of Artists’ Voices and Likenesses Without Express Permission. We cannot allow an artist’s voice or likeness to be cloned for use without the express permission of the artist. This is a very personal decision for the artist. Congress should pass into law effective federal protections for name, image, and likeness.

Incentivize Accurate Record-Keeping. Correct attribution will be a critical element to artists being paid fairly and correctly for new works that are created. In addition, rights can only be enforced around the training of AI when there are accurate records about what is being copied. Otherwise, the inability to enforce rights in the AI marketplace equates to a lack of rights at all, producing a dangerous imbalance that prevents a thriving ecosystem. This requires strong and accurate record keeping by the generative AI platforms, a requirement that urgently needs legislative support to ensure incentives are in place so that it happens consistently and correctly.

Assure Transparency for Consumers and Artists. Transparency is necessary to clearly distinguish human-created works from AI-created works. The public should know, when they are listening to music, whether that music was created by a human being or a machine.