State Champ Radio

by DJ Frosty

Current track

Title

Artist

Current show

Lunch Time Rewind

12:00 pm 1:00 pm

Current show

Lunch Time Rewind

12:00 pm 1:00 pm


artificial intelligence

Over the past week, the feud between Kendrick Lamar and Drake has entered into a new, more modern realm than any rap beef before it: AI.
As the back and forth has escalated, and fans wait to see what each of the hip-hop heavyweights will say next, a number of fan-fabricated diss tracks began circulating on social media using AI voices to mimic the emcees. And while some were obviously not real — and, thankfully, were voluntarily labeled AI by their authors — others were more convincing, leading to widespread confusion.

People questioned if Drake’s “Push Ups” was real (it was), and if Lamar’s supposed reply, “1 Shot 1 Kill” was real, too (it wasn’t). YouTube is rife with more AI replications, and some are amassing big audiences, including one called “To Kill A Butterfly,” which has amassed 508,000 views to date. To make matters even more convoluted, Drake himself took part in the trend, employing AI to replicate the voices of West Coast legends Tupac Shakur and Snoop Dogg on his diss “Taylor Made,” released on X and Instagram on Friday without their permission, prompting Shakur’s estate to send Drake a cease-and-desist letter.

The phenomenon has illustrated the sizable impact that AI has already had on modern fandom, as impatient fans use generative AI tools to fill in gaps in the conversation and imagine further storylines with a type of uncanny accuracy that was never before possible. And for better or for worse, it has become the most prominent use-case of generative AI in the music industry to date.

Trending on Billboard

This trend in AI use has its origins with Ghostwriter, the controversial TikTok user who deepfaked Drake and The Weeknd’s voices on his song “Heart On My Sleeve” one year ago, in April 2023. In a cover story for Billboard, Ghostwriter and his manager first compared AI voice filters to a form of “fan fiction — a fan-generated genre of music,” as the manager put it.

Traditional, written fan fiction has been a way for fans to engage with their favorite media for decades — whether that’s franchises like Star Wars, Marvel or Twilight, or the music of stars like Drake and Lamar. In it, fans can expand on details that were never fully fleshed out in the original work and write their own storylines and endings. AI fan creations inspired by Drake and Lamar’s beef are doing something similar, letting music fans imagine the artist’s next move and picture collaborations that haven’t happened yet.

Historically, fan fiction is great for the original artist from a marketing point of view. It is one of many forms of user-generated content (UGC) on the internet today that can engage superfans further with the original project without its author having to lift a finger.

But with traditional fan fiction, fans could easily tell where the official canon started and ended, and the writing was often relegated to superfan hot spots like Watt Pad, Discord, Reddit or fan zines. This new form of ‘AI fan fiction’ makes this distinction a lot less obvious and spreads it much wider. For now, trained ears can still tell when AI voices are used like this today, given the slight glitchiness still found in the audio quality, but soon these models will be so good that discerning AI from reality will be virtually impossible.

There is still not a good way to confidently figure out which songs use AI and which do not, and to make matters worse, these fan-made songs are more commonly posted to general social media platforms than written fan-fiction. In a search about this rap beef on X or YouTube, listeners are likely to run into a few AI fan tracks along the way, and many lack the expertise of a superfan to sniff out and differentiate what’s real and what’s fake.

In a time when fans demand nonstop connection to and content from their favorite talents, it is especially common for fans of elusive artists to take matters into their own hands with AI tools — including voices as well as other generative works like images, videos and text. In the absence of a Kendrick response to Drake last week, for example, “1 Shot 1 Kill” was produced by a 23-year-old fan who goes by Sy The Rapper. In an interview with Complex, Sy said he used the tool Voicify to imagine Lamar on the track. (Notably, the RIAA recently reported Voicify to the U.S. government’s piracy watch list).

Followers of famously elusive artist Frank Ocean also had fun with generative AI in the last year, with one fan, @tannerchauct, showing others on X how to create their own alternative forms of Ocean’s album artwork using DALLE-2, an image generator. A Cardi B fan, @iYagamiLight, even dreamed up the creative direction for an entirely fictional Cardi B project with AI, earning them thousands of retweets in October. The user’s cover art rendered Cardi B in a bedazzled corset and posing in a clawfoot bathtub, peacock feathers fanning out around her. They also created a fake tracklist and release date.

The downside of fan-made works has always been the same: they have the tendency to infringe on the artist’s copyrights, to use an artist’s name, image, voice or likeness without permission, or to generally profit from the artist’s work without sharing the spoils. This new age of AI fan fiction and UGC makes all of these pre-existing problems exponentially harder to police.

The Cardi B fan, for example, did not disclose that their work was AI-generated or fictional, and instead paired their creative direction with the misleading caption “Cardi B just announced her long awaited sophomore album “Mayura” coming out Friday 12th January 2024!”

In a recent music law conference at Vanderbilt University, Colin Rushing, general counsel of the Digital Media Association (DiMA) downplayed the commercial impact of AI in music so far, saying that, since Ghostwriter, “one of the things we really haven’t seen in the [last] year is an epidemic of ‘fake-Drakes’ climbing the charts. We’re not seeing popular examples of this in the commercial marketplace.”

Rushing is right — that hasn’t happened yet. Even Drake’s own AI-assisted song is not on streaming services, and thus is not eligible for the charts. (and if the lawyer for Tupac’s estate has his way, it will soon be removed from the internet entirely.) But this rap feud has revealed that while it hasn’t impacted the charts or the “commercial marketplace” all that much, it has impacted something possibly even more important to an artist today: fandoms.

Only one word really describes Drake’s shift from objecting to an AI impersonation of him to using similar technology to add imitations of 2Pac and Snoop Dogg to his Kendrick Lamar diss track “Taylor Made Freestyle”: Chutzpah. (Drake had a bar-mitzvah-themed 31st birthday party, so he probably knows the term.) Last year around this time, the infamous “Heart on My Sleeve,” which featured AI vocal imitations of Drake and The Weeknd, shifted the debate about music and AI into high gear. Ever since, industry lobbyists and artists rights groups have been pushing legislation to regulate generative AI technology based on concepts of rights and permissions. Now Drake goes and blatantly breaks the main principle involved. It’s like something out of a political attack ad: He was against this before he was for it!
To me, using artists’ voices without their permission is wrong and it’s even more wrong — creepier — if the artist in question died relatively recently. The legal situation around this, and AI in general, is in flux, though. Tennessee’s ELVIS Act just passed, and a few federal bills have significant support. But the main point of the ELVIS Act and most of the recently proposed legislation is to impose penalties for exactly the kind of thing Drake did.

Trending on Billboard

And Drake, who must know these laws are necessary because a year ago they would have helped him, just made it harder to pass them. Imagine you’re a music business lobbyist who spent the last year explaining to members of Congress how important it is to protect the unique sounds of particular performers and then suddenly one of the biggest artists in the world goes ahead and violates every principle you’ve been discussing. Forget about Lamar — where’s the diss track from the RIAA?

It’s hard to say for sure whether what Drake did was illegal because laws vary by state — that’s why we need federal legislation in the first place. But Drake seems to have released the recording without his label, Republic Records, a subsidiary of Universal Music Group, which could indicate some concerns. (A representative for Drake declined to comment and Universal did not respond to requests for comment.) And Tupac Shakur’s estate has threatened to sue if Drake doesn’t take the track offline. (Snoop Dogg’s reaction — “They did what? When? How? Are you sure?,” followed by a weary sigh — is a work of art in itself. 10/10, no notes.) Litigation could be complicated, though. The Shakur estate threatened to sue for a violation of Shakur’s right of publicity, as well as for copyright infringement, which may be harder but comes with high statutory damages.

Howard King, the lawyer for Shakur’s estate, lays out the issue in his cease-and-desist letter to Drake. “Not only is the record a flagrant violation of Tupac’s publicity and the estate’s legal rights,” King writes, “it is also a blatant abuse of the legacy of one of the greatest hip-hop artists of all time. The Estate would never have given its approval for this use.” The use of 2Pac’s voice was especially inappropriate, King suggests, since Lamar is “a good friend to the Estate who has given nothing but respect to Tupac.”

In music critic terms, Drake is using simulacra of 2Pac and Snoop to call out Lamar by implying that he’s unworthy of their legacy. In legal terms, this might violate Shakur and Snoop Dogg’s rights of publicity or likeness rights, and there are precedents that would suggest it does — Tom Waits and Bette Midler each won a case about (human) vocal imitation. In moral terms, this feels so wrong because it forces Shakur and Snoop to say something they would never have said in real life. In hip-hop, reputation is everything — you own your words in both senses of the term — and Snoop and Shakur have every right to guard theirs.

This might seem like an awfully pretentious way to talk about what will almost certainly be remembered as a minor track from a major artist. Are reputations really at stake? Doesn’t anyone with even the slightest interest in pop music know that Drake used AI?

That’s a very current way of thinking about a technology that’s evolving really fast. What happens when millions of hobbyist producers release thousands of songs with imitations of hundreds of artists? (There are fan-made AI tracks out there already.) Who’s to know who dissed whom, let alone who favors what politician or endorses which product? For that matter, what happens when this comes for politicians? You can’t regulate digital technology with the legal equivalent of an umbrella — you need to prepare for a flood.

The ELVIS Act and the EU AI legislation represent a good start for that preparation, and most of the federal legislation under discussion seems solid. Hopefully, by the time the flood hits, we’ll remember “Heart on My Sleeve” as the beginning of an important debate and “Taylor Made Freestyle” as an amusing aside.

Tupac Shakur’s estate is threatening to sue Drake over a recent diss track against Kendrick Lamar that featured an AI-generated version of the late rapper’s voice, calling it a “a flagrant violation” of the law and a “blatant abuse” of his legacy.
In a Wednesday cease-and-desist letter obtained exclusively by Billboard, litigator Howard King told Drake (Aubrey Drake Graham) that he must confirm that he will pull down his “Taylor Made Freestyle” in less than 24 hours or the estate would “pursue all of its legal remedies” against him.

Trending on Billboard

“The Estate is deeply dismayed and disappointed by your unauthorized use of Tupac’s voice and personality,” King wrote in the letter. “Not only is the record a flagrant violation of Tupac’s publicity and the estate’s legal rights, it is also a blatant abuse of the legacy of one of the greatest hip-hop artists of all time. The Estate would never have given its approval for this use.”

Drake released “Taylor Made” on Friday, marking the latest chapter in a back-and-forth war of words between the Canadian rapper and Lamar. Beyond taking shots at both Kendrick and Taylor Swift, the track made headlines because of its prominent use of artificial intelligence technology to create fake verses from Tupac and Snoop Dogg – two West Coast legends idolized by the LA-based Lamar.

“Kendrick, we need ya, the West Coast savior/ Engraving your name in some hip-hop history,” the AI-generated Tupac raps in Drake’s song. “If you deal with this viciously/ You seem a little nervous about all the publicity.”

In Tuesday’s letter, Tupac’s estate warned Drake that the use of his voice clearly violated Tupac’s so-called publicity rights – the legal power to control how your image or likeness is used by others. And they took particular exception the use of his voice to take shots at Lamar.

“The unauthorized, equally dismaying use of Tupac’s voice against Kendrick Lamar, a good friend to the Estate who has given nothing but respect to Tupac and his legacy publicly and privately, compounds the insult,” King wrote.

A rep for Drake declined to comment on the demands of the Shakur estate.

It’s unclear if Snoop Dogg, whose voice was also featured on “Taylor Made,” is planning to raise similar legal objections to Drake’s track. On Saturday, he posted a video to social media in which he seemed to be learning of the song for the first time: “They did what? When? How? Are you sure?” A rep for Snoop Dogg did not return a request for comment.

The unauthorized use of voice cloning technology has become one of the music industry’s thorniest legal subjects, as AI-powered tools have made easier than ever to convincingly mimic real artists.

The issue exploded onto the scene last year, when an unknown artist named Ghostwriter released a track called “Heart On My Sleeve” that featured – ironically – fake verses from Drake’s voice. Since then, as voice-cloning has proliferated on the internet, industry groups, legal experts and lawmakers have wrangled over how best to crack down on it.

It’s not as simple as it might seem. Federal copyrights are difficult to directly apply, since cloned vocals usually feature new words and music that are distinct from existing copyrighted songs. The publicity rights cited by the estate are a better fit because they protect someone’s likeness itself, but they have historically been used to sue over advertisements, rather than over creative works like songs.

Faced with that legal uncertainty, the recording industry and top artists have pushed for new legislation to address the problem. Last month, Tennessee passed a statute called the ELVIS Act that aims to crack down on voice cloning by expanding the state’s publicity right laws beyond just advertisements. Lawmakers in Washington DC are also considering similar bills that would create new, broader publicity rights at a federal level.

In Wednesday’s letter, however, the estate said that California’s existing publicity right laws clearly outlaw something as blatant as Drake’s use of Tupac’s voice in “Taylor Made.” King argued that the song had caused “substantial economic and reputational harm” by creating the “false impression that the estate and Tupac promote or endorse the lyrics for the sound-alike.”

The estate also argued that the song was likely created using an AI model that violated the estate’s copyrights by “training” on existing recordings of Tupac’s music. The legality of using copyrighted “inputs” is another difficult legal issue that’s currently being tested in several closely-watched lawsuits against AI developers, including one filed by major music publishers.

“It is hard to believe that [Tupac’s record label]’s intellectual property was not scraped to create the fake Tupac AI on the Record,” King wrote, before demanding that Drake also provide “a detailed explanation for how the sound-alike was created and the persons or company that created it, including all recordings and other data ‘scraped’ or used.”

Wednesday’s letter also pointedly highlighted that Drake himself has made previous objections to the use of his own likeness by others. In addition to last year’s incident surrounding “Heart on My Sleeve” — which was quickly pulled down from the internet — King pointed to a lesser-known federal lawsuit in which Drake’s attorneys accused a website of using his image without authorization.

“The [“Taylor Made Freestyle”] has generated well more than one million streams at this point and has been widely reported in the general national press and popular entertainment websites and publications,” the estate wrote. “Without question, it is exponentially more serious and damaging than a picture of you with some other people on a low volume website.”

In its closing paragraphs, the letter demanded written confirmation by noon Pacific on Thursday that Drake’s representatives were “expeditiously taking all steps necessary to have it removed.”

“If you comply, the estate will consider whether an informal negotiation to resolve this matter makes sense,” King wrote. “If you do not comply, our client has authorized this firm to pursue all of its legal remedies including, but not limited to, an action for violation of … the estate’s copyright, publicity and personality rights and the resulting damages, injunctive relief, and punitive damages and attorneys’ fees.”

Amazon Music has announced a new AI-powered playlist feature that allows users to turn text prompts into entire playlists. Called Maestro, the offering is still in beta and available only to a small number of Amazon Music users on all tiers in the United States on iOS and Android. It can be found on the […]

Artificial intelligence and user-generated content music tool company Mayk has announced the launch of its latest product, popstarz.ai. With the promise of helping anyone playfully assume the identity of a popstar and let a user sing their favorite song, the company hopes to revolutionize karaoke and strengthen the artist-fan relationship. Explore Explore See latest videos, […]

SAG-AFTRA, the union representing roughly 160,000 actors, dancers, singers, recording artists and other media professionals, and all three major music companies reached a tentative multiyear agreement last week that includes guardrails for the use of artificial intelligence technology across the industry.
A successor to the SAG-AFTRA National Code of Fair Practice for Sound Recordings, the new deal received unanimous approval from the guild’s executive committee and, if ratified by member vote, will cover the period beginning Jan. 1, 2021 through Dec. 31, 2026. Participating labels include Sony Music Entertainment, Universal Music Group and Warner Music Group, as well as Disney Music Group.

The AI guidelines require that the use of terms such as “artist,” “singer” and “royalty artist” only refer to actual humans, plus the deal calls for clear consent, minimum compensation and other stipulations prior to the release of a sound recording using a digital replication of a real artist’s voice.

Trending on Billboard

The tentative contract also includes increased minimums, health and retirement improvements, and an increase in the percentage of streaming revenue to be covered by contributions.

“This agreement ensures that our members are protected,” said Duncan Crabtree-Ireland, SAG-AFTRA national executive director. “SAG-AFTRA stands firm in the belief that while technology can enhance the creative process, the essence of music must always be rooted in genuine human expression and experience. We look forward to working alongside our industry partners to foster an environment where innovation serves to elevate, not diminish, the unique value of each artist’s contribution to our rich cultural tapestry.”

The Record Label Negotiating Committee said, “Together, we’ll chart a successful course forward, embracing new opportunities and facing our common challenges, strengthened by our shared values and commitment to human artistry.”

A new law in Tennessee aimed at protecting artists from AI-powered voice mimicry has won widespread acclaim from the music industry, but some legal experts are worried such laws might be an “overreaction” that could have unintended consequences.  
Less than a year after a fake Drake song created using new artificial intelligence tools took the music world by storm, Tennessee lawmakers enacted first-in-the-nation legislation last month aimed at preventing exactly that scenario — the use of a person’s voice without their permission. The ELVIS Act (Ensuring Likeness Voice and Image Security) does that by expanding the state’s protections against the unauthorized use of a person’s likeness, known as publicity rights.  

The passage of the new law was hailed across the music business. Mitch Glazier of the Recording Industry Association of America called it an “incredible result.” Harvey Mason Jr. of the Recording Academy described it as a “groundbreaking achievement.” David Israelite of the National Music Publishers’ Association called it “an important step forward.” Any musical artist who has had their voice used without permission likely shares those sentiments.  

Trending on Billboard

But legal experts are more divided. Jennifer Rothman, a law professor at the University of Pennsylvania and one of the country’s top experts on publicity rights, rang alarm bells last week at a panel discussion in Nashville, warning that Tennessee’s new statute had not been necessary and had been “rushed” into law.  

“We don’t want a momentary overreaction to lead to the passage of laws that would make things worse, which is currently what is happening,” Rothman told her fellow panel members and the audience. “The ELVIS Act has a number of significant concerns that are raised, particularly with the broad sweep of liability and restrictions on speech.”  

In an effort to combat AI voice cloning, the ELVIS Act makes a number of key changes to the law. Most directly, it expands the state’s existing publicity rights protections to explicitly include someone’s voice as part of their likeness. But the new law also expands the law in ways that have received less attention, including adding a broader definition of who can be sued and for what.  

According to Joseph Fishman, a law professor at Vanderbilt University who has been closely tracking the legislation, that broader wording “sweeps in innocuous behavior that no one seriously thinks is a problem that needs solving” — potentially including tribute bands, interpolations, or even just sharing a photo that a celebrity didn’t authorize. 

“The range of acts that trigger liability is vast,” Fishman tells Billboard. “All the press around this law is focused on deepfakes and digital replicas — and those would indeed be covered — but the law as written goes so much further.”  

Here’s why: Historically, publicity rights in the U.S. have been mostly limited to commercial contexts — like advertisements that use a celebrity’s likeness to make it appear they’re endorsing a product. The singer Bette Midler once famously sued the Ford Motor Co. over a series of commercials featuring vocals by a Midler impersonator.

The new law effectively gets rid of that commercial limitation; under the ELVIS Act, anyone who knowingly “makes available” someone’s likeness without authorization can face a lawsuit. It also broadly defines protected voices as any sound that’s “readily identifiable and attributable to a particular individual.”

Those are great changes if you’re a musical artist trying to sue over a song that’s using a fake version of your voice, since the old conception of publicity rights likely wouldn’t apply to that scenario. But Fishman says they have serious potential for collateral damage beyond their intended target.  

“There’s nothing that would limit it to AI outputs, nothing that would limit it to deceptive uses,” Fishman said. “The lead singer in an Elvis tribute band who sings convincingly like The King certainly seems to fall under the definition. So do Elvis impersonators.”  

In an “even more extreme” hypothetical, Fishman imagined an “unflattering” photo of Elvis that he knew the Presley estate didn’t like. “The law seems to say I’d be liable if I sent that photo to a friend. After all, I’m transmitting his likeness, knowing that the rightsholder hasn’t authorized the use. Stop and think about that for a moment.”

The ELVIS Act does contain exemptions aimed at protecting free speech, including those that allow for the legal use of someone’s likeness in news coverage, criticism, scholarship, parody and other “fair use” contexts. It also expressly allows for “audiovisual works” that contain “a representation of the individual as the individual’s self” — a provision likely aimed at allowing Hollywood to keep making biopics and other films about real people without getting sued in Tennessee.

But confusingly, the law says those exemptions only apply “to the extent such use is protected by the First Amendment.” That wording, according to Rothman, means those exemptions essentially “don’t exist” unless and until a court rules that a specific alleged activity is a form of protected free speech, a costly extra step that will mostly benefit those who want to be in court. “This specific law creates great work for lawyers,” Rothman said. “So much work for lawyers.”  

Those lawyers are going to be filing real lawsuits against real people — some of whom are the scary, voice-cloning bad actors that the music industry wants to crack down on, but also some of whom are likely just regular people doing things that used to be legal.

“The law could absolutely lead to lots of lawsuits,” Fishman says. “There’s plenty of room here for people to test how far the statute can go, whether because they object to how they’re being depicted or because they see an opportunity for an extra licensing stream.”  

Though it only applies to Tennessee, the importance of the ELVIS Act is magnified because it is the first of likely many such legislative efforts aimed at addressing AI mimicry. At least five other states are currently considering amending their publicity rights laws to address the growing problem, and lawmakers on Capitol Hill are also weighing federal legislation that would create a national likeness statute for the first time.  

At last week’s roundtable, Rothman said those efforts were misguided. She said that laws already on the books — including federal trademark law, existing publicity rights laws, and numerous other statutes and torts — already provide avenues to stop voice cloning and deepfakes. And she warned that the proposed federal bills posed even more serious problems, like allowing someone to sign away their likeness rights in perpetuity.

For other legal experts critical of the ELVIS Act, including Harvard University law professor Rebecca Tushnet, the hope is that any subsequent legislation, whether at the state or federal level, can be more directly tailored to the actual AI-fueled deceptions they’re supposed to address. 

“Any new laws need to be far more targeted at specific harms,” says Tushnet, who has written extensively about the intersection of intellectual property and free speech. “Right now, this statute and other proposals are dramatically overbroad, and threaten legitimate creative conduct.” 

This is The Legal Beat, a weekly newsletter about music law from Billboard Pro, offering you a one-stop cheat sheet of big new cases, important rulings and all the fun stuff in between.
This week: Mary J. Blige’s 1992 “Real Love” draws a new copyright case over an oft-sampled funk song with a long history in both hip hop and music law; Madonna strikes back against angry fans who sued over delayed concerts; Morgan Wallen is charged with multiple felonies after allegedly throwing a chair from the roof of a Nashville bar; and much more.

THE BIG STORY: Sampling Saga

If you’ve listened to any significant amount of rap music over the past 30 years, you’ve probably heard “Impeach the President” by the Honey Drippers — a legendary piece of hip-hop source material with a drum track that’s been sampled or interpolated literally hundreds of times, including by Run-DMC, Biggie, Tupac, Dr. Dre and many others.

Trending on Billboard

And, allegedly, by Mary J. Blige.

In a lawsuit filed last week, Tuff City Records claimed that Blige’s 1992 classic “Real Love,” which spent 31 weeks on the Hot 100 in 1992, featured an unlicensed sample from “Impeach.” The case claims that Universal Music Publishing has “repeatedly refused” to pay for the underlying composition, even though UMG Recordings has already agreed to a deal covering the master.

The new lawsuit is the latest chapter in a story dating back several decades, starting with a seminal 1991 case over an LL Cool J song that also featured “Impeach” – a legal battle that would ultimately prove to be the beginning of fundamental changes to how the music industry and the courts treated sampling.

Other top stories this week…

MADONNA CONCERT CLASH – The Material Girl fired back at a class action lawsuit filed by New York City fans who are angry that her concerts started later than scheduled, asking for the case to be dismissed. Madonna’s attorneys argued that needing to “wake up early the next day for work” is not the kind of “cognizable injury” someone can sue over, and that “no Madonna fan” has a “reasonable expectation” that her shows will start on time.

LAST NIGHT (ALLEGEDLY) – Morgan Wallen was arrested in Nashville and charged with three felony counts of reckless endangerment over accusations that he threw a chair off the six-story roof of a popular bar on the city’s bustling Broadway street, allegedly narrowly missing several police officers. He was later released on bond, and his lawyer told Billboard he was “cooperating fully with authorities.”

RAMONES MOVIE LAWSUIT – Joey Ramone‘s brother (Mickey Leigh) responded to a lawsuit filed by Johnny Ramone’s widow (Linda Cummings-Ramone) over a planned Netflix movie about the pioneering punk band, calling the case “baseless and flimsy” and arguing that she actually signed off on such a project years ago.

AI COPYRIGHT DISCLOSURE BILL – Rep. Adam Schiff (D-Calif.) introduced new legislation in the U.S. House of Representatives that would require AI companies to disclose which copyrighted works were used to train their models, or face a financial penalty. The measure would not directly require payment to artists, but would certainly make it easier for copyright owners to file infringement cases against AI companies demanding such compensation.

NEW DIDDY ABUSE CASE – Sean “Diddy” Combs was hit with yet another sexual abuse case, this time centering on allegations that his son Christian “King” Combs assaulted a staffer on a luxury yacht in the Caribbean. The case, one of many against Diddy over the past six months, claimed that he “encouraged an environment of debauchery” that enabled his son’s behavior.

ACCUSER’S LAWYER CRITICIZED – Tyrone Blackburn, an attorney who has filed two of the pending sexual abuse cases against Combs, could be facing potential discipline himself. In a scathing ruling last week, a federal judge in an unrelated lawsuit referred him to the court’s grievance committee over his “pattern of behavior” in which he allegedly “improperly files cases in federal court to garner media attention, embarrass defendants with salacious allegations, and pressure defendants to settle quickly.”

ROD WAVE ARRESTED OVER SHOOTING – The rapper was arrested on gun charges in Florida over alleged connections to a shooting last month at a sports bar in St. Petersburg. At a press conference after the arrest, police claimed that the alleged assailants used a getaway car registered to Wave and fled to a house he had rented, where they later discovered two assault rifles and other evidence.

MORE BIZARRE DONDA CLAIMS – Kanye West was hit with another lawsuit filed by a former employee at his Donda Academy, this time accusing him of discriminating against Black staffers. Like the several previous cases from former staffers, the case included bizarre allegations about conditions inside the school – including that West told students to “shave their heads” and that he “intended to put a jail at the school” where students could be “locked in cages.”

Representative Adam Schiff (D-Calif.) introduced new legislation in the U.S. House of Representatives on Tuesday (April 9) which, if passed, would require AI companies to disclose which copyrighted works were used to train their models, or face a financial penalty. Called the Generative AI Copyright Disclosure Act, the new bill would apply to both new models and retroactively to previously released and used generative AI systems.
The bill requires that a full list of copyrighted works in an AI model’s training data set be filed with the Copyright Office no later than 30 days before the model becomes available to consumers. This would also be required when the training data set for an existing model is altered in a significant manner. Financial penalties for non-compliance would be determined on a case-by-case basis by the Copyright Office, based on factors like the company’s history of noncompliance and the company’s size.

Generative AI models are trained on up to trillions of existing works. In some cases, data sets, which can include anything from film scripts to news articles to music, are licensed from copyright owners, but often these models will scrape the internet for large swaths of content, some of which is copyrighted, without the consent or knowledge of the author. Many of the world’s largest AI companies have publicly defended this practice, calling it “fair use,” but many of those working in creative industries take the position that this is a form of widespread copyright infringement.

Trending on Billboard

The debate has sparked a number of lawsuits between copyright owners and AI companies. In October, Universal Music Group, ABKCO, Concord Music Group, and other music publishers filed a lawsuit against AI giant Anthropic for “unlawfully” exploiting their copyrighted song lyrics to train AI models.

“In the process of building and operating AI models, Anthropic unlawfully copies and disseminates vast amounts of copyrighted works,” wrote lawyers for the music companies at the time. “Publishers embrace innovation and recognize the great promise of AI when used ethically and responsibly. But Anthropic violates these principles on a systematic and widespread basis.”

While many in the music business are also calling for compensation and the ability to opt in or out of being used in a data set, this bill focuses only on requiring transparency with copyrighted training data. Still, it has garnered support from many music industry groups, including the Recorded Industry Association of America (RIAA), National Music Publishers’ Association (NMPA), ASCAP, Black Music Action Coalition (BMAC), and Human Artistry Campaign.

It is also supported by other creative industry groups, including the Professional Photographers of America, SAG-AFTRA, Writers Guild of America, International Alliance of Theatrical Stage Employees (IATSE) and more.

“AI has the disruptive potential of changing our economy, our political system, and our day-to-day lives,” said Rep. Schiff in a statement. “We must balance the immense potential of AI with the crucial need for ethical guidelines and protections. My Generative AI Copyright Disclosure Act is a pivotal step in this direction. It champions innovation while safeguarding the rights and contributions of creators, ensuring they are aware when their work contributes to AI training datasets. This is about respecting creativity in the age of AI and marrying technological progress with fairness.”

A number of rights groups also weighed in on the introduction of the bill.

“Any effective regulatory regime for AI must start with one of the most fundamental building blocks of effective enforcement of creators’ rights — comprehensive and transparent record keeping,” adds RIAA chief legal officer Ken Doroshow. “RIAA applauds Congressman Schiff for leading on this urgent and foundational issue.”

“We commend Congressman Schiff for his leadership on the Generative AI Copyright Disclosure Act,” NMPA president/CEO David Israelite said. “AI only works because it mines the work of millions of creators every day and it is essential that AI companies reveal exactly what works are training their data. This is a critical first step towards ensuring that AI companies fully license and that songwriters are fully compensated for the work being used to fuel these platforms.”

“Without transparency around the use of copyrighted works in training artificial intelligence, creators will never be fairly compensated and AI tech companies will continue stealing from songwriters,” ASCAP CEO Elizabeth Matthews said. “This bill is an important step toward ensuring that the law puts humans first, and we thank Congressman Schiff for his leadership.”

“Protecting the work of music creators is essential, and this all begins with transparency and tracking the use of copyrighted materials in generative AI,” Black Music Action Coalition (BMAC) co-chair Willie “Prophet” Stiggers said. “BMAC hopes Rep. Schiff’s Generative AI Copyright Disclosure Act helps garner support for this mission and that author and creator rights continue to be protected and preserved.”

“Congressman Schiff’s proposal is a big step forward towards responsible AI that partners with artists and creators instead of exploiting them,” Human Artistry Campaign senior advisor Dr. Moiya McTier said. “AI companies should stop hiding the ball when they copy creative works into AI systems and embrace clear rules of the road for recordkeeping that create a level and transparent playing field for the development and licensing of genuinely innovative applications and tools.”

Spotify has launched a new AI playlist feature for premium users in the United Kingdom and Australia, the company revealed in a blog post on Sunday (April 7). The new feature, which is still in beta, allows Spotify users in those markets to turn any concept into a playlist by using prompts like “an indie […]