State Champ Radio

by DJ Frosty

Current track

Title

Artist

Current show
blank

State Champ Radio Mix

12:00 am 12:00 pm

Current show
blank

State Champ Radio Mix

12:00 am 12:00 pm


After Fake Drake Debacle, Expect More AI Songs. But Are They Legal? 

Written by on April 18, 2023

blank

A song featuring AI-generated fake vocals from Drake and The Weeknd might be a scary moment for artists and labels whose livelihoods feel threatened, but does it violate the law? It’s a complicated question.

The song “Heart on My Sleeve,” which also featured Metro Boomin’s distinctive producer tag, racked up hundreds of thousands of spins on streaming services before it was pulled down on Monday evening, powered to viral status by uncannily similar vocals over a catchy instrumental track. Millions more have viewed shorter snippets of the song that the anonymous creator posted to TikTok.

It’s unclear whether only the soundalike vocals were created with AI tools – a common trick used for years in internet parody videos and deepfakes – or if the entire song was created solely by a machine based purely on a prompt to create a Drake track, a more novel and potentially disruptive development. 

For an industry already on edge about the sudden growth of artificial intelligence, the appearance of a song that convincingly replicated the work product of two of music’s biggest stars and one of its top producers and won over likely millions of listeners has set off serious alarm bells.

“The ability to create a new work this realistic and specific is disconcerting, and could pose a range of threats and challenges to rightsowners, musicians, and the businesses that invest in them,” says Jonathan Faber, the founder of Luminary Group and an attorney who specializes in protecting the likeness rights of famous individuals. “I say that without attempting to get into even thornier problems, which likely also exist as this technology demonstrates what it may be capable of.”

“Heart On My Sleeve” was quickly pulled down, disappearing from most streaming services by Monday evening. Representatives for Drake, The Weeknd and Spotify all declined to comment when asked about the song on Monday. And while the artists’ label, Universal Music Group, issued a strongly worded statement condemning “infringing content created with generative AI,” a spokesperson would not say whether the company had sent formal takedown requests over the song. 

A rep for YouTube said on Tuesday that the platform “removed the video in question after receiving a valid takedown notice,” noting that the track was removed because it used a copyrighted music sample.

Highlighted by the debacle is a monumental legal question for the music industry that will likely be at the center of legal battles for years to come: To what extent do AI-generated songs violate the law? Though “Heart on My Sleeve” was removed relatively quickly, it’s a more complicated question than it might seem.

For starters, the song appears to be an original composition that doesn’t directly copy any of Drake or the Weeknd’s songs, meaning that it could be hard to make a claim that it infringes their copyrights, like when an artist uses elements of someone else’s song without permission. While Metro Boomin’s tag may have been illegally sampled, that element likely won’t exist in future fake songs.

By mimicking their voices, however, the track represents a clearer potential violation of Drake and Weeknd’s so-called right of publicity – the legal right to control how your individual identity is commercially exploited by others. Such rights are more typically invoked when someone’s name or visual likeness is stolen, but they can extend to someone’s voice if it’s particularly well-known – think Morgan Freeman or James Earl Jones.

“The right of publicity provides recourse for rights owners who would otherwise be very vulnerable to technology like this,” Faber said. “It fits here because a song is convincingly identifiable as Drake and the Weeknd.”

Whether a right of publicity lawsuit is legally viable against this kind of voice mimicry might be tested in court soon, albeit in a case dealing with decidedly more old school tech.

Back in January, Rick Astley sued Yung Gravy over the rapper’s breakout 2022 hit that heavily borrowed from the singer’s iconic “Never Gonna Give You Up.” While Yung Gravy had licensed the underlying composition, Astley claimed Yung Gravy violated his right of publicity when he hired a singer who mimicked his distinctive voice.

That case has key differences from the situation with “Heart on My Sleeve,” like the allegation that Gravy falsely suggested to his listeners that Astley had actually endorsed his song. In the case of “Heart on My Sleeve,” the anonymous creator Ghostwriter omitted any reference to Drake and The Weeknd on streaming platforms; on TikTok, he directly stated that he, and not the two superstars, had created his song using AI.

But for Richard Busch of the law firm King & Ballow, a veteran music industry litigator who brought the lawsuit on behalf of Astley, the right of publicity and its protections for likeness still provides the most useful tool for artists and labels confronted with such a scenario in the future.

“If you are creating a song that sounds identical to, let’s say, Rihanna, regardless of what you say people are going to believe that it was Rihanna. I think there’s no way to get around that,” Busch said. “The strongest claim here would be the use of likeness.”

But do AI companies themselves break the law when they create programs that can so effectively mimic Drake and The Weeknd’s voices? That would seem to be the far larger looming crisis, and one without the same kind of relatively clear legal answers.

The fight ahead will likely be over how AI platforms are “trained” – the process whereby machines “learn” to spit out new creations by ingesting millions of existing works. From the point of view of many in the music industry, if that process is accomplished by feeding a platform copyrighted songs — in this case, presumably, recordings by Drake and The Weeknd — then those platforms and their owners are infringing copyrights on a mass scale.

In UMG’s statement Monday, the label said clearly that it believes such training to be a “violation of copyright law,” and the company previously warned that it “will not hesitate to take steps to protect our rights and those of our artists.” The RIAA has said the same, blasting AI companies for making “unauthorized copies of our members works” to train their machines.

While the training issue is legally novel and unresolved, it could be answered in court soon. A group of visual artists has filed a class action over the use of their copyrighted images to train AI platforms, and Getty Images has filed a similar case against AI companies that allegedly “scraped” its database for training materials. 

And after this week’s incident over “Heart on My Sleeve,” a similar lawsuit against AI platforms filed by artists or music companies gets more likely by the day.

Related Images:


Reader's opinions

Leave a Reply

Your email address will not be published. Required fields are marked *