Why AI Lawsuits May Have a Lot to Do With Andy Warhol, Prince & a 1981 Photograph
Written by djfrosty on July 19, 2024
The lawsuits filed by the major labels against the AI companies Suno and Udio could be the most important cases to the music business since the Supreme Court Grokster decision, as I explained in last week’s Follow the Money column. The outcomes are hard to predict, however, because the central issue will be “fair use,” a U.S. legal doctrine shaped by judicial decisions that involves famously — sometimes notoriously — nuanced determinations about art and appropriation. And although most creators focus more on issues around generative AI “outputs” — music they’ll have to compete with or songs that might sound similar to theirs — these cases involve the legality of copying music for the purposes of training AI.
Neither Suno nor Udio has said how they’re trained their AI programs, but both have essentially said that copying music in order to do so would qualify as fair use. Determining that could touch on the development of Google Books, the compatibility of the Android operating system, and even a Supreme Court case that involves Prince, Andy Warhol and Vanity Fair. It’s the kind of fair use case that once inspired a judge to call copyright “the metaphysics of the law.” So let’s get metaphysical!
Trending on Billboard
Fair use essentially provides exceptions to copyright, usually for the purpose of free expression, allowing for quotation (as in book or film reviews) and parody (to comment on art), among other things. (The iconic example in music is the Supreme Court case over 2 Live Crew’s parody of Roy Orbison’s “Oh, Pretty Woman.”) These determinations involve a four-factor test that weighs “the purpose and character of the use”; “the nature of the copyrighted work”; how much and how important a part of the work is used; and the effect of the use upon the potential market value of the copyrighted work. Over the last decade or so, though, the concept of “transformative use,” derived from the first factor, expanded in a way that allowed the development of Google Books (the copying of books to create a database and excerpts) and the use of some Oracle API code in Google’s Android system — which could arguably be said to go beyond the origins of the concept.
Could copying music for the purposes of machine learning qualify as well?
In a paper on the topic, “Fair Use in the U.S. Redux: Reformed or Still Deformed,” the influential Columbia Law School professor Jane Ginsburg suggests that the influence of the transformative use argument might have reached its peak. (I am oversimplifying a very smart paper, and if you are interested in this topic, you should read it.)
The Supreme Court decision on the Google-Oracle case involved part of a computer program, far from the creative “core” of copyright, and music recordings would presumably be judged differently. The Supreme Court also made a very different decision last year in a case that pitted the Andy Warhol Foundation for the Visual Arts against prominent rock photographer Lynn Goldsmith. The case involved an Andy Warhol silkscreen of Prince, based on a Goldsmith photograph that the magazine Vanity Fair had licensed for Warhol to use. Warhol used the photo for an entire series — which Goldsmith only found out about when the magazine used the silkscreen image again for a commemorative issue after Prince died.
On the surface, this seemed to cast the Supreme Court Justices as modern art critics, in a position to judge all appropriation art as infringing. But the case wasn’t about whether Warhol’s silkscreen inherently infringed Goldsmith’s copyright but about whether it infringed it for licensed use by a magazine, in a way where it could compete with the original photo. There was a limit to transformative use, after all. “The same copying,” the court decided, “may be fair when used for one purpose but not another.”
So it might constitute fair use for Google to copy entire books for the purpose of creating a searchable database about those books with excerpts from them, as it did for Google Books — but not necessarily for Suno or Udio to copy terabytes of recordings to spur the creation of new works to compete with them, especially if it results in similar works. In the first case, it’s hard to find real economic harm — there will never be much of a market for licensing book databases — but there’s already a nascent market for licensing music to train AI programs. And, unlike Google Books, the AI programs are designed to make music to compete with the recordings used to train them. Obviously, licensing music to train an AI program is what we might call a secondary use — but so is turning a book into a film, and no one doubts they need permission for that.
All of this might seem like I think the major labels will win their cases, but that’s a tough call — the truth is that I just don’t think they’ll lose. And there’s a lot of space between victory and defeat here. If one of these cases ends up going to the Supreme Court — and if one of these doesn’t, another case about AI training surely will within the next few years — the decision might be more limited than either side is looking for, since the court has tended to step lightly around technology issues.
It’s also possible that the decision could depend on whether the outputs that result from all of this training are similar enough to copyrighted works to qualify, or plausibly qualify, as infringing. Both label lawsuits are full of such examples, presumably because that could make a difference. These cases are about the legality of AI inputs, but a fair use determination on that issue could easily involve whether those inputs lead to infringing output.
In the end, Ginsburg suggests, “system designers may need to disable features that would allow users to create recognizable copies.” Except that — let’s face it — isn’t that really part of the fun? Sure, AI music creation might eventually grow to maturity as some kind of art form — it already has enormous practical value for songwriters — but for ordinary consumers it’s still hard to beat Frank Sinatra singing Lil Jon’s “Get Low.” Of course, that could put a significant burden on AI companies — with severe consequences for crossing a line that won’t always be obvious. It might be easier to just license the content they need. The next questions, which will be the subject of future columns, involve exactly what they need to license and how they might do that, since it won’t be easy to get all the rights they need — or in some cases even agree on who controls them.