State Champ Radio

by DJ Frosty

Current track

Title

Artist

Current show

State Champ Radio Mix

8:00 pm 12:00 am

Current show

State Champ Radio Mix

8:00 pm 12:00 am


congress

The U.S. Senate Judiciary Committee convened on Tuesday (April 30) to discuss a proposed bill that would effectively create a federal publicity right for artists in a hearing that featured testimony from Warner Music Group CEO Robert Kyncl, artist FKA Twigs, Digital Media Association (DiMA) CEO Graham Davies, SAG-AFTRA national executive director/chief negotiator Duncan Crabtree-Ireland, Motion Picture Association senior vp/associate general counsel Ben Sheffner and the University of San Diego professor Lisa P. Ramsey.
The draft bill — called the Nurture Originals, Foster Art, and Keep Entertainment Safe Act (NO FAKES Act) — would create a federal right for artists, actors and others to sue those who create “digital replicas” of their image, voice, or visual likeness without permission. Those individuals have previously only been protected through a patchwork of state “right of publicity” laws. First introduced in October, the NO FAKES Act is supported by a bipartisan group of U.S. senators including Sen. Chris Coons (D-Del.), Sen. Marsha Blackburn (R-Tenn.), Sen. Amy Klobuchar (D-Minn.) and Sen. Thom Tillis (R-N.C.).

Trending on Billboard

Warner Music Group (WMG) supports the NO FAKES Act along with many other music businesses, the RIAA and the Human Artistry Campaign. During Kyncl’s testimony, the executive noted that “we are in a unique moment of time where we can still act and we can get it right before it gets out of hand,” pointing to how the government was not able to properly handle data privacy in the past. He added that it’s imperative to get out ahead of artificial intelligence (AI) to protect artists’ and entertainment companies’ livelihoods.

“When you have these deepfakes out there [on streaming platforms],” said Kyncl, “the artists are actually competing with themselves for revenue on streaming platforms because there’s a fixed amount of revenue within each of the streaming platforms. If somebody is uploading fake songs of FKA Twigs, for example, and those songs are eating into that revenue pool, then there is less left for her authentic songs. That’s the economic impact of it long term, and the volume of content that will then flow into the digital service providers will increase exponentially, [making it] harder for artists to be heard, and to actually reach lots of fans. Creativity over time will be stifled.”

Kyncl, who recently celebrated his first anniversary at the helm of WMG, previously held the role of chief business officer at YouTube. When questioned about whether platforms, like YouTube, Spotify and others who are represented by DiMA should be held responsible for unauthorized AI fakes on their platforms, Kyncl had a measured take: “There has to be an opportunity for [the services] to cooperate and work together with all of us to [develop a protocol for removal],” he said.

During his testimony, Davies spoke from the perspective of the digital service providers (DSPs) DiMA represents. “There’s been no challenge [from platforms] in taking down the [deepfake] content expeditiously,” he said. “We don’t see our members needing any additional burdens or incentives here. But…if there is to be secondary liability, we would very much seek that to be a safe harbor for effective takedowns.”

Davies added, however, that the Digital Millennium Copyright Act (DMCA), which provides a notice and takedown procedure for copyright infringement, is not a perfect model to follow for right of publicity offenses. “We don’t see [that] as being a good process as [it was] designed for copyright…our members absolutely can work with the committee in terms of what we would think would be an effective [procedure],” said Davies. He added, “It’s really essential that we get specific information on how to identify the offending content so that it can be removed efficiently.”

There is currently no perfect solution for tracking AI deepfakes on the internet, making a takedown procedure tricky to implement. Kyncl said he hopes for a system that builds on the success of YouTube’s Content ID, which tracks sound recordings. “I’m hopeful we can take [a Content ID-like system] further and apply that to AI voice and degrees of similarity by using watermarks to label content and care the provenance,” he said.

The NO FAKES draft bill as currently written would create a nationwide property right in one’s image, voice, or visual likeness, allowing an individual to sue anyone who produced a “newly-created, computer-generated, electronic representation” of it. It also includes publicity rights that would not expire at death and could be controlled by a person’s heirs for 70 years after their passing. Most state right of publicity laws were written far before the invention of AI and often limit or exclude the protection of an individual’s name, image and voice after death.

The proposed 70 years of post-mortem protection was one of the major points of disagreement between participants at the hearing. Kyncl agreed with the points made by Crabtree-Ireland of SAG-AFTRA — the actors’ union that recently came to a tentative agreement with major labels, including WMG, for “ethical” AI use — whose view was that the right should not be limited to 70 years post-mortem and should instead be “perpetual,” in his words.

“Every single one of us is unique, there is no one else like us, and there never will be,” said Crabtree-Ireland. “This is not the same thing as copyright. It’s not the same thing as ‘We’re going to use this to create more creativity on top of that later [after the copyright enters public domain].’ This is about a person’s legacy. This is about a person’s right to give this to their family.”

Kyncl added simply, “I agree with Mr. Crabtree-Ireland 100%.”

However, Sheffner shared a different perspective on post-mortem protection for publicity rights, saying that while “for living professional performers use of a digital replica without their consent impacts their ability to make a living…that job preservation justification goes away post-mortem. I have yet to hear of any compelling government interest in protecting digital replicas once somebody is deceased. I think there’s going to be serious First Amendment problems with it.”

Elsewhere during the hearing, Crabtree-Ireland expressed a need to limit how long a young artist can license out their publicity rights during their lifetime to ensure they are not exploited by entertainment companies. “If you had, say, a 21-year-old artist who’s granting a transfer of rights in their image, likeness or voice, there should not be a possibility of this for 50 years or 60 years during their life and not have any ability to renegotiate that transfer. I think there should be a shorter perhaps seven-year limitation on this.”

TikTok has returned to the bargaining table with Universal Music Group (UMG), but a fast-tracked Congressional bill that could result in the platform being sold, or, as a last result, banned in the United States may reach President Joe Biden’s desk before those negotiations are finished.  
A source familiar with the talks says Bytedance — the Chinese company that owns TikTok — has returned to the bargaining table with UMG after the label group pulled its music from the social media platform at the end of January citing its refusal to address three “critical” issues: “appropriate compensation for our artists and songwriters,” “protecting human artists from the harmful effects of AI” and “online safety for TikTok’s users.”  

It’s unclear whether any progress has resulted — neither UMG nor TikTok will comment — but ByteDance currently faces a more urgent, existential issue now that the Speaker of the House of Representatives has attached what’s being called the TikTok national security bill to the foreign aid package for Ukraine and Israel that is expected to move quickly through Congress. The House may vote on it as early this weekend and the Senate is expected to act quickly. If it passes in both houses, President Biden has promised to sign it immediately.  

Trending on Billboard

Officially titled The Protecting Americans From Foreign Adversary Controlled Applications Act, the proposed legislation was drawn up after White House national security and intelligence leaders briefed House lawmakers on the potential dangers that TikTok, which is used by 170 million Americans, poses to the nation.  

What the TikTok National Security Bill Does

If Biden signs the bill into law, ByteDance will have approximately a year from its enactment — the original bill gave it just 90 days — to sell TikTok to a buyer in a country that the United States does not consider a foreign adversary. If ByteDance, which has ties to the Chinese Communist Party and is subject to its government, refuses to divest itself of TikTok or does not meet the deadline, then the app could be banned from being downloaded or used in the United States.

Rick Lane, TikTok Coalition.org leader and child safety advocate, says the TikTok bill “is moving forward very quickly. The language between the House and Senate is so close — they are millimeters apart, and I think agreements are being made to bring them together. Unless something drastic happens, I don’t see this bill’s momentum slowing down, no matter who’s on the other side. That is why adding it to the foreign aid bill makes sense.”

At a time when Congress is mired in ideological infighting, particularly among Republicans, the House of Representatives moved with remarkable speed to mark up and pass the bill and send it to the Senate.  

Despite a deluge of calls and messages from TikTok users protesting the legislation, the House passed it, 352 votes to 65, on March 13 — less than a week after national security and intelligence officials held a classified briefing for an executive session of the House Energy and Commerce Committee. A music industry source familiar with activity on Capitol Hill tells Billboard that, before the briefing started, “members and staffers devices were taken away, and the committee room’s AV systems and the like were removed.” Following the morning briefing, the committee marked up the bill that afternoon and voted unanimously to advance it to the full House of Representatives. 

A classified intelligence briefing was also held in the Senate and prompted similar remarks of concern. Republican senator from Missouri Eric Schmitt told Axios that the Chinese-controlled platform’s “ability to spy is shocking.”  

“We don’t know exactly what was briefed,” says the music industry source. “But what is absolutely crystal clear is that whatever has been presented to Congress members by the intelligence community is clearly driving this. You don’t see — particularly Congress members — reacting with that kind of dispatch and unanimity.” 

A ‘Once-in-a-Lifetime’ Alarm

“This is really a once-in-a-lifetime kind of alarm,” the source adds. “People who have been around the Hill for decades don’t remember there ever being this level of concern.”

An unclassified 2024 Annual Threat Assessment issued by the Office of the Director of National Intelligence (ODNI) in February may offer a glimpse of these security concerns. The assessment reported that “China is demonstrating a higher degree of sophistication in its influence activity, including experimenting with generative AI. TikTok accounts run by a [People’s Republic of China] propaganda arm reportedly targeted candidates from both political parties during the U.S. midterm election cycle in 2022.” 

In response, a TikTok spokesperson referred Billboard to its written response to the ODNI, dated March 15, which asserts that the social media platform “regularly takes action against deceptive behavior, including covert influence networks throughout the world, and has been transparent in reporting them publicly. TikTok has protected our platform through more than 150 elections globally,” the response continues, “and is continuing to work with electoral commissions, experts, and fact-checkers to safeguard our community during this historic election year.” 

In addition to the intelligence briefings, Billboard obtained a slide presentation that one Capitol Hill source says has been shown to staffers for over 40 senators. The presentation cobbles together previously published articles, analyses and reports about TikTok’s alleged dissemination of disinformation and propaganda to much of the same demographic that uses the app for music discovery. (According to a 2024 Pew Research Center report, 56% of U.S. adults 18 to 34 use the platform and 52% of the users in this age group have posted a video to the platform.)  

‘TikTok Is a News Organization‘

As one tech policy expert says, “TikTok is a news organization. Trends are indicating that up to 40% of adults 18-to-29 will be getting their news from TikTok in 2024. It’s their CNN or Fox News or MSNBC.”  

One of the first slides, titled “TikTok Has Rapidly Evolved From an Entertainment to a News Platform, Enormously Expanding Its Influence on The U.S. Population,” includes a graph built from Pew Research Center data that shows 43% of TikTok users regularly got their news from the platform in 2023, nearly double the 22% that did so in 2020. Only X (59%) and Facebook (54%) were higher. And nearly a third of that 43% were adults under 30 years of age.  

Although music’s role in TikTok’s alleged dispersal of disinformation is not examined in the presentation, the tech policy expert says it’s definitely a factor. A 2023 report released by the rights management startup Pex in February revealed that 85% of TikTok videos contain music, more than YouTube (84%), Instagram (58%) and Facebook (49%), and the tech policy expert says that music played on the platform often functions as an emotional gateway to propaganda.  

“The power of music is what draws people to social interaction,” the source says. “They’re taking music that gets people excited and, for instance, following them with horrific videos — and the interaction of those data points creates this powerful tool to affect policy.” The expert adds that TikTok’s algorithm enables the platform to essentially tailor its approach to each user. “It’s no longer just one size fits all; the ability now is to take visual cues, music and sound and target each individual with what sets them off — and they can do that on a massive scale.  

“The argument in favor of TikTok is that Meta and Alphabet are collecting data from even more people, but they are not based in an adversarial country,” the expert continues. “There’s another key difference as well. TikTok sends you videos that they think you are interested in no matter what. Most young people want to be influencers. In order to be an influencer on TikTok, you have to follow what’s trending, so your video is blasted to more people. You tag along with feeds. In the policy realm, if they want to influence public policy, your view is going to be whatever direction that feed is going in.”

A TikTok spokesperson responds: “There is absolutely no evidence to these assertions. We have clear rules prohibiting deceptive behaviors.”  

‘They Deserve It’

The music industry’s view of the proceedings in Washington is mixed. The perspective of artists and songwriters is arguably best expressed by David Lowery, the artist rights activist and frontman for the bands Cracker and Camper Van Beethoven, who also was one of more than 200 creators that, in early April, signed an open letter to tech platforms urging them to stop using AI “to infringe upon and devalue the rights of human artists.”

“The rates TikTok pays artists are extremely low, and it has a history — at least with me — of using my catalog with no licenses,” Lowery says. “I just checked to make sure and there are plenty of songs that I wrote on TikTok, and I have no idea how they have a license for those songs.” 

As a result, Lowery says that while “I’m kind of neutral as to whether TikTok needs to be sold to a U.S. owner, the bill pleases me in a general way because I feel that they’ve gotten away with abusing artists for so long that they deserve it. I realize the bill doesn’t punish them for doing that,” he continues, “but that’s why a lot of musicians feel they really deserve it.” 

The consensus among label executives is that TikTok is not going anywhere, but were the app banned in the United States, they wouldn’t spill many tears. In early April, Billboard reported that two months after UMG pulled its music from TikTok, its market share and chart appearances had not been greatly affected. And though numerous UMG artists have devised workarounds to maintain a presence on TikTok, one senior label executive says, “When you’re looking at the competitive set for TikTok, you see a migration to YouTube, Instagram and Snap. And those platforms see a real opportunity, so they’re starting to lean in. The absence of TikTok would just mean migration to other platforms and, frankly, because those platforms monetize better, even if you lose a significant chunk of your audience, you’re still going to make more money.”

$8.7 Million For Lobbyists

Capitol Hill sources say ByteDance has enlisted a small army of lobbyists to keep TikTok on U.S. mobile devices. In 2023, ByteDance spent $8.7 million on lobbyists, according to the nonprofit government transparency organization OpenSecrets. That’s almost double the $4.9 million it dropped in 2022, although a TikTok spokesperson attributes the year-to-year increase to “a unique, one-time higher expenditure in the third quarter of 2023 that reflects the vesting of Restricted Stock Units related to the launch of our U.S. buyback program.” (Data for 2024 lobbyist expenditures were not available at publication time.) 

That 2023 outlay was the fourth-highest amount spent on lobbyists by a tech company that year, behind Meta ($19.3 million); Amazon.com (nearly $19.3 million) and Alphabet (almost $12.4 million). In 2019, ByteDance spent less than $1 million on lobbyists.  

Lobbyists hired by ByteDance include Rosemary Gutierrez, the former deputy chief of staff for Democratic Senator Maria Cantwell of Washington, who chairs the Senate Commerce Committee — which will review the TikTok legislation before a floor vote is taken — and Kellyanne Conway, former senior counselor to President Donald Trump. Conway is reportedly considering joining Trump’s reelection campaign, but last month, Politico reported that she was working for the conservative Club for Growth to lobby on TikTok’s behalf.   

One of the Club for Growth’s biggest donors is billionaire Jeffrey Yass, who owns 15% of ByteDance, which is reportedly worth roughly $40 billion. Yass’ trading firm, Susquehanna International Group, is also the largest institutional shareholder — 2% — of Digital World Acquisition Corporation, which merged with Trump Media & Technology Group, the parent company of the former president’s Truth Social app, and took it public in late March. (The New York Times reported that it’s unclear if Susquehanna still owned the shares at the time of the IPO.) 

Given Yass’ support of Trump, it’s not shocking that, after attempting to ban TikTok during his time in office, Trump has said on social media and in interviews that though he still considers TikTok a national security risk, he has reconsidered banning the platform. One reason he has cited is that such a move would benefit Meta and its social media app Facebook. Trump has made no secret of his enmity for Meta’s chairman/CEO Mark Zuckerberg and Facebook, which banned him in 2021. (Trump was reinstated in 2023.)  

The Taylor Factor

The news last week that Taylor Swift had restored her Taylor’s Version songs to TikTok in the run-up to the April 19 release of her new album The Tortured Poets Department led to speculation that the superstar singer-songwriter — who has often spoken out for artists’ rights — could be weaponized by TikTok in its standoff with UMG. In Washington, however, TikTok Coalition leader Lane says, “Taylor Swift being or not being on TikTok has never come up in any meeting I’ve been in on Capitol Hill.” He sees Swift’s return to the app as “a business decision” that’s no different than President Biden’s and Congress members’ presence on the app, or even UMG’s continued talks with TikTok. “It doesn’t diminish the strong bipartisan/bicameral support within Congress and the White House that TikTok is a clear and present danger to the U.S. national security and needs to be divested from ByteDance,” he says.  

Trump’s sway over the GOP has some on Capitol Hill predicting that passage of the TikTok National Security bill in concert with the foreign aid package is not a slam dunk. “It’s hard to say how it’s going to play on the Republican side,” says the music industry source familiar with the Capitol Hill proceedings. “Because while they’re feeling pressure from the former President on one hand to oppose the bill, they are also feeling heat from their constituents to support it.”  

Representative Adam Schiff (D-Calif.) introduced new legislation in the U.S. House of Representatives on Tuesday (April 9) which, if passed, would require AI companies to disclose which copyrighted works were used to train their models, or face a financial penalty. Called the Generative AI Copyright Disclosure Act, the new bill would apply to both new models and retroactively to previously released and used generative AI systems.
The bill requires that a full list of copyrighted works in an AI model’s training data set be filed with the Copyright Office no later than 30 days before the model becomes available to consumers. This would also be required when the training data set for an existing model is altered in a significant manner. Financial penalties for non-compliance would be determined on a case-by-case basis by the Copyright Office, based on factors like the company’s history of noncompliance and the company’s size.

Generative AI models are trained on up to trillions of existing works. In some cases, data sets, which can include anything from film scripts to news articles to music, are licensed from copyright owners, but often these models will scrape the internet for large swaths of content, some of which is copyrighted, without the consent or knowledge of the author. Many of the world’s largest AI companies have publicly defended this practice, calling it “fair use,” but many of those working in creative industries take the position that this is a form of widespread copyright infringement.

Trending on Billboard

The debate has sparked a number of lawsuits between copyright owners and AI companies. In October, Universal Music Group, ABKCO, Concord Music Group, and other music publishers filed a lawsuit against AI giant Anthropic for “unlawfully” exploiting their copyrighted song lyrics to train AI models.

“In the process of building and operating AI models, Anthropic unlawfully copies and disseminates vast amounts of copyrighted works,” wrote lawyers for the music companies at the time. “Publishers embrace innovation and recognize the great promise of AI when used ethically and responsibly. But Anthropic violates these principles on a systematic and widespread basis.”

While many in the music business are also calling for compensation and the ability to opt in or out of being used in a data set, this bill focuses only on requiring transparency with copyrighted training data. Still, it has garnered support from many music industry groups, including the Recorded Industry Association of America (RIAA), National Music Publishers’ Association (NMPA), ASCAP, Black Music Action Coalition (BMAC), and Human Artistry Campaign.

It is also supported by other creative industry groups, including the Professional Photographers of America, SAG-AFTRA, Writers Guild of America, International Alliance of Theatrical Stage Employees (IATSE) and more.

“AI has the disruptive potential of changing our economy, our political system, and our day-to-day lives,” said Rep. Schiff in a statement. “We must balance the immense potential of AI with the crucial need for ethical guidelines and protections. My Generative AI Copyright Disclosure Act is a pivotal step in this direction. It champions innovation while safeguarding the rights and contributions of creators, ensuring they are aware when their work contributes to AI training datasets. This is about respecting creativity in the age of AI and marrying technological progress with fairness.”

A number of rights groups also weighed in on the introduction of the bill.

“Any effective regulatory regime for AI must start with one of the most fundamental building blocks of effective enforcement of creators’ rights — comprehensive and transparent record keeping,” adds RIAA chief legal officer Ken Doroshow. “RIAA applauds Congressman Schiff for leading on this urgent and foundational issue.”

“We commend Congressman Schiff for his leadership on the Generative AI Copyright Disclosure Act,” NMPA president/CEO David Israelite said. “AI only works because it mines the work of millions of creators every day and it is essential that AI companies reveal exactly what works are training their data. This is a critical first step towards ensuring that AI companies fully license and that songwriters are fully compensated for the work being used to fuel these platforms.”

“Without transparency around the use of copyrighted works in training artificial intelligence, creators will never be fairly compensated and AI tech companies will continue stealing from songwriters,” ASCAP CEO Elizabeth Matthews said. “This bill is an important step toward ensuring that the law puts humans first, and we thank Congressman Schiff for his leadership.”

“Protecting the work of music creators is essential, and this all begins with transparency and tracking the use of copyrighted materials in generative AI,” Black Music Action Coalition (BMAC) co-chair Willie “Prophet” Stiggers said. “BMAC hopes Rep. Schiff’s Generative AI Copyright Disclosure Act helps garner support for this mission and that author and creator rights continue to be protected and preserved.”

“Congressman Schiff’s proposal is a big step forward towards responsible AI that partners with artists and creators instead of exploiting them,” Human Artistry Campaign senior advisor Dr. Moiya McTier said. “AI companies should stop hiding the ball when they copy creative works into AI systems and embrace clear rules of the road for recordkeeping that create a level and transparent playing field for the development and licensing of genuinely innovative applications and tools.”

Every passing day, a new statistic emerges that would make any aspiring artist, producer, or songwriter feel foolish for trying to fund their dreams. 
Over 100,000 songs get ingested to Spotify daily, but the vast majority of them fail to surpass the 1,000-play mark. Sony Music Entertainment, Warner Music Group, and Universal Music Group reported record profits in 2023, but those numbers are largely driven by a small number of star artists. A 2017 study showed that out of 7,000 bands tracked, only 21 managed to headline a venue with a capacity exceeding 3,000. Limited opportunity and long odds face artists who don’t have significant industry backing. 

Content saturation makes it harder to stand out, inspiring strategic conservatism from major labels, who, driven by data, fear financial risk and tend to invest in artists who demonstrate substantial market appeal. 

Trending on Billboard

What are musicians — and, frankly, writers, visual artists, filmmakers, or any creators — in need of resources to do when corporations appear more risk-averse than ever? 

Why do we need a grant system for individual artists?

While art is often considered a luxury rather than a public good, it has been shown to provide both cultural enrichment and economic stimulus. 

In 2023, Americans for the Arts found that the nonprofit arts and culture industry provided 2.6 million jobs, generated $29.1 billion in tax revenue, and provided $101 billion in personal income to U.S. residents. These numbers include the individuals who benefit from public arts funding to become working artists, who tour, show their work at museums, and fill movie theaters. 

America’s nonprofit and for-profit arts sectors work together to promote cultural growth as much as they stimulate economic activity locally and nationally.

Public funding for the arts has remained relatively steady in absolute terms. However, inflation-adjusted spending on the arts by local governments has declined consistently throughout the 2000s. Local arts agencies now receive 27% less in funding than they did in 2001.

Other countries have shown a better system can exist. 

For 37 years, Canada’s FACTOR grant program has supported Canadian recording artists in meaningful ways. 

FACTOR covers costs that traditionally require the debt financing of a label deal: recording, music videos, and tour funding chief among them. 

Notable FACTOR recipients launched into successful careers include Jessie Reyez, Grimes, Charlotte Cardin, BADBADNOTGOOD and TOBi. Drake’s vaunted company October’s Very Own has also received a variety of grants from Canadian governmental sources — including funding for the 2014 OVO Fest.

In Sweden, robust arts education in public schools combined with an internationally-minded grant system contribute to the small nation’s outsized influence on popular music abroad, particularly in the United States where Max Martin’s Swedish pop sensibilities have dominated Billboard charts since Bill Clinton was in office. 

While in America, artists can gain access to grants through institutions like the Guggenheim Memorial Foundation or the Henry Luce Foundation, or via state institutions, there is no unified federal mechanism for arts funding akin to FACTOR or the Swedish Arts Grant Committee. The National Endowment for the Arts has an impressive grantmaking operation but does not give direct grants to individual artists.

Introducing the CREATE Art Act.

We need a better system. 

In 2024, we are working to bring the CREATE Art Act to the American public. Created by Congressman Maxwell Frost, a drummer and musician himself and the first Gen-Z person ever elected to the United States Congress, the CREATE Art Act proposes a novel grant system for individual artists of all disciplines.

CREATE grants go beyond international models in the way they target emerging artists, those creators who may not yet have the good fortune of making a living off of their art or wish to avoid potentially injurious record and publishing deals. Recipients must show net earnings of less than $50,000 in the previous five years and not more than $400,000 in the previous 20 years from their art. The art produced must be relevant to the community and accessible to the public. The grants include: 

Progress Grant – Up to $2,000 to support a year of artist activities.

Project Grant – Up to $100,000 per project that can be used over two years.

Live Performance Grant – Up to $35,000 for live performances.

Development Fund – Up to $10,000 to pay the living and working expenses of artists

while they research, write, or cultivate stories or projects.

The purpose of the program is twofold. 

First, and simply, more artists with funding means more art. The greater the creative output of our nation, the greater the diversity of voices with the potential to gain an audience, shift perspectives, inspire future generations, and tell new American stories. 

Second, more artists creating means more economic activity in a sector experiencing an algae bloom of creators and consumers.

The current media landscape cuts a more jagged figure than ever. No monoliths. No starmakers. No obvious paths to success. 

In a time of such noise and fragmentation, artists find it as hard as ever to fund their dreams and more difficult than before to cut through the clutter. 

The CREATE Art Act would plant a foot on the right path forward, opening up possibilities for generations of American artists to come.

The first member of Generation Z to be elected to Congress, Maxwell Alejandro Frost is proud to represent the people of Central Florida (FL-10) in the United States House of Representatives. As a young Member of Congress and Afro-Latino, Congressman Frost brings a fresh, progressive perspective to an institution formerly out of reach for young, working Black and Latino Americans.

Jon Tanners is a manager, writer, and entrepreneur based in Los Angeles. He manages Grammy-winning, multi-platinum producers Dahi, Michael Uzowuru, and Take A Daytrip and is also co-founder of CreateSafe.

The House on Wednesday passed a bill that would lead to a nationwide ban of the popular video app TikTok if its China-based owner doesn’t sell, as lawmakers acted on concerns that the company’s current ownership structure is a national security threat.

Explore

Explore

See latest videos, charts and news

See latest videos, charts and news

The bill, passed by a vote of 352-65, now goes to the Senate, where its prospects are unclear.

TikTok, which has more than 150 million American users, is a wholly owned subsidiary of Chinese technology firm ByteDance Ltd.

Trending on Billboard

The lawmakers contend that ByteDance is beholden to the Chinese government, which could demand access to the data of TikTok’s consumers in the U.S. any time it wants. The worry stems from a set of Chinese national security laws that compel organizations to assist with intelligence gathering.

“We have given TikTok a clear choice,” said Rep. Cathy McMorris Rodgers, R-Wash. “Separate from your parent company ByteDance, which is beholden to the CCP (the Chinese Communist Party), and remain operational in the United States, or side with the CCP and face the consequences. The choice is TikTok’s.

House passage of the bill is only the first step. The Senate would also need to pass the measure for it to become law, and lawmakers in that chamber indicated it would undergo a thorough review. Senate Majority Leader Chuck Schumer, D-N.Y., said he’ll have to consult with relevant committee chairs to determine the bill’s path.

President Joe Biden has said if Congress passes the measure, he will sign it.

The House vote is poised to open a new front in the long-running feud between lawmakers and the tech industry. Members of Congress have long been critical of tech platforms and their expansive influence, often clashing with executives over industry practices. But by targeting TikTok, lawmakers are singling out a platform popular with millions of people, many of whom skew younger, just months before an election.

Opposition to the bill was also bipartisan. Some Republicans said the U.S. should warn consumers if there are data privacy and propaganda concerns, while some Democrats voiced concerns about the impact a ban would have on its millions of users in the U.S., many of which are entrepreneurs and business owners.

“The answer to authoritarianism is not more authoritarianism,” said Rep. Tom McClintock, R-Calif. “The answer to CCP-style propaganda is not CCP-style oppression. Let us slow down before we blunder down this very steep and slippery slope.”

Ahead of the House vote, a top national security official in the Biden administration held a closed-door briefing Tuesday with lawmakers to discuss TikTok and the national security implications. Lawmakers are balancing those security concerns against a desire not to limit free speech online.

“What we’ve tried to do here is be very thoughtful and deliberate about the need to force a divestiture of TikTok without granting any authority to the executive branch to regulate content or go after any American company,” said Rep. Mike Gallagher, the bill’s author, as he emerged from the briefing.

TikTok has long denied that it could be used as a tool of the Chinese government. The company has said it has never shared U.S. user data with Chinese authorities and won’t do so if it is asked. To date, the U.S. government also has not provided any evidence that shows TikTok shared such information with Chinese authorities. The platform has about 170 million users in the U.S.

The security briefing seemed to change few minds, instead solidifying the views of both sides.

“We have a national security obligation to prevent America’s most strategic adversary from being so involved in our lives,” said Rep. Nick LaLota, R-N.Y.

But Rep. Robert Garcia, D-Calif., said no information has been shared with him that convinces him TikTok is a national security threat. “My opinion, leaving that briefing, has not changed at all,” he said.

“This idea that we’re going to ban, essentially, entrepreneurs, small business owners, the main way how young people actually communicate with each other is to me insane,” Garcia said.

“Not a single thing that we heard in today’s classified briefing was unique to TikTok. It was things that happen on every single social media platform,” said Rep. Sara Jacobs, D-Calif.

Republican leaders have moved quickly to bring up the bill after its introduction last week. A House committee approved the legislation unanimously, on a 50-vote, even after their offices were inundated with calls from TikTok users demanding they drop the effort. Some offices even shut off their phones because of the onslaught.

Lawmakers in both parties are anxious to confront China on a range of issues. The House formed a special committee to focus on China-related issues. And Schumer directed committee chairs to begin working with Republicans on a bipartisan China competition bill.

Senators are expressing an openness to the bill but suggested they don’t want to rush ahead.

“It is not for me a redeeming quality that you’re moving very fast in technology because the history shows you make a lot of mistakes,” said Sen. Ron Wyden, D-Ore.

In pushing ahead with the legislation, House Republicans are also creating rare daylight between themselves and former President Donald Trump as he seeks another term in the White House.

Trump has voiced opposition to the effort. He said Monday that he still believes TikTok poses a national security risk but is opposed to banning the hugely popular app because doing so would help its rival, Facebook, which he continues to lambast over his 2020 election loss.

As president, Trump attempted to ban TikTok through an executive order that called “the spread in the United States of mobile applications developed and owned by companies in the People’s Republic of China (China)” a threat to “the national security, foreign policy and economy of the United States.” The courts, however, blocked the action after TikTok sued, arguing such actions would violate free speech and due process rights.

A new bill designed to increase streaming payouts for artists was introduced in the U.S. House of Representatives on Wednesday (Mar. 6). Titled the Living Wage for Musicians Act, the legislation proposes the establishment of a new royalty fund that would pay artists directly, bypassing labels altogether.
Introduced by Reps. Rashida Tlaib (D-Mich.) and Jamaal Bowman (D-N.Y.), the bill aims to boost artists’ streaming royalty from fractions of a penny up to one penny per stream by way of the new fund. It proposes to fund the additional royalty payments, in part, by mandating the addition of a fee to every streaming subscription equal to 50% of the subscription price — an amount that would be set anywhere between $4 and $10. The bill would also establish a royalty cap for tracks that generate at least 1 million streams per month, with royalties generated by the tracks beyond that number to be divided among all artists.

Notably, the bill would not affect the existing payout model but rather establish a separate fund on top of what artists already receive under the current system.

Trending on Billboard

“It’s only right that the people who create the music we love get their fair share, so that they can thrive, not just survive,” said Tlaib, who has long advocated for higher royalty payments to artists on streaming services, in a statement.

Damon Krukowski, a member of the band Damon & Naomi (and formerly Galaxie 500) and an organizer for the Union of Musicians and Allied Workers (UMAW), added in a statement, “There is a lot of talk in the industry about how to ‘fix’ streaming — but the streaming platforms and major labels have already had their say for more than a decade, and they have failed musicians.”

The UMAW partnered with the representatives in drafting the act.

It’s unlikely streaming services and top labels will support all of the changes proposed by the bill. Daniel Ek, Spotify’s co-founder/CEO, expressed reluctance for years to raise subscription prices, although they did finally increase in 2023, rising from $9.99 to $10.99. Also likely to be unpopular with streaming services: a mandate that 10% of all of their non-subscription revenue, including from advertising, goes to the fund as a way to further increase payments to recording artists.

Labels and some artists also seem likely to oppose the cap in which the most popular artists share portions of their streaming revenue with the rest of the streaming pool. And labels — which have lobbying power through the Recording Industry Association of America (RIAA) — will also likely challenge the provision that would see artists paid directly from the fund rather than through the labels themselves.

An RIAA representative declined to comment on the bill.

Country star Lainey Wilson and Recording Academy president/CEO Harvey Mason voiced their support for federal regulation of AI technology at a hearing conducted by the House Judiciary Subcommittee on Courts, Intellectual Property, and the Internet in Los Angeles on Friday (Feb. 2). 
“Our voices and likenesses are indelible parts of us that have enabled us to showcase our talents and grow our audiences, not mere digital kibble for a machine to duplicate without consent,” Wilson said during her comments. 

“The artists and creators I talk to are concerned that there’s very little protection for artists who see their own name or likeness or voice used to create AI-generated materials,” Mason added. “This misuse hurts artists and their fans alike.” 

“The problem of AI fakes is clear to everyone,” he continued later. “This is a problem that only Congress can address to protect all Americans. For this reason, the academy is grateful for the introduction of the No AI FRAUD Act,” a bill announced in January that aims to establish a federal framework for protecting voice and likeness. 

The star of the hearing was not from the music industry, though. Jennifer Rothman, a professor of law at University of Pennsylvania Law School, offered an eloquent challenge to a key provision of the No AI FRAUD act, which would allow artists to transfer the rights to their voice and likeness to a third party. 

It’s easy to imagine this provision is popular with labels, who historically built their large catalogs by taking control of artists’ recordings for perpetuity. However, Rothman argued that “any federal right to a person’s voice or likeness must not be transferable away from that person” and “there must be significant limits on licensing” as well.  

“Allowing another person or entity to own a living human being’s likeness or voice in perpetuity violates our fundamental and constitutional right to liberty,” she said.

Rothman cleverly invoked the music industry’s long history of perpetuity deals — a history that has upset many artists, including stars like Taylor Swift, over the years — as part of the reason for her objection. 

“Imagine a world in which Taylor Swift‘s first record label obtained rights in perpetuity to young Swift’s voice and likeness,” Rothman explained. “The label could then replicate Swift’s voice over and over in new songs that she never wrote and have AI renditions of her perform and endorse the songs and videos and even have holograms perform them on tour. In fact, under the proposed No AI Fraud Act, the label would be able to sue Swift herself for violating her own right of publicity if she used her voice and likeness to write and record new songs and publicly perform them. This is the topsy-turvy world that the draft bills would create.”

(Rothman’s reference to Swift was just one of several at the hearing. Rep. Kevin Kiley [R – CA] alluded to the debate over whether or not the singer would be able to make it to the Super Bowl from her performance in Tokyo, while Rep. Nathaniel Moran [R – TX] joked, “I have not mentioned Travis Kelce’s girlfriend once during this testimony.”)

Rothman pointed out that the ability to transfer voice or likeness rights in perpetuity potentially “threatens ordinary people” as well: They “may unwittingly sign over those rights as part of online Terms of Service” that exist on so many platforms and are barely ever read. In the music industry, there is a similar problem already causing problems for a number of young artists who sign up to distribute their music through an online service, agree to Terms of Service without reading them, and later discover that they have unknowingly locked their music into some sort of agreement. In an AI world, this problem could be magnified. 

Rothman’s comments put her at odds with the Recording Academy. “In this particular bill, there are certain safeguards, there’s language that says there have to be attorneys present and involved,” Mason said during questioning. (Though many young artists can’t afford counsel or can’t find good counsel.) “But we also believe that families should have the freedom to enter into different business arrangements.” 

Mason’s view was shared by Rep. Matt Gaetz (R – FL). “If tomorrow I wanted to sell my voice to a robot and let that robot say whatever in the world that it wanted to say, and I wanted to take the money from that sale and go buy a sailboat and never turn on the internet again, why should I not have the right to do that?” he asked.

In addition to Rothman, Mason and Wilson, there was one other witness at the hearing: Christopher Mohr, who serves as president of the Software & Information Industry Association. He spoke little and mostly reiterated that his members wanted the courts to answer key questions around AI. “It’s really important that these cases get thoroughly litigated,” Mohr said.

This answer did not satisfy Rep. Glenn Ivey (D – MD), a former litigator. “It could take years before all of that gets solved and you might have conflicting decisions from different courts in jury trials,” Ivey noted. “What should we be doing to try and fix it now?”

A bipartisan group of U.S. House lawmakers announced a new bill on Wednesday (Jan. 10) that regulates the use of AI for cloning voices and likenesses. Called the No Artificial Intelligence Fake Replicas And Unauthorized Duplications Act of 2023 (“No AI FRAUD” Act), the bill aims to establish a federal framework for protecting one’s voice and likeness and lays out First Amendment protections.

Explore

Explore

See latest videos, charts and news

See latest videos, charts and news

More federal and state legislation regulating artificial intelligence is expected to be announced later today, including a bill from Gov. Bill Lee of Tennessee also regarding AI voice and likeness cloning. On Jan. 5, Gov. Lee hinted at the subject of his forthcoming legislation: “As the technology landscape evolves with artificial intelligence, we’re proud to lead the nation in proposing legal protection for our best-in-class artists and songwriters.”

The No AI FRAUD Act was introduced by Rep. María Elvira Salazar (R-FL), the lead Republican sponsor of the bill, alongside Reps. Madeleine Dean (D-PA), Nathaniel Moran (R-TX), Joe Morelle (D-NY) and Rob Wittman (R-VA). It is said to be based on the Senate discussion draft Nurture Originals, Foster Art, and Keep Entertainment Safe Act (“NO FAKES” Act), which was announced last October.

“It’s time for bad actors using AI to face the music,” said Rep. Salazar. “This bill plugs a hole in the law and gives artists and U.S. citizens the power to protect their rights, their creative work, and their fundamental individuality online.”

AI voice synthesis technology poses a new problem and opportunity for recording artists. While some laud it as a novel marketing, creative or fan engagement tool, it also leaves artists vulnerable to uncanny impersonations that could confuse, scam or mislead the public.

An artists’ voice, image or likeness may be covered by “right of publicity” laws which protect them from commercial exploitation without authorization, but this is a right that varies state by state. The No AI FRAUD Act aims to establish a harmonized baseline of protection. Still, if one lives in a state with an even stronger right of publicity law than the No AI FRAUD Act, that state protection is still viable, and may be easier to address in court.

This bill is keeping with regulations that a number of music business executives, including those at Sony, ASCAP, UMG, have called for in recent months — following incidents like the viral fake-Drake song “Heart On My Sleeve.”

Mitch Glazier, chairman and CEO of the Recording Industry Association of America (RIAA), released a statement, showing support for the No AI FRAUD Act. “The No AI FRAUD Act is a meaningful step towards building a safe, responsible and ethical AI ecosystem, and the RIAA applauds Representatives Salazar, Dean, Moran, Morelle, and Wittman for leading in this important area. To be clear, we embrace the use of AI to offer artists and fans new creative tools that support human creativity. But putting in place guardrails like the No AI FRAUD Act is a necessary step to protect individual rights, preserve and promote the creative arts, and ensure the integrity and trustworthiness of generative AI. As decades of innovation have shown, when Congress establishes strong IP rights that foster market-led solutions, it results in both driving innovation and supporting human expression and partnerships that create American culture.”

Lucian Grainge, chairman and CEO of Universal Music Group, also shared his praise for the new bill in a statement: “Universal Music Group strongly supports the ‘No AI FRAUD Act’ because no one should be permitted to steal someone else’s image, likeness or voice. While we have an industry-leading track record of enabling AI in the service of artists and creativity, AI that uses their voice or identity without authorization is unacceptable and immoral. We call upon Congress to help put an end to nefarious deepfakes by enacting this federal right of publicity and ensuring that all Americans are protected from such harm.” 

HipHopWired Featured Video

Source: The Washington Post / Getty
Harry Dunn, one of the Black police officers who defended the U.S. Capitol from rioters during the Jan. 6 insurrection, is running for Congress.
On Friday (Jan. 5), former Capitol police officer Harry Dunn announced that he was running for a congressional seat in Maryland. Dunn declared his candidacy on X, formerly Twitter, a day before the third anniversary of the vicious and violent attack by supporters of former President Donald Trump on the U.S. Capitol building, which temporarily disrupted the certification of then-President-elect Joe Biden.

In an interview with the Associated Press concerning his announcement, Dunn said: “As a Capitol Police officer, I did all that I can do in that role to protect, defend, and preserve democracy. But that is exhausted now.” The 15-year veteran retired from the Capitol Police last month. Dunn became known as one of the prominent faces representing the embattled officers saying he was “hell-bent on finishing what he started” being one of the figures who protected members of Congress as rioters swarmed the halls, protecting a stairwell leading to the Lower West Terrace.
In testifying before the House Select Committee investigating the insurrection, Dunn, who is Black, spoke about the intense violence of the rioters and being hit with numerous racial slurs. “I know so many other officers continue to hurt both physically and emotionally,” Dunn said during his testimony. “What we went through that day was traumatic.” The investigation would lead to over 1,230 people arrested and charged with federal crimes related to the insurrection, with 730 pleading guilty and 170 being convicted of at least one charge through trial.
Dunn would later be awarded the Presidential Citizens Medal by President Biden for his service. He is entering the race for the 3rd District of Maryland, a heavily Democratic district after Rep. John Sarbanes declared he would not seek reelection last October. Sarbanes is one of 12 from his party not returning to Congress while 14 Republican members of Congress have also declared that they would not seek reelection.
“A lot of people are leaving, because I don’t know of a better way to say it, it’s a very toxic place. But I do believe that in times like this it is important for good people to stand up, so the bad guys, so to speak, do not win,” Dunn said in an interview with AP. 

Buying concert tickets could become an easier, more straightforward process after the U.S. House Subcommittee on Energy and Commerce passed the Speculative Ticketing Oversight and Prohibition (STOP) Act on Wednesday (Dec. 6). The bill is now eligible for a vote by the full House.
The STOP Act, which Rep. Gus Bilirakus (R-Fla.) called the “biggest ticket reform in years,” does far more than prevent speculative ticketing, though. The bill also addresses a range of deceptive ticketing practices and transparency issues that perplex, aggravate and annoy consumers.

For starters, the bill requires ticket sellers to conspicuously show the final ticket price at the beginning of the purchase process rather than at check-out. “The first price that you see when you order the ticket is the price that you pay — not a penny more,” said Rep. Jan Schakowsky (D-Ill.) during Wednesday’s hearing.

The bill also ensures ticket buyers can get refunds when concerts are cancelled or postponed. Ticket buyers will have the option of receiving a full refund or, subject to availability, a replacement ticket if the event is postponed and rescheduled in the same or a “comparable” location.

“Consumers should not be left on the hook if an event is canceled or postponed and should have the option to receive a full refund or comparable ticket to a rescheduled show or game,” said Rep. Frank Pallone (C-NJ).

The STOP Act also helps consumers know if they’re buying a ticket from the primary seller or a secondary marketplace. The bill would require ticket sellers to provide buyers with a “a clear and conspicuous statement” that the provider is engaged in the secondary sale of the ticket. In addition, the secondary ticket marketplace cannot state that it is “affiliated with or endorsed by a venue, team, or artist” unless a partnership agreement exists.

Deceptive websites that could mislead ticket buyers are also banned. Ticket providers are prevented from using a domain name or subdomain that contains the name of a specific team, league, venue, performance or artist — including “substantially similar” and misspelled names — unless authorized by the owner of the name. Ticket sellers must also make their refund policies known up front.

Finally, as the name of the bill implies, the STOP Act bans speculative ticketing, in effect barringprimary and secondary ticketing marketplaces from selling tickets they do not possess.

For its part, Live Nation, owner of the country’s largest ticketing company, Ticketmaster, welcomes the new measures. “We’ve long supported a federal all-in pricing mandate, along with other measures including banning speculative ticketing and deceptive websites that trick fans,” the company said in a statement. “We’ll continue working with policymakers, advocating for even stronger reforms and enforcement to stop predatory practices that hurt fans and artists.”

Even if the STOP Act passes in the full House, the U.S. Senate must pass a version of the bill for it to become law. Two similar bills have already been introduced in the Senate. Like the STOP Act, the TICKET Act, introduced by Ted Cruz (R-Tex.) and Maria Cantwell (D-Wash.), would prevent hidden ticket fees, require upfront pricing and stop speculative ticket selling. The Unlocking Ticketing Markets Act, introduced by Sens. Amy Klobuchar (D-Minn.) and Richard Blumenthal (D-Conn.), would limit exclusive, multi-year ticketing contracts in live entertainment.