State Champ Radio

by DJ Frosty

Current track

Title

Artist

Current show
blank

State Champ Radio Mix

12:00 am 12:00 pm

Current show
blank

State Champ Radio Mix

12:00 am 12:00 pm


Senate

HipHopWired Featured Video

CLOSE

Source: Tom Williams / Getty
Angelo Alsobrooks and Rep. Lisa Blunt Rochester achieved historic firsts on Election Day with the pair becoming the first women and first Black women in their respective seats to win U.S. Senate seats. This would mark the first time that two Black women are serving in the Senate chamber at the same time.

Angela Alsobrooks, currently the County Executive for Prince George’s County in Maryland, faced former Maryland Gov. Larry Hogan in a race that most expected Alsobrooks to win in the heavily blue-leaning state. Hogan spent plenty in Maryland to flip Sen. Ben Cardin’s open seat but Alsobrooks ran a sharp campaign and banked on her strong record as a known elected official in one of the larger counties.
Rep. Lisa Blunt Rochester, a congresswoman representing Delaware’s at-large district in the state, has been in politics since the 1990s. She served as deputy secretary of the Department of Health and Social Services in 1993 and secretary of the Department of Labor in 1998 under the exiting Delaware Sen. Tom Carper during his time as governor of the state.
Alsobrooks and Blunt Rochester will bring the number of Black senators in the chamber to six. They are also just the fourth and fifth Black women to be elected to the Senate.
The pair follow a great path that was first laid by Sen. Carol Mosley Braun, who was the first Black woman elected to the Senate in 1992. Vice President Kamala Harris was elected to the Senate in 2016. Sen Laphonza Butler, who will vacate the seat at the end of her current term, was appointed to the seat to serve out the late Sen. Dianne Feinstein’s term which ends this coming January.
The excitement of the news of Senator-Elect Angela Alsobrooks and Senator-Elect Rep. Lisa Blunt Rochester is spreading. We’ve got reactions from X below.

Thank you, Maryland! pic.twitter.com/fUl9HGdmaU
— Angela Alsobrooks (@AlsobrooksForMD) November 6, 2024
https://platform.twitter.com/widgets.js

From the bottom of my heart, Delaware, thank you 💙 pic.twitter.com/UI9GtzqYBJ
— Lisa Blunt Rochester (@LisaBRochester) November 6, 2024
https://platform.twitter.com/widgets.js

Photo: Getty

A bipartisan group of four senators led by Majority Leader Chuck Schumer is recommending that Congress spend at least $32 billion over the next three years to develop artificial intelligence and place safeguards around it, writing in a new report released Wednesday that the U.S. needs to “harness the opportunities and address the risks” of the quickly developing technology.
The group of two Democrats and two Republicans said in an interview Tuesday that while they sometimes disagreed on the best paths forward, it was imperative to find consensus with the technology taking off and other countries like China investing heavily in its development. They settled on a raft of broad policy recommendations that were included in their 33-page report.

Trending on Billboard

While any legislation related to AI will be difficult to pass, especially in an election year and in a divided Congress, the senators said that regulation and incentives for innovation are urgently needed.

“It’s complicated, it’s difficult, but we can’t afford to put our head in the sand,” said Schumer, D-N.Y., who convened the group last year after AI chatbot ChatGPT entered the marketplace and showed that it could in many ways mimic human behavior.

The group recommends in the report that Congress draft “emergency” spending legislation to boost U.S. investments in artificial intelligence, including new research and development and new testing standards to try and understand the potential harms of the technology. The group also recommended new requirements for transparency as artificial intelligence products are rolled out and that studies be conducted into the potential impact of AI on jobs and the U.S. workforce.

Republican Sen. Mike Rounds, a member of the group, said the money would be well spent not only to compete with other countries who are racing into the AI space but also to improve Americans’ quality of life — supporting technology that could help cure some cancers or chronic illnesses, he said, or improvements in weapons systems could help the country avoid a war.

“This is a time in which the dollars we put into this particular investment will pay dividends for the taxpayers of this country long term,” he said.

The group came together a year ago after Schumer made the issue a priority — an unusual posture for a majority leader — and brought in Democratic Sen. Martin Heinrich of New Mexico, Republican Sen. Todd Young of Indiana and Rounds of South Dakota.

As the four senators began meeting with tech executives and experts, Schumer said in a speech over the summer that the rapid growth of artificial intelligence tools was a “moment of revolution” and that the government must act quickly to regulate companies that are developing it.

Young said the development of ChatGPT, along with other similar models, made them realize that “we’re going to have to figure out collectively as an institution” how to deal with the technology.

“In the same breath that people marveled at the possibilities of just that one generative AI platform, they began to hypothesize about future risks that might be associated with future developments of artificial intelligence,” Young said.

While passing legislation will be tough, the group’s recommendations lay out the first comprehensive road map on an issue that is complex and has little precedent for consideration in Congress. The group spent almost a year compiling the list of policy suggestions after talking privately and publicly to a range of technology companies and other stakeholders, including in eight forums to which the entire Senate was invited.

The first forum in September included Elon Musk, CEO of Tesla and owner of X, Meta’s Mark Zuckerberg, former Microsoft CEO Bill Gates and Google CEO Sundar Pichai.

Schumer said after the private meeting that he had asked everyone in the room — including almost two dozen tech executives, advocates and skeptics — whether government should have a role in the oversight of artificial intelligence, and “every single person raised their hand.”

The four senators are pitching their recommendations to Senate committees, which are then tasked with reviewing them and trying to figure out what is possible. The Senate Rules Committee is already moving forward with legislation, voting on Wednesday on three bills that would ban deceptive AI content used to influence federal elections, require AI disclaimers on political ads and create voluntary guidelines for state election offices that oversee candidates.

Schumer, who controls the Senate’s schedule, said those election bills were among the chamber’s “highest priorities” this year. He also said he planned to sit down with House Speaker Mike Johnson, who has expressed interest in looking at AI policy but has not said how he would do that.

Some experts warn that the U.S. is behind many other countries on the issue, including the EU which took the lead in March when they gave final approval to a sweeping new law governing artificial intelligence in the 27-nation bloc. Europe’s AI Act sets tighter rules for the AI products and services deemed to pose the highest risks, such as in medicine, critical infrastructure or policing. But it also includes provisions regulating a new class of generative AI systems like ChatGPT that have rapidly advanced in recent years.

“It’s time for Congress to act,” said Alexandra Reeve Givens, CEO of the Center for Democracy & Technology. “It’s not enough to focus on investment and innovation. We need guardrails to ensure the responsible development of AI.”

The senators emphasized balance between those two issues, and also the urgency of action.

“We have the lead at this moment in time on this issue, and it will define the relationship between the United States and our allies and other competing powers in the world for a long time to come,” Heinrich said.

FKA Twigs is slated to testify before the Senate Judiciary Subcommittee on Intellectual Property on Tuesday afternoon (April 30) to warn members of Congress about the dangers of the unsanctioned use of artificial intelligence to mimic an artist’s unique style and delivery.
The singer/dancer will also reveal that she has been developing a deepfake of herself over the past year in a bid to explore using AI to help with marketing and streamlining the creative process, as well as to head off anyone else beating her to the AI punch.

“As a future-facing artist, new technologies are an exciting tool that can be used to expressdeeper emotions, create fantasy worlds, and touch the hearts of many people,” she will tell the committee, which will also hear from Warner Music Group CEO Robert Kyncl. Her appearance in D.C. is in support of the Senate’s bipartisan Nurture Originals, Foster Art, and Keep Entertainment Safe (“NO FAKES Act”) draft proposal, aimed at protecting Americans from nonconsensual AI-generated deepfakes and creating federal-level rules to protect an individual’s voice and image from being used in harmful AI-generated content.

Trending on Billboard

Her testimony — provided to Billboard ahead of her appearance — will open with the 36-year-old artist describing a life spent immersed in the arts, including the ballet, singing and acting lessons her dancer mother and stepdad dance company director sacrificed to provide for her. “From the age of 16, I began to explore both dance and music as a career, and that interest in multiple disciplines has defined my life for the past two decades both personally and professionally,” she will tell the committee.

The Grammy-nominated singer and recent soloist with the acclaimed Martha Graham Dance Company — and co-star in an upcoming adaptation of The Crow — will tell the committee that she wanted to testify because “my music, my dancing, my acting, the way that my body moves in front of a camera and the way that my voice resonates through a microphone is not by chance; they are essential reflections of who I am. My art is the canvas on which I paint my identity and the sustaining foundation of my livelihood. It is the essence of my being.”

All of that, however, is under threat, she testifies, noting that while AI can’t replicate the depth of her journey, “those who control it hold the power to mimic the likeness of my art, to replicate it and falsely claim my identity and intellectual property. This prospect threatens to rewrite and unravel the fabric of my very existence. We must enact regulation now to safeguard our authenticity and protect against misappropriation of our inalienable rights.”

At a time when bootleg AI songs claiming to feature the voices of major stars such as The Weeknd are proliferating — including the Drake AI freestyle diss track “Taylor Made” with computer-generated voices of Snoop Dogg and the late Tupac Shakur that was removed after a lawsuit threat from Shakur’s estate — Twigs says that the progenitors of the internet could not have predicted three decades ago how integral, and sometimes dangerous, it would become to our lives.

“AI is the biggest leap in technological advancement since the internet. You know the saying ‘Fool me once, shame on you… Fool me twice, shame on me,’” she says. “If we make the same mistakes with the emergence of AI, it will be ‘shame on us.’”

Having gleefully embraced technology throughout her career, Twigs will describe her bespoke deepfake, which she trained in the quirks of her personality and tuned to the exact tone of her voice to speak in several languages. “I will be engaging my AI twigs later this year to extend my reach and handle my online social media interactions, whilst I continue to focus on my art from the comfort and solace of my studio,” she says.

“These and similar emerging technologies are highly valuable tools both artistically and commercially when under the control of the artist,” she tells the committee. “What is not acceptable is when my art and my identity can simply be taken by a third party and exploited falsely for their own gain without my consent due to the absence of appropriate legislative control.”

Noting that history is littered with the stories of artists being the first ones to be exploited during moments of great technological advance, Twigs will warn that the “general and more vulnerable public” are often next. “By protecting artists with legislation at such a momentous moment in our history we are protecting a five-year-old child in the future from having their voice, likeness and identity taken and used as a commodity without prior consent, attribution or compensation,” she says.

Her testimony includes a plea to the committee to help protect artists and their work from the dangers of AI exploitation, speaking on behalf of fellow creators whose careers depend on the ability to create with the knowledge that they can maintain “tight control” over their “art, image, voice and identity.”

“Our careers and livelihoods are in jeopardy, and so potentially are the wider image-related rights of others in society,” she says. “You have the power to change this and safeguard the future. As artists and, more importantly, human beings, we are a facet of our given, learned, and developed identity. Our creativity is the product of this lived experience overlaid with years of dedication to qualification, training, hard work and, dare I say it, significant financial investment and sacrifice. That the very essence of our being at its most human level can be violated by the unscrupulous use of AI to create a digital facsimile that purports to be us, and our work, is inherently wrong.”

The testimony will end with an urgent plea, as well as a dire warning: “We must get this right … you must get this right,” she says. “Now… before it is too late.”

In January, a bipartisan group of U.S. House lawmakers announced a bill aimed at regulating the use of AI for cloning voices and likenesses, the No AI FRAUD Act, which could establish a federal framework for protecting one’s voice and likeness while laying out First Amendment protections.

U.S. Senators Marsha Blackburn (R-Tenn.) and John Hickenlooper (D-Colo.) have introduced a bill to help support music tourism throughout the country. Dubbed the American Music Tourism Act of 2024, the newly introduced legislation would be an amendment to the Visit America Act that passed in 2022 and required the assistant secretary of commerce for travel and tourism to lead a coordinated national effort to rejuvenate international tourism following declines from the pandemic.  
The American Music Tourism Act of 2024 requires the assistant secretary to identify locations and events in the United States that are important to music tourism and promote domestic travel and tourism to those sites and events.

“Tennesseans know a thing or two about the positive impact that music tourism has on the economy and culture,” Sen. Blackburn said in a statement. “The Volunteer State is proud to be home to so many iconic musical landmarks for tourists to enjoy – from Graceland in Memphis to the Grand Ole Opry in Nashville, Dollywood in Pigeon Forge, and the Birthplace of Country Music Museum in Bristol. This bipartisan legislation promotes music tourism’s fast-growing industry and ensures fans from all over the world can celebrate the rich history of music for generations to come.”

Trending on Billboard

The act classifies music tourism as the act of traveling to a state or locality to visit historic or modern-day music related attractions including museums, studios, venues of all sizes and other sites related to music. The definition also includes traveling somewhere in the U.S. to attend a music festival, concert or other live music performance. If passed, the act would strengthen the economic benefits of music festivals like Tennessee’s Bonarroo or California’s Stagecoach, as well as music venues from Madison Square Garden in New York City to Bluebird Cafe in Nashville.

“Music venues are keepers of our culture. From Red Rocks to the Grand Ole Opry, and hundreds of small venues across our country, millions visit Colorado and all our states to hear world class musicians and connect with each other,” said Sen. Hickenlooper in a statement. “Our bipartisan American Music Tourism Act will support these venues by helping our music tourism industry grow and expand.”

The bipartisan legislation is endorsed by the Recording Academy, the Nashville Songwriter’s Association International, the Recording Industry Association of America, Live Nation Entertainment, the National Independent Venues Association, Tennessee Department of Tourism Development, Tennessee Entertainment Commission, Memphis Tourism, Pigeon Forge Department of Tourism and the Overton Park Shell in Memphis.

“The Recording Academy is pleased to support the American Music Tourism Act and applauds Senators Blackburn and Hickenlooper for their continued dedication to lifting up the music community,” said Recording Academy chief advocacy and public policy officer Todd Dupler in a statement. “Music has long played an important role in our economy and culture. This bill will amplify the music community’s contributions to economic growth and increase understanding of music’s impact on the U.S. and the world.”

Live Nation’s president of Nashville music and business strategy Sally Williams also voiced her approval of the act, stating, “In Nashville, Memphis, and countless other communities across the country, a vibrant live music scene is an economic magnet that draws fans from around the globe. The American Music Tourism Act is an important piece of legislation that will help ensure live music remains a pillar of American culture and tourism, and we’d like to thank the Senator for her leadership on this issue.”

The American Music Tourism Act would leverage this existing framework within the Department of Commerce to highlight and promote music tourism in the United States with the act requiring the assistant secretary to submit their findings, achievements and activities to the congressional and senate committees within one year of its passage and every year thereafter.

“From rural communities to city centers, independent stages attract investment and visitors for the artists and professionals that put on shows and the restaurants, retail, and attractions around them,” said National Independent Venue Association executive director Stephen Parker in an endorsement. “The American Music Tourism Act finally recognizes music tourism as a catalyst for economic development and ensures its growth is a national priority. We applaud Senators Marsha Blackburn and John Hickenlooper for aligning the nation’s tourism strategy with the venues and festivals across our country that the world travels to experience.”

A bipartisan coalition of high-profile U.S. senators introduced a sweeping ticketing reform bill today that backers say would significantly improve transparency in concert and sports ticketing, better manage and enforce laws around ticket resale and ban deceptive sales tactics designed to trick consumers into overpaying for access to major events.
The Fans First Act, sponsored by U.S. Senator John Cornyn (R-TX) and co-sponsored by Amy Klobuchar (D-MN), Marsha Blackburn (R-TN), Ben Ray Luján (D-NM), Roger Wicker (R-MS) and Peter Welch (D-VT) is the most comprehensive ticketing industry reform package ever introduced in Congress. It could lead to needed reforms long championed by consumer rights groups, advocacy groups and live music companies including both Live Nation and Ticketmaster, as well as members of the Fix The Tix coalition: the National Independent Venue Association, the Recording Academy, the National Independent Talent Organization, the Screen Actors Guild-American Federation of Television and Radio Artists and the Association of Performing Arts Professionals.

“The current ticketing system is riddled with problems and doesn’t serve the needs of fans, teams, artists, or venues,” Sen. Cornyn said in a statement. “This legislation would rebuild trust in the ticketing system by cracking down on bots and others who take advantage of consumers through price gouging and other predatory practices and increase price transparency for ticket purchasers.”

Klobuchar added, “Buying a ticket to see your favorite artist or team is out of reach for too many Americans. Bots, hidden fees, and predatory practices are hurting consumers whether they want to catch a home game, an up-and-coming artist or a major headliner like Taylor Swift or Bad Bunny. From ensuring fans get refunds for canceled shows to banning speculative ticket sales, this bipartisan legislation will improve the ticketing experience.”

The Fans First Act boasts more than a dozen reform proposals aimed at protecting consumers, including requiring sites like StubHub and Ticketmaster to disclose the full price of tickets including fees at the beginning of the sale and detail if tickets are being sold by a primary seller or a reseller.

The bill would also strengthen the Better Online Ticket Sales (BOTS) Act, signed into law in 2016 by President Barack Obama, which prohibits the use of automated bots to purchase tickets online. It would additionally require sellers and resellers to provide proof of purchase to consumers within 24 hours of purchase and refund consumers the full cost of their tickets when events are canceled. If passed, the bill would also commission a Government Accountability Office study to investigate the marketplace and make recommendations.

Among other provisions, the Fans First Act would also ban the sale of a ticket that the reseller claims they possess but don’t acquire until they have already secured a sale for the ticket. Known as speculative ticket sales, the practice is often the subject of complaints from consumers who later learn they significantly overpaid for tickets.

Those who violate the law could face civil penalties and be added to a reporting website for fans to file complaints about illegal ticket sales tactics that would then be investigated by the Federal Trade Commission and state attorneys general.

“Fans have become increasingly frustrated with how difficult it has been to obtain affordable tickets to see their favorite artists perform,” said Sen. Blackburn. “Bots are snatching up tickets and selling them for exorbitant prices on secondary markets, while some ticketing companies are selling speculative event tickets that don’t even exist. This bipartisan legislation builds upon my work to safeguard artists and their fans in the online ticket marketplace.”

Sen. Luján stated that the “current ticketing system is limiting access to live entertainment,” adding, “That’s why I’m proud to join my colleagues in introducing the Fans First Act to ensure the sale of tickets is accessible to all consumers.” Sen. Wicker added, “Deceptive ticketing practices have become far too common. This bipartisan effort would result in more transparency and less price gauging.”

The Fan First Act is expected to be heard by the Senate Committee on Commerce, Science, and Transportation. Earlier this week, the U.S. House Subcommittee on Energy and Commerce passed a similar bill called the Speculative Ticketing Oversight and Prohibition (STOP) Act, which is now eligible for a vote by the full House.

The STOP Act also bans speculative ticketing, and like the Fans First Act, addresses a range of deceptive ticketing practices and pricing transparency issues. Live Nation and other groups have also expressed support for the STOP Act.

Earlier today, Live Nation officials issued a statement endorsing the Fans First Act.

“We support the Fans First Act and welcome legislation that brings positive reform to live event ticketing. We believe it’s critical Congress acts to protect fans and artists from predatory resale practices, and have long supported a federal all-in pricing mandate, banning speculative ticketing and deceptive websites, as well as other measures. We look forward to our continued work with policymakers to advocate for even stronger reforms and enforcement,” the statement reads.

Recording Academy CEO Harvey Mason jr. also came out with a statement supporting the bill on Friday. “With the introduction of the Fans First Act today, the Recording Academy applauds Senators Klobuchar, Cornyn, Blackburn, Luján, Wicker and Welch for taking this important step towards comprehensive ticketing reform,” he said. “As we work together to improve the ticket marketplace, we urge Congress to act on this bill quickly and continue its effort to protect both artists and fans by increasing transparency and limiting bad actors that take away from the joyous experience of live music.”

Dennis Kooker, president of global digital business at Sony Music Entertainment, represented the music business at Sen. Chuck Schumer’s (D-NY) seventh artificial intelligence insight forum in Washington, D.C. on Wednesday (Nov. 29). In his statement, Kooker implored the government to act on new legislation to protect copyright holders to ensure the development of “responsible and ethical generative AI.”

The executive revealed that Sony has already sent “close to 10,000 takedowns to a variety of platforms hosting unauthorized deepfakes that SME artists asked us to take down.” He says these platforms, including streamers and social media sites, are “quick to point to the loopholes in the law as an excuse to drag their feet or to not take the deepfakes down when requested.”

Presently, there is no federal law that explicitly requires platforms to takedown songs that impersonate an artists’ voice. Platforms are only obligated to do this when a copyright (a sound recording or a musical work) is infringed, as stipulated by the Digital Millennium Copyright Act (DMCA). Interest in using AI to clone the voices of famous artists has grown rapidly since a song with AI impersonations of Drake and The Weekend went viral earlier this year. The track, called “Heart on My Sleeve” has become one of the most popular use-cases of music-related AI.

A celebrity’s voice and likeness can be protected by “right of publicity” laws that safeguard it from unauthorized exploitation, but this right is limited. Its protections vary state-to-state and are even more limited post-mortem. In May, Billboard reported that the major labels — Sony, Universal Music Group and Warner Music Group — had been in talks with Spotify, Apple Music and Amazon Music to create a voluntary system for takedowns of right of publicity violations, much like the one laid out by the DMCA, according to sources at all three majors. It is unclear from Kooker’s remarks if the platforms that are dragging their feet on voice clone removals include the three streaming services that previously took part in these discussions.

In his statement, Kooker asked the Senate forum to create a federal right of publicity to create a stronger and more uniform protection for artists. “Creators and consumers need a clear unified right that sets a floor across all fifty states,” he said. This echoes what UMG general counsel/ executive vp of business and legal affairs Jeffery Harleston asked the Senate during a July AI hearing.

Kooker expressed his “sincere gratitude” to Sens. Chris Coons, Marsha Blackburn, Amy Klobuchar and Thom Tillis for releasing a draft bill called the No FAKES (“Nurture Originals, Foster Art, and Keep Entertainment Safe”) Act in October, which would create a federal property right for one’s voice or likeness and protect against unauthorized AI impersonations. At its announcement, the No FAKES Act drew resounding praise from music business organizations, including the RIAA and the American Association of Independent Music.

Kooker also stated that in this early stage many available generative AI products today are “not expanding the business model or enhancing human creativity.” He pointed to a “deluge of 100,000 new recordings delivered to [digital service providers] every day” and said that some of these songs are “generated using generative AI content creation tools.” He added, “These works flood the current music ecosystem and compete directly with human artists…. They reduce and diminish the earnings of human artists.”

“We have every reason to believe that various elements of AI will become routine in the creative process… [as well as] other aspects of our business” like marketing and royalty accounting,” Kooker continued. He said Sony Music has already started “active conversations” with “roughly 200” different AI companies about potential partnerships with Sony Music.

Still, he stressed five key issues remain that need to be addressed to “assure a thriving marketplace for AI and music.” Read his five points, as written in his prepared statement, below:

Assure Consent, Compensation, and Credit. New products and businesses built with music must be developed with the consent of the owner and appropriate compensation and credit. It is essential to understand why the training of AI models is being done, what products will be developed as a result, and what the business model is that will monetize the use of the artist’s work. Congress and the agencies should assure that creators’ rights are recognized and respected.

Confirm That Copying Music to Train AI Models is Not Fair Use. Even worse are those that argue that copyrighted content should automatically be considered fair use so that protected works are never compensated for usage and creators have no say in the products or business models that are developed around them and their work. Congress should assure and agencies should presume that reproducing music to train AI models, in itself, is not a fair use.

Prevent the Cloning of Artists’ Voices and Likenesses Without Express Permission. We cannot allow an artist’s voice or likeness to be cloned for use without the express permission of the artist. This is a very personal decision for the artist. Congress should pass into law effective federal protections for name, image, and likeness.

Incentivize Accurate Record-Keeping. Correct attribution will be a critical element to artists being paid fairly and correctly for new works that are created. In addition, rights can only be enforced around the training of AI when there are accurate records about what is being copied. Otherwise, the inability to enforce rights in the AI marketplace equates to a lack of rights at all, producing a dangerous imbalance that prevents a thriving ecosystem. This requires strong and accurate record keeping by the generative AI platforms, a requirement that urgently needs legislative support to ensure incentives are in place so that it happens consistently and correctly.

Assure Transparency for Consumers and Artists. Transparency is necessary to clearly distinguish human-created works from AI-created works. The public should know, when they are listening to music, whether that music was created by a human being or a machine.

A bipartisan group of U.S. senators released draft legislation Thursday (Oct. 12) aimed at protecting musical artists and others from artificial intelligence-generated deepfakes and other replicas of their likeness, like this year’s infamous “Fake Drake” song.

The draft bill – labelled the “Nurture Originals, Foster Art, and Keep Entertainment Safe Act, or NO FAKES Act — would create a federal right for artists, actors and others to sue those who create “digital replicas” of their image, voice, or visual likeness without permission.

In announcing the bill, Sen. Chris Coons (D-Del.) specifically cited the April release of “Heart On My Sleeve,” an unauthorized song that featured AI-generated fake vocals from Drake and The Weeknd.

“Generative AI has opened doors to exciting new artistic possibilities, but it also presents unique challenges that make it easier than ever to use someone’s voice, image, or likeness without their consent,” Coons said in a statement. “Creators around the nation are calling on Congress to lay out clear policies regulating the use and impact of generative AI.”

The draft bill quickly drew applause from music industry groups. The RIAA said it would push for a final version that “effectively protects against this illegal and immoral misappropriation of fundamental rights that protect human achievement.”

“Our industry has long embraced technology and innovation, including AI, but many of the recent generative AI models infringe on rights — essentially instruments of theft rather than constructive tools aiding human creativity,” the group wrote in the statement.

The American Association of Independent Music offered similar praise: “Independent record labels and the artists they work with are excited about the promise of AI to transform how music is made and how consumers enjoy art, but there must be guardrails to ensure that artists can make a living and that labels can recoup their investments.” The group said it would push to make sure that the final bill’s provisions were “accessible to small labels and working-class musicians, not just the megastars.”

A person’s name and likeness — including their distinctive voice — are already protected in most states by the so-called right of publicity, which allows control how your individual identity is commercially exploited by others. But those rights are currently governed by a patchwork of state statutes and common law systems.

The NO FAKES Act would create a nationwide property right in your image, voice, or visual likeness, allowing an individual to sue anyone the produced a “newly-created, computer-generated, electronic representation” of it. Unlike many state-law systems, that right would not expire at death and could be controlled by a person’s heirs for 70 years after their passing.

A tricky balancing act for any publicity rights legislation is the First Amendment and its protections for free speech. In Thursday’s announcementthe NO FAKES Act’s authors said the bill would include specific carveouts for replicas used in news coverage, parody, historical works or criticism.

“Congress must strike the right balance to defend individual rights, abide by the First Amendment, and foster AI innovation and creativity,” Coons said.

The draft was co-authored by Sen. Marsha Blackburn (R-Tenn.), Sen. Amy Klobuchar (D-Minn.), and Sen. Thom Tillis (R-N.C.).

HipHopWired Featured Video

CLOSE

Sen. John Fetterman found himself trending on the X, formerly known as Twitter, Tuesday morning (September 19) in the wake of the uproar over the Pennsylvanian’s casual style of dress despite it not harming a soul. Sen. Fetterman, savvy on social media, has been waving off the potshots from conservative talking heads and his GOP counterparts in elected office with ease.
Sen. John Fetterman was the target of a Fox News report regarding his style of dress, which includes hoodies, unbuttoned and untucked shirts, and basketball shorts, none of which hindered his ability to serve the people of his home state.
Quoting the tweet from Fox News, Fetterman wrote, “I figure if I take up vaping and grabbing the hog during a live musical, they’ll make me a folk hero.”

This was in reference to Rep. Lauren Boebert vaping during a Beetlejuice musical in Denver, Colo.
Rep. Marjorie Taylor Greene then used X to fire a shot at Fetterman, of course neglecting her own gaffes and stances while doing so.
“The Senate no longer enforcing a dress code for Senators to appease Fetterman is disgraceful,” she writes. “Dress code is one of society’s standards that set etiquette and respect for our institutions. Stop lowering the bar!”

Florida Gov. Ron DeSantis, he of tumbling poll numbers in his bid to win the White House back for the GOP, took shots at Fetterman’s clothing while on the campaign trail in Jacksonville on Monday (September 18), you know, hard-hitting policy talk that the American people care about.
“So he would campaign in that, which is your prerogative, right? I mean, if that’s what you want to do, but to show up in the United States Senate with that, and not have the decency to put on proper attire. I think it’s disrespectful to the body,” DeSantis said.
Quoting a video of Gov. DeSantis’ stump speech, Fetterman wrote, “I dress like he campaigns,” shutting it all down.

Adding to all of this are MAGA tinfoil types who think that recent appearances of Sen. Fetterman is that of a body double and he’s handling those doofs with the same smoothness as well.

Salute to Sen. John Fetterman. Keep scrolling to check out the reactions and chatter from all sides below.

Photo: Tom Williams / Getty

Universal Music Group general counsel/executive vp of business and legal affairs, Jeffery Harleston, spoke as a witness in a Senate Judiciary Committee hearing on AI and copyright on Wednesday (July 12) to represent the music industry. In his remarks, the executive called for a “federal right of publicity” — the state-by-state right that protects artists’ likenesses, names, and voices — as well as for “visibility into AI training data” and for “AI-generated content to be labeled as such.”

Harleston was joined by other witnesses including Karla Ortiz, a conceptual artist and illustrator who is waging a class action lawsuit against Stability AI; Matthew Sag, professor of artificial intelligence at Emory University School of Law; Dana Rao, executive vp/general counsel at Adobe; and Ben Brooks, head of public policy at Stability AI.

“I’d like to make four key points to you today,” Harleston began. “First, copyright, artists, and human creativity must be protected. Art and human creativity are central to our identity.” He clarified that AI is not necessarily always an enemy to artists, and can be used in “service” to them as well. “If I leave you with one message today, it is this: AI in the service of artists and creativity can be a very, very good thing. But AI that uses, or, worse yet, appropriates the work of these artists and creators and their creative expression, their name, their image, their likeness, their voice, without authorization, without consent, simply is not a good thing,” he said.

Second, he noted the challenges that generative AI poses to copyright. In written testimony, he noted the concern of “AI-generated music being used to generate fraudulent plays on streaming services, siphoning income from human creators.” And while testifying at the hearing, he added, “At Universal, we are the stewards of tens of thousands, if not hundreds of thousands, of copyrighted creative works from our songwriters and artists, and they’ve entrusted us to honor, value and protect them. Today, they are being used to train generative AI systems without authorization. This irresponsible AI is violative of copyright law and completely unnecessary.”

Training is one of the most contentious areas of generative AI for the music industry. In order to get an AI model to learn how to generate a human voice, a drum beat or lyrics, the AI model will train itself on up to billions of data points. Often this data contains copyrighted material, like sound recordings, without the owner’s knowledge or compensation. And while many believe this should be considered a form of copyright infringement, the legality of using copyrighted works as training data is still being determined in the United States and other countries.

The topic is also the source of Ortiz’s class action lawsuit against Stability AI. Her complaint, filed in California federal court along with two other visual artists, alleges that the “new” images generated by Stability AI’s Stable Diffusion model used their art “without the consent of the artists and without compensating any of those artists,” which they feel makes any resulting generation from the AI model a “derivative work.”

In his spoken testimony, Harleston pointed to today’s “robust digital marketplace” — including social media sites, apps and more — in which “thousands of responsible companies properly obtained the rights they need to operate. There is no reason that the same rules should not apply equally to AI companies.”

Third, he reiterated that “AI can be used responsibly…just like other technologies before.” Among his examples of positive uses of AI, he pointed to Lee Hyun [aka MIDNATT], a K-pop artist distributed by UMG who used generative AI to simultaneously release the same single in six languages using his voice on the same day. “The generative AI tool extended the artist’s creative intent and expression with his consent to new markets and fans instantly,” Harleston said. “In this case, consent is the key,” he continued, echoing Ortiz’s complaint.

While making his final point, Harleston urged Congress to act in several ways — including by enacting a federal right of publicity. Currently, rights of publicity vary widely state by state, and many states’ versions include limitations, including less protection for some artists after their deaths.

The shortcomings of this state-by-state system were highlighted when an anonymous internet user called Ghostwriter posted a song — apparently using AI to mimic the voices of Drake and The Weeknd –called “Heart On My Sleeve.” The track’s uncanny rendering of the two major stars immediately went viral, urging the music business to confront the new, fast-developing concern of AI voice impersonation.

A month later, sources told Billboard that the three major label groups — UMG, Warner Music Group and Sony Music — have been in talks with the big music streaming services to allow them to cite “right of publicity” violations as a reason to take down songs with AI vocals. Removing songs based on right of publicity violations is not required by law, so the streamers’ reception to the idea appears to be voluntary.

“Deep fakes, and/or unauthorized recordings or visuals of artists generated by AI, can lead to consumer confusion, unfair competition against the artists that actually were the original creator, market dilution and damage to the artists’ reputation or potentially irreparably harming their career. An artist’s voice is often the most valuable part of their livelihood and public persona. And to steal it, no matter the means, is wrong,” said Harleston.

In his written testimony, Harleston went deeper, stating UMG’s position that “AI generated, mimicked vocals trained on vocal recordings from our copyrighted recordings go beyond Right of Publicity violations… copyright law has clearly been violated.” Many AI voice uses circulating the internet involve users mashing up one previously released song topped with a different artist’s voice. These types of uses, Harleston wrote, mean “there are likely multiple infringements occurring.”

Harleston added that “visibility into AI training data is also needed. If the data on AI training is not transparent, the potential for a healthy marketplace will be stymied as information on infringing content will be largely inaccessible to individual creators.”

Another witness at the hearing raised the idea of an “opt-out” system so that artists who do not wish to be part of an AI’s training data set will have the option of removing themselves. Already, Spawning, a music-tech start-up, has launched a website to put this possible remedy into practice for visual art. Called “HaveIBeenTrained.com,’ the service helps creators opt-out of training data sets commonly used by an array of AI companies, including Stability AI, which previously agreed to honor the HaveIBeenTrained.com opt-outs.

Harleston, however, said he did not believe opt-outs are enough. “It will be hard to opt out if you don’t know what’s been opted in,” he said. Spawning co-founder Mat Dryhurst previously told Billboard that HaveIBeenTrained.com is working on an opt-in tool, though this product has yet to be released.

Finally, Harleston urged Congress to label AI-generated content. “Consumers deserve to know exactly what they’re getting,” he said.

A U.S. senator representing Music City had tough questions about artificial intelligence’s impact on the music industry during a Congressional hearing on Tuesday, at one point asking the CEO of the company behind ChatGPT to commit to not using copyrighted songs to train future machines.

At a hearing before the Senate Judiciary Committee about potential regulation for AI, Sen. Marsha Blackburn (R-Tenn.) repeatedly grilled Sam Altman, CEO of OpenAI, over how songwriters and musical artists should be compensated when their works are used by AI companies.

Opening her questioning, Blackburn said she had used OpenAI’s Jukebox to create a song that mimicked Garth Brooks – and that she was clearly concerned about how the singer’s music and voice had been used to create such a tool.

“You’re training it on these copyrighted songs,” Blackburn told Altman. “How do you compensate the artist?”

“If I can go in and say ‘write me a song that sounds like Garth Brooks,’ and it takes part of an existing song, there has to be compensation to that artist for that utilization and that use,” Blackburn said. “If it was radio play, it would be there. If it was streaming, it would be there.”

At one point, Blackburn demanded a firm answer: “Can you commit, as you’ve done with consumer data, not to train [AI models] on artists’ and songwriters’ copyrighted works, or use their voices and their likenesses without first receiving their consent?”

Though Altman did not directly answer that question, he repeatedly told the senator that artists “deserve control” over how their copyrighted music and their voices were used by AI companies.

“We think that content creators need to benefit from this technology,” Altman told the committee. “Exactly what the economic model is, we’re still talking to artists and content owners about what they want. I think there’s a lot of ways this can happen. But very clearly, no matter what the law is, the right thing to do is to make sure people get significant upside benefit from this new technology.”

Blackburn’s questioning came amid a far broader discussion of the potential risks posed by AI, including existential threats to democracy, major harm to the labor market, and the widespread proliferation of misinformation. One witness, a New York University professor and expert in artificial intelligence, told the lawmakers that it poses problems “on a scale that humanity has not seen before.”

The music industry, too, is worried about AI-driven disruption. Last month, a new song featuring AI-generated fake vocals from Drake and The Weeknd went viral, underscoring growing concerns about AI’s impact on music and highlighting the legal uncertainties that surround it.

One of the biggest open questions is over whether copyrighted music can be used to train AI platforms – the process whereby machines “learn” to spit out new creations by ingesting millions of existing works. Major labels and other industry players have already said that such training is illegal, and cutting-edge litigation against the creators of such platforms could be coming soon.

At Tuesday’s hearing, in repeatedly asking Altman to weigh in on that question, Blackburn drew historical parallels to the last major technological disruption to wreak havoc on the music industry — a scenario that also posed novel legal and policy questions.

“We lived through Napster,” Blackburn said. “That was something that really cost a lot of artists a lot of money.”

Though he voiced support for compensation for artists, Altman did not get into specifics, saying that many industry stakeholders had “different opinions” on how creators should be paid. When Blackburn asked him if he thought the government should create an organization similar to SoundExchange – the group that collects certain blanket royalties for streaming – Altman said he wasn’t familiar with it.

“You’ve got your team behind you,” Blackburn said. “Get back to me on that.”