State Champ Radio

by DJ Frosty

Current track

Title

Artist

Current show

State Champ Radio Mix

1:00 pm 7:00 pm

Current show

State Champ Radio Mix

1:00 pm 7:00 pm


European Union

European Union regulators opened investigations into Apple, Google and Meta on Monday, the first cases under a sweeping new law designed to stop Big Tech companies from cornering digital markets. The European Commission, the 27-nation bloc’s executive arm, said it was investigating the companies for “non-compliance” with the Digital Markets Act.

Explore

Explore

See latest videos, charts and news

See latest videos, charts and news

The Digital Markets Act that took full effect earlier this month is a broad rulebook that targets Big Tech “gatekeeper” companies providing “core platform services.” Those companies must comply with a set of do’s and don’ts, under threat of hefty financial penalties or even breaking up businesses. The rules have the broad but vague goal of making digital markets “fairer” and “more contestable” by breaking up closed tech ecosystems that lock consumers into a single company’s products or services.

The commission has heard complaints that tech companies’ measures to comply have fallen short, European Commission Vice President Margrethe Vestager, the bloc’s competition chief, said at a press briefing in Brussels. “Today, we decided to investigate a number of these suspected non-compliance issues. And as we unearth other problems, we will tackle those too.”

Trending on Billboard

The companies have been ordered to hold on to certain documents that the commission can access in current and future investigations, she said.

Regulators are looking into whether Google and Apple are fully complying with the DMA’s rules requiring tech companies to allow app developers to direct users to cheaper options available outside their app stores. The commission said it’s concerned the two companies are imposing “various restrictions and limitations” including charging recurring fees that prevent apps from freely promoting offers.

Google is also facing scrutiny for not complying with DMA provisions that prevent tech giants from giving preference to their own services over rivals. The commission said it is concerned Google’s measures will result in third-party services listed on Google’s search results page not being treated “in a fair and non-discriminatory manner.”

Google said that it has made “significant changes” to the way its services operate in Europe to comply with the DMA. “We will continue to defend our approach in the coming months,” Google’s director of competition, Oliver Bethell, said.

The commission is also investigating whether Apple is doing enough to allow iPhone users to easily change web browsers.

Apple said it’s confident that its plan complies with the DMA, and it will “continue to constructively engage with the European Commission as they conduct their investigations.” The company said it has created a wide range of new developer capabilities, features, and tools to comply with the regulation.

The commission is also looking into Meta’s option for European users to pay a monthly fee for ad-free versions of Facebook or Instagram, so they can avoid having their personal data used to target them with online ads. “The Commission is concerned that the binary choice imposed by Meta’s ‘pay or consent’ model may not provide a real alternative in case users do not consent, thereby not achieving the objective of preventing the accumulation of personal data by gatekeepers,” it said.

Meta said it will “engage constructively” with the Commission. “Subscriptions as an alternative to advertising are a well-established business model across many industries, and we designed Subscription for No Ads to address several overlapping regulatory obligations, including the DMA,” it said in a prepared statement.

The commission said it aims to wrap up its investigations within 12 months.

LONDON — Sweeping new laws regulating the use of artificial intelligence (AI) in Europe, including controls around the use of copyrighted music, have been approved by the European Parliament, following fierce lobbying from both the tech and music communities.
Members of the European Parliament (MEPs) voted in favor of the EU’s Artificial Intelligence Act by a clear majority of 523 votes for, 46 against and 49 abstentions. The “world first” legislation, which was first proposed in April 2021 and covers a wide range of AI applications including biometric surveillance and predictive policing, was provisionally approved in December, but Wednesday’s vote formally establishes its passage into law.

The act places a number of legal and transparency obligations on tech companies and AI developers operating in Europe, including those working in the creative sector and music business. Among them is the core requirement that companies using generative AI or foundation AI models like OpenAI’s ChatGPT or Anthropic’s Claude 2 provide detailed summaries of any copyrighted works, including music, that they have used to train their systems.

Trending on Billboard

Significantly, the law’s transparency provisions apply regardless of when or where in the world a tech company scraped its data from. For instance, even if an AI developer scraped copyrighted music and/or trained its systems in a non-EU country — or bought data sets from outside the 27-member state — as soon as they are used or made available in Europe the company is required to make publicly available a “sufficiently detailed summary” of all copyright protected music it has used to create AI works. 

There is also a requirement that any training data sets used in generative AI music or audio-visual works are water marked, so there is a traceable path for rights holders to track and block the illegal use of their catalog. 

In addition, content created by AI, as opposed to human works, must be clearly labeled as such, while tech companies have to ensure that their systems cannot be used to generate illegal and infringing content.

Large tech companies who break the rules – which govern all applications of AI inside the 27-member block of EU countries, including so-called “high risk” uses — will face fines of up to €35 million or 7% of global annual turnover. Start-up businesses or smaller tech operations will receive proportionate financial punishments. 

Speaking ahead of Wednesday’s vote, which took place in Strasbourg, co-rapporteur Brando Benifei said the legislation means that “unacceptable AI practices will be banned in Europe and the rights of workers and citizens will be protected.” 

Co-rapporteur Dragos Tudorache called the AI Act “a starting point for a new model of governance built around technology.” 

European legislators first proposed introducing regulation of artificial intelligence in 2021, although it was the subsequent launch of ChatGPT — followed by the high-profile release of “Heart on My Sleeve,” a track that featured AI-powered imitations of vocals by Drake and The Weeknd, last April — that made many music executives sit up and pay closer attention to the technology’s potential impact on the record business. 

In response, lobbyists stepped up their efforts to convince lawmakers to add transparency provisions around the use of music in AI – a move which was fiercely opposed by the technology industry, which argued that tougher regulations would put European AI developers at a competitive disadvantage.

Now that the AI Act has been approved by the European Parliament, the legislation will undergo a number of procedural rubber-stamping stages before it is published in the EU’s Official Journal — most likely in late April or early May — with its regulations coming into force 20 days after that. 

There are, however, tiered exceptions for tech companies to comply with its terms and some of its provisions are not fully applicable for up to two-years after its enactment. (The rules governing existing generative AI models commence after 12 months, although any new generative AI companies or models entering the European market after the Act has come into force have to immediately comply with its regulations).

In response to Wednesday’s vote, a coalition of European creative and copyright organizations, including global recorded-music trade body IFPI and international music publishing trade group ICMP, issued a joint statement thanking regulators and MEPs for the “essential role they have played in supporting creators and rightsholders.”

“While these obligations provide a first step for rightsholders to enforce their rights, we call on the European Parliament to continue to support the development of responsible and sustainable AI by ensuring that these important rules are put into practice in a meaningful and effective way,” said the 18 signatories, which also included European independent labels trade association IMPALA, European Authors Society GESAC and CISAC, the international trade organization for copyright collecting societies.

If bosses at the world’s biggest technology companies were still in any way doubting the European Union’s commitment towards regulating the digital marketplace, the 1.8 billion euro ($1.95 billion) fine levied against Apple on Monday (March 4) by the European Commission for breaking competition laws over music streaming served as a powerful statement of intent.
This week, more new EU rules come into force governing how the largest online platforms operate in Europe, now that the deadline for complying with the Digital Markets Act (DMA) has passed. 

Beginning today (March 7), the six tech giants designated “gatekeepers” by the European Commission — Apple, Google parent company Alphabet, Amazon, TikTok-owner ByteDance, Meta and Microsoft – are required to comply with a raft of legislative changes designed to rein in their global dominance. 

Trending on Billboard

They include outlawing companies favoring in-house services at the expense of third-party providers and forcing platforms to offer other businesses, such as apps, access to the data they generate – allowing smaller services to contact their customers directly and making it easier for users to switch services.

The laws are enforceable by fines of up to 20% of total worldwide turnover (aka, gross revenue) for repeat infringers, or, in extreme cases the “last resort option” of forced divestments and the break-up of businesses. 

THIS IS FINE

The changes are already having a significant impact on digital music services and, in turn, the global record business. 

In January, Apple announced that it will begin allowing European users to download app stores other than the company-operated one that comes installed on iPhones. It will additionally lower the fees it charges developers for purchases made through the App Store, reducing commission from the existing 15% to 30% level to between 10% and 17% for developers using the company’s payment-processing system. 

However, Apple’s plans to charge “high volume” services with over one million users a €0.50 ($0.54) “Core Technology Fee” per download, per year, for using alternatives to the App Store has been heavily criticized by a number of European businesses, including Spotify and Deezer.

“Apple’s new terms not only disregard both the spirit and letter of the law, but if left unchanged, make a mockery of the DMA,” said the streaming services in an open letter to the European Commission, sent last week and also signed by 32 other European digital companies and associations, including trade body Digital Music Europe.

The new fee structure, which only apply in the 27 EU member states, will deter app developers from opting into the revised terms “and will hamper fair competition,” say Spotify and Deezer, calling on regulators to take “swift, timely and decisive action against Apple.” (In January, Spotify stated in a company blog post that the new fees “equates for us to being the same or worse as under the old rules.”)

Similar anti-competition concerns were behind the European Commission’s decision to fine Apple 1.8 billion euros at the start of March, following longstanding complaints from Spotify over Apple’s restrictions to outside developers and the 30% fee it charges them on all purchases made through iOS apps. (Apple has said it will appeal the fine, which was issued under existing EU terms, rather the Digital Markets Act).

Defending its response to the new EU provisions, Apple estimates that less than 1% of developers will pay the Core Technology Fee and warned that the DMA brings greater risks to users and developers by compromising its ability to detect malware, fraud and illicit content in external apps. 

NOT JUST APPLE

Other so-called gatekeepers – defined by policy makers as a platform with an annual turnover of more than 7.5 billion euros ($8.1 billion) and more than 45 million active monthly users in the EU region — are also making sweeping changes as a result of the DMA. 

Aside from Apple, music executives will be paying most attention to how ByteDance, the Chinese owner of TikTok responds to the law’s provisions. In November, the company launched an appeal against the EU’s classification of TikTok as a “gatekeeper” arguing that the platform is a “challenger, not an incumbent, in the digital advertising market” and that the new rules could hamper its ability to “remain competitive and grow.”

Despite the ongoing legal challenge, TikTok has already taken a number of steps to comply with the terms of the DMA, including the launch of enhanced data portability tools that allow developers to download and export data profiles, followers and posts from TikTok to other services with users’ permission. These changes are being introduced now to European users, TikTok announced in a blog post on March 4, “with plans to roll out globally in the near future.”

In January, Google and YouTube parent company Alphabet announced that it will allow users to pick their default browser and provide more links to competing sites when searching Google – although, like Apple, Alphabet’s compliance with the DMA has been questioned.

Posting on X (formerly Twitter) this month, Epic Games CEO Tim Sweeney criticized the tech giant for imposing a commission fee of up to 27% for any app purchases made not using Google’s payment services. (Google/Alphabet has previously been issued three major fines totaling 8.2 billion euros by the EU over antitrust issues). 

Meanwhile, Meta is allowing users to separate their Facebook and Instagram accounts to prevent personal information being shared for targeted ads. Amazon is modifying its Amazon Ads service to provide stronger data protections for customers, and Microsoft is implementing changes to its Windows operating system.

The terms of the Digital Markets Act only apply to companies and services operating in the 27-member state EU block, but their impact extends far afield. Following the EU’s lead, similar regulations to rein in tech companies’ dominance are being drawn up in several other nations, including Japan, South Korea, India, Brazil, Australia and the United Kingdom.

What meaningful impact the DMA or comparable international legislation will actually have on curbing Big Tech — and the music companies that either drive or rely upon them to reach audiences — could take years to be felt, if at all, but EU regulators say they are not shying away from the challenge.

“We are looking very carefully at how companies are complying [with the DMA]” the European Commission recently said in a statement, “and once we have full enforcement powers will not hesitate to act.”

The European Union leveled its first antitrust penalty against Apple on Tuesday, fining the U.S. tech giant nearly $2 billion for breaking the bloc’s competition laws by unfairly favoring its own music streaming service over rivals.
Apple banned app developers from “fully informing iOS users about alternative and cheaper music subscription services outside of the app,” said the European Commission, the 27-nation bloc’s executive arm and top antitrust enforcer.

That is illegal under EU antitrust rules. Apple behaved this way for almost a decade, which meant many users paid “significantly higher prices for music streaming subscriptions,” the commission said.

The 1.8 billion-euro fine follows a long-running investigation triggered by a complaint from Swedish streaming service Spotify five years ago.

Trending on Billboard

The EU has led global efforts to crack down on Big Tech companies, including a series of multbillion-dollar fines for Google and charging Meta with distorting the online classified ad market. The commission also has opened a separate antitrust investigation into Apple’s mobile payments service.

The commission’s investigation initially centered on two concerns. One was the iPhone maker’s practice of forcing app developers that are selling digital content to use its in-house payment system, which charges a 30% commission on all subscriptions.

But the EU later dropped that to focus on how Apple prevents app makers from telling their users about cheaper ways to pay for subscriptions that don’t involve going through an app.

The investigation found that Apple banned streaming services from telling users about how much subscription offers cost outside of their apps, including links in their apps to pay for alternative subscriptions or even emailing users to tell them about different pricing options.

The fine comes the same week that new EU rules are set to kick in that are aimed at preventing tech companies from dominating digital markets.

The Digital Markets Act, due to take effect Thursday, imposes a set of do’s and don’ts on “gatekeeper” companies including Apple, Meta, Google parent Alphabet, and TikTok parent ByteDance — under threat of hefty fines.

The DMA’s provisions are designed to prevent tech giants from the sort of behavior that’s at the heart of the Apple investigation. Apple has already revealed how it will comply, including allowing iPhone users in Europe to use app stores other than its own and enabling developers to offer alternative payment systems.

The commission also has opened a separate antitrust investigation into Apple’s mobile payments service, and the company has promised to open up its tap-and-go mobile payment system to rivals in order to resolve it.

The European Union is looking into whether Elon Musk’s online platform X breached tough new social media regulations in the first such investigation since the rules designed to make online content less toxic took effect.

Explore

Explore

See latest videos, charts and news

See latest videos, charts and news

“Today we open formal infringement proceedings against @X” under the Digital Services Act, European Commissioner Thierry Breton said Monday in a post on the platform formerly known as Twitter.

“The Commission will now investigate X’s systems and policies related to certain suspected infringements,” spokesman Johannes Bahrke told a press briefing in Brussels. “It does not prejudge the outcome of the investigation.”

The investigation will look into whether X failed to do enough to curb the spread of illegal content and whether measures to combat ” information manipulation,” especially through its crowd-sourced Community Notes fact-checking feature, were effective.

The 27-nation EU also will examine whether X was transparent enough with researchers and will look into suspicions that its user interface, including for its blue check subscription service, has a “deceptive design.”

“X remains committed to complying with the Digital Services Act, and is cooperating with the regulatory process,” the company said in a statement. “It is important that this process remains free of political influence and follows the law. X is focused on creating a safe and inclusive environment for all users on our platform, while protecting freedom of expression, and we will continue to work tirelessly towards this goal.”

A raft of big tech companies faced a stricter scrutiny after the EU’s Digital Services Act took effect earlier this year, threatening penalties of up to 6% of their global revenue — which could amount to billions — or even a ban from the EU.

The DSA is a set of far-reaching rules designed to keep users safe online and stop the spread of harmful content that’s either illegal — such as child sexual abuse or terrorism content — or violates a platform’s terms of service, such as promotion of genocide or anorexia.

The EU has already called out X as the worst place online for fake news, and officials have exhorted owner Musk, who bought the platform a year ago, to do more to clean it up. The European Commission, the EU’s executive arm, quizzed X over its handling of hate speech, misinformation and violent terrorist content related to the Israel-Hamas war after the conflict erupted.

Legislators have provisionally agreed to sweeping new laws that will regulate the use of artificial intelligence (AI) in Europe, including controls around the use of copyrighted music.
The deal between policy makers from the European Union Parliament, Council and European Commission on the EU’s Artificial Intelligence Act was reached late on Friday night in Brussels local time following months of negotiations and amid fierce lobbying from the music and tech industries.   

The draft legislation is the world’s first comprehensive set of laws regulating the use of AI and places a number of legal obligations on technology companies and AI developers, including those working in the creative sector and music business.   

The precise technical details of those measures are still being finalized by EU policy makers, but earlier versions of the bill decreed that companies using generative or foundation AI models like OpenAI’s ChatGPT or Anthropic’s Claude 2 would be required to provide summaries of any copyrighted works, including music, that they use to train their systems. 

The AI Act will also force developers to clearly identify content that is created by AI, as opposed to human works before they are placed in the market. In addition, tech companies will have to ensure that their systems are designed in such a way that prevents them from generating illegal content. 

Large tech companies who break the rules – which govern all applications and uses of AI inside the 27 member block of EU countries — will face fines of up to €35 million or 7% of global annual turnover. Start-up businesses or smaller tech operations will receive proportionate financial punishments, said the European Commission.   

Governance will be carried out by national authorities, while a new European AI Office will be created to supervise the enforcement of the new rules on general purpose AI models. 

President of the European Commission Ursula von der Leyen called the agreement “a historic moment” that “will make a substantial contribution to the development of global rules and principles for human-centric AI.” 

Responding to the announcement, Tobias Holzmüller, CEO of German collecting society GEMA, said the deal reached by the European government was a welcome “step in the right direction” but cautioned that its rules and provisions “need to be sharpened further on a technical level.”  

“The outcome must be a clearly formulated transparency regime that obliges AI providers to submit detailed evidence on the contents they used to train their systems,” said Holzmüller.  

Representatives of the technology industry, which had lobbied to weaken the AI Act’s transparency provisions, criticized the deal and warned that it was likely to put European AI developers at a competitive disadvantage.  

Daniel Friedlaender, Senior Vice President of the Computer and Communications Industry Association (CCIA), which counts Alphabet, Apple, Amazon and Meta among its members, said in a statement that “crucial details” of the AI act are still missing “with potentially disastrous consequences for the European economy.”  

“The final AI Act lacks the vision and ambition that European tech startups and businesses are displaying right now,” said CCIA Europe’s Policy Manager, Boniface de Champris. He warned that, if passed, the legislation might “end up chasing away the European champions that the EU so desperately wants to empower.” 

Now that an political agreement has been reached on the AI Act, legislators will spend the coming weeks finalizing the exact technical details of the regulation and translating its terms for the 27 EU member countries.  

The final text then needs to be approved by the European Council and Parliament, with a decisive vote not excepted to take place until early next year, possibly as late as March. If passed, the act will be applicable two years after its entry into force, except for some specific provisions: bans will apply after six months while the rules on generative AI models will begin after 12 months. 

In a statement, international recorded music trade organization IFPI said the first-of-its-kind legislation provides “a constructive and encouraging framework” for regulation of the nascent technology.   

“AI offers creators both opportunities and risks,” said an IFPI spokesperson, “and we believe there is a path to a mutually successful outcome for both the creative and technology communities.”

LONDON — Representatives of the creative industries are urging legislators not to water down forthcoming regulations governing the use of artificial intelligence, including laws around the use of copyrighted music, amid fierce lobbying from big tech companies.     
On Wednesday (Dec. 6), policy makers from the European Union Parliament, Council and European Commission will meet in Brussels to negotiate the final text of the EU’s Artificial Intelligence Act – the world’s first comprehensive set of laws regulating the use of AI.  

The current version of the AI Act, which was provisionally approved by Members of European Parliament (MEPs) in a vote in June, contains several measures that will help determine what tech companies can and cannot do with copyright protected music works. Among them is the legal requirement that companies using generative AI models like OpenAI’s ChatGPT or Anthropic’s Claude 2 (classified by the EU as “general purpose AI systems”) provide summaries of any copyrighted works, including music, that they use to train their systems.

The draft legislation will also force developers to clearly identify content that is created by AI, as opposed to human works. In addition, tech companies will have to ensure that their systems are designed in such a way that prevents them from generating illegal content.

While these transparency provisions have been openly welcomed by music executives, behind the scenes technology companies have been actively lobbying policymakers to try and weaken the regulations, arguing that such obligations could put European AI developers at a competitive advantage.  

“We believe this additional legal complexity is out of place in the AI Act, which is primarily focused on health, safety, and fundamental rights,” said a coalition of tech organizations and trade groups, including the Computer and Communications Industry Association, which counts Alphabet, Apple, Amazon and Meta among its members, in a joint statement dated Nov. 27.

In the statement, the tech representatives said they were concerned “about the direction of the current proposals to regulate” generative AI systems and said the EU’s proposals “do not take into account the complexity of the AI value chain.”   

European lawmakers are also in disagreement over how to govern the nascent technology with EU member states France, Germany and Italy understood to be in favor of light touch regulation for developers of generative AI, according to sources close to the negotiations. 

In response, music executives are making a final pitch to legislators to ensure that AI companies respect copyright laws and strengthen existing protections against the unlawful use of music in training AI systems.  

Helen Smith, the executive chair of IMPALA. /

Lea Fery

Helen Smith, executive chair of European independent labels group IMPALA, tells Billboard that the inclusion of “meaningful transparency and record keeping obligations” in the final legislation is a “must for creators and rightsholders” if they are to be able to effectively engage in licensing negotiations.

In a letter sent to EU ambassadors last week, Björn Ulvaeus, founder member of ABBA and president of CISAC, the international trade organization for copyright collecting societies, warned policymakers that “without the right provisions requiring transparency, the rights of the creator to authorise and get paid for use of their works will be undermined and impossible to implement.”

The European Composer and Songwriter Alliance (ECSA), International Federation of Musicians (FIM) and International Artist Organisation (IAO) are also calling for guarantees that the rights of their members are respected.

If legislators fail to reach a compromise agreement at Wednesday’s fifth and planned-to-be-final negotiating session on the AI Act, there are a number of possible outcomes, including further ‘trologue’ talks the following week. If a deal doesn’t happen this month, however, there is the very real risk that the AI Act won’t be passed before the European parliamentary elections take place in June.

If that happens, a new parliament could theoretically scrap the bill altogether, although executives closely monitoring events in Brussels, the de facto capital of the European Union, say that is unlikely to happen and that there is strong political will from all sides to find a resolution before the end of the year when the current Spain-led presidency of the EU Council ends.

Because the AI Act is a regulation and not a directive — such as the equally divisive and just-as-fiercely-lobbied 2019 EU Copyright Directive — it would pass directly into law in all 27 EU member states, although only once it has been fully approved by the different branches of the European government via a final vote and officially entered into force (the exact timeframe of which could be determined in negotiations, but could take up to three years). 

In that instance, the act’s regulations will apply to any company that operates in the European Union, regardless of where they are based. Just as significant, if passed, the act will provide a world-first legislative model to other governments and international jurisdictions looking to draft their own laws on the use of artificial intelligence.

“It is important to get this right,” says IMPALA’s Smith, “and seize the opportunity to set a proper framework around these [generative AI] models.”

A coalition of artist and label groups is calling on legislators to urgently address a 2020 court ruling that risks seeing European musicians lose out on millions of euros in royalties each year to U.S. acts. 
For decades, American musicians have been denied royalties for the use of their music on broadcast radio or when it’s played in cafes, shops and bars in many overseas countries due to the lack of equivalent terrestrial radio performance and public performance rights in the United States. This practice is based on a principle known as material reciprocity, which means that broadcast and performance revenues are only paid out to countries that apply the same rights.   

The longstanding practice of reciprocal treatment was, however, suspended in the European Union (EU) by a 2020 ruling from the European Court of Justice (ECJ). In that decision, the ECJ decreed that all recording artists are entitled to an equal share of the royalties generated when their music is played on radio or in public premises in the EU, regardless of their nationality — or the absence of radio and performance rights in an artist’s home country. 

Brussels-based independent labels trade body IMPALA says the ECJ ruling will result in European artists and labels losing out on around 125 million euros ($137 million) in royalty income each year, with the equivalent sum instead going to U.S. musicians. Previously, these broadcast and performance royalties were mostly divided up between local labels according to their market share.

European countries that currently withhold public performance and broadcast royalty payments to U.S. artists and labels include the United Kingdom, France, Belgium, Denmark and Ireland. (Outside of Europe, three countries —Japan, Argentina and Australia — also deny royalties to U.S. musicians because of a lack of reciprocal rights). 

In 2019, prior to the court ruling, SoundExchange, which issues licenses to online and satellite radio services, estimated that recording artists and rights holders in the United States lost out on an estimated $350 million in royalty payments due to what it called the “unfair treatment of music creators.” 

So far, the Netherlands is the only EU country to change its legislation in line with the ECJ ruling, which has become widely known as the “RAAP” case in reference to Irish collection society Recorded Artists Actors Performers (RAAP), which initiated the reforms by taking legal action against Phonographic Performance Ireland (PPI) in 2020. In that case, RAAP challenged PPI in the Irish High Court after it reduced royalty payments to performers from a 50-50 split with labels to around 20%. The case was then referred up to the ECJ, which made the now-controversial ruling in September of that year.

U.S. repertoire represents around 40% of all public performance and broadcast income collected annually in the Netherlands, according to Dutch collecting society SENA. Until recently, this income was neither collected nor distributed. Since the change in practice, SENA has increased its tariffs on public performance royalties from 12.5% to 26%.

Will Maas, chair of the Netherlands’ musicians’ union, said in a statement that the rise in rates is not enough to make up for the additional U.S. repertoire now being collected, resulting in a “clear and substantial drop” in revenue going to Dutch and European performers. “This is what awaits other countries if nothing is done to address this,” he added. 

In response, IMPALA executive chair Helen Smith wants the European courts to reverse its 2020 ruling and restore the principle of material reciprocity. 

“It is the EU’s responsibility to prevent European artists and producers losing millions every year to the USA, which has chosen not to protect these rights,” said Smith in a statement. She added that the lack of terrestrial radio performance rights and public performance rights in the United States costs the world music economy “hundreds of millions, if not billions a year.” 

IMPALA also supports a flexible solution that would enable EU countries to pay U.S. artists if they already did so before the ECJ judgment.

Other music groups and CMOs backing IMPALA’s call for action include Adami in France, the Swedish Musicians’ Union, Belgium’s PlayRight and the German Federation of Musicians. They argue that reciprocal treatment forces countries to raise their own levels of protection for musicians by not allowing nations to benefit from other countries’ rules unless they follow the same standards.

Not everyone in the music business is against the ECJ ruling and the push for so-called national treatment — whereby foreign recording artists and labels receive the same types of royalties as the nationals of a given country — to be standardized across the global music business. Executives who back national treatment say that any fall in label income would likely be offset by the increased set of rights and royalty collections elsewhere in Europe resulting from the ECJ decision.

That, however, is not a view shared by IMPALA or its members. 

“Hundreds of thousands of artists count on the EU to do the right thing,” said Dutch musician Matthijs van Duijvenbode in a statement, “and to do it fast.”      

LONDON — When the European Union announced plans to regulate artificial intelligence in 2021, legislators started focusing on “high risk” systems that could threaten human rights, such as biometric surveillance and predictive policing. Amid increasing concern among artists and rights holders about the potential impact of AI on the creative sector, however, EU legislators are also now looking at the intersection of this new technology and copyright.

The EU’s Artificial Intelligence Act, which is now being negotiated among politicians in different branches of government, is the first comprehensive legislation in the world to regulate AI. In addition to banning “intrusive and discriminatory uses” of the technology, the current version of the legislation addresses generative AI, mandating that companies disclose content that is created by AI to differentiate it from works authored by humans. Other provisions in the law would require companies that use generative AI to provide details of copyrighted works, including music, on which they trained their systems. (The AI Act is a regulation, so it would pass directly into law in all 27 member states.)

Music executives began paying closer attention to the legislation after the November launch of ChatGPT. In April, around the time that “Heart on My Sleeve,” a track that featured AI-powered imitations of vocals by Drake and The Weeknd, drove home the issue posed by AI, industry lobbyists convinced lawmakers to add the transparency provisions.

So far, big technology companies, including Alphabet, Meta and Microsoft, have publicly stated that they, too, support AI regulation, at least in the abstract. Behind the scenes, however, multiple music executives tell Billboard that technology lobbyists are trying to weaken these transparency provisions by arguing that such obligations could put European AI developers at a competitive disadvantage.

“They want codes of conduct” — as opposed to laws — “and very low forms of regulation,” says John Phelan, director general of international music publishing trade association ICMP.

Another argument is that summarizing training data “would basically come down to providing a summary of half, or even the entire, internet,” says Boniface de Champris, Brussels-based policy manager at the Computer and Communications Industry Association Europe, which counts Alphabet, Apple, Amazon and Meta among its members. “Europe’s existing copyright rules already cover AI applications sufficiently.”

In May, Sam Altman, CEO of ChatGPT developer OpenAI, emerged as the highest-profile critic of the EU’s proposals, accusing it of “overregulating” the nascent business. He even said that his company, which is backed by Microsoft, might consider leaving Europe if it could not comply with the legislation, although he walked back this statement a few days later. OpenAI and other companies lobbied — successfully — to have an early draft of the legislation changed so that “general-purpose AI systems” like ChatGPT would no longer be considered high risk and thus subject to stricter rules, according to documents Time magazine obtained from the European Commission. (OpenAI didn’t respond to Billboard’s requests for comment.)

The lobbying over AI echoes some of the other political conflicts between media and technology companies — especially the one over the EU Copyright Directive, which passed in 2019. While that “was framed as YouTube versus the music industry, the narrative has now switched to AI,” says Sophie Goossens, a partner at global law firm Reed Smith. “But the argument from rights holders is much the same: They want to stop tech companies from making a living on the backs of their content.”

Several of the provisions in the Copyright Directive deal with AI, including an exception in the law for text- and data-mining of copyrighted content, such as music, in certain cases. Another exception allows scientific and research institutions to engage in text- and data-mining on works to which they have lawful access.

So far, the debate around generative AI in the United States has focused on whether performers can use state laws on right of publicity to protect their distinctive voices and images — the so-called “output side” of generative AI. In contrast, both the Copyright Directive and the AI Act address the “input side,” meaning ways that rights holders can either stop AI systems from using their content for training purposes or limit which ones can in order to license that right.

Another source of tension created by the Copyright Directive is the potential for blurred boundaries between research institutions and commercial businesses. Microsoft, for example, refers to its Muzic venture as “a research project on AI music,” while Google regularly partners with independent research, academic and scientific bodies on technology developments, including AI. To close potential loopholes, Phelan wants lawmakers to strengthen the bill’s transparency provisions, requiring specific details of all music accessed for training, instead of the “summary” that’s currently called for. IFPI, the global recorded-music trade organization, regards the transparency provisions as “a meaningful step in the right direction,” according to Lodovico Benvenuti, managing director of its European office, and he says he hopes lawmakers won’t water that down.

The effects of the AI Act will be felt far outside Europe, partly because they will apply to any company that does business in the 27-country bloc and partly because it will be the first comprehensive set of rules on the use of the technology. In the United States, the Biden administration has met with technology executives to discuss AI but has yet to lay out a legislation strategy. On June 22, Senate Majority Leader Chuck Schumer, D-N.Y., said that he was working on “exceedingly ambitious” bipartisan legislation on the topic, but political divides in the United States as the next presidential election approaches would make passage difficult. China unveiled its own draft laws in April, although other governments may be reluctant to look at legislation there as a model.

“The rest of the world is looking at the EU because they are leading the way in terms of how to regulate AI,” says Goossens. “This will be a benchmark.”

LONDON — Amid increasing concern among artists, songwriters, record labels and publishers over the impact of artificial intelligence (AI) on the music industry, European regulators are finalizing sweeping new laws that will help determine what AI companies can and cannot do with copyrighted music works.  
On Wednesday (June 14), Members of the European Parliament (MEPs) voted overwhelmingly in favor of the Artificial Intelligence (AI) Act with 499 votes for, 28 against and 93 abstentions. The draft legislation, which was first proposed in April 2021 and covers a wide range of AI applications, including its use in the music industry, will now go before the European Parliament, European Commission and the European Council for review and possible amendments ahead of its planned adoption by the end of the year.  

For music rightsholders, the European Union’s (EU) AI Act is the world’s first legal framework for regulating AI technology in the record business and comes as other countries, including the United States, China and the United Kingdom, explore their own paths to policing the rapidly evolving AI sector.  

The EU proposals state that generative AI systems will be forced to disclose any content that they produce which is AI-generated — helping distinguish deep-fake content from the real thing — and provide detailed publicly available summaries of any copyright-protected music or data that they have used for training purposes.    

“The AI Act will set the tone worldwide in the development and governance of artificial intelligence,” MEP and co-rapporteur Dragos Tudorache said following Wednesday’s vote. The EU legislation would ensure that AI technology “evolves and is used in accordance with the European values of democracy, fundamental rights, and the rule of law,” he added.

The EU’s AI Act arrives as the music business is urgently trying to respond to recent advances in the technology. The issue came to a head in April with the release of “Heart on My Sleeve,” the now-infamous song uploaded to TikTok that is said to have been created using AI to imitate vocals from Drake and The Weeknd. The song was quickly pulled from streaming services following a request from Universal Music Group, which represents both artists, but not before it had racked up hundreds of thousands of streams.

A few days before “Heart on My Sleeve” become a short-lived viral hit, UMG wrote to streaming services, including Spotify and Apple Music, asking them to stop AI companies from accessing the label’s copyrighted songs “without obtaining the required consents” to “train” their machines. The Recording Industry Association of America (RIAA) has also warned against AI companies violating copyrights by using existing music to generate new tunes. 

If the EU’s AI Act passes in its present draft form, it will strengthen supplementary protections against the unlawful use of music in training AI systems. Existing European laws dealing with text and data-mining copyright exceptions mean that rightsholders will still technically need to opt out of those exceptions if they want to ensure their music is not used by AI companies that are either operating or accessible in the European Union.

The AI Act would not undo or change any of the copyright protections currently provided under EU law, including the Copyright Directive, which came into force in 2019 and effectively ended safe harbor provisions for digital platforms in Europe.  

That means that if an AI company were to use copyright-protected songs for training purposes — and publicly declare the material it had used as required by the AI Act — it would still be subject to infringement claims for any AI-generated content it then tried to commercially release, including infringement of the copyright, legal, personality and data rights of artists and rightsholders.   

“What cannot, is not, and will not be tolerated anywhere is infringement of songwriters’ and composers’ rights,” said John Phelan, director general of international music publishing trade association ICMP, in a statement. The AI Act, he says, will ensure “special attention for intellectual property rights” but further improvements to the legislation “are there to be won.”