State Champ Radio

by DJ Frosty

Current track

Title

Artist

Current show
blank

State Champ Radio Mix

1:00 pm 7:00 pm

Current show
blank

State Champ Radio Mix

1:00 pm 7:00 pm


tech

Page: 39

HipHopWired Featured Video

CLOSE

Source: Rockstar Games / GTA 6
Mark your calendars! Gamers are excited after Rockstar Games dropped the date for the first trailer for Grand Theft Auto 6.
After years of trending on social media for absolutely no reason at all, opening X and seeing GTA 6 in the top trending topics finally means something now.
While not saying much, Rockstar Games announced trailer 1 for what is easily the most highly anticipated game in the world, which will be arriving on December 5.

Proving just how popular this game is, just announcing the trailer’s release date sent social media into a tizzy.
“Fast forwarding to Tuesday,” Sony’s official X, formerly known as Twitter account, wrote.

The trailer will also arrive two days before The Game Awards, shutting down the theory that Geoff Keighley had GTA 6 as a big reveal during the show.

What We Know So Far Ahead of GTA 6 First Trailer Release
While we know when to expect the trailer, we still have no idea what this game will look like outside of the massive leak that showed footage from a pre-build.
Rumors also suggest that GTA 6 will see the game return to Vice City, which is Grand Theft Auto’s version of Miami, featuring a female protagonist as one-half of the playable characters for the first time, drawing inspiration from Bonnie and Clyde and one of the most extensive “evolving” maps in GTA history.
Like everyone else, we are excited to see what Rockstar Games has been taking its sweet time to build since GTA V, which has seen life on three console generations since its initial release in 2013 on the PlayStation 3 and Xbox 360 console, making it one of, if not the most profitable pieces of media ever.
Until the day the trailer drops, you can see more reactions in the gallery below.

Photo: Rockstar Games / GTA 6

4. The wait is finally over

6. Vince McMahon has never been so popular on social media

7. Howling

Montana’s first-in-the-nation law banning the video-sharing app TikTok in the state was blocked Thursday, one month before it was set to take effect, by a federal judge who called the measure unconstitutional.
The ruling delivered a temporary win for the social media company that has argued Montana’s Republican-controlled Legislature went “completely overboard” in trying to regulate the app. A final ruling will come at a later date after the legal challenge moves through the courts.

U.S. District Judge Donald Molloy said the ban “oversteps state power and infringes on the Constitutional right of users and businesses” while singling out the state for its fixation on purported Chinese influence.

“Despite the state’s attempt to defend (the law) as a consumer protection bill, the current record leaves little doubt that Montana’s legislature and Attorney General were more interested in targeting China’s ostensible role in TikTok than with protecting Montana consumers,” Molloy wrote Thursday in granting the preliminary injunction. “This is especially apparent in that the same legislature enacted an entirely separate law that purports to broadly protect consumers’ digital data and privacy.”

Montana lawmakers in May made the state the first in the U.S. to pass a complete ban on the app based on the argument that the Chinese government could gain access to user information from TikTok, whose parent company, ByteDance, is based in Beijing.

The ban, which was scheduled to take effect Jan. 1, was first brought before the Montana Legislature a few weeks after a Chinese spy balloon flew over the state.

It would prohibit downloads of TikTok in the state and fine any “entity” — an app store or TikTok — $10,000 per day for each time someone “is offered the ability” to access or download the app. There would not be penalties for users.

TikTok spokesperson Jamal Brown issued a statement saying the company was pleased that “the judge rejected this unconstitutional law and hundreds of thousands of Montanans can continue to express themselves, earn a living, and find community on TikTok.”

A spokeswoman for Montana Attorney General Austin Knudsen, also a Republican, tried to downplay the significance of the ruling in a statement.

“The judge indicated several times that the analysis could change as the case proceeds,” said Emily Cantrell, spokeswoman for Knudsen. “We look forward to presenting the complete legal argument to defend the law that protects Montanans from the Chinese Communist Party obtaining and using their data.”

Western governments have expressed worries that the popular social media platform could put sensitive data in the hands of the Chinese government or be used as a tool to spread misinformation. Chinese law allows the government to order companies to help it gather intelligence.

More than half of U.S. states and the federal government have banned TikTok on official devices. The company has called the bans “political theatre” and says further restrictions are unnecessary due to the efforts it is taking to protect U.S. data by storing it on Oracle servers. The company has said it has not received any requests for U.S. user data from the Chinese government and would not provide any if it were asked.

“The extent to which China controls TikTok, and has access to its users’ data, forms the heart of this controversy,” the judge wrote.

Attorneys for TikTok and the content creators argued on Oct. 12 that the state had gone too far in trying to regulate TikTok and is essentially trying to implement its own foreign policy over unproven concerns that TikTok might share user data with the Chinese government.

TikTok has said in court filings that Montana could have limited the kinds of data TikTok could collect from its users rather than enacting a complete ban. Meanwhile, the content creators said the ban violates free speech rights and could cause economic harm for their businesses.

Christian Corrigan, the state’s solicitor general, argued Montana’s law was less a statement of foreign policy and instead addresses “serious, widespread concerns about data privacy.”

The state hasn’t offered any evidence of TikTok’s “allegedly harmful data practices,” Molloy wrote.

Molloy noted during the hearing that TikTok users consent to the company’s data collection policies and that Knudsen — whose office drafted the legislation — could air public service announcements warning people about the data TikTok collects.

The American Civil Liberties Union, its Montana chapter and the Electronic Frontier Foundation, a digital privacy rights advocacy group, have submitted an amicus brief in support of the challenge. Meanwhile, 18 attorneys generals from mostly Republican-led states are backing Montana and asking the judge to let the law be implemented. Even if that happens, cybersecurity experts have said it could be challenging to enforce.

Roblox players are about to get an eyeful — and potentially an earful — of KINGSHIP, the metaverse “supergroup” comprised of and managed by a shrewdness of Bored-and-Mutant Ape NFTs.
10:22PM, the Web3 label of Universal Music Group founded by Celine Joshua, announced on Thursday the launch of KINGSHIP Islands — an immersive in-game experience wherein Robloxers can work to unite the four band members on something called the “Floating Villa,” plus earn reward accessories and “acquire emotes for their avatars.” For minors with parents who are cool and totally not a drag, players can purchase customized animated heads and bodies for their avatars using Roblox’s facial animation technology.

As the game environment ages, more free virtual goods will be added along with new music produced by Hit-Boy and James Fauntleroy, the KINGSHIP “sonic creative team” that was announced a year ago. The band’s label said the pair — officially co-executive producers — are “overseeing the evolution of the group’s music direction and sound.”

The supergroup has yet to release music, and their manager Manager Noët All could not be reached for comment.

KINGSHIP Islands is free to play for any Roblox user, who must first complete various quests to gain access to the Floating Villa. Wanna skip all that? Owners of one of the 5,000 KINGSHIP Key Cards qualify for VIP access, along with special badges and other metadoodads. Key Card holders can access the villa at any time because they will have a special Roblox badge, which provide unique roles inside experiences, the label said.

The aforementioned Floating Villa, part of KINGSHIP Islands.

When they were released in July of 2022, the entire batch of Key Cards sold out in the span of a day, though they continue to trade on the secondary market. Over the last 30 days, 66 cards have been resold on OpenSea at an average price of 0.0592 ETH, or roughly $120 at the current exchange rate. The cards were designed to unlock forthcoming partnerships with major brands (see: Roblox), as well as unique artwork and immersive digital experiences.

10:22PM’s KINGSHIP project made its debut in November 2021 and is comprised of mutant ape Captain (vocals, bass) and bored ones KING (lead vocals), Arnell (beats, producer, drums) and Hud (guitar, keyboards, vocals). Avid NFT collector Jimmy McNeils supplied the apes for KINGSHIP from his own collection. At the time of launch it was billed as a “landmark, first-ever exclusive agreement to create a metaverse group.”

Hey look, a trailer:

[embedded content]

Dennis Kooker, president of global digital business at Sony Music Entertainment, represented the music business at Sen. Chuck Schumer’s (D-NY) seventh artificial intelligence insight forum in Washington, D.C. on Wednesday (Nov. 29). In his statement, Kooker implored the government to act on new legislation to protect copyright holders to ensure the development of “responsible and ethical generative AI.”

The executive revealed that Sony has already sent “close to 10,000 takedowns to a variety of platforms hosting unauthorized deepfakes that SME artists asked us to take down.” He says these platforms, including streamers and social media sites, are “quick to point to the loopholes in the law as an excuse to drag their feet or to not take the deepfakes down when requested.”

Presently, there is no federal law that explicitly requires platforms to takedown songs that impersonate an artists’ voice. Platforms are only obligated to do this when a copyright (a sound recording or a musical work) is infringed, as stipulated by the Digital Millennium Copyright Act (DMCA). Interest in using AI to clone the voices of famous artists has grown rapidly since a song with AI impersonations of Drake and The Weekend went viral earlier this year. The track, called “Heart on My Sleeve” has become one of the most popular use-cases of music-related AI.

A celebrity’s voice and likeness can be protected by “right of publicity” laws that safeguard it from unauthorized exploitation, but this right is limited. Its protections vary state-to-state and are even more limited post-mortem. In May, Billboard reported that the major labels — Sony, Universal Music Group and Warner Music Group — had been in talks with Spotify, Apple Music and Amazon Music to create a voluntary system for takedowns of right of publicity violations, much like the one laid out by the DMCA, according to sources at all three majors. It is unclear from Kooker’s remarks if the platforms that are dragging their feet on voice clone removals include the three streaming services that previously took part in these discussions.

In his statement, Kooker asked the Senate forum to create a federal right of publicity to create a stronger and more uniform protection for artists. “Creators and consumers need a clear unified right that sets a floor across all fifty states,” he said. This echoes what UMG general counsel/ executive vp of business and legal affairs Jeffery Harleston asked the Senate during a July AI hearing.

Kooker expressed his “sincere gratitude” to Sens. Chris Coons, Marsha Blackburn, Amy Klobuchar and Thom Tillis for releasing a draft bill called the No FAKES (“Nurture Originals, Foster Art, and Keep Entertainment Safe”) Act in October, which would create a federal property right for one’s voice or likeness and protect against unauthorized AI impersonations. At its announcement, the No FAKES Act drew resounding praise from music business organizations, including the RIAA and the American Association of Independent Music.

Kooker also stated that in this early stage many available generative AI products today are “not expanding the business model or enhancing human creativity.” He pointed to a “deluge of 100,000 new recordings delivered to [digital service providers] every day” and said that some of these songs are “generated using generative AI content creation tools.” He added, “These works flood the current music ecosystem and compete directly with human artists…. They reduce and diminish the earnings of human artists.”

“We have every reason to believe that various elements of AI will become routine in the creative process… [as well as] other aspects of our business” like marketing and royalty accounting,” Kooker continued. He said Sony Music has already started “active conversations” with “roughly 200” different AI companies about potential partnerships with Sony Music.

Still, he stressed five key issues remain that need to be addressed to “assure a thriving marketplace for AI and music.” Read his five points, as written in his prepared statement, below:

Assure Consent, Compensation, and Credit. New products and businesses built with music must be developed with the consent of the owner and appropriate compensation and credit. It is essential to understand why the training of AI models is being done, what products will be developed as a result, and what the business model is that will monetize the use of the artist’s work. Congress and the agencies should assure that creators’ rights are recognized and respected.

Confirm That Copying Music to Train AI Models is Not Fair Use. Even worse are those that argue that copyrighted content should automatically be considered fair use so that protected works are never compensated for usage and creators have no say in the products or business models that are developed around them and their work. Congress should assure and agencies should presume that reproducing music to train AI models, in itself, is not a fair use.

Prevent the Cloning of Artists’ Voices and Likenesses Without Express Permission. We cannot allow an artist’s voice or likeness to be cloned for use without the express permission of the artist. This is a very personal decision for the artist. Congress should pass into law effective federal protections for name, image, and likeness.

Incentivize Accurate Record-Keeping. Correct attribution will be a critical element to artists being paid fairly and correctly for new works that are created. In addition, rights can only be enforced around the training of AI when there are accurate records about what is being copied. Otherwise, the inability to enforce rights in the AI marketplace equates to a lack of rights at all, producing a dangerous imbalance that prevents a thriving ecosystem. This requires strong and accurate record keeping by the generative AI platforms, a requirement that urgently needs legislative support to ensure incentives are in place so that it happens consistently and correctly.

Assure Transparency for Consumers and Artists. Transparency is necessary to clearly distinguish human-created works from AI-created works. The public should know, when they are listening to music, whether that music was created by a human being or a machine.

Most conversations around AI in music are focused on music creation, protecting artists and rightsholders, and differentiating human-made music from machine-made works. And there is still discourse to be had as AI has some hidden superpowers waiting to be explored. One use for the technology that has immense potential to positively impact artists is music marketing.

As generative and complementary AI is becoming a larger part of creative works in music, marketing will play a larger role than ever before. Music marketing isn’t just about reaching new and existing fans and promoting upcoming singles. Today, music marketing must establish an artist’s ownership of their work and ensure that the human creatives involved are known, recognized, and appreciated. We’re about to see the golden age of automation for artists who want to make these connections and gain this appreciation.

While marketing is a prerequisite to a creator’s success, it takes a lot of time, energy, and resources. Creating engaging content takes time. According to Linktree’s 2023 Creator Report, 48% of creators who make $100-500k per year spend more than 10 hours on content creation every week. On top of that, three out of four creators want to diversify what they create but feel pressure to keep making what is rewarded by the algorithm. Rather than fighting the impossible battle of constantly evolving and cranking out more content to match what the algorithm is boosting this week, creatives can have a much greater impact by focusing on their brand and making high-quality content for their audience.

For indie artists without support from labels and dedicated promotion teams, the constant pressure to push their new single on TikTok, post on Instagram, and engage with fans while finding the time to make new music is overwhelming. The pressure is only building, thanks to changes in streaming payouts. Indie artists need to reach escape velocity faster.

Megh Vakharia

AI-powered music marketing can lighten that lift–generating campaign templates and delivering to artists the data they need to reach their intended audience. AI can take the data that artists and creators generate and put it to work in a meaningful way, automatically extracting insights from the information and analytics to build marketing campaigns and map out tactics that get results. 

AI-driven campaigns can give creators back the time they need to do what they do best: create. While artificial intelligence saves artists time and generates actionable solutions for music promotion, it is still highly dependent on the artist’s input and human touch. Just as a flight captain has to set route information and parameters before switching on autopilot, an artist enters their content, ideas, intended audience, and hopeful outcome of the marketing campaign. Then, using this information, the AI-powered marketing platform can provide all of the data and suggestions necessary to produce the targeted results.  

Rather than taking over the creative process, AI should be used to assist and empower artists to be more creative. It can help put the joy back into what can be a truly fun process — finding, reaching, and engaging with fans. 

A large portion of artists who have tapped into AI marketing have never spent money on marketing before, but with the help of these emerging tools, planning and executing effective campaigns is more approachable and intuitive. As the music industry learns more about artificial intelligence and debates its ethical implications in music creation, equal thought must be given to the opportunities that it unlocks for artists to grow their fanbases, fuel more sustainable careers, and promote their human-made work.

Megh Vakharia is the co-founder and CEO of SymphonyOS, the AI-powered marketing platform empowering creatives to build successful marketing campaigns that generate fan growth using its suite of smart, automated marketing tools.

Earlier this month, 760 stations owned by iHeartMedia simultaneously threw their weight behind a new single: The Beatles’ “Now and Then.” This was surprising, because the group broke up in 1970 and two of the members are dead. “Now and Then” began decades ago as a home recording by John Lennon; more recently, AI-powered audio technology allowed for the separation of the demo’s audio components — isolating the voice and the piano — which in turn enabled the living Beatles to construct a whole track around them and roll it out to great fanfare. 

“For three days, if you were a follower of popular culture, all you heard about was The Beatles,” says Arron Saxe, who represents several estates, including Otis Redding’s and Bill Withers’s. “And that’s great for the business of the estate of John Lennon and the estate of George Harrison and the current status of the two living legends.”

For many people, 2023 has been the year that artificial intelligence technology left the realm of science fiction and crashed rudely into daily life. And while AI-powered tools have the potential to impact wide swathes of the music industry, they are especially intriguing for those who manage estates or the catalogs of dead artists. 

That’s because there are inherent constraints involved with this work: No one is around to make new stuff. But as AI models get better, they have the capacity to knit old materials together into something that can credibly pass as new — a reproduction of a star’s voice, for example. “As AI develops, it may impact the value of an estate, depending on what assets are already in the estate and can be worked with,” says Natalia Nataskin, chief content officer for Primary Wave, who estimates that she and her team probably spend around 25% of their time per week mulling AI (time she says they used to spend contemplating possibilities for NFTs).

And a crucial part of an estate manager’s job, Saxe notes, is “looking for opportunities to earn revenue.” “Especially with my clients who aren’t here,” he adds, “you’re trying to figure out, how do you keep it going forward?”

The answer, according to half a dozen executives who work with estates or catalogs of dead artists or songwriters, is “very carefully.” “We say no to 99 percent of opportunities,” Saxe says. 

“You have this legacy that is very valuable, and once you start screwing with it, you open yourself up to causing some real damage,” adds Jeff Jampol, who handles the estates of The Doors, Janis Joplin and more. “Every time you’re going to do something, you have to be really protective. It’s hard to be on the bleeding edge.”

To work through these complicated issues, WME went so far as to establish an AI Task Force where agents from every division educate themselves on different platforms and tools to “get a sense for what is out there and where there are advantages to bring to our clients,” says Chris Jacquemin, the company’s head of digital strategy. The task force also works with WME’s legal department to gain “some clarity around the types of protections we need to be thinking about,” he continues,  as well as with the agency’s legislative division in Washington, D.C. 

At the moment, Jampol sees two potentially intriguing uses of AI in his work. “It would be very interesting to have, for instance, Jim Morrison narrate his own documentary,” he explains. He could also imagine using an AI voice model to read Morrison’s unrecorded poetry. (The Doors singer did record some poems during his lifetime, suggesting he was comfortable with this activity.) 

On Nov. 15, Warner Music Group announced a potentially similar initiative, partnering with the French great Edith Piaf’s estate to create a voice model — based on the singer’s old interviews — which will narrate the animated film Edith. The executors of Piaf’s estate, Catherine Glavas and Christie Laume, said in a statement that “it’s been a special and touching experience to be able to hear Edith’s voice once again — the technology has made it feel like we were back in the room with her.”

The use of AI tech to recreate a star’s speaking voice is “easier” than attempting to put together an AI model that will replicate a star singing, according to Nataskin. “We can train a model on only the assets that we own — on the speaking voice from film clips, for example,” she explains. 

In contrast, to train an AI model to sing like a star of old, the model needs to ingest a number of the artist’s recordings. That requires the consent of other rights holders — the owners of those recordings, which may or may not be the estate, as well as anyone involved in their composition. Many who spoke to Billboard for this story said they were leery of AI making new songs in the name of bygone legends. “To take a new creation and say that it came from someone who isn’t around to approve it, that seems to me like quite a stretch,” says Mary Megan Peer, CEO of the publisher peermusic. 

Outside the United States, however, the appetite for this kind of experimentation may differ. Roughly a year ago, the Chinese company Tencent Music Entertainment told analysts that it used AI-powered technology to create new vocal tracks from dead singers, one of which went on to earn more than 100 million streams.

For now, at least, Nataskin characterized Primary Wave as focused on “enhancing” with AI tech, “rather than creating something from scratch.” And after Paul McCartney initially mentioned that artificial intelligence played a role in “Now and Then,” he quickly clarified on X that “nothing has been artificially or synthetically created,” suggesting there is still some stigma around the use of AI to generate new vocals from dead icons. The tech just “cleaned up some existing recordings,” McCartney noted.

This kind of AI use for “enhancing” and “cleaning up,” tweaking and adjusting has already been happening regularly for several years. “For all of the industry freakout about AI, there’s actually all these ways that it’s already operating everyday on behalf of artists or labels that isn’t controversial,” says Jessica Powell, co-founder and CEO of Audioshake, a company that uses AI-powered technology for stem separation. “It can be pretty transformational to be able to open up back catalog for new uses.”

The publishing company peermusic used AI-powered stem separation to create instrumentals for two tracks in its catalog — Gaby Moreno’s “Fronteras” and Rafael Solano’s “Por Amor” — which could then be placed in ads for Oreo and Don Julio, respectively. Much like the Beatles, Łukasz Wojciechowski, co-founder of Astigmatic Records, used stem separation to isolate, and then remove distortion from, the trumpet part in a previously unreleased recording he found of jazz musician Tomasz Stanko. After the clean up, the music could be released for the first time. “I’m seeing a lot of instances with older music where the quality is really poor, and you can restore it,” Wojciechowski says.

Powell acknowledges that these uses are “not a wild proposition like, ‘create a new voice for artist X!’” Those have been few and far between — at least the authorized ones. (Hip-hop fans have been using AI-powered technology to turn snippets of rap leaks from artists like Juice WRLD, who died in 2019, into “finished” songs.) For now, Saxe believes “there hasn’t been that thing where people can look at it and go, ‘They nailed that use of it.’ We haven’t had that breakout commercial popular culture moment.”

It’s still early, though. “Where we go with things like Peter Tosh or Waylon Jennings or Eartha Kitt, we haven’t decided yet,” says Phil Sandhaus, head of WME Legends division. “Do we want to use voice cloning technologies out there to create new works and have Eartha Kitt in her unique voice sing a brand new song she’s never sung before? Who knows? Every family, every estate is different.”

Additional reporting by Melinda Newman

Full-service music company ONErpm is filling out further with the launch of two divisions, one being a new administration system meant to simplify managing an artist’s day-to-day needs — and the other an updated distribution platform geared for budget-crunched DIYers. Explore Explore See latest videos, charts and news See latest videos, charts and news The […]

Listeners remain wary of artificial intelligence, according to Engaging with Music 2023, a forthcoming report from the International Federation of the Phonographic Industry (IFPI) that seems aimed in particular at government regulators.
The IFPI surveyed 43,000 people across 26 countries, coming to the conclusion that 76% of respondents “feel that an artist’s music or vocals should not be used or ingested by AI without permission,” and 74% believe “AI should not be used to clone or impersonate artists without authorisation.” 

The results are not surprising. Most listeners probably weren’t thinking much, if at all, about AI and its potential impacts on music before 2023. (Some still aren’t thinking about it: 89% of those surveyed said they were “aware of AI,” leaving 11% who have somehow managed to avoid a massive amount of press coverage this year.) New technologies are often treated with caution outside the tech industry. 

It’s also easy for survey respondents to support statements about getting authorization for something before doing it — that generally seems like the right thing to do. But historically, artists haven’t always been interested in preemptively obtaining permission. 

Take the act of sampling another song to create a new composition. Many listeners would presumably agree that artists should go through the process of clearing a sample before using it. In reality, however, many artists sample first and clear later, sometimes only if they are forced to.

In a statement, Frances Moore, IFPI’s CEO, said that the organization’s survey serves as a “timely reminder for policymakers as they consider how to implement standards for responsible and safe AI.”

U.S. policymakers have been moving slowly to develop potential guidelines around AI. In October, a bipartisan group of senators released a draft of the NO FAKES Act, which aims to prevent the creation of “digital replicas” of an artist’s image, voice, or visual likeness without permission.

“Generative AI has opened doors to exciting new artistic possibilities, but it also presents unique challenges that make it easier than ever to use someone’s voice, image, or likeness without their consent,” Senator Chris Coons said in a statement. “Creators around the nation are calling on Congress to lay out clear policies regulating the use and impact of generative AI.”

HipHopWired Featured Video

CLOSE

Source: SOPA Images / Getty / Grand Theft Auto 6
As we get closer to the big reveal of Grand Theft Auto 6, the hype for the highly-anticipated video game is so thick you can cut it with a knife.
GTA VI is trending on X, formerly known as Twitter, but this time around, the energy is much different. Unlike the other times the long-awaited sequel to GTA V trended, there is actual excitement now because a trailer for the game is on the horizon.
Rockstar Games confirmed a trailer is on the way following the equivalent of  Woj Bomb in the video game space due to Bloomberg’s Jason Scheirer coming through with slam dunk reporting claiming the studio was dropping a trailer for the game.
In a post shared on X, formerly known as Twitter, Rockstar Games president Sam Houser wrote:
“We are very excited to let you know that in early December, we will release the first trailer for the next Grand Theft Auto. We look forward to many more years of sharing these experiences with all of you.”

Gamers React To GTA VI’s Trailer Coming In December
With that in mind and Thanksgiving and Black Friday now behind us, gamers have taken to Elon Musk’s trash platform to express how excited they are about GTA VI’s announcement being on the horizon.
“The only thing on my mind right now is #GTAVI December hurry up!!! Its gonna be game of the generation bruh….,” one post on X read. 
Another post read, “I hope we get that #GTAVI trailer Friday but I’m expecting nothing until next week at least. Regardless, I can’t believe we’ve made it this close.”
We feel both posts and wonder when the trailer will drop in December.
There is a strong chance it could happen during the upcoming Game Awards. 
We shall see.
Until then, hit the gallery below for more reactions.

Photo: SOPA Images / Getty

2. Howling

3. The graphics will be mind-blowing

4. Not confirmed, but if true, this will be crazy.

Most people send files to collaborators without a second thought — open a new email or text, click attach, hit send. Benjamin Thomas is not most people. “What I normally do is I encrypt it and send it with a 20 character randomized password via email,” he says. “And then I do a verbal confirmation of who the file is going to and deliver them the password through another method.”

Thomas is not employed by the National Security Agency; he’s an engineer who works closely with the rapper Lil Uzi Vert. But since Lil Uzi Vert’s most passionate fans are not fond of waiting for him to release music at his own pace — they want to hear it now, and will happily consume leaked songs in whatever form they can find them — these sort of safeguards are necessary.

Thomas’ precautions have helped drastically reduce the frequency of leaks. “He’s got a real blueprint for engineers to keep sh– under lock and key,” says Jason Berger, a partner at Lewis Brisbois, where he represents a number of producers who frequently collaborate with Lil Uzi Vert. “Uzi’s stuff used to leak a lot,” the attorney notes. “From about March of 2020 until the Pink Tape dropped [in June, 2023], not one f—ing record leaked out.”

But leaks remain a fairly regular occurrence in the music industry, especially in hip-hop, and happen in myriad ways. Despite being digital natives, the younger generation which tends to drive music is susceptible to being swindled: The Federal Trade Commission reported last year that 44% of people ages 20 to 29 said they lost money to online fraud, compared to 20% of people ages 70 to 79. When it comes to music, a lot of leaks boil down to “doing dumb stuff on the internet,” as Thomas puts it. 

He puts leak into two different categories: Some stem from carelessness, others from hacking. A lot of the careless leaks are the result of common email phishing techniques. 

The producer Warren “Oak” Felder recently received an email from his assistant — or so he thought. “He was asking me for something that made sense: ‘Hey, I need the bounces for some records because management asked,’” Felder says. But one sentence in the email stuck out for its odd construction, so the producer texted his assistant, who confirmed he didn’t send the email. “The amount of fake emails I get is crazy,” Thomas adds.  

The same thing also happens by text. “Friends of mine in the industry have fallen victim to people phishing, where they’ll get a text from somebody that is acting like somebody else, maybe an artist, saying they had to get a new phone, for example,” says Anthony Cruz, an engineer who works closely with Meek Mill. Maybe they ask for a demo, or maybe “they’ll send you a link, and that file ends up hacking your entire account,” prying loose any closely guarded tracks. 

Obtaining leaks via hacks can be more sophisticated, like the technique known as  SIM-swapping. “They’ll first find as much information as they can about the person that they want to hack,” explains the producer Waves, who has an unreleased song with Juice WRLD that’s floating around online due to a leak. “Then they would call your cell phone provider, say, ‘Hey, I lost my SIM card, I just got another one. Can you transfer my number over to this phone?’” 

If the perpetrator has been able to glean enough personal information — ranging from troves of previously hacked passwords that exist online to things like a mother’s maiden name — they can waltz past account protections and take over the account of the target phone. “From there, they just look through your email,” Waves says. “Sometimes they’ll even find full Pro Tools sessions and they’ll sell those. Honestly, some of them are pretty good hackers.”

Even if engineers, producers and artists are vigilant about protecting their own phones and computers, that may not be enough. Studios can be surprisingly loose with valuable materials. “A year ago, one of my clients was in one of these major recording studios and all of a sudden he’s hearing a collaboration between an A-list artist and somebody else that nobody even knew happened,” says Dylan Bourne, who manages artists and producers. “He was hearing it by accident in the studio because it was just on the computer.” Thomas “heard a story about somebody sitting outside the studio who logged into the Wi-Fi from a car,” enabling him to make off with files. 

And yet another moment of vulnerability occurs when artists ask other acts for features and send over a track. “Now you’re relying on that artist and their team to protect the files,” Cruz says. “It’s out of your hands. A couple of leaks that we have been a part of have been because of that.”

Due to the danger of files ending up in the wrong hands, “a lot of artists are starting to use private servers to share music,” according to Felder. “They’re saying, ‘listen, if you send me the record, don’t text it to me, don’t email it to me.’” 

One of Bourne’s clients, a producer, recently had to determine splits on a song he worked on for an A-list artist. The artist convened a listening session on Zoom “so that they could know what it made sense to argue for, but not have access to the song in any capacity,” Bourne says. “In the past people would have sent listening links.”

Another tactic Felder has taken up is “naming things cryptically.” This way, in case someone gets into a Dropbox folder or email and roots around for demos of songs featuring notable artists, at least that person can’t easily figure who is recorded on what. 

Leaks are “not a situation that’s going to go away,” Bourne continues. “Artists who care have to get ahead of it and be more protective about the music.”