State Champ Radio

by DJ Frosty

Current track

Title

Artist

Current show

Lunch Time Rewind

12:00 pm 1:00 pm

Current show

Lunch Time Rewind

12:00 pm 1:00 pm


Elon Musk & Other Tech Leaders Call for Pause on Training AI With ‘Human-Competitive Intelligence’

Written by on March 29, 2023

In a new open letter signed by Elon Musk, Steve Wozniak, Andrew Yang and more on Wednesday (March 29), leaders in technology, academia and politics came together to call for a moratorium on training AI systems “more advanced than Chat GPT-4” for “at least 6 months.”

The letter states that “AI systems with human-competitive intelligence can pose profound risks to society and humanity,” including the increased spread of propaganda and fake news as well as automation leading to widespread job loss. “Should we develop nonhuman minds that might eventually outnumber, outsmart, obsolete and replace us? Should we risk loss of control of our civilization?” the letter asks.

By drawing the line at AI models “more advanced than Chat GPT-4,” the signees are likely pointing to generative artificial intelligence — a term encompassing a subset of AI that can create new content after being trained via the input of millions or even billions of pieces of data. While some companies license or create their own training data, a large number of AIs are trained using data sets scraped from the web that contain copyright-protected material, including songs, books, articles, images and more. This practice has sparked widespread debate over whether or not AI companies should be required to obtain consent or to compensate the rights holders, and whether the fast-evolving models will endanger the livelihoods of musicians, illustrators and other creatives.

Before late 2022, generative AI was little discussed outside of tech-savvy circles, but it has gained national attention over the last six months. Popular examples of generative AI today include image generators like DALLE-2, Stable Diffusion and Midjourney, which use simple text prompts to conjure up realistic pictures. Chatbots (also called Large Language Models or “LLMs”) like Chat GPT are also considered generative, as are machines that can create new music at the touch of a button. Though generative AI models in music have yet to make as many headlines as chatbots and image generators, companies like Boomy, Soundful, Beatlab, Google’s Magenta, Open AI and others are already building them, leading to fears that their output could one day threaten human-made music.

The letter urging the pause in AI training was signed by some of AI’s biggest executives. They notably include Stability AI CEO Emad Mostaque, Conjecture AI CEO Connor Leahy, Unanimous AI CEO and chief scientist Louis Rosenberg and Scale AI CEO Julien Billot. It was also signed by Pinterest co-founder Evan Sharp, Skype co-founder Jaan Tallinn and Ripple CEO Chris Larsen.

Other signees include several engineers and researchers at Microsoft, Google and Meta, though it notably does not include any names from Open AI, the firm behind the creation of Chat GPT-4.

“This does not mean a pause on AI development in general, merely a stepping back from the dangerous race to ever-larger unpredictable black-box models with emergent capabilities,” the letter continues. Rather, the industry must “jointly develop and implement a set of shared safety protocols for advanced AI design and development that are rigorously audited and overseen by independent outside experts.”

The letter comes only a few weeks after several major organizations in the entertainment industry, including in music, came together to release a list of seven principles, detailing how they hope to protect and support “human creativity” in the wake of the AI boom. “Policymakers must consider the interests of human creators when crafting policy around AI,” the coalition wrote. “Creators live on the forefront of, and are building and inspiring, evolutions in technology and as such need a seat at the table.”

Related Images:


Reader's opinions

Leave a Reply

Your email address will not be published. Required fields are marked *