Skip to main content

Elon Musk and tech execs call for 'pause' on AI development

The authors of the letter explained that advanced AI could cause a profound change in the history of life on Earth, for better or worse.

More than 2,600 tech leaders and researchers have signed an open letter urging for a temporary “pause” on further artificial intelligence (AI) development, fearing “profound risks to society and humanity.”

Tesla CEO Elon Musk, Apple co-founder Steve Wozniak and a host of AI CEOs, CTOs and researchers were among the signatories of the letter, which was authored by the United States think tank Future of Life Institute (FOLI) on March 22.

The institute called on all AI companies to “immediately pause” training AI systems that are more powerful than GPT-4 for at least six months, sharing concerns that “human-competitive intelligence can pose profound risks to society and humanity,” among other things:

“Advanced AI could represent a profound change in the history of life on Earth, and should be planned for and managed with commensurate care and resources. Unfortunately, this level of planning and management is not happening,” the institute wrote in its letter.

GPT-4 is the latest iteration of OpenAI’s artificial intelligence-powered chatbot, which was released on March 14. To date, it has passed some of the most rigorous U.S. high school and law exams within the 90th percentile. It is understood to be 10 times more advanced than the original version of ChatGPT.

There is an “out-of-control race” between AI firms to develop more powerful AI, that “no one – not even their creators – can understand, predict, or reliably control," FOLI claimed.

Among the top concerns were whether machines could flood information channels, potentially with “propaganda and untruth” and whether machines will “automate away” all employment opportunities.

FOLI took these concerns one step further, suggesting that the entrepreneurial efforts of these AI companies may lead to an existential threat:

“Should we develop nonhuman minds that might eventually outnumber, outsmart, obsolete and replace us? Should we risk loss of control of our civilization?”

“Such decisions must not be delegated to unelected tech leaders,” the letter added.

The institute also agreed with a recent statement from OpenAI founder Sam Altman suggesting an independent review may be required before training future AI systems.

Altman in his Feb. 24 blog post highlighted the need to prepare for artificial general intelligence (AGI) and artificial superintelligence (ASI) robots.

Not all AI pundits have rushed to sign the petition though. Ben Goertzel, the CEO of SingularityNET explained in a March 29 Twitter response to Gary Marcus, the author of Rebooting.AI that language learning models (LLMs) won’t become AGIs, which, to date, there have been few developments of.

Instead, he said research and development should be slowed down for things like bioweapons and nukes:

In addition to language learning models like ChatGPT, AI-powered deep fake technology has been used to create convincing images, audio and video hoaxes. The technology has also been used to create AI-generated artwork, with some concerns raised about whether it could violate copyright laws in certain cases.

Related: ChatGPT can now access the internet with new OpenAI plugins

Galaxy Digital CEO Mike Novogratz recently told investors he was shocked over the amount of regulatory attention has been given to crypto, while little has been towards artificial intelligence.

“When I think about AI, it shocks me that we’re talking so much about crypto regulation and nothing about AI regulation. I mean, I think the government’s got it completely upside-down,” he opined during a shareholders call on March 28.

FOLI has argued that should AI development pause not be enacted quickly, governments should get involved with a moratorium.

“This pause should be public and verifiable, and include all key actors. If such a pause cannot be enacted quickly, governments should step in and institute a moratorium,” it wrote.

Magazine: How to prevent AI from ‘annihilating humanity’ using blockchain



from https://ift.tt/4qzWdRt
https://ift.tt/xUNcrSA

Comments

Popular posts from this blog

ENS DAO delegates offer perspective on DAO governance and decentralized identity

AlphaWallet CEO and Spruce co-founder talk about their roles as contributors to the Ethereum Name Service following the project's recent airdrop. Earlier this month, the Ethereum Name Service, or ENS, formed a decentralized autonomous organization, or DAO, for the ENS community.  Cointelegraph spoke to two ENS DAO delegates who applied for the opportunity to represent the community and stay involved in the decision making process: Victor Zhang, CEO of AlphaWallet, an open source Ethereum wallet, and Gregory Rocco, co-founder of Spruce, a decentralized ID and data toolkit for developers. Zhang spoke about his experience as an external contributor to ENS and an early supporter since 2018. Zhang initially sought to help ENS by offering Alpha Wallet as a user-friendly tool for  resolving .eth names and cryptocurrency wallet addresses. Essentially, if a user inputs an .eth name in the AlphaWallet, it will show the wallet address, and vice versa using reverse resolution. Alpha...

How Social Platform Chingari is Using Web 3.0 to Transform the Traditional Way We Use Social Media

The world is changing. This isn’t news to anyone, but sometimes it is nice to realize that—contrary to news headlines—not all the change is bad.  In fact, the last decade has seen so much innovation and so many improvements to technology that even 2015 seems like a different world.  Internet speeds, connecting with anyone globally (for free), and our ability to reach large groups of people without a middleman is nothing short of revolutionary. When it comes to technology evolution, this often happens with different iterations.  Once a system is mature, there’s a better idea of what we would like to change and improve.  We go back to the drawing board, target our creative minds at the issues, and create a new version that has evolved to better meet our needs.  The Internet has followed this model since its inception, evolving through three distinct stages.  We are only at the cusp of the third stage, called Web 3.0, with technologies such as blockchain and ...

Lightning Network Exploits Continue to Hinder the Bitcoin Scaling Solution

via Bitcoin News https://ift.tt/3mGmODQ While bitcoin has run-up to all new price highs in 2020, a great number of crypto supporters have been complaining about the mempool backlog and the high fees needed to send a transaction. Meanwhile, the Lightning Network is far from seeing widespread adoption, and a number of attack vectors have been revealed this year. At the time of publication, the Bitcoin ( BTC ) mempool (backlog of transactions) shows 113,000+ unconfirmed transactions and the backlog hasn’t been this high since 2017. When the bull run took place three years ago, transaction fees and unconfirmed transactions shot through the roof. Currently, according to bitcoinfees.cash data on October 31, the next BTC block fee is $10.77 and the current median fee is $6.43. Even with the high fees and the mempool clog , the greater bitcoin community is still transacting mostly onchain. The Layer 2 protocol built on top of Bitcoin called the Lightning Network (LN) was supposed to eas...