04/03/2023 / By Ethan Huff
The Future of Life Institute is circulating a petition that calls for an immediate end to all major artificial intelligence (AI) projects, citing their existential threat to human life.
Signed by Elon Musk and numerous other bigwigs in the tech industry, the petition cites “extensive research” showing that “AI systems with human-competitive intelligence can pose profound risks to society and humanity.”
“Contemporary AI systems are now becoming human-competitive at general tasks, and we must ask ourselves: Should we let machines flood our information channels with propaganda and untruth?” the petition further reads.
“Should we automate away all the jobs, including the fulfilling ones? Should we develop nonhuman minds that might eventually outnumber, outsmart, obsolete, and replace us? Should we risk loss of control of our civilization? Such decisions must not be delegated to unelected tech leaders.”
Only once, or if, a consensus agrees that the effects of AI system implementation “will be positive and their risks will be manageable” should the world allow for the wide-scale adoption of AI as a welcome addition to the world, the petition concludes.
(Related: Musk has been warning for years that AI is a very serious threat against humanity.)
At first glance, the true motivation behind this petition and Musk’s signing of it seems to be more about protecting those “fulfilling” jobs than it is about protecting the non-fulfilling ones.
The oligarch class has never cared about ordinary people losing their jobs to machines, or any of the other society-destroying technologies that have come to bear. It is only when AI started to become powerful enough to replace Hollywood actors, for instance, or high-level politicians that suddenly AI is a problem.
Voice actors, as another example, are freaking out that their high-paying gigs are drying up the more that AI robots learn how to mimic voices for free. When the millionaires and billionaires start to feel like their positions in society are threatened by AI, in other words, that is when petitions like this suddenly start to circulate.
For at least the next six months, the petition states, there should be a “public and verifiable” pause on all AI systems that are more powerful than the ChatGPT-4 AI robot that is all the rage in recent months.
An independent oversight board with “rigid auditing” capabilities must be able to ensure that these advanced AI robots are “safe beyond a reasonable doubt” before such projects are ever allowed to proceed.
In addition to Musk, other well-known signatories of the petition include Steve Wozniak (co-founder of Apple), Jaan Tallinn (co-founder of Skype, Future of Life Institute), Evan Sharp (co-founder of Pinterest), Emad Mostaque (CEO of Stability AI), and numerous others, including professors and executives at MIT and Harvard, and even the CEOs of various AI start-up companies.
“I fear that AI may replace humans altogether,” warned the late physicist Stephen Hawking in an interview with Wired magazine that was reportedly seen by Cambridge News.
“If people design computer viruses, someone will design AI that improves and replicates itself. This will be a new form of life that outperforms humans.”
Musk agrees, having stated that he himself should be “on the list of people who should absolutely *not* be allowed to develop digital superintelligence” because of what could come about from his involvement in the industry.
In the comments, someone wrote that she believes AI will not replace humans so much as it will be used “by horrible humans to control, torment, vaccinate, and / or exterminate other people.”
“The human monsters we are dealing with right now are dangerous enough,” this person added.
The latest news about the AI takeover of the world can be found at Transhumanism.news.
Sources for this article include:
Tagged Under:
AI, artificial intelligence, Collapse, computing, Dangerous, dangerous tech, deep state, Elon Musk, Future of Life Institute, future tech, genocide, Glitch, petition, risk, robots, transhumanism
This article may contain statements that reflect the opinion of the author
COPYRIGHT © 2017 COMPUTING NEWS