By Aaron Kesel
Thousands of leading scientists have urged their colleagues against helping governments create killer robots, making the movie The Terminator a reality.
2,400 scientists have pledged to block the development of lethal weaponry using Artificial Intelligence, The Guardian reported.
In other words, killer robots, that could eventually develop a mind of their own and take over the world. Scientists have vowed they will not support robots “that can identify and attack people without human oversight.”
Two leading experts backing the commitment, Demis Hassabis at Google DeepMind and Elon Musk at SpaceX, are among the more than 2,400 signatories whom have pledged to deter military firms and nations from building lethal autonomous weapon systems, referred to as Laws.
The move is the latest from concerned scientists and organizations about giving a machine the power to choose someone’s fate of life or death.
The pledge organized by The Future of Life Institute, calls on governments to agree on norms, laws, and regulations that stigmatize and effectively outlaw the development of killer robots.
The signatories of the pledge have stated they will “neither participate in nor support the development, manufacture, trade, or use of lethal autonomous weapons.” More than 150 AI-related firms and organizations added their names to the pledge that was announced at the International Joint Conference on AI in Stockholm.
Robots are beginning to take over every aspect of society. They are also headed for retail businesses delivering freight and eliminating truckers.
But, can we really give robots the choice whether to kill or let a human being live?
In fact, it sounds quite dangerous allowing a freight truck to drive itself; if the sensors break down on a big rig truck going 60-70 MPH, that’s potentially 40 tons barreling down the highway unattended except by artificial intelligence.
As another example, imagine A.I. having to choose who to let live in a freak accident?
This may be why, as Activist Post reported back in March, Uber had to halt nationwide testing of its A.I. vehicles following the death of a pedestrian in Arizona. And that was a car actually attended by a human backup operator.
Automation clearly isn’t a foolproof technology, and it can also be exploited by hackers for malicious purposes that could even include programming a bot to kill an individual.
This comes after Google employees drafted their own demands and a petition against Google’s involvement with Project Maven, demanding the engagement end, as well as an agreement to never get involved with the military again. That action caused Google to quit its drone program.
However, the agreement was already signed, so the company is locked in for another year until the contract runs out in March 2019. Google can then legally stop assisting the government with the advancement of artificial intelligence for use with its drones.
At least a dozen staff resigned over the issue.
A DoD statement from last July announced that Project Maven aimed to “deploy computer algorithms to war zones by year’s end.”
The military has also proposed a drone mothership in the sky like in the movie Captain America: The Winter Soldier, an absolutely scary concept at best.
The Tesla founder has previously said that artificial intelligence is potentially more dangerous than nuclear weapons. That’s a shared thought with scientist Stephen Hawking, who also previously warned that “artificial intelligence could spell the end for the human race if we are not careful enough because they are too clever.”
This is the beginning of The Terminator movie, as even scientists agree that machines will begin to think for themselves in the near future and could be a threat to the human race.
With this pledge by at least 2,400 of the world’s brightest minds maybe we can stop a potential man-made threat to the human race.
* Aaron Kesel writes for Activist Post. Support us at Patreon. Follow us on Minds, Steemit, SoMee, BitChute, Facebook and Twitter. Ready for solutions? Subscribe to our premium newsletter Counter Markets.