Worst Idea Ever
I assume by now you’ve heard about the ban on AI weapons proposed in a letter signed by over 1000 AI experts, including Elon Musk, Steve Wozniak, and Stephen Hawking. The letter was presented last week at the International Joint Conference on Artificial Intelligence in Buenos Aires.
The letter states: “AI technology has reached a point where the deployment of [autonomous weapons] is—practically if not legally—feasible within years, not decades, and the stakes are high: autonomous weapons have been described as the third revolution in warfare, after gunpowder and nuclear arms.”
The rest of us have been occupied with threats of various kinds of autonomous vehicles (cars, trucks, drones), and we failed to see what was right around the corner. Autonomous weapons are more than battlefield robot-soldiers (unlikely) or gun-toting drones (likely). And weapons are more than guns.
Robots have no qualms about sacrificing themselves. It’s a different kind of warfare when the weapons deliverer is the weapon. It’s kamikaze on steroids.
Only now do I realize, after writing a dozen posts about autonomous autos, that they and their ilk are merely stalking horses for the worst idea ever. Autos were just the beginning. Add long haul trucks, obviously, because they’re already being tested.
How about local trucks? Post Office? UPS? How about city buses? School buses? Don’t forget Amazon’s drones. Can larger autonomous drones replace helicopters? For news? Police surveillance? Medivacs?
Look at it another way. Instead of using robots to replace humans, autonomous vehicles are replacing the entire job, e.g., driving a truck. The idea behind all these autonomous devices is to get the public used to the concept, so they won’t question any of them, even those that go way over the top.
Aside from giving the people in control more control using the leverage of computers, there’s the general degradation of the populace by making them less valued than the robots that replace them.
How did humans come to this insane position? Here’s how. People who control the machines think not only are they so much smarter than other people (e.g., the ones they want to replace with robots), they think they can make computers smarter than other people. This is the AI they seek.
And there are some so enamored of intelligence in any form that if they succeed at making a superhuman artificial intelligence—one even smarter than themselves—they will bow down and worship it. Even as it destroys them.