Digital Minefield

Why The Machines Are Winning

Archive for the tag “autonomous weapons”

Who’s In Control?


I’ve written a lot lately about autonomous vehicles, weapons, etc. In the news right now are remote-controlled drones interfering with California fire fighters. What’s the connection? Whether you’re on the annoying end of self-driving cars or a human-driven drones, it’s all the same.

What’s my point? When it comes to laws prohibiting or regulating their actions, devices must be treated based on their actions and capabilities. The state of their autonomy has nothing to do with the case.

This is also true when it comes to finding the culprit. If a device breaks the law (or a regulation) then the responsible party must pay the penalty. If the device is autonomous, it will be hard to determine who sent it on its independent mission.

In other words, before we can have any kind of autonomous device, we need enforceable laws to connect the intent of the person controlling the device to its actions. As you might imagine, this will be more difficult than identifying a person with a drone.

Wait a minute! Can we even do that? If a drone can be connected to a controller device—and then to the person using that device—then why are California fire fighters having these problems?

It seems implausible the drone controller could possibly control more than one drone. However, instead of a unique identifier between each drone and its controller, suppose the manufacturer uses only a hundred unique identifiers for the thousands of drones it makes. Or maybe only a dozen.

In as much as the drone buyers do not have to register the identifier (nor is there a law requiring sellers to keep such records), the only way an errant drone could be prosecuted would be to get its identifier and find its controller.

The latter task requires searching an area whose radius is the maximum control distance for this model. Assuming the drone owner is stupid enough to keep the controller after the drone didn’t come back. Assuming the drone owner was operating from a house and not a car.

Without a complete registry of totally unique drone and controller ids, these devices are free to fly wherever the owner wants. Unlike a gun that identifies the bullets it shoots, a drone controller can’t be traced.

These rabbits have clearly left Pandora’s hat. Short of declaring all existing drones illegal (i.e., no totally unique identifier), there is no way for society to control the use of these devices.

However, we have the opportunity to pass such laws for autonomous devices not yet on the market. The real question is: Does society have the will? I doubt it, since it’s not too late to redo the drones and I see no inclination to do so.

Who would have thought that a technology as innocuous as toy drones could expand into total social chaos? As for banning of autonomous weapons, the military will resist ids. And I can see the NRA in the wings salivating at the chance to put in its two cents.

Advertisements

Worst Idea Ever


I assume by now you’ve heard about the ban on AI weapons proposed in a letter signed by over 1000 AI experts, including Elon Musk, Steve Wozniak, and Stephen Hawking. The letter was presented last week at the International Joint Conference on Artificial Intelligence in Buenos Aires.

The letter states: “AI technology has reached a point where the deployment of [autonomous weapons] is—practically if not legally—feasible within years, not decades, and the stakes are high: autonomous weapons have been described as the third revolution in warfare, after gunpowder and nuclear arms.”

The rest of us have been occupied with threats of various kinds of autonomous vehicles (cars, trucks, drones), and we failed to see what was right around the corner. Autonomous weapons are more than battlefield robot-soldiers (unlikely) or gun-toting drones (likely). And weapons are more than guns.

Robots have no qualms about sacrificing themselves. It’s a different kind of warfare when the weapons deliverer is the weapon. It’s kamikaze on steroids.

Only now do I realize, after writing a dozen posts about autonomous autos, that they and their ilk are merely stalking horses for the worst idea ever. Autos were just the beginning. Add long haul trucks, obviously, because they’re already being tested.

How about local trucks? Post Office? UPS? How about city buses? School buses? Don’t forget Amazon’s drones. Can larger autonomous drones replace helicopters? For news? Police surveillance? Medivacs?

Look at it another way. Instead of using robots to replace humans, autonomous vehicles are replacing the entire job, e.g., driving a truck. The idea behind all these autonomous devices is to get the public used to the concept, so they won’t question any of them, even those that go way over the top.

Aside from giving the people in control more control using the leverage of computers, there’s the general degradation of the populace by making them less valued than the robots that replace them.

How did humans come to this insane position? Here’s how. People who control the machines think not only are they so much smarter than other people (e.g., the ones they want to replace with robots), they think they can make computers smarter than other people. This is the AI they seek.

And there are some so enamored of intelligence in any form that if they succeed at making a superhuman artificial intelligence—one even smarter than themselves—they will bow down and worship it. Even as it destroys them.

Post Navigation