Put together, these two technologies may effectively combat mass shootings. In brief, non-lethal drones can be installed in schools and other venues and play the same role that sprinklers and other fire suppression tools do for firefighters: Preventing a catastrophic event, or at least mitigating its worst effects.
Of course, I appreciate the risks in such a proposal, and I know it sounds faintly ludicrous to some. That’s why we must start with a caveat: We cannot introduce anything like non-lethal drones into schools without rigorous debate and laws that govern their use. Here are three that can serve as a starting point:
1) Non-lethal drones should be used to save lives, not take them.
This must be a bedrock principle of any such proposal: Drones designed for a security purpose in a non-military setting must never have the capacity to take life. Simply put, drones that kill defeat the purpose of this proposal. Thus, any weapon attached to a robot or drone should only fire weapons that incapacitate; not kill.
2) Humans must own use-of-force decisions and take moral and legal responsibility.
In every dystopian sci-fi movie, the fear of technology is the same: What happens if the robot goes rogue? Which brings us to the second law of armed robotics: Human beings must control all drone decision-making and thus take moral and legal responsibility for outcomes. And not just any human being, but an authenticated human with the appropriate training and permissions as well as authority to act. In other words, a highly trained law enforcement official.
The decision to deploy force can’t be left in the hands of a random, untrained individual. For all our advancements in technology, human beings still possess judgment—the capacity to weigh risks and assess situations with nuance. We want to take the best of what technology and robotics can offer—speed, accuracy, and risk reduction—but never, ever lose sight of human judgment in volatile settings.
3) Agencies must provide rigorous oversight and transparency to ensure acceptable use.
A system like this can only be taken seriously if it starts with rigorous oversight—in the world of armed robots (even if only non-lethal), there’s no such thing as “shoot first, and ask questions later.” We must build the oversight and transparency systems into protective robotic technology from the get-go—both to ensure safety and to build public buy-in.