Lethal autonomous weapons systems (LAWS) are weapons system
that, once activated, require little to no human action in order to take carry
out an objective. LAWS often uses computer algorithms and sensor systems to identify
a target and use available weapons against it (CRS Report). The general
definition of LAWS covers a wide variety of weapons, and levels of autonomy can
range from partially to completely autonomous. Lawfare
defines an autonomous weapon as “weapons that can select, detect and engage
targets with little to no human intervention.”
At a basic level, all of this seems cool and innovative, but
is it actually a good idea to develop, produce, and deploy LAWS as part of our military
strategy? Is this technological innovation for its own sake—innovation that
carries with it complex international implications?
However, LAWS have complex implications and potential
consequences when they are used.
When LAWS make mistakes, target innocent civilians, or
otherwise malfunction, who is ultimately responsible for the consequences? If an
autonomous weapon indiscriminately kills civilians because of a corrupted
algorithm, who is held responsible for the breach in international law? The military
deploying the weapon, the potentially civilian corporations who built the code,
and the contractors who produced and tested the weapon all potentially hold
responsibility. Unlike a drone—which has a dedicated military operator
dictating its actions—autonomous weapons may not have direct military
operators, muddling the lines of responsibility and damaging oversight and
accountability.
Autonomous weapons are also vulnerable to software breaches or
hacks. Like any other technology, LAWS have vulnerabilities. According to Human
Rights Watch, LAWS software is vulnerable to breaches, as malicious actors
attempt to corrupt or even take control of adversarial LAWS and change their algorithms
to target friendly groups, including its own personnel/military or civilians.
Autonomous weapons in theory sound like innovations that can
limit mistakes and lower casualties in warfare. However, before these systems
enter the conventional theater, its vulnerabilities and weaknesses must be
considered.
No comments:
Post a Comment