Thursday, March 08, 2018

Self-Regulating Communities Writing Self-Regulating Algorithms

            Autonomous weapons systems (AWS) open the door to “really neat operations,” that the U.S. military hasn’t necessarily been able to do before, according to Major Jen Snow, of SOFWERX.  Major Snow, associated with the Donovan Group at SOFWERX, builds relationships between Special Operations Command (SOCOM) and what she calls “self-regulating communities” of “makers and hackers.”  SOFWERX runs outreach to inventors and innovators to try to glean advantages in the use of technology on the battlefield, offering prizes and exposure for makers and hackers at their rapid-prototyping expos.  Engineers, hackers, inventors, and scientists fill SOFWERX’s Florida headquarters offering new ways for soldiers to eliminate or use unmanned aerial systems (UAS), detect infrared, hack enemy systems, and so much more.

            The outreach that SOFWERX is engaged in highlights the pressure that the U.S. military is feeling about staying ahead of technological developments on the battlefield, and SOCOM isn’t the only organization outsourcing.  The Defense Advanced Research Projects Agency (DARPA) also issues grants to industry and academic teams working in advanced technology.  In recent years, DARPA has focused more and more on deep learning, a form of Artificial Intelligence that writes its own code to learn, to augment America’s warfighting capabilities.  This brand of Artificial Intelligence is exponentially more effective than the comparatively transparent algorithms that don’t teach themselves, but its OODA loop takes place within a black box.

            To clarify, the U.S. military’s plan to develop AWS is to rely on self-regulating communities to develop self-regulating algorithms that will be tasked with “really neat operations,” that the U.S. military might feel uncomfortable risking human soldiers conducting.  Furthermore, not all of the ideas coming out of those self-regulating communities of makers and hackers will be selected for military use, but they will be ready for quick sale to basically anyone else.  So, ultimately, humans won’t be able to know why our AWS make decisions, and the military won’t fully control who or how AWS are developed.  That’s not just a recipe for a lack of meaningful human control—that’s a recipe for a lack of meaningful state control.

No comments: