Saturday, March 26, 2022

Lethal Autonomous Weapons and the Future of War

 Lethal Autonomous Weapons and the Future of War 

    Many countries around the world are rapidly developing their artificial intelligence (AI) capabilities. There are many applications for AI technology, but the most enticing use seems to be for defense. But what role will AI technology really play in future conflicts? 


    AI can be useful in identifying enemy weapons systems, pin-pointing high-value targets using facial recognition, translating texts for intelligence collection, and various other information operations. These actions are valuable as they reduce workload and improve productivity. However, many fear what the future of this technology will look like on the battlefield as many global superpowers may have a more sinister use in mind. 


The future of AI might look like what people call “killer robots” or lethal autonomous weapons systems (LAWS). These weapons could have the ability to eliminate targets without human decision-making. Some believe the benefits to this outweigh potential downfalls. Obvious reasons in support of the use of LAWS would be the lower human and political costs. Removing soldiers from the battlefield seems too enticing to resist. 


Others have argued the risks are not worth the rewards. LAWS can upset international law and security. Removing human decision-making can also remove accountability. What happens if the technology makes a mistake? Additionally, if not controlled, LAWS could end up in the wrong hands (if there are even right hands for this  technology). What if terrorist organizations develop their own LAWS?


But LAWS may be entering the battlefield sooner rather than later. The US, Russia, China, Turkey and several others have already began experimenting with and testing AI technology and LAWS. As Russia has already demonstrated its use of controversial weapons during the war in Ukraine, such as the cluster bomb, some fear LAWS are just the next step. A National Security Council Report from 2021 stated the Turkish Kargu-2 drone was used in Libya to autonomously hunt down and destroy targets, a reminder that the future is closer than we think. 


The need to control AI technology on the battlefield is becoming more urgent. Recent attempts toward regulation have been made, such as AI guidelines issued by the Department of Defense and the EU’s AI Act, but it may already be too late. A world that previously seemed beyond our reach, only present in otherworldly sci-fi novels and movies, may soon resemble the modern battlefield.  

No comments: