December 3, 2024

I worry that we’re going to field many of these systems without really thinking through both the legality and morality of putting them into the field.

A BETTER PEACE welcomes Dr. Paul Springer Chair of the Department of Research at the U.S. Air Force Air Command and Staff College. Paul joins our Editor-In-Chief Jacqueline Whitt to examine the current state of autonomous warfare and the look ahead at where the technology may be going.  Paul argues that the use of artificial intelligence (AI) on the battlefield is a revolution in military affairs (RMA) that impacts both the nature and the character of warfare. This new norm will require a new structure of understanding and behavior that some aren’t ready to adopt. When will we become comfortable enough with technology to eliminate the human in the loop and what will it mean for humanity?

Dr. Paul Springer is the Chair of the Department of Research at the Air Command and Staff College, Maxwelll AFB, AL. He is the author or editor of 12 books in print including Outsourcing War to Machines: The Military Robotic Revolution and Military Robots and Drones: A Reference Handbook. Jacqueline E. Whitt is an Associate Professor of Strategy at the U.S. Army War College and the Editor-in-Chief of WAR ROOM. The views expressed in this presentation are those of the speakers and do not necessarily reflect those of the U.S. Army War College, U.S. Army, or Department of Defense.

Photo Description: A special ribbon cutting ceremony, signaling the completion of work on the first RQ-4 Global Hawk at Robins Air Force Base, Ga., was held on the base flight line June 29 2017. Robins AFB is the first and only installation to have a building-based Launch and Recovery Element, allowing the aircraft to take off and land from this location. This is also the first time a Global Hawk has flown into an Air Force air logistics complex. Warner Robins Air Logistics Complex maintenance professionals meticulously painted the aircraft to prevent corrosion.

Photo Credit: U.S. Air Force photo by Tech. Sgt. Kelly Goonan

Articles and Episodes related to this topic:

WE NEED AN AI-BASED ENEMY ANALYSIS TOOL … NOW!

ROLL OUT THE ROBOTS! MANAGING COMM NETWORKS AND ACCESS IN THE FUTURE

INCORPORATING ARTIFICIAL INTELLIGENCE: LESSONS FROM THE PRIVATE SECTOR

HOW DO ORGANIZATIONS CHANGE AFTER INCORPORATING ARTIFICIAL INTELLIGENCE?

THE IRON TRIANGLE: TECHNOLOGY, STRATEGY, ETHICS, AND THE FUTURE OF KILLING MACHINES

A.I. & THE URGENCY OF FINISHING FIRST

2 thoughts on “AI ON THE BATTLEFIELD? – IT’S ALREADY HERE

  1. Be a bit cautious of human-robotic teams. The robots can document the environment and actions around them and the actions of their human team-mates. Some of the actions are war crimes – now fully documented instantly to higher responsible authority even to the national level. This is an invitation to “command by negation” from significantly higher command levels- negating local initiative based on commander’s intent.

    Remember that mines have always been autonomous weapons- now they are becoming mobile weapons with multi-sensor pattern recognition sensors and libraries of target signatures.

    1. In general, most land mines are classified as “automatic” rather than “autonomous,” as they respond to stimuli but do not engage in any decision-making prior to detonation. To explain more fully, every mine has a specific condition that must be met before it triggers–so, for the traditional anti-personnel mine, the trigger mechanism consists of a mechanical system that is activated by sufficient pressure being placed upon it. Usually, this comes in the form of a human (or other animal of sufficient weight) stepping on the mine. For anti-tank mines, the trigger either requires a much greater pressure (so that humans can’t set it off by accident) or a magnetic contact with metal (since humans aren’t inherently metallic) or for a “flag” to be pushed by the weight of a vehicle.

      But, the development of autonomous mines is a real thing, to be sure–mines that are capable of deciding whether or not they will detonate, based on environmental sensing and decision-making. To be autonomous, they need to have multiple potential courses of action for selection, and the ability to choose that action without external inputs. The ability of these mines to move around just makes them all the more terrifying.

Leave a Reply

Your email address will not be published. Required fields are marked *

Send this to a friend