Dr Radha Iyenghar Plumb, co-author of a 2010 paper linking civilian casualties in Afghanistan to subsequent insurgent violence, was sworn in on 9 April 2024 as the Department of Defense (DoD) Chief Digital and Artificial Intelligence Officer (CDAO), a position that should prove key to the implementation of US policies for autonomous weapon systems and autonomous systems in the kill chain.
Iyenghar Plumb most recently served as the DoD Deputy Under Secretary of Defense for Acquisition and Sustainment. She has held a variety of senior industry positions including the director of research and insights for trust and safety at Google, the global head of policy analysis for Facebook, and a senior economist for the RAND Corporation, as well as academic posts with the London School of Economics as an assistant professor, and at Harvard University as a Robert Wood Johnson Health Policy Scholar.
Iyenghar Plumb co-authored a working paper in October 2010, which found strong evidence that for Afghanistan “local exposure to civilian casualties caused by international forces leads to increased insurgent violence over the long-run,” termed by the authors as the ‘revenge’ effect. However, no evidence was found for a similar reaction to civilian casualties in Iraq. The paper concluded that “reducing civilian casualties is not necessarily in conflict with the objective of protecting the lives of international forces.”
In post as the Deputy Under Secretary of Defense for Acquisition and Sustainment, Iyenghar Plumb built out the US security industrial base and supply chain. As the new CDAO, Iyenghar Plumb will be tasked with the integration and optimisation of artificial intelligence capabilities across the DoD, as well as acceleration the departments adoption of data and analytics.
In an open discussion with the Center for Strategic & International Studies in September 2023, Iyenghar Plumb described China’s rapid fusion of AI and military technology as a worrisome development, with the pacing challenge presented furthering China’s state interests in line with autocratic values. Iyenghar Plumb went on to tout the US economic advantage as a hard counter to these efforts, with solutions derived from the rapid scaling of information technologies, aided by government programs to reduce sticking points in development.
The DoD Directive 3000.09 on lethal autonomous weapon systems (LAWS), first published in November 2012, and most recently updated in January 2023, is aimed at ensuring commanders and operators can exercise appropriate levels of human judgement over the use of force. Reflecting the rapid advances in the autonomous technology, and the need for responsibility in managing this advancement, the latest changes to DODD 3000.09 clarify which autonomous weapon systems require additional senior review prior to formal development and before fielding, with the Chief Digital and AI Officer as part of the Autonomous Weapon System Working Group, established by DODD 3000.09, to support and advise the senior level review process.
International discussions on the use of LAWS have seen 30 countries and 165 nongovernmental organisations call for a pre-emptive ban on LAWS due to ethical concerns, according to a February 2024 Congressional Research Paper, many citing ethical concerns around the use of LAWS and collateral damage. Many see LAWS as incompatible with recognising accountability, and say that the systems a carry operational risk, as well as fail to take into account considerations of proportionality.
The US Government does not currently favour a ban on LAWS, and in a 2018 white paper it argued that LAWS could prove to be a benefit to humanitarian causes by improving the accuracy of weapon strikes. However, as part of the weapons review process, any changes made to systems, as a result of machine learning or otherwise, must lead to relevant testing and evaluation to ensure the safety features and ability to operate as intended are retained.