Group of Governmental Experts on Lethal Autonomous Weapons Systems (LAWS) – Agenda item 5(a)

Agenda item 5(a)
An exploration of the potential challenges posed by emerging technologies in the area of LAWS to International Humanitarian Law

Statement
September 21, 2020

Thank you, Mr. Chair. The United States appreciates the focus of this agenda item on the application of IHL to emerging technologies in the area of lethal autonomous weapons systems (LAWS). Guiding Principle (a) reflects the foundational premise that IHL applies to these weapons, and the GGE’s 2019 report contains significant conclusions on IHL. Of course, much more work can be done on IHL. This work on understanding how IHL applies is critical to effectively implementing the other guiding principles, including guiding principles (b), (c), (d), (e), and (h).

Indeed, reaching common understanding on what existing IHL requires could also help us re-solve diverging perspectives on whether new law or norms are needed, as such an effort would help us to better understand our respective legal positions and determine whether these diverging perspectives are based on different understandings of the requirements imposed by existing law.

Mr. Chair, the GGE should build on its successful work on IHL by further clarifying IHL re-quirements applicable to the use of emerging technologies in the area of LAWS. In our national commentary to Guiding Principle (a), we have proposed that this be done by considering how militaries have generally used autonomous functions in weapon systems and articulating conclu-sions about these general use scenarios.

In particular, we propose examining how the following uses of emerging technologies in the area of LAWS can be consistent with IHL:

1. Using autonomous functions to effectuate more accurately and reliably a commander or operator’s intent to strike a specific target or target group;

2. Using emerging technologies in the area of LAWS to inform decision-making.

3. Using weapons systems that autonomously select and engage targets where the human operator has not expressly intended to strike a specific target or group of targets when ac-tivating the weapon system.

These use scenarios frame a number of questions that are worth further exploration, such as: when is it consistent with IHL for a decision-maker to rely on a machine assessment to consider a target to be a military objective? What factors should inform a proportionality assessment re-garding the employment of weapons systems that autonomously

select and engage targets?

We have proposed conclusions on these and other issues. For example, we propose that the GGE build on last year’s report, which recognized the importance of precautions, by elaborating on the

types of precautions that States have employed in weapon systems with autonomous functions. On page 4 of our national commentary, we proposed the following:

Feasible precautions must be taken in use of weapon systems that autonomously select and engage targets to reduce the expected harm to civilians and civilian ob-jects. Such precautions may include:

i. Warnings (e.g., to potential civilian air traffic or notices to mari-ners);

ii. Monitoring the operation of the weapon system; and

iii. Activation or employment of self-destruct, self-deactivation, or self-neutralization mechanisms (e.g., use of rounds that self-destruct in flight or torpedoes that sink to the bottom if they miss their targets).

Reaching more granular understandings like this of IHL requirements would strengthen the nor-mative and operational framework. For example, it would improve our ability to conduct legal review of weapons, to train personnel to comply with IHL requirements, and to apply principles of State and individual responsibility.

We plan to discuss the issue of human-machine interaction in greater detail during the appropriate agenda item later this week, but let me just note that in our view, IHL does not establish a requirement for “human control” as such. Rather, IHL seeks, inter alia, to ensure the use of weapons is consistent with the fundamental principles and requirements of distinction, proportionality, and precautions.

The application of IHL to emerging technologies in the area of LAWS is a critical topic, and we welcome today’s discussion and continued discussions with other delegations regarding the more technical legal conclusions that we’ve proposed in working papers and our national commentary regarding how IHL applies in these three use scenarios. In particular, we would welcome more focused discussions with legal experts on these issues as part of our ongoing work.

Resources

· U.S. Commentary on Guiding Principles, September 1, 2020.

· U.S. Working Paper, “Autonomy in Weapons Systems,” November 10, 2017, available at https://unog.ch/80256EDD006B8954/(httpAssets)/99487114803FA99EC12581D40065E90A/$file/2017_GGEonLAWS_WP6_USA.pdf.

· U.S. Working Paper, “Implementing IHL in the Use of Autonomy in Weapon Systems,” March 28, 2019, available at https://unog.ch/80256EDD006B8954/(httpAssets)/B2A09D0D6083CB7CC125841E0035529D/$file/CCW_GGE.1_2019_WP.5.pdf.

· U.S. CCW GGE statement, March 28, 2019: U.S. Practice in the Assessment of Weapons Systems, available at
https://geneva.usmission.gov/2019/03/28/convention-on-ccw-u-s-practice-in-the-assessment-of-weapons-systems/.