CCW Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS)

U.S. Delegation Statement on Overarching Issues

Geneva,
April 16, 2015

Thank you Madame Chair.  We once again wish to thank you and the panel of experts for their presentations.  We would like to take this opportunity to comment on what we have heard today.

As with any weapon, any use of LAWS would have to be compliant with applicable law. For purposes of law enforcement and border or crowd control, the applicable human rights framework and, of course, national law would play a vital role in discussions about the possible future development and use of LAWS.  In particular, how governments interact with their own populations and how legal frameworks prohibit repression of populations are questions of paramount importance in these discussions.  The United States is committed to ensuring the utmost respect for and adherence to human rights when it comes to the development, use, or export of any weapons system, includingany potential future LAWS.

The context of armed conflict is governed primarily by international humanitarian law (IHL).  The United States is likewise committed to adherence to IHL.  As we discussed yesterday, there are no specific provisions in IHL that prohibit or restrict the use of autonomy to aid  in the operation of weapon, including LAWS.  

As we have discussed over the course of this week, our policy for many years has required legal review of the intended acquisition of a weapons system to ensure its development and use is consistent with applicable law, including IHL.  That review prohibits a weapon system where its intended use is calculated to cause superfluous injury; where it is inherently indiscriminate; and where it falls within a specifically prohibited class of weapons.  Part of the calculus of such a weapons review is to delineate whether there are legal restrictions on the weapon’s use that is specific to that type of weapon or whether other practical measures are needed, such as training or rules of engagement specific to the weapon.  In short, context and specific intended use matter, and this well-accepted approach to weapons legal review applies equally to LAWs.    

Some have invoked the Martens Clause to assert that LAWS as a category are legally prohibited.  To be sure, the Martens Clause, with its focus on principles of humanity and dictates of public conscience, is germane to the issue of autonomy. Equally, as with any emerging or novel technology of great complexity, we must be cognizant of unintended consequences.  Accordingly,  this makes it all the more vital that states implement not just the legal review, but the broader operational, safety, policy weapons reviews we have discussed.

Having said that, we believe that the assertion that the Martens Clause leads to categorical legal prohibition of LAWS is not supported.  As an initial matter, the Martens Clause is not a rule of international law that prohibits any particular weapon, much less a weapon that does not currently exist.  Rather, it is recognition that when no specific rule applies, the customary principles of IHL govern conduct during armed conflict.  In general, the lawfulness of the use of a type of weapon under international law does not depend on an absence of authorization, but instead depends upon whether the weapon is prohibited.  Given that there remains significantly divergent views on the definition of LAWS, we cannot say such systems are prohibited by customary law.

We believe that the principles of humanity and the dictates of public conscience provide a relevant and important paradigm for discussing the moral or ethical issues related to the use of automation in warfare.  Indeed, they may even be indicative, as time and practice develops, of emerging norms and customs.  But as an initial matter, we agree with the point of the Swiss delegation this past Tuesday that it is important for purposes of our discussions not to conflate the legal versus ethical issues presented by LAWS.  To be sure, it is vital that these discussions take up the ethical considerations presented, but we must be diligent in distinguishing what is legally prohibited from what is morally undesirable.  Otherwise, we run the real risk of stalling discussions in a morass of hypotheticals, with no tangible outcome forthcoming. 

As to the moral questions, they turn on fundamental questions about the relationship between man and machine and how that relationship in turn impacts the conduct of armed conflict.  Given the current state of the technology, we cannot know what this impact will be, but it will be critical to optimize that relationship. Some have suggested the possibility that LAWS could lower the threshold for going to war because they would allow us to go to war without putting as many human soldiers in harms’ way.  Others have suggested that they might lack empathy and therefore would not be restrained by compassion and mercy in situations where a human being might show such restraint, where the law does not require such restraint.  Still others have argued that ethical conduct on the battlefield requires complex judgments that only human beings can make, and that are impossible for programmers to predict in advance. As we mentioned earlier this week, these are some of the technical challenges that States would face with developing LAWS.  This is why under our Department of Defense Directive on autonomy in weapons systems, absent higher level approval, our current policy permits human-supervised lethal autonomous weapons to intercept attempted time-critical or saturation attacks for static defense of manned installations or onboard defense of manned platforms and not the selection of humans as targets.

While we believe that these and other questions are important questions to ask, they must be asked with respect to a particular weapon for a particular use in a particular context; only then can we accurately discuss these questions.  In certain contexts, the use of autonomous functions in weapons systems may be preferable to the weapons of today. Looking back, we recognize that advances in weapons systems means such systems have become more precise and in many ways less likely to cause collateral damage.  LAWS may continue this trend.  LAWS might even lower the threshold for engaging in peacekeeping operations, which could save lives, or prevent people from committing gross human rights violations, in situations where we might otherwise hesitate to put boots on the ground.  We do know that autonomy would be beneficial in contexts in which lethal force is not contemplated – defense against unmanned weapons, for example, or search and rescue missions. What is important in all of these cases is that careful attention is paid to the context.

Finally, we note that human dignity is an important part of the discussion, particularly as it encompasses the moral questions noted.  But we note that this question must take account of the violence to human dignity inherent in armed conflict, despite the type of weapon used.

print  Print