致命的自主武器系统: 国际社会能否达成一致?

2019 年 2 月 13 日

"联合国法外处决、即决处决或任意处决问题特别报告员很少成为头条新闻,他/她的报告也很少得到人权活动家小圈子以外的关注。然而,当克里斯托夫-海恩斯从保护生命权的角度提交其关于通过武装无人机使用致命武力的报告时,情况就是这样"。

It is not very often that a United Nations Special Rapporteur on extrajudicial, summary or arbitrary executions makes headlines or that his/her report gets noticed beyond a small circle of human rights activists. Yet this is what happened when Christof Heyns submitted his report1 on the use of lethal force through armed drones from the perspective of protection of the right to life.

The report was prepared in response to a request by the General Assembly2, but in the text of the resolution, only generic language about extrajudicial, summary or arbitrary executions is used: the focus on armed drones which—though not illegal, as he pointed out—and which could make it easier for States to deploy deadly and targeted force on the territories of other States, was a departure from previous reports. Heyns argued that the modern concept of human rights was based on the fundamental principle that those responsible for violations must be held to account, and that a lack of appropriate transparency and accountability concerning the deployment of drones undermined the rule of law and might threaten international security.

In his conclusion, he urged action by the UN Security Council, recommended that States using armed drones had to be transparent about their development, acquisition and use, and that States had to ensure meaningful oversight of the use of drones, as well as investigation and accountability as well as reparations for their misuse.

It is not surprising that these recommendations did not see immediate follow-up. In my capacity as High Representative for Disarmament Affairs, I had met Mr. Heyns shortly after the release of the report and brought it up repeatedly with Member States in my discussions, in order to move the deliberations from the General Assembly's Third Committee (Human Rights) to the First (Disarmament and International Security) where I felt it clearly belonged.

Maybe that was over-ambitious, but sixteen countries expressed support in the First Committee for taking action under the umbrella of the Convention on Certain Conventional Weapons (CCW)—also known as the Inhumane Weapons Convention—to discuss the questions related to emerging technologies in the area of Lethal Autonomous Weapons Systems (LAWS). It is true that the structure of the CCW—a chapeau Convention and five annexed Protocols—is flexible as it can accommodate additional Protocols, offering the possibility to add a Protocol on LAWS, but the CCW has one additional feature that was clearly important to Member States: while the General Assembly takes decision by majority, the CCW operates on the basis of consensus which means that mandates or decisions can be blocked by a single objecting nation.

France, as Chair of the CCW, thus proposed to hold an Informal Meeting of Experts which was convened for four days in May 2014, attended by Member States as well as industry representatives and a large number of civil society organizations.

This was the first time that many Member States spoke on the topic, and it showed that few nations had developed a national policy on the matter, and many more states, less technologically advanced, had questions on the matter. Thematic sessions, with significant input from AI scientists, academics and activists, dealt with legal aspects, ethical and sociological aspects, meaningful human control over targeting and attack decisions as well as operational and military aspects. While it was acknowledged that international humanitarian and human rights law applied to all new weapons, there was no agreement whether the weapons would be illegal under existing law or if their use in certain circumstances would be permitted.

The report3 issued after the meeting underlined that the meeting had contributed to forming common understandings, but questions still remained, with many States expressing the view that the process should be continued.

Clearly, a chord had been struck. The year 2015 saw the topic of LAWS getting ever more prominent, in the media (where LAWS were often characterized as "killer robots"), in scientific circles, in conferences, workshops, public meetings and panels. That year, at one of the world's leading AI conferences, the International Joint Conference on Artificial Intelligence (IJCAI 15), an Open Letter from AI & Robotics Researchers - signed by nearly 4,000 of the preeminent scientists such as Stuart Russell, Yann LeCun, Demis Hassabis, Noel Sharkey and many many others - and over 22,000 endorsers, including Stephen Hawking, Elon Musk, Jaan Tallinn, to name just a few - which warned against AI weapons development and posited that LAWS could "become the Kalashnikovs of tomorrow". Most AI researchers have no interest in building AI weapons, the letter stated, and do not want others to tarnish their field by doing so4.

These actions encouraged advocacy organizations. The Campaign to Stop Killer Robots, part of Human Rights Watch, had been created in 2013 as an international coalition of NGOs working to preemptively ban fully autonomous weapons. Influential organizations took notice: the World Economic Forum put on a one-hour televised discussion at their annual meeting in 2016 called "What if robots go to war?" and has followed it up with articles by Stuart Russell and others, as well as showcased topics such as "How AI could increase the risk of nuclear war"5. The International Committee of the Red Cross (ICRC) posited that decisions to kill and destroy were a human responsibility and published papers on the subject.

Back to the discussions in the United Nations

The second informal meeting under the umbrella of the CCW took place in Geneva a year after the first, in April 2015. There was a strong and diverse participation: 90 States (out of 125 High Contracting Parties) as well as delegations from other UN agencies, the ICRC, academics, scientists, military experts, as well as industry representatives. When the meeting concluded, 58 States had given their views on LAWS, mostly to indicate support for multilateral talks. Some States voiced their explicit support for a preemptive ban, while others said a prohibition had to remain on the table for consideration.

It was interesting to note that not a single State said that it was actively pursuing fully autonomous weapons or that their armed forces would have to have them in the future, though extensive discussions were held on the potential benefits of such weapons and what advantages technological advancements might bring.

The only States to explicitly say that they were keeping the door open to the acquisition of fully autonomous weapons were Israel and the United States, while others noted that they had no plans to ever acquire them (Canada, France, Japan, UK).

The second meeting—now called Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS)—was clearly productive and advanced the discussion. The Chair (Ambassador Biontino of Germany) had widely consulted prior to the meeting and prepared a "food-for-thought" paper6 which outlined issues to be covered at the meeting. Prominent industry experts (including Stuart Russell, Andrea Ominci, Paul Scharre) briefed the participants and urged that action be taken soon as progress in the development of AI was proceeding rapidly.

Sessions were devoted to technical issues, characteristics of LAWS, possible challenges to IHL, overarching issues, and transparency and the way ahead7 . The papers and presentations were uploaded on the UN CCW website; they were the most detailed contributions8 so far in this forum.

In the session on the way ahead, calls were made to continue the discussions on LAWS, with a dozen urging that the CCW establish a Group of Governmental Experts to advance the issue. The subsequently-issued report9 by the Chair (issued in his personal capacity) was very detailed and brought out differing views and divergencies of opinion, while urging that the debate be deepened, in particular to include an in-depth examination of legal weapons review (Article 6, Additional Protocol I10 ), a discussion on the general acceptability of LAWS in reference to the Martens Clause, ethical issues, and the notion of meaningful human control, autonomy in the critical functions, autonomy, command and control, and system-human interaction.

The Third Informal Meeting of Experts in 2016

The meeting took place in April, a year after the second meeting, and again, the number of States participating rose, to 94. Again, the participants benefited from a "food-for-thought" paper11 issued by the Chair (Ambassador Biontino of Germany). He had also called for the submission of working papers to the meeting, to which five States (Canada, France, Holy See, Japan and Switzerland) responded, together with the ICRC. Another novelty was the inclusion of a new element in the mandate which stated that participating countries "may agree by consensus on recommendations for further work for consideration by the CCW's 2016 Fifth Review Conference". Lest you think otherwise, in UN-speak, this was quite an important next step.

At the meeting, there was general agreement that lethal autonomous weapons did not yet exist, and the notion of "meaningful human control of weapons systems" was raised throughout the discussions. Other formulations were also proposed, including by the US, which noted its preference for "appropriate levels of human judgment".

The different perspectives of States came clearer in focus at this meeting: 14 States called for a preemptive ban on LAWS, while the US advocated increasing autonomy in weapon systems, citing perceived benefits in precision and the reduction of civilian casualties. This was echoed by Israel. China and Russia mostly reacted to the position of other States without giving views on their own position.

Yet the meeting did agree to recommend12 that the 2016 Fifth CCW Review Meeting establish an open-ended Group of Governmental Experts (GGE), which should meet "for an appropriate time" starting in 2017 and outlined a series of questions to be addressed by the GGE, including focus on a working definition of LAWS (which had already been the subject of extensive consideration at the expert meeting), IHL in the context of LAWS, ethical and moral questions, and effects on regional and global security, to name a few. The Review Conference did decide to establish a GGE13, starting its work in 2017, and doubling the period of time for the meeting from one week a year to two.

A strong intervention from leading AI scientists

In August 2017, an open letter was addressed to the UN14 signed by 116 of the world's leading robotics and AI pioneers which rang an alarm bell:

Lethal autonomous weapons threaten to become the third revolution in warfare. Once developed, they will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend. These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways. We do not have long to act. Once this Pandora's box is opened, it will be hard to close. We therefore implore the High Contracting Parties to find a way to protect us all from these dangers.

While the scientists "urge the High Contracting Parties therefore to double their efforts at the first meeting of the GGE now planned for November", they also noted that "many of our researchers and engineers are eager to offer technical advice to your deliberations". As far as I know, this offer has not been taken up and remains on the table.

So now on to the Group of Governmental Experts

Eighteen months had elapsed after the 2016 meeting on LAWS before the GGE convened in November 2017. The Chairmanship passed from Germany to India, the Group of Non-Aligned States having made the argument that after two Western European chairs, it was their turn.

The meeting had been scheduled earlier, in April, but had to be cancelled as CCW States Parties did not pay their assessed contributions in time (the main culprit was apparently Brazil which had not paid its dues since 2010), as conference and interpretation costs were due. Instead of two meetings of one week each, only one meeting was to take place in 2017. This deferral was heavily criticized, particularly by civil society which called 2017 "a lost year" for LAWS and international diplomacy.

Despite the rather robust mandate given to the GGE, the Indian chair (Ambassador Amandeep Singh Gill) took a very conservative approach, stating that "we need to know the state of the art and develop a better understanding of the problem; we cannot come to the stage of examining solutions if we don't have the diagnosis right"15 , so he wanted a further airing of views to find common ground, particularly in three areas: technical, military, legal. In keeping with past practice, he also issued a "food-for-thought" paper16 and announced that he would continue the review and "useful discussion of the past three years". His paper listed questions to be addressed but was essentially unambitious in its remit.

So what was achieved? The November GGE was preceded by the deliberations in the First Committee of the General Assembly in New York in October, where 34 States raised the issue of LAWS in their statements and expressed support for the work of the GGE.

The GGE followed the previously-established format of panels of experts, followed by interventions by States. A number of States (now grown to 21) called for a ban, while Israel, Russia, Turkey, UK and US reiterated their opposition to it. The US and UK indicated that it was too early for a ban, while Israel argued that the "futuristic nature" and differing opinions on LAWS would demand a careful and incremental approach17.

Nine States submitted working papers to the meeting; a welcome development, as it sets out their views and makes them part of the official documentation on the issue. Civil society organizations are not allowed to make written submissions, so their papers are circulated informally. France and Germany submitted a proposal for the CCW to produce a non-binding political declaration and a code of conduct to be agreed upon in 2018. Their position—that prohibition is premature as these weapons do not yet exist—did, however, come in for criticism from a number of NAM countries. One of the weapons outlawed in Protocol IV are blinding laser weapons—which were outlawed before they were fully developed, so the argument was seen as hollow.

Many States urged to discuss a working definition of lethal autonomous weapons systems, as well as concepts such as meaningful—or adequate—human control. Many affirmed the need for concrete action, and most States were dissatisfied with the lack of ambition and the lack of urgency, as was evident from the Chair's report18. In the conclusions, the GGE stated that

  • IHL continues to apply
  • responsibility for the deployment of any weapons system in armed conflict remains with States
  • States must ensure accountability for lethal action
  • There is a need to keep potential military applications of related technologies under review
  • Need to further assess the aspects of human-machine interaction
  • Need to continue the discussion (…) on possible options for addressing the humanitarian and international security challenges posed by emerging technologies.

The GGE meeting in April 2018—the fifth on LAWS

Two meetings of one week each were scheduled for this year, the first in April, the second at the end of August. Both of them are chaired by Ambassador Amandeep Singh Gill of India. While the final report is not yet available at the time of writing, an informal Chair's summary19 gave an overview of the proceedings.

So what was different about this meeting? First, numbers matter. While the number of States participating had risen to 94 in 2016, the first GGE saw 84 States, the 2018 GGE was attended by a total of 82 countries. There were quite a number of advocacy non-governmental organizations, but few AI specialists, academics and experts and not at the previous high level and standing. Since a GGE deals with governmental experts, a panel of five official participants was convened which appeared to mirror a P-5 approach. Their views were extensively summarized in the Chair's paper, while only sparse comments from delegations were noted on their positions taken. In addition, papers were submitted by the following States: Poland, Russia, the US, and the Non-Aligned Movement in advance of the session. Again, civil society was not able to submit written contributions.

From the Chair's summary, he noted that the discussions were intensive and focused but did not achieve an agreed outcome as to the way forward. He underlined the need for further deepening the engagement at the next meeting in August and suggested that informal consultations in the inter-sessional period would be useful, as would possible track 2 or track 1.5 side events on specific subjects or options. He also underlined that building blocks had already been identified in the agreed agenda points. As an example, he mentioned that the human element in the use of force, regardless of how it was worded, had emerged as a central consideration.

Not everyone agreed with this positive assessment.

Many States felt that rapid progress was possible, but that the window for credible preventative action in the CCW was closing. African countries as a group expressed their support for a ban on LAWS, while five States explicitly rejected moving to negotiate an international treaty. At the last day of the session, China for the first time expressed its desire to negotiate and conclude a CCW Protocol to prohibit the use of fully autonomous lethal weapons system but wants to see it limited to use only.

The Campaign to Stop Killer Robots added four more States to the list of those having called for a ban (Austria, China, Colombia, Djibouti), bringing the number to 26.

So what is next?

The next meeting of the GGE will take place at the end of August. States will have to agree to a report to be submitted to the Meeting of the High Contracting Parties to the Convention, which will be chaired by Latvia from 21-23 November 2018.

From consultations I have conducted, it is clear that not everyone is supporting the Chair, as he is seen as moving too cautiously and in a holding pattern rather than moving towards an agreed outcome. Would it be possible to expand the consultations and appoint friends of the chair, like Germany had done? This would give those States that have a different view an opportunity to influence the proceedings in a more direct manner, as some felt that their statements were not fully reflected in the summary.

What will be in the report? Concrete recommendations for a negotiating mandate? Recommendations to continue the work of the GGE for another year? A sharpened mandate for another GGE in 2019? This will be extremely important, as the mandate recommendations from the 2016 informal working group to the CCW Contracting Parties was approved with only minor revisions, so the stronger the recommendation, the stronger the next mandate.

I think I can safely say that a negotiating mandate is very unlikely. The States opposing a negotiation will firmly oppose moving the item to the General Assembly, as the consensus rule in the CCW protects them from actions they do not agree with, while the majority decision-making in the Assembly does not. For this reason alone, they will reject it, as it would mean losing control of the issue. Memories of the negotiations for the Treaty on the Prohibition of Nuclear Weapons in 2017 are still very raw.

If another GGE is agreed, the choice of the next Chair will be crucial, as the difference in approach and activism between Germany and India has shown.

What else can be done to advance the discussion? It is clear that the topic is so complex that a deeper understanding needs to be reached in order to have a meaningful impact. Many representatives of States do not have the legal and scientific background to weigh in the discussions in a forceful way. Organizing meetings of an inter-disciplinary nature, with AI scientists and experts, academics, representatives of industry, the military would contribute to the process, but what is needed is the willingness of the various parties to make the time and resources available: the States, the scientists, the experts, civil society—plus a requisite organizational unit and funding.

Funding might also be required to enable some States to attend the meetings on LAWS. Not all States have missions in Geneva, and many missions, in addition to ambassadors, have few staff. Bringing diplomatic or technical representatives with more expertise from capitals would be an important contribution.

Equally important is the involvement of military experts. The military is one of the largest funders and adopters of AI technology, as it is relatively cheap compared to other methods of warfare. A treaty on LAWS would ensure the prevention of large-scale manufacturing of the technology.

Accepting the offer of the leading scientists to provide technical expertise and advice to the experts would be a significant step—and should be eagerly accepted. We have recently seen that the pressure by scientists on companies not to participate in developing and advancing military AI tools is growing. Pension funds are divesting themselves of investments that they consider not in keeping with their ethical guidelines.

The recent open letter signed by 3,100 Google employees to the Google CEO Sundar Pichai protested against Google's collaboration with the US Department of Defense in an AI project—named "Maven"—that studies imagery and could be used to improve drone strikes in the battlefield. "We believe that Google should not be in the business of war," the letter begins, before going on to explain that Google's involvement in Project Maven stands to damage its brand and its trust among the public.

There are also parliamentary initiatives in capitals. Last month, for example, the Lord's Select Committee on AI challenged the UK's futuristic definitions of autonomous weapons systems as "clearly out of step" with those of the rest of the world and demanded that the UK's position be changed to align with these within a few months. Interest in other European parliaments is also high, as awareness of the issue has grown exponentially. It's the hot topic of the day.

The European Commission issued a communication20 in April 2018 with a blueprint for "Artificial Intelligence for Europe". While this does not specifically refer to LAWS, it demands an appropriate ethical and legal framework based on the EU's values and in line with the Charter of Fundamental Rights of the Union.

In addition, what is needed is media and outreach work to better explain the issues to the general public. I have always regretted the term "killer robots" and "killer machines". There is so much good and beneficial work that robots do and that AI is able to achieve, so to only tag them with negative and "Terminator" associations is extremely unfortunate. Maybe we should challenge PR companies to come up with a different moniker: think of pre-owned cars versus used cars, think of gently used clothing versus hand-me-downs.

How about simply "fully autonomous weapons"? Not as catchy but an accurate description—and maybe more creative minds can improve on that term. And finally, since there is objection to the unfortunate acronym of LAWS (which connotes a quasi-legitimacy to the AI-enabled machines), how about using the term "Fully Lethal Autonomous Weapon Systems"—which nicely shortens to FLAWS, a more apt description according to many advocacy groups.

Thank you.

NOTES

1 A/68/382 of 13 September 2013 2 A/RES/67/168 of 20 December 2012 3 CCW/MSP/2014/3 of 11 June 2014 4 https://futureoflife.org/open-letter-autonomous-weapons/ 5 https://www.weforum.org/agenda/2018/04/how-ai-could-increase-the-risk-of-nuclear-war 6 CCW/MSP/2015/WP.2 of 20 March 2015 7 For detailed views expressed by individual States, the Report on Activities on the second informal meeting compiled by the Campaign to Stop Killer Robots is very useful, see https://stopkillerrobots.org 8 https://unog.ch/80256EE600585943/(httpPages)/6CE049BE22EC75A2C1257C8D00513E26?OpenDocument 9 CCW/MSP/2015/3 of 2 June 2015 10 National-level legal reviews for the acquisition or development of new weapons systems are required by Article 36 of the Additional Protocol I (1977) of the 1949 Geneva Conventions. 11 https://www.unog.ch/80256EDD006B8954(httpAssets)/4423082AB7EF30E4C1257F7A00501844/$file/LAWSMX_FoodforThoughtFinal.pdf 12 CCW/CONF.V/2 of 10 June 2016 13 CCW/CONF.V/10 of 23 December 2016 14 https://futureoflife.org/autonomous-weapons-open-letter-2017/ 15 From an interview conducted by SIPRI, https://youtu.be/OoenCZZKtdU 16 CCW/GGE.1/2017/WP.1 17 https://stopkillerrobots.org/2017/11/gge/ 18 CCW/GGE.1/2017/3 of 22 December 2017 19 https://www.unog.ch/80256EDD006B8954/(httpAssets)/DF486EE2B556C86C125827A00488B9E/$file/Summary+of+the+discussions+during+GGE+on+LAS+April+2018.pdf 20 https://ec.europa.eu/newsroom/dae/document.cfm?doc_id=51625

您可能还喜欢

2024 年 12 月 17 日 - 特写

2024 年的道德赋权

探索Carnegie Council的 "2024 年回顾 "资源,其中重点介绍了涉及今年一些关键伦理问题的播客、活动等。

奇爱博士作战室。资料来源:IMDB/哥伦比亚电影公司

2024 年 12 月 10 日 - 文章

电影伦理讨论 "奇爱博士"

这篇评论探讨了与核武器和不扩散、军工复合体有关的伦理问题,以及斯坦利-库布里克的《奇爱博士》中政治讽刺的作用。

2024 年 12 月 3 日 - 文章

美国的儿童贫困与儿童机会平等

这个来自第一批 CEF 学员的毕业设计讨论了美国儿童贫困的影响以及帮助缓解这一问题的道德解决方案。