The 2022 Meeting of the High Contracting to the Convention on Certain Conventional Weapons (CCW) decided that "The Group of Governmental Experts of the High Contracting Parties related to emerging technologies in the area of lethal autonomous weapons systems (LAWS), for a duration of 10 days, from 6 to 10 March 2023, and from 15 to 19 May 2023" (CCW/MSP/2022/7, paragraph 37 (b)). The Meeting also adopted the relevant cost estimates for 2023 as contained in CCW/MSP/2022/5.

The 2022 Meeting of the High Contracting to the CCW also decided "That the work of the open-ended Group of Governmental Experts related to emerging technologies in the area of lethal autonomous weapon systems established by Decision 1 of the Fifth Review Conference as contained in document CCW/CONF.V/10, adhering to the agreed recommendations contained in document CCW/CONF.V/2, is to continue, to strengthen the Convention. In the context of the objectives and purpose of the Convention, the Group is to intensify the consideration of proposals and elaborate, by consensus, possible measures, including taking into account the example of existing protocols within the Convention, and other options related to the normative and operational framework on emerging technologies in the area of lethal autonomous weapon systems, building upon the recommendations and conclusions of the Group of Governmental Experts related to emerging technologies in the area of lethal autonomous weapon systems, and bringing in expertise on legal, military, and technological aspects" (CCW/MSP/2022/7, paragraph 32).

The two sessions of the 2023 Group of Governmental Experts on Lethal Autonomous Weapons Systems, chaired by Ambassador Flavio Damico Soares of Brazil will both take place at the Palais des Nations in Geneva, respectively on 6-10 March 2023 and 15-19 May 2023.

On 09 February 2023, the CCW Implementation Support Unit circulated an aide-mémoire to High Contracting Parties regarding the first 2023 session of the GGE on LAWS.

On 17 April 2023, the CCW Implementation Support Unit circulated an aide-mémoire to High Contracting Parties regarding the second 2023 session of the GGE on LAWS.


CCW/GGE.1/2023/1 - Provisional Agenda

Indicative timetable for the first session of the Group

Indicative timetable for the second session of the Group

CCW/GGE.1/2023/CRP.1 - Non-exhaustive compilation of definitions and characterizations

CCW/GGE.1/2023/2 - Report of the 2023 session of the Group of Governmental Experts on Emerging Technologies in the Area of LAWS

CCW/GGE.1/2023/INF.1 - List of participants

Working papers

States wishing to submit a working paper must send the document to ccw@un.org in Word format. States are kindly asked to submit working papers at least one week before the start of each session, to allow enough time to format and circulate the document.

CCW/GGE.1/2023/WP.1/Rev.1 - Working paper submitted by Austria

CCW/GGE.1/2023/WP.2/Rev.1 - Working paper submitted by the State of Palestine

CCW/GGE.1/2023/WP.3/Rev.1 - Working paper submitted by Pakistan

CCW/GGE.1/2023/WP.4/Rev.2 - Working paper submitted by Australia, Canada, Japan, Poland, the Republic of Korea, the United Kingdom, and the United States

CCW/GGE.1/2023/WP.5 - Working paper submitted by the Russian Federation

CCW/GGE.1/2023/WP.6 - Working paper submitted by Argentina, Colombia, Costa Rica, Ecuador, El Salvador, Guatemala, Kazakhstan, Nigeria, Palestine, Panama, Peru, Philippines, Sierra Leone and Uruguay

Side Events



Emerging technologies in the area of LAWS: novel approaches to "meaningful human control"
The notion of “human control” with regards to weapons systems is not new.1 It has been referred to by States, civil society, academic institutions, and international organizations alike. The term “meaningful human control” was first introduced in the LAWS discussions in the framework of the CCW in 2013, and it has since gained traction as a framing concept for discussions on autonomy in weapons systems in this Group.

This webinar unpacks the concept of “Meaningful Human Control” by exploring its meaning, operationalization, testing, and real-life applications. It seeks to explore the views of professionals conducting most recent academic and applied research on meaningful human control and working on concrete products and services, with the aim of deepening participants' understanding of a term that has become central to the discussions of the GGE on LAWS.

This webinar is organized by UNODA with the financial support of the EU.

Link to register: https://ungeneva.webex.com/weblink/register/rfbbd0244e11ce9470542a47f42acd63f

13:15-14:45 CET


Applying International Humanitarian Law to Autonomous Weapons Systems The Oxford Institute for Ethics, Law and Armed Conflict will host a panel discussion on the application of international humanitarian law (IHL) to autonomous weapons systems. In line with calls for further clarification and specification of IHL, the panel will explore the methodologies for identifying and interpreting existing law, the elements of a number of key positive and negative obligations relevant to the development and use of autonomous weapons systems, and the potential benefits and challenges of adopting new legal rules.

The wealth of national positions, working papers and statements on the application of IHL to autonomous weapons systems is a testament to the importance that states attribute to the clarification of IHL. Key principles of IHL, such as the principle of distinction, have been consistently reaffirmed in the discussions of the GGE. While it is common ground that IHL applies to autonomous weapons systems, how precisely it applies is a complex inquiry that requires further clarification.

Two factors complicate the clarification exercise. First, the elements of many obligations under IHL continue to be contested in general, that is, across contexts of application. For instance, different positions have been advanced on the type of subjective element that is built into the prohibition of making civilians the object of attack. Second, even if the elements of rules are clear, the application of these rules to autonomous weapons systems may need specification. Thus, the inclusion of autonomous functions in weapons systems may require the taking of particular precautionary obligations tailored to the risks of such functions. This session will examine existing legal uncertainties over the content of IHL through an examination of specific obligations.
The purpose of this panel is to explore the methodology of making claims about the content of IHL, discuss the areas of convergence on the elements of rules and those that call for further inquiry, and reflect on possible next steps.

A moderated discussion on these questions among the panelists will be followed by a wider discussion with the audience.
13:15-14:45 CETBuilding H - Rooms 207-208-209
May 2023
Perspectives on Unpredictability in Autonomous Weapons TechnologySide event organized by Austria, Ireland. Mexico, New Zealand, Switzerland and WILPF UK.

This side event will explore issues related to aspects of ‘unpredictability’ as they relate to Autonomous Weapon Systems, particularly those which incorporate forms of artificial intelligence. The side event will examine the issue from a variety of perspectives and provide a space for consideration of particular ways in which Autonomous Weapons may function. It will continue the ongoing engagement of research and expert activity within the CCW on LAWS and further facilitate interactions between civil society specialists, international organisations and policymakers.

Flyer of the side event.
13:15-14:45 CETRoom XXVII
Side event organized by Stop Killer Robots on automated decision researchAutomated Decision Research is the monitoring and research team of Stop Killer Robots, tracking state support for a legally binding instrument on autonomous weapons systems and conducting research and analysis on responses to autonomy and automated decision-making in warfare and wider society. This side event will introduce the work of Automated Decision Research, including the ADR state positions monitor, weekly news briefings, and recent publications in the area of autonomous weapons, and AI and automated decision-making.
13:15-14:45 CETRoom XXVII
Side event on autonomy and topic 6 - risk mitigation and confidence measuresThe European Union, the Philippines and the Centre for the Study of Existential Risk at the University of Cambridge will be hosting a discussion under Topic 6 of the CCW GGE on Emerging Technologies in the Area of Lethal Autonomous Weapons Systems indicative timetable, on the potential risks inherent in the use of autonomous weapons systems in weapon systems, and the need for mitigation and confidence building. Particularly the panel will focus on factors ripe for leading to unintended technological and geopolitical consequences and escalation, without suitable frameworks in place.

In the context of increasing debates regarding the development and associated regulation of artificial intelligence and other forms of autonomy, there have been satellite discussions implicating the immediate fall-out and wider political potential for conflict and retaliation.

While subjects of unpredictability, targeting, meaningful human control and the applications of International Law, including International Humanitarian Law, are fundamental, it is significant to acknowledge two further threats. These are, in the political realm, asymmetric power responses, and in the technological, multi-agent triggering.

The risk of unintended consequences of localised false positives, exponential error, swarms or any perceived hostility is great and the response to any of these must be measured and contained. Further, the asymmetry of power and ability to develop these technologies must be recognised in the potential to lean into the use of existing conventional weapons (including the possibility of deferring to unlawful methods) as well as cyber-attacks on national critical infrastructure as countermeasures. The failure to address these issues have direct repercussions on trust and confidence building.

The purpose of this panel is to explore the potential risks, which are less considered, and how they might be mitigated. After the presentations given by panellists, the floor will be open for questions and comments from the floor.
13:15-14:45 CETBuilding H - Rooms 207-208-209