Kaja Kowalczewska: Challenges of Remote and Autonomous Warfare

  1. General overview of the event organization

The event was organized by Kaja Kowalczewska, with support from Zvonko Trzun, who is a co-researcher within Professor Network. It was structured to share findings on the technical and legal aspects of military robots with students from law and military studies. The presentation by ProfNet researchers drew from the content previously delivered at the „Shielding Tomorrow: CSDP and Defense Developments in Central Europe” conference held in Budapest on March 7-8, 2024. These two researchers were accompanied by two additional scholars, specializing in legal and military studies, who addressed other topics related to the main theme.

  • Details of the event
  1. Date and time

The event was organized as an online International Scientific Symposium titled „Challenges of Remote and Autonomous Warfare.” It was conducted through Microsoft Teams on April 15th, 2024, from 15:30 to 17:30 CEST (local time in Zagreb).

  • Dissemination and registration

The event’s poster (Annex 1) contained the agenda and a registration link to a Microsoft Forms page (Annex 2). Participants were able to register for the event and receive the meeting link in advance of the April 15th date. The notice on the registration form included all necessary organizational details, as well as information on the processing and protection of personal data.

The event’s poster was distributed among researchers from the Research Group „Common Defence Policy: The Legal Framework for the Development of the European Defence Industry,” as well as students from the Croatian Defence Academy „Dr. Franjo Tuđman” and the Faculty of Law at the University of Zagreb.

In total, 21 individuals, including the speakers, registered for the event via the Microsoft Forms page (Annex 3).

  • Attendance and recording

The event proceeded as planned and was attended by 63 participants, as indicated by the Microsoft Teams attendance list (Annex 4). The event was recorded (Annex 5) and lasted for two hours.

  • Agenda

The agenda for the event included a keynote speech and several presentations by experts in the fields of remote and autonomous warfare, military studies, and legal challenges caused by new disruptive technologies.

The keynote speech, titled „Legal Challenges of Remote and Autonomous Warfare,” was delivered by dr. Kaja Kowalczewska from the Digital Justice Center at the University of Wrocław in Poland. Dr. Kowalczewska explored the complex legal issues associated with the use of remote and autonomous weapons systems in modern warfare (the summary of which is below).

Following the keynote speech, there were several presentations.

The first presentation, titled „Opportunities and Challenges of Remote Warfare – Technical Perspective,” was given by Colonel Assistant Professor Zvonko Trzun, PhD, as the Head of the Department of Military Engineering at the University of Defence and Security Dr. Franjo Tuđman in Zagreb. Professor Trzun provided an in-depth analysis of the technical aspects of remote warfare, highlighting both the opportunities and the potential challenges involved.

Assistant Professor Dijana Gracin, PhD, from Military Studies at the University of Zagreb, delivered the next presentation titled „New/Old Challenges of the Status of Child Soldiers – Child Soldiers as ‘Weapons’ and Victims (Legal Aspects).” In this session, Professor Gracin discussed the legal and ethical implications surrounding the use of child soldiers, both as combatants and as victims in conflict zones.

The final presentation, „Artificial Intelligence: CNN Image Denoise,” was given by Captain PhD Mario Šipoš from the Croatian Military Academy Dr. Franjo Tuđman in Zagreb. Captain Šipoš focused on the use of artificial intelligence, specifically convolutional neural networks (CNN), for image denoising, which has important applications in military technology and strategy.

The event concluded with a questions and discussion session, allowing participants to engage with the presenters, ask questions, and delve deeper into the topics covered during the symposium.

  • Summary of keynote presentation

Dr. Kaja Kowalczewska addressed legal challenges of military robots and autonomous weapons systems, emphasizing the complexities of developing ethical and responsible frameworks.

She began her presentation with international debates on remote and autonomous warfare, referencing the 2012 Human Rights Watch report „Losing Humanity.” This report raised significant concerns about fully autonomous weapons, arguing that they inherently lack human qualities necessary for legal and ethical checks on killing civilians. Human Rights Watch called for preemptive action, advocating a ban on fully autonomous weapons due to substantial risks to civilians in armed conflict. Dr. Kowalczewska also discussed the 2013 report by the Special Rapporteur on extrajudicial, summary, or arbitrary executions, Christof Heyns, which raised serious concerns about the implications of lethal autonomous robotics (LARs) for the protection of life during both war and peace. It questioned how much LARs can be programmed to comply with international humanitarian law and standards protecting life under international human rights law.

Dr. Kowalczewska reviewed discussions at the Convention on Certain Conventional Weapons (CCW) forum from 2014 to the present, focusing on several topics: definitions of Lethal Autonomous Weapon Systems (LAWS), the concept of meaningful human control (MHC), and the issue of accountability. These discussions led to the development of 11 guiding principles emphasizing that human control should be retained over autonomous weapon systems and that accountability for these systems’ actions cannot be transferred to machines. Another key point was the principle of human-machine interaction, which underlined the importance of maintaining clear distinctions between human and machine roles.

She compared the normative background with the AI Act, highlighting the European Union’s regulation of civilian AI applications based on risk levels. She emphasized the importance of human oversight for high-risk AI models, particularly in critical infrastructure, education, employment, healthcare, and banking.

Dr. Kowalczewska presented major arguments by states and organizations regarding a proposed preemptive ban on LAWS by the Campaign to Stop Killer Robots, supported by around 30 countries. This global coalition includes more than 160 international, regional, and national non-governmental organizations from 65 countries. The coalition aims to mitigate risks associated with potential misuse of LAWS and advocate for meaningful human control over weapon systems to prevent unintended harm and violations of human rights.

She discussed the political declaration proposed in 2017 by France and Germany, introducing a non-legally binding code of conduct for the use of LAWS. This declaration emphasized ensuring meaningful human control over LAWS and highlighted the importance of maintaining human responsibility in decision-making related to deploying and using such systems. Dr. Kowalczewska argued that soft law, like this code of conduct, is gaining traction, as seen with initiatives such as the Responsible AI in the Military Domain (REAIM) project and the United States’ declaration on responsible AI.

She concluded that prospects of a legally binding treaty to regulate LAWS are highly unlikely due to the current positions of several military powers. These nations, including Australia, Belgium, Israel, Russia, South Korea, Spain, Sweden, Turkey, the United Kingdom, and the United States, support the view that existing international law is sufficiently developed and can accommodate the modalities of LAWS.

Dr. Kowalczewska addressed legal challenges associated with using military robots and autonomous weapons systems, providing insights into obstacles and considerations involved. She identified the first major challenge in regulating LAWS as the lack of a precise definition. While there is broad agreement that remotely piloted platforms should be excluded from the definition, questions remain about autonomy in critical functions and how to categorize autonomy on a spectrum from automation to semi-autonomous and fully autonomous systems.

This challenge is compounded by complexities in determining when humans are in, on, or out of the loop in decision-making. Diverging points that still require conclusive answers include whether existing defensive systems like the Patriot missile defense system, Iron Dome, Phalanx, and other close-in weapon systems (CIWS) should be classified as LAWS and whether regulation should focus on antipersonnel weapons or encompass all types of autonomous weapons.

Establishing clear standards for control and accountability is another major challenge in regulating LAWS. While there’s widespread acknowledgment of the need for meaningful human control, there is no agreed definition of what this entails or how it should be implemented across various stages of a weapon system’s life cycle. Questions remain about the extent and nature of human involvement: Should human control be exercised over the attack phase, target acquisition, weapon system design, or throughout the system’s life cycle? Excessive human control could risk undermining advantages of autonomous systems, while insufficient control raises concerns about ethical and legal implications.

Accountability is another unresolved issue, with a lack of clarity on who should bear responsibility for the actions of autonomous systems, whether operators, commanders, software engineers, or other stakeholders involved in developing, deploying, and managing these systems. Uncertainty exists about standards applied for ensuring accountability, such as those related to explainable AI (XAI), operational constraints, and environmental considerations. Addressing these challenges is essential for establishing a comprehensive legal and ethical framework for LAWS use.

Dr. Kowalczewska discussed challenges of ensuring compliance with international humanitarian law (IHL) when employing LAWS, particularly in relation to targeting laws. These challenges stem from key IHL principles: distinction, proportionality and precautions – crucial for minimizing harm to civilians and non-combatants during armed conflicts.

Dr. Kowalczewska pointed out that meeting these principles presents significant challenges for autonomous systems, requiring a level of judgment and contextual understanding that current technology may not yet possess. Ensuring compliance with IHL in these areas is a complex and ongoing issue for future LAWS developments.

She presented case studies demonstrating the practical implications and challenges of employing these technologies in real-world scenarios. For example, she explored the Ukrainian practice of remote surrenders using drones and the Israel Defense Forces’ use of an AI-enabled decision support system called „Lavender.” While these technologies demonstrate potential benefits, they also pose challenges in complying with IHL principles.

Kérjük, ossza meg cikkünket a kedvenc csatornáján, vagy küldje el ismerőseinek.

Facebook
X
LinkedIn

Hasonló bejegyzések

EVENT REPORT & SUMMARYEVENT TITLE: COMBATTING CYBER WARFARE CRIMES IN THE EUROPEAN UNIONORGANISER: PROFESSOR DR.…

Még egyszer szeretnénk gratulálni fiatal kutatóinknak, akik sikeresen elvégezték a CEA Junior Programját és megszerezték…

A Közép-Európai Akadémia nevében nagy örömmel és elismeréssel köszöntjük Vladan Petrov professzor urat a Szerb…

cea mail modal