Home » Blog » The Accountability Gap in Lethal Autonomous Weapon Systemsunder International Humanitarian Law

The Accountability Gap in Lethal Autonomous Weapon Systemsunder International Humanitarian Law

Authored By: Natasha Abigail Dhliwayo

University of Zimbabwe

Abstract 

Artificial intelligence (AI) is increasingly reshaping, the conduct of contemporary armed  conflict, most notably through the deployment of lethal autonomous weapon systems  (LAWS).While these technologies, promise, operational efficiency, and enhance precision they  pose a profound challenges to the existing accountability frameworks under International  Humanitarian Law (IHL)..Where LAWS independently select and engage a targets without  human intervention, attributing responsibility for unlawful harm becomes legally complex,  giving rise to what scholars have described as an “accountability gap”. This article examines,  whether the existing doctrines of state responsibility, individual criminal liability, and command  responsibility are capable of addressing violations of IHL arising from the use of LAWS and  emerging solutions to close this gap. The article argues that while current IHL framework  formally apply to LAWS their effectiveness is undermined by diffused human control, algorithm  opacity , and institutional outsourcing , necessitating, normative, and institutional reformed to  ensure meaningful accountability. 

INTRODUCTION 

From bow and arrows to precision-guided ammunitions, technological advancements have  consistently reshaped the conduct of warfare.AI presents the most disruptive shift to date  fundamentally altering how military decisions are made and executed.1In the twenty-first  century battlefields are increasingly influenced by AI enabled systems, particularly those  commonly referred to as Lethal Autonomous Weapon Systems (LAWS) which are capable of  selecting an engaging targets without direct human intervention2 

IHL for centuries has for centuries been premised on human agency. Its fundamental principles  of distinction, proportionality precaution and military necessity presuppose that decisions  regarding the use of force are made by human actors, capable of judgment, intention, and legal  reasoning. The deployment of autonomous systems challenges this assumption Responsibility  may be diffused across programmers, manufacturers, commanders operators, and the deploying  state creating the gap between home and accountability. 

This article examines whether existing frameworks of responsibility under IHL are capable of  attributing liability for violations arising from the deployment of lethal autonomous weapon  systems current. 

Definition OF AI  

Defining AI is no easy task as scholars have proffered different definitions as a “computational  artefact built through human intervention that thinks or acts like a humans or how we expect  humans to think or act.”3For the purpose of this paper the definition advanced by the (ICRC)  position paper will be adopted, which defined AI as “the use of computer systems to carry out tasks often associated with human intelligence that require cognition,planning,reasoning or  learning.”4 

Lethal Autonomous Weapon Systems (LAWS) 

There is no universally accepted definition of LAWS, they are “fully autonomous weapon  systems that are capable of selecting and engaging a target in combat and support missions  without any intervention5of the human operator.”6Also defined as “any weapon system with  autonomy in its critical functions—that is, a weapon system that can select detect, identify, track  or select and attack targets without human intervention7 

Overview of International Humanitarian Law 

IHL governs the conduct of hostilities in both international and non-international conflicts.” it is  comprised of treaty law, including the four Geneva Conventions their Additional protocols as  well as Customary International Law.IHL is founded on four core principles: 8These principles  as they serve both as interpretative guiding tools for the technology’s use in this area9 

The Principle of Distinction 

It is jus cogens10and is one of the cardinal principles of IHL11enunciated in Articles 48 of AP 1: In order to ensure respect for and protection of the civilian population and civilian objects the  parties to the conflict shall at all times distinguish between civilian population and combatants and between civilian objects and military objectives12 

The Principle of Precaution 

According to Article 51(4) of AP1 states that “indiscriminate attacks which are not directed to a  specific military objective which employ a means or method which cannot be limited and which  are “of nature to strike military objectives and civilians or civilian objects without distinction13 to  cancel or suspend an attack if it becomes apparent that the target is not a military objective or is  subject to special protection to cancel or suspend an attack if it becomes apparent that the target  is not a military objective or is subject to special protection14 

The Principle of Military Necessity 

It is outlined in the 1907 Hague Regulations, requires that the parties to the conflict adopt only  measures necessary to weaken the enemy and achieve their surrender, it is not necessary to bring  about total destruction of an enemy ,its armed forces or its property15 

The Principle of Proportionality  

It is outlined in AP 1 which states in article 51 (5) (b) that a violation of this principle occurs when an “attack which may be expected to cause incidental loss of civilian life, injury to  civilians, damage to determine whether the attack may be expected to cause incidental civilian  casualties and damage to civilian objects, or a combination thereof, which would be excessive in  relation to the concrete and direct military advantage anticipated16 

The Principle of Humanity 

Underlies every other principle of IHL. It could be defined as a prohibition of inflicting “all  suffering, injury or destruction not necessary for achieving the legitimate purpose of a conflict” that aims to protect combatants from unnecessary suffering17 This balancing function is best described by the Martens Clause , it attempts to fill a possible gap in the legislation by providing  that, in the situation of an armed conflict in the absence of a legal norm, belligerents are still  bound by the laws of humanity and the requirements of public conscience18 

Attribution of Responsibility under IHL 

State Responsibility 

Under general, IL states bear responsibility for internationally wrongful act attributable to it but  if an act is lawful (and collateral damage and accidental harms are often lawful), it is not  internationally wrongful, and the law of state responsibility is not implicated19 

Command Responsibility 

Outlined in Articles 87 of Protocol 1, also known as superior responsibility it operates to hold  commanders accountable for the acts of their subordinates’20.The commander’s military  accountability pervades the battlefield.21 

Individual Criminal Responsibility 

International criminal law also assigns individual criminal responsibility to individuals who  commit war crimes with the requisite mental element. This model pre-opposes that perpetrators  possess intent or knowledge in relation to prohibited conduct. 

Prospects and Risks of LAWS in Warfare  

The integration of AI systems, LAWS in warfare presents both the significant opportunities and  considerable challenges. These systems can significantly improve operational efficiency in  military contexts for instance, military use of AI is broad, extending well beyond target  engagement22Including logistics, intelligence, surveillance23also some of these systems combine  powerful remote sensors that can generate broad landscape overviews and long-range images  with the maneuverability to capture on-the-ground footage. These advancements may lead to  more informed decision-making, potentially reducing the likelihood of errors that could result in  civilian casualties24as well as other situational context.25 

However a major concern is the unpredictability of LAWS in battlefield. While the principle of  distinction between civilians and combatants sounds very clear and easy to follow in reality. The comparison between military advantage on one hand and the excessiveness of force on the other  

hand when performed by LAWS isn’t ethically right. These AI systems 26may not be able to  make these assessments with the required level of understanding and human judgment. . Such  miscalculations can result in disproportionate harm to non-combatants, contravening  fundamental tenets of IHL27 

The Accountability Gap 

This gap refers to situations in which harm occurs during armed conflict, and yet no individual  entity can be clearly held legal responsible under the existing legal doctrines. For autonomous  weapon systems, as defined, the control exercised by humans can take various forms and degrees at different stages of development, deployment, the decision by the commander or operator to  activate the weapon system and the operation of the autonomous weapon system during which it  independently selects and attacks targets.28Seeing how complex the system of LAWS is, it is not  clear as to who exactly should be the found accountable. 

A key contributor to this gap is the opacity of AI systems often described as the “double black  box” problem encountered in machine learning systems. Machine learning systems generate  outputs29this inherently makes it harder to determine whether the alleged violation occurred due  to a malfunction, a glitch in the algorithm or a deliberate intentional command by the combatant 

Also many functions previously carried out by military personnel are currently being outsourced  to private contractors into government personnel, sometimes located in different locations. This further widens, the accountability gap as to who should be held accountable when IHL30 violations occur. As the LAWS may behave erratically, misidentify targets, or respond in  unanticipated ways due to adversarial inputs or latent biases in training data 

However regardless of the autonomous nature of these systems when they operate  “independently” IHL rules of responsibility are still applicable to humans. Scholars particularly  McFarland acknowledge that LAWS might raise questions regarding its potential classification as a combatant due to its human life abilities however, he further argue that often termed  independent decision-making in laws operations is in fact, programmed and embedded by  humans31 

However, this gap in the discourse arises from the tendency to assess designation as a combatant  based on the human-likeness of AI in terms of practical and behavioral characteristics, rather  than focusing32 on the root cause of these LAWS. The responsibility for ensuring IHL33 is adhered to first and foremost rests with each State that is developing, deploying and using  weapons34Therefore arguments that AI should have separate legal separate personality or rather  classified as a combatant due to its autonomous nature are flawed.35 

Suggested new accountability mechanisms 

Much has been said on the dangers, opportunities of LAWS but very few ink has been spilled in  ensuring binding obligations are made by states. In coming up with new accountability  mechanisms Crootof argued for the creation of a “war torts” regime, which would require states  to pay compensation for both lawful and unlawful acts in armed conflict that cause civilian harm,  either when they employ autonomous weapon systems. 

This is echoed by Kurki and Solum propose limited frameworks of financial accountability for  AI through mechanisms like insurance or compensation. However such measures, remain  speculative within IHL36. These considerations raise important questions regarding the legal  framework that would be necessary to ensure that these AI systems, combatants other entities responsible in violations are appropriately held accountable in a manner that aligns with the  principles of justice and fairness. While these proposals remain speculative, the highlight the  need for innovative, accountability mechanisms that complete existing IHL doctrines. 

Conclusion  

The rapid integration of LAWS present one of the most significant challenges faced by IHL in  the modern era. Existing frameworks of state responsibility, individual criminal, liability, and  command responsibility, formally apply to the use of such systems however with their fair share  of flaws. This article demonstrated that they are substantively strained by autonomous, decision making, opacity, and the diffusion of human control, which makes it harder to pinpoint who is  accountable when violations occur. The result is an emerging accountability gap that risk undermining the effective enforcement of IHL and the protection of vulnerable groups during  armed conflict. 

In light of these issues, it is a matter of urgency that legal scholars, policy makers and military  practitioners, States and the greater international community engage in a collaborative dialogue  to address the lacunas existing in IHL before it’s too late. The end goal isn’t to ban AI systems as  discussed but it’s have operational dimensions to ensure that Integration of AI systems and  LAWS into military operations aligns with the fundamental tenants of international  law,IHL,human rights law. 

BIBLIOGRAPHY

  1. Suzen H “How Disruptive Technologies Affect Deterrence, Defence and Security” (Sep  29, 2020) & Yordanova K. Artificial Intelligence and Armed Conflicts. (NA, ed;  2025:411-428.)< https://behorizon.org/how-disruptive-technologies-affect-deterrence defence-and-security/> accessed on 13 January 2026 
  2. Perrin B “Lethal Autonomous Weapons Systems & International Law: Growing Momentum Towards a New International Treaty” Volume 29: issue 1 Jan 2025 3. McCathy,Minsky,Rochester & Shannon “Proposal for the Dartmouth Summer Research  Project on Artificial intelligence (accessed on 13 January 2026 
  3. ICRC, “Artificial Intelligence and Machine Learning in Armed Conflict: A Human Centered Approach”, 102 Int’l Rev. Red Cross 463 (2021) (link) accessed on 13 January  2026 
  4. Ibid 
  5. Convention on Prohibitions or Restrictions on the use of Certain Conventional Weapons  (CCW), (August 2022) <https://docs.un.org/en/CCW/GGE.1/2022/WP.9> accessed 13  January 2026 
  6. ICRC, Views of the on autonomous weapon systems, paper submitted to the CCW  Meeting of Experts on (LAWS),(2016),< https://www.icrc.org/en/document/views-icrc autonomous-weapon-system;> accessed 13 January 2026 
  7. Zeineddine A ,“AI in Warfare: Legal and Humanitarian Challenges Under IHL” <  https://www.researchgate.net/publication/390103846_Law_Artificial_Intelligence_in_W arfare_Legal_and_Humanitarian_Challenges_Under_International_Humanitarian_Law >  accessed on 13 January 2026 
  8. Yordanova K,”Artificial Intelligence and Armed Conflicts” Katerina, Cambridge  University <https://www.cambridge.org/core> accessed 13 January 2026 10. Ibid Katerina  
  9. Delissen A & Tanja G, Humanitarian Law of Armed Conflict-Challenges Ahead,(1991  para 17) 
  10. Additional Protocol 1,Article 48  
  11. Additional Protocol 1,Article 51 (4)
  12. Davison N ,“A legal perspective: Autonomous weapon systems under IHL ,ICRC <  https://www.icrc.org/sites/default/files/document/file_list/autonomous_weapon_systems_ under_international_humanitarian_law.pdf > accessed on 13 January 2026 
  13. Burrus Carnahan,Lincoln & Lieber “The Laws of War:The Origions and Limits of the  Principle of Military Necessity”[1998] 92 AJIL 213 
  14. Ibid Davidson 
  15. ICRC, “What is IHL?” (September 2015) , <www.icrc.org/en/document/what-ihl>,  accessed 13 January 2026 
  16. Supra Katerina 
  17. Gap Rebecca Crootof,”AI and the Actual IHL Accountability Gap” (November 28,  2022) <https://ssrn.com/abstract=4289005autonomous weapon systems> accessed on 13  January 2026 
  18. Crawford E & Pert A, International Humanitarian Law (2nd Edition, Cambridge University Press 2020) 
  19. Kraska J “Command Accountability for AI Weapon Systems in the Law of Armed  Conflict” <https://digital commons.usnwc.edu/cgi/viewcontent.cgi?article=2958&context=ils> accessed 13  January 2026 
  20. Grand-Clément S,”Artificial Intelligence Beyond Weapons: Application and Impact of  AI in the Military Domain”(2023) <https://unidir.org/wp content/uploads/2023/10/UNIDIR_AI_Beyond_Weapons_Application_Impact_AI_in_th e_Military_Domain.pdf> accessed on 13 January 2026 
  21. supra Katerina 
  22. Gubrud, “Military Use of AI: Surveillance and Ethical Dilemmas.” International Review  of the Red Cross (2019).accessed 13 January 2026 
  23. Wells W “Battlefield Evidence in the Age of AI-Enabled Warfare” Chicago Journal of  International Law https://cjil.uchicago.edu/print-archive/battlefield-evidence-age artificial-intelligence-enabled-warfare  
  24. Crootof R, “The Killer Robots Are Here: Legal and Policy Implications.” Cardozo Law  Review (2016). accessed 13 January 2026
  25. ICRC “Autonomous Weapon Systems: Implications of Increasing Autonomy in the  Critical Functions of Weapons” (2019) < https://www.icrc.org/en/publication/4283- autonomous-weapons-systems>, accessed on 13 January 2026 
  26. Supra Davidson,note 14 
  27. Arthur H Michel, “In the Debate over Autonomous Weapons, It’s Time to Unlock the  ‘Black Box’ of AI”, (October 2020),https://thebulletin.org/2020/10/ ban-regulate-or-do nothing-the-debate-over-ai-weapons-and-one-path-forward/. Accessed on 13 January  2026 
  28. Supra Ali Zeineddine 
  29. McFarland & Tim, “Autonomous Weapon Systems and the Law of Armed Conflict:  Compatibility with International Humanitarian Law”(2020) 
  30. Ibid McFarland 
  31. Mustafa S “Can the Attributability of Combatant Status to Military AI Technologies  under International Humanitarian Law” (2023) (Global Society 38: 122–38) <  https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4607136> accessed on 13 January  2026. 
  32. Elizabeth Jalo-Okotie, “AI in Armed Conflict- THE ROLE OF ARTIFICIAL  INTELLIGENCE IN ARMED CONFLICT”( Vol.2 2024)  <https://oer.tsuniversity.edu.ng/index.php/tsulj/article/view/1642/1352> accessed 13  January 2026 
  33. Supra Davidson, note 14 
  34. Supra R. Crootof,note 26 
  35. Kurki, Visa, Legal Personhood. Elements in Philosophy of Law,(2023,Cambridge  University Press) < https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4669283> accessed on 13 January 2026

1 Hasan Suzen “How Disruptive Technologies Affect Deterrence, Defence and Security” (Sep 29, 2020) &  Yordanova K. Artificial Intelligence and Armed Conflicts. ( NA, ed; 2025:411-428.)< https://behorizon.org/how disruptive-technologies-affect-deterrence-defence-and-security/> accessed on 13 January 2026

2 Benjamin Perrin “Lethal Autonomous Weapons Systems & International Law: Growing Momentum Towards a  New International Treaty” Volume 29: issue 1 Jan 2025 

3 McCathy,Minsky,Rochester & Shannon “Proposal for the Dartmouth Summer Research Project on Artificial  intelligence (link) accessed on 13 January 2026

4ICRC, “Artificial Intelligence and Machine Learning in Armed Conflict: A Human-Centred Approach”, 102 Int’l  Rev. Red Cross 463 (2021) (link) accessed on 13 January 2026 

5 Benjamin Perrin,supra note 2 ,at 1 

6 Convention on Prohibitions or Restrictions on the use of Certain Conventional Weapons (CCW), (August 2022)  <https://docs.un.org/en/CCW/GGE.1/2022/WP.9> accessed 13 January 2026 

7ICRC, Views of the on autonomous weapon systems, paper submitted to the CCW Meeting of Experts on (LAWS),  (2016), < https://www.icrc.org/en/document/views-icrc-autonomous-weapon-system ;>  accessed 13 January 2026 

8 Ali Zeineddine,“AI in Warfare: Legal and Humanitarian Challenges Under IHL” <https://www.researchgate.net/publication/390103846_Law_Artificial_Intelligence_in_Warfare_Legal_and_Humanit arian_Challenges_Under_International_Humanitarian_Law > accessed on 13 January 2026 9 Katerina Yordanova,”Artificial Intelligence and Armed Conflicts” Katerina, Cambridge University  <https://www.cambridge.org/core> accessed 13 January 2026 

10 Ibid Katerina  

11 Astrid Delissen & Gerard Tanja, Humanitarian Law of Armed Conflict-Challenges Ahead,(1991 para 17) 1

2 Additional Protocol 1,Article 48 

13 Additional Protocol 1,Article 51 (4) 

14 Neil Davison, “A legal perspective: Autonomous weapon systems under IHL ,ICRC <  https://www.icrc.org/sites/default/files/document/file_list/autonomous_weapon_systems_under_international_huma nitarian_law.pdf > accessed on 13 January 2026 

15 Burrus Carnahan,Lincoln & Lieber “The Laws of War:The Origins and Limits of the Principle of Military  Necessity”[1998] 92 AJIL 213 

16 Davidson 

17 ICRC, “What is IHL?” (September 2015) , <www.icrc.org/en/document/what-ihl>, accessed 13 January 2026

18 Supra Katerina 

19 Gap Rebecca Crootof,”AI and the Actual IHL Accountability Gap” (November 28, 2022) <https://ssrn.com/abstract=4289005autonomous weapon systems> accessed on 13 January 2026

20 Emily Crawford & Alison Pert, International Humanitarian Law (2nd Edition, Cambridge University Press 2020)

21 James Kraska “Command Accountability for AI Weapon Systems in the Law of Armed Conflict” <https://digital commons.usnwc.edu/cgi/viewcontent.cgi?article=2958&context=ils> accessed 13 January 2026

22Sarah Grand-Clément, Sarah. 2023,”Artificial Intelligence Beyond Weapons: Application and Impact of AI in the  Military Domain”(2023) <https://unidir.org/wp content/uploads/2023/10/UNIDIR_AI_Beyond_Weapons_Application_Impact_AI_in_the_Military_Domain.pdf>  accessed on 13 January 2026 

23 supra Katerina 

24 Gubrud, “Military Use of AI: Surveillance and Ethical Dilemmas.” International Review of the Red Cross (2019). <link> accessed 13 January 2026 

25Winthrop Wells “Battlefield Evidence in the Age of AI-Enabled Warfare” Chicago Journal of International Law  https://cjil.uchicago.edu/print-archive/battlefield-evidence-age-artificial-intelligence-enabled-warfare

26 ICRC“Autonomous Weapon Systems: Implications of Increasing Autonomy in the Critical Functions of Weapons”  (2019) < https://www.icrc.org/en/publication/4283-autonomous-weapons-systems>, accessed on 13 January 2026 

27 R. Crootof, “The Killer Robots Are Here: Legal and Policy Implications.” Cardozo Law Review (2016).<link>  accessed 13 January 2026

28 Supra Davidson, note 14 

29 Arthur H Michel, “In the Debate over Autonomous Weapons, It’s Time to Unlock the ‘Black Box’ of AI”,  (October 2020),https://thebulletin.org/2020/10/ ban-regulate-or-do-nothing-the-debate-over-ai-weapons-and-one path-forward/. Accessed on 13 January 2026 

30 Supra Ali Zeineddine 

31 McFarland & Tim,”Autonomous Weapon Systems and the Law of Armed Conflict: Compatibility with  International Humanitarian Law”(2020) 

32 Sati Mustafa Can.The Attributability of Combatant Status to Military AI Technologies under International  Humanitarian Law. (2023) (Global Society 38: 122–38) <https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4607136> accessed on 13 January 2026.

33 Elizabeth Jalo-Okotie, “AI in Armed Conflict-THE ROLE OF ARTIFICIAL INTELLIGENCE IN ARMED CONFLICT”( Vol.2 2024)  <https://oer.tsuniversity.edu.ng/index.php/tsulj/article/view/1642/1352> accessed 13 January 2026

34 Supra Davidson,note 14 

35 Supra R. Crootof,note 26 

36 Kurki, Visa, Legal Personhood. Elements in Philosophy of Law,(2023,Cambridge University Press) < https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4669283> accessed on 13 January 2026

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top