Abstract
The jurisprudence of the International Court of Justice is instrumental to one’s understanding of the body of International Humanitarian Law in the contemporary context. This holds true despite the Court’s relative silence on recognised principles of International Humanitarian Law and other pertinent matters until its landmark case in 1996 which is the advisory opinion given on the Legality of the Threat or Use of Nuclear Weapons. The Court’s significant finding—that the principles of International Humanitarian Law are applicable to the use of nuclear weapons—resonates with the intention of this work, which will explore the impact of autonomous weapons on these very principles and argue that, in the future, the Court's attention and efforts will be needed to rule on the legality of the use of such weapons.
The rise in the use of autonomous weapons in international and non-international armed conflicts is thoroughly evidenced by records of the use of force over the last decade. These weapons have become an integral part of the military-industrial complex of the states employing them, and international efforts to reach a consensus on their regulation have already begun. The very nature of autonomous weapons, which confer the capacity for autonomous decision-making on weapon systems by removing the need for human intervention and/or supervision, has given rise to the much-debated ‘responsibility gap.’ The increasing frequency of the use of these weapons, strengthened by software capable of data collection and orientation, and feeding such data to features like automated target recognition, raises legitimate concerns about the ability of autonomous weapons to comply with the aforementioned principles of International Humanitarian Law. This concern is further compounded by the lack of international regulation of these weapons and the challenges in establishing individual criminal responsibility due to their autonomy in the use of force. The ability of these weapons to adhere to the conditions laid down in Additional Protocol I has not yet been conclusively established.
In this context, it is proposed that the International Court of Justice must look to the future by addressing this emerging technology as a tool of warfare. The Court could exercise its jurisdiction either through contentious cases or by rendering an advisory opinion to provide its views on the place of autonomous weapons in the framework of International Humanitarian Law and whether their prohibition should be contemplated, particularly in view of the very nature of such weapons that will exclude human decision making from their operation. It is an opportunity for the Court to act pre-emptively as these weapons are yet to become an absolute and total replacement for human combatants and such an anticipatory action is necessary before those goals are achieved.
The rise in the use of autonomous weapons in international and non-international armed conflicts is thoroughly evidenced by records of the use of force over the last decade. These weapons have become an integral part of the military-industrial complex of the states employing them, and international efforts to reach a consensus on their regulation have already begun. The very nature of autonomous weapons, which confer the capacity for autonomous decision-making on weapon systems by removing the need for human intervention and/or supervision, has given rise to the much-debated ‘responsibility gap.’ The increasing frequency of the use of these weapons, strengthened by software capable of data collection and orientation, and feeding such data to features like automated target recognition, raises legitimate concerns about the ability of autonomous weapons to comply with the aforementioned principles of International Humanitarian Law. This concern is further compounded by the lack of international regulation of these weapons and the challenges in establishing individual criminal responsibility due to their autonomy in the use of force. The ability of these weapons to adhere to the conditions laid down in Additional Protocol I has not yet been conclusively established.
In this context, it is proposed that the International Court of Justice must look to the future by addressing this emerging technology as a tool of warfare. The Court could exercise its jurisdiction either through contentious cases or by rendering an advisory opinion to provide its views on the place of autonomous weapons in the framework of International Humanitarian Law and whether their prohibition should be contemplated, particularly in view of the very nature of such weapons that will exclude human decision making from their operation. It is an opportunity for the Court to act pre-emptively as these weapons are yet to become an absolute and total replacement for human combatants and such an anticipatory action is necessary before those goals are achieved.
| Original language | English |
|---|---|
| Article number | 8 |
| Pages (from-to) | 1-21 |
| Number of pages | 21 |
| Journal | Granite Journal: The University of Aberdeen Postgraduate Interdisciplinary Journal |
| Volume | 10 |
| Issue number | 1 |
| DOIs | |
| Publication status | Published - 30 Sept 2025 |
Keywords
- autonomous weapons
- internal humanitarian law
- International Court of Justice