Position Statement on Turkish Drone Strikes in Iraq and the Position over Lethal Autonomous Weapons
Updated: Dec 30, 2020
Sarmad Ahmad,
Senior Editor,
The Indian Learning.
Earlier this week on the 11th of August 2020, the Iraqi Army stated that two senior commanders of the Iraqi Border guard were killed in a Turkish drone strike in Northern Iraq. Northern Iraq has been subjected to various Turkish attempts to raid on the positions of various PKK fighters since mid-June.
These ongoing events are part of a decades-long conflict that has existed between the Kurdish PKK and Turkey, which had even resulted at the end of a two-year ceasefire after peace-talks collapsed in 2015.
Following these events, three Kurdish fighters were killed as per an update on the 14th of August 2020. The individuals along with a fourth that fled the scene were targets of another Turkish drone strike as they stopped their vehicle outside a grocery store in the proximity of the previous attack.
While the reasons for cross-border attacks are specific to the geopolitical situation in the discussion, the use of drone strikes by Turkish forces is just one of the many international instances wherein AI proves its utility in military applications. Autonomous Weapons Systems and its potential applications in international conflict are one of the many incentives for various states to fund AI Research and Development.
While there exists an ethical disparity on the topic, the increasing use of contemporary technology in warfare is observable across various other states, from the Indian military revisiting its strategies to include contemporary technologies into teachings to Israeli Defence Forces “game-ifying” their new tanks by utilising game algorithms and gaming console controllers to increase operational efficiency.
Various international organisations like the “Stop Killer Robots” organisation founded by the Human Rights Watch had publicly mentioned years ago the potential consequences of the increasing uses of Autonomous Weapons in the spectrum of conflict, be it of an international or non-international character. With an increasing consensus on how the regulation of autonomous weapons ought to be introduced into codified law, the position of International Humanitarian Law has to be observed and assessed in this context.
Various International Law scholars remain on two sides of a legal disparity in itself. One side emphasises that Autonomous Weapons already come under the ambit of Article 36 of the First Additional Protocol to the Geneva Conventions, stating that a state’s obligation to employ a legal review is flexible enough to expand its ambit. The other maintains the position that smart weapons were not thought of when the provision was codified and hence requires a codification of its own, by the means of a new convention, as was the case for many new contemporary weapons throughout the years such as the Biological Weapons Convention or the Chemical Weapons Convention.
Social consensus, discussions and research and development nonetheless increasingly heighten the impending necessity of this issue; which arguably is one of the most forward ethical issues in the realm of AI research towards state utility alongside other discussions.
To understand this development, please refer to:
https://ahvalnews.com/northern-iraq/iraq-has-several-options-respond-turkish-drone-strike-military-spox
https://timesofindia.indiatimes.com/india/amid-lac-face-off-army-to-study-lasers-robotics-ai-for-warfare/articleshow/77425169.cms
https://reliefweb.int/report/world/legal-regulation-ai-weapons-under-international-humanitarian-law-chinese-perspective
https://www.icrc.org/en/document/autonomous-weapon-systems-under-international-humanitarian-law
The Indian Learning, e-ISSN: 2582-5631, Volume 1, Issue 2, January 31, 2021.
Commentaires