
The rapid evolution of defense technology has significantly changed the nature of the Russia–Ukraine War since its advent in February 2022. Unprecedented Russian drone incursions into Polish airspace last September—and the subsequent shock exhibited by NATO officials—are but one of many indications that the twenty-first century arms race is centered around autonomous weapons, artificial intelligence, and unmanned aerial vehicles (UAVs). In order to understand the changing landscape of autonomous defense tech and its impacts on the Ukrainian battlefield, one must be cognizant of autonomous weapons’ recent past, contentious present, and uncertain future. Ultimately, NATO nations that aim to support Ukraine must supply the country with the most advanced technology in order to effectively combat Russian aggression and secure the rest of eastern Europe from drone attacks.
The development of autonomous weapons systems (AWS)—defined as weaponry that can select targets without human intervention but typically require approval before engaging in further action—began during the Cold War when the U.S. and U.S.S.R. both heavily invested in automated targeting technologies and radar-guided air defense systems. These early systems and their successors, including loitering munitions, trigger-based landmines, and automated missile defense platforms, are still largely considered legal under international law today, since their operation still requires some level of human control; however, the development and implementation of artificial intelligence presents new challenges for the regulation of AWS. Lethal autonomous weapons systems (LAWS) in particular, or AWS that have specific offensive capabilities, raise both legal and ethical questions: how can international organizations and individual states regulate the development of machines that can directly engage with combatants without any human oversight, and should defense tech companies be allowed to develop such machines in the first place?
Yet, regardless of one’s thoughts on the moral and ethical aspect of the LAWS issue, it’s clear that developments in automation are, for now, progressing. In 2023, images circulated online showing the measures Russian and Ukrainian troops have taken in order to defend their tanks against drone attacks: many tanks had “counter-UAV” batting-cage style nets installed on their exterior. Indeed, drones in particular are the vessels that are seemingly most impacted by AI improvements. As reported by the BBC, drones are soon expected to become “fully autonomous weapons that can find and destroy targets on their own,” with one Ukrainian developer stating that “all a soldier will need to do is press a button on a smartphone app” while the drone would execute the rest of the destruction by itself. Notably, the lack of a necessary communications system between an autonomous drone and a human operator means that these communications systems can’t be “jammed” by an enemy, adding another layer of unstoppability to the drones and laying the groundwork for further total weapons automation.
However, as one may have guessed, fully autonomous weapons are not without significant risks. One weapons manufacturer indicated that an AI’s inability to distinguish between a Russian uniform and a Ukrainian uniform in certain cases may expose troops to friendly fire. Beyond targeting the wrong group of soldiers, there are also clear concerns about LAWS targeting civilians or combatants who are surrendering—placing innocent lives at risk and violating international law. Problematically, international rules on autonomous weapons still, for the most part, have yet to be written. The Convention on Certain Conventional Weapons (CCW), a United Nations treaty signed in 1983, seeks to restrict or ban excessively harmful AWS. Under the CCW framework, a Group of Governmental Experts convenes annually to assess emergent AWS risks, yet no further binding agreement has been enacted globally due to opposing national interests (like Russia’s opposition to a LAWS ban).
Opinions on the automation issue are divided within the Ukrainian government. In July 2024, Ukrainian Minister of Digital Transformation Mykhailo Fedorov advocated for “maximum automation” in weapons of war, since “these technologies are fundamental to our victory.” President Volodymyr Zelensky, however, has urgently called for regulations on autonomous weapons development, stating that humanity is in the throes of “the most destructive arms race” in history “because this time, it includes artificial intelligence.” A more optimistic report by Kyiv Independent advocates for the idea that full drone autonomy “remains well out of reach—and is likely to stay there”; yet with creative technological advancements being made by independent Ukrainian engineers, automation does seem to be approaching at a rapid pace. While this advancement could be interpreted as a threat to human rights, there are also clear benefits to the Ukrainian military that come with AWS implementation.
Ukrainian troops are currently outnumbered by Russian soldiers, with at least 1.4 Russian combatants per Ukrainian soldier. However, Ukraine’s drone power has proven to be an effective tool in leveling the playing field. Research and development (R&D) and investment from foreign companies such as the German defense tech firm Helsing and the American drone manufacturer Neros are pivotal to supplying Ukrainian forces with the most advanced technologies. Meanwhile, the Russian Ministry of Defense is also investing heavily in AI-driven AWS, yet international sanctions have been constraining the state’s development process and supply chain access, leaving the country behind Ukraine in terms of technical prowess. Concerningly, Chinese investment in the Russian defense tech sector is expected to help Russia bridge some, if not most, of this development gap.
NATO member states owe it to Ukraine—perhaps not out of explicit legal obligation but out of support and respect for a neighboring country serving as a bastion of democracy and free trade in one of the world’s most contested fronts—to continue investment in the AWS sector and help Ukraine end the war with Russia. Beyond helping their neighbor, NATO governments should also implement robust systems in order to protect European skies and infrastructure from Russian drone incursions. The willingness of private companies to help is already there—stronger governmental action is now needed to truly stop the Russian war machine. As argued by the Center for European Policy Analysis, the limiting factor in creating a united European defense mechanism against Russian drones is “not technology but politics….Each month of delay is an invitation to further incursions.” Even so, R&D into deterrence mechanisms will not win the war; NATO countries need to continue to actively invest in the Ukrainian offense. Lorenz Meier, CEO of Switzerland-based drone manufacturer Auterion, has indicated that pure R&D is relatively unhelpful in helping a country actively at war; rather, governments should directly buy drone units from private companies and send them to the Ukrainian front to be tested in real-time on the battlefield. This public-private partnership methodology will tangibly demonstrate NATO countries’ government commitment to defeating Russia, encourage private sector innovation, and provide Ukrainian forces with much-needed advanced weaponry.
The legal and moral aspects of autonomous weapon development are still up for debate, especially in academic circles. As an active member of UCLA’s Model United Nations team, I’ve borne witness to many a conversation on AWS, with some students expressing that they think the solution to the issue is to ban all autonomous weapons. However, the reality of our world is that autonomous weapons are in use every day, and belligerent states like Russia won’t stop investing in more advanced defense tech. Ultimately, Ukraine and Europe as a whole are best aided by increased innovation and funding in autonomous drone technology—regulation on the technology can be left to after the war, when Ukraine isn’t actively fighting for its own survival.
Image source: wikimedia commons
