R9X Hellfire missile among alarming new unregulated weapons
The recent assassination of al-Qaeda leader Ayman al-Zawahiri by a CIA drone strike was the latest US response to 9/11. Politically, this has amplified the existing mistrust between US leaders and the Taliban government in Afghanistan. The murder also revealed compromises in the 2020 Doha peace agreement between the United States and the Taliban.
But another story emerges with wider implications: the speed and nature of international arms development. Take the weapon allegedly used to kill al-Zawahiri: the Hellfire R9X “Ninja” missile.
The Hellfire missile was originally designed in the 1970s and 1980s to destroy Soviet tanks. Rapid improvements from the 1990s resulted in multiple variants with different capabilities. They can be launched from helicopters or Reaper drones. Their different explosive charges can be detonated in different ways: on impact or before impact.
The R9X Hellfire missile is not new but in the shadows
Then there is the Hellfire R9X “Ninja”. It is not new, although it has remained largely in the shadows for five years. It was reportedly used in 2017 in Syria to kill al-Qaeda deputy leader Abu Khayr al-Masri.
The Ninja Missile does not rely on an explosive warhead to destroy or kill its target. It uses the speed, accuracy and kinetic energy of a 100-pound missile fired up to 20,000 feet, armed with six blades that deploy in the final moments before impact.
R9X Hellfire missile among emerging superweapons
The Ninja Missile is the ultimate attempt – so far – to accurately target and kill a single person. No explosion, no widespread destruction and no deaths among passers-by.
But other weapon developments will also affect the way we live and how wars are fought or deterred. Russia has invested heavily in these so-called superweapons, building on older technologies. They aim to reduce or eliminate the technological advantages enjoyed by the United States or NATO.
Russia’s hypersonic missile development goals are very ambitious. The Avangard missile, for example, will not need to fly out of Earth’s atmosphere. Instead, it will remain in the upper atmosphere, which will give it the ability to maneuver.
Such maneuverability will make detection or interception more difficult. China’s DF-17 hypersonic ballistic missile is also intended to evade US missile defenses.
The autonomous era
On a smaller scale, robot dogs with mounted machine guns are appearing on the arms market. Weapons development company Sword International took a Ghost Robotics quadruped unmanned ground vehicle – or robot dog – and mounted an assault rifle on it. It was one of three robot dogs on display at a US Army show.
Turkey, meanwhile, claims to have developed four types of autonomous drones, capable of identifying and killing people, all without the intervention of a human operator or GPS guidance. According to a UN report from March 2021, such an autonomous weapon system has already been used in Libya against a logistics convoy affiliated with the armed group Khalifa Haftar.
Autonomous weapons that do not need GPS guidance are particularly important. In a future war between great powers, the satellites that provide GPS navigation can expect to be shot down. Thus, any military system or aircraft that relies on GPS signals for navigation or targeting would be rendered ineffective.
China, Russia, India and the United States have developed weapons to destroy satellites that provide global positioning for satellite navigation systems in cars and guidance for civilian aircraft.
The real nightmare scenario is to combine these weapon systems, and many others, with artificial intelligence.
New rules of war
Do we need new laws or new treaties to limit these futuristic weapons? In short, yes, but they don’t seem likely. The United States has called for a global agreement to stop anti-satellite missile testing – but there has been no acceptance.
The closest to a deal is the signing of NASA’s Artemis Accords. They are principles aimed at promoting the peaceful use of space exploration. But they only apply to “civilian space activities carried out by civil space agencies” of the signatory countries. In other words, the agreement does not extend to military space activities or terrestrial battlefields.
In contrast, the United States withdrew from the Intermediate-Range Nuclear Forces Treaty. This is part of a long-term pattern of withdrawal from global agreements by US administrations.
Lethal autonomous weapon systems are a special class of emerging weapon systems. They incorporate machine learning and other types of AI so they can make their own decisions and take action without direct human intervention. In 2014, the International Committee of the Red Cross (ICRC) brought together experts to identify the issues raised by autonomous weapon systems.
In 2020, the ICRC and the Stockholm International Peace Research Institute went further by bringing together international experts to identify the controls needed on autonomous weapon systems.
In 2022, discussions are ongoing between the countries that the UN first brought together in 2017. This group of governmental experts continues to debate the development and use of lethal autonomous weapons systems. However, there has still been no international agreement on a new law or treaty to limit their use.
New rules for autonomous weapon systems
The campaign group, Stop the Killer Robots, has called throughout this period for an international ban on lethal autonomous weapons systems. Not only has that not happened, but there is also an undeclared stalemate in UN talks on autonomous weapons in Geneva.
Australia, Israel, Russia, South Korea and the United States opposed a new treaty or political declaration. Opposing them at the same talks, 125 member states of the Non-Aligned Movement are calling for legally binding restrictions on lethal autonomous weapons systems. With Russia, China, the United States, the United Kingdom and France all having veto power in the UN Security Council, they can prevent such a binding law on autonomous weapons.
Outside of these international talks and activist organizations, independent experts are offering alternatives. For example, in 2019, Australian-based American ethicist Deane-Peter Baker brought together the Canberra Group of Independent International. The group produced a report, Guiding Principles for the Development and Use of Lethal Autonomous Weapons Systems.
These principles do not resolve the political impasse between superpowers. But if autonomous weapons are here to stay, this is a first attempt to figure out what new rules will be needed.
When Pandora’s mythical box was opened, unspeakable horrors were unleashed upon the world. Emerging weapon systems are all too real. Like Pandora, we are left with only hope.
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Professor of Applied Ethics and Director, Safety and Risk Research at University of Portsmouth