Ukrainian developers have confirmed that their drones are now carrying out autonomous strikes on Russian forces without a human operator. This is the first time such drones are known to have been used, as UN allegations about autonomous attacks in Libya in 2020 remain unproven.
The Saker Scout drones can find, identify and attack 64 different types of Russian ‘military objects’ on their own, operating in areas where radio jamming blocks communication and prevents other drones from working.
The quadcopter Saker Scout, came into service last month and can carry three kilos of bombs to a range of around 12 kilometres. Small drones operated by remote control have proved extremely effective as bombers with modified RKG-3 anti-tank grenades or RPG warheads and can destroy even heavy tanks.
AI In The Service Of Ukraine
The Saker company was founded in 2021 to develop affordable AI for small business, with applications as drone-based vision systems crop protection. When Russia invaded, the company switched to assisting the military. One of the first requirements was an AI to help a drone operator spot vehicles concealed by vegetation or camouflage.
Saker’s system is based on machine learning and the developers say it can currently recognize 64 different types of ‘military object’ including tanks, personnel carriers and other hardware. It is continuously improved and is updated on demand when there is a requirement to detect a specific new object or vehicle type. An early Saker video shows the system identifying light and heavy armor (personnel carriers and tanks) and trucks as it flies over them.
Saker AI software is also capable of visual navigation using known landmarks on the ground, so a drone can find its way even if GPS is jammed.
The Saker Scout is integrated with Ukraine’s Delta intelligence distribution system which fuses data from drones, satellites and other sources to produce a complete map of the battlefield. A Saker Scout can reconnoiter an area on its own and bring back information. Rather than just raw video, the software highlights enemy vehicles, so a the drones map out enemy positions which would otherwise take hundreds of hours of analysis by humans – not practical when you are fighting a war in real time.
The aim is to enable an extremely fast reconnaissance-decision making-strike process (also known as the ‘kill chain’) in a way that is not possible when humans are involved. Saker suggest that a kill-chain moving at machine-speed, with minimal human involvement, could be transformational in defeating Russian forces.
The Saker Scout can act as a hunter for FPV ('First Person View') attack drone teams: the AI acts as an observer pointing out targets, and automatically passes details to FPV attack drone operators who will verify a target before striking it.
Autonomous Attack Drone
The most radical use of the Saker Scout is to carry out attacks without a human in the loop, finding and hitting targets autonomously. A company spokesman confirmed to me that the Saker Scout had already been used in this mode, but only on a small scale. Most likely it is only used autonomously when radio interference or jamming prevents direct operator control. A video from Saker shows one of their drones carrying out a bombing mission. It is not known if this was autonomous.
The spokesman noted that the AI is not perfect, but their priority was getting a useful system that saves lives into the field. As mentioned, Saker are in constant contact with users and the system is updated continuously.
Going forward, if the system is deemed to be sufficiently reliable, large numbers of autonomous attack drones could be deployed simultaneously without the need for trained operators, or limited radio bandwidth. Dozens of bomber drones could attack simultaneously, immune to jamming and anti-drone guns, possibly targeting jammers or other defenses to make way for operator-controlled drones.
Many campaigners have sought to ban this type of ‘killer robot’, but Paul Scharre, Director of Studies at the think tank Centre for a New American Security, told me that despite UN discussions going back as far as 2014, there is still no international agreement.
“The pace of technology far outstrips the pace of diplomacy,” says Scharre. “After years of little to no progress in the consensus-based Convention on Certain Conventional Weapons (CCW), humanitarian disarmament activists pushing for a ban on autonomous weapons are taking their case to the UN General Assembly in late October. Whether this process spurs states to take action on autonomous weapons remains to be seen.”
Scharre notes the underlying technology has been around some time, so many others may have comparable systems. Other companies produce drone AI software claimed to be better at spotting targets than a human operator. AeroVironmentAVAV +0.7% say they are ready with an autonomous version of their SwitchBlade kamikaze drone if there is a demand for it.
In January Mykhailo Fedorov, Ukraine’s Minister for Digital Transformation and lead on the Army of Drones initiative, stated in an interview with AP that autonomous weapons were a “logical and inevitable” next step in drone development, suggesting a quiet but deliberate policy. On October 6th Fedorov announced in an official statement that some 2,000 AI-enabled drones had just been supplied via Army of Drones.
"They will assist in safely carrying out reconnaissance, adjusting artillery fire, and uncovering even well-concealed Russian objectives thanks to AI," Fedorov stated. He did not say whether these drones were also capable of autonomous attack. If not, this capability is only a software update away.
The concern in the past has always been that bad actors would develop this type of technology. However, the moral argument becomes more complex when it is being deployed in an existential fight for Ukraine’s survival against a brutal invasion. As with cluster bombs, Ukraine may be less concerned with possible long-term effects and more concerned about winning the war. But such weapons will not be confined to Ukraine for long.
“Operational pressures are likely to push both sides in the direction of autonomous weapons,” says Scharre.
Scharre notes that while they may initially be limited to military targets like tanks and radar, autonomous weapons could soon be used for less discriminate approaches such as targeting personnel.
Comentarios