A drone skims over the treetops, ignoring the dense blankets of electronic jamming. Onboard cameras feed a steady stream of the landscape beneath to the navigation processor, which compares them to preloaded terrain data at machine speed.
Soon, the drone detects shapes below — the AI models in its electronic brain spit out a high probability match to some enemy hardware, enough to trip the engagement threshold. The drone locks onto the image, tips forward, and swoops down, tenaciously seeking its target.
This scenario is both the nightmare and the dream of militaries around the world, which are racing to be the first to field fleets of killer robots. While the US and China are spending billions on this contest, neither is facing the live-fire conditions, nor the stakes of the race between Ukraine and Russia.
Ukraine is running out of soldiers and Russia’s loss-recruitment ratio is starting to look grim. If full-spectrum autonomy is possible, it may not only help turn the tide, but put the winner at the forefront of a terrifying military revolution. Former top commander Valeriy Zaluzhnyi predicted that Ukraine must win this race no later than 2027.
Fortunately, fully-autonomous weapons — those that navigate, detect, select, and strike targets without a human pilot, all in one package — don't exist yet, according to more than a dozen interviews with AI developers, dronemakers, military insiders, and Russia analysts.
However, each separate capability: from navigation, to object recognition, to targeting assistance, are already available and increasingly common on the battlefield.
“The reality is both sides already have this capability, but currently it’s not widely employed. I’d expect increasing adoption within the next 18 months,” said ‘Walrus,’ a mathematician and AI instructor, who now develops drones for Ukraine’s National Guard.
“We’re not at full independent ‘killer robot’ level, but we are already deep into practical battlefield autonomy,” said another AI drone programmer with Ukraine’s National Guard, who goes by ‘Jack.’ “What exists now is task-level autonomy — auto tracking, assisted navigation, terminal guidance, and vision-based locking.”
What’s the big deal with AI weapons
The full scope of what a war fought with autonomous weapons will look like is still unknown, as such a war has never been fought before. But the ability to strike the enemy without putting your own forces at risk can give a major edge to those who can field this tech.
First, there’s the EW resistance. Ukraine and Russia invented the modern version of machine war and as drones took over the battlefield, so did electronic warfare. Ukraine is a web of constant jamming, counter-jamming, frequency-hopping, and disrupted connections. Fiber-optic drones are a way to get around it, but they have their own trade-offs.
An autonomous weapon can theoretically navigate, seek targets, and hit them without a human operator to guide it, allowing it to ignore EW altogether, with no need for physical tethers.

It also means that “a single operator can control strikes on multiple targets simultaneously, which could greatly reduce the need for operators,” Walrus said. “Given the manpower shortage in the Ukrainian armed forces, this would be a significant advantage.”
Autonomy would also mean that drones can be launched from farther away, greatly improving pilot safety. Russia and its Rubicon (also spelled Rubikon) center have prioritized taking out Ukrainian drone operators.
Effective AI-powered weapons also theoretically reduce reliance on pilot skill, acuity, and dexterity when locating and engaging targets. Highly-skilled warfighters are often in short supply, especially in a war with this kind of body count.
All of the above raises serious ethical questions, which many, including the Vatican, warned against. US AI company Anthropic got in hot water with the Pentagon for refusing to remove guardrails against autonomous weaponry and AI surveillance — online responses were scathing against the US government.
Sources told Euromaidan Press that many Ukrainians, even Russians, are leery of these possibilities as well, especially if false positives, and thus, friendly fire, are on the table.
The race heats up
However, these reservations have done little to stop the race from going ahead at full-speed and the contest has borne a growing number of results along the way.
“What we are seeing on the battlefield is Ukrainians and Russians increasingly using things like machine learning and machine vision to aid some of their drone operations,” said Kateryna Stepanenko, a Russia researcher with the Institute for the Study of War.
These innovations come in many forms. Both Ukrainians and Russians already field interceptor and FPV drones that can lock on and pursue their targets. Ukrainian UAVs can navigate through terrain image matching for both close flights and deep strikes into Russia. Autoturret gun platforms are becoming increasingly viable on the battlefield.

Ukraine has been on the record that a growing number of its companies are working on AI capabilities — at least 200. Ukraine’s military tech advantage has been its quick iteration and innovation cycle, as well as its wealth of collected data inside its battle management databases, like Delta and OCHI.
“Ukraine does have an advantage because they do have systems like Delta that are integrated, so there's a platform that AI can learn from,” Stepanenko said. “The Russians don't have that system just yet. They're trying to develop it, but they're behind."
“I think, honestly, that Ukraine is the closest army in the world now to having a vision that enables truly autonomous warfare, no longer risking human lives,” said Merlin Hipp, founder of Germany’s Lancelot Systems, which works with Ukraine.
Russia has been increasingly silent on what it’s doing with AI since 2022, but multiple entities connected to the defense ministry, including ERA Institute, Technopark, Rostec subsidiaries, and Southern Federal University, have been hacking away at the problem.
“On the Russian side, they're using things like optical navigation — homing and targeting… where the operator is still required to determine what the target is, but the drones are increasingly able to pursue the target after the target was set for them,” Stepanenko said.
Russia’s advantage in this war has been its greater access to resources and centralized production, allowing it to more quickly scale its most effective technologies to try to overwhelm Ukrainian forces, as they have done with technologies like fiber-optic drones. Russia has also sought to improve the pace of its own innovation cycle.
“The Russians are not stupid," Hipp told Euromaidan Press. "We collected a lot of data from their ZALA loitering munitions… and you see that their AI is getting better and better, and it’s very scary.”
Full autonomy still out of reach
While machine learning is increasingly augmenting human abilities, both sides have yet to get away from their need for skilled human operators, at least for now.
“The automatic decision-making component is still missing and we're not really seeing it,” Stepanenko said.
“Fully independent ‘search–identify–decide–engage’ systems without human supervision are still extremely difficult to make reliable at scale,” Jack concurred.
“Everyone is experimenting, including the Russians, but nobody has solved all the reliability and identification challenges cleanly.”

Trending Now
Part of the challenge is filtering the right training data. Thousands of drones generate terabytes of footage every single day, but the quality is often inconsistent. Experts said that finding, sorting, and labeling data can be one of the most expensive and time-consuming aspects of AI development. While Kyiv is ahead of Moscow in this regard, some sources said it remains a work in progress, even on the Ukrainian side.
Integration and engineering are even bigger obstacles. “The development of the full control stack will be the biggest challenge,” Walrus said. “A fully automated system must read video input, process the video and construct the scene, maintain awareness of where the drone is and where it has been, and then transmit the control signals.”
Modeling approaches
One way to approach this problem is a multi-stage approach: multiple models working in tandem, each handling a different task — one to process the video feed, one to segment and identify potential targets, and one to determine control signals to reach the target.
Another way is to use what’s called an “agentic approach.” Unlike other forms of AI, “agents” are not instructed how to complete a task, but what the task is. The AI is then left to determine a method to solve it. But this comes with its own tradeoffs.
Insufficiently sophisticated models might get confused by changes they weren't prepared for. For example, if the system wasn’t sufficiently trained on different types of weather, it might be confused if the terrain gets covered by snow. Enemies might also try to make their targets look different to try to confuse the drone.
“We actually saw this after Operation Spiderweb, where they would put different coats of paint and tires on their planes,” Stepanenko said.
“So that way, when the drone with machine learning capabilities sees the plane, it might get confused, because the plane doesn’t look as perfect as the image preloaded into it.”

As time goes on, AI-assisted machines are getting better at overcoming these challenges, but it remains a work in progress. A Ukrainian Unmanned Systems Forces officer, speaking on condition of anonymity, told Euromaidan Press that in battlefield tests, AI-assisted drones regularly failed to detect targets that eagle-eyed human pilots were able to spot.
“We have a lot of footage of, for example, Russian tanks in the field,” he said. “The AI was trained on this data, and it still misses quite a lot of stuff that people are able to see. I understand this is just a matter of time, but our training models are not good right now.”
Advancements being made
Still, some companies say they are moving closer. Hipp said his company’s AI model, trained on “millions” of videos and harvested Russian data, is able to identify Russian targets with over 80% certainty.
He demonstrated this on his computer, showing images of various Russian combat vehicles and a percentage amount on top of each. Vehicles that receded into the background had lower percentages, but some closer to the foreground climbed up to 90%.
Hipp expressed confidence that Lancelot’s AI model could soon compete with top humans, even in environments filled with smoke or inclement weather.
"It's getting better with every interval. There are some things that are extremely difficult to detect by the human eye," he said. "The human has to be an experienced pilot… I think that the next iteration (of the company’s AI) will be as good as the very best drone pilots, because it's learning with every mission flown."
Hardware and scaling
The final — some might say most important — challenge is to miniaturize this model, put it inside a drone, and make sure it works smoothly with the physical components. After all, a drone that has to connect to some distant database is useless on the EW-choked battlefield.
Edge deployment of this kind is quite feasible, multiple sources said. A model trained in a laboratory or data center can be compressed and squeezed onto a processing unit — most developers use Raspberry Pi, often augmented with additional memory.

But the trade-offs are stark: how much AI sophistication you want to keep, how nimble and destructive the drone needs to be, and how much you have to spend per unit.
“Smarter” drones require more compute, raising weight and power draw, leaving less room for payload, sensors, range, or maneuverability. Latency is also critical. Even small delays can affect interception accuracy at high closing speeds.
“Hardware matters as much as software. Cameras are critical, and high-quality sensors — especially thermal — are expensive,” Jack said.
“Successful locking often comes down to pixel density. If the target is only a few pixels in size, detection reliability drops quickly. “That’s why optics are extremely important.”
Expense is an even bigger issue when trying to scale this technology, especially when considering drones' attrition rate, both during development and on the battlefield. Putting really expensive boards into an expendable machine doesn’t make sense for either side.
Brian Streem, CEO of Vermeer, which makes drones for Ukraine that navigate through terrain image matching, emphasized the engineering challenges involved in AI weapons.
"We have a deep neural network that is matching terrain features to a locally stored database of imagery, and it's pretty impressive what it's capable of.”
"But that's one piece of 35 other things that need to work well... There's integration into other embedded systems, flight controllers, other sensor fusion technologies... You have to be aware of barometric pressure sensors; there's a whole lot of other stuff to get this to actually work and not just in a computer simulation.”