Back to news

September 28, 2020 | International, Land, C4ISR

Inside the Army’s futuristic test of its battlefield artificial intelligence in the desert

YUMA PROVING GROUND, Ariz. — After weeks of work in the oppressive Arizona desert heat, the U.S. Army carried out a series of live fire engagements Sept. 23 at Yuma Proving Ground to show how artificial intelligence systems can work together to automatically detect threats, deliver targeting data and recommend weapons responses at blazing speeds.

Set in the year 2035, the engagements were the culmination of Project Convergence 2020, the first in a series of annual demonstrations utilizing next generation AI, network and software capabilities to show how the Army wants to fight in the future.

The Army was able to use a chain of artificial intelligence, software platforms and autonomous systems to take sensor data from all domains, transform it into targeting information, and select the best weapon system to respond to any given threat in just seconds.

Army officials claimed that these AI and autonomous capabilities have shorted the sensor to shooter timeline — the time it takes from when sensor data is collected to when a weapon system is ordered to engaged — from 20 minutes to 20 seconds, depending on the quality of the network and the number of hops between where it's collected and its destination.

“We use artificial intelligence and machine learning in several ways out here,” Brigadier General Ross Coffman, director of the Army Futures Command's Next Generation Combat Vehicle Cross-Functional Team, told visiting media.

“We used artificial intelligence to autonomously conduct ground reconnaissance, employ sensors and then passed that information back. We used artificial intelligence and aided target recognition and machine learning to train algorithms on identification of various types of enemy forces. So, it was prevalent throughout the last six weeks.”

The first exercise featured is informative of how the Army stacked together AI capabilities to automate the sensor to shooter pipeline. In that example, the Army used space-based sensors operating in low Earth orbit to take images of the battleground. Those images were downlinked to a TITAN ground station surrogate located at Joint Base Lewis McCord in Washington, where they were processed and fused by a new system called Prometheus.

Currently under development, Prometheus is an AI system that takes the sensor data ingested by TITAN, fuses it, and identifies targets. The Army received its first Prometheus capability in 2019, although it's targeting accuracy is still improving, according to one Army official at Project Convergence. In some engagements, operators were able to send in a drone to confirm potential threats identified by Prometheus.

From there, the targeting data was delivered to a Tactical Assault Kit — a software program that gives operators an overhead view of the battlefield populated with both blue and red forces. As new threats are identified by Prometheus or other systems, that data is automatically entered into the program to show users their location. Specific images and live feeds can be pulled up in the environment as needed.

All of that takes place in just seconds.

Once the Army has its target, it needs to determine the best response. Enter the real star of the show: the FIRES Synchronization to Optimize Responses in Multi-Domain Operations, or FIRESTORM.

“What is FIRESTORM? Simply put it's a computer brain that recommends the best shooter, updates the common operating picture with the current enemy situation, and friendly situation, admissions the effectors that we want to eradicate the enemy on the battlefield,” said Coffman.

Army leaders were effusive in praising FIRESTORM throughout Project Convergence. The AI system works within the Tactical Assault Kit. Once new threats are entered into the program, FIRESTORM processes the terrain, available weapons, proximity, number of other threats and more to determine what the best firing system to respond to that given threat. Operators can assess and follow through with the system's recommendations with just a few clicks of the mouse, sending orders to soldiers or weapons systems within seconds of identifying a threat.

Just as important, FIRESTORM provides critical target deconfliction, ensuring that multiple weapons systems aren't redundantly firing on the same threat. Right now, that sort of deconfliction would have to take place over a phone call between operators. FIRESTORM speeds up that process and eliminates any potential misunderstandings.

In that first engagement, FIRESTORM recommended the use of an Extended-Range Cannon Artillery. Operators approved the algorithm's choice, and promptly the cannon fired a projectile at the target located 40 kilometers away. The process from identifying the target to sending those orders happened faster than it took the projectile to reach the target.

Perhaps most surprising is how quickly FIRESTORM was integrated into Project Convergence.

“This computer program has been worked on in New Jersey for a couple years. It's not a program of record. This is something that they brought to my attention in July of last year, but it needed a little bit of work. So we put effort, we put scientists and we put some money against it,” said Coffman. “The way we used it is as enemy targets were identified on the battlefield — FIRESTORM quickly paired those targets with the best shooter in position to put effects on it. This is happening faster than any human could execute. It is absolutely an amazing technology.”

Dead Center

Prometheus and FIRESTORM weren't the only AI capabilities on display at Project Convergence.

In other scenarios, a MQ-1C Gray Eagle drone was able to identify and target a threat using the on-board Dead Center payload. With Dead Center, the Gray Eagle was able to process the sensor data it was collecting, identifying a threat on its own without having to send the raw data back to a command post for processing and target identification. The drone was also equipped with the Maven Smart System and Algorithmic Inference Platform, a product created by Project Maven, a major Department of Defense effort to use AI for processing full motion video

According to one Army officer, the capabilities of the Maven Smart System and Dead Center overlap, but placing both on the modified Gray Eagle at Project Convergence helped them to see how they compared.

With all of the AI engagements, the Army ensured there was a human in the loop to provide oversight of the algorithms' recommendations. When asked how the Army was implementing the Department of Defense's principles of ethical AI use adopted earlier this year, Coffman pointed to the human barrier between AI systems and lethal decisions.

“So obviously the technology exists, to remove the human right the technology exists, but the United States Army, an ethical based organization — that's not going to remove a human from the loop to make decisions of life or death on the battlefield, right? We understand that,” explained Coffman. “The artificial intelligence identified geo-located enemy targets. A human then said, Yes, we want to shoot at that target.”

https://www.c4isrnet.com/artificial-intelligence/2020/09/25/the-army-just-conducted-a-massive-test-of-its-battlefield-artificial-intelligence-in-the-desert/

On the same subject

  • The Army wants a self-directed combat vehicle to engage enemies

    December 7, 2018 | International, Land, C4ISR

    The Army wants a self-directed combat vehicle to engage enemies

    By: Adam Stone While the commercial world tiptoes toward the notion of a self-driving car, the military is charging forward with efforts to make autonomy a defining characteristic of the battlefield. Guided by artificial intelligence, the next-generation combat vehicle now in development will have a range of autonomous capabilities. Researchers at Army's Communications-Electronics Research, Development and Engineering Center (CERDEC) foresee these capabilities as a driving force in future combat. “Because it is autonomous, it can be out in front to find and engage the enemy while the soldiers remain safely in the rear,” said Osie David, chief engineer for CERDEC's mission command capabilities division. “It can draw fire and shoot back while allowing soldiers to increase their standoff distance.” Slated to come online in 2026, the next-gen combat vehicle won't be entirely self-driving. Rather, it will likely include a combination of autonomous and human-operated systems. To realize this vision, though, researchers will have to overcome a number of technical hurdles. Getting to autonomy An autonomous system would need to have reliable access to an information network in order to receive commands and relay intel to human operators. CERDEC's present work includes an effort to ensure such connections. “We need resilient comms in really radical environments — urban, desert, trees and forests. All those require new and different types of signal technologies and communications protocols,” David said. Developers also are thinking about the navigation. How would autonomous vehicles find their way in a combat environment in which adversaries could deny or degrade GPS signals? “Our role in this is to provide assured localization,” said Dr. Adam Schofield, integration systems branch chief for the positioning, navigation and timing (PNT) division. In order for autonomous systems to navigate successfully, they've got to know where they are. If they rely solely on GPS, and that signal gets compromised, “that can severely degrade the mission and the operational effectiveness,” he said. CERDEC, therefore, is developing ways to ensure that autonomous systems can find their way, using LIDAR, visual cues and a range of other detection mechanisms to supplement GPS. “We want to use all the sensors that are on there to support PNT,” Schofield said. In one scenario, for example, the combat vehicle might turn to an unmanned air asset for ISR data in order to keep itself oriented. “As that UAV goes ahead, maybe it can get a better position fix in support of that autonomous vehicle,” he said. Even as researchers work out the details around comms and navigation, they also are looking to advances in artificial intelligence, or AI, to further empower autonomy. The AI edge AI will likely be a critical component in any self-directed combat vehicle. While such vehicles will ultimately be under human control, they will also have some capacity to make decisions on their own, with AI as the software engine driving those decisions. “AI is a critical enabler of autonomy,” said CERDEC AI expert Dr. Peter Schwartz. “If autonomy is the delegation of decision-making authority, in that case to a robotic system, you need some confidence that it is going to make the right decision, that it will behave in a way that you expect.” AI can help systems to reach that level of certainty, but there's still work to be done on this front. While the basics of machine learning are well-understood, the technology still requires further adaptation in order to fulfill a military-specific mission, the CERDEC experts said. “AI isn't always good at detecting military things,” David said. “It may be great at recognizing cats, because people post millions of pictures of cats on the internet, but there isn't an equally large data set of images of adversaries hiding in bushes.” As AI strategies evolve, military planners will be looking for techniques that enable the computer to differentiate objects and actions in a military-specific context. “We need special techniques and new data sets in order to train the AI to recognize these things in all different environments,” he said. “How do you identify an enemy tank and not confuse that with an ordinary tractor trailer? There has to be some refinement in that.” Despite such technical hurdles, the CERDEC team expressed confidence that autonomy will in fact be a central feature of tomorrow's ISR capability. They say the aim is create autonomous systems that can generate tactical information in support of war-fighter needs. “As we are creating new paradigms of autonomy, we want to keep it soldier-centric,” David said. “There is filtering and analyzing involved so you don't overwhelm the user with information, so you are just providing them with the critical information they need to make a decision.” https://www.c4isrnet.com/unmanned/2018/11/30/the-army-wants-a-self-directed-combat-vehicle-to-engage-enemies

  • BAE Systems unveils $1.9 billion economic impact of ground vehicle and weapon systems network

    July 25, 2023 | International, Land, Other Defence

    BAE Systems unveils $1.9 billion economic impact of ground vehicle and weapon systems network

    Through operations at its 12 sites, BAE Systems’ ground vehicle, amphibious vehicle and weapon systems product lines contributed to local families and economies by providing more than 5,000 jobs and...

  • Indian navy to float $6 billion tender for six submarines

    June 8, 2021 | International, Naval

    Indian navy to float $6 billion tender for six submarines

    The approval was granted by the defense ministry’s procurement body, the Defence Acquisition Council, on June 4.

All news