2 novembre 2020 | International, Terrestre, C4ISR

How the Army plans to revolutionize tanks with artificial intelligence

Even as the U.S. Army attempts to integrate cutting edge technologies into its operations, many of its platforms remain fundamentally in the 20th century.

Take tanks, for example.

The way tank crews operate their machine has gone essentially unchanged over the last 40 years. At a time when the military is enamored with robotics, artificial intelligence and next generation networks, operating a tank relies entirely on manual inputs from highly trained operators.

“Currently, tank crews use a very manual process to detect, identify and engage targets,” explained Abrams Master Gunner Sgt. 1st Class Dustin Harris. “Tank commanders and gunners are manually slewing, trying to detect targets using their sensors. Once they come across a target they have to manually select the ammunition that they're going to use to service that target, lase the target to get an accurate range to it, and a few other factors.”

The process has to be repeated for each target.

“That can take time,” he added. “Everything is done manually still.”

On the 21st century battlefield, it's an anachronism.

“Army senior leaders recognize that the way the crews in the tank operate is largely analogous to how these things were done 30, 45 years ago,” said Richard Nabors, acting principal deputy for systems and modeling at the DEVCOM C5ISR Center.

“These senior leaders, many of them with extensive technical expertise, recognized that there were opportunities to improve the way that these crews operate,” he added. “So they challenged the Combat Capabilities Development Command, the Armaments Center and the C5ISR Center to look at the problem.”

On Oct. 28, the Army invited reporters to Aberdeen Proving Ground to see their solution: the Advanced Targeting and Lethality Aided System, or ATLAS.

ATLAS uses advanced sensors, machine learning algorithms and a new touchscreen display to automate the process of finding and firing targets, allowing crews to respond to threats faster than ever before.

“The assistance that we're providing to the soldiers will speed up those engagement times [and] allow them to execute multiple targets in the same time that they currently take to execute a single target,” said Dawne Deaver, C5ISR project lead for ATLAS.

At first glance, the ATLAS prototype the Army had set up looked like something out of a Star Wars film, albeit with treads and not easily harpooned legs. The system was installed on a mishmash of systems — a sleek black General Dynamics Griffin I chassis with the Army's Advance Lethality and Accuracy System for Medium Calibur (ALAS-MC) auto-loading 50mm turret stacked on top.

And mounted on top of the turret was a small round Aided Target Recognition (AiTR) sensor — a mid-wave infrared imaging sensor to be more exact. Constantly rotating to scan the battlefield, the sensor almost had a life of its own, not unlike an R2 unit on the back of an X-Wing.

Trailing behind the tank and connected via a series of long black cables was a black M113. For this demonstration, the crew station was located inside the M113, not the tank itself. Cavernous compared to the inside of an Abrams tank, the M113 had three short seats lined up. At the forward-most seat was a touchscreen display and a video game-like controller for operating the tank, while further back computer monitors displayed ATLAS' internal processes.

Of course, ATLAS isn't the tank itself, or even the M113 connected to it. The chassis served as a surrogate for either a future tank, fighting vehicle or even a retrofit of current vehicles, while the turret was an available program being developed by the Armaments Center. The M113 is not really meant to be involved at all, but the Army decided to remotely locate the crew station inside of it for safety concerns during a live fire demonstration expected to take place in the coming weeks. ATLAS, Army officials reminded observers again and again, is agnostic to the chassis or turret it's installed on.

So if ATLAS isn't the tank, what is it?

Roughly speaking, ATLAS is the mounted sensor collecting data, the machine learning algorithm processing that data, and the display/controller that the crew uses to operate the tank.

Here's how it works:

ATLAS starts with the optical sensor mounted on top of the tank. Once activated, the sensor continuously scans the battlefield, feeding that data into a machine learning algorithm that automatically detects threats.

Images of those threats are then sent to a new touchscreen display, the graphical user interface for the tank's intelligent fire control system. The images are lined up vertically on the left side of the screen, with the main part of the display showing what the gun is currently aimed at. Around the edges are a number of different controls for selecting ammunition, fire type, camera settings and more.

By simply touching one of the targets on the left with your finger, the tank automatically swivels its gun, training its sights on the dead center of the selected object. As it does that, the fire control system automatically recommends the appropriate ammo and setting — such as burst or single shot — to respond with, though the user can adjust these as needed.

So with the target in its sights, weapon selected, the operator has a choice: Approve the AI's recommendations and pull the trigger, adjust the settings before responding, or disengage. The entire process from target detection to the pull of the trigger can take just seconds. Once the target is destroyed, the operator can simply touch the screen to select the next target picked up by ATLAS.

In automating what are now manual tasks, the aim of ATLAS is to reduce end-to-end engagement times. Army officials declined to characterize how much faster ATLAS is than a traditional tank crew. However, a demo video shown at Aberdeen Proving Ground claimed ATLAS allows “the operator to engage three targets in the time it now takes to just engage one.”

ATLAS is essentially a marriage between technologies developed by the Army's C5ISR Center and the Armaments Center.

“We are integrating, experimenting and prototyping with technology from C5ISR center — things like advanced EO/IR targeting sensors, aided target algorithms — we're taking those technology products and integrating them with intelligent fire control systems from the Armaments Center to explore efficiencies between those technologies that can basically buy back time for tank crews,” explained Ground Combat Systems Division Deputy Director Jami Davis.

Starting in August, the Army began bringing in small groups of tank operators to test out the new system, mostly using a new virtual reality setup that replicates the ATLAS display and controller. By gathering soldier feedback early, the Army hopes that they can improve the system quickly and make it ready for fielding that much faster. Already, the Army has brought in 40 soldiers. More soldier touchpoints and a live fire demonstration are anticipated to help the Army mature its product.

In some ways, ATLAS replicates the AI-capabilities demonstrated at Project Convergence in miniature. Project Convergence is the Army's new campaign of learning, designed to integrate new sensor, AI and network capabilities to transform the battlefield. In September, the Army hauled many of its most advanced cutting edge technologies to the desert at Yuma Proving Ground, then tried to connect them in new ways. In short, at Project Convergence the Army tried to create an environment where it could connect any sensor to the best shooter.

The Army demonstrated two types of AI at Project Convergence. First were the automatic target recognition AIs. These machine learning algorithms processed the massive amount of data picked up by the Army's sensors to detect and identify threats on the battlefield, producing targeting data for weapon systems to utilize.

The second type of AI was used for fire control, and is represented by FIRES Synchronization to Optimize Responses in Multi-Domain Operations, or FIRESTORM. Taking in the targeting data from the other AI systems, FIRESTORM automatically looks at the weapons at the Army's disposal and recommends the best one to respond to any given threat.

While ATLAS does not yet have the networking components that tied Project Convergence together across domains, it essentially performs those two tasks: It's AI automatically detects threats and recommends the best response to the human operators. Although the full ATLAS system wasn't hauled out to Project Convergence this year, the Army was able to bring out the virtual prototyping setup to Yuma Proving Ground, and there is hope that ATLAS itself could be involved next year.

To be clear: ATLAS is not meant to replace tank crews. It's meant to make their jobs easier, and in the process, much faster. Even if ATLAS is widely adopted, crews will still need to be trained for manual operations in case the system breaks down. And they'll still need to rely on their training to verify the algorithm's recommendations.

“We can assist the soldier and reduce the number of manual tasks that they have to do while still retaining the soldiers' ability to always override the system, to always make the final decision of whether or not the target is a threat, whether or not the firing solution is correct, and that they can make that decision to pull the trigger and engage targets,” explained Deaver.

https://www.c4isrnet.com/artificial-intelligence/2020/10/29/how-the-army-plans-to-revolutionize-tanks-with-artificial-intelligence/

Sur le même sujet

  • Germany orders 40 Marder combat vehicles for Ukraine

    11 septembre 2023 | International, Terrestre

    Germany orders 40 Marder combat vehicles for Ukraine

    The new order doubles the total number of Marders to be sent to Ukraine, according to its manufacturer Rheinmetall.

  • Jumping into algorithmic warfare: US Army aviation tightens kill chain with networked architecture

    9 septembre 2019 | International, Aérospatial

    Jumping into algorithmic warfare: US Army aviation tightens kill chain with networked architecture

    By: Jen Judson NAVAL AIR WEAPONS STATION CHINA LAKE, Calif. — In the skies above China Lake, California, from the back of an MH-47 Chinook cargo helicopter, an operator with a tablet takes control of a Gray Eagle drone and tasks it with firing a small, precision-glide munition at an enemy target located on the ground. But at the last second, a higher level threat is detected and the munition is rapidly redirected toward a different threat, eliminating it within seconds. This was made possible through the architecture, automation, autonomy and interfaces capability, or A3I, built by the Army's Future Vertical Lift Cross-Functional Team under Army Futures Command. The demonstration showed the ability to nimbly pass control between operators of unmanned systems and munitions through a networked architecture of systems also receiving and filtering real-time, pertinent information to aid in operational decision-making. “It was our first jump into algorithmic warfare,” Brig. Gen. Wally Rugen, who is in charge of the Army's FVL modernization effort, told Defense News following the demonstration. “We definitely didn't jump into the deep end of the pool, but we jumped in and, again, we are into pursuing that as far as we can take it to help soldiers be lethal.” The Aug. 26 demonstration sought to tighten the kill chain and allow for more advanced teaming between air assets and troops on the ground using a resilient network. “When you talk about our kill chain, we are trying to take seconds out of our kill chain,” Rugen said. “We feel like we understand the reverse kill chain — the enemy coming to get us. Our kill chain is going to get them, and we want our decision-making to be as precise and as expeditious as possible,” using automation and autonomy, he added. AI3 was developed over the course of nine months and culminated in the demonstration at China Lake. "Going from a concept, and in a matter of months putting it into an experiment: That was probably the most impressive thing, particularly if you look back at the history of how we do these,” James McPherson, the official performing the duties of the undersecretary of the Army, told Defense News. McPherson attended the demonstration to emphasize the importance to senior Army leadership of modernization efforts within the service. The FVL effort in particular includes ensuring manned, unmanned, munition and other air-launched effects are all seamlessly networked together to fight in advanced formations in a congested environment, such as an urban area, and that they are prepared to fight across multiple domains. Using an interface called Arbitrator, the service networked together a variety of targeting identification and rapid automated processing, exploitation and distribution, or PED, capabilities as well as real-time weather information and several other features and capabilities to help operators of unmanned systems penetrate, in the case of the demonstration, an urban environment. AI3 in action During the demo, one of the systems integrated into the network tied to a ground sensor detected a possible threat on the ground. Seeing the threat detected in the system, a helicopter pilot then gained control of an extended-range Gray Eagle and tasked it to perform reconnaissance of the possible target. Using the UAS, the pilot identified the threat as an enemy surface-to-air missile system. The pilot then ordered the UAS to fire a Dynetics GBU-69 small glide munition to defeat the target, marking the first time the munition had been fired from a Gray Eagle. But as the munition closed in on the target, the system picks up on another threat deemed more important for elimination. The information for this decision came from the integrated PED systems that use machine-learning algorithms to accurately identify items of interest. Another operator then redirected the munition during its final seconds of flight to hit the new, more pressing threat. Why does the Army need A31 capability? To build the system, the government took the lead integration role, Chief Warrant Officer 5 Cory Anderson, the UAS branch chief for Army Special Operations Aviation Command, said at the demonstration. This ensured the service's ability to get the right levels of interoperability between subsystems. But almost all of the capabilities tied into the government's black box came from small businesses and academia. Much of the initial development has come from the special operations side of the house. The demonstration was viewed from a tactical operations center, with screens lining the walls of a large air-conditioned trailer, but the system has a scalable control interface and can be remotely accessed from a cockpit or even a tablet used by a soldier on the ground. This breaks the Army free from having to use a ground control station, Anderson said, meaning the footprint and logistics tail can be drastically reduced. To put together the tactical operations center and ground control station, it took roughly seven C-17 planes to move heavy equipment into China Lake. “We can't sustain that,” Anderson said. “We believe we can get it down to a two C-17 load-out just by minimizing the generational requirements alone.” By integrating PED systems that use machine learning into A3I, the Army no longer requires a large number of people — roughly 30 at a time — to conduct PED from full-motion video. The Arbitrator system allows for operators to pass control of various systems back and forth at different levels of control, from just receiving information from a sensor or UAS to controlling a payload to the entire system. The system is also under development to improve its automation levels. The utility of passing control to a relevant operator not tied to a ground station means taking out the middle man that doesn't have the same advantageous access to the tactical edge another possible operator might have. Rugen said that if there's an operator on the ground close to the action, it's much easier to take control of systems rather than try to direct someone far away to the right location to get eyes on a possible point of interest or target in order to make an actionable decision. “What if the squad leader could just grab the sensor because we have the hierarchy?” Rugen noted. While the capability was developed and demonstrated by the FVL Cross-Functional Team, the system has applications for almost everything on the battlefield, from applications to long-range precision fires targeting capabilities to next-generation combat vehicle teaming to soldier systems. Both directors for the Long-Range Precision Fires and the Network cross-functional teams were present at the demonstration. While the unclassified version of the demo didn't show capability, the classified version addresses the architecture's capability to protect itself against threat-representative electronic attack. “We want to make sure we have a resilient network,” Rugen said. The next step is to move the Arbitrator system onto an airborne platform, which would completely eliminate the ground control station. That will be demonstrated in roughly a year. https://www.defensenews.com/land/2019/09/05/jumping-into-algorithmic-warfare-army-aviation-tightens-kill-chain-with-networked-architecture/

  • India, Australia cleared to buy $4.3B in US military gear

    3 mai 2021 | International, Aérospatial, Naval, Terrestre, C4ISR, Sécurité

    India, Australia cleared to buy $4.3B in US military gear

    India wants more P-8s, and Australia wants more ground vehicles and Chinooks.

Toutes les nouvelles