10 novembre 2020 | International, C4ISR

Mixed-reality systems can bring soldier feedback into development earlier than ever before. Here’s how the US Army is using it.

ABERDEEN PROVING GROUND, Md. — The U.S. Army's Combat Capabilities Development Command has made clear it wants to introduce soldier feedback earlier in the design process, ensuring that new technologies are meeting users' needs.

“Within the CCDC, the need to get soldier feedback, to make sure that we're building the appropriate technologies and actually getting after the users' needs is critical,” said Richard Nabors, acting principal deputy for systems and modeling at the command's C5ISR Center (Command, Control, Computers, Communications, Cyber, Intelligence, Surveillance and Reconnaissance).

“There's a concerted effort within the C5ISR Center to do more prototyping not just at the final system level ... but to do it at the component level before the system of systems is put together,” he added.

But how can the service accomplish that with systems still in development?

One answer: virtual reality.

The Army's CCDC is testing this approach with its new artificial intelligence-powered tank concept: the Advanced Targeting and Lethality Aided System, or ATLAS. While tank operations are almost entirely manual affairs, ATLAS aims to automate the threat detection and targeting components of a gunner's job, greatly increasing the speed of end-to-end engagements.

Using machine-learning algorithms and a mounted infrared sensor, ATLAS automatically detects threats and sends targeting solutions to a touch-screen display operated by the gunner. By touching an image of the target, ATLAS automatically slews the tank's gun to the threat and recommends the appropriate ammunition and response type. If everything appears correct, the gunner can simply pull the trigger to fire at the threat.

The process takes just seconds, and the gunner can immediately move on to the next threat by touching the next target on the display.

ATLAS could revolutionize the way tank crews operate — at least in theory. But to understand how the system works with real people involved and whether this is a tool gunners want, CCDC needed to test it with soldiers.

The Army has set up an ATLAS prototype at Aberdeen Proving Ground in Maryland, and it hopes to conduct a live-fire exercise soon with targets in a field. However, to collect useful feedback, CCDC is giving soldiers a more robust experience with the system that involves multiple engagements and varying levels of data quality. To do this, the command has built a mixed-reality environment.

“It gives us the opportunity ... to get the soldiers in front of this system prior to it being here as a soldier touchpoint or using the live system so we get that initial feedback to provide back to the program, to get that soldier-centric design, to get their opinions on the system, be that from how the GUI is designed to some of the ways that the system would operate,” explained Christopher May, deputy director of the C5ISR Center's Modeling and Simulation Division.

The virtual world

In the new virtual prototyping environment — itself a prototype — users are placed in a 3D world that mimics the gunner station while using a physical controller and display that is a carbon copy of the current ATLAS design. The CCDC team can then feed simulated battlefield data into the system for soldiers to respond to as if they were actually using ATLAS.

Like most virtual reality systems, the outside looks less impressive than the rendered universe that exists on the inside. Sitting down at the gunner's seat, the user's vision is enveloped by a trifold of tall blue walls, cutting the individual off from the real world. Directly in front of the chair is a recreation of ATLAS' touch-screen display and a 3D-printed copy of the controller.

Putting on the virtual reality headset, the user is immersed in a 3D rendering of the ATLAS prototype's gunner station, but with some real-world elements.

“We're leveraging multiple technologies to put this together. So as the operator looks around ... he has the ability to see the hand grips. He also has the ability to see his own hands,” May said.

All in all, the mixed-reality environment creates the distinct impression that the user is in the gunner's chair during a real-life engagement. And that's the whole point.

It's important to note the virtual reality system is not meant to test the quality of the AI system. While the system populates the virtual battlefield with targets the same way ATLAS would, it doesn't use the targeting algorithm.

“We're not using the actual algorithm,” May said. “We're controlling how the algorithm performs.”

Switching up the scenarios

Another advantage to the mixed-reality environment: The Army can experiment with ATLAS in different vehicles. CCDC leaders were clear that ATLAS is meant to be a vehicle-agnostic platform. If the Army decides it wants ATLAS installed on a combat vehicle rather than a tank — like the current prototype — the CCDC team could recreate that vehicle within the simulated environment, giving users the opportunity to see how ATLAS would look on that platform.

“We can switch that out. That's a 3D representation,” May said. “This could obviously be an existing tactical vehicle or a future tactical vehicle as part of the virtual prototype.”

But is the virtual reality component really necessary to the experience? After all, the interactions with the ATLAS surrogate take place entirely through the touch screen and the controller, and a soldier could get an idea of how the system works without ever putting on the headset.

May said that, according to feedback he's received, the virtual reality component adds that extra level of realism for the soldier.

“They thought it added to their experience,” May said. “We've run through a version of this without the mixed reality — so they're just using the touch screens and the grips — and they thought the mixed reality added that realism to really get them immersed into the experience.”

“We've had over [40 soldiers] leveraging the system that we have here to provide those early insights and then also to give us some quantitative data on how the soldier is performing,” he added. “So we're looking from a user evaluation perspective: Again, how does the [aided target recognition] system influence the soldier both positively, potentially and negatively? And then what is the qualitative user feedback just of the system itself?”

In other words, the team is assessing how soldiers react to the simulated battlefield they are being fed through the mixed reality system. Not only is the team observing how soldiers operate when the data is perfect; it also wants to see how soldiers are impacted when fed less accurate data.

Soldiers are also interviewed after using the system to get a sense of their general impressions. May said users are asked questions such as “How do you see this impacting the way that you currently do your operations?” or “What changes would you make based off your use of it?”

The virtual prototyping environment is an outgrowth of CCDC's desire to push soldier interactions earlier in the development process, and it could eventually be used for other systems in development.

“We're hoping that this is kind of an initial proof of concept that other programs can kind of leverage to enhance their programs as well,” May said.

“This is a little bit of a pilot, but I think we can expect that across the C5ISR Center and other activities to spend and work a lot more in this virtual environment,” added Nabors. “It's a great mechanism for getting soldier feedback [and] provides us an opportunity to insert new capabilities where possible.”

https://www.c4isrnet.com/artificial-intelligence/2020/11/09/mixed-reality-systems-can-bring-soldier-feedback-into-development-earlier-than-ever-before-heres-how-the-us-army-is-using-it/

Sur le même sujet

  • How the Navy can lean in to software superiority

    26 juin 2018 | International, Naval, C4ISR

    How the Navy can lean in to software superiority

    Andrew C. Jarocki The Navy needs to take a "hard look” at its digital needs according to a senior Navy software official, especially in technology such as machine learning and artificial intelligence, or risk vital weapons systems failing on the future battlefield. Attendees of the Amazon Web Services Public Sector Summit in Washington June 21 heard warnings that obsolete and slow approaches are driving up costs of time and resources for the Navy's newest technologies that interact with one another in combat. "It's really a matter of making System A talk to System B,” said Richard Jack, a lead engineer and project director at the Space and Naval Warfare Systems Center Pacific. “A logistics system that needs to be able to interact with a weapons system.” Software superiority is an important part of the Navy's plan for a global competitive edge, from unmanned underwater vehicles to drones operated from ships. Unless the Navy wants to get an error message at a crucial combat moment, they will have to search outside their own technology labs for the solution to the interoperability challenge, Jack said. “The Navy can't do this alone, as 99 percent of the brain trust is in the cloud service providers and the industry,” Jack stated. He expressed the need to “take advantage” of lessons learned by cloud industry leaders on big data collection and interpreting results to make predictions . Jack suggested accelerating operations with increased cloud computing, creating shared infrastructure to make sure data centers are connected, eliminating duplicative investments across some programs, and further expanding AI and machine learning advancements. The software engineer expressed confidence that learning from cloud service providers will result in the Navy enhancing warfighting abilities, envisioning a cloud to allow instant data sharing “between a weapons system, an airframe, a UAV, and a logistics system” at the same time. Jack also praised cloud computing as important to the “compile to combat” program, in which the Navy is experimenting with ways to deploy new software capabilities to ships at sea in less than 24 hours. While the cloud can “be super fast and super efficient” for accessing large amounts of data anywhere, Jack also promised that it also allows the Navy to “really push the boundaries of machine learning,” even though “we are behind the curve” at the moment. Through “strategic partnerships” with the “Amazons, Googles and IBM Watsons of the world,” Jack promised the Navy could accomplish even more in the areas of AI and machine learning that will dominate warfighting in the era of the cloud. https://www.c4isrnet.com/it-networks/2018/06/25/how-the-navy-can-lean-in-to-software-superiority/

  • Ukraine troops will start to get artillery shells under Czech scheme by June
  • Squad X Improves Situational Awareness, Coordination for Dismounted Units

    30 novembre 2018 | International, C4ISR

    Squad X Improves Situational Awareness, Coordination for Dismounted Units

    The first test of DARPA's Squad X Experimentation program successfully demonstrated the ability to extend and enhance the situational awareness of small, dismounted units. In a weeklong test series at Twentynine Palms, California, U.S. Marine squads improved their ability to synchronize maneuvers, employing autonomous air and ground vehicles to detect threats from multiple domains – physical, electromagnetic, and cyber – providing critical intelligence as the squad moved through scenarios. Squad X provides Army and Marine dismounted units with autonomous systems equipped with off-the-shelf technologies and novel sensing tools developed via DARPA's Squad X Core Technologies program. The technologies aim to increase squads' situational awareness and lethality, allowing enemy engagement with greater tempo and from longer ranges. The Squad X program manager in DARPA's Tactical Technology Office, Lt. Col. Phil Root (U.S. Army), said Experiment 1 demonstrated the ability for the squad to communicate and collaborate, even while “dancing on the edge of connectivity.” The squad members involved in the test runs praised the streamlined tools, which allowed them to take advantage of capabilities that previously had been too heavy or cumbersome for individual Soldiers and Marines to use in demanding field conditions. “Each run, they learned a bit more on the systems and how they could support the operation,” said Root, who is also program manager for Squad X Core Technologies. “By the end, they were using the unmanned ground and aerial systems to maximize the squad's combat power and allow a squad to complete a mission that normally would take a platoon to execute.” Two performers, Lockheed Martin Missiles and Fire Control and CACI's BIT Systems, each are working on different approaches to provide unique capabilities to enhance ground infantries. Manned-unmanned teaming is critical to both companies' solutions. Marines testing Lockheed Martin's Augmented Spectral Situational Awareness, and Unaided Localization for Transformative Squads (ASSAULTS) system used autonomous robots with sensor systems to detect enemy locations, allowing the Marines to engage and target the enemy with a precision 40mm grenade before the enemy could detect their movement. Small units using CACI's BITS Electronic Attack Module (BEAM) were able to detect, locate, and attack specific threats in the radio frequency and cyber domains. Experiment 2 is currently targeted for early 2019. https://www.darpa.mil/news-events/2018-11-30a

Toutes les nouvelles