Back to news

March 9, 2021 | International, Aerospace

Premier vol opérationnel d’un Rafale F3-R équipé de 2 missiles Meteor

L'armée de l'Air et de l'Espace a effectué le 4 mars son premier vol opérationnel avec un Rafale équipé de 2 missiles Meteor. L'intégration du missile Meteor entre dans le cadre de la montée en puissance du Rafale F3-R de Dassault Aviation. Le missile Meteor, développé par MBDA, est propulsé par un statoréacteur pilotable qui lui apporte vitesse, portée et manoeuvrabilité terminale. Son intégration « apporte une allonge considérable dans le combat air-air, avec une portée estimée à une centaine de kilomètres », précise Aerobuzz. Son emploi se conjugue à celui du radar RBE2 AESA à antenne active, capable de détecter et désigner des cibles à cette distance. La prochaine étape pour le Rafale en matière de missile concerne l'intégration du MICA de nouvelle génération avec le standard F4, dans les années 2023-2024.

Air & Cosmos et Aerobuzz du 9 mars

On the same subject

  • National Geospatial-Intelligence Agency creating space intel hub

    May 6, 2024 | International, Aerospace

    National Geospatial-Intelligence Agency creating space intel hub

    While the Joint Mission Management Center is still in the concept phase, NGA is working quickly to get it up and running.

  • Inside the Army’s futuristic test of its battlefield artificial intelligence in the desert

    September 28, 2020 | International, Land, C4ISR

    Inside the Army’s futuristic test of its battlefield artificial intelligence in the desert

    Nathan Strout YUMA PROVING GROUND, Ariz. — After weeks of work in the oppressive Arizona desert heat, the U.S. Army carried out a series of live fire engagements Sept. 23 at Yuma Proving Ground to show how artificial intelligence systems can work together to automatically detect threats, deliver targeting data and recommend weapons responses at blazing speeds. Set in the year 2035, the engagements were the culmination of Project Convergence 2020, the first in a series of annual demonstrations utilizing next generation AI, network and software capabilities to show how the Army wants to fight in the future. The Army was able to use a chain of artificial intelligence, software platforms and autonomous systems to take sensor data from all domains, transform it into targeting information, and select the best weapon system to respond to any given threat in just seconds. Army officials claimed that these AI and autonomous capabilities have shorted the sensor to shooter timeline — the time it takes from when sensor data is collected to when a weapon system is ordered to engaged — from 20 minutes to 20 seconds, depending on the quality of the network and the number of hops between where it's collected and its destination. “We use artificial intelligence and machine learning in several ways out here,” Brigadier General Ross Coffman, director of the Army Futures Command's Next Generation Combat Vehicle Cross-Functional Team, told visiting media. “We used artificial intelligence to autonomously conduct ground reconnaissance, employ sensors and then passed that information back. We used artificial intelligence and aided target recognition and machine learning to train algorithms on identification of various types of enemy forces. So, it was prevalent throughout the last six weeks.” The first exercise featured is informative of how the Army stacked together AI capabilities to automate the sensor to shooter pipeline. In that example, the Army used space-based sensors operating in low Earth orbit to take images of the battleground. Those images were downlinked to a TITAN ground station surrogate located at Joint Base Lewis McCord in Washington, where they were processed and fused by a new system called Prometheus. Currently under development, Prometheus is an AI system that takes the sensor data ingested by TITAN, fuses it, and identifies targets. The Army received its first Prometheus capability in 2019, although it's targeting accuracy is still improving, according to one Army official at Project Convergence. In some engagements, operators were able to send in a drone to confirm potential threats identified by Prometheus. From there, the targeting data was delivered to a Tactical Assault Kit — a software program that gives operators an overhead view of the battlefield populated with both blue and red forces. As new threats are identified by Prometheus or other systems, that data is automatically entered into the program to show users their location. Specific images and live feeds can be pulled up in the environment as needed. All of that takes place in just seconds. Once the Army has its target, it needs to determine the best response. Enter the real star of the show: the FIRES Synchronization to Optimize Responses in Multi-Domain Operations, or FIRESTORM. “What is FIRESTORM? Simply put it's a computer brain that recommends the best shooter, updates the common operating picture with the current enemy situation, and friendly situation, admissions the effectors that we want to eradicate the enemy on the battlefield,” said Coffman. Army leaders were effusive in praising FIRESTORM throughout Project Convergence. The AI system works within the Tactical Assault Kit. Once new threats are entered into the program, FIRESTORM processes the terrain, available weapons, proximity, number of other threats and more to determine what the best firing system to respond to that given threat. Operators can assess and follow through with the system's recommendations with just a few clicks of the mouse, sending orders to soldiers or weapons systems within seconds of identifying a threat. Just as important, FIRESTORM provides critical target deconfliction, ensuring that multiple weapons systems aren't redundantly firing on the same threat. Right now, that sort of deconfliction would have to take place over a phone call between operators. FIRESTORM speeds up that process and eliminates any potential misunderstandings. In that first engagement, FIRESTORM recommended the use of an Extended-Range Cannon Artillery. Operators approved the algorithm's choice, and promptly the cannon fired a projectile at the target located 40 kilometers away. The process from identifying the target to sending those orders happened faster than it took the projectile to reach the target. Perhaps most surprising is how quickly FIRESTORM was integrated into Project Convergence. “This computer program has been worked on in New Jersey for a couple years. It's not a program of record. This is something that they brought to my attention in July of last year, but it needed a little bit of work. So we put effort, we put scientists and we put some money against it,” said Coffman. “The way we used it is as enemy targets were identified on the battlefield — FIRESTORM quickly paired those targets with the best shooter in position to put effects on it. This is happening faster than any human could execute. It is absolutely an amazing technology.” Dead Center Prometheus and FIRESTORM weren't the only AI capabilities on display at Project Convergence. In other scenarios, a MQ-1C Gray Eagle drone was able to identify and target a threat using the on-board Dead Center payload. With Dead Center, the Gray Eagle was able to process the sensor data it was collecting, identifying a threat on its own without having to send the raw data back to a command post for processing and target identification. The drone was also equipped with the Maven Smart System and Algorithmic Inference Platform, a product created by Project Maven, a major Department of Defense effort to use AI for processing full motion video According to one Army officer, the capabilities of the Maven Smart System and Dead Center overlap, but placing both on the modified Gray Eagle at Project Convergence helped them to see how they compared. With all of the AI engagements, the Army ensured there was a human in the loop to provide oversight of the algorithms' recommendations. When asked how the Army was implementing the Department of Defense's principles of ethical AI use adopted earlier this year, Coffman pointed to the human barrier between AI systems and lethal decisions. “So obviously the technology exists, to remove the human right the technology exists, but the United States Army, an ethical based organization — that's not going to remove a human from the loop to make decisions of life or death on the battlefield, right? We understand that,” explained Coffman. “The artificial intelligence identified geo-located enemy targets. A human then said, Yes, we want to shoot at that target.” https://www.c4isrnet.com/artificial-intelligence/2020/09/25/the-army-just-conducted-a-massive-test-of-its-battlefield-artificial-intelligence-in-the-desert/

  • DARPA wants commanding robots to work like a video game

    February 13, 2020 | International, Land

    DARPA wants commanding robots to work like a video game

    By: Kelsey D. Atherton In a fake city in Mississippi, DARPA is training robots for war. In December 2019, at a camp southeast of Hattiesburg, hundreds of robots gathered to scout an urban environment, and then convert that scouting data into useful information for humans. Conducted at Camp Shelby Joint Forces Training Center, the exercise was the third test of DARPA's OFFensive Swarm-Enable Tactics (OFFSET) program. OFFSET is explicitly about robots assisting humans in fighting in urban areas, with many robots working together at the behents of a small group of infantry to provide greater situational awareness than a human team could achieve on its own. The real-time nature of the information is vital to the vision of OFFSET. It is one thing to operate from existing maps, and another entirely to operate from recently mapped space, with continuing situational awareness of possible threats and other movement through the space. Dating back to at least 2017, OFFSET is in part an iterative process, with contractors competing for and receiving awards for various ‘sprints,' or narrower short-turnaround developments in coding capabilities. Many of these capabilities involve translating innovations from real-time strategy video games into real life, like dragging-and-dropping groups units to give them commands. For the exercise at Camp Shelby, the swarms involved both ground and flying robots. These machines were tasked with finding specific items of interest located in buildings at Camp Shelby's Combined Arms Collective training Facility. To assist the robots in the field experiment, organized seeded the environment with AprilTags. These tags, which are similar to QR codes but trade complexity of data stored for simplicity and robustness in being read at difference, were used to mark the sites of interest, as well as hazards to avoid. In practical use, hazards seldom if ever arrive with barcodes explicitly labeling themselves as hazards, but for training the AprilTags provide a useful scaffolding while the robots coordinate in other ways. “As the swarm relayed information acquired from the tags,” wrote DAPRA, “human swarm tacticians adaptively employed various swarm tactics their teams had developed to isolate and secure the building(s) containing the identified items.” That information is relayed in various ways, from updated live maps on computer screens to floating maps displayed in real time in augmented reality headsets. As foreshadowed by countless works of cyberpunk fiction, these “human swarm tacticians” interfaced with both the real world and a virtual representation of that world at once. Commanding robots to move in real space by manipulating objects in a virtual environment, itself generated by robots exploring and scouting the real space, blurs the distinction between artificial and real environments. That these moves were guided by gesture and haptic feedback only further underscores how deeply linked commanding robots can be to augmented reality. The gesture and haptic feedback command systems were built through sprinter contracts by Charles River Analytics, Inc., Case Western University, and Northwestern University, with an emphasis on novel interaction for human-swarm teaming. Another development, which would be as at home in the real-time strategy game series Starcraft as it is in a DARPA OFFSET exercise, is the operational management of swarm tactics from Carnegie Mellon University and Soar Technology. Their developments allowed the swarm to search and map a building on its own, and to automate resource allocation in the process of accomplishing tasks. For now, the heart of the swarm is as a scouting organism built to provide information to human operators. https://www.c4isrnet.com/unmanned/2020/02/11/darpa-wants-commanding-robots-to-work-like-a-video-game

All news