25 septembre 2019 | International, C4ISR

Boeing Australia collaborates on AI research for unmanned systems

BRISBANE, Australia, Sept. 25, 2019 — Boeing [NYSE:BA] is partnering with Australia's Trusted Autonomous Systems Defence Cooperative Research Centre (DCRC) to develop advanced artificial intelligence (AI) technologies to create smarter unmanned systems for global forces. Embedding machine learning techniques on-board will help unmanned systems better understand and react to threat environments.

“Over the next 12 months, Boeing Australia will design and test cognitive AI algorithms to enable sensing under anti-access conditions and to navigate and conduct enhanced tactics in denied environments,” said Dr. Shane Arnott, director of Phantom Works International.

Boeing Australia's first innovation project with the DCRC will examine an unmanned system's route planning, location, and identification of objects and the platform's subsequent behavioural response.

The DCRC for Trusted Autonomous Systems was announced by the Australian Government in 2017 to support the rapid creation and transition of industry-led trustworthy smart-machine technologies through the innovation ecosystem to the Australian Defence Force.

“Together with Boeing, we are investing in advanced technology that can have real game-changing product outcomes for our military to match the evolving threats and achieve a sustainable autonomous industry for Australia,” said Professor Jason Scholz, chief executive officer of the DCRC for Trusted Autonomous Systems.

Boeing will work with Australian university partners and Brisbane-based supplier RF Designs to flight-test and evaluate the capability with autonomous high performance jets.

* The Trusted Autonomous Systems DCRC receives funding support from the Australian Government's Next Generation Technologies Fund and the Queensland Government's Advance Queensland initiative.

# # #

Contact:

Melanie de Git
Boeing Australia
Mobile: +61 423 829 505
melanie.degit@boeing.com

Trusted Autonomous Systems DCRC
Phone: +61 7 3371 0524
info@tasdcrc.com.au

View source version on Boeing : https://boeing.mediaroom.com/2019-09-24-Boeing-Australia-collaborates-on-AI-research-for-unmanned-systems#assets_20295_130508-117

Sur le même sujet

  • Dassault Aviation : résultats semestriels 2020

    24 juillet 2020 | International, Aérospatial

    Dassault Aviation : résultats semestriels 2020

    DEFENSE Dassault Aviation : résultats semestriels 2020 Le 23 juillet, Éric Trappier, PDG de Dassault Aviation, a tenu une conférence de presse à l'occasion de l'annonce des résultats semestriels 2020. Le groupe a réalisé un chiffre d'affaires de 2,6 Mds€, (contre 3 Mds€ au premier semestre 2019). Le résultat net s'élève à 87 M€ (contre 286 M€ en 2019). En termes de livraisons, Dassault Aviation a remis 16 Falcon au premier semestre 2020, soit un de moins que sur la même période de 2019. Sur le terrain commercial, 5 commandes de Falcon ont été signées, contre 7 un an plus tôt. Concernant le Rafale, 7 appareils ont été livrés à l'export. M. Trappier a souligné la volonté du groupe de maintenir ses investissements d'avenir. «Nous allons maintenir notre effort de R&D autofinancé en faveur de notre future gamme d'avions d'affaires Falcon : le 6X qui est notre priorité absolue, et le NX, le nouveau Falcon. Car en sortie de crise, nous serons au rendez-vous avec des avions nouveaux. Cela va nous coûter en marge mais ce n'est pas le moment de baisser la garde», a-t-il notamment déclaré. Le Figaro du 24 juillet

  • British government clears sensitive Ultra Electronics sale to US-based Advent

    9 juillet 2022 | International, C4ISR

    British government clears sensitive Ultra Electronics sale to US-based Advent

    Media reports said U.S. officials had threatened to limit intelligence cooperation if the sale was blocked by London.

  • DARPA wants to arm ethical hackers with AI

    30 avril 2018 | International, C4ISR

    DARPA wants to arm ethical hackers with AI

    By: Brandon Knapp The Defense Advanced Research Projects Agency (DARPA) wants to leverage human-artificial intelligence teaming to accelerate the military's cyber vulnerability detection, according to agency documents. The task of securing the Pentagon's diverse networks, which support nearly every function of the military's operations, presents a nightmare for defense officials. The current time-intensive and costly process involves extensively trained hackers using specialized software suites to scour the networks in search of vulnerabilities that could potentially be exploited, but the scarcity of expert hackers makes detecting cyberthreats a challenge for the Defense Department. DARPA's Computers and Humans Exploring Software Security (CHESS) program seeks to bolster existing cyber defenders with a new tool that would render much of the current toolkit ancient history: artificial intelligence. The program aims to incorporate automation into the software analysis and vulnerability discovery process by enabling humans and computers to reason collaboratively. If successful, the program could enhance existing hacking techniques and greatly expand the number of personnel capable of ethically hacking DoD systems. To achieve its goal, DARPA will solicit proposals from industry across five technical areas, including developing tools that mimic the processes used by expert hackers and ultimately transitioning a final solution to the government. “Through CHESS, we're looking to gather, understand and convert the expertise of human hackers into automated analysis techniques that are more accessible to a broader range of technologists,” the DARPA program description reads. “By allowing more individuals to contribute to the process, we're creating a way to scale vulnerability detection well beyond its current limits.” While DARPA sees artificial intelligence as an important tool for enhancing cybersecurity efforts, officials emphasize the essential role humans play in the collaborative process. “Humans have world knowledge, as well as semantic and contextual understanding that is beyond the reach of automated program analysis alone,” said Dustin Fraze, the I2O program manager leading CHESS. “These information gaps inhibit machine understanding for many classes of software vulnerabilities. Properly communicated human insights can fill these information gaps and enable expert hacker-level vulnerability analysis at machine speeds.” The CHESS program will span three phases lasting a total of 42 months. Each phase will focus on increasing the complexity of an application the CHESS system is able to analyze effectively. https://www.c4isrnet.com/it-networks/2018/04/27/darpa-wants-to-arm-ethical-hackers-with-ai/

Toutes les nouvelles