20 avril 2018 | International, C4ISR, Sécurité

DARPA official: To build trust in AI, machines must explain themselves

By:

Artificially intelligent systems must be able to explain themselves to operators if they are to be trusted, according to an expert from the Defense Advanced Research Agency, who voiced concern that methods used by current AI systems are often masked by mysterious algorithms.

“A lot of the machine learning algorithms we're using today, I would tell you ‘good luck,” Fred Kennedy, the director of DARPA's Tactical Technology Office during a panel at Navy League's Sea-Air-Space on April 10. “We have no idea why they know the difference between a cat and a baboon.”

“If you start diving down into the neural net that's controlling it,” Kennedy continued, “you quickly discover that the features these algorithms are picking out have very little to do with how humans identify things.”

Kennedy's comments were in response to Deputy Assistant Secretary of the Navy for Unmanned Systems Frank Kelley, who described the leap of faith operators must make when dealing with artificially intelligent systems.

“You're throwing a master switch on and just praying to God that [Naval Research Laboratory] and John's Hopkins knew what the hell that they were doing,” Kelley said of the process.

The key to building trust, according to Kennedy, lies with the machines.

“The system has to tell us what it's thinking,” Dr. Kennedy said. “That's where the trust gets built. That's how we start to use and understand them.”

DARPA's Explainable Artificial Intelligence program seeks to teach AI how to do just that. The program envisions systems that will have the ability to explain the rationale behind their decisions, characterize their strengths and weaknesses, and describe how they will behave in the future. Such capabilities are designed to improve teamwork between man and machine by encouraging warfighters to trust artificially intelligent systems.

“It's always going to be about human-unmanned teaming,” said Kennedy. “There is no doubt about that.”

https://www.defensenews.com/home/2018/04/10/darpa-official-to-build-trust-in-ai-machines-must-explain-themselves/

Sur le même sujet

  • See the show floor and live drills at Eurosatory

    17 juin 2022 | International, Aérospatial, Naval, Terrestre, C4ISR, Sécurité

    See the show floor and live drills at Eurosatory

    Whether you were able to attend Eurosatory this year or not, you won't want to miss these highlights.

  • The Air Force sends good guys in to hack its cloud

    8 août 2019 | International, Sécurité

    The Air Force sends good guys in to hack its cloud

    By: Andrew Eversden The Air Force invited ethical hackers into its IT networks again this spring, allowing good guys the chance to infiltrate its enterprise-wide Air Force Common Computing Environment in search of vulnerabilities, the white hat hacking company Bugcrowd announced Aug. 6. The bug bounty program, done in a partnership with Bugcrowd and the Air Force's CCE program office, found 54 vulnerabilities. Bug bounties work under the assumption that the customer, in this case the Air Force, will now close the loopholes the hackers found, making the system more secure. The CCE cloud uses Amazon Web Services and Microsoft's Azure commercial cloud. The service plans to migrate more than 100 applications to that cloud environment, Bugcrowd executives said. The largest payout from the bug bounty totaled $20,000. The event ran from March 18 to June 21 at Hanscom Air Force Base in Massachusetts. Casey Ellis, Bugcrowd founder and CTO, said it was the first time Bugcrowd has worked with the Air Force. The Air Force has completed several other white hat hacking events with the firm HackerOne. Ellis said that moving to the cloud from on-premise environment represents a “paradigm shift” for many organizations. Penetration testing is an important part of keeping that environment secure, he said. Bugcrowd conducted such tests in six phases: source code analysis, AWS environment testing, Azure environment testing, black box network authentication assessment, social engineering engagement and Air Force portal testing. Bugcrowd declined to discuss how many vulnerabilities were found throughout each stage of the process. According to a news release from the Air Force from April, the CCE currently houses 21 Air Force applications and "has room for countess more.” The computing environment allows the Air Force to have a cloud to host its applications that reside on its Global Combat Support System, which is a centralized, cohesive enterprise resource planning system. The Air Force said in the April release that each migration costs $446,000 and that the service has spent more than $136 million on the program since 2016. https://www.fifthdomain.com/dod/air-force/2019/08/06/the-air-force-sends-good-guys-in-to-hack-its-cloud/

  • Jordan asks US to deploy Patriot air defense systems

    30 octobre 2023 | International, Aérospatial

    Jordan asks US to deploy Patriot air defense systems

    Jordan's military has also responded to allegations that U.S. aircraft were using local air bases to supply Israel with equipment and ammunition.

Toutes les nouvelles