26 février 2024 | International, Terrestre

Navy introduces new robotics warfare rating

The pool of sailors initially tapped for the Navy's new robotics rating will be "small and selective," according to the sea service.

https://www.c4isrnet.com/news/your-navy/2024/02/22/navy-introduces-new-robotics-warfare-rating/

Sur le même sujet

  • Panel wants to double federal spending on AI

    2 avril 2020 | International, C4ISR

    Panel wants to double federal spending on AI

    Aaron Mehta A congressionally mandated panel of technology experts has issued its first set of recommendations for the government, including doubling the amount of money spent on artificial intelligence outside the defense department and elevating a key Pentagon office to report directly to the Secretary of Defense. Created by the National Defense Authorization Act in 2018, the National Security Commission on Artificial Intelligence is tasked with reviewing “advances in artificial intelligence, related machine learning developments, and associated technologies,” for the express purpose of addressing “the national and economic security needs of the United States, including economic risk, and any other associated issues.” The commission issued an initial report in November, at the time pledging to slowly roll out its actual policy recommendations over the course of the next year. Today's report represents the first of those conclusions — 43 of them in fact, tied to legislative language that can easily be inserted by Congress during the fiscal year 2021 budget process. Bob Work, the former deputy secretary of defense who is the vice-chairman of the commission, said the report is tied into a broader effort to move DoD away from a focus on large platforms. “What you're seeing is a transformation to a digital enterprise, where everyone is intent on making the DoD more like a software company. Because in the future, algorithmic warfare, relying on AI and AI enabled autonomy, is the thing that will provide us with the greatest military competitive advantage,” he said during a Wednesday call with reporters. Among the key recommendations: The government should “immediately double non-defense AI R&D funding” to $2 billion for FY21, a quick cash infusion which should work to strengthen academic center and national labs working on AI issues. The funding should “increase agency topline levels, not repurpose funds from within existing agency budgets, and be used by agencies to fund new research and initiatives, not to support re-labeled existing efforts.” Work noted that he recommends this R&D to double again in FY22. The commission leaves open the possibility of recommendations for increasing DoD's AI investments as well, but said it wants to study the issue more before making such a request. In FY21, the department requested roughly $800 million in AI developmental funding and another $1.7 billion in AI enabled autonomy, which Work said is the right ratio going forward. “We're really focused on non-defense R&D in this first quarter, because that's where we felt we were falling further behind,” he said. “We expect DoD AI R&D spending also to increase” going forward. The Director of the Joint Artificial Intelligence Center (JAIC) should report directly to the Secretary of Defense, and should continue to be led by a three-star officer or someone with “significant operational experience.” The first head of the JAIC, Lt. Gen. Jack Shanahan, is retiring this summer; currently the JAIC falls under the office of the Chief Information Officer, who in turn reporters to the secretary. Work said the commission views the move as necessary in order to make sure leadership in the department is “driving" investment in AI, given all the competing budgetary requirements. The DoD and the Office of the Director of National Intelligence (ODNI) should establish a steering committee on emerging technology, tri-chaired by the Deputy Secretary of Defense, the Vice Chairman of the Joint Chiefs of Staff, and the Principal Deputy Director of ODNI, in order to “drive action on emerging technologies that otherwise may not be prioritized” across the national security sphere. Government microelectronics programs related to AI should be expanded in order to “develop novel and resilient sources for producing, integrating, assembling, and testing AI-enabling microelectronics.” In addition, the commission calls for articulating a “national for microelectronics and associated infrastructure.” Funding for DARPA's microelectronics program should be increased to $500 million. The commission also recommends the establishment of a $20 million pilot microelectronics program to be run by the Intelligence Advanced Research Projects Activity (IARPA), focused on AI hardware. The establishment of a new office, tentatively called the National Security Point of Contact for AI, and encourage allied government to do the same in order to strengthen coordination at an international level. The first goal for that office would be to develop an assessment of allied AI research and applications, starting with the Five Eyes nations and then expanding to NATO. One issue identified early by the commission is the question of ethical AI. The commission recommends mandatory training on the limits of artificial intelligence in the AI workforce, which should include discussions around ethical issues. The group also calls for the Secretary of Homeland Security and the director of the Federal Bureau of Investigation to “share their ethical and responsible AI training programs with state, local, tribal, and territorial law enforcement officials,” and track which jurisdictions take advantage of those programs over a five year period. Missing from the report: any mention of the Pentagon's Directive 3000.09, a 2012 order laying out the rules about how AI can be used on the battlefield. Last year C4ISRNet revealed that there was an ongoing debate among AI leaders, including Work, on whether that directive was still relevant. While not reflected in the recommendations, Eric Schmidt, the former Google executive who chairs the commission, noted that his team is starting to look at how AI can help with the ongoing COVID-19 coronavirus outbreak, saying "“We're in an extraordinary time... we're all looking forward to working hard to help anyway that we can.” The full report can be read here. https://www.c4isrnet.com/artificial-intelligence/2020/04/01/panel-wants-to-double-federal-spending-on-ai/

  • Airborne Triton drone key to Navy’s signal goals, Clapperton says

    13 février 2024 | International, Aérospatial

    Airborne Triton drone key to Navy’s signal goals, Clapperton says

    The autonomous MQ-4C Triton intelligence, surveillance, reconnaissance and targeting drone can fly for more than 24 hours.

  • Can a dragonfly teach a missile how to hunt?

    6 août 2019 | International, C4ISR

    Can a dragonfly teach a missile how to hunt?

    By: Jen Judson WASHINGTON — A computational neuroscientist is studying whether a dragonfly's excellent hunting skills can be replicated in a missile's ability to maneuver and destroy targets midair with better precision. Dragonflies are vicious little creatures with a hit-to-kill track record of 95 percent, meaning only 5 percent of its prey escapes. Sandia National Laboratories' Frances Chance is building algorithms that simulate how a dragonfly processes information when intercepting prey, and she's testing them in a virtual environment. So far, the results are promising. The laboratories are federally funded and focus on national security missions through scientific and engineering research. The project is a yearlong, high-risk, high-gain effort that will wrap up in September, and it is funded by Sandia's Autonomy for Hypersonics Mission Campaign, Chance said. “I think what is really interesting about insects, in general, is they do something really fast and really well, but they are not particularly smart in the way you or I would think of ourselves as being smart,” Chance told Defense News in a recent interview. While insects may not be the right fit for studying cognitive capabilities to develop complex artificial intelligence, they are ideal for developing efficient computations for intercept capability. A dragonfly can react to a particular prey's maneuvers in 50 milliseconds, Chance explained. That amount of time accounts for information to cross three neurons in a dragonfly's brain. This indicates the dragonfly doesn't learn how to hunt, but rather the skill is inherent and part of its brain's hard-wiring. “The challenge then is: Is there anything that we can learn from how dragonflies do this that we can then bring to the next generation of missiles, or maybe even the next-next generation of missiles?” Chance said. By developing an artificial neural network that mimics a dragonfly's ability to hunt and then applying it to missile capabilities that rely on computation-heavy systems, one could reduce the size, weight and power needed for a missile's onboard computers; improve intercept techniques for targets such as hypersonic weapons; and home in on targets using simpler sensors. If the model of a dragonfly's neural circuit developed through Chance's research shows enough promise, she would then pass the information to scientists, who would try to directly apply it to weapons systems. One of the greatest leaps involves adapting an algorithm to handle the speed at which a missile flies. While a dragonfly is fast, it's not nearly as fast as a missile. Animal brains process information significantly slower than a computer, so it's possible computations can be sped up to better align with the speed at which a missile approaches targets. “The hope is that even if the algorithm isn't wildly successful, you might be able to say something about what you can get away with in terms of what types of capabilities you give the next generation of weapons,” Chance said. The model she's building is several steps removed from implementation onto a weapon. “I would consider the project complete when we have a viable model — ‘viable' meaning it does interception — and a bonus if it's neurobiologically plausible. There is no reason to force that for this type of research, but only because it doesn't necessarily matter; so something biologically inspired that works I would consider a success.” https://www.c4isrnet.com/land/2019/08/05/can-a-dragonfly-teach-a-missile-how-to-hunt/

Toutes les nouvelles