4 mars 2021 | International, C4ISR, Sécurité

Defense Intelligence Agency awards IT services contract worth up to $12.6 billion

The military intelligence agency named 144 companies that can receive pieces of the contact.

https://www.c4isrnet.com/it-networks/2021/03/03/defense-intelligence-agency-awards-it-services-contract-worth-up-to-126-billion/

Sur le même sujet

  • Can the Army perfect an AI strategy for a fast and deadly future?

    15 octobre 2019 | International, C4ISR

    Can the Army perfect an AI strategy for a fast and deadly future?

    By: Kelsey D. Atherton Military planners spent the first two days of the Association of the United States Army's annual meeting outlining the future of artificial intelligence for the service and tracing back from this imagined future to the needs of the present. This is a world where AI is so seamless and ubiquitous that it factors into everything from rifle sights to logistical management. It is a future where every soldier is a node covered in sensors, and every access point to that network is under constant threat by enemies moving invisibly through the very parts of the electromagnetic spectrum that make networks possible. It is a future where weapons can, on their own, interpret the world, position themselves within it, plot a course of action, and then, in the most extreme situations, follow through. It is a world of rich battlefield data, hyperfast machines and vulnerable humans. And it is discussed as an inevitability. “We need AI for the speed at which we believe we will fight future wars,” said Brig. Gen. Matthew Easley, director of the Army AI Task Force. Easley is one of a handful of people with an outsized role shaping how militaries adopt AI. The past of data future Before the Army can build the AI it needs, the service needs to collect the data that will fuel and train its machines. In the shortest terms, that means the task force's first areas of focus will include preventative maintenance and talent management, where the Army is gathering a wealth of data. Processing what is already collected has the potential for an outsized impact on the logistics and business side of administering the Army. For AI to matter in combat, the Army will need to build a database of what sensor-readable events happen in battle, and then refine that data to ultimately provide useful information to soldiers. And to get there means turning every member of the infantry into a sensor. “Soldier lethality is fielding the Integrated Visual Augmentation Systems, or our IVAS soldier goggles that each of our infantry soldiers will be wearing,” Easley said. “In the short term, we are looking at fielding nearly 200,000 of these systems.” The IVAS is built on top of Microsoft's HoloLens augmented reality tool. That the equipment has been explicitly tied to not just military use, but military use in combat, led to protests from workers at Microsoft who objected to the product of their labor being used with “intent to harm.” And with IVAS in place, Easley imagines a scenario where IVAS sensors plot fields of fire for every soldier in a squad, up through a platoon and beyond. “By the time it gets to [a] battalion commander,” Easley said, “they're able to say where their dead zones are in front of [the] defensive line. They'll know what their soldiers can touch right now, and they'll know what they can't touch right now.” Easley compared the overall effect to the data collection done by commercial companies through the sensors on smartphones — devices that build detailed pictures of the individuals carrying them. Fitting sensors to infantry, vehicles or drones can help build the data the Army needs to power AI. Another path involves creating synthetic data. While the Army has largely fought the same type of enemy for the past 18 years, preparing for the future means designing systems that can handle the full range of vehicles and weapons of a professional military. With insurgents unlikely to field tanks or attack helicopters at scale anytime soon, the Army may need to generate synthetic data to train an AI to fight a near-peer adversary. Faster, stronger, better, more autonomous “I want to proof the threat,” said Bruce Jette, the Army's assistant secretary for acquisition, logistics and technology, while speaking at a C4ISRNET event on artificial intelligence at AUSA. Jette then set out the kind of capability he wants AI to provide, starting from the perspective of a tank turret. “Flip the switch on, it hunts for targets, it finds targets, it classifies targets. That's a Volkswagen, that's a BTR [Russian-origin armored personnel carrier], that's a BMP [Russian-origin infantry fighting vehicle]. It determines whether a target is a threat or not. The Volkswagen's not a threat, the BTR is probably a threat, the BMP is a threat, and it prioritizes them. BMP is probably more dangerous than the BTR. And then it classifies which one's [an] imminent threat, one's pointing towards you, one's driving away, those type of things, and then it does a firing solution to the target, which one's going to fire first, then it has all the firing solutions and shoots it.” Enter Jette's ideal end state for AI: an armed machine that senses the world around it, interprets that data, plots a course of action and then fires a weapon. It is the observe–orient–decide–act cycle without a human in the loop, and Jette was explicit on that point. “Did you hear me anywhere in there say ‘man in the loop?,' ” Jette said. “Of course, I have people throwing their hands up about ‘Terminator,' I did this for a reason. If you break it into little pieces and then try to assemble it, there'll be 1,000 interface problems. I tell you to do it once through, and then I put the interface in for any safety concerns we want. It's much more fluid.” In Jette's end state, the AI of the vehicle is designed to be fully lethal and autonomous, and then the safety features are added in later — a precautionary stop, a deliberate calming intrusion into an already complete system. Jette was light on the details of how to get from the present to the thinking tanks of tomorrow's wars. But it is a process that will, by necessity, involve buy-in and collaboration with industry to deliver the tools, whether it comes as a gestalt whole or in a thousand little pieces. Learning machines, fighting machines Autonomous kill decisions, with or without humans in the loop, are a matter of still-debated international legal and ethical concern. That likely means that Jette's thought experiment tank is part of a more distant future than a host of other weapons. The existence of small and cheap battlefield robots, however, means that we are likely to see AI used against drones in the more immediate future. Before robots fight people, robots will fight robots. Before that, AI will mostly manage spreadsheets and maintenance requests. “There are systems now that can take down a UAS pretty quickly with little collateral damage,” Easley said. “I can imagine those systems becoming much more autonomous in the short term than many of our other systems.” Autonomous systems designed to counter other fast, autonomous systems without people on board are already in place. The aptly named Counter Rocket, Artillery, and Mortar, or C-RAM, systems use autonomous sensing and reaction to specifically destroy projectiles pointed at humans. Likewise, autonomy exists on the battlefield in systems like loitering munitions designed to search for and then destroy anti-air radar defense systems. Iterating AI will mean finding a new space of what is acceptable risk for machines sent into combat. “From a testing and evaluation perspective, we want a risk knob. I want the commander to be able to go maximum risk, minimum risk,” said Brian Sadler, a senior research scientist at the Army Research Laboratory. “When he's willing to take that risk, that's OK. He knows his current rules of engagement, he knows where he's operating, he knows if he uses some platforms; he's willing to make that sacrifice. In his work at the Vehicle Technology Directorate of the Army Combat Capabilities Development Command, Sadler is tasked with catching up the science of AI to the engineered reality of it. It is not enough to get AI to work; it has to be understood. “If people don't trust AI, people won't use it,” Tim Barton, chief technology officer at Leidos, said at the C4ISRNET event. Building that trust is an effort that industry and the Army have to tackle from multiple angles. Part of it involves iterating the design of AI tools with the people in the field who will use them so that the information analyzed and the product produced has immediate value. “AI should be introduced to soldiers as an augmentation system,” said Lt. Col. Chris Lowrance, a project manager in the Army's AI Task Force. “The system needs to enhance capability and reduce cognitive load.” Away from but adjacent to the battlefield, Sadler pointed to tools that can provide immediate value even as they're iterated upon. “If it's not a safety of life mission, I can interact with that analyst continuously over time in some kind of spiral development cycle for that product, which I can slowly whittle down to something better and better, and even in the get-go we're helping the analyst quite a bit,” Sadler said. “I think Project Maven is the poster child for this,” he added, referring to the Google-started tool that identifies objects from drone footage. Project Maven is the rare intelligence tool that found its way into the public consciousness. It was built on top of open-source tools, and workers at Google circulated a petition objecting to the role of their labor in creating something that could “lead to potentially lethal outcomes.” The worker protest led the Silicon Valley giant to outline new principles for its own use of AI. Ultimately, the experience of engineering AI is vastly different than the end user, where AI fades seamlessly into the background, becoming just an ambient part of modern life. If the future plays out as described, AI will move from a hyped feature, to a normal component of software, to an invisible processor that runs all the time. “Once we succeed in AI,” said Danielle Tarraf, a senior information scientist at the think tank Rand, “it will become invisible like control systems, noticed only in failure.” https://www.c4isrnet.com/artificial-intelligence/2019/10/15/can-the-army-perfect-an-ai-strategy-for-a-fast-and-deadly-future

  • RTX's Pratt and Whitney expands operations with opening of new India Digital Capability Center

    15 février 2024 | International, Terrestre

    RTX's Pratt and Whitney expands operations with opening of new India Digital Capability Center

    Bengaluru, India, February 13, 2024 /PRNewswire/ -- Pratt & Whitney, an RTX (NYSE: RTX) business, announced the establishment of its new India Digital Capability Center (IDCC) in Bengaluru, India. The...

  • DIUx wants drones that are out for blood

    4 mai 2018 | International, Aérospatial

    DIUx wants drones that are out for blood

    By: Kelsey Atherton For drone delivery to make sense, with existing capabilities of drones, the cargo needs to be relatively light, it needs to have tremendous value, and it needs to urgently travel the last mile by air. This is why, to the extent we've seen drones used for delivery in the wild, it's more likely as a means to carry contraband into a prison than it is a practical alternative to the postal service. But there's one other cargo that fits the description, and that's blood itself. Defense Innovation Unit Experimental, the Pentagon's stand-up Silicon Valley-focused acquisition house, is looking for a drone that can carry a modest cargo of blood, through the dark of night toward where it's most needed. Call it “Dronesferatu.” From FCW: The specs of the solicitation from the Defense Innovation Unit Experimental -- the ability to deliver a 5-pound package over 100 kilometers in “austere environments” -- strongly suggest that they're looking at an unmanned aerial vehicle system that supports refrigeration or other means of temperature control. “These deliveries, ideally automated, will provide essential items to critically wounded military personnel as quickly as possible after an injury occurs,” the April 23 solicitation states. “Ability to sustain a very high frequency of operations over an extended period of time is critical. Speed of delivery, reliability and robustness to failure and interference, response time, and overall delivery throughput are critical.” Getting the right blood to the right people as fast as possible means saving lives. To that end, DARPA's funded research into metabolic rate reduction to see if there's a way to make people bleed out more slowly, or into using female hormones to similarly prolong the survivable time without transfusion. In 2013, the U.S. Army conducted a study on pre-hospital transfusion for battlefield casualties being medically evacuated in Afghanistan, and in 2012 Canadian Blood Services even tested the viability of paratroopers transporting blood for transfusion. Consider blood drones complementary to this field of work. Early tests by researchers at Johns Hopkins and Uganda's Makerere University proved that small vials of blood transported by drone were just as viable as blood transported by car. Those same researchers followed up with a test of blood delivery from ship-to-shore, for possible use in response to coastal areas hit by natural disasters, where the roads are impassable but drones could still safely fly. The American startup Zipline demonstrated its own blood delivery drones in 2016, and has for a year and a half worked on delivering blood by robot to parts of Rwanda. DIUx's ask, that a drone fly over 60 miles and carry 5 pounds of blood, is not far off from what Zipline's drones can already do, with the company stating a range of 100 miles and a cargo capacity of just under four pounds. Weight and range tradeoffs are at the heart of aviation design, so it's likely that vendors have already pitched something within the bounds of the solicitation. Should that drone make a fast turnaround from ask to prototype to useful tool, the troops fighting abroad may gain a better shot at surviving otherwise-fatal blood loss. Unlikely that the reverse-vampire drones will look like bats, though. https://www.c4isrnet.com/unmanned/2018/05/03/diux-wants-drones-that-are-drones-out-for-blood/

Toutes les nouvelles