Back to news

January 14, 2022 | International, Aerospace

Avion de combat: premiers tests d'un prototype de moteur pour le Scaf

Entre Dassault et Airbus, les discussions se poursuivent sur le Scaf. En attendant, la DGA et Safran annoncent les premiers tests d'un moteur pour l'avion européen du futur.

https://www.bfmtv.com/economie/entreprises/industries/avion-de-combat-premiers-tests-d-un-prototype-de-moteur-pour-le-scaf_AN-202201120218.html

On the same subject

  • U.S. Navy orders 48 retrofit redesign kits in support of Super Hornet aircraft

    November 25, 2019 | International, Aerospace, Naval

    U.S. Navy orders 48 retrofit redesign kits in support of Super Hornet aircraft

    The U.S. Department of Defense announced on Thursday that Boeing Co. has been awarded a new contract for support F/A-18 Super Hornet aircraft. U.S. aerospace giant has won a contract valued at as much as $43 million to build, test and delivery of 48 Trailing Edge Flap retrofit redesign kits in support of the F/A-18E/F aircraft. Work will be performed in St. Louis, Missouri (72%); Lucerne, Switzerland (20%); Paramount, California (5%); and Hot Springs, Arkansas (3%), and is expected to be completed in June 2022. Production of the flaps involves the use of new manufacturing methods including advanced composites and high-speed machining, which were not used in the manufacture of flaps for the earlier Hornets. The Super Hornet is the most advanced addition to the combat-proven family of F/A-18 Hornets. Both the single-seat E and two-seat F models offer longer range, greater endurance, more payload-carrying ability, more powerful engines, increased carrier bringback capability, enhanced survivability and the growth potential to incorporate future systems and technologies to meet emerging threats. Although it is 25 percent larger than the Hornet, the Super Hornet has 42 percent fewer parts. The company's website said the Super Hornet is the backbone of the U.S. Navy carrier air wing now and for decades to come. The combat-proven Super Hornet delivers cutting-edge, next-generation multi-role strike fighter capability, outdistancing current and emerging threats well into the future. The Super Hornet has the capability, flexibility and performance necessary to modernize the air or naval aviation forces of any country. Two versions of the Super Hornet – E model and F model – are able to perform virtually every mission in the tactical spectrum, including air superiority, day/night strike with precision-guided weapons, fighter escort, close air support, suppression of enemy air defenses, maritime strike, reconnaissance, forward air control and tanker missions. https://defence-blog.com/news/u-s-navy-orders-48-retrofit-redesign-kits-in-support-of-super-hornet-aircraft.html The U.S. Department of Defense announced on Thursday that Boeing Co. has been awarded a new contract for support F/A-18 Super Hornet aircraft. U.S. aerospace giant has won a contract valued at as much as $43 million to build, test and delivery of 48 Trailing Edge Flap retrofit redesign kits in support of the F/A-18E/F aircraft. Work will be performed in St. Louis, Missouri (72%); Lucerne, Switzerland (20%); Paramount, California (5%); and Hot Springs, Arkansas (3%), and is expected to be completed in June 2022. Production of the flaps involves the use of new manufacturing methods including advanced composites and high-speed machining, which were not used in the manufacture of flaps for the earlier Hornets. The Super Hornet is the most advanced addition to the combat-proven family of F/A-18 Hornets. Both the single-seat E and two-seat F models offer longer range, greater endurance, more payload-carrying ability, more powerful engines, increased carrier bringback capability, enhanced survivability and the growth potential to incorporate future systems and technologies to meet emerging threats. Although it is 25 percent larger than the Hornet, the Super Hornet has 42 percent fewer parts. The company's website said the Super Hornet is the backbone of the U.S. Navy carrier air wing now and for decades to come. The combat-proven Super Hornet delivers cutting-edge, next-generation multi-role strike fighter capability, outdistancing current and emerging threats well into the future. The Super Hornet has the capability, flexibility and performance necessary to modernize the air or naval aviation forces of any country. Two versions of the Super Hornet – E model and F model – are able to perform virtually every mission in the tactical spectrum, including air superiority, day/night strike with precision-guided weapons, fighter escort, close air support, suppression of enemy air defenses, maritime strike, reconnaissance, forward air control and tanker missions. https://defence-blog.com/news/u-s-navy-orders-48-retrofit-redesign-kits-in-support-of-super-hornet-aircraft.html

  • Intelligence Agencies Release AI Ethics Principles

    July 24, 2020 | International, C4ISR, Security

    Intelligence Agencies Release AI Ethics Principles

    Getting it right doesn't just mean staying within the bounds of the law. It means making sure that the AI delivers reports that accurate and useful to policymakers. By KELSEY ATHERTON ALBUQUERQUE — Today, the Office of the Director of National Intelligence released what the first take on an evolving set of principles for the ethical use of artificial intelligence. The six principles, ranging from privacy to transparency to cybersecurity, are described as Version 1.0, approved by DNI John Ratcliffe last month. The six principles are pitched as a guide for the nation's many intelligence especially, especially to help them work with the private companies that will build AI for the government. As such, they provide an explicit complement to the Pentagon's AI principles put forth by Defense Secretary Mark Esper back in February. “These AI ethics principles don't diminish our ability to achieve our national security mission,” said Ben Huebner, who heads the Office of Civil Liberties, Privacy, and Transparency at ODNI. “To the contrary, they help us ensure that our AI or use of AI provides unbiased, objective and actionable intelligence policymakers require that is fundamentally our mission.” The Pentagon's AI ethics principles came at the tail end of a long process set in motion by workers at Google. These workers called upon the tech giant to withdraw from a contract to build image-processing AI for Project Maven, which sought to identify objects in video recorded by the military. While ODNI's principles come with an accompanying six-page ethics framework, there is no extensive 80-page supporting annex, like that put forth by the Department of Defense. “We need to spend our time under framework and the guidelines that we're putting out to make sure that we're staying within the guidelines,” said Dean Souleles, Chief Technology Advisor at ODNI. “This is a fast-moving train with this technology. Within our working groups, we are actively working on many, many different standards and procedures for practitioners to use and begin to adopt these technologies.” Governing AI as it is developed is a lot like laying out the tracks ahead while the train is in motion. It's a tricky proposition for all involved — but the technology is evolving too fast and unpredictable to try to carve commandments in stone for all time. Here are the six principles, in the document's own words: Respect the Law and Act with Integrity. We will employ AI in a manner that respects human dignity, rights, and freedoms. Our use of AI will fully comply with applicable legal authorities and with policies and procedures that protect privacy, civil rights, and civil liberties. Transparent and Accountable. We will provide appropriate transparency to the public and our customers regarding our AI methods, applications, and uses within the bounds of security, technology, and releasability by law and policy, and consistent with the Principles of Intelligence Transparency for the IC. We will develop and employ mechanisms to identify responsibilities and provide accountability for the use of AI and its outcomes. Objective and Equitable. Consistent with our commitment to providing objective intelligence, we will take affirmative steps to identify and mitigate bias. Human-Centered Development and Use. We will develop and use AI to augment our national security and enhance our trusted partnerships by tempering technological guidance with the application of human judgment, especially when an action has the potential to deprive individuals of constitutional rights or interfere with their free exercise of civil liberties. Secure and Resilient. We will develop and employ best practices for maximizing reliability, security, and accuracy of AI design, development, and use. We will employ security best practices to build resilience and minimize potential for adversarial influence. Informed by Science and Technology. We will apply rigor in our development and use of AI by actively engaging both across the IC and with the broader scientific and technology communities to utilize advances in research and best practices from the public and private sector. The accompanying framework offers further questions for people to ask when programming, evaluating, sourcing, using, and interpreting information informed by AI. While bulk processing of data by algorithm is not a new phenomenon for the intelligence agencies, having a learning algorithm try to parse that data and summarize it for a human is a relatively recent feature. Getting it right doesn't just mean staying within the bounds of the law, it means making sure that the data produced by the inquiry is accurate and useful when handed off to the people who use intelligence products to make policy. “We are absolutely welcoming public comment and feedback on this,” said Huebner, noting that there will be a way for public feedback at Intel.gov. “No question at all that there's going to be aspects of what we do that are and remain classified. I think though, what we can do is talk in general terms about some of the things that we are doing.” Internal legal review, as well as classified assessments from the Inspectors General, will likely be what makes the classified data processing AI accountable to policymakers. For the general public, as it offers comment on intelligence service use of AI, examples will have to come from outside classification, and will likely center on examples of AI in the private sector. “We think there's a big overlap between what the intelligence community needs and frankly, what the private sector needs that we can and should be working on, collectively together,” said Souleles. He specifically pointed to the task of threat identification, using AI to spot malicious actors that seek to cause harm to networks, be they e-commerce giants or three-letter agencies. Depending on one's feelings towards the collection and processing of information by private companies vis-à-vis the government, it is either reassuring or ominous that when it comes to performing public accountability for spy AI, the intelligence community will have business examples to turn to. “There's many areas that I think we're going to be able to talk about going forward, where there's overlap that does not expose our classified sources and methods,” said Souleles, “because many, many, many of these things are really really common problems.” https://breakingdefense.com/2020/07/intelligence-agencies-release-ai-ethics-principles/

  • Autonomous Firefighting Drone

    March 12, 2019 | International, Aerospace, Security

    Autonomous Firefighting Drone

    Working with mentors from Sikorsky, three University of Connecticut engineering seniors are translating their classroom education to the field. Electrical engineering majors Kerry Jones and Joshua Steil, and computer engineering major Ryan Heilemann, are collaborating to build and program an autonomous firefighting drone to battle blazes without a pilot's guidance. “In the world today there's a high prevalence of forest fires, like in California, but the problem is of how to safely put out these fires,” says Steil. “So our project, in essence, is to see if we can start putting out fires without a human driver.” Once finished, the drone will carry a thermal imaging camera to identify a fire, object avoidance technology to steer clear of any obstacles, and a softball-sized fire-extinguishing ball that will be dropped over the flames. The system's technology will be tied together through coding language developed by the students, and will operate based on inputted coordinates. While their drone will only be able to put out a campfire-size blaze, the project is meant to prove that this technology is possible, so that much bigger technology can be engineered in the future, says Heilemann. “The idea is that in the future, on a larger scale, there can be a fleet of unmanned helicopters that can go out and put out forest fires, thereby lowering loss of life,” says Steil. While drones are currently used by fire departments across the country, all of them so far have a pilot who navigates the drone from a distance, and most are used for observation, not fire suppression. “The autonomy definitely makes it different,” says Jones, “and the fire-extinguishing ball, for sure.” Teams in previous years have worked on similar projects with Sikorsky, which provided some guidance on what has worked and what has not. The team looked back on previous projects' reports, including last year's team, which was the first to integrate firefighting capabilities into the drone. While the previous team to work on this project used small thermal sensors called thermopile array sensors, Heilemann says these sensors required the previous drone to be only about six feet from the flames, which was too close for real-world applications. His team decided to use an infrared camera, which allows for more distance from the flames. This year's team had the added benefit of working on their project in UConn's brand new 118,000 square-foot Engineering and Science Building, which features three engineering floors filled with faculty and labs focused on robotics, machine autonomy, and virtual and augmented reality. At Sikorsky, the team is working with a recent UConn School of Engineering alum, Jason Thibodeau, deputy manager of Sikorsky's Flight Controls and Autonomous Systems Department. “He's really helpful. We have phone meetings every Monday, and we tell him what's going on, what we're struggling with, and he reasons with us,” says Jones. Adds Heilemann, “He really wants us to figure our way through issues we have, instead of just giving us a direct solution.” Working with Sikorsky also introduced the UConn seniors to new career options. Jones has accepted an offer with Sikorsky after she graduates, in their autonomy lab as part of their Rotary and Mission Systems department. Steil has accepted a job offer with Sikorsky's parent company, Lockheed Martin, in Massachusetts after graduation. “Working with Sikorsky definitely sparked a greater interest looking into the company as a whole,” he says. Heilemann also decided to go into the aerospace industry, and has found a job doing control and diagnostics at another aerospace company. Most importantly, the collaboration was a chance to get some experience with a top company. “In this project, I get to learn so much about Sikorsky and what they do,” says Steil, “and having a company like that so close to home and have them be our sponsor is definitely an added benefit.” https://dronescrunch.com/autonomous-firefighting-drone/

All news