19 novembre 2018 | International, Aérospatial, C4ISR

Russian drones can jam cellphones 60 miles away

By:

Russia's Defense Ministry announced Nov. 6 that the nation had extended the range on its drone-carried jammers to 100 km, or over 60 miles. Drones as a platform for, and not just the target of, electronic warfare means that the sight of a flying robot overhead could signal incoming strikes as well as a sudden inability to call for help.

“Russia has been using a UAV-mounted cellphone jammer for a number of years now,” said Samuel Bendett, a research analyst at the Center for Naval Analyses. The drones operate in a two- or three-vehicle pod with a ground station, collectively grouped as a “Leer-3” system.

“When these UAVs fly in teams, one acts as a signal-and-comms relay while another acts as a jammer,” Bendett said. “These Leer-3 systems have been around for about two years at this point.”

What is changed is the range of the jammer. The Orlan-10 drones already have a range of 75 miles, which means that, with the latest update on the jammer, the drone pod can interfere with signals up to 135 miles away from where the drone was launched. TASS reports that the 60-mile range is a 3.5 times increase in distance from the initial range.

In addition, Bendett said there's a chance this capability, or an earlier version of it, has already been witnessed in conflict.

“Ukrainian forces claim to spot Leer-3 systems in eastern Ukraine, while there is potential evidence that Leer-3 was used in Syria as well,” Bendett said. “Russian forces are constantly training with Leer-3 UAVs as they practice adversary signal and cell comms suppression, identification and eventual destruction of the enemy force. In fact, this kind of training is part of the official [tactics, techniques and procedures] in electronic warfare and other forces across the Russian military.”

Advancements in electronic warfare are one of the key components guiding the development of autonomous systems for the military. For now, drones are conducting electronic warfare against cellular communications, but it's not hard to imagine the same doctrines applied with new technology. In that scenario, it easy to picture other vehicles transforming into jamming machines on future battlefields ... and maybe even present ones.

https://www.c4isrnet.com/newsletters/unmanned-systems/2018/11/16/russian-drones-can-jam-cell-phones-60-miles-away

Sur le même sujet

  • USAF’s Big Data Approach To Logistics And Predictive Maintenance

    24 avril 2022 | International, C4ISR

    USAF’s Big Data Approach To Logistics And Predictive Maintenance

    New logistics program feeds basing, force protection data into maintenance analysis.

  • Artificial intelligence systems need ‘checks and balances’ throughout development

    22 juin 2020 | International, C4ISR

    Artificial intelligence systems need ‘checks and balances’ throughout development

    Andrew Eversden The Pentagon's primary artificial intelligence hub is already studying how to aim a laser at the correct spot on an enemy vehicle, pinpointing which area to target to inflict the most damage, and identifying the most important messages headed to commanders, officials said June 16. But as part of that work, the Department of Defense needs to carefully implement checks and balances into the development process, experts urged June 16. “Fundamentally I would say there's a requirement ... that there's going to be a mixture of measures taken to ensure the governability of the system from the first stage of the design of the system all the way up through the operations of the system in a combat scenario,” said Greg Allen, the Joint Artificial Intelligence Center's chief of strategy and communications at the Joint Artificial Intelligence Center, at the Defense One Tech Summit June 16. The JAIC is working on several lethality projects through its new joint warfighting initiative, boosted by a new contract award to Booz Allen potentially worth $800 million. “With this new contract vehicle, we have the potential to do even more this next year than we did in the past,” Allen said. Meanwhile, the Army's Artificial Intelligence Task Force is working on an advanced threat recognition project. DARPA is exploring complementing AI systems that would identify available combat support assets and quickly plan their route to the area. Throughout all of the development work, experts from the military and from academia stressed that human involvement and experimentation was critical to ensuring that artificial intelligence assets are trustworthy. The department has released a document of five artificial intelligence ethical principles, but the challenge remains implementing those principles into projects across a department with disparate services working on separate artificial intelligence projects. “We want safe, reliable and robust systems deployed to our warfighters,” said Heather Roff, senior research analyst at the Johns Hopkins Applied Physics Lab. “We want to be able to trust those systems. We want to have some sort of measure of predictability even if those systems act unpredictably.” Brig. Gen. Matt Easley, director of the artificial intelligence task force at Army Futures Command, said the service is grappling with those exact challenges, trying to understand how the service can insert “checks and balances” as it trains systems and soldiers. Easley added that the unmanned systems under development by the Army will have to be adaptable to different environments, such as an urban or desert scenarios. In order to ensure that the systems and soldiers are ready for those scenarios, the Army has to complete a series of tests, just like the autonomous vehicle industry. “We don't think these systems are going to be 100 percent capable right out of the box,” Easley said on the webinar. “If you look at a lot of the evolution of the self-driving cars throughout our society today, they're doing a lot of experimentation. They're doing lots of testing, lots of learning every day. We in the Army have to learn how to go from doing one to two to three vehicle experiments to have many experiments going on every day across all our camp posts and stations.” Increasingly autonomous systems also mean that there needs to a cultural shift in among all levels of military personnel who will need to better understand how artificial intelligence is used. Roff said that operators, commanders and judge advocate generals will need to better understand how systems are supposed “to ensure that the human responsibility and governability is there.” “We need to make sure that we have training, tactics, procedures, as well as policies, ensuring where we know the human decision maker is,” Roff said. https://www.c4isrnet.com/it-networks/2020/06/18/artificial-intelligence-systems-need-checks-and-balances-throughout-development/

  • Air Force Was ‘Hyper Focused’ on Cybersecurity for IT Networks. Now Other Systems Need Protection. - Air Force Magazine

    11 août 2022 | International, C4ISR

    Air Force Was ‘Hyper Focused’ on Cybersecurity for IT Networks. Now Other Systems Need Protection. - Air Force Magazine

    Air Force Life Cycle Management Center leaders discussed the importance of cybersecurity for weapons systems and base facilities.

Toutes les nouvelles