Artificial intelligence and machine-learning form the foundation of the cognitive techniques set to revolutionise electronic warfare.

Since their first use en masse during World War 2, Electronic Warfare (EW) systems – on aircraft, on ships or support armies – have tended to follow a philosophy. A person programs them to be aware of single or multiple radios or radars operating within their detection range. The system will recognise a particular radar or radio because the person has told it how to recognise transmissions from particular emitters.
For example, an Electronic Support Measure (ESM) or a Radar Warning Receiver (RWR) equipping a ship or aircraft may be programmed to detect all signals with an L-band (1.215GHz to 1.4GHz) frequency and, every time the ESM detects an L-band signal, it alerts the crew.
However, an eye-watering array of RF (Radio Frequency) emitters transmit in L-band, including radars, satellite communications, conventional Ultra High Frequency (UHF – 300MHz to 2GHz) radio communications, television and radio broadcasting, and satellite navigation systems. Suddenly, these EW systems are sounding an alert almost continuously as they encounter a myriad of emitters in the ether.
To avoid these false alarms the ESM/RWR programmer may configure the equipment to recognise only certain L-band waveforms.
Put simply, a waveform is a transmission which is altered or modulated to perform a particular task. For example, an L-band signal may be modulated in a specific way to carry voice traffic between two UHF radios. Alternatively, the signal may be modulated to carry satellite navigation information to a vehicle, or to detect an aircraft.
The programmer can task the ESM or RWR to only alert the crew if L-band waveforms with specific characteristics are detected. For instance, the system may be programmed to detect tracking waveforms transmitted by a ground-based air surveillance radar or naval surveillance radar.
The detection of such transmissions could mean that the ship or aircraft has been detected and is being tracked. This may be the first stage in a potential kill cycle that could see the ship or aircraft’s coordinates passed to a fire-control radar in preparation for the launch of a surface-to-air (SAM) or anti-ship missile (ASM). Unsurprisingly, it is vital that the crew have such information so that they can begin jamming the radar or commencing physical manoeuvres to reduce the visibility of the ship or aircraft to the radar.
Electronic warfare is hugely dependent on humans. While ESMs and RWRs equipping aircraft, satellites, ships, vehicles, and soldiers are tasked with collecting signals intelligence (SIGINT) on communications (COMINT) and radars (ELINT), it is people – albeit assisted by increasingly sophisticated software – who analyse this raw SIGINT.
This analysis turns the raw SIGINT into actionable COMINT and ELINT that can in turn be programmed into ESMs and RWRs, so they know what threats to look for. Once these threats are detected they can either be avoided or engaged with kinetic or electronic effects.
COGNITIVE EW
The term Cognitive EW has proliferated within and without the global EW community over the past decade. Definitions differ, but Cognitive EW can be defined as the application of Artificial Intelligence (AI) and Machine Learning (ML) approaches to assist the application of EW.
EW includes three subdisciplines.
- Electronic attack is the offensive transmission of electromagnetic energy to disrupt, degrade, or destroy hostile platforms, subsystems, weapons, or capabilities dependent wholly or in part on the electromagnetic spectrum.
- Electronic protection is the safeguarding of platforms, personnel, subsystems, weapons, and capabilities against electronic attack.
- And electronic support is the collection, management, and dissemination of SIGINT relevant to the pursuit of electronic attack and protection.
Cognitive EW is applicable to all three pillars of the electronic warfare triumvirate, and the overriding goal is to harness AI and ML to train EW systems to think for themselves.

Returning to our example of the ESM programmed to recognise L-band radar waveforms, once the ESM is programmed thus it will continue to detect and recognise these transmissions until it is told otherwise. Conversely, if it is programmed to no longer recognise these transmissions it will stop.
However, an ESM with cognitive functions would behave differently. Imagine you have just received your new ESM. It has AI software, but its memory is blank and has yet to encounter any radar or radio transmissions. Installed on an aircraft, it performs its first sortie, and, before the sortie, it was programmed to recognise L-band transmissions from these same radars. The ESM will perform its mission in a similar fashion to its conventional counterpart. It will detect, classify, and locate these L-band radars as the sortie is performed.
But during the mission the ESM notes a pattern of behaviour. When some of the L-band radars illuminate the aircraft, an X-band radar transmission is detected shortly after. Unbeknown to the ESM, a SAM battery’s L-band ground-based air surveillance radar is detecting and tracking the aircraft. Radar operators then activate an X-band (8.5GHz to 10.68GHz) fire-control radar to illuminate the target to possibly guide a SAM to its quarry.
The ESM also receives inputs from other EW systems on the aircraft. Feedback from the aircraft’s RF jammer and countermeasures launcher informs the ESM that these latter systems are activated during specific series of events. For example, the countermeasures launcher and RF jammer are activated every time radar warning receiver detects X-band radar transmissions after detecting L-band transmissions, and that this pattern of events continues to occur during several sorties.
An AI/ML-enabled ESM will be able to learn that detecting an L-band radar signal followed by an X-band transmission constitutes potential danger for the aircraft. So in the future, when this sequence of events occurs, cognitive EW techniques could allow the ESM to automatically activate countermeasures without needing to alert the pilot. Likewise, the ESM may realise that an L-band radar illuminating the aircraft, not followed by an X-band signal, is relatively benign and thus no action is required.

As this technology makes its way into other weapons systems and sensors, AI and ML could soon become essential elements for EW, particularly electronic support. Communications systems are already adopting cognitive techniques, as are radars, and both may soon be capable of continually changing their transmission waveforms to frustrate attempts by ESMs to accurately detect and classify their transmissions.
Ironically, cognitive EW systems could use similar approaches against these same radars and radios. “The idea is to give EW systems the same conceptual capabilities as cognitive radios and radars, entering the spectrum with agility, adaptability and efficiency,” Daniela Pistoia, corporate chief scientist of Elettronica Group, an Italian Defence, Cyber and Security company told ADBR.
We could even be moving towards a situation where the permutations of waveforms used by radars and radios could be endless. Dr Sue Robertson, director of EW Defence, told us that, “The vast amount of data needed to effectively characterise signals makes it impossible for human operators to process all the data available.”
Dr Robertson believes that “cognitive EW techniques would be a bonus” in making sense of this bewildering array of signals that SIGINT practitioners seem destined to encounter in the future. Ms Pistoia agrees, saying, “The spectrum is much more congested and contested, and we are operating in a competitive environment.”
But like any other aspect of EW, cognitive approaches are not necessarily a magic wand in the electronic support arena. Any EW system, cognitive or otherwise, is only as good as the data it receives which forms the raw material it will use for decision-making. Dr Robertson cautions that, “care must be taken to insert cognitive EW at the correct points in the processing chain and to have safeguards to check the results”.
This could be complicated by the reality that most armed forces spend much more time in peace than at war, thus the risk is that any cognitive EW system will always have vastly more peacetime data to crunch than its wartime counterpart. The danger of this imbalance is that it could compromise the processes and decisions these systems make when using AI and ML processes.

BOUNDARIES
It is important to stress that cognitive EW will not make the human operator obsolete. In the context of electronic support, Dr Robertson stresses that it is designed as “an aid to human processing, and not the complete replacement of expert operators”. As with many aspects of warfare that look set to be revolutionised by AI and ML augmentation, the human will still play the dominant part, albeit ‘on the loop’ – playing a supervisory role – rather than being ‘in the loop’.
Nonetheless, she believes there is the possible potential for humans to be replaced in the context of the electronic attack mission. This could be driven by two factors.
The first is the speed of engagement. The velocity of today’s weapons systems steadily increases. Radio waves already move at the fastest speed possible, ie 300,000km/second – the speed of light – while threats such as the Almaz-Antey S-400 high-altitude/long-range SAM system has weapons like 48N6DM/E2/E3 missiles which may reach hypersonic speeds of almost six times the speed of sound (7,400km/h).
Therefore, we may already be entering an era where a human simply cannot observe and react to a kinetic threat in sufficient time to avoid being killed. This situation may be exacerbated further if the weapon is using AI and ML techniques to detect and engage targets, something which seems almost inevitable in the future.
The second factor is workload. The ever-deepening sophistication of platforms and weapons will only increase the tasks the human brain must undertake. The more tasks which can be automated, the more ‘headspace’ the human will have to concentrate on other things. From an ethical perspective, it may be less controversial giving a machine autonomy over the initiation and control of an electronic attack without human intervention, compared with allowing a machine full control of a kinetic engagement.
There is also the possibility that the quality of the machine’s decision may be better than that of a human. After all, EW systems don’t experience fatigue or battle stress – they are as fresh at the end of a mission as they were at the start. “Sometimes what seems the safest thing to do is not actually the right one, so it is possible that cognitive EW could be better than the human operator,” Dr Robertson said.
But simply designing AI and ML capabilities into an EW system is not the end of the process. Verifying and validating these capabilities is imperative if the system is to function as desired. “Once you have designed a solution with elements of autonomous decision-making, what are the tools and approaches you will use to validate the system for all the applications you have in mind?” she asks.
Ms Pistoia adds that she expects militaries will need to modify their doctrines and tactics, training, and procedures to take account of cognitive EW systems. “They might adapt their concepts of operation to include the fact that elements of their orders of battle now have autonomous decision-making capabilities,” she said.
Moreover, EW training will have to be modified to cover cognitive elements, and Ms Pistoia believes that it is inevitable that the requisite skills of tomorrow’s EW practitioner will be different from those of today’s.

IMPLEMENTATION
The advent of cognitive EW is not a question of if, but when. “We have already started on the cognitive EW journey,” Dr Robertson told us. Yet progress is measured by baby steps. “Industry may claim impressive solutions for cognitive EW, but often when detailed analysis of results is undertaken, especially for ESM systems, the results are not as good as expected,” she added.
It is unlikely that there will be a ‘big bang’ moment as far as the advent of cognitive EW is concerned. Instead, it is likely that cognitive capabilities will incrementally adorn new EW systems, and perhaps be retrofitted into existing systems.
Moreover, introducing cognitive approaches into electronic warfare will require numerous hurdles to be overcome, not only technological but also cultural. “There is a misconception that translating traditional EW tasks into autonomous or semi-autonomous tasks is easy when actually it is not,” warned Dr Robertson. She added that she is concerned “there are not enough EW experts involved in the development of cognitive EW”, and that replicating tasks currently performed by a human EW practitioner will not be easy.
In addition, would-be users of EW equipment harbouring AI/ML capabilities must be persuaded on the issue of trust. It is no secret that militaries around the world can be conservative, and cognitive approaches to EW will need to have demonstrated they are safe, reliable, and capable, particularly in combat, before they are likely to gain wholesale acceptance by armed services. Indeed, Ms Pistoia says that concerns have been expressed by some service personnel she has encountered in the past that they risk losing control of a cognitive EW system.
The incremental adoption of cognitive EW approaches will help in this regard, by gradually enhancing EW systems and getting users progressively accustomed to these approaches. That cognitive EW has the potential to significantly change how EW practitioners do business today is not in doubt, but the technology will not lead to a wholesale replacement of EW practitioners with machines, nor is this the logic behind the innovation.
Instead, the advent of cognitive approaches could improve the way humans perform the electronic attack, electronic protection, and electronic support dimensions of EW. Furthermore, debating whether cognitive approaches should be adopted in EW is largely pointless – the technology is becoming an unstoppable reality. For the machines to be the humans’ servant, it is up to the humans to decide how best to deploy this potent new technology.
We have only just begun to scratch the surface of this challenge.

This feature story was published in the Jan-Feb 2021 issue of ADBR.