This is a metaphor. Photograph: Getty Images
Show Hide image

Your brain on pseudoscience: the rise of popular neurobollocks

The “neuroscience” shelves in bookshops are groaning. But are the works of authors such as Malcolm Gladwell and Jonah Lehrer just self-help books dressed up in a lab coat?

An intellectual pestilence is upon us. Shop shelves groan with books purporting to explain, through snazzy brain-imaging studies, not only how thoughts and emotions function, but how politics and religion work, and what the correct answers are to age-old philosophical controversies. The dazzling real achievements of brain research are routinely pressed into service for questions they were never designed to answer. This is the plague of neuroscientism – aka neurobabble, neurobollocks, or neurotrash – and it’s everywhere.

In my book-strewn lodgings, one literally trips over volumes promising that “the deepest mysteries of what makes us who we are are gradually being unravelled” by neuroscience and cognitive psychology. (Even practising scientists sometimes make such grandiose claims for a general audience, perhaps urged on by their editors: that quotation is from the psychologist Elaine Fox’s interesting book on “the new science of optimism”, Rainy Brain, Sunny Brain, published this summer.) In general, the “neural” explanation has become a gold standard of non-fiction exegesis, adding its own brand of computer-assisted lab-coat bling to a whole new industry of intellectual quackery that affects to elucidate even complex sociocultural phenomena. Chris Mooney’s The Republican Brain: the Science of Why They Deny Science – and Reality disavows “reductionism” yet encourages readers to treat people with whom they disagree more as pathological specimens of brain biology than as rational interlocutors.

The New Atheist polemicist Sam Harris, in The Moral Landscape, interprets brain and other research as showing that there are objective moral truths, enthusiastically inferring – almost as though this were the point all along – that science proves “conservative Islam” is bad.

Happily, a new branch of the neuroscienceexplains everything genre may be created at any time by the simple expedient of adding the prefix “neuro” to whatever you are talking about. Thus, “neuroeconomics” is the latest in a long line of rhetorical attempts to sell the dismal science as a hard one; “molecular gastronomy” has now been trumped in the scientised gluttony stakes by “neurogastronomy”; students of Republican and Democratic brains are doing “neuropolitics”; literature academics practise “neurocriticism”. There is “neurotheology”, “neuromagic” (according to Sleights of Mind, an amusing book about how conjurors exploit perceptual bias) and even “neuromarketing”. Hoping it’s not too late to jump on the bandwagon, I have decided to announce that I, too, am skilled in the newly minted fields of neuroprocrastination and neuroflâneurship.

Illumination is promised on a personal as well as a political level by the junk enlightenment of the popular brain industry. How can I become more creative? How can I make better decisions? How can I be happier? Or thinner? Never fear: brain research has the answers. It is self-help armoured in hard science. Life advice is the hook for nearly all such books. (Some cram the hard sell right into the title – such as John B Arden’s Rewire Your Brain: Think Your Way to a Better Life.) Quite consistently, heir recommendations boil down to a kind of neo- Stoicism, drizzled with brain-juice. In a selfcongratulatory egalitarian age, you can no longer tell people to improve themselves morally. So self-improvement is couched in instrumental, scientifically approved terms.

The idea that a neurological explanation could exhaust the meaning of experience was already being mocked as “medical materialism” by the psychologist William James a century ago. And today’s ubiquitous rhetorical confidence about how the brain works papers over a still-enormous scientific uncertainty. Paul Fletcher, professor of health neuroscience at the University of Cambridge, says that he gets “exasperated” by much popular coverage of neuroimaging research, which assumes that “activity in a brain region is the answer to some profound question about psychological processes. This is very hard to justify given how little we currently know about what different regions of the brain actually do.” Too often, he tells me in an email correspondence, a popular writer will “opt for some sort of neuro-flapdoodle in which a highly simplistic and questionable point is accompanied by a suitably grand-sounding neural term and thus acquires a weightiness that it really doesn’t deserve. In my view, this is no different to some mountebank selling quacksalve by talking about the physics of water molecules’ memories, or a beautician talking about action liposomes.”

Shades of grey

The human brain, it is said, is the most complex object in the known universe. That a part of it “lights up” on an fMRI scan does not mean the rest is inactive; nor is it obvious what any such lighting-up indicates; nor is it straightforward to infer general lessons about life from experiments conducted under highly artificial conditions. Nor do we have the faintest clue about the biggest mystery of all – how does a lump of wet grey matter produce the conscious experience you are having right now, reading this paragraph? How come the brain gives rise to the mind? No one knows.

So, instead, here is a recipe for writing a hit popular brain book. You start each chapter with a pat anecdote about an individual’s professional or entrepreneurial success, or narrow escape from peril. You then mine the neuroscientific research for an apparently relevant specific result and narrate the experiment, perhaps interviewing the scientist involved and describing his hair. You then climax in a fit of premature extrapolation, inferring from the scientific result a calming bromide about what it is to function optimally as a modern human being. Voilà, a laboratory-sanctioned Big Idea in digestible narrative form. This is what the psychologist Christopher Chabris has named the “story-study-lesson” model, perhaps first perfected by one Malcolm Gladwell. A series of these threesomes may be packaged into a book, and then resold again and again as a stand-up act on the wonderfully lucrative corporate lecture circuit.

Such is the rigid formula of Imagine: How Creativity Works, published in March this year by the American writer Jonah Lehrer. The book is a shatteringly glib mishmash of magazine yarn, bizarrely incompetent literary criticism, inspiring business stories about mops and dolls and zany overinterpretation of research findings in neuroscience and psychology. Lehrer responded to my hostile review of the book by claiming that I thought the science he was writing about was “useless”, but such garbage needs to be denounced precisely in defence of the achievements of science. (In a sense, as Paul Fletcher points out, such books are “anti science, given that science is supposed to be  our protection against believing whatever we find most convenient, comforting or compelling”.) More recently, Lehrer admitted fabricating quotes by Bob Dylan in Imagine, which was hastily withdrawn from sale, and he resigned from his post at the New Yorker. To invent things supposedly said by the most obsessively studied popular artist of our age is a surprising gambit. Perhaps Lehrer misunderstood his own advice about creativity.

Mastering one’s own brain is also the key to survival in a dog-eat-dog corporate world, as promised by the cognitive scientist Art Markman’s Smart Thinking: How to Think Big, Innovate and Outperform Your Rivals. Meanwhile, the field (or cult) of “neurolinguistic programming” (NLP) sells techniques not only of self-overcoming but of domination over others. (According to a recent NLP handbook, you can “create virtually any and all states” in other people by using “embedded commands”.) The employee using such arcane neurowisdom will get promoted over the heads of his colleagues; the executive will discover expert-sanctioned ways to render his underlings more docile and productive, harnessing “creativity” for profit.

Waterstones now even has a display section labelled “Smart Thinking”, stocked with pop brain tracts. The true function of such books, of course, is to free readers from the responsibility of thinking for themselves. This is made eerily explicit in the psychologist Jonathan Haidt’s The Righteous Mind, published last March, which claims to show that “moral knowledge” is best obtained through “intuition” (arising from unconscious brain processing) rather than by explicit reasoning. “Anyone who values truth should stop worshipping reason,” Haidt enthuses, in a perverse manifesto for autolobotomy. I made an Olympian effort to take his advice seriously, and found myself rejecting the reasoning of his entire book.

Modern neuro-self-help pictures the brain as a kind of recalcitrant Windows PC. You know there is obscure stuff going on under the hood, so you tinker delicately with what you can see to try to coax it into working the way you want. In an earlier age, thinkers pictured the brain as a marvellously subtle clockwork mechanism, that being the cutting-edge high technology of the day. Our own brain-as-computer metaphor has been around for decades: there is the “hardware”, made up of different physical parts (the brain), and the “software”, processing routines that use different neuronal “circuits”. Updating things a bit for the kids, the evolutionary psychologist Robert Kurzban, in Why Everyone (Else) Is a Hypocrite, explains that the brain is like an iPhone running a bunch of different apps.

Such metaphors are apt to a degree, as long as you remember to get them the right way round. (Gladwell, in Blink – whose motivational selfhelp slogan is that “we can control rapid cognition” – burblingly describes the fusiform gyrus as “an incredibly sophisticated piece of brain software”, though the fusiform gyrus is a physical area of the brain, and so analogous to “hardware” not “software”.) But these writers tend to reach for just one functional story about a brain subsystem – the story that fits with their Big Idea – while ignoring other roles the same system might play. This can lead to a comical inconsistency across different books, and even within the oeuvre of a single author.

Is dopamine “the molecule of intuition”, as Jonah Lehrer risibly suggested in The Decisive Moment (2009), or is it the basis of “the neural highway that’s responsible for generating the pleasurable emotions”, as he wrote in Imagine? (Meanwhile, Susan Cain’s Quiet: the Power of Introverts in a World That Can’t Stop Talking calls dopamine the “reward chemical” and postulates that extroverts are more responsive to it.) Other recurring stars of the pop literature are the hormone oxytocin (the “love chemical”) and mirror neurons, which allegedly explain empathy. Jonathan Haidt tells the weirdly unexplanatory micro-story that, in one experiment, “The subjects used their mirror neurons, empathised, and felt the other’s pain.” If I tell you to use your mirror neurons, do you know what to do? Alternatively, can you do as Lehrer advises and “listen to” your prefrontal cortex? Self-help can be a tricky business.

Cherry-picking

Distortion of what and how much we know is bound to occur, Paul Fletcher points out, if the literature is cherry-picked.

“Having outlined your theory,” he says, “you can then cite a finding from a neuroimaging study identifying, for example, activity in a brain region such as the insula . . . You then select from among the many theories of insula function, choosing the one that best fits with your overall hypothesis, but neglecting to mention that nobody really knows what the insula does or that there are many ideas about its possible function.”

But the great movie-monster of nearly all the pop brain literature is another region: the amygdala. It is routinely described as the “ancient” or “primitive” brain, scarily atavistic. There is strong evidence for the amygdala’s role in fear, but then fear is one of the most heavily studied emotions; popularisers downplay or ignore the amygdala’s associations with the cuddlier emotions and memory. The implicit picture is of our uneasy coexistence with a beast inside the head, which needs to be controlled if we are to be happy, or at least liberal. (In The Republican Brain, Mooney suggests that “conservatives and authoritarians” might be the nasty way they are because they have a “more active amygdala”.) René Descartes located the soul in the pineal gland; the moral of modern pop neuroscience is that original sin is physical – a bestial, demonic proto-brain lurking at the heart of darkness within our own skulls. It’s an angry ghost in the machine.

Indeed, despite their technical paraphernalia of neurotransmitters and anterior temporal gyruses, modern pop brain books are offering a spiritual topography. Such is the seductive appeal of fMRI brain scans, their splashes of red, yellow and green lighting up what looks like a black intracranial vacuum. In mass culture, the fMRI scan (usually merged from several individuals) has become a secular icon, the converse of a Hubble Space Telescope image. The latter shows us awe-inspiring vistas of distant nebulae, as though painstakingly airbrushed by a sci-fi book-jacket artist; the former peers the other way, into psychedelic inner space. And the pictures, like religious icons, inspire uncritical devotion: a 2008 study, Fletcher notes, showed that “people – even neuroscience undergrads – are more likely to believe a brain scan than a bar graph”.

In The Invisible Gorilla, Christopher Chabris and his collaborator Daniel Simons advise readers to be wary of such “brain porn”, but popular magazines, science websites and books are frenzied consumers and hypers of these scans. “This is your brain on music”, announces a caption to a set of fMRI images, and we are invited to conclude that we now understand more about the experience of listening to music. The “This is your brain on” meme, it seems, is indefinitely extensible: Google results offer “This is your brain on poker”, “This is your brain on metaphor”, “This is your brain on diet soda”, “This is your brain on God” and so on, ad nauseam. I hereby volunteer to submit to a functional magnetic-resonance imaging scan while reading a stack of pop neuroscience volumes, for an illuminating series of pictures entitled This Is Your Brain on Stupid Books About Your Brain.

None of the foregoing should be taken to imply that fMRI and other brain-investigation techniques are useless: there is beautiful and amazing science in how they work and what well-designed experiments can teach us. “One of my favourites,” Fletcher says, “is the observation that one can take measures of brain activity (either using fMRI or EEG) while someone is learning . . . a list of words, and that activity can actually predict whether particular words will be remembered when the person is tested later (even the next day). This to me demonstrates something important – that observing activity in the brain can tell us something about how somebody is processing stimuli in ways that the person themselves is unable to report. With measures like that, we can begin to see how valuable it is to measure brain activity – it is giving us information that would otherwise be hidden from us.”

In this light, one might humbly venture a preliminary diagnosis of the pop brain hacks’ chronic intellectual error. It is that they misleadingly assume we always know how to interpret such “hidden” information, and that it is always more reliably meaningful than what lies in plain view. The hucksters of neuroscientism are the conspiracy theorists of the human animal, the 9/11 Truthers of the life of the mind.

Steven Poole is the author of the forthcoming book “You Aren’t What You Eat”, which will be published by Union Books in October.

This article was updated on 18 September 2012.

This article first appeared in the 10 September 2012 issue of the New Statesman, Autumn politics special

André Carrilho
Show Hide image

The army of one

How the writings of an al-Qaeda strategist inspired the spate of small-cell terror attacks in Britain and other Western countries.

Again. The attack on London Bridge and Borough Market on  3 June has claimed seven lives, with many more people still receiving intensive care for critical injuries. Within hours of the terrorist attack – the third within as many months in England – Islamic State released a statement claiming responsibility, as it did for the two outrages in the UK that immediately preceded it. At least one of the three attackers, Khuram Butt, had a long history of extremist activism and associations in Britain. He was a member of one of the Islamist networks that emerged during the 1990s and then proliferated after the al-Qaeda attacks of 11 September 2001. Although most of the preachers who founded these groups – such as Omar Bakri Muhammad, in the case of Butt and al-Muhajiroun – are no longer in the country, their legacy endures.

Bakri founded al-Muhajiroun (meaning “the emigrants”) in 1996. Now outlawed, it was a radical group committed to the re-establishment of a caliphate. After Bakri was finally excluded from the UK in 2005 (he is now in prison in Lebanon), he was succeeded by Anjem Choudary, the well-known British jihadist who assumed leadership of the network. From its inception, al-Muhajiroun embraced ever greater extremism, declaring support for Osama Bin Laden, for the 9/11 attacks and for al-Qaeda. Scores of its members have been convicted of terrorist offences. Choudary was sentenced to five and a half years in prison in September 2016 for supporting IS.

Several individuals from his network have travelled to Syria in recent years. Among them are Abu Rahin Aziz, originally from Luton, who became involved in active attack planning for IS operations against the West. He was killed in a US drone strike on Raqqa in 2015. Another prominent member of the group, Abu Rumaysah from London, moved his wife and five children to IS-held territory. Along with another British member of the group, Mohammad Ridha Alhak, Rumaysah is believed to have appeared in execution videos for IS.

Those who have remained at home can be found on the edges of terrorist plots. Butt, a 27-year-old British national born in Pakistan, was featured in a recent Channel 4 documentary about British supporters of Islamic State. He glorified and revelled in the barbarism of IS.

Butt will not be the last British jihadist to carry out a terrorist outrage in this country. The London Bridge attack may have seemed chaotic and amateurish but that is the jihadists’ purpose. And behind even the most unsophisticated attack is a considered strategic theory of global jihad, the antecedents of which are long and extend from Afghanistan into Yemen and Syria.

***

Al-Qaeda concentrated on carrying out the 11 September 2001 attacks with such tunnel vision that it gave little consideration to what might come next. The group reasoned that the US would be forced to respond to the atrocity, as Bill Clinton had done in 1998 after groups affiliated with Osama Bin Laden bombed US embassies in Tanzania and Kenya. The Clinton administration launched a series of retaliatory cruise missile strikes against sites associated with Bin Laden in Sudan and Afghanistan. But the response was otherwise muted.

Nonetheless, al-Qaeda had learned an important lesson. Push the US hard enough and the president will be forced to act – which is what al-Qaeda wanted. What the group had not anticipated was the ferocity of American resolve.

As the Taliban melted away after the US invasion of Afghanistan in late 2001 – and with much of its leadership captured, or killed, or on the run – al-Qaeda feared it had overreached. What good had the 9/11 attacks achieved if the global jihad movement would now crumble?

This provoked an intense debate within the movement about its future. Two competing schools of thought arose, which were considered mutually exclusive until Islamic State’s emergence brought them together.

The first view came from a theorist called Abu Bakr Naji (this is a pen name). Naji argued that al-Qaeda should promote an asymmetry of fear by adopting especially brutal and gruesome tactics. He believed Western societies were ultimately weak and lacked the resolve to endure the long war. Instead, he reasoned, jihadists should continue to escalate their depravity and barbarism. This would in turn allow them to re-establish formal control over territory as the Taliban had done, creating safe havens and launchpads for future attacks.

Naji’s view was robustly opposed by another theorist, Abu Musab al-Suri (a nom de guerre for Mustafa Setmariam Nasar, a Syrian strategist within al-Qaeda). He argued that the American response to 9/11 was too severe and that the group would never be able to regain the freedom it had enjoyed under the Taliban. The global jihadi movement would have to embrace the new reality.

According to the Norwegian scholar Brynjar Lia, who has written an authoritative biography of Suri, he opposed the 9/11 attacks precisely because he feared al-Qaeda would be unable to withstand the ferocity of the US response. When it eventually came, Suri felt vindicated.

It reaffirmed a long-standing view of his that the global jihad movement could succeed only if it was decentralised. Suri had begun to advocate decentralisation in the early 1990s, arguing that formal hierarchies did not well serve the jihadist cause. At the time, militant groups were being rounded up in Egypt, Libya and Algeria because their members congregated in large structures.

The most formative influence on Suri’s views, however, was the uprising led by the Muslim Brotherhood in the Syrian city of Hama in the late 1970s and early 1980s. He wrote about the experience in a 900-page book, The Islamic Jihadi Revolution in ­Syria. The uprising was brutally repressed by President Hafez al-Assad, father of Bashar al-Assad. Too much centralisation and formal structuring had caused the revolution to be lost, Suri reasoned. What the movement needed was smaller and more autonomous cells, which could wage a form of low-intensity guerrilla warfare, grinding down the local populations.

Although Suri had been formulating his ideas since the early 1990s, it was only after 9/11 that they coalesced into a coherent theory. Towards the end of 2004, he published his seminal work, The Global Islamic Resistance Call, which outlined his vision for the future of the global jihad movement.

Suri took a more strategic view of terrorism and its outcomes than Bin Laden or his al-Qaeda network. They obsessed about “spectacular”, large-scale attacks such as the 1998 twin embassy bombings, 9/11, the Madrid bombings, or the 7 July 2005 attacks in London. Suri welcomed the successful execution of these attacks but above all what he wanted was continuous, low-level action of the kind we are now experiencing in Britain. Despite overt displays of resilience and camaraderie, these have succeeded in making the public more fearful and angry.

“The jihad of individual or cell terrorism, using the methods of urban or rural guerrilla warfare, is fundamental for exhausting the enemy and causing him to collapse and withdraw,” Suri wrote in The Global Islamic Resistance Call.

To justify indiscriminate attacks on civilians, he invoked verse 8:60 of the Quran, which states: “And prepare against them whatever you are able of power and of steeds of war by which you may terrify the enemy of Allah and your enemy and others besides them.” The passage is often invoked by jihadi theorists to rationalise acts of mass and indiscriminate terror. “This generous verse has ordered preparation for the purpose of terrorising the assailants’ and God’s enemies among the infidels and their servants,” Suri wrote.

He interpreted its invocation to “terrify the enemy” broadly, arguing that “terrorism is a religious duty, and assassination is a Prophetic tradition”.

This is what is known as the doctrine of the “army of one”. The idea is simple: individuals are empowered to carry out deadly and destructive attacks without an overriding command-and-control structure. Having no command structure makes their attacks harder to intercept and oppress. The reality is, the antecedents of the threat we face in Britain today were first theorised in the mountains of the Hindu Kush more than a decade ago.

***

Al-Qaeda’s central leadership favoured Suri’s doctrine, believing that Naji’s approach was too fantastical. What Suri offered was a simpler, more tangible vision of how the global jihad movement should proceed.

However, it was al-Qaeda’s branch in Yemen (known as AQAP) that capitalised most on this doctrine. Under the spiritual tutelage of Anwar al-Awlaki, a Yemeni-American cleric who was eventually killed in an US drone strike, an aggressive doctrine of global jihad was launched.

What made Awlaki so dangerous was not just his charismatic appeal, but his experience of living in the West. He understood the motivation of Western Muslims and knew how to radicalise them.

AQAP published a magazine called Inspire, a precursor to the glossy IS magazine Dabiq, which has glorified as well as inspired attacks against the West. Inspire published Abu Musab al-Suri in English translation. Much of his work is untranslated and remains lost in Arabic texts, making it inaccessible to many Western Muslims. AQAP changed this by bringing the most devastating sections of his writing directly to readers in the West.

Most importantly, Inspire created and promoted a programme of Open Source Jihad (OSJ), which is the strategy of inspiring lone-actor attacks. It offered simple instructions for launching unsophisticated attacks on civilians: pipe bombs, stabbings and vehicle-based assaults.

The impact was considerable. According to original research by Alexander Meleagrou-Hitchens in his forthcoming book on Awlaki, between 2009 and 2016, of a total 212 terrorism cases in the United States, 66 plots could be directly linked to the cleric in one form or another. Put another way, Awlaki was responsible for or inspired almost one-third of all terrorism cases in the US over a seven-year period.

“In this section, the OSJ [Open Source Jihad], we give our readers suggestions on how to wage their individual jihad,” is how Inspire magazine described its OSJ programme. “It allows Muslims to train at home instead of risking [sic] a dangerous travel abroad.” Awlaki explained that this was “a disaster for the repressive imperialistic nations . . . America’s worst nightmare”.

Its effects were felt not only in the United States. In May 2010, Roshonara Choudhry, a then 21-year-old university dropout, attempted to murder Stephen Timms, the Labour MP for East Ham at his constituency surgery in London, because he had voted in favour of the Western war in Iraq. During her trial, Choudhry explained how she had been motivated to stab Timms during a constituency surgery after she became a devotee of Awlaki and his OSJ programme.

The murder of Fusilier Lee Rigby by Michael Adebolajo and Michael Adebowale in May 2013 was another Awlaki-inspired plot. Rigby was attacked on the streets of Woolwich, south-east London. His attackers first rammed him with a vehicle and then stabbed him with knives.

Documents seized by the United States following the raid in which Osama Bin Laden was killed show that the al-Qaeda leader was uneasy about AQAP’s strategy. He felt that attacks using vehicles against civilians were wrong as well as amateurish. And he believed they were so brutal that they would reduce support for violent jihad.

***

Islamic State has never worried about public opinion. It emerged in 2003 after the invasion of Iraq, in a period when the murderous Abu Musab al-Zarqawi, a Jordanian, was leading the organisation. Unlike al-Qaeda’s central leadership, which found itself on the defensive in Afghanistan, Zarqawi, who revelled in barbarism, was presented with an opportunity to confront some of the group’s biggest enemies – America and Britain – in the heart of the Arab world.

In these circumstances, he favoured Naji’s nihilistic doctrine of brutality. His methods have been played out in Syria and Iraq, producing especially egregious acts of barbarism against the local populations over which IS has ruled. Yet, for the group’s attacks in the West, it continues to embrace Suri’s model of decentralisation.

IS has produced significant amounts of literature promoting gruesome attacks in its followers’ home countries. In its Rumiyah (Rome) magazine, one infographic promotes truck attacks, describing their use as “just terror”. It advises readers to acquire a vehicle that is “large in size, heavy in weight” and which has a “slightly raised chassis and bumper”. Among its suggested targets are “congested streets, outdoor markets and large outdoor festivals”.

Another infographic disseminated by the group on social media advises supporters to “kill the civilians of the crusaders, run over them by vehicles [sic]”.

The Nice attack in 2016 which killed 86 people demonstrated just how effective a relatively unsophisticated plot carried out by a lone actor can be. This is one of the ways in which terrorism works: a successful attack gives confidence to others, inspiring and emboldening imitators. The wave of terrorism that is now sweeping the UK is born of this.

Terrorists will take encouragement from others and we have seen comparable spikes in attacks across the European continent, with the French and Germans enduring periods of similar activity.

***

None of this occurs in a vacuum. For many years we allowed radical preachers such as Omar Bakri Muhammad, Anjem Choudary, Abu Hamza and others to preach on the streets and in the mosques of Britain. They spread a deadly message of separateness, telling young Muslims not to identify as British. In many cases, they invoked the very same verses of the Quran as al-Qaeda theorists such as Abu Musab al-Suri in order to spread their message.

A leaflet produced and distributed in 2006 by the same network from which Khuram Butt emerged brazenly glorified terrorism of the kind he unleashed in London. “Jihad against the Kuffar [infidels], the enemies of Allah, puts fear in their hearts and terrifies them,” it stated. “This will give Islam victory, humiliate its enemy and put happiness into the hearts of believers.”

Al-Muhajiroun frequently celebrated terrorist atrocities at home and abroad. What we have, therefore, is a culture in which young men are growing up in Britain who are divorced from our society and its values. They are invested in the fortunes of foreign conflicts instead, exposing us to the turbulence of distant wars. Once it was Yemen and Anwar al-Awlaki who posed the greatest threat to Britain, when al-Qaeda regrouped and established a base there. Yet the potency of his message sharply tailed off after he was killed in 2011.

There is a lesson to learn. The message of leaders and movements, however ideological, still requires them to have an active presence. When they are killed, or pushed back through military action, their potency is much reduced.

In recent times, Islamic State has been able to project a message of momentum and success. Now that the group is suffering significant setbacks in Mosul – where it has practically lost the entire city – its prestige is diminished. Its de facto capital in Raqqa, Syria, is also being slowly encircled by coalition troops.

As with the efforts in Iraq, reclaiming the city will be difficult and dangerous, but Raqqa will fall. In the meantime, attacks from the “army of one” will only intensify and increase in frequency, yet this is a critical phase through which we must pass if the overall threat from IS and other such groups is to be defeated decisively. Our security can only be built over their ruins.

Shiraz Maher is a contributing writer for the New Statesman and a senior research fellow at King’s College London’s International Centre for the Study of Radicalisation.

This article first appeared in the 08 June 2017 issue of the New Statesman, Election special

0800 7318496