This is a metaphor. Photograph: Getty Images
Show Hide image

Your brain on pseudoscience: the rise of popular neurobollocks

The “neuroscience” shelves in bookshops are groaning. But are the works of authors such as Malcolm Gladwell and Jonah Lehrer just self-help books dressed up in a lab coat?

An intellectual pestilence is upon us. Shop shelves groan with books purporting to explain, through snazzy brain-imaging studies, not only how thoughts and emotions function, but how politics and religion work, and what the correct answers are to age-old philosophical controversies. The dazzling real achievements of brain research are routinely pressed into service for questions they were never designed to answer. This is the plague of neuroscientism – aka neurobabble, neurobollocks, or neurotrash – and it’s everywhere.

In my book-strewn lodgings, one literally trips over volumes promising that “the deepest mysteries of what makes us who we are are gradually being unravelled” by neuroscience and cognitive psychology. (Even practising scientists sometimes make such grandiose claims for a general audience, perhaps urged on by their editors: that quotation is from the psychologist Elaine Fox’s interesting book on “the new science of optimism”, Rainy Brain, Sunny Brain, published this summer.) In general, the “neural” explanation has become a gold standard of non-fiction exegesis, adding its own brand of computer-assisted lab-coat bling to a whole new industry of intellectual quackery that affects to elucidate even complex sociocultural phenomena. Chris Mooney’s The Republican Brain: the Science of Why They Deny Science – and Reality disavows “reductionism” yet encourages readers to treat people with whom they disagree more as pathological specimens of brain biology than as rational interlocutors.

The New Atheist polemicist Sam Harris, in The Moral Landscape, interprets brain and other research as showing that there are objective moral truths, enthusiastically inferring – almost as though this were the point all along – that science proves “conservative Islam” is bad.

Happily, a new branch of the neuroscienceexplains everything genre may be created at any time by the simple expedient of adding the prefix “neuro” to whatever you are talking about. Thus, “neuroeconomics” is the latest in a long line of rhetorical attempts to sell the dismal science as a hard one; “molecular gastronomy” has now been trumped in the scientised gluttony stakes by “neurogastronomy”; students of Republican and Democratic brains are doing “neuropolitics”; literature academics practise “neurocriticism”. There is “neurotheology”, “neuromagic” (according to Sleights of Mind, an amusing book about how conjurors exploit perceptual bias) and even “neuromarketing”. Hoping it’s not too late to jump on the bandwagon, I have decided to announce that I, too, am skilled in the newly minted fields of neuroprocrastination and neuroflâneurship.

Illumination is promised on a personal as well as a political level by the junk enlightenment of the popular brain industry. How can I become more creative? How can I make better decisions? How can I be happier? Or thinner? Never fear: brain research has the answers. It is self-help armoured in hard science. Life advice is the hook for nearly all such books. (Some cram the hard sell right into the title – such as John B Arden’s Rewire Your Brain: Think Your Way to a Better Life.) Quite consistently, heir recommendations boil down to a kind of neo- Stoicism, drizzled with brain-juice. In a selfcongratulatory egalitarian age, you can no longer tell people to improve themselves morally. So self-improvement is couched in instrumental, scientifically approved terms.

The idea that a neurological explanation could exhaust the meaning of experience was already being mocked as “medical materialism” by the psychologist William James a century ago. And today’s ubiquitous rhetorical confidence about how the brain works papers over a still-enormous scientific uncertainty. Paul Fletcher, professor of health neuroscience at the University of Cambridge, says that he gets “exasperated” by much popular coverage of neuroimaging research, which assumes that “activity in a brain region is the answer to some profound question about psychological processes. This is very hard to justify given how little we currently know about what different regions of the brain actually do.” Too often, he tells me in an email correspondence, a popular writer will “opt for some sort of neuro-flapdoodle in which a highly simplistic and questionable point is accompanied by a suitably grand-sounding neural term and thus acquires a weightiness that it really doesn’t deserve. In my view, this is no different to some mountebank selling quacksalve by talking about the physics of water molecules’ memories, or a beautician talking about action liposomes.”

Shades of grey

The human brain, it is said, is the most complex object in the known universe. That a part of it “lights up” on an fMRI scan does not mean the rest is inactive; nor is it obvious what any such lighting-up indicates; nor is it straightforward to infer general lessons about life from experiments conducted under highly artificial conditions. Nor do we have the faintest clue about the biggest mystery of all – how does a lump of wet grey matter produce the conscious experience you are having right now, reading this paragraph? How come the brain gives rise to the mind? No one knows.

So, instead, here is a recipe for writing a hit popular brain book. You start each chapter with a pat anecdote about an individual’s professional or entrepreneurial success, or narrow escape from peril. You then mine the neuroscientific research for an apparently relevant specific result and narrate the experiment, perhaps interviewing the scientist involved and describing his hair. You then climax in a fit of premature extrapolation, inferring from the scientific result a calming bromide about what it is to function optimally as a modern human being. Voilà, a laboratory-sanctioned Big Idea in digestible narrative form. This is what the psychologist Christopher Chabris has named the “story-study-lesson” model, perhaps first perfected by one Malcolm Gladwell. A series of these threesomes may be packaged into a book, and then resold again and again as a stand-up act on the wonderfully lucrative corporate lecture circuit.

Such is the rigid formula of Imagine: How Creativity Works, published in March this year by the American writer Jonah Lehrer. The book is a shatteringly glib mishmash of magazine yarn, bizarrely incompetent literary criticism, inspiring business stories about mops and dolls and zany overinterpretation of research findings in neuroscience and psychology. Lehrer responded to my hostile review of the book by claiming that I thought the science he was writing about was “useless”, but such garbage needs to be denounced precisely in defence of the achievements of science. (In a sense, as Paul Fletcher points out, such books are “anti science, given that science is supposed to be  our protection against believing whatever we find most convenient, comforting or compelling”.) More recently, Lehrer admitted fabricating quotes by Bob Dylan in Imagine, which was hastily withdrawn from sale, and he resigned from his post at the New Yorker. To invent things supposedly said by the most obsessively studied popular artist of our age is a surprising gambit. Perhaps Lehrer misunderstood his own advice about creativity.

Mastering one’s own brain is also the key to survival in a dog-eat-dog corporate world, as promised by the cognitive scientist Art Markman’s Smart Thinking: How to Think Big, Innovate and Outperform Your Rivals. Meanwhile, the field (or cult) of “neurolinguistic programming” (NLP) sells techniques not only of self-overcoming but of domination over others. (According to a recent NLP handbook, you can “create virtually any and all states” in other people by using “embedded commands”.) The employee using such arcane neurowisdom will get promoted over the heads of his colleagues; the executive will discover expert-sanctioned ways to render his underlings more docile and productive, harnessing “creativity” for profit.

Waterstones now even has a display section labelled “Smart Thinking”, stocked with pop brain tracts. The true function of such books, of course, is to free readers from the responsibility of thinking for themselves. This is made eerily explicit in the psychologist Jonathan Haidt’s The Righteous Mind, published last March, which claims to show that “moral knowledge” is best obtained through “intuition” (arising from unconscious brain processing) rather than by explicit reasoning. “Anyone who values truth should stop worshipping reason,” Haidt enthuses, in a perverse manifesto for autolobotomy. I made an Olympian effort to take his advice seriously, and found myself rejecting the reasoning of his entire book.

Modern neuro-self-help pictures the brain as a kind of recalcitrant Windows PC. You know there is obscure stuff going on under the hood, so you tinker delicately with what you can see to try to coax it into working the way you want. In an earlier age, thinkers pictured the brain as a marvellously subtle clockwork mechanism, that being the cutting-edge high technology of the day. Our own brain-as-computer metaphor has been around for decades: there is the “hardware”, made up of different physical parts (the brain), and the “software”, processing routines that use different neuronal “circuits”. Updating things a bit for the kids, the evolutionary psychologist Robert Kurzban, in Why Everyone (Else) Is a Hypocrite, explains that the brain is like an iPhone running a bunch of different apps.

Such metaphors are apt to a degree, as long as you remember to get them the right way round. (Gladwell, in Blink – whose motivational selfhelp slogan is that “we can control rapid cognition” – burblingly describes the fusiform gyrus as “an incredibly sophisticated piece of brain software”, though the fusiform gyrus is a physical area of the brain, and so analogous to “hardware” not “software”.) But these writers tend to reach for just one functional story about a brain subsystem – the story that fits with their Big Idea – while ignoring other roles the same system might play. This can lead to a comical inconsistency across different books, and even within the oeuvre of a single author.

Is dopamine “the molecule of intuition”, as Jonah Lehrer risibly suggested in The Decisive Moment (2009), or is it the basis of “the neural highway that’s responsible for generating the pleasurable emotions”, as he wrote in Imagine? (Meanwhile, Susan Cain’s Quiet: the Power of Introverts in a World That Can’t Stop Talking calls dopamine the “reward chemical” and postulates that extroverts are more responsive to it.) Other recurring stars of the pop literature are the hormone oxytocin (the “love chemical”) and mirror neurons, which allegedly explain empathy. Jonathan Haidt tells the weirdly unexplanatory micro-story that, in one experiment, “The subjects used their mirror neurons, empathised, and felt the other’s pain.” If I tell you to use your mirror neurons, do you know what to do? Alternatively, can you do as Lehrer advises and “listen to” your prefrontal cortex? Self-help can be a tricky business.

Cherry-picking

Distortion of what and how much we know is bound to occur, Paul Fletcher points out, if the literature is cherry-picked.

“Having outlined your theory,” he says, “you can then cite a finding from a neuroimaging study identifying, for example, activity in a brain region such as the insula . . . You then select from among the many theories of insula function, choosing the one that best fits with your overall hypothesis, but neglecting to mention that nobody really knows what the insula does or that there are many ideas about its possible function.”

But the great movie-monster of nearly all the pop brain literature is another region: the amygdala. It is routinely described as the “ancient” or “primitive” brain, scarily atavistic. There is strong evidence for the amygdala’s role in fear, but then fear is one of the most heavily studied emotions; popularisers downplay or ignore the amygdala’s associations with the cuddlier emotions and memory. The implicit picture is of our uneasy coexistence with a beast inside the head, which needs to be controlled if we are to be happy, or at least liberal. (In The Republican Brain, Mooney suggests that “conservatives and authoritarians” might be the nasty way they are because they have a “more active amygdala”.) René Descartes located the soul in the pineal gland; the moral of modern pop neuroscience is that original sin is physical – a bestial, demonic proto-brain lurking at the heart of darkness within our own skulls. It’s an angry ghost in the machine.

Indeed, despite their technical paraphernalia of neurotransmitters and anterior temporal gyruses, modern pop brain books are offering a spiritual topography. Such is the seductive appeal of fMRI brain scans, their splashes of red, yellow and green lighting up what looks like a black intracranial vacuum. In mass culture, the fMRI scan (usually merged from several individuals) has become a secular icon, the converse of a Hubble Space Telescope image. The latter shows us awe-inspiring vistas of distant nebulae, as though painstakingly airbrushed by a sci-fi book-jacket artist; the former peers the other way, into psychedelic inner space. And the pictures, like religious icons, inspire uncritical devotion: a 2008 study, Fletcher notes, showed that “people – even neuroscience undergrads – are more likely to believe a brain scan than a bar graph”.

In The Invisible Gorilla, Christopher Chabris and his collaborator Daniel Simons advise readers to be wary of such “brain porn”, but popular magazines, science websites and books are frenzied consumers and hypers of these scans. “This is your brain on music”, announces a caption to a set of fMRI images, and we are invited to conclude that we now understand more about the experience of listening to music. The “This is your brain on” meme, it seems, is indefinitely extensible: Google results offer “This is your brain on poker”, “This is your brain on metaphor”, “This is your brain on diet soda”, “This is your brain on God” and so on, ad nauseam. I hereby volunteer to submit to a functional magnetic-resonance imaging scan while reading a stack of pop neuroscience volumes, for an illuminating series of pictures entitled This Is Your Brain on Stupid Books About Your Brain.

None of the foregoing should be taken to imply that fMRI and other brain-investigation techniques are useless: there is beautiful and amazing science in how they work and what well-designed experiments can teach us. “One of my favourites,” Fletcher says, “is the observation that one can take measures of brain activity (either using fMRI or EEG) while someone is learning . . . a list of words, and that activity can actually predict whether particular words will be remembered when the person is tested later (even the next day). This to me demonstrates something important – that observing activity in the brain can tell us something about how somebody is processing stimuli in ways that the person themselves is unable to report. With measures like that, we can begin to see how valuable it is to measure brain activity – it is giving us information that would otherwise be hidden from us.”

In this light, one might humbly venture a preliminary diagnosis of the pop brain hacks’ chronic intellectual error. It is that they misleadingly assume we always know how to interpret such “hidden” information, and that it is always more reliably meaningful than what lies in plain view. The hucksters of neuroscientism are the conspiracy theorists of the human animal, the 9/11 Truthers of the life of the mind.

Steven Poole is the author of the forthcoming book “You Aren’t What You Eat”, which will be published by Union Books in October.

This article was updated on 18 September 2012.

This article first appeared in the 10 September 2012 issue of the New Statesman, Autumn politics special

© MARK PETERSON/REDUX/EYEVINE
Show Hide image

Goodbye to the Confederate flag

After the shootings in Charleston, the Republican right showed it was finally ready to reject the old symbols of the Confederacy.

On 27 June, an African-American activist named Bree Newsome woke up before dawn, put on her climbing equipment and scaled a 30-foot flagpole on the lawn of State House in Columbia, South Carolina. She then removed the Confederate battle flag that flew from it. “We can’t wait any longer,” she explained later in an online statement. “It’s time for a new chapter where we are sincere about dismantling white supremacy.”

After she was led away in handcuffs, the flag was raised again.

Newsome’s protest reflected a growing impatience within America’s black community and anger about liberal inaction. Political rallies by the Democratic presidential contenders Hillary Clinton and Bernie Sanders have been disrupted by the Black Lives Matter campaign against violence committed on young African Americans and the cultural and legal biases that justify it. While promoting his book on race in the US, the writer Ta-Nehisi Coates argued that, to African Americans, the battle flag represents a lingering attempt “to bury the fact that half this country thought it was a good idea to raise an empire rooted in slavery”.

Yet, on this matter, to everyone’s surprise, the black civil rights movement and many southern Republicans have proved to be of one mind. On 9 July the House of Representatives in South Carolina voted to lower the battle flag for good. It stood, representatives said, for racism. It had to go.

The context of this agreement was a painful one. Ten days before Newsome’s act, a 21-year-old white man named Dylann Roof shot and killed nine black worshippers at the Emanuel African Methodist Episcopal Church in Charleston, South Carolina. According to his room-mate, he wanted to start a race war. The TV screens showed a photo of him holding a gun in one hand and a Confederate battle flag in the other.

If the demands for redress made by civil rights groups didn’t come as a surprise, conservative acquiescence did. The Republican Party had built a solid base in the South by courting white voters who cherished the memory of the Confederacy. Yet the party’s presidential hopefuls from both the North and the South – including Jeb Bush, Lindsey Graham, Scott Walker and George Pataki – said that the battle flag ought to be lowered. The most striking intervention was made by the governor of South Carolina, Nikki Haley, who denounced the use of the Confederate flag and signed the bill removing it. Haley is now tipped to figure on the list of potential vice-presidential nominees.

The volte-face of the US right is in part a result of the horror of the Charleston shootings. Yet it also occurs in the context of major shifts within American society. There are still many conservatives who will defend Confederate heritage as a matter of southern pride but the culture wars are changing as the US becomes increasingly European in outlook. This is taking place across the country. It just happens to be more pronounced in the South because no other region has fought so violently and so long to resist the liberal tide.

The story of the battle flag is the story of the South. The first official Confederate flag used in the civil war of 1861-65 caused confusion during fighting – through the haze of gun smoke, its design of 13 stars and red and white bars was hard to distinguish from the Stars and Stripes. An alternative blue cross was rejected for being too sectarian; the racist Confederacy was anxious not to offend its Jewish citizens. So the cross became a diagonal X. This flag was never officially adopted by the Confederate army. In the years after the war its use was infrequent.

There was little need to visualise southern difference in a flag. It was self-evident in the physical signs of racial segregation: separate schools, pools and drinking fountains; black people confined to the back of the bus. Political displays of the battle flag of Dixie (the historical nickname for the states that seceded from the Union) only really resurfaced when that racial order was challenged by northern liberals. In 1948, the Democrats – then the party overwhelmingly in control of the South – split over modest calls for civil rights. The conservatives who refused to support that year’s presidential ticket, the “Dixiecrats”, triggered a rev­ival of flag-waving across the region.

The old battle flag suddenly appeared on private lawns, on cars and at political rallies. Supposedly ancient cultural traditions were invented overnight. For instance, the 1948 student handbook of the University of Mississippi confessed: “Many Ole Miss customs are fairly new; they lack only the savouring which time brings . . . Ole Miss has adopted the Confederate flag as a symbol of the Mississippi spirit. Each football game finds the scarlet flag frantically waving to the rhythm of the Rebel band.”

I can confirm that this “tradition” was still going as recently as in 2005. That year, I attended an American football game at Ole Miss and was surprised when the band played “Dixie” at the end. White boys and white girls stood up and belted out the folk song of the Confederacy, while black students filed out.

In 1958, South Carolina made it a crime to desecrate the battle flag. Three years later, on the 100th anniversary of the outbreak of the civil war, it was hoisted above its Capitol building in Columbia. That day, there was a struggle in the US Congress to keep federal funding going for segregated schools.

So clear is the link between the postwar white resistance to civil rights and the battle flag that many see it as the symbolic equivalent of the N-word. Jack Hunter, the editor of the conservative website Rare Politics, says: “Some people insist that it’s not about racism, not about slavery, not about segregation. But it’s about all those things.” Hunter grew up in Charleston and used to skateboard in the car park of the church that Dylann Roof attacked. When he was a young journalist, he appeared on local radio as a rabidly right-wing masked character called “the Southern Avenger”. His past was exposed in 2013 while he was working for Rand Paul, a Republican presidential candidate, and Hunter stepped down from his position. He publicly renounced his youthful association with racial conservatism. He now eschews any romanticism about the Confederate cause and its demand for states’ rights. “States’ rights to do what?” he asks: the right to discriminate against African Americans? He is glad that the State House flag is gone. He ascribes its longevity to ignorance, which was corrected by Roof’s rampage: “It was the first time that [southern Republicans] were able to see a different perspective on this symbol.”

Not everyone agrees. Richard Hines – a former South Carolina legislator, Reagan campaign state co-chair and senior activist with the Sons of Confederate Veterans – insists that the flag is “an enduring symbol of the southern fighting man”. Indeed, a poll in July found that 57 per cent of Americans think it stands for southern heritage, rather than racism. Yet that heritage has a political dimension. “Southern people are proud of who they are and there is a leftist assault to destroy the best part of America,” Hines says. “The Trotskyite elite in control of the establishment wants to root out the southern tradition” – a tradition of religious devotion, chivalry and military honour. It is possible to cast the battle flag as a pawn in a much larger cultural conflict.

In 2000, civil rights activists lobbied hard to get the battle flag removed from the top of the South Carolina Capitol and succeeded in having it shrunk in size and relocated to the grounds of State House. The issue came up in that year’s Republican presidential primaries – an unusually poisonous contest between George W Bush and John McCain. Supporters of Bush put out a false story that McCain had fathered an interracial child out of wedlock. McCain added to his woes by opining that the battle flag was “a symbol of racism and slavery”. An organisation called Keep It Flying flooded the state with 250,000 letters attacking him and he lost the crucial competition here to Bush.

The battle flag has retained a strong emotional power for a long time. This makes the Republican establishment’s abandonment of the flag all the more surprising. Then again, those who run the South are probably the people most likely to grasp how much the region has changed in just a decade.

***

In 2010 I took a trip through North Carolina. The landscape told a story. Dotted along the roadside were abandoned black buildings, the old tobacco sheds. The decline of the rural economy had rendered them obsolete. Over the fields that would once have been full of farmers were freshly tarmacked roads, stretching out to nowhere. My guide explained that these were supposed to be cul-de-sacs for new houses. North Carolina was going through a property boom. But who was going to buy all those homes, I asked? The answer: damn Yankees.

Demography is destiny. This once agri­cultural region developed fast from the 1960s onwards by keeping union membership, taxes and regulation as low as possible. Yet capitalism proved disastrous for southern conservatism. Northerners flooded in, seeking work or retirement and bringing their own values. The forecast is that North Carolina’s Research Triangle – the South’s Silicon Valley – will grow by 700,000 jobs and 1.2 million people in two decades.

White migration was accompanied by an influx of Spanish speakers as the service sector flourished. Between 2000 and 2010, the white share of the population of North Carolina fell from 70 to 65 per cent. The black proportion remained at roughly 21 per cent. The Latino proportion, however, jumped from 4.7 per cent to 8.4 per cent. Today, the proportion of people who are non-white and over 60 is about a third. But it’s approaching nearly half for those under 18. As a result, politics in the South is no longer biracial: a contest between white and black. It is increasingly multiracial and uncoupled from the region’s complex past.

The impact of these changes is reflected in voting patterns. In 2000, the South was still overwhelmingly Republican in presidential contests. Even the Democratic nominee, Al Gore, a southerner, lost his home state of Tennessee. But in 2008 and 2012, Barack Obama took those states with the fastest-changing demographics: Florida and Virginia. He won North Carolina in 2008 and lost it in 2012 – but by less than 100,000 votes. It is true that the Republicans won back control in the 2014 midterm elections, with the result that the Deep South now sends few Democrats to Congress; but the region’s political masters are not quite as traditional-minded as they once were.

The Republican relationship with the Confederate past is complex. As the party of Abraham Lincoln and the Union, the GOPs’ southern support was historically small. But in the 1960s the national Democratic Party embraced civil rights and alienated its once loyal southern following; the Republicans took the opportunity to steal some conservative white voters.

The growing southern Republican vote had a class component. Its success in local and congressional races was built more on winning over middle-class moderates than on appealing to the working-class racists who filled the ranks of the Ku Klux Klan. The southern Republican Party did enthusiastically embrace the Confederate battle flag in many quarters. But some office-holders did so only with ambiguity, while large sections of the party never identified with it at all. The period of Republican ascendancy in the South was, in reality, linked with a softening of the area’s racial politics.

Two of the Republicans’ current southern stars are Indian Americans: Bobby Jindal, the governor of Louisiana, and Nikki Haley, the anti-flag governor of South Carolina. There are just two black people in the US Senate and one of them is a Republican, the Tea Party-backed senator for South Carolina, Tim Scott. Marco Rubio, the Floridian senator and presidential candidate, is Cuban American, and the former Florida governor Jeb Bush is married to a Mexican-born woman and speaks fluent Spanish. Bush has tried to push a more moderate line on immigration, in deference to how the GOP will struggle to win the White House if it appeals only to angry white voters. The Kentucky libertarian senator Rand Paul, Jack Hunter’s former boss, has called for legal reforms to correct the trend of keeping far more black than white people in prison. And he is not the only Republican to have been moved by recent race riots sparked by police violence.

***

Violence on the streets of Ferguson, Missouri, and Baltimore, Maryland, confirmed that there still is a culture war in the US. Yet its character has changed. In the past, civil disturbances were typically leapt upon by conservative politicians as evidence of social decline. The 1992 LA riots were blamed on single parenthood and rap lyrics. In contrast, conservative leaders today are far more likely to acknowledge the problems of white racism. There is no place in their ranks for the likes of Dylann Roof. White supremacists are tiny in number.

Jack Hunter claims: “The KKK is like 12 guys in a telephone booth. Liberal groups will use their threat for fundraising but it doesn’t exist. It hasn’t properly since the 1960s.” Roof’s actions say more about gun control, mental illness and the angst of the young than they do about popular, largely liberal views on race, as polling shows.

We can see a similar liberal shift in other areas of the historic culture war. In May 2015 Gallup released the results of a “moral acceptability” survey charting changes in national attitude across all age groups, from 2001 to 2015. Approval of gay relationships jumped from 40 to 63 per cent; having a baby out of wedlock from 45 to 61 per cent; sex between unmarried men and women from 53 to 68 per cent; doctor-assisted suicide from 49 to 56 per cent; even polygamy went from 7 to 16 per cent. Abortion remained narrowly disapproved of: support for access has only crept up from 42 to 45 per cent. This is probably a result of an unusual concentration of political and religious opposition and because it involves a potential life-or-death decision. But the general trend is that young people just don’t care as much about what consenting adults get up to.

Why? It might be because old forms of identity are dying. One way of measuring that is religious affiliation. From 2007 to 2014, according to Pew Research, the proportion of Americans describing themselves as Christian fell from 78 to 71 per cent. Today, only a quarter of the population is evangelical and 21 per cent Catholic, down despite high immigration. Then there is the decline in civic or communal activity. Since 2012, the organisers of Nascar, the stock-car races, have not published attendance figures at their tracks, probably because they have fallen so sharply. The decline of this most macho and working class of sports parallels the fall in conservative forms of collective identity such as southern traditionalism.

The old culture war was, like the racial politics of the old South, binary. In the 1950s, around the same time as the South invented its tradition of flying the battle flag in colleges, the US constructed an ideal of the “normal” nuclear family unit: straight, white, patriarchal, religious. On the other side was the “abnormal”: gay, black, feminist, atheist, and the rest. The surest way to get elected in the US between 1952 and 2004 was to associate yourself with the economic needs and cultural prejudices of the majority. The approach was once summed up by a Richard Nixon strategist thus: split the country in two and the Republicans will take the larger half. But that is changing. The old normal is no longer the cultural standard but just one of many identities to choose from. The races are mixing. Women want to work more and have children later in life, possibly without marriage. Many religious people are having to rethink their theology when a child comes out as gay. And the enforcers of the old ways – the unions, churches or political parties – are far less attractive than the atomising internet.

***

Politicians are scrabbling to keep up with the diffusion of American identity. Democrats got lucky when they nominated Barack Obama and chose a presidential candidate who reflected the fractured era well: interracial, non-denominational Christian, and so on. In the 2012 presidential race the Republicans got burned when they tried to play the old culture war card on abortion. They won’t repeat that mistake. After the Supreme Court legalised gay marriage across the country in June, the right’s response was not as uniformly loud and outraged as it would have been in the past. Some protested, but serious presidential contenders such as Jeb Bush grasped the implications of the defeat. There is a cultural and political realignment going on and no one is sure where it will lead. It’s encouraging caution among the Republican top brass. It is time, they think, to abandon lost causes.

The death of southern traditionalism is part of the ebb and flow of cultural history. Identities flourish and die. As political fashions change, you find the typically American mix of triumph on one side and jeremiad on the other. Richard Hines stood vigil as the battle flag was lowered in Columbia and noted with disgust the presence of what he described as “bussed-in” activists. “They pulled out all these gay pride flags and started shouting, ‘USA, USA, USA!’ It reminded me of the Bolshevik Revolution.”

Hines reckons that more southerners will now fly the flag than ever before and says he has attended overflow rallies of ordinary folks who love their region. He may well be correct. The faithful will keep the old Confederate standard fluttering on their lawns – an act of secession from the 21st century. But in the public domain, the battle flag is on its way down and in its place will be raised the standard of the new America. The rainbow flag flutters high. For now.

Tim Stanley is a historian and a columnist for the Telegraph

This article first appeared in the 20 August 2015 issue of the New Statesman, Corbyn wars