Artwork by Ryan Schude.
Show Hide image

The paradox of fairness

Is the world a better place if the vicious suffer for their viciousness? And what exactly are just deserts?

For as far back as I can remember language, and uttered the very last time I saw her, one of my mother’s most repeated sentences was: “Every dog has its day.” She said it aloud to herself and to the knowing, listening universe, though, when I was in the room, her eyes might be pointing in my direction. It was an incantation, voiced in a low growl. There was something of a spell about it, but it was mainly an assertion of a fundamental and reassuring truth, a statement to vibrate and stand in the air against whatever injustice she had just suffered or remembered suffering. It was, I understood, a reiterated form of self-comfort to announce that justice, while taking its time, was inevitably to come; perhaps, too, a bit of a nudge for the lackadaisical force responsible for giving every dog its day.

Hers was an other-worldly view, of justice meted out from beyond the human sphere, held in this case by an uneducated non-observant Jewish woman with parents from the shtetl, but it is a foundational promise made by all three religions of the Book, and surely their most effective selling point. My mother’s recitation of her truth belonged with another, more impassioned phrase, which I recall her saying only when sitting in a chair, rocking back and forth, or in bed, rolling her head from side to side. “God! God! What have I done to deserve this?” Generally, unlike the harsh confidence of the first phrase, it was wept, sometimes screamed, mumbled madly, wailed, moaned, and usually repeated over and over again, whereas “Every dog has its day” needed saying only once, whenever the situation merited it. Both phrases were occasioned by my repeatedly philandering, disappearing, money-withholding conman father, and each marked opposite ends of the continuum of disappointment on which my mother lived.

I learned from this, in the first place, obviously, to sneak away so that I wouldn’t get dragged in to the conversation and end up (perhaps not unjustly) as a substitute accusee for my father’s failure to care. But I learned also that she had certain expectations of the world: that the world properly consisted of a normality, and that the world had peculiarly failed her in respect of it.

From a very early age I already knew about the norms of the world, what it was supposed to be like, from nursery rhymes, fairy tales, books, films, television and radio. I knew that the most basic of all the norms was that fairness was to be expected. I doubt that I needed to be taught that; it was inward to me, never unknown, and I would guess that I knew it in some way even before I got the hang of language. I would also guess that it was the same for you.

I suppose what I importantly learned from my cursing and keening mother was that grown-ups still knew it, too. That fairness was not just one of those always suspect childish expectations – like money in return for a tooth, or a man coming down the chimney – that one grew out of.

At the same time, I learned from her that fairness was not an infallible fact of the world and that the most apparently fundamental essentials failed, yet the idea I got from my mother about this was that she (and sometimes I was included) was the only person on the planet whom the arranger of fairness had let down. All other husbands and fathers were true and trustworthy, everyone else had enough money to pay the rent and buy food, everyone else had relatives or friends who rallied round, so my mother often explicitly said. Everyone except her (and me, as the appendage inside her circle of misfortune).

It did rather astonish me that we should be so unfortunate to have been singled out, but I was also impressed that we should have received such special treatment from the universe. The stories and nursery rhymes had told me that bad things were supposed to happen to people who had done something to merit it. But my mother had no doubt she had done nothing, had been a helpless victim, yet the world was bad to her, and therefore bafflingly unfair. When she wailed to her personal inattentive God, “What have I done to deserve this?” she meant that she had done nothing.

It seemed that there was a crack in the heart of fairness, and she had fallen into it. She was innocent and deserving of better in her adulthood than her emotionally and economically impoverished childhood had provided, and yet she was receiving punishment and unhappiness in the form of my father and his bad behaviour. He, not she, deserved her misery, and yet, having disappeared and left us behind, he was living an apparently untroubled, unpunished life.

So I understood that on the one hand there was a rule of universal fairness, and every dog had to have its day, even if it was late in coming, and on the other hand that it was possible for some people to be delivered into unhappiness for no reason at all (as I grew older I understood it wasn’t just her and me). What was odd was the way my mother kept calling, albeit reproachfully, on this God who had so let her down.

I grew up to ditch the notion of a structural fairness, of a god or a nature that rewarded and punished on a moral basis. What occurred in people’s lives was consequent on their choices, their lack of choice, and the interrelation between the two, as well as high-or-low-risk-taking or simple arbitrary happenstance. I settled for a universe where narratives and meanings were fractured rather than based on moral cause and effect.

Lives were fragmented, subject to chance, not a continuing stream of moral repercussions, and although chance did have consequences, those consequences, too, were subject to chance. I recognised it in the devil’s distorting mirror from The Snow Queen which accidentally fell and broke into millions of splinters – a random shard falling into Kay’s eye but not into the eye of his friend Gerda. The story starts and takes its shape from a shrug of fate that knows nothing of you or what you deserve, but quite by accident or because of how our story-craving minds work, life could look as if it was conforming to moral judgement. I built a way to pass and describe my time around the rejection of the expectation of fairness, playing with the sharp edges of deconstructed fairy stories and tales children and adults are easily told. And I shook my head against those I came across who echoed my mother, such as, 30 years later, my mother-in-law, who contracted breast cancer at the age of 75 and asked, over and over, whenever I visited her: “How could this have happened to me? I’ve never done anything to deserve cancer.”

My attempt to grow up and away from the childishness of just deserts was, it goes without saying, no more than a position I took. It was necessary and useful, and allowed me to construct narratives that were more interesting to me than the most expected ones, but naturally I never did manage to do away with the sense of outrage against unfairness that I conclude is at the heart of self- and other-conscious life. I have to acknowledge the fundamental human desire for fairness, which, turned inwards, hampered my mother and which, turned outwards, causes people to work in danger and discomfort in places of war and hunger to improve imbalances of fortune.

Desert, the noun deriving from the verb “to deserve”, appears to be an essential human dynamic. It is at least a central anxiety that provides the plot for so many novels and films that depend on our sense that there is or should be such a thing. Like Kafka and Poe, Hitchcock repeatedly returns to the individual who is singled out, wrongly accused, an innocent suffering an injustice. Yet consider Montgomery Clift’s priest in I Confess, Henry Fonda in The Wrong Man, Blaney, the real killer’s friend played by Jon Finch in Frenzy, James Stewart in The Man Who Knew Too Much and Cary Grant in North by Northwest; none of them is – or could be according to Hitchcock’s Catholic upbringing – truly innocent of everything, and often their moral failings give some cause for the suspicion that falls on them. There is always a faint tang of consequence about their troubles.

We worry about people not getting what they deserve, but, due to religion or some essential guilt we carry with us, we are also concerned that there might be a deeper, less obvious basis for guilt that our everyday, human sense of justice doesn’t take into account. In Victorian fiction, Dickens and Hardy are masters of just and unjust deserts, as innocents such as Oliver Twist, David Copperfield, Tess of the D’Urbervilles and Jude the Obscure become engulfed by persecutory institutions and struggle, only sometimes with success, to find the life they ought, in a fair world, to have.

In Dickens, readers get a joyful reassurance after evil intent almost overcomes goodness but justice finally, though at the last moment, wins out by decency and coincidence. Hardy, in his covert modernism, offers no reassurance at all that his innocents’ day will come; his victims’ hopes and lives are snuffed out by forces such as nature and class that have no concern at all with the worth of individual lives and hopes. For both writers, however, the morally just or unjust result is usually an accident that works in or against the protagonist’s favour.

Every child ever born has at some time or other wailed, “It’s not fair.” To which the adults answer, “Life isn’t fair,” and always, surely, with a sense of sorrow and a vague feeling of betrayal, but also an understanding that a vital lesson is being imparted.

Fairness and desert are not exactly the same, I suppose; we might have a basic requirement for a generalised fairness – equality of opportunity, say – that has nothing to do with what anyone deserves, but our strangely inbuilt earliest sense of fairness provides our first encounter with the complexity of justice and injustice. Perhaps it arose even earlier than human consciousness. There are those who, like the primatologist Frans de Waal, suggest that a sense of fairness is an inherent emotion in monkeys:

An experiment with capuchin monkeys by Sarah Brosnan, of Georgia State University’s CEBUS Lab, and myself illuminated this emotional basis. These monkeys will happily perform a task for cucumber slices until they see other getting grapes, which taste so much better. They become agitated, throw down their measly cucumbers, and go on strike.

I’m not sure if this is exactly a sense of fairness. If so, it is a limited, unidirectional sense. Perhaps a sense of unfairness precedes the more general idea. I imagine a full sense of fairness would be demonstrated by a capuchin throwing her grapes down when she sees her fellow worker receiving cucumber. All for one and one for all. I couldn’t find any experiment that showed this.

A sense of personal unfairness may be all that is experienced by small children, too. It is always easy enough to come up with the idea that we have been morally mistreated. We manage to do it from a very young age and, like my mother-in-law, continue to the end of our lives. That others might deserve something is a more sophisticated thought. Usually, before any egalitarian fervour has a chance to emerge on its own, we have introduced the children, if not the monkeys, to the concept of desert. You get the grape for good behaviour, or helping with the washing-up, or not hitting your baby brother when he hits you, and you don’t get a grape if you throw a tantrum, or refuse to put on your socks. In this way, you and your brother get different amounts of goodness according to some very general rule that you are not much in a position to question, and the inherent problems of universal fairness are put into abeyance, except in the deepest dungeon of our consciousness.

There’s a revival of the childish sense of unfairness in adolescence when they/we cry, “I didn’t ask to be born.” To which we/they reply, again with an implication of just des - erts: “The world doesn’t owe you a living.” But neither party explains how either statement asks or answers the difficulty of unfairness in the world.

I dare say all this harks back to our earliest desperation – the battle for the breast – with the helpless infant demanding that her hunger be assuaged and demanding comfort for her discomfort, the formerly helpless infant now in charge and having the capacity to deny it. It starts in a milky muddle and goes on to just des(s)erts. It is astonishing, actually, that the word for pudding in English is not, as it plainly ought to be, related to the desert that is getting what you deserve.

Nevertheless, eventually the hard-learned reward and punishment system becomes social glue and enters into the world as law and civic organisation, as a clumsy attempt to solve the insoluble. The legislation starts early, in the family, and is a necessity in the community and the state, because, in any unlegislated situation, goodness and altruism are not necessarily rewarded on an individual level. Payback, positive and negative, is rarely found in the wild, and only sometimes in what we call civilisation. Cheats very often prosper and an eye for an eye is a brutal, primitive formulation that advanced cultures (us, as we like to think of ourselves) reject as a kind of exact justice that lacks all mature consideration of circumstances. Yahweh hardly applied fairness when he repaid Job’s devotion with vastly incommensurate loss just to win a bet with Satan. And certainly the knotted family romance that is the basis for Judaism, Christianity and Islam, involving Abram, Sara, Isaac, Hagar, Ishmael and Yahweh, is resolved only by Abram’s adultery with Hagar, then Hagar’s expulsion with her son, nearly ending in their death, and the near-filicide of Isaac. All the victims are as completely innocent as human beings and God can be.

In an attempt properly to get to grips with the idea of fairness, justice and desert, I have recently been struggling with the story of Amos, Boris, Claire and Zoey. They are the protagonists in a drama plotted by Shelly Kagan in his new book, The Geometry of Desert. To simplify, but only slightly, all four of them are injured by an explosion at work. A fifth person, You, comes along with a syringe containing a single dose of painkiller, while they wait in agony for the ambulance.

There is no point in giving everybody a little bit; it won’t help any of them enough. Whom do you give the single useful dose to? At this point, the devastation fades into the background and we learn that Amos was hurt as he happened to walk past an explosion from a device planted by the disgruntled or revolutionary Boris, who failed to get away in time, and that Claire, who instigated the bomb attack and set off the detonator, stood too close and was also injured by the blast, while Zoey came on the horrible scene and was wounded by a second blast as she was trying to go to the aid of the other three. The carnage can now return to the forefront of your mind and you have to choose whom to help with your exiguous morphine supply.

The first thing that should become clear before you start mulling over whom to assist is that you are, in fact, in the middle of a philosophical thought experiment. If you are, like me, a novelist with a resistance (as well as a –probably related –hopelessly inept attraction) to this kind of theoretical reasoning, you might reject the entire scenario, because it never happened and your plot is as good as anyone else’s. No, you think, as if you were back in school rebelling against the insistence that all lines meeting at infinity is a given, I don’t have to make any choice at all.

The bomb at the factory didn’t go off. It was never set in the first place. Boris and Claire are gentle vegans who have no animus that would impel them to set a bomb, and no one is hurt. Amos, Boris, Claire and Zoey can continue their ordinary daily business, perhaps never even meeting, or, if they do, knowing nothing about the drama that never happened and which they all failed to be involved in. Or perhaps each of them becomes the protagonist of his or her own novel of which You, and not Shelly Kagan, are the author – the A, B, C, Z Quartet.

In my version, You’s choices are broadened infinitely, there is no given, and I can simply refuse the parameters of the thought experiment because I am not a philosopher, I do not wish to be restricted to the terms set by someone else for their own didactic purposes, and likely I’ve got several deadlines that don’t depend on figuring out how much or how little guilt deserves the morphine and why. And so, once again, I fail to get to grips with academic philosophy.

The Geometry of Desert considers both the fundamental and the complex nature of deserving. Kagan poses familiar questions initially (what makes one person more deserving than another?; what is it that the more deserving deserve more of?; does anyone deserve anything at all?) and then puts them aside in order to examine the underlying complexity of desert by means of graphs that represent his elaborately anatomised notion of desert and all the possible implications and interactions between its teased-apart elements. This graphical representation of desert is, he says, the most important and original part, and the point, of his book.

Which I dare say it is, but I got no further than my enjoyment and childish rejection of the initial elementary narrative. If I were Alice, the Wonderland in which I find myself wandering, enchanted but fearful and utterly baffled, would be geometry, algebra and (as Alice also encounters) formal logic. I am, if it is possible, spatially challenged. Maps and reality completely fail to come together in my brain. My eyes tear up and the trauma of school maths lessons returns to me as Kagan translates away from situation to number and algebraic representation to devise graphs whose plotted lines meander across their grid in a, to me, mysterious arithmetical relation to each other. I’m rubbish at all things numerical and graphical and, with all the will in the world, which I started off with, I could no more have read the greater part of Kagan’s book with comprehension than I could read the Bhaga­vadgita in Sanskrit.

And yet and yet, I can’t get away from the foothills of desert. I can’t shake off the elementary problems that Amos, Boris, Claire and Zoey create, lying there, waiting for that ambulance, me with a hypodermic full of morphine still in my pocket. Amos and Zoey innocent as lambs, but perhaps Zoey more innocent, having put herself in harm’s way in order to help the others? Boris and Claire guilty, for sure, but is Claire, the initiator of the harmful event, more guilty than Boris the foot soldier? Does it go without saying that I should perform a moral triage in order to decide which sufferer to give the morphine, based on the hierarchy of guilt and innocence? Kagan calls it “Fault Forfeits First”, so that Zoey would be first in line for the morphine and Claire and Boris at the back of the queue. But he points out a basic division in Fault Forfeits First, between the “retributionists” and “moderates” who subscribe to that belief.

The retributionists would not give Claire or Boris any morphine even if some were left over after soothing Zoey’s pain, because they deserve to suffer, having caused suffering. The moderates believe that no one should suffer, but that the innocent Amos and Zoey should be helped first if a choice has to be made. The world, the moderates believe, I believe and perhaps you believe, is improved by an improvement in everyone’s well-being. The retributionists think that the world is a better place if the vicious suffer for their viciousness.

But, as John Rawls claimed early in his career, unless you completely accept free will in people’s behaviour, unclouded by fortune or misfortune in birth, education or life experience, it is possible that no one deserves anything as a result of his actions, good or bad. The first instinct is to give Zoey the pain­killer, other things being equal. Other things being equal is the problem. Why, when you come to think of it, does Zoey deserve less pain or more well-being on account of her good will? Did she have a particularly fortunate upbringing or, indeed, an unfortunate one that inclined her to acts of benevolence? No one is culturally, genetically free of influence. In any case, she had no intention of being injured when she went to help. And who knows why Claire, who conceived a bomb and detonated it, became the person she did?

How do we know (butterfly wings beating in the rainforest, and all that) if there might not be something we are not aware of that would make it more beneficial to give Claire the morphine? What if she has information about other bombs that have been planted? And what if, given an “undeserved” benefit, she came to rethink her viciousness? There may be more purely angelic joy in heaven over such a change of heart, but there are also very good practical reasons to rejoice far more, here on earth, over the redemption of one sinner than over 99 people who do not need to repent.

The retributionists and the moderates believe as they do for the same complicated reasons as the good and the vicious. In the practical world, getting just deserts is enshrined in legislation, and justice is separated from fairness, precisely to avoid the endless entailments of the philosophy of desert. It isn’t so surprising that there have been 20 seasons of Law and Order, which in every episode neatly segments and plays out the uncertainties of policing wrongdoing and providing justice. Finally, I suppose, we have to settle for the muddle of “good enough” fairness, while thinking and trying for something better. But don’t try telling that to my mother.

Jenny Diski’s most recent book is “What I Don’t Know About Animals” (Virago, £9.99)

This article first appeared in the 01 April 2013 issue of the New Statesman, Easter Special Issue

© MARK PETERSON/REDUX/EYEVINE
Show Hide image

Goodbye to the Confederate flag

After the shootings in Charleston, the Republican right showed it was finally ready to reject the old symbols of the Confederacy.

On 27 June, an African-American activist named Bree Newsome woke up before dawn, put on her climbing equipment and scaled a 30-foot flagpole on the lawn of State House in Columbia, South Carolina. She then removed the Confederate battle flag that flew from it. “We can’t wait any longer,” she explained later in an online statement. “It’s time for a new chapter where we are sincere about dismantling white supremacy.”

After she was led away in handcuffs, the flag was raised again.

Newsome’s protest reflected a growing impatience within America’s black community and anger about liberal inaction. Political rallies by the Democratic presidential contenders Hillary Clinton and Bernie Sanders have been disrupted by the Black Lives Matter campaign against violence committed on young African Americans and the cultural and legal biases that justify it. While promoting his book on race in the US, the writer Ta-Nehisi Coates argued that, to African Americans, the battle flag represents a lingering attempt “to bury the fact that half this country thought it was a good idea to raise an empire rooted in slavery”.

Yet, on this matter, to everyone’s surprise, the black civil rights movement and many southern Republicans have proved to be of one mind. On 9 July the House of Representatives in South Carolina voted to lower the battle flag for good. It stood, representatives said, for racism. It had to go.

The context of this agreement was a painful one. Ten days before Newsome’s act, a 21-year-old white man named Dylann Roof shot and killed nine black worshippers at the Emanuel African Methodist Episcopal Church in Charleston, South Carolina. According to his room-mate, he wanted to start a race war. The TV screens showed a photo of him holding a gun in one hand and a Confederate battle flag in the other.

If the demands for redress made by civil rights groups didn’t come as a surprise, conservative acquiescence did. The Republican Party had built a solid base in the South by courting white voters who cherished the memory of the Confederacy. Yet the party’s presidential hopefuls from both the North and the South – including Jeb Bush, Lindsey Graham, Scott Walker and George Pataki – said that the battle flag ought to be lowered. The most striking intervention was made by the governor of South Carolina, Nikki Haley, who denounced the use of the Confederate flag and signed the bill removing it. Haley is now tipped to figure on the list of potential vice-presidential nominees.

The volte-face of the US right is in part a result of the horror of the Charleston shootings. Yet it also occurs in the context of major shifts within American society. There are still many conservatives who will defend Confederate heritage as a matter of southern pride but the culture wars are changing as the US becomes increasingly European in outlook. This is taking place across the country. It just happens to be more pronounced in the South because no other region has fought so violently and so long to resist the liberal tide.

The story of the battle flag is the story of the South. The first official Confederate flag used in the civil war of 1861-65 caused confusion during fighting – through the haze of gun smoke, its design of 13 stars and red and white bars was hard to distinguish from the Stars and Stripes. An alternative blue cross was rejected for being too sectarian; the racist Confederacy was anxious not to offend its Jewish citizens. So the cross became a diagonal X. This flag was never officially adopted by the Confederate army. In the years after the war its use was infrequent.

There was little need to visualise southern difference in a flag. It was self-evident in the physical signs of racial segregation: separate schools, pools and drinking fountains; black people confined to the back of the bus. Political displays of the battle flag of Dixie (the historical nickname for the states that seceded from the Union) only really resurfaced when that racial order was challenged by northern liberals. In 1948, the Democrats – then the party overwhelmingly in control of the South – split over modest calls for civil rights. The conservatives who refused to support that year’s presidential ticket, the “Dixiecrats”, triggered a rev­ival of flag-waving across the region.

The old battle flag suddenly appeared on private lawns, on cars and at political rallies. Supposedly ancient cultural traditions were invented overnight. For instance, the 1948 student handbook of the University of Mississippi confessed: “Many Ole Miss customs are fairly new; they lack only the savouring which time brings . . . Ole Miss has adopted the Confederate flag as a symbol of the Mississippi spirit. Each football game finds the scarlet flag frantically waving to the rhythm of the Rebel band.”

I can confirm that this “tradition” was still going as recently as in 2005. That year, I attended an American football game at Ole Miss and was surprised when the band played “Dixie” at the end. White boys and white girls stood up and belted out the folk song of the Confederacy, while black students filed out.

In 1958, South Carolina made it a crime to desecrate the battle flag. Three years later, on the 100th anniversary of the outbreak of the civil war, it was hoisted above its Capitol building in Columbia. That day, there was a struggle in the US Congress to keep federal funding going for segregated schools.

So clear is the link between the postwar white resistance to civil rights and the battle flag that many see it as the symbolic equivalent of the N-word. Jack Hunter, the editor of the conservative website Rare Politics, says: “Some people insist that it’s not about racism, not about slavery, not about segregation. But it’s about all those things.” Hunter grew up in Charleston and used to skateboard in the car park of the church that Dylann Roof attacked. When he was a young journalist, he appeared on local radio as a rabidly right-wing masked character called “the Southern Avenger”. His past was exposed in 2013 while he was working for Rand Paul, a Republican presidential candidate, and Hunter stepped down from his position. He publicly renounced his youthful association with racial conservatism. He now eschews any romanticism about the Confederate cause and its demand for states’ rights. “States’ rights to do what?” he asks: the right to discriminate against African Americans? He is glad that the State House flag is gone. He ascribes its longevity to ignorance, which was corrected by Roof’s rampage: “It was the first time that [southern Republicans] were able to see a different perspective on this symbol.”

Not everyone agrees. Richard Hines – a former South Carolina legislator, Reagan campaign state co-chair and senior activist with the Sons of Confederate Veterans – insists that the flag is “an enduring symbol of the southern fighting man”. Indeed, a poll in July found that 57 per cent of Americans think it stands for southern heritage, rather than racism. Yet that heritage has a political dimension. “Southern people are proud of who they are and there is a leftist assault to destroy the best part of America,” Hines says. “The Trotskyite elite in control of the establishment wants to root out the southern tradition” – a tradition of religious devotion, chivalry and military honour. It is possible to cast the battle flag as a pawn in a much larger cultural conflict.

In 2000, civil rights activists lobbied hard to get the battle flag removed from the top of the South Carolina Capitol and succeeded in having it shrunk in size and relocated to the grounds of State House. The issue came up in that year’s Republican presidential primaries – an unusually poisonous contest between George W Bush and John McCain. Supporters of Bush put out a false story that McCain had fathered an interracial child out of wedlock. McCain added to his woes by opining that the battle flag was “a symbol of racism and slavery”. An organisation called Keep It Flying flooded the state with 250,000 letters attacking him and he lost the crucial competition here to Bush.

The battle flag has retained a strong emotional power for a long time. This makes the Republican establishment’s abandonment of the flag all the more surprising. Then again, those who run the South are probably the people most likely to grasp how much the region has changed in just a decade.

***

In 2010 I took a trip through North Carolina. The landscape told a story. Dotted along the roadside were abandoned black buildings, the old tobacco sheds. The decline of the rural economy had rendered them obsolete. Over the fields that would once have been full of farmers were freshly tarmacked roads, stretching out to nowhere. My guide explained that these were supposed to be cul-de-sacs for new houses. North Carolina was going through a property boom. But who was going to buy all those homes, I asked? The answer: damn Yankees.

Demography is destiny. This once agri­cultural region developed fast from the 1960s onwards by keeping union membership, taxes and regulation as low as possible. Yet capitalism proved disastrous for southern conservatism. Northerners flooded in, seeking work or retirement and bringing their own values. The forecast is that North Carolina’s Research Triangle – the South’s Silicon Valley – will grow by 700,000 jobs and 1.2 million people in two decades.

White migration was accompanied by an influx of Spanish speakers as the service sector flourished. Between 2000 and 2010, the white share of the population of North Carolina fell from 70 to 65 per cent. The black proportion remained at roughly 21 per cent. The Latino proportion, however, jumped from 4.7 per cent to 8.4 per cent. Today, the proportion of people who are non-white and over 60 is about a third. But it’s approaching nearly half for those under 18. As a result, politics in the South is no longer biracial: a contest between white and black. It is increasingly multiracial and uncoupled from the region’s complex past.

The impact of these changes is reflected in voting patterns. In 2000, the South was still overwhelmingly Republican in presidential contests. Even the Democratic nominee, Al Gore, a southerner, lost his home state of Tennessee. But in 2008 and 2012, Barack Obama took those states with the fastest-changing demographics: Florida and Virginia. He won North Carolina in 2008 and lost it in 2012 – but by less than 100,000 votes. It is true that the Republicans won back control in the 2014 midterm elections, with the result that the Deep South now sends few Democrats to Congress; but the region’s political masters are not quite as traditional-minded as they once were.

The Republican relationship with the Confederate past is complex. As the party of Abraham Lincoln and the Union, the GOPs’ southern support was historically small. But in the 1960s the national Democratic Party embraced civil rights and alienated its once loyal southern following; the Republicans took the opportunity to steal some conservative white voters.

The growing southern Republican vote had a class component. Its success in local and congressional races was built more on winning over middle-class moderates than on appealing to the working-class racists who filled the ranks of the Ku Klux Klan. The southern Republican Party did enthusiastically embrace the Confederate battle flag in many quarters. But some office-holders did so only with ambiguity, while large sections of the party never identified with it at all. The period of Republican ascendancy in the South was, in reality, linked with a softening of the area’s racial politics.

Two of the Republicans’ current southern stars are Indian Americans: Bobby Jindal, the governor of Louisiana, and Nikki Haley, the anti-flag governor of South Carolina. There are just two black people in the US Senate and one of them is a Republican, the Tea Party-backed senator for South Carolina, Tim Scott. Marco Rubio, the Floridian senator and presidential candidate, is Cuban American, and the former Florida governor Jeb Bush is married to a Mexican-born woman and speaks fluent Spanish. Bush has tried to push a more moderate line on immigration, in deference to how the GOP will struggle to win the White House if it appeals only to angry white voters. The Kentucky libertarian senator Rand Paul, Jack Hunter’s former boss, has called for legal reforms to correct the trend of keeping far more black than white people in prison. And he is not the only Republican to have been moved by recent race riots sparked by police violence.

***

Violence on the streets of Ferguson, Missouri, and Baltimore, Maryland, confirmed that there still is a culture war in the US. Yet its character has changed. In the past, civil disturbances were typically leapt upon by conservative politicians as evidence of social decline. The 1992 LA riots were blamed on single parenthood and rap lyrics. In contrast, conservative leaders today are far more likely to acknowledge the problems of white racism. There is no place in their ranks for the likes of Dylann Roof. White supremacists are tiny in number.

Jack Hunter claims: “The KKK is like 12 guys in a telephone booth. Liberal groups will use their threat for fundraising but it doesn’t exist. It hasn’t properly since the 1960s.” Roof’s actions say more about gun control, mental illness and the angst of the young than they do about popular, largely liberal views on race, as polling shows.

We can see a similar liberal shift in other areas of the historic culture war. In May 2015 Gallup released the results of a “moral acceptability” survey charting changes in national attitude across all age groups, from 2001 to 2015. Approval of gay relationships jumped from 40 to 63 per cent; having a baby out of wedlock from 45 to 61 per cent; sex between unmarried men and women from 53 to 68 per cent; doctor-assisted suicide from 49 to 56 per cent; even polygamy went from 7 to 16 per cent. Abortion remained narrowly disapproved of: support for access has only crept up from 42 to 45 per cent. This is probably a result of an unusual concentration of political and religious opposition and because it involves a potential life-or-death decision. But the general trend is that young people just don’t care as much about what consenting adults get up to.

Why? It might be because old forms of identity are dying. One way of measuring that is religious affiliation. From 2007 to 2014, according to Pew Research, the proportion of Americans describing themselves as Christian fell from 78 to 71 per cent. Today, only a quarter of the population is evangelical and 21 per cent Catholic, down despite high immigration. Then there is the decline in civic or communal activity. Since 2012, the organisers of Nascar, the stock-car races, have not published attendance figures at their tracks, probably because they have fallen so sharply. The decline of this most macho and working class of sports parallels the fall in conservative forms of collective identity such as southern traditionalism.

The old culture war was, like the racial politics of the old South, binary. In the 1950s, around the same time as the South invented its tradition of flying the battle flag in colleges, the US constructed an ideal of the “normal” nuclear family unit: straight, white, patriarchal, religious. On the other side was the “abnormal”: gay, black, feminist, atheist, and the rest. The surest way to get elected in the US between 1952 and 2004 was to associate yourself with the economic needs and cultural prejudices of the majority. The approach was once summed up by a Richard Nixon strategist thus: split the country in two and the Republicans will take the larger half. But that is changing. The old normal is no longer the cultural standard but just one of many identities to choose from. The races are mixing. Women want to work more and have children later in life, possibly without marriage. Many religious people are having to rethink their theology when a child comes out as gay. And the enforcers of the old ways – the unions, churches or political parties – are far less attractive than the atomising internet.

***

Politicians are scrabbling to keep up with the diffusion of American identity. Democrats got lucky when they nominated Barack Obama and chose a presidential candidate who reflected the fractured era well: interracial, non-denominational Christian, and so on. In the 2012 presidential race the Republicans got burned when they tried to play the old culture war card on abortion. They won’t repeat that mistake. After the Supreme Court legalised gay marriage across the country in June, the right’s response was not as uniformly loud and outraged as it would have been in the past. Some protested, but serious presidential contenders such as Jeb Bush grasped the implications of the defeat. There is a cultural and political realignment going on and no one is sure where it will lead. It’s encouraging caution among the Republican top brass. It is time, they think, to abandon lost causes.

The death of southern traditionalism is part of the ebb and flow of cultural history. Identities flourish and die. As political fashions change, you find the typically American mix of triumph on one side and jeremiad on the other. Richard Hines stood vigil as the battle flag was lowered in Columbia and noted with disgust the presence of what he described as “bussed-in” activists. “They pulled out all these gay pride flags and started shouting, ‘USA, USA, USA!’ It reminded me of the Bolshevik Revolution.”

Hines reckons that more southerners will now fly the flag than ever before and says he has attended overflow rallies of ordinary folks who love their region. He may well be correct. The faithful will keep the old Confederate standard fluttering on their lawns – an act of secession from the 21st century. But in the public domain, the battle flag is on its way down and in its place will be raised the standard of the new America. The rainbow flag flutters high. For now.

Tim Stanley is a historian and a columnist for the Telegraph

This article first appeared in the 20 August 2015 issue of the New Statesman, Corbyn wars