Show Hide image

Why futurologists are always wrong – and why we should be sceptical of techno-utopians

From predicting AI within 20 years to mass-starvation in the 1970s, those who foretell the future often come close to doomsday preachers.

Image: Randy Mora

In his book The Future of the Mind, the excitable physicist and futurologist Michio Kaku mentions Darpa. This is America’s Defence Advanced Research Projects Agency, the body normally credited with creating, among other things, the internet. It gets Kaku in a foam of futurological excitement. “The only justification for its existence is . . .” he says, quoting Darpa’s strategic plan, “to ‘accelerate the future into being’ ”.

This isn’t quite right (and it certainly isn’t literate). What Darpa actually says it is doing is accelerating “that future into being”, the future in question being the specific requirements of military commanders. This makes more sense but is no more literate than Kaku’s version. Never mind; Kaku’s is a catchy phrase. It is not strictly meaningful – the future will arrive at its own pace, no matter how hard we press the accelerator – but we know what he is trying to mean. Technological projects from smartphones to missiles can, unlike the future, be accelerated and, in Kaku’s imagination, such projects are the future.

Meanwhile, over at the Googleplex, the search engine’s headquarters in Silicon Valley, Ray Kurzweil has a new job. He has been hired by Google to “work on new projects involving machine learning and language processing”.

For two reasons I found this appointment pretty surprising. First, I had declined to review Kurzweil’s recent book How to Create a Mind – the basis for Google’s decision to hire him – on the grounds that it was plainly silly, an opinion then supported by a sensationally excoriating review by the philosopher Colin McGinn for the New York Review of Books which pointed out that Kurzweil knew, to a rough approximation, nothing about the subject. And, second, I am not sure a religious fanatic is quite the right man for the job.

OK, Kurzweil doesn’t say he is religious but, in reality, his belief system is structurally identical to that of the Southern hot gospellers who warn of the impending “Rapture”, the moment when the blessed will be taken up into paradise and the rest of us will be left to seek salvation in the turmoil of the Tribulation before Christ returns to announce the end of the world. Kurzweil’s idea of “the singularity” is the Rapture for geeks – in this case the blessed will create an intelligent computer that will give them eternal life either in their present bodies or by uploading them into itself. Like the Rapture, it is thought to be imminent. Kurzweil forecasts its arrival in 2045.

Kaku and Kurzweil are probably the most prominent futurologists in the world today. They are the heirs to a distinct tradition which, in the postwar world, has largely focused on space travel, computers, biology and, latterly, neuroscience.

Futurologists are almost always wrong. Indeed, Clive James invented a word – “Hermie” – to denote an inaccurate prediction by a futurologist. This was an ironic tribute to the cold war strategist and, in later life, pop futurologist Herman Kahn. It was slightly unfair, because Kahn made so many fairly obvious predictions – mobile phones and the like – that it was inevitable quite a few would be right.

Even poppier was Alvin Toffler, with his 1970 book Future Shock, which suggested that the pace of technological change would cause psychological breakdown and social paralysis, not an obvious feature of the Facebook generation. Most inaccurate of all was Paul R Ehrlich who, in The Population Bomb, predicted that hundreds of millions would die of starvation in the 1970s. Hunger, in fact, has since declined quite rapidly.

Perhaps the most significant inaccuracy concerned artificial intelligence (AI). In 1956 the polymath Herbert Simon predicted that “machines will be capable, within 20 years, of doing any work a man can do” and in 1967 the cognitive scientist Marvin Minsky announced that “within a generation . . . the problem of creating ‘artificial intelligence’ will substantially be solved”. Yet, in spite of all the hype and the dizzying increases in the power and speed of computers, we are nowhere near creating a thinking machine.

Such a machine is the basis of Kurzweil’s singularity, but futurologists seldom let the facts get in the way of a good prophecy. Or, if they must, they simply move on. The nightmarishly intractable problem of space travel has more or less killed that futurological category and the unexpected complexities of genetics have put that on the back burner for the moment, leaving neuroscientists to take on the prediction game. But futurology as a whole is in rude health despite all the setbacks.

Why? Because there’s money in it; money and faith. I don’t just mean the few millions to be made from book sales; nor do I mean the simple geek belief in gadgetry. And I certainly don’t mean the pallid, undefined, pop-song promises of politicians trying to turn our eyes from the present – Bill Clinton’s “Don’t stop thinking about tomorrow” and Tony Blair’s “Things can only get better”. No, I mean the billions involved in corporate destinies and the yearning for salvation from our human condition. The future has become a land-grab for Wall Street and for the more dubious hot gospellers who have plagued America since its inception and who are now preaching
to the world.

Take the curious phenomenon of the Ted talk. Ted – Technology, Entertainment, Design – is a global lecture circuit propagating “ideas worth spreading”. It is huge. Half a billion people have watched the 1,600 Ted talks that are now online. Yet the talks are almost parochially American. Some are good but too many are blatant hard sells and quite a few are just daft. All of them lay claim to the future; this is another futurology land-grab, this time globalised and internet-enabled.

Benjamin Bratton, a professor of visual arts at the University of California, San Diego, has an astrophysicist friend who made a pitch to a potential donor of research funds. The pitch was excellent but he failed to get the money because, as the donor put it, “You know what, I’m gonna pass because I just don’t feel inspired . . . you should be more like Malcolm Gladwell.” Gladwellism – the hard sell of a big theme supported by dubious, incoherent but dramatically presented evidence – is the primary Ted style. Is this, wondered Bratton, the basis on which the future should be planned? To its credit, Ted had the good grace to let him give a virulently anti-Ted talk to make his case. “I submit,” he told the assembled geeks, “that astrophysics run on the model of American Idol is a recipe for civilisational disaster.”

Bratton is not anti-futurology like me; rather, he is against simple-minded futurology. He thinks the Ted style evades awkward complexities and evokes a future in which, somehow, everything will be changed by technology and yet the same. The geeks will still be living their laid-back California lifestyle because that will not be affected by the radical social and political implications of the very technology they plan to impose on societies and states. This is a naive, very local vision of heaven in which everybody drinks beer and plays baseball and the sun always shines.

The reality, as the revelations of the National Security Agency’s near-universal surveillance show, is that technology is just as likely to unleash hell as any other human enterprise. But the primary Ted faith is that the future is good simply because it is the future; not being the present or the past is seen as an intrinsic virtue.

Bratton, when I spoke to him, described some of the futures on offer as “anthrocidal” – indeed, Kurzweil’s singularity is often celebrated as the start of a “post-human” future. We are the only species that actively pursues and celebrates the possibility of its own extinction.

Bratton was also very clear about the religiosity that lies behind Tedspeak. “The eschatological theme within all this is deep within the American discourse, a positive and negative eschatology,” he said. “There are a lot of right-wing Christians who are obsessed with the Mark of the Beast. It’s all about the Antichrist . . . Maybe it’s more of a California thing – this messianic articulation of the future is deep within my culture, so maybe it is not so unusual to me.”

Bratton also speaks of “a sort of backwash up the channel back into academia”. His astrophysicist friend was judged by Ted/Gladwell values and found wanting. This suggests a solution to the futurologists’ problem of inaccuracy: they actually enforce rather than merely predict the future by rigging the entire game. It can’t work, but it could do severe damage to scientific work before it fails.

Perhaps even more important is the political and social damage that may be done by the future land-grab being pursued by the big internet companies. Google is the leading grabber simply because it needs to keep growing its primary product – online advertising, of which it already possesses a global monopoly. Eric Schmidt, having been displaced as chief executive, is now, as executive chairman, effectively in charge of global PR. His futurological book The New Digital Age, co-written with Jared Cohen, came decorated with approving quotes from Richard Branson, Henry Kissinger, Tony Blair and Bill Clinton, indicating that this is the officially approved future of the new elites, who seem, judging by the book’s contents, intent on their own destruction – oligocide rather an anthrocide.

For it is clear from The New Digital Age that politicians and even supposedly hip leaders in business will have very little say in what happens next. The people, of course, will have none. Basically, most of us will be locked in to necessary digital identities and, if we are not, we will be terrorist suspects. Privacy will become a quaint memory. “If you have something that you don’t want anyone to know,” Schmidt famously said in 2009, “maybe you shouldn’t be doing it [online] in the first place.” So Google elects itself supreme moral arbiter.

Tribalism in the new digital age will increase and “disliked communities” will find themselves maginalised. Nobody seems to have any oversight over anything. It is a hellish vision but the point, I think, is that it is all based on the assumption that companies such as Google will get what they want – absolute and unchallengeable access to information.

As the book came out, Larry Page, the co-founder of Google, unwisely revealed the underlying theme of this thinking in a casual conversation with journalists. “A law can’t be right,” he said, “if it’s 50 years old. Like, it’s before the internet.” He also suggested “we should set aside some small part of the world”, which would be free from regulation so that Googlers could get on with hardcore innovation. Above the law and with their own island state, the technocrats could rule the world with impunity. Peter Thiel, the founder of PayPal, is trying to make exactly that happen with his Seasteading Institute, which aims to build floating cities in international waters. “An open frontier,” he calls it, “for experimenting with new ideas in government.” If you’re an optimist this is just mad stuff; if you’re a pessimist it is downright evil.

One last futurological, land-grabbing fad of the moment remains to be dealt with: neuroscience. It is certainly true that scanners, nanoprobes and supercomputers seem to be offering us a way to invade human consciousness, the final frontier of the scientific enterprise. Unfortunately, those leading us across this frontier are dangerously unclear about the meaning of the word “scientific”.

Neuroscientists now routinely make claims that are far beyond their competence, often prefaced by the words “We have found that . . .” The two most common of these claims are that the conscious self is a illusion and there is no such thing as free will. “As a neuroscientist,” Professor Patrick Haggard of University College London has said, “you’ve got to be a determinist. There are physical laws, which the electrical and chemical events in the brain obey. Under identical circumstances, you couldn’t have done otherwise; there’s no ‘I’ which can say ‘I want to do otherwise’.”

The first of these claims is easily dismissed – if the self is an illusion, who is being deluded? The second has not been established scientifically – all the evidence on which the claim is made is either dubious or misinterpreted – nor could it be established, because none of the scientists seems to be fully aware of the complexities of definition involved. In any case, the self and free will are foundational elements of all our discourse and that includes science. Eliminate them from your life if you like but, by doing so, you place yourself outside human society. You will, if you are serious about this displacement, not be understood. You will, in short, be a zombie.

Yet neuroscience – as in Michio Kaku’s manic book of predictions – is now one of the dominant forms of futurological chatter. We are, it is said, on the verge of mapping, modelling and even replicating the human brain and, once we have done that, the mechanistic foundations of the mind will be exposed. Then we will be able to enhance, empower or (more likely) control the human world in its entirety. This way, I need hardly point out, madness lies.

The radicalism implied, though not yet imposed, by our current technologies is indeed as alarming to the sensitive and thoughtful as it is exciting to the geeks. Benjamin Bratton is right to describe some of it as anthrocidal; both in the form of “the singularity” and in some of the ideas coming from neuroscience, the death of the idea of the human being is involved. If so, it is hard to see why we should care: the welfare of a computer or the fate of a neuroscientifically specified zombie would not seem to be pressing matters. In any case, judging by past futurologies, none of these things is likely to happen.

What does matter is what our current futurologies say about the present. At one level, they say we are seriously deluded. As Bratton observes, the presentational style of Ted and of Gladwell involves embracing radical technologies while secretly believing that nothing about our own cherished ways of life will change; the geeks will still hang out, crying “Woo-hoo!” and chugging beer when the gadgets are unveiled.

At another level, futurology implies that we are unhappy in the present. Perhaps this is because the constant, enervating downpour of gadgets and the devices of the marketeers tell us that something better lies just around the next corner and, in our weakness, we believe. Or perhaps it was ever thus. In 1752, Dr Johnson mused that our obsession with the future may be an inevitable adjunct of the human mind. Like our attachment to the past, it is an expression of our inborn inability to live in – and be grateful for – the present.

“It seems,” he wrote, “to be the fate of man to seek all his consolations in futurity. The time present is seldom able to fill desire or imagination with immediate enjoyment, and we are forced to supply its deficiencies by recollection or anticipation.”

Bryan Appleyard is the author of “The Brain Is Wider Than the Sky: Why Simple Solutions Don’t Work in a Complex World” (Phoenix, £9.99)

© MARK PETERSON/REDUX/EYEVINE
Show Hide image

Goodbye to the Confederate flag

After the shootings in Charleston, the Republican right showed it was finally ready to reject the old symbols of the Confederacy.

On 27 June, an African-American activist named Bree Newsome woke up before dawn, put on her climbing equipment and scaled a 30-foot flagpole on the lawn of State House in Columbia, South Carolina. She then removed the Confederate battle flag that flew from it. “We can’t wait any longer,” she explained later in an online statement. “It’s time for a new chapter where we are sincere about dismantling white supremacy.”

After she was led away in handcuffs, the flag was raised again.

Newsome’s protest reflected a growing impatience within America’s black community and anger about liberal inaction. Political rallies by the Democratic presidential contenders Hillary Clinton and Bernie Sanders have been disrupted by the Black Lives Matter campaign against violence committed on young African Americans and the cultural and legal biases that justify it. While promoting his book on race in the US, the writer Ta-Nehisi Coates argued that, to African Americans, the battle flag represents a lingering attempt “to bury the fact that half this country thought it was a good idea to raise an empire rooted in slavery”.

Yet, on this matter, to everyone’s surprise, the black civil rights movement and many southern Republicans have proved to be of one mind. On 9 July the House of Representatives in South Carolina voted to lower the battle flag for good. It stood, representatives said, for racism. It had to go.

The context of this agreement was a painful one. Ten days before Newsome’s act, a 21-year-old white man named Dylann Roof shot and killed nine black worshippers at the Emanuel African Methodist Episcopal Church in Charleston, South Carolina. According to his room-mate, he wanted to start a race war. The TV screens showed a photo of him holding a gun in one hand and a Confederate battle flag in the other.

If the demands for redress made by civil rights groups didn’t come as a surprise, conservative acquiescence did. The Republican Party had built a solid base in the South by courting white voters who cherished the memory of the Confederacy. Yet the party’s presidential hopefuls from both the North and the South – including Jeb Bush, Lindsey Graham, Scott Walker and George Pataki – said that the battle flag ought to be lowered. The most striking intervention was made by the governor of South Carolina, Nikki Haley, who denounced the use of the Confederate flag and signed the bill removing it. Haley is now tipped to figure on the list of potential vice-presidential nominees.

The volte-face of the US right is in part a result of the horror of the Charleston shootings. Yet it also occurs in the context of major shifts within American society. There are still many conservatives who will defend Confederate heritage as a matter of southern pride but the culture wars are changing as the US becomes increasingly European in outlook. This is taking place across the country. It just happens to be more pronounced in the South because no other region has fought so violently and so long to resist the liberal tide.

The story of the battle flag is the story of the South. The first official Confederate flag used in the civil war of 1861-65 caused confusion during fighting – through the haze of gun smoke, its design of 13 stars and red and white bars was hard to distinguish from the Stars and Stripes. An alternative blue cross was rejected for being too sectarian; the racist Confederacy was anxious not to offend its Jewish citizens. So the cross became a diagonal X. This flag was never officially adopted by the Confederate army. In the years after the war its use was infrequent.

There was little need to visualise southern difference in a flag. It was self-evident in the physical signs of racial segregation: separate schools, pools and drinking fountains; black people confined to the back of the bus. Political displays of the battle flag of Dixie (the historical nickname for the states that seceded from the Union) only really resurfaced when that racial order was challenged by northern liberals. In 1948, the Democrats – then the party overwhelmingly in control of the South – split over modest calls for civil rights. The conservatives who refused to support that year’s presidential ticket, the “Dixiecrats”, triggered a rev­ival of flag-waving across the region.

The old battle flag suddenly appeared on private lawns, on cars and at political rallies. Supposedly ancient cultural traditions were invented overnight. For instance, the 1948 student handbook of the University of Mississippi confessed: “Many Ole Miss customs are fairly new; they lack only the savouring which time brings . . . Ole Miss has adopted the Confederate flag as a symbol of the Mississippi spirit. Each football game finds the scarlet flag frantically waving to the rhythm of the Rebel band.”

I can confirm that this “tradition” was still going as recently as in 2005. That year, I attended an American football game at Ole Miss and was surprised when the band played “Dixie” at the end. White boys and white girls stood up and belted out the folk song of the Confederacy, while black students filed out.

In 1958, South Carolina made it a crime to desecrate the battle flag. Three years later, on the 100th anniversary of the outbreak of the civil war, it was hoisted above its Capitol building in Columbia. That day, there was a struggle in the US Congress to keep federal funding going for segregated schools.

So clear is the link between the postwar white resistance to civil rights and the battle flag that many see it as the symbolic equivalent of the N-word. Jack Hunter, the editor of the conservative website Rare Politics, says: “Some people insist that it’s not about racism, not about slavery, not about segregation. But it’s about all those things.” Hunter grew up in Charleston and used to skateboard in the car park of the church that Dylann Roof attacked. When he was a young journalist, he appeared on local radio as a rabidly right-wing masked character called “the Southern Avenger”. His past was exposed in 2013 while he was working for Rand Paul, a Republican presidential candidate, and Hunter stepped down from his position. He publicly renounced his youthful association with racial conservatism. He now eschews any romanticism about the Confederate cause and its demand for states’ rights. “States’ rights to do what?” he asks: the right to discriminate against African Americans? He is glad that the State House flag is gone. He ascribes its longevity to ignorance, which was corrected by Roof’s rampage: “It was the first time that [southern Republicans] were able to see a different perspective on this symbol.”

Not everyone agrees. Richard Hines – a former South Carolina legislator, Reagan campaign state co-chair and senior activist with the Sons of Confederate Veterans – insists that the flag is “an enduring symbol of the southern fighting man”. Indeed, a poll in July found that 57 per cent of Americans think it stands for southern heritage, rather than racism. Yet that heritage has a political dimension. “Southern people are proud of who they are and there is a leftist assault to destroy the best part of America,” Hines says. “The Trotskyite elite in control of the establishment wants to root out the southern tradition” – a tradition of religious devotion, chivalry and military honour. It is possible to cast the battle flag as a pawn in a much larger cultural conflict.

In 2000, civil rights activists lobbied hard to get the battle flag removed from the top of the South Carolina Capitol and succeeded in having it shrunk in size and relocated to the grounds of State House. The issue came up in that year’s Republican presidential primaries – an unusually poisonous contest between George W Bush and John McCain. Supporters of Bush put out a false story that McCain had fathered an interracial child out of wedlock. McCain added to his woes by opining that the battle flag was “a symbol of racism and slavery”. An organisation called Keep It Flying flooded the state with 250,000 letters attacking him and he lost the crucial competition here to Bush.

The battle flag has retained a strong emotional power for a long time. This makes the Republican establishment’s abandonment of the flag all the more surprising. Then again, those who run the South are probably the people most likely to grasp how much the region has changed in just a decade.

***

In 2010 I took a trip through North Carolina. The landscape told a story. Dotted along the roadside were abandoned black buildings, the old tobacco sheds. The decline of the rural economy had rendered them obsolete. Over the fields that would once have been full of farmers were freshly tarmacked roads, stretching out to nowhere. My guide explained that these were supposed to be cul-de-sacs for new houses. North Carolina was going through a property boom. But who was going to buy all those homes, I asked? The answer: damn Yankees.

Demography is destiny. This once agri­cultural region developed fast from the 1960s onwards by keeping union membership, taxes and regulation as low as possible. Yet capitalism proved disastrous for southern conservatism. Northerners flooded in, seeking work or retirement and bringing their own values. The forecast is that North Carolina’s Research Triangle – the South’s Silicon Valley – will grow by 700,000 jobs and 1.2 million people in two decades.

White migration was accompanied by an influx of Spanish speakers as the service sector flourished. Between 2000 and 2010, the white share of the population of North Carolina fell from 70 to 65 per cent. The black proportion remained at roughly 21 per cent. The Latino proportion, however, jumped from 4.7 per cent to 8.4 per cent. Today, the proportion of people who are non-white and over 60 is about a third. But it’s approaching nearly half for those under 18. As a result, politics in the South is no longer biracial: a contest between white and black. It is increasingly multiracial and uncoupled from the region’s complex past.

The impact of these changes is reflected in voting patterns. In 2000, the South was still overwhelmingly Republican in presidential contests. Even the Democratic nominee, Al Gore, a southerner, lost his home state of Tennessee. But in 2008 and 2012, Barack Obama took those states with the fastest-changing demographics: Florida and Virginia. He won North Carolina in 2008 and lost it in 2012 – but by less than 100,000 votes. It is true that the Republicans won back control in the 2014 midterm elections, with the result that the Deep South now sends few Democrats to Congress; but the region’s political masters are not quite as traditional-minded as they once were.

The Republican relationship with the Confederate past is complex. As the party of Abraham Lincoln and the Union, the GOPs’ southern support was historically small. But in the 1960s the national Democratic Party embraced civil rights and alienated its once loyal southern following; the Republicans took the opportunity to steal some conservative white voters.

The growing southern Republican vote had a class component. Its success in local and congressional races was built more on winning over middle-class moderates than on appealing to the working-class racists who filled the ranks of the Ku Klux Klan. The southern Republican Party did enthusiastically embrace the Confederate battle flag in many quarters. But some office-holders did so only with ambiguity, while large sections of the party never identified with it at all. The period of Republican ascendancy in the South was, in reality, linked with a softening of the area’s racial politics.

Two of the Republicans’ current southern stars are Indian Americans: Bobby Jindal, the governor of Louisiana, and Nikki Haley, the anti-flag governor of South Carolina. There are just two black people in the US Senate and one of them is a Republican, the Tea Party-backed senator for South Carolina, Tim Scott. Marco Rubio, the Floridian senator and presidential candidate, is Cuban American, and the former Florida governor Jeb Bush is married to a Mexican-born woman and speaks fluent Spanish. Bush has tried to push a more moderate line on immigration, in deference to how the GOP will struggle to win the White House if it appeals only to angry white voters. The Kentucky libertarian senator Rand Paul, Jack Hunter’s former boss, has called for legal reforms to correct the trend of keeping far more black than white people in prison. And he is not the only Republican to have been moved by recent race riots sparked by police violence.

***

Violence on the streets of Ferguson, Missouri, and Baltimore, Maryland, confirmed that there still is a culture war in the US. Yet its character has changed. In the past, civil disturbances were typically leapt upon by conservative politicians as evidence of social decline. The 1992 LA riots were blamed on single parenthood and rap lyrics. In contrast, conservative leaders today are far more likely to acknowledge the problems of white racism. There is no place in their ranks for the likes of Dylann Roof. White supremacists are tiny in number.

Jack Hunter claims: “The KKK is like 12 guys in a telephone booth. Liberal groups will use their threat for fundraising but it doesn’t exist. It hasn’t properly since the 1960s.” Roof’s actions say more about gun control, mental illness and the angst of the young than they do about popular, largely liberal views on race, as polling shows.

We can see a similar liberal shift in other areas of the historic culture war. In May 2015 Gallup released the results of a “moral acceptability” survey charting changes in national attitude across all age groups, from 2001 to 2015. Approval of gay relationships jumped from 40 to 63 per cent; having a baby out of wedlock from 45 to 61 per cent; sex between unmarried men and women from 53 to 68 per cent; doctor-assisted suicide from 49 to 56 per cent; even polygamy went from 7 to 16 per cent. Abortion remained narrowly disapproved of: support for access has only crept up from 42 to 45 per cent. This is probably a result of an unusual concentration of political and religious opposition and because it involves a potential life-or-death decision. But the general trend is that young people just don’t care as much about what consenting adults get up to.

Why? It might be because old forms of identity are dying. One way of measuring that is religious affiliation. From 2007 to 2014, according to Pew Research, the proportion of Americans describing themselves as Christian fell from 78 to 71 per cent. Today, only a quarter of the population is evangelical and 21 per cent Catholic, down despite high immigration. Then there is the decline in civic or communal activity. Since 2012, the organisers of Nascar, the stock-car races, have not published attendance figures at their tracks, probably because they have fallen so sharply. The decline of this most macho and working class of sports parallels the fall in conservative forms of collective identity such as southern traditionalism.

The old culture war was, like the racial politics of the old South, binary. In the 1950s, around the same time as the South invented its tradition of flying the battle flag in colleges, the US constructed an ideal of the “normal” nuclear family unit: straight, white, patriarchal, religious. On the other side was the “abnormal”: gay, black, feminist, atheist, and the rest. The surest way to get elected in the US between 1952 and 2004 was to associate yourself with the economic needs and cultural prejudices of the majority. The approach was once summed up by a Richard Nixon strategist thus: split the country in two and the Republicans will take the larger half. But that is changing. The old normal is no longer the cultural standard but just one of many identities to choose from. The races are mixing. Women want to work more and have children later in life, possibly without marriage. Many religious people are having to rethink their theology when a child comes out as gay. And the enforcers of the old ways – the unions, churches or political parties – are far less attractive than the atomising internet.

***

Politicians are scrabbling to keep up with the diffusion of American identity. Democrats got lucky when they nominated Barack Obama and chose a presidential candidate who reflected the fractured era well: interracial, non-denominational Christian, and so on. In the 2012 presidential race the Republicans got burned when they tried to play the old culture war card on abortion. They won’t repeat that mistake. After the Supreme Court legalised gay marriage across the country in June, the right’s response was not as uniformly loud and outraged as it would have been in the past. Some protested, but serious presidential contenders such as Jeb Bush grasped the implications of the defeat. There is a cultural and political realignment going on and no one is sure where it will lead. It’s encouraging caution among the Republican top brass. It is time, they think, to abandon lost causes.

The death of southern traditionalism is part of the ebb and flow of cultural history. Identities flourish and die. As political fashions change, you find the typically American mix of triumph on one side and jeremiad on the other. Richard Hines stood vigil as the battle flag was lowered in Columbia and noted with disgust the presence of what he described as “bussed-in” activists. “They pulled out all these gay pride flags and started shouting, ‘USA, USA, USA!’ It reminded me of the Bolshevik Revolution.”

Hines reckons that more southerners will now fly the flag than ever before and says he has attended overflow rallies of ordinary folks who love their region. He may well be correct. The faithful will keep the old Confederate standard fluttering on their lawns – an act of secession from the 21st century. But in the public domain, the battle flag is on its way down and in its place will be raised the standard of the new America. The rainbow flag flutters high. For now.

Tim Stanley is a historian and a columnist for the Telegraph

This article first appeared in the 20 August 2015 issue of the New Statesman, Corbyn wars