Illustration by Melinda Gebbie.
Show Hide image

The Goebbels of the English language

We cannot state conclusively that anything is true.

I’m not entirely sure what my fellow contributors will have to say upon the subject but I expect they’ll generally see evidence as quite a good thing and will make compelling arguments to that effect, backed up by documented facts, deductions, and, well, evidence. Except that evidence would say that, wouldn’t it? It isn’t going to testify against itself.

I submit that if in the preceding paragraph the reader replace the term “evidence” with, say, the term “News International”, then the sleazy duplicity of this butter-wouldn’t-melt-in- its-mouth logical, scientific and forensic concept will become immediately apparent. Although it might be incautious to suggest that “evidence” and “evil” are synonymous based solely on their first three letters, I say that we go for it. Let’s subject this oily and persuasive abstract noun to the same brutal scrutiny that it is all too ready to inflict on others and see how it likes it.

When we try to build a solid case against our purely notional defendant, though, we start to get an idea of exactly what we’re up against. For one thing we discover that, suspiciously, we’re suddenly without a single shred of data to support our claims. Forthcoming witnesses are nowhere to be found and even the police appear reluctant to become involved. We learn that evidence, being pretty much made of evidence, has got an alibi for absolutely everything, with all the confirmatory theatre ticket-stubs and time-logged credit card exchanges carefully in place. In this, at least, evidence bears a strong resemblance to the imperviously powerful and homicidal drug cartels of Mexico.

Despite evidence appearing to be protected by the prince of darkness from on high, we can still pursue our investigation and construct a profile of our subject. Evidence, it turns out, is a relatively young latecomer to the scene that muscled its way into our vocabulary 700 years ago, ruthlessly ousting older and more venerated competition such as rumour, superstition and some bloke down ye tavern and malicious gossip, in the lexicological equivalent of an axe-wielding turf war.

Now, quite clearly, the mere fact that evidence shares the same medieval pedigree as the Black Death and the Spanish Inquisition doesn’t mean that it is equally abhorrent, but it’s worth observing that airborne carbon particulates are often indicative of a fire, hence smoke alarms.

A glance at evidence’s backstory reveals a seemingly impeccable and spotless record sheet, with glowing testimonials to the subject’s many acts of great social benevolence: tremendous contributions to the methodology of science and medicine that have allowed humanity to crawl up from a swamp of ignorance and early death; providing the foundations for a legal process that goes further than establishing if witches float or not; and blah blah blah. It hardly need be said that representing oneself as a public benefactor is a timeworn strategy for camouflaging acts of dire monstrosity, as employed by Alphonse Capone, the Kray twins and disc jockeys in the 1970s. As yet, no information has emerged connecting evidence with any of the previously mentioned malefactors but there are of course fresh revelations every day. Is it a mere coincidence that the most commonly used adjectives or adjectival nouns describing evidence are “cold”, “hard”, or, more worryingly, “DNA”?

If we are hoping to make something stick with this notoriously slippery and Teflon-coated piece of terminology, we obviously need to dig a little deeper. A good place to start might be with a description of the suspect, something that a sketch artist could work from. Evidence, according to a reputable source, is “that which tends to prove or disprove something; grounds for belief; proof”. The sharp-eyed juror may note a considerable leap between the first two cautious, unassuming clauses of that definition and the confident, declamatory third example. How does “that which tends to prove or disprove” or which gives “grounds for belief” equate so suddenly with “proof”? While a creationist might justifiably regard the book of Genesis as documentary evidence supporting their profoundly stupid version of the universe’s origin, as something which in their opinion tends to prove or to at least give grounds for their belief, that surely doesn’t mean that a several-millennia- old Just So Story is a proof of anything.

Alternatively, back in 1881, the physicists Albert Michelson and Edward Morley ably demonstrated that the hitherto convincing evidence for the existence of the ether was not actually a proof of that existence. All of this implies that evidence has quite a history of passing itself off as proof. The two are frequently mistaken for each other and I would suggest that it is under this deliberate smokescreen of ambiguity that evidence is free to carry on with its insidious racket.

Most accounts of a debate where evidence is in attendance generally depict the aforementioned entity as an intimidating presence, useful when it comes to shutting people up and not afraid to use its hefty physicality as a deterrent. On examination, though, it would appear that evidence is not so much the physical material of which it is comprised, as it is the entirely abstract and subjective processes involved in the selection and classification of material phenomena as evidence. A lead pipe, in and of itself, is after all just a lead pipe and needs considerable human interpretation to connect it with Professor Plum and the conservatory. It is in this dependence on the unreliable perceptions and concealed agendas of an individual that we finally identify the weak spot of this domineering thug.

In order for an item to be classed as evidence, the thing it evidences must be previously extrapolated or determined, presupposing the conditions under which it qualifies as evidence. As an example, you conceivably might be employed by a giant petrochemical concern and have for some time loathed Professor Plum for his outspoken views on global warming, or, I don’t know, because you think he looks Jewish. When you heard about the murder, you immediately let your prejudices as a climate change denying anti-Semite influence your judgement as to whom might be the culprit. The well-known phenomenon of confirmation bias led you to ignore such data as did not support your predetermined theory and instead carefully to select only those facts that did. You gathered evidence and then presented it as proof. For God’s sake, there must be a thousand ways that lead pipe could have ended up in that conservatory, you scientifically illiterate Nazi.

Evidence, that always plausible and superficially convincing psychopath, can only ever be a charting of our own perceptions and our intellectual processes, as in Niels Bohr’s Copenhagen interpretation – or at least in my interpretation of it. Evidence is thus the map, while proof by the same token is the territory and the two might not exactly or even remotely correspond, as in the recent mortifying case of Google Earth and that South Pacific island, which, it turned out, wasn’t really there.

The yawning and yet easily ignored gap between map and territory, evidence and proof, along with the confusion that apparently persists between the two, is indicated in the subtle disagreement that is polarising current scientific thought upon what constitutes reality. One side in the debate contends that if our theories on the nature of the universe – for instance, the existence of inferred quantum effects or particles that may be unobservable – are in accordance with the way that space-time seems to function, then we may as well afford these theoretical constructions their full status as reality. Those with opposing views, perhaps more wisely and more cautiously, point to the many “Michelson and Morley” instances where our most informed understanding of existence proves to be fallacious and instead suggest that even our most powerful theories can be only be part of an evolving and continually adapting apprehension of a hypothesised “ultimate reality”.

As the philosopher Karl Popper pointed out, we cannot state conclusively that anything is true, only that it has not thus far been falsified. Since even proof itself is seemingly fatally undermined by Popper’s hard-to-discount observation, might we not therefore conclude that evidence is a completely hopeless bastard?

Evidence is not proof and occasionally it isn’t even evidence. While it undoubtedly illuminates the human landscape, it obscures it in an equal measure. It has led to the incarceration of some thoroughly vile people and similarly has collaborated in the execution or internment of the blameless and the mentally impaired. In its contribution to the sciences, it has repeatedly allowed us to escape from the miasma of disinformation that somebody else’s view of evidence had visited upon us in the first place. Even in those instances where evidence is plentiful, are we entirely capable of judging what the evidence is of?

Approximately 18 months ago, it was announced that measuring the cosmic constant yielded different measurements depending upon which way the observer happened to be facing. This apparently nonsensical discovery would make sense in the context of “etheric flow”, a kind of current having a direction that’s conducted through the rarefied essential medium of our universe, except that back in 1881 we were assured by Michelson and Morley that the ether was entirely fictional, according to their evidence. Now, I’m not saying that these two respected physicists should be exhumed and pilloried, their gravestones rendered to unfathomable rubble by an angry, crowbar-swinging mob. That is, naturally, a matter for the individual reader to decide. My only aim is to present the facts, as they appear to me. If I can do that in around 2,000 words, so much the better.

Those who still prefer to picture evidence as some variety of loveable old villain in the manner of Mad Frankie Fraser, despite all the documented torture and brutality, should give some thought as to what a society entirely based on evidence might look like. An informative example is available in South America’s extraordinary Pirãha people, for whom every statement or remark must be accompanied by some sort of supporting evidence or proof. For instance, simply saying “John has gone up river” would not be sufficient by itself and would need qualifying with an explanation of how this conclusion was arrived at. Proof from observation, as in “John has gone upriver and I know because I personally saw him go” would be acceptable, as would proof from deduction, as in “John has gone upriver and I know this because his canoe’s no longer here”. This rigorous approach to conversation would appear to have significant advantages in that it does not permit the Pirãha any concept of a god or notions of an afterlife, surely good news for scientific atheists who may have recently become distressed by the idea that human beings might be “hardwired for religion” and possess a “god-shaped hole” in their psychology. With the world view of the Pirãha, practically unique in having no creation myth, this notion is reliably refuted.

Other things that the Pirãha do not have include a written language, possibly because the provenance of written statements is impossible to validate compared with first-hand verbal information from a trusted relative or colleague. This means that, along with being unencumbered by a deity or a religion, the Pirãha also have no scientific theory, no literature or art, nor any history extending further back than a few generations. On the other hand, if you’re still worrying about where John’s gone, the Pirãha are nothing if not dependable.

To summarise, evidence schmevidence. This Goebbels of the English language has for too long passed itself off as a thing of formidable weight and substance, bolstering its image with the use of terms like “solid”, “irrefutable” and “cast-iron”, when in fact it often only demonstrates the pattern-recognition pro - cesses of those presenting it. A jar of saddlebag-faced Saddam Hussein’s anti-wrinkle cream confirms the presence of weapons of mass destruction and so justifies the comprehensive devastation of Iraq. Evidence is sometimes murderously deceptive.

For all we know, it hasn’t even stopped beating its wife.

Alan Moore is the author of “Watchmen”, “V for Vendetta”, “From Hell” and many other titles

Alan Moore is the author of Watchmen, V for Vendetta, From Hell and many other titles.

This article first appeared in the 24 December 2012 issue of the New Statesman, Brian Cox and Robin Ince guest edit

© MARK PETERSON/REDUX/EYEVINE
Show Hide image

Goodbye to the Confederate flag

After the shootings in Charleston, the Republican right showed it was finally ready to reject the old symbols of the Confederacy.

On 27 June, an African-American activist named Bree Newsome woke up before dawn, put on her climbing equipment and scaled a 30-foot flagpole on the lawn of State House in Columbia, South Carolina. She then removed the Confederate battle flag that flew from it. “We can’t wait any longer,” she explained later in an online statement. “It’s time for a new chapter where we are sincere about dismantling white supremacy.”

After she was led away in handcuffs, the flag was raised again.

Newsome’s protest reflected a growing impatience within America’s black community and anger about liberal inaction. Political rallies by the Democratic presidential contenders Hillary Clinton and Bernie Sanders have been disrupted by the Black Lives Matter campaign against violence committed on young African Americans and the cultural and legal biases that justify it. While promoting his book on race in the US, the writer Ta-Nehisi Coates argued that, to African Americans, the battle flag represents a lingering attempt “to bury the fact that half this country thought it was a good idea to raise an empire rooted in slavery”.

Yet, on this matter, to everyone’s surprise, the black civil rights movement and many southern Republicans have proved to be of one mind. On 9 July the House of Representatives in South Carolina voted to lower the battle flag for good. It stood, representatives said, for racism. It had to go.

The context of this agreement was a painful one. Ten days before Newsome’s act, a 21-year-old white man named Dylann Roof shot and killed nine black worshippers at the Emanuel African Methodist Episcopal Church in Charleston, South Carolina. According to his room-mate, he wanted to start a race war. The TV screens showed a photo of him holding a gun in one hand and a Confederate battle flag in the other.

If the demands for redress made by civil rights groups didn’t come as a surprise, conservative acquiescence did. The Republican Party had built a solid base in the South by courting white voters who cherished the memory of the Confederacy. Yet the party’s presidential hopefuls from both the North and the South – including Jeb Bush, Lindsey Graham, Scott Walker and George Pataki – said that the battle flag ought to be lowered. The most striking intervention was made by the governor of South Carolina, Nikki Haley, who denounced the use of the Confederate flag and signed the bill removing it. Haley is now tipped to figure on the list of potential vice-presidential nominees.

The volte-face of the US right is in part a result of the horror of the Charleston shootings. Yet it also occurs in the context of major shifts within American society. There are still many conservatives who will defend Confederate heritage as a matter of southern pride but the culture wars are changing as the US becomes increasingly European in outlook. This is taking place across the country. It just happens to be more pronounced in the South because no other region has fought so violently and so long to resist the liberal tide.

The story of the battle flag is the story of the South. The first official Confederate flag used in the civil war of 1861-65 caused confusion during fighting – through the haze of gun smoke, its design of 13 stars and red and white bars was hard to distinguish from the Stars and Stripes. An alternative blue cross was rejected for being too sectarian; the racist Confederacy was anxious not to offend its Jewish citizens. So the cross became a diagonal X. This flag was never officially adopted by the Confederate army. In the years after the war its use was infrequent.

There was little need to visualise southern difference in a flag. It was self-evident in the physical signs of racial segregation: separate schools, pools and drinking fountains; black people confined to the back of the bus. Political displays of the battle flag of Dixie (the historical nickname for the states that seceded from the Union) only really resurfaced when that racial order was challenged by northern liberals. In 1948, the Democrats – then the party overwhelmingly in control of the South – split over modest calls for civil rights. The conservatives who refused to support that year’s presidential ticket, the “Dixiecrats”, triggered a rev­ival of flag-waving across the region.

The old battle flag suddenly appeared on private lawns, on cars and at political rallies. Supposedly ancient cultural traditions were invented overnight. For instance, the 1948 student handbook of the University of Mississippi confessed: “Many Ole Miss customs are fairly new; they lack only the savouring which time brings . . . Ole Miss has adopted the Confederate flag as a symbol of the Mississippi spirit. Each football game finds the scarlet flag frantically waving to the rhythm of the Rebel band.”

I can confirm that this “tradition” was still going as recently as in 2005. That year, I attended an American football game at Ole Miss and was surprised when the band played “Dixie” at the end. White boys and white girls stood up and belted out the folk song of the Confederacy, while black students filed out.

In 1958, South Carolina made it a crime to desecrate the battle flag. Three years later, on the 100th anniversary of the outbreak of the civil war, it was hoisted above its Capitol building in Columbia. That day, there was a struggle in the US Congress to keep federal funding going for segregated schools.

So clear is the link between the postwar white resistance to civil rights and the battle flag that many see it as the symbolic equivalent of the N-word. Jack Hunter, the editor of the conservative website Rare Politics, says: “Some people insist that it’s not about racism, not about slavery, not about segregation. But it’s about all those things.” Hunter grew up in Charleston and used to skateboard in the car park of the church that Dylann Roof attacked. When he was a young journalist, he appeared on local radio as a rabidly right-wing masked character called “the Southern Avenger”. His past was exposed in 2013 while he was working for Rand Paul, a Republican presidential candidate, and Hunter stepped down from his position. He publicly renounced his youthful association with racial conservatism. He now eschews any romanticism about the Confederate cause and its demand for states’ rights. “States’ rights to do what?” he asks: the right to discriminate against African Americans? He is glad that the State House flag is gone. He ascribes its longevity to ignorance, which was corrected by Roof’s rampage: “It was the first time that [southern Republicans] were able to see a different perspective on this symbol.”

Not everyone agrees. Richard Hines – a former South Carolina legislator, Reagan campaign state co-chair and senior activist with the Sons of Confederate Veterans – insists that the flag is “an enduring symbol of the southern fighting man”. Indeed, a poll in July found that 57 per cent of Americans think it stands for southern heritage, rather than racism. Yet that heritage has a political dimension. “Southern people are proud of who they are and there is a leftist assault to destroy the best part of America,” Hines says. “The Trotskyite elite in control of the establishment wants to root out the southern tradition” – a tradition of religious devotion, chivalry and military honour. It is possible to cast the battle flag as a pawn in a much larger cultural conflict.

In 2000, civil rights activists lobbied hard to get the battle flag removed from the top of the South Carolina Capitol and succeeded in having it shrunk in size and relocated to the grounds of State House. The issue came up in that year’s Republican presidential primaries – an unusually poisonous contest between George W Bush and John McCain. Supporters of Bush put out a false story that McCain had fathered an interracial child out of wedlock. McCain added to his woes by opining that the battle flag was “a symbol of racism and slavery”. An organisation called Keep It Flying flooded the state with 250,000 letters attacking him and he lost the crucial competition here to Bush.

The battle flag has retained a strong emotional power for a long time. This makes the Republican establishment’s abandonment of the flag all the more surprising. Then again, those who run the South are probably the people most likely to grasp how much the region has changed in just a decade.

***

In 2010 I took a trip through North Carolina. The landscape told a story. Dotted along the roadside were abandoned black buildings, the old tobacco sheds. The decline of the rural economy had rendered them obsolete. Over the fields that would once have been full of farmers were freshly tarmacked roads, stretching out to nowhere. My guide explained that these were supposed to be cul-de-sacs for new houses. North Carolina was going through a property boom. But who was going to buy all those homes, I asked? The answer: damn Yankees.

Demography is destiny. This once agri­cultural region developed fast from the 1960s onwards by keeping union membership, taxes and regulation as low as possible. Yet capitalism proved disastrous for southern conservatism. Northerners flooded in, seeking work or retirement and bringing their own values. The forecast is that North Carolina’s Research Triangle – the South’s Silicon Valley – will grow by 700,000 jobs and 1.2 million people in two decades.

White migration was accompanied by an influx of Spanish speakers as the service sector flourished. Between 2000 and 2010, the white share of the population of North Carolina fell from 70 to 65 per cent. The black proportion remained at roughly 21 per cent. The Latino proportion, however, jumped from 4.7 per cent to 8.4 per cent. Today, the proportion of people who are non-white and over 60 is about a third. But it’s approaching nearly half for those under 18. As a result, politics in the South is no longer biracial: a contest between white and black. It is increasingly multiracial and uncoupled from the region’s complex past.

The impact of these changes is reflected in voting patterns. In 2000, the South was still overwhelmingly Republican in presidential contests. Even the Democratic nominee, Al Gore, a southerner, lost his home state of Tennessee. But in 2008 and 2012, Barack Obama took those states with the fastest-changing demographics: Florida and Virginia. He won North Carolina in 2008 and lost it in 2012 – but by less than 100,000 votes. It is true that the Republicans won back control in the 2014 midterm elections, with the result that the Deep South now sends few Democrats to Congress; but the region’s political masters are not quite as traditional-minded as they once were.

The Republican relationship with the Confederate past is complex. As the party of Abraham Lincoln and the Union, the GOPs’ southern support was historically small. But in the 1960s the national Democratic Party embraced civil rights and alienated its once loyal southern following; the Republicans took the opportunity to steal some conservative white voters.

The growing southern Republican vote had a class component. Its success in local and congressional races was built more on winning over middle-class moderates than on appealing to the working-class racists who filled the ranks of the Ku Klux Klan. The southern Republican Party did enthusiastically embrace the Confederate battle flag in many quarters. But some office-holders did so only with ambiguity, while large sections of the party never identified with it at all. The period of Republican ascendancy in the South was, in reality, linked with a softening of the area’s racial politics.

Two of the Republicans’ current southern stars are Indian Americans: Bobby Jindal, the governor of Louisiana, and Nikki Haley, the anti-flag governor of South Carolina. There are just two black people in the US Senate and one of them is a Republican, the Tea Party-backed senator for South Carolina, Tim Scott. Marco Rubio, the Floridian senator and presidential candidate, is Cuban American, and the former Florida governor Jeb Bush is married to a Mexican-born woman and speaks fluent Spanish. Bush has tried to push a more moderate line on immigration, in deference to how the GOP will struggle to win the White House if it appeals only to angry white voters. The Kentucky libertarian senator Rand Paul, Jack Hunter’s former boss, has called for legal reforms to correct the trend of keeping far more black than white people in prison. And he is not the only Republican to have been moved by recent race riots sparked by police violence.

***

Violence on the streets of Ferguson, Missouri, and Baltimore, Maryland, confirmed that there still is a culture war in the US. Yet its character has changed. In the past, civil disturbances were typically leapt upon by conservative politicians as evidence of social decline. The 1992 LA riots were blamed on single parenthood and rap lyrics. In contrast, conservative leaders today are far more likely to acknowledge the problems of white racism. There is no place in their ranks for the likes of Dylann Roof. White supremacists are tiny in number.

Jack Hunter claims: “The KKK is like 12 guys in a telephone booth. Liberal groups will use their threat for fundraising but it doesn’t exist. It hasn’t properly since the 1960s.” Roof’s actions say more about gun control, mental illness and the angst of the young than they do about popular, largely liberal views on race, as polling shows.

We can see a similar liberal shift in other areas of the historic culture war. In May 2015 Gallup released the results of a “moral acceptability” survey charting changes in national attitude across all age groups, from 2001 to 2015. Approval of gay relationships jumped from 40 to 63 per cent; having a baby out of wedlock from 45 to 61 per cent; sex between unmarried men and women from 53 to 68 per cent; doctor-assisted suicide from 49 to 56 per cent; even polygamy went from 7 to 16 per cent. Abortion remained narrowly disapproved of: support for access has only crept up from 42 to 45 per cent. This is probably a result of an unusual concentration of political and religious opposition and because it involves a potential life-or-death decision. But the general trend is that young people just don’t care as much about what consenting adults get up to.

Why? It might be because old forms of identity are dying. One way of measuring that is religious affiliation. From 2007 to 2014, according to Pew Research, the proportion of Americans describing themselves as Christian fell from 78 to 71 per cent. Today, only a quarter of the population is evangelical and 21 per cent Catholic, down despite high immigration. Then there is the decline in civic or communal activity. Since 2012, the organisers of Nascar, the stock-car races, have not published attendance figures at their tracks, probably because they have fallen so sharply. The decline of this most macho and working class of sports parallels the fall in conservative forms of collective identity such as southern traditionalism.

The old culture war was, like the racial politics of the old South, binary. In the 1950s, around the same time as the South invented its tradition of flying the battle flag in colleges, the US constructed an ideal of the “normal” nuclear family unit: straight, white, patriarchal, religious. On the other side was the “abnormal”: gay, black, feminist, atheist, and the rest. The surest way to get elected in the US between 1952 and 2004 was to associate yourself with the economic needs and cultural prejudices of the majority. The approach was once summed up by a Richard Nixon strategist thus: split the country in two and the Republicans will take the larger half. But that is changing. The old normal is no longer the cultural standard but just one of many identities to choose from. The races are mixing. Women want to work more and have children later in life, possibly without marriage. Many religious people are having to rethink their theology when a child comes out as gay. And the enforcers of the old ways – the unions, churches or political parties – are far less attractive than the atomising internet.

***

Politicians are scrabbling to keep up with the diffusion of American identity. Democrats got lucky when they nominated Barack Obama and chose a presidential candidate who reflected the fractured era well: interracial, non-denominational Christian, and so on. In the 2012 presidential race the Republicans got burned when they tried to play the old culture war card on abortion. They won’t repeat that mistake. After the Supreme Court legalised gay marriage across the country in June, the right’s response was not as uniformly loud and outraged as it would have been in the past. Some protested, but serious presidential contenders such as Jeb Bush grasped the implications of the defeat. There is a cultural and political realignment going on and no one is sure where it will lead. It’s encouraging caution among the Republican top brass. It is time, they think, to abandon lost causes.

The death of southern traditionalism is part of the ebb and flow of cultural history. Identities flourish and die. As political fashions change, you find the typically American mix of triumph on one side and jeremiad on the other. Richard Hines stood vigil as the battle flag was lowered in Columbia and noted with disgust the presence of what he described as “bussed-in” activists. “They pulled out all these gay pride flags and started shouting, ‘USA, USA, USA!’ It reminded me of the Bolshevik Revolution.”

Hines reckons that more southerners will now fly the flag than ever before and says he has attended overflow rallies of ordinary folks who love their region. He may well be correct. The faithful will keep the old Confederate standard fluttering on their lawns – an act of secession from the 21st century. But in the public domain, the battle flag is on its way down and in its place will be raised the standard of the new America. The rainbow flag flutters high. For now.

Tim Stanley is a historian and a columnist for the Telegraph

This article first appeared in the 20 August 2015 issue of the New Statesman, Corbyn wars