Illustration by Melinda Gebbie.
Show Hide image

The Goebbels of the English language

We cannot state conclusively that anything is true.

I’m not entirely sure what my fellow contributors will have to say upon the subject but I expect they’ll generally see evidence as quite a good thing and will make compelling arguments to that effect, backed up by documented facts, deductions, and, well, evidence. Except that evidence would say that, wouldn’t it? It isn’t going to testify against itself.

I submit that if in the preceding paragraph the reader replace the term “evidence” with, say, the term “News International”, then the sleazy duplicity of this butter-wouldn’t-melt-in- its-mouth logical, scientific and forensic concept will become immediately apparent. Although it might be incautious to suggest that “evidence” and “evil” are synonymous based solely on their first three letters, I say that we go for it. Let’s subject this oily and persuasive abstract noun to the same brutal scrutiny that it is all too ready to inflict on others and see how it likes it.

When we try to build a solid case against our purely notional defendant, though, we start to get an idea of exactly what we’re up against. For one thing we discover that, suspiciously, we’re suddenly without a single shred of data to support our claims. Forthcoming witnesses are nowhere to be found and even the police appear reluctant to become involved. We learn that evidence, being pretty much made of evidence, has got an alibi for absolutely everything, with all the confirmatory theatre ticket-stubs and time-logged credit card exchanges carefully in place. In this, at least, evidence bears a strong resemblance to the imperviously powerful and homicidal drug cartels of Mexico.

Despite evidence appearing to be protected by the prince of darkness from on high, we can still pursue our investigation and construct a profile of our subject. Evidence, it turns out, is a relatively young latecomer to the scene that muscled its way into our vocabulary 700 years ago, ruthlessly ousting older and more venerated competition such as rumour, superstition and some bloke down ye tavern and malicious gossip, in the lexicological equivalent of an axe-wielding turf war.

Now, quite clearly, the mere fact that evidence shares the same medieval pedigree as the Black Death and the Spanish Inquisition doesn’t mean that it is equally abhorrent, but it’s worth observing that airborne carbon particulates are often indicative of a fire, hence smoke alarms.

A glance at evidence’s backstory reveals a seemingly impeccable and spotless record sheet, with glowing testimonials to the subject’s many acts of great social benevolence: tremendous contributions to the methodology of science and medicine that have allowed humanity to crawl up from a swamp of ignorance and early death; providing the foundations for a legal process that goes further than establishing if witches float or not; and blah blah blah. It hardly need be said that representing oneself as a public benefactor is a timeworn strategy for camouflaging acts of dire monstrosity, as employed by Alphonse Capone, the Kray twins and disc jockeys in the 1970s. As yet, no information has emerged connecting evidence with any of the previously mentioned malefactors but there are of course fresh revelations every day. Is it a mere coincidence that the most commonly used adjectives or adjectival nouns describing evidence are “cold”, “hard”, or, more worryingly, “DNA”?

If we are hoping to make something stick with this notoriously slippery and Teflon-coated piece of terminology, we obviously need to dig a little deeper. A good place to start might be with a description of the suspect, something that a sketch artist could work from. Evidence, according to a reputable source, is “that which tends to prove or disprove something; grounds for belief; proof”. The sharp-eyed juror may note a considerable leap between the first two cautious, unassuming clauses of that definition and the confident, declamatory third example. How does “that which tends to prove or disprove” or which gives “grounds for belief” equate so suddenly with “proof”? While a creationist might justifiably regard the book of Genesis as documentary evidence supporting their profoundly stupid version of the universe’s origin, as something which in their opinion tends to prove or to at least give grounds for their belief, that surely doesn’t mean that a several-millennia- old Just So Story is a proof of anything.

Alternatively, back in 1881, the physicists Albert Michelson and Edward Morley ably demonstrated that the hitherto convincing evidence for the existence of the ether was not actually a proof of that existence. All of this implies that evidence has quite a history of passing itself off as proof. The two are frequently mistaken for each other and I would suggest that it is under this deliberate smokescreen of ambiguity that evidence is free to carry on with its insidious racket.

Most accounts of a debate where evidence is in attendance generally depict the aforementioned entity as an intimidating presence, useful when it comes to shutting people up and not afraid to use its hefty physicality as a deterrent. On examination, though, it would appear that evidence is not so much the physical material of which it is comprised, as it is the entirely abstract and subjective processes involved in the selection and classification of material phenomena as evidence. A lead pipe, in and of itself, is after all just a lead pipe and needs considerable human interpretation to connect it with Professor Plum and the conservatory. It is in this dependence on the unreliable perceptions and concealed agendas of an individual that we finally identify the weak spot of this domineering thug.

In order for an item to be classed as evidence, the thing it evidences must be previously extrapolated or determined, presupposing the conditions under which it qualifies as evidence. As an example, you conceivably might be employed by a giant petrochemical concern and have for some time loathed Professor Plum for his outspoken views on global warming, or, I don’t know, because you think he looks Jewish. When you heard about the murder, you immediately let your prejudices as a climate change denying anti-Semite influence your judgement as to whom might be the culprit. The well-known phenomenon of confirmation bias led you to ignore such data as did not support your predetermined theory and instead carefully to select only those facts that did. You gathered evidence and then presented it as proof. For God’s sake, there must be a thousand ways that lead pipe could have ended up in that conservatory, you scientifically illiterate Nazi.

Evidence, that always plausible and superficially convincing psychopath, can only ever be a charting of our own perceptions and our intellectual processes, as in Niels Bohr’s Copenhagen interpretation – or at least in my interpretation of it. Evidence is thus the map, while proof by the same token is the territory and the two might not exactly or even remotely correspond, as in the recent mortifying case of Google Earth and that South Pacific island, which, it turned out, wasn’t really there.

The yawning and yet easily ignored gap between map and territory, evidence and proof, along with the confusion that apparently persists between the two, is indicated in the subtle disagreement that is polarising current scientific thought upon what constitutes reality. One side in the debate contends that if our theories on the nature of the universe – for instance, the existence of inferred quantum effects or particles that may be unobservable – are in accordance with the way that space-time seems to function, then we may as well afford these theoretical constructions their full status as reality. Those with opposing views, perhaps more wisely and more cautiously, point to the many “Michelson and Morley” instances where our most informed understanding of existence proves to be fallacious and instead suggest that even our most powerful theories can be only be part of an evolving and continually adapting apprehension of a hypothesised “ultimate reality”.

As the philosopher Karl Popper pointed out, we cannot state conclusively that anything is true, only that it has not thus far been falsified. Since even proof itself is seemingly fatally undermined by Popper’s hard-to-discount observation, might we not therefore conclude that evidence is a completely hopeless bastard?

Evidence is not proof and occasionally it isn’t even evidence. While it undoubtedly illuminates the human landscape, it obscures it in an equal measure. It has led to the incarceration of some thoroughly vile people and similarly has collaborated in the execution or internment of the blameless and the mentally impaired. In its contribution to the sciences, it has repeatedly allowed us to escape from the miasma of disinformation that somebody else’s view of evidence had visited upon us in the first place. Even in those instances where evidence is plentiful, are we entirely capable of judging what the evidence is of?

Approximately 18 months ago, it was announced that measuring the cosmic constant yielded different measurements depending upon which way the observer happened to be facing. This apparently nonsensical discovery would make sense in the context of “etheric flow”, a kind of current having a direction that’s conducted through the rarefied essential medium of our universe, except that back in 1881 we were assured by Michelson and Morley that the ether was entirely fictional, according to their evidence. Now, I’m not saying that these two respected physicists should be exhumed and pilloried, their gravestones rendered to unfathomable rubble by an angry, crowbar-swinging mob. That is, naturally, a matter for the individual reader to decide. My only aim is to present the facts, as they appear to me. If I can do that in around 2,000 words, so much the better.

Those who still prefer to picture evidence as some variety of loveable old villain in the manner of Mad Frankie Fraser, despite all the documented torture and brutality, should give some thought as to what a society entirely based on evidence might look like. An informative example is available in South America’s extraordinary Pirãha people, for whom every statement or remark must be accompanied by some sort of supporting evidence or proof. For instance, simply saying “John has gone up river” would not be sufficient by itself and would need qualifying with an explanation of how this conclusion was arrived at. Proof from observation, as in “John has gone upriver and I know because I personally saw him go” would be acceptable, as would proof from deduction, as in “John has gone upriver and I know this because his canoe’s no longer here”. This rigorous approach to conversation would appear to have significant advantages in that it does not permit the Pirãha any concept of a god or notions of an afterlife, surely good news for scientific atheists who may have recently become distressed by the idea that human beings might be “hardwired for religion” and possess a “god-shaped hole” in their psychology. With the world view of the Pirãha, practically unique in having no creation myth, this notion is reliably refuted.

Other things that the Pirãha do not have include a written language, possibly because the provenance of written statements is impossible to validate compared with first-hand verbal information from a trusted relative or colleague. This means that, along with being unencumbered by a deity or a religion, the Pirãha also have no scientific theory, no literature or art, nor any history extending further back than a few generations. On the other hand, if you’re still worrying about where John’s gone, the Pirãha are nothing if not dependable.

To summarise, evidence schmevidence. This Goebbels of the English language has for too long passed itself off as a thing of formidable weight and substance, bolstering its image with the use of terms like “solid”, “irrefutable” and “cast-iron”, when in fact it often only demonstrates the pattern-recognition pro - cesses of those presenting it. A jar of saddlebag-faced Saddam Hussein’s anti-wrinkle cream confirms the presence of weapons of mass destruction and so justifies the comprehensive devastation of Iraq. Evidence is sometimes murderously deceptive.

For all we know, it hasn’t even stopped beating its wife.

Alan Moore is the author of “Watchmen”, “V for Vendetta”, “From Hell” and many other titles

Alan Moore is the author of Watchmen, V for Vendetta, From Hell and many other titles.

This article first appeared in the 24 December 2012 issue of the New Statesman, Brian Cox and Robin Ince guest edit

AKG-IMAGES/ULLSTEIN BILD
Show Hide image

A nervous breakdown in the body politic

Are we too complacent in thinking that the toxic brew of paranoia and populism that brought Hitler to power will never be repeated?

The conventional wisdom holds that “all that is necessary for the triumph of evil is that good men do nothing”, in Edmund Burke’s familiar phrase; but this is at best a half-truth. Studying the biography of a moral monster triumphantly unleashed on the political and international stage points us to another perspective, no less important. What is necessary for the triumph of evil is that the ground should have been thoroughly prepared by countless small or not-so-small acts of petty malice, unthinking prejudice and collusion. Burke’s axiom, though it represents a powerful challenge to apathy, risks crediting evil with too much of a life of its own: out there, there are evil agencies, hostile to “us”, and we (good men and women) must mobilise to resist.

No doubt; but mobilising intelligently demands being willing to ask what habits and assumptions, as well as what chances and conditions, have made possible the risk of evil triumphing. And that leads us into deep waters, to a recognition of how what we tolerate or ignore or underestimate opens the way for disaster, the ways in which we are at least half-consciously complicit. If this is not to be the silly we-are-all-guilty response that has rightly been so much mocked, nor an absolution for the direct agents of great horrors, it needs a careful and unsparing scrutiny of the processes by which cultures become corruptible, vulnerable to the agendas of damaged and obsessional individuals.

This can be uncomfortable. It raises the awkward issue of what philosophers have learned to call “moral luck” – the fact that some people with immense potential for evil don’t actualise it, because the circumstances don’t present them with the chance, and that some others who might have spent their lives in blameless normality end up supervising transports to Auschwitz. Or, to take a sharply contemporary example, that one Muslim youth from a disturbed or challenging background becomes a suicide bomber but another from exactly the same background doesn’t. It is as though there were a sort of diabolical mirror image for the biblical Parable of the Sower: some seeds grow and some don’t, depending on the ground they fall on, or what chance external stimulus touches them at critical moments.

If what interests us is simply how to assign individuals rapidly and definitively to the categories of sheep and goats, saved and damned, this is offensively frustrating. But if we recognise that evil is in important respects a shared enterprise, we may be prompted to look harder at those patterns of behaviour and interaction that – in the worst cases – give permission to those who are most capable of extreme destructiveness, and to examine our personal, political and social life in the light of this.

***

It would be possible to argue that the anti-Semitism of a lot of German culture – as of European Christian culture overall – was never (at least in the modern period) genocidal and obsessed with absolute racial purity; limited but real possibilities of integration were taken for granted, converts to Christianity were not disadvantaged merely because of their race, and so on. Yet the truth is that this cultural hinterland offered a foothold to the mania of Adolf Hitler; that it gave him just enough of the permission he needed to identify his society’s problems with this clearly definable “alien” presence. In his new book, Hitler: the Ascent, Volker Ullrich compellingly tells us once again that no one could have been under any illusion about Hitler’s general intentions towards the Jews from his very first appearance as a political figure, even if the detailed planning of genocide (lucidly traced in the late David Cesarani’s recent, encyclopaedic Final Solution) took some time to solidify. Yet so much of the German public heard Hitler’s language as the slightly exaggerated version of a familiar trope and felt able to treat it as at worst an embarrassing overstatement of a common, even a common-sense, view. One of the most disturbing things about this story is the failure of so many (inside and outside Germany) to grasp that Hitler meant what he said; and this failure in turn reinforced the delusion of those who thought they could use and then sideline Hitler.

To say that Hitler “meant what he said”, however, can be misleading. It is one of the repeated and focal themes in Ullrich’s book that Hitler was a brazen, almost compulsive liar – or, perhaps better, a compulsive and inventive actor, devising a huge range of dramatic roles for himself: frustrated artist, creative patron, philosopher-king (there is a fine chapter on the intellectual and artistic circle he assembled frequently at his Berchtesgaden residence), workers’ friend, martyr for his people (he constantly insinuated that he believed himself doomed to a tragic and premature death), military or economic messiah and a good deal else besides. His notorious outbursts of hysterical rage seem to have been skilfully orchestrated as instruments of intimidation (though this did not exactly indicate that he was otherwise predictable). Ullrich devotes a fair measure of attention to the literal staging of National Socialism, the architectural gigantism of Albert Speer which gave the Führer the sophisticated theatre he craved. In all sorts of ways, Hitler’s regime was a profoundly theatrical exercise, from the great public displays at Nuremberg and the replanning of Berlin to the various private fantasies enacted by him and his close associates (Göring above all), and from the emotional roller coaster he created for his circle to the dangerously accelerated rate of military-industrial expansion with which he concealed the void at the centre of the German economy.

Theatre both presupposes and creates a public. In the anxiety and despair of post-Versailles Germany, there was a ready audience for the high drama of Nazism, including its scapegoating of demonic enemies within and without. And in turn, the shrill pitch of Hitler’s quasi-liturgies normalised a whole set of bizarre and fantastic constructions of reality. A N Wilson’s challenging novel Winnie and Wolf, a fantasia on Hitler’s relations with Winifred Wagner, culminates in a scene at the end of the war where refugees and destitute citizens in Bayreuth raid the wardrobe of the opera house and wander the streets dressed in moth-eaten costumes; it is an unforgettable metaphor for one of the effects of Hitlerian theatre. Ullrich leaves his readers contemplating the picture of a vast collective drama centred on a personality that was not – as some biographers have suggested – something of a cipher, but that of a fantasist on a grand scale, endowed with a huge literal and metaphorical budget for staging his work.

All of this prompts questions about how it is that apparently sophisticated political systems succumb to corporate nervous breakdowns. It is anything but an academic question in a contemporary world where theatrical politics, tribal scapegoating and variegated confusions about the rule of law are increasingly in evidence. On this last point, it is still shocking to realise how rapidly post-Versailles Germany came to regard violent public conflict between heavily armed militias as almost routine, and this is an important background to the embittered negotiations later on around the relation between Hitler’s Sturmabteilung and the official organs of state coercion. Ullrich’s insightful account of a de facto civil war in Bavaria in the early 1920s makes it mercilessly plain that any pretensions to a state monopoly of coercion in Germany in this period were empty.

Yet the idea of such a state monopoly is in fact essential to anything that could be called a legitimate democracy. In effect, the polity of the Third Reich “privatised” coer­cion: again and again in Ullrich’s book, in the struggles for power before 1933, we see Nazi politicians successfully bidding for control of the mechanisms of public order in the German regions, and more or less franchising public order to their own agencies. A classical democratic political philosophy would argue that the state alone has the right to use force because the state is the guarantor of every community’s and every individual’s access to redress for injury or injustice. If state coercion becomes a tool for any one element in the social complex, it loses legitimacy. It is bound up with the rule of law, which is about something more than mere majority consent. One way of reading the rise of Hitler and National Socialism is as the steady and consistent normalising of illegitimate or partisan force, undermining any concept of an independent guarantee of lawfulness in society. It is the deliberate dissolution of the idea of a Rechtsstaat, a law-governed state order that can be recognised by citizens as organised for their common and individual good. Rule by decree, the common pattern of Nazi governmental practice, worked in harness with law enforcement by a force that was essentially a toxic hybrid, combining what was left of an independent police operation with a highly organised party militia system.

So, one of the general imperatives with which Hitler’s story might leave us is the need to keep a clear sense of what the proper work of the state involves. Arguments about the ideal “size” of the state are often spectacularly indifferent to the basic question of what the irreducible functions of state authority are – and so to the question of what cannot be franchised or delegated to non-state actors (it is extraordinary that we have in the UK apparently accepted without much debate the idea that prison security can be sold off to private interests). This is not the same as saying that privatisation in general leads to fascism; the issues around the limits to state direction of an economy are complex. However, a refusal to ask some fundamental questions about the limits of “franchising” corrodes the idea of real democratic legitimacy – the legitimacy that arises from an assurance to every citizen that, whatever their convictions or their purchasing power, the state is there to secure their access to justice. And, connected with this, there are issues about how we legislate: what are the proper processes of scrutiny for legislation, and how is populist and short-view legislation avoided? The Third Reich offers a masterclass in executive tyranny, and we need not only robust and intelligent counter-models, but a clear political theory to make sense of and defend those models.

***

Theatre has always been an aspect of the political. But there are different kinds of theatre. In ancient Athens, the annual Dionysia festival included the performance of tragedies that forced members of the audience to acknowledge the fragility of the political order and encouraged them to meditate on the divine interventions that set a boundary to vendetta and strife. Classical tragedy is, as political theatre, the exact opposite of Hitlerian drama, which repeatedly asserted the solid power of the Reich, the overcoming of weakness and division by the sheer, innate force of popular will as expressed through the Führer.

Contemporary political theatre is not – outside the more nakedly totalitarian states – a matter of Albert Speer-like spectacle and affirmation of a quasi-divine leader; but it is increasingly the product of a populist-oriented market, the parading of celebrities for popular approval, with limited possibilities for deep public discussion of policies advanced, and an assumption that politicians will be, above all, performers. It is not – to warn once again against cliché and exaggeration – that celebrity culture in politics is a short route to fascism. But a political theatre that never deals with the fragility of the context in which law and civility operate, that never admits the internal flaws and conflicts of a society, and never allows some corporate opening-up to the possibilities of reconciliation and reparation, is one that exploits, rather than resolves our anxieties. And, as such, it makes us politically weaker, more confused and fragmented.

The extraordinary mixture of farce and menace in Donald Trump’s campaign is a potent distillation of all this: a political theatre, divorced from realism, patience and human solidarity, bringing to the surface the buried poisons of a whole system and threatening its entire viability and rationality. But it is an extreme version of the way in which modern technology-and-image-driven communication intensifies the risks that beset the ideals of legitimate democracy.

And – think of Trump once again – one of the most seductively available tricks of such a theatre is the rhetoric of what could be called triumphant victimhood: we are menaced by such and such a group (Jews, mig­rants, Muslims, Freemasons, international business, Zionism, Marxism . . .), which has exerted its vast but covert influence to destroy us; but our native strength has brought us through and, given clear leadership, will soon, once and for all, guarantee our safety from these nightmare aliens.

***

This is a rhetoric that depends on ideas of collective guilt or collective malignity: plots ascribed to the agency of some dangerous minority are brandished in order to tarnish the name of entire communities. The dark legacy of much popular Christian language about collective Jewish guilt for the death of Jesus could be translated without much difficulty into talk about the responsibility of Jews for the violence and poverty afflicting Germans in the 1920s. (Shadows of the same myths still affect the way in which – as recent reports suggest – sinister, vague talk about Zionism and assumptions of a collective Jewish guilt for the actions of various Israeli politicians can become part of a climate that condones anti-Semitic bullying, or text messages saying “Hitler had a point”, on university campuses.)

Granted that there is no shortage of other candidates for demonic otherness in Europe and the United States (witness Trump’s language about Muslims and Mexicans), the specific and abiding lesson of Nazi anti-Semitism is the twofold recognition of the ease with which actually disadvantaged communities can be cast in the role of all-powerful subverters, and the way in which the path to violent exclusion of one kind or another can be prepared by cultures of casual bigotry and collective anxiety or self-pity, dramatised by high-temperature styles of media communication.

Marie Luise Knott’s recent short book Unlearning With Hannah Arendt (2014) revisits the controversy over Arendt’s notorious characterisation of the mindset of Nazism as “the banality of evil”, and brilliantly shows how her point is to do with the erosion in Hitlerian Germany of the capacity to think, to understand one’s agency as answerable to more than public pressure and fashion, to hold to notions of honour and dignity independent of status, convention or influence – but also, ultimately, the erosion of a sense of the ridiculous. The victory of public cliché and stereotype is, in Arendt’s terms, a protection against reality, “against the claim on our thinking attention that all events and facts make by virtue of their existence”, as she memorably wrote in The Life of the Mind. Hitler was committed to the destruction of anything that challenged the simple self-identity and self-justification of the race and the nation; hence, as Ullrich shows in an acutely argued chapter of Hitler: a Biography, the Führer’s venom against the churches, despite their (generally) embarrassingly lukewarm resistance to the horrors of the Reich. The problem was that the churches’ rationale entailed just that accountability to more than power and political self-identity that Nazi philosophy treated as absolute. They had grounds for thinking Nazism not only evil, but absurd. Perhaps, then, one of the more unexpected questions we are left with by a study of political nightmare such as Ullrich’s excellent book is how we find the resources for identifying the absurd as well as for clarifying the grounds of law and honour.

The threats now faced by “developed” democracy are not those of the 1920s and 1930s; whatever rough beasts are on their way are unlikely to have the exact features of Hitler’s distinctive blend of criminality and melodrama. But this does not mean that we shouldn’t be looking as hard as we can at the lessons to be learned from the collapse of political legality, the collective panics and myths, the acceptance of delusional and violent public theatre that characterised Hitler’s Germany. For evil to triumph, what is necessary is for societies to stop thinking, to stop developing an eye for the absurd as well as the corrupt in language and action, public or private.

Hitler: a Biography – Volume I: Ascent by Volker Ullrich is published by the Bodley Head

Rowan Williams is an Anglican prelate, theologian and poet, who was Archbishop of Canterbury from 2002 to 2012. He writes on books for the New Statesman

This article first appeared in the 28 April 2016 issue of the New Statesman, The new fascism