In praise of pessimism

Who needs the politics and mindset of “jam tomorrow”, asks Will Self, when you can adopt a sensibly pessimistic attitude and live by the principle of “shit happens, but until it does, make hay”?

Illustration by Ralph Steadman

The last time I remember going out with my mother it was to Hampstead Heath. We drove there in my car, then walked arm in arm along the terrace in front of Kenwood House. As if elliptically commenting on our own halting progress my mother said: “The good thing about being a pessi­mist is that you’re never really wrong-footed; even before you’ve put one foot in front of the other you suspect that you’re likely to trip up, and that makes adversity much easier to deal with.”

She died three weeks later, lying in a bed in the Royal Ear Hospital. Not, you understand, that there was anything in particular wrong with her hearing; rather, despite the cancer that had metastasised from lymph to liver to brain, she remained highly attuned to the vapidity of yeasayers.

Indeed, I imagine the last thing she heard – and silently dismissed – before she slid into the coal-hole of inexistence was some well-meaning health professional or other, telling her it was all going to be all right.

That was a quarter-century ago, but my mother’s valedictory wisdom has stayed with me, informing my life, refining an epicurean attitude to personal life and a stoical one towards public events. For those who would dismiss pessimism out of hand, seeing it as a negative and self-fulfilling prophecy, let’s lay our jokers on the table right now: in respect of which of the major social and political developments of the past 25 years would optimism have been an appropriate attitude to take? My mother would have had to hang on only a few months in order to prop herself up in bed, or possibly lie supine, while I read aloud to her Francis Fukuyama’s essay in the National Interest “The End of History?”. How she would have snorted derisively at Fukuyama’s assertion that the end of the cold war would be followed by the worldwide dissemination of benign western liberal democracy.

Of course, in 1989, the immiseration of the former Soviet Union was just a gleam in wannabe oligarchs’ eyes and the rise of Putin’s Potemkin democracy lay some way ahead. The US was about to disengage itself from a range of proxy wars across the globe, in order to reinvest its peace dividend in the prosecution of a brand new range of hegemonic interventions.

A decade had already passed since the Camp David accords that were to have ushered in a peaceful era – but there was no sign of a lasting peace in the Middle East then and there certainly isn’t now. Indeed, US support of the Israeli state’s expansionist territorial aims remains to this day the festering pressure sore on the posterior of international relations. Mum, a Jew who believed passionately in justice for the Palestinian people, wouldn’t have been in the least bit surprised about this.

Nor, I imagine, would she have kept her sunny side uppermost as the western coalition’s air forces vaporised the retreating Iraqi conscript army at the end of the first Gulf war. An optimist, of necessity, believes in a future typified by knowns, because if – in the rousing chorus of the Blair government’s accession anthem of 1997 – “things can only get better”, then this must be in comparison with what already obtains. The pessi­mist, by contrast, is fully attuned to Donald Rumsfeld’s unknown unknowns: the black swans that swoop down out of a clear blue sky to annihilate thousands of New York office workers. The pessimist does not sanction foreign wars on the basis that democracy can issue forth from the barrel of a gun – which is not to say that pessimists don’t believe in the need to defend democratic values. Indeed, the chief paradox of whatever still obtains in the way of British greatness is that it derives from Churchillian pessimism: while the appeasers were optimistically waving their brollies, Winnie was scrying the storm clouds of the Nazi blitzkrieg. It was when optimism got the better of him – believing in the continuation of British rule in India – that Churchill’s Pollyanna intransigence contributed to the deaths of up to three million Bengalis in the 1943 famine.

No, in foreign affairs a healthy dose of pessimism – if by this is meant a willingness to accept that things may be for the worst in a less-than-perfect world – is definitely indicated. But domestically the optimism (if you can call it that) of the Thatcher-Blair neoliberal consensus hardly seems to have been borne out. Abed in the late 1980s, receiving cancer treatment that may have been less advanced than that of today, but which nonetheless was administered free and on demand with no caveats, my moribund mother would have undoubtedly been right in taking a gloomy view of the sell-off of our public assets.

As we shiver our way through an interminable winter, facing both fuel poverty in individual households and collective energy insecurity after having bartered our oil reserves for a mess of banker’s pottage, the much-vaunted efficiency of the market seems like just another optimistic mirage. Food banks opening at the rate of two a week, sickness benefit claimants about to be struck off by private contractors without any recourse to justice – these are developments that wouldn’t have fazed her.

She regarded the auto-cannibalistic tendencies of capitalism not from a theoretical perspective, but with the weary eyes of an American child of the Depression era. She had witnessed her own father keep the family afloat by organising fire sales for bust department stores.

Indeed, what are speculative bubbles if not the purest example of optimism run wild? The same sort of loony thinking that once invested in perpet­ual motion machines leads the contemporary credulous to believe that financial wizardry can conjure something out of nothing. The same glad-eyed and groundless enthusiasm for the Good News that the Redeemer’s arrival is imminent also leads people to believe that economies can continue to grow for all eternity, spawning more goods for more clap-happy consumers. I’m by no means the most eminent Cassandra to have pointed out that there’s a worm in the Enlightenment’s apple of knowledge; this distinction belongs to my New Statesman colleague John Gray. But, by contrast with him, I retain the same ideals as I always did: a belief that an egalitarian and essentially socialistic society is worth striving for.

Is this a paradox? I think not. Moreover, I also hold that a healthy streak of pessi­mism is pretty much mandated by such idealism. Let me explain. For every instance of a pessimistic forecast being fulfilled that I’ve set out above, the optimist can probably instance a counter-example. So be it. But the optimist also thinks that it is her willingness to entertain a better future that acts as a psychic midwife to its birth. How, the optimist argues, can you be bothered to struggle for a state of affairs that you regard as at best unlikely, and quite possibly altogether unattainable? The answer lies in the appreciation that the political and the personal are linked not instrumentally, but existentially. Subscribing to an ideology, whether it bases its appeal in the reasonable prolegomena of a Rousseau-inflected state-of-nature, or one of the instinctive and Hobbesian variety, nonetheless involves the individual in an act of deferral of the form: not now, but given such-and-such, then.

It is this “such-and-such” that forms the basis of all institutionalised appeals to political action: the communist utopia is forestalled quite as much as the thousand-year Reich; both retreat in advance of the measured tramping of the mobilised masses. At a less dramatic level, politicians in our highly imperfect (but still vaguely operable) representative democracies exhort us with their manifesto promises of jam tom­orrow and seek to remind us of the jam we spread on yesterday’s bread. It’s no wonder that electorates that are gummed up within the mechanisms of internet commerce find such appeals increasingly difficult to hear above the whine of their computers. After all, this is the most compelling contemporary paradigm of gratification: push the button to receive jam by express 24-hour delivery; and if you sign up for repeat deliveries, you can indeed have tomorrow’s jam today.

It is this consumerist ethic – if it can be so glorified – that has eaten away at any remaining semblance of altruism, its chomping in synchrony with the optimistic belief in the power of the market to unite mouths efficiently with jam. And this also explains why all political parties and charitable organisations now aspire to the form of commercial enterprises, complete with marketing departments and tax breaks for donations. Implicit in all of these activities, whether ostensibly dedicated to social welfare or to capital aggregation, is a utilitarian calculus. The nature of the good – or goods – may be disputed, but the conviction remains that it can be factually accounted for and numerically arrived at.

Yet to live a full life is not to cede such a large percentage of it to a purely statistical perspective; such a life – to borrow the title of Céline’s novel – is merely death on the instalment plan. And it is the optimist, para­doxically, who enforces such a life on the generality of humankind with her plea that we look to a better future.

I have, as you have probably realised, a good deal of sympathy for that apocalyptic tendency that led Spanish anarchists to burn the town hall records and string up the priest. But I don’t think we have to resort to such excesses in order to reclaim the primacy of the here, the now and the individual over the insistent compulsions of the there, the then and the collective. All that’s necessary is to expect the worst but live hopefully, if by “living hopefully” is meant to invest the present in the raiment of all the idealism any of us could wish for – to practise, in the telling phrase of Basho, the Japanese Zen poet, “random acts of senseless generosity”.

We do not arrive at any idea of what is best for the collective unless we are prepared to seize the day and practise it on our own behalf. Most mature individuals understand what this means in respect of themselves – it’s just all those feckless others that they don’t trust to act appropriately. And so, by one means or another, they seek to organise society in such a way as to corral the human kine and herd them towards pastures new. But really, the sweet-smelling grass is beneath our hoofs right now: what is required is that we take pleasure in what is available to us. I said above that my pessimism resulted in an epicureanism when it came to personal life. Unfortunately, in our gastro-fixated culture, the epicurean is associated with fancy concoctions of wheatgrass, rather than the stuff growing close to hand. We need to redress this balance and understand that once the basic necessities of life are accounted for, all the rest can be creative and even wilful.

The optimist can never embrace this perspective, driven as she is by an inchoate need that can always be shaped by others so as to tantalise her. The optimist – again, paradoxically – lives in fear of a future that she endeavours, futilely, to control. The optimists can never be that most desirable of things: a meliorist, because every setback is necessarily a disaster. For the pessimist, it’s simply a matter of shit happens, but until it does, make hay.

But if I may end, as I began, on a personal note (rightly so, given the tenor of what I’ve had to say), while I have maintained a pessimistic cast of mind for the past quarter-century, there are many areas of my life in which my pessimism has been unwarranted – none more so than in respect of the New Statesman. In 1988 my career as a contributor was in distinct abeyance; 25 years on it’s going strong, and long may it and this vehicle for it continue.

Will Self’s most recent novel, the Booker-nominated “Umbrella”, is published by Bloomsbury (£18.99). He writes a weekly column for the New Statesman

Will Self is an author and journalist. His books include Umbrella, Shark, The Book of Dave and The Butt. He writes the Madness of Crowds and Real Meals columns for the New Statesman.

This article first appeared in the 12 April 2013 issue of the New Statesman, Centenary Special Issue

Nicola Snothum / Millenium Images
Show Hide image

The end of solitude: in a hyperconnected world, are we losing the art of being alone?

In the end, Solitude feels a bit like an amiable cop-out. 

Michael Harris is a Canadian writer who lives in a big city and whose life is defined and circumscribed, as so many Western lives are now, by digital technologies. He finds it hard to leave his phone at home in case he misses anything. He worries about his social media reputation. He uses apps and plays games, and relies on the internet hive mind to tell him which films to watch or where to eat. Here is what happens when he goes on holiday to Paris:

Disembarking from the train from London, I invited a friendly app to guide me to a hotel near the Pompidou . . . The next morning, Yelp guided me towards a charming café in the Marais. There, wizard-like, I held my phone over the menu and waited for Google Translate to melt the words into English. When the waiter arrived, I spoke into my phone and had it repeat my words to the grinning garçon in a soft, robotic French. Later, at the Louvre, I allowed a Nintendo-sponsored guidance system to track my steps up the centuries-old Daru staircase as I squinted confusedly at its glowing blue you-are-here dot . . .

Terrifying, isn’t it? Well, I thought so as I read it, and Harris thought so afterwards. It was situations like this, during which he realised that his life was controlled, confined and monitored by distancing technologies, that led him to wonder whether solitude – the act and the art of being alone – was in danger of disappearing.

Harris has an intuition that being alone with ourselves, paying attention to inner silence and being able to experience outer silence, is an essential part of being human. He can remember how it felt to do this, before the internet brought its social anxiety and addiction into his life. “I began to remember,” he writes, “a calm separateness, a sureness I once could live inside for an easy hour at a time.”

What happens when that calm separateness is destroyed by the internet of everything, by big-city living, by the relentless compulsion to be with others, in touch, all the time? Plenty of people know the answer already, or would do if they were paying attention to the question. Nearly half of all Americans, Harris tells us, now sleep with their smartphones on their bedside table, and 80 per cent are on their phone within 15 minutes of waking up. Three-quarters of adults use social networking sites regularly. But this is peanuts compared to the galloping development of the so-called Internet of Things. Within the next few years, anything from 30 to 50 billion objects, from cars to shirts to bottles of shampoo, will be connected to the net. The internet will be all around you, whether you want it or not, and you will be caught in its mesh like a fly. It’s not called the web for nothing.

I may not be the ideal reader for this book. By page 20, after a few more facts of this sort, I had already found myself scrawling “Kill everyone!” in the margins. This is not really the author’s fault. I often start behaving like this whenever I’m forced to read a list of ways in which digital technology is wrecking human existence. There are lots of lists like this around at the moment, because the galloping, thoughtless, ongoing rush to connect everything to the web has overcome our society like a disease. Did you know that cows are now connected to the internet? On page 20, Harris tells us that some Swiss dairy cows, sim cards implanted in their necks, send text messages to their farmers when they are on heat and ready to be inseminated. If this doesn’t bring out your inner Unabomber, you’re probably beyond help. Or maybe I am.

What is the problem here? Why does this bother me, and why does it bother Harris? The answer is that all of these things intrude upon, and threaten to destroy, something ancient and hard to define, which is also the source of much of our creativity and the essence of our humanity. “Solitude,” Harris writes, “is a resource.” He likens it to an ecological niche, within which grow new ideas, an understanding of the self and therefore an understanding of others.

The book is full of examples of the genius that springs from silent and solitary moments. Beethoven, Dostoevsky, Kafka, Einstein, Newton – all developed their ideas and approach by withdrawing from the crowd. Peter Higgs, the Nobel ­Prizewinner who discovered the Higgs boson particle, did his best work in peace and solitude in the 1960s. He suggests that what he did then would be impossible today, because it is now virtually impossible to find such solitude in the field of science.

Collaboration, not individuality, is fetishised today, in business as in science and the arts, but Harris warns that collaboration often results in conformism. In the company of others, most of us succumb to pressure to go with the crowd. Alone, we have more chance to be thoughtful, to see differently, to enter a place where we feel free from the mob to moderate our unique experience of the world. Without solitude, he writes, genius – which ultimately springs from different ways of thinking and seeing – becomes impossible. If Thoreau’s cabin in the woods had had wifi, we would never have got Walden.

Yet it is not only geniuses who have a problem: ordinary minds like yours and mine are threatened by the hypersocial nature of always-on urbanity. A ­civilisation can be judged by the quality of its daydreams, Harris suggests. Who daydreams now? Instead of staring out of the window on a train, heads are buried in smartphones, or wired to the audio of a streaming film. Instead of idling at the bus stop, people are loading up entertainment: mobile games from King, the maker of Candy Crush, were played by 1.6 billion times every day in the first quarter of 2015 alone.

If you’ve ever wondered at the behaviour of those lines of people at the train station or in the street or in the café, heads buried in their phones like zombies, unable or unwilling to look up, Harris confirms your worst fears. The developers of apps and games and social media sites are dedicated to trapping us in what are called ludic loops. These are short cycles of repeated actions which feed our brain’s desire for reward. Every point you score, every candy you crush, every retweet you get gives your brain a dopamine hit that keeps you coming back for more. You’re not having a bit of harmless fun: you are an addict. A tech corporation has taken your solitude and monetised it. It’s not the game that is being played – it’s you.

So, what is to be done about all this? That’s the multibillion-dollar question, but it is one the book cannot answer. Harris spends many pages putting together a case for the importance of solitude and examining the forces that splinter it today. Yet he also seems torn in determining how much of it he wants and can cope with. He can see the damage being done by the always-on world but he lives in the heart of it, all his friends are part of it, and he doesn’t want to stray too far away. He understands the value of being alone but doesn’t like it much, or want to experience it too often. He’ll stop checking his Twitter analytics but he won’t close down his account.

At the end of the book, Harris retreats, Thoreau-like, to a cabin in the woods for a week. As I read this brief last chapter, I found myself wishing it was the first, that he had spent more time in the cabin, that he had been starker and more exploratory, that he had gone further. Who will write a Walden for the Internet Age? This book is thick with fact and argument and some fine writing, but there is a depth that the author seems afraid to plumb. Perhaps he is afraid of what he might find down there.

In the end, Solitude feels a bit like an amiable cop-out. After 200 pages of increasingly disturbing facts about the impact of technology and crowded city living on everything from our reading habits to our ability to form friendships, and after warning us on the very last page that we risk making “an Easter Island of the mind”, the author goes back home to Vancouver, tells his boyfriend that he missed him, and then . . . well, then what? We don’t know. The book just ends. We are left with the impression that the pile-up of evidence leads to a conclusion too vast for the author, and perhaps his readers, to take in, because to do that would be to challenge everything.

In this, Solitude mirrors the structure of many other books of its type: the Non-Fiction Warning Book (NFWB), we might call it. It takes a subject – disappearing childhood; disappearing solitude; disappearing wilderness; disappearing anything, there’s so much to choose from – trots us through several hundred pages of anecdotes, science,
interviews and stories, all of which build up to the inescapable conclusion that everything is screwed . . . and then pulls back. It’s like being teased by an expert hustler. Yes, technology is undermining our sense of self and creating havoc for our relationships with others, but the solution is not to stop using it, just to moderate it. Yes, overcrowded cities are destroying our minds and Planet Earth, but the solution is not to get out of the cities: it’s to moderate them in some way, somehow.

Moderation is always the demand of the NFWB, aimed as it is at mainstream readers who would like things to get better but who don’t really want to change much – or don’t know how to. This is not to condemn Harris, or his argument: most of us don’t want to change much or know how to. What books of this kind are dealing with is the problem of modernity, which is intractable and not open to moderation. Have a week away from your screen if you like, but the theft of human freedom by the machine will continue without you. The poet Robinson Jeffers once wrote about sitting on a mountain and looking down on the lights of a city, and being put in mind of a purse seine net, in which sardines swim unwittingly into a giant bag, which is then drawn tightly around them. “I thought, We have geared the machines and locked all together into interdependence; we have built the great cities; now/There is no escape,” he wrote. “The circle is closed, and the net/Is being hauled in.”

Under the circumstances – and these are our circumstances – the only honest conclusion to draw is that the problem, which is caused primarily by the technological direction of our society, is going to get worse. There is no credible scenario in which we can continue in the same direction and not see the problem of solitude, or lack of it, continue to deepen.

Knowing this, how can Harris just go home after a week away, drop off his bag and settle back into his hyperconnected city life? Does he not have a duty to rebel, and to tell us to rebel? Perhaps. The problem for this author is our shared problem, however, at a time in history when the dystopian predictions of Brave New World are already looking antiquated. Even if Harris wanted to rebel, he wouldn’t know how, because none of us would. Short of a collapse so severe that the electricity goes off permanently, there is no escape from what the tech corporations and their tame hive mind have planned for us. The circle is closed, and the net is being hauled in. May as well play another round of Candy Crush while we wait to be dragged up on to the deck. 

Paul Kingsnorth's latest book, “Confessions of a Recovering Environmentalist” (Faber & Faber)

This article first appeared in the 20 April 2017 issue of the New Statesman, May's gamble

0800 7318496