Modernism still matters

Writers such as T S Eliot and Samuel Beckett worked in synchrony with continental Europeans

Writers such as T S Eliot and Samuel Beckett worked in synchrony with continental Europeans such as Thomas Mann and Franz Kafka, pushing against the limitations of art. Why have English-language writers turned away from this challenge?

One of the minor themes of my latest book, Whatever Happened to Modernism?, is that a grave problem with cultural life in Britain today is how all issues are reduced to a question of personalities. I learned just how true this is when, shortly before the book came out, the Guardian published an article that was ostensibly about it but which, in fact, was only about personalities (in this instance, Salman Rushdie, Ian McEwan and Julian Barnes). The journalist who wrote it found a few sentences in one chapter of a 200-page book, wrenched them from their context and, on the basis of three telephone conversations with me, passed the whole thing off as an interview. Following the appearance of the article, I was rung up by the Evening Standard and Radio 4's PM programme and emailed by Newsnight - all of which wanted me to "elaborate" on what I had apparently said in the Guardian. When I pointed out that I had not said those things and that I would talk to them only if they gave me the chance to set the record straight (and not discuss personalities), they lost interest. So it goes, as Kurt Vonnegut's narrator says. I am grateful to the New Statesman for giving me the chance to explain what I was trying to do in the book.

I wrote it in the first place to try to make sense of a problem that had long puzzled me: why was it that works of literature such as the poems of T S Eliot, the stories of Kafka and Borges, the novels of Proust, Mann, Claude Simon and Thomas Bernhard seemed worlds apart from those admired by the English literary establishment (works by writers such as Margaret Atwood, John Updike, Martin Amis and Ian McEwan)? The first group touched me to the core, leading me into the depths of myself even as they led me out into worlds I did not know. The latter were well-written narratives that, once I'd read them, I had no wish ever to reread. Was it my fault? Was I in some way unable to enter into the spirit of these works? Or did they belong to a kind of writing that was clearly to the taste of the English public but not to mine?

There was another problem: no composer would dream of writing like Tchaikovsky today, except in an ironic manner; no painter today would dream of painting like Sargent, except in an ironic manner; yet novelists writing in English seemed to want to write like the Victorians and the Edwardians. Others might object that literature is simply different from the other arts and it is absurd to compare them. But then why did I feel that there were profound affinities between Eliot and Picasso, Proust and Bonnard, Simon and Cézanne? Were Eliot and Proust really in thrall to the debilitating idea that they should be modern at all costs? No one who has responded to them could ever imagine this to be the case. Yet critics and reviewers who paid lip-service to Eliot and Proust seemed to fail utterly to see that to take their work seriously meant asking questions about the bulk of current English writing that were simply never asked. Even writers such as William Golding and Muriel Spark, whose work gave me the same thrill
as the one I got from Marguerite Duras and Milan Kundera, were treated as the quirky authors of books about children, shipwrecks and eccentric schoolteachers.

It had not always been like that. When I first came to England in the late 1950s, it was a reviewer in the Observer, Philip Toynbee, who alerted me to the novels of Claude Simon. It was in the pages of Encounter that I first came across the stories of Borges. The back pages of the Listener and the New Statesman were alive with critics familiar with European culture and with a wide historical grasp: John Berger, David Drew and Wilfrid Mellers, among others. By the early 1990s, Encounter and the Listener had gone, to be replaced by three-for-the-price-of-two creative writing courses and literary festivals. What had happened to literary modernism in this country? How did it expire like this, without leaving a trace?

To answer this question, it was necessary to show that modernism was not a "movement", like mannerism, or the name of a period. Like Romanticism, it is multifaceted and ambiguous. And it didn't begin in 1880 and end in 1930. Modernism, whenever it began, will always be with us, for it is not primarily a revolution in diction, or a response to indus­trialisation or the First World War, but is art coming to a consciousness of its limitations and responsibilities.

The principal issue is that of authority. Shelley talked of poets being the "unacknowledged legislators" of the world and the prophetic strand of Romanticism did, indeed, see the artist as inspired and authoritative. Modernism can be seen as a reaction to this and a recognition that the artist is no different from the rest of us. "I am no prophet," says Eliot's Prufrock, and "here's no great matter". Marcel Duchamp spelled out the implications:

The word "art", etymologically speaking, means to make, simply to make. Now what is making? Making something is choosing a tube of blue, a tube of red, putting some of it on the palette, and always choosing the quality of blue, the quality of red, and always choosing the place to put it on canvas, always choosing.

If that is so, why not take a lavatory bowl, isolate it from its normal context, give it a title and, hey presto, it's art! Not all artists were as bold as Duchamp, but every modern artist has had, somehow or other, to come to terms with what he did. Kafka got it, but not Max Brod. Walter Benjamin got it, but not, for all his great gifts, William Empson. Simon got it, but not Irène Némirovsky. Tom Stoppard got it, but not John Osborne.

Alongside the prophetic strand of Romanticism, there runs another: despair at the thought of having come too late, of having only ruins to contemplate, of recognising that the voice of the nightingale can be heard only fleetingly, if at all. That, it would seem, is where the origins of modernism are to be located. But the coming of modernism is like the rise of the bourgeoisie - the closer you look, the further into the distance it recedes.

If, for the Romantics, Shakespeare and Milton were gigantic figures they could not hope to emulate, for some artists in the Renaissance their own age had already lost contact with authority. Albrecht Dürer sums this up in his two parallel engravings of 1514 Saint Jerome in His Study and Melencolia I. The former shows us the saint who gave the Latin west its Bible, at ease within tradition, working away peacefully in his room. The latter shows us a figure many modern artists have identified with: a wild-eyed, impotent giantess in a bleak landscape, surrounded by instruments of making, but incapable of making anything because she is unable to connect with any tradition. Rabelais, Cervantes and Sterne later explored this pre­dicament in comic style and, for that reason, they seem to us to be strikingly modern, the true contemporaries of Borges and Beckett.

Thomas Mann understood all this; his wonderful novel Doctor Faustus is an exploration of the paradoxes and depths of the modernist crisis, which, as the title suggests, he locates firmly in the 16th century. Taking our cue from this, we could say that, for Homer, the Muses dictated both the content and the form of what he had to say; for medieval artists such as the sculptors of the great cathedrals, what was to be depicted was determined by the cathedral's clerics, and the forms - the way the beard of Moses or the hand of Christ were to be carved - was given by tradition. This gives medieval art, as both Pound and Proust recognised, an innocence and freedom from ego that both writers felt went missing from European art in the ensuing centuries.

By the 16th century, the consensus on which this was based had disappeared. Though patrons went on giving specific commissions to artists and composers for the next two centuries, artists were becoming increasingly conscious that, from now on, they had to rely only on their imagination. Our culture, which is still in thrall to the individualistic strain in the Renaissance and in Romanticism, welcomed this as a splendid new freedom. More prescient souls, however, sensed what Duchamp would eventually articulate so icily - if every choice is merely the artist's, why is one choice better than any other?

This is what Kafka, Beckett and Borges struggled with: how to escape the conclusion that whatever you do is private self-indulgence. Your work may earn you and your publisher money but, having no authority, it remains nothing more than an object of consumption, like a pair of shoes.
And yet the urge to speak remains. That is what we find with Prufrock, with Hamm in Beckett's Endgame, with Saul Bellow's Henderson. And this combination of the need, which we all have, to speak out our deepest feelings and the recognition that, as soon as the need is expressed, it becomes obvious that it is not what we meant at all is what makes the work of Eliot, Bellow, Beckett and Bernhard so moving. This is what is so signally lacking in the bulk of postwar English novels, which tend to consist of well-plotted tales in the first or third person, in which morality and the convolutions of plot now take the place of authority.

“How many poems he denied himself/In his observant progress, lesser things/Than the relentless contact he desired." So reads Wallace Stevens's poem "The Comedian as the Letter C". Modernism has found many ways of establishing that "relentless contact" with reality: the constant shift from book to world and back in Rabelais and Sterne; the sly reminders in Nabokov and Queneau that we are reading words on a page; the tragic, climactic wrenchings of Golding's Pincher Martin and Rosalind Belben's Our Horses in Egypt.

At those moments, modern art reaches beyond words to that which we share but cannot speak. I find it in the work of writers as diverse as Marguerite Duras, Robert Pinget, Peter Handke, the French-writing Hungarian Agota Kristof, Gert Hofmann and the Israeli Yaakov Shabtai. I rarely find it in the English-language writers of today.

Since the Romantics, English culture has been deeply suspicious of Romantic posturing and some of this suspicion is reasonable - posturing needs to be debunked. But suspicion too easily slides into philistinism and an intolerance of ambiguity and fear of the unknown. We find this in the cultural commentary of Evelyn Waugh (whose early novels I love and admire), Philip Larkin and Kingsley Amis. Unfortunately, it is now so ubiquitous that people no longer have even a glimmer of what has been lost. My book was written in an attempt to reawaken that sense.

Gabriel Josipovici's "Whatever Happened to Modernism?" is published by Yale University Press (£18.99)

This article first appeared in the 06 September 2010 issue of the New Statesman, The Pope on Trial

PAUL KOOIMAN/GALLERY STOCK
Show Hide image

Chill out

Stress is not as destructive as is often assumed: a little bit of it may even be good for us.

It creeps up on you as soon as the alarm clock rings. Fingers reflexively unlock your phone. Emails bound in with a jolly ping: things you should have done last week; pointless meeting requests; bills to pay.

Over a hurried breakfast you scan the headlines: wall-to-wall misery. On the train you turn to social media for relief. ­Gillian is funnier than you. Alex got promoted again. Laura’s sunning herself in Thailand. You’re here, packed in, surrounded but alone, rattling your way towards another overstretched day.

Stress: we know what it feels like, we can smell it on others, we complain about it most days. And we’re living through an epidemic of it. The government’s Health and Safety Executive estimates that stress cost the economy nearly ten million working days last year. Some 43 per cent of all sick days were attributed to stress. In the US, a large survey conducted by the National Public Radio network in 2014 showed that nearly one in two people reported a major stress event at some point in the previous 12 months. The year before that, American doctors wrote 76 million unique prescriptions for the anti-anxiety drugs Xanax and Ativan. With the media running stories about stress-induced heart disease, strokes, obesity, depression, ulcers and cancer, it’s hard not to conclude that stress kills.

But consider this: just a century ago, nobody got stressed. They suffered with their nerves, got a touch of the vapours; they worried; but they were never stressed. In fact, our current view of stress – what it is, what it feels like, and when it is harmful – evolved surprisingly recently. And research shows that the way we think about stress has a profound influence on how it affects us.

Prolonged, uncontrollable stress – particularly if suffered in childhood – can be profoundly corrosive and debilitating. But what of the familiar stresses of day-to-day life? Are they actually damaging you? Might the belief that stress is harmful be self-fulfilling? And what would a stress-free life look like? Instead of turning in on ourselves and doing battle with our personal stress demons, might we be able to put their diabolic energy to good use?

If we pause for a moment from our daily hustle we would see that many of us are incurably hooked on stress. We thrive on it, getting a kick out of surviving the high-stakes presentation, meeting the deadline and overcoming our fears and prejudices. Watching a thriller, we are on the edge of our seat, pulses racing. Sports, on the field or on television, can propel us into “fight or flight” mode. Humanity’s fascination with gambling hinges on stress.

If the most skilled physiologists in the world could peer beneath the skin of a thrill-seeker on a roller coaster and an out-of-his-depth job interview candidate, they would struggle to tell them apart. Deep in the brain, they would see a structure called the hypothalamus fired up. With each lurch of the ride or disarming question asked, the hypothalamus signals to the adrenal glands, which sit atop each kidney. The adrenals then squirt a shot of adrenalin into the bloodstream. In the background, the hypothalamus prods the pituitary gland, which passes a different message on to the adrenal gland. This increases production of cortisol, the textbook “stress hormone”. Flipping these biological switches triggers the familiar bodily symptoms of stress: a pounding heart, raised blood pressure, dilated pupils, arrested digestion and a damped-down immune system. In both cases, the biological stress response would look very similar.

Even if we could eliminate stress entirely, or smother it with pharmaceuticals, we wouldn’t want to. To muzzle the stress response is to silence the good as well as the bad. At best, stress can motivate us to achieve more and fix the sources of our stress. Boredom is stressful in its own way: observe a caged lion, or an understimulated teenager. In fact, as the animal psychologist Françoise Wemelsfelder told New Scientist recently, boredom may exist to spur us back into activity. This half-forgotten idea, that some degree of stress can inspire and elevate, is common sense. It also has deep roots in the earliest scientific study of stress and stress responses.

***

At the beginning of the 20th century, two American psychologists, Robert Yerkes and John Dodson, wanted to know how stressing out lab mice affected their learning. They set the rodents navigational challenges and punished wrong turns by administering small electric shocks to the feet. In their terminology, larger electric currents caused greater “arousal”.

They spotted some consistent trends. When they gave mice an easy task (choosing between a black or a white tunnel) the relationship between the strength of the shock and the speed of learning was simple. The greater the stressor, the quicker the mice learned to pick the right tunnel.

When the challenge was subtler (differentiating between grey tunnels), the response was less straightforward. Weak shocks provided little impetus to learn, but as the zaps got stronger, the mice gradually upped their game. They focused on the task and remembered the consequences of wrong choices. Yet, at a certain point, the high stress levels that helped with the easy task became counterproductive. Overwhelmed, the mice skittered around at random, trying in vain to escape.

On a graph, the relationship between stress and performance on onerous tasks traces an inverted U shape. Some degree of stress helps, but there is a clear tipping point, beyond which stress becomes paralysing. The findings became known as the Yerkes-Dodson law.

This was all very well for mice, but could it be applied to human beings? According to the Canadian-Austrian endocrinologist Hans Selye, the “father of stress”, it could. Selye was the first person to describe the key glands, hormones and nerves of the biological stress response during the 1930s and 1940s, and also one of the first to apply the word “stress” to human biology.

For Selye, “stress” described an all-purpose response the body had to any demand placed upon it. When stress is on the upswing of Yerkes and Dodson’s inverted-U performance curve, Selye calls it “eustress”. This is where good teachers and managers should push their charges: to the sweet spot that separates predictable tedium from chaotic overload. Where stress gets more persistent, unmanageable and damaging, Selye calls it “distress”. Eustress and distress have identical biological bases; they are simply found at different points on the same curve.

Despite this knowledge, stress has a terrible public image today, often synonymous with distress. While some wear their stress as a badge of honour (“I’m important enough to be stressed”), deep down even the most gung-ho City workers probably stress about their stress. And in painting stress as a beast, we grant it more destructive power.

When did we come to view stress as the universal enemy? Mark Petticrew, Professor of Public Health Evaluation at the London School of Hygiene and Tropical Medicine, has sifted through a huge archive of historical tobacco industry documents. In a 2011 paper, he revealed that a large proportion of stress research during the second half of the 20th century was funded, steered and manipulated by this most unexpected of benefactors. Indeed, from the late 1950s, Hans Selye received hundreds of thousands of tobacco-stained dollars. He also allowed industry lawyers to vet his research and appeared in several pro-tobacco propaganda films.

“They put a massive, massive amount of money into it,” Petticrew told me.

Why were tobacco manufacturers so interested in stress? First, cigarettes were marketed as a stress reliever. “To anxiety . . . I bring relief,” reads a 1930s advertisement for Lucky Strike. So if research could help them pin poor mental and physical health to stress, this sort of message would carry more weight. (Incidentally, the still widespread belief that smoking reduces anxiety appears to be wrong.)

Later, as evidence grew that smoking caused cancer and heart disease, the tobacco industry wanted to prove that stress was an equally significant risk factor. They used the authority of Selye and several other leading researchers as a smokescreen. “Doubt is our product,” read a top industry executive’s 1969 memo. And so doubt they sowed, arguing repeatedly that stress was a major cause of disease. Those seeking to control tobacco were wrong, they claimed.

It worked: the industry convinced the general public of the evils of stress and diverted public health research for at least a decade. With tobacco regulation and compensation payouts postponed, the profits kept rolling in.

Should we doubt the veracity and neutrality of all the foundational research into stress as a disease? “I wouldn’t want to argue that stress doesn’t exist, or that it isn’t bad for your health and certainly your mental health,” Petticrew says. “But you can’t ignore this story.”

He goes on to describe concrete “findings” that industry-funded researchers got wrong. Prominent among these was a link between coronary disease and people displaying so-called Type A personality traits: competitiveness, ambition, anxiety. Such temperamentally “stressed” people were especially likely to suffer heart attacks and, not coincidentally, to smoke. Then the association faded away. “Aside from the scientific weaknesses, which are many, Type A is a cultural artefact to some extent constructed by the tobacco lobby,” Petticrew says. And yet, despite its fragile foundations, the Type A myth persists today.

The long shadow cast by decades of one-sided, propaganda-laced stress research has led many people to believe that stress is a direct cause of heart attacks. But the British Heart Foundation’s website states, “There is no evidence to suggest that stress causes coronary heart disease or heart attacks.” Nor does it cause stomach ulcers: usually it is a bacterium called Helicobacter pylori which does that.

The tobacco-funded researchers didn’t get it all wrong. Stress does have clear causal links to some diseases, particularly mental illnesses, including depression, anxiety disorders, schizophrenia and addictive behaviour. High stress levels appear to be a general risk factor for early death, among middle-aged men in particular. Moreover, we all know how unpleasant stress can be. From insomnia to binge eating and boozing, we respond to stress with all sorts of counterproductive and antisocial behaviours. And that is partly why the tone of messages we hear about stress matters so much. Human beings are inherently suggestible and particularly vulnerable to warning messages about our health, especially when those messages seem to be backed by science.

***

With mice in a cage, you can measure the tipping point – the precise current of the electric shock – where good stress becomes bad. But we don’t need the lurking menace of a lion in the long grass to activate our stress response. We can do it perfectly well for ourselves. All it takes is a negative thought, the memory of an insult, or a vague feeling of unease.

We can think our way into stress. And, as recent evidence shows, if we believe stress is going to hurt us, it is more likely to hurt us. This is one message emerging from the Whitehall II project, a long-term study of 10,000 UK government civil servants, set up in 1985 to study the social, economic and personal determinants of health and disease. A 2013 analysis of Whitehall II data concluded that people who believe stress adversely affects their health are more than twice as likely to suffer a heart attack, irrespective of their stress levels.

There is a flipside to this gloomy news. If our thoughts and beliefs can switch on a damaging stress response, can they also switch it off? Could the power of suggestion be a partial vaccination in the battle against the stress epidemic?

This is the contention of Alia Crum, a psychology professor at Stanford University and a flagbearer for the science of mindset manipulations. In 2007 she showed that if hotel chambermaids come to think of their work as exercise, they lose weight and their blood pressure falls, apparently without them working any harder. More recently, she described how UBS bankers who were shown videos about the life-enhancing effects of stress – how it can sharpen attention, boost cognition and force fresh perspectives – reported being more productive, focused and collaborative, and less afflicted by depression and anxiety.

The inescapable conclusion is this: the human mind is a powerful gatekeeper to the stress response. But we have to tread carefully here. UBS employees may have the freedom to choose a less stressful life, and find opportunity to reshape their stress mindsets. What about those whose stress is delivered early and compounded by a lifetime of disadvantage and adversity? Perhaps this is where the story of familiar, workaday stress and the grinding strain of social injustice come together. Stress gets under our skin only when we can’t see the end or spot the fix. So what, other than using Crum’s mindset interventions, can we do to restore the critical feeling of empowerment?

Emily Ansell, an assistant professor of psychiatry at Yale, says that reaching out a kindly hand to your fellow human beings can be surprisingly helpful. In a study published last year, Ansell and colleagues gave a group of 77 people a diary-like smartphone app. They asked the subjects to record all the stressful incidents they encountered, and any minor acts of kindness they performed, during a 14-day period. The data shows that gestures such as holding doors for strangers and helping elderly people across the road buffer the effects of stress and make you feel more optimistic.

Positive interactions deliver a reward at the neurological level. They restore a sense of control and show that meaningful relationships are possible. Moreover, helpers often get more psychological and health benefits than those on the receiving end of  that help.

How do we encourage prosocial behaviour throughout society, particularly at the margins? According to Paul Piff, a social psychologist at the University of California, Irvine, lower-class people in America often “have less and give more”. They are more generous, charitable, trusting and helpful than their upper-class counterparts. It’s possible that this tendency to reach out and muck in is a direct response to a life of chronic stress. In response to Piff’s theory, Michael Poulin, a professor of psychology at the University of Buffalo, suggests: “We should perhaps really focus on encouraging prosocial behaviour among the well-off, ­potentially leading to benefits both for them – in terms of stress – and for the disadvantaged, who would presumably benefit from their generosity.”

This article is published simultaneously in the Long + Short, the free online magazine of ideas published by Nesta, the UK’s innovation foundation. thelongandshort.org

This article first appeared in the 19 May 2016 issue of the New Statesman, The Great Huckster