The New Statesman Essay 4 - Is science good for us?

Helen McCarthy argues that our unease about new technology makes it more urgent than ever to revive

Judging goodness is not an exact science. Received opinion has, over the ages, recommended various pursuits for the benefits they purportedly bestow, from wearing hair shirts and reading the Bible to cleaning one's plate at dinner time and listening to Mozart. Self-improvement, be it of body or of mind, is the key, we are told, to individual happiness and collective well-being; striving to find what is good for us will lead us to the good life and the good society.

But does science help or hinder? Historians have often identified the scientific revolution of the late 17th and 18th centuries as the watershed that separated the moderns from the ancients in ways of knowing the world. As a result, superstition, tradition and custom no longer stood as the primary authorities that could explain, legitimate and preserve the status quo. The emerging spirit of inquiry and discovery released humanity from pre-modern unenlightenment; out of the darkness came the gas lamp, the electric light bulb and the ultraviolet beam, shedding light on man's formerly slavish, subordinated state of being.

In this Whiggish narrative of progress, science plays its benevolent part in bringing mankind to a higher stage of evolution. Elemental forces are mastered and managed: killer diseases no longer kill, long distances cease to be prohibitive, mass media and communications transform our knowledge of societies outside our own. The length and quality of life increase in tandem with the onward procession of scientists, physicians, inventors and techno-entrepreneurs.

The Victorian paternalists were the pioneers in bringing scientific knowledge directly to bear on solving social problems, such as poverty, disease and crime. Private philanthropy and public works, from Jeremy Bentham's Panopticon model for prisons to Edwin Chadwick's Public Health Act 1848, became rationalised: empirical in method, utilitarian in philosophy, professional in execution. By the beginning of the 20th century, the social sciences were established as academic disciplines, and, as poverty and need became quantifiable, the modern welfare state began to take shape.

Thus, spreading good and delivering relief became subject to scientific management. But the nature of the good itself did not change; people had aspired to help the poor, cure the sick and preserve law and order for centuries. In other words, science might be said to be good for us because it helps us become more efficient at being good - using an established moral framework to determine what that good should consist of. The 19th century remained for most an age of faith; morality was God-given, and science could be integrated successfully into a theistic world-view by viewing man's newfound rationality as a divine gift.

However, the advance of scientific knowledge has been coupled with an irresistible rise in secularism, bringing new questions to a post-faith age. What happens to goodness when science begins to create its own rationale? What happens to morality when the onward march of science alters the paradigm within which we both ask and answer the question: How shall I live? Modernity changed not just our view of the physical universe; for many, it caused the moral universe to be reconfigured as well.

Some early 20th-century visions of the scientific dawn have been hopeful, even utopian. The Italian futurists envisaged an alternative aesthetic in which the machine would become an object of beauty and a vehicle for good. Leninism-Marxism was heralded by its advocates as the first truly scientific political system, which would build an entirely new kind of society and offer an entirely new kind of freedom. Later, industrialisation and collectivisation, put at the centre of the communist project under Stalin, would collapse morality and science together into the broader sweep of economics and ideology. As late as 1971, Shulamith Firestone formulated a feminist vision of techno-utopianism, looking hopefully to a future in which gestation and childbirth would be out-of-body experiences, thus liberating women from being defined primarily as wombed creatures.

However, many other voices have used apocalyptic language to express anxiety about the ethical possibilities of the modern era. A cultural narrative has emerged that frames the coming of the scientific age as the beginning of the battle for the soul of western civilisation. Many poets and artists of the First World War presented machines as things that frequently alienate and dehumanise; the natural or organic becomes a redeeming force, yet ever more fragile; the pastoral becomes elegiac. Man at odds with his environment, man endangered by his own technological achievement, man playing around with the natural environment at his peril - these are themes that dominate a strand of cautionary tale-telling, from H G Wells's Time Machine to Ridley Scott's Blade Runner.

And behind these fictionalised fantasies of what man might do with his technology lurked the disturbing reality of what man was already doing: in the industrialised carnage on Flanders fields, in the assembly-line gas ovens of Auschwitz, in the seconds of indiscriminate obliteration that consumed Hiroshima and Nagasaki. Science, without need of the artistic imagination, became at once the harbinger of total war and total death, with no pleading and no respite. Theodor Adorno said that there could be no lyric poetry after Auschwitz; perhaps, neither can there be a virtuous science.

Another philosopher, Jean-Francois Lyotard, saw no option but to greet this troubled and troubling place - which we might call "postmodernity" - with incredulity and despair. In The Inhuman, Lyotard envisages a cyber-age when human beings are finally discarded for their inefficiencies, irrationalities and frailties, and artificial intelligence takes over the world. The humanist project is dead, replaced by that of the non-living.

But is Lyotard's nightmare vision the only, or inevitable, future reality for a world where technology continues to be driven by its own incremental force? Or can we find a new narrative of science and morality for the 21st century?

It may be that the language of politics and democracy will provide the answer. Popular discourse has too often been shaped by a caricatured "two cultures" divide between amoral scientists and fatalistic artists. The challenge for politicians is to transcend this dichotomy and recast the debate by bringing science back into the realm of human understanding and democratic accountability.

This entails the recognition of science as an interest - rather than an ineluctable force - rooted in institutional, commercial and organisational structures, susceptible to peaks and troughs in consumer or labour markets, and heavily implicated in any analysis of where power lies in our society. Scientific knowledge, like any kind of knowledge, can not only create possibilities and tear down walls, but can also inhibit development in other directions and build new walls of social exclusion.

What science amounts to thus hinges on our ability to integrate new technologies and their practitioners into a social contract of rights balanced with responsibilities: the freedom to innovate, tempered by the obligation to seek legitimacy through the democratic process.

The popular protests that fill the news headlines - anti-capitalist rioters in Seattle and Genoa, opponents of GM foods, critics of stem-cell research, angry members of the Countryside Alliance - reflect people's fears that a coalition of business, political and scientific interests will deprive them of any stake in decisions which will determine how we live in the future. The dissenting voices may be described as fatalistic, reactionary or nihilist, especially by those who would rather choose to ignore them. But it is the failure of political leaders to nurture healthy, participatory, pluralistic political cultures which must account for the loss of confidence in representative government across western democracies.

Anxieties about where technology might lead us are therefore part of the broader malaise of our impoverished democracy. If we are to feel confident about the power of science to build a brighter future, then we must create structures for the development of moral consensus, through debate and dialogue, across communities and societies at all levels. A socially integrated, politically connected, virtuous science cannot be successfully locked into an inclusive, democratic system when that system itself is weak and failing.

Thus science can help make the case for a revitalised democracy. In practice, this could take several forms: for example, a more active and high-profile role in the public realm for regulatory bodies such as the Human Fertilisation and Embryology Authority or the Food Standards Agency; a greater emphasis on social, moral and political issues in school science teaching; more systematic requirements for the declaration of financial interests in all scientific publications, along the lines recently pioneered by Nature; and something like an international scientists' charter, to bind academics and technology-driven business to principles of democratic accountability.

In short, we need to engage citizens in public conversations about science. Though scientific experts will always need to preserve a degree of intellectual independence, and to protect the integrity of their specialist and professional knowledge, they should welcome such a participatory culture as a means of gaining public support and the stamp of moral legitimacy for their work.

Science in today's world makes us work harder to be good: the choices are tougher, the dilemmas seem ever more impossible, and the established value systems for making moral judgements have been displaced and fragmented. The good society, however, is still in our sights, and, with vision, collective will and a shared language of rights and responsibilities, science and democracy can join hands to build it on this earth.

Helen McCarthy is currently a Kennedy scholar at Harvard University. This essay was the winner of the Webb Essay Prize 2001, sponsored by the NS, the Foreign Policy Centre and the Webb Memorial Trust. The judges were Helena Kennedy QC, Robert Winston, Richard Rawes of the Webb Trust, Mark Leonard of the FPC and Peter Wilby, NS editor

This article first appeared in the 17 December 2001 issue of the New Statesman, The ignorance of the Islamophobes

Almeida Theatre
Show Hide image

Rupert Goold: “A director always has to be more of a listener”

The artistic director of the Almeida Theatre on working with Patrick Stewart, the inaccessibility of the arts, and directing his wife in Medea.

Eight years ago Rupert Goold’s Macbeth made his name. The critics were unanimous in their praise, with one calling it the “Macbeth of a lifetime”. Goold’s first Olivier Award soon followed (Enron won him a second in 2009, King Charles III nearly won him a third last year). It was a family triumph; Lady Macbeth was played by Goold’s wife, Kate Fleetwood.

Now the pair has finally reunited and Fleetwood is his undisputed lead. She is playing Medea in the Almeida’s latest and final play of its Greek season. Directing your wife is one thing. Directing her in a play about a woman who murders her children because her husband abandons her is another. And it’s been harder than Goold expected.

“You live with someone every day, and they don’t age because the change is so incremental, and then you do something together and you realise how much you’ve changed. It’s like playing tennis with someone after eight years: you’re completely different players.”

As it is, Goold thinks the director-actor relationship is inevitably fraught. “There is an essential slave-master, sadomasochistic, relationship,” he says. “The incredibly complicated thing about being an actor is you’re constantly being told what to do. And one of the most damaging things about being a director – and why most of them are complete arseholes – is because they get off at telling people what to do.”

Goold doesn’t. He’s as amicable in person as the pictures – bountiful hair, loose jacket, wide grin – suggest. And when we meet in the Almedia’s crowded rehearsal rooms, tucked away on Upper Street, 100 yards from the theatre, he’s surprisingly serene given his play is about to open.

He once said that directing a play is like running towards a wall and hoping it becomes a door just before the curtain goes up. Has the door appeared? “It’s always a funny moment [at the end of rehearsal]. Sometimes you do a show and it’s a bit dead and the costumes and set transform it. Then sometimes it’s perfect and the design kills it.”

We meet shortly before last Thursday’s press night, and he can’t tell how good it is. But it “certainly feels quite private. The idea that loads of people are going to come and watch it now feels a bit weird. You bring a lot of your sense of relationships and parenting into it.”

Goold has always argued that the classics wither without intervention. So in this revival of Euripides’ 2,446-year-old play, Medea is a writer and her husband, Jason (of Argonauts fame), is an actor. “But it’s not really about that… it’s more about divorce, about what it means to separate.”

“It’s about the impact of a long-term relationship when it collapses. I don’t know whether there is a rich tradition of drama like that, and yet for most people, those kind of separations are far more profound and complicated and have greater ramifications than first love; and we have millions of plays about first love!”

Every generation discovers their own time in the Greek plays. Goold thinks he and playwright Rachel Cusk were shaped by the aftermath of the 1970s in interpreting Medea; “That’s the period when the idea of the family began to get tainted.” And when critics praised Oresteia, the Almeida’s first Greek play and a surprise West End transfer, they compared it to the Sopranos.

Yet there is something eternal about these plays. Goold says it’s the way they “stare at these problems that are totally perennial, like death,” and then offer answers that aren’t easy. Medea kills the kids and a mother rips her son to shreds in the Bakkhai (the Almeida’s predecessor to Medea). Where’s the moral compass in that?

Except there is a twist in Goold’s Medea, and it’s not one every critic has taken kindly to. It was enough to stop the Telegraph’s Dominic Cavendish, otherwise lavish in his praise, from calling it “a Medea for our times”. Nevertheless, the reviews have been kind, as they often are for Goold; although The Times’ Ann Treneman was vitriolic in her dislike (“Everyone is ghastly. The men are beyond irritating. The women even worse.”).

In theory, Goold welcomes the criticism. “I’d rather our audience hated something and talked about it than was passively pleased,” he tells me ahead of reviews.

Controversial and bracing theatre is what Goold wants to keep directing and producing; as the Almeida’s artistic director he is in charge of more than just his own shows. But how does he do it? I put a question to him: if I had to direct Medea instead of him, what advice would he have given me?

He pauses. “You’ve got to love words,” he begins. “There’s no point doing it unless you have a real delight in language. And you have to have vision. But probably the most important thing is, you’ve got to know how to manage a room.”

“It’s people management. So often I have assistants, or directors I produce, and I think ‘God, they’re just not listening to what that person is trying to say, what they’re trying to give.’ They’re either shutting them down or forcing them into a box.”

“Most people in a creative process have to focus on what they want to say, but a director always has to be more of a listener. People do it different ways. Some people spin one plate incredibly fast and vibrantly in the middle of the room, and hope all the others get sucked in. It’s about thriving off of one person – the director, the lead performer, whomever.”

“I’m more about the lowest common denominator: the person you’re most aware of is the least engaged. You have to keep lifting them up, then you get more creativity coming in.”

It’s not always simple. When actors and directors disagree, the director can only demand so much, especially if the actor is far more famous than them. When Goold directed Macbeth, Patrick Stewart was his lead. Stewart was a movie star and twice his age.

“Patrick’s take on Macbeth… I didn’t think it should be played that way. I’d played him as a student and I had an idea of what he was.”

“But then you think, ‘Ok, you’re never going to be what I want you to be, but actually let me get rid of that, and just focus on what’s good about what you want to be, and get rid of some of the crap.’”

Goold doesn’t think he’s ever really struggled to win an actor’s respect (“touch wood”). The key thing, he says, is that “they just feel you’re trying to make legible their intention”.

And then you must work around your lead. In Macbeth, Stewart was “a big deep river of energy… when normally you get two people frenetically going ‘Uhgh! Is this a dagger I see before me! Uhgh!’ and there’s lots of hysteria.”

“So we threw all sorts of other shit at the production to compensate, to provide all the adrenalin which Patrick was taking away to provide clarity and humanity.”

Many people want to be theatre directors, and yet so few are successful. The writers, actors and playwrights who sell shows can be counted on a few hands. Depressingly, Goold thinks it’s becoming harder to break in. It’s difficult to be discovered. “God, I don’t know, what I worry – wonder – most is: ‘Are there just loads of great directors who don’t make it?’”

 The assisting route is just not a good way to find great new directors. “The kind of people who make good assistants don’t make good directors, it’s almost diametrically opposite.” As for regional directors, newspaper budgets have collapsed, so they can no longer rely on a visit from a handful of national critics, as Goold did when he was based in Salisbury and Northampton. And audiences for touring shows have, by some measures, halved in the past twenty years.

Theatre has also evolved. When Goold was coming through, “There were not a lot of directors who felt they were outside the library, so for me to whack on some techno was radical! Now it’d be more commonplace.” New directors have to find new ways to capture our attention – or at least the critics’.

But the critics have changed too. A nod from a critic can still be vital in the right circles, but the days when critics “made” directors is long over. “I remember Nick de Jongh saying, ‘Oh Rupert Goold, I made him.’ Because he’d put Macbeth on the front page of the Standard. I owed my career to him, and in some ways I did! But it's an absurd idea, that would not happen now.”

“It’s all changed so much in literally the past three years. There was a time, for better or worse, when you had a big group of establishment critics: de Jongh, Michael Billington, Michael Coveney, Charlie Spencer – they were mostly men – Susannah Clapp. And if they all liked your show, you were a hit.” (“They could be horrible,” he adds.)

“Now I get more of a sense of a show by being on Twitter than reading the reviews.” It’s “probably a good thing”, Goold thinks, and it certainly beats New York, where a single review – the New York Times' – makes or breaks plays. But it’s another problem for aspiring directors, who can no longer be so easily plucked from the crowd.

It’s no longer a problem Goold needs to overcome. His star could wane, but he seems likely to be among the leading voices in British theatre for a while yet.

Harry Lambert is a staff writer and editor of May2015, the New Statesman's election website.