The New Statesman Essay 4 - Is science good for us?

Helen McCarthy argues that our unease about new technology makes it more urgent than ever to revive

Judging goodness is not an exact science. Received opinion has, over the ages, recommended various pursuits for the benefits they purportedly bestow, from wearing hair shirts and reading the Bible to cleaning one's plate at dinner time and listening to Mozart. Self-improvement, be it of body or of mind, is the key, we are told, to individual happiness and collective well-being; striving to find what is good for us will lead us to the good life and the good society.

But does science help or hinder? Historians have often identified the scientific revolution of the late 17th and 18th centuries as the watershed that separated the moderns from the ancients in ways of knowing the world. As a result, superstition, tradition and custom no longer stood as the primary authorities that could explain, legitimate and preserve the status quo. The emerging spirit of inquiry and discovery released humanity from pre-modern unenlightenment; out of the darkness came the gas lamp, the electric light bulb and the ultraviolet beam, shedding light on man's formerly slavish, subordinated state of being.

In this Whiggish narrative of progress, science plays its benevolent part in bringing mankind to a higher stage of evolution. Elemental forces are mastered and managed: killer diseases no longer kill, long distances cease to be prohibitive, mass media and communications transform our knowledge of societies outside our own. The length and quality of life increase in tandem with the onward procession of scientists, physicians, inventors and techno-entrepreneurs.

The Victorian paternalists were the pioneers in bringing scientific knowledge directly to bear on solving social problems, such as poverty, disease and crime. Private philanthropy and public works, from Jeremy Bentham's Panopticon model for prisons to Edwin Chadwick's Public Health Act 1848, became rationalised: empirical in method, utilitarian in philosophy, professional in execution. By the beginning of the 20th century, the social sciences were established as academic disciplines, and, as poverty and need became quantifiable, the modern welfare state began to take shape.

Thus, spreading good and delivering relief became subject to scientific management. But the nature of the good itself did not change; people had aspired to help the poor, cure the sick and preserve law and order for centuries. In other words, science might be said to be good for us because it helps us become more efficient at being good - using an established moral framework to determine what that good should consist of. The 19th century remained for most an age of faith; morality was God-given, and science could be integrated successfully into a theistic world-view by viewing man's newfound rationality as a divine gift.

However, the advance of scientific knowledge has been coupled with an irresistible rise in secularism, bringing new questions to a post-faith age. What happens to goodness when science begins to create its own rationale? What happens to morality when the onward march of science alters the paradigm within which we both ask and answer the question: How shall I live? Modernity changed not just our view of the physical universe; for many, it caused the moral universe to be reconfigured as well.

Some early 20th-century visions of the scientific dawn have been hopeful, even utopian. The Italian futurists envisaged an alternative aesthetic in which the machine would become an object of beauty and a vehicle for good. Leninism-Marxism was heralded by its advocates as the first truly scientific political system, which would build an entirely new kind of society and offer an entirely new kind of freedom. Later, industrialisation and collectivisation, put at the centre of the communist project under Stalin, would collapse morality and science together into the broader sweep of economics and ideology. As late as 1971, Shulamith Firestone formulated a feminist vision of techno-utopianism, looking hopefully to a future in which gestation and childbirth would be out-of-body experiences, thus liberating women from being defined primarily as wombed creatures.

However, many other voices have used apocalyptic language to express anxiety about the ethical possibilities of the modern era. A cultural narrative has emerged that frames the coming of the scientific age as the beginning of the battle for the soul of western civilisation. Many poets and artists of the First World War presented machines as things that frequently alienate and dehumanise; the natural or organic becomes a redeeming force, yet ever more fragile; the pastoral becomes elegiac. Man at odds with his environment, man endangered by his own technological achievement, man playing around with the natural environment at his peril - these are themes that dominate a strand of cautionary tale-telling, from H G Wells's Time Machine to Ridley Scott's Blade Runner.

And behind these fictionalised fantasies of what man might do with his technology lurked the disturbing reality of what man was already doing: in the industrialised carnage on Flanders fields, in the assembly-line gas ovens of Auschwitz, in the seconds of indiscriminate obliteration that consumed Hiroshima and Nagasaki. Science, without need of the artistic imagination, became at once the harbinger of total war and total death, with no pleading and no respite. Theodor Adorno said that there could be no lyric poetry after Auschwitz; perhaps, neither can there be a virtuous science.

Another philosopher, Jean-Francois Lyotard, saw no option but to greet this troubled and troubling place - which we might call "postmodernity" - with incredulity and despair. In The Inhuman, Lyotard envisages a cyber-age when human beings are finally discarded for their inefficiencies, irrationalities and frailties, and artificial intelligence takes over the world. The humanist project is dead, replaced by that of the non-living.

But is Lyotard's nightmare vision the only, or inevitable, future reality for a world where technology continues to be driven by its own incremental force? Or can we find a new narrative of science and morality for the 21st century?

It may be that the language of politics and democracy will provide the answer. Popular discourse has too often been shaped by a caricatured "two cultures" divide between amoral scientists and fatalistic artists. The challenge for politicians is to transcend this dichotomy and recast the debate by bringing science back into the realm of human understanding and democratic accountability.

This entails the recognition of science as an interest - rather than an ineluctable force - rooted in institutional, commercial and organisational structures, susceptible to peaks and troughs in consumer or labour markets, and heavily implicated in any analysis of where power lies in our society. Scientific knowledge, like any kind of knowledge, can not only create possibilities and tear down walls, but can also inhibit development in other directions and build new walls of social exclusion.

What science amounts to thus hinges on our ability to integrate new technologies and their practitioners into a social contract of rights balanced with responsibilities: the freedom to innovate, tempered by the obligation to seek legitimacy through the democratic process.

The popular protests that fill the news headlines - anti-capitalist rioters in Seattle and Genoa, opponents of GM foods, critics of stem-cell research, angry members of the Countryside Alliance - reflect people's fears that a coalition of business, political and scientific interests will deprive them of any stake in decisions which will determine how we live in the future. The dissenting voices may be described as fatalistic, reactionary or nihilist, especially by those who would rather choose to ignore them. But it is the failure of political leaders to nurture healthy, participatory, pluralistic political cultures which must account for the loss of confidence in representative government across western democracies.

Anxieties about where technology might lead us are therefore part of the broader malaise of our impoverished democracy. If we are to feel confident about the power of science to build a brighter future, then we must create structures for the development of moral consensus, through debate and dialogue, across communities and societies at all levels. A socially integrated, politically connected, virtuous science cannot be successfully locked into an inclusive, democratic system when that system itself is weak and failing.

Thus science can help make the case for a revitalised democracy. In practice, this could take several forms: for example, a more active and high-profile role in the public realm for regulatory bodies such as the Human Fertilisation and Embryology Authority or the Food Standards Agency; a greater emphasis on social, moral and political issues in school science teaching; more systematic requirements for the declaration of financial interests in all scientific publications, along the lines recently pioneered by Nature; and something like an international scientists' charter, to bind academics and technology-driven business to principles of democratic accountability.

In short, we need to engage citizens in public conversations about science. Though scientific experts will always need to preserve a degree of intellectual independence, and to protect the integrity of their specialist and professional knowledge, they should welcome such a participatory culture as a means of gaining public support and the stamp of moral legitimacy for their work.

Science in today's world makes us work harder to be good: the choices are tougher, the dilemmas seem ever more impossible, and the established value systems for making moral judgements have been displaced and fragmented. The good society, however, is still in our sights, and, with vision, collective will and a shared language of rights and responsibilities, science and democracy can join hands to build it on this earth.

Helen McCarthy is currently a Kennedy scholar at Harvard University. This essay was the winner of the Webb Essay Prize 2001, sponsored by the NS, the Foreign Policy Centre and the Webb Memorial Trust. The judges were Helena Kennedy QC, Robert Winston, Richard Rawes of the Webb Trust, Mark Leonard of the FPC and Peter Wilby, NS editor