Why life is good

A dangerous gap exists between our personal experience, which is mainly happy, and our view of a soc

Progressive ideology relies on the capacity of human beings to live fulfilled lives in a just and co-operative society. That people whose beliefs imply optimism seem to spend most of their time wallowing in pessimism is one reason that leftists sometimes lack personal credibility (another reason being that egalitarians so clearly enjoy being very well-off). But miserable idealists need to make a New Year resolution to look on the bright side. Pessimism is becoming an impediment to progressive politics. It is 50 years since J K Galbraith coined the phrase "private affluence and public squalor"; today, the dichotomy is between private hubris and public pessimism.

It is pessimism of a particular and pernicious kind. People are not generally negative about their own lives. In fact, we systematically exaggerate the control we have as individuals. As Malcolm Gladwell, among others, has shown, we tend to give our conscious minds credit for many reactions that are in fact instinctive. Other studies - of what we say has made us happy and what has actually increased our levels of contentment - show that we have a huge capacity to rationalise our life choices. When we are forced to make a choice between limited options, we are as likely to end up claiming the choice as our own as we would if it were unconstrained. And the more we like a future possibility in our lives, the more inclined we are to believe it will happen. The human mind is hard-wired to be personally Panglossian.

In contrast, we are unduly negative about the wider world. As a government adviser, I would bemoan what we in Whitehall called the perception gap. Time and again, opinion polls expose a dramatic disparity between what people say about their personal experiences and about the state of things in general. Take attitudes towards public services. In a recent poll, 81 per cent of respondents said that they were happy with their last visit to hospital. Yet when the same people were asked whether they thought the National Health Service was providing a good service nationally, only 47 per cent felt able to declare it was so, and most think the NHS is going to get worse.

This perception gap is not restricted to public services, as a recent BBC poll on families confirms. Some 93 per cent of respondents des cribed themselves as optimistic about their own family life, up 4 per cent from the previous time the survey was conducted, 40 years ago. Yet more people - 70 per cent, across race, class and gender - believe families are becoming less successful overall. While we apparently thrive in our own families of many shapes and forms, as social commentators we prefer to look back, misty-eyed, to the gendered certainties of our grandparents' generation.

What is true for families is true for neighbourhoods: we think ours is improving while community life is declining elsewhere. We tend to like the people we know from different ethnic backgrounds but are less sure about such people in general. We think our own prospects look OK but society is going to the dogs.

The media seem to be the most obvious cause of this phenomenon. Bad news makes more compelling headlines than good. Tabloids and locals feed off crime stories, middlebrow papers are dismayed at the chaos of the modern world and the alleged venality and ignorance of those in power, and left-leaning broadsheets enjoy telling us that global instability is endemic and envir onmental apocalypse inevitable. Mean while, the content of television programmes - from dramas to news bulletins - contributes to what the communication theorist George Gerbner called "mean world syndrome": people who regularly watch TV systematically overstate the level of criminality in society.

Yet it is too easy to blame the media; the job of commissioning editors is to give us what we want. We make our own contribution to social pessimism. In the burgeoning industry of reputation management, it is generally argued that people are much more likely to tell others about bad experiences of services than good ones (5:1 is the usual ratio). Academic research suggests that people tend to exaggerate in the direction of the general mood. Viewing our own lives positively but wider society negatively, we will tend to pass on and exaggerate evidence that supports these prejudices.

Evolutionary determinists may seek an explanation of our predilection for bad news in neurological hard-wiring; perhaps, for the survival of hunter-gatherers, warning is more important than celebrating. But it is in two of the mega-trends of modernity that more likely reasons for our social pessimism are to be found.

First, there has been the inexorable rise in individualism since the Enlightenment. As Richard Sennett brilliantly argued in The Fall of Public Man, aspects of modernity such as the power of consumer capitalism and the ubiquity of the idioms of psychotherapy have accelerated the process by which we see our authentic selves as revealed in the private and personal spheres, rather than the public and social.

Unstoppable force

Hand in hand with the rise of individualism, we have seen the decline of industrial and pre-industrial collectivist institutions, including the organised church, trade unions, political parties and municipal elites. Robert Putnam's work on social capital suggests this decline in collectivism reaches down into our social lives, with people choosing to spend less time with acquaintances and more with intimates. Putnam's more recent work controversially argues that trust levels are lower and loose social networking less common in more diverse communities.

This points to the second of modernity's mega- trends. Increasingly, we feel that we are the victims of processes set in train by human activity but no longer under anyone's control. Globalisation is the gravity of modern society: an unstoppable force that will knock us over if we try to defy it. The origins of the current credit squeeze in the US sub-prime mortgage market show a financial system that is beyond not only its managers' control, but even their capacity to chart.

Illegal immigration, terrorism and pandemics are seen as the inevitable flip side of cheap travel and consumer goods. Philosophers and policy-makers argue about how best to regulate emerging science and technology in genetics, nano technology and artificial intelligence. But can anything long delay the advance of knowledge - especially if it has commercial applications?

It is not only that we as ordinary citizens feel beset by forces beyond our control. We are ever less likely to believe in the power or authority of our elected representatives (although we much prefer our own MP to MPs in general). At a time when they have more to prove to us than ever before, our leaders are diminished by the politics of a populist consumerism. In this time of uncertainty, is it surprising that the more politically successful national leaders - think Chávez or Putin - are those who offer strong leadership in defiance of democratic constraints?

This is the anatomy of social impotence. By definition, progressives argue for the possibilities of progress; but is anyone inclined to believe us? A hundred years ago, Joseph Rowntree established his charitable works after analysing the social evils of his age. When, last year, the Joseph Rowntree Foundation asked today's public for its definition of the "new social evils", the list had changed very little. Greed, poverty, crime, family and community breakdown all featured on both lists. But at a seminar to discuss the findings, advisers from the foundation and elsewhere agreed on one big shift between the late-Victorian era and today: while Rowntree had seen his evils as the unfinished business of society's onward march, today we see social patho logies as the inevitable consequences of an idea of progress that itself feels imposed upon us.

Brainier than before

And yet. There is a different story to be told about our world. It is a story of unprecedented affluence in the developed world and fast-falling poverty levels in the developing world; of more people in more places enjoying more freedom than ever before. It is a story of healthier lives and longer life expectancy (obesity may be a problem, but it is one that individuals have more chance of solving than rickets or polio). Think of how we thrive in the diversity of modern cities. Think, in our own country, of rivers and beaches cleaner than at any time since the Industrial Revolution. When you read the next report bemoaning falling standards in our schools, remember the overwhelming evidence that average IQs have risen sharply over recent decades. If you think we have less power over our lives, think of the internet, of enhanced rights at work and in law, or remember how it was to be a woman or black or gay 30 years ago.

As for the powerlessness of leaders, the Bali deal last month may leave much to be resolved, but isn't this at last a sign that nations can unite in the best interests of the planet? And should we really lose faith that human determination and ingenuity ultimately will win through? Despite the power of international finance, this is a world where it is possible to be economically successful in societies as deliberately different as those of Sweden or the United States.

We rightly worry about rogue states and terrorists with dirty bombs; but let us also remember that since Nagasaki we have managed to carry on for 60 years without anyone unleashing the power of nuclear warfare. Not only have there been three generations of peace in Europe, but when in the past has a project as grand as EU enlargement been accomplished, let alone accomplished in a decade?

Progressives want the world to be a better place. We bemoan its current inequities and oppression - yet if we fail to celebrate the progress that human beings have made, and if we sound as though the future is a fearful place, we belie our own philosophy. Instead, we need to address a deficit in social optimism that threatens the credibility of our core narrative.

There are many aspects to this; we should, for example, be making the case for a more balanced and ethical media. But my starting point is the need to forge a new collectivism. It is in working with others on a shared project of social advance that we can be reconnected to the sense of collective agency so missing from modern political discourse. It is the attitude of the spectator that induces pessimism, the experience of the participant that brings hope. The problem is not that change brings fear and disorientation (there's nothing new in this), it is that we lack the spaces and places where people can renew hope and develop solutions.

The old collectivism is dead or dying. Its characteristics - hierarchical, bureaucratic, paternalistic - are no longer suited to the challenges or the mood of the times. The institutions of the new collectivism must be devolved, pluralistic, egalitarian and, most of all, self-actualising.

For all the talk of the decline of social capital, people are doing more stuff together. Twenty-five years ago, with falling audiences, commentators assumed that the cinema and live football were dead: we would all rather stay in the safety and comfort of our new, hi-tech living rooms. But then the multiplex, the blockbuster, the all-seater stad ium and foreign players showed the problem to be no deeper than the failure to keep up with modern tastes and expectations.

Self-actualisation is the peak of Maslow's hierarchy of needs. There is evidence that more of us are trying to climb that hierarchy. It is in the crowds at book festivals and art galleries, in ever more demanding consumerism with an emphasis on the personal, sensual and adventurous. We want to enjoy ourselves, to be appreciated and to feel we are growing from the experience. Compare that to the last Labour Party, trade union or council meeting you went to.

Roll up your sleeves

The failure to provide routes to collective fulfilment means we assume that our journey is best pursued alone. In the 1970s and 1980s, new left movements at home and abroad placed emphasis on forms of political organisation and debate that were innovative, exciting and (dare I say it without mockery) consciousness-raising.

Today, there are signs of a yearning for new ways of working together. There is the growing interest in social and co-operative enterprise and the emergence of new forms of online collaboration. Gordon Brown's citizens' juries are a tentative step in the right direction, albeit without much fun or risk-taking, but generally, progressives seem more interested in bemoaning the state of the world than in rolling up their sleeves and getting to work on building the institutions of a new collectivism.

Despite the huge impersonal forces of the modern world, people are prepared not only to believe in a better future, but to work together to build it. Tackling climate change offers a fascinating opportunity to interweave stories of action at the individual, community, national and international levels. This potential will be fulfilled only when we provide spaces for collective decision-making and action that speak to the same vision of collaboration, creativity and human fulfilment that progressives claim to be our destiny.

Matthew Taylor is chief executive, Royal Society for the Encouragement of Arts, and former chief adviser on political strategy to Tony Blair

Matthew Taylor became Chief Executive of the RSA in November 2006. Prior to this appointment, he was Chief Adviser on Political Strategy to the Prime Minister.

This article first appeared in the 07 January 2008 issue of the New Statesman, Pakistan plot

JOHN DEVOLLE/GETTY IMAGES
Show Hide image

Fitter, dumber, more productive

How the craze for Apple Watches, Fitbits and other wearable tech devices revives the old and discredited science of behaviourism.

When Tim Cook unveiled the latest operating system for the Apple Watch in June, he described the product in a remarkable way. This is no longer just a wrist-mounted gadget for checking your email and social media notifications; it is now “the ultimate device for a healthy life”.

With the watch’s fitness-tracking and heart rate-sensor features to the fore, Cook explained how its Activity and Workout apps have been retooled to provide greater “motivation”. A new Breathe app encourages the user to take time out during the day for deep breathing sessions. Oh yes, this watch has an app that notifies you when it’s time to breathe. The paradox is that if you have zero motivation and don’t know when to breathe in the first place, you probably won’t survive long enough to buy an Apple Watch.

The watch and its marketing are emblematic of how the tech trend is moving beyond mere fitness tracking into what might one call quality-of-life tracking and algorithmic hacking of the quality of consciousness. A couple of years ago I road-tested a brainwave-sensing headband, called the Muse, which promises to help you quiet your mind and achieve “focus” by concentrating on your breathing as it provides aural feedback over earphones, in the form of the sound of wind at a beach. I found it turned me, for a while, into a kind of placid zombie with no useful “focus” at all.

A newer product even aims to hack sleep – that productivity wasteland, which, according to the art historian and essayist Jonathan Crary’s book 24/7: Late Capitalism and the Ends of Sleep, is an affront to the foundations of capitalism. So buy an “intelligent sleep mask” called the Neuroon to analyse the quality of your sleep at night and help you perform more productively come morning. “Knowledge is power!” it promises. “Sleep analytics gathers your body’s sleep data and uses it to help you sleep smarter!” (But isn’t one of the great things about sleep that, while you’re asleep, you are perfectly stupid?)

The Neuroon will also help you enjoy technologically assisted “power naps” during the day to combat “lack of energy”, “fatigue”, “mental exhaustion” and “insomnia”. When it comes to quality of sleep, of course, numerous studies suggest that late-night smartphone use is very bad, but if you can’t stop yourself using your phone, at least you can now connect it to a sleep-enhancing gadget.

So comes a brand new wave of devices that encourage users to outsource not only their basic bodily functions but – as with the Apple Watch’s emphasis on providing “motivation” – their very willpower.  These are thrillingly innovative technologies and yet, in the way they encourage us to think about ourselves, they implicitly revive an old and discarded school of ­thinking in psychology. Are we all neo-­behaviourists now?

***

The school of behaviourism arose in the early 20th century out of a virtuous scientific caution. Experimenters wished to avoid anthropomorphising animals such as rats and pigeons by attributing to them mental capacities for belief, reasoning, and so forth. This kind of description seemed woolly and impossible to verify.

The behaviourists discovered that the actions of laboratory animals could, in effect, be predicted and guided by careful “conditioning”, involving stimulus and reinforcement. They then applied Ockham’s razor: there was no reason, they argued, to believe in elaborate mental equipment in a small mammal or bird; at bottom, all behaviour was just a response to external stimulus. The idea that a rat had a complex mentality was an unnecessary hypothesis and so could be discarded. The psychologist John B Watson declared in 1913 that behaviour, and behaviour alone, should be the whole subject matter of psychology: to project “psychical” attributes on to animals, he and his followers thought, was not permissible.

The problem with Ockham’s razor, though, is that sometimes it is difficult to know when to stop cutting. And so more radical behaviourists sought to apply the same lesson to human beings. What you and I think of as thinking was, for radical behaviourists such as the Yale psychologist Clark L Hull, just another pattern of conditioned reflexes. A human being was merely a more complex knot of stimulus responses than a pigeon. Once perfected, some scientists believed, behaviourist science would supply a reliable method to “predict and control” the behaviour of human beings, and thus all social problems would be overcome.

It was a kind of optimistic, progressive version of Nineteen Eighty-Four. But it fell sharply from favour after the 1960s, and the subsequent “cognitive revolution” in psychology emphasised the causal role of conscious thinking. What became cognitive behavioural therapy, for instance, owed its impressive clinical success to focusing on a person’s cognition – the thoughts and the beliefs that radical behaviourism treated as mythical. As CBT’s name suggests, however, it mixes cognitive strategies (analyse one’s thoughts in order to break destructive patterns) with behavioural techniques (act a certain way so as to affect one’s feelings). And the deliberate conditioning of behaviour is still a valuable technique outside the therapy room.

The effective “behavioural modification programme” first publicised by Weight Watchers in the 1970s is based on reinforcement and support techniques suggested by the behaviourist school. Recent research suggests that clever conditioning – associating the taking of a medicine with a certain smell – can boost the body’s immune response later when a patient detects the smell, even without a dose of medicine.

Radical behaviourism that denies a subject’s consciousness and agency, however, is now completely dead as a science. Yet it is being smuggled back into the mainstream by the latest life-enhancing gadgets from Silicon Valley. The difference is that, now, we are encouraged to outsource the “prediction and control” of our own behaviour not to a benign team of psychological experts, but to algorithms.

It begins with measurement and analysis of bodily data using wearable instruments such as Fitbit wristbands, the first wave of which came under the rubric of the “quantified self”. (The Victorian polymath and founder of eugenics, Francis Galton, asked: “When shall we have anthropometric laboratories, where a man may, when he pleases, get himself and his children weighed, measured, and rightly photographed, and have their bodily faculties tested by the best methods known to modern science?” He has his answer: one may now wear such laboratories about one’s person.) But simply recording and hoarding data is of limited use. To adapt what Marx said about philosophers: the sensors only interpret the body, in various ways; the point is to change it.

And the new technology offers to help with precisely that, offering such externally applied “motivation” as the Apple Watch. So the reasoning, striving mind is vacated (perhaps with the help of a mindfulness app) and usurped by a cybernetic system to optimise the organism’s functioning. Electronic stimulus produces a physiological response, as in the behaviourist laboratory. The human being herself just needs to get out of the way. The customer of such devices is merely an opaquely functioning machine to be tinkered with. The desired outputs can be invoked by the correct inputs from a technological prosthesis. Our physical behaviour and even our moods are manipulated by algorithmic number-crunching in corporate data farms, and, as a result, we may dream of becoming fitter, happier and more productive.

***

 

The broad current of behaviourism was not homogeneous in its theories, and nor are its modern technological avatars. The physiologist Ivan Pavlov induced dogs to salivate at the sound of a bell, which they had learned to associate with food. Here, stimulus (the bell) produces an involuntary response (salivation). This is called “classical conditioning”, and it is advertised as the scientific mechanism behind a new device called the Pavlok, a wristband that delivers mild electric shocks to the user in order, so it promises, to help break bad habits such as overeating or smoking.

The explicit behaviourist-revival sell here is interesting, though it is arguably predicated on the wrong kind of conditioning. In classical conditioning, the stimulus evokes the response; but the Pavlok’s painful electric shock is a stimulus that comes after a (voluntary) action. This is what the psychologist who became the best-known behaviourist theoretician, B F Skinner, called “operant conditioning”.

By associating certain actions with positive or negative reinforcement, an animal is led to change its behaviour. The user of a Pavlok treats herself, too, just like an animal, helplessly suffering the gadget’s painful negative reinforcement. “Pavlok associates a mild zap with your bad habit,” its marketing material promises, “training your brain to stop liking the habit.” The use of the word “brain” instead of “mind” here is revealing. The Pavlok user is encouraged to bypass her reflective faculties and perform pain-led conditioning directly on her grey matter, in order to get from it the behaviour that she prefers. And so modern behaviourist technologies act as though the cognitive revolution in psychology never happened, encouraging us to believe that thinking just gets in the way.

Technologically assisted attempts to defeat weakness of will or concentration are not new. In 1925 the inventor Hugo Gernsback announced, in the pages of his magazine Science and Invention, an invention called the Isolator. It was a metal, full-face hood, somewhat like a diving helmet, connected by a rubber hose to an oxygen tank. The Isolator, too, was designed to defeat distractions and assist mental focus.

The problem with modern life, Gernsback wrote, was that the ringing of a telephone or a doorbell “is sufficient, in nearly all cases, to stop the flow of thoughts”. Inside the Isolator, however, sounds are muffled, and the small eyeholes prevent you from seeing anything except what is directly in front of you. Gernsback provided a salutary photograph of himself wearing the Isolator while sitting at his desk, looking like one of the Cybermen from Doctor Who. “The author at work in his private study aided by the Isolator,” the caption reads. “Outside noises being eliminated, the worker can concentrate with ease upon the subject at hand.”

Modern anti-distraction tools such as computer software that disables your internet connection, or word processors that imitate an old-fashioned DOS screen, with nothing but green text on a black background, as well as the brain-measuring Muse headband – these are just the latest versions of what seems an age-old desire for technologically imposed calm. But what do we lose if we come to rely on such gadgets, unable to impose calm on ourselves? What do we become when we need machines to motivate us?

***

It was B F Skinner who supplied what became the paradigmatic image of ­behaviourist science with his “Skinner Box”, formally known as an “operant conditioning chamber”. Skinner Boxes come in different flavours but a classic example is a box with an electrified floor and two levers. A rat is trapped in the box and must press the correct lever when a certain light comes on. If the rat gets it right, food is delivered. If the rat presses the wrong lever, it receives a painful electric shock through the booby-trapped floor. The rat soon learns to press the right lever all the time. But if the levers’ functions are changed unpredictably by the experimenters, the rat becomes confused, withdrawn and depressed.

Skinner Boxes have been used with success not only on rats but on birds and primates, too. So what, after all, are we doing if we sign up to technologically enhanced self-improvement through gadgets and apps? As we manipulate our screens for ­reassurance and encouragement, or wince at a painful failure to be better today than we were yesterday, we are treating ourselves similarly as objects to be improved through operant conditioning. We are climbing willingly into a virtual Skinner Box.

As Carl Cederström and André Spicer point out in their book The Wellness Syndrome, published last year: “Surrendering to an authoritarian agency, which is not just telling you what to do, but also handing out rewards and punishments to shape your behaviour more effectively, seems like undermining your own agency and autonomy.” What’s worse is that, increasingly, we will have no choice in the matter anyway. Gernsback’s Isolator was explicitly designed to improve the concentration of the “worker”, and so are its digital-age descendants. Corporate employee “wellness” programmes increasingly encourage or even mandate the use of fitness trackers and other behavioural gadgets in order to ensure an ideally efficient and compliant workforce.

There are many political reasons to resist the pitiless transfer of responsibility for well-being on to the individual in this way. And, in such cases, it is important to point out that the new idea is a repackaging of a controversial old idea, because that challenges its proponents to defend it explicitly. The Apple Watch and its cousins promise an utterly novel form of technologically enhanced self-mastery. But it is also merely the latest way in which modernity invites us to perform operant conditioning on ourselves, to cleanse away anxiety and dissatisfaction and become more streamlined citizen-consumers. Perhaps we will decide, after all, that tech-powered behaviourism is good. But we should know what we are arguing about. The rethinking should take place out in the open.

In 1987, three years before he died, B F Skinner published a scholarly paper entitled Whatever Happened to Psychology as the Science of Behaviour?, reiterating his now-unfashionable arguments against psychological talk about states of mind. For him, the “prediction and control” of behaviour was not merely a theoretical preference; it was a necessity for global social justice. “To feed the hungry and clothe the naked are ­remedial acts,” he wrote. “We can easily see what is wrong and what needs to be done. It is much harder to see and do something about the fact that world agriculture must feed and clothe billions of people, most of them yet unborn. It is not enough to advise people how to behave in ways that will make a future possible; they must be given effective reasons for behaving in those ways, and that means effective contingencies of reinforcement now.” In other words, mere arguments won’t equip the world to support an increasing population; strategies of behavioural control must be designed for the good of all.

Arguably, this authoritarian strand of behaviourist thinking is what morphed into the subtly reinforcing “choice architecture” of nudge politics, which seeks gently to compel citizens to do the right thing (eat healthy foods, sign up for pension plans) by altering the ways in which such alternatives are presented.

By contrast, the Apple Watch, the Pavlok and their ilk revive a behaviourism evacuated of all social concern and designed solely to optimise the individual customer. By ­using such devices, we voluntarily offer ourselves up to a denial of our voluntary selves, becoming atomised lab rats, to be manipulated electronically through the corporate cloud. It is perhaps no surprise that when the founder of American behaviourism, John B Watson, left academia in 1920, he went into a field that would come to profit very handsomely indeed from his skills of manipulation – advertising. Today’s neo-behaviourist technologies promise to usher in a world that is one giant Skinner Box in its own right: a world where thinking just gets in the way, and we all mechanically press levers for food pellets.

This article first appeared in the 18 August 2016 issue of the New Statesman, Corbyn’s revenge