The cast and crew during a shoot for Downton Abbey. Photograph: Carnival Films/Nick Briggs
Show Hide image

Horlicks for Chummy: Britain’s romance with cosy TV nostalgia

Why is our home-grown drama so fixated on the past?

British television is on a huge nostalgia binge. On one Sunday evening in January, the new series of Call the Midwife (set in the East End of London in the 1950s) was sandwiched between Blandings (a 1920s country-house comedy) and Ripper Street (a late-19th-century cop show). On the same evening, BBC2 was repeating the Second World War episode of Fawlty Towers (“Don’t mention the war”) and ITV was running Mr Selfridge (an Edwardian drama described as “Downton Abbey with tills”).

The following Tuesday, ITV offered the first part of Great Houses with Julian Fellowes. That’s not counting all the reruns of 1970s comedies. On BBC2 on Christmas Eve, apart from Carols from King’s, the entire evening schedule from 5.35pm to after midnight consisted of such repeats. Four of these made the top five for the channel’s ratings during Christmas week.

Much of today’s television drama, in particular, is set in the past, not least the two biggest hits of all, Call the Midwife and Downton Abbey. What is striking is not just that these are set in the past but how idealised their view of British history is. Why this turn to the past and why such cosy nostalgia?

There is a striking contrast with foreign TV drama. The best examples from the US (Homeland, Breaking Bad, Boss) are dark explorations of modern America. Similarly, Scandinavian series such as Wallander, The Bridgeand The Killing have used detectives to transform our sense of modern Sweden and Denmark. While these series make gripping drama out of Muslim terrorists, Mexican drug cartels and modern-day politics, British TV is making Horlicks for Chummy.

The big TV event of 2013 is the new series of Call the Midwife. The Radio Times dedicated 13 pages to its return. Series 0ne was acclaimed by critics and proved hugely popular with audiences. A second series was immediately commissioned after the drama’s opening episode attracted nearly ten million viewers. The figures for the next two episodes passed ten million and episode four’s rating of 10.89 million overtook ITV’s 2010 hit Downton Abbey as the largest first-series audience for original drama on UK television in recent years. Both Downton and Call the Midwife are period dramas; both are hugely popular. There are two principal reasons for their appeal. First, they are soaps. Second, they present a rose-tinted vision of the past.

Call the Midwife is based on four books of memoirs by the late Jennifer Worth, about her experiences as a midwife in the East End. The differences between the books and the TV series are revealing. Worth’s books are full of fascinating social history: about living conditions in east London, the scale of poverty and violence, the realities of postwar medicine and the workhouse. In her introduction, Worth points out what a “rough area” the East End of the 1950s was. “Pub fights and brawls were an everyday event,” and: “Domestic violence was expected.” Hardly any of this features in the TV series. The terrible daily grind of life without running water, central heating and washing machines that looms large in Worth’s memoirs gives way to dewy-eyed romance.

Romance hardly features in the books. Jimmy, Jenny Lee’s on-off “friend” in the TV series, barely appears in the books and there’s no mention of his romance with Jenny. Chummy’s romance with PC Noakes only features in one chapter in the four books and Chummy herself barely appears. Even Cynthia’s moment with the widowed husband of a violinist who dies of eclampsia never happens. Indeed, Cynthia and Trixie, the minxy blonde, don’t appear that much in the books. The opposite is the case with the TV series. It cleverly mixes romance with stories from Worth’s books.

Conversely, the darkest stories in the book (“Molly”, a story of domestic abuse; “Of Mixed Descent II”, about a white husband’s violent reaction to his wife having a black baby) never made it into the first series, though a predictably happier version of “Molly” began series two. What happened with the TV adaptation was that most of the history got taken out and soapy romance was put in instead – romance and a peculiar kind of nostalgia for a time of high employment and a strong sense of community and neighbourhood.

In Call the Midwife, there is always a friendly bobby on the beat, East Enders are salt of the earth types and, crucially, everyone is white (except for a few non-speaking extras). This is the appeal of Call the Midwife. Except for one Asian pimp and a few foulmouthed underclass mums, everyone is decent and respectable. Even in a family of 24 children, they all have white teeth and clean hair. This is the world we have lost, which bears little resemblance to today’s Britain of feral children, family and social breakdown and violence. Call the Midwife is like Dixon of Dock Green with babies. The result is a huge ratings success.

Something else has been cut out from the books. There’s a scene in one story in which Sister Evangelina makes a reference to The Black and White Minstrel Show and, several times, Sister Monica Joan is seen knitting golliwogs. There is no place for that in the TV series. All references to a past that might make us uncomfortable today get airbrushed out. It is unacceptable today. But isn’t that the point? We don’t want to be reminded of how different the past was. We want a past that is cosy and better than today, the past we would like to remember, not the past as it actually was – golliwogs, domestic violence and all.

The same is true with Downton Abbey. There are a few pantomime villains (the scheming Thomas, a gay servant, and Miss O’Brien, Lady Grantham’s lady’s maid) but otherwise almost everyone is decent. The Granthams treat the servants kindly and respectfully. Lord Grantham sends the cook to Moorfields Eye Hospital and pays for her cataracts operation. He employs his old batman as his valet at Downton. When the footman Will’s mother is dying, he is swiftly sent home on compassionate leave. Carson, the butler, speaks of Downton as “family”: “They’re all the family I’ve got.” This is England as one happy family with Lord Gran - tham, an old-time Tory paternalist, in charge.

Downton Abbey has exactly the same formula as Call the Midwife. It mixes this rosetinted view of the past with lots of romance. Grantham has three grown-up daughters – lots of opportunity for romance and gossip. Numerous young chaps come to Downton. Which one will marry Lady Mary? Or perhaps Lady Edith? There’s even the occasional scandal – the dodgy Turk (bisexual, of course); the gossipy Lady Rosamund. It’s like Dynasty with butlers.

What has been smoothed out, again, is history. There are barely any references to trade unions or tenant farmers. The Strange Death of Liberal England seems far away. There’s history with a big H: Lady Sibyl is interested in women’s rights; two distant relatives (who we never meet) die on the Titanic; there’s a single reference to Lloyd George; series one ends with the announcement of the First World War (cue countless reaction shots). Yet poverty, unemployment and falling agricultural prices are far from Downton Abbey. “I hanker for a simpler world,” says Maggie Smith as the dowager countess. That’s what we get: a simpler world with the complexities of real history removed.

It’s not just that the dark side of British history has been edited out. What is revealing is what has been left in. Both series are about close-knit communities, in which everyone knows everyone: Nonnatus House and Downton Abbey. There’s always plenty of cake and Horlicks, pale ale and allotments and, as we are reminded several times, the NHS has made miracles possible. We hear no talk of cuts. There’s always an obstetric flying squad or a copper with a kind word on hand. It’s a world of happy endings: the woman with rickets will have her healthy baby, Chummy will learn to ride her bike, Mrs Patmore will see again. In the background, we hear the dulcet tones of Harold MacMillan saying we have never had it so good.

Meanwhile, a few acclaimed American series are set in the past: Boardwalk Empire and Mad Men. But there is no Horlicks in The Killing, no coconut cake in Boss. The best Scandinavian and American drama is TV noir. Young women get sexually abused and murdered; terrible things happen in the Middle East and spread to the US and Denmark; Mexican drug barons perpetrate acts of unimaginable violence. There are no good old days, just bad new days, and nowhere is safe.

A central issue in many of these series is the border between good and evil and the constant worry that the border will not hold. Middle Eastern terrorists and Mexican drug cartels are never far away. In the second series of The Killing, Breaking Bad and Homeland, the question is: “Where is the bad guy?” The dark answer is: “He’s here.” Too close for comfort.

There is another alternative to rose-tinted nostalgia: dramas that explore the past in all its complexity and challenge conventional wisdom. During the 1970s and 1980s, a number of British TV dramas and series did exactly this: Days of Hope, Alan Bleasdale’s The Monocled Mutineer, David Hare’s Licking Hitler and Ian McEwan’s The Imitation Game and Ploughman’s Lunch were among programmes that explored significant moments in 20th-century British history, as well as issues of national identity and mythology.

More recently, Stephen Poliakoff’s plays have been about personal and national history; how we come to terms with the past and how we don’t; how the past gets to be sold off (Shooting the Past); secret histories (Perfect Strangers, The Lost Prince); black and Jewish people. Or he brings together bits of the past that don’t seem to belong together: the royal family and black jazz musicians in Dancing on the Edge; the Holocaust and country-house drama in his earlier plays. Poliakoff shows how we can see the past differently. We don’t have to see it through the soapy prisms of romance and nostalgia.

David Herman is a writer and former television producer

This article first appeared in the 25 February 2013 issue of the New Statesman, The cheap food delusion

JOHN DEVOLLE/GETTY IMAGES
Show Hide image

Fitter, dumber, more productive

How the craze for Apple Watches, Fitbits and other wearable tech devices revives the old and discredited science of behaviourism.

When Tim Cook unveiled the latest operating system for the Apple Watch in June, he described the product in a remarkable way. This is no longer just a wrist-mounted gadget for checking your email and social media notifications; it is now “the ultimate device for a healthy life”.

With the watch’s fitness-tracking and heart rate-sensor features to the fore, Cook explained how its Activity and Workout apps have been retooled to provide greater “motivation”. A new Breathe app encourages the user to take time out during the day for deep breathing sessions. Oh yes, this watch has an app that notifies you when it’s time to breathe. The paradox is that if you have zero motivation and don’t know when to breathe in the first place, you probably won’t survive long enough to buy an Apple Watch.

The watch and its marketing are emblematic of how the tech trend is moving beyond mere fitness tracking into what might one call quality-of-life tracking and algorithmic hacking of the quality of consciousness. A couple of years ago I road-tested a brainwave-sensing headband, called the Muse, which promises to help you quiet your mind and achieve “focus” by concentrating on your breathing as it provides aural feedback over earphones, in the form of the sound of wind at a beach. I found it turned me, for a while, into a kind of placid zombie with no useful “focus” at all.

A newer product even aims to hack sleep – that productivity wasteland, which, according to the art historian and essayist Jonathan Crary’s book 24/7: Late Capitalism and the Ends of Sleep, is an affront to the foundations of capitalism. So buy an “intelligent sleep mask” called the Neuroon to analyse the quality of your sleep at night and help you perform more productively come morning. “Knowledge is power!” it promises. “Sleep analytics gathers your body’s sleep data and uses it to help you sleep smarter!” (But isn’t one of the great things about sleep that, while you’re asleep, you are perfectly stupid?)

The Neuroon will also help you enjoy technologically assisted “power naps” during the day to combat “lack of energy”, “fatigue”, “mental exhaustion” and “insomnia”. When it comes to quality of sleep, of course, numerous studies suggest that late-night smartphone use is very bad, but if you can’t stop yourself using your phone, at least you can now connect it to a sleep-enhancing gadget.

So comes a brand new wave of devices that encourage users to outsource not only their basic bodily functions but – as with the Apple Watch’s emphasis on providing “motivation” – their very willpower.  These are thrillingly innovative technologies and yet, in the way they encourage us to think about ourselves, they implicitly revive an old and discarded school of ­thinking in psychology. Are we all neo-­behaviourists now?

***

The school of behaviourism arose in the early 20th century out of a virtuous scientific caution. Experimenters wished to avoid anthropomorphising animals such as rats and pigeons by attributing to them mental capacities for belief, reasoning, and so forth. This kind of description seemed woolly and impossible to verify.

The behaviourists discovered that the actions of laboratory animals could, in effect, be predicted and guided by careful “conditioning”, involving stimulus and reinforcement. They then applied Ockham’s razor: there was no reason, they argued, to believe in elaborate mental equipment in a small mammal or bird; at bottom, all behaviour was just a response to external stimulus. The idea that a rat had a complex mentality was an unnecessary hypothesis and so could be discarded. The psychologist John B Watson declared in 1913 that behaviour, and behaviour alone, should be the whole subject matter of psychology: to project “psychical” attributes on to animals, he and his followers thought, was not permissible.

The problem with Ockham’s razor, though, is that sometimes it is difficult to know when to stop cutting. And so more radical behaviourists sought to apply the same lesson to human beings. What you and I think of as thinking was, for radical behaviourists such as the Yale psychologist Clark L Hull, just another pattern of conditioned reflexes. A human being was merely a more complex knot of stimulus responses than a pigeon. Once perfected, some scientists believed, behaviourist science would supply a reliable method to “predict and control” the behaviour of human beings, and thus all social problems would be overcome.

It was a kind of optimistic, progressive version of Nineteen Eighty-Four. But it fell sharply from favour after the 1960s, and the subsequent “cognitive revolution” in psychology emphasised the causal role of conscious thinking. What became cognitive behavioural therapy, for instance, owed its impressive clinical success to focusing on a person’s cognition – the thoughts and the beliefs that radical behaviourism treated as mythical. As CBT’s name suggests, however, it mixes cognitive strategies (analyse one’s thoughts in order to break destructive patterns) with behavioural techniques (act a certain way so as to affect one’s feelings). And the deliberate conditioning of behaviour is still a valuable technique outside the therapy room.

The effective “behavioural modification programme” first publicised by Weight Watchers in the 1970s is based on reinforcement and support techniques suggested by the behaviourist school. Recent research suggests that clever conditioning – associating the taking of a medicine with a certain smell – can boost the body’s immune response later when a patient detects the smell, even without a dose of medicine.

Radical behaviourism that denies a subject’s consciousness and agency, however, is now completely dead as a science. Yet it is being smuggled back into the mainstream by the latest life-enhancing gadgets from Silicon Valley. The difference is that, now, we are encouraged to outsource the “prediction and control” of our own behaviour not to a benign team of psychological experts, but to algorithms.

It begins with measurement and analysis of bodily data using wearable instruments such as Fitbit wristbands, the first wave of which came under the rubric of the “quantified self”. (The Victorian polymath and founder of eugenics, Francis Galton, asked: “When shall we have anthropometric laboratories, where a man may, when he pleases, get himself and his children weighed, measured, and rightly photographed, and have their bodily faculties tested by the best methods known to modern science?” He has his answer: one may now wear such laboratories about one’s person.) But simply recording and hoarding data is of limited use. To adapt what Marx said about philosophers: the sensors only interpret the body, in various ways; the point is to change it.

And the new technology offers to help with precisely that, offering such externally applied “motivation” as the Apple Watch. So the reasoning, striving mind is vacated (perhaps with the help of a mindfulness app) and usurped by a cybernetic system to optimise the organism’s functioning. Electronic stimulus produces a physiological response, as in the behaviourist laboratory. The human being herself just needs to get out of the way. The customer of such devices is merely an opaquely functioning machine to be tinkered with. The desired outputs can be invoked by the correct inputs from a technological prosthesis. Our physical behaviour and even our moods are manipulated by algorithmic number-crunching in corporate data farms, and, as a result, we may dream of becoming fitter, happier and more productive.

***

 

The broad current of behaviourism was not homogeneous in its theories, and nor are its modern technological avatars. The physiologist Ivan Pavlov induced dogs to salivate at the sound of a bell, which they had learned to associate with food. Here, stimulus (the bell) produces an involuntary response (salivation). This is called “classical conditioning”, and it is advertised as the scientific mechanism behind a new device called the Pavlok, a wristband that delivers mild electric shocks to the user in order, so it promises, to help break bad habits such as overeating or smoking.

The explicit behaviourist-revival sell here is interesting, though it is arguably predicated on the wrong kind of conditioning. In classical conditioning, the stimulus evokes the response; but the Pavlok’s painful electric shock is a stimulus that comes after a (voluntary) action. This is what the psychologist who became the best-known behaviourist theoretician, B F Skinner, called “operant conditioning”.

By associating certain actions with positive or negative reinforcement, an animal is led to change its behaviour. The user of a Pavlok treats herself, too, just like an animal, helplessly suffering the gadget’s painful negative reinforcement. “Pavlok associates a mild zap with your bad habit,” its marketing material promises, “training your brain to stop liking the habit.” The use of the word “brain” instead of “mind” here is revealing. The Pavlok user is encouraged to bypass her reflective faculties and perform pain-led conditioning directly on her grey matter, in order to get from it the behaviour that she prefers. And so modern behaviourist technologies act as though the cognitive revolution in psychology never happened, encouraging us to believe that thinking just gets in the way.

Technologically assisted attempts to defeat weakness of will or concentration are not new. In 1925 the inventor Hugo Gernsback announced, in the pages of his magazine Science and Invention, an invention called the Isolator. It was a metal, full-face hood, somewhat like a diving helmet, connected by a rubber hose to an oxygen tank. The Isolator, too, was designed to defeat distractions and assist mental focus.

The problem with modern life, Gernsback wrote, was that the ringing of a telephone or a doorbell “is sufficient, in nearly all cases, to stop the flow of thoughts”. Inside the Isolator, however, sounds are muffled, and the small eyeholes prevent you from seeing anything except what is directly in front of you. Gernsback provided a salutary photograph of himself wearing the Isolator while sitting at his desk, looking like one of the Cybermen from Doctor Who. “The author at work in his private study aided by the Isolator,” the caption reads. “Outside noises being eliminated, the worker can concentrate with ease upon the subject at hand.”

Modern anti-distraction tools such as computer software that disables your internet connection, or word processors that imitate an old-fashioned DOS screen, with nothing but green text on a black background, as well as the brain-measuring Muse headband – these are just the latest versions of what seems an age-old desire for technologically imposed calm. But what do we lose if we come to rely on such gadgets, unable to impose calm on ourselves? What do we become when we need machines to motivate us?

***

It was B F Skinner who supplied what became the paradigmatic image of ­behaviourist science with his “Skinner Box”, formally known as an “operant conditioning chamber”. Skinner Boxes come in different flavours but a classic example is a box with an electrified floor and two levers. A rat is trapped in the box and must press the correct lever when a certain light comes on. If the rat gets it right, food is delivered. If the rat presses the wrong lever, it receives a painful electric shock through the booby-trapped floor. The rat soon learns to press the right lever all the time. But if the levers’ functions are changed unpredictably by the experimenters, the rat becomes confused, withdrawn and depressed.

Skinner Boxes have been used with success not only on rats but on birds and primates, too. So what, after all, are we doing if we sign up to technologically enhanced self-improvement through gadgets and apps? As we manipulate our screens for ­reassurance and encouragement, or wince at a painful failure to be better today than we were yesterday, we are treating ourselves similarly as objects to be improved through operant conditioning. We are climbing willingly into a virtual Skinner Box.

As Carl Cederström and André Spicer point out in their book The Wellness Syndrome, published last year: “Surrendering to an authoritarian agency, which is not just telling you what to do, but also handing out rewards and punishments to shape your behaviour more effectively, seems like undermining your own agency and autonomy.” What’s worse is that, increasingly, we will have no choice in the matter anyway. Gernsback’s Isolator was explicitly designed to improve the concentration of the “worker”, and so are its digital-age descendants. Corporate employee “wellness” programmes increasingly encourage or even mandate the use of fitness trackers and other behavioural gadgets in order to ensure an ideally efficient and compliant workforce.

There are many political reasons to resist the pitiless transfer of responsibility for well-being on to the individual in this way. And, in such cases, it is important to point out that the new idea is a repackaging of a controversial old idea, because that challenges its proponents to defend it explicitly. The Apple Watch and its cousins promise an utterly novel form of technologically enhanced self-mastery. But it is also merely the latest way in which modernity invites us to perform operant conditioning on ourselves, to cleanse away anxiety and dissatisfaction and become more streamlined citizen-consumers. Perhaps we will decide, after all, that tech-powered behaviourism is good. But we should know what we are arguing about. The rethinking should take place out in the open.

In 1987, three years before he died, B F Skinner published a scholarly paper entitled Whatever Happened to Psychology as the Science of Behaviour?, reiterating his now-unfashionable arguments against psychological talk about states of mind. For him, the “prediction and control” of behaviour was not merely a theoretical preference; it was a necessity for global social justice. “To feed the hungry and clothe the naked are ­remedial acts,” he wrote. “We can easily see what is wrong and what needs to be done. It is much harder to see and do something about the fact that world agriculture must feed and clothe billions of people, most of them yet unborn. It is not enough to advise people how to behave in ways that will make a future possible; they must be given effective reasons for behaving in those ways, and that means effective contingencies of reinforcement now.” In other words, mere arguments won’t equip the world to support an increasing population; strategies of behavioural control must be designed for the good of all.

Arguably, this authoritarian strand of behaviourist thinking is what morphed into the subtly reinforcing “choice architecture” of nudge politics, which seeks gently to compel citizens to do the right thing (eat healthy foods, sign up for pension plans) by altering the ways in which such alternatives are presented.

By contrast, the Apple Watch, the Pavlok and their ilk revive a behaviourism evacuated of all social concern and designed solely to optimise the individual customer. By ­using such devices, we voluntarily offer ourselves up to a denial of our voluntary selves, becoming atomised lab rats, to be manipulated electronically through the corporate cloud. It is perhaps no surprise that when the founder of American behaviourism, John B Watson, left academia in 1920, he went into a field that would come to profit very handsomely indeed from his skills of manipulation – advertising. Today’s neo-behaviourist technologies promise to usher in a world that is one giant Skinner Box in its own right: a world where thinking just gets in the way, and we all mechanically press levers for food pellets.

This article first appeared in the 18 August 2016 issue of the New Statesman, Corbyn’s revenge