Illustration by Nick Hayes for the New Statesman.
Show Hide image

What David Cameron can learn from Abraham Lincoln

Wearing the Union blue.

Abraham Lincoln’s Emancipation Proclamation of 1863 is one of the noblest statements ever delivered, and forcing the abolition bill through a reluctant Congress, as Steven Spielberg’s masterful Oscarnominated film attests, was a monumental achievement. But Lincoln’s principal contribution to American history was to save the Union, as those from the Southern states are quick to tell you. In the former Confederacy, the civil war is still called “the War Between the States”.

Lincoln confided his thoughts about secession and slavery in a letter of 1862. “If I could save the Union without freeing any slave I would do it, and if I could save it by freeing all the slaves I would do it; and if I could save it by freeing some and leaving others alone I would also do that,” he wrote. “What I do about slavery, and the coloured race, I do because I believe it helps to save the Union; and what I forbear, I forbear because I do not believe it would help to save the Union.”

His proclamation did not, in fact, free the slaves in the North, nor was he in a position to free slaves in the Confederate South, but, under his powers as commander-in-chief in wartime, he issued an executive order that freed all slaves in the Southern states as soon as they were occupied by the Union army.

It may at first seem a little far-fetched, but there are poignant similarities between the conundrum that Lincoln encountered 150 years ago and the dilemma David Cameron faces today. They are both confronted with threats to the very existence of the nations they govern. While Lincoln was obliged to respond to a fait accompli, a group of slave states that had decided before his election to wrest themselves from the Union, by force of arms if necessary, Cameron finds himself under siege on all sides. But while Lincoln was presented with the simple option of whether to take up arms to defend the Union or watch as his country split in two, Cam eron has no such easy choice.

In Scotland, the Scottish National Party has finally achieved what it has been dreaming of for 80 years. It has a mandate to demand from Westminster a referendum on whether, after three centuries united with England and Wales, Scotland should become a free nation again. The Union came about as a result of the Union of the Crowns, when the Scottish king James VI, son of Mary, Queen of Scots, acceded to the throne of England following the death of the childless Elizabeth I in 1603. It took a full century before the English and Scottish parliaments combined in the Acts of Union of 1707. Lincoln was obliged to defend a union barely 90 years old; Cameron must protect a union that has lasted more than 300 years.

In Ireland, Cameron presides over the latest skirmish in a bloody struggle that has lasted much longer. The colonisation of Ireland was messy and brutal from the start, and the independence wrested from Britain in 1922 left the northern, overwhelmingly Protestant and unionist part of the island in British hands. A border had to be drawn somewhere, leaving many who would prefer to live in the republic stranded in a British province. The continuing troubles offer a challenge to Cameron to find a permanent peace. No less than in Scotland, British sovereignty and British lives are severely at risk.

Then there is the European Union. Those with a sense of history will remember that joining Europe was always predominantly a Conservative project. It was Harold Macmillan, with Edward Heath at his side, who first flirted with the continentals in 1961 and had his overture rudely rebuffed by Charles de Gaulle’s “Non!”. Heath the eternal bachelor then made it his life’s mission to make a marriage with the Europeans and the lasting legacy of his otherwise awkward, chilly and ultimately tragic premiership was British entry into the European Economic Community in 1973. As Cameron must be all too aware, the principled Heath condemned the referendum that Harold Wilson called on European membership two years later as a shabby gimmick, designed to appease internal Labour divisions over the Common Market.

Since the moment when Heath’s successor Margaret Thatcher – who had campaigned in favour of remaining in Europe in 1975 – began arguing, as prime minister after 1979, against closer European union, the Conservatives have been profoundly and openly divided on the matter. The rupture over Europe, even more than Thatcher’s unpopular poll tax, led to her defenestration by cabinet colleagues in 1990. John Major’s leadership of the Tories was blighted by the question of Europe; and the election of three Eurosceptic leaders in a row – William Hague, Iain Duncan Smith and Michael Howard – did not settle the matter.

Cameron’s inheritance is a party facing both ways on Europe, and his inability to reconcile the opposing forces has given rise to a challenge for the affections of his patriotic electoral base from the anti-European Ukip. Although Ukip’s leader, Nigel Farage, along with every other Ukip candidate, failed to win a Commons seat in 2010 (Farage was beaten by a candidate dressed as a dolphin), his party stole enough votes from the Conservatives to deprive Cameron of a parliamentary majority.

When he dreamed of leading his party, Cameron could never have imagined that Britain’s existence would be subject to a three-pronged attack. But he finds himself in the same position as today’s Republican leadership in America, under assault from angry rank and file who feel they are being ignored and betrayed by their leaders. The Republicans, once the proud “party of Lincoln”, have evolved into a testy vehicle for insurgent mavericks and malcontents.

To add insult to indignity, the “Grand Old Party”, which once bravely saved the Union, is the home of a new secessionist movement. Having failed to devolve substantial powers from the federal government to the states, many are demanding independence. At present, eight states, all from the defeated Confederacy, have petitioned the White House to be allowed to secede: Texas, Louisiana, Florida, Georgia, Tennessee, North Carolina, Alabama and South Carolina. The muddled, ahistorical thinking behind the treacherous talk is evident in the argument proffered by the libertarian Ron Paul: “It’s very American to talk about secession. That’s how we came into being.”

On a personal level, there are as few similarities between Cameron and Lincoln as between Jacob and Esau. Lincoln was brought up in a sparsely furnished log cabin and, much to his ignorant father’s despair, taught himself to read and write, eventually emerging as a jobbing country lawyer in Illinois. Cameron, as we know, was the son of high privilege. Everyone who met Lincoln commented on his rough looks and his even rougher clothes. Cameron’s smooth, unlined face betrays an easy, affluent, well-fed life.

Both men, however, could be described more accurately as Whigs than Conservatives, in their commitment to parliament or Congress over absolute powers held by the monarch or president. Indeed, Lincoln was an old American Whig before he joined the Republicans over the issue of abolition. Allied to their commitment to rewarding individual effort, irrespective of background, is a strong, Protestant sense that their good fortune entails paying something back. Despite his comfortable circumstances, Cameron has argued that “it’s where you are going to, not where you have come from, that matters”. In a decisive break from the philosophy of heroic individualism that inspired Thatcher, he believes “there is such a thing as society”.

As well as soaring ambition, the two men share other similarities. Both are most eloquent when they do not refer to notes. Although stiff and wooden at first, Lincoln’s speeches gathered pace and by the peroration he would be ripping off his necktie, loosening his starched collar and throwing his arms around like a deranged windmill. “His pronunciation is bad, his manners uncouth and his general appearance anything but prepossessing,” is how one eyewitness described his platform presence.

Cameron’s delivery is calm, ordered and deliberate. His speech to the Tory party conference in 2005, delivered without notes, may not have been as powerful and inspirational as the 268 words of Lincoln’s Gettysburg Address of 1863, which would be a tall order for anyone except, perhaps, Winston Churchill. But the performance at Blackpool, in its carefully pitched content tailored to the party faithful and the confidence of its delivery, ensured his election as leader.

Lincoln took into his administration the big beasts of the Republican Party whom he had beaten to the Republican nomination: William H Seward, Salmon P Chase and Edward Bates. And Cameron, too, assembled a team of former rivals. To become Tory leader, he saw off David Davis, Liam Fox and Kenneth Clarke, all of whom he invited into his shadow cabinet. Like Lincoln, Cameron leads his disparate colleagues with the minimum of friction. But there the favourable comparisons between the two leaders start to run out.

Lincoln was always a man of principle rather than pragmatism. He could be rash, failing to hold his tongue in the presence of those he knew disagreed with him, and found it difficult to compromise even when it was in his best interest to do so. Nowhere was this more obvious and powerful than when he spelled out, years before running for the White House, what he felt about race.

He declared that when the Founding Fathers wrote, “We hold these truths to be selfevident: that all men are created equal,” they meant “the whole great family of man” and not merely those with white faces. Lincoln said the founders knew enough about human nature to imagine that, “in the distant future”, people would emerge who would “set up the doctrine that none but rich men, or none but white men, were entitled to life, liberty and the pursuit of happiness”. But he was certain that racism could never have been in the founders’ minds and he would have none of it.

In comparison to this eloquent statement of principle, just one among dozens that Lincoln crisply articulated in his short life, Cam - eron emerges as a dissembler, always alert for a way to delay taking a stand, ever ready with the smudgy phrase and the tactical retreat. Let us give him a pass on Ireland. Few have got it right and it may well be insoluble so long as a vociferous minority in Northern Ireland demands the impossible and the intractable majority insists on being British. In Scotland, however, when the SNP obtained a majority in the Scottish Parliament and claimed a mandate to call a referendum on independence, Cameron readily ignored Lincoln’s example to resist the dissolution of the Union and readily agreed to Alex Salmond’s demands.

In calling an all-or-nothing, in-out referendum on independence next year in Scotland only, David William Donald Cameron, to give him his full, Scots-derived name, failed to question the legality of one half of the nation being able to secede from the other on its own cognisance. Instead, he conceded the principle that if the referendum records a majority of Scots in favour of secession, that is enough to grant a divorce, as if England, Wales and Northern Ireland, and the Scots living in the rest of Britain, were not entitled to a say in the dissolution of the United Kingdom. “I’m not going to stand here and suggest Scotland couldn’t make a go of being on its own, if that’s what people decide,” Cameron said. “There are plenty of small, independent nation states of a similar size or even smaller. Scotland could make its way in the world alongside countries like those.”

Lincoln would never have yielded on such a fundamental principle. As he put it, “If we do not make common cause to save the good old ship of the Union on this voyage, nobody will have a chance to pilot her on another voyage.”

When Cameron conceded the principle that one part of the United Kingdom may constitutionally break from the rest, he also declared himself “ready for the fight for our country’s life”. He appears to be in favour of two incompatible principles, the right of Britain to remain a nation and the right of Scotland to secede. He then adopts the principle that gives Scotland the moral right to secede to inform his party’s demand that Britain be allowed to renegotiate a looser union with our European partners. What, then, is Cameron’s guiding principle when dealing with Scotland and the European Union? There is none. Both are craven acts of political expedience. His promise of a referendum on British membership of the EU is largely an attempt to save the Conservatives from being driven from office by Ukip.

Cameron’s answer to the Ukip threat to the renewal of his Downing Street lease is to avoid saying exactly what the relationship between Britain and the EU should be, because plainly he doesn’t know where the line should be drawn. Instead he abrogates the responsibility of a true leader and, in the hope of being re-elected, promises an in-out referendum on EU membership, so long as he is re-elected. As Lincoln asked, “What is conservatism? Is it not adherence to the old and tried against the new and untried?”

Cameron is less a conservative than a trimmer, less a Heath than a Wilson, less a That - cher than a Blair.

When Lincoln confronted the break-up of the United States, he borrowed from the Gospel according to Saint Matthew: “A house divided against itself cannot stand.” To avoid the consequences of the Conservatives’ deeply divided house, Cameron is willing to risk the dissolution of the United Kingdom and British withdrawal from the European Union. Both are too high a price to pay for trying to bridge the irrevocable schism in the Tory ranks.

Nicholas Wapshott’s most recent book is “Keynes Hayek: the Clash That Defined Modern Economics” (W W Norton, £12.99)

Nicholas Wapshott’s Keynes Hayek: the Clash That Defined Modern Economics is published by W W Norton (£12.99)

This article first appeared in the 11 February 2013 issue of the New Statesman, Assange Alone

JOHN DEVOLLE/GETTY IMAGES
Show Hide image

Fitter, dumber, more productive

How the craze for Apple Watches, Fitbits and other wearable tech devices revives the old and discredited science of behaviourism.

When Tim Cook unveiled the latest operating system for the Apple Watch in June, he described the product in a remarkable way. This is no longer just a wrist-mounted gadget for checking your email and social media notifications; it is now “the ultimate device for a healthy life”.

With the watch’s fitness-tracking and heart rate-sensor features to the fore, Cook explained how its Activity and Workout apps have been retooled to provide greater “motivation”. A new Breathe app encourages the user to take time out during the day for deep breathing sessions. Oh yes, this watch has an app that notifies you when it’s time to breathe. The paradox is that if you have zero motivation and don’t know when to breathe in the first place, you probably won’t survive long enough to buy an Apple Watch.

The watch and its marketing are emblematic of how the tech trend is moving beyond mere fitness tracking into what might one call quality-of-life tracking and algorithmic hacking of the quality of consciousness. A couple of years ago I road-tested a brainwave-sensing headband, called the Muse, which promises to help you quiet your mind and achieve “focus” by concentrating on your breathing as it provides aural feedback over earphones, in the form of the sound of wind at a beach. I found it turned me, for a while, into a kind of placid zombie with no useful “focus” at all.

A newer product even aims to hack sleep – that productivity wasteland, which, according to the art historian and essayist Jonathan Crary’s book 24/7: Late Capitalism and the Ends of Sleep, is an affront to the foundations of capitalism. So buy an “intelligent sleep mask” called the Neuroon to analyse the quality of your sleep at night and help you perform more productively come morning. “Knowledge is power!” it promises. “Sleep analytics gathers your body’s sleep data and uses it to help you sleep smarter!” (But isn’t one of the great things about sleep that, while you’re asleep, you are perfectly stupid?)

The Neuroon will also help you enjoy technologically assisted “power naps” during the day to combat “lack of energy”, “fatigue”, “mental exhaustion” and “insomnia”. When it comes to quality of sleep, of course, numerous studies suggest that late-night smartphone use is very bad, but if you can’t stop yourself using your phone, at least you can now connect it to a sleep-enhancing gadget.

So comes a brand new wave of devices that encourage users to outsource not only their basic bodily functions but – as with the Apple Watch’s emphasis on providing “motivation” – their very willpower.  These are thrillingly innovative technologies and yet, in the way they encourage us to think about ourselves, they implicitly revive an old and discarded school of ­thinking in psychology. Are we all neo-­behaviourists now?

***

The school of behaviourism arose in the early 20th century out of a virtuous scientific caution. Experimenters wished to avoid anthropomorphising animals such as rats and pigeons by attributing to them mental capacities for belief, reasoning, and so forth. This kind of description seemed woolly and impossible to verify.

The behaviourists discovered that the actions of laboratory animals could, in effect, be predicted and guided by careful “conditioning”, involving stimulus and reinforcement. They then applied Ockham’s razor: there was no reason, they argued, to believe in elaborate mental equipment in a small mammal or bird; at bottom, all behaviour was just a response to external stimulus. The idea that a rat had a complex mentality was an unnecessary hypothesis and so could be discarded. The psychologist John B Watson declared in 1913 that behaviour, and behaviour alone, should be the whole subject matter of psychology: to project “psychical” attributes on to animals, he and his followers thought, was not permissible.

The problem with Ockham’s razor, though, is that sometimes it is difficult to know when to stop cutting. And so more radical behaviourists sought to apply the same lesson to human beings. What you and I think of as thinking was, for radical behaviourists such as the Yale psychologist Clark L Hull, just another pattern of conditioned reflexes. A human being was merely a more complex knot of stimulus responses than a pigeon. Once perfected, some scientists believed, behaviourist science would supply a reliable method to “predict and control” the behaviour of human beings, and thus all social problems would be overcome.

It was a kind of optimistic, progressive version of Nineteen Eighty-Four. But it fell sharply from favour after the 1960s, and the subsequent “cognitive revolution” in psychology emphasised the causal role of conscious thinking. What became cognitive behavioural therapy, for instance, owed its impressive clinical success to focusing on a person’s cognition – the thoughts and the beliefs that radical behaviourism treated as mythical. As CBT’s name suggests, however, it mixes cognitive strategies (analyse one’s thoughts in order to break destructive patterns) with behavioural techniques (act a certain way so as to affect one’s feelings). And the deliberate conditioning of behaviour is still a valuable technique outside the therapy room.

The effective “behavioural modification programme” first publicised by Weight Watchers in the 1970s is based on reinforcement and support techniques suggested by the behaviourist school. Recent research suggests that clever conditioning – associating the taking of a medicine with a certain smell – can boost the body’s immune response later when a patient detects the smell, even without a dose of medicine.

Radical behaviourism that denies a subject’s consciousness and agency, however, is now completely dead as a science. Yet it is being smuggled back into the mainstream by the latest life-enhancing gadgets from Silicon Valley. The difference is that, now, we are encouraged to outsource the “prediction and control” of our own behaviour not to a benign team of psychological experts, but to algorithms.

It begins with measurement and analysis of bodily data using wearable instruments such as Fitbit wristbands, the first wave of which came under the rubric of the “quantified self”. (The Victorian polymath and founder of eugenics, Francis Galton, asked: “When shall we have anthropometric laboratories, where a man may, when he pleases, get himself and his children weighed, measured, and rightly photographed, and have their bodily faculties tested by the best methods known to modern science?” He has his answer: one may now wear such laboratories about one’s person.) But simply recording and hoarding data is of limited use. To adapt what Marx said about philosophers: the sensors only interpret the body, in various ways; the point is to change it.

And the new technology offers to help with precisely that, offering such externally applied “motivation” as the Apple Watch. So the reasoning, striving mind is vacated (perhaps with the help of a mindfulness app) and usurped by a cybernetic system to optimise the organism’s functioning. Electronic stimulus produces a physiological response, as in the behaviourist laboratory. The human being herself just needs to get out of the way. The customer of such devices is merely an opaquely functioning machine to be tinkered with. The desired outputs can be invoked by the correct inputs from a technological prosthesis. Our physical behaviour and even our moods are manipulated by algorithmic number-crunching in corporate data farms, and, as a result, we may dream of becoming fitter, happier and more productive.

***

 

The broad current of behaviourism was not homogeneous in its theories, and nor are its modern technological avatars. The physiologist Ivan Pavlov induced dogs to salivate at the sound of a bell, which they had learned to associate with food. Here, stimulus (the bell) produces an involuntary response (salivation). This is called “classical conditioning”, and it is advertised as the scientific mechanism behind a new device called the Pavlok, a wristband that delivers mild electric shocks to the user in order, so it promises, to help break bad habits such as overeating or smoking.

The explicit behaviourist-revival sell here is interesting, though it is arguably predicated on the wrong kind of conditioning. In classical conditioning, the stimulus evokes the response; but the Pavlok’s painful electric shock is a stimulus that comes after a (voluntary) action. This is what the psychologist who became the best-known behaviourist theoretician, B F Skinner, called “operant conditioning”.

By associating certain actions with positive or negative reinforcement, an animal is led to change its behaviour. The user of a Pavlok treats herself, too, just like an animal, helplessly suffering the gadget’s painful negative reinforcement. “Pavlok associates a mild zap with your bad habit,” its marketing material promises, “training your brain to stop liking the habit.” The use of the word “brain” instead of “mind” here is revealing. The Pavlok user is encouraged to bypass her reflective faculties and perform pain-led conditioning directly on her grey matter, in order to get from it the behaviour that she prefers. And so modern behaviourist technologies act as though the cognitive revolution in psychology never happened, encouraging us to believe that thinking just gets in the way.

Technologically assisted attempts to defeat weakness of will or concentration are not new. In 1925 the inventor Hugo Gernsback announced, in the pages of his magazine Science and Invention, an invention called the Isolator. It was a metal, full-face hood, somewhat like a diving helmet, connected by a rubber hose to an oxygen tank. The Isolator, too, was designed to defeat distractions and assist mental focus.

The problem with modern life, Gernsback wrote, was that the ringing of a telephone or a doorbell “is sufficient, in nearly all cases, to stop the flow of thoughts”. Inside the Isolator, however, sounds are muffled, and the small eyeholes prevent you from seeing anything except what is directly in front of you. Gernsback provided a salutary photograph of himself wearing the Isolator while sitting at his desk, looking like one of the Cybermen from Doctor Who. “The author at work in his private study aided by the Isolator,” the caption reads. “Outside noises being eliminated, the worker can concentrate with ease upon the subject at hand.”

Modern anti-distraction tools such as computer software that disables your internet connection, or word processors that imitate an old-fashioned DOS screen, with nothing but green text on a black background, as well as the brain-measuring Muse headband – these are just the latest versions of what seems an age-old desire for technologically imposed calm. But what do we lose if we come to rely on such gadgets, unable to impose calm on ourselves? What do we become when we need machines to motivate us?

***

It was B F Skinner who supplied what became the paradigmatic image of ­behaviourist science with his “Skinner Box”, formally known as an “operant conditioning chamber”. Skinner Boxes come in different flavours but a classic example is a box with an electrified floor and two levers. A rat is trapped in the box and must press the correct lever when a certain light comes on. If the rat gets it right, food is delivered. If the rat presses the wrong lever, it receives a painful electric shock through the booby-trapped floor. The rat soon learns to press the right lever all the time. But if the levers’ functions are changed unpredictably by the experimenters, the rat becomes confused, withdrawn and depressed.

Skinner Boxes have been used with success not only on rats but on birds and primates, too. So what, after all, are we doing if we sign up to technologically enhanced self-improvement through gadgets and apps? As we manipulate our screens for ­reassurance and encouragement, or wince at a painful failure to be better today than we were yesterday, we are treating ourselves similarly as objects to be improved through operant conditioning. We are climbing willingly into a virtual Skinner Box.

As Carl Cederström and André Spicer point out in their book The Wellness Syndrome, published last year: “Surrendering to an authoritarian agency, which is not just telling you what to do, but also handing out rewards and punishments to shape your behaviour more effectively, seems like undermining your own agency and autonomy.” What’s worse is that, increasingly, we will have no choice in the matter anyway. Gernsback’s Isolator was explicitly designed to improve the concentration of the “worker”, and so are its digital-age descendants. Corporate employee “wellness” programmes increasingly encourage or even mandate the use of fitness trackers and other behavioural gadgets in order to ensure an ideally efficient and compliant workforce.

There are many political reasons to resist the pitiless transfer of responsibility for well-being on to the individual in this way. And, in such cases, it is important to point out that the new idea is a repackaging of a controversial old idea, because that challenges its proponents to defend it explicitly. The Apple Watch and its cousins promise an utterly novel form of technologically enhanced self-mastery. But it is also merely the latest way in which modernity invites us to perform operant conditioning on ourselves, to cleanse away anxiety and dissatisfaction and become more streamlined citizen-consumers. Perhaps we will decide, after all, that tech-powered behaviourism is good. But we should know what we are arguing about. The rethinking should take place out in the open.

In 1987, three years before he died, B F Skinner published a scholarly paper entitled Whatever Happened to Psychology as the Science of Behaviour?, reiterating his now-unfashionable arguments against psychological talk about states of mind. For him, the “prediction and control” of behaviour was not merely a theoretical preference; it was a necessity for global social justice. “To feed the hungry and clothe the naked are ­remedial acts,” he wrote. “We can easily see what is wrong and what needs to be done. It is much harder to see and do something about the fact that world agriculture must feed and clothe billions of people, most of them yet unborn. It is not enough to advise people how to behave in ways that will make a future possible; they must be given effective reasons for behaving in those ways, and that means effective contingencies of reinforcement now.” In other words, mere arguments won’t equip the world to support an increasing population; strategies of behavioural control must be designed for the good of all.

Arguably, this authoritarian strand of behaviourist thinking is what morphed into the subtly reinforcing “choice architecture” of nudge politics, which seeks gently to compel citizens to do the right thing (eat healthy foods, sign up for pension plans) by altering the ways in which such alternatives are presented.

By contrast, the Apple Watch, the Pavlok and their ilk revive a behaviourism evacuated of all social concern and designed solely to optimise the individual customer. By ­using such devices, we voluntarily offer ourselves up to a denial of our voluntary selves, becoming atomised lab rats, to be manipulated electronically through the corporate cloud. It is perhaps no surprise that when the founder of American behaviourism, John B Watson, left academia in 1920, he went into a field that would come to profit very handsomely indeed from his skills of manipulation – advertising. Today’s neo-behaviourist technologies promise to usher in a world that is one giant Skinner Box in its own right: a world where thinking just gets in the way, and we all mechanically press levers for food pellets.

This article first appeared in the 18 August 2016 issue of the New Statesman, Corbyn’s revenge