Show Hide image

1066 and all that

Michael Gove argues that schools should teach children about kings, queens and wars. He's offering a

"Fewer and fewer students want to study the past," complained the Tory MP and historian Chris Skidmore recently, adding: "[G]iven the way it is currently presented in schools, who can blame them?" In 2011, in 159 schools no pupils at all were entered for GCSE history. "We are facing a situation," he warns, "where history is at risk of dying out in schools and regions in the country." His remedy is to reorient the GCSE towards "our national history, rather than focusing on Hitler's Germany, Stalin's Russia or the history of medicine. We should introduce a narrative-based exam that covers every age in British history across a broad chronological span", instead of focusing on isolated "bite-sized" chunks of history. "Local history," he adds, would bring it all to life and "can easily be woven into the school curriculum".

Skidmore joins a swelling chorus of voices clamouring for a restoration of a British history narrative at the core of the curriculum as a means of halting the subject's decline in schools. It has been led by the Education Secretary, Michael Gove. The current National Curriculum, he says, neglects our national history: "Most parents would rather their children had a traditional education, with children sitting in rows, learning the kings and queens of England." David Cameron has lamented the "tragedy that we have swept away the teaching of narrative history and replaced it with a bite-sized, disjointed approach to learning about historical events . . . [in a] shift away from learning actual knowledge, such as facts and dates."

Some historians take the same view. "The syllabus," thunders Dominic Sandbrook, "has been a shambles for years. Fragmented and fractured, obsessed with the Nazis and apparently indifferent to the pleasures of narrative, it leaves students struggling for a sense of the contours of our national story." The Labour MP and historian Tristram Hunt has added his voice to those demanding a replacement of the current National Curriculum with a British-focused national narrative, showing there is a cross-party consensus behind these criticisms.

But is history in our schools really in a state of terminal crisis? As David Cannadine has shown in his new book The Right Kind of History: Teaching the Past in Twentieth-Century England, such complaints are not new. They were made by Margaret Thatcher's government in the 1980s and by others long before, all of whom wanted history-teaching to be a vehicle for the creation of a unified sense of national identity. Indeed, at the beginning of the 20th century, history was barely taught in schools at all. When GCSEs were introduced in the 1980s, history, unlike many other subjects, was not made compulsory; still, about a third of all GCSE candidates voluntarily studied it as one of their exam subjects. Over the following years the spread of thematic and social history approaches pioneered by the Schools History Project, including the history of medicine, far from plunging the subject into crisis, actually led to an increase in its popularity and GCSE history entries reached 40 per cent by 1995.

The introduction of league tables in the 1990s, however, focused schools' attention on maths, English and science at primary level. The result was a rapid and drastic fall in history teaching, so that nowadays only 4 per cent of class time in primary schools is devoted to the subject. League tables based on GCSE and A-level results have led secondary schools to focus on subjects in which better GCSE results can be achieved, and pupils often prefer to take a GCSE in a subject that's compulsory until the age of 16 than add to their workload by taking one that's not - such as history. All this has led to a 10 per cent drop in history GCSE entries since 1995, putting it back to around 30 per cent. However, this is roughly where it was when the GCSE was introduced; it's not, as Skidmore implies, a decline from some past golden age when all 14-to-16-year-olds took the subject.

Blaming the curriculum is wrong. In 2007 the Qualifications and Curriculum Authority reported that a survey of 1,700 children, two-thirds of whom gave up the subject at 14, found that half of them liked and enjoyed the subject. And it's important not to exaggerate the decline either. A recent Ofsted report on the teaching of history in 166 primary and secondary schools noted that between 2007 and 2010 "there were more examination entries for history than for any other optional subject at GCSE level apart from design and technology".

The number of students taking GCSE history remained stable from 2000 to 2010. Moreover, Ofsted reported that "numbers taking the subject at A-level have risen steadily over the past ten years", making history one of the "top five subject choices at A-level". The report found the subject was well taught at a majority of schools at all levels, and that pupils enjoyed their lessons, found history fun, and praised it for making them think. Far from being in a state of terminal decay, then, history in schools is actually a success story.

Still, nobody seriously interested in the subject would want to disagree with the proposition that more schoolchildren should study it. Is the way forward to focus it exclusively on British history? In fact, the National Curriculum for children up to the age of 14 already has a chronological account of British history from 1066 to the present as its core, surrounding it with forays into European and extra-European history to introduce pupils to other countries and cultures. And local history is also a key part of the curriculum, as Skidmore would discover if he actually bothered to read it. So the Ofsted report, surveying the content of teaching across the country, concludes firmly that "the view that too little British history is taught in secondary schools in England is a myth". Complaints about the "Nazification" of the curriculum are mere rhetoric and nothing more. One can smell more than a whiff of Tory Euro-scepticism in the complaint that pupils learn more about Russia and Germany than they do about England.

Would a greater emphasis on kings and queens help? Dominic Sandbrook notes that, "for all the efforts of academic historians, popular history is still dominated by vivid characters and bloody battles, often shot through with a deep sense of national pride". But many of the most popular history books don't deal with British history at all, even if they do focus on vivid characters and bloody battles: Antony Beevor's Stalingrad, for instance; or Jung Chang and Jon Halliday's Mao: The Untold Story; or, in a rather different way, Edmund de Waal's bestselling part-history, part-memoir, The Hare with Amber Eyes. And many popular history books deal with social and cultural history, including, ironically, Sandbrook's own marvellous, best-selling trilogy of books on post-war Britain; some of the greatest bestsellers of recent years, such as Dava Sobel's Longitude, are on subjects about as far away as one could imagine from kings and battles.

How about teaching narrative rather than analysis, then? It is wrong, David Starkey has asserted, that history in the schools has modelled itself on university research. What we need, he declares, is to give children "a sense of change and development over time . . . The skills-based teaching of history is a catastrophe." But what sells in the bookshops or what succeeds on TV is not necessarily what should be taught in schools. Teaching is a profession with its own skills and techniques, different from those needed to present a television programme (as Starkey's performance on the reality TV show Jamie's Dream School dramatically indicated). Physics, biology and every other subject in schools is taught along lines that reflect research in the universities. One wouldn't expect physics teachers to ignore Stephen Hawking's ideas about black holes, or biology teachers to keep quiet about the discovery of DNA. So what makes history so different? Chemistry devotes a large amount of time to transmitting skills to students; why shouldn't history?

The narrative that the critics want shoved down pupils' throats in schools - as they sit in rows silently learning lists of kings and queens - is essentially what's been called the "Whig theory of history"; that is, telling a story of British history over a long period of time, stressing the development of parliamentary democracy in a narrative that culminates in a present viewed in self-congratulatory terms.

This theory was exploded by professional historians more than half a century ago, under the influence of the classic tract The Whig Interpretation of History by the conservative historian Herbert Butterfield. Yet it still has strong support in the media. The Daily Telegraph and the right-wing think tank Civitas even campaigned to get H E Marshall's patriotic textbook Our Island Story put on the National Curriculum. Dating from the Edwardian era, this book, with its stories of how the British brought freedom and justice to the Maoris of New Zealand and many other lucky peoples across the world, has rightly been described as "imperialist propaganda masquerading as history". In what other academic subject would people seriously advocate a return to a state of knowledge as it was a hundred years ago?

Perhaps instead of this outdated volume they might therefore use Simon Jenkins's new A Short History of England. But its message is in the end not very different. Interviewed in the Guardian, its author intoned with breathtaking complacency his view that "England really is a most successful country" and claimed that English history was separate from that of the other European powers. "The British talent," if we are to believe Jenkins, "had always been to keep away from wars overseas. We had kept out of Europe all the time."

Jenkins talks as if there had never been a Norman conquest, an Angevin regime, a hundred years war, a Dutch invasion (in 1688), joint rule of a large chunk of Germany (Hanover) from 1714 to 1837, or a series of wars with France, ranging across the world from India to the Americas, from the age of Louis XIV to that of Napoleon; as if there had never been any immigration or any cultural exchange with the Continent; as if our history had not been part of Europe's through two world wars and the ensuing decades of peace. The thought of such an ignorant and insular approach to English history finding its way into the hands of children is frightening; but on the other hand, its errors of fact and perspective are so egregious that it might provide a good starting point from which they can sharpen their critical faculties.

It's all very well demanding that the curriculum should be filled with facts, but what facts you choose depends on what vision you have of British national identity. The concept of "British history" itself is contentious and politically debatable, which perhaps is why some of the National Curriculum's critics advocate a narrative history of England instead; though in the case of Jenkins the justification for this, that "England is an island", is a geographical howler that even six-year-olds should be able to spot. Time and again, the advocates of a national narrative confuse English history with British history, in a way that would not go down well in Cardiff or Edinburgh.

History at every level, not just in the universities, is endlessly contentious and argumentative. How can this provide a basis for a unified national consciousness? Rote learning suppresses critical thought; narrative isn't something you can teach unless you subject it to critical analysis and for that you need the skills to interrogate it. For analysis, especially in depth, you need to study selected topics, even if it has to be within a broader chronological context. Critics who complain of the breaking up of the seamless web of chronology have no concept of what history teaching and learning actually involve.

Forcing students to study a narrowly focused curriculum based on British kings and queens would soon lead to students in their thousands being put off history as a subject. There would be a collapse of take-up at GCSE and A-level. Our culture and our national identity would be impoverished. A quack remedy for a misdiagnosed complaint, it would only make things worse. The real threat to history teaching in our schools doesn't come from the curriculum, it comes from somewhere else, not mentioned by Skidmore at all: it comes from the academies, Michael Gove's flagship secondary schools, which are free from local authority control and don't have to follow the National Curriculum. In 2011, just 20 per cent of academy students taking GCSEs included history among their subjects. As academies - which already make up 10 per cent of secondary schools - spread further, with government encouragement, the teaching of history really will be in crisis.

Richard J Evans is Regius Professor of History and president of Wolfson College, Cambridge. He is the author of "The Third Reich at War" (Penguin, £12.99)

This article first appeared in the 23 January 2012 issue of the New Statesman, Has the Arab Spring been hijacked?

JOHN DEVOLLE/GETTY IMAGES
Show Hide image

Fitter, dumber, more productive

How the craze for Apple Watches, Fitbits and other wearable tech devices revives the old and discredited science of behaviourism.

When Tim Cook unveiled the latest operating system for the Apple Watch in June, he described the product in a remarkable way. This is no longer just a wrist-mounted gadget for checking your email and social media notifications; it is now “the ultimate device for a healthy life”.

With the watch’s fitness-tracking and heart rate-sensor features to the fore, Cook explained how its Activity and Workout apps have been retooled to provide greater “motivation”. A new Breathe app encourages the user to take time out during the day for deep breathing sessions. Oh yes, this watch has an app that notifies you when it’s time to breathe. The paradox is that if you have zero motivation and don’t know when to breathe in the first place, you probably won’t survive long enough to buy an Apple Watch.

The watch and its marketing are emblematic of how the tech trend is moving beyond mere fitness tracking into what might one call quality-of-life tracking and algorithmic hacking of the quality of consciousness. A couple of years ago I road-tested a brainwave-sensing headband, called the Muse, which promises to help you quiet your mind and achieve “focus” by concentrating on your breathing as it provides aural feedback over earphones, in the form of the sound of wind at a beach. I found it turned me, for a while, into a kind of placid zombie with no useful “focus” at all.

A newer product even aims to hack sleep – that productivity wasteland, which, according to the art historian and essayist Jonathan Crary’s book 24/7: Late Capitalism and the Ends of Sleep, is an affront to the foundations of capitalism. So buy an “intelligent sleep mask” called the Neuroon to analyse the quality of your sleep at night and help you perform more productively come morning. “Knowledge is power!” it promises. “Sleep analytics gathers your body’s sleep data and uses it to help you sleep smarter!” (But isn’t one of the great things about sleep that, while you’re asleep, you are perfectly stupid?)

The Neuroon will also help you enjoy technologically assisted “power naps” during the day to combat “lack of energy”, “fatigue”, “mental exhaustion” and “insomnia”. When it comes to quality of sleep, of course, numerous studies suggest that late-night smartphone use is very bad, but if you can’t stop yourself using your phone, at least you can now connect it to a sleep-enhancing gadget.

So comes a brand new wave of devices that encourage users to outsource not only their basic bodily functions but – as with the Apple Watch’s emphasis on providing “motivation” – their very willpower.  These are thrillingly innovative technologies and yet, in the way they encourage us to think about ourselves, they implicitly revive an old and discarded school of ­thinking in psychology. Are we all neo-­behaviourists now?

***

The school of behaviourism arose in the early 20th century out of a virtuous scientific caution. Experimenters wished to avoid anthropomorphising animals such as rats and pigeons by attributing to them mental capacities for belief, reasoning, and so forth. This kind of description seemed woolly and impossible to verify.

The behaviourists discovered that the actions of laboratory animals could, in effect, be predicted and guided by careful “conditioning”, involving stimulus and reinforcement. They then applied Ockham’s razor: there was no reason, they argued, to believe in elaborate mental equipment in a small mammal or bird; at bottom, all behaviour was just a response to external stimulus. The idea that a rat had a complex mentality was an unnecessary hypothesis and so could be discarded. The psychologist John B Watson declared in 1913 that behaviour, and behaviour alone, should be the whole subject matter of psychology: to project “psychical” attributes on to animals, he and his followers thought, was not permissible.

The problem with Ockham’s razor, though, is that sometimes it is difficult to know when to stop cutting. And so more radical behaviourists sought to apply the same lesson to human beings. What you and I think of as thinking was, for radical behaviourists such as the Yale psychologist Clark L Hull, just another pattern of conditioned reflexes. A human being was merely a more complex knot of stimulus responses than a pigeon. Once perfected, some scientists believed, behaviourist science would supply a reliable method to “predict and control” the behaviour of human beings, and thus all social problems would be overcome.

It was a kind of optimistic, progressive version of Nineteen Eighty-Four. But it fell sharply from favour after the 1960s, and the subsequent “cognitive revolution” in psychology emphasised the causal role of conscious thinking. What became cognitive behavioural therapy, for instance, owed its impressive clinical success to focusing on a person’s cognition – the thoughts and the beliefs that radical behaviourism treated as mythical. As CBT’s name suggests, however, it mixes cognitive strategies (analyse one’s thoughts in order to break destructive patterns) with behavioural techniques (act a certain way so as to affect one’s feelings). And the deliberate conditioning of behaviour is still a valuable technique outside the therapy room.

The effective “behavioural modification programme” first publicised by Weight Watchers in the 1970s is based on reinforcement and support techniques suggested by the behaviourist school. Recent research suggests that clever conditioning – associating the taking of a medicine with a certain smell – can boost the body’s immune response later when a patient detects the smell, even without a dose of medicine.

Radical behaviourism that denies a subject’s consciousness and agency, however, is now completely dead as a science. Yet it is being smuggled back into the mainstream by the latest life-enhancing gadgets from Silicon Valley. The difference is that, now, we are encouraged to outsource the “prediction and control” of our own behaviour not to a benign team of psychological experts, but to algorithms.

It begins with measurement and analysis of bodily data using wearable instruments such as Fitbit wristbands, the first wave of which came under the rubric of the “quantified self”. (The Victorian polymath and founder of eugenics, Francis Galton, asked: “When shall we have anthropometric laboratories, where a man may, when he pleases, get himself and his children weighed, measured, and rightly photographed, and have their bodily faculties tested by the best methods known to modern science?” He has his answer: one may now wear such laboratories about one’s person.) But simply recording and hoarding data is of limited use. To adapt what Marx said about philosophers: the sensors only interpret the body, in various ways; the point is to change it.

And the new technology offers to help with precisely that, offering such externally applied “motivation” as the Apple Watch. So the reasoning, striving mind is vacated (perhaps with the help of a mindfulness app) and usurped by a cybernetic system to optimise the organism’s functioning. Electronic stimulus produces a physiological response, as in the behaviourist laboratory. The human being herself just needs to get out of the way. The customer of such devices is merely an opaquely functioning machine to be tinkered with. The desired outputs can be invoked by the correct inputs from a technological prosthesis. Our physical behaviour and even our moods are manipulated by algorithmic number-crunching in corporate data farms, and, as a result, we may dream of becoming fitter, happier and more productive.

***

 

The broad current of behaviourism was not homogeneous in its theories, and nor are its modern technological avatars. The physiologist Ivan Pavlov induced dogs to salivate at the sound of a bell, which they had learned to associate with food. Here, stimulus (the bell) produces an involuntary response (salivation). This is called “classical conditioning”, and it is advertised as the scientific mechanism behind a new device called the Pavlok, a wristband that delivers mild electric shocks to the user in order, so it promises, to help break bad habits such as overeating or smoking.

The explicit behaviourist-revival sell here is interesting, though it is arguably predicated on the wrong kind of conditioning. In classical conditioning, the stimulus evokes the response; but the Pavlok’s painful electric shock is a stimulus that comes after a (voluntary) action. This is what the psychologist who became the best-known behaviourist theoretician, B F Skinner, called “operant conditioning”.

By associating certain actions with positive or negative reinforcement, an animal is led to change its behaviour. The user of a Pavlok treats herself, too, just like an animal, helplessly suffering the gadget’s painful negative reinforcement. “Pavlok associates a mild zap with your bad habit,” its marketing material promises, “training your brain to stop liking the habit.” The use of the word “brain” instead of “mind” here is revealing. The Pavlok user is encouraged to bypass her reflective faculties and perform pain-led conditioning directly on her grey matter, in order to get from it the behaviour that she prefers. And so modern behaviourist technologies act as though the cognitive revolution in psychology never happened, encouraging us to believe that thinking just gets in the way.

Technologically assisted attempts to defeat weakness of will or concentration are not new. In 1925 the inventor Hugo Gernsback announced, in the pages of his magazine Science and Invention, an invention called the Isolator. It was a metal, full-face hood, somewhat like a diving helmet, connected by a rubber hose to an oxygen tank. The Isolator, too, was designed to defeat distractions and assist mental focus.

The problem with modern life, Gernsback wrote, was that the ringing of a telephone or a doorbell “is sufficient, in nearly all cases, to stop the flow of thoughts”. Inside the Isolator, however, sounds are muffled, and the small eyeholes prevent you from seeing anything except what is directly in front of you. Gernsback provided a salutary photograph of himself wearing the Isolator while sitting at his desk, looking like one of the Cybermen from Doctor Who. “The author at work in his private study aided by the Isolator,” the caption reads. “Outside noises being eliminated, the worker can concentrate with ease upon the subject at hand.”

Modern anti-distraction tools such as computer software that disables your internet connection, or word processors that imitate an old-fashioned DOS screen, with nothing but green text on a black background, as well as the brain-measuring Muse headband – these are just the latest versions of what seems an age-old desire for technologically imposed calm. But what do we lose if we come to rely on such gadgets, unable to impose calm on ourselves? What do we become when we need machines to motivate us?

***

It was B F Skinner who supplied what became the paradigmatic image of ­behaviourist science with his “Skinner Box”, formally known as an “operant conditioning chamber”. Skinner Boxes come in different flavours but a classic example is a box with an electrified floor and two levers. A rat is trapped in the box and must press the correct lever when a certain light comes on. If the rat gets it right, food is delivered. If the rat presses the wrong lever, it receives a painful electric shock through the booby-trapped floor. The rat soon learns to press the right lever all the time. But if the levers’ functions are changed unpredictably by the experimenters, the rat becomes confused, withdrawn and depressed.

Skinner Boxes have been used with success not only on rats but on birds and primates, too. So what, after all, are we doing if we sign up to technologically enhanced self-improvement through gadgets and apps? As we manipulate our screens for ­reassurance and encouragement, or wince at a painful failure to be better today than we were yesterday, we are treating ourselves similarly as objects to be improved through operant conditioning. We are climbing willingly into a virtual Skinner Box.

As Carl Cederström and André Spicer point out in their book The Wellness Syndrome, published last year: “Surrendering to an authoritarian agency, which is not just telling you what to do, but also handing out rewards and punishments to shape your behaviour more effectively, seems like undermining your own agency and autonomy.” What’s worse is that, increasingly, we will have no choice in the matter anyway. Gernsback’s Isolator was explicitly designed to improve the concentration of the “worker”, and so are its digital-age descendants. Corporate employee “wellness” programmes increasingly encourage or even mandate the use of fitness trackers and other behavioural gadgets in order to ensure an ideally efficient and compliant workforce.

There are many political reasons to resist the pitiless transfer of responsibility for well-being on to the individual in this way. And, in such cases, it is important to point out that the new idea is a repackaging of a controversial old idea, because that challenges its proponents to defend it explicitly. The Apple Watch and its cousins promise an utterly novel form of technologically enhanced self-mastery. But it is also merely the latest way in which modernity invites us to perform operant conditioning on ourselves, to cleanse away anxiety and dissatisfaction and become more streamlined citizen-consumers. Perhaps we will decide, after all, that tech-powered behaviourism is good. But we should know what we are arguing about. The rethinking should take place out in the open.

In 1987, three years before he died, B F Skinner published a scholarly paper entitled Whatever Happened to Psychology as the Science of Behaviour?, reiterating his now-unfashionable arguments against psychological talk about states of mind. For him, the “prediction and control” of behaviour was not merely a theoretical preference; it was a necessity for global social justice. “To feed the hungry and clothe the naked are ­remedial acts,” he wrote. “We can easily see what is wrong and what needs to be done. It is much harder to see and do something about the fact that world agriculture must feed and clothe billions of people, most of them yet unborn. It is not enough to advise people how to behave in ways that will make a future possible; they must be given effective reasons for behaving in those ways, and that means effective contingencies of reinforcement now.” In other words, mere arguments won’t equip the world to support an increasing population; strategies of behavioural control must be designed for the good of all.

Arguably, this authoritarian strand of behaviourist thinking is what morphed into the subtly reinforcing “choice architecture” of nudge politics, which seeks gently to compel citizens to do the right thing (eat healthy foods, sign up for pension plans) by altering the ways in which such alternatives are presented.

By contrast, the Apple Watch, the Pavlok and their ilk revive a behaviourism evacuated of all social concern and designed solely to optimise the individual customer. By ­using such devices, we voluntarily offer ourselves up to a denial of our voluntary selves, becoming atomised lab rats, to be manipulated electronically through the corporate cloud. It is perhaps no surprise that when the founder of American behaviourism, John B Watson, left academia in 1920, he went into a field that would come to profit very handsomely indeed from his skills of manipulation – advertising. Today’s neo-behaviourist technologies promise to usher in a world that is one giant Skinner Box in its own right: a world where thinking just gets in the way, and we all mechanically press levers for food pellets.

This article first appeared in the 18 August 2016 issue of the New Statesman, Corbyn’s revenge