Gove's proposals would see pupils studying primarily British history. Image: Alex Leme "Globe" 2009
Show Hide image

Michael Gove’s history curriculum is a pub quiz not an education

The rote sets in.

Michael Gove’s new draft national curriculum for history, launched on 7 February, has been greeted with dismay by history teachers at every level, from primary schools to universities, and from every part of the political spectrum.

What has annoyed them most is Gove’s decision to ignore the consultation process and do it all himself. He initially asked the historian Niall Ferguson to come up with ideas for a new curriculum but Ferguson’s response, based on a positive presentation of Europe’s – and especially Britain’s – global ascendancy since the early modern period, did not appeal to Gove, because it advocated history with a global sweep instead of history focused on supposedly key personalities and events within the British past.

Sidelining Ferguson, Gove then asked another expatriate British television historian, Simon Schama, to take a lead. A process of consultation began. A large meeting was held with interested parties including the Better History Forum of conservative teachers led by a former teacher, Seán Lang. Clearly those selected to advise the secretary of state, such as Steven Mastin, a state school history teacher, were chosen partly for political reasons (Mastin was an unsuccessful Conservative candidate at the 2010 general election). With their participation, a draft national history curriculum was hammered out in January and prepared for consultation.

What was actually announced in early February came as a shock to everyone. Those who had taken part in the preparation process did not recognise it. The history profession, including the history sections of the British Academy, the Historical Association, the Royal Historical Society and History UK, complained that the “details of the [new] curriculum have been drafted inside the Department for Education without any systematic consultation or public discussion with historians, teachers or the wider public”.

Even conservative historians were dismayed. A group of 15 academic historians close to the Conservative Party gave their support in a letter to the Times only “in principle” and hoped that the proposals “will no doubt be adapted as a result of full consultation”. Ferguson found the draft curriculum “too prescriptive” and complained that his advice to Gove on this point had been ignored. Lang complained on behalf of the Better History Forum: “Our proposal was ignored; Mr Gove has apparently shut his ears to anyone’s advice but his own.” Mastin said the proposed new curriculum bore “no resemblance” to drafts he had worked on as late as January of this year. “Between January and the publication of this document – which no one involved in the consultation had seen – someone has typed it up and I have no idea who that is,” he remarked.

The answer is inescapable: it was Gove. Just as Margaret Thatcher declared herself shocked and appalled when she saw her first national history curriculum, drawn up largely by education professionals, Gove must have reacted with dismay when he saw the final draft of his history curriculum. Neither document delivered what the politicians wanted, namely the learning of names, dates and facts strung together to form a celebratory, patriotic national narrative. Unlike Thatcher, however, who in the end reluctantly respected the professionals’ expertise, he tore it up and wrote his own.

What does the proposed new curriculum suggest? It begins well enough by reminding us: “A high-quality history education equips pupils to think critically, weigh evidence, sift arguments and develop perspective and judgement.” Yet this introduction seems to have been left over from an earlier draft, for it is no more than a token gesture, almost completely forgotten in the rest of the text, which focuses on listing the facts that pupils will have to learn by rote.

The contradiction between aims and content is even more crass in the passage about the requirement that pupils “know and understand the broad outlines of European and world history”. Despite this laudable aim, they are given no opportunity whatsoever to do so in the rest of the curriculum, in which the emphasis is exclusively on British history. European and world history are included only where they are relevant to Britain.

At times, this verges on the comical. When pupils study the Enlightenment, for instance, they study “Francis Bacon, John Locke, Christopher Wren, Isaac Newton, the Royal Society, Adam Smith and the impact of European thinkers”, though not those thinkers themselves; clearly Voltaire, Montesquieu and Diderot are unimportant because they were French.

This is a curriculum that will produce a generation of young Britons with no knowledge of the history of any part of the world beyond the shores of the British Isles. “As far as I am aware,” Mastin has warned, “we will be the only jurisdiction in the western world that won’t teach world history.” The curriculum declares: “A knowledge of Britain’s past, and our place in the world, helps us understand the challenges of our own time.” Yet in today’s globalised world, it does no such thing.

How are history pupils going to be tested on their knowledge of, say, Thatcher’s election (oddly, the period that the curriculum specifies stops at the moment she comes to power and does not require pupils to know anything about her government), the Chartists or King Athelstan? The draft curriculum is no help at all here. Will they be given multiple-choice examinations? There are no clues; it doesn’t mention the skills whose varying level of deployment is the main basis for assessment. This is preparation for Mastermind or a pub quiz; it is not education.

The new curriculum tells pupils what to think. The Dutch invasion that overthrew King James II was, it declares, “the Glorious Revolution”, ignoring its violent anti-Catholicism and deadly effects in Scotland and Ireland, which were followed by the discrimination against Catholics in the UK that lasted another 140 years. Not glorious for everyone, then. It also tells us what the causes of the First World War were (“colonial rivalry, naval expansion and European alliances”); the causes of the Second World War, meanwhile, were “appeasement, the failure of the League of Nations and the rise of the dictators”.

Evidence gathered in the recent Ofsted report History for All suggests that one of the chief attractions of history for school students is the opportunity that it gives them to find out about historical personalities and issues for themselves and to make their own decisions. The new curriculum is sure to put them off the subject.

Gove has said he wants pupils to study British heroes. However, is “Clive of India” a hero to the many British children of Indian parentage or descent? Historical individuals, including objects of left-wing admiration such as the Levellers or the black nurse Mary Seacole, should be presented as subjects for historical inquiry, not as heroes or heroines to be admired mindlessly.

The new chronology that forms the basis of the proposed curriculum isn’t workable. In practice, it will produce even more superficial knowledge than pupils have at the moment. With only one hour a week devoted to history, taught by a non-specialist teacher, how are primary school pupils going to work their way through the dense factual material of Key Stages 1 and 2? There is simply too much material to teach; only bits and pieces can be selected.

And how are seven-year-olds going to understand topics such as “the heptarchy” or “feudalism”? What will 11-year-olds make of the Putney debates? After the age of 11, pupils will study only modern history. They will come to maturity with a knowledge of the Middle Ages stuck at the level of a nine-yearold. The teaching prescribed by the draft curriculum is not appropriate to the ages of the children being taught.

Given the time available, the chronology will end up being taught as discrete episodes. Narrative or, to use a better word, chronicle, the recital of one event after another, will not help children understand change over time; to do that, they need to compare and relate events with each other and with their contexts, not just to learn that the Vikings came after the Anglo-Saxons and the Normans after the Vikings. In practice, sequential teaching of this kind does not provide a context; it rips events out of their context, leaving them insusceptible to analysis.

All of the new developments over the past half-century – in economic, social, cultural and other kinds of history – that have made history so exciting as a discipline are pushed to the sidelines in favour of a political narrative that might have been lifted straight from a textbook written in the 1930s. There are labels and concepts in the new curriculum that haven’t been used by historians for years – “gunboat diplomacy” and “Clive of India”, to name only two.

Gove wants the teaching of history to give pupils a positive sense of national identity and pride. Yet history isn’t a form of instruction in citizenship. It’s an academic subject in its own right. If he really wants more rigour in education, Gove should tear up his amateurish new curriculum and start listening to the professionals.

Richard J Evans is Regius professor of history and president of Wolfson College, University of Cambridge

This article first appeared in the 18 March 2013 issue of the New Statesman, The German Problem

JOHN DEVOLLE/GETTY IMAGES
Show Hide image

Fitter, dumber, more productive

How the craze for Apple Watches, Fitbits and other wearable tech devices revives the old and discredited science of behaviourism.

When Tim Cook unveiled the latest operating system for the Apple Watch in June, he described the product in a remarkable way. This is no longer just a wrist-mounted gadget for checking your email and social media notifications; it is now “the ultimate device for a healthy life”.

With the watch’s fitness-tracking and heart rate-sensor features to the fore, Cook explained how its Activity and Workout apps have been retooled to provide greater “motivation”. A new Breathe app encourages the user to take time out during the day for deep breathing sessions. Oh yes, this watch has an app that notifies you when it’s time to breathe. The paradox is that if you have zero motivation and don’t know when to breathe in the first place, you probably won’t survive long enough to buy an Apple Watch.

The watch and its marketing are emblematic of how the tech trend is moving beyond mere fitness tracking into what might one call quality-of-life tracking and algorithmic hacking of the quality of consciousness. A couple of years ago I road-tested a brainwave-sensing headband, called the Muse, which promises to help you quiet your mind and achieve “focus” by concentrating on your breathing as it provides aural feedback over earphones, in the form of the sound of wind at a beach. I found it turned me, for a while, into a kind of placid zombie with no useful “focus” at all.

A newer product even aims to hack sleep – that productivity wasteland, which, according to the art historian and essayist Jonathan Crary’s book 24/7: Late Capitalism and the Ends of Sleep, is an affront to the foundations of capitalism. So buy an “intelligent sleep mask” called the Neuroon to analyse the quality of your sleep at night and help you perform more productively come morning. “Knowledge is power!” it promises. “Sleep analytics gathers your body’s sleep data and uses it to help you sleep smarter!” (But isn’t one of the great things about sleep that, while you’re asleep, you are perfectly stupid?)

The Neuroon will also help you enjoy technologically assisted “power naps” during the day to combat “lack of energy”, “fatigue”, “mental exhaustion” and “insomnia”. When it comes to quality of sleep, of course, numerous studies suggest that late-night smartphone use is very bad, but if you can’t stop yourself using your phone, at least you can now connect it to a sleep-enhancing gadget.

So comes a brand new wave of devices that encourage users to outsource not only their basic bodily functions but – as with the Apple Watch’s emphasis on providing “motivation” – their very willpower.  These are thrillingly innovative technologies and yet, in the way they encourage us to think about ourselves, they implicitly revive an old and discarded school of ­thinking in psychology. Are we all neo-­behaviourists now?

***

The school of behaviourism arose in the early 20th century out of a virtuous scientific caution. Experimenters wished to avoid anthropomorphising animals such as rats and pigeons by attributing to them mental capacities for belief, reasoning, and so forth. This kind of description seemed woolly and impossible to verify.

The behaviourists discovered that the actions of laboratory animals could, in effect, be predicted and guided by careful “conditioning”, involving stimulus and reinforcement. They then applied Ockham’s razor: there was no reason, they argued, to believe in elaborate mental equipment in a small mammal or bird; at bottom, all behaviour was just a response to external stimulus. The idea that a rat had a complex mentality was an unnecessary hypothesis and so could be discarded. The psychologist John B Watson declared in 1913 that behaviour, and behaviour alone, should be the whole subject matter of psychology: to project “psychical” attributes on to animals, he and his followers thought, was not permissible.

The problem with Ockham’s razor, though, is that sometimes it is difficult to know when to stop cutting. And so more radical behaviourists sought to apply the same lesson to human beings. What you and I think of as thinking was, for radical behaviourists such as the Yale psychologist Clark L Hull, just another pattern of conditioned reflexes. A human being was merely a more complex knot of stimulus responses than a pigeon. Once perfected, some scientists believed, behaviourist science would supply a reliable method to “predict and control” the behaviour of human beings, and thus all social problems would be overcome.

It was a kind of optimistic, progressive version of Nineteen Eighty-Four. But it fell sharply from favour after the 1960s, and the subsequent “cognitive revolution” in psychology emphasised the causal role of conscious thinking. What became cognitive behavioural therapy, for instance, owed its impressive clinical success to focusing on a person’s cognition – the thoughts and the beliefs that radical behaviourism treated as mythical. As CBT’s name suggests, however, it mixes cognitive strategies (analyse one’s thoughts in order to break destructive patterns) with behavioural techniques (act a certain way so as to affect one’s feelings). And the deliberate conditioning of behaviour is still a valuable technique outside the therapy room.

The effective “behavioural modification programme” first publicised by Weight Watchers in the 1970s is based on reinforcement and support techniques suggested by the behaviourist school. Recent research suggests that clever conditioning – associating the taking of a medicine with a certain smell – can boost the body’s immune response later when a patient detects the smell, even without a dose of medicine.

Radical behaviourism that denies a subject’s consciousness and agency, however, is now completely dead as a science. Yet it is being smuggled back into the mainstream by the latest life-enhancing gadgets from Silicon Valley. The difference is that, now, we are encouraged to outsource the “prediction and control” of our own behaviour not to a benign team of psychological experts, but to algorithms.

It begins with measurement and analysis of bodily data using wearable instruments such as Fitbit wristbands, the first wave of which came under the rubric of the “quantified self”. (The Victorian polymath and founder of eugenics, Francis Galton, asked: “When shall we have anthropometric laboratories, where a man may, when he pleases, get himself and his children weighed, measured, and rightly photographed, and have their bodily faculties tested by the best methods known to modern science?” He has his answer: one may now wear such laboratories about one’s person.) But simply recording and hoarding data is of limited use. To adapt what Marx said about philosophers: the sensors only interpret the body, in various ways; the point is to change it.

And the new technology offers to help with precisely that, offering such externally applied “motivation” as the Apple Watch. So the reasoning, striving mind is vacated (perhaps with the help of a mindfulness app) and usurped by a cybernetic system to optimise the organism’s functioning. Electronic stimulus produces a physiological response, as in the behaviourist laboratory. The human being herself just needs to get out of the way. The customer of such devices is merely an opaquely functioning machine to be tinkered with. The desired outputs can be invoked by the correct inputs from a technological prosthesis. Our physical behaviour and even our moods are manipulated by algorithmic number-crunching in corporate data farms, and, as a result, we may dream of becoming fitter, happier and more productive.

***

 

The broad current of behaviourism was not homogeneous in its theories, and nor are its modern technological avatars. The physiologist Ivan Pavlov induced dogs to salivate at the sound of a bell, which they had learned to associate with food. Here, stimulus (the bell) produces an involuntary response (salivation). This is called “classical conditioning”, and it is advertised as the scientific mechanism behind a new device called the Pavlok, a wristband that delivers mild electric shocks to the user in order, so it promises, to help break bad habits such as overeating or smoking.

The explicit behaviourist-revival sell here is interesting, though it is arguably predicated on the wrong kind of conditioning. In classical conditioning, the stimulus evokes the response; but the Pavlok’s painful electric shock is a stimulus that comes after a (voluntary) action. This is what the psychologist who became the best-known behaviourist theoretician, B F Skinner, called “operant conditioning”.

By associating certain actions with positive or negative reinforcement, an animal is led to change its behaviour. The user of a Pavlok treats herself, too, just like an animal, helplessly suffering the gadget’s painful negative reinforcement. “Pavlok associates a mild zap with your bad habit,” its marketing material promises, “training your brain to stop liking the habit.” The use of the word “brain” instead of “mind” here is revealing. The Pavlok user is encouraged to bypass her reflective faculties and perform pain-led conditioning directly on her grey matter, in order to get from it the behaviour that she prefers. And so modern behaviourist technologies act as though the cognitive revolution in psychology never happened, encouraging us to believe that thinking just gets in the way.

Technologically assisted attempts to defeat weakness of will or concentration are not new. In 1925 the inventor Hugo Gernsback announced, in the pages of his magazine Science and Invention, an invention called the Isolator. It was a metal, full-face hood, somewhat like a diving helmet, connected by a rubber hose to an oxygen tank. The Isolator, too, was designed to defeat distractions and assist mental focus.

The problem with modern life, Gernsback wrote, was that the ringing of a telephone or a doorbell “is sufficient, in nearly all cases, to stop the flow of thoughts”. Inside the Isolator, however, sounds are muffled, and the small eyeholes prevent you from seeing anything except what is directly in front of you. Gernsback provided a salutary photograph of himself wearing the Isolator while sitting at his desk, looking like one of the Cybermen from Doctor Who. “The author at work in his private study aided by the Isolator,” the caption reads. “Outside noises being eliminated, the worker can concentrate with ease upon the subject at hand.”

Modern anti-distraction tools such as computer software that disables your internet connection, or word processors that imitate an old-fashioned DOS screen, with nothing but green text on a black background, as well as the brain-measuring Muse headband – these are just the latest versions of what seems an age-old desire for technologically imposed calm. But what do we lose if we come to rely on such gadgets, unable to impose calm on ourselves? What do we become when we need machines to motivate us?

***

It was B F Skinner who supplied what became the paradigmatic image of ­behaviourist science with his “Skinner Box”, formally known as an “operant conditioning chamber”. Skinner Boxes come in different flavours but a classic example is a box with an electrified floor and two levers. A rat is trapped in the box and must press the correct lever when a certain light comes on. If the rat gets it right, food is delivered. If the rat presses the wrong lever, it receives a painful electric shock through the booby-trapped floor. The rat soon learns to press the right lever all the time. But if the levers’ functions are changed unpredictably by the experimenters, the rat becomes confused, withdrawn and depressed.

Skinner Boxes have been used with success not only on rats but on birds and primates, too. So what, after all, are we doing if we sign up to technologically enhanced self-improvement through gadgets and apps? As we manipulate our screens for ­reassurance and encouragement, or wince at a painful failure to be better today than we were yesterday, we are treating ourselves similarly as objects to be improved through operant conditioning. We are climbing willingly into a virtual Skinner Box.

As Carl Cederström and André Spicer point out in their book The Wellness Syndrome, published last year: “Surrendering to an authoritarian agency, which is not just telling you what to do, but also handing out rewards and punishments to shape your behaviour more effectively, seems like undermining your own agency and autonomy.” What’s worse is that, increasingly, we will have no choice in the matter anyway. Gernsback’s Isolator was explicitly designed to improve the concentration of the “worker”, and so are its digital-age descendants. Corporate employee “wellness” programmes increasingly encourage or even mandate the use of fitness trackers and other behavioural gadgets in order to ensure an ideally efficient and compliant workforce.

There are many political reasons to resist the pitiless transfer of responsibility for well-being on to the individual in this way. And, in such cases, it is important to point out that the new idea is a repackaging of a controversial old idea, because that challenges its proponents to defend it explicitly. The Apple Watch and its cousins promise an utterly novel form of technologically enhanced self-mastery. But it is also merely the latest way in which modernity invites us to perform operant conditioning on ourselves, to cleanse away anxiety and dissatisfaction and become more streamlined citizen-consumers. Perhaps we will decide, after all, that tech-powered behaviourism is good. But we should know what we are arguing about. The rethinking should take place out in the open.

In 1987, three years before he died, B F Skinner published a scholarly paper entitled Whatever Happened to Psychology as the Science of Behaviour?, reiterating his now-unfashionable arguments against psychological talk about states of mind. For him, the “prediction and control” of behaviour was not merely a theoretical preference; it was a necessity for global social justice. “To feed the hungry and clothe the naked are ­remedial acts,” he wrote. “We can easily see what is wrong and what needs to be done. It is much harder to see and do something about the fact that world agriculture must feed and clothe billions of people, most of them yet unborn. It is not enough to advise people how to behave in ways that will make a future possible; they must be given effective reasons for behaving in those ways, and that means effective contingencies of reinforcement now.” In other words, mere arguments won’t equip the world to support an increasing population; strategies of behavioural control must be designed for the good of all.

Arguably, this authoritarian strand of behaviourist thinking is what morphed into the subtly reinforcing “choice architecture” of nudge politics, which seeks gently to compel citizens to do the right thing (eat healthy foods, sign up for pension plans) by altering the ways in which such alternatives are presented.

By contrast, the Apple Watch, the Pavlok and their ilk revive a behaviourism evacuated of all social concern and designed solely to optimise the individual customer. By ­using such devices, we voluntarily offer ourselves up to a denial of our voluntary selves, becoming atomised lab rats, to be manipulated electronically through the corporate cloud. It is perhaps no surprise that when the founder of American behaviourism, John B Watson, left academia in 1920, he went into a field that would come to profit very handsomely indeed from his skills of manipulation – advertising. Today’s neo-behaviourist technologies promise to usher in a world that is one giant Skinner Box in its own right: a world where thinking just gets in the way, and we all mechanically press levers for food pellets.

This article first appeared in the 18 August 2016 issue of the New Statesman, Corbyn’s revenge