Immersed in death: on a packed commuter train in New York on the day JFK got shot, there is only one headline. (Photo: Getty Images)
Show Hide image

The assassin’s creed

The killings of Abraham Lincoln, Archduke Franz Ferdinand and John F Kennedy all seemed world-changing events. But is assassination anything other than an act of petty vanity?

It was a grey January day in St Petersburg in 1878 when Vera Zasulich, a young nihilist, made the short journey to the office of the city’s governor, General Fyodor Trepov. Here the general listened to petitions and examined complaints. A crowd of people had gathered in the cold. Zasulich waited in line for her turn to approach the great man. At last they spoke, and just as Trepov was turning from her to deal with the next supplicant, she pulled a gun from under her cloak and fired at him at point-blank range. The bullet burst into his pelvis, wounding but not killing him. Zasulich threw down the gun, stood quite still, and waited to be arrested. They beat her, of course, and then bundled her into a room, and then wondered a little feebly what to do with her next.

As they deliberated in the immediate aftermath of her deed, Zasulich moved from moments of dissociation and strangeness to an honest desire to offer advice to her baffled captors. Her words are quoted in a collection of revolutionary-era Russian memoirs, Five Sisters: Women Against the Tsar, edited by Barbara Alpern Engel and Clifford N Rosenthal:

My foresight, and consequently my precise plan of action, did not extend beyond the moment of attack. But every minute my joy increased – not because I was in full control of myself . . . but rather because I found myself in an extraordinary state of the most complete invulnerability, such as I had never before experienced. Nothing at all could confuse me, annoy me, or tire me. Whatever was being thought up by those men, at that time conversing animatedly in another corner of the room, I would regard them calmly, from a distance they could not cross.

This mingled feeling of elation and satisfaction appears often in the personal accounts of assassins; the work has been done and, in the process, their own lives thrown away. A sudden liberation from the burden of self fills them; they ascend to a height above life. They have realised themselves in the perfection of a deed.

Zasulich’s act succeeded by virtue of its comparative failure. Her shooting of Trepov was an act of revenge, after he had ordered an innocent man to be badly whipped in the house of detention on account of a small act of insubordination. Put on trial for her retaliation, she found herself acquitted unexpectedly; indignation against Trepov and sympathy for Zasulich’s courage meant only one possible end to the trial, despite the weight of evidence against her. That she had only wounded her man no doubt also facilitated her acquittal.

At the end of the trial, there were wild scenes of jubilation in court. Almost everyone was elated; only the judge and Zasulich were suitably sober. The result depressed the judge, who knew that it made a nonsense of the law, and disheartened Zasulich, who had been deprived of her death. She was confronted by the terrible responsibility of living on; freedom had been returned to her.

Zasulich’s state of mind following her attempt at murder is symptomatic of the “archetypal assassin” from the French Revolution onwards, that is, the assassin who struck at a prominent political figure for idealistic and ideological reasons. It illustrates how the results of assassination were perhaps always less vital to the perpetrators than the sheer exhilaration and abandonment central to the deed. There is no question that they also looked for a kind of political “success” in such murders, but in fact such triumphs were always more limited and less vital than the psychological rewards: the desire, in a righteous deed, to justify the self and in the same instant to escape its trammels.

It is doubtful how far assassinations have worked as an instrument of political or revolutionary change. In most cases, such murders have made only a negligible impression on events; the chaos and instability they carry with them have nearly always meant more than the change brought about by the deed.

One of the Great Courses, those DVD lecture series advertised in the New York Review of Books or the LRB, is on Events That Changed History. Two of its 36 defining moments are assassinations – the murder of Archduke Franz Ferdinand in Sarajevo in 1914 and the killing of John F Kennedy in Dallas in 1963. Both events look world-changing, but were they?

The assassination on 28 June 1914 in Sarajevo is a textbook example of contingency in historical matters. Along the Appel Quay, where the visiting Austrian archduke, his wife and their entourage were scheduled to drive past, waited seven adolescent assassins, some of them still schoolboys, all determined to kill their man and spark a situation that might lead to Bosnia joining a Greater Serbia. The car swept by, passing the first assassin, who could not act, as a policeman was standing by him in the crowd. The second assassin was more fortunate, and lobbed a nail bomb that landed on the opened bonnet of the car. The archduke swiftly scooped it up and threw it back on to the road, where it exploded as it hit the ground. One soldier was injured by the blast; 70 holes punctured the car. The bomb-thrower bit into a cyanide capsule but the poison was old and its potency was gone. He pushed past the bystanders and leapt over the wall to drown himself. But the summer’s heat had shrunk the river, and it was too shallow to drown. Vomiting from the unstable pill, he was pulled down by a throng of people and bundled into the custody of the police. When they asked him if he was a Serb, he replied, “Yes, I am a Serb hero.”

Meanwhile the car drove on. The next assassin it passed was moved by pity for the royal pair and failed to fire his gun. The fourth assassin’s nerve failed him and he ran off home. The others watched as the car sped past too fast, and the moment was gone. Disappointed, one of the would-be killers, 19-year-old Gavrilo Princip, feeling hungry now, crossed the Quay and ambled on to the entrance of Franz Joseph Street; there, at Moritz Schiller’s food store, he stepped in and bought a sandwich. He was still sitting and eating it when the archduke and duchess pulled up in their car, right outside the store. They were coming back from the city hall and the driver had taken a wrong turning. They tried to reverse, but there was too little room to manoeuvre in the narrow street. Princip stood up, strode over to the right hand of the vehicle and, from a distance of four or five paces, fired two shots directly into the car. The first one killed the archduke; the second, intended for Oskar Potiorek, the governor of Bosnia and Herzegovina, fatally wounded the duchess.

Some still believe that this product of accident and misadventure sparked a world war that killed millions. It is the case that some schoolboys playing the role of doomed heroes helped topple a civilisation; yet, on a grander political scale, the murder was very largely only a pretext for action. There had been brutal assassinations before. Such murders usually occurred within the body politic of a sovereign state, as an element in a coup or an outcome of insanity. By their transnational nature, the numerous anarchist murders from the 1880s to the 1900s offered no foreign country as a suitable target for retaliation; like the pirate, the anarchist was equally an enemy everywhere. As the assassins operated outside the boundaries of the nation state, the vengeance of armed retribution was meaningless in relation to the horrors of their actions. The murders in Sarajevo were entirely different. They occurred on Hapsburg soil but could easily be said to have their origin in the very existence of the Serbian state. The response to the killings potentially involved war – yet such a confrontation was always avoidable as long as everyone wanted peace.

However, far from aspiring to avert a war, the Austrians did all they could to invite it. The Austrian foreign minister, Count Berchtold, wrongly believed, or chose to believe, that the murders in Sarajevo had been carried out with the connivance of the Serbian government. With German backing, the Austrians were disposed to pick a fight. They thought they could bully the Serbs with impunity and quickly crush an upstart neighbour. They pressed for war, but even so meant only to settle scores with Serbia, a smaller enemy whose certain defeat would bolster the empire; they never intended the European conflagration that would burn down their power.

None of the young conspirators imagined that the assassination would provoke immediate war between Serbia and Austria; as for their deed sparking a worldwide conflict, it was beyond their powers to conceive such an outcome. Nedeljko Cabrinovic, the youth who threw the nail bomb at the car, lamented that “if I had foreseen what was to happen I should myself have sat down on the bombs so as to blow myself to bits”. Though he toughed it out in court, in private Princip was devastated by reports of the war. Yet, later, he could hardly believe that a world war could have followed on from their choices; he couldn’t really feel guilty for that bit of bad luck. They had aimed at a symbol, the embodiment of all their frustrations. They were too young and too naive to grasp fully the potential consequences of their actions; they were in love with the heroic deed, and their bloodily rose-tinted imaginations could not picture anything beyond that fair vision: at the trial, Cabrinovic remarked, “We thought that only noble characters are capable of committing assassinations.” Their most pressing motive in murdering the archduke and his wife was the desire to share in that nobility.

The “world-changing” consequences of the events in Sarajevo depended on the context in which the murders happened. The world was poised for war, and so the killings led to carnage. The deed resonated within the desires of others, and just then what others wanted was what they imagined would be the speedy resolution of questions of European dominance and prestige. If it had not been Sarajevo that pulled the trigger, it would have been something else, but war would have come in any case.

If assassination’s potency to alter history is questionable in Sarajevo, there must be even greater doubts in the case of the killing of John F Kennedy on 22 November 1963. Kennedy had mastered the new politics, offering charisma to the electorate. His final place in the national consciousness was as a symbol of all that was most desirable in the American myth. In terms of tangible achievement or foreign policy gains, however, he left almost nothing for posterity; at best, he founded the Peace Corps. The great legislative triumphs of the period, in civil rights, Medicaid, environmental law and social welfare, are all attributable to the much-disparaged and untelegenic Lyndon Baines Johnson.

In the cruellest interpretation, the single most important impact of Kennedy’s career is that his death handed Johnson, as his successor, the moral force to pass these necessary reforms – laws that Kennedy would have been unlikely to get through undamaged on his own. In foreign policy terms, it was almost certainly Kennedy’s weakness with Khrushchev that prompted the Cuban missile crisis, the resolution of which was his only victory. Even that success was not quite what the public perceived it to be, involving as it did the hushed-up quid pro quo removal of US missiles from Turkey. Meanwhile, Kennedy had already sparked an arms race with the Russians, and his policy on Vietnam helped to create the conditions for the disastrous war that followed. There were hints that he would have withdrawn from Vietnam if he had been re-elected; and later there were other hints that he had been murdered precisely because of this private intention.

To imagine that Kennedy could have ended the Vietnam war presupposes a strength of purpose in him of which there was little evidence in the first years of his presidency, other than the brinkmanship of the missile crisis and (on a much smaller scale) his confrontation with George Wallace over racial integration. In any case, Kennedy had fatally undermined the Diem regime in South Vietnam, with consequences that would have precluded such a sudden withdrawal. Otherwise, he inspired and launched the space programme – and that was about all.

Despite this paltry legacy, Kennedy still stands in the eyes of many as a “great president”, even one of the greatest. This owes more to marketing than delivery. Knowing that his Catholicism would prevent a straightforward coronation by the Democratic Party, he was forced to fight the 1960 election campaign on the basis of his national popularity. He had to win primaries and show his power. It was a new kind of strategy, and it hinged on the retailing of Kennedy. They were going to “sell Jack like soap flakes”.

On 26 September 1960, Kennedy triumphed over Richard Nixon, the Republican candidate, on television; radio listeners were more evenly divided on who they believed had won the debate. On the screen, JFK had looked like a superstar, and the sweating, stubbly Nixon, as one journalist put it, “a real middle-class uneducated swindler with all the virtues of a seller of fountain pens in Naples”. The smear on the Democratic posters – “Would YOU buy a used car from this man?” – stuck. The Kennedys’ relationship with the press and with television, their youth, their attractiveness, placed them in a position of mediated confidence with the electorate. However, it was the faux-intimacy of the television image, the allure of cinema. In 1960, in an article for Esquire, Norman Mailer put it like this:

Since the First World War Americans have been living a double life, and our history has moved on two rivers, one visible, the other underground; there has been the history of politics which is concrete, factual, practical and unbelievably dull . . . and there is a subterranean river of untapped, ferocious, lonely and romantic desires, that concentration of ecstasy and violence which is the dream life of the nation . . . if elected he would not only be the youngest president ever to be chosen by the voters, he would be the most conventionally attractive young man ever to sit in the White House, and his wife – some would claim it – might be the most beautiful first lady in our history. Of necessity the myth would emerge once more, because America’s politics would now be also America’s favourite movie, America’s first soap opera, America’s bestseller.

The Kennedys were stars, and John F Ken - nedy died on-screen. The assassination was an experience broadcast on television; two days later Jack Ruby’s murder of Lee Harvey Oswald was shown live by NBC across homes in America. Within half an hour of the JFK shooting, 68 per cent of Americans had heard the news, carried to them by the media. The television set that brought the outside world into the domestic space displayed the pathos to a nation. All could feel involved; the deed became an image.

Yet it was an image that could not be assimilated or understood. In one sense, apart from the tragedy of a young man’s murder, it was precisely its lack of historical significance that rendered it so potent. The plethora of conspiracy theories around Kennedy’s murder responded to genuine mysteries and unresolved problems in the story; the theories were also a way to inscribe meaning into the event, as though an unseen betrayal underwrote it, and they might produce the simulacra of significance. For, just as the assassinations of the 1960s were often assumed to be manifestations of a vague “climate of violence”, so it was that their significance lay chiefly in their effect on American mentalities – even, as Mailer suggests, on the dream life of the nation. It was not the political consequences of Kennedy’s murder, nor indeed all the various assassinations of that decade, that truly mattered, but rather the way they sustained and exemplified an atmosphere of panic, or of social disintegration. They worried Americans with a sense of things falling apart, of a polis under strain.

In the eyes of many, political violence, random killings and unrest seemed a constant factor in American life from the early 1960s to the early 1980s. As the journalist Jack Newfield wrote: “We felt, by the time we reached 30, that we had already glimpsed the most compassionate leaders our nation could produce, and they had all been assassinated.” The folk singer Dick Holler’s 1968 song “Abraham, Martin and John” links the deaths of Lincoln, JFK, Martin Luther King and Bobby Kennedy. It presents the four men as simple embodiments of goodness who were not allowed to live out their potential.

In America losing those individuals, irreparable damage was done to the possibilities of national political life. Other people failed to do what the man who was killed might have done. Similarly the history of assassination depends in two senses on the centrality of the individual: in the idea of the “indispensable person” who is assassination’s chosen victim, and in the fantasy that such murders gift their perpetrators with an undying, if ignoble fame.

Though there is a great deal of force to Newfield’s lament, the idea of the “indispensable person” runs counter to the strengths and resilience of democratic life. The American mood in the late 1960s was one of intense unease; and yet the fabric of social and political life held good. The anarchists who struck at presidents, monarchs or high-ranking officials were sometimes engaged in personal attacks, their killings a move in an ongoing vendetta between the government and revolutionaries. More usually they were simply aiming at the office itself: in their own judgement, murdering a symbol and not a person. Yet, seen as such, the deed was meaningless. The president was killed and another president took his place. The structures of power were always designed to take into account the fact of mortality, to maintain continuity; that death should be caused by an assassin’s gun altered little.

Even in the case of the many assassination attempts directed against Adolf Hitler, it is doubtful whether striking their target would have altered events significantly. Hitler’s would be assassins were as much involved in making a gesture, an indication of the survival of an internal opposition, as attempting to decapitate the Third Reich. Success would probably have led to succession by another, equally wicked Nazi. Where assassins did succeed in killing a leading Nazi – with the murder of Reinhard Heydrich in Prague in 1942 – the murder, for all its justice, merely prompted horrible reprisals, notably the massacres in the Czech villages of Lidice and Ležáky. Meanwhile, the “architect of the Holocaust” may have been killed, but the Final Solution continued apace.

In modern times, with very few exceptions (such as the killing of Abraham Lincoln in 1865), assassination has been a sideshow – although, I would argue, a highly significant one. Assassination has often been linked to a kind of “secret history”, contained in the romance of conspiracy theories. It seems instead that assassination belongs to another kind of concealed history – the history, in Norman Mailer’s terms, of the dream life of the west.

The assassins of the past 200 years were besotted with action, the power of deeds. It was part of the thrill of such action that no one could foresee to what it would lead. Killing was sufficient, even without the understanding of its consequences. Indeed, practical results were the last thing sought for by any assassin. For assassination long ago broke free of any idea of efficacy or political influence, and instead became the central expression of the extremists’ taste for action as such – a pure deed that annihilates both the victim and the perpetrator, even as it depends on the target’s fame and the fame and attention that it grants to the killer. It is an act of self-assertion that is simultaneously a self-negation.

In one respect, the historical importance of Zasulich’s action was limited: Trepov survived and the autocracy continued. However, as a muse of murder she proved a vital figure, her deed provoking attempts against the kaiser in Germany and arousing a broad campaign of assassinations in Russia which led to the murder of the tsar himself. Zasulich’s example was crucial in this swing towards the practice of terror. She was an inspiration to crime. A police official was murdered in Kiev in the spirit of emulation, and when in August 1878 a young man called Sergei Kravchinskii executed General Nikolai Mezentsev on the streets of St Petersburg, he was consciously following the line set by the courageous Zasulich.

For most within the movement, the heightening of stakes seemed inevitable and just. And yet, for Zasulich herself, there was no such easy acceptance of the killings. Soloviev’s infamous attempt to assassinate the tsar with a bomb at the Winter Palace in 1879 merely depressed her. As years went by, Zasulich’s position grew clearer. The assassin’s deed was without revolutionary merit. It led not to great social changes, but only to an ineffectual puff of violence. It exhilarated other revolutionaries, who sensed vicariously and inappropriately the retort of power. Conversely, it dismayed and sickened potential supporters among the masses, or rendered them passive spectators of outrage. The people were not roused to rebel by such deeds, but became mere witnesses to others’ glorious,  or infamous, violence. Worst of all was terror’s dependence upon a sickly illusion in the mind of the assassin herself.

Zasulich knew this at first hand. The assassin worked in a spirit of vanity or anomie: either conceited by an impression of their own potency or buoyed by the awareness of their own insignificance. The assassin embraced their victim’s death and their own, and both inspirited them with the weightless emancipation from the burden of having to live at all. Zasulich’s act of terror had sought to publicise another’s brutality; the danger was that such acts would only advertise their own horror. The injustice that prompted them would be forgotten in the impact of the assassin’s bullet. It was, she might have realised, only her own incompetence, in merely hitting Trepov in the hip, that had permitted her deed to appear noble.

Michael Newton is the author of “Age of Assassins” (Faber & Faber, £25)

JOHN DEVOLLE/GETTY IMAGES
Show Hide image

Fitter, dumber, more productive

How the craze for Apple Watches, Fitbits and other wearable tech devices revives the old and discredited science of behaviourism.

When Tim Cook unveiled the latest operating system for the Apple Watch in June, he described the product in a remarkable way. This is no longer just a wrist-mounted gadget for checking your email and social media notifications; it is now “the ultimate device for a healthy life”.

With the watch’s fitness-tracking and heart rate-sensor features to the fore, Cook explained how its Activity and Workout apps have been retooled to provide greater “motivation”. A new Breathe app encourages the user to take time out during the day for deep breathing sessions. Oh yes, this watch has an app that notifies you when it’s time to breathe. The paradox is that if you have zero motivation and don’t know when to breathe in the first place, you probably won’t survive long enough to buy an Apple Watch.

The watch and its marketing are emblematic of how the tech trend is moving beyond mere fitness tracking into what might one call quality-of-life tracking and algorithmic hacking of the quality of consciousness. A couple of years ago I road-tested a brainwave-sensing headband, called the Muse, which promises to help you quiet your mind and achieve “focus” by concentrating on your breathing as it provides aural feedback over earphones, in the form of the sound of wind at a beach. I found it turned me, for a while, into a kind of placid zombie with no useful “focus” at all.

A newer product even aims to hack sleep – that productivity wasteland, which, according to the art historian and essayist Jonathan Crary’s book 24/7: Late Capitalism and the Ends of Sleep, is an affront to the foundations of capitalism. So buy an “intelligent sleep mask” called the Neuroon to analyse the quality of your sleep at night and help you perform more productively come morning. “Knowledge is power!” it promises. “Sleep analytics gathers your body’s sleep data and uses it to help you sleep smarter!” (But isn’t one of the great things about sleep that, while you’re asleep, you are perfectly stupid?)

The Neuroon will also help you enjoy technologically assisted “power naps” during the day to combat “lack of energy”, “fatigue”, “mental exhaustion” and “insomnia”. When it comes to quality of sleep, of course, numerous studies suggest that late-night smartphone use is very bad, but if you can’t stop yourself using your phone, at least you can now connect it to a sleep-enhancing gadget.

So comes a brand new wave of devices that encourage users to outsource not only their basic bodily functions but – as with the Apple Watch’s emphasis on providing “motivation” – their very willpower.  These are thrillingly innovative technologies and yet, in the way they encourage us to think about ourselves, they implicitly revive an old and discarded school of ­thinking in psychology. Are we all neo-­behaviourists now?

***

The school of behaviourism arose in the early 20th century out of a virtuous scientific caution. Experimenters wished to avoid anthropomorphising animals such as rats and pigeons by attributing to them mental capacities for belief, reasoning, and so forth. This kind of description seemed woolly and impossible to verify.

The behaviourists discovered that the actions of laboratory animals could, in effect, be predicted and guided by careful “conditioning”, involving stimulus and reinforcement. They then applied Ockham’s razor: there was no reason, they argued, to believe in elaborate mental equipment in a small mammal or bird; at bottom, all behaviour was just a response to external stimulus. The idea that a rat had a complex mentality was an unnecessary hypothesis and so could be discarded. The psychologist John B Watson declared in 1913 that behaviour, and behaviour alone, should be the whole subject matter of psychology: to project “psychical” attributes on to animals, he and his followers thought, was not permissible.

The problem with Ockham’s razor, though, is that sometimes it is difficult to know when to stop cutting. And so more radical behaviourists sought to apply the same lesson to human beings. What you and I think of as thinking was, for radical behaviourists such as the Yale psychologist Clark L Hull, just another pattern of conditioned reflexes. A human being was merely a more complex knot of stimulus responses than a pigeon. Once perfected, some scientists believed, behaviourist science would supply a reliable method to “predict and control” the behaviour of human beings, and thus all social problems would be overcome.

It was a kind of optimistic, progressive version of Nineteen Eighty-Four. But it fell sharply from favour after the 1960s, and the subsequent “cognitive revolution” in psychology emphasised the causal role of conscious thinking. What became cognitive behavioural therapy, for instance, owed its impressive clinical success to focusing on a person’s cognition – the thoughts and the beliefs that radical behaviourism treated as mythical. As CBT’s name suggests, however, it mixes cognitive strategies (analyse one’s thoughts in order to break destructive patterns) with behavioural techniques (act a certain way so as to affect one’s feelings). And the deliberate conditioning of behaviour is still a valuable technique outside the therapy room.

The effective “behavioural modification programme” first publicised by Weight Watchers in the 1970s is based on reinforcement and support techniques suggested by the behaviourist school. Recent research suggests that clever conditioning – associating the taking of a medicine with a certain smell – can boost the body’s immune response later when a patient detects the smell, even without a dose of medicine.

Radical behaviourism that denies a subject’s consciousness and agency, however, is now completely dead as a science. Yet it is being smuggled back into the mainstream by the latest life-enhancing gadgets from Silicon Valley. The difference is that, now, we are encouraged to outsource the “prediction and control” of our own behaviour not to a benign team of psychological experts, but to algorithms.

It begins with measurement and analysis of bodily data using wearable instruments such as Fitbit wristbands, the first wave of which came under the rubric of the “quantified self”. (The Victorian polymath and founder of eugenics, Francis Galton, asked: “When shall we have anthropometric laboratories, where a man may, when he pleases, get himself and his children weighed, measured, and rightly photographed, and have their bodily faculties tested by the best methods known to modern science?” He has his answer: one may now wear such laboratories about one’s person.) But simply recording and hoarding data is of limited use. To adapt what Marx said about philosophers: the sensors only interpret the body, in various ways; the point is to change it.

And the new technology offers to help with precisely that, offering such externally applied “motivation” as the Apple Watch. So the reasoning, striving mind is vacated (perhaps with the help of a mindfulness app) and usurped by a cybernetic system to optimise the organism’s functioning. Electronic stimulus produces a physiological response, as in the behaviourist laboratory. The human being herself just needs to get out of the way. The customer of such devices is merely an opaquely functioning machine to be tinkered with. The desired outputs can be invoked by the correct inputs from a technological prosthesis. Our physical behaviour and even our moods are manipulated by algorithmic number-crunching in corporate data farms, and, as a result, we may dream of becoming fitter, happier and more productive.

***

 

The broad current of behaviourism was not homogeneous in its theories, and nor are its modern technological avatars. The physiologist Ivan Pavlov induced dogs to salivate at the sound of a bell, which they had learned to associate with food. Here, stimulus (the bell) produces an involuntary response (salivation). This is called “classical conditioning”, and it is advertised as the scientific mechanism behind a new device called the Pavlok, a wristband that delivers mild electric shocks to the user in order, so it promises, to help break bad habits such as overeating or smoking.

The explicit behaviourist-revival sell here is interesting, though it is arguably predicated on the wrong kind of conditioning. In classical conditioning, the stimulus evokes the response; but the Pavlok’s painful electric shock is a stimulus that comes after a (voluntary) action. This is what the psychologist who became the best-known behaviourist theoretician, B F Skinner, called “operant conditioning”.

By associating certain actions with positive or negative reinforcement, an animal is led to change its behaviour. The user of a Pavlok treats herself, too, just like an animal, helplessly suffering the gadget’s painful negative reinforcement. “Pavlok associates a mild zap with your bad habit,” its marketing material promises, “training your brain to stop liking the habit.” The use of the word “brain” instead of “mind” here is revealing. The Pavlok user is encouraged to bypass her reflective faculties and perform pain-led conditioning directly on her grey matter, in order to get from it the behaviour that she prefers. And so modern behaviourist technologies act as though the cognitive revolution in psychology never happened, encouraging us to believe that thinking just gets in the way.

Technologically assisted attempts to defeat weakness of will or concentration are not new. In 1925 the inventor Hugo Gernsback announced, in the pages of his magazine Science and Invention, an invention called the Isolator. It was a metal, full-face hood, somewhat like a diving helmet, connected by a rubber hose to an oxygen tank. The Isolator, too, was designed to defeat distractions and assist mental focus.

The problem with modern life, Gernsback wrote, was that the ringing of a telephone or a doorbell “is sufficient, in nearly all cases, to stop the flow of thoughts”. Inside the Isolator, however, sounds are muffled, and the small eyeholes prevent you from seeing anything except what is directly in front of you. Gernsback provided a salutary photograph of himself wearing the Isolator while sitting at his desk, looking like one of the Cybermen from Doctor Who. “The author at work in his private study aided by the Isolator,” the caption reads. “Outside noises being eliminated, the worker can concentrate with ease upon the subject at hand.”

Modern anti-distraction tools such as computer software that disables your internet connection, or word processors that imitate an old-fashioned DOS screen, with nothing but green text on a black background, as well as the brain-measuring Muse headband – these are just the latest versions of what seems an age-old desire for technologically imposed calm. But what do we lose if we come to rely on such gadgets, unable to impose calm on ourselves? What do we become when we need machines to motivate us?

***

It was B F Skinner who supplied what became the paradigmatic image of ­behaviourist science with his “Skinner Box”, formally known as an “operant conditioning chamber”. Skinner Boxes come in different flavours but a classic example is a box with an electrified floor and two levers. A rat is trapped in the box and must press the correct lever when a certain light comes on. If the rat gets it right, food is delivered. If the rat presses the wrong lever, it receives a painful electric shock through the booby-trapped floor. The rat soon learns to press the right lever all the time. But if the levers’ functions are changed unpredictably by the experimenters, the rat becomes confused, withdrawn and depressed.

Skinner Boxes have been used with success not only on rats but on birds and primates, too. So what, after all, are we doing if we sign up to technologically enhanced self-improvement through gadgets and apps? As we manipulate our screens for ­reassurance and encouragement, or wince at a painful failure to be better today than we were yesterday, we are treating ourselves similarly as objects to be improved through operant conditioning. We are climbing willingly into a virtual Skinner Box.

As Carl Cederström and André Spicer point out in their book The Wellness Syndrome, published last year: “Surrendering to an authoritarian agency, which is not just telling you what to do, but also handing out rewards and punishments to shape your behaviour more effectively, seems like undermining your own agency and autonomy.” What’s worse is that, increasingly, we will have no choice in the matter anyway. Gernsback’s Isolator was explicitly designed to improve the concentration of the “worker”, and so are its digital-age descendants. Corporate employee “wellness” programmes increasingly encourage or even mandate the use of fitness trackers and other behavioural gadgets in order to ensure an ideally efficient and compliant workforce.

There are many political reasons to resist the pitiless transfer of responsibility for well-being on to the individual in this way. And, in such cases, it is important to point out that the new idea is a repackaging of a controversial old idea, because that challenges its proponents to defend it explicitly. The Apple Watch and its cousins promise an utterly novel form of technologically enhanced self-mastery. But it is also merely the latest way in which modernity invites us to perform operant conditioning on ourselves, to cleanse away anxiety and dissatisfaction and become more streamlined citizen-consumers. Perhaps we will decide, after all, that tech-powered behaviourism is good. But we should know what we are arguing about. The rethinking should take place out in the open.

In 1987, three years before he died, B F Skinner published a scholarly paper entitled Whatever Happened to Psychology as the Science of Behaviour?, reiterating his now-unfashionable arguments against psychological talk about states of mind. For him, the “prediction and control” of behaviour was not merely a theoretical preference; it was a necessity for global social justice. “To feed the hungry and clothe the naked are ­remedial acts,” he wrote. “We can easily see what is wrong and what needs to be done. It is much harder to see and do something about the fact that world agriculture must feed and clothe billions of people, most of them yet unborn. It is not enough to advise people how to behave in ways that will make a future possible; they must be given effective reasons for behaving in those ways, and that means effective contingencies of reinforcement now.” In other words, mere arguments won’t equip the world to support an increasing population; strategies of behavioural control must be designed for the good of all.

Arguably, this authoritarian strand of behaviourist thinking is what morphed into the subtly reinforcing “choice architecture” of nudge politics, which seeks gently to compel citizens to do the right thing (eat healthy foods, sign up for pension plans) by altering the ways in which such alternatives are presented.

By contrast, the Apple Watch, the Pavlok and their ilk revive a behaviourism evacuated of all social concern and designed solely to optimise the individual customer. By ­using such devices, we voluntarily offer ourselves up to a denial of our voluntary selves, becoming atomised lab rats, to be manipulated electronically through the corporate cloud. It is perhaps no surprise that when the founder of American behaviourism, John B Watson, left academia in 1920, he went into a field that would come to profit very handsomely indeed from his skills of manipulation – advertising. Today’s neo-behaviourist technologies promise to usher in a world that is one giant Skinner Box in its own right: a world where thinking just gets in the way, and we all mechanically press levers for food pellets.

This article first appeared in the 18 August 2016 issue of the New Statesman, Corbyn’s revenge