Magnet Reps
Show Hide image

Learning how to live

Why do we find free time so terrifying? Why is a dedication to work, no matter how physically destructive and ultimately pointless, considered a virtue? Jenny Diski urges you to down tools while you can.

Stop what you’re doing. I don’t mean stop reading this, or whatever you’re doing while you’re reading (brushing your teeth, eating, waiting for the water to boil). I mean consider the possibility of stopping whatever your answer is to the conversational gambit, “And what do you do?” Try putting the appropriate response in the past tense: “I used to be [. . .]” It’s very likely, unless your interlocutor gives up on you at that point (as an academic sitting at a Cambridge “feast” once did, turning to her other neighbour for the rest of the meal when I told her I was a novelist), that the follow-up question will be: “So what do you do now?” You might attempt to circumvent this with “I used to be [. . .] but now I’m retired”, if you look old enough, or if you’re younger you could try, “I used to be [. . .] but now I’m vastly wealthy”, but the chances are that the next question will still be in the conceptual area of “What do you do now?”, such as: “How do you spend your time? What do you do with yourself? What are your hobbies?” If you wanted to avoid the whole party chatter thing (but what are you doing at this vacuous party, anyway?), you could say: “Unemployed, thanks to the government’s economic policy, and lacking the financial resources for hobbies to pass the time until I die.” Or in a more passive-aggressive mode just answer, “Oh, these days I skive and scrounge.”

But what if as you use the phrase “I used to [. . .]” your own heart sinks, or your psyche panics at the idea that you might not be what you think yourself to be? Or that what you think yourself to be crumbles into nameless dread at the thought that you are not being what you are doing? The party questioner is only you (or me) on another day, wondering how on earth we are to get through the rest of our time as conscious beings without the reassurance that we are a writer, a teacher, a taxi driver, a parent. The Tory rhetoric about the skiver and scrounger is not nearly as disturbing as the idea we have of ourselves, of being cut loose from a sense of purpose. And the venom directed at the skivers is surely the result of the rhetoric feeding on our own fears about a life without a labelled purpose.

Driving ambition might just be a way of staving off the vacuum, rather than a sign of bottomless greed for more when you have enough. An unquenchable passion for work might be a panic-stricken way of concealing the fear of a lack of passion for life itself. If you are what you do, what are you when you stop doing it and you still are? There are people who don’t find this a problem, who have not entirely or even at all identified existence with what they do and how they make a living, but they are evidently a great problem to those – the majority –who do.

What if you answered the question “What do you do all day?” with “Nothing”? It isn’t as if that could possibly be true. If you spent all day in bed watching television, or staring at the clouds, you wouldn’t be doing nothing. Children are always being told to stop doing “nothing” when they’re reading or daydreaming. It is lifelong training for the idea that activity is considered essential to mental health, whether it is meaningful or not. Behind the “nothing” is in part a terror of boredom, as if most of the work most people do for most of their lives isn’t boring. The longing people express to be doing “creative” work suggests that they think it less boring than other kinds of work. Many people say that writing isn’t “proper work”. Often they tell me they are saving up writing a book for their “retirement”. Creative work sits uneasily in the fantasy life between dread leisure and the slog of the virtuous, hardworking life. It’s seen as a method of doing something while doing nothing, one that stops you flying away in terror.

It was Michel de Montaigne’s chosen solution in 1571, after retiring from his position as a counsellor of the Bordeaux high court. He settled himself at the top of a circular tower in his chateau, surrounded by books, and decided to write delicate morsels of classical rhetoric to pass the time. He crashed into a depression and then, in desperation, started to write a newfangled form of essay that looked, not from some high, abstract point at well-trodden arguments, but deep into the well of his self to investigate the nature of the world of which he had once been so much a part. It turned out to be not so much a retirement, as a reinvention of life and form.

It’s true that the Tories (imitated by every other political party) did not invent the idea of “decent, hard-working families” and “strivers”, even if it seems as if they have so convincingly coined the phrases that their clichéd-language coffers are now overflowing. (If only the mountain of hard-workingfamily- rhetoric could be used to pay off the national debt.) Max Weber and R H Tawney would claim the work-ethic-as-self-worth idea behind the virtuous labouring discourse to be the cultural property of the Protestant Reformation. In the north/south religious divide it does, roughly speaking, keep to the same side as Protestantism. It can’t be only the lack of sunshine that prevents us in the more northern parts of the western hemisphere from enjoying and benefiting from those civilised siestas and mañanas that punitive economists partly blame for the Greek, Spanish, Italian and Portuguese financial crises. If we’re going delving, there’s also Adam (and all of us), punished for his disobedience by having to work hard for a living, as well as the first deadly rivalry between the farmer Cain and the herder Abel, each striving to have God favour his produce over his brother’s. Not such an honest and decent family, that original one. Working hard to earn a living may go back to the very beginning, but it was called the Fall for a reason, and it signalled the opposite of an ideal way of life. Work as ethic and work as punishment might come to seem, in the omnipresence of religious or Freudian guilt, to be one and the same thing, but they are not.

Nor are the skiver and scrounger labels recent inventions, although “welfare state”, which is the context for the latest iterations (and not about scrounging but a social safety net for any of us who find we cannot earn a living by ourselves), is relatively new. Most familiarly, concern about skivers and scroungers takes us back to the deserving and undeserving poor of the Poor Law Amendment Act of 1834. This legislation embodied the Victorian view that if you made destitution unpleasant enough (because it wasn’t unpleasant enough already?) and arguably worse than a fairly swift death from cold and starvation, with grim and regimented workhouses providing bare sustenance, only the most hopeless cases would consider it an option. Genesis gave us work as punishment and the Victorians doubled it, by punishing those who didn’t or couldn’t work. I’m rather inclined to think that those who can liberate themselves from the severe whims of old Nobodaddy deserve a cheer, but the Victorians’ moral assessment of the poor into good and bad, worthy and unworthy sorts, translates effortlessly into the present government’s employment of companies such as Atos, which use standardised questionnaires to decide who is “genuinely” seeking but unable to find a job, and who disabled enough not to be fit to work. Then and now, avidness to work hard all their lives is –unsurprisingly, you might think – the ruling classes’ and corporations’ definition of the good citizen.

My father often used to tell me how my immigrant grandfather declined in health and spirit once he gave up the café he ran from dawn to late into the night in Petticoat Lane to retire to a leafy suburb. It was only a matter of time, my father said of the man I never met and knew almost nothing else about, before he died of having stopped work. I think this story is the equivalent of an urban myth of that generation. The decent man who worked all the hours that God sent and more, provided what he could (which was never lavish) for his family, toiled unceasingly in order to make sure his son went to a good school and got a profession, collapsed and died once he stepped off the treadmill.

I never doubted that retirement killed my grandfather. I did wonder sometimes why his devotion to work unto death was considered a virtue. It was never explained, as if it were self-evident, although frequently the story would be told to me as an improving tale when I had failed to complete some task or activity – regardless of its lack of efficacy on my own father, who was a criminal conman, a profession that David Cameron and Iain Duncan Smith would presumably not include in the decent, hard-working category.

There is an argument to be made against the prototypical life of hard work as the inevitable lot of humanity. In 1974 the Chicago anthropologist Marshall Sahlins published Stone Age Economics. He proposed the idea that individuals in many “simple” societies, far from working themselves to death merely to exist in their nasty, brutish and short lives, were actually members of the “original affluent society”. He suggested that, in those parts of the world where co-operation and social exchange were paramount, once people had done the few days’ hard work of felling a tree and carving out a canoe, there were large amounts of free time to lie about daydreaming, exploring, telling stories: doing “culture” or just skiving. You’d fish in the canoe you’d made, and by preserving and sharing the catch with others, who also shared theirs with you, you could then take a few days off before you needed to get any more. Decent members of those communities did what they needed to do and then when they didn’t need to do it, they stopped.

Only when you worship the idea of accumulation and status based on its perceived wealth-giving properties do you have to work hard all the time. Accumulation was hampering; you had to carry it about with you when you moved from camp to camp, or find ways of storing and securing it if you were sedentary. Without the idea of surplus as a value beyond its use value, when you needed/wanted something you got it, and when you had it, you enjoyed it until it was time to get some more.

To modernity’s inability to grasp the idea of a pattern of necessity, sufficiency and rest, we could add its lack of understanding about the social conditions needed to produce a willingness to labour. A few years ago I visited the isolated island of St Helena, a plaintive, forgotten and unwanted British overseas territory left over from the days of the East India Company. There were desperate plans by DfID (the Department for International Development, responsible for the island) to make St Helena economically viable by building an airport to fly in rich South Africans for “luxury holidays”. This was in spite of the mountainous island being overrun with flax that was once disastrously imported as a possible cash crop, the place having no natural resources or industry, frequent shortages of fresh water, not a single accessible beach or usable port, and a dwindling, elderly population of 4,000.

A DfID official was travelling from England on the same boat as me in 2008 (this dedicated boat, the RMS St Helena, was the only means of delivering people and goods as basic as salt and potatoes to the island from England and South Africa, though the English leg has now ceased). DfID Man explained that the people living on the island were fatally dependent on Britain’s (rather paltry) annual handouts. As he told me, one example of the essential laziness of the Saints – as they call themselves – was that those with boats and nets on the island fished only when they needed to, and then waited until they needed more fish before going out again. St Helena was one of George Osborne’s feckless families on a slightly grander scale, stuck in the middle of the southern Atlantic Ocean, “sleeping off a life on benefits”. If it had blinds around its sheer coastal cliffs, it would keep them down all day.

Only a handful of people I spoke to wanted the airport or believed it could be anything other than an outrageously expensive white elephant, especially since the planned airstrip was battered by fierce crosswinds that would make landing and taking off terrifying at the least. And if it worked it would be a less-thanattractive, island-sized case of, as always, the “feckless” poor being forced to earn their own living by servicing the pleasures of the rich. Only the old were left, and they loved the island, having returned after retirement from a life of work abroad, taking up half the passenger space on the RMS St Helena to be back where they belong.

I wondered: given how little the Saints cost the British taxpayer, on whose behalf the DfID official was wringing his hands, why not carry on paying our dues and let those who want to live there continue to live there without requiring economic self-sufficiency for the whole island? The population of St Helena is roughly half that of Malton, North Yorkshire, a town from which we wouldn’t think of demanding self-sufficiency.

There were, of course, all sorts of problems in St Helena – empty shelves in the shops before the boat with supplies arrived, very poor standards of education, a class division between self-important bureaucrats and the rest of the population, inadequate selfesteem – but those things could be improved with a little more money and commitment to our historical responsibility to the place that did not seek to turn the islanders’ perceived paradise into a service industry for wealthy tourists. Why not let them be? “Because,” I was told firmly, “they have a culture of dependency. St Helena, like everywhere and everyone else, must earn its living.” My “Why? Not everyone can” was left hanging in the air, the question so evidently absurd and troublemaking that the man from DfID didn’t bother to reply.

Even those imbued with the work ethic used to concede that a lifetime’s work earned an easeful retirement early enough in life to allow you a few years to appreciate it before you died. If you weren’t driven, like my grandfather, the gold watch represented the time you’d looked forward to during those decades of nine-to-five, the time when you would potter in the garden, read books, go on long, lazy cruises or play with the grandchildren. It was a prize of extended leisure for a life of hard work and a consolation for forthcoming death. It was the equivalent of the Lord’s seventh day of rest, a well-deserved, built-in part of the pattern of a life of doing. The Lord got one day in seven for the graft of creating the earth, and his virtuous followers got ten or 15 years in addition after four or five decades of shipbuilding, selling, teaching or manufacturing cardboard boxes. At any rate, that was how it was for a western capitalist society that thought it had got itself sorted.

In the 1960s some of the postwar generation, given time to think by relative peace, security and wealth, voiced their doubts about the pattern of virtuous hard work followed by a bit of a rest and death, but, on the whole, nothing much changed structurally. Now, a new demographic (those very 1960s dissidents reaching retirement age) and the results of the greed inherent in capitalism are causing economists and politicians to fret about the cost of an ageing population “being paid for by the hard-working young”, idling their lives away too soon and for too long to sustain an honest hard-working economy. If only their deteriorating bodies can be kept going, the old folk could stay in work for longer and cost less. But keeping those bodies going is expensive, and the longer the old work, the fewer jobs there are for the young.

All very perplexing, when things seemed to be going so nicely in our small part of the planet for a not very long time. Especially confusing as it turns out that the economy, in fact, is controlled by people who gamble rather than graft, and that the decent hardworking family has to be provided with mythical villains – the skivers and scroungers somehow taking the benefit of their efforts – to prevent them from questioning what all the hard work and striving is for. The state has reasons of its own survival for requiring everyone to keep busy; it must maintain the status quo, keep the taxes rolling in and above all thwart the devil’s penchant for making work or something even more dangerous for idle hands.

The wealthy, the privileged and those satisfied with what they have done with their life (if anyone really is or ever could be) will continue to retire, to give themselves a rest and a break. The most dogged and unlikely people are taking the final sabbatical. Alex Ferguson, Philip Roth, even popes are retiring these days. Only the Queen is a holdout, the very emblem of the old standing in the way of the young and preventing them from having a decent hard-working existence. For decades now people have voiced concern about Prince Charles finding a role for himself and what the lack of purpose in his life might be doing to his character. The worry is that, if he finally attains the throne, it will cause the next prince-in-waiting to become a fretful, interfering busybody who has nothing to do but believe in odd theories, being an odd theory himself. The whole problem of the decent hard-working family in modern times is acted out for us by that quaint historical anomaly, the Windsors.

Philip Roth, apparently, is delighted not to be writing any more novels and seems to be having a wonderful time sitting around in coffee bars learning to use an iPhone. Alex Ferguson can have the satisfaction of watching the football or, perhaps, not watching it and going to the races instead if he wants to. But generally there isn’t very much evidence of joyful retirement even among the elite. The Daily Mail reports that the Pope Emeritus has gone into a physical decline of Diskigrandfatherly proportions, even though he is living comfortably next door to Pope Francis in a flat in the Vatican, in the care of “four consecrated laywomen”. Margaret Thatcher didn’t go gracefully into retirement; indeed, she seems to have taken the long route to going the way of my grandfather after the day job gave up on her.

It has always seemed to me that even those with the most worldly and desirable or admirable successes in their working life end up disappointed. How can it be otherwise? Although people fantasise the immense satisfaction of certain achievements, I would guess that if that is what you actually did with your life (whatever the achievement was), when it comes towards the end, it never seems to be quite enough, or the right thing, or what or how you really meant it to be.

The inevitability of it being too late to have another go must and perhaps should cast a shadow over whatever you have done. Only those who wish they had written the books of Philip Roth, coached the greatest football team, been a leader of “the free world”, succeeded Saint Paul as bishop of Rome and leader of the Catholic Church, brought up small children to be independent adults or taught generations of children to think for themselves think these achievements would feel sufficient when it’s game over. Those who do, fret, in my experience. And if satisfaction is properly absent for gaudy high achievers, is it any more available for all those who virtuously felled trees, dug out canoes and fished without cease until they dropped, because they were told it was “the right thing to do”, when all along their Palaeolithic ancestors knew that there was more to being alive than working to live, than doing something rather than being something?

Leisure, not doing, is so terrifying in our culture that we cut it up into small, manageable chunks throughout our working year in case an excess of it will drive us mad, and leave the greatest amount of it to the very end, in the half-conscious hope that we might be saved from its horrors by an early death.

JOHN DEVOLLE/GETTY IMAGES
Show Hide image

Fitter, dumber, more productive

How the craze for Apple Watches, Fitbits and other wearable tech devices revives the old and discredited science of behaviourism.

When Tim Cook unveiled the latest operating system for the Apple Watch in June, he described the product in a remarkable way. This is no longer just a wrist-mounted gadget for checking your email and social media notifications; it is now “the ultimate device for a healthy life”.

With the watch’s fitness-tracking and heart rate-sensor features to the fore, Cook explained how its Activity and Workout apps have been retooled to provide greater “motivation”. A new Breathe app encourages the user to take time out during the day for deep breathing sessions. Oh yes, this watch has an app that notifies you when it’s time to breathe. The paradox is that if you have zero motivation and don’t know when to breathe in the first place, you probably won’t survive long enough to buy an Apple Watch.

The watch and its marketing are emblematic of how the tech trend is moving beyond mere fitness tracking into what might one call quality-of-life tracking and algorithmic hacking of the quality of consciousness. A couple of years ago I road-tested a brainwave-sensing headband, called the Muse, which promises to help you quiet your mind and achieve “focus” by concentrating on your breathing as it provides aural feedback over earphones, in the form of the sound of wind at a beach. I found it turned me, for a while, into a kind of placid zombie with no useful “focus” at all.

A newer product even aims to hack sleep – that productivity wasteland, which, according to the art historian and essayist Jonathan Crary’s book 24/7: Late Capitalism and the Ends of Sleep, is an affront to the foundations of capitalism. So buy an “intelligent sleep mask” called the Neuroon to analyse the quality of your sleep at night and help you perform more productively come morning. “Knowledge is power!” it promises. “Sleep analytics gathers your body’s sleep data and uses it to help you sleep smarter!” (But isn’t one of the great things about sleep that, while you’re asleep, you are perfectly stupid?)

The Neuroon will also help you enjoy technologically assisted “power naps” during the day to combat “lack of energy”, “fatigue”, “mental exhaustion” and “insomnia”. When it comes to quality of sleep, of course, numerous studies suggest that late-night smartphone use is very bad, but if you can’t stop yourself using your phone, at least you can now connect it to a sleep-enhancing gadget.

So comes a brand new wave of devices that encourage users to outsource not only their basic bodily functions but – as with the Apple Watch’s emphasis on providing “motivation” – their very willpower.  These are thrillingly innovative technologies and yet, in the way they encourage us to think about ourselves, they implicitly revive an old and discarded school of ­thinking in psychology. Are we all neo-­behaviourists now?

***

The school of behaviourism arose in the early 20th century out of a virtuous scientific caution. Experimenters wished to avoid anthropomorphising animals such as rats and pigeons by attributing to them mental capacities for belief, reasoning, and so forth. This kind of description seemed woolly and impossible to verify.

The behaviourists discovered that the actions of laboratory animals could, in effect, be predicted and guided by careful “conditioning”, involving stimulus and reinforcement. They then applied Ockham’s razor: there was no reason, they argued, to believe in elaborate mental equipment in a small mammal or bird; at bottom, all behaviour was just a response to external stimulus. The idea that a rat had a complex mentality was an unnecessary hypothesis and so could be discarded. The psychologist John B Watson declared in 1913 that behaviour, and behaviour alone, should be the whole subject matter of psychology: to project “psychical” attributes on to animals, he and his followers thought, was not permissible.

The problem with Ockham’s razor, though, is that sometimes it is difficult to know when to stop cutting. And so more radical behaviourists sought to apply the same lesson to human beings. What you and I think of as thinking was, for radical behaviourists such as the Yale psychologist Clark L Hull, just another pattern of conditioned reflexes. A human being was merely a more complex knot of stimulus responses than a pigeon. Once perfected, some scientists believed, behaviourist science would supply a reliable method to “predict and control” the behaviour of human beings, and thus all social problems would be overcome.

It was a kind of optimistic, progressive version of Nineteen Eighty-Four. But it fell sharply from favour after the 1960s, and the subsequent “cognitive revolution” in psychology emphasised the causal role of conscious thinking. What became cognitive behavioural therapy, for instance, owed its impressive clinical success to focusing on a person’s cognition – the thoughts and the beliefs that radical behaviourism treated as mythical. As CBT’s name suggests, however, it mixes cognitive strategies (analyse one’s thoughts in order to break destructive patterns) with behavioural techniques (act a certain way so as to affect one’s feelings). And the deliberate conditioning of behaviour is still a valuable technique outside the therapy room.

The effective “behavioural modification programme” first publicised by Weight Watchers in the 1970s is based on reinforcement and support techniques suggested by the behaviourist school. Recent research suggests that clever conditioning – associating the taking of a medicine with a certain smell – can boost the body’s immune response later when a patient detects the smell, even without a dose of medicine.

Radical behaviourism that denies a subject’s consciousness and agency, however, is now completely dead as a science. Yet it is being smuggled back into the mainstream by the latest life-enhancing gadgets from Silicon Valley. The difference is that, now, we are encouraged to outsource the “prediction and control” of our own behaviour not to a benign team of psychological experts, but to algorithms.

It begins with measurement and analysis of bodily data using wearable instruments such as Fitbit wristbands, the first wave of which came under the rubric of the “quantified self”. (The Victorian polymath and founder of eugenics, Francis Galton, asked: “When shall we have anthropometric laboratories, where a man may, when he pleases, get himself and his children weighed, measured, and rightly photographed, and have their bodily faculties tested by the best methods known to modern science?” He has his answer: one may now wear such laboratories about one’s person.) But simply recording and hoarding data is of limited use. To adapt what Marx said about philosophers: the sensors only interpret the body, in various ways; the point is to change it.

And the new technology offers to help with precisely that, offering such externally applied “motivation” as the Apple Watch. So the reasoning, striving mind is vacated (perhaps with the help of a mindfulness app) and usurped by a cybernetic system to optimise the organism’s functioning. Electronic stimulus produces a physiological response, as in the behaviourist laboratory. The human being herself just needs to get out of the way. The customer of such devices is merely an opaquely functioning machine to be tinkered with. The desired outputs can be invoked by the correct inputs from a technological prosthesis. Our physical behaviour and even our moods are manipulated by algorithmic number-crunching in corporate data farms, and, as a result, we may dream of becoming fitter, happier and more productive.

***

 

The broad current of behaviourism was not homogeneous in its theories, and nor are its modern technological avatars. The physiologist Ivan Pavlov induced dogs to salivate at the sound of a bell, which they had learned to associate with food. Here, stimulus (the bell) produces an involuntary response (salivation). This is called “classical conditioning”, and it is advertised as the scientific mechanism behind a new device called the Pavlok, a wristband that delivers mild electric shocks to the user in order, so it promises, to help break bad habits such as overeating or smoking.

The explicit behaviourist-revival sell here is interesting, though it is arguably predicated on the wrong kind of conditioning. In classical conditioning, the stimulus evokes the response; but the Pavlok’s painful electric shock is a stimulus that comes after a (voluntary) action. This is what the psychologist who became the best-known behaviourist theoretician, B F Skinner, called “operant conditioning”.

By associating certain actions with positive or negative reinforcement, an animal is led to change its behaviour. The user of a Pavlok treats herself, too, just like an animal, helplessly suffering the gadget’s painful negative reinforcement. “Pavlok associates a mild zap with your bad habit,” its marketing material promises, “training your brain to stop liking the habit.” The use of the word “brain” instead of “mind” here is revealing. The Pavlok user is encouraged to bypass her reflective faculties and perform pain-led conditioning directly on her grey matter, in order to get from it the behaviour that she prefers. And so modern behaviourist technologies act as though the cognitive revolution in psychology never happened, encouraging us to believe that thinking just gets in the way.

Technologically assisted attempts to defeat weakness of will or concentration are not new. In 1925 the inventor Hugo Gernsback announced, in the pages of his magazine Science and Invention, an invention called the Isolator. It was a metal, full-face hood, somewhat like a diving helmet, connected by a rubber hose to an oxygen tank. The Isolator, too, was designed to defeat distractions and assist mental focus.

The problem with modern life, Gernsback wrote, was that the ringing of a telephone or a doorbell “is sufficient, in nearly all cases, to stop the flow of thoughts”. Inside the Isolator, however, sounds are muffled, and the small eyeholes prevent you from seeing anything except what is directly in front of you. Gernsback provided a salutary photograph of himself wearing the Isolator while sitting at his desk, looking like one of the Cybermen from Doctor Who. “The author at work in his private study aided by the Isolator,” the caption reads. “Outside noises being eliminated, the worker can concentrate with ease upon the subject at hand.”

Modern anti-distraction tools such as computer software that disables your internet connection, or word processors that imitate an old-fashioned DOS screen, with nothing but green text on a black background, as well as the brain-measuring Muse headband – these are just the latest versions of what seems an age-old desire for technologically imposed calm. But what do we lose if we come to rely on such gadgets, unable to impose calm on ourselves? What do we become when we need machines to motivate us?

***

It was B F Skinner who supplied what became the paradigmatic image of ­behaviourist science with his “Skinner Box”, formally known as an “operant conditioning chamber”. Skinner Boxes come in different flavours but a classic example is a box with an electrified floor and two levers. A rat is trapped in the box and must press the correct lever when a certain light comes on. If the rat gets it right, food is delivered. If the rat presses the wrong lever, it receives a painful electric shock through the booby-trapped floor. The rat soon learns to press the right lever all the time. But if the levers’ functions are changed unpredictably by the experimenters, the rat becomes confused, withdrawn and depressed.

Skinner Boxes have been used with success not only on rats but on birds and primates, too. So what, after all, are we doing if we sign up to technologically enhanced self-improvement through gadgets and apps? As we manipulate our screens for ­reassurance and encouragement, or wince at a painful failure to be better today than we were yesterday, we are treating ourselves similarly as objects to be improved through operant conditioning. We are climbing willingly into a virtual Skinner Box.

As Carl Cederström and André Spicer point out in their book The Wellness Syndrome, published last year: “Surrendering to an authoritarian agency, which is not just telling you what to do, but also handing out rewards and punishments to shape your behaviour more effectively, seems like undermining your own agency and autonomy.” What’s worse is that, increasingly, we will have no choice in the matter anyway. Gernsback’s Isolator was explicitly designed to improve the concentration of the “worker”, and so are its digital-age descendants. Corporate employee “wellness” programmes increasingly encourage or even mandate the use of fitness trackers and other behavioural gadgets in order to ensure an ideally efficient and compliant workforce.

There are many political reasons to resist the pitiless transfer of responsibility for well-being on to the individual in this way. And, in such cases, it is important to point out that the new idea is a repackaging of a controversial old idea, because that challenges its proponents to defend it explicitly. The Apple Watch and its cousins promise an utterly novel form of technologically enhanced self-mastery. But it is also merely the latest way in which modernity invites us to perform operant conditioning on ourselves, to cleanse away anxiety and dissatisfaction and become more streamlined citizen-consumers. Perhaps we will decide, after all, that tech-powered behaviourism is good. But we should know what we are arguing about. The rethinking should take place out in the open.

In 1987, three years before he died, B F Skinner published a scholarly paper entitled Whatever Happened to Psychology as the Science of Behaviour?, reiterating his now-unfashionable arguments against psychological talk about states of mind. For him, the “prediction and control” of behaviour was not merely a theoretical preference; it was a necessity for global social justice. “To feed the hungry and clothe the naked are ­remedial acts,” he wrote. “We can easily see what is wrong and what needs to be done. It is much harder to see and do something about the fact that world agriculture must feed and clothe billions of people, most of them yet unborn. It is not enough to advise people how to behave in ways that will make a future possible; they must be given effective reasons for behaving in those ways, and that means effective contingencies of reinforcement now.” In other words, mere arguments won’t equip the world to support an increasing population; strategies of behavioural control must be designed for the good of all.

Arguably, this authoritarian strand of behaviourist thinking is what morphed into the subtly reinforcing “choice architecture” of nudge politics, which seeks gently to compel citizens to do the right thing (eat healthy foods, sign up for pension plans) by altering the ways in which such alternatives are presented.

By contrast, the Apple Watch, the Pavlok and their ilk revive a behaviourism evacuated of all social concern and designed solely to optimise the individual customer. By ­using such devices, we voluntarily offer ourselves up to a denial of our voluntary selves, becoming atomised lab rats, to be manipulated electronically through the corporate cloud. It is perhaps no surprise that when the founder of American behaviourism, John B Watson, left academia in 1920, he went into a field that would come to profit very handsomely indeed from his skills of manipulation – advertising. Today’s neo-behaviourist technologies promise to usher in a world that is one giant Skinner Box in its own right: a world where thinking just gets in the way, and we all mechanically press levers for food pellets.

This article first appeared in the 18 August 2016 issue of the New Statesman, Corbyn’s revenge