Show Hide image

An American suicide

There was no subject about which David Foster Wallace could not write brilliantly, from politics to

On the evening of 20 October this year a memorial service was held at Amherst College, Massachusetts, for the writer David Foster Wallace. Wallace, who graduated from Amherst in 1985, had hanged himself the previous month at his home in California. He was 46.

Since the publication of his voluminous and extravagantly ambitious second novel, Infinite Jest, Wallace had been widely acknowledged as the finest American writer of his generation. When Infinite Jest came out, in 1996, New York magazine's Walter Kirn declared that the competition had been "obliterated". It was, he wrote, "as though Paul Bunyan had joined the NFL, or Wittgenstein had gone on Jeopardy!". More or less instantly, Infinite Jest became the benchmark against which any ambitious and intellectually curious young novelist would measure himself. And so, when Wallace died, American fiction lost its lodestar. His suicide was a kind of Kurt Cobain moment, a profound generational shock. (Cobain, frontman of the rock band Nirvana, shot himself at home in Seattle in 1994. He was 27.)

Wallace had begun to write fiction while at Amherst, and his first novel, the antic and exuberant The Broom of the System, published when he was 24, was based on his senior English thesis. One of his professors, Andrew Parker, told the congregation at the Amherst service at Johnson Chapel how Wallace had insisted on writing a thesis in the philosophy department at the same time, and had still managed to graduate summa cum laude in both subjects. Sue Dickman, an Amherst alumna, remembered the year Wallace spent there as an instructor after an abortive attempt to start a PhD at Harvard; she recalled how he would come to class with a tennis racket and let his students take breaks so he could smoke.

The mourners also heard from Mark Costello, Wallace's college room-mate in the early 1980s, who implored them not to forget "how painful Dave's day-to-day life was". Wallace had suffered from depression since adolescence, and was hospitalised during his sophomore year at Amherst. Later, he briefly shared an apartment with Costello in Somerville, near Boston. Wallace was drinking heavily and experimenting enthusiastically with drugs, and eventually ended up in McLean Hospital, a psychiatric institution that had previously counted the poets Sylvia Plath and Robert Lowell among its patients. The "power of death", Costello said, was his constant companion, and "eventually it cornered him and killed him".

While he was at McLean, Wallace was prescribed Nardil, a powerful antidepressant that he would take for most of the next 20 years. By the summer of 2007, however, the drug had begun to have unpleasant side effects, and it was decided that he would come off it. Doing so had catastrophic results. "Severe depression came back," Wallace's father, James, told the online magazine Salon. "They tried all kinds of things. He was hospitalised twice . . . and had a series of electroconvulsive therapy treatments, which just really left him very shaky and very fragile and unable to sleep."

By this summer, Wallace hadn't written anything for more than a year. He had been granted medical leave from his teaching job at Pomona College in Claremont, California, and had stopped going to the gym with his friend and colleague John Seery, where they had been the only "skinny-ass egghead types" in the place.

When, in mid-August, his wife, Karen Green, had to go away on family business, Wallace's parents came to stay with him in Claremont. "He was very emotional," his mother, Sally, said. "He was just terrified of so much. We would just try to hold him. He did tell me that he was glad I was his mom." James and Sally Wallace flew back home to Illinois at the end of the month. A fortnight later, Karen came home from a shopping trip to find her husband dead.

News of Wallace's suicide first broke on the Twitter page of the Brooklyn-based writer Edward Champion. By the time the mainstream media picked up the story, Champion was already gathering tributes and testimonials from the "literary community" on his blog, the comments box of which was soon filling up with expressions of shock and anguish from Wallace's fans. The confession of one devotee that "I haven't really cried over the death of someone I haven't met since Kurt Cobain" was typical.

Within a couple of days, Dave Eggers's McSweeney's ope ration had a virtual book of condolence up on its website, and people were posting reminiscences of meeting "DFW" at readings in out-of-the-way branches of Barnes & Noble, of taking writing classes with him and, in one case, of playing tennis against him in fifth grade. (Wallace had been an adolescent tennis prodigy, playing the junior circuit in the Midwest in the mid-Seventies. The first piece of extended narrative journalism he wrote, for Harper's magazine, was about summers spent "being driven through dawns to tournaments all over Illinois, Indiana and Iowa"; and Infinite Jest is, among other things, probably the finest, and certainly the longest, work of fiction about tennis ever written.)

The internet is a uniquely effective mechanism for processing public grief, of course, but there was something striking nevertheless about the sheer copiousness of the reaction to Wallace's death. This was partly because he was the kind of writer who attracts acolytes as well as admirers. The esoteric, self-enclosed quality of much of Wallace's fiction, not least the great thousand-odd-page slab of Infinite Jest itself, tended to inspire unusually fierce allegiance in many of his fans - the sort of obsessive who can tell you the likely effects of putting your head in a microwave (this being how the father of one of the central characters kills himself) because they've checked with a "med-school acquaintance", and who posts elaborate theories about arcane plot points online ("It sounds far-fetched, but check this out . . .").

Just as striking, and much more significant, however, was the reaction of other writers, Wallace's peers. There was spontaneous, unanimous agreement that he had been the best of them. Jonathan Franzen described Wallace as a "huge talent, our strongest rhetorical writer". In Zadie Smith's view, he had no "equal among living writers. He was an actual genius". He was also, everyone agreed, an unusually generous man: as "sweet a person" as Franzen had ever known, "meticulous about putting people at their ease", according to Smith.

Eggers posted a series of anecdotes about Wallace on the McSweeney's website. In one, he recalled the first time they met, in the mid-Nineties, at a diner in New York. While Wallace chewed tobacco, using a teacup in his lap as a spittoon (a habit he would never give up), he and Eggers "talked about how he'd grown up in Champaign-Urbana, Illinois, and how I'd gone to college there, how his father taught there, about the pleasures and quirks of east-central Illinois. There's something very strange and uniquely powerful about meeting a guy whose writing you find world-changing but who also comes from your part of the world."

Who else could have written a story about a marketing executive who plans to inject deadly cultures into a snack he’s testing with focus groups?

Wallace was, by his own description, an “infantile transplant” to Illinois from upstate New York: his father had done his PhD at Cornell University and had come west to teach philosophy at the University of Illinois in Champaign. His mother taught English at a local community college. Wallace described the atmosphere of the family home in an essay he wrote for Harper's: “Suppers often involved a game: if one of us children [Wallace had a sister, Amy] made a usage error, Mom would pretend to have a coughing fit that would go on and on until the relevant child had identified the relevant error and corrected it. It was all very self-ironic and light-hearted; but still, looking back, it seems a bit excessive to pretend that your small child is actually denying you oxygen by speaking incorrectly.”

Being raised as an "extreme usage fanatic" by a teacher of composition certainly left its mark on the mature Wallace's prose style. In another of his McSweeney's posts, Eggers remembered the time Wallace sent him a story, "Mister Squishy", together with a note asking that if it was to run in his magazine, it should do so under the pseudonym "Elizabeth Klemm". Eggers agreed ("we were so proud to publish it"), but the subterfuge didn't last long: Wallace was simply "too recognisable to hide, too singular to fool anyone". Who else could have written a 60-page story about a marketing executive who hatches a plan to inject deadly cultures into the "snack cakes" he is testing with a focus group? And all this in sentences of improbable length and fiendish complexity.

Take the passage in which the protagonist fantasises, while briefing his focus group, about how much ricin or botulinus it would take to bring the entire corporate snack industry to its knees. As he is doing so, he is struck by his ability to entertain such fantasies in "total subjective private" and then by the thought that half the people in the room with him are doing the same:

. . . Schmidt had a quick vision of them all in the conference room as like icebergs and/or floes, only the sharp caps showing, unknowing and -knowable to one another, and he imagined that it was only in marriage (and a good marriage, not the decorous dance of loneliness he'd watched his mother and father do for seventeen years but rather true conjugal intimacy) that partners allowed each other to see below the berg's cap's public mask and consented to be truly known, maybe even to the extent of not only letting the partner see the repulsive nest of moles under their left arm or the way after any sort of cold or viral infection the toenails on both feet turned a weird deep yellow for several weeks but even perhaps every once in a while sobbing in each other's arms late at night and pouring out the most ghastly private fears and thoughts of failure and impotence and terrible and thoroughgoing smallness . . .

This is writing of extraordinary syntactic control, and it is characteristic of what Eggers describes as Wallace's "dense, discursive, and insanely detailed style". The sentence continues for almost another page; the paragraph in which it occurs runs over four pages. Eggers says that he asked Wallace to consider breaking up some of the paragraphs before the story was published: "It was as if he were visiting the notion . . . for the first time. He was that kind of genius, whose understanding of the workings of his own fiction was, I think, largely separate from ideas of audience."

Actually, Eggers is missing something here, because Wallace thought more deeply about questions of audience and address than any other American writer of his generation. All that "insane" detail, and the formal strategies he borrowed from postmodernism (authorial intrusions, digressions, footnotes, flash-cutting between scenes and so on), were meant in fact to serve the rather traditional end of saying something about "what it is to be a fucking human being".

Wallace believed that each of us is "sort of marooned" inside our own skull, and that it is fiction's job to "aggravate this sense of entrapment and loneliness and death in people". It was the estranging apparatus of his style - the postmodern rhetorical devices, the hyperextended sentences - that was meant to do the aggravating.

That, at least, was the theory. However, Wallace was tormented by the thought that the “antagonistic elements” in his fiction might in fact just be manifestations of a pathological exhibitionism. He deplored his own “grossly sentimental affection for gags” and weakness for “formal stunt-pilotry” that served no narrative purpose. But he also understood that this predicament was not his exclusively; it was that of an entire generation of writers who, in a sense, had come too late – who had arrived, that is, just as the bold innovations of postmodern novelists such as John Barth, William Gaddis and Thomas Pynchon were being absorbed and neutered by the culture.

Wallace believed that each of us is “sort of marooned” inside our own skull and that it is fiction’s job to “aggravate this sense of entrapment”

This highly developed generational self-consciousness is one reason Wallace was held in such esteem by his peers: he held up a mirror to their own anxieties, and articulated them more clearly and honestly than they ever dared.

Infinite Jest bears the scars of Wallace's parricidal struggle with his influences. He insisted that he had wanted it to be an "extraordinarily sad" book about loneliness and addiction, rather than a postmodern one. And to this end, the novel's two principal settings are a tennis academy, which Wallace depicts as a kind of laboratory of obsession (he writes with considerable feeling about the psychic costs exacted by endless early-morning tennis drills), and a halfway house for recovering drug addicts and alcoholics. However, it is also crammed with set-ups and gags that could have come straight out of the fiction of Thomas Pynchon: a geopolitical agglomeration with the acronym ONAN; a gang of wheelchair-bound Québécois separatists; and a film so entertaining that it paralyses anyone who watches it.

At times it seems as if the novel is conducting an argument with itself - for instance, in a long scene in which Don Gately, a former drug addict who is now a live-in staffer at the half way house, goes to an Alcoholics Anonymous meeting in Boston. One of the residents in Gately's care is there, too, and complains about the "psychobabbly dialect" that's de rigueur at events like this. Gately admits that the "seminal little mini-epiphanies" routinely experienced by new inductees into AA come embalmed in language of "polyesterish" banality. Then someone else says they also find the sentimental argot hard to stomach - especially the habit the speakers have of saying they are "here but for the grace of God", which phrase, she points out, is "literally senseless", and should be used only when introducing a conditional clause. Wallace is flattering his hip and savvy readers here, inviting them to identify with this sophisticated cynicism. But it is also clear that we are meant at the same time to find something ridiculous and overwrought about someone who is driven to want to "put her head in a Radarange" by a home-spun solecism or two. Indeed, Wallace said later that the scene was designed to get his readers - privileged, educated Americans, most of them - to "confront stuff about spirituality and values", stuff "our generation needs to feel".

Infinite Jest turned out to be Wallace's last novel. He didn't stop writing fiction - there were two further collections of stories, the second of which, Oblivion, contains some of his finest work - but, as Mark Costello observed in his Amherst encomium, something changed: "If you sit down and read his prose from the early Nineties to later, you'll hear the music changing. You'll hear sentences getting longer and longer, with these wonderfully balanced dependent clauses."

According to the novelist and journalist Tom Bissell, who knew Wallace, this was also a shift in "world-view". Bissell told me: "I think the man who wrote Oblivion would not have been satisfied with the cross-dressing leader of a Québécois separatist group whose primary mode of transport is wheelchairs, or with the political acronym ONAN, both of which are kind of silly in the way Pynchon is silly, but not in the way the world ever feels silly. I believe he escaped the anxiety of his influence."

Wallace's deepening ambivalence about the moral as well as aesthetic legacy of postmodernism is especially noticeable in the non-fiction he wrote in the last decade of his life. In many of his essays, whether the brief was to write about Caribbean cruises or the Maine Lobster Festival, he can be seen grappling obsessively with all that "stuff about spirituality and values".

A good example is a piece he wrote after Harper's sent him back to Illinois to attend the state fair, and to gorge on corn dogs while watching the rural Midwest at play. "Getting Away From Already Pretty Much Being Away From It All" derives all its considerable force from the tension between Wallace's self-acknowledged "East Coast cynicism" and his yearning for a kind of authenticity.

He describes coming across a small hillock that, for some reason, has been covered in artificial grass: "a quick look under the edge of the fake-grass mat reveals the real grass underneath, flattened and already yellowing". Now, the postmodernist debunker in him would have been content to leave that image to stand for the emptiness and shoddiness of the state fair as a whole. But Wallace doesn't settle for simply unmasking the event as a sham; and this is partly because, for all his protestations that he is no longer "spiritually Midwestern", he remains of the place he is describing.

He understands that to be Midwestern is to be "marooned in a space whose emptiness is both physical and spiritual", and that what the state fair provides is a kind of temporary communal respite from that condition. As Bissell, himself from the Midwest, puts it, "in terms of literary persona, [Wallace] was temperamentally speaking a rural Midwesterner, intellectually speaking a high-wire postmodernist, and emotionally speaking an artist-as-priest type. I'm not sure anyone else has really managed to combine those qualities."

Wallace’s best essays are artefacts of this multifarious literary personality, dramatisations of its internal conflicts. Nowhere is this drama more powerfully performed than in the 15,000-word article he wrote for Rolling Stone about a week he spent following John McCain during his Republican primary battle, in 2000, with George W Bush. (The article was turned into a short book this year to coincide with McCain’s second run at the presidency.)

It is unsettling reading the piece now - after a Republican campaign this year conducted mostly from the sewer - to recall that eight years ago McCain represented to many the last, best hope for decency and truth in American politics. Even the most jaded hacks aboard McCain's bus, the Straight Talk Express, were wondering if "humanity and politics, shrewdness and decency" really could coexist. And Wallace finds himself as inspired as anyone: "It's difficult not to feel enthused and to really like this man and want to support him in just about any sort of feasible way you can think of."

Underpinning McCain's promise in 2000, of course, was "something riveting and unspinnable and true": that the candidate had been imprisoned and tortured during the Vietnam War, and spent five years in a box-sized cell at the "Hanoi Hilton". But whatever it was in his character that sustained McCain during those five years was shut away in that box, too. This, for Wallace, was the essential "paradox" of his campaign: the fact that the "box that makes McCain 'real' is, by definition, locked. Impenetrable." Which meant that it was up to the voter to decide whether McCain was "truly 'for real'".

What the essay is really about, therefore, is the "interior war" inside Wallace's own head between the need to believe in something larger than himself and his anxiety that the need to believe might be a sham or a fraud. And "fraud", Mark Costello reminded the mourners at Amherst, "was one of the worst words in his personal vocabulary".

In the end, Wallace took his own life. Perhaps the struggle to believe in something was too great; he had suffered too much. He was the ultimate victim of his own interior war.

Jonathan Derbyshire is a writer and philosopher

The best of David Foster Wallace

1987: The Broom of the System

The metafictional game-playing and sheer verbal inventiveness of Wallace's first novel established him as a successor to American postmodernists such as John Barth and Thomas Pynchon. It is set in the near future in Cleveland, which now stands on the border of the Great Ohio Desert (or GOD), a vast tract of land filled with black sand.

1989: Girl With Curious Hair

The centrepiece of this first collection of stories is "Westward the Course of Empire Takes Its Way", a 150-page novella set in a creative writing workshop. Wallace said the story was "written in the margins" of Barth's Lost in the Funhouse - the professor running the workshop is the author of a famous story also called "Lost in the Funhouse".

1996: Infinite Jest

The gargantuan novel that sealed Wallace's reputation as the most exciting and ambitious American novelist of his generation. The plot motor of Infinite Jest, which is over a thousand pages long and weighed down with more than a hundred pages of endnotes, is a movie of the same name which paralyses anyone who watches it.

1997: A Supposedly Fun Thing I'll Never Do Again

Wallace's first collection of extended non-fiction contains some of his most celebrated essays: the titular record of a comically awful Caribbean cruise; a paean to the genius of David Lynch; and "E Unibus Pluram", his account of the way the "rebellious irony" of postmodernism was co-opted by television.

1999: Brief Interviews With Hideous Men

In the title story of this second collection of shorter fiction, several unidentified men describe their sexual proclivities - including one who can't help shouting "Victory for the Forces of Democratic Freedom!" when he is on the point of ejaculating. Another story, "The Depressed Person", with its extravagantly long sentences, footnotes and abundant psychotherapeutic jargon, is typical of Wallace's later fictional style.

2004: Oblivion

Oblivion has several of Wallace's finest stories, notably "Good Old Neon" and "The Suffering Channel". The former is narrated from beyond the grave by a high school acquaintance of "David Wallace". In the latter, a journalist on a hip New York style magazine travels to Indiana to profile a man who developed an ability to shit perfectly rendered sculptures while on latrine duty during the first Gulf War.

2005: Consider the Lobster

When Gourmet magazine sent Wallace to the Maine Lobster Festival, he came back with the title essay of this collection, in which he asks whether it is morally acceptable to boil sentient creatures alive (short answer: no).

2008: McCain's promise

This short book about John McCain was published this year to coincide with the US presidential election. Originally published as a long narrative report in Rolling Stone magazine under the title "Up, Simba!", it is an account of a week spent following the McCain's campaign during the 2000 Republican primaries. Wallace admired McCain the man, if not the politician, and was fascinated by his years in prison in Vietnam. He did not live long enough to see the outcome of this year's presidential election.

the death of Kurt Cobain

Whatever the cultural significance of Kurt Cobain's suicide, his reasons, as with David Foster Wallace, were first and foremost personal. Struggling with heroin addiction, various medical problems and facing an impending separation from his wife, Courtney Love, the lead singer of Nirvana shot himself at home in April 1994. Yet when news broke, the public outpouring of grief among teenage fans in his home city of Seattle resonated around the world.

To many, his death represented a clash between conflicting value systems: the counterculture from which Nirvana had emerged, and the corporate world of MTV and major record labels that transformed them into global rock stars just months after the release of their 1991 album, Nevermind. Cobain's suicide seemed like an admission that these two worlds could not be reconciled. "The worst crime I can think of would be to rip people off by faking it and pretending as if I'm having 100 per cent fun," he declared in his suicide note. The irony is that this exit only pushed Cobain deeper into music-industry mythology.

But it is a mistake to see his death as an artistic gesture. Cobain had come adrift in his life, which is something he shared with the increasing numbers of young men who kill themselves every year. In England and Wales, for example, suicide remains the second most common cause of death for men under the age of 35. If you really want to know something about the hopes and fears of a generation, understanding these everyday tragedies is as important as unpicking the famous ones.

Daniel Trilling

Jonathan Derbyshire is Managing Editor of Prospect. He was formerly Culture Editor of the New Statesman.

This article first appeared in the 01 December 2008 issue of the New Statesman, How safe is your job?

JOHN DEVOLLE/GETTY IMAGES
Show Hide image

Fitter, dumber, more productive

How the craze for Apple Watches, Fitbits and other wearable tech devices revives the old and discredited science of behaviourism.

When Tim Cook unveiled the latest operating system for the Apple Watch in June, he described the product in a remarkable way. This is no longer just a wrist-mounted gadget for checking your email and social media notifications; it is now “the ultimate device for a healthy life”.

With the watch’s fitness-tracking and heart rate-sensor features to the fore, Cook explained how its Activity and Workout apps have been retooled to provide greater “motivation”. A new Breathe app encourages the user to take time out during the day for deep breathing sessions. Oh yes, this watch has an app that notifies you when it’s time to breathe. The paradox is that if you have zero motivation and don’t know when to breathe in the first place, you probably won’t survive long enough to buy an Apple Watch.

The watch and its marketing are emblematic of how the tech trend is moving beyond mere fitness tracking into what might one call quality-of-life tracking and algorithmic hacking of the quality of consciousness. A couple of years ago I road-tested a brainwave-sensing headband, called the Muse, which promises to help you quiet your mind and achieve “focus” by concentrating on your breathing as it provides aural feedback over earphones, in the form of the sound of wind at a beach. I found it turned me, for a while, into a kind of placid zombie with no useful “focus” at all.

A newer product even aims to hack sleep – that productivity wasteland, which, according to the art historian and essayist Jonathan Crary’s book 24/7: Late Capitalism and the Ends of Sleep, is an affront to the foundations of capitalism. So buy an “intelligent sleep mask” called the Neuroon to analyse the quality of your sleep at night and help you perform more productively come morning. “Knowledge is power!” it promises. “Sleep analytics gathers your body’s sleep data and uses it to help you sleep smarter!” (But isn’t one of the great things about sleep that, while you’re asleep, you are perfectly stupid?)

The Neuroon will also help you enjoy technologically assisted “power naps” during the day to combat “lack of energy”, “fatigue”, “mental exhaustion” and “insomnia”. When it comes to quality of sleep, of course, numerous studies suggest that late-night smartphone use is very bad, but if you can’t stop yourself using your phone, at least you can now connect it to a sleep-enhancing gadget.

So comes a brand new wave of devices that encourage users to outsource not only their basic bodily functions but – as with the Apple Watch’s emphasis on providing “motivation” – their very willpower.  These are thrillingly innovative technologies and yet, in the way they encourage us to think about ourselves, they implicitly revive an old and discarded school of ­thinking in psychology. Are we all neo-­behaviourists now?

***

The school of behaviourism arose in the early 20th century out of a virtuous scientific caution. Experimenters wished to avoid anthropomorphising animals such as rats and pigeons by attributing to them mental capacities for belief, reasoning, and so forth. This kind of description seemed woolly and impossible to verify.

The behaviourists discovered that the actions of laboratory animals could, in effect, be predicted and guided by careful “conditioning”, involving stimulus and reinforcement. They then applied Ockham’s razor: there was no reason, they argued, to believe in elaborate mental equipment in a small mammal or bird; at bottom, all behaviour was just a response to external stimulus. The idea that a rat had a complex mentality was an unnecessary hypothesis and so could be discarded. The psychologist John B Watson declared in 1913 that behaviour, and behaviour alone, should be the whole subject matter of psychology: to project “psychical” attributes on to animals, he and his followers thought, was not permissible.

The problem with Ockham’s razor, though, is that sometimes it is difficult to know when to stop cutting. And so more radical behaviourists sought to apply the same lesson to human beings. What you and I think of as thinking was, for radical behaviourists such as the Yale psychologist Clark L Hull, just another pattern of conditioned reflexes. A human being was merely a more complex knot of stimulus responses than a pigeon. Once perfected, some scientists believed, behaviourist science would supply a reliable method to “predict and control” the behaviour of human beings, and thus all social problems would be overcome.

It was a kind of optimistic, progressive version of Nineteen Eighty-Four. But it fell sharply from favour after the 1960s, and the subsequent “cognitive revolution” in psychology emphasised the causal role of conscious thinking. What became cognitive behavioural therapy, for instance, owed its impressive clinical success to focusing on a person’s cognition – the thoughts and the beliefs that radical behaviourism treated as mythical. As CBT’s name suggests, however, it mixes cognitive strategies (analyse one’s thoughts in order to break destructive patterns) with behavioural techniques (act a certain way so as to affect one’s feelings). And the deliberate conditioning of behaviour is still a valuable technique outside the therapy room.

The effective “behavioural modification programme” first publicised by Weight Watchers in the 1970s is based on reinforcement and support techniques suggested by the behaviourist school. Recent research suggests that clever conditioning – associating the taking of a medicine with a certain smell – can boost the body’s immune response later when a patient detects the smell, even without a dose of medicine.

Radical behaviourism that denies a subject’s consciousness and agency, however, is now completely dead as a science. Yet it is being smuggled back into the mainstream by the latest life-enhancing gadgets from Silicon Valley. The difference is that, now, we are encouraged to outsource the “prediction and control” of our own behaviour not to a benign team of psychological experts, but to algorithms.

It begins with measurement and analysis of bodily data using wearable instruments such as Fitbit wristbands, the first wave of which came under the rubric of the “quantified self”. (The Victorian polymath and founder of eugenics, Francis Galton, asked: “When shall we have anthropometric laboratories, where a man may, when he pleases, get himself and his children weighed, measured, and rightly photographed, and have their bodily faculties tested by the best methods known to modern science?” He has his answer: one may now wear such laboratories about one’s person.) But simply recording and hoarding data is of limited use. To adapt what Marx said about philosophers: the sensors only interpret the body, in various ways; the point is to change it.

And the new technology offers to help with precisely that, offering such externally applied “motivation” as the Apple Watch. So the reasoning, striving mind is vacated (perhaps with the help of a mindfulness app) and usurped by a cybernetic system to optimise the organism’s functioning. Electronic stimulus produces a physiological response, as in the behaviourist laboratory. The human being herself just needs to get out of the way. The customer of such devices is merely an opaquely functioning machine to be tinkered with. The desired outputs can be invoked by the correct inputs from a technological prosthesis. Our physical behaviour and even our moods are manipulated by algorithmic number-crunching in corporate data farms, and, as a result, we may dream of becoming fitter, happier and more productive.

***

 

The broad current of behaviourism was not homogeneous in its theories, and nor are its modern technological avatars. The physiologist Ivan Pavlov induced dogs to salivate at the sound of a bell, which they had learned to associate with food. Here, stimulus (the bell) produces an involuntary response (salivation). This is called “classical conditioning”, and it is advertised as the scientific mechanism behind a new device called the Pavlok, a wristband that delivers mild electric shocks to the user in order, so it promises, to help break bad habits such as overeating or smoking.

The explicit behaviourist-revival sell here is interesting, though it is arguably predicated on the wrong kind of conditioning. In classical conditioning, the stimulus evokes the response; but the Pavlok’s painful electric shock is a stimulus that comes after a (voluntary) action. This is what the psychologist who became the best-known behaviourist theoretician, B F Skinner, called “operant conditioning”.

By associating certain actions with positive or negative reinforcement, an animal is led to change its behaviour. The user of a Pavlok treats herself, too, just like an animal, helplessly suffering the gadget’s painful negative reinforcement. “Pavlok associates a mild zap with your bad habit,” its marketing material promises, “training your brain to stop liking the habit.” The use of the word “brain” instead of “mind” here is revealing. The Pavlok user is encouraged to bypass her reflective faculties and perform pain-led conditioning directly on her grey matter, in order to get from it the behaviour that she prefers. And so modern behaviourist technologies act as though the cognitive revolution in psychology never happened, encouraging us to believe that thinking just gets in the way.

Technologically assisted attempts to defeat weakness of will or concentration are not new. In 1925 the inventor Hugo Gernsback announced, in the pages of his magazine Science and Invention, an invention called the Isolator. It was a metal, full-face hood, somewhat like a diving helmet, connected by a rubber hose to an oxygen tank. The Isolator, too, was designed to defeat distractions and assist mental focus.

The problem with modern life, Gernsback wrote, was that the ringing of a telephone or a doorbell “is sufficient, in nearly all cases, to stop the flow of thoughts”. Inside the Isolator, however, sounds are muffled, and the small eyeholes prevent you from seeing anything except what is directly in front of you. Gernsback provided a salutary photograph of himself wearing the Isolator while sitting at his desk, looking like one of the Cybermen from Doctor Who. “The author at work in his private study aided by the Isolator,” the caption reads. “Outside noises being eliminated, the worker can concentrate with ease upon the subject at hand.”

Modern anti-distraction tools such as computer software that disables your internet connection, or word processors that imitate an old-fashioned DOS screen, with nothing but green text on a black background, as well as the brain-measuring Muse headband – these are just the latest versions of what seems an age-old desire for technologically imposed calm. But what do we lose if we come to rely on such gadgets, unable to impose calm on ourselves? What do we become when we need machines to motivate us?

***

It was B F Skinner who supplied what became the paradigmatic image of ­behaviourist science with his “Skinner Box”, formally known as an “operant conditioning chamber”. Skinner Boxes come in different flavours but a classic example is a box with an electrified floor and two levers. A rat is trapped in the box and must press the correct lever when a certain light comes on. If the rat gets it right, food is delivered. If the rat presses the wrong lever, it receives a painful electric shock through the booby-trapped floor. The rat soon learns to press the right lever all the time. But if the levers’ functions are changed unpredictably by the experimenters, the rat becomes confused, withdrawn and depressed.

Skinner Boxes have been used with success not only on rats but on birds and primates, too. So what, after all, are we doing if we sign up to technologically enhanced self-improvement through gadgets and apps? As we manipulate our screens for ­reassurance and encouragement, or wince at a painful failure to be better today than we were yesterday, we are treating ourselves similarly as objects to be improved through operant conditioning. We are climbing willingly into a virtual Skinner Box.

As Carl Cederström and André Spicer point out in their book The Wellness Syndrome, published last year: “Surrendering to an authoritarian agency, which is not just telling you what to do, but also handing out rewards and punishments to shape your behaviour more effectively, seems like undermining your own agency and autonomy.” What’s worse is that, increasingly, we will have no choice in the matter anyway. Gernsback’s Isolator was explicitly designed to improve the concentration of the “worker”, and so are its digital-age descendants. Corporate employee “wellness” programmes increasingly encourage or even mandate the use of fitness trackers and other behavioural gadgets in order to ensure an ideally efficient and compliant workforce.

There are many political reasons to resist the pitiless transfer of responsibility for well-being on to the individual in this way. And, in such cases, it is important to point out that the new idea is a repackaging of a controversial old idea, because that challenges its proponents to defend it explicitly. The Apple Watch and its cousins promise an utterly novel form of technologically enhanced self-mastery. But it is also merely the latest way in which modernity invites us to perform operant conditioning on ourselves, to cleanse away anxiety and dissatisfaction and become more streamlined citizen-consumers. Perhaps we will decide, after all, that tech-powered behaviourism is good. But we should know what we are arguing about. The rethinking should take place out in the open.

In 1987, three years before he died, B F Skinner published a scholarly paper entitled Whatever Happened to Psychology as the Science of Behaviour?, reiterating his now-unfashionable arguments against psychological talk about states of mind. For him, the “prediction and control” of behaviour was not merely a theoretical preference; it was a necessity for global social justice. “To feed the hungry and clothe the naked are ­remedial acts,” he wrote. “We can easily see what is wrong and what needs to be done. It is much harder to see and do something about the fact that world agriculture must feed and clothe billions of people, most of them yet unborn. It is not enough to advise people how to behave in ways that will make a future possible; they must be given effective reasons for behaving in those ways, and that means effective contingencies of reinforcement now.” In other words, mere arguments won’t equip the world to support an increasing population; strategies of behavioural control must be designed for the good of all.

Arguably, this authoritarian strand of behaviourist thinking is what morphed into the subtly reinforcing “choice architecture” of nudge politics, which seeks gently to compel citizens to do the right thing (eat healthy foods, sign up for pension plans) by altering the ways in which such alternatives are presented.

By contrast, the Apple Watch, the Pavlok and their ilk revive a behaviourism evacuated of all social concern and designed solely to optimise the individual customer. By ­using such devices, we voluntarily offer ourselves up to a denial of our voluntary selves, becoming atomised lab rats, to be manipulated electronically through the corporate cloud. It is perhaps no surprise that when the founder of American behaviourism, John B Watson, left academia in 1920, he went into a field that would come to profit very handsomely indeed from his skills of manipulation – advertising. Today’s neo-behaviourist technologies promise to usher in a world that is one giant Skinner Box in its own right: a world where thinking just gets in the way, and we all mechanically press levers for food pellets.

This article first appeared in the 18 August 2016 issue of the New Statesman, Corbyn’s revenge