Steve Jobs: monster and genius

An insight into the man who crowdsourced his own marriage.

Steve Jobs was obsessive about the pursuit of perfection. When he bought a family home after his son was born, he didn't just pop to Ikea for a coffee table and some chairs. Oh no. "We spoke about furniture in theory for eight years," his wife, Laurene, says. "We spent a lot of time asking ourselves, 'What is the purpose of a sofa?'" (I know this one: it's to sit on.)

Walter Isaacson's biography of the Apple svengali is peppered throughout with such eyebrow-raising anecdotes. For several years up to his death in October, Jobs gave the writer his full co-operation, and did not (for once) attempt to exercise any control over how he was portrayed. The result sometimes feels less "warts-and-all" and more "all-warts".

The computer pioneer could be, in his own words, an "asshole". Colleagues said he projected a "reality distortion field", which convinced employees, rivals and the press that the impossible was possible. It sprang from a belief that the rules of normal behaviour did not apply to him. In the early days of Apple, he claimed his vegan diet meant he didn't need to shower, and he relaxed by soaking his feet in the loo ("a practice that was not as soothing for his colleagues", Isaacson writes drily). Pulled over for speeding in 1984, he waited for a few moments as the policeman wrote his ticket, then honked his horn impatiently. He was, he explained to the traffic cop, in a hurry.

Jobs may have cried frequently when crossed, but he could be frighteningly cold to those he believed had betrayed him. Often he would scream at employees and tell them their work was "totally shitty", even if he later embraced it - and took the credit. Jonathan Ive, the trusted English-born lieutenant whose close collaboration with Jobs led to the sinuous designs of the iPod and iPhone, is one of several friends who complain about this.

Yet perhaps the most shocking example of his callousness is one that Isaacson describes with little fanfare. After abandoning a pregnant girlfriend at 23 - Jobs's reality distortion field became a mirror and he convinced himself that he was not the father of her baby - he met a young graduate called Laurene Powell and proposed to her twice before she became pregnant. Then, abruptly, he broke up with her and crowdsourced a decision on their future, asking dozens of his friends if she was prettier than his ex. "It was probably fewer than a hundred,"saacson writes. (The two then married and lived happily for 20 years until his death.)

The triumph of this biography, however, is that Jobs's mountain of peccadilloes is weighted perfectly against his undeniable triumphs. Isaacson makes a convincing case that he was an artistic visionary with pure motives, driven only by a love of "the product". Jobs knew how to inflame desire for something you didn't even know you wanted: a computer with a graphical rather than text interface, a phone with no keyboard, a computer the size and thickness of a magazine.

He also ruthlessly exploited other companies' shortfalls. Take the graphical user interface - essentially, the use of a picture-based desktop rather than lines of text - that put the early Apple computers so far ahead of the competition. The interface was originally developed by a rival firm called Xerox Parc, but the management there did not understand its potential significance. Jobs did, and promptly appropriated it. (When Bill Gates used the same tool to design Windows, Jobs accused him of "ripping us off". Gates's reply is immensely endearing: "Well, Steve . . . I think it's more like we both had this rich neighbour called Xerox and I broke into his house to steal his TV set and found out that you had already stolen it.")

The comparison to Gates, his near-exact contemporary, is illuminating. The Microsoft man is cool, methodical and humane: Jobs was fiery, intuitive and unreasonably demanding. Their approaches to design were equally opposed, Gates believing in licensing Windows to any hardware manufacturer who would pay, while Jobs wanted "end-to-end control" of the user's experience.

Throughout the 1980s and 1990s, it seemed that Gates's promiscuous approach guaranteed him market dominance, until Jobs made a triumphant return to Apple in 1997, 11 years after being ousted in a boardroom coup, and led the company to greatness with a raft of iDevices. Apple surpassed Microsoft's valuation in May 2010, and last quarter it had larger cash reserves than the US Treasury.

The only duff moment here, aside from too much boardroom infighting for my taste, is when Jobs woos Bono to release U2's records on iTunes. The author retells the story breathlessly, but it is clear that behind the billing and cooing about artistic integrity, two monumental egos were jockeying shamelessly for supremacy.

Isaacson ends the book with Jobs slowly succumbing to the cancer that killed him last month. The unspoken question is whether Apple can thrive without its founder. This biography's great achievement is to interweave the personal and the professional, showing how Jobs the monster and Jobs the genius were indivisible. Apple may survive, but it will miss its monstrous genius.

Steve Jobs
Walter Isaacson
Little, Brown, 627pp, £25

Helen Lewis is deputy editor of the New Statesman. She has presented BBC Radio 4’s Week in Westminster and is a regular panellist on BBC1’s Sunday Politics.

This article first appeared in the 07 November 2011 issue of the New Statesman, The triumph of the Taliban

GETTY IMAGES/LIFE IMAGES COLLECTION
Show Hide image

Back to the future – mankind’s new ideas that aren’t new at all

Rethink: the Surprising History of New Ideas by Steven Poole reviewed.

When Steven Poole writes a book review, he likes to lie to himself. His only conscious decision is to jot down a few notes as the deadline approaches. There is no pressure to think deep thoughts, he tells himself, or to reach the required word count. Then invariably, in a few hours, he has written the entire review. This happens time and again. No matter how many times he convinces himself he is merely jotting and thinking, the result is a finished article.

Human beings are extraordinarily good at deceiving themselves and possibly never more so than when they think that they have had a new idea, as Poole makes clear in this fascinating compendium of new ideas that aren’t new at all. He digs deep into subjects as various as cosmology, economics, health care and bioethics to show that, as the writer of Ecclesiastes put it (long before Poole), “There is nothing new under the sun.” This is demonstrated in the re-emergence of ideas such as therapeutic psychedelic drugs, inherited traits that aren’t programmed into the genome, cognitive behavioural therapy, getting our protein from insects, and the multiverse.

Poole explores these propositions deftly enough, but they are not what interest him here. Rather, his subject is the way that we have seen them all before. He ties together what he concedes is a “highly selective snapshot of the looping evolution of ideas” with the observation that: “Any culture that thinks the past is irrelevant is one in which future invention threatens to stall.” Originality, he argues, is overrated.

The book might be something of a downer for those who like to gaze at “progress” with wide-eyed admiration. The starkest takeaway is that we are clearly hopeless at putting good ideas to work. In his discussion of artificial intelligence, for instance, Poole mentions the emerging idea of a universal basic income, which is likely to become a necessary innovation as robots take over many of the least demanding tasks of the human workforce. Yet he traces it back to 1796, when Thomas Paine first published his pamphlet Agrarian Justice.

Maybe this tells us something about the limits of the brain. It has always innovated, thought through its situations and created solutions. But those solutions can only be drawn from a limited pool of possibilities. Hence we get the same ideas occurring ­inside human skulls for millennia and they are not always presented any better for the passing of time. Richard Dawkins and his ilk provide a salient example, as Poole points out: “Virtually none of the debating points in the great new atheism struggles of the 21st century . . . would have been unfamiliar to medieval monks, who by and large conducted the argument on a more sophisticated and humane level.”

So, perhaps we should start to ask ourselves why so many proposed solutions remain unimplemented after what seem to be thousand-year development programmes. It is only through such reflection on our own thinking that we will overcome our barriers to progress.

Sometimes the barriers are mere prejudice or self-interest. After the Second World War, Grace Hopper, a computer scientist in the US navy, created a language that allowed a computer to be programmed in English, French or German. “Her managers were aghast,” Poole writes. It was “an American computer built in blue-belt Pennsylvania” – so it simply had to be programmed in English. “Hopper had to promise management that from then on the program would only accept English input.”

It is worth noting that Hopper was also a victim of postwar sexism. In 1960 she and several other women participated in a project to create COBOL, the computing language. Critics said there was no way that such a “female-dominated process” could end in anything worthwhile. Those critics were
wrong. By the turn of the century, 80 per cent of computer coding was written in COBOL. But this is another unlearned lesson. A survey in 2013 showed that women make up just 11 per cent of software developers. A swath of the population is missing from one of our most creative endeavours. And we are missing out on quality. Industry experiments show that women generally write better code. Unfortunately, the gatekeepers only accept it as better when they don’t know it was written by a woman.

Solving the technology industry’s gender problems will be a complex undertaking. Yet it is easy to resolve some long-standing difficulties. Take that old idea of providing a universal basic income. It appears to be a complex economic issue but experimental projects show that the answer can be as simple as giving money to the poor.

We know this because the non-profit organisation GiveDirectly has done it. It distributed a basic income to an entire community and the “innovation” has proved remarkably effective in providing the means for people to lift themselves out of poverty. Projects in Kenya, Brazil and Uganda have made the same discovery. As Poole notes, even the Economist, that “bastion of free-market economics”, was surprised and impressed. It said of the scheme: “Giving money directly to poor people works surprisingly well.” You can almost hear the exclamation “Who knew?” – and the slapping sound of history’s facepalm.

Michael Brooks’s books include “At the Edge of Uncertainty: 11 Discoveries Taking Science by Surprise” (Profile)

Michael Brooks holds a PhD in quantum physics. He writes a weekly science column for the New Statesman, and his most recent book is At the Edge of Uncertainty: 11 Discoveries Taking Science by Surprise.

This article first appeared in the 21 July 2016 issue of the New Statesman, The English Revolt