"Oh Wow": the life and death of Steve Jobs

The last words of the Apple founder revealed a side of him that was usually hidden.

If there's one thing I've learned from Walter Isaacson's biography of Steve Jobs, it's that there is no line between monster and genius: the Apple founder was undoubtedly both.

In my review -- to be published in this week's magazine -- I trace some of the "asshole" things that Jobs did: abandoning a pregnant girlfriend, "crowdsourcing" his decision to marry his wife Laurene, even parking in disabled spaces. But the biography also does a wonderful job of showing how the character traits that led him to those actions were exactly the ones that made him great.

Jobs believed the normal rules didn't apply to him. He refused to put up anything less than perfection, creating a team of "A players" at Apple. He made sure his products were as beautiful on the inside as the outside, even if no one would see it. He was also unafraid to tear up months of work if he had a better idea.

The result is that by the end of the book you can't help admiring him, even if you're not a fully paid-up member of the Cult of Apple (I've only got an iPhone and a MacBook, so I think that makes me a Christmas and Easter churchgoer). His death from pancreatic cancer is told simply and movingly: Isaacson does not flinch from the fact that Jobs's stubborness -- he believed that his vegan diet would halt the spread of his tumours -- meant he died earlier than he needed to. But nonetheless, the way Jobs dealt with his diagnosis revealed a side of the great showman we might never otherwise have seen.

Jobs spoke about his cancer in his 2005 Stanford commencement address:

Remembering that I'll be dead soon is the most important tool I've ever encountered to help me make the big choices in life. Because almost everything -- all external expectations, all pride, all fear of embarrassment or failure -- these things just fall away in the face of death, leaving only what is truly important. Remembering that you are going to die is the best way I know to avoid the trap of thinking you have something to lose. You are already naked. There is no reason not to follow your heart.

Now, his sister Mona has revealed his last words, in a eulogy reprinted in the New York Times:

Even now, he had a stern, still handsome profile, the profile of an absolutist, a romantic. His breath indicated an arduous journey, some steep path, altitude.

He seemed to be climbing.

But with that will, that work ethic, that strength, there was also sweet Steve's capacity for wonderment, the artist's belief in the ideal, the still more beautiful later.

Steve's final words, hours earlier, were monosyllables, repeated three times.

Before embarking, he'd looked at his sister Patty, then for a long time at his children, then at his life's partner, Laurene, and then over their shoulders past them.

Steve's final words were:

OH WOW. OH WOW. OH WOW.

I'm sure there will be some people who aren't moved by that -- but I'm not one of them.

Helen Lewis is deputy editor of the New Statesman. She has presented BBC Radio 4’s Week in Westminster and is a regular panellist on BBC1’s Sunday Politics.

Flickr: B.S.Wise/YouTube
Show Hide image

Extremist ads and LGBT videos: do we want YouTube to be a censor, or not?

Is the video-sharing platform a morally irresponsible slacker for putting ads next to extremist content - or an evil, tyrannical censor for restricting access to LGBT videos?

YouTube is having a bad week. The Google-owned video-sharing platform has hit the headlines twice over complaints that it 1) is not censoring things enough, and 2) is censoring things too much.

On the one hand, big brands including Marks & Spencer, HSBC, and RBS have suspended their advertisements from the site after a Times investigation found ads from leading companies – and even the UK government – were shown alongside extremist videos. On the other, YouTubers are tweeting #YouTubeIsOverParty after it emerged that YouTube’s “restricted mode” (an opt-in setting that filters out “potentially objectionable content”) removes content with LGBT themes.

This isn’t the first time we’ve seen a social media giant be criticised for being a lax, morally irresponsible slacker and an evil, tyrannical censor and in the same week. Last month, Facebook were criticised for both failing to remove a group called “hot xxxx schoolgirls” and for removing a nude oil painting by an acclaimed artist.

That is not to say these things are equivalent. Quite obviously child abuse imagery is more troubling than a nude oil painting, and videos entitled “Jewish People Admit Organising White Genocide” are endlessly more problematic than those called “GAY flag and me petting my cat” (a highly important piece of content). I am not trying to claim that ~everything is relative~ and ~everyone deserves a voice~. Content that breaks the law must be removed and LGBT content must not. Yet these conflicting stories highlight the same underlying problem: it is a very bad idea to trust a large multibillion pound company to be the arbiter of what is or isn’t acceptable.

This isn’t because YouTube have some strange agenda where it can’t get enough of extremists and hate the LGBT community. In reality, the company’s “restricted mode” also affects Paul Joseph Watson, a controversial YouTuber whose pro-Trump conspiracy theory content includes videos titled “Islam is NOT a Religion of Peace” and “A Vote For Hillary is a Vote For World War 3”, as well as an interview entitled “Chuck Johnson: Muslim Migrants Will Cause Collapse of Europe”. The issue is that if YouTube did have this agenda, it would have complete control over what it wanted the world to see – and not only are we are willingly handing them this power, we are begging them to use it.

Moral panics are the most common justification for extreme censorship and surveillance methods. “Catching terrorists” and “stopping child abusers” are two of the greatest arguments for the dystopian surveillance measures in Theresa May’s Investigatory Powers Act and Digital Economy Bill. Yet in reality, last month the FBI let a child pornographer go free because they didn’t want to tell a court the surveillance methods they used to catch him. This begs the question: what is the surveillance really for? The same is true of censorship. When we insist that YouTube stop this and that, we are asking it to take complete control – why do we trust that this will reflect our own moral sensibilities? Why do we think it won't use this for its own benefit?

Obviously extremist content needs to be removed from YouTube, but why should YouTube be the one to do it? If a book publisher released A Very Racist Book For Racists, we wouldn’t trust them to pull it off the shelves themselves. We have laws (such as the Racial and Religious Hatred Act) that ban hate speech, and we have law enforcement bodies to impose them. On the whole, we don’t trust giant commercial companies to rule over what it is and isn’t acceptable to say, because oh, hello, yes, dystopia.

In the past, public speech was made up of hundreds of book publishers, TV stations, film-makers, and pamphleteers, and no one person or company had the power to censor everything. A book that didn’t fly at one publisher could go to another, and a documentary that the BBC didn’t like could find a home on Channel 4. Why are we happy for essentially two companies – Facebook and Google – to take this power? Why are we demanding that they use it? Why are we giving them justification to use it more, and more, and more?

In response to last week’s criticism about extremist videos on the YouTube, Google UK managing director Ronan Harris said that in 2016 Google removed nearly 2 billion ads, banned over 100,000 publishers, and prevented ads from showing on over 300 million YouTube videos. We are supposed to consider this a good thing. Why? We don't know what these adverts were for. We don't know if they were actually offensive. We don't know why they were banned. 

As it happens, YouTube has responded well to the criticism. In a statement yesterday, Google's EMEA President, Matt Brittin, apologised to advertisers and promised improvements, and in a blog this morning, Google said it is already "ramping up changes". A YouTube spokesperson also tweeted that the platform is "looking into" concerns about LGBT content being restricted. But people want more. The Guardian reported that Brittin declined three times to answer whether Google would go beyond allowing users to flag offensive material. Setting aside Brexit, wouldn't you rather it was up to us as a collective to flag offensive content and come together to make these decisions? Why is it preferable that one company takes a job that was previously trusted to the government? 

Editor’s Note, 22 March: This article has been updated to clarify Paul Joseph Watson’s YouTube content.

Amelia Tait is a technology and digital culture writer at the New Statesman.