Follow the leader: Apple’s co-founder Steve Jobs leaves the stage after launching the iPad in San Francisco in January 2010. Image: Bloomberg via Getty Images
Show Hide image

Is this the death of Apple?

Steve Jobs’s creation, long thought to be the smartest company in the world, is in danger of falling behind Google and Facebook in the race to be the internet platform of the future.

Steve Jobs died on 5 October 2011, the day after the launch of the iPhone 4S. His chosen successor, Tim Cook, was already installed as Apple’s chief executive and, after initially faltering, the share price recovered, rising 76 per cent in the following year. Then, in a few months, all those gains vanished and the price is still well below its peak.

The reason is market scepticism about the post-Jobs regime. Cook is not Jobs and, since the iPad, there has been no spectacular product launch, only the usual stream of updates and improvements. His more conventional management practices are said to be counterinnovative. In addition, competitors are thriving, and most importantly Google and Facebook seem to have solved the puzzle of how to make money out of advertising on mobile devices. All of which is just another way of saying that Steve Jobs is dead.

Then there is the sigmoidal curve. Companies are like animals. After an initial growth spurt, they slow down and die, usually in a matter of decades. Imagine the letter “S” fallen on its face: there is a curve downwards on the left, then a rise – the start-up and growth spurt – then it reaches a plateau and begins to decline. This is the sigmoidal curve, applied by the physicist Geoffrey West of the Sante Fe Institute in New Mexico to both organisms and companies. Some are now saying that it applies to Apple, one of the biggest companies on the planet – and certainly the smartest.

Business theories are like Marxism in the Soviet Union: they are only true to the extent that enough people pretend that they are. Apple, however, looks like an unusually perfect test case, not just of the theory but also of the possibility that the sigmoidal curve can, for a time, be beaten by the creation of a new curve, a new injection of start-up energy, a new growth spurt. Under Jobs, the iPod, the iPhone and the iPad did this and it is what the markets are now looking for, post-Steve. Apple has yet to deliver.

One symptom of market nerves is that investors are increasingly unwilling to let the company sit on its great pile of cash. In Apple’s case, this amounts to $150bn, almost 10 per cent of all the holdings of non-financial companies in the United States and, at times, more than the cash cushion held by the US Treasury. Cook has agreed to hand back $100bn in dividends and share buybacks. In October, a powerful investor, Carl Icahn (who is worth $20bn), demanded the return of the whole $150bn.

Meanwhile, the shares of Google, Apple’s bitter rival, have soared over the past year. Google, the story goes, is now the great innovator in Silicon Valley. Apple, after Jobs, began to ossify, to lose its ability to scatter “insanely great” products over the adoring masses. There are currently only three serious players in the battle to be the platform on which the future is built – Facebook is the other. Apple is in danger of taking the bronze.

The mixed response to the launch of iOS7 – the new version of the iPhone and iPad operating system – in September seemed to confirm this. It was also a reminder of one of the most important things that happened after Jobs’s death: Cook’s dismissal of Scott Forstall. Forstall is a curious figure, a Jobsian to his core – he tended to wear the same clothes as Steve and even drive the same car (a silver Mercedes-Benz SL55 AMG). He was in charge of iOS and, crucially, of the disastrous launch of Apple Maps, an app initially so bad that Cook advised users to go back to Google Maps.

Forstall was also a fan of skeuomorphs, a term of design art that has suddenly escaped into common usage. It means things that look like older, more familiar things – so an icon for an app for making notes looks like a notebook. This dominated iOS visuals until Jonathan “Jony” Ive, Apple’s senior vice-president of design, threw it out in favour of a “flatter”, de-skeuomorphed look with iOS7.

Many objected to the new system, claiming that it hurt their eyes, made them dizzy, and so on. The truth was that with the flat look, Ive and Cook had bigger fish to fry. As Will Self has astutely observed in the New Statesman, skeuomorphs signal the dominance of western old-guy culture. They are a way of making the old feel comfortable with the new. In that context, it seems that iOS7 is aimed at youth, specifically Asian youth. It is in Asia where Apple is growing and will continue to grow fastest. Nevertheless, iOS7 remains contentious and further evidence for those who believe that Apple has lost it.

This raises two questions. Should we believe any of this? Should we care? A quick answer to the first question is: of course not. Precisely because it is the largest and – thanks to the epic tale of Jobs’s life and death – the most dramatic tech company, the imminent demise of Apple is announced with tedious regularity. The “Apple Death Knell Counter” at the Mac Observer website records 63 premature obituaries since 1995, the most recent being in July, when an article on the CBS site carried the headline: “Why Apple is a dead company walking”.

Nevertheless, the point about crying wolf is that eventually the wolf does come. Its arrival would be traumatic for veteran Apple users like me. For anybody young enough to know music primarily as something heard through white earphones (the iPod was launched in October 2001), the idea of an Appleless world is hard to imagine.

“I was born in 1985,” says Luke Dormehl, author of The Apple Revolution. “I was 16 when the iPod came out. For my whole consumer-electronics-buying life, Apple has been as we know [it] today.”

The software and hardware of Apple’s computers remain years ahead of and infinitely more usable than anything else on the market. Its ethos of striving for “machine beauty” – a kind of beauty now reflected in the splendour of the Apple Stores – is still seductively unique. In this, Google comes a distant second but there is not a single brand – not in technology, not anywhere – as potent as Apple.

Consider this: Apple makes very few products – Cook once said its entire range could fit on a tabletop – and they are more expensive than the competition. So how has it become one of the biggest companies in the world? It has done so through the power of mystique, aspiration and industrial design; through, in short, the narcissistic, brutally competitive aesthetic obsessiveness of Steve Jobs. Apple continues to be formidably profitable – its stores, for example, have the highest sales per square foot of a retail outlet in the world. Yet Apple is not a viable business model: it is, like Jobs, an unrepeatable corporate freak show. Can it possibly be, post-Jobs, a freak show that runs and runs? The reviews are not yet in but doubt is priced into the shares.

What is going on? Given the number of full-time Apple watchers in the world, you’d think that somebody would know – but nobody does, not even, thanks to the company’s internal as well as external secrecy, most of its employees. Products are developed in closed-off rooms entered only by those who need to know. The one small peephole in this wall of silence used to be patent applications, which are public documents. Apple’s used to be scanned for what would come next but now the company patents everything, even abandoned ideas, so there is no knowing which are the real possibilities.

However, when the watchers converge on a forecast, it tends to have at least a grain of truth. Currently, they say, an iPhone with a much larger, curved screen and pressuresensitive touch controls is on the way. But that simply augments the current product. Two possible and entirely new products could restart the company’s sigmoidal curve. “I’m pretty confident that they have at least two more technologies,” says Leander Kahney, editor of the Cult of Mac blog and author of a new book about Jony Ive.

Current televisions are among the worstdesigned electronic products on the market; they are a vision of what mobile phones and computers would be like if Apple had never existed. Everybody has several remotes and, to the rear of the flat screen is a zoo of wires and ports. Apple can fix this – just before his death, Jobs said he had solved the problem – and Ive can make it beautiful. Imagine a screen and an iPad-like controller and you probably get the picture. An Apple television would be an invasion of a new product zone comparable to the launch of the iPhone. The project is fraught with difficulty, however, not least because Apple would need to control the content flow, as it has done with music ever since the iPod, in order to preserve its freakish profitability. Jobs may be dead but the company’s control mania is undiminished.

The latest rumours suggest that the TV is not imminent but also that something perhaps much more interesting is. The biggest clue to this is Cook’s appointment of Angela Ahrendts, CEO of Burberry, as head of retail. This has been called “the most important hire Cook has ever made” and Ahrendts has even been tipped as the next Apple CEO. Less obvious are a number of other appointments from the fashion industry.

“They are hiring fashion designers like crazy,” says Kahney, “and they’re getting all these people from the fashion industries; they’re working with somebody from Yves Saint Laurent. They’ve just secretly hired three industrial designers with experience in hi-tech clothing; there was a wetsuit designer from Patagonia [the company, not the place]; there’s an expert in industrial knitting from Nike; and at the same time, they are hiring all these engineers who build biometric sensors.”

The next move in this game is, therefore, the cyborg – the part-human, part-machine, dreamed of by science-fiction writers. This is all about wearable computing or “technologically enhanced clothing”, as Kahney puts it. The widely rumoured iWatch may be the first step in this direction, though this would hardly be revolutionary, as there are many such devices already on the market. What follows may be, for example, clothing that tracks your vital signs – blood pressure, heart rate, and so on – giving you instant feedback so that you can adjust your behaviour. Apple Stores could thus become, in part, clothing outlets. Hence the appointment of Ahrendts.

This would be a move in the great Jobs tradition: the annexation of a new industry. Wearable computing is, at the moment, a mess of rather dull-looking products, primarily watches, and the best known of them, Google Glass, is still not widely available. In any case, it seems specifically designed to make its users look like idiots – Glass is a spectacle frame controlled by a series of strange movements of the head and hands; it is guaranteed not to break the ice at parties. Yet if Kahney is right, Apple intends not just to compete with these ephemeral gadgets but with the entire clothing industry.

Can Cook do it? The first point to make is that he may not be Jobs but he made what Jobs did possible. “It is no coincidence,” writes Adam Lashinsky in his book Inside Apple, “that the more responsibility Cook took on in the nuts-and-bolts parts of Apple, the more Jobs was freed up for his creative endeavours. Released from worrying whether customer service was operating smoothly or if retail outlets were receiving inventory to match customer demand, Jobs spent the last decade of his life dreaming up the iPod, iPhone and iPad – and then marketing them.”

Cook created the supply chain, a global network of manufacturers whose components converge, primarily, on giant Chinese assembly operations. It is unlikely that Jobs could have done this, given his volcanic temper, his impatience and his love of the product rather than its manufacturing ancestry. Kahney suggests that Cook’s strategy now is to become an enabler for Jony Ive, the man who, more than anybody else, seems to keep the Jobs mystique alive.

It is certainly still the case that ID – the industrial design studio – is the dominant force within the company. It is said that at meetings, everybody falls silent when the designers walk into the room. They have the last and the first word on everything that Apple ships, the packaging included. This is unique in Silicon Valley and probably unique in the industrialised world. If Cook tampers with that, then he risks Apple finally becoming just another company.

What Cook can’t do, however, is maintain’s Apple marketing operation; this is because it is dead. Jobs was the marketing department. His story – ejected in 1985 from the company he co-founded, returning to save it from bankruptcy and lead it to world dominance in 1996 – combined with his theatrical skills (his product announcements were some of the greatest shows on earth) and his attention to detail in advertising all created a personal mystique that fed into a product mystique of religious intensity. Steven Levy of Newsweek was one of the four journalists to get an iPhone ahead of its launch. In the crowds outside the New York Apple Store, he was being interviewed on TV when somebody grabbed the microphone and announced that he had one of the sacred gadgets. What followed was a weird display of what can only be called piety. “Shaken but undaunted,” Levy wrote later, “we restarted [the interview]. It got scarier. People pressed in close, fingers stretching toward the device, Michelangelo-style.”

Even Jobs’s death seems to have been choreographed as a quasi-religious ceremony. His last words are said to have been: “Oh, wow. Oh, wow. Oh, wow” – as if he saw, beyond the veil, the ultimate product. And he once said that death was “very likely the single best invention of life”.

Cook can’t match this – nobody could – and it is noticeable that the quality of Apple’s advertising has slipped badly. Print and video ads have become corporate feel-good hack work, indistinguishable from the competition: a sad decline from the often outrageous and always stylish edginess of the Jobs years. Cook now seems to be buying in marketing skills, notably with the hiring of Ahrendts, but Apple will have to function without the giant personality attached.

The fate of this confection of ego, art and advertising will ultimately be determined by competition. As a new book – Fred Vogelstein’s Dogfight – explains, Apple’s history since the iPhone has been dominated by an increasingly bitter war with Google. Prior to the iPhone, the two companies were quite friendly, sharing a director, Eric Schmidt; Apple even tolerated that Google seemed to be developing a mobile operation system that might compete with its device.

After the launch of the iPhone in 2007, this fell apart as phones using Google’s Android system started to appear. Jobs was incandescent and launched a series of lawsuits against the makers of Android phones. In his biography of Jobs, Walter Isaacson quotes him in full flow after he launched a case against the smartphone manufacturer HTC in 2010: “Our lawsuit is saying, ‘Google, you fucking ripped off the iPhone, wholesale ripped us off.’ Grand theft. I will spend my last dying breath if I need to, and I will spend every penny of Apple’s $40bn in the bank to right this wrong. I am going to destroy Android because it is a stolen product. I’m willing to go to thermonuclear war on this. They are scared to death because they know they are guilty. Outside of Search, Google’s products – Android, Google Docs – are shit.”

Vogelstein points out that Jobs’s vendetta “created one of the largest patent law firms in the world”, an amalgam of Apple’s legal team and four outside firms deploying 300 lawyers and costing about $200m a year. All of this on the basis of a case that, Vogelstein suggests, was far from fair. Apple, too, had copied ideas – notably the computer mouse from Xerox – but any defence from Google along those lines could not dent Jobs’s conviction that Apple had invented everything.

Cook is unlikely to be any less determined to wage war on Google. Yet the late boss’s vehemence may prove a burden. It wasn’t just applied to one company; all of Apple’s competitors and even their partners were subjected to his monomaniacal conviction that there was Apple and there was the rest of the world, which largely consisted of bozos and rip-off artists. This left behind a burden of scratchy corporate relationships. When Jobs was alive, the other companies grinned and bore it; now they are not doing so.

The big picture now is this. There are three companies competing to be the internet platform of the future –Apple, Google and Facebook. They have quite different methods and utterly different images. Google’s approach is based on its near monopoly over advertising and its drive to feed this by a rapid expansion of its ability to acquire and control information; its image is that of the wacky, let’s-give-it-a-whirl inventor. Facebook pursues a massive expansion of the idea of the social network; its image is that of a hip, genial, idealisticmaker of friendships and connections. Apple aims for the tightest possible integration of hardware and software that ties users into its system; its image is that of the autocratic genius who knows better than you how to live your life and, soon, how to dress. Apple’s image is at least the most honest.

At the moment, Google is the favourite for gold, with Facebook as a possible silver if it can control its appalling public relations and crass handling of private information. Apple is on the ropes. I hope it won’t stay there for long for one simple reason. None of these companies is especially loveable; they are all power and money-hungry operations that seem to think they have a right to remake the world in their own image. They employ people who think that Ayn Rand’s objectivism is the last word in philosophy and that the “technological singularity” – the takeover of the machines – is imminent and desirable. They no doubt want us to be more like machines, the better to be interacted with and be read by their products. Yet Apple has a redeeming feature. It does, in spite of everything and thanks to Steve Jobs, make things beautiful.

Bryan Appleyard is a writer for the Sunday Times and other publications. He tweets as: @bryanappleyard

This article first appeared in the 20 November 2013 issue of the New Statesman, iBroken

Getty
Show Hide image

Arsène Wenger: how can an intelligent manager preside over such a hollowed-out team?

The Arsenal manager faces a frustrating legacy.

Sport is obviously not all about winning, but it is about justified hope. That ­distinction has provided, until recently, a serious defence of Arsène Wenger’s Act II – the losing part. Arsenal haven’t won anything big for 13 years. But they have been close enough (and this is a personal view) to sustain the experience of investing emotionally in the story. Hope turning to disappointment is fine. It’s when the hope goes, that’s the problem.

Defeat takes many forms. In both 2010 and 2011, Arsenal lost over two legs to Barcelona in the Champions League. Yet these were rich and rewarding sporting experiences. In the two London fixtures of those ties, Arsenal drew 2-2 and won 2-1 against the most dazzling team in the world. Those nights reinvigorated my pride in sport. The Emirates Stadium had the best show in town. Defeat, when it arrived in Barcelona, was softened by gratitude. We’d been entertained, more than entertained.

Arsenal’s 5-1 surrender to Bayern Munich on 15 February was very different. In this capitulation by instalments, the fascination was macabre rather than dramatic. Having long given up on discerning signs of life, we began the post-mortem mid-match. As we pored over the entrails, the curiosity lay in the extent of the malady that had brought down the body. The same question, over and over: how could such an intelligent, deep-thinking manager preside over a hollowed-out team? How could failings so obvious to outsiders, the absence of steel and resilience, evade the judgement of the boss?

There is a saying in rugby union that forwards (the hard men) determine who wins, and the backs (the glamour boys) decide by how much. Here is a footballing equivalent: midfielders define matches, attacking players adorn them and defenders get the blame. Yet Arsenal’s players as good as vacated the midfield. It is hard to judge how well Bayern’s playmakers performed because they were operating in a vacuum; it looked like a morale-boosting training-ground drill, free from the annoying presence of opponents.

I have always been suspicious of the ­default English critique which posits that mentally fragile teams can be turned around by licensed on-field violence – a good kicking, basically. Sporting “character” takes many forms; physical assertiveness is only one dimension.

Still, it remains baffling, Wenger’s blind spot. He indulges artistry, especially the mercurial Mesut Özil, beyond the point where it serves the player. Yet he won’t protect the magicians by surrounding them with effective but down-to-earth talents. It has become a diet of collapsing soufflés.

What held back Wenger from buying the linchpin midfielder he has lacked for many years? Money is only part of the explanation. All added up, Arsenal do spend: their collective wage bill is the fourth-highest in the League. But Wenger has always been reluctant to lavish cash on a single star player, let alone a steely one. Rather two nice players than one great one.

The power of habit has become debilitating. Like a wealthy but conservative shopper who keeps going back to the same clothes shop, Wenger habituates the same strata of the transfer market. When he can’t get what he needs, he’s happy to come back home with something he’s already got, ­usually an elegant midfielder, tidy passer, gets bounced in big games, prone to going missing. Another button-down blue shirt for a drawer that is well stuffed.

It is almost universally accepted that, as a business, Arsenal are England’s leading club. Where their rivals rely on bailouts from oligarchs or highly leveraged debt, Arsenal took tough choices early and now appear financially secure – helped by their manager’s ability to engineer qualification for the Champions League every season while avoiding excessive transfer costs. Does that count for anything?

After the financial crisis, I had a revealing conversation with the owner of a private bank that had sailed through the turmoil. Being cautious and Swiss, he explained, he had always kept more capital reserves than the norm. As a result, the bank had made less money in boom years. “If I’d been a normal chief executive, I’d have been fired by the board,” he said. Instead, when the economic winds turned, he was much better placed than more bullish rivals. As a competitive strategy, his winning hand was only laid bare by the arrival of harder times.

In football, however, the crash never came. We all wrote that football’s insane spending couldn’t go on but the pace has only quickened. Even the Premier League’s bosses confessed to being surprised by the last extravagant round of television deals – the cash that eventually flows into the hands of managers and then the pockets of players and their agents.

By refusing to splash out on the players he needed, whatever the cost, Wenger was hedged for a downturn that never arrived.

What an irony it would be if football’s bust comes after he has departed. Imagine the scenario. The oligarchs move on, finding fresh ways of achieving fame, respectability and the protection achieved by entering the English establishment. The clubs loaded with debt are forced to cut their spending. Arsenal, benefiting from their solid business model, sail into an outright lead, mopping up star talent and trophies all round.

It’s often said that Wenger – early to invest in data analytics and worldwide scouts; a pioneer of player fitness and lifestyle – was overtaken by imitators. There is a second dimension to the question of time and circumstance. He helped to create and build Arsenal’s off-field robustness, even though football’s crazy economics haven’t yet proved its underlying value.

If the wind turns, Arsène Wenger may face a frustrating legacy: yesterday’s man and yet twice ahead of his time. 

Ed Smith is a journalist and author, most recently of Luck. He is a former professional cricketer and played for both Middlesex and England.

This article first appeared in the 24 February 2017 issue of the New Statesman, The world after Brexit