Two numbers from Apple Inc's recent quarterly results provide a cold but fitting memorial to the company's co-founder, the late Steve Jobs. The first is $81.6bn, the cash, or near-cash, in the company's accounts. The second is 40.3 per cent, the gross margin on sales.
A couple of months ago, Apple had more cash in hand than the US government. It can buy Tesco, BT and Nokia without borrowing a cent. The margins, meanwhile, indicate that it can sell millions of premium products at premium prices in the most competitive business sectors on the planet - computers and mobile phones.
Less cold, but equally astonishing, was the global mourning that followed the announcement of Jobs's death on 5 October. It was comparable to the great outpourings of grief for the Princess of Wales, John Lennon, Pope John Paul II and John F Kennedy. The left-wing French philosopher Jean-Claude Michéa was baffled. Why, he wondered, was this being seen as a "planetary catastrophe"? Jobs was just another American businessman, after all. The answer was, the mourners silently declared, because he made us happy.
Perhaps the people mourning Jobs also knew that they would never see his like again. This is undoubtedly true, but not because of the exotic, narcissistic, obsessive confection of his personality. There are many such people and there will be many more. Rather, it is true because it is highly unlikely that any such person will ever again lead a publicly quoted company at the very summit of the commanding heights of the world economy. Creating a one-man company with a meagre product line while sitting on a pile of cash and selling at dazzlingly high profit margins is not, in our present, over-financialised climate, an acceptable corporate strategy.
Also no longer an option is what management consultants would identify as the sheer inefficiency of the Apple process. This was largely because of the costly perfectionism that Jobs brought to his products. He was neither a geek nor a code-writer, but the ultimate user of end products of which he demanded a quasi-mystical purity. He was the customer and, it is said, the only market research he ever did was in the mirror. His products were "insanely great", as he put it, because they sprang from his inner sense of insane greatness.
There he saw a man neurotically repelled by the sloppy or the incomplete. He insisted that even the innards of Apple's machines, which customers normally would never see, should be as well designed as the exterior. "For you to sleep well at night," he said, "the aesthetic, the quality, has to be carried all the way through." In pursuit of perfection, he could hold up the release of products for years - as he did with the iPad - or demand expensive last-minute changes. A few weeks before its launch, the first iPhone had to have its plastic screen replaced with glass because Jobs thought the plastic scratched too easily. "At its core," wrote David Pogue, tech watcher for the New York Times, "Apple existed to execute the visions in his brain. He oversaw every button, every corner, every chime. He lost sleep over the fonts in his menus, the cardboard of the packaging, the colour of the power cord."
This fetishism, this intense focus on the aesthetic delights and usability, the erotica of the machines, was underpinned by a very ungeeky faith in their benign social significance. "It is in Apple's DNA that technology alone is not enough," Jobs said in March, when he launched the iPad 2. "It's technology married with liberal arts, married with the humanities, that yields us the results that make our hearts sing."
Beyond the beige box
This was matched by a broader critique of the narrowness of his own industry. "A lot of people in our industry haven't had very diverse experiences," he once said. "So they don't have enough dots to connect, and they end up with very linear solutions . . ." Bill Gates, he argued, would be "a broader guy if he had dropped acid once or gone off to an ashram when he was younger". Once asked what was wrong with Microsoft, he replied that the company had no taste. Subjective, immaterial values were his refuge from the dull grind of business and the tyranny of the price-earnings ratio.
Besides the perfection, narcissism and inefficiency of his business style, there was also his stubborn counter-intuitiveness. He opened specialist Apple retail stores when all other such ventures had failed; he removed the physical keyboard from the smartphone, when everybody said that's exactly what people want; and, against all industry wisdom, he stripped computers - especially the iPad - of features, because he realised that it made them more pleasurable than the feature-bloated Windows machines.
As a result of all this perversity, Jobs created the most valuable company in the world in opposition to the cult of management, the mores of his industry and the strictures of Wall Street and the accountants. Nobody will dare do that again. Nor, in most cases, would they be allowed to. So, what happens next? The answer for Apple is a long and difficult attempt to maintain the extremity and oddity of the Jobs dispensation. The answer for the rest of us is more problematic.
Take yourself back to 1994. Computers had invaded the home, but the internet was not quite there yet. Indeed, Gates was saying that what he called the "information superhighway" would not be with us for another decade. I talked to him at length in August that year and, writing about the meeting, I felt I had to explain to my readers what "email" was. Gates has risen to the top of the computer business, toppling IBM in the process, by concluding that it was all about software - hardware would be a low-margin business, banging out plastic boxes and circuit boards.
Meanwhile, Apple, the great innovator, was dying. Jobs had been ejected from the company in 1985 and by the time he returned in 1996 it was circling the drain. The future seemed to consist of a computing environment created by Microsoft's ugly, annoying and slow software (it is still ugly, annoying and slow) running on nasty, plastic machines. But at least we would have the internet, which, by 1996, Gates had finally realised had arrived a decade ahead of schedule. Jobs saved us all from that ill-designed fate.
Apple's development from that mythopoeic moment when the king returned to claim his crown defined the markets for, first, computers, then MP3 players, then phones and, latterly, tablet computers. It was based primarily on a reversal of the Gates conviction that software was paramount. Jobs was too much in love with the hardware to leave it to the production managers.
The first iMac that appeared in 1998 looked, in desktop form, like a giant wine gum and, as a laptop, like a bath toy. They were gestures against the beige box culture of the day. Software was not ignored - Apple's dismal Mac operating system 9, launched in 1999, was replaced by the superb OS X in 2001 - but by then the company had clearly re-established its position as the cool kid on the computing block solely through hardware design.
Ever since, Apple's wave of product innovation has dictated terms to its competitors. As this wave was entirely due to Jobs's return to the company, the big question after his death is: if he had not returned in 1997, would we now be suffering from horrible, clunky software with barely functional mobile phones and tablet computers like those awful machines running Windows that the iPad turned into a joke? Or, to put it another way: does the death of Jobs also mark the death of innovation?
Hardened geeks are likely to sneer at that question because, in strict computing terms, Jobs was not an innovator. He did not write code and almost all the technical aspects of his machines were available elsewhere. "The name 'Steve Jobs' may appear on 300 patents," writes Pogue, "but his gift wasn't invention. It was seeing the promise in some early, clunky technology - and polishing it, refining it and simplifying it until it becomes a standard component. Like the mouse, menus, windows, the CD-Rom or wifi."
Or take Siri, the "personal assistant" on the new iPhone which can understand and answer questions in ordinary speech. It feels like another Jobsian leap into the future. But, in fact, Apple simply bought the company and thereby took the software off the market - otherwise, it would have been available on BlackBerries and Android phones first.
On the other hand, Jobs's pursuit of design and usable integration of existing technologies was as innovative as anything in Silicon Valley. Furthermore, it drove technical innovation by showing how profitable and popular these machines could be. Now that he is gone, will the whole process slow or come to a halt?
This is a more profound question than it might appear to be, because it calls into question the continued ability of the liberal democratic system to deliver the goods. Perhaps stunned by the relatively sudden rise of China and her staggering ability to satisfy her own and the world's consumers, western thinkers, with or without the death of Jobs, have begun to consider the possibility of a decline in our innovative capacity.
Running to stand still
This is a startling and, for some, a blasphemous idea. The unthinking assumption of most people is that technological progress is more or less a straight, upward line. This view, in the age of the computer, has been encouraged by Moore's Law. Formulated in 1965 by the Intel co-founder Gordon Moore, this forecast that computing power - both speed and memory - would double every two years. For nearly half a century, this has proved phenomenally accurate and underpinned the conviction that an inevitable increase in our technological competence was, in effect, like a law of nature.
Hyperoptimistic geek history now usually draws an upward line from the printing press to the iPad, ignoring any intervening carnage in order to claim this is also the line of human progress. Such optimists sometimes go further. They say that the movement upwards is exponential and will, therefore, lead to a climactic moment known as "the Singularity" - apparently due in 2045, according to the American inventor and author Ray Kurzweil. At this point, a hyperintelligent machine will usher in the technological rhapsody, freeing us from the bonds of biology and leading us into the post- or transhuman future. If you think this sounds like spilt religion, you are not wrong.
But you don't have to go that far. Moderate types may simply say that frontier-crossing global connectivity just might make us a more tolerant, less violent species. Realists will say that nothing changes. Where you stand on this depends on your theology or sensibility. Yet it is not just a matter of opinion. There are more hard-headed analyses of the reality or the speed of innovation.
In 1990, Jonathan Huebner, a physicist at the Pentagon's Naval Air Warfare Centre in China Lake, California, was struck by what seemed to be a decline in new technological advances. He had been told that it was a great time to be in technology, but increasingly it seemed to him that the golden age had passed. He embarked on a study, which showed that the rate of innovation peaked in 1873 and had been declining ever since. In fact, our current rate of innovation - which Huebner puts at seven important technological developments per billion people per year - is about the same as it was in 1600. By 2024, it will have slumped to the same level as it was in the Dark Ages, the period between the end of the Roman empire and the start of the Middle Ages.
Huebner offered two possible explanations for this: economics, and the size of the human brain. Either it's just not worth pursuing certain innovations because they won't pay off - this is one reason why space exploration has practically stopped - or it is occurring because we already know most of what we can know and, therefore, discovering new things is becoming increasingly difficult.
At the same time, Ben Jones, a Chicago management professor, pointed out that we were having to run ever faster to stay in the same place. We need inventions but we have to work harder to find them. "The result," he said, "is that the average individual innovator is having a smaller and smaller impact . . . I've noticed that Nobel prizewinners are getting older. That's a sure sign it's taking longer to innovate."
The "low-hanging fruit" aspect of this argument also emerged in June in the economist Tyler Cowen's book The Great Stagnation. Cowen says innovation has flatlined since 1973 because all the easiest prizes have been won. In this, he was backed up by Peter Thiel, founding chief executive of PayPal. Thiel points to the way transport speeds have stalled since the Sixties while biotechnology and medical research is bearing ever less fruit. Even computers, in his analysis, are not what they seem. With real incomes stagnating since the Seventies, the effect of the new gadgetry of connection has been merely to offset declines elsewhere.
“To a first approximation," Thiel wrote this month in the National Review, "the progress in computers and the failure in energy appear to have roughly cancelled each other out. Like Alice in the Red Queen's race, we (and our computers) have been forced to run faster and faster to stay in the same place."
The science-fiction writer Neal Stephenson brings in another, more immediately topical factor: the short-termism of the financialised economy. "Any strategy," he writes in the current issue of World Policy Journal, "that involves crossing a valley - accepting short-term losses to reach a higher hill in the distance - will soon be brought to a halt by the demands of a system that celebrates short-term gains and tolerates stagnation, but condemns anything else as failure. In short, a world where big stuff can never get done."
If the low-hanging fruit argument is taken to mean that we are closer to the limit of what we are capable of, then there is nothing much that we can do about it. But if it is taken to be an economic and financial issue - can we afford to reach for the high-hanging fruit? - then plainly there is a great deal we can do.
This brings me back to Apple's cash and margins. In the logic of Wall Street, Apple should be a basket case. It holds cash rather than gears itself up, it delays products (and has only five lines to sell - desktops, laptops, iPods, iPhones and iPads) until they are judged by Jobs to be "insanely great" and it clings on to what would appear to be uncompetitive margins. Even so, the share price has risen relentlessly.
Head in the cloud
In response to this, those who believe in the present financial dispensation might repeat the geek argument that Apple has not been a true innovator, or they might say that Jobs was too much of a one-off to work as a model for any future business based on innovation. Naturally, those who believe in the present financial dispensation should be seeking help and not offering advice, but the argument that Jobs is just too much of an exception deserves careful consideration.
His is a hard act to follow - as Tim Cook, Jobs's successor as Apple chief executive well knows - but it is not impossible. Indeed, before boneheaded globalisers, ruthless private equity operators, thuggish City traders and banks incapable of understanding risk came along, it used to be the business of enlightened capitalism to discover, back and nurture such one-offs. Apple was such a rarity in the deregulated Nineties and Noughties precisely because almost everything else had come to seem like a kind of cheap scam, the spawn of the neoliberal bitch goddess.
The answer to this problem of financial corruption and short-termism, for some, is to throw out all the babies with all the bathwater. In France "déglobalisation" is now an intellectual programme, and potentially a political movement. There is a directly anti-capitalist strand called "décroissance" (decrease) which supports a deliberate end to economic growth. The demonstrators around St Paul's Cathedral and Wall Street are routinely described as "anti-capitalist". Even the Tea Party in the US - the French and the demonstrators will hate this - is a response, couched in hypernationalistic terms, to the failures of globalised capitalism.
This brings me to one final feature of Apple which has not attracted comment, presumably because everyone assumes it is just another big, global company. In certain respects, it isn't. The products all proudly boast of their Californian roots. Even the stamp icon that denotes its email program sweetly carries a Cupertino postmark. And one of Jobs's last projects was the commissioning of a giant, spaceship-like headquarters in this little suburb of San Jose. This is still, in gleaming, hypertrophied form, a small-town company. Apple, in short, does not embed a global identity in its imagery; its "offer" is a home in California and all that implies to the contemporary imagination. Contrast this with the blackmail and extortion deployed by the banks in resisting regulation precisely because they can go anywhere in the world and owe no special loyalty to anybody.
Innovation, for the moment, seems to continue. We are now entering the world of "the cloud". Cloud computing - in which remote servers will ultimately carry all our software and information to be accessed by any device - is the next big thing. Apple has just got there. Google is ahead and is seriously challenging the iPhone with its Android operating system and its agreement in August to purchase Motorola for $12.5bn. Microsoft is limping along with the recent completion of its $8.5bn acquisition of Skype. Amazon, having launched the Kindle Fire tablet in the US, plainly intends to cut itself a slice of the Apple action.
Product innovation, therefore, is rife but there is now no Jobs to make it beautiful, to make you want it even if you don't need it, and there is still a critically unreformed financial system to make it fail. Apple will almost certainly suffer a descent from the peaks of the past decade and, in the present climate, there is only a slim chance that anything will take its place. So, cuddle your iPhone 4S. It may be the last insanely great thing in your lifetime.
Bryan Appleyard's book "The Brain Is Wider Than the Sky: Why Simple Solutions Don't Work in a Complex World" will be published by Weidenfeld & Nicolson on 10 November (bryanappleyard.com)