Invasion of the cyber hustlers
From Jeff Jarvis to Clay Shirky, a class of gurus are intent on "disrupting" old-fashioned practices like asking us to pay for valuable content. Meanwhile, web giants like Google and Apple jealously guard their profitable secrets.
Like every other era, the internet age has its own class of booster gurus. They are the “cybertheorists”, embedded reporters of the social network, dreaming of a perfectible electronic future and handing down oracular commandments about how the world must be remade. As did many religious rebels before them, they come to bring not peace, but a sword. Change is inevitable; we must abandon the old ways. The cybertheorists, however, are a peculiarly corporatist species of the Leninist class: they agitate for constant revolution but the main beneficiaries will be the giant technology companies before whose virtual image they prostrate themselves.
Cybertheorists’ jargon often betrays an adolescent hatred of the world in which they find themselves. Jay Rosen, a prominent “future of news” cyber-guru, takes care at every opportunity to sneer at publishing institutions by pasting to them the epithet “legacy”: “legacy newsrooms”, “legacy media”. Another favourite cyber-adjective is “disruptive”. For most of us, disruption is annoying, but for cyber-swamis the more disruptive of established practices technology becomes, the more exciting it is.
Another new-media cyber-quack, the journalist Jeff Jarvis, wrote in his 2009 tract What Would Google Do?: “Education is one of the institutions most deserving of disruption.” (The tone of resentful loathing is cyber-typical.) What form might such exciting disruption take? The start-up Coursera, for one, promises to transform university teaching by offering lectures on snippets of web video and getting students to mark each other’s work. If you are a cybertheorist, this wheeze is a brilliant plan to leverage peer networks; if you are anyone else, it’s a brilliant plan to offload more of the labour of education on to the learners.
Another purported quality of Coursera is that it is “open”, as everything must now be. The cyber-credo of “open” sounds so liberal and friendly that it is easy to miss its remarkable hypocrisy. The big technology companies that are the cybertheorists’ beloved exemplars of the coming world order are anything but open. Google doesn’t publish its search algorithm; Apple is notoriously secretive about its product plans; Facebook routinely changes its users’ privacy options. Apple, Google and Amazon are all frantically building proprietary “walled-garden” content utopias for profit.
“Open-source” software, on the model of the Linux operating system, used to be the cyber-theorists’ favourite example of why open would always beat closed. Yet, for all the admirable successes of open-source software (especially in industrial applications), closed commercial software and services still dominate. Even Google’s open-source Android smartphone operating system is, for the vast majority of customers, experienced as a customised and re-closed version on phones manufactured by Samsung, HTC and Sony.
“Owning pipelines, people, products or even intellectual property is no longer the key to success,” Jarvis wrote in 2009. “Openness is.” What is now the company with the highest market valuation on earth? Apple, which sells physical products, jealously guards its patent hoard and is about as “open” as Fort Knox. Only the most black-hearted of cynics could suppose that it is in a cybertheorist’s interest to lecture media companies that they must be “open” so that the technology companies for which he acts as a useful idiot can happily hoover up all their data for free and monetise it.
The genius of Coursera’s plot to “disrupt” university teaching can, if you like, be imbibed through the medium of a Ted talk. These are lavishly produced speeches, lasting 18 minutes at most, which provide the assembled digerati with a nugget-sized “takeaway”: ideas as fast food. A Ted talk takes the form of a collection of stories tied together by an intriguingly stupid marquee title, forming a tight video-bolus of anti-thought that is then linked around by web enthusiasts.
A perfect Ted talk title is the recent “The Game That Can Give You Ten Extra Years of Life”, by the cybertheorist and “gamification” promoter Jane McGonigal. What could such a game be? Wiring up a joystick to an iron lung? Playing a gladiatorial game of televised chess in which the loser is killed instantly – and winning? No, it’s a little web-based game that McGonigal created called SuperBetter. There are, to date, no large-cohort longitudinal studies showing that SuperBetter makes you live ten years longer, but then a Ted talk is all about attention-grabbing truthiness, not truth.
McGonigal’s glib vision of using multiplayer video games to solve global problems, as articulated in her book Reality Is Broken, is in a way just the zaniest new application of another omni-relevant cyber-meme: the wisdom of crowds. James Surowiecki’s 2004 book of that title was relatively careful and there are some interesting wisdom-of-crowds effects. (Probably the best-known one is that if you ask lots of people to guess the number of jelly beans in a jar, the arithmetical mean of all the guesses will be surprisingly accurate.)
Since then, however, cyber-thinkers have run with the wisdom-of-crowds notion to a place that bears little resemblance to reality as we know it, high-fiving each other among the rubble of reason in a fatuous kind of hi-tech, misanthropic herd-worship. It can now seriously be proposed that there are occasions when “the smartest person in the room is the room”, as the subtitle of the cybertheorist David Weinberger’s book Too Big to Know, published last January, claims. Its weirdly self-undermining idea (perfect for a Ted talk) is that books are outdated and useless ways of organising “information” and that the sum total of information is now so overwhelming that we may as well throw up our hands and concede that “the network” knows better than we do.
If Weinberger’s thesis were correct, then his book would be disposable, because a random cohort of bloggers could be expected to come up with something far superior in a couple of weeks. Weinberger’s book is also cyber-typical for its pseudo-democratic hatred of any kind of expertise, and its cartoonish intellectual history, in the service of pretending that our age is utterly novel. “The internet,” he opines grandly, “enables groups to develop ideas further than any individual could.” So have writing and talking, since time immemorial.
It is surprising how often Wikipedia is cited by such cyber-pedlars as a paradigm of communal “knowledge creation”, given that Wikipedia explicitly bans the creation of any new knowledge. Its highest law is “no original research”, barring any mention of either “facts” or “ideas” that are not already published elsewhere. Observance of this edict has the effect that Wikipedia is entirely dependent on its cited sources, including newspaper and journal articles, for the “knowledge” it contains. This does not mean that Wikipedia is useless – far from it – but it is not an example of what it is so often claimed to be.
Cybertheorists, in any case, daren’t attempt to distinguish information from knowledge, because to do so would require them to perform the kind of intellectual triage that their rhetorical success depends crucially on avoiding. What is certainly true is that a book contains less information, in terms of the number of bytes needed to encode it, than a video of a sneezing kitten. And if you are a cyber-sage you are likely to join Weinberger in despising books anyway. Clay Shirky, the cybertheorist author of 2008’s Here Comes Everybody – a crowd-sourcing manifesto that now reads as a forlorn exercise in boosterism of once-hot internet services such as Flickr – notoriously wrote that same year: “No one reads War and Peace. It’s too long, and not so interesting.”
Anyone who might object, Shirky apparently assumed, could not be an ordinarily cultured person scornful of the sheer dopiness of such a claim but had to be a “littérateur”, anxious about how, “Having lost its actual centrality some time ago, the literary world is now losing its normative hold on culture as well.” Thus does the cybertheorist project his own resentment on to his critics. (As a matter of fact, according to data supplied by Nielsen BookScan, War and Peace sold nearly 17,000 copies in the UK alone in 2011 and has sold a quarter of a million since 1998.)
Books do matter for the cyber-babbler on the make; it’s just that they matter in a different way. Jarvis told this inspirational story in the acknowledgements of his most recent book, Public Parts, published in the US last year: “[Seth] Godin is to blame for my writing books. He sat me down one day and said I was a fool if I didn’t write one – and I would further be a fool if I thought that the book was the goal. No, he said, the book would build my public reputation, which would lead to other business. It has.” There you go: if you write a book with the book as the goal, you are a fool. A book’s correct function is that of a business card that gets you invited to where the real action is. Anyone who thinks literature, thought and argument are noble pursuits in themselves is an idiot. This is the proud yawp of the ultramodern philistine.
The sexiness of sharing
It might be accounted by some pedants a failure on the cybertheorists’ part that the world as it is today bears so little relation to their descriptions of it. People still read long Russian novels; the mass market for culture has not been replaced, as we have repeatedly been assured it has over the years, by “niches” (just ask the Fifty Shades of Grey author, E L James). Open is not the prevailing model for internet business success. The New York Times now has nearly 600,000 digital subscribers, driving recent rises in circulation, even though cyberdogmatists long swore that an internet pay wall could never work. The term “pay wall” is designed as a rancour-evoking sneer, as though one were expressing outrage that one had to pass through a “pay gate” to be allowed to take food out of a shop.
Meanwhile, the cybertheorists celebrate what they euphemistically refer to as the sharing of music and films by people who didn’t buy them, conflating it with sharing as the practice of retweeting a link and with the Oprah-era sense of sharing that denotes emotional revelation. (The term “oversharing” is not in the cyber-dictionary.)
Indeed, sharing is now much sexier than making the stuff that sharers share. In an article published in February headlined “What the media can learn from Facebook”, on which no parodist could hope to improve, Jarvis pontificated on how newspapers ought to imitate Mark Zuckerberg’s business model: “Production is expensive. Sharing is inexpensive and it scales. Facebook will soon serve a billion people with a staff equivalent to that of a large newspaper.” Make the readers do most or all of the work and hey presto! Legacy media are transformed into money-printing social media.
As with the notion of sharing, so with “social”: the cybertheorists have adopted a term of presumptive virtue and sprayed on to it a newly etiolated and instrumental meaning. Social is now a commercial technique to persuade users of digital services to reveal more to potential advertisers about their “networks” of friendship and business contacts and to “connect” such users more intimately with brands by means of a “Like” button – and soon, as recent reports of in-house experiments at Facebook suggest, a “Want” button.
Even the practice of reading books – if it must continue at all – apparently needs to be transformed into “social reading”, as though books were not already social and sociable artefacts. If you remember Johannes Gutenberg, it will be as Jarvis does in his hilarious e-pamphlet Gutenberg the Geek, published on Kindle this spring. Gutenberg, he writes, should rightly be considered “the patron saint of Silicon Valley”, the tech-entrepreneur John the Baptist to the eventual and glorious coming of Google’s Christ.
However, it doesn’t matter if cyber-hustlers are wrong about the present, because their brand value is more as wireless Nostradamuses. The cyber-maniac ideates a perfect cyber-future and affirms at the top of his voice that it has already arrived, or is so vague about the date of its realisation that he could never possibly be refuted. The title of a recent Ted talk by Shirky is a beautiful example of such unfalsifiable cyber-augury: “How the Internet Will (One Day) Transform Government.”
What sells, to the cyber-fanatic’s intended audience, is ludicrous utopian fantasy, silicon Panglossianism. Bill Leigh, who is the agent for the minor cybertheorist Steven Johnson, recently told New York magazine that his client “wanted to take his book sales to the next level” and so decided “to slant his material with a particular innovation feel to it”. Johnson’s new book is about how networks of “peer progressives” will make everything better, as they already have done through Wikipedia (yet again), the crowd-funding site du moment Kickstarter and New York City’s 311 hotline for reporting urban repair needs. The book’s title is, cyber-speculatively, unimprovable. It is called Future Perfect.
Also particularly canny is the newly published Makers: the New Industrial Revolution, by the über-cybertheorist Chris Anderson, the former editor of Wired (not the Chris Anderson who is the controller – or, if you will, “curator” – of the Ted talk franchise.) His new book snappily melds its cyber-utopian vision (in the future, we will all make cool things using robots and 3D printers) with geopolitical uplift (this means that the United States will once again become the globe’s leading industrial powerhouse).
Anderson, who didn’t mind the world knowing that some passages from his earlier book Free were simply copied and pasted from Wikipedia entries, is at least a real expert in matters of gadgetry and online empire-building, unlike most of the second-tier cyber-hawkers, many of whom, as the Pulitzer Prize-winning journalist Dean Starkman put it in a devastating essay about the “future of news” thinkers in the Columbia Journalism Review last year, are “journalism academics known for neither their journalism nor their scholarship”.
Cybertheorists in general could perhaps be tolerated as harmlessly colourful futurists, were it not that so many of them, through the influence of their consulting work and virtual bully pulpits, are right now engaged in promoting widespread cultural vandalism. Whatever smells mustily of the pre-digital age must be torn down, “disrupted” and made anew in the sacred image of Google and Apple, except more open to the digital probings of the internet- company oligopoly. Long live sharing, social reading, volunteering free labour as a peer student or member of a company’s online “community”, and entrusting your documents to the data-mining mega-corporations that control the “cloud”.
Cybertheorists love to apply the adjective “smart” to one another but, as a group, they are the most prominent anti-intellectual cadre of our day – little Pol Pots of the touchscreen and Twitter.
Steven Poole is the author of “You Aren’t What You Eat” (Union Books, £12.99). Read his previous essay for the New Statesman, "Your brain on pseudoscience: the rise of popular neurobollocks", here.