The pseudo-profundity of Malcolm Gladwell

The essayist's mania for teachable narrative goes hand in hand with a revealingly indifferent attitude to truth.

Malcolm Gladwell is sometimes criticised on the basis that, although he has a reputation as a thinker, all he does is précis other people’s research. That’s not fair. Popularising academic ideas with style for a broad audience is hardly an ignoble pursuit. The real problem with Gladwell goes far deeper. It is the method that he has helped make ubiquitous in modern non-fiction trade publishing.
 
“Through these stories,” he explains in the introduction to his latest book, David and Goliath: Underdogs, Misfits and the Art of Battling Giants (Allen Lane, £16.99), “I want to explore two ideas.” The method of “exploring” ideas through stories is now the preferred mode of, or replacement for, serious thought and argument. Unfortunately, it can lead an incautious writer into a conceptual shambles.
 
Gladwell is a brilliant salesman for a certain kind of cognitive drug. He tells his readers that everything they thought they knew about a subject is wrong, and then delivers what is presented as a counterintuitive discovery but is actually a bromide of familiar clichés. The reader is thus led on a pleasant quasi-intellectual tour, to be reassured at the end that a flavour of folksy wisdom was right all along. Little things really can make a big difference; trusting your gut can be better than overthinking; successful people work hard.
 
The art here lies in making the platitudinous conclusion seem like a revelatory place to end up, after one has enjoyed the colourful “stories” about carefully described plucky individuals with certain hairstyles and particular kinds of trousers. (Actual quote: “He is a tall young man with carefully combed dark-brown hair and neatly pressed khakis.”) Such books must thus be constructed with a certain suspenseful cunning. Gladwell likes first to tell an apparently convincing story and then declare that it’s not true, like a magician pulling an empty hat out of a rabbit. Thus does his book begin, relaying the standard version of David and Goliath – plucky shepherd defeating fearsome giant with fortunately slung pebble – and then announcing that “almost everything about it is wrong”.
 
In ancient times, Gladwell writes, the slingshot was a potent weapon and bound to defeat an infantryman such as Goliath, who moved slowly because of all his armour and might even have been suffering from the hereditary disease acromegaly. What made him look strong was what made him weak. The problem with our current way of thinking – for if there were no problems with our way of thinking, Gladwell would surely invent some – is that “we consistently get these kinds of conflicts wrong”.
 
Do we? Well, if you ever suspected that the weak should play to their own strengths rather than the strengths of their adversaries, you are way ahead of him. You will not be surprised by his subsequent lengthy discussions of “asymmetrical” tactics in warfare or how peaceful protest that provokes overreaction by the authorities can be excellent PR. But banal nostrums about physical conflict cannot be the whole story, for such books must act as keys to all mythologies. So, Gladwell promises that our alleged misunderstanding has “consequences for everything from the way we educate our children to the way we fight crime and disorder”. Consequences for everything! That is the hard sell, the first free rock of intellectual crack.
 
The examples of “everything” include basketball coaching, policing, university science, Martin Luther King, and the Impressionists. (The waft of luxury art-history tourism in the Impressionists sequence is only the most obvious example of how Gladwell is now the non-fiction equivalent of Dan Brown.) The promise that such heterogeneous matter can be governed by one or two big ideas and understood through them constitutes the main attraction of the Gladwellian literary genre. Armed with these “ideas”, you won’t have to think for yourself ever again.
 
One early story Gladwell tells is about classroom sizes. A large class is usually thought to be a “disadvantage” (the abstract equivalent of a “giant”) for pupils, and smaller class sizes are assumed to be better. Surveying studies, Gladwell observes that though really big classes are a problem, there is a happy medium, and smaller classes don’t necessarily lead to better outcomes. This, he explains, is because teachers don’t usually adjust their teaching style to smaller class sizes; instead, they just work less. So, the “disadvantage” of moderately big classes isn’t one after all. 
 
A bizarre coda to this story shows the weaselly potential of Gladwell’s method. Up the road from the state school where he has been talking to a nice teacher, there is a private school, which boasts that its average class size is 12. Oh dear, thinks Gladwell. “Why does a school like [this] do something that so plainly makes its students worse off?” The odd thing is that he simply doesn’t know whether the students there are worse off, because he doesn’t know whether the staff teach in a way that suits their small classes. If they do, then the students won’t be worse off at all. So does Gladwell talk to anyone at the school to find out? He does not. Perhaps he fears ruining the story.
 
Another yarn focuses on a doctor called Jay Freireich, who spearheaded advances in treating childhood leukaemia in the 1950s. Gladwell tells a fascinating, bloody and frightening tale with great verve. Freireich was a maverick who gave sick children untested treatments because they were otherwise certain to die quickly. To understand where this fits into Gladwell’s David and Goliath pattern, we must take a historical detour to the Blitz. (Another important feature of a Gladwellian text is the relentless montage.) Famously, the Blitz did not destroy the morale of Londoners. Why not? Gladwell cites a study. People who suffered “near misses”, when a bomb landed very close to them, were traumatised. But a lot more people experienced “remote misses”, when a bomb landed far off, and this usually gave them a sense of invulnerability. Back to Freireich. His father died when he was very young and his childhood was generally unpleasant. Gladwell assumes that Freireich experienced his horrible youth as a “remote miss” and that this explains his heroism as an adult. “Freireich had the courage to think the unthinkable,” Gladwell orates. “He experimented on children. He took them through pain no human being should ever have to go through. And he did it in no small part because he understood from his own childhood experience that it is possible to emerge from even the darkest hell healed and restored.”
 
The interesting thing about this – apart from it being the kind of gruesomely emetic, cliché-rammed prose that would not be out of place in the trashiest kind of spiritualist self-help book – is that, although Gladwell has interviewed Freireich, he is unable to quote his subject saying anything of the sort. Freireich says he regularly took painful bone marrow samples from the sick children, because “we needed to know if their bone marrow had recovered”. Nothing about feeling great because he had survived the death of his dad; just the single-minded epistemological need of the driven scientist.
 
Nor is Gladwell afraid to tackle the “giant” of dyslexia, which might be a “desirable difficulty” in its own right. How come? Why, because lots of “successful entrepreneurs” and “famous innovators” are dyslexic. Coincidence? “There are two possible interpretations for this remarkable fact. One is that this remarkable group of people triumphed in spite of their disability,” Gladwell remarks, and then hastens to dispose of this boringly un-Gladwellian explanation. “The second, more intriguing possibility is that they succeeded, in part, because of their disorder.”
 
The easiest way to support that “intriguing possibility” would be to cite statistics showing that, proportionally, more people with dyslexia enjoy worldly success than people without. But the data-happy writer doesn’t do that. Perhaps the answer doesn’t fit. Instead, Gladwell offers anecdotes. Here is “one of the most famous trial lawyers in the world”, David Boies. Because he is dyslexic, Boies couldn’t read much at law school, but he became very good at listening to people. People who can thus overcome dyslexia, Gladwell concludes, turn out to be “better off than they would have been otherwise”.
 
Not even Gladwell can run the experiment in which Boies repeats his childhood without dyslexia, to see if he still becomes a high profile lawyer, or maybe a bestselling author of high-concept non-fiction books. So the claim that Boies wouldn’t have done as well if he hadn’t been dyslexic is just cheaply comforting counterfactual speculation, to swallow which one must also assent to the bizarre assumption that no lawyer who can read well is also a very good listener.
 
Somewhat unhelpfully for the credibility of his own style of argument, Gladwell later reveals: “There are a remarkable number of dyslexics in prison.” In a parallel universe, another Malcolm Gladwell is using exactly the same pseudo-reasoning to argue that being dyslexic turns you into a criminal.
 
He is forced into such inconsistency and contortion throughout because there wouldn’t have been a Gladwellian book to write if he had just accepted the proverbial truth that, when life gives people lemons, some are able to make lemonade. (Strikingly, Gladwell the serial study-citer makes no reference to the substantial psychological literature on “resilience”.) Any teenager could also sum up much of David and Goliath by quoting the not-entirely-obscure maxim of a long deceased German: “What does not kill me makes me stronger.” 
 
Gladwell’s mania for teachable narrative goes hand in hand with a revealingly indifferent attitude to truth. The most blatant and unintentionally hilarious example of this comes at the book’s finale, when he tells the inspiring story of André Trocmé, pastor of the French village of Le Chambon-sur- Lignon, who defied the occupying Nazis and refused to give up the town’s Jews.
 
How did Trocmé get away with it? Gladwell acknowledges one explanation: “Philip Hallie, who wrote the definitive history of Le Chambon, argues that the town was protected at the end of the war by Major Julius Schmahling, a senior Gestapo official in the region.”
 
Sadly, this explanation does not deliver the right kind of heart-warming moral. “But the best answer,” he concludes blithely, “is the one David and Goliath has tried to make plain – that wiping out a town or a people or a movement is never as simple as it looks. The powerful are not as powerful as they seem – nor the weak as weak.”
 
This idea is definitely satisfying in stories. (I pictured Obi-Wan Kenobi telling Darth Vader: “If you strike me down, I shall become more powerful than you can possibly imagine.”) In life, however, the Nazis did not have much trouble wiping out the Jewish populations of other towns. But this is rather a depressing thought. Gladwell therefore jettisons the opinion of the scholar he says wrote the “definitive history” and decides instead that “the best answer” is the one he just made up to fit in with his uplifting scheme.
 
Malcolm Gladwell has thus done everyone a service by illustrating all too clearly the baleful drawbacks of “exploring ideas through stories”. In doing so, you might, like him, become incapable of understanding the stories in any other way than through the lens of your prefabricated idea. And so, because your idea is never allowed to be challenged by opposing evidence, it will languish forlornly, like Malcolm Gladwell’s, at the level of vapid homily.
 
Steven Poole’s latest book is “You Aren’t What You Eat” (Union Books, £7.99) 
Malcolm Gladwell. Portrait by David Yellen

This article first appeared in the 07 October 2013 issue of the New Statesman, The last days of Nelson Mandela

Getty
Show Hide image

In the age of podcasts, the era of communal listening is over

Where once the nation would listen to radio events together, now, it is the booming podcast market that commands our attention

It’s a moment so celebrated that no TV drama about the Second World War is complete without it. At 11.15am on 3 September 1939, Neville Chamberlain made a live radio broadcast from Downing Street announcing that “this country is now at war with Germany”. A silence fell over the nation as people rushed to the wireless to hear him. The whole country was listening, but crucially, it was listening together.

Nearly eight decades later, it is difficult to imagine a communal audio event like that ever happening again. The arrival of the Walkman in 1979, since superseded by the iPod and then the smartphone, turned listening into a personal, solitary pastime. It was no longer necessary for families to get a radio on a hire-purchase arrangement and gather round it in the sitting room. The technology that delivers audio to us is now small and cheap enough for each of us to have one in our pocket (with headphones tangled around it, of course).

At the same time, the method of delivery changed, too. “Radio” ceased to indicate simply “programming transmitted by electromagnetic waves” in the late 1990s, when conventional radio stations began to make their output available on the internet. Online-only radio stations sprang up, streaming their shows directly to computers. Free from any regulation and with the internet as a free distribution platform, these early stations echoed the tone of pirate radio stations in the 1960s.

The idea of “audioblogging” – making short voice recordings available for download online – has been around since the early 1980s, but it wasn’t until 2004 that the word “podcasting” was coined by the technology journalist Ben Hammersley in an article for the Guardian. He was looking for a name for the “new boom in amateur radio” that the internet had enabled.

Thanks to technological advances, by the early 2000s, a podcaster could record a sound clip and upload it to his or her feed, and it would arrive automatically on the computer of anyone who had subscribed. Apple began to include podcasts as a default option on iPods; in 2008 iPhones offered a podcast app as standard. The market boomed.

Apple is notoriously reluctant to provide data on its products, but in 2013 it announced that there had been more than a billion podcast subscriptions through its iTunes store, which carried over 250,000 podcasts in 100 languages. In 2016, Edison Research released a study suggesting that 21 per cent of all Americans over the age of 12 had listened to at least one podcast in the past month – roughly 57 million people. Audiobooks, too, are booming in this new age of listening; the New York Times reported that
although publishing revenue in the US was down overall in the first quarter of 2016, digital audio sales had risen by 35.3 per cent.

The vast share of this listening will be solitary. This is because audio is a secondary medium. For all the talk about the rise of “second screening”, it isn’t really possible to do much more than idly scroll through Twitter on your phone as you watch television, but you can easily get things done while you listen to a podcast. Put on a pair of headphones, and you can go for a run or clean out the oven in the company of your favourite show. In this sense, the medium has been a game-changer for commuters and those doing repetitive or manual work: there’s no longer any need to put up with sniffling on the train or your boss’s obsession with Magic FM.

Though podcasts are an internet phenomenon, they have managed to remain free from the culture of trolling and abuse found elsewhere. It is difficult to make audio go viral, because it’s tricky to isolate a single moment from it in a form that can be easily shared. That also deters casual haters. You can’t just copy and paste something a host said into an insulting tweet.

Our new and solitary way of listening is reflected in the subjects that most podcasts cover. While there is the occasional mega-hit – the American true crime podcast Serial attracted 3.4 million downloads per episode in 2014, the year it launched – most shows exist in a niche. A few hundred listeners who share the host’s passion for pens or for music from antique phonographs can be enough to sustain a series over hundreds of episodes (there are real podcasts on both of these topics).

This is also where the commercial opportunity lies. It costs relatively little to produce even high-quality podcasts, compared to TV or conventional radio, yet they can ­attract very high advertising rates (thanks to the dedication of regular listeners and the trust they have in the host). The US is far ahead of the UK in this regard, and podcast advertising revenue there is expected to grow 25 per cent year on year, reaching half a billion dollars in 2020. Where this was once a hobby for internet enthusiasts, it is now big business, with venture capitalists investing in new networks and production companies. The US network Gimlet attracted $6m in funding in 2015. However, in the UK, the BBC crowds out smaller, independent operations (the trade-off is that it makes undeniably outstanding programmes).

There is even a movement to make listening a communal activity again. The same hipsters responsible for the resurgence of vinyl sales are organising “listening parties” at trendy venues with high-quality sound systems. Live shows have become an important source of revenue for podcasters. Eleanor McDowall, a producer at the Falling Tree radio production company, organises subtitled “screenings” for podcasts in languages other than English. I even have a friend who is part of a “podcast club”, run on the same lines as a monthly book group, with a group of people coming together to discuss one show on a regular schedule.

The next big technological breakthrough for audio will be when cars can support internet-based shows as easily as conventional radio. We might never again gather around the wireless, but our family holidays could be much improved by a podcast.

Caroline Crampton is assistant editor of the New Statesman. She writes a weekly podcast column.

This article first appeared in the 16 February 2017 issue of the New Statesman, The New Times