This is a metaphor. Photograph: Getty Images
Show Hide image

Your brain on pseudoscience: the rise of popular neurobollocks

The “neuroscience” shelves in bookshops are groaning. But are the works of authors such as Malcolm Gladwell and Jonah Lehrer just self-help books dressed up in a lab coat?

An intellectual pestilence is upon us. Shop shelves groan with books purporting to explain, through snazzy brain-imaging studies, not only how thoughts and emotions function, but how politics and religion work, and what the correct answers are to age-old philosophical controversies. The dazzling real achievements of brain research are routinely pressed into service for questions they were never designed to answer. This is the plague of neuroscientism – aka neurobabble, neurobollocks, or neurotrash – and it’s everywhere.

In my book-strewn lodgings, one literally trips over volumes promising that “the deepest mysteries of what makes us who we are are gradually being unravelled” by neuroscience and cognitive psychology. (Even practising scientists sometimes make such grandiose claims for a general audience, perhaps urged on by their editors: that quotation is from the psychologist Elaine Fox’s interesting book on “the new science of optimism”, Rainy Brain, Sunny Brain, published this summer.) In general, the “neural” explanation has become a gold standard of non-fiction exegesis, adding its own brand of computer-assisted lab-coat bling to a whole new industry of intellectual quackery that affects to elucidate even complex sociocultural phenomena. Chris Mooney’s The Republican Brain: the Science of Why They Deny Science – and Reality disavows “reductionism” yet encourages readers to treat people with whom they disagree more as pathological specimens of brain biology than as rational interlocutors.

The New Atheist polemicist Sam Harris, in The Moral Landscape, interprets brain and other research as showing that there are objective moral truths, enthusiastically inferring – almost as though this were the point all along – that science proves “conservative Islam” is bad.

Happily, a new branch of the neuroscienceexplains everything genre may be created at any time by the simple expedient of adding the prefix “neuro” to whatever you are talking about. Thus, “neuroeconomics” is the latest in a long line of rhetorical attempts to sell the dismal science as a hard one; “molecular gastronomy” has now been trumped in the scientised gluttony stakes by “neurogastronomy”; students of Republican and Democratic brains are doing “neuropolitics”; literature academics practise “neurocriticism”. There is “neurotheology”, “neuromagic” (according to Sleights of Mind, an amusing book about how conjurors exploit perceptual bias) and even “neuromarketing”. Hoping it’s not too late to jump on the bandwagon, I have decided to announce that I, too, am skilled in the newly minted fields of neuroprocrastination and neuroflâneurship.

Illumination is promised on a personal as well as a political level by the junk enlightenment of the popular brain industry. How can I become more creative? How can I make better decisions? How can I be happier? Or thinner? Never fear: brain research has the answers. It is self-help armoured in hard science. Life advice is the hook for nearly all such books. (Some cram the hard sell right into the title – such as John B Arden’s Rewire Your Brain: Think Your Way to a Better Life.) Quite consistently, heir recommendations boil down to a kind of neo- Stoicism, drizzled with brain-juice. In a selfcongratulatory egalitarian age, you can no longer tell people to improve themselves morally. So self-improvement is couched in instrumental, scientifically approved terms.

The idea that a neurological explanation could exhaust the meaning of experience was already being mocked as “medical materialism” by the psychologist William James a century ago. And today’s ubiquitous rhetorical confidence about how the brain works papers over a still-enormous scientific uncertainty. Paul Fletcher, professor of health neuroscience at the University of Cambridge, says that he gets “exasperated” by much popular coverage of neuroimaging research, which assumes that “activity in a brain region is the answer to some profound question about psychological processes. This is very hard to justify given how little we currently know about what different regions of the brain actually do.” Too often, he tells me in an email correspondence, a popular writer will “opt for some sort of neuro-flapdoodle in which a highly simplistic and questionable point is accompanied by a suitably grand-sounding neural term and thus acquires a weightiness that it really doesn’t deserve. In my view, this is no different to some mountebank selling quacksalve by talking about the physics of water molecules’ memories, or a beautician talking about action liposomes.”

Shades of grey

The human brain, it is said, is the most complex object in the known universe. That a part of it “lights up” on an fMRI scan does not mean the rest is inactive; nor is it obvious what any such lighting-up indicates; nor is it straightforward to infer general lessons about life from experiments conducted under highly artificial conditions. Nor do we have the faintest clue about the biggest mystery of all – how does a lump of wet grey matter produce the conscious experience you are having right now, reading this paragraph? How come the brain gives rise to the mind? No one knows.

So, instead, here is a recipe for writing a hit popular brain book. You start each chapter with a pat anecdote about an individual’s professional or entrepreneurial success, or narrow escape from peril. You then mine the neuroscientific research for an apparently relevant specific result and narrate the experiment, perhaps interviewing the scientist involved and describing his hair. You then climax in a fit of premature extrapolation, inferring from the scientific result a calming bromide about what it is to function optimally as a modern human being. Voilà, a laboratory-sanctioned Big Idea in digestible narrative form. This is what the psychologist Christopher Chabris has named the “story-study-lesson” model, perhaps first perfected by one Malcolm Gladwell. A series of these threesomes may be packaged into a book, and then resold again and again as a stand-up act on the wonderfully lucrative corporate lecture circuit.

Such is the rigid formula of Imagine: How Creativity Works, published in March this year by the American writer Jonah Lehrer. The book is a shatteringly glib mishmash of magazine yarn, bizarrely incompetent literary criticism, inspiring business stories about mops and dolls and zany overinterpretation of research findings in neuroscience and psychology. Lehrer responded to my hostile review of the book by claiming that I thought the science he was writing about was “useless”, but such garbage needs to be denounced precisely in defence of the achievements of science. (In a sense, as Paul Fletcher points out, such books are “anti science, given that science is supposed to be  our protection against believing whatever we find most convenient, comforting or compelling”.) More recently, Lehrer admitted fabricating quotes by Bob Dylan in Imagine, which was hastily withdrawn from sale, and he resigned from his post at the New Yorker. To invent things supposedly said by the most obsessively studied popular artist of our age is a surprising gambit. Perhaps Lehrer misunderstood his own advice about creativity.

Mastering one’s own brain is also the key to survival in a dog-eat-dog corporate world, as promised by the cognitive scientist Art Markman’s Smart Thinking: How to Think Big, Innovate and Outperform Your Rivals. Meanwhile, the field (or cult) of “neurolinguistic programming” (NLP) sells techniques not only of self-overcoming but of domination over others. (According to a recent NLP handbook, you can “create virtually any and all states” in other people by using “embedded commands”.) The employee using such arcane neurowisdom will get promoted over the heads of his colleagues; the executive will discover expert-sanctioned ways to render his underlings more docile and productive, harnessing “creativity” for profit.

Waterstones now even has a display section labelled “Smart Thinking”, stocked with pop brain tracts. The true function of such books, of course, is to free readers from the responsibility of thinking for themselves. This is made eerily explicit in the psychologist Jonathan Haidt’s The Righteous Mind, published last March, which claims to show that “moral knowledge” is best obtained through “intuition” (arising from unconscious brain processing) rather than by explicit reasoning. “Anyone who values truth should stop worshipping reason,” Haidt enthuses, in a perverse manifesto for autolobotomy. I made an Olympian effort to take his advice seriously, and found myself rejecting the reasoning of his entire book.

Modern neuro-self-help pictures the brain as a kind of recalcitrant Windows PC. You know there is obscure stuff going on under the hood, so you tinker delicately with what you can see to try to coax it into working the way you want. In an earlier age, thinkers pictured the brain as a marvellously subtle clockwork mechanism, that being the cutting-edge high technology of the day. Our own brain-as-computer metaphor has been around for decades: there is the “hardware”, made up of different physical parts (the brain), and the “software”, processing routines that use different neuronal “circuits”. Updating things a bit for the kids, the evolutionary psychologist Robert Kurzban, in Why Everyone (Else) Is a Hypocrite, explains that the brain is like an iPhone running a bunch of different apps.

Such metaphors are apt to a degree, as long as you remember to get them the right way round. (Gladwell, in Blink – whose motivational selfhelp slogan is that “we can control rapid cognition” – burblingly describes the fusiform gyrus as “an incredibly sophisticated piece of brain software”, though the fusiform gyrus is a physical area of the brain, and so analogous to “hardware” not “software”.) But these writers tend to reach for just one functional story about a brain subsystem – the story that fits with their Big Idea – while ignoring other roles the same system might play. This can lead to a comical inconsistency across different books, and even within the oeuvre of a single author.

Is dopamine “the molecule of intuition”, as Jonah Lehrer risibly suggested in The Decisive Moment (2009), or is it the basis of “the neural highway that’s responsible for generating the pleasurable emotions”, as he wrote in Imagine? (Meanwhile, Susan Cain’s Quiet: the Power of Introverts in a World That Can’t Stop Talking calls dopamine the “reward chemical” and postulates that extroverts are more responsive to it.) Other recurring stars of the pop literature are the hormone oxytocin (the “love chemical”) and mirror neurons, which allegedly explain empathy. Jonathan Haidt tells the weirdly unexplanatory micro-story that, in one experiment, “The subjects used their mirror neurons, empathised, and felt the other’s pain.” If I tell you to use your mirror neurons, do you know what to do? Alternatively, can you do as Lehrer advises and “listen to” your prefrontal cortex? Self-help can be a tricky business.

Cherry-picking

Distortion of what and how much we know is bound to occur, Paul Fletcher points out, if the literature is cherry-picked.

“Having outlined your theory,” he says, “you can then cite a finding from a neuroimaging study identifying, for example, activity in a brain region such as the insula . . . You then select from among the many theories of insula function, choosing the one that best fits with your overall hypothesis, but neglecting to mention that nobody really knows what the insula does or that there are many ideas about its possible function.”

But the great movie-monster of nearly all the pop brain literature is another region: the amygdala. It is routinely described as the “ancient” or “primitive” brain, scarily atavistic. There is strong evidence for the amygdala’s role in fear, but then fear is one of the most heavily studied emotions; popularisers downplay or ignore the amygdala’s associations with the cuddlier emotions and memory. The implicit picture is of our uneasy coexistence with a beast inside the head, which needs to be controlled if we are to be happy, or at least liberal. (In The Republican Brain, Mooney suggests that “conservatives and authoritarians” might be the nasty way they are because they have a “more active amygdala”.) René Descartes located the soul in the pineal gland; the moral of modern pop neuroscience is that original sin is physical – a bestial, demonic proto-brain lurking at the heart of darkness within our own skulls. It’s an angry ghost in the machine.

Indeed, despite their technical paraphernalia of neurotransmitters and anterior temporal gyruses, modern pop brain books are offering a spiritual topography. Such is the seductive appeal of fMRI brain scans, their splashes of red, yellow and green lighting up what looks like a black intracranial vacuum. In mass culture, the fMRI scan (usually merged from several individuals) has become a secular icon, the converse of a Hubble Space Telescope image. The latter shows us awe-inspiring vistas of distant nebulae, as though painstakingly airbrushed by a sci-fi book-jacket artist; the former peers the other way, into psychedelic inner space. And the pictures, like religious icons, inspire uncritical devotion: a 2008 study, Fletcher notes, showed that “people – even neuroscience undergrads – are more likely to believe a brain scan than a bar graph”.

In The Invisible Gorilla, Christopher Chabris and his collaborator Daniel Simons advise readers to be wary of such “brain porn”, but popular magazines, science websites and books are frenzied consumers and hypers of these scans. “This is your brain on music”, announces a caption to a set of fMRI images, and we are invited to conclude that we now understand more about the experience of listening to music. The “This is your brain on” meme, it seems, is indefinitely extensible: Google results offer “This is your brain on poker”, “This is your brain on metaphor”, “This is your brain on diet soda”, “This is your brain on God” and so on, ad nauseam. I hereby volunteer to submit to a functional magnetic-resonance imaging scan while reading a stack of pop neuroscience volumes, for an illuminating series of pictures entitled This Is Your Brain on Stupid Books About Your Brain.

None of the foregoing should be taken to imply that fMRI and other brain-investigation techniques are useless: there is beautiful and amazing science in how they work and what well-designed experiments can teach us. “One of my favourites,” Fletcher says, “is the observation that one can take measures of brain activity (either using fMRI or EEG) while someone is learning . . . a list of words, and that activity can actually predict whether particular words will be remembered when the person is tested later (even the next day). This to me demonstrates something important – that observing activity in the brain can tell us something about how somebody is processing stimuli in ways that the person themselves is unable to report. With measures like that, we can begin to see how valuable it is to measure brain activity – it is giving us information that would otherwise be hidden from us.”

In this light, one might humbly venture a preliminary diagnosis of the pop brain hacks’ chronic intellectual error. It is that they misleadingly assume we always know how to interpret such “hidden” information, and that it is always more reliably meaningful than what lies in plain view. The hucksters of neuroscientism are the conspiracy theorists of the human animal, the 9/11 Truthers of the life of the mind.

Steven Poole is the author of the forthcoming book “You Aren’t What You Eat”, which will be published by Union Books in October.

This article was updated on 18 September 2012.

This article first appeared in the 10 September 2012 issue of the New Statesman, Autumn politics special

Picture: KEVIN HAUFF
Show Hide image

How the modern addiction to identity politics has fractured the left

This partisan, divisive form of liberalism alienated the working class and helped create the conditions for the rise of Donald Trump.

Donald Trump is the president of the United States. His election in November 2016 turned our campuses in America upside down. The day after his victory, some professors held teach-ins, some students asked to be excused from class, and now many have been joining marches and attending raucous town hall meetings. This warms the heart of an impassioned if centrist liberal like myself.

But something more needs to happen, and soon. All of us liberals in higher education should take a long look in the mirror and ask ourselves how we contributed to putting the country in this situation. We must accept our share of responsibility. Anyone involved in Republican politics will tell you that our campus follies, magnified by Fox News, mobilise their base as few things do. But our responsibility extends beyond feeding the right-wing media by tolerating attempts to control speech, limit debate and stigmatise and bully conservatives, as well as encouraging a culture of complaint that strikes people outside our privileged circles as comically trivial. We have distorted the liberal message to such a degree that it has become unrecognisable.

After Ronald Reagan’s election in 1980, liberals in the US faced the challenge of developing a fresh and truly political vision of the country’s shared destiny, adapted to the new realities of American society, chastened by the failures of old approaches. And this they failed to do. Instead, they threw themselves into the movement politics of identity, losing a sense of what we share as citizens and what binds us as a nation. An image for Roosevelt-era liberalism and the unions that supported it was that of two hands shaking. A recurring image of identity liberalism is that of a prism refracting a single beam of light into its constituent colours, producing a rainbow. This says it all.

The politics of identity is nothing new, certainly on the American right. And it is not dead, as the recent events in Charlottesville, Virginia, remind us. The white nationalist march that set off the conflict and then led to a counter-protester’s death was not only directed against minorities. It was also directed at the university and everything it stands for. In May 1933, Nazi students marched at night into the courtyard of the University of Berlin and proceeded to burn “decadent” books in the library. The alt-right organisers were “quoting” this precedent when they flooded Thomas Jefferson’s campus, looking for blood. This was fascist identitarianism, something liberals and progressives have always battled in the name of human equality and universal justice.

What was astonishing during the Reagan years was the development of an explicit left-wing identity politics that became the de facto creed of two generations of liberal politicians, professors, schoolteachers, journalists, movement activists and officials of the Democratic Party. This has been disastrous for liberalism’s prospects in our country, especially in the face of an increasingly radicalised right.

There is a good reason that liberals focus extra attention on minorities, since they are the most likely to be disenfranchised. But the only way in a democracy to assist them meaningfully – and not just make empty gestures of recognition and “celebration” – is to win elections and exercise power in the long run, at every level of government. And the only way to accomplish that is to have a message that appeals to as many people as possible and pulls them together. Identity liberalism does the opposite and just reinforces the alt-right’s picture of politics as a war of competing identity groups.

Identity politics on the left was at first about large classes of people – African Americans, women, gays – seeking to redress major historical wrongs by mobilising and then working through our political institutions to secure their rights. By the 1980s, it had given way to a pseudo-politics of self-regard and increasingly narrow, exclusionary self-definition that is now cultivated in our colleges and universities.

The main result has been to turn young people back on to themselves, rather than turning them outward towards the wider world they share with others. It has left them unprepared to think about the common good in non-identity terms and what must be done practically to secure it – especially the hard and unglamorous task of persuading people very different from themselves to join a common effort. Every advance of liberal identity consciousness has marked a retreat of effective liberal political consciousness.

Campus politics bears a good deal of the blame. Until the 1960s, those active in liberal and progressive politics were drawn largely from the working class or farm communities and were formed in local political clubs or on shop floors. Today’s activists and leaders are formed almost exclusively at colleges and universities, as are members of the mainly liberal professions of law, journalism and education. Liberal political education, such as it is, now takes place on campuses that, especially at the elite level, are largely detached socially and geographically from the rest of the country. This is not likely to change. As a result, liberalism’s prospects will depend in no small measure on what happens in our institutions of higher education.

***

Flash back to 1980 and the election of Ronald Reagan. Republican activists are setting out on the road to spread the new individualist gospel of small government and pouring their energies into winning out-of-the-way county, state and congressional elections – a bottom-up strategy. Also on the road, though taking a different exit off the interstate, are former New Left activists in rusting, multicoloured VW buses. Having failed to overturn capitalism and the military-industrial complex, they are heading for college towns all over America, where they hope to practise a very different sort of politics aimed at transforming the outlook of the educated classes – a top-down strategy. Both groups succeeded.

The retreat of the post-1960s left was strategic. In 1962, the authors of The Port Huron Statement – the manifesto of the activist movement Students for a Democratic Society (SDS) – wrote: “We believe that the universities are an overlooked seat of influence.” Universities were no longer isolated preserves of learning. They had become central to American economic life, serving as conduits and accrediting institutions for post-industrial occupations, and to political life, through research and the formation of party elites.

The SDS authors made the case that a New Left should first try to form itself within the university, where they were free to argue among themselves and work out a more ambitious political strategy, recruiting followers along the way. The ultimate point, however, was to enter the wider world, looking “outwards to the less exotic but more lasting struggles for justice”.

But as hopes for a radical transformation of American life faded, ambitions shrank. Many who returned to campus invested their energies in making their sleepy college towns into socially progressive and environmentally self-sustaining communities. These campus towns still do stand out from the rest of America and are very pleasant places to live, though they have lost much of their utopian allure. Most have become meccas of a new consumerist culture for the highly educated, surrounded by techie office parks and increasingly expensive homes. They are places where you can visit a bookshop, see a foreign movie, pick up vitamins and candles, have a decent meal followed by an espresso and perhaps attend a workshop to ease your conscience. A thoroughly bourgeois setting without a trace of the demos, apart from the homeless men and women who flock there and whose job is to keep it real for the residents.

That’s the comic side of the story. The other side (heroic or tragic, depending on your politics) concerns how the retreating New Left turned the university into a political theatre for the staging of morality plays and operas. This has generated enormous controversy about tenured radicals, the culture wars, political correctness – and with good reason. But these developments mask a quieter, far more significant one.

A young protester at a march in California in June 2017. Photo: Getty

The big story is not that leftist professors successfully turn millions of young people into dangerous political radicals every year. Some certainly try, but that seems not to have slowed the line of graduates shoving their way towards professional schools and then moving on to conventional careers. The real story is that the 1960s generation passed on to students a particular conception of what politics is, based on its idiosyncratic historical experience.

The experience of that era taught the New Left two lessons. The first was that movement politics was the only mode of engagement that changes things (which once was true but no longer is). The second was that political activity must have some authentic meaning for the self, making compromise seem a self-betrayal (which renders ordinary politics impossible).

The lesson of these two lessons, so to speak, was that if you want to be a political person, you should begin not by joining a broad-based party but by searching for a movement that has some deep personal meaning for you. In the 1950s and early 1960s, there were already a number of such movements – about nuclear disarmament, war, poverty, the environment – that engaged the self, though they were not about the self. Instead, engaging with those issues required having to engage with the wider world and gain some knowledge of economics, sociology, psychology, science and especially history.

With the rise of identity consciousness, engagement in issue-based movements began to diminish somewhat and the conviction got rooted that the movements most meaningful to the self are, unsurprisingly, about the self. This new attitude has had a profound impact on American universities. Marxism, with its concern for the fate of the workers of the world – all of them – gradually lost its allure. The study of identity groups now seemed the most urgent scholarly and political task, and soon there was an extraordinary proliferation of departments, research centres and professorial chairs devoted to it.

This has had many good effects. It has encouraged academic disciplines to widen the scope of their investigations to incorporate the experiences of large groups that had been somewhat invisible, such as women and African Americans. But it also has encouraged a single-minded fascination with group differences and the social margins, so much so that students have come away with a distorted picture of history and of their country in the present – a significant handicap at a time when American liberals need to learn more, not less, about the vast middle of the country.

***

Imagine a young student entering such an environment today – not your average student pursuing a career, but a recognisable campus type drawn to political questions. She is at the age when the quest for meaning begins and in a place where her curiosity could be directed outward towards the larger world she will have to find a place in. Instead, she is encouraged to plumb mainly herself, which seems an easier exercise. She will first be taught that understanding herself depends on exploring the different aspects of her identity, something she now discovers she has. An identity that, she also learns, has already been largely shaped for her by various social and political forces. This is an important lesson, from which she is likely to draw the conclusion that the aim of education is not progressively to become a self – the task of a lifetime, Kierkegaard thought – through engagement with the wider world. Rather, one engages with the world and particularly politics for the limited aim of understanding and affirming what one already is.

And so she begins. She takes classes in which she reads histories of the movements related to whatever she determines her identity to be, and reads authors who share that identity. (Given that this is also an age of sexual exploration, gender studies will hold a particular attraction.) In these courses she also discovers a surprising and heartening fact: that although she may come from a comfortable, middle-class background, her identity confers on her the status of one of history’s victims. This discovery may then inspire her to join a campus group that engages in movement work. The line between self-analysis and political action is now fully blurred. Her political interest will be genuine but circumscribed by the confines of her self-definition. Issues that penetrate those confines now take on looming importance and her position on them quickly becomes non-negotiable; those issues that don’t touch on her identity (economics, war and peace) are hardly perceived.

The more our student gets into the campus identity mindset, the more distrustful she becomes of the word “we”, a term her professors have told her is a universalist ruse used to cover up group differences and maintain the dominance of the privileged. And if she gets deeper into “identity theory”, she will even start to question the reality of the groups to which she thinks she belongs. The intricacies of this pseudo-discipline are only of academic interest. However, where it has left our student is of great political interest.

An earlier generation of young women, for example, might have learned that women as a group have a distinct perspective that deserves to be recognised and cultivated, and have distinct needs that society must address. Today, the theoretically adept are likely to be taught, to the consternation of older feminists, that one cannot generalise about women since their experiences are radically different, depending on their race, sexual preference, class, physical abilities, life experiences, and so on. More generally, they will be taught that nothing about gender identity is fixed, that it is all highly malleable. This is either because, on the French view, the self is nothing, just the trace left by the interaction of invisible, tasteless, odourless forces of “power” that determine everything in the flux of life; or, on the all-American view, because the self is whatever we damn well say it is. (The most advanced thinkers hold both views at once.)

A whole scholastic vocabulary has been developed to express these notions: fluidity, hybridity, intersectionality, performativity, and more. Anyone familiar with medieval scholastic disputes over the mystery of the Holy Trinity – the original identity problem – will feel right at home.

What matters about these academic trends is that they give an intellectual patina to the narcissism that almost everything else in our society encourages. If our young student accepts the mystical idea that anonymous forces of power shape everything in life, she will be perfectly justified in withdrawing from democratic politics and casting an ironic eye on it. If, as is more likely, she accepts the all-American idea that her unique identity is something she gets to construct and change as the fancy strikes her, she can hardly be expected to have an enduring political attachment to others, and certainly cannot be expected to hear the call of duty towards them. Instead, she will find herself in the hold of what might be called the Facebook model of identity: the self as a home page I construct like a personal brand, linked to others through associations I can “like” and “unlike” at will. Intersectionality is too ephemeral to serve as a lasting foundation for solidarity and commitment.

***

The more obsessed with personal identity campus liberals become, the less willing they are to engage in reasoned political debate. Over the past decade, a new, very revealing locution has drifted from our universities into the media mainstream: “Speaking as an X…” This is not an anodyne phrase. It tells the listener that I am speaking from a privileged position on this matter. It sets up a wall against questions, which by definition come from a non-X perspective. And it turns the encounter into a power relation: the winner of the argument will be whoever has invoked the morally superior identity and expressed the most outrage at being questioned.

So classroom conversations that once might have begun, “I think A, and here is my argument,” now take the form: “Speaking as an X, I am offended that you claim B.” This makes perfect sense if you believe that identity determines everything. It means that there is no impartial space for dialogue. White men have one “epistemology”, and black women have another. So what remains to be said?

What replaces argument is taboo. At times, our more privileged campuses can seem stuck in the world of archaic religion. Only those with an approved identity status are, like shamans, allowed to speak on certain matters. Particular groups are given temporary totemic significance. Scapegoats are duly designated and run off campus in a purging ritual. Propositions become pure or impure, not true or false.

And not only propositions but simple words. Left identitarians who think of themselves as radical creatures, contesting this and transgressing that, have become like buttoned-up schoolmarms when it comes to the English language, parsing every conversation for immodest locutions and rapping the knuckles of those who inadvertently use them.

It’s a depressing development for professors who went to college in the 1960s, rebelled against the knuckle rappers and mussed the schoolmarm’s hair. Things seem to have come full circle: now the students are the narcs.

That was hardly the intention when the New Left, fresh from real political battles in the great out there, returned to campus in the hope of encouraging the young to follow in their footsteps. They imagined raucous, no-holds-barred debates over big ideas, not a roomful of students looking suspiciously at one another. They imagined being provocative and forcing students to defend their positions, not getting emails from deans suggesting they come in for a little chat. They imagined launching their politically committed and informed students into the world, not watching them retreat into themselves.

***

Conservatives are right: our colleges, from bottom to top, are mainly run by liberals, and teaching has a liberal tilt. Yet they are wrong to infer that students are therefore being turned into an effective left-wing political force. The liberal pedagogy of our time, focused as it is on identity, is actually a depoliticising force. It has made our children more tolerant of others than certainly my generation was, which is a very good thing. However, by undermining the universal democratic “we” on which solidarity can be built, duty instilled and action inspired, it is unmaking rather than making citizens. In the end, this approach just strengthens all the atomising forces that dominate our age.

It’s strange: liberal academics idealise the 1960s generation, as their weary students know. But I’ve never heard any of my colleagues ask an obvious question: what was the connection between that generation’s activism and what they learned about our country in school and in college? After all, if professors would like to see their own students follow in the footsteps of the left’s “Greatest Generation”, you would think they would try to reproduce the pedagogy of that period. But they don’t. Quite the contrary. The irony is that the supposedly bland, conventional colleges of the 1950s and early 1960s incubated what was perhaps the most radical generation of American citizens since the country’s founding – young people who were eager to engage in “the less exotic but more lasting struggles for justice” for everyone in the great out there beyond the campus gates.

The universities of our time instead cultivate students so obsessed with their personal identities and campus pseudo-politics that they have much less interest in, less engagement with, and frankly less knowledge of matters that don’t touch on identity in the great out there. Neither Elizabeth Cady Stanton (who studied Greek) nor Martin Luther King, Jr (who studied Christian theology), nor Angela Davis (who studied Western philosophy), received an identity-based education. And it is difficult to imagine them becoming who they became had they been cursed with one. The fervour of their rebellion demonstrated the degree to which their education had widened their horizons and developed in them a feeling of democratic solidarity rare in America today.

Whatever you wish to say about the political wanderings of the 1960s generation, its members were, in their own way, patriots. They cared about what happened to their fellow citizens and cared when they felt that America’s democratic principles had been violated. Even when the fringes of the student movement adopted a wooden, Marxist rhetoric, it always sounded more like “Yankee Doodle” than Wagner.

That they received a relatively non-partisan education in an environment that encouraged debates over ideas and that developed emotional toughness and intellectual conviction surely had a great deal to do with it. You can still find such people teaching in our universities and some are my friends. Most remain to the left of me but we enjoy disagreeing and respect arguments based on evidence. I still think they are unrealistic; they think I don’t see that dreaming is sometimes the most realistic thing one can do. (The older I get, the more I think they have a point.) But we shake our heads in unison when we discuss what passes for political activity on campus.

It would not be such a terrible thing to raise another generation of citizens like them. The old model, with a few tweaks, is worth following: passion and commitment, but also knowledge and argument. Curiosity about the world outside your own head and about people unlike yourself. Care for this country and its citizens, all of them, and a willingness to sacrifice for them. And the ambition to imagine a common future for all of us.

Any professor who teaches these things is engaged in the most important political work: that of building effective, and not just right-thinking, democratic citizens. Only when we have such citizens can we hope that they will become liberal ones. And only when we have liberal ones can we hope to put the country on a better path.

Mark Lilla is a professor of humanities at Columbia University, New York. His new book is “The Once and Future Liberal: After Identity Politics” (Harper), from which this essay is adapted

This article first appeared in the 10 September 2012 issue of the New Statesman, Autumn politics special