Leon Wieseltier: “I don’t believe that civility or tenderness is a primary intellectual virtue”

An interview with New Republic literary editor Leon Wieseltier, winner of the US$1m Dan David Prize, on critical standards in a technological age, slowing the march of Big Data and Barack Obama's moral vanity.

On Sunday 9 June Leon Wieseltier will be presented with the Dan David Prize at Tel Aviv University. The New Republic literary editor will join French philosopher Michel Serres and MIT economist Esther Duflo in receiving awards worth US$1m for their “outstanding contribution to humanity”. The prize, conferred annually since 2002, emphasises interdisciplinary research and aims to “foster university values of excellence, creativity, justice, democracy and progress”.

A little biography: Wieseltier was born in Brooklyn in 1952. After studying philosophy, literature and art history at universities in Britain and the US, he was made a member of the Society of Fellows at Harvard University. He has been literary editor at the New Republic since 1983. His books include Nuclear War, Nuclear Peace (1983), Against Identity (1996) and Kaddish (1998) - part-memoir, part-cultural history, in which the writer traces the history of the Jewish prayer for the dead after losing his father. He also translates modern Hebrew poetry into English and appeared briefly in The Sopranos.

According to the Dan David board, Wieseltier is being honoured for “setting the standard for serious cultural discussion in the United States”. He laughed when I read this aloud to him: “It’s a terrible responsibility to bear…”. I asked whether he believed the award implied a degree of anxiety about declining standards, the influence of technology upon the humanities (he remains involved “at the level of a layman – perhaps a little less than that”), the condition of criticism and the sheer noisiness of intellectual life.

Leon Wieseltier: The cultural debate in America usually happens at quite a low level, and there isn’t that much of it, but right now it’s never been noisier. The new technology is ideal for generating noise, and I think there’s also a lot of – how shall I put this - worthless praise and empty friendliness out there. Every once in a while somebody has to come in and say something that may seem negative but nonetheless has the effect of making things a little stricter.

There are rules of engagement, are there not, where serious criticism is concerned?

People often accuse me of publishing or writing negative criticism, but I actually believe in negative criticism. When I write an article that seems to be an attack on a book or on a figure, my view is that I’m not attacking something, I’m defending something that has already been attacked. I don’t write ad hominem pieces, I’m not that sort of journalist (I’m not really a journalist at all), but I do believe that when one sees something that one values traduced in some way, the response should be forceful. There are stakes and sometimes the stakes are quite high. It’s not always a matter of life or death, but still, the quality of the culture in which one lives really matters.

There’s an award, the Hatchet Job of the Year, that sees itself in those terms – as a defence rather than an attack.

I’m against gratuitous cruelty in criticism but I don’t believe that civility or tenderness is a primary intellectual virtue, especially on important subjects. If one is in the business of what used to be called “the battle of ideas” one should thicken one’s skin.

Next year the New Republic will turn 100. You’ve been literary editor there for 30 years, and must have seen some big changes, particularly over the last few months. How has the magazine changed?

It’s changed in all sorts of ways in its external characteristics – physically, in the frequency of its publication and of course in the addition of a website and so on. In its internal characteristics – its essential characteristics – there’s been an extraordinary level of continuity. The discontinuities have never wildly outweighed the continuities so that it became unrecognisable. Quite the contrary. One of my roles there is to worry about this and work towards a high degree of continuity. We have a very different look right now visually, but we are still unmistakeably the New Republic, which is a wonderful thing.

My view of the digital revolution and what it’s doing to publishing and journalism is this: just because there are new bottles, it doesn’t mean the old wine was bad. Even at the very beginning of the internet period there was this idea that the medium would change the message and the form would have to be the content. It had to be quick and fast. I think things are calming down a little bit now. I think the first period of internet history – the dizzy, inebriated period – is ending, which is good. But as I say, I think that the challenge for us is to get ideas that are delivered at a high level of intellectual literary quality and to get them onto the new technology. Not to alter what we do because people have no patience anymore.

One could easily be forgiven for getting lost amid all the information, losing track of the value of any of it.

It’s the greatest single assault on human attention ever devised. It’s going to effect everything for better or worse. It’s a huge subject. I write about it sometimes. I certainly worry about it a lot.

You recently focussed on the limits of Big Data, and the “datafication” – ugly word…

It’s OK you can put it in quotes.

…of subjectivity. Viktor Mayer-Schönberger and Kenneth Cukier write confidently about Twitter's capacity to act as a kind of barometer of sentiment. It reminded me of that Emerson quote: “Life is what a man is thinking of all day”. I couldn’t help but think it really isn’t what a person is tweeting. Mind and Twitter will never converge. There is data and there is the human activity of contextualising and understanding and emoting and processing things. They are distinct.

I think two things about that: one, you’re right when you speak about Twitter as a barometer. Unfortunately, a great many people are now busy looking at the barometer once every five minutes. Imagine if you did that about the actual weather! The distinction between information and knowledge is the central point. I think that there are two things which differentiate information from knowledge – one is time and the other is method. The computer reduces all knowledge to the status of information. You can go online and you can Google my security number, you can Google God’s existence, but essentially they're the same type of request.

I don’t know how things are in the States, but here, “global competition” is being used almost as an excuse to reduce those areas of learning that are seen as in any way inefficient.

Oh yes, the attitude towards knowledge now is highly instrumental and highly pragmatic. Those are the values that are generated by a technological culture. There’s no question about that. Sometimes the things that most change one’s life or comfort one in times of sore need or enable one to help other people are the kinds of knowledge that are in some sense useless.

You have written about the “emotional efficiency” of the American response to the Boston Bombings. This story rolled on for weeks in the States, while in Britain it fizzled out after a few days – does this imply a greater trauma than the refusal to appear moved by terrorism implies?

Don’t be fooled. Efficiency in motion is in some ways an American value – in others you’d think that wallowing in emotion was more of an American trait, except of course wallowing in emotion is never really experiencing emotion – and yet the efficiency of emotion was in fact an expression of fear. Terrorism works. It scares people. I didn’t follow the British coverage of the American events but of course England has had more experience of this than the United States has had in terms of terrorism in the homeland. The United States is freakishly insulated by history, geography, culture. When these things happen they are especially startling, but Americans like to move on. To achieve “closure”. When all the stuff about the brothers emerged it felt like it was time to hold a memorial service. But they’d already held it – Obama spoke.

The way the mechanism springs into action as if it’s been pre-planned leaves even less room for reflection.

Speed is the central quality of technology. It’s about the acceleration of things, including the inner life, and you saw that in the response to Boston. What one has to do, not just as a writer or as a thinker but even in one’s own life, is look for those experiences that allow one to de-accelerate. In other words, to find those things that cannot be sped up and cherish them more.

You have spoken a great deal about Syria, which has been a case of analysis and hesitance rather than expediency. It is a daily horror show, and has been for soon time. After the attacks in Boston people in Britain were keen to point out what was happening around the world – in Iraq, for example, or Syria, on the same day – where death tolls were much far than in the United States. Where are you now on Syria?

I’m still for western intervention. I still think we have to build up a force that will be friendly to the west after Assad is gone. A political force that will owe us something and will fight the jihadists. It’s not too late to deny Al-Qaeda a government in Damascus. It isn’t too late to stem the tide of refugees. It’s a huge problem. Even on humanitarian grounds, I don’t think the west should sit back and watch a dictator rain scud missiles on civilians or use chemical weapons. In foreign policy there are such things as moral emergencies that require rapid deployment, and as I understand Syria’s not the only moral emergency. I never understood the argument that because we can’t act in one place we shouldn’t act in any. It’s late though, our options would have been better two years, or even one year previously.

And waiting is not a neutral action.

That’s exactly right. It’s been more than two years. It was quite a lot more than two years in Bosnia before the United States and the west decided to take action. One keeps at it, but it is not the job of the President of the United States to bear witness to anything. You and I have to bear witness to things because we don’t have the power to actually affect them, but there are people in the world, starting with the President of the United States, who can pick up the phone and actually alter the situation. It’s grotesque. Obama’s indecisiveness and the disconnection between his lofty moral rhetoric and his actual behaviour is the central feature of his foreign policy.

There are example from recent history of course – Bill Clinton famously spoke of his regrets over the Rwandan genocide. Will Obama speak similarly about Syria?

I don’t think so. Obama’s moral vanity is so large I don’t think he’ll ever apologise. I thought Clinton’s apology on Rwanda was a little cheap – that’s easy enough, I’ve apologised – but no, Obama’s bearing witness. It’s the strangest thing. The most powerful man in the world has decided for reasons I don’t entirely understand to be genuinely passive about the most important challenges.

“We have to look for those experiences that allow one to de-accelerate. To find those things that cannot be sped up, and cherish them more”. Leon Wieseltier in his office. Photograph: Books Kraft.

Philip Maughan is a freelance writer in Berlin and a former Assistant Editor at the New Statesman.

FRED TOMASELLI/PRIVATE COLLECTION/BRIDGEMAN IMAGES
Show Hide image

How nature created consciousness – and our brains became minds

In From Bacteria to Bach and Back, Daniel C Dennett investigates the evolution of consciousness.

In the preface to his new book, the ­philosopher Daniel Dennett announces proudly that what we are about to read is “the sketch, the backbone, of the best scientific theory to date of how our minds came into existence”. By the end, the reader may consider it more scribble than spine – at least as far as an account of the origins of human consciousness goes. But this is still a superb book about evolution, engineering, information and design. It ranges from neuroscience to nesting birds, from computing theory to jazz, and there is something fascinating on every page.

The term “design” has a bad reputation in biology because it has been co-opted by creationists disguised as theorists of “intelligent design”. Nature is the blind watchmaker (in Richard Dawkins’s phrase), dumbly building remarkable structures through a process of random accretion and winnowing over vast spans of time. Nonetheless, Dennett argues stylishly, asking “design” questions about evolution shouldn’t be ­taboo, because “biology is reverse engin­eering”: asking what some phenomenon or structure is for is an excellent way to understand how it might have arisen.

Just as in nature there is design without a designer, so in many natural phenomena we can observe what Dennett calls “competence without comprehension”. Evolution does not understand nightingales, but it builds them; your immune system does not understand disease. Termites do not build their mounds according to blueprints, and yet the results are remarkably complex: reminiscent in one case, as Dennett notes, of Gaudí’s church the Sagrada Família. In general, evolution and its living products are saturated with competence without comprehension, with “unintelligent design”.

The question, therefore, is twofold. Why did “intelligent design” of the kind human beings exhibit – by building robotic cars or writing books – come about at all, if unintelligent design yields such impressive results? And how did the unintelligent-design process of evolution ever build intelligent designers like us in the first place? In sum, how did nature get from bacteria to Bach?

Dennett’s answer depends on memes – self-replicating units of cultural evolution, metaphorical viruses of the mind. Today we mostly use “meme” to mean something that is shared on social media, but in Richard Dawkins’s original formulation of the idea, a meme can be anything that is culturally transmitted and undergoes change: melodies, ideas, clothing fashions, ways of building pots, and so forth. Some might say that the only good example of a meme is the very idea of a meme, given that it has replicated efficiently over the years despite being of no use whatsoever to its hosts. (The biologist Stephen Jay Gould, for one, didn’t believe in memes.) But Dennett thinks that memes add something important to discussions of “cultural evolution” (a contested idea in its own right) that is not captured by established disciplines such as history or sociology.

The memes Dennett has in mind here are words: after all, they reproduce, with variation, in a changing environment (the mind of a host). Somehow, early vocalisations in our species became standardised as words. They acquired usefulness and meaning, and so, gradually, their use spread. Eventually, words became the tools that enabled our brains to reflect on what they were ­doing, thus bootstrapping themselves into full consciousness. The “meme invasion”, as Dennett puts it, “turned our brains into minds”. The idea that language had a critical role to play in the development of human consciousness is very plausible and not, in broad outline, new. The question is how much Dennett’s version leaves to explain.

Before the reader arrives at that crux, there are many useful philosophical interludes: on different senses of “why” (why as in “how come?” against why as in “what for?”), or in the “strange inversions of reasoning” offered by Darwin (the notion that competence does not require comprehension), Alan Turing (that a perfect computing machine need not know what arithmetic is) and David Hume (that causation is a projection of our minds and not something we perceive directly). Dennett suggests that the era of intelligent design may be coming to an end; after all, our best AIs, such as the ­AlphaGo program (which beat the human European champion of the boardgame Go 5-0 in a 2015 match), are these days created as learning systems that will teach themselves what to do. But our sunny and convivial host is not as worried as some about an imminent takeover by intelligent machines; the more pressing problem, he argues persuasively, is that we usually trust computerised systems to an extent they don’t deserve. His final call for critical thinking tools to be made widely available is timely and admirable. What remains puzzlingly vague to the end, however, is whether Dennett actually thinks human consciousness – the entire book’s explanandum – is real; and even what exactly he means by the term.

Dennett’s 1991 book, Consciousness Explained, seemed to some people to deny the existence of consciousness at all, so waggish critics retitled it Consciousness Explained Away. Yet it was never quite clear just what Dennett was claiming didn’t exist. In this new book, confusion persists, owing to his reluctance to define his terms. When he says “consciousness” he appears to mean reflective self-consciousness (I am aware that I am aware), whereas many other philosophers use “consciousness” to mean ordinary awareness, or experience. There ensues much sparring with straw men, as when he ridicules thinkers who assume that gorillas, say, have consciousness. They almost certainly don’t in his sense, and they almost certainly do in his opponents’ sense. (A gorilla, we may be pretty confident, has experience in the way that a volcano or a cloud does not.)

More unnecessary confusion, in which one begins to suspect Dennett takes a polemical delight, arises from his continued use of the term “illusion”. Consciousness, he has long said, is an illusion: we think we have it, but we don’t. But what is it that we are fooled into believing in? It can’t be experience itself: as the philosopher Galen Strawson has pointed out, the claim that I only seem to have experience presupposes that I really am having experience – the experience of there seeming to be something. And throughout this book, Dennett’s language implies that he thinks consciousness is real: he refers to “conscious thinking in H[omo] sapiens”, to people’s “private thoughts and experiences”, to our “proper minds, enculturated minds full of thinking tools”, and to “a ‘rich mental life’ in the sense of a conscious life like ours”.

The way in which this conscious life is allegedly illusory is finally explained in terms of a “user illusion”, such as the desktop on a computer operating system. We move files around on our screen desktop, but the way the computer works under the hood bears no relation to these pictorial metaphors. Similarly, Dennett writes, we think we are consistent “selves”, able to perceive the world as it is directly, and acting for rational reasons. But by far the bulk of what is going on in the brain is unconscious, ­low-level processing by neurons, to which we have no access. Therefore we are stuck at an ­“illusory” level, incapable of experiencing how our brains work.

This picture of our conscious mind is rather like Freud’s ego, precariously balan­ced atop a seething unconscious with an entirely different agenda. Dennett explains wonderfully what we now know, or at least compellingly theorise, about how much unconscious guessing, prediction and logical inference is done by our brains to produce even a very simple experience such as seeing a table. Still, to call our normal experience of things an “illusion” is, arguably, to privilege one level of explanation arbitrarily over another. If you ask me what is happening on my computer at the moment, I shall reply that I am writing a book review on a word processor. If I embarked instead on a description of electrical impulses running through the CPU, you would think I was being sarcastically obtuse. The normal answer is perfectly true. It’s also true that I am currently seeing my laptop screen even as this experience depends on innumerable neural processes of guessing and reconstruction.

The upshot is that, by the end of this brilliant book, the one thing that hasn’t been explained is consciousness. How does first-person experience – the experience you are having now, reading these words – arise from the electrochemical interactions of neurons? No one has even the beginnings of a plausible theory, which is why the question has been called the “Hard Problem”. Dennett’s story is that human consciousness arose because our brains were colonised by word-memes; but how did that do the trick? No explanation is forthcoming. Dennett likes to say the Hard Problem just doesn’t exist, but ignoring it won’t make it go away – even if, as his own book demonstrates, you can ignore it and still do a lot of deep and fascinating thinking about human beings and our place in nature.

Steven Poole’s books include “Rethink: the Surprising History of New Ideas” (Random House Books)

This article first appeared in the 16 February 2017 issue of the New Statesman, The New Times