I gave up on Mozza years ago - Morrissey: Live is proof that I was right to do it

As far as Morrissey concerts go, the one immortalised in his latest film Morrissey: Live isn't the best. It saddens me to say it, but my love affair with Mozza is well and truly over.

A few times a week, I pass the UCKG (United Church of the Kingdom of God) building on Kilburn High Road, and I usually glance up at its plump dome and feel a teensy bit nostalgic. Before it was a church, it was the National Ballroom, a thriving music venue for decades (it closed in 1999). Nirvana played there in December 1991, but the gig to which I think back when I pass the building took place a few months earlier that year.

It was Morrissey—the second time I had seen him on his 1991 Kill Uncle tour—and my companion and I had arrived in Kilburn early enough to catch a glimpse of him being chauffeured away after soundcheck. It would be factually incorrect to say that we chased his car. It was a more a moderate hotfooting than an actual chase. We made it to the side street just as he was pulling away, and snapped frantically at the vehicle’s window with our cameras. The pictures came out well. You could see clearly it was Morrissey: aloof as a queen, smug as a cat. He was smirking, as he often is. Was it at the thought of the gold foil-effect shirt he would wear later that night on stage? How I loved that shirt.

And how I loved Morrissey. This confers on me no particular distinction. “I Was a Teenage Morrissey fan” is a revelation to file alongside other popular adolescent confessions such as “I was insufferably pretentious” and “I had acne.” But—and I’m sorry to break it to you so brutally if you had not already heard—Morrissey and I are over. Finished. I’m never going back. Not after what he did to me. What did he do? Well, his music went off and so did he.

It was nice while it lasted. And it lasted 20 years. I was a shade too young to be in on the Smiths from the start but by the time The Queen is Dead was released in June 1986, I was hanging out with some cool older kids who clued me in. Morrissey and I went all the way. All the way, that is, from 1986 to 2006, when the release of his eighth solo album, Ringleader of the Tormentors, coincided with a frosting of my affection for him. I can’t say whether the feeling was mutual; you’ll just have to contact him for his side of the story.

And it wasn’t so much that album that killed off our relationship—it’s at least half-brilliant, and far more nuanced than what followed. But what he was saying and doing away from the studio began to interfere with the music. There was always a prickly arrogance about him to offset the self-flagellation in his writing; that was part of the joy of his persona. But now there was an air of social and cultural intolerance in his proclamations which was no longer about defending the outsider—it seemed to involve lashing out pointlessly at anyone whose perspective deviated even mildly from his, or slighting entire races (“You can't help but feel that the Chinese are a subspecies,” he told the Guardian in 2010.) By the time he was ranting about Kate Middleton’s admission to hospital last year, insisting that she was swinging the lead, I found myself in the unusual position of feeling sympathy for a member of the Royal family. My 16-year-old self would have thrown up at that.

Then there were the pompous dispatches he had begun issuing through the uncritical portal of the fansite true-to-you.net; they were like a Private Eye pastiche of rock-star delusions. He had also become a strikingly poor writer. This, from a recent 1,500-word, single paragraphdiatribe against Thatcher, will make any sane person reach for the red pen: “The coverage by the British media of Thatcher's death has been exclusively absorbed in Thatcher's canonization to such a censorial degree that we suddenly see the modern British establishment as an uncivilized entity of delusion, giving the cold shoulder to truth, and offering indescribable disgust to anyone unimpressed by Thatcher.”(Not quite “Margaret on the Guillotine,” is it?)

I should probably confess that the blame for my cooling can’t be laid entirely at Morrissey’s feet. I think you know what I’m saying: yes, there was someone else. Another man, younger and livelier and so much more innovative than Morrissey. Ariel Pink is his name, and I realized when I heard his album Worn Copy in 2006 that he had the playfulness, wit and passion that had been missing from Morrissey for the longest time. What can I say? He’s good for me.

I didn’t leave Morrissey a goodbye note, a Dear John letter. I guess in some ways, this is that letter. But now he has left me one: his concert film Morrissey 25: Live (so named because it marks the quarter-century point in his solo career). It’s a terrible film, depressingly conservative as an example of the concert movie genre as well as a harsh indictment of its subject’s complacency and declining creativity. Helpfully, it only confirms to me how right it was that we went our separate ways. It was a hard decision. But, as he once put it, that’s how people grow up.

The film includes the full concert he played in March this year at the Hollywood High School. The set-list perversely scrapes the barrel of his solo career: the inclusion of “Alma Matters,” “Ouija Board, Ouija Board” and “You’re the One For Me, Fatty” suggest he was going in his contrarian way for a Greatest Misses effect. Any fine songs in his repertoire—from solo numbers like “Everyday is Like Sunday” to the Smiths’ “Still Ill” and “The Boy with the Thorn in His Side”—tend to be massacred by his increasingly callous band. The low-point of the movie shows Morrissey handing the microphone to a selection of front-row fans who compete to give the best impressions of lobotomy patients (“Thank you for living,” says one).

We can’t blame them, though. It’s Morrissey who disgraces himself by fishing for their compliments using an industrial trawler. His egotism can only undermine the sincerity of a song like “Please Please Please Let Me Get What I Want,” released in 1984 but performed here in an overwrought new arrangement. To hear him sing “For once in my life, let me get what I want” after several fans have done everything short of offering themselves up to him for sacrifice is ungrateful at best, disingenuous at worst.

I’ve seen good Morrissey gigs and bad ones. I went to more than 20 shows—one for each year of my infatuation—and I cherish the great nights (Wembley Arena 1991, Battersea Power Station 1996, Royal Albert Hall 2002, Harlem’s Apollo Theatre 2004) as much as I wince at the lacklustre ones (Bournemouth 1991, Ilford, east London, 1996). Unless the transfer from stage to screen has been especially harsh, my Moz-memory tells me that the performance we see in Morrissey 25: Live is not one that merited conserving. But at least it has helped bring closure for me to this relationship. I know I will still gaze up at the old National Ballroom building and get goosebumps. But I know also that I can move on. I only hope the same is true of Morrissey.

Morrissey 25: Live is in cinemas from Saturday.

Mozza glances down at the groundlings at the Hollywood High School. Photograph: Kevin Winter/Getty Images.

Ryan Gilbey is the New Statesman's film critic. He is also the author of It Don't Worry Me (Faber), about 1970s US cinema, and a study of Groundhog Day in the "Modern Classics" series (BFI Publishing). He was named reviewer of the year in the 2007 Press Gazette awards.

FRED TOMASELLI/PRIVATE COLLECTION/BRIDGEMAN IMAGES
Show Hide image

How nature created consciousness – and our brains became minds

In From Bacteria to Bach and Back, Daniel C Dennett investigates the evolution of consciousness.

In the preface to his new book, the ­philosopher Daniel Dennett announces proudly that what we are about to read is “the sketch, the backbone, of the best scientific theory to date of how our minds came into existence”. By the end, the reader may consider it more scribble than spine – at least as far as an account of the origins of human consciousness goes. But this is still a superb book about evolution, engineering, information and design. It ranges from neuroscience to nesting birds, from computing theory to jazz, and there is something fascinating on every page.

The term “design” has a bad reputation in biology because it has been co-opted by creationists disguised as theorists of “intelligent design”. Nature is the blind watchmaker (in Richard Dawkins’s phrase), dumbly building remarkable structures through a process of random accretion and winnowing over vast spans of time. Nonetheless, Dennett argues stylishly, asking “design” questions about evolution shouldn’t be ­taboo, because “biology is reverse engin­eering”: asking what some phenomenon or structure is for is an excellent way to understand how it might have arisen.

Just as in nature there is design without a designer, so in many natural phenomena we can observe what Dennett calls “competence without comprehension”. Evolution does not understand nightingales, but it builds them; your immune system does not understand disease. Termites do not build their mounds according to blueprints, and yet the results are remarkably complex: reminiscent in one case, as Dennett notes, of Gaudí’s church the Sagrada Família. In general, evolution and its living products are saturated with competence without comprehension, with “unintelligent design”.

The question, therefore, is twofold. Why did “intelligent design” of the kind human beings exhibit – by building robotic cars or writing books – come about at all, if unintelligent design yields such impressive results? And how did the unintelligent-design process of evolution ever build intelligent designers like us in the first place? In sum, how did nature get from bacteria to Bach?

Dennett’s answer depends on memes – self-replicating units of cultural evolution, metaphorical viruses of the mind. Today we mostly use “meme” to mean something that is shared on social media, but in Richard Dawkins’s original formulation of the idea, a meme can be anything that is culturally transmitted and undergoes change: melodies, ideas, clothing fashions, ways of building pots, and so forth. Some might say that the only good example of a meme is the very idea of a meme, given that it has replicated efficiently over the years despite being of no use whatsoever to its hosts. (The biologist Stephen Jay Gould, for one, didn’t believe in memes.) But Dennett thinks that memes add something important to discussions of “cultural evolution” (a contested idea in its own right) that is not captured by established disciplines such as history or sociology.

The memes Dennett has in mind here are words: after all, they reproduce, with variation, in a changing environment (the mind of a host). Somehow, early vocalisations in our species became standardised as words. They acquired usefulness and meaning, and so, gradually, their use spread. Eventually, words became the tools that enabled our brains to reflect on what they were ­doing, thus bootstrapping themselves into full consciousness. The “meme invasion”, as Dennett puts it, “turned our brains into minds”. The idea that language had a critical role to play in the development of human consciousness is very plausible and not, in broad outline, new. The question is how much Dennett’s version leaves to explain.

Before the reader arrives at that crux, there are many useful philosophical interludes: on different senses of “why” (why as in “how come?” against why as in “what for?”), or in the “strange inversions of reasoning” offered by Darwin (the notion that competence does not require comprehension), Alan Turing (that a perfect computing machine need not know what arithmetic is) and David Hume (that causation is a projection of our minds and not something we perceive directly). Dennett suggests that the era of intelligent design may be coming to an end; after all, our best AIs, such as the ­AlphaGo program (which beat the human European champion of the boardgame Go 5-0 in a 2015 match), are these days created as learning systems that will teach themselves what to do. But our sunny and convivial host is not as worried as some about an imminent takeover by intelligent machines; the more pressing problem, he argues persuasively, is that we usually trust computerised systems to an extent they don’t deserve. His final call for critical thinking tools to be made widely available is timely and admirable. What remains puzzlingly vague to the end, however, is whether Dennett actually thinks human consciousness – the entire book’s explanandum – is real; and even what exactly he means by the term.

Dennett’s 1991 book, Consciousness Explained, seemed to some people to deny the existence of consciousness at all, so waggish critics retitled it Consciousness Explained Away. Yet it was never quite clear just what Dennett was claiming didn’t exist. In this new book, confusion persists, owing to his reluctance to define his terms. When he says “consciousness” he appears to mean reflective self-consciousness (I am aware that I am aware), whereas many other philosophers use “consciousness” to mean ordinary awareness, or experience. There ensues much sparring with straw men, as when he ridicules thinkers who assume that gorillas, say, have consciousness. They almost certainly don’t in his sense, and they almost certainly do in his opponents’ sense. (A gorilla, we may be pretty confident, has experience in the way that a volcano or a cloud does not.)

More unnecessary confusion, in which one begins to suspect Dennett takes a polemical delight, arises from his continued use of the term “illusion”. Consciousness, he has long said, is an illusion: we think we have it, but we don’t. But what is it that we are fooled into believing in? It can’t be experience itself: as the philosopher Galen Strawson has pointed out, the claim that I only seem to have experience presupposes that I really am having experience – the experience of there seeming to be something. And throughout this book, Dennett’s language implies that he thinks consciousness is real: he refers to “conscious thinking in H[omo] sapiens”, to people’s “private thoughts and experiences”, to our “proper minds, enculturated minds full of thinking tools”, and to “a ‘rich mental life’ in the sense of a conscious life like ours”.

The way in which this conscious life is allegedly illusory is finally explained in terms of a “user illusion”, such as the desktop on a computer operating system. We move files around on our screen desktop, but the way the computer works under the hood bears no relation to these pictorial metaphors. Similarly, Dennett writes, we think we are consistent “selves”, able to perceive the world as it is directly, and acting for rational reasons. But by far the bulk of what is going on in the brain is unconscious, ­low-level processing by neurons, to which we have no access. Therefore we are stuck at an ­“illusory” level, incapable of experiencing how our brains work.

This picture of our conscious mind is rather like Freud’s ego, precariously balan­ced atop a seething unconscious with an entirely different agenda. Dennett explains wonderfully what we now know, or at least compellingly theorise, about how much unconscious guessing, prediction and logical inference is done by our brains to produce even a very simple experience such as seeing a table. Still, to call our normal experience of things an “illusion” is, arguably, to privilege one level of explanation arbitrarily over another. If you ask me what is happening on my computer at the moment, I shall reply that I am writing a book review on a word processor. If I embarked instead on a description of electrical impulses running through the CPU, you would think I was being sarcastically obtuse. The normal answer is perfectly true. It’s also true that I am currently seeing my laptop screen even as this experience depends on innumerable neural processes of guessing and reconstruction.

The upshot is that, by the end of this brilliant book, the one thing that hasn’t been explained is consciousness. How does first-person experience – the experience you are having now, reading these words – arise from the electrochemical interactions of neurons? No one has even the beginnings of a plausible theory, which is why the question has been called the “Hard Problem”. Dennett’s story is that human consciousness arose because our brains were colonised by word-memes; but how did that do the trick? No explanation is forthcoming. Dennett likes to say the Hard Problem just doesn’t exist, but ignoring it won’t make it go away – even if, as his own book demonstrates, you can ignore it and still do a lot of deep and fascinating thinking about human beings and our place in nature.

Steven Poole’s books include “Rethink: the Surprising History of New Ideas” (Random House Books)

This article first appeared in the 16 February 2017 issue of the New Statesman, The New Times