Family values: Gugu Mbatha-Raw as Dido and Sarah Gadon as Lady Elizabeth Murray in Belle
Show Hide image

Race and sensibility: Belle by Amma Asante

As the illegitimate mixed-race daughter of an admiral in 18th-century England, Dido Elizabeth Bell’s status is too high to allow her to eat with the servants, yet too low to permit her to join guests for dinner.

It should go down as one of the disgraces of British cinema that it took ten years for the writer-director Amma Asante to get the chance to make a follow-up to her 2004 debut, A Way of Life. That picture, which explored the tensions contributing to a (fictional) racist murder in Cardiff, showed a director capable of keeping her nerve when faced with the dual temptations of melodrama and blame. Each character, no matter how monstrous their actions, could count on being the beneficiary of her insights and her mercy.

Several follow-up projects collapsed during the economic crisis but Asante has finally made a second film. Like her first, Belle is a story of race seen from an oblique angle. Its focus is a real woman whose horizons were narrowed by prejudice but who nevertheless enjoyed a life of greater privilege than some white members of society and even of her own family.

As the illegitimate mixed-race daughter of a Royal Navy admiral in late-18th-century England, Dido Elizabeth Belle (Gugu Mbatha-Raw) grows up at Kenwood House with her father’s family while he returns to the sea. Her great-uncle Lord Mansfield (Tom Wilkinson) talks her through the canvases that gaze imposingly from the walls. The only black subjects in those paintings are subservient to white masters but Dido’s life is more complicated than that. Her status is too high to allow her to eat with the servants, yet too low to permit her to join guests for dinner.

She has a guaranteed income for life from her father, which perversely makes her a lesser priority than her cousin Lady Elizabeth Murray (Sarah Gadon) when it comes to finding a husband. A man of good breeding, she is told, would be unlikely to marry her. Any other kind of suitor, however, would lower her rank. Her fate is to have fallen between a multitude of stools, racial and economic. She has the wealth and standing of aristocracy and none of the leverage.

Asante (who also worked on the screenplay, though only the original writer, Misan Sagay, is credited) can’t correct history. What she can do is restore some of the power that must have been denied to Dido in life. (Diaries from the Mansfield household formed the spine of the research, while a painting of Dido and Elizabeth was the film’s springboard, but the screenplay is predominantly speculative.)

Dido is made the lynchpin of social transactions that appear to exclude her. When Elizabeth is disparaged by the cad she hopes to marry, the film puts Dido in a position to blow the whistle. When Lord Mansfield, the lord chief justice, presides over the court case involving the Zong slave ship, from which 142 Africans were thrown to their deaths, Dido is the one who discovers incriminating inconsistencies in the ship’s log. She might have come across as a proper Nancy Drew if not for Mbatha-Raw’s screen presence, gentle to the point of faintness but brimming with inner hurt.

She and the film are never better than in the brief scene in which a black maid, Mabel (Bethan Mary-James), notices Dido struggling to brush her hair. Mbatha-Raw has to cram layers of conflicting emotion into the petulant scowl that Dido shoots across the room at Mabel. She is smarting at the servant’s impertinence in staring but she is also curious and embarrassed at the disparity in status between them: two black women kept in their respective places by racism of varying strengths. There is envy, too. After all, Mabel knows from her childhood how to take unruly African hair in hand, which Dido does not. The scene’s genius comes in the next shot, a brisk and brilliant cut to the pair of them in front of a mirror – Dido seated as she is groomed by Mabel, both women wearing girlish slumber-party grins.

Never content to give a scene a single flavour when she can squeeze in two, Asante is careful to show that Elizabeth is the gooseberry in this moment of sisterhood. But then one of the points of Belle, expressed in its central metaphor of the portrait for which the cousins pose, is that someone is always at risk of being painted out of history. The film paints everyone back in.

Ryan Gilbey is the New Statesman's film critic. He is also the author of It Don't Worry Me (Faber), about 1970s US cinema, and a study of Groundhog Day in the "Modern Classics" series (BFI Publishing). He was named reviewer of the year in the 2007 Press Gazette awards.

This article first appeared in the 11 June 2014 issue of the New Statesman, The last World Cup

FRED TOMASELLI/PRIVATE COLLECTION/BRIDGEMAN IMAGES
Show Hide image

How nature created consciousness – and our brains became minds

In From Bacteria to Bach and Back, Daniel C Dennett investigates the evolution of consciousness.

In the preface to his new book, the ­philosopher Daniel Dennett announces proudly that what we are about to read is “the sketch, the backbone, of the best scientific theory to date of how our minds came into existence”. By the end, the reader may consider it more scribble than spine – at least as far as an account of the origins of human consciousness goes. But this is still a superb book about evolution, engineering, information and design. It ranges from neuroscience to nesting birds, from computing theory to jazz, and there is something fascinating on every page.

The term “design” has a bad reputation in biology because it has been co-opted by creationists disguised as theorists of “intelligent design”. Nature is the blind watchmaker (in Richard Dawkins’s phrase), dumbly building remarkable structures through a process of random accretion and winnowing over vast spans of time. Nonetheless, Dennett argues stylishly, asking “design” questions about evolution shouldn’t be ­taboo, because “biology is reverse engin­eering”: asking what some phenomenon or structure is for is an excellent way to understand how it might have arisen.

Just as in nature there is design without a designer, so in many natural phenomena we can observe what Dennett calls “competence without comprehension”. Evolution does not understand nightingales, but it builds them; your immune system does not understand disease. Termites do not build their mounds according to blueprints, and yet the results are remarkably complex: reminiscent in one case, as Dennett notes, of Gaudí’s church the Sagrada Família. In general, evolution and its living products are saturated with competence without comprehension, with “unintelligent design”.

The question, therefore, is twofold. Why did “intelligent design” of the kind human beings exhibit – by building robotic cars or writing books – come about at all, if unintelligent design yields such impressive results? And how did the unintelligent-design process of evolution ever build intelligent designers like us in the first place? In sum, how did nature get from bacteria to Bach?

Dennett’s answer depends on memes – self-replicating units of cultural evolution, metaphorical viruses of the mind. Today we mostly use “meme” to mean something that is shared on social media, but in Richard Dawkins’s original formulation of the idea, a meme can be anything that is culturally transmitted and undergoes change: melodies, ideas, clothing fashions, ways of building pots, and so forth. Some might say that the only good example of a meme is the very idea of a meme, given that it has replicated efficiently over the years despite being of no use whatsoever to its hosts. (The biologist Stephen Jay Gould, for one, didn’t believe in memes.) But Dennett thinks that memes add something important to discussions of “cultural evolution” (a contested idea in its own right) that is not captured by established disciplines such as history or sociology.

The memes Dennett has in mind here are words: after all, they reproduce, with variation, in a changing environment (the mind of a host). Somehow, early vocalisations in our species became standardised as words. They acquired usefulness and meaning, and so, gradually, their use spread. Eventually, words became the tools that enabled our brains to reflect on what they were ­doing, thus bootstrapping themselves into full consciousness. The “meme invasion”, as Dennett puts it, “turned our brains into minds”. The idea that language had a critical role to play in the development of human consciousness is very plausible and not, in broad outline, new. The question is how much Dennett’s version leaves to explain.

Before the reader arrives at that crux, there are many useful philosophical interludes: on different senses of “why” (why as in “how come?” against why as in “what for?”), or in the “strange inversions of reasoning” offered by Darwin (the notion that competence does not require comprehension), Alan Turing (that a perfect computing machine need not know what arithmetic is) and David Hume (that causation is a projection of our minds and not something we perceive directly). Dennett suggests that the era of intelligent design may be coming to an end; after all, our best AIs, such as the ­AlphaGo program (which beat the human European champion of the boardgame Go 5-0 in a 2015 match), are these days created as learning systems that will teach themselves what to do. But our sunny and convivial host is not as worried as some about an imminent takeover by intelligent machines; the more pressing problem, he argues persuasively, is that we usually trust computerised systems to an extent they don’t deserve. His final call for critical thinking tools to be made widely available is timely and admirable. What remains puzzlingly vague to the end, however, is whether Dennett actually thinks human consciousness – the entire book’s explanandum – is real; and even what exactly he means by the term.

Dennett’s 1991 book, Consciousness Explained, seemed to some people to deny the existence of consciousness at all, so waggish critics retitled it Consciousness Explained Away. Yet it was never quite clear just what Dennett was claiming didn’t exist. In this new book, confusion persists, owing to his reluctance to define his terms. When he says “consciousness” he appears to mean reflective self-consciousness (I am aware that I am aware), whereas many other philosophers use “consciousness” to mean ordinary awareness, or experience. There ensues much sparring with straw men, as when he ridicules thinkers who assume that gorillas, say, have consciousness. They almost certainly don’t in his sense, and they almost certainly do in his opponents’ sense. (A gorilla, we may be pretty confident, has experience in the way that a volcano or a cloud does not.)

More unnecessary confusion, in which one begins to suspect Dennett takes a polemical delight, arises from his continued use of the term “illusion”. Consciousness, he has long said, is an illusion: we think we have it, but we don’t. But what is it that we are fooled into believing in? It can’t be experience itself: as the philosopher Galen Strawson has pointed out, the claim that I only seem to have experience presupposes that I really am having experience – the experience of there seeming to be something. And throughout this book, Dennett’s language implies that he thinks consciousness is real: he refers to “conscious thinking in H[omo] sapiens”, to people’s “private thoughts and experiences”, to our “proper minds, enculturated minds full of thinking tools”, and to “a ‘rich mental life’ in the sense of a conscious life like ours”.

The way in which this conscious life is allegedly illusory is finally explained in terms of a “user illusion”, such as the desktop on a computer operating system. We move files around on our screen desktop, but the way the computer works under the hood bears no relation to these pictorial metaphors. Similarly, Dennett writes, we think we are consistent “selves”, able to perceive the world as it is directly, and acting for rational reasons. But by far the bulk of what is going on in the brain is unconscious, ­low-level processing by neurons, to which we have no access. Therefore we are stuck at an ­“illusory” level, incapable of experiencing how our brains work.

This picture of our conscious mind is rather like Freud’s ego, precariously balan­ced atop a seething unconscious with an entirely different agenda. Dennett explains wonderfully what we now know, or at least compellingly theorise, about how much unconscious guessing, prediction and logical inference is done by our brains to produce even a very simple experience such as seeing a table. Still, to call our normal experience of things an “illusion” is, arguably, to privilege one level of explanation arbitrarily over another. If you ask me what is happening on my computer at the moment, I shall reply that I am writing a book review on a word processor. If I embarked instead on a description of electrical impulses running through the CPU, you would think I was being sarcastically obtuse. The normal answer is perfectly true. It’s also true that I am currently seeing my laptop screen even as this experience depends on innumerable neural processes of guessing and reconstruction.

The upshot is that, by the end of this brilliant book, the one thing that hasn’t been explained is consciousness. How does first-person experience – the experience you are having now, reading these words – arise from the electrochemical interactions of neurons? No one has even the beginnings of a plausible theory, which is why the question has been called the “Hard Problem”. Dennett’s story is that human consciousness arose because our brains were colonised by word-memes; but how did that do the trick? No explanation is forthcoming. Dennett likes to say the Hard Problem just doesn’t exist, but ignoring it won’t make it go away – even if, as his own book demonstrates, you can ignore it and still do a lot of deep and fascinating thinking about human beings and our place in nature.

Steven Poole’s books include “Rethink: the Surprising History of New Ideas” (Random House Books)

This article first appeared in the 16 February 2017 issue of the New Statesman, The New Times