The angry fundamentalists of the church of gaming

Why are gamers such an angry bunch?

I like the idea that the hate storm surrounding Anita Sarkeesian is a surprise to some people. It pleases me that there are still people in the world who possess that level of innocence, that people can still be outraged by the viciousness and ignorance that so many people take for granted when using the Internet.

Sarkeesian you see broke two rules of online communication, the first, which I don’t want to dwell on, is that she forgot to be male. If you want to express any sort of opinion without a penis to give you credibility then you are going to get a certain type of abuse from men, almost regardless of topic.

The second rule she broke however is that she poked the sacred cow, video games.

Gamers are an incredibly diverse bunch as I’m sure we all know, but like with any broad church there are going to be some people in there who are, for want of a better word, fundamentalists.

Gaming is no exception to this and in many ways gaming culture mirrors the structure of an established religion. The younger gamers are often more radical, more extreme in their views and how they express them, the fanboys and the fanatics. The games industry is itself the church, delivering the games which are to be worshipped and revered by the masses. The older generation of gamers tend to view this church with more suspicion, but most, at heart, are still believers.

The big element which links gaming culture to a religion however is just how conservative it is. A lot of gamers do not like change, they will wait like hungry dogs for the next game in a series, but they don’t want it to be too different. Just like the faithful going to church they are expecting to hear what they want to hear, nothing radical, nothing too different, but not word for word what was said last week. It is no coincidence or surprise that so many of the most successful games in recent years are sequels, giving the public more of the same.

You can see evidence of this gaming conservatism if you look at the kind of language that gamers often use to describe new games. New games are jumped upon and embraced of course, but at the same time they are often resented by the faithful. Many games, even successful ones like Skyrim and Battlefield 3, are seen as toned down and casual shadows of the tougher, less forgiving and less accessible games that we cut our teeth on. 

In the face of this orthodoxy the arrival of women on the scene, carrying with them an agenda of change, it is inevitably greeted with vitriol and anger by gamers who perceive their precious stream of the same thing as last year to be under threat. Worse it is not just the women who openly have an agenda who face this wrath; female gamers are also abused merely for the crime of being female. Female gamers are seen as harbingers of some sort of oestrogen induced end of days for gaming, a spoilt little sister who has climbed the rope ladder to our clubhouse and is intending to paint it pink.

It is this conservatism that Sarkeesian’s project confronted, a desire of many gamers to not see things changed. While it only takes a small minority to unleash the torrent of abuse she was subjected to the views held by those who abused her are not that rare, as evidenced by how often they are encountered by female gamers themselves.

This mind set is of course not common to all gamers and gaming does see radical ideas breaking out into the world on a regular basis, but it is something of a concern for anybody wanting to see the medium progress that the biggest titles are always the hardy perennials, Call of Duty, Halo, FIFA, the same Malibu Stacy as last year with a new hat.

To an extent change is already happening, Lara Croft being transformed from a heavily armed blow up doll into a relatable teenage girl wielding a bow is a laudable if clumsy step in the right direction. Of course it could also be seen as an attempt to cash in on The Hunger Games, but even that in itself shows a change in the focus of marketing. Meanwhile recent games like Duke Nukem Forever and Postal 3 which have used their misogyny and crassness as a selling point have been total failures.

Games and gaming are growing up fast and no amount of hostility and rage from the hard line gamers is going to change that.

A gamer of yesteryear. This dying breed will defend its turf to the death.

Phil Hartup is a freelance journalist with an interest in video gaming and culture

FRED TOMASELLI/PRIVATE COLLECTION/BRIDGEMAN IMAGES
Show Hide image

How nature created consciousness – and our brains became minds

In From Bacteria to Bach and Back, Daniel C Dennett investigates the evolution of consciousness.

In the preface to his new book, the ­philosopher Daniel Dennett announces proudly that what we are about to read is “the sketch, the backbone, of the best scientific theory to date of how our minds came into existence”. By the end, the reader may consider it more scribble than spine – at least as far as an account of the origins of human consciousness goes. But this is still a superb book about evolution, engineering, information and design. It ranges from neuroscience to nesting birds, from computing theory to jazz, and there is something fascinating on every page.

The term “design” has a bad reputation in biology because it has been co-opted by creationists disguised as theorists of “intelligent design”. Nature is the blind watchmaker (in Richard Dawkins’s phrase), dumbly building remarkable structures through a process of random accretion and winnowing over vast spans of time. Nonetheless, Dennett argues stylishly, asking “design” questions about evolution shouldn’t be ­taboo, because “biology is reverse engin­eering”: asking what some phenomenon or structure is for is an excellent way to understand how it might have arisen.

Just as in nature there is design without a designer, so in many natural phenomena we can observe what Dennett calls “competence without comprehension”. Evolution does not understand nightingales, but it builds them; your immune system does not understand disease. Termites do not build their mounds according to blueprints, and yet the results are remarkably complex: reminiscent in one case, as Dennett notes, of Gaudí’s church the Sagrada Família. In general, evolution and its living products are saturated with competence without comprehension, with “unintelligent design”.

The question, therefore, is twofold. Why did “intelligent design” of the kind human beings exhibit – by building robotic cars or writing books – come about at all, if unintelligent design yields such impressive results? And how did the unintelligent-design process of evolution ever build intelligent designers like us in the first place? In sum, how did nature get from bacteria to Bach?

Dennett’s answer depends on memes – self-replicating units of cultural evolution, metaphorical viruses of the mind. Today we mostly use “meme” to mean something that is shared on social media, but in Richard Dawkins’s original formulation of the idea, a meme can be anything that is culturally transmitted and undergoes change: melodies, ideas, clothing fashions, ways of building pots, and so forth. Some might say that the only good example of a meme is the very idea of a meme, given that it has replicated efficiently over the years despite being of no use whatsoever to its hosts. (The biologist Stephen Jay Gould, for one, didn’t believe in memes.) But Dennett thinks that memes add something important to discussions of “cultural evolution” (a contested idea in its own right) that is not captured by established disciplines such as history or sociology.

The memes Dennett has in mind here are words: after all, they reproduce, with variation, in a changing environment (the mind of a host). Somehow, early vocalisations in our species became standardised as words. They acquired usefulness and meaning, and so, gradually, their use spread. Eventually, words became the tools that enabled our brains to reflect on what they were ­doing, thus bootstrapping themselves into full consciousness. The “meme invasion”, as Dennett puts it, “turned our brains into minds”. The idea that language had a critical role to play in the development of human consciousness is very plausible and not, in broad outline, new. The question is how much Dennett’s version leaves to explain.

Before the reader arrives at that crux, there are many useful philosophical interludes: on different senses of “why” (why as in “how come?” against why as in “what for?”), or in the “strange inversions of reasoning” offered by Darwin (the notion that competence does not require comprehension), Alan Turing (that a perfect computing machine need not know what arithmetic is) and David Hume (that causation is a projection of our minds and not something we perceive directly). Dennett suggests that the era of intelligent design may be coming to an end; after all, our best AIs, such as the ­AlphaGo program (which beat the human European champion of the boardgame Go 5-0 in a 2015 match), are these days created as learning systems that will teach themselves what to do. But our sunny and convivial host is not as worried as some about an imminent takeover by intelligent machines; the more pressing problem, he argues persuasively, is that we usually trust computerised systems to an extent they don’t deserve. His final call for critical thinking tools to be made widely available is timely and admirable. What remains puzzlingly vague to the end, however, is whether Dennett actually thinks human consciousness – the entire book’s explanandum – is real; and even what exactly he means by the term.

Dennett’s 1991 book, Consciousness Explained, seemed to some people to deny the existence of consciousness at all, so waggish critics retitled it Consciousness Explained Away. Yet it was never quite clear just what Dennett was claiming didn’t exist. In this new book, confusion persists, owing to his reluctance to define his terms. When he says “consciousness” he appears to mean reflective self-consciousness (I am aware that I am aware), whereas many other philosophers use “consciousness” to mean ordinary awareness, or experience. There ensues much sparring with straw men, as when he ridicules thinkers who assume that gorillas, say, have consciousness. They almost certainly don’t in his sense, and they almost certainly do in his opponents’ sense. (A gorilla, we may be pretty confident, has experience in the way that a volcano or a cloud does not.)

More unnecessary confusion, in which one begins to suspect Dennett takes a polemical delight, arises from his continued use of the term “illusion”. Consciousness, he has long said, is an illusion: we think we have it, but we don’t. But what is it that we are fooled into believing in? It can’t be experience itself: as the philosopher Galen Strawson has pointed out, the claim that I only seem to have experience presupposes that I really am having experience – the experience of there seeming to be something. And throughout this book, Dennett’s language implies that he thinks consciousness is real: he refers to “conscious thinking in H[omo] sapiens”, to people’s “private thoughts and experiences”, to our “proper minds, enculturated minds full of thinking tools”, and to “a ‘rich mental life’ in the sense of a conscious life like ours”.

The way in which this conscious life is allegedly illusory is finally explained in terms of a “user illusion”, such as the desktop on a computer operating system. We move files around on our screen desktop, but the way the computer works under the hood bears no relation to these pictorial metaphors. Similarly, Dennett writes, we think we are consistent “selves”, able to perceive the world as it is directly, and acting for rational reasons. But by far the bulk of what is going on in the brain is unconscious, ­low-level processing by neurons, to which we have no access. Therefore we are stuck at an ­“illusory” level, incapable of experiencing how our brains work.

This picture of our conscious mind is rather like Freud’s ego, precariously balan­ced atop a seething unconscious with an entirely different agenda. Dennett explains wonderfully what we now know, or at least compellingly theorise, about how much unconscious guessing, prediction and logical inference is done by our brains to produce even a very simple experience such as seeing a table. Still, to call our normal experience of things an “illusion” is, arguably, to privilege one level of explanation arbitrarily over another. If you ask me what is happening on my computer at the moment, I shall reply that I am writing a book review on a word processor. If I embarked instead on a description of electrical impulses running through the CPU, you would think I was being sarcastically obtuse. The normal answer is perfectly true. It’s also true that I am currently seeing my laptop screen even as this experience depends on innumerable neural processes of guessing and reconstruction.

The upshot is that, by the end of this brilliant book, the one thing that hasn’t been explained is consciousness. How does first-person experience – the experience you are having now, reading these words – arise from the electrochemical interactions of neurons? No one has even the beginnings of a plausible theory, which is why the question has been called the “Hard Problem”. Dennett’s story is that human consciousness arose because our brains were colonised by word-memes; but how did that do the trick? No explanation is forthcoming. Dennett likes to say the Hard Problem just doesn’t exist, but ignoring it won’t make it go away – even if, as his own book demonstrates, you can ignore it and still do a lot of deep and fascinating thinking about human beings and our place in nature.

Steven Poole’s books include “Rethink: the Surprising History of New Ideas” (Random House Books)

This article first appeared in the 16 February 2017 issue of the New Statesman, The New Times