No, Jane Austen was not a game theorist

Using science to explain art is a good way to butcher both, and is intellectually bankrupt to boot.

This article first appeared on newrepublic.com

Proust was a neuroscientist. Jane Austen was a game theorist. Dickens was a gastroenterologist. That’s the latest gambit in the brave new world of “consilience,” the idea that we can overcome the split between “the two cultures” by bringing art and science into conceptual unity – which is to say, by setting humanistic thought upon a scientific foundation. Take a famous writer, preferably one with some marketing mojo, and argue that their work anticipates contemporary scientific insights. Proust knew things about memory that neuroscientists are only now discovering. Austen constructed her novels in a manner that is consistent with game theory. Bang, there’s your consilience.

There is only one problem with this approach: it is intellectually bankrupt. Actually, there are a lot of problems, as Michael Suk-Young Chwe’s abominable volume shows. If this is the sort of thing that we have to look forward to, as science undertakes to tutor the humanities, the prospect isn’t bright. Game theory is a method for modeling decisions, especially in contexts that involve a multiplicity of actors, in mathematical terms. One would think, given its title, that Chwe’s book offers an in-depth game-theoretical analysis of the ways that Austen’s characters (specifically, her heroines and heroes) work through their choices (specifically, the ones they make in relation to one another) – why Elizabeth Bennet, to take the most obvious example, rejects Mr Darcy the first time he proposes but accepts him on the next go-round.

No such luck. What we really get, once we fight through Chwe’s meandering, ponderous, frequently self-contradictory argument, is only the claim that Austen wants her characters to think in game-theoretic ways: to reflect upon the likely consequences of their choices, to plan out how to reach their goals, to try to intuit what the people around them are thinking and how they in turn are likely to act. But this is hardly news. Austen describes a world in which young ladies have to navigate their perilous way to happiness (that is, a rich husband they can get along with, or, more charitably, a man they love who happens to be wealthy) by controlling their impulses and thinking coolly and deliberately. Act otherwise and you end up like Lydia Bennet, yoked forever to the feckless Mr Wickham. That Austen is no D H Lawrence – that she believed that reason should govern our conduct – is pretty much the most obvious thing about her work.

But Chwe himself is not content with being reasonable. When he says that Austen was a game theorist, he means for us to take him at his word. Never mind the fact that game theory did not emerge until the middle of the twentieth century. Austen, he claims, was a “social theorist” who “carefully establishes game theory’s core concepts” and “systematically explored” them in her novels, which are “game theory textbooks.” This is a perfectly valid statement, as long as we ignore the accepted meaning of most of the words it contains. Chwe apparently saw the title of Proust Was a Neuroscientist and took it literally. Jonah Lehrer, to give him what little credit he deserves, does not actually believe that the author of the Recherche conducted experiments with rats and prions. But Chwe insists that Austen’s novels do not just adumbrate some social-scientific concepts, they represent a pioneering “research program” into game theory (which, again, did not exist) that constituted her essential purpose in creating them. This, apparently, is how you achieve consilience: by pretending that artists are scientists in disguise.

We’ll get to the category errors in a minute. For now, let’s recognise that Chwe, a professor of political science with a PhD in economics, is making two rather large and improbable claims: that Austen programmatically developed such concepts as “choice (a person takes an action because she chooses to do so), preferences (a person chooses the action with the highest payoff), and strategic thinking (before taking an action, a person thinks about how others will act)” – thundering ideas, to be sure – and that she was the very first to show an interest in them.

Chwe falls down the moment he begins to make the case. “The most specific ‘smoking gun’ evidence that Austen is centrally concerned with strategic thinking is how she employs children: when a child appears, it is almost always in a strategic context,” as a pawn or bit player “in an adult’s strategic actions.” Really, that’s the best you can do? First of all, when a child appears in Austen, it isn’t almost always in a strategic context. She also often uses them – Emma’s nieces and nephews, for example, whom we see her love and care for – to certify the goodness of her heroine’s heart. More importantly, what would it prove if she did always use them in a strategic context? Children are not a privileged category of representation; in Austen, in fact, they are a very minor one, never more than incidental to the action. Yes, they are sometimes used strategically – but so are pianos and snowstorms and horses. So what?

The balance of Chwe’s evidence is comparably trivial. As a clincher, he cites the moment where Jane and Elizabeth Bennet find their comically pedantic sister Mary “deep in the study of thorough-bass and human nature.” Thorough-bass, he reasons, is a mathematical approach to music. By having Mary study music and human nature the same way, Austen suggests the possibility of a mathematical approach to the latter – that is, game theory. Don’t worry, I don’t get it either. No one said that Mary studies them the same way, only at the same time. Besides, as everyone but Chwe can see, the character is being held up as a figure of fun, not an intellectual role model. As hard as it is to believe that Austen undertook to construct a systematic approach to human behavior along game-theoretical lines, the notion that she did so within the kind of quantitative framework that exists today – decision trees, decision matrices, numerical inputs and outcomes – is truly idiotic.

As for the question of Austen’s priority as a “game theorist,” there is a grain of truth to the idea. She did depict strategic thinking in everyday social situations with a new depth, a new detail, and a number of new techniques – literary techniques, such as free indirect discourse, not mathematical ones. But she was hardly the first in the field. As even Chwe acknowledges (as quickly as he can), literature has been exploring the mind, and strategic thinking in particular, for as long as it has existed. The Odyssey, the story of a master strategist, is the most obvious early example. But the whole history of stage comedy, with its tricky servants and maneuvering lovers, as well as of dramatic tragedy – Hamlet, Iago, Richard III, Edmund in King Lear (as well as Lear himself, as a failed example), not to mention Marlowe’s Barabas and Jonson’s Volpone – is replete with schemers. The ways that people try to use each other to achieve their ends, and the grief they often come to in the process, is a central subject of classical theater, as well as of a giant chunk of the other narrative genres.

But neither Homer, nor Shakespeare, nor Austen, nor any other writer worth their salt believed that people think only strategically. You see, it is not enough for game theory to analyse strategic thought; at least in Chwe’s account, it regards such thinking as the exclusive explanation of human behavior. Chwe runs through a series of alternatives – emotions, instincts, habits, rules, social factors, ideology, intoxication (not being in your right mind), the constraints of circumstance – claiming to show that Austen rejects them as possible sources of action. But Austen wasn’t dumb enough to think that people never act out of habit or instinct or sudden emotion. All Chwe really shows is that she thought they shouldn’t.

Austen knew, in other words, that human motivation is enormously complex. Reducing it to any single factor – well, for that you need a social scientist. Great literature has the power, through painstaking art, to fashion a convincing representation of human behavior in all its inextricable, mysterious, and endlessly ramifying mixture of sources. That is why it never becomes obsolete. What does become obsolete are the monocausal theories of people such as Chwe. Literature puts back everything the social sciences – by way of methodological simplification, or disciplinary ideology, or just plain foolishness – take out. That is why the finest literature responds to every monocausal theory you can throw at it. Shakespeare was a game theorist, too – and a neuroscientist, and a political scientist, and a Freudian, and a Marxist, and a Lacanian, and a Foucauldian, and all the -ists and -ians that we haven’t yet devised.

Though really, of course, he was none of these. He was a dramatist, just as Austen was a novelist. She didn’t write textbooks, she had no use for concepts, and she wasn’t interested in making arguments. If she had a research program, as Chwe insists, it was into the techniques of fiction and the possibilities of the English language. She was no more a social theorist than Marx or Weber was a novelist. Chwe has much to say about “cluelessness,” the inability to think strategically, another concept he insists that Austen pioneered. After cataloging five Austenian varieties of the phenomenon, he adds an equal number of his own. But he forgets a few. You can also be clueless because you have sworn allegiance to a theory, or because you never learned to handle the material in question, or because you didn’t do the work to find out what you’re talking about, or because you want to get an academic promotion and need to publish another book. Jane Austen, game theorist: as Mencken, the great American bard of cluelessness, said, “There is no idea so stupid that you can’t find a professor who will believe it.” Usually, of course, because he thought it up himself. 

 

Chwe’s book, apparently, has made a stir in social-scientific circles – that is, among the kind of readers who know even less about Jane Austen, and literature in general, than he does. A depressing enough thought, but what really bothers me is that his titular idea is the kind of effluent that contaminates the cultural water supply. Without even opening his book, a lot of otherwise intelligent people are going to go around believing that Jane Austen “was” a game theorist, just as lots of them undoubtedly believe that Proust “was” a neuroscientist. Which means that Chwe’s book, like Lehrer’s, reinforces the notion that art is merely a diffuse or coded form of scientific or social-scientific 
knowledge, and that its insights are valid only insofar as they approximate (or can be made to seem to approximate) those of those disciplines – or worse, the latest fashions in those disciplines.

Lehrer is pretty direct about this. Contemporary science is “true,” and that art is best which best accords with it. “Their art,” he writes in reference to the eight creators, largely modernist, whom he discusses in his book, “proved to be the most accurate, because they most explicitly anticipated our science.” Poor Sophocles, poor Rembrandt. But art is not about being accurate, the way that the solution to an equation can be accurate; it is about being expressive. Art does not have winners. Cézanne might have been “right” about the cognitive science of vision, as Lehrer tells us, but there are many ways, in art, of being right. Raphael, Vermeer, Turner, Matisse – they were also right, and still are.

Insofar as we do sometimes talk about art as if it had winners, it is not because of science. We speak of Shakespeare as supreme among the writers not because he had a systematic conception of human behavior (and if he did, it was probably the medieval theory of the humors), but because his work has been felt to constitute, persistently and by the widest number of people, the most profound and powerful representation – not explanation – of our shared experience. It doesn’t matter, in that respect, what science happens to believe today about the material substrate of that experience, which may not be what it will believe tomorrow. Whom would Lehrer have anointed, in the visual arts, if he had written half a century ago? Not Cézanne, just as it is likely not to be Cézanne half a century from now. Lehrer can point, in retrospect, to the art that best accords with the current state of scientific knowledge, but what about the artist who proposes something science hasn’t (yet) discovered? How can we guess what it will?

Lehrer belongs to the “we used to think ... now we know” school of science writing. He understands that scientific discoveries are always provisional, but he keeps pushing the recognition away. He also knows that art and science do not belong to the same order of knowledge, but he cannot sustain the idea. Although his writing is more stylish than Chwe’s, his command of his material is not much more sophisticated. Before the middle of the nineteenth century, Lehrer believes, the arts were merely “pretty or entertaining.” (You know – Goya, Beethoven, Swift.) Then came modernism, inspired by the science of its time (a claim he never supports and, in seeking to align his subjects with the science of our time, frequently contradicts). Lehrer is the kind of person who believes that people woke up on January 1, 1500 and started speaking Modern English. “Cézanne invented modernist art.” Stravinsky “steeped himself in angst.” As for Gertrude Stein, “after a few years, her revolution petered out, and writers went back to old-fashioned storytelling.” “All of these artists,” Lehrer tells us, “shared an abiding interest in human experience.” Really, all of these artists? “In a move of stunning arrogance and ambition, they tried to invent fictions that told the truth.” Too bad Dante never thought of that. Lehrer, innocent of subtlety or history or depth, with no idea of how much he doesn’t know, is like the college student who comes home for winter break, all eager to regurgitate the things he has learned in Freshman Humanities.

Like Chwe’s, his argument advances through hyperbole, self-contradiction, oversimplification, and sheer incoherence. Maybe those are no more than the failures of these two men in particular, but I think they point to something larger. I have read other efforts to analyse artistic phenomena in scientific terms – most notably, in the emerging field of literary Darwinism, itself an outgrowth of the highly dubious discipline of evolutionary psychology – and they tend to falter in the same sorts of ways. At best, they tell us things we already know – and know immensely better – through humanistic means. They are almost always either crushingly banal or desperately wrongheaded. Pride and Prejudice is about mate selection. Hamlet struggles to choose between personal and genetic self-interest: killing Claudius and usurping his throne (but the latter never crosses his mind) or letting Gertrude furnish him with siblings (though since Hamlet is already thirty, that isn’t all too probable). Interpretive questions are not responsive to scientific methods. It isn’t even like using a chainsaw instead of a scalpel; it’s like using a chainsaw instead of a stethoscope. The instrument is not too crude; it is the wrong kind altogether.

The problem of “the two cultures” is not, in fact, a problem at all. There’s a reason that art and science are distinct. They don’t just work in different ways; they work on different things. Science addresses external reality, which lies outside our minds and makes itself available for objective observation. The arts address our experience of the world; they tell us what reality feels like. That is why the chain of consilience ruptures as we make the leap from material phenomena to the phenomena of art. Physics can explain chemistry, which can explain biology, which can explain psychology, and psychology might someday tell us, at least in the most general terms, how we create art and why we respond to it. But it will never account for the texture, the particularities, of individual works, or tell us what they mean. Nor will it explain the history of art: its moments, its movements, the evolution of its modes and styles, the labyrinths of influence that join its individual creators. The problem isn’t just that there is so much data that is unrecoverable. It’s too late now to turn up Sappho’s DNA, but even if we dug up Austen’s bones and sequenced her
genome, it will never tell us why she wrote Persuasion, or how she came up with the opening of Pride and Prejudice, or what we are supposed to make of Emma. Art is experiential. It doesn’t just speak of experience; it needs to be experienced itself, inhabited in ways that proofs and formulae do not. And experience cannot be weighed or measured; it can only be evoked.

Scientists did not appreciate it when the “science studies” hucksters were attempting to usurp their turf, nor should they have. Even the disciples of consilience seemingly retain a dim awareness of the independent validity of humanistic knowledge, as witnessed by their tendency to appeal to Shakespeare – that is, to argue that he lends support to this or that contemporary view of human nature. So here is a suggestion: why not simply go to him directly, and learn what else he had to say? And after Shakespeare, you can turn to Virgil, and Goethe, and Tolstoy, and Rumi, and Murasaki – and, well, the possibilities are endless. You can give yourself, in other words, a humanistic education. And that is neither game nor theory.

William Deresiewicz is a contributing editor at The New Republic. His new book, Excellent Sheep: The Miseducation of the American Elite and the Way to a Meaningful Life (Free Press), will be published this summer.

This article first appeared on newrepublic.com

 

Austen knew that human motivation is enormously complex, but that doesn't make her a game theorist. Image: Hulton Archive/Getty Images
Nicola Snothum / Millenium Images
Show Hide image

The end of solitude: in a hyperconnected world, are we losing the art of being alone?

In the end, Solitude feels a bit like an amiable cop-out. 

Michael Harris is a Canadian writer who lives in a big city and whose life is defined and circumscribed, as so many Western lives are now, by digital technologies. He finds it hard to leave his phone at home in case he misses anything. He worries about his social media reputation. He uses apps and plays games, and relies on the internet hive mind to tell him which films to watch or where to eat. Here is what happens when he goes on holiday to Paris:

Disembarking from the train from London, I invited a friendly app to guide me to a hotel near the Pompidou . . . The next morning, Yelp guided me towards a charming café in the Marais. There, wizard-like, I held my phone over the menu and waited for Google Translate to melt the words into English. When the waiter arrived, I spoke into my phone and had it repeat my words to the grinning garçon in a soft, robotic French. Later, at the Louvre, I allowed a Nintendo-sponsored guidance system to track my steps up the centuries-old Daru staircase as I squinted confusedly at its glowing blue you-are-here dot . . .

Terrifying, isn’t it? Well, I thought so as I read it, and Harris thought so afterwards. It was situations like this, during which he realised that his life was controlled, confined and monitored by distancing technologies, that led him to wonder whether solitude – the act and the art of being alone – was in danger of disappearing.

Harris has an intuition that being alone with ourselves, paying attention to inner silence and being able to experience outer silence, is an essential part of being human. He can remember how it felt to do this, before the internet brought its social anxiety and addiction into his life. “I began to remember,” he writes, “a calm separateness, a sureness I once could live inside for an easy hour at a time.”

What happens when that calm separateness is destroyed by the internet of everything, by big-city living, by the relentless compulsion to be with others, in touch, all the time? Plenty of people know the answer already, or would do if they were paying attention to the question. Nearly half of all Americans, Harris tells us, now sleep with their smartphones on their bedside table, and 80 per cent are on their phone within 15 minutes of waking up. Three-quarters of adults use social networking sites regularly. But this is peanuts compared to the galloping development of the so-called Internet of Things. Within the next few years, anything from 30 to 50 billion objects, from cars to shirts to bottles of shampoo, will be connected to the net. The internet will be all around you, whether you want it or not, and you will be caught in its mesh like a fly. It’s not called the web for nothing.

I may not be the ideal reader for this book. By page 20, after a few more facts of this sort, I had already found myself scrawling “Kill everyone!” in the margins. This is not really the author’s fault. I often start behaving like this whenever I’m forced to read a list of ways in which digital technology is wrecking human existence. There are lots of lists like this around at the moment, because the galloping, thoughtless, ongoing rush to connect everything to the web has overcome our society like a disease. Did you know that cows are now connected to the internet? On page 20, Harris tells us that some Swiss dairy cows, sim cards implanted in their necks, send text messages to their farmers when they are on heat and ready to be inseminated. If this doesn’t bring out your inner Unabomber, you’re probably beyond help. Or maybe I am.

What is the problem here? Why does this bother me, and why does it bother Harris? The answer is that all of these things intrude upon, and threaten to destroy, something ancient and hard to define, which is also the source of much of our creativity and the essence of our humanity. “Solitude,” Harris writes, “is a resource.” He likens it to an ecological niche, within which grow new ideas, an understanding of the self and therefore an understanding of others.

The book is full of examples of the genius that springs from silent and solitary moments. Beethoven, Dostoevsky, Kafka, Einstein, Newton – all developed their ideas and approach by withdrawing from the crowd. Peter Higgs, the Nobel ­Prizewinner who discovered the Higgs boson particle, did his best work in peace and solitude in the 1960s. He suggests that what he did then would be impossible today, because it is now virtually impossible to find such solitude in the field of science.

Collaboration, not individuality, is fetishised today, in business as in science and the arts, but Harris warns that collaboration often results in conformism. In the company of others, most of us succumb to pressure to go with the crowd. Alone, we have more chance to be thoughtful, to see differently, to enter a place where we feel free from the mob to moderate our unique experience of the world. Without solitude, he writes, genius – which ultimately springs from different ways of thinking and seeing – becomes impossible. If Thoreau’s cabin in the woods had had wifi, we would never have got Walden.

Yet it is not only geniuses who have a problem: ordinary minds like yours and mine are threatened by the hypersocial nature of always-on urbanity. A ­civilisation can be judged by the quality of its daydreams, Harris suggests. Who daydreams now? Instead of staring out of the window on a train, heads are buried in smartphones, or wired to the audio of a streaming film. Instead of idling at the bus stop, people are loading up entertainment: mobile games from King, the maker of Candy Crush, were played by 1.6 billion times every day in the first quarter of 2015 alone.

If you’ve ever wondered at the behaviour of those lines of people at the train station or in the street or in the café, heads buried in their phones like zombies, unable or unwilling to look up, Harris confirms your worst fears. The developers of apps and games and social media sites are dedicated to trapping us in what are called ludic loops. These are short cycles of repeated actions which feed our brain’s desire for reward. Every point you score, every candy you crush, every retweet you get gives your brain a dopamine hit that keeps you coming back for more. You’re not having a bit of harmless fun: you are an addict. A tech corporation has taken your solitude and monetised it. It’s not the game that is being played – it’s you.

So, what is to be done about all this? That’s the multibillion-dollar question, but it is one the book cannot answer. Harris spends many pages putting together a case for the importance of solitude and examining the forces that splinter it today. Yet he also seems torn in determining how much of it he wants and can cope with. He can see the damage being done by the always-on world but he lives in the heart of it, all his friends are part of it, and he doesn’t want to stray too far away. He understands the value of being alone but doesn’t like it much, or want to experience it too often. He’ll stop checking his Twitter analytics but he won’t close down his account.

At the end of the book, Harris retreats, Thoreau-like, to a cabin in the woods for a week. As I read this brief last chapter, I found myself wishing it was the first, that he had spent more time in the cabin, that he had been starker and more exploratory, that he had gone further. Who will write a Walden for the Internet Age? This book is thick with fact and argument and some fine writing, but there is a depth that the author seems afraid to plumb. Perhaps he is afraid of what he might find down there.

In the end, Solitude feels a bit like an amiable cop-out. After 200 pages of increasingly disturbing facts about the impact of technology and crowded city living on everything from our reading habits to our ability to form friendships, and after warning us on the very last page that we risk making “an Easter Island of the mind”, the author goes back home to Vancouver, tells his boyfriend that he missed him, and then . . . well, then what? We don’t know. The book just ends. We are left with the impression that the pile-up of evidence leads to a conclusion too vast for the author, and perhaps his readers, to take in, because to do that would be to challenge everything.

In this, Solitude mirrors the structure of many other books of its type: the Non-Fiction Warning Book (NFWB), we might call it. It takes a subject – disappearing childhood; disappearing solitude; disappearing wilderness; disappearing anything, there’s so much to choose from – trots us through several hundred pages of anecdotes, science,
interviews and stories, all of which build up to the inescapable conclusion that everything is screwed . . . and then pulls back. It’s like being teased by an expert hustler. Yes, technology is undermining our sense of self and creating havoc for our relationships with others, but the solution is not to stop using it, just to moderate it. Yes, overcrowded cities are destroying our minds and Planet Earth, but the solution is not to get out of the cities: it’s to moderate them in some way, somehow.

Moderation is always the demand of the NFWB, aimed as it is at mainstream readers who would like things to get better but who don’t really want to change much – or don’t know how to. This is not to condemn Harris, or his argument: most of us don’t want to change much or know how to. What books of this kind are dealing with is the problem of modernity, which is intractable and not open to moderation. Have a week away from your screen if you like, but the theft of human freedom by the machine will continue without you. The poet Robinson Jeffers once wrote about sitting on a mountain and looking down on the lights of a city, and being put in mind of a purse seine net, in which sardines swim unwittingly into a giant bag, which is then drawn tightly around them. “I thought, We have geared the machines and locked all together into interdependence; we have built the great cities; now/There is no escape,” he wrote. “The circle is closed, and the net/Is being hauled in.”

Under the circumstances – and these are our circumstances – the only honest conclusion to draw is that the problem, which is caused primarily by the technological direction of our society, is going to get worse. There is no credible scenario in which we can continue in the same direction and not see the problem of solitude, or lack of it, continue to deepen.

Knowing this, how can Harris just go home after a week away, drop off his bag and settle back into his hyperconnected city life? Does he not have a duty to rebel, and to tell us to rebel? Perhaps. The problem for this author is our shared problem, however, at a time in history when the dystopian predictions of Brave New World are already looking antiquated. Even if Harris wanted to rebel, he wouldn’t know how, because none of us would. Short of a collapse so severe that the electricity goes off permanently, there is no escape from what the tech corporations and their tame hive mind have planned for us. The circle is closed, and the net is being hauled in. May as well play another round of Candy Crush while we wait to be dragged up on to the deck. 

Paul Kingsnorth's latest book, “Confessions of a Recovering Environmentalist” (Faber & Faber)

This article first appeared in the 20 April 2017 issue of the New Statesman, May's gamble

0800 7318496