Robots assemble a car. Photo: Camera Press
Show Hide image

Reign of the robots: how to live in the machine age

By using ever more machines we lose not only physical skills, but cognitive faculties.

The Glass Cage: Where Automation is Taking Us
Nicholas Carr
Bodley Head, 299pp, £20

The Second Machine Age: Work, Progress and Prospetiry in a Time of Brilliant Technologies.
Erik Brynjolfsson and Andrew McAfee
WW Norton, 320pp, £17.99

There is a Mitchell and Webb sketch in which two cavemen sit at a work table making tools. Big Feet (Webb) chips the flint, Red Beard (Mitchell) ties it to sticks. Red Beard reminds Big Feet that they won’t be getting much done today, because today is Bronze Orientation Day. “I’m so sick of hearing about bronze,” says Big Feet. “What’s wrong with stone? Does stone not work all of a sudden?” Red Beard is more open-minded: “They say it will revolutionise the way we hunter-gather.”

The leader of Bronze Orientation Day is Hairy Back, who is breathlessly enthusias­tic and talks in slogans. “My message to you is this: Don’t Be Afraid of Bronze. Bronze Is Brilliant.” Foremost among its virtues, he says, is that it doesn’t need to be chipped. Big Feet is understandably concerned to hear this, but Hairy Back confirms the bad news with ruthless cheer: “Old-fashioned chipping is a thing of the past. Have you thought about retraining as a smelter?”

The sketch is eloquent about our relationship with technological change. New technologies don’t just disrupt old ways of working, they split us into tribes: those who resist disruption to old ways, and those who insist that the new must always be embraced. Digital technology has come at us in such a rush in recent years that the polarisation is particularly acute. Most books about its effects on society are polemics, written by a follower of either Hairy Back or Big Feet: the internet will save the world, or destroy everything you love, depending on which. But now that personal computers are approaching middle age, perhaps we can have a more mature debate, one that presumes every new technology comes with a credit and debit sheet attached.

The Glass Cage by Nicholas Carr provides a fine example. Carr’s previous book The Shallows argued that the internet is making us stupid, by turning us into a twitchy, distractible species capable of little more than clicking on someone else’s answers. The Shallows was rather one-sided: it underestimated the capacity of the web, used thoughtfully, to extend and deepen our thinking. Carr is an avid internet user and if his thesis was correct he should hardly have been able to write another book. But I am glad he did, because The Glass Cage, a more nuanced account of human cognition in the age of digital automation, is very good.

The Glass Cage warns that we may be creating a technological environment that erodes our skills, anaesthetises our curiosity and dims our critical faculties. From airline cockpits to central heating systems, cars and phones, we are swaddling ourselves in technologies whose workings we don’t understand, and that ask so little of us that we feel no need to enquire further. Carr quotes the technology historian George Dyson: “What if the cost of machines that think is people who don’t?”

In Seattle in 2008 the driver of a 12-foot-high school bus ran it into a nine-foot-high bridge. The top of the bus was sheared off and 16 students were injured. The driver later told police that he hadn’t seen the signs and flashing lights warning him of the bridge ahead because he was following GPS instructions. In 2009 Air France Flight 447 crashed into the Atlantic, killing all 228 of its passengers and crew. A subsequent investigation showed the aircraft had run into a storm that caused the autopilot to disengage, which threw the plane’s human pilots into a panic. In the words of the report, the crew suffered a “total loss of cognitive control of the situation”.

Carr acknowledges that digital automation has benefits, and to argue otherwise would be absurd. Computers remove the need for us to carry out tedious tasks; they make fewer errors than we do, perform miraculous feats of calculation and save lives. High-profile exceptions notwithstanding, you are less likely to be killed in an airline crash now than at any other time in the history of aviation. But none of this is to say we should not interrogate automation’s downsides. If we don’t, we may end up making ourselves redundant.

In The Second Machine Age, one of last year’s most important books, the economists Erik Brynjolfsson and Andrew McAfee invite us to think about the relationship between human beings and technology as a race in which we are competing with machines for the best jobs. The race has had two stages. The first, which started with the Industrial Revolution, was mechanical. In the workplace and the home, machines took over the heavy lifting, performing physically demanding and repetitive tasks more reliably and efficiently than people. In the short term this created human losers such as the Luddite weavers, but in the long run many more of us were winners.

New technologies made some jobs obsolete but created whole new categories of employment, and the newer jobs have, on the whole, been more productive and better paid. They have also been more interesting: human beings have responded to the challenge of machines by cultivating brain over brawn. The grandchildren of Luddite weavers became factory managers; the children of typists in the 1960s became software engineers. A vast and prosperous middle class was created – if “middle class” means anything it indicates the ability to make a living from your mind rather than your muscle.

Understandably, given how well the past two centuries turned out, it has become almost heretical among economists and policymakers to suggest that technological automation is anything but beneficial, at least in the long run (as Hairy Back might put it, Automation is Awesome). So it is brave of Brynjolfsson and McAfee to contend that this time the machines really have put humanity on notice. The second stage of the race has begun, and we are in danger of losing it.

Human beings won the race with the machines of the Industrial Revolution by cultivating their intelligence. But information technology automates mental, not just manual tasks, and now, due to a huge increase in computing power and the sheer number of interconnected devices, the machines are catching up.

The digital sphere is expanding at a dizzy­ing rate: from music to agriculture to the military, ever more sophisticated cognitive labour is performed by algorithm. As the venture capitalist Marc Andreessen said, “software is eating the world”.

We appear to be next on the menu. It’s no coincidence that middle-class incomes are falling at a time when complex production and logistics processes are running smoothly with minimal human oversight, and hi-tech companies with small staffs are conquering the world at the expense of companies that rely heavily on cumbersome and temperamental meat-based robots . . . excuse me, people.

Brynjolfsson and McAfee sketch a potential future in which corporate profits rise higher than ever and an elite of robot-owners grows phenomenally rich while the rest of us wonder what to do with our time or how to feed our families. You might say that’s a fair description of the present. They would say you ain’t seen nothing yet. They note that there is no iron (nor bronze) law of economics that says most people benefit from technological progress, even if that has been true to date. It is quite possible that, to adapt Keynes, in the long, long run we are all obsolete.

Despite this, Brynjolfsson and McAfee are optimistic. They advise us to make the most of what remain uniquely human capabilities: inventiveness, empathy, an ability to cope with the unpredictable. To ensure that machines remain in the service of human happiness, we need to play to our strengths.

Sensible as that sounds, something tells me the robots have thought this one through. By taking so much out of our hands, and now our brains, they are neutering the very capabilities that might enable us to outrun them. The long-recognised problem of “deskilling” – as factories replace skilled labour with machines, workers become less skilled and therefore more dispensable – now applies to the very skills that ought to give us an edge over our machines. We are easing ourselves into a dependency trap.

Take navigation, a fascinating discussion of which appears in The Glass Cage. Mapping apps are a godsend. I do a lot of walking around London, the city where I live, and I rely much of the time on a blue dot to tell me where to go, which makes it much easier to get around (my sense of direction barely deserves the name). But Carr points to a hidden cost. Cognitive science has established, as a general principle, that the less you exercise a mental skill, the worse you get at it, and that this applies to the skill of mentally mapping space. Maybe that’s OK, as long as my battery doesn’t run out. But then again, maybe not.

The neural networks we employ to find our way through space are the same as those used in forming memories (memory seems to have begun in the need to find one’s way back to the right cave). Véronique Bohbot, a professor of psychiatry at McGill University, has found that the way people exercise their navigational skills affects the functioning of the hippocampus, a part of the brain pivotal to memory; studies have famously found the posterior hippocampus to be larger in the brains of London cabbies.

Bohbot also discovered that the more effort a person makes to build cognitive maps of space, the stronger his memory circuits become. When I follow the dot, I am not engaging my hippocampus. Consequently, I may be allowing age to ravage my memory faster than it would do otherwise. Bohbot worries that over the next 20 years, because society is increasingly geared towards shrinking the hippocampus (Uber drivers, like everyone else, use GPS), dementia will occur earlier and earlier.

Does this mean I’m going to delete Google Maps from my phone? No. But I will use it only when I absolutely have to. In other words, I will use it to augment, rather than replace, my meagre capacity for navigation. My aim is to avoid a relationship with my phone in which I become increasingly lost without it.

Just as it’s hard to say no to a good app, so the greater the degree of automation in society, the more likely we are to regard it as a force of nature, resistance to which is as rational as dancing a rainstorm away. But let’s recall Stevie Wonder’s definition of superstition: to believe in things that you don’t understand. Technophiles are fond of saying that the best technology is indistinguishable from magic. It is useful to remember, however, that magic relies on tricks, and the sleight of hand performed by digital technology is to hide from us what we are losing. Steve Jobs loved to say of a new Apple product, “It just works.” I, too, love technology that just works, but that’s not to say there isn’t a price to pay for tools that efface their very existence.

If the Luddites underestimated the benefits of innovation, they did at least regard the introduction of novel technologies as a choice, and in that sense, Carr points out, they were more rational than those today who insist that we hand over more tasks to machines. It has somehow become accepted that if a machine can do something, it should do it. This is the attitude of a child, cooing at every shiny thing that crosses her path, sticking it in her mouth even at the risk of choking.

Even the Luddites weren’t Luddites. Carr reminds us that the original gang (named after a legendary, possibly mythical Leicestershire machine-breaker, Ned Ludd) didn’t go around demolishing things because they hated machines, but because they were trying to prevent another form of destruction. They were out to protect a way of life that was bound up with the practice of their craft. They took pride in being good at something difficult, and enjoyed the security, respect and independence it brought. They didn’t hate what was new; they just loved what they had.

In 1958, New York’s modern master planner Robert Moses proposed to blast a highway through Greenwich Village, scattering its communities in order to make room for the inevitable technology of its day, the automobile. Moses had already built a highway through the Bronx, which never recovered from it. His plan for the Village was defeated by an alliance of local residents, including the urban philosopher Jane Jacobs, who articulated what would be lost in unforgettable terms: “the sidewalk ballet”, the dense web of glances, handshakes and hellos that constitutes city life at its most creative and fulfilling.

With digital technology today we are roughly at the stage we were with the car in the 1950s – dazzled by its possibilities and unwilling to think seriously about its costs, which is another way of saying we haven’t thought about how to maximise its benefits. Tools, whether they are made of flint or silicon, should be deployed to extend our potential, not erase it. Hunter-gathering has been revolutionised many times over but we still have the job of being human. It’s up to us to define the scope of work.

Ian Leslie is the author of “Curious: the Desire to Know and Why Your Future Depends on It” (Quercus)

Ian Leslie is a writer, author of CURIOUS: The Desire to Know and Why Your Future Depends On It, and writer/presenter of BBC R4's Before They Were Famous.

This article first appeared in the 16 January 2015 issue of the New Statesman, The Jihadis Among Us

DES WILLIE/BBC
Show Hide image

Man alive! Why the flaws of Inside No 9 only emphasise its brilliance

A man we’d thought destined for certain death reappeared, alive and kicking.​ ​Even as my brain raced, I was grinning.

At the risk of sounding like some awful, jargon-bound media studies lecturer – precisely the kind of person those I’m writing about might devote themselves to sending up – it seems to me that even the dissatisfactions of Inside No 9 (Tuesdays, 10pm) are, well, deeply satisfying. What I mean is that the occasional flaws in Steve Pemberton and Reece Shearsmith’s cultish series, those unlooked-for moments when nothing quite makes sense, only serve to emphasise its surpassing brilliance.

At the end of the final episode of series three, for instance, there came a discombobulating twist. A man we’d thought destined for certain death reappeared, alive and kicking. How had this happened? Were the preceding 28 minutes only a dream? Even as my brain raced, I was grinning. That line about Ron Mueck! In a piece that seemed mostly to be paying topsy-turvy homage to the camp 1973 horror flick Theatre of Blood.

Pemberton and Shearsmith are all about homage: a bit of Doctor Who here, a touch of Seventies B-movie there. Inside No 9’s format of twisty one-offs is a direct descendant of ITV’s Tales of the Unexpected. And yet it is so absolutely its own thing. Only they could have written it; only they could ever do this much (stretch your arms as wide as they’ll go) in so little time (half an hour).

In the episode Private View, guests were invited to the Nine Gallery in somewhere Hoxtonish. This motley crew, handpicked to represent several of the more unedifying aspects of 21st-century Britain, comprised Carrie (Morgana Robinson), a reality-TV star; Patricia (Felicity Kendal), a smutty novelist; Kenneth (Pemberton), a health and safety nut; and Maurice (Shearsmith), an art critic. Hard on their heels came Jean (Fiona Shaw), a wittering Irishwoman with gimlet eyes. However, given that they were about to be bloodily picked off one by one, at least one of them was not what she seemed. “I’m due at Edwina Currie’s perfume launch later,” Carrie yelped, as it dawned on her that the pages of Grazia might soon be devoting a sidebar to what Towie’s Mark Wright wore to her funeral.

Private View satirised a certain kind of contemporary art, all bashed up mannequins and blindingly obvious metaphors. Admittedly, this isn’t hard to do. But at least Pemberton and Shearsmith take for granted the sophistication of their audience. “A bit derivative of Ron Mueck,” said Maurice, gazing coolly at one of the installations. “But I like the idea of a blood mirror.” The duo’s determination to transform themselves from episode to episode – new accent, new hair, new crazy mannerisms – calls Dick Emery to mind. They’re better actors than he was, of course; they’re fantastic actors. But in the context of Inside No 9, even as they disappear, they stick out like sore thumbs, just as he used to. They’re the suns around which their impressive guest stars orbit. They may not always have the biggest parts, but they nearly always get the best lines. You need to watch them. For clues. For signs. For the beady, unsettling way they reflect the world back at you.

What astonishes about this series, as with the two before it, is its ability to manage dramatic shifts in tone. Plotting is one thing, and they do that as beautifully as Roald Dahl (the third episode, The Riddle of the Sphinx, which revolved around a crossword setter, was a masterclass in structure). But to move from funny to plangent and back again is some trick, given the limitations of time and the confined spaces in which they set the stories. In Diddle Diddle Dumpling, Shearsmith’s character found a size-nine shoe in the street and became obsessed with finding its owner, which was very droll. But the real engine of the piece, slowly revealed, was grief, not madness (“Diddle-diddle-dumpling, my son John”). You felt, in the end, bad for having sniggered at him.

If you missed it, proceed immediately to iPlayer, offering a thousand thanks for the usually lumbering and risk-averse BBC, which has commissioned a fourth series. One day people will write learned papers about these shows, at which point, jargon permitting, I might discover just how Maurice managed to live to fight another day.

Rachel Cooke trained as a reporter on The Sunday Times. She is now a writer at The Observer. In the 2006 British Press Awards, she was named Interviewer of the Year.

This article first appeared in the 23 March 2017 issue of the New Statesman, Trump's permanent revolution