Here come the supertaskers

New technologies and social media are training up the next generation of superbrains, but are young

Young people today are dangerously self-obsessed, over-cosseted and computer-addled - or so the media would have us believe. Recent science stories seem to confirm popular concerns about the feckless brains of Generation Whatever (to use the latest label). But we are not getting the whole story.

On 29 May, British newspapers rushed to report on a study by Sara Konrath, a University of Michigan researcher, showing that current college students are lacking in empathy compared to their predecessors. The study concludes that "college kids today are about 40 per cent lower in empathy". The biggest fall came after the year 2000 - the advent of mass connectivity - according to the survey of 14,000 personality tests over the past three decades. Konrath says that modern students are far less likely to agree with lines such as "I sometimes try to understand my friends better by imagining how things look from their perspective" and "I often have tender, concerned feelings for people less fortunate than me".

Presenting their findings on 27 May, Konrath and her colleague Edward O'Brien told the US Association for Psychological Science that the rise of social media seemed to be a factor: "The ease of having 'friends' online might make people more likely to tune out when they don't feel like responding to others' problems - a behaviour that could carry over offline." Thus, "many people", Konrath said, "see the current group of college students as one of the most self-centred, narcissistic, competitive, confident and indivi­dualistic in recent history".

It is a popular impression. Not only is Generation Whatever accused of unprecedented selfishness, but we are told that it is getting increasingly stupid. Again, technology is blamed. In June, for example, a Duke University study found that having home computers and broadband lowers students' scores in reading and maths - particularly if they don't have the sort of middle-class parents who nag them to lay off the messaging and gaming.

Techy teens

But are these concerns new? "I see no hope for the future of our people if they are dependent on the frivolous youth of today, for certainly all youth are reckless beyond words . . . and impatient of restraint." That was the poet Hesiod in the 8th century BC.

The human basics may have changed very little - but that does not make headlines. In March, the British media ignored a University of Western Ontario study of 477,380 high-school seniors in 1976 and 2006, which found that Generation Whatever looks very similar to youth from the mid-1970s. The main difference, said the study published in Perspectives on Psychological Science, is that the new generation of young people has higher expectations of its education and is less trustful of government. So perhaps it cares more.

Jeroen Boschma, creative director of the advertising firm Keesie, based in Rotterdam, believes as much. He told the Spanish newspaper El País the story of how he interviewed a 17-year-old for a job and asked him a tough technical question to see how he would react. The candidate did not know the answer, but requested a minute to find out, consulted an online forum and got more than 100 informed responses from across the world.

In 2006, Boschma published the book Generation Einstein: Smarter, Faster and More Socially Aware, to loud media buzz. He believes that rapid-paced technology has imbued these so-called "digital natives" with new qualities: they challenge authority and are highly pragmatic in dealing with information. "This sets them apart from any other generation and has consequences that are by no means trivial."

Certainly, young people are politically engaged on a scale unseen since the 1960s, thanks to their ability to clamber on to the internet's global soapbox. For example, when Farouk Olu Aregbe, a recent graduate in the US, set up the One Million Strong for Obama Facebook group, it rapidly gained 820,000 members. And in Britain, pressure from a 5,000-strong Facebook group forced HSBC to stop charging interest on graduates' overdrafts.

The laptop revolutionaries can also be altruistic. Twitter and Facebook were still primarily driven by college students when these networks overwhelmed the Red Cross with millions of texted $10 gifts to Haitian earthquake relief. Digital networking, far from merely fostering passivity, has created a generation that can engage vigorously and fast. Empathy has not disappeared - it is simply taking different forms.

And it's not just empathy that is changing. The idea that the human population is developing a different kind of intelligence is another common idea. Studies by James Flynn, a professor of political studies at Otago University, New Zealand, who specialises in measuring intelligence, show a consistent rise in global IQ performance of roughly 3 per cent per decade, in some cases going back to the early 20th century. This implies that, over the past 100 years, the IQs of people (predominantly in the west) have risen by about 30 points, an observation known as the "Flynn effect".

Flynn believes that our brains have changed in recent decades because TV, computers and social networking challenge the brain in new ways and for far longer periods of time. Those challenges are developing quickly. The plotlines for The Wire are infinitely more complex than those of, say, The Good Life in the 1970s. Games such as Civilization IV re-create human economic and technological history, challenging teens to work out whether they should develop an agrarian capitalist society or a monarchy.

But Flynn argues that his "effect" does not show a genetic increase in intelligence per se. It is the product of a bias in IQ tests towards abstract-reasoning intelligence. Our brains are becoming more creative, but this is perhaps at the cost of older, everyday skills.

This theory is echoed by Gary Small, professor of psychiatry at the University of California Los Angeles and author of iBrain: Surviving the Technological Alteration of the Modern Mind. He believes the generation that has grown up using computers is having a harder time reading social cues. "Even though [they] are very good with the tech skills, they are weak with the face-to-face human contact skills," he told the New York Times in April.

Such shifts in consciousness are not without peril. Two recently published studies - by the University of California and the University of Southern California - indicate that our constant diet of digital news is beginning to move faster than our ability to make moral judgements. Rapid info-bursts of stabbings, suffering and war are consumed but may not make us indignant, compassionate or inspired.

Yet there is evidence, too, that the human brain is advancing its ability to sift informa-tion quickly. We appear to be evolving rapidly under pressure from unprecedented demands, using evolutionary mechanisms we are just beginning to understand. One is called epigenetics - a frontier science that is revealing how the changes we experience in our brains during our lives do not simply go to the grave with us, but can be passed on to our offspring.

Scientists are also discovering that the brain retains high levels of plasticity throughout our lives, particularly if we keep challenging it with new learning.


Tomorrow's people may already be buzzing away among us. They will include the "supertaskers". For most of us, multitasking is tough. Trials show that it tends to result in two things done poorly rather than one done well. But one in 40 people appears immune to this problem. These lucky speed-freaks can, for example, drive and talk on a mobile phone at the same time without loss of concentration on either task, according to tests on 200 people by the Utah University psychologist Jason Watson.

Supertaskers constitute only 2.5 per cent of the population, Watson believes. But even that level is surprisingly high. "According to cognitive theory, these individuals ought not to exist," he says in a paper soon to be published by the Psychonomic Bulletin and Review. Further research into supertaskers may reveal how the multitasking regions of their brains are different, due to some inherited variation. Watson predicts that employers in high-performance professions will want to screen for genetic markers of supertasking ability. Generation Whatever's multi-mediated brains may be the key to our ever-faster future.

But even in a hyper-accelerated culture, someone is going to have to pay close attention to socially indispensable matters such as law, politics, academia and medicine - disciplines that demand conscientiousness and a gimlet eye for mono-tasking detail. Old-brainers, the over-thirties, aren't out of business yet. So we should not be so snippy about welcoming the children of the network-minded generation, even if we don't understand their ways.

John Naish is the author of "Enough: Breaking Free from the World of More" (Hodder, £7.99)

This article first appeared in the 12 July 2010 issue of the New Statesman, Behind the mask

Show Hide image

The age of loneliness

Profound changes in technology, work and community are transforming our ultrasocial species into a population of loners.

Our dominant ideology is based on a lie. A series of lies, in fact, but I’ll focus on just one. This is the claim that we are, above all else, self-interested – that we seek to enhance our own wealth and power with little regard for the impact on others.

Some economists use a term to describe this presumed state of being – Homo economicus, or self-maximising man. The concept was formulated, by J S Mill and others, as a thought experiment. Soon it became a modelling tool. Then it became an ideal. Then it evolved into a description of who we really are.

It could not be further from the truth. To study human behaviour is to become aware of how weird we are. Many species will go to great lengths to help and protect their close kin. One or two will show occasional altruism towards unrelated members of their kind. But no species possesses a capacity for general altruism that is anywhere close to our own.

With the possible exception of naked mole-rats, we have the most social minds of all mammals. These minds evolved as an essential means of survival. Slow, weak, armed with rounded teeth and flimsy nails in a world of fangs and claws and horns and tusks, we survived through co-operation, reciprocity and mutual defence, all of which developed to a remarkable degree.

A review paper in the journal Frontiers in Psychology observes that Homo economicus  might be a reasonable description of chimpanzees. “Outsiders . . . would not expect to receive offers of food or solicitude; rather, they would be fiercely attacked . . . food is shared only under harassment; even mothers will not voluntarily offer novel foods to their own infants unless the infants beg for them.” But it is an unreasonable description of human beings.

How many of your friends, colleagues and neighbours behave like chimpanzees? A few, perhaps. If so, are they respected or reviled? Some people do appear to act as if they have no interests but their own – Philip Green and Mike Ashley strike me as possible examples – but their behaviour ­attracts general revulsion. The news is filled with spectacular instances of human viciousness: although psychopaths are rare, their deeds fill the papers. Daily acts of kindness are seldom reported, because they are everywhere.

Every day, I see people helping others with luggage, offering to cede their place in a queue, giving money to the homeless, setting aside time for others, volunteering for causes that offer no material reward. Alongside these quotidian instances are extreme and stunning cases. I think of my Dutch mother-in-law, whose family took in a six-year-old Jewish boy – a stranger – and hid him in their house for two years during the German occupation of the Netherlands. Had he been discovered, they would all have been sent to a concentration camp.

Studies suggest that altruistic tendencies are innate: from the age of 14 months, children try to help each other, attempting to hand over objects another child can’t reach. At the age of two, they start to share valued possessions. By the time they are three, they begin to protest against other people’s violation of moral norms.

Perhaps because we are told by the media, think tanks and politicians that competition and self-interest are the defining norms of human life, we disastrously mischaracterise the way in which other people behave. A survey commissioned by the Common Cause Foundation reported that 78 per cent of respondents believe others to be more selfish than they really are.

I do not wish to suggest that this mythology of selfishness is the sole or even principal cause of the epidemic of loneliness now sweeping the world. But it is likely to contribute to the plague by breeding suspicion and a sense of threat. It also appears to provide a doctrine of justification for those afflicted by isolation, a doctrine that sees individualism as a higher state of existence than community. Perhaps it is hardly surprising that Britain, the European nation in which neoliberalism is most advanced, is, according to government figures, the loneliness capital of Europe.

There are several possible reasons for the atomisation now suffered by the supremely social mammal. Work, which used to bring us together, now disperses us: many people have neither fixed workplaces nor regular colleagues and regular hours. Our leisure time has undergone a similar transformation: cinema replaced by television, sport by computer games, time with friends by time on Facebook.

Social media seems to cut both ways: it brings us together and sets us apart. It helps us to stay in touch, but also cultivates a tendency that surely enhances other people’s sense of isolation: a determination to persuade your followers that you’re having a great time. FOMO – fear of missing out – seems, at least in my mind, to be closely ­associated with loneliness.

Children’s lives in particular have been transformed: since the 1970s, their unaccompanied home range (in other words, the area they roam without adult supervision) has declined in Britain by almost 90 per cent. Not only does this remove them from contact with the natural world, but it limits their contact with other children. When kids played out on the street or in the woods, they quickly formed their own tribes, learning the social skills that would see them through life.

An ageing population, family and community breakdown, the decline of institutions such as churches and trade unions, the switch from public transport to private, inequality, an alienating ethic of consumerism, the loss of common purpose: all these are likely to contribute to one of the most dangerous epidemics of our time.

Yes, I do mean dangerous. The stress response triggered by loneliness raises blood pressure and impairs the immune system. Loneliness enhances the risk of depression, paranoia, addiction, cognitive decline, dem­entia, heart disease, stroke, viral infection, accidents and suicide. It is as potent a cause of early death as smoking 15 cigarettes a day, and can be twice as deadly as obesity.

Perhaps because we are in thrall to the ideology that helps to cause the problem, we turn to the market to try to solve it. Over the past few weeks, the discovery of a new American profession, the people-walker (taking human beings for walks), has caused a small sensation in the media. In Japan there is a fully fledged market for friendship: you can hire friends by the hour with whom to chat and eat and watch TV; or, more disturbingly, to pose for pictures that you can post on social media. They are rented as mourners at funerals and guests at weddings. A recent article describes how a fake friend was used to replace a sister with whom the bride had fallen out. What would the bride’s mother make of it? No problem: she had been rented, too. In September we learned that similar customs have been followed in Britain for some time: an early foray into business for the Home Secretary, Amber Rudd, involved offering to lease her posh friends to underpopulated weddings.



My own experience fits the current pattern: the high incidence of loneliness suffered by people between the ages of 18 and 34. I have sometimes been lonely before and after that period, but it was during those years that I was most afflicted. The worst episode struck when I returned to Britain after six years working in West Papua, Brazil and East Africa. In those parts I sometimes felt like a ghost, drifting through societies to which I did not belong. I was often socially isolated, but I seldom felt lonely, perhaps because the issues I was investigating were so absorbing and the work so frightening that I was swept along by adrenalin and a sense of purpose.

When I came home, however, I fell into a mineshaft. My university friends, with their proper jobs, expensive mortgages and settled, prematurely aged lives, had become incomprehensible to me, and the life I had been leading seemed incomprehensible to everyone. Though feeling like a ghost abroad was in some ways liberating – a psychic decluttering that permitted an intense process of discovery – feeling like a ghost at home was terrifying. I existed, people acknowledged me, greeted me cordially, but I just could not connect. Wherever I went, I heard my own voice bouncing back at me.

Eventually I made new friends. But I still feel scarred by that time, and fearful that such desolation may recur, particularly in old age. These days, my loneliest moments come immediately after I’ve given a talk, when I’m surrounded by people congratulating me or asking questions. I often experience a falling sensation: their voices seem to recede above my head. I think it arises from the nature of the contact: because I can’t speak to anyone for more than a few seconds, it feels like social media brought to life.

The word “sullen” evolved from the Old French solain, which means “lonely”. Loneliness is associated with an enhanced perception of social threat, so one of its paradoxical consequences is a tendency to shut yourself off from strangers. When I was lonely, I felt like lashing out at the society from which I perceived myself excluded, as if the problem lay with other people. To read any comment thread is, I feel, to witness this tendency: you find people who are plainly making efforts to connect, but who do so by insulting and abusing, alienating the rest of the thread with their evident misanthropy. Perhaps some people really are rugged individualists. But others – especially online – appear to use that persona as a rationale for involuntary isolation.

Whatever the reasons might be, it is as if a spell had been cast on us, transforming this ultrasocial species into a population of loners. Like a parasite enhancing the conditions for its own survival, loneliness impedes its own cure by breeding shame and shyness. The work of groups such as Age UK, Mind, Positive Ageing and the Campaign to End Loneliness is life-saving.

When I first wrote about this subject, and the article went viral, several publishers urged me to write a book on the theme. Three years sitting at my desk, studying isolation: what’s the second prize? But I found another way of working on the issue, a way that engages me with others, rather than removing me. With the brilliant musician Ewan McLennan, I have written a concept album (I wrote the first draft of the lyrics; he refined them and wrote the music). Our aim is to use it to help break the spell, with performances of both music and the spoken word designed to bring people together –which, we hope, will end with a party at the nearest pub.

By itself, our work can make only a tiny contribution to addressing the epidemic. But I hope that, both by helping people to acknowledge it and by using the power of music to create common sentiment, we can at least begin to identify the barriers that separate us from others, and to remember that we are not the selfish, ruthless beings we are told we are.

“Breaking the Spell of Loneliness” by Ewan McLennan and George Monbiot is out now. For a full list of forthcoming gigs visit:

This article first appeared in the 20 October 2016 issue of the New Statesman, Brothers in blood