If there is one idea we need to be rid of it’s “natural curiosity”. Photo: Rich Grundy on Flickr via Creative Commons
Show Hide image

Why curiosity will rule the modern world

We ought to be doing everything we can to foster curiosity but we undervalue and misunderstand it.

Before the internet and before the printing press, knowledge was the preserve of the 1 per cent. Books were the super yachts of 17th-century kings. The advent of the printing press and, latterly, the worldwide web has broken this monopoly. Today, in a world where vast inequalities in access to information are finally being levelled, a new cognitive divide is emerging: between the curious and the incurious.

Twenty-first-century economies are rewarding those who have an unquenchable desire to discover, learn and accumulate a wide range of knowledge. It’s no longer just about who or what you know, but how much you want to know. The curious are more likely to stay in education for longer. A hungry mind isn’t the only trait you need to do well at school, but according to Sophie von Stumm of Goldsmiths, University of London, it is the best single predictor of achievement, allied as it is with the other two quantifiably important traits: intelligence and conscientiousness.

Across the developed world, the cost of university is rising, but the cost of not going to university is rising faster. In the UK, according to a report commissioned by the Department for Business, Innovation and Skills, workers with degrees earn 27 per cent more than those with only A-levels. To put it another way, not having a degree costs £210,000 over the course of a lifetime.

Once at university, students find that individual differences in their cognitive ability flatten out (you had to be reasonably smart to get there) and an inner drive to learn becomes even more important. Curiosity can make the difference between classes of degree, and that matters: workers with Firsts or Upper Seconds earn on average £80,000 more over a lifetime than those with 2:2s or lower.

Students entering the workplace need to stay curious, because the wages for routine intellectual work, even in professional industries such as accountancy and law, are falling. Technology is rapidly taking over tasks historically performed by human beings, and it’s no longer enough to be merely competent or smart: computers are both. But no computer can yet be said to be curious. As the technology writer Kevin Kelly puts it, “Machines are for answers; humans are for questions.”

Industries are growing more complex and unpredictable and employers are increasingly looking for curious learners: people with an aptitude for cognitively demanding work and a thirst for knowledge. This applies even in areas not previously thought to be intellectually challenging.

Take football. Managers are no longer there just to pick the team or give a rousing half-time talk. They need to be tactically sophisticated, statistically literate and financially astute. Asked why there is a trend towards footballers with short playing careers becoming managers, José Mourinho replied, “More time to study.”

Curious people are often good at solving problems for their employers, because they’re really solving them for themselves. When confident that others are working on the same problem, most people cut themselves slack. Highly curious people form an exception to this rule.

We ought to be doing everything we can to foster curiosity but we undervalue and misunderstand it. Our education system is increasingly focused on preparing students for specific jobs. To teach someone to be an engineer or a lawyer is not the same as teaching them to be a curious learner – yet often the best engineers and lawyers are the most curious learners.

If there is one idea we need to be rid of it’s “natural curiosity”. Ever since Rousseau created Émile, the boy who learns best when left alone, we’ve been in love with the idea that children thrive when parents and teachers get out of the way. All the scientific evidence suggests the opposite.

Curiosity may be a fundamental human trait but intellectual curiosity is hard work. It requires a willingness to learn things that can seem pointless at the time but turn out to be useful later, and to perform boring tasks such as writing out equations over and over again. This is dependent on the guidance of adults and experts.

The web is just as likely to neuter curiosity as supercharge it. It presents us with more opportunities to learn than ever before, and also to watch endless videos of kittens. Those who acquire the habits of intellectual curiosity early on will use computers to learn throughout their lives; those who don’t may find they are replaced by one, having had their curiosity killed by cats.

Ian Leslie is the author of Curious: The Desire to Know and Why Your Future Depends on It (Quercus, £10.99)

Ian Leslie is a writer, author of CURIOUS: The Desire to Know and Why Your Future Depends On It, and writer/presenter of BBC R4's Before They Were Famous.

This article first appeared in the 21 May 2014 issue of the New Statesman, Peak Ukip

Show Hide image

It’s been 25 years since the Super Nintendo and Sega Mega Drive were released – what’s changed?

Gaming may be a lonelier pusuit now, but there have been positive changes you can console yourselves with too.

Let's not act as if neither of us knows anything about gaming, regardless of how old we are. Surely you'll remember the Super Nintendo console (SNES) and Sega's Mega Drive (or Genesis, if you're an American)? Well, it's now been 25 years since they were released. OK, fine, it's been 25 years since the SNES' debut in Japan, whereas the Mega Drive was released 25 years ago only in Europe, having arrived in Asia and North America a bit earlier, but you get the idea.

Sonic the Hedgehog by Sega

It's amazing to think a quarter of a century has passed since these digital delights were unveiled for purchase, and both corporate heavyweights were ready for battle. Sega jumped into the new era by bundling Sonic, their prized blue mascot and Nintendo retaliated by including a Mario title with their console.

Today's equivalent console battle involves (primarily) Sony and Microsoft, trying to entice customers with similar titles and features unique to either the PlayStation 4 (PS4) or Xbox One. However, Nintendo was trying to focus on younger gamers, or rather family-friendly audiences (and still does) thanks to the endless worlds provided by Super Mario World, while Sega marketed its device to older audiences with popular action titles such as Shinobi and Altered Beast.

Donkey Kong Country by Rare

But there was one thing the Mega Drive had going for it that made it my favourite console ever: speed. The original Sonic the Hedgehog was blazingly fast compared to anything I had ever seen before, and the sunny background music helped calm any nerves and the urge to speed through the game without care. The alternative offered by the SNES included better visuals. Just look at the 3D characters and scenery in Donkey Kong Country. No wonder it ended up becoming the second best-selling game for the console.

Street Fighter II by Capcom

The contest between Sega and Nintendo was rough, but Nintendo ultimately came out ahead thanks to significant titles released later, demonstrated no better than Capcom's classic fighting game Street Fighter II. Here was a game flooding arcade floors across the world, allowing friends to play together against each other.

The frantic sights and sounds of the 16-bit era of gaming completely changed many people's lives, including my own, and the industry as a whole. My siblings and I still fondly remember our parents buying different consoles (thankfully we were saved from owning a Dreamcast or Saturn). Whether it was the built-in version of Sonic on the Master System or the pain-in-the-ass difficult Black Belt, My Hero or Asterix titles, our eyes were glued to the screen more than the way Live & Kicking was able to manage every Saturday morning.

The Sims 4 by Maxis

Today's console games are hyper-realistic, either in serious ways such as the over-the-top fatalities in modern Mortal Kombat games or through comedy in having to monitor character urine levels in The Sims 4. This forgotten generation of 90s gaming provided enough visual cues to help players comprehend what was happening to allow a new world to be created in our minds, like a good graphic novel.

I'm not at all saying gaming has become better or worse, but it is different. While advantages have been gained over the years, such as the time I was asked if I was gay by a child during a Halo 3 battle online, there are very few chances to bond with someone over what's glaring from the same TV screen other than during "Netflix and chill".

Wipeout Pure by Sony

This is where the classics of previous eras win for emotional value over today's blockbuster games. Working with my brother to complete Streets of Rage, Two Crude Dudes or even the first Halo was a draining, adventurous journey, with all the ups and downs of a Hollywood epic. I was just as enthralled watching him navigate away from the baddies, pushing Mario to higher and higher platforms in Super Mario Land on the SNES just before breaking the fast.

It's no surprise YouTube's Let's Play culture is so popular. Solo experiences such as Ico and Wipeout Pure can be mind-bending journeys too, into environments that films could not even remotely compete with.

But here’s the thing: it was a big social occasion playing with friends in the same room. Now, even the latest Halo game assumes you no longer want physical contact with your chums, restricting you to playing the game with them without being in their company.

Halo: Combat Evolved by Bungie

This is odd, given I only ever played the original title, like many other, as part of an effective duo. Somehow these sorts of games have become simultaneously lonely and social. Unless one of you decides to carry out the logistical nightmare of hooking up a second TV and console next to the one already in your living room.

This is why handhelds such as the Gameboy and PSP were so popular, forcing you to move your backside to strengthen your friendship. That was the whole point of the end-of-year "games days" in primary school, after all.

Mario Kart 8 by Nintendo

The industry can learn one or two things by seeing what made certain titles successful. It's why the Wii U – despite its poor sales performance compared with the PS4 – is an excellent party console, allowing you to blame a friend for your pitfalls in the latest Donkey Kong game. Or you can taunt them no end in Mario Kart 8, the console's best-selling game, which is ironic given its crucial local multiplayer feature, making you suspect there would be fewer physical copies in the wild.

In the same way social media makes it seem like you have loads of friends until you try to recall the last time you saw them, gaming has undergone tremendous change through the advent of the internet. But the best games are always the ones you remember playing with someone by your side.