Is all the time in front of that screen time wasted? Photo: Getty
Show Hide image

I’ve probably played over 10,000 hours of video games. I could be a concert pianist by now

Escaping into video games is something that people have been doing since video games were first invented. But is it time wasted, or valuable escapism?

Video games have great power, but it can be difficult to see from the perspective of a player. I didn’t see it myself until I saw what happened to my nephew, then three years old, when he was plonked down in front of my computer to play Minecraft. One second he’s an energetic babbling scamp, and the next he’s sat stock still in the chair, eyes locked on the screen, completely engaged with what is going on to the exclusion of everything else. The worrying realisation loomed that he could cheerfully stay sat forever in front of that computer playing games, circumstances permitting. Just as I could.

Escaping into video games is something that people have been doing since video games were first invented. Even the blockiest, bloopiest attempts at making digital entertainment could hook players. When you’re properly absorbed into a game it’s much like being completely engaged with any other activity, whether it’s reading a book, playing a musical instrument or performing open heart surgery. The difference between a game and almost any other activity is that for many games the duration is potentially infinite, the connection can last hundreds, even thousands of hours. Almost no other recreational activity can sustain that kind of attention. Traditional media cannot provide the amount of hours of entertainment that games can, exercise and sports are limited by physical exhaustion and most other hobbies or activities would be impractical if pursued to the same extent. Even if one game gets dull there’s usually another and from simplistic mobile phone and web browser games to sprawling sandbox games and MMOs there is always something else around the corner for the determined games fan, and some of us are very determined indeed.

It has been said that you can become a concert pianist with 10,000 hours of practice. Personally, according to Steam which since 2009 has been tracking how much I play games on my PC, I’ve clocked in the ballpark of 6,000 hours on various games since 2009, just on that platform. Factor in the preceding 25 or so years playing games on top of that and I’m starting to think maybe somebody should book me in to Carnegie Hall so culture vultures can watch me play Dwarf Fortress in C minor.

Carnegie Hall won’t be calling. The simple fact is that though I’ve tried, and largely succeeded, to cram quality game playing time into every corner of my waking life (plus a mission in Kerbal Space Program during a particularly vivid and memorable dream) I’m not an outlier. There are men and women whose dedication to playing video games is almost beyond belief. Players who measure the time committed to single games, usually multiplayer games, in the thousands of hours, and are happy to do it.

Sometimes I wonder if I spent my time with games well. If there wasn’t something else I could have been doing. Maybe if I’d spent fifty less hours playing and fifty more hours at the gym I’d be in better shape. Maybe if I’d skipped a given game entirely 200 hours or so might have been diverted to something more traditionally productive, learning like a foreign language or training a swan to bite the Queen. It is easy to juggle hypotheticals when the time spent is all laid out before us. Ultimately I don’t regret any of it.

The idea that people are just pouring their lives into an insatiable digital abyss could be seen as a waste, because for all intents and purposes, it is a waste. Our greatest accomplishments in the field of gaming are just a hard disk failure or cloud save snafu away from obliteration. It could be argued that the last two decades or so through which the video game has risen to prominence have created a boondoggle of incalculable proportions. Millions of hours across the world going into this sinkhole - had gamers only looked outwards, we might wonder, could they have not achieved something great? Could we not have all done something useful?

Well, no. To view time in games in the above terms, as time lost, is to miss the point of playing the game in the first place. For me playing a game is to recharge, to escape from reality that is, let’s face it, not nearly as enticing as we were brought up to think it’d be.

We live in a world of suffering and injustice and it isn’t getting better. In general there’s climate change and wealth disparity, and on a personal level (spoiler alert) we’re all going to die eventually. Life, even in comparatively comfortable surroundings, is difficult and painful and it never ends well. The decades marked by the growth of video games have coincided with the collapse of traditional ideas of job security, with the near complete breakdown of trust in our political class and with the rise of a surveillance culture so comprehensive and intrusive that the Stasi would be telling us to steady on. We enjoy communications technology that prior generations could only have dreamed of and what have we found out? That people all over the world are getting just as screwed as we are or worse, and there’s no vote we can cast or placard we can march under that will help them. Video games are the most effective means of temporary escapism that humankind has ever developed that didn’t involve a syringe and we need them now more than ever.

If you can wake up in the morning, look at the news, check your Twitter, know that your entire existence is fleeting and just deal with that, all day long, until you turn in at night ready to do it all the next day, more power to you. Me? I’m going to need to spend a chunk of that time pretending to be a magic Viking or something or else my head is going to explode, and if the millions of people spending millions of hours doing likewise are any indication I’m not the only one.

Phil Hartup is a freelance journalist with an interest in video gaming and culture

NANCY JO IACOI/GALLERY STOCK
Show Hide image

There are only two rules for an evening drink: it must be bitter, and it must be cold

A Negroni is the aperitif of choice in bars everywhere from London to Palermo - and no wonder.

The aperitif has the odd distinction of being the only alcohol that can always rely on a sober audience: it is the opener, the stimulant, a spur to the appetite for good food and good conversation. This preparatory beverage is considered the height of sophistication, and certainly nobody labouring in field or factory ever required a pep to their evening appetite. Still, to take a drink before one starts drinking is hardly clever behaviour. So why do it?

One reason is surely the wish to separate the working day from the evening’s leisure, an increasingly pressing matter as we lose the ability to switch off. This may change the nature of the aperitif, which was generally supposed to be light, in alcohol and character. Once, one was expected to quaff a pre-dinner drink and go in to dine with faculties and taste buds intact; now, it might be more important for those who want an uninterrupted meal to get preprandially plastered. That way, your colleagues may contact you but they won’t get much sense out of you, and pretty soon they’ll give up and bother someone else.

The nicest thing about the aperitif, and the most dangerous, is that it doesn’t follow rules. It’s meant to be low in alcohol, but nobody ever accused a gin and tonic or a Negroni (Campari, gin and vermouth in equal portions) of that failing; and sherry, which is a fabulous aperitif (not least because you can keep drinking it until the meal or the bottle ends), has more degrees of alcohol than most wines. An aperitif should not be heavily perfumed or flavoured, for fear of spoiling your palate, yet some people love pastis, the French aniseed drink that goes cloudy in water, and that you can practically smell across the Channel. They say the scent actually enhances appetite.

Really only two rules apply. An aperitif should be bitter – or, at any rate, it shouldn’t be sweet, whatever the fans of red vermouth may tell you. And it must be cold. Warm drinks such as Cognac and port are for after dinner. Not for nothing did Édith Piaf warble, in “Mon apéro”, about drowning her amorous disappointments in aperitifs: fail to cool your passions before sharing a table, and you belong with the barbarians.

On the other hand, conversing with your nearest over a small snack and an appropriate beverage, beyond the office and before the courtesies and complications of the dinner table, is the essence of cultured behaviour. If, as is sometimes thought, civilisation has a pinnacle, surely it has a chilled apéro carefully balanced on top.

The received wisdom is that the French and Italians, with their apéritifs and aperitivos, are the experts in these kinds of drinks. Certainly the latter are partial to their Aperol spritzes, and the former to such horrid, wine-based tipples as Lillet and Dubonnet. But the English are good at gin and the Americans invented the Martini. As for Spain, tapas were originally snacks atop a covering that kept the flies out of one’s pre-dinner drink: tapa means lid.

Everywhere, it seems, as evening approaches, people crave a drink that in turn will make them salivate: bitterness, the experts tell us, prepares the mouth to welcome food. The word “bitter” may come from “bite”, in which case the aperitif’s place before dinner is assured.

I like to think that a good one enables the drinker to drown all sour feelings, and go in to dinner cleansed and purified. Fanciful, perhaps. But what better lure to fancy than a beverage that exists only to bring on the evening’s pleasures?

Nina Caplan is the Louis Roederer Pio Cesare Food and Wine Writer of the Year

Nina Caplan is the 2014 Fortnum & Mason Drink Writer of the Year and 2014 Louis Roederer International Wine Columnist of the Year for her columns on drink in the New Statesman. She tweets as @NinaCaplan.

This article first appeared in the 22 September 2016 issue of the New Statesman, The New Times