Why I won't be buying an Xbox One or a PS4

As a veteran of many Console Wars, Alan Williamson believes that the best console is the one you have with you.

This year brings the most boring games console launch in history. I don’t mean that in a hyperbolic, share-this-incendiary-link-with-your-friends way: having lived through and been an active combatant in four generations of console wars, like many former soldiers I have now become an advocate for peace.

The First Great World Console War broke out in the early Nineties between Sega’s Mega Drive and the Super Nintendo Entertainment System (Nintendo’s progress in the 1980s was more a swift annexation). Both manufacturers were then broadsided by the introduction of the Sony PlayStation in 1995. Since then we have descended into effective Cold War, an ever-escalating technological arms race between equally weighted armies with few casualties. While there was isolated fighting in smaller Handheld Console Wars, a gaming Vietnam where Pokémon waged a guerrilla war for children’s minds, the Fourth World Console War ended with over eighty million consoles sold for each belligerent. I’d make an analogy about PCs and the United Nations, but I think the metaphor is already stretched to breaking at this point.

Each generation of the Console Wars had its own innovations, each console its own personality and fan base. The second saw the birth of affordable 3D graphics and some of the most critically-acclaimed games of all time, such as Legend of Zelda: Ocarina of Time, Metal Gear Solid and Final Fantasy VII. The third was dominated by Sony’s PlayStation 2 (don’t blame me, I had a Dreamcast) but also saw the beginnings of online play and the rise of Xbox megabrand Halo. We’re now at the tail end of the fourth generation, which has brought consoles mostly up to speed with PCs through high-definition graphics and digital distribution of games.

The war has reached a stalemate: Sony and Microsoft still rule in their home countries, while a Nintendo Wii gathers dust in every living room of the developed world. As a consequence of the global recession, game publishers have tried to extract even more revenue from a squeezed market. Witness the rise and demise of the loathed ‘online pass’, now replaced by the euphemistic ‘season pass’; paid downloads on day one that unlock content already on the disc; pre-order bonuses; and free to play games - or as I like to call them, pay to play games. History will show this generation as one that expanded the monetisation of games as much as the experiences themselves, often to the detriment of fun and artistic merit.

So here we go again with two new omnipotent wonderboxes, the PlayStation 4 and Xbox One. The games look much the same as the old ones: of course, similar criticisms were levelled at the Xbox 360, but even a layman could appreciate the beauty of Project Gotham Racing 3 compared to its predecessor. Perhaps it was the somewhat-sweaty razzmatazz of Earl’s Court at Eurogamer Expo in September, but I just couldn’t tell the difference between Forza Motorsport 5 and Forza 4, and I’ve played over fifty hours of Forza 4. With big franchises like Call of Duty and Assassin’s Creed launching on both consoles, as well as their predecessors, the choice is based more on ideology and available funds rather than real quantifiable differences. In fact, with so few games available, it makes less sense to buy a PS4 than a PS3.

Games journalists are in a difficult position. Those who haven’t been invited to New York for a complimentary gold-plated PS4 (I’m writing on a train travelling through Slough, but thanks for asking) will have pre-ordered the expensive new consoles to support their job, adding buyer’s remorse to an increasingly-dominant business model of offering their words online for free, funded by advertising. This model relies on hyperbole: every month a new ‘best game ever’, every day an announcement of something on the horizon, every minute a constant stream of rumour posts. It focuses on sensationalising games and the machines that play them, rather than criticising them. It encourages critics to follow the zeitgeist rather than dwelling on the games that linger in our minds. It blurs the lines between editorial and advertorial: after all, what is a news post about a PlayStation TV commercial if not advertising?

The YouTubeisation of publishing elevates every berk with a webcam and an opinion to the same level as the seasoned journalist. The Twitterisation of news encourages news-breaking, but not fact-checking. This isn’t unique to coverage of videogames, but the medium is entangled with technology and therefore at the forefront of innovations in publishing. Meanwhile, channels like PlayStation Access and Nintendo Direct show that publishers can successfully skip the middleman and advertise directly to customers. A new generation of games consoles deserves newer, deeper ways of looking at them - something outlets like Press Select, Boss Fight Books and my own Five out of Ten are trying to address. But we may be at the stage where readers are better served by the groupthink of their peers than the proclamations of journalists, where real decisions are made and discussions are had below the line.

While we’re keen to proclaim that videogames now generate more revenue than cinema, few have asked whether that was sustainable or even desirable. While digital distribution has led to a bigger market for indie developers, especially on the PC and smartphones, the biggest successes like Call of Duty and Battlefield are better for their publishers than the people that make them. In the UK we’ve seen the closure of much-loved studios like Blitz Games and Sony Liverpool, while other British studios like Rare have lost their lustre; the team that made the charming Banjo-Kazooie and Viva Piñata now produce bland sports titles, a reflection of the wishes of their corporate overlord Microsoft. The industry undervalues its creators and programmers, encouraging a ‘crunch’ culture with unpaid overtime and ridiculous hours. This system where distributor-takes-all reminds me of the Hollywood visual effects studio Rhythm and Hues, which won an Oscar for Life of Pi before declaring bankruptcy. If videogames really are such an important industry, they can do a lot better than emulating Hollywood in content and working culture.

Some pundits believe this may be the last console generation. I’d like to believe otherwise. I have fond memories of consoles and continue to make more: they provide a cheaper entry point into the fantastic worlds of fiction that games offer, without the expense or complexities of a PC. Yet perhaps games have outgrown the traditional model of consoles: the exponential growth of indie games is better suited to the less restrictive system of a personal computer, mobile phone or even the Ouya ‘microconsole’. Valve’s SteamOS promises the power of Linux married with their friendly distribution platform. While Sony and Microsoft are taking steps to open development on their consoles, their revenue model is built on strict control of the system: they focus on making money off the games they sell, not the platform itself.

Even more exciting are devices like the Oculus Rift, a virtual reality headset that offers a sea change in the way we play games. However, according to its creators it requires tremendous computer horsepower to be convincing - more than even the Xbox One and PS4 can provide. For years, consoles offered the best way to play games, but with that advantage gone they’re like digital cameras in a world where everyone has a camera built into their phone. I choose that analogy carefully, because I think portable consoles like the Nintendo 3DS are much better than an iPhone for games, but there’s a trade off between quality and the utility of having an all-in-one device. The best console is the one you have with you.

“War never changes,” mused Ron Perlman in the introduction to 2008’s Fallout 3. But this is a war that needs to change if games consoles are to expand, or merely retain their cultural relevance. Consoles used to represent inclusivity and the comforts of socialising with friends, but now they are targeted at an audience - and a medium - which is growing up and leaving them behind.

What should I buy?

I don’t play games, but I’d like to start

Nintendo’s latest console, the Wii U, is an underrated box. It’s cheaper than the competition and can play older Wii games as well as its new, shinier ones. Nintendo still make the best games, appealing to both children and adults like the videogame equivalent of Pixar. Unfortunately, also like Pixar, they release one great product every two years. Super Mario 3D World and Legend of Zelda: The Wind Waker are better than anything PlayStation and Xbox can offer this year.

Not only is the iPad a great computer, it’s also a great way to play games. But please avoid the mainstream tosh like Candy Crush Saga and instead try innovative titles like The Room, Year Walk and Device 6.

As an alternative, the website Forest Ambassador lists free five-minute games that work on most computers, and has the feel of a hippie art gallery. Hopefully, that last sentence will tell you whether you’ll like it.

I play games on my phone and want something better

The Nintendo 3DS goes from strength to strength with life absorbers like Animal Crossing and Pokémon X, plus the usual Mario, Mario Kart and Zelda. Since you probably won’t use the retina-bursting 3D functionality, you may as well buy the cheaper 2DS. It can also play games from the vast library of DS titles.

Steam is a free download for any computer running Windows, OS X or Linux and has an unrivalled library of games, from the biggest new releases to smaller (but no less compelling) games like Spelunky, FTL: Faster Than Light and Redshirt.

I want the best gaming experience available

PS4, Xbox One or a monster gaming PC. Choose a side, then spend the next five years of your life attacking the option you didn’t pick in internet comment threads.

Alan Williamson is Editor-in-Chief of the videogame culture magazine Five out of Ten

The new controller for the Xbox One. Photo: Getty
Getty
Show Hide image

Peculiar Ground by Lucy Hughes-Hallett asks how we shape history and how much is beyond our control

In Wychwood, a great house in Oxfordshire, the wealthy build walls around themselves to keep out ugliness, poverty, political change. Or at least they try to. 

The great cutting heads of the Crossrail tunnel-boring machines were engines of the future drilling into the past. The whole railway project entailed a crawl back into history as archaeologists worked hand in hand with engineers, preserving – as far as possible – the ancient treasures they discovered along the way. One of the most striking finds, relics of which are now on display at the Museum of London Docklands, was a batch of skeletons, unearthed near Liverpool Street Station, in which the bacteria responsible for the Great Plague of 1665 were identified for the first time. Past and present are never truly separable.

Lucy Hughes-Hallett’s ambitious first novel ends in 1665 in the aftermath of that plague, and it, too, dances between past and present, history and modernity. Like those skeletons buried for centuries beneath Bishopsgate, it is rooted in the ground. The eponymous “peculiar ground” is Wychwood, a great house in Oxfordshire, a place where the wealthy can build walls around themselves to keep out ugliness, poverty, political change. Or at least that is what they believe they can do; it doesn’t spoil the intricacies of this novel to say that, in the end, they will not succeed.

It is a timely idea. No doubt Hughes-Hallett was working on her novel long before a certain presidential candidate announced that he would build a great wall, but this present-day undiplomatic reality can never be far from the reader’s mind, and nor will the questions of Britain’s connection to or breakage with our European neighbours. Hughes-Hallett’s last book, a biography of Gabriele d’Annunzio, “the John the Baptist of fascism”, won a slew of awards when it was published four years ago and demonstrated the author’s skill in weaving together the forces of culture and politics.

Peculiar Ground does not confine itself to a single wall. Like Tom Stoppard’s classic play Arcadia, it sets up a communication between centuries in the grounds at Wychwood. In the 17th century, John Norris is a landscape-maker, transforming natural countryside into artifice on behalf of the Earl of Woldingham, who has returned home from the depredations of the English Civil War. In the 20th century a new cast of characters inhabits Wychwood, but there are powerful resonances of the past in this place, not least because those who look after the estate – foresters, gardeners, overseers – appear to be essentially the same people. It is a kind of manifestation of what has been called the Stone Tape theory, after a 1972 television play by Nigel Kneale in which places carry an ineradicable echo of their history, causing ghostly lives to manifest themselves through the years.

But the new story in Peculiar Ground broadens, heading over to Germany as it is divided between East and West in 1961, and again as that division falls away in 1989. Characters’ lives cannot be divorced from their historical context. The English breakage of the civil war echoes through Europe’s fractures during the Cold War. The novel asks how much human actors shape history and how much is beyond their control.

At times these larger questions can overwhelm the narrative. As the book progresses we dance between a succession of many voices, and there are moments when their individual stories are less compelling than the political or historical situations that surround them. But perhaps that is the point. Nell, the daughter of the land agent who manages Wychwood in the 20th century, grows up to work in prison reform and ­observes those who live in confinement. “An enclosed community is toxic,” she says. “It festers. It stagnates. The wrong people thrive there. The sort of people who actually like being walled in.”

The inhabitants of this peculiar ground cannot see what is coming. The novel’s modern chapters end before the 21st century, but the future is foreshadowed in the person of Selim Malik, who finds himself hiding out at Wychwood in 1989 after he becomes involved in the publication of an unnamed author’s notorious book. “The story you’re all so worked up about is over,” he says to a journalist writing about the supposed end of the Cold War. “The story I’m part of is the one you need to think about.”

A little heavy handed, maybe – but we know Selim is right. No doubt, however, Wychwood will endure. The landscape of this novel – its grounds and waters and walls – is magically and movingly evoked, and remains in the imagination long after the reader passes beyond its gates. 

Erica Wagner’s “Chief Engineer: the Man Who Built the Brooklyn Bridge” is published by Bloomsbury

Erica Wagner is a New Statesman contributing writer and a judge of the 2014 Man Booker Prize. A former literary editor of the Times, her books include Ariel's Gift: Ted Hughes, Sylvia Plath and the Story of “Birthday Letters” and Seizure.

This article first appeared in the 25 May 2017 issue of the New Statesman, Why Islamic State targets Britain

0800 7318496