Why I won't be buying an Xbox One or a PS4

As a veteran of many Console Wars, Alan Williamson believes that the best console is the one you have with you.

This year brings the most boring games console launch in history. I don’t mean that in a hyperbolic, share-this-incendiary-link-with-your-friends way: having lived through and been an active combatant in four generations of console wars, like many former soldiers I have now become an advocate for peace.

The First Great World Console War broke out in the early Nineties between Sega’s Mega Drive and the Super Nintendo Entertainment System (Nintendo’s progress in the 1980s was more a swift annexation). Both manufacturers were then broadsided by the introduction of the Sony PlayStation in 1995. Since then we have descended into effective Cold War, an ever-escalating technological arms race between equally weighted armies with few casualties. While there was isolated fighting in smaller Handheld Console Wars, a gaming Vietnam where Pokémon waged a guerrilla war for children’s minds, the Fourth World Console War ended with over eighty million consoles sold for each belligerent. I’d make an analogy about PCs and the United Nations, but I think the metaphor is already stretched to breaking at this point.

Each generation of the Console Wars had its own innovations, each console its own personality and fan base. The second saw the birth of affordable 3D graphics and some of the most critically-acclaimed games of all time, such as Legend of Zelda: Ocarina of Time, Metal Gear Solid and Final Fantasy VII. The third was dominated by Sony’s PlayStation 2 (don’t blame me, I had a Dreamcast) but also saw the beginnings of online play and the rise of Xbox megabrand Halo. We’re now at the tail end of the fourth generation, which has brought consoles mostly up to speed with PCs through high-definition graphics and digital distribution of games.

The war has reached a stalemate: Sony and Microsoft still rule in their home countries, while a Nintendo Wii gathers dust in every living room of the developed world. As a consequence of the global recession, game publishers have tried to extract even more revenue from a squeezed market. Witness the rise and demise of the loathed ‘online pass’, now replaced by the euphemistic ‘season pass’; paid downloads on day one that unlock content already on the disc; pre-order bonuses; and free to play games - or as I like to call them, pay to play games. History will show this generation as one that expanded the monetisation of games as much as the experiences themselves, often to the detriment of fun and artistic merit.

So here we go again with two new omnipotent wonderboxes, the PlayStation 4 and Xbox One. The games look much the same as the old ones: of course, similar criticisms were levelled at the Xbox 360, but even a layman could appreciate the beauty of Project Gotham Racing 3 compared to its predecessor. Perhaps it was the somewhat-sweaty razzmatazz of Earl’s Court at Eurogamer Expo in September, but I just couldn’t tell the difference between Forza Motorsport 5 and Forza 4, and I’ve played over fifty hours of Forza 4. With big franchises like Call of Duty and Assassin’s Creed launching on both consoles, as well as their predecessors, the choice is based more on ideology and available funds rather than real quantifiable differences. In fact, with so few games available, it makes less sense to buy a PS4 than a PS3.

Games journalists are in a difficult position. Those who haven’t been invited to New York for a complimentary gold-plated PS4 (I’m writing on a train travelling through Slough, but thanks for asking) will have pre-ordered the expensive new consoles to support their job, adding buyer’s remorse to an increasingly-dominant business model of offering their words online for free, funded by advertising. This model relies on hyperbole: every month a new ‘best game ever’, every day an announcement of something on the horizon, every minute a constant stream of rumour posts. It focuses on sensationalising games and the machines that play them, rather than criticising them. It encourages critics to follow the zeitgeist rather than dwelling on the games that linger in our minds. It blurs the lines between editorial and advertorial: after all, what is a news post about a PlayStation TV commercial if not advertising?

The YouTubeisation of publishing elevates every berk with a webcam and an opinion to the same level as the seasoned journalist. The Twitterisation of news encourages news-breaking, but not fact-checking. This isn’t unique to coverage of videogames, but the medium is entangled with technology and therefore at the forefront of innovations in publishing. Meanwhile, channels like PlayStation Access and Nintendo Direct show that publishers can successfully skip the middleman and advertise directly to customers. A new generation of games consoles deserves newer, deeper ways of looking at them - something outlets like Press Select, Boss Fight Books and my own Five out of Ten are trying to address. But we may be at the stage where readers are better served by the groupthink of their peers than the proclamations of journalists, where real decisions are made and discussions are had below the line.

While we’re keen to proclaim that videogames now generate more revenue than cinema, few have asked whether that was sustainable or even desirable. While digital distribution has led to a bigger market for indie developers, especially on the PC and smartphones, the biggest successes like Call of Duty and Battlefield are better for their publishers than the people that make them. In the UK we’ve seen the closure of much-loved studios like Blitz Games and Sony Liverpool, while other British studios like Rare have lost their lustre; the team that made the charming Banjo-Kazooie and Viva Piñata now produce bland sports titles, a reflection of the wishes of their corporate overlord Microsoft. The industry undervalues its creators and programmers, encouraging a ‘crunch’ culture with unpaid overtime and ridiculous hours. This system where distributor-takes-all reminds me of the Hollywood visual effects studio Rhythm and Hues, which won an Oscar for Life of Pi before declaring bankruptcy. If videogames really are such an important industry, they can do a lot better than emulating Hollywood in content and working culture.

Some pundits believe this may be the last console generation. I’d like to believe otherwise. I have fond memories of consoles and continue to make more: they provide a cheaper entry point into the fantastic worlds of fiction that games offer, without the expense or complexities of a PC. Yet perhaps games have outgrown the traditional model of consoles: the exponential growth of indie games is better suited to the less restrictive system of a personal computer, mobile phone or even the Ouya ‘microconsole’. Valve’s SteamOS promises the power of Linux married with their friendly distribution platform. While Sony and Microsoft are taking steps to open development on their consoles, their revenue model is built on strict control of the system: they focus on making money off the games they sell, not the platform itself.

Even more exciting are devices like the Oculus Rift, a virtual reality headset that offers a sea change in the way we play games. However, according to its creators it requires tremendous computer horsepower to be convincing - more than even the Xbox One and PS4 can provide. For years, consoles offered the best way to play games, but with that advantage gone they’re like digital cameras in a world where everyone has a camera built into their phone. I choose that analogy carefully, because I think portable consoles like the Nintendo 3DS are much better than an iPhone for games, but there’s a trade off between quality and the utility of having an all-in-one device. The best console is the one you have with you.

“War never changes,” mused Ron Perlman in the introduction to 2008’s Fallout 3. But this is a war that needs to change if games consoles are to expand, or merely retain their cultural relevance. Consoles used to represent inclusivity and the comforts of socialising with friends, but now they are targeted at an audience - and a medium - which is growing up and leaving them behind.

What should I buy?

I don’t play games, but I’d like to start

Nintendo’s latest console, the Wii U, is an underrated box. It’s cheaper than the competition and can play older Wii games as well as its new, shinier ones. Nintendo still make the best games, appealing to both children and adults like the videogame equivalent of Pixar. Unfortunately, also like Pixar, they release one great product every two years. Super Mario 3D World and Legend of Zelda: The Wind Waker are better than anything PlayStation and Xbox can offer this year.

Not only is the iPad a great computer, it’s also a great way to play games. But please avoid the mainstream tosh like Candy Crush Saga and instead try innovative titles like The Room, Year Walk and Device 6.

As an alternative, the website Forest Ambassador lists free five-minute games that work on most computers, and has the feel of a hippie art gallery. Hopefully, that last sentence will tell you whether you’ll like it.

I play games on my phone and want something better

The Nintendo 3DS goes from strength to strength with life absorbers like Animal Crossing and Pokémon X, plus the usual Mario, Mario Kart and Zelda. Since you probably won’t use the retina-bursting 3D functionality, you may as well buy the cheaper 2DS. It can also play games from the vast library of DS titles.

Steam is a free download for any computer running Windows, OS X or Linux and has an unrivalled library of games, from the biggest new releases to smaller (but no less compelling) games like Spelunky, FTL: Faster Than Light and Redshirt.

I want the best gaming experience available

PS4, Xbox One or a monster gaming PC. Choose a side, then spend the next five years of your life attacking the option you didn’t pick in internet comment threads.

Alan Williamson is Editor-in-Chief of the videogame culture magazine Five out of Ten

The new controller for the Xbox One. Photo: Getty
Show Hide image

Shami Chakrabarti’s fall from grace: how a liberal hero lost her reputation

Once, it was trendy to say you liked the former director of Liberty. No longer.

It might be hard to remember now, but there was a time when it was trendy to like Shami Chakrabarti. In the mid-2000s, amid the Iraq War backlash and the furore over identity cards, speaking well of the barrister and head of the human rights campaign group Liberty was a handy way of displaying liberal credentials. She was everywhere: Question Time, Desert Island Discs, Have I Got News For You. A young indie band from Worcester called the Dastards was so keen on her that it even wrote a song about her. It included the lyric: “I turn on my TV/The only one I want to see/Is Shami Chakrabarti.”

The daughter of Bengali immigrants, Chakrabarti was born and brought up in the outer-London borough of Harrow, where she attended a comprehensive school before studying law at the London School of Economics. Her background was a great strength of her campaigning, and during the most authoritarian years of New Labour government she burnished her reputation.

Fast-forward to 13 September 2016, when Chakrabarti made her House of Lords debut as a Labour peer. Baroness Chakrabarti of Kennington wore a sombre expression and a rope of pearls looped round her throat beneath her ermine robe. It was hard to recognise the civil liberties campaigner who was once called “an anarchist in a barrister’s wig” by Loaded magazine.

Yet Chakrabarti has also been cast in another role that is far less desirable than a seat in the Lords: that of a hypocrite. On 29 April this year, Jeremy Corbyn announced that Chakrabarti would chair an independent inquiry into anti-Semitism and other forms of racism in the Labour Party. The inquiry was prompted by the suspensions of Naz Shah, the MP for Bradford West, and Ken Livingstone, for making offensive remarks that were condemned as anti-Semitic. On 16 May Chakrabarti announced that she was joining Labour to gain members’ “trust and confidence”. She said that she would still run the inquiry “without fear or favour”.

The Chakrabarti inquiry delivered its findings on 30 June at a press conference in Westminster. The atmosphere was febrile – there were verbal clashes between the activists and journalists present, and the Jewish Labour MP Ruth Smeeth was reduced to tears. The report stated that Labour “is not overrun by anti-Semitism, Islamophobia or other forms of racism” but that there was an “occasionally toxic atmosphere”. It listed examples of “hateful language” and called on party members to “resist the use of Hitler, Nazi and Holocaust metaphors, distortions and comparisons”. Many Labour supporters were surprised that the report’s 20 recommendations did not include lifetime bans for members found to have shown anti-Semitic behaviour.

Then, on 4 August, it was revealed that Chakrabarti was the sole Labour appointment to the House of Lords in David Cameron’s resignation honours. Both Chakrabarti and Corbyn have denied that the peerage was discussed during the anti-Semitism inquiry. But critics suggested that her acceptance undermined the report and its independence.

In particular, it attracted criticism from members of the UK’s Jewish community. Marie van der Zyl, vice-president of the Board of Deputies of British Jews, said: “This ‘whitewash for peerages’ is a scandal that surely raises serious questions about the integrity of Ms Chakrabarti, her inquiry and the Labour leadership.” A home affairs select committee report into anti-Semitism in the UK has since found that there were grave failings in the report for Labour.

Two further incidents contributed to the decline in Chakrabarti’s reputation: her arrival on Corbyn’s front bench as shadow attorney general and the revelation that her son attends the selective Dulwich College, which costs almost £19,000 a year in fees for day pupils (£39,000 for full boarders). She said that she “absolutely” supports Labour’s opposition to grammar schools but defended her choice to pay for selective education.

Chakrabarti told ITV’s Peston on Sunday: “I live in a nice big house and eat nice food, and my neighbours are homeless and go to food banks. Does that make me a hypocrite, or does it make me someone who is trying to do best, not just for my own family, but for other people’s families, too?”

This was the end for many of those who had respected Chakrabarti – the whisper of hypocrisy became a roar. As the Times columnist Carol Midgley wrote: “You can’t with a straight face champion equality while choosing privilege for yourself.”

Hypocrisy is a charge that has dogged the left for decades (both Diane Abbott and Harriet Harman have fallen foul of the selective school problem). The trouble with having principles, it is said, is that you have to live up to them. Unlike the right, the left prizes purity in its politicians, as Jeremy Corbyn’s squeaky-clean political image shows. Shami Chakrabarti started the year with a campaigning reputation to rival that of the Labour leader, but her poor decisions have all but destroyed her. It’s difficult to recall a time when a liberal icon has fallen so far, so fast. 

Caroline Crampton is assistant editor of the New Statesman.

This article first appeared in the 20 October 2016 issue of the New Statesman, Brothers in blood