Video game sequels and remakes can easily lose fans. Photo: Getty
Show Hide image

How do you make the perfect sequel to a successful game?

Much like that difficult second album, the sequels to video games are easy to get wrong, so what's the best formula for a successful remake or sequel?

Games work best in series, both from the point of view of a player who gets to continually enjoy a particular style of game that they like, and as a developer who gets to make money year after year off a single great idea they had. But simply churning out game after game isn’t as easy as it looks, and the history of video games is littered with examples of sequels and remakes gone wrong.

The first job of a sequel is to ensure that it retains enough of what was familiar in order to sustain the original audience. When a sequel gets this wrong there can be hell to pay because you are not just losing the original fan base, you are angering them. We’ve seen this with games like the recent Thief and Devil May Cry remakes and also Hitman: Absolution. When you take a franchise and make from it a sequel that lacks the unique selling points that its original fan base loves, it will respond with outrage, as well it should.

A game is more than just stories told through the medium of colours, shapes and button mashing: it is, well, a game. The mechanics, rules and systems of the game are unique and when you remove elements from them when making a sequel, the fans of that game, who enjoyed those systems, will likely feel that the series has lost them and might move on without them. In the same way as an animal might evolve to lose a certain feature, games can do this too over time. Resident Evil, for example, had the survival horror traits bred out of it for Resident Evil 4, becoming instead a third person action game, and the series never looked back. Over time, games can shed the quirks and oddities of their design and in doing so just become painfully bland.

Another example is Hitman: Absolution, which brought in a couple of changes to its rules. One was that it allowed the main character to conceal weapons like rifles on his person. In the old Hitman games, if you wanted to walk around armed people would react to that, now the main character could hide an assault rifle in his pocket and walk through a crowd with nobody any the wiser. It seems like a small change, but for a game about stealthy assassination it was significant. Players complained and hopefully that will lead to improvements. Without the vocal complaints of the oft-maligned fans of the series, the change might not be reversed. The more distinctive features a game has, the harder it is to sell to the mass market, but, if it loses too many of them, there’s nothing to sell.

That being said you cannot simply keep serving up a slightly rehashed version of the same game that the players already have, even if you keep all the unique elements intact. Single player games in particular can suffer badly from diminishing returns: the scary becomes familiar, the exciting becomes routine and the epic becomes ordinary. The most frustrating example of this is the STALKER series of games. This series debuted with the buggy and slightly awkward STALKER: Shadow of Chernobyl, which, for all its flaws, remains an unforgettable game. The setting, the creatures, the atmosphere; it was nothing short of majestic the first time out.

The next two games in the series, Clear Sky and Call of Pripyat, refined the game with improved visuals, animations and mechanics but didn’t add very much. A flawed masterpiece rebuilt with fewer flaws sounds perfect, but in reality neither game had the same atmosphere as the original, nothing had the same impact. There’s nothing inherently wrong with polishing up a good game but there always has to be something substantial added as well.

The best developers tend to be the ones who have the courage to make a big change when they see that it needs to be made. For example, the change from GTA 2 to GTA 3 was huge, swapping the game from top-down 2D to the 3D world while keeping the tone of the games largely intact. When the jump from GTA: San Andreas to GTA 4 happened, the change was more in the tone and style of the game. Meanwhile, the Saints Row series went with a tonal shift between the first and second games, dumping any attempt at seriousness in favour of becoming a full-on parody, and it worked. Changing the formula in a series that is already doing OK for itself is a big gamble, but the payoff can be immense.

The Saints Row series also highlights what can happen when the change is too severe. The difference between Saints Row: The Third and Saints Row 4, becoming a superhero game rather than a GTA-clone, was a step too far. Taking the story to its logical over-the-top conclusion is one thing, and there wasn’t much room to move up in the world after the third game, but a developer should always remember that fans are there for the game – you mess too much with that too much and you will lose them.

The concern for players will always be that games gradually lose their character over time and there is certainly some evidence that this is the case. Even if we take something as commercially and creatively successful as the Elder Scrolls series we can see that Skyrim is a much less idiosyncratic and complex creature than Oblivion, which in turn is very streamlined compared to Morrowind. It is difficult to dislike Skyrim, but it’s pretty clear that if the developers do decide to smooth out the design of the game further, it is basically going to have nothing left. The current trajectory of the series would suggest that The Elder Scrolls 6 might just be a shiny thing that dangles on the end of a piece of elastic.

The hope is that game developers remain true to the fans of their games over the pull of the mass market, or that they at least attempt to reconcile the two. The fact is that if a game succeeds, if it then increases its budget and production values, it will have to sell more. Players always want to see the games they love improve, but the downside here is that improvements cost money, and if a game grows beyond its commercial niche the company that made it is going to fail. As it stands, it seems that fans of video games will have to make do with a system that is happy to give them more of what they like every couple of years or so, on the understanding that they are happy to have it watered down a little more each time.

Phil Hartup is a freelance journalist with an interest in video gaming and culture

Show Hide image

In an age where history is neatly divided, two new books take a longer view

The Evolution of Everything by Matt Ridley and Human Race by Ian Mortimer have, to put it gently, an abundance of ambition.

Historians often take evasive action when confronted by well-meaning members of the public who believe that if you call yourself a historian you must know about the bit of the past they are interested in – if not all of it, at least some of it. But most academic history is now sliced and diced in ways that limit the scope of what can be reasonably researched, taught and learned. So you can, if you choose, leave most universities as a history student without having any idea at all about the medieval world or what happened, ever, on most continents.

Here, however, we have two studies of the past (and present and future, too) setting inhibition aside and aiming rather wider, with grandiloquent titles that suggest, to put it gently, an abundance of ambition.

Matt Ridley, the Times columnist and author of bestselling books on science, ranges across the planet, and from the beginning of time, throwing off ideas about every
conceivable discipline: biology and genetics as you would expect, but also morality, economics, philosophy, culture, technology and more. Ian Mortimer is almost timid by comparison. His focus is on the West and for a mere thousand years. Whereas Ridley sets out to explain everything, Mortimer’s task is only to judge which century out of the past ten brought the most change.

Ridley’s book is much the more readable, provocative and infuriating. He wants us to understand that Darwin’s theory of natural selection applies to all human activity. The world, in every particular, changes because of endless trial and error and innumerable small steps: it is self-organising. Nobody is in charge and nobody is responsible – certainly not God (there isn’t one), nor kings, popes,
politicians and officials. He deplores our need for “skyhooks” (a phrase he attributes to the philosopher Daniel Dennett): the notion that somebody or something is responsible for designing and planning outcomes. For instance, nobody is in charge of English or invented it, but it has rules that make sense, and can develop without a management structure ordaining changes.

Obviously, Ridley knows that many other writers have attacked the notion that individuals have a significant role in changing the course of the world. And he is not the first to suggest we are too keen to dramatise change – rather than giving credit to “cumulative complexity” or the recombination of existing ideas: the multiple and widely dispersed actions that lead to, say, the invention of a pencil. He draws a number of challenging conclusions. We should be sceptical about awarding Nobel Prizes or patents; no single person should claim intellectual property rights for what are collective, “bottom-up” achievements.

Ridley’s relegation of individual agency in history does not stop him from telling us about his heroes, some predictable – Darwin, Mendel, Hume, Locke – and some not. His favourite is Lucretius, the 1st-century BC Roman philosopher and poet whom he loves for his eclecticism, hatred of superstition, love of pleasure and view of nature, “ceaselessly experimenting”.

When Ridley is on his central territory (physics, animal behaviour, Darwin’s science, genomes and the like) he is very interesting. Here he considers the relationship between genes and culture: “It is wrong to assume that complex cognition is what makes human beings uniquely capable of cumulative cultural evolution. Rather it is the other way around. Cultural evolution drove the changes in cognition that are embedded in our genes.” The relationship between culture and biology is contested but he argues his corner – that biology can and does respond to culture – with much erudition. Yet in his pursuit of an overarching evolutionary theory for every activity there are arguments and whole chapters that lurch out of control. Ridley the clever scientist becomes Ridley the political champion of a very much smaller state, Ridley the cheery optimist  about anthropogenic global warming, Ridley the libertarian. He makes a point, takes out an ideological hammer and smashes his own argument to pieces.

So, summarising Steven Pinker’s theories in The Better Angels of Our Nature about the long-term decline of violence, he asks us to see capitalism as the crucial beneficial cause of tranquillity. Ridley asserts that the ten most prosperous countries in the world are all firmly capitalist and the ten least clearly not. This is a glib assertion of cause and effect and somehow overlooks the United States, the most obvious emblem for capitalism, with its extraordinary gun culture and homicide rate. The US does not feature in either list but he does not appear to notice.

He dislikes state provision of most things, and chooses to contrast the fall in the price of clothes and food over the past fifty years with the big rise in state expenditure on health and education – which leads to a hopelessly airy summary of the state’s performance: “The quality of both [health and education] is the subject of frequent lament and complaint. Costs keep going up, quality not so much, and innovation is sluggish.” We read nothing here about, say, the rise in life expectancy; but by this point Ridley is in full-on polemical mode. He notes North Korea’s dismal productivity, frequent shortages, scandalous lapses in quality, rationing by queue and by privilege, and continues: “These are exactly the features that have
dominated Britain’s health-care debate over the past few years.” This is no longer interestingly mischievous, but silly and shrill.

Ridley was chairman of Northern Rock in 2007 when it went belly-up. The only problem, apparently, was too much pressure from the US government to lend to inappropriate borrowers. He is not the first to assert that; yet the passage reeks of complacency and self-interest. He is a fan of Bitcoin, or at least of breaking the state’s monopoly of printing money. The iconoclasm, though fun at times, becomes exhausting.

Mortimer’s approach is largely more conservative. He goes through each of his ten centuries and gives compressed accounts of the main events: the 14th-century Black Death (“by far the most traumatic event that humanity has ever experienced”), the 16th-century Scientific Revolution, the 17th-century wars of religion, the Industrial Revolution of the 19th century and much else. He also explores some rich themes, particularly the importance of population growth, climate change, transport, food production and literacy. He makes a spirited attempt at the end of each chapter to choose someone he labels “the principal agent of change”, an approach that Ridley would deplore. Winners of this accolade are two popes, an English monarch, three philosophers, a religious revolutionary, a scientist, an explorer and Hitler. It’s harmless fun.

After a largely conventional, chronological account, he ends up using Abraham Maslow’s hierarchy of needs to decide which century brought the biggest transformation of human experience in the West. The question is more a parlour game than an academic historical conundrum.

Both Ridley and Mortimer end with “the future”. They point in different directions; Ridley is sunnily optimistic (everything not only evolves but in a vaguely progressive way, too) and Mortimer gloomy, worrying about our fuel reserves and a consequent decline of individual freedom. Like most futurology, though, there is not much rigour in these books, so don’t take it seriously.

Mark Damazer is the Master of St Peter’s College, Oxford, and a former controller of BBC Radio 4

This article first appeared in the 19 November 2015 issue of the New Statesman, The age of terror