Live the world, don't tell the story

The games industry doesn't need to model itself on the film industry, argues Bill Thompson

Around the turn of the century, people were encouraged to explore a new medium, one based around a technology that could tell stories in a way that tapped directly into the viewer's emotions.

The medium was not videogames but cinema. Between the late 1880s and the start of the 20th century, motion pictures developed from an experimental technology into an established entertainment medium. There were thousands of kinetoscope parlours around the US and Europe and, after Robert Paul introduced the projector in 1895, large audiences for the short films of the day.

The early years of the film industry were as chaotic as any high technology start-up of today, as new inventions flooded on to the market and audiences grew. In the US, film-making became concentrated around southern California.

The Hollywood studio system emerged, offering Ford-like production lines for films with vertically integrated giants controlling every stage of the process, a model that survived for decades.

The big studios remain, but today the film industry in the west is far more fragmented, with star directors and actors holding the real power. It is not even clear that Hollywood is profitable: in Do Movies Make Money? insider Roger Smith calculates that the 2006 releases from Hollywood will lose $1.9bn over five years, once every source of income is added up.

There are some obvious parallels between the film industry and the relatively young videogames industry. Early games were commissioned by the companies that built arcade systems or written by hobbyists for the home computers of the eighties, just as early films were made by the inventors who developed cameras and projectors. As the technology matured, many small companies were started, and a period of consolidation in the 1980s and 1990s created Electronic Arts and the other giants we see today. Major players like Sony, Nintendo and Microsoft funded games development for their consoles, hoping to bring audiences to their next-generation platforms.

Independent control

Now there are signs that the studio model is breaking down - as it did with film. Big name designers like Peter Molyneux are attempting to control their own destiny just as film directors do, although they remain reliant on the big-name publishers to distribute their work, in the same way as independent film-makers need a distribution deal.

From the outside, games seem very similar to films and, indeed, the term "cinematic" is often used approvingly in reviews and discussion. Both are industrialised forms of entertainment, which rely on sophisticated technology to create a product and advanced capitalism to provide a market within which the product can be promoted, sold and consumed.

However, the superficial similarity disguises fundamental differences between the two forms of entertainment which may lead the games industry to diverge from the path taken by film.

Back in 2005, film director Steven Spielberg announced a working partnership with games company Electronic Arts to develop three games, including one for Nintendo's family-friendly Wii. The attendant promotion gave the film industry another chance to claim superiority over mere games developers, and Spielberg remarked: "I am a gamer myself and game development has always intrigued me."

It may have intrigued him, but the assumption that being a good film director automatically equips him to design and develop games is not one that many in the industry would support. Respected games developers like Shigeru Miyamoto, Peter Molyneux, Andy Schatz and Jenova Chen could reasonably argue that their skills in creating engaging and interactive environments are somewhat different from those needed to persuade a bunch of highly-paid actors to sit up and beg in front of the camera.

Commercial relationship

One reason for the confusion may be that the commercial relationship between films and games has been very lucrative, and there is much at stake in encouraging the belief that the overlap is meaningful. Film tie-ins are among the most-hyped titles each year, with Star Wars and Lord of the Rings leading the sale charts and Spider-Man, Harry Potter and The Simpsons all crowding out other games from the shelves.

But a game is not a story. It is a space for interaction and exploration, a space that may be dressed as a medieval world or a vast alien planet, occupied by human-like characters or small yellow blobs. Games require a very different form of engagement from film. Do nothing in the cinema and the story will continue without you. Press no buttons in a game and the action pauses, at least until a character turns up and kills you.

Self-direction

The digital technology that supports film is now very similar to that used for gaming, but the end results are very different. The emotion felt by the audience at the end of Annie Hall was put there by Woody Allan. The sense of achievement my son felt when he completed Halo 3 in hero mode came from inside him, facilitated by developers Bungie - but not created by them.

This difference has provoked a wide-ranging debate online, much of it spurred by a blog entry from RJ Layton, a student at the film school at the University of Southern California.

Studio-system parallels

In a provocative post titled "movies suck" this experienced gamer expressed his profound frustration with the view that "film is something that videogames should aspire to", telling games developers that "instead of trying to make a video game that accomplishes things that films do, why not make a video game that accomplishes things films were never able to?"

Game developers could model their industry on Hollywood, but we should not assume that there are any necessary parallels between the two or even that the games developers of tomorrow would want to find themselves in the same situation as today's struggling, undervalued and exploited independent film-makers. In the fragmented multimedia online world we are currently creating, the space for gaming may owe more to web development and virtual worlds than the old media style of the film industry.

Gaming and me

When I was younger I played Doom and Quake, years ago, on Network Systems but I was never a hardcore gamer. In the eighties, there was a game I liked - an adventure text-based game, called Unix.

About four years ago my son got an Xbox and he insisted I played on Halo with him.

What I want for Christmas...

If I were to get a game for Christmas, I would like a preview of Halo Wars, which is a multiplayer game due out next summer.

Bill Thompson is a technology critic and a trustee of the Cambridge Film Trust

This article first appeared in the 17 December 2007 issue of the New Statesman, Christmas and New Year special 2007

Picture: Ralph Steadman
Show Hide image

The age of disorder: why technology is the greatest threat to humankind

Disruptive technologies might change the very nature of humanity – and no nation can fight on its own.

Though human beings are social animals, for millions of years they lived in small, intimate communities numbering no more than a few dozen people. Even today, as the evolutionary biologist Robin Dunbar has shown, most human beings find it impossible properly to know more than 150 individuals, irrespective of how many Face­book “friends” they boast. Human beings easily develop loyalty to small, intimate groups such as a tribe, an infantry company or a family business, but it is hardly natural for them to be loyal to millions of strangers. Such mass loyalties have appeared only in the past few thousand years as a means of solving practical problems that no single tribe could solve by itself. Ancient Egypt was created to help human beings gain control of the River Nile, and ancient China coalesced to help the people restrain the turbulent Yellow River.

Nations solved some problems and created new ones. In particular, big nations led to big wars. Yet people were willing to pay the price in blood, because nations provided them with unprecedented levels of security and prosperity. In the 19th and early 20th centuries the nationalist deal still looked very attractive. Nationalism was leading to horrendous conflicts on an unprecedented scale, but modern nation states also built systems of health care, education and welfare. National health services made Passchendaele and Verdun seem worthwhile.

Yet the invention of nuclear weapons sharply tilted the balance of the deal. After Hiroshima, people no longer feared that nationalism would lead to mere war: they began to fear it would lead to nuclear war. Total annihilation has a way of ­sharpening people’s minds, and thanks in no small measure to the atomic bomb, the impossible happened and the nationalist genie was squeezed at least halfway back into its bottle. Just as the ancient villagers of the Yellow River Basin redirected some of their loyalty from local clans to a much bigger nation that restrained the dangerous river, so in the nuclear age a global community gradually developed over and above the various nations because only such a community could restrain the nuclear demon.

In the 1964 US presidential campaign, Lyndon B Johnson aired the “Daisy” advertisement, one of the most successful pieces of propaganda in the annals of television. The advert opens with a little girl picking and counting the petals of a daisy, but when she reaches ten, a metallic male voice takes over, counting back from ten to zero as in a missile launch countdown. Upon it reaching zero, the bright flash of a nuclear explosion fills the screen, and Candidate Johnson addresses the American public: “These are the stakes – to make a world in which all of God’s children can live, or to go into the dark. We must either love each other. Or we must die.” We often associate the slogan “Make love, not war” with the late-1960s counterculture, but already in 1964 it was accepted wisdom, even among hard-nosed politicians such as Johnson.

During the Cold War, nationalism took a back seat to a more global approach to international politics, and when the Cold War ended, globalisation seemed to be the irresistible wave of the future. It was expected that humankind would leave nationalistic politics behind, as a relic of more primitive times that might appeal at most to the ill-informed inhabitants of a few under­developed countries. Events in 2016 proved, however, that nationalism still has a powerful hold even on the citizens of Europe and the United States, not to mention Russia, India and China. Alienated by the impersonal forces of global capitalism, and fearing for the fate of national systems of health, education and welfare, people all over the world seek reassurance and meaning in the bosom of the nation.

Yet the question raised by Johnson in the Daisy advertisement is even more pertinent today than it was in 1964. Will we make a world in which all human beings can live together, or will we all go into the dark? Can Donald Trump, Vladimir Putin and their like save the world by appealing to our national sentiments, or is the current nationalist spate a form of escapism from the intractable global problems we face?

***

Let’s start with nuclear war. When the Daisy advert aired, two years after the Cuban missile crisis, nuclear annihilation was a palpable threat. Pundits and laypeople alike feared that humankind did not have the wisdom to avert destruction, and that it was only a matter of time before the Cold War turned scorching hot. In fact, humankind successfully rose to the nuclear challenge. Americans, Soviets, Europeans and Chinese changed the way geopolitics had been conducted for millennia, so that the Cold War ended with little bloodshed, and a new internationalist world order fostered an era of unprecedented peace. Not only was nuclear war averted, but war of all kinds declined. Since 1945, surprisingly few borders have been redrawn through naked aggression, and most countries have ceased to use war as a standard political tool. In 2016, despite wars in Syria, Ukraine and other hot spots, fewer people died from human violence than from obesity, car accidents or suicide. This may well have been the greatest political and moral achievement of our times.

Unfortunately, we are so used to this achievement that we take it for granted. This is partly why people allow themselves to play with fire, and that includes not only the latest Russian adventures in eastern Europe and the Middle East, but also the choices made by European and American voters.

The Brexit debate in Britain revolved mainly around questions of economics and immigration, while the EU’s vital contribution to European and global peace has largely been ignored. After centuries of terrible bloodshed, the French, Germans, Italians and Britons have finally built a mechanism that ensures continental harmony – only to have the British public throw a wrench into the miracle machine. Meanwhile, Donald Trump mixes calls for US isolationism with plans to strengthen the country’s nuclear arsenal and reignite a nuclear arms race, thereby threatening to undo the hard-won gains of the past decades and bring us back to the brink of nuclear annihilation.

It was extremely difficult to construct the internationalist regime that prevented nuclear war and safeguarded global peace. No doubt we need to adapt this regime to changing conditions in the world: for example, by relying less on the United States and giving a greater role to non-Western powers such as China and India. But abandoning this regime altogether and reverting to nationalist power politics would be an ­irresponsible gamble.

True, in the past, countries played the ­nationalist politics game without destroying human civilisation. But that was in the pre-Hiroshima era. Since then, nuclear weapons have raised the stakes and changed the fundamental nature of war and politics. No matter whom American voters elect to the presidency, the atom bomb is still there and E still equals MC². As long as human beings know how to enrich uranium and plutonium, their survival will depend on privileging the prevention of nuclear war over the interests of any particular nation. Zealous nationalists should ask themselves whether their nation by itself, without a robust system of international co-operation, can protect the world – or even itself – from nuclear destruction.

On top of nuclear war, in the coming decades humankind will face a new threat to its existence that hardly registered on the political radar in 1964: climate change. If we continue with our present course it is likely that global warming, ocean acidification and ecological degradation will result in unprecedented economic, political and social problems, and might well destroy the foundations of human prosperity. What is the nationalist answer to climate change? How can any nation, however powerful, stop global warming on its own? Will the US build a wall against rising oceans?

When it comes to climate, countries are not sovereign, but are at the mercy of actions taken by governments on the other side of the planet. As long as 200 governments pursue 200 different ecological strategies, shaped by their unique needs and interests, none is likely to succeed. With present-day technology, any serious measures to stop global warming are likely to slow down economic growth. Such a policy carries an unbearable political price if it is undertaken by a single country while others continue with business as usual. Any US administration that deliberately slowed down economic growth for environmental reasons would be bound to lose the next election; a Chinese administration that does so courts revolution tomorrow morning. In a nationalist and xenophobic world no government will sacrifice itself for the greater good of humanity, as Trump’s actions show.

***

Indeed, nationalism is even more dangerous in the context of climate change than that of nuclear war. An atomic bomb is such an obvious and immediate threat that even the most ardent nationalist cannot ignore it. Global warming, by contrast, is a much more vague and protracted menace. Hence, whenever environmental considerations demand some painful sacrifice, nationalists will be tempted to put the national interest first, reassuring themselves that we can worry about the environment later, or just leave it to people elsewhere. Alternatively, as in the case of Trump, they may simply deny the problem. It isn’t a coincidence that scepticism about climate change is usually the preserve of nationalist politicians. They have no answer to the problem, and so they prefer to believe it does not exist.

The same dynamics are likely to spoil any nationalist antidote to the third large threat to human existence in the 21st century: technological disruption. New technologies, particularly in the fields of bioengineering and artificial intelligence, will soon give humankind unprecedented, godlike powers. Whereas previously human beings learned to produce food, weapons and vehicles, in the coming decades our main products will probably be bodies, brains and minds. However, it is extremely difficult to foresee the potential impact of such technologies. They open the door to an entire supermarket of doomsday scenarios.

If and when artificial intelligence (AI) surpasses human intelligence, it may be given control of weapon systems and crucial decisions, with potentially calamitous consequences. In addition, as AI outperforms human beings in ever more tasks, it might push billions of us out of the job market, creating a new “useless class” of people, devoid of both economic value and political power. Meanwhile, given enough biometric data and enough computing power, external algorithms could know us better than we know ourselves, and then governments and corporations could predict our decisions, mani­pulate our emotions and gain absolute control over our lives.

On the bioengineering front, breakthroughs in genetics, nanotechnology and direct brain/computer interfaces could unleash deadly new epidemics or disturb our internal mental balance. In past centuries we have gained control of the world outside us and reshaped the planet, but because we didn’t understand the complexity of the global ecology, the changes we made inadvertently disrupted the entire ecological system. In the coming century we will gain control of the world inside us and reshape our bodies and brains, but because we don’t understand the complexity of our own minds, the changes we will make might disrupt our mental system. In addition, bioengineering might for the first time in history translate economic inequality into biological inequality, creating an upper caste of enhanced superhumans, and relegating the poor to the dustbin of evolution.

What is the nationalist answer to these menaces? As in the case of global warming, so, too, with technological disruption: the nation state is the wrong framework to address the threat. Given that research and development are not the monopoly of any one country, even a superpower such as the US or China cannot restrict them by itself. If the US government forbids the genetic engineering of human embryos, it won’t prevent North Korean scientists from doing such work. And if the resulting developments confer on North Korea some crucial economic or military advantage, the US will be tempted to break its own ban. Particularly in a xenophobic, dog-eat-dog world, if even a single country chooses to pursue a high-risk, high-gain technological path, other countries will be forced to do the same, because nobody can afford to remain behind. In order to avoid such a race to the bottom, humankind will probably need some kind of global identity and loyalty.

Whereas nuclear war and climate change threaten only the physical survival of humankind, disruptive technologies might change the very nature of humanity, and are therefore entangled with human beings’ deepest ethical and religious beliefs. Although everyone agrees that we should avoid nuclear war and ecological meltdown, people have widely differing opinions about using bioengineering and AI to upgrade human beings and to create new life forms. If we fail to cobble together globally accepted ethical guidelines, it will be open season for Dr Frankenstein.

When it comes to formulating such ethical guidelines, nationalism suffers above all from a failure of the imagination. Nationalists think in terms of territorial conflicts lasting centuries, whereas the technological revolutions of the 21st century should be understood in cosmic terms. Ever since its appearance on Earth, four billion years ago, life has been governed by the laws of natural selection. During those aeons, whether you were a virus or a dinosaur, you evolved according to the principles of natural selection. No matter what strange shapes life took, it remained confined to the organic realm. Whether a cactus or a whale, you were made of organic compounds. Now science might replace natural selection with intelligent design, and might even start creating non-organic life forms. After four billion years of organic life shaped by natural selection, science is ushering in an era of inorganic life shaped by intelligent design. What has Israeli, Russian or French nationalism got to say about this? In order to make wise choices about the future of life we need to go way beyond the nationalist viewpoint and look at things from a much wider perspective.

***

The nationalist wave sweeping across the world cannot turn the clock back to 1939 or 1914. Technology has changed everything by creating a set of global threats to human existence that no nation can fight on its own. A common enemy is the best catalyst for forging a common identity, and humankind now has three such enemies – nuclear war, climate change and disruptive technology. If, despite these threats, we choose to privilege our particular national loyalties above everything else, the results may be far worse than in 1914 and 1939.

A much better path is the one outlined in the EU’s constitution, which states that “while remaining proud of their own national identities and history, the peoples of Europe are determined to transcend their former divisions and, united ever more closely, to forge a common destiny”. There is still plenty of room in the world for the kind of patriotism that celebrates the uniqueness of my nation and stresses my special obligations towards it. Yet, if we want to survive and flourish, humankind has little choice but to complement such local loyalties with substantial obligations towards a global community.

In previous centuries national identities were forged because human beings faced problems and discovered opportunities that went far beyond the scope of ­local tribes, and which only countrywide co-operation could hope to handle. In the 21st century, nations find themselves in the same situation as the old tribes.

We need a new global identity, because national institutions are incapable of managing a set of unprecedented global challenges. We now have a global ecology, a global economy and a global science – but we are still stuck with only national politics. This mismatch prevents the political system from countering our main problems effectively.

To have effective politics, we must either de-globalise the ecology, the economy and the march of science, or we must globalise our politics. As it is impossible to ­de-globalise the ecology and the march of science, and as the cost of de-globalising the economy will probably be ruinous, the only solution is to globalise politics.

Yuval Noah Harari lectures at the Hebrew University of Jerusalem. His latest book is “Homo Deus: a Brief History of Tomorrow” (Vintage)

This article first appeared in the 17 December 2007 issue of the New Statesman, Christmas and New Year special 2007