It's only a movie: horror films may claim cultural relevance but their main appeal is shock or terror
Show Hide image

Blood money: how the market affects what horror makes it to Hollywood

Recent torture pornographers such as Eli Roth arguably have aligned themselves with 1970s American horror auteurs not only to legitimise their work but to cash in on their rebel credibility.

Merchants of Menace: the Business of Horror Cinema 
Edited by Richard Nowell
Bloomsbury Academic, 280pp, £21.99

Selling the Splat Pack 
Mark Bernard
Edinburgh University Press, 224pp, £70


The drab, yellowy walls at the edges of the photograph are what I remember best, perhaps because what dominates the foreground is so horrific: a young woman smiling at the camera, leaning over the corpse of a prisoner on a black sheet. His face is cut and bruised; crop the image and hers wouldn’t look out of place on a pinboard in a student dormitory. She makes a thumbs-up gesture. It’s hard not to turn away.

A decade since that picture and several others started to trickle out of Abu Ghraib, the cruelty on display is no less repulsive. The Iraq torture scandal was a reminder of the fragility of civilised behaviour. The smiling woman, Sabrina Harman, was the Virginia-born daughter of a homicide detective. Charles Graner, another of the disgraced soldiers shown posing among the abject prisoners, was once a member of his Pennsylvania school’s drama club. They weren’t psychopaths or bogeymen. If their actions were evil, that evil was both banal and unknowable.

The photographs were published in the spring of 2004. It was a visual moment, replete with instant icons: the towers of naked men, the hoods, the metal bars, the characterless corridors. When, a few months later, James Wan’s horror movie Saw was released in the US, the New York Times reviewer Stephen Holden noted its “uncomfortable resemblance” to the scenes captured at Abu Ghraib. The spartan, squalid-seeming room, the arbitrariness of the victims’ situation, the killer’s “impulse to humiliate and torture . . . and justify it with some twisted morality” – the comparison suggested itself, even though the film, as Holden acknowledged, had been completed before the Iraq images emerged.

It wasn’t long before horror directors such as Eli Roth were claiming that their work could trace a direct lineage to the “war on terror”. “I really try to load up the films with ideas,” Roth insisted, citing with pride the university seminars discussing his Hostel series as “a post-9/11 response to Iraq and torture”. The ecstatic violence of that franchise at first attracted the scorn of many reviewers, who dismissed it as “torture porn”, but Roth’s articulate justifications for his on-screen cruelties seem to have won over the academy.

This strategy of media management isn’t new – Night of the Living Dead’s George Romero said in 1973 that his pioneering zombie film was intended as “a statement about society”, and its semi-documentary style and black hero, murdered by white authorities, served to corroborate his claim. Yet, presenting Romero with a lifetime achievement award in 2009, Quentin Tarantino characterised his movies as consisting of “heart-stopping violence, explosive bloodshed, undead flesh-eaters and dismembered ghouls”. So does the political content really give life to the films, or is it ancillary to the thrills of gore and suspense? And does the meaning that a film-maker attaches to his lurid tales ultimately matter?

Much writing on cinema draws from auteur theory, which privileges directors’ preoccupations when discussing their work. A film culture centred around notions of an auteur’s sole agency also facilitates what the industry rather gruesomely calls “product differentiation” – making what are, in effect, franchises out of disparate movies – while indulging nerdy connoisseurship. Horror fans’ satisfying sense of their own expertise is nourished, even in the popcorn-scented multiplex, by cod-scientific classification: does Hitchcock, often cited as the catalyst for the slasher picture, belong in the same phylum as the Italian Dario Argento, another serial dismemberer of beautiful women? And what of his relation to Carrie’s Brian De Palma, whose films self-consciously draw from Psycho, Rear Window and the rest? Auteurism creates bodies of work and the critic-fan has long been cheerfully employed as their mortician, tagging identifying labels to their toes.

Two new books – Mark Bernard’s Selling the Splat Pack and Merchants of Menace, a collection of essays edited by Richard Nowell – challenge this consensus by exploring the business end of horror movie-making. Like Jason Zinoman’s excellent 2011 book Shock Value, they focus on industry machinations; but where Zinoman framed his account of horror’s “eccentric outsiders”
as a tale of subgenre heretics conquering Hollywood largely through force of will, these latest studies take a cooler, less personally invested view of how the market affects what nightmares make it on to the screen.

In Merchants of Menace, Joe Tompkins argues that “the horror auteur” is, in effect, a “brand”: directors are sold as subversives to attract consumers weaned on the appeal of the “radical artist”. So recent torture pornographers such as Roth arguably have aligned themselves with the 1970s cohort of American horror’s “golden age” – Romero et al – not only to “legitimate themselves as artists” but to cash in on the rebel credibility of those film-makers. The irony of this is that each of those directors used conventional media to sell his work, which competed in the same market as the big studio movies for the same dollars. The fixation on an individual’s influence masks that of the industry and alchemises product into art.

As a VHS-hoarding fan boy, I prefer the more romantic narrative of heroic mischief-makers testing the limits of taste, but it is hard to deny that some of horror’s most recognisable innovations were influenced by the demands of business. Bernard bemoans how an “overdependence on textual and filmic analysis” comes “at the expense of industry analysis” and his case for a shift in emphasis to the latter is persuasive.

Robert Wiene’s The Cabinet of Dr Cali­gari (1920), widely considered to be one of “the foundational films of the horror genre”, has been celebrated for its ability to offer “a glimpse into the cultural chaos of the historical moment” – that moment being the aftermath of the First World War. Its power lies largely in its expressionist style, which seems to psychologise its settings, using shadows and weird architecture to evoke the characters’ interior worlds. Bernard counters this reading with the suggestion that its look was, in part, just another form of branding – a way of making a niche for German cinema as a more crafted alternative to Hollywood.

Meanwhile, it’s a given that the US horror directors who emerged around or shortly after 1968 embody the freewheeling spirit of their age; that they stuck it to the man, making their own rules. Yet many of their creative choices were in line with changes in the business. In the late 1960s, the restrictive Production Code – guidelines set by the industry to police itself, the better to avoid government meddling – was scrapped in favour of ratings ranging from G (for general audiences) to X (for adults only). Although an R (restricted) or X certificate would result in fewer ticket sales, the new system suddenly allowed film-makers to push boundaries in sex and violence with less risk of their work being banned. Perhaps they took this as a challenge: within half a decade, Leatherface was hanging pesky kids on meat hooks (The Texas Chainsaw Massacre) and a demon-possessed girl was masturbating with a crucifix (The Exorcist).

By focusing on film cycles and their modes of production, Selling the Splat Pack and Merchants of Menace broaden the terms of discussion and help liberate the genre from the dungeon of worthy cultural interpretation. As Mikal J Gaines writes in the latter, even Romero’s Night of the Living Dead, which “provides some of the richest social critique of any horror film before or since”, was a hit with black audiences at the time of release not for its progressive political message but for the promise that it “contained spectacles of the abject body”. (It was run in a double bill with the less forward-thinking exploitation picture Slaves.)

Although it would be a folly to dismiss interpretative readings of the horror genre entirely, I am sceptical of the claims of critics and film-makers alike that a zombie or torture movie is primarily to be approached as political allegory. That attitude seems to conform to an apologetic attitude to art, in which the work serves, at best, a medicinal function: the Hostel series is valid because it negotiates, even purges, society’s anxieties about Abu Ghraib, and so on. Yet Eli Roth is not Noam Chomsky. And who thinks about Nixon or Vietnam when confronted with The Texas Chainsaw Massacre?

Horror is among cinema’s most visceral genres and its final meaning, if it must have one, is located in its affective power. A film such as Saw doesn’t just signify some real-life horror – our bodies respond to it as if it were something truly horrific. The Australian cultural theorist Claire Colebrook once described her experience of watching movies as follows: “I watch a scene . . . and my heart races, my eye flinches and I begin to perspire. Before I even think or conceptualise, there is an element of response that is prior to any decision.” In films about killers, monsters and ghosts, this pre-intellectual state is surely at its most profound. After all, as H P Lovecraft put it: “The oldest and strongest emotion of mankind is fear.”

Forms of expression and media that cause physical reactions are usually shunted to the lower end of the cultural hierarchy – pornography being a case in point – but such a valuing seems to me somewhat squeamish and arbitrary. Horror films scare us and the fear they evoke enriches us by making us more alert to our senses. Maybe Yeats was right when he wrote, “Only two topics can be of the least interest to a serious and studious mind: sex and the dead.” By which reasoning, The Exorcist, say, or Carrie, is as serious as art comes. 

Yo Zushi’s new album, "It Never Entered My Mind", will be released in October by Eidola Records

Yo Zushi is a sub-editor of the New Statesman. His work as a musician is released by Eidola Records.

This article first appeared in the 13 August 2014 issue of the New Statesman, A century of meddling in the Middle East

Show Hide image

It’s been 25 years since the Super Nintendo and Sega Mega Drive were released – what’s changed?

Gaming may be a lonelier pusuit now, but there have been positive changes you can console yourselves with too.

Let's not act as if neither of us knows anything about gaming, regardless of how old we are. Surely you'll remember the Super Nintendo console (SNES) and Sega's Mega Drive (or Genesis, if you're an American)? Well, it's now been 25 years since they were released. OK, fine, it's been 25 years since the SNES' debut in Japan, whereas the Mega Drive was released 25 years ago only in Europe, having arrived in Asia and North America a bit earlier, but you get the idea.

Sonic the Hedgehog by Sega

It's amazing to think a quarter of a century has passed since these digital delights were unveiled for purchase, and both corporate heavyweights were ready for battle. Sega jumped into the new era by bundling Sonic, their prized blue mascot and Nintendo retaliated by including a Mario title with their console.

Today's equivalent console battle involves (primarily) Sony and Microsoft, trying to entice customers with similar titles and features unique to either the PlayStation 4 (PS4) or Xbox One. However, Nintendo was trying to focus on younger gamers, or rather family-friendly audiences (and still does) thanks to the endless worlds provided by Super Mario World, while Sega marketed its device to older audiences with popular action titles such as Shinobi and Altered Beast.

Donkey Kong Country by Rare

But there was one thing the Mega Drive had going for it that made it my favourite console ever: speed. The original Sonic the Hedgehog was blazingly fast compared to anything I had ever seen before, and the sunny background music helped calm any nerves and the urge to speed through the game without care. The alternative offered by the SNES included better visuals. Just look at the 3D characters and scenery in Donkey Kong Country. No wonder it ended up becoming the second best-selling game for the console.

Street Fighter II by Capcom

The contest between Sega and Nintendo was rough, but Nintendo ultimately came out ahead thanks to significant titles released later, demonstrated no better than Capcom's classic fighting game Street Fighter II. Here was a game flooding arcade floors across the world, allowing friends to play together against each other.

The frantic sights and sounds of the 16-bit era of gaming completely changed many people's lives, including my own, and the industry as a whole. My siblings and I still fondly remember our parents buying different consoles (thankfully we were saved from owning a Dreamcast or Saturn). Whether it was the built-in version of Sonic on the Master System or the pain-in-the-ass difficult Black Belt, My Hero or Asterix titles, our eyes were glued to the screen more than the way Live & Kicking was able to manage every Saturday morning.

The Sims 4 by Maxis

Today's console games are hyper-realistic, either in serious ways such as the over-the-top fatalities in modern Mortal Kombat games or through comedy in having to monitor character urine levels in The Sims 4. This forgotten generation of 90s gaming provided enough visual cues to help players comprehend what was happening to allow a new world to be created in our minds, like a good graphic novel.

I'm not at all saying gaming has become better or worse, but it is different. While advantages have been gained over the years, such as the time I was asked if I was gay by a child during a Halo 3 battle online, there are very few chances to bond with someone over what's glaring from the same TV screen other than during "Netflix and chill".

Wipeout Pure by Sony

This is where the classics of previous eras win for emotional value over today's blockbuster games. Working with my brother to complete Streets of Rage, Two Crude Dudes or even the first Halo was a draining, adventurous journey, with all the ups and downs of a Hollywood epic. I was just as enthralled watching him navigate away from the baddies, pushing Mario to higher and higher platforms in Super Mario Land on the SNES just before breaking the fast.

It's no surprise YouTube's Let's Play culture is so popular. Solo experiences such as Ico and Wipeout Pure can be mind-bending journeys too, into environments that films could not even remotely compete with.

But here’s the thing: it was a big social occasion playing with friends in the same room. Now, even the latest Halo game assumes you no longer want physical contact with your chums, restricting you to playing the game with them without being in their company.

Halo: Combat Evolved by Bungie

This is odd, given I only ever played the original title, like many other, as part of an effective duo. Somehow these sorts of games have become simultaneously lonely and social. Unless one of you decides to carry out the logistical nightmare of hooking up a second TV and console next to the one already in your living room.

This is why handhelds such as the Gameboy and PSP were so popular, forcing you to move your backside to strengthen your friendship. That was the whole point of the end-of-year "games days" in primary school, after all.

Mario Kart 8 by Nintendo

The industry can learn one or two things by seeing what made certain titles successful. It's why the Wii U – despite its poor sales performance compared with the PS4 – is an excellent party console, allowing you to blame a friend for your pitfalls in the latest Donkey Kong game. Or you can taunt them no end in Mario Kart 8, the console's best-selling game, which is ironic given its crucial local multiplayer feature, making you suspect there would be fewer physical copies in the wild.

In the same way social media makes it seem like you have loads of friends until you try to recall the last time you saw them, gaming has undergone tremendous change through the advent of the internet. But the best games are always the ones you remember playing with someone by your side.