Animating platitudes

The genius of David Foster Wallace

Something Tom Shone says in his piece about writers and booze (about which Seher Hussain blogged here last week) reminded me of David Foster Wallace, who took his own life almost a year ago. Shone compares, unfavourably, the "recovered life" (that of the recovered, or recovering, alcoholic) and "its endless meetings [and] rote ingestion of the sort of clichés the writer has spent his entire life avoiding", with the bibulous life of the carousing writer.

It was that reference to the "ingestion of . . . clichés" that made me think of "DFW" -- specifically, of a passage from his magnum opus Infinite Jest that I discussed in a piece I wrote for the NS in autumn 2008, a couple of months after his death. Here is what I wrote:

At times it seems as if the novel is conducting an argument with itself -- for instance, in a long scene in which Don Gately, a former drug addict who is now a live-in staffer at the halfway house, goes to an Alcoholics Anonymous meeting in Boston. One of the residents in Gately's care is there, too, and complains about the "psychobabbly dialect" that's de rigueur at events like this. Gately admits that the "seminal little mini-epiphanies" routinely experienced by new inductees into AA come embalmed in language of "polyesterish" banality. Then someone else says they also find the sentimental argot hard to stomach -- especially the habit the speakers have of saying they are "here but for the grace of God", which phrase, she points out, is "literally senseless", and should be used only when introducing a conditional clause. Wallace is flattering his hip and savvy readers here, inviting them to identify with this sophisticated cynicism. But it is also clear that we are meant at the same time to find something ridiculous and overwrought about someone who is driven to want to "put her head in a Radarange" by a home-spun solecism or two. Indeed, Wallace said later that the scene was designed to get his readers -- privileged, educated Americans, most of them -- to "confront stuff about spirituality and values", stuff "our generation needs to feel".

I was trying there to excavate what one might call the moralist in Wallace; to separate a part of his writerly personality that was distinct from the metafictional showman of popular repute. This aspect of Wallace is the subject of a magnificent (and, I think, previously unpublished) essay by Zadie Smith that appears in a collection of hers, Changing My Mind, which comes out later this year. Smith quotes a remark Wallace makes somewhere about Wittgenstein's private language argument and how it entails that language must "always be a function of relationships between persons", and goes on to say:

He was always trying to place "relationships between persons" as the light at the end of his narrative dark tunnels; he took special care to re-create and respect the (often simple) language shared by people who feel some connection with each other . . . "In the day-to-day trenches of adult existence," Wallace once claimed, "banal platitudes can have a life-or-death importance." Among his many gifts was this knack for truly animating platitudes, in much the same way that moral philosophers through the ages have animated abstract moral ideas through "dialogues" or narrative examples.

Smith then points out that Wallace was also obsessed by nomenclatures and argots, those "specialized islands of language within the system". According to his editor at Little, Brown, Wallace's last, unfinished novel, The Pale King (an excerpt from which appeared in the New Yorker this year), is an attempt "to weave a novel out of life's dark matter: boredom, banality, the 'irrelevant complexity' of everyday life, all the maddening stuff that stands between us and the rest of the world and through which we have to travel to arrive at joy" -- specifically, as Smith puts it, out of "the specialised language of IRS tax inspectors".

Jonathan Derbyshire is Managing Editor of Prospect. He was formerly Culture Editor of the New Statesman.

Show Hide image

It’s been 25 years since the Super Nintendo and Sega Mega Drive were released – what’s changed?

Gaming may be a lonelier pusuit now, but there have been positive changes you can console yourselves with too.

Let's not act as if neither of us knows anything about gaming, regardless of how old we are. Surely you'll remember the Super Nintendo console (SNES) and Sega's Mega Drive (or Genesis, if you're an American)? Well, it's now been 25 years since they were released. OK, fine, it's been 25 years since the SNES' debut in Japan, whereas the Mega Drive was released 25 years ago only in Europe, having arrived in Asia and North America a bit earlier, but you get the idea.

Sonic the Hedgehog by Sega

It's amazing to think a quarter of a century has passed since these digital delights were unveiled for purchase, and both corporate heavyweights were ready for battle. Sega jumped into the new era by bundling Sonic, their prized blue mascot and Nintendo retaliated by including a Mario title with their console.

Today's equivalent console battle involves (primarily) Sony and Microsoft, trying to entice customers with similar titles and features unique to either the PlayStation 4 (PS4) or Xbox One. However, Nintendo was trying to focus on younger gamers, or rather family-friendly audiences (and still does) thanks to the endless worlds provided by Super Mario World, while Sega marketed its device to older audiences with popular action titles such as Shinobi and Altered Beast.

Donkey Kong Country by Rare

But there was one thing the Mega Drive had going for it that made it my favourite console ever: speed. The original Sonic the Hedgehog was blazingly fast compared to anything I had ever seen before, and the sunny background music helped calm any nerves and the urge to speed through the game without care. The alternative offered by the SNES included better visuals. Just look at the 3D characters and scenery in Donkey Kong Country. No wonder it ended up becoming the second best-selling game for the console.

Street Fighter II by Capcom

The contest between Sega and Nintendo was rough, but Nintendo ultimately came out ahead thanks to significant titles released later, demonstrated no better than Capcom's classic fighting game Street Fighter II. Here was a game flooding arcade floors across the world, allowing friends to play together against each other.

The frantic sights and sounds of the 16-bit era of gaming completely changed many people's lives, including my own, and the industry as a whole. My siblings and I still fondly remember our parents buying different consoles (thankfully we were saved from owning a Dreamcast or Saturn). Whether it was the built-in version of Sonic on the Master System or the pain-in-the-ass difficult Black Belt, My Hero or Asterix titles, our eyes were glued to the screen more than the way Live & Kicking was able to manage every Saturday morning.

The Sims 4 by Maxis

Today's console games are hyper-realistic, either in serious ways such as the over-the-top fatalities in modern Mortal Kombat games or through comedy in having to monitor character urine levels in The Sims 4. This forgotten generation of 90s gaming provided enough visual cues to help players comprehend what was happening to allow a new world to be created in our minds, like a good graphic novel.

I'm not at all saying gaming has become better or worse, but it is different. While advantages have been gained over the years, such as the time I was asked if I was gay by a child during a Halo 3 battle online, there are very few chances to bond with someone over what's glaring from the same TV screen other than during "Netflix and chill".

Wipeout Pure by Sony

This is where the classics of previous eras win for emotional value over today's blockbuster games. Working with my brother to complete Streets of Rage, Two Crude Dudes or even the first Halo was a draining, adventurous journey, with all the ups and downs of a Hollywood epic. I was just as enthralled watching him navigate away from the baddies, pushing Mario to higher and higher platforms in Super Mario Land on the SNES just before breaking the fast.

It's no surprise YouTube's Let's Play culture is so popular. Solo experiences such as Ico and Wipeout Pure can be mind-bending journeys too, into environments that films could not even remotely compete with.

But here’s the thing: it was a big social occasion playing with friends in the same room. Now, even the latest Halo game assumes you no longer want physical contact with your chums, restricting you to playing the game with them without being in their company.

Halo: Combat Evolved by Bungie

This is odd, given I only ever played the original title, like many other, as part of an effective duo. Somehow these sorts of games have become simultaneously lonely and social. Unless one of you decides to carry out the logistical nightmare of hooking up a second TV and console next to the one already in your living room.

This is why handhelds such as the Gameboy and PSP were so popular, forcing you to move your backside to strengthen your friendship. That was the whole point of the end-of-year "games days" in primary school, after all.

Mario Kart 8 by Nintendo

The industry can learn one or two things by seeing what made certain titles successful. It's why the Wii U – despite its poor sales performance compared with the PS4 – is an excellent party console, allowing you to blame a friend for your pitfalls in the latest Donkey Kong game. Or you can taunt them no end in Mario Kart 8, the console's best-selling game, which is ironic given its crucial local multiplayer feature, making you suspect there would be fewer physical copies in the wild.

In the same way social media makes it seem like you have loads of friends until you try to recall the last time you saw them, gaming has undergone tremendous change through the advent of the internet. But the best games are always the ones you remember playing with someone by your side.