Friedrich Nietzsche certainly had a knack for aphorisms, and there is none better than this, from Beyond Good and Evil: “If you gaze for long into an abyss, the abyss gazes also into you.” In other words, too much exposure to the bad side of human nature is infectious.
Every time I read that quotation now, I think: Oh Freddy, if only you could have lived to see Google Analytics. This is a service that shows web administrators a smorgasbord of data, including which search queries brought readers to their sites. And having run several personal blogs, I can tell you what those are, by and large: poorly spelled attempts to find porn. The simplicity of the requests can give them a plaintive air, and I often find myself wondering what series of life choices ends with a person googling “erectoin” at 4pm on a rainy Thursday.
Anyone working in digital media gazes into the abyss, all the time. A friend who works on a big newspaper website told me how the referral traffic broke down on articles with headlines including the words “Jennifer Lawrence nude pictures” following the theft of her private photos. Most came from search, and a sizeable chunk was click-through from the website’s home page. Hardly any came from Twitter or Facebook. “Lots of people want to look at stolen naked photos,” was the reasonable explanation. “But no one wants their friends to know they’re looking at stolen naked photos.”
How to deal with knowing what your customers really want is perhaps the biggest challenge facing journalism in this century (well, after how to pay for it). In Ye Olden Days (say, ooh, 1999) your choice of newspaper was based not simply on how you preferred it formatted – excruciating puns? Writers who assume you know what derivatives are? – but also on how you wanted to be seen. There was no point sitting down to a bowl of organic muesli in a fair-trade batik dress if a copy of the Daily Express was about to plop on to the doormat.
Now, though, no one knows if you are idly browsing the Mail’s Sidebar of Shame on your smartphone, or hate-reading the Tatler website in the offices of the Morning Star. At the same time, we show off to our friends by sharing individual pieces that make us look good. So sites such as Upworthy go heavy on “inspirational stories” – behold this child with no hands who has learned to play the piano with his tongue! – which get shared like crazy. But a painstaking investigation into the hospital blunders that led Piano Boy to lose his hands in the first place? Not so much.
The horror of staring into the Big Data abyss is not only affecting journalists, as a new book by Christian Rudder shows. Its title contains the most menacing asterisk I’ve ever seen: Dataclysm: Who We Are* (*When We Think No One’s Looking). In it, Rudder discusses what he learned from crunching the numbers generated by the American dating site OkCupid, which he co-founded.
Some of his findings were unsurprising, if dispiriting: women prefer men roughly their own age, whether they are in their twenties or forties; men prefer twentysomethings, even when they’ve hit 50. But it is the data Rudder collected on race that is most challenging, particularly in a culture which seems to assume that if only we can pillory enough individual racists, we can end racism.
On OkCupid, 84 per cent of users answer No to this question: “Would you consider dating someone who has vocalised strong negative bias toward a certain race of people?” Look at the ratings given to individual profiles, however, and a different picture emerges: black women are consistently rated lower than women of other races. They also receive only three-quarters of the first messages that other women do, and their messages are replied to about 75 per cent as often.
This isn’t about “sticking to your own kind”: black men consistently give Asian and Latina women higher scores than black women. And what applies to dating also applies to job interviews. As one research paper put it: “Are Emily and Greg more employable than Lakisha and Jamal?” (Yes.) Rudder notes that these decisions are made instantly, and unconsciously: “By hundreds of small, everyday actions, none of them made with racist intent or feeling, we reflect a broader culture that is, in fact, racist.”
The interesting question here is whether we feel it is OkCupid’s responsibility to do anything to redress this systemic injustice. It would be perfectly possible, for instance, for the site to boost the number of black people’s profiles that appear in its search results, in order to level the playing field.
But that kind of positive action makes the tech-world queasy: many of its leading lights are completely in thrall to the idea that their creations are “neutral” or “apolitical”, where those words turn out to mean “reinforces the status quo”. Rudder gives the example of Google’s autocomplete feature – which responded to me typing “why do black people . . .” with the suggestions “have white palms”, “say aks” (for “ask”) and “like fried chicken”. He points to research suggesting that “autocomplete will eventually perpetuate the stereotypes it should only reflect . . . It’s the site acting not as Big Brother, but as Older Brother, giving you mental cigarettes.” By knowing what people are like right now, it reinforces their existing beliefs and increases the chances of them staying like that.
This brings us back to journalists: for us, the same temptation has always existed. The better you know your readers, the easier it is not to challenge them. We, too, could artificially boost stories that we believe redress systemic injustices: but in a competitive marketplace – and with no one to know what the audience is reading – will readers just vote with their clicks and go elsewhere? Journalists are looking into the abyss . . . and the abyss is full of traffic.