By altering people’s newsfeeds to show more “positive” or “negative” content, Facebook’s “research” sought to understand how feelings can spread. Image: Getty
Show Hide image

Laurie Penny on Facebook's manipulation: It can manipulate your mood. It can affect whether you vote. When do we start to worry?

The social network admits manipulating its users’ emotions through the content it put in their newsfeeds. Think that’s creepy? A couple of years ago, it influenced their voting patterns, too. When do we get scared about what Facebook could do with its power?

When you clicked the little box that said you agreed to Facebook’s terms of service, you agreed to be a lab rat. 

The internet is alight with news of a study conducted by the social media company’s research department into “emotional contagion”. Over 600,000 people had their Facebook newsfeeds altered to reflect more “positive” or “negative” content, in order to determine if seeing more sad messages makes a person sadder. The “negative” content wasn’t entirely censored from the newsfeeds of the test subjects – if you checked in to your friend’s personal page, you could still see if he’d had a good day or not. But the newsfeeds themselves were tweaked without warning, and the emotional responses of test subjects tracked, judged by changes in their use of language.

The findings of the study – that people are influenced by the emotions of others online as they are offline – surprised precisely nobody. The findings are not the point. The point, and indeed the fact that has sent ripples of outrage around the web, is that Facebook can do this. Facebook can manipulate the emotions of hundreds of thousands of people just to see what happens. 

I’ve been digging into this story for a number of weeks now. Having read the paper and spoken to a number of experts in the field, including those who are more informed than me about the dirt-under-the-fingernails procedures of psychological research, I am not convinced that the Facebook team knows what it’s doing. It does, however, know what it can do – what a platform with access to the personal information and intimate interactions of 1.25 billion users can do. 

Nobody has ever had this sort of power before. No dictator in their wildest dreams has been able to subtly manipulate the daily emotions of more than a billion humans so effectively. There are no precedents for what Facebook is doing here. Facebook itself is the precedent. What the company does now will influence how the corporate powers of the future understand and monetise human emotion. Dr Adam Kramer, the man behind the study and a longtime member of the company’s research team, commented in an excited Q & A that “Facebook data constitutes the largest field study in the history of the world.” The ethics of this situation have yet to be unpacked.

I put a number of questions to Facebook’s representatives, including Dr Kramer, over the course of three weeks of phone calls, emails and direct messages. I was repeatedly told that Adam Kramer was too busy to talk to me and would remain too busy for the foreseeable future, although Dr Kramer himself told me that he couldn’t speak to me without the say so of the press team. Facebook were unavailable for comment. Facebook went to some lengths to be as unavailable as possible for comment without directly telling me where to shove my inquiry. Facebook were unavailable for comment in the way that a man who, on hearing the doorbell, runs out of the back door and over the garden wall is not at home to visitors.

I asked if it was possible for users to find out if their own newsfeeds had been altered. No answer. I asked if it was possible for users to opt out of any further such studies. No answer, but if I’d got one, I suspect it would have been “no” – all users agree implicitly to be experimented upon when they sign up for the service. I asked if anyone had bothered to check up on all the people in whom negative emotions were apparently induced. No answer.

 

The one thing Facebook’s representatives would tell me is that yes, they had indeed carried out the study and yes, they had been looking into the effects of emotional contagion for some time. Right now, the internet is outraged that Facebook played with the emotions of over half a million users in the name of research, without their consent. But one key thing to remember here – and what becomes clear upon reading through half a decade’s worth of news reports – is that Facebook have been doing this for years. The company’s suggestion that it doesn’t understand all the fuss about its “emotional contagion” research is rather undermined by the fact that it has been conducting said research at great expense and for some time. 

Their barely-consensual experiments with manipulating emotion and ideologies, rather than merely tracking their patterns, are not new. They are ongoing. Facebook is deeply invested in “what happens when you apply the science of how people relate to each other to social technology”, to quote from the prospectus for their recent “Compassion Research Day”. The ethics of altering people’s experience of the world on this scale, without their consent, for the purposes of research, do not appear to trouble the Facebook research team.

Emotional engineering is, and always has been, Facebook’s business model. It is the practice of making itself socially indispensable that has ensured that, for many millions of people, Facebook has become the default front page of the internet. Their newsfeed is literally that – it’s the first place many of us go to find out what’s been happening in the world, and in the worlds of those we love, those we like, and those we once met at a party and got an awkward friend request from two weeks later. Nathan Jurgenson, a social media theorist, researcher at Snapchat and editor at the New Inquiry, told me that: “This is part of the terms of service. All design plays at our emotions. That study didn’t mention the like button, which is itself emotional engineering.” 

Of course, actual newspapers have been doing this since the days of hot type. They select stories to represent a particular worldview, alter content to suit their advertisers, change headlines, circulate propaganda. But Facebook is not a newspaper: it is a distribution platform, a site of social exchange. The equivalent here is not, just for example, the Sun newspaper deciding to back up Rupert Murdoch’s view of the world, but your local pub, shop, supermarket and post office suddenly refusing to stock anything but the Sun without prior warning. 

Most experiments of this sort, although there have been few on this scale, offer subjects a blind choice on the assumption that the choices will be harmless to them. Aleks Krotoski, an expert in internet research methods who has a PhD in social research, told me that while the study passed two ethics research boards, “for such a networked study, ethics boards consider the following when waiving the need to gain informed consent: the research must not involve greater than minimal risk”. What Facebook has done, by its own claims, is not harmless. The test induced negative emotions in tens of thousands of people in order to prove a point. 

Writing at Medium, Zeynep Tufekci identifies this as “a Gramscian model of social control: one in which we are effectively micro-nudged into ‘desired behaviour’ as a means of societal control”. That sort of control was never possible on this scale before. Television couldn’t do it. Radio couldn’t do it. Newspapers couldn’t keep track of how an individual reader was feeling and who they were talking to and determine what message to send. Social media companies can do all of that, and more. This has already extended to control of voting behaviour.

Facebook has already manipulated the voting behaviour of its users, and bragged about it, too.

I’m going to give you a second to consider the implications of that.

Here’s what happened. In 2010, Facebook made small, experimental alterations to the banners it released reminding US citizens to vote. Most users saw a banner encouraging voting, with images below the banner of Facebook friends who had already voted, or who had at least clicked the button claiming so (four per cent of people turned out to be lying about that). Two randomly-selected groups of 600,000 users – this seems to be the magic number for Facebook’s in-house wonks – saw a message without the faces of their friends, or no message at all. The 2012 study based on this data claimed that Facebook’s “get out and vote” messages may have caused an extra 340,000 votes to be cast, and that merely manipulating the message changed those numbers by tens of thousands. They determined this by examining public voter rolls: cross-checking private status information against the records the state holds on your political activity. 

The abstract proudly declares that its results “show that the messages directly influenced political self-expression, information seeking and real-world voting behaviour of millions of people”. The lead researcher on the voter-manipulation experiment, Dr J Fowler, told CNN: “If we want to make the world a better place on a massive scale, we should focus not just on changing a person’s behavior, but also on utilising the network to influence that person’s friends.”

For you and me, this is a massive secret political experiment on the creepy-totalitarian side of interesting. For a senator, or a Member of Parliament, this news means and meant much, much more. It means power. Power of a new and breathtaking kind. Power that demands to be paid attention to and courted. 

What if Facebook, for example, chose to subtly alter its voting message in swing states? What if the selected populations that didn’t see a get-out-and-vote message just happened to be in, say, majority African-American neighbourhoods? The fact that Facebook are obviously good guys who get movies made about them with Aaron Sorkin scripts and Trent Reznor soundtracks and would obviously never do such a thing doesn’t change the fact that they could do it, and more if they chose, and then claim it as research. 

The studies, taken individually, are creepy quasi-consensual experiments on individuals’ most intimate feelings and most important choices. Taken together, there's a terrifying pattern.

Facebook’s service is not free. Facebook’s product is your information, your worldview, your memories and experience, and that is what you pay with every time you log in. That information is power of a quality that is can be traded upon and sold.

The simple answer would, of course, be that if you don’t want to be spied on, emotionally manipulated and studied, quit Facebook. But that’s not how the modern economy of information works. It never has been. There is a cost to not participating in these new networks. The choice not to participate is the choice to miss out on events, birthdays, status updates. Your best friend’s wedding photos. Your young cousin’s call for help at four in the morning. Professional networks, like “Binders Full of Women Writers”, the great big glorious global group of authors and journalists I got an invite to last week. And, most importantly, news about the world. The recent articles I’ve read on the the Facebook emotional contagion study have been linked on my Facebook newsfeed. I find my own emotions negatively impacted by the news, and can’t help but wonder if anyone’s tracking that fact, if perhaps I should try to throw them off the scent by posting some pictures of adorable baby sloths.

More people live a part of their lives on Facebook than live in any single country on earth, apart from China. It is, effectively, a country itself, a country of pure information where the authorities know everything you do and can change everything you see, without even telling you first. They can make sure you only hear happy news on a particular day, to encourage you to buy more MacGuffins. And they can manipulate the way you vote.

If Facebook is a country, then it is a corporate dictatorship. This is not a metaphor. I believe that it is beyond time that we began to hold social networking not just to the laws of the market, but to the common laws of the societies we live in and the societies we want to see. Principles like the right to receive information without impediment. Principles like not making tens of thousands of people sad for your personal gain. Principles like corporations not messing about with the voting behavior of their users in any way, for any reason. And right now is when those principles, those precedents, will be decided.

Laurie Penny's Unspeakable Things: Sex, Lies and Revolution is available for pre-order. She will also be in conversation with classicist and author Mary Beard on 30 July at Conway Hall, London. More details and tickets here.

 

Laurie Penny is a contributing editor to the New Statesman. She is the author of five books, most recently Unspeakable Things.

Photo: mdl70 via Flickr
Show Hide image

Childhood mythology is being revamped by digital monsters like Slenderman

The stories the younger generation tell one another are just as rich, and as terrifying.

“I am the spirit of dark and lonely water, ready to trap the unwary, the show-off, the fool...”

The eldritch tones of Donald Pleasance will breathe a chilly memory along the vertebrae of those of a certain age. That 1973 public information broadcast about water safety lingers long in the memory. But why? There was something about the hooded figure depicted beside the river where children cavorted that just resonated. A shadow, a memory, a whisper. It was as if we'd almost heard it before. It wasn't just a warning. It was a story.

Before Donald Pleasance kept Britain's children from its treacherous riverbanks, we had tales of ubiquitous river-hags. Two good examples are Peg Powler, who haunts the banks of the Tees, and Jenny Green Teeth, who stalks Shropshire's waterways.  Both of these terrifying water spirits live to drag youngsters to a watery lair. These monsters pan the world, from the native Penobscot people of Maine, with their child-luring swamp-woman Skwaktemus, to the Inuit's Qalupalik, a green-skinned water witch that reaches up from below the ice to snatch wayward and disobedient children.

These stories are geographically distant but carry essentially the same message: “Stay away from the water”. Predatory creatures embody a very real fear. The unimaginable nightmare of our children in real peril is blunted by the presence of a monster.

Children need stories as much as adults do; stories make sense of reality when reality is hard to understand. Stories are told to be re-told, to be embellished, to raise heroes and to make monsters.

For people of the Dark and Lonely Water generation, including me, it's easy to assume that today's kids have lost the art of storytelling. We say social media has diffused, has numbed, has snuffed the flame of imagination. Yet perhaps it hasn't. Perhaps we just got old. On the contrary, the stories the younger generation tell each other are just as rich. Monsters are still being made. This world of ours is still being understood.

There is real danger out there. There are real monsters. But now they come in new forms, they lurk in new lairs.

Today, the internet is the new hunting ground of the monster. Grooming, trolling, cat-fishing and scamming have become the MOs of the vile in our society and, as if in direct response, legends and myths have sprung from the same place.

Creepypasta, 4Chan and /nosleep are breeding colonies of legend. Forums and social media have taken the place of the skipping-rope chants and the childhood whispers. Young people still know Bloody Mary, yet Black-Eyed Kids and the Goatman have usurped her from her throne. Nefarious rituals and games like Hooded Man or Elevator to Another World have been born of the internet age, submitted as stories and experiences. Like the Spirit of Dark and Lonely Water in the 70s, they have touched a common nerve.

The most iconic of these net-dredged horrors is Slenderman: born of a paranormal Photoshop competition, his legend has transmogrified into an internet Tulpa, the power of which played a significant part in the decision of 12-year-olds Anissa Weier and Morgan Geyser to stab their friend Bella Leutner 19 times. He is strengthened with every share, every fan image, every account of his being. It is no wonder that he has become flesh in the hearts and minds of those who need him, who want to escape into his world.

The readers of the forums that bear these eldritch fruits know the content isn't real. Yet disbelief is intentionally suspended. As it says on the /nosleep board guidelines “Everything is true here, even if it's not. Don't be the jerk in the movie theater [sp] hee-hawing because monkeys don't fly.”  It's disrespectful to negate the skill or the talent that it takes to write a story or make an image on Photoshop. This leaves space for storytelling. We need stories.

These stories stir something inside us, lend a bellow to the flames of our imagination. Images, anecdotes and instructions - they are monsters we have the power to control. Online, we can pass on the whispers, we have the ability to interact with the shadows. Online we can be the purveyors of this mythology. We can tell each other stories. We can control. If we make a monster, it is ours. Most importantly, we can escape from reality and immerse ourselves in our monsters.

Not just monsters lurk online, there are games and rituals, rich in their own mythology. The illicit Ouija board in the parks and graveyards of my own childhood are dwarfed by the trans-cultural crucible of today's games. With the ingenuity of Koji Suzuki's cursed video in Ring, far eastern influence and technology are pervasive throughout. Japan can boast the ghost-summoning Satoru-kun, and the White Kimono game. Both are alleged to summon spirits, Satoru-kun specifically with a mobile phone. There are many more of these games sprouting up from all over the rest of the world eg.- Mexico's ' El Juego Del Libro Rojo' (Red Book game) and Portugal's Ritual da Televisão (Television ritual) and nearly all carry grave warnings.

These nebulous games, like the internet's monsters carry their own stories. Peruse Reddit and you'll find accounts and speculation from those who claim to have played and been changed or had their lives altered by what they've done. The comments below the hundreds of accounts begging for advice are a mix of sincerity and concern.

“Dude, luck only lasts so long, and even longer less when you tempt things you know nothing of.”

“OP, you messed up big time. You're always supposed to follow the rules of the game as completely as you can!”

“You idiot! Ghost games are not for play! Especially japanese ones, they are dangerous”

“Get some sage. Burn the sage, and wave it into every corner of every room in the house... I would recommend putting salt across doorways and window sills, anything that would be an 'entry' into the house, but it sounds like you may have summoned it inside the house”

Everything is true here even if it's not.

But how can we know for sure? Do we really know that the user didn't summon something terrible from the void they opened with one of these games?

That's what makes them so compelling. That's what makes Slenderman, Smile Dog and Jeff the Killer so iconic. Like that friend of your mate's brother who went mad after he did an Ouija board down the park, we are still whispering, we are still embellishing and interacting.

The internet is open, unchartered landscape, there are no rules of the real world in which to weave mythology and the quest is to be the creator of something that wriggles from our grasp and is embraced, formed and made flesh by a collective consciousness. Are we in some way thankful for these creatures that bring us together over oceans and time zones?

Phones and tablets in the hands of our children are frightening to us: they are the unknown, the window into an abyss. Yet from that abyss, we are like our ancestors, toasting heraldry and horror, and making new myths.

Hydra by Matt Wesolowski, published by Orenda Books is out now.