View all newsletters
Sign up to our newsletters

Support 110 years of independent journalism.

30 June 2014updated 09 Jun 2021 10:39am

How Facebook’s news feed controls what you see and how you feel

The social media giant has allowed scientists to conduct a mass experiment in psychological manipulation on hundreds of thousands of users - without consent.

By Ian Steadman

The internet is a lab, and we are the rats. Facebook, in February 2012, decided to conduct a psychological experiment on nearly 700,000 of its users without their consent, to see if it was possible to manipulate their moods. It was incredibly easy, too. To explain how, we have to delve into how companies like Facebook use something called “A/B testing”.

For at least a decade, if not longer, A/B testing has been a key technique for web developers. It’s not especially complicated – all it means is you present two options to users (option A or option B) and then see which one they like better – but it can have dramatic benefits when implemented properly. Google, especially, is fond of it, and has popularised it more than any other company. Brian Christian wrote a great in-depth feature on A/B’s conquest of Silicon Valley for Wired in 2012:

[E]ngineers ran their first A/B test on February 27, 2000. They had often wondered whether the number of results the search engine displayed per page, which then (as now) defaulted to ten, was optimal for users. So they ran an experiment. To 0.1 per cent of the search engine’s traffic, they presented 20 results per page; another 0.1 per cent saw 25 results, and another, 30.

Due to a technical glitch, the experiment was a disaster. The pages viewed by the experimental groups loaded significantly slower than the control did, causing the relevant metrics to tank. But that in itself yielded a critical insight – tenths of a second could make or break user satisfaction in a precisely quantifiable way. Soon Google tweaked its response times and allowed real A/B testing to blossom. In 2011 the company ran more than 7,000 A/B tests on its search algorithm.

The changes that A/B testing can be used for can be extremely subtle, from the sizes of text columns to the locations of input fields, to the colours of buttons to the order in which categories appear in a drop-down menu. Here’s a blog post from two Facebook developers explaining how they use Airlock, for example. Barack Obama’s 2008 presidential campaign made heavy use of A/B testing in its infamous social media campaigns, as his staff – some of whom were hired from Google for their experience in this specific area – tried to work out how best to squeeze donations from supporters.

Select and enter your email address Your weekly guide to the best writing on ideas, politics, books and culture every Saturday. The best way to sign up for The Saturday Read is via saturdayread.substack.com The New Statesman's quick and essential guide to the news and politics of the day. The best way to sign up for Morning Call is via morningcall.substack.com Our Thursday ideas newsletter, delving into philosophy, criticism, and intellectual history. The best way to sign up for The Salvo is via thesalvo.substack.com Stay up to date with NS events, subscription offers & updates. Weekly analysis of the shift to a new economy from the New Statesman's Spotlight on Policy team. The best way to sign up for The Green Transition is via spotlightonpolicy.substack.com
  • Administration / Office
  • Arts and Culture
  • Board Member
  • Business / Corporate Services
  • Client / Customer Services
  • Communications
  • Construction, Works, Engineering
  • Education, Curriculum and Teaching
  • Environment, Conservation and NRM
  • Facility / Grounds Management and Maintenance
  • Finance Management
  • Health - Medical and Nursing Management
  • HR, Training and Organisational Development
  • Information and Communications Technology
  • Information Services, Statistics, Records, Archives
  • Infrastructure Management - Transport, Utilities
  • Legal Officers and Practitioners
  • Librarians and Library Management
  • Management
  • Marketing
  • OH&S, Risk Management
  • Operations Management
  • Planning, Policy, Strategy
  • Printing, Design, Publishing, Web
  • Projects, Programs and Advisors
  • Property, Assets and Fleet Management
  • Public Relations and Media
  • Purchasing and Procurement
  • Quality Management
  • Science and Technical Research and Development
  • Security and Law Enforcement
  • Service Delivery
  • Sport and Recreation
  • Travel, Accommodation, Tourism
  • Wellbeing, Community / Social Services
Visit our privacy Policy for more information about our services, how New Statesman Media Group may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications.
THANK YOU

When a company has a near-monopoly in a web market, like Google, or Facebook or Amazon, A/B testing also has the advantage of making it easier to introduce innovation from within instead of waiting for a competitor to steal a march. Why bother having an internal debate over how your website should work, when you can easily run up an A/B test over some fraction of users and see how people react?

(That said, critics argue that it’s a killer of genuine, fresh innovation, instead focusing on quantifying and perfecting existing products or sites without any awareness of bigger issues.)

The key to A/B testing is that it is subtle, though, and that users are unaware. If you’ve ever used a website and noticed that it looks a little different to normal, only for it strangely “go back to normal” the next time you visit, chances are you may have wandered into the middle of an A/B test.

Fast-forward back to today, and the news is this: Facebook has turned A/B testing into a psychological experiment, so subtle that none of the subjects would likely have been aware.

A study published in the Proceedings of the National Academy of Sciences details an investigation into what the authors call “massive-scale emotional contagion” – that is, how effectively emotions and moods spread throughout a group of people if they see other people experiencing it. Clearly, emotional transferrance is a thing (as anyone who has cried in response to the sight of someone else crying can attest), but what the Facebook study was seeking to do was quantify exactly how transferable emotions can be, especially in a situation where there was no direct message. An advert might cause an emotional reaction, but it’s an advert, and so it’s explicitly trying to sell an audience something. This was about figuring out how influential an emotion is, rather than an interaction with someone experiencing that emotion. The authors write:

[M]ethods used in controlled experiments have been criticised for examining emotions after social interactions. Interacting with a happy person is pleasant (and an unhappy person, unpleasant). As such, contagion may result from experiencing an interaction rather than exposure to a partner’s emotion. Prior studies have also failed to address whether nonverbal cues are necessary for contagion to occur, or if verbal cues alone suffice. Evidence that positive and negative moods are correlated in networks suggests that this is possible, but the causal question of whether contagion processes occur for emotions in massive social networks remains elusive in the absence of experimental evidence. 

They wanted to figure out how an emotion can travel through a network of people even if those people never directly speak or interact with each other; Facebook was perfect for their needs.

The news feeds of 689,003 users were manipulated to show either more “negative” or more “positive” stories, as determined by language analysis software (and a control group, with negative and positive posts mixed in at random, was included too for comparison), with the hope of seeing people either happier or more depressed. After a week of running the manipulated feeds, the researchers found a clear change: those who had seen more positive posts were posting more positively themselves, while those subjected to negative posts were positing more negatively. Facebook had successfully influenced the emotional states of tens of thousands of its users without their knowledge.

This study has been met with a barrage of criticism. If your gut reaction is disgust or fear, you’re not alone.

Facebook’s news feed isn’t everything that your friends post to Facebook – instead, it’s a kind of curated selection of highlights, chosen based on all kinds of secret algorithmic factors that determine what constitutes “news” within the world of your friends. The company has previously boasted that it managed to increase voter turnout in the 2010 US midterm elections to the tune of 340,000 people, simply by experimenting with a banner prompting people to get up and go to their nearest polling booth.

That experiment was greeted warmly by those who saw the potential for leveraging social networks for civic engagement. Yet this latest study is exactly the flipside of it, and an apt proof of the danger of giving any one organisation or company the ability to command that kind of social engineering, regardless of how innocent they might feel their intentions to be.

When scientists wish to conduct experiments on human subjects they have to obtain permission – in this case, as it’s the US, from one of the institutional review boards attached to every research university (and that should have been Cornell’s for this study). The curious thing here is that, according to Forbes and the Atlantic, there appear to have been two big problems in getting this study published: the first is that the researchers told PNAS and the Cornell review board that the data collection had already been approved by another “local” review board – on the basis that Facebook A/B tests news feed algorithms all the time anyway – when in reality it had only been approved by Facebook’s own internal team; the second is that the Cornell review board, happy that the data had been collected legitimately, waved through permission for what it saw as analysis of a “pre-existing dataset”.

There is a possibility that the study will now be pulled for violating ethics standards, but that shouldn’t detract from the key issue here: if you accept Facebook’s terms and conditions, you are consenting to become a participant in a new experiment like this every time it tweaks its news feed algorithm. Maybe we should be grateful to have been allowed to hear about this one.

Is it ethically dubious? Yes – even if the legal technicality of a terms and conditions form can be used to argue that participants gave “informed consent” to the collection of their data, there’s clearly no way that any Facebook user consented to someone deliberately trying to make them depressed or happy. To argue otherwise is absurd.

Is it scientifically dubious? Again, yes – the actual effect detected was tiny, barely meeting the level of statistical significance. As lead author Adam Kramer wrote in a Facebook post in response to the outcry over his study, “the result was that people produced an average of one fewer emotional word, per thousand words, over the following week”. In fact, the most significant result may be that news feeds with neither positive nor negative words saw people post less to Facebook, and vice versa. It implies that the key to getting people to post anything to Facebook at all is to increase the amount of emotion in the posts that they see.

Everything that goes on on the social web happens because of a bargain that lots of us made, almost by default: let us use your service for free, and we’ll let you use our data to try and sell us stuff. It’s a bargain that obscures the power dynamics between the individual and the company, and we should feel like we’re getting a bum deal out of it. 

Almost every major news site is at the mercy of the news feed algorithm. When it was changed late last year to promote more “serious” stuff and downplay “trivial” stuff, Upworthy lost 40 per cent of its traffic. It’s a neat illustration of exactly how powerful one tweak of code can be.

Think back to the A/B testing at Google, and the fact that it was engineers with experience of that particular skill that were hired for a presidential election campaign. Think of Facebook influencing an election with nothing more complicated than a banner ad. Think of how little choice there is in terms of consenting to the factors at play here – it’s either all in when you tick the box below the terms and conditions, or all out.

Quite aside from the ethical questions raised by the possibility that any Facebook engineer can now decide to try and make hundreds of thousands of people depressed at the change of a few lines of code, there is the larger civic question to consider. There is no binary between real life and digital life, and for many people a digital life allows them to escape the disadvantages – of physical ability, of age, of location, of race, of class – that would historically have kept them socially isolated. Yet those tools are controlled by private companies with no obligation to their users beyond a bottom line of a balance sheet, and who can manipulate this environment however they choose.

It might not seem too awful for a light-hearted site like Upworthy to lose traffic, but what if the algorithm stops promoting posts from NGOs that campaign against Facebook advertisers? What if search results feature one political party over another when looking for information on a key issue? What if the news feed shows posts that contain one type of hate speech more often than another? What if a tweak to the news feed causes a statistically significant uptick in the number of suicides from those who felt suddenly more alone? What happens to a democracy if the platforms we use to discuss and debate issues are controlled by unelectable, unaccountable organisations which can choose which parts of the debate we’re privy to? 

It’s issues like these that frighten many of those who reacted with shock to the news of this study, and why it’s bizarre to hear Facebook defenders argue that emotional manipulation has been around as long as advertising has existed. There’s no process for keeping a company like Facebook in check if a tweak to its algorithm both increases its ad revenues by ten per cent and leads to an increase in mental health problems among users. We can accept the minor influence of the study as published, but we can also acknowledge that we are in the early days of this kind of sophisticated data control.

Usually I’m quite dismissive of those moral panic-type stories that revolve around a claim that “Facebook is changing our brains” or somesuch similar hyperbole, but without question technology is changing how humans interact with each other. If we increasingly accept that technology companies need access to our fundamental private life in order to provide the products – be it social media, fitness-tracking bracelets, driverless cars, smart thermostats, etc. – that we want, then we also have to accept that those companies have a financial incentive to test and refine those products, and the data that they gather, ruthlessly, and that their primary concern is not our consent.

Content from our partners
Unlocking the potential of a national asset, St Pancras International
Time for Labour to turn the tide on children’s health
How can we deliver better rail journeys for customers?

Select and enter your email address Your weekly guide to the best writing on ideas, politics, books and culture every Saturday. The best way to sign up for The Saturday Read is via saturdayread.substack.com The New Statesman's quick and essential guide to the news and politics of the day. The best way to sign up for Morning Call is via morningcall.substack.com Our Thursday ideas newsletter, delving into philosophy, criticism, and intellectual history. The best way to sign up for The Salvo is via thesalvo.substack.com Stay up to date with NS events, subscription offers & updates. Weekly analysis of the shift to a new economy from the New Statesman's Spotlight on Policy team. The best way to sign up for The Green Transition is via spotlightonpolicy.substack.com
  • Administration / Office
  • Arts and Culture
  • Board Member
  • Business / Corporate Services
  • Client / Customer Services
  • Communications
  • Construction, Works, Engineering
  • Education, Curriculum and Teaching
  • Environment, Conservation and NRM
  • Facility / Grounds Management and Maintenance
  • Finance Management
  • Health - Medical and Nursing Management
  • HR, Training and Organisational Development
  • Information and Communications Technology
  • Information Services, Statistics, Records, Archives
  • Infrastructure Management - Transport, Utilities
  • Legal Officers and Practitioners
  • Librarians and Library Management
  • Management
  • Marketing
  • OH&S, Risk Management
  • Operations Management
  • Planning, Policy, Strategy
  • Printing, Design, Publishing, Web
  • Projects, Programs and Advisors
  • Property, Assets and Fleet Management
  • Public Relations and Media
  • Purchasing and Procurement
  • Quality Management
  • Science and Technical Research and Development
  • Security and Law Enforcement
  • Service Delivery
  • Sport and Recreation
  • Travel, Accommodation, Tourism
  • Wellbeing, Community / Social Services
Visit our privacy Policy for more information about our services, how New Statesman Media Group may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications.
THANK YOU