Eli Pariser explains the dangers of a personalised web and “what the internet is hiding from you”.
Eli Pariser has an intriguing idea — that the increasing “personalisation” of the internet is trapping us all in echo chambers, hearing only opinions we already agree with and narrowing our interests.
Pariser, the board president of citizens’ organisation MoveOn.Org, points to Google’s switch to personalised search in 2009 as the moment when “the filter bubble” became an urgent topic of discussion. There is no longer an “objective” Google — you receive search results based on your previous searches and other information — and the company reportedly measures 57 “metrics” about you every time you search.
I spoke to Pariser about how much companies such as Google and Facebook are shaping the information we see and whether they have a distinct political world-view.
What is a “filter bubble” and why we should be worried about it?
It used to be the case that you’d Google something and I’d Google something and we’d get the same results. Now that’s no longer true. Many sites, including Google, predict what we want to see, based on personal data that we’ve given them. The filter bubble is the personal unique universe of information that results when we have these algorithms following us around and sifting through data for us and showing us what they think we want to see.
It’s a problem because it’s happening invisibly; you don’t know how your view of the world is being edited — you can get a distorted picture and not even really know it.
Is the filter bubble made worse because of the dominance of a few companies over the net — Google, Apple, Microsoft, Facebook?
Absolutely. Google and Facebook are especially prominent and Microsoft is a close runner-up. These companies have an incredible power to edit and filter what we see and what we don’t. But they don’t think of themselves that way; they don’t seem to be taking much responsibility for the power that they’ve accumulated.
So the problem is that you can’t see what you’re missing?
Yes. You don’t know what the editorial sensibility is: whereas when you pick up the Guardian, you know what the editing role is. But you don’t know who Google or Facebook think you are and you don’t know in what way they are editing information, so you really don’t know what’s being edited out.
Do you think, for example, Facebook’s Mark Zuckerberg has a distinct political world-view?
I don’t think that they’re doing this to shape politics. I think the contrary — it’s sort of a wilful ignorance of the critical implications of this stuff.
These guys are engineers and they tend be very wary of the messy, social, consensus-orientated culture of politics.
None of these guys would have ever run for office. But here they are, running these huge companies that are making decisions that apparently aren’t political. For the fabric of society, they do have important repercussions.
Google’s motto once was: “Don’t be evil.” Is it evil now?
I don’t think that they are evil. Except maybe in the way that institutions that are wilfully blind to their responsibilities [are]. I had this conversation with a Google engineer that really stuck with me. He said: “Yeah, we’re not evil but if we wanted to be, boy, we could be.” It’s an incredible amount of trust to place in an institution and they have all the tools at their disposal to do some quite creepy and concerning things.
What was the biggest surprise during researching this book?
There was a lot of surprises because it’s such a hidden realm. One is that there’s a “behaviour market”; increasingly, almost every action you take online can be viewed as a commodity and be bought or sold.
That was quite striking to me, that that infrastructure is being built. The other thing is just how far this kind of personalised search thinking has permeated our culture, not just in the context of Google, or the context of news even, but also in the context of dating sites or restaurant recommendations. Increasingly, it seems like almost every domain is shaped by this kind of breakneck decision-making.
I’m sure journalists would agree that not all of the results of search-engine optimisation [writing headlines that use popular search terms] are positive.
If the way that you get to a large audience is having a headline that people will click “like” on a lot on Facebook, that changes the kind of headline you write — because you literally don’t want it to be a downer.
I think the “like” button has a really significant effect on both what is produced and what is distributed. It’s not a neutral word. And so, you know, a story about someone overcoming the odds and surviving their fight with cancer gets lots of “likes” — but the war in Libya? That’s harder to click “like” on.
Is there a filter bubble on Twitter — because you only follow people similar to you?
Twitter, until recently, has been bucking this trend somewhat in that it doesn’t make a lot of personalised decisions for you. One of my jumping-off points with the book was that on Facebook I was being shown updates only from my Facebook friends whose political views I agreed with — the others were being edited out, essentially. That doesn’t happen on Twitter — I see the conservatives as well as the liberals. I get to choose.
Do you think that we will outgrow this obsession with personalisation and instant gratification?
I feel hopeful that people are shocked when they hear that this is happening. The more it’s brought to people’s attention, the more [companies] will be pressured to develop products that give people much more fulfilling media.
Personally, where do you get your news from these days?
I find Twitter very helpful because it does allow me to get a taste of a lot of different information in the world. I also rely a lot on standard newspapers, the New York Times and the Washington Post I think still do a very good job. I use Google News and I use Facebook, so I don’t want to just go backwards.
So it’s not a case of “ban Facebook and we’ll be fine”?
No, the question is: how do you make the new 21st-century media as good as the best of the earlier centuries’ media?
Can you trick Google?
It’s very hard because the masses of data that they’ve collected on most people are so significant that, you know, one day clicking links willy-nilly doesn’t throw it off a whole lot. We’re looking at gigabytes of data.
You can clear your cookies and you can go into an incognito window, or a private browsing window, and that helps a bit. But even then, one Google engineer told me there are 57 things that they can track: what kind of computer you’re on, where you’re located, even what font size you’re using. And all of that then forms a picture of who you are. So I really do think this comes down to putting some pressure on these companies rather than modifying behaviour.
Presumably the only way to do that is legislation.
Some of this change can be produced by consumers. When people get to know this is happening, they do want a change. It’s one of the frustrating conversations I get into with engineers at these companies where they say: “Nobody’s demanding better tools on this.” You have to know [about them] first.
There is a legislative component, as well, which is about the updating the laws around personal information to reflect this new world in which you click in one place and that ripples out. It starts with giving people better control over their own personal information. If the basis of the web is going to be that we hand over all this personal data — which is valuable — in exchange for services, then we need to understand the value of the data.
Have you heard any reaction from Google or Facebook to the book? You haven’t had a late night phone call from [executive chairman] Eric Schmidt?
No, no phone calls from Eric Schmidt. Different people in the company have different reactions. At the very top, they don’t seem very compelled by this. I had a conversation very briefly with [the Google co-founder] Larry Page and he said, “I just don’t think this is a very interesting problem.” And I found that not that helpful. Further down the company, I think there are people sincerely wrestling with it.
What change would you like to come out of publishing the book?
The best scenario would be that people understand how their information is filtered in a more serious way so that they can make better choices.
“The Filter Bubble” is out now (Penguin, £12.99). For more information, visit thefilterbubble.com. Helen Lewis-Hasteley is on Twitter: @helenlewis