New Times,
New Thinking.

  1. Science & Tech
29 August 2018updated 09 Sep 2021 3:04pm

Wikipedia has resisted information warfare, but could it fight off a proper attack?

The online encyclopedia’s openness has protected it, but a more concerted effort to interfere could turn that strength into a weakness. 

By Carl Miller

It’s been happening for around a decade now. NATO was probably first, back in 2009. A few years later, Russia did it and then in 2014 the British military too. They all had realised that a victory is now won as much in the eyes of the watching public as between opposing armies. That the battlefield extended from a muddy field all the way to online blogs. “To”, as the British Army said, “change attitudes and behaviour in our favour”. One by one, militaries around the world began to fight, as they saw it, a new form of warfare, one fit for the information age. Information warfare.

Since then, researchers have begun to uncover what this new kind of warfare looks like in practice. In China, it looks like two million people employed by the government to write 448 million social media posts every year. In Mexico, it looks like 75,000 Peñabots, automated Twitter accounts, busily lionising President Enrique Peña Nieto. In Thailand, it looks like 100,000 students trained as “cyberscouts” to monitor and report online behaviour deemed to threaten national security. And it looks like an office block in downtown St Petersburg full of digitally savvy millennials, each paid by the Kremlin to manage dozens of different online identities.

This strange, new kind of militarised marketing has broken out on blogs, across social media platforms, in the mainstream press; almost everywhere that people can be reached and influenced.

Yet there is one enormous exception. The place that over a billion people visit every month in search of not opinions but fact. The site that in 2014 people trusted more than the BBC News. The most common destination following any Google search. The source the tech giants themselves are turning to for help in sorting fact from fiction. If I were an information warrior, it would be the first place I’d target. And it seems to be the place where it is happening the least.

Anyone can edit Wikipedia, of course, but it’s not an easy target. It has been famously resilient to attempts to vandalise it in the past; which is probably the most significant achievement of the power of the “crowd” in the digital age so far. All edits are open, vandalism can be rolled back in a second, pages can be locked, and the site is patrolled by a combination of bots and editors. Some of Wikipedia’s users have powers to check IP addresses, and if an editor finds suspicious links between accounts, they can open a sock-puppet investigation to get the user blocked or banned. People have tried to manipulate Wikipedia from the very beginning, and others have worked to stop them for just as long.

Select and enter your email address Your weekly guide to the best writing on ideas, politics, books and culture every Saturday. The best way to sign up for The Saturday Read is via saturdayread.substack.com The New Statesman's quick and essential guide to the news and politics of the day. The best way to sign up for Morning Call is via morningcall.substack.com
Visit our privacy Policy for more information about our services, how Progressive Media Investments may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications.
THANK YOU

But lets imagine we’re not a lone user with a grudge, or a PR company trying to engineer a bit of positive coverage. Instead we’re Russia, with all the resources and people that we need to undertake a years-long mission to influence the watching public.

Research by Stanford Universtiy shows that successful hoax articles on Wikipedia, the ones that stayed around, got traffic, and were referred to by credible media sources, tended to be longer, well-linked, and created by accounts that are older and have longer histories of legitimate edits. That means that the key is not to start by attacking Wikipedia’s content, but by infiltrating its community.

I’d create a team managing tens of thousands of fictitious Wikipedia editors. This is legend farming. Unlike the creation of crude sock-puppets, it is a technique to create and develop intricate, deep online identities over months or years. Researchers in Oxford, for instance, uncovered a single firm focused on online propaganda that created and managed 40,000 unique identities; each one with multiple accounts on social media, a unique IP address, its own internet address, even its own personality, interests and writing style.

I’d spoof IP addresses to make my legends look like they’d come from all over the world, a trivial matter for a state to do, and over months they’d weave in amongst the 120,000 genuine Wikipedians who actively edit it. Like any open source project, the reputation and influence of the members within it are largely earned through trust and long-term contribution, and the first mission of my legends would be simple: to gain reputation and trust within the community. Helping out. Removing vandalism. Paid, in fact, to simply supply legitimate genuine, constructive edits to Wikipedia with absolutely no link to each other.

Over time, as my legends earn trust, some would eventually run for Wikipedia office. While the consensus comes from the community, it is applied by a network of Wikipedians with additional powers to resolve disputes, lock pages. These administrators are again chosen by the community in processes resembling elections. Some of my legends would seek these positions of greater power, pointing to their rich edit histories and long terms of service.

Next, I’d go after the rules. Every claim on Wikipedia needs to be referenced, and I might try to get my administer legends to lean conventions towards being more accepting of my state-owned media (“what’s the difference to the BBC”, I’d argue?). I might try to tweak the rules around contentious political topics, and even make some of my legends active in the discussions around the identification of sock-puppetry and the levying of sanctions.

Finally, and only after laying that groundwork, I’d go after the content. But don’t expect any of my legends to have anything to do with suddenly trying to convince you that the Russian occupation of Crimea is legitimate. Or that MH-17 was shot down by the Ukrainian air force. The Russian information campaigns uncovered by Facebook and Twitter have shown little interest in trying to improve Russia’s reputation in our eyes.

Messages are targeted at environmental activists and racists, Wall Street financiers and radical egalitarians. They touch on everything from police brutality to racial tensions, online privacy concerns to alleged government misconduct, gun rights to transgender issues, conspiracy theories to isolationism and anti-interventionism. Some themes dwell on the glorious never-was of Soviet Communism, others offer caricatures of contemporary western politicians. All the edits from my legends would be within the rules of Wikipedia, they’d target pages outside of the English language, and stay away from the most popular and heavily policed. They have a softer, softer, more pervasive, more attritional aim: building a sedimentary bedrock of feeling that everyone is corrupt, everything is rotten. A war on information as much as information warfare.

No grand attempt to subvert Wikipedia has yet been uncovered, and most Wikipedians I spoke to thought that the processes in place would stop it happening. However, there is an assumption, I think, that because Wikipedia is open, it is resilient. And to crude forms of vandalism, that is probably true. But the danger here isn’t vandalism, it is entryism.“You can easily set up a crew of editors to subvert the system” Stevie Benton told me, the former head of external relations for Wikimedia UK. “There is very little stopping someone with money paying to influence the system by doing it in the ways the system allows. It really would not be difficult to do at scale if people knew how it worked, played by the rules, became established, then used their position within the community. With its scope, scale and openness I think it is extremely naive to think Wikipedia is secure. I’d say it is pretty vulnerable, and evidently so – not from the outside perhaps, but from the inside, very much so.” Openness, Wikipedia’s great strength, could also be its greatest weakness.

Carl Miller is the author of The Death of the Gods: The New Global Power Grab published by William Heinemann. He tweets @carljackmiller. 

Content from our partners
The UK’s skills shortfall is undermining growth
<strong>What kind of tax reforms would stimulate growth?</strong>
How to end the poverty premium