Viral hit: we all suffer from an inbuilt psychological bug, exacerbated by the internet. Photo: Marcelo Graciolli on Flickr, via Creative Commons
Show Hide image

Omniscience bias: how the internet makes us think we already know everything

The internet is an answer machine, it doesn’t help us ask better questions. It feeds the illusion that we already know everything we need to know to be well-informed.

Last week, Washington DC was hit by an earthquake. The Republican congressional leader Eric Cantor lost his seat to a Tea Party upstart with the suggestive name of David Brat. This wasn’t just a surprise: it was a shock. Nobody saw it coming – not even Nate Silver.

The political press quickly concluded that Cantor had committed the ultimate political sin of losing touch with the voters. Spending his days and nights in the unreal world of the nation’s capital, absorbed in the high politics of the Capitol, Cantor had forgotten about the people who put him there.

No doubt this is true. But as David Carr, the media correspondent of the New York Times, suggests, certain members of the press might want to get an appointment with an optician to see about that log in their eye.

Carr points out that the only journalists who got even a sniff of the trouble that Cantor was in were from local newspapers. Jim McConnell is a staff reporter at the Chesterfield Reporter, which serves the district in question. He didn’t call Brat’s victory, but he did predict it was going to be very close at a time when everyone assumed Cantor was a shoo-in.

He was able to do this by employing the sophisticated journalistic technique of leaving the office and talking to people. “You could tell wherever you went that Cantor was incredibly unpopular, that people saw him as arrogant,” he told Carr. Meanwhile, members of more prestigious and well-funded national newspapers completely missed the big story about to explode in a district less than two hours drive from Capitol Hill.

Carr blames the internet, at least in part. The web is a tremendous boon to reporters: the world’s information is now accessible from a desk or smartphone. But it can also seduce journalists into thinking that they know everything worth knowing. As Carr puts it, “the always-on data stream is hypnotic, giving us the illusion of omniscience.”

Take another story that seemed to come out of the blue: the current violence in Iraq. There’s no shortage of pundits pronouncing with impressive confidence on its causes and ramifications. The real experts tend to be more cautious; they know how little we know about ISIS and its aims. They may have also have been left wondering why editors only got interested in this story once pictures started to show up in their Twitter streams.

Actually, I think Carr puts his finger on something with implications far beyond the media. We all suffer from an inbuilt psychological bug, which is exacerbated by the internet. Call it “omniscience bias”: the illusion that we know everything we need to.

In 1987, researchers at the University of Oklahoma ran an experiment in which they gave students a series of problems to solve, and asked them to generate as many solutions as they could. The researchers deliberately gave their subjects a very limited amount of information on each problem. One problem was how to provide enough parking spaces on the university campus, given the limited space available. The students came up with different solutions, including reducing demand for parking space by raising fees or using the space more efficiently.

After the students had generated their answers they were asked to estimate what percentage of possible good solutions they thought they had come up with, while, separately, a panel of experts were asked to compile a database of the possible solutions. It turned out that the average participant generated only about one in three of the best solutions – yet when asked, participants guessed that they had landed on three out of four possible solutions. Not only had they missed most of the best ideas, but they found it hard to imagine there were many alternatives they hadn’t covered.

Psychologists have replicated this or similar effects in different ways: we tend to be over-confident that we have the right information we need to form opinions or make judgements. The modern internet feeds this tendency by persuading you that everything you need to know is a click away or coming soon from a feed near you. Google never says, “I don’t know.” It is an answer machine, but it doesn’t help us ask better questions.

Even those paid to be intellectual explorers are can be stymied by the apparent certainties of the web. James Evans, a sociologist at the University of Chicago, assembled a database of 34 million scholarly articles published between 1945 and 2005. He analysed the citations included in the articles to see if patterns of research have changed as journals shifted from print to online.

His working assumption was that he would find a more diverse set of citations, as scholars used the web to broaden the scope of their research. Instead, he found that as journals moved online, scholars actually cited fewer articles than they had before. A broadening of available information had led to “a narrowing of science and scholarship”.

Explaining his finding, Evans noted that Google has a ratchet effect, making popular articles even more popular, thus quickly establishing and reinforcing a consensus about what’s important and what isn’t. Furthermore, the efficiency of hyperlinks means researchers bypass many of the “marginally related articles” print researchers would routinely stumble upon as they flipped the pages of a printed journal or book. Online research is faster and more predictable than library research, but precisely because of this it can have the effect of shrinking the scope of investigation.

According to the psychologist Daniel Kahneman, “our comforting conviction the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance.” It’s never been easier to go through life assuming you know everything you need to know. But that leaves you more vulnerable to information earthquakes. Just ask Eric Cantor.

Ian Leslie is the author of Curious: The Desire to Know and Why Your Future Depends on It (Quercus, £10.99)

 

Ian Leslie is a writer, author of CURIOUS: The Desire to Know and Why Your Future Depends On It, and writer/presenter of BBC R4's Before They Were Famous.

Getty
Show Hide image

How hackers held the NHS to ransom

NHS staff found their computer screens repleaced by a padlock and a demand for money. Eerily, a junior doctor warned about such an attack days earlier. 

On Friday, doctors at Whipps Cross Hospital, east London, logged into their computers, but a strange red screen popped up. Next to a giant padlock, a message said the files on the computer had been encrypted, and would be lost forever unless $300 was sent to a Bitcoin account – a virtual currency that cannot be traced. The price doubled if the money wasn’t sent within six days. Digital clocks were counting down the time.

It was soon revealed Barts Health Trust, which runs the hospital, had been hit by ransomware, a type of malicious software that hijacks computer systems until money is paid. It was one of 48 trusts in England and 13 in Scotland affected, as well as a handful of GP practices. News reports soon broke of companies in other countries hit. It affected 200,000 victims in 150 countries, according to Europol. This included the Russian Interior Ministry, Fedex, Nissan, Vodafone and Telefonica. It is thought to be the biggest outbreak of ransomware in history.

Trusts worked all through the weekend and are now back to business as usual. But the attack revealed how easy it is to bring a hospital to its knees. Patients are rightly questioning if their medical records are safe. Others fear hackers may strike again and attack other vital systems. Defence minister Michael Fallon was forced to confirm that the Trident nuclear submarines could not be hacked.

So how did this happen? The virus, called WannaCry or WannaDecrypt0r, was an old piece of ransomware that had gained a superpower. It had been combined with a tool called EternalBlue which was developed by US National Security Agency spies and dumped on the dark web by a criminal group called Shadow Brokers. Computers become infected with ransomware when somebody clicks on a dodgy link or downloads a booby-trapped PDF, but normally another person has to be fooled for it to harm a different computer. EternalBlue meant the virus could cascade between machines within a network. It could copy itself over and over, moving from one vulnerable computer to the next, spreading like the plague. Experts cannot trace who caused it, whether a criminal gang or just one person in their bedroom hitting "send".

Like a real virus, it had to be quarantined. Trusts had to shut down computers and scan them to make sure they were bug-free. Doctors – not used to writing anything but their signature – had to go back to pen and paper. But no computers meant they couldn’t access appointments, referral letters, blood tests results or X-rays. In some hospitals computer systems controlled the phones and doors. Many declared a major incident, flagging up that they needed help. In Barts Health NHS Trust, ambulances were directed away from three A&E departments and non-urgent operations were cancelled.

The tragedy is that trusts had been warned of such an attack. Dr Krishna Chinthapalli, a junior doctor in London, wrote an eerily premonitory piece in the British Medical Journal just two days earlier telling hospitals they were vulnerable to ransomware hits. Such attacks had increased fourfold between 2015 and 2016, he said, with the money being paid to the criminals increased to $1bn, according to the FBI. NHS trusts had been hit before. A third reported a ransomware attack last year, with Imperial College London NHS Trust hit 19 times. None admitted to paying the ransom.

Hospitals had even been warned of this exact virus. It exploited a vulnerability in Microsoft Windows operating systems – but Microsoft had been tipped off about it and raised the red flag in March. It issued a patch – an update which would fix it and stop systems being breached this way. But this patch only worked for its latest operating systems. Around 5 per cent of NHS devices are still running the ancient Windows XP, the equivalent of a three-wheeled car. Microsoft said it would no longer create updates for it two years ago, rendering it obsolete.

There are many reasons why systems weren’t updated. Labour and the Lib Dems were quick to blame the attack on lack of Tory funding for the NHS. It is clear cost was an issue. Speaking on BBC Radio 4’s PM programme on Saturday, ex-chief of NHS Digital Kingsley Manning estimated it would take £100m a year to update systems and protect trusts against cyber attacks. Even if that money was granted, there is no guarantee cash-strapped trusts would ringfence it for IT; they may use it to plug holes elsewhere.

Yet even with the money to do so updating systems and applying patches in hospitals is genuinely tricky. There is no NHS-wide computer system – each trust has its own mix of software, evolved due to historical quirk. New software or machines may be coded with specific instructions to help them run. Changing the operating system could stop them working – affecting patient care. While other organisations might have time to do updates, hospital systems have to be up and running 24 hours a day, seven days a week. In small hospitals, it’s a man in a van manually updating each computer.

Some experts believe these are just excuses; that good digital hygiene kept most trusts in the UK safe. "You fix vulnerabilities in computers like you wash your hands after going to the toilet," said Professor Ross Anderson, a security engineering expert at Cambridge University. "If you don't, and patients die, excuses don't work and blame shifting must not be tolerated."

It is not known yet if any patients have died as a result of the attack, but it certainly raised fears about the safety of sensitive medical records. This particular virus got into computer files and encrypted them – turning them into gooble-de-gook and locking doctors out. Systems were breached but there have been no reports of records being extracted. Yet the scale of this attack raises fears in future the NHS could be targeted for the confidential data it holds. "If it’s vulnerable to ransomware in this way, it could be vulnerable to other attacks," said Professor Alan Woodward security expert at the University of Surrey's department of computing.

In the US, there have been examples where ransomware attacks have led to patient data being sucked out, he said. The motivation is not to embarrass people with piles or "out" women who have had an abortion, but because medical information is lucrative. It can be sold to criminals for at least $10, a price 10 times higher than can be earned by selling credit card details. Dossiers with personal identification information – known as "fullz" on the dark web – help crooks commit fraud and carry out scams. The more personal details a conman knows about you the more likely you are to fall for their hustle.

Hospital data is backed up at least hourly and three copies are kept, one offsite, so it is unlikely any medical records or significant amounts of data will have been lost – although the hack will cost the NHS millions in disruption. A British analyst, who tweets under the name Malware Tech, became an unlikely hero after accidentally finding a killswitch to stop the virus replicating. He registered a website, whose presence signalled to the virus it should stop. Yet he admits that a simple tweak of the code would create a new worm able to infect computers.

Experts warn this event could trigger a spate of copycat attacks. Hacker may turn their eyes to other public services. Dr Brian Gladman, a retired Ministry of Defence director, and ex-director of security at Nato, points out that our entire infrastructure, from the national grid, food distribution channels to the railways rely on computer systems. We now face an arms race – and criminals only have to get lucky once.

"We’re going to get more attacks and more attacks and it’s going to go on," he said. "We’ve got to pay more attention to this."

Madlen Davies is a health and science reporter at The Bureau of Investigative Journalism. She tweets @madlendavies.

0800 7318496