David Cameron’s recent demand that companies including Google and Microsoft should do more to prevent the trade in child abuse images on the “dark web” exposes a worrying ignorance of how the internet works. Things don’t have to be that way.
It can help to think of Google as working similarly to a fishing trawler – the net gathers everything close to the surface but it misses things further down. Hence the term “deep web”, or “dark web”.
Much of the deep web isn’t indexed by Google because it’s boring, or inaccessible without passwords. Or it’s just not needed – imagine if every frame of every YouTube video was indexed, instead of just the title and description.
People sometimes use the deep web out of necessity, often logging on through anonymising programs such as Tor. Among the users are political activists, as well as criminals – such as those behind Silk Road, the online drug marketplace closed down by the FBI in October. The deep web is where gangs that make and exchange child abuse images like to hide.
This is why the government’s response to online exploitation is so frustrating. It has asked Google and Microsoft to improve their algorithms so as to block child abuse images from appearing in search results, something that both companies have agreed to do.
The Daily Mail described the move as a “stunning U-turn”, but Google has been filtering child abuse images for years. A vital element of any search company’s business model is to tweak and improve its search algorithms, and this includes blocking illegal content. The non-profit Internet Watch Foundation, founded in 1996, combs the web to find illegal images and adds them to its blacklist, which is then automatically blocked by many search engines. The politicians seem to be equating a tech company’s size with its ability to sheriff the internet’s Wild West – but asking Google for help with a dark web problem is like reporting a burglary in London to the New York Police Department.
Monitoring the deep web requires old-fashioned detective work, and yet, for all his bluster, Cameron is quietly squeezing the budget of the agency tasked with tracking down those who create and disseminate child abuse images online. The Child Exploitation and Online Protection Centre (CEOP) has had its budget frozen since 2010 and this year actually cut by 10 per cent, down to £6m. Since its establishment in 2006, it has saved hundreds of children from abuse. It also runs training schemes for schools and community groups to help prevent child abuse before it happens.
In October 2010 Jim Gamble resigned as head of CEOP over fears that the unit would lose its independence after it was absorbed into the National Crime Agency, which became operational this autumn. This summer, Gamble told PC Pro that he feared Cameron was being given inaccurate information on how to police the web. “I really am concerned that [the Prime Minister] is being poorly advised and that some of the information being set in front of him isn’t as accurate as it should be,” he said.
Perhaps Cameron has bad advisers; or he’s more interested in cheap vote-winning tactics than in effective interventions. Whatever the reason for his approach, the end result is that the government has adopted a faulty strategy to tackle some of the gravest challenges posed by modern technology.