“Real people often prefer ease of use and a multitude of features to perfect, unbreakable security. Who uses WhatsApp because it is end-to-end encrypted, rather than because it is an incredibly user-friendly and cheap way of staying in touch with friends and family?”
This is a little like saying: “Who uses a car because it has airbags and seatbelts, rather than because it’s a convenient way to get around?”
The Home Office strategy here may be to persuade internet companies to take action by telling them that ordinary people don’t care about security. This would be dangerous and misleading.
Clearly, real people (who are Rudd’s not real people?) do value security in their communication, just as they do with safety in their cars. Security is not – or at least does not have to be – the opposite of usability.
For many people, good security makes a service usable and useful. Some people want privacy from corporations, abusive partners or employers. Others may be worried about confidential information, sensitive medical conversations, or be working in countries with a record of human rights abuses.
Whatever the reasons people want secure communications, it is not for the Home Secretary to tell the public that they don’t have any real need for end-to-end encryption.
While Rudd seems to be saying she does not want encryption to be “removed” or bypassed, there are other things she might be looking for. It is possible that she wants internet companies to assist the police with “computer network exploitation” – that’s hacking people’s devices.
It could mean providing communications data about users which could include data such as: “This user uses this device, often these IP addresses, this version of their operating system with these known vulnerabilities, talks to these people at these times, is online now, is using this IP address, is likely at this address and has visited these websites this many times.”
Alternatively, Rudd might mean pushing out compromised app updates with end-to-end encryption disabled.
However, it is likely to be police rather than security services asking for this help. While targeted hacking does provide an investigative option that avoids blanket communications surveillance, it would be risky for the police to have these powers. Training and oversight, after all, are not as thorough or exacting as in the security services.
What is completely lacking is any serious attempt to tell the public what the Home Office wants internet companies to do to make people’s end-to-end communications accessible.
We should be told what risks the public would be exposed to if the companies were to agree to the Home Office’s private requests. Have these risks been properly weighed up and scrutinised? What safeguards and oversight would there be?
One risk is that users may start to distrust tech companies and the apps, operating systems and devices that they make. When security vulnerabilities are identified, firms push out updates to users. Keeping devices and apps up-to-date is one of the most important ways of keeping them secure. But if people are unsure whether they can trust pending updates, will they keep their devices up-to-date?
It would be incredibly damaging to UK security if large numbers of people were dissuaded from doing so. A prime example is the WannaCry ransomware attack that paralysed parts of the NHS in May. It spread through old Windows computers that hadn’t been updated, forcing doctors to cancel thousands of appointments.
The government must spell out its plans in clear, precise legislation and subject that legislation to full parliamentary scrutiny, and it should bring security and usability experts into a public debate about these questions.
Measures that deeply affect everybody’s privacy, freedom of expression, and access to information must not be decided behind closed doors.
Ed Johnson-Williams is a campaigner at open rights group. This article orginally appeared on NS Tech, a new division of the New Statesman focusing on the intersection of technology and politics.