GDPR has made it easier to access our own data – and for hackers to do so too

It was heralded by data security experts as a much-needed, sweeping change. But some elements of GDPR, it appears, are beginning to cause problems. 

Sign Up

Get the New Statesman's Morning Call email.

When the European Union's General Data Protection Regulation (GDPR) came into play in May, data specialists and privacy experts breathed a joint sigh of relief. After 20 years of outdated legislation, drafted in the mid-Nineties, huge swathes of privacy encroachments would be resolved. GDPR was treated like a godsend; the be-all and end-all to data breaches and data misuse. But only three months into the legislation, the cracks are beginning to show.

Assistant professor of computer science at Carnegie Mellon University, Jean Yang, revealed on Twitter this week how her Spotify account had been hacked – and how the hackers were able to request, and ultimately download, her entire streaming history, date of birth, and card details through a newly accessible function required by GDPR.

The hacker seems to have exploited what is called the subject access right – your right to have access to your data, which was bolstered under GDPR (Yang is based in the US, but Spotify has implemented GDPR globally). In Information Commissioner’s Office (ICO) terms, this right means that, at any given moment, any individual has the right to request access to all of the data a company has, or has collected, on them.

The subject access right has existed in data protection law for 30 years, but before GDPR there were far more restrictions. Under the Data Protection Act 1998 – GDPR's predecessor – organisations could charge fees for requesting data, it could take nearly two months to hand over information following a request, and they weren't required to have an electronic database. GDPR has scrapped the fee, requires information to be dispensed immediately, and expects all companies to have electronic sets of this data at all times. 

The personal information a company may have can include any sensitive information you have supplied to it – such as your date of birth, address, credit card number (all affected in Yang’s case) – as well as the information it has collected, such as the songs you've streamed or the websites you often visit. 

But the real concern with GDPR lies with the speed with which this data can be supplied. The updated subject access right means individuals should be able to get automated access to their data immediately. While it is, of course, convenient for individuals to have such easy access to the information a company holds on them, it also means that, in the case of a hack, other people could have the same incredibly easy access. When logged into some accounts, the function that allows you to request your data may be as simple as clicking on a button. In which case, hackers merely need to figure out your password in order to gain access to everything a company has on you. 

In an article for the European Data Protection Law Review entitled “Is the Subject Access Right Now Too Great a Threat to Privacy?”, chief regulatory adviser at Jisc technologies, Andrew Cormack, outlines how the subject access right could come back to bite us. In a world where data regulation is in its earliest, and importantly, most naïve stages, Cormack argues that the ability to immediately gain access to a cache of personal information could be dangerous. The risks, in other words, outweigh the benefits.

“Subject access rights would probably increase the incidence of personal records being accidentally or deliberately opened to unauthorised third parties,” Cormack writes, quoting directly from a Lindop Committee on Data Protection report from 1978. He goes on to argue that the subject access right, while seemingly giving more power to the individual, actually does very little in terms of protecting our data because of the risks it poses. He notes that GDPR’s other elements have made the subject access right marginally more useful, but the risk – where a company holding data can just give it away without having to distinguish whether or not the person requesting it is indeed the person the data concerns – is still painfully high.

One answer to this problem, as Yang posed on Twitter, could be to require companies to implement two-factor authentication. Two-factor authentication is a relatively simple process, merely requiring the data holder to obtain two forms of identification from the individual before releasing their data. In a case like Yang’s, this would mean that after logging in to their Spotify account, the person would have also confirm via text or email that it is indeed them logged in requesting access to their data. (It’s worth noting that Spotify has, in particular, been hounded for years for its lack of two-factor authentication – meaning this has been a problem for the platform well before GDPR).

But companies aren’t entirely to blame for a lack of two-factor authentication. Charities, small organisations, and local businesses can’t practically be expected to know the ins and outs of data security beyond the law, and two-factor authentication is not a legal requirement. Although what Spotify has done, or failed to do, by handing over data to whoever is logged in on an account, could be considered irresponsible, it is in no way illegal – and, in all likelihood, is generally the norm.

Just three months into GDPR, we’re likely to see this type of breach happening more and more often. And until the law changes, we can’t expect companies to have the know-how to tell the difference between releasing your data to you or to someone else.

Sarah Manavis is the New Statesman's tech and digital culture writer.