For Sale: NHS Records. Condition, morally dubious

Should we be worried about the monetization of access to NHS records?

It’s finally upon us. In what many will argue is a victory for science, but a blow to privacy, this September will see the launch of Clinical Practice Research Datalink (CPRD), the controversial database that will make NHS records available for research – at a price.

Scientifically, there’s a lot to celebrate. The CPRD will offer researchers in the health sciences unheard of access to the life records of 52 million English NHS users and, in time, will be connected to other databases, such as those that deal with genetics and mental health. What this will provide is a resource for huge scale and easily longitudinal studies that researchers could previously only dream of. To understand the benefits one only need consider the important findings of previous studies that have used NHS records, notable examples of which include the revealing of a link between power lines and leukaemia, dismissing the proposed link between autism and the MMR vaccine, and uncovering disastrous effects of overdoses of thalidomide.

David Cameron was right to argue that it’s "simply a waste" for the NHS data not to be used “to make new medical breakthroughs". Yet the current setup only makes sense if the CPRD's main interest is monetary, rather than scientific, profit. Note, for instance, the emphasis in the first paragraph of the Department of Health website about the project's launch.

In March 2011, The Government launched its ‘Plan for Growth’ which details steps needed to enable the British economy to become more internationally competitive. As part of this initiative The Government pledged to build a consensus on using e-health record data to create a unique position for the UK in health research.

Though scientific rationales are mentioned later it’s quite clear that the economic benefits are the first on their mind. Taken in this light, perhaps the CPRD should be seen as nothing new. From education, to transport to policing, the government is surveying the welfare state with sparkly pound signs in their eyes. Yet there’s a subtle difference between such asset striping and what has happened with the CPRD, in which that hungry look has fallen on the population itself. There's new and radical idea here - the possibility that one of England’s most lucrative asset is us. This isn’t mere speculation, the life sciences industry is currently worth £50bn a year and the CPRD, with its unparalleled mass of data, is an irresistible honey pot that will entice global pharma back to our shores.

Yet nothing ever comes for free, and the price we are set to pay is an infringement, however slight, on our privacy and rights. Though there is the opportunity to opt out of system, despite huge protest, MPs intend to rewrite the NHS constitution to presume patient consent. There have also been grave concerns over anonymity. Though no names will be included with records, post codes and age profiles will remain attached, meaning that in some cases publicly known information will make it possible to trace anonymous records to individuals. As a report from the Royal Society in June stated:

It had been assumed in the past that the privacy of data subjects could be protected by processes of anonymisation such as the removal of names and precise addresses of data subjects. However, a substantial body of work in computer science has now demonstrated that the security of personal records in databases cannot be guaranteed through anonymisation procedures where identities are actively sought.

And this isn’t even taking human error into consideration. Consider the furor in June last year when a laptop with 8.6 million medical records went missing. Centralisation projects like the CPRD only make incidences like this more common and problematic.

Perhaps it should be taken as a sign of the times that an egalitarianism institution, which arose from post-war ashes on the belief that every individual should be valued and given the right to health, is now becoming one in which those same individuals are being increasingly valued as profitable data points. It’s the type of ideology and practice we are more used to in the likes of Social Media, but that are rapidly permeating society.

By practice, I am, of course, talking about data-mining. Since the word popped up as an innocuous 90s buzzword, the subtle statistical craft has become a dominant, and highly lucrative, marketing force. Simplistically, it’s the process of running algorithms on huge amounts of data to reveal powerful associations from seemly irrelevant information and grant the investigators immense inferential power. Worst of all, data-mining is insatiable. As people have finite pockets, there is a threshold at which a population can’t consume any more and data-mining leads to an arms race in which companies are pressured to paw through our psyches for more and more invasive information in a scrabble to regain their edge. Admittedly, as faceless, nameless, number crunching, data-mining doesn’t infringe upon personal privacy, but it could be argued that it is an assault on our personal integrity.

I’m therefore always surprised at how ambivalent, even welcoming, people tend to be to the idea. ‘Surely advertising tailored to me is a good thing’, the reasoning goes. Yet this argument is based on a conception of people as completely rational agents. I speak as a Psychology graduate when I say, trust me, we’re really not. Massive amounts of private scientific research is devoted to unpicking consumer behaviour and mapping the subconscious, emotional and impulsive driving factors behind our buying habits. What's more, the influence is so subtle they individuals aren’t even conscious of its impact upon their actions. Yet we should never forget that knowledge is power. What data-mining sells is access to the inner workings of a population, and what is bought is the ability to manipulate behaviour. If you don’t believe this, then ask yourself why so many big corporations are flinging their best minds and resources after the practice.

So should we be worried about the data-mining that the CPRD will facilitate? Admittedly, it isn’t dealing in anything as candidly invasive as kidneys out of a piss reeking back alleys, but one doesn’t need much imagination to see that were CPRD to give too much access to the likes of drug companies and other private industries this would be a very troubling state of affairs. Of course, there are limits to the use of the database, under the current framework data can only be used for medical research and all projects must publicly publish their results. The Medicines and Healthcare Products Regulatory Agency, the governmental agency running the scheme, will also charge private companies double the academic rate. Yet the pockets of drug companies are very deep and giving them any kind of access opens up opportunities for manoeuvre. Not to mention that once such resources become monetised it’s an easy step to start loosening the conditions under which data can be used.

Even the information commissioning office itself has suggested applications such as the creation of an encryption key to be shared by the NHS and supermarkets, which would allow for the diabetic status of individuals to be correlated with supermarket purchases. Big Brother issues aside, the idea that supermarkets, or any business could have this sort of access is terrifying. As any marketer worth their salt knows, two of the most effective sellers are fear and sex, both of which are heavily rooted in health.

Despite such arguments, it’s important not to lose track of the fact that in principle the CPRD is an excellent humanitarian project. Though there are significant dangers involved, standing in the way of scientific progress is never the responsible answer to controversy. As with nearly all modern technology, morality lies in the application and ideology and this is where we should execute caution, especially as the economic motivation of the government is far from reassuring. Vigilance is needed.

David Cameron speaking on NHS reforms (Image: Getty)

Emma Geen is a freelance writer. She tweets @EmmaCGeen and blogs at

The Science & Society Picture Library
Show Hide image

This Ada Lovelace Day, let’s celebrate women in tech while confronting its sexist culture

In an industry where men hold most of the jobs and write most of the code, celebrating women's contributions on one day a year isn't enough. 

Ada Lovelace wrote the world’s first computer program. In the 1840s Charles Babbage, now known as the “father of the computer”, designed (though never built) the “Analytical Engine”, a machine which could accurately and reproducibly calculate the answers to maths problems. While translating an article by an Italian mathematician about the machine, Lovelace included a written algorithm for which would allow the engine to calculate a sequence of Bernoulli numbers.

Around 170 years later, Whitney Wolfe, one of the founders of dating app Tinder, was allegedly forced to resign from the company. According to a lawsuit she later filed against the app and its parent company, she had her co-founder title removed because, the male founders argued, it would look “slutty”, and because “Facebook and Snapchat don’t have girl founders. It just makes it look like Tinder was some accident". (They settled out of court.)

Today, 13 October, is Ada Lovelace day – an international celebration of inspirational women in science, technology, engineering and mathematics (STEM). It’s lucky we have this day of remembrance, because, as Wolfe’s story demonstrates, we also spend a lot of time forgetting and sidelining women in tech. In the wash of pale male founders of the tech giants that rule the industry,we don't often think about the women that shaped its foundations: Judith Estrin, one of the designers of TCP/IP, for example, or Radia Perlman, inventor of the spanning-tree protocol. Both inventions sound complicated, and they are – they’re some of the vital building blocks that allow the internet to function. 

And yet David Streitfield, a Pulitzer-prize winning journalist, someow felt it accurate to write in 2012: “Men invented the internet. And not just any men. Men with pocket protectors. Men who idolised Mr Spock and cried when Steve Jobs died.”

Perhaps we forget about tech's founding women because the needle has swung so far into the other direction. A huge proportion – perhaps even 90 per cent - of the world’s code is written by men. At Google, women fill 17 per cent of technical roles. At Facebook, 15 per cent. Over 90 per cent of the code respositories on Github, an online service used throughout the industry, are owned by men. Yet it's also hard to believe that this erasure of women's role in tech is completely accidental. As Elissa Shevinsky writes in the introduction to a collection of essays on gender in tech, Lean Out: “This myth of the nerdy male founder has been perpetuated by men who found this story favourable."

Does it matter? It’s hard to believe that it doesn’t. Our society is increasingly defined and delineated by code and the things it builds. Small slip-ups, like the lack of a period tracker on the original Apple Watch, or fitness trackers too big for some women’s wrists, gesture to the fact that these technologies are built by male-dominated teams, for a male audience.

In Lean Out, one essay written by a Twitter-based “start-up dinosaur” (don’t ask) explains how dangerous it is to allow one small segment of society to built the future for the rest of us:

If you let someone else build tomorrow, tomorrow will belong to someone else. They will build a better tomorrow for everyone like them… For tomorrow to be for everyone, everyone needs to be the one [sic] that build it.

So where did all the women go? How did we get from a rash of female inventors to a situation where the major female presence at an Apple iPhone launch is a model’s face projected onto a screen and photoshopped into a smile by a male demonstrator? 

Photo: Apple.

The toxic culture of many tech workplaces could be a cause or an effect of the lack of women in the industry, but it certainly can’t make make it easy to stay. Behaviours range from the ignorant - Martha Lane-Fox, founder of, often asked “what happens if you get pregnant?” at investors' meetings - to the much more sinister. An essay in Lean Out by Katy Levinson details her experiences of sexual harassment while working in tech: 

I have had interviewers attempt to solicit sexual favors from me mid-interview and discuss in significant detail precisely what they would like to do. All of these things have happened either in Silicon Valley working in tech, in an educational institution to get me there, or in a technical internship.

Others featured in the book joined in with the low-level sexism and racism  of their male colleagues in order to "fit in" and deflect negative attention. Erica Joy writes that while working in IT at the University of Alaska as the only woman (and only black person) on her team, she laughed at colleagues' "terribly racist and sexist jokes" and "co-opted their negative attitudes”. 

The casual culture and allegedly meritocratic hierarchies of tech companies may actually be encouraging this discriminatory atmosphere. HR and the strict reporting procedures of large corporates at least give those suffering from discrimination a place to go. A casual office environment can discourage reporting or calling out prejudiced humour or remarks. Brook Shelley, a woman who transitioned while working in tech, notes: "No one wants to be the office mother". So instead, you join in and hope for the best. 

And, of course, there's no reason why people working in tech would have fewer issues with discrimination than those in other industries. A childhood spent as a "nerd" can also spawn its own brand of misogyny - Katherine Cross writes in Lean Out that “to many of these men [working in these fields] is all too easy to subconciously confound women who say ‘this is sexist’ with the young girls who said… ‘You’re gross and a creep and I’ll never date you'". During GamerGate, Anita Sarkeesian was often called a "prom queen" by trolls. 

When I spoke to Alexa Clay, entrepreneur and co-author of the Misfit Economy, she confirmed that there's a strange, low-lurking sexism in the start-up economy: “They have all very open and free, but underneath it there's still something really patriarchal.” Start-ups, after all, are a culture which celebrates risk-taking, something which women are societally discouraged from doing. As Clay says, 

“Men are allowed to fail in tech. You have these young guys who these old guys adopt and mentor. If his app doesn’t work, the mentor just shrugs it off. I would not be able ot get away with that, and I think women and minorities aren't allowed to take the same amount of risks, particularly in these communities. If you fail, no one's saying that's fine.

The conclusion of Lean Out, and of women in tech I have spoken to, isn’t that more women, over time, will enter these industries and seamlessly integrate – it’s that tech culture needs to change, or its lack of diversity will become even more severe. Shevinsky writes:

The reason why we don't have more women in tech is not because of a lack of STEM education. It's because too many high profile and influential individuals and subcultures within the tech industry have ignored or outright mistreated women applicants and employees. To be succinct—the problem isn't women, it's tech culture.

Software engineer Kate Heddleston has a wonderful and chilling metaphor about the way we treat women in STEM. Women are, she writes, the “canary in the coal mine”. If one dies, surely you should take that as a sign that the mine is uninhabitable – that there’s something toxic in the air. “Instead, the industry is looking at the canary, wondering why it can’t breathe, saying ‘Lean in, canary, lean in!’. When one canary dies they get a new one because getting more canaries is how you fix the lack of canaries, right? Except the problem is that there isn't enough oxygen in the coal mine, not that there are too few canaries.” We need more women in STEM, and, I’d argue, in tech in particular, but we need to make sure the air is breatheable first. 

Barbara Speed is a technology and digital culture writer at the New Statesman and a staff writer at CityMetric.