We need to stop worrying and trust our robot researchers

The work of Francis Crick and James Watson gives us a vision of what's to come.

It’s now 60 years since the publication of the structure of DNA. As we celebrate the past, the work of Francis Crick and James Watson also gives us a vision of what’s to come. Their paper was not subjected to peer review, today’s gold standard for the validation of scientific research. Instead, it was discussed briefly over a lunch at the Athenaeum Club. In an editorial celebrating the anniversary, the journal Nature, which originally published the research, points out that this is “unthinkable now”.

However, peer review has always been somewhat patchy and it is becoming ever more difficult. This is the age of “big data”, in which scientists make their claims based on analysis of enormous amounts of information, often carried out by custom-written software. The peer review process, done on an unpaid, voluntary basis in researchers’ spare time, doesn’t have the capacity to go through all the data-analysis techniques. Reviewers have to rely on their intuition.

There are many instances of this leading science up the garden path but recently we were treated to a spectacular example in economics. In 2010, Harvard professors published what quickly became one of the most cited papers of the year. Simply put, it said that if your gross public debt is more than 90 per cent of your national income, you are going to struggle to achieve any economic growth.

Dozens of newspapers quoted the research, the Republican Party built its budget proposal on it and no small number of national leaders used it to justify their preferred policies. Which makes it all the more depressing that it has been unmasked as completely wrong.

The problem lay in poor data-handling. The researchers left out certain data points, gave questionable weight to parts of the data set and – most shocking of all – made a mistake in the programming of their Excel spreadsheet.

The Harvard paper was not peer-reviewed before publication. It was only when the researchers shared software and raw data with peers sceptical of the research that the errors came to light.

The era of big data in science will stand or fall on such openness and collaboration. It used to be that collaboration arose from the need to create data. Crick and Watson collaborated with Maurice Wilkins to gather the data they needed – from Rosalind Franklin’s desk drawer, without her knowledge or permission. That was what gave them their pivotal insight. However, as Mark R Abbott of Oregon State University puts it, “We are no longer data-limited but insight-limited.”

Gaining insights from the data flood will require a different kind of science from Crick’s and Watson’s and it may turn out to be one to which computers and laboratorybased robots are better suited than human beings. In another 60 years, we may well be looking back at an era when silicon scientists made the most significant discoveries.

A robot working in a lab at Aberystwyth University made the first useful computergenerated scientific contribution in 2009, in the field of yeast genomics. It came up with a hypothesis, performed experiments and reached a conclusion, then had its work published in the journal Science. Since then, computers have made further inroads. So far, most (not all) have been checked by human beings but that won’t be possible for long. Eventually, we’ll be taking their insights on trust and intuition stretched almost to breaking point – just as we did with Crick and Watson.

President Obama inspects a robot built in Virginia. Photograph: Getty Images.

Michael Brooks holds a PhD in quantum physics. He writes a weekly science column for the New Statesman, and his most recent book is At the Edge of Uncertainty: 11 Discoveries Taking Science by Surprise.

Photo: Getty
Show Hide image

Your life's work, ruined – how storms can wipe out scientific research in an instant

Some researchers face the prospect of risking their own lives to save valuable scientific research that could benefit future generations.

Before the autumn of 2012, if you went into the basement of New York University's School of Medicine in Manhattan, you would find a colony of more than 3,000 live mice. This was the collection of Gordon Fishell, the associate director of the NYU Neuroscience institute, which he had spent more than 20 years building up, and which he was using to discover how neurons communicate with other cells.

As Hurricane Sandy began to approach New York State, Fishell and his colleagues, like others in the city, made preparations for the onslaught. This meant leaving extra food and water for their colonies, and making sure that emergency power was on.

But no one anticipated the size and intensity of the hurricane. On the day it finally arrived, Fishell was forced by the weather to stay home, and to his horror he saw that his lab was now in the path of the storm. As he wrote later in Nature magazine: "We were done for. It was obvious that our labs were in great danger, and there was nothing I could do." All of Fishell's mice drowned. Furthermore, scientific equipment and research worth more than $20m was destroyed.

In seeing years of academic work wiped out by a storm, Fishell and his colleagues at the School of Medicine are not alone. In 2001, Hurricane Allison, a tropical storm turned hurricane, had caused similar devastation at Texas Medical Centre, the world's largest such research centre, inflicting at least $2bn in damages. In 2011, the Japanese tsunami hit Tohoku University’s world-renowned Advanced Institute for Materials Research and destroyed some of the world’s best electron microscopes, as well as $12.5m in loss of equipment.

Such stories used to be seen as unique and unfortunate incidents. But the increasing incidence of extreme weather events over the last 20 years has highlighted the dangers of complacency.

Not only do facilities affected by natural disasters lose decades of irreplaceable research, but many contain toxic chemicals which could be potentially deadly if released into the water or food supply. During the 2007 floods in the UK, a foot and mouth outbreak was traced back to a lab affected by heavy rain. In Houston, during the recent Hurricane Harvey, leakages from industrial facilities contaminated the floodwater. 

Gradually, university deans and heads of research facilities in the United States have realised that the Federal Emergency Management Agency (FEMA) is badly prepared for this kind of problem. "They had never thought of how to deal with a research loss," Susan Berget, the vice president of emergency planning at Baylor College of Medicine told Nature in 2005. "To them, transgenic mice are a foreign concept."

It therefore falls on universities, local communities and regional governments to ensure they are adequately prepared for disasters. A common complaint is the lack of guidance they receive. 

Often, researchers who choose to save valuable scientific research are putting their lives at risk. One particularly harrowing story was that of biochemist Dr Arthur Lustig, who spent four days in his Tulane university laboratory before being evacuated to a shelter. Despite his tenacity, he lost more than 80 per cent of his work on yeast strains, carried out over 20 years, to flooding caused by Hurricane Katrina.

Other than the immediate, heartbreaking effects of losing research, natural disasters also pose a threat to future investment. If a region is increasingly seen as not disaster resilient, it reduces the amount of federal and private funding for groundbreaking research, as well as applications from prospective researchers.

A recent report in the journal of the National Academies of Science, Engineering and Medicine quantified this link. It found that varius tropical storms led to as many as 120 researchers losing their livelihoods. In one instance, a psychology internship for high schoolers was discontinued. 

Disasters like hurricanes and tropical storms are usually thought of as high risk but low probability events. As Bill McKibben noted in the Guardian, Hurricane Harvey was a once in 25,000 years kind of storm, but the “normal” measurements of incidence cannot necessarily be held as true anymore. Just like the rest of us, researchers will have to be prepared for every possibility.