Does dark matter exist?

After 80 years of agreement about the dark stuff, opinions may be changing.

The peasants are revolting. Last night the Flamsteed Astronomical Society met at the National Maritime Museum to hear a debate on the existence (or not) of dark matter. In a vote at the end, the audience decided it probably doesn’t exist.

The idea of dark matter has been around since 1933, when a Swiss astronomer called Fritz Zwicky found that centrifugal forces should have been tearing spinning galaxy clusters apart – but weren’t. The answer, he suggested, was that there was extra stuff in there, whose gravitational pull was holding everything together.
 
Astronomers now believe this stuff makes up around a quarter of the universe, if you take into account all the mass and energy in the cosmos. Ignore the pure energy, and dark matter accounts for 80 per cent of the universe’s mass. Which makes it a little embarrassing that we have never seen any.

Neither do we know what it looks like. We’ve been groping around for dark matter since about 1970. Various predictions have been made: in 1980, astronomer Vera Rubin said it would be found within 10 years. In 1990, astronomer royal Martin Rees said the dark matter mystery would be solved by the turn of the century. In 1999 Rees was aware he had been too hasty, and said we would know what dark matter is by 2004. Last January, CERN theoretical physicist and Gandalf lookalike John Ellis gave the physicists another decade.
 
But patience is starting to wear thin. At last night’s debate, Oxford physicist and co-presenter of The Sky at Night Chris Lintott made the case for dark matter; astronomy writer Stuart Clark argued that a modification to the laws of gravity, which are dictated by Einstein’s general relativity theory, held more promise for explaining the (apparently) missing mass. At the end of the evening, the audience sided with Clark and modifying gravity.
 
That’s not going to have dark matter astronomers quaking in their boots. But it is nonetheless indicative of a change of mood. Take what went on at the Cosmic Variance blog last week. Sean Carroll, the blog’s host, has always been bullishly pro dark matter. But it seems he has started to hedge a bit.
 
In a fascinating post, he published the trialogue he had been conducting with astronomer Stacey McGaugh, the original proponent of the modified gravity idea (it’s called MOND: modified Newtonian Dynamics) and German astrophysicist Rainer Plaga. Right at the top, Carroll concedes that “it may very well turn out that the behavior of gravity on large scales does not precisely match the prediction of ordinary general relativity”. In other words, he is saying, we might well have to modify gravity.
 
It’s worth pointing out a couple more reasons it’s OK to harbour doubts about the dark stuff. Last September, Durham astronomer Carlos Frenk admitted he was “losing sleep” over the results of his own computer simulations. His work had showed that the way simulated dwarf galaxies – mainly composed of dark matter – form in a halo around our own galaxy doesn’t tally with what we observe. His conclusion was that the standard theory of dark matter is almost certainly wrong, adding that searches for the stuff at the LHC in Geneva would therefore prove fruitless.
 
Then last month two groups of astronomers announced that dark matter wasn’t where it should be. The sun is meant to be surrounded by a halo of dark matter, and it isn’t.
 
If there really is no dark matter, that won’t be a mainstream view for decades to come. Once it’s got some momentum, it takes a lot of effort to change direction in science. But it does seem that, after 80 years, someone’s found the handbrake on the dark matter juggernaut.
 

Images of giant galaxy clusters, said to be mainly made up of dark matter. Photograph: Nasa/Getty Images

Michael Brooks holds a PhD in quantum physics. He writes a weekly science column for the New Statesman, and his most recent book is At the Edge of Uncertainty: 11 Discoveries Taking Science by Surprise.

Photo: Getty
Show Hide image

Your life's work, ruined – how storms can wipe out scientific research in an instant

Some researchers face the prospect of risking their own lives to save valuable scientific research that could benefit future generations.

Before the autumn of 2012, if you went into the basement of New York University's School of Medicine in Manhattan, you would find a colony of more than 3,000 live mice. This was the collection of Gordon Fishell, the associate director of the NYU Neuroscience institute, which he had spent more than 20 years building up, and which he was using to discover how neurons communicate with other cells.

As Hurricane Sandy began to approach New York State, Fishell and his colleagues, like others in the city, made preparations for the onslaught. This meant leaving extra food and water for their colonies, and making sure that emergency power was on.

But no one anticipated the size and intensity of the hurricane. On the day it finally arrived, Fishell was forced by the weather to stay home, and to his horror he saw that his lab was now in the path of the storm. As he wrote later in Nature magazine: "We were done for. It was obvious that our labs were in great danger, and there was nothing I could do." All of Fishell's mice drowned. Furthermore, scientific equipment and research worth more than $20m was destroyed.

In seeing years of academic work wiped out by a storm, Fishell and his colleagues at the School of Medicine are not alone. In 2001, Hurricane Allison, a tropical storm turned hurricane, had caused similar devastation at Texas Medical Centre, the world's largest such research centre, inflicting at least $2bn in damages. In 2011, the Japanese tsunami hit Tohoku University’s world-renowned Advanced Institute for Materials Research and destroyed some of the world’s best electron microscopes, as well as $12.5m in loss of equipment.

Such stories used to be seen as unique and unfortunate incidents. But the increasing incidence of extreme weather events over the last 20 years has highlighted the dangers of complacency.

Not only do facilities affected by natural disasters lose decades of irreplaceable research, but many contain toxic chemicals which could be potentially deadly if released into the water or food supply. During the 2007 floods in the UK, a foot and mouth outbreak was traced back to a lab affected by heavy rain. In Houston, during the recent Hurricane Harvey, leakages from industrial facilities contaminated the floodwater. 

Gradually, university deans and heads of research facilities in the United States have realised that the Federal Emergency Management Agency (FEMA) is badly prepared for this kind of problem. "They had never thought of how to deal with a research loss," Susan Berget, the vice president of emergency planning at Baylor College of Medicine told Nature in 2005. "To them, transgenic mice are a foreign concept."

It therefore falls on universities, local communities and regional governments to ensure they are adequately prepared for disasters. A common complaint is the lack of guidance they receive. 

Often, researchers who choose to save valuable scientific research are putting their lives at risk. One particularly harrowing story was that of biochemist Dr Arthur Lustig, who spent four days in his Tulane university laboratory before being evacuated to a shelter. Despite his tenacity, he lost more than 80 per cent of his work on yeast strains, carried out over 20 years, to flooding caused by Hurricane Katrina.

Other than the immediate, heartbreaking effects of losing research, natural disasters also pose a threat to future investment. If a region is increasingly seen as not disaster resilient, it reduces the amount of federal and private funding for groundbreaking research, as well as applications from prospective researchers.

A recent report in the journal of the National Academies of Science, Engineering and Medicine quantified this link. It found that varius tropical storms led to as many as 120 researchers losing their livelihoods. In one instance, a psychology internship for high schoolers was discontinued. 

Disasters like hurricanes and tropical storms are usually thought of as high risk but low probability events. As Bill McKibben noted in the Guardian, Hurricane Harvey was a once in 25,000 years kind of storm, but the “normal” measurements of incidence cannot necessarily be held as true anymore. Just like the rest of us, researchers will have to be prepared for every possibility.