Stephen Hawking now thinks "there are no black holes"

The physicist, whose pioneering work on black holes in the 1970s made him a household name, has proposed a radical fudge to try and resolve a baffling paradox.

For a scientist whose career was made by his work on black holes, it might seem a little confusing to read that Stephen Hawking now thinks that they don’t exist. But that’s what “Information Preservation and Weather Forecasting for Black Holes”, the study Hawking published last week on arXiv, says: “there are no black holes”.

While this might seem surprising - after all, there’s a huge amount of (indirect) evidence that black holes exist, including a massive one several million times the mass of our Sun at the centre of the Milky Way - it’s really not. It’s Hawking’s latest attempt to solve a paradox that he, and other astrophysicists, have been grappling with for a couple of years.

So what’s he talking about? Here’s the background: black holes are objects which are so massive, with such strong gravity, that even light can’t escape. The distance from the black hole, beyond which nothing gets out, is the event horizon. However, Hawking made his name in the 1970s when he published a paper showing that black holes don’t just suck stuff up, endlessly - they spew out a beam of so-called “Hawking radiation” as they absorb other matter. That means black holes actually lose mass over time, eventually whittling away to nothing.

Black holes are frustrating, though, because their extreme gravity exposes the major inadequacy in our current scientific understanding of the universe - we don’t know how to reconcile quantum mechanics and general relativity. With general relativity, we can make accurate predictions about objects with certainty, but on the tiny scale of quantum mechanics it’s only possible to talk about the behaviour of objects in terms of probability. When we do the maths on what happens to things that fall into black holes, using relativity gives results that break quantum mechanics; the same goes vice versa.

One of the key things about quantum mechanics is that it tells us information can’t be destroyed - that is, if you measure the radiation given off by a black hole, you should be able to build up a picture of what matter fell into the hole to create it. However, if general relativity holds, and nothing can escape from inside the event horizon, then that should apply to that quantum information - any radiation that’s coming out is, Hawking showed, random. It’s the black hole “information paradox”. Either give up quantum mechanics, or accept that information can die.

Hawking was in the “information can die” camp, until 2004, when it became clear - thanks to string theory - that quantum mechanics held up (and there’s an excellent in-depth explanation of this in Nature that explores this story more fully if interested). There was just one problem - nobody could work out *how* information was getting out of black holes, even if it was happening mathematically.

And, just in case this wasn’t all entirely confusing, it turns out that our best post-2004 theory about what’s been going on gives rise to an entirely new paradox - the “firewall”.

It’s to do with quantum entanglement, where two particles are created that are identical on the quantum level. The way it works isn’t exactly clear yet - it could be something to do with string theory and wormholes - but it means that measuring the properties of one particle will give readings that mirror those found on its entangled particle. It might lead to teleportation technology, but scientists aren’t sure yet.

Joseph Polchinski from the Kavli Institute for Theoretical Physics in Santa Barbara, California published a paper in 2012 that worked out the information paradox could be solved if Hawking radiation was quantum entangled with the stuff falling in. But, due to the limitations of entanglement, if this is true, that would mean that at the event horizon a massive amount of energy was given off by particles entering and leaving.

Hence “firewall” - anything crossing the event horizon would be burnt to a crisp. And even though most scientists, including Polchinski, thought this couldn’t possibly be right - it completely contradicts a lot of the stuff underlying general relativity, for example - nobody’s yet managed to disprove it.

The choice for physicists, once again, was to: a) accept the firewall, and throw out general relativity, or b) accept that information dies in black holes, and quantum mechanics is wrong.

Still with me? Here’s where Hawking’s latest paper comes in.

(That title - “Information Preservation and Weather Forecasting for Black Holes” - might make some more sense too, hopefully.)

Hawking’s proposed solution, building on an idea first floated in 2005, is that the event horizon isn’t as defined as we’ve come to imagine it. He instead proposes something called an “apparent horizon”, which light and other stuff can escape from:

"The absence of event horizons mean that there are no black holes - in the sense of regimes from which light can't escape to infinnity. There are however apparent horizons which persist for a period of time."

Black holes should be treated more like massive galactic washing machines. Stuff falls in and starts getting tossed around, mixed up with other stuff in there, and only eventually is allowed to escape out again when ready. This happens because the quantum effects around a black hole, like weather on Earth, churn so violently and unpredictably that it’s just impossible to either predict the position of an event horizon or expect uniform effects for stuff crossing it. While the theoretical basis, that information is preserved, remains, in practice it's so difficult as to be impractical.

It’s a fudge of an idea, which tries to have its general relativity and quantum mechanics cakes, and eat them, too. Possible weaknesses, as Nature points out, are that it could imply that escaping from black holes is easier than it is in reality. It could also be the apparent horizons are just as much of a firewall as the traditional conception of an event horizon. Hawking's peers have yet to have a go at assessing his idea, so we'll have to wait to see whether the idea has merit - or whether it merely gives rise to yet more paradoxes.

Hawking in Cambridge, September 2013. (Photo: Getty)

Ian Steadman is a staff science and technology writer at the New Statesman. He is on Twitter as @iansteadman.

Getty
Show Hide image

From Darwin to Damore - the ancient art of using "science" to mask prejudice

Charles Darwin, working at a time when women had little legal rights, declared “woman is a kind of adult child”.

“In addition to the Left’s affinity for those it sees as weak, humans are generally biased towards protecting females,” wrote James Damore, in his now infamous anti-diversity Google memo. “As mentioned before, this likely evolved because males are biologically disposable and because women are generally more co-operative and agreeable than men.” Since the memo was published, hordes of women have come forward to say that views like these – where individuals justify bias on the basis of science – are not uncommon in their traditionally male-dominated fields. Damore’s controversial screed set off discussions about the age old debate: do biological differences justify discrimination?  

Modern science developed in a society which assumed that man was superior over women. Charles Darwin, the father of modern evolutionary biology, who died before women got the right to vote, argued that young children of both genders resembled adult women more than they did adult men; as a result, “woman is a kind of adult child”.

Racial inequality wasn’t immune from this kind of theorising either. As fields such as psychology and genetics developed a greater understanding about the fundamental building blocks of humanity, many prominent researchers such as Francis Galton, Darwin’s cousin, argued that there were biological differences between races which explained the ability of the European race to prosper and gather wealth, while other races fell far behind. The same kind of reasoning fuelled the Nazi eugenics and continues to fuel the alt-right in their many guises today.

Once scorned as blasphemy, today "science" is approached by many non-practitioners with a cult-like reverence. Attributing the differences between races and gender to scientific research carries the allure of empiricism. Opponents of "diversity" would have you believe that scientific research validates racism and sexism, even though one's bleeding heart might wish otherwise. 

The problem is that current scientific research just doesn’t agree. Some branches of science, such as physics, are concerned with irrefutable laws of nature. But the reality, as evidenced by the growing convergence of social sciences like sociology, and life sciences, such as biology, is that science as a whole will, and should change. The research coming out of fields like genetics and psychology paint an increasingly complex picture of humanity. Saying (and proving) that gravity exists isn't factually equivalent to saying, and trying to prove, that women are somehow less capable at their jobs because of presumed inherent traits like submissiveness. 

When it comes to matters of race, the argument against racial realism, as it’s often referred to, is unequivocal. A study in 2002, authored by Neil Risch and others, built on the work of the Human Genome Project to examine the long standing and popular myth of seven distinct races. Researchers found that  “62 per cent of Ethiopians belong to the same cluster as Norwegians, together with 21 per cent of the Afro-Caribbeans, and the ethnic label ‘Asian’ inaccurately describes Chinese and Papuans who were placed almost entirely in separate clusters.” All that means is that white supremacists are wrong, and always have been.

Even the researcher Damore cites in his memo, Bradley Schmitt of Bradley University in Illinois, doesn’t agree with Damore’s conclusions.  Schmitt pointed out, in correspondence with Wired, that biological difference only accounts for about 10 per cent of the variance between men and women in what Damore characterises as female traits, such as neuroticism. In addition, nebulous traits such as being “people-oriented” are difficult to define and have led to wildly contradictory research from people who are experts in the fields. Suggesting that women are bad engineers because they’re neurotic is not only mildly ridiculous, but even unsubstantiated by Damore’s own research.  As many have done before him, Damore couched his own worldview - and what he was trying to convince others of - in the language of rationalism, but ultimately didn't pay attention to the facts.

And, even if you did buy into Damore's memo, a true scientist would retort - so what? It's a fallacy to argue that just because a certain state of affairs prevails, that that is the way that it ought to be. If that was the case, why does humanity march on in the direction of technological and industrial progress?

Humans weren’t meant to travel large distances, or we would possess the ability to do so intrinsically. Boats, cars, airplanes, trains, according to the Damore mindset, would be a perversion of nature. As a species, we consider overcoming biology to be a sign of success. 

Of course, the damage done by these kinds of views is not only that they’re hard to counteract, but that they have real consequences. Throughout history, appeals to the supposed rationalism of scientific research have justified moral atrocities such as ethnic sterilisation, apartheid, the creation of the slave trade, and state-sanctioned genocide.

If those in positions of power genuinely think that black and Hispanic communities are genetically predisposed to crime and murder, they’re very unlikely to invest in education, housing and community centres for those groups. Cycles of poverty then continue, and the myth, dressed up in pseudo-science, is entrenched. 

Damore and those like him will certainly maintain that the evidence for gender differences are on their side. Since he was fired from Google, Damore has become somewhat of an icon to some parts of society, giving interviews to right-wing Youtubers and posing in a dubious shirt parodying the Google logo (it now says Goolag). Never mind that Damore’s beloved science has already proved them wrong.