Is there a Hard Problem? Photo: Flickr/A Health Blog
Show Hide image

A hard problem for soft brains: is there a Hard Problem?

Daniel Dennett wants to convince Tom Stoppard that there is no Hard Problem.

Oh, to have been a fly on the wall when the philosopher Daniel Dennett chatted with Tom Stoppard. The conversation took place after a performance of Stoppard’s new play about consciousness, The Hard Problem. A few days earlier, Dennett had told an audience at the Royal Institution (RI) that there is no “Hard Problem”.

The play’s name comes from the label that the Australian cognitive scientist David Chalmers gives to the task of understanding consciousness. This is hard, he says, because no physical phenomena will ever be found to account for the emergence of conscious experience. It is a statement of faith but one that has garnered plenty of support and clearly caught Stoppard’s attention.

Consciousness is a tough nut to crack. Scientists aren’t sure how to define it and they don’t know how it – whatever “it” is – emerges from the squidgy, biological matter of the brain. Somehow, billions of neurons connect and give us the ability to sense the outside world and have what we describe as “feelings” about our experience.

To Stoppard, consciousness is an almost supernatural phenomenon – something beyond the reach of science. His play suggests that those who indulge in spiritual beliefs might be more successful in hunting down the root of consciousness, as if consciousness inhabited some realm beyond physics, chemistry and biology.

Dennett, on the other hand, thinks that we may have already solved the problem of consciousness with a coterie of small-scale, rather banal explanations. The non-mysterious ways in which the brain creates our sensory experience might be the only ingredients we need to explain how it is that we are aware of feeling something.

He expands on this possibility in his contribution to a new collection of essays at edge.org that asks the question: “What scientific idea is ready for retirement?” He chooses the Hard Problem (even though, he says, it isn’t actually a scientific idea) and suggests we should approach all of its difficulties in the same way as scientists approach extrasensory perception and telekinesis: as “figments of the imagination”.

The central issue concerns our trouble with believing in the physicality of things we cannot see or touch. Software, Dennett suggested at the RI, provides a good example. Everyone agrees that software exists and performs tasks that are far from mysterious. But what is it made of? Lines of code written on a piece of paper do nothing. When written into a computer, they become abstract information encoded in the electronic state of silicon chips – we know that they are there but they are transformed. However hard that is to grasp, it doesn’t
make software spiritual or take it beyond analysis.

A word of caution: there is always a danger of interpreting our scientific struggles within a familiar paradigm. Newton discovered his “clockwork heavens” in an age when accurate means of measuring time were the central goal of many scientifically minded colleagues. Einstein’s special relativity, which defines the fundamentals of the universe in terms that reference light and signals, was birthed in the era of the electric telegraph. Neither was the final word.

These days, much of physics and biology focuses on issues of information transfer, probably because computing now plays such a significant role. So it is possible that Dennett’s software analogy is an innocent sleight of hand. It may be that we haven’t yet encountered the paradigm that will allow us to frame a good understanding of consciousness.

That would certainly make consciousness a hard problem to solve right now – but still not the Hard Problem.

Michael Brooks holds a PhD in quantum physics. He writes a weekly science column for the New Statesman, and his most recent book is At the Edge of Uncertainty: 11 Discoveries Taking Science by Surprise.

This article first appeared in the 09 April 2015 issue of the New Statesman, The Anniversary Issue 2015

Getty/Glu Games/New Statesman
Show Hide image

The second coming of Gordon Ramsay

A star is reborn. 

It would be a lie to say that Gordon Ramsay ever disappeared. The celebrity chef made his television debut in 1997 and went on to star in shows in 1998, 2001, 2004, 2005, 2006, 2007, 2009, 2010, 2011, 2012, 2013, 2014, 2015, 2016, and 2017. There hasn’t been a lull in Ramsay’s career, which has arguably gone from strength to strength. In 2000, he was cooking for Vladimir Putin and Tony Blair – in 2008, he ate the raw heart of a dead puffin.

Left: Gordon Ramsay shaking hands with Vladimir Putin. Right: Gordon Ramsay hugging a puffin (different from the one he ate).

Yet we are, undeniably, in the middle of a Ramsay renaissance. How? How could a man that conquered the last twenty years of cookery-based television have an upsurge in popularity? There are only so many television channels – so many amateur donkey chefs. Wrong. The internet has enabled a Ramsay resurgence, the second act of a play overflowing with blood, sweat, and French onion soup.

Wow.

We all, of course, know about Gordon’s Twitter account. Although started in 2010, the social media profile hit the headlines in February this year when Ramsay began rating food cooked by the world’s amateur-amateur chefs. But other elements of Ramsay’s internet celebrity are more miraculous and mysterious.

His official YouTube channel uploads, on average, three videos a week. Decades old clips from Kitchen Nightmares accumulate over three million views in as many days. A 15,000 follower-strong Facebook fan page for the show – which premiered in 2007 and ended in 2014 – was set up on 19 June 2017.

Wow, wow, wow, wow. Wow.       

A Google Trends graph showing an April 2017 surge in Ramsay's popularity, after a decline in 2014.                                      

What makes a meme dank? Academics don’t know. What is apparent is that a meme parodying Gordon Ramsay’s fury over missing lamb sauce (first aired on Hell’s Kitchen in 2006) had a dramatic upsurge in popularity in December 2016. This is far from Gordon’s only meme. Image macros featuring the star are captioned with fictitious tirades from the chef, for example: “This fish is so raw… it’s still trying to find Nemo”. A parody clip from The Late Late Show with James Cordon in which Ramsay calls a woman an “idiot sandwich” has been watched nearly five million times on YouTube.

And it is on YouTube where Ramsay memes most thrive. The commenters happily parrot the chef’s most memable moments, from “IT’S RAW” to the more forlorn “fuck me” after the news something is frozen. “HELLO MY NAME IS NINOOOOO!” is an astonishingly popular comment, copied from a clip in which a Kitchen Nightmares participant mocks his brother. If you have not seen it – you should.

But what does all this mean for Ramsay’s career? His YouTube channel and Facebook page are clearly meticulously managed by his team – who respond to popular memes by clipping and cutting new videos of classic Ramsay shows. Although this undoubtedly earns a fortune in ad revenue, Ramsay’s brand has capitalised on his internet fame in more concrete ways. The chef recently voiced Gordon Ramsay Dash, a mobile game by Glu Games Inc in which you can cook with the star and he will berate or praise you for your efforts. Ten bars of gold – which are required to get upgrades and advance in the game – cost 99p.

Can other celebrity chefs learn from Ramsay? A generation will never forgive that twisted, golden piece of meat, Jamie Oliver, for robbing them of their lunch time Turkey Twizzlers. But beyond this, the internet’s love is impossible to game. Any celebrity who tried to generate an online following similar to Ramsay’s would instantly fail. Ramsay’s second coming is so prolific and powerful because it is completely organic. In many ways, the chef is not resposible for it. 

In truth, the Ramsay renaissance only worked because it was - though the chef himself would not want to admit it - completely raw.

Amelia Tait is a technology and digital culture writer at the New Statesman.