Jeremy Clarkson. Photo: Mark Thompson/Getty Images
Show Hide image

The neuroscience of Jeremy Clarkson

If humans can’t control themselves, they cannot be allowed the freedoms others enjoy: humans learn self-control, she says, in the same way that toddlers learn to control their bladders.

The great Jeremy Clarkson drama is, at heart, a question of neuroscience. Can our brains, formed and tweaked over millions of years, adjust to modern times? Or are we doomed to let their ancient structures rule for ever?

Perhaps it helps to compare Clarkson to Sims, the central character in Jennifer Haley’s brilliant play The Nether. Both are accused of an ongoing series of only vaguely defined transgressions. Both are largely unrepentant. They consider themselves to be beneficent figures: they provide pleasure and entertainment, albeit of a kind that is frowned on by many. Neither see grounds for criminal proceedings: it is the tenor of their lives and their general proclivities that the authorities seek to rein in.

The difference is that Sims is an entrepreneurial paedophile. In The Nether, the internet has become a fully immersive experience, gratifying all the senses. Sims has created a virtual world where customers’ avatars are free to have sex with childlike avatars – and even slay them with an axe, if they wish.

Is this wrong? It is an unsettling question for the audience, especially since the set makes the online world an immersive experience for them, too. The online children are not real; they are the online personas of consenting adults. No one is physically hurt. The uncomfortable truth is that we have no rules for how virtual adults should behave, even towards virtual children. Technology has already outrun the evolution of our morals.

Sims exploits this. His creation, he says, mitigates against people with his proclivities offending in the real world. Their brains make them do these things, he says. He is merely providing an alternative path of action. Does he have a point? The scientific literature certainly associates paedophilia with specific abnormalities in the brain. In 2003, the Archives of Neurology reported the case of a man whose brain tumour had caused him to start sexually molesting his eight-year-old stepdaughter. When the tumour was removed, his sexual interest disappeared.

This month, German researchers have reported that there is “growing evidence that paedophilia is linked to both structural and functional brain abnormalities”. One of those abnormalities is in the areas of the brain that deal with impulse control. The question is, does that make us any less inclined to condemn behaviour when those impulses are acted upon?

It seems not. The American neuroscientist and philosopher Patricia Churchland puts it like this: “An explanation is not an excuse.” If humans can’t control themselves, they cannot be allowed the freedoms others enjoy: humans learn self-control, she says, in the same way that toddlers learn to control their bladders.

This brings us back to Clarkson. Top Gear creates a virtual experience for those who have learned to control their bladders but not their impulses to do (or watch) silly or dangerous things. The programme’s appeal is often described as adolescent – and with good neuroscientific reason: in teenagers, the frontal lobes are not yet fully connected. These are the structures responsible for assessing consequences and making judgements. The Clarkson issue is about whether this half-formed physiology and the appeal of its world-view can justifiably be exploited and enjoyed by adults.

Now, though, someone has allegedly been physically hurt. The incident invokes ancient brain structures that flood us with deep-rooted but possibly anachronistic moral certitudes. We, the audience, are left in turmoil and yet utterly compelled to watch as the BBC’s moral dilemma plays out. Having had its way with Jeremy Clarkson for so long, should the corporation now control that impulse towards the axe?

Michael Brooks holds a PhD in quantum physics. He writes a weekly science column for the New Statesman, and his most recent book is At the Edge of Uncertainty: 11 Discoveries Taking Science by Surprise.

This article first appeared in the 19 March 2015 issue of the New Statesman, British politics is broken

Getty
Show Hide image

A quote-by-quote analysis of how little Jeremy Hunt understands technology

Can social media giants really implement the health secretary’s sexting suggestions? 

In today’s “Did we do something wrong? No, it was social media” news, Health Secretary Jeremy Hunt has argued that technology companies need to do more to prevent sexting and cyber-bullying.

Hunt, whose job it is to help reduce the teenage suicide rate, argued that the onus for reducing the teenage suicide rate should fall on social media companies such as Facebook and Twitter.

Giving evidence to the Commons Health Committee on suicide prevention, Hunt said: “I think social media companies need to step up to the plate and show us how they can be the solution to the issue of mental ill health amongst teenagers, and not the cause of the problem.”

Pause for screaming and/or tearing out of hair.

Don’t worry though; Hunt wasn’t simply trying to pass the buck, despite the committee suggesting he direct more resources to suicide prevention, as he offered extremely well-thought out technological solutions that are in no way inferior to providing better sex education for children. Here’s a quote-by-quote analysis of just how technologically savvy Hunt is.

***

“I just ask myself the simple question as to why it is that you can’t prevent the texting of sexually explicit images by people under the age of 18…”

Here’s Hunt asking himself a question that he should be asking the actual experts, which is in no way a waste of anybody’s time at all.

“… If that’s a lock that parents choose to put on a mobile phone contract…”

A lock! But of course. But what should we lock, Jeremy? Should teenager’s phones come with a ban on all social media apps, and for good measure, a block on the use of the camera app itself? It’s hard to see how this would lead to the use of dubious applications that have significantly less security than giants such as Facebook and Snapchat. Well done.

“Because there is technology that can identify sexually explicit pictures and prevent it being transmitted.”

Erm, is there? Image recognition technology does exist, but it’s incredibly complex and expensive, and companies often rely on other information (such as URLs, tags, and hashes) to filter out and identify explicit images. In addition, social media sites like Facebook rely on their users to click the button that identifies an image as an abuse of their guidelines, and then have a human team that look through reported images. The technology is simply unable to identify individual and unique images that teenagers take of their own bodies, and the idea of a human team tackling the job is preposterous. 

But suppose the technology did exist that could flawlessly scan a picture for fleshy bits and bobs? As a tool to prevent sexting, this still is extremely flawed. What if two teens were trying to message one another Titian’s Venus for art or history class? In September, Facebook itself was forced to U-turn after removing the historical “napalm girl” photo from the site.

As for the second part of Jezza’s suggestion, if you can’t identify it, you can’t block it. Facebook Messenger already blocks you from sending pornographic links, but this again relies on analysis of the URLs rather than the content within them. Other messaging services, such as Whatsapp, offer end-to-end encryption (EE2E), meaning – most likely to Hunt’s chagrin – the messages sent on them are not stored nor easily accessed by the government.

“I ask myself why we can’t identify cyberbullying when it happens on social media platforms by word pattern recognition, and then prevent it happening.”

Jeremy, Jeremy, Jeremy, Jeremy, can’t you spot your problem yet? You’ve got to stop asking yourself!

There is simply no algorithm yet intelligent enough to identify bullying language. Why? Because we call our best mate “dickhead” and our worst enemy “pal”. Human language and meaning is infinitely complex, and scanning for certain words would almost definitely lead to false positives. As Labour MP Thangam Debbonaire famously learned this year, even humans can’t always identify whether language is offensive, so what chance does an algorithm stand?

(Side note: It is also amusing to imagine that Hunt could even begin to keep up with teenage slang in this scenario.)

Many also argue that because social media sites can remove copyrighted files efficiently, they should get better at removing abusive language. This is a flawed argument because it is easy to search for a specific file (copyright holders will often send social media giants hashed files which they can then search for on their databases) whereas (for the reasons outlined above) it is exceptionally difficult for algorithms to accurately identify the true meaning of language.

“I think there are a lot of things where social media companies could put options in their software that could reduce the risks associated with social media, and I do think that is something which they should actively pursue in a way that hasn’t happened to date.”

Leaving aside the fact that social media companies constantly come up with solutions for these problems, Hunt has left us with the burning question of whether any of this is even desirable at all.

Why should he prevent under-18s from sexting when the age of consent in the UK is 16? Where has this sudden moral panic about pornography come from? Are the government laying the ground for mass censorship? If two consenting teenagers want to send each other these aubergine emoji a couple of times a week, why should we stop them? Is it not up to parents, rather than the government, to survey and supervise their children’s online activities? Would education, with all of this in mind, not be the better option? Won't somebody please think of the children? 

“There is a lot of evidence that the technology industry, if they put their mind to it, can do really smart things.

Alas, if only we could say the same for you Mr Hunt.

Amelia Tait is a technology and digital culture writer at the New Statesman.