On manipulating memories, we're not as far behind Hollywood as you might think

Deep brain stimulation is racing ahead, and the ethical issues associated with it are starting to be debated.

Remember Total Recall? When the film came out in 1990, its premise, in which people take virtual holidays using memory manipulation, seemed farfetched. But on 20 August President Obama’s commission on bioethics debated what we ought to do about memory manipulation. That’s because it is just one of many invasive actions we are beginning to perform on the brain.
 
This month, the first trials of a new technique for controlling Parkinson’s disease began. A German sufferer has had a “deep brain stimulation” device, essentially a pair of electrodes, implanted in his brain. It will monitor the brain’s activity to deliver electrical currents designed to combat tremors and muscle rigidity. A similar technique has been shown, in a few cases, to reverse the shrinkage of brain tissues associated with Alzheimer’s disease. This reversal was not only about the neural tissue’s physical appearance: it led to improved brain functioning. No one knows how it works; the best guess is that it stimulates the growth of neurons.
 
Deep brain stimulation is also a treatment option if you have obsessive compulsive disorder. OCD appears to arise when electrical circuits conveying signals between the emotional and the decision-making parts of the brain become stuck in feedback loops. That leads to people compulsively repeating actions because the anxieties associated with not having done the task don’t get erased. A jolt of electricity seems to clear the brain jam, however. Similar treatments seem to be a cure for depression in some people.
 
And, true to Hollywood, we are now manipulating memories. We’re not yet at the virtual holiday stage, but mice are starting to have some strange experiences. Last month it was reported that electricity delivered to a mouse’s hippocampus gave it a memory of receiving a shock to the foot.
 
Hence the need for ethical review: it is easy to see how this could eventually be used to create a tool for controlling errant prisoners, say, or mental-health patients. Perhaps you remember the electroconvulsive “therapy” punishment in One Flew Over the Cuckoo’s Nest? It’s still seen as a treatment option for depression but some think it’s too blunt an instrument. Deep brain stimulation is far less blunt – yet who decides just how blunt is acceptable?
 
There are many other issues to face. As we begin our assault on the brain, we will begin to gather information that might turn out to be problematic. Brain experiments are already suggesting that some people have naturally poor control over impulsive actions, and are more prone to criminal or antisocial behaviour. It is important that such information should not get thrown casually into the public sphere.
 
For all the appropriate caution, let’s acknowledge that some of the things we’re learning to do to the brain are really rather exciting. Having a virtual holiday might sound like a bore, but what about having razor-sharp focus at the flick of a switch? The US military is piloting a scheme that is mind-bendingly futuristic: a DC electrical current applied to the brain that in effect puts you into a high-concentration zone. With “transcranial direct current stimulation”, learning is accelerated and performance in tasks that require mental focus is significantly enhanced.
 
The Americans are using it to improve sniper training but that won’t be the only application. One day soon you might unplug yourself and utter the immortal words: “I know kung fu.” Hollywood races ahead, but we’re not as far behind as you might think.
Jack Nicholson in the film version of "One Flew Over the Cuckoo's Nest".

Michael Brooks holds a PhD in quantum physics. He writes a weekly science column for the New Statesman, and his most recent book is At the Edge of Uncertainty: 11 Discoveries Taking Science by Surprise.

This article first appeared in the 26 August 2013 issue of the New Statesman, How the dream died

Metro-Goldwyn-Mayer Pictures
Show Hide image

The one where she turns into a USB stick: the worst uses of tech in films

The new film Worst Tinder Date Ever will join a long tradition of poorly-thought-through tech storylines.

News just in from Hollywood: someone is making a film about Tinder. What will they call it? Swipe Right, perhaps? I Super Like You? Some subtle allusion to the app’s small role in the plotline? Nope – according to Hollywood Reporterthe film has been christened Worst Tinder Date Ever.

With the exception of its heavily branded title (You’ve Got Gmail, anyone?), Worst Tinder Date Ever follows neatly in the tradition of writers manhandling tech into storylines. Because really, why does it matter if it was a Tinder date? This “rom com with action elements” reportedly focuses on the couple’s exploits after they meet on the app, so the dogged focus on it is presumably just a ploy to get millennial bums on cinema seats.  

Like the films on this list, it sounds like the tech in Worst Tinder Date Ever is just a byword for “modern and cool” – even as it demonstrates that the script is anything but.

Warning: spoilers ahead.

Lucy (2014)

Scarlett Johansson plays Lucy, a young woman who accidentally ingests large quantities of a new drug which promises to evolve your brain beyond normal human limits.

She evolves and evolves, gaining superhuman powers, until she hits peak human, and turns into first a supercomputer, and then a very long USB stick. USB-Lucy then texts Morgan Freeman's character on his fliphone to prove that: “I am everywhere.”

Beyond the obvious holes in this plotline (this wouldn’t happen if someone’s brain evolved; texting a phone is not a sign of omnipotence), USB sticks aren’t even that good – as Business Insider points out: “Flash drives are losing relevance because they can’t compete in speed and flexibility with cloud computing services . . . Flashdrives also can’t carry that much information.”

Star Wars: The Force Awakens (2015)

If you stare at it hard enough, the plotline in the latest Star Wars film boils down to the following: a gaggle of people travels across space in order to find a map showing Luke Skywalker’s location, held on a memory stick in a drawer in a spherical robot. Yep, those pesky flash drives again.

It later turns out that the map is incomplete, and the rest of it is in the hands of another robot, R2-D2, who won’t wake up for most of the film in order to spit out the missing fragment. Between them, creator George Lucas and writer and director JJ Abrams have dreamed up a dark vision of the future in which robots can talk and make decisions, but can’t email you a map.

Willy Wonka and the Chocolate Factory (1971)

In which a scientist uses a computer to find the “precise location of the three remaining golden tickets sent out into the world by Willy Wonka. When he asks it to spill the beans, it announces: “I won’t tell, that would be cheating.


Image: Paramount Pictures. 

The film inhabits a world where artificial intelligence has been achieved, but no one has thought to pull Charlie's poor grandparents out of extreme poverty, or design a computer with more than three buttons.

Independence Day (1996)

When an alien invasion threatens Earth, David Levinson (Jeff Goldblum) manages to stop it by hacking the alien spaceship and installing a virus. Using his Mac. Amazing, really, that aliens from across the universe would somehow use computing systems so similar to our own. 

Skyfall (2012)

In the Daniel Craig reboot of the series, MI6’s “Q” character (played by Ben Whishaw) becomes a computer expert, rather than just a gadget wizard. Unfortunately, this heralded some truly cringeworthy moments of “hacking” and “coding” in both Skyfall and Spectre (2014).

In the former, Bond and Q puzzle over a screen filled with a large, complex, web shape. They eventually realise it’s a map of subterranean London, but then the words security breach flash up, along with a skull. File under “films which make up their own operating systems because a command prompt box on a Windows desktop looks too boring”.

An honourable mention: Nelly and Kelly Rowland’s “Dilemma” (2009)

Not a movie, but how could we leave out a music video in which Kelly Rowland texts Nelly on a Microsoft Excel spreadsheet on a weird Nokia palm pilot?


Image: Vevo.

You’ll be waiting a long time for that response, Kelly. Try Tinder instead.

Barbara Speed is a technology and digital culture writer at the New Statesman and a staff writer at CityMetric.