On manipulating memories, we're not as far behind Hollywood as you might think

Deep brain stimulation is racing ahead, and the ethical issues associated with it are starting to be debated.

Remember Total Recall? When the film came out in 1990, its premise, in which people take virtual holidays using memory manipulation, seemed farfetched. But on 20 August President Obama’s commission on bioethics debated what we ought to do about memory manipulation. That’s because it is just one of many invasive actions we are beginning to perform on the brain.
 
This month, the first trials of a new technique for controlling Parkinson’s disease began. A German sufferer has had a “deep brain stimulation” device, essentially a pair of electrodes, implanted in his brain. It will monitor the brain’s activity to deliver electrical currents designed to combat tremors and muscle rigidity. A similar technique has been shown, in a few cases, to reverse the shrinkage of brain tissues associated with Alzheimer’s disease. This reversal was not only about the neural tissue’s physical appearance: it led to improved brain functioning. No one knows how it works; the best guess is that it stimulates the growth of neurons.
 
Deep brain stimulation is also a treatment option if you have obsessive compulsive disorder. OCD appears to arise when electrical circuits conveying signals between the emotional and the decision-making parts of the brain become stuck in feedback loops. That leads to people compulsively repeating actions because the anxieties associated with not having done the task don’t get erased. A jolt of electricity seems to clear the brain jam, however. Similar treatments seem to be a cure for depression in some people.
 
And, true to Hollywood, we are now manipulating memories. We’re not yet at the virtual holiday stage, but mice are starting to have some strange experiences. Last month it was reported that electricity delivered to a mouse’s hippocampus gave it a memory of receiving a shock to the foot.
 
Hence the need for ethical review: it is easy to see how this could eventually be used to create a tool for controlling errant prisoners, say, or mental-health patients. Perhaps you remember the electroconvulsive “therapy” punishment in One Flew Over the Cuckoo’s Nest? It’s still seen as a treatment option for depression but some think it’s too blunt an instrument. Deep brain stimulation is far less blunt – yet who decides just how blunt is acceptable?
 
There are many other issues to face. As we begin our assault on the brain, we will begin to gather information that might turn out to be problematic. Brain experiments are already suggesting that some people have naturally poor control over impulsive actions, and are more prone to criminal or antisocial behaviour. It is important that such information should not get thrown casually into the public sphere.
 
For all the appropriate caution, let’s acknowledge that some of the things we’re learning to do to the brain are really rather exciting. Having a virtual holiday might sound like a bore, but what about having razor-sharp focus at the flick of a switch? The US military is piloting a scheme that is mind-bendingly futuristic: a DC electrical current applied to the brain that in effect puts you into a high-concentration zone. With “transcranial direct current stimulation”, learning is accelerated and performance in tasks that require mental focus is significantly enhanced.
 
The Americans are using it to improve sniper training but that won’t be the only application. One day soon you might unplug yourself and utter the immortal words: “I know kung fu.” Hollywood races ahead, but we’re not as far behind as you might think.
Jack Nicholson in the film version of "One Flew Over the Cuckoo's Nest".

Michael Brooks holds a PhD in quantum physics. He writes a weekly science column for the New Statesman, and his most recent book is At the Edge of Uncertainty: 11 Discoveries Taking Science by Surprise.

This article first appeared in the 26 August 2013 issue of the New Statesman, How the dream died

Show Hide image

With everything from iPhones to clothing turning monochrome, is the West afraid of colour?

If modern design appears particularly achromatic, it only reflects the "chromophobia" which courses through the history of Western thought.

To many English observers, 1666 – the year that the poet John Dryden christened the annus mirabilis, or “year of miracles” – wasn’t especially miraculous. The country was gripped by plague and, after a hot, dry summer, the Great Fire cut a swath through London. But for Isaac Newton, then still a student, it did prove illuminating. It was in 1666 that he first used prisms to prove that white light was not a pure, indissoluble substance but was made up of different coloured rays. This was such a profound challenge to the prevailing world-view that even Newton was shaken. “I perswade my self,” he wrote, “that this Assertion above the rest appears Paradoxical, & is with most difficulty admitted.”

The belief that colours are inferior and therefore naturally subordinate, rather than fundamental, was not new in Newton’s day, nor did it end with his discovery of spectral colour. A pattern of chromophobia – an aversion to colours – courses through Western thought.

Writing in the fourth century BC, Aristotle argued: “The most attractive colours would never yield as much pleasure as a definite image without colour.” For Renaissance artists, this idea was defined by the division between disegno, drawing or design, and colore. Disegno was the foundation of any serious artistic endeavour. The preference for achromatic, “intellectual” form is also evident in architecture. Despite rock-solid evidence from the 19th century proving that Greek marble buildings and statues were once brightly painted, the classical ideal has remained anachronistically bleached. And while modernist and postmodern architects have made some use of colour, the primacy of form is unmistakable in the work of everyone from John Pawson to Zaha Hadid and Toyo Ito.

A broad cultural dislike of colour is curious because, speaking in evolutionary terms, our ability to see it has been crucial to our success. Colour vision in primates developed between 38 and 65 million years ago and makes us better able to find ripening red and yellow fruits amid green foliage. Neurons devoted to visual processing occupy much more of our neocortex real estate than those devoted to hearing or touch. Estimates vary but the Optical Society of America has suggested that it may be possible for humans to distinguish between up to ten million different shades.

And we have put this skill to good use. Bold colours have been used by many cultures to mark temporal and spiritual power. Tyrian purple, a rich, reddish dye said to resemble clotted blood, was made using an extract from two different kinds of Mediterranean shellfish and was beloved by emperors in the ancient world. A single pound of dyed cloth would cost a skilled craftsman three years’ wages and became steadily more expensive as the shellfish became rarer.

But even as such saturated colours were coveted, they also elicited disgust. The manufacture of many, including Tyrian purple, involved ingredients such as stale urine and dung. Dye and paintworks were relegated to the urban fringes. Increasingly, the wearing of bright colours was seen as vainglorious and ungodly. Protestants indicated their humility by whitewashing over jewel-coloured murals and smashing stained-glass windows in churches, and by restricting their sartorial palette predominantly to black. An echo prevails today in men’s suits: colours are largely confined to small accessories such as ties and white shirts are held up as the ne plus ultra of refined sophistication. (The late Apple co-founder Steve Jobs went one better, opting for a uniform of identical black turtlenecks.)

One reason for this distrust is that colours are difficult to conceptualise. Do they exist physically, or only in our brains? Does everyone see them the same way? Colours have been maligned as chaotic, fickle, irrational and female. The early Christian thinker St Augustine of Hippo accused them of “a seductive and dangerous sweetness”.

Our ambivalence to colour, however, has profited white. Like black, white has not been classed as a real colour since Newton. It has almost become an anti-colour. Take Apple, for example. Although Sir Jony Ive is usually credited with the company’s love for monochrome products (it was certainly Ive who brought this to its apogee), the trend predates his arrival. It can be traced back to the “Snow White” design language developed in the 1980s. Today, as consumer neophilia demands that technology be continually refreshed, Apple’s higher-end products are available in the smallest range of colours – usually just white, black and, for the Asian market, gold – while those lower down come in a slew of fruity brights.

White is not only big business for Apple. In 2014, a Californian man named Walter Liew was found guilty of 20 counts of economic espionage and sentenced to 15 years in jail for selling the secret to a very special shade of titanium-oxide white, used in everything from luxury cars to tennis courts, to Chinese firms for $28m.

Perhaps the final word on the matter should go to Le Corbusier. In 1925, the great modernist recommended that all interior walls should be whitewashed, to act as a moral and spiritual restorative. But he wasn’t just advocating white for white’s sake: although he continued to dabble with colour, he disapproved of it, too. “Let us leave to the clothes-dyers,” he wrote, “the sensory jubilations of the paint tube.”

“The Secret Lives of Colour” (John Murray) by Kassia St Clair will be published on 20 October

This article first appeared in the 26 May 2016 issue of the New Statesman, The Brexit odd squad