Yes We Can Go Forward and Believe in America! When did US campaign slogans become self-help mantras?

Martha Gill's Irrational Animals column.

Something’s happened to presidential campaign slogans. Something affirmative. Motivational. Inspiring. Yes! They’ve become self-help mantras.

Romney’s got his rather hectoring “Believe in America” and Obama’s gone for the grammatically pointed “Forward.” - the much discussed full-stop signifying, apparently, a mind set on its course. Last election of course we had the rabble rousing chant “Yes we can”. The tone now borrows from life coaches where it once borrowed from the advertising industry (I like Ike, Keep Cool and Keep Coolidge), and this time it’s much harder to oppose. Agreeing is not only right – it’s healthy!

This would be all very clever, but the trouble with life coaching is that it’s already been through several loops of cultural backlash. If a film features fairground music we know a grisly murder is not far away, and if a character recites motivational mantras, that is a character primed for gentle tragedy.  In fact I’m so damaged by the likes of Little Miss Sunshine and The Office that I can’t hear Romney’s slogan without picturing him saying it in front of a mirror (“I believe in America. I believe in myself. I am a strong, independent individual moving daily towards a better future”) before bursting into tears and eating Ben and Jerry’s straight from the tub.

But there is also something intrinsic about the tragi-comedy of motivational quotes. Who really springs into action after reciting a wholesale phrase about how great they are? The slogans seem to mock you, denying a gap between where you are and where you want to be (“I am the best presidential candidate in the world, EVER”), and making the gap all the more apparent in the process. It could only be a matter of time before science found they didn’t really work.

A paper published in Psychological Science looked at the differences between "declarative" talk (yes we can) and interrogative talk (can we, though?). Scientists Ibrahim Senay and Dolores Albarracin took fifty three undergraduates and gave them some anagrams to solve – (like rearranging the letters in “cause” to spell “sauce”). But before they were allowed to start the task they had to spend a minute talking to themselves. One half were in the “Will I?” group – they had to ask themselves whether they could complete the task. The other was the “I Will” group – they had to tell themselves they would. The groups were then given ten minutes to solve as many anagrams as possible.

Raised on Nike adverts and positive thinking, we might expect the assertive group to do better. They are pumped on self belief, after all, where as the other group have only mild self doubt. But no – the “Will I?” group solved 25 per cent more anagrams. Real motivation seemed to come from the question, rather than the pre-emptive answer.

The scientists thought that the question helped people to tap in to intrinsic motivation – whether they actually wanted to do the activity for themselves. They found they did. The extrinsic hectoring actually blocked their internal motivation.

So there we are, Obama, just a small change in punctuation is needed. “Forward?” Yeah, go on then.

Mitt Romney and wife. Photograph, Getty Images.

Martha Gill writes the weekly Irrational Animals column. You can follow her on Twitter here: @Martha_Gill.

This article first appeared in the 27 August 2012 issue of the New Statesman, The end of the political cartoon?

Show Hide image

YouTube announces new measures against extremism – but where do they leave the far right?

Videos by alt-right commentators have arguably radicalised many online. Will Google's latest policies do anything to change this?

Within hours of the terrorist attack in Finsbury Park, Tommy Robinson was trending on Twitter. The former leader of the English Defence League accused the Finsbury Park mosque of “creating terrorists” in a series of tweets on his personal account.

More than 17,400 people have now tweeted about the 34-year-old, with many theorising he could have radicalised the attacker who allegedly shouted “I’m going to kill all Muslims” at the scene. At present, there is no evidence that the man arrested by police on suspicion of attempted murder is a fan of Robinson.

“People are saying I’m inciting hate,” said Robinson in a video uploaded to Twitter and YouTube after the attack. “I just tell the facts and the truth and I’m not going to apologise for that…

“If giving you quotes from the Quran that incite murder and war against us is inciting hate, I’m guilty. If telling you all the problematic problems that come from the teachings and scriptures of Islam, I’m guilty. But these are just facts.”

After describing the country as being at “war”, he goes on to say: “Please one person, just one, give me one example of me inciting hate.”

When we talk about radicalisation and terrorism, we are finally to understand that this extends beyond the work of Isis.

Just over a year ago, Labour MP Jo Cox was murdered by a white supremacist. This morning, Harry Potter author JK Rowling used Twitter to accuse columnist Katie Hopkins of contributing to radicalisation. The New Statesman’s own Media Mole notes how right-wing tabloids incite hate.

In particular, it is now evident how the far right radicalises online. In December 2016, a man fired three shots in a Washington DC pizza parlour that the alt-right (on 4Chan and YouTube) had accused of being at the centre of a paedophile ring.

The internet arguably allowed Anders Breivik, the Norwegian far right white supremacist who killed 77 people in 2011, to cultivate his extreme views. Alexandre Bissonnette, the white nationalist who murdered six men at a Québec City mosque in January, was described by many as an “internet troll”.

Earlier this year, a report by the Commons home affairs committee accused social media giants of not doing enough to tackle terrorism online. In response to this – and following a series of high-profile brands pulling their advertising from YouTube after it was featured on or by terrorism-related videos – Google, which owns the video-sharing site, has now announced four steps it is taking to fight online terror. But do these reflect the reality that there are many forms of extremism?

Google’s new guidelines speak of “terrorism” and “extremism” in broad terms. This means that videos glorifying or inciting terrorism will be treated the same whether they are from the far right, far left, or pro-Isis organisations.

Google’s four steps for tackling such videos include: using machine learning to identify videos glorifying violence, using a team of human flaggers to identify problematic videos, and using a "redirect method" to send potential Isis recruits towards anti-terror videos. Each of these steps is concerned with content that either breaks the law or violates YouTube’s policies.

The fourth step (or rather the third, as it is ordered in Google’s blogpost) is focused on non-illegal, non-policy violating content. For example, this could include videos that don’t directly incite terrorism, but arguably incite hate, such as those denying the Holocaust.

According to Kent Walker, Google’s general counsel, these could also be “videos that contain inflammatory religious or supremacist content”. Rather than being removed like the other offending videos, these will be hidden behind a warning, not have adverts on them (therefore preventing their creators from making money), and will not be eligible for comments. Essentially, as Walker writes, “that means these videos will have less engagement and be harder to find”.

It remains to be seen whether – or how – this will apply to the content of Tommy Robinson. YouTube’s steps will be taken on a video-by-video basis, meaning no far right commentator will be banned outright. Instead, YouTube simply won’t promote any offending videos, meaning they will not appear in their subscribers’ recommended feeds and will be difficult to find on the site.

In this way, Google has remained committed to free speech while doing more to tackle extremism on YouTube. Those like Robinson who claim to just “tell the facts” could arguably now be held to account for their actions. Many on the far right are careful to not explicitly advocate violence. Nevertheless, the loaded language used in their videos could arguably incite hate.

Paul Joseph Watson, a right-wing conspiracy theorist YouTuber with nearly one million subscribers, has never advocated terrorism, but has videos entitled “Islam is NOT a Religion of Peace” and “Chuck Johnson: Muslim Migrants Will Cause Collapse of Europe”.

In the past I have argued that allowing Google and YouTube to censor us in the name of “extremism” and “terrorism” is a troubling trend, but with these new promises, the company has walked the delicate line between the law and free speech. By allowing hateful, but not illegal, content to be hosted on its site and yet restricted from a wider audience, YouTube is taking a stand against extremists of all kinds.

Amelia Tait is a technology and digital culture writer at the New Statesman.

0800 7318496