Separation from our mobiles impacts our cognitive, emotional and physiological wellbeing. Image: Getty.
Show Hide image

Can't survive without your phone? You could be suffering from nomophobia

Our smartphones are fast becoming extensions of ourselves. So what happens when we're separated from them?

You could be one of the millions of people suffering from Nomophobia (or, as it’s also known, Smartphone Separation Anxiety). It’s the pathological fear, anxiety, or discomfort associated with being without your mobile phone. In other words, it’s the sheer panic that descends the moment you suspect that you’ve accidentally left it at home. Or that sinking feeling of despair when it’s on 1 per cent charge. And the sense of relief when someone offers you a charger.

The findings from last years Deloitte Mobile Consumer Survey showed that of the 35 million smartphone owners in the UK, one in six looks at their phone more than 50 times a day. Nearly a third reach for their smartphones within five minutes of waking up (not including turning off the alarm). And I reluctantly admit that I fall into the 11 per cent of people that scroll through their smartphones immediately after waking up.

Smartphones have morphed into “physical extensions of ourselves”, and separation from our mobiles could have a significant impact on our cognitive, emotional and physiological wellbeing. This is according to a study published in the Journal of Computer-Mediated Communication, which aimed to investigate the psychological and physiological changes in participants when they were separated from their smartphones and prevented from answering them.

The research team from the University of Missouri asked forty participants to complete various cognitive tasks, once with their smartphones in their possession and once without. As predicted, the results of the study showed that when the participants were separated from their smartphones it resulted in poor cognitive performance, increased blood pressure, increased heart rate, and greater levels of self-reported anxiety.

Their findings supported the Extended Self Theory, a concept formulated by marketing professor Russell Belk. The theory proposes that “an individual’s possessions, whether knowingly or unknowingly, intentionally or unintentionally, can become an extension of one’s self”. In other words, when we exercise power over our possessions, in the same way in which we would control our limbs, for example, eventually the external object is viewed as part of our self.  In line with this theory, the research team suggests that when a person loses a close possession, like a smartphone, it should be viewed as a “loss or lessening of self”.

This is not entirely surprising. Nowadays smartphones are more than just gadgets, they are ever-present aspects of our daily lives. Our mobile phones are like portable windows to the outside world, providing us with instant access to vast amounts of information, and a sense of connection to our social circles.

However, one of the limitations of this study is its small sample size, and so the results may not be very representative in terms of the wider population. However, the research team concludes that subsequent research should aim to investigate whether “other technological devices are capable of becoming incorporated into the extended self”. This could be an important area for future research, especially since the International Data Corporation predicts that there will be more that 2bn “Internet of Things” devices installed by 2020. The Internet of Things – a popular phrase used to describe the technology in which our devices are connected and controlled over the Internet – is growing rapidly. “Smart homes”, in which our washing machines, fridges, smoke detectors and other household appliances are connected up to the internet, constitute a major part of this trend.

Tech companies such as Google have shared their plans to link their devices with appliances in our homes. And earlier this month, Apple launched their smart home platform HomeKit, which will allow a number of products to be controlled by its voice command system Siri. iPhones, iPads and iWatches could be used to dim the lights, determine whether a kitchen window is open, and even detect home air quality.

It’s exciting to see technology advancing in this way. However, the findings of this study raise a number of questions. Are we becoming unhealthily reliant on technology? If so, how can we develop a healthier attachment to our gadgets? Or is this even something to be worried about? Most importantly, since an increasing number of devices are being connected to the Internet, should we also be concerned about the repercussions of humans becoming connected to an increasing number of devices?

Metro-Goldwyn-Mayer Pictures
Show Hide image

The one where she turns into a USB stick: the worst uses of tech in films

The new film Worst Tinder Date Ever will join a long tradition of poorly-thought-through tech storylines.

News just in from Hollywood: someone is making a film about Tinder. What will they call it? Swipe Right, perhaps? I Super Like You? Some subtle allusion to the app’s small role in the plotline? Nope – according to Hollywood Reporterthe film has been christened Worst Tinder Date Ever.

With the exception of its heavily branded title (You’ve Got Gmail, anyone?), Worst Tinder Date Ever follows neatly in the tradition of writers manhandling tech into storylines. Because really, why does it matter if it was a Tinder date? This “rom com with action elements” reportedly focuses on the couple’s exploits after they meet on the app, so the dogged focus on it is presumably just a ploy to get millennial bums on cinema seats.  

Like the films on this list, it sounds like the tech in Worst Tinder Date Ever is just a byword for “modern and cool” – even as it demonstrates that the script is anything but.

Warning: spoilers ahead.

Lucy (2014)

Scarlett Johansson plays Lucy, a young woman who accidentally ingests large quantities of a new drug which promises to evolve your brain beyond normal human limits.

She evolves and evolves, gaining superhuman powers, until she hits peak human, and turns into first a supercomputer, and then a very long USB stick. USB-Lucy then texts Morgan Freeman's character on his fliphone to prove that: “I am everywhere.”

Beyond the obvious holes in this plotline (this wouldn’t happen if someone’s brain evolved; texting a phone is not a sign of omnipotence), USB sticks aren’t even that good – as Business Insider points out: “Flash drives are losing relevance because they can’t compete in speed and flexibility with cloud computing services . . . Flashdrives also can’t carry that much information.”

Star Wars: The Force Awakens (2015)

If you stare at it hard enough, the plotline in the latest Star Wars film boils down to the following: a gaggle of people travels across space in order to find a map showing Luke Skywalker’s location, held on a memory stick in a drawer in a spherical robot. Yep, those pesky flash drives again.

It later turns out that the map is incomplete, and the rest of it is in the hands of another robot, R2-D2, who won’t wake up for most of the film in order to spit out the missing fragment. Between them, creator George Lucas and writer and director JJ Abrams have dreamed up a dark vision of the future in which robots can talk and make decisions, but can’t email you a map.

Willy Wonka and the Chocolate Factory (1971)

In which a scientist uses a computer to find the “precise location of the three remaining golden tickets sent out into the world by Willy Wonka. When he asks it to spill the beans, it announces: “I won’t tell, that would be cheating.


Image: Paramount Pictures. 

The film inhabits a world where artificial intelligence has been achieved, but no one has thought to pull Charlie's poor grandparents out of extreme poverty, or design a computer with more than three buttons.

Independence Day (1996)

When an alien invasion threatens Earth, David Levinson (Jeff Goldblum) manages to stop it by hacking the alien spaceship and installing a virus. Using his Mac. Amazing, really, that aliens from across the universe would somehow use computing systems so similar to our own. 

Skyfall (2012)

In the Daniel Craig reboot of the series, MI6’s “Q” character (played by Ben Whishaw) becomes a computer expert, rather than just a gadget wizard. Unfortunately, this heralded some truly cringeworthy moments of “hacking” and “coding” in both Skyfall and Spectre (2014).

In the former, Bond and Q puzzle over a screen filled with a large, complex, web shape. They eventually realise it’s a map of subterranean London, but then the words security breach flash up, along with a skull. File under “films which make up their own operating systems because a command prompt box on a Windows desktop looks too boring”.

An honourable mention: Nelly and Kelly Rowland’s “Dilemma” (2009)

Not a movie, but how could we leave out a music video in which Kelly Rowland texts Nelly on a Microsoft Excel spreadsheet on a weird Nokia palm pilot?


Image: Vevo.

You’ll be waiting a long time for that response, Kelly. Try Tinder instead.

Barbara Speed is a technology and digital culture writer at the New Statesman and a staff writer at CityMetric.