The ratings game

Games are rated in the same way as film and DVD releases, but confusion still reigns in consumers’ m

Games are rated in the same way as film and DVD releases, but confusion still reigns in consumers' minds

Interactive entertainment is a hot potato, the subject of headlines, parliamentary discussion and media scrutiny. At the centre of the conflagration is the old debate about media effects: what (if any) real-life impact does playing a video game have on the people who play them?

The hundreds of games released every year tend to be lumped into the same bucket, accused of violence, sexual promiscuity and other content considered inappropriate for younger audiences. Yet, according to the Entertainment and Leisure Software Publishers Association, (ELSPA), less than 3 per cent of annual releases contains content that is rated 18. The problem is that many detractors appear unaware of the variety of gaming experiences that are on offer.

Social commentaries

As a relatively new entertainment medium, games are generally misunderstood. On the one hand, most people think of them as kids' toys. On the other, the technological advances in game technology over the past 20 years allow designers to create political statements, hard-hitting dramas, and fiercely cutting social commentaries that are decades beyond the innocent blips of Pong. The Atari generation has grown up, and so have their tastes in leisure activities. Games are now developed for all age brackets - from infants to adults.

In the UK, all video games released to market are voluntarily submitted to the British Board of Film Classification (BBFC) and are assessed like DVD and video releases. The classification board works closely with the Pan-European Game Information (PEGI) system, a consortium of 29 countries.

PEGI was established in 2003 by the Interactive Software Federation of Europe to ensure that parents and other consumers were informed about games that may be unsuitable for children. Although it's voluntary, the programme does have the support of all major publishers and console manufacturers, including PlayStation, Microsoft and Nintendo.

When the games reach the board, the applications and the products are viewed and given an appropriate certificate, with the age clearly marked on the front of the package. The games are also tagged with icons that indicate potentially offensive content, from discrimination, gambling, drug abuse, sex or violence.

BBFC and PEGI

In the UK, the BBFC adds its own familiar icons to indicate U for "universal", through to 18 for adults. In contrast to PEGI's ratings, which are for information only, the BBFC's ratings are legally binding, and anyone caught selling inappropriate content to a minor will be fined. When a game is denied a BBFC rating, such as Rockstar Games' Manhunt 2, which was submitted earlier this year, it is effectively banned. Only two games have been banned in the initiative's 21-year history.

The interactivity of games can make a difference in ratings. The examiners work to a similar remit for both games and film, but a game may receive a higher rating if a player's action leads a character towards a behaviour that may be offensive.

At the GameCity event in Nottingham in October, BBFC examiner Jim Cliff explained that, for example, an instance of bad language in a film may result in a lower rating than a game, particularly if the language in the game is triggered by a player's action (such as pressing a button, passing a particular location) and can be repeated again and again.

Parental uncertainty

Cliff admits that parents may know what a 15 rating for a film means, but may not understand what gives a game a "15" rating. To combat this disconnect, industry bodies have implemented publicly-facing education programmes, websites and white papers with varying degrees of success. It is hoped that the issue will be clarified in March next year in the results of the Byron Review, spearheaded by the Department for Children, Schools and Families and the Department for Culture, Media and Sport set up to critically examine research on the effects of violent video games.

This article first appeared in the 17 December 2007 issue of the New Statesman, Christmas and New Year special 2007

Fox via YouTube
Show Hide image

Are smart toys spying on children?

If you thought stepping on a Lego was bad, consider the new ways in which toys can hurt and harm families.

In January 1999, the president of Tiger Electronics, Roger Shiffman, was forced to issue a statement clearing the name of the company’s hottest new toy. “Furby is not a spy,” he announced to the waiting world.

Shiffman was speaking out after America’s National Security Agency (NSA) banned the toy from its premises. The ban was its response to a playground rumour that Furbies could be taught to speak, and therefore could record and repeat human speech. “The NSA did not do their homework,” said Shiffman at the time.

But if America’s security agencies are still in the habit of banning toys that can record, spy, and store private information, then the list of contraband items must be getting exceptionally long. Nearly 18 years after TE were forced to deny Furby’s secret agent credentials, EU and US consumer watchdogs are filing complaints about a number of WiFi and Bluetooth connected interactive toys, also known as smart toys, which have hit the shelves. Equipped with microphones and an internet connection, many have the power to invade both children’s and adults’ private lives.

***

“We wanted a smart toy that could learn and grow with a child,” says JP Benini, the co-founder of the CogniToys “Dino”, an interactive WiFi-enabled plastic dinosaur that can hold conversations with children and answer their questions. Benini and his team won the 2014 Watson Mobile Developer Challenge, allowing them to use the question-answering software IBM Watson to develop the Dino. As such, unlike the “interactive” toys of the Nineties and Noughties, Dino doesn’t simply reiterate a host of pre-recorded stock phrases, but has real, organic conversations. “We grew it from something that was like a Siri for kids to something that was more conversational in nature.”

In order for this to work, Dino has a speaker in one nostril and a microphone in the other, and once a child presses the button on his belly, everything they say is processed by the internet-connected toy. The audio files are turned into statistical data and transcripts, which are then anonymised and encrypted. Most of this data is, in Benini’s words, “tossed out”, but his company, Elemental Path, which owns CogniToys, do store statistical data about a child, which they call “Play Data”. “We keep pieces from the interaction, not the full interaction itself,” he tells me.

“Play Data” are things like a child’s favourite colour or sport, which are used to make a profile of the child. This data is then available for the company to view, use, and pass on to third parties, and for parents to see on a “Parental Panel”. For example, if a child tells Dino their favourite colour is “red”, their mother or father will be able to see this on their app, and Elemental Path will be able to use this information to, Benini says, “make a better toy”.

Currently, the company has no plans to use the data with any external marketers, though it is becoming more and more common for smart toys to store and sell data about how they are played with. “This isn’t meant to be just another monitoring device that's using the information that it gathers to sell it back to its user,” says Benini.

Sometimes, however, Elemental Path does save, store, and use the raw audio files of what a child has said to the toy. “If the Dino is asked a question that it doesn’t know, we take that question and separate it from the actual child that’s asking it and it goes into this giant bucket of unresolved questions and we can analyse that over time,” says Benini. It is worth noting, however, that Amazon reviews of the toy claim it is frequently unable to answer questions, meaning there is potentially an abundance of audio saved, rather than it being an occasional occurrence.

CogniToys have a relatively transparent Privacy Policy on their website, and it is clear that Benini has considered privacy at length. He admits that the company has been back and forth about how much data to store, originally offering parents the opportunity to see full transcripts of what their child had been saying, until many fed back that they found this “creepy”. Dino is not the first smart toy to be criticised in this way.

Hello Barbie is the world’s first interactive Barbie doll, and when it was released by Mattel in 2015, it was met with scorn by parents’ rights groups and privacy campaigners. Like Dino, the doll holds conversations with children and stores data about them which it passes back to the parents, and articles expressing concerns about the toy featured on CNN, the Guardian, and the New York Times. Despite Dino’s similarities, however, Benini’s toy received almost no negative attention, while Hello Barbie won the Campaign for a Commercial-Free Childhood’s prize for worst toy of the year 2015.

“We were lucky with that one,” he says, “Like the whole story of the early bird gets the worm but the second worm doesn’t get eaten. Coming second on all of this allowed us to be prepared to address the privacy concerns in greater depth.”

Nonetheless, Dino is in many ways essentially the same as Hello Barbie. Both toys allow companies and parents to spy on children’s private playtimes, and while the former might seem more troubling, the latter is not without its problems. A feature on the Parental Panel of the Dino also allows parents to see the exact wording of questions children have asked about certain difficult topics, such as sex or bullying. In many ways, this is the modern equivalent of a parent reading their child's diary. 

“Giving parents the opportunity to side-step their basic responsibility of talking to, engaging with, encouraging and reassuring their child is a terrifying glimpse into a society where plastic dinosaurs rule and humans are little more than machines providing the babies for the reptile robots to nurture,” says Renate Samson, the chief executive of privacy campaign group Big Brother Watch. “We are used to technology providing convenience in our lives to the detriment of our privacy, but allowing your child to be taught, consoled and even told to meditate by a WiFi connected talking dinosaur really is a step in the wrong direction.”

***

Toy companies and parents are one thing, however, and to many it might seem trivial for a child’s privacy to be comprised in this way. Yet many smart toys are also vulnerable to hackers, meaning security and privacy are under threat in a much more direct way. Ken Munro, of Pen Test Partners, is an ethical hacker who exposed security flaws in the interactive smart toy “My Friend Cayla” by making her say, among other things, “Calm down or I will kick the shit out of you.”

“We just thought ‘Wow’, the opportunity to get a talking doll to swear was too good,” he says. “It was the kid in me. But there were deeper concerns.”

Munro explains that any device could connect to the doll over Bluetooth, provided it was in range, as the set-up didn’t require a pin or password. He also found issues with the encryption processes used by the company. “You can say anything to a child through the doll because there's no security,” he says. “That means you've got a device that can potentially be used to groom a child and that's really creepy.”

Pen Test Partners tells companies about the flaws they find with their products in a process they call “responsible disclosure”. Most of the time, companies are grateful for the information, and work through ways to fix the problem. Munro feels that Vivid Toy Group, the company behind Cayla, did a “poor job” at fixing the issue. “All they did was put one more step in the process of getting it to swear for us.”

It is one thing for a hacker to speak to a child through a toy and another for them to hear them. Early this year, a hack on baby monitors ignited such concerns. But any toy with speech recognition that is connected to the internet is also vulnerable to being hacked. The data that is stored about how children play with smart toys is also under threat, as Fisher Price found out this year when a security company managed to obtain the names, ages, birthdays, and genders of children who had played with its smart toys. In 2015, VTech also admitted that five million of its customers had their data breached in a hack.

“The idea that your child shares their playtime with a device which could potentially be hacked, leaving your child’s inane or maybe intimate and revealing questions exposed is profoundly worrying,” says Samson. Today, the US Electronic Privacy Information Center (EPIC) said in a statement that smart toys “pose an imminent and immediate threat to the safety and security of children in the United States”. 

Munro says big brands are usually great at tackling these issues, but warns about smaller, cheaper brands who have less to lose than companies like Disney or Fisher Price. “I’m not saying they get it right but if someone does find a problem they’ve got a huge incentive to get it right subsequently,” he says of larger companies. Thankfully, Munro says that he found Dino to be secure. “I would be happy for my kids to play with it,” he says. “We did find a couple of bugs but we had a chat with them and they’re a good bunch. They aren’t perfect but I think they’ve done a hell of a lot of a better job than some other smart toy vendors.”

Benini appears alert to security and the credibility it gives his company. “We took the security very, very seriously,” he says. “We were still building our systems whilst these horror stories were coming about so I already set pipelines and parameters in place. With a lot of devices out there it seems that security takes a backseat to the idea, which is really unfortunate when you’re inviting these devices into your home.”

As well as being wary of smaller brands, Munro advises that parents should look out for Bluetooth toys without a secure pairing process (ie. any device can pair with the toy if near enough), and to think twice about which toys you connect to your WiFi. He also advises to use unique passwords for toys and their corresponding apps.

“You might think ‘It's just a toy, so I can use the same password I put in everything else’ – dog’s name, football club, whatever – but actually if that ever got hacked you’d end up getting all your accounts that use that same password hacked,” he says.

Despite his security advice, Munro describes himself as “on the fence” about internet-connected smart toys as a whole. “Most internet of things devices can be hacked in one way or another,” he says. “I would urge caution.”

***

Is all of this legal? Companies might not be doing enough ethically to protect the privacy of children, but are they acting responsibly within the confines of the law?

Benini explains that Dino complies with the United States Children's Online Privacy Protection Act (COPPA) of which there is no real equivalent in the UK. COPPA says that companies must have parental permission to collect personal information over the internet about children under 13 years of age. “We’ve tried to go above and beyond the original layout of COPPA,” says Benini, when describing CogniToys transparent privacy documents. Parents give their consent for Elemental Path to collect their children’s data when they download the app that pairs with the toy.

Dino bears a striking similarity to Amazon Echo and Google Home, smart speakers that listen out for commands and questions in your home. Everything that is said to Amazon Echo is recorded and sent to the cloud, and an investigation by the Guardian earlier this year discovered that this does not comply with COPPA. We are therefore now in a strange position whereby many internet of things home devices are legally considered a threat to a child’s privacy, whereas toys with the same capabilities are not. This is an issue because many parents may not actually be aware that they are handing over their children’s data when installing a new toy.

As of today, EU consumer rights groups are also launching complaints against certain smart toys, claiming they breach the EU Unfair Contract Terms Directive and the EU Data Protection Directive, as well as potentially the Toy Safety Directive. Though smart toys may be better regulated in Europe, there are no signs that the problem is being tackled in the UK. 

At a time when the UK government are implementing unprecedented measures to survey its citizens on the internet and Jeremy Hunt wants companies to scour teens’ phones for sexts, it seems unlikely that any legislation will be enacted that protects children’s privacy from being violated by toy companies. Indeed, many internet of things companies – including Elemental Path – admit they will hand over your data to government and law enforcement officials when asked.

***

As smart toys develop, the threat they pose to children only becomes greater. The inclusion of sensors and cameras means even more data can be collected about children, and their privacy can and will be compromised in worrying ways.

Companies, hackers, and even parents are denying children their individual right to privacy and private play. “Children need to feel that they can play in their own place,” says Samson. It is worrying to set a precedent where children get used to surveillance early on. All of this is to say nothing of the educational problems of owning a toy that will tell you (rather than teach you) how to spell “space” and figure out “5+8”.

In a 1999 episode of The Simpsons, “Grift of the Magi”, a toy company takes over Springfield Elementary and spies on children in order to create the perfect toy, Funzo. It is designed to destroy all other toys, just in time for Christmas. Many at the time criticised the plot for being absurd. Like the show's prediction of President Trump, however, it seems that we are living in a world where satire slowly becomes reality.

Amelia Tait is a technology and digital culture writer at the New Statesman.