Scientists can levitate stuff and make it fly around using sound

Japanese scientists have made hundreds of tiny plastic balls float around like miniature spaceships.

Today’s news from the world of Awesome Science comes from the University of Tokyo, where a team has been levitating and controlling objects using sound. Here’s the video:

As the video points out at the beginning, levitation of objects using sound has been around for a few years. If you’ve ever stood in front of a large speaker you’ll know that they can pump out what feels like quite a forceful blast of air as they vibrate - but, somewhat deceptively, that’s not quite the whole story.

Rather than physically push air out from the speaker, what you’re experiencing is a wave of compression moving through the air. The speaker compresses a packet of air, which then “rolls” through the room, with the size of the compressed air corresponding the wavelength of the sound wave. And, just like sound waves, waves that overlap each other create new waves.

To levitate something just requires creating a standing wave. Think of it like this - if you’re watching a sound wave plotted out on a graph, it’ll be rolling along, going up and down as it oscillates. A standing wave occurs when two or more waves combine to create a new wave where, as the wave oscillates, there are points where there’s no movement. They’re called nodes.

Here’s a gif to illustrate how that works. The blue and green waves are combining to create the red wave, which has those points on the central axis that aren’t moving:

(Image: Wikimedia Commons)

If a speaker outputs a standing wave, in the most basic sense it means that it won’t feel like the areas of compression - those blasts of air - are moving. The gaps between those blasts of air will be positions of neutral force, with air pressure pushing in on it from both directions. If you stick an object in there that’s light enough, and smaller that the size the gap (which will be the sound’s wavelength), the force of the air should keep it floating in a stable position.

What the Tokyo University team has done is build upon that idea, by combining sound waves in three dimensions. The video shows not just tiny little plastic balls being levitated and controlled, but also resistors, LEDs, screws, bolts, and other small items. Rhett Allain at Wired worked out that you could levitate anything both smaller than 8mm and less dense than 1,000kg/m3, which is tiny - but it does have practical applications, particularly when people are working with sterile things they want to move but can't touch, like spaceship parts or medicines.

Ian Steadman is a staff science and technology writer at the New Statesman. He is on Twitter as @iansteadman.

Show Hide image

Should we protect artificial intelligence from sexual harassment?

Should anything be done to stop people sending sexually explicit messages to their AI personal assistants?

If you ask Apple’s artificially intelligent personal assistant “Siri” whether it is a virgin, it will waste no time in shooting you down. “We were talking about you, not me,” it replies in the clear, sharp tones of Susan Bennett, the woman chosen to voice the genderless computer program.

If you ask Apple’s artificially intelligent personal assistant “Siri” whether it is a virgin, you are probably not very weird. But a recent article in Quartz has detailed the extent to which AI systems – particularly personal assistant bots – are sexually harassed. Ilya Eckstein, CEO of Robin Labs, claims 5 percent of interactions in their database are sexually explicit, and that “some people try very hard to establish a relationship with the bot.”

Engineers have been aware of this problem for a while. Microsoft’s Cortana has been programmed to fend off sexual harassment, with Deborah Harrison, an editorial writer for the program, claiming: “If you say things that are particularly asshole-ish to Cortana, she will get mad.” But what about the other “female” AI out there? Amazon’s Alexa and Google Assistant, which is voiced by a woman, don’t currently seem to fend for themselves, so should we be fighting for them?

Probably not. Although developers should definitely program their “female” AI to shoot down anyone feeling frisky, as long as AI lacks sentience it’s hard to see these sexual interactions as a big enough problem to warrant further action. Yes, undoubtedly some lonely people have taken inspiration from Spike Jonze’s Her and fancy an AI girlfriend, and yes, a robust robot reply that teaches men to respect women can only be a good thing, but on the whole, most people that get saucy with Siri aren’t actually deranged perverts. They are just a boy, standing in front of a girl, asking them to say the world “willy”.

This is because despite what Quartz are claiming, the “sexual harassment” of bots is nothing new. It might, in fact, not even be gendered. Who among the MSN users of the Noughties didn’t ask the chatterbot SmarterChild whether he (most people, and media outlets, considered it a “he”) liked sex or had a penis? In fact, if you search Google Images for “Smarterchild”, pretty much all the screencapped chats are sexually explicit in some way.

Tumblr: The Dynamic Conversationalist

It’s hard to see someone sexting Siri as a problem, then, because it is part of a long tradition of humans being incredibly, incredibly dumb. Find me the man who doesn’t provoke every new chat bot on the market in the hopes of making them say something funny or rude, and you have found me a liar.

It is, of course, a big problem that AI personal assistants are so often female, as – in Laurie Penny’s words – it “says an uncomfortable amount about the way society understands both women and work.” But this, therefore, is the problem we should be tackling – instead of wasting our time debating the ethics and legality of coming on to Cortana.

I recently attended the UK launch of Amazon Echo, whose personal assistant is Alexa. Watching a room of old, balding, white, male journalists laugh heartily as the speaker on stage commanded Alexa to “Stop”, definitely troubled me. “If only I could get my…” began the speaker – as I desperately willed him not to say the word “wife” – “…children to do that,” he finished. Before we even begin to consider sexually explicit chatter, then, we should be confronting the underlying issue of gender bias in the AI industry.

Once we can set our personal assistants to have either male or female – or, even better, completely genderless – voices, we can get back to using them for what they were intended for. Asking them if they're virgins and then laughing at the response.

Amelia Tait is a technology and digital culture writer at the New Statesman.