We've seen how a young child can place an order for a dollhouse using Alexa, and how a television anchorman repeating the story accidentally caused other Echo units to order a dollhouse too. Now comes a story that is even scarier in its implications. A report in Thursday's New York Times revealed that students from University of California, Berkeley, and Georgetown University were able to hide subliminal commands for virtual personal assistants inside white noise played over loudspeakers. The commands included enabling airplane mode and requesting a website on a smartphone. That was back in 2016.

To make matters worse, a new report from some of the Berkley students found that the subliminal messages could be hidden in music. While a smart device owner might only hear music, underneath there are commands aimed at the device's virtual personal assistant. According to Nicholas Carlini, a fifth-year Ph.D. student in computer security at U.C. Berkeley, it is only a matter of time before this is exploited. "My assumption is that the malicious people already employ people to do what I do," Carlini said.

Major smart speaker manufacturers like Amazon, Google and Apple say that they have safeguards in place to prevent their assistants from being hijacked. Amazon and Google use technology to block commands that cannot be heard. To block Alexa from accidentally getting activated on devices during its Super Bowl ad (which was about Alexa losing her voice), the company played a tone in the range of 3,000 to 6,000Hz. In addition, voice recognition technology prevents Alexa and Google Assistant from following through on tasks requested by an unknown voice. Apple says that its HomePod won't act on commands that unlock doors, and iPhone and iPod models must be unlocked before Siri can access personal data, call up websites or open apps.

Last year, a Burger King commercial said, "O.K., Google, what is the Whopper burger?" The idea was to get Android devices to read aloud the ingredients of the burger from the Whopper's Wikipedia page. But when the page was overrun with obviously false ingredients like "toe nail clippings," the ad was pulled.

A technique called DolphinAttack shows that subliminal messages can even be embedded in sounds inaudible to the human ear. Check out the video at the top of this story. While DolphinAttacks require the transmitter to be placed in close proximity to the smart device receiving the hidden message, last month researchers were able to send these subliminal message via ultrasound from 25 feet away.

Perhaps the scariest scenario was tested by Chinese and American researchers who discovered that virtual personal assistants would follow commands hidden inside music played from the radio or over YouTube. So if your phone or smart speaker starts acting funny and performs tasks that you didn't request, it could be reacting to a message embedded in a song you are listening to on the radio. Or, it could be coming from a message embedded in a sound that you can't even hear.