Your smart speaker could be transformed into an acoustic cyber-weapon by hackers
Look around your home – how many smart speakers do you have dotted around? For many of us, smart speakers have become an integral part of our daily lives; as music players, the hub that controls our smart home devices, and as a vessel for voice assistants like Google Assistant, Siri, and Alexa.
But could our smart speakers one day be used to harm us? One researcher believes it’s possible. Matt Wixey, cybersecurity research lead at technology consulting firm PWC UK, says that it’s “surprisingly easy to write custom malware that can induce all sorts of embedded speakers to emit inaudible frequencies at high intensity, or blast out audible sounds at high volume”.
These aural attacks could easily damage your hearing, cause tinnitus, or even lead to psychological changes.
Speaking at the Defcon security conference in Las Vegas on August 11, Wixey explained that his team “wondered if an attacker could develop malware or attacks to emit noise exceeding maximum permissible level guidelines, and therefore potentially cause adverse effects to users or people around”.
The research analyzed the potential acoustic output of a number of devices, including “a laptop, a smartphone, a Bluetooth speaker, a smart speaker, a pair of over-ear headphones, a vehicle-mounted public address system, a vibration speaker, and a parametric speaker, which channels sound in a specific direction”.
Wixey then created malware to run on each device, and tested their acoustic output and surface temperature in an anechoic chamber.
He found that the smart speaker, the headphones, and the parametric speaker “were capable of emitting high frequencies that exceeded the average recommended by several academic guidelines”.
Wixey also discovered that the Bluetooth speaker, a pair of noise-canceling headphones, and the smart speaker again “were able to emit low frequencies that exceeded the average recommendations”.
As well as that, his attack on the smart speaker “generated enough heat to start melting its internal components after four or five minutes, permanently damaging the device”.
So what does that tell us? Well, according to Wixey, “the upshot of it is that the minority of the devices we tested could in theory be attacked and repurposed as acoustic weapons”.
The extent of the risks posed by acoustically-hijacked smart speakers isn’t clear, however, Wixey notes that “existing research on detrimental human exposure to acoustic emanations has found potential effects that are both physiological and psychological”.
The use of ‘noise torture’ has been well documented in the media, notably being used by US interrogators “to break the will” of Iraqi POWs during the Iraq War.
Wixey’s research also demonstrates the potential for acoustic malware to be inflicted on internet-connected smart speakers remotely, without the need for physical access to the device.
It raises questions about the security of our smart speakers and other connected devices; and if hackers could hijack our smart home devices for nefarious reasons, manufacturers will need to step up the security of their products.