So this could have gone in a bunch of forums, such as Science or Computing, but I think it's also a current issue for those that depend on or are thinking about using smart devices for the safety and running of their homes.
It turns out that with a laser you can activate the microphones in smart devices and this creates a vulnerability into Smart Homes.
Check out this video by Smarter Every Day where they investigate, or read the paper by lightcommands
[yt]ozIKwGt38LQ[/yt]
It turns out that with a laser you can activate the microphones in smart devices and this creates a vulnerability into Smart Homes.
Check out this video by Smarter Every Day where they investigate, or read the paper by lightcommands
Quote:
We propose a new class of signal injection attacks on microphones based on the photoacoustic effect: converting light to sound using a microphone. We show how an attacker can inject arbitrary audio signals to the target microphone by aiming an amplitude-modulated light at the microphones aperture. We then proceed to show how this effect leads to a remote voice-command injection attack on voice-controllable systems. Examining various products that use Amazons Alexa, Apples Siri, Facebooks Portal, and Google Assistant, we show how to use light to obtain full control over these devices at distances up to 110 meters and from two separate buildings. Next, we show that user authentication on these devices is often lacking or non-existent, allowing the attacker to use light-injected voice commands to unlock the targets smartlock-protected front doors, open garage doors, shop on e-commerce websites at the targets expense, or even locate, unlock and start various vehicles (e.g., Tesla and Ford) that are connected to the targets Google account. Finally, we conclude with possible software and hardware defenses against our attacks. |
via International Skeptics Forum https://ift.tt/2F0j38g
Aucun commentaire:
Enregistrer un commentaire