Acoustic Eavesdropping: How AIs Can Steal Your Secrets by Listening to Your Typing

Authors

  • A. Shaji George Independent Researcher, Chennai, Tamil Nadu, India
  • S. Sagayarajan Independent Researcher, Chennai, Tamil Nadu, India

DOI:

https://doi.org/10.5281/zenodo.8260814

Keywords:

Keystroke listening, Audio eavesdropping, AI spying, Acoustic surveillance, Microphone security, Audio cryptography, Keystroke biometrics, Acoustic fingerprinting, Speech privacy

Abstract

Recent advances in artificial intelligence have enabled a disturbing new form of cyberattack -
acoustic side channel attacks, where AIs can identify keystrokes and steal passwords simply by listening to the sounds of typing. A study published in 2021 demonstrated this is possible with 93% accuracy using common video conferencing apps like Zoom. By analyzing audio recordings, AIs can detect the precise timings between keystrokes, and map patterns to reveal what is being typed. This poses serious risks, as acoustic eavesdropping can steal passwords, private messages, credit card numbers and other sensitive information. This paper examines the technical details of how acoustic side channel attacks work, using machine learning algorithms to match audio signals to keystroke inputs. It outlines real-world examples of AIs stealing credentials, text messages, and financial data simply by listening to typing sounds. The paper then discusses the broader dangers of acoustic snooping attacks, which exploit ubiquitous apps like video calls to turn devices into surveillance bugs. With acoustic attacks operating silently in the background, users may have their most confidential information extracted without their knowledge. To defend against this novel threat, the paper provides recommendations like using strong randomized passwords, enabling two-factor authentication, and avoiding typing sensitive information during video calls in public spaces. It also explores emerging countermeasures like audio masking, localized jamming signals, and AI detection of acoustic anomalies. However, these defenses are still in early stages. The paper concludes with an outlook on the future of acoustic security, emphasizing the need for continued research and increased user awareness to combat the disturbing privacy risks posed by AI listening hacks.

Downloads

Published

2023-08-25

How to Cite

A. Shaji George, & S. Sagayarajan. (2023). Acoustic Eavesdropping: How AIs Can Steal Your Secrets by Listening to Your Typing. Partners Universal International Innovation Journal, 1(4), 1–14. https://doi.org/10.5281/zenodo.8260814

Issue

Section

Articles