Researchers can now listen to your phone calls from afar
Penn State researchers have figured out how to use radar and AI to pick up parts of your phone conversations—even from almost 10 feet away.
Their system detects tiny vibrations on a phone's earpiece, then uses a tweaked version of OpenAI's Whisper model to turn those signals into words.
How it works
A radar sensor sits about three meters from the phone during a call, capturing subtle vibrations.
The AI was retrained just enough to handle these noisy signals, managing about 60% accuracy across 10,000 possible words—not perfect, but enough to get the gist.
Privacy concerns
The article likens the results to lip reading for context—they're not getting every word, but can piece together conversations.
This raises some real privacy questions: as tech like this gets smaller, it could potentially be hidden in everyday objects, raising concerns about its use for eavesdropping.