MIT's tech can see through walls


11 Oct 2019

Now, MIT's superhuman tech can see through thick walls

After beating humans in playing professional games and detecting critical diseases, artificial intelligence-backed machines are doing things that are way beyond human control.

A new report has revealed that the folks at the Massachusetts Institute of Technology (MIT) have developed a machine vision tech that has the superhuman ability to see through a dark room or thick walls.

Here's all about it.


Machine vision for identification of people, objects

Machine vision employs sophisticated algorithms to give robots, including self-driving cars, the ability to 'see' their surroundings.

The tech has been around for a long while and is mainly used for helping robots identify faces/objects/people.

However, when these objects and people are obscured in some way, say with a wall or just no light, it becomes impossible for machine vision to identify a target.

Radio waves worked but had their own problems

As visible light-based machine vision struggles in dark environments, engineers thought about using radio waves, which work in the night, pass through walls, and are transmitted and reflected by human bodies. However, studies revealed radio imagery can be noisy, making it difficult to discern actions.

Love Tech news?

Stay updated with the latest happenings.

Yes, notify me


So, MIT engineers employed the mix of both

After learning the possible advantages and disadvantages of both radio and visible light vision systems, the folks at MIT integrated the two.

They trained a radio vision system on visible light images, allowing it to tell what a person is doing, even if they were behind a wall or in dark environments (places where visible light-based tech had failed before).


How the system actually worked?

How the system actually worked?

After training, the hybrid neural network developed by the team recorded and analyzed imagery in both visible light and radio waves.

It identified a particular action in the visible light data and correlated it with the radio waves' information (or vice versa) to give an idea of the action a person is performing behind a wall or in a room with no light.

Human aspect

Stick figures generated to highlight human activity

In addition to this, the system developed by the researchers uses 3D stick figures to make sure that only human action gets highlighted in the results.

"By translating the input to an intermediate skeleton-based representation, our model can learn from both vision-based and radio frequency-based datasets, and allow the two tasks to help each other," Tianhong Li who developed the system said.

This could ultimately improve smart home security

"We introduce a neural network model that can detect human actions through walls and occlusions, and in poor lighting conditions," Li further added, noting that "it can bring action recognition to people's homes and allow for its integration in smart home systems."

Share this timeline

Artificial Intelligence


Massachusetts Institute

Massachusetts Institute of Technology (MIT

Tianhong Li

Share this timeline

Ask NewsBytes
User Image

Next Timeline