Amazing stuff! Privacy?
"... The team used DensePose, a system for mapping all of the pixels on the surface of a human body in a photo, developed by researchers at Facebook’s AI lab and a London-based team. What makes DensePose really powerful is its ability to identify over two dozen key points and areas in the human body, such as joints and body parts like the arms, head, and torso, allowing the AI to describe a person’s pose. Combining this with a deep neural network, they were able to map WiFi signals’ phase and amplitude sent and received by routers to coordinates on human bodies. ...
For their demonstration, the researchers used three $30 WiFi routers and three aligned receivers which bounce WiFI signals around the walls of a room. The system cancels out static objects and focuses on the signals reflected off moving objects, reconstructing the pose of a person in a radar-like image even if there’s a wall between the routers and the subjects. This approach could enable standard WiFi routers to see through a variety of opaque obstacles, including drywall, wooden fences, and even concrete walls. ...
This is not the first time researchers have attempted to “see” people through walls. In 2013, a team at MIT found a way to use cell phone signals for this purpose, and in 2018, another MIT team used WiFi to detect people in another room and translate their movements to stick figures. However, the new study from the Carnegie Mellon team delivers much higher spatial resolution. You can actually see what people who are moving are doing by looking at their poses. ..."
For their demonstration, the researchers used three $30 WiFi routers and three aligned receivers which bounce WiFI signals around the walls of a room. The system cancels out static objects and focuses on the signals reflected off moving objects, reconstructing the pose of a person in a radar-like image even if there’s a wall between the routers and the subjects. This approach could enable standard WiFi routers to see through a variety of opaque obstacles, including drywall, wooden fences, and even concrete walls. ...
This is not the first time researchers have attempted to “see” people through walls. In 2013, a team at MIT found a way to use cell phone signals for this purpose, and in 2018, another MIT team used WiFi to detect people in another room and translate their movements to stick figures. However, the new study from the Carnegie Mellon team delivers much higher spatial resolution. You can actually see what people who are moving are doing by looking at their poses. ..."
From the abstract:
"Advances in computer vision and machine learning techniques have led to significant development in 2D and 3D human pose estimation from RGB cameras, LiDAR, and radars. However, human pose estimation from images is adversely affected by occlusion and lighting, which are common in many scenarios of interest. Radar and LiDAR technologies, on the other hand, need specialized hardware that is expensive and power-intensive. Furthermore, placing these sensors in non-public areas raises significant privacy concerns. To address these limitations, recent research has explored the use of WiFi antennas (1D sensors) for body segmentation and key-point body detection. This paper further expands on the use of the WiFi signal in combination with deep learning architectures, commonly used in computer vision, to estimate dense human pose correspondence. We developed a deep neural network that maps the phase and amplitude of WiFi signals to UV coordinates within 24 human regions. The results of the study reveal that our model can estimate the dense pose of multiple subjects, with comparable performance to image-based approaches, by utilizing WiFi signals as the only input. This paves the way for low-cost, broadly accessible, and privacy-preserving algorithms for human sensing."
DensePose From WiFi (open access)
No comments:
Post a Comment