Google's New Tech Can Read Your Body Language

The company’s ATAP research team is using radar to help computers respond to your movements, like turning off a TV if it senses you've dozed off.

The same Soli sensor is being used in this new round of research, but instead of using the sensor input to directly control a computer, ATAP is instead using the sensor data to enable computers to recognize our everyday movements and make new kinds of choices.

“We believe as technology becomes more present in our life, it's fair to start asking technology itself to take a few more cues from us,” says Leonardo Giusti, head of design at ATAP. In the same way your mom might remind you to grab an umbrella before you head out the door, perhaps your thermostat can relay the same message as you walk past and glance at it—or your TV can lower the volume if it detects you've fallen asleep on the couch.

Radar can detect you moving closer to a computer and entering its personal space. This might mean the computer can then choose to perform certain actions, like booting up the screen without requiring you to press a button.

This kind of interaction already exists in current Google Nest smart displays, though instead of radar, Google employs ultrasonic sound waves to measure a person's distance from the device. When a Nest Hub notices you're moving closer, it highlights current reminders, calendar events, or other important notifications.