We are reader supported. External links may earn us a commission.

Smart Devices

What is Google ATAP and how it can bring socially intelligent displays from Pixel phones

By Roland Udvarlaki March 2, 2022, 10:25 am
Google Pixel 4 Soli sensor
Google Pixel 4 with Soli sensor
Source: Google ATAP

Google’s ATAP team posted a new video, highlighting the use cases of the Soli sensor technology. The video shows us how Google attempts to make devices more socially intelligent by only working and displaying information when the person is looking at the display or facing the device.

The team was also responsible for Project Soli, the Soli sensor that was equipped in the Google Pixel 4 series, and the Nest smart devices. The Soli sensor allowed the smartphone to unlock the display, and recognize the user’s face, providing a safe unlock mechanism. The sensor was also used for various hand gestures – which never really quite worked properly in the Pixel 4 series – that could recognize when a user was reaching for the device or trying to switch skip.


What is Google ATAP?

Google ATAP (short for “Advanced Technology and Projects”) is a hardware invention studio that belongs to Google. It’s a group of engineers, scientists, analysts, artists, and designers who work together on new ideas, which could one day make life easier and more innovative. The team mainly focuses on new inventions that they try to turn into finished products that could later be used in devices, such as the Soli sensor in the Google Pixel 4 series, and other Nest devices.

In the new video today, Google ATAP highlighted its main goal “to create ambient, socially intelligent devices that are controlled by the wave of a hand or turn of the head” (via 9to5Google).

“As humans, we understand each other intuitively — without saying a single word. We pick up on social cues, subtle gestures that we innately understand and react to. What if computers understood us this way?”

In the video, the team shows off a few different devices that understand the “social context” of the environment around the person, and interacts and displays information when the person approaches the product. The video also highlights that the technology could allow the device to recognize when a user turns to the device to read messages or see the weather information and display other content when it’s not actively looked at.

The device in the video appears to be a mockup, and it’s unlikely that we’ll see a similar smart home device anytime soon, but it shows the possibilities of what may be possible in the near future, once the team manages to figure out how to recognize such challenging tasks.


Latest Articles


Here's how the Apple iPod changed the world in 21 years

iPod was an industry-changing device at its time, and it had a massive impact on modern smartphones, and the way we listen to music. We take a last look at the now-discontinued Apple iPod and the history it leaves behind.

By Roland Udvarlaki May 11, 2022, 10:00 am

How to use Mic Modes in VOIP and FaceTime Calls

This guide will go over the steps you need to follow to activate one of the available Mic Mode settings on Apple Devices to begin using the feature and improve your calling experience.

By Aryan Suren May 10, 2022, 10:00 am

This iPhone 14 feature might urge users to upgrade

Until now, it appeared that iPhone 14 would only be a minor upgrade over the iPhone 13 series. However, a new leak suggests that the iPhone 14 will come with one feature that might urge users to upgrade.

By Sanuj Bhatia May 9, 2022, 5:00 am