Google Pixel Visual Core is the company’s machine-learning co-processor for Pixel 2

Google revealed the Pixel 2’s and Pixel 2 XL’s HDR+ and machine learning capabilities for computational photo processing will be the result of work by the Pixel Visual Core, the company’s first-ever own-design co-processor.

The co-processor runs on a single ARM Cortex-A53 unit, LPDDR4 RAM and, among other things, eight image processing cores, each with 512 arithmetic logic units. Together, they can perform more than 3 trillion calculations per second — allowing for up to five times the processing speed for HDR+ at 10 percent of the power usage. It’s able to glide between domain-specific languages (Halide for images and Google’s TensorFlow for machine learning) to make things easier for third-party developers.

The Pixel Visual Core will be enabled on Pixel 2 (and, presumably, Pixel 2 XL) devices with the Android 8.1 Oreo update (Maintenance Release 1) and third-party apps will be able to crack at code to harness the power of this new hardware.

Rumors of Google developing its own mobile applications processor have been around for years. It hired away Apple’s head of chipset design, Manu Gulati, in June.

Share This Post

Watch the Latest Pocketnow Videos

About The Author
Jules Wang
Jules Wang is News Editor for Pocketnow and one of the hosts of the Pocketnow Weekly Podcast. He came onto the team in 2014 as an intern editing and producing videos and the podcast while he was studying journalism at Emerson College. He graduated the year after and entered into his current position at Pocketnow, full-time.