Pixel Visual Core should make photos taken in Instagram not suck

The Pixel Visual Core in the Pixel 2 and Pixel 2 XL already handles the devices’ camera app image processing for what Google likes to call “HDR+” or computational high dynamic range post-processing.

It was announced that third-party apps utilizing Android Camera API would eventually be able to utilize the co-processor in all its glory in the upcoming Android 8.1 update, but those testing the second developer preview of said update can now opt to have the Core work in those apps.

If you have a Pixel 2 device, you’ll need to get into the developer settings, toggle on the option reading “Camera HAL HDR+” — Android Central notes that the HAL acronym stands for Hardware Abstraction Layer. Once you restart your device, boot up Instagram (just one of many using the Android Camera API) and take a picture in the app.

A post shared by Jules Wang (@juicefarcicles) on

We observed that it took just a beat for our Pixel 2 unit to tweak the challenging composition we saw from the viewfinder. The edge of that corner of a monitor was lit up just a tad as contrast was adjusted for several regions. The overall brighter result allowed for details of the painted wall to show up better.

It’ll be interesting to see how results get tuned over time as the Pixel Visual Core also uses Google’s TensorFlow language for machine learning.

Share This Post

Watch the Latest Pocketnow Videos

About The Author
Jules Wang
Jules Wang is News Editor for Pocketnow and one of the hosts of the Pocketnow Weekly Podcast. He came onto the team in 2014 as an intern editing and producing videos and the podcast while he was studying journalism at Emerson College. He graduated the year after and entered into his current position at Pocketnow, full-time.