Pixel Visual Core should make photos taken in Instagram not suck
The Pixel Visual Core in the Pixel 2 and Pixel 2 XL already handles the devices’ camera app image processing for what Google likes to call “HDR+” or computational high dynamic range post-processing.
It was announced that third-party apps utilizing Android Camera API would eventually be able to utilize the co-processor in all its glory in the upcoming Android 8.1 update, but those testing the second developer preview of said update can now opt to have the Core work in those apps.
If you have a Pixel 2 device, you’ll need to get into the developer settings, toggle on the option reading “Camera HAL HDR+” — Android Central notes that the HAL acronym stands for Hardware Abstraction Layer. Once you restart your device, boot up Instagram (just one of many using the Android Camera API) and take a picture in the app.
We observed that it took just a beat for our Pixel 2 unit to tweak the challenging composition we saw from the viewfinder. The edge of that corner of a monitor was lit up just a tad as contrast was adjusted for several regions. The overall brighter result allowed for details of the painted wall to show up better.
It’ll be interesting to see how results get tuned over time as the Pixel Visual Core also uses Google’s TensorFlow language for machine learning.