The Pixel Visual Core in the Pixel 2 and Pixel 2 XL already handles the devices’ camera app image processing for what Google likes to call “HDR+” or computational high dynamic range post-processing.

It was announced that third-party apps utilizing Android Camera API would eventually be able to utilize the co-processor in all its glory in the upcoming Android 8.1 update, but those testing the second developer preview of said update can now opt to have the Core work in those apps.

If you have a Pixel 2 device, you’ll need to get into the developer settings, toggle on the option reading “Camera HAL HDR+” — Android Central notes that the HAL acronym stands for Hardware Abstraction Layer. Once you restart your device, boot up Instagram (just one of many using the Android Camera API) and take a picture in the app.

A post shared by Jules Wang (@juicefarcicles) on

We observed that it took just a beat for our Pixel 2 unit to tweak the challenging composition we saw from the viewfinder. The edge of that corner of a monitor was lit up just a tad as contrast was adjusted for several regions. The overall brighter result allowed for details of the painted wall to show up better.

It’ll be interesting to see how results get tuned over time as the Pixel Visual Core also uses Google’s TensorFlow language for machine learning.

You May Also Like
Huawei Mate 30 Pro review

Huawei Mate 30 Pro review: the best phone you can’t get, and that’s OK

In our Huawei Mate 30 Pro review we’re trying to answer the question of whether the phone can survive without Google support, and should you buy it?

Companies could soon get licenses to sell to Huawei

Good news for Huawei: In a recent Bloomberg interview, Commerce Secretary W. Ross said he was optimistic about reaching a “Phase One” China deal this month.

The upcoming Moto Razr has been spotted in the wild, with a huge chin

It seems that the new Moto Razr is already being caught in the wild, with a huge chin, and there’s a picture to prove it