- Google has explained how its Live HDR feature works.
- The feature delivers HDR in the viewfinder of the Pixel camera app.
Google debuted its Live HDR and Dual Exposure Controls features on the Pixel 4 last year, and it’s since brought them to the Pixel 4a. Now, the company has given us a breakdown of how the former feature work on its AI blog.
The vast majority of smartphones show you a scene in the camera viewfinder that isn’t quite representative of an HDR or HDR+ photo. In other words, what you see in the viewfinder and the final photos might be wildly different.
That’s where Google’s Live HDR comes in, giving the viewfinder the HDR treatment as well so you have a better idea of what to expect before hitting the shutter. Google says running its HDR+ algorithm in the viewfinder would be way too slow, so it had to use another approach.
Read: The 10 best Google products you can buy
Google says it essentially divides the input image into a number of smaller tiles and then conducts HDR+ on a per-tile basis.
“Compared to HDR+, this algorithm is particularly well suited for GPUs. Since the tone mapping of each pixel can be computed independently, the algorithm can also be parallelized,� the firm notes. It then uses a machine learning algorithm called HDRNet for better, more accurate results. And the results are very similar to one another, as the comparison below shows.
It’s a pretty interesting way of delivering a better camera experience, as users now have a good idea of what to expect before they hit the camera shutter key. So hopefully more Android OEMs implement similar functionality in their camera apps.
Google’s explanation also comes after a teardown of the latest Google Camera app yielded evidence of potentially upcoming features. More specifically, it looks like the firm is working on audio zoom functionality, flash intensity adjustments, and a “motion blur� mode.
Next: Want a phone with a great camera? Here’s what to look for