Skip to main content

Apple’s Smart HDR sounds a lot like the Google Pixel camera

Apple’s Smart HDR sounds a lot like the Google Pixel camera

/

Stressing Image Signal Processor more than lenses and sensors this year

Share this story

Apple’s new camera system in the iPhone XS and XS Max has just been detailed at the company’s grand launch event, and the big highlight has nothing to do with hardware specs. The big improvement this year is all about computational photography. Sound familiar? That’s exactly the stuff Google has relied on to build its world-beating Google Pixel camera, and Apple’s solution follows a very similar path.

A new Smart HDR mode in the freshly announced iPhones does two Pixel-like things. One is that it shoots a constant four-frame buffer when the camera app is open, meaning that the moment you press the shutter is the exact moment the photo is taken, there’s literally zero lag. The other new feature is the smart combination and splicing of multiple frames of the same shot, exposed at different values. Apple then says it takes the best parts of each frame to produce the best possible imagery. Google’s approach is to take a whole bunch of underexposed photos and combine them — so Apple isn’t doing the exact same thing, but it’s a very similar approach.

Apple’s other cool feature on the camera front is something the Pixel lacks: adjustment of image depth of field after capturing the shot. A slider lets you toggle the simulated aperture of the shot, so that you can isolate your subject from the background by blurring it or, alternatively, you can make sure everything in your shot is in focus. Samsung’s offered such an adjustment in its phone cameras since the Note 8 last year, so it’s fair to say Apple is playing catch-up on the camera front this year.

1/8