Aside from folks who still shoot film, almost nobody uses the term 'digital photography' anymore – it's simply 'photography,' just as we don't keep our food in an 'electric refrigerator.' Given the changes in the camera system in Apple's latest iPhone models, we're headed down a path where the term 'computational photography' will also just be referred to as 'photography,' at least by the majority of photographers.

Recent Videos

The iPhone XS and iPhone XS Max feature the same dual-camera and processing hardware; the upcoming iPhone XR also sports the same processing power, but with only a single camera: the same wide-angle F1.8 one on the other models. The image sensor captures 12 megapixels of data, the same resolution as every previous model dating back to the iPhone 6s, but the pixels themselves are larger at 1.4 µm, compared to 1.22 µm for the iPhone X, meaning a slightly larger sensor. (For more on the camera's specs, see "iPhone XS, XS Max, and XR cameras: what you need to know.")

More important this year is upgraded computational power and the software it enables: the A12 Bionic processor, the eight-core 'Neural Engine,' and the image signal processor (ISP) dedicated to the camera functions. The results include a new Smart HDR feature that rapidly combines multiple exposures for every capture, and improved depth-of-field simulation using Portrait mode. (All the examples throughout are straight out of the device.)

Smart HDR

This feature intrigued me the most, because last year's iPhone 8, iPhone 8 Plus and iPhone X introduced HDR as an always-on feature. (See "HDR is enabled by default on the iPhone 8 Plus, and that's a really good thing.") HDR typically blends two or more images of varying exposures to end up with a shot with increased dynamic range, but doing so introduces time as a factor; if objects are in motion, the delay between captures makes those objects blurry. Smart HDR captures many interframes to gather additional highlight information, and may help avoid motion blur when all the slices are merged into the final product.

The iPhone XS image almost looks as if it was shot using an off-camera flash

Testing Smart HDR proved to be a challenge at first, because unlike with the HDR feature in earlier models, the Photos app doesn't automatically label all Smart HDR images as such. After shooting in conditions that would be ripe for HDR – bright backgrounds and dark foreground, low-light conditions at dusk – nothing had that HDR indicator. I wasn't initially sure if perhaps the image quality was due to Smart HDR or the larger sensor pixels; no doubt some credit is due to the latter, but it couldn't be that much. As it turns out, it's only once you've enabled the option to keep the original image that you'll see an HDR label on your photos. However, there's no way to force it on.

Comparing shots with those taken with an iPhone X reveals the enhanced effect of Smart HDR. In the following photo at dusk, I wanted to see how well the cameras performed in the fading light and also with motion in the scene (the flying sand). The iPhone X image is dark, but you still get a fair bit of detail in the girl's face and legs, which are away from the sun. The iPhone XS image almost looks as if it was shot using an off-camera flash, likely because the interframes allow highlight retention and motion freezing even as 'shutter speeds' become longer.

Shot with iPhone X
Shot with iPhone XS

As another example, you can see the Smart HDR on the iPhone XS working in even darker light compared to the iPhone X shot. At this point there's more noise in both images, but it's far more pronounced in the iPhone X photo.

Shot with iPhone X Shot with iPhone XS

Smart HDR doesn't seem to kick in when shooting in burst mode, or the effect isn't as pronounced. Considering the following photo is captured at 1/1000 sec, and the foreground isn't a silhouette, the result isn't bad.

iPhone XS image shot in burst mode. It's dark, but picks up the detail in the sand.
iPhone XS image shot in burst mode.
iPhone XS non-burst image captured less than a minute after the photo above.

Portrait Mode

The iPhone's Portrait mode is a clever cheat involving a lot of processing power. On the iPhone X and iPhone 8 Plus, Apple used the dual backside cameras to create a depth map to isolate a foreground subject – usually a person, but not limited to people-shaped objects – and then blur the background based on depth. It was a hit-or-miss feature that sometimes created a nice shallow depth-of-field effect, and sometimes resulted in laughable, blurry misfires.

On the iPhone XS and iPhone XS Max, Apple augments the dual cameras with Neural Engine processing to generate better depth maps, including a segmentation mask that improves detail around the edge of the subject. It's still not perfect, and one pro photographer I know immediately called out what he thought was a terrible appearance, but it is improved, and in some cases most people may not recognize that it's all done in software.

The notable addition to Portrait mode in the iPhone XS and iPhone XS Max is the ability to edit the simulated depth of field within the Photos app. A depth control slider appears for Portrait mode photos, with f-stop values from F1.4 to F16. The algorithm that creates the blur also seems improved, creating a more natural effect than a simple Gaussian blur.

Apple also says it's analyzed the optical characteristics of some "high-end lenses" and tried to mimic their bokeh. For instance, the simulated blur should produce circular discs at the center of the image but develop a 'cats-eye' look as you approach the edge of the image. The company says that a future update will include that control in the Camera app for real-time preview of the effect.

Portrait mode is still no substitute for optics and good glass. Sometimes objects appear in the foreground mask – note the coffee cup over the shoulder at left in the following image – and occasionally the processor just gets confused, blurring the horizontal lines of the girl's shirt in the next example. But overall, you can see progress being made toward better computational results.

Flare and a Raw Footnote

One thing I noticed with my iPhone XS is that it produced more noticeable lens flare when catching direct light from the sun or bright sources such as playing-field lights, as in the following examples; notice the blue dot pattern in the foreground of the night image.

Since I wanted to focus on the Smart HDR and Portrait mode features for this look, I haven't shot many Raw photos using third-party apps such as Halide or Manual (the built-in Photos app does not include a Raw capture mode). Sebastiaan de With, the developer of Halide, determined that in order to make faster captures, the camera is shooting at higher ISOs with shorter exposures, and then de-noising the results via software and image averaging. With Raw photos, however, that results in originals that aren't as good as those created by the iPhone X, because they're noisier due to - quite often - shorter exposures. You can read more at the Halide blog: iPhone XS: Why It's a Whole New Camera. But keep in mind: if you shoot Raw on recent smartphones, you relinquish the very real benefits of the computational approaches these devices are taking.

Overall, though, the camera systems in the iPhone XS and iPhone XS Max turn out to be larger improvements than they initially seemed, especially for the majority of iPhone owners who want to take good photos without fuss. Apple's computational photography advancements in these models deliver great results most of the time, and point toward more improvements in the future.

iPhone XS sample gallery