Beautygate —

Apple says glossy iPhone XS selfies were a bug, promises a fix in iOS 12.1

"Computational photography" isn't without its downsides.

The notch on the new iPhone.
Enlarge / The notch and front-facing camera on the new iPhone.
Samuel Axon

When we reviewed the iPhone XS and XS Max, we found a number of small things to quibble about in our generally positive review. Among them was the fact that Apple's self-styled "computational photography" approach involves the software doctoring the photos you take in ways that are not transparent to you and that may not produce the effect you want. That issue was at the heart of a minor scandal surrounding the phone's launch that some dubbed "Beautygate."

Users who took selfies with the front-facing camera found that the resulting images looked a little bit glossed over. Skin was unrealistically smooth, with blemishes and details missing—similar to what you'd see with a beauty filter. Some users took to forums to speculate that Apple did this deliberately because these kinds of filters are popular in one of its largest markets (China) and in social messaging apps like Snapchat.

However, Apple has told The Verge that the behavior is in fact a bug and that a fix is coming in iOS 12.1.

The issue is reportedly related to the Smart HDR feature in the iPhone XS, which shoots a buffer of four frames for every photo, along with "interframes" at different exposures. This is done in part to reduce the effects of shutter lag, but it also presents the phone's software and hardware with the opportunity to quickly analyze and compare every frame to make smart decisions about which frame to use, which things to highlight, and so on.

According to Apple, Smart HDR would sometime choose a frame with a long shutter speed, which led to images in which fine details in the subject's skin were lost. The problem was exacerbated by the fact that the front-facing camera does not have optical image stabilization. The company says iOS 12.1 will bring a fix that is meant to make the phone use frames that show more detail instead.

iOS 12.1 doesn't have a release date yet, but it is currently well into beta testing, and it's expected soon.

Apple isn't the only company to employ these sorts of methods for touching up photos behind the scenes. Using powerful image processors and machine learning is a trend in today's phones for a reason—it's an extremely promising way to make up for the limitations inherent in phone cameras' optics. But it does have the downside of taking some control away from the user. It's good that Apple is addressing the issue, but we doubt we've seen the last of this kind of concern from our smartphone cameras.

Channel Ars Technica