Technology

Is the iPhone’s Camera Finally Good Enough?

Reviewers have lots of praise for the iPhone X—but that selfie might feel too accurate for comfort.

After generations of improvement, is the iPhone camera finally good enough?     

Justin Sullivan/Getty Images

One of the most quantifiable improvements to Apple’s iPhone is the evolution of its cameras. Ten generations deep now, we can look back (as Mashable did here) and compare how the saturation, brightness, sharpness, and overall quality of the iPhone’s photos have changed. While there’s still room for improvement—and Apple knows it—at this point, people are beginning to wonder: Is the iPhone’s camera finally good enough? In fact, is it too good?

Some think so. The iPhone X has a 7-megapixel camera paired with the iPhone X’s True Depth sensing technology. This allows for portrait mode on the front-facing camera for the first time. Portrait mode uses AI to figure out what parts of a photo should be hyperfocused and what should be blurred, which gives photographs a more professional look—but not everyone likes that. In fact, some feel that the iPhone X’s selfies are suddenly too accurate for comfort.

A number of iPhone X owners have taken to Twitter to air their complaints.

Others dig it, though.

“Taking selfies on the iPhone X is a religious experience,” Paris Martineau wrote for New York. “Gone are the shaky-handed, perma-blurry thumb pics; in their place are selfies so well-lit and clear that your friends will think you hired a photographer.”

One of the best parts about hiring a professional photographer is the professional-level touchups you get before the photos are sent out to the world. The iPhone X doesn’t do that for you (although plenty of other third party apps will). If you’ve got wrinkles, crinkles, smile lines, or overlarge pores, the iPhone X camera, and portrait mode, will highlight those details in all its 7-megapixel glory.

It’s fascinating that Apple may have made a camera that is too accurate for comfort, especially since the company continues to work on improving its imaging technology even further. On Thursday, TechCrunch confirmed that Apple acquired imaging sensor startup InVisage Technologies. InVisage holds 27 patents, but its primary development is something called QuantumFilm. This product combines software and materials science into an imaging sensor that’s smaller than today’s sensors, while capturing better photos in low-light scenarios. Instead of using silicon, QuantumFilm incorporates a photosensitive layer made of quantum dots, a type of nanoparticle. This layer is 10 times thinner than silicon but absorbs the same amount of light—and the whole light spectrum, too.

While the iPhone excels at portrait photography—especially compared to other leading smartphones on the market—it could still stand to improve when it comes to snapping photographs in darker conditions. In tests against a professional mirrorless camera, the Panasonic GH5, testers found the iPhone X performed about as well or better than the GH5 in a variety of situations, but not when it came to low light, image stabilization, or zoom capabilities. It’s possible that InVisage’s QuantumFilm could fix that.

InVisage’s technology could be useful to Apple in other ways, too. According to its website, QuantumFilm can also be used in areas such as authentication, autonomy, virtual reality, and augmented reality. And we know Apple is heavily invested in the latter.

Apple hasn’t yet commented on the acquisition—it gave TechCrunch a boilerplate response that it “buys smaller technology companies from time to time, and we generally do not discuss our purpose or plans”—but it would be surprising if Apple didn’t use InVisage’s technology to better its iPhone cameras in the future. Right now, you may still be able to escape Portrait Mode’s pore-highlighting exactness with a dimly lit bar bathroom selfie. But that might not be the case come next year.