BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

Google's Pixel 2 Secret Weapon Is 5 Times Faster Than iPhone X

Following
This article is more than 6 years old.

Google’s Pixel 2 and Pixel 2 XL phones pack the highest-rated smartphone cameras on the planet right now, but they’ll soon be getting even better.

It’s all thanks to a new image processing chip, currently lying dormant inside the phone, which wasn’t even mentioned in Google’s keynote at the launch of the new Pixel 2 handsets.

Google

The chip, named ‘Pixel Visual Core’, is Google’s first custom-designed co-processor for consumer products and is designed to deliver maximum image processing performance while using minimum power. Furthermore, it will enable third-party apps to use the HDR+ feature, something which is currently only possible from within Google’s home-grown camera app.

HDR+ is Google’s name for the image processing magic which enables the company’s phones to take pictures of a much higher quality than would normally be possible with comparable hardware.

However, it’s currently available only in Google’s own camera app and this means popular third-party apps, which often offer greater manual control and additional features, can’t currently take pictures of such high quality.

Google

The Pixel Visual Core chip will be turned on ‘in the coming months’ through a software update, says Google, and will enable more applications to use the Pixel 2’s camera hardware in HDR+ mode.

Pixel Visual Core is a powerful, eight-core processor capable of delivering 3 trillion operations per second per core. According to Google, this means the chip can process images five times faster than the phone’s main processor while using only one-tenth of the power.

Google

This will not only speed up imaging performance but also greatly reduce the draw on the phone’s battery while taking and processing images.

Pixel Visual Core requires specially written code, but this has been made easier for developers thanks to supporting open source languages Halide for image processing and TensorFlow for machine learning.

Pixel Visual Core is due to be enabled for the first time in the forthcoming Android Oreo 8.1 (MR1) developer preview, available in the coming weeks, but more exciting is the news that the chip will eventually be made accessible to all third-party applications via the Android Camera API. This means, that the door will be open for all your favourite camera apps to take photos in HDR+ mode without the need for any new code in Halide or TensorFlow.

Google’s camera app is great but lacks the sophistication of some third-party offerings. I particularly look forward to being able to save HDR+ images in uncompressed high bit-depth versions for processing in Adobe Lightroom Mobile or later on my PC.

Soon you won’t have to choose between the ultimate image quality or the convenience and enhanced flexibility of your favourite camera app.

Just as exciting is the level of performance offered by Pixel Visual Core. The Bionic A11 chip found in Apple's iPhone 8, iPhone 8 Plus and iPhone X features a dual-core 'Neural Engine' capable of up to 600 billion operations per second. Pixel Visual Core, on the other hand, is rated at up to three trillion. That's five times the performance, assuming like-for-like operations.

More On Forbes

Google's Pixel 2 Has Big Win Over Apple's New iPhones

GoPro 'Confirms' Hero 6 Black Release Date

iPhone 8, iPhone 8 Plus Tests Reveal Apple's Biggest Upgrade

GoPro Hero6 Leak Reveals iPhone X Challenger

Samsung Galaxy S9 Video Camera Tipped To Be 4 Times Faster Than iPhone X

Google's Brilliant iPhone App Sucks On Android