Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

Kuo: Android devices to rely on alternate 3D sensing tech after iPhone X takes early lead with TrueDepth

In a bid to catch up with Apple's TrueDepth 3D-sensing camera, Android device makers are looking to incorporate similar components using time-of-flight technology, according to KGI analyst Ming-Chi Kuo, though the first models are not expected to arrive until 2019.

In a note to investors seen by AppleInsider, Kuo predicts Chinese manufacturer Huawei will introduce smartphone models with 3D-sensing capabilities in 2019.

The company will start off with structured light solutions like those found in Apple's TrueDepth camera system for iPhone X, but will move to cheaper, smaller time-of-flight sensors by the second or third quarter of next year, Kuo says. Models with TOF sensor could replace those with structured light technology within three to six months after a start to mass production.

Other Android device makers could follow Huawei's lead and build out their own facial recognition camera systems using ToF technology.

"With advantages in BOM, we see ToF becoming the Android camp's mainstream facial recognition solution, as long as production yield and recognition quality don't become major issues," Kuo writes.

Apple became the first major smartphone maker to leverage depth-sensing camera technology with the release of iPhone X last year. Dubbed TrueDepth, Apple's specialized camera system integrates a dot projector, flood illuminator, infrared and color cameras, and advanced learning algorithms to power a facial recognition solution called Face ID.

To accurately map a user's face, TrueDepth projects an array of 30,000 dots in a known pattern using a vertical-cavity surface-emitting laser (VCSEL) module operating in the infrared spectrum. The resulting image, specifically deviations in the pattern, is captured by an infrared camera, combined with 2D image information and compared against secure reference data.

Rather than capturing an image of a patterned light, ToF solutions generate depth maps by calculating the time it takes laser pulses, or alternatively modulated laser light, to bounce off an object's surface.

Despite having a lead in terms of progress toward mass production, Kuo believes structured light technology might become a "transitional solution" amidst a wider industry push into ToF systems. Whether Apple will also turn to ToF in a future iteration of TrueDepth is unknown, though rumors last year suggest the company is investigating integration of the technology in a rear-facing 3D sensor for iPhone that could debut in 2019.

Kuo's predictions come on the heels of a report that estimated Apple to have a two-year head start over the Android camp in what is becoming a 3D sensing arms race. Suppliers of VCSEL modules and optical filters said they are unlikely to reach production levels adequate for wide adoption until 2019.

Apple, on the other hand, gobbled up parts supply for iPhone X and inked an exclusive deal with key manufacturer Finisar in 2017, promising a substantial leg up on the competition.

Even if companies like Huawei shift from structured light to ToF, supply will continue to be constrained as the two technologies share a number of components including hard to get VCSEL arrays.