Apple's Animoji Will Teach You To Love Face Tracking, For Better or Worse

With the new iPhone X, you can turn your visage into a fox, a unicorn, or a pile of poo.
Image may contain Human and Person
Phil Schiller, senior VP of worldwide marketing at Apple Inc., speaks about iPhone X at the Steve Jobs Theater in Cupertino, CA on September 12, 2017.David Paul Morris/Bloomberg/Getty Images

One of the first things you'll do after throwing down $1,000 on a new iPhone X is take your own picture. But don't expect a regular selfie. This time, you'll fire up Apple's new TrueDepth camera and a suite of technologies—flood illuminator, infrared camera, front-facing camera, and dot projector—will project and analyze 30,000 dots across your visage, creating a high-resolution map of your facial features.

This superselfie enables some of the phone's most compelling new features: things like automatically unlocking your phone, paying for coffee with your face, and turning your facial expressions into a grinning pile of poop. In a demo at Apple's launch event, the audience watched as Apple SVP of software, Craig Federighi, transformed himself into various "animoji" (as in emoji, but animated). As he thumbed through the iOS Messages app, Federighi's face became a clucking chicken, a neighing unicorn, and a chattering dung pile.

The point was to show animoji's silly side. "If you were by chance wondering what humanity would do when given access to the most advanced facial tracking technology available, you now have your answer," Federighi joked. But he also undersold animoji's potential. Someday the facial-tracking software used to create talking animoji might do more than mimic your emotion—it might predict it too.

Follow the Face

The technology that Apple is using isn't new, exactly. "The infrared light projection and 3-D scanning is something that is present on the consumer market for some time now," says Dino Paic, director of sales at facial-recognition software company Visage Technologies. In fact, it's a lot like what Microsoft Kinect's depth camera has done for years. Using the phone's TrueDepth camera, Apple can track more than 50 muscle movements and overlay those features onto the emoji you know and love—that includes the fox, the unicorn, and yes, the pile of poop. As you contort your face from a smile to a frown, the expression on your chosen emoji changes with it.

Apple's primary achievement was squeezing this technology into a phone that you'll have with you all the time. In that light, Apple tasking its biggest brains to turn you into a talking emoji isn't trivial—it helps acclimate people to seeing their faces tracked in real time. Avoiding that creepy factor will be even more important if and when Apple decides to extend its facial tracking technology to other use cases.

That certainly seems within reach. Last year, Apple acquired a company called Emotient, which uses facial tracking software to analyze and predict human emotion. By watching how your face moves—if you raise an eyebrow, glance downward, smile, or frown—machine learning can begin to figure out how you feel.

Big tech companies such as Facebook and Google are already investing in research around this kind of affective computing. Amazon claims it can parse human emotion by studying video clips. And while Apple's animoji does little more than mimic facial expressions, it could someday lead to a phone that has far greater emotional intelligence.

"It's interesting what they chose to come out with," says Rana el Kaliouby, founder of Affectiva, an MIT spinoff that develops technology to track and measure human emotion. "Animoji is fun and engaging and interactive, but it's not exactly advanced emotional AI in the way we think about it."

Second That Emotion

It's a short walk from there to something more involved, though. Some startups such as Polygram are attempting something similar to Apple's efforts, with an added layer of emotion built in. The LA-based company trained a convolutional neural network to recognize facial expressions and then map those onto sentiments such as boredom, happiness, and interest. The app, which is like a Snapchat/Instagram hybrid, shows the poster, in real time, the facial expressions of the people looking at the post. It also shows an emotional analysis of how people react to the content, without the inauthentic niceties that usually accompany social media. "The gist of what we do is understanding human facial expression for purpose of understanding your mood," says Faryar Ghazanfari, Polygram's founder.

Apple doesn't have a social network to bolster with emotional intelligence, nor does it rely on advertising like Google or retail prompts like Amazon, which are both use cases that are clear benefactors from emotional recognition. It does, however, have a music-streaming service whose success hinges on personalization and a vested interest in original video content. It's interesting to imagine the ways that emotional data gleaned from your facial expression could be combined with biometric data from your Apple Watch to paint a more holistic picture of health. It's just as scary to imagine the repercussions of that data being misused.

For now, at least, Apple's Animoji is little more than a clever gimmick. Will Apple eventually make its messaging more emotionally intelligent? Maybe. Are Anijomi a Trojan horse for making a creepy technology less creepy? Probably. It's easy to imagine that someday the company might be interested in not just how you look, but how you feel.