Training My iPhone

I’ve often referred to Apple’s iOS as a learning operating system. When you look beneath the surface, you see Apple’s iOS starting to adapt to your habits and is constantly learning about your behavior. A simple example of this is if you go to a specific location every day at the same time, the iPhone will give you a notification saying how long it will take you to get there. For people who go to an office every day, you don’t need to tell it where your office is as the device simply recognizes your patterns. The Siri suggested apps similarly look for patterns of location or time of day where you use certain apps and give you quick access to apps you use regularly. These are two simple examples of how the iPhone, and underlying iOS, is learning about the habits and behaviors of its owner.

The idea of an operating system that is constantly learning, adapting, and become more tuned to its owner’s behavior is a relatively new idea. I believe Apple has been heading in this direction for a while which is the software, hardware, and even custom silicon chipsets are all uniquely part of the equation. And when you dig into it, you realize Apple is only scratching the surface of what is possible with an operating system that as you use it, you train it to serve you better.

Training my iPhone
I’m probably in the minority, but I am highly aware my iPhone is learning about me and even giving me opportunities to train it. For example, I am intentionally letting the Apple News app know when I like a particular article. My goal in doing this is to train the underlying algorithms behind Apple News so that the service can better serve me with relevant news articles. I view this as an important part of training my iPhone, and in this case Apple News, so it can better help me discover content I will like and want to read.

Similarly, I am intentional at letting Apple Music know when I like a song or artist. In doing this, I’m trying to train Apple Music to know what I like and what I don’t like so it will only surface songs I like and help me discover new artists. Apple News and Apple Music are just two examples of how you can train your iPhone, so it is more customized and tailored to your unique needs.

In some ways, each iPhone owners iPhone will become like a fingerprint. No two iPhones will be the same because each one is becoming tailor-made for its owner. Right now News, Apple Music, Siri app suggestions, and even personalized Maps experiences are all unique to each iPhone owner. I’m not sure how often normal consumers do this, and it certainly isn’t necessary to manually train the iPhone as iOS itself is built in a way to learn and adapt. But, for now, being intentional with liking and not liking music or news makes this process quicker.

The iOS Digital Nervous System
In many ways, iOS is the central nervous system of the iPhone. The unique customization and personalization that takes place in the underlying iOS architecture will lead to a depth of experience that will be hard to find on other products. I view this as one of the more sticky innovations of iOS and one that will lead to even deeper loyalty once it reaches its full potential.

What will be interesting to watch is how Apple weaves all of this into more core experiences with their software. I’m also curious to see what additional control or assisted training Apple can weave into their apps and services. For example, what if (someday) when you like a song Siri asks why or what you like about the song. You can then give Siri some context which deepens the learning and could lead to better assistance, recommendations, and discovery.

Using Siri as the conversational interface to better train Apple’s core software and services seems like a natural fit. Granted, I’m not sure consumers will be immediately comfortable with this. However, I think the convenience benefits of letting your iPhone get to know you will be worth it in the end.

Another point that stands out to me is the smartphone is the only device where I think this experience can start from. The smartphone is the most personal computer any human will use and own which makes it the perfect device to train and personalize things that are unique to its owner. But, that does not mean things your iPhone learns about you can’t expand into other products. For example, perhaps macOS can benefit from what my iPhone has learned about me and begin to be customized as well. Similarly my Apple TV, Apple Watch, and any other device that is part of the Apple ecosystem. What that will lead to is an ecosystem of devices uniquely customized and tailored to you.

Apple’s Lead
With everything I’ve outlined above, which blends a number of themes around devices, custom chipsets, integrated software, AI/ML, and security and privacy, Apple is the clear leader. Other companies will try to do the things I mentioned but can only do so on one device. Apple has the ability to leverage their ecosystem and depth of products to all work together. This creates a more compelling story for an Apple customer buy more Apple products. Buying more Apple products increases Apple’s ARPU, leads to more stickiness, and creates more opportunity for services revenue over time.

The machine learning/AI era will not be limited to one device but will have to span all the things a customer uses. As these devices become more aware of the unique interests and behaviors of its owner, those benefits will translate to all the devices in their ecosystem. The implication being, the point I made about your iPhone becoming uniquely tuned to its owner but every device Apple makes thanks to the training that takes place on the iPhone.

Lastly, this central nervous system can become a distributed learning engine where data from all Apple hardware feeds the central brain and trains it. Therefore, while your iPhone may be the central brain, your Apple TV, Apple Watch, iPad, and future products can all feed that brain and give even more useful training data. When you look out at those competing with Apple, you quickly realize they are the only ones with any critical mass of ecosystem devices with Apple Watch and iPad being the best examples.

Hopefully, I’ve articulated a few points to think about related to how AI/ML is being used by Apple uniquely with more of a focus on the individual than a general data set of information like Google is doing. I’m still fleshing out this model of Apple’s central nervous system and learning OS and perhaps next time I update or tackle this subject I can create some visuals to clarify it even more.

Published by

Ben Bajarin

Ben Bajarin is a Principal Analyst and the head of primary research at Creative Strategies, Inc - An industry analysis, market intelligence and research firm located in Silicon Valley. His primary focus is consumer technology and market trend research and he is responsible for studying over 30 countries. Full Bio

Leave a Reply

Your email address will not be published. Required fields are marked *