Skip to main content

The end of app isolation: How our apps will work together to serve us better

Image Credit: Dragon Images/Shutterstock

In mobile, “context” is everything. Where you are, what time is it, what you are doing — these things are essential elements to delivering a great mobile experience.

But let’s be honest: Since the dawn of first iPhone eight years ago, apps can’t and don’t do context as well as we’d like. That’s because they have been stuck in isolation. And isolation is no place to be when you’re trying to understand the world around you.

The phones we carry around have dozens of individually packaged applications that appear as a series of unrelated icons on your phone screen. The good apps do an exceptional job of offering up deeply engaging experiences, but it’s a solitary experience. They are oblivious to one another and unaware of whom you are texting or emailing with at the moment. All this means apps alone can’t really understand your context well enough to deliver on their enormous potential.

However, mobile operating systems are in a position to more fully understand you. At the recent Google I/O and Apple WWDC developer events, the world’s two most important mobile companies each announced big leaps in how they harness data and behavior that defines your context. With these leaps, Android and iOS both are offering up a new type of mobile experience that Google refers to as “glanceable moments.”

The welcome news for consumers and app developers is this: Apps have been invited out of isolation.

Google Now on Tap takes what it knows about you by looking at your calendar, location, contacts and recent behavior — including tapping into your app usage — and serves up contextually relevant snippets. These snippets, or cards, come from a combination of things — including data on your phone, the mobile web and your apps.

For example, if I am reading a text message about going to a restaurant tonight, I can hold the home button and Google Now on Tap will show snippets at the bottom of my message app, including directions, reviews and contact information for the restaurant. Or, if I was listening to a song, I can tap the home button and simply say “OK Google, who is the lead singer?” Now on Tap would understand that I am speaking about the lead singer of the band I am currently listening to within my chosen music app.

Siri’s upgrade also uses artificial intelligence to understand what you need in the moment. Siri’s “proactive” feature doesn’t analyze what is beyond your phone screen in the same way that Google Now does, a fact Apple touted as its commitment to privacy. But it interprets your habits – such as your tendency to listen to your music app when going for a run in the morning or a podcast when driving – and offers up music or podcast choices appropriately.

When I first saw the I/O demo, it reminded me of The Knowledge Navigator, a virtual assistant concept presented by John Sculley in 1987, then CEO of Apple, and the contextual “Data Detectors” that we were working on in the ‘90s when I was at Apple. Nearly two decades after the Knowledge Navigator was conceived, we’re finally getting close to having that virtual assistant that really knows us.

Some pundits have misinterpreted this improvement to context as a step toward the end of apps, including this Wired headline about the Google Now news: “Google’s Ingenious Plan to Make Apps Obsolete.” The argument is that if the data behind the apps — which may also be available via the mobile web — is what drives the more contextually aware mobile OS, then why would anyone need apps?

There are three reasons why:

1. Better snapshots won’t replace deeper experiences

Apps have long excelled at offering deeply engaging experiences — and it’s one of the reasons why app usage remains much higher than the mobile web (estimates range from 80 percent and up of mobile device time spent is within apps). Many of these “sticky” and immersive experiences can only come from software that lives on a device and was natively designed for a platform — the device-agnostic mobile web just can’t perform the same way.

While Siri and Google Now promise to offer up snapshot information you need in the moment, they don’t try to satisfy this need for deeper engagement. In the restaurant example, even though Google Now may recommend a nearby restaurant based on texts from your friends, you may still want to access your Yelp app to learn more about the restaurant ambiance and see pictures of its food. There are also apps that serve no informational purpose at all, such as games. There are more than a billion mobile gamers worldwide — and outside of launching a game on your command, Siri and Google Now won’t have much of a play in that experience.

The virtual assistant that really knows you can help guide you throughout your day, but it can’t help you live it.

2. Glanceable information, like notifications, will increase app usage

Improvements to Google Now and Siri represent a huge step for the user experience, and they promise to do their part to increase the overall use of mobile devices. Given that apps are a huge part of the mobile experience and information from apps are now included in these glanceable moments, there’s every reason to believe that app usage will also increase.

The information presented by Google Now and Siri is in some ways an evolution of notifications. Effective notifications have proven to remind people to open up their apps — including one study that found notifications increased app use by 88 percent. Like notifications, snippets of information from Google Now or patterns that Siri identifies will drive you back into your apps — not keep you from them.

3. We trust apps and the companies that have them

The app customer lifecycle typically starts something like this: You search the web for a kind of service or company and engage with the content you find. That company or service may also have an app that offers up a more deeply engaging experience designed specifically for your device. If you like what you see on the web, you download the native app. By downloading the app, you show a higher level of trust for that company/service than you do for a similar company with an app that you didn’t install.

So, when Google Now and Siri are analyzing your context, the most relevant information is going to be from the apps you’ve chosen to download to your phone. To put it another way, what would you be more likely to trust: a piece of information that came from one of millions of web pages or one of the apps you have on your phone?

Hyperbole and the end of apps

Wired wasn’t the first publication to suggest the end of apps is imminent. Even the MIT Tech Review declared them as good as dead in 2011. But the declarations of “the end of apps” are hyperbole — in most cases, the arguments aren’t really about the utility of the apps themselves but rather their isolated state as disconnected icons on a phone screen.

The advancements of Google Now and Siri address this — and represent a whole new world for apps and the developers that make them. As we ideate, design and engineer the next generation of mobile experiences, the apps we build can finally coexist with the operating systems and the data on our devices, along with the mobile web — and together offer the kind of mobile experiences we deserve.

Now that apps are out of isolation, “context” has a whole new context.

Eric N. Shapiro is the CEO and cofounder of ArcTouch, a San Francisco-based mobile app development studio. His days of working with mobile technology go all the way back to his time at Apple, when (among other things) he was involved in the launch of the Newton.

VB Daily - get the latest in your inbox

Thanks for subscribing. Check out more VB newsletters here.

An error occured.