This Week's Sponsor:

Kolide

Ensure that if a device isn’t secure it can’t access your apps.  It’s Device Trust for Okta.


iPhone X: A New Frontier

For a radically redesigned iPhone launching on the tenth anniversary of the first model, it seems reasonable to lead a review of the device with a retrospective on the original iPhone and how, over the course of a decade, its vision matured into the iPhone X. But today, I want to begin with the iPhone 4.

In many ways, it was 2010’s redesigned iPhone 4 that turned the iPhone into the social, cultural, and economical phenomenon we all recognize in 2017. It was the first iPhone to be publicly leaked several weeks in advance, kicking off the modern era of Apple rumors and speculation on unreleased hardware. The iPhone 4 brought a substantial rear camera upgrade and, for the first time, a front-facing camera – features that would become instrumental to the redefinition of mobile photography for self expression, communication, and sharing through the rise of selfies, FaceTime video calling, and Instagram.

Perhaps most importantly, the iPhone 4, with its Retina display and bold new look, was a piece of modern industrial design that demanded to be held, used, and studied to be fully appreciated. Unlike its predecessor – the faster and forever endeared iPhone 3GS – the iPhone 4 felt like a marvel of pocket computing that couldn’t be possible a mere 12 months after the 3GS’ plastic body and low-resolution display. The iPhone 4 was an instant classic that leaped beyond anything we’d seen before at the time.

I understand why the original iPhone makes for an ideal starting point to discuss the iPhone X. The device’s anniversary and (debatable) nomenclature beg for a reflection on the past decade and how our lives have changed thanks to smartphones. But it’s not how I feel about the iPhone X.

Like the iPhone 4 did in 2010, the iPhone X follows a highly successful iPhone model and diverts from its established formula with a new design and features that will cause our habits to evolve. The iPhone X doesn’t just add to an existing model – it remixes and changes our expectations of what an iPhone should be altogether. I see more parallels between the iPhone 4 and iPhone X than, say, the original iPhone and the X: selfies becoming portraits, the advent of Retina and now OLED, and, yes, even being left in a bar and “being left” inside an unreleased firmware.

The iPhone X, like the iPhone 4, inaugurates a new direction, with effects that will inevitably ripple through the entire iPhone line. But beyond its striking, sensor-laden display and glass body, the iPhone X is a collection of major themes that will reshape the Apple ecosystem over the next decade.


Table of Contents

The Truly Personal Computer

When rumors of Apple abandoning Touch ID in favor of facial recognition appeared earlier this year, I was skeptical. I couldn’t believe the company would already move beyond the technology they had been perfecting over the past four years. But not only has Apple been able to smoothly transition from Touch ID to Face ID in core system features and apps, with the iPhone’s improved biometric framework, they have set a new standard for what “personal computing” ultimately entails.

Face ID

I’ve always used and loved Touch ID, even when the first-generation sensor made the original iPad Pro slower to unlock than the iPhone 6s with a second-gen Touch ID. Setting aside the concerns regarding 3D masks and genetic resemblance being able to spoof Face ID1, I believe Face ID is better than Touch ID for two reasons: it’s a secure biometric feature that doesn’t require physical contact with the iPhone, and it’s also software that improves itself over time.

Since the iPhone 5S, I’ve probably used Touch ID several thousands of times, but the sensor never taught itself to recognize my fingerprint after I got out of the shower, washed dishes, or played with my dogs at the beach. In those situations, Touch ID predictably and consistently failed. I was okay with it because I knew what was happening: the sensor’s lead advantage over the passcode (“just put your finger on the Home button”) was also its liability in imperfect reading conditions. Touch ID was (and still is) great at recognizing fingerprints, but, by design, it can’t infer a correct fingerprint unless it can read your skin just right.

With Face ID, these physical limitations are removed from the experience. Your hair may be wet because you just took a shower, and Face ID will still log you into the iPhone X. You may be shopping and holding your iPhone with one hand in a precarious way, and Face ID won’t care. The mechanical act of holding your finger over a sensor is completely gone, and I can’t stress enough how nice (and futuristic) this feels when you just want to use your iPhone without worrying that biometric authentication might fail. What your hands have touched is no longer a variable to account for when unlocking an iPhone. It only takes a couple of days to get used to Face ID and look at other Touch ID devices (like iPads) and wish they also did away with the physical sensor.

There are trade-offs involved with scanning your face as a means of authentication, of course. But here’s the unique, and potentially game-changing, part of the iPhone X experience: Face ID’s downsides turn into an advantage thanks to machine learning.

Just like our fingerprints, our faces are also subject to changes that may impact the sensor’s ability to recognize us. In the iPhone X’s case, they are visual changes related to our appearance or position in a physical space – glasses, beards, hats, facial expressions, head movements and orientation, and so forth. There isn’t much a fingerprint sensor can do when it can’t come in contact with the skin on your finger, but a face? For years, iOS has proven to be more than apt at recognizing people’s faces; Face ID takes it to the next level with the ability to do it in milliseconds, in real-time, and gradually improve its understanding of your face.

Face ID builds upon the more capable face recognitions APIs introduced in iOS 112 through a dedicated, self-contained, self-learning process that uses errors as data points for improvement. According to Apple, when Face ID fails and you have to enter your passcode manually, if the data recognized by Face ID hits a certain threshold of familiarity with the user, the passcode is used as confirmation to add the unrecognized scan to Face ID’s data pool and, hopefully, teach it to recognize a similar scan in the future.

This is a powerful idea for a core system functionality: it isn’t so much designed in a hardware lab or programmed beforehand as it is an adaptive feature that constantly evolves, discards data, and ingests new scans to learn and grow more accurate. I‘ve already started seeing the benefits of this approach after a few weeks with the iPhone X: Face ID recognizes me just fine when my eyes are squinting after I’ve just woken up in the morning, and it authenticates me when my hand is partially covering my right cheek.3

With Face ID, the iPhone’s ability to recognize you isn’t affected by what you just did before picking it up; Face ID dynamically adapts to any situation where you might be looking at the screen. Therefore, Face ID’s potential struggles lay in hardware (how wide an angle the TrueDepth camera can see) but mostly in software; Apple’s algorithms to train Face ID via machine learning will be the definitive test for this new technology.

Machine learning isn’t new to iOS, but Face ID is the most prominent example of a major new iPhone feature that depends on it to train itself. It’s fascinating to consider how future changes to iOS hardware and software may be imbued with the same principles. For now though, I can only judge the adaptivity of Face ID and its quality over weeks of intense usage. And from that perspective, I’m impressed. In my experience, Face ID has been more reliable than Touch ID (in all kinds of situations in and outside of the house); when Face ID fails, at least I know I’m helping it become smarter by entering my passcode. More than a chore, it almost feels like a lesson I’m imparting to my iPhone.

More importantly, Face ID is nicer than Touch ID. There’s something utterly pleasant about simply looking at an iPhone to securely unlock it that is hard to convey without experiencing it. A month later, its novelty effect hasn’t worn off for me yet.

I don’t perceive Face ID as slower than Touch ID; while it may be technically true that it takes longer for iOS 11 to recognize a face than scan a fingerprint, the integration of Face ID on the Lock screen makes it superior to Touch ID. It comes down to detaching user authentication from a physical point of contact: your hand isn’t busy clicking the Home button, so authentication and the vertical swipe gesture required to leave the Lock screen can happen simultaneously. Face ID will scan your face as you’re swiping up to go to the Home screen; this parallel operation helps make Face ID feel faster than it is because it coexists with touch input. It’s a subtle difference, but more than a user action, Face ID is a layer embedded within the Lock screen.

Speaking of the Lock screen, it’s evident now that Apple was playing the long game with Raise to Wake, introduced in iOS 10 and finally finding its true purpose with the iPhone X. A device that turns on as soon as it’s picked up plays into the idea of absolving users of the need to tap or click to unlock an iPhone. Combined with Face ID, Raise to Wake yields the ideal iPhone unlocking behavior: a smooth, uninterrupted flow that switches from screen turned off to the Home screen with little user intervention in between.4

Apple has spent the past couple of years adjusting our Lock screen behaviors, and the goal is clear with Face ID on the iPhone X. Suddenly, the Lock screen raising instead of fading to reveal the Home screen makes sense: the animation was hinting at the iPhone X’s swipe gesture on the Home indicator all along. Same with iOS 11’s Cover Sheet: ditching Notification Center in favor of a quasi-Lock screen environment seemed like an odd choice on older hardware; on the iPhone X, it acts as a positive reinforcement for the vertical swipe gesture. Both features are a good example of how Apple prepares for upcoming hardware changes through software tested at scale several months in advance.

Most of all though, Apple’s effort to turn the Lock screen into a system area where users can spend time interacting with notifications and widgets is paying off with Face ID. Instead of awkwardly placing your finger on the Touch ID sensor without clicking so you can authenticate and stay on the Lock screen, Raise to Wake‘s natural activation and Face ID’s invisible authentication make the Lock screen more personal, private, and easier to use.

Locked notifications (left) are expanded with Face ID.

Locked notifications (left) are expanded with Face ID.

Notification previews are hidden by default on the iPhone X; you have to authenticate with Face ID while actively looking at the screen to expand them. While this option was available on older iOS devices under Settings ⇾ Notifications ⇾ Show Previews ⇾ When Unlocked, unlocking an iPhone has never been as seamless as on the iPhone X; making this option the default now is, I believe, the right move. I love the idea of an always-private Lock screen that fully reveals itself only if I’m looking straight at it. Before Face ID, I had to look at the screen and use my thumb if I wanted to show hidden notifications; now, authentication happens as I look at the display.5

The hands-free nature of Face ID has other benefits too. As part of the iPhone X’s Face ID & Passcode settings, you can disable Lock screen features such as Control Center, HomeKit controls, Siri, and widgets while the iPhone is locked. If you enable these privacy features and try to swipe to open the Today view or Control Center, nothing will happen. A stolen iPhone X cannot be quickly put in AirPlane mode if Control Center’s availability is controlled by authentication, which is fantastic. These options were available for Touch ID devices as well, but they seamlessly blend with the Lock screen thanks to Face ID.6

My bigger point here is that the convenience of Face ID implicitly makes the iPhone X more secure because you’re more inclined to leave its privacy settings turned on. With Touch ID, having to relocate your finger from somewhere onscreen to the Home button was, even if a small one, still a chore; higher privacy came at the expense of ease of use as authentication was physically tied to an offscreen button. Face ID has no such limitations. It’s integrated with the experience so that you don’t think of it as an action that depends on you; there are (almost) no waiting times and no buttons to reach for. And if there’s no “catch” involved with enabling as many privacy features as possible, why not leave them turned on at all times?

My doubts about Face ID disappeared within my first few minutes of iPhone X usage. Millisecond-by-millisecond comparisons with Touch ID miss the point. Even if the recognition process might be technically slower than second-gen Touch ID, it’s not the only step of authentication as a whole; Face ID’s authentication flow often feels faster than Touch ID because it’s easier.

Face ID’s real test will happen over the next several months as its machine learning algorithms adapt to different seasons and associated headwear, facial features, and other variables. We can’t tell now if Face ID’s performance will improve or degrade over a long period of time. So far, I’ve found Face ID to be more convenient and accurate than Touch ID, and also just as fast thanks to the design of the vertical unlock gesture.

With Face ID, I went from being highly skeptical of facial recognition to a believer. I’m also convinced, though, that the system behind Face ID has deeper implications than authentication alone.

Attention Awareness

One such ramifications stemming from Face ID is the idea of attention awareness – iOS being able to dynamically adapt certain features whether the user is looking at the screen or not. This wasn’t possible with Touch ID, as pressing a button didn’t necessarily indicate attention; but with persistent face detection, Apple can start dipping their toes in ambient computing, which I believe is the overarching theme around face recognition.

The iPhone X shows the first signs of iOS becoming truly personal through tweaks that adapt to the user based on context and presence. First off, unlocking via Face ID can be done with or without attention – essentially, you can either glance at the iPhone while unlocking it, or let Face ID authenticate without having to confirm you’re looking at the device. The ‘Require Attention for Face ID’ setting makes for an impressive demo in front of other people (I’ve had fun demonstrating how notifications expand on the Lock screen only when I turn my head and look at them), and Apple says that it provides an additional layer of security. I left it enabled on my iPhone X.

Furthermore, the volume of timer alerts is automatically lowered if the user is looking at the iPhone X’s screen; if the iPhone X is on a table and a timer goes off, picking it up and looking at it will instantly reduce the alert’s sound. Similarly, the TrueDepth camera checks for attention before dimming the display. The latter is perhaps one of my favorite smaller changes in iOS for the iPhone X: for years I never enabled display auto-lock because I didn’t want to tap the screen every few minutes to make it stay active; with attention awareness, I can leave auto-lock set to 1 minute and the iPhone will never lock itself as it knows I’m looking at it.7

These features show how, besides authentication and security, the TrueDepth camera’s support for attention awareness lets iOS adapt to us. After trying timer alerts on the iPhone X, I want all notification sounds to work this way. Whenever an old iPhone would blast a loud timer alert as I was holding the device and using it I almost wanted to yell “Why won’t you realize I am using you right now?” – finally, the iPhone X makes that small frustration obsolete. I hope Apple will expand this sound and notification behavior to all apps on the system.

I’ve been asking myself which parts of iOS and the iPhone experience could be influenced by attention awareness and redesigned to intelligently fit our context and needs. I don’t think this idea will be limited to Face ID, timers, and auto-lock in the future. What happens, for example, if we take attention awareness farther and imagine how an iPhone X could capture user emotions and reactions? TrueDepth could turn into an attention and context layer that might be able to suggest certain emoji if we’re smiling or shaking our heads, or perhaps automatically zoom into parts of a game if we’re squinting and getting closer to the screen. A future, more sophisticated TrueDepth camera system might even be able to guess which region of the display we’re focusing on, and display contextual controls around it. Siri might decide in a fraction of a second to talk more or less if we’re looking at the screen or not. Lyrics might automatically appear in the Music app if we keep staring at the Now Playing view while listening to a song.

There’s immense potential around what Apple has built with Face ID and TrueDepth in the iPhone X. I think we’re on the cusp of seeing how devices aware of our presence and state will gain the ability to alter their functionality and anticipate our inputs. Apple is facing multiple challenges here: attention-based features will also have to be powered by machine learning and processed privately and securely on-device; at the same time, the company shouldn’t push too hard on the idea of “face control” for the OS – these features have to be seamlessly integrated in the background of the experience, not become the primary interaction methods of iOS.

I take Apple’s comments on the iPhone X evolving over time at face value.8 The TrueDepth stack is a perfect example of how the iPhone X might learn to perform tasks we can only imagine today. And the best part is – Face ID and attention awareness may already feel like the future, but they’re only a 1.0.

The Camera as a 3D Lens

While I regarded the Depth API as one of iOS 11’s most exciting developments, it’s TrueDepth that takes the idea of 3D mapping performed by the iPhone’s camera to the next level. Apple is entering the field of real-world 3D visualizations on multiple fronts: Depth API, ARKit, and, on the iPhone X, Animoji. These frameworks are all connected and part of the broader narrative of the iPhone camera growing into an augmented lens through an interplay of hardware and software.

For portrait photography, the depth map generated by the front-facing TrueDepth camera (this includes the dot projector and IR camera) is often superior to the depth estimate calculated by the dual-camera system. Thanks to face-mapping optimizations, this results in Portrait selfies that have more accurate or more pleasing blurred backgrounds than Portrait photos taken with the “better” camera on the back.

For this reason, I’m not surprised that Portrait selfies have been, by far, the best demo of the iPhone X among my friends and family. The jump in quality from FaceTime HD camera selfies to TrueDepth camera portraits is astounding, and, while not perfect by any professional photography standards, they tend to look great both in normal and low-light situations.

Despite the hiccups of Portrait Lighting effects9, TrueDepth’s mapping of a 3D space for Portrait selfies is remarkable. To understand what the iPhone X’s front-facing camera is able to reconstruct in 3D, I’ve been playing around with Focos, a third-party app that uses iOS 11’s Depth API. Focos’ standout feature is the ability to visualize a photo’s depth map in 3D and rotate a 3D representation of your face to see how multiple layers are stacked to form the final image. This isn’t sci-fi: it’s what the iPhone X’s TrueDepth camera sees today, in its first iteration.

Replay

Animoji are an extension of the same idea – using TrueDepth to capture more accurate 3D information – only applied to turn your expressions into fun and cute characters.

Like Portrait selfies, Animoji are one of the first iPhone X features people usually want to try, and for good reason: it’s not just that they’re adorable emoji characters – it’s impressive how well a portable device can track your facial movements in real-time without requiring any “special mode” or additional calibration. Thanks to integrated face tracking, Animoji no longer constrain you to “emotion presets” as designed in static emoji beforehand; you can make your own animated emoji and share them.

Some characters in the initial roster are more expressive than others (I don’t understand why the rather dull robot and alien made the cut for launch), but I’ve been having a lot of fun playing with Animoji and sending them as video messages to Silvia and my friends.

The chicken is my spirit animal.

The chicken is my spirit animal.

There’s room for improvement in Animoji, both in terms of performance and execution. From a technical standpoint, Animoji don’t work well in low light – characters are jittery when I try to record an Animoji video in my bedroom at night, which makes them look either scared or overly caffeinated. Also, Animoji currently don’t recognize nose movements and, more importantly, a tongue sticking out – both missing features that have been mentioned by my non-geek friends who tried Animoji.

Last, I hope Apple doesn’t limit Animoji to Messages because they want to drive usage of iMessage among iPhone users. I would love to have a standalone Animoji app10 to record videos longer than 10 seconds, or perhaps a custom keyboard to attach Animoji videos to Twitter and Instagram, or a special Animoji mode in Clips. I’m sure the fun and unexpected uses of Animoji we’ve seen since the iPhone X’s launch will push Apple to consider more ideas for their new animated characters.

It’s intriguing to imagine how Animoji could grow into something bigger than 12 characters in iMessage animated by recognition of 50 different face muscles.11 I expect Apple to add more characters over time (perhaps turning Animoji updates into another reason to update your iPhone?), but I’m also curious about the possibility of Animoji being available in other iOS apps, as well as avatars (think Mii-like 3D characters) built on the same tech of Animoji and used to represent your profile/activity in social apps, games, and real-time communication.12

Third-party developers can also work with the TrueDepth camera system by integrating with the iPhone X-only ARKit face-tracking mode. This API is an addition to the ARKit framework launched with iOS 11 and it uses the TrueDepth camera to provide developers with a face mesh – effectively, a map of face muscle movements tracked in real-time that has higher performance and accuracy than similar technologies seen before the iPhone X.

MeasureKit’s latest update is an impressive demonstration of this technology – the app can now show you the face mesh captured by the TrueDepth camera in real-time. Once MeasureKit recognizes your face via TrueDepth, you can make different expressions (such as smiling, frowning, shaking your head, laughing) and see how the geometric shapes that make up the face mesh contract and expand to form a 3D representation visualized by ARKit. It’s a bit freaky, but quite the technical feat.

An example of how the TrueDepth camera works with partially occluded faces. This isn't unsettling at all.

An example of how the TrueDepth camera works with partially occluded faces. This isn’t unsettling at all.

We’re soon going to see some of the more practical implementations of ARKit face tracking that go beyond fun effects and filters (which Apple itself showcased with Snapchat in September). Warby Parker is using the more precise data returned by TrueDepth and ARKit to suggest glasses that might fit you well, for instance. It’s easy to imagine how eyewear companies, the makeup industry, tattoo artists, and other apparel makers might take advantage of TrueDepth’s ARKit mode just like IKEA and other furniture companies are leveraging standard ARKit to facilitate online shopping.

Another impressive application of depth as captured by the iPhone X’s TrueDepth camera is immersion in virtual scenes rendered by software in real-time within the camera view. This is a powerful idea as it turns augmented reality into something closer to virtual reality, where the object (or, in this case, the person) is blended with 360-degree 3D environments without any additional hardware.

Apple’s updated Clips app is a first example of this concept built specifically for the TrueDepth camera. Clips 2.0 offers a new Selfie Scenes mode that puts you in different 3D environments such as a city at night, outer space, Star Wars’ Millennium Falcon, and a room filled with emoji stickers on the wall. Think of it as a modern day Photo Booth, but fully rendered in 3D, animated, much more accurate thanks to TrueDepth and, most importantly, presented in 360 degrees.

A TrueDepth camera scene animated by Clips on iOS 11.Replay

You can walk and pan around in Clips’ Selfie Scenes and the iPhone X’s camera won’t lose your position; just looking at the screen will give you the illusion that slightly moving behind you are, in fact, Japanese cherry blossoms.

As with portrait photography or ARKit, Clips’ live Selfie Scenes are in the early stages and exhibit clear shortcomings. Some filters may cut off parts of your head when applying a live background, while others have a less convincing blending between the person in focus and what’s behind them.

However, the fact that a smartphone’s front-facing camera can now put you in animated 360-degree scenes and work well in most cases is astonishing. The results in Clips may be somewhat crude for now, but, in just a couple of years, educational software, 3D photography, games, and other creative apps could vastly benefit from similar techniques. I wouldn’t be surprised to see Selfie Scenes-like TrueDepth filters as third-party extensions for the Camera app in iOS 12.


It’s been a long road since the iPhone 4’s front-facing VGA camera. With TrueDepth on the iPhone X, Apple is going beyond selfies and FaceTime: from authentication and attention awareness to 3D effects and live scenes, the iPhone’s camera is maturing into a lens that brings us into apps and rich virtual experiences.

To discard TrueDepth as a mere enabler of Face ID would be shortsighted; the iPhone X’s camera system feels like one of Apple’s most important innovations in years.


  1. If you’re concerned about other people using masks to unlock your iPhone X, I have the perfect xkcd for you↩︎
  2. Which added support for recognizing partially occluded faces and profiles with a higher degree of accuracy and performance. ↩︎
  3. Apparently, this is my “focused” pose when I’m working on a story. I'm doing it right now as I'm editing this footnote. ↩︎
  4. An equivalent of Touch ID’s ‘Rest Finger to Open’ option is the only missing piece: if you don’t care about interacting with notifications on the Lock screen, the iPhone X should offer a setting to instantly open the Home screen after a successful Face ID scan. ↩︎
  5. Hidden notification previews are only a problem when I'm driving and ask my girlfriend to read my missed notifications. I thought about disabling the setting, but I like it so much, I'm asking my girlfriend to check my notifications less often; if she has to because I received an important message, she can just unlock my iPhone with the passcode and open Cover Sheet. ↩︎
  6. Enabling Control Center on the Lock screen only when you authenticate hasn't been an issue for me when it comes to toggling commonly accessed controls. The iPhone X has flashlight and camera buttons at the bottom of the Lock screen, and the Now Playing widget is always displayed above notifications. ↩︎
  7. Both these options are managed by a single setting dubbed 'Attention Aware Features' which is available in Settings ⇾ Face ID & Passcode as well as Settings ⇾ General ⇾ Accessibility ⇾ Face ID & Attention. I’m not sure why Apple replicated the same settings in two places. ↩︎
  8. No pun intended. ↩︎
  9. iOS 11’s Portrait Lighting effects (currently in beta for the iPhone 8 Plus and X) still need a lot of work; the Stage Light ones are particularly prone to artifacts right now, and even though you can make them look good if you know what you’re doing, I feel like Stage Light effects are more an alpha than a beta. ↩︎
  10. An official one, that is. I've been using AnimojiStudio to make my own longer Animoji and #AnimojiKaraoke videos outside of Messages. ↩︎
  11. In the short term – perhaps by iOS 12 next year – I’d like Apple to revamp Animoji with new characters and animation of more face features. ↩︎
  12. An idea already being explored, for example, by Facebook and Oculus with Spaces↩︎

Unlock More with Club MacStories

Founded in 2015, Club MacStories has delivered exclusive content every week for over six years.

In that time, members have enjoyed nearly 400 weekly and monthly newsletters packed with more of your favorite MacStories writing as well as Club-only podcasts, eBooks, discounts on apps, icons, and services. Join today, and you’ll get everything new that we publish every week, plus access to our entire archive of back issues and downloadable perks.

The Club expanded in 2021 with Club MacStories+ and Club Premier. Club MacStories+ members enjoy even more exclusive stories, a vibrant Discord community, a rotating roster of app discounts, and more. And, with Club Premier, you get everything we offer at every Club level plus an extended, ad-free version of our podcast AppStories that is delivered early each week in high-bitrate audio.

Choose the Club plan that’s right for you:

  • Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with app collections, tips, automation workflows, longform writing, a Club-only podcast, periodic giveaways, and more;
  • Club MacStories+: Everything that Club MacStories offers, plus exclusive content like Federico’s Automation Academy and John’s Macintosh Desktop Experience, a powerful web app for searching and exploring over 6 years of content and creating custom RSS feeds of Club content, an active Discord community, and a rotating collection of discounts, and more;
  • Club Premier: Everything in from our other plans and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.