Skip to main content

Apple’s new sign-in button is built for a post-Cambridge Analytica world

Apple’s new sign-in button is built for a post-Cambridge Analytica world

/

Protection against developers, not hackers

Share this story

In 2019, Facebook isn’t just a homepage; it’s a passport. As mobile apps look for an alternative to passwords, big networks like Facebook and Google have become login services, letting a single login sign you into dozens of different apps. As long as Facebook or Google will vouch for you, most apps are happy to take them at their word, using open protocols like OAuth to verify the login. In return, the big companies find out what you’re logging into and when. It’s a good deal for apps trying to avoid the friction of a sign-up process, and one of the many ways major tech companies have made themselves indispensable — or inescapable, depending on your perspective.

At its Worldwide Developers Conference this week, Apple threw a wrench into that system. Apple is introducing its own single sign-on (SSO) service, a direct competitor to the services offered by Google and Facebook. The new service is aimed at paring back data collection, with only minimal data shared with the app and a promise to quarantine any data collected within Apple itself so it can’t be used for other purposes. More importantly, the service will be mandatory for any iOS apps using SSO, which makes it an instant competitor to Google and Facebook’s offerings.

To the extent that data is leaking out through the sign-in process, this plugs the biggest leak

That might seem like an odd move from a hardware company, but Apple has made an explicit push toward web services in recent years, with a particular focus on privacy. The new sign-on button fits right in with iMessage’s focus on encryption and Safari’s push against third-party tracking, all fitting in with Apple’s broader vision of itself as a cleaner and more controlled alternative to the rest of the tech world. Unlike iMessage, that system won’t be restricted to iPhone users. It will be available on Android and web browsers, too, which means there’s less concern about lock-in than you might think. It also means the system could reach more users than any previous effort, aiming for internet-wide scale in a way that few Apple products do. But unlike cookie-blocking or encryption, this latest move is targeted at legitimate software as much as hostile intruders. The people losing data from this change won’t be hackers or third-party ad networks, but apps you’ve purposefully installed on your phone and networks you’ve chosen to join. It’s a product of the growing scope of privacy concerns in the wake of Cambridge Analytica, and it’s a sign of just how much tech infrastructure needs to be rebuilt as our expectations of privacy change.

So what does Apple’s new system actually do? The Apple SSO system isn’t exactly OAuth — the open-source protocol used by Google and Facebook — but Apple says it’s OAuth-like, letting third parties verify a login as authentic while protecting against man-in-the-middle attacks. But where intermediaries would typically pass along the email address associated with an account, Apple’s new system will provide an email relay as a username, generated specifically for that service. In essence, Apple is adding an extra intermediary step, making sure the app doesn’t know your email and that third parties can’t combine data to get a picture of your activity across multiple apps. To the extent that data is leaking out through the sign-in process, this plugs the biggest leak.

Still, it’s not clear how much data was actually leaking out that way. If you’re concerned about Google and Facebook knowing what apps you use, the technical situation hasn’t changed much. Apple will still know which apps you’re logging into and when. (It has to in order to operate the system.) The company has promised to stovepipe the information internally, but all you’ve really done is transfer your trust from Google to Apple, like switching from Gmail to iCloud.

All you’ve really done is transfer your trust from Google to Apple

What’s more affected is the data flowing from the larger networks (Google, Facebook, and now Apple) to the apps themselves. In most cases, the information flowing to the app is fairly straightforward — a person’s name, their email, sometimes their avatar — but with most of that data moving in the background, it can be hard to tell exactly what’s changing hands. Most of the data scandals of the past few years have involved tech companies failing to protect those data flows, whether it’s Cambridge Analytica or plug-ins reading through your Gmail. In theory, it’s now impossible to get social graph information through Facebook’s API this way, but after so many broken promises, it’s hard to take the company at its word. Apple’s system would bake that protection into the protocol itself, taking an openly adversarial stance toward any data shared with outside apps.

Again, this might seem strange: Apple is tightening the reins on developers at a developer conference, of all places. But it’s one of the first privacy measures that seems to grasp the norms of how privacy works after the Cambridge Analytica scandal, which put the focus on abusive apps that users had willingly installed. It’s not enough to make sure users know what they’re getting into when they install an app. Platforms are expected to monitor and control all the ways partners could be abusing their privileges, which will mean rearchitecting how many of those partnerships work.

Of course, single sign-on alone won’t stop the next Cambridge Analytica. Addressing the true problem means changing the way apps interact with the data on your phone and changing the way personal info moves between third parties online. We’re already starting to make those changes with tighter App Store policies and regulations like the General Data Protection Regulation. But if you’re trying to build a world in which networks are keeping a closer eye on third-party software, Apple’s new sign-in button might be exactly what you need.