Skip to content
Thoughtful, detailed coverage of everything Apple for 33 years
and the TidBITS Content Network for Apple professionals
The hammer lady from Apple's

Image by Apple

12 comments

Apple Flexes Its Privacy Muscles

Apple events follow a consistent pattern that rarely changes beyond the details of their particular announcements. This consistency becomes its own language. Attend enough Apple events and you start to pick up the deliberate undertones that the company wants to communicate but not directly express. These are the postures and facial expressions that accompany the words of the slides, demos, and videos.

Five years ago I walked out of the WWDC keynote with a feeling that those undertones were screaming a momentous shift in Apple’s direction, that privacy was emerging as a foundational principle for the company. I laid out my interpretation of Apple’s privacy principles in this piece in Macworld. Privacy had been increasing in importance for years before at Apple, but that WWDC keynote was the first time the company clearly articulated that privacy not only mattered but was being built into its foundational technologies.

This year I sat in the WWDC keynote, hearing the undertones, and realized that Apple is upping its privacy game to levels never before seen from a major technology company. That is, beyond improving privacy in its own products, the company is starting to use its market strength to extend privacy through the tendrils that touch the Apple ecosystem.

Regardless of Apple’s motivation—altruism, the personal principles of Apple executives, or a shrewd business strategy—Apple’s stance on privacy is unique and historic in the annals of consumer technology. The real question now isn’t if Apple can succeed at a technical level, but if its privacy push can withstand the upcoming onslaught from governments, regulators, the courts, and its competitors.

Apple executives say that they believe that privacy is a human right. History, however, is strewn with the remains of well-intentioned champions of such rights.

Sign in with Apple

When discussing shifts in strategy, whether at Apple or any other technology firm, we should keep in mind that such changes typically start years earlier and are more gradual than we realize. In the case of Apple, the company’s privacy extension efforts started at least a couple years before WWDC 2014, which was when Apple first started requiring privacy protections for developers who wanted to participate in HomeKit and HealthKit.

The most obvious privacy push to come out of WWDC 2019 is “Sign in with Apple,” which offers benefits to both consumers and developers. Additional WWDC sessions made it clear that Apple is using a carrot-and-stick approach with developers: developers are required to use the service when they include competing offerings from Google and Facebook, but, in exchange, they also gain built-in fraud prevention. Every Apple ID is already vetted by Apple and secured with two-factor authentication, and Apple provides developers with the digital equivalent of a thumbs-up or thumbs-down indicating whether Apple’s monitoring code thinks the connection is from a real human being. Since Apple uses similar mechanisms for iCloud activity, iTunes, and App Store purchases, the odds are that this is a reliable indicator.

Apple also emphasized that Sign in with Apple extends this privacy to the developers themselves, saying that it isn’t Apple’s business to know how developers engage with their users in their apps. Apple serves merely as an authentication provider and collects no telemetry on user activity. This isn’t to imply that Google and Facebook necessarily abuse their authentication services. Google denies these accusations and also offers features to detect suspicious activity. Facebook, on the other hand, has famously abused phone numbers supplied for two-factor authentication.

The difference between Sign in with Apple and previous privacy requirements within Apple’s ecosystems is that the feature extends Apple’s insistence on privacy beyond the company’s walled garden. Previous requirements—from HomeKit’s data use strictures to App Store rules about how apps can collect and use data—applied mostly to apps running on Apple devices. While this is technically true for Sign in with Apple, practically speaking the implications extend much further.

That’s because, when developers add Sign in with Apple to an iOS app, they likely also will need to add it to their apps on other platforms if they expect their customers to ever use anything other than an Apple device. If they don’t, they will create a confusing user experience (which, I hate to say, we will likely see a lot of). Once users create their accounts for an app with their Apple IDs, there are technical complexities in supporting those same user accounts with alternative login credentials. Thus developers likely will support Sign in with Apple across all their different platforms, extending the feature’s inherent privacy beyond Apple’s usual reach.

Intelligent Ad Tracking Prevention

Two other technologies stand out as additional examples of how Apple is extending its privacy regime. The first is an important update to intelligent tracking prevention for advertising. Privacy-preserving ad-click attribution provides (at least some) privacy in the ugly ad-tracking market. The second technology is HomeKit Secure Video, which offers a new privacy-respecting foundation to video security firms that want to be feature-competitive without dealing with the mess of building their own back-end cloud services.

Let’s look first at Intelligent Tracking Prevention. This Safari feature reduces the ability of services to track users across different Web sites. The idea behind it is that users can and should be able to enable cookies for a trusted site without having additional trackers continue to monitor them through the rest of their browsing to other sites. Cross-site tracking is epidemic, with many sites hosting dozens of trackers. Such tracking is meant to support advertising and to provide one key marketing metric: did an ad lead the user to visit the target site and buy something?

Effective tracking prevention is an existential risk to online advertisers and the sites that rely on advertising for income, but increased scrutiny from Apple (and other browser makers) is almost completely the result of overly intrusive tracking by advertisers. While Intelligent Tracking Prevention (combined with other browser privacy and security features) is the stick, privacy-preserving ad-click attribution is Apple’s carrot. Its method of monitoring clicks allows advertisers to track conversion rates without invading user privacy.

This privacy-preserving ad-click attribution is an upcoming feature of Safari (and a proposed Web standard) that enables the browser to remember ad clicks for 7 days. If a purchase is made within that time period, it is marked as a potential ad conversion. After a semi-random time delay to limit user identification, that conversion is then reported as a delayed ephemeral post to the search or advertising provider using a limited set of IDs that can’t be linked back to the actual user.

By building a privacy-preserving advertising technology into the second-most popular Web browser on the planet (Safari’s market share is about 15%, behind Google Chrome with 63%) and by making it an open standard, all while making Herculean efforts to block invasive forms of tracking, Apple is again leveraging its market position to improve privacy beyond its walls. What’s most interesting about the technology is that, unlike Sign in with Apple, it improves user privacy without completely disrupting the business model of Apple’s advertising-driven competitors like Google and Facebook. Those companies can use Apple’s technology and still track ad conversions, and Apple still supports user-manageable ad identifiers for targeted advertisements.

HomeKit Secure Video

As I said above, HomeKit Secure Video is another technology with which Apple is extending its privacy push. Coming in macOS 10.15 Catalina, iOS 13, and iPadOS, it provides HomeKit security cameras with a privacy-preserving update. I’m a heavy user of such cameras myself, even though they are only marginally useful at preventing crime. Nearly all home security camera systems, including my Arlo cameras, record their video directly to cloud-based storage (see “The HomeKit-Compatible Arlo Baby Security Cam Is Not Just for Parents,” 3 September 2018). Cloud storage is a feature you generally want in order to avoid the risk of having bad guys steal your security footage, as happens so often on popular crime shows. Security camera companies also use cloud processing to identify people, animals, and vehicles, and to offer other useful features. Like many customers, I’m not thrilled that these companies also have access to my videos, which is one reason none of their cameras run inside my home when anyone in my family is present.

HomeKit Secure Video will send encrypted video from supported cameras to iCloud, where it’s stored, for free, for 10 days without impacting your iCloud storage limits. If you have an Apple TV or iPad on your network, it will use that device for machine learning analysis and image recognition instead of performing any analysis in the cloud. This is an interesting area for Apple to step into: it certainly doesn’t seem like the sort of thing that would drive profits since Apple doesn’t sell its own cameras, and security camera support isn’t a motivator when customers decide to purchase a phone or tablet. It’s almost as though some Apple executives and engineers were personally creeped out by the lack of privacy protection for existing security camera systems and said, “Let’s fix this.”

HomeKit Secure Video opens the security video market to a wider range of competitors while protecting consumer privacy. It is a platform, not a product, and it eliminates the need for manufacturers to build their own back-end cloud service and machine learning capabilities. Companies using the platform will experience less friction when they bring a product to market, and it simultaneously allows them to provide better user privacy.

Apple Created a Culture of Privacy, but Will It Survive?

These are just a few highlights that demonstrate Apple’s extension of privacy beyond its direct ecosystem, but WWDC featured even more privacy-related announcements.

Apple continues to expand existing privacy features across all its platforms, including the new offline Find My device tracking tool (see “How Apple’s New Find My Service Locates Missing Hardware That’s Offline,” 21 June 2019). Having seen how some apps abuse Wi-Fi and Bluetooth data for ad hoc location tracking, Apple now blocks app access in iOS to such data unless it’s needed as a core feature. Users now can also track the trackers and see when even approved apps accessed their location.

Then there’s the upcoming Apple credit card, which is the closest thing we can get to a privacy respecting payment option. Even speech recognition is getting a privacy polish: developers will soon be able to mandate that speech recognition in their apps runs on-device, without ever being exposed to the cloud. In fact, Apple dedicated an entire WWDC session to examples of how developers can adopt Apple’s thinking to improve privacy within their own apps.

During John Guber’s The Talk Show Live, Craig Federighi said that Apple’s focus on privacy started back in its earliest days, when the company was founded on creating “personal” computers. Maybe it did, maybe it didn’t, but Apple certainly didn’t build a real culture of privacy (or any technical protections) until the start of the iPhone era. When Microsoft launched its highly successful Trustworthy Computing Initiative in 2002 and reversed the company’s poor security record, one of its founding principles was “Secure by Design.” During Apple’s developer-focused Platform State of the Union session, privacy took center stage as Apple talked about “Privacy by Design.”

Apple and other tech firms have already run into resistance when building secure and private devices and services. Some countries, including Australia, are passing laws to break end-to-end encryption and require device backdoors. US law enforcement officials have been laying the groundwork for years to push for laws that permit similar access, even while knowing it would then be impossible to guarantee device security (see “Apple and Google Spark Civil Rights Debate,” 10 October 2014). China requires Apple and other non-Chinese cloud providers to hand over their data centers to Chinese companies who can then feed information to the government. Apple’s competitors aren’t sitting by idly, with Google’s Sundar Pichai muddying the waters in a New York Times opinion piece that equates Google security with privacy, and positioning Apple’s version of privacy as a luxury good. While Google’s security is the best in the industry, equating that security with the kind of privacy that Apple offers is disingenuous at best.

The global forces arrayed against personal privacy are legion. Advertising companies and marketing firms want to track your browsing and buying. Governments want to solve crimes and prevent terrorism whatever the cost. Telecommunication providers monitor all our Internet traffic and locations, just because they can. The financial services industry is sure our data is worth something. And even grocery stores can’t resist offering minor discounts if you just let them correlate all your buying to your phone number. While, theoretically, we have a little  control over some of this tracking, practically speaking we have essentially no control over most of it, and even less insight into how it is used. It’s a safe bet that many of these organizations will push back hard against Apple’s privacy promotion efforts, and, by extension, against any of us that care about and want to control our own privacy.

Calling privacy a fundamental human right is as strong a position as any company or individual can take. It was one thing for Apple to build privacy into its own ecosystem, but as it extends this privacy outside its ecosystem, we have to decide for ourselves if we consider these protections meaningful and worthy of support. I know where I stand, but I also recognize that privacy is a highly personal concept and I shouldn’t assume a majority of the world feels the same as I, or that Apple’s efforts will survive the challenges of the next decades.

It’s in our hands now.

Subscribe today so you don’t miss any TidBITS articles!

Every week you’ll get tech tips, in-depth reviews, and insightful news analysis for discerning Apple users. For over 33 years, we’ve published professional, member-supported tech journalism that makes you smarter.

Registration confirmation will be emailed to you.

This site is protected by reCAPTCHA. The Google Privacy Policy and Terms of Service apply.

Comments About Apple Flexes Its Privacy Muscles

Notable Replies

  1. Great article. I forwarded it to many stressing that Security is not Privacy.
    FaceBook might get to having a secure service BUT their business model would put them into Bankruptcy if they provided Privacy

  2. Trust in Apple’s privacy protections are one of the main reasons why we stick with Apple.

  3. All of Apple’s laudable privacy moves live in the huge shadow of the billions that Google pays Apple every year to keep Google as Safari’s default search engine.

  4. Apple’s Intelligent Tracking Prevention is frequently updated to prevent Google, etc., etc., etc. from exceeding very strict limits on tracking. Rich Mogul’s excellent article in this week’s TidBITS has a summary about how ITP strictly limits tracking:

    https://tidbits.com/2019/07/12/apple-flexes-its-privacy-muscles/

    There’s an analysis here about how Safari’s privacy measures are frequently updated and how they negatively impact advertising. Keep in mind that the the companies include Alphabet/Google, who is spending $12 billion this year to continue to be Safari’s search engine, up from $9 billion in 2018:

    https://fortune.com/2018/09/29/google-apple-safari-search-engine/

    There is no way Google would be shelling out increasingly big bucks every year to be a search engine for a browser that not only limits the amount of data it collects but also greatly limits the time in which the data is available to advertisers unless it’s a sure thing it would be a highly profitable investment. They are basically just acquiring traffic, not accumulating and storing huge amounts of tracked data from Safari. This greatly limited and constricted data is likely to be worth well in excess of double the amount Google is paying for it. Look at Safari’s share of market that Rich discusses in his article.

    Here’s the scoop from WebKit on the latest iteration of ITP. Keep in mind that over the last few years Apple first eliminated third party cookies, then crippled cross site tracking by moving it from a week to a day, which has pretty much made cookie tracking crumble:

    https://webkit.org/blog/8828/intelligent-tracking-prevention-2-2/

  5. I have very strong feelings about Apple Security. First I find it too invasive and “Big Brother”. For instance it does not allow me to turn of 2FID once it is turned off. When I am doing account maintenance it is constantly asking me for verification which is annoying and time consuming. As such it continues to enforce 2FID on me whether I want it or not. As a customer I should be allowed to control my own machines.

    Personally I regard 2FID as a cost saving method to avoid the costs of beefing up their own security and putting the security responsibilities of the companies on the backs of their customers, thereby blaming the customers for any security breaches that if resulting in customer hardships, they can avoid any responsibility which can lead to lawsuits.

    Based on personal experience they also make user security decision for the customer without notifying the user result in frustration and signification loss of time. Recently for some reason, still unknown they blocked me from sharing calendars from user accounts, all contained locally on my own computer for which I am the sole user, system admin, and have the root account activated. They did this without any notice. I spent many hours troubleshooting only finallying giving up and calling support. Neither 1st level or Senior Support had any idea what was going on. Only after escalating it to engineering did an agent report back to me that Apple had decided to put a block on my machine to prevent me from sharing my own calendars. It took over a week from the time I contacted Apple to have it removed. I still don’t know why they did it or how to prevent it in the future.

    Finally whenever I sign in to AppleID or iCloud it wants me to verify my identity by sending me an email with a code and a window pops up asking me to enter the code on the same screen from which I was trying to log in. All that does is verify that I can receive mail on the same machine that I use to access the services. There is no attempt to actually confirm that I am actually that person with any personal information. If I am going to attempt to get into iCloud or AppleID with my username and password, then it obvious I will be able to receive email to that same machine via Mail. All I need to do is setup an account in Mail or if the machine is stolen, have the account already there to be able to enter the verification code. Total nonsense and annoying in my opinion as it is doing little to secure my identity.

    In my opinion it is past time that Apple took the steps and spent the monies to securing its own systems and stop shifting the responsibility onto the backs of its users along with ceasing to act as “Big Brother” and allow users to determine the security levels on their accounts and machines require for their personal protection.

  6. If you have better ideas I’m all ears and I’m sure Apple would like to hear them, as well.

    Then you clearly don’t understand the purpose of 2FID. It has nothing to do with protecting your computer. Only physical protection means, lock screen and use of Find My… are useful for that. When Apple wants to verify that it’s you using an AppleID it allows all trusted platforms to produce a code that can be used for such verification. If you have told Apple that it’s a trusted platform you are attempting iCloud, e-mail, etc. access, then that machine should be able to produce a validation code. If you are using the hotel business center machine, then it’s not trusted and you will need some other device to produce the code.

    If your machine is stolen and the thief has your username and password, then you have much bigger problems.

  7. Unfortunately their now seems to be 2 Apple’s: The Apple before Steve Job’s passed and the Apple after. As far as the current Apple listening to users, that seems to only happened in a bygone era. Now the only thing Apple seems to pay attention to is stockholders clamoring for more dividends. Unless threatened with a lawsuit, or 10’s of thousands give ‘feedback’ on a topic, most of users input seems to now be consigned to the ‘bit bucket’ as evidenced by mobile products now costing upwards of 4 figures, and a fully equipped new Mac Pro Desktop being essentially a closed system due to unique Apple connectors for RAM, SSD, and GPU’s, and costing as much as a luxury level automobile.

    The obvious suggestion would be to have Apple beef up its own systems so and protect its users so that they would not need to use 2FID. But that costs money and takes diligence - something that Apple and many other companies are not willing to invest in. 2FID is a very cost effective substitute.

    If Apple already knows that the code is coming from a trusted machine why send the code at all to that machine? It only creates an annoying effort for the user to verify something that Apple already is aware of. If the wish to verify the user why not ask them for a PIN code that should have been memorized or secured in some way by the user that only the user and some data field on a server has access to by the verification process.


  8. jweil

        July 16
    

    I have very strong feelings about Apple Security. First I find it too invasive and “Big Brother”. For instance it does not allow me to turn of 2FID once it is turned off. When I am doing account maintenance it is constantly asking me for verification which is annoying and time consuming. As such it continues to enforce 2FID on me whether I want it or not. As a customer I should be allowed to control my own machines.

    2FID is one of the security and privacy protections I greatly appreciate from Apple. If someone gets their hands on any one of our many personal or business Mac or iOS devices, it is extremely unlikely they would be able to gain access into any of our information even if they can guess my password. I’ll bet Jennifer Lawrence and other movie, TV and pop stars who had their accounts compromised via one factor sign and nude photos they assumed would be private were splashed all over the internet were very happy when Apple implemented 2FID.

    Security isn’t usually the most convenient option; in fact, I think it’s almost always not. Even though it might be a PITA and take a little more time, IMHO, security is well worth it. Even though I hate the long lines and checks I have to go through at airports, I know how important they are. I live in NYC and still feel the effects of 9/11.

    Personally I regard 2FID as a cost saving method to avoid the costs of beefing up their own security and putting the security responsibilities of the companies on the backs of their customers, thereby blaming the customers for any security breaches that if resulting in customer hardships, they can avoid any responsibility which can lead to lawsuits.

    2FID was costly to develop and is costly to maintain. Apple would probably have a significantly larger stash of hundreds of billions in cash if never developed it. Apple’s implementation of it has been popular and caused Google to scramble to make an option they advise turning on in Android, even though it could make tracking difficult for them in certain circumstances. IIRC, Microsoft also rushed to develop something that’s not quite like 2FID. It sounded to me that the Google and MS built “light” versions, and Google made it an option to minimize any impact it might have on data collection.

    I’m not a lawyer or anything resembling one, as far as I have read or seen, I have not read or heard of lawsuits like what you described against Android or Microsoft, which have not as strict 2FID as Apple’s.

  9. I have no issue with high security, in fact I endorse it. My issue is the way it has been implemented, making life more difficult and frustrating for users. As you indicated by your examples some individuals require high security, but many others do not , do not feel they need it, or just don’t care about their privacy and security. As such the level of security needed should be the choice of the user/customer so long as it does not compromise the security of others who wish to have high security. Companies should not be acting as the police or “big brother’, enforcing protocols on their customers that do not wish to use them. That said there is now technology that makes 2FID archaic. This includes voice printing, retina scans, facial recognition (most computers have cameras and mikes, fingerprints (most computers now have touch capability or it can be added), finger codes from pressing keypads, flash drive keys, etc. May of these are passive that do not require the user interaction, resulting in multiple devices, lost time and frustration.

  10. Al Varnell already posted an excellent and concise explanation.

    Apple has a history of investing very significant sums to develop state of the art security measures like like Touch and Face ID that its competitors are still scrambling to keep up with. They lead the competition in security and privacy, and could even be working on something other than than 2FID for all we know.

  11. Obviously, the reality distortion field was working well.

    The defining attribute of Steve J running Apple was that HE knew what the customer wanted. And the customer did not.

Join the discussion in the TidBITS Discourse forum

Participants

Avatar for jcenters Avatar for alvarnell Avatar for stevenoz Avatar for jwking Avatar for raleighthings Avatar for MMTalker Avatar for jweil Avatar for rmogull Avatar for jim17