Apple events follow a very consistent pattern, which rarely changes beyond the details of the content. This consistency has gradually become its own language. Attend enough events and you start to pick up the deliberate undertones Apple wants to communicate, but not express directly. They are the facial and body expressions beneath the words of the slides, demos, and videos.

Five years ago I walked out of the WWDC keynote with a feeling that those undertones were screaming a momentous shift in Apple’s direction. That privacy was emerging as a foundational principle for the company. I wrote up my thoughts at Macworld, laying out my interpretation of Apple’s privacy principles. Privacy was growing in importance at Apple for years before that, but that WWDC keynote was the first time they so clearly articulated that privacy not only mattered, but was being built into foundational technologies.

This year I sat in the WWDC keynote, reading the undertones, and realized that Apple was upping their privacy game to levels never before seen from a major technology company. That beyond improving privacy in their own products, the company is starting to use its market strength to pulse privacy throughout the tendrils that touch the Apple ecosystem.

Regardless of motivations – whether it be altruism, the personal principles of Apple executives, or simply shrewd business strategy – Apple’s stance on privacy is historic and unique in the annals of consumer technology. The real question now isn’t whether they can succeed at a technical level, but whether Apple’s privacy push can withstand the upcoming onslaught from governments, regulators, the courts, and competitors.

Apple has clearly explained that they consider privacy a fundamental human right. Yet history is strewn with the remains of well-intentioned champions of such rights.

How privacy at Apple changed at WWDC19

When discussing these shifts in strategy, at Apple or any other technology firm, it’s important to keep in mind that the changes typically start years before outsiders can see them, and are more gradual than we can perceive. Apple’s privacy extension efforts started at least a couple years before WWDC14, when Apple first started requiring privacy protections to participate in HomeKit and HealthKit.

The most important privacy push from WWDC19 is Sign In with Apple, which offers benefits to both consumers and developers. In WWDC sessions it became clear that Apple is using a carrot and stick approach with developers: the stick is that App Review will require support for Apple’s new service in apps which leverage competing offerings from Google and Facebook, but in exchange developers gain Apple’s high security and fraud prevention. Apple IDs are vetted by Apple and secured with two-factor authentication, and Apple provides developers with the digital equivalent of a thumbs-up or thumbs-down on whether the request is coming from a real human being. Apple uses the same mechanisms to secure iCloud, iTunes, and App Store purchases, so this seems to be a strong indicator.

Apple also emphasized they extend this privacy to developers themselves. That it isn’t Apple’s business to know how developers engage with users inside their apps. Apple serves as an authentication provider and collects no telemetry on user activity. This isn’t to say that Google and Facebook abuse their authentication services, Google denies this accusation and offers features to detect suspicious activity. Facebook, on the other hand, famously abused phone numbers supplied for two-factor authentication, as well as a wide variety of other user data.

The difference between Sign In with Apple and previous privacy requirements within the iOS and Mac ecosystems is that the feature extends Apple’s privacy reach beyond its own walled garden. Previous requirements, from HomeKit to data usage limitations on apps in the App Store, really only applied to apps on Apple devices. This is technically true for Sign In with Apple, but practically speaking the implications extend much further.

When developers add Apple as an authentication provider on iOS they also need to add it on other platforms if they expect customers to ever use anything other than Apple devices. Either that or support a horrible user experience (which, I hate to say, we will likely see plenty of). Once you create your account with an Apple ID, there are considerable technical complexities to supporting non-Apple login credentials for that account. So providers will likely support Sign In with Apple across their platforms, extending Apple’s privacy reach beyond its own platforms.

Beyond sign-in

Privacy permeated WWDC19 in both presentations and new features, but two more features stand out as examples of Apple extending its privacy reach: a major update to Intelligent Tracking Prevention for web advertising, and HomeKit Secure Video. Privacy preserving ad click attribution is a surprisingly ambitious effort to drive privacy into the ugly user and advertising tracking market, and HomeKit Secure Video offers a new privacy-respecting foundation for video security firms which want to be feature competitive without the mess of building (and securing) their own back-end cloud services.

Intelligent Tracking Prevention is a Safari feature to reduce the ability of services to track users across websites. The idea is that you can and should be able to enable cookies for one trusted site, without having additional trackers monitor you as you browse to other sites. Cross-site tracking is endemic to the web, with typical sites embedding dozens of trackers. This is largely to support advertising and answer a key marketing question: did an ad lead to you visit a target site and buy something?

Effective tracking prevention is an existential risk to online advertisements and the sites which rely on them for income, but this is almost completely the fault of overly intrusive companies. Intelligent Tracking Prevention (combined with other browser privacy and security features) is a stick and privacy preserving ad click attribution is the corresponding carrot. It promises to enable advertisers to track conversion rates without violating user privacy. An upcoming feature of Safari, and a proposed web standard, Apple promises that browsers will remember ad clicks for seven days. If a purchase is made within that time period it will be considered a potential ad conversion (sale), and reported as a delayed ephemeral post to the search or advertising provider – using a limited set of IDs which are insufficiently granular to be linked to an individual user, after a random time delay to further frustrate individual user identification.

By providing a privacy-preserving advertising technology inside one of the most important and popular web browsers, then offering it as an open standard, all while making herculean efforts to block invasive tracking, Apple is again leveraging its market position to improve privacy and extend its reach. What’s most interesting is that unlike Sign In with Apple, this improves privacy without directly attacking their advertising-driven competitors’ business models. Google can use this same technology and still track ad conversions, and Apple still supports user-manageable ad identifiers for targeted advertisements, accepting less user data to provide better privacy. Of course, a cynic might ask whether more accurate conversion metrics would hurt advertisers who inflate their numbers.

HomeKit security cameras also get a privacy-preserving update with macOS Catalina and iOS 13. I’m a heavy user of cameras myself, even though they are only marginally useful for preventing crime. Nearly all these systems record to the cloud (including my Arlo cameras). This is a feature customers want, clearly demonstrated by the innumerable crime shows where criminals steal the tapes. The providers also use cloud processing to distinguish people from animals from vehicles and provide offer other useful features. But like many customers, I’m not thrilled that providers also have access to my videos, which is one reason none of them run inside my house when anyone is home.

HomeKit Secure Video will encrypt video end-to-end from supported cameras into iCloud, free, kept for 10 days without consuming iCloud storage capacity. If you have an Apple TV or iPad on your home network, it will provide machine learning analysis and image recognition instead of performing any analysis in the cloud. This is an interesting area for Apple to step into. It doesn’t seem likely to drive profits because Apple doesn’t sell its own cameras, and security camera support is hardly driving phone and tablet brand choices. It’s almost like some Apple executive and engineers were personally creeped out by the lack of privacy protection in existing camera systems and said, “Let’s fix this.”

The key to HomeKit Secure Video is that it opens the security video market to a wider range of competitors while protecting consumer privacy. This is a platform, not a product, and it removes manufacturers’ need to build their own back-end cloud service and machine learning capabilities. Less friction to market with better customer privacy.

Apple created a culture of privacy, but will it survive?

These are only a few highlights to demonstrate Apple’s extension of privacy beyond its direct ecosystem, but WWDC was filled with examples. Apple continues to expand privacy features across all their platforms, including the new offline Find My device and customer tracking tool. They now block access to WiFi and Bluetooth data on iOS unless required as a core app feature, since they noticed it being abused for location tracking. Users can also now track the trackers, seeing when and where approved apps accessed their location. The upcoming Apple credit card is the closest thing we are likely to see to a privacy-respecting payment option. Developers will soon be able to mandate that speech recognition in their apps runs on-device, never exposed to the cloud. Privacy enhancements permeate Apple’s upcoming updates, and that’s before we hear anything about new hardware. Apple even dedicated an entire WWDC session to not only its own updates, but also examples of how developers can adopt Apple’s thinking to improve privacy within their own apps.

During John Guber’s The Talk Show Live, Craig Federighi stated that Apple’s focus on privacy started back in their earliest days when the company was founded to create ‘personal” computers. Maybe it did and maybe it didn’t, but Apple certainly didn’t build an effective culture of privacy (or noteworthy technical protection) until the beginning of the iPhone era. When Microsoft launched its highly successful Trustworthy Computing Initiative in 2002 and reversed the company’s poor security record, one of its founding principles was “Secure by Design”. During Apple’s developer-focused Platform State of the Union session, privacy took center stage as Apple talked about “Privacy by Design”.

Apple and other tech firms have already run into resistance building secure and private devices and services. Countries, including Australia, are passing laws to break end to end encryption and require device backdoors. U.S. law enforcement officials have been laying groundwork for years for laws requiring access, even knowing it would then be impossible to guarantee device security. China requires Apple and other non-Chinese cloud providers to hand over their data centers to Chinese companies, who can are legally required to feed information to the Chinese government. Apple’s competitors aren’t sitting idly by, with Google’s Sundar Pichai muddying the waters in a New York Times opinion piece which equates Google security with privacy, and positions Apple-style privacy as a luxury good. Google’s security is definitely industry-leading, but equating it with Apple-style privacy is disingenuous at best.

The global forces arrayed against personal privacy are legion. From advertising companies and marketing firms, to governments, to telecommunication providers which monitor all our internet traffic and locations, to the financial services industry, and even to grocery stores offering minor discounts if you’ll just let them correlate all your buying to your phone number. We have a bit of control over some of this tracking, but really we have little control over most of it, and even less insight into how it is used. It’s a safe bet that many of these organizations will push back hard against Apple, and by extension any of us who care about and want to control our own privacy.

Calling privacy a fundamental human right is as strong a position as any company or individual can take. It was one thing for Apple to build privacy into its own ecosystem, but as they extend this privacy outward, it is up to us to decide for ourselves whether we consider these protections meaningful and worthy of support. I know where I stand, but I recognize that privacy is highly personal and I shouldn’t assume a majority of the world feels the same, or that Apple’s efforts will survive the challenges of the next decades.

It’s in our hands now.

Share: