Every year, as I travel the security conference circuit, hallway conversations always turn to, “See anything interesting?”. To be honest, I can’t remember the last time I was excited about an honestly cool security technology (which I didn’t create myself, but let’s not go there today). I see plenty of cloud innovation, and plenty of security evolution, but not a lot of revolution.
A week ago I picked up my iPhone X. Although I received a background brief on Face ID a couple weeks earlier, I hadn’t gotten my hands on it until then. And, really, didn’t get to play with it until the next day after spending 5 hours restoring my data (all 200 GB of it).
Face ID is the most compelling security advance I have seen in a very long time. It’s game-changing not merely due to technology, but also thanks to design and implementation. Apple has created a new authentication modality.
First things first: Face ID nails nearly every criteria I came up with to evaluate it. The false positive rate, within certain genetic constraints, is 1 in a million compared, to 1 in 50,000. The inherent security architecture doesn’t look quite as tied to hardware as Touch ID (because the phone needs the sensor package for other capabilities), but does appear to be either as strong (including the software implementation) or close enough in practical circumstances. Watch enough videos of journalists buying masks of their own faces, and it’s clear Face ID is more expensive to circumvent than Touch ID. We haven’t actually seen a public crack yet, but I always assume it will happen eventually. Because history.
Apple sometimes has a weak spot underestimating adversaries in their threat models, but they did a good job on this one.
In my pre-release article I wrote:
Face ID doesn’t need to be the same as Touch ID – it just needs to work reasonably equivalently in real-world use.
In my personal experience, and with every user I’ve talked with and in every article I’ve read, Face ID’s core usability is equal to or greater than Touch ID’s. For example, it doesn’t work as well at many angles you could touch your phone from, but it works better in the kitchen and after a shower/workout. I’ve tested it in all sorts of lighting conditions and haven’t found one that trips it up yet. The only downside is I can’t register my wife’s face, and we were become accustomed to using Touch ID on each other’s devices.
I do believe it’s slower at actual recognition, but it’s nearly impossible to notice due to the implementation. Face ID is tightly bound to activity, which masks its latency. For example, the time to swipe your fingers is long enough to unlock, where with Touch ID recognition and unlocking were the same action, which made the latency more visible.
But I think it’s time to justify that hyperbolic headline.
Apple didn’t just throw a facial recognition sensor into the iPhone and replace a fingerprint sensor – they enabled a new security modality. I call this “continuous authentication”.
When you use an iPhone you look at the iPhone (some calls and music listening excepted). Instead of unlocking your iPhone once and opening up everything, or requiring you to put your finger on the sensor when an app or feature wants to re-authenticate, the phone can quickly scan your face on demand. And the iPhone does this constantly. Here are the examples I’ve discovered so far:
- It’s already been widely reported that notification, by default, don’t show details on the lock screen until you look at the iPhone. This is my favorite new feature because it improves security with effectively zero usability impact.
- I always disabled Control Center on the lock screen for security reasons, but like notifications, just looking at my phone unlocks it. It’s just too bad my thumb can’t reach that upper right corner.
- Safari now (optionally) uses Face ID before filling in passwords on web sites. Previously, even with Touch ID, they filled in automatically when the phone was unlocked.
- Apple Pay and the App Store now authenticate with your face without separate authentication actions.
- Apps can authenticate as you open them. This is where I notice that Face ID is a likely bit slower, but because I don’t need to take another action it feels faster.
The lock screen and Safari passwords are, to my knowledge, legitimately new modalities. The others are evolutions of previous use cases.
Face ID allows your iPhone to authenticate you under nearly every circumstance you would use your phone and need to authenticate, but without requiring any user action. I think we are just scratching the surface of what’s possible here. Yes, we’ve used tools like Yubikeys plugged into devices to keep sessions open, but I think it’s clear how this is different.
This is just the first generation of Face ID. Imagine the use cases once it evolves and can, for example, register multiple users. My Xbox Kinect (may it rest in peace) already does this pretty well, so we know it’s possible (Kinect’s implementation is as secure, and it’s a lot bigger). One of the biggest problems in healthcare security is quickly authenticating to shared workstations in clinical environments… I could see a future version of Face ID significantly addressing that problem.
I previously said that Touch ID enables you to use a strong password with the convenience of no password at all. Face ID not only exceeds that mark, it may be the ultimate expression of it, by deeply integrating effortless authentication throughout the user experience without requiring new behaviors.
That, my friends, is the power of security design, not just security engineering.
Comments