It should surprise no one that Apple is writing their own playbook for bug bounties. Both bigger, with the largest potential payout I’m aware of, and smaller, focusing on a specific set of vulnerabilities with, for now, a limited number of researchers. Many, including myself, are definitely free to be surprised that Apple is launching a program at all. I never considered it a certainty, nor even necessarily something Apple had to do.

Personally, I cannot help but mention that this news hits almost exactly 10 years after Securosis started… with my first posts on, you guessed it, a conflict between Apple and a security researcher.

For those who haven’t seen the news, the nuts and bolts are straightforward. Apple is opening a bug bounty program to a couple dozen select researchers. Payouts go up to $200,000 for a secure boot hardware exploit, and down to $25,000 for a sandbox break. They cover a total of five issues, all on iOS or iCloud. The full list is below. Researchers have to provide a working proof of concept and coordinate disclosure with Apple.

Unlike some members of our community, I don’t believe bug bounties always make sense for the company. Especially for ubiquitous, societal, and Internet-scale companies like Apple. First, they don’t really want to get into bidding wars with governments and well-funded criminal organizations, some willing to pay a million dollars for certain exploits (including some in this program). On the other side is the potential deluge of low-quality, poorly validated bugs that can suck up engineering and communications resources. That’s a problem more than one vendor mentions to me pretty regularly.

Additionally negotiation can be difficult. For example, I know of situations where a researcher refused to disclose any details of the bug until they were paid (or guaranteed payment), without providing sufficient evidence to support their claims. Most researchers don’t behave like this, but it only takes a few to sour a response team on bounties.

A bug bounty program, like any corporate program, should be about achieving specific objectives. In some situations finding as many bugs as possible makes sense, but not always, and not necessarily for a company like Apple.

Apple’s program sets clear objectives. Find exploitable bugs in key areas. Because proving exploitability with a repeatable proof of concept is far more labor-intensive than merely finding a vulnerability, pay the researchers fair value for their work. In the process, learn how to tune a bug bounty program and derive maximum value from it. High-quality exploits discovered and engineered by researchers and developers who Apple believes have the skills and motivations to help advance product security.

It’s the Apple way. Focus on quality, not quantity. Start carefully, on their own schedule, and iterate over time. If you know Apple, this is no different than how they release their products and services.

This program will grow and evolve. The iPhone in your pocket today is very different from the original iPhone. More researchers, more exploit classes, and more products and services covered.

My personal opinion is that this is a good start. Apple didn’t need a program, but can certainly benefit from one. This won’t motivate the masses or those with ulterior motives, but it will reward researchers interested in putting in the extremely difficult work to discover and work through engineering some of the really scary classes of exploitable vulnerabilities.

Some notes:

  • Sources at Apple mentioned that if someone outside the program discovers an exploit in one of these classes, they could then be added to the program. It isn’t completely closed.
  • Apple won’t be publishing a list of the invited researchers, but they are free to say they are in the program.
  • Apple may, at its discretion, match any awarded dollars the researcher donates to charity. That discretion is to avoid needing to match a donation to a controversial charity, or one against their corporate culture.
  • macOS isn’t included yet. It makes sense to focus on the much more widely used iOS and iCloud, both of which are much harder to find exploitable bugs on, but I really hope Macs start catching up to iOS security. As much as Apple can manage without such tight control of hardware.
  • I’m very happy iCloud is included. It is quickly becoming the lynchpin of Apple’s ecosystem. It makes me a bit sad all my cloud security skills are defensive, not offensive.
  • I’m writing this in the session at Black Hat, which is full of more technical content, some of which I haven’t seen before.

And here are the bug categories and payouts:

  • Secure boot firmware components: up to $200,000.
  • Extraction of confidential material protected by the Secure Enclave, up to $100,000.
  • Execution of arbitrary code with kernel privileges: up to $50,000.
  • Unauthorized access to iCloud account data on Apple servers: up to $50,000.
  • Access from a sandboxed process to user data outside that sandbox: up to $25,000.

I have learned a lot more about Apple over the decade since I started covering the company, and Apple itself has evolved far more than I ever expected. From a company that seemed fine just skating by on security, to one that now battles governments to protect customer privacy.

It’s been a good ten years, and thanks for reading.

Share: