The PCI DSS Tokenization Guidelines Information Supplement – which I will refer to as “the supplement” for the remainder of this series – is intended to address how tokenization may impact Payment Card Industry (PCI) Data Security Standard (DSS) scope. The supplement is divided into three sections: a discussion of the essential elements of a tokenization system, PCI DSS scoping considerations, and new risk factors to consider when using tokens as a surrogate for credit card numbers. It’s aimed at merchants who process credit card payment data and fall under PCI security requirements. At this stage, if you have not downloaded a copy, I recommend you do so now. It will provide a handy reference for the rest of this post.

The bulk of that document covers tokenization systems as a whole: technology, workflow, security, and operations management. The tokenization overview does a good job of introducing what tokenization is, what tokens look like, and the security impact of different token types. The diagrams do an excellent job of illustrating of how token substitution fits within normal payment processing flow, providing a clear picture of how an on-site tokenization system – or a tokenization service – works. The supplement stresses the need for authorization and network segmentation – the two critical security tools needed to secure a token server and reduce compliance scope.

The last section of the supplement helps readers understand the risks inherent to using tokens – which are new and distinct from the issues of traditional security controls. Using tokens directly for financial exchange, instead of as simple references to the real financial data in a private token database, carries its own risk – a hacker could use the tokens to conduct transactions, without needing to crack the token database. Should they penetrate the IT systems, even if there is no credit card, if it can be used as a financial instrument, hackers will misuse it. If the token can initiate a transaction, force a repayment, or be used as money, there is risk. This section covers a couple critical risk factors merchants need to consider; although this has little to do with the token service – it is simply an effect of how tokens are used.

Those were the highlights of the supplement – now the lowlights. The section on PCI Scoping Considerations is convoluted and ultimately unsatisfying. I wanted bacon but only got half a piece of Sizzlean. Seriously, it was one of those “Where’s the beef?” moments. Okay, I am mixing my meats – if not my metaphors – but I must say that initially I thought the supplement was going to be an excellent document. They did a fantastic job answering the presales questions of tokenization buyers in section 1.3: simplification of merchant validation, verification of deployment, and unique risks to token solutions. But after my second review, I realized the document does offer “scoping considerations”, but does not provide advice, nor a definitive standard for auditing or scope reduction. That’s when I started making phone calls to others who have read the supplement – and they were as perplexed as I was. Who will evaluate the system and what are the testing procedures? How does a merchant evaluate a solution? What if I don’t have an in-house tokenization server – can I still reduce scope? Where is the self-assessment questionnaire?

The supplement does not improve user understanding of the critical questions posed in the introduction. As I waded through page after page, I was numbed by the words. It slowly lulled me to sleep with stuff that sounded like information – but wasn’t. Here’s an example:

The security and robustness of a particular tokenization system is reliant on many factors including the configuration of the different components, the overall implementation, and the availability and functionality of the security features for each solution.

No sh&$! Does that statement – which sums up their tokenization overview – help you in any way? Is this statement be true for every software or hardware system? I think so. Uselessly vague statements like this litter the supplement. Sadly, the first paragraph of the ‘guidance’ – a disclaimer repeated at the foot of each page, quoted from Bob Russo in the PCI press release – reflects the supplement’s true nature:

“The intent of this document is to provide supplemental information. Information provided here does not replace or supersede requirements in the PCI Data Security Standard”.

Tokenization should replace some security controls and should reduce PCI DSS scope. It’s not about layering. Tokenization replaces one security model for another. Technically there is no need to adjust the PCI DSS specification to account for a tokenization strategy – they can happily co-exist – with one system handling non-sensitive systems and the other handling those which store payment data. But not providing a clear definition of which is which, and what merchants will be held accountable for, demonstrates the problem.

It seems clear to me that, based on this supplement, PCI DSS scope will never be reduced. For example, section 2.2 rather emphatically states “If the PAN is retrievable by the merchant, the merchant’s environment will be in scope for PCI DSS.” Section 3.1, “PCI DSS Scope for Tokenization”, starts from the premise that everything is in scope, including the tokenization server, as it should be. But what falls out of scope and how is not made clear in section 3.1.2 “Out-of-scope Considerations”, where one would expect to find such information. Rather than define what is out of scope, it outlines many objectives to be met, seemingly without regard for where the credit card vault resides, or the types of tokens used. Section 3.2, titled “Maximizing PCI DSS Scope Reduction”, states that “If tokens are used to replace PAN in the merchant environment, both the tokens, and the systems they reside on will need to be evaluated to determine whether they require protection and should be in scope of PCI DSS”. From this statement, how can anything then be out of scope? The merchant, and likely the auditor, must still review every system to determine scope, which means there is no benefit of audit scope reduction.

Here’s the deal: Tokenization – properly implemented – reduces security risks and should reduce compliance costs for merchants. Systems that have fully substituted PAN data with random tokens, and that have no method of retrieving credit card data, are out of scope. The council failed to endorse tokenization as a recommended approach to securing data, and also failed to provide more than a broad handwave for to how this will happen.

There are a few other significant topics where the PCI council should have written guidance to help their customers, but failed to accomplish anything.

  • Take a stand on encryption: Encrypted values are not tokens. Rich Mogull wrote an excellent post called An Encrypted Value Is Not a Token! last year. Encrypted credit card numbers are just that – encrypted credit card numbers. A token is a random number surrogate – pulled randomly out of thin air, obtained from a sequence generator, copied from a one-time pad, or even from a code book; these are all acceptable token generation methods. We can debate the philosophical nature of how just how secure encrypted values are until the end of time, but in practice you simply cannot reduce audit scope with encrypted values. Companies that store encrypted PANs do so because – at some point in time – they need to ‘de-tokenize’ and access the original PAN/Credit Card number. That means the system has access to the Card Data Vault, or the encryption key manager. The supplement glosses over this in several places – section 2.1.1 for example – but stops short of saying encrypted values remain in scope. They should have acknowledged this, and you need to expect auditors to treat them as in scope.
  • Address the token identification problem: PCI calls it Token Distinguishability, and section 2.3.4 of the supplement talks about it. In simplest terms, how can you tell if a token is a token and not a credit card number? I won’t cover all the nuances here, but I do want to point out that this is a problem only the PCI Council can address. Merchants, QSAs, and payment providers can’t distinguish tokens from credit card numbers with any certainty, especially considering that most merchants want several digits of each token to reflect the original credit card number. But they are required to have such a facility! Honestly, there is no good solution here. It would be better to acknowledge this unpleasant fact and recommend moving away from tokens that preserve portions of the original credit card values.
  • Liability and the merchant: Section 2.4.2 says “The merchant has ultimate responsibility for the tokenization solution”. Most merchants buy tokenization as a service. Further, if a) the PAN is not retrievable by the merchant, and b) the merchant never sees PAN data because they use end-to-end encryption from customer to payment processor, it’s hard to justify putting the onus on the merchant. Sure, any merchant could collect PAN directly from their customers and scatter it across their servers indiscriminately. But when it comes to tokenization technology, merchants are simply incapable of validating the security of a tokenization product. It’s not their field of expertise. If the payment processor screws up their encryption implementation or exposes the token server through bogus identity and access controls, the merchant has no way of recognizing this – logically the card vendors are the only party which can address problem, as they have much deeper understanding of payment processor systems, and already audit payment processors. This blanket transfer of liability is entirely unjustified.

While scope is the biggest issue by far, there are additional areas where the PCI Council and tokenization task force punted.

  1. Audit Guidelines: How will merchants be judged? Provide guidelines so merchants know how they will be judged.
  2. Update the self-assessment questionnaires: The vast majority or merchants don’t process enough transactions to warrant an on-site audit, but must complete the self-assessment questionnaire. There should be a series of clear questions, in layman’s terms, to determine whether the merchant has implemented tokenization correctly.
  3. Token Service Checklist: What should you look for in a token service? The majority of merchants will not run their own tokenization server – or what the PCI Council is calling a Card Data Vault – instead they will buy this additional service from their payment providers. But the required changes and the migration process, are quite complicated. A discussion of the impact on both software and process is needed, as this influences the selection of a token format, and is likely to be the deciding factor when choosing between different solutions.
  4. Provide Auditor Training: Understanding how to audit a tokenization system and validate the migration from PAN to token storage is critical. Without specifications, QSA are naturally concerned about what they can approve, and worry about reprisals for accepting tokenization. If something goes wrong, the first person who will be blamed is the auditor, so they keep everything in scope. Yes, it’s CYA, but auditing is their livelihood.
  5. State a definitive position: It would be best if they came out and endorsed tokenization as a way to remove the PAN from merchant sites. Reading the guidance, I got a clear impression that the PCI Council would prefer that payment gateways/banks and cardholders be the only parties with credit card numbers. This would remove all credit cards from merchant exposure, and I think this is the right course of action – which should have happened 15 years ago. That sort of roadmap would help merchants, mobile application developers, PoS device manufacturers, encryption vendors, and key management vendors plan out their products and efforts.
  6. Tokenization for mobile payment: Some of you are probably saying “What?” While even basic adoption of mobile payments is nowhere to be seen, dozens of large household name companies are building mobile and ‘smart’ payment options. When the players are ready these will come fast and furious, and most firms I speak with want to embed tokenization for security purposes. Mobile payment will be a form of “card not present” transaction for the merchants. It is likely to fall outside the scope for evaluating Point of Sale systems as well as common payment application architectures, so guidance is sorely needed.

I’m trying not to be totally negative, but you can see my wish list is pretty long. This was not the time to be fuzzy or dance around the elephant in the room. We all need clear and actionable guidance, and the supplement failed in this regard. Next I will offer guidance for merchants adopting tokenization, including which areas you should consider above and beyond the Payment Card Industry Supplement. Rather than a list of 20 hurdles to jump, I’ll provide a simple list to follow.