Tokenization Guidance. I have wanted to write this post since the middle of August. Every time I started writing another phone phone call came in from a merchant, payment processor, technology vendor, or someone loosely associated with a Payment Card Industry (PCI) task force or steering committee (SIG). And every conversation yielded some new sliver of information that changed what I wanted to say, or implied some research work had already been conducted that was far more interesting and useful than anything being provided to the public. This in turned prompted more calls, new conversations, more digging and – like a good mystery novel – prompted me to iteratively peel back another layer of the onion. I’ve finally reached a point where I believe I have enough of the story to understand what was published and why it’s not what they should have published.

But enough of the preamble: let’s back up and dive into the subject at hand. As of August 12, 2011, the PCI task force driving the study of tokenization published an “Information Supplement” called the PCI DSS Tokenization Guidelines. More commonly known as the ‘Tokenization Guidance’ document, it discussed the dos and don’ts of using token surrogates for credit card data. The only problem is that this document is sorely lacking in actual guidance. Even the section on “Maximizing PCI DSS Scope Reduction” is a collection of broad generalizations on security, rather than practical advice. After spending the better part of the last two weeks with this wishy-washy paper, a better title would be “Quasi-acceptance of Tokenization Without Guidance”. And all my conversations indicate that this opinion is universally held outside the PCI council.

“We read the guidance but we don’t know what falls out of scope!” is the universal merchant response to the tokenization information supplement. “Where are the audit guidelines?” is the second most common statement. The tokenization guidlines provides an overview of the elements of a tokenization system, along with the promise of reduced compliance requirements, but they don’t provide a roadmap on how to get there. Let’s make one thing very clear right from the start: There is very wide interest in tokenization because it promises better security, lower risk and – potentially – significant cost reductions for compliance efforts. Merchants want to reduce the work they must do in order to comply with the PCI requirements – which is exactly why they are interested in tokenization technologies. Security and lower risk are secondary benefits. But without a concrete idea of the actual cost reduction – or worse, an understanding of how they will be audited once tokenization is deployed – they are dragging their feet on adoption.

There is no good reason to omit a basic cookbook for scope reduction when using tokenization. I am going to take the guesswork out of it and provide real guidance for evaluating tokenization, and clarify how to benefit from tokenization. This will be in the form of concrete, actionable steps for merchants deploying tokenization, with checklists for auditors reviewing tokenization systems. I’ll fill in the gaps from the PCI supplement, poke at the topics they decided it was politically unpalatable to discuss, and specify what you can reasonably omit from the scope of your assessment. Given an overview of what you can reasonably consider to be out of scope, I’ll advise you on how to approach compliance and follow up with some checklists to make it easier. This is more than I can cover in a simple post, so I will cover these topics over the next two weeks, ultimately wrapping this into my own tokenization guidance white paper.

The series will have four parts:

  • Key points from supplement: Outline what the PCI information supplement on tokenization means and discuss the important aspects of the technology for users to focus on. We’ll discuss what is missing from the guidance and what does – and does not – help reduce PCI assessment effort.
  • Guidance for merchants: How tokenization changes PCI compliance. We’ll discuss critical areas of concern when deciding to adopt a tokenization solution, with guidance on reducing audit scope. This will encompass areas including implementation tradeoffs, integration, rollout, and vendor lock-in.
  • The audit process: How tokenization impacts the auditing process, how to work with your assessor to establish testing criteria, and where to look to reduce the scope of your audit. We’ll provide guidance for working with QSAs and self assessment.
  • Checklists: The guidance describes major components of the technology but lacks operational guidelines for assessors or merchants. As with the original PCI-DSS documents, I’ll include an audit checklist to supplement the PCI standard on what should be considered out of scope, and where you can shave time from your auditing process.

I will present information I feel should have been included in the tokenization supplement. And I will advise against use of some technologies and deployment models that frankly should not have been lumped in with the supplement as they don’t simplify and reduce risks in the way any merchant should be looking for. I am willing to bet that some of my recommendations will make many interested stakeholders quite angry. I accept this as unavoidable – my guidance is geared toward making the lives of merchants who buy tokenization solutions easier, rather than to avoid conflict with vendor products. No technology vendor or payment provider ever endorses any guidance that negatively impacts their sales, so I expect blowback. As always, if you think some of my recommendations are BS, I encourage you to comment. We are open to criticism and alternate viewpoints, and we always factor relevant comments into our final research. I do ask vendors to identify yourselves.

I will also assume some prior knowledge of tokenization and PCI-DSS. There is a ton of research on the Securosis blog and the Research Library on these subjects. If you are not fully up to speed on tokenization systems, or are interested in learning more about tokenization in general, I suggest you review some of our previous research on tokenization. Most helpful will be Understanding and Selecting a Tokenization System, Tokenization vs. Encryption: Options for Compliance and FireStarter: an Encrypted Value is Not a Token!

Next: Key points from the tokenization supplement.

Share: