Analysis of Visa’s Proposed Tokenization Spec

By Adrian Lane

Visa, Mastercard, and Europay – together known as EMVCo – published a new specification for Payment Tokenisation this month. Tokenization is a proven security technology, which has been adopted by a couple hundred thousand merchants to reduce PCI audit costs and the security exposure of storing credit card information. That said, there is really no tokenization standard, for payments or otherwise. Even the PCI-DSS standard does not address tokenization, so companies have employed everything from hashed credit card (PAN) values (craptastic!) to very elaborate and highly secure random value tokenization systems. This new specification is being provided to both raise the bar on shlock home-grown token solutions, but more importantly to address fraud with existing and emerging payment systems.

I don’t expect many of you to read 85 pages of token system design to determine what it really means, if there are significant deficiencies, or whether these are the best approaches to solving payment security and fraud issues, so I will summarize here. But I expect this specification to last, so if you build tokenization solutions for a living you had best get familiar with it. For the rest of you, here are some highlights of the proposed specification.

  • As you would expect, the specification requires the token format to be similar to credit card numbers (13-19 digits) and pass LUHN.
  • Unlike financial tokens used today, and at odds with the PCI specification I might add, the tokens can be used to initiate payments.
  • Tokens are merchant or payment network specific, so they are only relevant within a specific domain.
  • For most use cases the PAN remains private between issuer and customer. The token becomes a payment object shared between merchants, payment processors, the customer, and possibly others within the domain.
  • There is an identity verification process to validate the requestor of a token each time a token is requested.
  • The type of token generated is variable based upon risk analysis – higher risk factors mean a low-assurance token!
  • When tokens are used as a payment objects, there are “Data Elements” – think of them as metadata describing the token – to buttress security. This includes a cryptographic nonce, payment network data, and token assurance level.

Each of these points has ramifications across the entire tokenization eco-system, so your old tokenization platform is unlikely to meet these requirements. That said, they designed the specification to work within todays payment systems while addressing near-term emerging security needs.

Don’t let the misspelled title fool you – this is a good specification! Unlike the PCI’s “Tokenization Guidance” paper from 2011 – rumored to have been drafted by VISA – this is a really well thought out document. It is clear whoever wrote this has been thinking about tokenization for payments for a long time, and done a good job of providing functions to support all the use cases the specification needs to address. There are facilities and features to address PAN privacy, mobile payments, repayments, EMV/smartcard, and even card-not-present web transactions. And it does not address one single audience to the detriment of others – the needs of all the significant stakeholders are addressed in some way. Still, NFC payments seems to be the principle driver, the process and data elements really only gel when considered from that perspective. I expect this standard to stick.

No Related Posts

There’s a lot of standardization work being done in the tokenization space, and this document is certainly a valuable contribution.  I think it’s important to keep in mind that the document defines the category “payment token”, which is intended to drop in as a limited context payment instrument, as opposed to the classic “security token” (or, as they call it on page 8, a “non-Payment Token”) which may be built in such a way that it can’t be used for anything besides analysis tasks, and has to be detokenized before use in a payment.

It’s even possible (or even likely) in the future that you’ll see environments where a payment token is locally tokenized into a non-payment form to allow analysis tasks to function without exposing a value that might be used to initiate a transaction.

By Terence Spies

If you like to leave comments, and aren’t a spammer, register for the site and email us at and we’ll turn off moderation for your account.