I realize I might be dating myself a bit, but to this day I still miss the short-lived video arcade culture of the 1980’s. Aside from the excitement of playing on “big hardware” that far exceeded my Atari 2600 or C64 back home (still less powerful than the watch on my wrist today), I enjoyed the culture of lining up my quarters or piling around someone hitting some ridiculous level of Tempest.
One thing I didn’t really like was the whole “token” thing. Rather than playing with quarters, some arcades (pioneered by the likes of that other Big Mouse) issued tokens that would only work on their machines. On the upside you would occasionally get 5 tokens for a dollar, but overall it was frustrating as a kid. Years later I realized that tokens were a parental security control – worthless for anything other than playing games in that exact location, they keep the little ones from buying gobs of candy 2 heartbeats after a pile of quarters hits their hands.
With the increasing focus on payment transaction security due to the quantum-entangled forces of breaches and PCI, we are seeing a revitalization of tokenization as a security control. I believe it will become the dominant credit card transaction processing architecture until we finally dump our current plain-text, PAN-based system.
I first encountered the idea a few years ago while talking with a top-tier retailer about database encryption. Rather than trying to encrypt all credit card data in all their databases, they were exploring the possibility of concentrating the numbers in one master database, and then replacing the card numbers with “tokens” in all the other systems. The master database would be highly hardened and encrypted, and keep track of which token matched which credit card. Other systems would send the tokens to the master system for processing, which would then interface with the external transaction processing systems.
By swapping out all the card numbers, they could focus most of their security efforts on one controlled system that’s easier to control. Sure, someone might be able to hack the application logic of some server and kick off an illicit payment, but they’d have to crack the hardened master server to get card numbers for any widespread fraud.
We’ve written about it a little bit in other posts, and I have often recommended it directly to users, but I probably screwed up by not pushing the concept on a wider basis. Tokenization solves far more problems than trying to encrypt in place, and while complex it is still generally easier to implement than alternatives. Well-designed tokens fit the structure of credit card numbers, which may require fewer application changes in distributed systems. The assessment scope for PCI is reduced, since card numbers are only in one location, which can reduce associated costs. From a security standpoint, it allows you to focus more effort on one hardened location. Tokenization also reduces data spillage, since there are far fewer locations which use card numbers, and fewer business units that need them for legitimate functions, such as processing refunds (one of the main reasons to store card numbers in retail environments).
Today alone we were briefed on two different commercial tokenization offerings – one from RSA and First Data Corp, the other from Voltage. The RSA/FDC product is a partnership where RSA provides the encryption/tokenization tech FDC uses in their processing service, while Voltage offers tokenization as an option to their Format Preserving Encryption technology. (Voltage is also partnering with Heartland Payment Systems on the processing side, but that deal uses their encryption offering rather than tokenization).
There are some extremely interesting things you can do with tokenization. For example, with the RSA/FDC offering, the card number is encrypted on collection at the point of sale terminal with the public key of the tokenization service, then sent to the tokenization server which returns a token that still “resembles” a card number (it passes the LUHN check and might even include the same last 4 digits – the rest is random). The real card number is stored in a highly secured database up at the processor (FDC). The token is the stored value on the merchant site, and since it’s paired with the real number on the processor side, can still be used for refunds and such. This particular implementation always requires the original card for new purchases, but only the token for anything else.
Thus the real card number is never stored in the clear (or even encrypted) on the merchant side. There’s really nothing to steal, which eliminates any possibility of a card number breach (according to the Data Breach Triangle). The processor (FDC) is still at risk, so they will need to use a different set of technologies to lock down and encrypt the plain text numbers. The numbers still look like real card numbers, reducing any retrofitting requirements for existing applications and databases, but they’re useless for most forms of fraud. This implementation won’t work for recurring payments and such, which they’ll handle differently.
Over the past year or so I’ve become a firm believer that tokenization is the future of transaction processing – at least until the card companies get their stuff together and design a stronger system. Encryption is only a stop-gap in most organizations, and once you hit the point where you have to start making application changes anyway, go with tokenization.
Even payment processors should be able to expand use of tokenization, relying on encryption to cover the (few) tokenization databases which still need the PAN.
Messing with your transaction systems, especially legacy databases and applications, is never easy. But once you have to crack them open, it’s hard to find a downside to tokenization.
Reader interactions
16 Replies to “Tokenization Will Become the Dominant Payment Transaction Architecture”
Interesting discussion. As I read the article I was also interested in the ways in which a token could be used as a ‘proxy’ for the PAN in such a system – the necessity of having the actual card number for the initial purchase seems to assuage most of that concern.
Another aspect of this method that I have not seen mentioned here: if the Tokens in fact conform to the format of true PANs, won’t a DLP scan for content recognition typically ‘discover’ the Tokens as potential PANs? How would the implementing organization reliably prove the distinction, or would they simply rest on the assumption that as a matter of design any data lying around that *looks* like a credit card number _must_be_ a Token? I’m not sure that would cut the mustard with a PCI auditor. Seems like this could be a bit of a sticky wicket still?
Let’s also remember that tokenization cannot be accomplish by itself. It has to be accompanied by defined processes that will help close the gap of human interventions. One of the best ways to implement this architecture is the use of web services. This helps control who’s aksing for the token. In the example given by Rich the parent has to know that it is his/her child that is asking for the token and not someone posing as that child.
Huroye
Marc,
That’s what the RSA/FDC solution does- it immediately encrypts the card, sends it in for the token, and then the token is what’s kept. The original card is always required for a new purchase, but the token can be used for refunds and such.
For recurring payments, the model will have to change and the retailer will need to be able to use the token for new purchases. In those cases, the token and a special retailer ID/token will be needed for a new purchase transaction (which could be a digital cert). Thus even if the token is stolen, the attacker also needs to compromise the retailer identification, and even then will only be able to initiate transactions from that specific network.
That’s why I like the approach… but there are other ways to pull this off.
I always thought Chuck E. Cheese was a rat…not a mouse. That being said, I think your example of a video arcade is a good one. I have used the casino chip analogy when explaining tokenization to people. You trade the high value data (cash in the analogy and a CC# in the use case) for some lower value data (a casino chip and a piece of “tokenized” data). The problem I have with tokens though is that they still have value in a certain context. You haven’t sufficiently devalued the original data by making it a “token.” The token can still be used to perform functions, albeit in a more limited context than the original data. And I question the methodologies currently used to generate these tokens. I have yet to see any academic research that establishes that the tokens are truly random or that they are any better than hashed values. What we’ve done is traded one type of attack for one that has yet to emerge (an underground market in valid card data for one that will surely emerge trading valid token data in poorly implemented solutions). Now, coupling a token with a time-based signature or some other authentication value makes these solutions much more palatable because then I can prove the token is being properly used. There are numerous implementation issues in the different token solutions provided in the market today…and not enough discussion of provable security and standardization of those implementations…
And all of those security considerations are only part of the problem…many of the previous comments focus on the differences between e-com and brick and mortar retail. E-commerce sites don’t have to worry about cash registers, store controllers, EFT switches, gateways, etc between the card acceptance point (a browser) and a processing host. Changing the message that is generated at the point of swipe and passes through all of these intermediate systems is costly and sometimes not even possible. Some of the token systems are requiring 200 bytes or more of extra payload. That’s not a big deal in the e-commerce world (I shouldn’t generalize like that, but it is relatively simpler than the brick and mortar space).
I’ve reviewed the RSA/FDC solution using what is available to their sales channel and public information and I find that there are still a lot of unanswered questions. In my opinion, an integrated solution using encryption and tokenization is better that just tokenization. Knowing what I do about retail card processing I have my doubts that tokenization will be viable during the authorization process…but I think it will continue to be a viable storage tactic.
Check out book – even though its Java 1.4 centric, it’s powerful stuff. Tokenization was not a buzzword in 05, but the book certainly discusses the underlying concepts to building such a system from scratch.
@Jim Manico – Thanks for the publication reference. It appears that book was written in 2005 and I have never heard of it. Book looks like a DIY cookbook for application or database (API or user level) encryption. However, I am looking at the table of contents and I see nothing about tokenization per se.
-Adrian