Login  |  Register  |  Contact

Tokenization Will Become the Dominant Payment Transaction Architecture

I realize I might be dating myself a bit, but to this day I still miss the short-lived video arcade culture of the 1980’s. Aside from the excitement of playing on “big hardware” that far exceeded my Atari 2600 or C64 back home (still less powerful than the watch on my wrist today), I enjoyed the culture of lining up my quarters or piling around someone hitting some ridiculous level of Tempest.

One thing I didn’t really like was the whole “token” thing. Rather than playing with quarters, some arcades (pioneered by the likes of that other Big Mouse) issued tokens that would only work on their machines. On the upside you would occasionally get 5 tokens for a dollar, but overall it was frustrating as a kid. Years later I realized that tokens were a parental security control – worthless for anything other than playing games in that exact location, they keep the little ones from buying gobs of candy 2 heartbeats after a pile of quarters hits their hands.

With the increasing focus on payment transaction security due to the quantum-entangled forces of breaches and PCI, we are seeing a revitalization of tokenization as a security control. I believe it will become the dominant credit card transaction processing architecture until we finally dump our current plain-text, PAN-based system.

I first encountered the idea a few years ago while talking with a top-tier retailer about database encryption. Rather than trying to encrypt all credit card data in all their databases, they were exploring the possibility of concentrating the numbers in one master database, and then replacing the card numbers with “tokens” in all the other systems. The master database would be highly hardened and encrypted, and keep track of which token matched which credit card. Other systems would send the tokens to the master system for processing, which would then interface with the external transaction processing systems.

By swapping out all the card numbers, they could focus most of their security efforts on one controlled system that’s easier to control. Sure, someone might be able to hack the application logic of some server and kick off an illicit payment, but they’d have to crack the hardened master server to get card numbers for any widespread fraud.

We’ve written about it a little bit in other posts, and I have often recommended it directly to users, but I probably screwed up by not pushing the concept on a wider basis. Tokenization solves far more problems than trying to encrypt in place, and while complex it is still generally easier to implement than alternatives. Well-designed tokens fit the structure of credit card numbers, which may require fewer application changes in distributed systems. The assessment scope for PCI is reduced, since card numbers are only in one location, which can reduce associated costs. From a security standpoint, it allows you to focus more effort on one hardened location. Tokenization also reduces data spillage, since there are far fewer locations which use card numbers, and fewer business units that need them for legitimate functions, such as processing refunds (one of the main reasons to store card numbers in retail environments).

Today alone we were briefed on two different commercial tokenization offerings – one from RSA and First Data Corp, the other from Voltage. The RSA/FDC product is a partnership where RSA provides the encryption/tokenization tech FDC uses in their processing service, while Voltage offers tokenization as an option to their Format Preserving Encryption technology. (Voltage is also partnering with Heartland Payment Systems on the processing side, but that deal uses their encryption offering rather than tokenization).

There are some extremely interesting things you can do with tokenization. For example, with the RSA/FDC offering, the card number is encrypted on collection at the point of sale terminal with the public key of the tokenization service, then sent to the tokenization server which returns a token that still “resembles” a card number (it passes the LUHN check and might even include the same last 4 digits – the rest is random). The real card number is stored in a highly secured database up at the processor (FDC). The token is the stored value on the merchant site, and since it’s paired with the real number on the processor side, can still be used for refunds and such. This particular implementation always requires the original card for new purchases, but only the token for anything else.

Thus the real card number is never stored in the clear (or even encrypted) on the merchant side. There’s really nothing to steal, which eliminates any possibility of a card number breach (according to the Data Breach Triangle). The processor (FDC) is still at risk, so they will need to use a different set of technologies to lock down and encrypt the plain text numbers. The numbers still look like real card numbers, reducing any retrofitting requirements for existing applications and databases, but they’re useless for most forms of fraud. This implementation won’t work for recurring payments and such, which they’ll handle differently.

Over the past year or so I’ve become a firm believer that tokenization is the future of transaction processing – at least until the card companies get their stuff together and design a stronger system. Encryption is only a stop-gap in most organizations, and once you hit the point where you have to start making application changes anyway, go with tokenization.

Even payment processors should be able to expand use of tokenization, relying on encryption to cover the (few) tokenization databases which still need the PAN.

Messing with your transaction systems, especially legacy databases and applications, is never easy. But once you have to crack them open, it’s hard to find a downside to tokenization.


Previous entry: Realistic Security | | Next entry: SQL Injection Prevention


If you like to leave comments, and aren't a spammer, register for the site and email us at info@securosis.com and we'll turn off moderation for your account.

By Adrian Lane  on  09/30  at  04:21 PM

I wanted to add some clarity on what Rich said in the post.  As Format Preserving Encryption takes the size of the sensitive data it is protecting (i.e. credit card numbers), it can be used, in and of itself, as a token. The view is that this provides the same advantage as other tokenization approaches in that the devices and systems along the data processing workflow do not break, and credit card data is kept from those systems that should not have unencrypted access. For trusted systems you can deploy key sharing to provide access to the original data.  Certainly there are pros and cons to the approach, but this does provide flexibility for those select organizations who need it.

By Ben  on  10/01  at  07:34 AM

I think one needs to be very careful how they talk about the “solution” of tokenization. It is, at best, a bridge to a better future, but it is not, itself, that future. Tokenization in POS systems is also significantly different than use in an eCommerce platform. As you yourself note in the article, it’s an ok solution “...until we finally dump our current plain-text, PAN-based system.” That, I think, is really where the focus needs to be placed; not on these kludgey solutions.

I do, however, take issue with some of what you’ve said, because I think it treads dangerously close to overstating the benefits. Specifically, you said:
“The assessment scope for PCI is reduced, since card numbers are only in one location, which can reduce associated costs.”

Actually, this is not necessarily true, and depends on the environment. In my research last Spring, I found that while tokenization solves the storage problem (sidesteps requirement 3.6, for example), it does not always do anything for the point of entry into the environment. Specifically, most tokenization solutions appeared to be geared toward legacy billing systems, giving the false sense that one could then just ignore most PCI DSS requirements. Nothing, of course, could be further from the truth. In eCommerce, the data still enters through a web portal and, in most cases, that card data is then passed off to the tokenization solution, either on the local net or over the Internet. So, you still have to secure the point of entry (web servers, web applications, network infrastructure), which means you still have to conform to the majority of requirements in the DSS.

To get to the point: in a nutshell, unless the merchant can suddenly check the “we do not store, process, or transmit cardholder data in our environment” box on the self-assessment question, then they are still responsible for the lion’s share of requirements in DSS. One then must weigh the cost vs benefit of an add-on (additional overhead cost) system vs other solutions. Tokenization is perhaps easier to manage than encryption (though with less control), but one must still wonder how that data is being stored and managed by the 3rd party. In extending the chain of trust, you could in fact be increasing the level of risk represented by the cardholder data. A better solution would be full outsourcing of everything to do with cardholder data, targeted to Level 3 and 4 merchants in particular, but those types of useful solutions seem to be rare-to-non-existent.


By Rich  on  10/01  at  10:08 AM


I think I outlined most of those differences- BTW, the toeknization scenario I described for FDC is being done inside the PoS. The PAN is never stored in the PoS- it’s encrypted and shipped off immediately to the tokenization server.

The scope is *definitely* reduced, unless it’s a crappy implementation. No, you don’t get to check that box, but you do get to more quickly exclude systems from certain aspects of the assessment. Keep in mind that some of these are exactly targeted at level 3/4 merchants as full services- the merchant never stores the card number, but the token still supports refunds and other functions that might be excluded when normally outsourcing. With a good implementation, very few systems ever touch a real card number- basically any PoS, and possibly a web app server if you do online. All of the other systems normally under assessment are now out of scope.

But again, in the post I say outsourcing is your best bet if you can pull it off :)

By Ben  on  10/01  at  10:14 AM

The problem is that using POS as the basis of your discussion will inevitably lead to confusion as many organizations are selling tokenization to eCommerce merchants. The benefits for POS are much greater than the benefits for eCommerce. For eCommerce, I think you have it completely backwards… you said “Encryption is only a stop-gap in most organizations…” and I think that’s completely wrong. Tokenization is the stop-gap, but encryption is the long-term solution. I think your post, while obviously discussing POS, will be extrapolated inappropriately to eCommerce merchants and could end up increasing problems overall.

By Rich  on  10/01  at  10:19 AM

No way is encryption the long term solution- anything that reduces the storage of card numbers in any form will always be superior to obfuscation.

Tokenization still provides advantages in ecommerce. The card number is never stored, and, in fact, is encrypted in memory only for the initial transaction and sent off for the tokenization. When done at the application level, especially supported by an HSM, how will this possibly increase problems?

Customer enters number. App encrypts with PKI and sends transaction to tokenization service… and there you go. Card number is never needed again

It’s more complex when you want to store numbers for recurring transactions, but even there the token has the edge. The tokenization provider can require mutual authentication and other strong authentication before accepting the token for a transaction. Again, the merchant never holds a card number, and even if the token is stolen it can’t ever work anyplace else.

By Ben  on  10/01  at  10:36 AM

You’re playing a shell game. The data is still transmitted and processed through merchant systems, thus their scope is not reduced all that much. Most of these tokenization solutions are add-on costs. This is a kludge, by it’s very definition. It’s something you add onto an existing solution to bridge the gap.

BTW, have you asked the vendors how they’re dealing with key management and key rotation? *Somebody* is storing the data, and it’s being stored encrypted. Just because they’re getting a Tier 1 assessment for PCI providers doesn’t mean they’re doing things any better.

Have you looked at the contracts between the merchants and the providers?
Are you aware that some processors are also providing this service?

Overall, I’m finding your conclusions and research a little lacking here. More importantly, you seem to think tokenization “solves” a big problem, when in fact it only addresses a small piece of the overall picture. You haven’t even fully described where the data ends up here. Is it offsite with the 3rd party, or is it in a box that the vendor provided, but that still sits onsite? There are tokenization solutions that do both.

Tokenization is coming into it’s own right for one reason: it’s the cheapest way to get cardholder data out of legacy billing systems without having to upgrade or replace those systems. However, you’re having to introduce additional moving pieces (aka “complexity”) in order to accomplish that goal. This is not a long-term or sustainable solution. It’s a kludge until the billing systems and/or 3rd party providers+processors get a much more viable and realistic solution in place. Oh, and btw, all tokenization solutions rely on encryption. Ergo, encryption is the preferred solution, it’s only how you implement it that is in question.

By Rich  on  10/01  at  10:54 AM

No Ben, you are completely missing how this works, and diverting the discussion.

The card number is never in a database, never in any other systems beyond the first transaction, never anywhere other than in memory (and perhaps the HSM) during collection. Merchant systems aren’t a single tier, and there is a variety of storage and processing of the PAN in different areas. Tokenization removes that from nearly all the systems except the initial collection, thus reducing the scope.

As for the how the vendors handle the PAN, yes- I *have* asked some of those questions. They do all the encryption, key rotation, and so on on their back end. If they don’t, then they would be non-compliant themselves. I won’t promise they have perfect security, and if they are breached, like Heartland was, that’s an entirely different can of worms, but almost irrelevant to the merchant since they shouldn’t have any associated breach notification costs.

*OF COURSE* this is being offered by the processors. THAT’S THE ENTIRE POINT! It doesn’t work nearly as well if the processor isn’t involved, although I do know of level 1 merchants who implemented this years ago (well, started years ago, they are big projects in those environments).

My conclusions aren’t lacking, you aren’t groking what Im saying. It doesn’t solve anything- nothing in our transaction system solves the ral problems, as I say in every single post I write on the subject, including this one.

It is far from the cheapest option- there are other, cheaper alternatives that meet PCI requirements (like transparent encryption). This *is* a complex option if you are building it yourself, but one with material benefits that you don’t seem able to address other than saying, “it’s complicated”. That’s a poor response.

Saying encryption=tokenization is a total mischaracterization. But if you read my entire post, which I’m not sure you have, you see I specifically call that out. The token isn’t an encrypted value, it’s a random value. not even a hash.

Anyway, you can’t write that I’m not seeing the big picture when I call out specifically that this is a stop gap until the payment industry decides to address the real problem But merchants, today, have a real need to do the best they can within the current system, and tokenization is emerging as a preferred choice.

By Dave  on  10/02  at  04:41 AM


You are dead on.  There seems to some mis-understanding of this solution in the marketplace - the feeling that somehow encryption is “just as good”, or even better.  The major point is:  if you tokenize (using the FDC solution above), the card number is *never* stored in the Merchant’s systems, anywhere.  With encryption, it is stored, albeit “protected” by encryption. That leaves every system that accesses the encrypted information still in scope, and keeps both the PCI burden and liability with the Merchant.  In the FDC solution, the trach data is encrypted at the point of capture (POS system, PIN pad, or even eCommerce site) with a public key.  That keeps those systems in scope, albeit a reduced scope (since it’s never actually stored there).  By using a public/private key pair, key management is reduced well below any other form of encryption, since I I don’t have to protect the public key at all - I just need to rotate it at least once a year. I’m not even storing the encrypted value - just protecting it for transport over existing comminucation lines (no SSL).  The tokeinzation is even done as part of the authorization, so the merchant never connects to the tokeinzation service directly - it’s done by the card authorization process.

I do agree with one of the earlier posts - this is not “the ultimate solution” - new technologies such as smart cards, one-time card numbers, etc., will be “more secure”.  The issue is that Merchants generally arent going to spend money to deploy those solutions unless it adds revenue or reduces cost, or becomes “mandated”.  The FDC solution attempts to minimize the cost to implement, and provide a significant enough “step forward” in reducing cost (audit cost, fraud cost, compliance cost), while moving the burden of securing the card information from the Mechant to the processor, where economies of scale will be able to provide much better protection.


By David Mortman  on  10/02  at  05:15 AM

Great post, I’d like to highlight one important thing that I don’t think you emphasized enough. That is that tokenization does not solve local fraud issues business logic flaw abuse. Then again, neither does encryption so it’s not a reason to not use tokenization at all. Really this is more a complaint about the way the tokenization has been advertised by the vendors as yet another cure-all.

By Jim Manico  on  10/05  at  10:21 AM

If you are shoving data like PII and credit card information into your data storage mechanism using some kind of weak custom encryption scheme, you’re going down the wrong road. These turnkey tokenization systems do a lot more than just give you tokens - they provide PKI, Key rotation/expiration, use strong single-use encryption keys (tokens!), etc. Really powerful and necessary stuff. Check out Kevin Keenans book, cryptography in the database; he describes some of the details of a system of this nature from a coders perspective. It demonstrates the power of tokenization systems and why you need to consider them.

By Adrian Lane  on  10/05  at  11:07 AM

@Jim Manico - Thanks for the publication reference.  It appears that book was written in 2005 and I have never heard of it. Book looks like a DIY cookbook for application or database (API or user level) encryption. However, I am looking at the table of contents and I see nothing about tokenization per se.


By Jim Manico  on  10/06  at  09:25 AM

Check out book - even though its Java 1.4 centric, it’s powerful stuff. Tokenization was not a buzzword in 05, but the book certainly discusses the underlying concepts to building such a system from scratch.

By Marc  on  10/27  at  05:20 PM

I always thought Chuck E. Cheese was a rat…not a mouse.  That being said, I think your example of a video arcade is a good one.  I have used the casino chip analogy when explaining tokenization to people.  You trade the high value data (cash in the analogy and a CC# in the use case) for some lower value data (a casino chip and a piece of “tokenized” data).  The problem I have with tokens though is that they still have value in a certain context.  You haven’t sufficiently devalued the original data by making it a “token.”  The token can still be used to perform functions, albeit in a more limited context than the original data.  And I question the methodologies currently used to generate these tokens.  I have yet to see any academic research that establishes that the tokens are truly random or that they are any better than hashed values.  What we’ve done is traded one type of attack for one that has yet to emerge (an underground market in valid card data for one that will surely emerge trading valid token data in poorly implemented solutions).  Now, coupling a token with a time-based signature or some other authentication value makes these solutions much more palatable because then I can prove the token is being properly used.  There are numerous implementation issues in the different token solutions provided in the market today…and not enough discussion of provable security and standardization of those implementations…
And all of those security considerations are only part of the problem…many of the previous comments focus on the differences between e-com and brick and mortar retail.  E-commerce sites don’t have to worry about cash registers, store controllers, EFT switches, gateways, etc between the card acceptance point (a browser) and a processing host.  Changing the message that is generated at the point of swipe and passes through all of these intermediate systems is costly and sometimes not even possible.  Some of the token systems are requiring 200 bytes or more of extra payload.  That’s not a big deal in the e-commerce world (I shouldn’t generalize like that, but it is relatively simpler than the brick and mortar space). 
I’ve reviewed the RSA/FDC solution using what is available to their sales channel and public information and I find that there are still a lot of unanswered questions.  In my opinion, an integrated solution using encryption and tokenization is better that just tokenization.  Knowing what I do about retail card processing I have my doubts that tokenization will be viable during the authorization process…but I think it will continue to be a viable storage tactic.

By Rich  on  10/28  at  10:04 AM


That’s what the RSA/FDC solution does- it immediately encrypts the card, sends it in for the token, and then the token is what’s kept. The original card is always required for a new purchase, but the token can be used for refunds and such.

For recurring payments, the model will have to change and the retailer will need to be able to use the token for new purchases. In those cases, the token and a special retailer ID/token will be needed for a new purchase transaction (which could be a digital cert). Thus even if the token is stolen, the attacker also needs to compromise the retailer identification, and even then will only be able to initiate transactions from that specific network.

That’s why I like the approach… but there are other ways to pull this off.

By huroye scott  on  05/27  at  08:08 AM

Let’s also remember that tokenization cannot be accomplish by itself.  It has to be accompanied by defined processes that will help close the gap of human interventions.  One of the best ways to implement this architecture is the use of web services.  This helps control who’s aksing for the token. In the example given by Rich the parent has to know that it is his/her child that is asking for the token and not someone posing as that child.


By Troy  on  09/17  at  05:11 AM

Interesting discussion.  As I read the article I was also interested in the ways in which a token could be used as a ‘proxy’ for the PAN in such a system - the necessity of having the actual card number for the initial purchase seems to assuage most of that concern. 

Another aspect of this method that I have not seen mentioned here: if the Tokens in fact conform to the format of true PANs, won’t a DLP scan for content recognition typically ‘discover’ the Tokens as potential PANs?  How would the implementing organization reliably prove the distinction, or would they simply rest on the assumption that as a matter of design any data lying around that *looks* like a credit card number _must_be_ a Token?  I’m not sure that would cut the mustard with a PCI auditor.  Seems like this could be a bit of a sticky wicket still?



Remember my personal information

Notify me of follow-up comments?