Continuing our series on tokenization for compliance, it’s time to look at how tokens are used to secure payment data. I will focus on how tokenization is employed for credit card security and helps with compliance because this model is driving adoption today.

As defined in the introduction, tokenization is the process of replacing sensitive information with tokens. The tokens are ‘random’ values that resemble the sensitive data they replace, but lack intrinsic value. In terms of payment data security, tokenization is used to replace sensitive payment data such as bank account numbers. But its recent surge in popularity has been specifically about replacing credit card data. The vast majority of current tokenization projects are squarely intended to reduce the cost of achieving PCI compliance. Removing credit cards from all or part of your environment sounds like a good security measure, and it is. After all, thieves can’t steal what’s not there. But that’s not actually why tokenization has become popular for credit card replacement. Tokenization is popular because it saves money.

Large merchants must undergo extensive examinations of their IT security and processes to verify compliance with the Payment Card Industry Data Security Standard (PCI-DSS). Every system that transmits or stores credit card data is subject to review. Small and mid-sized merchants must go through all the same steps as large merchants except the compliance audit, where they are on the honor system. The list of DSS requirements is lengthy – a substantial investment of time and money is required to create policies, secure systems, and generate the reports PCI assessors need. While the Council’s prescribed security controls are conceptually simple, in practice they demand a security review of the entire IT infrastructure.

Over the last couple decades firms have used credit card numbers to identify and reference customers, transactions, payments, and chargebacks. As the standard reference key, credit card numbers were stored in billing, order management, shipping, customer care, business intelligence, and even fraud detection systems. They were used to cross-reference data from third parties in order to gather intelligence on consumer buying trends. Large retail organizations typically stored credit card data in every critical business processing system. When firms began suffering data breaches they started to encrypt databases and archives, and implemented central key management systems to control access to payment data. But faulty encryption deployments, SQL injection attacks, and credential hijacking continued to expose credit cards to fraud. The Payment Card Industry quickly stepped in to require a standardized set of security measures of everyone who processes and stores credit card data. The problem is that it is incredibly expensive to audit network, platform, application, user, and data security across all these systems – and then document usage and security policies to demonstrate compliance with PCI-DSS.

If credit card data is replaced with tokens, almost half of the security checks no longer apply. For example, the requirement to encrypt databases or archives goes away with credit card numbers. Key management systems shrink, as they no longer need to manage keys across the entire organization. You don’t need to mask report data, rewrite applications, or reset user authorization to restrict access. Tokenization drastically reduces the complexity and scope of auditing and securing operations. That doesn’t mean you don’t need to maintain a secure network, but the requirements are greatly reduced. Even for smaller merchants who can self-assess, tokenization reduces the workload. You must secure your systems – primarily to ensure token and payment services are not open to attack – but the burden is dramatically lightened.

Tokens can be created and managed in-house, or by third party service providers. Both models support web commerce and point-of-sale environments, and integrate easily with existing systems. For in-house token platforms, you own and operate the token system, including the token database. The token server is integrated with back-end transaction systems and swaps tokens in during transactions. You still keep credit card data, but only a single copy of each card, in the secure token database. This type of systems is most common with very large merchants who need to keep the original card data and want to keep transaction fees to a minimum. Third-party token services, such as those provided directly by payment processors – return a token to signify a successful payment. But the merchant retains only the token rather than the credit card. The payment processor stores the card data along with the issued token for recurring payments and dispute resolution. Small and mid-sized merchants with no need to retain credit card numbers lean towards this model – they sacrifice some control and pay higher transaction fees in exchange for convenience, reduced liability, and compliance costs.

Deployment of token systems can still be tricky, as you need to substitute existing payment data with tokens. Updates must be synchronized across multiple systems so keys and data maintain relational integrity. Token vendors, both in-house and third party service providers, offer tools and services to perform the conversion. If you have credit card data scattered throughout your company, plan on paying a bit more during the conversion.

But tokenization is mostly a drop-in replacement for encryption of credit card data. It requires very little in the way of changes to your systems, processes, or applications. While encryption can provide very strong security, customers and auditors prefer tokenization because it’s simpler to implement, simpler to manage, and easier to audit.

Today, tokenization of payment data is driving the market. But there are many other uses for data tokenization, particularly in health care and for other Personally Identifiable Information (PII). In the mid-term I expect to see tokenization increasingly applied to databases containing PII, which is the topic for our next post.

Share: