Tokenization is currently one of the hottest topics in database and application security. In this report we explain what tokenization is, when it works best, and how it works – and give recommendations to help choose the best solution.
Tokenization is just such a technology: it replaces the original sensitive data with non-sensitive placeholders. Tokenization is closely related to encryption – they both mask sensitive information – but its approach to data protection is different. With encryption we protect the data by scrambling it using a process that’s reversible if you have the right key. Anyone with access to the key and the encrypted data can recreate the original values. With tokenization we completely replace the real value with a random, representative token.