Tokenization

The PCI SSC released the Aug 2011 “Information Supplement: PCI DSS Tokenization Guidelines.” In the guidelines, the council describes various aspects of tokenization, including some desired security properties of the system and the tokens, as well as how tokenization may reduce PCI DSS scope (which is ultimately tokenization’s raison d’être). Ramon Krikken (Gartner IT1 Security and Risk Management Strategies tea)

Why Tokenization?

Tokenization is a useful and valid technique or risk mitigation.  It is not encryption, is more secure because there is no key to steal, and enables you to decrease PCI DSS scope.  Video explaning tokenization.

Once the transaction is authorized, a 16 character token is generated by the processor and is passed back to the merchant’s POS where the token can be stored in place of the PAN.  This means that for credit card processing applications, the token is 16 characters long.  In addition, the token can contain the last four digits of the PAN for transaction reference.

One feature that tokenization can provide is the ability to process recurring transactions.  In a recurring transaction scenario, the processor provides a mapping capability between tokens and the original cardholder data.  On a recurring transaction, the merchant passes the token back to the processor and the processor looks up the token and then generates a transaction based on the cardholder data associated with the token.

What makes a good token? A FIPS-validated random number generator (RNG) is theoretically the best choice.

 

Tokenization Guidance: PCI Requirement Checklist

PCI Tokenisation Guidance PDF

A checklist of each PCI requirement affected by tokenization, with guidance on how to modify compliance efforts in light of tokenization. 

 

Tokenization Products

Intel Application Security and Identity Products - Tokenization Broker

 

by using Intel Tokenization Broker (Intel ETB) solution you get,

  • Storage and processing using surrogate data and not the original data.
  • Format preserving tokenization allowing the preservation of parts of PAN information for internal purposes.
  • It can handle any form of data such as MS word, Excel, PDF or any other document type.
  • Solution comes with the necessary security certifications (CC EAL 4+, FIPS 140-2 Level, etc)
  • Allows you to secure the perimeter, secure the edge, secure the API.
  • Will reduce PCI scope, Protect Card Holder data,
  • It can work anywhere within the enterprise, extended enterprise, including partner locations, or in virtual environments such as in the cloud.
  • Can be in DMZ due to a hardened appliance form factor.
  • Results in Reduction of annual assessment costs.
  • Helps with compliance issues.
  • Provides a hardware based random token generator.
  • Full disk encryption, database storage encryption, Secure Boot/ BIOS, Tripwire, snooping block

 

EncryptRIGHT Tokenization

The EncryptRIGHT product suite includes an optional tokenization module that reduces the scope and cost of a PCI audit. 

 

EncryptRIGHT tokenization can be used for all types of processing, and is suitable for a wide range of personally identifiable information (PII) such as social security numbers, passport numbers and health records. Here are some of the common reasons why tokenization is a popular choice:

  • PCI compliance. For merchants around the world, substituting tokens for credit cards meets compliance requirements and reduces threat of data breach.
  • No impact on your existing systems. Tokens have the same format and data type as the original values, so you don't need to change your applications or databases to support tokenized values.
  • Reduced audit costs. By removing sensitive information from data processing systems, security audits are simpler, smaller in scope, easier to perform and cost less.

 

Tokenization Manager, SafeNet

 

  • Complies with PCI DSS Tokenization Guidelines
  • Follows VISA Tokenization Best Practices
  • Supports numeric, alphanumeric and date data types
  • Format Preserving Tokenization
  • Supports Token masking and Luhn algorithm pass/fail checks.
  • Supports multiple tokens  vaults
  • Ability to purge tokens by dates
  • Highly scalable - can generate and retrieve millions of tokens/per day
  • Supported Tokens Vault Databases:
    • Microsoft SQL Server
    • Oracle