Vaultless Tokenization Released for the KMES Series 3
Futurex today announced the release of vaultless tokenization using the KMES Series 3 Key Management Enterprise Server. Tokenization is a method of protecting sensitive data such as credit card numbers by cryptographically generating substitute characters known as tokens. Tokens have no intrinsic value but allow authorized users to retrieve clear data when needed. If tokenized data is lost or stolen, it is useless to cybercriminals. This functionality is available in the KMES Series 3 immediately, and additional technical details are outlined in a whitepaper.
Futurex’s vaultless tokenization is natively interoperable with other Hardened Enterprise Security Platform use cases such as PIN processing, card and mobile issuance, remote key loading for ATM and Point of Sale, and Point-to-Point Encryption. It uses the FF1 format-preserving encryption algorithm defined in the National Institute of Standards and Technology’s Special Publication 800-38G: Recommendations for Block Cipher Modes of Operation: Methods for Format-Preserving Encryption. By using format-preserving encryption, tokenization can be used without changing business applications or database schemas, an important step in streamlining the deployment process.
For organizations within the scope of Payment Card Industry Data Security Standards (PCI DSS) requirements for safeguarding cardholder data, tokenization can be a valuable tool. Under these standards, applications and databases storing clear cardholder data must fulfill rigorous compliance requirements. When clear cardholder data is replaced with tokens, that compliance scope can be significantly reduced.
In addition to cardholder data, the KMES Series 3 also tokenizes personally identifiable information (PII). PII tokenization applies to data types such as passport numbers, birth dates, social security numbers, and account numbers. Futurex's tokenization interface, offered as network-level and REST APIs, is designed for integration into a variety of different environments.
“Tokenization is a well-established and trusted method of de-scoping data from PCI DSS requirements, and we are pleased to include it in our Hardened Enterprise Security Platform,” said Ryan Smith, Vice President for Global Business Development. “Although tokenization’s use is expanding greatly in financial environments alongside our Point-to-Point Encryption and ATM/Point of Sale remote key loading functionality, we also see tremendous applicability in a wide range of other industry verticals.”