Sunday, March 8, 2026

Tokenization is at the forefront of the fight for data security

Share

Presented by Capital One Software


Tokenization is becoming the cornerstone of current data security, helping companies separate data value from risk. During this VB in conversationRavi Raghu, president of Capital One Software, discusses how tokenization can facilitate reduce the value of compromised data and preserve the underlying data format and usability, including Capital One’s own experience using tokenization at scale.

Tokenization, Raghu argues, is a much better technology. It converts sensitive data into a non-sensitive digital replacement, called a token, that replicates the original, which is secured in a digital vault. The token placeholder preserves both the format and usability of sensitive data and can be used in a variety of applications – including AI models. Because tokenization eliminates the need to manage encryption keys or devote computing power to continuous encryption and decryption, it offers companies one of the most scalable ways to protect their most sensitive data, he added.

“The killer part, from a security standpoint, when we think about it compared to other methods, if a bad actor gets the data, they get the tokens,” he explained. “The actual data is not stored in the token, unlike other methods such as encryption, where the actual data is stored there and just waiting for someone to get a key or use brute force to get to the real data. By all accounts, this is the ideal way to approach protecting sensitive data.”

Tokenization differentiator

Most organizations are just scratching the surface of data security, adding security at the very end when the data is read to prevent the end user from accessing it. At a minimum, organizations should focus on securing data while it is being written while in storage. But best-in-class organizations go even further, protecting data from the moment it is created.

At one end of the security spectrum is a plain lock-and-key approach that restricts access but leaves the underlying data intact. More advanced methods, such as masking or modifying data, permanently change its meaning, which can reduce its usefulness. File-level encryption provides broader protection for gigantic amounts of stored data, but when you move to field-level encryption (such as your Social Security number), it becomes more of a challenge. Encrypting a single field and then decrypting it at point of apply requires a lot of computation. Yet it has a fatal flaw: the original data is still there, and all you need to gain access is a key.

Tokenization avoids these pitfalls by replacing the original data with a surrogate that has no intrinsic value. If the token is intercepted – either by the wrong person or by the wrong machine – the data itself will remain secure.

The business value of tokenization

“You’re basically protecting data, and that’s invaluable,” Raghu said. “Another thing that is priceless – can it be used later for modeling purposes? On the one hand, it is a matter of protection, and on the other hand, it is a matter of enabling business activity.”

Because tokenization preserves the structure and order of the original data, it can still be used for modeling and analysis, turning protection into a business enabler. Take, for example, private health data regulated by HIPAA: tokenization means the data can be used to create pricing models or gene therapy research while maintaining compliance.

“If your data is already protected, you can expand data use across the enterprise and get everyone creating more and more value from data,” Raghu said. “Conversely, if this is not there, modern enterprises show great reticence when it comes to giving more people or more and more AI agents access to their data. Ironically, this limits the radius of the explosion of innovation. The impact of tokenization is huge and there are many metrics that can be used to measure this – operational impact, revenue impact and, of course, peace of mind from a security perspective.”

Breaking down adoption barriers

Until now, the primary challenge with time-honored tokenization has been performance. Artificial intelligence requires unprecedented scale and speed. This is one of the main challenges Capital One faces with Databolt, a vaultless tokenization solution that can generate up to 4 million tokens per second.

“Capital One has been undergoing tokenization for over a decade. We started doing this because we serve our 100 million banking customers. We want to protect this sensitive data,” Raghu said. “We ate our own dog food with our internal tokenization capability over 100 billion times a month. We took that knowledge, capability, scale and speed and then innovated it for the world to consume and it became a commercial offering.”

Vaultless tokenization is an advanced form of tokenization that does not require a central database (vault) to store token mappings. Instead, it uses mathematical algorithms, cryptographic techniques, and deterministic mapping to dynamically generate tokens. This approach is faster, more scalable, and eliminates the security risks associated with vault management.

“We realized that due to the scale and speed requirements, we needed to build these capabilities ourselves,” Raghu said. “We are constantly working to ensure that the solution can scale to hundreds of billions of operations per month. All of our innovations are focused on building intellectual property and the ability to do so within our enterprise at a battle-proven scale to serve our customers.”

While conventional tokenization methods can introduce some complexity and leisurely down operations, Databolt integrates seamlessly with encrypted data warehouses, enabling companies to maintain solid security without slowing down performance or operations. Tokenization occurs within the client environment, eliminating the need to communicate with an external network to perform tokenization operations, which can also leisurely down performance.

“We believe that, in principle, tokenization should be easy to adopt,” Raghu said. “You should be able to secure your data very quickly and operate at the speed, scale and cost needs that organizations have. I think this has been a critical barrier to mass adoption of tokenization so far. In the world of AI, this will become a huge enabler.”

Don’t miss it full interview with Ravi Raghu, CEO of Capital One Software, here.


Sponsored articles are content created by a company that pays to publish or has a business relationship with VentureBeat and is always clearly marked. For more information, please contact us sales@venturebeat.com.

Latest Posts

More News