We have made great strides in the area of encryption since its introduction over 3,000 years ago, particularly with the introduction of asymmetric and hash algorithms. From a policy perspective, the National Institute of Standards and Technology (NIST) has helped us evolve our encryption capabilities by pushing for public submission and review of algorithms, as they did with Advanced Encryption Standard (AES) and Secure Hash Algorithm 3 (SHA-3). In 2009, IBM research scientist Craig Gentry presented the first homomorphic encryption scheme, and IBM unveiled pervasive encryption as part of its z14 platform last year.

Given all these developments, why do many enterprises still resist using encryption? For years, experts within the security community have debated about worst-case scenarios — that an encryption key might be lost or stolen, that a brute-force attack might unlock data, or that an intentional backdoor or unforeseen bug in the encryption algorithm might cause a failure, or encryption blast, that compromises all the data under its protection. These discussions foster fear and uncertainty in our community and can cause us to overreact or, worse, do nothing.

Why Are Organizations Reluctant to Adopt Encryption?

In the 19th century, Dutch cryptographer Auguste Kerckhoffs created a principle that states that a cryptosystem should be secure even if everything about the system — except the key — is public knowledge. In simpler terms, that means that even if you know everything about a lock, you should not be able to open it without the proper key.

Encryption critics have long stated that encryption has overhead. It does — it must — since no algorithm is free of processor consumption. However, the current standard for encryption, AES, is extremely efficient, and its security and performance were vetted publicly against numerous implementations, including a variety of languages, across a number of hardware architectures. In fact, even its implementation within silicon was evaluated before it was considered for ratification. Furthermore, AES has now withstood quite a lifetime of use without compromise since its formal ratification in 2001. Unlike algorithms before it, AES epitomizes Kerckhoffs’ principle: The lock is well-known but the key is virtually impossible to find, especially within a 256-bit keyhole

Managing Encryption Keys

Now let’s talk about managing keys. We, as security professionals, do not place all of our reports, presentations or spreadsheets into a single file because it would be too difficult to manage. Instead, we manage thousands of files on our laptops with ease by separating them by topic, time, type or filename. We even have emails that are organized in a similar fashion, along with numerous hyperlinks. Why are we able to manage such a large set of files, but managing a similar number of encryption keys seems so challenging? In truth, managing encryption keys should be easier than managing files.

If we placed all of our data within one file or database, the loss of that database would have very large blast radius — all of the data would be gone. However, since we break our data into manageable chunks, files, databases or spreadsheets, the loss of one file does not mean a loss of everything. Therefore, our blast radius is smaller.

This approach enables us to minimize the encryption blast radius. Encryption not only gives you the ability to protect data, but it also allows you to encrypt it on a file-by-file basis, regardless of where it is stored, with each file being scrambled under a unique encryption key. With granular file encryption and a unique key per file, the loss of one key or file significantly reduces the blast radius.

Better yet, granular encryption and key management also allow you to erase the data forever by merely deleting the associated encryption key. Can you image a file delete feature that merely discarded the encryption key, thus rendering the data cryptographically erased and unrecoverable? Would rewriting with ones and zeros even be necessary? Sure, to feel better, you could rotate the bits of the key, but it would take a lot less time to scrub the bits of a 256-bit key than it would for a 1 GB file.

Are We Future-Proof?

We are never future-proof. Encryption is a practice, not a destination. Unfortunately, we have a lot of data that still needs our protection, but some of our locks and keys may need to be updated as well.

We do not know precisely when, but quantum computing is coming. Furthermore, we do not know exactly when it will be cost-effective, but the NIST reported that by 2030, the cost of building a quantum computer that can break a 2000-bit RSA key in a matter of hours will be about $1 billion. That is certainly not couch change, but the price will likely go down from there.

Will this advancement impact today’s encryption algorithms? Yes, but the impact of symmetric algorithms such as AES will be mild compared to the disastrous impact it will have on asymmetric algorithms such as RSA and elliptic curve cryptography (ECC).

In response, we should not scream, declare the sky is falling or wait before we do anything. Instead, we should encrypt all of our data at a very granular level using AES-256, delete all data that we do not want to be made public in 10 years, and find new algorithms to replace RSA and ECC.

It’s that simple — we must keep moving forward. It is the only way to minimize the encryption blast radius.

To learn more, watch the “Unified Data Encryption: Reduce the Risk of Costly Breaches and Compliance Pain” video at the top of this page.

More from Data Protection

Cost of a data breach: Cost savings with law enforcement involvement

3 min read - For those working in the information security and cybersecurity industries, the technical impacts of a data breach are generally understood. But for those outside of these technical functions, such as executives, operators and business support functions, “explaining” the real impact of a breach can be difficult. Therefore, explaining impacts in terms of quantifiable financial figures and other simple metrics creates a relatively level playing field for most stakeholders, including law enforcement.IBM’s 2024 Cost of a Data Breach (“CODB”) Report helps…

Cost of data breaches: The business case for security AI and automation

3 min read - As Yogi Berra said, “It’s déjà vu all over again.” If the idea of the global average costs of data breaches rising year over year feels like more of the same, that's because it is. Data protection solutions get better, but so do threat actors. The other broken record is the underuse or misuse of technologies that can help safeguard data, such as artificial intelligence and automation.IBM’s 2024 Cost of a Data Breach (CODB) Report studied 604 organizations across 17…

Cost of a data breach: The industrial sector

2 min read - Industrial organizations recently received a report card on their performance regarding data breach costs. And there’s plenty of room for improvement.According to the 2024 IBM Cost of a Data Breach (CODB) report, the average total cost of a data breach in the industrial sector was $5.56 million. This reflects an 18% increase for the sector compared to 2023.These figures place the industrial sector in third place for breach costs among the 17 industries studied. On average, data breaches cost industrial…

Topic updates

Get email updates and stay ahead of the latest threats to the security landscape, thought leadership and research.
Subscribe today