We have made great strides in the area of encryption since its introduction over 3,000 years ago, particularly with the introduction of asymmetric and hash algorithms. From a policy perspective, the National Institute of Standards and Technology (NIST) has helped us evolve our encryption capabilities by pushing for public submission and review of algorithms, as they did with Advanced Encryption Standard (AES) and Secure Hash Algorithm 3 (SHA-3). In 2009, IBM research scientist Craig Gentry presented the first homomorphic encryption scheme, and IBM unveiled pervasive encryption as part of its z14 platform last year.

Given all these developments, why do many enterprises still resist using encryption? For years, experts within the security community have debated about worst-case scenarios — that an encryption key might be lost or stolen, that a brute-force attack might unlock data, or that an intentional backdoor or unforeseen bug in the encryption algorithm might cause a failure, or encryption blast, that compromises all the data under its protection. These discussions foster fear and uncertainty in our community and can cause us to overreact or, worse, do nothing.

Why Are Organizations Reluctant to Adopt Encryption?

In the 19th century, Dutch cryptographer Auguste Kerckhoffs created a principle that states that a cryptosystem should be secure even if everything about the system — except the key — is public knowledge. In simpler terms, that means that even if you know everything about a lock, you should not be able to open it without the proper key.

Encryption critics have long stated that encryption has overhead. It does — it must — since no algorithm is free of processor consumption. However, the current standard for encryption, AES, is extremely efficient, and its security and performance were vetted publicly against numerous implementations, including a variety of languages, across a number of hardware architectures. In fact, even its implementation within silicon was evaluated before it was considered for ratification. Furthermore, AES has now withstood quite a lifetime of use without compromise since its formal ratification in 2001. Unlike algorithms before it, AES epitomizes Kerckhoffs’ principle: The lock is well-known but the key is virtually impossible to find, especially within a 256-bit keyhole

Managing Encryption Keys

Now let’s talk about managing keys. We, as security professionals, do not place all of our reports, presentations or spreadsheets into a single file because it would be too difficult to manage. Instead, we manage thousands of files on our laptops with ease by separating them by topic, time, type or filename. We even have emails that are organized in a similar fashion, along with numerous hyperlinks. Why are we able to manage such a large set of files, but managing a similar number of encryption keys seems so challenging? In truth, managing encryption keys should be easier than managing files.

If we placed all of our data within one file or database, the loss of that database would have very large blast radius — all of the data would be gone. However, since we break our data into manageable chunks, files, databases or spreadsheets, the loss of one file does not mean a loss of everything. Therefore, our blast radius is smaller.

This approach enables us to minimize the encryption blast radius. Encryption not only gives you the ability to protect data, but it also allows you to encrypt it on a file-by-file basis, regardless of where it is stored, with each file being scrambled under a unique encryption key. With granular file encryption and a unique key per file, the loss of one key or file significantly reduces the blast radius.

Better yet, granular encryption and key management also allow you to erase the data forever by merely deleting the associated encryption key. Can you image a file delete feature that merely discarded the encryption key, thus rendering the data cryptographically erased and unrecoverable? Would rewriting with ones and zeros even be necessary? Sure, to feel better, you could rotate the bits of the key, but it would take a lot less time to scrub the bits of a 256-bit key than it would for a 1 GB file.

Are We Future-Proof?

We are never future-proof. Encryption is a practice, not a destination. Unfortunately, we have a lot of data that still needs our protection, but some of our locks and keys may need to be updated as well.

We do not know precisely when, but quantum computing is coming. Furthermore, we do not know exactly when it will be cost-effective, but the NIST reported that by 2030, the cost of building a quantum computer that can break a 2000-bit RSA key in a matter of hours will be about $1 billion. That is certainly not couch change, but the price will likely go down from there.

Will this advancement impact today’s encryption algorithms? Yes, but the impact of symmetric algorithms such as AES will be mild compared to the disastrous impact it will have on asymmetric algorithms such as RSA and elliptic curve cryptography (ECC).

In response, we should not scream, declare the sky is falling or wait before we do anything. Instead, we should encrypt all of our data at a very granular level using AES-256, delete all data that we do not want to be made public in 10 years, and find new algorithms to replace RSA and ECC.

It’s that simple — we must keep moving forward. It is the only way to minimize the encryption blast radius.

To learn more, watch the “Unified Data Encryption: Reduce the Risk of Costly Breaches and Compliance Pain” video at the top of this page.

More from Data Protection

Why safeguarding sensitive data is so crucial

4 min read - A data breach at virtual medical provider Confidant Health lays bare the vast difference between personally identifiable information (PII) on the one hand and sensitive data on the other.The story began when security researcher Jeremiah Fowler discovered an unsecured database containing 5.3 terabytes of exposed data linked to Confidant Health. The company provides addiction recovery help and mental health treatment in Connecticut, Florida, Texas and other states.The breach, first reported by WIRED, involved PII, such as patient names and addresses,…

Addressing growing concerns about cybersecurity in manufacturing

4 min read - Manufacturing has become increasingly reliant on modern technology, including industrial control systems (ICS), Internet of Things (IoT) devices and operational technology (OT). While these innovations boost productivity and streamline operations, they’ve vastly expanded the cyberattack surface.According to the 2024 IBM Cost of a Data Breach report, the average total cost of a data breach in the industrial sector was $5.56 million. This reflects an 18% increase for the sector compared to 2023.Apparently, the data being stored in industrial control systems is…

3 proven use cases for AI in preventative cybersecurity

3 min read - IBM’s Cost of a Data Breach Report 2024 highlights a ground-breaking finding: The application of AI-powered automation in prevention has saved organizations an average of $2.2 million.Enterprises have been using AI for years in detection, investigation and response. However, as attack surfaces expand, security leaders must adopt a more proactive stance.Here are three ways how AI is helping to make that possible:1. Attack surface management: Proactive defense with AIIncreased complexity and interconnectedness are a growing headache for security teams, and…

Topic updates

Get email updates and stay ahead of the latest threats to the security landscape, thought leadership and research.
Subscribe today