We have made great strides in the area of encryption since its introduction over 3,000 years ago, particularly with the introduction of asymmetric and hash algorithms. From a policy perspective, the National Institute of Standards and Technology (NIST) has helped us evolve our encryption capabilities by pushing for public submission and review of algorithms, as they did with Advanced Encryption Standard (AES) and Secure Hash Algorithm 3 (SHA-3). In 2009, IBM research scientist Craig Gentry presented the first homomorphic encryption scheme, and IBM unveiled pervasive encryption as part of its z14 platform last year.

Given all these developments, why do many enterprises still resist using encryption? For years, experts within the security community have debated about worst-case scenarios — that an encryption key might be lost or stolen, that a brute-force attack might unlock data, or that an intentional backdoor or unforeseen bug in the encryption algorithm might cause a failure, or encryption blast, that compromises all the data under its protection. These discussions foster fear and uncertainty in our community and can cause us to overreact or, worse, do nothing.

Why Are Organizations Reluctant to Adopt Encryption?

In the 19th century, Dutch cryptographer Auguste Kerckhoffs created a principle that states that a cryptosystem should be secure even if everything about the system — except the key — is public knowledge. In simpler terms, that means that even if you know everything about a lock, you should not be able to open it without the proper key.

Encryption critics have long stated that encryption has overhead. It does — it must — since no algorithm is free of processor consumption. However, the current standard for encryption, AES, is extremely efficient, and its security and performance were vetted publicly against numerous implementations, including a variety of languages, across a number of hardware architectures. In fact, even its implementation within silicon was evaluated before it was considered for ratification. Furthermore, AES has now withstood quite a lifetime of use without compromise since its formal ratification in 2001. Unlike algorithms before it, AES epitomizes Kerckhoffs’ principle: The lock is well-known but the key is virtually impossible to find, especially within a 256-bit keyhole

Managing Encryption Keys

Now let’s talk about managing keys. We, as security professionals, do not place all of our reports, presentations or spreadsheets into a single file because it would be too difficult to manage. Instead, we manage thousands of files on our laptops with ease by separating them by topic, time, type or filename. We even have emails that are organized in a similar fashion, along with numerous hyperlinks. Why are we able to manage such a large set of files, but managing a similar number of encryption keys seems so challenging? In truth, managing encryption keys should be easier than managing files.

If we placed all of our data within one file or database, the loss of that database would have very large blast radius — all of the data would be gone. However, since we break our data into manageable chunks, files, databases or spreadsheets, the loss of one file does not mean a loss of everything. Therefore, our blast radius is smaller.

This approach enables us to minimize the encryption blast radius. Encryption not only gives you the ability to protect data, but it also allows you to encrypt it on a file-by-file basis, regardless of where it is stored, with each file being scrambled under a unique encryption key. With granular file encryption and a unique key per file, the loss of one key or file significantly reduces the blast radius.

Better yet, granular encryption and key management also allow you to erase the data forever by merely deleting the associated encryption key. Can you image a file delete feature that merely discarded the encryption key, thus rendering the data cryptographically erased and unrecoverable? Would rewriting with ones and zeros even be necessary? Sure, to feel better, you could rotate the bits of the key, but it would take a lot less time to scrub the bits of a 256-bit key than it would for a 1 GB file.

Are We Future-Proof?

We are never future-proof. Encryption is a practice, not a destination. Unfortunately, we have a lot of data that still needs our protection, but some of our locks and keys may need to be updated as well.

We do not know precisely when, but quantum computing is coming. Furthermore, we do not know exactly when it will be cost-effective, but the NIST reported that by 2030, the cost of building a quantum computer that can break a 2000-bit RSA key in a matter of hours will be about $1 billion. That is certainly not couch change, but the price will likely go down from there.

Will this advancement impact today’s encryption algorithms? Yes, but the impact of symmetric algorithms such as AES will be mild compared to the disastrous impact it will have on asymmetric algorithms such as RSA and elliptic curve cryptography (ECC).

In response, we should not scream, declare the sky is falling or wait before we do anything. Instead, we should encrypt all of our data at a very granular level using AES-256, delete all data that we do not want to be made public in 10 years, and find new algorithms to replace RSA and ECC.

It’s that simple — we must keep moving forward. It is the only way to minimize the encryption blast radius.

To learn more, watch the “Unified Data Encryption: Reduce the Risk of Costly Breaches and Compliance Pain” video at the top of this page.

More from Data Protection

SpyAgent malware targets crypto wallets by stealing screenshots

4 min read - A new Android malware strain known as SpyAgent is making the rounds — and stealing screenshots as it goes. Using optical character recognition (OCR) technology, the malware is after cryptocurrency recovery phrases often stored in screenshots on user devices.Here's how to dodge the bullet.Attackers shooting their (screen) shotAttacks start — as always — with phishing efforts. Users receive text messages prompting them to download seemingly legitimate apps. If they take the bait and install the app, the SpyAgent malware gets…

Exploring DORA: How to manage ICT incidents and minimize cyber threat risks

3 min read - As cybersecurity breaches continue to rise globally, institutions handling sensitive information are particularly vulnerable. In 2024, the average cost of a data breach in the financial sector reached $6.08 million, making it the second hardest hit after healthcare, according to IBM's 2024 Cost of a Data Breach report. This underscores the need for robust IT security regulations in critical sectors.More than just a defensive measure, compliance with security regulations helps organizations reduce risk, strengthen operational resilience and enhance customer trust.…

Skills shortage directly tied to financial loss in data breaches

2 min read - The cybersecurity skills gap continues to widen, with serious consequences for organizations worldwide. According to IBM's 2024 Cost Of A Data Breach Report, more than half of breached organizations now face severe security staffing shortages, a whopping 26.2% increase from the previous year.And that's expensive. This skills deficit adds an average of $1.76 million in additional breach costs.The shortage spans both technical cybersecurity skills and adjacent competencies. Cloud security, threat intelligence analysis and incident response capabilities are in high demand. Equally…

Topic updates

Get email updates and stay ahead of the latest threats to the security landscape, thought leadership and research.
Subscribe today