We have made great strides in the area of encryption since its introduction over 3,000 years ago, particularly with the introduction of asymmetric and hash algorithms. From a policy perspective, the National Institute of Standards and Technology (NIST) has helped us evolve our encryption capabilities by pushing for public submission and review of algorithms, as they did with Advanced Encryption Standard (AES) and Secure Hash Algorithm 3 (SHA-3). In 2009, IBM research scientist Craig Gentry presented the first homomorphic encryption scheme, and IBM unveiled pervasive encryption as part of its z14 platform last year.

Given all these developments, why do many enterprises still resist using encryption? For years, experts within the security community have debated about worst-case scenarios — that an encryption key might be lost or stolen, that a brute-force attack might unlock data, or that an intentional backdoor or unforeseen bug in the encryption algorithm might cause a failure, or encryption blast, that compromises all the data under its protection. These discussions foster fear and uncertainty in our community and can cause us to overreact or, worse, do nothing.

Why Are Organizations Reluctant to Adopt Encryption?

In the 19th century, Dutch cryptographer Auguste Kerckhoffs created a principle that states that a cryptosystem should be secure even if everything about the system — except the key — is public knowledge. In simpler terms, that means that even if you know everything about a lock, you should not be able to open it without the proper key.

Encryption critics have long stated that encryption has overhead. It does — it must — since no algorithm is free of processor consumption. However, the current standard for encryption, AES, is extremely efficient, and its security and performance were vetted publicly against numerous implementations, including a variety of languages, across a number of hardware architectures. In fact, even its implementation within silicon was evaluated before it was considered for ratification. Furthermore, AES has now withstood quite a lifetime of use without compromise since its formal ratification in 2001. Unlike algorithms before it, AES epitomizes Kerckhoffs’ principle: The lock is well-known but the key is virtually impossible to find, especially within a 256-bit keyhole

Managing Encryption Keys

Now let’s talk about managing keys. We, as security professionals, do not place all of our reports, presentations or spreadsheets into a single file because it would be too difficult to manage. Instead, we manage thousands of files on our laptops with ease by separating them by topic, time, type or filename. We even have emails that are organized in a similar fashion, along with numerous hyperlinks. Why are we able to manage such a large set of files, but managing a similar number of encryption keys seems so challenging? In truth, managing encryption keys should be easier than managing files.

If we placed all of our data within one file or database, the loss of that database would have very large blast radius — all of the data would be gone. However, since we break our data into manageable chunks, files, databases or spreadsheets, the loss of one file does not mean a loss of everything. Therefore, our blast radius is smaller.

This approach enables us to minimize the encryption blast radius. Encryption not only gives you the ability to protect data, but it also allows you to encrypt it on a file-by-file basis, regardless of where it is stored, with each file being scrambled under a unique encryption key. With granular file encryption and a unique key per file, the loss of one key or file significantly reduces the blast radius.

Better yet, granular encryption and key management also allow you to erase the data forever by merely deleting the associated encryption key. Can you image a file delete feature that merely discarded the encryption key, thus rendering the data cryptographically erased and unrecoverable? Would rewriting with ones and zeros even be necessary? Sure, to feel better, you could rotate the bits of the key, but it would take a lot less time to scrub the bits of a 256-bit key than it would for a 1 GB file.

Are We Future-Proof?

We are never future-proof. Encryption is a practice, not a destination. Unfortunately, we have a lot of data that still needs our protection, but some of our locks and keys may need to be updated as well.

We do not know precisely when, but quantum computing is coming. Furthermore, we do not know exactly when it will be cost-effective, but the NIST reported that by 2030, the cost of building a quantum computer that can break a 2000-bit RSA key in a matter of hours will be about $1 billion. That is certainly not couch change, but the price will likely go down from there.

Will this advancement impact today’s encryption algorithms? Yes, but the impact of symmetric algorithms such as AES will be mild compared to the disastrous impact it will have on asymmetric algorithms such as RSA and elliptic curve cryptography (ECC).

In response, we should not scream, declare the sky is falling or wait before we do anything. Instead, we should encrypt all of our data at a very granular level using AES-256, delete all data that we do not want to be made public in 10 years, and find new algorithms to replace RSA and ECC.

It’s that simple — we must keep moving forward. It is the only way to minimize the encryption blast radius.

To learn more, watch the “Unified Data Encryption: Reduce the Risk of Costly Breaches and Compliance Pain” video at the top of this page.

More from Data Protection

Cost of a data breach 2023: Pharmaceutical industry impacts

3 min read - Data breaches are both commonplace and costly in the medical industry.  Two industry verticals that fall under the medical umbrella — healthcare and pharmaceuticals — sit at the top of the list of the highest average cost of a data breach, according to IBM’s Cost of a Data Breach Report 2023. The health industry’s place at the top spot of most costly data breaches is probably not a surprise. With its sensitive and valuable data assets, it is one of…

Cost of a data breach 2023: Financial industry impacts

3 min read - According to the IBM Cost of a Data Breach Report 2023, the global average cost of a data breach in 2023 was $4.45 million, 15% more than in 2020. In response, 51% of organizations plan to increase cybersecurity spending this year. For the financial industry, however, global statistics don’t tell the whole story. Finance firms lose approximately $5.9 million per data breach, 28% higher than the global average. In addition, evolving regulatory concerns play a role in how financial companies…

Advanced analytics can help detect insider threats rapidly

2 min read - While external cyber threats capture headlines, the rise of insider threats from within an organization is a growing concern. In 2023, the average cost of a data breach caused by an insider reached $4.90 million, 9.6% higher than the global average data breach cost of $4.45 million. To effectively combat this danger, integrating advanced analytics into data security software has become a critical and proactive defense strategy. Understanding insider threats Insider threats come from users who abuse authorized access to…

One simple way to cut ransomware recovery costs in half

4 min read - Whichever way you look at the data, it is considerably cheaper to use backups to recover from a ransomware attack than to pay the ransom. The median recovery cost for those that use backups is half the cost incurred by those that paid the ransom, according to a recent study. Similarly, the mean recovery cost is almost $1 million lower for those that used backups. Despite this fact, the use of backups is actually falling. This was one of the…