June 15, 2015 By Douglas Bonderud 2 min read

It’s been more than a year since Heartbleed, but companies remain nervous about widely used encryption software OpenSSL. This is understandable since the preexisting vulnerability put millions of websites and email servers at risk. But a thorough audit of this open-source SSL/TLS offering is now underway. As reported by Threatpost, the project has just released four new patches — 1.0.2b, 1.0.1n, 1.0.0s and 0.9.8zg — to deal with a number of moderate and low-severity issues. Here’s a quick recap.

Cleared Out

The biggest news from these updates? Logjam is finally cleared. Officially tracked as CVE-2015-4000, Logjam stemmed from a problem with Diffie–Hellman key exchanges. With millions of HTTPS, SSH and VPN servers using identical prime numbers for their key exchange, it became possible for attackers to use the same number field sieve over and over again to break individual connections. More worrisome? A flaw in the TLS protocol allowed man-in-the-middle attackers to downgrade connection security by exploiting the Diffie–Hellman exchange to use easily cracked 512-bit cryptography, which was coded into SSL by law in 1990 and still remains in many older deployments. As noted by CIO, releases 1.0.2b and 1.0.1n toss out 512-bit altogether and will only support handshakes of 768 bits or better. In a future patch, this limit will be raised to 1024 bits.

Other Issues With OpenSSL

New OpenSSL patches also addressed several other low-severity issues such as potential memory corruption. In versions 1.0.1, 1.0.0 and 0.9.8, it was possible for a DTLS peer to receive application data between the ChangeCipherSpec and Finished messages. When buffered, this data could generate an “invalid free,” resulting in segmentation faults or memory corruption. According to Softpedia, there’s also a fix for denial-of-service (DoS) conditions that caused out-of-bounds reads in the X509_cmp_time function, which is supposed to check the length of the ANS1_TIME string but may read a few bytes out-of-bounds. Combined with X509’s ability to accept arbitrarily assigned fractional seconds in its time string, it’s possible for attackers to create fake, malformed certificates that crash applications designed to verify these certificates.

Alone, none of these vulnerabilities is particularly damning, but they’re further proof that open-source legacy code often comes with a host of hidden problems. It appears, however, that the OpenSSL project is making good on its promise to evaluate existing code line-by-line and address these vulnerabilities before another Heartbleed-type event occurs. Best bet? Update OpenSSL ASAP and keep an eye on all future updates.

More from

What does resilience in the cyber world look like in 2025 and beyond?

6 min read -  Back in 2021, we ran a series called “A Journey in Organizational Resilience.” These issues of this series remain applicable today and, in many cases, are more important than ever, given the rapid changes of the last few years. But the term "resilience" can be difficult to define, and when we define it, we may limit its scope, missing the big picture.In the age of generative artificial intelligence (gen AI), the prevalence of breach data from infostealers and the near-constant…

Airplane cybersecurity: Past, present, future

4 min read - With most aviation processes now digitized, airlines and the aviation industry as a whole must prioritize cybersecurity. If a cyber criminal launches an attack that affects a system involved in aviation — either an airline’s system or a third-party vendor — the entire process, from safety to passenger comfort, may be impacted.To improve security in the aviation industry, the FAA recently proposed new rules to tighten cybersecurity on airplanes. These rules would “protect the equipment, systems and networks of transport…

Protecting your digital assets from non-human identity attacks

4 min read - Untethered data accessibility and workflow automation are now foundational elements of most digital infrastructures. With the right applications and protocols in place, businesses no longer need to feel restricted by their lack of manpower or technical capabilities — machines are now filling those gaps.The use of non-human identities (NHIs) to power business-critical applications — especially those used in cloud computing environments or when facilitating service-to-service connections — has opened the doors for seamless operational efficiency. Unfortunately, these doors aren’t the…

Topic updates

Get email updates and stay ahead of the latest threats to the security landscape, thought leadership and research.
Subscribe today