The White House recently unveiled its new National Cybersecurity Strategy. The 35-page document lays out how the United States will confront cybersecurity challenges over the next several years. For anyone interested in security, it’s important to understand where the federal government will be focusing its cyber efforts.
Addressing ongoing risks
In the opening pages of the National Cybersecurity Strategy is a statement summarizing one of the greatest challenges organizations face. It says:
“A single person’s momentary lapse in judgment, use of an outdated password or errant click on a suspicious link should not have national security consequences. Our collective cyber resilience cannot rely on the constant vigilance of our smallest organizations and individual citizens.”
This describes what security pros deal with every day — a porous line of defense with no clear boundaries, where every endpoint is a potential gateway for intruders. As the strategy document states, “end users bear too great a burden for mitigating cyber risks. Individuals, small businesses and local governments, and infrastructure operators have limited resources and competing priorities, yet these actors’ choices can have a significant impact on our national cybersecurity.”
Software regulation coming
As a response to these growing cybersecurity challenges, the White House states that the most capable and best-positioned actors in cyberspace must be better stewards of the digital ecosystem. The National Cybersecurity Strategy proposes new measures aimed at encouraging secure development practices. This means the transfer of liability for software products and services to large corporations that create and license these products to the federal government. This transfer of liability would not impact developers of open-source applications, which often contribute to constructing novel software products.
However, some cybersecurity industry insiders fear legislation that holds software manufacturers liable. New liability laws could make software manufacturers reluctant to share information if their products are discovered to have an exploited vulnerability.
Proofpoint Executive Vice President of Cybersecurity Strategy Ryan Kalember was quoted in an Information Security Media Group (ISMG) article, saying, “The threat of liability will always discourage transparency. I don’t think that there is a simple, straightforward, easy compromise here.”
Get threat intelligence here
What legislation might look like
New legislation targeting software vendors might include aspects such as:
- Prohibit disclaiming liability by contract
- Establish higher standards of care for software in high-risk scenarios
- Develop a safe harbor framework to shield from liability companies that securely develop and maintain software products/services.
As per the new White House strategy, the development of a safe harbor will come from current best practices for secure software development, such as the NIST Secure Software Development Framework.
According to the ISMG article, any liability protection policy should factor in a company’s maturity and security measures. At present, there are no established institutions that are adequately equipped to evaluate compliance with NIST or apportion blame following a security breach, as per Veracode founder and Chief Technology Officer Chris Wysopal. He suggests incorporating a few different tiers of safety protocols for developing secure software. Although the SSDF may provide a solid foundation, it needs to be more practical and simplified.
Wysopal also feels that a safe harbor mechanism should establish high standards for established, incumbent software manufacturers without impeding the ability of new startups to introduce innovative products to the market quickly. Also, liability should extend to all software, not just critical infrastructure software, as attackers can target critical systems through general-purpose software as well.
Expecting everyone to fully comply with NIST SSDF is not realistic, Wysopal suggests. However, conducting fundamental application security testing and managing open-source risks should be included in any safe harbor provisions. Wysopal says that automated static and dynamic testing of applications can effectively uncover frequently exploited vulnerabilities, such as buffer overflows, remote command injection and SQL injection.
Focus on critical infrastructure
Defending critical infrastructure is also of the utmost importance, according to the White House document. For this reason, requirements and regulations will continue to be rolled out to “drive better cybersecurity practices at scale.”
Along these lines, the U.S. Environmental Protection Agency (EPA) recently issued a memorandum emphasizing the importance of evaluating cybersecurity risks in drinking water systems. Some public water systems (PWSs) have already implemented measures to enhance their cybersecurity. However, a recent survey and reports of cyberattacks reveal that many PWSs have yet to adopt fundamental cybersecurity best practices. These systems remain vulnerable to potential cyberattacks by individuals, criminal groups and state-sponsored actors with sophisticated capabilities. The memorandum mandates that states must conduct assessments of cybersecurity best practices at PWSs.
“Cyberattacks against critical infrastructure facilities, including drinking water systems, are increasing, and public water systems are vulnerable. Cyberattacks have the potential to contaminate drinking water, which threatens public health,” said EPA Assistant Administrator for Water Radhika Fox. “EPA is taking action to protect our public water systems by issuing this memorandum requiring states to audit the cybersecurity practices of local water systems.”
Future-proofing national cybersecurity
The White House document also addressed the looming threats associated with quantum computers and artificial intelligence. To address the “investment gap”, the government plans to leverage public investments in innovation, R&D and education to prepare for future challenges. The federal government plans to prioritize the transition of vulnerable public networks to quantum-resistant environments. They also plan to develop complementary mitigation strategies to provide “cryptographic agility” to face future threats.
The growing complexity of systems and networks makes all of this more difficult. As the strategy document mentions, new software and systems continue to provide value but also add to insecurity. Adding new functionality without security and resilience may do more harm than good. And now, even artificial intelligence has become widely available, sometimes with unexpected results, which amplify complexity even more.
Planning for uncertainty
The White House security strategy addresses some of the most pressing security issues of our time. The changes won’t happen overnight, but an overarching plan is necessary to guide security efforts. The cyber threat universe is unpredictable and constantly changing. For this reason, the government also places significant emphasis on intelligence gathering, sharing and cooperation between friendly entities and law enforcement to defend against the ongoing cyber threat.