Inside the Box — Advancing Software to Better Protect Data
Updates in guidance programs and standards are underway
Data center security breaches have been on the rise in recent years. High-profile incidents at Equifax, Yahoo, Marriott, and others are still on the minds of those who own or run mission critical environments. The primary strength of data centers — continual uptime — is actually a weakness when it comes to security. It provides bad actors with unfettered time to launch attacks on data center systems.
When it comes to the security of data in use and at rest in data centers, it’s imperative to have secure software that relies on the cryptographic modules built into data center hardware systems. These modules ensure data is cryptographically protected — meaning it’s converted into an unreadable or difficult-to-read form — if hackers attempt to steal information. But just as hackers have become more sophisticated throughout the years, so have the processes that guide the validation and testing of cryptographic algorithms and modules.
Time for an Updated Program and Standard
Twenty years ago, the National Institute of Standards and Technology (NIST) launched the Cryptographic Module Validation Program (CMVP) to certify cryptographic modules. Around this same time, NIST released the original Federal Information Processing Standard (FIPS) 140-1 protocol that independent labs use to test cryptographic modules. This was the right program and standard at the time, but, more than two decades later, updates are needed to keep pace with the proliferation and advancement of technology — as well as growing threats.
A companion program, Cryptographic Algorithm Validation Program (CAVP), was also released to validate the implementation of the NIST-approved algorithms employed by the modules.
CAVP will end on June 30, and a transition to Automated Cryptographic Validation Protocol (ACVP) will take place. The big change with this new program is automating the process of testing cryptographic algorithms. Through the ACVP program, a NIST server is available to validate to the next-generation algorithm standards and protocol.. Over the internet, test vectors can be produced, responses can be validated, and certificates can be issued — speeding the time it takes to validate and certify cryptographic algorithms and modules. This is especially significant given the surging volume of certification requests in recent years.
FIPS 140-2 (the current standard) continues until September 2020 when the new FIPS 140-3 will start the transition period. Like CAVP, FIPS 140-2 is insufficient for the rapidly transforming world. While FIPS 140-2 provides security requirements for finalized modules, FIPS 140-3 takes the standard even further, providing security requirements during the design, implementation, and deployment phases of cryptographic development. This new standard takes into account areas that have emerged since the original FIPS standard was released two decades ago — software/firmware security, noninvasive security, sensitive security parameter management, and life-cycle assurance. Furthermore FIPS 140-3 aligns itself with the international ISO standard for cryptographic module testing.
In 2019, NIST weighed in specifically on software development by announcing its Secure Software Development Framework (SSDF). These recommendations are intended to reduce the number of vulnerabilities in released software, mitigate the potential impact of the exploitation of undetected or unaddressed vulnerabilities, and address the root causes of vulnerabilities to prevent future recurrences. As a nonregulatory body, NIST cannot force software developers to adopt its framework; however, the SSDF is a valuable tool that consolidates existing best practices to guide the development of more secure software.
In 2020, NIST also published guidance to address the growing number of cybersecurity threats across the globe: Seamless Security: Elevating Global Cyber Risk Management Through Interoperable Frameworks. It recommends the development of a globally recognized core framework for cybersecurity. Currently, there is a patchwork of cybersecurity regulations at the national and local levels — resulting in inconsistent approaches to protecting data worldwide. This is especially onerous if an organization has data centers located throughout the world.
Making Progress to Better Protect Data
For mission critical environments, protecting data is a paramount concern. At a time when malicious threats persist and technology continues to evolve, those who own or run data centers need greater assurance that the systems and software deployed provide the best protection against outside attacks. To that end, industry forces are at play to create unified software development practices and a global cybersecurity framework.
More imminent is the shift to ACVP and FIPS 140-3 for testing, validating, and certifying cryptographic algorithms and modules. These updates are greatly needed to meet the challenges of the 21st century world. Choosing data center software that is validated and certified via the new ACVP program and FIPS 140-3 standard will bring greater assurance that valuable data will be protected.