Government agency dragging its heels on OpenSSL validation

24

Author: Stephen Feller

An agency created by the US and Canadian governments to validate security software has spent about two years reviewing the OpenSSL project — and continues to search for a way to validate that the software will always do what it is expected to do.

The Cryptographic Module Verification Program (CMVP), a joint agency between the US National Institute of Standards and Technology (NIST) and Canada’s Communications Security Establishment, has been reviewing test reports of OpenSSL since at least December 2003.

According to CMVP director Randy Easter, a typical testing cycle runs from several weeks to a few months, and the goal for NIST is to process reports generated by the labs after testing within six to nine weeks. Once processed, NIST either sends additional questions back to the testing lab or moves forward with granting validation. The process typically takes less than a year. Because testing on OpenSSL has now taken more than twice that long, some have begun questioning the review process and whether the open source toolkit is getting a fair shake by the agency.

OpenSSL is an open source project to provide a toolkit that implements the Secure Sockets Layer (SSL) and Transport Layer Security (TLS) protocols, and is based on the SSLeay library.

The CMVP and its 12 accredited testing labs are responsible for validating that encryption modules meet guidelines set out in the Federal Information Processing Standard (FIPS) 140-2. The document, updated every five years to reflect technological progress and actual security concerns, sets out security standards and the review process for encryption software used by government agencies.

As with other free and open source options, companies that sell similar encryption tools would prefer OpenSSL not be validated because of its potential effect on their own products if it is recognized as effective by the CMVP. Though rumor has it that some of those companies have lobbied against OpenSSL, Easter said he had not heard any such rumors and that lobbying attempts are of no concern.

Easter said the CMVP “does not address, acknowledge, or consider outside comments as it may be perceived as a mechanism to ‘sway’ the independent review of a conformance report.” He added that while there have been many inquiries into the OpenSSL application, all receive the same response: “the CMVP cannot comment.” The agency does maintain a publicly available pre-validation list of modules under consideration for validation, but that’s as far as it goes.

Both the CMVP and the DOMUS IT Security Lab, the accredited lab testing OpenSSL as part of the validation process, maintain that the lengthy review time is part of the process with this application. According to Chris Brych, FIPS-140 program manager at the IBM Canada-owned DOMUS, the OpenSSL review is different because it is a proof of concept review, which has resulted in questions from NIST that have not come up before.

The CMVP must specifically review the source code and how it is compiled and implemented in order to validate it, which is not normally the case. Since OpenSSL is open source, some users may opt for a pre-compiled version of the software, while others will choose to compile their own, which presents the possibility of the software working differently or incorrectly if it is changed or compiled differently.

In addition to the review itself, Brych said the CMVP has had to find a suitable framework for testing and reviewing open source software, now and in the future, as a result of the OpenSSL application.

“Software is software,” Brych said, adding that the actual testing of cryptographic algorithms in OpenSSL is not different from other reviews. “NIST is very conscious in how they’re making this work. Other [reviews] in the future could be faster because this will have been done…. They’re just being overly cautious.”

Easter said that any idea floating around that NIST or the CMVP, which is hosted and largely administered by NIST, has been “sitting on” the OpenSSL application is “totally untrue.” They are simply doing what he said is the agency’s due diligence.

“We have no apprehension here,” Easter said. “We’re happy to validate it if it conforms to the standard. The timeframe may just be larger because we’re unfamiliar with it.”

The CMVP does not normally validate source code. Instead, it usually receives object code that has already been compiled and is usable, said Peter Sargent, former director of the COACT FIPS lab and a consultant with OpenSSL. Although source code is part of the reports the agency reviews for validation, the cryptographic algorithms and processing of information are the parts CMVP is interested in. The agency looks at the source simply to make sure nothing is there that shouldn’t be.

“You cannot validate the source code,” Sargent said. “You have to create object code…. I think that’s where the problem is. You cannot validate source code because it doesn’t do anything. It has to be compiled in order for it to be a product and work. That’s probably where the biggest hang-ups come from.”

What consultants say

The Open Source Software Institute (OSSI) has helped OpenSSL get funding for the validation process, and found consultants and experts who know what the CMVP is looking for in the software it validates, said John Weathersby, executive director of OSSI.

According to Steve Marquess, the lead technical consultant brought in by OSSI for the application, the OpenSSL team offered object code digests, more commonly referred to as fingerprints, for DOMUS to use as comparisons to the binary builds of the software they would have to compile. When compared to the digests, he said, the lab would have tangible proof whether OpenSSL had been compiled properly.

Additionally, Sargent said that hash codes and digital signatures were inserted into the code to ensure things were in their proper order and compiling correctly.

All of this was done, Marquess said, because when software is compiled from source code, individual bits may not land in the same place with each build. And although this does not affect the software in most cases, it also means that the software is not the same each time it is compiled.

Easter said he could not specify whether this was an issue because CMVP policy does not allow him to.

Design assurance

One of the things he mentioned, while speaking generally about what CMVP reviews, is design assurance — which includes the delivery and operation of software. With open source software, he said, the delivery aspect of the standards being checked is important.

Because the CMVP validation is based on assurance that software will do what it has been validated to do, any changes to software invalidates that instance of it. The potential of something guaranteed by the government to work, and doing doing anything less, would defeat the purpose FIPS. “What we validate cannot be changed in any way, because … any change can effect the effectiveness of the validated module,” Easter said.

Marquess said that “flipping bits” in different compilations of the OpenSSL code should not affect whether it works or not, but he is aware the issue is a sticking point for the CMVP because FIPS does not allow for it. And although he said that the digests provided by OpenSSL can display that no code has changed, he said he believes that the definition of “exactly” the same build is where the problem lies because “the functional behavior of the object code is different than being bit for bit equivalent.”

Breaking new ground

OpenSSL currently is in the coordination stage of review, as it has been for the last several months, with the CMVP exchanging questions and answers with DOMUS about the software. The most recent report to the CMVP from DOMUS was in November, and the agency was still reviewing that report before the Christmas holiday, Brych said.

Sargent said that some other products, though they were not open source, have taken this long to be validated. He said that while NIST, CSE, and the CMVP would not lose credibility if they cannot find a way to validate OpenSSL, it needs to be addressed because open source shows up in other areas NIST is involved with.

“We went into it with the understanding that we were breaking new ground,” Sargent said. “I know that there’s a lot of really down people in the community … because it’s taken at least a year to a year-and-a-half longer than we expected. But I think it can still get through.”

Sargent said the CMVP may need to “go out on a limb a little bit” and bring in outside experts to help with the review, which Easter alluded to as being a possibility in areas which the CMVP and NIST is not necessarily expert.

Sargent also said it might not be “the worst thing in the world” if OpenSSL wasn’t validated, but that the validation should be granted if the agency can work around its issues with source code and the module being open source. Like Weathersby, Marquess, and others, Sargent said that with the use of free and open source software growing in businesses and governments around the world, this validation is pretty important.

Richard Levitte, a developer on the project who had an early hand in the CMVP application, called the validation a political statement that nonetheless holds weight for the open source community.

“It’s never been done before, so it is kind of a proof of concept,” Levitte said. “I think that it is a huge step for the free and open source software (FOSS) community. We could say that it shows that FOSS has grown as a movement and is a force to count with [proprietary software].”

Weathersby pointed out that there is no organization, or group of organizations, that requires the kind of security guarantees, among other guarantees, that are required by the US Department of Defense (DoD) — just one of the agencies in the US government required by law to use only software validated under FIPS. And if open source software is good enough for the DoD, he said, it should be good enough for anybody.

“Everybody says they want open source on a level playing field,” Weathersby said. “The DoD has the most stringent requirements — if we want to be accepted into the big game, we have to play by DoD rules.”