NDcPP – The devil is in the details

Lachlan TurnerCommon Criteria

In this post, we identify some common problem areas for vendors complying with the Network Device Collaborative Protection Profile (NDcPP). We’ll discuss how Lightship has adjusted to the new reality that every product going against the very prescriptive NDcPP will have gaps because of the strict level of conformance required – even if the same product was tested against a previous version of NDPP / NDcPP.

Another candidate sub-title for this post was: ‘Tail wags the dog’ because our experience has shown that it is the specific test activities within the NDcPP Supporting Document that are the source of truth for what a product must do to conform with the NDcPP, rather than solely the Security Functional Requirements (SFRs) of the NDcPP.

Here’s our list of common problem areas for NDcPP test conformance:

  • X.509 – CN/SAN usage, wildcards, revocation checking
  • TLS – protocol testing requirements including complex man in the middle attacks
  • SSH – accurately determining large packet sizes and understanding rekey limits
  • Audit – missing required events that in some cases are specifically only needed for CC certification

Of these, the most complex test requirements are related to X.509 and TLS that add requirements beyond the SFRs.

[May 2019 Update for NDcPPv2.1]One example is TLS reference identifiers. IP addresses are relatively commonplace (e.g. syslog server address), which the NDcPP states are valid. However, because wildcard testing is not optional, vendors are forced to support host names to allow the single positive wildcard test to succeed (FCS_TLSC_EXT.x.2 Test 5.2). While FQDNs are not mandatory anymore for wildcard support, and wildcard support is indeed optional, if you explicitly claim you do *not* support wildcards there are now tests to verify that you don’t support them. That means if you *do* support them — even if you have a non-compliant implementation — you are now on the hook to fix it to be compliant OR to remove the functionality entirely.

Another way to illustrate the added complexity that vendors face is to look at the sheer number of test activities. Looking at FCS_TLSC_EXT.1 (TLS client), the SFR includes 4 elements which are broken down into 14 tests in the Supporting Document. A typical NDcPPv2 full test run as an example, could contain more than 250 discrete tests once various combination between interacting components are considered.

Whereas it used to be possible to perform a “paper” based gap analysis using the SFRs as a requirements specification, we have found that with the collaborative PP’s, only by actually performing the test activities can you truly be confident that all gaps are identified and remediated prior to the formal evaluation. Our approach to addressing this new paradigm starts with a Functional Gap Assessment (FGA) – aka end-to-end testing.

At Lightship, we use our Greenlight Common Criteria test automation platform to perform FGA testing in a fraction of the time that it would take manually. By leveraging our flexible and intelligent combinatorial test generation engine, we can create a complete test plan and start executing testing on day one. This is a game changer that flips the test activity from the end of an evaluation to the start of the project – during the evaluation preparation phase.

By performing an FGA in the preparation cycle for NDcPP evaluation, vendors can avoid mid-certification product changes and have certainty that products are truly conformant and ready for evaluation.

Lachlan has 20+ years of extensive product security certification experience, including roles as a government certifier, lab evaluator and vendor consultant. As the Director of Cyber Labs, Lachlan has overall responsibility for our Canadian and US Common Criteria laboratories.