Last week the ISCI, after quite a long delay, published draft requirements documents for 2 of the 3 legs of the Embedded Device Security Assurance [EDSA] certification. The Software Development Security Assessment and Functional Security Assessment documents are now online for your review. The Communications Robustness Testing draft has not been published yet.
Much like when I entered the review of the WIB document, I was hopeful and optimistic that this would be a big plus for improving the security of vendor products because it was addressing vendor product development. After reading the two documents, I’m still optimistic. There is a lot of work required in cleaning up the requirements and making them testable for certification, but this is just painful, detailed work and not does not reflect a fundamental flaw or require a change in approach.
The Functional Security Assessment requires a lot of useful features not found in many current embedded devices such as audit logging requirements, no cleartext passwords in storage or transit, … ISCI anticipates three levels of rigor and certification with Level 1 being the lowest. The majority of the requirements do not apply to Level 1. This is probably a nod to the fact that these features do not exist today and could be years away. At least defining requirements for Levels 2 and 3 give vendors an idea of what security features they should be developing.
This is potentially an issue of setting expectations. If the community believes a Level 1 certified device will have basic security features such as role based authorization or integrity protection for request and response packets they will be disappointed. But if they think of Level 1 as having a robust protocol stack and simple security features like password support and very basic auditing it might be satisfied.
There were a number of testing issues with the Functional Security Assessment document involving the definition of the quality of security measures. Words like basic, sufficient or acceptable just kick this issue down the line to other documents. Perhaps the details don’t belong in this document because they may change, but they have to be determined before certification testing can begin. An example early in the document:
- FSA-AC-2: The IACS embedded device shall support acceptable authentication methods for user identification to support Access Management and Use Control Functionality for all services supported by the device. Comment: What are “acceptable authentication methods”?
The Software Development Security Assessment document is going to need more work to get ready for certification. It may sound odd, but a lot of the problems with this document are due to its trying to do much – – there are 23 pages of tabular requirements. It often reads like a number of documents were reviewed and requirements were pulled from these documents and cobbled together. Better organization and elimination of untestable and overlapping requirements can reduce this dramatically, at least by half, without reducing the value of the certification.
Much like the WIB review, there are a lot of problems with an independent organization being able to consistently evaluate compliance with specific requirements around vendor development programs. I was repeatedly asking myself with the SDSA document, how would I test that? However the SDSA document is clearly aimed at the vendor and not a mix of owner/operator and vendor requirements. Just a few examples:
- SDSA-SRS-5: Required security assurance level for the product should be included in the SecRS. Comment: I’m unsure how long it will be before we have a clear definition of what is required to meet various SAL’s.
- SDSA-SRS-8: The security requirements shall be expressed and structured such that they are clear, precise, unequivocal, verifiable by test, analysis or other means, maintainable, and feasible, but do not contain unnecessary design or verification detail. Comment: The sentiment is good, but how do I test this? Perhaps it would be better to have avoided some adjectives and just state the each security requirement must have a corresponding documented method of testing it has been met.
- SDSA-SAD-6: Attack surface reduction techniques shall be practiced to minimize the number of available entry points. Comment: SAD-5 requires the vendor to enumerate all possible entry points for an attacker. Which with an addition to require documentation on why each entry point is required would be great, but how will attack surface reduction techniques be considered?
- SDSA-SAD-11: The design process shall incorporate secure design best practices. This applies to all features, not just security features. Comment: Great but how do I test?
Lastly a few general process and formating comments:
- These are tables of requirements rather than a document. I actually like this and the surrounding prose can be added later.
- There is a column labeled “Source of Requirement”, and it is filled with the IEC, ISA99, NIST documents and other sources that led to the requirement.
- These drafts are clean enough to read and understand, but need some careful editing – – which is very tedious in a table format. I reviewed a few pages with an editor’s hat on and the pages were heavily marked.
We will keep monitoring this effort. I’m certain ISCI would be pleased with any detailed review or comments if you are looking for a pro bono project. More information on the details of the documents is available on the SCADApedia.