13.6. Risk Assessment
For each Level I and Level II critical function or component the program performs a risk assessment. Figure 13.6.F1 shows the overall risk assessment methodology.
Figure 13.6.F1. Risk Assessment Methodology
The system impact level from the criticality analysis is used to determine the risk consequence. The risk likelihood is based upon the vulnerability assessment and the knowledge or suspicion of threats within the supply chain and potential vulnerabilities within supplied hardware, software, and firmware products. Each Service and program may have specific guidance on how to use the threat assessment and vulnerability assessment to develop the risk likelihood. A basic method which may be used in the absence of program or service specific guidance is described in this section.
One way to translate the threat assessments and vulnerability assessments into risk likelihood or probability is to develop specific questions for supply chain and software assurance. The following paragraphs list two sets of sample “Yes/No” vulnerability questions that a program can use to establish the risk likelihood. The first set of vulnerability questions applies to supply chain considerations.
- Does the Contractor:
- Have visibility into lower-level suppliers that provide sub-components used in constructing or assembling critical components?
- Vet suppliers of critical function components ( hardware / software / firmware) based upon the security of their processes?
- Have processes to verify critical function components received from suppliers to ensure that components are free from malicious insertion ( e.g. seals, inspection, secure shipping, testing, etc. )?
- Have controls in place to ensure technical manuals are printed by a trusted supplier who limits access to the technical material?
- Have a process to establish trusted suppliers of critical components?
- Require suppliers to have similar processes for the above questions?
- Have processes to limit access to critical components? Can the contractor identify everyone that has access to critical components?
- Are Blind Buys Used to Contract for Critical Function Components?
- Are Specific Test Requirements Established for Critical Components?
- Does the Developer Require Secure Design and Fabrication or Manufacturing Standards for Critical Components?
- Are Critical Program Information (CPI) and Critical Functions stored, maintained, transported, or transmitted ( e.g., electronic media, blueprints, training materials, facsimile, modem ) securely?
The second set of sample “Yes / No” questions apply to software/ firmware assurance considerations.
- Does the Developer have:
- A design and code inspection process that requires specific secure design and coding standards as part of the inspection criteria?
- Secure design and coding standards that consider Common Weakness Enumeration ( CWE ), Software Engineering Institute ( SEI ) Top 10 secure coding practices, and other sources when defining the standards?
- From Common Weakness Enumeration ( CWE )
- Common Vulnerabilities and Exposures ( Common Vulnerabilities and Exposures (CVE )
- Common Attack Pattern Enumeration and Classification ( CAPEC )
- Have software vulnerabilities derived from these three sources been mitigated?
- From Common Weakness Enumeration ( CWE )
- Common Vulnerabilities and Exposures ( Common Vulnerabilities and Exposures ( CVE ) )
- Common Attack Pattern Enumeration and Classification ( CAPEC )
- Are static analysis tools used to identify and mitigate vulnerabilities?
- Does the software contain Fault Detection/Fault Isolation ( FDFI ) and tracking or logging of faults?
- Do the software interfaces contain input checking and validation?
- Is access to the development environment controlled with limited authorities and does it enable tracing all code changes to specific individuals?
- Are specific code test-coverage metrics used to ensure adequate testing?
- Are regression tests routinely run following changes to code?
“No” responses to the questions provide points where a countermeasure may be considered for risk mitigation. A simple way of translating the “No” responses into a risk likelihood is to map the percentage of “No” responses to a risk likelihood, such as is shown in Table 13.6.T1.
Table 13.6.T1 Sample Risk Likelihood Mapping
Number of “No” Responses
|
Risk Likelihood
|
All “NO”
|
Near Certainty (VH)
|
>=75% NO
|
High Likelihood (H)
|
>= 25% No
|
Likely (M)
|
<= 25% No
|
Low Likelihood (L)
|
<= 10% No
|
Not Likely (NL)
|
Table 13.6.T2 provides an example of a table that summarizes the vulnerability and threat assessment results used to develop the risk likelihood. A table similar to this is beneficial to the program in understanding the rationale and should be documented in the Risk section of the Program Protection Plan (PPP). The overall likelihood is derived from the supply chain risk likelihood, the software assurance risk likelihood and the threat assessment. The Overall Risk Likelihood may be derived by using a weighted average of the three inputs or using the highest risk. In the example shown in Table 13.6.T2, the overall risk likelihood of “High” was derived by applying equal weights for the Supply Chain and Software Assurance Risk Likelihood and the Threat Assessment Risk. The program or service may develop their own specific weightings based upon their program and domain specific knowledge.
Table 13.6.T2 Risk Likelihood Derived From Vulnerability and Threat Assessments
Critical Function Component
|
Mission Impact
|
Supply Chain Risk Likelihood
|
Software Assurance Risk Likelihood
|
Threat Assessment Risk
|
Overall Risk Likelihood
|
Component 1
|
I
|
High
- No blind buys
- No Supply Chain visibility
- No supplier qualification process
- No receiving verification
- No trusted suppliers
|
Very High
- No fault logging
- No secure design standard
- No static analysis
- No Common Vulnerabilities and Exposures (CVE), Common Weakness Enumeration (CWE), Common Attack Pattern Enumeration and Classification (CAPEC)
- No input validation
- No dev envir ctrl
- No Regression test
- Low test coverage
|
Medium
|
High
|
Component 2
|
II
|
Low
- No Supply Chain visibility
- No supplier qualification
|
Not Likely
|
Medium
|
Low
|
The “No” responses to the questions help to determine the possible countermeasures to be considered for risk mitigation. A similar table may be created which records the countermeasures planned and the new risk probability as a result of the planned mitigations. Table 13.6.T3 provides an example worksheet for planning the countermeasures and the resulting Risk Likelihood.
Table 13.6.T3. Risk Likelihood After Mitigations
Critical Function Component
|
Mission Impact
|
Supply chain mitigations
|
Software assurance mitigations
|
Threat assessment risk
|
Overall Risk Likelihood
|
Component 1
|
I
|
- Blind buys
- Supply Chain (SC) visibility included in Statement of Work (SOW)
- Supplier verification and test of Commercial off-the-shelf (COTS)
- Requirement to flow down Statement of Work (SOW) requirements to sub-tier suppliers
|
- Secure design and coding std included in SOW
- Fault logging added
- Static analysis added
- Common Vulnerabilities and Exposures (CVE), Common Weakness Enumeration (CWE), Common Attack Pattern Enumeration and Classification (CAPEC) used to establish and update secure design standards
- Input validation added to interfaces
- Development environment control added to limit access and record all access
- Regression testing added
- Test coverage increased to 60%
- Penetration testing added
|
Medium
|
Low to Medium
|
The risk is then incorporated into the program technical risks. The risk entry may look similar to the following example:
Software Assurance Technical Risks
|
|
Mitigation Activities
|
R1. Field-programmable gate array (FPGA) 123 has high exposure to software vulnerabilities with potential foreign influence
|
|
Establishing a wrapper to implement secure design standards and fault logging, static analysis, increased test coverage, and penetration testing
|
Technical Issues
|
|
|
1. May impact performance, cost, and schedule
|
|
|
Opportunities
|
|
|
O1. Low investment, great benefit for program and overall for Missile Programs
|
|
Low cost, benefit for program and command
|
Ensure that the top program protection risks ( very high and high ) have a risk cube and mitigation plans.