Click here
      Home    DAG Tutorial    Search    Available Downloads     Feedback
 
The DAG does not reflect the changes in the DoDI5000.02. Work is in progress to update the content and will be completed as soon as possible.
 
.

13.4. Intelligence and Counterintelligence (CI) Support

Topic
Previous Page Next Page

13.4. Intelligence and Counterintelligence (CI) Support

13.4.1. Defense Intelligence Agency Supply Chain Risk Management Threat Assessment Center (DIA SCRM TAC)

DoD has designated the Defense Intelligence Agency (DIA) to be the DoD enterprise focal point for threat assessments needed by the DoD acquisition community to assess supplier risks. DIA established the Threat Assessment Center (TAC) for this purpose. The Threat Assessment Center (TAC) provides the enterprise management and interface to resources within the National Counterintelligence Executive (NCIX), and coordinates with the Defense Intelligence and Defense Counterintelligence Components to provide standardized all-source intelligence assessments to support acquisition risk management efforts. This enterprise integration role of the DoD Threat Assessment Center (TAC) was designed and organized to achieve comprehensive and consistent engagements with the United States Government (USG) across all of the Military Departments (MILDEPs) and Defense Agencies needs for supplier threat assessments and to ensure the efficiency and coherent use of the results provided to the acquisition community.

13.4.1.1. Threat Assessments

Defense Intelligence Agency (DIA) Threat Assessments provide specific and timely threat characterization of the identified suppliers to inform program management. Threat Assessment Center (TAC) reports are used by the Program Manager and the engineering team to assist in selecting supplier and/or architecture alternatives and developing appropriate mitigations for supply chain risks. For the policy and procedures regarding the request, receipts, and handling of Threat Assessment Center (TAC) reports, refer to DoD Instruction O-5240.24.

Supplier threat assessment requests are developed based on the criticality analysis. An annotated work breakdown structure (WBS) or system breakdown structure (SBS) that identifies the suppliers of the critical functions components may be used to assist with the creation of the Threat Assessment Center (TAC) requests. Supplier threat assessment requests may be submitted as soon as sources of critical capability are identifiable. Near the end of the Materiel Solution Analysis (MSA) Phase, as some threat information is available from the capstone threat assessment (CTA) and technologies and potential suppliers are identified, Supply Chain Risk Management (SCRM) Threat Assessments may be used to assist in defining lowest risk architectures, based on suppliers for particular architecture alternatives. Note that early in the system lifecycle the threat requests may be more focused on suppliers in general technology areas to inform architecture choices, while later in the system lifecycle they may be more focused on critical components defined in the criticality analysis.

13.4.1.2. Criticality Analysis to Inform TAC Requests

Engineering activities related to Supply Chain Risk Management (SCRM) begin as architecture alternatives are considered and continue throughout the acquisition lifecycle. As the systems engineering team develops the initial view of system requirements and system design concepts, a criticality analysis is performed to define critical technology elements. Criticality analysis produces a list of critical components and suppliers that are used to generate Threat Assessment Center (TAC) requests and supplier risk mitigation.

The criticality analysis begins early in the system acquisition lifecycle and continues to be updated and enhanced through Milestone C, becoming more specific as architecture decisions are made and the system boundaries are fully defined. The engineering team may at any point, beginning prior to Milestone A, identify technology elements and potential manufacturers and request supplier threat assessments. It is expected that the number of supplier threat assessment requests will grow as the criticality analysis becomes more specific and the system architecture and boundaries are fully specified, i.e., the greatest number of Threat Assessment Center (TAC) requests will typically occur between Milestones B and C (i.e., Preliminary Design Review (PDR) and Critical Design Review (CDR)). See Section 13.3.2 for more information.

13.4.2. Counterintelligence Support

[This section will be updated to reflect implementation guidance for DoD Instruction O-5240.24, but content was not ready by the submission deadline for this major update.]

When an acquisition program containing Critical Program Information (CPI) is initiated, the Program Manager (PM) should request a counterintelligence (CI) analysis of CPI from the servicing CI organization. The CI analysis focuses on how the opposition sees the program and on how to counter the opposition's collection efforts. The CI analyst, in addition to having an in-depth understanding and expertise on foreign intelligence collection capabilities, must have a good working knowledge of the U.S. program. Therefore, CI organizations need information that describes the CPI and its projected use to determine the foreign collection threat to an acquisition program.

The CI analytical product that results from the analysis will provide the PM with an evaluation of foreign collection threats to specific program or project technologies, the impact if that technology is compromised, and the identification of related foreign technologies that could impact program or project success. The CI analytical product is updated as necessary (usually prior to each major milestone decision) throughout the acquisition process. Changes are briefed to the Program or PM within 60 days.

13.4.2.1. Requesting Counterintelligence (CI) Analytical Support

The PM's request to the counterintelligence organization for an analytical product normally contains the following information and is classified according to content:

  • Program office, designator, and address;
  • PM's name and telephone number;
  • Point of contact's (POCs) name, address, and telephone number;
  • Supporting or supported programs' or projects' names and locations;
  • Operational employment role, if any;
  • List of CPI;
  • Relationship to key technologies or other controlled technology lists of the Departments of Defense, Commerce, and/or State;
  • CPI technical description, including distinguishing characteristics (e.g., emissions; sight or sensor sensitivities) and methods of CPI transmittal, usage, storage, and testing;
  • Use of foreign equipment or technology during testing (if known);
  • Anticipated foreign involvement in the development, testing, or production of the U.S. system;
  • Contractor names, locations, Points of Contact (POCs), and telephone numbers, as well as the identification of each CPI used at each location; and
  • Reports of known or suspected compromise of CPI.

13.4.2.2. Preliminary Counterintelligence (CI) Analytical Product

After the request is submitted, the DoD Component CI organization provides a preliminary CI analytical product to the program manager within 90 days. A preliminary analytical product is more generic and less detailed than the final product. It is limited in use since it only provides an indication of which countries have the capability to collect intelligence on the U.S. system or technology as well as the possible interest and/or intention to collect it. The preliminary CI analytical product may serve as the basis for the draft Program Protection Plan.

13.4.2.3. Final Counterintelligence (CI) Analytical Product

The program manager approves the Program Protection Plan only after the final CI analysis of Critical Program Information (CPI) has been received from the applicable DoD Component CI and/or intelligence support activity. Normally, the CI analysis of CPI is returned to the requesting program office within 180 days of the CI and/or intelligence organization receiving the request.

The CI analysis of CPI answers the following questions about CPI:

  • Which foreign interests might be targeting the CPI and why?
  • What capabilities does each foreign interest have to collect information on the CPI at each location identified by the program office?
  • Does evidence exist to indicate that a program CPI has been targeted?
  • Has any CPI been compromised?

13.5. Vulnerability Assessment

This section briefly describes a process for identifying vulnerabilities in systems. A vulnerability is any weakness in system design, development, production, or operation that can be exploited by a threat to defeat a system’s mission objectives or significantly degrade its performance. Decisions about which vulnerabilities need to be addressed and which countermeasures or mitigation approaches should be applied will be based on an overall understanding of threats, risks, and program priorities. Vulnerability assessment is a step in the overall risk assessment process, as described in Section 13.5.

Vulnerability assessments should focus first on the mission-critical functions and components identified by a Criticality Analysis (see Section 13.3.2) and the Critical Program Information (CPI) identified (see Section 13.3.1). The search for vulnerabilities should begin with these critical functions, associated components and CPI.

13.5.1. Approaches to Identifying Vulnerabilities

Potential malicious activities that could interfere with a system’s operation should be considered throughout a system’s design, development testing, production, and maintenance. Vulnerabilities identified early in a system’s design can often be eliminated with simple design changes at lower cost. Vulnerabilities found later may require “add-on” protection measures or operating constraints that may be less effective and more expensive.

The principal vulnerabilities to watch for in an overall review of system engineering processes are:

  • Access paths within the supply chain that would allow threats to introduce components that could cause the system to fail at some later time (“components” here include hardware, software, and firmware); and
  • Access paths that would allow threats to trigger a component malfunction or failure at a time of their choosing.

“Supply chain” here means any point in a system’s design, engineering and manufacturing development, production, configuration in the field, updates, and maintenance. Access opportunities may be for extended or brief periods (but potentially exploitable).

Two design processes that have proven effective in identifying vulnerabilities are Fault Tree Analysis (FTA) and Failure Modes, Effects, and Criticality Analysis (FMECA). An important twist in applying these techniques is that the potential sources of failures are malicious actors, not random device failures. Malicious actors invalidate many assumptions made about randomness and event independence in reliability analysis. Both FTA and FMECA assume hypothetical system or mission failures have occurred, and trace back through the system to determine contributing component malfunctions or failures. For a vulnerability assessment, the possible access paths and opportunities a threat would have to exercise to introduce the vulnerability or trigger the failure must also be considered.

For software, a number of software tools are available that will identify common vulnerabilities. These tools apply different criteria and often find different flaws. It is therefore beneficial to run code through multiple tools.

Controls on access to software during development and in the field are critical to limiting opportunities for exploitation. One approach to testing access controls and software vulnerabilities in general is Red Teaming. Red teams typically subject a system under test to a series of attacks, simulating the tactics of an actual threat. (See further discussion of software tools and access controls in Section 13.7.3, Software Assurance.)

13.5.2. Rating Vulnerability Severity

The consequences of exploiting a vulnerability should be levels on the same scale as criticality (catastrophic, critical, marginal, and negligible). Vulnerability levels however, may not be the same as the criticality levels. For example, a vulnerability may expose a “critical” function in a way that has only a “marginal” consequence. At the same time, another vulnerability may expose several “critical” functions that taken together could lead to a “catastrophic” system failure.

Additional factors that should be rated include the ease or difficulty of exploiting a vulnerability, the developers’ or maintainers’ ability to detect access used to introduce or trigger a vulnerability, and any other deterrents to threats such as the consequences of being caught. A summary table of the vulnerability assessment is illustrated in Table 13.5.2.T1.

Table 13.5.2.T1. Sample summary vulnerability assessment table

Critical Components
(Hardware, Software, Firmware)

Identified Vulnerabilities

Exploitability

System Impact
(I, II, III, IV)

Exposure

Processor X

Vulnerability 1 Vulnerability 4

Low

Medium

II

Low

SW Module Y

Vulnerability 1 Vulnerability 2 Vulnerability 3 Vulnerability 6

High

Low

Medium

High

I

High

SW Algorithm A

None

Very Low

II

Very Low

FPGA 123

Vulnerability 1 Vulnerability 23

Low

I

Low

13.5.3. Identifying Vulnerability Mitigations or Countermeasures

There are multiple countermeasures available to mitigate a wide range of possible vulnerability risks. Design changes may either 1) eliminate an exploitation, 2) reduce the consequences of exploitation, or 3) block the access necessary for introduction or exploitation. “Add-on” protection mechanisms may block the access required to trigger an exploitation. An effective update process, particularly for software, can correct or counteract vulnerabilities discovered after fielding.

As a result of globalization, commercial off-the-shelf (COTS) components are designed and manufactured anywhere in the world, and it may be difficult or impossible to trace all opportunities for malicious access. If the source of a particular component might be compromised, it may be possible to substitute a comparable component from another, more dependable source. Anonymous purchases (Blind Buys) may prevent an untrustworthy supplier from knowing where the component is being used. More extensive testing may be required for critical components from unverified or less dependable sources. A variety of different countermeasures should be identified to inform and provide options for the program manager’s risk-mitigation decisions.

13.5.4. Interactions with Other Program Protection Processes

Investigation of vulnerabilities may indicate the need to raise or at least reconsider the criticality levels of functions and components identified in earlier criticality analyses. Investigation of vulnerabilities may also identify additional threats, or opportunities for threats, that were not considered risks in earlier vulnerability assessments. Vulnerabilities inform the risk assessment and the countermeasure cost-risk-benefit trade-off.

Discovery of a potentially malicious source from the threat assessment may warrant additional checks for vulnerabilities in other (less-critical) products procured from that source. Therefore, threat assessments can inform vulnerability assessments.

In the Program Protection Plan (PPP) the vulnerability process should be documented at a high level along with the person responsible for the process. The date of the vulnerability assessment, the results of the vulnerability assessment and the planned dates or period of future vulnerability assessments is also recorded in the PPP.

Previous and Next Page arrows

List of All Contributions at This Location

No items found.

Popular Tags

Browse

https://acc.dau.mil/UI/img/bo/minus.gifWelcome to the Defense Acquisition...
https://acc.dau.mil/UI/img/bo/plus.gifForeword
https://acc.dau.mil/UI/img/bo/plus.gifChapter 1 -- Department of Defense...
https://acc.dau.mil/UI/img/bo/plus.gifChapter 2 -- Program Strategies
https://acc.dau.mil/UI/img/bo/minus.gifChapter 3 -- Affordability and...
https://acc.dau.mil/UI/img/bo/plus.gif3.0. Overview
https://acc.dau.mil/UI/img/bo/plus.gif3.1. Life-Cycle Costs/Total Ownership...
https://acc.dau.mil/UI/img/bo/plus.gif3.2. Affordability
https://acc.dau.mil/UI/img/bo/plus.gif3.3. Analysis of Alternatives
https://acc.dau.mil/UI/img/bo/plus.gif3.4. Cost Estimation for Major Defense...
https://acc.dau.mil/UI/img/bo/plus.gif3.5. Manpower Estimates
https://acc.dau.mil/UI/img/bo/plus.gif3.6. Major Automated Information Systems...
https://acc.dau.mil/UI/img/bo/plus.gif3.7. Principles for Life-Cycle Cost...
https://acc.dau.mil/UI/img/bo/minus.gifChapter 4 -- Systems Engineering
https://acc.dau.mil/UI/img/bo/plus.gif4.0. Overview
https://acc.dau.mil/UI/img/bo/plus.gif4.1. Introduction
https://acc.dau.mil/UI/img/bo/plus.gif4.2. Systems Engineering Activities in...
https://acc.dau.mil/UI/img/bo/plus.gif4.3. Systems Engineering Processes
https://acc.dau.mil/UI/img/bo/plus.gifChapter 5 -- Life-Cycle Logistics
https://acc.dau.mil/UI/img/bo/plus.gifChapter 6 -- Human Systems Integration...
https://acc.dau.mil/UI/img/bo/plus.gifChapter 7 -- Acquiring Information...
https://acc.dau.mil/UI/img/bo/plus.gifChapter 8 -- Intelligence Analysis...
https://acc.dau.mil/UI/img/bo/plus.gifChapter 9 -- Test and Evaluation (T&E)
https://acc.dau.mil/UI/img/bo/plus.gifChapter 10 -- Decisions Assessments and...
https://acc.dau.mil/UI/img/bo/plus.gifChapter 11 -- Program Management...
https://acc.dau.mil/UI/img/bo/minus.gifChapter 12 - Defense Business System...
https://acc.dau.mil/UI/img/bo/minus.gif12.0 Overview
https://acc.dau.mil/UI/img/bo/plus.gif12.0.2 BCL Introduction
https://acc.dau.mil/UI/img/bo/minus.gif12.1 Business Capability Definition...
https://acc.dau.mil/UI/img/bo/plus.gif12.2 Investment Management (IM) Phase
https://acc.dau.mil/UI/img/bo/plus.gif12.3 Execution
https://acc.dau.mil/UI/img/bo/minus.gif12.4 DBS-specific Criteria
https://acc.dau.mil/UI/img/bo/plus.gif12.5 Tools and Methods
https://acc.dau.mil/UI/img/bo/minus.gifChapter 13 -- Program Protection
https://acc.dau.mil/UI/img/bo/plus.gif13.0 Overview
https://acc.dau.mil/UI/img/bo/plus.gif13.1 The Program Protection Process
https://acc.dau.mil/UI/img/bo/plus.gif13.2 The Program Protection Plan (PPP)
https://acc.dau.mil/UI/img/bo/plus.gif13.3 Critical Program Information (CPI)...
https://acc.dau.mil/UI/img/bo/minus.gif13.4. Intelligence and...
https://acc.dau.mil/UI/img/bo/plus.gif13.5. Vulnerability Assessment
https://acc.dau.mil/UI/img/bo/plus.gif13.6. Risk Assessment
https://acc.dau.mil/UI/img/bo/plus.gif13.7. Countermeasures
https://acc.dau.mil/UI/img/bo/plus.gif13.8. Horizontal Protection
https://acc.dau.mil/UI/img/bo/plus.gif13.9. Foreign Involvement
https://acc.dau.mil/UI/img/bo/plus.gif13.10. Managing and Implementing PPPs
https://acc.dau.mil/UI/img/bo/plus.gif13.11. Compromises
https://acc.dau.mil/UI/img/bo/plus.gif13.12. Costs
https://acc.dau.mil/UI/img/bo/plus.gif13.13. Contracting
https://acc.dau.mil/UI/img/bo/plus.gif13.14. Detailed System Security...
https://acc.dau.mil/UI/img/bo/plus.gif13.15. Program Protection Plan (PPP)...
https://acc.dau.mil/UI/img/bo/plus.gif13.16. Program Protection Plan (PPP)...
https://acc.dau.mil/UI/img/bo/plus.gifChapter 14 -- Acquisition of Services
https://acc.dau.mil/UI/img/bo/minus.gifDoD Directive 5000.01
https://acc.dau.mil/UI/img/bo/plus.gifENCLOSURE 1 ADDITIONAL POLICY
https://acc.dau.mil/UI/img/bo/plus.gifDoD Instruction 5000.02
https://acc.dau.mil/UI/img/bo/minus.gifRecent Policy and Guidance
https://acc.dau.mil/UI/img/bo/plus.gifDownload the Defense Acquisition...
https://acc.dau.mil/UI/img/bo/plus.gifWeapon Systems Acquisition Reform Act of...
https://acc.dau.mil/UI/img/bo/plus.gifCurrent JCIDS Manual and CJCSI 3170.01 I
https://acc.dau.mil/UI/img/bo/plus.gifDefense Acquisition Guidebook Key...
ACC Practice Center Version 3.2
  • Application Build 3.2.9
  • Database Version 3.2.9