Click here
      Home    DAG Tutorial    Search    Available Downloads     Feedback
 
The DAG does not reflect the changes in the DoDI5000.02. Work is in progress to update the content and will be completed as soon as possible.
 
.

10.4. Role of Exit Criteria

Topic
Previous Page Next Page

10.4. Role of Exit Criteria

Each Milestone Decision Authority (MDA) should use exit criteria for ACAT I and ACAT IA programs during an acquisition phase. Prior to each milestone decision point and at other decision reviews, the Program Manager will develop and propose exit criteria appropriate to the next phase or effort of the program. The Overarching Integrated Product Team will review the proposed exit criteria and make a recommendation to the MDA. Exit criteria approved by the MDA will be published in the Acquisition Decision Memorandum.

System-specific exit criteria normally track progress in important technical, schedule, or management risk areas. Unless waived, or modified by the MDA, exit criteria must be satisfied before the program may continue with additional activities within an acquisition phase or proceed into the next acquisition phase (depending on the decision with which they are associated). Exit criteria should not be part of the Acquisition Program Baseline (APB) and are not intended to repeat or replace APB requirements or the phase-specific entrance criteria specified in DoD Instruction 5000.02. They should not cause program deviations.

10.5. Role of Independent Assessments

Assessments, independent of the developer and the user, provide a different perspective of program status. However, requirements for independent assessments (for example, Program Support Reviews, Assessments of Operational Test Readiness, independent cost estimates, and technology readiness assessments) must be consistent with statutory requirements, policy, and good management practice. Senior acquisition officials consider these assessments when making acquisition decisions. Staff offices that provide independent assessments should support the orderly and timely progression of programs through the acquisition process. Overarching Integrated Product Team access to independent assessments that provide additional program perspectives facilitates full and open discussion of issues.

10.5.1. Independent Cost Estimate

Section 2334 of title 10, United States Code, requires the Director, Cost Assessment and Program Evaluation (DCAPE) to conduct independent cost estimates (ICEs) on Major Defense Acquisition Programs (MDAPs) and Major Automated Information Systems (MAIS) programs for which the Under Secretary of Defense (Acquisition, Technology, and Logistics) is the Milestone Decision Authority. The statute also requires DCAPE to review Department of Defense (DoD) Component cost estimates and cost analyses conducted in connection with MDAPs and MAIS programs.

Further, the statute gives DCAPE the authority to prescribe the policies and procedures for the conduct of all cost estimates for DoD acquisition programs and issue guidance relating to the full consideration of life-cycle management and sustainability costs.

10.5.1.1. Independent Cost Estimate (ICE) for Major Defense Acquisition Programs (MDAPs)

The Director, Cost Assessment and Program Evaluation (DCAPE) conducts ICEs and cost analyses for MDAPs for which the Under Secretary of Defense for Acquisition, Technology, and Logistics (USD(AT&L)) is the Milestone Decision Authority in advance of:

(1) Any decision to enter low rate initial production, or full rate production.

(2) Any certification pursuant to sections 2366a, 2366b, or 2433a of title 10, United States Code.

(3) At any other time considered appropriate by the DCAPE or upon the request of the USD(AT&L).

10.5.1.2. Independent Cost Estimate (ICE) for Major Automated Information Systems (MAIS) Programs

The Director, Cost Assessment and Program Evaluation (DCAPE), conducts ICEs and cost analyses for MAIS programs for which the Under Secretary of Defense (Acquisition, Technology and Logistics) (USD(AT&L)) is the Milestone Decision Authority in advance of:

(1) Any report pursuant to section 2445c(f) of title 10, United States Code.

(2) At any other time considered appropriate by the DCAPE or upon the request of the USD(AT&L).

10.5.1.3. Review of Cost Estimates

The Director, Cost Assessment and Program Evaluation (DCAPE) participates in the discussion of any discrepancies related to cost estimates for Major Defense Acquisition Programs (MDAPs) and Major Automation Information System (MAIS) programs, comments on deficiencies regarding the methodology or the execution of the estimates, concurs with the choice of the cost estimate used to support the Acquisition Program Baseline or any of the cost estimates identified in paragraphs 10.5.1.1. and 10.5.1.2. and participates in the consideration of any decision to request authorization of a multi-year procurement contract for a MDAP.

10.5.1.4. Cost Estimate Confidence Levels

The Director, Cost Assessment and Program Evaluation (DCAPE) and the Secretary of the Military Department concerned or the head of the Defense Agency concerned (as applicable) state the confidence level used in establishing the cost estimate for Major Defense Acquisition Programs (MDAPs) and Major Automated Information System (MAIS) programs, ensure that the confidence level provides a high degree of confidence that the program can be completed without the need for significant adjustment to program budgets, and provides the rationale for selecting the confidence level. The confidence level statement shall be included in the Acquisition Decision Memorandum approving the Acquisition Program Baseline, and in any documentation of cost estimates for MDAPs or MAIS programs prepared in association with the events identified in paragraphs 10.5.1.1.and 10.5.1.2. The confidence level statement shall also be included in the next Selected Acquisition Report prepared in compliance with section 2432 of title 10, United States Code, or in the next quarterly report prepared in compliance with section 2445c of title 10, United States Code.

10.5.2. Technology Maturity and Technology Readiness Assessments

A Technology Readiness Assessment (TRA) is a systematic, metrics-based process that assesses the maturity of, and the risk associated with, critical technologies to be used in Major Defense Acquisition Programs (MDAPs). It is conducted by the Program Manager (PM) with the assistance of an independent team of subject matter experts (SMEs). It is provided to the Assistant Secretary of Defense for Research and Engineering (ASD(R&E)) and will provide part of the basis upon which he advises the Milestone Decision Authority (MDA) at Milestone (MS) B or at other events designated by the MDA to assist in the determination of whether the technologies of the program have acceptable levels of risk—based in part on the degree to which they have been demonstrated (including demonstration in a relevant environment)—and to support risk-mitigation plans prepared by the PM.

A TRA is required by Department of Defense Instruction (DoDI) 5000.02 for MDAPs at MS B (or at a subsequent Milestone if there is no MS B). It is also conducted whenever otherwise required by the MDA. The TRA final report for MDAPs must be submitted to ASD(R&E) for review to support the requirement that ASD(R&E) provide an independent assessment to the MDA.

A TRA focuses on the program’s “critical” technologies (i.e., those that may pose major technological risk during development, particularly during the Engineering and Manufacturing Development (EMD) phase of acquisition). Technology Readiness Levels (TRLs) can serve as a helpful knowledge-based standard and shorthand for evaluating technology maturity, but they must be supplemented with expert professional judgment.

The program manager should identify critical technologies, using tools such as the Work Breakdown Structure. In order to provide useful technology maturity information to the acquisition review process, technology readiness assessments of critical technologies and identification of critical program information (CPI) must be completed prior to Milestone Decision points B and C.

10.5.2.1. Assessment of MDAP Technologies

The TRA final report for MDAPs must be submitted to ASD(R&E) for review to support the requirement that ASD(R&E) provide an independent assessment to the Milestone Decision Authority.

10.5.2.2. Technology Readiness Levels (TRLs)

A summary table of TRL descriptions, Table 10.5.2.2.T1 follows:

Table 10.5.2.2.T1. TRL Descriptions

Technology Readiness Level

Description

1. Basic principles observed and reported.

Lowest level of technology readiness. Scientific research begins to be translated into applied research and development. Examples might include paper studies of a technology's basic properties.

2. Technology concept and/or application formulated.

Invention begins. Once basic principles are observed, practical applications can be invented. Applications are speculative and there may be no proof or detailed analysis to support the assumptions. Examples are limited to analytic studies.

3. Analytical and experimental critical function and/or characteristic proof of concept.

Active research and development is initiated. This includes analytical studies and laboratory studies to physically validate analytical predictions of separate elements of the technology. Examples include components that are not yet integrated or representative.

4. Component and/or breadboard validation in laboratory environment.

Basic technological components are integrated to establish that they will work together. This is relatively "low fidelity" compared to the eventual system. Examples include integration of "ad hoc" hardware in the laboratory.

5. Component and/or breadboard validation in relevant environment.

Fidelity of breadboard technology increases significantly. The basic technological components are integrated with reasonably realistic supporting elements so it can be tested in a simulated environment. Examples include "high fidelity" laboratory integration of components.

6. System/subsystem model or prototype demonstration in a relevant environment.

Representative model or prototype system, which is well beyond that of TRL 5, is tested in a relevant environment. Represents a major step up in a technology's demonstrated readiness. Examples include testing a prototype in a high-fidelity laboratory environment or in simulated operational environment.

7. System prototype demonstration in an operational environment.

Prototype near, or at, planned operational system. Represents a major step up from TRL 6, requiring demonstration of an actual system prototype in an operational environment such as an aircraft, vehicle, or space. Examples include testing the prototype in a test bed aircraft.

8. Actual system completed and qualified through test and demonstration.

Technology has been proven to work in its final form and under expected conditions. In almost all cases, this TRL represents the end of true system development. Examples include developmental test and evaluation of the system in its intended weapon system to determine if it meets design specifications.

9. Actual system proven through successful mission operations.

Actual application of the technology in its final form and under mission conditions, such as those encountered in operational test and evaluation. Examples include using the system under operational mission conditions.

The use of TRLs enables consistent, uniform, discussions of technical maturity across different types of technologies. Decision authorities will consider the recommended TRLs (or some equivalent assessment methodology, e.g., Willoughby templates) when assessing program risk. TRLs are a measure of technical maturity. They do not discuss the probability of occurrence (i.e., the likelihood of attaining required maturity) or the impact of not achieving technology maturity.

For additional information, see the on-line TRA Deskbook.

Previous and Next Page arrows

List of All Contributions at This Location

No items found.

Popular Tags

ACC Practice Center Version 3.2
  • Application Build 3.2.9
  • Database Version 3.2.9