Click here
      Home    DAG Tutorial    Search    Available Downloads     Feedback
 
The DAG does not reflect the changes in the DoDI5000.02. Work is in progress to update the content and will be completed as soon as possible.
 
.

9.5 Test and Evaluation Planning

Topic
Previous Page Next Page

9.5. Test and Evaluation Planning

T&E planning should include statistically defensible test results to effectively support decision makers. A common approach, DOE serves as a structured process to assist in developing T&E strategies utilizing statistical analyses. Many constraints exist in testing – limited test resources, limited test time, and limited test articles. DOE aids in the understanding of the tradeoffs among these constraints and their implications. Additionally, DOE can provide a statistically optimum allocation of assets under given constraints. It can also provide optimal allocation test points between multiple phases of testing. DOE ensures the synergistic results in the data collected in multiple phases in sequential learning about the system.

A program applying DOE should start early in the acquisition process and assemble a team of subject matter experts who can identify operational and environmental conditions (the driving factors in the successful performance of the system and the consideration of levels of each factor). The team should include representation for all testing (contractor testing, Government DT and OT). The developed TEMP should include the resources needed, the plan for early tests (including component tests), and use of the results of early tests to plan further testing.

9.5.1. DT&E Planning

A well planned and executed DT&E program supports the technology development and acquisition strategies as well as the systems engineering process; providing the information necessary for informed decision-making throughout the development process and at each acquisition milestone. DT&E provides the verification and validation (V&V) of the systems engineering process as well as confidence that the system design solution satisfies the desired capabilities. The strategy for T&E should remain consistent with and complementary to the SEP and acquisition strategy. The T&E WIPT, working closely with the PM and the system design team, facilitates this process. Rigorous component and sub-system DT&E enables early performance and reliability assessments for utilization in system design. DT&E and integrated testing events should advance to rigorous, system-level and system-of-systems (SoS) level T&E; ensuring the system maturity to a point where it can enter production, and ultimately meet operational employment requirements.

DT&E reduces technical risk and increases the probability of a successful program. During early DT&E, the prime contractor focuses contractor testing on technical contract specifications. Government testers observe the critical contractor testing, conduct additional T&E, and, when practical, facilitate early user involvement. The PMs contract with industry must support open communication between government and contractor testers. The OSD document, "Incorporating Test and Evaluation into Department of Defense Acquisition Contracts," dated October 2011, provides additional guidance on contract-related issues for the successful solicitation, award, and execution of T&E related aspects of acquisition contracts. Items such as commercial-off-the-shelf, non-developmental items, and Government-off-the-shelf products, regardless of the manner of procurement, must undergo DT&E to verify readiness to enter IOT&E, for proper evaluation of operational effectiveness, operational suitability, and survivability or operational security for the intended military application. Programs should not enter IOT&E until the DoD Components indicate confidence that the “production representative” system will successfully demonstrate effective, suitable, and survivable criteria established in the capability production document (CPD). In addition, the government will report DT&E results at each program milestone, providing knowledge to reduce the risk in those acquisition decisions.

9.5.2. OT&E Planning

DoDI 5000.02 Enclosure 6 lists mandatory elements of OT&E planning and execution. Other considerations include:

  • Planning should consider an integrated testing approach. The integrated approach should not compromise either DT&E or OT&E objectives. Planning should provide for an adequate OT period and report generation, including the DOT&E BLRIP report to the SecDef and Congress prior to the FRP decision.
  • OT&E should take maximum advantage of training and exercise activities to increase the realism and scope of both the OT&E and training, and to reduce testing costs.
  • OTAs should participate in early DT&E and M&S to provide operational insights to the PM, the JCIDS process participants, and acquisition decision-makers. OT&E responsibility resides with the DoD Component OTA; including planning, gaining DOT&E plan approval, execution, and reporting.
  • Prototype testing should be emphasized early in the acquisition process and during EOAs to identify technology risks and provide operational user impacts. OTAs should maximize their involvement in early, pre-acquisition activities. T&E provides early operational insights during the developmental process. This early operational insight should reduce the scope of the integrated and dedicated OT&E, thereby contributing to reduced acquisition cycle times and improved performance.
  • OT&E planning should consider appropriate use of accredited M&S to support DT&E, OT&E, and LFT&E and be coordinated through the T&E WIPT. Test planners should collaborate early with the PMs M&S proponent on the planned use of M&S to support or supplement their test planning or analyze test results. Where feasible, consider the use or development of M&S that encompasses the needs of each phase of T&E. Test planners must coordinate with the M&S proponent/developer/operator to establish acceptability criteria required to allow VV&A of proposed M&S. It is the responsibility of the PMs M&S proponent to ensure the conduct of V&V in a manner supporting accreditation of M&S for each intended use. Whenever possible, an OA should draw upon test results with the actual system, or subsystem, or key components thereof, or with operationally meaningful surrogates. When a PM cannot conduct actual system testing to support an OA, such assessments may utilize computer modeling and/or hardware in the loop, simulations (preferably with real operators in the loop), or an analysis of information contained in key program documents. However, the PM must ensure they receive a risk assessment when system testing cannot support an OA. The TEMP explains the extent of M&S supporting OT&E, whether to develop M&S, the identification of resources, and a cost/benefit analysis. Naval vessels, the major systems integral to ship construction, and military satellite programs typically have development and construction phases extending over long periods of time and involve small procurement quantities. To facilitate evaluations and assessments of system performance (operational effectiveness, operational suitability and mission capability) the PM should ensure the involvement of the independent OTA in the monitoring of or participating in all relevant activity to make use of any/all relevant results to complete operational assessments (OAs). The OTA should determine the inclusion/exclusion of test data for use during OAs and determine the requirement for any additional operational testing needed for evaluation of operational effectiveness, operational suitability and mission capability.
  • OT&E uses threat or threat representative forces, targets, and threat countermeasures, validated by the DIA or the DoD Component intelligence agency, as appropriate, and approved by DOT&E during the operational test plan approval process. DOT&E oversees threat target, threat simulator, and threat simulation acquisitions and validation to meet OT&E and LFT&E needs.
  • PMs and OTAs assess the reliability growth required for the system to achieve its reliability threshold during IOT&E and report the results of that assessment to the MDA at Milestone C.

9.5.3. Early Involvement

T&E early involvement advises program offices on the testability of requirements, scoping the T&E program and resources for inclusion in the technology and acquisition strategies, contractual requirements, and other upfront actions helping the acquisition program succeed. This requires the active engagement of skilled T&E personnel in the requirements and acquisition processes to get the “up-front” right, particularly in terms of definitional precision in describing the operational context, mission and system measures, integration of DT and OT, and the construct for translating performance results into mission effectiveness terms. Developing a framework to accomplish those objectives enhances the efficiencies and effectiveness of T&E programs, and results in less conflict during T&E planning and execution.

An integral element of the Defense Acquisition System (DoDI 5000.02), T&E has a role across the entire lifecycle as depicted in the following Figure 9.5.3.F1. The Integrated Defense Acquisition, Technology, and Logistics Life Cycle Management System Chart (v5.3.4, 15 Jun 2009) outlines the key activities in the systems acquisition processes that must work in concert to deliver the capabilities required by the warfighters: the requirements process (JCIDS; the acquisition process (Defense Acquisition System); and program and budget development (Planning, Programming, Budgeting, and Execution (PPBE) process).

Figure 9.5.3.F1: Key T&E Processes across the Lifecycle – T&E Perspective

image describes improving milestone process effectivness

NOTE: A larger version of the process is available by clicking on the image above.

Key sources of T&E information, used during the formulation of a Materiel Solution, include the capabilities-based assessment (CBA), Analysis of Alternatives (AOA), JCIDS documents, etc. Items of particular interest to the T&E community include:

  • Mission description, scenarios, Concept of Operations (CONOPS), performance attributes and effectiveness metrics, targets and threats, operational environments, etc.
  • Mission to task decomposition and scenario-based task performance standards.
  • Task to system/sub-system associations and functionality.
  • Alignment of mission Measures of Effectiveness (MOEs) with system performance attributes and measures.

The requirements process defines and subsequently refines a program’s operational capability requirements (system attributes) and operational environments (mission attributes) throughout the development process in the CBA, Initial Capabilities Document (ICD), CDD, and CPD.

Critical to the developers, testers, and representative of the COCOM Area of Responsibility (AOR) for operational employment ,the pedigree of operational context across the lifecycle and the design of the operational context of the system should remain the same as the evaluated operational context,. If the operational context changes over the course of development, those changes should be documented in both the AOA and JCIDS updates.

9.5.3.1. Defining Mission Measures: Early Involvement – JCIDS (Measures of Effectiveness (MOE) and Measures of Performance (MOP))

JCIDS processes are currently undergoing a significant revision, with the expectation of releasing the new policy in late FY 2011. The current JCIDS process has evolved from a joint mission-based process, focused on evaluating MOE and MOP in a mission context to deliver a capability to an operational environments-based process focused on evaluating system performance attributes to deliver a required capability, as seen in excerpt from the current JCIDS policy below:

  • The JCIDS primary objective ensures the identification of the capabilities required by the joint Warfighter with their associated operational performance criteria in order to successfully execute the missions assigned.
  • The JCIDS process supports the acquisition process by identifying and assessing capability needs and associated performance criteria used as a basis for acquiring the right capabilities, including the right systems.
  • The CDD primary objective specifies the operational technical performance attributes of the system delivering the capability to fill the gaps identified in the ICD.
  • The CPD primary objective describes the actual performance of the system delivering the required capability.
  • If the system does not meet all of the threshold levels for the KPPs, the Joint Requirements Oversight Council (JROC) will assess whether or not the system remains operationally acceptable.
  • The CDD and CPD identify the attributes contributing most significantly to the desired operational capability in threshold-objective format. Whenever possible, state attributes in terms reflecting the range of military operations the capabilities must support and the joint operational environment intended for the system (family of systems (FoS) or SoS).
  • Other compatibility and interoperability attributes (e.g., databases, fuel, transportability, and ammunition) might need identification to ensure a capability’s effectiveness.

The CJCSI 3170.01H “Joint Capabilities Integration and Development System,” dated January 10, 2012 complements the JCIDS instruction. Additionally:

  • DOT&E’s role with respect to the ICD is included in the JCIDS Manual: “DOT&E will advise on the testability of chosen capability attributes and metrics so that the system’s performance measured in operational testing can be linked to the CBA.”
  • The JCIDS manual further states “The ICD will include a description of the capability, capability gap, threat, expected joint operational environments, shortcomings of existing systems, the capability attributes and metrics, joint Doctrine, Organization, Training, Materiel, Leadership and Education, Personnel and Facilities (DOTMLPF), and policy impact and constraints for the capabilities.”

Director of Operational Test and Evaluation (DOT&E) (DoDD 5141.02).

  • Assist the CJCS in efforts to ensure the specification of expected joint operational mission environment, mission-level MOE, and KPPs in JCIDS documents in terms verifiable through testing or analysis.

Note: the JCIDS policy no longer requires or discusses MOPs and MOEs; however, the JCIDS derives and documents performance attributes from analysis that supported the CBA and the AOA. Additionally, the CBA, AOA, and MOPs and MOEs remain essential metrics needed for evaluation of those performance attributes.

  • Measure of Effectiveness (MOE) – The data used to measure the military effect (mission accomplishment) that comes from the use of the system in its expected environment. That environment includes the system under test and all interrelated systems, that is, the planned or expected environment in terms of weapons, sensors, command and control, and platforms, as appropriate, needed to accomplish an end-to-end mission in combat.
  • Measures of Performance (MOPs) – System-particular performance parameters such as speed, payload, range, time-on-station, frequency, or other distinctly quantifiable performance features. Several MOPs may be related to the achievement of a particular MOE.

Further, the OTAs and DOT&E have a requirement to address effectiveness in their evaluations. In the memorandum “Reporting of Operational Test and Evaluation (OT&E) Results,” dated January 6, 2010, DOT&E states:

  • The data used for evaluation are appropriately called measures of effectiveness, because they measure the military effect (mission accomplishment) that comes from the use of the system in its expected environment. This statement of policy precludes measuring operational effectiveness and suitability solely on the basis of system-particular performance parameters.
  • “. . . "performance attributes” (sic) are often what the program manager is required to deliver….they are not the military effect or measure of operational effectiveness required for achieving the primary purpose” of a mission capability.
  • “It is therefore unacceptable in evaluating and reporting operational effectiveness and suitability to parse requirements and narrow the definition of mission accomplishment so that MOP are confused with MOE.”

9.5.3.2. Defining the Operational Context: Early Involvement - CBA: Operational Context (Scenarios, Missions and Objectives, Environments, etc.)

The JCIDS process begins with the CBA, which provides the bases for JCIDS to articulate the systems performance attributes required by the warfighters. Any DoD organization may initiate a CBA. See the “Manual for the Operation of the Joint Capabilities Integration and Development System,” dated July 31, 2009 for CBA information.

9.5.3.3. Analysis of Alternatives

For potential and designated ACAT I and IA programs, the Director, Cost Assessment and Program Evaluation (CAPE) should draft, for MDA approval, AoA study guidance for review at the Materiel Development Decision. Following approval, the guidance should be issued to the DoD Component designated by the MDA, or for ACAT IA programs, to the office of the Principal Staff Assistant responsible for the mission area. According to DoDI 5000.02, Enclosure 7, dated December 8, 2008, the DoD Component or the Principal Staff Assistant shall designate responsibility for completion of the study plan and the AoA; neither of which may be assigned to the PM. The study plan shall be coordinated with the MDA and approved by the CAPE prior to the start of the AoA. The final AoA shall be provided to the CAPE not later than 60 days prior to the DAB or Information Technology Acquisition Board milestone reviews. The CAPE shall evaluate the AoA and provide an assessment to the Head of the DoD Component or Principal Staff Assistant and to the MDA. In this evaluation, the CAPE, in collaboration with the OSD and Joint Staff, shall assess the extent to which the AoA:

a) Illuminated capability advantages and disadvantages.

b) Considered joint operational plans.

c) Examined sufficient feasible alternatives.

d) Discussed key assumptions and variables and sensitivity to changes in these.

e) Calculated costs.

f) Assessed the following:

  1. Technology risk and maturity.
  2. Alternative ways to improve the energy efficiency of DoD tactical systems with end items that create a demand for energy, consistent with mission requirements and cost effectiveness.
  3. Appropriate system training to ensure that effective and efficient training is provided with the system.

9.5.3.4. Defining Critical Technical Parameters (CTPs)

T&E programs will have hundreds or thousands of technical parameters needing capture to support data analysis and evaluations; however, every technical parameter is not a CTP. CTPs measure critical system characteristics that, when achieved, enable the attainment of desired operational performance capabilities – in the mission context. CTP do not simply restate the KPPs and/or KSAs. Each CTP must have a direct or significant indirect correlation to a KPP and or KSA that measures a physical characteristic essential to evaluation of the KPP or KSA. The 2011 JCIDS Manual, “The Director, Operational Test & Evaluation (DOT&E) will advise on the testability of chosen capability attributes and metrics so that the system’s performance measured in operational testing can be linked to the CBA. The ICD will include a description of the capability, capability gap, threat, expected joint operational environments, shortcomings of existing systems, the capability attributes and metrics, joint DOTMLPF, and policy impact and constraints for the capabilities.”

CTPs should focus on critical design features or risk areas (e.g., technical maturity, reliability, availability, and maintainability (RAM) issues, physical characteristics or measures) that if not achieved or resolved during development will preclude delivery of required operational capabilities. CTPs will likely evolve/change as the system matures during EMD. Resolve existing CTPs and identify new CTPs as the system progresses during development. Identify any CTPs not resolved prior to entering LRIP and establish an action plan to resolve them prior to the FRP Decision Review.

The Program T&E Lead has responsibility for coordinating the CTP process with the Program’s Chief or Lead Systems Engineer, with assistance from the appropriate test organization subject matter experts and lead OTA. The evaluation of CTPs proves important in projecting maturity of the system and to inform the PM as to whether the system is on (or behind) the planned development schedule or will likely (or not likely) achieve an operational capability, but are not sufficient in projecting mission capability. The projection of mission capability requires an evaluation of the interoperability of systems and sub-systems in the mission context, when used by a typical operator, CTPs associated with the systems/sub-systems provide a basis for selecting entry or exit criteria demonstrated for the major developmental test phases.

Previous and Next Page arrows

List of All Contributions at This Location

No items found.

Popular Tags

Browse

https://acc.dau.mil/UI/img/bo/minus.gifWelcome to the Defense Acquisition...
https://acc.dau.mil/UI/img/bo/minus.gifForeword
https://acc.dau.mil/UI/img/bo/plus.gifChapter 1 -- Department of Defense...
https://acc.dau.mil/UI/img/bo/plus.gifChapter 2 -- Program Strategies
https://acc.dau.mil/UI/img/bo/minus.gifChapter 3 -- Affordability and...
https://acc.dau.mil/UI/img/bo/plus.gif3.0. Overview
https://acc.dau.mil/UI/img/bo/plus.gif3.1. Life-Cycle Costs/Total Ownership...
https://acc.dau.mil/UI/img/bo/plus.gif3.2. Affordability
https://acc.dau.mil/UI/img/bo/plus.gif3.3. Analysis of Alternatives
https://acc.dau.mil/UI/img/bo/plus.gif3.4. Cost Estimation for Major Defense...
https://acc.dau.mil/UI/img/bo/plus.gif3.5. Manpower Estimates
https://acc.dau.mil/UI/img/bo/plus.gif3.6. Major Automated Information Systems...
https://acc.dau.mil/UI/img/bo/plus.gif3.7. Principles for Life-Cycle Cost...
https://acc.dau.mil/UI/img/bo/plus.gifChapter 4 -- Systems Engineering
https://acc.dau.mil/UI/img/bo/plus.gifChapter 5 -- Life-Cycle Logistics
https://acc.dau.mil/UI/img/bo/plus.gifChapter 6 -- Human Systems Integration...
https://acc.dau.mil/UI/img/bo/plus.gifChapter 7 -- Acquiring Information...
https://acc.dau.mil/UI/img/bo/minus.gifChapter 8 -- Intelligence Analysis...
https://acc.dau.mil/UI/img/bo/plus.gif8.0. Introduction
https://acc.dau.mil/UI/img/bo/plus.gif8.1. Threat Intelligence Support
https://acc.dau.mil/UI/img/bo/plus.gif8.2. Signature and other Intelligence...
https://acc.dau.mil/UI/img/bo/plus.gif8.3. Support to the Intelligence...
https://acc.dau.mil/UI/img/bo/minus.gifChapter 9 -- Test and Evaluation (T&E)
https://acc.dau.mil/UI/img/bo/plus.gif9.0 Overview
https://acc.dau.mil/UI/img/bo/plus.gif9.1 OSD T&E Organization
https://acc.dau.mil/UI/img/bo/plus.gif9.2 Service-Level T&E Management
https://acc.dau.mil/UI/img/bo/plus.gif9.3 Test and Evaluation
https://acc.dau.mil/UI/img/bo/plus.gif9.4 Integrated Test and Evaluation
https://acc.dau.mil/UI/img/bo/minus.gif9.5 Test and Evaluation Planning
https://acc.dau.mil/UI/img/bo/plus.gif9.5.4 Test and Evaluation Strategy...
https://acc.dau.mil/UI/img/bo/plus.gif9.5.5. Test and Evaluation Master Plan
https://acc.dau.mil/UI/img/bo/plus.gif9.5.6 Contractual
https://acc.dau.mil/UI/img/bo/plus.gif9.5.8 System Readiness for Operational...
https://acc.dau.mil/UI/img/bo/plus.gif9.6 T&E Reporting
https://acc.dau.mil/UI/img/bo/plus.gif9.7 Special Topics
https://acc.dau.mil/UI/img/bo/plus.gif9.8. Best Practices
https://acc.dau.mil/UI/img/bo/plus.gif9.9. Prioritizing Use of Government Test...
https://acc.dau.mil/UI/img/bo/plus.gifChapter 10 -- Decisions Assessments and...
https://acc.dau.mil/UI/img/bo/plus.gifChapter 11 -- Program Management...
https://acc.dau.mil/UI/img/bo/plus.gifChapter 12 - Defense Business System...
https://acc.dau.mil/UI/img/bo/plus.gifChapter 13 -- Program Protection
https://acc.dau.mil/UI/img/bo/plus.gifChapter 14 -- Acquisition of Services
https://acc.dau.mil/UI/img/bo/plus.gifDoD Directive 5000.01
https://acc.dau.mil/UI/img/bo/minus.gifDoD Instruction 5000.02
https://acc.dau.mil/UI/img/bo/plus.gifTABLE OF CONTENTS
https://acc.dau.mil/UI/img/bo/plus.gifEnclosure 1 -- References
https://acc.dau.mil/UI/img/bo/plus.gifEnclosure 2 -- Procedures
https://acc.dau.mil/UI/img/bo/plus.gifEnclosure 3 -- Acquisition Category...
https://acc.dau.mil/UI/img/bo/plus.gifEnclosure 4 -- Statutory and Regulatory...
https://acc.dau.mil/UI/img/bo/plus.gifEnclosure 5 -- IT Considerations
https://acc.dau.mil/UI/img/bo/plus.gifEnclosure 6 -- Integrated T&E
https://acc.dau.mil/UI/img/bo/minus.gifEnclosure 7 -- Resource Estimation
https://acc.dau.mil/UI/img/bo/plus.gifEnclosure 8 -- Human Systems Integration...
https://acc.dau.mil/UI/img/bo/plus.gifEnclosure 9 -- Acquisition of Services
https://acc.dau.mil/UI/img/bo/plus.gifEnclosure 10 -- Program Management
https://acc.dau.mil/UI/img/bo/plus.gifEnclosure 11 -- Management of Defense...
https://acc.dau.mil/UI/img/bo/plus.gifEnclosure 12 -- Systems Engineering
https://acc.dau.mil/UI/img/bo/plus.gifRecent Policy and Guidance
https://acc.dau.mil/UI/img/bo/plus.gifCurrent JCIDS Manual and CJCSI 3170.01 I
https://acc.dau.mil/UI/img/bo/plus.gifDefense Acquisition Guidebook Key...
ACC Practice Center Version 3.2
  • Application Build 3.2.9
  • Database Version 3.2.9