Click here
      Home    DAG Tutorial    Search    Available Downloads     Feedback
 
The DAG does not reflect the changes in the DoDI5000.02. Work is in progress to update the content and will be completed as soon as possible.
 
.

9.5.5. Test and Evaluation Master Plan

Topic
Previous Page Next Page

9.5.5. Test and Evaluation Master Plan

The TEMP serves as the overarching document for managing a T&E program. PMs should develop a draft TEMP for the pre-EMD review and a formal TEMP for Milestone B, based on the AT&L memo “Improving Milestone Process Effectiveness,” dated June 23, 2011. Prior to each subsequent Defense Acquisition System Milestone, the PMs must submit an updated TEMP. The TEMP should include sufficient detail to support development of other test related documents.

PMs develop a TEMP and subsequent updates meeting the following objectives:

  • Accomplish all certification requirements necessary for the conduct of T&E.
  • Provide an event-driven T&E schedule.
  • Ensure the T&E strategy aligns with and supports the approved acquisition strategy to provide adequate, risk-reducing T&E information to support decisions.
  • Integrate DT&E and OT&E objectives into an efficient test continuum for use in the TEMP to maximize efficiencies during test execution, and increase the test sample size while minimizing test resource requirements.
  • Identify and describe design, technical, integration, operational, safety, and security risks. The T&E strategy should naturally flow from the user mission requirements and concept of operations (CONOPS), systems engineering processes of requirements analysis, functional allocation, and design synthesis.
  • Serve as the basis for T&E budgetary estimates identified in the Cost Analysis Requirements Description (required by DoD 5000.4-M “Cost Analysis Guidance and Procedures,” dated December 11, 1992).
  • Identify test strategies to efficiently identify technology limitations and capabilities of alternative concepts to support early cost performance tradeoff decisions.
  • Provide data and analytic support to certify the system ready for IOT&E. The DT&E report discussed below provides this data.
  • Assess technical progress and maturity against critical technical parameters (CTPs), key system attributes (KSAs), KPPs, and critical operational issues (COIs) as documented in the TEMP and test plans. CTPs can be used to assess completion of a major phase of developmental testing such as ground or flight testing; and determine readiness to enter the next phase of testing, whether developmental or operational.
  • To mitigate technical risk, the required assessment of technical progress should also include reliability, maintainability and supportability desired capabilities, software functionality, and technical and manufacturing risks.
  • Include reliability growth curves at Pre-EMD and report progress to plan at future updates.
  • Include adequate measures to support the program’s reliability growth plan and requirements for a RAM Cost Rationale Report defined in DOD RAM Cost Rationale Manual, for MS B and C. For more information, read DTM 11–003, “Reliability Analysis, Planning, Tracking, and Reporting,” dated December 2, 2011.
  • Some technical parameters can be expressed as either a rate of change or a simple specific value in assessing level of success. For example, the rate at which a system accuracy or reliability is increasing, or simply the success rate of a system meeting a certain accuracy or reliability threshold. The PM may use a combination of both to tailor the test strategy to support decision requirements.
  • Utilize M&S and ground test activities, to include integration laboratories, hardware-in-the-loop simulation, and installed-system test facilities prior to conducting full-up, system-level and end-to-end testing in open-air realistic environments. Programs normally limit DT&E of military medical devices to airworthiness certification and environmental testing to ensure the device does not fail due to the austere or harsh environments imposed by the operational environment or interfere with the aircraft’s operational environment. This can often be integrated into, or performed alongside, the requisite OT.
  • Perform V&V in the use of M&S and the systems engineering process.
  • Stress the system under test to at least the limits of the Operational Mode Summary/Mission Profile, and for some systems, beyond the normal operating limits to ensure the robustness of the design. This testing will reduce risk for performance in the expected operational environments.
  • Provide safety releases (to include formal Environment, Safety, and Occupational Health (ESOH) risk acceptance), in concert with the user and the T&E community, to the developmental and operational testers prior to any test using personnel.
  • Demonstrate the maturity of the production process through Production Qualification Testing (PQT) of low-rate initial production (LRIP) assets prior to full-rate production (FRP). The focus of this testing is on the contractor's ability to produce a quality product, since the design testing should have been completed.
  • Provide data and analytic support to the Milestone C decision to enter LRIP.
  • For weapons systems, use the System Threat Assessment (STA) or System Threat Assessment Report (STAR) as a basis for scoping a realistic test environment.
  • For IT & NSS, use DIA, North American Industry Class System (NAICS), or other applicable standard as a basis for scoping a realistic test environment.
  • Conduct Information Assurance (IA) testing on any system that collects, stores, transmits, and processes unclassified or classified information; The extent of IA testing depends upon the assigned ‘Mission Assurance Category’ and ‘Confidentiality Level’. DoDI 8500.2, "Information Assurance (IA) Implementation," dated February 6, 2003, mandates specific IA Control Measures a system should implement as part of the development process.
  • In the case of IT systems, including NSS, support the DoD Information Assurance Certification and Accreditation Process and Joint Interoperability Certification process.
  • Discover, evaluate, and mitigate potentially adverse electromagnetic environmental effects (E3).
  • Support joint interoperability assessments required to certify system-of-systems interoperability.
  • For business systems, the TEMP identifies certification requirements needed to support the compliance factors established by the Office of the Under Secretary of Defense (Comptroller) (USD(C)) for financial management, enterprise resource planning, and mixed financial management systems.
  • Demonstrate performance against threats and their countermeasures as identified in the Defense Intelligence Agency (DIA) or component-validated threat document. Any impact on technical performance by these threats should be identified early in technical testing, rather than in operational testing where their presence might have serious repercussions.
  • Assess SoS Command, Control, Communications, Computers, Intelligence, Surveillance, and Reconnaissance (C4ISR) prior to OT&E to ensure interoperability under loaded conditions will represent stressed OT&E scenarios.

9.5.5.1. Strategy for Test and Evaluation

PMs should structure a T&E program strategy to provide knowledge to reduce risk in acquisition and operational decisions. The evaluations of all available and relevant data and information from contractor and government sources develop that knowledge. The evaluation should focus on providing essential information to decision makers, specifically with regard to attainment of technical performance attributes and an assessment of the systems missions operational effectiveness, operational suitability, and survivability or operational security. The evaluation framework supports estimates for test resource requirements and provides a basis for determining test program adequacy and assessing risk margins within the T&E plans and events.

The PM should structure the strategy to provide essential information to decision-makers, assess attainment of technical performance parameters, and determine whether systems are operationally effective, suitable, survivable, and safe for intended use. The conduct of T&E, integrated with M&S, should facilitate learning, assess technology maturity and interoperability, facilitate integration into fielded forces, and confirm performance against documented capability needs and adversary capabilities as described in the system threat assessment.

In other words, the evaluation should describe the links between key program and user decisions, as well as the developmental and operational tests that requiring evaluation for those decisions. It correlates the knowledge required concerning KPPs/ KSAs, CTPs, key test measures (i.e., MOEs and Measure of Suitability (MOSs)), and the planned test methods, key test resources, facility, or infrastructure needs. The framework discussion should also identify major risks or limitations to completing the evaluations. The TEMP should clearly reflect what key questions the evaluations will answer for the program and user, and at what key decision points. This layout and discussion provides a rationale for the major test objectives and the resulting major resource requirements shown in the “Resources” portion of the TEMP.

The evaluation should also discuss the intended maturation of key technologies within the overall system, the evaluation of capabilities in a mission context, and evaluations needed to support required certifications or to comply with statute(s). Separate evaluation plans should provide details for the PMs overall evaluation strategy (e.g., System Evaluation Plan (Army), Operational Test and Evaluation plan, LFT&E plan).

The DT&E section describes the evaluation of the maturation of a system or capability, and should address the overall approach to evaluate the development of system capabilities, in operationally relevant environments. The approach should cover CTPs, key system risks, and any certifications required (weapon safety, interoperability, etc.). The evaluation of technology maturity should support the TDS.

The evaluation of system maturity should support the acquisition strategy. The amount of development in the acquisition strategy will drive the extent of the discussion. For example, if a non-developmental item (i.e., Commercial-Off-The-Shelf (COTS) or Government-off-the-shelf (GOTS)) then there may not be much, if any, maturation of the system required. If a new technology effort, pushing the state-of-the-art or capabilities significantly improved over what is currently being achieved in the operational environment, then it may require a significant amount of effort in maturing or developing the system or its support system, and therefore more decisions requiring knowledge from evaluations. In assessing the level of evaluations necessary, give equal consideration to the maturity of the technologies used, the degree to which system design (hardware and software) has stabilized, as well as the operational environment for the employment of the system. Using COTS items in a new environment can result in significant capability changes, potentially eliminating a true COTS item from a system maturity perspective.

The system maturation discussions should also cover evaluations for production qualification, production acceptance, and sustainment of the system. The Defense Contract Management Agency (DCMA) representatives and procedures may cover the production evaluations at the contractors manufacturing plant, or may require the T&E effort to establish and mature the processes. Therefore, the appropriate level of evaluation could range from none, for normal DCMA practices, to minimal for first article qualification checks, to more extensive evaluations based upon PQT results for new or unique manufacturing techniques, especially with new technologies. The sustainment evaluation discussions should address key risks or issues in sustaining or assessing the system capability in operational use. The sustainment evaluation discussion should address the overall T&E logistics effort, maintenance (both corrective and preventative), servicing, calibration, and support aspects.

The discussion of mission context evaluations addresses the approach to evaluate operational effectiveness and operational suitability of the system for use by typical users in the intended mission environments. This should also include joint operations issues. These evaluations provide a prediction of how well the system will perform in field use as well as in IOT&E, and may reduce the scope of the IOT&E, but will not replace or eliminate the need for IOT&E.

COIs also prove relevant to this discussion. COIs act as key operational effectiveness or operational suitability issues requiring examination in OT&E to determine the system’s capability to perform its mission. COIs must be relevant to the required capabilities and of key importance to the system’s operationally effectiveness, operationally suitability and survivability, and represent a significant risk if not satisfactorily resolved.

The strategy for T&E must include those evaluations required by statute, specifically IOT&E, survivability or operational security, and lethality. The IOT&E discussion should describe the approach to conduct the independent evaluation of the system, including official resolution of COIs. The discussion of the approach to evaluate the survivability or operational security /lethality of the system should show how it will influence the development and maturation of the system design. The discussion should include a description of the overall live fire evaluation strategy for the system (as defined in section 2366 of title 10 USC); critical live fire evaluation issues; and any major evaluation limitations.

9.5.5.2. Evaluation Framework

The Evaluation Framework Matrix describes in table format the most important links and relationships between the types of testing conducted to support the entire acquisition program. It also shows the linkages between the KPPs/KSAs, CTPs, key test measures (i.e., MOEs, MOSs), planned test methods, key test resources (i.e., facility and infrastructure), and the decisions supported. Table 9.5.5.2.T1. depicts “Top-Level Evaluation Framework Matrix” from the TEMP format annex (and shown below) shows a notional Evaluation Framework Matrix. Programs may also use equivalent Service-specific formats identifying the same relationships and information. Note: the Evaluation Framework Matrix provides a tabular summary of the evaluation strategy.

Table 9.5.5.2.T1. Top-Level Evaluation Framework Matrix

Key Requirements and T&E Measures

Test Methodologies/Key Resources (M&S, SIL, MF, ISTF, HITL, OAR)

Decision

Supported

Key

Reqs

COIs

Key MOEs/

MOSs

CTPs & Threshold

KPP#1:

COI #1. Is the XXX effective for…

MOE 1.1.

Engine thrust

Chamber measurement

Observation of performance profiles OAR

PDR

CDR

COI #2. Is the XXX suitable for…

Data upload time

Component level replication

Stress and Spike testing in SIL

PDR

CDR

COI #3. Can the XXX be…

MOS 2.1.

MS-C

FRP

MOE 1.3.

Post-CDR

FRP

MOE 1.4.

Reliability based on growth curve

Component level stress testing

Sample performance on growth curve

Sample performance with M&S augmentation

PDR

CDR

MS-C

KPP #2

MOS 2.4.

Data link

MS-C

SRR

KPP #3

COI #4. Is training….

MOE 1.2.

Observation and Survey

MS-C

FRP

KSA #3.a

COI #5. Documentation

MOS 2.5.

MS-C

FRP

The Evaluation Framework Matrix acts as a key tool used to capture all major parts of a complete T&E program, identify gaps in coverage, and ensure more efficient integrated testing. Programs must include it in Part III of the TEMP and base it on the strategy for T&E (aka evaluation strategy) developed at Milestone A. The Evaluation Framework Matrix should succinctly enumerate the top-level, key values and information for all types of T&E. Updates should occur as the system matures and the updating of source documents (e.g., CDD/CPD, AS, STAR, SEP, ISP). Include demonstrated values for measures and parameters as the acquisition program advances from milestone to milestone and as the updating of the TEMP.

Three major sections comprise the Evaluation Framework Matrix: Key Requirements and T&E Measures; Test Methodologies/Key Resources; and Decisions Supported. When filled in, readers can scan the matrix horizontally and see all linkages from the beginning of a program (i.e., from the requirement document) to the decision supported. Each requirement should associate with at least one or more T&E issues and measures. However, T&E measures can exist without an associated key requirement or COI/ COI Criteria (COIC). Hence, some cells in Table 9.5.5.2.T1. may be void.

Key Requirements and T&E Measures – These include KPPs and KSAs and the top-level T&E issues and measures for evaluation. The top-level T&E issues would typically include COIs and COIC, CTPs, and key MOEs/MOSs. This should also include SoS issues. Each measure should be associated with one or more key requirements. However, there could be T&E measures without an associated key requirement or COI/COIC. Hence, some cells in Table 9.5.5.2.T1. of the TEMP may be void. A simple test to determine if this section of the matrix is minimally adequate is to confirm that each decision supported has at least one T&E measure associated with it, and each key requirement also has at least one T&E measure associated with it. Outside of that, only include the T&E issues and measures that drive size or scope of the T&E program.

Test Methodologies/Key Resources – These identify test methodologies or key resources necessary to generate data for evaluations to support decisions. The content of this column should indicate the key methodologies or significant resources required. Test methodology refers to high-level descriptions of methods used to obtain the data. For example, modeling and simulation, system integration lab, or open-air range, each represents a different methodology for obtaining test data. Where multiple methodologies are acceptable, it is necessary to show the preferred methodology utilized. Short notes or acronyms should be used to identify the methodology. Models or simulations should be identified with the specific name or acronym.

Decisions Supported – These are the major design, developmental, manufacturing, programmatic, acquisition, or employment decisions driving the need for knowledge to be obtained through T&E. These decisions include acquisition milestones, design reviews, certifications, safety releases, production acceptance, and operational employment/deployment. The operational employment/deployment decisions include those made by operators and maintainers that drive the need for validated operating and maintenance manuals. The decisions supported column would not contain each decision an operator or maintainer would make, but just the overall level of knowledge needed for operating or maintenance data or instructions, or those that steer significant or top-level decisions. The key determinant for what to include in this section is whether the decision supported (or knowledge requirement) drives trade space for performance, cost or schedule, or the size or scope of the T&E program. Only those decisions that facilitate program decisions or the size or scope of the T&E program should be included.

If portions of any T&E activity are missing, those become immediately evident. For example, if a KPP for reliability, availability, and maintainability (RAM) is listed, then there must be a supporting COI (or criterion in the set of COIC), along with CTPs and MOSs, to show that RAM will be fully evaluated in DT&E and OT&E. Specifically in the case of RAM measures, many acquisition programs included little to no RAM testing in DT&E and subsequently failed Suitability in OT&E (i.e., were rated "Not Suitable" by DOT&E). Had the TEMPs for those programs contained a full Evaluation Framework Matrix, the weak or missing RAM areas may have been identified early and properly tested before systems reached OT&E. Increasing the visibility of all key measures will help ensure these areas are developed and properly tested in DT&E and are ready for OT&E.

The Evaluation Framework Matrix also aids integrated testing and systems engineering by providing a broad outline of the linkages and corresponding areas for each kind of T&E activity. Mutual support between tests can be planned based on these linkages. For example, DT&E can augment the high visibility areas in OT&E, and OT&E can "right-size" their T&E concept based on what they can use in DT&E. More synergy is possible where DT and OT measures are the same or similar, or where the same T&E resources (test articles and/or facilities) are used. Data sharing protocols can be developed early to aid test planning. DOD Information Assurance Certification and Accreditation Process(s) (DIACAP's) Certification & Accreditation (C&A) requirements can be folded in early. Redundancy and gaps can be spotted and eliminated. Greater visibility and transparency between T&E activities will generate countless ways to enhance integration. The discussion of the evaluation strategy can fill in all the details.

Table 9.5.5.2.T2. provides key inputs within the TEMP.

Table 9.5.5.2.T2 – Key Inputs within the TEMP

TEMP

Milestone

B

(Updated from MS A when developed)

C

(Updated from MS B)

Part I, Introduction

Include Purpose

Include Mission Description

Include System Description

Include System Threat Assessment

Include Program Background

Include Key Capabilities / SE Requirements

Part II, Management & Schedule

Include T&E Management / Organizational Construct

Include Common T&E Database Requirements (for integrated testing)

Include Deficiency Reporting

Include TEMP Update

Include Integrated Test Program Schedule within the TEMP, updated prior to each MS.

Part III, T&E Strategy

Evaluation Framework Matrix (cross referenced with; COIs (or COIC), KPPs, CTPs, KSAs, MOPs, MOEs, & MOSs)

Should describe planned DT&E, OT&E and LFT&E in detail. Include overview and use of integrated test (CT, DT&E, & OT&E) and list out those events requiring stand alone (or dedicated) Government DT&E and OT&E. Delineate test limitations (Annotate by DT&E, LFT&E, or OT&E).

A list of supporting interfaces, consistent with the ISP/TISP. SV-5b should be included with each interface cross-referenced to any planned EMD phase T&E or C&A activities utilizing each interface.

Provide for operational evaluation of mission-level interoperability across key interfaces.

Plan for the conduct of dedicated Government DT&E or integrated test (lead by Government personnel) to provide confidence that the system design solution is on track to satisfy the desired capabilities.

A listing of all test events within the dedicated IOT&E

Identify Lead Government DT&E organization.

Plan for one full-up system level government DT&E event and at least one OA with intended operational users.

Reliability Growth Curve(s) (RGCs) reflecting the reliability growth plans at the appropriate level of analysis for the program

Updated RGC

Listing of all commercial and NDIs

Provide a tabulation of factors

Determination of critical interfaces and information security

The TEMP should describe the T&E program in sufficient detail for decision makers to determine whether the planned activities are adequate to achieve the T&E objectives for the program.

Identify each test event as Contractor or Government DT&E

Indentify M&S to be used and VV&A process. Annotate supporting usage (i.e., DT&E or OT&E)

T&E Support of Reliability Growth Plan

Plan for data collection

The TEMP should identify entrance and exit criteria and their associated test events or test periods.

The TEMP should consider the potential impacts on the environment and on personnel.

Part IV, Resource Summary

The TEMP should describe the resources’ required in sufficient detail and aligned with Part III of the TEMP.

Programs should maximize the use DoD Government T&E capabilities and invest in Government T&E infrastructure unless an exception can be justified as cost-effective to the Government.

9.5.5.3. TEMP Format

TEST AND EVALUATION MASTER PLAN

FOR

PROGRAM TITLE/SYSTEM NAME

ACRONYM

ACAT Level

Program Elements

Xxxxx

************************************************************************

SUBMITTED BY

____________________________________________________ ____________

Program Manager DATE

CONCURRENCE

____________________________________________________ ____________

Program Executive Officer or Developing Agency DATE

(If not under the Program Executive Officer structure)

____________________________________________________ ____________

Operational Test Agency DATE

____________________________________________________ ____________

User's Representative DATE

DoD COMPONENT APPROVAL

____________________________________________________ ____________

DoD Component Test and Evaluation Director DATE

____________________________________________________ ____________

DoD Component Acquisition Executive (Acquisition Category I) DATE

Milestone Decision Authority (for less-than-Acquisition Category I)

Note: For Joint/Multi Service or Agency Programs, each Service or Defense Agency should provide a signature page for parallel staffing through its CAE or Director, and a separate page should be provided for OSD Approval

************************************************************************

OSD APPROVAL

____________________________________________________ ____________

DASD(DT&E) DATE

____________________________________________________ ____________

D,OT&E DATE

TABLE OF CONTENTS

PART 1 – INTRODUCTION................................................................................................

1.1 PURPOSE.....................................................................................................

1.2 MISSION DESCRIPTION............................................................................

1.3 SYSTEM DESCRIPTION.............................................................................

1.3.1 System Threat Assessment..............................................................

1.3.2 Program Background........................................................................

1.3.2.1 Previous Testing.................................................................

1.3.3 Key Capabilities.................................................................................

1.3.3.1 Key Interfaces....................................................................

1.3.3.2 Special test or certification requirements...........................

1.3.3.3 Systems Engineering (SE) Requirements.........................

PART II – TEST PROGRAM MANAGEMENT AND SCHEDULE...................................

2.1 T&E MANAGEMENT....................................................................................

2.1.1 T&E Organizational Construct...........................................................

2.2 Common T&E Data Base Requirements.......................................

2.3 DEFICIENCY REPORTING.........................................................................

2.4 TEMP UPDATES..........................................................................................

2.5 INTEGRATED TEST PROGRAM SCHEDULE..........................................

Figure 2.1 – Integrated Test Program Schedule...........................................

PART III – TEST AND EVALUATION STRATEGY..........................................................

3.1 T&E STRATEGY...........................................................................................

3.2 EVALUATION FRAMEWORK.....................................................................

Figure 3.1 – Top-Level Evaluation Framework Matrix..................................

3.3 Developmental Evaluation Approach.......................................

3.3.1 Mission-Oriented Approach...............................................................

3.3.2 Developmental Test Objectives........................................................

3.3.3 Modeling and Simulation...................................................................

3.3.4. Test Limitations..................................................................................

3.4 Live Fire Evaluation Approach.......................................................

3.4.1 Live Fire Test Objectives...................................................................

3.4.2 Modeling and Simulation...................................................................

3.4.3 Test Limitations..................................................................................

3.5 Certification for IOT&E.....................................................................

3.5.1 Assessment of Operational Test Readiness.....................................

3.6 Operational Evaluation Approach.............................................

3.6.1 Operational Test Objectives..............................................................

3.6.2 Modeling and Simulation...................................................................

3.6.3 Test Limitations..................................................................................

3.7 OTHER CERTIFICATIONS..........................................................................

3.8 RELIABILITY GROWTH...............................................................................

3.9 FUTURE TEST AND EVALUATION............................................................

PART IV – RESOURCE SUMMARY..................................................................................

4.1 Introduction..........................................................................................

4.1.1 Test Articles.......................................................................................

4.1.2 Test Sites and Instrumentation..........................................................

4.1.3 Test Support Equipment....................................................................

4.1.4 Threat Representation.......................................................................

4.1.5 Test Targets and Expendables..........................................................

4.1.6 Operational Force Test Support........................................................

4.1.7 Models, Simulations, and Test-beds..................................................

4.1.8 Joint Operational Test Environment..................................................

4.1.9 Special Requirements.......................................................................

4.2 Federal, State, Local Requirements..........................................

4.3 Manpower/Personnel Training.....................................................

4.4 Test Funding Summary.......................................................................

Table 4.1 Resource Summary Matrix

APPENDIX A – BIBLIOGRAPHY

APPENDIX B – ACRONYMS

APPENDIX C – POINTS OF CONTACT

ADDITIONAL APPENDICES AS NEEDED

1. PART I - INTRODUCTION

1.1. Purpose.

  • State the purpose of the Test and Evaluation Master Plan (TEMP).
  • Identify if this is an initial or updated TEMP.
  • State the Milestone (or other) decision the TEMP supports.
  • Reference and provide hyperlinks to the documentation initiating the TEMP (i.e., Initial Capability Document (ICD), Capability Development Document (CDD), Capability Production Document (CPD), Acquisition Program Baseline (APB), Acquisition Strategy Report (ASR), Concept of Operations (CONOPS)).
  • State the Acquisition Category (ACAT) level, operating command(s), and if listed on the OSD T&E Oversight List (actual or projected)

1.2. Mission Description.

  • Briefly summarize the mission need described in the program capability requirements documents in terms of the capability it will provide to the Joint Forces Commander.
  • Describe the mission to be accomplished by a unit equipped with the system using all applicable CONOPS and Concepts of Employment.
  • Incorporate an OV-1 of the system showing the intended operational environment.
  • Also include the organization in which the system will be integrated as well as
  • [Include] significant points from the Life Cycle Sustainment Plan, the Information Support Plan, and Program Protection Plan.
    • Provide links to each document referenced in the introduction.
  • For business systems, include a summary of the business case analysis for the program.

1.3. System Description.

  • Describe the system configuration.
  • Identify key features and subsystems, both hardware and software (such as architecture, system and user interfaces, security levels, and reserves) for the planned increments within the Future Years Defense Program (FYDP).

1.3.1. System Threat Assessment.

  • Succinctly summarize the threat environment (to include cyber-threats) in which the system will operate.
  • Reference the appropriate DIA or component-validated threat documents for the system.

1.3.2. Program Background.

  • Reference the Analysis of Alternatives (AoA), the APB and the materiel development decision to provide background information on the proposed system.
  • Briefly describe the overarching Acquisition Strategy (for space systems, the Integrated Program Summary (IPS)), and the Technology Development Strategy (TDS).
  • Address whether the system will be procured using an incremental development strategy or a single step to full capability.
  • If it is an evolutionary acquisition strategy, briefly discuss planned upgrades, additional features and expanded capabilities of follow-on increments.
    • The main focus must be on the current increment with brief descriptions of the previous and follow-on increments to establish continuity between known increments.

1.3.2.1. Previous Testing.

  • Discuss the results of any previous tests that apply to, or have an effect on, the test strategy.

1.3.3. Key Capabilities.

  • Identify the Key Performance Parameters (KPPs) and Key System Attributes (KSAs) for the system.
    • For each listed parameter, provide the threshold and objective values from the CDD/CPD and reference the paragraph.

1.3.3.1. Key Interfaces.

  • Identify interfaces with existing or planned systems’ architectures that are required for mission accomplishment.
  • Address integration and modifications needed for commercial items.
  • Include interoperability with existing and/or planned systems of other Department of Defense (DoD) Components, other Government agencies, or Allies.
  • Provide a diagram of the appropriate DoD Architectural Framework (DoDAF) system operational view from the CDD or CPD.

1.3.3.2. Special test or certification requirements.

  • Identify unique system characteristics or support concepts that will generate special test, analysis, and evaluation requirements
    • (e.g., security test and evaluation and Information Assurance (IA) Certification and Accreditation (C&A),
    • post deployment software support,
    • resistance to chemical, biological, nuclear, and radiological effects;
    • resistance to countermeasures;
    • resistance to reverse engineering/exploitation efforts (Anti-Tamper);
    • development of new threat simulation, simulators, or targets.

1.3.3.3. Systems Engineering (SE) Requirements.

  • Reference all SE-based information that will be used to provide additional system evaluation targets driving system development.
    • Examples could include hardware reliability growth and software maturity growth strategies.
    • The SEP should be referenced in this section and aligned to the TEMP with respect to SE Processes, methods, and tools identified for use during T&E.

2. PART II – TEST PROGRAM MANAGEMENT AND SCHEDULE

2.1 T&E Management.

  • Discuss the test and evaluation responsibilities of all participating organizations (such as developers, testers, evaluators, and users).
  • Describe the role of contractor testing in early system development.
  • Describe the role of government developmental testers to assess and evaluate system performance.
  • Describe the role of the Operational Test Agency (OTA) /operational testers to confirm operational effectiveness, operational suitability and survivability.

2.1.1. T&E Organizational Construct.

  • Identify the organizations or activities (such as the T&E Working-level Integrated Product Team (WIPT) or Service equivalent, LFT&E IPT, etc.) in the T&E management structure, to include the sub-work groups, such as a modeling & simulation, or reliability.
  • Provide sufficient information to adequately understand the functional relationships. Reference the T&E WIPT charter that includes specific responsibilities and deliverable items for detailed explanation of T&E management.
    • These items include TEMPs and Test Resource Plans (TRPs) that are produced collaboratively by member organizations.

2.2. Common T&E Database Requirements.

  • Describe the requirements for and methods of collecting, validating, and sharing data as it becomes available from the contractor, Developmental Test (DT), Operational Test (OT), and oversight organizations, as well as supporting related activities that contribute or use test data (e.g., information assurance C&A, interoperability certification, etc.).
  • Describe how the pedigree of the data will be established and maintained. The pedigree of the data refers to understanding the configuration of the test asset, and the actual test conditions under which the data were obtained for each piece of data.
  • State who will be responsible for maintaining this data.

2.3. Deficiency Reporting.

  • Briefly describe the processes for documenting and tracking deficiencies identified during system development and testing.
  • Describe how the information is accessed and shared across the program.
  • The processes should address problems or deficiencies identified during both contractor and government test activities.
  • The processes should also include issues that have not been formally documented as a deficiency (e.g., watch items).

2.4. TEMP updates.

  • Reference instructions for complying with DoDI 5000.02 required updates or identify exceptions to those procedures if determined necessary for more efficient administration of document.
  • Provide guidelines for keeping TEMP information current between updates.
  • For a Joint or Multi-Service TEMP, identify references that will be followed or exceptions as necessary.

2.5. Integrated Test Program Schedule.

  • Display (see Figure 2.1) the overall time sequencing of the major acquisition phases and milestones (as necessary, use the NSS-03-01 time sequencing).
    • Include the test and evaluation major decision points, related activities, and planned cumulative funding expenditures by appropriation by year.
    • Include event dates such as
      • Major decision points as defined in DoD Instruction 5000.02, e.g., operational assessments,
      • Preliminary and critical design reviews,
      • Test article availability; software version releases;
      • Appropriate phases of DT&E; LFT&E; Joint Interoperability Test Command (JITC) interoperability testing and certification date to support the MS-C and Full-Rate Production (FRP) Decision Review (DR).
      • Include significant Information Assurance certification and accreditation event sequencing, such as Interim Authorization to Test (IATT), Interim Authorization to Operate (IATO) and Authorization to Operate (ATO).
      • Also include operational test and evaluation;
      • Low-Rate Initial Production (LRIP) deliveries;
      • Initial Operational Capability (IOC); Full Operational Capability (FOC);
      • Statutorily required reports such as the Live-Fire T&E Report and Beyond Low-Rate Initial Production (B-LRIP) Report.
    • Provide a single schedule for multi-DoD Component or Joint and Capstone TEMPs showing all related DoD Component system event dates.

Figure 9.5.5.3.F1

3. PART III – TEST AND EVALUATION STRATEGY

3.1 T&E Strategy.

  • Introduce the program T&E strategy by briefly describing how it supports the acquisition strategy as described in Section 1.3.2. This section should summarize an effective and efficient approach to the test program.
  • The developmental and operational test objectives are discussed separately below; however this section must also address how the test objectives will be integrated to support the acquisition strategy by evaluating the capabilities to be delivered to the user without compromising the goals of each major kind of test type.
  • Where possible, the discussions should focus on the testing for capabilities, and address testing of subsystems or components where they represent a significant risk to achieving a necessary capability.
  • As the system matures and production representative test articles are available, the strategy should address the conditions for integrating DT and OT tests.
  • Evaluations shall include a comparison with current mission capabilities using existing data, so that measurable improvements can be determined.
    • If such evaluation is considered costly relative to the benefits gained, the PM shall propose an alternative evaluation strategy.
    • Describe the strategy for achieving this comparison and for ensuring data are retained and managed for future comparison results of evolutionary increments or future replacement capabilities.
  • To present the program’s T&E strategy, briefly describe the relative emphasis on methodologies (e.g., Modeling and Simulation (M&S), Measurement Facility (MF), Systems Integration Laboratory (SIL), Hardware-In-the-Loop Test (HILT), Installed System Test Facility (ISTF), Open Air Range (OAR)).

3.2. Evaluation Framework.

  • Describe the overall evaluation approach focusing on key decisions in the system lifecycle and addressing key system risks, program unique Critical Operational Issues (COIs) or Critical Operational Issue Criteria (COIC), and Critical Technical Parameters (CTPs).
  • Specific areas of evaluation to address are related to the:

(1) Development of the system and processes (include maturation of system design)

(2) System performance in the mission context

(3) OTA independent assessments and evaluations

(4) Survivability and/or lethality

(5) Comparison with existing capabilities, and

(6) Maturation of highest risk technologies

  • Describe any related systems that will be included as part of the evaluation approach for the system under test (e.g., data transfer, information exchange requirements, interoperability requirements, and documentation systems).
  • Also identify any configuration differences between the current system and the system to be fielded.
    • Include mission impacts of the differences and the extent of integration with other systems with which it must be interoperable or compatible.
  • Describe how the system will be evaluated and the sources of the data for that evaluation.
    • The discussion should address the key elements for the evaluations, including major risks or limitations for a complete evaluation of the increment undergoing testing.
    • The reader should be left with an understanding of the value-added of these evaluations in addressing both programmatic and warfighter decisions or concerns.
    • This discussion provides rationale for the major test objectives and the resulting major resource requirements shown in Part IV - Resources.
  • Include a Top-Level Evaluation Framework matrix that shows the correlation between the KPPs/KSAs, CTPs, key test measures (i.e., Measures of Effectiveness (MOEs) and Measures of Suitability (MOSs)), planned test methods, and key test resources, facility or infrastructure needs.
    • When structured this way, the matrix should describe the most important relationships between the types of testing that will be conducted to evaluate the Joint Capabilities Integration and Development System (JCIDS)-identified KPPs/KSAs, and the program’s CTPs.
    • Figure 3.1 shows how the Evaluation Framework could be organized. Equivalent Service-specific formats that identify the same relationships and information may also be used.
    • The matrix may be inserted in Part III if short (less than one page), or as an annex.
    • The evaluation framework matrix should mature as the system matures. Demonstrated values for measures should be included as the acquisition program advances from milestone to milestone and as the TEMP is updated.

The suggested content of the evaluation matrix includes the following:

  • Key requirements & T&E measures – These are the KPPs and KSAs and the top-level T&E issues and measures for evaluation. The top-level T&E issues would typically include COIs/Critical Operational Issues and Criteria (COICs), CTPs, and key MOEs/MOSs. System-of-Systems and technical review issues should also be included, either in the COI column or inserted as a new column. Each T&E issue and measure should be associated with one or more key requirements. However, there could be T&E measures without an associated key requirement or COI/COIC. Hence, some cells in figure 3.1 may be empty.
  • Overview of test methodologies and key resources – These identify test methodologies or key resources necessary to generate data for evaluating the COIs/COICs, key requirements, and T&E measures. The content of this column should indicate the methodologies/resources that will be required and short notes or pointers to indicate major T&E phases or resource names. M&S should be identified with the specific name or acronym.
  • Decisions Supported – These are the major design, developmental, manufacturing, programmatic, acquisition, or employment decisions most affected by the knowledge obtained through T&E.

Figure 3.1, Top-Level Evaluation Framework Matrix

Key Requirements and T&E Measures

Test Methodologies/Key Resources (M&S, SIL, MF, ISTF, HITL, OAR)

Decision

Supported

Key

Reqs

COIs

Key MOEs/

MOSs

CTPs & Threshold

KPP#1:

COI #1. Is the XXX effective for…

MOE 1.1.

Engine thrust

Chamber measurement

Observation of performance profiles OAR

PDR

CDR

COI #2. Is the XXX suitable for…

Data upload time

Component level replication

Stress and Spike testing in SIL

PDR

CDR

COI #3. Can the XXX be…

MOS 2.1.

MS-C

FRP

MOE 1.3.

Post-CDR

FRP

MOE 1.4.

Reliability based on growth curve

Component level stress testing

Sample performance on growth curve

Sample performance with M&S augmentation

PDR

CDR

MS-C

KPP #2

MOS 2.4.

Data link

MS-C

SRR

KPP #3

COI #4. Is training….

MOE 1.2.

Observation and Survey

MS-C

FRP

KSA #3.a

COI #5. Documentation

MOS 2.5.

MS-C

FRP

3.3. Developmental Evaluation Approach.

  • Describe the top-level approach to evaluate system and process maturity, as well as, system capabilities and limitations expected at acquisition milestones and decision review points.
  • The discussion should include logistics, reliability growth, and system performance aspects.
  • Within this section, also discuss:

1) rationale for CTPs (see below for a description of how to derive CTPs),

2) key system or process risks,

3) any certifications required (e.g. weapon safety, interoperability, spectrum approval, information assurance),

4) any technology or subsystem that has not demonstrated the expected level of technology maturity at level 6 (or higher), system performance, or has not achieved the desired mission capabilities for this phase of development,

5) degree to which system hardware and software design has stabilized so as to determine manufacturing and production decision uncertainties,

6) key issues and the scope for logistics and sustainment evaluations, and

7) reliability thresholds when the testing is supporting the system’s reliability growth curve.

  • CTPs are measurable critical system characteristics that, if not achieved, preclude the fulfillment of desired operational performance capabilities. While not user requirements, CTPs are technical measures derived from desired user capabilities. Testers use CTPs as reliable indicators that the system is on (or behind) the planned development schedule or will likely (or not likely) achieve an operational capability.
  • Limit the list of CTPs to those that support the COIs. Using the system specification as a reference, the chief engineer on the program should derive the CTPs to be assessed during development.

3.3.1. Mission-Oriented Approach.

  • Describe the approach to evaluate the system performance in a mission context during development in order to influence the design, manage risk, and predict operational effectiveness and operational suitability.
  • A mission context focuses on how the system will be employed. Describe the rationale for the COIs or COICs.

3.3.2. Developmental Test Objectives.

  • Summarize the planned objectives and state the methodology to test the system attributes defined by the applicable capability requirement document (CDD, CPD, CONOPs) and the CTPs that will be addressed during each phase of DT as shown in Figure 3.1, Top-Level Evaluation Framework matrix and the Systems Engineering Plan.
  • Subparagraphs can be used to separate the discussion of each phase.
  • For each DT phase, discuss the key test objectives to address both the contractor and government developmental test concerns and their importance to achieving the exit criteria for the next major program decision point.
  • If a contractor is not yet selected, include the developmental test issues addressed in the Request For Proposals (RFPs) or Statement of Work (SOW).
  • Discuss how developmental testing will reflect the expected operational environment to help ensure developmental testing is planned to integrate with operational testing.
  • Also include key test objectives related to logistics testing.
  • All objectives and CTPs should be traceable in the Top-Level Evaluation Framework matrix to ensure all KPPs/KSAs are addressed, and that the COIs/COICs can be fully answered in operational testing.
  • Summarize the developmental test events, test scenarios, and the test design concept.
  • Quantify the testing sufficiently (e.g., number of test hours, test articles, test events, test firings) to allow a valid cost estimate to be created.
  • Identify and explain how models and simulations, specific threat systems, surrogates, countermeasures, component, or subsystem testing, test beds, and prototypes will be used to determine whether or not developmental test objectives are achieved.
  • Identify the DT&E reports required to support decision points/reviews and OT readiness.
  • Address the system’s reliability growth strategy, goals, and targets and how they support the Evaluation Framework.
  • Detailed developmental test objectives should be addressed in the System Test Plans and detailed test plans.

3.3.3. Modeling & Simulation (M&S).

  • Describe the key models and simulations and their intended use.
  • Include the developmental test objectives to be addressed using M&S to include any approved operational test objectives.
  • Identify data needed and the planned accreditation effort.
  • Identify how the developmental test scenarios will be supplemented with M&S, including how M&S will be used to predict the Sustainment KPP and other sustainment considerations.
  • Identify who will perform M&S verification, validation, and accreditation. Identify developmental M&S resource requirements in Part IV.

3.3.4. Test Limitations.

  • Discuss any developmental test limitations that may significantly affect the evaluator's ability to draw conclusions about the maturity, capabilities, limitations, or readiness for dedicated operational testing.
    • Also address the impact of these limitations, and resolution approaches.

3.4. Live Fire Test and Evaluation Approach.

  • If live fire testing is required, describe the approach to evaluate the survivability/lethality of the system, and (for survivability LFT&E) personnel survivability of the system's occupants.
  • Include a description of the overall live fire evaluation strategy to influence the system design (as defined in Title 10 U.S.C. § 2366), critical live fire evaluation issues, and major evaluation limitations.
  • Discuss the management of the LFT&E program, to include the shot selection process, target resource availability, and schedule.
  • Discuss a waiver, if appropriate, from full-up, system-level survivability testing, and the alternative strategy.

3.4.1. Live Fire Test Objectives.

  • State the key live fire test objectives for realistic survivability or lethality testing of the system.
  • Include a matrix that identifies all tests within the LFT&E strategy, their schedules, the issues they will address, and which planning documents will be submitted for DOT&E approval and which will be submitted for information and review only.
  • Quantify the testing sufficiently (e.g., number of test hours, test articles, test events, test firings) to allow a valid cost estimate to be created.

3.4.2. Modeling & Simulation (M&S).

  • Describe the key models and simulations and their intended use.
  • Include the LFT&E test objectives to be addressed using M&S to include operational test objectives. Identify data needed and the planned accreditation effort.
  • Identify how the test scenarios will be supplemented with M&S.
  • Identify who will perform M&S verification, validation, and accreditation. Identify M&S resource requirements in Part IV

3.4.3. Test Limitations.

  • Discuss any test limitations that may significantly affect the ability to assess the system’s vulnerability and survivability.
    • Also address the impact of these limitations, and resolution approaches.

3.5. Certification for Initial Operational Test and Evaluation (IOT&E).

  • Explain how and when the system will be certified safe and ready for IOT&E.
  • Explain who is responsible for certification and which decision reviews will be supported using the lead Service’s certification of safety and system materiel readiness process.
  • List the DT&E information (i.e., reports, briefings, or summaries) that provides predictive analyses of expected system performance against specific COIs and the key system attributes - MOEs/MOSs.
  • Discuss the entry criteria for IOT&E and how the DT&E program will address those criteria.

3.6. Operational Evaluation Approach.

  • Describe the approach to conduct the independent evaluation of the system.
  • Identify the periods during integrated testing that may be useful for operational assessments and evaluations.
  • Outline the approach to conduct the dedicated IOT&E and resolution of the COIs.
    • COIs must be relevant to the required capabilities and of key importance to the system being operationally effective, operationally suitable and survivable, and represent a significant risk if not satisfactorily resolved. A COI/COIC is typically phrased as a question that must be answered in the affirmative to properly evaluate operational effectiveness (e.g., "Will the system detect the threat in a combat environment at adequate range to allow successful engagement?") and operational suitability (e.g., "Will the system be safe to operate in a combat environment?"). COIs/COICs are critical elements or operational mission objectives that must be examined.
    • COIs/COICs should be few in number and reflect total operational mission concerns. Use existing documents such as capability requirements documents, Business Case Analysis, AoA, APB, war fighting doctrine, validated threat assessments and CONOPS to develop the COIs/COICs.
    • COIs/COICs must be formulated as early as possible to ensure developmental testers can incorporate mission context into DT&E.
    • If every COI is resolved favorably, the system should be operationally effective and operationally suitable when employed in its intended environment by typical users.

3.6.1. Operational Test Objectives.

  • State the key MOEs/MOSs that support the COIs/COICs.
  • Ensure the operational tests can be identified in a way that allows efficient DOT&E approval of the overall OT&E effort in accordance with Title 10 U.S.C. § 139(d).
  • Describe the scope of the operational test by identifying the test mission scenarios and the resources that will be used to conduct the test.
  • Summarize the operational test events, key threat simulators and/or simulation(s) and targets to be employed, and the type of representative personnel who will operate and maintain the system.
  • Identify planned sources of information (e.g., developmental testing, testing of related systems, modeling, simulation) that may be used to supplement operational test and evaluation.
  • Quantify the testing sufficiently (e.g., number of test hours, test articles, test events, test firings) to allow a valid cost estimate to be created.

3.6.2. Modeling & Simulation (M&S).

  • Describe the key models and simulations and their intended use.
  • Include the operational test objectives to be addressed using M&S. Identify data needed and the planned accreditation effort.
  • Identify how the operational test scenarios will be supplemented with M&S.
  • Identify who will perform the M&S verification, validation, and accreditation.
  • Identify operational M&S resource requirements in Part IV.

3.6.3. Test Limitations.

  • Discuss test limitations including threat realism, resource availability, limited operational (military; climatic; Chemical, Biological, Nuclear, and Radiological (CBNR), etc.) environments, limited support environment, maturity of tested systems or subsystems, safety, that may impact the resolution of affected COIs.
  • Describe measures taken to mitigate limitations.
  • Indicate if any system contractor involvement or support is required, the nature of that support, and steps taken to ensure the impartiality of the contractor providing the support according to Title 10 U.S.C. §2399.
  • Indicate the impact of test limitations on the ability to resolve COIs and the ability to formulate conclusions regarding operational effectiveness and operational suitability. Indicate the COIs affected in parenthesis after each limitation.

3.7. Other Certifications.

  • Identify key testing prerequisites and entrance criteria, such as required certifications (e.g. DoD Information Assurance Certification and Accreditation Process (DIACAP) Authorization to Operate, Weapon Systems Explosive Safety Review Board (WSERB), flight certification, etc.)

3.8. Reliability growth.

  • Since reliability is a driver during system development, identify, in tabular form, the amount of operating time being accrued during the each of the tests listed in the Figure 2.1.
    • Table should contain the system configuration, operational concept, etc. Reference and provide hyperlinks to the reliability growth planning document.

3.9. Future Test and Evaluation

  • Summarize all remaining significant T&E that has not been discussed yet, extending through the system life cycle.
    • Significant T&E is that T&E requiring procurement of test assets or other unique test resources that need to be captured in the Resource section.
    • Significant T&E can also be any additional questions or issues that need to be resolved for future decisions.
    • Do not include any T&E in this section that has been previously discussed in this part of the TEMP.

4. PART IV-RESOURCE SUMMARY

4.1. Introduction.

  • In this section, specify the resources necessary to accomplish the T&E program.
  • Testing will be planned and conducted to take full advantage of existing DoD investment in ranges, facilities, and other resources wherever practical.
  • Provide a list in a table format (see Table 4.1) including schedule (Note: ensure list is consistent with figure 2.1 schedule) of all key test and evaluation resources, both government and contractor, that will be used during the course of the current increment. Include long-lead items for the next increment if known.
  • Specifically, identify the following test resources and identify any shortfalls, impact on planned testing, and plan to resolve shortfalls.

4.1.1. Test Articles.

  • Identify the actual number of and timing requirements for all test articles, including key support equipment and technical information required for testing in each phase of DT&E, LFT&E, and OT&E.
    • If key subsystems (components, assemblies, subassemblies or software modules) are to be tested individually, before being tested in the final system configuration, identify each subsystem in the TEMP and the quantity required.
  • Specifically identify when prototype, engineering development, or production models will be used.

4.1.2. Test Sites and Instrumentation.

  • Identify the specific test ranges/facilities and schedule to be used for each type of testing.
  • Compare the requirements for test ranges/facilities dictated by the scope and content of planned testing with existing and programmed test range/facility capability.
  • Identify instrumentation that must be acquired specifically to conduct the planned test program.

4.1.3. Test Support Equipment.

  • Identify test support equipment and schedule specifically required to conduct the test program.
  • Anticipate all test locations that will require some form of test support equipment. This may include test measurement and diagnostic equipment, calibration equipment, frequency monitoring devices, software test drivers, emulators, or other test support devices that are not included under the instrumentation requirements.

4.1.4. Threat Representation.

  • Identify the type, number, availability, fidelity requirements, and schedule for all representations of the threat (to include threat targets) to be used in testing.
  • Include the quantities and types of units and systems required for each of the test phases. Appropriate threat command and control elements may be required and utilized in both live and virtual environments.
  • The scope of the T&E event will determine final threat inventory.

4.1.5. Test Targets and Expendables.

  • Specify the type, number, availability, and schedule for all test targets and expendables, (e.g. targets, weapons, flares, chaff, sonobuoys, smoke generators, countermeasures) required for each phase of testing.
  • Identify known shortfalls and associated evaluation risks.
  • Include threat targets for LFT&E lethality testing and threat munitions for vulnerability testing.

4.1.6. Operational Force Test Support.

  • For each test and evaluation phase, specify the type and timing of aircraft flying hours, ship steaming days, and on-orbit satellite contacts/coverage, and other operational force support required.
  • Include supported/supporting systems that the system under test must interoperate with if testing a system-of-systems or family-of-systems.
  • Include size, location, and type unit required.

4.1.7. Models, Simulations, and Testbeds.

  • For each test and evaluation phase, specify the models and simulations to be used, including computer-driven simulation models and hardware/software-in-the-loop test beds.
  • Identify opportunities to simulate any of the required support.
  • Identify the resources required to validate and accredit their usage, responsible agency and timeframe.

4.1.8. Joint Mission Environment.

  • Describe the live, virtual, or constructive components or assets necessary to create an acceptable environment to evaluate system performance against stated joint requirements.
  • Describe how both DT and OT testing will utilize these assets and components.

4.1.9. Special Requirements.

  • Identify requirements and schedule for any necessary non-instrumentation capabilities and resources such as: special data processing/data bases, unique mapping/charting/geodesy products, extreme physical environmental conditions or restricted/special use air/sea/landscapes.
  • Briefly list any items impacting the T&E strategy or government test plans that must be put on contract or which are required by statute or regulation. These are typically derived from the JCIDS requirement (i.e., Programmatic Environment, Safety and Occupational Health Evaluation (PESHE) or Environment, Safety and Occupational Health (ESOH)).
  • Include key statements describing the top-level T&E activities the contractor is responsible for and the kinds of support that must be provided to government testers.

4.2. Federal, State, and Local Requirements.

  • All T&E efforts must comply with federal, state, and local environmental regulations.
  • Current permits and appropriate agency notifications will be maintained regarding all test efforts.
  • Specify any National Environmental Policy Act documentation needed to address specific test activities that must be completed prior to testing and include any known issues that require mitigations to address significant environmental impacts.
  • Describe how environmental compliance requirements will be met.

4.3. Manpower/Personnel and Training.

  • Specify manpower/personnel and training requirements and limitations that affect test and evaluation execution. Identify how much training will be conducted with M&S.

4.4. Test Funding Summary.

  • Summarize cost of testing by FY separated by major events or phases and within each Fiscal Year (FY) DT and OT dollars.
    • When costs cannot be estimated, identify the date when the estimates will be derived.

Table 4.1 Test Sites and Instrumentation Example

Fiscal Year

06

07

08

09

10

11

12

TBD

TEST EVENT

TEST RESOURCE

IT-B1

IT-B2

IT-B2 / IT-C1

IT-C1

IT-C1

IT-C2

OT-C1

OT-D1

Integration Lab

X

X

X

X

X

X

Radar Integration Lab

X

X

X

X

X

X

Loads (flights)

Operating Area #1 (flights)

X(1)

X(1)

X (1)

X (2)

Operating Area #2 (flights)

50(1)

132(1)

60

100

140

X (1)

X (2)

Northeast CONUS Overland (flights)

10

X (1)

X (2)

SOCAL Operating Areas (flights)

X

X

Shielded Hangar (hours)

160

160

Electromagnetic Radiation Facility (hours)

40

40

Arresting Gear

(Mk 7 Mod 3)(events)

10

10

NAS Fallon

5

5

A/R

X (1)

X (2)

Link-16 Lab, Eglin AFB

X

NAWCAD WD, China Lake Range

X

Eglin AFB ESM Range

X

1. Explanations as required.

2. Enter the date the funding will be available.

9.5.5.4. Other Milestone TEMPs and Updates

An updated TEMP is required as part of entry criteria for entering each acquisition phase, and at any time a major programmatic change occurs. For example, an updated TEMP may be required due to a change resulting in a CDR or configuration change, change to the acquisition strategy, or changes to capability requirements.

Previous and Next Page arrows

List of All Contributions at This Location

No items found.

Popular Tags

Browse

https://acc.dau.mil/UI/img/bo/minus.gifWelcome to the Defense Acquisition...
https://acc.dau.mil/UI/img/bo/minus.gifForeword
https://acc.dau.mil/UI/img/bo/minus.gifChapter 1 -- Department of Defense...
https://acc.dau.mil/UI/img/bo/plus.gif1.0. Overview
https://acc.dau.mil/UI/img/bo/plus.gif1.1. Integration of the DoD Decision...
https://acc.dau.mil/UI/img/bo/plus.gif1.2. Planning Programming Budgeting and...
https://acc.dau.mil/UI/img/bo/plus.gif1.3. Joint Capabilities Integration and...
https://acc.dau.mil/UI/img/bo/plus.gif1.4. Defense Acquisition System
https://acc.dau.mil/UI/img/bo/minus.gifChapter 2 -- Program Strategies
https://acc.dau.mil/UI/img/bo/plus.gif2.0 Overview
https://acc.dau.mil/UI/img/bo/plus.gif2.1. Program Strategies—General
https://acc.dau.mil/UI/img/bo/plus.gif2.2. Program Strategy Document...
https://acc.dau.mil/UI/img/bo/plus.gif2.3. Program Strategy Relationship to...
https://acc.dau.mil/UI/img/bo/plus.gif2.4. Relationship to Request for...
https://acc.dau.mil/UI/img/bo/plus.gif2.5. Program Strategy Classification...
https://acc.dau.mil/UI/img/bo/plus.gif2.6. Program Strategy Document Approval...
https://acc.dau.mil/UI/img/bo/plus.gif2.7. Acquisition Strategy versus...
https://acc.dau.mil/UI/img/bo/plus.gif2.8. Technology Development...
https://acc.dau.mil/UI/img/bo/minus.gifChapter 3 -- Affordability and...
https://acc.dau.mil/UI/img/bo/plus.gif3.0. Overview
https://acc.dau.mil/UI/img/bo/plus.gif3.1. Life-Cycle Costs/Total Ownership...
https://acc.dau.mil/UI/img/bo/plus.gif3.2. Affordability
https://acc.dau.mil/UI/img/bo/plus.gif3.3. Analysis of Alternatives
https://acc.dau.mil/UI/img/bo/plus.gif3.4. Cost Estimation for Major Defense...
https://acc.dau.mil/UI/img/bo/plus.gif3.5. Manpower Estimates
https://acc.dau.mil/UI/img/bo/plus.gif3.6. Major Automated Information Systems...
https://acc.dau.mil/UI/img/bo/plus.gif3.7. Principles for Life-Cycle Cost...
https://acc.dau.mil/UI/img/bo/minus.gifChapter 4 -- Systems Engineering
https://acc.dau.mil/UI/img/bo/plus.gif4.0. Overview
https://acc.dau.mil/UI/img/bo/plus.gif4.1. Introduction
https://acc.dau.mil/UI/img/bo/minus.gif4.2. Systems Engineering Activities in...
https://acc.dau.mil/UI/img/bo/plus.gif4.2.1. Life-Cycle Expectations
https://acc.dau.mil/UI/img/bo/plus.gif4.2.2. Pre-Materiel Development Decision
https://acc.dau.mil/UI/img/bo/plus.gif4.2.3. Materiel Solution Analysis Phase
https://acc.dau.mil/UI/img/bo/plus.gif4.2.4. Technology Development Phase
https://acc.dau.mil/UI/img/bo/plus.gif4.2.5. Engineering and Manufacturing...
https://acc.dau.mil/UI/img/bo/plus.gif4.2.6. Production and Deployment Phase
https://acc.dau.mil/UI/img/bo/plus.gif4.2.7. Operations and Support Phase
https://acc.dau.mil/UI/img/bo/plus.gif4.2.8. Technical Reviews and Audits...
https://acc.dau.mil/UI/img/bo/plus.gif4.2.9. Alternative Systems Review
https://acc.dau.mil/UI/img/bo/plus.gif4.2.10. System Requirements Review
https://acc.dau.mil/UI/img/bo/plus.gif4.2.11. System Functional Review
https://acc.dau.mil/UI/img/bo/plus.gif4.2.12. Preliminary Design Review
https://acc.dau.mil/UI/img/bo/plus.gif4.2.13. Critical Design Review
https://acc.dau.mil/UI/img/bo/plus.gif4.2.14. System Verification...
https://acc.dau.mil/UI/img/bo/plus.gif4.2.15. Production Readiness Review
https://acc.dau.mil/UI/img/bo/plus.gif4.2.16. Physical Configuration Audit
https://acc.dau.mil/UI/img/bo/plus.gif4.2.17. In-Service Review
https://acc.dau.mil/UI/img/bo/minus.gif4.3. Systems Engineering Processes
https://acc.dau.mil/UI/img/bo/plus.gif4.3.2. Technical Planning Process
https://acc.dau.mil/UI/img/bo/plus.gif4.3.3. Decision Analysis Process
https://acc.dau.mil/UI/img/bo/plus.gif4.3.4. Technical Assessment Process
https://acc.dau.mil/UI/img/bo/plus.gif4.3.5. Requirements Management Process
https://acc.dau.mil/UI/img/bo/plus.gif4.3.6. Risk Management Process
https://acc.dau.mil/UI/img/bo/plus.gif4.3.7. Configuration Management Process
https://acc.dau.mil/UI/img/bo/plus.gif4.3.8. Technical Data Management Process
https://acc.dau.mil/UI/img/bo/plus.gif4.3.9. Interface Management Process
https://acc.dau.mil/UI/img/bo/plus.gif4.3.10. Stakeholder Requirements...
https://acc.dau.mil/UI/img/bo/plus.gif4.3.11. Requirements Analysis Process
https://acc.dau.mil/UI/img/bo/plus.gif4.3.12. Architecture Design Process
https://acc.dau.mil/UI/img/bo/plus.gif4.3.13. Implementation Process
https://acc.dau.mil/UI/img/bo/plus.gif4.3.14. Integration Process
https://acc.dau.mil/UI/img/bo/plus.gif4.3.15. Verification Process
https://acc.dau.mil/UI/img/bo/plus.gif4.3.16. Validation Process
https://acc.dau.mil/UI/img/bo/plus.gif4.3.17. Transition Process
https://acc.dau.mil/UI/img/bo/plus.gif4.3.18. Design Considerations
https://acc.dau.mil/UI/img/bo/plus.gif4.3.19. Tools Techniques and Lessons...
https://acc.dau.mil/UI/img/bo/minus.gifChapter 5 -- Life-Cycle Logistics
https://acc.dau.mil/UI/img/bo/plus.gif5.0. Overview
https://acc.dau.mil/UI/img/bo/plus.gif5.1. Life-Cycle Sustainment in the...
https://acc.dau.mil/UI/img/bo/plus.gif5.2. Applying Systems Engineering to...
https://acc.dau.mil/UI/img/bo/plus.gif5.3. Supportability Design...
https://acc.dau.mil/UI/img/bo/plus.gif5.4. Sustainment in the Life-Cycle...
https://acc.dau.mil/UI/img/bo/plus.gif5.5. References
https://acc.dau.mil/UI/img/bo/minus.gifChapter 6 -- Human Systems Integration...
https://acc.dau.mil/UI/img/bo/plus.gif6.0. Overview
https://acc.dau.mil/UI/img/bo/plus.gif6.1. Total System Approach
https://acc.dau.mil/UI/img/bo/plus.gif6.2 HSI - Integration Focus
https://acc.dau.mil/UI/img/bo/plus.gif6.3. Human Systems Integration Domains
https://acc.dau.mil/UI/img/bo/plus.gif6.4. Human Systems Integration (HSI)...
https://acc.dau.mil/UI/img/bo/plus.gif6.5. Manpower Estimates
https://acc.dau.mil/UI/img/bo/plus.gif6.6. Additional References
https://acc.dau.mil/UI/img/bo/minus.gifChapter 7 -- Acquiring Information...
https://acc.dau.mil/UI/img/bo/minus.gif7.0. Overview
https://acc.dau.mil/UI/img/bo/minus.gif7.1. Introduction
https://acc.dau.mil/UI/img/bo/minus.gif7.2. DoD Information Enterprise
https://acc.dau.mil/UI/img/bo/plus.gif7.2.2. Mandatory Policies
https://acc.dau.mil/UI/img/bo/plus.gif7.2.3. The Use of Architecture
https://acc.dau.mil/UI/img/bo/plus.gif7.2.4. Integration into the Acquisition...
https://acc.dau.mil/UI/img/bo/plus.gif7.2.5. DoD Enterprise...
https://acc.dau.mil/UI/img/bo/minus.gif7.3. Interoperability and Supportability...
https://acc.dau.mil/UI/img/bo/plus.gif7.3.2. Mandatory Policies
https://acc.dau.mil/UI/img/bo/plus.gif7.3.3. Interoperability and...
https://acc.dau.mil/UI/img/bo/plus.gif7.3.4. Net-Ready Key Performance...
https://acc.dau.mil/UI/img/bo/plus.gif7.3.5. Net-Ready Key Performance...
https://acc.dau.mil/UI/img/bo/plus.gif7.3.6. Information Support Plan (ISP)...
https://acc.dau.mil/UI/img/bo/plus.gif7.4. Sharing Data, Information, and...
https://acc.dau.mil/UI/img/bo/minus.gif7.5. Information Assurance (IA)
https://acc.dau.mil/UI/img/bo/plus.gif7.5.3. Information Assurance (IA)...
https://acc.dau.mil/UI/img/bo/plus.gif7.5.4. Estimated Information Assurance...
https://acc.dau.mil/UI/img/bo/plus.gif7.5.5. Integrating Information Assurance...
https://acc.dau.mil/UI/img/bo/plus.gif7.5.6. Program Manager (PM)...
https://acc.dau.mil/UI/img/bo/plus.gif7.5.7. Information Assurance (IA)...
https://acc.dau.mil/UI/img/bo/plus.gif7.5.8. Information Assurance (IA)...
https://acc.dau.mil/UI/img/bo/plus.gif7.5.10. Information Assurance (IA)...
https://acc.dau.mil/UI/img/bo/plus.gif7.5.12. Implementing Information...
https://acc.dau.mil/UI/img/bo/plus.gif7.5.13. Information Assurance (IA)...
https://acc.dau.mil/UI/img/bo/minus.gif7.6. Electromagnetic Spectrum
https://acc.dau.mil/UI/img/bo/plus.gif7.6.2. Mandatory Policies
https://acc.dau.mil/UI/img/bo/plus.gif7.6.3. Spectrum Management and E3 in the...
https://acc.dau.mil/UI/img/bo/plus.gif7.6.4. Spectrum Supportability Risk...
https://acc.dau.mil/UI/img/bo/minus.gif7.7. Accessibility of Electronic and...
https://acc.dau.mil/UI/img/bo/plus.gif7.8. The Clinger-Cohen Act (CCA) --...
https://acc.dau.mil/UI/img/bo/plus.gif7.9. Post-Implementation Review (PIR)
https://acc.dau.mil/UI/img/bo/minus.gif7.10. Commercial Off-the-Shelf (COTS)...
https://acc.dau.mil/UI/img/bo/plus.gif7.10.6. Best Practices Tools and Methods
https://acc.dau.mil/UI/img/bo/minus.gif7.11. Space Mission Architectures
https://acc.dau.mil/UI/img/bo/minus.gifChapter 8 -- Intelligence Analysis...
https://acc.dau.mil/UI/img/bo/plus.gif8.0. Introduction
https://acc.dau.mil/UI/img/bo/plus.gif8.1. Threat Intelligence Support
https://acc.dau.mil/UI/img/bo/plus.gif8.2. Signature and other Intelligence...
https://acc.dau.mil/UI/img/bo/plus.gif8.3. Support to the Intelligence...
https://acc.dau.mil/UI/img/bo/minus.gifChapter 9 -- Test and Evaluation (T&E)
https://acc.dau.mil/UI/img/bo/minus.gif9.0 Overview
https://acc.dau.mil/UI/img/bo/plus.gif9.1 OSD T&E Organization
https://acc.dau.mil/UI/img/bo/minus.gif9.2 Service-Level T&E Management
https://acc.dau.mil/UI/img/bo/minus.gif9.3 Test and Evaluation
https://acc.dau.mil/UI/img/bo/plus.gif9.3.3 Live Fire Test and Evaluation
https://acc.dau.mil/UI/img/bo/minus.gif9.4 Integrated Test and Evaluation
https://acc.dau.mil/UI/img/bo/minus.gif9.5 Test and Evaluation Planning
https://acc.dau.mil/UI/img/bo/plus.gif9.5.4 Test and Evaluation Strategy...
https://acc.dau.mil/UI/img/bo/minus.gif9.5.5. Test and Evaluation Master Plan
https://acc.dau.mil/UI/img/bo/plus.gif9.5.6 Contractual
https://acc.dau.mil/UI/img/bo/plus.gif9.5.8 System Readiness for Operational...
https://acc.dau.mil/UI/img/bo/minus.gif9.6 T&E Reporting
https://acc.dau.mil/UI/img/bo/minus.gif9.7 Special Topics
https://acc.dau.mil/UI/img/bo/minus.gif9.7.5 Testing in a Joint Operational...
https://acc.dau.mil/UI/img/bo/plus.gif9.7.6 Information Assurance Testing
https://acc.dau.mil/UI/img/bo/minus.gif9.8. Best Practices
https://acc.dau.mil/UI/img/bo/minus.gif9.9. Prioritizing Use of Government Test...
https://acc.dau.mil/UI/img/bo/plus.gifChapter 10 -- Decisions Assessments and...
https://acc.dau.mil/UI/img/bo/minus.gifChapter 11 -- Program Management...
https://acc.dau.mil/UI/img/bo/plus.gif11.0. Overview
https://acc.dau.mil/UI/img/bo/plus.gif11.1. Joint Programs
https://acc.dau.mil/UI/img/bo/plus.gif11.2. International Programs
https://acc.dau.mil/UI/img/bo/plus.gif11.3. Integrated Program Management
https://acc.dau.mil/UI/img/bo/plus.gif11.4. Knowledge-Based Acquisition
https://acc.dau.mil/UI/img/bo/plus.gif11.5. Technical Representatives at...
https://acc.dau.mil/UI/img/bo/plus.gif11.6. Contractor Councils
https://acc.dau.mil/UI/img/bo/plus.gif11.7 Property
https://acc.dau.mil/UI/img/bo/plus.gif11.8. Modeling and Simulation (M&S)...
https://acc.dau.mil/UI/img/bo/plus.gifChapter 12 - Defense Business System...
https://acc.dau.mil/UI/img/bo/minus.gifChapter 13 -- Program Protection
https://acc.dau.mil/UI/img/bo/minus.gif13.0 Overview
https://acc.dau.mil/UI/img/bo/plus.gif13.1 The Program Protection Process
https://acc.dau.mil/UI/img/bo/minus.gif13.2 The Program Protection Plan (PPP)
https://acc.dau.mil/UI/img/bo/minus.gif13.3 Critical Program Information (CPI)...
https://acc.dau.mil/UI/img/bo/minus.gif13.4. Intelligence and...
https://acc.dau.mil/UI/img/bo/plus.gif13.5. Vulnerability Assessment
https://acc.dau.mil/UI/img/bo/minus.gif13.6. Risk Assessment
https://acc.dau.mil/UI/img/bo/minus.gif13.7. Countermeasures
https://acc.dau.mil/UI/img/bo/minus.gif13.8. Horizontal Protection
https://acc.dau.mil/UI/img/bo/plus.gif13.9. Foreign Involvement
https://acc.dau.mil/UI/img/bo/minus.gif13.10. Managing and Implementing PPPs
https://acc.dau.mil/UI/img/bo/minus.gif13.11. Compromises
https://acc.dau.mil/UI/img/bo/plus.gif13.12. Costs
https://acc.dau.mil/UI/img/bo/minus.gif13.13. Contracting
https://acc.dau.mil/UI/img/bo/minus.gif13.14. Detailed System Security...
https://acc.dau.mil/UI/img/bo/minus.gif13.15. Program Protection Plan (PPP)...
https://acc.dau.mil/UI/img/bo/minus.gif13.16. Program Protection Plan (PPP)...
https://acc.dau.mil/UI/img/bo/plus.gifChapter 14 -- Acquisition of Services
https://acc.dau.mil/UI/img/bo/minus.gifDoD Directive 5000.01
https://acc.dau.mil/UI/img/bo/plus.gifENCLOSURE 1 ADDITIONAL POLICY
https://acc.dau.mil/UI/img/bo/minus.gifDoD Instruction 5000.02
https://acc.dau.mil/UI/img/bo/plus.gifTABLE OF CONTENTS
https://acc.dau.mil/UI/img/bo/minus.gifEnclosure 1 -- References
https://acc.dau.mil/UI/img/bo/plus.gifEnclosure 2 -- Procedures
https://acc.dau.mil/UI/img/bo/plus.gifEnclosure 3 -- Acquisition Category...
https://acc.dau.mil/UI/img/bo/minus.gifEnclosure 4 -- Statutory and Regulatory...
https://acc.dau.mil/UI/img/bo/plus.gifEnclosure 4 -- Table 2-1. Statutory...
https://acc.dau.mil/UI/img/bo/plus.gifEnclosure 4 -- Table 2-2. Statutory...
https://acc.dau.mil/UI/img/bo/plus.gifEnclosure 4 -- Table 3. Regulatory...
https://acc.dau.mil/UI/img/bo/plus.gifEnclosure 4 -- Table 4. Regulatory...
https://acc.dau.mil/UI/img/bo/plus.gifEnclosure 4 -- Table 5. EVM...
https://acc.dau.mil/UI/img/bo/plus.gifEnclosure 4 -- Table 6. APB Policy
https://acc.dau.mil/UI/img/bo/plus.gifEnclosure 4 -- Table 7. Unique Decision...
https://acc.dau.mil/UI/img/bo/minus.gifEnclosure 5 -- IT Considerations
https://acc.dau.mil/UI/img/bo/plus.gifEnclosure 6 -- Integrated T&E
https://acc.dau.mil/UI/img/bo/minus.gifEnclosure 7 -- Resource Estimation
https://acc.dau.mil/UI/img/bo/plus.gifEnclosure 8 -- Human Systems Integration...
https://acc.dau.mil/UI/img/bo/plus.gifEnclosure 9 -- Acquisition of Services
https://acc.dau.mil/UI/img/bo/minus.gifEnclosure 10 -- Program Management
https://acc.dau.mil/UI/img/bo/minus.gifEnclosure 11 -- Management of Defense...
https://acc.dau.mil/UI/img/bo/minus.gifEnclosure 12 -- Systems Engineering
https://acc.dau.mil/UI/img/bo/minus.gifRecent Policy and Guidance
https://acc.dau.mil/UI/img/bo/plus.gifDownload the Defense Acquisition...
https://acc.dau.mil/UI/img/bo/plus.gifWeapon Systems Acquisition Reform Act of...
https://acc.dau.mil/UI/img/bo/minus.gifCurrent JCIDS Manual and CJCSI 3170.01 I
https://acc.dau.mil/UI/img/bo/minus.gifDefense Acquisition Guidebook Key...
ACC Practice Center Version 3.2
  • Application Build 3.2.9
  • Database Version 3.2.9