Click here
      Home    DAG Tutorial    Search    Available Downloads     Feedback
 
The DAG does not reflect the changes in the DoDI5000.02. Work is in progress to update the content and will be completed as soon as possible.
 
.

9.4 Integrated Test and Evaluation

Topic
Previous Page Next Page

Previous and Next Page arrows

DEFENSE ACQUISITION GUIDEBOOK
Chapter 9 - Test and Evaluation (T&E)

9.4. Integrated Test and Evaluation

9.4. Integrated Test and Evaluation

According to OSD Memorandum “Definition of Integrated Testing,” dated April 25, 2008, OSD defines integrated testing as “the collaborative planning and collaborative execution of test phases and events to provide shared data in support of independent analysis, evaluation, and reporting by all stakeholders, particularly the development (both contractor and government) and operational test and evaluation communities.”

Integrated testing’s goal: conduct a seamless test program producing credible qualitative and quantitative data useful to all evaluators, and addressing developmental, sustainment, and operational issues. Integrated testing allows for the collaborative planning of test events; where a single test point or mission can provide data to satisfy multiple objectives, without compromising the test objectives of participating test organizations. Test points in this context, mean a test condition denoted by time, three-dimensional location and energy state, and system operating configuration; where applying a pre-planned test technique to the system under test and observing and recording the response(s).

Integrated testing includes more than just concurrent or combined DT and OT, where both DT and OT test points remain interleaved on the same mission or schedule. Integrated testing focuses the entire test program (contractor test, Government DT, OT, and LFT) on designing, developing, and producing a comprehensive plan that coordinates all test activities to support evaluation results for decision makers at required decision reviews.

Integrated testing may include all types of test activities such as contractor testing, developmental and operational testing, interoperability and IA testing, and certification testing. All testing types, regardless of the source, should receive consideration; including tests from other Services for multi-Service programs. Software intensive and IT systems should use the reciprocity principle as much as possible, i.e., "Test by one, use by all." Specifically name any required integrated test combinations.

For successful integrated testing, understanding and maintaining the pedigree of the data proves vital. The pedigree of the data refers to accurately documenting the configuration of the test asset and the actual test conditions under which each element of test data was obtained. The pedigree of the data should indicate whether the test configuration represented operationally realistic or representative conditions. The T&E WIPT plays an important role in maintaining the data pedigree within the integrated test process for a program. The T&E WIPT establishes agreements between the test program stakeholders; regarding roles and responsibilities in not only implementing the integrated test process, but also in developing and maintaining data release procedures, and data access procedures or a data repository, where all stakeholders will have access to test data for separate evaluations.

Integrated testing must provide shared data in support of independent analyses for all T&E stakeholders. A requirement exists for a common T&E database, including descriptions of the test environments to ensure commonality and usability by other testers. Integrated testing must allow for and support separate, independent OT&E according to section 2399 of title 10 USC and DoDI 5000.02, “Operation of the Defense Acquisition System,” dated December 8, 2008. It does not include the earliest engineering design or testing of early prototype components.

Integrated testing serves as a ‘concept’ for test design, not a new type of T&E. Programs must intentionally design it into the earliest program strategies, plans, documentation, and test plans, preferably starting before Milestone A. Developing and adopting integrated testing strategies early in the process increases the opportunities and benefits. If done correctly, the enhanced operational realism in DT&E provides greater opportunity for early identification of system design improvements, and may even change the course of system development during EMD. Integrated testing can increase the statistical confidence and power of all T&E activities. Most obviously, integrated testing can also reduce the number of T&E resources needed in OT&E. However, integrated testing does not replace or eliminate the need for dedicated IOT&E, as required by section 2399 of title 10 USC, "Operational Test and Evaluation of Defense Acquisition Programs" and DoDI 5000.02.

The T&E strategy should embed integrated testing, although most of the effort takes place during the detailed planning and execution phases of a test program. It is critical that all stakeholders understand the required evaluations to assess risks, assess maturity of the system and assess the operational effectiveness, operational suitability and survivability or operational security /lethality. Up front, define the “end state” for evaluation, ensuring all stakeholders work toward the same goal. Once accomplished, develop an integrated test program that generates the data required to conduct the evaluations.

Early identification of ‘system’ and ‘mission’ elements enable the development and execution of an efficient and effective T&E strategy and an integrated DT/OT program. The use of scientific and statistical principles for test and evaluation; for example, design of experiments (DOE), will help develop an integrated DT/OT program by providing confidence about the performance of a system in a mission context.

Although DT and OT require different fidelity to meet their individual objectives (e.g., data parameters, mission control, onboard and test range instrumentation, data collection and analysis), some of areas of commonality include:

  • Evaluation in complex joint mission operating environments with systems of different levels of maturity (integrating upgraded systems with legacy systems)
  • Replication of the “real world” environment as closely as practical in a safe and affordable manner
  • Need for a distributive live/virtual/constructive (LVC) representation of the joint operational environments (the only affordable way to test and train in a complex system-of-systems environment)
  • Use of validated tactics, techniques, and procedures (TTPs)
  • Representation of Blue and Red Forces
  • Validated scenarios
  • Threat and threat countermeasures
  • Dedicated instrumented ranges. (differences exist in the instrumentation fidelity required to control participants, collect data, and support real-time and post-event analyses)
  • Data collection, management, archiving, and retrieval processes
  • Embedded sensors and instrumentation

Integrated DT/OT initiatives encourage all testers – contractor, developmental, operational, and live fire – to plan an integrated test program, seeking an efficient continuum. They focus on the early discovery of problems in a mission context and in realistic operational environments – even for component testing. The appropriate T&E environment includes the system under test (SUT) and any interrelated systems (that is, its planned or expected environment in terms of weapons, sensors, command and control, and platforms, as appropriate) needed to accomplish an end-to-end mission in combat. The following includes a few integrated test concerns:

  1. Balancing the test event to effectively capture different DT and OT data collection objectives
  2. Requiring early investment in detailed planning that many programs lack in early stages
  3. Requiring constant planning and updates to effectively maximize test results
  4. Much of the early information for a program is preliminary, requiring rework and updates
  5. Analyzing proves difficult when unanticipated anomalies appear in test results

Previous and Next Page arrows

List of All Contributions at This Location

No items found.

Popular Tags

ACC Practice Center Version 3.2
  • Application Build 3.2.9
  • Database Version 3.2.9