Test & Evaluation

Non-Complex Systems:
Test & Evaluation non-complex systems
Complex Systems:
Test & Evaluation complex systems
HSI = Hardware Software Integration
NPE = Navy Preliminary Evaluation
CPI = Contractor Preliminary Inspection
ICPI = Incremental Contractor Preliminary Inspection
TRR = Test Readiness Review
GPI = Government Preliminary Inspection
FCA = Functional Configuration Audit
PCA = Physical Configuration Audit
CFI = Contractor Final Inspection
GFI = Government Final Inspection

Overview

The testing process for training systems varies according to the complexity of training system being developed. Each training system is tested in the contractor's facility where it is being created and then again once it is delivered to the training site. Basically once the contractor begins to integrate the hardware and software components together some level of testing can begin. For complex aviation trainers a Navy Preliminary Evaluation is conducted to determine if the trainer is ready for testing. Next the contractor tests the system during the Contractor Preliminary Inspection event. Again depending on the complexity of the trainer this event can be incremental where subsystems are tested prior to the overall integrated system. After the contractor completes their in-plant testing, the government will conduct a Test Readiness Review (if it is a complex training system) to determine if the system is ready to be tested by the government. If the system is ready, the government will conduct Government Preliminary Inspection. Once the GPI is approved the training system is delivered and installed at the training site. If the training system is complex, a Contractor final Inspection will be conducted prior to the Government Final Inspection.

Definitions

Test. A project or program designed to obtain, verify, and provide data to evaluate, research, and develop (other than laboratory experiments); progress in accomplishing development objectives; performance and operational capability of systems, subsystems, and components; and equipment items.

Evaluation. The review and analysis of data produced during current or previous testing and data obtained from test conducted by other government agencies and contractors, from operation and commercial experience, or combinations thereof.

Acceptance. The determination for whether or not a system satisfies its acceptance criteria. This criteria enables NAWCTSD to determine whether or not to accept the system. The Government signed Form DD250 results from acceptance of the training system.

Test Evaluation IPT (T&E IPT). Chaired by the Systems Engineer to provide a forum for test-related subjects like the Trainer Test Evaluation Plan (TTEP). The T&E IPT assists in establishing test objectives and evaluation baselines; defines organization, responsibilities, and relationships; estimates costs and schedules; and identifies needed test resources. The T&E IPT forum will maintain a continuous interchange of test related issues and identify and resolve potential problem areas related to a successful trainer acceptance program. T&E IPT meetings are scheduled to coincide with requirements, design, progress, and TRR conferences.

Fundamental Role of T&E

The fundamental role of Test and Evaluation (T&E) in the acquisition of new military systems is changing as a result of acquisition reform. The policy calls for reducing government oversight “...by substituting process controls and non-government standards in place of development and/or production testing and inspection and military unique quality assurance systems.” Although this policy can reduce the need for acceptance testing, it is not obvious that it should lead to reduced developmental testing. In fact, development and verification of manufacturing process capability is an Engineering Manufacturing Development (EMD) task that is generally recognized as needing additional emphasis. It must always be remembered that the sole function of testing is to generate accurate and sufficient data to enable the appropriate decision authority to determine acceptability.

However, testers, like program managers, must be committed to program success. Everyone must be on the same Integrated Product Team (IPT) -- the one that is responsible for delivering a superior capability to the user. The outlook and approach must shift from one of “oversight” and report to “early insight.” T&E expertise must be included in the IPT early on so that problems are prevented, rather than trying to identify them in a “gotcha” fashion when a test report is written. Quality and excellence should be built in from the start -- not inspected in two weeks before the test program begins or an In Process Review (IPR) meeting occurs. This is one of the important value-added contributions that the developmental and operational T&E communities must provide.

It must be stressed that being responsible for program success does not compromise a functional member’s independent assessment role. T&E team members must still be accountable for ensuring each program has a workable approach. The test planning is accomplished through the T&E IPT process and documented in the Trainer Test and Evaluation Plan (TTEP). The independent assessment function must be maintained. T&E team members must continue to perform an independent assessment and satisfy themselves that a program is executable, but they are expected to accomplish this by engaging early and in a constructive way. The T&E community is not working constructively as an integrated team if they wait until the test report is written or the IPR meets to surface “surprises.”

Stakeholder behavior is also expected. When concerns are raised in a constructive way, they should be accompanied with workable suggestions and practical solutions. As this cultural change is institutionalized, remember that a process to secure early insight should be implemented -- not event driven oversight.

With the emphasis now on performance specifications, the use of modeling and simulation can be used to evaluate many key performance parameters. Modeling and simulation must be integrated into the T&E process to secure early insight, reduce costs and shorten the acquisition cycle time. The Navy’s senior leadership is strongly committed to greater use of modeling and simulation, especially models that incorporate real physical underpinnings. With such models, certain tests can actually be eliminated and test resources can be focused on the areas where the understanding is less. In many cases, tests should be conducted to validate the models and simulations.

Roles Of The IPT During Testing

Systems Engineer (SE) - As the lead technical person for the Government, the SE performs the functions of Test Director during Government tests in an effort to ensure that the tests are complete and also to resolve any conflicts. The SE also represents the Government at technical meetings and working groups. The SE is normally designated as the Contracting Officer’s Representative and is thus given oversight over the technical aspects of the contract. The SE also ensures that ICPI, TRR, GPI, CFI, and GFI are conducted in accordance with the contract, the TTEP, Test Procedures, Test/Inspection Report and any other applicable documents.

Project Manager (PJM) - The PJM monitors the overall project cost and schedule and is the primary interface with the Fleet users and with the Sponsors. The PJM makes decisions that affect the project scope, cost, schedule and acts upon the recommendations of the SE and other Government team members.

Contractor - Responsible for conducting tests during ICPI/CFI, providing test facilities and equipment, and maintaining test logs and records. The Contractor develops the Test Procedures and the Test/Inspection Report and is a member of the Test and Evaluation Working Group. The Contractor is also responsible for correcting all DRs determined by the Government to be within the scope of the contract and re-testing each correction.

Fleet Project Team (FPT) - Provides the user’s perspective during the Preliminary Evaluation(s) and at various technical meetings. The FPT assists the PJM and SE (Test Director) in interpreting user requirements. Their military experience is invaluable during testing, as they are the actual fleet operators or instructors. The FPT can verify that behavior of the system under test is representative of the actual weapons system.

In-Service Engineer (ISE) - Acts as a technical consultant to the SE (Test Director) during Preliminary Evaluation(s). Assists in reviewing technical and support documentation. Frequently designated as the Trainer Software Support Activity (TSSA) the ISE office can provide valuable insight into the acceptability of the Training System. The ISE’s “hands on” experience with a wide range of training systems provides an inherent objectivity and ability to identify potential weak areas.

Defense Contract Management Area Operations (DCMAO) - Provides administrative oversight of the contract at the Contractor’s facilities. Monitors the quality of the Contractor’s processes. When the DCMAO representative is located at the Contractors plant, they serve as the in-plant point of contact for the Government.

Subject Matter Experts (SMEs) - Government experts in various areas of training systems development provide their expertise during Preliminary Evaluation(s), technical meetings, and review of technical documentation. In-house SMEs normally consist of visual systems engineers, software engineers, and modeling and simulation engineers. Test engineers from other Government agencies and FPT members are also considered SMEs.

Contracts - Maintains communications with the Contractor for all contractual and legal issues. Accepts and issues official correspondence during performance of the contract. Only the Procuring Contracting Officer is authorized to make changes to the contract and he does so under the direction of the PJM.

Integrated Logistics Support Management Team (ILSMT) - Ensures all logistics aspects of the project are in compliance with the contract. Reviews logistics and support documentation.

Trainer Testing Evaluation Criteria for Award

When evaluating proposals, some general T&E criteria to analyze:

  1. Offeror understands, and has addressed, the T&E requirements in the request for proposal?
  2. The proposed test schedules are reasonable relative to the scope of the trainer development effort?
  3. The offeror has identified, and has at his disposal, the resources necessary to adequately support the T&E effort?
  4. That any proposed incremental tests are independent, contain feasible entrance and exit criteria, and after successful completion of the incremental tests will not require further testing?
  5. That the offeror understands and adequately addresses the test sequence relative to CDRL deliveries and test milestones?

Trainer Access During Government Testing

Control of the trainer is critical during Government testing. Upon the start of each testing day, do not begin testing and do not allow any Government personnel on the trainer until the contractor has released the trainer to the Government. Likewise, no contractor personnel is permitted on the trainer while the Government has control unless the contractor has the systems engineer's permission. Control of the trainer is returned to the contractor whenever the Government is not using the trainer (during lunch, Government meetings, end of the day, etc). The Government team should be aware that no tests are to be started without first checking with the systems engineer. One of the biggest problems the systems engineer may have is team members walking onto the trainer to rerun a test without first informing the systems engineer. If the contractor happens to be doing modifications to the trainer at the time, the government could be liable for any damage done. Likewise, the contractor is not permitted to work on the trainer while it is under Government control unless the actions are coordinated with the systems engineer. Many tests have been restarted because someone touched a terminal "unrelated to the test" during an inspection.


NAWCTSD NAVAL AIR WARFARE CENTER
TRAINING SYSTEM DIVISION
ORLANDO FLORIDA


NAWCTSD Navigation Text Version
Last Update: 7 June 2007
Return to Technical Reviews Hardware Software Integration Contractor Preliminary Inspection Government Preliminary Inspection Government Final Inspection Go to Intial Operational Capability/Ready For Training Return to Technical Reviews Hardware Software Integration/NPE Contractor Preliminary Inspection Test Readiness Review Government Preliminary Inspection Contractor Final Inspection Government Final Inspection Go to Intial Operational Capability/Ready For Training