Click here
      Home    DAG Tutorial    Search    Available Downloads     Feedback
 
The DAG does not reflect the changes in the DoDI5000.02. Work is in progress to update the content and will be completed as soon as possible.
 
.

4.3.15. Verification Process

Topic

Previous and Next Page arrows

DEFENSE ACQUISITION GUIDEBOOK
Chapter 4 -- Systems Engineering

4.3.15. Verification Process

4.3.15. Verification Process

Verification provides evidence that the system or system element performs its intended functions and meets all performance requirements listed in the system performance specification and functional and allocated baselines. Verification answers the question, “Did you build the system correctly?” Verification is a key risk-reduction activity in the implementation and integration of a system and enables the program to catch defects in system elements before integration at the next level, thereby preventing costly troubleshooting and rework.

The Program Manager and Systems Engineer manage verification activities and methods as defined in the functional and allocated baselines, and review the results of verification. Guidance for managing and coordinating integrated testing activities can be found in DAG Chapter 9 Test and Evaluation and in DoDI 5000.02.

Verification begins during Requirements Analysis, when top-level stakeholder performance requirements are decomposed and eventually allocated to system elements in the initial system performance specification and interface control specifications. During this process, the program determines how and when each requirement should be verified, the tasks required to do so, as well as the necessary resources (i.e., test equipment, range time, personnel, etc.). The resulting verification matrix and supporting documentation become part of the program’s functional and allocated baselines.

Verification may be accomplished by any combination of the following methods:

  • Demonstration. Demonstration is the performance of operations at the system or system element level where visual observations are the primary means of verification. Demonstration is used when quantitative assurance is not required for verification of the requirements.
  • Examination. Visual inspection of equipment and evaluation of drawings and other pertinent design data and processes should be used to verify conformance with characteristics such as physical, material, part, and product marking and workmanship.
  • Analysis. Analysis is the use of recognized analytic techniques (including computer models) to interpret or explain the behavior/performance of the system element. Analysis of test data or review and analysis of design data should be used as appropriate to verify requirements.
  • Test. Test is an activity designed to provide data on functional features and equipment operation under fully controlled and traceable conditions. The data are subsequently used to evaluate quantitative characteristics.

Designs are verified at all levels of the physical architecture through a cost-effective combination of these methods, all of which can be aided by modeling and simulation.

Verification activities and results are documented among the artifacts for Functional Configuration Audits (FCA) and the System Verification Review (SVR) (see DAG section 4.2.14. Functional Configuration Audits/System Verification Review). When possible, verification should stress the system, or system elements, under realistic conditions representative of its intended use.

The individual system elements provided by the Implementation process are verified through developmental test and evaluation (DT&E), acceptance testing, or qualification testing. During the Integration process, the successively higher level system elements may be verified before they move on to the next level of integration. Verification of the system as a whole occurs when integration is complete. As design changes occur, each change should be assessed for potential impact to the qualified baseline. This may include a need to repeat portions of verification in order to mitigate risk of performance degradation.

The output of the Verification process is a verified production-representative article with documentation to support Initial Operational Test and Evaluation (IOT&E). The SVR provides a determination of the extent to which the system meets the system performance specification.

Previous and Next Page arrows

Previous Page Next Page

List of All Contributions at This Location

No items found.

Popular Tags

ACC Practice Center Version 3.2
  • Application Build 3.2.9
  • Database Version 3.2.9