Test and Evaluation: Impact of DOD's Office of the Director of Operational Test and Evaluation

NSIAD-98-22 October 24, 1997
Full Report (PDF, 44 pages)  

Summary

The Defense Department (DOD) has proposed that the practices and policies of the Office of the Director of Operational Test and Evaluation be modified to reduce the time and cost of developing and fielding new weapon systems. This report reviews the Office's operations and organizational structure for overseeing operational testing. GAO assesses (1) the Office's efforts and their impact on the quality of operational testing and evaluation in DOD and (2) the strengths and weaknesses of the current organizational framework in DOD for operational testing. As part of its review, GAO prepared 13 case studies involving the testing of individual weapon systems.

GAO noted that: (1) GAO's review of 13 case studies indicated that DOT&E oversight of operational testing and evaluation increased the probability that testing would be more realistic and more thorough; (2) specifically, DOT&E was influential in advocating increasing the reliability of the observed performance and reducing the risk of unknowns through more thorough testing, conducting more realistic testing, enhancing data collection and analysis, reporting independent findings, and recommending follow-on operational test and evaluation when suitability or effectiveness was not fully demonstrated prior to initiating full-rate production; (3) the independence of DOT&E--and its resulting authority to report directly to Congress--is the foundation of its effectiveness; (4) that independence, along with its legislative mandate, provides sufficient freedom and authority to exercise effective oversight of the operational testing and evaluation of new systems before a decision is made to begin full-rate production; (5) DOT&E can reduce the risk that systems are not adequately tested prior to the full-rate production decision but DOT&E cannot ensure that: (a) only systems whose operational effectiveness and suitability have been demonstrated through operational testing will proceed to the full-rate production decision; or (b) new fielded systems will accomplish their missions as intended or that the fielded systems are safe, survivable, and effective; (6) DOT&E management must balance its oversight responsibilities for operational testing with the broader acquisition priorities of program managers and service test agencies; (7) though supportive of DOT&E's mission and independence, program and service representatives frequently considered the time, expense, and resources expended to accommodate DOT&E concerns to be ill-advised; (8) several current trends may challenge DOT&E's ability to manage its workload and its ability to impact operational test and evaluation: (a) service challenges to DOT&E's authority to require and oversee follow-on operational testing and evaluation; (b) a decline in resources available for oversight; (c) an expansion of DOT&E involvement in activities other than oversight of major acquisition programs; (d) participation of DOT&E in the acquisition process as a member of working-level integrated product teams; and (e) greater integration of developmental and operational testing; and (9) these trends make it imperative that DOT&E prioritize its workload to achieve a balance between the oversight of major defense acquisition programs and other initiatives important to the quality of operational test and evaluation.