ExpectMore.gov


Detailed Information on the
Test & Evaluation Programs Assessment

Program Code 10003216
Program Title Test & Evaluation Programs
Department Name Dept of Defense--Military
Agency/Bureau Name Research, Development, Test, and Evaluation
Program Type(s) Direct Federal Program
Assessment Year 2006
Assessment Rating Results Not Demonstrated
Assessment Section Scores
Section Score
Program Purpose & Design 100%
Strategic Planning 25%
Program Management 58%
Program Results/Accountability 25%
Program Funding Level
(in millions)
FY2007 $2,974
FY2008 $3,060
FY2009 $3,147
*Note: funding shown for a program may be less than the actual program amount in one or more years because part of the program's funding was assessed and shown in other PART(s).

Ongoing Program Improvement Plans

Year Began Improvement Plan Status Comments
2007

Develop a comprehensive portfolio of notional composite performance measures (i.e., fairly specific measurement types) that reflect the general health of the Department of Defense's (DoD's) test ranges and other test infrastructure. Use the output of this improvement element as inputs to an effort to develop a complete set of performance measurements. These measures will be published in future biennial Strategic Plan for DoD Test Resources (2009 and each 2 years thereafter).

Action taken, but not completed Each DoD test range has its own unique capabilities and most often has performance measures applicable to their mission. However, DoD has not had aggregate performance measures to use to assess the health of the overall enterprise and help determine where resources on the margin might be most effectively applied. To date, the Department has identified some notional measures in the human resources area that they will use to develop specific measures. Other notional categories will follow.
2007

Develop a complete set of specific human resources performance measures and a baseline measurement that will allow a comprehensive assessment of the health of the test range workforce. Measures likely will include educational/technical training attainment, seniority and skill sets. Measures will be published in future biennial Strategic Plan for DoD Test Resources (starting in 2009).

Action taken, but not completed The establishment of this set of measures is the most advanced of the categories of measurements.

Completed Program Improvement Plans

Year Began Improvement Plan Status Comments

Program Performance Measures

Term Type  
Long-term/Annual Output

Measure: Long-term investment programs are balanced to the Strategic Plan.


Explanation:The TRMC Annual Budget Certification Report to Congress defines "balance" and documents the findings for each FY. Details of the FY actual rating can be found in that year's certification report as the form below does not allow full explanation. Balanced is defined as greater than 80 percent of the needs contained in the Strategic Plan and associated Addendum are addressed by test capability investment programs in the proposed T&E budgets. Balanced but improvement needed is defined as between 50 percent and 80 percent of the needs contained in the Strategic Plan and associated Addendum are addressed by test capability investment programs in the proposed T&E budgets. Not Balanced is defined as less than 50 percent of the needs contained in the Strategic Plan and associated Addendum are addressed by test capability investment programs in the proposed T&E budgets.

Year Target Actual
2006 Balanced Balance
2008 Balanced Avail. Jan. 2009
2009 Balanced
2011 Balanced
2013 Balanced

Questions/Answers (Detailed Assessment)

Section 1 - Program Purpose & Design
Number Question Answer Score
1.1

Is the program purpose clear?

Explanation: The purpose of these programs is to exercise, test and evaluate the performance of weapons and supporting systems during system development and prior to purchase and fielding, against pre-established standards, in order to ensure that fielded systems operate effectively in their expected threat environment.

Evidence: 10 USC 139 establishes the positions of Director, Operational Test and Evaluation and the responsibilities of that office; 10 USC 2366 establishes requirements for survivability testing prior to purchase of weapon systems; 10 USC 2399 establishes requirements for Defense Operational Test and Evaluation before full scale production and purchase; PL 107-314, Secs. 231-235 establishes the Defense Test Resource Management Center and its Director.

YES 20%
1.2

Does the program address a specific and existing problem, interest, or need?

Explanation: The Nation needs weapon and supporting systems that allow us to dissuade potential opponents from undertaking programs that would challenge crucial U.S. interests, deter persistent opponents from attacking U.S. forces and defeat them decisively if they do attack. The Department of Defense's acquisition program is focused on ensuring that U.S. military forces have access to the weapons and supporting systems to perform those functions efficiently and effectively. The Test and Evaluation program, which tests those systems to determine performance, ranges from "developmental test", when the tested systems are evolving relatively rapidly and the weapon system development community is closely involved, to the "operational test" phase when the system must measure up to carefully selected final performance standards, by arm's length testers and overseers, prior to purchase.

Evidence: The Quadrennial Defense Review, 2005 sets the requirements for the Department to (1) Defend the Homeland, (2) Prevail in the War on Terror and Conduct Irregular Operations, and (3) Conduct and Win Conventional Campaigns; 10 USC Chap. 137 establishes the Defense procurement system which supports the Department's ability to fulfill those requirements; DoD Directive 5000.1, May 12, 2003 ("The Defense Acquisition System") and DoD Instruction 5000.2, May 12, 2003 ("Operation of the Defense Acquisition System") establishes how the DoD acquisition system is to work as a whole, including the test and evaluation process, relying on the authorities provided by 10 USC Chap. 137; 10 USC Secs. 139, 2366, 2399 state some of the requirements of the final test process that ensure that the weapons and supporting systems will work as intended.

YES 20%
1.3

Is the program designed so that it is not redundant or duplicative of any other Federal, state, local or private effort?

Explanation: The responsibilities for the military test and evaluation program is a Department of Defense responsibility. It is carried out by Federal employees, and in some cases, with private sector employees under Federal control and supervision. (The exception to this DoD assignment is responsibility for nuclear weapon "tests" (non-explosive simulations and investigations) conducted by the Department of Energy.) The responsibilities are split-up as described in question 1.4, below. There is no commercial or other non-DoD Federal analog to these responsibilities. The Department, using the periodic Base Realignment and Closure (BRAC) process and a Strategic Plan for T&E Resources, works to ensure that there are no unwarranted redundancies within DoD's T&E infrastructure. The Department also provides various T&E services under agreements with other Federal agencies to assist in their T&E needs when they have no suitable T&E facilities.

Evidence: The Strategic Plan for DoD Test and Evaluation (T&E) Resources sets out a Department-wide plan for the non-duplicative use of test resources, the NASA/DoD National Aeronautical Test Alliance aims to avoid duplication between large-scale DoD and NASA wind tunnel facilities, an Interagency Agreement with Department of Justice (later amended to include the Department of Homeland Security) on use of Dugway Proving Grounds aims to avoid duplication of facilities between DoD and DHS.

YES 20%
1.4

Is the program design free of major flaws that would limit the program's effectiveness or efficiency?

Explanation: The early stages of the T&E process are managed by the military services during development of the candidate weapon and supporting systems. Through all stages, the primary test range resources are provided by the military services, the system-specific test funds are provided by the system development customer (usually the military Service) and the resource oversight is provided by the Test Resource Management Center (TRMC). The final stage is overseen by the independent Director of Operational Test and Evaluation (DOT&E), which provides the coordination and strong oversight for the crucial operational test (test before buy) process. Regular program assessments and infrastructure reviews ensure communication among the stakeholders and planning needed for the use of major resources. There is no strong evidence that another approach would be more efficient or effective.

Evidence: The OT&E authorization is found at 10 USC 139, 10 USC 2399. A Memorandum of Understanding (MOU) between OT&E and TRMC delineates the transition of oversight responsibility between the two organizations. The Strategic Plan for DoD Test and Evaluation Resources lays out the requirements for test resource investments.

YES 20%
1.5

Is the program design effectively targeted so that resources will address the program's purpose directly and will reach intended beneficiaries?

Explanation: There are three integrated efforts to insure that DoD T&E resources address the intended purpose and reach the intended beneficiaries. 1) DoD T&E resources are identified to unique T&E activities in individual program funding elements within each military Service and Agency during the budget planning process. The activities report funding and workload projections to the TRMC and DOT&E via Major Range and Test Facility Base (MRTFB) Budget Exhibits per requirement of the DoD Financial Management Regulation (FMR). 2) Customer charge policies are specifically outlined in the DoD Financial Management Regulation (FMR). These policies outline resourcing responsibilities between the T&E activity (the facility) and the programs under test. 3) The TRMC analyzes the MRTFB Budget Exhibits, taking into account the current resource levels together with workload projections and investment requirements, and provides an annual Budget Certification Report to Congress. This report summarizes TRMC's certification (or non-certification) of the Services' and Defense Agencies' approved and proposed T&E budgets against the Department's requirements. TRMC's certification accounts for resource balance against the requirements of the T&E strategic plan, as well as adequacy of the resources to support operation, sustainment, and improvement of the DoD T&E activities in support of the specific weapon or supporting system being tested. The beneficiaries of the program are the paying customers as well as the Secretary of Defense (as proxy for the public), who is served by the Director of Operational Test and Evaluation (OT&E) in his role as final test evaluator. The Director's independent, arm's length assessment provides an objective determination that the tested system's performance in a realistic threat environment is well understood and that potential weaknesses are noted.

Evidence: The DoD Financial Management Regulation 7000.14-R establishes the charge policies and financial requirements; the Annual Report to Congress of the Director, Test Resource Management Center, provides the final annual report on the program and its resource adequacy; the Test and Evaluation Budget Certification, FYs 2006 and 2007 (per 10 USC Sec. 196) to the Secretary of Defense certifies that the various programs' resources are adequate and balanced to support the planned test programs..

YES 20%
Section 1 - Program Purpose & Design Score 100%
Section 2 - Strategic Planning
Number Question Answer Score
2.1

Does the program have a limited number of specific long-term performance measures that focus on outcomes and meaningfully reflect the purpose of the program?

Explanation: The program does not currently have specific long-term performance measures, but will work to develop useful measures. The TRMC effort, which oversees the military Services' and Agencies' allocation of resources to the test infrastructure, is new and is in the process of developing performance measures, but the next iteration of measures won't be finished until about Sept., 2007. In the meantime, the TRMC is in the process of developing measures of the number of solutions (test capabilities) for critical test needs that deliver on time??in other words, that are ready for use for scheduled tests and will work with the Services to develop measures that can be used to assess T&E process performance and monitor program health..

Evidence: The Strategic Plan for DoD Test and Evaluation Resources documents current activities and in the future will include long-term performance measures.

NO 0%
2.2

Does the program have ambitious targets and timeframes for its long-term measures?

Explanation: Because the long-term performance measures are still under development, ambitious targets and timeframes for its measures are not yet available.

Evidence: The Strategic Plan for DoD Test and Evaluation Resources documents current capabilities and plans and is the report in which future targets and timeframes will be documented.

NO 0%
2.3

Does the program have a limited number of specific annual performance measures that can demonstrate progress toward achieving the program's long-term goals?

Explanation: Because the long-term performance measures are still under development, specific annual performance measures are not yet available.

Evidence: The Strategic Plan for DoD Test and Evaluation Resources documents current capabilities and it, along with the annual addendum, will be the report in which future annual performance measures will be documented.

NO 0%
2.4

Does the program have baselines and ambitious targets for its annual measures?

Explanation: Because the long-term performance measures are still under development, ambitious targets and timeframes for its measures are not yet available.

Evidence: The Strategic Plan for DoD Test and Evaluation Resources documents current capabilities and will be the report in which baselines and targets will be documented.

NO 0%
2.5

Do all partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) commit to and work toward the annual and/or long-term goals of the program?

Explanation: Because long-term and annual goals are still under development, the partners (Military Services, Defense Agencies, contractors) cannot yet commit to and work toward a common set of quantifiable goals. However, it should be noted that the various organizations generally work together well to allocate resources to upcoming needs and to perform needed tests.

Evidence:

NO 0%
2.6

Are independent evaluations of sufficient scope and quality conducted on a regular basis or as needed to support program improvements and evaluate effectiveness and relevance to the problem, interest, or need?

Explanation: The Director, Operational Test and Evaluation, resides at the end of the DoD test chain and fills a statutory roles as the final, objective, arm's length evaluator of weapon and supporting systems that are the product of the DoD development system. The standards for DoD tests are documented in the Test and Evaluation Master Plan for the system under test. Thus the DOT&E is the independent check on the system under review and the test results are reported in DOT&E's annual report to the Secretary and the Congress. But in a broader respect, DOT&E effectively provides a check on the performance of the T&E process as a whole, and the effectiveness of this broader check is due, in part, to publication of its annual report. This office and its functions meet the tests of scope, quality and independence for those systems. The DOT&E itself, as well as the overall system, has been evaluated by the General Accountability Office (GAO NSIAD-98-22) and that review, when combined with the previous oversight mechanisms, meets the standards of scope, quality and independence.

Evidence: The Test and Evaluation Master Plans establish the objective basis upon which future operational tests will be conducted. DoD Directive 5105.71 outlines the role of TRMC in oversight of the DoD T&E program. The DOT&E Annual Report contains the overall assessment by DOT&E. GAO Report NSIAD 98-22 ("Test and Evaluation: Impact of DoD's Office of the Director of Operational Test and Evaluation") contains GAO's assessment of the role and value of the DOT&E office.

YES 12%
2.7

Are Budget requests explicitly tied to accomplishment of the annual and long-term performance goals, and are the resource needs presented in a complete and transparent manner in the program's budget?

Explanation: Because the T&E program doesn't yet have annual and long-term performance measures, it doesn't have those as bases on which to build or link its budget needs. That is not to say, however, that budget requests are subjectively determined or aren't determined in a rational manner. Budget requests are determined by reference to needs for documented upcoming tests as well as needs to maintain the T&E infrastructure to support future test capabilities. Costs for infrastructure are provided through the military services and TRMC. Variable costs associated with individual tests are provided by the user programs. The DOT&E budget supports the cost of evaluating the results of operational testing and live fire testing.

Evidence: Strategic Plan for DoD Test and Evaluation Resources, Strategic Plan addendum, Report on Test and Evaluation Budget Certification, DoD Financial Management Regulation 7000.14-R, 10 USC 139.

NO 0%
2.8

Has the program taken meaningful steps to correct its strategic planning deficiencies?

Explanation: TRMC, as the lead organization to plan for infrastructure adequacy and to assess adequacy of Service and Agency budgets to support planned tests, is the key to this answer. TRMC is a relatively new organization and will provide its second Strategic Plan in about Sept., 2007. The Strategic Plan already shows progress in bridging identified gaps. A Strategic Planning Process Improvement Team exists within TRMC to consider and develop improvements to the planning process.

Evidence: The Strategic Plan for DoD Test and Evaluation Resources documents the plans for future improvements, with the Strategic Plan Annual Addendum providing annual updates.

YES 12%
Section 2 - Strategic Planning Score 25%
Section 3 - Program Management
Number Question Answer Score
3.1

Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance?

Explanation: Test and Evaluation program performance includes that related to test resource use as well as evaluation activities. DoD collects data on resource use for almost all of its T&E efforts. The responsibility for this requirement resides with the military services, which supplies the test infrastructure used in the tests, as well as TRMC, which oversees infrastructure programming and planning. However, at this time collected performance data, including resource use data, is of widely varying scope and usefulness in the improvement of the application of test resources varies. Long-term performance measures are under development. Evaluation activities are difficult to measure and currently lack performance information.

Evidence:

NO 0%
3.2

Are Federal managers and program partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) held accountable for cost, schedule and performance results?

Explanation: Because performance measures are still under development, managers and partners can't be held accountable for achievement of specific performance results. Those responsible for achieving key T&E program results include--in addition to the program officials of the systems under test--the Service test range managers, the developmental and operational test and evaluation organizations within the military services, and the Director of Operational Test and Evaluation. (The acquisition program managers have much responsibility for the smooth functioning of the test facilities they use, but are largely outside of the scope of this review.) The Director, TRMC, provides the oversight of the test resources and the directors of the tests (developmental through operational test) provide the management or oversight of the tests.

Evidence: The Strategic Plan sets out the resource needs, DoD Directive 3200.11 establishes responsibilities for oversight and management of the DoD test range and facilities. 10 USC 139 establishes responsibilities of the Director, Operational Test and Evaluation and 10 USC 196 establishes responsibilities for the Director, TRMC. The program managers for major development programs are accountable for meeting cost, schedule and performance goals of their programs as they proceed through the Defense Acquisition System (DoD Directive 5000.1 and DoD Instruction 5000.2) and must be reported to Congress if they breach so-called "Nunn-McCurdy" cost threshholds (10 USC 2433). However, we are awaiting evidence of follow-through (contract requirements for contracted support, performance metrics for test managers, etc.)

NO 0%
3.3

Are funds (Federal and partners') obligated in a timely manner, spent for the intended purpose and accurately reported?

Explanation: As mostly reimbursable entities, the T&E activities are responsible for timely execution of both their direct budget lines as well as those of their customers. The majority of the funds support labor hours and materials and, as such, typically do not have problems executing funds. Site assessments began in FY06 to ensure major T&E activities are appropriately spending both the reimbursable and direct budgets according to the intended purpose and applicable charge policies. Major Range and Test Facility Base (MRTFB) budget exhibits provide a feedback loop on planned versus actual expenditures at each major T&E activity. Program managers of both systems under test and the test activities (ranges) are held accountable for the timely obligation and expenditures of their funds. Programs that fail to obligate their funds in a timely fashion lose their funds to programs with more urgent funding needs. DoD monitors obligation and expenditure rates closely and adjusts funding accordingly. Program awards are reported promptly. The funding display shown above for the program shows the budgets supporting test infrastructure (TRMC and military Service infrastructure funding) and evaluation of operational testing (DOT&E), but not the costs levied against the development programs seeking the tests.

Evidence: DoD Financial Management Regulation 7000.14-R sets the standards and DoD 1102 forms track obligations and expenditures.

YES 14%
3.4

Does the program have procedures (e.g. competitive sourcing/cost comparisons, IT improvements, appropriate incentives) to measure and achieve efficiencies and cost effectiveness in program execution?

Explanation: A requirement of all test infrastructure investment proposals is to include an analysis of the associated payoffs. If the proposal addresses an improvement to an existing capability, a payback date and the schedule for retirement of existing equipment that will become obsolete as a result of the upgrade must be provided. A discussion of alternatives and cross-service opportunities must also be included. This approach ties long range investment planning with programming for the operation of the T&E activity. Each test range is responsible for measuring its own efficiencies. Some of the test ranges have fairly repeatable activities and have established effective measures of performance and efficiencies. Others lag.

Evidence: DoD Financial Management Regulation 7000.14-R.

NO 0%
3.5

Does the program collaborate and coordinate effectively with related programs?

Explanation: DoD has memoranda of understanding with other agencies which specifiy how they will work together to use single test installations to meet user needs, rather than duplicate them across government. For example, NASA makes frequent use of the White Sands Missile Range, a test range owned and operated by the Army. Correspondingly, DoD has made use of unique NASA facilities and people, particularly for rotary wing craft test. In addition, commerical firms sometimes use the DoD test infrastructure when adequate commercial facilities aren't available and if such tests don't conflict with DoD access. For example, some commercial aerospace companies use DoD aeronautics facilities for selected tests, on a non-interference basis with DoD customers. Likewise, DoD developmental tests often use commercial test facilities. There is little interaction with state or local programs.

Evidence: Interagency memoranda of agreements (MOAs) or Memoranda of Understandings (MOUs) with civilian agencies (e.g., DoD-NASA MOU, June 22, 2006, on Aeronautics Research) and related cross-agency tests, contract test activities with aerospace companies.

YES 15%
3.6

Does the program use strong financial management practices?

Explanation: Although the various test activities use the financial systems maintained by their parent Services, TRMC, which oversees the vast majority of the test infrastructure, has instituted a rigorous audit process to ensure that the test activities under its purview maintain tight control and accountability for the assets under their control. Because of TRMC's statutory requirement to certify the T&E budgets, coupled with a requirement to assure adherence to a new test facility charge policy, the TRMC initiated on-site financial management assessments. A contractor with expertise in government financial management and audits is assessing the T&E activities and provides TRMC a statement of assurance as to compliance with the Financial Management Regulation. The site assessments, which began in FY06, provide confidence that major T&E activities are indeed using the strong financial management practices required of the reimbursable activities. Because the test ranges are reimbursable activities, they are accountable not only for their own budgets, but also that supplied by their customers. Thus, the T&E activities track funds execution, inventory, work hours, etc. at the individual transaction level and aggregate the information and pass to the higher level DoD systems. After each site assessment is completed, a report is prepared that documents the site-specific internal processes and procedures along with an assessment of their internal controls. All six assessments completed to-date have been rated "green" with "strong internal controls" observed. TRMC's plan is to conduct assessments at six T&E sites per year on a continual basis.

Evidence: The TRMC Budget Certification Report documents TRMC's analysis of the adequacy and balance of DoD Service and Components budgets to support planned T&E activities. Operating budgets are analyzed for workload requirements identified in the major test range budget exhibits, as required by the DoD Financial Management Regulation (FMR) 7000.14-R. Furthermore, the Office of the Secretary and each of the military Services have documented and implemented recent changes in charge policies stated in DoD Financial Management Regulation 7000.14-R that establish uniform user charges for test activities (AF Updated Guidance MRTFB Charging Policy, Army MRTFB Charge Policy, DTRMC Guidance (Clarification of MRTFB Charge Policy), Navy Direct Product Accounts Policy, Navy MRTFB Guidance). Sites reviewed for charge policy implementation via the Compliance Assessment process include Arnold Engineering and Development Center (Air Force), Aberdeen Test Center (Army), Naval Air Warfare Center Aircraft Division (Navy), 30th Space Wing (Air Force), the Atlantic Undersea Test and Evaluation Center (Navy) and White Sands Missile Range (Army).

YES 14%
3.7

Has the program taken meaningful steps to address its management deficiencies?

Explanation: The TRMC was established with the 2003 National Defense Authorization Act. It is a new organization, created specifically to be able to coordinate the various DoD test infrastructure elements. Examples of deficiencies noted and addressed by the new organization include: 1) Production of a Strategic Plan for DoD T&E Resources, which is the Department's first overarching look at requirements and resources across the Department as opposed to strictly within component lines of authority. 2) Creation of a Strategic Planning Process Improvement Team. To improve planning, a working group was created to explore and select options, integrate, and coordinate ideas to improve on the Strategic Plan's utility as a management tool. 3) Production of a Budget Certification Report, which requires comparison of test range funding needs with budgeted funds. 4) Addition of site assessments at major T&E sites to ensure common DoD-wide application of a revised charge policy that requires that the various test range owners treat their users the same regardless of what DoD organization is using the facilities. 5) Revision of the DoD Major Range and Test Facility Base (MRTFB) budget exhibits requirements to increase transparency of financial transactions. In addition, the Department assisted the recent Base Realignment and Closure (BRAC) Commission in identifying duplicative or inefficiently utilized infrastructure, including test and evaluation infrastructure. The BRAC process realigned certain DoD test activities to make the infrastructure more productive. On the other hand, more remains to be done in establishing performance metrics and in ensuring that funds have been set aside in the military Services and Agencies to support all activities projected in the Test and Evaluation Master Plans.

Evidence: Strategic Plan for DoD T&E Resources, Budget Certification Report, DoD Financial Management Regulation 7000.14-R, Base Realignment and Closure process and results.

YES 15%
Section 3 - Program Management Score 58%
Section 4 - Program Results/Accountability
Number Question Answer Score
4.1

Has the program demonstrated adequate progress in achieving its long-term performance goals?

Explanation: Because the long-term performance measures are still under development, demonstration of progress is not yet achievable.

Evidence: No goals, as documented in 2.1.

NO 0%
4.2

Does the program (including program partners) achieve its annual performance goals?

Explanation: The annual performance goals are still under development.

Evidence: No annual goals, as documented in 2.3.

NO 0%
4.3

Does the program demonstrate improved efficiencies or cost effectiveness in achieving program goals each year?

Explanation: Because many DoD test ranges don't have measures of efficiencies or cost effectiveness, they can't yet demonstrate quantitatively that they achieve improved efficiencies year by year.

Evidence:

NO 0%
4.4

Does the performance of this program compare favorably to other programs, including government, private, etc., with similar purpose and goals?

Explanation:

Evidence:

NA 0%
4.5

Do independent evaluations of sufficient scope and quality indicate that the program is effective and achieving results?

Explanation: The CTEIP investment program was the subject of a recent DoD Inspector General Report, D-2004-097, "The Central Test and Evaluation Investment Program" dated 30 June 2004. The report states that the "completed projects were successful because the Program process is well structured and includes stringent reviews which facilitate effective management of the projects. As a result, the DoD has achieved tangible benefits, through the CTEIP, that are demonstrated and in place on the nation's ranges and test facilities." In addition, beginning in FY06, the TRMC instituted a process by which an independent evaluation of individual T&E activities is conducted by a government-led team to evaluate the performance of the T&E activity at implementing the requirements of the revised FMR, DoD 700.14R charge policy. Six site visits have been accomplished to date, four of which indicate proper compliance, two of which ended in a finding of non-compliance due to issues of higher-level guidance (i.e., not controlled at the site). On-site feedback to the non-compliant sites and required follow-up actions corrected the issue. GAO reports have indicated the DOT&E is performing its function effectively and is achieving desired results.

Evidence: DoD Inspector General Report, D-2004-097, "The Central Test and Evaluation Investment Program" dated 30 June 2004, DoD FMR 700.14R, Budget Certification Report, CTEIP program. GAO NSIAD-98-22, "Test and Evaluation: Impact of DOD's Office of the Director of Operational Test and Evaluation".

YES 25%
Section 4 - Program Results/Accountability Score 25%


Last updated: 09062008.2006SPR