This is the accessible text file for GAO report number GAO-09-338 
entitled 'Defense Acquisitions: Production and Fielding of Missile 
Defense Components Continue with Less Testing and Validation Than 
Planned' which was released on March 16, 2009.

This text file was formatted by the U.S. Government Accountability 
Office (GAO) to be accessible to users with visual impairments, as part 
of a longer term project to improve GAO products' accessibility. Every 
attempt has been made to maintain the structural and data integrity of 
the original printed product. Accessibility features, such as text 
descriptions of tables, consecutively numbered footnotes placed at the 
end of the file, and the text of agency comment letters, are provided 
but may not exactly duplicate the presentation or format of the printed 
version. The portable document format (PDF) file is an exact electronic 
replica of the printed version. We welcome your feedback. Please E-mail 
your comments regarding the contents or accessibility features of this 
document to Webmaster@gao.gov. 

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately. 

Report to Congressional Committees: 

United States Government Accountability Office: 
GAO: 

March 2009: 

Defense Acquisitions: 

Production and Fielding of Missile Defense Components Continue with 
Less Testing and Validation Than Planned: 

GAO-09-338: 

GAO Highlights: 

Highlights of GAO-09-338, a report to Congressional Committees. 

Why GAO Did This Study: 

The Missile Defense Agency (MDA) has spent about $56 billion and will 
spend about $50 billion more through 2013 to develop a Ballistic 
Missile Defense System (BMDS). GAO was directed to assess the annual 
progress MDA made in developing the BMDS as well as improvements in 
accountability and transparency in agency operations, management 
processes, and the new block strategy. To accomplish this, GAO reviewed 
contractor cost, schedule, and performance; tests completed; and the 
assets fielded during 2008. GAO also reviewed pertinent sections of the 
U.S. Code, acquisition policy, and the activities of the new Missile 
Defense Executive Board (MDEB). An appendix on the effect the 
cancellation of a Ground-based Midcourse Defense flight test (FTG-04) 
had on BMDS development is also included. 

What GAO Found: 

Cost: 
MDA has not yet established baselines for total costs or unit costs, 
both fundamental markers most programs use to measure progress. 
Consequently, for the sixth year, GAO has not been able to assess MDA’s 
actual costs against a baseline of either total costs or unit costs. 
MDA planned to establish such baselines in 2008 in response to past GAO 
recommendations, but has delayed this until 2009. GAO was able to 
assess the cost performance on individual contracts, and project an 
overrun at completion of between $2 billion and $3 billion. However, 
because in some cases the budgeted costs at completion—the basis for 
our projection—has changed significantly over time as adjustments were 
made, this projection does not capture as cost growth the difference 
between the original and current budgeted costs at completion. In one 
case, these costs increased by approximately five times its original 
value. 

Performance and Testing: 
While MDA completed several key tests that demonstrated enhanced 
performance of the BMDS, all elements of the system had test delays and 
shortfalls. Overall, testing achieved less than planned. For example, 
none of the six Director’s test knowledge points established by MDA for 
2008 were achieved. Poor performing target missiles have been a 
persistent problem. Testing shortfalls have slowed the validation of 
models and simulations, which are needed to assess the system’s overall 
performance. Consequently, the performance of the BMDS as a whole can 
not yet be determined. 

Schedule: 
Although fewer tests have been conducted than planned, the production 
and fielding of assets has proceeded closer to schedule. Except for no 
ground-based interceptors being delivered, all other radars, standard 
missiles, and software were delivered as planned. However, some 
deliveries, such as enhanced Exoatmospheric Kill Vehicles, will now 
precede test results. In most cases, MDA has also reduced the bases it 
planned to use to declare when capabilities are operational in the 
field. Thus, fielding decisions are being made with a reduced 
understanding of system effectiveness. 

Transparency, Accountability, and Oversight: Improvement in this area 
has been limited. The Missile Defense Executive Board (MDEB) has acted 
with increased authority in providing oversight of MDA and the BMDS. 
However, transparency and accountability into MDA’s work is limited by 
the management fluidity afforded through the lack of cost baselines, an 
unstable test baseline, continued use of development funds to produce 
assets for fielding, and renewed potential for transferring work from 
one predefined block to another. A better balance must still be struck 
between the information Congress and the Department of Defense need to 
conduct oversight of the BMDS and the flexibility MDA needs to manage 
across the portfolio of assets that collectively constitute the 
system’s capability. At this point, the balance does not provide 
sufficient information for effective oversight. 

What GAO Recommends: 

GAO recommends that the MDEB assess how the transparency and 
accountability of MDA acquisitions can be strengthened without losing 
the benefits of MDA’s existing flexibilities. Meanwhile, MDA should 
improve its cost and test baselines; tie modeling and simulation needs 
into test objectives; provide more time to analyze tests; better 
coordinate with independent testers; synchronize development, 
manufacturing, and fielding with testing and validation; complete a key 
developmental test; and strengthen the basis for capability 
declarations. DOD agreed with 10 of the 11 recommendations and 
partially agreed with one. 

To view the full product, including the scope and methodology, click on 
[hyperlink, http://www.gao.gov/products/GAO-09-338]. For more 
information, contact Paul Francis at (202) 512-4841 or 
francisp@gao.gov. 

[End of section] 

Contents: 

Letter: 

Background: 

Cost Tracking Deficiencies Hinder Assessment of Cost Performance: 

While Some Tests Succeeded, Others Were Deferred; Overall System 
Performance Cannot Yet Be Assessed: 

Production, Fielding, and Declaration of Capabilities Proceed despite 
Delays in Testing and Assessments: 

Production and Fielding of BMDS Systems Getting Ahead of Testing: 

Limited Progress Made in Improving Transparency and Accountability: 

Conclusions: 

Recommendations for Executive Action: 

Agency Comments and Our Evaluation: 

Appendix I: Comments from the Department of Defense: 

Appendix II: BMDS Prime Contractors Exceed Budgeted Cost and Schedule 
Performance during Fiscal Year 2008: 

Aegis BMD Contractors Experienced Mixed Performance during the Fiscal 
Year: 

ABL Contractor Overran Budgeted Fiscal Year Cost: 

C2BMC Program Incurred Negative Cumulative and Fiscal Year Variances: 

GMD Contractor Maintained Negative Cumulative Cost and Schedule 
Variances throughout the Fiscal Year: 

KEI Cost and Schedule Performance Continued to Decline after Replan: 

Limited Contractor Data Prevented Analysis of All MKV Task Orders: 

Sensors' Radar Experienced Fiscal Year Cost and Schedule Growth: 

Technical Issues Drove STSS Cost Growth during the Fiscal Year: 

Targets and Countermeasures Program's Rebaseline Positively Affected 
Fiscal Year Schedule Variances: 

THAAD Contractor Spent More Money and Time Than Budgeted: 

Appendix III: FTG-04 Flight Test Cancellation: 

Faulty Telemetry Component Caused Delay and Subsequent Cancellation of 
FTG-04: 

Most FTG-04 Test Objectives Will Be Allocated to Follow-on Tests: 

Cancellation Eliminates One of Few Opportunities to Demonstrate GMD 
Capabilities: 

Conclusions: 

Appendix IV: Reduced Basis for Capability Declarations: 

Appendix V: Scope and Methodology: 

Appendix VI: GAO Contact and Staff Acknowledgments: 

Tables: 

Table 1: MDA BMDS Elements: 

Table 2: MDA Block Construct: 

Table 3: Fiscal Year 2008 Capability Goals for Blocks 1.0, 2.0, and 
3.1/3.2: 

Table 4: Analysis of Contractor Realignments from Contract Start 
through Fiscal Year 2008: 

Table 5: Prime Contractor Fiscal Year 2008 and Cumulative Cost and 
Schedule Performance: 

Table 6: Test and Targets Issues: 

Table 7: Status of Fiscal Year 2008 Director's Test Knowledge Points: 

Table 8: BMDS Deliveries and Total Fielded Assets as of September 30, 
2008: 

Table 9: MDA BMDS Test Baseline Revisions: 

Table 10: Timeline of Events: 

Table 11: Engagement Sequence Groups with Revised Basis for Fiscal Year 
2008 Capability Declarations: 

Table 12: Block 1.0 Engagement Sequence Groups with Revised Basis for 
Completion at End of Fiscal Year 2009: 

Figures: 

Figure 1: Estimated Percentage of Total BMDS Block and Capability 
Development Funds through Fiscal Year 2013 Expected to Be Baselined in 
2009: 

Figure 2: Difference in Traditional Unit Cost Reporting and MDA's Unit 
Cost Reporting: 

Figure 3: GMD Reduction in Flight Tests from January 2006 to March 
2010: 

Figure 4: GMD Flight Test and Fielding Plan for Interceptors 
Comparison--September 2006 versus January 2009: 

Figure 5: Timeline Showing Declaration of Capabilities in Fiscal Year 
2008: 

Figure 6: Timeline Showing Deferred Declaration of Capabilities from 
Fiscal Year 2008 to 2009: 

Figure 7: Aegis BMD Weapon System Fiscal Year 2008 Cost and Schedule 
Performance: 

Figure 8: Aegis BMD SM-3 CLIN 1 Fiscal Year 2008 Cost and Schedule 
Performance: 

Figure 9: ABL Fiscal Year 2008 Cost and Schedule Performance: 

Figure 10: C2BMC Fiscal Year 2008 Cost and Schedule Performance: 

Figure 11: GMD Fiscal Year 2008 Cost and Schedule Performance: 

Figure 12: KEI Fiscal Year 2008 Cost and Schedule Performance: 

Figure 13: MKV Task Order 6 Fiscal Year 2008 Cost and Schedule 
Performance: 

Figure 14: MKV Task Order 7 Fiscal Year 2008 Cost and Schedule 
Performance: 

Figure 15: MKV Task Order 8 Fiscal Year 2008 Cost and Schedule 
Performance: 

Figure 16: Sensors Fiscal Year 2008 Cost and Schedule Performance: 

Figure 17: STSS Fiscal Year 2008 Cost and Schedule Performance: 

Figure 18: Targets and Countermeasures Fiscal Year 2008 Cost and 
Schedule Performance: 

Figure 19: THAAD Fiscal Year 2008 Cost and Schedule Performance: 

Abbreviations: 

ABL: Airborne Laser: 

Aegis BMD: Aegis Ballistic Missile Defense: 

BMDS: Ballistic Missile Defense System: 

C2BMC: Command, Control, Battle Management, and Communications: 

CE: Capability Enhancement: 

CLIN: Contract Line Item Number: 

DOD: Department of Defense: 

DOT&E: Director, Operational Test and Evaluation: 

EKV: Exoatmospheric Kill Vehicle: 

FTF: Flexible Target Family: 

GBI: Ground-based Interceptor: 

GMD: Ground-based Midcourse Defense: 

KEI: Kinetic Energy Interceptor: 

MDA: Missile Defense Agency: 

MDEB: Missile Defense Executive Board: 

MKV: Multiple Kill Vehicle: 

PCME: Pulse Code Modulation Encoder: 

SAR: Selected Acquisition Report: 

STSS: Space Tracking and Surveillance System: 

THAAD: Terminal High Altitude Area Defense: 

[End of section] 

United States Government Accountability Office:
Washington, DC 20548: 

March 13, 2009: 

Congressional Committees: 

The Missile Defense Agency (MDA) has spent almost $56 billion since its 
initiation in 2002 on developing and fielding a Ballistic Missile 
Defense System (BMDS) and is on course to spend about $50 billion more 
over the next 5 years. In 2002, the President directed the Department 
of Defense (DOD) to "deploy a set of initial missile defense 
capabilities beginning in 2004".[Footnote 1] MDA began delivering an 
initial capability in late 2004, as directed, and deployed an initial 
capability in 2005 by concurrently developing and fielding BMDS 
assets.[Footnote 2] Though this approach facilitated the rapid 
deployment of an initial BMDS capability, as MDA has proceeded beyond 
that initial capability, it has been less successful in fostering 
adequate knowledge of system capabilities prior to manufacturing and 
fielding BMDS assets. 

In its fiscal year 2002, 2007, and 2008 National Defense Authorization 
Acts, Congress directed GAO to assess the cost, schedule, testing, and 
performance progress that MDA is making in developing the BMDS. 
[Footnote 3] We have delivered assessments covering fiscal years 2003 
through 2007.[Footnote 4] This report assesses the progress made during 
fiscal year 2008 toward BMDS goals as well as the progress MDA made in 
improving accountability and transparency through its agency 
operations, management, processes, and new block strategy. This report 
also includes an appendix that addresses the Senate Armed Services 
Committee's request that we review the reasons behind and the effects 
on BMDS development of the cancellation of a Ground-based Midcourse 
Defense flight test designated FTG-04. 

To assess progress during fiscal year 2008, we examined the 
accomplishments of 10 BMDS elements that MDA is developing and 
fielding: the Aegis Ballistic Missile Defense (Aegis BMD); Airborne 
Laser (ABL); BMDS Sensors; Command, Control, Battle Management, and 
Communications (C2BMC); Ground-based Midcourse Defense (GMD); Kinetic 
Energy Interceptors (KEI); Multiple Kill Vehicles (MKV); Space Tracking 
and Surveillance System (STSS); Targets and Countermeasures; and 
Terminal High Altitude Area Defense (THAAD).[Footnote 5] These elements 
collectively account for about 80 percent of MDA's research and 
development budget. We also examined MDA's Fiscal Year 2008 Statement 
of Goals, Program Execution Reviews, test plans and reports, production 
plans, and Contract Performance Reports. We interviewed officials 
within program offices and within MDA functional directorates, such as 
the Directorate for Cost Estimating. In addition, we discussed each 
element's test program and its results with the BMDS Operational Test 
Agency and DOD's Office of the Director, Operational Test and 
Evaluation (DOT&E). 

In assessing progress made toward improving accountability and 
transparency, we held discussions with officials in MDA's Directorate 
of Business Operations to determine whether its new block structure 
improved accountability and transparency of the BMDS. In addition, we 
reviewed pertinent sections of the U.S. Code to compare MDA's current 
level of accountability with federal acquisition laws. We also 
interviewed officials from the Office of the Under Secretary of Defense 
for Acquisition, Technology and Logistics to discuss the oversight role 
of the Missile Defense Executive Board. Additionally, we reviewed the 
board's charter to determine its oversight responsibility. Our scope 
and methodology is discussed in more detail in appendix V. 

We conducted this performance audit from May 2008 to March 2009 in 
accordance with generally accepted government auditing standards. Those 
standards require that we plan and perform the audit to obtain 
sufficient, appropriate evidence to provide a reasonable basis for our 
findings and conclusions based on our audit objectives. We believe that 
the evidence obtained provides a reasonable basis for our findings and 
conclusions based on our audit objectives. 

Background: 

MDA's mission is to develop an integrated and layered system to defend 
the United States and its deployed forces, friends, and allies against 
ballistic missile attacks. The BMDS aims to engage all ranges of 
ballistic missiles during all phases of flight. This challenging 
expectation requires complex coordination within an integrated system 
of defensive components--space-based sensors, surveillance and tracking 
radars, advanced interceptors, and a battle management, command, 
control, and communication component. 

A typical engagement scenario to defend against an intercontinental 
ballistic missile would occur as follows: 

* Infrared sensors aboard early-warning satellites detect the hot plume 
of a missile launch and alert the command authority of a possible 
attack. 

* Upon receiving the alert, land-or sea-based radars are directed to 
track the various objects released from the missile and, if so 
designed, to identify the warhead from among spent rocket motors, 
decoys, and debris. 

* When the trajectory of the missile's warhead has been adequately 
established, an interceptor--consisting of a kill vehicle mounted atop 
a booster--is launched to engage the threat. The interceptor boosts 
itself toward a predicted intercept point and releases the kill 
vehicle. 

* The kill vehicle uses its onboard sensors and divert thrusters to 
detect, identify, and steer itself into the warhead. With a combined 
closing speed of up to 10 kilometers per second (22,000 miles per 
hour), the warhead is destroyed above the atmosphere through a "hit to 
kill" collision with the kill vehicle. 

* Some interceptors use sensors to steer themselves into the inbound 
ballistic missile. Inside the atmosphere, these systems kill the 
ballistic missile using a range of mechanisms such as direct collision 
between the missile and the inbound ballistic missile or killing it 
with the combined effects of a blast fragmentation warhead (heat, 
pressure, and grains/shrapnel) in cases where a direct hit does not 
occur. 

Table 1 provides a brief description of 10 BMDS elements currently 
under development by MDA. 

Table 1: MDA BMDS Elements: 

BMDS element: Aegis Ballistic Missile Defense; 
Missile defense role: Aegis BMD is a ship-based missile defense system 
designed to destroy short-to intermediate-range ballistic missiles 
during the midcourse phase of their flight; its capability is being 
expanded to include the terminal phase of flight. Aegis BMD's mission 
is twofold--an engagement capability against regional ballistic missile 
threats providing the BMDS with its first mobile, global, deployable 
and proven capability that can destroy ballistic missiles both above 
and within the atmosphere, as well as a forward-deployed combatant to 
search, detect, and track ballistic missiles of all ranges and transmit 
track data to the BMDS, performing a strategic role in homeland 
defense. To date, 18 ships have been upgraded for the Aegis BMD 
mission. MDA is planning to procure 147 Aegis BMD missiles--the 
Standard Missile-3 (SM-3)--from calendar years 2004 through 2013. 

BMDS element: Airborne Laser; 
Missile defense role: ABL is an air-based missile defense system 
designed to destroy all classes of ballistic missiles during the boost 
phase of their flight. ABL employs a high-energy chemical laser to 
rupture a missile's motor casing, causing the missile to lose thrust or 
flight control. MDA plans to demonstrate proof of concept in a system 
demonstration in 2009. 

BMDS element: BMDS Sensors; 
Missile defense role: MDA is developing radars for fielding as part of 
the BMDS. The BMDS uses these sensors to identify and track ballistic 
missiles. The ultimate goal is to provide continuous tracking of 
ballistic missiles in all phases of flight and increase the probability 
for successful intercept. 

BMDS element: Command, Control, Battle Management and Communications; 
Missile defense role: C2BMC is the integrating element of the BMDS. Its 
role is to provide deliberate planning, situational awareness, sensor 
management, and battle management for the integrated BMDS. 

BMDS element: Ground-based Midcourse Defense; 
Missile defense role: GMD is a ground-based missile defense system 
designed to destroy intercontinental ballistic missiles during the 
midcourse phase of their flight. Its mission is to protect the U.S. 
homeland against ballistic missile attacks from North Korea and the 
Middle East. Currently, GMD has fielded 24 interceptors with the 
original configuration, known as Capability Enhancement-I (CE-I). GMD 
has recently begun emplacing a new configuration of the kill vehicle 
known as the Capability Enhancement-II (CE-II). This configuration was 
designed to replace obsolete parts. MDA is planning on fielding 44 
interceptors at Fort Greely, Alaska, and Vandenberg Air Force Base, 
California, by fiscal year 2011. MDA also plans to field 10 
interceptors in Europe. 

BMDS element: Kinetic Energy Interceptors; Missile defense role: KEI is 
a mobile land-based missile defense system designed to destroy medium, 
intermediate, and intercontinental ballistic missiles during the boost 
and midcourse phases of their flight. The agency plans to conduct the 
first booster flight test in 2009. The KEI capability could be expanded 
to sea basing in subsequent blocks. 

BMDS element: Multiple Kill Vehicle; 
Missile defense role: The MKV is being designed as a spiral improvement 
to counter advancements in the threat for midcourse interceptors. This 
approach mitigates the need to pinpoint a single lethal object in a 
threat cluster by using numerous kill vehicles to engage all objects 
that might be lethal. The system under development consists of a 
carrier vehicle housing a number of smaller kill vehicles, which would 
primarily benefit the Ground-based and Kinetic Energy interceptors as 
well as the Aegis BMD SM-3. To mitigate risk, MDA has initiated a 
parallel acquisition with a second contractor. Because MKV is in the 
technology development stage, it does not project an initial capability 
date, but the program expects that the capability could be available by 
2017. 

BMDS element: Space Tracking and Surveillance System; 
Missile defense role: STSS is designed to be a low-orbit constellation 
of space-based sensors that is able to observe targets in all phases of 
trajectory. MDA intends to launch two demonstration satellites in 2009. 
If the demonstration satellites perform successfully, MDA plans to have 
an operational capability of next-generation satellites. 

BMDS element: Targets and Countermeasures; 
Missile defense role: MDA maintains a series of targets used in BMDS 
flight tests to present authentic threat scenarios. The targets are 
designed to encompass the full spectrum of threat missile ranges and 
capabilities. In 2005, MDA began developing a new family of targets, 
the Flexible Target Family (FTF), which was to represent evolving 
threats of all ranges. However, in 2008, MDA narrowed the FTF focus to 
developing one long-range 72-inch target, the LV-2. The first launch of 
this target is scheduled for 2009. 

BMDS element: Terminal High Altitude Area Defense; 
Missile defense role: THAAD is a ground-based missile defense system 
designed to destroy short-and medium-range ballistic missiles during 
the terminal phase of flight, both inside and outside of the 
atmosphere. Its mission is to defend deployed U.S. forces and 
population centers. MDA plans to field a fire unit, which includes 24 
missiles, in 2010 and a second unit in 2011. MDA also plans to field 
two additional fire units, which includes 24 missiles each, in 2012 and 
2013, respectively. 

Source: MDA data. 

[End of table] 

To manage BMDS development, MDA uses an acquisition strategy defined by 
a block structure. From its inception in 2002 through 2007, MDA 
developed BMDS capability in biennial increments, ultimately delivering 
two blocks--Block 2004 and Block 2006. These 2-year blocks each built 
on preceding blocks and enhanced the development and capability of the 
BMDS. However, in response to recommendations from GAO, in December 
2007 MDA announced a new block structure that was intended to improve 
the program's transparency, accountability, and oversight. The new 
blocks are not based on biennial time periods, but instead focus on 
fielding capabilities that address particular threats. Because the new 
block structure is not aligned to regular time periods, multiple blocks 
are underway concurrently. Table 2 details the current blocks and 
categories included in the BMDS block structure. 

Table 2: MDA Block Construct: 

Block: Block 1.0: Defend U.S. from Limited North Korean Long-Range 
Threats; 
Description: Provides an initial capability to protect the United 
States from a limited North Korean attack. It is the most mature 
capability and will be the first block delivered to the warfighter. 

Block: Block 2.0: Defend Allies and Deployed Forces from Short-to 
Medium-Range Threats in One Region/Theater; 
Description: Includes capabilities needed to defend allies and deployed 
forces from short-to medium-range threats in one region/theater. 

Block: Block 3.0: Expand Defense of the U.S. to Include Limited Iranian 
Long-Range Threats[A]; 
Description: Builds on the foundation established in Block 1.0 and 
includes capabilities needed to expand the defense of the United States 
against limited Iranian long-range threats. 

Block: Block 4.0: Defend Allies and Deployed Forces in Europe from 
Limited Iranian Long-Range Threats; 
Description: Builds on the foundation established by Blocks 1.0 and 3.0 
capabilities needed to defend allies and deployed forces in Europe from 
limited Iranian long-range threats and to expand protection of the U.S. 
homeland. 

Block: Block 5.0: Expand Defense of Allies and Deployed Forces from 
Short-to Intermediate-Range Threats in Two Regions/Theaters; 
Description: Builds on the foundation established by Block 2.0 and 
includes capabilities needed to expand defense of allies and deployed 
forces from short-to intermediate-range threats in two regions/ 
theaters. 

Categories: Capability Development; 
Description: Includes BMDS elements and other development elements that 
are not baselined in the existing agency block structure, such as ABL, 
KEI, and MKV. These programs have knowledge points tailored to critical 
risks. 

Categories: Sustainment; 
Description: Funding for Contractor Logistics Support and other 
operation and support activities. 

Categories: Mission Area Investment; 
Description: Investments that cut across several blocks and cannot be 
reasonably allocated to a specific block. Examples include modeling and 
simulation and intelligence and security. 

Categories: MDA Operations; 
Description: Contains operations support functions such as MDA 
headquarters management. 

Source: MDA data. 

[A] Block 3.0 is subdivided into three sections: 3.1, 3.2, and 3.3. 
Block 3.0 will focus on more sophisticated sensors and algorithms, and 
therefore includes upgrades to the Ground-based Interceptors, sensors, 
and the C2BMC system to allow discrimination of the threat missile. MDA 
is pursuing two parallel and complementary approaches to counter 
complex countermeasures. The full implementation of this approach will 
be conducted in phases, with the first phase referred to as Near Term 
Discrimination (Block 3.1/3.2) and the second phase as Improved 
Discrimination and System Track. (Block 3.3). 

[End of table] 

MDA uses a Statement of Baselines and Goals to report modifications to 
established block baselines.[Footnote 6] For those blocks that are 
currently or will soon be underway, block baselines are created to make 
a firm commitment to Congress. The Statement of Goals also includes the 
following: 

* BMDS Baseline Capabilities - Assets and engagement sequence groups 
that will be made available for fielding for a particular block. 
[Footnote 7] During 2008, cost baselines were under development for 
Block 2.0 and Block increments 3.1/3.2. MDA established schedule and 
performance baselines for Blocks 1.0, 2.0, 3.1, and 3.2 in 2008. 

* BMDS Capability Goals - Assets and engagement sequence groups 
expected to be made available for future blocks. 

* Adversary Benchmarks - Adversary missile systems used for block 
performance estimates. 

* BMDS Budget Breakdowns - Detailed fielding, development, and 
integration budgets for each block and BMDS Capability Development 
activity. 

MDA also uses an incremental declaration process to designate BMDS 
capability for its blocks. Three capability designations are applied to 
all BMDS elements, their hardware and software components, and 
engagement sequence groups. This allows these BMDS features to play a 
limited role in system operations before they have attained their 
expected level of capability. Each capability designation in the 
delivery schedule represents upgraded capacity to support the overall 
function of BMDS in its mission as well as the level of MDA confidence 
in the system's performance. The designations are defined as follows: 

* Early Capability Delivery signifies readiness for contingency use. At 
this point, MDA has determined that the capability can be utilized by 
the BMDS. When integrated, the capability must be adequately 
demonstrated to build sufficient confidence that it will safely perform 
as intended without degrading the existing capabilities of the BMDS. 

* Partial Capability Delivery is an intermediate state of maturity 
indicating that a capability has been shown through testing to perform 
as intended in certain scenarios. At this point, MDA is sufficiently 
confident that the capability can support the warfighter's partially 
mission-capable objectives and logistics support is adequate to achieve 
defensive operations. 

* Full Capability Delivery is the point at which a capability satisfies 
the BMDS block objectives and is considered to be completely mature and 
ready for full operational use. 

MDA's capability goals for fiscal year 2008 for Blocks 1.0, 2.0, 3.1, 
and 3.2 are shown in table 3. 

Table 3: Fiscal Year 2008 Capability Goals for Blocks 1.0, 2.0, and 
3.1/3.2: 

Block 1.0--Initial Defense of U.S. from North Korea Expected 
Completion: Fiscal Year 2009: 

Engagement Sequence Group: Ground-Based Interceptor (GBI) Launch on 
COBRA DANE/Upgraded Early Warning Radar; 
2008 Planned Capability Deliveries: Early. 

Engagement Sequence Group: GBI Engage on COBRA DANE/Upgraded Early 
Warning Radar; 
2008 Planned Capability Deliveries: Full. 

Engagement Sequence Group: GBI Engage on forward-based AN/TPY-2 radar; 
2008 Planned Capability Deliveries: Full. 

Engagement Sequence Group: GBI Engage on Sea-based X-band radar; 
2008 Planned Capability Deliveries: Early. 

Engagement Sequence Group: GBI Launch on Sea-based X-band radar; 
2008 Planned Capability Deliveries: Early. 

Block 2.0--Initial Defense of Allied Forces Expected Completion: Fiscal 
Year 2011: 

Engagement Sequence Group: SM-2 Engage on shipboard Aegis radar (AN/ 
SPY-1); 
2008 Planned Capability Deliveries: Early. 

Engagement Sequence Group: SM-3 Engage on shipboard Aegis radar (AN/ 
SPY-1); 
2008 Planned Capability Deliveries: Full. 

Engagement Sequence Group: SM-3 Launch on remote on shipboard Aegis 
radar (AN/SPY-1); 
2008 Planned Capability Deliveries: Early. 

Engagement Sequence Group: THAAD Interceptor Engage on AN/TPY-2 radar 
in the terminal mode; 
2008 Planned Capability Deliveries: Early. 

Engagement Sequence Group: Block 3.1/3.2--Initial Defense of U.S. from 
Iran Expected Completion: Fiscal Year 2013; 
2008 Planned Capability Deliveries: [Empty]. 

Engagement Sequence Group: GBI Engage on COBRA DANE/Upgraded Early 
Warning Radar Mod 1 (Fylingdales, UK; Forward-Based mobile radar (AN/ 
TPY-2)); 
2008 Planned Capability Deliveries: Partial. 

Engagement Sequence Group: GBI Launch on shipboard Aegis radar Mod 1 
(Fylingdales, UK; Sea-based X-band radar); 
2008 Planned Capability Deliveries: Early. 

Source: GAO analysis of MDA data. 

Note: In addition to the engagement sequence groups listed above, as of 
October 2007 MDA had planned to declare the capability of several more 
engagement sequence groups in fiscal year 2008. However these were 
excluded from the February 2008 Statement of Goals. MDA continues to 
work toward declaring these additional engagement sequence groups. 

[End of table] 

Cost Tracking Deficiencies Hinder Assessment of Cost Performance: 

MDA has not yet established baselines for total costs or unit costs, 
both fundamental markers that most programs use to measure progress. 
MDA had planned to establish total cost baselines at the element and 
block levels in 2008, but the initial set of total cost baselines will 
not be available until the spring of 2009. Similarly, MDA has not 
established unit costs for selected assets, such as GBIs. Consequently, 
for the sixth year, we have been unable to assess MDA's overall 
progress on total or unit cost. While MDA plans to establish some total 
cost baselines in 2009, most efforts will not be captured in a 
baseline. MDA also plans to establish unit costs, another improvement, 
but is considering a narrower definition of unit cost than is used by 
other weapon system programs. MDA's definition will report a subset of 
procurement costs called flyaway costs, which only includes the major 
piece of equipment and excludes all research and development as well as 
some procurement costs--those for support equipment and spares. 
Moreover, these unit costs will only be tracked within those blocks 
that are baselined, which will represent a minority of those assets 
being produced and fielded. Without total cost baselines in place, the 
BMDS concept continually evolves, as indicated by the number of 
realignments to the program of work at the individual contract level. 
While the changing nature of the BMDS and the lack of total cost 
baselines precludes analysis of total cost progress, we were able to 
analyze contractor fiscal year performance on the current work under 
the contract. We were also able to project overruns or underruns at 
completion for BMDS contracts using the contracts' current budgeted 
costs at completion as a basis for our projections. However, in some 
cases, the current budgeted cost at completion changed significantly 
over time. In one case, the budgeted cost at completion increased by 
approximately five times its original value. Our analysis of fiscal 
year 2008 progress shows that several prime contractors exceeded 
budgeted costs. 

Absence of Cost Baselines Prevents Assessment of System-Level Costs: 

To provide accountability, major defense acquisition programs are 
required by statute to document program goals in an acquisition program 
baseline.[Footnote 8] MDA is not yet required to establish an 
acquisition program baseline because of the acquisition flexibilities 
it has been granted. However, Congress has enacted legislation 
requiring MDA to establish some baselines.[Footnote 9] Baselines serve 
an important discipline both by ensuring that the full cost commitment 
is considered before embarking on major development efforts and by 
identifying cost growth as a program proceeds. Since we began annual 
reporting on missile defense in 2004, we have been unable to assess 
overall progress on cost--that is, comparing BMDS baselined costs with 
actual costs. For example, under the prior block structure, we reported 
that BMDS costs grew by at least $1 billion, but the total cost growth 
could not be determined because MDA did not account for all costs for a 
given block. 

In response to recommendations we made in March 2008, MDA agreed to 
develop cost estimates and to provide independent verification for 
blocks outlined under its new approach. Upon conclusion of those 
estimates, MDA will develop cost baselines. In addition, on April 1, 
2008, the Director, MDA, testified before a Senate Armed Services 
Subcommittee that cost baselines for the new block structure would be 
available by the end of 2008. As of January 2009, the agency had not 
yet developed full cost baselines for any blocks. MDA plans to have 
these cost baselines for Blocks 2.0, 3.1 and 3.2 completed, 
independently reviewed by DOD's Cost Analysis Improvement Group, and 
released by the spring of 2009. The only information that was available 
for this report was limited to budget projections for BMDS blocks and 
capability development for fiscal years 2008 through 2013, totaling 
approximately $42 billion. 

Even with the release of some block cost estimates in 2009, all block 
costs will not be baselined and no date has been established for when 
the remaining block costs will be baselined. MDA does not plan to 
baseline Block 1.0 costs, but will provide the actual costs of the 
block since it is near completion. Full cost baselines for Block 2.0 
are anticipated to be available in the spring of 2009, but only 
portions of Block 3.0 and none of Block 4.0 and 5.0 costs will be 
baselined at this time. As figure 1 shows, if MDA does complete 
baselines as planned, they will only cover about 26 percent of its 
block and capability development costs. 

Figure 1: Estimated Percentage of Total BMDS Block and Capability 
Development Funds through Fiscal Year 2013 Expected to Be Baselined in 
2009: 

[Refer to PDF for image: vertical bar graph] 

Fiscal year: 2010; 
Percentage of funding baselined (Blocks 2.0, 3.1, and 3.2): 26.26%. 

Fiscal year: 2011; 
Percentage of funding baselined (Blocks 2.0, 3.1, and 3.2): 5.65%. 

Fiscal year: 2012; 
Percentage of funding baselined (Blocks 2.0, 3.1, and 3.2): 3.36%. 

Fiscal year: 2013; 
Percentage of funding baselined (Blocks 2.0, 3.1, and 3.2): 2.26%. 

Source: GAO analysis of MDA’s Fiscal Year 2009 Budget Estimate 
Submission and January 2008 Statement of Goals. 

Note: Analysis is based on MDA's fiscal year 2009 projected funding 
through fiscal year 2013 from the February 2008 request. Funding 
includes defense-wide resources projected for MDA. 

[End of figure] 

At this point, MDA plans to baseline between 2 and 26 percent of BMDS 
block and capability development costs from fiscal years 2010 to 2013 
as depicted above. MDA has not determined when other blocks will be 
baselined. If other blocks were to be baselined before 2013, the 
percentage of funding baselined would be increased. The rapid decline 
in percentage of baselined funds also shows that initial baselines are 
being set late in a block's duration. For example, Block 2.0 will be 
completed within 2 years of its baseline being set. If baselines are to 
facilitate management and oversight, they will have to be set sooner 
for Block 3.3 and beyond. Additionally, agency officials stated that 
although cost estimates will be developed for individual capability 
development efforts--such as ABL, KEI, and MKV--the agency does not 
plan to baseline their costs until these elements are matured and moved 
into a defined block. MDA may eventually baseline these elements as 
part of a block once a firm commitment can be made to Congress. The 
budgets for capability development elements account for approximately 
$22 billon--or more than half of MDA's fiscal year 2008 6-year Future 
Years Defense Plan for BMDS blocks and capability development. 

Planned Unit Cost Reporting Will Not Be Comprehensive: 

Major defense acquisition programs are required by statute to report 
certain unit costs to Congress,[Footnote 10] track unit cost growth 
against their original and current baseline estimates, and perform an 
additional assessment of the program if certain cost growth thresholds 
are reached.[Footnote 11] This cost monitoring mechanism helps ensure 
that programs are being held accountable. MDA is not yet required to 
report these unit costs because of the acquisition flexibilities it has 
been granted by DOD, but Congress has enacted legislation requiring MDA 
to provide unit cost reporting data for certain BMDS elements, and MDA 
does plan to develop and report unit costs for some of its assets in 
the spring of 2009.[Footnote 12] The agency has also established 
thresholds for reporting cost growth. However, the approach MDA is 
taking, while an improvement, provides a much less comprehensive 
assessment of unit cost compared to the traditional acquisition costs 
that are typically reported for major defense acquisition programs. 
Normally, unit costs are reported in two ways: (1) program acquisition 
unit cost, which is the total cost for the development and procurement 
of, and system-specific military construction for, the acquisition 
program divided by the number of fully configured end items to be 
produced, or (2) average procurement unit cost, which is the total of 
all funds programmed to be available for obligation for procurement 
divided by the number of end items to be procured.[Footnote 13] 

MDA's development of the BMDS outside of DOD's normal acquisition 
process makes it difficult to compare the actual unit cost of a 
delivered asset with its planned unit cost. For example, MDA plans to 
only report recurring unit flyaway costs for the blocks that are 
baselined.[Footnote 14] Figure 2 reveals the significant reduction in 
standard areas of costs covered by MDA's approach compared to that 
normally reported for major defense acquisition programs. 

Figure 2: Difference in Traditional Unit Cost Reporting and MDA's Unit 
Cost Reporting: 

[Refer to PDF for image: illustration] 

Most major acquisitions programs report: 

Program acquisition cost: 

Development cost: 

Research, Development, Test, and Evaluation: 

* Development costs of Prime Mission Equipment and support items; 
* Systems engineering
* Program management
* Test and evaluation 

Military Construction: 
* Facilities 

Procurement cost: 

Procurement: 
* Prime equipment; 
* Support items; 
* Initial spares. 

MDA plans to report: 

Flyaway cost: 

Procurement: 
* Prime equipment. 

Source: GAO analysis. 

[End of figure] 

MDA's decision to report only flyaway unit costs will not capture 
research and development costs associated with BMDS assets--which 
account for more than 97 percent of the nearly $56 billion MDA costs to 
date. In addition, the procurement costs for initial spares and support 
equipment are not included. Thus, while the flyaway cost baseline will 
provide visibility into changes in recurring manufacturing costs, it 
will not provide a basis for comparison with the costs of other DOD 
programs. If a cost increase occurs in research and development or 
nonrecurring procurement, it will not be reported in MDA's unit cost. 

Agency officials told us that the reason for using flyaway unit costs 
was the new MDA block structure. They further explained that within the 
block structure, there are many cases where MDA procures and delivers a 
weapon system for more than one block and, in some cases, the same 
configuration in more than one block. For example, THAAD deliveries are 
included in Blocks 2.0 and 5.0. Agency officials cite this as a key 
difference between MDA and other defense programs. For MDA, most of the 
development costs are assigned to the first block where a capability is 
delivered and very little of the development cost is assigned to 
subsequent blocks. MDA officials further stated that if the agency were 
to use the standard unit cost methodology, it would show very 
dissimilar unit costs between the first and subsequent block deliveries 
and the difference could not be explained by learning curves and 
manufacturing efficiencies. MDA officials also told us that they chose 
unit flyaway cost for unit cost reporting because flyaway cost provides 
a better measure of what the individual system components cost to 
procure. However, MDA is not precluded from also determining and 
reporting unit costs by taking the entire cost of the asset being 
developed without regard to the capability or block for which it is 
originally developed. 

Frequent Realignments Indicate That Full Scope Is Not Yet Determined: 

Without a firm cost commitment or baseline in place for the full scope 
of the BMDS, in some cases the work currently under contract changes 
frequently. These changes manifest themselves in realignments that 
often add scope, cost and time to the value of the work under the 
contract. Since contracts began on the BMDS, MDA has performed 31 
realignments where, in some cases, the period of stability between 
these major realignments average only between 1 to 2 years. These 
frequent changes indicate that the total BMDS effort has not been fully 
determined and is likely to grow. While we have been able to make an 
assessment of contractor costs, that assessment is limited to the 
current approved program. 

Until total and unit cost baselines are established, the only tool for 
us to use in assessing BMDS costs is the costs reported on individual 
contracts under BMDS's Earned Value Management System.[Footnote 15] All 
BMDS contracts that we assessed have a cost and schedule baseline 
against which progress, measured by cost and schedule performance, can 
be measured. It is appropriate for a program to realign its current 
status with the remaining contractual effort when officials conclude 
that the baseline no longer provides valid performance assessment 
information.[Footnote 16] A program can realign its current status with 
the remaining contractual effort through rebaselines, replans, and 
restructures. 

* A rebaseline is a more general term to describe a major realignment 
of the performance measurement baseline used to better correlate the 
work plan with the baseline budget, scope, and schedule and can refer 
to replans and restructures as well. 

* A replan is a reallocation of schedule or budget for the remaining 
effort within the existing constraints of the contract. 

* A restructure includes adding funds to the performance management 
budget that exceed the value of the negotiated contract and result in a 
contract modification. 

For purposes of this report, we refer to each of these types of program 
changes as realignments. Since work under the BMDS contracts first 
began, there have been a total of 31 realignments to the program which 
have added nearly $14 billion dollars to the value of the work under 
the contracts. Table 4 shows the BMDS elements' realignments since 
contract start. 

Table 4: Analysis of Contractor Realignments from Contract Start 
through Fiscal Year 2008: 

ABL; 
Contract start: Nov-96; 
Date of the last realignments[A]: June-07; 
Total number realignments: 6; 
Total contract value increase due to realignments: $2,590,881,491; 
Total period of performance increase due to realignments (in months): 
80; 
Average time between realignments: 2 years; 
Average contract value increase per year due to realignments: 
$218,947,732; 
Average period of performance increase per year due to realignments (in 
months): 7. 

Aegis BMD; 
Contract start: Oct-03; 
Date of the last realignments[A]: Mar-06; 
Total number realignments: 1; 
Total contract value increase due to realignments: 0; 
Total period of performance increase due to realignments (in months): 
0; 
Average time between realignments: 4 years 11 months; 
Average contract value increase per year due to realignments: 0; 
Average period of performance increase per year due to realignments (in 
months): 0. 

C2BMC; 
Contract start: Feb-02; 
Date of the last realignments[A]: Nov-06; 
Total number realignments: 1; 
Total contract value increase due to realignments: $36,514,950; 
Total period of performance increase due to realignments (in months): 
0; 
Average time between realignments: 6 years 7 months; 
Average contract value increase per year due to realignments: 
$5,546,575; 
Average period of performance increase per year due to realignments (in 
months): 0. 

GMD[B]; 
Contract start: Jan-01; 
Date of the last realignments[A]: Ongoing as-of Sep-08; 
Total number realignments: 8; 
Total contract value increase due to realignments: $8,086,607,706; 
Total period of performance increase due to realignments (in months): 
27; 
Average time between realignments: 1 year; 
Average contract value increase per year due to realignments: 
$1,054,774,918; 
Average period of performance increase per year due to realignments (in 
months): 4. 

KEI; 
Contract start: Dec-03; 
Date of the last realignments[A]: Apr-08; 
Total number realignments: 4; 
Total contract value increase due to realignments: $1,639,800,000; 
Total period of performance increase due to realignments (in months): 
20; 
Average time between realignments: 1 year 2 months; 
Average contract value increase per year due to realignments: 
$345,000,000[C]; 
Average period of performance increase per year due to realignments (in 
months): 4. 

MKV; 
Contract start: Jan-04; 
Date of the last realignments[A]: July-07; 
Total number realignments: 4; 
Total contract value increase due to realignments: $51,213,935; 
Total period of performance increase due to realignments (in months): 
14; 
Average time between realignments: 1 year 2 months; 
Average contract value increase per year due to realignments: 
$10,974,415; 
Average period of performance increase per year due to realignments (in 
months): 3. 

Sensors; 
Contract start: Apr-03; 
Date of the last realignments[A]: N/A; 
Total number realignments: 0; 
Total contract value increase due to realignments: 0; 
Total period of performance increase due to realignments (in months): 
0; 
Average time between realignments: N/A; 
Average contract value increase per year due to realignments: 0; 
Average period of performance increase per year due to realignments (in 
months): 0. 

STSS; 
Contract start: Apr-02; 
Date of the last realignments[A]: Oct-07; 
Total number realignments: 1; 
Total contract value increase due to realignments: $232,293,329; 
Total period of performance increase due to realignments (in months): 
13; 
Average time between realignments: 6 years 5 months; 
Average contract value increase per year due to realignments: 
$36,201,558; 
Average period of performance increase per year due to realignments (in 
months): 2. 

THAAD; 
Contract start: Aug-00; 
Date of the last realignments[A]: May-08; 
Total number realignments: 5; 
Total contract value increase due to realignments: $1,179,000,000; 
Total period of performance increase due to realignments (in months): 
15; 
Average time between realignments: 1 year 7 months; 
Average contract value increase per year due to realignments: 
$146,000,000[C]; 
Average period of performance increase per year due to realignments (in 
months): 2. 

Targets and Countermeasures; 
Contract start: Dec-03; 
Date of the last realignments[A]: June-08; 
Total number realignments: 1; 
Total contract value increase due to realignments: $41,300,000; 
Total period of performance increase due to realignments (in months): 
0; 
Average time between realignments: 4 years 9 months; 
Average contract value increase per year due to realignments: 
$8,694,737; 
Average period of performance increase per year due to realignments (in 
months): 0. 

Total; 
Total number realignments: 31; 
Total contract value increase due to realignments: $13,857,611,411. 

Source: GAO analysis of MDA data. 

[A] Dates for some elements reflect when realignments were completed 
and not necessarily when realignments were incorporated into cost and 
schedule performance reporting. 

[B] The GMD program began a restructure during the fiscal year that 
includes a proposal to add between $350 million and $580 million to the 
contract value as well as 39 months to the period of performance. Since 
the replan is still ongoing and has not yet been placed on contract, 
this information is not totaled in the table above. 

[C] The realignment data provided for the KEI and THAAD programs 
included rounding for contract value increases; therefore we have 
rounded the average contract value increase per year since contract 
start to reflect this. 

[End of table] 

Some programs realigned more often than others. For example, GMD 
realigned work under its contract every year on average since its 
contract start in 2001, adding nearly 4 months and close to $1.1 
billion to the time and value of the work under the contract with each 
realignment. KEI realigned its contract about every 14 months on 
average, adding more than $345 million and 4 months every year. Since 
contract start in 1996, ABL also added more than $218 million to the 
value of the work under its contract every year on average. 
Additionally, ABL adds approximately 7 months to its period of 
performance every year on average--more than any other element. 

During fiscal year 2008, 5 of 10 BMDS elements performed a realignment-
-KEI, Targets and Countermeasures, GMD, THAAD, and STSS. The KEI replan 
in April 2008 reflected an 8-month delay to the booster flight test 
date because of technical issues experienced by the program over the 
past 2 years. Since the replan, the booster flight test has been 
further delayed to the fourth quarter of fiscal year 2009. However, 
during the replan, the program did not extend the period of performance 
or add value to the work under the contract. In June 2008, a delivery 
order under the Targets and Countermeasures element that is developing 
a new family of targets--the FTF--performed a rebaseline adding more 
than $41 million to the value of the work under the contract but not 
extending the period of performance. The program changed major 
milestone delivery dates as a result of manufacturing delays for some 
systems, caused principally by qualification program failures, 
subsequent redesigns, and requalification efforts. 

GMD, THAAD, and STSS added time and money to the value of the work 
under their contracts during the fiscal year.[Footnote 17] GMD's 
ongoing restructure includes a proposal to add between $350 million and 
$580 million to the value of the work under contract and more than 3 
years to the period of performance. This ongoing restructure rephases 
and rescopes ongoing efforts to refine European capability requirements 
and to adjust program content as well as perform weapon system 
integration, perform flight test planning, and work to develop the two- 
stage booster among other tasks. During its realignment in May 2008, 
THAAD added approximately $80 million and 3 months citing cost effects 
from insufficient target availability. In October 2007, STSS replanned 
work citing funding constraints and the addition of STSS software 
upgrades. This resulted in the program changing its launch date from 
December 2007 to July 2008 and adding approximately $232 million to the 
value of the work under contract and 13 months to its period of 
performance. Since the replan, the program has further delayed launch 
of its demonstrator satellite to the third quarter of fiscal year 2009. 

MDA Contractors Overran Fiscal Year Cost and Schedule: 

Our analysis of contractor costs indicates that during fiscal year 
2008, MDA contractors collectively overran budgeted costs by $152.4 
million.[Footnote 18] These overruns occurred in 11 of 14 MDA contracts 
we reviewed, with the STSS contract accounting for more than 50 percent 
of the total.[Footnote 19] Based on cost performance during the fiscal 
year and using formulas accepted within the cost community, we estimate 
that at completion the cumulative overrun in the contractors' budgeted 
costs could be from about $2.0 billion to $3.0 billion. Our projections 
are based on the current budgeted costs at completion for each contract 
we assessed, which represents the total current planned value of the 
contract.[Footnote 20] However, the budgeted costs at completion, in 
some cases, have grown significantly over time. For example, the ABL 
contractor reported budgeted costs at completion totaling about $724 
million in 1997, but as depicted in table 5, that cost has since grown 
to about $3.6 billion. Our assessment only reveals the overrun or 
underrun since the latest adjustment to the budget at completion. It 
does not capture, as cost growth, the difference between the original 
and current budgeted costs at completion. As a result, comparing the 
underruns or overruns for MDA programs in table 5 with cost growth on 
major defense acquisition programs is not appropriate because those 
major defense acquisition programs have established their full scope of 
work as well as developed total cost baselines, while these have not 
been developed for MDA programs. Our analysis is presented in table 5. 
Appendix II provides further details on the cost and schedule 
performance outlined in the table. 

Table 5: Prime Contractor Fiscal Year 2008 and Cumulative Cost and 
Schedule Performance: 

(Dollars in millions): 

ABL; 
Fiscal year 2008 cost performance[A]: (10.6); 
Fiscal year 2008 schedule performance[A]: 2.2; 
Cumulative cost performance: (84.8); 
Cumulative schedule performance: (23.6); 
Percentage of contract completed: 91.1; 
Estimated contract overrun/underrun at completion: Overrun of $89.7 to 
$95.4; 
Budget at completion: $3,626.7; 
Period of performance: Nov. 1996 - Feb. 2010. 

Aegis BMD Weapon System; 
Fiscal year 2008 cost performance[A]: (7.0); 
Fiscal year 2008 schedule performance[A]: (5.1);
Cumulative cost performance: 0.0; 
Cumulative schedule performance: (8.4); 
Percentage of contract completed: 81.1; 
Estimated contract overrun/underrun at completion: Overrun of $1.9 to 
$12.2; 
Budget at completion: 1,247.0; 
Period of performance: Oct. 2003 - Sept. 2010. 

Aegis BMD SM-3 CLIN 9 (20 Block 1A missiles)[B]; 
Fiscal year 2008 cost performance[A]: (3.9); 
Fiscal year 2008 schedule performance[A]: 3.9;
Cumulative cost performance: 2.3; 
Cumulative schedule performance: (0.1); 
Percentage of contract completed: 94.2; 
Estimated contract overrun/underrun at completion: Underrun of $7.5; 
Budget at completion: 179.0; 
Period of performance: Aug. 2006 - Aug. 2008. 

Aegis BMD SM-3 CLIN 1 (27 Block 1A missiles)[C]; 
Fiscal year 2008 cost performance[A]: 3.0; 
Fiscal year 2008 schedule performance[A]: (7.6);
Cumulative cost performance: 3.3; 
Cumulative schedule performance: (7.0); 
Percentage of contract completed: 46.3; 
Estimated contract overrun/underrun at completion: Underrun of $6.6 to 
overrun of $0.7; 
Budget at completion: 237.5; 
Period of performance: May 2007 - Apr. 2010. 

C2BMC; (Dollars in millions): [Empty]; 
Fiscal year 2008 cost performance[A]: (9.8); 
Fiscal year 2008 schedule performance[A]: (3.6);
Cumulative cost performance: (24.3); 
Cumulative schedule performance: (7.1); 
Percentage of contract completed: 71.1; 
Estimated contract overrun/underrun at completion: Overrun of $37.1 to 
$76.8; 
Budget at completion: 1,040.0; 
Period of performance: Jan. 2002 - Dec. 2009. 

GMD; (Dollars in millions): [Empty]; 
Fiscal year 2008 cost performance[A]: 53.9; 
Fiscal year 2008 schedule performance[A]: (77.4);
Cumulative cost performance: (1,027.9); 
Cumulative schedule performance: (130.3); 
Percentage of contract completed: 84.0; 
Estimated contract overrun/underrun at completion: Overrun of $950.2 to 
$1,251.3; 
Budget at completion: 14,934.9; 
Period of performance: Jan. 2001 - Dec 2011. 

KEI[D]; (Dollars in millions): [Empty]; 
Fiscal year 2008 cost performance[A]: (8.3); 
Fiscal year 2008 schedule performance[A]: (8.5);
Cumulative cost performance: (2.6); 
Cumulative schedule performance: (21.3); 
Percentage of contract completed: 13.9; 
Estimated contract overrun/underrun at completion: N/A; 
Budget at completion: 6,068.3; 
Period of performance: Dec. 2003 - Oct 2014. 

MKV Task Order 6 (Prototype Carrier Vehicle Seeker)[E]; 
Fiscal year 2008 cost performance[A]: (1.4); 
Fiscal year 2008 schedule performance[A]: (1.5);
Cumulative cost performance: (1.1); 
Cumulative schedule performance: (0.6); 
Percentage of contract completed: 78.3; 
Estimated contract overrun/underrun at completion: Overrun of $1.6 to 
$2.5; 
Budget at completion: 19.3; 
Period of performance: Nov. 2006 - May 2009. 

MKV Task Order 7 (Engagement Management Algorithms)[E]; 
Fiscal year 2008 cost performance[A]: 1.4; 
Fiscal year 2008 schedule performance[A]: 0.0;
Cumulative cost performance: 1.7; 
Cumulative schedule performance: 0.1; 
Percentage of contract completed: 52.8; 
Estimated contract overrun/underrun at completion: Underrun of $3.9 to 
$3.2; 
Budget at completion: 43.9; 
Period of performance: Dec. 2006 - May 2010. 

MKV Task Order 8 (Hover Test Bed)[E]; 
Fiscal year 2008 cost performance[A]: (10.7); 
Fiscal year 2008 schedule performance[A]: (0.0);
Cumulative cost performance: (10.3); 
Cumulative schedule performance: 0.3; 
Percentage of contract completed: 81.4; 
Estimated contract overrun/underrun at completion: Overrun of $5.7 to 
$13.8; 
Budget at completion: 48.0; 
Period of performance: Dec. 2006 - Jan 2009. 

Sensors; 
Fiscal year 2008 cost performance[A]: (2.2); 
Fiscal year 2008 schedule performance[A]: (27.4);
Cumulative cost performance: 22.0; 
Cumulative schedule performance: (9.6); 
Percentage of contract completed: 80.7; 
Estimated contract overrun/underrun at completion: Underrun of $25.0 to 
overrun of $9.1; 
Budget at completion: 1,125.2; 
Period of performance: Mar. 2003 - Dec 2010. 

STSS[F]; 
Fiscal year 2008 cost performance[A]: (87.9); 
Fiscal year 2008 schedule performance[A]: 1.9;
Cumulative cost performance: (319.3); 
Cumulative schedule performance: (17.8); 
Percentage of contract completed: 53.2; 
Estimated contract overrun/underrun at completion: Overrun of $621.7 to 
$1,157.9; 
Budget at completion: 1,603.0; 
Period of performance: Apr 2002 - Sept. 2011. 

Targets and Countermeasures; 
Fiscal year 2008 cost performance[A]: (35.7); 
Fiscal year 2008 schedule performance[A]: 23.2;
Cumulative cost performance: (52.8); 
Cumulative schedule performance: (6.4); 
Percentage of contract completed: 84.5; 
Estimated contract overrun/underrun at completion: Overrun of $63.7 to 
$75.9; 
Budget at completion: 1,056.4; 
Period of performance: Dec. 2003 - Dec 2009. 

THAAD[G]; 
Fiscal year 2008 cost performance[A]: (33.5); 
Fiscal year 2008 schedule performance[A]: (7.4);
Cumulative cost performance: (228.7); 
Cumulative schedule performance: (16.5); 
Percentage of contract completed: 91.4; 
Estimated contract overrun/underrun at completion: Overrun of $252.0 to 
$274.0; 
Budget at completion: 4,649.4; 
Period of performance: Aug. 2000 - Sept. 2009. 

Total; 
Fiscal year 2008 cost performance[A]: (152.4); 
Fiscal year 2008 schedule performance[A]: (107.4);
Cumulative cost performance: (1,722.5); 
Cumulative schedule performance: (248.3); 
Percentage of contract completed: [Empty]; 
Estimated contract overrun/underrun at completion: Overrun of $1,980.8 
to $2,959.0. 

Source: Contract Performance Reports (data); GAO (analysis). 

Note: Comparing the percentage of total overrun to total budget at 
completion for MDA contracts with percentage of total cost growth for 
major acquisition defense programs that are past milestone B is not 
appropriate because the major defense acquisition programs have 
established their full scope of work as well as developed total cost 
baselines, while these have not been developed for MDA programs. 

[A] Cost performance here is defined as the difference between the 
budget for the work performed and the actual cost of work performed; 
while the schedule performance is the difference between the budgeted 
cost of planned work and the budgeted cost of work performed. Negative 
cost performance (budget overruns) and negative schedule performance 
(less work performed than planned) are shown with parentheses around 
the dollar amounts. 

[B] The Aegis BMD SM-3 contractor began work on contract line item 
number (CLIN) 9 in February 2007 that concluded in August 2008 for the 
acquisition of an additional 20 SM-3 Block 1A missiles. All 
corresponding analysis is based on data through August 2008. 

[C] The Aegis BMD SM-3 contractor began reporting performance on CLIN 1 
in August 2007. This CLIN is for the production of a fourth lot of 27 
Block 1A missiles. 

[D] We could not estimate the likely outcome of the KEI contract at 
completion because a trend cannot be predicted until 15 percent of the 
planned work is complete. 

[E] Out of the five task orders open during fiscal year 2008, there was 
sufficient cost performance data to report on the three listed above. 

[F] The STSS contract includes line items for work that do not 
necessarily apply to the program being launched in the third quarter of 
fiscal year 2009. Removing these line items from our analysis, the 
program's contract would be considered 78% complete. 

[G] Earned Value data for the THAAD contract is reported under two 
CLINs, 1 and 10. We report only the contractor's cost and schedule 
performance for CLIN 1 because it represents the majority of the total 
work performed under the contract. CLIN 10 provides for Patriot Common 
Launcher initiatives funded by the Army's Lower Tier Program Office. 

[End of table] 

Technical difficulties caused most elements to overrun their fiscal 
year 2008 budgeted costs. For example, STSS attributed most of its 
overrun of approximately $87.9 million to hardware problems on the 
program's second space vehicle including the flight communication box 
and the main spacecraft computer. The box overheated during testing 
which required a thorough test of the unit. Upon successful completion 
of this testing, it was determined it did not require a replacement. In 
addition, the program had a failure in the main spacecraft computer for 
which the program office initially recommended the removal of the 
entire computer from the spacecraft. However, after extensive research 
and testing, the program manager determined that the event with the 
spacecraft was an unverifiable failure with a low probability of 
occurrence and low mission impact and decided not to remove the 
computer from the spacecraft to resolve the issue. However, as a result 
of these issues, the launch was delayed from April 2008 to at least the 
third quarter of fiscal year 2009. 

The MKV Task Order 6 and ABL contractors also experienced technical 
difficulties. The MKV contractor for Task Order 6 reported cost 
overruns during the fiscal year of $1.4 million due mostly to software 
development issues and late delivery of government-furnished 
components. ABL's fiscal year cost overruns of $10.6 million were 
mainly related to late deliveries of key laser system components and 
the acquisition or refurbishment of the Beam Control/Fire Control 
system components. For example, a major component of the laser system 
required redesign and fabrication, delaying planned delivery and 
installation onto the aircraft. Also, multiple Beam Control/Fire 
Control hardware components either were not refurbished to 
specification or failed initial testing, delaying delivery and 
integration testing. The overall effect was an approximate 1-month slip 
that the contractor believes will be made up in time to make the 
current lethality demonstration planned for the end of fiscal year 
2009. 

Three elements' contracts--Aegis BMD's contract for 27 SM-3 Block 1A 
missiles, MKV Task Order 7, and GMD--performed below their fiscal year 
budgeted costs by nearly $58.3 million with the GMD element accounting 
for approximately $53.9 million of that. The GMD element's underruns 
occurred partially because the contractors delayed or eliminated some 
planned work. For example, the GMD program did not emplace the three 
GBIs it expected to in fiscal year 2008 or conduct either of its two 
planned flight tests as scheduled during the fiscal year. As a result, 
it employed less labor than originally intended. Drivers for the MKV 
Task Order 7 contract's fiscal year cost underruns include a 
restructuring of the effort and decisions to use one rather than 
several approaches for coordinated attack, and using less manpower than 
originally planned in its procurement and software efforts. Lastly, the 
Aegis BMD contract for 27 SM-3 Block 1A missiles underran its fiscal 
year 2008 budget by approximately $3 million due in part to spending 
less than planned for engineering efforts with the missile's third 
stage component as well as adjustments made in program management, 
labor efficiencies, and material transfers in the missile's fourth 
stage component. 

While Some Tests Succeeded, Others Were Deferred; Overall System 
Performance Cannot Yet Be Assessed: 

Although several tests showed progress in individual elements and some 
system-level capabilities, all BMDS elements experienced test delays 
and shortfalls, in part due to problems with the availability and 
performance of target missiles. Most significantly, GMD was unable to 
conduct either of its planned intercept attempts during fiscal year 
2008, however it was able to conduct one delayed intercept test in 
December 2008. As a result, key performance capabilities of the current 
configuration of the GMD kill vehicles may not be demonstrated and the 
new configuration is being fielded prior to flight testing. As a 
consequence of testing problems, none of the six MDA Director's test 
knowledge points for 2008 were achieved. Poor performance of targets 
continues to be a problem that caused several tests to either fail in 
part or in whole. Shortfalls in testing have delayed validating the 
models and simulations that are used to assess the overall performance 
of the BMDS as a whole. Consequently, comprehensive assessments of the 
capabilities and limitations of the BMDS are not currently possible and 
therefore MDA still does not have the capability to model or simulate 
BMDS capability from enemy missile launch to its engagement. 

Test, Targets, and Performance Challenges Continued during Fiscal Year 
2008 for Several Elements: 

During fiscal year 2008, all BMDS elements experienced delays in 
conducting tests, most were unable to accomplish all objectives, and 
performance challenges continued for many. Moreover, the inability of 
MDA to conduct its full fiscal year 2008 flight test campaign as 
planned precluded the agency from collecting key information specified 
by the Director, MDA--known as Director's test knowledge points--to 
make certain decisions at critical points in some BMDS programs. Table 
6 below summarizes test results and target performance for BMDS 
elements during the fiscal year. 

Table 6: Test and Targets Issues: 

Element: ABL; 
Tests/activities conducted as scheduled: No; 
All objectives achieved: Yes; 
Target issues: N/A. 

Element: Aegis BMD; 
Tests/activities conducted as scheduled: No; 
All objectives achieved: No; 
Target issues: Target availability delayed key test from 2008 until at 
least the third quarter fiscal year 2009. 

Element: C2BMC; 
Tests/activities conducted as scheduled: No; 
All objectives achieved: No; 
Target issues: N/A. 

Element: GMD; 
Tests/activities conducted as scheduled: No; 
All objectives achieved: No; 
Target issues: Target failed to release countermeasures during December 
2008 flight test--FTG-05.[A] 

Element: KEI; 
Tests/activities conducted as scheduled: No; 
All objectives achieved: No; 
Target issues: N/A. 

Element: MKV; 
Tests/activities conducted as scheduled: No; 
All objectives achieved: No[B]; 
Target issues: N/A. 

Element: Sensors; 
Tests/activities conducted as scheduled: No; 
All objectives achieved: No; 
Target issues: Target failed to release countermeasures during July 
2008 testing (FTX-03). 

Element: STSS; 
Tests/activities conducted as scheduled: No; 
All objectives achieved: No; 
Target issues: N/A. 

Element: Targets and Countermeasures; 
Tests/activities conducted as scheduled: No; 
All objectives achieved: No; 
Target issues: FTF delivery delayed and experienced cost growth. 

Element: THAAD; 
Tests/activities conducted as scheduled: No; 
All objectives achieved: No; 
Target issues: Target experienced anomaly during a September flight 
test resulting in a no-test. 

Sources: GAO (presentation); MDA (data). 

[A] This flight test was originally scheduled for fiscal year 2008, but 
was later executed in fiscal year 2009. 

[B] The MKV program was able to achieve its objective in the first 
quarter of fiscal year 2009. 

[End of table] 

As a result of test delays, MDA restructured its flight test plan for 
fiscal year 2009, increasing the number of tests and compressing the 
amount of time to analyze and prepare for subsequent tests. For 
example, MDA plans to conduct 14 of 18 flight tests in the third and 
fourth quarter of fiscal year 2009. MDA's past performance raises 
questions about whether this is realistic. In fiscal year 2008, MDA had 
planned to conduct 18 flight tests, but it only accomplished 10, plus 
it had several flight tests delayed into 2009 from 2008. An MDA 
official acknowledged that the 2009 plan is aggressive, but stated that 
it can be achieved. Specifics of each element's testing experience 
during fiscal year 2008 follow. 

According to Aegis BMD officials, budgetary constraints prompted the 
Aegis BMD element to delay some tests, reducing the number of tests 
planned for 2008. However, the program was able to successfully 
complete its first test involving two non-separating targets, conduct a 
short-range ballistic missile intercept, and participate in a THAAD 
intercept during the fiscal year. The program also planned to 
participate in a BMDS-level ground test during the year, but the test 
was delayed until at least the second quarter of fiscal year 2009 
because of real-world events. Finally, Aegis BMD standard missile 
flight tests showed that interoperability issues persist between THAAD 
and Aegis BMDS with respect to correlation and object reporting. 

ABL experienced delays during fiscal year 2008, but achieved all of its 
primary test objectives. The program planned to complete the 
installation of its high energy laser on the aircraft by June 2008 in 
preparation for testing. However, it was not completed until August 
2008 because of problems with activating some of the laser's 
subsystems. The program delayed the final testing of the laser until 
the problems could be resolved. Once the problems were resolved, the 
ABL program was able to complete functionality testing of the laser in 
September 2008. 

C2BMC experienced delays in conducting tests, but achieved several test 
objectives. For example, software upgrade verification testing slipped 
from fiscal year 2008 to 2009 but the program was able to participate 
in many other system-level ground and flight tests during the year that 
enabled the program to demonstrate multiple capabilities, including 
situational awareness and sensor management.[Footnote 21] The C2BMC 
element extended development for its next software release, 6.4, by 
more than a year because of delays in system-level BMDS testing and 
challenges in developing the network server for version 6.2, as well as 
unplanned work to incorporate effects from earth rotation in the 6.4 
C2BMC planning architecture. C2BMC added earth rotation effects to 
address a requirement that Spiral 6.2 and later releases have the 
ability to model the true extent of ranges for long-range threats. 
Finally, C2BMC is still developing its capability to generate a single 
track from multiple sensors through a new resource management function, 
the Global Engagement Manager. For example, the development team for 
this function had to modify the design of this function's new track 
processing that experienced an unacceptable level of delays when 
processing data. 

In fiscal year 2008, the GMD program was unable to conduct either of 
its two planned intercept attempts--FTG-04 and FTG-05. MDA first 
delayed and then later canceled the FTG-04 test in May 2008 due to a 
problem with a telemetry component in the interceptor's Exoatmospheric 
Kill Vehicle (EKV) needed to gather test data. MDA also delayed FTG-05 
from fiscal year 2008 and conducted it in December 2008. Over the past 
two years MDA had expected to conduct seven GMD interceptor flight 
tests by the end of the first quarter of fiscal year 2009. However, MDA 
was only able to conduct two, as shown in figure 3. 

Figure 3: GMD Reduction in Flight Tests from January 2006 to March 
2010: 

[Refer to PDF for image: illustration] 

As of September 2005: Integrated flight tests planned: 
FY06, Q1: FT-1 (achieved); Type: CE I EKV; 
FY06, Q4: FTG-2 (achieved); Type: CE I EKV; 
FY07, Q1: FTG-3; Type: CE I EKV; 
FY07, Q3: FTG-4; Type: CE I EKV; 
FY07, Q4: FTG-5; Type: CE I EKV; 
FY08, Q1: FTG-6; Type: CE II EKV New processor; 
FY08, Q2: FTG-7; Type: CE I EKV; 
FY08, Q4: FTG-8; Type: CE I EKV; 
FY09, Q1: FTG-9. Type: CE I EKV and CE II EKV New processor. 

As of January 2009: Integrated flight tests planned: 
FY06, Q1: FT-1 (achieved); Type: CE I EKV; 
FY06, Q4: FTG-2 (achieved); Type: CE I EKV; 
FY07, Q4: FTG-3a (achieved); Type: CE I EKV; 
FY09, Q1: FTG-5 (achieved); Type: CE I EKV; 
FY09, Q4: FTG-6. Type: CE II EKV New processor. 

Source: GAO analysis of MDA data. 

[End of figure] 

The cancellation of FTG-04 raised concerns within the test community 
and members of Congress. FTG-04 was at first delayed and then canceled. 
MDA replaced it with a test to assess sensor capability--FTX-03. The 
sensor test allowed GMD to verify fire control software and integration 
with multiple sensors. The DOT&E was not consulted on the decision to 
cancel FTG-04 and expressed concern that the elimination of any 
intercept test reduced the opportunity to gather data that might have 
increased confidence in the models and simulations. In the conference 
report accompanying the National Defense Authorization Act for Fiscal 
Year 2008, conferees expressed concern about the loss of the FTG-04 
flight test and requested that we review the circumstances and the 
effects on the BMDS. Details of our review of the FTG-04 flight test 
cancellation appear in appendix III. 

Because GMD conducted FTG-05 in December 2008, there are only two full 
sets of GMD intercept data to date available for analysis which limits 
the ability to verify and validate the models and simulations.[Footnote 
22] Additionally, FTG-04 and the subsequent test--FTG-05--were planned 
to present different stresses to the kill vehicle which would provide 
critical data needed to further verify the fielded configuration of the 
kill vehicle. The cancellation and subsequent restructuring of the 
first test caused a delay in FTG-05 from the third quarter of fiscal 
year 2008 until December 2008. In the FTG-05 test, the interceptor hit 
its intended target. However, MDA judged the target as a failure 
because it failed to release its countermeasures as planned. 
Consequently, all primary test objectives were not achieved. 

Looking forward to the next GMD intercept flight test--FTG-06 in at 
least the fourth quarter of fiscal year 2009--MDA is accepting a higher 
level of risk than it previously expected in conducting this first test 
of the CE-II EKV because it will contain several objectives that had 
planned to be previously tested, but have not been. MDA had set up an 
approach to test one new major component change at a time. For example, 
MDA had planned to test the CE-I EKV first against simple targets, then 
the CE-I against a complex target, and once that had been proven MDA 
planned to test the CE-II EKV against a complex target. However, MDA 
was not able to test the CE-I EKV against a complex target due to a 
target failure. Due to testing problems, GMD has only been able to 
assess the CE-I EKV with a target without countermeasures. As a result, 
the FTG-06 flight test will be the first GMD test assessing both a CE-
II EKV and a complex target scene. Adding to the risk, this will be 
only the second test using a newly developed FTF LV-2 target. 

During fiscal year 2008, the KEI program experienced problems during 
testing that required it to rework components which, in turn, caused a 
delay to subsequent testing. More importantly, due to technical issues 
experienced by the program over the past two years, the first booster 
flight test--a key decision point for the program--has been delayed by 
nearly a year and is not scheduled to occur until at least the fourth 
quarter of fiscal year 2009. Technical difficulties delayed the MKV 
program's fiscal year 2008 hover test until fiscal year 2009. This 
hover test will allow the program to integrate and test key components 
of the system in a repeatable ground-based free flight environment as 
their technologies reach maturity. Although originally planned for the 
fourth quarter of fiscal year 2008, the test was successfully conducted 
in December 2008. The STSS program encountered problems during testing 
that forced the program to delay the launch of its demonstration 
satellites from April 2008 to at least the third quarter of fiscal year 
2009. The program continued to experience technical difficulties with 
its space vehicles. For example, during testing, the program 
experienced problems with its main spacecraft computer as well as an 
overheating flight communications box. After extensive testing, the 
program determined that these components were acceptable for flight. 

Similarly, in fiscal year 2008, the Sensors element also experienced 
flight test delays as well as difficulties in achieving planned 
objectives due to target performance, but met some primary objectives. 
The element successfully participated in other tests during the fiscal 
year which demonstrated the ability for the sensors to acquire and 
track a target. One test event, FTX-03, provided the first opportunity 
for four key sensors--Sea-based X-band radar, AN/TPY-2, Upgraded Early 
Warning Radar, and an Aegis BMD radar--to operate in a more 
operationally realistic test scenario.[Footnote 23] This test 
demonstrated the capability for the sensors to correlate target 
information in order to conduct an intercept test. However, the target 
failed to release its countermeasures as planned. This failure 
precluded sensors from assessing capability against a dynamic lethal 
target scene with countermeasures. As a result, the sensors could not 
collect all of the expected data, which delayed the element's ability 
to develop algorithms needed for the discrimination capability. These 
objectives will need to be addressed in future testing. The BMDS 
Operational Test Agency has had ongoing concerns regarding the 
formatting, tracking, and accounting of messages from GMD sensors. 
[Footnote 24] The timely reception of messages from sensors to weapon 
systems is key to support decisions and achieve effective intercepts. 
Since 2000 the BMDS Operational Test Agency has reported these concerns 
to MDA about poor data collection and management practices involving 
sensors affecting its assessment of tests. These data management 
problems prevented the analysis of message data, according to BMDS 
Operational Test Agency officials. In response, the contractor proposed 
a message monitoring system among communications nodes. Consequently, 
MDA recommended that this issue be closed out, but the BMDS Operational 
Test Agency still considers the matter to be open because GMD has not 
funded the monitoring system. 

THAAD planned to conduct three intercept attempts, but due to a target 
failure, it was only able to conduct two. The program could not 
complete its final flight test of the fiscal year because the target 
experienced an anomaly during flight. The test was planned to be a BMDS-
level event and was designated as a developmental test/operational test 
mission utilizing multiple BMDS elements and operationally realistic 
criteria. The program also expected to demonstrate that it could launch 
more than one THAAD interceptor during the engagement. Program 
officials rescheduled this test for the second quarter of fiscal year 
2009. In addition, THAAD's radar data collection test, RDC-2, was 
planned for 2008 but was deleted due to target availability and 
funding. As a result, program officials told us that these test 
objectives will be covered in the future with hardware-in-the-loop 
simulations and other radar events. The program successfully completed 
its first two planned tests for the fiscal year. In October 2007, THAAD 
successfully demonstrated an intercept of a target outside the earth's 
atmosphere. This was the first time THAAD had successfully conducted an 
intercept outside of the atmosphere since 1999. Additionally, in June 
2008, THAAD completed a successful intercept of a separating target. 
This intercept utilized warfighter procedures developed by the U.S. 
Army Air Defense School. 

Key MDA Test Knowledge Points Not Achieved: 

As a consequence of flight test delays as well as a delay in a key 
ground test, MDA was unable to achieve any of the Director's test 
knowledge points scheduled for fiscal year 2008 as shown in table 7. 

Table 7: Status of Fiscal Year 2008 Director's Test Knowledge Points: 

Knowledge point: Assess Capability to Deliver Real-Time Engagement 
Tracks; 
Knowledge gained: Verification of initial Global Engagement Manager 
capability to support BMDS-level sensor/weapon system pairing; 
Flight and ground test: GTD-03[A]; 
Original completion: 4th Quarter 2008; 
Current projection: 2nd Quarter 2009. 

Knowledge point: Verify 72-inch Flexible Target Family; 
Knowledge gained: Confirmation of 72" performance. Viability of FTF 
concept to efficiently configure and transport target to launch 
facility. Confidence to discontinue use of STARS; 
Flight and ground test: FTM-15; 
Original completion: 4th Quarter 2008; 
Current projection: 3rd Quarter 2009. 

Knowledge point: Demonstrate High-Acceleration Booster; 
Knowledge gained: Confirmation of Boost Phase Capability alternative to 
ABL and High Acceleration Booster for Midcourse Defense (mobile and 
fixed sites); 
Flight and ground test: FTK-01; 
Original completion: 4th Quarter 2008; 
Current projection: 4th Quarter 2009. 

Knowledge point: Confirm Constellation Affordability; 
Knowledge gained: Space sensor performance against operationally 
realistic targets confirmed with existing Block 06 technology (anchors 
performance-cost baseline for future STSS); 
Flight and ground test: FTS-01; 
Original completion: 4th Quarter 2008; 
Current projection: 4th Quarter 2009. 

Knowledge point: Verify Capability to Conduct Launch on Tactical 
Digital Information Link BM Engagement; 
Knowledge gained: Assessment of Aegis BMD 3.6 and SM-3 Block IA 
performance and ability to successfully engage and intercept a long-
range ballistic missile target and to use an off-board sensor's track 
data via Link-16 to initiate that engagement; 
Flight and ground test: FTM-15; 
Original completion: 4th Quarter 2008; 
Current projection: 3rd Quarter 2009. 

Knowledge point: Confirm Constellation Performance; 
Knowledge gained: Space sensor performance against operationally 
realistic targets confirmed with existing Block 06 technology (anchors 
performance-cost baseline for future STSS); 
Flight and ground test: FTS-03; 
Original completion: 4th Quarter 2008; 
Current projection: To Be Determined. 

Source: GAO analysis of MDA data. 

[A] GTD-03 was delayed to accommodate a real-world contingency as 
requested by the warfighter. 

[End of table] 

In May 2007, the Director, MDA, established key system-level and 
element-level knowledge points to provide critical information for 
making key decisions regarding the BMDS. According to MDA, these 
knowledge points are unique management approaches chosen to manage 
MDA's critical program risks.[Footnote 25] Each knowledge point is 
based on an event that provides critical information--or knowledge--for 
a key MDA decision requiring the Director's approval. 

In fiscal year 2008, among the Director's test knowledge points 
delayed, MDA had to defer the confirmation of the 72" target 
performance due to delays in qualifying components. Additionally, MDA 
had to delay the confirmation of the booster for the KEI program as 
problems were encountered during testing of the nozzle. 

Poor Target Missile Performance Continues to Hamper BMDS Testing: 

While targets have caused problems in fiscal year 2008 testing, poor 
performance of targets is not new. Targets' reliability and 
availability problems have significantly affected BMDS development and 
testing since 2006, and issues have grown even more problematic in 
recent years. Although target anomalies and failures have affected many 
of the missile defense elements, THAAD and GMD have been most affected. 
In 2006, the THAAD program was unable to achieve its first intercept 
attempt (FTT-04) because the target did not function properly. In 2007, 
two THAAD radar characterization tests (RDC-1c and RDC-1d) were 
unsuccessful due to target anomalies. These tests flew targets with 
characteristics needed for radar observation in support of advanced 
discrimination algorithm development. However, target problems 
prevented an opportunity for the radar to exercise all of the planned 
algorithms, causing a loss of expected data. In addition to target 
failure issues, the THAAD program deferred some flight tests because 
targets were not available, which cost the program about $201 million. 
GMD also experienced similar long-term effects on its flight test 
schedule when it was unable to achieve primary test objectives in a 
2007 intercept attempt (FTG-03) due to a target failure. 

MDA's existing targets are becoming less capable of meeting 
requirements for near-term flight tests. These targets are aging and 
likely to grow even less reliable with time; some components, such as 
the rocket motors, are more than 40 years old. Among other things, 
MDA's Targets and Countermeasures program office has also had problems 
incorporating requirements into contracts and has experienced problems 
obtaining supplies as vendors left the market due to the lack of 
business. 

To address the growing need for more sophisticated and reliable targets 
for the future BMDS test program, MDA was developing a new family of 
targets called the FTF, which was originally intended to be a family of 
new short, medium, and long-range targets with ground-, air-, and sea- 
launch capabilities. MDA embarked on this major development without 
estimating the cost to develop the family of target missiles. MDA 
proceeded to develop and even to produce some FTF targets without a 
sound business case and, consequently, their acquisition has not gone 
as planned.[Footnote 26] The funds required for the FTF were spent 
sooner than expected and were insufficient for the development. Getting 
the FTF target's components through the qualification process, however, 
was more difficult and costly than the program expected. For example, 
MDA originally planned to launch the first FTF target--a 72-inch LV-2-
-in a 2008 STSS flight test, but the test was rescheduled due to delays 
in satellite integration and target affordability and availability. 
While many of the target missile's components are found on existing 
systems, their form, fit, function, and the environment they fly in 
have been changed for the 72-inch LV-2 target. Consequently, many 
critical components initially failed shock and vibration testing and 
other qualification tests and had to be redesigned. The process was 
recently scheduled to be complete in early October 2008 but, after 
several delays, was not finished until December 2008. Despite this, MDA 
expects the target to be complete and ready for its first launch in a 
third quarter fiscal year 2009 Aegis BMD flight test (FTM-15). 

We recently reported that the FTF has been delayed, costs have 
increased and exceeded $1 billion, and the program's scope has been 
reduced.[Footnote 27] Work on all but one of the FTF target variants, 
the 72-inch LV-2, was canceled in June 2008, including plans for 
development and production of the second type of FTF target, the 52- 
inch, originally scheduled to launch in 2009. With guidance from the 
Missile Defense Executive Board, MDA is currently conducting a 
comprehensive review of the targets program to determine the best 
acquisition strategy for future BMDS targets. It is expected to be 
completed in mid-2009. Whether or not MDA decides to restart the 
acquisition of the 52-inch targets, or other FTF variants, and the 
nature of those targets depends on the results of this review. 

Currently, MDA has narrowed its FTF development efforts, focusing on a 
single vehicle, the 72-inch LV-2 ground-launched target. The first 
launch was supposed to determine the viability of the FTF concept and 
the feasibility of discontinuing the use of existing targets. However, 
rather than first conducting a separate developmental test to confirm 
the target's capability, MDA has chosen a much riskier approach. The 
first launch of the new LV-2 target will be in an Aegis BMD intercept 
test. Aegis BMD originally planned to use this new target in a fiscal 
year 2008 flight test; however, because the target was not ready, the 
test is delayed until at least the third quarter of fiscal year 2009. 

Repeated target problems and test cancellations have also affected the 
development of capabilities needed to discriminate the real target from 
countermeasures. Without opportunities to test the functionality of the 
software, there now is a system-level shortfall in BMDS progress toward 
developing a target discrimination capability against more 
sophisticated countermeasures in the midcourse phase of flight. In 
order to improve the effectiveness of the BMDS against evolving 
threats, MDA elements are developing advanced discrimination software 
in their component's sensors. The advanced discrimination software is 
critical to distinguish the threat re-entry vehicle from associated 
countermeasures and debris. Target failures during tests prevented 
opportunities to gather data to assess how well discrimination software 
performs in an operational environment. 

Overall Performance of BMDS Can Not Yet Be Assessed: 

MDA's modeling and simulation program enables MDA to assess the 
capabilities and limitations of how BMDS performs under a wider variety 
of conditions than can be accomplished through the limited number of 
flight tests conducted. Flight tests alone are insufficient because 
they only demonstrate a single collection data point of element and 
system performance. Flight tests are, however, an essential tool used 
to both validate performance of the BMDS and to anchor the models and 
simulations to ensure that they accurately reflect real performance. 
Computer models of individual elements replicate how those elements 
function. These models are then combined into various configurations 
that simulate the BMDS engagement of enemy ballistic missiles. 

To ensure confidence in the accuracy of modeling and simulation in 
representing BMDS capability, the program goes through a process called 
accreditation.[Footnote 28] Element models are validated individually 
using flight and other test data and accredited for their intended use. 
MDA intends to group these models into system-level representations 
according to user needs. One such grouping is the annual performance 
assessment, a system-level end-to-end simulation that assesses the 
performance of the current BMDS configuration.[Footnote 29] The 
performance assessment integrates element-specific models into a 
coherent representation of the BMDS. Performance assessments are used 
to: 

* assess objectives from MDA's Deputy of Engineering and the BMDS 
Operational Test Agency,[Footnote 30] 

* support MDA decisions about engagement sequence group capability 
deliveries, and: 

* support MDA decisions about BMDS fielding and declaring capabilities. 

Fundamentally, performance assessments anchored by flight and ground 
tests are a comprehensive means to fully understand the performance 
capabilities and limitations of the BMDS. 

Developing an end-to-end system-level model and simulation has been 
difficult. BMDS Operational Test Agency officials told us that they do 
not anticipate a fully accredited, system-level model and simulation 
capability to be available until 2011. MDA's first effort to bring 
together different element models and simulations to produce a fully 
accredited, end-to-end model and simulation for Performance Assessment 
2007 was unsuccessful primarily because of inadequate data for 
verification and validation to support accreditation and a lack of 
common threat and environment input data among element models. Though 
Performance Assessment 2007 was a success in establishing a capability 
for integrated modeling and simulation in a short time frame, it was 
unsuitable to assess system-level performance due to low confidence 
from a lack of accreditation. Consequently, acting on a joint 
recommendation between MDA and the Operational Test Agency, MDA 
officials canceled their 2008 performance assessment efforts in April 
2008 because of developmental risks associated with modeling and 
simulations, focusing instead on testing and models for Performance 
Assessment 2009. MDA officials believe that the refocused efforts will 
increase the chances for success during Performance Assessment 2009. 

According to the BMDS Operational Test Agency's January 2009 Modeling 
and Simulation Accreditation Report, confidence in MDA's modeling and 
simulation efforts remains low although progress was made during the 
year. MDA is now exercising stronger central leadership to provide 
guidance and resources as it coordinates the development of verified 
and validated models and simulations, as recommended by a 2004 Defense 
Science Board study. MDA and element officials are now working more 
closely with the BMDS Operational Test Agency. For example, MDA and the 
BMDS Operational Test Agency have agreed on performance parameters and 
criteria used to validate element models and simulations. Nonetheless, 
BMDS Operational Test Agency officials stated that there are several 
weaknesses in the BMDS testing program such as: 

* Insufficient consideration of modeling and simulation requirements in 
MDA flight test plans, though they emphasized that MDA is finalizing a 
list of such parameters for future flight test plans, 

* Use of artificialities in flight tests which limit the realism of 
scenarios for anchoring models and simulations,[Footnote 31] and: 

* Inadequate test planning for comprehensive modeling of weather 
conditions.[Footnote 32] 

MDA intends to verify and validate models and simulations by December 
2009 for Performance Assessment 2009. However, BMDS Operational Test 
Agency officials stated that there is a high risk that the Performance 
Assessment 2009 analysis will be delayed because of remaining 
challenges and MDA's slow progress in accreditation, as follows: 

* The compressed schedule of ground and flight tests leaves little time 
for data analysis that is essential to anchor models to those tests, 
particularly for a complete analysis supporting MDA's Performance 
Assessment 2009. 

* Out of 40 models, the BMDS Operational Test Agency recommended in 
January 2009 full accreditation for only 6 models, partial 
accreditation for 9 models, and no accreditation for 25 models. 

* Because MDA canceled the follow-on Performance Assessment 2008, the 
BMDS Operational Test Agency did not receive verification and 
validation data that would have been included in the modeling and 
simulation portion of its 2008 operational assessment. 

BMDS Operational Test Agency officials told us that MDA also does not 
adequately plan for the collection of flight test data and post-flight 
reconstruction to support anchoring MDA models and simulations, even 
though post-flight reconstruction is needed to validate that models and 
simulations are adequate representations of the real world for their 
intended purpose.[Footnote 33] MDA guidance emphasizes that one of the 
primary objectives of the MDA ground and flight test program is to 
anchor BMDS models and simulations. Additionally, this guidance 
requires MDA's testing program to work with the MDA engineers to define 
a test program that anchors these models and simulations across the 
operating spectrum. According to BMDS Operational Test Agency 
officials, the first full post-flight reconstruction was conducted in 
December 2008. 

Despite the guidance delineating responsibilities for test data, MDA 
test plans currently do not include enough detail to allocate and 
synchronize resources in order to anchor models and simulations. MDA 
recently initiated a three-phase review of the entire BMDS test 
program. According to MDA, this three-phase review will emphasize the 
need for basing BMDS test planning and test design on critical factors 
that have not been proven to date and will drive target selection 
requirements. One outcome of the review will be to create integrated 
campaigns of ground and flight tests to efficiently collect data needed 
to validate the models and simulations. MDA intends to complete all 
three phases of the review by May 2009, after which MDA intends to have 
a date when all MDA models and simulations will be verified and 
validated. However, the current lack of flight test data for MDA's and 
BMDS Operational Test Agency analysis prevents the timely validation of 
models and simulations that are used to build the 2009 end-to-end 
performance assessment. 

Production, Fielding, and Declaration of Capabilities Proceed despite 
Delays in Testing and Assessments: 

In fiscal year 2008, MDA met most of its delivery goals. However, it 
continued to pursue a concurrent development, manufacturing and 
fielding strategy in which assets are produced and fielded before they 
are fully demonstrated through testing and modeling. Although flight 
tests and modeling and simulation produced less validation of 
performance than planned, MDA continued manufacturing untested 
components and declaring capabilities ready for fielding. For example, 
10 of the new configuration kill vehicles for the GBI will have been 
manufactured and delivered before being flight-tested. MDA also 
declared that it had fielded 9 of 22 BMDS capabilities planned for 2008 
(postponing 13), but due to test cancellations and performance 
assessment delays, it had to change the basis of these declarations, 
often relying on previous, less realistic testing. 

MDA Met Most 2008 Asset Delivery Goals: 

MDA achieved four of the five delivery goals it set for fiscal year 
2008 as shown in the table 8. 

Table 8: BMDS Deliveries and Total Fielded Assets as of September 30, 
2008: 

BMDS element: GMD; 
Fiscal year 2008 delivery goals: 3 interceptors; 
Assets delivered in fiscal year 2008: 0 interceptors; 
Total assets available (cumulative total of assets since 2005): 24 
interceptors[A]. 

BMDS element: Sensors; 
Fiscal year 2008 delivery goals: 1 AN/TPY-2 radar; Sea-based X-band 
radar; 
Assets delivered in fiscal year 2008: 1 AN/TPY-2 radar; Sea-based X-
band radar[B]; 
Total assets available (cumulative total of assets since 2005): 4 
AN/TPY-2 radars[C]; Sea-based X-band radar. 

BMDS element: Aegis BMD; 
Fiscal year 2008 delivery goals: 20 SM-3 missiles; 
Assets delivered in fiscal year 2008: 20 SM-3 missiles; 
Total assets available (cumulative total of assets since 2005): 34 SM-3 
missiles; 15 destroyers; 3 cruisers. 

BMDS element: C2BMC; 
Fiscal year 2008 delivery goals: 1 fielding and activation site; 
Assets delivered in fiscal year 2008: 1 fielding and activation site; 
Total assets available (cumulative total of assets since 2005): 6 
suites; 31 Web browsers; 1 fielding and activation site; 46 enterprise 
workstations. 

Source: MDA (data); GAO (presentation). 

[A] The GMD program did not deliver any interceptors as planned in 
fiscal year 2008, but was able to deliver two interceptors--one in 
October 2008 and one in November 2008. Therefore, the cumulative total 
for GBIs as of December 2008 is 26. 

[B] Partial capability for the Sea-based X-band radar will be based on 
satisfying planned objectives for two tests scheduled for fiscal year 
2009. 

[C] AN/TPY-2 radars were formerly known as Forward-Based X-Band- 
Transportable radars. According to MDA, an additional AN/TPY-2 radar 
has been provided and is undergoing Government ground testing. 

[End of table] 

The agency planned to deliver the Sea-based X-band radar and three 
additional GBIs for Block 1.0, 20 additional SM-3 missiles for its 
Block 2.0 capability, a C2BMC site for fielding and activation for its 
Blocks 3.1/3.2 and 5.0 capabilities, and an additional AN/TPY-2 
radar.[Footnote 34] Although partial capability for the Sea-based X- 
band radar will not be declared until at least fiscal year 2009, it was 
approved for Early Capability Delivery in fiscal year 2008. The agency 
delivered the Aegis BMD SM-3 missiles, the AN/TPY-2 radar, and the 
C2BMC site in fiscal year 2008 as planned, but was unable to deliver 
the GBIs because the GMD element encountered development challenges 
with components for the CE-II EKV. In addition, the Navy Commander, 
Operational Test and Evaluation Force declared the Aegis BMD 3.6 system 
as operationally suitable and effective in October 2008. This decision 
signifies that 18 Aegis BMD-equipped ships and 90 SM-3 missiles are 
ready for transition to the Navy. 

Production and Fielding of BMDS Systems Getting Ahead of Testing: 

Despite developmental problems, test delays and MDA's inability to 
complete all fiscal year 2008 Director's test knowledge points, 
manufacturing, production, and fielding have proceeded close to 
schedule. In some cases fielding has gotten ahead of testing. For 
example, Aegis BMD expected to assess the ability of the SM-3 Block 1A 
missile to engage and intercept a long-range ballistic target to 
satisfy a Director's test knowledge point. Even though that test has 
been delayed until the third quarter of fiscal year 2009, MDA purchased 
20 SM-3 Block 1As in fiscal year 2008. 

Furthermore, MDA intended to assess, through flight tests, the CE-I 
EKV's capability against scenarios that included complex target scenes 
with countermeasures. However, due to the frequent restructuring of its 
test plan and a target failure during its most recent flight test, the 
fielded configuration for GMD has not completed a test against 
countermeasures. According to MDA, no more CE-I flight tests have been 
approved, although the agency is considering additional flight testing 
of the CE-I EKV in the future. Moreover, earlier ground and flight 
testing, along with manufacturing discoveries prompted the GMD program 
to initiate a refurbishment program for the kill vehicles and the 
boosters. Refurbishment consists of: (1) reliability improvements to 
address high priority risks and to support the development and 
understanding of GBI reliability and (2) surveillance of aging through 
the examination of removed components. Consequently, the capability of 
the CE-I, including improvements designed to mitigate risk, as well as 
understand its capabilities and limitations against targets employing 
countermeasures may not be flight-tested, yet all 24 interceptors with 
this configuration are already emplaced and declared operational. 

More importantly, the GMD program continues to experience test delays, 
causing fielding to outpace flight tests as shown in figure 4. 

Figure 4: GMD Flight Test and Fielding Plan for Interceptors 
Comparison--September 2006 versus January 2009: 

[Refer to PDF for image: illustration] 

As of September 2005: 

Integrated flight tests planned: 
FY06, Q1: FT-1 (achieved); Type: CE I EKV; 
FY06, Q4: FTG-2 (achieved); Type: CE I EKV; 
FY07, Q1: FTG-3; Type: CE I EKV; 
FY07, Q3: FTG-4; Type: CE I EKV; 
FY07, Q4: FTG-5; Type: CE I EKV; 
FY08, Q1: FTG-6; Type: CE II EKV New processor; 
FY08, Q2: FTG-7; Type: CE I EKV; 
FY08, Q4: FTG-8; Type: CE I EKV; 
FY09, Q1: FTG-9. Type: CE I EKV and CE II EKV New processor. 

Fieldings: 
FY06, Q1: CE I EKV (achieved); 
FY06, Q2: CE I EKV (achieved); 
FY06, Q4: CE I EKV (2) (achieved); 
FY07, Q1: CE I EKV (3); 
FY07, Q2: CE I EKV (2); 
FY07, Q4: CE I EKV (2); 
FY08, Q1: CE I EKV (2); 
FY08, Q2: CE II EKV New processor; 
FY08, Q3: CE II EKV New processor (2); 
FY08, Q4: CE II EKV New processor (3); 
FY09, Q1: CE I EKV; 
FY09, Q2: CE II EKV New processor (3); 
FY09, Q3: CE II EKV New processor (3); 
FY09, Q4: CE II EKV New processor (3); 
FY10, Q1: CE II EKV New processor (3); 
FY10, Q2: CE II EKV New processor (2). 

As of January 2009: 

Integrated flight tests planned: 
FY06, Q1: FT-1 (achieved); Type: CE I EKV; 
FY06, Q4: FTG-2 (achieved); Type: CE I EKV; 
FY07, Q4: FTG-3a (achieved); Type: CE I EKV; 
FY09, Q1: FTG-5 (achieved); Type: CE I EKV; 
FY09, Q4: FTG-6. Type: CE II EKV New processor. 

Fieldings: 
FY06, Q1: CE I EKV (achieved); 
FY06, Q2: CE I EKV (achieved); 
FY06, Q4: CE I EKV (2) (achieved); 
FY07, Q1: CE I EKV (achieved); 
FY07, Q2: CE I EKV (2) (achieved); 
FY07, Q3: CE I EKV (3) (achieved); 
FY07, Q4: CE I EKV (5) (achieved); 
FY08, Q1: CE I EKV (2) (achieved); 
FY09, Q1: CE II EKV New processor (2) (achieved); 
FY09, Q2: CE II EKV New processor; 
FY09, Q3: CE II EKV New processor (3). 

Source: GAO analysis of MDA data. 

[End of figure] 

For example, the program has only been able to conduct two intercepts 
since 2006 for verifying the fielded configuration yet the production 
of interceptors continues. According to GMD's September 2006 flight 
test plan, for fiscal years 2007 and 2008, and the first quarter of 
fiscal year 2009 it was going to conduct seven flight tests, including 
a test that would utilize 2 GBIs against a single target--known as a 
salvo test[Footnote 35] --and field 16 new GBIs. By January 2009 GMD 
had changed its plan, removing the salvo test and conducting two flight 
tests, yet it fielded 13 GBIs. 

Similarly, GMD had planned to conduct an intercept test to assess the 
enhanced version of the EKV called the Capability Enhancement II in the 
first quarter of fiscal year 2008, months before emplacing any 
interceptors with this configuration. However, developmental problems 
with the new configuration's inertial measurement unit and the target 
delayed the first flight test with the CE-II configuration--FTG-06-- 
until at least the fourth quarter of fiscal year 2009. Despite these 
delays, MDA expects to have emplaced five CE-II interceptors before 
this flight test. MDA indicated that these five interceptors will not 
be declared operational until the satisfactory completion of the test 
and the Program Change Board declares their status. However, MDA 
projects that 10 CE-II EKVs will have been manufactured and delivered 
before that first flight test demonstrates the CE-II capability. This 
amounts to over half of the CE-II EKV deliveries that are currently on 
contract. 

MDA did not emplace the three GBIs it needed to meet its fiscal year 
2008 fielding goals. MDA will have to emplace twice as many GBIs than 
planned in fiscal year 2009 before Block 1.0 can be declared complete. 
As of January 2009, the agency had emplaced two and must emplace four 
more in order to complete Block 1.0 as planned. 

Major defense and acquisition programs must complete operational test 
and evaluation before entering full-rate production.[Footnote 36] 
Because MDA considers the assets it has fielded to be developmental, it 
has not advanced BMDS elements to DOD's acquisition cycle or begun full-
rate production. Therefore, MDA has not yet triggered the requirement 
for an operational test and evaluation prior to fielding. However, 
MDA's concurrent approach to developing and fielding assets has led to 
testing problems and concerns about the performance of some fielded 
assets. After two flight test failures in 2005, MDA undertook a Mission 
Readiness Task Force to establish confidence in GMD's ability to 
reliably hit its target, establish credibility in setting and meeting 
test event dates, build increasing levels of operationally realistic 
test procedures and scenarios, raise confidence in successful outcomes 
of flight missions, and conduct the next flight test as soon as 
practical within acceptable risk bounds. However, GMD accelerated the 
objectives for its test program after the first Mission Readiness Task 
Force flight test and the program continues to experience developmental 
problems with key interceptor components. MDA also separately 
established a refurbishment program designed to replace questionable 
interceptor parts and increase reliability of GBIs. Since 2006, we have 
reported that the performance of some fielded GBIs was uncertain. 

Despite MDA's previous efforts to build confidence in its test program, 
MDA continues to pursue a risky approach for fielding BMDS assets under 
its new block structure. In March 2008, we reported that MDA's new 
block structure did not address whether it would continue its practice 
of concurrently developing and fielding BMDS elements and components. 
However, in 2008 the agency continued to field assets without adequate 
knowledge. MDA emplaced GBIs during the year although its refurbishment 
program was barely underway, meaning that the risks of rework continue. 
To date, 26 GBIs have been emplaced--many of which may contain 
unreliable parts--and only a few have been refurbished since the 
initiation of the refurbishment program in 2007. According to program 
officials, some improvements have already been introduced into the 
manufacturing flow and demonstrated during flight testing. 

While it is always a concern when tests are eliminated or the 
complexity of a planned test is reduced, the concern is heightened for 
a system of systems such as the BMDS because of the complex interaction 
of components within an element, and between that element and the other 
elements within the BMDS. Consequently, the need to synchronize the 
development and testing of different capabilities is crucial before 
fielding begins. For example, for certain engagement scenarios, the 
ground-based interceptor will launch based on information provided by 
an entirely separate element such as an Aegis cruiser or destroyer. If 
a problem is discovered during these flight tests, post-flight 
reconstruction using models needs to be conducted, the root-cause must 
be determined, a solution or mitigation must be developed and 
implemented, and a new test to confirm the effectiveness of the 
solution or mitigation must be performed. 

Reduced Testing Has Delayed Some Capability Declarations and Weakened 
the Basis for Others: 

When MDA determines that a capability can be considered for operational 
use, it does so through a formal declaration. MDA uses an incremental 
declaration process to designate BMDS capability for its blocks in 
three levels--early, partial and full. The first two levels allow these 
BMDS features to play a limited role in system operations before they 
have attained their full level of capability. Each capability 
designation in the delivery schedule represents upgraded capacity to 
support the overall function of BMDS in its mission as well as the 
level of MDA confidence in the system's performance. Capability 
declarations are important because MDA uses them to assess progress 
toward block completion. MDA guidance calls for an orderly sequence of 
events that lead to declaring that a fielded capability has been 
achieved and is ready for consideration for operational use. 

MDA bases its declarations on, among other things, a combination of 
models and simulations--such as end-to-end performance assessments-- 
and ground tests all anchored to flight test data. Because performance 
assessments analyze the BMDS as an entire system in a variety of ways, 
they provide more comprehensive information than flight tests alone. 
These events and assessments build on each other every year as MDA adds 
capabilities by improving hardware and software. MDA decision makers 
would then declare the achievement of capability goals for engagement 
sequence groups based on performance assessments. While in some 
instances, declarations of capability have been deferred, in other 
instances MDA has declared capabilities despite shortfalls in testing, 
modeling and simulation, and performance assessments. The agency 
declared the delivery of nine capabilities during fiscal year 2008 as 
shown in figure 5 below. It declared three early capabilities for Block 
1.0 engagement sequence groups, three early as well as one full 
capability for Block 2.0 engagement sequence groups, and one early 
capability and one partial capability for Block 3.1/3.2 engagement 
sequence groups. 

Figure 5: Timeline Showing Declaration of Capabilities in Fiscal Year 
2008: 

[Refer to PDF for image: illustration] 

Block 1.0G: 

BI Launch-on COBRA DANE radar (Beale): 
FY08, Q1: Capability declaration, early; 
FY08, Q4: Milestone achieved, early. 

GBI Engage-on sea-based X-band radar: 
FY08, Q1: Capability declaration, early; 
FY08, Q4: Milestone achieved, early. 

GBI Launch-on sea-based X-band radar: 
FY08, Q1: Capability declaration, early; 
FY08, Q4: Milestone achieved, early. 

Block 2.0: 

SM-3 Engage-on shipboard Aegis radar: 
FY08, Q2: Full milestone scheduled; 
FY08, Q4: Full milestone achieved. 

SM-3 Launch on remote shipboard Aegis radar: 
FY08, Q2: Milestone scheduled, early; 
FY08, Q4: Milestone achieved, early. 

SM-2 Engage-on shipboard Aegis radar: 
FY08, Q4: Milestone scheduled, early; 
FY08, Q4: Milestone achieved, early. 

THAAD Engage-on AN/TPY-2 radar (terminal mode): 
FY08, Q3: Milestone scheduled, early; 
FY08, Q4: Milestone achieved, early. 

Block: 3.1/3.2: 

GBI Engage-on sea-based X-band radar: 
FY08, Q2: Milestone achieved, partially. 

GBI Launch-on sea-based X-band radar: 
FY08, Q3: Milestone scheduled, early; 
FY08, Q4: Milestone achieved, early. 

Source: GAO analysis of MDA. 

Note: Our analysis above is based on the MDA Master Execution Fielding 
Schedule dated October 2007 as well as the Master Fielding Plan dated 
February 2008. Commensurate with its new block structure, MDA reported 
a subset of these as part of its fiscal year 2008 Statement of Goals 
dated January 2008. However, MDA continues to work toward declaring all 
of the October 2007 engagement sequence groups. 

[End of figure] 

MDA had intended to use the results of a flight test (FTG-04), that was 
later canceled;[Footnote 37] a distributed ground test (GTD-03), that 
was delayed into fiscal year 2009; and the results of Performance 
Assessments 2007 and 2008 to determine if capabilities were ready for 
declaration in fiscal year 2008. Consequently, these shortfalls in 
knowledge led MDA to reduce the basis for declaring capability goals. 
Performance Assessment 2007--identified by MDA as a key source to 
assess capabilities during fiscal year 2008--achieved only limited 
accreditation.[Footnote 38] This less-than-full accreditation indicated 
that MDA could not rely on the assessment's results to gauge end-to-end 
BMDS performance. Subsequently, MDA officials decided to cancel 
Performance Assessment 2008 because they needed time to address 
problems and prepare for Performance Assessment 2009. 

While MDA officials declared these capabilities during fiscal year 
2008, they did so after mostly reducing the basis for the declarations. 
They reverted back in several cases to older ground and flight tests, 
though MDA in a few cases added some newer flight and ground tests as 
well. For example, MDA declared early Block 1.0 capability for three 
engagement sequence groups in fiscal year 2008 without the planned 
results from Performance Assessment 2007. In all cases, though MDA had 
intended to use the final results from comprehensive performance 
assessments, after revising the basis for declaring capability goals it 
eliminated them entirely. Specifically, MDA dropped some sources of 
data it expected to use, such as the canceled Performance Assessment 
2008, and shifted from flight and ground tests planned to occur in 
fiscal year 2008 to older flight and ground tests. For example, in 
Block 2.0 MDA declared full capability during fiscal year 2008 for one 
engagement sequence group, Aegis BMDS engage on its shipboard radar, 
even though Performance Assessment 2008 had been canceled. MDA instead 
based its decision on integrated and distributed ground tests (GTI-02 
and GTD-02) conducted in calendar year 2007 as well as prior flight 
tests during fiscal years 2006 through 2008. However, the BMDS 
Operational Testing Agency raised concerns about the comprehensiveness 
of the GTI-02 scenarios,[Footnote 39] specifically, the incorrect 
configuration of U.S. satellites and threat data. 

MDA also deferred 13 capability goals scheduled to occur in fiscal year 
2008 to the end of fiscal year 2009, as shown in figure 6 below. 

Figure 6: Timeline Showing Deferred Declaration of Capabilities from 
Fiscal Year 2008 to 2009: 

[Refer to PDF for image: illustration] 

Block 1.0: 

GBI Engage-on COBRA DANE radar (Beale): 
FY08, Q2: Full milestone scheduled; 
FY09, Q3: Full milestone achieved. 

GBI Launch-on COBRA DANE radar (Beale): 
FY08, Q1: Partial milestone scheduled; 
FY09, Q4: Partial milestone achieved. 

GBI Engage-on shipboard Aegis radar: 
FY09, Q2: Full milestone scheduled; 
FY09, Q4: Full milestone achieved. 

GBI Launch-on shipboard Aegis radar: 
FY08, Q2: Full milestone scheduled; 
FY09, Q4: Full milestone achieved. 

GBI Engage-on forward-based AN/TPY-2 radar: 
FY08, Q2: Full milestone scheduled; 
FY09, Q4: Full milestone achieved. 

GBI Launch-on forward-based AN/TPY-2 radar: 
FY08, Q2: Full milestone scheduled; 
FY09, Q4: Full milestone achieved. 

GBI Engage-on sea-based X-band radar: 
FY08, Q1: Partial milestone scheduled; 
FY09, Q4: Partial milestone achieved. 

GBI Launch-on sea-based X-band radar: 
FY08, Q1: Partial milestone scheduled; 
FY09, Q4: Partial milestone achieved; 
FY09, Q2: Full milestone scheduled; 
FY09, Q4: Full milestone achieved. 

Block 2.0: 

SM-3 Launch-on shipboard Aegis radar: 
FY08, Q1: Partial milestone scheduled; 
FY09, Q4: Partial milestone still delayed; 
FY08, Q4: Full milestone scheduled; 
FY09, Q4: Full milestone still delayed. 

Block 3.1/3.2: 

GBI Engage on COBRADANE radar Mod 1 (Fylingdales, UK; forward-based: 
AN/TPY-2 radar): 
FY08, Q2: Full milestone scheduled; 
FY09, Q4: Full milestone still delayed. 

GBI Launch on forward-based AN/TPY-2 radar, Mod 1a (Hercules 1): 
FY08, Q3: Early milestone scheduled; 
FY09, Q4: Early milestone still delayed. 

GBI Engage on forward-based AN/TPY-2 radar, Mod 1a (Hercules 1): 
FY08, Q3: Early milestone scheduled; 
FY09, Q4: Early milestone still delayed. 

Source: GAO analysis of MDA data. 

Note: Our analysis above is based on the MDA Master Execution Fielding 
Schedule dated October 2007 as well as the Master Fielding Plan dated 
February 2008. Commensurate with its new block structure, MDA reported 
a subset of these as part of its fiscal year 2008 Statement of Goals 
dated January 2008. However, MDA continues to work toward declaring all 
of the October 2007 engagement sequence groups. Several engagement 
sequence groups are not shown here because they are Block 3.3, which 
MDA has not yet baselined. 

[End of figure] 

MDA intended to declare all Block 1.0 engagement sequence groups as 
fully capable by the middle of fiscal year 2009.[Footnote 40] However, 
as MDA encountered test delays and technical challenges, it had to 
defer full capability declaration for these engagement sequence groups 
until the end of fiscal year 2009. For Block 2.0, MDA also deferred 
declaring full capability for one of the two planned full capability 
declarations for fiscal year 2008. This declaration is contingent upon 
the review of a ground test that has been rescheduled to the second 
quarter of fiscal year 2009 and a flight test rescheduled to the third 
quarter of fiscal year 2009. MDA also deferred one full and two early 
capability declarations for Block 3.1/3.2 beyond the end of fiscal year 
2009. 

In response to the limitations of Performance Assessment 2007, the 
cancellation of Performance Assessment 2008 and FTG-04, and the delayed 
GTD-03 and FTG-05 flight tests, MDA is planning to rely on older ground 
and flight tests; a sensor flight test, FTX-03, instead of intercept 
flight tests; and the initial quick look review of Performance 
Assessment 2009 instead of the previously planned full analysis. 
Appendix IV provides a detailed layout for the reduced basis of 
capability declarations for fiscal years 2008 and 2009. 

Since MDA was only able to declare a few of the capabilities it planned 
for fiscal year 2008, the schedule for fiscal year 2009 and subsequent 
years will be compressed if the agency plans to maintain the schedule 
it has set for its blocks. For example, MDA may need to declare three 
times as many capabilities than originally planned for fiscal year 2009 
in order to meet the 2009 capability declaration schedule. In addition, 
if the schedule cannot be maintained, MDA will likely have to make 
further adjustments to mitigate additional delays in BMDS capabilities. 

Increased reliance on integrated ground testing will provide less 
knowledge than a complete analysis of capabilities from a performance 
assessment. Integrated ground testing involves less robust conditions 
than distributed ground testing, which involves operational systems in 
the field. The MDA master fielding plan indicates that the agency 
originally intended to take a more comprehensive approach upon which to 
base capability declarations. Reliance on an upcoming Performance 
Assessment 2009 quick look for Block 1.0 completion is a particular 
concern because the knowledge it provides may be limited with respect 
to testing, according to BMDS Operational Test Agency officials. For 
example, officials told us that a quick look may indicate anomalies 
from a test but will not analyze their causes. In contrast, MDA 
originally planned to have a complete analysis from the Performance 
Assessment 2009 models, simulations, and tests. 

Limited Progress Made in Improving Transparency and Accountability: 

In March 2008, we reported that efforts were underway to improve BMDS 
management, transparency, accountability, and oversight including a new 
executive board outside of MDA and a new block structure along with 
other improvements within MDA.[Footnote 41] Since that time, the 
executive board that was established in 2007 has acted with increased 
oversight. MDA's efforts, however, have not made the expected progress. 
In particular, MDA has decided to retain the option of deferring work 
from one block to another; cost baselines have not been established; 
test baselines remain relatively unstable; and requesting procurement 
funds for some assets, as directed by Congress, will not occur until 
fiscal year 2010.[Footnote 42] 

To accomplish its mission, in 2002 the Secretary of Defense gave MDA 
requirements, acquisition, and budget flexibilities and relief from 
some oversight mechanisms and reporting responsibilities. The 
flexibility granted to MDA has allowed concurrent development, testing, 
manufacturing, and fielding. MDA used this flexibility to quickly 
develop and field the first increment of capability in 2005. In August 
2008, in response to Congressional direction to assess the current and 
future missions, roles, and structure of MDA, an independent study 
group agreed that there is a need to move MDA toward more normal 
acquisition processes. However, the group noted that the continuous 
evolution of the BMDS requires that the approach to setting 
requirements for, developing, and fielding increments of capability 
should remain as special authorities with oversight of the Missile 
Defense Executive Board (MDEB). Further, in regards to budget 
flexibilities, the independent group concluded that while these 
flexibilities may have been deemed necessary at the time, it should not 
have been expected that all the special authorities granted to MDA 
would continue or would have a need to continue in full force beyond 
achieving the President's goal of deploying a set of initial 
capabilities. 

Missile Defense Executive Board's Oversight Role More Active in 2008: 

During 2008, the MDEB appeared to act with an increased level of 
authority in providing oversight of MDA and the BMDS. For example, the 
board took on a major role in making key decisions regarding the 
transition of elements to military services. We previously reported 
that MDA and the military services had been negotiating the transition 
of responsibilities for the sustainment of fielded BMDS elements, but 
this process had been proven to be arduous and time consuming. However, 
in 2008, with the influence of the MDEB, a lead military service 
designation was appointed for one BMDS asset--the Sea-based X-band 
radar.[Footnote 43] 

In March 2008, we reported that the MDEB could play a key role in the 
Joint Requirements Oversight Council's proposal to return the BMDS to 
the Joint Capabilities Integration and Development System requirements 
process--a formal DOD procedure followed by most DOD programs that 
defines acquisition requirements and evaluation criteria for future 
defense programs. In responding to the proposal, the Acting Under 
Secretary of Defense for Acquisition, Technology, and Logistics 
recommended that the Deputy Secretary of Defense delay his approval of 
the Joint Staff's proposal until the MDEB could review the proposal and 
provide a recommendation. According to Acquisition, Technology and 
Logistics officials, no decision has been made regarding returning the 
BMDS to the requirements process. However, the Deputy Secretary of 
Defense, in September 2008, appeared to strengthen the oversight role 
of the MDEB, clarifying the roles of the MDEB as well as MDA, the 
Office of the Secretary of Defense, Combatant Commands, and Military 
Departments. With respect to the role of the MDEB, he established a 
life cycle management process for the BMDS stating that the MDEB will 
recommend and oversee implementation of strategic policies and plans, 
program priorities, and investment options to protect our Nation and 
our allies from missile attack. One of the MDEB functions is to provide 
the Under Secretary of Defense for Acquisition, Technology, and 
Logistics--or Deputy Secretary of Defense, as necessary--a recommended 
strategic program plan and feasible funding strategy for approval. The 
Deputy Secretary further noted that, through the use of the BMDS Life 
Cycle Management Process outlined in the memo, the MDEB will oversee 
the annual preparation of the BMDS portfolio, including BMDS-required 
capabilities and a program plan to meet the requirements with Research, 
Development Test & Evaluation, procurement, operations and maintenance, 
and military construction resources in defense-wide accounts. 

To further increase BMDS transparency and oversight of the BMDS, the 
Under Secretary of Defense Acquisition, Technology, and Logistics plans 
to hold program reviews for several BMDS elements commensurate with the 
authority granted to the MDEB by the Deputy Secretary of Defense. 
According to Under Secretary of Defense for Acquisition, Technology, 
and Logistics officials, the MDEB conducted its first of such reviews 
in November 2008 of the THAAD program. This review covered production 
and the element's contract schedule. Under Secretary of Defense for 
Acquisition, Technology, and Logistics officials told us that these 
reviews are designed to provide the Deputy Director Acquisition, 
Technology, and Logistics with comprehensive information that will be 
used as the basis for MDEB recommendations for the BMDS business case 
and baseline processes--a process which, according to these officials, 
is similar to the traditional Defense Acquisition Board process for 
reviewing other major acquisition programs. However, it is unclear 
whether the information provided to the MDEB will be comparable to that 
produced for other major acquisition program reviews as most of the 
information appears to be derived or presented by MDA as opposed to 
independent sources as required for traditional major defense 
acquisition programs.[Footnote 44] 

Efforts to Improve Transparency of MDA's Work Have Not Progressed as 
Planned: 

Deferral of Work: 

In 2007, MDA redefined its block structure to better communicate its 
plans and goals to Congress. The agency's new structure is based on 
fielding capabilities that address particular threats instead of the 
biennial time periods previously used to develop and field the BMDS. 
Last year, we reported that MDA's new block plans included many 
positive changes.[Footnote 45] However, MDA, with its submission of its 
Fiscal Year 2008 Statement of Goals, reversed some of the positive 
aspects of the new block structure. For example, we previously reported 
that the new block structure would improve the transparency of each 
block's actual cost by disallowing the deferral of work from one block 
to another. Under its prior block structure, MDA deferred work from one 
block to another; but it did not track the cost of the deferred work so 
that it could be attributed to the block that it benefited. Because MDA 
did not track the cost of the deferred work, the agency was unable to 
adjust the cost of its blocks to accurately capture the cost of each. 
This weakened the link between budget funds and the work performed. 
Last year, MDA officials told us that under its new block approach, MDA 
would no longer transfer work under any circumstances to a different 
block. However, MDA officials recently said that they are retaining the 
option to move work from one block to another as long as it is 
accompanied by a rebaseline. This change allows the agency to continue 
the practice of moving work from one block to another, which thereby 
reduces the transparency of the new block structure and undermines any 
baselines that are established. 

Use of Procurement Funds: 

In March 2007, we reported that the majority of MDA's funding comes 
from the Research, Development, Test, and Evaluation appropriation 
account, another flexibility provided by law.[Footnote 46] In past 
years, Congress authorized MDA to pay for assets incrementally using 
research and development funds. This allowed MDA to fund the purchase 
of assets over multiple years. Congress recently restricted MDA's 
authority and required MDA to purchase certain assets with procurement 
funds and directed that for any year after fiscal year 2009, MDA's 
budget materials must delineate between funds needed for research, 
development, test, and evaluation, procurement, operations and 
maintenance, and military construction. Requiring MDA to request 
funding in these appropriation categories will mean that it will be 
required to follow the funding policies for each category. For example, 
using procurement funds will mean that MDA will be required to ensure 
that assets are fully funded in the year of their purchase, rather than 
incrementally funded over several years. 

Congress directed in the 2008 National Defense Authorization Act, for 
any year after fiscal year 2009, that MDA's budget materials delineate 
between funds needed for research, development, test, and evaluation, 
procurement, operations and maintenance, and military construction. We 
have previously reported that using procurement funds will mean that 
MDA generally will be required to adhere to congressional policy that 
assets be fully funded in the year of their purchase, rather than 
incrementally funded over several years. The Congressional Research 
Service reported in 2006 that "incremental funding fell out of favor 
because opponents believed it could make the total procurement costs of 
weapons and equipment more difficult for Congress to understand and 
track, create a potential for DOD to start procurement of an item 
without necessarily stating its total cost to Congress, permit one 
Congress to 'tie the hands' of future Congresses, and increase weapon 
procurement costs by exposing weapons under construction to uneconomic 
start-up and stop costs."[Footnote 47] 

In the 2008 National Defense Authorization Act, Congress also provided 
MDA with the authority to use procurement funds for fiscal years 2009 
and 2010 to field its BMDS capabilities on an incremental funding 
basis, without any requirement for full funding. Congress has granted 
similar authority to other DOD programs. In the conference report 
accompanying the Act, Conferees cautioned DOD that additional authority 
will be considered on a limited case-by-case basis and noted that they 
expect that future missile defense programs will be funded in a manner 
more consistent with other DOD acquisition programs. 

MDA did not request any procurement funds for fiscal year 2009. During 
our audit, the agency had not yet released the 2010 budget request to 
include such funding categories. However, MDA officials told us that 
the agency plans to request procurement funds for some of its assets in 
its fiscal year 2010 request, but could not elaborate on its plans to 
do so. Given that data was unavailable, it is unclear for which assets 
procurement funds will be requested or the extent to which the request 
will meet the direction given by Congress. According to MDA officials, 
information regarding its plans to request procurement funding will not 
be released until spring 2009. 

Baselines: 

Baselines represent starting points against which actual progress can 
be measured. They are thus used to provide indications of when a 
program is diverting from a plan. Baselines can be established to gauge 
progress in different areas, including cost, schedule and testing. 
Overall, the BMDS does not have baselines that are useful for 
oversight. With regard to cost, we have already discussed the lack of 
total and unit cost baselines for missile defense as well as the 
frequency of changes in contract baselines. MDA made some progress with 
developing a schedule baseline for its blocks and their associated 
capabilities. The agency's annual Statement of Goals identifies its 
schedule baseline as the fiscal year dates for early, partial, and full 
capability deliveries of hardware and functionality for a block. Thus, 
while MDA has changed its schedule for making declarations, the effect 
of the change can be determined by comparison with the original 
schedule. 

MDA does not have test baselines for its blocks. The agency does 
however, have baselines for its test program, but revises them 
frequently. They are therefore not effective for oversight. The agency 
identified its Integrated Master Test Plan as the test baseline for the 
BMDS. However, as depicted in table 9, the agency has made a number of 
changes to the content of the baseline. 

Table 9: MDA BMDS Test Baseline Revisions: 

Version: 5.6.2; 
Revisions/change date: February 20; 
Rationale for change: Interim Update: Changed for signature; 
Version approved: [Check]. 

Version: 8.01; 
Revisions/change date: August 15; 
Rationale for change: Incorporated MDA new block construct; migrated 
from calendar year format to fiscal year format; 
Version approved: [Empty]. 

Version: 8.04; 
Revisions/change date: October 12; 
Rationale for change: Updated funding status. Incorporated Operational 
Test Agency input; 
Version approved: [Empty]. 

Version: 8.06; 
Revisions/change date: November 6; 
Rationale for change: Preparation for internal MDA coordination; 
Version approved: [Empty]. 

Version: 8.07; 
Revisions/change date: December 11; 
Rationale for change: Program Change Board changes incorporated. 
Includes some fiscal year 2008 and 2009 budget decisions; 
Version approved: [Empty]. 

Version: 8.1; 
Revisions/change date: February 5; 
Rationale for change: Updated with Program Change Board changes; 
Version approved: [Check]. 

Version: 8.4; 
Revisions/change date: July 19; 
Rationale for change: Quarterly update based on Program Change Board 
and working group decisions; 
Version approved: [Empty]. 

Version: 9.1; 
Revisions/change date: September 26; 
Rationale for change: Quarterly update limited to schedules; 
Version approved: [Empty]. 

Source: GAO analysis of MDA documents. 

[End of table] 

The official approved test baseline changes every year and there are 
numerous more informal changes that happen more frequently. Most of the 
annual revisions to the test baseline occur either because MDA has 
changed the substance of test, changed the timing of tests, or added 
tests to the baseline. 

The Integrated Master Test Plan establishes the executable test program 
for the current fiscal year and extends through the following fiscal 
year. According to MDA, the test plan is updated quarterly based upon 
decisions from MDA leadership or formal decision-making forums such as 
the Program Control Board, and is also coordinated annually with 
external agencies such as the Office of the Director of Operational 
Test and Evaluation. However, as shown in table 9, there are several 
versions for a given quarter and as many as seven versions have been 
developed since the fiscal year 2008 baseline was established. It is 
unclear which Integrated Master Test Plan version MDA manages to at any 
given time. For example, in November 2008, we requested the latest 
version of the Integrated Master Test Plan and were told that the 
latest approved version was 8.4, which was revised in July 2008. 
However, the signature page for that version is from a prior version-- 
version 8.1. Since there is no signature page referring to version 8.4, 
it appears that this version is unapproved though MDA officials told us 
that it was being used to manage the BMDS. 

Agency officials maintain that the document is used to manage tests and 
associated requirements for key test events. However, it is unclear how 
well the baseline is integrated with other key aspects of testing such 
as the acquisition of targets needed to execute tests. For example, in 
some instances, targets are placed on contract for two or more years in 
advance of the planned tests. Yet, the test baseline--the Integrated 
Master Test Plan--does not appear to include events beyond the 
following fiscal year that are key to the BMDS test program. As we 
reported in September 2008, MDA officials acknowledged that its target 
contracts did not capture all element testing requirements and target 
baselines were not always established before targets contracts were 
signed.[Footnote 48] 

Conclusions: 

In 2002, MDA was given unusual authorities to expedite the fielding of 
an initial BMDS capability. As this initial capability was fielded in 
2005, it showed the benefits of these flexibilities. MDA has improved 
on this capability in the ensuing years, including 2008, the focus of 
this report. Today, the program is still operating at a fast pace, as 
production and fielding of assets outstrips the ability to test and 
validate them. A collateral effect of these flexibilities has been 
reduced visibility into actual versus planned progress. Some 
fundamental questions of an oversight nature are not yet answerable. 
For example, a comparison of actual versus planned costs at the system 
or asset level is not yet possible, nor is an assessment of the 
performance of the fielded system as a whole. Beginning in 2007, MDA 
began efforts to improve visibility into its actual performance, 
beginning with the new way of defining blocks, coupled with DOD's 
creation of the MDEB. However, progress has been slow in some areas and 
value for money cannot be satisfactorily assessed. Delays are 
especially important in a program of this size, as a year delay in 
establishing cost baselines means another $8 billion to $10 billion may 
be spent in the meantime. 

With the transition to a new administration, the deployment and 
subsequent improvement of an initial capability, a new agency Director, 
and a new block structure for managing the BMDS, an opportunity exists 
to revisit and strengthen the processes by which MDA operates. Looking 
to the future, decision makers in Congress and DOD face multi-billion 
dollar investment decisions in allocating funds both within MDA and 
between MDA and other DOD programs. At this point, a better balance 
must still be struck between the information Congress and DOD need to 
conduct oversight of the BMDS and the flexibility MDA needs to manage 
across the portfolio of elements that collectively constitute the 
system's capability. 

At this point, the balance does not provide sufficient information for 
effective oversight. In particular: 

* Total cost and unit cost baselines have not been set and contract 
baselines are subject to frequent changes. Even if such baselines are 
set as planned, they will only capture about 26 percent of MDA's work. 

* Less testing is conducted than planned, thus delaying the validation 
of the models and simulations needed to assess the overall performance 
of the BMDS. Moreover, test plans do not hold and are revised often, in 
many cases due to the poor performance of target missiles. The current 
test plan is at risk given its ambitious scope. 

* Manufacturing, production, and fielding are outpacing testing, 
modeling, and validation. Consequently, fielding decisions and 
capability declarations are being made with limited understanding of 
system effectiveness. 

Recommendations for Executive Action: 

We recommend that the Secretary of Defense direct the MDEB to assess 
how the transparency and accountability of MDA's acquisitions can be 
strengthened to enhance oversight, such as by adopting relevant aspects 
of DOD's normal requirements, acquisition and budgeting processes, 
without losing the beneficial features of MDA's existing flexibility. 

In the near term we recommend that the Secretary of Defense direct MDA 
to undertake the following 10 actions: 

In the area of cost: 

1. Complete total cost baselines before requesting additional funding 
for Blocks 2.0 and 3.0 and commit to a date when baselines for all 
blocks will be established. 

2. Ensure that transfers of work from one block to another are 
transparent and reported as cost variances. 

3. Provide additional unit costs reports, beyond flyaway unit costs, 
that incorporate both procurement and research and development funding 
so that there is a more comprehensive understanding of the progress of 
the acquisitions. 

In the area of testing and performance: 

4. Expand the BMDS test baseline to include tests scheduled beyond the 
first succeeding year of the plan to ensure its synchronization with 
BMDS contracts. 

5. Ensure that DOT&E is consulted before making significant changes to 
the test baseline so that the tests planned provide DOT&E with 
sufficient data to assess the performance of the BMDS elements. 

6. Ensure that planned test objectives include concrete data 
requirements anchoring models and simulations to real-world tests, 
synchronized with flight and ground test plans and that the effects on 
models and simulations of test cancellations, delays or problems are 
clearly identified and reported. 

7. Reassess the flight tests scheduled for the end of fiscal year 2009 
to ensure that they can be reasonably conducted and analyzed given 
targets and other constraints. 

In the area of knowledge-based decisions: 

8. Synchronize the development, manufacturing, and fielding schedules 
of BMDS assets with the testing and validation schedules to ensure that 
items are not manufactured for fielding before their performance has 
been validated through testing. 

9. Conduct a flight test of the CE-I EKV against a complex target scene 
with countermeasures to complete MDA's previous testing goal of 
understanding the performance capabilities of the first 24 fielded 
GBIs. 

10. Strengthen the capability declarations by using the complete 
analysis from annual performance assessments as the basis for declaring 
engagement sequence groups as fully capable and block development as 
fully complete; otherwise, indicate the limitations of the capabilities 
and steps that MDA will take to reduce the risks. 

Agency Comments and Our Evaluation: 

DOD provided written comments on a draft of this report. These comments 
are reprinted in appendix I. DOD also provided technical comments, 
which were incorporated as appropriate. 

DOD fully concurred with 10 of our 11 recommendations and partially 
concurred with our recommendation that the Secretary of Defense direct 
MDA to synchronize the development, manufacturing, and fielding 
schedules of BMDS assets with testing and validation schedules to 
ensure that items are not manufactured for fielding before their 
performance has been validated through testing. Yet, even DOD's 
response to this recommendation appears to be, in substance, 
concurrence. 

DOD concurred with our recommendation that the Secretary of Defense 
direct MDA to ensure that transfers of work from one block to another 
are transparent and reported as cost variances. DOD noted in its 
response that MDA will report block baselines and variances annually to 
Congress in the BMDS Accountability Report. The Department further 
noted that for the purposes of unit cost reporting, MDA has defined a 
cost variance as a confirmed increase of 10 percent or more in block or 
unit costs when compared to the current cost baseline or 20 percent or 
more compared to the original cost baseline, stating that transfers of 
work creating such cost variances will be reported. The intent of our 
recommendation is to increase visibility into transfers of work between 
blocks regardless of the amount of the increase or the baseline status 
of the blocks. The trigger for reporting the variances selected by DOD 
will not necessarily provide that visibility. Given that only between 2 
and 26 percent of BMDS block and capability development costs from 
fiscal year 2010 to 2013 will be baselined initially, visibility into 
transfers into blocks that are not yet baselined may not occur. 
Further, an increase may not be reported in the baselined block from 
which work is transferred because the transfer would actually yield a 
decrease in the cost of the baselined block. An increase would also not 
be reported in the receiving block if that block is not baselined or if 
the transfer did not increase costs above the threshold. MDA may need 
to consider a different approach to reporting that captures meaningful 
transfers of work into and out of blocks regardless of whether any of 
the blocks are baselined. MDA should work with Congress to determine 
what constitutes a meaningful or significant cost increase. 

DOD also concurred with our recommendation that the Secretary of 
Defense strengthen the capability declarations by using the complete 
analysis from annual performance assessments. In responding to our 
recommendation, DOD noted that if there is limited performance 
assessment data, the overall capability assessment will factor in the 
knowledge gained from ground tests and flight tests against the 
identified risks. While we recognize that MDA is not always able to 
complete all of its planned tests in a given time period, when MDA 
decides to change the planned basis for its capability declarations to 
a different or reduced set of data it is important for the agency to 
clearly report the limitations that affect the capability declaration 
as well as the mitigation steps it is taking. 

We are sending copies of this report to the Secretary of Defense and to 
the Director, MDA. In addition, the report will be available at no 
charge on the GAO Web site at [hyperlink, http://www.gao.gov]. 

If you, or your staff have any questions concerning this report, please 
contact me at (202) 512-4841. Contact Points for our offices of 
Congressional Relations and Public Affairs may be found on the last 
page of this report. The major contributors are listed in appendix VI. 

Signed by: 

Paul Francis: 
Director, Acquisition and Sourcing Management: 

List of Congressional Committees: 

The Honorable Carl Levin: 
Chairman: 
The Honorable John McCain: 
Ranking Member: 
Committee on Armed Services: 
United States Senate: 

The Honorable Daniel K. Inouye: 
Chairman: 
The Honorable Thad Cochran: 
Ranking Member: 
Subcommittee on Defense: 
Committee on Appropriations: 
United States Senate: 

The Honorable Ike Skelton: 
Chairman: 
The Honorable John M. McHugh: 
Ranking Member: 
Committee on Armed Services: 
House of Representatives: 

The Honorable John P. Murtha: 
Chairman: 
The Honorable C.W. Bill Young: 
Ranking Member: 
Subcommittee on Defense: 
Committee on Appropriations: 
House of Representatives: 

[End of section] 

Appendix I: Comments from the Department of Defense: 

Office Of The Under Secretary Of Defense: 
Acquisition Technology And Logistics: 
3000 Defense Pentagon: 
Washington, DC 20301-3000: 

March 3, 2009: 

Mr. Paul Francis: 
Director, Acquisition and Sourcing Management: 
U.S. Government Accountability Office: 
441 G Street, N.W. 
Washington, DC 20548: 

Dear Mr. Francis: 

This is the Department of Defense (DoD) response to the GAO Draft 
Report, GAO-09-338, "Defense Acquisitions: Production and Fielding of 
Missile Defense Components Continue With Less Testing and Validation 
Than Planned," dated February 11, 2009 (GAO Code 120744). Detailed 
comments on the report recommendations are enclosed. 

The DoD concurs with ten of the draft report's recommendations and 
partially concurs with one. The rationale for our position is included 
in the enclosure. I submitted separately a list of technical and 
factual errors for your consideration. 

We appreciate the opportunity to comment on the draft report. My point 
of contact for this effort is Mr. David Crim, (703) 697-5385, 
david.crim@osd.mil. 

Sincerely, 

Signed by: 

David G. Ahern: 
Director: 
Portfolio Systems Acquisition: 

Enclosure: As stated: 

GAO Draft Report Dated February 11, 2009: 
GAO-09-338 (GAO Code 120744): 

"Defense Acquisitions: Production And Fielding Of Missile Defense 
Components Continue With Less Testing And Validation Than Planned" 

Department Of Defense Comments To The GAO Recommendations: 

Recommendation 1: The GAO recommends that the Secretary of Defense 
direct the Missile Defense Executive Board to assess how the 
transparency and accountability of Missile Defense Agency (MDA's) 
acquisitions can be strengthen to enhance oversight, such as by 
adopting relevant aspects of DoD's normal requirements, acquisition and 
budgeting processes, without losing the beneficial features of MDA's 
existing flexibility. 

DoD Response: Concur. As GAO noted, DoD has recently enhanced the 
transparency, accountability, and oversight of the missile defense 
program. For example, the Missile Defense Executive board (MDEB) has 
played an increasingly important role in Ballistic Missile Defense 
System (BMDS) policy and programmatic decisions. Existing flexibilities 
and the associated integrated decision authority for requirements, 
acquisition, and budget have facilitated MDA's efforts to provide 
critical capabilities to the war fighter. However, it's an appropriate 
time to take a fresh look. Such a review might also identify where the 
flexibilities and integrated decision authority granted MDA could be 
applied beneficially to other DoD programs. 

Recommendation 2: The GAO recommends that the Secretary of Defense 
direct MDA to complete total cost baselines before requesting 
additional funding for blocks two and three and commit to a date when 
baselines for all blocks will be established. 

DoD Response: Concur. MDA intends to present its cost baselines for 
Blocks 2.0 and 3.1/3.2 in the BMDS Accountability Report (BAR) 
accompanying the President's Budget for Fiscal Year (FY) 2010. As for 
Blocks 3.3 and 5.0, assuming events unfold as expected, we intend to 
baseline their costs no later than the issuance of next year's BAR. MDA 
intends to baseline Block 4.0 within one fiscal year of reaching 
agreements with the Czech and Polish governments and obtaining needed 
Congressional approvals. 

Recommendation 3: The GAO recommends that the Secretary of Defense 
direct MDA to ensure that transfers of work from one block to another 
are transparent and reported as cost variances. 

DoD Response: Concur. MDA will report block baselines and variances 
annually to Congress in the BMDS Accountability Report. For the 
purposes of unit cost reporting, MDA has defined a cost variance as a 
confirmed increase of 10 percent or more in block or unit costs when
Attachment Page 1 of 3 compared to the current cost baseline or 20 
percent or more compared to the original cost baseline. Transfers of 
work creating such cost variances will be reported. 

Recommendation 4: The GAO recommends that the Secretary of Defense 
direct MDA to provide additional unit costs reports, beyond flyaway 
unit costs, that incorporate both procurement and research and 
development funding so that there is a more comprehensive understanding 
of the progress of the acquisitions. 

DoD Response: Concur. MDA will provide additional unit cost reports 
that include development and integration costs, flyway costs, initial 
spares, and support items (incorporating research and development or 
procurement funds as appropriated) for major pieces of equipment such 
as interceptors, sensors, fire control/command systems, and launch 
systems in baselined blocks. 

Recommendation 5: The GAO recommends that the Secretary of Defense 
direct MDA to expand the Ballistic Missile Defense Systems (BMDS) test 
baseline to include tests scheduled beyond the first succeeding year of 
the plan to ensure its synchronization with BMDS contract. 

DoD Response: Concur. The Missile Defense Agency is developing the 
Integrated Master Test Plan (IMTP) that spans the Future Year Defense 
Program (FYDP) rather than in two year increments. It will include test 
objectives based on specifications, modeling and simulation 
verification, validation and accreditation and Critical Operational 
Issues. 

Recommendation 6: The GAO recommends that the Secretary of Defense 
direct MDA to ensure Director, Operational Test and Evaluation (DOT&E) 
is consulted before making significant changes to the test baseline so 
that the tests planned provide the Director of DOT&E with sufficient 
data to assess the performance of the BMDS elements. 

DoD Response: Concur. Proposed updates to the test baseline and the 
IMTP will be coordinated through established Working Groups, which 
include DOT&E, Operational Test Agencies (OTAs), and the War fighter as 
permanent members. The updates will continue to be staffed through MDA 
Leadership and both DOT&E and the OTAs, and DOT&E will continue as a 
signator to the IMTP. 

Recommendation 7: The GAO recommends that the Secretary of Defense 
direct MDA to ensure that planned test objectives include concrete data 
requirements anchoring models and simulations to real-world tests, 
synchronized with flight and ground test plans and that the effects on 
models and simulations of test cancellations, delays or problems are 
clearly identified and reported. 

DoD Response: Concur. Flight tests include objectives to anchor models 
and simulations to real-world test data. MDA currently does a post-
flight-test reconstruction of each flight test, which includes the 
requirement to "re-fly" flight tests in our ground Hardware in Loop 
Tests. This helps to ensure that models and simulations accurately 
represent element and system performance. Additionally, MDA has clearly 
identified and reported the effects on models and simulations of test 
cancellations, delays or problems and will continue to do so. 

Recommendation 8: The GAO recommends that the Secretary of Defense 
direct MDA to reassess the flight tests scheduled for the end of fiscal 
year 2009 to ensure that they can he reasonably conducted and analyzed 
given targets and other constraints. 

DoD Response: Concur. MDA concurs with reassessment of the test 
schedule as it applies to the conduct of flight tests through the end 
of fiscal year 2009. 

Recommendation 9: The GAO recommends that the Secretary of Defense 
direct MDA to synchronize the development, manufacturing, and fielding 
schedules of BMDS assets with testing and validation schedules to 
ensure that items are not manufactured for fielding before their 
performance has been validated through testing. 

DoD Response: Partially Concur. MDA is pursuing synchronization of 
development, manufacturing and fielding of BMDS assets with the IMTP's 
testing and validation requirements. That synchronization will be 
captured in the draft BMDS Master Plan (BMP) and its associated 
Integrated Master Plan (IMP) and Integrated Master Schedule (IMS). 
While successful ground and flight tests have provided confidence in 
BMDS capabilities being fielded, MDA and the war fighter recognize that 
additional validation through modeling and simulation is needed. 

Recommendation 10: The GAO recommends that the Secretary of Defense 
direct MDA to conduct a flight test of the Capability Enhancement (CE)-
I Exoatmospheric Kill Vehicle (EKV) against a complex target scene with 
countermeasures to complete MDA's previous testing goal of 
understanding the performance capabilities of the first 24 fielded 
Ground-Based Interceptors (GBIs). 

DoD Response: Concur. MDA is currently reexamining its flight testing 
program and expects to include additional flight testing of the 
Capability Enhancement CE-I EKV. This testing will be reflected in the 
next IMTP update. Such testing will include the specific objective to 
discriminate and intercept the dynamic lethal object from a target 
scene with countermeasures. 

Recommendation 11: The GAO recommends that the Secretary of Defense 
direct MDA to strengthen the capability declarations by using the 
complete analysis from annual performance assessments as the basis for 
declaring engagement sequence groups as fully capable and block 
development as fully complete; otherwise, indicate the limitations of 
the capabilities and steps the MDA will take to reduce the risks. 

DoD Response: Concur. MDA makes a capability declaration based on 
complete analysis of data from all available ground test, flight test 
and performance assessment events. As part of the technical assessment 
criteria for a capability delivery (whether an engagement sequence 
group or BMDS Block capability). MDA identifies the capabilities and 
limitations and provides the MDA Director with the summary of any 
remaining risk to the capability. If there is limited
performance assessment data, the overall capability assessment will 
factor in the knowledge gained from ground tests and flight tests 
against the identified risks. 

[End of section] 

Appendix II: BMDS Prime Contractors Exceed Budgeted Cost and Schedule 
Performance during Fiscal Year 2008: 

Based on our analysis of 14 Ballistic Missile Defense System (BMDS) 
elements' prime contractor earned value management performance, we 
determined that collectively the contractors overran budgeted cost by 
$152.4 million and were behind schedule by approximately $107.4 million 
during the fiscal year.[Footnote 49] Our insight of the dollar gained 
or lost for each dollar invested is based on monthly earned value 
reports which are required of each BMDS program office's prime 
contractor. These reports compare monthly progress to the cost or 
schedule performance baseline to reveal whether the work scheduled is 
being completed on time and if the work is being completed at the cost 
budgeted. For example, if the contractor was able to complete more work 
than scheduled and for less cost than budgeted, the contractor reports 
a positive schedule and cost variance. Alternatively, if the contractor 
was not able to complete the work in the scheduled time period and 
spent more than budgeted, the contractor reports both a negative 
schedule and cost variance. The results can also be mixed by, for 
example, completing the work under cost (a positive cost variance) but 
taking longer than scheduled to do so (a negative schedule variance). 

We also used contract performance report data to base predictions of 
likely overrun or underrun of each prime contractor's budgeted cost at 
completion. Our predictions of final contract cost are based on the 
assumption that the contractor will continue to perform in the future 
as it has in the past. In addition, since they provide the basis for 
our projected overruns, we also provide the total budgeted contract 
cost at completion for each contract we assessed in this 
appendix.[Footnote 50] However, the budgeted costs at completion, in 
some cases, have grown significantly over time. For example, in one 
case the budgeted cost at completion increased by approximately five 
times its original value. Since our assessment does not reveal, as cost 
growth, the difference between the original and current budgeted costs 
at completion it would be inappropriate to compare the underruns or 
overruns for MDA programs with cost growth on major defense acquisition 
programs since those major defense acquisition programs have 
established their full scope of work as well as developed total cost 
baselines, while these have not been developed for MDA programs. 
[Footnote 51] 

Aegis BMD Contractors Experienced Mixed Performance during the Fiscal 
Year: 

The Aegis Ballistic Missile Defense (Aegis BMD) program manages two 
prime contractors for work on its two main components--the Aegis BMD 
Weapon System and the Standard Missile-3 (SM-3). We report on work 
under one of the two separate Aegis BMD SM-3 contract's contract line 
item numbers (CLIN)on which we received sufficient performance data 
during fiscal year 2008. The first Aegis BMD SM-3 contract's CLIN 9 was 
for the production of 20 Block 1A missiles which began in February 2007 
and finished deliveries in August 2008. Deliveries were completed $7.5 
million under budget on the contractor's total budgeted cost of $179.0 
million. The other Aegis BMD SM-3 contract's CLIN 1 is for a fourth lot 
of 27 Block 1A missiles and began reporting performance data in August 
2007 for work that is still ongoing. The weapon system contractor 
experienced cost growth and schedule delays while the SM-3 contractor 
for the ongoing CLIN 1 for 27 Block 1A missiles had mixed performance. 
Neither of these CLINs experienced a realignment during fiscal year 
2008. 

The Aegis BMD weapon system contractor experienced cumulative cost 
growth and schedule delays throughout the year. The Aegis BMD weapon 
system contractor overran budgeted cost and schedule during the fiscal 
year by $7 million and $5.1 million respectively. Although cumulative 
cost performance remains positive at $16 thousand, cumulative schedule 
performance continued to decline to negative $8.4 million. The negative 
cumulative schedule variance is driven by late engineering data, delays 
to qualification efforts, and the need to return components 
experiencing issues back to the vendor which required more time than 
originally planned. See figure 7 for cumulative cost and schedule 
performance during the fiscal year. 

Figure 7: Aegis BMD Weapon System Fiscal Year 2008 Cost and Schedule 
Performance: 

[Refer to PDF for image: multiple line graph] 

Month: September 07; 
Cumulative Cost Variance: $6.979 million; 
Cumulative Schedule Variance: -$3.258 million. 

Month: October 07; 
Cumulative Cost Variance: $8.085 million; 
Cumulative Schedule Variance: -$3.704 million. 

Month: November 07; 
Cumulative Cost Variance: $1.761 million; 
Cumulative Schedule Variance: -$3.61 million. 

Month: December 07; 
Cumulative Cost Variance: -$1.343 million; 
Cumulative Schedule Variance: -$2.384 million. 

Month: January 08; 
Cumulative Cost Variance: -$0.761 million; 
Cumulative Schedule Variance: -$5.098 million; 

Month: February 08; 
Cumulative Cost Variance: -$7.333 million; 
Cumulative Schedule Variance: -$6.143 million. 

Month: March 08; 
Cumulative Cost Variance: -$2.407 million; 
Cumulative Schedule Variance: -$5.35 million. 

Month: April 08; 
Cumulative Cost Variance: -$0.496 million; 
Cumulative Schedule Variance: -$8.977 million. 

Month: May 08; 
Cumulative Cost Variance: $0.12 million; 
Cumulative Schedule Variance: -$8.575 million. 

Month: June 08; 
Cumulative Cost Variance: -$1.922 million; 
Cumulative Schedule Variance: -$8.842 million. 

Month: July 08;
Cumulative Cost Variance: -$2.384 million; 
Cumulative Schedule Variance: -$10.276 million. 

Month: August 08; 
Cumulative Cost Variance: -$2.442 million; 
Cumulative Schedule Variance: -$8.914 million. 

Months: September 08; 
Cumulative Cost Variance: $0.016 million;
Cumulative Schedule Variance: -$8.397 million. 

Source: Contractor (data), GAO (presentation). 

[End of figure] 

The program attributes the fiscal year cost and schedule overruns 
mainly to the additional time and testing needed to ensure that the 
weapon system fielded capability was what was originally promised to 
the warfighter. To account for some of the overruns, the program 
performed fewer risk reduction efforts for a future weapon system 
capability release. If the contractor continues to perform as it has 
during the fiscal year, we project that at contract completion in 
September 2010, the contractor will overrun its budgeted cost of $1.2 
billion by between $1.9 million and $12.2 million. 

The Aegis BMD SM-3 contractor, producing another lot of 27 Block 1A 
missiles under its CLIN 1, ended the fiscal year by underrunning 
budgeted costs by $3.0 million. The Aegis BMD SM-3 contractor for CLIN 
1 work also ended the year with a negative $7.6 million schedule 
variance, which means that the contractor was unable to accomplish $7.6 
million worth of planned work. Since reporting began in August 2007, 
cumulative and fiscal year variances are nearly equal with cumulative 
cost variances at a positive $3.3 million and cumulative schedule 
variances at negative $7.0 million. See figure 8 for a graphic 
representation of the cumulative cost and schedule variances during 
fiscal year 2008. 

Figure 8: Aegis BMD SM-3 CLIN 1 Fiscal Year 2008 Cost and Schedule 
Performance: 

[Refer to PDF for image: multiple line graph] 

Month: September 07; 
Cumulative Cost Variance: $0.2657 million; 
Cumulative Schedule Variance: $0.5567 million. 

Month: October 07; 
Cumulative Cost Variance: $0.6644 million; 
Cumulative Schedule Variance: -$0.2938 million. 

Month: November 07; 
Cumulative Cost Variance: -$0.1986 million; 
Cumulative Schedule Variance: $1.7739 million. 

Month: December 07; 
Cumulative Cost Variance: -$0.8043 million; 
Cumulative Schedule Variance: -$0.2961 million. 

Month: January 08; 
Cumulative Cost Variance: -$0.3651 million; 
Cumulative Schedule Variance: $1.4555 million; 

Month: February 08; 
Cumulative Cost Variance: $0.017 million; 
Cumulative Schedule Variance: $1.9683 million. 

Month: March 08; 
Cumulative Cost Variance: -$1.1359 million; 
Cumulative Schedule Variance: $2.3834 million. 

Month: April 08; 
Cumulative Cost Variance: -$0.0039 million; 
Cumulative Schedule Variance: $4.7049 million. 

Month: May 08; 
Cumulative Cost Variance: $0.3091 million; 
Cumulative Schedule Variance: $3.1361 million. 

Month: June 08; 
Cumulative Cost Variance: $1.236 million; 
Cumulative Schedule Variance: $1.4158 million. 

Month: July 08;
Cumulative Cost Variance: $1.8045 million; 
Cumulative Schedule Variance: -$4.6013 million. 

Month: August 08; 
Cumulative Cost Variance: $1.5336 million; 
Cumulative Schedule Variance: -$5.4807 million. 

Months: September 08; 
Cumulative Cost Variance: $3.313 million;
Cumulative Schedule Variance: -$7.0103 million. 

Source: Contractor (data), GAO (presentation). 

[End of figure] 

The contractor was able to accomplish fiscal year 2008 work for $3.0 
million less than originally planned in part due to adjustments made in 
program management, labor efficiencies, reductions in vendor cost, and 
material transfers in the missile's fourth stage component. The 
unaccomplished work in negative $7.6 million worth of fiscal year 
schedule variances is largely in the first, second, and fourth stages 
portion of work. In the first stage booster, the contractor attributes 
some of the negative schedule variance to more than a year delay in 
testing the first stage due to rework needed to correct errors in the 
original drawing packages. In addition, the contractor cites second 
stage component delivery delays as drivers for the negative schedule 
variance. Vendors were unable to deliver these components due to 
holdups in approving waivers, achieving recertification after test 
equipment failures, and property damage to facilities. Lastly, the 
contractor experienced delays in components for the fourth stage which 
also contributed to the unfavorable schedule variance. If the 
contractor continues to perform as it did through September 2008, our 
analysis predicts that, at completion in April 2010, the work under the 
contract could cost from $6.6 million less to $0.7 million more than 
the budgeted cost of $237.5 million. 

ABL Contractor Overran Budgeted Fiscal Year Cost: 

For fiscal year 2008, the Airborne Laser (ABL) contractor overran 
fiscal year budgeted costs by $10.6 million but had a positive fiscal 
year schedule variance of $2.2 million. Despite some gains in its 
schedule variance during the fiscal year, the program still maintains 
negative cumulative cost and schedule variances of $84.8 million and 
$23.6 million respectively. The contractor mostly attributes the 
negative cumulative variances in cost and schedule to late beam 
control/fire control hardware deliveries. Despite a replan in June 
2007, the ABL contractor did not perform any type of realignment during 
fiscal year 2008. Figure 9 shows cumulative variances at the beginning 
of fiscal year 2008 along with a depiction of the contractor's cost and 
schedule performance throughout the fiscal year. 

Figure 9: ABL Fiscal Year 2008 Cost and Schedule Performance: 

[Refer to PDF for image: multiple line graph] 

Month: September 07; 
Cumulative Cost Variance: -$74.217 million; 
Cumulative Schedule Variance: -$25.767 million. 

Month: October 07; 
Cumulative Cost Variance: -$74.533 million; 
Cumulative Schedule Variance: -$22.015 million. 

Month: November 07; 
Cumulative Cost Variance: -$74.602 million; 
Cumulative Schedule Variance: -$20.48 million. 

Month: December 07; 
Cumulative Cost Variance: -$76.455 million; 
Cumulative Schedule Variance: -$20.444 million. 

Month: January 08; 
Cumulative Cost Variance: -$75.803 million; 
Cumulative Schedule Variance: -$19.487 million; 

Month: February 08; 
Cumulative Cost Variance: -$83.76 million; 
Cumulative Schedule Variance: -$24.927 million. 

Month: March 08; 
Cumulative Cost Variance: -$87.968 million; 
Cumulative Schedule Variance: -$26.377 million. 

Month: April 08; 
Cumulative Cost Variance: -$89.066 million; 
Cumulative Schedule Variance: -$25.155 million. 

Month: May 08; 
Cumulative Cost Variance: -$89.248 million; 
Cumulative Schedule Variance: -$24.72 million. 

Month: June 08; 
Cumulative Cost Variance: -$90.097 million; 
Cumulative Schedule Variance: -$26.772 million. 

Month: July 08;
Cumulative Cost Variance: -$88.78 million; 
Cumulative Schedule Variance: -$26.546 million. 

Month: August 08; 
Cumulative Cost Variance: -$91.003 million; 
Cumulative Schedule Variance: -$29.548 million. 

Months: September 08; 
Cumulative Cost Variance: -$84.775 million;
Cumulative Schedule Variance: -$23.606 million. 

Source: Contractor (data), GAO (presentation). 

[End of figure] 

Technical issues with key components of the Beam Control Fire Control 
system that required new hardware or refurbishment of existing 
components as well as late deliveries of key laser system components 
are the primary drivers of the unfavorable fiscal year cost variance of 
$10.6 million. These issues have caused delays in integration and test 
activities for the overall ABL weapon system. Based on the contractor's 
performance up through fiscal year 2008, we estimate that, at 
completion in February 2010, the contractor will overrun its budgeted 
cost of $3.6 billion by between $89.7 million and $95.4 million. 

C2BMC Program Incurred Negative Cumulative and Fiscal Year Variances: 

Our analysis of the Command, Control, Battle Management, and 
Communications' (C2BMC) cumulative contract performance indicates that 
the prime contractor's performance declined during fiscal year 2008. 
The contractor overran its fiscal year 2008 budget by $9.8 million and 
did not perform $3.6 million of work on schedule. By September 2008, 
this resulted in an unfavorable cumulative cost variance of $24.3 
million and an unfavorable cumulative schedule variance of $7.1 
million. The main drivers for the negative cumulative cost variances 
were costs associated with unplanned work, increased technical 
complexity, and reduction to cost efficiency due to losing key staff. 
The contractor attributes the unfavorable cumulative schedule variances 
to software issues related to the global engagement manager and 
components of test training operations. Although the C2BMC contractor 
performed a replan in November 2006, the contractor did not perform any 
type of realignment during fiscal year 2008. Trends in cost and 
schedule performance during the fiscal year are depicted in figure 10. 

Figure 10: C2BMC Fiscal Year 2008 Cost and Schedule Performance: 

[Refer to PDF for image: multiple line graph] 

Month: September 07; 
Cumulative Cost Variance: -$14.476 million; 
Cumulative Schedule Variance: -$3.535 million. 

Month: October 07; 
Cumulative Cost Variance: -$18.458 million; 
Cumulative Schedule Variance: -$4.59 million. 

Month: November 07; 
Cumulative Cost Variance: -$21.255 million; 
Cumulative Schedule Variance: -$5.988 million. 

Month: December 07; 
Cumulative Cost Variance: -$17.271 million; 
Cumulative Schedule Variance: -$0.776 million. 

Month: January 08; 
Cumulative Cost Variance: -$19.3 million; 
Cumulative Schedule Variance: -$1.375 million; 

Month: February 08; 
Cumulative Cost Variance: -$23.593 million; 
Cumulative Schedule Variance: -$5.695 million. 

Month: March 08; 
Cumulative Cost Variance: -$22.957 million; 
Cumulative Schedule Variance: -$3.534 million. 

Month: April 08; 
Cumulative Cost Variance: -$26.384 million; 
Cumulative Schedule Variance: -$7.521 million. 

Month: May 08; 
Cumulative Cost Variance: -$20.172 million; 
Cumulative Schedule Variance: $0.309 million. 

Month: June 08; 
Cumulative Cost Variance: -$20.408 million; 
Cumulative Schedule Variance: $1.452 million. 

Month: July 08;
Cumulative Cost Variance: -$21.062 million; 
Cumulative Schedule Variance: -$2.696 million. 

Month: August 08; 
Cumulative Cost Variance: -$23.253 million; 
Cumulative Schedule Variance: -$5.647 million. 

Months: September 08; 
Cumulative Cost Variance: -$24.273 million;
Cumulative Schedule Variance: -$7.121 million. 

Source: Contractor (data), GAO (presentation). 

[End of figure] 

The negative fiscal year cost variance of $9.8 million is driven mainly 
by problems in the performance of work under Part 4 and Part 5 of the 
contract. The Part 4 effort, which began in December 2005, includes 
completing several spiral capabilities, upgrading spiral suites, and 
implementing initial global engagement capabilities at an operations 
center. The Part 5 effort began in December 2007 and covers operations 
and sustainment support for fielded C2BMC; the delivery of spiral 
hardware, software, and communications; as well as development, 
planning, and testing for other spiral capabilities. The contractor was 
able to use reserves to cover some of its Part 4 unfavorable fiscal 
year cost variances. 

The Part 5 fiscal year cost variance's primary drivers are unexpected 
complexities with the network design, unplanned work that required more 
resources for developing the planner, and the extension of efforts past 
the completion date on the global engagement management portion of 
work. The unfavorable fiscal year schedule variance of $3.6 million is 
attributable to the Part 5 portion of work and primarily caused by an 
unexpected reallocation of resources off of the global engagement 
management portion of work to other areas, delays in requesting 
material procurement also for global engagement management, and a 
lagging schedule for building out a testing lab. If the contractor 
continues to perform as it has in the past, we predict that the 
contractor will overrun its budgeted cost of $1.0 billion at completion 
in December 2009 by between $37.1 million and $76.8 million. 

GMD Contractor Maintained Negative Cumulative Cost and Schedule 
Variances throughout the Fiscal Year: 

The government and the Ground-based Midcourse Defense (GMD) contractor 
began a contract restructuring during the fiscal year to rephase and 
rescope on-going efforts to refine capability requirements and to 
adjust program content as well as perform weapon system integration, 
perform flight test planning, and work to develop the two-stage booster 
among other tasks. The ongoing realignment includes a proposal to add 
between $350 million and $580 million to the cost of the work under 
contract and to add 39 months to the period of performance. 

The GMD contractor reports a cumulative negative cost variance of more 
than $1.0 billion that it attributes to technical challenges with its 
Exoatmospheric Kill Vehicle (EKV) as well as supplier component quality 
problems. The contractor also carries a total unfavorable cumulative 
schedule variance of $130.3 million, the bulk of which the contractor 
attributes to the technical issues connected with the Ground-based 
Interceptor (GBI), particularly the EKV. For example, during the fiscal 
year the program experienced difficulties in manufacturing the 
Capability Enhancement II (CE-II) EKVs. Although the CE-II EKVs are 
expected to provide better performance, the contractor produced the 
kill vehicles before completing developmental tests, discovered 
problems during manufacturing, incorporated a new design, and continued 
manufacturing them. Although these issues contributed unfavorable 
fiscal year cost variances of $42.7 million, the program was able to 
make up for these losses in other areas. The variances, depicted in 
figure 11 represent the GMD contractor's cumulative cost and schedule 
performance over fiscal year 2008. 

Figure 11: GMD Fiscal Year 2008 Cost and Schedule Performance: 

[Refer to PDF for image: multiple line graph] 

Month: September 07; 
Cumulative Cost Variance: -$1081.76 million; 
Cumulative Schedule Variance: -$52.887 million. 

Month: October 07; 
Cumulative Cost Variance: -$1077.93 million; 
Cumulative Schedule Variance: -$118.649 million. 

Month: November 07; 
Cumulative Cost Variance: -$1066.65 million; 
Cumulative Schedule Variance: -$115.545 million. 

Month: December 07; 
Cumulative Cost Variance: -$1049.57 million; 
Cumulative Schedule Variance: -$120.363 million. 

Month: January 08; 
Cumulative Cost Variance: -$1058.01 million; 
Cumulative Schedule Variance: -$111.88 million; 

Month: February 08; 
Cumulative Cost Variance: -$1060.91 million; 
Cumulative Schedule Variance: -$117.452 million. 

Month: March 08; 
Cumulative Cost Variance: -$1059.82 million; 
Cumulative Schedule Variance: -$125.855 million. 

Month: April 08; 
Cumulative Cost Variance: -$1045.19 million; 
Cumulative Schedule Variance: -$143.324 million. 

Month: May 08; 
Cumulative Cost Variance: -$1046.85 million; 
Cumulative Schedule Variance: -$123.692 million. 

Month: June 08; 
Cumulative Cost Variance: -$1037.42 million; 
Cumulative Schedule Variance: -$115.797 million. 

Month: July 08;
Cumulative Cost Variance: -$1038.12 million; 
Cumulative Schedule Variance: -$113.994 million. 

Month: August 08; 
Cumulative Cost Variance: -$1030.16 million; 
Cumulative Schedule Variance: -$119.216 million. 

Months: September 08; 
Cumulative Cost Variance: -$1027.87 million;
Cumulative Schedule Variance: -$130.261 million. 

Source: Contractor (data), GAO (presentation). 

[End of figure] 

The GMD contractor did have a favorable fiscal year cost variance of 
$53.9 million, which it attributed to labor efficiencies in the ground 
system as well as less field maintenance support required than planned, 
and labor efficiencies in the deployment and sustainment portion of the 
work under the contract. However, the GMD element's underruns occurred 
partially because the contractors delayed or eliminated some planned 
work. For example, the GMD program did not accomplish the emplacement 
of three GBIs, or conduct either of its two planned flight tests. As a 
result, it employed less labor than originally intended. The program 
also reports an unfavorable fiscal year schedule variance of $77.4 
million which it attributes to an administrative error that occurred in 
September 2007. This error incorrectly adjusted the baseline to the 
booster effort in September which was then updated in October. However, 
it should also be noted that Missile Defense Agency (MDA) officials 
believe that ongoing adjustments to the GMD element's baseline have 
skewed recent variances to such a degree that they should not be used 
to predict future costs. We did perform analysis based on the 
contractor's reported performance through fiscal year 2008, and our 
analysis estimates that at contract end planned for December 2011, the 
contractor could overrun its budgeted cost of $14.9 billion by between 
$950.2 million and $1.25 billion. 

KEI Cost and Schedule Performance Continued to Decline after Replan: 

Despite a replan in April 2007 and again in April 2008, the Kinetic 
Energy Interceptors (KEI) contractor continued to experience declining 
cost and schedule performance during the fiscal year. Although the 
contractor began the year with a positive cost variance, the contractor 
overran fiscal year 2008 budgeted costs by $8.3 million, ending the 
year with an unfavorable cumulative cost variance of $2.6 million. In 
addition, the program was unable to accomplish $8.5 million worth of 
work which added to an unfavorable cumulative schedule variance of 
$21.3 million. Cumulative cost and schedule variances were mainly 
driven by costs associated with delays to booster drawing releases, 
delays in procurement, and unexpected costs and rework related to 
issues with the second stage. Figure 12 depicts the cost and schedule 
performance for the KEI contractor during fiscal year 2008. 

Figure 12: KEI Fiscal Year 2008 Cost and Schedule Performance: 

[Refer to PDF for image: multiple line graph] 

Month: September 07; 
Cumulative Cost Variance: $5.672 million; 
Cumulative Schedule Variance: -$12.784 million. 

Month: October 07; 
Cumulative Cost Variance: $7.276 million; 
Cumulative Schedule Variance: -$11.519 million. 

Month: November 07; 
Cumulative Cost Variance: $2.096 million; 
Cumulative Schedule Variance: -$12.699 million. 

Month: December 07; 
Cumulative Cost Variance: $7.025 million; 
Cumulative Schedule Variance: -$14.872 million. 

Month: January 08; 
Cumulative Cost Variance: $6.535 million; 
Cumulative Schedule Variance: -$17.217 million; 

Month: February 08; 
Cumulative Cost Variance: $5.144 million; 
Cumulative Schedule Variance: -$19.172 million. 

Month: March 08; 
Cumulative Cost Variance: $3.333 million; 
Cumulative Schedule Variance: -$18.34 million. 

Month: April 08; 
Cumulative Cost Variance: $2.689 million; 
Cumulative Schedule Variance: -$4.326 million. 

Month: May 08; 
Cumulative Cost Variance: -$1.871 million; 
Cumulative Schedule Variance: -$5.097 million. 

Month: June 08; 
Cumulative Cost Variance: $2.418 million; 
Cumulative Schedule Variance: -$7.655 million. 

Month: July 08;
Cumulative Cost Variance: $1.23 million; 
Cumulative Schedule Variance: -$12.58 million. 

Month: August 08; 
Cumulative Cost Variance: $0.434 million; 
Cumulative Schedule Variance: -$17.127 million. 

Months: September 08; 
Cumulative Cost Variance: -$2.584 million;
Cumulative Schedule Variance: -$21.277 million. 

Source: Contractor (data), GAO (presentation). 

[End of figure] 

The KEI contractor attributes its unfavorable fiscal year cost and 
schedule variances of $8.3 million and $8.5 million, respectively, to 
issues with its interceptor booster. Problems initially arose in fiscal 
year 2007 with a motor case failure during acceptance testing which led 
to unexpected redesigns. In October 2007, the program experienced 
several issues with the nozzle during a second stage ground test and 
also experienced a deviation in measured performance from pre-test 
predictions. These issues added costly redesigns and delays to its 
knowledge point, a booster flight test. The program performed a replan 
of its work in April 2008 because of these issues in order to realign 
the schedule with their booster flight test knowledge point which was 
delayed from August 2008 to April 2009. Since the replan, the booster 
flight test has been further delayed to the fourth quarter of fiscal 
year 2009. As a result of the replan, the program zeroed out some 
schedule variances from the baseline to reflect the program's progress 
toward the newly defined schedule. Despite this replan in April, our 
analysis shows that the replan has not improved overall performance as 
cumulative cost and schedule variances continue their downward trend. 
We were unable to estimate whether the total work under the contract is 
likely to be completed within budgeted cost since trends cannot be 
developed until at least 15 percent of the work under the contract is 
completed. 

Limited Contractor Data Prevented Analysis of All MKV Task Orders: 

The Multiple Kill Vehicles (MKV) program began utilizing an indefinite 
delivery indefinite quantity contract in January 2004. Since then, the 
program has initiated eight task orders, five of which were open during 
fiscal year 2008--Task Orders 4 through 8. Task Order 4 provided 
insufficient data to complete full earned value analysis for the fiscal 
year. In addition, Task Order 5 was completed shortly after the fiscal 
year began, without providing enough data to show performance trends. 
Therefore we performed analysis for Task Orders 6, 7, and 8 as shown 
below. None of the task orders were realigned during the fiscal year. 

MKV Task Order 6 began in November 2006 for the component development 
and testing of a prototype carrier vehicle seeker (a long-range 
sensor). According to the task order, this seeker for the carrier 
vehicle will assign individual kill vehicles for target destruction. 
This task will culminate in a demonstration planned for fiscal year 
2010. As shown in figure 13 below, performance data over the course of 
the fiscal year illustrates declining cost and schedule performance. 
Although it began the fiscal year with slightly positive cumulative 
cost and schedule variances, the program ended the year with slightly 
negative cumulative cost and schedule variances of $1.1 million and 
$0.6 million respectively. In addition, the contractor has unfavorable 
fiscal year cost and schedule variances of $1.4 million and $1.5 
million, respectively. The program attributes its negative cumulative 
cost and schedule variances to increased work necessary to resolve 
software development issues, unplanned efforts as a result of late 
hardware arrivals, and a government-directed change in vendors for 
hardware resulting in additional design work. Based on our analysis and 
the assumption that the contractor will continue to perform as it has 
through fiscal year 2008, we predict that at its contract completion in 
May 2009, the contractor on Task Order 6 will overrun its budgeted cost 
of $19.3 million by between $1.6 million and $2.5 million. 

Figure 13: MKV Task Order 6 Fiscal Year 2008 Cost and Schedule 
Performance: 

[Refer to PDF for image: multiple line graph] 

Month: September 07; 
Cumulative Cost Variance: $0.266 million; 
Cumulative Schedule Variance: $0.913 million. 

Month: October 07; 
Cumulative Cost Variance: -$0.053 million; 
Cumulative Schedule Variance: $0.536 million. 

Month: November 07; 
Cumulative Cost Variance: -$0.191 million; 
Cumulative Schedule Variance: $0.382 million. 

Month: December 07; 
Cumulative Cost Variance: -$0.436 million; 
Cumulative Schedule Variance: $0.354 million. 

Month: January 08; 
Cumulative Cost Variance: -$0.489 million; 
Cumulative Schedule Variance: $0.172 million; 

Month: February 08; 
Cumulative Cost Variance: -$0.591 million; 
Cumulative Schedule Variance: $0.226 million. 

Month: March 08; 
Cumulative Cost Variance: -$0.602 million; 
Cumulative Schedule Variance: $0.189 million. 

Month: April 08; 
Cumulative Cost Variance: -$0.344 million; 
Cumulative Schedule Variance: $0.218 million. 

Month: May 08; 
Cumulative Cost Variance: -$0.269 million; 
Cumulative Schedule Variance: $0.151 million. 

Month: June 08; 
Cumulative Cost Variance: -$0.554 million; 
Cumulative Schedule Variance: -$0.067 million. 

Month: July 08;
Cumulative Cost Variance: -$0.718 million; 
Cumulative Schedule Variance: -$0.319 million. 

Month: August 08; 
Cumulative Cost Variance: -$0.93 million; 
Cumulative Schedule Variance: -$0.47 million. 

Months: September 08; 
Cumulative Cost Variance: -$1.128 million;
Cumulative Schedule Variance: -$0.616 million. 

Source: Contractor (data), GAO (presentation). 

[End of figure] 

The MKV Task Order 7 is for the development and testing of engagement 
management algorithms and the test bed in which it will be 
demonstrated. These algorithms will be a critical capability of the 
carrier vehicle to manage the kill vehicle engagements relying on 
target information from the BMDS sensors and the carrier vehicle long- 
range sensor. The contractor on this task order performed positively 
during the fiscal year, both in terms of its cumulative and fiscal year 
cost and schedule variances. The program had a favorable fiscal year 
cost variance of $1.4 million and a positive fiscal year schedule 
variance of $11 thousand, adding to its favorable cumulative cost and 
schedule variances of $1.7 million and $0.1 million, respectively. The 
program attributes its cumulative cost underruns to several reasons 
including a programmatic decision to proceed with one approach for 
organizing kill vehicles in attack formation rather than funding 
several different approaches. In addition, the contractor experienced 
cost savings with greater efficiencies than expected in the kill 
vehicle portion of the work under the contract and less manpower than 
planned in other portions of the work under the contract. If the 
contractor continues to perform as it has in the past, we estimate that 
at completion in May 2010 the work under the contract could cost 
between $3.2 million and $3.9 million less than the expected $43.9 
million budgeted for the work under the contract. See figure 14 below 
for an illustration of cumulative cost and schedule performance during 
fiscal year 2008. 

Figure 14: MKV Task Order 7 Fiscal Year 2008 Cost and Schedule 
Performance: 

[Refer to PDF for image: multiple line graph] 

Month: September 07; 
Cumulative Cost Variance: $0.25 million; 
Cumulative Schedule Variance: $0.068 million. 

Month: October 07; 
Cumulative Cost Variance: $0.252 million; 
Cumulative Schedule Variance: $0.056 million. 

Month: November 07; 
Cumulative Cost Variance: $0.347 million; 
Cumulative Schedule Variance: $0.029 million. 

Month: December 07; 
Cumulative Cost Variance: $0.474 million; 
Cumulative Schedule Variance: $0.002 million. 

Month: January 08; 
Cumulative Cost Variance: $0.588 million; 
Cumulative Schedule Variance: $0.064 million; 

Month: February 08; 
Cumulative Cost Variance: $0.567 million; 
Cumulative Schedule Variance: $0.044 million. 

Month: March 08; 
Cumulative Cost Variance: $0.24 million; 
Cumulative Schedule Variance: -$0.191 million. 

Month: April 08; 
Cumulative Cost Variance: $0.635 million; 
Cumulative Schedule Variance: -$0.07 million. 

Month: May 08; 
Cumulative Cost Variance: $0.846 million; 
Cumulative Schedule Variance: -$0.074 million. 

Month: June 08; 
Cumulative Cost Variance: $1.036 million; 
Cumulative Schedule Variance: -$0.016 million. 

Month: July 08;
Cumulative Cost Variance: $1.264 million; 
Cumulative Schedule Variance: -$0.009 million. 

Month: August 08; 
Cumulative Cost Variance: $1.269 million; 
Cumulative Schedule Variance: $0.002 million. 

Months: September 08; 
Cumulative Cost Variance: $1.665 million;
Cumulative Schedule Variance: $0.079 million. 

Source: Contractor (data), GAO (presentation). 

[End of figure] 

MKV Task Order 8 was awarded in January 2007 and began reporting full 
performance data in July 2007. The task order is for the development 
and testing of a hover test bed and hover test vehicle. This hover test 
bed will allow the program to integrate and test key components of the 
system in a repeatable ground-based free flight environment as their 
technologies reach maturity. The program experienced a continuing 
schedule performance decline as seen in figure 15. 

Figure 15: MKV Task Order 8 Fiscal Year 2008 Cost and Schedule 
Performance: 

[Refer to PDF for image: multiple line graph] 

Month: September 07; 
Cumulative Cost Variance: $0.379 million; 
Cumulative Schedule Variance: $0.267 million. 

Month: October 07; 
Cumulative Cost Variance: -$0.259 million; 
Cumulative Schedule Variance: $0.106 million. 

Month: November 07; 
Cumulative Cost Variance: -$1.287 million; 
Cumulative Schedule Variance: -$0.895 million. 

Month: December 07; 
Cumulative Cost Variance: -$1.178 million; 
Cumulative Schedule Variance: -$0.759 million. 

Month: January 08; 
Cumulative Cost Variance: -$1.314 million; 
Cumulative Schedule Variance: -$1.375 million; 

Month: February 08; 
Cumulative Cost Variance: -$1.493 million; 
Cumulative Schedule Variance: -$0.176 million. 

Month: March 08; 
Cumulative Cost Variance: -$2.284 million; 
Cumulative Schedule Variance: $2.186 million. 

Month: April 08; 
Cumulative Cost Variance: -$4.147 million; 
Cumulative Schedule Variance: $2.717 million. 

Month: May 08; 
Cumulative Cost Variance: -$5.747 million; 
Cumulative Schedule Variance: $2.154 million. 

Month: June 08; 
Cumulative Cost Variance: -$6.843 million; 
Cumulative Schedule Variance: $1.906 million. 

Month: July 08;
Cumulative Cost Variance: -$7.854 million; 
Cumulative Schedule Variance: $1.469 million. 

Month: August 08; 
Cumulative Cost Variance: -$9.131 million; 
Cumulative Schedule Variance: $0.772 million. 

Months: September 08; 
Cumulative Cost Variance: -$10.303 million;
Cumulative Schedule Variance: $0.252 million. 

Source: Contractor (data), GAO (presentation). 

[End of figure] 

Although the contractor began the year with a positive cumulative cost 
variance, overruns during the fiscal year of $10.7 million led the 
program to a total cumulative cost overrun of $10.3 million. The 
element's fiscal year schedule variance was slightly negative at an 
unfavorable $15 thousand, leaving its cumulative schedule variance 
largely unchanged at a favorable $0.3 million. The program attributes 
the cumulative cost variances to increased labor, procurement, and 
material costs as well as increased hardware and engineering drawings, 
and management oversight to resolve subcontractor inefficiencies. In 
addition, the program increased expenditures to resolve technical and 
schedule issues associated with the development of avionics subsystems. 
The planned date for the task order's main effort--completing the hover 
test--was delayed 2 months from its original date to December 2008 in 
part because of technical issues associated with the test vehicle's 
power unit and a software anomaly. These issues were resolved prior to 
the hover test being conducted. Based on its prior performance, the MKV 
contractor could overrun the budgeted cost of $48.0 million for the 
work under the contract at completion in January 2009 by between $5.7 
million and $13.8 million. 

Sensors' Radar Experienced Fiscal Year Cost and Schedule Growth: 

As of September 2008, the Sensor's contractor had overrun its fiscal 
year budget by $2.2 million and was behind in completing $27.4 million 
worth of work. Considering prior years' performance, the contractor is 
performing under budget with a favorable cumulative cost variance of 
$22.0 million. However, the contractor has a cumulative unfavorable 
schedule variance of $9.6 million. The contractor reports the 
cumulative schedule variance is driven by delays in the manufacturing 
of the sixth radar and a software capability release that is 2 to 3 
months behind schedule. Additionally, the contractor reports that its 
favorable cumulative cost variance is attributable to efficiencies in 
the second radar's manufacturing, design, development, and software. 
The Sensors contractor has not performed a realignment of its work 
since contract start in April 2003. See figure 16 for trends in the 
contractor's cost and schedule performance during the fiscal year. 

Figure 16: Sensors Fiscal Year 2008 Cost and Schedule Performance: 

[Refer to PDF for image: multiple line graph] 

Month: September 07; 
Cumulative Cost Variance: $24.146 million; 
Cumulative Schedule Variance: $17.828 million. 

Month: October 07; 
Cumulative Cost Variance: $22.739 million; 
Cumulative Schedule Variance: $18.009 million. 

Month: November 07; 
Cumulative Cost Variance: $24.037 million; 
Cumulative Schedule Variance: $17.767 million. 

Month: December 07; 
Cumulative Cost Variance: $24.993 million; 
Cumulative Schedule Variance: $16.444 million. 

Month: January 08; 
Cumulative Cost Variance: $26.023 million; 
Cumulative Schedule Variance: $12.228 million; 

Month: February 08; 
Cumulative Cost Variance: $26.135 million; 
Cumulative Schedule Variance: $6.864 million. 

Month: March 08; 
Cumulative Cost Variance: $24.922 million; 
Cumulative Schedule Variance: $4.215 million. 

Month: April 08; 
Cumulative Cost Variance: $24.793 million; 
Cumulative Schedule Variance: $0.896 million. 

Month: May 08; 
Cumulative Cost Variance: $23.734 million; 
Cumulative Schedule Variance: -$3.31 million. 

Month: June 08; 
Cumulative Cost Variance: $15.282 million; 
Cumulative Schedule Variance: -$5.772 million. 

Month: July 08;
Cumulative Cost Variance: $14.947 million; 
Cumulative Schedule Variance: -$7.484 million. 

Month: August 08; 
Cumulative Cost Variance: $16.51 million; 
Cumulative Schedule Variance: -$8.162 million. 

Months: September 08; 
Cumulative Cost Variance: $21.994 million;
Cumulative Schedule Variance: -$9.607 million. 

Source: Contractor (data), GAO (presentation). 

[End of figure] 

The contractor reports that its unfavorable fiscal year schedule 
variance of $27.4 million is due to a decrease of previously earned 
positive schedule variances reaped from the manufacturing efficiencies 
leveraged from the Terminal High Altitude Area Defense radar hardware 
design. In addition, late delivery of components also contributed to 
the negative fiscal year schedule variances. The negative fiscal year 
cost variance of $2.2 million is largely due to a contract change 
related to its incentive fee. Our analysis predicts that if the 
contractor continues to perform as it has through fiscal year 2008, the 
work under the contract could cost from $25 million less to $9.1 
million more than the budgeted cost of $1.1 billion at completion 
currently planned for December 2010. 

Technical Issues Drove STSS Cost Growth during the Fiscal Year: 

After a replan of work in October 2007, the Space Tracking and 
Surveillance System (STSS) contractor experienced an unfavorable cost 
variance of $87.9 million during the fiscal year. The replan was 
undertaken in order to extend the period of performance and delay the 
launch date of its demonstrator satellite. Despite fiscal year cost 
overruns, the contractor was able to make gains on the cumulative 
schedule variance by accomplishing $1.9 million more worth of work than 
was originally planned. Cumulatively, the program has both unfavorable 
cost and schedule variances at $319.3 million and $17.8 million, 
respectively. The program attributes cumulative cost variances and 
schedule variances to continual launch date schedule slippages. In 
addition, problems in the space segment portion of work also added to 
the cumulative cost variances. Figure 17 shows both cost and schedule 
trends during fiscal year 2008. 

Figure 17: STSS Fiscal Year 2008 Cost and Schedule Performance: 

[Refer to PDF for image: multiple line graph] 

Month: September 07; 
Cumulative Cost Variance: -$231.41 million; 
Cumulative Schedule Variance: -$19.667 million. 

Month: October 07; 
Cumulative Cost Variance: -$246.178 million; 
Cumulative Schedule Variance: -$13.029 million. 

Month: November 07; 
Cumulative Cost Variance: -$249.783 million; 
Cumulative Schedule Variance: -$14.222 million. 

Month: December 07; 
Cumulative Cost Variance: -$256.466 million; 
Cumulative Schedule Variance: -$15.784 million. 

Month: January 08; 
Cumulative Cost Variance: -$263.5 million; 
Cumulative Schedule Variance: -$15.202 million; 

Month: February 08; 
Cumulative Cost Variance: -$268.303 million; 
Cumulative Schedule Variance: -$15.858 million. 

Month: March 08; 
Cumulative Cost Variance: -$274.939 million; 
Cumulative Schedule Variance: -$16.97 million. 

Month: April 08; 
Cumulative Cost Variance: -$282.524 million; 
Cumulative Schedule Variance: -$18.962 million. 

Month: May 08; 
Cumulative Cost Variance: -$288.702 million; 
Cumulative Schedule Variance: -$19.985 million. 

Month: June 08; 
Cumulative Cost Variance: -$295.914 million; 
Cumulative Schedule Variance: -$21.368 million. 

Month: July 08;
Cumulative Cost Variance: -$305.378 million; 
Cumulative Schedule Variance: -$22.753 million. 

Month: August 08; 
Cumulative Cost Variance: -$313.263 million; 
Cumulative Schedule Variance: -$18.452 million. 

Months: September 08; 
Cumulative Cost Variance: -$319.261 million;
Cumulative Schedule Variance: -$17.752 million. 

Source: Contractor (data), GAO (presentation). 

[End of figure] 

Program cost variances during the fiscal year were driven primarily by 
technical issues with hardware installed on the second space vehicle. 
These issues included an overheating flight communications box, a leak 
on the propulsion side of the satellite, and problems with the 
spacecraft processor that failed to send a critical command to the 
onboard computer. To resolve the issues with the processor, the program 
office initially recommended the removal of the entire computer from 
the spacecraft. However, after extensive research and testing, the 
program manager determined that the event with the spacecraft is an 
unverifiable failure with a low probability of occurrence and low 
mission impact and decided not to remove the computer from the 
spacecraft to resolve the issue. We estimate that if the contractor 
continues to perform as it has through fiscal year 2008, the work under 
the contract at completion in September 2011 could exceed its budgeted 
cost of $1.6 billion by between $621.7 million and $1.2 billion. 

Targets and Countermeasures Program's Rebaseline Positively Affected 
Fiscal Year Schedule Variances: 

In June 2008, a delivery order under the Targets and Countermeasures' 
element that is developing a new family of targets--the Flexible Target 
Family (FTF)--performed a rebaseline as a result of experiencing 
manufacturing delays to several components. The majority of the delays 
were from qualification failures, subsequent redesigns, and 
requalification efforts. The rebaseline was to realign the work under 
the contract to reflect realistic hardware delivery dates. This 
rebaseline did not affect cost variances, but did rebaseline major 
milestone delivery dates and, as a result, set some of the previously 
existing schedule variances to zero. 

The Targets and Countermeasures contractor made gains with a favorable 
$23.2 million fiscal year schedule variance due in part to the 
rebaseline in June 2008. However, the contractor ended the year with an 
unfavorable cumulative schedule variance of $6.4 million which was 
primarily driven by delays in the completion of the FTF qualification 
program.[Footnote 52] The program also ended the year with a cumulative 
cost variance of $52.8 million which the contractor attributed to costs 
associated with the FTF's avionics components integration and 
qualification issues, and more effort than expected required on motors 
for one of the targets in the program. See figure 18 below for an 
illustration of cumulative cost and schedule variances during the 
course of the fiscal year. 

Figure 18: Targets and Countermeasures Fiscal Year 2008 Cost and 
Schedule Performance: 

[Refer to PDF for image: multiple line graph] 

Month: September 07; 
Cumulative Cost Variance: -$17.091 million; 
Cumulative Schedule Variance: -$29.577 million. 

Month: October 07; 
Cumulative Cost Variance: -$18.662 million; 
Cumulative Schedule Variance: -$34.647 million. 

Month: November 07; 
Cumulative Cost Variance: -$21.078 million; 
Cumulative Schedule Variance: -$37.97 million. 

Month: December 07; 
Cumulative Cost Variance: -$23.654 million; 
Cumulative Schedule Variance: -$34.392 million. 

Month: January 08; 
Cumulative Cost Variance: -$25.813 million; 
Cumulative Schedule Variance: -$32.862 million; 

Month: February 08; 
Cumulative Cost Variance: -$25.6 million; 
Cumulative Schedule Variance: -$32.2 million. 

Month: March 08; 
Cumulative Cost Variance: -$34.549 million; 
Cumulative Schedule Variance: -$30.243 million. 

Month: April 08; 
Cumulative Cost Variance: -$40.03 million; 
Cumulative Schedule Variance: -$32.813 million. 

Month: May 08; 
Cumulative Cost Variance: -$45.077 million; 
Cumulative Schedule Variance: -$31.551 million. 

Month: June 08; 
Cumulative Cost Variance: -$49.187 million; 
Cumulative Schedule Variance: -$3.288 million. 

Month: July 08;
Cumulative Cost Variance: -$52.211 million; 
Cumulative Schedule Variance: -$4.232 million. 

Month: August 08; 
Cumulative Cost Variance: -$51.679 million; 
Cumulative Schedule Variance: -$5.196 million. 

Months: September 08; 
Cumulative Cost Variance: -$52.776 million;
Cumulative Schedule Variance: -$6.383 million. 

Source: Contractor (data), GAO (presentation). 

[End of figure] 

The contractor attributes its unfavorable fiscal year cost variances of 
$35.7 million to the increased cost of completing the first four FTF 72-
inch targets. Delays in completing component qualification extended the 
period of performance which invariably led to higher costs. In 
addition, the contractor cites cost increases to the failure of one of 
its targets in July 2008 that added mission assurance and testing cost 
to the follow-on mission using the same target configuration. We 
estimate that if the contractor continues to perform as it has in the 
past, it will overrun its budgeted cost of $1.1 billion at contract's 
end in December 2009 by between $63.7 million and $75.9 million. 

THAAD Contractor Spent More Money and Time Than Budgeted: 

The Terminal High Altitude Area Defense (THAAD) program experienced 
target issues during fiscal year 2008. The THAAD program performed a 
realignment in May 2008 to extend the flight test program after 
experiencing several delayed target deliveries. Because of the cost 
impact of these delayed targets, the program will increase its value of 
the work under its contract by approximately $80 million. The THAAD 
program performed a similar realignment in December 2006 as a result of 
delayed target deliveries as well. As a result of this realignment, the 
program extended its flight test program and added an estimated $121 
million to the value of work under its contract. 

The THAAD contractor experienced downward trends in its cost and 
schedule performance during fiscal year 2008. The program overran its 
budgeted costs for the fiscal year by $33.5 million. It was also unable 
to accomplish $7.4 million worth of work during the fiscal year. Both 
of these unfavorable variances added to the negative cumulative cost 
and schedule variances of $228.7 million and $16.5 million, 
respectively, as shown in figure 19. 

Figure 19: THAAD Fiscal Year 2008 Cost and Schedule Performance: 

[Refer to PDF for image: multiple line graph] 

Month: September 07; 
Cumulative Cost Variance: -$195.232 million; 
Cumulative Schedule Variance: -$9.058 million. 

Month: October 07; 
Cumulative Cost Variance: -$196.647 million; 
Cumulative Schedule Variance: -$9.515 million. 

Month: November 07; 
Cumulative Cost Variance: -$187.51 million; 
Cumulative Schedule Variance: -$8.4 million. 

Month: December 07; 
Cumulative Cost Variance: -$187.962 million; 
Cumulative Schedule Variance: -$9.815 million. 

Month: January 08; 
Cumulative Cost Variance: -$187.929 million; 
Cumulative Schedule Variance: -$10.492 million; 

Month: February 08; 
Cumulative Cost Variance: -$192.491 million; 
Cumulative Schedule Variance: -$10.311 million. 

Month: March 08; 
Cumulative Cost Variance: -$199.843 million; 
Cumulative Schedule Variance: -$10.774 million. 

Month: April 08; 
Cumulative Cost Variance: -$206.191 million; 
Cumulative Schedule Variance: -$11.979 million. 

Month: May 08; 
Cumulative Cost Variance: -$213.162 million; 
Cumulative Schedule Variance: -$14.308 million. 

Month: June 08; 
Cumulative Cost Variance: -$217.408 million; 
Cumulative Schedule Variance: -$12.351 million. 

Month: July 08;
Cumulative Cost Variance: -$221.086 million; 
Cumulative Schedule Variance: -$15.883 million. 

Month: August 08; 
Cumulative Cost Variance: -$223.567 million; 
Cumulative Schedule Variance: -$14.516 million. 

Months: September 08; 
Cumulative Cost Variance: -$228.743 million;
Cumulative Schedule Variance: -$16.497 million. 

Source: Contractor (data), GAO (presentation). 

[End of figure] 

The THAAD prime contractor's fiscal year cost overrun of $33.5 million 
was primarily caused by the radar, missile, and launcher portions of 
work. Design problems delayed the prime power unit design review and 
slowed parts production, causing the radar's negative cost trend. In 
addition, the missile's negative cost trend for this same period was 
driven by design complexity, ongoing rework/retest of subsystems, 
unexpected qualification discoveries, and unfavorable labor variances 
at key subcontractors. Lastly, the launcher variances were driven by 
hardware and software complexities and higher-than-expected costs for 
transitioning a portion of this effort to a different facility for 
production. 

The contractor reports that its unfavorable fiscal year schedule 
variance of $7.4 million is primarily driven by the radar and missile 
components. The radar's negative schedule variance is associated with 
vendor delays in delivering trailers for both of the system's prime 
power units. The late delivery of the trailers has subsequently delayed 
delivery of the prime power units. Missile rework due to qualification 
test discoveries also negatively affected schedule performance. If the 
contractor continues to perform as it has through fiscal year 2008, we 
project that at the contract's completion currently scheduled for 
September 2009, the contractor could overrun its budgeted cost of $4.6 
billion by between $252 million to $274 million. 

[End of section] 

Appendix III: FTG-04 Flight Test Cancellation: 

On May 23, 2008, the Senate Armed Services Committee requested that we 
review the reasons behind the cancellation of a GMD flight test 
designated FTG-04. Initially, on May 1, 2008, the Director, MDA decided 
to delay this test due to problems discovered in a telemetry device, 
the Pulse Code Modulation Encoder (PCME). This device does not affect 
operational performance, but rather is a critical component needed to 
transmit flight test data only. The PCME problems were due in large 
part to manufacturing defects, which the manufacturers and MDA 
concluded likely affected all the PCMEs. However, on May 8, the 
Director of MDA instead decided to cancel this flight test entirely, 
resulting in one less GMD end-to-end intercept flight test. MDA told us 
that delaying the flight test until the PCMEs could be repaired would 
cause delays in future tests since various test assets were shared. MDA 
officials therefore decided to cancel FTG-04 and transfer some test 
objectives to other tests, including a new non-intercept flight test, 
FTX-03, and an already planned intercept flight test, FTG-05. Also, for 
some remaining objectives not captured in FTG-05 and FTX-03, MDA stated 
that it planned a third intercept test, FTG-X. We were asked to 
investigate this test cancellation and answer the following questions: 

* Why did the MDA change its initial decision to delay FTG-04 until 
November 2008 and decide to cancel FTG-04 instead and what deliberative 
process did MDA follow in deciding to cancel FTG-04? 

* When and how, if ever, will each of the specific test objectives 
previously planned for FTG-04 be accomplished? 

* What are the implications of canceling this flight test on the 
ensuing test program, on demonstrating the capability of the GMD 
system, and on other programmatic decisions? 

Faulty Telemetry Component Caused Delay and Subsequent Cancellation of 
FTG-04: 

MDA initially delayed the FTG-04 flight test because of defects in the 
PCME, a telemetry component in the Exoatmospheric Kill Vehicle (EKV) 
only needed to gather test data. Although the PCME does not affect 
operational performance, it is needed for test assets to determine if 
design and operational issues have been resolved. The FTG-04 had four 
prior delays and was originally scheduled for the first quarter of 
fiscal year 2007. In responding to these delays, multiple tests over 
several years were affected. 

Several defects contributed to the problem, the first three of which 
are presumed to affect all PCMEs manufactured up to that point and all 
24 fielded Test Bed/Capability Enhancement (CE) I EKVs: 

* The PCMEs experienced gold embrittlement due to lack of pretinning. 

* Insufficient oscillator stand-off height increased thermal stress. 

* Circuit board deflection caused by three washers missing from the 
board. 

In addition to these manufacturing defects, there were stress fractures 
in the solder of three PCMEs caused by the removal and replacement of a 
chip on the device. This chip was removed because a clock on a chip was 
asynchronous with another component's clock. It was estimated that 
there was an 18 to 48 percent chance of the loss of telemetry data at 
some point during a flight test due to the asynchronous chip problem. 
Again, all 24 fielded Test Bed/CE-I EKVs have the chip with this 
problem. This chip does not affect operational performance, but rather 
is a critical component needed to transmit flight test data only. 

See table 10 for timeline of events related to this cancellation. 

Table 10: Timeline of Events: 

1/12/08-2/4/08; 
During early tests of the Payload 33 PCME at the subcontractor's 
facility, no failures were detected. 

2/7/08; 
During final test readiness reviews at Vandenburg Air Force Base, the 
first failure was identified. 

2/8/08-2/22/08; 
Trouble shooting isolates problem to PCME and EKV is returned to 
contractor for removal of PCME. 

2/22/2008; 
MDA de-emplaced interceptor as a replacement (Payload 32). 

3/3/08-3/26/08; 
Troubleshooting continues, failures are repeatable but intermittent. 

3/28/08-4/2/08; 
Fault isolated to oscillator and solder joints. 

4/30/08-5/1/08; 
Tiger Team formed to assess risk and presented risk assessment to the 
Director, MDA. 

5/1/08; 
Director, MDA delays FTG-04 test. 

5/8/08; 
Program Change Board recommends cancellation; Director, MDA makes the 
decision to cancel, replace it with FTX-03 and informs Congress. 

Source: GAO presentation; MDA documentation. 

[End of table] 

The contractor, Boeing, and the subcontractors, Raytheon and the 
manufacturer of the component, L-3, took actions to mitigate the 
problem. They eliminated the gold embrittlement problem by sending the 
oscillator out for pretinning, they designed custom washers for two 
already produced PCMEs and raised three bosses for new PCMEs to 
eliminate the need for washers, and they tightened tolerances on the 
board to eliminate the deflection issue. These first three PCME 
manufacturing improvements were finalized on May 16, 2008. They also 
made changes to correct the chip with the asynchronous clock problem in 
all newly manufactured PCMEs. None of the previously fielded GBIs will 
be refurbished with improved PCMEs needed for flight tests, but the 
GBIs emplaced starting in October 2008 and thereafter have the improved 
PCME. 

On May 1, 2008, MDA's Program Change Board considered five options. 

1. Execute FTG-04 as scheduled, using payload (32) "as is": 

2. Continue diagnostic testing of payload 32, but if decision was made 
that it was not ready, substitute payload 33, leading to a delay in the 
test schedule: 

3. Refurbish payload 32, but if it did not improve, substitute payload 
33: 

4. Immediately replace payload 32 with 33 without further testing: 

5. Immediately return payload 32 for repair: 

The Director, MDA chose option 5, delaying the FTG-04 into the November 
to December 2008 timeframe, but keeping the program on track to provide 
this intercept data as planned. MDA consulted the test community, 
Director, Operational Test and Evaluation (DOT&E) and the BMDS 
Operational Test Agency, on this initial decision to delay the test and 
both agreed with this decision. According to MDA, the Director also 
asked for options for a sensor test in the summer of 2008. 

On May 8, 2008, MDA's Program Change Board reconvened to consider three 
options for a sensor test (FTX-03) and canceling instead of delaying 
FTG-04: 

1. Conduct FTX-03, with a baseline like the planned FTG-04, but without 
a live intercept attempt. 

2. Conduct FTX-03 with a baseline like the FTG-05, an intercept flight 
test to be conducted in December 2008. 

3. Similar to option 2, but with more sensor data collected. 

The Director MDA changed the May 1 decision to delay, refurbish and fly 
the planned FTG-04 test and chose instead to cancel FTG-04 and pursue 
the modified option 3 above. Choosing option 3 resulted in 
restructuring the intercept test into a test designed to assess 
multiple sensor integration capability. This new test benefited sensor 
modeling and simulation as post-flight reconstruction could occur now 
on two missions. 

According to MDA, it canceled the FTG-04 at the May 8, 2008 meeting 
instead of delaying it, in part, because rescheduling FTG-04 would have 
caused a major delay in another test, the Distributed Ground Test-03 
(GTD-03). GTD-03 and FTG-04 required many of the same assets, so 
conducting FTG-04 would have delayed GTD-03 and thus the delivery of 
this new capability by four months. Ground tests assess the increased 
BMDS capability to be fielded next and GTD-03 was to provide the means 
by which a more realistic simulation of threats and scenarios and the 
means by which new software capability could be declared ready to move 
into the operational baseline. MDA consulted with BMDS Operational Test 
Agency officials on this decision and they supported it. DOT&E was not 
consulted on this decision and expressed concern that the elimination 
of any intercept test reduced the opportunity to gather additional data 
that might have increased confidence in models and simulations. DOT&E 
has repeatedly expressed concerns over the lack of test data needed to 
validate MDA's models and simulations. 

Most FTG-04 Test Objectives Will Be Allocated to Follow-on Tests: 

According to MDA, all FTG-04 test objectives were allocated to other 
flight tests. However, partly due to differences in how MDA describes 
test objectives, it is unclear whether all planned FTG-04 test 
objectives will be accomplished in follow-on tests. The loss of a 
primary objective, an intercept of a complex target scene, will slow 
MDA's efforts to build confidence in the EKV's ability to consistently 
achieve intercepts, unless an additional intercept is scheduled. In 
August 2008, MDA informed Congress that it planned to conduct a new 
intercept test called FTG-X in fiscal year 2009. However, in January 
2009 MDA stated that the FTG-X intercept test was never formally 
approved and is no longer planned. 

In addition, some test objectives related to modeling and simulations 
have been redefined so it is unclear whether they will be fully tested. 
Models and simulations are critical to understanding and assessing the 
performance of the BMDS because flight tests are limited by their cost, 
complexity, and range safety constraints. Modeling and simulation is 
therefore the primary way to fully assess the overall performance of 
the BMDS and its various components. According to DOT&E, cancellation 
of FTG-04 reduced interceptor and EKV data available for modeling, 
leaving only two intercepts (FTG-3a and FTG-05) that have provided 
complete sets of information. In October 2008, MDA stated that 
modification of FTG-04 into a sensor test eliminated a second 
opportunity to anchor the models of EKV-fielded software. Test 
objectives in MDA planning documents describe modeling and simulation 
objectives at a high level. However, it is difficult to determine 
whether modeling and simulation objectives are addressed in the near 
term because the objectives are defined differently for each test. For 
example, the FTX-03 and FTG-05 objectives do not distinguish between 
primary and secondary objectives while the FTG-04 does. One objective 
that seems to be absent in the FTG-05 and FTX-03 is to collect data to 
support validation and anchoring of system-level (vs. element-level) 
simulations, MDA stated that the exclusion from the test objectives was 
inadvertent and it will be addressed by the tests. 

BMDS Operational Test Agency objectives related to the GBI engagement 
were not met, although the test agency officials indicate the majority 
of their non-intercept sensor related objectives for FTG-04 were met in 
the FTX-03 test. However, BMDS Operational Test Agency officials state 
that some of their intercept objectives may be addressed through a 
combination of previous intercept test, FTG-03a, and recently conducted 
FTG-05. Finally, several warfighter objectives for FTG-04, related to 
tactics, techniques, and procedures, will be met through the ground 
tests instead because, according to the warfighter representative at 
the BMDS Operational Test Agency, flight tests do not offer the best 
opportunity to assess this kind of objective. 

Cancellation Eliminates One of Few Opportunities to Demonstrate GMD 
Capabilities: 

The cancellation has increased the strain on the ensuing test program. 
GMD's current plans call for two intercept attempts in fiscal year 
2009--FTG-05, which was conducted in December 2008, and FTG-06--and one 
booster verification test. This is an ambitious schedule as GMD has 
been able to conduct only one intercept flight test per year--FTG-02 in 
September 2006, FTG-03a in September 2007 and FTG-05 in December 2008. 
MDA had planned to conduct five intercept tests with varying stresses 
to assess the EKV capability between February 2007 and December 2008. 
Flight test failures and test plan revisions caused MDA to only carry 
out two intercept tests in that period--FTG-03a and FTG-05--both of 
which resulted in an intercept. 

In addition, the number of future flight tests planned has been 
reduced. MDA has not funded or scheduled an intercept replacement for 
FTG-04. In January 2008 MDA decided to merge two intercept tests--FTG- 
06 and FTG-07--into one single intercept attempt. This merger removes 
another opportunity to gather end-game EKV performance data needed to 
assess capability. In January 2008 MDA also decided to accelerate a two-
stage verification non-intercept test required to assess the European 
component. 

The cancellation of FTG-04 removed one chance to obtain end-game 
performance data needed to develop GMD models and to assess the 
capability of the CE-I EKV. The repetition of intercept-related 
objectives is important to build confidence in the intercept 
capability. These models are the primary way to fully assess the 
overall system performance, since flight tests are limited by their 
cost, complexity and range safety concerns. MDA planned to test the CE- 
I EKV against a dynamic target scene with countermeasures in both FTG- 
04 and FTG-05. FTG-04 was canceled and an FTG-05 target anomaly 
affected this objective. According to MDA, no more CE-I EKV flight 
tests have been approved, although it is considering whether to conduct 
an intercept test using a CE-I EKV in the future. GMD developed some 
mitigations to various developmental issues, but realistic flight 
testing is needed to anchor the models and to determine the 
effectiveness of these mitigations. 

The test cancellation and target problems have reduced the knowledge 
that MDA expected to use for its upcoming end-to-end performance 
assessment. Performance assessments are annual system-level assessments 
to test, evaluate, and characterize the operational capability of the 
BMDS as of the end of the calendar year. Currently, MDA has only 
completed one--Performance Assessment 2007. Furthermore, acting on a 
joint recommendation between MDA and the Operational Test Agency, MDA 
officials canceled their 2008 performance assessment efforts in April 
2008 because of developmental risks associated with modeling and 
simulations. Instead, MDA is focusing on testing and models for 
Performance Assessment 2009. However, the planned performance 
information available for Performance Assessment 2009 will be reduced. 
The FTG-04 cancellation reduced one set of data that was expected to be 
available. In addition, both FTX-03 and FTG-05 will be used to anchor 
data for Performance Assessment 2009, but target anomalies in each test 
precluded the completion of all planned test objectives. Neither target 
presented the complexity needed for advanced algorithm development. 

Manufacturing and emplacement continue unabated by reductions and 
delays in tests. Twenty-four CE-I GBIs have been fielded and the new CE-
II GBIs are now being fielded without important knowledge about the 
systems capabilities expected to be gained through tests. The first CE- 
II GBI emplacement occurred prior to any flight testing of this 
configuration. The first flight test is FTG-06 currently scheduled to 
occur no earlier than the fourth quarter of fiscal year 2009. According 
to MDA, these CE-II GBIs will not be declared operational until after 
the successful completion of FTG-06. FTG-04 was also identified as a 
key source of data supporting a number of capabilities declarations. 
The cancellation of FTG-04, plus other testing delays, prompted MDA to 
defer some capabilities and to declare others based on previous tests. 

Conclusions: 

The cancellation of the FTG-04 flight test increases the risk to the 
GMD program and to the overall BMDS capability, since the lack of 
adequate intercept data adversely affects confidence that the system 
could perform as intended in a real-world situation. The GMD program 
has reduced its plans to assess operational performance of the fielded 
configuration between February 2007 and December 2008 from five to two 
intercept tests, leaving gaps in knowledge about the repeatability of 
the performance of fielded assets. In addition, the opportunity to 
obtain additional intercept data vital to the anchoring of models and 
simulations has been lost, unless the FTG-X flight test is conducted, 
adding to an existing concern expressed by DOT&E. Despite test 
reductions and effects on assessing system-level performance, 
production and fielding of assets continues as planned. 

[End of section] 

Appendix IV: Reduced Basis for Capability Declarations: 

The two tables below list test events supporting MDA capability 
declarations during fiscal year 2008 for certain engagement sequence 
groups in Blocks 1.0 through 3.0 (see table 11), as well as for the 
full completion of Bock 1.0 by the end of fiscal year 2009 (see table 
12). Both tables illustrate that MDA reduced the basis for declaring 
certain engagement sequence groups as early or fully capable. The basis 
for declaring an early, partial, or full capability includes flight and 
ground tests as well as performance assessments. 

Table 11: Engagement Sequence Groups with Revised Basis for Fiscal Year 
2008 Capability Declarations: 

Engagement sequence: Block 1.0: GBI Launch on COBRA DANE/Upgraded Early 
Warning Radar; 
Capability declaration: Early; 
Planned basis for 2008 declaration: PA 07; 
Revised basis for 2008 declaration[A]: Dropped. 

Engagement sequence: Block 1.0: GBI Launch on COBRA DANE/Upgraded Early 
Warning Radar; 
Capability declaration: Early;
Planned basis for 2008 declaration: GTD-02; 
Revised basis for 2008 declaration[A]: [Check]. 

Engagement sequence: Block 1.0: GBI Launch on COBRA DANE/Upgraded Early 
Warning Radar; 
Capability declaration: Early;
Planned basis for 2008 declaration: GTI-02; 
Revised basis for 2008 declaration[A]: [Check]. 

Engagement sequence: Block 1.0: GBI Launch on COBRA DANE/Upgraded Early 
Warning Radar; 
Capability declaration: Early;
Planned basis for 2008 declaration: [Empty]; 
Revised basis for 2008 declaration[A]: GTG-04-3. 

Engagement sequence: Block 1.0: GBI Engage on Sea-based X-band Radar; 
Capability declaration: Early; 
Planned basis for 2008 declaration: PA 07; 
Revised basis for 2008 declaration[A]: Dropped. 

Engagement sequence: Block 1.0: GBI Engage on Sea-based X-band Radar; 
Capability declaration: Early;
Planned basis for 2008 declaration: GTD-02; 
Revised basis for 2008 declaration[A]: [Check]. 

Engagement sequence: Block 1.0: GBI Engage on Sea-based X-band Radar; 
Capability declaration: Early;
Planned basis for 2008 declaration: GTI-02; 
Revised basis for 2008 declaration[A]: [Check]. 

Engagement sequence: Block 1.0: GBI Engage on Sea-based X-band Radar; 
Capability declaration: Early;
Planned basis for 2008 declaration: [Empty]; 
Revised basis for 2008 declaration[A]: GTG-04-3. 

Engagement sequence: Block 1.0: GBI Launch on Sea-based X-band Radar; 
Capability declaration: Early; 
Planned basis for 2008 declaration: PA 07; 
Revised basis for 2008 declaration[A]: Dropped. 

Engagement sequence: Block 1.0: GBI Launch on Sea-based X-band Radar; 
Capability declaration: Early; 
Planned basis for 2008 declaration: GTD-02; 
Revised basis for 2008 declaration[A]: [Check]. 

Engagement sequence: Block 1.0: GBI Launch on Sea-based X-band Radar; 
Capability declaration: Early; 
Planned basis for 2008 declaration: GTI-02; 
Revised basis for 2008 declaration[A]: [Check]. 

Engagement sequence: Block 1.0: GBI Launch on Sea-based X-band Radar; 
Capability declaration: Early; 
Planned basis for 2008 declaration: [Empty]; 
Revised basis for 2008 declaration[A]: GTG-04-3. 

Engagement sequence: Block 2.0: SM-3 Engage on shipboard Aegis Radar; 
Capability declaration: Full; 
Planned basis for 2008 declaration: PA 08; 
Revised basis for 2008 declaration[A]: Dropped. 

Engagement sequence: Block 2.0: SM-3 Engage on shipboard Aegis Radar; 
Capability declaration: Full; 
Planned basis for 2008 declaration: FTM-15; 
Revised basis for 2008 declaration[A]: Dropped. 

Engagement sequence: Block 2.0: SM-3 Engage on shipboard Aegis Radar; 
Capability declaration: Full; 
Planned basis for 2008 declaration: GTD-03; 
Revised basis for 2008 declaration[A]: Dropped. 

Engagement sequence: Block 2.0: SM-3 Engage on shipboard Aegis Radar; 
Capability declaration: Full; 
Planned basis for 2008 declaration: GTI-03; 
Revised basis for 2008 declaration[A]: Dropped. 

Engagement sequence: Block 2.0: SM-3 Engage on shipboard Aegis Radar; 
Capability declaration: Full; 
Planned basis for 2008 declaration: GTX-03[A]; 
Revised basis for 2008 declaration[A]: Dropped. 

Engagement sequence: Block 2.0: SM-3 Engage on shipboard Aegis Radar; 
Capability declaration: Full; 
Planned basis for 2008 declaration: [Empty]; 
Revised basis for 2008 declaration[A]: FTM-10,-11,-12,-13. 

Engagement sequence: Block 2.0: SM-3 Engage on shipboard Aegis Radar; 
Capability declaration: Full; 
Planned basis for 2008 declaration: [Empty]; 
Revised basis for 2008 declaration[A]: GTD-02. 

Engagement sequence: Block 2.0: SM-3 Engage on shipboard Aegis Radar; 
Capability declaration: Full; 
Planned basis for 2008 declaration: [Empty]; 
Revised basis for 2008 declaration[A]: GTI-02. 

Engagement sequence: Block 2.0: SM-3 Launch on Remote shipboard Aegis 
Radar; 
Capability declaration: Early; 
Planned basis for 2008 declaration: GTD-02; 
Revised basis for 2008 declaration[A]: Dropped. 

Engagement sequence: Block 2.0: SM-3 Launch on Remote shipboard Aegis 
Radar; 
Capability declaration: Early; 
Planned basis for 2008 declaration: GTI-02; 
Revised basis for 2008 declaration[A]: Dropped. 

Engagement sequence: Block 2.0: SM-3 Launch on Remote shipboard Aegis 
Radar; 
Capability declaration: Early; 
Planned basis for 2008 declaration: FTM-13; 
Revised basis for 2008 declaration[A]: Dropped. 

Engagement sequence: Block 2.0: SM-3 Launch on Remote shipboard Aegis 
Radar; 
Capability declaration: Early; 
Planned basis for 2008 declaration: [Empty]; 
Revised basis for 2008 declaration[A]: FTM-14. 

Engagement sequence: Block 2.0: THAAD Engage on AN/TPY-2 (terminal 
mode); 
Capability declaration: Early; 
Planned basis for 2008 declaration: FTT-09; 
Revised basis for 2008 declaration[A]: [Check]. 

Engagement sequence: Block 2.0: THAAD Engage on AN/TPY-2 (terminal 
mode); 
Capability declaration: Early; 
Planned basis for 2008 declaration: [Empty]; 
Revised basis for 2008 declaration[A]: FTT-07,-08. 

Engagement sequence: Block 2.0: THAAD Engage on AN/TPY-2 (terminal 
mode); 
Capability declaration: Early; 
Planned basis for 2008 declaration: [Empty]; 
Revised basis for 2008 declaration[A]: GTI-02,-03. 

Engagement sequence: Block 2.0: THAAD Engage on AN/TPY-2 (terminal 
mode); 
Capability declaration: Early; 
Planned basis for 2008 declaration: [Empty]; 
Revised basis for 2008 declaration[A]: GTX-03a. 

Engagement sequence: Block 3.0: GBI Launch on shipboard Aegis Radar; 
Capability declaration: Early; 
Planned basis for 2008 declaration: GTX-03[A]; 
Revised basis for 2008 declaration[A]: Dropped. 

Engagement sequence: Block 3.0: GBI Launch on shipboard Aegis Radar; 
Capability declaration: Early; 
Planned basis for 2008 declaration: [Empty]; 
Revised basis for 2008 declaration[A]: GTD-02. 

Engagement sequence: Block 3.0: GBI Launch on shipboard Aegis Radar; 
Capability declaration: Early; 
Planned basis for 2008 declaration: [Empty]; 
Revised basis for 2008 declaration[A]: GTI-02. 

Engagement sequence: Block 3.0: GBI Launch on shipboard Aegis Radar; 
Capability declaration: Early; 
Planned basis for 2008 declaration: [Empty]; 
Revised basis for 2008 declaration[A]: GTI-03. 

Engagement sequence: Block 3.0: GBI Launch on shipboard Aegis Radar; 
Capability declaration: Early; 
Planned basis for 2008 declaration: [Empty]; 
Revised basis for 2008 declaration[A]: FTX-03. 

Source: GAO analysis of MDA data. 

[A] Planned assessment or test was actually used for the capability 
declaration indicated by a "Check" in this column. 

[End of table] 

Table 12: Block 1.0 Engagement Sequence Groups with Revised Basis for 
Completion at End of Fiscal Year 2009: 

Engagement sequence: GBI Engage on COBRA DANE (Beale Air Force Base, 
CA); 
Capability declaration: Full; 
Planned basis for 2008 declaration: PA-07; 
Revised basis for 2009 declaration[A]: Dropped. 

Engagement sequence: GBI Engage on COBRA DANE (Beale Air Force Base, 
CA); 
Capability declaration: Full;
Planned basis for 2008 declaration: GTD-02; 
Revised basis for 2009 declaration[A]: [Check]. 

Engagement sequence: GBI Engage on COBRA DANE (Beale Air Force Base, 
CA); 
Capability declaration: Full;
Planned basis for 2008 declaration: GTI-02; 
Revised basis for 2009 declaration[A]: [Check]. 

Engagement sequence: GBI Engage on COBRA DANE (Beale Air Force Base, 
CA); 
Capability declaration: Full;
Planned basis for 2008 declaration: FTG-04; 
Revised basis for 2009 declaration[A]: Dropped. 

Engagement sequence: GBI Launch on COBRA DANE (Beale Air Force Base, 
CA); 
Capability declaration: Partial; 
Planned basis for 2008 declaration: GTD-03; 
Revised basis for 2009 declaration[A]: Dropped. 

Engagement sequence: GBI Launch on COBRA DANE (Beale Air Force Base, 
CA); 
Capability declaration: Partial; 
Planned basis for 2008 declaration: GTI-03; 
Revised basis for 2009 declaration[A]: [Check]. 

Engagement sequence: GBI Launch on COBRA DANE (Beale Air Force Base, 
CA); 
Capability declaration: Partial; 
Planned basis for 2008 declaration: FTG-04; 
Revised basis for 2009 declaration[A]: Dropped. 

Engagement sequence: GBI Launch on COBRA DANE (Beale Air Force Base, 
CA); 
Capability declaration: Partial; 
Planned basis for 2008 declaration: [Empty]; 
Revised basis for 2009 declaration[A]: GTD-02. 

Engagement sequence: GBI Launch on COBRA DANE (Beale Air Force Base, 
CA); 
Capability declaration: Partial; 
Planned basis for 2008 declaration: [Empty]; 
Revised basis for 2009 declaration[A]: FTI-02. 

Engagement sequence: GBI Launch on COBRA DANE (Beale Air Force Base, 
CA); 
Capability declaration: Full; 
Planned basis for 2008 declaration: PA-08; 
Revised basis for 2009 declaration[A]: PA-09 Quick Look. 

Engagement sequence: GBI Launch on COBRA DANE (Beale Air Force Base, 
CA); 
Capability declaration: Full;
Planned basis for 2008 declaration: GTD-03; 
Revised basis for 2009 declaration[A]: [Check]. 

Engagement sequence: GBI Launch on COBRA DANE (Beale Air Force Base, 
CA); 
Capability declaration: Full;
Planned basis for 2008 declaration: GTI-03; 
Revised basis for 2009 declaration[A]: [Check]. 

Engagement sequence: GBI Launch on COBRA DANE (Beale Air Force Base, 
CA); 
Capability declaration: Full;
Planned basis for 2008 declaration: FTG-05; 
Revised basis for 2009 declaration[A]: Dropped. 

Engagement sequence: GBI Engage on Sea-based X-band radar; 
Capability declaration: Partial; 
Planned basis for 2008 declaration: GTD-03; 
Revised basis for 2009 declaration[A]: Dropped. 

Engagement sequence: GBI Engage on Sea-based X-band radar; 
Capability declaration: Partial; 
Planned basis for 2008 declaration: GTI-03; 
Revised basis for 2009 declaration[A]: [Empty]. 

Engagement sequence: GBI Engage on Sea-based X-band radar; 
Capability declaration: Partial; 
Planned basis for 2008 declaration: [Empty]; 
Revised basis for 2009 declaration[A]: [Empty]. 

Engagement sequence: GBI Engage on Sea-based X-band radar; 
Capability declaration: Full; 
Planned basis for 2008 declaration: PA-08; 
Revised basis for 2009 declaration[A]: PA-09 Quick Look. 

Engagement sequence: GBI Engage on Sea-based X-band radar; 
Capability declaration: Full; 
Planned basis for 2008 declaration: GTD-03; 
Revised basis for 2009 declaration[A]: [Check]. 

Engagement sequence: GBI Engage on Sea-based X-band radar; 
Capability declaration: Full; 
Planned basis for 2008 declaration: GTI-03; 
Revised basis for 2009 declaration[A]: [Check]. 

Engagement sequence: GBI Engage on Sea-based X-band radar; 
Capability declaration: Full; 
Planned basis for 2008 declaration: FTG-05; 
Revised basis for 2009 declaration[A]: Dropped. 

Engagement sequence: GBI Launch on Sea-based X-band radar; 
Capability declaration: Partial; 
Planned basis for 2008 declaration: GTD-03; 
Revised basis for 2009 declaration[A]: Dropped. 

Engagement sequence: GBI Launch on Sea-based X-band radar; 
Capability declaration: Partial; 
Planned basis for 2008 declaration: GTI-03; 
Revised basis for 2009 declaration[A]: [Check]. 

Engagement sequence: GBI Launch on Sea-based X-band radar; 
Capability declaration: Partial; 
Planned basis for 2008 declaration: FTG-04; 
Revised basis for 2009 declaration[A]: Dropped. 

Engagement sequence: GBI Launch on Sea-based X-band radar; 
Capability declaration: Full; 
Planned basis for 2008 declaration: PA-08; 
Revised basis for 2009 declaration[A]: PA-09 Quick Look. 

Engagement sequence: GBI Launch on Sea-based X-band radar; 
Capability declaration: Full; 
Planned basis for 2008 declaration: GTD-03; 
Revised basis for 2009 declaration[A]: [Check]. 

Engagement sequence: GBI Launch on Sea-based X-band radar; 
Capability declaration: Full; 
Planned basis for 2008 declaration: GTI-03; 
Revised basis for 2009 declaration[A]: [Check]. 

Engagement sequence: GBI Launch on Sea-based X-band radar; 
Capability declaration: Full; 
Planned basis for 2008 declaration: FTG-05; 
Revised basis for 2009 declaration[A]: Dropped. 

Engagement sequence: GBI Engage on forward-based AN/TPY-2 radar; 
Capability declaration: Full; 
Planned basis for 2008 declaration: PA 07; 
Revised basis for 2009 declaration[A]: Dropped. 

Engagement sequence: GBI Engage on forward-based AN/TPY-2 radar; 
Capability declaration: Full; 
Planned basis for 2008 declaration: [Empty]; 
Revised basis for 2009 declaration[A]: GTI-03. 

Engagement sequence: GBI Engage on forward-based AN/TPY-2 radar; 
Capability declaration: Full; 
Planned basis for 2008 declaration: GTD-02; 
Revised basis for 2009 declaration[A]: [Check]. 

Engagement sequence: GBI Engage on forward-based AN/TPY-2 radar; 
Capability declaration: Full; 
Planned basis for 2008 declaration: GTI-02; 
Revised basis for 2009 declaration[A]: [Check]. 

Engagement sequence: GBI Engage on forward-based AN/TPY-2 radar; 
Capability declaration: Full; 
Planned basis for 2008 declaration: FTG-04; 
Revised basis for 2009 declaration[A]: Dropped. 

Engagement sequence: GBI Engage on forward-based AN/TPY-2 radar; 
Capability declaration: Full; 
Planned basis for 2008 declaration: [Empty]; 
Revised basis for 2009 declaration[A]: FTX-03. 

Engagement sequence: GBI Launch on forward-based AN/TPY-2 radar; 
Capability declaration: Full; 
Planned basis for 2008 declaration: PA 07; 
Revised basis for 2009 declaration[A]: Dropped. 

Engagement sequence: GBI Launch on forward-based AN/TPY-2 radar; 
Capability declaration: Full; 
Planned basis for 2008 declaration: [Empty]; 
Revised basis for 2009 declaration[A]: GTI-03. 

Engagement sequence: GBI Launch on forward-based AN/TPY-2 radar; 
Capability declaration: Full; 
Planned basis for 2008 declaration: GTD-02; 
Revised basis for 2009 declaration[A]: [Check]. 

Engagement sequence: GBI Launch on forward-based AN/TPY-2 radar; 
Capability declaration: Full; 
Planned basis for 2008 declaration: GTI-02; 
Revised basis for 2009 declaration[A]: [Check]. 

Engagement sequence: GBI Launch on forward-based AN/TPY-2 radar; 
Capability declaration: Full; 
Planned basis for 2008 declaration: FTG-04; 
Revised basis for 2009 declaration[A]: Dropped. 

Engagement sequence: GBI Launch on forward-based AN/TPY-2 radar; 
Capability declaration: Full; 
Planned basis for 2008 declaration: [Empty]; 
Revised basis for 2009 declaration[A]: FTX-03. 

Engagement sequence: GBI Engage on shipboard Aegis radar; 
Capability declaration: Full; Planned basis for 2008 declaration: PA 
07; Revised basis for 2009 declaration[A]: Dropped. 

Engagement sequence: GBI Engage on shipboard Aegis radar; 
Capability declaration: Full; 
Planned basis for 2008 declaration: [Empty]; 
Revised basis for 2009 declaration[A]: GTI-03. 

Engagement sequence: GBI Engage on shipboard Aegis radar; 
Capability declaration: Full; 
Planned basis for 2008 declaration: GTD-02; 
Revised basis for 2009 declaration[A]: Dropped. 

Engagement sequence: GBI Engage on shipboard Aegis radar; 
Capability declaration: Full; 
Planned basis for 2008 declaration: GTI-02; 
Revised basis for 2009 declaration[A]: Dropped. 

Engagement sequence: GBI Engage on shipboard Aegis radar; 
Capability declaration: Full; 
Planned basis for 2008 declaration: FTG-04; 
Revised basis for 2009 declaration[A]: Dropped. 

Engagement sequence: GBI Engage on shipboard Aegis radar; 
Capability declaration: Full; 
Planned basis for 2008 declaration: [Empty]; 
Revised basis for 2009 declaration[A]: FTX-03. 

Engagement sequence: GBI Launch on shipboard Aegis Radar; 
Capability declaration: Full; 
Planned basis for 2008 declaration: PA 07; 
Revised basis for 2009 declaration[A]: Dropped. 

Engagement sequence: GBI Launch on shipboard Aegis Radar; 
Capability declaration: Full; 
Planned basis for 2008 declaration: [Empty]; 
Revised basis for 2009 declaration[A]: GTI-03. 

Engagement sequence: GBI Launch on shipboard Aegis Radar; 
Capability declaration: Full; 
Planned basis for 2008 declaration: GTD-02; 
Revised basis for 2009 declaration[A]: Dropped. 

Engagement sequence: GBI Launch on shipboard Aegis Radar; 
Capability declaration: Full; 
Planned basis for 2008 declaration: GTI-02; 
Revised basis for 2009 declaration[A]: Dropped. 

Engagement sequence: GBI Launch on shipboard Aegis Radar; 
Capability declaration: Full; 
Planned basis for 2008 declaration: FTG-04; 
Revised basis for 2009 declaration[A]: Dropped. 

Engagement sequence: GBI Launch on shipboard Aegis Radar; 
Capability declaration: Full; 
Planned basis for 2008 declaration: [Empty]; 
Revised basis for 2009 declaration[A]: FTX-03. 

Source: GAO analysis of MDA data. 

[A] Planned assessment or test for the capability declaration that 
hasn't changed is indicated by a "Check" in this column. 

[End of table] 

[End of section] 

Appendix V: Scope and Methodology: 

To examine the progress MDA made in fiscal year 2008 toward its cost, 
schedule, testing, and performance goals, we examined the efforts of 10 
BMDS elements that MDA is developing and fielding. The elements 
included in our review collectively accounted for 80 percent of MDA's 
fiscal year 2008 research and development budget requests. In assessing 
each element, we examined the BMDS Fiscal Year 2008 Statement of Goals, 
Program Execution Reviews, test plans and reports, production plans, 
Contract Performance Reports, MDA briefings, and earned value 
management data. We developed data collection instruments that were 
completed by MDA and each element program office. The instruments 
gathered detailed information on planned and completed program 
activities including tests, design reviews, prime contracts, estimates 
of element performance, and challenges facing the elements. In 
addition, we discussed fiscal year 2008 progress and performance with 
officials in MDA's Agency Operations Office, each element program 
office, as well as the Office of DOD's Director, Operational Test and 
Evaluation, and DOD's Operational Test Agency. To assess each element's 
progress toward its cost goals, we reviewed Contract Performance 
Reports and, when available, the Defense Contract Management Agency's 
analyses of these reports. We applied established earned value 
management techniques to data captured in Contract Performance Reports 
to determine trends and used established earned value management 
formulas to project the likely costs of prime contracts at completion. 

To evaluate the sufficiency of MDA's modeling and simulation practices, 
we reviewed DOD and MDA policies, memos, flight and test plans related 
to modeling and simulations, the Acquisition Modeling and Simulation 
Master plan, as well as verification, validation and accreditation 
plans and reports for various elements, and MDA white papers discussing 
modeling and simulation techniques. We also interviewed officials in 
element program offices to discuss modeling and simulation plans and 
procedures particular to each. 

In assessing MDA's accountability, transparency, and management 
controls, we interviewed officials from the Office of the Under 
Secretary of Defense's Office for Acquisition, Technology, and 
Logistics, as well officials in the MDA Agency Operations Directorate. 
We also reviewed an Institute for Defense Analysis study, two 
Congressional Research Service reports, a Congressional Budget Office 
report, U.S. Code, DOD acquisition system policy, various DOD 
directives, the Missile Defense Executive Board charter, and various 
MDA statements and documents related to the agency's block structure. 

To ensure that MDA-generated data used in our assessment are reliable, 
we evaluated the agency's management control processes. We discussed 
these processes with MDA senior management. In addition, we confirmed 
the accuracy of MDA-generated data with multiple sources within MDA 
and, when possible, with independent experts. To assess the validity 
and reliability of prime contractors' earned value management systems 
and reports, we interviewed officials and analyzed audit reports 
prepared by the Defense Contract Audit Agency. Finally, we assessed 
MDA's internal accounting and administrative management controls by 
reviewing MDA's Federal Manager's Financial Integrity Report for Fiscal 
Years 2003, 2004, 2005, 2006, 2007, and 2008. 

Our work was performed primarily at MDA headquarters in Arlington, 
Virginia. At this location, we met with officials from the Aegis 
Ballistic Missile Defense Program Office; Airborne Laser Program 
Office; Command, Control, Battle Management, and Communications Program 
Office; MDA's Agency Operations Office; DOD's Office of the Director, 
Operational Test and Evaluation; and the Office of the Under Secretary 
of Defense for Acquisition, Technology and Logistics. In addition, in 
Huntsville, Alabama, we met with officials from the Ground-based 
Midcourse Defense Program Office, the Sensors Program Office, the 
Terminal High Altitude Area Defense Project Office, the Kinetic Energy 
Interceptors Program Office, the BMDS Kill Vehicles Program Office, the 
Targets and Countermeasures Program Office, and the Office of the 
Director for BMDS Tests. We also met with Space Tracking and 
Surveillance System officials in El Segundo, California. 

In December 2007, the conference report accompanying the National 
Defense Authorization Act for Fiscal Year 2008 noted the importance of 
DOD and MDA providing information to GAO in a timely and responsive 
manner to facilitate the review of ballistic missile defense programs. 
During the course this audit, we experienced significant delays in 
obtaining information from MDA. During the audit, MDA did not provide 
GAO staff with expeditious access to requested documents which delayed 
some audit analysis and contributed to extra staff-hours. Of the 
documents we requested, we received approximately 19 percent within the 
10-15 business day protocols that were agreed upon with MDA. Pre- 
existing documentation took MDA on average about 50 business days to 
provide and many pre-existing documents took over 100 business days to 
be provided to GAO. Notwithstanding these delays, we were able to 
obtain the information needed to satisfy our objectives in accordance 
with generally accepted government auditing standards. 

We conducted this performance audit from May 2008 to March 2009 in 
accordance with generally accepted government auditing standards. Those 
standards require that we plan and perform the audit to obtain 
sufficient, appropriate evidence to provide a reasonable basis for our 
findings and conclusions based on our audit objectives. We believe that 
the evidence obtained provides a reasonable basis for our findings and 
conclusions based on our audit objectives. 

[End of section] 

Appendix VI: GAO Contact and Staff Acknowledgments: 

GAO Contact: 

Paul Francis (202) 512-4841 or francisp@gao.gov: 

Acknowledgments: 

In addition to the contact named above, David Best, Assistant Director; 
LaTonya Miller; Beverly Breen; Ivy Hübler; Tom Mahalek; Steven Stern; 
Claire Cyrnak; Isabella Johnson; Meredith Allen Kimmett; Kenneth E. 
Patton; Karen Richey; and Alyssa Weir made key contributions to this 
report. 

[End of section] 

Footnotes: 

[1] This initial BMDS capability was to defend the U.S. homeland, 
deployed troops, friends, and allies against ballistic missiles of all 
ranges in all phases of flight. MDA was tasked with carrying out the 
President's direction. 

[2] According to MDA, the agency was expected to field an initial 
increment of missile defense capability that provides initial 
protection of the entire United States from North Korea, partial 
protection of the United States from the Middle East threat and 
protection of deployed forces, allies, and friends with terminal 
defenses. MDA fielded a limited capability that included initial 
versions of Ground-based Midcourse Defense; Aegis Ballistic Missile 
Defense; Patriot Advanced Capability-3; and Command, Control, Battle 
Management, and Communications elements. MDA expected to enhance these 
initial capabilities and over time, produce an overarching BMDS capable 
of protecting the United States, deployed forces, friends, and allies 
from ballistic missile attacks of all ranges. 

[3] National Defense Authorization Act for Fiscal Year 2002, Pub. L. 
No. 107-107, § 232(g) (2001); John Warner National Defense 
Authorization Act for Fiscal Year 2007, Pub. L. No. 109-364, § 224 
(2006); and National Defense Authorization Act for Fiscal Year 2008, 
Pub. L. No. 110-181, § 225 (2007). 

[4] We did not assess MDA's progress in fiscal year 2002 as the agency 
did not establish goals for that fiscal year. We delivered the 
following reports for fiscal years 2003 through 2007: GAO, Missile 
Defense: Actions Are Needed to Enhance Testing and Accountability, 
[hyperlink, http://www.gao.gov/products/GAO-04-409] (Washington, D.C.: 
Apr. 23, 2004); Defense Acquisitions: Status of Ballistic Missile 
Defense Program in 2004, [hyperlink, 
http://www.gao.gov/products/GAO-05-243] (Washington, D.C.: Mar. 31, 
2005); Defense Acquisitions: Missile Defense Agency Fields Initial 
Capability but Falls Short of Original Goals, [hyperlink, 
http://www.gao.gov/products/GAO-06-327] (Washington, D.C.: Mar.15, 
2006); Defense Acquisitions: Missile Defense Acquisition Strategy 
Generates Results but Delivers Less at a Higher Cost, [hyperlink, 
http://www.gao.gov/products/GAO-07-387] (Washington, D.C.: Mar.15, 
2007); and Defense Acquisitions: Progress Made in Fielding Missile 
Defense, but Program Is Short of Meeting Goals, [hyperlink, 
http://www.gao.gov/products/GAO-08-448] (Washington, D.C.: Mar.14, 
2008). 

[5] The BMDS also includes an 11th element, Patriot Advanced Capability-
3, which has been transferred to the Army for production, operation, 
and sustainment. This report does not evaluate Patriot Advanced 
Capability-3 because its initial development is complete and is now 
being managed by the Army. 

[6] Beginning in 2009, this report--previously referred to as the 
Statement of Goals--will be called the BMDS Accountability Report. 

[7] BMDS hardware and software are grouped into engagement sequence 
groups, each of which is the specific combination of all sensors, 
weapons, and C2BMC capability that are needed to detect, track, and 
intercept a particular threat. The engagement sequence group construct 
was created as an engineering tool to provide a simple representation 
of BMDS capabilities, integration, and functionality and is defined as 
a unique combination of detect-control-engage functions performed by 
BMDS subsystems used to engage a threat ballistic missile. 

[8] 10 U.S.C. § 2435 requires an approved program baseline for major 
defense acquisition programs. The BMDS program meets the definition of 
a major defense acquisition program, which is defined in 10 U.S.C. § 
2430; however, the requirement to establish a baseline is not triggered 
until entry into system development and demonstration. Because the BMDS 
has not yet formally entered the acquisition cycle, it has not yet been 
required to meet the minimum requirements of section 2435. 

[9] The Fiscal Year 2005 National Defense Authorization Act, Pub. L. 
No. 108-375 § 234(e), required the Director, MDA, to establish and 
report annually to Congress a cost, schedule, and performance baseline 
for each block configuration being fielded. Modification to the 
baseline and variations against the baseline must also be reported. In 
addition, the National Defense Authorization Act for Fiscal Year 2008, 
Pub. L. No. 110-181 § 223(g) required that no later than the submittal 
of the budget for fiscal year 2009, MDA shall "establish acquisition 
cost, schedule and performance baselines" for BMDS elements that have 
entered the equivalent of system development and demonstration or are 
being produced and acquired for operation fielding. 

[10] 10 U.S.C. § 2432 requires DOD to submit to Congress a Selected 
Acquisition Report (SAR) for each major defense acquisition program 
that includes the program acquisition unit cost for each program. 
Unless waived by the Secretary of Defense, this requirement applies 
when funds have been appropriated for the program and the program has 
proceeded to system development and demonstration. Development programs 
that have not entered system development and demonstration may submit a 
limited SAR. MDA submits a limited SAR that does not include program 
acquisition unit costs. 

[11] 10 U.S.C. § 2433 (commonly referred to as Nunn-McCurdy). 

[12] National Defense Authorization Act for Fiscal Year 2008, Pub. L. 
No. 110-181 § 223(g) requires MDA to provide unit cost reporting data 
for each BMDS element that has entered the equivalent of the systems 
development and demonstration phase of acquisition or is being produced 
and acquired for operational use and secure independent estimation and 
verification of such cost reporting data. How MDA was to calculate 
these unit costs was not specified. 

[13] 10 U.S.C. § 2432. 

[14] Flyaway cost refers to the cost of procuring prime mission 
equipment (e.g., an aircraft, ship, tank, etc.). It is funded with 
Procurement appropriations and is part of the Investment cost category. 
This term includes the Work Breakdown Structure elements of Prime 
Mission Equipment, System Engineering/Program Management, System Test 
and Evaluation, Warranties, and Engineering Changes. 

[15] Earned Value Management is a program management tool that 
integrates the technical, cost, and schedule parameters of a contract. 
During the planning phase, an integrated baseline is developed by time 
phasing budget resources for defined work. As work is performed and 
measured against the baseline, the corresponding budget value is 
"earned." 

[16] According to the Over Target Baseline and Over Target Schedule 
Handbook, May 7, 2003. 

[17] The GMD program's replan began in fiscal year 2008 but is 
currently ongoing and, as such, was not totaled into table 4. 

[18] To determine if contractors are executing the work planned within 
the funds and time budgeted, each BMDS program office requires its 
prime contractor to provide monthly reports detailing cost and schedule 
performance. 

[19] We analyzed three task orders for the MKV program issued as part 
of an Indefinite Delivery/Indefinite Quantity contract as well as three 
contracts managed by the Aegis BMD element. We assessed one contract 
for each of the other eight elements. Indefinite Delivery/Indefinite 
Quantity contracts provide for an indefinite quantity, within stated 
limits, of supplies or services during a fixed period. 

[20] The current budgeted costs at completion are as-of September 30, 
2008. 

[21] Situational awareness is defined as the degree to which the 
perception of the current environment mirrors reality. 

[22] According to program officials, an earlier test--FTG-02--provided 
limited data for assessment purposes. However, the data was incomplete 
and could not be used to fully verify and validate the models and 
simulations. 

[23] AN/TPY-2 was formerly known as Forward-Based X-Band Transportable 
radar. 

[24] The BMDS Operational Test Agency conducts independent operational 
assessments of BMDS capability to defend the United States, its 
deployed forces, friends, and allies against ballistic missiles of all 
ranges and in all phases of flight. MDA funds all BMDS Operational Test 
Agency activities. 

[25] There are various categories of knowledge points. For example, an 
element knowledge point is based on an element event that provides 
critical information for a key element program decision requiring the 
Program Manager's approval. Element knowledge points support one or 
more Director knowledge points, and may be supported by other knowledge 
points. 

[26] A sound business case demonstrates that (1) the identified needs 
are real and necessary and are best met with the chosen concept and (2) 
the chosen concept can be developed and produced with existing 
resources--such as technical, knowledge, funding, time and management 
capacity. 

[27] GAO, Defense Acquisitions: Sound Business Case Needed to Implement 
Missile Defense Agency's Targets Program, [hyperlink, 
http://www.gao.gov/products/GAO-08-1113] (Washington, D.C.: Sept. 26, 
2008). 

[28] The accreditation of models and simulations is an official 
certification that a model or simulation is acceptable for use as their 
developers intended. Before a decision to accredit a model, MDA must 
first verify that the models and simulations operate as the designers 
conceptualized, and then validate that the models are sufficiently 
accurate representations of real-world conditions for their intended 
purposes. 

[29] An end-to-end simulation represents a complete BMDS engagement-- 
from enemy missile launch to attempted intercept by BMDS kill vehicle. 

[30] The BMDS Operational Test Agency provides an independent 
accreditation of MDA models and simulations. 

[31] The BMDS Operational Test Agency defines artificialities as BMDS 
architecture, targets, procedures, and conditions that exist in flight 
tests but would not exist in the real world. Flight test 
artificialities are introduced for a number of reasons, such as 
increased chances of success, range safety, data collection, and asset 
availability. 

[32] Weather conditions include rain, clouds, and snow. Severe sea 
states, ice loads, or winds could render tests unsafe to execute. 

[33] Post-flight reconstruction is the process of manually recreating 
and running a past flight test scenario in a simulated environment. 

[34] AN/TPY-2 radars were formerly known as Forward-Based X-Band- 
Transportable radars. According to MDA, an additional AN/TPY-2 radar 
has been provided and is undergoing Government ground testing. 

[35] The National Defense Authorization Act for Fiscal Year 2002, Pub. 
L. No. 107-107, § 234(c), states that "for ground-based midcourse 
interceptor systems, the Secretary of Defense shall initiate steps 
during fiscal year 2002 to establish a flight test capability of 
launching not less than three missile defense interceptors and not less 
than two ballistic missile targets to provide a realistic test 
infrastructure." Currently, GMD has not conducted this test. 

[36] 10 U.S.C. § 2399. 

[37] See app III which details reasons for the FTG-04 cancellation. 

[38] GTD-03 was delayed to support warfighter needs and resulted in 
delayed capability assessments and capability declaration later than 
planned. 

[39] The accreditation of models and simulations is an official 
certification that a model or simulation is acceptable for use as their 
developers intended. Before a decision to accredit a model, MDA must 
first verify that the models and simulations operate as the designers 
conceptualized, and then validate that the models are sufficiently 
accurate representations of real-world conditions for their intended 
purposes. 

[40] GTI-02 included models and simulations. 

[41] Our assessment of engagement sequence groups utilizes MDA's plans 
as of October 1, 2007. Commensurate with the new block structure, MDA 
devised a baseline in February 2008 for engagement sequence groups for 
Blocks 1.0, 2.0, 3.1, and 3.2. 

[42] [hyperlink, http://www.gao.gov/products/GAO-08-448]. 

[43] Section 223(b) of the National Defense Authorization Act of 2008, 
Pub. L. No. 110-181 (Jan. 28, 2008) specified a revised budget 
structure of the missile defense budget to be submitted in the 
President's budget no later than the first Monday in February. 31 
U.S.C. §1105. 

[44] Lead services have already been designated for Aegis BMD, the AN/ 
TPY-2 radar, THAAD, GMD, ABL, the European radar, Cobra Dane, and 
upgraded early warning radars. 

[45] Before a program can enter the system development and 
demonstration phase of the acquisition cycle, statute requires that 
certain information be developed. 10 U.S.C. § 2366b. In 2002, the 
Secretary of Defense deferred the application of some of DOD's 
acquisition processes to BMDS. Therefore, MDA has not yet entered 
System Development and Demonstration which would trigger the statutes 
requiring the development of information that the Defense Acquisition 
Board uses to inform its decisions. Most major defense acquisition 
programs are also required by statute to obtain an independent 
verification of program cost prior to beginning system development and 
demonstration, and/or production and deployment. 10 U.S.C. § 2434. 
Statute also requires an independent verification of a system's 
suitability for and effectiveness on the battlefield before a major 
defense acquisition program can proceed beyond low-rate initial 
production. 10 U.S.C. § 2399. 

[46] [hyperlink, http://www.gao.gov/products/GAO-08-448]. 

[47] [hyperlink, http://www.gao.gov/products/GAO-07-387]. 

[48] Congressional Research Service, Defense Procurement: Full Funding 
Policy--Background, Issues, and Options for Congress (Oct. 20, 2006). 

[49] [hyperlink, http://www.gao.gov/products/GAO-08-1113]. 

[50] Earned Value Management is a program management tool that 
integrates the technical, cost, and schedule parameters of a contract. 
During the planning phase, an integrated baseline is developed by time 
phasing budget resources for defined work. As work is performed and 
measured against the baseline, the corresponding budget value is 
"earned." Using this earned value metric, cost and schedule variances 
can be determined and analyzed. 

[51] The total contract cost at completion is based on budgeted cost at 
completion for each contract we assessed. The budget at completion 
represents the total planned value of the contract. 

[52] The current budgeted costs at completion are as-of September 30, 
2008. 

[53] MDA is developing the FTF to represent evolving threats of all 
ranges. MDA has narrowed its FTF development efforts, focusing on a 
single vehicle, the 72-inch LV-2 ground-launched target. 

[End of section] 

GAO's Mission: 

The Government Accountability Office, the audit, evaluation and 
investigative arm of Congress, exists to support Congress in meeting 
its constitutional responsibilities and to help improve the performance 
and accountability of the federal government for the American people. 
GAO examines the use of public funds; evaluates federal programs and 
policies; and provides analyses, recommendations, and other assistance 
to help Congress make informed oversight, policy, and funding 
decisions. GAO's commitment to good government is reflected in its core 
values of accountability, integrity, and reliability. 

Obtaining Copies of GAO Reports and Testimony: 

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through GAO's Web site [hyperlink, http://www.gao.gov]. Each 
weekday, GAO posts newly released reports, testimony, and 
correspondence on its Web site. To have GAO e-mail you a list of newly 
posted products every afternoon, go to [hyperlink, http://www.gao.gov] 
and select "E-mail Updates." 

Order by Phone: 

The price of each GAO publication reflects GAO’s actual cost of
production and distribution and depends on the number of pages in the
publication and whether the publication is printed in color or black and
white. Pricing and ordering information is posted on GAO’s Web site, 
[hyperlink, http://www.gao.gov/ordering.htm]. 

Place orders by calling (202) 512-6000, toll free (866) 801-7077, or
TDD (202) 512-2537. 

Orders may be paid for using American Express, Discover Card,
MasterCard, Visa, check, or money order. Call for additional 
information. 

To Report Fraud, Waste, and Abuse in Federal Programs: 

Contact: 

Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]: 
E-mail: fraudnet@gao.gov: 
Automated answering system: (800) 424-5454 or (202) 512-7470: 

Congressional Relations: 

Ralph Dawn, Managing Director, dawnr@gao.gov: 
(202) 512-4400: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7125: 
Washington, D.C. 20548: 

Public Affairs: 

Chuck Young, Managing Director, youngc1@gao.gov: 
(202) 512-4800: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7149: 
Washington, D.C. 20548: