This is the accessible text file for GAO report number GAO-08-294 
entitled 'Best Practices: Increased Focus on Requirements and Oversight 
Needed to Improve DOD's Acquisition Environment and Weapon System 
Quality' which was released on February 1, 2008.

This text file was formatted by the U.S. Government Accountability 
Office (GAO) to be accessible to users with visual impairments, as part 
of a longer term project to improve GAO products' accessibility. Every 
attempt has been made to maintain the structural and data integrity of 
the original printed product. Accessibility features, such as text 
descriptions of tables, consecutively numbered footnotes placed at the 
end of the file, and the text of agency comment letters, are provided 
but may not exactly duplicate the presentation or format of the printed 
version. The portable document format (PDF) file is an exact electronic 
replica of the printed version. We welcome your feedback. Please E-mail 
your comments regarding the contents or accessibility features of this 
document to Webmaster@gao.gov. 

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately. 

Report to Congressional Committees: 

United States Government Accountability Office: 
GAO: 

February 2008: 

Best Practices: 

Increased Focus on Requirements and Oversight Needed to Improve DOD's 
Acquisition Environment and Weapon System Quality: 

GAO-08-294: 

GAO Highlights: 

Highlights of GAO-08-294, a report to congressional committees. 

Why GAO Did This Study: 

A Senate report related to the National Defense Authorization Act for 
Fiscal Year 2007 asked GAO to compare quality management practices used 
by the Department of Defense (DOD) and its contractors to those used by 
leading commercial companies and make suggestions for improvement. 
To do this, GAO (1) determined the impact of quality problems on 
selected weapon systems and prime contractor practices that contributed 
to the problems; (2) identified commercial practices that can be used 
to improve DOD weapon systems; (3) identified problems that DOD must 
overcome; and (4) identified recent DOD initiatives that could improve 
quality. GAO examined 11 DOD weapon systems with known quality problems 
and met with quality officials from DOD, defense prime contractors, and 
five leading commercial companies that produce complex products and/or 
are recognized for quality products 

What GAO Found: 

Problems related to quality have resulted in major impacts to the 11 
DOD weapon systems GAO reviewed—billions in cost overruns, years-long 
delays, and decreased capabilities for the warfighter. For example, 
quality problems with the Expeditionary Fighting Vehicle program were 
so significant that DOD extended development 4 years at a cost of $750 
million. The F-22A fighter aircraft experienced cracks in the plane’s 
canopy that grounded the flight test aircraft, and initial operating 
capability for the Wideband Global SATCOM satellite was delayed 18 
months because a supplier installed some fasteners incorrectly. GAO’s 
analysis of 11 DOD weapon systems illustrates that defense contractors’ 
poor practices for systems engineering activities as well as 
manufacturing and supplier quality problems contributed to these 
outcomes. Reliance on immature designs, inadequate testing, defective 
parts, and inadequate manufacturing controls are some of the quality 
problems that GAO found. Senior prime contractor officials GAO met with 
generally agreed with GAO’s assessment of the causes of the quality 
problems. 

In contrast, leading commercial companies GAO contacted use more 
disciplined systems engineering, manufacturing, and supplier quality 
practices. For example, rather than wait to discover defects after the 
fact, Boeing Commercial Airplanes tries to design parts that can be 
assembled only one way. Effective use of many systems engineering 
practices has helped Space Systems/Loral, a satellite producer, improve 
overall quality, for example, by allowing the company to operate its 
satellites for more than 80 million consecutive hours in orbit with 
just one failure. Companies also put significant effort into validating 
product design and production processes to catch problems early on, 
when problems are less costly to fix. They conduct regular audits of 
their suppliers and hold them accountable for quality problems. 

DOD faces its own set of challenges—setting achievable requirements for 
systems development and providing effective oversight during the 
development process. In conducting systems development, DOD generally 
pays the allowable costs incurred for the contractor’s best efforts. 
These conditions contribute to an acquisition environment that is not 
conducive for incentivizing contractors to build high-quality weapon 
systems and DOD, which typically uses cost-reimbursement contracts to 
develop weapon systems, assumes most of the risks and pays contractors 
to fix most of the problems. 

DOD has taken steps to improve its acquisition practices by 
experimenting with a new concept decision review practice, selecting 
different acquisition approaches according to expected fielding times, 
and establishing panels to review weapon system configuration changes 
that could adversely affect program cost and schedule. None of these 
initiatives focus exclusively on quality issues, and none specifically 
address problems with defense contractors’ practices. 

What GAO Recommends: 

GAO recommends that the Secretary of Defense take actions to set 
achievable requirements for new weapon system development, oversee and 
expand initiatives that could improve quality, and use data to assess 
contractor performance and weapon system quality. DOD partially agreed 
with the recommendations, stating that its current practices or planned 
actions are appropriate. We believe our recommendations remain valid 
and can improve weapons systems quality. 

To view the full product, including the scope and methodology, click on 
[hyperlink, http://wwww.GAO-08-294]. For more information, contact 
Michael Sullivan at (202) 512-4841 or sullivanm@gao.gov. 

[End of section] 

Contents: 

Letter: 

Results in Brief: 

Background: 

DOD Weapon Systems Experience Quality Problems Due to Prime 
Contractors' Inconsistent Practices: 

Leading Commercial Companies Use Disciplined Quality Management 
Practices: 

Different Environments Create Different Incentives to Improve Quality: 

DOD Efforts to Improve Acquisition Outcomes: 

Conclusions: 

Recommendations for Executive Action: 

Agency Comments and Our Evaluation: 

Appendix I: Objectives, Scope, and Methodology: 

Appendix II: Quality Problems for 11 DOD Weapon Systems: 

Appendix III: Comments from the Department of Defense: 

Appendix IV: GAO Contact and Staff Acknowledgments: 

Related GAO Products: 

Figures: 

Figure 1: Weapon System Quality Problems and Impact: 

Figure 2: Advanced SEAL Delivery System: 

Figure 3: Advanced Threat Infrared Countermeasure/Common Missile 
Warning System: 

Figure 4: Expeditionary Fighting Vehicle: 

Figure 5: F-22A: 

Figure 6:Global Hawk Unmanned Aircraft System: 

Figure 7: Joint Air-to-Surface Standoff Missile: 

Figure 8: LPD 17 Amphibious Transport Dock: 

Figure 9: MH-60S Fleet Combat Support Helicopter: 

Figure 10: Patriot Advanced Capability-3: 

Figure 11: V-22 Joint Services Advanced Vertical Lift Aircraft: 

Figure 12: Wideband Global SATCOM: 

Abbreviations: 

DOD: Department of Defense: 

RDT&E: research, development, test and evaluation: 

[End of section] 

United States Government Accountability Office: 
Washington, DC 20548: 

February 1, 2008: 

Congressional Committees: 

The Department of Defense (DOD) plans to invest about $1.5 trillion (in 
2007 dollars) in its current portfolio of major weapon systems. 
However, the cost of designing and developing these systems could 
continue to exceed estimates by billions of dollars if DOD continues to 
employ the same acquisition practices, including those for quality, as 
it has in the past. Excessive scrap, rework, and repair costs, as well 
as reliability problems impact overall quality and could ultimately 
present serious consequences on a weapon system's long-term support 
costs and affordability. 

Like DOD, commercial companies collectively invest trillions to develop 
their products. Fundamentally, they know they must meet customer 
expectations to ensure continued growth. Many leading companies follow 
a knowledge-based approach for product development and rely on proven 
practices to attain high-quality products, control costs, and make a 
profit. While commercial companies are not without flaws and can 
produce poor-quality products, the demands of the marketplace force 
them to place a high priority on quality. 

This report examines how DOD and its defense contractors can improve 
the quality of major weapon systems. A Senate report related to the 
John Warner National Defense Authorization Act for Fiscal Year 2007 
asked GAO to compare quality management practices used by DOD and its 
contractors to those used by leading commercial companies and make 
suggestions as to how DOD's practices could be improved. We (1) 
determined the impact of quality problems on selected DOD weapon 
systems and defense contractors' practices that contributed to the 
problems; (2) identified practices used by leading commercial companies 
that can be used to improve the quality of DOD weapon systems; (3) 
identified problems DOD faces in terms of improving quality; and (4) 
identified recent DOD initiatives that could improve quality. 

To do this, we considered quality activities that take place from the 
time requirements are established for a product until it is fielded. 
This includes virtually all key design and engineering elements during 
development, the transition to production, and production itself. We 
examined 11 DOD weapon systems with known deficiencies that are in 
later phases of development or production to determine the impact of 
quality problems and assessed the effectiveness of defense contractors' 
quality practices for these systems. The 11 weapon systems were chosen 
to demonstrate the types of problems DOD weapon systems experience and 
to help focus our discussions with leading commercial companies on 
aspects of development that caused DOD major quality problems. We also 
met with representatives from the Office of the Under Secretary of 
Defense (Acquisition, Technology and Logistics), each of DOD's military 
services, selected commands, the Defense Contract Management Agency, 
and six of DOD's largest prime contractors--BAE Systems, Boeing 
Integrated Defense Systems, General Dynamics, Lockheed Martin, Northrop 
Grumman, and Sikorsky Aircraft--to discuss quality practices used to 
build DOD weapon systems and to obtain related documentation.[Footnote 
1] These prime contractors are involved in a little over $1.1 trillion, 
or about 76 percent, of DOD's expected $1.5 trillion expenditure on 
weapon systems in its current portfolio. To identify leading commercial 
companies' quality practices, we interviewed and obtained documentation 
from quality management personnel at five companies: Boeing Commercial 
Airplanes; Cummins Inc., a manufacturer of diesel and natural gas- 
powered engines; Siemens Medical Solutions, a producer of ultrasound 
systems; Space Systems/Loral, a producer of satellite systems; and 
Kenworth, a trucking company. Much of the information we obtained from 
these companies is anecdotal, due to the proprietary nature of the data 
that could affect their competitive standing. We also met with 
officials from American Airlines and Intelsat, a satellite 
communications company, to understand the commercial customers' role in 
acquiring high-quality products. 

We compared leading commercial company practices with those used by DOD 
and the prime contractors we reviewed to identify both potential areas 
for improvement and practices that could improve the quality of DOD 
weapon systems. We also reviewed recent DOD initiatives aimed at 
improving acquisitions to determine if they have the potential to 
improve weapon system quality. Appendix I includes additional details 
about our scope and methodology. We conducted this performance audit 
from September 2006 to December 2007 in accordance with generally 
accepted government auditing standards. Those standards require that we 
plan and perform the audit to obtain sufficient, appropriate evidence 
to provide a reasonable basis for our findings and conclusions based on 
our audit objectives. We believe that the evidence obtained provides a 
reasonable basis for our findings and conclusions based on our audit 
objectives. 

Results in Brief: 

Quality problems have caused cost overruns, schedule delays, and 
reduced weapon system availability on the 11 DOD weapon systems we 
reviewed. The Expeditionary Fighting Vehicle is a case in point. Just 
as it was scheduled to move into production, DOD extended the program's 
development by 4 years at an estimated cost of $750 million when the 
prime contractor could not meet interim reliability goals. In another 
example, the Air Force temporarily grounded the F-22A's flight test 
aircraft when its first-of-a-kind canopy suffered cracks near the 
mounting holes because of problems with the prime contractor's 
manufacturing practices. In addition, the Wideband Global SATCOM 
communications satellite's initial operating capability date was 
delayed by 18 months because a supplier had installed some fasteners 
incorrectly and 1,500 fasteners on each of the first three satellites 
had to be reinspected. We found many other problems on other programs 
we reviewed as well--a laser jammer that did not work as intended, 
peeling coating on ships, deficient welding, and nonconforming parts-- 
that added to DOD's costs and schedules. Prime contractors' poor 
practices related to systems engineering, manufacturing, and supplier 
quality contributed to these problems. Senior prime contractor quality 
officials generally agreed with our assessment of the causes of 
problems in the systems we reviewed. 

Like DOD prime contractors, leading commercial companies rely on many 
practices related to systems engineering, manufacturing, and supplier 
quality, but the companies we reviewed apply more discipline and more 
rigorous, institutionalized processes to ensure product quality. The 
companies set well-defined product requirements and performed 
appropriate testing, which are critical systems engineering practices. 
For example, recent satellite components designed and developed by 
Space Systems/Loral, a satellite producer, have over 80-million hours 
of in orbit experience with only one failure, a greater than 99 percent 
availability rate. Space Systems/Loral accomplished this by focusing on 
reliability requirements during development and using reliability 
assessments and extensive testing to identify weak links before 
production started. Likewise, leading commercial companies focus on 
getting manufacturing processes in control prior to production. Cummins 
builds prototype engines to validate its manufacturing processes and 
Kenworth uses electronic versions of installation work processes to 
ensure that there is configuration control over the installation 
process and reduce rework. The companies also conducted regular audits 
of their suppliers and tracked supplier performance related to parts 
delivery and quality. For example, Boeing Commercial Airplanes requires 
its highest-rated suppliers to meet a 99 percent rate for parts 
conformance. For these companies, using disciplined processes and 
continuous improvement was essential to producing high-quality products 
and sustaining their competitive position in the commercial 
marketplace. 

DOD's acquisition environment does not provide incentives to prime 
contractors to use best practices to efficiently build high-quality 
weapon systems. The department faces challenges setting achievable 
requirements for systems development and providing effective oversight 
during the development process. In conducting systems development, DOD 
generally pays the allowable costs incurred for the contractor's best 
efforts and accepts most of the financial risks associated with 
development because of technical uncertainties. However, DOD and its 
contractors often enter into development contracts before requirements 
have been analyzed with disciplined systems engineering practices. This 
introduces significant cost and schedule risk to a development program, 
risk that is not borne by the prime contractor, but by DOD. Contractors 
have little incentive to utilize the best systems engineering, 
manufacturing, and supplier quality practices to control costs. DOD 
also has limited oversight of prime contractor activities and does not 
aggregate quality data in a manner that helps decision makers assess or 
identify systemic quality problems. In contrast, commercial companies 
we visited operate in an environment that requires their own investment 
of significant funds to develop new products before they are able to 
sell them and recoup that investment. This high-cost environment 
creates incentives for reasonable requirements that have been analyzed 
and proven achievable, the use of best practices, and continuous 
improvement in systems engineering, manufacturing, and supplier quality 
activities. 

In response to the John Warner National Defense Authorization Act for 
Fiscal Year 2007, the Under Secretary of Defense for Acquisition, 
Technology and Logistics has identified several initiatives DOD 
recently started that might eventually help improve weapon system 
quality. Some of its new initiatives address problems we noted in this 
report, such as placing greater emphasis on setting achievable 
requirements before starting development. However, DOD has not taken 
actions that would address problems related to prime contractor systems 
engineering, manufacturing, and supplier quality practices we found in 
our review of the 11 weapon systems. 

We are making recommendations that the Secretary of Defense improve 
weapons system quality by setting achievable requirements at the start 
of weapon system development, overseeing and expanding initiatives that 
could improve quality, and using data to assess prime contractor 
performance and weapon system quality. DOD partially agreed with each 
of the recommendations, stating that it believes the current practices 
or actions it plans to take are appropriate. In response to DOD's 
comments, we added more detail to one recommendation and acknowledged 
that the department is taking steps that could improve weapons system 
quality. Nevertheless, we believe our recommendations remain valid for 
improving weapons system quality. 

Background: 

In general, a quality product is one that is delivered on time, 
performs as expected, and can be depended on to perform when needed, at 
an affordable cost. This applies whether the customer is an individual 
purchasing a simple consumer good, such as a television, a hospital 
purchasing medical imaging equipment to help doctors treat cancer 
patients, or DOD purchasing sophisticated weapons for its warfighters 
to use on the battlefield. 

For about 3 decades, DOD based its quality requirements on a military 
standard known as MIL-Q-9858A, and its quality assurance practices were 
oriented toward discovering defects through inspections. In 1994, the 
Secretary of Defense announced that commercial quality standards should 
replace MIL-Q-9858A. The intent was to remove military-unique 
requirements that could present barriers to DOD in accessing the 
commercial supplier base. Currently, responsibilities for quality 
policy and oversight fall under the Systems and Software Engineering 
organization, within the Office of the Secretary of Defense. 

Over the past 20 years, commercial companies have had to dramatically 
improve quality in response to increased competition. Many companies 
moved from inspection-oriented quality management practices--where 
problems are identified and corrected after a product is produced--to a 
process in which quality is designed into a product and manufacturing 
processes are brought in statistical control to reduce defects. Many 
companies have also adopted commercial quality standards, such as ISO 
9001.[Footnote 2] This standard was developed by the International 
Organization for Standardization, a non-governmental organization 
established in 1947 to facilitate the international coordination and 
unification of industrial standards. Similar to DOD's MIL-Q-9858A, ISO 
9001 includes requirements for controlling a product's design and 
development, and production, as well as processes for oversight and 
improvement. Some industries, such as the automotive and aerospace 
industries, also have standards specific to their sector based on the 
ISO 9001.[Footnote 3] Because supplier parts account for a substantial 
amount of the material value of many companies' products, companies may 
require their suppliers to adopt the same standards. 

In practice, DOD and its prime contractors both participate in 
activities that contribute to weapon system quality. DOD plays a large 
role in quality when it sets key performance parameters, which are the 
most important requirements DOD wants prime contractors to focus on 
during development. For example, if reliability is one of those key 
performance parameters, then prime contractors are expected to focus on 
it during weapon system design. Prime contractors employ quality 
assurance specialists and engineers to assess the quality and 
reliability of parts they receive from suppliers, as well as the 
overall weapon system. DOD has its own quality specialists within the 
Defense Contract Management Agency and the military services, such as 
the Navy's Supervisor of Shipbuilding organization. DOD's quality 
specialists oversee prime contractors' design, manufacturing, and 
supplier management activities; oversee selected supplier manufacturing 
activities; and conduct final product inspections prior to acceptance. 

GAO previously reported on DOD quality practices in 1996.[Footnote 4] 
At that time, we reported that numerous weapon system programs had 
historically had quality problems in production because designs were 
incomplete. The B-2 bomber program and the C-17 Airlifter program, for 
example, encountered major manufacturing problems because they went 
forward with unstable designs and relied on inspections to find defects 
once in production. Since 1996, GAO has recommended several times that 
DOD adopt a knowledge-based acquisition approach used by leading 
commercial companies to develop its weapon systems. Under this 
approach, high levels of knowledge are demonstrated at critical 
decision points in the product development process, which results in 
successful product development outcomes. Systems engineering is a key 
practice that companies use to build quality into new products. 
Companies translate customers' broad requirements into detailed 
requirements and designs, including identifying requisite 
technological, software, engineering, and production capabilities. 
Systems engineering also involves performing verification activities, 
including testing, to confirm that the design satisfies requirements. 
Products borne out of a knowledge-based approach stand a significantly 
better chance to be delivered on time, within budget, and with the 
promised capabilities. Related GAO products, listed at the back of this 
report, provide detailed information about the knowledge-based 
approach. 

DOD Weapon Systems Experience Quality Problems Due to Prime 
Contractors' Inconsistent Practices: 

Although major defense contractors have adopted commercial quality 
standards in recent years, quality and reliability problems persist in 
DOD weapon systems. On the 11 weapon systems GAO reviewed, these 
problems have resulted in billions of dollars in cost overruns, years 
of schedule delays, and reduced weapon system availability. Prime 
contractors' poor systems engineering practices related to requirements 
analysis, design, and testing were key contributors to these quality 
problems. We also found problems with manufacturing and supplier 
quality that contributed to problems with DOD weapon systems. Senior 
officials from the prime contractor companies we contacted said that 
they agreed with our assessment of the causes of the quality problems 
of weapon system programs we reviewed and that disciplined processes 
help improve overall quality. 

Case Studies Illustrate Impact of DOD Weapon System Quality Problems: 

Quality problems caused significant cost and/or schedule delays in the 
11 weapon systems we reviewed. Figure 1 shows the types of problems we 
found and the resulting impacts. Appendix II contains detailed 
information about each of the programs' quality problems. Quality 
problems occurred despite the fact that each of the prime contractors 
for these programs is certified to commercial quality standards and 
most provided us with quality plans that address systems engineering 
activities such as design, as well as manufacturing, and supplier 
quality. However, quality problems in these areas point to a lack of 
discipline or an inconsistency in how prime contractors follow through 
on their quality plans and processes. 

Figure 1: Weapon System Quality Problems and Impact: 

[See PDF for image] 

This figure is a table illustrating Weapon System Quality Problems and 
Impact. For each system, there is an accompanying photograph. The 
following data is depicted: 

System: Advanced SEAL Delivery System[A]; 
Source of quality problem, Systems engineering: [Check]; 
Source of quality problem, Manufacturing: [Empty]; 
Source of quality problem, Supplier quality: [Check]; 
Impact of quality problem, Cost (in millions of dollars): $87; 
Impact of quality problem, Schedule: Program halted. 

System: Advanced Threat Infrared Countermeasure/Common Missile Warning 
System; 
Source of quality problem, Systems engineering: [Check]; 
Source of quality problem, : [Check]; 
Source of quality problem, Supplier quality: [Empty]; 
Impact of quality problem, Cost (in millions of dollars): $117; 
Impact of quality problem, Schedule: 5-year delay. 

System: Expeditionary Fighting Vehicle; 
Source of quality problem, Systems engineering: [Check]; 
Source of quality problem, Manufacturing: [Empty]; 
Source of quality problem, Supplier quality: [Empty]; 
Impact of quality problem, Cost (in millions of dollars): $750; 
Impact of quality problem, Schedule: 4-year extension to system 
development. 

System: F-22A; 
Source of quality problem, Systems engineering: [Check]; 
Source of quality problem, : [Check]; 
Source of quality problem, Supplier quality: [Check]; 
Impact of quality problem, Cost (in millions of dollars): $400; 
Impact of quality problem, Schedule: No schedule impact to program. 

System: Global Hawk[A]; 
Source of quality problem, Systems engineering: [Check]; 
Source of quality problem, Manufacturing: [Empty]; 
Source of quality problem, Supplier quality: [Check]; 
Impact of quality problem, Cost (in millions of dollars): $239; 
Impact of quality problem, Schedule: 4-month production slip for sensor 
suite. 

System: Joint Air-to-Surface Standoff Missile; 
Source of quality problem, Systems engineering: [Empty]; 
Source of quality problem, : [Check]; 
Source of quality problem, Supplier quality: [Check]; 
Impact of quality problem, Cost (in millions of dollars): $39; 
Impact of quality problem, Schedule: Program deferred. 

System: LPD 17 Amphibious Transport Dock[A]; 
Source of quality problem, Systems engineering: [Check]; 
Source of quality problem, Manufacturing: [Check]; 
Source of quality problem, Supplier quality: [Check]; 
Impact of quality problem, Cost (in millions of dollars): $846; 
Impact of quality problem, Schedule: 3-year delay. 

System: MH-60s Fleet Combat Support Helicopter; 
Source of quality problem, Systems engineering: [Empty]; 
Source of quality problem, : [Check]; 
Source of quality problem, Supplier quality: [Empty]; 
Impact of quality problem, Cost (in millions of dollars): No cost 
impact to program; 
Impact of quality problem, Schedule: 6-month production slip. 

System: Patriot Advanced Capability-3; 
Source of quality problem, Systems engineering: [Check]; 
Source of quality problem, Manufacturing: [Check]; 
Source of quality problem, Supplier quality: [Check]; 
Impact of quality problem, Cost (in millions of dollars): $26; 
Impact of quality problem, Schedule: 6-month delay. 

System: V-22 Joint Services Advanced Vertical Lift Aircraft; 
Source of quality problem, Systems engineering: [Check]; 
Source of quality problem, : [Empty]; 
Source of quality problem, Supplier quality: [Empty]; 
Impact of quality problem, Cost (in millions of dollars): $165; 
Impact of quality problem, Schedule: Flight operations halted for 17-
months. 

System: Wideband Global SATCOM; 
Source of quality problem, Systems engineering: [Empty]; 
Source of quality problem, Manufacturing: [Check]; 
Source of quality problem, Supplier quality: [Check]; 
Impact of quality problem, Cost (in millions of dollars): $10; 
Impact of quality problem, Schedule: 18-month delay for initial 
operating capability. 

Source: GAO analysis of DOD and prime contractor data. 

[A] Cost and schedule figures are not solely attributable to quality 
problems. 

[End of figure] 

Lack of Systems Engineering Discipline Early in Programs Leads to 
Significant Quality Problems Later: 

GAO's past work has identified systems engineering as a key practice 
for ensuring quality and achieving successful acquisition 
outcomes.[Footnote 5] Systems engineering is a sequence of activities 
that translates customer needs into specific capabilities and 
ultimately into a preferred design. These activities include 
requirements analysis, design, and testing in order to ensure that the 
product's requirements are achievable and designable given available 
resources, such as technologies. In several of the DOD weapon programs 
we reviewed, poor systems engineering practices contributed to quality 
problems. Examples of systems engineering problems can be found on the 
Expeditionary Fighting Vehicle, Advanced Threat Infrared 
Countermeasure/Common Missile Warning System, and Joint Air-to-Surface 
Standoff Missile programs. 

Design problems have hampered the development of the Marine Corps' 
Expeditionary Fighting Vehicle. The system, built by General Dynamics, 
is an amphibious vehicle designed to transport troops from ships 
offshore to land at higher speeds and from farther distances than its 
predecessor. According to program officials, prime contractor design 
and engineering changes were not always passed to suppliers, resulting 
in supplier parts not fitting into assemblies because they were 
produced using earlier designs. Systems engineering problems have also 
contributed to poor vehicle reliability, even though reliability was a 
key performance parameter. Consequently, the prime contractor was only 
able to demonstrate 7.7 hours between mission failures, which was well 
short of the 17 hours it needed to demonstrate in pre-production 
testing. Subsequently, the vehicle's development phase has been 
extended. Program officials estimate that this extension, which will 
primarily focus on improving reliability, will last an additional 4 
years at an estimated cost of $750 million. 

For several other weapon systems, inadequate testing was another 
systems engineering problem. The Army's Advanced Threat Infrared 
Countermeasure/Common Missile Warning System program, developed by BAE 
Systems, is designed to defend U.S. aircraft from advanced infrared- 
guided missiles. Reliability problems related to the Advanced Threat 
Infrared Countermeasure jam head forced the Army to initiate a major 
redesign of the jam head in fiscal year 2006, and fielding of the 
subsystem has been delayed until fiscal year 2010. According to a prime 
contractor official, the reliability problems were caused, at least in 
part, by inadequate reliability testing. Likewise, the Joint Air-to- 
Surface Standoff Missile program, developed by Lockheed Martin, has 
experienced a number of flight test failures that have underscored 
product reliability as a significant problem. Ground testing, which 
prime contractor officials said could have identified most of the 
failure modes observed in flight testing, did not occur initially. 
Prime contractor officials indicated that ground testing was not 
considered necessary because the program was a spin-off of a previous 
missile program and there was an urgent need for the new missile. As a 
result of the test failures, the program has initiated a reliability 
improvement effort that includes ground and flight testing. A program 
official reported that the cost of reliability improvements for fiscal 
years 2006 and 2007 totaled $39.4 million. 

Manufacturing Problems Are Often Caused by Lack of Process Controls: 

GAO's past work addresses the importance of capturing manufacturing 
knowledge in a timely manner as a means for ensuring that an 
organization can produce a product within quality targets.[Footnote 6] 
Prime contractor activities to capture manufacturing knowledge should 
include identifying critical characteristics of the product's design 
and then the critical manufacturing processes to achieve these 
characteristics. Once done, those processes should be proven to be in 
control prior to production. This would include making work 
instructions available, preventing and removing foreign object debris 
in the production process, and establishing criteria for workmanship. 
However, prime contractors' lack of controlled manufacturing processes 
caused quality problems on several DOD weapon programs, including the F-
22A and LPD 17 programs. 

The F-22A, a fighter aircraft with air-to-ground attack capability 
being built by Lockheed Martin, entered production with less than 50 
percent of the critical manufacturing processes in control. In 2000, 
citing budgetary constraints and specific hardware quality problems 
that demanded attention, the Air Force abandoned its efforts to get 
manufacturing processes in control prior to the start of production. 
Subsequently, the contractor experienced a scrap, rework, and repair 
rate of about 30 percent on early-production aircraft. The contractor 
also experienced major problems with the aircraft canopy. According to 
program officials, the aircraft uses a first-of-a-kind canopy, with an 
external metallic stealth layer. The contractor did not bring its 
manufacturing processes in control and the canopy cracked near the 
mounting holes. This problem was discovered in March 2000 and 
temporarily grounded the flight test aircraft. In addition, in 2006 a 
pilot was trapped in an F-22A for 5 hours when a defective activator 
prevented him from opening the canopy. According to the Air Force, when 
production began in 2001, the prime contractor should have been able to 
demonstrate that the F-22A could achieve almost 2 flying hours between 
maintenance. However, at that time, the contractor could demonstrate 
only about 40 minutes. Six years later, the contractor increased the 
flying hours to 97 minutes mean time, short of the Air Force's current 
3-hour requirement. The program now has budgeted an additional $400 
million to improve the aircraft's reliability and maintainability. 

Northrop Grumman, the prime contractor for the LPD 17, the first ship 
of a new class of amphibious transport dock ships, delivered the ship 
to the Navy in 2005 with many quality problems resulting from poor 
manufacturing practices. For example, the program experienced problems 
with non-skid coating applications because the company did not keep the 
boat surface free from dirt and debris when applying the coating, which 
caused it to peel. As of late 2007, the problem was not fixed. In 
addition, the ship encountered problems with faulty welds on piping 
used in some of the ship's hydraulic applications. According to the 
prime contractor, they could not verify that welds had been done 
properly. This problem required increased rework to correct the 
problems and reinspect all the welds. Had the problem not been 
discovered and weld failure had occurred, the crew and the ship could 
have been endangered. These problems, as well as many others, 
contributed to a 3-year delay and cost increase of $846 million in 
delivering the ship to the Navy. In June 2007, the Secretary of the 
Navy sent a letter to the Chairman of the Board of Northrop Grumman 
expressing his concerns about the contractor's ability to construct and 
deliver ships that meet Navy quality standards and to meet agreed-to 
cost and schedule commitments. 

Supplier Quality Problems Can Result in Higher Product Cost: 

Management of supplier quality is another problem area for DOD weapon 
systems. Supplier quality is particularly important because more that 
half of the cost of a weapon system can be attributed to material 
received by the prime contractor from its supplier base. While DOD 
prime contractors told us that they manage and control the quality of 
parts and material they receive from their suppliers with the help of 
performance reviews and process audits, we found supplier quality 
problems on seven of the weapon systems we reviewed. Two examples are 
the Wideband Global SATCOM and Patriot Advanced Capability-3 programs. 

Boeing Integrated Defense Systems is the prime contractor for the Air 
Force and Army's Wideband Global SATCOM communications satellite. 
Boeing Integrated Defense Systems discovered that one of its suppliers 
had installed certain fasteners incorrectly. As a result, 1,500 
fasteners on each of the first three satellites had to be inspected or 
tested, and 148 fasteners on the first satellite had to be reworked. 
The DOD program office reported that the resulting 15-month schedule 
slip would add rework and workforce costs to the program and delay 
initial operating capability by 18 months. A prime contractor official 
estimated the cost to fix the problem was about $10 million. 

In 2006, a supplier for the Patriot Advanced Capability-3 program, a 
long-range system that provides air and missile defense for ground 
combat forces, accepted non-conforming hardware for a component for the 
missile's seeker. The seeker contractor had to re-inspect components 
and some were returned for rework. As a result of this and other 
problems involving poor workmanship and inadequate manufacturing 
controls, the supplier facility was shut down for 7 months, delaying 
delivery of about 100 missiles. 

Prime Contractors' Observations on Quality: 

We met with senior quality officials at the prime contractor companies 
we included in this review to discuss the problems we found. For the 
most part, they agreed with our assessment, and that the discipline 
with which a company implements its processes is a key contributor to 
quality outcomes. The officials discussed the importance of quality and 
how they are attempting to improve quality across their companies. This 
includes the use of Six Sigma, a tool for measuring defects and 
improving quality, as well as independent program reviews and improving 
design processes.[Footnote 7] 

The senior quality officials also identified factors they believe 
affect the quality of DOD weapon systems, including insufficient 
attention to reliability by DOD during development and the prime 
contractor's lack of understanding of weapon system requirements, 
including those for testing. 

Leading Commercial Companies Use Disciplined Quality Management 
Practices: 

While there are similarities between the quality management practices 
of DOD prime contractors and leading commercial companies in our 
review, the discipline with which leading companies implement their 
practices contributes to the high quality of their products. According 
to company officials we contacted, reliability is a paramount concern 
for them because their customers demand products that work, and the 
companies must develop and produce high-quality products to sustain 
their competitive position in the marketplace. Leading commercial 
companies use disciplined, well-defined, and institutionalized 
practices for (1) systems engineering to ensure that a product's 
requirements are achievable with available resources, such as 
technologies; (2) manufacturing to ensure that a product, once 
designed, can be produced consistently with high quality and low 
variability; and (3) supplier quality to ensure that their suppliers 
are capable of delivering high-quality parts. These practices, which 
were part of the companies' larger product development processes, and 
other tools such as Six Sigma, provided an important foundation for 
producing quality products and continually improving performance. 

Adherence to Systems Engineering Practices Leads to Clear Requirements 
and Reliable Designs: 

Several of the companies we met with discussed how they use systems 
engineering as a key practice for achieving quality outcomes. As part 
of Siemens Medical Solutions' standard product development process, the 
company validates that product requirements are sufficiently clear, 
precise, measurable, and comprehensive. They ensure that requirements 
address quality, including requirements for reliability and readiness 
prior to making a commitment to developing and building a new product. 
Officials with Boeing Commercial Airplanes say they have shifted their 
view of quality into a more proactive approach, which includes a focus 
on "mistake-proofing" designs so that they can be assembled only one 
way. To help assess the producibility of critical parts designs, the 
company has also developed a tool that rates different attributes of 
the design, including clarity of engineering requirements, consequences 
of defects on performance or manufacturability, and verification 
complexity. Company officials say they use the tool's ratings to modify 
designs to ensure that parts will be less prone to manufacturing and 
assembly error, and that its use has resulted in lower costs for scrap, 
rework, and repair and fewer quality problems. 

Space Systems/Loral also relies on well-defined and disciplined 
processes to develop and produce satellites. Because the company's 
customers expect satellites to perform for up to 15 years, product 
reliability is paramount and company officials say that using systems 
engineering to design reliability into a satellite is essential. As 
part of its systems engineering activities, the company performs 
reliability assessments to verify that satellite components and 
subsystems will meet reliability requirements and to identify potential 
hardware problems early in the design cycle. Space Systems/Loral 
officials also discussed testing and its importance to developing 
products. For significant new product developments, Space Systems/Loral 
employs highly accelerated life testing to find weak links in a design 
and correct them to make the product more robust before going into 
production. As a result of the company's disciplined quality management 
practices, new satellite components--such as lithium-ion batteries, 
stationary plasma thrusters, and a satellite control system--have over 
80 million hours of operation in orbit with only one component failure, 
according to company data. 

Effective Manufacturing Process Controls Reduce Variability and 
Defects: 

Several company officials discussed the importance of having controlled 
manufacturing processes, and described several approaches to reduce 
variability and the likelihood of defects. These approaches greatly 
increase the likelihood that a product, once designed, can be produced 
consistently and with high quality and low variability. In this way, 
they reduce waste and increase a product's reliability in the field. 

Early in its product development process, Cummins, a manufacturer of 
diesel and natural gas-powered engines, establishes a capability growth 
plan for manufacturing processes. This increases the probability that 
the manufacturing process will consistently produce parts that meet 
specifications. Prior to beginning production, Cummins completes what 
it calls "alpha" and "beta" builds, which are prototypes intended to 
validate the product's design and production processes. Cummins 
officials noted that these activities allow them to catch problems 
earlier in development, when problems are less costly to fix. 

Officials from Kenworth, a manufacturer of heavy-and medium-duty 
trucks, described several initiatives it uses to improve manufacturing 
process controls. For example, the company has a new electronic system 
for process documents. Workers on the manufacturing floor used to rely 
on paper installation instructions, and sometimes workers used outdated 
instructions. Kenworth officials say that converting to an electronic 
system ensures that all workers use the most current process 
configuration and reduces rework. For a selected number of processes, 
Kenworth has also developed documents that include pictures as well as 
engineering specifications to ensure that workers follow the correct 
processes, and performs audits to assess whether workers are properly 
trained and know where to go if they have questions regarding the 
process. 

Companies Hold Suppliers Accountable to Deliver High Quality Parts for 
the Product: 

At several of the companies we visited, officials reported that 
supplier parts accounted for a substantial amount of the overall 
product value. Companies we met with systematically manage and oversee 
their supply chain through such activities as regular supplier audits 
and performance evaluations of quality and delivery, among other 
things. Several officials noted that their supplier oversight focuses 
on first-tier suppliers, with limited interaction and oversight of 
lower-tier suppliers. However, Kenworth officials said they hold their 
first-tier suppliers accountable for quality problems attributable to 
lower-tier suppliers. 

Leading commercial companies we met with set high expectations for 
supplier quality. Boeing Commercial Airplanes categorizes its suppliers 
by rates of defective parts per million. To achieve the highest rating 
level, a supplier must exhibit more than 99 percent part conformance, 
and company officials said they have been raising their supplier 
quality expectations over time. The organization has taken steps to 
reduce the number of direct suppliers and retain higher-performing 
suppliers in the supply base. Similarly, suppliers of major components 
for Siemens Medical Solutions' ultrasound systems must provide 
conforming products 98 percent of the time, and the company will levy 
financial penalties against suppliers that do not meet this standard. 
Other companies also financially penalized suppliers for providing 
nonconforming parts. 

Disciplined Processes and Continuous Quality Improvement Are a Focus at 
Several Companies: 

Several company officials discussed how a focus on improving product 
development processes and product quality served as the foundation for 
their systems engineering, manufacturing, and supplier quality 
practices. Officials with Space Systems/Loral discussed how they 
adopted a more disciplined product development process following 
quality problems in the 1990s with some of its satellites. This 
included creating companywide product development processes, adopting a 
formal program that institutionalized an iterative development process, 
and implementing strict documentation requirements and pass/fail 
criteria. The company also established an oversight organization to 
ensure that processes are followed. As a result, the first-year failure 
rate for Space Systems/Loral's satellites decreased by approximately 50 
percent from 2000 through 2006. Likewise, Cummins officials told us 
that quality problems following the initial release of their ISX engine 
were a major factor in the implementation of their current product 
development process. This includes review gates to ensure process 
compliance and management reviews that use knowledge-based approaches 
for evaluating projects. 

Cummins and Kenworth also use tools such as Six Sigma to define, 
measure, analyze, control, and continually improve their processes. For 
example, Cummins applies Six Sigma to its technology development, 
design, and production activities. The company also expects its 
critical suppliers to implement Six Sigma programs to improve quality 
and customer satisfaction. As a result of implementing initiatives such 
as Six Sigma, Cummins officials reported that the company's warranty 
costs have declined substantially in the last several years. Kenworth 
also uses Six Sigma to drive efficiencies into the organization's work 
processes, particularly in the design phase of new product development 
and in controlling manufacturing processes. Kenworth requires its first-
tier suppliers to participate in a Six Sigma program. Company officials 
estimated that Six Sigma projects saved its Chillicothe, Ohio, facility 
several million dollars in 2006. 

In addition, each of the commercial companies we met with collected and 
used data to measure and evaluate their processes and products. This 
helps them gauge the quality of their products and identify areas that 
need improvement. For example, Cummins tracks warranty costs as a 
measure of product quality, while Siemens Medical Solutions measures 
manufacturing process yields for its ultrasound systems. 

Different Environments Create Different Incentives to Improve Quality: 

The quality problems in our case studies and the practices that relate 
to them--whether systems engineering, manufacturing, or supplier 
quality practices--are strongly influenced and often the result of 
larger environmental factors. DOD's acquisition environment is not 
wholly conducive to incentivizing prime contractors to efficiently 
build high-quality weapon systems--ones that perform as expected, can 
be depended on to perform when needed, and are delivered on time and 
within cost estimates. During systems development, DOD usually pays for 
a contractor's best efforts, which can include efforts to achieve 
overly optimistic requirements. In such an environment, seeking to 
achieve overly optimistic requirements along with a lack of oversight 
over the development process contributes to quality problems. In 
contrast, commercial companies we visited operate in an environment 
that requires their own investment of significant funds to develop new 
products before they are able to sell them and recoup that investment. 
This high-cost environment creates incentives for reasonable 
requirements and best practices, as well as continuous improvement in 
systems engineering, manufacturing, and supplier quality. 

DOD's Environment: 

DOD uses cost reimbursement contracts with prime contractors for the 
development of its weapon systems. In this type of contract 
arrangement, DOD accepts most of the financial risks associated with 
development because of technical uncertainties. Because DOD often sets 
overly optimistic requirements for new weapon systems that require new 
and unproven technologies, development cycles can take up to 15 years. 
The financial risk tied to achieving these requirements during 
development is not borne by the contractor in this environment, but by 
the government. This environment provides little incentive for 
contractors to utilize the best systems engineering, manufacturing, and 
supplier quality practices discussed earlier in this report to ensure 
manageable requirements, stable designs, and controlled manufacturing 
processes to hold costs down. Finally, DOD's quality organizations, 
which collect information about prime contractors' quality systems and 
problems, provide limited oversight of prime contractor activities and 
do not aggregate quality data in a manner that helps decision makers 
assess or identify systemic quality problems. 

Overly Optimistic Requirements Hamper Good Quality Outcomes: 

DOD's ability to obtain a high-quality weapon system is adversely 
impacted by an environment where it both (1) assumes most of the 
financial risks associated with technical or cost uncertainties for the 
systems development and (2) sets requirements without adequate systems 
engineering knowledge.[Footnote 8] Without requirements that have been 
thoroughly analyzed for feasibility, development costs are impossible 
to estimate and are likely to grow out of control. 

DOD typically assumes most of the financial risk associated with a new 
weapon system's development by establishing cost reimbursement 
contracts with prime contractors. In essence, this means that prime 
contractors are asked to give their best effort to complete the 
contract and DOD pays for allowable costs, which often includes fixing 
quality problems experienced as part of the effort. As stated earlier, 
these problems can cost millions of dollars to fix. For example, DOD as 
the customer for the Expeditionary Fighting Vehicle signed a cost 
reimbursement contract with the prime contractor, General Dynamics, to 
develop a new weapon system that would meet performance and reliability 
requirements that had not yet been adequately informed by systems 
engineering analysis. Once General Dynamics performed a detailed 
requirements analysis, it informed DOD that more resources would be 
needed to meet the key reliability requirement established earlier. DOD 
decided not to invest the additional money at that time. However, when 
the vehicle was unable to meet its reliability goal prior to moving 
into production, DOD eventually decided to invest an additional $750 
million into its development program to meet the reliability goal. 

Often DOD enters into contracts with prime contractors before 
requirements for the weapon systems have been properly analyzed. For 
example, in March 2007 we reported that only 16 percent of the 62 DOD 
weapon system programs we reviewed had mature technologies to meet 
requirements at the start of development.[Footnote 9] The prime 
contractors on these programs ignored best systems engineering 
practices and relied on immature technologies that carry significant 
unknowns about whether they are ready for integration into a product. 
The situation is exacerbated when DOD adds or changes requirements to 
reflect evolving threats. Prime contractors must then spend time and 
resources redesigning the weapon system, flowing down the design 
changes to its suppliers, and developing new manufacturing plans. In 
some cases, special manufacturing tools the prime contractor thought it 
was going to use might have to be scrapped and new tooling procured. 

Lack of detailed requirements analysis, for example, caused significant 
problems for the Advanced Threat Infrared Countermeasure/Common Missile 
Warning System program. Prior to 1995, the services managed portions of 
the program separately. Then, in 1995, DOD combined the efforts and 
quickly put a developer on contract. This decision resulted in 
significant requirements growth and presented major design and 
manufacturing difficulties for the prime contractor. It took over a 
year to determine that the tactical fixed-wing aircraft requirements 
were incorrect. The extent of the shortfall, however, did not become 
evident until the critical design review and numerous changes were 
required in the contract statement of work. More than 4 years after the 
system's critical design review, the sensor units were built in 
prototype shops, with engineers only then trying to identify critical 
manufacturing processes. Further, sensor manufacturing was slowed by 
significant rework, and at one point was halted while the contractor 
addressed configuration control problems. The Navy and Air Force, which 
required the system for fixed-wing aircraft, dropped out of the program 
in 2000 and 2001, respectively. 

Ultimately, quality is defined in large part by reliability. But, in 
DOD's environment, reliability is not usually emphasized when a program 
begins, which forces the department to fund more costly redesign or 
retrofit activities when reliability problems surface later in 
development or after a system is fielded.[Footnote 10] The F-22A 
program illustrates this point. Because DOD as the customer assumed 
most of the financial risk on the program, it made the decision that 
system development resources primarily should be focused on 
requirements other than reliability, leading to costly quality 
problems. After 7 years in production, the Air Force had to budget an 
additional unplanned $400 million for the F-22A to address numerous 
quality problems and help the system achieve its baseline reliability 
requirements. 

Oversight of Development Programs Could Be Strengthened: 

DOD oversight of prime contractor activities varies and has decreased 
as its quality assurance workforce has decreased. Weapon system 
progress reviews at key decision points are a primary means for DOD to 
oversee prime contractor performance in building high-quality systems, 
but they are not used consistently across programs. The purpose of the 
reviews is to determine if the program has demonstrated sufficient 
progress to advance to the next stage of product development or to 
enter production. The department has developed decision criteria for 
moving through each phase of development and production; and DOD's 
acquisition executive has the authority to prevent programs from 
progressing to later stages of development if requisite knowledge has 
not been attained. Unfortunately most programs are allowed to advance 
without demonstrating sufficient knowledge. For example, in our recent 
review of 62 DOD weapon systems, we found that only 27 percent of the 
programs demonstrated that they had attained a stable design at the 
completion of the design phase.[Footnote 11] 

In addition, as a result of downsizing efforts over the past 15 years, 
DOD's oversight of prime contractor and major supplier manufacturing 
processes varies from system to system. DOD quality officials stated 
that they have had to scale back on the amount of oversight they can 
provide, focusing only on the specific areas that the weapon system 
program managers ask them to review. It is unclear what impact the 
reduction in quality assurance specialists and the reduction of 
oversight has had on the department's ability to influence quality 
outcomes. However, in the case of the Advanced SEAL Delivery System, a 
lapse in effective management oversight exercised by both the 
government and contractor contributed to very late discovery of costly 
quality problems. DOD quality organizations such as the Defense 
Contract Management Agency do capture a significant amount of 
information electronically about the quality of DOD weapon systems 
through audits and corrective action reports. They collect quality data 
on a program by program basis and share information about certain types 
of deficiencies and nonconforming parts they found. While the 
organizations are looking for additional opportunities to share 
information, they do not currently aggregate and consolidate the 
information in a manner that would allow the department to determine 
the overall quality of products it receives from prime contractors or 
to identify quality related systemic problems or trends with its prime 
contractors. 

Commercial Environment: 

Commercial companies must develop and deliver high-quality, highly 
capable products to markets on-time or suffer financial loss. The 
companies face competition and, therefore, their customers can choose 
someone else's products when they are not satisfied. It is this 
environment that incentivizes manufacturers to implement and use best 
practices to improve quality and reduce cost while delivering on-time. 
Commercial customers must set achievable product requirements for their 
manufacturers that they know will result in a reliable, high-quality, 
and desirable product that can be delivered on-time. Manufacturers then 
get their key manufacturing processes in control to reduce 
inconsistencies in the product. Commercial customers understand the 
need to monitor and track manufacturer and supplier quality performance 
over time to determine which companies they want to do business with in 
the future or to identify problem areas that need to be corrected. 

Commercial customers we visited--American Airlines and Intelsat-- 
expect to operate their products for 30 and 15 years, respectively. The 
companies focus a great deal of attention on setting performance and 
reliability goals that manufacturers like Boeing Commercial Airplanes 
and Space Systems/Loral must meet in order for them to purchase their 
products. This provides a strong, direct incentive for manufacturers 
and their customers to ensure that requirements are clear and 
achievable with available resources, including mature technologies, 
before the manufacturer will invest in a product's development. For 
example, Intelsat expects its satellites to be available at least 
99.995 percent of the time. To meet this goal, Intelsat expects its 
manufacturers to use mature technologies and parts where the 
reliability is already known. There are several reasons that drive this 
approach. The most obvious one is that there is no way to fix 
mechanical problems once a satellite has been launched. Another reason 
is that the company must credit television networks, telephone 
companies, or cable companies for any loss of service. The company also 
insures their satellite for launch plus the first year of in-orbit 
service. Having a proven record of in-orbit performance and using 
reliable and flight proven technology are two important factors that 
help the company get favorable terms from the insurance underwriters. 
And, the company does not want to spend a large sum of money for a 
replacement satellite prior to its design life since it will negatively 
impact the company's financial performance. 

In the commercial environment, manufacturers are motivated to develop 
and provide high-quality products because their profit is tied to 
customer expectations and satisfaction. For example, American Airlines 
makes an initial payment to Boeing Commercial Airplanes when it places 
an order for new aircraft, but will not make final payment until it is 
satisfied that their requirements have been met. 

In an another example, Cummins officials discussed how they were 
motivated to adopt more disciplined product development processes 
following the development effort for one of its highest selling 
engines, in the late 1990s. According to company officials, the design 
requirements were unstable from the start of development. They were 
changed and added to as development progressed, often without the 
benefit of timely and disciplined requirements analysis to ensure they 
could be met for the estimated investment cost. There were conflicting 
requirements (weight, size, performance, and fuel economy) that made 
development difficult. In addition, Cummins did not pay enough 
attention to reliability, focusing instead on weight and power 
considerations. As a result, development costs were higher than 
expected and, once the engine was sold, customers experienced less than 
expected. A Cummins official reported that the company found itself in 
an "intolerable" position with customers who were becoming increasingly 
dissatisfied. 

This significant event, in which Cummins lost customer confidence, 
caused the company to examine its product development processes. The 
result of this examination was an improved product development process 
that requires a more cross functional and data-based approach to new 
development programs. The improvements resulted in better analysis and 
understanding of customer requirements leading to resource allocations 
before beginning new programs. Cummins invested in both customer 
satisfaction and the development and support of its products. This 
investment provided the motivation to adopt a more disciplined product 
development approach for the production of high-quality products for 
its customers. 

Intelsat officials told us it makes progress payments to its 
manufacturers throughout development and production. However, the 
company holds about 10 to 20 percent of the contract value to award to 
the manufacturer after a satellite is successfully launched. According 
to company officials, the 10 to 20 percent is paid to the manufacturer 
over the expected life of the satellite, which is typically 15 years, 
when the satellite performs as expected. 

The commercial companies also all capture information about their 
manufacturing processes and key suppliers' quality. However, unlike 
DOD, they use the information when making purchasing decisions and 
determining how best to structure contracts to incentivize good quality 
outcomes. For example, in some cases Intelsat does not allow 
manufacturers to use certain suppliers whose parts do not meet 
specified reliability goals. In addition, Intelsat may include clauses 
in its contracts that require a manufacturer to conduct periodic 
inspections of particular suppliers. 

DOD Efforts to Improve Acquisition Outcomes: 

DOD has long recognized its acquisition problems and has initiated 
numerous improvement efforts over the years to address them. A recent 
set of initiatives are highlighted by the Under Secretary of Defense 
for Acquisition, Technology and Logistics in DOD's Defense Acquisition 
Transformation and Program Manager Empowerment and Accountability 
reports to Congress.[Footnote 12] Our analysis indicates that while 
none of the initiatives is aimed solely at improving the quality of DOD 
weapon systems or improving prime contractor quality practices, they 
could address some of the problems identified in this report, 
particularly the ones that improve the DOD requirements-setting process 
and limit requirements growth during development. A brief description 
of the initiatives is included below. 

* Concept Decision Reviews: DOD is pilot-testing a concept decision 
reviews program to provide a better framework for strategic investment 
decisions. A Concept Decision Committee composed of senior DOD 
officials is applying the reviews to four pilot programs--the Joint 
Lightweight Tactical Mobility program, the Integrated Air and Missile 
Defense program, the Global Strike Raid Scenario, and the Joint Rapid 
Scenario Generation program. A key aspect of the pilot programs is the 
early involvement and participation of systems engineering prior to 
concept decision. DOD expects this to provide decision makers better 
insight for setting firm requirements early, assessing technology 
options, considering alternative acquisition strategies, ensuring that 
new technology will mature in time to meet development and delivery 
schedules, and delivering systems with predictable performance to the 
warfighter. 

* Time-Defined Acquisition: Under the time-defined acquisition 
initiative, DOD plans to use such criteria as technology maturity, time 
to delivery, and requirement certainty to select the appropriate 
acquisition approach to provide a needed capability. The department 
envisions using a different acquisition approach, depending on whether 
a capability can be fielded in 2 years or less, more than 2 years to 
less than 4 years, or more than 4 years. In September 2006, the Under 
Secretary of Defense for Acquisition, Technology and Logistics stated 
that he anticipated the time-defined acquisition approach would 
facilitate better overall cost control and more effective use of total 
available resources. 

* Configuration Steering Boards: In July 2007, the Under Secretary of 
Defense for Acquisition, Technology and Logistics directed the 
establishment of Configuration Steering Boards for every current and 
future acquisition category I program in development.[Footnote 13] The 
boards, chaired by the service acquisition executive within each of the 
military services, are expected to review all requirements changes and 
significant technical configuration changes that have the potential to 
adversely affect program cost and schedule. Requirement changes are not 
to be approved unless funds are identified and schedule impacts are 
mitigated. However, the Under Secretary stated in his announcement of 
this initiative that such requirements changes would usually be 
rejected. 

* Key Performance Parameters/Key System Attributes:[Footnote 14] DOD 
has added new guidelines and procedures for establishing weapon system 
requirements in its Joint Capabilities Integration and Development 
System manual. The manual now requires that materiel availability be 
included as a key performance parameter for new weapon system 
development and that materiel reliability and ownership costs be 
included as key system attributes. Together, these requirements are 
aimed at ensuring weapon system sustainment considerations are fully 
assessed and addressed as part of the systems engineering process. 

* Award and Incentive Fees: DOD recently issued policy memorandums that 
reflect a change in policy related to the proper use of award and 
incentive fees. The memorandums emphasize the need to structure award 
fee contracts in ways that focus DOD and contractor efforts on meeting 
or exceeding cost, schedule, and performance requirements. The policy 
memorandums state that award fees should be linked to desired outcomes 
and payments should be commensurate to contractor performance. It also 
provides guidelines for how much contractors will be paid for excellent 
performance, satisfactory performance, and less than satisfactory 
performance. 

While these initiatives are not directly linked together, they have the 
potential to help DOD implement some of the leading commercial 
practices we have highlighted in the past. In particular, they could 
help the Under Secretary of Defense for Acquisition, Technology and 
Logistics ensure that DOD has a better match between warfighter needs 
and funding at the start of weapon system development and that 
technology, engineering, and production knowledge is properly 
considered at that time. They can also help control requirements 
changes and requirements growth, which can adversely affect system 
quality during development. The initiatives are still new and, in the 
case of concept decision reviews, small in scope; therefore, their 
effectiveness may not be known for some time. 

Conclusions: 

DOD has developed policies that address the need for setting achievable 
requirements, adopting commercial quality standards, using good systems 
engineering practices, and overseeing supplier quality. However, DOD 
still has difficulty acquiring high-quality weapon systems in a cost- 
efficient and timely manner. While many problems are caused by poor 
prime contractor practices related to systems engineering, 
manufacturing, and supplier quality, an underlying cause lies in the 
environment. DOD typically assumes most of the financial risk 
associated with development of complex systems. However, risks 
associated with this situation are exacerbated because DOD generally 
enters into development contracts without demonstrated knowledge or 
firm assurance that requirements are achievable, which too often result 
in inefficient programs and quality problems. 

DOD can learn from leading commercial companies in the way they deal 
with risk and ensure quality in their products. Because commercial 
companies invest their own money in product development and recoup that 
investment when their customers buy the finished good, they put a new 
product's requirements to the test with disciplined systems engineering 
practices before they commit to a large investment to develop it. If a 
highly valued requirement cannot be demonstrated as achievable through 
systems engineering, it is deferred to a subsequent product variation 
or to another program. Moreover, and very importantly, companies do not 
shortcut essential quality practices that ensure process controls and 
high supplier quality, including collecting and analyzing quality data. 
Like commercial companies, DOD must demand appropriate knowledge about 
requirements and make hard decisions about program risk before it 
initiates costly investments. 

Improvements in the way DOD uses existing tools to analyze requirements 
during development, along with potential results of some of the 
initiatives it has underway, can help reduce quality risks, and address 
some of the long-standing acquisition problems it faces. Although the 
initiatives are new and in the case of the concept decision reviews, 
small in scope, they are a good first step toward the department 
setting more realistic requirements and time frames for weapon system 
development. Additional oversight could help ensure that prime 
contractors can meet requirements with given resources, such as funding 
and technologies, prior to DOD entering into a development contract. In 
addition, continued leadership from the Under Secretary of Defense for 
Acquisition, Technology and Logistics and a combination of actions from 
both DOD and prime contractors are needed to make these improvements 
and get the most from its planned $1.5 trillion investment in new 
weapons programs. 

Recommendations for Executive Action: 

To ensure that the department is taking steps to improve the quality of 
weapon systems, we recommend that the Secretary of Defense take the 
following actions related to recent initiatives highlighted in DOD's 
Defense Acquisition Transformation and Program Manager Empowerment and 
Accountability reports to Congress to improve its focus on setting 
achievable requirements and oversight: 

* As a part of the concept decision review initiative, have contractors 
perform more detailed systems engineering analysis to develop sound 
requirements before DOD selects a prime contractor for the systems 
development contract, which would help ensure that weapon system 
requirements, including those for reliability, are achievable with 
given resources. 

* Establish measures to gauge the success of the concept decision 
reviews, time-defined acquisition, and configuration steering board 
initiatives and properly support and expand these initiatives where 
appropriate. 

To better assess the quality of weapon system programs and prime 
contractor performance, DOD needs to obtain and analyze more 
comprehensive data regarding prime contractors and their key suppliers. 
Therefore, we also recommend that the Secretary of Defense direct the 
Defense Contract Management Agency and the military services to: 

* Identify and collect data that provides metrics about the 
effectiveness of prime contractors' quality management system and 
processes by weapon system and business area over time and: 

* Develop evaluation criteria that would allow DOD to score the 
performance of prime contractors' quality management systems based on 
actual past performance, which could be used to improve quality and 
better inform DOD acquisition decision makers. 

Agency Comments and Our Evaluation: 

DOD provided us with written comments on a draft of this report. DOD 
partially concurred with each of the recommendations. DOD's comments 
appear in appendix III. 

In its comments, DOD partially concurred with the draft recommendation 
that, as part of its concept decision review initiative, prime 
contractors should complete systems engineering analysis prior to 
entering a development contract. The department stated that the 
recommendation was vague. DOD noted that it conducts systems 
engineering planning prior to entering into a development contract and 
that prime contractors conduct more detailed systems engineering 
analysis afterwards. Moreover, DOD noted that systems engineering is a 
continuous government-performed activity at the heart of any structured 
development process that proceeds from concept to production. The 
concept decision review initiative, in particular, considers 
fundamental systems engineering issues such as technology and 
integration and manufacturing risk before the concept decision review. 

To address DOD's concern that our recommendation was too vague, we 
modified it to add more detail. Specifically, as part of the concept 
decision review initiative, we recommend that contractors that are 
competing for the systems development contract provide DOD more 
detailed systems engineering requirements analysis to be completed 
before a systems development contract is awarded. This would help 
ensure that requirements are clear and reasonable before DOD enters 
into a development contract. We understand that currently DOD conducts 
systems engineering planning prior to entering a development contract 
with prime contractors and that prime contractors conduct a more 
thorough systems engineering analysis afterwards. However, because our 
work has found that many DOD systems development efforts have been 
hampered by poorly defined or poorly understood requirements, we 
believe that DOD should test, through the concept decision initiative, 
paying contractors to complete a more thorough systems engineering 
analysis prior to entering into a development contract. This would give 
the department the benefit of more knowledge when finalizing 
requirements and provide an opportunity for DOD to set requirements 
that can be met in a well-defined time frame, which could reduce the 
department's risk exposure in cost reimbursement contracts used for 
development. In addition, it would better position DOD to place more 
accountability on the winning contractor to meet the desired 
requirements within cost and schedule estimates. 

DOD also partially concurred with the recommendation to establish 
measures to gauge the success of the concept decision reviews, time- 
defined acquisition, and configuration steering board initiatives and 
properly support and expand these initiatives where appropriate. In its 
response, DOD stated that changes to the concept decision review and 
time-defined acquisition initiatives are being considered and any 
changes would be reflected in an update to DOD Instruction 5000.2. DOD 
also stated that the configuration steering board initiative is being 
implemented consistent with its policy. 

We are encouraged by the potential changes that could result from 
successful implementation of the concept decision reviews, time-defined 
acquisition, and configuration steering board initiatives. We believe 
that these three initiatives are aimed at addressing several of DOD's 
systemic problems that impact weapon system quality and that the 
department should not lose sight of these initiatives. While the 
initiatives are new and untested in practice, acquisition history tells 
us that these policy changes alone will not be sufficient to change 
outcomes. We have found that measures to gauge success can help 
facilitate senior-level oversight that is needed to bring about 
significant change within an organization. We, therefore, believe this 
recommendation remains valid. 

DOD partially concurred with the recommendation for the Defense 
Contract Management Agency and military services to identify and 
collect data that provides metrics about the effectiveness of prime 
contractors' quality management systems and processes by weapon system 
and business area over time. In its response, DOD stated that the 
Defense Contract Management Agency is in the process of identifying and 
will eventually collect data that could be used to determine the 
effectiveness of prime contractors' quality management systems. 
However, DOD stated that the added expense of capturing data by weapon 
system and business area does not seem warranted at this time. Further 
it commented that there is no need for the military services to engage 
in a similar effort to the Defense Contract Management Agency, since 
the agency is working in cooperation with the military services. 

We are encouraged by the Defense Contract Management Agency's efforts 
to identify and collect data on prime contractor quality management 
activities on a broad scale. As we noted in the report, this is a 
practice used by leading commercial companies we visited. During our 
review, the agency could only provide data on a weapon system by weapon 
system basis. We believe that data should be captured on both a weapon 
system and prime contractor basis and that the added expense of 
including data by weapon system is likely minimal, given that it is 
already being collected that way. Considering that DOD plans to invest 
about $1.5 trillion (in 2007 dollars) in its current portfolio of major 
weapon systems, we believe it would be valuable for DOD to know how the 
companies and business units responsible for delivering its high- 
quality weapon systems are performing as well as the quality associated 
with individual weapon systems. In addition, we believe the military 
services, particularly the Navy's Supervisor of Shipbuilding, which is 
responsible for overseeing contractor activities for shipbuilding, 
should identify and collect similar data so that information collected 
is consistent and can be used for comparison purposes. We, therefore, 
believe this recommendation remains valid. 

Finally, DOD partially concurred with the recommendation for the 
Defense Contract Management Agency and military services to develop 
evaluation criteria that would allow DOD to score the performance of 
prime contractors' quality management systems based on actual past 
performance. DOD stated that it plans to develop evaluation criteria 
based on data the Defense Contract Management Agency plans to collect 
in the future. DOD does not think the military services need to develop 
a parallel effort because Defense Contract Management Agency data will 
be shared with the military services. 

It was not our intent for the military services, the Defense Contract 
Management Agency, and the Navy's Supervisor of Shipbuilding to have 
parallel efforts. Rather, we expected that they would work 
collaboratively on this effort. Moreover, not only do we believe DOD 
should know how well the prime contractors and their respective 
programs are performing as noted above, we also believe that DOD should 
know how well the prime contractors' quality management systems are 
working. Again, this is a practice used by leading commercial companies 
we visited. We are encouraged that the Defense Contract Management 
Agency plans to develop evaluation criteria that would be used to score 
prime contractor quality management systems but believe the department 
should have a consistent methodology to be used across DOD. We, 
therefore, believe this recommendation remains valid. 

We are sending copies of this report to the Secretary of Defense and 
interested congressional committees. We will also make copies available 
at no charge on the GAO Web site at http://www.gao.gov. 

If you have any questions about this report or need additional 
information, please contact me at (202) 512-4841 or sullivanm@gao.gov. 
Contact points for our Offices of Congressional Relations and Public 
Affairs may be found on the last page of this report. Key contributors 
to this report are listed in appendix IV. 

Signed by: 

Michael Sullivan: 
Director:
Acquisition and Sourcing Management: 

List of Congressional Committees: 

The Honorable Carl Levin:
Chairman:
The Honorable John McCain:
Ranking Member:
Committee on Armed Services:
United States Senate: 

The Honorable Daniel K. Inouye:
Chairman:
The Honorable Ted Stevens:
Ranking Member:
Subcommittee on Defense:
Committee on Appropriations:
United States Senate: 

The Honorable Ike Skelton:
Chairman:
The Honorable Duncan Hunter:
Ranking Member:
Committee on Armed Services:
House of Representatives: 

The Honorable John P. Murtha:
Chairman:
The Honorable C.W. Bill Young:
Ranking Member:
Subcommittee on Defense:
Committee on Appropriations:
House of Representatives: 

[End of section] 

Appendix I: Objectives, Scope, and Methodology: 

This report compares Department of Defense (DOD) and its large prime 
contractors' quality management policies and practices with those of 
leading commercial companies--with a focus on improving the quality of 
DOD weapon systems. Specifically, we (1) determined the impact of 
quality problems on selected DOD weapon systems and defense 
contractors' practices that contributed to the problems, (2) identified 
practices used by leading commercial companies that can be used to 
improve the quality of DOD weapon systems, (3) identified problems DOD 
faces in terms of improving quality, and (4) identified recent DOD 
initiatives that could improve quality. 

To determine the impact of quality problems on selected DOD weapon 
systems and defense contractors' practices that contribute to the 
problems, we selected and reviewed 11 DOD weapon systems with known 
deficiencies from each of the military services and identified the 
quality problems associated with each deficiency. The 11 were chosen to 
demonstrate the types of problems DOD weapon systems experience and to 
help focus our discussions with leading commercial companies on aspects 
of development that caused DOD major quality problems. The prime 
contractors in charge of developing these systems include six of DOD's 
largest contractors; together, they are involved with a little over $1 
trillion, or about 76 percent, of the $1.5 trillion (in 2006 dollars) 
DOD plans to spend on weapon systems in its current portfolio. Systems 
we reviewed, along with the prime contractors responsible for 
developing the systems, are: 

* Advanced SEAL Delivery System, a battery-powered submarine funded by 
the Special Operations Command and developed by Northrop Grumman; 

* Advanced Threat Infrared Countermeasure/Common Missile Warning 
System, a defense countermeasure system for protection against infrared 
guided missiles in flight funded primarily by the Army and developed by 
BAE Systems; 

* Expeditionary Fighting Vehicle, an amphibious and armored tracked 
vehicle funded by the Navy for the Marine Corps and developed by 
General Dynamics; 

* F-22A, an air superiority fighter with an air-to-ground attack 
capability funded by the Air Force and developed by Lockheed Martin; 

* Global Hawk, a high-altitude, long endurance unmanned aircraft funded 
by the Air Force and developed by Northrop Grumman; 

* Joint Air-to-Surface Standoff Missile, an air-to-surface missile 
funded by the Air Force and developed by Lockheed Martin; 

* LPD 17, an amphibious transport ship funded by the Navy and developed 
by Northrop Grumman; 

* MH-60S, a fleet combat support helicopter funded by the Navy and 
developed by Sikorsky Aircraft; 

* Patriot Advanced Capability-3, a long-range high-to-medium altitude 
missile system funded by the Army and developed by Lockheed Martin; 

* V-22, a tilt rotor, vertical/short take-off and landing aircraft 
funded primarily by the Navy for the Marine Corps and developed jointly 
by Bell Helicopter Textron and Boeing Integrated Defense Systems; and: 

* Wideband Global SATCOM, a communications satellite funded by the Air 
Force and developed by Boeing Integrated Defense Systems. 

To evaluate each of the 11 DOD weapon systems, we examined program 
documentation, such as deficiency reports and corrective action 
reports, and held discussions with quality officials from DOD program 
offices, the prime contractor program office, and either the Defense 
Contract Management Agency or the Supervisor of Shipbuilding office 
where appropriate. Based on information gathered through documentation 
and discussions, we grouped the problems into three general categories: 
systems engineering, manufacturing, and supplier quality. When 
possible, we identified the impact that quality problems had on system 
cost, schedule, performance, reliability, availability, or safety. 
After completing our weapon systems reviews, we held meetings with 
senior quality leaders at selected prime contractors included in our 
review to discuss the quality problems we found and to obtain their 
views on why the problems occurred. 

To identify practices used by leading commercial companies that can be 
used to improve the quality of DOD weapon systems, we selected and 
visited five companies based on several criteria: companies that make 
products similar to DOD weapon systems in terms of complexity; 
companies that have been recognized in quality management literature or 
by quality-related associations/research centers for their high- 
quality products; companies that have won quality-related awards; and/ 
or companies that have close relationships with customers when 
developing and producing products. We met with these companies to 
discuss their product development and manufacturing processes, supplier 
quality activities, and the quality of selected products made by these 
companies. Much of the information we obtained from these companies is 
anecdotal, due to the proprietary nature of the data that could affect 
their competitive standing. Several of the companies provided data on 
specific products, which they agreed to let us include in this report. 
The companies we visited and the products we discussed include: 

* Boeing Commercial Airplanes, a leading aerospace company and a 
manufacturer of commercial jetliners. We met with quality officials in 
Seattle, Washington, and discussed the quality practices associated 
with the company's short-to-medium range 737 and extended range 777 
aircraft, as well as its new 787 aircraft. 

* Cummins Inc., a manufacturer of diesel and natural gas-powered 
engines for on-highway and off-highway use. We met with quality 
officials at their company's headquarters location in Columbus, 
Indiana, and discussed the development and quality of the ISX, a heavy- 
duty engine. 

* Kenworth Truck Company, a division of PACCAR Inc. and a leading 
manufacturer of heavy-and medium-duty trucks. We met with quality 
officials at its manufacturing plant in Chillicothe, Ohio, which was 
named Quality Magazine's 2006 Large Plant of the Year, to discuss the 
development and quality of various large trucks. 

* Siemens Medical Solutions, a business area within Siemens AG, which 
is a global producer of numerous products, including electronics, 
electrical equipment, and medical devices. We met with quality 
officials at a company facility located in Mountain View, California, 
and discussed the division's quality practices for developing and 
manufacturing ultrasound systems such as the Sequoia ultrasound system. 

* Space Systems/Loral, one of the world's premier designers, 
manufacturers, and integrators of geostationary satellites and 
satellite systems. We met with quality officials at the company's 
headquarters in Palo Alto, California, and discussed the company's 
quality practices for developing satellites such as the Intelsat IX 
series and iPSTAR satellites. 

To identify problems that DOD must overcome to improve the quality of 
weapon systems, we reviewed processes and tools DOD can use to 
influence weapon system quality. These include setting requirements, 
participating in key decisions during weapon system development and 
production, using contracts to incentivize good quality, and overseeing 
weapon system quality and prime contractor performance. We examined 
these processes and tools for the 11 weapons programs we reviewed and 
discussed the use of these processes and tools with acquisition and 
quality officials from the Office of the Secretary of Defense, military 
services, prime contractors, Defense Contract Management Agency, and 
Supervisor of Shipbuilding. We also relied on previous GAO best 
practices and weapon system reports to identify DOD actions that 
contributed to poor quality outcomes. A comprehensive list of reports 
we considered throughout our review can be found in the related 
products section at the end of this report. 

We met with officials at two commercial companies that purchase 
products manufactured by two of the leading commercial manufacturers we 
included in this review. These companies included: 

* American Airlines, the largest scheduled passenger airline in the 
world, which has purchased aircraft from Boeing Commercial Airplanes. 
We met with quality officials at a major maintenance facility located 
in Tulsa, Oklahoma. 

* Intelsat, a leading provider of fixed satellite services for 
telecommunications, Internet, and media broadcast companies, which 
purchases satellites from all major satellite manufacturers in the 
United States and Europe. We met with officials in space systems 
acquisition and planning at the company's headquarters located in 
Washington, D.C. 

Our discussions focused on (1) the companies' roles in establishing 
requirements; (2) the types of contracts they award to manufacturers 
and the specificity included in the contracts in terms of quality, 
reliability, and penalties; and (3) the amount of oversight they 
exercise over their suppliers' development and manufacturing 
activities. 

To identify recent DOD initiatives that could improve weapon system 
quality, we reviewed DOD's formal response to Sections 804 and 853 of 
the John Warner National Defense Authorization Act for Fiscal Year 
2007. This act requires DOD to report to the congressional defense 
committees on acquisition reform and program management initiatives. We 
also met with senior defense leaders to discuss the implementation 
status of the acquisition reform initiatives identified in DOD's 
February 2007 and September 2007 reports to the committees and relied 
on a previous GAO report for the implementation status of planned 
program management improvements.[Footnote 15] 

We conducted this performance audit from September 2006 to December 
2007 in accordance with generally accepted government auditing 
standards. Those standards require that we plan and perform the audit 
to obtain sufficient, appropriate evidence to provide a reasonable 
basis for our findings and conclusions based on our audit objectives. 
We believe that the evidence obtained provides a reasonable basis for 
our findings and conclusions based on our audit objectives. 

[End of section] 

Appendix II: Quality Problems for 11 DOD Weapon Systems: 

This appendix summarizes the quality problems experienced by the 11 DOD 
weapon systems we reviewed. The problems are categorized as systems 
engineering, manufacturing, and/or supplier quality problems. Most of 
the programs had problems in more than one of these categories. These 
summaries do not address all quality problems experienced on the 
programs; rather they emphasize major problems we discussed with 
officials from the military services, prime contractors, and the 
Defense Contract Management Agency. When possible, we include the 
direct impact the quality problems had on the program, the corrective 
actions the prime contractor or DOD took to address the problems, and 
the change in cost estimates and quantities from the start of program 
development to the present. 

The cost estimates were taken from DOD Selected Acquisition Reports or 
were program office estimates and include DOD's research, development, 
test and evaluation (RDT&E) and procurement expenditures on a 
particular program. We did not break out the portion of these funds 
that were paid to prime contractors versus the amount paid to 
suppliers. In addition, the change in cost estimates can be the result 
of a number of factors, including the amount paid to fix quality 
problems, a decision to procure more weapons, and increased labor rates 
or material prices. 

Figure 2: Advanced SEAL Delivery System: 

[See PDF for image] 

Photograph of the Advanced SEAL Delivery System, with the following 
accompanying data: 

DOD sponsor: Special Operations Command; 
Prime contractor: Northrop Grumman; 
Program start: 1994. 

RDT&E cost: 
Development estimate (in millions of dollars): $157; 
Current estimate[A] (in millions of dollars): $584; 
Change: +272%. 

Procurement cost: 
Development estimate (in millions of dollars): $139; 
Current estimate[A] (in millions of dollars): $173; 
Change: +24%. 

Total quantity: 
Development estimate (in millions of dollars): 3; 
Current estimate[A] (in millions of dollars): 1; 
Change: -2. 

Source: GAO analysis; US Navy (photo). 

[A] Estimated costs through fiscal year 2008. 

The Advanced SEAL Delivery System has experienced a number of problems 
that have degraded the boat’s reliability and performance. Since 
accepting the boat in 2003, the Navy has issued contracts to Northrop 
Grumman valued at approximately $87 million, and much of this was to 
address design and reliability issues. However, continuing reliability 
and performance problems led to a decision in 2006 to cancel purchases 
of additional boats. The Navy subsequently issued another contract to 
Northrop Grumman for an estimated cost of $18 million to perform 
critical systems reviews and failure reviews, among other things, to 
improve the reliability of the first boat. DOD has also directed the 
Navy and the Special Operations Command to assess alternate material 
solutions to fulfill remaining operational requirements. In July 2007, 
the system was reinstated for System Fielding and Deployment Release. 
Examples of quality problems related to systems engineering and 
supplier quality are highlighted below. 

Systems engineering: 
Quality Problems: Ineffective program management by the contractor, 
including systems engineering deficiencies, was a key contributor to 
the system’s quality problems. Navy and Special Operations Command 
reviews found that the contractor had considerable difficulty 
interpreting the underwater shock portion of the performance 
requirements, in part due to the contractor’s lack of experience in 
submarine design. Further, the contractor used substandard design 
methodologies, resulting in an unacceptable system design. The Advanced 
SEAL Delivery System’s tail is an example of the system’s design 
problems. The system’s aluminum tail was redesigned due to fatigue 
stresses that were revealed during mated operations with the host 
submarine. The aluminum tail was not structurally adequate to meet the 
30-year tail life requirement. In 2005, the Navy awarded Northrop 
Grumman an $8 million contract to redesign the tail and upgrade to 
titanium. The replacement tail still has not resolved all the tail 
assembly design deficiencies, and the Navy has imposed operating 
restrictions that limit the speed of the host submarine while 
transporting the boat. 

Supplier quality: 
Quality Problems: In 2004, during testing of tail repairs, the 
propeller stator, which is part of the tail section, broke off, causing 
damage to the propeller. The resulting investigation attributed this 
failure to improper manufacturing of the propeller stator by a 
supplier, as it was not done in accordance with the stated design. 

[End of figure] 

Figure 3: Advanced Threat Infrared Countermeasure/Common Missile 
Warning System: 

[See PDF for image] 

Photograph of the Advanced Threat Infrared Countermeasure/Common 
Missile Warning System, with the following accompanying data: 

DOD sponsor: Army; 
Prime contractor: BAE Systems; 
Program start: 1995. 

RDT&E cost: 
Development estimate (in millions of dollars): $637; 
Current estimate[A] (in millions of dollars): $798; 
Change: +25%. 

Procurement cost: 
Development estimate (in millions of dollars): $2,605; 
Current estimate[A] (in millions of dollars): $4,515; 
Change: +73%. 

Total quantity: 
Development estimate (in millions of dollars): 3,094; 
Current estimate[A] (in millions of dollars): 3,589; 
Change: +495. 

Source: GAO analysis; BAE Systems (photo). 

The Advanced Threat Infrared Countermeasure/Common Missile Warning 
System is comprised of two systems initially managed by different 
military services. The Air Force and Navy initially managed the missile 
warning system designed to detect incoming missiles; and the Army 
managed the infrared countermeasure portion of the system, which 
employs laser energy to decoy or jam seekers on incoming missiles. In 
1995, DOD combined the two systems into a joint Army, Navy, and Air 
Force program. According to program officials, the services rushed to 
put a developer on contract, a move that resulted in significant 
requirements growth and presented major difficulties in designing the 
Common Missile Warning System sensor for use on both rotary wing (i.e., 
helicopter) and fixed-wing aircraft. Subsequently, the Navy and Air 
Force dropped out of the program in 2000 and 2001, respectively. 

Systems engineering: 
Quality problems: Reliability problems related to the Advanced Threat 
Infrared Countermeasure jam head and pointing accuracy forced the Army 
to halt laser testing in fiscal year 2005. As a result of these 
problems, fielding of the subsystem has been delayed for 5 years until 
fiscal year 2010 to develop a more reliable jam head. The program 
office estimated the cost of developing the new jam head at $117.3 
million. Reliability problems were caused, at least in part, by an 
early lack of focus on reliability. Specifically, according to a prime 
contractor official, neither the prime contractor nor the government 
had sufficient funding for a reliability testing program. 

Manufacturing: 
Quality problems: More than 4 years after the system’s critical design 
review, Common Missile Warning System sensor units were built in 
prototype shops, with engineers only then trying to identify critical 
manufacturing processes. [Footnote 18] Sensor manufacturing was slowed 
by significant rework, and at one point was halted while the contractor 
addressed configuration control problems. 

[End of figure] 

Figure 4: Expeditionary Fighting Vehicle: 

[See PDF for image] 

Photograph of the Expeditionary Fighting Vehicle, with the following 
accompanying data: 

DOD sponsor: Marine Corps; 
Prime contractor: General Dynamics; 
Program start: 2000. 

RDT&E cost: 
Development estimate (in millions of dollars): $1,569; 
Current estimate[A] (in millions of dollars): $3,565; 
Change: +127%. 

Procurement cost: 
Development estimate (in millions of dollars): $7,037; 
Current estimate[A] (in millions of dollars): $9,847; 
Change: +40%. 

Total quantity: 
Development estimate (in millions of dollars): 1,025; 
Current estimate[A] (in millions of dollars): 593; 
Change: -432. 

Source: GAO analysis; EFV Program Office (photo). 

The Expeditionary Fighting Vehicle has experienced significant 
reliability problems in development, despite reliability being one of 
the vehicle’s seven key performance parameters. The prime contractor 
must meet a 43.5-hours mean time between operational mission failures 
requirement by fielding. The vehicle achieved only 7.7 hours between 
mission failures in pre-production testing, short of the 17 hours 
needed to be on an acceptable path for reliability growth. According to 
prime contractor officials, although reliability was a key performance 
parameter in development, DOD decided to focus its resources during 
this phase on meeting requirements related to water speed, 
survivability, and lethality. While this emphasis did not relieve the 
prime contractor of the reliability requirement, which was to be met at 
full-rate production, activities aimed at meeting the reliability 
requirement were to take place through a reliability growth program in 
which problems identified during testing would be fixed as they 
occurred. However, as a result of reliability problems, DOD extended 
the System Development and Demonstration phase, which program officials 
anticipate will last an additional 4 years at an estimated cost of $750 
million. The primary focus of the extension is to redesign the system 
for improved reliability. The extended development phase will focus on 
reliability improvements for several subsystems including the turret, 
hydraulics, drive train, software, and electrical/electronics. 

Systems engineering: 
Quality problems: According to program officials, design problems— 
manifested as part and subsystem interferences at integration and 
assembly points—were the primary cause of nonconformances noted during 
vehicle assembly. These interferences resulted from design and 
engineering changes that were not always passed to suppliers; this 
resulted in supplier parts not fitting into assemblies because they 
were produced using earlier designs. Interferences caused assembly 
schedule delays and, more generally, Defense Contract Management Agency 
officials said the high number of nonconformances experienced during 
assembly made every development vehicle late for testing. Prime 
contractor officials identified specific root causes for interference 
problems as: 
* tight engineering model release schedules; 
* design engineers lacked experience and did not comply with 
engineering standards; 
* computer model checks were inconsistently performed; and; 
* space claim checks were not performed/completed between subassembly 
teams, resulting in different components inadvertently claiming use of 
the same space. 

[End of figure] 

Figure 5: F-22A: 

[See PDF for image] 

Photograph of the F-22A, with the following accompanying data: 

DOD sponsor: Air Force; 
Prime contractor: Lockheed Martin; 
Program start: 1991. 

RDT&E cost: 
Development estimate (in millions of dollars): $23,820; 
Current estimate[A] (in millions of dollars): $36,723; 
Change: +54%. 

Procurement cost: 
Development estimate (in millions of dollars): $62,586; 
Current estimate[A] (in millions of dollars): $35,845; 
Change: +43%. 

Total quantity: 
Development estimate (in millions of dollars): 648; 
Current estimate[A] (in millions of dollars): 184; 
Change: -464. 

Source: GAO analysis; F-22A System Program Office (photo). 

In 2002, we reported that the F-22A program initially had taken steps 
to use commercial best practices to design and produce the aircraft. 
[Footnote 19] For example, the program planned to design in reliability 
and get critical manufacturing processes in statistical control by the 
full-rate production decision. In 2000, citing budgetary constraints 
and specific hardware quality problems that demanded attention, the Air 
Force decided to trade off producibility, reliability, and 
maintainability activities for performance in the system’s design and 
abandoned its efforts to get manufacturing processes in control. Less 
than 50 percent of its critical manufacturing processes were in control 
when the program entered production. Currently, the program is using 
post-assembly inspections to identify and fix defects rather than 
statistical process control techniques to prevent them. 

The program has not yet reached its reliability requirement of 3-hours 
mean time between maintenance actions. The Air Force estimated, when 
production began in 2001, Lockheed Martin should have been able to 
demonstrate that the F-22A could achieve almost 2 flying hours between 
maintenance actions. However, it could only demonstrate 0.66 hours, or 
about 40 minutes between maintenance actions. As of October 2007, the 
mean time between maintenance is 1.61 hours, or about 97 minutes. For 
fiscal year 2008, the program has over $400 million budgeted to improve 
the reliability and maintainability of the aircraft. Following are some 
of the more significant quality problems experienced on the system. 

Systems engineering: 
Quality problems: The program was structured to provide the aircraft’s 
full capability with the first production unit off the line. This was 
an extreme design challenge and required the product design to include 
many new and unproven technologies, designs, and manufacturing 
processes. For example, the design of the transparency was a first for 
canopy systems. The program also included new low observable (stealth) 
materials, integrated avionics, and propulsion technology that were not 
mature at the start of the acquisition program. The program declared 
the design stable and ready to begin initial manufacturing, even though 
only 26 percent of the eventual design drawings were completed at the 
critical design review. The F-22A program did not achieve design 
stability, where 90 percent of drawings were complete, until almost 3 
years after the critical design review. By this time, the first two 
development aircraft had been delivered. 

Manufacturing: 
Quality problems: aircraft’s cockpit. According to program officials, 
the program includes a first of a kind canopy, with an external 
metallic stealth layer, which makes it difficult to manufacture. 
Program officials acknowledge the program has addressed and corrected a 
number of manufacturing issues with the canopy including cracks 
emanating from mounting holes in the transparency in 2000, and they 
maintain they continue to incorporate improvements to the coating 
system in order to meet durability requirements. Another canopy 
problem, unrelated to the transparency, involved a defective activator 
that prevented a pilot from exiting the aircraft for 5 hours, resulting 
in a cost of over $100,000 in order to release the pilot and retrofit 
other existing aircraft. 

Supplier quality: 
Quality problems: The Air Force identified a potential major structural 
problem with heat treatment of titanium forgings near the aircraft’s 
engine. According to program officials, the problem with the titanium 
was a material defect from a subcontractor. The program office, in 
conjunction with Air Force Research Laboratories and the contractor, 
performed a thorough review of the entire manufacturing process for the 
frames and determined that the root cause was that the frames had not 
spent enough time at the proper temperature during the heat treating 
process. An extensive structural test program was conducted to 
determine the service life impact of aircraft produced with incorrectly 
heat treated frames. The results of the testing showed that the frames 
met full durability life. The original heat treat vendor is no longer 
producing parts for the F-22A program, and the program office in 
conjunction with Air Force Research Laboratories and Lockheed Martin 
have implemented rigorous process controls at the new vendor to ensure 
that all frames for future production are properly heat treated. 

[End of figure] 

Figure 6: Global Hawk Unmanned Aircraft System: 

[See PDF for image] 

Photograph of the Global Hawk Unmanned Aircraft System, with the 
following accompanying data: 

DOD sponsor: Air Force; 
Prime contractor: Northrop Grumman; 
Program start: 2001. 

RDT&E cost: 
Development estimate (in millions of dollars): $989; 
Current estimate[A] (in millions of dollars): $3,682; 
Change: +272%. 

Procurement cost: 
Development estimate (in millions of dollars): $4,102; 
Current estimate[A] (in millions of dollars): $5,774; 
Change: +41%. 

Total quantity: 
Development estimate (in millions of dollars): 63; 
Current estimate[A] (in millions of dollars): 54; 
Change: -9. 

Source: GAO analysis; Northrop Grumman Corporation (photo). 

The Global Hawk program began as an Advanced Concept Technology 
Demonstration effort in 1994. Following a successful technology 
demonstration in 2001, DOD transitioned the program directly to a 
simultaneous development and production effort. In 2002, the program 
was restructured to include a more advanced Global Hawk model. 
Collectively, these decisions created several challenges for the 
program, including quality problems, as described below. 

Systems engineering: 
Quality problems: The advanced Global Hawk model’s design had been 
expected to be very similar to the previous Global Hawk model. However, 
as the design for the advanced model matured and production was about 
to start, the differences between the initial model and the advanced 
model were more extensive, complex, and costly than anticipated. Within 
a year, there were more than 2,000 engineering drawing changes to the 
baseline of 1,400 drawings. More than half of the changes were 
considered major. Design deficiencies, engineering changes, and work 
delays contributed to a $209 million overrun in the development 
contract. 

Supplier quality: 
Quality problems: The supplier producing the Integrated Sensor Suite, 
which is the primary air vehicle payload, encountered production 
problems as the program transitioned from an Advanced Concept 
Technology Demonstration effort to a development and production effort. 
During the demonstration effort, the supplier built the sensor suite in 
a laboratory using a labor-intensive process. However, this process was 
not efficient for a longer production effort, and the supplier 
struggled to meet the increased production rates for later deliveries. 
Defense Contract Management Agency officials stated that the 
specialized test equipment development delays resulted in approximately 
a four-month schedule slip in production of the integrated sensor 
suite. To address this issue, the Air Force had to invest $30 million 
in specialized testing equipment to help the contractor implement 
efficient production processes. However, this supplier continued to 
experience quality problems related to workmanship and was delivering 
sensors late. Starting in October 2006, the prime contractor placed an 
assistance team at the supplier’s facility and addressed many of these 
problems. A 2006 review of lessons learned from the Global Hawk program 
by prime contractor and Air Force program personnel noted that the 
program had done little production planning as it began the development 
and production effort. In addition, the review found that the Air Force 
should have included funding in the first production estimates for 
specialized test equipment, which was needed to implement efficient 
production processes. 

[End of figure] 

Figure 7: Joint Air-to-Surface Standoff Missile: 

[See PDF for image] 

Photograph of the Joint Air-to-Surface Standoff Missile, with the 
following accompanying data: 

DOD sponsor: Air Force; 
Prime contractor: Lockheed Martin; 
Program start: 1998. 

RDT&E cost: 
Development estimate (in millions of dollars): $970; 
Current estimate[A] (in millions of dollars): $1,407; 
Change: +45%. 

Procurement cost: 
Development estimate (in millions of dollars): $1,208; 
Current estimate[A] (in millions of dollars): $3,998; 
Change: +231%. 

Total quantity: 
Development estimate (in millions of dollars): 2,469; 
Current estimate[A] (in millions of dollars): 5,006; 
Change: +2,537. 

Source: GAO analysis; Joint Air-to-Surface Standoff Missile (JASSM) 
Program Office (JASSM-Extended Range IT-2) (photo). 

The Joint Air-to-Surface Standoff Missile program entered production in 
December 2001; as of June 2007, more than 600 missiles had been 
delivered to DOD. However, following two flight test failures in the 
spring of 2005, quantities for one missile production lot were reduced 
and, in response to congressional concerns, DOD is focusing on 
increasing missile reliability. As of September 2007, the program 
office has spent $39.4 million on reliability improvements. Due to 
increased costs and schedule delays associated with reliability 
problems and development of an extended range version of the missile, 
the program reported a Nunn-McCurdy breach in 2006. [Footnote 20] 
According to a program official, in May 2007, DOD deferred Nunn-McCurdy 
certification to continue the program, pending improvements to system 
reliability. Overall, DOD’s Director, Operational Test and Evaluation 
office recorded 25 flight test failures, 12 of which were attributed to 
quality and hardware design issues. Problems related to manufacturing 
and supplier quality were responsible for some test failures. 

Systems engineering: 
Quality problems: Wing retention devices—piston-like parts that hold 
the wings in the stowed position inside the missile—failed to deploy 
during two flight tests, causing test failures. The parts were designed 
to snap in an exact location when an electronic charge is fired, 
allowing wing deployment. Following the test failures the design was 
shown to be adequate but the manufacturing process could not guarantee 
the devices would snap in the precise spot every time. Manufacturing 
tolerances were subsequently changed to ensure an exact break, and this 
remedy was retrofitted to some missile production lots. 

Supplier quality: 
Quality problems: Malfunction of a mechanical fuse provided by a 
supplier was responsible for another flight test failure. Prime 
contractor officials said the failure exhibited a repeat of a fuse 
problem experienced early in manufacturing, which resulted in a 2004 
flight test failure. That earlier failure was attributed to foreign 
object damage and a corrective action was applied to missiles in 
production at the time. Additionally, the prime contractor has agreed 
to fund replacement of affected fuses from two production lots. Another 
supplier problem resulted in an April 2005 test failure involving a 
part needed to move the tail and wings into flight position. This 
problem, which originally surfaced prior to a planned mission the 
previous year, resulted from a supplier employee not following work 
instructions. The missile fleet was inspected at that time; but, 
following a review of the flight failure, the program office and prime 
contractor determined the criteria used for the inspection was 
inadequate. As a result, the prime contractor instituted a more robust 
inspection process at the supplier for future production. 

[End of figure] 

Figure 8: LPD 17 Amphibious Transport Dock: 

[See PDF for image] 

Photograph of the LPD 17 Amphibious Transport Dock, with the following 
accompanying data: 

DOD sponsor: Navy; 
Prime contractor: Northrop Grumman; 
Program start: 1993. 

RDT&E cost: 
Development estimate (in millions of dollars): $97; 
Current estimate[A] (in millions of dollars): $137; 
Change: +41%. 

Procurement cost: 
Development estimate (in millions of dollars): $11,025; 
Current estimate[A] (in millions of dollars): $13,557; 
Change: +23%. 

Total quantity: 
Development estimate (in millions of dollars): 12; 
Current estimate[A] (in millions of dollars): 9; 
Change: -3. 

Source: GAO analysis; U.S. Navy (PMS 317, PEO Ships). 

According to the program office the LPD 17 Amphibious Transport Dock, 
which was delivered to the Navy in July 2005, has experienced numerous 
quality problems of varying degrees that significantly impacted the 
ship’s mission. These problems contributed to a delay of 3 years in the 
delivery of the ship and a cost increase of $846 million. According to 
Navy program officials, some of the problems are typical of those of a 
first ship of class production. Many of the problems can be attributed 
to systems engineering, manufacturing, and supplier issues as noted 
below. In June 2007, the Secretary of the Navy sent a letter to the 
Chairman of the Board of Northrop Grumman expressing his concerns for 
the contractor’s ability to construct and deliver ships that conform to 
the quality standards maintained by the Navy and that adhere to the 
cost and schedule commitments agreed upon. Northrop Grumman’s Chairman 
acknowledged that the company was aware of the problems and is working 
on improving its processes. 

Systems engineering: 
Quality problems: Many of the system engineering problems on the LPD 17 
can be attributed to the software-based design tool used by the 
contractors. The contractor selected a 3-D model to fulfill Navy 
requirements, the Integraph software package, which had been used in 
large construction efforts but not fully adapted for shipbuilding. It 
was intended for workers to design systems and extract drawings from 
this 3-D model. The modification of this design tool, at the same time 
the ship was under design, caused delays in the release of production 
drawings. According to the program office, Northrop Grumman experienced 
some difficulty in acquiring and training qualified personnel to use 
the system. Consequently, the program experienced higher than expected 
engineering hours due to a large number of design drawings that 
required rework. Design rework also affected the sequencing of work 
being done on the ship as well as the accuracy of that work. Northrop 
Grumman Ship Systems officials stated that completing design work after 
beginning ship construction affects both the work schedule and the 
quality of work. The LPD 17 also encountered a problem with the 
isolators on titanium piping. The isolators are used to separate 
different types of metals to keep them from corroding. The problem was 
discovered in 2006, about a year after the launch of the first ship. 
According to DOD program officials, the titanium piping is used 
throughout the ship because it is lighter than the traditional copper-
nickel piping and has a longer service life. However, it has not been 
used much in naval surface ships or by the American shipbuilding 
industry, and therefore required new manufacturing and installation 
processes. According to the program office, these processes were being 
developed as Northrop Grumman Ship Systems was building the ship. In 
addition, designs for the piping hangers, which hold the piping in 
place, as well as tests of the isolators were subsequently delayed. 
When the titanium piping on the ship was changed, the hanger design had 
to be modified as well. The final hanger design was not completed until 
about 90 percent of the titanium piping was already on the ship, which 
resulted in additional rework and schedule delays. The LPD 17 Class has 
had problems associated with its steering system as well. Hydraulic 
fluid contamination occurred during system flushing. System flushing is 
completed in order to clean out a system and involves running fluid 
throughout the piping. Additionally, there were problems in keeping air 
out of the system. After investigation, several steps were taken to 
mitigate these issues including installing additional filters, 
modifying the flushing procedures, and modifying the system design. 

Manufacturing: 
Quality problems: The ship encountered problems with faulty welds on P-
1 piping systems, a designation used in high-temperature, high-
pressure, and other critical systems. This class of piping is used 
primarily in hydraulic applications in engineering and machinery 
spaces. P-1 piping systems require more extensive weld documentation 
than other pipes as they are part of critical systems and could cause 
significant damage to the ship and crew if they failed. Welds of this 
nature must be documented to ensure they were completed by qualified 
personnel and inspected for structural integrity. Further investigation 
revealed that weld inspection documentation was incomplete. As a 
result, increased rework levels were necessary to correct deficiencies 
and to re-inspect all the welds. Failure to complete this work would 
have increased the risk of weld failure and potentially presented a 
hazard to the ship and crew. According to the program office, a 
contributing factor was turnover in production personnel and their lack 
of knowledge on how to complete the proper documentation. The program 
is also experiencing problems with non-skid applications, a type of 
coating used on the ship. The non-skid application is different from 
traditional surface coatings in that it creates a rough surface when it 
has dried. This is particularly important on a ship because it provides 
increased traction when wet as opposed to traditional surface coatings. 
One problem the program encountered with this particular type of 
coating was in preparation. When applying non-skid application, it is 
important to have a clean surface free of dirt and debris. 
Additionally, high humidity levels found along the Gulf Coast, where 
the ship was built interfere with the bonding process and require 
dehumidification. These conditions have been difficult to consistently 
achieve in a construction environment. As a result, the non-skid would 
not adhere properly and began to peel away. As of November 2007, no 
change in process has occurred. 

Supplier quality: 
Quality problems: The ship program also experienced numerous supplier 
quality problems. For example, an inspection completed in March of 2007 
identified the reverse osmosis units, which provide drinkable water to 
the crew, as one of the most troubled systems onboard the ship. At the 
time of the inspection, one of the three units was out of commission, 
one was unable to produce to capacity, and one was operational but 
unreliable. In this condition, the ship would not be able to support 
embarked troops for extended periods at sea and, as a result, the 
mission of the ship would be limited. During the design phase, it was 
determined that currently available reverse osmosis units could not 
meet the ship’s output requirements. Therefore, a new design was 
developed specifically for the LPD 17 Class. Problems with the reverse 
osmosis units were caused by premature failures of some mechanical and 
electrical components. According to the Navy program office, the 
supplier of the ship’s reverse osmosis units did not use parts rugged 
enough for the ship’s needs. This supplier is providing reverse osmosis 
units for all ships in the LPD 17 Class. Consequently, the LPD 18 and 
LPD 19 will need to have their units reworked as well. According to the 
program office, the vendor is now using more rugged parts and will 
provide properly working units for the LPD 20, the fourth ship to be 
delivered in this class, and all subsequent ships. 

[End of figure] 

Figure 9: MH-60S Fleet Combat Support Helicopter: 

[See PDF for image] 

Photograph of the MH-60S Fleet Combat Support Helicopter, with the 
following accompanying data: 

DOD sponsor: Navy; 
Prime contractor: Sikorsky Aircraft; 
Program start: 1998. 

RDT&E cost: 
Development estimate (in millions of dollars): $85; 
Current estimate[A] (in millions of dollars): $654; 
Change: +666%. 

Procurement cost: 
Development estimate (in millions of dollars): $3,246; 
Current estimate[A] (in millions of dollars): $7,212; 
Change: +122%. 

Total quantity: 
Development estimate (in millions of dollars): 166; 
Current estimate[A] (in millions of dollars): 267; 
Change: +101. 

Source: GAO analysis; H-60 program office. 

The MH-60S is currently in production. In recent years, Sikorsky began 
to outsource some of its production work, and subsequently experienced 
problems related to the outsourcing of more complex assemblies to 
suppliers. An example of a problem related to the outsourcing of MH-60S 
cabin production is below. 

Manufacturing: 
Quality problems: The supplier building the cabin assemblies had to 
scrap two major assemblies for the helicopter cabins, and the program 
experienced approximately a 6-month delay in the delivery of the 
cabins. According to program office officials, these problems occurred 
because of inadequate work instructions for producing the cabins. 
Sikorsky sent the subcontractor its drawing packages and work 
instructions for the cabins, but due to the poor quality of this 
information, the subcontractor was not able to build the cabins 
properly. Initially, Sikorsky had planned to have a parallel production 
line with the subcontractor. However, Defense Contract Management 
Agency officials stated that Sikorsky discontinued the parallel 
production line after the supplier had built only a few cabins. 
Sikorsky officials said that the company did not provide enough overlap 
in the production line to ensure a smooth transition. 

[End of figure] 

Figure 10: Patriot Advanced Capability-3: 

[See PDF for image] 

Photograph of the Patriot Advanced Capability-3, with the following 
accompanying data: 

DOD sponsor: Army; 
Prime contractor: Lockheed Martin; 
Program start: 1994. 

RDT&E cost: 
Development estimate (in millions of dollars): $2,593; 
Current estimate[A] (in millions of dollars): $3,932; 
Change: +52%. 

Procurement cost: 
Development estimate (in millions of dollars): $2,357; 
Current estimate[A] (in millions of dollars): $5,665; 
Change: +140%. 

Total quantity: 
Development estimate (in millions of dollars): 1,200; 
Current estimate[A] (in millions of dollars): 969; 
Change: -231. 

Source: GAO analysis; PAC-3 Product Office, Lower Tier Project Office. 

When the Patriot Advanced Capability-3 program was initiated, the 
development contract did not require the prime contractor to have 
commercial quality standard certification. According to program 
officials, the contract’s reference to quality standards was for 
guidance only. Since the program went into production in 1999, it has 
experienced a number of problems with the seeker (target finding) 
portion of the missile. Below are some examples of these problems. 

Systems engineering: 
Quality problems: During flight testing in November 2005, two missile 
seekers reset shortly after launch causing the missiles to fail to 
intercept their targets. These failures resulted from a design issue 
involving requirements passed from the prime contractor to the seeker 
manufacturer that did not contain a sufficient design margin. The Army 
paid for a failure analysis review as well as a short-term software fix 
to mitigate the effects of a potential future seeker reset. Related 
hardware improvements will cost up to $2.1 million. 

Manufacturing: 
Quality problems: We reported that in low-rate production only 25 
percent of the missile’s seekers were being manufactured correctly the 
first time, with the rest being reworked on average four times before 
being acceptable. [Footnote 21] Additionally, prior to entering 
production only 40 percent of the missile’s manufacturing processes 
were in control. In an effort to boost seeker first-pass yield rates, 
the Army agreed to pay an estimated $24 million for equipment to test 
subcomponents before integration into the seeker. This equipment has 
helped identify problems earlier in the manufacturing process and 
improve first-pass yield rates to 90 percent. 

Supplier quality: 
Quality problems: In the spring of 2006 a supplier producing a seeker 
component acted without authority in acceptance of non-conforming 
hardware. One program official attributed this in part to the supplier 
having formerly operated in a development environment in which 
procedures for material acceptance differ from those followed during 
production. The supplier involved has also experienced other problems, 
some involving manufacturing and poor workmanship issues. As a result 
of problems experienced by this supplier, its production facility was 
temporarily shut down, causing a 6-month schedule slip and delaying 
delivery of about 100 missiles. 

[End of figure] 

Figure 11: V-22 Joint Services Advanced Vertical Lift Aircraft: 

[See PDF for image] 

Photograph of the V-22 Joint Services Advanced Vertical Lift Aircraft, 
with the following accompanying data: 

DOD sponsor: Navy; 
Prime contractor: Bell Helicopter Textron and Boeing Integrated Defense 
Systems; 
Program start: 1986. 

RDT&E cost: 
Development estimate (in millions of dollars): $4,033; 
Current estimate[A] (in millions of dollars): $12,474; 
Change: +209%. 

Procurement cost: 
Development estimate (in millions of dollars): $33,823; 
Current estimate[A] (in millions of dollars): $42,088; 
Change: +24%. 

Total quantity: 
Development estimate (in millions of dollars): 913; 
Current estimate[A] (in millions of dollars): 458; 
Change: -455. 

Source: GAO analysis; V-22 Joint Program Office. 

The V-22 is currently in production. The program has experienced four 
crashes throughout its development and production, three of which 
resulted in casualties. Quality issues related to design were a 
contributor in one of these fatal crashes, as described below. 

Systems engineering: 
Quality problems: In 2000, a low-rate initial production aircraft 
crashed during a training mission. This crash killed the four Marines 
aboard the flight, and, as a result, the Navy and the Marine Corps 
suspended program flight operations from December 2000 to May 2002. An 
investigation into the crash attributed the accident to a combination 
of a hydraulic line failure and a flight control software problem. 
While neither the hydraulic line failure nor the software problem alone 
would have caused the accident, the combination of the two problems 
resulted in a loss of flight control. The hydraulic line failure was 
due to chafing of the line on a wire harness in the nacelle, which is 
the portion of the aircraft that tilts or rotates in order to convert 
from helicopter to aircraft operations. The accident investigation also 
noted that hydraulic line chafing was a repeated problem among 
aircraft, citing various Airframe Bulletins, Hazardous Material Reports 
and Quality Deficiency Reports from June 1999 through February 2001 
that described a chafing problem of wire bundles and hydraulic lines in 
the aircraft nacelles. The Navy program office established Integrated 
Product Teams to identify the hydraulic system challenges facing the V-
22. They concluded that chafing due to insufficient clearances among 
components, installation flaws, and variances among aircraft were all 
problems affecting the hydraulic lines in the nacelles. The program 
subsequently completed a redesign to address system separation, 
including the hydraulic system design, hydraulic tubing, wire 
harnesses, and fuel system. These design changes were incorporated into 
what are known as “Block A” aircraft. The program office estimated that 
the recurring costs of the engineering change proposals to the aircraft 
design for the Block A aircraft was approximately $165 million. 

[End of figure] 

Figure 12: Wideband Global SATCOM: 

[See PDF for image] 

Illustration of the Wideband Global SATCOM, with the following 
accompanying data: 

DOD sponsor: Air Force and Army; 
Prime contractor: Boeing Integrated Defense Systems; 
Program start: 2000. 

RDT&E cost: 
Development estimate (in millions of dollars): $203; 
Current estimate[A] (in millions of dollars): $322; 
Change: +63%. 

Procurement cost: 
Development estimate (in millions of dollars): $929; 
Current estimate[A] (in millions of dollars): $1,699; 
Change: +83%. 

Total quantity: 
Development estimate (in millions of dollars): 3; 
Current estimate[A] (in millions of dollars): 5; 
Change: +2. 

Source: GAO analysis; WGS Program Office. 

The Wideband Global SATCOM acquisition is commercial in nature, with 
the satellite design based on an existing satellite manufactured by the 
prime contractor. As such, the program entered production in November 
2000 with mature technologies; however, supplier problems have delayed 
initial operational capability by 18 months. The first of the five 
satellites was launched in October 2007, 4 months later than expected. 

Manufacturing: 
Quality problems: Power dividers are located in the transmit phased 
array and in the receive phased array. The dividers, which split power 
between the separate elements that make up the arrays, failed hot and 
cold electrical performance tests resulting in a 6-month schedule slip 
for the program. An in-orbit power divider failure could result in the 
loss of one of its eight shapeable beams. This problem was caused by 
circuit traces that cracked in the manufacturing process. The problem 
was resolved by the prime contractor changing suppliers and improving 
processes for manufacturing the power dividers. 

Supplier quality: 
Quality problems: During replacement of a subcomponent on the first 
satellite, the prime contractor discovered that certain fasteners were 
installed incorrectly. As a result, 1,500 fasteners on each of the 
first three satellites had to be inspected or tested and 148 fasteners 
on the first satellite had to be reworked. The DOD program office 
reported that the resulting 15-month schedule slip would add rework and 
workforce costs (to be borne by the contractor) to the program and 
delay initial operating capability by 18 months. A prime contractor 
official estimated the impact to the program was at least $10 million. 
The problem resulted from a supplier not testing installed fasteners as 
required. 

[End of figure] 

[End of section] 

Appendix III: Comments from the Department of Defense: 

Office Of The Under Secretary Of Defense: 
Acquisition, Technology And Logistics: 
3000 Defense Pentagon: 
Washington, DC 20301-3000: 

January 18, 2008: 

Mr. Michael Sullivan: 
Director, Acquisition and Sourcing Management: 
U.S. Government Accountability Office: 
441 G Street, NW: 
Washington, DC 20548: 

Dear Mr. Sullivan: 

This is the Department of Defense (DOD) response to the GAO draft 
report GAO-08-294, "Best Practices: Increased Focus on Requirements and 
Oversight Needed to Improve DoD's Acquisition Environment and Weapon 
System Quality," dated December 21, 2007 (GAO Code 120642). Detailed 
comments on the report recommendations are enclosed. 

Sincerely, 

Signed by: 

Mark D. Schaeffer: 
Director: 
Systems and Software Engineering: 

Enclosure: As stated: 

Enclosure: 

GAO Draft Report Dated December 21, 2007: 
GAO-08-294 (GAO CODE 120642): 

"Best Practices: Increased Focus On Requirements And Oversight Needed 
To Improve Dod's Acquisition Environment And Weapon System Quality" 

Department Of Defense Comments To The Gao Recommendations: 

Recommendation 1: The GAO recommends that the Secretary of Defense 
include as a part of the concept decision review initiative, a 
requirement that systems engineering analysis be completed by the prime 
contractor prior to entering into a development contract. (p. 27/GAO 
Draft Report) 

DOD Response: Partially concur. The concept decision (CD) initiative 
reflects the requirement to consider such fundamental systems 
engineering issues as technology maturity and integration and 
manufacturing risk before the CD review. Currently the CD initiative is 
under consideration by the Under Secretary of Defense (Acquisition, 
Technology and Logistics) (USD (AT&L)). The recommendation to require 
that systems engineering analysis be completed by the prime contractor 
prior to entering into a development contract is vague. A prime 
contractor generally cannot perform a systems engineering analysis as 
part of the concept decision review because prime contractors are 
generally not selected until after Milestone B when a development 
contract is awarded for the Systems Development and Demonstration 
Phase. However, by its very nature, systems engineering is a continuous 
government-performed activity at the heart of any structured 
development process that proceeds from concept to production. Formal 
systems engineering planning is consistent with current DoD policy and 
is required to support program decisions prior to award of any 
development contract. 

Recommendation 2: The GAO recommends that the Secretary of Defense 
establish measures to gauge the success of the concept decision 
reviews, time-defined acquisition, and configuration steering board 
initiatives and properly support and expand these initiatives where 
appropriate. (p. 27/GAO Draft Report) 

DOD Response: Partially concur. Configuration steering boards are being 
implemented consistent with the USD (AT&L) policy of July 30, 2007, 
Configuration Steering Boards. Changes to DoD Instruction 5000.2, 
"Operation of the Defense Acquisition System," on concept decision 
reviews and time-defined acquisition are under consideration by the USD 
(AT&L). 

Recommendation 3: The GAO recommends that the Secretary of Defense 
direct the Defense Contract Management Agency and the Military Services 
to identify and collect data that provides metrics about the 
effectiveness of prime contractors' quality management system and 
processes by weapon system and business area over time. (p. 27/GAO 
Draft Report) 

DOD Response: Partially concur. The Defense Contract Management Agency 
is already in the process of identifying and eventually collecting data 
as a source of metrics on the effectiveness of prime contractors' 
quality management system and processes over time. The added expense of 
refining the collection process to capture data by weapon system and 
business area does not seem warranted at this time. Greater levels of 
detail may be considered once the data are used to score performance. 
(See the DoD response to Recommendation 4.) The Defense Contract 
Management Agency is doing this work in cooperation with the Military 
Services; therefore there is no need for them to engage in a similar 
effort. 

Recommendation 4: The GAO recommends that the Secretary of Defense 
direct the Defense Contract Management Agency and the Military Services 
to develop evaluation criteria that would allow DoD to score the 
performance of prime contractors' quality management systems based on 
actual past performance. (p. 27/GAO Draft Report) 

DOD Response: Partially concur. The plant-level data discussed in the 
DoD response to Recommendation 3 will be used to develop evaluation 
criteria that would allow DoD to score the performance of prime 
contractors' quality management systems based on actual performance in 
the past. Such scoring will be helpful in source selection and to 
reduce risk in contracts. These results will be made available to the 
Military Services so there is no need for them to conduct a parallel 
effort. 

[End of enclosure] 

[End of section] 

Appendix IV: GAO Contact and Staff Acknowledgments: 

GAO Contact: 

Michael Sullivan (202) 512-4841 or sullivanm@gao.gov. 

Acknowledgments: 

Key contributors to this report were Jim Fuquay, Assistant Director; 
Cheryl Andrew; Lily Chin; Julie Hadley; Lauren Heft; Laura Jezewski; 
Andrew Redd; Charlie Shivers, and Alyssa Weir. 

[End of section] 

Related GAO Products: 

Defense Acquisitions: Assessments of Selected Weapon Programs. GAO-07- 
406SP. Washington, D.C.: March 30, 2007. 

Best Practices: Stronger Practices Needed to Improve DOD Technology 
Transition Processes. GAO-06-883. Washington, D.C.: September 14, 2006. 

Best Practices: Better Support of Weapon System Program Managers Needed 
to Improve Outcomes. GAO-06-110. Washington, D.C.: November 1, 2005. 

Defense Acquisitions: Major Weapon Systems Continue to Experience Cost 
and Schedule Problems under DOD's Revised Policy. GAO-06-368. 
Washington, D.C.: April 13, 2006. 

DOD Acquisition Outcomes: A Case for Change. GAO-06-257T. Washington, 
D.C.: November 15, 2005. 

Defense Acquisitions: Stronger Management Practices Are Needed to 
Improve DOD's Software-Intensive Weapon Acquisitions. GAO-04-393. 
Washington, D.C.: March 1, 2004. 

Best Practices: Setting Requirements Differently Could Reduce Weapon 
Systems' Total Ownership Costs. GAO-03-57. Washington, D.C.: February 
11, 2003. 

Defense Acquisitions: Factors Affecting Outcomes of Advanced Concept 
Technology Demonstration. GAO-03-52. Washington, D.C.: December 2, 
2002. 

Best Practices: Capturing Design and Manufacturing Knowledge Early 
Improves Acquisition Outcomes. GAO-02-701. Washington, D.C.: July 15, 
2002. 

Defense Acquisitions: DOD Faces Challenges in Implementing Best 
Practices. GAO-02-469T. Washington, D.C.: February 27, 2002. 

Best Practices: Better Matching of Needs and Resources Will Lead to 
Better Weapon System Outcomes. GAO-01-288. Washington, D.C.: March 8, 
2001. 

Best Practices: A More Constructive Test Approach Is Key to Better 
Weapon System Outcomes. GAO/NSIAD-00-199. Washington, D.C.: July 31, 
2000. 

Defense Acquisition: Employing Best Practices Can Shape Better Weapon 
System Decisions. GAO/T-NSIAD-00-137. Washington, D.C.: April 26, 2000. 

Best Practices: DOD Training Can Do More to Help Weapon System Programs 
Implement Best Practices. GAO/NSIAD-99-206. Washington, D.C.: August 
16, 1999. 

Best Practices: Better Management of Technology Development Can Improve 
Weapon System Outcomes. GAO/NSIAD-99-162. Washington, D.C.: July 30, 
1999. 

Defense Acquisitions: Best Commercial Practices Can Improve Program 
Outcomes. GAO/T-NSIAD-99-116. Washington, D.C.: March 17, 1999. 

Defense Acquisitions: Improved Program Outcomes Are Possible. GAO/T- 
NSIAD-98-123. Washington, D.C.: March 17, 1998. 

Best Practices: Successful Application to Weapon Acquisition Requires 
Changes in DOD's Environment. GAO/NSIAD-98-56. Washington, D.C.: 
February 24, 1998. 

Best Practices: Commercial Quality Assurance Practices Offer 
Improvements for DOD. GAO/NSIAD-96-162. Washington, D.C.: August 26, 
1996. 

[End of section] 

Footnotes: 

[1] Boeing has two primary businesses: Boeing Commercial Airplanes and 
Boeing Integrated Defense systems. We held discussions with officials 
from both business areas and therefore refer to them separately 
throughout the report. 

[2] The ISO 9001 standard provides a framework for managing an 
organization's processes so that it consistently produces products that 
meet customer expectations. An ISO certification means that an 
independent external body has audited an organization's quality 
management system and verified that it conforms to the requirements 
specified in the standard. 

[3] For example, AS9100 is a set of quality standards for the aerospace 
industry; ISO/TS 16949 is a set of standards for the automotive 
industry. 

[4] GAO, Best Practices: Commercial Quality Assurance Practices Offer 
Improvements for DOD, GAO/NSIAD-96-162 (Washington, D.C.: Aug. 26, 
1996). 

[5] GAO, Best Practices: Better Matching of Needs and Resources Will 
Lead to Better Weapon System Outcomes, GAO-01-288 (Washington, D.C.: 
Mar. 8, 2001). 

[6] GAO, Best Practices: Capturing Design and Manufacturing Knowledge 
Early Improves Acquisition Outcomes, GAO-02-701 (Washington, D.C.: July 
15, 2002). 

[7] Six Sigma is a tool for measuring defects and improving quality. 
Over time, it has evolved into a business improvement methodology that 
focuses an organization on understanding customer requirements, 
aligning key business processes to achieve those requirements, 
utilizing rigorous data analysis to minimize variation in business 
processes, and rapid and sustainable improvement to business processes. 

[8] The Federal Acquisition Regulation (FAR), indicates that complex 
requirements, particularly those unique to the government, usually 
result in greater risk assumption by the government. This is especially 
true for complex research and development contracts when performance 
uncertainties or the likelihood of changes makes it difficult to 
estimate performance costs in advance. Cost-reimbursable contracts are 
suitable for use only when uncertainties involved in contract 
performance do not permit costs to be estimated with sufficient 
accuracy to use any type of fixed-price contract. FAR 16.104 and 16.301-
2. 

[9] GAO, Defense Acquisitions: Assessments of Selected Weapon Programs, 
GAO-07-406SP (Washington, D.C.: Mar. 30, 2007). 

[10] GAO, Best Practices: Setting Requirements Differently Could Reduce 
Weapon Systems' Total Ownership Costs, GAO-03-57 (Washington, D.C.: 
Feb. 11, 2003). 

[11] GAO-07-406SP. 

[12] Section 804 of the John Warner National Defense Authorization Act 
for Fiscal Year 2007, Pub. L. No. 109-364 (2006), requires DOD to 
submit to Congress on a biannual basis an update of its implementation 
plans to reform the acquisition system. Many of the initiatives 
highlighted in the report were initiated in response to other reform 
efforts, such as the Defense Acquisition Performance Assessment Project 
Report (DAPA, January 2006); Defense Science Board 2005 Summer Study on 
Transformation: "A Progress Assessment" (February 2006); The Center for 
Strategic and International Studies Report, "Beyond Goldwater Nichols: 
U.S. Government and Defense Reform for a New Strategic Era" (July 
2005); and The 2006 Quadrennial Defense Review Report (February 2006). 
Section 853 of the act requires the Secretary of Defense to develop a 
comprehensive strategy for enhancing the role of DOD program managers 
in developing and carrying out defense acquisition programs. 

[13] Acquisition category I programs are major defense acquisition 
programs that are estimated by the Under Secretary of Defense for 
Acquisition, Technology and Logistics to require eventual expenditure 
for research, development, test, and evaluation of more than $365 
million (FY 2000 constant dollars) or procurement of more than $2.190 
billion (FY 2000 constant dollars), or those designated by the 
Milestone Decision Authority to be acquisition category I programs. 10 
U.S.C. § 2430. 

[14] Key performance parameters are defined as those attributes or 
characteristics of a system that are considered critical or essential 
to the development of an effective military capability and those 
attributes that make a significant contribution to the characteristics 
of the future joint force as defined in the Capstone Concept for Joint 
Operations. Key system attributes are attributes or characteristics 
that are considered crucial in support of achieving a balanced 
solution/approach deemed necessary by the program sponsor. Key 
performance parameters must be met before a weapon system can go into 
production. Key system attributes, on the other hand, can be traded in 
favor of other system attributes. 

[15] GAO, Defense Acquisitions: Department of Defense Actions on 
Program Manager Empowerment and Accountability, GAO-08-62R (Washington, 
D.C.: Nov. 9, 2007). 

[16] GAO-02-701. 

[17] 10 U.S.C. § 2433 establishes the requirement for unit cost 
reports. If certain cost thresholds are exceeded (known as unit cost or 
Nunn-McCurdy breaches), DOD is required to report to Congress and, in 
certain circumstances, certify the program to Congress. 

[18] At this point, most design drawings should be released, prototype 
hardware developed, and units ready to build. 

[19] GAO-02-701. 

[20] 10 U.S.C. § 2433 establishes the requirement for unit cost 
reports. If certain cost thresholds are exceeded (known as unit cost or 
Nunn-McCurdy breaches), DOD is required to report to Congress and, in 
certain circumstances, certify the program to Congress. 

[21] GAO-02-701. 

[End of section] 

GAO's Mission: 

The Government Accountability Office, the audit, evaluation and 
investigative arm of Congress, exists to support Congress in meeting 
its constitutional responsibilities and to help improve the performance 
and accountability of the federal government for the American people. 
GAO examines the use of public funds; evaluates federal programs and 
policies; and provides analyses, recommendations, and other assistance 
to help Congress make informed oversight, policy, and funding 
decisions. GAO's commitment to good government is reflected in its core 
values of accountability, integrity, and reliability. 

Obtaining Copies of GAO Reports and Testimony: 

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through GAO's Web site [hyperlink, http://www.gao.gov]. Each 
weekday, GAO posts newly released reports, testimony, and 
correspondence on its Web site. To have GAO e-mail you a list of newly 
posted products every afternoon, go to [hyperlink, http://www.gao.gov] 
and select "Subscribe to Updates." 

Order by Mail or Phone: 

The first copy of each printed report is free. Additional copies are $2 
each. A check or money order should be made out to the Superintendent 
of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or 
more copies mailed to a single address are discounted 25 percent. 
Orders should be sent to: 

U.S. Government Accountability Office: 
441 G Street NW, Room LM: 
Washington, D.C. 20548: 

To order by Phone: 
Voice: (202) 512-6000: 
TDD: (202) 512-2537: 
Fax: (202) 512-6061: 

To Report Fraud, Waste, and Abuse in Federal Programs: 

Contact: 

Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]: 
E-mail: fraudnet@gao.gov: 
Automated answering system: (800) 424-5454 or (202) 512-7470: 

Congressional Relations: 

Gloria Jarmon, Managing Director, JarmonG@gao.gov: 
(202) 512-4400: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7125: 
Washington, D.C. 20548: 

Public Affairs: 

Chuck Young, Managing Director, youngc1@gao.gov: 
(202) 512-4800: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7149: 
Washington, D.C. 20548: