This is the accessible text file for GAO report number GAO-04-642 
entitled 'NASA: Lack of Disciplined Cost-Estimating Processes Hinders 
Effective Program Management' which was released on June 22, 2004.

This text file was formatted by the U.S. General Accounting Office 
(GAO) to be accessible to users with visual impairments, as part of a 
longer term project to improve GAO products' accessibility. Every 
attempt has been made to maintain the structural and data integrity of 
the original printed product. Accessibility features, such as text 
descriptions of tables, consecutively numbered footnotes placed at the 
end of the file, and the text of agency comment letters, are provided 
but may not exactly duplicate the presentation or format of the printed 
version. The portable document format (PDF) file is an exact electronic 
replica of the printed version. We welcome your feedback. Please E-mail 
your comments regarding the contents or accessibility features of this 
document to Webmaster@gao.gov.

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately.

Report to the Committee on Science, House of Representatives: 

May 2004: 

NASA: 

Lack of Disciplined Cost-Estimating Processes Hinders Effective Program 
Management: 

GAO-04-642: 

GAO Highlights: 

Highlights of GAO-04-642, a report to the Committee on Science, House 
of Representatives 

Why GAO Did This Study: 

For more than a decade, GAO has identified the National Aeronautics and 
Space Administration’s (NASA) contract management as a high-risk area—
in part because of NASA’s inability to collect, maintain, and report 
the full cost of its programs and projects. Lacking this information, 
NASA has been challenged to manage its programs and control program 
costs. The scientific and technical expectations inherent in NASA’s 
mission create even greater challenges—especially if meeting those 
expectations requires NASA to reallocate funding from existing programs 
to support proposed new efforts.

Because cost growth has been a persistent problem in a number of NASA 
programs, GAO was asked to examine NASA’s cost estimating for selected 
programs, assess NASA’s cost-estimating processes and methodologies, 
and describe any barriers to improving NASA’s cost-estimating 
processes. To conduct GAO’s work, GAO analyzed a total of 27 NASA 
programs—10 of which GAO reviewed in detail.

What GAO Found: 

Considerable change in NASA’s program cost estimates—both increases and 
decreases—indicates that NASA lacks a clear understanding of how much 
its programs will cost and how long they will take to achieve their 
objectives. For example, the development cost estimates for more than 
half of the 27 programs that GAO reviewed have increased and for some 
programs this increase was significant—as much as 94 percent. Cost 
estimates changed for each of 10 programs that GAO reviewed in detail. 
For 8 of the 10 programs, the estimates increased. Although NASA cited 
specific reasons for the changes, such as technical problems and 
funding shortages, the variability in the cost estimates indicates that 
the programs lacked the sufficient knowledge needed to establish 
priorities, quantify risks, and make informed investment decisions, and 
thus predict costs.

Most notably, NASA’s basic cost-estimating processes—an important tool 
for managing programs—lack the discipline needed to ensure that program 
estimates are reasonable. Specifically, GAO found that none of the 10 
NASA programs that GAO reviewed in detail met all of GAO’s cost-
estimating criteria, which are based on criteria developed by Carnegie 
Mellon University’s Software Engineering Institute. Moreover, none of 
the 10 programs fully met certain key criteria—including clearly 
defining the program’s life cycle to establish program commitment and 
manage program costs, as required by NASA. In addition, only three 
programs provided a breakdown of the work to be performed. Without this 
knowledge, the programs’ estimated costs could be understated and 
thereby subject to underfunding and cost overruns, putting programs at 
risk of being reduced in scope or requiring additional funding to meet 
their objectives. Finally, only two programs have a process in place 
for measuring cost and performance to identify risks.

NASA has limited ability to collect the program cost and schedule data 
needed to meet basic cost-estimating criteria. For example, as GAO has 
previously reported, NASA does not have a system to capture reliable 
financial and performance data—key to using effectively the cost-
estimating tools that NASA officials state that programs employ. 
Further, without adequate financial and nonfinancial data, programs 
cannot easily track an acquisition’s progress and assess whether the 
program can meet its cost and schedule goals before it incurs 
significant cost and schedule overruns. NASA identified other barriers, 
including limited cost-estimating staff. According to NASA officials, 
several initiatives are under way to remove such obstacles and improve 
the agency’s cost-estimating practices.

What GAO Recommends: 

GAO is recommending that NASA take a number of actions to better ensure 
that the agency’s planned and recently implemented initiatives to 
improve its cost-estimating practices will result in sound cost 
estimates and thereby enable NASA to control its programs better.

www.gao.gov/cgi-bin/getrpt?GAO-04-642.

To view the full product, including the scope and methodology, click on 
the link above. For more information, contact Allen Li at (202) 
512-4841 or lia@gao.gov.

[End of section]

Contents: 

Letter: 

Results in Brief: 

Background: 

Development Cost Estimates Frequently Changed: 

Poor Estimating Processes and Methodologies Contributed to 
Wide Variations in Baseline Cost Estimates: 

NASA Has Begun to Address Certain Barriers to Effective Cost 
Estimating: 

Conclusions: 

Recommendations for Executive Action: 

Agency Comments and Our Evaluation: 

Appendixes: 

Appendix I: Scope and Methodology: 

Appendix II: Assessments of 10 Programs Reviewed in Detail: 

Gravity Probe B: 

Mars Exploration Rovers: 

Space Infrared Telescope Facility: 

Landsat-7: 

Aqua: 

Aura: 

Fluids and Combustion Facility: 

Hyper-X Program: 

Checkout and Launch Control System: 

Cockpit Avionics Upgrade: 

Appendix III: Summary Descriptions of the 17 Additional Programs: 

Space Science Enterprise: 

Earth Science Enterprise: 

Space Flight Enterprise: 

Appendix IV: Description of Earned Value Management: 

Appendix V: Comments from the National Aeronautics and 
Space Administration: 

Appendix VI: GAO Contact and Staff Acknowledgments: 

Tables: 

Table 1: Initial and Current Baseline Development Cost Estimates and 
Life-Cycle Cost Estimates for 27 NASA Programs: 

Table 2: Summary of Criteria Used to Assess 10 NASA Programs Reviewed: 

Table 3: Summary of Extent 10 NASA Programs Met Assessment Criteria: 

Table 4: Summary of the Number of Programs That Met, Partially Met, or 
Did Not Meet Criterion: 

Table 5: Thirty-Two Criteria for Evaluating the Quality of Management 
Systems: 

Figure: 

Figure 1: History of Rebaselinings of 10 Programs' Development Cost 
Estimates: 

Abbreviations: 

AHMS Phase 1: Advanced Health Management System Phase I: 

ATP: Alternate Turbopump Program: 

CAIV: cost as an independent variable: 

CALIPSO: Cloud-Aerosol Lidar and Infrared Pathfinder Satellite 
Observations: 

CARD: cost analysis requirements description: 

CAU: Cockpit Avionics Upgrade: 

CLCS: Checkout and Launch Control System: 

CPR: cost performance report: 

CRV: Crew Return Vehicle: 

DOD: Department of Defense: 

EOS: Earth Observing System: 

EVM: earned value management: 

FCF: Fluids and Combustion Facility: 

GP-B: Gravity Probe B: 

IFMP: Integrated Financial Management Program: 

INTEGRAL: International Gamma-Ray Astrophysics Laboratory: 

IPAO: Independent Program Assessment Office: 

MERs: Mars Exploration Rovers: 

MESSENGER: Mercury Surface, Space Environment, Geochemistry, and 
Ranging: 

NASA: National Aeronautics and Space Administration: 

NMP-EO-1: New Millennium Program Earth Observing-1: 

OMB: Office of Management and Budget: 

PMA: President's Management Agenda: 

SEER: System Evaluation and Estimation of Resources: 

SEI: Software Engineering Institute: 

SIRTF: Space Infrared Telescope Facility: 

SOFIA: Stratospheric Observatory for Infrared Astronomy: 

STEREO: Solar Terrestrial Relations Observatory: 

TDRS: Tracking and Data Relay Satellite Replenishment: 

TIMED: Thermosphere, Ionosphere, Mesosphere Energetics, and Dynamics: 

WBS: work breakdown structure: 

Letter May 28, 2004: 

The Honorable Sherwood L. Boehlert: 
Chairman:
The Honorable Bart Gordon: 
Ranking Minority Member: 
Committee on Science: 
House of Representatives: 

The lack of reliable financial and performance information has posed 
significant challenges to the National Aeronautics and Space 
Administration's (NASA) ability to manage its largest and most costly 
programs effectively. For nearly 15 years, NASA contract management has 
been on GAO's high-risk list--due in part to NASA's inability to 
collect, maintain, and report the full cost of its programs and 
projects.[Footnote 1] Without such information, NASA has consistently 
developed unrealistic cost and schedule estimates, which, at least in 
part, are reflected in the cost growth and schedule increases in many 
of its programs.

The demanding scientific and technical expectations inherent in NASA's 
mission create even greater challenges for the agency to control 
program costs--especially if meeting those expectations requires NASA 
to reallocate funding from existing programs to support new efforts. 
Because cost growth has been a persistent problem on a number of NASA 
programs, you asked us to (1) identify initial cost estimates in 
selected NASA programs and any changes in those cost estimates, 
(2) assess NASA's cost-estimating processes and methodologies, and (3) 
describe any barriers that make it difficult for NASA to improve its 
cost-estimating processes.

Our review focused on 27 of 68 NASA programs in the development phase 
as of April 2003 or that completed development in fiscal year 2001 or 
2002. To assess NASA's cost-estimating processes and methodologies, we 
conducted a more in-depth review of 10 of the 27 programs, which 
generally had the highest development cost estimate within five of 
NASA's seven Enterprises.[Footnote 2] Our work was conducted between 
February 2003 and March 2004 in accordance with generally accepted 
government auditing standards. For a complete description of our scope 
and methodology, see appendix I.

Results in Brief: 

Many of the NASA programs that we reviewed cost more and took longer 
than was proposed at the time of congressional approval.[Footnote 3] 
Several factors continue to put NASA projects at risk of increased cost 
and schedule delays. Most notably, NASA lacks the basic cost-estimating 
processes needed to establish priorities, quantify risks, and make 
informed investment decisions for its programs. Further, NASA has 
limited ability to collect, analyze, and use program cost and schedule 
data to identify and mitigate impediments to program success.

Current baseline development cost estimates for the 27 programs we 
reviewed varied considerably from the programs' initial baseline 
estimates.[Footnote 4] More than half of the programs' development cost 
estimates increased, and for some programs, this increase was 
significant--as much as 94 percent. In addition, the baseline 
development estimates for each of 10 programs that we reviewed in 
detail were rebaselined--some as many as four times. For 7 of the 
10 programs, the new baseline development estimate was an increase over 
the previous baseline estimate. Although NASA cited specific reasons 
for the cost growth and the recalculated baselines, such as technical 
problems and funding shortages, the variability in the cost estimates 
and the rebaselinings indicate that the programs lacked sufficient 
knowledge needed to make informed acquisition decisions.

Although an important tool for managing programs, NASA's 
cost-estimating processes lack the discipline needed to ensure that 
program estimates are reasonable. Specifically, we found that none of 
the 10 NASA programs that we reviewed in detail met all of the criteria 
that we selected to assess NASA's cost-estimating processes. Moreover, 
none of the 10 programs met certain key criteria--such as clearly 
defining the program's life cycle. NASA procedures and guidelines 
require programs and projects to be managed on the basis of life-cycle 
cost--which the agency clearly defines--and that such cost be developed 
to establish the program's commitment.[Footnote 5] In addition, only 
three programs provided a complete breakdown of the work to be 
performed. Without knowing the full life cycle and the work to be 
performed, the programs' estimated costs could be understated and 
thereby subject to underfunding and cost overruns, thus putting 
programs at risk of being reduced in scope or requiring additional 
funding to meet their objectives. Finally, only two programs had a 
process in place for measuring cost and performance to identify these 
potential risks and take action to avoid them.

NASA faces a number of barriers in meeting the cost-estimating criteria 
that we used to assess the 10 programs. For example, although NASA 
officials noted that programs are using cost-estimating tools, NASA 
generally lacks the data needed to employ these tools effectively. For 
more than a decade, we have reported that, despite repeated efforts, 
NASA has failed to develop a system to capture reliable financial and 
performance information. Most recently, we reported that the agency's 
current effort to implement a modern integrated financial management 
system will not, as it is being implemented, routinely provide program 
managers and other key stakeholders and decision makers--including the 
Congress--with the financial information needed to measure program 
performance and ensure: 

accountability.[Footnote 6] According to NASA officials, nonfinancial 
data, such as data on technology readiness levels, have also been 
difficult for the NASA cost-estimating community to obtain. Without 
adequate financial and nonfinancial data, programs cannot easily track 
an acquisition's progress and assess whether the program can meet its 
cost and schedule goals before the program incurs significant cost and 
schedule overruns. NASA identified other barriers, including limited 
cost-estimating staff. According to NASA officials, there are several 
initiatives under way to remove such obstacles and improve the agency's 
cost-estimating practices.

We are recommending that NASA take a number of actions to better ensure 
that the agency's initiatives result in sound cost-estimating practices 
and are integrated into the project approval process. Specifically, we 
are recommending that NASA develop an integrated plan that includes 
specific actions that ensure that guidance is established on 
rebaselining and that programs have a well-defined process in place to 
measure cost and performance and identify potential risks. We are also 
recommending that NASA establish a framework for developing life-cycle 
cost estimates.

In its comments on a draft of this report, NASA stated that it 
concurred with our recommendations. NASA believes that it has already 
made progress toward achieving many of the improvements intended by the 
recommendations by developing new guidance, implementing management 
controls, and instituting additional levels of project oversight. These 
reforms to NASA's project development and implementation processes are, 
in our view, positive steps in addressing some of the problems 
discussed in our report. However, planned improvements must be 
integrated and enforced on an agency wide basis; our recommendations 
are in line with that thrust. NASA's detailed comments are included as 
appendix V.

Background: 

NASA's programs encompass a broad range of complex and technical 
activities--from investigating the composition and resources of Mars to 
providing satellite and aircraft observations of Earth for scientific 
and weather forecasting. NASA currently funds more than 100 programs 
and projects in various phases of execution in 7 strategic Enterprises: 
Space Science, Earth Science, Biological and Physical Research, 
Aeronautics, Space Flight, Education, and Exploration Systems. Two NASA 
offices have key responsibilities in ensuring the effective execution 
of these programs: the Office of the Chief Financial Officer, which is 
responsible for providing oversight and financial management of agency 
resources and establishing related policy guidance, and the Office of 
Chief Engineer, which is responsible for ensuring development efforts 
and mission operations are planned and conducted using sound 
engineering practices.

More than two-thirds of NASA's work force is made up of contractors and 
grantees, and 90 percent--or roughly $13 billion--of NASA's annual 
budget is spent on work performed by its contractors. Since 1990, we 
have identified NASA's contract management as a high-risk area. This 
assessment has been based in part on our repeated finding that NASA 
does not have good cost-estimating processes or the financial 
information needed to develop good cost estimates for its programs, 
making it difficult for NASA to oversee its contracts and control 
costs. For example, in July 2002, we reported that an independent task 
force convened to assess the management of the International Space 
Station concluded that the program's fiscal year 2002 through fiscal 
2006 budget was not credible because of weaknesses in its cost-
estimating processes.[Footnote 7] The task force pointed out that these 
problems occurred because NASA had not instituted or had ignored many 
of the program's control and contract oversight procedures--such as 
preparing a full life-cycle cost estimate--that should have alerted the 
agency to the growing cost problem and the need for mitigating actions. 
According to the cost analysis team that supported the task force, 
NASA's focus on staying within annual budgets instead of managing total 
program costs was perhaps the single greatest factor in the program's 
cost growth.

NASA's unreliable cost estimates have significant implications for 
potential future endeavors, such as those outlined by the President in 
January of this year. Specifically, the President called for a shift in 
NASA's long-term focus, envisioning that NASA will retire the shuttle 
program as soon as assembly of the International Space Station is 
completed, planned for the end of the decade; develop a new crew 
exploration vehicle as well as launch human missions to the moon 
between 2015 and 2020, and build a permanent lunar base as a stepping 
stone for more ambitious missions. To achieve these goals, the 
President proposed spending $12 billion over the next 5 years--about 
$1 billion of which would come from an increase in NASA's budget, 
currently $15.4 billion--with the remaining $11 billion being 
reallocated from existing NASA programs.

Developing reliable cost estimates has been difficult for agencies 
across the federal government. The need for reliable cost estimates is 
at the heart of two of the five-governmentwide initiatives in the 2002 
President's Management Agenda (PMA); the two are "improved financial 
performance" and "budget and performance integration."[Footnote 8] 
These initiatives are aimed at ensuring that federal financial systems 
produce accurate and timely information to support operating, budget, 
and policy decisions and that budgets are performance-based. As part of 
these initiatives, the President calls for changes to the budget 
process to better measure the real cost and performance of programs. 
According to the PMA, accomplishing all of the crosscutting initiatives 
will matter little without the integration of agency budgets with 
performance.

Development Cost Estimates Frequently Changed: 

As of April 2003, the baseline development cost estimates for the 
programs we reviewed varied considerably from the programs' initial 
baseline estimates. More than half of the programs' development cost 
estimates increased, and for some programs, this increase was 
significant. The baseline development cost estimates for each of the 10 
programs we reviewed in detail were rebaselined--that is, recalculated 
to reflect new costs, time frames, or resources associated with program 
changes in program objectives, deliverables, or scope and plans. 
Although NASA provided specific reasons for the increased cost 
estimates and rebaselinings--such as delays in the development or 
delivery of key system components and funding shortages--it does not 
have guidance for determining when rebaselinings are justified. Such 
criteria are important to instilling discipline in the cost-estimating 
process.

Most of the 27 programs we reviewed experienced a change in their 
development costs estimates. While 8 of the 27 programs experienced 
slight decreases in their development cost estimates, 17 experienced 
cost growth--as much as almost 94 percent. The remaining two programs 
had no change. Ten of the 17 programs' cost growth was greater than 
25 percent. Table 1 shows the development cost estimate changes from 
the initial baseline to the baseline as of April 2003 and the life-
cycle cost estimate for each of the 27 programs. The 10 programs that 
we reviewed in detail are shaded and italicized. (See app. II for 
assessments of the 10 programs and app. III for descriptions of the 
remaining 17 programs.): 

Table 1: Initial and Current Baseline Development Cost Estimates and 
Life-Cycle Cost Estimates for 27 NASA Programs: 

Then-year dollars in millions.

Program: Space Science; 
Enterprise: Space Infrared Telescope Facility (SIRTF)[B]; 
Baseline development cost estimate: Initial: $472.0; 
Baseline development cost estimate: Current (as of April 2003)[A]: $610.5; 
Baseline development cost estimate: Percent change: 29.3%; 
Life-cycle cost estimate (as of April 2003)[A]: $1,170.6.

Program: Space Science; 
Enterprise: 2003 Mars Exploration Rovers (MERs); 
Baseline development cost estimate: Initial: 657.2; 
Baseline development cost estimate: Current (as of April 2003)[A]: 
$767.0; 
Baseline development cost estimate: Percent change: 16.7%; 
Life-cycle cost estimate (as of April 2003)[A]: $806.3.

Program: Space Science; 
Enterprise: Gravity Probe B (GP-B); 
Baseline development cost estimate: Initial: 529.6; 
Baseline development cost estimate: Current (as of April 2003)[A]: 
$709.3; 
Baseline development cost estimate: Percent change: 33.9%; 
Life-cycle cost estimate (as of April 2003)[A]: $734.9.

Program: Space Science; 
Enterprise: Strastospheric Observatory for Infrared Astronomy (SOFIA); 
Baseline development cost estimate: Initial: 234.8; 
Baseline development cost estimate: Current (as of April 2003)[A]: 
$373.0; 
Baseline development cost estimate: Percent change: 58.9%; 
Life-cycle cost estimate (as of April 2003)[A]: $604.5.

Program: Space Science; 
Enterprise: Solar Terrestrial Relations Observatory (STEREO); 
Baseline development cost estimate: Initial: 404.7; 
Baseline development cost estimate: Current (as of April 2003)[A]: 
$302.1; 
Baseline development cost estimate: Percent change: (25.4)%; 
Life-cycle cost estimate (as of April 2003)[A]: $423.0.

Program: Space Science; 
Enterprise: Mercury Surface, Space Environment, Geochemistry and 
Ranging (MESSENGER); 
Baseline development cost estimate: Initial: 325; 
Baseline development cost estimate: Current (as of April 2003)[A]: 
$235.1; 
Baseline development cost estimate: Percent change: (27.7)%; 
Life-cycle cost estimate (as of April 2003)[A]: $337.7.

Program: Space Science; 
Enterprise: Herschel; 
Baseline development cost estimate: Initial: 103.7; 
Baseline development cost estimate: Current (as of April 2003)[A]: 
$72.7; 
Baseline development cost estimate: Percent change: (29.9)%; 
Life-cycle cost estimate (as of April 2003)[A]: $277.6.

Program: Space Science; 
Enterprise: Thermosphere, Ionosphere, Mesosphere Energetics and 
Dynamics (TIMED); 
Baseline development cost estimate: Initial: 176.8; 
Baseline development cost estimate: Current (as of April 2003)[A]: 
$176.2; 
Baseline development cost estimate: Percent change: (0)%; 
Life- cycle cost estimate (as of April 2003)[A]: $253.5.

Program: Space Science; 
Enterprise: estimate: Current (as of April 2003)[A]: $80.4; 
Baseline development cost estimate: Percent change: (19.0)%; 
Life-cycle cost estimate (as of April 2003)[A]: $146.4.

Program: Space Science; 
Enterprise: Rosetta; 
Baseline development cost estimate: Initial: 28.4; 
Baseline development cost estimate: Current (as of April 2003)[A]: 
$40.1; 
Baseline development cost estimate: Percent change: 41.2%; 
Life-cycle cost estimate (as of April 2003)[A]: $106.0.

Program: Space Science; 
Enterprise: International Gamma-Ray Astrophysics Laboratory (INTEGRAL); 
Baseline development cost estimate: Initial: 8.2; 
Baseline development cost estimate: Current (as of April 2003)[A]: 
$11.9; 
Baseline development cost estimate: Percent change: 45.1%; 
Life-cycle cost estimate (as of April 2003)[A]: $51.2.

Program: Earth Science; 
Enterprise: Terra; 
Baseline development cost estimate: Initial: 1,309.1; 
Baseline development cost estimate: Current (as of April 2003)[A]: 
$1,393.2; 
Baseline development cost estimate: Percent change: 6.4%; 
Life- cycle cost estimate (as of April 2003)[A]: $1,451.7.

Program: Earth Science; 
Enterprise: Aqua; 
Baseline development cost estimate: Initial: 1,005.5; 
Baseline development cost estimate: Current (as of April 2003)[A]: 
$952.4; 
Baseline development cost estimate: Percent change: (5.3)%; 
Life-cycle cost estimate (as of April 2003)[A]: $1,050.6.

Program: Earth Science; 
Enterprise: Aura; 
Baseline development cost estimate: Initial: 762.5; 
Baseline development cost estimate: Current (as of April 2003)[A]: 
$764.6; 
Baseline development cost estimate: Percent change: 0.3%; 
Life- cycle cost estimate (as of April 2003)[A]: $788.5.

Program: Earth Science; 
Enterprise: Landsat-7; 
Baseline development cost estimate: Initial: 445.8; 
Baseline development cost estimate: Current (as of April 2003)[A]: 
$508.8; 
Baseline development cost estimate: Percent change: 14.1%; 
Life-cycle cost estimate (as of April 2003)[A]: $508.8.

Program: Earth Science; 
Enterprise: New Millennium Program Earth Observing-1 (NMP-EO-1); 
Baseline development cost estimate: Initial: $111.7; 
Baseline development cost estimate: Current (as of April 2003)[A]: 
$176.4; 
Baseline development cost estimate: Percent change: 57.9%; 
Life-cycle cost estimate (as of April 2003)[A]: $192.5.

Program: Earth Science; 
Enterprise: SeaWinds; 
Baseline development cost estimate: Initial: 130.2; 
Baseline development cost estimate: Current (as of April 2003)[A]: 
$148.8; 
Baseline development cost estimate: Percent change: 14.3%; 
Life-cycle cost estimate (as of April 2003)[A]: $160.1.

Program: Earth Science; 
Enterprise: Cloud-Aerosol Lidar and Infrared Pathfinder Satellite 
Observations (CALIPSO); 
Baseline development cost estimate: Initial: 98.0; 
Baseline development cost estimate: Current (as of April 2003)[A]: 
$133.9; 
Baseline development cost estimate: Percent change: 36.6%; 
Life-cycle cost estimate (as of April 2003)[A]: $150.9.

Program: Earth Science; 
Enterprise: Jason-1; 
Baseline development cost estimate: Initial: 77.5; 
Baseline development cost estimate: Current (as of April 2003)[A]: 
$87.8; 
Baseline development cost estimate: Percent change: 13.3%; 
Life-cycle cost estimate (as of April 2003)[A]: $127.8.

Program: Biological and Physical Research; 
Enterprise: Fluids and Combustion Facility (FCF); 
Baseline development cost estimate: Initial: 118.9; 
Baseline development cost estimate: Current (as of April 2003)[A]: 
$114.1; 
Baseline development cost estimate: Percent change: (4.0)%; 
Life-cycle cost estimate (as of April 2003)[A]: $132.0.

Program: Aeronautics; 
Enterprise: Hyper-X (X-43A); 
Baseline development cost estimate: Initial: 167.0; 
Baseline development cost estimate: Current (as of April 2003)[A]: 
$227.0; 
Baseline development cost estimate: Percent change: 35.9%; 
Life-cycle cost estimate (as of April 2003)[A]: $[C].

Program: Space Flight; 
Enterprise: Alternate Turbopump Program (ATP); 
Baseline development cost estimate: Initial: 1,056.0; 
Baseline development cost estimate: Current (as of April 2003)[A]: 
$764.0; 
Baseline development cost estimate: Percent change: (27.7)%; 
Life-cycle cost estimate (as of April 2003)[A]: $982.0.

Program: Space Flight; 
Enterprise: Cockpit Avionics Upgrade (CAU); 
Baseline development cost estimate: Initial: 442.0; 
Baseline development cost estimate: Current (as of April 2003)[A]: 
$454.0; 
Baseline development cost estimate: Percent change: 2.7%; 
Life-cycle cost estimate (as of April 2003)[A]: $514.0.

Program: Space Flight; 
Enterprise: Advanced Health Management System Phase I (AHMS Phase 1); 
Baseline development cost estimate: Initial: 55.0; 
Baseline development cost estimate: Current (as of April 2003)[A]: 
$55.0; 
Baseline development cost estimate: Percent change: (0)%; 
Life-cycle cost estimate (as of April 2003)[A]: $55.0.

Program: Space Flight; 
Enterprise: Tracking and Data Relay Satellite Replenishment (TDRS); 
Baseline development cost estimate: Initial: 937.0; 
Baseline development cost estimate: Current (as of April 2003)[A]: 
$518.1; 
Baseline development cost estimate: Percent change: (44.7)%; 
Life-cycle cost estimate (as of April 2003)[A]: $[D].

Program: Space Flight; 
Enterprise: X-38 Crew Return Vehicle (CRV); 
Baseline development cost estimate: Initial: 792.0; 
Baseline development cost estimate: Current (as of April 2003)[A]: 
$1,025.0; 
Baseline development cost estimate: Percent change: 29.4%; 
Life-cycle cost estimate (as of April 2003)[A]: $[E].

Program: Space Flight; 
Enterprise: Checkout and Launch Control System (CLCS); 
Baseline development cost estimate: Initial: 206.0; 
Baseline development cost estimate: Current (as of April 2003)[A]: 
$399.0; 
Baseline development cost estimate: Percent change: 93.7%; 
Life-cycle cost estimate (as of April 2003)[A]: $[F]. 

Source: NASA.

Note: The draft NASA Cost Estimating Handbook 2002 defines then-year 
dollars as dollars that are escalated into the time period of 
performance of a contract. It further states that then-year dollars are 
sometimes referred to as escalated costs, inflated costs, or real-year 
dollars. NASA normally uses the term--real-year dollars.

[A] Includes launch vehicle cost where applicable.

[B] SIRTF was renamed the Spitzer Space Telescope in December 2003.

[C] Because Hyper-X is classified as a test program, there is no life-
cycle cost estimate.

[D] A life-cycle cost estimate was not developed for the Tracking and 
Data Relay Satellite Replenishment program because it is currently in 
pre-phase A (conceptual definition). According to a NASA official, a 
life-cycle cost estimate will be determined before it enters phase C/D 
(design, development, test, and evaluation).

[E] A life-cycle cost estimate was not developed for the X-38 Crew 
Return Vehicle program because the program was cancelled in 2003, and 
the program's contracts remained undefinitized at termination--that is, 
the final price or estimated cost and fee were not negotiated and 
mutually agreed to by NASA and the contractor.

[F] A life-cycle cost estimate was not developed for the CLCS program 
because it was canceled due to excessive cost growth.

[End of table]

The development cost estimates for each of the 10 programs that we 
reviewed in detail have been rebaselined--for some programs, as many as 
four times--and for 7 of the 10 programs, the cost estimate increased 
each time it was rebaselined (see fig. 1).

Figure 1: History of Rebaselinings of 10 Programs' Development Cost 
Estimates: 

[See PDF for image]

[A] SIRTF was renamed the Spitzer Space Telescope in December 2003.

[B] The baseline development estimates for the Aura and Aqua projects 
were rebaselined once as a result of a restructuring of the overall 
Earth Observing System (EOS) program in 1995 to address affordability 
issues. Before EOS' restructuring, the baseline was $524 million for 
Aura and $1.2 billion for Aqua. However, according to NASA officials, 
both the Congress and NASA recognize the revised baseline estimates as 
the initial baseline estimates.

[C] Landsat-7's initial baseline development estimate was established 
by the Department of Defense (DOD), which originally had responsibility 
for managing the program. A 1994 Presidential Directive later 
reassigned the program to a joint NASA, National Oceanic and 
Atmospheric Administration, and U.S. Geological Survey program, with 
NASA having responsibility for the development and launch of the 
satellite and development of the ground system. Landsat-7 also became a 
part of the EOS program. In 1995, NASA established a revised initial 
baseline development estimate for Landsat-7, which according to NASA 
officials is recognized by the Congress and NASA as the initial 
baseline development estimate. DOD's initial baseline estimate was not 
available.

[D] CLCS was rebaselined twice, but the second rebaselined estimate for 
CLCS was not established because NASA terminated the program due to the 
program's excessive cost growth.

[End of figure]

For the 10 programs we reviewed in detail, NASA cited specific reasons 
for changes in the baseline development cost estimates and the 
recalculated baselines--many of which were related to technical 
problems and subsequent delays in the development or delivery of key 
system components, and insufficient funding and reserves, as 
illustrated in the following examples: 

* Technical problems in the MERs program required a significant 
redesign of components and the development of a new landing system. Two 
of MERs' three rebaselinings were also the result of inadequate 
reserves. According to NASA officials, without the rebaselinings, the 
development cost "to go"[Footnote 9] would have drained the program's 
reserves.

* The increase in CLCS's development cost estimate and rebaselining was 
the result of poorly defined requirements and design, software 
integration problems, and fundamental changes in the project's 
management structure and contractors' approach to the work. The 
project, which experienced an almost 94 percent increase in its 
baseline development cost estimate, was ultimately terminated.

* The GP-B program--which was rebaselined four times--experienced 
significant schedule slippages due to repeated technical problems, 
including failures in the probe's heat exchanger, the need for 
additional testing, payload electronics delays, and thermal vacuum test 
failures.

* Schedule slippages in the SIRTF program--which contributed to 
increases in the program's baseline development cost estimate and four 
rebaselinings of the estimate--were caused by delays in the delivery of 
components, flight software, and the mission operation system as well 
as launch delays that resulted from a handling accident involving a 
global positioning system payload and concerns of delamination on the 
launch vehicle's solid rocket motors.

* Changes in development cost estimates for the CAU program were 
primarily the result of the program's expanded scope, which occurred in 
October 2002, to produce modification kits that would allow the CAU 
upgrade to be installed into the orbiters.

* The Hyper-X program experienced three rebaselinings, and according to 
the project manager, the program will be rebaselined again in the near 
future. The rebaselinings were due to schedule slippages resulting from 
the need to fund an investigation of the problems experienced in the 
first Mach 7 flight vehicle--which was destroyed in flight--and related 
corrective actions to the second Mach 7 flight.[Footnote 10]

Revised contract requirements, funding changes, or the realization that 
program goals are not achievable may require a formal rebaselining. 
However, NASA has not defined or provided guidance or restrictions on 
rebaselining to ensure that programs consistently and appropriately 
apply rebaselinings and do not adjust their baseline cost estimates 
whenever the estimates become unmanageable. Further, NASA lacks a 
process for systematically identifying and assessing programs that are 
not achieving their cost, schedule, and performance goals. Such a 
process has been employed by the Department of Defense (DOD), which 
also relies heavily on contractors to deliver complex, cutting-edge 
technologies to meet its mission. Specifically, DOD must report to the 
Congress programs that incur a cost growth of 15 percent or more in the 
program baseline. Moreover, DOD must justify the continuation of 
acquisition programs that incur a cost growth of 25 percent or more in 
the program baseline by certifying that specific criteria have been 
met--including that the new cost estimates are reasonable.[Footnote 11] 
Under such a process, 5 of the 10 programs that we reviewed in detail 
would have been required to report to the Congress, and 4 of 
the 5 programs would have had to certify that their new cost estimates 
were reasonable.

Poor Estimating Processes and Methodologies Contributed to 
Wide Variations in Baseline Cost Estimates: 

NASA has yet to implement a well-defined process for estimating the 
cost of its programs--a weakness we and NASA's Inspector General have 
repeatedly reported.[Footnote 12] Recognizing the need for such a 
process, NASA developed a cost-estimating handbook in 2002--the first 
such guidance provided to its cost-estimating community and program and 
project managers.[Footnote 13] Despite this effort, the programs we 
reviewed failed to follow key cost-estimating processes, including 
developing and documenting full life-cycle cost estimates, summarizing 
estimates according to the current breakdown of work to be performed, 
conducting an uncertainty analysis, performing an independent review of 
contractors' cost estimates, and later using earned value management 
(EVM) to assess progress.[Footnote 14]

Reflecting Office of Management and Budget (OMB) guidance and best 
practices of government and industry leaders, NASA requires that full 
life-cycle cost estimates be prepared using full cost 
accounting,[Footnote 15] that estimates be summarized according to the 
current breakdown of work to be performed, and that major changes be 
tracked to the life-cycle cost. In its draft cost-estimating handbook, 
NASA lists a number steps that are integral to preparing a reliable 
life-cycle cost estimate, including preparing or obtaining a cost 
analysis requirements description (CARD),[Footnote 16] developing 
ground rules and assumptions, and developing cost range and risk 
assessments.

Carnegie Mellon University's Software Engineering Institute 
(SEI)[Footnote 17] echoes the need for reliable cost-estimating 
processes in managing software implementations--identifying tasks to be 
estimated, mapping the estimates to the breakdown of work to be 
performed, and identifying and explaining assumptions are among SEI's 
requisites for producing reliable cost estimates. To evaluate the cost-
estimating processes of the 10 NASA programs that we reviewed in 
detail, we selected 14 criteria based on SEI checklists (see table 
2).[Footnote 18] Many of these criteria are included in NASA's cost-
estimating guidance.

Table 2: Summary of Criteria Used to Assess 10 NASA Programs Reviewed: 

Criterion: The objectives of the estimate are stated in writing; 
Purpose/Significance: The objectives of the program must be clearly 
stated in a concise document for the cost estimator to use to develop 
the cost estimate. NASA guidance states that NASA programs and projects 
are to be defined as activities that have defined objectives along with 
goals and requirements.

Criterion: The life cycle to which the estimate applies is clearly 
defined; Purpose/Significance: The life cycle must be clearly defined 
to ensure that the full cost of the program--that is, all direct and 
indirect costs for planning, procurement, operations and maintenance, 
and disposal--are captured. The draft NASA cost-estimating handbook 
states that a life-cycle cost estimate provides "an exhaustive 
accounting of all resources necessary to develop, deploy or field, 
operate, maintain, and dispose of a system over its lifetime." The 
handbook defines life cycle as the program's or project's "total life, 
beginning with mission feasibility and extending through operation and 
disposal or conclusion of the system or program.".

Criterion: The task has been appropriately sized; Purpose/
Significance: This criteria asks if the appropriate metric was used in 
the development of the estimate, such as the size of a software product 
with expected amount of reuse.

Criterion: The estimated cost and schedule are consistent with 
demonstrated accomplishments on other projects; Purpose/Significance: 
In other words, estimates have been validated by relating them back to 
demonstrated performance on completed projects.

Criterion: A written summary of parameter values and their rationales 
accompany the estimate; Purpose/Significance: This criterion refers to 
the underlying cost-estimating methodology. If a parametric equation 
was used to generate the estimate, then the parameters that feed the 
equation must be provided along with an explanation of why they were 
chosen.

Criterion: Assumptions have been identified and explained; Purpose/
Significance: The draft NASA draft cost-estimating handbook states that 
assumptions are a critical step in any estimate and should be clearly 
prominent in all documentation for the estimate. Accurate assumptions 
can prevent inaccurate or misleading estimates.

Criterion: A structured process such as a template or format has been 
used to ensure that key factors have not been overlooked; Purpose/
Significance: This criterion refers to whether or not the program has 
established a work breakdown structure (WBS)--that is a structure that 
organizes, defines, and graphically displays the individual work units 
to be performed. NASA policy guidance calls for breaking down work into 
smaller units to facilitate cost-estimating and project and contract 
management as well as to help ensure that all relevant costs are 
captured. The guidance requires that a preliminary WBS be developed 
during the formulation phase, and that a final WBS be generated 
following contractor selection or approval to implement. The guidance 
further requires that programs describe the overall WBS structure and 
the content of each individual element of the WBS.

Criterion: Uncertainties in parameter values have been identified and 
quantified; Purpose/Significance: Again this criterion refers to the 
underlying cost-estimating methodology. For all major cost drivers, an 
uncertainty analysis should be performed to assess the risk associated 
with the cost estimate.

Criterion: If a dictated schedule has been imposed, an estimate of the 
normal schedule has been compared to the additional expenditures 
required to meet the dictated schedule.[A]; Purpose/Significance: This 
criterion asks whether a dictated schedule was imposed on the program, 
that is, whether the program was forced to accelerate the schedule in 
order to meet requirements. If this occurred, then the impacts to the 
cost estimate need to be calculated and provided.

Criterion: If more than one cost model or estimating approach has been 
used, any differences in results have been analyzed and explained; 
Purpose/Significance: This criterion checks to ensure that the primary 
methodology or cost model results are consistent with any secondary 
methodology (for example, cross checks) performed.

Criterion: Estimators independent of the performing organization 
concurred with the reasonableness of the parameter values and 
estimating methodology; Purpose/Significance: NASA policy guidance 
states, "when a project under a program has an estimated NASA life-
cycle cost greater than $150 million, an independent life-cycle cost 
analysis is required during formulation in conjunction with initiating 
the preliminary design."

Criterion: Estimates are current; Purpose/Significance: Estimates 
should be updated whenever changes to requirements affect cost or 
schedule. NASA policy guidance requires that the life-cycle cost 
estimate be updated prior to each budget submission.

Criterion: The results of the estimate have been integrated with 
project planning and tracking; Purpose/Significance: NASA policy 
guidance requires that a life-cycle cost be developed to establish a 
program/project commitment, assessed at major reviews, and updated for 
each budget submission and should use currently available full cost 
initiative guidance.

Criterion: Earned value reporting has been used to manage the program; 
Purpose/Significance: NASA policy guidance requires program and project 
managers to "ensure that EVM provisions and requirements are included 
in requests for proposals and contracts and ensure that an effective 
surveillance program is in place to provide assurance that EVM data are 
valid and that the contractor's integrated management system remains in 
compliance with the EVM criteria." The guidance further requires each 
program and project to periodically generate estimates at completion, 
perform cost and schedule variance analyses based upon pre-established 
thresholds, and prepare corrective action plans where necessary.

Sources: NASA and SEI.

[A] Does not apply to all programs.

[End of table]

Despite NASA requirements and OMB and SEI guidance, few of the 
10 programs that we reviewed in detail met even a third of these 
criteria; only one met half. Further, none of the programs fully met 
certain key criteria. For example, none provided a complete life cycle 
with definitions or a complete description of the methodology used to 
generate the complete cost estimate, such as data sources and 
uncertainties. According to the draft NASA cost-estimating handbook, a 
reliable life-cycle cost estimate is critical to making realistic 
decisions about developing or producing a system and to determining the 
appropriate scope or size of a program. NASA guidance also calls for 
breaking down the work to be performed into smaller units to facilitate 
cost estimating and program and contract management and to help ensure 
relevant costs are not omitted. However, only 3 of the 10 programs 
provided a complete breakdown of the work to be performed. Table 3 
shows for each program the applicable criteria that were met, partially 
met, or not met.[Footnote 19] (See app. II for a program by program 
assessment.): 

Table 3: Summary of Extent 10 NASA Programs Met Assessment Criteria: 

Criteria for cost estimating: Objectives stated in writing; 
Space science: GP-B: Not met; 
Space science: MERs: Partially met; 
Space science: SIRTF: Partially met; 
Earth science: Landsat-7: Not met; 
Earth science: Aqua: Partially met; 
Earth science: Aura: Met; 
Biological and physical research: FCF: Partially met; 
Aeronautics: Hyper-X: Not met; 
Space flight: CLCS: Partially met; 
Space flight: CAU: Met. 

Criteria for cost estimating: Life cycle clearly defined; 
Space science: GP-B: Partially met; 
Space science: MERs: Partially met; 
Space science: SIRTF: Partially met; 
Earth science: Landsat-7: Partially met; 
Earth science: Aqua: Partially met; 
Earth science: Aura: Partially met; 
Biological and physical research: FCF: Partially met; 
Aeronautics: Hyper-X: Partially met; 
Space flight: CLCS: Partially met; 
Space flight: CAU: Partially met. 

Criteria for cost estimating: Tasks appropriately sized; 
Space science: GP-B: Not met; 
Space science: MERs: Not met; 
Space science: SIRTF: Partially met; 
Earth science: Landsat-7: Partially met; 
Earth science: Aqua: Not met; 
Earth science: Aura: Partially met; 
Biological and physical research: FCF: Partially met; 
Aeronautics: Hyper-X: Not met; 
Space flight: CLCS: Partially met; 
Space flight: CAU: Met. 

Criteria for cost estimating: Estimates based on demonstrated programs; 
Space science: GP-B: Not met; 
Space science: MERs: Partially met; 
Space science: SIRTF: Partially met; 
Earth science: Landsat-7: Partially met; 
Earth science: Aqua: Not met; 
Earth science: Aura: Partially met; 
Biological and physical research: FCF: Partially met; 
Aeronautics: Hyper-X: Not met; 
Space flight: CLCS: Partially met; 
Space flight: CAU: Partially met. 

Criteria for cost estimating: Parameter values and rationale 
documented; 
Space science: GP-B: Not met; 
Space science: MERs: Not met; 
Space science: SIRTF: Partially met; 
Earth science: Landsat-7: Not met; 
Earth science: Aqua: Not met; 
Earth science: Aura: Not met; 
Biological and physical research: FCF: Partially met; 
Aeronautics: Hyper-X: Not met; 
Space flight: CLCS: Partially met; 
Space flight: CAU: Partially met. 

Criteria for cost estimating: Assumptions identified and explained; 
Space science: GP-B: Partially met; 
Space science: MERs: Partially met; 
Space science: SIRTF: Partially met; 
Earth science: Landsat-7: Not met; 
Earth science: Aqua: Partially met; 
Earth science: Aura: Partially met; 
Biological and physical research: FCF: Partially met; 
Aeronautics: Hyper-X: Not met; 
Space flight: CLCS: Met; 
Space flight: CAU: Met. 

Criteria for cost estimating: Structured format captures all costs; 
Space science: GP-B: Partially met; 
Space science: MERs: Met; 
Space science: SIRTF: Partially met; 
Earth science: Landsat-7: Partially met; 
Earth science: Aqua: Partially met; 
Earth science: Aura: Partially met; 
Biological and physical research: FCF: Met; 
Aeronautics: Hyper-X: Partially met; 
Space flight: CLCS: Partially met; 
Space flight: CAU: Met. 

Criteria for cost estimating: Uncertainties identified and quantified; 
Space science: GP-B: Not met; 
Space science: MERs: Not met; 
Space science: SIRTF: Partially met; 
Earth science: Landsat-7: Partially met; 
Earth science: Aqua: Not met; 
Earth science: Aura: Not met; 
Biological and physical research: FCF: Not met; 
Aeronautics: Hyper-X: Not met; 
Space flight: CLCS: Not met; 
Space flight: CAU: Partially met.

Criteria for cost estimating: Accelerated schedules show cost impacts; 
Space science: GP-B: Partially met; 
Space science: MERs: Partially met; 
Space science: SIRTF: N/A; 
Earth science: Landsat-7: Partially met; 
Earth science: Aqua: Partially met; 
Earth science: Aura: N/A; 
Biological and physical research: FCF: N/A; 
Aeronautics: Hyper-X: N/A; 
Space flight: CLCS: Partially met; 
Space flight: CAU: N/A.

Criteria for cost estimating: More than one estimating approach used; 
Space science: GP-B: Not met; 
Space science: MERs: Not met; 
Space science: SIRTF: Partially met; 
Earth science: Landsat-7: Not met; 
Earth science: Aqua: Not met; 
Earth science: Aura: Not met; 
Biological and physical research: FCF: Partially met; 
Aeronautics: Hyper-X: Not met; 
Space flight: CLCS: Partially met; 
Space flight: CAU: Partially met. 

Criteria for cost estimating: Independent and program estimates concur; 
Space science: GP-B: Partially met; 
Space science: MERs: Met; 
Space science: SIRTF: Partially met; 
Earth science: Landsat-7: Partially met; 
Earth science: Aqua: Partially met; 
Earth science: Aura: Partially met; 
Biological and physical research: FCF: Partially met; 
Aeronautics: Hyper-X: Partially met; 
Space flight: CLCS: Met; 
Space flight: CAU: Met. 

Criteria for cost estimating: Estimates reflect changes over time; 
Space science: GP-B: Partially met; 
Space science: MERs: Met; 
Space science: SIRTF: Partially met; 
Earth science: Landsat-7: Partially met; 
Earth science: Aqua: Partially met; 
Earth science: Aura: Met; 
Biological and physical research: FCF: Partially met; 
Aeronautics: Hyper-X: Partially met; 
Space flight: CLCS: Partially met; 
Space flight: CAU: Met. 

Criteria for cost estimating: Estimates used for program tracking; 
Space science: GP-B: Met; 
Space science: MERs: Met; 
Space science: SIRTF: Met; 
Earth science: Landsat-7: Partially met; 
Earth science: Aqua: Partially met; 
Earth science: Aura: Partially met; 
Biological and physical research: FCF: Met; 
Aeronautics: Hyper-X: Met; 
Space flight: CLCS: Partially met; 
Space flight: CAU: Met. 

Criteria for cost estimating: Earned value reporting used; 
Space science: GP-B: Partially met; 
Space science: MERs: Partially met; 
Space science: SIRTF: Partially met; 
Earth science: Landsat-7: Partially met; 
Earth science: Aqua: Partially met; 
Earth science: Aura: Partially met; 
Biological and physical research: FCF: Met; 
Aeronautics: Hyper-X: Partially met; 
Space flight: CLCS: Partially met; 
Space flight: CAU: Met. 

Sources: NASA (data), SEI (criteria), GAO (analysis).

[End of table]

Failing to meet these criteria puts programs at certain risk. For 
example, underestimating a program's full life-cycle costs creates the 
risk that a program could be underfunded and subject to major cost 
overruns, which would ultimately result in the program being reduced in 
scope or additional funding being requested and appropriated to ensure 
the program meets its objectives. Conversely, overestimating life-cycle 
costs creates the risk that a program will be deemed unaffordable and 
would, therefore, go unfunded. Without a complete WBS, NASA programs 
cannot ensure that the life-cycle cost estimates have captured all 
relevant costs, which again can result in underfunding and cost 
overruns. Further, inconsistent WBS estimates across programs can 
create problems of double counting or, worse, underestimating costs 
when using historical program costs as a basis for projecting future 
costs on similar programs.

Despite the uncertainty inherent in estimating the cost of emerging 
technologies, all of the 10 programs we reviewed also failed to conduct 
an uncertainty analysis to assess risks associated with the cost 
estimates. Instead, the programs expressed their cost estimates as 
point values--which implies certainty--not as ranges or numbers with 
confidence levels.[Footnote 20] Performing an uncertainty analysis, 
such as a Monte Carlo simulation,[Footnote 21] quantifies the amount of 
cost risk within a program. Only by quantifying the cost risk can 
management make informed decisions about risk mitigation strategies. 
Quantifying cost risks also provides a benchmark against which future 
progress can be measured. Without this knowledge, NASA may have little 
specific basis to determine adequate financial reserves, schedule 
margins, and technical performance margins to provide managers the 
flexibility needed to address programmatic, technical, cost, and 
schedule risks, as required by NASA policy.

Seven of the 10 programs also failed to have an independent review of 
contractors' cost estimates--as required by NASA. Instead, programs 
established their budgets based on contractor proposals--particularly 
problematic since many contractors could bid low in order to win the 
contract. To ensure contractor costs are realistic, NASA procedures and 
guidelines specifically require programs to ensure that independent 
reviews are conducted and that these reviews address project life-cycle 
costs, risk management plans, as well as technical issues. Without such 
reviews, NASA decision makers lacked the benchmarks needed to assess 
the reasonableness of the contractors' proposed costs, limiting NASA's 
ability to make sound investment decisions and accurately assess 
contractor performance.

Finally, only two programs used EVM--an approach used by DOD and 
leading companies to provide meaningful assessments of a program's 
progress by comparing the value of work performed to its costs, rather 
than the traditional management approach of comparing budgeted and 
actual costs, which can provide a distorted view of a program's 
progress. (For a detailed discussion of EVM, see app. IV.) By using the 
value of completed work as a basis for estimating the cost and time 
needed to complete the program, EVM can alert program managers to 
potential problems early in the program. NASA requires that EVM be used 
on all significant contracts--that is, research and development 
contracts with a total anticipated final value of $70 million or more, 
and production contracts with a total anticipated final value of $300 
million or more--which includes all of the 10 programs we reviewed in 
detail.[Footnote 22] Although the program managers for all 10 programs 
stated that EVM was used in their projects, only two programs provided 
cost performance reports, indicating a true EVM process was in place. 
The remaining eight programs relied on NASA Form 533, which captures 
planned and actual obligations and expenditures--not the value of the 
work performed.[Footnote 23] Without a true EVM process, programs 
cannot readily determine if a program is at risk of cost and schedule 
overruns until it is too late to make programmatic changes to avoid 
these risks.

NASA Has Begun to Address Certain Barriers to Effective 
Cost Estimating: 

There are several impediments that NASA needs to overcome to implement 
effective cost-estimating practices. These include the lack of reliable 
financial data and other performance information; lack of trained EVM 
staff, data analysis tools, and incentive for supporting and 
implementing EVM; and ineffective use of cost analysts. NASA has 
initiated several measures to begin addressing some of these 
impediments.

Utility of Cost-Estimating Tools Depends on the Reliability of NASA's 
Financial and Performance Data: 

According to NASA officials, state-of-the-art cost-estimating tools 
have been funded and implemented. For example, NASA officials told us 
that commercial-off-the-shelf models have been used to estimate 
hardware and software acquisition costs and quantify the level of 
uncertainty surrounding cost estimates. However, these cost-estimating 
tools are only as good as the data they rely on to develop the 
estimates. For more than a decade, we have reported that NASA has 
failed to develop a system to capture reliable financial and 
performance information, posing significant challenges to NASA's 
ability to estimate and control program costs. Over the past year 
alone, we issued numerous reports on NASA's Integrated Financial 
Management Program (IFMP)--the agency's third and most recent effort to 
implement a modern, integrated financial management system. 
Specifically, we found that IFMP--which is under the responsibility of 
the Program Executive Officer for IFMP--will not, as it is being 
implemented, routinely provide program managers and other key 
stakeholders and decision makers--including the Congress--with the 
financial related information needed to measure program performance and 
ensure accountability. For example, the core financial module 
(considered the backbone of the system) does not appropriately capture 
property, plant, and equipment, as well as material in its general 
ledger at the transaction level--which is needed to provide independent 
control over these assets. In addition, NASA implemented the system 
before it had the capability to capture the full costs of its programs 
and projects. According to headquarters officials, collecting 
nonfinancial data crucial to cost estimating--such as technology 
readiness levels, parts counts, and team and management experience and 
skill ratings--has also been difficult.

Use of EVM Has Been Undermined by a Lack of Trained Staff, Data 
Analysis Tools, and Incentive: 

According to headquarters officials, agencywide EVM implementation 
efforts began in 1996 and are recognized by NASA management as a key 
tool in monitoring and measuring cost trends in higher risk project 
elements--a tool that serves as an early warning of the need for cost-
risk mitigation actions to maintain control of program costs. These 
officials stated that EVM has been applied to the International Space 
Station Program[Footnote 24] and with varying levels of emphasis to 
other programs and projects at different NASA centers.[Footnote 25] 
While all of the program managers for the 10 programs that we reviewed 
in detail stated that they used EVM, only 2 of the programs used a true 
EVM process.

NASA headquarters officials identified several challenges that have 
affected the agency's ability to implement EVM effectively, including a 
lack of staff and data analysis tools. According to officials, resource 
constraints have prevented the agency from staffing many project 
offices with appropriate personnel to fulfill all project functions. In 
addition, there has been little or no priority to include a trained EVM 
analyst, even if one were available. Headquarters officials also noted 
that EVM has been hampered by the lack of a practical automated 
software data analysis tool. Without such a tool, analyzing the 
contractors' EVM cost performance reports, which contain significant 
amounts of data, became a cumbersome undertaking that often resulted in 
incomplete and untimely analyses, providing little usefulness to inform 
management decisions. A lack of incentive to support EVM has further 
undermined its use. Some project managers whom we spoke with are 
skeptical about the benefits of EVM and argue that it has failed to 
help them manage or control program costs. According to NASA 
headquarters officials, during proposal and contract negotiation 
phases, contractors have also suggested not using EVM as a way to 
reduce contract costs. While EVM was included in most contracts for the 
10 programs we reviewed in detail--as required by NASA policy--it was 
used only in two programs as a cost-estimating tool. In general, EVM 
has been viewed by NASA as a financial reporting tool. Consequently, 
there is little incentive to use EVM because the data needed to report 
financial activity is captured elsewhere, such as in Form 533.

Ineffective Use and Placement of Cost Analysts across the Agency's Cost 
Activities also Hinders NASA's Efforts to Improve Its Cost-Estimating 
Practices: 

NASA's efforts to improve its cost-estimating processes have also been 
undermined by ineffective use of its limited number of cost-estimating 
analysts. For example, headquarters officials state that as projects 
entered the formulation phase, they have typically relied on program 
control and budget specialists--not cost analysts--to provide the 
financial services to manage projects. Yet budget specialists are 
generally responsible for obligating and expending funding--not for 
conducting cost analyses that underlie the budget or ensuring budgets 
are based on reasonable cost estimates--and, therefore, tend to assume 
that the budget is realistic. While NASA officials state that its cost-
estimating staff is too limited to be involved in day-to-day project 
execution activities, they agreed that the cost analysts could be more 
effectively used throughout the life cycle--particularly when projects 
are rebaselined and independent cost estimates of project changes must 
be performed.

In some cases, cost analysts are not appropriately located in the 
organization, which may compromise controls NASA has in place to ensure 
reasonable cost estimates. For example, some cost analysts at NASA's 
centers are located with senior systems engineers in systems management 
organizations, while others are not. According to NASA officials, 
housing the cost analysts with senior systems engineers has two key 
benefits. First, the systems engineers generally conduct systems 
analyses to help ensure that a program's requirements are properly 
established and that the design and validity meet the requirements. 
Such analyses can greatly inform the development of reasonable cost 
estimates. Second, the systems engineering offices afford some measures 
of independence for cost estimating, which, according to NASA cost-
estimating guidance and procedures, is important to the overall project 
management process. However, NASA officials stated that several of its 
centers' cost analysts are in the advocacy chain of command--not housed 
with senior systems engineers. For example, one center's 15 cost 
analysts work in the center's Office of the Chief Financial Officer--
which is responsible for directing the development and execution of the 
center's budget--not in the systems management organization, which is 
independent from the rest of the center. As a result, the costs 
analysts' estimates may not be adequately informed by the systems 
engineers and may lack the objectivity required to ensure that the 
criteria for independence have been met.

Efforts Under Way to Remove Some Barriers and Improve Cost Estimating: 

NASA has several initiatives under way to improve the agency's 
cost-estimating processes. First, NASA has established a Cost Analysis 
Division in the Office of the Comptroller to strategically manage 
analyses related to directing and funding research, improving cost-
estimating processes and practices, and providing cost-estimating tools 
and training throughout the agency. The division also provides, along 
with the Independent Program Assessment Office (IPAO), the last 
independent cost estimate of projects before the information is 
released externally. These efforts are being coordinated through a 
steering committee composed of the managers of the cost analysis 
organizations from each of the centers and IPAO's deputy director.

NASA is revising the cost sections in its governing procedures and 
guidelines and is finalizing its cost-estimating handbook to reflect 
these changes.[Footnote 26] These documents will require the routine 
use of probabilistic cost risk analysis, a CARD document, cost as an 
independent variable (CAIV), and EVM. The CARD supports the project 
life-cycle cost estimate and a congressionally required independent 
cost estimate. Agency officials note that while there has been some use 
of CARD in the agency, its first concentrated and successful use was in 
the 2001 to 2002 independent cost estimate for the International Space 
Station program. According to headquarters officials, NASA's revised 
guidance and finalized cost-estimating handbook will provide direction 
and guidance for fully implementing the use of CARDs for major 
development projects. Although NASA calls for CAIV to be used routinely 
and notes that CAIV demonstrates a commitment to evolutionary 
acquisition, it has yet to provide guidance on its implementation. NASA 
headquarters officials stated that guidance relating to improvements in 
the collection of cost data is also being reflected in its revised 
governing procedures and guidelines.

With respect to EVM, NASA headquarters officials described several 
efforts under way to ensure agencywide implementation of true EVM. For 
example, NASA recently revised its EVM policy directives to shift 
ownership of EVM responsibilities from NASA's Chief Financial Officer 
to NASA's Chief Engineer, to emphasize that EVM is to be considered a 
project management tool rather than a financial management tool. NASA 
officials also noted that the agency is working to inform managers of 
the performance management capabilities available to them through EVM 
and to emphasize the importance of providing adequate resources and 
management support to ensure successful EVM implementation. Agencywide 
goals for EVM implementation include promoting the effective use of EVM 
and providing needed training and education for program and project 
staff. These efforts and proposed initiatives should help resolve EVM 
utilization problems.

Finally, NASA officials told us that the agency is planning to hire 
additional cost analysts to alleviate understaffing at all of its 
center cost analysis offices. The agency envisions a total staff of 
about 100 cost analysts along with additional support contractors. NASA 
officials also stated that it is necessary to ensure centers address 
the problem of having cost analysts located in the advocacy chain of 
command, which could affect five NASA centers.

Because NASA's initiatives have only recently been implemented or are 
still in the drafting or planning stage, we cannot determine to what 
degree these efforts will enable NASA to provide reasonable and 
defensible cost estimates of its programs and projects.

Conclusions: 

There are numerous scientific and technical challenges inherent in the 
successful implementation of many NASA programs. Nevertheless, the need 
to choose among competing alternatives within limited budget resources 
makes it essential that the agency and the Congress clearly understand 
the costs and uncertainties of programs proposed for authorization and 
funding. Yet, NASA does not have the disciplined cost-estimating 
process needed to make informed acquisition decisions, nor does the 
agency have processes and tools for capturing, monitoring, and managing 
program costs and schedules within an implementation plan on a timely 
basis. This makes it difficult for senior NASA officials, program and 
project managers, and other key stakeholders to measure performance and 
initiate mitigation measures when needed. Taken together, the lack of 
disciplined and established cost-estimating processes and tools can 
cause program officials to restructure projects to available resources 
rather than develop realistic cost estimates and implementation plans 
for projects. As a result, programs may have to be modified to 
accommodate emerging technical, cost, and schedule realities. 
Ultimately, programs cost more, fail to meet their schedules, or 
deliver less than originally envisioned. To help minimize project costs 
increases and implementation delays identified in this report, NASA 
needs to instill disciplined cost-estimating processes into its project 
development and approval activities and to ensure such processes are 
integrated with its implementation of an integrated 
financial management system. Without a process that prevents programs 
from proceeding before they have sufficiently demonstrated that key 
cost-estimating criteria have been met, NASA programs will continue to 
be at risk of cost and schedule overruns.

Recommendations for Executive Action: 

Improvements to NASA's cost-estimating processes will partly depend on 
the agency's ability to address recommendations that we made in 
November 2003 to help ensure NASA effectively implements a modern, 
integrated financial management system.[Footnote 27] Notwithstanding 
the need to address those recommendations, to better position NASA to 
ensure its recent initiatives result in sound cost-estimating practices 
agencywide, we are making three recommendations with minimum suggested 
courses of action. First, we are recommending that the NASA 
Administrator direct the Program Executive Officer for IFMP, the Chief 
Financial Officer, and the Chief Engineer to develop an integrated plan 
for improving cost estimating that, at a minimum, includes specific 
actions for ensuring that: 

* guidance is established on rebaselining and that rebaselining is 
consistently applied to provide accountability among programs,

* true earned value management is used as an organizational management 
tool to bring cost to the forefront in NASA's management decision-
making process,

* acquisition and earned value management policies and procedures are 
enforced, and: 

* staff and support for cost-estimating and earned value analyses are 
effectively used.

In addition, we recommend that the NASA Administrator direct the Chief 
Financial Officer to establish a standard framework for developing 
life-cycle cost estimates. At a minimum the framework should require 
each program or project to: 

* base its cost estimates on a full life cycle for the program--
including all direct and indirect costs for operations and maintenance 
and disposal as well as planning and procurement--and on a work 
breakdown structure that encompass both in-house and contractor 
efforts,

* prepare a cost analysis requirements description,

* prepare an independent government estimate at each milestone of the 
program, and: 

* conduct a cost risk assessment that identifies the level of 
uncertainty inherent in the estimate.

Further, we recommend that the NASA Administrator develop procedures 
that would prohibit proposed projects from proceeding through the 
review and approval process when they do not address the elements of 
the recommended cost-estimating practices.

Agency Comments and Our Evaluation: 

In written comments on a draft of this report, NASA's Deputy 
Administrator stated that the agency concurs with the recommendations, 
adding that the recommendations validate and reinforce the importance 
of activities underway at NASA to improve cost estimating and program 
management.

Notwithstanding agreement with our recommendations, the Deputy 
Administrator believes NASA has made substantive changes and achieved 
significant improvements in its cost-estimating processes. For example, 
NASA's comments on a draft of this report cite a 1992 GAO report 
[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO/NSIAD-93-97] that 
found a median 77 percent increase in NASA program costs. According to 
the Deputy Administrator, this contrasts with a 13 percent cost growth 
in this present study. While there may be improvements in the percent 
of cost growth of some projects, such declines in cost growth are often 
achieved by rescoping and rebaselining projects to remain within 
available resources, as was demonstrated in a number of projects 
discussed in this report. We do not believe other examples cited by 
the Deputy Administrator, namely termination of the Checkout and Launch 
Control System and cost control measures imposed on the International 
Space Station, demonstrate that NASA has made substantive changes and 
achieved significant improvements in its cost-estimating processes. 
Rather, we believe these examples demonstrate what happens when 
projects are undertaken without a full understanding of the potential 
costs and management challenges inherent in many of the programs NASA 
proposes and then implemented without adequate financial management 
systems in place.

With regard to our recommendation to develop guidelines for 
rebaselining and ensure effective use of earned value management, the 
Deputy Administrator cited the development of revised direction on 
program and project management and a refocus on risk and cost-risk 
analysis. NASA also now requires the establishment of cost thresholds 
that, if exceeded, will require a rebaselining review. Further, because 
much of NASA's work is performed through grants and contracts, NASA's 
revised procedures will emphasize how risk and technical complexity 
affect contractor performance. New earned value management and 
acquisition policies and procedures will be implemented through program 
management councils that will review and approve programs and projects 
regularly through each step of their development. Also, a new Cost 
Analysis Division has been established, and cost-estimating staff has 
been added to it and NASA's Independent Program Assessment Office. NASA 
also noted the importance of training needed to match the new 
requirements.

NASA's Deputy Administrator also concurred with our recommendation to 
establish a standard framework for developing life-cycle cost 
estimates. According to the Deputy Administrator, NASA's new processes 
and procedural requirements document will define the full life-cycle 
cost to include development, operations, maintenance, disposal, and all 
NASA in-house direct and indirect costs to eliminate ambiguity and 
ensure consistency. NASA's revised cost-estimating handbook will 
provide further guidance for life-cycle cost estimates. Also, project 
managers will be responsible for developing and maintaining a cost 
analysis requirements document similar to a tool DOD uses that will 
include the equivalent of a project and technical description; key 
performance parameters, including documentation of actual work 
breakdown structure cost elements; and initial and annual updates of 
the life-cycle cost estimates. NASA guidance will also require periodic 
independent cost estimates on major programs and approval by the 
respective program management council to enter into implementation 
after an independent estimate has been completed.

Lastly, NASA's Deputy Administrator concurred with our recommendation 
to prohibit proposed projects from proceeding through the review and 
approval process when they do not address the elements of the 
recommended cost-estimating practices. Accordingly, NASA's forthcoming 
procedural requirements will define the authority of the program 
management councils that will, according to NASA, enforce the 
requirements, including the required information, documentation, and 
management methods needed for proceeding through the review and 
approval process. The Deputy Administrator also noted the availability 
of recent management information system improvements that enhance 
visibility over project and program performance. In his general 
comments, the Deputy Administrator also stated that NASA had recently 
taken steps to address issues raised in the draft report and suggested 
a report title that would better reflect that progress.

We agree that NASA has initiated number of reforms to its project 
development and implementation processes that, if properly implemented, 
would be positive steps to addressing many of the problems noted in 
this report. However, we also note that some of these problems have 
been long-standing in the projects discussed in this report and in a 
number of other projects we and NASA's Office of Inspector General have 
reviewed. Furthermore, planned improvements in the past have fallen 
short of agencywide implementation. For example, poor or inadequate 
cost estimates and management oversight have been central to the 
problems that plagued several programs, including those intended to 
develop new space transportation and the International Space Station 
programs. A reliable financial management structure is central to the 
success of many measures noted by the Deputy Administrator in his 
reply. We recently reported and testified on the impediments that exist 
in achieving such a capability. Finally, we note that contract 
management has been a long-standing problem at NASA. In 1990, we 
identified NASA's contract management function as an area at high risk. 
During that time, there was little emphasis on end results, product 
performance, and cost control. NASA found itself procuring expensive 
hardware that did not work properly. This report shows that these types 
of problems still exist. Regarding the Deputy Administrator's 
suggestion that we revise the title of our report to reflect recent 
progress that NASA has made towards addressing issues that we raise, we 
believe NASA's improvements have been properly reflected in our 
report's title. We considered the concerns expressed in the Deputy 
Administrator's comments, and consistent with our stated position that 
NASA's improvements are positive steps but that its problems still 
persist, we revised the title accordingly.

Finally, until NASA's integrated financial management system, which is 
central to providing effective management and oversight, is fully 
implemented, performance assessments relying on cost data may be 
incomplete and full costing will be only partially achieved. And until 
these problems are resolved and the measures the Deputy Administrator 
noted in commenting on a draft of this report are fully implemented and 
integrated into the way the agency does business, NASA's contract 
management function will continue to be an area of concern.

As agreed with your office, unless you announce its contents earlier, 
we will not distribute this report further until 30 days from its date. 
At that time, we will send copies to the NASA Administrator and 
interested congressional committees. We will make copies available to 
others upon request. In addition, the report will be available at no 
charge on the GAO Web site at [Hyperlink, http://www.gao.gov].

If you or your staff have any question concerning this report, please 
contact me at (202) 512-4841 or [Hyperlink, lia@gao.gov]. Key 
contributors to this report are acknowledged in appendix VI.


Signed by: 

Allen Li: 
Director, Acquisition and Sourcing Management: 

[End of section]

Appendixes: 

Appendix I: Scope and Methodology: 

To determine cost estimates in selected NASA programs and any changes 
in those estimates, we asked NASA to provide a list of programs that 
were currently in the development phase, and programs that had 
completed development or were launched in fiscal year 2001or 2002. We 
also asked NASA to provide the initial baseline development cost 
estimate and current cost estimate for the development phase and life 
of the program, and the reasons for changes to initial development cost 
estimates. NASA identified 68 programs that were currently in 
development or had completed development in fiscal years 2001 and 2002. 
These included planetary missions and Earth observatory, aeronautical 
technology, and space flight systems. From that universe, we selected 
at least one program (10 in total) from 5 of NASA's 7 Enterprises. This 
involved 6 of 9 NASA centers (and the Jet Propulsion Laboratory) with 
lead responsibility for one or more of these programs. Our selection 
was generally based on programs with the highest current development 
cost estimates within an Enterprise. We compared the initial 
development cost estimates NASA provided to the current development 
cost estimates for the programs. The initial development estimates 
generally reflect the projected costs at the time a new program was 
first approved by the Congress. The current development and life-cycle 
cost estimates reflect the latest estimates provided by NASA as of 
April 2003. We also interviewed program officials to obtain additional 
information related to NASA's revisions to initially established 
baseline development cost estimates, including the rationale for 
changes to the cost estimates.

We also analyzed the initial and current development cost estimates for 
17 additional NASA programs, later added to the scope of our review, to 
ascertain the level of cost growth or decline as those programs 
progressed through the development phase.

To assess NASA's cost-estimating processes and methodologies, we used 
cost-estimating criteria developed by Carnegie Mellon University's 
Software Engineering Institute (SEI) designed to assess the reliability 
of project cost and schedule estimates. SEI is a government-funded 
research organization that is widely considered an authority on 
software implementation. SEI developed checklists with these criteria 
to help evaluate software costs and schedule; however, SEI states that 
these checklists are equally applicable to hardware and systems 
engineering projects. We first analyzed NASA's cost-estimating 
procedures and guidelines to determine if they incorporated key 
components of good cost-estimating practices advocated by SEI and other 
experts.

Based on that analysis, we selected 14 criteria from two SEI 
reports[Footnote 28] to use in assessing NASA's cost-estimating 
practices for the 10 programs we selected to review in detail. Our 
selection of the 14 criteria from the SEI reports was based, in part, 
on their commonality with NASA cost-estimating procedures and 
guidelines. Finally, using the cost-estimating documentation provided 
by NASA for the 10 programs, we determined the extent to which the 
programs met the 14 criteria. If a program provided substantiating 
evidence for a criterion, we determined that the program "fully met" 
the criterion. If partial evidence was provided for a criterion, we 
determined the program "partially met" the criterion. If no evidence 
was found, then we determined that the criterion was "not met." Table 2 
describes each of the 14 criteria and the significance of each 
criterion.

To identify any barriers that make it difficult to improve any 
weaknesses in NASA's cost-estimating processes, we reviewed our recent 
work on NASA's efforts to implement a modern integrated financial 
management system. We also provided questions to NASA headquarters that 
asked for information regarding NASA's ability to use its cost 
estimates as a management tool for its programs. We also provided 
questions related to the SEI criteria, and NASA's responses to these 
questions provided further insight into the agency's cost-estimating 
management process at the organizational level. In addition, we 
interviewed officials in NASA headquarters' Office of the Chief 
Financial Officer and Office of the Chief Engineer, and the center 
project managers for the 10 programs and other appropriate personnel to 
obtain further perspective on this issue.

To accomplish our work, we visited NASA headquarters, Washington, D.C., 
and Goddard Space Flight Center, Maryland. We also contacted officials 
at Marshall Space Flight Center, Alabama; Jet Propulsion Laboratory, 
California; Kennedy Space Center, Florida; Glenn Research Center, Ohio; 
Johnson Space Center, Texas; and Langley Research Center, Virginia.

We conducted our work from February 2003 to March 2004 in accordance 
with generally accepted government auditing standards.

[End of section]

Appendix II: Assessments of 10 Programs Reviewed in Detail: 

This appendix provides a program by program assessment of the 10 NASA 
programs we reviewed in detail. Each assessment provides: 

* a brief description of the program's mission;

* the status of the program--that is, whether it is in development, 
operational, or terminated;

* the year the program was initiated;[Footnote 29]

* the fiscal year in which the Congress approved the program--that is, 
when full-scale design and development funds were appropriated;

* a comparison of the initial and current (as of April 2003) baseline 
development estimates; and: 

* an assessment of the program's cost-estimating processes, 
methodologies, and practices to determine the extent they met 
the 14 cost-estimating criteria that we used to measure program 
performance. (Table 4 shows for each criterion the number of programs 
that met, partially met, or did not meet the criterion.): 

Table 4: Summary of the Number of Programs That Met, Partially Met, or 
Did Not Meet Criterion: 

Criterion: The objectives of the estimate are stated in writing; 
Number of programs that met criterion: Met: 2; 
Number of programs that met criterion: Partially met: 5; 
Number of programs that met criterion: Not met: 3.

Criterion: The life cycle to which the estimate applies is clearly 
defined; 
Number of programs that met criterion: Met: 0; 
Number of programs that met criterion: Partially met: 10; 
Number of programs that met criterion: Not met: 0.

Criterion: The task has been appropriately sized; 
Number of programs that met criterion: Met: 1; 
Number of programs that met criterion: Partially met: 5; 
Number of programs that met criterion: Not met: 4.

Criterion: The estimated cost and schedule are consistent with 
demonstrated accomplishments on other projects; 
Number of programs that met criterion: Met: 0; 
Number of programs that met criterion: Partially met: 7; 
Number of programs that met criterion: Not met: 3.

Criterion: A written summary of parameter values and their rationales 
accompany the estimate; 
Number of programs that met criterion: Met: 0; 
Number of programs that met criterion: Partially met: 4; 
Number of programs that met criterion: Not met: 6.

Criterion: Assumptions have been identified and explained; 
Number of programs that met criterion: Met: 2; 
Number of programs that met criterion: Partially met: 6; 
Number of programs that met criterion: Not met: 2.

Criterion: A structured process such as a template or format has been 
used to ensure that key factors have not been overlooked; 
Number of programs that met criterion: Met: 3; 
Number of programs that met criterion: Partially met: 7; 
Number of programs that met criterion: Not met: 0.

Criterion: Uncertainties in parameter values have been identified and 
quantified; 
Number of programs that met criterion: Met: 0; 
Number of programs that met criterion: Partially met: 3; 
Number of programs that met criterion: Not met: 7.

Criterion: If a dictated schedule has been imposed, an estimate of the 
normal schedule has been compared to the additional expenditures 
required to meet the dictated schedule; 
Number of programs that met criterion: Met: [A]; 
Number of programs that met criterion: Partially met: [A]; 
Number of programs that met criterion: Not met: [A].

Criterion: If more than one cost model or estimating approach has been 
used, any differences in results have been analyzed and explained; 
Number of programs that met criterion: Met: 0; 
Number of programs that met criterion: Partially met: 4; 
Number of programs that met criterion: Not met: 6.

Criterion: Estimators independent of the performing organization 
concurred with the reasonableness of the parameter values and 
estimating methodology; 
Number of programs that met criterion: Met: 3; 
Number of programs that met criterion: Partially met: 7; 
Number of programs that met criterion: Not met: 0.

Criterion: Estimates are current; 
Number of programs that met criterion: Met: 3; 
Number of programs that met criterion: Partially met: 7; 
Number of programs that met criterion: Not met: 0.

Criterion: The results of the estimate have been integrated with 
project planning and tracking; 
Number of programs that met criterion: Met: 6; 
Number of programs that met criterion: Partially met: 4; 
Number of programs that met criterion: Not met: 0.

Criterion: Earned value reporting has been used to manage the program; 
Number of programs that met criterion: Met: 2; 
Number of programs that met criterion: Partially met: 8; 
Number of programs that met criterion: Not met: 0. 

Source: NASA (data), SEI (criteria), GAO (analysis).

[A] This criterion did not apply to 5 of the 10 programs we reviewed. 
For those 5 programs to which the criterion did apply, none provided 
evidence comparing the dictated schedule to the normal schedule.

[End of table]

SPACE SCIENCE: Gravity Probe B: 

[See PDF for image]

[End of figure]

The mission of the Gravity Probe B (GP-B) space vehicle--launched in 
April 2004--is to test Einstein's theory of relativity, which states 
that space and time are very slightly distorted by the presence of 
massive objects, such as Earth. Over approximately 16 months, GP-B will 
measure very precisely, the expected tiny changes in the direction of 
the spin of four gyroscopes contained in GP-B as it orbits at a 400- 
mile altitude directly over the poles. The gyroscopes, free from 
disturbance, will provide an almost perfect space-time reference 
system.

Program Facts: 

* Status: Development: 

* Program initiation: Fiscal year 1993: 

* Program approved by Congress: Fiscal year 1996: 

* Comparison of initial and current baseline development estimates: 
$179.7 million or 33.9 percent increase: 

Cost-Estimating Criteria: 

Met: 

* Estimates used as baselines for program tracking; 

Partially met: 

* Estimate life cycle clearly defined; 
* Assumptions identified and explained; 
* Structured format used to ensure all costs are captured; 
* Dictated schedules show cost impacts of acceleration; 
* Independent estimates concur with program estimates; 
* Estimates reflect changes over time; 
* Earned value reporting used to manage program; 

Not met: 

* Estimate objectives stated in writing; 
* Tasks appropriately sized; 
* Estimated costs based on demonstrated programs; 
* Written documentation of parameter values and rationale; 
* Parameter value uncertainties identified and quantified; 
* More than one cost model or estimating approach used.

Sources: NASA (data), SEI (criteria), GAO (analysis).

[End of table]

SPACE SCIENCE: Mars Exploration Rovers: 

[See PDF for image]

[End of figure]

Launched in the summer of 2003, NASA's twin roving exploration robots-
-Spirit and Opportunity--landed on opposite sides of Mars in January 
2004 in search of answers about the history of water on the red planet. 
Over the course of their 90-day mission, the rovers were expected to 
perform on-site geological investigations, searching for and 
characterizing a wide range of rocks and soils. The robotic geologists 
were equipped with mast-mounted cameras that provide 360-degree, 
stereoscopic, humanlike views of the terrain; robotic arms capable of 
human-like elbow and wrist movements; and a mechanical "fist" with a 
microscopic camera and rock hammer.

Program Facts: 

* Status: Operations: 

* Program initiation: Fiscal year 2000: 

* Program approved by Congress: Fiscal year 2001: 

* Comparison of initial and current baseline development estimates: 
$109.8 million or 16.7 percent increase: 

Cost-Estimating Criteria: 

Met: 

* Structured format used to ensure all costs are captured; 
* Independent estimates concur with program estimates; 
* Estimates are kept current by reflecting changes over time; 
* Estimates used as baselines for program tracking; 

Partially met: 

* Estimate objectives stated in writing; 
* Estimate life cycle clearly defined; 
* Estimated costs based on demonstrated programs; 
* Assumptions identified and explained; 
* Dictated schedules show cost impact of acceleration; 
* Earned value reporting used to manage program; 

Not met: 

* Tasks appropriately sized; 
* Written documentation of parameter values and rationale; 
* Parameter value uncertainties identified and quantified; 
* More than one cost model or estimating approach used.

Sources: NASA (data), SEI (criteria), GAO (analysis).

[End of table]
 

SPACE SCIENCE: Space Infrared Telescope Facility: 

[See PDF for image]

[End of figure]

The Space Infrared Telescope Facility (now called Spitzer), launched in 
August 2003, is the fourth and final mission in NASA's Great 
Observatories Program--a program designed to see the universe in 
different kinds of light. During its planned 2½-year mission, SIRTF 
aims to detect infrared heat, which is mostly blocked by the Earth's 
atmosphere. Infrared light penetrates gas and dust clouds, allowing 
scientists to peer into hidden regions of space, revealing star 
formations, centers of galaxies, and newly forming planetary systems. 
Infrared light also provides information about cooler objects, such as 
dim stars, extrasolar planets, and giant molecular clouds.

Program Facts: 

* Status: Operations: 

* Program initiation: Fiscal year 1984: 

* Program approved by Congress: Fiscal year 1998: 

* Comparison of initial and current baseline development estimates: 
$139 million or 29.3 percent increase: 

Cost-Estimating Criteria: 

Not met: 

Met: 

* Estimates used as baselines for program tracking; 

Partially met: 
 

* Estimate objectives stated in writing; 
* Estimate life cycle clearly defined; 
* Tasks appropriately sized; 
* Estimated costs based on demonstrated programs; 
* Written documentation of parameter values and rationale; 
* Assumptions identified and explained; 
* Structured format used to ensure all costs are captured; 
* Parameter value uncertainties identified and quantified; 
* More than one cost model or estimating approach used; 
* Independent estimates concur with program estimates; 
* Estimates reflect changes over time; 
* Earned value reporting used to manage program; [Empty].

Sources: NASA (data), SEI (criteria), GAO (analysis).

[End of table]
 

EARTH SCIENCE: Landsat-7: 

[See PDF for image]

[End of figure]

Launched in April 1999, Landsat-7 is the latest in a series of earth 
observation satellites. Since 1972, Landsat satellites have collected 
continuous data on the earth's continental surfaces for land surface 
monitoring and global change research. Landsat-7's combination of 
synoptic coverage, high spatial resolution, spectral range, and 
radiometric calibration is unparalleled and provides digital data in 
greater quantities, more quickly, and at lower cost than at any 
previous time in Landsat's history.

Program Facts: 

* Status: Operations: 

* Program initiation: Fiscal year 1992: 

* Program approved by Congress: Fiscal year 1995: 

* Comparison of initial and current baseline development estimates: 
$63 million or 14.1 percent increase: 

Cost-Estimating Criteria: 

Partially met: 

* Estimate life cycle clearly defined; 
* Tasks appropriately sized; 
* Estimated costs based on demonstrated programs; 
* Structured format used to ensure all costs are captured; 
* Parameter value uncertainties identified and quantified; 
* Dictated schedules show cost impacts of acceleration; 
* Independent estimates concur with program estimates; 
* Estimates reflect changes over time; 
* Estimates used as baselines for program tracking; 
* Earned value reporting used to manage program;

Not met: 

* Estimate objectives stated in writing; 
* Written documentation of parameter values and rationale; 
* Assumptions identified and explained; 
* More than one cost model or estimating approach used.

Sources: NASA (data), SEI (criteria), GAO (analysis).

[End of table]

EARTH SCIENCE: Aqua: 

[See PDF for image]

[End of figure]

Aqua, part of the Earth Observing System (EOS), is expected to provide 
a 6-year chronology of Earth and its processes. Launched in May 2002, 
the Aqua satellite collects information on evaporation from the oceans, 
water vapor in the atmosphere, clouds, precipitation, soil moisture, 
sea and land ice, and snow cover. Aqua also measures radiative energy 
fluxes; aerosols; land vegetation cover; dissolved organic matter and 
phytoplankton in the oceans; and air, land, and water temperatures. 
Measurements taken by on-board instruments will allow scientists to 
assess long-term climate change, identify its human and natural 
causes, and advance the development of models for long-term 
forecasting.

Program Facts: 

* Status: Operations: 

* Program initiation: Fiscal year 1991: 

* Program approved by Congress: Fiscal year 1991: 

* Comparison of initial and current baseline development estimates: 
$53.1 million or 5.3 percent decrease: 

Cost-Estimating Criteria: 

Partially met: 

* Estimate objectives stated in writing; 
* Estimate life cycle clearly defined; 
* Assumptions identified and explained; 
* Structured format used to ensure all costs are captured; 
* Dictated schedules show cost impacts of acceleration; 
* Independent estimates concur with program estimates; 
* Estimates reflect changes over time; 
* Estimates used as baselines for program tracking; 
* Earned value reporting used to manage program;: 

Not met: 

* Tasks appropriately sized; 
* Estimated costs based on demonstrated programs; 
* Written documentation of parameter values and rationale; 
* Parameter value uncertainties identified and quantified; 
* More than one cost model or estimating approach used.

Sources: NASA (data), SEI (criteria), GAO (analysis).

[End of table]
 

EARTH SCIENCE: Aura: 

[See PDF for image]

[End of figure]

Scheduled for launch in June 2004, the Aura satellite is the third in a 
series of major Earth-observing satellites to study environment and 
climate change. The first and second missions, Terra and Aqua, were 
designed to study the land, oceans, and the Earth's radiation budget. 
Aura's mission is to study, for at least a 5-year period, the Earth's 
ozone, air quality, and climate, focusing exclusively on the 
composition, chemistry, and dynamics of the Earth's upper and lower 
atmospheres.

Program Facts: 

* Status: Development: 

* Program initiation: Fiscal year 1991: 

* Program approved by Congress: Fiscal year 1994: 

* Comparison of initial and current baseline development estimates: 
$2.1 million or 0.3 percent increase: 

Cost-Estimating Criteria: 

Not met: 

* Estimate objectives stated in writing; 
* Estimates reflect changes over time; 

Partially met: 

* Estimate life cycle clearly defined; 
* Tasks appropriately sized; 
* Estimated costs based on demonstrated programs; 
* Assumptions identified and explained; 
* Structured format used to ensure all costs are captured; 
* Independent estimators concur with program estimates; 
* Estimates used as baselines for program tracking; 
* Earned value reporting used to manage program; 

Met: 

* Written documentation of parameter values and rationale; 
* Parameter value uncertainties identified and quantified; 
* More than one cost model or estimating approach used.

Sources: NASA (data), SEI (criteria), GAO (analysis).

[End of table] 

BIOLOGICAL AND PHYSICAL RESEARCH: Fluids and Combustion Facility: 

[See PDF for image]

[End of figure]

The Fluids and Combustion Facility (FCF) is designed to be a permanent 
modular facility for conducting microgravity experiments on the 
International Space Station. Through these experiments, scientists hope 
to enhance their understanding of gravity's role in a wide range of 
physical processes, including materials science, power, propulsion, 
combustion, fluid physics, and plasma physics. FCF is to be composed of 
two racks that share mutually necessary hardware. The fluids 
integration rack will be used to perform investigations for microscopic 
imaging to particle tracking. The combustion integration rack will be 
used to study the process of combustion in a near weightless 
environment with the aim of improving fire safety and increasing fuel 
efficiency.

Program Facts: 

* Status: Development: 

* Program initiation: Fiscal year 1987: 

* Program approved by Congress: Fiscal year 2001: 

* Comparison of initial and current baseline development estimates: 
$4.8 million or 4 percent decrease: 

Cost-Estimating Criteria: 

Met: 

* Structured format used to ensure all costs are captured; 
* Estimates used as baselines for program tracking; 
* Earned value reporting used to manage program; 

Partially met: 

* Estimate objectives stated in writing; 
* Estimate life cycle clearly defined; 
* Tasks appropriately sized; 
* Estimated costs based on demonstrated programs; 
* Written documentation of parameter values and rationale; 
* Assumptions identified and explained; 
* More than one cost model or estimating approach used; 
* Independent estimates concur with program estimates; 
* Estimates reflect changes over time; 

Not met: 

* Parameter value uncertainties identified and quantified.

Sources: NASA (data), SEI (criteria), GAO (analysis).

[End of table]

AERONAUTICS: Hyper-X Program: 

[See PDF for image]

[End of figure]

The goal of NASA's Hyper-X program is to flight validate key propulsion 
and related technologies for air-breathing hypersonic aircraft. The 
Hyper-X (X-43A) vehicle, launched in March 2004, flew at Mach 7-- 
greater than the cruising speed of the SR-71, the world's fastest air- 
breathing aircraft, which cruises slightly above Mach 3. The highest 
speed attained by NASA's rocket-powered X-15 was Mach 6.7, back in 
1967. NASA anticipates that the technologies exposed by the Hyper-X 
Program will increase payload capacities and reduce costs for future 
air and space vehicles.

Program Facts: 

* Status: Development: 

* Program initiation: Fiscal year 1996: 

* Program approved by Congress: Fiscal year 1998: 

* Comparison of initial and current baseline development estimates: $60 
million or 35.9 percent increase: 

Cost-Estimating Criteria: 

Met: 

* Estimates used as baselines for program tracking; 

Partially met: 

* Estimate life cycle clearly defined; 
* Structured format used to ensure all costs are captured; 
* Independent estimates concur with program estimates; 
* Estimates are kept current by reflecting changes over time; 
* Earned value reporting used to manage program; 

Not met: 

* Estimate objectives stated in writing; 
* Tasks appropriately sized; 
* Estimated costs based on demonstrated programs; 
* Written documentation of parameter values and rationale; 
* Assumptions identified and explained; 
* More than one cost model or estimating approach used; 
* Parameter value uncertainties identified and quantified.

Sources: NASA (data), SEI (criteria), GAO (analysis).

[End of table]

SPACE FLIGHT: Checkout and Launch Control System: 

[See PDF for image]

[End of figure]

The Checkout and Launch Control System (CLCS) was intended to replace a 
central component in NASA's existing launch processing system for the 
space shuttle. The original justification for CLCS was that a 
substantial portion of the vendors for the command control and monitor 
system no longer provided support. In addition, out-of-date software 
and systems were expected to increase costs. CLCS promised to reduce 
staff, paperwork, and operations and maintenance costs by 50 percent. 
The program was canceled in September 2002 due to cost overruns, which 
according to NASA, were caused by factors such as software development 
delays based on poorly defined requirements and design, integration 
problems, and a lack of experienced development staff.

Program Facts: 

* Status: Canceled: 

* Program initiation: Fiscal year 1996: 

* Program approved by Congress: Fiscal year 1998: 

* Comparison of initial and current baseline development estimates: 
$193 million or 93.7 percent increase: 

Cost-Estimating Criteria: 

Met: 

* Assumptions identified and explained; 
* Independent estimators concur with program estimates; 

Partially met: 

* Estimate objectives stated in writing; 
* Estimate life cycle clearly defined; 
* Tasks appropriately sized; 
* Estimated costs based on demonstrated programs; 
* Written documentation of parameter values and rationale; 
* Structured format used to ensure all costs are captured; 
* More than one cost model or estimating approach used; 
* Dictated schedules show cost impacts of acceleration; 
* Estimates reflect changes over time; 
* Estimates used as baselines for program tracking; 
* Earned value reporting used to manage program; 

Not met: 

* Parameter value uncertainties identified and quantified.

Sources: NASA (data), SEI (criteria), GAO (analysis).

[End of table]

SPACE FLIGHT: Cockpit Avionics Upgrade: 

[See PDF for image]

[End of figure]

The Cockpit Avionics Upgrade (CAU) project is redesigning the display 
formats on the liquid crystal displays of the space shuttle cockpit. 
The objective of the redesign is to enhance flight safety by presenting 
the crew with flight and vehicle critical information in a user- 
friendly format that enhances situational awareness. Because the new 
display format uses graphics and color to present complex information, 
crews are expected to have better and more rapid decision-making 
capability under off-nominal conditions than could be made with the 
legacy system, enhancing flight safety and the crew's ability to meet 
mission objectives.

Program Facts: 

* Status: Development: 

* Program initiation: Fiscal year 2000: 

* Program approved by Congress: Fiscal year 2003: 

* Comparison of initial and current baseline development estimates: $12 
million or 2.7 percent increase: 

Cost-Estimating Criteria: 

Met: 

* Estimate objectives stated in writing; 
* Tasks appropriately sized; 
* Assumptions identified and explained; 
* Structured format used to ensure all costs are captured; 
* Independent estimators concur with program estimates; 
* Estimates reflect changes over time; 
* Estimates used as baselines for program tracking; 
* Earned value reporting used to manage program; 

Partially met: 

* Estimate life cycle clearly defined; 
* Estimated costs based on demonstrated programs; 
* Written documentation of parameter values and rationale; 
* Parameter value uncertainties identified and quantified; 
* More than one cost model or estimating approach used; 

Sources: NASA (data), SEI (criteria), GAO (analysis).

[End of table]

[End of section] 

Appendix III: Summary Descriptions of the 17 Additional Programs: 

In addition to the 10 programs that we reviewed in detail, we analyzed 
the initial and current development cost estimates for 17 other 
NASA programs.

Space Science Enterprise: 

Thermosphere, Ionosphere, Mesosphere Energetics and Dynamics (TIMED): 

NASA's TIMED satellite is conducting the first global study of the 
Earth's mesosphere, lower thermosphere, and ionosphere--segments of the 
Earth's atmosphere located between 40 and 110 miles above the planet. 
Initially, TIMED's mission was to last 2 years, beginning with its 
launch in December 2001, but NASA extended the satellite's orbital 
operations through 2006. TIMED's goal is to improve our understanding 
of the influences the sun and humans have on this "gateway region" as 
well as the effects of its atmospheric variability on satellites and 
spacecraft reentering the Earth's atmosphere.

International Gamma-Ray Astrophysics Laboratory (INTEGRAL): 

INTEGRAL is a European Space Agency mission, with Russian and 
U.S. involvement. Launched in October 2002, the INTEGRAL satellite is 
equipped with two telescopes designed to register elusive gamma rays--
some of the universe's most energetic radiation--and give insight into 
the most violent processes in our universe. Through INTEGRAL, 
scientists plan to study black holes' interaction with their 
surroundings, the explosion of supernovae and their role in forming 
chemical elements, the nature of powerful gamma-ray bursts, and 
transient sources that suddenly change brightness. U.S. participation 
consists of co-investigators providing hardware and software components 
to the spectrometer and imager instruments, a co-investigator for the 
data center, a mission scientist, and a provision for ground tracking 
and data collection.

Rosetta: 

Rosetta is a European Space Agency mission whose objectives are to 
study the origin of and the relationship between comets and 
interstellar material and to improve our knowledge of the origins of 
the Solar System. The Rosetta satellite was launched in March 2004 and, 
after a long cruise phase, is planned to rendezvous with comet 
Churyumov-Gerasimenko in 2014. Plans call for Rosetta to orbit the 
comet while taking scientific measurements and to position a probe on 
the comet surface to take in-situ measurements. U.S. involvement 
includes developing three remote-sensing instruments and a subsystem 
for a fourth instrument.

Mercury Surface, Space Environment, Geochemistry and Ranging 
(MESSENGER): 

Currently scheduled to launch during a 15-day period that opens July 
30, 2004, the MESSENGER spacecraft is intended to collect images of 
Mercury. Through these images, NASA scientists hope to determine 
Mercury's geological history and the nature of its surface composition, 
core, poles, exosphere and magnetosphere, and magnetic field. This 
information is expected to provide scientists with a better 
understanding of how Earth was formed, how it evolved, and how it 
interacts with the sun.

Solar Terrestrial Relations Observatory (STEREO): 

Through STEREO--an international collaboration involving France, 
Germany, the United Kingdom, and the United States--NASA plans to trace 
the flow of energy and matter from the sun to Earth by studying the 
solar origin of coronal mass ejections, their evolution in the 
heliosphere, and their effects on geospace. Twin STEREO observatories, 
scheduled to be launched in November 2005, will be used to develop a 
three-dimensional, time-dependent model of the magnetic topology, 
temperature, density, and velocity structure of the ambient solar wind. 
Because coronal mass ejections are the prime drivers of major space 
weather hazards, STEREO is expected to greatly improve our 
understanding of the most severe disturbances of the Sun-Earth system. 
The observatories will also provide a continuous data stream for the 
purpose of real-time space weather forecasts.

Stratospheric Observatory for Infrared Astronomy (SOFIA): 

The SOFIA observatory--a modified Boeing 747 aircraft with a 
permanently installed telescope, which NASA plans to begin flying in 
2005--will be used to study different astronomical objects and 
phenomena, including star births and deaths; solar system formations; 
complex molecules in space; planets, comets, and asteroids in our solar 
system; nebulae and dust in galaxies; and black holes at the centers of 
galaxies. The telescope, provided through a partnership with the German 
Aerospace Center, is designed to provide routine access to nearly all 
of the visual, infrared, far-infrared, and submillimeter parts of the 
spectrum. As such, SOFIA is expected to extend the range of 
astrophysical observations significantly beyond that of previous 
infrared airborne observatories through increases in sensitivity and 
angular resolution. NASA plans to incorporate new or upgraded 
technologies over the aircraft's lifetime to allow additional 
scientific exploration. Because most of the instruments are to be 
designed and built by graduate students and post-doctoral scientists in 
universities throughout the United States, SOFIA will serve as a 
training ground for the next generation of instrument builders.

Solar-B Observatory: 

The Solar-B program's objectives are to investigate the interaction 
between the Sun's magnetic field and its corona and to understand the 
sources of solar variability. Solar-B is a Japanese Institute of Space 
and Astronautical Science mission, with significant U.S. involvement, 
and follows the Solar-A collaboration among Japan, the United Kingdom, 
and the United States. The observatory is designed to consist of a set 
of optical, extreme ultraviolet, and X-ray instruments, and NASA is 
expected to provide components for each. The Solar-B observatory is 
scheduled to be launched on a Japanese M-V rocket out of Kagoshima, 
Japan, in September 2006.

Herschel Space Observatory: 

The European Space Agency's Herschel Space Observatory (formerly the 
Far Infrared and Submillimetre Telescope, or FIRST) houses an infrared 
telescope that is expected to observe virtually unexplored spectrum 
wavelengths that cannot be observed from the ground. Scheduled for 
launch in February 2007, Herschel is expected to enable scientists to 
better understand galaxy formation, evolution in the early universe, 
and the nature of active galaxy power sources; star-forming regions and 
interstellar medium physics in the Milky Way and other galaxies; and 
the molecular chemistry of cometary, planetary, and satellite 
atmospheres in our solar system. NASA is providing components for two 
of the three instruments that will be flown on Herschel: the Heterodyne 
Instrument for Far Infrared and the Spectral and Photometric Imaging 
Receiver.

Earth Science Enterprise: 

Terra: 

Launched in February 2000, Terra is providing measurements that, 
according to NASA, are significantly contributing to the understanding 
of the total Earth system. Specifically, Terra is collecting 200 
gigabytes of data each day on the earth's physical and radiative 
properties of clouds, air-land and air-sea exchanges of energy, carbon, 
and water as well as measurements of trace gases and volcanology. One 
of the first operational uses of Terra was to provide imagery to 
support the U.S. Forest Service's efforts to combat forest fires in the 
western United States. Through Terra, fire fighters were able to 
identify the locations of active fires, instead of locations of smoke, 
providing them with the data needed to better control spreading fires. 
Terra data were also used by the Geography Department of Dartmouth 
College in New Hampshire to assist in flood hazard reduction programs.

New Millennium Program's Earth Observing-1 (EO-1): 

NASA's New Millennium Program (NMP) is designed to identify, develop, 
and flight-validate key instrument and spacecraft technologies that can 
enable new or more cost-effective approaches to conducting science 
missions. EO-1--the first NMP mission, launched in November 2000--
includes three land imaging instruments that are expected to lead to a 
new generation of lighter weight, higher performance, and lower cost 
Landsat-type Earth surface imaging instruments.

Jason-1: 

The mission of the Jason-1 program, a cooperative effort with the 
French Space Agency, is to study the global oceans. Launched in 
December 2001, the Jason-1 satellite was expected to monitor ocean 
circulation and events such as El Nino and ocean eddies and to improve 
global climate forecasts and predictions. The Jason-1 satellite was 
positioned to orbit the earth in tandem with TOPEX/Poseidon, an earlier 
generation satellite launched in 1992, to provide data to the National 
Oceanic and Atmospheric Administration.

SeaWinds: 

The SeaWinds satellite, launched in December 2002, is providing high-
resolution, ocean surface wind data used for studies of ocean 
circulation, climate, and air-sea interaction to understand global 
climate changes and weather patterns better. By using long-term wind 
data in numerical weather and wave prediction models, SeaWinds is 
expected to improve weather forecasts near coastlines and storm warning 
and monitoring.

Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations 
(Calipso): 

The Calipso satellite, scheduled for launch in 2005, is being designed 
to study the effect that aerosols and clouds have on the Earth's 
radiation balance, which ultimately controls the temperature of the 
Earth. Calipso is expected to provide scientists with data to construct 
three-dimensional structures of the atmosphere, enabling new 
observationally based assessments of the radiative effects of aerosol 
and clouds that will greatly improve our ability to predict future 
climate change. NASA plans to fly Calipso in formation with Aqua and 
CloudSat, a satellite being designed to measure the vertical structure 
of clouds from space and contribute to a better understanding of the 
role of clouds in the Earth's climate system. The Calipso program is a 
cooperative effort with France.

Space Flight Enterprise: 

X-38 Crew Return Vehicle (CRV): 

The X-38 Crew Return Vehicle was cancelled in April 2002, due to its 
single purpose design and the potentially high costs identified by an 
independent assessment. The purpose of the CRV project was to initiate 
work toward an independent U.S. crew return capability for the 
International Space Station. As envisioned, CRV was expected to serve 
as a back-up to the space shuttle orbiters by providing resupply to the 
station or change-out crew, or accommodating safe return for up to 
seven crew members who may be ill or injured or in the event that a 
catastrophic failure of the station made it unable to support life.

Alternate Turbopump Program (ATP): 

ATP's primary objectives were to significantly improve the safety and 
operating margins of the high-pressure turbopump in the space shuttle's 
main engine and to eliminate the need to remove the turbopump for 
postflight maintenance. An alternative turbopump was successfully 
implemented in the shuttle launched in April 2002. According to NASA, 
ATP's development contract, signed in December 1986, specifically 
addressed shortcomings of the previous turbopumps; took advantage of 
the latest technologies; and applied lessons learned. The contract 
called for the parallel development of two high-pressure turbopumps--
one that operates on oxidization and one on fuel. However, 5 years into 
the program, technical problems prompted NASA to end parallel 
development and concentrate first on developing the oxidizer turbopump, 
which was first flown in July 1995. Although development of the fuel 
turbopump resumed in 1994, extreme high temperatures, pressures, and 
rotor speeds resulted in significant design challenges and the design 
certification review was not completed until March 2001. The full 
implementation of the fuel turbopump into flight was completed 
beginning with the April 2002 shuttle flight.

Tracking and Data Relay Satellite (TDRS) Replenishment: 

In December 2002, the TDRS Replenishment project achieved its goal: 
launch three geosynchronous satellites to replace the existing aging 
satellite constellation, and thereby continue to provide space network 
tracking, data, voice, and video services to NASA scientific 
satellites, the Space Shuttle program, the International Space Station, 
and other NASA customers. According to NASA, the functional and 
technical performance requirements for the replacement satellites--
launched in June 2000, March 2002, and December 2002--are virtually 
identical to those of the previous satellites.

Advanced Health Management System (AHMS) Phase 1: 

AHMS is expected to provide safe shutdown of the space shuttle main 
engine during potentially catastrophic high-pressure turbopump 
failures through improved monitoring of engine vibration and anomaly 
response capabilities. According to NASA, AHMS modifications include 
(1) adding a vibration redline monitor for high pressure turbopumps, 
(2) doubling memory capacity and employing radiation tolerant memory, 
(3) adding an external communication interface for a potential phase-
two health management computer, and (4) eliminating existing memory 
retention batteries and replacing them with nonvolatile memory. While 
NASA stated the AHMS will be available for launch in January 2005, the 
shuttle fleet's return to flight date is planned for March or April 
2005.

[End of section]

Appendix IV: Description of Earned Value Management: 

Earned value management (EVM) goes beyond the two-dimensional approach 
of comparing budgeted costs to actuals. Instead, it attempts to compare 
the value of work accomplished during a given period with the work 
scheduled for that period. By using the value of completed work as a 
basis for estimating the cost and time needed to complete the program, 
earned value can alert program managers to potential problems early in 
the program.

An accurate, valid, and current performance management baseline is 
needed to perform useful analyses using EVM. In 1996, in response to 
acquisition reform initiatives, the Department of Defense (DOD) adopted 
32 criteria for evaluating the quality of management systems. In 
general terms, the 32 criteria require contractors to (1) define the 
contractual scope of work using a work breakdown structure; 
(2) identify organizational responsibility for the work; (3) integrate 
internal management subsystems; (4) schedule and budget authorized 
work; (5) measure the progress of work based on objective indicators; 
(6) collect the cost of labor and materials associated with the work 
performed; (7) analyze any variances from planned cost and schedules; 
(8) forecast costs at contract completion; and (9) control changes. The 
criteria have become the standard for EVM and have been adopted by 
major U.S. government agencies, industry, and the governments of Canada 
and Australia. The full application of EVM system criteria is 
appropriate for large cost reimbursable contracts where the government 
bears the cost risk. For such contracts, management discipline 
prescribed by the criteria is essential. In addition, data from an EVM 
system have been proved to provide objective reports of contract 
status, allowing numerous indices and performance measures to be 
calculated. These can then be used to develop accurate estimates of 
anticipated costs at completion, providing early warning of impending 
schedule delays and cost overruns.

Table 5 lists the 32 criteria, organized into five basic categories: 
organization, planning and budgeting, accounting considerations, 
analysis and management reports, and revisions and data maintenance.

Table 5: Thirty-Two Criteria for Evaluating the Quality of Management 
Systems: 

Category: Organization; 

Criteria: 1. Define the authorized work elements for the program. A 
work breakdown structure, tailored for effective internal management 
control, is commonly used in this process.

Criteria: 2. Identify the program organizational structure, including 
the major subcontractors responsible for accomplishing the authorized 
work, and define the organizational elements in which work will be 
planned and controlled.

Criteria: 3. Provide for the integration of the company's planning, 
scheduling, budgeting, work authorization, and cost accumulation 
processes with each other and, as appropriate, the program work 
breakdown structure and the program organizational structure.

Criteria: 4. Identify the company organization or function responsible 
for controlling overhead (indirect costs).

Criteria: 5. Provide for integration of the program work breakdown 
structure and the program organizational structure in a manner that 
permits cost and schedule performance measurement by elements of either 
or both structures as needed.

Category: Planning and budgeting; 

Criteria: 6. Schedule the authorized work in a manner that describes 
the sequence of work and identifies significant task interdependencies 
required to meet the requirements of the program.

Criteria: 7. Identify physical products, milestones, technical 
performance goals, or other indicators that will be used to measure 
progress.

Criteria: 8. Establish and maintain a time-phased budget baseline, at 
the control account level, against which program performance can be 
measured. Budget for far-term efforts may be held in higher-level 
accounts until an appropriate time for allocation at the control 
account level. Initial budgets established for performance measurement 
will be based on either internal management goals or the external 
customer-negotiated target cost, including estimates for authorized 
but undefinitized work. On government contracts, if an over target 
baseline is used for performance measurement reporting purposes, prior 
notification must be provided to the customer.

Criteria: 9. Establish budgets for authorized work with identification 
of significant cost elements (labor and material, for example) as 
needed for internal management and for control of subcontractors.

Criteria: 10. To the extent it is practical to identify the authorized 
work in discrete work packages, establish budgets for this work in 
terms of dollars, hours, or other measurable units. Where the entire 
control account is not subdivided into work packages, identify the far 
term effort in larger planning packages for budget and scheduling 
purposes.

Criteria: 11. Provide that the sum of all work package budgets plus 
planning package budgets within a control account equals the control 
account budget.

Criteria: 12. Identify and control level of effort activity by time-
phased budgets established for this purpose. Only that effort which is 
unmeasurable or for which measurement is impractical may be classified 
as level of effort.

Criteria: 13. Establish overhead budgets for each significant 
organizational component of the company for expenses that will become 
indirect costs. Reflect in the program budgets, at the appropriate 
level, the amounts in overhead pools that are planned to be allocated 
to the program as indirect costs.

Criteria: 14. Identify management reserves and undistributed budget.

Criteria: 15. Provide that the program target cost goal is reconciled 
with the sum of all internal program budgets and management reserves.

Category: Accounting considerations; 

Criteria: 16. Record direct costs in a manner consistent with the 
budgets in a formal system controlled by the general books of account.

Criteria: 17. When a work breakdown structure is used, summarize 
direct costs from control accounts into the work breakdown structure 
without allocation of a single control account to two or more work 
breakdown structure elements.

Criteria: 18. Summarize direct costs from the control accounts into 
the contractor's organizational elements without allocation of a 
single control account to two or more organizational elements.

Criteria: 19. Record all indirect costs that will be allocated to the 
contract.

Criteria: 20. Identify unit costs, equivalent units costs, or lot costs 
when needed.

Criteria: 21. For EVM, the material accounting system will provide (1) 
accurate cost accumulation and assignment of costs to control accounts 
in a manner consistent with the budgets using recognized, acceptable, 
costing techniques; (2) cost performance measurement at the point in 
time most suitable for the category of material involved, but no 
earlier than the time of progress payments or actual receipt of 
material; and (3) full accountability of all material purchased for 
the program, including the residual inventory.

Category: Analysis and management reports; 

Criteria: 22. At least on a monthly basis, generate the following 
information at the control account and other levels as necessary for 
management control using actual cost data from, or reconcilable with, 
the accounting system: (1) Comparison of the amount of planned budget 
and the amount of budget earned for work accomplished. This comparison 
provides the schedule variance. (2) Comparison of the amount of the 
budget earned and the actual (applied where appropriate) direct costs 
for the same work. This comparison provides the cost variance.

Criteria: 23. Identify, at least monthly, the significant differences 
between both planned and actual schedule performance and planned and 
actual cost performance, and provide the reasons for the variances in the detail needed by program management.

Criteria: 24. Identify budgeted and applied (or actual) indirect costs 
at the level and frequency needed by management for effective control, 
along with the reasons for any significant variances.

Criteria: 25. Summarize the data elements and associated variances 
through the program organization and/or work breakdown structure to 
support management needs and any customer reporting specified in the 
contract.

Criteria: 26. Implement managerial actions taken as the result of 
earned value information.

Criteria: 27. Develop revised estimates of cost at completion based on 
performance to date, commitment values for material, and estimates of 
future conditions. Compare this information with the performance 
measurement baseline to identify variances at completion important to 
company management and any applicable customer reporting requirements, 
including statements of funding requirements.

Category: Revisions and data maintenance; 

Criteria: 28. Incorporate authorized changes in a timely manner, 
recording the effects of such changes in budgets and schedules. In the 
directed effort prior to negotiation of a change, base such revisions 
on the amount estimated and budgeted to the program organizations.

Criteria: 29. Reconcile current budgets to prior budgets in terms of 
changes to the authorized work and internal replanning in the detail 
needed by management for effective control.

Criteria: 30. Control retroactive changes to records pertaining to 
work performed that would change previously reported amounts for 
actual costs, earned value, or budgets. Adjustments should be made only 
for correction of errors, routine accounting adjustments, effects of 
customer or management directed changes, or to improve the baseline 
integrity and accuracy of performance measurement data.

Criteria: 31. Prevent revisions to the program budget except for 
authorized changes.

Criteria: 32. Document changes to the performance measurement 
baseline. 

Source: Interim Defense Acquisition Guide Book, Appendix 4.

[End of table]

The standard format for tracking earned value is through a cost 
performance report (CPR). The CPR is a monthly compilation of 
cost, schedule, and technical data, which displays the performance 
measurement baseline, any cost and schedule variances from that 
baseline, the amount of management reserve used to date, the portion of 
the contract that is authorized unpriced work, and the contractor's 
latest revised estimate to complete the program. As a result, the CPR 
can be used as an effective management tool because it provides the 
program manager with early warning of potential cost and schedule 
overruns.

Using data from the CPR, a program manager can assess trends in cost 
and schedule performance. This information is useful because trends 
tend to continue and can be difficult to reverse. Studies have shown 
that once programs are 15 percent complete, the performance indicators 
are indicative of the final outcome. For example, a CPR showing a 
negative trend for schedule status would indicate that the program is 
behind schedule. By analyzing the CPR, one could determine the cause of 
the schedule problem such as delayed flight tests, changes in 
requirements, or test problems because the CPR contains a section that 
describes the reasons for the negative status. A negative schedule can 
be a predictor of later cost problems because additional spending is 
often necessary to resolve problems. CPR data also provide the basis 
for independent assessments of a program's cost and schedule status and 
can be used to project final costs at completion in addition to 
determining when a program should be completed.

Examining a program's management reserves is another way that a program 
can use a CPR to determine potential issues early on. Management 
reserves, which are funds that may be used as needed, provide 
flexibility to cope with problems or unexpected events. EVM experts 
agree that transfers of management reserves should be tracked and 
reported because they are often problem indicators. An alarming 
situation arises if the CPR shows that the management reserves are 
being used at a faster pace than the program is progressing toward 
completion. For example, a problem would be indicated if a program has 
used 80 percent of its management reserves, but only completed 
40 percent of its work. A program's management reserves should contain 
at least 10 percent of the cost to complete a program so that funds 
will always be available to cover future unexpected problems that are 
more likely to surface as the program moves into the testing and 
evaluation phase.

[End of section]

Appendix V: Comments from the National Aeronautics and Space 
Administration: 

National Aeronautics and Space Administration:

Office of the Administrator 
Washington, DC 20546-0001:

May 24, 2004:

Mr. Allen Li:

Director, Acquisition and Sourcing Management Team:

United States General Accounting Office 
Washington, DC 20548:

Dear Mr. Li:

NASA appreciates the opportunity to comment on your draft report 
(General Accounting Office (GAO)-04-642) entitled "Lack of Disciplined 
Cost-Estimating Processes Undermines NASA's Ability to Effectively 
Manage Its Programs."

We concur with the recommendations of your report and your observations 
as they validate and reinforce the importance of activities already 
underway at NASA to improve cost estimating and program management. We 
believe that NASA, while still pursuing important improvements in many 
areas related to cost management, has already made substantive changes 
and achieved significant improvements in its cost-estimating processes. 
Some examples are as follows:

* Reduction in Cost Growth:

In your December 1992 report, "NASA Program Costs: Space Missions 
Require Substantially More Funding Than Initially Estimated" (GAO/
NSIAD-93-97), GAO cited a median 77 percent increase in Agency program 
cost growth. This contrasts with a 13 percent median cost growth in the 
present study - a dramatic improvement.

* Project Termination:

NASA has recently terminated projects with high cost growth, such as 
the Checkout and Launch Control System (CLCS), the highest growth 
project in your report.

* Space Station Reforms:

From the President's FY 2005 Budget Request, Office of Management and 
Budget (OMB) states, "During the 1990s, International Space Station 
costs were spiraling out of control, potentially threatening other NASA 
programs and using taxpayer resources ineffectively. Using independent 
reviews and implementing management reforms, Space Station managers 
have since gained control over the costs of this unique laboratory."

In order to ensure that the title of your report is appropriately 
consistent with your findings, we would suggest the title could be 
revised to recognize steps we have recently taken to address the issues 
you raise. We believe the title of the report would better reflect the 
current status if it were changed to "Consistent Implementation of 
NASA's Cost Management Processes are Necessary to Ensure Effective 
Program and Project Management." NASA has already identified several 
key processes in this area and must now be diligent in ensuring 
consistent implementation of those processes, whereas the title of the 
draft report implies that there are no processes.

The following paragraphs provide the current status and planned 
approach for addressing each of the recommendations made by GAO in its 
draft report.

Recommendation 1: GAO recommends that the NASA Administrator direct the 
Program Executive Officer for IFMP, Chief Financial Officer and Chief 
Engineer to develop an integrated plan that, at a minimum, includes 
specific actions for ensuring that:

* guidance is established on rebaselining and that rebaselining is 
consistently applied to provide accountability among programs:

* true earned value management is used as an organizational management 
tool to bring cost to the forefront in NASA's management decision-
making process:

* acquisition and earned value management policies and procedures are 
enforced, and:

* staff and support for cost-estimating and earned value analyses are 
effectively used.

NASA concurs with this recommendation, and is making progress towards 
implementation, including development of revised direction for program 
and project management by the Office of the Chief Engineer and design 
and implementation under the IFM Program's Integrated Asset Management 
(IAM) effort of enhanced Agencywide Project Management and Earned Value 
analytical capabilities. This revised direction is NASA Procedural 
Requirement (NPR) 7120.5C, entitled "NASA Program and Project 
Management Processes and Requirements." As an example of our commitment 
to implementing these reforms, we have changed the name from "NASA 
Procedural Guidance" to "NASA Procedural Requirement" to make clear 
that these directives are requirements and are in no way optional. NPR 
7120.5C will be released in August 2004 to replace NASA Procedural 
Guidance (NPG) 7120.513. This management system governs the 
formulation, approval, implementation, and evaluation of all Agency 
programs and projects. An integral part of NPR 7120.5C will be NASA's 
new Continuous Cost-Risk Management (CCRM) process, focusing cost-
estimators, earned value management (EVM) analysts, and program 
analysts on risk and cost-risk insight. This will improve their 
effectiveness at making better estimates and identifying potential 
problems as they emerge, thereby controlling program and project cost 
growth.

We acknowledge that requirements must be established and discipline 
must be enforced in both determining when rebaselining is necessary and 
in the actual implementation of rebaselining activities when they are 
determined to be necessary. NASA has created firm requirements that 
establish thresholds that, if exceeded, will require a rebaselining 
review by the governing Program Management Council (PMC). NASA has 
governing PMCs at the Center level, Enterprise level, or Agency level 
depending on the size and criticality of the program or project. Each 
is comprised of senior representatives, has the authority to impose 
program and project management requirements, and regularly reviews 
programs and projects for satisfactory performance. A rebaselining 
review will require strict adherence to justification procedures, to 
include reasons for the cost growth, value of the program to the Agency 
and the Nation, an updated life-cycle cost estimate (LCCE), and 
evidence that a team is in place that is capable of managing the 
project to the updated technical, schedule and cost targets. If a 
rebaselining is determined to be appropriate, traceability to the 
original baseline will be ensured.

NASA recognizes the need for improved Earned Value Management (EVM) 
implementation on development projects across the Agency. It also 
recognizes the challenges with implementing true, full-cost EVM that 
seamlessly integrates in-house and multiple contractor activities. 
Accomplishing this requires evolution of Integrated Financial 
Management Program (IFMP) business practices and the integration of EVM 
software with business management systems. NPR 7120.5C will also 
reinforce implementation of EVM by providing requirements for EVM. The 
Office of the Chief Engineer is now responsible for EVM requirements 
definition and compliance at NASA, and is working with all Enterprises 
to ensure that full criteria-compliant EVM is a part of every major 
project and that EVM principles are applied to smaller projects.

To ensure that the most up-to-date EVM practices are consistently 
implemented across NASA, the Deputy Chief Engineer Chairs the newly 
formed EVM Focal Point Council (FPC). The EVM FPC meets every 2 months, 
and includes representatives from the Enterprises and the IFMP who are 
experienced in EVM design, development, and implementation. NASA's 
training personnel are also included in the FPC to help develop EVM 
training modules. At present, the EVM FPC has formed seven working 
groups to address various improvements. The EVM FPC will also be 
publishing an EVM Handbook that will contain the results of each EVM 
FPC working group and provide the latest guidance to the NASA EVM 
community. The EVM FPC is overseeing the selection of software tools, 
prototype testing, and the rollout of a resultant set of pilot tools in 
FY 2005 that project managers can use to more readily and consistently 
implement EVM policy.

Acquisition management is another area receiving great emphasis in NPR 
7120.5C. Approximately 90 percent of NASA's work is performed through 
grants and contracts, underscoring the importance of acquisition 
strategies. New rules of engagement require project plans to dedicate 
extensive effort to acquisition planning and strategy, with an emphasis 
on how risk and technical complexity affect contractor performance. The 
new practices also emphasize methods for incentivizing contractor 
performance to achieve Agency safety, reliability, and performance 
goals.

Enforcement of these new EVM and acquisition policies and procedures 
will be achieved through Program Management Councils which will review 
and approve programs and projects regularly, including each step of 
their development, based on the new requirements in NPR 7120.5C. 
Additionally, the Contract Management module, which is part of IFM's 
IAM rollout, will significantly help move the Agency off GAO's "High 
Risk" list in Contract Management.

Prior to the initiation of this GAO study, NASA had already taken 
critical steps to address staffing and support needs for cost 
estimating and earned value management. For example, a new Cost 
Analysis Division, reporting to the Comptroller, has been 
established at NASA Headquarters. This division is being staffed with 
six new high-level civil service cost estimators. In addition, senior 
cost analyst positions have been added to the Independent Program 
Assessment Office (IPAO), the Agency's lead for conducting program and 
project cost estimates and technical reviews at key milestones. IPAO's 
Deputy Director will act as lead for Independent Cost Estimates. We are 
also strengthening interactions between the IPAO and the Center System 
Management Offices (SMOs); the SMO's provide an additional source of 
cost estimating expertise independently of projects. These measures 
will enable new cost management policy and direction that will ensure 
effective use of NASA-wide staff, and support cost estimating and 
earned value analysis capability.

NASA recognizes the importance of supporting its new management 
requirements with training. NASA recently moved the personnel 
responsible for engineering and management training to the Office of 
the Chief Engineer. NASA's Chief Engineer controls NPR 7120.5C, and 
will ensure that training is well matched to the new requirements, 
especially the cost-risk principles of NASA's new Continuous Cost-Risk 
Management (CORM) process. Implementation of CORM has already begun on 
programs in the Exploration Systems Enterprise, and full implementation 
of NASA's CCRM is expected to take place following publication of 
7120.5C. NASA has developed not only a strategy for an improved, 
rigorous and disciplined cost estimating and EVM capability, but a 
genuine enhancement to overall project management.

Recommendation 2: In addition, we recommend that the NASA Administrator 
direct the Chief Financial Officer to establish a standard framework 
for developing life-cycle cost estimates. At a minimum the framework 
should require each program or project to:

* base its cost estimates on a full life cycle for the program - 
including all direct and indirect costs for operations and maintenance 
and disposal as well as planning and procurement - and on a work 
breakdown structure that:

encompass both in-house and contractors efforts, 

* prepare a cost analysis requirements document, 

* prepare an independent government estimate at each milestone of the 
program, and:

* conduct a cost risk assessment that identifies the level of 
uncertainty inherent in the estimate.

NASA concurs with this recommendation. NASA agrees that there is a need 
for consistency in defining life-cycle costs that includes the full 
cost of NASA's programs.

NPR 7120.5C will clearly define the full life-cycle cost to include 
development, operations, maintenance, disposal and all NASA in-house 
direct and indirect costs, including civil service salary, materials, 
service pool, Center G&A, and Corporate G&A to eliminate ambiguity and 
ensure consistency. Life-cycle cost estimates will be done for those 
phases relevant to the program or project - for example, some NASA 
activities are technology or test programs that may not include all the 
life cycle phases of a spacecraft development project.

The revised Cost Estimating Handbook, based on NPR 7120.5C, will 
provide further guidance for life-cycle cost estimates. Both NPR 
7120.5C and the revised cost estimating handbook will require rigorous 
development of life-cycle cost estimates, to include cost-risk 
assessment, for all phases of the program's life cycle.

The recommendation for the preparation of a cost analysis requirements 
document, similar to the Department of Defense's (DoD) cost analysis 
requirements description (CARD), is addressed by a similar but improved 
tool called the Cost Analysis Data Requirement or "CADRe." The Project 
Manager is responsible for developing and maintaining the CADRe, which 
has three basic parts: (1) a DoD CARD-equivalent project and technical 
description document; (2) identification and valuation of key 
engineering performance parameters, including updates over time, and 
documentation of actual Work Breakdown Structure (WBS) element costs; 
and (3) initial and annual updates of life-cycle cost estimates (LCCE). 
The LCCE is separable from the CADRe so that the CARD-equivalent 
portion of the CADRe can be given to an independent cost-estimating 
group, like the IPAO, so that it can perform an independent cost 
estimate (ICE) without knowledge of the project estimate.

NPR 7120.5C, complemented with guidance contained in the revised Cost 
Estimating Handbook, will require an ICE for major programs (identified 
by size and criticality) at least twice --prior to entering Phase B 
(corresponding to the System Design Review) and prior to entering Phase 
C/D (corresponding to the Preliminary Design Review). These estimates 
may be done by in-house organizations, such as the IPAO, by outside 
experts from Federally Funded Research and Development Centers (FFRDCs) 
or industry, or some combination thereof.

The transition from Phase AB to Phase C/D, or from formulation to 
implementation, is a critical transition and one that NASA has 
emphasized in its budget structure. Prior to Phase C, when projects are 
in formulation, there tends to be a high level of uncertainty as 
designs mature. NASA communicates to its stakeholders that estimates 
should be expected to change in this phase. Budgets for programs and 
projects in the formulation phase are included in the "technology and 
advanced concepts" category of the budget. To move into Phase C/D, 
projects must be approved by their governing PMC, having already gone 
through a Preliminary Design Review and had an Independent Cost 
Estimate. Once projects are approved by their PMC to enter 
implementation and begin Phase C/D, the funding is transferred from 
"technology and advanced concepts" to the 
"development" category, at which time NASA commits to the cost and 
schedule estimates with confidence. In Phase C/D, NASA relies primarily 
on EVM to capture cost performance and estimates-to-complete, although 
updates to Independent Cost Estimates may also be used as required. 
These updates will be enabled through continual cost estimating 
community involvement in implementing annual parametric estimates as 
required by the CADRe. NASA is implementing these requirements at the 
present time.

NASA has already implemented guidance and training to ensure that all 
Independent Cost Estimates are based on risk and expressed in terms of 
probability distributions. Naturally, estimates that are performed very 
early in the development cycle contain 
higher levels of uncertainty, and NASA communicates that uncertainty to 
stakeholders through the budget, as described above, in briefings and 
in updates to the NASA Operating Plans. The vast majority of systems 
developed by NASA are unprecedented, and variation in cost estimates is 
to be expected as they mature from initial risk mitigation efforts 
(such as early technology development) through the phases of the 
project life cycle. The Comptroller and IPAO are preparing and testing 
alternative cost estimating tools that can be used early in development 
cycles for crosschecks and validation against traditional cost 
estimating methods.

Cost uncertainty analyses are required as part of the NASA Continuous 
Cost-Risk Management (CCRM) process. The CORM process not only requires 
identification of the uncertainty inherent in the estimate, it ties 
this identification to the monitoring and management of medium-and 
high-risk WBS elements using EVM. NASA has also recently developed a 
policy for cost-risk data generation and analysis, and has documented 
it as "the 12 Tenets of NASA Cost-Risk" in the updated cost-estimating 
handbook.

The requirement for a CADRe is presently being implemented on all major 
new development programs, and will be institutionally required with its 
inclusion in the upcoming NPR 7120.5C. NASA is implementing milestone 
ICE requirements at the present time, along with uncertainty analysis 
requirements on ICES and LCCEs, to include operations, maintenance, and 
disposal costs.

Recommendation 3: Further, we recommend that the NASA Administrator 
develop procedures such that proposed projects not be allowed to 
proceed through the review and approval process when they do not 
address the elements of the recommended cost estimating practices.

NASA concurs with this recommendation.

The new version of NPR 7120.5 defines the authority of the governing 
PMCs that will enforce the program requirements for proceeding through 
the review and approval process. It also makes very clear to project 
managers the procedures that have to be followed along with information 
and documentation required in the project plan to successfully receive 
authority to proceed at key milestone gates. The document also clearly 
identifies management methods for ensuring that projects perform 
according to plan.

NASA's recent implementation of its Executive Financial Management 
Information Dashboard ("Erasmus") provides the Administrator and the 
senior management of the Agency direct insight into individual Project 
and Program performance on a monthly basis. The Office of the Chief 
Engineer reviews Erasmus "stoplight" information regularly, and 
significant variances, such as cost estimating deficiencies, are 
identified for PMC review. Additionally, the Agency-wide Business 
Warehouse tool provides users across the Agency complete access to the 
full cost of projects or functional activities. Furthermore, the 
completion of populating FY05 phasing plan data in the new Budget 
Formulation tool will provide additional functionality to users, giving 
them complete access to actual versus planned cost performance.

Again, thank you for the critical insight the report provided. We 
assure you that we are well on our way toward implementing your 
recommendations.

Signed by: 

Frederick D. Gregory:

Deputy Administrator: 

[End of section]

Appendix VI: GAO Contact and Staff Acknowledgments: 

GAO Contact: 

Allen Li (202) 512-4841: 

Acknowledgments: 

Staff making key contributions to this report were Jerry Herley, 
Shirley Johnson, Charles Malphurs, Karen Sloan, Madhav Panwar, Karen 
Richey, Jennifer Echard, and Deborah Lott.

(120208): 

FOOTNOTES

[1] U.S. General Accounting Office, Major Management Challenges and 
Program Risks: National Aeronautics and Space Administration, GAO-03-
114 (Washington, D.C.: Jan. 2003).

[2] NASA's Enterprises, listed in the background section of this 
report, function as primary business areas for implementing 
NASA's mission. Each Enterprise has its own strategic goals, 
objectives, and implementation strategies.

[3] According to NASA, congressional approval occurs when the Congress 
appropriates design and development funds for the program.

[4] NASA defines baseline as the technical performance and content, 
technology application, schedule milestones, and budget (including 
contingency and allowance for program adjustment) that are documented 
in the approved program and project plans. Current baseline development 
cost estimates are as of April 2003.

[5] NASA defines life-cycle cost as the total of the direct, indirect, 
recurring, nonrecurring, and other related expenses incurred, or 
estimated to be incurred, in the design, development, verification, 
production, operation, maintenance, support, and retirement of a system 
over its planned life.

[6] U.S. General Accounting Office, Business Modernization: NASA's 
Challenges in Managing Its Integrated Financial Management Program, 
GAO-04-255 (Washington, D.C.: Nov. 21, 2003); Business Modernization: 
Disciplined Processes Needed to Better Manage NASA's Integrated 
Financial Management Program, GAO-04-118 (Washington, D.C.: Nov. 21, 
2003); Business Modernization: NASA's Integrated Financial Management 
Program Does Not Fully Address Agency's External Reporting Issues, GAO-
04-151 (Washington, D.C.: Nov. 21, 2003); and Information Technology: 
Architecture Needed to Guide NASA's Financial Management Modernization, 
GAO-04-43 (Washington, D.C.: Nov. 21, 2003).

[7] U.S. General Accounting Office, Space Station: Actions Under Way to 
Manage Cost, but Significant Challenges Remain, GAO-02-735 (Washington, 
D.C.: July 17, 2002).

[8] The other three initiatives are strategic human capital management, 
competitive sourcing, and expanded electronic government.

[9] According to a NASA project manager, "to go" means from this point 
forward to completion of the project, given the current status of the 
project and the resources available to complete it.

[10] The second Hyper-X flight vehicle flew successfully at Mach 7 
speed in March 2004 (see app. II).

[11] 10 U.S.C. 2433.

[12] See, for example, GAO-04-118; GAO-04-255; GAO-03-114; U.S. General 
Accounting Office, Space Station: Actions Under Way to Manage Cost, but 
Significant Challenges Remain, GAO-02-735 (Washington, D.C.: July 17, 
2002); NASA Program Costs: Space Missions Require Substantially More 
Funding Than Initially Estimated, GAO/NSIAD-93-97 (Washington, D.C.: 
Dec. 31, 1992); and NASA Office of Inspector General, Final Management 
Letter on Failures in Cost Estimating and Risk Management Weaknesses 
in Prior Space Launch Initiative Assignment Numbers A-01-049-01and A-
01-049-02, IG-03-023 (Washington, D.C.: Sept. 29, 2003).

[13] The cost-estimating handbook is in draft form, but NASA made it 
available for official use by its cost-estimating community and program 
managers. NASA expected to complete the handbook by May 2004.

[14] EVM compares the actual work performed at certain stages of a job 
to its actual costs--rather than comparing budgeted and actual costs, 
the traditional management approach to assessing progress. By measuring 
the value of work that has been completed at certain stages in a job, 
EVM can alert program managers, contractors, and administrators of 
potential cost overruns and schedule delays before they occur and of 
problems that need correcting before they worsen. For a more detailed 
discussion of EVM, see appendix IV.

[15] The full cost of a project is the sum of all direct costs, service 
costs, and general administrative costs. Full cost accounting ties all 
NASA agency costs (including civil service personnel costs) to major 
projects.

[16] A CARD provides a system technical description and programmatic 
information to create a common baseline used by the project team to 
develop estimates.

[17] SEI is a government-funded research organization that is widely 
considered an authority on software implementation. 

[18] SEI developed checklists to help evaluate software costs and 
schedule. However, SEI states that these checklists are equally 
applicable to hardware and systems engineering projects. 

[19] If a program provided substantiating evidence for a criterion, we 
determined that the program "fully met" the criterion. If partial 
evidence was provided for a criterion, we determined the program 
"partially met" the criterion. If no evidence was found, then we 
determined that the criterion was "not met."

[20] For example, a cost estimate of $1 million could be presented 
either as a range of $900,000 to $1.1 million or as $1 million with a 
confidence interval of 90 percent, indicating that there is a 10-
percent chance that the cost will exceed the estimate.

[21] A Monte Carlo simulation randomly generates values for uncertain 
variables over and over to simulate a model. Without the aid of 
simulation, a model will only reveal a single outcome, generally the 
most likely or average scenario, but after hundreds or thousands of 
trials, one can view the statistics of the results and the certainty of 
any outcome. 

[22] See Earned Value Management, NASA Policy Directive 9501.3A (Aug. 
3, 2002) and Earned Value Management Implementation on NASA Contracts, 
NASA Procedural Requirements 9501.3 (Nov. 24, 2002).

[23] Form 533 captures financial information that is used as basis for 
the financial management and budget activities within projects and 
NASA-wide.

[24] The International Space Station Program was not a part of our 
review.

[25] NASA has nine centers located around the country and owns the Jet 
Propulsion Laboratory, which is operated by the California Institute of 
Technology.

[26] According to NASA officials, revisions of NASA's current governing 
program and project guidance--NASA Procedures and Guidelines 7120.5B, 
NASA Program and Project Management Processes and Requirements (Nov. 
21, 2002)--is expected to be completed by August 2004, and the draft 
cost-estimating handbook was expected to be finalized by May 2004.

[27] GAO-04-118, GAO-04-151, and GAO-04-43.

[28] Software Engineering Institute, A Manager's Checklist for 
Validating Software Cost and Schedule Estimates, CMU/SEI-95-SR-004 
(Pittsburgh, Penn.: Jan. 1995) and Software Engineering Institute, 
Checklists and Criteria for Evaluating the Cost and Schedule Estimating 
Capabilities of Software Organizations, CMU/SEI-95-SR-005 (Pittsburgh, 
Penn.: 1995).

[29] We use the date the program was initiated to refer to the 
beginning of the formulation subprocess--the phase of a NASA program 
that establishes an affordable project concept and plan to meet mission 
objectives or technology goals.

GAO's Mission: 

The General Accounting Office, the investigative arm of Congress, 
exists to support Congress in meeting its constitutional 
responsibilities and to help improve the performance and accountability 
of the federal government for the American people. GAO examines the use 
of public funds; evaluates federal programs and policies; and provides 
analyses, recommendations, and other assistance to help Congress make 
informed oversight, policy, and funding decisions. GAO's commitment to 
good government is reflected in its core values of accountability, 
integrity, and reliability.

Obtaining Copies of GAO Reports and Testimony: 

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through the Internet. GAO's Web site ( www.gao.gov ) contains 
abstracts and full-text files of current reports and testimony and an 
expanding archive of older products. The Web site features a search 
engine to help you locate documents using key words and phrases. You 
can print these documents in their entirety, including charts and other 
graphics.

Each day, GAO issues a list of newly released reports, testimony, and 
correspondence. GAO posts this list, known as "Today's Reports," on its 
Web site daily. The list contains links to the full-text document 
files. To have GAO e-mail this list to you every afternoon, go to 
www.gao.gov and select "Subscribe to e-mail alerts" under the "Order 
GAO Products" heading.

Order by Mail or Phone: 

The first copy of each printed report is free. Additional copies are $2 
each. A check or money order should be made out to the Superintendent 
of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or 
more copies mailed to a single address are discounted 25 percent. 
Orders should be sent to: 

U.S. General Accounting Office

441 G Street NW,

Room LM Washington,

D.C. 20548: 

To order by Phone: 

Voice: (202) 512-6000: 

TDD: (202) 512-2537: 

Fax: (202) 512-6061: 

To Report Fraud, Waste, and Abuse in Federal Programs: 

Contact: 

Web site: www.gao.gov/fraudnet/fraudnet.htm E-mail: fraudnet@gao.gov

Automated answering system: (800) 424-5454 or (202) 512-7470: 

Public Affairs: 

Jeff Nelligan, managing director, NelliganJ@gao.gov (202) 512-4800 U.S.

General Accounting Office, 441 G Street NW, Room 7149 Washington, D.C.

20548: