This is the accessible text file for GAO report number GAO-06-1099T 
entitled 'Information Technology: Improvements Needed to More 
Accurately Identify and Better Oversee Risky Projects Totaling Billions 
of Dollars' which was released on September 7, 2006. 

This text file was formatted by the U.S. Government Accountability 
Office (GAO) to be accessible to users with visual impairments, as part 
of a longer term project to improve GAO products' accessibility. Every 
attempt has been made to maintain the structural and data integrity of 
the original printed product. Accessibility features, such as text 
descriptions of tables, consecutively numbered footnotes placed at the 
end of the file, and the text of agency comment letters, are provided 
but may not exactly duplicate the presentation or format of the printed 
version. The portable document format (PDF) file is an exact electronic 
replica of the printed version. We welcome your feedback. Please E-mail 
your comments regarding the contents or accessibility features of this 
document to Webmaster@gao.gov. 

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately. 

United States Government Accountability Office: 

Testimony: 

Before the Subcommittee on Federal Financial Management, Government 
Information, and International Security, Committee on Homeland Security 
and Governmental Affairs, U.S. Senate: 

For Release of Delivery Expected at: 

9:30 a.m. EDT: 

Thursday, September 7, 2006: 

Information Technology: 

Improvements Needed to More Accurately Identify and Better Oversee 
Risky Projects Totaling Billions of Dollars: 

Statement of David A. Powner: 
Director, Information Technology Management Issues: 

GAO-06-1099T: 

GAO Highlights: 

Highlights of GAO-06-1099T, a testimony before the Subcommittee on 
Federal Financial Management, Government Information and International 
Security, Committee on Homeland Security and Governmental Affairs, U.S. 
Senate 

Why GAO Did This Study: 

The Office of Management and Budget (OMB) plays a key role in 
overseeing federal IT investments. The Clinger-Cohen Act, among other 
things, requires OMB to establish processes to analyze, track, and 
evaluate the risks and results of major capital investments in 
information systems made by agencies and to report to Congress on the 
net program performance benefits achieved as a result of these 
investments. 

OMB has developed several processes to help carry out its role. For 
example, OMB began using a Management Watch List several years ago as a 
means of identifying poorly planned projects based on its evaluation of 
agencies’ funding justifications for major projects, known as exhibit 
300s. In addition, in August 2005, OMB established a process for 
agencies to identify high risk projects, i.e., projects requiring 
special attention because of one or more reasons specified by OMB, and 
to report on those that are poorly performing or not meeting 
performance criteria. 

GAO recently issued reports on the Management Watch List, high risk 
projects, and agencies’ exhibit 300s. GAO was asked to summarize (1) 
the number of projects and the fiscal year 2007 dollar value of 
Management Watch List and high risk projects, (2) previously reported 
results on how these projects are identified and provided oversight, 
and (3) recommendations it made to improve these processes. 

What GAO Found: 

As a result of the Management Watch List and high risk projects 
processes, about 300 projects totaling about $12 billion in estimated 
IT expenditures for fiscal year 2007 have been identified as being 
either poorly planned or poorly performing. Specifically, of the 857 
major IT projects in the President's budget for fiscal year 2007, OMB 
placed 263 projects, representing about $10 billion on its Management 
Watch List. In addition, in response to OMB's memorandum, agencies 
reported that 79 of 226 high risk projects, collectively totaling about 
$2.2 billion, had a performance shortfall. 

While this information helps to focus both agency and OMB management 
attention on these poorly planned and poorly performing projects, GAO 
identified opportunities to strengthen how these projects are 
identified and provided oversight. 

* The Management Watch List may be undermined by inaccurate and 
unreliable data. OMB uses scoring criteria to evaluate agencies’ 
exhibit 300s to derive the projects on its Management Watch List. GAO’s 
detailed evaluation of exhibit 300s showed that the information 
reported in them is not always accurate or supported by documentation. 

* The criteria for identifying high risk projects were not always 
consistently applied and projects that appeared to meet the criteria 
were not identified as high risk. Without consistent application of the 
high risk criteria, OMB and agency executives cannot have the assurance 
that all projects that require special attention have been identified. 

* For both sets of projects, OMB did not develop a central list of 
projects and deficiencies that could facilitate tracking progress and 
reporting to Congress. Without such lists, OMB is not fully exploiting 
the opportunity to analyze and track these projects on a governmentwide 
basis and not involving Congress in the oversight of these projects 
with risks. 

To improve the way the Management Watch List and high risk projects are 
identified and provided oversight, GAO has made a number of 
recommendations to the Director of OMB. These recommendations include 
directing agencies to improve the accuracy and reliability of exhibit 
300 information and to consistently apply the high risk criteria 
defined by OMB. In addition, GAO recommended that the Director develop 
a single, aggregate list for both the Management Watch List and high 
risk projects to facilitate tracking progress, performing 
governmentwide analysis, and reporting the results to Congress. OMB 
generally disagreed with these recommendations. However, GAO believes 
that they are needed to provide greater assurance that poorly planned 
and poorly performing projects are more accurately identified and 
provided oversight, and ultimately ensure that potentially billions of 
taxpayer dollars are not wasted. 

[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-06-1099T]. 

To view the full product, including the scope and methodology, click on 
the link above. For more information, contact David A. Powner at (202) 
512-9286 or pownerd@gao.gov. 

[End of Section] 

Mr. Chairman and Members of the Subcommittee: 

I am pleased to be here today to discuss the federal government's 
processes for improving the management of IT investments that total $64 
billion for fiscal year 2007. Effective management of these investments 
is essential to the health, economy, and security of the nation. The 
Office of Management and Budget (OMB) plays a key role in overseeing 
federal IT investments. The Clinger-Cohen Act, among other things, 
requires OMB to establish processes to analyze, track, and evaluate the 
risks and results of major capital investments in information systems 
made by executive agencies and to report to Congress on the net program 
performance benefits achieved as a result of these investments. 

To help carry out its role, OMB has developed several processes to 
improve the management of federal IT projects, including the e-Gov 
scorecard,[Footnote 1] Management Watch List, and high risk projects. 
The Management Watch List identifies projects with weaknesses in their 
funding justifications (or exhibit 300s) based on an evaluation of 
these documents. High risk projects are projects requiring special 
attention from oversight authorities and the highest level of agency 
management because of one or more of the following four 
reasons:[Footnote 2] (1) the agency has not consistently demonstrated 
the ability to manage complex projects; (2) the projects has 
exceptionally high development, operating, or maintenance costs; (3) 
the projects are addressing deficiencies in the agencies' ability to 
perform an essential mission program or function of the agency; or (4) 
the projects' delay or failure would impact the agencies' essential 
mission functions. Agencies are also required to provide quarterly 
reports to OMB on identified high risk projects that have performance 
shortfalls, meaning that they do not meet one or more of four 
performance evaluation criteria. The performance criteria are (1) 
establishing baselines with clear cost, schedule, and performance 
goals;(2) maintaining the project's cost and schedule variances within 
10 percent; (3) assigning a qualified project manager; or (4) avoiding 
duplication by leveraging interagency and governmentwide investments. 

These processes, among other things, are instrumental in helping to 
identify and improve oversight of poorly planned and poorly performing 
projects. Given the importance of these processes, you asked us to 
summarize (1) the number of projects and fiscal year 2007 dollar value 
of Management Watch List and high risk projects, (2) previously 
reported results on how these projects are identified and provided 
oversight, and (3) recommendations made to improve these processes. In 
preparing this testimony, we summarized our previous reports on 
initiatives for improving the management of federal IT 
investments.[Footnote 3] The work in these reports was performed in 
accordance with generally accepted government auditing standards. 

Results in Brief: 

As a result of the Management Watch List and high risk projects 
processes, about 300 projects totaling about $12 billion in estimated 
IT expenditures for fiscal year 2007 have been identified as being 
either poorly planned or poorly performing. Of the 857 major IT 
projects in the President's budget for fiscal year 2007, OMB placed 263 
projects, representing about $10 billion on its Management Watch List. 
In addition, in response to OMB's memorandum, agencies reported that 79 
of 226 high-risk projects, collectively totaling about $2.2 billion, 
had a performance shortfall primarily associated with cost and schedule 
variances that exceeded 10 percent. 

While this information helps to focus both agency and OMB management 
attention on these poorly planned and poorly performing projects, our 
reviews identified opportunities to strengthen how these projects are 
identified and provided oversight. 

* The Management Watch List may be undermined by inaccurate and 
unreliable data. OMB uses scoring criteria to evaluate each major 
projects' justification for funding (known as exhibit 300s) to derive 
the projects on its Management Watch List. Our detailed evaluation of 
exhibit 300s showed that the information reported in them is not always 
accurate or supported by documentation. 

* For the high risk projects, the criteria for identifying projects 
were not always consistently applied and we found examples of projects 
that appeared to meet the criteria but were not identified as high 
risk. Without consistent application of the high risk criteria, OMB and 
agency executives cannot have the assurance that all projects that 
require special attention have been identified. 

* For both sets of projects, OMB did not develop a central list of 
projects and deficiencies that could facilitate the tracking of 
progress and reporting to Congress. By not having such lists, OMB is 
not fully exploiting the opportunity to analyze and track these 
projects on a governmentwide basis and to involve Congress in the 
oversight of these projects with risks. 

To improve the way the Management Watch List and high risk projects are 
identified and provided oversight, we have made a number of 
recommendations to the Director of OMB. These recommendations include 
directing agencies to improve the accuracy and reliability of exhibit 
300 information and to consistently applying the high risk criteria 
defined by OMB. In addition, we recommended that the Director provide 
for training of agency personnel responsible for completing exhibit 
300s and to develop a single, aggregate list for both the Management 
Watch List projects and for high risk projects to facilitate tracking 
progress, performing governmentwide analysis, and reporting the results 
to Congress. OMB generally disagreed with our recommendations. However, 
we continue to believe that they are needed to help accurately identify 
poorly planned and performing projects and more effectively oversee 
these projects. 

Background: 

Each year, OMB and federal agencies work together to determine how much 
the government plans to spend for IT and how these funds are to be 
allocated. Federal IT spending has risen to an estimated $64 billion in 
fiscal year 2007. 

OMB plays a key role in overseeing federal IT investments and how they 
are managed. To drive improvement in the implementation and management 
of IT projects, Congress enacted the Clinger-Cohen Act in 1996 to 
further expand the responsibilities of OMB and the agencies under the 
Paperwork Reduction Act.[Footnote 4] In particular, the act requires 
agency heads, acting through agency chief information officers (CIOs), 
to, among other things, better link their IT planning and investment 
decisions to program missions and goals and to implement and enforce IT 
management policies, procedures, standards, and guidelines. The Clinger-
Cohen Act requires that agencies engage in capital planning and 
performance and results-based management.[Footnote 5] The act also 
requires OMB to establish processes to analyze, track, and evaluate the 
risks and results of major capital investments in information systems 
made by executive agencies. OMB is also required to report to Congress 
on the net program performance benefits achieved as a result of major 
capital investments in information systems that are made by executive 
agencies.[Footnote 6] 

In response to the Clinger-Cohen Act and other statutes, OMB developed 
policy for planning, budgeting, acquisition, and management of federal 
capital assets. This policy is set forth in OMB Circular A-11 (section 
300) and in OMB's Capital Programming Guide (supplement to Part 7 of 
Circular A-11), which directs agencies to develop, implement, and use a 
capital programming process to build their capital asset portfolios. 
Among other things, OMB's Capital Programming Guide directs agencies 
to: 

* evaluate and select capital asset investments that will support core 
mission functions that must be performed by the federal government and 
demonstrate projected returns on investment that are clearly equal to 
or better than alternative uses of available public resources; 

* institute performance measures and management processes that monitor 
actual performance and compare to planned results; and: 

* establish oversight mechanisms that require periodic review of 
operational capital assets to determine how mission requirements might 
have changed and whether the asset continues to fulfill mission 
requirements and deliver intended benefits to the agency and customers. 

To further support the implementation of IT capital planning practices, 
we have developed an IT investment management framework[Footnote 7] 
that agencies can use in developing a stable and effective capital 
planning process, as required by statute and directed in OMB's Capital 
Programming Guide. Consistent with the statutory focus on 
selecting,[Footnote 8] controlling,[Footnote 9] and evaluating[Footnote 
10] investments, this framework focuses on these processes in relation 
to IT investments specifically. It is a tool that can be used to 
determine both the status of an agency's current IT investment 
management capabilities and the additional steps that are needed to 
establish more effective processes. Mature and effective management of 
IT investments can vastly improve government performance and 
accountability. Without good management, such investments can result in 
wasteful spending and lost opportunities for improving delivery of 
services to the public. 

Prior Reviews on Federal IT Investment Management Have Identified 
Weaknesses: 

Only by effectively and efficiently managing their IT resources through 
a robust investment management process can agencies gain opportunities 
to make better allocation decisions among many investment alternatives 
and further leverage their investments. However, the federal government 
faces enduring IT challenges in this area. For example, in January 2004 
we reported on mixed results of federal agencies' use of IT investment 
management practices.[Footnote 11] Specifically, we reported that 
although most of the agencies had IT investment boards responsible for 
defining and implementing the agencies' IT investment management 
processes, agencies did not always have important mechanisms in place 
for these boards to effectively control investments, including decision-
making rules for project oversight, early warning mechanisms, and/or 
requirements that corrective actions for underperforming projects be 
agreed upon and tracked. Executive-level oversight of project-level 
management activities provides organizations with increased assurance 
that each investment will achieve the desired cost, benefit, and 
schedule results. Accordingly, we made several recommendations to 
agencies to improve their practices. 

In previous work using our investment management framework, we reported 
that the use of IT investment management practices by agencies was 
mixed. For example, a few agencies that have followed the framework in 
implementing capital planning processes have made significant 
improvements.[Footnote 12] In contrast, however, we and others have 
continued to identify weaknesses at agencies in many areas, including 
immature management processes to support both the selection and 
oversight of major IT investments and the measurement of actual versus 
expected performance in meeting established performance 
measures.[Footnote 13] 

OMB's Management Watch List Intended to Correct Project Weaknesses and 
Business Case Deficiencies: 

In helping to ensure that investments of public resources are justified 
and that public resources are wisely invested, OMB began using the 
Management Watch List, in the President's fiscal year 2004 budget 
request, as a means to oversee the justification for and planning of 
agencies' IT investments. This list was derived based on a detailed 
review of the investments' Capital Asset Plan and Business Case, also 
known as the exhibit 300. 

The exhibit 300 is a reporting mechanism intended to enable an agency 
to demonstrate to its own management, as well as OMB, that a major 
project is well planned in that it has employed the disciplines of good 
project management; developed a strong business case for the 
investment; and met other Administration priorities in defining the 
cost, schedule, and performance goals proposed for the investment. 

We reported in 2005 that OMB analysts evaluate agency exhibit 300s by 
assigning scores to each exhibit 300 based on guidance presented in OMB 
Circular A-11.[Footnote 14] As described in this circular, the scoring 
of a business case consists of individual scoring for 10 categories, as 
well as a total composite score of all the categories. The 10 
categories are: 

* acquisition strategy, 

* project (investment) management, 

* enterprise architecture, 

* alternatives analysis, 

* risk management, 

* performance goals, 

* security and privacy, 

* performance-based management system (including the earned value 
management system[Footnote 15]), 

* life-cycle costs formulation, and: 

* support of the President's Management Agenda. 

Using these scores, projects were placed on the Management Watch List 
if their exhibit 300 business case received a total composite score of 
3 or less, or if it received a score of 3 or less in the areas of 
performance goals, performance-based management systems, or security 
and privacy, even if its overall score was a 4 or 5. To derive the 
total number of projects on the list that were reported for fiscal year 
2005, OMB polled the individual analysts and compiled the numbers. 

According to OMB, agencies with weaknesses in these three areas were to 
submit remediation plans addressing the weaknesses. OMB officials also 
stated that decisions on follow-up and monitoring the progress were 
typically made by staff with responsibility for reviewing individual 
agency budget submissions, depending on the staff's insights into 
agency operations and objectives. According to OMB officials, those 
Management Watch List projects that did receive specific follow-up 
attention received feedback through the passback process, through 
targeted evaluation of remediation plans designed to address 
weaknesses, and through the apportioning of funds so that the use of 
budgeted dollars was conditional on appropriate remediation plans being 
in place, and through the quarterly e-Gov Scorecards. 

OMB Issued August 2005 Memorandum on Improving Performance of High Risk 
IT Projects: 

To improve IT project execution, OMB issued a memorandum in August 2005 
to all federal CIOs, directing them to begin taking steps to identify 
IT projects that are high risk and to report quarterly on their 
performance.[Footnote 16] As originally defined in OMB Circular A-11 
and subsequently reiterated in the August 2005 memorandum, high risk 
projects are those that require special attention from oversight 
authorities and the highest levels of agency management because of one 
or more of the following four reasons: 

* The agency has not consistently demonstrated the ability to manage 
complex projects. 

* The project has exceptionally high development, operating, or 
maintenance costs, either in absolute terms or as a percentage of the 
agency's total IT portfolio. 

* The project is being undertaken to correct recognized deficiencies in 
the adequate performance of an essential mission program or function of 
the agency, a component of the agency, or another organization. 

* Delay or failure of the project would introduce for the first time 
unacceptable or inadequate performance or failure of an essential 
mission function of the agency, a component of the agency, or another 
organization. 

As directed in the memorandum, agencies are to work with OMB to 
identify their high risk IT projects using these criteria. Most 
agencies reported that, to identify high risk projects, CIO office 
staff compared the criteria against their current portfolio to 
determine which projects met OMB's definition. They then submitted the 
list to OMB for review. According to OMB and agency officials, after 
the submission of the initial list, examiners at OMB worked with 
individual agencies to identify or remove projects as appropriate. 
According to most agencies, the final list was then approved by their 
CIO. 

For the identified high risk projects, beginning September 15, 2005, 
and quarterly thereafter, CIOs were to assess, confirm, and document 
projects' performance. Specifically, agencies were required to 
determine, for each of their high risk projects, whether the project 
was meeting one or more of four performance evaluation criteria: (1) 
establishing baselines with clear cost, schedule, and performance 
goals; (2) maintaining the project's cost and schedule variances within 
10 percent; (3) assigning a qualified project manager; and (4) avoiding 
duplication by leveraging inter-agency and governmentwide investments. 
If a high risk project met any of these four performance evaluation 
criteria, agencies were instructed to document this using a standard 
template provided by OMB and provide this template to oversight 
authorities (e.g., OMB, agency inspectors general, agency management, 
and GAO) on request. Upon submission, according to OMB staff, 
individual analysts review the quarterly performance reports of 
projects with shortfalls to determine how well the projects are 
progressing and whether the actions described in the planned 
improvement efforts are adequate using other performance data already 
received on IT projects such as the e-Gov Scorecards, earned value 
management data, and the exhibit 300. 

Poorly Planned and Performing Projects Identified, Totaling About $12 
Billion in Estimated Expenditures for Fiscal Year 2007: 

About 300 projects totaling about $12 billion in estimated IT 
expenditures for fiscal year 2007 have been placed on OMB's Management 
Watch List or as a high risk project with performance shortfalls. 
Specifically, the President's budget for fiscal year 2007 included 857 
major IT projects, totaling approximately $64 billion. Of this, OMB 
reported that there were 263 proposed major projects that were poorly 
planned, totaling $10 billion. In addition, agencies reported that 79 
of the 226 high-risk projects identified as of March 2006, collectively 
totaling about $2.2 billion had a performance shortfall primarily 
associated with cost and schedule variances that exceeded 10 percent. 

OMB first reported on the Management Watch List in the President's 
budget request for 2004. While the number of projects and their 
associated budget have decreased since then, they still represent a 
significant percentage of the total IT budget. Table 1 shows the budget 
information for projects on the Management Watch List for fiscal years 
2004, 2005, 2006, and 2007. 

Table 1: Management Watch List Budget for Fiscal Years 2004, 2005, 
2006, and 2007: 

Fiscal Years: 2004; 
Total IT budget (in billions): $59.0; 
IT budget for Management Watch List projects (in billions): $20.9; 
Percentage of budget for Management Watch List projects: 35%. 

Fiscal Years: 2005; 
Total IT budget (in billions): $60.0; 
IT budget for Management Watch List projects (in billions): $22.0; 
Percentage of budget for Management Watch List projects: 37%. 

Fiscal Years: 2006; 
Total IT budget (in billions): $65.0; 
IT budget for Management Watch List projects (in billions): $15.0; 
Percentage of budget for Management Watch List projects: 23%. 

Fiscal Years: 2007; 
Total IT budget (in billions): $64.0; 
IT budget for Management Watch List projects (in billions): $9.9; 
Percentage of budget for Management Watch List projects: 15%. 

Source: GAO analysis of OMB data. 

[End of table] 

Table 2 provides the number of projects on the Management Watch List 
for fiscal years 2004, 2005, 2006, and 2007. 

Table 2: Number of Projects on Management Watch List for Fiscal Years 
2004, 2005, 2006, and 2007: 

Fiscal Years: 2004; 
Total IT projects: 1400; 
Management Watch List projects: 771; 
Percentage of projects on Management Watch List: 55%. 

Fiscal Years: 2005; 
Total IT projects: 1200; 
Management Watch List projects: 621; 
Percentage of projects on Management Watch List: 52%. 

Fiscal Years: 2006; 
Total IT projects: 1087; 
Management Watch List projects: 342; 
Percentage of projects on Management Watch List: 31%. 

Fiscal Years: 2007; 
Total IT projects: 857; 
Management Watch List projects: 263; 
Percentage of projects on Management Watch List: 31%. 

Source: GAO analysis of OMB data. 

[End of table] 

In addition, in response to OMB's August 2005 memorandum, the 24 major 
agencies identified 226 IT projects as high risk, totaling about $6.4 
billion in funding requested for fiscal year 2007[Footnote 17]. 
Agencies identified most projects as high risk because their delay or 
failure would impact the essential business functions of the agency. In 
addition, agencies reported that about 35 percent of the high risk 
projects--or 79 investments, collectively totaling about $2.2 billion 
in fiscal year 2007--had a performance shortfall, primarily associated 
with cost and schedule variances that exceeded 10 percent.[Footnote 18] 

Figure 1 illustrates the number of agency high risk projects as of 
March 2006 with and without shortfalls. The majority of the agencies 
reported that their high risk projects did not have performance 
shortfalls in any of the four areas identified by OMB. In addition, six 
agencies--the departments of Commerce, Energy, Housing and Urban 
Development, and Labor, and the National Aeronautics and Space 
Administration and the National Science Foundation--reported that none 
of their high risk projects experienced any performance shortfalls. 

Figure 1: Number of Agencies High Risk Projects with and without 
Performance Shortfalls (as of March 2006): 

[See PDF for image] 

Source: GAO analysis of 24 CFO agencies' March 2006 high risk reports. 

Note: Department of Agriculture (USDA); Department of Health and Human 
Services (HHS); Department of Homeland Security (DHS); Department of 
Housing and Urban Development (HUD); Department of Veterans Affairs 
(VA); Environmental Protection Agency (EPA); General Services 
Administration (GSA); National Aeronautics and Space Administration 
(NASA); National Science Foundation (NSF); Nuclear Regulatory 
Commission (NRC); Office of Personnel Management (OPM); Small Business 
Administration (SBA); Social Security Administration (SSA); Agency for 
International Development (USAID): 

[End of figure] 

Improvements Needed to Identify and Oversee Management Watch List and 
High Risk Projects: 

While the Management Watch List and high risk processes serve to 
highlight poorly planned and performing projects and focus attention on 
them, our reviews identified opportunities to strengthen the 
identification and oversight of projects for each. 

Management Watch List May Be Based on Unreliable Data and High Risk 
Project Criteria Are Not Always Consistently Applied: 

OMB's Management Watch List may be undermined by inaccurate and 
unreliable data. While OMB uses the exhibit 300s as the basis for 
designating projects as poorly planned, we have recently 
reported[Footnote 19] that the underlying support was often inadequate 
for information provided in the exhibit 300s GAO reviewed. Three 
general types of weaknesses were evident: 

* All exhibit 300s had documentation weaknesses. Documentation either 
did not exist or did not fully agree with specific areas of the exhibit 
300. 

* Agencies did not always demonstrate that they complied with federal 
or departmental requirements or policies with regard to management and 
reporting processes. Also, none had cost analyses that fully complied 
with OMB requirements for cost-benefit and cost-effectiveness analyses. 
In contrast, most investments did demonstrate compliance with 
information security planning and training requirements. 

* In sections that required actual cost data, these data were 
unreliable because they were not derived from cost-accounting systems 
with adequate controls. In the absence of such systems, agencies 
generally derived cost information from ad hoc processes. 

Moreover, although agencies, with OMB's assistance, generally 
identified their high risk projects using criteria specified by OMB, 
these criteria were not always consistently applied. 

* In several cases, agencies did not use OMB's criteria to identify 
high risk projects. Some agencies reported using other reasons to 
identify a total of 31 high risk projects. For example, the Department 
of Homeland Security reported investments that were high risk because 
they had weaknesses associated with their business cases based on the 
evaluation by OMB. The Department of Transportation reported projects 
as high risk because two did not have approved baselines, and four had 
incomplete or poor earned value management assessments. 

* Regarding the criterion for high risk designation that the agency has 
not consistently demonstrated the ability to manage complex projects, 
only three agencies reported having projects meeting this criterion. 
This appears to be somewhat low, considering that we and others have 
previously reported on weaknesses in numerous agencies' ability to 
manage complex projects. For example, we have reported in our high risk 
series on major programs and operations that need urgent attention and 
transformation in order to ensure that our national government 
functions in the most economical, efficient, and effective manner 
possible.[Footnote 20] Specifically, the Department of Defense's 
efforts to modernize its business systems have been hampered because of 
weaknesses in practices for (1) developing and using an enterprise 
architecture, (2) instituting effective investment management 
processes, and (3) establishing and implementing effective systems 
acquisition processes. We concluded that the Department of Defense, as 
a whole, remains far from where it needs to be to effectively and 
efficiently manage an undertaking with the size, complexity, and 
significance of its departmentwide business systems modernization. We 
also reported that, after almost 25 years and $41 billion, efforts to 
modernize the air traffic control program of the Federal Aviation 
Administration, the Department of Transportation's largest component, 
are far from complete and that projects continue to face challenges in 
meeting cost, schedule, and performance expectations.[Footnote 21] 
However, neither the Department of Defense nor the Department of 
Transportation cited the "inability to consistently manage complex 
projects" criteria for any projects as being high risk. 

* Finally, while agencies have reported a significant number of IT 
projects as high risk, we identified other projects on which we have 
reported and testified that appear to meet one or more of OMB's 
criteria for high risk designation including high development or 
operating costs and recognized deficiencies in the adequate performance 
but were not identified as high risk. Examples we have recently 
reported include the following projects: 

* The Decennial Response Integration System of the Census Bureau, is 
intended to integrate paper, Internet, and telephone responses. Its 
high development and operating costs are expected to make up a large 
portion of the $1.8 billion program to develop, test, and implement 
decennial census systems. In March 2006,[Footnote 22] we testified that 
the component agency has established baseline requirements for the 
acquisition, but the bureau has not yet validated them or implemented a 
process for managing the requirements. We concluded that, until these 
and other basic contract management activities are fully implemented, 
this project faced increased risks that the system would experience 
cost overruns, schedule delays, and performance shortfalls. 

* The National Polar-Orbiting Operational Environmental Satellite 
System--an initiative managed by the Department of Commerce, the 
Department of Defense, and the National Aeronautics and Space 
Administration--is to converge two satellite programs into a single 
satellite program capable of satisfying both civilian and military 
requirements. In November 2005,[Footnote 23] we reported that the 
system was a troubled program because of technical problems on critical 
sensors, escalating costs, poor management at multiple levels, and the 
lack of a decision on how to proceed with the program. Over the last 
several years, this system has experienced continual cost increases to 
about $10 billion and schedule delays, requiring difficult decisions 
about the program's direction and capabilities. More recently, we 
testified[Footnote 24] that the program is still in trouble and that 
its future direction is not yet known. While the program office has 
corrective actions under way, we concluded that, as the project 
continues, it will be critical to ensure that the management issues of 
the past are not repeated. 

* Rescue 21, is a planned coastal communications system of the 
Department of Homeland Security. We recently reported[Footnote 25] that 
inadequacies in several areas contributed to Rescue 21 cost overruns 
and schedule delays. These inadequacies occurred in requirements 
management, project monitoring, risk management, contractor cost and 
schedule estimation and delivery, and executive level oversight. 
Accordingly, the estimated total acquisition cost has increased from 
$250 million in 1999 to $710.5 million in 2005, and the timeline for 
achieving full operating capability has been extended from 2006 to 
2011. 

For the projects we identified as appearing to meet OMB's criteria for 
high risk, the responsible agencies reported that they did not consider 
these investments to be high risk projects for such reasons as (1) the 
project was not a major investment; (2) agency management is 
experienced in overseeing projects; or (3) the project did not have 
weaknesses in its business case. In particular, one agency stated that 
their list does not include all high risk projects, only those that are 
the highest priority of the high risk investments. However, none of the 
reasons provided are associated with OMB's high risk definition. 
Without consistent application of the criteria, OMB and executives 
cannot have the assurance that all projects that require special 
attention have been identified. 

OMB Does Not Use an Aggregate List to Perform Its Oversight of the 
Management Watch List or High Risk Projects: 

While OMB's Management Watch List identified opportunities to 
strengthen investments and promote improvements in IT management, OMB 
did not develop a single, aggregate list identifying the projects and 
their weaknesses. According to OMB officials, they did not construct a 
single list of projects meeting their watch list criteria because they 
did not see such an activity as necessary in performing OMB's 
predominant mission: to assist in overseeing the preparation of the 
federal budget and to supervise agency budget administration. Thus, OMB 
did not exploit the opportunity to use the list as a tool for analyzing 
IT investments on a govermentwide basis, limiting its ability to 
identify and report on the full set of IT investments requiring 
corrective actions. 

In addition, while OMB asked agencies to take corrective actions to 
address weaknesses associated with projects on the Management Watch 
List, it did not develop a structured, consistent process or criteria 
for deciding how to follow up on these actions. We also reported that 
because it did not consistently monitor the follow-up performed, OMB 
could not tell us which of the 621 projects identified on the fiscal 
year 2005 list received follow-up attention, and it did not know 
whether the specific project risks that it identified through its 
Management Watch List were being managed effectively. This approach 
could leave resources at risk of being committed to poorly planned and 
managed projects. Thus, OMB was not using its Management Watch List as 
a tool for improving IT investments on a governmentwide basis and 
focusing attention where it was most needed. 

Similar to the Management Watch List, we reported in June 2006 that 
while OMB analysts review the quarterly performance reports on high 
risk projects, they did not compile a single aggregate list of high 
risk projects. According to OMB staff they did not see such an activity 
as necessary in achieving the intent of the guidance--to improve 
project planning and execution. Consistent with our Management Watch 
list observations and recommendations, we believe that by not having a 
single list, OMB is limiting its ability to identity and report on the 
full set of IT investments across the federal government that require 
special oversight and greater agency management attention. 

Implementation of Recommendations Can Lead to Improved Processes to 
Identify and Oversee Management Watch List and High Risk Projects: 

To address our key findings, we made several recommendations to the 
Director of OMB. For example, to improve how the Management Watch List 
projects are identified, we have made several recommendations to 
improve the accuracy and validity of exhibit 300s for major IT 
investments, including that the Director require agencies to determine 
the extent to which the information contained in each exhibit 300 is 
accurate and reliable, and, where weaknesses in accuracy and 
reliability are identified, disclose them and explain the agency's 
approach to mitigating them. We also recommended that the Director 
provide for training of agency personnel responsible for completing 
exhibit 300s, and specified that, in developing the training, OMB 
consult with agencies to identify deficiencies that the training should 
address. Likewise, to improve how high risk projects are identified, we 
recommended that the Director direct federal agency CIOs to ensure that 
they are consistently applying the high risk criteria defined by OMB. 

To improve how the Management Watch List is provided oversight, in our 
April 2005 report, we recommended that the Director of OMB develop a 
central list of projects and their deficiencies and report to Congress 
on progress made in addressing risks of major IT investments and 
management areas needing attention. In addition, to fully realize the 
potential benefits of using the Management Watch List, we recommended 
that OMB use the list as the basis for selecting projects for follow- 
up, tracking follow-up activities and analyze the prioritized list to 
develop governmentwide and agency assessments of the progress and risks 
of IT investments, identifying opportunities for continued improvement. 
We also made similar recommendations to the Director of OMB regarding 
high risk projects. Specifically, we recommended that OMB develop a 
single aggregate list of high risk projects and their deficiencies and 
use that list to report to Congress progress made in correcting high 
risk problems, actions under way, and further actions that may be 
needed. 

OMB generally disagreed with our recommendations for strengthening the 
Management Watch List and high risk projects processes. Specifically, 
OMB's Administrator of the Office of E-Government and Information 
Technology stated that the ultimate responsibility to improve the 
accuracy and reliability of the exhibit 300s lies with the agencies. 
While this is true, OMB also has statutory responsibility for providing 
IT guidance governmentwide, especially when it involves an OMB-required 
budget document. Regarding the consistent application of the high risk 
criteria, the Administrator stated that some flexibility in the 
application of the criteria is essential. While some flexibility may be 
appropriate, we believe that these criteria should be more consistently 
applied so that projects that clearly meet them are identified and 
provided oversight. The Administrator also disagreed with our 
recommendations that an aggregated governmentwide Management Watch List 
and high risk project list is necessary to perform adequate oversight. 
However, we continue to believe that these lists are needed to 
facilitate OMB's ability to track progress. Addressing these 
recommendations would provide increased assurance that poorly planned 
and performing projects are accurately identified and more effectively 
provided oversight. 

In summary, the Management Watch List and High Risk processes play 
important roles in improving the management of federal IT investments 
by helping to identify poorly planned and performing projects totaling 
at least $12 billion that require management attention. However, the 
number of projects identified on both lists is likely understated 
because the Management Watch List is derived from budgetary documents 
that are not always accurate and reliable and the high risk projects 
are not always identified consistently using OMB criteria. In addition, 
we noted areas where oversight of both sets of projects could be 
strengthened primarily by reporting the results in the aggregate so 
that governmentwide analyses can be performed, progress can be tracked, 
and Congress can be informed. The recommendations we made to agencies 
and OMB to address these issues are aimed at providing greater 
assurance that poorly planned and performing projects are more 
accurately identified and receiving adequate oversight, and ultimately 
ensuring that potentially billions of taxpayers dollars are not wasted. 

Contacts and Acknowledgements: 

If you should have any questions about this testimony, please contact 
me at (202) 512-9286 or by e-mail at pownerd@gao.gov. Other individuals 
who made key contributions to this testimony are Sabine Paul and Niti 
Tandon. 

FOOTNOTES 

[1] The quarterly e-Gov Scorecards are reports that use a red/yellow/ 
green scoring system to illustrate the results of OMB's evaluation of 
agencies' implementation of e-government criteria in the President's 
Management Agenda. The scores are determined in quarterly reviews, 
where OMB evaluates agency progress toward agreed-upon goals along 
several dimensions, and provides input to the quarterly reporting on 
the President's Management Agenda. Key criteria used to score agencies 
e-government process include acceptable business cases, cost and 
schedule performance; and security accreditation. As of June 30, 2006, 
21 of the 26 departments/major agencies were identified as having a 
yellow (mixed results) or red (unsatisfactory) score. 

[2] These reasons are specified in OMB, Memorandum for Chief 
Information Officers: Improving Information Technology (IT) Project 
Planning and Execution , M-05-23 (Washington, D.C., Aug. 4, 2005) 

[3] GAO, Information Technology: OMB Can Make More Effective Use of Its 
Investment Reviews, GAO-05-276 (Washington, D.C.: April 15, 2005); 
Information Technology: Agencies Need to Improve the Accuracy and 
Reliability of Investment Information, GAO-06-250 (Washington, D.C.: 
Jan.12, 2006); GAO, Information Technology: Agencies and OMB Should 
Strengthen Processes for Identifying and Overseeing High Risk Projects, 
GAO-06-647 (Washington, DC, June 15, 2006) 

[4] 44 U.S.C. § 3504(a)(1)(B)(vi)(OMB); 44 U.S.C. § 3506(h)(5) 
(agencies). 

[5] 40 U.S.C. § 11312; 40 U.S.C. § 11313. 

[6] These requirements are specifically described in the Clinger-Cohen 
Act, 40 U.S.C. § 11302 (c). 

[7] GAO, Information Technology Investment Management: A Framework for 
Assessing and Improving Process Maturity, GAO-04-394G (Washington, 
D.C.: March 2004). 

[8] During the selection phase, the organization (1) identifies and 
analyzes each project's risks and returns before committing significant 
funds to any project and (2) selects those IT projects that will best 
support its mission needs. 

[9] During the control phase, the organization ensures that, as 
projects develop and investment expenditures continue, the project is 
continuing to meet mission needs at the expected levels of cost and 
risk. If the project is not meeting expectations or if problems have 
arisen, steps are quickly taken to address the deficiencies. 

[10] During the evaluation phase, actual versus expected results are 
compared once projects have been fully implemented. This is done to (1) 
assess the project's impact on mission performance, (2) identify any 
changes or modifications to the project that may be needed, and (3) 
revise the investment management process based on lessons learned. 

[11] GAO, Information Technology Management: Governmentwide Strategic 
Planning, Performance Measurement, and Investment Management Can Be 
Further Improved, GAO-04-49 (Washington, D.C.: Jan. 12, 2004). 

[12] These agencies include the Departments of Agriculture, Commerce, 
and the Interior. 

[13] For example, GAO, Information Technology: Centers for Medicare & 
Medicaid Services Needs to Establish Critical Investment Management 
Capabilities, GAO-06-12 (Washington, D.C.: Oct. 28, 2005); Information 
Technology: Departmental Leadership Crucial to Success of Investment 
Reforms at Interior, GAO-03-1028 (Washington, D.C.: Sept. 12, 2003); 
and United States Postal Service: Opportunities to Strengthen IT 
Investment Management Capabilities, GAO-03-3 (Washington, D.C.: Oct. 
15, 2002). 

[14] GAO-05-276. 

[15] Earned value management is a project management tool that 
integrates the investment scope of work with schedule and cost elements 
for investment planning and control. This method compares the value of 
work accomplished during a given period with that of the work expected 
in the period. Differences in expectations are measured in both cost 
and schedule variances. 

[16] OMB Memorandum, M-05-23 (Aug. 4, 2005). 

[17] GAO-06-647. 

[18] Of the 79 projects with performance shortfalls, nineteen projects 
totaling about $500 million in estimated IT expenditures for 2007 were 
also placed on OMB's Management Watch List. 

[19] GAO-06-250. 

[20] GAO, High-Risk Series: An Update, GAO-05-207 (Washington, D.C., 
January 2005). 

[21] GAO-05-207. 

[22] GAO, Census Bureau: Important Activities for Improving Management 
of Key 2010 Decennial Acquisitions Remain to be Done, GAO-06-444T 
(Washington, D.C.: Mar. 1, 2006). 

[23] GAO, Polar-Orbiting Operational Environmental Satellites: 
Technical Problems, Cost Increases, and Schedule Delays Trigger Need 
for Difficult Trade-Off Decisions, GAO-06-249T (Washington, D.C.: Nov. 
16, 2005). 

[24] GAO, Polar-Orbiting Operational Environmental Satellites: Cost 
Increases Trigger Review and Place Program's Direction on Hold, GAO-06- 
573T (Washington, D.C.: Mar. 30, 2006). 

[25] GAO, United States Coast Guard: Improvements Needed in Management 
and Oversight of Rescue System Acquisition, GAO-06-632 (Washington, 
D.C.: May 31, 2006). 

GAO's Mission: 

The Government Accountability Office, the investigative arm of 
Congress, exists to support Congress in meeting its constitutional 
responsibilities and to help improve the performance and accountability 
of the federal government for the American people. GAO examines the use 
of public funds; evaluates federal programs and policies; and provides 
analyses, recommendations, and other assistance to help Congress make 
informed oversight, policy, and funding decisions. GAO's commitment to 
good government is reflected in its core values of accountability, 
integrity, and reliability. 

Obtaining Copies of GAO Reports and Testimony: 

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through the Internet. GAO's Web site ( www.gao.gov ) contains 
abstracts and full-text files of current reports and testimony and an 
expanding archive of older products. The Web site features a search 
engine to help you locate documents using key words and phrases. You 
can print these documents in their entirety, including charts and other 
graphics. 

Each day, GAO issues a list of newly released reports, testimony, and 
correspondence. GAO posts this list, known as "Today's Reports," on its 
Web site daily. The list contains links to the full-text document 
files. To have GAO e-mail this list to you every afternoon, go to 
www.gao.gov and select "Subscribe to e-mail alerts" under the "Order 
GAO Products" heading. 

Order by Mail or Phone: 

The first copy of each printed report is free. Additional copies are $2 
each. A check or money order should be made out to the Superintendent 
of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or 
more copies mailed to a single address are discounted 25 percent. 
Orders should be sent to: 

U.S. Government Accountability Office 

441 G Street NW, Room LM 

Washington, D.C. 20548: 

To order by Phone: 

Voice: (202) 512-6000: 

TDD: (202) 512-2537: 

Fax: (202) 512-6061: 

To Report Fraud, Waste, and Abuse in Federal Programs: 

Contact: 

Web site: www.gao.gov/fraudnet/fraudnet.htm 

E-mail: fraudnet@gao.gov 

Automated answering system: (800) 424-5454 or (202) 512-7470: 

Public Affairs: 

Jeff Nelligan, managing director, 

NelliganJ@gao.gov 

(202) 512-4800 

U.S. Government Accountability Office, 

441 G Street NW, Room 7149 

Washington, D.C. 20548: