This is the accessible text file for GAO report number GAO-04-49 
entitled 'Information Technology Management: Governmentwide Strategic 
Planning, Performance Measurement, and Investment Management Can Be 
Further Improved' which was released on February 11, 2004.

This text file was formatted by the U.S. General Accounting Office 
(GAO) to be accessible to users with visual impairments, as part of a 
longer term project to improve GAO products' accessibility. Every 
attempt has been made to maintain the structural and data integrity of 
the original printed product. Accessibility features, such as text 
descriptions of tables, consecutively numbered footnotes placed at the 
end of the file, and the text of agency comment letters, are provided 
but may not exactly duplicate the presentation or format of the printed 
version. The portable document format (PDF) file is an exact electronic 
replica of the printed version. We welcome your feedback. Please E-mail 
your comments regarding the contents or accessibility features of this 
document to Webmaster@gao.gov.

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately.

Report to Congressional Requesters:

January 2004:

INFORMATION TECHNOLOGY MANAGEMENT:

Governmentwide Strategic Planning, Performance Measurement, and 
Investment Management Can Be Further Improved:

GAO-04-49:

GAO Highlights:

Highlights of GAO-04-49, a report to congressional requesters 

Why GAO Did This Study:

Over the years, the Congress has promulgated laws and the Office of 
Management and Budget and GAO have issued policies and guidance, 
respectively, on (1) information technology (IT) strategic planning/
performance measurement (which defines what an organization seeks to 
accomplish, identifies the strategies it will use to achieve desired 
results, and then determines how well it is succeeding in reaching 
results-oriented goals and achieving objectives) and (2) investment 
management (which involves selecting, controlling, and evaluating 
investments). 

To obtain an understanding of the government’s implementation of these 
key IT management policies, congressional requesters asked GAO to 
determine the extent to which 26 major agencies have in place 
practices associated with key legislative and other requirements for 
(1) IT strategic planning/performance measurement and (2) IT 
investment management.

What GAO Found:

Agencies’ use of 12 IT strategic planning/performance measurement 
practices—identified based on legislation, policy, and guidance—is 
uneven (see figure, below left). For example, agencies generally have 
IT strategic plans and goals, but these goals are not always linked to 
specific performance measures that are tracked. Without enterprisewide 
performance measures that are tracked against actual results, agencies 
lack critical information about whether their overall IT activities 
are achieving expected goals.

Agencies’ use of 18 IT investment management practices that GAO 
identified is also mixed (see figure, below right). For example, the 
agencies largely have IT investment management boards, but no agency 
had the practices associated with the control phase fully in place. 
Executive-level oversight of project-level management activities 
provides organizations with increased assurance that each investment 
will achieve the desired cost, benefit, and schedule results.

Agencies cited a variety of reasons for not having practices fully in 
place, such as that the chief information officer position had been 
vacant, that not including a requirement in guidance was an oversight, 
and that the process was being revised, although they could not always 
provide an explanation. Regardless of the reason, these practices are 
important ingredients for ensuring effective strategic planning, 
performance measurement, and investment management, which, in turn, 
make it more likely that the billions of dollars in government IT 
investments are wisely spent.

What GAO Recommends:

GAO is making a number of recommendations, including that each agency 
take action to address IT strategic planning, performance measurement, 
and investment management practices that are not fully in place. In 
commenting on a draft of the report, most agencies generally agreed 
with our findings and recommendations.

www.gao.gov/cgi-bin/getrpt?GAO-04-49.

To view the full product, including the scope and methodology, click 
on the link above. For more information, contact David Powner at (202) 
512-9286 or pownerd@gao.gov.

[End of section]

Contents:

Letter: 

Results in Brief: 

Background: 

Agencies' Use of IT Strategic Planning/Performance Measurement 
Practices Is Uneven: 

Agencies' Use of IT Investment Management Practices Is Mixed: 

Conclusions: 

Recommendations: 

Agency Comments and Our Evaluation: 

Appendixes:

Appendix I: Recommendations to Departments and Agencies: 

Appendix II: Comments from the Department of Agriculture: 

Appendix III: Comments from the Department of Commerce: 

Appendix IV: Comments from the Department of Defense (including 
comments from the Departments of the Air Force, Army, and Navy): 

GAO Comments: 

Appendix V: Comments from the Department of Education: 

GAO Comments: 

Appendix VI: Comments from the Environmental Protection Agency: 

GAO Comments: 

Appendix VII: Comments from the General Services Administration: 

GAO Comments: 

Appendix VIII: Comments from the Department of Health and Human 
Services: 

Appendix IX: Comments from the Department of Housing and Urban 
Development: 

Appendix X: Comments from the Department of the Interior: 

Appendix XI: Comments from the Department of Justice: 

GAO Comments: 

Appendix XII: Comments from the Department of Labor: 

GAO Comments: 

Appendix XIII: Comments from the National Aeronautics and Space 
Administration: 

GAO Comments: 

Appendix XIV: Comments from the Nuclear Regulatory Commission: 

Appendix XV: Comments from the Social Security Administration: 

GAO Comments: 

Appendix XVI: Comments from the Department of State: 

GAO Comments: 

Appendix XVII: Comments from the U.S. Agency for International 
Development: 

GAO Comments: 

Appendix XVIII: Comments from the Department of Veterans Affairs: 

GAO Comments: 

Appendix XIX: GAO Contacts and Staff Acknowledgments: 

GAO Contacts: 

Staff Acknowledgments: 

Tables: 

Table 1: IT Strategic Planning/Performance Measurement Practices: 

Table 2: IT Investment Management Practices: 

Figures: 

Figure 1: Percentage of Agencies' Use of IT Strategic Planning/
Performance Measurement Practices: 

Figure 2: Percentage of Agencies' Use of IT Investment Management 
Practices: 

Abbreviations: 

CFO: chief financial officer: 

CIO: chief information officer: 

COTS: commercial-off-the-shelf:

DHS: Department of Homeland Security:

DOD: Department of Defense: 

EPA: Environmental Protection Agency: 

FISMA: Federal Information Security Management Act: 

GISRA: Government Information Security Reform Act: 

GPRA: Government Performance and Results Act: 

GSA: General Services Administration:

HHS: Department of Health and Human Services: 

HUD: Department of Housing and Urban Development:

IRM: information resources management: 

IT: information technology: 

IV&V: independent verification and validation:

NARA: National Archives and Records Administration:

NASA: National Aeronautics and Space Administration:

NRC: Nuclear Regulatory Commission:

NSF: National Science Foundation: 

OMB: Office of Management and Budget: 

OPM: Office of Personnel Management: 

SBA: Small Business Administration: 

SSA: Social Security Administration: 

USAID: U.S. Agency for International Development:

VA: Department of Veterans Affairs:

Letter January 12, 2004:

The Honorable Susan M. Collins 
Chairman 
Committee on Governmental Affairs 
United States Senate:

The Honorable Tom Davis 
Chairman 
Committee on Government Reform 
House of Representatives:

The Honorable Adam H. Putnam 
Chairman 
Subcommittee on Technology, Information Policy, Intergovernmental 
Relations and the Census 
Committee on Government Reform 
House of Representatives:

According to the President's most recent budget, the federal government 
spends billions of dollars annually on information technology (IT)--
reportedly investing about $50 billion in fiscal year 2002 and 
expecting to invest about $60 billion in fiscal year 2004.[Footnote 1] 
Despite this substantial investment, the government's management of 
information resources has produced mixed results. Although agencies 
have taken constructive steps to implement modern strategies, systems, 
and management policies and practices, our most recent high-risk and 
performance and accountability series identified continuing high-risk 
system modernization efforts and governmentwide information and 
technology management challenges.[Footnote 2]

For years, the Congress has been working to increase the effectiveness 
of information and technology management in the federal government by 
passing legislation and providing oversight. For example, the Paperwork 
Reduction Act of 1995 applied life-cycle management principles to 
information and technology management and required that agencies 
indicate in strategic information resources management (IRM) plans how 
they are applying information resources to improve the productivity, 
efficiency, and effectiveness of government programs.[Footnote 3] The 
Clinger-Cohen Act of 1996 amended the Paperwork Reduction Act, 
establishing agency chief information officers (CIO) who report 
directly to the agency head and are responsible for information 
resources management activities. Among other things, the Clinger-Cohen 
Act also (1) required senior executive involvement in IT decision 
making and (2) imposed much-needed discipline in acquiring and managing 
technology resources.

To obtain a broad view of the government's implementation of key IT 
management, you requested that we determine the extent to which 
agencies have in place practices associated with key legislative and 
other requirements for (1) IT strategic planning/performance 
measurement and (2) IT investment management. To address these 
objectives, we identified and reviewed major legislative requirements 
and executive orders pertaining to IT strategic planning/performance 
measurement, which defines what an organization seeks to accomplish, 
identifies the strategies it will use to achieve desired results, and 
then determines--through measurement--how well it is succeeding in 
reaching results-oriented goals and achieving objectives; and IT 
investment management, which involves selecting, controlling, and 
evaluating investments. Specifically, we identified 30 important IT 
management practices in these areas using legislative requirements, 
such as the Paperwork Reduction Act and the Clinger-Cohen Act, and 
policy and guidance issued by the Office of Management and Budget 
(OMB)[Footnote 4] and GAO.[Footnote 5] We selected 26 organizations 
for our review (23 major departments and agencies identified in 31 
U.S.C. 901[Footnote 6] and the 3 military services).

Results in Brief:

Agencies' use of IT strategic planning/performance measurement 
practices is uneven--46 percent of the practices are in place, 41 
percent are partially in place, and 7 percent are not in 
place.[Footnote 7] The lack of full implementation of these practices 
is of concern because effective strategic planning is important to 
ensure that agencies' IT goals are aligned with the strategic goals of 
the agency. Also important is having measures in place to monitor 
whether, or the extent to which, IT is supporting the agency. The 
agencies generally have IRM plans or IT strategic plans, but these 
plans do not always address important IRM elements, such as information 
collection, records management, or privacy. In addition, although 
agencies generally have goals associated with IT, these goals are not 
always linked to specific performance measures. Moreover, many agencies 
do not monitor actual-versus-expected performance against 
enterprisewide IT performance measures in their IRM plans. Agencies 
cited a variety of reasons why the strategic planning/performance 
measurement practices are not in place, including that there was a lack 
of support from agency leadership, that the agency had not been 
developing IRM plans until recently and recognized that the plans 
needed further refinement, or that the process is being revised. In 
addition, the agencies in our review could not always identify why the 
practices were not fully in place. Regardless of the reason, these 
practices were generally derived from legislative requirements and 
governmentwide policies and are fundamental ingredients to effective IT 
planning and performance measurement; therefore, it is important that 
they be implemented.

Agencies' use of IT investment management practices is also mixed in 
that 44 percent of the practices are in place, 37 percent are partially 
in place, and 17 percent are not in place.[Footnote 8] Only by 
effectively and efficiently managing their IT resources through a 
robust investment management process can agencies gain opportunities to 
make better allocation decisions among many investment alternatives and 
further leverage their investments. As part of their investment 
management process, the agencies largely have IT investment management 
boards in place that are responsible for making decisions on selecting 
investments. However, many of these boards do not have written policies 
and procedures covering oversight or control of projects that cover 
such critical areas as corrective action plans and the tracking of such 
actions to resolution. Having these policies and procedures is a 
critical element of the control phase of a comprehensive IT investment 
management process, which helps ensure that investments are on track 
and are continuing to meet mission needs. As in the strategic planning/
performance measurement area, agencies were not always able to explain 
why certain IT investment management practices were not in place. 
However, among the reasons cited were that the CIO position had been 
vacant, that not including a given requirement in an investment 
management guide was an oversight, and that the investment management 
process was being revised. Nevertheless, the full implementation of the 
investment management practices would bring more rigor and structure to 
how agencies select and manage their IT investments.

We are making a number of recommendations, including that each agency 
take action to address IT strategic planning, performance measurement, 
and investment management practices that are not fully in place.

We received written or oral comments on a draft of this report from 25 
of the agencies[Footnote 9] in our review. Most agencies generally 
agreed with our findings and recommendations, and some provided 
additional documentation and information that we incorporated into the 
report, as appropriate.

Background:

Advances in the use of IT and the Internet are continuing to change the 
way that federal agencies communicate, use, and disseminate 
information; deliver services; and conduct business. For example, 
electronic government (e-government) has the potential to help build 
better relationships between government and the public by facilitating 
timely and efficient interaction with citizens. To help the agencies 
more effectively manage IT, the Congress has established a statutory 
framework of requirements and roles and responsibilities relating to 
information and technology management. Nevertheless, the agencies face 
significant challenges in effectively planning for and managing their 
IT. Such challenges can be overcome through the use of a systematic and 
robust management approach that addresses critical elements, such as IT 
strategic planning and investment management.

Federal Government's Statutory Framework for Information and Technology 
Management:

The Congress established a statutory framework to help address the 
information and technology management challenges that agencies face. 
Under this framework, agencies are accountable for effectively and 
efficiently developing, acquiring, and using IT in their organizations. 
In particular, the Paperwork Reduction Act of 1995 and the Clinger-
Cohen Act of 1996 require agency heads, acting through agency CIOs, to, 
among other things,

* better link their IT planning and investment decisions to program 
missions and goals;

* develop and maintain a strategic IRM plan that describes how IRM 
activities help accomplish agency missions;

* develop and maintain an ongoing process to establish goals for 
improving IRM's contribution to program productivity, efficiency, and 
effectiveness; methods for measuring progress toward these goals; and 
clear roles and responsibilities for achieving these goals;

* develop and implement a sound IT architecture;

* implement and enforce IT management policies, procedures, standards, 
and guidelines;

* establish policies and procedures for ensuring that IT systems 
provide reliable, consistent, and timely financial or program 
performance data; and:

* implement and enforce applicable policies, procedures, standards, and 
guidelines on privacy, security, disclosure, and information sharing.

Moreover, under the government's current legislative framework, OMB has 
important responsibilities for providing direction on governmentwide 
information and technology management and overseeing agency activities 
in these areas. Among OMB's responsibilities are:

* ensuring agency integration of IRM plans, program plans, and budgets 
for the acquisition and use of IT and the efficiency and effectiveness 
of interagency IT initiatives;

* developing and maintaining a governmentwide strategic IRM plan;

* developing, as part of the budget process, a mechanism for analyzing, 
tracking, and evaluating the risks and results of all major capital 
investments made by an executive agency for information 
systems;[Footnote 10]

* directing and overseeing the implementation of policy, principles, 
standards, and guidelines for the dissemination of and access to public 
information;

* encouraging agency heads to develop and use best practices in IT 
acquisitions; and:

* developing and overseeing the implementation of privacy and security 
policies, principles, standards, and guidelines.

Further, in 2002, the Congress passed, and the President signed, 
legislation intended to improve the collection, use, and dissemination 
of government information and to strengthen information security. 
Specifically, Public Law 107-347, the E-Government Act of 2002, which 
was enacted in December 2002, includes provisions to promote the use of 
the Internet and other information technologies to provide government 
services electronically. The E-Government Act also contains the Federal 
Information Security Management Act (FISMA) of 2002, which replaced and 
strengthened the Government Information Security Reform legislative 
provisions (commonly referred to as "GISRA").[Footnote 11] Among other 
provisions, FISMA requires each agency, including national security 
agencies, to (1) establish an agencywide risk-based information 
security program to be overseen by the agency CIO and ensure that 
information security is practiced throughout the life cycle of each 
agency system; and (2) develop, maintain, and annually update an 
inventory of major information systems (including major national 
security systems) operated by the agency or under its control.

Federal IT Challenges:

Even with the framework laid out by the Congress, the federal 
government faces enduring IT challenges. Specifically, in January 2003, 
we reported on a variety of challenges facing federal agencies in 
continuing to take advantage of the opportunities presented by 
IT.[Footnote 12] Unless and until the challenges outlined below are 
overcome, federal agencies are unlikely to optimize their use of IT, 
which can affect an organization's ability to effectively and 
efficiently implement its programs and missions.

* Pursuing opportunities for e-government. E-government offers many 
opportunities to better serve the public, make government more 
efficient and effective, and reduce costs. Federal agencies have 
implemented a wide array of e-government applications, including using 
the Internet to collect and disseminate information and forms; buy and 
pay for goods and services; submit bids and proposals; and apply for 
licenses, grants, and benefits. Although substantial progress has been 
made, the government has not yet fully reached its potential in this 
area. Recognizing this, a key element of the President's Management 
Agenda is the expansion of e-government to enhance access to 
information and services, particularly through the Internet. In 
response, OMB established a task force that selected a strategic set of 
initiatives to lead this expansion. Our review of the initial planning 
projects associated with these initiatives found that important 
aspects--such as collaboration and customer focus--had not been thought 
out for all of the projects and that major uncertainties in funding and 
milestones were not uncommon. Accordingly, we recommended that OMB 
take:

steps as overseer of the e-government initiatives to reduce the risk 
that the projects would not meet their objectives.[Footnote 13]

* Improving the collection, use, and dissemination of government 
information. The rapid evolution of IT is creating challenges in 
managing and preserving electronic records. Complex electronic records 
are increasingly being created in a decentralized environment and in 
volumes that make it difficult to organize them and make them 
accessible. Further, storage media themselves are affected by the dual 
problems of obsolescence and deterioration. These problems are 
compounded as computer hardware and application software become 
obsolete, since they may leave behind electronic records that can no 
longer be read. Overall responsibility for the government's electronic 
records lies with the National Archives and Records Administration 
(NARA). Our past work has shown that while NARA has taken some action 
to respond to the challenges associated with managing and preserving 
electronic records, most electronic records remain unscheduled; that 
is, their value had not been assessed and their disposition had not 
been determined.[Footnote 14] In addition, records of historical value 
were not being identified and provided to NARA; as a result, they were 
at risk of being lost. We recommended that NARA develop strategies for 
raising agency management's awareness of the importance of records 
management and for performing systematic inspections. In July 2003 we 
testified that although NARA has made progress in addressing these 
issues, more work remains to be done.[Footnote 15]

The growth of electronic information--as well as the security threats 
facing our nation--are also highlighting privacy issues. For example, 
online privacy has emerged as one of the key--and most contentious--
issues surrounding the continued evolution of the Internet. In 
addition, our survey of 25 departments and agencies about their 
implementation of the Privacy Act--which regulates how federal agencies 
may use the personal information that individuals supply when obtaining 
government services or fulfilling obligations--found that a key 
characteristic of the agencies' 2,400 systems of records is that an:

estimated 70 percent contained electronic records.[Footnote 16] Our 
survey also found that although compliance with Privacy Act provisions 
and related OMB guidance was generally high in many areas, according to 
agency reports, it was uneven across the federal government. To improve 
agency compliance and address issues reported by the agencies, we made 
recommendations to OMB, such as to direct agencies to correct 
compliance deficiencies, to monitor agency compliance, and to reassess 
its guidance.

* Strengthening information security. Since September 1996, we have 
reported that poor information security is a high-risk area across the 
federal government with potentially devastating consequences.[Footnote 
17] Although agencies have taken steps to redesign and strengthen their 
information system security programs, our analyses of information 
security at major federal agencies have shown that federal systems were 
not being adequately protected from computer-based threats. Our latest 
analyses of audit reports published from October 2001 through October 
2002 continue to show significant weaknesses in federal computer 
systems that put critical operations and assets at risk.[Footnote 18] 
In addition, in June 2003 we testified that agencies' fiscal year 2002 
reports and evaluations required by GISRA found that many agencies have 
not implemented security requirements for most of their systems, such 
as performing risk assessments and testing controls.[Footnote 19] In 
addition, the usefulness of agency corrective action plans may be 
limited when they do not identify all weaknesses or contain realistic 
completion dates.

One of the most serious problems currently facing the government is 
cyber critical infrastructure protection, which is protecting the 
information systems that support the nation's critical infrastructures, 
such as national defense and power distribution. Since the September 11 
attacks, warnings of the potential for terrorist cyber attacks against 
our critical infrastructures have increased. In addition, as greater 
amounts of money are transferred through computer systems, as more 
sensitive economic and commercial information is exchanged 
electronically, and as the nation's defense and intelligence 
communities increasingly rely on commercially available information 
technology, the likelihood increases that information attacks will 
threaten vital national interests. Among the critical infrastructure 
protection challenges the government faces are (1) developing a 
national critical infrastructure protection strategy, (2) improving 
analysis and warning capabilities, and (3) improving information 
sharing on threats and vulnerabilities. For each of the challenges, 
improvements have been made and continuing efforts are in progress, but 
much more is needed to address them. In particular, we have identified 
and made numerous recommendations over the last several years 
concerning critical infrastructure challenges that still need to be 
addressed. As a result of our concerns in this area, we have expanded 
our information security high-risk area to include cyber critical 
infrastructure protection.[Footnote 20]

* Constructing and enforcing sound enterprise architectures. Our 
experience with federal agencies has shown that attempts to modernize 
IT environments without blueprints--models simplifying the 
complexities of how agencies operate today, how they want to operate in 
the future, and how they will get there--often result in unconstrained 
investment and systems that are duplicative and ineffective. Enterprise 
architectures offer such blueprints. Our reports on the federal 
government's use of enterprise architectures in both February 2002 and 
November 2003 found that agencies' use of enterprise architectures was 
a work in progress, with much to be accomplished.[Footnote 21] 
Nevertheless, opportunities exist to significantly improve this outlook 
if OMB were to adopt a governmentwide, structured, and systematic 
approach to promoting enterprise architecture use, measuring agency 
progress, and identifying and pursuing governmentwide solutions to 
common enterprise architecture challenges that agencies face. 
Accordingly, we made recommendations to OMB to address these areas.

* Employing IT system and service management practices. Our work and 
other best-practice research have shown that applying rigorous 
practices to the acquisition or development of IT systems or the 
acquisition of IT services improves the likelihood of success. In other 
words, the quality of IT systems and services is governed largely by 
the quality of the processes involved in developing or acquiring each. 
For example, using models and methods that define and determine 
organizations' software-intensive systems process maturity that were 
developed by Carnegie Mellon University's Software Engineering 
Institute, which is recognized for its expertise in software processes, 
we evaluated several agencies' software development or acquisition 
processes. We found that agencies are not consistently using rigorous 
or disciplined system management practices. We have made numerous 
recommendations to agencies to improve their management processes, and 
they have taken, or plan to take, actions to improve.[Footnote 22] 
Regarding IT services acquisition, we identified leading commercial 
practices for outsourcing IT services that government entities could 
use to enhance their acquisition of IT services.[Footnote 23]

* Using effective agency IT investment management practices. 
Investments in IT can have a dramatic impact on an organization's 
performance. If managed effectively, these investments can vastly 
improve government performance and accountability. If not, however, 
they can result in wasteful spending and lost opportunities for 
improving delivery of services to the public. Using our information:

technology investment management maturity framework,[Footnote 24] we 
evaluated selected agencies and found that while some processes have 
been put in place to help them effectively manage their planned and 
ongoing IT investments, more work remains.[Footnote 25]

IT Challenges Are Interdependent:

Complicating the government's ability to overcome these IT management 
challenges are these challenges' interdependencies. As a result, the 
inability of an organization to successfully address one IT management 
area can reduce the effectiveness of its success in addressing another 
management function. For example, a critical aspect of implementing 
effective e-government solutions and developing and deploying major 
systems development projects is ensuring that robust information 
security is built into these endeavors early and is periodically 
revisited.

The government's many IT challenges can be addressed by the use of 
effective planning and execution, which can be achieved, in part, 
through strategic planning/performance measurement, and investment 
management. For example, strong strategic planning is focused on using 
IT to help accomplish the highest priority customer needs and mission 
goals, while effective performance measurement helps determine the 
success or failure of IT activities. Finally, IT investment management 
provides a systematic method for minimizing risks while maximizing the 
return on investments and involves a process for selecting, 
controlling, and evaluating investments. These processes, too, are 
interdependent. For example, the investment management process is a 
principal mechanism to ensure the effective execution of an agency's IT 
strategic plan.

Objectives, Scope, and Methodology:

Our objectives were to determine the extent to which federal agencies 
are following practices associated with key legislative and other 
requirements for (1) IT strategic planning/performance measurement and 
(2) IT investment management.

To address these objectives, we identified and reviewed major 
legislative requirements and executive orders pertaining to IT 
strategic planning, performance measurement, and investment 
management. Specifically, we reviewed:

* the Paperwork Reduction Act of 1995;

* the Clinger-Cohen Act of 1996;

* the E-Government Act of 2002;

* the Federal Information Security Management Act of 2002;

* Executive Order 13011, Federal Information Technology; and:

* Executive Order 13103, Computer Software Piracy.

Using these requirements and policy and guidance issued by OMB[Footnote 
26] and GAO,[Footnote 27] we identified 30 IT management practices that 
(1) can be applied at the enterprise level and (2) were verifiable 
through documentation and interviews. These 30 practices focused on 
various critical aspects of IT strategic management, performance 
measurement, and investment management, including the development of 
IRM plans, the identification of goals and related measures, and the 
selection and control of IT investments, respectively.

We selected 26 major departments and agencies for our review (23 
entities identified in 31 U.S.C. 901 and the 3 military 
services).[Footnote 28] At our request, each agency completed a self-
assessment on whether and how it had implemented the 30 IT management 
practices. We reviewed the completed agency self-assessments and 
accompanying documentation, including agency and IT strategic plans, 
agency performance plans and reports required by the Government 
Performance and Results Act, and IT investment management policy and 
guidance, and interviewed applicable agency IT officials to corroborate 
whether the practices were in place. We did not evaluate the 
effectiveness of agencies' implementation of the practices. For 
example, we did not review specific IT investments to determine whether 
they were selected, controlled, and reviewed in accordance with agency 
policy and guidance. However, we reviewed applicable prior GAO and 
agency inspector general reports and discussed whether agency policies 
had been fully implemented with applicable agency IT officials.

On the basis of the above information, we assessed whether the 
practices were in place, using the following definitions:

* Yes--the practice was in place.

* Partially--the agency has some, but not all, aspects of the practice 
in place. Examples of circumstances in which the agency would receive 
this designation include when (1) some, but not all, of the elements of 
the practice were in place; (2) the agency documented that it has the 
information or process in place but it was not in the prescribed form 
(e.g., in a specific document as required by law or OMB); (3) the 
agency's documentation was in draft form; or (4) the agency had a 
policy related to the practice but evidence supported that it had not 
been completely or consistently implemented.

* No--the practice was not in place.

* Not applicable--the practice was not relevant to the agency's 
particular circumstances.

We also collected information from the Department of Homeland Security 
(DHS) but found that since it had been established so recently, it was 
too early to judge its IT strategic planning, performance measurement, 
and investment management. As a result, although we provided 
information on what DHS was doing with respect to these areas, we did 
not include it in our assessment.

We also interviewed officials from OMB's Office of Information and 
Regulatory Affairs regarding OMB's role in establishing policies and 
overseeing agencies' implementation of the identified practices.

We performed our work at the agencies' offices in greater Washington, 
D.C. We conducted our review between April and mid-December 2003 in 
accordance with generally accepted government auditing standards.

Agencies' Use of IT Strategic Planning/Performance Measurement 
Practices Is Uneven:

The use of IT strategic planning/performance measurement practices is 
uneven (see fig. 1), which is of concern because a well-defined 
strategic planning process helps ensure that an agency's IT goals are 
aligned with that agency's strategic goals. Moreover, establishing 
performance measures and monitoring actual-versus-expected performance 
of those measures can help determine whether IT is making a difference 
in improving performance. Among the practices or elements of practices 
that agencies largely have in place were those pertaining to 
establishing goals and performance measures. On the other hand, 
agencies are less likely to have fully documented their IT strategic 
planning processes, developed comprehensive IRM plans, linked 
performance measures to their enterprisewide IT goals, or monitored 
actual-versus-expected performance for these enterprisewide goals. 
Agencies cited various reasons, such as the lack of support from agency 
leadership, for not having strategic practices/performance measurement 
practices in place. Without strong strategic management practices, it 
is less likely that IT is being used to maximize improvement in mission 
performance. Moreover, without enterprisewide performance measures 
that are being tracked against actual results, agencies lack critical 
information about whether their overall IT activities, at a 
governmentwide cost of billions of dollars annually, are achieving 
expected goals.

Figure 1: Percentage of Agencies' Use of IT Strategic Planning/
Performance Measurement Practices:

[See PDF for image]

Note: Yes--the practice was in place. Partially--the agency has some, 
but not all, aspects of the practice in place. Examples of 
circumstances in which the agency would receive this designation 
include when (1) some, but not all, of the elements of the practice 
were in place; (2) the agency documented that it has the information or 
process in place but it was not in the prescribed form (e.g., in a 
specific document as required by law or OMB); (3) the agency's 
documentation was in draft form; or (4) the agency had a policy related 
to the practice but evidence supported that it had not been completely 
or consistently implemented. No--the practice was not in place. Not 
applicable--the practice was not relevant to the agency's particular 
circumstances.

[End of figure]

Governmentwide Progress Demonstrated, but More Work Remains:

Critical aspects of the strategic planning/performance measurement area 
include documenting the agency's IT strategic planning processes, 
developing IRM plans, establishing goals, and measuring performance to 
evaluate whether goals are being met. Although the agencies often have 
these practices, or elements of these practices, in place, additional 
work remains, as demonstrated by the following examples:

* Strategic planning process. Strategic planning defines what an 
organization seeks to accomplish and identifies the strategies it will 
use to achieve desired results. A defined strategic planning process 
allows an agency to clearly articulate its strategic direction and to 
establish linkages among planning elements such as goals, objectives, 
and strategies.

About half of the agencies fully documented their strategic planning 
processes. For example, the General Services Administration (GSA) 
documented an IT governance structure that addresses the roles and 
responsibilities of various organizations in strategic planning and 
investment management. In addition, in its IT strategic plan, GSA 
describes how it developed the plan, including its vision, business-
related priorities, and goals. In contrast, the Department of 
Agriculture has not completely documented its IT strategic planning 
process or integrated its IT management operations and decisions with 
other agency processes. According to Agriculture IT officials, the 
department's ongoing budget and performance integration initiative is 
expected to result in a more clearly defined and integrated IT 
strategic management planning process. Such a process provides the 
essential foundation for ensuring that IT resources are effectively 
managed.

* Strategic IRM plans. The Paperwork Reduction Act requires that 
agencies indicate in strategic IRM plans how they are applying 
information resources to improve the productivity, efficiency, and 
effectiveness of government programs. An important element of a 
strategic plan is that it presents an integrated system of high-level 
decisions that are reached through a formal, visible process. The plan 
is thus an effective tool with which to communicate the mission and 
direction to stakeholders. In addition, a strategic IRM plan that 
communicates a clear and comprehensive vision for how the agency will 
use information resources to improve agency performance is important 
because IRM encompasses virtually all aspects of an agency's 
information activities.

Although the Paperwork Reduction Act also requires agencies to develop 
IRM plans in accordance with OMB's guidance, OMB does not provide 
cohesive guidance on the specific contents of IRM plans. OMB Circular 
A-130 directs that agencies have IRM plans that support agency 
strategic plans, provide a description of how IRM helps accomplish 
agency missions, and ensure that IRM decisions are integrated with 
organizational planning, budgets, procurement, financial management, 
human resources management, and program decisions. However, Circular A-
130 does not provide overall guidance on the plan's contents. As a 
result, although agencies generally provided OMB with a variety of 
planning documents to meet its requirement that they submit an IRM 
plan, these plans were generally limited to IT strategic or e-
government issues and did not address other elements of IRM, as defined 
by the Paperwork Reduction Act. Specifically, these plans generally 
include individual IT projects and initiatives, security, and 
enterprise architecture elements but do not often address other 
information functions, such as information collection, records 
management, and privacy, or the coordinated management of all 
information functions.

OMB IT staff agreed that the agency has not set forth guidance on the 
contents of agency IRM plans in a single place, stating that its focus 
has been on looking at agencies' cumulative results and not on planning 
documents. In addition, these staff also noted that agencies account 
for their IRM activities through multiple documents (e.g., Information 
Collection Budgets[Footnote 29] and Government Paperwork Elimination 
Act[Footnote 30] plans). However, the OMB IT staff stated that they 
would look at whether more guidance is needed to help agencies in their 
development of IRM plans, but have not yet made a commitment to provide 
such guidance. Half the agencies indicated a need for OMB to provide 
additional guidance on the development and content of IRM plans.

Strong agency strategic IRM plans could also provide valuable input to 
a governmentwide IRM plan, which is also required by the Paperwork 
Reduction Act. As we reported last year, although OMB designated the 
CIO Council's strategic plan for fiscal years 2001-2002 as the 
governmentwide strategic IRM plan, it does not constitute an effective 
and comprehensive strategic vision.[Footnote 31] Accordingly, we 
recommended that OMB develop and implement a governmentwide strategic 
IRM plan that articulates a comprehensive federal vision and plan for 
all aspects of government information. In April 2003, we testified that 
OMB had taken a number of actions that demonstrate progress in 
fulfilling the Paperwork Reduction Act's requirement of providing a 
unifying IRM vision.[Footnote 32] However, more remains to be done. In 
particular, we reported that although OMB's strategies and models are 
promising, their ability to reduce paperwork burden and accomplish 
other objectives depends on how OMB implements them.

One element required by the Clinger-Cohen Act to be included in agency 
IRM plans is the identification of a major IT acquisition program(s), 
or any phase or increment of that program, that significantly deviated 
from cost, performance, or schedule goals established by the program. 
However, few agencies met this requirement. In these cases, a common 
reason cited for not including this information was that it was not 
appropriate to have such detailed information in a strategic plan 
because such plans should be forward thinking and may not be developed 
every year. Agencies also identified other mechanisms that they use to 
track and report cost, schedule, and performance deviations. Because 
agencies generally do not address this Clinger-Cohen Act requirement in 
their IRM plans, they may benefit from additional guidance from OMB on 
how to address this requirement.

* IT goals. The Paperwork Reduction Act and the Clinger-Cohen Act 
require agencies to establish goals that address how IT contributes to 
program productivity, efficiency, effectiveness, and service delivery 
to the public. We have previously reported that leading organizations 
define specific goals, objectives, and measures, use a diversity of 
measure types, and describe how IT outputs and outcomes impact 
operational customer and agency program delivery 
requirements.[Footnote 33]

The agencies generally have the types of goals outlined in the 
Paperwork Reduction Act and the Clinger-Cohen Act. For example, the 
Social Security Administration (SSA) set a goal of achieving an average 
of at least a 2 percent per year improvement in productivity, and it 
expects that advances in automation will be a key to achieving this 
goal along with process and regulation changes. In addition, the 
Department of Veterans Affairs' (VA) latest departmental strategic plan 
has a goal that includes using business process reengineering and 
technology integration to speed up delivery of benefit payments, 
improve the quality of health care provided in its medical centers, and 
administer programs more efficiently. The VA goal includes strategies 
such as using its enterprise architecture as a continuous improvement 
process, implementing e-government solutions to transform paper-based 
electronic collections to electronic-based mechanisms, and 
establishing a single, high-performance wide area data network. Five 
agencies do not have one or more of the goals required by the Paperwork 
Reduction Act and the Clinger-Cohen Act. For example, the Department of 
Labor's single IT strategic goal--to provide better and more secure 
service to citizens, businesses, government, and Labor employees to 
improve mission performance--which it included in its fiscal year 2004 
performance plan, does not address all required goals. Further, in 
contrast to other agencies, Labor does not have goals in its IRM plan. 
It is important that agencies specify clear goals and objectives to set 
the focus and direction of IT performance.

* IT performance measures. The Paperwork Reduction Act, the Clinger-
Cohen Act, and Executive Order 13103 require agencies to establish a 
variety of IT performance measures, such as those related to how IT 
contributes to program productivity, efficiency, and effectiveness, and 
to monitor the actual-versus-expected performance of those measures. As 
we have previously reported, an effective performance management system 
offers a variety of benefits, including serving as an early warning 
indicator of problems and the effectiveness of corrective actions, 
providing input to resource allocation and planning, and providing 
periodic feedback to employees, customers, stakeholders, and the 
general public about the quality, quantity, cost, and timeliness of 
products and services.[Footnote 34]

Although the agencies largely have one or more of the required 
performance measures, these measures are not always linked to the 
agencies' enterprisewide IT goals. For example, the Department of 
Defense (DOD), Air Force, and Navy have a variety of enterprisewide IT 
goals but do not have performance measures associated with these goals. 
Each of these organizations are in the process of developing such 
measures. To illustrate, the Air Force's August 2002 information 
strategy includes nine goals, such as providing decision makers and all 
Air Force personnel with on-demand access to authoritative, relevant, 
and sufficient information to perform their duties efficiently and 
effectively, but does not have performance measures for these goals. 
The Air Force recognizes the importance of linking performance measures 
to its goals and is developing such measures, which it expects to 
complete by the fourth quarter of fiscal year 2004.

Leading organizations use performance measures to objectively evaluate 
mission, business, and project outcomes. Such organizations also focus 
on performance measures for gauging service to key management processes 
and tailoring performance measures to determine whether IT is making a 
difference in improving performance. Few agencies monitored actual-
versus-expected performance for all of their enterprisewide IT goals. 
Specifically, although some agencies tracked actual-versus-expected 
outcomes for the IT performance measures in their performance plans or 
accountability reports and/or for specific IT projects, they generally 
did not track the performance measures specified in their IRM plans. 
For example, although the Department of Health and Human Services' 
(HHS) IT strategic plan identifies enterprisewide goals and performance 
measures, these measures generally do not identify quantified outcomes 
(e.g., the measures indicate that the outcome will be a percentage 
transaction increase or cost decrease in certain areas but do not 
provide a baseline or target). In addition, the HHS plan does not 
describe how the department will monitor actual-versus-expected 
performance for these measures. HHS's Director of Business Operations 
in its IRM office reported that the department recognizes the need to 
develop an integrated program for monitoring performance against the 
enterprisewide measures in the IT strategic plan. He stated that HHS 
has recently begun an initiative to establish such a process. By not 
measuring actual-versus-expected performance, agencies lack the 
information to determine where to target agency resources to improve 
overall mission accomplishment.

* Benchmarking. The Clinger-Cohen Act requires agencies to 
quantitatively benchmark agency process performance against public-and 
private-sector organizations, where comparable processes and 
organizations exist. Benchmarking is used by entities because there may 
be external organizations that have more innovative or more efficient 
processes than their own processes. Our previous study of IT 
performance measurement at leading organizations found that they had 
spent considerable time and effort comparing their performance 
information with that of other organizations.[Footnote 35]

Seven agencies have mechanisms--such as policies and strategies--in 
place related to benchmarking their IT processes. For example, DOD's 
information resources and IT directive states that DOD components shall 
routinely and systematically benchmark their functional processes 
against models of excellence in the public and private sector and use 
these and other analyses to develop, simplify, or refine the processes 
before IT solutions are applied. In general, however, agencies' 
benchmarking decisions are ad hoc. Few agencies have developed a 
mechanism to identify comparable external private-or public-sector 
organizations and processes and/or have policies related to 
benchmarking; however, all but 10 of the agencies provided examples of 
benchmarking that had been performed. For example, the Small Business 
Administration (SBA) does not have benchmarking policies in place, but 
the agency provided an example of a benchmarking study performed by a 
contractor that compared SBA's IT operations and processes against 
industry cost and performance benchmarks and best practices and 
resulted in recommendations for improvement.

Practice-Specific Analysis:

Table 1 provides additional detail on each strategic planning/
performance measurement practice and our evaluation of whether each 
agency had the practice in place. The table indicates that work remains 
for the agencies to have each of the practices fully in place as well 
as that several agencies reported that they were taking, or planned to 
take, actions to address the practices or elements of practices.

Table 1: IT Strategic Planning/Performance Measurement PracticesA:

Practice 1.1: The agency has documented its IT strategic management 
process, including, at a minimum:

* the responsibilities and accountability for IT resources across the 
agency, including the relationship between the chief information 
officer (CIO), chief financial officer (CFO), and mission/program 
officials; and;

* the method by which the agency defines program information needs and 
develops strategies, systems, and capabilities to meet those needs.

Results; Yes: 12; Partially: 11; No: 1; NA: 2.

Comments:

* Yes--the Departments of the Air Force, Army, Commerce, Defense (DOD), 
Education, Energy, Labor, Navy, and Veterans Affairs (VA) and the 
General Services Administration (GSA), the Office of Personnel 
Management (OPM), and the Social Security Administration (SSA) have 
this practice in place;

* Partially--the Departments of Agriculture[C], Health and Human 
Services (HHS)[C], Interior, Justice, and Transportation, and the 
Environmental Protection Agency (EPA), the National Aeronautics and 
Space Administration (NASA)[C], and the Small Business Administration 
(SBA) do not have a completely documented IT strategic planning 
process. The Department of Housing and Urban Development (HUD)[C] does 
not clearly describe the roles and responsibilities of the CFO and 
program managers in IT strategic planning. The Nuclear Regulatory 
Commission's (NRC) roles and responsibilities in its IT strategic 
management process are not clearly defined. The Department of the 
Treasury's[C] documentation supporting this practice is in draft form;

* No--the National Science Foundation (NSF) does not have this practice 
in place;

* NA (not applicable)--the Department of State and the U.S. Agency for 
International Development (USAID) are transitioning to a joint 
strategic planning process that will support their common policy 
objectives. The first step in this process was the August 2003 issuance 
of a State/USAID strategic plan. Because a new joint IT strategic 
planning process is also being implemented, it is too early to evaluate 
whether the new process will address this practice.

Practice 1.2: The agency has documented its process to integrate IT 
management operations and decisions with organizational planning, 
budget, financial management, human resources management, and program 
decisions.

Results; Yes: 13; Partially: 10; No: 1; NA: 2.

Comments:

* Yes--Air Force, Army, Commerce, DOD, Education, GSA, Labor, Navy, 
NSF, OPM, SBA, SSA, and VA have this practice in place;

* Partially--Agriculture[C] and EPA have not completely documented the 
integration of their IT management operations and decisions with other 
agency processes. Energy[C], HUD, NASA[C], and Justice have not 
documented how their IT management operations and decisions are 
integrated with human resources management. HHS[C] has not documented 
how its IT management operations and decisions are integrated with its 
budget processes. NRC reported that improvement is needed in how IT 
planning is integrated with the budget and human resources management. 
Transportation's[C] IT human capital planning is not yet integrated 
with the agency's human capital planning. Treasury's[C] documentation 
pertaining to this practice is in draft form;

* No--Interior does not have this practice in place;

* NA--this practice is not applicable to State and USAID for reasons 
outlined in practice 1.1.

Practice 1.3: The agency requires that information security management 
processes be integrated with strategic and operational planning 
processes.

Results; Yes: 24; Partially: 2; No: 0; NA: 0.

Comments:

* Yes--Agriculture, Air Force, Army, Commerce, DOD, Education, Energy, 
EPA, GSA, HHS, HUD, Interior, Justice, Labor, NASA, Navy, NSF, OPM, 
SBA, SSA, State, Transportation, USAID, and VA have this practice in 
place;

* Partially--NRC and Treasury's[C] documentation supporting this 
practice is in draft form.

Practice 1.4: The agency has a process that involves the CFO, or 
comparable official, to develop and maintain a full and accurate 
accounting of IT-related expenditures, expenses, and results.

Results; Yes: 15; Partially: 11; No: 0; NA: 0.

Comments:

* Yes--Agriculture, Commerce, Energy, GSA, HUD, Interior, Justice, 
NASA, NRC, NSF, OPM, SSA, Transportation, Treasury, and VA reported 
that they have this practice in place.[B];

* Partially--prior GAO or inspector general work indicates that Army, 
Air Force, DOD, EPA, and Navy do not capture and report on the full 
costs of their programs. State and USAID reported that IT internal 
costs are not consistently captured. HHS reported that not all internal 
costs are captured and that the CFO is not involved in the process used 
to derive its IT costs. Education and Labor's CFOs are not involved in 
the process used to derive their IT costs. SBA reported that not all 
costs are captured for nonmajor systems.

Practice 1.5: The agency prepares an enterprisewide strategic 
information resources management (IRM) plan that, at a minimum:

* describes how IT activities will be used to help accomplish agency 
missions and operations, including related resources; and;

* identifies a major IT acquisition program(s) or any phase or 
increment of that program that has significantly deviated from the 
cost, performance, or schedule goals established for the program.

Results; Yes: 2; Partially: 22; No: 0; NA: 2.

Comments:

* Yes--Commerce and NSF have this practice in place;

* Partially--Agriculture, Air Force, Army, EPA, GSA, HHS, HUD, 
Interior, Justice, Labor, NASA, OPM, and SBA's IRM plans do not include 
resources and major IT acquisition programs that deviated from cost, 
schedule, or performance goals. Education, Energy, Navy, SSA, and 
Transportation's IRM plans do not include major IT acquisition programs 
that deviated from cost, schedule, or performance goals. DOD and NRC's 
draft IRM plans do not include resources and major IT acquisition 
programs that deviated from cost, schedule, or performance goals in 
their IRM plans. Treasury and VA's draft IRM plans do not include 
resources or major IT acquisition programs that deviated from cost, 
schedule, or performance goals in their IRM plans;

* NA--this practice is not applicable to State and USAID for reasons 
outlined in practice 1.1.

Practice 1.6: The agency's performance plan required under Government 
Performance and Results Act (GPRA) includes;

* a description of how IT supports strategic and program goals:

* the resources and time periods required to implement the information 
security program plan required by the Federal Information Security 
Management Act (FISMA), and;

* a description of major IT acquisitions contained in the capital asset 
plan that will bear significantly on the achievement of a performance 
goal.

Results; Yes: 0; Partially: 23; No: 0; NA: 3.

Comments:

* Partially--no agency's performance plan, except VA's, includes time 
periods, and none includes resources required to implement the 
information security program plan required by FISMA. In addition, 
Agriculture, DOD, HHS, and Interior's plans also do not include a 
description of major IT acquisitions contained in their capital asset 
plans that bear significantly on the achievement of a performance goal;

* NA--this practice is not applicable to Air Force, Army, and Navy 
because they are not required to produce such plans.

Practice 1.7: The agency has a documented process to;

* develop IT goals in support of agency needs:

* measure progress against these goals, and;

* assign roles and responsibilities for achieving these goals.

Results; Yes: 4; Partially: 12; No: 8; NA: 2:

Comments:

* Yes--Army, GSA, OPM, and SSA have this practice in place;

* Partially--Agriculture[C], NRC, and NSF do not have a documented 
process for assigning roles and responsibilities for achieving their 
enterprisewide IT goals. DOD[C] and HHS[C] have not established a 
documented process for measuring progress against their enterprisewide 
IT goals. Energy has this process in place for some, but not all, of 
its IT goals and performance measures. Air Force[C], Education, and 
Navy[C] do not have a documented process to measure against their 
enterprisewide IT goals or to assign roles and responsibilities for 
achieving these goals. Treasury's[C] documentation in support of this 
practice is in draft form. Transportation is piloting a process. 
VA's[C] documentation supporting this practice does not explicitly 
address how IT goals are developed and roles and responsibilities 
assigned;

* No--Commerce[C], EPA, HUD[C], Interior, Justice[C], Labor, NASA, and 
SBA do not have this practice in place;

* NA--this practice is not applicable to State and USAID for reasons 
outlined in practice 1.1.

Practice 1.8: The agency has established goals that, at a minimum, 
address how IT contributes to;

* program productivity:

* efficiency:

* effectiveness, and;

* service delivery to the public (if applicable).

Results; Yes: 19; Partially: 5; No: 0; NA: 2.

Comments:

* Yes--Agriculture, Air Force, Army, Commerce, DOD, Education, EPA, 
GSA, HHS, HUD, Interior, Justice, NASA, NSF, OPM, SBA, SSA, Treasury, 
and VA have this practice in place;

* Partially--Navy does not have an IT goal associated with service 
delivery to the public. Energy, Labor, and Transportation do not have a 
goal associated with how IT contributes to program productivity. NRC's 
documentation in support of this practice is in draft form;

* NA--this practice is not applicable to State and USAID for reasons 
outlined in practice 1.1.

Practice 1.9: The agency has established IT performance measures and 
monitors actual-versus-expected performance that at least addresses;

* how IT contributes to program productivity:

* how IT contributes to the efficiency of agency operations:

* how IT contributes to the effectiveness of agency operations:

* service delivery to the public (if applicable):

* how electronic government initiatives enable progress toward agency 
goals and statutory mandates:

* the performance of IT programs (e.g., system development and 
acquisition projects), and;

* agency compliance with federal software piracy policy.

Results; Yes: 0; Partially: 23; No: 1; NA: 2.

Comments:

* Partially--Agriculture[C], HHS[C], Interior, NASA, OPM, and VA[C] 
generally do not track actual-versus-expected performance for 
enterprisewide measures in their IRM plans. Commerce[C], EPA, Justice, 
SBA, and Treasury have some enterprisewide IT performance measures in 
their performance plans or accountability reports in which actual-
versus-expected performance is tracked but do not have measures for the 
enterprisewide IT goals in their IRM plans. SBA also does not have 
performance measures associated with program productivity, efficiency, 
effectiveness, and performance of IT programs. Moreover, Treasury's[C] 
IRM plan is in draft form. Air Force[C] has not developed measures for 
the enterprisewide goals in its information strategy and does not have 
measures associated with program productivity, electronic government, 
and service delivery to the public. Army[C] has neither performance 
measures for all of the objectives related to its enterprise IT goals 
nor measures associated with service delivery to the public. Navy[C] 
has not developed measures for the enterprisewide goals in its IRM plan 
and does not have measures related to how IT contributes to the 
effectiveness and efficiency of agency operations, service delivery to 
the public, or e-government. Education does not have measures related 
to how IT contributes to program productivity and the effectiveness and 
efficiency of agency operations and does not track actual-versus-
expected performance of measures identified in its IRM plan. GSA did 
not provide evidence that it tracked actual versus expected performance 
for one of its IT goals in its IRM plan. HUD[C] does not have 
performance measures related to how IT contributed to program 
productivity and does not track actual-versus-expected performance for 
enterprisewide measures in its IRM plan. Labor does not have 
performance measures associated with program productivity and 
efficiency. Energy and NRC's performance measures are not linked to the 
enterprisewide IT goals contained in their IRM plans. In addition, 
Energy does not have a measure associated with program productivity. 
Transportation's[C] performance measures are generally not linked to 
the goals contained in its IRM plan, and it does not track actual-
versus-expected performance for its enterprisewide measures. SSA 
reported that it has performance measures associated with the overall 
performance of its IT programs but provided no supporting 
documentation. Finally, no agency has performance measures related to 
the effectiveness of controls to prevent software piracy;

* No--DOD[C] does not have this practice in place but is working on 
developing such measures;

* NA--this practice is not applicable to State and USAID for reasons 
outlined in practice 1.1.

Practice 1.10: The agency has developed IT performance measures that 
align with and support the goals in the GPRA performance plan.

Results; Yes: 22; Partially: 0; No: 1; NA: 3.

Comments:

* Yes--Agriculture, Commerce, Education, Energy, EPA, GSA, HHS, HUD, 
Interior, Justice, Labor, NASA, NRC, NSF, OPM, SBA, SSA, State, 
Transportation, Treasury, USAID, and VA have this practice in place;

* No--DOD does not have this practice in place;

* NA--this practice is not applicable to the Air Force, Army, and Navy 
because they are not required to produce such plans.

Practice 1.11: The agency developed an annual report, included as part 
of its budget submission, that describes progress in achieving goals 
for improving the efficiency and effectiveness of agency operations 
and, as appropriate, the delivery of services to the public through the 
effective use of IT.

Results; Yes: 25; Partially: 1; No: 0; NA: 0.

Comments:

* Yes--Agriculture, Air Force, Army, Commerce, DOD, Education, Energy, 
EPA, GSA, HHS, HUD, Interior, Justice, Labor, NASA, Navy, NRC, NSF, 
OPM, SSA, State, Transportation, Treasury, USAID, and VA have this 
practice in place;

* Partially--SBA has not reported progress on achieving its goals for 
improving the efficiency and effectiveness of agency operations.

Practice 1.12: The agency requires that its IT management processes be 
benchmarked against appropriate processes and/or organizations from the 
public and private sectors in terms of cost, speed, productivity, and 
quality of outputs and outcomes where comparable processes and 
organizations in the public or private sectors exist.

Results; Yes: 7; Partially: 9; No: 10; NA: 0.

Comments:

* Yes--Air Force, Army, DOD, Education, Navy, NRC, and VA have this 
practice in place;

* Partially--Agriculture, Commerce, Energy, GSA, Interior, NASA, SBA, 
SSA, and Transportation provided an example of a process that they have 
benchmarked, but benchmarking is being performed on an ad hoc basis;

* No--EPA, HHS[C], HUD[C], Justice, Labor, NSF, OPM, State, 
Treasury[C], and USAID do not have this practice in place.

Source: GAO.

[A] Due to its recent establishment, we did not include DHS as a part 
of this analysis.

[B] We have previously reported that agencies are making progress to 
address financial management system weaknesses but that agency 
management does not yet have the full range of information needed for 
accountability, performance reporting, and decision making. In 
addition, for fiscal year 2002, auditors reported that 19 agency 
systems were not compliant with the Federal Financial Management 
Improvement Act, including Agriculture, Commerce, Education, HUD, 
Interior, and NASA. (Financial Management: Sustained Efforts Needed to 
Achieve FFMIA Accountability, [Hyperlink, http://www.gao.gov/cgi-bin/
getrpt?GAO-03-1062] GAO-03-1062, Sept. 30, 2002).

[C] The agency reported that it was taking, or planned to take, action 
to address this practice or elements of the practice.

Note: Yes--the practice was in place. Partially--the agency has some, 
but not all, aspects of the practice in place. Examples of 
circumstances in which the agency would receive this designation 
include when (1) some, but not all, of the elements of the practice 
were in place; (2) the agency documented that it has the information or 
process in place but it was not in the prescribed form (e.g., in a 
specific document as required by law or OMB); (3) the agency's 
documentation was in draft form; or (4) the agency had a policy related 
to the practice, but evidence supported that it had not been completely 
or consistently implemented. No--the practice was not in place. NA (not 
applicable)--the practice was not relevant to the agency's particular 
circumstances.

[End of table]

Agency IT officials could not identify why practices were not in place 
in all cases, but in those instances in which reasons were identified, 
a variety of explanations were provided. For example, reasons cited by 
agency IT officials included that they lacked the support from agency 
leadership, that the agency had not been developing IRM plans until 
recently and recognized that the plan needed further refinement, that 
the process was being revised (in at least one case because of changes 
that are needed to reflect a loss of component organizations to the new 
DHS), and that requirements were evolving. In other cases, the agency 
reported that it had the information but it was not in the format 
required by legislation. For instance, FISMA requires agencies to 
include in the performance plans required by the Government Performance 
and Results Act the resources, including budget, staffing, and 
training, and time periods to implement its information security 
program. None of the agencies included this information in their 
performance plans.[Footnote 36] However, the agencies commonly reported 
that they had this information but that it was in another document. 
Nevertheless, this does not negate the need for having the agency 
report to the Congress in the required form. This is particularly 
important since, as in the example of the FISMA requirement, the 
reporting requirement involves a public document, whereas other reports 
may not be publicly available.

In the case of DHS, while we did not include the department in our 
assessment and in table 1, the department is in the process of 
developing its first IT strategic plan. According to DHS, it expects to 
complete this plan by mid-February 2004.

Agencies' Use of IT Investment Management Practices Is Mixed:

The use of IT investment management practices is mixed (as shown in 
fig. 2), which demonstrates that agencies do not have all the processes 
in place to effectively select, control, and evaluate investments. An 
IT investment management process is an integrated approach to managing 
investments that provides for the continuous identification, selection, 
control, life-cycle management, and evaluation of IT investments. Among 
the investment management practices that are most frequently in place 
are having investment management boards and requiring that projects 
demonstrate that they are economically beneficial. Practices less 
commonly in place are those requiring that IT investments be performed 
in a modular, or incremental, manner and that they be effectively 
controlled. Only by effectively and efficiently managing their IT 
resources through a robust investment management process can agencies 
gain opportunities to make better allocation decisions among many 
investment alternatives and further leverage their IT investments.

Figure 2: Percentage of Agencies' Use of IT Investment Management 
Practices[A]:

[See PDF for image]

[A] Percentages do not add up to 100 percent due to rounding.

Note: Yes--the practice was in place. Partially--the agency has some, 
but not all, aspects of the practice in place. Examples of 
circumstances in which the agency would receive this designation 
include when (1) some, but not all, of the elements of the practice 
were in place; (2) the agency documented that it has the information or 
process in place but it was not in the prescribed form (e.g., in a 
specific document as required by law or OMB); (3) the agency's 
documentation was in draft form; or (4) the agency had a policy related 
to the practice, but evidence supported that it had not been completely 
or consistently implemented. No--the practice was not in place. Not 
applicable--the practice was not relevant to the agency's particular 
circumstances.

[End of figure]

Governmentwide Progress Demonstrated, but More Work Remains:

Critical aspects of IT investment management include developing well-
supported proposals, establishing investment management boards, and 
selecting and controlling IT investments. The agencies' use of 
practices associated with these aspects of investment management is 
wide-ranging, as follows:

* IT investment proposals. Various legislative requirements, an 
executive order, and OMB policies provide minimum standards that govern 
agencies' consideration of IT investments. In addition, we have issued 
guidance to agencies for selecting, controlling, and evaluating IT 
investments.[Footnote 37] Such processes help ensure, for example, that 
investments are cost-beneficial and meet mission needs and that the 
most appropriate development or acquisition approach is chosen.

The agencies in our review have mixed results when evaluated against 
these various criteria. For example, the agencies almost always require 
that proposed investments demonstrate that they support the agency's 
business needs, are cost-beneficial, address security issues, and 
consider alternatives. To demonstrate, the Department of Transportation 
requires that proposed projects complete a business case to indicate 
that the project (1) will meet basic requirements in areas such as 
mission need, affordability, technical standards, and disabled access 
requirements, (2) is economically beneficial, and (3) has considered 
alternatives.

One element in this area that agencies were not as likely to have fully 
in place was the Clinger-Cohen Act requirement that agencies follow, to 
the maximum extent practicable, a modular, or incremental, approach 
when investing in IT projects. Incremental investment helps to mitigate 
the risks inherent in large IT acquisitions/developments by breaking 
apart a single large project into smaller, independently useful 
components with known and defined relationships and dependencies. An 
example of such an approach is DOD's policy stating that IT acquisition 
decisions should be based on phased, evolutionary segments that are as 
brief and narrow in scope as possible and that each segment should 
solve a specific part of an overall mission problem and deliver a 
measurable net benefit independent of future segments.[Footnote 38] 
However, 14 agencies do not have a policy that calls for investments to 
be done in a modular manner. For example, although the Environmental 
Protection Agency (EPA) reported that it worked with program offices to 
try to segment work so that the scope and size of each project is 
manageable, it does not have a policy that calls for investments to be 
done in a modular manner. The absence of a policy calls into question 
whether EPA is implementing incremental investment in a consistent and 
effective manner.

* Investment management boards. Our investment management guide states 
that establishing one or more IT investment boards is a key component 
of the investment management process. According to our guide, the 
membership of this board should include key business executives and 
should be responsible for final project funding decisions or should 
provide recommendations for the projects under its scope of authority. 
Such executive-level boards, made up of business-unit executives, 
concentrate management's attention on assessing and managing risks and 
regulating the trade-offs between continued funding of existing 
operations and developing new performance capabilities.

Almost all of the agencies in our review have one or more enterprise-
level investment management boards. For example, HUD's Technology 
Investment Board Executive Committee and supporting boards have 
responsibility for selecting, controlling, and evaluating the 
department's IT investments. HUD's contractor-performed maturity 
audits also have helped the department validate its board structure and 
its related investment management processes. However, the investment 
management boards for six agencies are not involved, or the agency did 
not document the board's involvement, in the control phase. For 
example, the National Science Foundation (NSF) has a CIO advisory group 
that addresses only the select phase of the IT investment management 
process. NSF's CIO explained that the agency reviews the progress of 
its major information system projects through other means, such as 
meetings with management. In providing comments on a draft of this 
report, the CIO stated that he believes that NSF has a comprehensive 
set of management processes and review structures to select, control, 
and evaluate IT investments and cited various groups and committees 
used as part of this process. However, NSF's summary of its investment 
management process and memo establishing the CIO advisory group include 
only general statements related to the oversight of IT investments, and 
NSF provided no additional documentation demonstrating that its 
investment management board plays a role in the control and evaluation 
phases. Our investment management guidance identifies having an IT 
investment management board(s) be responsible for project oversight as 
a critical process. Maintaining responsibility for oversight with the 
same body that selected the investment is crucial to fostering a 
culture of accountability by holding the investment board that 
initially selected an investment responsible for its ongoing success.

In addition, 17 agencies do not fully address the practice that calls 
for processes to be in place that address the coordination and 
alignment of multiple investment review boards. For example, we 
recently reported that the Department of the Interior has established 
three department-level IT investment boards and begun to take steps to 
ensure that investment boards are established at the bureau 
level.[Footnote 39] However, at the time of our review, the department 
(1) could not assert that department-level board members exhibited core 
competencies in using Interior's IT investment approach and (2) had 
limited ability to oversee investments in its bureaus. We made 
recommendations to Interior to strengthen both the activities of the 
department-level boards and the department's ability to oversee 
investment management activities at the bureaus.

* Selection of IT investments. During the selection phase of an IT 
investment management process, the organization (1) selects projects 
that will best support its mission needs and (2) identifies and 
analyzes each project's risks and returns before committing significant 
funds. To achieve desired results, it is important that agencies have a 
selection process that, for example, uses selection criteria to choose 
the IT investments that best support the organization's mission and 
prioritizes proposals.

Twenty-two agencies use selection criteria in choosing their IT 
investments. In addition, about half the agencies use scoring 
models[Footnote 40] to help choose their investments. For example, the 
working group and CIO office officials that support the Department of 
Education's investment review board used a scoring model as part of 
deciding which IT investments to recommend for the board's 
consideration and approval. This model contained two main categories of 
criteria: (1) value criteria that measured the impact and significance 
of the initiative, given project goals and the strategic objectives of 
the department; and (2) health criteria that measured the potential for 
the success of the initiative and helped to assess both the performance 
and the associated risks that are involved in project and contract 
management. In the case of DOD, in February 2003 we reported that it 
had established some, and was establishing other IT investment 
criteria, but these criteria had not been finalized.[Footnote 41] 
Accordingly, we recommended, and DOD concurred, that DOD establish a 
standard set of criteria. In September we reported that this 
recommendation had not been implemented.[Footnote 42] DOD officials 
stated that the department was developing the criteria but that the 
proposed governance structure had not yet been adopted.

* Control over IT investments. During the control phase of the IT 
investment management process, the organization ensures that, as 
projects develop and as funds are spent, the project is continuing to 
meet mission needs at the expected levels of cost and risk. If the 
project is not meeting expectations or if problems have arisen, steps 
are quickly taken to address the deficiencies. Executive level 
oversight of project-level management activities provides the 
organization with increased assurance that each investment will achieve 
the desired cost, benefit, and schedule results.

Although no agencies had the practices associated with the control 
phase fully in place, some have implemented important aspects of this 
phase. For example, Labor requires project managers to prepare a 
control status report based on a review schedule established during the 
selection phase, which is reviewed by the Office of the CIO and its 
technical review board as part of determining whether to continue, 
modify, or cancel the initiative.[Footnote 43] For initiatives meeting 
certain criteria, the technical review board makes recommendations to 
the management council, which serves as the department's top tier 
executive investment review council, is chaired by the Assistant 
Secretary of Administration and Management, and consists of component 
agency heads.

Nevertheless, in general, the agencies are weaker in the practices 
pertaining to the control phase of the investment management process 
than in the selection phase. In particular, the agencies did not always 
have important mechanisms in place for agencywide investment management 
boards to effectively control investments, including decision-making 
rules for project oversight, early warning mechanisms, and/or 
requirements that corrective actions for under-performing projects be 
agreed upon and tracked. For example, the Department of the Treasury 
does not have a department-level control process; instead, each bureau 
may conduct its own reviews that address the performance of its IT 
investments and corrective actions for under-performing projects. In a 
multitiered organization like Treasury, the department is responsible 
for providing leadership and oversight for foundational critical 
processes by ensuring that written policies and procedures are 
established, repositories of information are created that support IT 
investment decision making, resources are allocated, responsibilities 
are assigned, and all of the activities are properly carried out where 
they may be most effectively executed. In such an organization, the CIO 
is specifically responsible for ensuring that the organization is 
effectively managing its IT investments at every level. Treasury IT 
officials recognize the department's weaknesses in this area and 
informed us that they are working on developing a new capital planning 
and investment control process that is expected to address these 
weaknesses. Similarly, the Department of Energy is planning on 
implementing the investment control process outlined in its September 
2003 capital planning and investment control guide in fiscal year 2004, 
which addresses important elements such as corrective action plans. 
However, this guide does not document the role of Energy's investment 
management boards in this process.

Practice-Specific Analysis:

Table 2 provides additional detail on each investment management 
practice and our evaluation of whether each agency had the practice in 
place. The table indicates those practices in which improvement is 
needed as well as which agencies reported that they were taking, or 
planned to take, actions to address the practices or elements of 
practices.

Table 2: IT Investment Management Practices[A]:

Practice 2.1: The agency has a documented IT investment management 
process that, at a minimum;

* specifies the roles of key people (including the CIO) and groups 
within the IT investment management process;

* outlines significant events and decision points;

* identifies external and environmental factors that influence the 
process;

* explains how the IT investment management process is coordinated with 
other organizational plans and processes, and;

* describes the relationship between the investment management process 
and the agency's enterprise architecture.

Results; Yes: 12; Partially: 14; No: 0; NA: 0.

Comments:

* Yes--Commerce, Education, Energy, GSA, HUD, Interior, Justice, OPM, 
SBA, SSA, State, and USAID have this practice in place;

* Partially--Agriculture and Labor do not describe the relationship 
between their investment management processes, and their enterprise 
architectures in their IT capital planning and investment control 
guide. Air Force, EPA, and VA documentation related to this practice is 
in draft form. In addition, Air Force's[C] draft portfolio management 
document does not identify external and environmental factors or 
describe the relationship between the investment management process and 
the enterprise architecture. DOD[C] is piloting a draft IT portfolio 
management policy, but this policy does not address how this process 
relates to its other organizational plans and processes and its 
enterprise architecture or identify external and environmental factors. 
HHS[C] does not address how this process relates to its other 
organizational plans and processes and its enterprise architecture or 
identify external and environmental factors. NRC's current and draft 
capital planning and investment control policies do not address how 
this process relates to its other organizational plans and processes 
and its enterprise architecture or identify external and environmental 
factors. Army and NASA's[C] investment management policies and guidance 
do not describe the relationship of this process to its enterprise 
architecture. Navy[C] recognizes the need to clarify roles and 
responsibilities related to IT investment management, and its IT 
capital planning guide does not identify external and environmental 
factors. NSF does not have an IT investment management guide, and its 
summary of its policy does not address how this process relates to its 
other organizational plans and processes and its enterprise 
architecture or identify external and environmental factors. 
Transportation reported that there was little integration between its 
capital planning and investment control process and the budget. 
Treasury[C] does not have a capital planning and investment control 
guide, and its documentation supporting this practice is in draft form.

Practice 2.2: The agency established one or more agencywide IT 
investment management boards responsible for selecting, controlling, 
and evaluating IT investments that, at a minimum;

* have final project funding decision authority (or provide 
recommendations) over projects within their scope of authority, and;

* are composed of key business unit executives.

Results; Yes: 14; Partially: 10; No: 2; NA: 0.

Comments:

* Yes--Agriculture, Commerce, Education, GSA, HHS, HUD, Interior, 
Labor, OPM, SBA, SSA, State, Transportation, and VA have this practice 
in place;

* Partially--Treasury[C] and USAID have not completely implemented this 
practice. Air Force, Army[C], Energy, NASA, NRC, and NSF's IT 
investment management boards are not responsible for controlling and 
evaluating IT investments, or this role has not been fully documented. 
EPA's documentation in support of this practice is in draft form. 
Navy's[C] IT investment management board governance process is not 
completely implemented;

* No--DOD[B,C] does not have this practice in place. Justice[C] 
reported that it is piloting an IT investment management board, but did 
not provide documentation on the responsibilities, processes, or makeup 
of this board.

Practice 2.3: The agencywide board(s) work processes and decision-
making processes are described and documented.

Results; Yes: 9; Partially: 6; No: 11; NA: 0.

Comments:

* Yes--Agriculture, Commerce, Education, HUD, Interior, Labor, SBA, 
State, and Transportation have this practice in place;

* Partially--Army has not consistently implemented this practice. GSA 
did not have policies and procedures for each of its IT investment 
management boards. HHS has not established procedures for the 
development, documentation, and review of IT investments. EPA and VA's 
documentation related to this practice is in draft form. USAID has not 
completely implemented this practice;

* No--Air Force, DOD[B,C], Energy, Justice[C], NASA, Navy[C], NRC, NSF, 
OPM, SSA[C], and Treasury[C] do not have this practice in place.

Practice 2.4: If more than one IT investment management board exists in 
the organization (e.g., at the component level), the organization has;

* documented policies and procedures that describe the processes for 
aligning and coordinating IT investment decision making;

* criteria for determining where in the organization different types of 
IT investment decisions are made, and;

* processes that describe how cross-functional investments and 
decisions (e.g., common applications) are handled.

Results; Yes: 2; Partially: 10; No: 7; NA: 7.

Comments:

* Yes--GSA and Labor have this practice in place;

* Partially--Agriculture does not have documented policies and 
processes for aligning and coordinating IT investment decision making 
or processes for describing how cross-functional investments and 
decisions are made. Air Force, Army, Commerce, Education, HHS[C], and 
Transportation[C] do not have documented policies and procedures for 
aligning and coordinating investment decision making among their 
investment management boards. Interior[B] has not fully implemented its 
governance process for aligning and coordinating its IT investment 
decision making. OPM did not describe its criteria for determining 
major systems or describe how cross-functional investments and 
decisions are handled. SBA did not address whether its enterprisewide 
board can invoke final decision-making authority over its program 
office boards;

* No--DOD,[B,C] Energy, NASA, Navy[C], Treasury[C], and VA[C] do not 
have this practice in place. Justice[C] reported that it is piloting an 
IT investment management board but did not provide supporting 
documentation;

* NA--EPA, HUD, NRC, NSF, SSA, State, and USAID do not have multiple IT 
investment management boards.

Practice 2.5: As part of its investment management process, the agency 
has available an annually updated comprehensive inventory of its major 
information systems that includes major national security systems and 
interfaces.

Results; Yes: 21; Partially: 5; No: 0; NA: 0.

Comments:

* Yes--Agriculture, Air Force, Commerce, Education, EPA, GSA, HUD, 
Interior, Justice, Labor, Navy, NRC, NSF, OPM, SBA, SSA, State, 
Transportation, Treasury, USAID, and VA have this practice in place;

* Partially--Army's[C] inventory is not complete and does not include 
interfaces. A DOD inspector general report stated that DOD's inventory 
may not capture the universe of current DOD business management 
systems. Energy and NASA's inventories do not include interfaces. HHS 
reported that its Exhibit 300s fulfill the requirements of this 
practice but did not provide supporting documentation.

Practice 2.6: A standard, documented procedure is used so that 
developing and maintaining the inventory is a repeatable event, which 
produces inventory data that are timely, sufficient, complete, and 
compatible.

Results; Yes: 21; Partially: 1; No: 4; NA: 0.

Comments:

* Yes--Agriculture, Air Force, Commerce, DOD, Education, EPA, GSA, HUD, 
Interior, Justice, Labor, Navy, NSF, OPM, SBA, SSA, State, 
Transportation, Treasury, USAID, and VA have this practice in place;

* Partially--Army's documentation is in draft form;

* No--Energy[C], HHS, NASA, and NRC do not have this practice in place.

Practice 2.7: The IT asset inventory is used as part of managerial 
decision making.

Results; Yes: 12; Partially: 11; No: 3; NA: 0.

Comments:

* Yes--Agriculture, Army, Commerce, Education, GSA, HUD, Labor, Navy, 
SSA, State, Transportation, and VA have this practice in place;

* Partially--DOD, Energy, EPA, Interior,[B] NRC[C], NSF, OPM, SBA, and 
USAID do not explicitly document how their IT asset inventory is used 
to identify asset duplication. Air Force reported that its inventory is 
not being consistently used to identify asset duplication. Justice[C] 
reported that it has begun to use its IT asset inventory to identify 
asset duplication as part of a pilot of its new IT investment 
management process;

* No--HHS, NASA, and Treasury[C] do not have this practice in place.

Practice 2.8: Proposed IT investments are required to document that 
they have addressed the following items during project planning:;

* that the project supports the organization's business and mission 
needs and meets users' needs;

* whether the function should be performed by the public or private 
sector;

* whether the function or project should be performed or is being 
performed by another agency;

* that alternatives have been considered, and;

* how security will be addressed.

Results; Yes: 25; Partially: 1; No: 0; NA: 0.

Comments:

* Yes--Agriculture, Air Force, Army, Commerce, DOD, Education, Energy, 
EPA, GSA, HUD, Interior, Justice, Labor, NASA, Navy, NRC, NSF, OPM, 
SBA, SSA, State, Transportation, Treasury, USAID, and VA have this 
practice in place;

* Partially--the HHS policy addressing the element related to whether 
the function or project should be performed by the private sector or 
another government agency is in draft form. This information is 
normally contained in the Exhibit 300s, but HHS did not provide us with 
this documentation.

Practice 2.9: In considering a proposed IT project, the agency requires 
that the project demonstrate that it is economically beneficial through 
the development of a business case that at least addresses costs, 
benefits, schedule, and risks.

Results; Yes: 25; Partially: 1; No: 0; NA: 0.

Comments:

* Yes--Agriculture, Air Force, Army, Commerce, Education, Energy, EPA, 
GSA, HHS, HUD, Interior, Justice, Labor, NASA, Navy, NRC, NSF, OPM, 
SBA, SSA, State, Transportation, Treasury, USAID, and VA have this 
practice in place;

* Partially--DOD has not consistently implemented this practice.

Practice 2.10: In considering a proposed IT project, the agency 
requires that the project demonstrate that it is consistent with 
federal and agency enterprise architectures.

Results; Yes: 20; Partially: 6; No: 0; NA: 0.

Comments:

* Yes--Air Force, Army, Commerce, Education, Energy, EPA, GSA, HHS, 
HUD, Interior, Labor, Navy, NRC, NSF, OPM, SSA, State, Transportation, 
Treasury, and VA have this practice in place;

* Partially--the agencies are required to include how major IT 
investments align with the agency's enterprise architecture and the 
federal enterprise architecture in their budget Exhibit 300s. However, 
the following agencies do not have policies requiring compliance with 
the agency enterprise architectures, which is a core element in our 
Enterprise Architecture Management Framework. Agriculture and 
NASA[B,C] do not have approved policies related to this practice but 
require compliance as part of their IT investment management reviews 
for the fiscal year 2005 budget cycle. Justice and USAID do not have a 
policy requiring that IT projects comply with their enterprise 
architecture. SBA's policy requiring compliance with its enterprise 
architecture is in draft form. DOD does not have a policy requiring 
compliance with its business enterprise architecture.

Practice 2.11: The agency requires that the proposed IT investment, at 
a minimum;

* support work processes that it has simplified or redesigned to reduce 
costs and improve effectiveness, and;

* make maximum use of commercial-off-the-shelf (COTS) software.

Results; Yes: 11; Partially: 7; No: 8; NA: 0:

Comments:

* Yes--Air Force, Army, DOD, GSA, Justice, Labor, NASA, Navy, NSF, SSA, 
and VA have this practice in place;

* Partially--Education, HHS, Interior, and SBA do not require that 
proposed IT investments support work processes that have been 
simplified or redesigned. NRC has policies related to this practice but 
reported that they have not been fully integrated into its investment 
decision making. Energy's business case guidelines address this 
practice, but Energy reported that consideration of these factors is 
not required for selection and approval. EPA's policy related to COTS 
is in draft form;

* No--Agriculture[C], Commerce, HUD, OPM, State, Transportation, 
Treasury[C], and USAID do not have this practice in place.

Practice 2.12: The agency has established project selection criteria 
distributed throughout the organization that include, at a minimum;

* cost, benefit, schedule, and risk elements;

* measures such as net benefits, net risks, and risk-adjusted return on 
investment; and;

* qualitative criteria for comparing and prioritizing alternative 
information systems investment projects.

Results; Yes: 6; Partially: 16; No: 4; NA: 0:

Comments:

* Yes--Agriculture, GSA, Energy, NASA, Transportation, and VA have this 
practice in place;

* Partially--Commerce[C], Education, HUD[C], Justice, Labor, Navy, 
SBA[B], State, and USAID have project selection criteria that do not 
include net risk and risk-adjusted return on investment. DOD[B] has 
established some IT investment criteria, but these criteria are not 
finalized or part of an investment review process. EPA has project 
selection criteria that do not include net risks, risk-adjusted return 
on investment, or qualitative criteria. EPA's documentation in support 
of this practice is also in draft form. Interior's project selection 
criteria do not include cost and schedule. Air Force[C] and Army's[C] 
project selection criteria do not include cost, benefit, schedule, and 
risk elements or measures such as net benefits, net risks, and risk-
adjusted return on investment. OPM has not consistently implemented 
this practice. SSA's[B] criteria is high-level and not explicit;

* No--HHS[C], NRC, NSF, and Treasury[C] do not have this practice in 
place.

Practice 2.13: The agency has established a structured selection 
process that, at a minimum;

* selects IT proposals using selection criteria;

* identifies and addresses possible IT investments and proposals that 
are conflicting, overlapping, strategically unlinked, or redundant;

* prioritizes proposals; and;

* is integrated with budget, financial, and program management 
decisions.

Results; Yes: 8; Partially: 15; No: 3; NA: 0.

Comments:

* Yes--Agriculture, Commerce, Education, GSA, HUD, Labor, SBA, and 
State have this practice in place;

* Partially--Air Force's[C] documentation in support of this practice 
is in draft form and does not include prioritizing proposals across the 
enterprise or the use of a scoring model. Army's prioritized list is 
limited to investments to address capability shortfalls. DOD[C] is 
piloting a draft IT portfolio management policy that includes a 
selection process. EPA's documentation of its selection processes is in 
draft form. Energy, Interior, and Transportation do not prioritize 
their IT proposals. Justice[C] does not use a scoring model or 
prioritize or rank its IT proposals. NASA[C] does not have a process 
for identifying possible conflicting, overlapping, strategically 
unlinked, or redundant proposals; does not use a scoring model; and 
does not prioritize or rank its IT proposals. Navy generally does not 
use its IT investment management boards outlined in its governance 
process as part of its IT investment selection process and does not use 
a scoring model or prioritize or rank its IT proposals. NRC does not 
select IT proposals using selection criteria, prioritize proposals, or 
document how its selection process is integrated with budget, 
financial, and program management decisions. OPM has not consistently 
implemented this practice. SSA[B] does not use a scoring model. 
USAID[C] does not have a process for identifying possible conflicting, 
overlapping, strategically unlinked, or redundant proposals. VA does 
not have a process to identify and address possible conflicting, 
overlapping, strategically unlinked, or redundant IT investments and 
does not prioritize IT proposals for selection;

* No--HHS[C], Treasury[C], and NSF do not have this practice in place.

Practice 2.14: Agency policy calls for investments to be modularized 
(e.g., managed and procured in well-defined useful segments or modules 
that are short in duration and small in scope) to the maximum extent 
achievable.

Results; Yes: 9; Partially: 3; No: 14; NA: 0.

Comments:

* Yes--Air Force, Army, Education, Justice, NASA, Navy, NRC, SBA, and 
VA have this practice in place;

* Partially--DOD had not consistently implemented this practice. HHS 
and NSF's documentation supporting this practice is in draft form;

* No--Agriculture, Commerce[C], Energy, EPA, GSA[C], HUD, Interior, 
Labor, OPM, SSA, State, Transportation[C], Treasury, and USAID do not 
have this practice in place.

Practice 2.15: The agencywide investment management board(s) has 
written policies and procedures for management oversight of IT projects 
that cover, at a minimum;

* decision-making rules for project oversight that allow for 
terminating projects, when appropriate;

* current project data, including expected and actual cost, schedule, 
and performance data, to be provided to senior management periodically 
and at major milestones;

* criteria or thresholds related to deviations in cost, schedule, or 
system capability actuals versus expected project performance; and;

* the generation of an action plan to address a project's problem(s) 
and track resolution.

Results; Yes: 0; Partially: 20; No: 6; NA: 0.

Comments:

* Partially--Agriculture[C] reported that it has not implemented the 
corrective action plan element in a consistent manner. Air Force[C], 
NASA[C], and SSA[C] have control processes but do not explicitly 
document the role, responsibility, and authority of their 
enterprisewide IT investment management boards in the control phase. 
Army[C], DOD[B,C], and Navy's control processes do not involve 
enterprisewide IT investment management boards. Commerce[C] does not 
have decision-making rules to guide oversight of IT investments and 
projects are not required to submit reports of deviations in system 
capability. Education has not consistently required corrective actions 
or tracked corrective actions related to control phase reviews. GSA 
does not have clear decision-making rules, require projects to report 
on deviations in system capability, or require that corrective actions 
be tracked to resolution. HHS[C] does not have decision-making rules to 
guide oversight of IT investments, review projects at major milestones, 
or systematically track corrective actions. HUD[C] does not require 
reports of deviations of system capability or monitor projects at key 
milestones. Interior[B] does not have decision-making rules for 
oversight of IT investments, require reports of deviations of system 
capability, or require corrective action plans. Justice[B,C], reported 
that it is piloting an IT investment management board that includes the 
control phase but has not provided documentation supporting that all of 
the practice elements are addressed. Labor and Transportation have 
evaluation criteria to assess investments during the control phase, but 
do not have decision-making rules to guide their investment management 
boards' decisions. OPM has not consistently implemented this practice. 
State's draft documentation does not require projects to be reviewed at 
key milestones. USAID[C] does not have decision-making rules, require 
reports on deviations in system capability, and review projects at 
major milestones, and its policy for requiring action plans is in draft 
form. VA's[C] policies and procedures on decision-making rules, 
criteria or thresholds for system capability, and the generation of 
action plans have not been fully documented;

* No--SBA[B,C] and Treasury[C] do not have this practice in place. 
Energy plans to implement a control process in fiscal year 2004, but 
its new capital planning and investment review guide does not address 
the role of its investment management boards in the process. EPA[C] is 
implementing its control process in fiscal year 2004. NRC's current and 
draft capital planning and investment control documentation do not 
address the elements of this practice and do not explicitly document 
the role, responsibility, and authority of its enterprisewide IT 
investment management board in this process. NSF's investment 
management board is not responsible for the control process. NSF 
reported that it uses other mechanisms to implement this practice but 
provided no supporting documentation.

Practice 2.16: The agencywide investment management board(s) 
established an oversight mechanism of funded investments that, at a 
minimum;

* determines whether mission requirements have changed;

* determines whether the investment continues to fulfill ongoing and 
anticipated mission requirements;

* determines whether the investment is proceeding in a timely manner 
toward agreed-upon milestones;

* employs early warning mechanisms that enable it to take corrective 
action at the first sign of cost, schedule, or performance slippages; 
and;

* includes the use of independent verification and validation (IV&V) 
reviews of under-performing projects, where appropriate.

Results; Yes: 2; Partially: 19; No: 5; NA: 0.

Comments:

* Yes--GSA and VA have this practice in place;

* Partially--Agriculture[C] reported that its oversight of IT 
investments has not been consistently implemented. Air Force[C], 
NASA[C], and SSA[C] have control processes but did not explicitly 
document the role, responsibility, and authority of their 
enterprisewide IT investment management boards in this process. 
Army[C], DOD[B,C], and Navy's control processes do not involve 
enterprisewide IT investment management boards. Commerce and Labor do 
not employ an early warning mechanism. State[C] has procedures for 
control phase reviews, but they are not fully implemented. Education, 
HHS, and HUD do not have a process for using IV&V reviews. 
Interior[B,C], does not have a process to determine whether investments 
are proceeding in a timely manner toward agreed-upon milestones, employ 
an early warning mechanism, or use IV&V reviews. Justice[B,C], reported 
that it is piloting an IT investment management board that includes the 
control phase but did not provide documentation supporting that all of 
the practice elements are addressed. OPM has not consistently 
implemented this practice. SBA[B] did not provide evidence that it had 
implemented all of the oversight mechanisms in its investment 
management guide and did not use IV&V reviews. Transportation and USAID 
do not employ an early warning system or have a process for using IV&V 
reviews;

* No--Treasury[C] does not have this practice in place. Energy plans to 
implement a control process in fiscal year 2004, but its new capital 
planning and investment review guide does not address the role of its 
investment management boards in the process. EPA[C] is implementing its 
control process in fiscal year 2004. NRC's current and draft capital 
planning and investment control documentation does not address the 
elements of this practice and does not explicitly document the role, 
responsibility, and authority of its enterprisewide IT investment 
management board in this process. NSF's investment management board is 
not responsible for the control process. NSF reported that it uses 
other mechanisms to implement this practice but provided no supporting 
documentation.

Practice 2.17: Corrective actions for under-performing projects are 
agreed upon, documented, and tracked by the agencywide investment 
management board(s).

Results; Yes: 5; Partially: 12; No: 9; NA: 0.

Comments:

* Yes--Commerce, HUD, Labor, Transportation, and VA have this practice 
in place;

* Partially--Agriculture[C] and SBA[B] reported that they have not 
consistently implemented this practice. Air Force[C], NASA[C], and SSA 
have control processes but did not explicitly document the role, 
responsibility, and authority of their enterprisewide IT investment 
management boards in this process. SSA[C] also did not provide support 
that it was tracking corrective actions. Army[C], DOD[B], and Navy's 
control processes do not involve enterprisewide IT investment 
management boards. Education has not consistently required corrective 
actions or tracked corrective actions related to control phase reviews. 
GSA and HHS[C] do not systematically track corrective actions. State[C] 
has procedures for control phase reviews, but they are not fully 
implemented;

* No--Interior[C], Justice, OPM, Treasury[C], and USAID do not have 
this practice in place. Energy plans to implement a control process in 
fiscal year 2004, but its new capital planning and investment review 
guide does not address the role of its investment management boards in 
the process. EPA[C] is implementing its control process in fiscal year 
2004. NRC's current and draft capital planning and investment control 
documentation does not address the elements of this practice and does 
not explicitly document the role, responsibility, and authority of its 
enterprisewide IT investment management board in this process. NSF's 
investment management board is not responsible for the control process. 
NSF reported that it uses other mechanisms to implement this practice, 
but provided no supporting documentation.

Practice 2.18: The agencywide investment management board(s) requires 
that postimplementation reviews be conducted to;

* validate expected benefits and costs, and;

* document and disseminate lessons learned.

Results; Yes: 6; Partially: 17; No: 3; NA: 3.

Comments:

* Yes--Agriculture, GSA, HUD, Labor, OPM, and VA have this practice in 
place;

* Partially--Army, DOD, NASA[C], Navy, NRC, NSF, and SSA's[C] 
evaluation processes do not involve an enterprisewide IT investment 
management board. NSF also does not define what is to be included in a 
postimplementation review and SSA[B] reported that such reviews are not 
done regularly. Commerce[C] reported that postimplementation reviews 
have not been consistently completed and are not required to be 
reported to its investment management board. Air Force's[C] 
documentation in support of this practice is in draft form and does not 
document the role of its IT investment management boards in this 
process. Education[C] reported that postimplementation reviews were not 
always performed. Energy[C], Justice[B], Transportation[C], and USAID 
have a policy related to this practice, but it has not been 
implemented. Also, Energy's processes do not involve an enterprisewide 
IT investment management board. HHS, SBA[B,C], and State[C] have a 
policy related to this practice but did not provide evidence that it 
has been completely implemented. In addition, HHS's policy does not 
specifically address validating expected benefits and costs;

* No--EPA[C] is implementing its evaluation process in fiscal year 
2004. Interior[C] and Treasury[C] do not have this practice in place.

Source: GAO.

[A] Due to its recent establishment, we did not include DHS as a part 
of this analysis.

[B] We have an outstanding recommendation related to this practice.

[C] The agency reported that it was taking, or planned to take, action 
to address this practice, or elements of the practice.

Note: Yes--the practice was in place. Partially--the agency has some, 
but not all, aspects of the practice in place. Examples of 
circumstances in which the agency would receive this designation 
include when (1) some, but not all, of the elements of the practice 
were in place; (2) the agency documented that it has the information or 
process in place but it was not in the prescribed form (e.g., in a 
specific document as required by law or OMB); (3) the agency's 
documentation was in draft form; or (4) the agency had a policy related 
to the practice, but evidence supported that it had not been completely 
or consistently implemented. No--the practice was not in place. Not 
applicable--the practice was not relevant to the agency's particular 
circumstances.

[End of table]

Among the variety of reasons cited for practices not being fully in 
place were that the CIO position had been vacant, that not including a 
requirement in the IT investment management guide was an oversight, and 
that the process was being revised. However, in some cases the agencies 
could not identify why certain practices were not in place.

Regarding DHS, although we did not include the department in our 
assessment or table 2, the department has investment management 
processes that it has put in place or is in the process of putting in 
place.

Conclusions:

Federal agencies did not always have in place important practices 
associated with IT laws, policies, and guidance. At the governmentwide 
level, agencies generally have IT strategic plans or information 
resources management (IRM) plans that address IT elements, such as 
security and enterprise architecture, but do not cover other aspects of 
IRM that are part of the Paperwork Reduction Act, such as information 
collection, records management, and privacy. This may be attributed, in 
part, to OMB not establishing comprehensive guidance for the agencies 
detailing the elements that should be included in such a plan. There 
were also numerous instances of individual agencies that do not have 
specific IT strategic planning, performance measurement, or investment 
management practices fully in place. Agencies cited a variety of 
reasons for not having these practices in place, such as that the CIO 
position had been vacant, not including a requirement in guidance was 
an oversight, or that the process was being revised. Nevertheless, not 
only are these practices based on law, executive orders, OMB policies, 
and our guidance, but they are also important ingredients for ensuring 
effective strategic planning, performance measurement, and investment 
management, which, in turn, make it more likely that the billions of 
dollars in government IT investments will be wisely spent. Accordingly, 
we believe that it is important that they be expeditiously implemented 
by individual agencies.

Recommendations:

To help agencies in developing strategic IRM plans that fully comply 
with the Paperwork Reduction Act of 1995, we recommend that the 
Director, OMB, develop and disseminate to agencies guidance on 
developing such plans. At a minimum, such guidance should address all 
elements of IRM, as defined by the Paperwork Reduction Act. As part of 
this guidance, OMB should also consider the most effective means for 
agencies to communicate information about any major IT acquisition 
program(s) or phase or increment of that program that significantly 
deviated from cost, performance, or schedule goals established by the 
program. One option for communicating this information, for example, 
could be through the annual agency performance reports that are 
required by the Government Performance and Results Act.

We are also generally making recommendations to the agencies in our 
review regarding those practices that are not fully in place unless, 
for example, (1) we have outstanding recommendations related to the 
practice, (2) the agency has a draft document addressing the practice, 
or (3) implementation of the practice was ongoing. Appendix I contains 
these recommendations.

Agency Comments and Our Evaluation:

We received written or oral comments on a draft of this report from OMB 
and 25 of the agencies in our review.[Footnote 44] We also requested 
comments from the Department of Homeland Security and the Office of 
Personnel Management, but none were provided.

Regarding OMB, in oral comments on a draft of this report, 
representatives from OMB's Office of Information and Regulatory Affairs 
and Office of the General Counsel questioned the need for additional 
IRM plan guidance because they do not want to be prescriptive in terms 
of what agencies include in their plans. We continue to believe that 
agencies need additional guidance from OMB on the development and 
content of their IRM plans because OMB Circular A-130 does not provide 
overall guidance on the contents of agency IRM plans and half the 
agencies indicated a need for OMB to provide additional guidance on the 
development and content of IRM plans. Further, additional guidance 
would help to ensure that agency plans address all elements of IRM, as 
defined by the Paperwork Reduction Act. A strategic IRM plan that 
communicates a clear and comprehensive vision for how the agency will 
use information resources to improve agency performance is important 
because IRM encompasses virtually all aspects of an agency's 
information activities.

In commenting on a draft of the report, most of the agencies in our 
review generally agreed with our findings and recommendations. The 
agencies' specific comments are as follows:

* Agriculture's CIO stated that the department concurred with the 
findings in this report and provided information on action it was 
taking, or planned to take, to implement the recommendations. 
Agriculture's written comments are reproduced in appendix II.

* The Secretary of Commerce concurred with the recommendations in this 
report and stated that, in response, the department is updating its 
policies and procedures. Commerce's written comments are reproduced in 
appendix III.

* DOD's Deputy Assistant Secretary of Defense (Deputy CIO) stated that 
the department concurred or partially concurred with the 
recommendations in this report. DOD also provided additional 
documentation and information on actions that it is taking, or planned 
to take, to address these recommendations. We modified our report based 
on these comments and documentation, as appropriate. DOD's written 
comments, along with our responses, are reproduced in appendix IV.

* Education's Assistant Secretary for Management/CIO stated that the 
agency generally agreed with our assessment of the department's use of 
IT strategic planning/performance measurement and investment 
management practices. Education provided additional comments and 
documentation related to two of our practices. We modified our report 
on the basis of these comments and documentation, as appropriate. 
Education's written comments, along with our responses, are reproduced 
in appendix V.

* Energy's Director of Architecture and Standards provided e-mail 
comments stating that the department believes that GAO fairly depicted 
where the department currently stands in the IT investment management 
process. The director also provided other comments that were technical 
in nature and that we addressed, as appropriate.

* EPA's Assistant Administrator/CIO generally agreed with our findings 
and recommendations on the need to complete work currently under way to 
formalize the documentation of IT management practices. However, EPA 
questioned our characterization of the agency's IT management and 
strategic planning and provided other comments, which we addressed, as 
appropriate. EPA's written comments, along with our responses, are 
reproduced in appendix VI.

* GSA's CIO stated that the agency generally agreed with the findings 
and recommendations in the report. GSA provided suggested changes and 
additional information and documentation related to nine of our 
practices and two recommendations. We modified our report on the basis 
of these comments and documentation, as appropriate. GSA's written 
comments, along with our responses, are reproduced in appendix VII.

* HHS's Acting Principal Deputy Inspector General stated that the 
department concurred with the findings and recommendations of the 
report. HHS's written comments are reproduced in appendix VIII.

* HUD's Assistant Secretary for Administration/CIO stated that the 
department was in agreement with the recommendations in this report. 
HUD's written comments are reproduced in appendix IX.

* Interior's Acting Assistant Secretary for Policy, Management and 
Budget stated that the recommendations in our report would further 
improve the department's IT investment management. Interior's written 
comments are reproduced in appendix X.

* Justice's CIO stated that, overall, the department concurred with the 
findings and recommendations in this report, noting that our 
recommendations will assist in further defining IT strategic planning, 
performance measurement, and investment management practices. 
Justice's written comments, along with our response, are reproduced in 
appendix XI.

* Labor's Assistant Secretary for Administration and Management/CIO 
reported that the department generally concurred with this report and 
provided suggested changes in two areas, which we addressed, as 
appropriate. Labor's written comments, along with our responses, are 
reproduced in appendix XII.

* NASA's Deputy Administrator reported that the agency generally 
concurred with the recommendations in this report and provided 
additional information on actions that it is taking, or planned to 
take, to address these recommendations. NASA's written comments, along 
with our response, are reproduced in appendix XIII.

* NSF's CIO provided e-mail comments disagreeing with three areas of 
this report. First, NSF did not agree with our assessment of practice 
1.1, stating that the agency has a comprehensive agency-level planning 
framework that includes a suite of planning documents and internal and 
external oversight activities that it believes addresses IT planning 
requirements. However, our review of the planning documents cited by 
NSF in its self-assessment found that it did not address the elements 
of the practice. In particular, the agency did not describe the 
responsibility and accountability for IT resources or the method that 
it uses to define program information needs and how such needs will be 
met. Moreover, in our exit conference with NSF officials, the CIO 
indicated agreement with our assessment. Since NSF provided no 
additional documentation, we did not modify the report. Second, the CIO 
disagreed with our characterization of the agency's enterprisewide 
investment management board. We modified the report to reflect the 
CIO's comments; however, we did not change our overall assessment of 
the role of the board because NSF's summary of its investment 
management process and memo establishing the CIO advisory group include 
only general statements related to the oversight of IT investments, and 
NSF provided no additional documentation demonstrating that its 
investment management board plays a role in the control and evaluation 
phases. Third, the CIO stated that NSF has established processes, 
management, and oversight controls over IT investments. However, NSF 
provided limited documentation on the control phase of its investment 
management process. In particular, NSF's summary of its investment 
management process and memo establishing the CIO advisory group include 
only general statements related to the oversight of IT investments, and 
NSF provided no additional documentation demonstrating that its 
investment management board plays a role in the control and evaluation 
phases. Accordingly, we did not modify the report.

* NRC's Executive Director for Operations stated that this report 
provides useful information and agreed that the practices are important 
for ensuring effective use of government IT investments but had no 
specific comments. NRC's written comments are reproduced in appendix 
XIV.

* SBA's GAO liaison provided e-mail comments questioning the need to 
have its enterprise investment management board have final decision-
making authority over IT investments. Our IT investment management 
guidance states that enterprise-level IT investment boards be capable 
of reviewing lower-level board actions and invoking final decision-
making authority over all IT investments.[Footnote 45] In particular, 
if disputes or disagreements arise over decision-making jurisdiction 
about a specific IT investment project, the enterprise board must be 
able to resolve the issue. Accordingly, we did not modify the report. 
SBA also provided technical comments that we incorporated, as 
appropriate.

* SSA's Commissioner generally agreed with the recommendations in the 
report and provided comments on each recommendation that we addressed, 
as appropriate. SSA's written comments, along with our responses, are 
reproduced in appendix XV.

* State's Assistant Secretary/Chief Financial Officer stated that the 
findings in the report are consistent with discussions held with its IT 
staff and provided additional information on four practices. On the 
basis of this additional information, we modified our report, as 
appropriate. State's written comments, along with our response, are 
reproduced in appendix XVI.

* A program analyst in the Department of Transportation's Office of the 
CIO provided oral comments that were technical in nature that we 
addressed, as appropriate.

* The Acting Director, Budget and Administrative Management in 
Treasury's Office of the CIO, provided oral comments stating that the 
department concurred with our findings and recommendations. The 
official further stated that the department recognized its shortcomings 
and was working to correct them.

* USAID's Assistant Administrator, Bureau for Management, did not 
address whether the agency agreed or disagreed with our overall 
findings or recommendations but commented on our evaluation of two 
practices, which we addressed, as appropriate. USAID's written 
comments, along with our response, are reproduced in appendix XVII.

* The Secretary of VA stated that the department concurred with the 
recommendations in the report and provided comments on actions that it 
has taken, or planned to take, in response. We modified the report 
based on these comments, as appropriate. VA's written comments, along 
with our responses, are reproduced in appendix XVIII.

As agreed with your offices, unless you publicly announce the contents 
of this report earlier, we plan no further distribution until 30 days 
from the report date. At that time, we will send copies of this report 
to the secretaries of the Departments of Agriculture, the Air Force, 
the Army, Commerce, Defense, Education, Energy, Health and Human 
Services, Homeland Security, Housing and Urban Development, the 
Interior, Justice, Labor, the Navy, State, Transportation, the 
Treasury, and Veterans Affairs; the administrators of the Environmental 
Protection Agency, General Services Administration, National 
Aeronautics and Space Administration, Small Business Administration, 
and U.S. Agency for International Development; the commissioners of the 
Nuclear Regulatory Commission and the Social Security Administration; 
and the directors of the National Science Foundation, Office of 
Management and Budget, and Office of Personnel Management. We will also 
make copies available to others upon request. In addition, this report 
will be available at no charge on the GAO Web site at [Hyperlink, 
http://www.gao.gov].

If you have any questions on matters discussed in this report, please 
contact me at (202) 512-9286 or Linda J. Lambert, Assistant Director, 
at (202) 512-9556. We can also be reached by e-mail at [Hyperlink, 
pownerd@gao.gov] and [Hyperlink, lambertl@gao.gov], respectively.

Other contacts and key contributors to this report are listed in 
appendix XIX.

Signed by: 

David A. Powner: 
Director, Information Technology Management Issues:

[End of section]

Appendixes: 

Appendix I: Recommendations to Departments and Agencies:

Agriculture:

To improve the department's information technology (IT) strategic 
planning/performance measurement processes, we recommend that the 
Secretary of Agriculture take the following six actions:

* document the department's IT strategic management processes and how 
they are integrated with other major departmental processes, such as 
the budget and human resources management;

* include in the department's annual performance plan the resources and 
time periods required to implement the information security program 
plan required by the Federal Information Security Management Act 
(FISMA) and include a description of major IT acquisitions contained in 
its capital asset plan that bear significantly on its performance 
goals;

* implement a process for assigning roles and responsibilities for 
achieving the department's IT goals;

* develop performance measures related to the effectiveness of controls 
to prevent software piracy;

* track actual-versus-expected performance for the department's 
enterprisewide IT performance measures in its information resources 
management (IRM) plan; and:

* develop a mechanism for benchmarking the department's IT management 
processes, when appropriate.

To improve the department's IT investment management processes, we 
recommend that the Secretary of Agriculture take the following four 
actions:

* include a description of the relationship between the IT investment 
management process and the department's enterprise architecture in its 
IT capital planning and investment control guide and require that IT 
investments be in compliance with the agency's enterprise architecture;

* document the alignment and coordination of responsibilities of the 
department's various IT investment management boards for decision 
making related to IT investments, including cross-cutting investments;

* establish a policy requiring that proposed IT investments support 
work processes that have been simplified or redesigned to reduce costs 
and improve effectiveness and that makes maximum use of commercial-off-
the-shelf (COTS) software; and:

* establish a policy requiring modularized IT investments.

Air Force:

To improve the department's IT strategic planning/performance 
measurement processes, we recommend that the Secretary of the Air Force 
take the following two actions:

* establish a documented process for measuring progress against the 
department's IT goals and assign roles and responsibilities for 
achieving these goals; and:

* develop IT performance measures related to the IT goals in the 
department's information strategy, including measures such as those 
contained in practice 1.9 in our report, and track actual-versus-
expected performance.

To improve the department's IT investment management processes, we 
recommend that the Secretary of the Air Force take the following four 
actions:

* include a description of the relationship between the IT investment 
management process and the department's enterprise architecture, and an 
identification of external and environmental factors in its portfolio 
management guide;

* include costs, benefits, schedule, and risk elements as well as 
measures such as net benefits, net risks, and risk-adjusted return-on-
investment in the department's project selection criteria;

* implement a scoring model and develop a prioritized list of IT 
investments as part of its project selection process; and:

* document the role, responsibility, and authority of its IT investment 
management boards, including work processes, alignment, and 
coordination of decision making among its various boards, and document 
processes for controlling and evaluating IT investments, such as those 
outlined in practices 2.15, 2.16, 2.17, and 2.18.

Army:

To improve the department's IT strategic planning/performance 
measurement processes, we recommend that the Secretary of the Army take 
the following action:

* complete the development of IT performance measures related to the 
Army's enterprisewide IT goals, including measures such as those in 
practice 1.9 in our report, and track actual-versus-expected 
performance.

To improve the department's IT investment management processes, we 
recommend that the Secretary of the Army take the following four 
actions:

* include a description of the relationship between the IT investment 
management process and the department's enterprise architecture in the 
department's IT capital planning and investment control guide;

* document the alignment and coordination of responsibilities of its 
various IT investment management boards for decision making related to 
IT investments;

* include costs, benefits, schedule, and risk elements as well as 
measures such as net benefits, net risks, and risk-adjusted return-on-
investment in the department's project selection criteria; and:

* involve the department's IT investment management boards in 
controlling and evaluating IT investments, including the development 
and documentation of oversight processes such as those in practices 
2.15, 2.16, 2.17, and 2.18.

Commerce:

To improve the department's IT strategic planning/performance 
measurement processes, we recommend that the Secretary of Commerce take 
the following four actions:

* include in the department's annual performance plan the resources and 
time periods required to implement the information security program 
plan required by FISMA;

* document its process of developing IT goals in support of agency 
needs, measuring progress against these goals, and assigning roles and 
responsibilities for achieving these goals;

* develop performance measures related to the department's IT goals in 
its IRM plan, and track actual-versus-expected performance for these IT 
performance measures; and:

* develop a mechanism for benchmarking the department's IT management 
processes, when appropriate.

To improve the department's IT investment management processes, we 
recommend that the Secretary of Commerce take the following eight 
actions:

* document the alignment and coordination of responsibilities of the 
department's various IT investment management boards for decision 
making related to IT investments;

* establish a policy requiring that proposed IT investments support 
work processes that have been simplified or redesigned to reduce costs 
and improve effectiveness and that makes maximum use of COTS software;

* include net risks and risk-adjusted return-on-investment in the 
department's project selection criteria;

* establish a policy requiring modularized IT investments;

* develop decision-making rules to help guide the investment management 
board's oversight of IT investments during the control phase;

* require that reports of deviations in systems capability in a project 
be submitted to the IT investment management board;

* develop an early warning mechanism that enables the investment 
management board to take corrective action at the first sign of cost, 
schedule, or performance slippages; and:

* require postimplementation reviews be completed and the results 
reported to its investment management board.

Defense:

To improve the department's IT strategic planning/performance 
measurement processes, we recommend that the Secretary of Defense take 
the following three actions:

* include in the department's annual performance plan the resources and 
time periods required to implement the information security program 
plan required by FISMA, align its performance measures with the goals 
in the plan, and include a description of major IT acquisitions 
contained in its capital asset plan that bear significantly on its 
performance goals;

* establish a documented process for measuring progress against the 
department's IT goals;

* develop IT performance measures related to its IT goals, including, 
for example, the measures contained in practice 1.9 in our report and 
track actual-versus-expected performance.

To improve the department's IT investment management processes, we 
recommend that the Secretary of Defense take the following action:

* document, as part of its planned IT portfolio management process, how 
this process relates to other departmental processes and the 
department's enterprise architecture, and document the external and 
environmental factors that influence the process.

Education:

To improve the department's IT strategic planning/performance 
measurement processes, we recommend that the Secretary of Education 
take the following four actions:

* include in the department's annual performance plan the resources and 
time periods required to implement the information security program 
plan required by FISMA;

* establish and document a process for measuring progress against the 
department's IT goals in its IRM plan and for assigning roles and 
responsibilities for achieving these goals;

* develop performance measures related to how IT contributes to program 
productivity, the effectiveness and efficiency of agency operations, 
and the effectiveness of controls to prevent software piracy; and:

* track actual-versus-expected performance for the department's 
enterprisewide IT performance measures in its IRM plan.

To improve the department's IT investment management processes, we 
recommend that the Secretary of Education take the following five 
actions:

* document the alignment and coordination of responsibilities of the 
department's various IT investment management boards for decision 
making related to IT investments;

* establish a policy requiring that proposed IT investments support 
work processes that have been simplified or redesigned to reduce costs;

* include net risks and risk-adjusted return-on-investment in the 
department's project selection criteria;

* develop a process to use independent verification and validation 
reviews, when appropriate; and:

* track the resolution of corrective actions for under-performing 
projects and report the results to the investment management board.

Energy:

To improve the department's IT strategic planning/performance 
measurement processes, we recommend that the Secretary of Energy take 
the following six actions:

* document how its IT management operations and decisions are 
integrated with human resources management;

* include in the department's annual performance plan the resources and 
time periods required to implement the information security program 
plan required by FISMA;

* develop a goal related to how IT contributes to program productivity;

* develop performance measures related to how IT contributes to program 
productivity and the effectiveness of controls to prevent software 
piracy;

* develop and link performance measures to the department's 
enterprisewide goals in its IRM plan and track actual-versus-expected 
performance for these measures; and:

* develop a mechanism for benchmarking the department's IT management 
processes, when appropriate.

To improve the department's IT investment management processes, we 
recommend that the Secretary of Energy take the following four actions:

* include interfaces in its inventory of the agency's major information 
systems, implement a standard, documented procedure to maintain this 
inventory, and develop a mechanism to use the inventory as part of 
managerial decision making;

* prioritize the department's IT proposals;

* establish a policy requiring modularized IT investments; and:

* document the role, responsibility, and authority of its IT investment 
management boards, including work processes, alignment, and 
coordination of decision making among its various boards, and document 
the processes for controlling and evaluating IT investments, such as 
those in practices 2.15, 2.16, 2.17, and 2.18.

Environmental Protection Agency:

To improve the agency's IT strategic planning/performance measurement 
processes, we recommend that the Administrator of the Environmental 
Protection Agency take the following six actions:

* document the agency's IT strategic management processes and how they 
are integrated with other major departmental processes, such as the 
budget and human resources management;

* include in the agency's annual performance plan the resources and 
time periods required to implement the information security program 
plan required by FISMA;

* develop a documented process to develop IT goals in support of agency 
needs, measure progress against these goals, and assign roles and 
responsibilities for achieving these goals;

* develop performance measures related to the effectiveness of controls 
to prevent software piracy;

* track actual-versus-expected performance for the agency's measures 
associated with the IT goals in its IRM plan; and:

* develop a mechanism for benchmarking the agency's IT management 
processes, when appropriate.

To improve the agency's IT investment management processes, we 
recommend that the Administrator of the Environmental Protection Agency 
take the following three actions:

* include net risks, risk-adjusted return-on-investment, and 
qualitative criteria in the agency's project selection criteria;

* establish a policy requiring modularized IT investments; and:

* fully implement an IT investment management control phase, including 
the elements contained in practices 2.15, 2.16, and 2.17.

General Services Administration:

To improve the agency's IT strategic planning/performance measurement 
processes, we recommend that the Administrator of the General Services 
Administration take the following four actions:

* include in the agency's annual performance plan the resources and 
time periods required to implement the information security program 
plan required by FISMA;

* develop performance measures related to the effectiveness of controls 
to prevent software piracy;

* track actual-versus-expected performance for each of the agency's 
measures associated with the IT goals in its IRM plan; and:

* develop a mechanism for benchmarking the agency's IT management 
processes, when appropriate.

To improve the agency's IT investment management processes, we 
recommend that the Administrator of the General Services Administration 
take the following four actions:

* develop work processes and decision-making processes for the agency's 
investment management boards;

* establish a policy requiring modularized IT investments;

* help guide the oversight of IT investments by developing clear 
decision-making rules for its IT investment management board and by 
requiring that IT projects report on deviations in system capability; 
and:

* track the resolution of corrective actions for under-performing 
projects and report the results to the investment management board.

Health and Human Services:

To improve the department's IT strategic planning/performance 
measurement processes, we recommend that the Secretary of Health and 
Human Services take the following six actions:

* document the department's IT strategic management processes and how 
they are integrated with its budget processes;

* include in the department's annual performance plan the resources and 
time periods required to implement the information security program 
plan required by FISMA and include a description of major IT 
acquisitions contained in its capital asset plan that bear 
significantly on its performance goals;

* establish a documented process for measuring progress against the 
department's IT goals;

* develop performance measures related to the effectiveness of controls 
to prevent software piracy;

* track actual-versus-expected performance for its enterprisewide IT 
performance measures in its IRM plan; and:

* develop a mechanism for benchmarking the department's IT management 
processes, when appropriate.

To improve the department's IT investment management processes, we 
recommend that the Secretary of Health and Human Services take the 
following 10 actions:

* revise the department's IT investment management policy to include 
(1) how this process relates to other agency processes, (2) an 
identification of external and environmental factors, (3) a description 
of the relationship between the process and the department's enterprise 
architecture, and (4) the use of independent verification and 
validation reviews, when appropriate.

* develop procedures for the department's enterprisewide investment 
management board to document and review IT investments;

* document the alignment and coordination of responsibilities of the 
department's various IT investment management boards for decision 
making related to IT investments;

* implement a standard, documented procedure to maintain the 
department's inventory of major information systems and develop a 
mechanism to use the inventory as part of managerial decision making;

* establish a policy requiring that proposed IT investments support 
work processes that have been simplified or redesigned to reduce costs 
and improve effectiveness;

* implement a structured IT selection process that includes processes 
and criteria such as those in practices 2.12 and 2.13;

* develop decision-making rules to help guide the investment management 
board's oversight of IT investments during the control phase;

* require the investment management board to review projects at major 
milestones;

* track the resolution of corrective actions for under-performing 
projects and report the results to the investment management board; 
and:

* revise the department's investment management policy to require 
postimplementation reviews to address validating benefits and costs, 
and conduct such reviews.

Housing and Urban Development:

To improve the department's IT strategic planning/performance 
measurement processes, we recommend that the Secretary of Housing and 
Urban Development take the following six actions:

* document the roles and responsibilities of the chief financial 
officer and program managers in IT strategic planning and how the 
department's IT management operations and decisions are integrated with 
human resources management;

* include in the department's annual performance plan the resources and 
time periods required to implement the information security program 
plan required by FISMA;

* develop a documented process to develop IT goals in support of agency 
needs, measure progress against these goals, and assign roles and 
responsibilities for achieving these goals;

* develop performance measures related to how IT contributes to program 
productivity and the effectiveness of controls to prevent software 
piracy;

* track actual-versus-expected performance for the department's 
enterprisewide IT performance measures in its IRM plan; and:

* develop a mechanism for benchmarking the department's IT management 
processes, when appropriate.

To improve the department's IT investment management processes, we 
recommend that the Secretary of Housing and Urban Development take the 
following five actions:

* establish a policy requiring that proposed IT investments support 
work processes that have been simplified or redesigned to reduce costs 
and improve effectiveness and that makes maximum use of COTS software;

* include net risks and risk-adjusted return-on-investment in the 
department's project selection criteria;

* establish a policy requiring modularized IT investments;

* require IT projects to report on deviations in system capability and 
monitor IT projects at key milestones; and:

* develop a process to use independent verification and validation 
reviews, when appropriate.

Interior:

To improve the department's IT strategic planning/performance 
measurement processes, we recommend that the Secretary of the Interior 
take the following six actions:

* document the department's IT strategic management processes and how 
they are integrated with other major departmental processes, including 
organizational planning, budget, financial management, human resources 
management, and program decisions;

* include in the department's annual performance plan the resources and 
time periods required to implement the information security program 
plan required by FISMA and include a description of major IT 
acquisitions contained in its capital asset plan that bear 
significantly on its performance goals;

* develop a documented process to develop IT goals in support of agency 
needs, measure progress against these goals, and assign roles and 
responsibilities for achieving these goals;

* develop performance measures related to the effectiveness of controls 
to prevent software piracy;

* track actual-versus-expected performance for the department's 
enterprisewide IT performance measures in its IRM plan; and:

* develop a mechanism for benchmarking the department's IT management 
processes, when appropriate.

To improve the department's IT investment management processes, we 
recommend that the Secretary of the Interior take the following five 
actions:

* establish a policy requiring that proposed IT investments support 
work processes that have been simplified or redesigned to reduce costs 
and improve effectiveness;

* include cost and schedule in the department's project selection 
criteria and prioritize its IT proposals;

* establish a policy requiring modularized IT investments;

* require that corrective actions be undertaken, tracked, and reported 
to the investment management board for under-performing projects; and:

* implement an evaluation process for IT investments that addresses the 
elements of practice 2.18.

Justice:

To improve the department's IT strategic planning/performance 
measurement processes, we recommend that the Attorney General take the 
following six actions:

* document the department's IT strategic management processes;

* document how the department's IT management operations and decisions 
are integrated with human resources management processes;

* include in the department's annual performance plan the resources and 
time periods required to implement the information security program 
plan required by FISMA;

* develop a documented process to develop IT goals in support of agency 
needs, measure progress against these goals, and assign roles and 
responsibilities for achieving these goals;

* develop performance measures related to the department's IT goals in 
its IRM plan, and track actual-versus-expected performance for these IT 
performance measures; and:

* develop a mechanism for benchmarking the department's IT management 
processes, when appropriate.

To improve the department's IT investment management processes, we 
recommend that the Attorney General take the following five actions:

* develop work processes and procedures for the department's investment 
management boards, including aligning and coordinating IT investment 
decision making among its various boards;

* establish a policy requiring that IT investments be in compliance 
with the agency's enterprise architecture;

* include net risks and risk-adjusted return-on-investment in the 
department's project selection criteria;

* implement a scoring model and develop a prioritized list of 
investments as part of the department's project selection process; and:

* require that corrective actions be undertaken, tracked, and reported 
to the investment management board for under-performing projects.

Labor:

To improve the department's IT strategic planning/performance 
measurement processes, we recommend that the Secretary of Labor take 
the following five actions:

* include in the department's annual performance plan the resources and 
time periods required to implement the information security program 
plan required by FISMA;

* develop a documented process to develop IT goals in support of agency 
needs, measure progress against these goals, and assign roles and 
responsibilities for achieving these goals;

* develop a goal related to how IT contributes to program productivity;

* develop performance measures related to how IT contributes to program 
productivity, efficiency, and the effectiveness of controls to prevent 
software piracy, and track actual-versus-expected performance; and:

* develop a mechanism for benchmarking the department's IT management 
processes, when appropriate.

To improve the department's IT investment management processes, we 
recommend that the Secretary of Labor take the following five actions:

* include a description of the relationship between the IT investment 
management process and the department's enterprise architecture in the 
department's IT capital planning and investment control guide;

* include net risks and risk-adjusted return-on-investment in its 
project selection criteria;

* establish a policy requiring modularized IT investments;

* develop decision-making rules to help guide the investment management 
board's oversight of IT investments during the control phase; and:

* develop an early warning mechanism that enables the investment 
management board to take corrective action at the first sign of cost, 
schedule, or performance slippages.

National Aeronautics and Space Administration:

To improve the agency's IT strategic planning/performance measurement 
processes, we recommend that the Administrator of the National 
Aeronautics and Space Administration take the following seven actions:

* document the agency's IT strategic management processes;

* document how the agency's IT management operations and decisions are 
integrated with human resources management processes;

* include in the agency's annual performance plan the resources and 
time periods required to implement the information security program 
plan required by FISMA;

* develop a documented process to develop IT goals in support of agency 
needs, measure progress against these goals, and assign roles and 
responsibilities for achieving these goals;

* develop performance measures related to the effectiveness of controls 
to prevent software piracy;

* track actual-versus-expected performance for the agency's 
enterprisewide IT performance measures in its IRM plan; and:

* develop a mechanism for benchmarking the agency's IT management 
processes, when appropriate.

To improve the agency's IT investment management processes, we 
recommend that the Administrator of the National Aeronautics and Space 
Administration take the following four actions:

* revise the agency's IT investment management policy and guidance to 
describe the relationship of this process to the agency's enterprise 
architecture;

* include interfaces in its inventory of the agency's major information 
systems, implement a standard, documented procedure to maintain this 
inventory, and develop a mechanism to use the inventory as part of 
managerial decision making;

* within the agency's IT investment selection process, implement a 
mechanism to identify possible conflicting, overlapping, strategically 
unlinked, or redundant proposals; implement a scoring model; and 
develop a prioritized list of investments; and:

* document the role, responsibility, and authority of its IT investment 
management boards, including work processes, alignment, and 
coordination of decision making among its various boards, and document 
the processes for controlling and evaluating IT investments, such as 
those in practices 2.15, 2.16, 2.17, and 2.18.

National Science Foundation:

To improve the agency's IT strategic planning/performance measurement 
processes, we recommend that the Director of the National Science 
Foundation take the following five actions:

* document the agency's IT strategic management processes;

* include in the agency's annual performance plan the resources and 
time periods required to implement the information security program 
plan required by FISMA;

* implement a process for assigning roles and responsibilities for 
achieving its IT goals;

* develop performance measures related to the effectiveness of controls 
to prevent software piracy; and:

* develop a mechanism for benchmarking the agency's IT management 
processes, when appropriate.

To improve the agency's IT investment management processes, we 
recommend that the Director of the National Science Foundation take the 
following four actions:

* develop an IT investment management guide that includes a description 
of the relationship between the IT investment management process and 
the agency's other organizational plans and processes and its 
enterprise architecture, and identify external and environmental 
factors that influence the process in the agency's IT capital planning 
and investment control policy;

* implement a structured IT selection process that includes the 
elements of practices 2.12 and 2.13;

* involve the department's IT investment management board in 
controlling and evaluating IT investments, including the development 
and documentation of oversight processes such as those in practices 
2.15, 2.16, 2.17, and 2.18; and:

* define and document the elements of the agency's postimplementation 
reviews.

Navy:

To improve the department's IT strategic planning/performance 
measurement processes, we recommend that the Secretary of the Navy take 
the following three actions:

* develop a documented process to measure progress against the 
department's enterprisewide IT goals and assign roles and 
responsibilities for achieving these goals;

* develop an IT goal related to service delivery to the public; and:

* develop IT performance measures related to the department's IT goals, 
including, at a minimum, measures contained in practice 1.9 in our 
report, and track actual-versus-expected performance.

To improve the department's IT investment management processes, we 
recommend that the Secretary of the Navy take the following four 
actions:

* include net risks and risk-adjusted return-on-investment in the 
department's project selection criteria;

* implement a structured IT selection process that includes the 
elements of practice 2.13;

* involve all elements of the department's IT investment management 
board governance process in selecting, controlling, and evaluating IT 
investments; and:

* document the role, responsibility, and authority of its IT investment 
management boards, including work processes, alignment, and 
coordination of decision making among its various boards, and document 
the processes for controlling and evaluating IT investments, such as 
those outlined in practices 2.15, 2.16, 2.17, and 2.18.

Nuclear Regulatory Commission:

To improve the agency's IT strategic planning/performance measurement 
processes, we recommend that the Commissioner of the Nuclear Regulatory 
Commission take the following five actions:

* document the agency's roles and responsibilities for its IT strategic 
management processes and how IT planning is integrated with its budget 
and human resources planning;

* include in the department's annual performance plan the resources and 
time periods required to implement the information security program 
plan required by FISMA;

* develop a documented process to assign roles and responsibilities for 
achieving its enterprisewide IT goals;

* develop performance measures related to the effectiveness of controls 
to prevent software piracy; and:

* develop performance measures for the agency's enterprisewide goals in 
its IRM plan, and track actual-versus-expected performance for these 
measures.

To improve the agency's IT investment management processes, we 
recommend that the Commissioner of the Nuclear Regulatory Commission 
take the following five actions:

* include a description of the relationship between the IT investment 
management process and the department's other organizational plans and 
processes and its enterprise architecture, and identify external and 
environmental factors that influence the process in the agency's IT 
capital planning and investment control policy;

* develop work processes and procedures for the agency's investment 
management boards;

* implement a standard, documented procedure to maintain its IT asset 
inventory, and develop a mechanism to use the inventory as part of 
managerial decision making;

* develop a structured IT investment management selection process that 
includes project selection criteria, a scoring model, and 
prioritization of proposed investments; and:

* document the role, responsibility, and authority of its IT investment 
management boards, including work processes and control, and evaluate 
processes that address the oversight of IT investments, such as what is 
outlined in practices 2.15, 2.16, 2.17, and 2.18.

Office of Personnel Management:

To improve the agency's IT strategic planning/performance measurement 
processes, we recommend that the Director of the Office of Personnel 
Management take the following four actions:

* include in the agency's annual performance plan the resources and 
time periods required to implement the information security program 
plan required by FISMA;

* develop performance measures related to the effectiveness of controls 
to prevent software piracy;

* track actual-versus-expected performance for the agency's 
enterprisewide IT performance measures in its IRM plan; and:

* develop a mechanism for benchmarking the agency's IT management 
processes, when appropriate.

To improve the agency's IT investment management processes, we 
recommend that the Director of the Office of Personnel Management take 
the following four actions:

* develop work processes and procedures for the agency's investment 
management board, including establishing criteria for defining major 
systems and documenting a process for handling cross-functional 
investments;

* implement a policy requiring that proposed IT investments support 
work processes that have been simplified or redesigned to reduce costs 
and improve effectiveness and that makes maximum use of COTS software;

* establish a policy requiring modularized IT investments; and:

* require that corrective actions be undertaken, tracked, and reported 
to the investment management board for under-performing projects.

Small Business Administration:

To improve the agency's IT strategic planning/performance measurement 
processes, we recommend that the Administrator of the Small Business 
Administration take the following five actions:

* document the agency's IT strategic management processes;

* include in the agency's annual performance plan the resources and 
time periods required to implement the information security program 
plan required by FISMA;

* develop a documented process to develop IT goals in support of agency 
needs, measure progress against these goals, and assign roles and 
responsibilities for achieving these goals;

* develop performance measures related to the agency's IT goals in its 
IRM plan, including, at a minimum, measures related to how IT 
contributes to program productivity, efficiency, effectiveness, the 
overall performance of its IT programs, and the effectiveness of 
controls to prevent software piracy, and track actual-versus-expected 
performance for these IT performance measures; and:

* develop a mechanism for benchmarking the agency's IT management 
processes, when appropriate.

To improve the agency's IT investment management processes, we 
recommend that the Administrator of the Small Business Administration 
take the following two actions:

* document a process that the investment management board can invoke 
final decision-making authority over IT investments addressed by lower-
level boards; and:

* implement a policy requiring that proposed IT investments support 
work processes that have been simplified or redesigned to reduce costs.

Social Security Administration:

To improve the agency's IT strategic planning/performance measurement 
processes, we recommend that the Commissioner of the Social Security 
Administration take the following three actions:

* include in its annual performance plan the resources and time periods 
required to implement the information security program plan required by 
FISMA;

* develop performance measures related to the performance of the 
agency's IT programs and the effectiveness of controls to prevent 
software piracy; and:

* develop a mechanism for benchmarking the agency's IT management 
processes, when appropriate.

To improve the agency's IT investment management processes, we 
recommend that the Commissioner of the Social Security Administration 
take the following four actions:

* develop work processes and procedures for the agency's investment 
management board;

* establish a policy requiring modularized IT investments;

* document the role, responsibility, and authority of its IT investment 
management board for the oversight of IT investments, such as what is 
outlined in practices 2.15, 2.16, and 2.18; and:

* require that corrective actions be tracked and reported to the 
investment management board for under-performing projects.

State:

To improve the department's IT strategic planning/performance 
measurement processes, we recommend that the Secretary of State take 
the following two actions:

* include in the department's annual performance plan the resources and 
time periods required to implement the information security program 
plan required by FISMA; and:

* develop a mechanism for benchmarking the department's IT management 
processes, when appropriate.

To improve the department's IT investment management processes, we 
recommend that the Secretary of State take the following five actions:

* implement a policy requiring that proposed IT investments support 
work processes that have been simplified or redesigned to reduce costs 
and improve effectiveness and that makes maximum use of COTS software;

* establish a policy requiring modularized IT investments;

* include risk-adjusted return-on-investment in the department's 
project selection criteria;

* revise the department's draft IT investment management policy to 
include reviewing projects at major milestones; and:

* fully implement an IT investment management control phase, including 
the elements contained in practices 2.16 and 2.17.

Transportation:

To improve the department's IT strategic planning/performance 
measurement processes, we recommend that the Secretary of 
Transportation take the following five actions:

* document its IT strategic planning process;

* include in the department's annual performance plan the resources and 
time periods required to implement the information security program 
plan required by FISMA;

* develop a goal related to how IT contributes to program productivity;

* develop performance measures related to the department's IT goals in 
its IRM plan, and track actual-versus-expected performance for these IT 
performance measures; and:

* develop a mechanism for benchmarking the department's IT management 
processes, when appropriate.

To improve the department's IT investment management processes, we 
recommend that the Secretary of Transportation take the following six 
actions:

* document the alignment and coordination of responsibilities of the 
department's various IT investment management boards for decision 
making related to IT investments;

* implement a policy requiring that proposed IT investments support 
work processes that have been simplified or redesigned to reduce costs 
and improve effectiveness and that makes maximum use of COTS software;

* prioritize the department's IT proposals;

* establish a policy requiring modularized IT investments;

* develop and document decision-making rules to help guide the 
investment management board's oversight of IT investments during the 
control phase; and:

* as part of the department's control phase, employ an early warning 
mechanism, and use independent verification and validation reviews, 
when appropriate.

Treasury:

To improve the department's IT strategic planning/performance 
measurement processes, we recommend that the Secretary of the Treasury 
take the following four actions:

* include in the department's annual performance plan the resources and 
time periods required to implement the information security program 
plan required by FISMA;

* develop performance measures related to the effectiveness of controls 
to prevent software piracy;

* develop performance measures related to the department's IT goals in 
its IRM plan, and track actual-versus-expected performance for these IT 
performance measures; and:

* develop a mechanism for benchmarking the department's IT management 
processes, when appropriate.

To improve the department's IT investment management processes, we 
recommend that the Secretary of the Treasury take the following eight 
actions:

* develop a capital planning and investment control guide that 
includes, for example, the elements of practice 2.1;

* develop work processes and procedures for the agency's IT investment 
management board, and document the alignment and coordination of 
responsibilities of its various boards for decision making related to 
investments, including the criteria for which investments--including 
cross-cutting investments--will be reviewed by the enterprisewide 
board;

* use the department's IT asset inventory as part of managerial 
decision making, including using it to identify the potential for asset 
duplication;

* establish a policy requiring that proposed IT investments support 
work processes that have been simplified or redesigned to reduce costs 
and improve effectiveness and that makes maximum use of COTS software;

* implement a structured IT selection process that includes the 
elements of practices 2.12 and 2.13;

* establish a policy requiring modularized IT investments;

* implement an IT investment management process that includes a control 
phase that addresses, for example, the elements of practices 2.15, 
2.16, and 2.17; and:

* implement an IT investment management process that includes an 
evaluation phase that addresses, for example, the elements of practice 
2.18.

U.S. Agency for International Development:

To improve the agency's IT strategic planning/performance measurement 
processes, we recommend that the Administrator of the U.S. Agency for 
International Development take the following two actions:

* include in the agency's annual performance plan the resources and 
time periods required to implement the information security program 
plan required by FISMA; and:

* develop a mechanism for benchmarking the agency's IT management 
processes, when appropriate.

To improve the agency's IT investment management processes, we 
recommend that the Administrator of the U.S. Agency for International 
Development take the following nine actions:

* develop work processes and procedures for the agency's IT investment 
management board;

* establish a policy requiring that IT investments be in compliance 
with the agency's enterprise architecture;

* develop a policy requiring that proposed IT investments support work 
processes that have been simplified or redesigned to reduce costs and 
improve effectiveness and that makes maximum use of COTS software;

* include net risks, risk-adjusted return-on-investment, and 
qualitative criteria in the agency's project selection criteria;

* within the agency's IT investment selection process, implement a 
mechanism to identify possible conflicting, overlapping, strategically 
unlinked, or redundant proposals;

* develop a policy requiring modularized IT investments;

* develop decision-making rules, review projects at major milestones, 
and require projects to report on deviations in system capability to 
help guide the oversight of IT investments by the agency's investment 
management board during the control phase;

* as part of the agency's control phase, employ an early warning 
mechanism, and use independent verification and validation reviews, 
when appropriate; and:

* require that corrective actions be undertaken, tracked, and reported 
to the investment management board for under-performing projects.

Veterans Affairs:

To improve the department's IT strategic planning/performance 
measurement processes, we recommend that the Secretary of Veterans 
Affairs take the following four actions:

* include in the department's annual performance plan the resources 
required to implement the information security program plan required by 
FISMA;

* develop a documented process to measure progress against the 
department's IT goals, and assign roles and responsibilities for 
achieving these goals;

* develop performance measures related to the effectiveness of controls 
to prevent software piracy; and:

* track actual-versus-expected performance for the department's 
enterprisewide IT performance measures in its IRM plan.

To improve the department's IT investment management processes, we 
recommend that the Secretary of Veterans Affairs take the following two 
actions:

* document the alignment and coordination of responsibilities of the 
department's various IT investment management boards for decision 
making related to IT investments, including cross-cutting investments; 
and:

* within the agency's IT investment selection process, implement a 
mechanism to identify possible conflicting, overlapping, strategically 
unlinked, or redundant proposals, and prioritize its IT investments.

[End of section]

Appendix II: Comments from the Department of Agriculture:

USDA:

United States Department of Agriculture:

Office of the Chief Information Officer:

1400 Independence Avenue SW:

Washington, DC 20250:

DEC 3 2003:

David A. Powner, Director:

Information Technology Management Issues:

General Accounting Office:

Dear Mr. Powner:

The United States Department of Agriculture (USDA) has reviewed draft 
report number GAO-04-49 entitled "Information Technology Management - 
Government-wide Strategic Planning, Performance Measurement, and 
Investment Management Can Be Further Improved" and is in agreement with 
the findings. USDA is fully committed to meeting the Federal 
information technology challenges that are outlined in the report.

USDA is currently taking steps to implement the recommendations made in 
the report. For example, we are in the process of updating our Capital 
Planning and Investment Control guidance to reflect the improvements 
suggested by the GAO. Additional measures are being put in place to 1) 
improve the Department's IT strategic management/performance 
measurement processes; and 2) improve the Department's information 
technology (IT) investment management processes. We currently estimate 
that the six strategic management/performance measurement 
recommendations will be fully implemented by the end of fiscal year 
2005 and the four investment management recommendations will be fully 
implemented by the end of fiscal year 2004.

Thank you for the opportunity to review and provide feedback on the 
draft report. If additional information is needed, please contact Dr. 
Gregory Parham of my staff on (202) 720-5865.

Sincerely,

Signed for:

Scott Charbo:

Chief Information Officer:

[End of section]

Appendix III: Comments from the Department of Commerce:

THE SECRETARY OF COMMERCE:

Washington, D.C. 20230:

December 11, 2003:

Mr. David A. Powner:

Director, Information Technology Management Issues:

United States General Accounting Office:

Washington, DC 20548:

Dear Mr. Powner:

Thank you for the opportunity to comment on the GAO draft report 
"Information Technology Management: Governmentwide Strategic Planning, 
Performance Measurement, and Investment Management Can Be Further 
Improved." We appreciate the thoroughness of your review and believe 
that our information technology strategic planning and capital 
investment processes will benefit from your insights. We concur with 
the report's recommendations, and are updating the documentation for 
our relevant policies and procedures.

Sincerely,

Signed by:

Donald L. Evans:

[End of section]

Appendix IV: Comments from the Department of Defense (including 
comments from the Departments of the Air Force, Army, and Navy):

[See PDF for image]

[End of figure]

The following are GAO's comments on the Department of Defense's (DOD) 
letter dated December 5, 2003.

GAO Comments:

1. DOD provided its annual report to the President and the Congress, 
which included its fiscal year 2004 performance plan. Based on a review 
of this plan, we modified our report.

2. We disagree that the cited objective fully addresses this issue. 
Specifically, although this objective addresses e-government, the 
wording of the objective, its description, and the discussion of 
related initiatives do not explicitly address service delivery to the 
public. Accordingly, we did not modify our report.

3. Our review of the acquisition management process documentation 
provided by the Navy did not support that the department's selection 
criteria include net risks and risk-adjusted return-on-investment. 
Accordingly, we did not modify our report.

[End of section]

Appendix V: Comments from the Department of Education:

UNITED STATES DEPARTMENT OF EDUCATION:

OFFICE OF MANAGEMENT:

ASSISTANT SECRETARY:

December 10, 2003:

To: David A. Powner:

Director, Information Technology Management Issues:

From: William Leidinger:

Assistant Secretary for Management and Chief Information Officer:

Signed by: William Leidinger:

Re: Draft Report: Information Technology Management:

Government Strategic Planning, Performance Measurement, and Investment 
Management Can Be Further Improved:

We thank you for the opportunity to review the Information Technology 
Management Report. In general, we agree with your assessment of our 
implementation of the various strategic planning and performance 
measurement practices as well as our investment management practices.

We have taken the position that IRM planning is subsumed in the 
Department's Strategic and Annual Plans (Goal 6.3), and occurs in the 
development and maintenance of the enterprise architecture. Our 
architecture is business driven and our IT investments are aligned with 
the architecture. We believe this approach meets the intent of both the 
Paperwork Reduction Act and the Clinger-Cohen Act.

You specifically recommended that we develop measures related to how IT 
contributes to program productivity and the effectiveness and 
efficiency of agency operations. Currently, we require all of our major 
and significant IT investments to collect and track against business 
process performance measures, IT performance measures, and customer-
based performance measures. We assess progress that is reviewed by the 
Department's senior management quarterly. We believe that this approach 
satisfies the requirement that we develop the measure you stipulate.

Finally, specifically regarding IT Management practices 2.5 and 2.6 on 
pages 46 and 47, the Department does have a repeatable process for 
developing and maintaining an inventory of its major information 
systems. The Department has documented this procedure in Handbook OCIO-
09, Handbook for Information Technology Security General Support 
Systems and Major Applications Inventory Procedures. This process is 
formally completed twice a year and includes documented Critical 
Infrastructure Protection questionnaires and Systems Inventory forms to 
support the contents of the Department's IT systems inventory.

The following are GAO's comments on the Department of Education's 
letter dated December 10, 2003.

GAO Comments:

1. We agree that Education requires IT investments to have performance 
measures. However, our practice dealt with enterprise-level measures, 
such as those found in the department's IRM plan, not project-specific 
measures. Education reported that the performance measures in its IRM 
plan do not measure how IT contributes to program productivity and the 
efficiency and effectiveness of agency operations. Accordingly, we did 
not modify our report.

2. We modified our assessment of practice 2.6 in this report and 
deleted the related recommendation based on our evaluation of 
additional documentation provided by Education.

[End of section]

Appendix VI: Comments from the Environmental Protection Agency:

UNITED STATES ENVIRONMENTAL PROTECTION AGENCY:

WASHINGTON, D.C. 20460:

DEC 9 2003:

OFFICE OF ENVIRONMENTAL INFORMATION:

Mr. David Powner:

Director, Information Technology Management Issues:

U.S. General Accounting Office:

441 G Street, NW Washington, D.C. 20548:

Dear Mr. Powner:

Thank you for the opportunity to respond to the U.S. General Accounting 
Office (GAO) draft Report to Congressional Requesters "Information 
Technology Management: Government-wide Strategic Planning, Performance 
Measurement, and Investment Management Can be Further Improved" (GAO-
04-49). The U.S. Environmental Protection Agency (EPA) is pleased to 
note that the reviewers acknowledge that the Agency has established and 
employs best-practices for IT management for the majority of key 
dimensions of the report. We accept the reviewers' findings and 
recommendations that in some cases the Agency does need to complete 
work currently underway to formalize the documentation of those 
practices. We expect to promulgate a final version of new Agency-wide 
official procedures in the near future.

At the same time, EPA believes that the report does not adequately 
reflect both key dimensions of EPA's IT management and strategic 
planning, and current best-practices in general. The draft report 
includes reference to the importance of enterprise architectures 
("Constructing and enforcing sound enterprise architectures," p. 13). 
However, the sections regarding IRM strategic planning do not reflect 
how EPA's Enterprise Architecture (EA) constitutes the multi year, 
business driven, enterprise-wide comprehensive IRM strategic plan. The 
EA should, and for EPA does, derive first from strategic, line of 
business, performance and outcome-based needs. Those needs tie to the 
comprehensive range of IT assets (applications, data, services, 
hardware, software, security) from a strategic perspective (the multi 
year target), leading to the business-based priorities to move toward 
those goals (baseline and sequencing plans).

As Chief Information Officer, I govern EPA's EA and the IT Capital 
Planning and Investment Control (CPIC) process to implement the 
architecture. This occurs through a formal integrated process in 
partnership with Agency senior executives, including the Chief 
Financial Officer. EPA requests that the reviewers reassess this 
dimension of their findings to acknowledge how EPA fulfills strategic 
planning mandates via our implementation of enterprise architecture and 
IT investment management practices. This also affects the 
recommendations regarding tracking actual-versus-expected performance 
as well as including net risks in project selection criteria and 
modularized IT investments which are all part of our EA and CPIC 
practice.

In addition, for Practice 1.7 ("The agency has a documented process to: 
develop IT goals in support of agency needs, measure progress against 
these goals, and assign roles and responsibilities for achieving these 
goals," p. 32) the reviewers state EPA "does not have this practice in 
place." As noted above, EPA has a very rigorous process to accomplish 
this through the integration of our EA and CPIC processes. The finding 
should be corrected to include EPA in the "partially" category to 
acknowledge we do have the practice in place and are working on 
documentation.

Again, thank you for the opportunity to respond to this important 
report. EPA is proud of the successful implementation of many of the 
best practices in IT investment and infrastructure. If you have any 
questions relating to this information, please contact Steve Tiber, 
EPA-GAO Liaison, at 202-564-5184.

Sincerely,

Kimberly T. Nelson:

Assistant Administrator and Chief Information Officer:

Signed by: Kimberly T. Nelson:

cc: Ramona Trovato;

Michael W.S. Ryan;

Mark Day;

Maggie Mitchell:

The following are GAO's comments on the Environmental Protection 
Agency's (EPA) letter dated December 9, 2003.

GAO Comments: 

1. As we reported and EPA acknowledged, its documentation on IT 
strategic planning and investment management was not complete or 
finalized. For example, the partial rating we gave EPA for its IT 
management and strategic planning practices--practices 1.1 and 1.2--
matched the agency's own self-assessment in these areas. Specifically, 
our review of planning documents cited by EPA in its self-assessment 
found that while the agency had documented agencywide roles and 
responsibilities for planning and managing IT resources and had 
documented its process to integrate the IT investment management 
process with the budget, EPA had not addressed other key elements of 
the practices. As an example, EPA had not fully documented the method 
by which it defines program information needs and develops strategies, 
systems, and capabilities to meet those needs. Since EPA provided no 
additional documentation, our practice assessment and our related 
recommendations remain unchanged.

2. As stated in our report, practice 1.7 refers to the documentation of 
the process used to develop IT goals and measures and the 
responsibility for achieving them. As EPA states in its comments, it is 
currently working on documenting this process. Accordingly, we did not 
modify our report.

[End of section]

Appendix VII: Comments from the General Services Administration:

GSA:

GSA Office of the Chief Information Officer:

December 9, 2003:

The Honorable David M. Walker:

Comptroller General of the United States:

General Accounting Office:

Washington, DC 20548:

Dear Mr. Walker:

This is to provide comments on the General Accounting Office (GAO) 
findings and recommendations included in the Governmentwide Strategic 
Planning, Performance Measurement, and Investment Management Practices 
draft report. We generally agree with the findings and recommendations 
and are pleased that General Services Administration (GSA) was 
identified as fully meeting 21 of the 29 practice areas included in the 
report. There were eight (8) areas identified as GSA partially meeting 
and one (1) as GSA not meeting. We are concentrating our comments on 
the areas identified as partially or not meeting where we feel 
additional information or clarification needs to be provided to GAO.

We appreciate the opportunity to review the draft report. Should you 
require additional information, please contact Ms. L. Diane Savoy, 
Director, Office of Policy and Plans, at (202) 501-3535.

Sincerely,

Signed by:

Michael W. Carleton:

Chief Information Officer:

Enclosures:

GAO GOVERNMENTWIDE STRATEGIC PLANNING, PERFORMANCE MEASUREMENT, AND 
INVESTMENT MANAGEMENT REVIEW:

Table 1: IT Strategic Planning/Performance Measurement Practices:

Practice 1.4: The agency has a process that involves the CFO or 
comparable official, to develop and maintain a full and accurate 
accounting of IT-related expenditures, expenses, and results.

GAO Assessment: Partially - GSA has reported that not all costs may 
have been captured.

GSA OCIO Response:

To the best of our knowledge, all IT costs are captured in the GSA IT 
Capital Plan and submitted with the agency budget annually. We have 
been and continue to be very diligent about ensuring that all IT 
investments, agency-wide, are identified and captured in our IT Capital 
Plan.

Practice 1.5: The agency prepares an enterprisewide strategic 
information resources management (IRM) plan that, at a minimum:

* describes how IT activities will be used to help accomplish agency 
missions and operations, including related resources;

* identifies major IT acquisition program(s) or any phase or increment 
of that program that has significantly deviated from the cost, 
performance, or schedule goals established for the program.

GAO Assessment: Partially-GSA ... IRM plan(s) do not include resources 
and major IT acquisition programs that deviated from cost, schedule, or 
performance goals.

GSA OCIO Response:

The GSA IT Strategic Plan is a strategic document that sets the 
direction and focus of the GSA's IT program over a five-year period. It 
is not considered the appropriate place nor is it intended to track 
deviations from planned cost, performance or schedule goals. This 
information is tracked as part of the Control Phase of our IT Capital 
Planning and Investment Control process. GSA has provided GAO with 
significant documentation from our Project Summary Control database 
that currently tracks major investment deviations from planned cost, 
performance or schedule goals on a monthly basis. In addition, the 
project managers track these same items closely as they manage their 
investments day-to-day.

We request that GAO reconsider this statement and modify it to at least 
acknowledge the processes that GSA has in place to monitor the planned 
and actual cost, performance and schedule. In support of this, we are 
attaching copies of additional monitoring documents that are utilized 
and that identify deviations cost, schedule or performance. 
(Attachments 1 and 2):

Practice 1.6: The agency's performance plan required by GPR includes:

* A description of how IT supports strategic and program goals:

* The resources and time periods required to implement the information 
security program plan required by the Federal Information Security 
Management Act (FISMA), and:

* A description of major IT acquisitions contained in the capital asset 
plan that will bear significantly on the achievement of a performance 
goal.

GAO Assessment: Partially-No agency's performance plan, except VA's, 
includes time periods, and none includes resources required to 
implement the information security program plan required by FISMA.

GSA OCIO Response:

This will be reviewed by, coordinated with the Office of the Chief 
Financial Officer and the Senior Agency Information Security Official, 
and corrected appropriately. We note however, that GSA has developed 
and update quarterly to OMB, as required, the FISMA security report and 
Plan of Actions and Milestones (POA&M) which includes specifics and 
details regarding the resources and time periods to accomplish 
necessary corrective actions.

Practice 1.9: The agency has established IT performance measures and 
monitors actual-Versus-expected performance that at least addresses:

* How IT contributes to program productivity,

* How IT contributes to the efficiency of agency operations,

* How IT contributes to the effectiveness of agency operations,

* Service delivery to the public (if applicable,

* How electronic government initiatives enable progress toward agency 
goals and statutory mandates,

* The performance of IT programs (e.g., system development and 
acquisition projects), and:

* Agency compliance with federal software piracy policy.

GAO Assessment: Partially-GSA does not have measures for one of its IT 
goals and did not have performance measures on service delivery to the 
public.

GSA OCIO Response:

GSA has performance measures that map to each of the IT goals. We have 
attached a revised copy of the GSA Goals to Performance Measures 
Mapping document that was submitted earlier. (Attachment 3):

Please note that GSA's mission is to "help Federal agencies better 
serve the public by offering, at best value, superior workplaces, 
expert solutions, acquisition services and management policies." As 
clearly defined in our mission statement, GSA's primary role is in 
support of other Federal agencies. We have a lesser role in providing 
information to citizens through our Office of Citizen Services and 
Communications that was established on June 30, 2002. This Office has 
the following performance goals that are related providing service to 
citizens:

1. Increase access to government for all audiences through the use of 
web sites, call centers, e-mail, publications, and all forms of media.

2. Develop single government fact to citizens to enable the Federal 
government to become more citizen centric by presenting a "front door" 
for citizens who need timely, accurate, consist responses about 
government programs.

The attached a revised copy of the GSA Goals to Performance Measures 
Mapping includes the specific performance measures to accomplish these 
goals. In further support of this, we have also attached copies of the 
performance information and results from the GSA Performance 
Measurement Tool for these goals. (Attachment 4):

Practice 1.12: The agency requires that its IT management process be 
benchmarked against appropriate processes and/or organizations from the 
public and private sectors in terms of cost, speed, productivity, and 
quality of outputs and outcomes where comparable process and 
organizations in the public or private sectors exist.

GAO Assessment: Partially - . . . GSA ... provided an example of a 
process that they have benchmarked, but benchmarking if being performed 
on an ad hoc basis.

GSA OCIO Response:

GSA has identified the need for and used benchmarking when it has been 
determined that comparable process and organizations in the public or 
private sector exist and that it will provide additional value in 
management decision-making. GSA recognizes the benefits of benchmarking 
and will continue to utilize it, as appropriate.

Table 2: IT Investment Management Practices:

Practice 2.3: The agencywide board(s) work processes and decision-
making processes are described and documented.

GAO Assessment: Partially - GSA did not have policies and procedures 
for each of its IT investment management boards.

GSA OCIO Response:

GSA was advised by GAO that because the pending Business Systems 
Council (BSC) charter, for the executive tier of our process is in 
draft that we would be rated "partially" on this practice. We are still 
planning to have the BSC Charter finalized in place before the end of 
the year. Charters exist for all other tiers of our investment 
management boards, and overall policies exist in the GSA IT Capital 
Planning and Investment Control policy and Guide. Copies of these 
documents were provided to GAO earlier.

Practice 2.14: Agency policy calls for investments to be "modularized" 
(e.g., managed and procured in well-defined useful segments or 
"modules" that are short in duration and small in scope) to the maximum 
extent achievable.

GAO Assessment: No-_ GSA ... do not have this practice in place.

GSA OCIO Response:

In earlier versions of the GSA IT Capital Planning and Investment 
Control Guide, that GAO was provided, included this requirement. This 
requierment was inadvertently not included in the August 2002 version 
of the Guide, which was reviewed by GAO. The GSA Capital Planning and 
IT Investment Guide will be updated and will include modularization.

Practice 2.15: The agency wide investment management board(s) has 
written policies and procedures for management oversight of IT projects 
that cover at a minimum:

* decision-making rules for project oversight that allow for the 
termination of projects, when appropriate;

* current project data, including expected and actual cost, schedule, 
and performance data, to be provided to senior management periodically 
and at a major milestones;

* criteria or thresholds related to deviations in cost, schedule, or 
system capability actual vs. expected project performance; and:

* the generation of an action plans to address a project's problem(s) 
and track resolution.

GAO Assessment: Partially - GSA does not have clear decision-making 
rules, require projects to report on deviations in system capability, 
or require that corrective actions be tracked to resolution.

GSA OCIO Response:

The GSA IT Capital Planning and Investment Control Guide identifies the 
requirements for reporting on major investments in the Control Phase of 
the IT Capital Planning and Investment Control process. The reporting 
is accomplished through the Summary Project Control database. Samples 
of reports from this database have been provided to GAO and provide 
additional samples are included in Attachments 1 and 2.

The GSA IT Capital Planning and Investment Control guide requires the 
Information Technology Resources Boards (ITRBs) and the Technical 
Review Boards (TRBs), managed by sub-agency CIO's with the 
participation of the agency CIO, to closely monitor investments, and to 
develop and monitor correction actions when deviations occur. While 
this is a decentralized approach, it is fully integrated into our 
enterprise-wide process. In the upcoming revision of our IT Capital 
Planning and Investment Control Guide, we will add more details to 
further clarify this process.

Practice 2.17: Corrective actions for under performing projects are 
agreed upon, documented, and tracked by the agencywide investment 
management board(s).

GAO Assessment. Partially - GSA ... do not systematically track 
corrective actions.

GSA OCIO Response:

GSA Response: As stated in reference to 2.15 above, the GSA IT Capital 
Planning and Investment Control guide requires the Information 
Technology Resources Boards (ITRBs) and the Technical Review Boards 
(TRBs), managed by sub-agency CIO's with the participation of the 
agency CIO, to closely monitor investments, and to develop and monitor 
corrective actions when deviations occur.

Appendix I, Recommendations to Department and Agencies:

General Services Administration:

GAO Recommendation: To improve the agency's IT strategic management/ 
performance measurement processes, we recommend that the Administrator 
or the General Services Administration:

* develop performance measures for each of its IT goals in its ITM plan 
as well as measures related to how IT contributes to service delivery 
to the public and the effectiveness of controls to prevent software 
piracy:

GSA Response:

As noted in our response to Practice 1.9 and documented in attachment 
3, GSA does have performance measures for each IT goal in the IT 
Strategic Plan. In addition, the response to Practice 1.9 also 
identified measures in place to address how IT contributes to service 
delivery to the public.

We have also attached a copy of the GSA Order, 2104.1, GSA IT General 
Rules of Behavior, dated July 3, 2003 that outlines the GSA policy and 
requirements designed to prevent software piracy (Attachment 5). The 
specifics are included in Section 10. Software acceptable use. In 
addition, Section 6. Penalties for non-compliance stated "users who do 
not comply with the IT General Rules of Behavior may incur disciplinary 
action and/or criminal prosecution.":

The GAO stated recommendation should be removed from the report.

GAO Recommendation: To improve the agency's IT investment management 
processes, we recommend that the Administrator or the General Services 
Administration:

* require corrective actions be undertaken, tracked, and reported to 
the investment management board for underperforming projects.

GSA Response:

As stated in our response to Practice 2.15 and 2.17, the GSA IT Capital 
Planning and Investment Control Guide does require corrective actions 
to be undertaken, tracked, and reported to the investment management 
board for underperforming projects. Specifically, the Guide, in Section 
8, The IT Capital Planning and Investment Control process, (b) Control 
Phase, (1). Reporting, includes the following excerpts with critical 
parts highlighted in bold to draw your attention to these sections:

The SSO CIOs will submit monthly stop light reports on all major IT 
projects that are ongoing developments, acquisitions, or enhancements.

The OCIO will use these stop light reports to monitor projects through 
their life cycle. In addition to alerting the CIO of projects that vary 
significantly from planned cost, schedule and performance estimates, 
stop light reports will allow intermittent updates of baseline 
information on the agency's overall IT investments.

For any reports which indicate a yellow or red light, SSOs must submit 
an update that indicates the change in status and the actions that are 
being accomplished to address conditions underlying the yellow or red 
light. Reports with red light changes must provide get-well plans. The 
SSO CIO's will synthesize project status data and submit summary 
control reports to the ITC and Executive Committee, as necessary.

Variances in project schedule or cost goals of 10 percent or more must 
be reported to the ITC. Any variance or slippage in actual performance 
from established goals must be included in monthly stoplight reports to 
the CIO.

In addition, projects which, based upon monthly reports or other 
indicators, consistently fail to meet requirements may be, in 
consultation with the SSO CIO, subject to a special independent review 
by an Independent Verification and Validation (IV&V) contractor or 
Governmentwide ITRB. The results of the review may be presented to the 
ITC and/or Executive Committee, as appropriate. When a TRB, ITRB, IV&V 
or Governmentwide ITRB finds a project has significant deviation from 
planned performance, cost and schedule, it may result in the project 
being modified or terminated.

The GSA IT Capital Planning and Investment Control Guide, section (d) 
Executive and technical oversight, includes the following:

(1). Executive Committee. The Executive Committee approves the IT 
Strategy and IT Capital Plan and acts on projects that significantly 
deviate from investment controls. (NOTE; This Committee is being 
replaced by the Business Systems Council being established):

The Executive Committee makes the decision for final approval of the 
investment portfolio and decisions on systems that have significant 
deviations from planned performance, cost and schedule.

The GAO stated recommendation should be removed from the report.

The following are GAO's comments on the General Services 
Administration's (GSA) letter dated December 9, 2003.

GAO Comments:

1. We based our evaluation on the agency's self-assessment and comments 
made by GSA's Director, Office of Policy and Plans. However, based on 
GSA's representation in commenting on our draft, we changed our 
evaluation of the referenced practice.

2. The Clinger-Cohen Act requires agencies to include in its 
information resources management (IRM) plan the identification of a 
major IT acquisition program(s), or any phase or increment of that 
program, that significantly deviated from cost, performance, or 
schedule goals established by the program. As we acknowledge in this 
report, agencies, which would include GSA, identified other mechanisms 
that they use to track and report cost, schedule, and performance 
deviations. Moreover, we evaluated agencies as a "partially" instead of 
a "no" in this practice to take into account that the agency had the 
required information, although it was not in the prescribed format. 
Accordingly, we did not modify our report.

3. The Federal Information Security Management Act of 2002 requires 
agencies to include in the performance plans required by the Government 
Performance and Results Act the resources and time periods to implement 
their information security program. As we noted in this report, 
agencies, which would include GSA, commonly stated that they had this 
information but that it was in another document. Nevertheless, this 
does not negate the need for having the agency report to the Congress 
in the form that it requires. This is particularly important since 
performance plans are public documents. Accordingly, we did not modify 
our report.

4. GSA's new documentation illustrates that it has performance measures 
for each of the IT goals in its IRM plan. However, GSA did not provide 
evidence that it was tracking actual versus expected performance for 
measures associated with one of its goals. We revised our report to 
reflect GSA's new documentation and our evaluation.

5. We revised our report on the basis of this new documentation.

6. GSA's highest-level IT investment management board is its Executive 
Committee. GSA did not provide a charter or any other evidence of 
policies and procedures for this committee. We therefore did not modify 
our report.

7. The additional documentation provided by GSA (1) does not address 
decision-making rules and (2) illustrates that GSA uses a monthly 
project control report on cost, schedule, and performance status, but 
the report does not explicitly address deviations in system capability. 
In addition, according to GSA's capital planning and investment control 
order, the format of the report is left to the applicable organization, 
thereby making it less likely that the investment management boards are 
obtaining consistent information. We therefore did not modify our 
report.

8. We agree that GSA's capital planning and investment control order 
requires that projects that have significant variances are to provide 
"get well" plans and that monthly control reports are used to report on 
project cost, schedule, and performance status. However, it is not 
clear that these status reports can be used to systemically track 
corrective actions. Moreover, according to GSA's capital planning and 
investment control order, the format of the monthly control report is 
left to the applicable organization, thereby making it less likely that 
the status of corrective actions is being consistently reported. We 
therefore did not modify our report.

9. See comment 8.

10. We modified our recommendations based on our evaluation of GSA's 
documentation. See comment 4 for our assessment.

11. Executive Order 13103 requires agencies to use software piracy 
performance measures that comply with guidance issued by the federal 
CIO Council.[Footnote 46] The Council, in turn, called on the agencies 
to develop such measures. The additional documentation that GSA 
provided was an order requiring agency employees to use properly 
licensed software, but it does not include performance measures that 
would demonstrate that this requirement is being honored. Measuring how 
well agencies are combating software piracy is important because it can 
verify that the controls that they have put in place are working. 
Accordingly, we did not change this part of the recommendation.

12. We modified our recommendation to reflect that GSA requires 
projects that have significant variances to develop corrective action 
plans. However, the other elements of the recommendation pertaining to 
the tracking and reporting on corrective actions remain outstanding. 
See comment 8 for additional information.

[End of section]

Appendix VIII: Comments from the Department of Health and Human 
Services:

DEPARTMENT OF HEALTH & HUMAN SERVICES:

Office of Inspector General:

Washington, D.C. 20201:

DEC 15 2003:

Mr. David A. Powner:

Director, Information Technology Management Issues:

United States General Accounting Office:

Washington, D.C. 20548:

Dear Mr. Powner:

Enclosed are the Department's comments on your draft report entitled,

"Information Technology Management: Governmentwide Strategic Planning,

Performance Measurement, and Investment Management Can Be Further:

Improved." The comments represent the tentative position of the:

Department and are subject to reevaluation when the final version of:

this report is received.

The Department appreciates the opportunity to comment on this draft:

report before its publication.

Sincerely,

Signed by:

Dara Corrigan:

Acting Principal Deputy Inspector General:

Enclosure:

The Office of Inspector General (OIG) is transmitting the Department's 
response to this draft report in our capacity as the Department's 
designated focal point and coordinator for General Accounting Office 
reports. OIG has not conducted an independent assessment of these 
comments and therefore expresses no opinion on them.

Comments of the Department of Health and Human Services on the General 
Accounting Office's Draft Report, "Information Technology Management: 
Governmentwide Strategic Planning, Performance Measurement, and 
Investment Management Can Be Further Improved" (GAO-04-49):

The Department of Health and Human Services (Department) appreciates 
the opportunity to comment on this draft report. The Department concurs 
with the findings and recommendations contained in the report.

[End of section]

Appendix IX: Comments from the Department of Housing and Urban 
Development:

DEPARTMENT OF HOUSING AND URBAN DEVELOPMENT:

WASHINGTON, DC 20410-3000:

ASSISTANT SECRETARY FOR ADMINISTRATION/CHIEF INFORMATION OFFICER:

DEC 15 2003:

Ms. Linda Lambert:

Assistant Director, Information Technology Management Issues:

U.S. General Accounting Office:

Washington, DC 20548:

Dear Ms. Lambert:

Thank you for the opportunity to comment on the GAO's draft report on 
"Information Technology Management: Governmentwide Strategic Planning, 
Performance Measurement, and Investment Management Can be Further 
Improved (GAO-04-49)." The report contains six recommendations to 
improve HUD's information technology (IT) strategic management/ 
performance measurement processes and five recommendations to improve 
the investment management processes.

We are in agreement with all the recommendations in the report. We are 
pleased that of the 29 applicable practices GAO examined at HUD, 17 are 
in place and 8 are partially implemented. In the coming months, we will 
focus on implementing the four practices that are not in place and 
strengthening those processes where further improvements can be made. 
The information in the report will help form a baseline against which 
we will measure our continuing improvement efforts.

Should you or your staff have any questions or require additional 
information, please contact Mary P. Barry, Acting Director, Office of 
Management and Planning, at (202) 708-1027, extension 123.

Sincerely,

Signed by:

Vickers B. Meadows:

Assistant Secretary for Administration/Chief Information Officer:

cc:

David Powner:

[End of section]

Appendix X: Comments from the Department of the Interior:

United States Department of the Interior:

OFFICE OF THE ASSISTANT SECRETARY:

POLICY, MANAGEMENT AND BUDGET:

Washington, DC 20240:

David A. Powner:

Director, Information Technology Management Issues:

United States General Accounting Office:

441 G Street, NW:

Washington, DC 20548:

Dear Mr. Powner:

The Department of the Interior reviewed the Draft GAO Report entitled, 
"Information Technology Management: "Governmentwide Strategic 
Planning, Performance Measurement, and Investment Management Can Be 
Further Improved" (GAO-04-49). This report follows closely upon the 
completion of the GAO Report on information technology investment 
management (ITIM) in Interior entitled "Information Technology: 
Departmental Leadership Crucial to Success of Investment Reforms at 
Interior" (GAO-03-1028), in which GAO commended Interior for the 
progress made to date in ITIM and provided recommendations for further 
improvement. In response to this report, Interior committed to develop 
a comprehensive plan to implement departmentwide improvements to the 
ITIM process based on the Stage 2 and Stage 3 critical processes of 
GAO's ITIM framework, and strengthen departmental oversight of bureau 
investment management processes.

The draft report (GAO-04-49), which covers several agencies, 
acknowledges that Interior's progress in improving information 
technology investment management (ITIM) has been evident and is 
ongoing. The report also provides very useful comparisons to ITIM 
progress in other agencies. In order to capitalize on the successes at 
other agencies, it would perhaps be more helpful to provide more 
examples for emulation in areas where agencies are successful. Interior 
acknowledges the assistance provided by the GAO auditors during the 
review process in highlighting a few examples of successes in areas 
where Interior needs greater improvement. Further examples of successes 
would provide the information needed to benchmark, as recommended in 
the draft report.

Interior is committed to continue to move forward aggressively to 
execute key practices for ITIM considering competing priorities for 
this and other initiatives. The Department of the Interior agrees the 
recommendations in this report would further improve ITIM at Interior, 
and have plans in place to incorporate many of the suggestions noted. 
However, some of the recommendations go beyond what Interior could 
reasonably accomplish within current and projected budgets for 
information technology. Interior remains committed to address the 
recommendations of the final report GAO-03-1028.

For additional information, please contact Mr. W. Hord Tipton, at 202 
208 6194.

Sincerely,

Signed by:

Lynn Scarlett:

Acting Assistant Secretary for Policy, Management and Budget:

[End of section]

Appendix XI: Comments from the Department of Justice:

U.S. Department of Justice:

Washington, D.C. 20530:

DEC 2 2003:

David A. Powner, Director:

Information Technology Management Issues:

U.S. General Accounting Office:

441 G St., N. W.

Washington, D.C. 20548:

Dear Mr. Powner:

I would like to thank you for affording the Department of Justice 
(Department) the opportunity to respond to the General Accounting 
Office (GAO) report entitled "Governmentwide Strategic Planning, 
Performance Measurement, and Investment Management Can Be Further 
Improved.":

The Attorney General appointed me Chief Information Officer for the 
Department in April 2002. In July 2002,1 re-issued the Information 
Technology (IT) Strategic Plan conveying a fundamental reorientation of 
the role of IT in the Department of Justice. In May 2003, I received 
congressional approval to reorganize my office and realign staff to 
operate more effectively to carry out the management goals identified 
in the IT Strategic Plan. During the past 12 months, I have initiated 
the following specific initiatives in these areas in order to achieve 
measurable progress in IT management within the Department:

* Initiated an outside assessment of IT investment management with 
recommendations to improve the Department's investment management 
processes;

* Completed a pilot project to test and assess new processes to manage 
IT investments in the Select Phase resulting in a portfolio-centered 
analysis of candidate investments and recommendations for consolidated 
business cases as part of the Department's Fiscal Year 2005 budget 
formulation process;

* Implemented an ongoing pilot project to test, assess, and phase in 
new processes to manage IT investments in the Control Phase, including 
the use of an IT Investment Board, which will enable departmental 
senior executives to actively set direction and monitor high priority, 
high risk and department wide IT investments;

* Initiated a web-based "Project Dashboard" to collect current data on 
cost, schedule, and performance for major projects across the 
Department;

* Consolidated business cases for enterprise architecture and IT 
infrastructure - two areas critical for achieving my strategic 
objectives;

* Developed of a project oversight process in collaboration with the 
FBI to oversee major IT investments.

I am currently institutionalizing and documenting those key management 
processes listed above to strengthen strategic planning, performance 
measurement, and IT investment across the Department. The 
recommendations of your report will assist me in further defining those 
management practices.

Overall, I concur with the findings and recommendations in your report; 
however, I want to modify one recommendation to ensure it is an 
actionable item that reflects the future direction of our oversight 
reporting requirements. That recommendation relates to Practice 1.6 - 
"The agency's performance plan required by GPRA [Government Performance 
and Results Act] includes the resources and time periods required to 
implement the information security program plan required by FISMA 
[Federal Information Systems Management Act].":

The Department's self-assessment and the GAO review results are in 
agreement on the Partial rating for this assessment. As discussed in 
the Exit Conference, the Department lays out the full FISMA program, 
including major milestones and required resources, in the annual 
Security Report and the Plan of Actions and Milestones Report submitted 
to the Office of Management and Budget (OMB), GAO, and Congress but 
includes only selected performance measures in the GPRA Plan. However, 
the OMB will replace the annual GPRA performance plan with performance 
budgeting in FY 2005.

In light of this change, the following modification in language is 
requested:

Include the resources and time periods required to implement the 
information security program plan required by FISMA in reporting 
document(s) as directed by OMB guidance.

Thank you for the opportunity to comment on this report. If you need 
additional information, please do not hesitate to contact me.

Sincerely,

Signed by:

Vance E. Hitch:

Chief Information Officer:

The following are GAO's comments on the Department of Justice's letter 
dated December 2, 2003.

GAO Comments:

1. GAO has ongoing work looking at OMB's initiative. However, the 
Federal Information Security Management Act of 2002 requires agencies 
to include in the performance plans required by the Government 
Performance and Results Act the resources and time periods to implement 
its information security program. Accordingly, we did not change the 
recommendation.

[End of section]

Appendix XII: Comments from the Department of Labor:

U.S. Department of Labor:

Office of the Assistant Secretary for Administration and Management:

Washington, D.C. 20210:

DEC - 2 2003:

Mr. David Powner:

Director, Information Technology Management Issues U.S. General 
Accounting Office:

441 G. Street, NW Washington, D.C. 29548:

Dear Mr. Powner:

The enclosed comments are in reference to GAO's draft report entitled 
Information Technology Management: Government-wide Strategic Planning, 
Performance Measurement, and Investment Management Can Be Further 
Improved (GAO 04-49). As you know, the objectives of this study were to 
determine the extent to which major government agencies have in place 
practices associated with key legislative and other requirements for 
(1) IT strategic planning/performance measurement and (2) IT investment 
management.

The Department concurs with GAO's finding and is proposing two changes, 
found on the enclosed document.

If you have any questions about these comments or this matter in 
general, please contact Thomas Wiesner, Deputy CIO, at 
wiesner.thomas@dol.gov or (202) 693-4200.

Sincerely,

Signed by:

Patrick Pizzella:

Assistant Secretary for Administration and Management, Chief 
Information Officer:

Enclosure:

DOL Response to GAO-04-49 IT Management Practices Recommendations to 
Departments and Agencies:

1) p. 25. "For example, the Department of Labor's... does not address 
all required goals.":

Recommended change: After this sentence, insert new sentence reading: 
"However, this single IT goal is supported by a number of IT 
indicators, which are specific, measurable and tied to Labor's mission 
- and address the Clinger-Cohen and PRA requirements. The CIO is 
assigned responsibility for achieving the goal, with component agencies 
responsible for carrying out the indicators appropriate to their 
mission.":

2) p. 71, second bullet under Labor's recommendations, which begins 
"develop a documented process...":

Recommended change: remove this bullet entirely for the following 
reasons:

As required by the Government Performance Results Act (GPRA), the 
Department of Labor (DOL) prepares an annual performance plan covering 
each program activity set forth in the DOL annual budget, including 
information technology. This plan is built upon strategic goals 
established by the Department and are directly related to the 
Department's mission. In addition, performance indicators are 
established to measure outcomes. These indicators are tracked and 
reported quarterly through the GPRA process.

DOL's IT strategic goal, "to provide better and more secure service to 
citizens, businesses, government and Labor employees to improve mission 
performance" contain a number of IT indicators, which support this 
goal. The indicators are specific, measurable goals that support the 
strategic goal and are reported in GPRA on an annual basis.

In accordance with the Clinger-Cohen Act and DOL Secretary's Order 3-
2003, the Office of the Chief Information Officer is responsible for 
establishing, monitoring and evaluating Departmental IT goals to 
improve efficiencies, contain costs, streamline business processes, 
provide better access to Government information and services, and 
promote a secure environment. In addition, consistent with their 
statutory responsibilities and other applicable Secretary's Orders and 
guidelines, all DOL Agency Heads are assigned responsibility to 
implement Department-wide IT initiatives approved by the Department's 
Management Review Board (MRB).

The following are GAO's comments on the Department of Labor's letter 
dated December 2, 2003.

GAO Comments:

1. Because Labor did not disagree with our characterization of its IT 
goal, no changes were made to our report.

2. We agree with Labor's characterization of its IT strategic goal and 
order 3-2003. Nevertheless, the recommendation, and related practice 
1.7, refers to the documentation of the process used to develop IT 
goals and measures and the responsibility for achieving them. Labor 
neither provided documentation of such a process nor took issue with 
our assessment of practice 1.7, in which we stated that the agency did 
not have this practice in place. Moreover, Labor's self-assessment 
referenced a draft performance measurement guidebook and quarterly 
review process in support of this practice. However, these mechanisms 
relate to performance measures associated with IT projects, not Labor's 
enterprisewide IT goal. Finally, as we noted in our report, unlike 
other agencies in our review, Labor does not have goals in its IRM 
plan. Accordingly, we did not change this recommendation.

[End of section]

Appendix XIII: Comments from the National Aeronautics and Space 
Administration:

National Aeronautics and Space Administration:

Office of the Administrator:

Washington, DC 20546-0001:

December 8, 2003:

Mr. David A. Powner, Director:

Information Technology Management Issues:

United States General Accounting Office:

Washington, DC 20548:

Dear Mr. Powner:

Enclosed is the National Aeronautics and Space Administration (NASA) 
response to the General Accounting Office (GAO) Draft Report, 
"Information Technology Management: Governmentwide Strategic Planning, 
Performance Measurement, and Investment Management Can Be Further 
Improved" (GAO-04-49). The Agency concurs with your recommendations for 
corrective action. Enclosed are NASA's detailed comments on each 
individual recommendation.

My point of contact for information technology management is Nancy 
Kaplan in the Office of the Chief Information Officer (CIO). She may be 
contacted by e-mail at Nancy.Kaplan@nasa.gov or by telephone at (202) 
358-1372.

Cordially,

Signed by:

Frederick D. Gregory:

Deputy Administrator:

Enclosure:

NASA Response to Draft GAO Report: "Information Technology Management: 
Governmentwide Strategic Planning, Performance Measurement, and 
Investment Management Can Be Further Improved" (GAO-04-49):

GAO Recommendations: To improve the Agency's Information Technology 
(IT) strategic management/performance measurement processes, we 
recommend that the Administrator of the National Aeronautics and Space 
Administration:

1. Document the Agency's IT strategic management processes;

NASA Response to GAO Recommendation 1: Concur. NASA has produced an 
Information Resources Management (IRM) Strategic Plan and is currently 
documenting the strategic planning process which will provide the 
framework for ensuring that the Agency's IT management strategies as 
defined in the IRM Strategic Plan remain clearly linked to the Agency's 
vision, mission, and strategic goals as defined in the NASA Strategic 
Plan. This activity is scheduled to be completed by June 30, 2004.

2. Document how the Agency's IT management operations and decisions are 
integrated with human resources management processes;

NASA Response to GAO Recommendation 2: Concur. The human resources 
management processes are critical to our success in meeting the NASA 
mission, and the NASA CIO has been working closely with the NASA Office 
of Human Resources to tightly integrate NASA's IT human capital needs 
with their efforts. NASA has developed a Strategic Human Capital Plan 
(SHCP) that establishes a systematic, Agencywide approach to human 
capital management, aligned with the Agency's vision and mission. The 
SHCP is built around five pillars--strategic alignment, strategic 
competencies, learning, performance culture, and leadership. A 
companion document to the SHCP, the Strategic Human Capital 
Implementation Plan (SHCP), has also been developed which contains 
detailed action plans for achieving the goals defined in the SHCP. 
Neither of these plans reference specific disciplines (e.g., IT), but 
are intended to pertain to all disciplines that are critical to 
developing and maintaining NASA's strategic competencies. The current 
NASA IRM Strategic Plan does not address human resources management 
since the SHCP and the SHCIP address this from a multidiscipline 
perspective, but does reference the NASA Strategic Plan, which 
identifies the criticality of IT as a strategic competency to NASA 
through the definition of Implementing Strategy 2, "Demonstrate 
leadership in the use of information technology." However, the IRM 
Strategic Plan is currently in the process of being revised and will 
include language that clarifies the linkage between the various 
strategic plans and the integration of IT management with human 
resources management. The revised plan is scheduled to be completed by 
September 30, 2004.

3. Include in the Agency's annual performance plan the resources and 
time periods required to implement the information security program 
plan required by FISMA;

NASA Response to GAO Recommendation 3: Partially Concur. NASA has been 
including the resources and time periods required to implement the 
information security changes necessary to correct the identified 
weakness contained in the OMB FISMA report. OMB has instructed NASA to 
exclude this information in any submit to Congress due to budget data 
being embargoed at this time. NASA will summarize this information in 
our annual performance plan.

4. Develop a documented process to develop IT goals in support of 
Agency needs, measure progress against these goals, and assign roles 
and responsibilities for achieving these goals; develop measures 
related to the effectiveness of controls to prevent software piracy;

NASA Response to GAO Recommendation 4: Concur. The strategic planning 
process referenced in (1) above is the process through which IT goals 
and performance measures are developed. NASA is currently in the 
process of revising NASA Policy Directive (NPD) 2800.1, Managing 
Information Technology, which identifies the roles and responsibilities 
for achieving the Agency's IT goals and objectives. This revision is 
scheduled for the first quarter of calendar year 2004. NASA will 
benchmark other agencies as well as private industry to identify best 
practices with respect to developing performance measures related to 
the effectiveness of controls to prevent software piracy. The 
benchmarking activity will be completed by September 30, 2004.

5. Track actual-versus-expected performance for the Agency's 
Enterprisewide IT performance measures in its IRM plan; and develop a 
mechanism for benchmarking the Agency's IT management processes, when 
appropriate.

NASA Response to GAO Recommendation 5: Partially Concur. NASA has 
established performance measures for major IT investments that are 
identified in the Agency's Exhibit 300 submissions to OMB and have 
included performance measures in our IRM Strategic Plan. We acknowledge 
the need to expand our benchmarking efforts.

6. Revise the Agency's IT investment management policy and guidance to 
describe the relationship of this process to the Agency's Enterprise 
architecture;

NASA Response to GAO Recommendation 6: Partially Concur. The NASA IT 
investment management policy is contained in NPD 2800.1, Managing 
Information Technology. The NASA IT Capital Planning and Investment 
Control (CPIC)Process document describes the process NASA will use for 
ensuring that all IT capital investments align with the Agency's 
mission, Enterprise architecture, and business needs. NASA will ensure 
that the planned revision of NPD 2800.1, referenced in (4) above, 
clarifies the relationship between these two documents and the Agency's 
Enterprise architecture. This revision is scheduled for the first 
quarter of calendar year 2004.

7. Include interfaces in its inventory of the Agency's major 
information systems, implement a standard, documented procedure to 
maintain this inventory, and develop a mechanism to use the inventory 
as part of managerial decisionmaking;

NASA Response to GAO Recommendation 7: Concur. NASA has started the 
process of replacing our existing asset management system, and the NASA 
CIO has provided IT asset-tracking requirements to the project 
responsible for implementing this new system. In addition, the CIO has 
established a database that captures all NASA systems and has begun to 
integrate the use of these data into our management processes to 
facilitate decisionmaking. We will be integrating our existing database 
with the new asset management system once that system is deployed.

8. Within the Agency's IT investment selection process, implement a 
mechanism to identify possible conflicting, overlapping, strategically 
unlinked or redundant proposals, implement a scoring model, and develop 
a prioritized list of investments;

NASA Response to GAO Recommendation 8: Concur. The NASA CPIC Process 
document referenced in (6) above describes the selection phase of the 
process NASA will use. The selection phase is divided into two stages-
-Concept Screening and Business Case Development and Screening. The 
Concept Screening stage permits the evaluation of a number of candidate 
investments with a minimum amount of information. If the investment is 
deemed viable after this initial screening, then it will move into the 
business case development and screening stage where it will undergo the 
development of a full business case and project plan. Investments that 
pass the initial screening will be rated and ranked against other 
proposed investments and in the context of ongoing projects. As 
described in the CPIC process, a portfolio analysis will be conducted 
for rating and ranking each investment in the context of a total 
investment portfolio. NASA will have completed this process for the FY 
2006 budget submission by September 30, 2004.

9. Document the role, responsibility, and authority of its IT 
investment management boards, including work processes, alignment and 
coordination of decisionmaking among its various boards and processes 
for controlling and evaluating IT investments, such as those in 
practices 2.15, 2.16, 2.17, and 2.18.

NASA Response to GAO Recommendation 9: Concur. NASA will ensure that 
the role, responsibility, and authority of its IT investment management 
board(s) in the referenced practices (2.15, 2.16, 2.17, and 2.18) is 
clarified in the revision of NPD 2800.1, Managing Information 
Technology, referenced in (4) and (6) above. This revision is scheduled 
for the first quarter of calendar year 2004.

The following are GAO's comments on the National Aeronautics and Space 
Administration's (NASA) letter dated December 8, 2003.

GAO Comments:

1. Our practice dealt with enterprise-level measures, not project-
specific measures. In addition, although we agree that NASA's IRM plan 
included performance measures, the agency generally does not track 
actual-versus-expected performance for these enterprisewide measures.

[End of section]

Appendix XIV: Comments from the Nuclear Regulatory Commission:

UNITED STATES NUCLEAR REGULATORY COMMISSION:

WASHINGTON, D.C. 20555-0001:

December 5, 2003:

Mr. David A. Powner, Director:

Information Technology Management Issues:

United States General Accounting Office:

441 G Street, NW:

Washington, D.C. 20548:

Dear Mr. Powner:

I would like to thank you for the opportunity to review and submit 
comments on the draft report, "Information Technology Management: 
Governmentwide Strategic Planning, Performance Measurement, and 
Investment Management Can be Further Improved," (GAO-04-49). The U.S. 
Nuclear Regulatory Commission appreciates the time and effort that you 
and your staff have taken to review this important topic.

Overall, this report provides useful information on Federal agencies' 
use of 12 IT strategic planning/performance measurement practices and 
18 IT investment management practices. We agree that these practices 
are important for ensuring effective use of government IT investments, 
and we support the effort to encourage best practices across Federal 
agencies.

We have no specific comments on the report. Should you have any 
questions, please contact either Mr. William Dean at 301-415-1703 or 
Ms. Melinda Malloy at 301-415-1785 of my staff.

Sincerely,

Signed by:

William D. Travers:

Executive Director for Operations:

cc: Linda J. Lambert, GAO:

[End of section]

Appendix XV: Comments from the Social Security Administration:

SOCIAL SECURITY:

The Commissioner:

December 3, 2003:

Mr. David A. Powner:

Director, Information Technology Management Issues:

U.S. General Accounting Office, Washington, D.C. 20548:

Room 4075:

Dear Mr. Powner:

Thank you for the opportunity to review the draft report, "Information 
Technology Management: Governmentwide Strategic Planning, Performance 
Measurement, and Investment Management Can Be Further Improved" (GAO-
04-49). Our comments are enclosed. If you have any questions, please 
have your staff contact Mark Welch at (410) 965-0374.

Sincerely,

Signed by:

Jo Anne B. Barnhart:

Enclosure:

COMMENTS OF THE SOCIAL SECURITY ADMINISTRATION ON THE GENERAL 
ACCOUNTING OFFICE'S (GAO) DRAFT REPORT "INFORMATION TECHNOLOGY 
MANAGEMENT: GOVERNMENTWIDE STRATEGIC PLANNING PERFORMANCE MEASUREMENT, 
AND INVESTMENT MANAGEMENT CAN BE FURTHER IMPROVED" (GAO-04-49):

Thank you for the opportunity to review and provide comments on this 
GAO draft report. The Social Security Administration (SSA) has taken 
many positive actions in recent years, and continues efforts, to fully 
comply with federal guidance relating to information technology (IT) 
management and improve overall performance in this area. We are pleased 
that this GAO report acknowledges the numerous IT management practices 
fully and partially in place at SSA, and that our performance in this 
area compares quite favorably with that of other federal agencies.

Recommendation I:

Include in its annual performance plan (APP) the resources and time 
periods required to implement the information security program plans 
required by the Federal Information Security Management Act of 2002 
(FISMA).

Comment:

We agree in part. We are concerned about including the details of our 
information security program, including information about the resources 
and time periods required for implementation in a public document, 
because such information might aid and abet those seeking to compromise 
the Agency's information security. However, we recognize that Section 
3544(d) of the FISMA calls for agencies to include as part of their 
APP, which are public documents, the time periods and resources that 
are necessary to implement their information security programs. 
Therefore, we will include this information in future APPS in a manner 
that will not risk compromise of the Agency's information security.

Recommendation 2:

Develop measures related to the performance of the Agency's IT programs 
and the effectiveness of controls to prevent software piracy.

Comment:

We agree and we believe the Agency already has effective measures in 
place to ensure effective SSA IT program performance. SSA's Strategic 
Information Resources Management Plan and associated IT programs are 
driven by the Agency's strategic goals and objectives which are defined 
in the Agency Strategic Plan and APP. IT programs provide the 
automation support required for the projects that help SSA to achieve 
those goals and objectives.

At the tactical level, the Control Phase of the Agency's Capital 
Planning and Investment Control Process provides the appropriate 
oversight process for IT initiatives, including measuring compliance 
with the cost, schedule and performance goals established for these IT 
initiatives. SSA also has availability, stability and performance 
measures for many components of its enterprise IT architecture.

Regarding software piracy, SSA currently tests the effectiveness of 
controls to prevent software piracy. Since the performance measures in 
the APP are at a high level and are focused on the four Agency goals, 
it seems inappropriate to include a goal focused at the tactical level 
of preventing software piracy. This issue is more appropriately 
addressed in the oversight provided in the Control Phase of the 
Agency's Capital Planning and Investment Control Process.

Recommendation 3:

Develop a mechanism for benchmarking the agency's IT management 
processes, when appropriate.

Comment:

We agree that the use of benchmarking of IT management processes is 
useful in certain situations. However, we do not agree with any 
recommendation that requires the Agency to develop a mechanism (that 
is, an explicit strategy or policy) for benchmarking its IT management 
processes. We are in compliance with the Clinger/Cohen Act, Section 
5123(4) requirement that "the head of an executive agency shall, where 
comparable processes and organizations in the public or private sectors 
exist, quantitatively benchmark agency process performance against such 
processes in terms of cost, speed, productivity, and quality of outputs 
and outcomes." SSA performs benchmarking, when appropriate, and has 
provided GAO a list of Agency benchmarking activities carried out in 
response to the Clinger/Cohen Act requirement.

Recommendation 4:

Develop work processes and procedures for the agency's investment 
management board.

Comment:

We agree and SSA's investment management board (the Information 
Technology Advisory Board - ITAB) already follows established work 
processes and procedures. The role of the ITAB is evolving as we 
explore ways to refine and improve our Capital Planning and Investment 
Control Process to ensure continued compliance with legislation and 
Office of Management and Budget (OMB) guidance. While high-level 
documentation concerning the ITAB's operations is in place, we are 
working to more fully document ITAB's work processes and procedures.

Recommendation 5:

Establish a policy requiring modularized IT investments.

Comment:

We agree and SSA has already established and is following a modularized 
investment policy. SSA's current review of business cases includes an 
assessment of major projects against the "Raines Rules" criteria that 
include "modularized" system implementations. The Raines Rules criteria 
are incorporated into guidance in OMB Circulars No. A-11 and A-130. The 
provisions of these circulars are incorporated by reference in SSA's 
Capital Planning and Investment Control Process. SSA software 
development initiatives follow a modular, release-oriented strategy and 
major IT technology deployments are performed through a phased-
implementation approach. Each IT investment is assessed to determine 
whether it is worth doing before approval is given to proceed. Any 
problems with the results from the implementations of earlier "modules" 
of a system are considered in the approval process for continued 
investments.

Recommendation 6:

Document the role, responsibility, and authority of its IT investment 
management board for the oversight of IT investments, such as what is 
outlined in practices 2.15, 2.16, and 2.18.

Comment:

We agree. The role, responsibility and authority of SSA's investment 
management board (the ITAB) for the oversight of IT investments is 
documented at a high level. As noted above, the role of the ITAB is 
evolving as we explore ways to refine and improve our Capital Planning 
and Investment Control Process to ensure continued compliance with 
legislation and OMB guidance. We are working to ensure that the ITAB's 
role, responsibility and authority for the oversight of IT investments 
are more explicitly documented.

Recommendation 7:

Require that corrective actions be tracked, and reported to the 
investment management board for under performing projects.

Comment:

We agree. SSA tracks the progress of all IT projects, including both 
projects that are proceeding as planned, and those that are not. Any 
surfacing problems are:

separately tracked and monitored within the Risk Identification and 
Mitigation System and:

Problem and Issues Reporting System. These repositories collect and 
track risk/problem information and the corrective actions taken, which 
are reported upon at various points along a project's lifecycle. The 
ITAB has access to this information. Therefore, we will more explicitly 
document how corrective actions for under-performing projects are 
tracked and reported to the ITAB.

The following are GAO's comments on the Social Security 
Administration's (SSA) letter dated December 3, 2003.

GAO Comments:

1. We agree that SSA needs to consider the level of detail that is 
appropriate to include in its performance plans so as not to compromise 
security.

2. We requested documentation to support SSA's assertion that it has 
performance measures associated with the performance of IT programs 
(e.g., the percentage of IT projects that are meeting cost, schedule, 
and performance goals), but none were provided. Accordingly, we did not 
modify our report.

3. We agree that it is not appropriate to include measures related to 
the effectiveness of controls to prevent software piracy in agency 
performance plans. Neither our practice nor our recommendation 
specifies the document or process that should be used to address 
software piracy.

4. As we noted in this report, SSA performs benchmarking in an ad hoc 
manner. We believe that taking a more systematic approach is necessary 
to ensure that benchmarking is performed at suitable times using an 
appropriate methodology. Without a systematic approach, it is not 
possible to validate that the agency performs benchmarking "when 
appropriate." Accordingly, we did not modify our report.

5. References to OMB's Circular A-11 in agency policy documentation 
alone do not ensure that these practices are met. In particular, we 
believe that agency policies related to modularized IT investments 
should be explicit and that it is neither prudent nor practical to rely 
on users of SSA's documentation of its capital planning and investment 
control process to review a secondary source.

[End of section]

Appendix XVI: Comments from the Department of State:

United States Department of State:

Assistant Secretary and Chief Financial Officer:

Washington, D. C 20520:

DEC 9 2003:

Dear Ms. Westin:

We appreciate the opportunity to review your draft report, "INFORMATION 
TECHNOLOGY MANAGEMENT: Governmentwide Strategic Planning, Performance 
Measurement, and Investment Management Can Be Further Improved," GAO-
04-49, GAO Job Code 310448.

The enclosed Department of State comments are provided for 
incorporation with this letter as an appendix to the final report.

If you have any questions concerning this response, please contact 
Karen Mummaw, Computer Specialist, Bureau of Information Resource 
Management at (202) 663-1415.

Sincerely,

Signed by:

Christopher B. Burnham:

Enclosure:

As stated.

cc: GAO/IT - David Powner;

State/OIG - Luther Atkins;

State/IRM - Bruce Morrison;

State/H - Paul Kelly:

Ms. Susan S. Westin, Managing Director, International Affairs and 
Trade, U.S. General Accounting Office.

Department of State Comments on GAO Draft Report:

Information Technology Management, Government-wide Strategic Planning, 
Performance Measurement, and Investment Management Can Be Further 
Improved (GAO-04-49, GAO Job Code 310448):

Thank you for the opportunity to review and comment on your draft 
report "Information Technology Management, Government-wide Strategic 
Planning, Performance Measurement, and Investment Management Can Be 
Further Improved". In general, the findings of the report are 
consistent with the discussions between GAO staff and our Bureau of 
Information Resource Management staff in September 2003. However, we 
have made progress in several of the areas since those discussions took 
place. So that GAO might have the opportunity to reflect the most 
current information in its final report, we want to bring the following 
information to your attention:

* Practice Area 2.13, pg. 49 - Agency has established a structured 
selection process. GAO scored this item as partial for the Department 
and commented that the Department's documentation of its selection 
process is in draft form.

On September 25, 2003, the Department published the final version of 
the Department's Capital Planning and Investment Control (CPIC) Program 
Guide on the IT Planning Division's web page on the Department's 
Intranet. This guide contains the framework for the Department's 
information technology CPIC process. Section 4 of the guide contains 
detailed procedures for the Select phase and addresses process, 
prerequisites and data requirements, scoring criteria, and outputs. The 
Department suggests that GAO consider changing the score to a "yes" for 
this practice area and amend the comment section accordingly. (See pp. 
11-14 of the final CPIC guide that is attached.):

* Practice Area 2.16, pg. 51 - Agency has established an oversight 
mechanism of funded investments. GAO scored this item as partial for 
the Department and commented that State had draft procedures for 
control phase reviews in place, but they are not fully implemented.

Section 5 of the Department's final CPIC Guide, published on the 
Department's Intranet, contains detailed procedures for the Control 
phase of the process. The Department has begun the process of 
implementing these procedures for controlling IT investments, which 
address the initiative review process, portfolio review process, 
prerequisites and data requirements, scoring criteria, and outputs. 
(See pp. 16-21 of the CPIC Guide attached.) The Department suggests 
that GAO amend the comments section for this practice area to state 
that the Department recently issued final procedures for control phase 
reviews, which are not yet fully implemented.

Practice Area 2.17, pg. 51 - Corrective actions for projects are agreed 
upon, documented, and tracked by the agency wide investment management 
board. GAO scored this item as partial for the Department and commented 
that State had draft procedures for control phase reviews in place, but 
they are not fully implemented.

See our comment for Practice Area 2.16. The Department suggests that 
GAO amend the comments section for this practice area to state that the 
Department recently issued final procedures for control phase reviews, 
which are not yet fully implemented.

* Practice Area 2.18, pg. 52 - Agency-wide investment management board 
required that post-implementation reviews be conducted. GAO scored this 
item as partial for the Department and commented that State has a 
policy related to this practice, but did not provide evidence that it 
is completely implemented.

Section 6.2 of the Department's final CPIC Guide provides for post-
implementation reviews of IT investments and sets forth the 
prerequisites and data requirements for such reviews. (See pp. 24-25.) 
The Department suggests that GAO amend the comments section for this 
practice area to state that the Department recently issued final 
procedures for conducting post implementation reviews, which are not 
yet fully implemented.

The following are GAO's comments on the Department of State's letter 
dated December 9, 2003.

GAO Comments:

1. We based our evaluation on the agency's draft Capital Planning and 
Investment Control Program Guide that was provided during our review. 
However, based on State's newly finalized Capital Planning and 
Investment Control Program Guide, we changed this evaluation in our 
report.

2. We based our evaluation on the agency's draft Capital Planning and 
Investment Control Program Guide that was provided at the time of our 
review. Based on the final version of the Capital Planning and 
Investment Control Program Guide provided by State in its response, we 
modified the language in our report, as appropriate.

3. See comment 2.

4. See comment 2.

[End of section]

Appendix XVII: Comments from the U.S. Agency for International 
Development:

USAID:

DEC 9 2003:

U.S. AGENCY FOR INTERNATIONAL DEVELOPMENT:

Mr. David A. Powner:

Director:

Information Technology Management Issues:

U.S. General Accounting Office:

441 G Street, N.W.

Washington, DC 20548:

Dear Mr. Powner:

I am pleased to provide the U.S. Agency for International Development's 
(USAID's) formal response on the draft GAO report entitled "Information 
Technology Management: Governmentwide Strategic Planning, Performance 
Measurement, and Investment Management Can Be Further Improved" 
(November 2003.):

We have reviewed the draft recommendations to USAID and appreciate the 
thorough review which has been performed by your team. We do take 
exception to your assessment of USAID's compliance at the "No" level 
with the following practices:

l. Practice 2.11 (page 48): "The agency requires that the proposed IT 
investment, at a minimum:

* support work processes that it has simplified or redesigned to reduce 
costs and improve effectiveness, and:

* make maximum use of commercial-off-the-shelf (COTS) software.":

2. Practice 2.14 (page 49): "Agency policy calls for investments to be 
modularized (e.g., managed and procured in well-defined useful segments 
or modules that are short in duration and small in score) to the 
maximum extent achievable.":

We believe that USAID implements both practices at a \\partial" level 
because we require that USAID IT investment requests be submitted in 
accordance with OMB Circular A-11 which contains such policy 
requirements. For example, USAID's Automated Directives System (ADS) 
Chapter 577 (Capital Planning and Investment Control), Table l, 
Investment Category Documentation and Review Requirements, states that 
OMB Exhibit 300 must be submitted to the CIO requesting approval of 
Level II and Level III investments, which are defined on that table. 
Authors of Level I Investment Decision Requests (USAID 3001) must 
follow instructions provided in OMB Exhibit 300 in OMB Circular A-11. 
OMB Circular A-11 states that, "Agencies must develop, implement and 
use a capital programming process to develop their capital asset 
portfolio, and must: ...

* Simplify or otherwise redesign work processes to reduce costs, 
improve effectiveness, and make maximum use of commercial services and 
off-the-shelf technology; ...

* Structure major acquisitions into useful segments with a narrow scope 
and brief duration... ..":

(ADS 577 is available on the USAID website at: http://www.usaid.gov/ 
policy/ads/500/577.pdf. Table 1 is available at:

http://www.usaid.gov/policy/ads/500/577maa.doc. The Action Memorandum 
is available at:

http://www.usaid.gov/policy/ads/500/577mag.pdf and OMB Circular A-11, 
Part 7; page p. 300-8 is available at: http://www.cio.gov/documents/ 
s300.pdf.):

In addition, USAID evaluates and scores IT investment proposals, in 
part, according to the following criteria: - whether or not the project 
includes the use of commercial off-the-shelf (COTS) solutions and 
minimizes the use of customer-designed components and - whether the 
Agency uses phased successive chunks. (See ADS 577, Table 3, page 3, 
item 2.2, which is available on the USAID website at: http://
www.usaid.gov/policy/ads/ 500/577mab.pdf.):

Thank you for the opportunity to respond to the GAO draft report and 
for the courtesies extended by your staff in the conduct of this 
review. If you have any questions, you may contact me or contact Mark 
Kneidinger, Deputy CIO for Policy and Deputy Assistant Administrator 
for Management at (202)712-1200.

Sincerely,

Signed by:

John Marshall:

Assistant Administrator:

Bureau for Management:

The following are GAO's comments on the U.S. Agency for International 
Development's (USAID) letter dated December 9, 2003.

GAO Comments:

1. References to OMB's Circular A-11 in agency policy documentation 
alone do not ensure that these practices are met. In particular, we 
believe that agency policies related to practices 2.11 and 2.14 should 
be explicit and that it is neither prudent nor practical to rely on 
users of USAID's directives to review a secondary source. Regarding 
USAID's comments that it uses the criteria in practices 2.11 and 2.14 
as part of its evaluation and scoring of investments, we agree that the 
agency does ask some questions on the use of commercial-off-the-shelf 
software and whether the agency uses "successive chunks" within its 
proposed IT investment scoring model. However, addressing these 
criteria as part of a scoring model does not address our practice 
because scoring projects on the basis of the questions asked does not 
necessarily preclude projects from continuing if they do not fully meet 
the criteria. Additionally, the questions asked as part of the scoring 
model do not fully meet the requirements of the practices. Accordingly, 
we did not modify our report.

[End of section]

Appendix XVIII: Comments from the Department of Veterans Affairs:

THE SECRETARY OF VETERANS AFFAIRS:

WASHINGTON:

December 5, 2003:

Mr. David A. Powner:

Director:

Information Technology Management Issues:

U.S. General Accounting Office:

441 G Street, NW:

Washington, DC 20548:

Dear Mr. Powner:

The Department of Veterans Affairs (VA) has reviewed your draft report, 
Information Technology Management: Governmentwide Strategic Planning, 
Performance Measurement, and Investment Management Can Be Further 
Improved (GAO 04-49). VA concurs with all seven of GAO's 
recommendations. I am pleased to advise that VA has already implemented 
five recommendations and plans implementing the remaining two 
recommendations by the end of April 2004.

VA's detailed comments specific to the report's recommendations are 
outlined in the enclosure. Thank you for the opportunity to comment on 
your draft report.

Sincerely yours,

Signed by:

Anthony J. Principi:

Enclosure:

Enclosure:

Department of Veterans Affairs Comments To GAO Draft Report, 
INFORMATION TECHNOLOGY MANAGEMENT: Governmentwide Strategic Planning, 
Performance Measurement, and Investment Management Can Be Further 
Improved (GAO-04-49):

To improve the department's IT strategic management/performance 
measurement processes, GAO recommends that the Secretary of Veterans 
Affairs:

* Include in the department's annual performance plan the resources 
required to implement the information security program plan required by 
FISMA.

Concur: The Department currently collects information on the costs to 
implement the provision of Federal Information Security Management Act 
(FISMA) of 2002 through several inter-related efforts and reports this 
information to the Office of Management and Budget. This information 
will be included in future Government Performance and Results Act 
submissions.

* Develop a documented process to measure progress against the 
department's IT goals and assign goals and responsibilities for 
achieving these goals.

Concur: VA's Acting Chief Information Officer (CIO) is revising the 
Department's IT Strategic Plan. The revision will include a section 
that outlines the performance measures that are associated with 
specific IT goals and objectives and which organizations are 
responsible for carrying them out. Progress against these performance 
measures will be assessed as part of the monthly performance reviews 
that all projects within the IT Portfolio must undergo. The process for 
measuring this performance will be documented in guidance being 
prepared to address IT Portfolio and Project Management and will be 
developed by April 2004. The IT Strategic Plan update will be developed 
by February 2004. VA anticipates implementing this recommendation by 
the end of April 2004.

* Develop measures related to the effectiveness of controls to prevent 
software piracy.

Concur: In fiscal year 2004, VA will field its Department-wide Security 
Configuration and Management Program (SCAMP). SCAMP will establish and 
deploy an enterprise-level, network configuration management framework 
capability to centrally manage all desktops, servers, and 
communications and security devices in the VA environment. This 
initiative will greatly enhance VA's existing capabilities to preclude 
the installation of any unauthorized (including pirated) software on VA 
networks and desktops, and should provide data to measure the 
effectiveness of the controls.

* Track actual-versus-expected performance for the department's 
enterprisewide IT performance measures in its IRM plan.

Concur: Monthly performance reports are provided to VA's Strategic 
Management Council for all OMB Exhibit 300-level projects and programs 
actual-versus-expected performance metrics including:

* Acquisition requirements:

* Funding:

* Staffing:

* Schedule performance:

* Budget performance:

* Quality performance:

The Department's Strategic Management Council is chaired by the Deputy 
Secretary and is comprised of senior officers from the Department's 
administrations and staff offices. This recommendation has been 
implemented.

To improve the department's IT investment management process, GAO 
recommends that the Secretary of Veterans Affairs:

Document the alignment and coordination of responsibilities of the 
department's various IT investment management boards for decision 
making related to IT investments, including cross-cutting 
investment(s).

Concur: Within VA, the sole board responsible for the overall 
governance of IT decision making, as it is related to the IT Portfolio 
of VA investments, is the Enterprise Information Board (EIB). The EIB 
is chaired by the VA Chief Information Officer and includes within its 
membership the senior IT leadership of the Department, as well as 
fiscal officer representation. The EIB uses the Integrated Management 
Process as its tool for managing IT investments. The Integrated 
Management Process ensures appropriate planning, technical, and fiscal 
review at key decision points ("milestone") within a project's life 
cycle. Projects must fully complete all requirements for a given 
milestone before they are permitted unqualified approval to proceed to 
the next one. There are five milestones in the life cycle of a project: 
(0) Project Concept Development; (1) Project Planning; (2) Project 
Pilot/Prototyping; (3) Project Development/Roll-Out; and (4) Project 
Operation (i.e., "Post-Implementation"). VA constituent organizations 
(administrations and staff offices) are free to develop appropriate 
decision making mechanisms to vet investments and investment progress 
decisions prior to their review by the EIB. Nevertheless, those boards 
are still subject to the overall governance of the EIB. Detailed 
guidance on the Integrated Management Process, IT Portfolio Management, 
and IT Project Management is being prepared and will be completed by 
April 2004. VA anticipates implementing this recommendation by the end 
of April 2004.

Within the agency's IT investment selection process, implement a 
mechanism to identify possible conflicting, overlapping, strategically 
unlinked or redundant proposals and prioritize its IT investment(s).

Concur: VA has already implemented an IT Capital Planning and 
Investment Control (CPIC) process that fully implements the requirement 
of the Office of Management and Budget (OMB), as defined within OMB 
Circular A-11, Section 300. A key component of the CPIC process is the 
annual review of the entire IT Portfolio through the analyses of 
Exhibit 300 - Capital Asset Plan and Business Case - for each of the 
Department's major IT investments. VA has fully aligned its review 
process with that of OMB to the extent that internal evaluation of 
Exhibit 300s (prior to their dispatch to OMB) uses the same scoring 
template and analytical paradigm as used by OMB. All Exhibit 300 
documents are reviewed by a small group of technical subject matter 
experts. Through such means, it becomes possible to identify those 
investments that may be in conflict with others, potentially offer 
duplicative or overlapping efforts, or not advance the mission, goals, 
and objectives of the Department (i.e., "strategically unlinked"). 
Based on this review, the EIB is offered recommendations on the IT 
Portfolio. The EIB, through its capability to decide whether to 
recommend continuance, modification, or termination of projects to the 
VA CIO, can then resolve conflicting, overlapping, or non-aligned 
investment proposals. This recommendation has been implemented.

* Develop a process to use independent verification and validation 
reviews, when appropriate.

Concur: As part of VA's Integrated Management Process, the VA CIO, or 
the EIB, can request "in process" reviews of investments during their 
life cycle, particularly if such an investment appears "off track" 
(that is, cost and/or schedule is no longer within planned values plus 
10 percent, or the project is not delivering the performance that was 
expected). In addition, after implementation, projects are subject to 
post-implementation reviews. This latter set of reviews determines 
whether a project, now fully implemented, provided the Department with 
what was initially expected. These reviews are conducted by someone in 
other than the office responsible for the investment and often by 
outside consultants. This recommendation has been implemented.

The following are GAO's comments on the Department of Veterans Affairs' 
(VA) letter dated December 5, 2003.

GAO Comments:

1. VA's response indicates that the department will address this 
recommendation in the future and, therefore, we did not remove this 
recommendation.

2. See comment 1.

3. See comment 1.

4. VA's monthly performance reports track project-specific measures, 
not enterprisewide IT performance measures. VA's draft IRM plan states 
that it will establish metrics to measure performance for IT strategic 
initiatives. However, progress toward doing so was not addressed by VA 
in its comments. Therefore, we do not believe this recommendation has 
been fully addressed.

5. See comment 1.

6. Although VA describes a process followed for reviewing investment 
proposals, it did not provide evidence to support that this practice 
was actually followed. In addition, VA did not address the element of 
our recommendation related to prioritizing its IT investments. 
Therefore, we did not remove this recommendation.

7. On the basis of the additional information provided, we agree that 
the recommendation has been implemented and modified our report 
accordingly.

[End of section]

Appendix XIX: GAO Contacts and Staff Acknowledgments:

GAO Contacts:

Linda J. Lambert, (202) 512-9556 Mark D. Shaw, (202) 512-6251:

Staff Acknowledgments:

Joseph P. Cruz, Lester P. Diamond, Laurence P. Gill, David B. Hinchman, 
Robert G. Kershaw, David F. Plocher, Susan S. Sato, and Patricia D. 
Slocum made key contributions to this report.

(310448):

FOOTNOTES

[1] Office of Management and Budget, Budget of the U.S. Government, 
Fiscal Year 2004, Report on IT Spending for the Federal Government for 
Fiscal Years 2002, 2003, and 2004. We did not verify these data. 

[2] U.S. General Accounting Office, High-Risk Series: An Update, GAO-
03-119 (Washington, D.C. January 2003) and Major Management Challenges 
and Program Risks: A Governmentwide Perspective, GAO-03-95 (Washington, 
D.C. January 2003). 

[3] The Paperwork Reduction Act of 1995 revised the information 
resources management responsibilities established under the Paperwork 
Reduction Act of 1980, as amended in 1986. 

[4] Office of Management and Budget, Circular A-11, Preparation, 
Submission, and Execution of the Budget (July 27, 2002) and Circular A-
130, Management of Federal Information Resources (Nov. 28, 2000).

[5] U.S. General Accounting Office, Information Technology Investment 
Management: A Framework for Assessing and Improving Process Maturity, 
GAO/AIMD-10.1.23, Exposure Draft (Washington, D.C. May 2000).

[6] This section of the United States Code requires 24 departments and 
agencies to establish chief financial officers. We did not include the 
Federal Emergency Management Agency in our review, even though it is 
one of the 24 departments and agencies, because this agency has been 
transferred to the Department of Homeland Security.

[7] Six percent were not applicable. 

[8] One percent were not applicable. Percentages do not add up to 100 
percent due to rounding.

[9] DOD submitted a single letter that included comments from the 
Departments of the Air Force, Army, and Navy. 

[10] This responsibility is in addition to OMB's role in assisting the 
President in reviewing agency budget submissions and compiling the 
President's budget, as discussed in 31 U.S.C. chapter 11. 

[11] Government Information Security Reform, Title X, Subtitle G, Floyd 
D. Spence National Defense Authorization Act for Fiscal Year 2001, P.L. 
106-398, Oct. 30, 2000. 

[12] GAO-03-95. 

[13] U.S. General Accounting Office, Electronic Government: Selection 
and Implementation of the Office of Management and Budget's 24 
Initiatives, GAO-03-229 (Washington, D.C. Nov. 22, 2002). 

[14] U.S. General Accounting Office, Information Management: Challenges 
in Managing and Preserving Electronic Records, GAO-02-586 (Washington, 
D.C. June 17, 2002). 

[15] U.S. General Accounting Office, Electronic Records: Management and 
Preservation Pose Challenges, GAO-03-936T (Washington, D.C. July 8, 
2003). 

[16] U.S. General Accounting Office, Privacy Act: OMB Leadership Needed 
to Improve Agency Compliance, GAO-03-304 (Washington, D.C. June 30, 
2003). 

[17] U.S. General Accounting Office, Information Security: 
Opportunities for Improved OMB Oversight of Agency Practices, GAO/AIMD-
96-110 (Washington, D.C. Sept. 24, 1996).

[18] U.S. General Accounting Office, Computer Security: Progress Made, 
but Critical Federal Operations and Assets Remain at Risk, GAO-03-303T 
(Washington, D.C. Nov. 19, 2002). 

[19] U.S. General Accounting Office, Information Security: Continued 
Efforts Needed to Fully Implement Statutory Requirements, GAO-03-852T 
(Washington, D.C. June 24, 2003). 

[20] U.S. General Accounting Office, High-Risk Series: Protecting 
Information Systems Supporting the Federal Government and the Nation's 
Critical Infrastructures, GAO-03-121 (Washington, D.C. January 2003).

[21] U.S. General Accounting Office, Information Technology: Leadership 
Remains Key to Agencies Making Progress on Enterprise Architecture 
Efforts, GAO-04-40 (Washington, D.C. Nov. 17, 2003) and Information 
Technology: Enterprise Architecture Use across the Federal Government 
Can Be Improved, GAO-02-6 (Washington, D.C. Feb. 19, 2002). 

[22] For example, see U.S. General Accounting Office, Information 
Technology: Inconsistent Software Acquisition Processes at the Defense 
Logistics Agency Increase Project Risks, GAO-02-9 (Washington, D.C. 
Jan. 10, 2002); and HUD Information Systems: Immature Software 
Acquisition Capability Increases Project Risks, GAO-01-962 
(Washington, D.C. Sept. 14, 2001).

[23] U.S. General Accounting Office, Information Technology: Leading 
Commercial Practices for Outsourcing of Services, GAO-02-214 
(Washington, D.C. Nov. 30, 2001).

[24] U.S. General Accounting Office, Information Technology Investment 
Management: A Framework for Assessing and Improving Process Maturity, 
Exposure Draft, GAO/AIMD-10.1.23 (Washington, D.C. May 2000). 

[25] For example, U.S. General Accounting Office, Information 
Technology: Departmental Leadership Crucial to Success of Investment 
Reforms at Interior, GAO-03-1028 (Washington, D.C. Sept. 12, 2003); 
Bureau of Land Management: Plan Needed to Sustain Progress in 
Establishing IT Investment Management Capabilities, GAO-03-1025 
(Washington, D.C. Sept. 12, 2003); United States Postal Service: 
Opportunities to Strengthen IT Investment Management Capabilities, GAO-
03-3 (Washington, D.C. Oct. 15, 2002); Information Technology: DLA 
Needs to Strengthen Its Investment Management Capability, GAO-02-314 
(Washington, D.C. Mar. 15, 2002); and Information Technology: INS 
Needs to Strengthen Its Investment Management Capability, GAO-01-146 
(Washington, D.C. Dec. 29, 2000).

[26] Office of Management and Budget, Circular A-11, Preparation, 
Submission, and Execution of the Budget (July 2002) and Circular A-130, 
Management of Federal Information Resources (Nov. 30, 2000).

[27] U.S. General Accounting Office, Information Technology Investment 
Management: A Framework for Assessing and Improving Process Maturity, 
GAO/AIMD-10.1.23, Exposure Draft (Washington, D.C. May 2000).

[28] The Departments of Agriculture, the Air Force, the Army, Commerce, 
Defense, Education, Energy, Health and Human Services, Housing and 
Urban Development, the Interior, Justice, Labor, the Navy, State, 
Transportation, the Treasury, and Veterans Affairs; and the 
Environmental Protection Agency, General Services Administration, 
National Aeronautics and Space Administration, National Science 
Foundation, Nuclear Regulatory Commission, Office of Personnel 
Management, Small Business Administration, Social Security 
Administration, and U.S. Agency for International Development. 

[29] Each year, OMB's Office of Information and Regulatory Affairs 
publishes an Information Collection Budget by gathering data from 
executive branch agencies on the total number of burden hours it 
approved for collection of information at the end of the fiscal year 
and agency estimates of the burden for the coming fiscal year. 

[30] In fulfilling its responsibilities under this act, OMB requires 
agencies to report to OMB on their plans for providing the public with 
the option of submitting, maintaining, and disclosing required 
information electronically, instead of on paper. 

[31] U.S. General Accounting Office, Information Resources Management: 
Comprehensive Strategic Plan Needed to Address Mounting Challenges, 
GAO-02-292 (Washington, D.C. Feb. 22, 2002). 

[32] U.S. General Accounting Office, Paperwork Reduction Act: Record 
Increase in Agencies' Burden Estimates, GAO-03-619T (Washington, D.C. 
April 11, 2003). 

[33] U.S. General Accounting Office, Executive Guide: Measuring 
Performance and Demonstrating Results of Information Technology 
Investments, GAO/AIMD-98-89 (Washington, D.C. March 1998). 

[34] GAO/AIMD-98-89. 

[35] GAO/AIMD-98-89. 

[36] VA included the time periods to implement its information security 
program in its performance plan.

[37] GAO/AIMD-10.1.23 and U.S. General Accounting Office, Assessing 
Risks and Returns: A Guide for Evaluating Federal Agencies' IT 
Investment Decision-making, GAO/AIMD-10.1.13 (Washington, D.C. 
February 1997). 

[38] We have previously reported that certain DOD system acquisitions 
were not utilizing incremental management best practices or were just 
beginning to do so. For example, see U.S. General Accounting Office, 
DOD Systems Modernization: Continued Investment in Standard Procurement 
System Has Not Been Justified, GAO-01-682 (Washington, D.C. July 31, 
2001).

[39] GAO-03-1028. 

[40] With a scoring model, the assessment body typically attaches 
numerical scores and "relative value" weights to each of the individual 
selection criteria. Investments are then assessed relative to these 
scores and then against weights associated with each individual 
criterion. Finally, the weighted scores are summed to create a 
numerical value for each investment. 

[41] U.S. General Accounting Office, DOD Business Systems 
Modernization: Improvements to Enterprise Architecture Development and 
Implementation Efforts Needed, GAO-03-458 (Washington, D.C. Feb. 28, 
2003). 

[42] U.S. General Accounting Office, DOD Business Systems 
Modernization: Important Progress Made to Develop Business Enterprise 
Architecture, but Much Work Remains, GAO-03-1018 (Washington, D.C. 
Sept. 19, 2003). 

[43] The technical review board, which serves as the department's 
first-tier investment review board, is chaired by the deputy CIO, and 
its members consist of IRM managers and administrative officers from 
each component. 

[44] DOD submitted a single letter that included comments from the 
Departments of the Air Force, Army, and Navy. 

[45] AIMD-10-1-23.

[46] The CIO Council is the principal interagency forum for improving 
agency practices related to the design, acquisition, development, 
modernization, use, operation, sharing, and performance of federal 
government information resources.

GAO's Mission:

The General Accounting Office, the investigative arm of Congress, 
exists to support Congress in meeting its constitutional 
responsibilities and to help improve the performance and accountability 
of the federal government for the American people. GAO examines the use 
of public funds; evaluates federal programs and policies; and provides 
analyses, recommendations, and other assistance to help Congress make 
informed oversight, policy, and funding decisions. GAO's commitment to 
good government is reflected in its core values of accountability, 
integrity, and reliability.

Obtaining Copies of GAO Reports and Testimony:

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through the Internet. GAO's Web site ( www.gao.gov ) contains 
abstracts and full-text files of current reports and testimony and an 
expanding archive of older products. The Web site features a search 
engine to help you locate documents using key words and phrases. You 
can print these documents in their entirety, including charts and other 
graphics.

Each day, GAO issues a list of newly released reports, testimony, and 
correspondence. GAO posts this list, known as "Today's Reports," on its 
Web site daily. The list contains links to the full-text document 
files. To have GAO e-mail this list to you every afternoon, go to 
www.gao.gov and select "Subscribe to e-mail alerts" under the "Order 
GAO Products" heading.

Order by Mail or Phone:

The first copy of each printed report is free. Additional copies are $2 
each. A check or money order should be made out to the Superintendent 
of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or 
more copies mailed to a single address are discounted 25 percent. 
Orders should be sent to:

U.S. General Accounting Office

441 G Street NW,

Room LM Washington,

D.C. 20548:

To order by Phone: 

 Voice: (202) 512-6000:

 TDD: (202) 512-2537:

 Fax: (202) 512-6061:

To Report Fraud, Waste, and Abuse in Federal Programs:

Contact:

Web site: www.gao.gov/fraudnet/fraudnet.htm E-mail: fraudnet@gao.gov

Automated answering system: (800) 424-5454 or (202) 512-7470:

Public Affairs:

Jeff Nelligan, managing director, NelliganJ@gao.gov (202) 512-4800 U.S.

General Accounting Office, 441 G Street NW, Room 7149 Washington, D.C.

20548: