This is the accessible text file for GAO report number GAO-04-1008 
entitled 'Financial Management Systems: Lack of Disciplined Processes 
Puts Implementation of HHS' Financial System at Risk' which was 
released on September 30, 2004.

This text file was formatted by the U.S. Government Accountability 
Office (GAO) to be accessible to users with visual impairments, as part 
of a longer term project to improve GAO products' accessibility. Every 
attempt has been made to maintain the structural and data integrity of 
the original printed product. Accessibility features, such as text 
descriptions of tables, consecutively numbered footnotes placed at the 
end of the file, and the text of agency comment letters, are provided 
but may not exactly duplicate the presentation or format of the printed 
version. The portable document format (PDF) file is an exact electronic 
replica of the printed version. We welcome your feedback. Please E-mail 
your comments regarding the contents or accessibility features of this 
document to Webmaster@gao.gov.

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately.

Report to Congressional Requesters:

September 2004:

FINANCIAL MANAGEMENT SYSTEMS:

Lack of Disciplined Processes Puts Implementation of HHS' Financial 
System at Risk:

GAO-04-1008:

GAO Highlights:

Highlights of GAO-04-1008, a report to congressional requesters.

Why GAO Did This Study:

In June 2001, the Secretary of HHS directed the department to 
establish a unified accounting system that, when fully implemented, 
would replace five outdated accounting systems. GAO was asked to 
review HHS’ ongoing effort to develop and implement the Unified 
Financial Management System (UFMS) and to focus on whether the agency 
has 
(1) effectively implemented disciplined processes; 
(2) implemented effective information technology (IT) investment 
management, enterprise architecture, and information security 
management; and 
(3) taken actions to ensure that the agency has the human capital 
needed to successfully design, implement, and operate UFMS.

What GAO Found:

HHS has not followed key disciplined processes necessary to reduce the 
risks associated with implementing UFMS to acceptable levels. While 
development of a core financial system can never be risk free, 
effective implementation of disciplined processes can reduce those 
risks to acceptable levels. The problems that have been identified in 
such key areas as requirements management, including developing a 
concept of operations, testing, data conversion, systems interfaces, 
and risk management, compounded by incomplete IT management practices, 
information security weaknesses, and problematic human capital 
practices, significantly increase the risks that UFMS will not fully 
meet one or more of its cost, schedule, and performance objectives. 

With initial deployment of UFMS at the Centers for Disease Control and 
Prevention (CDC) scheduled for October 2004, HHS has not developed 
sufficient quantitative measures for determining the impact of the 
many process weaknesses identified by GAO and others to evaluate its 
project efforts. Without well-defined requirements that are traceable 
from origin to implementation, HHS cannot be assured that the system 
will provide the functionality needed and that testing will identify 
significant defects in a timely manner prior to rollout when they are 
less costly to correct. The agency has not developed the necessary 
framework for testing requirements, and its schedule leaves little 
time for correcting process weaknesses and identified defects. HHS has 
focused on meeting its predetermined milestones in the project schedule 
to the detriment of disciplined processes. If HHS continues on this 
path, it risks not achieving its goal of a common accounting system 
that produces data for management decision making and financial 
reporting and risks perpetuating its long-standing accounting system 
weaknesses with substantial workarounds to address needed capabilities 
that have not been built into the system. Accordingly, GAO believes 
these issues need to be addressed prior to deployment at CDC.

Beyond the risks associated with this specific system development, HHS 
has departmental weaknesses in IT investment management, enterprise 
architecture, and information security. Because of the risks related 
to operating UFMS in an environment with flawed information security 
controls, HHS needs to take action to ensure that UFMS benefits from 
strong information security controls. HHS is modifying its IT 
investment management policies, developing an enterprise architecture, 
and responding to security weaknesses with several ongoing activities, 
but substantial progress in these areas is needed to prevent increased 
risks to cost, schedule, and performance objectives for UFMS.

In human capital, many positions were not filled as planned and 
strategic workforce planning was not timely. HHS has taken the first 
steps to address these issues; however, ongoing staff shortages have 
played a role in several key deliverables being significantly behind 
schedule.

What GAO Recommends:

GAO makes 34 recommendations that focus on helping HHS reduce the risks 
associated with its implementation of UFMS. These recommendations are 
aimed at establishing strong, disciplined processes, addressing 
information security weaknesses, and strengthening human capital.

In its comments, HHS indicated that it has implemented some of our 
recommendations but disagreed with our conclusion that a lack of 
disciplined processes puts UFMS at risk. HHS also commented on issues 
such as implementation methodology, testing, requirements management, 
program management, IT management, and our review. 

www.gao.gov/cgi-bin/getrpt?GAO-04-1008. To view the full product, 
including the scope and methodology, click on the link above. For more 
information, contact Sally Thompson (202) 512-9450, thompsons@gao.gov 
or Keith Rhodes (202) 512-6412, rhodesk@gao.gov.

[End of section]

Contents:

Letter: 

Results in Brief: 

Background: 

HHS Has Not Effectively Implemented the Disciplined Processes Necessary 
to Reduce UFMS Program Risks to Acceptable Levels: 

Weaknesses in HHS' IT Management Practices and Information Security 
Controls Put UFMS at Risk: 

Human Capital Issues Increase Risk Associated with the Implementation 
of UFMS: 

Conclusions: 

Recommendations for Executive Action: 

Agency Comments and Our Evaluation: 

Appendixes:

Appendix I: Scope and Methodology: 

Appendix II: Disciplined Processes Are Key to Successful System 
Development and Implementation Efforts: 

Requirements Management: 

Testing: 

Project Planning and Oversight: 

Risk Management: 

Appendix III: An Effective Requirements Management Process and the UFMS 
Functionality for CDC Had Not Been Fully Developed: 

Appendix IV: Comments from the Department of Health and Human Services: 

Appendix V: GAO Contacts and Staff Acknowledgments: 

GAO Contacts: 

Acknowledgments: 

Tables: 

Table 1: Current Agency Accounting Systems: 

Table 2: Type of Action Needed to Address Data Conversion Findings: 

Figures: 

Figure 1: HHS' Integration Strategy: 

Figure 2: Percentage of Effort Associated with Undisciplined Projects: 

Figure 3: Relationship between Requirements Development and Testing: 

Abbreviations: 

ACF: Administration for Children and Families:

AHRQ: Agency for Health Care Research and Quality:

AoA: Administration on Aging:

ATSDR: Agency for Toxic Substances and Disease Registry:

BIA: Bureau of Indian Affairs:

CAS: Central Accounting System:

CDC: Centers for Disease Control and Prevention:

CFO: Chief Financial Officer:

CMS: Centers for Medicare and Medicaid Services:

COTS: commercial off-the-shelf:

DHS: Department of Homeland Security:

DOD: Department of Defense:

EIN: employer identification number:

ERP: enterprise resource planning:

FACS: Financial Accounting Control System:

FDA: Food and Drug Administration:

FEMA: Federal Emergency Management Agency:

FFMIA: Federal Financial Management Improvement Act of 1996:

FISCAM: Federal Information System Controls Audit Manual:

GLAS: General Ledger Accounting System:

HHS: Department of Health and Human Services:

HIGLAS: Healthcare Integrated General Ledger Accounting System:

HRSA: Health Resources and Services Administration:

IEEE: Institute of Electrical and Electronic Engineers:

IG: Inspector General:

IHS: Indian Health Service:

IT: information technology:

IV&V: independent verification and validation:

JFMIP: Joint Financial Management Improvement Program:

NASA: National Aeronautics and Space Administration:

NBRSS: National Institutes of Health Business and Research Support 
System:

NIH: National Institutes of Health:

OMB: Office of Management and Budget:

OS: Office of the Secretary of Health and Human Services:

PSC: Program Support Center:

SAMHSA: Substance Abuse and Mental Health Services Administration:

SEI: Software Engineering Institute:

SGL: U.S. Government Standard General Ledger:

TOPS: Total On-Line Processing System:

UFMS: Unified Financial Management System:

Letter September 23, 2004:

The Honorable Todd R. Platts: 
Chairman: 
The Honorable Edolphus Towns: 
Ranking Minority Member: 
Subcommittee on Government Efficiency and Financial Management: 
Committee on Government Reform: 
House of Representatives:

The Honorable Marsha Blackburn: 
House of Representatives:

The ability to produce the information needed to efficiently and 
effectively manage the day-to-day operations of the federal government 
and provide accountability to taxpayers and the Congress has been a 
long-standing challenge for federal agencies. To address some of these 
problems, many agencies are in the process of replacing their core 
financial systems as part of their financial management system 
improvement efforts. Although the implementation of any major system is 
not a risk-free proposition, organizations that follow and effectively 
implement accepted best practices in systems development and 
implementation (commonly referred to as disciplined processes) can 
reduce these risks to acceptable levels. The use of the term acceptable 
levels acknowledges the fact that any systems acquisition has risks and 
will suffer the adverse consequences associated with defects. However, 
effective implementation of the disciplined processes reduces the 
potential for risks to occur and helps prevent those that do occur from 
having any significant adverse impact on the cost, timeliness, and 
performance of the project.

Because of the importance of these financial management system 
improvement efforts and your question as to whether agencies are 
employing disciplined processes in implementing new systems, you asked 
us to evaluate the current plans for implementing financial management 
systems at the Chief Financial Officer Act (CFO) agencies.[Footnote 1] 
As agreed with your offices, we initiated our review at the Department 
of Health and Human Services (HHS). HHS has undertaken a multiyear 
effort to implement its Unified Financial Management System (UFMS), a 
new core financial system, to help HHS management monitor budgets, 
conduct operations, evaluate program performance, and make financial 
and programmatic decisions. As a core financial system, UFMS will 
interface with an estimated 110 other HHS information systems. HHS 
envisions the eventual UFMS as a departmentwide system that will 
include core financial systems currently under development at the 
National Institutes of Health (NIH), the Centers for Medicare and 
Medicaid Services (CMS), along with an integrated system for the 
Centers for Disease Control and Prevention (CDC), the Food and Drug 
Administration (FDA), and the Program Support Center (PSC), which 
provides accounting support for the remaining HHS organizations.

This report provides our assessment of HHS' ongoing effort to develop 
and implement the integrated UFMS at CDC, FDA, and PSC, and focuses on 
whether the agency has (1) effectively implemented key disciplined 
processes in the development of UFMS to provide reasonable assurance 
that UFMS meets its cost, schedule, and performance goals; (2) 
implemented effective investment management, enterprise architecture, 
and security management to support UFMS efforts; and (3) taken actions 
to ensure that HHS has the human capital needed to successfully design, 
implement, and operate UFMS.

To achieve these objectives, we reviewed documentation related to the 
project and interviewed HHS officials and contractors used by HHS to 
assist with implementation. We used relevant government and industry 
standards, such as those from the Software Engineering Institute (SEI) 
and the Institute of Electrical and Electronics Engineers (IEEE), along 
with key best practice guides such as our Executive Guide: Creating 
Value Through World-class Financial Management,[Footnote 2] to assess 
the status of HHS' implementation of disciplined processes. This report 
does not assess HHS' other financial management improvement efforts at 
NIH and CMS. We conducted our work in Washington, D.C., Rockville, 
Maryland, and Atlanta, Georgia, from September 2003 through May 2004 in 
accordance with U.S. generally accepted government auditing standards. 
More details on our scope and methodology can be found in appendix I.

Results in Brief:

HHS has adopted some best practices in its development of UFMS, in 
particular, sponsorship from senior financial management and routine 
reviews by various HHS officials of its progress. However, at the time 
of our review, HHS had not effectively implemented several disciplined 
processes, which are accepted best practices in systems development and 
implementation efforts that have been shown to reduce risks to 
acceptable levels and therefore are key to a project's success, and had 
adopted other practices that put the project at unnecessary risk.

HHS officials told us that they had carefully considered the risks 
associated with implementing UFMS and that they had put in place 
strategies to manage these risks and to allow the project to meet its 
schedule within budget. However, we found that HHS had focused on 
meeting its schedule to the detriment of disciplined processes and thus 
had introduced unnecessary risks that may compromise the system's cost, 
schedule, and performance. Key disciplined processes that HHS had not 
fully embraced were requirements management, including developing a 
concept of operations, testing, project management, and oversight using 
quantitative measures, and risk management. Compounding these problems 
are departmentwide weaknesses in investment management, enterprise 
architecture, and information security. Specifically, HHS had not 
established the information technology management processes needed to 
provide UFMS with a solid foundation for development. Also, staff 
shortages and limited strategic workforce planning have resulted in the 
project not having the resources needed to effectively design and 
operate UFMS. In our work at other agencies, we have found that project 
deficiencies such as those at HHS have led to a range of problems, from 
increased cost and reduced functionality to system failure. If UFMS 
continues along this path of development, it runs a much higher risk of 
following a long line of troubled system development efforts involving 
schedule delays and increased development costs for a system that 
ultimately may not serve the agency well.

Among the disciplined processes, we focused on requirements management 
and testing because these areas form the foundation for project 
success. To guide its requirements development process, HHS prepared a 
number of documents, such as the Financial Shared Services Study 
Concept of Operation and Initial Global Process Designs. However, the 
documents were developed too late or lacked the information needed to 
effectively aid in guiding development and they did not include the key 
document, a concept of operations, that specifies the high-level 
business processes that form the basis for defining system 
requirements. HHS did establish a framework for its requirements 
development, a hierarchy of definitions from high-level processes to 
the detailed definitions needed for software development. However, the 
requirements we tested, which are the specifications that system 
developers use to design and develop a system, were not defined at each 
level and so could not be traced through the hierarchy as needed for 
system development and implementation. Individually, definitions were 
not specific enough to reduce requirements-related defects to 
acceptable levels. With these weaknesses, HHS did not have a firm 
foundation of requirements for testing activities, such as system 
testing, which verifies that the complete system satisfies functional 
requirements. In addition, system testing and data conversion[Footnote 
3] are occurring late in the project schedule, leaving little time to 
address any defects, which are commonplace in a large project such as 
UFMS, before the first UFMS implementation, scheduled for October 2004 
at CDC.

In addition to requirements and testing, we found weaknesses in the 
disciplined processes of risk management and project management and in 
the quantitative data needed to support management's assessment of the 
project's condition. HHS maintained a database of risks; however, where 
mitigation strategies had been identified, the database listed 
unresolved risks as closed. Project managers agreed to revise their 
procedures to provide more information, and this change should improve 
their ability to oversee project risks. In project management, UFMS had 
high-level support from senior financial management officials and 
assessments from a contractor hired to perform oversight services. 
However, HHS was slow to take action on several recommendations made by 
the contractor. For example, although the contractor identified the 
lack of personnel as a major risk factor in June 2003, this problem was 
not substantially addressed until more than 6 months later. In 
addition, in gathering data for project assessment, HHS had not 
effectively captured the metrics needed to assess capabilities, 
problems, and corrective actions and had not implemented a process to 
ensure that defects are promptly reported and corrected. These 
problems, if not corrected before system launch, will have to be 
addressed while the system is in operation, potentially resulting in 
costly and time-consuming rework and cumbersome procedures to 
compensate for a system that does not function as expected.

We have previously reported--and HHS has acknowledged--weaknesses in 
the HHS-wide information technology management processes within which 
UFMS will be implemented. HHS is modifying its information technology 
(IT) investment management policies, developing an enterprise 
architecture, and responding to security weaknesses with several 
ongoing activities; but these changes may not be implemented in time to 
prevent increased risks to cost, schedule, and performance objectives 
for this particular initiative. In investment management, we found 
weaknesses in review board procedures, coordination of decision making 
among review boards, and selection criteria. With most of the planning 
and development of UFMS completed, HHS has not yet established an 
agencywide enterprise architecture to guide and constrain its IT 
projects. Our experience has shown that without an enterprise 
architecture in place before planning and development, the project 
increases its risk of facing such problems as duplication, lack of 
integration, and costly maintenance. In addition, HHS has recognized 
the need to improve information security throughout the department and 
has various initiatives under way. However, it has not yet fully 
implemented the key elements of a comprehensive security management 
program. We found that HHS had not conducted a comprehensive assessment 
of information security general controls agencywide. Some operating 
divisions had not been recently assessed, and some that were recently 
assessed had not provided UFMS with current information. Without 
information on control weaknesses in the operating divisions, UFMS 
management is not in a position to develop mitigating controls.

In human capital, UFMS had a project manager, systems integrator, and 
some functional experts at the time of our review; however, many 
positions were not filled as planned, and ongoing staff shortages have 
played a role in key deliverables being significantly behind schedule. 
HHS had taken the first steps in strategic workforce planning; however, 
CDC, the site for UFMS' first implementation, was the only operating 
division that had prepared a competency report or adopted the project's 
global competency report. Further, a skills gap analysis and site-
specific training plan had not been completed for CDC.

We are making 9 recommendations to help HHS address the risks 
associated with implementing UFMS at CDC in October 2004 and, as HHS 
moves forward with UFMS, we are making another 25 recommendations aimed 
at establishing strong disciplined processes, addressing information 
security weaknesses, and strengthening human capital in order to 
minimize the risk, and ultimately, the resources needed to efficiently 
and effectively implement UFMS.

We requested comments on a draft of this report from the Secretary of 
Health and Human Services or his designee. Written comments from the 
Department of Health and Human Services are reprinted in appendix IV 
and evaluated in the "Agency Comments and Our Evaluation" section. In 
written comments on a draft of our report, HHS described its actions 
taken to date on some of our recommendations, and its other planned 
actions. If fully implemented, the actions HHS has taken and plans to 
take in the future should help to reduce some of the risks to the 
project. HHS contended that its processes have been rigorously executed 
and disagreed with our conclusion that a lack of disciplined processes 
is placing the UFMS program at risk. We disagree. We believe that if 
HHS continues to employ ineffective disciplined processes, it cannot 
reduce risk to a reasonable level, and risks implementing a system that 
does not serve its needs and will require costly and time-consuming 
rework once in operation. HHS believes that the risk in its approach 
results from an aggressive project schedule, not the lack of 
disciplined processes. We agree that HHS has adopted an aggressive 
project schedule that increased the risks to UFMS. To keep to its 
schedule as it now stands, HHS is at risk of not substantively 
accomplishing the milestones in the schedule, or if it does implement 
the system in October 2004 as planned, the system may have compromised 
functionality and need to rely on manual work-arounds. HHS also 
disagreed with several of our findings and stated its position on 
issues including implementation methodology, testing, requirements 
management, program management oversight, and human capital.

Background:

HHS is the federal government's principal agency for protecting the 
health of Americans and provides essential human services, such as 
ensuring food and drug safety and assisting needy families. HHS 
disburses almost a quarter of all federal outlays and administers more 
grant dollars than all other federal agencies combined, providing more 
than $200 billion of over $350 billion in federal funds awarded to 
states and other entities in fiscal year 2002, the most recent year for 
which these data are available. For fiscal year 2004, HHS had a budget 
of $548 billion and over 66,000 employees. HHS comprises 11 
agencies[Footnote 4] led by the Office of the Secretary covering a wide 
range of activities including conducting and sponsoring medical and 
social science research, guarding against the outbreak of infectious 
diseases, assuring the safety of food and drugs, and providing health 
care services and insurance.

HHS is required by the CFO Act of 1990[Footnote 5] to modernize its 
financial management systems and by the Federal Financial Management 
Improvement Act (FFMIA) of 1996[Footnote 6] to have auditors--as part 
of an audit report on the agency's annual financial statements--
determine whether the agency's financial management systems comply 
substantially with three requirements: (1) federal financial management 
systems requirements,[Footnote 7] (2) applicable federal accounting 
standards, and (3) the U.S. Government Standard General Ledger 
(SGL)[Footnote 8] at the transaction level.

While HHS has received unqualified opinions on its financial statements 
at the consolidated departmental level since fiscal year 1999, the 
underlying financial systems that assist in the preparation of 
financial statements have not met all applicable requirements. For 
fiscal years 1997 through 2003, HHS auditors reported that the 
department's systems did not substantially comply with federal 
financial management systems requirements, and for fiscal year 2003, 
they reported that the systems also lacked compliance with the SGL 
requirement. In describing the financial management problems in the 
fiscal year 2003 financial statement audit report, the HHS Inspector 
General (IG) stated that the department's lack of an integrated 
financial system and internal control weaknesses made it difficult for 
HHS to prepare timely and reliable financial statements. The IG also 
noted that preparation of HHS financial statements required substantial 
"work arounds," cumbersome reconciliations and consolidation 
processes, and significant adjustments to reconcile subsidiary records 
to reported balances on the financial statements.

HHS' Financial System Implementation Effort:

In June 2001, the Secretary of HHS directed the department to establish 
a unified accounting system that, when fully implemented, would replace 
five outdated accounting systems. HHS considers the UFMS program a 
business transformation effort with IT, business process improvement, 
and operations consolidation components. According to HHS, the program 
supports the Office of Management and Budget's (OMB) requirements for 
each agency to implement and operate a single, integrated financial 
management system (required by OMB Circular No. A-127). HHS asserts 
that its approach will require it to institute a common set of business 
rules, data standards, and accounting policies and procedures, thereby 
significantly furthering the Secretary's management objectives. Table 1 
depicts the current accounting systems that will be replaced and the 
organizations currently served.

Table 1: Current Agency Accounting Systems:

Current accounting systems: CORE Accounting System; 
Agencies served: 
Administration for Children and Families (ACF); 
Administration on Aging (AoA); 
Agency for Health Care Research and Quality (AHRQ); 
Health Resources and Services Administration (HRSA); 
Indian Health Service (IHS); 
Office of the Secretary of Health and Human Services (OS) Substance 
Abuse and Mental Health Services Administration (SAMHSA); 
These are entities supported by the Program Support Center (PSC).[A].

Current accounting systems: Total On-Line Processing System (TOPS); 
Agencies served: Centers for Disease Control and Prevention (CDC)[B].

Current accounting systems: General Ledger Accounting System (GLAS); 
Agencies served: Food and Drug Administration (FDA).

Current accounting systems: Central Accounting System (CAS); 
Agencies served: National Institutes of Health (NIH).

Current accounting systems: Financial Accounting Control System (FACS); 
Agencies served: Centers for Medicare and Medicaid Services (CMS). 

Source: HHS.

[A] The Program Support Center is an administrative office, 
organizationally aligned under the Office of the Secretary. The CORE 
Accounting system has been described as the "nucleus" of PSC's 
accounting operations.

[B] Includes the Agency for Toxic Substances and Disease Registry 
(ATSDR).

[End of table]

In response to the Secretary's direction, HHS began a project to 
improve its financial management operations.[Footnote 9] CMS and NIH 
had already initiated projects to replace their financial systems. 
Figure 1 illustrates the systems being replaced, the new configuration, 
and the approximate known implementation costs.

Figure 1: HHS' Integration Strategy:

[See PDF for image] 

[End of figure] 

As shown in figure 1, HHS plans to pursue a phased approach to 
achieving the Secretary's vision. The first phase is to implement the 
system at CDC and, as of May 2004, CDC was expected to begin using the 
system for its operations starting in fiscal year 2005 (October 2004). 
FDA was expected to implement UFMS in May 2005, and the entities served 
by PSC were to be phased in from July 2005 through April 2007. After 
all of the individual component agency implementations have been 
completed, UFMS and HHS consolidated reporting will be deployed. This 
effort involves automating the department's financial reporting 
capabilities and is expected to integrate the NIH Business and Research 
Support System (NBRSS) and CMS' Healthcare Integrated General Ledger 
Accounting System (HIGLAS) into UFMS, which are scheduled to be fully 
implemented in 2006 and 2007, respectively. The focus of our review was 
on the system implementation efforts associated with the HHS entities 
not covered by the NBRSS and HIGLAS efforts.

As shown in figure 1, the costs for this financial management system 
improvement effort can be broken down into four broad areas: NIH, CMS, 
all other HHS entities, and a system to consolidate the results of HHS' 
financial management operations. HHS estimates that it will spend about 
$713 million as follows:

* $110 million[Footnote 10] for its NIH efforts (NBRSS),

* $393 million to implement HIGLAS, and:

* $210 million for remaining HHS organizations.

HHS has not yet developed an estimate of the costs associated with 
integrating these efforts into the HHS unified financial management 
system envisioned in Secretary Thompson's June 2001 directive.

HHS selected a commercial-off-the-shelf (COTS) product, Oracle U.S. 
Federal Financials software (certified by the Program Management Office 
of the Joint Financial Management Improvement Program (JFMIP)[Footnote 
11] for federal agencies' use), as the system it would use to design 
and implement UFMS. The department has hired two primary contractors to 
help implement UFMS. In November 2001, HHS awarded KPMG Consulting (now 
BearingPoint) a contract as system integrator for assistance in 
planning, designing, and implementing UFMS. As the systems integrator, 
BearingPoint is expected to provide team members, who are experienced:

in the enterprise resource planning (ERP)[Footnote 12] software and its 
installation, configuration, and customization, with expertise in 
software, hardware, business systems architecture, and business process 
and transformation. HHS selected Titan Corporation to act as the 
project's independent verification and validation (IV&V) contractor, 
tasked with determining the programmatic, management, and technical 
status of the UFMS project and recommending actions to mitigate any 
identified risks to project success.

When fully implemented, UFMS is expected to permit the consolidation of 
financial data across all HHS component agencies to support timely and 
reliable departmentwide financial reporting. In addition, it is 
intended to integrate financial information from the department's 
administrative systems, including travel management systems, property 
systems, logistics systems, acquisition and contracting systems, and 
grant management systems. The department's goals in the development and 
implementation of this integrated system are to achieve greater 
economies of scale; eliminate duplication; provide better service 
delivery; and help management monitor budgets, conduct operations, 
evaluate program performance, and make financial and programmatic 
decisions.

HHS Has Not Effectively Implemented the Disciplined Processes Necessary 
to Reduce UFMS Program Risks to Acceptable Levels:

Experience has shown that organizations that adopt and effectively 
implement best practices, referred to in systems development and 
implementation efforts as the disciplined processes, can reduce the 
risks associated with these projects to acceptable levels.[Footnote 13] 
Although HHS has adopted some of the best practices associated with 
managing projects such as UFMS, it has adopted other practices that 
significantly increase the risk to the project. Also, HHS has not yet 
effectively implemented several of the disciplined processes--
requirements management, testing, project management and oversight, and 
risk management--necessary to reduce its risks to acceptable levels and 
has exposed the project to unnecessary risk that it will not achieve 
its cost, schedule, and performance objectives.

The project has been able to obtain high-level sponsorship at HHS with 
senior financial management and HHS personnel routinely reviewing its 
progress. HHS officials maintain that the project is on schedule and 
that the functionality expected to be available for its first 
deployment, at CDC in October 2004, is well known and acceptable to its 
users. However, the IV&V contractor identified a number of serious 
deficiencies that are likely to affect HHS' ability to successfully 
implement UFMS within its current budget and schedule while providing 
the functionality needed to achieve its goals. HHS management has been 
slow to take the recommended corrective actions necessary to address 
the findings and recommendations of its IV&V contractor. Further, it is 
not clear that the decision to proceed from one project milestone to 
the next is based on quantitative data that indicate tasks have been 
effectively completed. Rather, decisions to progress have been driven 
by the project's schedule. With a focus on meeting schedule milestones 
and without quantitative data, HHS faces significant risk that UFMS 
will suffer the adverse impacts on its cost, schedule, and performance 
that have been experienced by projects with similar problems.

Effective Implementation of the Disciplined Processes Are Key to 
Reducing Project Risks:

Disciplined processes, which are fundamental to successful systems 
development and implementation efforts, have been shown to reduce to 
acceptable levels the risks associated with software development and 
acquisition. A disciplined software development and acquisition process 
can maximize the likelihood of achieving the intended results 
(performance) within established resources (costs) on schedule. 
Although there is no standard set of practices that will ever guarantee 
success, several organizations, such as SEI[Footnote 14] and 
IEEE,[Footnote 15] as well as individual experts, have identified and 
developed the types of policies, procedures, and practices that have 
been demonstrated to reduce development time and enhance effectiveness. 
The key to having a disciplined system development effort is to have 
disciplined processes in multiple areas, including project planning and 
management, requirements management, configuration management, risk 
management, quality assurance, and testing. Effective processes should 
be implemented in each of these areas throughout the project life cycle 
because change is constant. Effectively implementing the disciplined 
processes necessary to reduce project risks to acceptable levels is 
hard to achieve because a project must effectively implement several 
best practices, and inadequate implementation of any one may 
significantly reduce or even eliminate the positive benefits of the 
others.

Acquiring and implementing a new financial management system requires a 
methodology that starts with a clear definition of the organization's 
mission and strategic objectives and ends with a system that meets 
specific information needs. We have seen many system efforts fail 
because agencies started with a general need, such as improving 
financial management, but did not define in precise terms (1) the 
specific problems they were trying to solve, (2) what their operational 
needs were, and (3) what specific information requirements flowed from 
these operational needs. Instead, they plunged into the acquisition and 
implementation process in the belief that these specifics would somehow 
be defined along the way. The typical result was that systems were 
delivered well past anticipated milestones; failed to perform as 
expected; and, accordingly, were overbudget because of required costly 
modifications.

Figure 2 shows how organizations that do not effectively implement the 
disciplined processes lose the productive benefits of their efforts as 
a project continues through its development and implementation cycle. 
Although undisciplined projects show a great deal of productive work at 
the beginning of the project, the rework associated with defects begins 
to consume more and more resources. In response, processes are adopted 
in the hopes of managing what later turns out, in reality, to have been 
unproductive work. Generally, these processes are "too little, too 
late" and rework begins to consume more and more resources because 
sufficient foundations for building the systems were not done or not 
done adequately. Experience has shown that projects for which 
disciplined processes are not implemented at the beginning are forced 
to implement them later when it takes more time and they are less 
effective.[Footnote 16]

Figure 2: Percentage of Effort Associated with Undisciplined Projects:

[See PDF for image] 

[End of figure] 

As shown in figure 2, a major consumer of project resources in 
undisciplined efforts is rework (also known as thrashing). Rework 
occurs when the original work has defects or is no longer needed 
because of changes in project direction. Disciplined organizations 
focus their efforts on reducing the amount of rework because it is 
expensive. Fixing a defect during the testing phase costs anywhere from 
10 to 100 times the cost of fixing it during the design or requirements 
phase.[Footnote 17] As shown in figure 2, projects that are unable to 
successfully address their rework will eventually only be spending 
their efforts on rework and the associated processes rather than on 
productive work. In other words, the project will continually find 
itself reworking items. Appendix II provides additional information on 
the disciplined processes.

HHS Has Not Effectively Implemented Key Processes Necessary to Reduce 
Risks to Acceptable Levels:

We found that HHS has not implemented effective disciplined processes 
in several key process areas that have been shown to form the 
foundation for project success or failure including requirements 
management, testing, project management and oversight, and risk 
management. Problems with HHS' requirements management practices 
include the lack of (1) a concept of operations to guide the 
development of requirements, (2) traceability of a requirement from the 
concept of operations through testing to ensure requirements were 
adequately addressed in the system, and (3) specificity in the 
requirements to minimize confusion in the implementation. These 
problems with requirements have resulted in a questionable foundation 
for the systems' testing process. In addition, HHS has provided an 
extremely limited amount of time to address defects identified from 
system testing, which reflects an optimism not supported by other HHS 
testing efforts, including those performed to test the conversion of 
data from CDC's legacy system to UFMS. This type of short time frame 
generally indicates that a project is being driven to meet 
predetermined milestones in the project schedule. While adherence to 
schedule goals is generally desirable, if corners are cut and there is 
not adequate quantitative data to assess the risks to the project of 
not implementing disciplined processes in these areas, the risk of 
project rework or failure appreciably rises. Ineffective implementation 
of these processes exposes a project to the unnecessary risk that 
costly rework will be required, which in turn will adversely affect the 
project's cost and schedule, and can adversely affect the ultimate 
performance of the system.

An effective risk management process can be used by an agency to 
understand the risks that it is undertaking when it does not implement 
an effective requirements management process. In contrast, HHS has 
implemented risk management procedures that close risks before it is 
clear that mitigating actions were effective. HHS has agreed to change 
these procedures so that the actions needed to address risks remain 
visible and at the forefront. While the executive sponsor for the UFMS 
project and other senior HHS officials have demonstrated commitment to 
the project, effective project management and oversight are needed to 
identify and resolve problems as soon as possible, when it is the 
cheapest to fix them. For example, HHS officials have struggled to 
address problems identified by the IV&V contractor in a timely manner. 
Moreover, HHS officials lack the quantitative data or metrics to 
effectively oversee the project. An effective project management and 
oversight process uses such data to understand matters such as (1) 
whether the project plan needs to be adjusted and (2) oversight actions 
that may be needed to ensure that the project meets its stated goals 
and complies with agency guidance. Whereas, with ineffective project 
oversight, management can only respond to problems as they arise.

HHS' Requirements Management Process Is Ineffective:

We found significant problems in HHS' requirements management process. 
(See appendix III for a more detailed discussion.) We found that HHS 
had not (1) developed a concept of operations that can be used to guide 
its requirements development process, (2) maintained traceability 
between the various requirements documents to ensure consistency, and 
(3) developed requirements that were unambiguous. Because of these 
weaknesses, HHS does not have reasonable assurance that the UFMS 
project is free of significant requirement defects that will cause 
significant rework.

Requirements are the specifications that system developers and program 
managers use to design, develop, and acquire a system. They need to be 
unambiguous, consistent with one another, verifiable, and directly 
traceable to higher-level business or functional requirements. It is 
critical that requirements flow directly from the organization's 
concept of operations, which describes how the organization's day-to-
day operations (1) are being carried out and (2) will be carried out to 
meet mission needs.[Footnote 18] Examples of problems noted in our 
review include the following.

* Requirements were not based on a concept of operations. HHS has 
prepared a number of documents that discuss various aspects of its 
vision for UFMS. However, these documents do not accomplish the 
principal objective associated with developing a concept of operations-
-specifying the high-level business processes that are expected to form 
the basis for requirements definition. One such document, issued April 
30, 2004,[Footnote 19] discusses the use of shared service centers
[Footnote 20] to perform financial management functions. This document 
was issued well after implementation efforts were under way and about 
5 months before the expected deployment date of UFMS at CDC. As 
discussed in more detail in appendix III, the April 30 document does 
not clearly explain who will perform these functions, and where and how 
these functions will be performed.

* Requirements were not traceable. HHS developed a hierarchical 
approach to defining its requirements. HHS defined the high-level 
requirements that were used to identify the requirements that could not 
be satisfied by the COTS product. Once these high-level requirements 
were defined, a hierarchical requirements management process was 
developed which included (1) reviewing and updating the requirements 
through process design workshops,[Footnote 21](2) establishing the 
initial baseline requirements, (3) performing a fit/gap analysis, (4) 
developing gap closure alternatives, and (5) creating the final 
baseline requirements. The key in using such a hierarchy is that each 
step of the process builds upon the previous step. However, this 
traceability was not maintained for the 74 requirements we reviewed. 
Therefore, HHS has little assurance that (1) requirements defined in 
the lower-level requirements documents are consistent with and 
adequately cover the higher-level requirements and (2) testing efforts 
based on lower-level requirements documents will adequately assess 
whether UFMS can meet the high-level requirements used to define the 
overall functionality expected from UFMS. Appendix III provides more 
details on problems we identified related to the traceability of 
requirements.

* Requirements were not always specific. Many requirements reviewed 
were not sufficiently specific to reduce requirements-related defects 
to acceptable levels. For example, one inadequately defined requirement 
stated that the system "shall track actual amounts and verify 
commitments and obligations against the budget as revised, consistent 
with each budget distribution level." The "Define Budget Distributions" 
process area was expected to provide the additional specificity needed 
for this requirement. However, as of May 2004, this process document 
stated that the functionality was "To Be Determined." Until HHS 
provides additional information concerning this requirement, it will 
not be able to determine whether the system can meet the requirement. 
Items that will need to be defined include the number of budget 
distribution levels that must be supported and what it means to verify 
the commitments and obligations against the revised budget. Appendix 
III includes more details on the problems related to the specificity of 
HHS' requirements.

HHS officials plan to use traditional testing approaches, including 
demonstrations and validations, to show UFMS' compliance with HHS high-
level requirements as well as the requirements contained in the various 
other requirements documents. However, the effectiveness of the testing 
process is directly related to the effectiveness of the requirements 
management process. HHS' IV&V contractor reported that as of April 
2004, the UFMS test program had not been adequately planned to provide 
the foundation for a comprehensive and coordinated process for 
validating that UFMS has the functionality to meet the stated 
requirements. For example, the test planning documents reviewed by the 
IV&V contractor did not have the detail typically found in test 
plans.[Footnote 22] As of May 2004, the information necessary for 
evaluating future testing efforts had not been developed for the 44 
requirements that we reviewed. Because of the weaknesses noted in the 
requirements management process, HHS does not yet have a firm 
foundation on which to base an effective testing program.

Key Testing Processes Have Not Been Completed:

Complete and thorough testing is essential to provide reasonable 
assurance that new or modified systems will provide the capabilities in 
the requirements. Testing activities that can provide quantitative data 
on the ability of UFMS to meet HHS' needs are scheduled late in the 
implementation cycle. For example, system testing on the capabilities 
for the CDC implementation was planned to start in August 2004 and to 
be completed in a 6-week time frame before the system is expected to 
become operational there. This leaves HHS with little time to address 
any defects identified during the system testing process and to ensure 
that the corrective actions taken to address the defects do not 
introduce new defects. Because HHS has allotted little time for system 
testing and defect correction, problems not corrected before system 
launch will in the worst case result in system failure, or will have to 
be addressed during operations, resulting in potentially costly and 
time-consuming rework.

Testing is even more challenging for this system development because 
HHS had not fully developed its overall requirements traceability 
matrix[Footnote 23] before testing to determine whether testing will 
address the requirements. HHS is placing a great deal of reliance on 
system testing to provide reasonable assurance of the functionality 
included in UFMS. Also, with system testing scheduled for August, HHS 
had not, as of May 2004, established an effective management framework 
for testing. For example, HHS had not (1) clearly defined the roles and 
responsibilities of the developers and testers, (2) developed 
acceptance criteria, and (3) strictly controlled the testing 
environment. As the IV&V contractor noted, if testing is not properly 
controlled and documented, there is no assurance that the system has 
been adequately tested and will perform as expected. Accordingly, HHS 
will need to develop such documents prior to conducting testing, such 
as developing test cases and executing the actual tests.

Given the issues associated with HHS' requirements management process, 
even if HHS addresses these testing process weaknesses, evaluating UFMS 
based solely on testing will not ensure that CDC's and HHS' needs will 
be met. It is unlikely that the system testing phase will uncover all 
defects in the UFMS system. In fact, testing, based on well-defined 
requirements, performed through the system test phase, often catches 
less than 60 percent of a program's defects.[Footnote 24] In HHS' case, 
problems with its poorly defined requirements make creating test cases 
more challenging and increase the likelihood that the systems test 
phase will identify significant defects that are often identified by 
system testing. The remaining errors are found through other quality 
assurance practices, such as code inspections, or by end users after 
the software has been put into production. Thus, it will be important 
for HHS to implement a quality assurance program that is both rigorous 
and well-structured.

Initial Data Conversion and System Interface Efforts Encountered 
Problems:

The ability of HHS to effectively address its data conversion and 
system interface challenges will also be critical to the ultimate 
success of UFMS. In its white paper on financial system data 
conversion,[Footnote 25] JFMIP identified data conversion[Footnote 26] 
as one of the critical tasks necessary to successfully implement a new 
financial system. Moreover, JFMIP stated that data conversion is one of 
the most frequently underestimated tasks. JFMIP also noted that if data 
conversion is done right, the new system has a much greater opportunity 
for success. On the other hand, converting data incorrectly or entering 
unreliable data from a legacy system has lengthy and long-term 
repercussions. The adage "garbage in garbage out" best describes the 
adverse impact. For example, the National Aeronautics and Space 
Administration (NASA) cited data conversion problems as a major reason 
that it was unable to prepare auditable financial statements from its 
new financial management system. HHS officials had initially expected 
to perform only two data conversion testing efforts, but decided that 
two additional data conversion testing efforts were needed after 
identifying 77 issues during the first data conversion test. While 
there is no standard number of data conversion tests that are needed, 
the key to successfully converting data from a legacy system to a new 
system is that the data conversion test is successfully executed with 
minimal errors. In addition, system interfaces had not been fully 
developed as expected for the conference room pilots held in March and 
April 2004. Proper implementation of the interfaces between UFMS and 
the other systems it receives data from and sends data to is essential 
for the successful deployment of UFMS.

HHS had originally expected to perform two data conversion testing 
efforts (commonly referred to as mock conversions) prior to the system 
being implemented at CDC. In discussions with HHS officials, we noted 
that other agencies have found that many more mock conversions are 
required, but HHS officials told us that the project schedule did not 
allow for many more conversion efforts. However, according to HHS, more 
than 8 months of preparatory activities were completed before beginning 
the first mock conversion. They also told us that at least some of 
these data-cleanup efforts had started about 3 years ago. As with other 
efforts on this project, the quantitative data necessary to determine 
whether HHS' expectations were realistic, such as the number of issues 
identified during a mock conversion, were not produced until late in 
the implementation cycle. In May 2004, HHS performed the first of its 
two planned mock conversions. On the basis of the results of this 
effort, HHS has now decided that it will need to perform two additional 
mock conversions before the October 2004 implementation at CDC. As 
shown in the following examples of the problems found in the first mock 
conversion, data cleanup was not sufficient in at least some cases to 
support the data conversion efforts.

* Employer identification numbers (EIN) assigned to customers caused 
problems because adequate data cleanup efforts had not yet been 
performed. For example, multiple customers had the same EIN or an EIN 
on the invoice did not have a corresponding customer. In addition, over 
1,300 vendors lacked the necessary banking information.

* Problems related to data quality and conversion logic were found in 
the conversions related to general ledger account balances. A primary 
cause of the problems was that the legacy system performed its closing 
activities by appropriation while UFMS does it by program. On the basis 
of a review of these problems by the project team, one of the team's 
recommendations was that a substantial data cleanup effort in the 
legacy system be started to mitigate the problems identified in this 
mock conversion.

Overall, HHS identified 77 issues that applied to 10 of the 11 business 
activities[Footnote 27] covered by this mock conversion. Table 2 shows 
the types of actions HHS identified as necessary to address these 
issues.

Table 2: Type of Action Needed to Address Data Conversion Findings:

Type of corrective action: Data cleanup; 
Number of issues that will be addressed: 22.

Type of corrective action: Modify data extract process; 
Number of issues that will be addressed: 8.

Type of corrective action: Modify data conversion specification; 
Number of issues that will be addressed: 15.

Type of corrective action: Modify data conversion program; 
Number of issues that will be addressed: 1.

Type of corrective action: Modify configuration; 
Number of issues that will be addressed: 21.

Type of corrective action: Perform further research; 
Number of issues that will be addressed: 10.

Total; 
Number of issues that will be addressed: 77.

Source: HHS.

[End of table]

At the conclusion of the first mock conversion, the project team 
believed that most of the major conversion issues had been identified 
and that subsequent data conversion efforts would only identify issues 
that required refinements to the solutions developed for the issues 
already identified. On the basis of the results of the first mock 
conversion, they also agreed to perform two additional mock 
conversions.

We also noted similar problems in HHS' efforts related to system 
interfaces. For example, one purpose of the March/April 2004 conference 
room pilot was to demonstrate several key system interfaces. However, a 
key feature of system interface efforts--error correction--was not 
available for demonstration since it had not yet been developed. At the 
conference room pilot, a user asked about how the error correction 
process would work for transactions that were not processed between two 
systems correctly and the user was told that the project team had not 
yet worked out how errors would be managed. Until HHS defines and 
implements this functionality, it will be unable to ensure that the 
processes being used for exchanging data between UFMS and more than 30 
CDC systems ensures the necessary levels of data integrity. Properly 
implementing the interfaces will be critical to performing a realistic 
system test at CDC and ensuring UFMS will properly operate when in 
production. Also, HHS expects UFMS to interface with about 110 systems 
when it is fully implemented.

HHS Risk Management Process Prematurely Closed Identified Risk As Being 
Resolved:

In our view, a major value of a risk management system is the increased 
visibility over the scope of work and resources needed to address the 
risks. HHS officials have developed a risk assessment and mitigation 
strategy and have implemented a process for managing UFMS risks that 
meets many of the risk management best practices.[Footnote 28] For 
example, they cited a program to identify risks to the project, such 
as staffing shortages and training deficiencies, and have HHS 
management focus on those risks. Our review confirmed that HHS does 
maintain a risk database and that these risks are available for review 
and discussion during project oversight meetings. However, we noted 
problems with the implementation of the risk management system.

HHS routinely closed its identified risks on the premise that they had 
been identified and were being addressed. As of March 2004, 13 of the 
44 project risks identified by HHS were considered "closed," even 
though it appeared that actions taken to close the risks were still 
ongoing. For example, HHS had identified data conversion as a risk 
because the conversion might be more complex, costly, and time 
consuming than previously estimated. However, this risk was closed in 
February 2003 because a data conversion strategy was in the project 
plan that UFMS officials considered as adequate to mitigate the risk. 
HHS officials characterized this practice as intended to streamline the 
number of risks for discussion at biweekly meetings. Project officials 
defended this approach under the premise that if the mitigating actions 
were not achieving their desired results, then the risk would be 
"reopened." After we discussed this with HHS officials, they agreed to 
revise their procedures to include a resolution column with more 
information on why a risk was closed. This change should improve 
management's ability to oversee the inventory of risks, their status, 
and the effectiveness of the mitigating strategies.

Project Management Benefits from the Support of Senior Officials, but 
Corrective Actions Lag:

According to HHS, the project has been able to obtain high-level 
sponsorship from senior financial management officials who routinely 
review its progress. This sponsorship has enabled the project to gain 
support from individuals critical to the implementation of UFMS at 
organizational units such as CDC. In addition, senior management 
officials have received periodic reports from a contractor hired to 
perform independent verification and validation[Footnote 29] that help 
identify issues needing management attention. Because of this strong 
support and oversight, HHS officials said they believed that the risks 
associated with the project have been reduced to acceptable levels and 
that the project can serve as a management model.

While we agree that top management commitment and oversight together 
comprise one critical factor in determining a project's success, they 
are not in themselves sufficient to provide reasonable assurance of the 
project's success. As noted in our discussion of disciplined processes, 
the inadequate implementation of any one of the disciplined processes 
in systems development can significantly reduce or overcome the 
positive benefits of others. In this case, it is important to act 
promptly to address risks so as to minimize their impact.

In this regard, in February 2003, HHS obtained the services of the 
current contractor to perform the IV&V function for the UFMS 
project.[Footnote 30] As of May 2004, according to the contractor, its 
staff has participated in hundreds of meetings at all levels within the 
project, provided written comments and recommendations on over 120 
project documents, and produced 55 project status and assessment 
reports. Twice a month it produces a report that is sent directly to 
the Executive Sponsor of the UFMS project. These reports highlight the 
IV&V team's view on the overall status of the UFMS project, including a 
discussion of any impacts or potential impacts to the project with 
respect to cost, schedule, and performance and a section on current 
IV&V concerns and associated recommendations. The IV&V contractor 
reported several project management and oversight weaknesses that 
increase the risks associated with this project that were not promptly 
addressed. Examples include the following.

* Personnel. Although the contractor hired by HHS to perform IV&V 
services identified the lack of personnel as a major risk factor in 
June 2003, it took HHS and its system integrator over 6 months to 
substantially address this weakness. In February 2004, the IV&V 
contractor reported this issue as closed. In closing this issue, the 
IV&V contractor noted that the availability of adequate resources was 
an ongoing concern, and the issue may be reopened at a later date. 
Related human capital issues are discussed in a separate section of 
this report.

* Critical path analysis. In August 2003, the IV&V contractor noted 
that an effective critical path analysis had not been developed. A 
critical path defines the series of tasks that must be finished on time 
for the entire project to finish on schedule. Each task on the critical 
path is a critical task. As of April 2004, this weakness had not been 
effectively addressed. Until HHS can develop an effective critical path 
analysis for this project, it does not have adequate assurance that it 
can understand the impact of various project events, such as delays in 
project deliverables. HHS' critical path report shows planned start and 
finish dates for various activities, but does not show the actual 
progress so that the impact of schedule slips can be analyzed. The IV&V 
contractor recommended that critical path analysis and discussion 
become a more prominent feature of UFMS project management to monitor 
the resources assigned to activities that are on the critical path.

* Earned value management system. In August 2003, the IV&V contractor 
also noted that an effective earned value management system had not 
been implemented. Earned value management attempts to compare the value 
of work accomplished during a given period with the work scheduled for 
that period. By using the value of completed work as a basis for 
estimating the cost and time needed to complete the program, earned 
value can alert program managers to potential problems early in the 
program. For example, if a task is expected to take 100 hours to 
complete and it is 50 percent complete, the earned value management 
system would compare the number of hours actually spent to complete the 
task to the number of hours expected for the amount of work performed. 
In this example, if the actual hours spent equaled 50 percent of the 
hours expected, the earned value would show that the project's 
resources were consistent with the estimate. As of April 2004, this 
weakness had not been effectively addressed.[Footnote 31] Without an 
effective earned value management system, HHS has little assurance that 
it knows the status of the various project deliverables in the context 
of progress and associated cost. In other words, an effective earned 
value management system would be able to provide quantitative data on 
the status of a given project deliverable, such as a data conversion 
program.[Footnote 32] On the basis of this information, HHS management 
would be able to determine whether the progress of a task was within 
the expected parameters for completion. Management could then use this 
information to determine actions to take to mitigate risk and manage 
cost and schedule performance.

The following additional significant issues were considered open by the 
IV&V contractor as of April 2004.

* Requirements management. The project had not produced an overall 
requirements traceability matrix that identified all the requirements 
and the manner in which each will be verified. In addition, HHS had not 
implemented a consistent approach to defining and maintaining a set of 
"testable" requirements.

* UFMS test program adequacy. The test program for UFMS had not been 
adequately defined and the test documentation reviewed to date lacks 
the detail typically found in test plans that are developed in 
accordance with industry standards and best practices.

* UFMS strategy documents. A number of key strategy documents that 
provide the foundation for system development and operations had not 
been completed as defined in the project schedule. These documents are 
used for guidance in developing documents for articulating the plans 
and procedures used to implement UFMS. Examples of the documents that 
were 2 or more months late include the UFMS Business Continuity 
Strategy, UFMS Lifecycle Test Strategy, Global Interface Strategy, and 
Global Conversion Strategy.

In addition, the IV&V contractor has presented other issues, concerns, 
and recommendations in its reports. For example, a May 2004 report 
noted that the IV&V contractor had expressed some concerns on the 
adequacy of the project schedule and the status of some data conversion 
activities. Our review of the IV&V contractor's concerns found that 
they are consistent with those that we identified in our review of 
UFMS.

HHS Has Not Yet Developed the Quantitative Data Necessary for Assessing 
Whether the System Will Provide the Needed Functionality:

The ability to understand the impact of the weaknesses we and the IV&V 
contractor identified is limited because HHS has not effectively 
captured the types of quantitative data or metrics that can be used to 
assess the effectiveness of its management processes, such as 
identifying and quantifying any weaknesses in its requirements 
management process. This information is necessary to understand the 
risk being assumed and whether the UFMS project will provide the 
desired functionality. HHS does not have a metrics measurement process 
that allows it to fully understand (1) its capability to manage the 
entire UFMS effort; (2) how its process problems will affect the UFMS 
cost, schedule, and performance objectives; and (3) the corrective 
actions needed to reduce the risks associated with the problems 
identified. Without such a process, HHS management can only focus on 
the project schedule and whether activities have occurred as planned, 
not whether the activities achieved their objectives. Experience has 
shown that such an approach leads to rework instead of making real 
progress on the project.

SEI has found that metrics identifying important events and trends are 
invaluable in guiding software organizations to informed decisions. Key 
SEI findings relating to metrics include the following.

* The success of any software organization depends on its ability to 
make predictions and commitments relative to the products it produces.

* Effective measurement processes help software groups succeed by 
enabling them to understand their capabilities so that they can develop 
achievable plans for producing and delivering products and services.

* Measurements enable people to detect trends and to anticipate 
problems, thus providing better control of costs, reducing risks, 
improving quality, and ensuring that business objectives are 
achieved.[Footnote 33]

Defect tracking systems are one means of capturing quantitative data 
that can be used to evaluate project efforts. Although HHS has a system 
that captures the defects that have been reported, we found that the 
agency has not effectively implemented a process to ensure that defects 
are identified and reported as soon as they have been identified. For 
example, we noted in the March/April 2004 conference room pilot that 
one of the users identified a process weakness related to grant 
accounting as a "showstopper."[Footnote 34] However, this weakness did 
not appear in the defect tracking system until about 1 month later. As 
a result, during this interval, the HHS defect tracking system did not 
accurately reflect the potential problems identified by the users, and 
HHS management was unable to determine (1) how well the system was 
working and (2) the amount of work necessary to correct the defects. 
Such information is critical when assessing a project's status.

According to HHS officials at of the end of our fieldwork, the UFMS 
project is on schedule. However, while the planned activities may have 
been performed, because there are not quantifiable criteria for 
assessing progress, it is unclear whether they were performed 
successfully or whether the activities have been accomplished 
substantively. For example, one major milestone[Footnote 35] was to 
conduct a conference room pilot in March/April 2004. HHS held the 
conference room pilot in March/April 2004, and so it considered that 
the milestone had been met. However, HHS did not define what 
constituted success for this event, such as the users identifying no 
significant defects in functionality. A discussion of the problems we 
identified with the March/April 2004 conference room pilot is included 
in appendix III and clearly demonstrates that the objective of this 
activity, to validate the prototype system and test interfaces, was not 
achieved. Therefore, by measuring progress based on the fact that this 
conference room pilot was held, HHS has little assurance that the 
project is in fact on schedule and can provide the desired 
functionality. This approach increases the risk that HHS will be 
surprised by a major malfunction at a critical juncture in the project, 
such as when it conducts system testing or attempts to implement the 
system at CDC.

Good metrics would enable HHS to assess the risk of moving forward on 
UFMS with a much greater degree of certainty. HHS will be better able 
to proactively manage UFMS through disciplined processes as opposed to 
having to respond to problems as they arise.

Experience Has Shown the Effects of Not Effectively Implementing the 
Disciplined Processes:

HHS' inability to effectively implement the types of disciplined 
processes necessary to reduce risks to acceptable levels does not mean 
that the agency cannot put in place an effective process prior to the 
CDC implementation. However, HHS has little time to (1) address long-
standing requirements management problems, (2) develop effective test 
cases from requirements that have not yet been defined at the level 
necessary to support effective testing efforts, and (3) develop and 
implement disciplined test management processes before it can begin its 
testing efforts. Furthermore, HHS will need to address its project 
management and oversight weaknesses so that officials can understand 
(1) the impact that the defects identified during system testing will 
have on the project's schedule and (2) the corrective actions needed to 
reduce the risks associated with the problems identified. Without 
effectively implementing disciplined processes and the necessary 
metrics to understand the effectiveness of the processes that it has 
implemented, HHS is incurring unnecessary risks that the project will 
not meet its cost, schedule, and performance objectives.

The kinds of problems we saw at HHS for the UFMS project have 
historically not boded well for successful system development at other 
federal agencies. In 1999 we reported[Footnote 36] on a system at the 
Department of the Interior's Bureau of Indian Affairs (BIA) that had 
problems similar to those discussed in this report. As is the case at 
HHS, Interior's deficiencies in requirements management and other 
disciplined processes meant that Interior had no assurance that its 
newly acquired system would meet its specific performance, security, 
and data management needs and that it would be delivered on time and on 
schedule. To reduce these risks, we recommended that Interior develop 
and implement an effective risk management plan and that Interior 
ensure that all project decisions were (1) based on objective data and 
demonstrated project accomplishments and (2) driven by events, not the 
schedule. In subsequent reviews we noted that, like HHS, Interior 
planned to use testing to demonstrate that the system could perform its 
intended functions.

However, as we reported in September 2000,[Footnote 37] BIA did not 
follow sound practices in conducting its system and user acceptance 
tests for this system. Subsequently, in May 2004, the agency 
reported[Footnote 38] that only one function had been successfully 
implemented and that it was in the process of evaluating the 
capabilities and shortcomings of the system to determine whether any 
other components could be salvaged for interim use while it looked for 
a new system to provide the desired functionality.

In reports on other agencies, we have also identified weaknesses in 
requirements management and testing that are similar to the problems we 
identified at HHS. Examples of problems that have resulted from 
undisciplined efforts include the following.

* In April 2003, we reported[Footnote 39] that NASA had not implemented 
an effective requirements management process and that these requirement 
management problems adversely affected its testing activities. We also 
noted that because of the testing inadequacies, significant defects 
later surfaced in the production system.

* In May 2004, we reported[Footnote 40] that NASA's new financial 
management system, which was fully deployed in June 2003 as called for 
in the project schedule, still did not address many of the agency's 
most challenging external reporting issues, such as external reporting 
problems related to property accounting and budgetary accounting.

* In May 2004, we reported[Footnote 41] that for two major Department 
of Defense (DOD) systems, the initial deployments for these systems did 
not operate as intended and, therefore, did not meet component-level 
needs. In large part, these operational problems were due to DOD not 
effectively implementing the disciplined processes that are necessary 
to manage the development and implementation of the systems in the 
areas of requirements management and testing. DOD program officials 
have acknowledged that the initial deployments of these systems 
experienced problems that could be attributed to requirements and 
testing.

The problems experienced by these other agencies are illustrative of 
the types of problems that can result when disciplined processes are 
not properly implemented. Whether HHS will experience such problems 
cannot be known until the agency obtains the quantitative data 
necessary to indicate whether the system will meet its needs. 
Accordingly, HHS will need to ensure it adequately addresses the 
numerous weaknesses we and the IV&V contractor identified and has 
reduced the risk to an acceptable level before implementing UFMS at 
CDC. As we will be discussing in the next section, compounding the risk 
to UFMS from not properly implementing disciplined processes, is the 
fact that HHS is introducing UFMS into an environment with weaknesses 
in its departmentwide IT management practices.

Weaknesses in HHS' IT Management Practices and Information Security 
Controls Put UFMS at Risk:

HHS has planned and developed UFMS using the agency's existing IT 
investment management processes. However, we have reported--and HHS has 
acknowledged--weaknesses in IT investment management, enterprise 
architecture, and information security. Such weaknesses increase the 
risk that UFMS will not achieve planned results within the estimated 
budget and schedule.

HHS' Enterprise IT Management Processes Also Add Risk for UFMS:

In addition to weaknesses in disciplined processes in the development 
of UFMS, weaknesses in the HHS' IT management processes also increase 
the risks associated with UFMS. HHS is modifying its IT investment 
management policies, developing an enterprise architecture, and 
responding to security weaknesses with several ongoing activities, but 
these changes may not be implemented in time to compensate for the 
increased risks.

IT investment management provides for the continuous identification, 
selection, control, life-cycle management, and evaluation of IT 
investments. The Clinger-Cohen Act of 1996[Footnote 42] lays out 
specific aspects of the process that agency heads are to implement in 
order to maximize the value of the agency's IT investments. In 
addition, OMB and GAO have issued guidance[Footnote 43] for agencies to 
use in implementing the Clinger-Cohen Act requirements for IT 
investment management. Our Information Technology Investment 
Management framework[Footnote 44] is a maturity model composed of five 
progressive stages of maturity that an agency can achieve in its IT 
investment management capabilities. These stages range from creating 
investment awareness to developing a complete investment portfolio to 
leveraging IT for strategic outcomes. The framework can be used both to 
assess the maturity of an agency's investment management processes and 
as a tool for organizational improvement.

OMB Circular No. A-130,[Footnote 45] which implements the Clinger-Cohen 
Act, requires agencies to use architectures. A well-defined enterprise 
architecture provides a clear and comprehensive picture of the 
structure of any enterprise by providing models that describe in 
business and technology terms how the entity operates today and how it 
intends to operate in the future. It also includes a plan for 
transitioning to this future state. Enterprise architectures are 
integral to managing large-scale programs such as UFMS. Managed 
properly, an enterprise architecture can clarify and help optimize the 
interdependencies and relationships among an organization's business 
operations and the underlying IT infrastructure and applications that 
support these operations. Employed in concert with other important 
management controls, architectures can greatly increase the chances 
that organizations' operational and IT environments will be configured 
to optimize mission performance. To aid agencies in assessing and 
improving enterprise architecture management, we issued guidance 
establishing an enterprise architecture management maturity 
framework.[Footnote 46] That framework uses a five-stage maturity model 
outlining steps toward achieving a stable and mature process for 
managing the development, maintenance, and implementation of an 
enterprise architecture.

The reliability of operating environments, computerized data, and the 
systems that process, maintain, and report these data is a major 
concern to federal entities, such as HHS, that have distributed 
networks that enable multiple computer processing units to communicate 
with each other. Such a platform increases the risk of unauthorized 
access to computer resources and possible data alteration. Effective 
departmentwide information security controls will help reduce the risk 
of loss due to errors, fraud and other illegal acts, disasters, or 
incidents that cause systems to be unavailable. Inadequate security and 
controls can adversely affect the reliability of the operating 
environments in which UFMS and its applications operate. Without 
effective general controls, application controls may be rendered 
ineffective by circumvention or modification. For example, a control 
designed to preclude users from entering unreasonably large dollar 
amounts in a payment processing system can be an effective application 
control, but this control cannot be relied on if general controls 
permit unauthorized program modifications to allow certain payments to 
be exempted from it.

Key HHS IT Investment Management Policies Still under Development:

UFMS is at increased risk because of previously reported weaknesses in 
the process that HHS uses to select and control its IT investments. In 
January 2004, we reported[Footnote 47] that there were serious 
weaknesses in HHS IT investment management. Notably, HHS had not (1) 
established procedures for the development, documentation, and review 
of IT investments by its review boards or (2) documented policies and 
procedures for aligning and coordinating investment decision making 
among its investment management boards. In addition, HHS had not yet 
established selection criteria for project investments or a requirement 
that IT investments support work processes that have been simplified or 
redesigned.

HHS is modifying several of its IT investment management policies, 
including its capital planning and investment control guidance and its 
governance policies; but as of May 12, 2004, these documents were not 
final or available for review. Until HHS addresses weaknesses in its 
selection or control processes, IT projects like UFMS will face an 
increased likelihood that the projects will not be completed on 
schedule and within estimated costs.

Risk to UFMS Are Heightened With the Absence of an Established 
Enterprise Architecture:

In November 2003, we released a report[Footnote 48] noting the 
importance of leadership to agency progress on enterprise architecture 
efforts. We reported that federal agencies' progress toward effective 
enterprise architecture management was limited: In a schedule of five 
stages leading to a highly effective enterprise architecture program, 
97 percent of the agencies surveyed were still in Stage 1--creating 
enterprise architecture awareness. In that report, we noted that HHS 
had reached Stage 2--building the enterprise architecture management 
foundation--by successfully satisfying all elements of that stage of 
the maturity framework. In addition, HHS had successfully addressed 
three of six elements of the Stage 3 maturity level--developing 
architecture products. HHS has laid that foundation by (1) assigning 
enterprise architecture management roles and responsibilities and (2) 
establishing plans for developing enterprise architecture products and 
for measuring program progress and product quality. Progressing through 
the next stage would involve defining the scope of the architecture and 
developing products describing the organization in terms of business, 
performance, information/data, service/application, and technology. 
Once the scope is defined and products developed, Stage 3 organizations 
track and measure progress against plans; identify and address 
variances, as appropriate; and report on their progress.

Although it has made progress, HHS has not yet established an 
enterprise architecture to guide and constrain its IT projects. In 
January 2004, HHS' acting chief architect told us that the department 
continues to work on implementing an enterprise architecture to guide 
its decision making. He also noted that HHS plans to make UFMS a 
critical component of the enterprise architecture now under 
development. However, most of the planning and development of the UFMS 
IT investment has occurred without the guidance of an established 
enterprise architecture. Our experience with other federal agencies has 
shown that projects developed without the constraints of an established 
enterprise architecture are at risk of being duplicative, not well 
integrated, unnecessarily costly to maintain and interface, and 
ineffective in supporting missions.

HHS Information Security Weaknesses Are Unresolved, and Needed 
Information for UFMS Is Not Shared:

HHS has recognized the need to improve information security throughout 
the department, including in key operating divisions, and has various 
initiatives under way; however, it has not yet fully implemented the 
key elements of a comprehensive security management program. Unresolved 
general control weaknesses at headquarters and in HHS' operating 
divisions include almost all areas of information system controls 
described in our Federal Information System Controls Audit Manual 
(FISCAM).[Footnote 49] These weaknesses are in entitywide security, 
access controls, system software, application software, and service 
continuity and they are significant and pervasive.

According to a recent IG report,[Footnote 50] the underlying cause for 
most of the weaknesses was that the department did not have an 
effective management structure in place to ensure that sensitive data 
and critical operations received adequate attention and that 
appropriate security controls were implemented to protect them. HHS has 
not sufficiently controlled network access, appropriately limited 
mainframe access, or fully implemented a comprehensive program to 
monitor access. Weaknesses in other information security controls, 
including physical security, further increased the risk to HHS' 
information systems. As a result, sensitive data--including information 
related to the privacy of U.S. citizens, payroll and financial 
transactions, proprietary information, and mission-critical data--were 
at increased risk of unauthorized disclosure, modification, or loss, 
possibly without being detected. Overall, the IG concluded that the 
weaknesses left the department vulnerable to unauthorized access to and 
disclosure of sensitive information, malicious changes that could 
interrupt data processing or destroy data files, improper payments, or 
disruption of critical operations.

Extensive information security planning for UFMS was based on 
requirements and applicable guidance set forth in the Federal 
Information Security Management Act,[Footnote 51] OMB Circular No. A-
130 Appendix III (Security of Federal Automated Information Resources), 
National Institute of Standards and Technology guidance, and our 
FISCAM. However, that planning was done without complete information 
from the department and operating divisions. HHS has not conducted a 
comprehensive, departmentwide assessment of information security 
general controls. Further, information security general controls at 
four operating divisions have not been recently assessed. UFMS 
officials told us they did not know which operating divisions had 
conducted or contracted for a review of their individual information 
security environments. Without departmentwide and operating-division-
specific assessments, HHS increases its risk that information security 
general control weaknesses will not be identified and therefore will 
not be subject to departmentwide resolution or mitigation by UFMS 
controls.

According to HHS officials, some operating divisions that have been 
assessed recently have not provided UFMS with current information on 
the status of the outstanding weaknesses in their operating 
environments. UFMS officials told us that they do not have assurance of 
the reliability of the control environment of these operating 
divisions. Without information on control weaknesses in the operating 
divisions, UFMS management has not been in a position to develop 
mitigating controls that could compensate for departmentwide 
weaknesses. As a result, UFMS planning for security cannot provide 
reasonable assurance that the system is protected from loss due to 
errors, fraud and other illegal acts, disasters, and incidents that 
cause systems to be unavailable.

Human Capital Issues Increase Risk Associated with the Implementation 
of UFMS:

Serious understaffing and incomplete workforce planning have plagued 
the UFMS project. Human capital management for the UFMS project 
includes organizational planning, staff acquisition, and team 
development. It is essential that an agency take the necessary steps to 
ensure that it has the human capital capacity to design, implement, and 
operate a financial management system. However, the UFMS project has 
experienced staff shortages as high as 40 percent of the federal 
positions that HHS believed were needed to implement UFMS. Although the 
staff shortage has been alleviated to a great extent, the impact of 
such a significant shortfall lingers. Further, HHS has not yet fully 
developed key workforce planning tools, such as the CDC skills gap 
analysis, to help transform its workforce so that it can effectively 
use UFMS. It is important that agencies incorporate strategic workforce 
planning by (1) aligning an organization's human capital program with 
its current and emerging mission and programmatic goals and (2) 
developing long-term strategies for acquiring, developing, and 
retaining an organization's total workforce to meet the needs of the 
future. This incorporates a range of activities from identifying and 
defining roles and responsibilities to identifying team members to 
developing individual competencies that enhance performance. Human 
capital planning should be considered for all stages of the system 
implementation.

Positions Were Not Filled as Planned:

According to JFMIP's Building the Work Force Capacity to Successfully 
Implement Financial Systems, the roles needed on an implementation team 
are consistent across financial system implementation projects and 
include a project manager, systems integrator, functional experts, 
information technology manager, and IT analysts. Many of these project 
roles require the dedication of full-time staff for one or more of the 
project's phases.

HHS has identified the lack of resources as a risk to the project and 
acquired the staff to fill some of the roles needed for a systems 
implementation project. The project has a project manager, systems 
integrator, and some functional experts. However, on the basis of our 
review of the HHS Organization and Staffing Plan and the most recent 
program management office organization chart, many positions were not 
filled as planned. For example, as reported in the IV&V contractor's 
September 2003 report, some key personnel filled multiple positions and 
their actual available time was inadequate to perform the allocated 
tasks--commonly referred to as staff being overallocated on the 
project. As a result, some personnel were overworked, which according 
to the IV&V contractor, could lead to poor morale. The UFMS 
organization chart also showed that the UFMS project team was 
understaffed and that several integral positions were vacant or filled 
with part-time detailees. As of January 2004, 19 of the 47 UFMS 
positions in the UFMS Program Management Office identified as needed 
for the UFMS project were not filled. The vacant positions included key 
positions such as the enterprise architect, purchasing, testing, and 
configuration management leads. While HHS and the systems integrator 
have taken measures to acquire additional human resources for the 
implementation of UFMS, scarce resources could significantly jeopardize 
the project's success and have led to several key deliverables being 
significantly behind schedule, as discussed in the section on 
disciplined processes. Without adequate resources to staff the project, 
the project schedule could be negatively affected, project controls and 
accountability could be diminished, and the successful implementation 
of UFMS could be compromised.

Strategic Workforce Planning Is Incomplete:

Strategic workforce planning is essential for achieving the mission and 
goals of the UFMS project. As we have reported,[Footnote 52] there are 
five key principles that strategic workforce planning should address:

* Involve top management, employees, and other stakeholders in 
developing, communicating, and implementing the strategic workforce 
plan.

* Determine the critical skills and competencies that will be needed to 
achieve current and future programmatic results.

* Develop strategies that are tailored to address gaps in the number, 
deployment, and alignment of human capital approaches for enabling and 
sustaining the contributions of all critical skills and competencies.

* Build the capability needed to address administrative, educational, 
and other requirements important to support workforce planning 
strategies.

* Monitor and evaluate the agency's progress toward its human capital 
goals and the contribution that human capital results have made toward 
achieving programmatic results.

HHS has taken first steps to address three of the five key principles 
identified in our report on strategic workforce planning.[Footnote 53] 
To address the first key principle, HHS' top management first 
communicated the agency's goal to implement a unified financial 
management system in June 2001 and has continued to communicate the 
agency's vision. HHS has developed an Organizational Change Management 
Plan and, according to the UFMS project's Statement of Work, HHS, in 
undertaking UFMS, will seek to ensure that sufficient efforts are made 
to address communications, human resources, and training requirements.

To meet the second principle of identifying the needed skills and 
competencies, HHS developed a Global Organization Impact Analysis in 
March 2003 and subsequently prepared an analysis for CDC that 
identified workforce and training implications associated with the 
major changes that will occur in its financial management business 
processes. However, more work remains. Although a Global/CDC Pilot 
Competency Report was prepared that focuses on preparing and equipping 
the workforce to function effectively in the new environment, none of 
the other operating divisions scheduled to implement UFMS had prepared 
a competency report as of May 2004.

To effectively address the third principle of developing strategies to 
address the gaps in human capital, HHS must first identify the skills 
and competencies needed. HHS has plans to conduct a skills gap analysis 
on a site-specific basis. However, as of May 2004, the CDC skills gap 
analysis had not been completed. CDC officials maintain that they 
intend to wait until after the system is implemented to assess the 
changes in individuals' workloads and make decisions on staffing 
changes. In addition, HHS is currently developing a global Workforce 
Transition Strategy, which the other operating divisions will use as a 
model in developing their own strategies. According to HHS officials, 
HHS has also prepared a global training strategy. Training plans are to 
be developed on a site-specific basis using the global strategy as a 
model. Although CDC has a tentative schedule for planned training, as 
of May 2004 the CDC training plan was not complete.

As we have previously reported,[Footnote 54] having staff with the 
appropriate skills is key to achieving financial management 
improvements, and managing an organization's employees is essential to 
achieving results. HHS already faces challenges in implementing its 
financial management system due to the lack of adequate resources. By 
not identifying staff with the requisite skills to implement such a 
system and by not identifying gaps in needed skills and filling them, 
HHS has reduced its chances of successfully implementing and operating 
UFMS.

Conclusions:

HHS has not followed key disciplined processes necessary to reduce the 
risks associated with implementing UFMS to acceptable levels. These 
problems are similar to those encountered by other agencies that have 
found themselves under strong pressure to skip steps in their haste to 
get systems up and running and produce results. If HHS continues on 
this path, it runs a higher risk than necessary of finding, as many 
others have already discovered, that the system may be more costly to 
operate, take more time and effort to perform needed functions, be more 
disruptive to the work of the agency, and may not achieve the intended 
improvement.

Ideally, HHS should not continue with its current approach for UFMS. 
However, if HHS decides for operational reasons to continue its plan to 
deploy UFMS at CDC in October 2004, then as a precursor to deployment 
at CDC, there are several key steps that must be taken to mitigate the 
significant risk related to this deployment. To begin, HHS must 
determine the system capabilities that are necessary for the CDC 
deployment and identify the relevant requirements related to those 
capabilities. The associated requirements will have to be unambiguous 
and adequately express how the system will work, be traceable from 
their origin through implementation, and be sufficiently tested to 
confirm that the system meets those functional needs. Validating data 
conversion efforts and systems interfaces will also be critical to the 
successful launch of UFMS. HHS will need to ensure that its desire to 
meet the October 2004 initial deployment of UFMS is driven by 
successful completion of at least these key events based on 
quantitative data rather than the schedule. HHS should not deploy UFMS 
at CDC until these critical steps are complete.

Before proceeding further with the UFMS implementation beyond CDC, HHS 
should pause to assess whether an appropriate foundation is in place so 
that UFMS will achieve its ultimate goals of a unified accounting 
system that institutes common business rules, data standards, and 
accounting policies and procedures. From our perspective, HHS does not 
have a fully developed view of how UFMS will operate because it moved 
forward with the project before ensuring that certain key elements, 
such as a concept of operations and an enterprise architecture, were 
completed. Without assurances that it is moving ahead with a solid 
foundation and a fully developed and strongly administered plan for 
bringing the entire UFMS project under the disciplined processes of 
requirements management, testing, risk management, and the use of 
quantitative measures to manage the project, HHS risks not achieving 
its goal of a common accounting system that produces data for 
management decision making and financial reporting and risks 
perpetuating its long-standing accounting system weaknesses with 
substantial workarounds to address any needed capabilities that have 
not been built into the system.

Because we have recently issued reports providing HHS with 
recommendations to address weaknesses in IT investment management 
processes, we are not making additional recommendations in this report 
related to those two disciplines other than to reiterate the importance 
of taking action on our prior recommendations. It will be important 
that HHS continue with its ongoing initiatives to strengthen these two 
areas. Also, HHS has not fully secured its information systems security 
environment to offer an adequate basis for incorporating adequate 
security features into UFMS as it is being developed. Finally, 
addressing human capital and staffing shortages that have also 
increased risks related to UFMS is paramount to achieving the agency's 
objectives for this project.

Recommendations for Executive Action:

To help reduce risks associated with deployment of UFMS at CDC to 
acceptable levels, we recommend that the Secretary of Health and Human 
Services direct the Assistant Secretary for Budget, Technology, and 
Finance to require that the UFMS program staff take the following nine 
actions:

* Determine the system capabilities that are necessary for the CDC 
deployment.

* Identify the relevant requirements related to the desired system 
capabilities for the CDC deployment.

* Clarify, where necessary, any requirements to ensure they (1) fully 
describe the capability to be delivered, (2) include the source of the 
requirement, and (3) are unambiguously stated to allow for quantitative 
evaluation.

* Maintain traceability of the CDC-related requirements from their 
origin through implementation.

* Use a testing process that employs effective requirements to obtain 
the quantitative measures necessary to understand the assumed risks.

* Validate that data conversion efforts produce reliable data for use 
in UFMS.

* Verify that systems interfaces function properly so that data 
exchanges between systems are adequate to satisfy system needs.

* Measure progress based on quantitative data rather than the 
occurrence of events.

If these actions are not completed, delay deployment of UFMS at CDC.

Before proceeding with further implementation of UFMS after deployment 
at CDC, we recommend that the Secretary of Health and Human Services 
direct the Assistant Secretary for Budget, Technology, and Finance to 
require that the UFMS program staff take the following 14 actions:

* Develop and effectively implement a plan on how HHS will implement 
the disciplined processes necessary to reduce the risks associated with 
this effort to acceptable levels. This plan should include the 
processes, such as those identified by SEI and IEEE, that will be 
implemented and the resources, such as staffing and funding, needed to 
implement the necessary processes.

* Develop a concept of operations in accordance with recognized 
industry standards such as those promulgated by IEEE. The concept of 
operations should apply to all HHS entities that will be required to 
use UFMS. This concept of operations should contain a high-level 
description of the operations that must be performed, who must perform 
them, and where and how the operations will be carried out, and be 
consistent with the current vision for the HHS information system 
enterprise architecture.

* Implement a requirements management process that develops 
requirements that are consistent with the concept of operations and 
calls for each of the resulting requirements to have the attributes 
associated with good requirements: (1) fully describing the 
functionality to be delivered, (2) including the source of the 
requirement, and (3) stating the requirement in unambiguous terms that 
allows for quantitative evaluation.

* Maintain traceability of requirements among the various 
implementation phases from origin through implementation.

* Confirm that requirements are effectively used for:

(1) determining the functionality that will be available in UFMS at a 
given location,

(2) implementing the required functionality,

(3) supporting an effective testing process to evaluate whether UFMS is 
ready for deployment,

(4) validating that data conversion efforts produce reliable data for 
use in UFMS, and:

(5) verifying that systems interfaces function properly so that data 
exchanges between systems are adequate to satisfy each system's needs.

* Develop and implement a testing process that uses adequate 
requirements as a basis for testing a given system function.

* Formalize risk management procedures to consider that:

(1) all risks currently applicable to the UFMS project are identified 
and:

(2) a risk is only closed after the risk is no longer applicable rather 
than once management has developed a mitigation strategy.

* Develop and implement a program that will identify the quantitative 
metrics needed to evaluate project performance and risks.

* Use quantitative measures to assess progress and compliance with 
disciplined processes.

To help ensure that HHS reduces risks in the agencywide IT environment 
associated with its implementation of UFMS, we recommend that the 
Secretary of Health and Human Services direct the Assistant Secretary 
for Budget, Technology, and Finance to require that the following seven 
actions are taken by the IT program management staff, as appropriate:

* Conduct assessments of operating divisions' information security 
general controls that have not been recently assessed.

* Establish a comprehensive program to monitor access to the network, 
including controls over access to the mainframe and the network.

* Verify that the UFMS project management staff has all applicable 
information needed to fully ensure a comprehensive security management 
program for UFMS. Specifically, this would include identifying and 
assessing the reported concerns for all HHS entities regarding key 
general control areas of the information security management process:

(1) entitywide security planning,

(2) access controls,

(3) system software controls,

(4) segregation of duties, and:

(5) application development and change controls.

To help improve the human capital initiatives associated with the UFMS 
project, we recommend that the Secretary of Health and Human Services 
direct the Assistant Secretary for Budget, Technology, and Finance to 
require that the following four actions are taken by the UFMS program 
management staff:

* Assess the key positions needed for effective project management and 
confirm that those positions have the human resources needed. If 
needed, solicit the assistance of the Assistant Secretary for Budget, 
Technology, and Finance to fill key positions in a timely manner.

* Finalize critical human capital strategies and plans related to UFMS 
such as the:

(1) skills gap analysis,

(2) workforce transition strategy, and:

(3) training plans.

Agency Comments and Our Evaluation:

In written comments on a draft of this report, HHS described the 
actions it had taken to date to develop UFMS, including some actions 
related to our recommendations, which if effectively implemented, 
should reduce project risk. HHS disagreed with our conclusion that a 
lack of disciplined processes is placing the UFMS program at risk, 
stating that its processes have been clear and rigorously executed. HHS 
characterized the risk in its approach as the result not of a lack of 
disciplined process but of an aggressive project schedule. HHS stated 
that it made a decision early in the program to phase in the deployment 
of the system to obtain what it referred to as incremental benefits, 
and said that a core set of requirements will be available for the 
October 2004 release at CDC. HHS added that if a system functional 
capability becomes high risk for the pilot implementation at CDC, it 
could be deferred to a subsequent release without affecting the overall 
implementation. HHS did not provide examples of the functional 
capabilities that could be deferred under such a scenario, but we 
understand that at least some functionality associated with grant 
accounting being deployed at CDC is less than that originally 
envisioned when we performed our review--less than 6 months before the 
scheduled CDC implementation date. HHS stated that it had reached every 
major milestone to date within the planned timeframes and budget for 
almost 3 years while managing to mitigate the cost, schedule, and 
technical risks. The agency considers this is a testament to UFMS 
management disciplines, notwithstanding known needed improvements.

From our perspective, this project demonstrates the classic symptoms of 
a schedule-driven effort for which key processes have been omitted or 
shortcutted, thereby unnecessarily increasing risk. This is a multiyear 
project, and it is important that the project adhere to disciplined 
processes that represent best practices. We have no problem whatsoever 
with a phased approach and view it as a sound decision for this 
project. There is no doubt that a phased approach can help reduce 
risks. However, we do not agree that a phased approach adequately 
mitigates risk in a project of this magnitude, given the other problems 
we identified. As discussed in our report and highlighted in the 
following sections that further evaluate HHS' comments on our draft 
report, we identified a number of problems with HHS' methodology, 
including problems in requirements management, testing, project 
management and oversight, and IT management, that are at the heart of 
our concern. Also, we are not saying that HHS is not following any 
disciplined processes, and in this report we have recognized certain 
HHS actions that we believe represent best practices that reduce risk. 
We are saying that HHS has not reduced its risk to an acceptable level 
because a number of key disciplined processes were not yet in place or 
were not effectively implemented. We focused our 34 recommendations on 
tangible actions that HHS can take to adequately mitigate risk. Risk on 
a project such as this can never be eliminated, but risk can be much 
better managed than what we observed for this project.

With respect to HHS' comment that all milestones have been met, as we 
discussed in detail in this report, we caution that because HHS has 
insufficient quantifiable criteria for assessing the quality of its 
progress and the impact of identified defects, it does not have the 
information it needs to determine whether the milestones have been 
substantively accomplished and the nature and extent of resources 
needed to resolve remaining defects. A best practice is having 
quantitative metrics and a disciplined process for continually 
measuring and monitoring results.

We stand firmly behind our findings that HHS had not reduced project 
risk to an acceptable level because it had not adequately adhered to 
disciplined processes called for in its stated implementation 
methodology. We are somewhat encouraged by the planned actions outlined 
in HHS' comment letter and the fact that it has now decided to delay 
initial implementation by at least 2 weeks to address known problems 
and has indicated it may delay the initial implementation further as 
needed. Only time will tell how well this project turns out, as the 
initial implementation at CDC represents just the first phase. Our hope 
is that the disciplined processes discussed in our report and addressed 
in our recommendations will be followed and that risks of a project of 
this magnitude and importance will be reduced to an acceptable level. 
If the past is prologue, taking the time to adhere to disciplined 
processes will pay dividends in the long term.

Implementation Methodology:

HHS stated that the underlying premise of our report is that there is 
one correct way to perform an implementation for a project such as UFMS 
and that this methodology, commonly referred to as the 
waterfall[Footnote 55] methodology, is inappropriate for a COTS-based 
system. Our report does not call for the use of this or any other 
specific methodology. Instead, we have emphasized the importance of 
following disciplined processes in the development and implementation 
of large and complex information management systems, including 
financial management systems such as UFMS. As we have reiterated 
throughout this report, we view disciplined processes as the key to 
successfully carrying out a system development and implementation 
program whatever the methodology.

In the case of HHS' COTS-based system development program, we did not 
question the methodology, but have concerns about HHS' ability to 
successfully implement its methodology. For example, as explained in 
our report and reiterated in HHS' comments, before a COTS software 
package is selected for implementation, requirements need to be more 
flexible and less specific than custom-developed software because no 
off-the-shelf product is likely to satisfy all of the detailed 
requirements for a large, complex organization such as HHS. Once the 
product is selected, however, a disciplined approach to COTS 
implementation demands that requirements be defined at a level of 
specificity that allows the software to be configured to fit the system 
under development and to be implemented to meet the organization's 
needs. In discussing the HHS methodology, our report is consistent with 
how HHS described its methodology in its comments. As we noted in the 
report, the methodology selected by HHS requires (1) reviewing and 
updating the requirements through process design workshops, (2) 
establishing the initial baseline requirements, (3) performing a fit/
gap analysis, (4) developing gap closure alternatives, and (5) creating 
the final baseline requirements. However, as noted in our report, HHS 
was unable to successfully implement its methodology for the majority 
of the requirements we reviewed. For example, one inadequately defined 
requirement was linked to the budget distributions process. However, 
this process, which should of provided additional specificity to 
understand how the system needed to be configured, stated that the 
process was "To Be Determined."

Requirements Management:

In its comments, HHS stated that in July 2002 it had developed a 
"target business model" that is equivalent to a concept of operations 
for guiding its development efforts. The document HHS referenced, which 
we reviewed during our audit, along with several other requirement-
related documents HHS had provided, did not have all the elements 
associated with a concept of operations document as defined by IEEE. 
For example, the document did not address the modes of operation; user 
classes and how they should interact; operational policies and 
constraints; costs of systems operations; performance characteristics, 
such as speed, throughput, volume, or frequency; quality attributes, 
such as availability, reliability, supportability, and expandability; 
and provisions for safety, security, and privacy. The document does not 
address a number of other critical issues associated with the project 
such as the use of shared services. We also noted that some HHS 
officials who had reviewed this document stated that it did not resolve 
a number of issues that needed to be addressed. For example, HHS 
reviewers raised questions about who was responsible for several core 
functions. When we performed our review, these types of questions 
remained unanswered, although HHS said in its comments on our draft 
report that it is taking steps to address these concerns and has now 
made certain decisions regarding shared services.

In addition, HHS' comment letter stated that it has developed a 
requirements database that could be used to track the requirements and 
that its requirements management process used two broad categories - 
Program Management Office of JFMIP requirements and agency-specific 
requirements. HHS also stated that the requirements process has fully 
defined and documented the expected behavior of UFMS and that the 
agency-specific requirements it had identified had been developed in 
accordance with industry best practices. HHS noted that it has also 
developed a requirements traceability verification matrix since our 
review. The result, according to HHS, has been a requirements 
management process that provides fully traceable requirements that are 
fully tested by the implementation team.

Developing and effectively implementing the kinds of processes 
described in HHS' comments are positive steps that would reduce the 
risks associated with requirements related defects. However, since 
these key processes, which were called for in our report and during 
meetings held with HHS during our review, were developed and 
implemented after our work was complete, we are unable to determine 
whether HHS has yet fully addressed the weaknesses we observed. As 
noted in our report, we found numerous requirements that did not 
contain the necessary specificity to support a good testing program. We 
also note that the HHS comments refer to these processes being used for 
"testable" requirements but do not provide information on how many of 
the 2,130 requirements contained in its requirements database were 
considered testable and, therefore, subject to this improved process.

Testing:

While HHS stated in its comment letter that it has implemented a more 
disciplined system testing process, its comments also raised concerns 
about the thoroughness of the testing. HHS noted that it has selected 
an application certified by the Program Management Office of JFMIP and 
that "80% of the requirements have been met [with] out of the box 
functionality." Accordingly, HHS stated that it has, by design, tested 
these requirements with less rigor than the agency specific 
requirements. As noted in HHS' comments, its requirements management 
database contains 2,130 requirements that include requirements issued 
by the Program Management Office of JFMIP. However, according to the 
Program Management Office of JFMIP, its testing efforts encompass about 
331 requirements, or only about 16 percent of HHS' stated requirements.

Compounding this limitation, while the Program Management Office of 
JFMIP test results can be helpful, as the Program Management Office of 
JFMIP has consistently made it clear to agencies, these tests are not 
intended to take the place of agency-level tests. The Program 
Management Office of JFMIP tests are in a controlled environment that 
is not intended to represent the operating environment of a specific 
agency. As the Project Management Office of JFMIP points out on its Web 
site, agencies need to (1) test the installed configured system to 
ensure continued compliance with the governmentwide core requirements 
and any agency-specific requirements, (2) assess the suitability of an 
application for the agency's operating environment, and (3) assess the 
COTS computing performance in the agency's environment for response 
time and transaction throughput capacity. For example, addressing this 
last point regarding transaction throughput capacity has proven 
problematic to some agencies that implemented a COTS package. The 
system could have properly processed a type of transaction, which is 
what the test requires in order to be certified. However, the system 
may require a number of separate processing steps to accomplish this 
task. Those steps may be acceptable at an agency that has a relatively 
low volume of this type of transaction, but may prove problematic for 
an agency with a high volume of this type of transaction.

As noted in the HHS comments, it had not yet developed the test scripts 
and other documentation that would have enabled us to assess the 
adequacy of its system testing activities at the time of our review. 
Therefore, we cannot conclude on whether its system testing activities 
will have a reasonable assurance of detecting the majority of the 
defects. HHS noted that it had conducted preliminary testing, referred 
to as conference room pilots, in August 2003 and in March and April 
2004 and that these activities were attended by finance, business, and 
program staff members from across HHS, who will be the ultimate users 
of the new system. As noted in our report, our review of the conference 
room pilot conducted in March and April 2004 found significant 
weaknesses in the processes being used. This was the last conference 
pilot scheduled before the pilot deployment at CDC. We found that some 
of the stated requirements in a given conference room pilot test script 
were not tested and defects identified were not promptly recorded. This 
is consistent with observations made by HHS' IV&V contractor on the 
August 2003 conference room pilots. Furthermore, we observed that when 
users asked about needed functionality, they were told that the 
functionality would be developed later. Therefore, we are encouraged by 
the statement in HHS' comment letter that it will implement a 
disciplined system testing process.

In our report, we also noted that the system testing activities were 
scheduled late in the first phase of the UFMS implementation process, 
leaving little time for HHS to address any defects identified during 
system testing and to ensure that the corrective actions taken to 
address the defects do not introduce new defects. HHS agreed that 
system testing would ideally come earlier in the process and noted that 
although the testing process is being performed late due to an 
aggressive time schedule, it believed, based on its level of scrutiny, 
its testing plan will identify the majority of the defects in the 
system. We view this as adding to project risk. However, we are 
encouraged that in its comments on our draft report, HHS said it was 
analyzing system integration test results prior to deploying the system 
at CDC, and that this assessment may result in revising the current 
software release strategy.

Program Management Oversight:

In its comments, HHS stated that its combined use of software tools, 
including TeamPlay from Primavera, provides management with information 
for monitoring the project's critical path and the earned value of 
completed work and that this action was taken in October 2003 after an 
August 2003 report from its IV&V contractor. As with other process 
areas, the key to reducing risks to acceptable levels is not only the 
tool that is used but, more importantly, the effective implementation 
of that tool. In other words, simply selecting an industry standard 
practice or tool does not guarantee success. As noted in a May 2004 
IV&V report, as of April 2004, the IV&V contractor was still raising 
concerns about HHS' ability to perform critical path and earned value 
analysis. HHS acknowledged in its comments on our draft report that it 
continues to work on improving the information provided in the critical 
path reports and is executing a plan to implement the remainder of the 
IV&V suggestions. As we discussed previously in this report, without an 
effective critical path analysis and an earned value management system, 
HHS does not have adequate assurance that it can understand the impact 
of various project events, such as delays in project deliverables, and 
that it knows the status of the various project deliverables in the 
context of progress and associated cost. We continue to believe that 
management needs this information to determine actions to take to 
mitigate risk and manage cost and schedule performance.

HHS also stated that all of the needed improvements in its project 
execution were identified and documented prior to and during our review 
by its IV&V contractor and that improvements continue to be 
implemented. Our report clearly identifies areas of mutual concern by 
us and the IV&V contractor as well as areas where our work uncovered 
additional issues. Regardless of who identified the problems, we remain 
concerned that HHS has been slow to act upon the weaknesses identified 
by the IV&V contractor and has not yet clearly identified actions 
planned to address our recommendations. Our report provides examples 
where it has taken HHS months to address the findings made by its IV&V 
contractor.

Regarding quantitative measures, HHS agreed that quantitative measures 
are crucial to UFMS success and stated that it has struck an adequate 
balance between the number of measures used to assess UFMS progress and 
the effort and costs required to develop and maintain the measures. HHS 
described several measures related to its defect-tracking processes 
that are associated with its system testing efforts. We agree with HHS 
that the measures listed in its comment letter are critical to 
assessing system stability and readiness, but HHS' comments did not 
indicate whether it is also capturing metrics on items that can help it 
understand the risks associated with the processes it is implementing, 
such as with its requirements management process. For example, HHS 
stated that system testing had not identified any requirements 
problems, which indicated the requirements were defined thoroughly. 
However, system testing is normally not designed to capture 
requirements problems since, as noted in HHS' comment letter, testing 
is structured to determine whether the system is meeting requirements 
that have been documented. Therefore, it is not clear whether HHS has 
fully developed a metric process that will address its needs throughout 
the phased deployments.

Regarding human capital, HHS said that it faces its share of challenges 
in obtaining full-time federal staff due to the temporary nature of an 
implementation project and the agency's objective to staff a highly 
competent program team and not a permanent federal bureaucracy. We 
recognize that HHS and the systems integrator it has under contract to 
assist with the project have taken measures to acquire additional staff 
for the implementation of UFMS. We also recognize the challenge in 
finding people with the needed skills. Our concern is that the UFMS 
project has experienced staff shortages as high as 40 percent of the 
federal positions that HHS believed were needed to implement UFMS. This 
shortage of staff resources led to several key deliverables being 
significantly behind schedule. Also, while HHS said that CDC has the 
vast majority of its required positions filled, we found that many of 
the positions for this operating division were filled with staff from 
the program management office for the project, which affects the work 
that should be done to manage and oversee the project. As stated in our 
report, without adequate staff resources, the project schedule can be 
negatively affected, project controls and accountability can be 
diminished, and the successful implementation of UFMS may be 
compromised.

IT Management:

With respect to IT management, including investment management, 
enterprise architecture, and information security, HHS elaborated on 
further activities taken to address weaknesses that we had pointed out 
in our draft report. In its comments, HHS referenced a Web site that 
provides its IT investment policy dated January 2001, which we had 
already reviewed and which agency officials stated was in the process 
of being updated. In January 2004, we recommended 10 actions the 
department should take to improve its IT investment management process. 
One action called for HHS to revise the department's IT investment 
management policy to include (1) how this process relates to other 
agency processes, (2) an identification of external and environmental 
factors, (3) a description of the relationship between the process and 
the department's enterprise architecture, and (4) the use of 
independent verification and validation reviews, when appropriate. HHS 
concurred with our recommendations. Further, although HHS' comments 
indicated that we made a recommendation related to enterprise 
architecture, as we stated in our conclusions, we did not make 
recommendations about enterprise architecture in this report.

We agree with HHS that progress has been made in its information 
security management. However, HHS did not address the potential impact 
that outstanding departmentwide information security controls 
weaknesses could have on the reliability and integrity of the new 
financial management system. HHS will need to ensure effective 
information security controls departmentwide for UFMS operations.

GAO's Review Process:

In its response to a draft of this report, HHS stated that the timing 
of our review of the UFMS was not optimal and required significant 
staff time for meetings and preparation, document requests, and 
communications. In HHS' opinion, GAO involvement was in itself a 
significant contributor to project schedule risk. In our view, we 
conducted this engagement in a professional, constructive manner in 
which we worked proactively with HHS to provide timely observations on 
the implementation of UFMS. The timing of our review was aimed at 
providing input early in the process so that HHS can act to address 
weaknesses and reduce the risk of implementing a system that does not 
meet needs and expectations and requires costly rework and work-arounds 
to operate. We have found in our reviews of other agencies' system 
implementation efforts that effective implementation of disciplined 
processes can reduce risks that have an adverse impact on the cost, 
timeliness, and performance of a project. Through early recognition and 
resolution of the weaknesses identified, HHS can optimize its 
opportunities to reduce the risks that UFMS will not fully meet one or 
more of its cost, schedule, and performance objectives. Further, in 
performing our review, we made every effort to reduce inconvenience to 
HHS. For example, HHS asked us and we agreed to postpone our initial 
meetings with HHS staff until after the completion of HHS' fiscal year 
2003 financial statement audit. We also followed HHS' protocols in 
scheduling meetings and requested documentation that should have been 
readily available, at this stage of the UFMS. HHS' adoption of several 
of our recommendations evidences the added value of our review and 
implementation of all 34 of our recommendations will add even greater 
value to the project.

As agreed with your offices, unless you announce the contents of this 
report earlier, we will not distribute it until 30 days after its date. 
At that time, we will send copies to the Chairman and Ranking Minority 
Member, Senate Committee on Governmental Affairs, and other interested 
congressional committees. We are also sending copies to the Secretary 
of Health and Human Services and the Director of the Office of 
Management and Budget. Copies will also be made available to others 
upon request. The report will also be available at no charge on GAO's 
Web site at [Hyperlink, http://www.gao.gov].

If you or your staff have any questions concerning this report, please 
contact Sally E. Thompson, Director, Financial Management and 
Assurance, who may be reached at (202) 512-9450 or by e-mail at 
[Hyperlink, thompsons@gao.gov], or Keith A. Rhodes, Chief Technologist, 
Applied Research and Methods, who may be reached at (202) 512-6412 or 
by e-mail at [Hyperlink, rhodesk@gao.gov]. Staff contacts and other 
key contributors to this report are listed in appendix V.

Signed by: 

Sally E. Thompson: 
Director, Financial Management and Assurance:

Signed by: 

Keith A. Rhodes: 
Chief Technologist, Applied Research and Methodology 
Center for Engineering and Technology:

[End of section]

Appendixes: 

Appendix I: Scope and Methodology:

Our review of the Department of Health and Human Services' (HHS) 
ongoing effort to develop and implement a unified accounting system 
focused on one of the three concurrent but separate projects: the 
ongoing implementation of the Unified Financial Management System 
(UFMS) at the Centers for Disease Control and Prevention (CDC), the 
Food and Drug Administration and HHS' Program Support Center (PSC). 
This project will be carried out in a phased approach. HHS is currently 
implementing UFMS at CDC, and it is scheduled to go live in October 
2004. The other two projects are the Centers for Medicare and Medicaid 
Services' (CMS) implementation of the Healthcare Integrated General 
Ledger Accounting System to replace the Financial Accounting Control 
System, and the National Institutes of Health's (NIH) implementation of 
the NIH Business and Research Support System to replace the Central 
Accounting System.

To assess HHS' implementation of disciplined processes, we reviewed 
industry standards and best practices from the Institute of Electrical 
and Electronics Engineers (IEEE), Software Engineering Institute (SEI), 
Project Management Institute, Joint Financial Management Improvement 
Program (JFMIP), GAO executive guides, and prior GAO reports. We 
reviewed and analyzed UFMS planning documents related to project 
management, testing, data conversion, requirements management, risk 
management, and configuration management. We also reviewed minutes from 
key meetings, such as the Information Technology Investment Review 
Board meetings, Risk Management meetings, and Planning and Development 
Committee meetings. In addition, we reviewed reports issued by the 
independent verification and validation (IV&V) contractor and 
interviewed the systems integrator to clarify the status of issues 
discussed in the reports.

To assess whether HHS had established and implemented disciplined 
processes related to requirements management, we:

* reviewed strategy and planning documents, including its Financial 
Shared Services Study Concept of Operation, dated April 30, 2004;

* reviewed HHS' procedures for defining its requirements management 
framework and compared these procedures to its current practices;

* reviewed guidance published by IEEE and SEI and publications by 
experts to determine the attributes that should be used in developing 
good requirements and selected over 70 requirements and performed an 
in-depth review and analysis to determine whether they could be traced 
between the various process documents;

* attended the second conference room pilot (the session held in 
Rockville, Maryland) to evaluate whether the test scripts demonstrated 
the functionality of the listed requirements; and:

* reviewed IV&V contractor reports to obtain its perspective on HHS' 
requirements management processes.

To assess the risk management process, we reviewed the 44 risks 
documented in the PMOnline risk management tool to determine the 
current status of the risk and to assess the risk mitigation plan. We 
interviewed agency officials to obtain explanations for the status of 
the risks. We analyzed the project schedule and IV&V status reports to 
assess the probability of HHS meeting its projected completion dates 
for development, implementation, and testing.

To assess information technology (IT) management practices, we reviewed 
prior GAO reports on governmentwide investment management and 
enterprise architecture. We also reviewed and analyzed relevant IT 
policies and plans and HHS documentation on the IT investment 
management processes. To assess information security practices, we 
relied on prior years' audit work performed in this area. We reviewed 
pertinent HHS security policies and procedures, and reviewed HHS' 
efforts to minimize potential and actual risks and exposures.

To determine whether HHS had the human resources capacity to 
successfully design, implement, and operate the financial management 
system, we reviewed JFMIP's Core Competencies for Project Managers 
Implementing Financial Systems in the Federal Government, Building the 
Work Force Capacity to Successfully Implement Financial Systems, and 
Core Competencies in Financial Management for Information Technology 
Personnel Implementing Financial Systems in the Federal Government and 
prior GAO reports related to strategic workforce planning. We analyzed 
the UFMS program management office organization chart and obtained 
related information on project staffing. We also interviewed HHS 
officials and the IV&V contractor to discuss staffing resource issues.

For these areas, we interviewed HHS, UFMS, IV&V, and systems integrator 
officials to discuss the status of the project and their roles in the 
project. On April 26, 2004, and May 12, 2004, we briefed HHS management 
on our findings so that action could be taken to reduce risks 
associated with the UFMS project. We performed our work at HHS 
headquarters in Washington, D.C; at the UFMS site in Rockville, 
Maryland; and at CDC offices in Atlanta, Georgia. Our work was 
performed from September 2003 through May 2004 in accordance with U.S. 
generally accepted government auditing standards. We did not review the 
prior implementation of Oracle at NIH or the ongoing implementation of 
Oracle at CMS. We requested comments on a draft of this report from the 
Secretary of Health and Human Services or his designee. Written 
comments from the Department of Health and Human Services are reprinted 
in appendix IV and evaluated in the "Agency Comments and Our 
Evaluation" section.

[End of section]

Appendix II: Disciplined Processes Are Key to Successful System 
Development and Implementation Efforts:

Disciplined processes have been shown to reduce the risks associated 
with software development and acquisition efforts to acceptable levels 
and are fundamental to successful systems acquisition. A disciplined 
software development and acquisition process can maximize the 
likelihood of achieving the intended results (performance) within 
established resources (costs) on schedule. Although a standard set of 
practices that will guarantee success does not exist, several 
organizations, such as SEI and IEEE, and individual experts have 
identified and developed the types of policies, procedures, and 
practices that have been demonstrated to reduce development time and 
enhance effectiveness. The key to having a disciplined system 
development effort is to have disciplined processes in multiple areas, 
including requirements management, testing, project planning and 
oversight, and risk management.

Requirements Management:

Requirements are the specifications that system developers and program 
managers use to design, develop, and acquire a system. They need to be 
carefully defined, consistent with one another, verifiable, and 
directly traceable to higher-level business or functional requirements. 
It is critical that they flow directly from the organization's concept 
of operations (how the organization's day-to-day operations are or will 
be carried out to meet mission needs).[Footnote 56]

According to IEEE, a leader in defining the best practices for such 
efforts, good requirements have several characteristics, including the 
following:[Footnote 57]

* The requirements fully describe the software functionality to be 
delivered. Functionality is a defined objective or characteristic 
action of a system or component. For example, for grants management, a 
key functionality includes knowing (1) the funds obligated to a grantee 
for a specific purpose, (2) the cost incurred by the grantee, and (3) 
the funds provided in accordance with federal accounting standards.

* The requirements are stated in clear terms that allow for 
quantitative evaluation. Specifically, all readers of a requirement 
should arrive at a single, consistent interpretation of it.

* Traceability among various requirement documents is maintained. 
Requirements for projects can be expressed at various levels depending 
on user needs. They range from agencywide business requirements to 
increasingly detailed functional requirements that eventually permit 
the software project managers and other technicians to design and build 
the required functionality in the new system. Adequate traceability 
ensures that a requirement in one document is consistent with and 
linked to applicable requirements in another document.

* The requirements document contains all of the requirements identified 
by the customer, as well as those needed for the definition of the 
system.

Studies have shown that problems associated with requirements 
definition are key factors in software projects that do not meet their 
cost, schedule, and performance goals. Examples include the following:

* A 1988 study found that getting a requirement right in the first 
place costs 50 to 200 times less than waiting until after the system is 
implemented to get it right.[Footnote 58]

* A 1994 survey of more than 8,000 software projects found that the top 
three reasons that projects were delivered late, over budget, and with 
less functionality than desired all had to do with requirements 
management.[Footnote 59]

* A 1994 study found that the average project experiences about a 25 
percent increase in requirements over its lifetime, which translates 
into at least a 25 percent increase in the schedule.[Footnote 60]

* A 1997 study noted that between 40 and 60 percent of all defects 
found in a software project could be traced back to errors made during 
the requirements development stage.[Footnote 61]

Testing:

Testing is the process of executing a program with the intent of 
finding errors.[Footnote 62] Because requirements provide the 
foundation for system testing, specificity and traceability defects in 
system requirements preclude an entity from implementing a disciplined 
testing process. That is, requirements must be complete, clear, and 
well documented to design and implement an effective testing program. 
Absent this, an organization is taking a significant risk that 
substantial defects will not be detected until after the system is 
implemented. As shown in figure 3, there is a direct relationship 
between requirements and testing.

Figure 3: Relationship between Requirements Development and Testing:

[See PDF for image] 

[End of figure] 

Although the actual testing occurs late in the development cycle, test 
planning can help disciplined activities reduce requirements-related 
defects. For example, developing conceptual test cases based on the 
requirements derived from the concept of operations and functional 
requirements stages can identify errors, omissions, and ambiguities 
long before any code is written or a system is configured. Disciplined 
organizations also recognize that planning the testing activities in 
coordination with the requirements development process has major 
benefits.

Although well-defined requirements are critical for implementing a 
successful testing program, disciplined testing efforts for projects 
such as UFMS have several characteristics,[Footnote 63] which include 
the following:

* Testers who assume that the program has errors. Such testers are 
likely to find a greater percentage of the defects present in the 
system. This is commonly called the "testing mindset."

* Test plans and scripts that clearly define what the expected results 
should be when the test case is properly executed and the program does 
not have a defect that would be detected by the test case. This helps 
to ensure that defects are not mistakenly accepted.

* Processes that ensure test results are thoroughly inspected.

* Test cases that include exposing the system to invalid and unexpected 
conditions as well as the valid and expected conditions. This is 
commonly referred to as boundary condition testing.

* Testing processes that determine if a program has unwanted side 
effects. For example, a process should update the proper records 
correctly but should not delete other records.

* Systematic gathering, tracking, and analyzing statistics on the 
defects identified during testing.

Although these processes may appear obvious, they are often overlooked 
in testing activities.[Footnote 64]

Project Planning and Oversight:

Project planning is the process used to establish reasonable plans for 
carrying out and managing the software project. This includes (1) 
developing estimates of the resources needed for the work to be 
performed, (2) establishing the necessary commitments, and (3) defining 
the plan necessary to perform the work. Effective planning is needed to 
identify and resolve problems as soon as possible, when it is the 
cheapest to fix them. According to one author, the average project 
spends about 80 percent of its time on unplanned rework--fixing 
mistakes that were made earlier in the project. Recognizing that 
mistakes will be made in a project is an important part of planning. 
According to this author, successful system development activities are 
designed so that the project team makes a carefully planned series of 
small mistakes to avoid making large, unplanned mistakes. For example, 
spending the time to adequately analyze three design alternatives 
before selecting one results in time spent analyzing two alternatives 
that were not selected. However, discovering that a design is 
inadequate after development can result in code that must be rewritten 
two times, at a cost greater than analyzing the three alternatives in 
the first place. This same author notes that a good rule of thumb is 
that each hour a developer spends reviewing project requirements and 
architecture saves 3 to 10 hours later in the project.[Footnote 65]

Project oversight can also be a valuable contributor to successful 
projects. Agency management can perform oversight functions, such as 
project reviews and participating in key meetings, to help ensure that 
the project will meet the agency needs. Management can also use IV&V 
reviews to provide it with assessments of the project's software 
deliverables and processes. Although independent of the developer, IV&V 
is an integral part of the overall development program and helps 
management mitigate risks.

Risk Management:

Risk and opportunity are inextricably related. Although developing 
software is a risky endeavor, risk management processes should be used 
to manage the project's risks to acceptable levels by taking the 
actions necessary to mitigate the adverse effects of significant risks 
before they threaten the project's success. If a project does not 
effectively manage its risks, then the risks will manage the project.

Risk management is a set of activities for identifying, analyzing, 
planning, tracking, and controlling risks. Risk management starts with 
identifying the risks before they can become problems. If this step is 
not performed well, then the entire risk management process may become 
a useless exercise since one cannot manage something that one does not 
know anything about. As with the other disciplined processes, risk 
management is designed to eliminate the effects of undesirable events 
at the earliest possible stage to avoid the costly consequences of 
rework.

After the risks are identified, they need to be analyzed so that they 
can be better understood and decisions can be made about what actions, 
if any, will be taken to address them. Basically, this step includes 
activities such as evaluating the impact on the project if the risk 
does occur, determining the probability of the event occurring, and 
prioritizing the risk against the other risks. Once the risks are 
analyzed, a risk management plan is developed that outlines the 
information known about the risks and the actions, if any, which will 
be taken to mitigate those risks. Risk monitoring is a continuous 
process because both the risks and actions planned to address 
identified risks need to be monitored, to ensure that the risks are 
being properly controlled and that new risks are identified as early as 
possible. If the actions envisioned in the plan are not adequate, then 
additional controls are needed to correct the deficiencies identified.

[End of section]

Appendix III: An Effective Requirements Management Process and the UFMS 
Functionality for CDC Had Not Been Fully Developed:

HHS has not implemented an effective requirements management process to 
reduce requirements-related defects to acceptable levels or to support 
an effective testing process. In reviewing HHS' requirements management 
process, we found (1) the requirements were not based on a concept of 
operations that should provide the framework for the requirements 
development process, (2) traceability was not maintained between 
various requirements documents, and (3) the requirements contained in 
the documents do not provide the necessary specificity. Because of 
these weaknesses, HHS does not have reasonable assurance that it has 
reduced its requirements-related defects to acceptable levels. 
Furthermore, the requirements management problems we noted also prevent 
HHS from developing an effective testing process until they are 
adequately addressed. Although HHS has performed some functions that 
are similar to testing, commonly referred to as conference room pilots, 
to help it determine whether the system will meet its needs, these 
efforts have not provided the quantitative data needed to provide 
reasonable assurance that the system can provide the needed capability. 
Therefore, HHS is depending on system testing, which is not expected to 
start until less than 2 months before system implementation, to provide 
it with the quantitative data needed to determine whether the system 
will meet its needs.

Requirements Were Not Based on a Complete Concept of Operations:

Requirements for UFMS were not based on a concept of operations. The 
concept of operations--which contains a high-level description of the 
operations that must be performed, who must perform them, and where and 
how the operations will be carried out--provides the foundation on 
which requirements definitions and the rest of the systems planning 
process are built. Normally, a concept of operations is one of the 
first documents to be produced during a disciplined development effort. 
According to the IEEE Standards, a concept of operations is a user-
oriented document that describes the characteristics of a proposed 
system from the users' viewpoint.[Footnote 66] Its development is a 
particularly critical step at HHS because of the organizational 
complexity of its financial management activities and the estimated 110 
other systems HHS expects to interface with UFMS.

In response to our requests for a UFMS concept of operations, HHS 
officials provided its Financial Shared Services Study Concept of 
Operation, dated April 30, 2004, that studied several approaches for 
HHS management to consider for implementing shared services.[Footnote 
67] While making a decision on whether to operate in a shared services 
environment is important because it will dictate such items as 
hardware, network, and software needs, this study lacks many of the 
essential elements needed for a concept of operations document that can 
be used to fully inform users about the business processes that will be 
used by UFMS. Without this information, the document cannot serve as 
the foundation for HHS' requirements management processes.

HHS management has stated that it plans to establish centers of 
excellence for UFMS and has identified four functions as candidates to 
begin shared services. These functions are UFMS operations and 
maintenance, customer service (call center), vendor payments, and e-
travel. HHS management also decided that establishing a center of 
excellence for operations and maintenance should begin right away. 
Basically, this center of excellence will perform such UFMS operations 
and maintenance functions as maintaining the data tables in the UFMS 
database, managing various periodic closings, and performing various 
user maintenance functions as well as some security functions. While 
HHS officials advised us that they had selected PSC to operate the 
operations and maintenance center of excellence, there is limited time 
to establish the center before UFMS' planned deployment date at CDC. In 
addition, HHS has still not identified (1) who will operate the other 
centers of excellence and the location(s) performing these functions 
and (2) how these functions will be performed. To address these open 
issues, HHS has asked several HHS operating divisions to submit 
business plans for operating a center of excellence.

We also analyzed various other strategy and planning documents that are 
expected to be used in developing UFMS. Like the Financial Shared 
Services Study Concept of Operation, none of these other documents 
individually or in their totality addressed all of the key elements of 
a concept of operations. For example, operational policies and 
constraints have not been addressed. Moreover, profiles of user classes 
describing each class of user, including responsibilities, education, 
background, skill level, activities, and modes of interaction with the 
current system, have not been developed. In fact, as of May 2004, HHS 
has been unable to get agreement on all the standard processes that it 
will use. For example, when HHS attempted to develop a standard way of 
recording grant-related information, the project team members were 
unable to get agreement between the various operating divisions on how 
to develop crosscutting codes that would have to be maintained at the 
departmental level. Part of the process of developing a concept of 
operations for an organization includes describing how its day-to-day 
operations will be carried out to meet mission needs. The project team 
tasked with developing and implementing a UFMS common accounting system 
attempted to develop standardized processes that would be used for the 
UFMS project. They held meetings with several different operating 
divisions to reach agreement on how the processes should be structured. 
Unfortunately, an agreement between the various parties could not be 
reached, and the decision on how these processes would be defined was 
deferred for further discussion for at least 6 months.

Since standardized processes could not be agreed upon at the outset, 
additional requirements definition and validation activities must be 
conducted later in the development cycle when they are more costly to 
implement. In addition, process modifications will affect all users, 
including those who have been trained in and perform financial 
management functions using the original process. These users may have 
to undergo additional training and modify their existing understanding 
of how the system performs a given function.

Because HHS has not developed a complete concept of operations, 
requirements definition efforts have not had the benefit of 
documentation that fully depicts how HHS' financial system will 
operate, and so HHS cannot ensure that all requirements for the 
system's operations have been defined. Without well-defined 
requirements, HHS cannot be certain that the level of functionality 
that will be provided by UFMS is understood by the project team and 
users and that the resulting system will provide the expected 
functionality.

Approach to Requirements Management Does Not Provide Traceability or 
the Necessary Specificity:

HHS has adopted an approach to requirements development that its 
officials believe is suited to the acquisition and development of 
commercial off-the-shelf software (COTS). HHS officials have stated 
that the requirements management process that we reviewed was adopted 
based on their belief that for COTS development, they do not need to 
fully define the UFMS requirements because UFMS is not a traditional 
system development effort. Therefore, they adopted the following 
approach.

* Define high-level requirements that could be used to guide the 
selection and implementation of the system.

* Understand how the COTS-based system meets the high-level 
requirements defined for UFMS and how HHS must (1) modify its existing 
processes to match the COTS processes or (2) identify the areas or gaps 
requiring custom solutions.

* Develop specific requirements for the areas that require custom 
solutions and document those requirements in the requirements 
repository tool as derived requirements.

HHS used a hierarchical approach to develop the specific requirements 
from the high-level requirements used to acquire the system. These 
high-level requirements and the related supporting documentation were 
expected to help HHS identify the requirements that could not be 
satisfied by the COTS product. This approach includes using the high-
level requirements to (1) update the requirements through process 
design workshops, which generated business processes; (2) establish 
initial baseline requirements; (3) perform a fit/gap analysis; (4) 
develop gap closure alternatives; and (5) create the final baseline 
requirements. The key advantage in using such a hierarchy is that each 
step of the process builds upon the previous one. However, unidentified 
defects in one step migrate to the subsequent steps where they are more 
costly to fix and thereby increase the risk that the project will 
experience adverse effects on its schedule, cost, and performance 
objectives.

HHS recognized that the high-level requirements associated with the 
COTS processes are "by definition, insufficient to adequately define 
the required behavior of the COTS based system." However, HHS has 
stated that UFMS will be able to demonstrate compliance with these 
requirements as well as the requirements derived from high-level 
requirements associated with its custom development through traditional 
testing approaches including demonstrations and validations.

We agree with HHS' position that requirement statements for COTS 
products need to be more flexible and less specific before a product is 
selected because of the low probability that any off-the-shelf product 
will satisfy the detailed requirements of an organization like HHS. As 
HHS has noted, COTS products are designed to meet the needs of the 
marketplace not a specific organization. However, once the product is 
selected, requirements must be defined at a level that allows the 
software to be configured to fit the system under development and 
implemented to meet the organization's needs. As noted elsewhere, on 
the basis of the requirements we reviewed, HHS had not accomplished 
this objective. Furthermore, we identified numerous instances in which 
each documented requirement used to design and test the system was not 
traceable forward to the business processes and therefore could not 
build upon the next step in moving through the hierarchy. This is 
commonly referred to as traceability.[Footnote 68] Furthermore, the 
requirements (1) lacked the specific information necessary to 
understand the required functionality that was to be provided and (2) 
did not describe how to determine quantitatively, through testing or 
other analysis, whether the systems would meet HHS' needs.

One example showing that HHS did not adequately define a requirement 
and maintain traceability through the various documents is an HHS 
requirement regarding general ledger entries that was inadequately 
defined. The high-level requirement stated that the system "shall 
define, generate, and post compound general ledger debit and credit 
entries for a single transaction." The system was also expected to 
"accommodate at least 10 debit and credit pairs," but this information 
was not included in the process document for the Create Recurring 
Journals process, to which the requirement was tied. Therefore, someone 
implementing this functionality from this process document would not 
know the number of debit and credit pairs that must be supported. 
Furthermore, in April 2004, HHS conducted a demonstration for the users 
to validate that this functionality had been implemented. Although the 
demonstration documentation stated that this requirement would be 
covered, none of the steps in the test scripts[Footnote 69] actually 
demonstrated (1) how the system would process a general ledger entry 
that consisted of 10 debit and credit pairs or (2) examples of 
transactions that would require such entries. Since HHS has neither 
demonstrated the functionality nor defined what entries need to be 
supported, HHS does not yet have reasonable assurance the system can 
address this requirement.

HHS expects that UFMS will be able to demonstrate compliance with the 
HHS high-level requirements as well as the derived requirements 
associated with its custom development through traditional testing 
approaches including demonstrations and validations. However, we found 
that as of May 2004, the necessary information to evaluate future 
testing efforts had not been developed for many of the requirements 
that we reviewed.

Conference Room Pilots Provide Little Confidence in Functionality:

HHS has conducted two conference room pilots that were to help 
determine and validate that the UFMS design and configuration meets HHS 
functional requirements. Such demonstrations, properly implemented, 
could be used to reduce the risks associated with the requirements 
management process weaknesses we identified. However, based on our 
review of the conference room pilots, the pilots did not (1) 
significantly reduce the risks associated with requirements management 
processes discussed above and (2) provide HHS with reasonable assurance 
that the functionality needed by its users had been implemented in 
UFMS.

The first conference room pilot, held in August 2003, was designed to 
(1) demonstrate the functionality present in the COTS system that HHS 
believed could be used without modification and (2) identify any gaps 
in the functionality provided by the base system. The second conference 
room pilot in March and April 2004[Footnote 70] was conducted to 
demonstrate the functionality present in the system that should be 
available for the October 2004 implementation at CDC. This 
demonstration was expected to show that the gaps in functionality 
identified in the first conference room pilot had been addressed. 
Problems with these demonstrations include the following:

* The IV&V contractor noted that some of the test scripts involved a 
number of requirements that were only partially addressed or not 
addressed at all. The IV&V contractor expressed concern that HHS would 
not be mapping these requirements designated as "fits"[Footnote 71] to 
test cases until system testing. According to the IV&V contractor, if 
some of the "fits" turn out to be "gaps" as a result of system testing, 
HHS may not have enough time to provide a solution without compromising 
the project schedule.

* In our observations of the second conference room pilot held in March 
and April 2004, we noted several cases in which the users were told 
that the system's approach to address a given issue had not yet been 
defined but that the issue would be resolved before the system was 
deployed. One such issue was the process for handling erroneous 
transactions received from other systems. For example, procedures to 
correct errors in the processing of voucher batches had not been fully 
defined as of the demonstration. HHS officials stated that this would 
be addressed after this second conference room pilot. Additionally, 
during the demonstration it was unclear how five-digit object class 
codes used in the system will migrate to interfacing systems. We 
observed that four-digit object class codes from certain grant systems 
were cross-walked to five-digit object class codes when interfaced with 
the Oracle system. However, it was not clear how the data would be 
converted back to four-digit object class codes to flow back to the 
grant systems.

* The scripts used for the second conference room pilot did not 
maintain traceability to the associated requirements.

In discussing our observations on the March and April 2004 conference 
room pilot, HHS officials stated that the conference room pilots were 
not a phase of formal testing but rather a structured working session 
(first conference room pilot) and a demonstration (second conference 
room pilot). However, they stated that the system test in August 2004-
-less than 2 months before the system is implemented at CDC--would 
verify that UFMS satisfies all requirements and design constraints.

[End of section]

Appendix IV: Comments from the Department of Health and Human Services:

DEPARTMENT OF HEALTH & HUMAN SERVICES:
Office of Inspector General:
Washington, D.C. 20201:

SEP 8 2004:

Sally E. Thompson:
Director, Financial Management and Assurance:
United States Government Accountability Office:
Washington, DC 20548:

Dear Ms. Thompson:

Enclosed are the Department's comments on your draft report entitled, 
"Financial Management Systems: Lack of Disciplined Processes Puts 
Implementation of HHS' Financial System at Risk" (GAO-04-1008). The 
comments represent the tentative position of the Department and are 
subject to reevaluation when the final version of this report is 
received.

The Department appreciates the opportunity to comment on this draft 
report before its publication.

Sincerely,

Signed by:

Lewis Morris:

Chief Counsel to the Inspector General:

Enclosure:

The Office of Inspector General (OIG) is transmitting the Department's 
response to this draft report in our capacity as the Department's 
designated focal point and coordinator for Government Accountability 
Office reports. OIG has not conducted an independent assessment of 
these comments and therefore expresses no opinion on them.

HHS Unified Financial Management System:

Response to the GAO Review of the Unified Financial Management System 
Implementation:

Prepared by:

U.S. Department of Health and Human Services:
Office of Program Management and Systems Policy:
200 Independence Avenue, SW:
Washington, DC 20201:

Version 1.1:
September 7, 2004:

Document Revision History:

Date: 9/03/2004;
Version: 1.0;
Level of Review: UMFS;
Summary of Contents: Initial drafting.

Date: 9/07/2004;
Version: 1.1;
Level of Review: UMFS;
Summary of Contents: Correct Figure 1.

[End of table]

Table of Contents:

1.0: EXECUTIVE SUMMARY:

2.0: STRATEGIC RESPONSE:
2.1: Overall Implementation Strategy/Discipline:
2.2: Observations on GAO's Review Process:

3.0: TACTICAL ANALYSIS:
3.1: Impacts of an Aggressive Project Plan:
3.2: Requirements Management:
3.3: Program Management Oversight:
3.4: Risk Management:
3.5: HHS Information Technology Management:

4.0: RESPONSE TO RECOMMENDATIONS:
4.1: Crosswalk-GAO Recommendations to Response Sections:
4.2: Assessment of October Release Strategy:

5.0: RECOMMENDED CORRECTIONS:

APPENDIX A: ACRONYMS:

APPENDIX B: GLOSSARY OF TERMS:

[End of table of contents]

Table of Figures:

Figure 1 - High level UFMS Program Plan:

Figure 2 - Traditional vs. R^2i Implementation Methodology:

Figure 3 - COTS Design Process:

Figure 4 - UFMS CPI and SPI Performance:

Figure 5 - UFMS Governance Structure:

1.0: EXECUTIVE SUMMARY:

EXECUTIVE SUMMARY:

In August 2004 the Secretary of the Department of Health and Human 
Services (HHS) received the Draft GAO Report assessing the Unified 
Financial Management System (UFMS) implementation. GAO's report cited 
lack of discipline in specific implementation processes and Information 
Technology governance areas related to the implementation of HHS' 
Unified Financial Management System (UFMS) project. Herein, HHS 
presents its response to the subject report. This response clarifies 
HHS' implementation strategy and approach for the Unified Financial 
Management System (UFMS) project and addresses issues cited in the GAO 
report. We contend a more appropriate titling of the report would be: 
"Aggressive Schedule Increases Risk of Implementation of HHS' Financial 
Management System". The following Executive Summary highlights the most 
important points in HHS' detailed response from both the strategic and 
tactical perspectives.

HHS Implementation Strategy for the Unified Financial Management System 
Project:

The development and implementation of UFMS, like other complex 
technology projects is inherently risky. HHS has chosen an 
implementation strategy that is well governed and aggressive. We have 
also prudently placed the UFMS under the scrutiny of an independent 
verification and validation (IV&V) agent who has the duty of 
monitoring, assessing and reporting on the rigor and execution of our 
management processes to the HHS Assistant Secretary for Budget 
Technology and Finance (ASBTF). Indeed, the findings in the GAO report 
were issues that were previously identified as a result of this 
governance and IV&V oversight. Our approach to using an IV&V was 
validated by GAO's use of IV&V analysis in the report. Our detailed 
response on GAO issues in each of these areas is presented in this 
document and summarized in the following paragraphs.

Issue Area 1 - Impacts of an Aggressive Project Plan:

One of the most challenging aspects of any COTS implementation is the 
continual management of the inter-related but sometimes competing 
priorities of cost, schedule, requirements, and resources. Early in the 
program, the UFMS leadership team made the decision that incremental 
benefits from UFMS would be obtained through a phased deployment of the 
system. A well-defined set of phases was established. A core set of 
functional requirements will be available in the October 2004 release 
for Centers for Disease Control and Prevention (CDC) and Food and Drug 
Administration (FDA). Additional capabilities will be added in 
subsequent releases resulting in a complete, Department wide core 
accounting system in 2007. This is an industry best practice risk 
reduction technique, and also allows the UFMS program to give priority 
to meeting the October 2004 "go live" schedule for CDC and FDA. All 
things being equal, if a system functional capability becomes high risk 
for the pilot implementation, it can be deferred to a subsequent 
release without impacting the overall implementation.

On the topic of how the UFMS schedule risk is being managed, HHS 
decided at the beginning of our pilot CDC implementation to push 
aggressively to meet an FY 2005 deployment. October 2004 was chosen as 
the aggressive goal in order to rapidly uncover system defects and 
increase chances that the system would go live in FY 2005. This 
strategy ensures that if the team encountered unsuspected technical 
issues and risks during the system build and testing phases, adequate 
time would remain in 2005 to deploy a quality system. This strategy is 
being executed within a UFMS governance and risk management framework 
that is rigorously managed. Risks are identified in a timely manner and 
scope and budgets are filtered through governance bodies consisting of 
Chief Information Officers (CIOs), Chief Financial Officers (CFOs) and 
executive managers from all of the major HHS operating divisions. This 
level of oversight and partnership has helped to ensure that the UFMS 
program management office continues to follow industry accepted 
processes for the system implementation and that key processes and 
milestones are not circumvented in order to meet the October 2004 
objective for the CDC and FDA implementation.

This management framework also exists to ensure that critical key 
disciplines needed to implement UFMS are effectively executed. The UFMS 
program adheres to detailed plans in risk management, change 
management, quality assurance, configuration management, earned value 
management and critical path schedule management. The GAO reported on 
known imperfections in some of these processes. However, the UFMS 
program has for almost three years managed to mitigate the cost, 
schedule and technical risks well enough to keep the project within 
budget and has reached every major milestone to date within the planned 
timeframes. This is a testament to these and other UFMS management 
disciplines, notwithstanding known needed improvements. All of these 
needed improvements in our execution were identified and documented 
prior to and during the GAO review and continue to be implemented.

Issue Area 2 - Requirements Management:

HHS disagrees with the assertion that the most appropriate requirements 
management process is a custom development model. HHS' method of 
requirements management is carefully designed to follow industry best 
practices, including those of Oracle itself. In COTS-based systems, 
requirements statements need to be much more flexible and less specific 
since COTS products are designed to meet the needs of a marketplace 
instead of satisfying the needs of a particular organization. In the 
traditional custom development model, detailed requirements are 
developed at the onset of the program in order to build a custom 
solution that exactly meets those requirements. UFMS, an implementation 
of commercial off the shelf software (COTS), is not a typical software 
development effort and therefore does not require as much definitional 
rigor in requirements management. This implementation is more focused 
on refitting existing HHS business practices to use the system as it 
was designed by the COTS vendor and configuring the software to meet 
the needs of the HHS business. Where typical software development is 
required such as in developing interfaces between UFMS and the many HHS 
feeder systems, the UFMS implementation team does follow a very typical 
and rigorous requirements definition, design and development process. 
However, this is not the primary focus of the UFMS implementation 
efforts.

On the topic of a concept of operations we agree with GAO that an 
administrative concept of operations for UFMS did not exist at the 
beginning of the project. HHS disagrees with the notion that the lack 
of such a concept of operations is a prerequisite to disciplined 
requirements management. This is not an oversight on the part of HHS 
executive management. HHS has, in fact, laid a course for financial 
management for the department that will unify operations consistent 
with the Secretary's vision for "One HHS". Based on analyses during 
UFMS planning, HHS executives explored various business models for the 
Department, all aimed at unification of process and achieving economies 
of scale. To this end, HHS published a "UFMS Core Financial Target 
Business Model" document during UFMS planning that details much of this 
thought on concepts for operation. Through this initial planning HHS 
concluded that it was more prudent to implement UFMS in a manner that 
provides HHS flexibility in enabling any business model it chooses in 
the future.

The UFMS program has a very detailed, disciplined process for tracing 
requirements from inception through testing. At the time that GAO 
completed its review, HHS had yet to develop its detailed plans for 
testing the UFMS system. These plans are being executed by our team 
with each and every system requirement assigned a tracking number and 
associated defect tracking, resolution and testing results where 
appropriate. Furthermore, the UFMS team has completed weeks of unit and 
integration testing of the system without uncovering any situations 
where a requirement had to be changed as a result of testing. This is 
an extremely good signal that the UFMS system requirements were defined 
thoroughly since this is typically a point in the system development 
lifecycle where discoveries of unclear requirements are made.

The requirements traceability process was never an oversight by HHS and 
was always planned to be carried through as the UFMS testing process 
was documented and integrated into our requirements management process.

HHS does acknowledge GAO's comments on the fact that the testing of 
this system is occurring relatively late in relation to the October 
objective for deployment of the Global Pilot. At the time of the 
writing of this response HHS is analyzing system integration test 
results prior to deploying the first release of the system at the CDC 
and FDA. This assessment may result in a recommendation to the UFMS 
Steering Committee to revise the current software release strategy.

Issue Area 3 - Project Management Oversight:

Following is a summary of the areas that were cited in the report for 
improvement and HHS' response to each:

Personnel and Human Capital Management: The issue of human capital is 
one that HHS has managed carefully. UFMS faces its share of challenges 
in obtaining full-time federal staff due to the temporary nature of an 
implementation project. Our objective remains to staff a highly 
competent program team and not a permanent federal bureaucracy. 
However, at the CDC level, where the current phase of deployment is 
taking place, the project is adequately staffed. CDC has the vast 
majority of their required positions filled, and has evolved creative 
arrangements with contractors and rearrangement of Global 
responsibilities to enable the UFMS to be delivered successfully. To 
date, there has been minimal impact on the project due to human capital 
issues, and plans to acquire necessary resources for upcoming aspects 
of the UFMS project are in place.

UFMS Critical Path Management (CPM): An August 2003 UFMS IV&V review of 
the HHS Critical Path methodology identified that an effective critical 
path analysis had not been developed. Following the lV&V review of the 
HHS Critical Path methodology, HHS initiated steps to implement the 
recommendations provided by the IV&V contractor. Since October 2003 HHS 
has used TeamPlay to automatically generate the critical path report 
for the UFMS Global Pilot/CDC release and reviews the report on a 
weekly basis. The critical path report is calculated using activity 
status that is updated in TeamPlay weekly. HHS agrees with the IV&V 
that in isolation the critical path report does not provide a complete 
picture of program health. However, when the critical path report is 
viewed in conjunction with activity reports provided in TeamPlay HHS is 
able to effectively monitor the health of the UFMS schedule.

Earned Value Management (EVM) Procedures: HHS uses Primavera, named a 
project portfolio management 'Leader' by Gartner for five consecutive 
years, to calculate its EV. The HHS use of Primavera meets 94% of ANSI 
standards. At HHS' request, the IV&V contractor performed a review of 
the EVM procedures. Although substantially compliant with the ANSI 
Standards, the IV&V report commented on the need for HHS to include 
Federal hours in the EV calculations. After analysis, HHS determined it 
was not worth the additional investment to make its Primavera EVM fully 
ANSI compliant, and OMB provided a functional workaround for the 
calculation of the measure to include the Federal hours.

Quantitative Outcome Measures to Assess Program Status: Since the 
inception of the project, HHS has placed a focus on adequate 
performance measures for UFMS. Our focus has been on measuring three 
key program control facets instead of instituting outcome measures all 
along the implementation pathway. These areas are quality, cost, and 
schedule.

HHS contends that it has struck an adequate balance between the amount 
of measures used to assess UFMS progress and the effort and costs 
required to maintain them. HHS agrees with GAO that these measures are 
crucial to UFMS success and we will therefore continue to assess and 
improve our use of them based on our past lessons and future needs.

Issue Area 4 - Risk Management:

The UFMS project relies on a well-implemented risk management process 
that uses business best practices developed by leading providers across 
market segments. The UFMS risk management process is the result of a 
Cooperative Research and Development Agreement (CRADA) between 
BearingPoint and the Software Engineering Institute (SEI) to co-develop 
a best practice based risk management program. The techniques were 
encapsulated into a UFMS risk management methodology that has been 
successfully applied in a wide variety of commercial and federal 
clients from large multi-billion dollar/multi-year programs to smaller 
projects that have lasted less than a year.

GAO determined that although HHS has documented and implemented a risk 
management process for the project, its effectiveness is hampered by 
examples of risks being closed before their solution had been 
completed. HHS agrees with this observation and has since revised the 
risk management processes to keep all risks open until they are either 
realized or an appropriate mitigation has been successful.

Issue Area 5 - HHS Information Technology Management:

Governance: HHS is confident of its existing information technology 
management disciplines and has bolstered its IT management processes 
further with a multi-layered UFMS governance structure described later 
in this response. HHS plans to make UFMS a critical component of its 
enterprise architecture, and this separate but integrated governance 
structure was established to ensure that key stakeholders throughout 
HHS come together to accomplish the Secretary's vision for unified 
financial management at HHS. To this end, the UFMS governance structure 
provides a management framework that enables strategic direction and 
leadership to support successful implementation of the program. HHS 
believes that its overall management of IT investment management and 
architecture are augmented by the UFMS governance structure and 
executive oversight policies. Furthermore, the UFMS governance, which 
includes a change control governing body, stands as a cornerstone, 
around which, HHS' future enterprise architecture and IT management 
practices can be continually enhanced.

Enterprise Architecture: GAO noted UFMS was at a higher level of 
Enterprise Architecture attainment than 97% of other agencies, having 
completed all of stage 2 readiness, along with significant components 
of stage 3. UFMS is a critical and defining part of the Federal 
government's overall Enterprise Architecture. Even with its advanced 
state of EA readiness, HHS cannot design a complete Enterprise 
Architecture, as the GAO review recommends, due to the changing 
external environment. Even in the face of significant forces of change, 
the department has made great advances in creating a consolidating and 
unifying infrastructure for the Federal Government's Enterprise 
Architecture, and is the first ERP to do so, a proof of concept for 
Architecture with different agencies and standardized processes.

Over one year in time has elapsed between the gathering of information 
for the HHS OIG's FISMA report for FY03 and the use of that information 
by GAO for the foundation of its security management findings in this 
report. During this span of time much progress has been made in IT 
security management within HHS. Certification and Accreditation (C&A) 
of major systems is a very high priority for the Department, as 
demonstrated (and reinforced) by the focus on C&A milestones in OMB's 
E-Government scoring on HHS's PMA scorecard. HHS has received C&A 
confirmation for 95% (18 of 19 systems) of the systems associated with 
the implementation of UFMS at the CDC. The outstanding system is the 
HHSnet and enterprise wide network upgrade scheduled for implementation 
in late October 2004. Clearly HHS has made significant progress in the 
C&A arena.

Summary:

HHS is targeted to achieve success in the execution of our processes. 
We disagree with the premise of the GAO report that a lack of 
discipline is placing the UFMS program at risk. Our disciplines have 
actually kept the UFMS program on a successful path. This is despite 
the fact that UFMS, like other large systems implementations, faces 
known and unknown challenges to achieving our goals. HHS' approach to 
the UFMS implementation is well governed and aggressive. Our processes 
for program and risk management, requirements management, configuration 
management, quality assurance and testing are clear and rigorously 
executed.

As originally planned and since GAO completed its report, HHS has 
completed the following activities discussed in the report:

* Established quantitative outcome measures Wrote all test scripts for 
the October release Wrote test plans for each test phase for the 
October release Populated our Requirements Traceability Verification 
Matrix.

* HHS is taking steps on some of the recommendations from our IV&V 
which were later highlighted by GAO.

* Reassessing the CDC deployment schedule in September 2004 and 
replanning if necessary.

* Revising our Risk Management review and closure process.

[End of section]

2.0: STRATEGIC RESPONSE:

2.1. Overall Implementation Strategy/Discipline:

The GAO report offers a critique of the HHS Oracle implementation as at 
risk due to the lack of a disciplined approach. HHS believes its 
approach, though not the one promulgated by GAO, is not only 
disciplined, but is in fact the most appropriate for the needs of the 
project. The risk inherent in the HHS approach comes not from a lack of 
discipline, but from an aggressive project plan, which was designed to 
begin securing value for the taxpayer and HHS community at the earliest 
possible time. The UFMS project plan does contain significant risk, but 
is supported by a robust risk mitigation plan, which is carefully 
managed on a daily basis. The GAO methodology against which the UFMS 
methodology has been compared is appropriate for a customized ERP 
development where there is a large design component and a long and 
careful build process. However, HHS deliberately chose a JFMIP-
certified COTS financial product and CMM Level 3-certified integrator 
to implement an ERP configuration strategy that has proven effective at 
several federal agencies. The difference between an ERP development/
design and an ERP configuration with minimal customization is 
considerable, and impacts the choice of methodologies for 
implementation. HHS' choice not to follow an implementation strategy 
such as the one advocated in this review was a conscious one, and is 
consistent with the best practices for COTS ERP implementations.

2.2. Observations on GAO's Review Process:

HHS believes there are several important points that need to be made 
about the process that the GAO followed during the review of the HHS 
Oracle implementation. It's important to note that this is the first in 
a series of twenty-four (24) CFO Agency reviews of financial management 
systems.

The timing of this review was not optimal, with this response due at 
approximately the same time as the October release. The GAO review 
occurred in the middle of the HHS implementation, at a point were many 
of the key items noted in the review were just starting to be 
developed. HHS strongly recommends that this practice not be followed 
in subsequent CFO Agency Reviews. The impact of such a review should be 
factored into the project plan.

The underlying argument of the paper is that there is one correct way 
to perform an ERP implementation that shows sufficient discipline to 
reduce risk. HHS is not following a traditional waterfall methodology. 
In COTS-based systems, requirements statements need to be much more 
flexible and less specific [NOTE 1] since COTS products are designed to 
meet the needs of a marketplace instead of satisfying the needs of a 
particular organization [NOTE 2].HHS has followed the methodology of 
the most successful implementers for COTS ERP systems.

The GAO analysis that led to this report took 8 months, as opposed to 
the 2-3 projected; varied widely in its topics of research; and was in 
itself a significant contributor to project schedule risks. GAO 
involved a total of 15 members of its staff throughout this review.

Over 130 different official UFMS documents, meeting minutes, reports, 
organization charts, meeting attendee lists, and budget extracts were 
supplied to GAO.

Beginning with the September 29, 2003 entrance conference, the UFMS PMO 
GAO Review coordinator attended over 40 meetings. These meetings did 
not include additional sessions conducted at the Humphrey Building, and 
at CDC in Atlanta and Ft. Collins, Colorado. HHS estimates that these 
meetings consumed over 230 person-hours of the UFMS team. Taking into 
consideration prep-time for these meetings, follow-up on document 
requests, gathering of documents, and communications-we estimate an 
additional 460 person-hours to comply with GAO needs. In total, 
approximately 700 hours from over 30 individuals associated with the 
project, including a large number of senior staff and executives, were 
expended during this review.

[End of section]

3.0: TACTICAL ANALYSIS:

3.1. Impacts of an Aggressive Project Plan:

One of the most challenging aspects of any COTS implementation is the 
continual management of the inter-related but sometimes competing 
priorities of cost, schedule, requirements, and resources. Early in the 
program, the UFMS leadership team decided that incremental benefits 
from UFMS would be obtained through a phased deployment of the system. 
A well-defined set of phases was established, with the CDC acting as 
the pilot implementation for the department. A core set of functional 
requirements will be available in the October 2004 release for CDC and 
FDA. Additional capabilities will be added in subsequent releases 
resulting in a complete, Department wide core accounting system in 2007 
(refer to Figure 1-High Level UFMS Program Plan below). This is an 
industry best practice risk reduction technique that also allows the 
UFMS program to give priority to meeting the October 2004 "go live" 
schedule for CDC and FDA. All things being equal, if a system 
functional capability becomes high risk for the pilot implementation, 
it can be deferred to a subsequent release without impacting the 
overall implementation.

Figure 1 - High level UFMS Program Plan:

[See PDF for image]

[End of figure]

The flexibility afforded by the phased implementation approach, 
combined with the CMM level 3 compliant development processes provides 
the balance necessary to manage the risks associated with an aggressive 
but achievable program schedule. One key risk in this approach, as GAO 
identified, is that the formal testing phase comes late in the overall 
timeline such that very limited time is available to resolve and retest 
any unexpected issues uncovered.

Testing of COTS based systems, like UFMS, takes on a significantly 
different focus from the testing of custom developed systems. Among the 
keys reasons for choosing a COTS based implementation is to leverage 
the investment made by the COTS vendor in producing a mature product 
that has been thoroughly tested. Very mature products, such as Oracle 
U.S. Federal Financials, require little or no low-level testing. 
Functional tests based on HHS specific business processes that 
necessitate the underlying COTS product are sufficient. Consequently, 
the focus of the test efforts is system-level, and focused on code 
developed for HHS specific extensions and interfaces. The other 
important difference in COTS based implementations is the inclusion of 
the Finance, Business, and Program stakeholders in the testing process. 
Industry experience has repeatedly shown that including these key 
stakeholders in testing can be used to set expectations and introduce 
the future users to the system in a gradual way. The UFMS test effort 
is a multi-phased approach beginning with Conference Room Pilot (CRP) 
activities, progressing through formal test activities, and culminating 
in a User Acceptance Test (UAT).

The Finance, Business, and Program leaders, who have been active in the 
project and its design from the beginning, are heavily involved in 
testing the end product. The UFMS project will go through multiple 
CRPs, and multiple mock data conversions. A series of Go/No-Go 
checkpoints will be passed in these testing phases, which had not yet 
been engaged at the time of the review. The GAO review takes issue with 
the timing of the testing in the project plan and HHS agrees that 
system testing ideally occurs earlier in the schedule. However, even 
though the testing occurs relatively late in the timeline, it is 
subject to extreme scrutiny and management oversight, with regular 
review meetings, daily summaries and detailed communication. All test 
scripts and results are rigorously tracked in TestDirector, and testing 
teams manage defects on a daily basis. Based on this level of scrutiny, 
and in combination with basing UFMS on a very mature COTS product, HHS 
believes its testing plan will identify the significant majority of any 
defects in the system.

Each testing phase (CRPs, Unit-level testing, Integration Testing, 
System Testing, UAT) has a detailed plan developed that defines what 
will be tested, how it will be tested, where it will be tested, and who 
will test it. The results of each phase are recorded, defects noted, 
and corrective actions taken and functionality retested in each testing 
phase as necessary.

CRPs, held in August 2003 and March 2004, were the first phase of 
preliminary testing. These CRPs were used to validate the initial 
system configurations and the system's ability to meet the requirements 
deemed necessary for the CDC release in October 2004. Both CRPs were 
widely attended by Finance, Business, and Program staff members from 
agencies across HHS - the same people who helped identify the program 
requirements and will ultimately use UFMS.

Unit testing is the first phase of formal testing HHS utilizes to 
verify that individual UFMS units meet design requirements and/or 
design constraints and that no differences exist between the unit's 
expected behavior and the unit's actual behavior. Unit test results are 
recorded and reviewed, however statistics of defects encountered are 
not maintained since developers will go through multiple iterations of 
unit testing as extensions, interfaces, workflows, reports, and 
conversion programs are developed.

Integration testing, the next phase of HHS formal testing, verifies the 
interaction between groups of related units, verifying that each unit 
functions properly when invoked by another unit. Integration testing 
started in July 2004 and includes exercising standard Oracle 
functionality as well as interfaces, extensions, workflows, 
conversions, and reports. To date, 181 defects have been identified 
during integration testing, of which 94% have been resolved.

System test is a separate and distinct activity that re-uses existing 
test cases to verify that UFMS satisfies all requirements, design 
constraints, and accurate accounting treatment. All functional 
development, defect resolution, and integration activities are complete 
before being promoted to System test. HHS has separated system test 
into two distinct phases - Functional System Test and Infrastructure 
System Test. Functional System Test verifies the HHS integrated 
business processes and accounting treatment, while Infrastructure 
System Test verifies high-availability, disaster recovery, network, 
security, data transfer with external systems, performance, and end-to-
end processes. Results from both phases are tracked in the Requirements 
Traceability Verification Matrix (RTVM). Since formal testing began 
there have been no formal requirement change requests identified, 
demonstrating that the requirements management process is performing as 
anticipated.

Data conversions represent one of the riskiest areas of an ERP 
implementation. To mitigate this risk, UFMS is utilizing a series of 
four Mock conversions to perform dress rehearsals of the data 
conversion process. The first mock conversion was the initial 
conversion and setup of necessary background data (e.g. vendor tables). 
Second and third mock conversions further validated the conversion 
programs and data cleanup efforts. The data from mock conversion 3 was 
made available for system testing in August. Following mock conversion 
3, final adjustments are made to the conversion programs and additional 
data cleanup may occur. A final test of the conversion programs (e.g. 
Mock conversion 4) is performed in the final month prior to go live and 
is used as the final data validation and reconciliation prior to User 
Acceptance Testing.

As GAO discovered, the UFMS implementation schedule for the CDC 
deployment is extremely aggressive with significant risk. This led HHS 
to tailor its testing plans such that testing phases that normally 
occur sequentially have been allowed to overlap, but steps have never 
been skipped or eliminated. As testing has unfolded, HHS has taken the 
recommendations of the IV&V contractor and PMO and is analyzing system 
integration test results prior to deploying the first release of the 
system at the CDC and FDA. (refer to Section 4.2 - Assessment of 
October Release Strategy).

3.2. Requirements Management:

In July 2002 HHS developed a target business model, which has been a 
guiding document from its creation. This document is the equivalent to 
the "Concept of Operations,' which the GAO review notes is lacking.

The Core Financial Target Business Model is a description of business 
operations and design of how the operations will be performed at HHS 
across multiple, coordinated entities. For HHS, the target business 
model for financial management describes how financial management will 
be performed once the current five financial management systems are 
combined into one system with two components: one for Centers for 
Medicare and Medicaid Services (CMS) HIGLAS and one for the rest of the 
department. The target business model presents the target environment 
by each major JFMIP core financial functional area and associated major 
business. It also defines the interaction between OS at the Department-
level and the component agencies (e.g., defining accounting policy), as 
well as the interaction between Program Support Center (PSC) and the 
PSC-serviced agencies (e.g., external reports submitted to the serviced 
agencies for review and approval).

Detailed diagrams depicting the target business model for each 
component agency are included in the document. These diagrams present 
the major business functions by JFMIP functional area, as well as the 
associated inputs and outputs (i.e., interfacing systems and external 
entities). It also provides a matrix that compares the business 
functions across the component agencies, and a referenced system list 
that provides a brief description of the systems depicted in the 
detailed diagrams.

HHS' method of requirements management is carefully designed to follow 
industry best practices, including those of Oracle itself.

The UFMS requirements management process is a systematic approach to 
identify, document, organize, communicate, and manage changes in the 
requirements applicable to the UFMS Program. UFMS has established a 
central information repository, which includes requirements, their 
attributes, [NOTE 3] status and other management information pertinent 
to the UFMS environment in a COTS product designed for this purpose: 
RequisitePro (RegPro). Requirements and their associated attributes 
have been developed, adapted, and reused, which results in an 
efficiency that lowers the effort and cost of development at each site, 
as well as subsequent iterations and related projects. The UFMS 
Baseline Requirements Specification is a primary output of this 
process, which fully defines and documents the behavior of the UFMS.

The initial set of requirements gathered by HHS were of two broad 
categories - 1) JFMIP* requirements and 2) Agency-specific 
requirements. The HHS requirements (2130 total) breakdown as either 
"standard federal" requirements (including JFMIP) or requirements that 
require a business specific configuration of a COTS financial 
application. After analyzing the requirements and vendors, HHS selected 
a base application of Oracle U.S. Federal Financials, with a JFMIP 
certified Federal extension containing imbedded industry best 
practices. To attain the highest efficiencies in cost and time for 
design and build, HHS made the conscious decision to select a financial 
application that had submitted itself to JFMIP disciplined 
certification process. This is consistent with the successful precedent 
at the Department of Education, Secret Service, and other federal 
agencies, as well as the implementation methodologies of major ERP 
integrators, including BearingPoint's R^2i (refer to Figure 2 below) 
and Oracle's integrator group's, Application Implementation 
Methodology (AIM).

Figure 2 - Traditional vs. R^2i Implementation Methodology:

[See PDF for image]]

[End of figure]

* JFMIP:

The JFMIP is a joint and cooperative undertaking of the U.S. Department 
of the Treasury, the General Accounting Office, the Office of 
Management and Budget, and the Office of Personnel Management working 
in cooperation with each other and other agencies to improve financial 
management practices in government. The Office revises the Federal 
government's requirements definitions, testing, and acquisition 
processes, and the first target of opportunity is core financial 
systems. The objectives of the Office are to develop systems 
requirements, communicate and explain Federal and agency needs, provide 
agencies and vendors information to improve financial systems, ensure 
that products meet relevant system requirements, and simplify the 
procurement process.

The "standard federal" requirements (including JFMIP):

JFMIP requirements are, by definition, global across the government, 
being the aspects of finance that all federal agencies have in common. 
The JFMIP certification signifies a software application has passed 
rigorous federal scrutiny, and allows Oracle to claim a "federalized" 
title when marketing its product. Configuration of a JFMIP certified 
application allows the federal agency and integrator to have a high 
degree of confidence that JFMIP requirements are met off-the-shelf, 
which means that for HHS, 80% of the requirements have been met by out 
of the box functionality. JFMIP certification effectively allows an 
agency and its integrator to focus most of their time and energy on 
careful management and configuration of the product to meet the 
remaining 20% of requirements. Even with this certification, HHS tracks 
all requirements in a traceability document and has embedded them in 
testing scenarios to verify the product performs as designed throughout 
multiple conference room pilots conducted with the Finance, Business, 
and Program stakeholders. These JFMIP certified requirements are 
admittedly, and by design, tested with less rigor than the Agency-
specific "unique" requirements.

Agency-specific "unique" requirements:

The Agency-specific requirements are the true "design" requirements of 
the UFMS project. For any requirements that required custom code or 
configurations, HHS has followed all IEEE standards - creating fully 
traceable requirements, sourced, referenced and thoroughly tested by 
the implementation team (developers, business analysts, system 
architects, and testers) and the Finance, Business, and Program leaders 
who accept them. The major focus is on UFMS requirements management and 
testing, with extensive testing based on the HHS specific business 
processes. Multiple conference room pilots, to gain acceptance from the 
Finance, Business, and Program stakeholders that the requirements have 
been met, satisfy the business need, and the Go/No-Go gates have been 
passed.

Figure 3 - COTS Design Process:

[See PDF for image]

[End of figure]

Requirements Traceability Verification Matrix (RTVM):

UFMS has built a comprehensive RTVM in which the requirements are 
mapped to Business Processes to Test Scripts, resulting in a full trace 
of requirements to the appropriate testable area of Oracle, and the 
method used to verify that each requirement has been satisfied. The 
RTVM is maintained in an industry standard COTS testing tool - 
Mercury's TestDirector.

The purpose of the RTVM is to ensure that all requirements are met by 
the system deliverable and to demonstrate to HHS and outside parties 
that we have satisfied the system requirements. Through the RTVM 
requirements management and testing are inseparably linked.

In addition,

* The RTVM is used to track all UFMS requirements and design 
constraints and ensure they are all tested during System Test.

* The UFMS Final Baseline Requirements have been mapped to integrated 
business processes at the script level since the GAO review.

* Test Director is populated with all testable requirements from the 
UFMS Final Baseline Requirements and subsequent approved changes.

* As the functional/technical specs were completed, the design 
constraints were added to RegPro and then to Test Director to ensure 
they are included in System Test.

* The requirements module in TestDirector maintains the list of 
testable requirements, organized by module, in order to map 
requirements to Test Scripts.

3.3. Program Management Oversight:

HHS uses Primavera, named a project portfolio management 'Leader' by 
Gartner for five consecutive years, as the project-planning tool for 
UFMS. The baseline project schedule is maintained and tracked using 
Primavera, including tacking actual vs. planned hours by resource 
against each activity, and the automatic calculation of the critical 
path and earned value (EV).

Critical Path:

An August 2003 IV&V review of the HHS Critical Path methodology 
identified that an effective critical path analysis had not been 
developed. HHS immediately undertook steps to implement the 
recommendations provided. Since October 10TH 2003 HHS has used TeamPlay 
to automatically generate the critical path report for the Global 
Pilot/CDC release and reviewed the report on a weekly basis. The 
critical path report is calculated using activity status that is 
updated in TeamPlay weekly. HHS agrees with the IV&V that in isolation 
the critical path report does not provide a complete picture of program 
health. However, when the critical path report is viewed in conjunction 
with activity reports provided in TeamPlay, HHS is able to monitor 
UFMS.

HHS continues to work on improving the information provided in the 
critical path reports and is executing a plan to implement the 
remainder of the IV&V suggestions. Additionally, HHS began reporting on 
the critical paths of the FDA and PSC releases on August 26th 2004.

EVM:

The HHS use of Primavera meets 94% of ANSI standards as supported by 
the report of our IV&V. One issue the IV&V noted is that the Federal 
hours are not included in the TeamPlay EV calculations. After analysis, 
HHS determined it was not worth the additional investment to make its 
Primavera EVM fully ANSI compliant, and OMB provided a functional 
workaround for the calculation of the measure. The two criteria not met 
are:

* Management Reserve: TeamPlay project plans do not show a management 
reserve. The Department (HHS) maintains a management reserve that they 
can use based on where it is needed. TeamPlay is used only to track 
BearingPoint's progress; it does not incorporate the overall UFMS 
budget.

* Cost at Completion Data: The current Earned Value Report does not 
show Cost at Completion data. Cost at Completion data is readily 
available in TeamPlay and can be added to the current report as 
necessary.

Figure 4 presents the Cost Performance Index (CPI) and Schedule 
Performance Index (SPI) on a monthly basis for UFMS since January 2003.

Quantitative Outcome Measures to Assess Program Status:

Since the inception of the project, HHS has focused on adequate 
performance measures for UFMS. Our focus has been on measuring three 
key program control facets instead of instituting outcome measures all 
along the implementation pathway. These areas are quality, cost, and 
schedule.

Until HHS reached the testing phases of the UFMS implementation, most 
of the focus on quality dealt with UFMS documents and artifacts. We are 
now conducting a very through and rigorous process for quantifying the 
results of test defect tracking and resolution. Given the importance of 
this process and its impact on our assessment of system stability and 
readiness we have chosen to collect, analyze and discuss these 
quantitative measures on a daily basis. Included in this process are 
the following quality indicators:

* Percent of release requirements tested;

* Number of requirement change requests;

* Percent of Integrated Process test scripts completed;

* Percent of test scenarios passed testing;

* Number defects detected;

* Number defects closed.

For cost and schedule progress, we have instituted the earned value 
management and critical path measures mentioned above. For two years 
now HHS has collected and assessed monthly Cost Performance Index data 
(CPI) and Schedule Performance Index (SPI) data to determine the degree 
to which the program is efficiently using budget and schedule. 
Furthermore, critical path schedule analysis is also used as a 
predictive schedule performance gauge to help our managers determine if 
schedule slippage is occurring. Despite some needed improvements in the 
use of these measures, they have been effective in helping manager's 
drive the achievements mentioned earlier in this document.

HHS asserts that it has struck an adequate balance between the amount 
of measures used to assess UFMS progress and the effort and costs 
required to maintain them. HHS agrees with GAO that these measures are 
crucial to UFMS success and we will therefore continue to assess and 
improve our use of them based on our past lessons and future needs.

Figure 4 - UFMS CPI and SPI Performance:

[See PDF for image]

[End of figure]

Human Capital: The issue of human capital is one that HHS has managed 
carefully. UFMS faces its share of challenges in obtaining full-time 
federal staff due to the temporary nature of an implementation project. 
Our objective remains to staff a highly competent program team and not 
a permanent federal bureaucracy. However, at the CDC level, where the 
current phase of deployment is taking place, the project is adequately 
staffed. CDC has the vast majority of their required positions filled, 
and has evolved creative arrangements with contractors and 
rearrangement of Global responsibilities to enable the UFMS to be 
delivered successfully. To date, there has been minimal impact on the 
project due to human capital issues, and plans to acquire necessary 
resources for upcoming aspects of the UFMS project are in place.

3.4. Risk Management:

The UFMS project relies on a well-implemented risk management process 
that is based on business best practices developed by leading providers 
across market segments. The UFMS risk management process is the result 
of a Cooperative Research and Development Agreement (CRADA) between 
BearingPoint and the Software Engineering Institute (SEI) to co-develop 
a best practice based risk management program. The techniques were 
included in a UFMS risk management methodology that has been 
successfully applied in a wide variety of commercial and federal 
clients from large multi-billion dollar/multi-year programs to smaller 
projects that have lasted less than a year.

There is a certain amount of risk in the overall HHS Program 
Management, which is controlled by a sound risk management process put 
in place early on by HHS' CMM3-Certified systems integrator. The risk 
management approach entails two major processes - risk assessment and 
risk mitigation. Risk assessment includes activities to identify risks, 
and analyze and prioritize them. Risk mitigation includes developing 
risk mitigation strategies and monitoring the impact of the strategies 
on effectively mitigating the risks. The continuous risk management 
process that is followed by the UFMS program includes weekly meetings 
with HHS Program Management to review current and past risks, update 
and refine mitigation strategies, and assess issues that might become 
risks to the success of UFMS.

GAO determined that although HHS has documented and implemented a risk 
management process for the project, its effectiveness is hampered by 
examples of risks being closed before their solution had been 
completed. HHS agrees with this observation and has since revised the 
risk management processes to keep all risks open until they are either 
realized or an appropriate mitigation has been successful.

3.5. HHS Information Technology Management:

Governance: Given the HHS plans to make UFMS a critical component of 
its enterprise architecture, a separate governance structure was 
established to ensure that key stakeholders throughout HHS come 
together to accomplish the Secretary's vision for unified accounting at 
HHS. To this end, the UFMS governance structure provides a management 
framework that enables strategic direction and leadership to support 
successful implementation of the program.

This governance structure also supports UFMS program objectives and 
creates shared ownership and responsibility for the program.

The UFMS governance structure, presented in Figure 5 below, is 
comprised of several bodies of business experts from the HHS business 
communities (finance, administration, budget, technology, and 
operations). Executive and staff resources from HHS' Office of the 
Secretary (HHS/OS) and component agencies interact to oversee and 
manage the UFMS program.

Governance of the UFMS program is organized into two levels of 
stakeholder leadership --executive leadership and program management.

Executive Leadership: The Assistant Secretary for Budget, Technology 
and Finance/Department CFO is the departmental executive sponsor for 
the UFMS initiative and, along with the UFMS Steering Committee, 
provides overall executive leadership for the program. The ASBTF/CFO 
chairs the Steering Committee, which is comprised of HHS and component 
agencies' executive officials.

The Steering Committee is an advisory board that provides counsel and 
guidance to the ASBTF/CFO and makes decisions regarding Departmental 
policy, strategy, funding decisions and program risks. The Steering 
Committee also makes decisions about UFMS including milestones, 
workforce transitions, budget, and staffing.

The UFMS Planning and Development Committee and the UFMS Program 
Management Office provide overall program management. The UFMS Planning 
and Development Committee is comprised of the HHS component agencies' 
Chief Financial Officers (CFO) and Chief Information Officers (CIO) and 
HHS/OS Deputy Assistant Secretaries who work to set guidelines and 
advise the UFMS PMO on system implementation. The HHS Deputy CFO and 
the Department CIO co-chair the UFMS Planning and Development 
Committee. The UFMS PMO routinely interacts with the UFMS Planning and 
Development Committee and the ASBTF/CFO on UFMS matters.

Figure 5 - UFMS Governance Structure:

[See PDF for image]

[End of figure]

IT Investment Management: HHS has a detailed and mature IT Capital 
Planning and Investment Control (CPIC) process, including art IT 
Investment Review Board (ITIRB) that meets regularly to review and 
prioritize projects, track project progress, and vote on funding. The 
policy document defining CPIC and ITIRB for HHS is publicly available 
at http://www.hhs.gov/read/irmpolicv/0001.html and was available for 
GAO review at that web location. ITIRB meetings have been held since 
that policy was promulgated. These meetings have reviewed projects 
proposals, funding requests, and quarterly and annual updates. 
Additionally the ITIRB establishes key decisions points with specific 
criteria that must be meet before the program is permitted to proceed. 
Below is a brief history of these ITIRB reviews that demonstrate HHS's 
ongoing management of the continuous progress UFMS is making toward its 
stated objectives.

Date: January 13, 2002:
Event: UFMS (Global) has decided to integrate the other components 
(HIGLAS and NBRSS) into their annual update presentation.
Action: UFMS has updated their business case. Few, if any, significant 
issues were indicated by the OIRM subject area specialists.
Decision: No decision necessary:

Date: November 5, 2002:
Event: UFMS Decision Point II:
Action: Thomas presented documentation to support a Decision Point II 
briefing: 
Decision: The ITIRB approved the Decision Point II documentation.

Date: April 8, 2003:
Event: UFMS Quarterly Status Report:
Action: Thomas presented the UFMS FY03 April Quarterly Status Report:
Decision: The Board had no objections to the program's Quarterly Status 
Report:

Date: July 22, 2003:
Event: UFMS Quarterly Status Report:
Action: Tom Doherty, UFMS program manager, presented the UFMS FY03 July 
Quarterly Status Report:
Decision: The Board had no objections to the program's Quarterly Status 
Report.

Date: November 12, 2003 Event: UFMS Quarterly Status Report:
Action: The UFMS FY03 October Quarterly Status Report was posted as a 
virtual ITIRB per the permission of the HHS CIO. All ITIRB-
participating entities were given the opportunity to review and comment 
on the report. The Board had no objections to the program's Quarterly 
Status Report. (http://intranet.hhs.gov/cio/meetings/itagenda.html):

Date: January 13, 2004:
Event: UFMS Annual Report:
Action: UFMS, HIGLAS, and NBRSS program managers gave a combined report 
to the ITIRB.
Decision: There were no objections to the combined presentation.

Enterprise Architecture: GAO itself noted UFMS was at a higher level of 
Enterprise Architecture attainment than 97% of other agencies, having 
completed all of stage 2 readiness, along with significant components 
of stage 3. UFMS is a critical and defining part of the federal 
governments overall Enterprise Architecture. This Enterprise 
Architecture is inherently consolidating and unifying, integrated at 
touch points with its feeder systems. Even with its advanced state of 
EA readiness at HHS, it would be impractical to fulfill GAO's 
recommendation that a complete Enterprise Architecture be designed 
prior to UFMS implementation given the schedule delays that would 
impose on the UFMS project. Even in the face of significant forces of 
change, the department has made great advances in creating a 
consolidating and unifying infrastructure for the Federal Government's 
Enterprise Architecture, and is the first ERP to do so, a proof of 
concept for Architecture with different agencies and standardized 
processes.

The foundation of its security management findings in GAO's report is 
the HHS OIG's FISMA report for FY03 that contains information that was 
gathered more than one full year prior to the use of that information 
by GAO. During that one year span of time much progress has been made 
in IT security management within HHS. Examples are:

* Developed and implemented a Department-wide IT security program, 
Secure One HHS that incorporates Secretary Thompson's One HHS Vision.

* Employed the Project Matrix methodology to identify 30 nationally 
critical functions and services supported primarily by 24 cyber and 
physical assets. Currently, performing a Project Matrix Phase II 
(interdependency) analysis on the nationally critical functions, 
services, and assets.

* Developed a cohesive and up-to-date set of HHS IT Security Policies.

* Implemented a Managed Security Service (MSS) using an automated 
intrusion detection tool to monitor, detect, and report local and 
Department-wide system security weaknesses.

* Progressively increased key system security metrics reported in the 
FISMA quarterly report. Key items for the 3rd quarter of 2004 included:

-96% of systems have been assessed for risk.

-95% of systems have security plans.

-93% of systems have been certified and accredited.

* Completed and submitted Privacy Impact Assessments (PIAs) for 246 
systems, and institutionalized the delivery of periodic privacy 
awareness training.

* Developed a standardized and coherent process for maintenance and 
management of the HHS FISMA Plans of Action and Milestones (POA&M).

* Implemented an automated capital-planning tool (Prosight) to manage 
OMB Exhibit 300 and 53 submissions.

* Institutionalized a fabric of improved security awareness and 
communications by establishing a virtual Security help desk (SOS), the 
issuance of weekly and monthly Secure One HHS newsletters, the launch 
of Secure One HHS Online, and the establishment of the Secure One 
Communications Center (SOCC).

* Developed in-depth guides to 13 areas of HHS IT security.

* Implemented an automated, centralized data collection tool, the 
Information Security Data Manager (ISDM), to streamline FISMA, POA&M 
and PIA tracking and reporting.

* Implemented an automated privacy tool (Watchfire) to monitor online 
applications for HHS and to guard against privacy risks.

* Implemented an automated security vulnerability and threat alert 
system (iDefense) to detect and warn of potential cyber threats and 
security issues.

* Currently working to establish an automated centralized self-
assessment process using the Security Self Assessment Tool (SSAT). 
Current participants include: NIH, HRSA, AHRQ, IHS, FDA, and AoA.

Certification and Accreditation (C&A) of major systems is a very high 
priority for the Department, as demonstrated (and reinforced) by the 
focus on C&A milestones in OMB's E-Government scoring on HHS's PMA 
scorecard. HHS has received C&A confirmation for 95% (18 of 19 systems) 
of the systems associated with the implementation of UFMS at the CDC. 
The outstanding system is the HHSnet and enterprise wide network 
upgrade scheduled for implementation in late October 2004. Clearly HHS 
has made significant progress in the C&A arena.

[End of section]

4.0: RESPONSE TO RECOMMENDATIONS:

The GAO report identified specific recommendations that the HHS 
response addresses. This crosswalk identifies each of the 
recommendations and the specific section within this response where the 
recommendation is discussed. Additionally, based on our own assessment 
of UFMS, HHS is taking very specific actions to further mitigate the 
acknowledged schedule risk.

4.1. Crosswalk - GAO Recommendations to Response Sections:

1. Determine the system capabilities that are necessary for the CDC 
deployment.
Section 3.2 - Requirements Management:

2. Identify the relevant requirements related to the desired system 
capabilities for the CDC deployment.
Section 3.2 - Requirements Management:

3. Clarify, where necessary, any requirements to verify they 1) fully 
describe the capability to be delivered, 2) include the source of the 
requirement and 3) are unambiguously stated to allow for quantitative 
evaluation.
Section 3.2 - Requirements Management:

4. Maintain the traceability of the CDC-related requirements from their 
origin through implementation.
Section 3.2 - Requirements Management:

5. Use a testing process that employs effective requirements to obtain 
the quantitative measures necessary to understand the assumed risks.
Section 3.1 - Impacts of an Aggressive Project Plan:

6. Validate that data conversion efforts produce reliable data for use 
in UFMS.
Section 3.1 - Impacts of an Aggressive Project Plan:

7. Verify systems interfaces function properly so that data exchanges 
between systems are adequate to satisfy system needs.
Section 3.1 - Impacts of an Aggressive Project Plan:

8. Measure progress based on quantitative data rather than the 
occurrence of events. Section 3.1 - Impacts of an Aggressive Project 
Plan:
Section 3.2 - Requirements Management Section:

3.3 - Program Management Oversight:

4.2. Assessment of October Release Strategy:

HHS does acknowledge GAO's comments on the fact that the testing of 
this system is occurring relatively late in relation to the October 
objective for deployment of the Global Pilot. At the time of the 
writing of this response HHS is analyzing system integration test 
results prior to deploying the first release of the system at the CDC 
and FDA. This assessment may result in a recommendation to the UFMS 
Steering Committee to revise the current software release strategy.

[End of section]

5.0: RECOMMENDED CORRECTIONS:

Needed corrections to the report:

* Page 2): TWO SYSTEMS:

* The report does not note that HHS envisions the eventual UFMS as a 
departmental system that will include the core system currently under 
development at CMS, and will integrate with the NBS and others.

* The report does not reflect that the General Ledger (GL) component of 
the NIH NBRSS, implemented in October 2003, was used as proof of 
concept for UFMS and will be merged with UFMS at a future point to be 
determined by the PMO.

[End of section]

Appendix A: Acronyms:

ASBTF: Assistance Secretary for Budget, Technology, and Finance:

BAT: Business Analysis Team:

BTT: Business Transformation Team:

CCB: Change Control Board:

CCMP: Change Control Management Plan:

CDC: Centers for Disease Control and Prevention:

CMM: Capability and Maturity Model:

CRP: Conference Room Pilot:

FDA: Food and Drug Administration:

FFMIA: Federal Financial Management Improvement Act:

HHS: Health and Human Services, Department of:

IT: Information Technology:

ITIRB: Information Technology Investment Review Board:

IV&V: Independent Verification and Validation:

JFMIP: Joint Federal Management Improvement Program:

OMB: Office of Management and Budget:

PDC: Planning and Development Committee:

PMO: Program Management Office:

PSC: Program Support Center:

QA: Quality Assurance:

QAP: Quality Assurance Plan:

SC: Steering Committee:

SCM: Software Configuration Management:

SI: Systems Integrator:

TAT: Technology Analysis Team:

UCCB: UFMS Change Control Board:

UFMS: Unified Financial Management System:

[End of section]

Appendix B: Glossary:

Artifacts: Policy documents, procedures, deliverables, or other 
documented work products associated with UFMS implementation.

Change Control Board: Governing body established for project change 
control procedures to manage project scope.

Conference Room Pilot: CRPs are held to verify updated configuration 
and business processes. CRP I tests the configuration of a single 
track/single module and is repeated for each track/module being 
implemented. CRP II is system integration testing of all functional 
tracks/modules. The end result of the CRP is a fully operational system 
that is more than 90 percent complete, as well as the initial knowledge 
transfer to the HHS's staff on the use of the system. Projects 
involving multiple HHS sites may involve a CRP I/II at each site.

Decision Document: Policy or other document developed by a UFMS team 
member to provide guidance or instruction.

Department-Level: Term used to describe implementation tasks focused 
upon analyzing and implementing business processes unique to the 
Department as a parent organization (e.g., monitoring use of budget 
authority across the Department).

Deliverable: Plan or other document contractually required to be 
created by BearingPoint and delivered to HHS for review and approval.

Financial Management System: A financial management system is comprised 
of the core financial system (funds management, general ledger, 
accounts payable, accounts receivable, and cost accounting) and the 
financial portions of mixed financial systems such as travel and 
acquisitions (reference: OMB Circular A-127).

Independent Verification and Validation: Contracted organization 
engaged to provide independent assessment of project activities, 
deliverables, and work products.

Joint Financial Management Improvement Program: A cooperative effort 
among major agencies of the U.S. Federal Government to arrive at a 
common set of financial management standards as mandated by the 
President of the United States. Representatives from major agencies 
serve on a committee charged with formulating these standards.

Oracle Functional Modules: Individual components of Oracle Federal 
Financials, such as General Ledger, Budget Execution, Receivables, 
Payables, and Purchasing.

Phase: Represents a major stage of the implementation life cycle. 
Phases are typically several months in duration and consist of lower 
level activities and milestones. The phases of implementation are: 
prepare, validate, simulate, test, and deploy.

Planning and Development Committee: The UFMS PDC is comprised of 
executive-level officials from the HHS Office of the Secretary and 
component agencies. These officials are the HHS Deputy CFO, Co-Chair, 
the HHS CIO, Co-Chair, and CFOs from each of the 12 component agencies, 
CIOs from each of the 12 component agencies, and the UFMS Program 
Director.

Program Management Office: Governing body formed to oversee and manage 
the day-to-day activities of the overall UFMS initiative and 
coordinates with other HHS-wide efforts, such as the Enterprise Human 
Resource Planning (EHRP) initiative.

Quality: Property that distinguishes the state of being of a work 
product, process, or project.

Steering Committee: The Steering Committee is comprised of executive-
level officials from the HHS Office of the Secretary and component 
agencies. These officials are the ASBTF/CFO, Chairperson, the Assistant 
Secretary for Administration and Management, and Directors or 
Administrators of Management from each of the 12 component agencies.

Systems Integrator: Contracted organization engaged to provide system 
integration services and personnel in support of a system 
implementation.

Work Product: Document or other product created by a UFMS team member 
that is reviewed through the QA process.

[End of section]

NOTES:

[1] Carney, D.; COTS Evaluation in the Real World; SEI Interactive, 
Carnegie Mellon University, December 1998.

[2] Alves, C and Finkelstein, A; Negotiating Requirements for COTS-
Based Systems; Position paper presented at Eighth International 
Workshop on Requirements Engineering: Foundation for Software Quality, 
Essen Germany, 2002.

[3] In terms of this document, "attributes" refer to descriptive 
features related to the requirement, such as requirement type, origin, 
and status. 

[End of section]

Appendix V: GAO Contacts and Staff Acknowledgments:

GAO Contacts:

Kay Daly, (202) 512-9312 Chris Martin, (202) 512-9481:

Acknowledgments:

Staff members who made key contributions to this report were Linda 
Elmore, Amanda Gill, Rosa Harris, Maxine Hattery, Lisa Knight, Michael 
LaForge, W. Stephen Lowrey, Meg Mills, David Powner, Gina Ross, Norma 
Samuel, Yvonne Sanchez, Sandra Silzer, and William Thompson.

(193053):

FOOTNOTES

[1] There were initially 24 CFO Act agencies. The Federal Emergency 
Management Agency (FEMA), one of the 24 CFO Act agencies, was 
subsequently transferred to the new Department of Homeland Security 
(DHS) effective March 1, 2003. However, DHS was not established as a 
CFO Act agency. Consideration is now being given by each house of 
Congress to adding DHS to the list of CFO Act agencies in the 
Department of Homeland Security Financial Accountability Act, H.R.4259 
and S.1567, 108th Congress.

[2] GAO, Executive Guide: Creating Value Through World-class Financial 
Management, GAO/AIMD-00-134 (Washington, D.C.: April 2000).

[3] Data conversion is defined as the modification of existing data to 
enable it to operate with similar functional capability in a different 
environment.

[4] These agencies are the Administration for Children and Families 
(ACF), Administration on Aging (AoA), Centers for Medicare and Medicaid 
Services (CMS), Agency for Healthcare Research and Quality (AHRQ), 
Centers for Disease Control and Prevention (CDC), Agency for Toxic 
Substances and Disease Registry (ATSDR), Food and Drug Administration 
(FDA), Health Resources and Services Administration (HRSA), Indian 
Health Service (IHS), National Institutes of Health (NIH), and 
Substance Abuse and Mental Health Services Administration (SAMHSA).

[5] The Chief Financial Officers (CFO) Act of 1990 calls for 
modernization of financial management systems in the departments and 
major agencies in the federal government, so that the systematic 
measurement of performance, the development of cost information, and 
the integration of program, budget, and financial information for 
management reporting can be achieved. Pub. L. No. 101-576, 104 Stat. 
2838 (Nov. 15, 1990).

[6] Pub. L. No. 104-208, div. A., sec. 101(f), title VIII, 110 Stat. 
3009, 3009-389 (Sept. 30, 1996).

[7] Policies and standards prescribed for executive agencies in 
developing, operating, evaluating, and reporting on financial 
management systems are defined in the Office of Management and Budget 
(OMB) Circular No. A-127, Financial Management Systems. Circular A-127 
references the series of publications, entitled Federal Financial 
Management Systems Requirements, issued by the Joint Financial 
Management Improvement Program (JFMIP), as the primary source of 
governmentwide requirements for financial management systems. The OMB 
system requirements provide the framework for establishing integrated 
financial management systems to support the partnership between program 
and financial managers and ensure the integrity of information for 
decision making and measuring performance.

[8] The SGL provides a standard chart of accounts and standardized 
transactions that agencies are to use in all their financial systems.

[9] HHS has other projects under way to improve other financial 
management areas such as grant accounting and travel.

[10] About $12.2 million of the $110 million is to integrate NBRSS into 
UFMS. The general ledger component of the NIH NBRSS, implemented in 
October 2003, was used as a proof of concept for UFMS and will be 
merged with UFMS in the future.

[11] The Program Management Office, managed by the Executive Director 
of JFMIP, with funds provided by the CFO Council agencies, tests COTS 
software packages and certifies that they meet certain federal 
financial management system requirements for core financial systems.

[12] ERP is a business management system that integrates business 
processes such as planning, inventory control, order tracking, customer 
service, finance, and human resources.

[13] Acceptable levels refer to the fact that any systems acquisition 
effort will have risks and will suffer the adverse consequences 
associated with defects in its processes. However, effective 
implementation of the disciplined processes reduces the potential risks 
from actually occurring and prevents significant defects from 
materially affecting the cost, timeliness, and performance of the 
project.

[14] SEI is a federally funded research and development center operated 
by Carnegie Mellon University and sponsored by the U.S. Department of 
Defense. The SEI objective is to provide leadership in software 
engineering and in the transition of new software engineering 
technologies into practice.

[15] IEEE develops standards for a broad range of global industries 
including the information technology and information assurance 
industries.

[16] Steve McConnell, Rapid Development: Taming Wild Software 
Schedules, (Redmond, WA: Microsoft Press, 1996).

[17] Steve McConnell, Rapid Development: Taming Wild Software 
Schedules.

[18] According to IEEE Standard 1362-1998, a concept of operations 
document is normally one of the first documents produced during a 
disciplined development effort since it describes system 
characteristics for a proposed system from the user's viewpoint. This 
is important since a good concept of operations document can be used to 
communicate overall quantitative and qualitative system 
characteristics to the user, developer, and other organizational 
elements. This allows the reader to understand the user organizations, 
missions, and organizational objectives from an integrated systems 
point of view.

[19] Department of Health and Human Services, Financial Shared Services 
Study Concept of Operation, Version 1.0, (Apr. 30, 2004).

[20] Shared service centers provide common services such as finance, 
human resources, procurement, and logistics.

[21] The process design workshops were held at the global level. The 
global-level process designs were then reviewed at the site-level to 
develop site-unique processes as necessary.

[22] Test plans typically contain a general description of what testing 
will involve, including tolerable limits.

[23] A requirements traceability matrix is used to verify that each 
requirement is mapped to one or more business processes and test cases.

[24] Steve McConnell, Rapid Development: Taming Wild Software 
Schedules.

[25] Joint Financial Management Improvement Program, White Paper: 
Financial Systems Data Conversion-Considerations, (Washington, D.C.: 
Dec. 20, 2002).

[26] Data conversion is defined as the modification of existing data to 
enable it to operate with similar functional capability in a different 
environment.

[27] Examples of business activities include reimbursable projects, 
grant obligations, and supplier information.

[28] Risk management recognizes that risk cannot be eliminated from a 
project but can be kept at acceptable levels through a set of 
continuous activities for identifying, analyzing, planning, tracking, 
and controlling risks. If a project does not effectively manage its 
risks, then the risks will manage the project. For example, if a 
project does not properly manage the risks associated with inadequate 
requirements, then the undesirable consequences associated with 
requirement defects, such as increased rework and schedule delays, will 
start consuming more and more project resources. Risk management starts 
with identifying the risks before they can become problems. Once risks 
are identified, they need to be understood. A risk management plan is 
then developed that outlines the information known about the risks and 
the actions, if any, which will be taken to mitigate those risks.

[29] According to IEEE, verification and validation processes for 
projects such as UFMS can be used to determine whether (1) the products 
of a given activity conform to the requirements of that activity and 
(2) the software satisfies its intended use and user needs. This 
determination may include analyzing, evaluating, reviewing, 
inspecting, assessing, and testing software products and processes. The 
IV&V processes should assess the software in the context of the system, 
including the operational environment, hardware, interfacing software, 
operators, and users.

[30] Originally this contractor was a subcontractor. In September 2003, 
the company became the project's prime IV&V contractor, staffing the 
effort with the equivalent of five to six individuals. 

[31] On July 15, 2004, HHS officials stated that the IV&V contractor 
was satisfied with the earned value management system being used for 
the project. However, they were unable to provide any documentation to 
support this position.

[32] For example, a data conversion task may have several activities 
such as (1) determining the data that are needed from a given system, 
(2) ensuring that the data are acceptable to the other system, (3) 
determining the format of the data that will be used in the conversion, 
(4) performing the actual conversion, and (5) resolving any errors that 
resulted from the conversion process. Each of these activities may have 
a given percentage of completion status. For example, once a final 
determination of the data needed from a given system is completed, 10 
percent of the task would be considered completed. An earned value 
management system would take the completed activities, determine the 
completion status, and then compare that to the expected effort, such 
as costs incurred and staff hours expended, to determine whether they 
are consistent. Using the example above, if the determination of data 
needed consumed 15 percent of the dollars expected for that data 
conversion task, then the earned value management system would show 
that 10 percent of the work had consumed 15 percent of the task's 
resources.

[33] William A. Florac, Robert E. Park, and Anita D. Carleton, 
Practical Software Measurement: Measuring for Process Management and 
Improvement (Pittsburgh, Pa.: Software Engineering Institute, Carnegie 
Mellon University, 1997).

[34] Showstoppers were described as risks that would affect the forward 
movement of UFMS implementation if they were not resolved quickly. 

[35] The Project Management Institute has defined milestone as a 
"significant event in the project, usually completion of a major 
deliverable."

[36] GAO, Indian Trust Funds: Interior Lacks Assurance That Trust 
Improvement Plan Will Be Effective, GAO/AIMD-99-53 (Washington, D.C.: 
Apr. 28, 1999).

[37] GAO, Indian Trust Funds: Improvements Made in Acquisition of New 
Asset and Accounting System But Significant Risks Remain, GAO/AIMD-00-
259 (Washington, D.C.: Sept. 15, 2000).

[38] U.S. Department of the Interior, Status Report to the Court Number 
Seventeen (For the Period January 1, 2004 through March 31, 2004) 
(Washington, D.C.: May 3, 2004).

[39] GAO, Business Modernization: Improvements Needed in Management of 
NASA's Integrated Financial Management Program, GAO-03-507 
(Washington, D.C.: Apr. 30, 2003).

[40] GAO, National Aeronautics and Space Administration: Significant 
Actions Needed to Address Long-standing Financial Management Problems, 
GAO-04-754T (Washington, D.C.: May 19, 2004).

[41] GAO, DOD Business Systems Modernization: Billions Continue to Be 
Invested with Inadequate Management Oversight and Accountability, GAO-
04-615 (Washington, D.C.: May 27, 2004).

[42] Clinger-Cohen Act of 1996, Public Law 104-106, Div. E, section 
5125, 110 Stat. 679,684 (Feb. 10, 1996).

[43] In March 2004, we issued the latest version of our IT investment 
management framework, GAO-04-394G, to aid agencies in enhancing their 
IT investment management processes.

[44] GAO, Information Technology Investment Management: A Framework for 
Assessing and Improving Process Maturity (Version 1.1), GAO-04-394G 
(Washington, D.C.: March 2004).

[45] Office of Management and Budget, Circular No. A-130, Management of 
Federal Information Resources (Nov. 28, 2000).

[46] GAO, Information Technology: A Framework for Assessing and 
Improving Enterprise Architecture Management (Version 1.1), GAO-03-
584G (Washington, D.C.: April 2003).

[47] GAO, Information Technology Management: Governmentwide Strategic 
Planning, Performance Measurement, and Investment Management Can Be 
Further Improved, GAO-04-49 (Washington, D.C.: January 2004).

[48] GAO, Information Technology: Leadership Remains Key to Agencies 
Making Progress on Enterprise Architecture Efforts, GAO-04-40 
(Washington, D.C.: Nov. 17, 2003).

[49] GAO, Federal Information System Controls Audit Manual, Volume I: 
Financial Statement Audits, GAO/AIMD-12.19.6 (Washington, D.C.: 
January 1999).

[50] Department of Health and Human Services, Office of the Inspector 
General, Information Technology Security Program Evaluation (September 
2003).

[51] Pub. L. No. 107-347, Tit. III, 116 Stat. 2899, 2946 (Dec. 17, 
2002).

[52] GAO, Human Capital: Key Principles for Effective Strategic 
Workforce Planning, GAO-04-39 (Washington, D.C.: Dec. 11, 2003).

[53] GAO-04-39.

[54] GAO, Executive Guide: Creating Value Through World-class Financial 
Management, GAO/AIMD-00-134 (Washington, D.C.: April 2000).

[55] The waterfall model uses a set of distinct sequential processes to 
develop and implement a system. For example, the software concept is 
developed, and then followed by requirements analysis, architectural 
design, detailed design, coding and debugging, and system testing.

[56] According to IEEE Standard 1362-1998, a concept of operations 
document is normally one of the first documents produced during a 
disciplined development effort since it describes system 
characteristics for a proposed system from the user's viewpoint. This 
is important since a good concept of operations document can be used to 
communicate overall quantitative and qualitative system 
characteristics to the user, developer, and other organizational 
elements. This allows the reader to understand the user organizations, 
missions, and organizational objectives from an integrated systems 
point of view.

[57] IEEE 830-1998.

[58] Barry W. Boehm and Philip N. Papaccio, "Understanding and 
Controlling Software Costs," IEEE Transactions on Software Engineering, 
vol. 14, no. 10 (1988). 

[59] The Standish Group, Charting the Seas of Information Technology 
(Dennis, Mass.: The Standish Group, 1994).

[60] Caper Jones, Assessment and Control of Software Risks (Englewood 
Cliffs, N.J.: Yourdon Press, 1994).

[61] Dean Leffingwell, "Calculating the Return on Investment from More 
Effective Requirements Management," American Programmer (1997).

[62] Glenford J. Myers, The Art of Software Testing, (John Wiley & 
Sons, Inc., 1979).

[63] Testing covers a variety of activities. The discussion of the 
testing processes in this appendix has been tailored to selected 
aspects of the UFMS evaluation and is not intended to provide a 
comprehensive discussion of all the processes that are required or the 
techniques that can be used to accomplish a disciplined testing 
process. 

[64] Myers, The Art of Software Testing.

[65] Steve McConnell, Software Project Survival Guide (Redmond, Wash.: 
Microsoft Press, 1998).

[66] The IEEE Standard describes key elements that should be included 
in a concept of operations including major system components, 
interfaces to external systems, and performance characteristics such as 
speed, throughput, and volume.

[67] Shared service centers provide common services such as finance, 
human resources, procurement, and logistics.

[68] Traceability allows the user to follow requirements both forward 
and backward through process documents and from origin through 
implementation. Traceability is also critical to understanding the 
parentage, interconnections, and dependencies among the individual 
requirements and the impact of a requirement change or deletion on the 
entire system. Without an effective traceability approach, it is very 
difficult to perform actions such as (1) accurately determining the 
impact of changes and making value-based decisions when considering 
requirement changes, (2) maintaining the system once it goes into 
production, (3) tracking the project's progress, and (4) understanding 
the impact of a defect discovered during testing.

[69] A test script is a series of instructions that carry out the test 
case contained in the test plan. A test case is a set of input 
information designed to determine the correctness of a routine. A test 
plan contains a general description of what testing will involve, 
including the tolerable limits.

[70] During the first week of April 2004, a separate session was held 
in the Washington, D.C. area. According to HHS, this session would 
provide the other HHS operating divisions an opportunity to participate 
in the demonstration of the global interfaces, extensions, and 
federally mandated reports. 

[71] "Fits" were those requirements related to actions or processes 
that were included as a standard part of the Oracle U.S. Federal 
Financials modules being implemented by the UFMS program team. 
Requirements satisfied through use of a standard Oracle U.S. Federal 
Financials Application Program Interface are also considered to be 
"fits."

GAO's Mission:

The Government Accountability Office, the investigative arm of 
Congress, exists to support Congress in meeting its constitutional 
responsibilities and to help improve the performance and accountability 
of the federal government for the American people. GAO examines the use 
of public funds; evaluates federal programs and policies; and provides 
analyses, recommendations, and other assistance to help Congress make 
informed oversight, policy, and funding decisions. GAO's commitment to 
good government is reflected in its core values of accountability, 
integrity, and reliability.

Obtaining Copies of GAO Reports and Testimony:

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through the Internet. GAO's Web site ( www.gao.gov ) contains 
abstracts and full-text files of current reports and testimony and an 
expanding archive of older products. The Web site features a search 
engine to help you locate documents using key words and phrases. You 
can print these documents in their entirety, including charts and other 
graphics.

Each day, GAO issues a list of newly released reports, testimony, and 
correspondence. GAO posts this list, known as "Today's Reports," on its 
Web site daily. The list contains links to the full-text document 
files. To have GAO e-mail this list to you every afternoon, go to 
www.gao.gov and select "Subscribe to e-mail alerts" under the "Order 
GAO Products" heading.

Order by Mail or Phone:

The first copy of each printed report is free. Additional copies are $2 
each. A check or money order should be made out to the Superintendent 
of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or 
more copies mailed to a single address are discounted 25 percent. 
Orders should be sent to:

U.S. Government Accountability Office

441 G Street NW, Room LM

Washington, D.C. 20548:

To order by Phone:

 

Voice: (202) 512-6000:

TDD: (202) 512-2537:

Fax: (202) 512-6061:

To Report Fraud, Waste, and Abuse in Federal Programs:

Contact:

Web site: www.gao.gov/fraudnet/fraudnet.htm

E-mail: fraudnet@gao.gov

Automated answering system: (800) 424-5454 or (202) 512-7470:

Public Affairs:

Jeff Nelligan, managing director,

NelliganJ@gao.gov

(202) 512-4800

U.S. Government Accountability Office,

441 G Street NW, Room 7149

Washington, D.C. 20548: