This is the accessible text file for GAO report number GAO-07-485R 
entitled 'Public Health and Hospital Emergency Preparedness Programs: 
Evolution of Performance Measurement Systems to Measure Progress' which 
was released on April 25, 2007. 

This text file was formatted by the U.S. Government Accountability 
Office (GAO) to be accessible to users with visual impairments, as part 
of a longer term project to improve GAO products' accessibility. Every 
attempt has been made to maintain the structural and data integrity of 
the original printed product. Accessibility features, such as text 
descriptions of tables, consecutively numbered footnotes placed at the 
end of the file, and the text of agency comment letters, are provided 
but may not exactly duplicate the presentation or format of the printed 
version. The portable document format (PDF) file is an exact electronic 
replica of the printed version. We welcome your feedback. Please E-mail 
your comments regarding the contents or accessibility features of this 
document to Webmaster@gao.gov. 

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately. 

March 23, 2007: 

The Honorable Bennie G. Thompson: 
Chairman: 
Committee on Homeland Security: 
House of Representatives: 

The Honorable Judd Gregg: 
Ranking Minority Member: 
Committee on the Budget: 
United States Senate: 

The Honorable Charles E. Grassley: 
Ranking Minority Member: 
Committee on Finance: 
United States Senate: 

The Honorable Henry A. Waxman: 
Chairman: 
Committee on Oversight and Government Reform: 
House of Representatives: 

The Honorable Edward J. Markey: 
House of Representatives: 

Subject: Public Health and Hospital Emergency Preparedness Programs: 
Evolution of Performance Measurement Systems to Measure Progress: 

The September 11, 2001, terrorist attacks, the anthrax incidents during 
the fall of 2001, Hurricane Katrina, and concerns about the possibility 
of an influenza pandemic have raised public awareness and concerns 
about the nation's public health and medical systems' ability to 
respond to bioterrorist events and other public health emergencies. 
From 2002 to 2006, the Congress appropriated about $6.1 billion to the 
Department of Health and Human Services (HHS) to support activities to 
strengthen state and local governments' emergency preparedness 
capabilities under the Public Health Security and Bioterrorism 
Preparedness and Response Act of 2002 (Preparedness and Response 
Act).[Footnote 1] HHS has distributed funds annually to 62 recipients, 
including all 50 states and 4 large municipalities, through cooperative 
agreements under two programs--the Centers for Disease Control and 
Prevention's (CDC) Public Health Emergency Preparedness 
Program,[Footnote 2] and the Health Resources and Services 
Administration's (HRSA) National Bioterrorism Hospital Preparedness 
Program. The common goal of CDC's and HRSA's preparedness programs is 
to improve state and local preparedness to respond to bioterrorism and 
other large-scale public health emergencies, such as natural disasters 
or outbreaks of infectious disease. 

To guide efforts by federal, state, and local departments and agencies 
to prepare and respond to terrorism and other major emergencies, the 
federal government has developed a number of national strategies, 
including a National Strategy for Homeland Security, which was issued 
in July 2002.[Footnote 3] Among other things, the National Strategy for 
Homeland Security requires federal government departments and agencies 
to create performance measures to evaluate progress in achieving 
homeland security initiatives, including national preparedness and 
emergency response, and to allocate future resources. Annually, both 
CDC and HRSA develop and issue program guidance for recipients that 
describes activities necessary to improve their ability to respond to 
bioterrorism and other public health emergencies and sets out 
requirements for measuring their performance. Each recipient is 
required to submit periodic reports that track progress in improving 
their preparedness. 

As a result of the nation's ineffective response to Hurricane Katrina 
and the need to prepare for a possible influenza pandemic, members of 
the Congress have raised questions about CDC's and HRSA's efforts to 
monitor the progress of their preparedness programs. Because of these 
questions, we are reporting on (1) how CDC's and HRSA's performance 
measurement systems have evolved and (2) how CDC and HRSA are using 
these systems to measure the progress of their preparedness programs. 
Enclosure I contains the information we provided to your staff at our 
February 28, 2007, briefing. 

To do our work, we reviewed and analyzed federal government documents 
related to national security and emergency preparedness. We also 
obtained reports and interviewed officials from federal 
agencies[Footnote 4] that had evaluated CDC's and HRSA's public health 
and hospital preparedness programs, professional associations involved 
in emergency preparedness, and policy research organizations that had 
published assessments or evaluations of public health and hospital 
preparedness programs. We analyzed CDC and HRSA documents and 
interviewed officials to determine how they have developed and 
implemented performance management systems for their cooperative 
agreement programs, including recipient reporting requirements, and 
systems for collecting data from recipients. Additionally, we analyzed 
other CDC and HRSA documents to identify procedures in place for 
management review of program progress and for providing feedback and 
suggestions for program improvements to recipients. We did not evaluate 
the actual performance measures adopted by CDC or HRSA or examine the 
accuracy or completeness of recipients' self-reported data as contained 
in the progress reports they are required to submit to CDC or HRSA. See 
enclosure II for detailed information on our scope and methodology. We 
conducted our work from June 2006 through March 2007 in accordance with 
generally accepted government auditing standards. 

Results in Brief: 

Since 2002, CDC's and HRSA's performance measurements have evolved from 
measuring capacity to assessing capability. Early in their programs, 
both agencies used markers or values that they called benchmarks to 
measure capacity-building efforts, such as purchasing equipment and 
supplies and acquiring personnel.[Footnote 5] These benchmarks were 
developed from activities authorized in the Preparedness and Response 
Act. In 2002, CDC established 14 benchmarks, such as requiring each 
recipient to designate an executive director of the bioterrorism and 
response program, establish a bioterrorism advisory committee, and 
develop a statewide response plan. From 2003 to 2005, CDC further 
developed its performance measurements by obtaining input from 
stakeholders to make a transition from using benchmarks focused on 
capacities to using performance measures focused on capabilities, such 
as whether personnel have been trained and can appropriately use 
equipment. In 2006, CDC continued to work with stakeholders to refine 
its performance measures. At the beginning of its program in 2002, HRSA 
established 5 benchmarks, such as requiring each recipient to designate 
a coordinator for bioterrorism planning, establish a hospital 
preparedness committee, and develop a plan for hospitals to respond to 
a potential epidemic. From 2003 to 2005, HRSA modified existing 
benchmarks and added new ones, such as training benchmarks, based on 
the existing legislation and input from stakeholders. In 2006, HRSA 
convened an expert panel to propose a set of performance measures 
focused on capabilities. CDC and HRSA officials told us they will 
continue to face challenges as their performance measures evolve, such 
as gaining consensus among stakeholders in light of minimal scientific 
data about public health and hospital emergency preparedness. 

CDC and HRSA use data from recipients' reports and site visits to 
monitor recipients' progress in improving their ability to respond to 
bioterrorism events and other public health emergencies. CDC and HRSA 
project officers use performance measurement data from recipients' 
required progress reports, along with site visits, to monitor progress 
and provide feedback about whether individual recipients have 
accomplished activities related to their ability to respond to 
bioterrorism events and other public health emergencies. Currently, 
there are no standard analyses or reports that enable CDC and HRSA to 
compare data across recipients to measure collective progress, compare 
progress across recipients' programs, or provide consistent feedback to 
recipients. However, in mid to late 2006 both agencies began developing 
formal data analysis programs that are intended to validate recipient- 
reported data and assist in generating standardized reports. According 
to CDC officials, CDC plans to finish validation projects by August 
2007 and then develop routine reports summarizing individual recipient 
and national progress. In addition, CDC plans to issue a report by the 
end of 2007 providing a "snapshot" of the progress recipients have made 
in building emergency readiness capacity and addressing how CDC will 
measure capability in the future. However, because of the expected move 
of HRSA's program to a different HHS office in 2007, its schedule for 
finishing data validation was tentative at the time we briefed your 
staff. Furthermore, due to the expected move, HRSA officials said at 
that time that decisions about whether to issue a report in 2007 on 
recipients' progress also had not been made. 

Agency Comments: 

We requested comments on a draft of this report from HHS. The 
department provided written comments that are reprinted in enclosure 
III. 

In commenting on this draft, HHS provided additional information about 
the transfer on March 5, 2007, of the National Bioterrorism Hospital 
Preparedness Program from HRSA to the new HHS Office of the Assistant 
Secretary for Preparedness and Response. According to HHS, it has made 
a number of changes that it believes will improve its ability to 
monitor performance at the individual recipient level and for the 
program overall. HHS is also planning to conduct an analysis of the 
performance data for existing recipients for fiscal years 2002-2006 in 
order to develop a more complete picture of levels of preparedness from 
all National Bioterrorism Hospital Preparedness Program recipients. 

Many of the initiatives outlined in HHS' comments were begun after our 
briefings to your staff on February 28, 2007, and are still being 
implemented; we are unable to comment on their effectiveness. As we 
continue to evaluate emergency preparedness programs we will review the 
results of their continued efforts to develop measurable evidence-based 
benchmarks and objective standards and their ability to compare data 
across recipients to measure collective progress, compare progress 
across recipients' programs, or provide consistent feedback to 
recipients. 

As arranged with your offices, unless you release its content earlier, 
we plan no further distribution of this report until 30 days after its 
issuance date. At that time, we will send copies of this report to the 
Secretary of HHS and other interested parties. We will also make copies 
available to others on request. In addition, the report will be 
available at no charge on the GAO Web site at http://www.gao.gov. 
Contact points for our Office of Congressional Relations and Public 
Affairs may be found on the last page of this report. 

If you and your staff have any questions or need additional 
information, please contact me at (202) 512-7101, or bascettac@gao.gov. 
Contact points for our Offices of Congressional Relations and Public 
Affairs may be found on the last page of this report. GAO staff members 
who made major contributions to this report are listed in enclosure IV. 

Signed by: 

Cynthia A. Bascetta: 
Director, Health Care: 

Enclosures - 4: 

Enclosure I: Information Presented in Briefing on February 28, 2007: 

The information in this enclosure is taken directly from the slides 
used in the briefing presented to the staffs of the Honorable Judd 
Gregg, Ranking Minority Member, Senate Committee on the Budget; the 
Honorable Charles E. Grassley, Ranking Minority Member, Senate 
Committee on Finance; the Honorable Bennie G. Thompson, Chairman, House 
Committee on Homeland Security; the Honorable Henry A. Waxman, 
Chairman, House Committee on Oversight and Government Reform; and the 
Honorable Edward J. Markey, House of Representatives on February 28, 
2007. 

Introduction (slides 3 through 6): 

The September 11, 2001, terrorist attacks, the anthrax incidents, 
Hurricane Katrina, and concerns about the possibility of an influenza 
pandemic have raised public awareness and concerns about the nation's 
public health and medical systems' ability to respond to bioterrorist 
events and other public health emergencies. In November 2002, the 
Congress passed legislation creating the Department of Homeland 
Security (DHS), giving it the overall responsibility for managing 
emergency preparedness. The Department of Health and Human Services 
(HHS) is designated as the primary agency for implementing activities 
relating to public health and hospital emergency preparedness. 

From 2002 to 2006, the Congress appropriated about $6.1 billion to 
support activities under the Public Health Security and Bioterrorism 
Preparedness and Response Act of 2002 (Preparedness and Response Act) 
to strengthen state and local governments' emergency readiness 
capabilities. HHS has distributed these funds annually to 62 
recipients, including all 50 states and 4 large municipalities, through 
cooperative agreements under two programs: 

* Centers for Disease Control and Prevention's (CDC) Public Health 
Emergency Preparedness Program (formerly the Public Health Preparedness 
and Response for Bioterrorism Program), and: 

* Health Resources and Services Administration's (HRSA) National 
Bioterrorism Hospital Preparedness Program. 

In addition to bioterrorism, these programs also address other large- 
scale public health emergencies, such as natural disasters or outbreaks 
of infectious disease. This "all-hazards" approach recognizes that some 
aspects of response to bioterrorism, such as providing emergency 
medical services and managing mass casualties, can be the same as for 
response to other public health emergencies. 

Public Law 109-417, the Pandemic and All-Hazards Preparedness Act, 
enacted December 19, 2006, amended the Preparedness and Response Act 
and authorizes appropriations for CDC's and HRSA's public health and 
hospital preparedness programs through 2011. The legislation also 
creates a new Assistant Secretary for Preparedness and Response in HHS 
and transfers responsibility for HRSA's hospital preparedness program 
to this position. The program is expected to move some time in 2007. To 
guide preparedness and response for terrorism and other major 
emergencies, the federal government developed a number of national 
strategies, including a National Strategy for Homeland Security issued 
in July 2002.[Footnote 6] This national strategy requires federal 
government departments and agencies to create performance measures to 
evaluate progress in achieving homeland security initiatives, including 
national preparedness and emergency response, and to allocate future 
resources. 

Purpose and Questions (slide 7): 

As a result of the nation's ineffective response to Hurricane Katrina 
and the need to prepare for a possible influenza pandemic, members of 
the Congress have raised questions about CDC's and HRSA's efforts to 
monitor the progress of their preparedness programs. 

To assess CDC's and HRSA's systems to monitor these programs, we 
reviewed the following questions: 

1. How have CDC's and HRSA's performance measurement systems evolved? 

2. How are CDC and HRSA using these systems to measure the progress of 
their preparedness programs? 

Scope and Methodology (slides 8 through 10): 

To do our work, we interviewed officials from: 

* HHS's Office of Public Health Emergency Preparedness (OPHEP), Office 
of the Assistant Secretary for Planning and Evaluation, Office of the 
Inspector General (OIG), and Agency for Healthcare Research and Quality 
(AHRQ); 

* CDC's Coordinating Office for Terrorism Preparedness and Emergency 
Response; 

* HRSA's National Bioterrorism Hospital Preparedness Program; 

* Congressional Research Service; and: 

* professional associations involved in emergency preparedness and 
policy research organizations that had published assessments or 
evaluations of public health and hospital preparedness programs. 

We also reviewed and analyzed documents from: 

* The Executive Office of the President, including the National 
Strategy for Homeland Security and Homeland Security Presidential 
Directives; 

* DHS, including the National Response Plan, the Interim National 
Preparedness Goal, and the draft Target Capabilities List; 

* HHS's OIG and AHRQ; 

* Congressional Research Service; 

* Office of Management and Budget, including Program Assessment Rating 
Tool reviews; 

* CDC and HRSA on the development of performance management systems and 
recipients' annual applications and progress reports; and: 

* professional associations and policy research organizations. 

We did not evaluate the actual performance measures adopted by CDC or 
HRSA or examine the accuracy or completeness of recipients' self- 
reported data as contained in the progress reports they are required to 
submit to CDC or HRSA. Our review was conducted from June 2006 through 
March 2007 in accordance with generally accepted government auditing 
standards. 

Background (slides 11 through 17): 

CDC's and HRSA's Preparedness Programs: 

The common goal of CDC's and HRSA's preparedness programs is to improve 
state and local preparedness to respond to bioterrorism and other 
public health emergencies. 

* CDC's program focuses on public health preparedness. 

* HRSA's program focuses on hospital preparedness. 

CDC and HRSA annually distribute program funds to recipients. These 
funds are used to improve their ability to respond to bioterrorism and 
other public health emergencies, such as training volunteers to provide 
mass vaccinations or antibiotics in the event of a public health 
emergency. 

CDC and HRSA also develop program guidance for recipients that 
describes activities necessary to improve preparedness and sets out 
requirements for measuring recipients' performance. 

CDC's Preparedness Program: 

CDC distributes funds under its cooperative agreements on an annual 
basis. Each recipient: 

* must apply annually for these funds; 

* receives a base amount, plus an amount based on its proportional 
share of the national population; and: 

* has flexibility in how to distribute the funds to local public health 
agencies based on the workplan submitted to CDC with the recipient's 
application. 

Each recipient must submit reports that track progress in improving its 
ability to respond to bioterrorism and other public health emergencies. 
These have included quarterly, midyear, and annual reports. 

HRSA's Preparedness Program: 

HRSA distributes funds under its cooperative agreements on an annual 
basis. Each recipient: 

* receives a base amount, plus an amount based on its proportional 
share of the national population; and: 

* must allocate at least 75 percent of its funds to hospitals or other 
health care entities. 

- Recipients distribute most of the funds to hospitals, with a small 
portion going to other entities such as community health centers, 
emergency medical services, and poison control centers. 

- Recipients may use the remaining funds to support their 
administrative costs and needs assessments. 

Each recipient must submit midyear and annual reports that track 
progress in improving its ability to respond to bioterrorism and other 
public health emergencies. 

Prior Reviews of CDC's and HRSA's Preparedness Programs: 

Several government and private studies, including those conducted by 
GAO, HHS's OIG, and Rand, have noted weaknesses in CDC's and HRSA's 
preparedness programs. 

* In February 2004, we reported (GAO-04-360R) that although the states' 
progress fell short of 2002 goals and much remained to be accomplished, 
these programs enabled states to make needed improvements in public 
health and health care capabilities critical for preparedness. 

* Since December 2002, HHS's OIG has issued seven evaluation and 
inspection reports on program results. It found that all of the studied 
recipients had prepared bioterrorism responses and were working to 
strengthen their infrastructure, but barriers to preparedness remained, 
including problems with staffing, funding, and communication and the 
need for standards and guidance. 

* Since 2001, Rand has conducted many studies related to preparedness 
for public health emergencies. Rand studied how public health 
preparedness is transforming public health agencies and found: 

- the preparedness mission has raised challenges in terms of 
accountability among local health jurisdictions; 

- it is difficult to assess preparedness because measures to define and 
assess preparedness, and a strong evidence base to support those 
measures are lacking; and: 

- it is difficult to measure preparedness because it involves measuring 
the capacity to deal with situations that rarely happen. 

Under a contract with HHS, Rand currently is convening expert panels 
and performing literature searches to help define preparedness. 

Presidential Directive 8--National Preparedness: 

Homeland Security Presidential Directive 8 provides some guidance on 
implementing the National Strategy for Homeland Security. Consistent 
with the directive, DHS developed the Interim National Preparedness 
Goal and the draft Target Capabilities List and issued them in 
2005.[Footnote 7] 

* The Interim National Preparedness Goal establishes preparedness 
priorities, targets, and standards for preparedness assessments and 
strategies to align efforts of federal, state, local, tribal, private- 
sector, and nongovernmental entities. 

* The draft Target Capabilities List identifies 37 capabilities that 
federal, state, local, tribal, private-sector, and nongovernmental 
entities need in order to prevent, protect against, respond to, and 
recover from a major event to minimize the impact on lives, property, 
and the economy. 

CDC's and HRSA's preparedness programs provide both funds and guidance 
to state and local entities and hospitals to help them develop these 
capabilities and meet these preparedness priorities. 

Performance Measurement Systems: 

Early in a program, performance measurement systems can focus on 
measuring capacity, such as equipment and supplies purchased and 
personnel hired. 

As programs mature and more data and scientific evidence are available, 
performance measurement systems can focus more on measuring 
capabilities, such as whether personnel are trained and can 
appropriately use equipment and supplies. Measurements can include: 

* type or level of program activities conducted (process), 

* direct products and services delivered (outputs), or: 

* results of those products and services (outcomes). 

Finding 1: CDC and HRSA Performance Measures Evolved from Measuring 
Capacity to Assessing Capability (slides 18 through 29): 

In 2002, CDC's and HRSA's efforts focused on measuring capacity, such 
as the type of staff hired and equipment needed to respond to a 
bioterrorism attack. To do this, CDC and HRSA identified markers or 
values against which recipients were expected to measure their 
performance. These initial markers or values, which they called 
benchmarks, were developed from emergency preparedness activities 
authorized in the Preparedness and Response Act. 

From 2003 to 2006, CDC and HRSA changed their approach from using 
benchmarks that measure capacity to using performance measures that 
focus on whether a program has met standards assessing capabilities. 

In 2004, CDC and HRSA increased their coordination and in 2005 began to 
coordinate with DHS to align their preparedness programs with the 
Interim National Preparedness Goal and draft Target Capabilities List. 

2002--CDC's Initial Measurements Based on Legislation: 

In 2002, CDC initially established its performance measurement systems 
using benchmarks based on emergency preparedness activities authorized 
in the Preparedness and Response Act. 

CDC officials said these initial benchmarks measured program capacity- 
building efforts such as purchasing equipment and supplies and 
acquiring personnel.[Footnote 8] 

CDC established 14 critical benchmarks, such as requiring each 
recipient to designate an executive director of the bioterrorism and 
response program, establish a bioterrorism advisory committee, and 
develop a statewide response plan. 

2003 to 2005--CDC's Transition from Measuring Capacity to Assessing 
Capability: 

From 2003 to 2005, CDC began to include the participation and input of 
stakeholders--other federal agencies, recipients of program funds, 
public health professional association officials, and industry experts-
-as it further developed its performance measurements. This input 
resulted in modifications of the benchmarks and the transition from 
benchmarks to performance measures that address capabilities. 

* In 2003, an initial draft of over 100 proposed measures was developed 
from input by CDC internal subject matter experts. An external 
workgroup, including professional association representatives, reviewed 
and assessed the proposed measures. Some of the measures focused on new 
areas, such as exercising, drilling, and training. 

* In 2004, CDC convened a second CDC internal expert panel to conduct a 
literature search to identify evidence-based criteria to support the 
performance measures. The panel consolidated the over 100 performance 
measures into 47 interim performance measures. Subsequent field-testing 
eliminated one proposed measure. 

* In late 2004, CDC held teleconferences with selected recipients and 
professional association representatives to discuss these interim 
performance measures. This process reduced the number of performance 
measures to 34. 

* In 2005, CDC introduced the 34 performance measures in the 2005 
cooperative agreement guidance and field tested the new measures in 
five locations. 

Example of the transition of a CDC benchmark into a performance measure 
that addresses capabilities: 

* 2002 benchmark: Recipients were required to develop a system to 
receive and evaluate urgent disease reports on a 24-hour-per-day, 7- 
day-per-week basis. 

* 2003/2004 benchmark: Recipients were required to complete development 
of and maintain a system to receive and evaluate urgent disease 
reports. 

* 2005 performance measure: Recipients were required to meet a target 
time of 15 minutes for a knowledgeable public health professional to 
respond to a call or a communication that appears to be of urgent 
public health consequence. 

2005 to 2006--CDC's Refinement of Capability Assessment: 

In late 2005, CDC met with representatives from professional 
organizations and state and local public health laboratories and health 
departments to review and refine the performance measures. 

In 2006, CDC held further meetings with seven recipients and other 
stakeholders to discuss data collection efforts for performance 
measures and found that gathering some of the data would not be 
feasible. As a result, CDC further reduced the number of performance 
measures from 34 to 23. 

CDC's 2006 guidance with the 23 performance measures was issued in June 
2006. Recipients were expected to comply with this guidance when 
implementing their 2006 programs, during the period from August 31, 
2006, to August 30, 2007. 

2002--HRSA's Initial Measurements Based on Legislation: 

In 2002, HRSA initially established its performance measurement systems 
using benchmarks based on emergency preparedness activities authorized 
in the Preparedness and Response Act. 

HRSA officials said these initial benchmarks measured program capacity- 
building efforts such as purchasing equipment and supplies and 
acquiring personnel. 

HRSA established five critical benchmarks, such as requiring each 
recipient to designate a coordinator for bioterrorism planning, 
establish a hospital preparedness committee, and develop a plan for 
hospitals to respond to a potential epidemic. 

2003 to 2005--HRSA's Benchmarks Modified and Expanded: 

From 2003 to 2005, HRSA, like CDC, began to include the participation 
and input of stakeholders--federal agencies, cooperative agreement 
recipients, public health professional association officials, and 
industry experts--as it further developed its performance measurements. 
This input resulted in modifications of the benchmarks. 

* In 2003, HRSA added new benchmarks based on the existing legislation 
and meetings and discussions with stakeholders. The benchmarks focused 
on such things as exercising, drilling, and training. 

* In 2004, each of HRSA's benchmarks was divided into HRSA-identified 
"sentinel indicators," which are smaller component tasks that are 
intended to accomplish the larger benchmark activity. For example, for 
the benchmark "Surge Capacity: Beds," one of the sentinel indicators is 
the number of additional hospital beds for which a recipient could make 
patient care available within 24 hours. 

* In 2005, HRSA increased the number of sentinel indicators from 21 to 
72 at HHS's request. For example, HHS asked for additional measures to 
identify bed capacity for trauma and burn victims. 

2006--HRSA's Transition from Measuring Capacity to Assessing 
Capability: 

In early 2006, HRSA convened an expert panel that proposed a set of 
performance measures, which were then disseminated to stakeholders such 
as recipients, professional associations, industry experts, and federal 
agencies for feedback. 

This input resulted in adoption of 6 performance measures and 17 
program measures (HRSA defined program measures as a mixture of program 
activities and process and outcome measures) that focus on 
capabilities. 

HRSA also maintained reporting requirements for 17 of its 72 sentinel 
indicators. 

HRSA's 2006 performance and program measures and sentinel indicators 
were not issued with its guidance in July 2006 because HRSA officials 
were awaiting final approval by HHS. These measures were issued in 
December 2006. However, according to HRSA officials, recipients were 
aware of the expectations contained in the guidance because they helped 
develop them. As such, it was HRSA's expectation that recipients would 
comply with them when implementing their 2006 programs, during the 
period from September 1, 2006, to August 31, 2007. 

Increased Coordination between CDC and HRSA; Coordination Initiated 
with DHS: 

In 2004, CDC and HRSA increased their coordination and in 2005 began to 
coordinate with DHS to align their preparedness programs with the 
Interim National Preparedness Goal and draft Target Capabilities List. 
For example, 

* CDC and HRSA project officers shared information in monthly 
conference calls. 

* CDC subject matter experts assisted HRSA's recipients. 

* CDC, HRSA, and DHS created a Joint Advisory Committee in 2005 to 
create common terminology for their respective programs and improve 
commonality in their guidance. 

* CDC and HRSA officials stated that in 2005 they had more closely 
aligned their performance measurements with the draft Target 
Capabilities List and the Interim National Preparedness Goal. 

Figure 1 provides an example of how CDC and HRSA have aligned their 
performance measurements with DHS's draft Target Capabilities List and 
the Interim National Preparedness Goal. 

Figure 1: Alignment of CDC and HRSA Performance Measures with DHS's 
Draft Target Capabilities List and the Interim National Preparedness 
Goal: 

[See PDF for Image] 

Source: CDC, HRSA, and DHS documents. 

[End of figure] 

CDC's and HRSA's Challenges: 

According to CDC and HRSA officials, they will continue to face 
challenges as their performance measures evolve, because gaining 
consensus among the various stakeholders--federal agencies, state and 
local governments, and professional associations--is difficult. These 
difficulties arise because: 

* minimal scientific data exist in this new area of public health and 
hospital emergency preparedness to guide performance measurement 
systems; and: 

* scientists, subject matter experts, and program officials can 
disagree as to what could and should be measured. 

Finding 2: CDC and HRSA Use Data from Recipients' Reports and Site 
Visits to Measure Progress (slides 30 through 36): 

CDC and HRSA project officers use performance measurement data from 
recipients' required reports, along with site visits, to monitor 
progress and provide feedback about whether individual recipients meet 
goals and accomplish activities related to their ability to respond to 
bioterrorism events and other public health emergencies. 

Both CDC and HRSA are making improvements to address the need for 
formal data analysis programs based on validated data and standardized 
procedures. 

Report and Site Visit Data: 

CDC and HRSA project officers are responsible for monitoring individual 
recipients' progress, providing technical assistance, and giving 
feedback on their emergency preparedness activities. Experts in areas 
such as epidemiology, laboratory testing, and surveillance assist 
project officers in providing technical assistance. 

* Project officers analyze and monitor individual recipients' progress 
from the information gathered through recipients' progress reports, 
phone calls, and e-mails and by conducting site visits. 

* Project officers use the information and their analyses of it to (1) 
provide recipients with technical assistance and feedback on their 
ability to respond to bioterrorism and other public health emergencies, 
(2) determine issues to discuss during future site visits, and (3) 
assist recipients in developing future cooperative agreement 
applications. 

* Project officers also collaborate with recipients to identify their 
specific needs for improving their emergency preparedness. For example, 
prior to site visits CDC project officers ask recipients what type of 
technical assistance they need and then include appropriate subject 
matter experts on the site visit. 

Providing Feedback: 

Both CDC and HRSA have various methods for providing feedback on 
progress to recipients: 

* Project officers determine the type and amount of feedback to provide 
each recipient on their progress. 

* CDC and HRSA periodically provide recipients with information about 
promising practices and lessons learned on improving their ability to 
respond to bioterrorism and other public health emergencies. 

* CDC and HRSA both hold annual conferences with all recipients to 
provide training, and other information such as changes to program 
guidance. 

Standard Analysis and Reports Currently Lacking: 

CDC and HRSA officials told us that project officers lack standard 
protocols, checklists, or procedures for analyzing recipients' reports 
that include both qualitative and quantitative data. Consequently, each 
project officer develops his or her own methods or procedures for 
analyzing and measuring recipients' progress. 

CDC and HRSA project officers have not generated standardized reports 
summarizing individual or collective recipients' progress and 
activities. 

Ongoing Improvements: 

However, both CDC and HRSA are making improvements in measuring 
progress: 

* In mid to late 2006, both CDC and HRSA began developing formal data 
analysis programs. They plan to generate standardized reports for 
management and other stakeholders as needed. 

* CDC and HRSA plan to put procedures in place to validate the 
accuracy, reasonableness, and completeness of selected data that 
recipients self-report. 

* Officials said validation is needed to: 

- ensure that reports based on recipients' data provide accurate 
information; 

- determine whether all recipients are comparably reporting the status 
of their preparedness; and: 

- allow managers to make informed decisions to improve the individual 
recipients' cooperative agreements and, ultimately, the nation's 
preparedness. 

Once the data validation projects are completed, CDC officials plan to 
develop routine reports with specific recipient information and reports 
that provide national summaries. CDC officials plan to finish the 
validation projects by August 2007. CDC officials said that in the 
interim they would continue to use many of the measurements from 2005 
and 2006 to trace recipients' progress. 

HRSA's time frame to finish validation is tentative due to the hospital 
preparedness program's expected move to another office within HHS in 
2007. 

As programs mature and more data become available, performance measures 
will continue to evolve to better measure outcomes. Because the process 
is iterative, the system allows for continuous improvements. 

Plans for Making Preparedness Information Public: 

CDC plans to issue a report by the end of 2007 providing a "snapshot" 
of the progress recipients have made in building emergency readiness 
capacity and addressing how CDC will measure capability in the future. 

HRSA officials said that decisions about whether to issue a report in 
2007 on recipients' progress had not been made due to the hospital 
preparedness program's expected move to another office within HHS. 

Beginning in 2009, and every 4 years thereafter, the Pandemic and All- 
Hazards Preparedness Act requires that HHS report to the Congress on 
the status of public health emergency preparedness and response. 

* This includes a National Health Security Strategy and an 
implementation plan that includes an evaluation of progress made toward 
preparedness based on evidence-based benchmarks and objective standards 
that measure levels of preparedness. 

* The Act is generally silent on the type of information that is to be 
included in this evaluation other than an aggregate and recipient- 
specific breakdown of funding. 

[End of section] 

Enclosure II: Scope and Methodology: 

To determine how the Centers for Disease Control and Prevention's (CDC) 
and Health Resources and Services Administration's (HRSA) performance 
measurement systems have evolved, we reviewed and analyzed federal 
government documents related to national security and emergency 
preparedness, including the Executive Office of the President's 
National Strategy for Homeland Security and several Homeland Security 
Presidential Directives, and the Department of Homeland Security's 
(DHS) National Response Plan, Interim National Preparedness Goal, and 
draft Target Capabilities List. We interviewed officials from CDC and 
HRSA to identify and document how they have developed and implemented 
performance management systems for their cooperative agreement 
programs, including determining how standards were identified, 
indicators were selected, goals and targets were established, measures 
were defined, data systems were developed, and data were collected from 
recipients. We obtained and analyzed CDC and HRSA documents to identify 
the development of performance measures from program inception to the 
present, recipient reporting requirements, and systems for collecting 
data from cooperative agreement recipients. We also obtained reports 
and interviewed officials from federal agencies that had evaluated 
CDC's and HRSA's public health and hospital preparedness programs, 
including HHS's Office of Inspector General, HHS's Agency for 
Healthcare Research and Quality, and the Congressional Research 
Service. We also obtained reports and interviewed officials from 
professional associations involved in emergency preparedness and from 
policy research organizations that had published assessments or 
evaluations of public health and hospital preparedness programs. The 
professional associations included: 

* American Hospital Association, 

* Association of Professionals in Infection Control, 

* Association of Public Health Laboratories, 

* Association of State and Territorial Health Officials, 

* National Association of County and City Health Officials, 

* National Association of Public Hospitals, and: 

* The Joint Commission (formerly Joint Commission on Accreditation of 
Healthcare Organizations). 

The policy research organizations we contacted included: 

* Center for Studying Health System Changes, 

* National Center for Disaster Preparedness at Columbia University, 

* Public Health Foundation, 

* Rand Corporation, 

* The Century Foundation, and: 

* Trust for America's Health. 

We did not evaluate the actual performance measures adopted by CDC or 
HRSA. 

To determine how CDC and HRSA measure the progress of their 
preparedness programs, we interviewed CDC and HRSA officials to 
identify and document how they oversee and evaluate their cooperative 
agreement programs. To identify procedures used for reviewing recipient 
data and reporting results to applicable program managers, we obtained 
and analyzed documents and recipient-submitted progress reports from 
CDC and HRSA for program year 2004 and the first half of program year 
2005 and interviewed CDC and HRSA project officers. Additionally, we 
analyzed documents to identify procedures in place for providing 
feedback and suggestions for program improvements to cooperative 
agreement recipients. We also reviewed documents and conducted 
interviews about the procedures used by project officers to provide 
recipients with feedback on their performance, share expertise on 
developing plans or conducting exercises, and disseminate "promising 
practices" information. We did not examine the accuracy or completeness 
of recipients' self-reported data in the progress reports submitted to 
CDC or HRSA. We conducted our work from June 2006 to March 2007 in 
accordance with generally accepted government auditing standards. 

[End of section] 

Enclosure III: Comments from Department of Health and Human Services: 

Office of the Assistant Secretary for Legislation: 
Department Of Health & Human Services: 
Washington, D.C. 20201: 

Mar 14 2001: 

Cynthia A. Bascetta: 
Director, Health Care: 
U.S. Government Accountability Office: 
Washington, DC 20548: 

Dear Ms. Bascetta: 

Enclosed are the Department's comments on the U.S. Government 
Accountability Office's (GAO) draft report entitled, "Public Health and 
Hospital Emergency Preparedness Programs: Evolution of Performance 
Measurement Systems to Measure Progress" (GAO-07-485R), before its 
publication. 

The Department appreciates the opportunity to comment on this draft 
report before its publication. 

Sincerely, 

Signed by: 

Vincent J. Ventimiglia, Jr. 
Assistant Secretary for Legislation: 

Comments On The Department Of Health And Human Services On The 
Government Accountability Office Draft Report Entitled "Public Health 
And Hospital Emergency Preparedness Programs Evolution Of Performance 
Measurement Systems To Measure Progress (GAO 07-485R): 

HHS Comments: 

On March 5, 2007, the Bioterrorism Hospital Preparedness Program (BHPP) 
transferred to the Office of the Assistant Secretary for Preparedness 
and Response. We maintained the existing staff assignments with the 
states to ensure continuity of support during the transition period and 
immediately afterward. We did, however, make some immediate and 
meaningful changes that we anticipate will greatly improve our ability 
to monitor performance at the individual awardee level and for the 
program overall. Most importantly, we have reassigned staff from the 
Office of the Assistant Secretary for Preparedness and Response to 
support the evaluation unit of the BHPP and reassigned former members 
of the evaluation unit staff to assignments that are a better fit for 
their skill set. This change will ensure that those individuals 
monitoring the performance of the awardees have the necessary analysis 
skills. 

In addition to strengthening the Program's evaluation unit, we have 
taken steps to establish partnerships with the Office of the Assistant 
Secretary for Evaluation and Policy here in the Department of Health 
and Human Services (HHS) and the Division of State and Local Readiness, 
Outcome Monitoring and Evaluation Branch, the evaluation unit at for 
the Public Health Emergency Program at CDC. Currently, these three 
units are working to develop the measurable, evidence-based benchmarks 
and objective standards for both programs as required by the Pandemic 
All-Hazards Preparedness Act (the Act). These benchmarks and standards 
will be vetted with our State and local stakeholders and finalized by 
the June 17, 2007 deadline specified in the Act. The establishment of 
these measures will allow us to monitor and track performance in a 
systematic and uniform manner during the upcoming BHPP project period, 
which includes Fiscal Years 2007-2011. 

Finally, we are preparing to conduct an analysis of performance data 
for existing BHPP awardees for the initial project period-Fiscal Years 
2002-2006. We will utilize a contract currently in place to review and 
analyze the data and other information to be submitted by awardees 
through August 31, 2007. While we do not anticipate gleaning consistent 
information from all awardees, we do expect to develop a more complete 
picture of levels of preparedness for all awardees. We welcome an 
opportunity to share the results of our analysis during the next 
several months. 

[End of section] 

Enclosure IV: GAO Contact and Staff Acknowledgments: 

GAO Contact: 

Cynthia A. Bascetta at (202) 512-7101or bascettac@gao.gov: 

Acknowledgments: 

In addition to the contact name above, Karen Doran, Assistant Director; 
La Sherri Bush; Jeffrey Mayhew; Roseanne Price; Lois Shoemaker; and 
Cherie' Starck. 

(290537): 

FOOTNOTES 

[1] Pub. L. No. 107-188, 116 Stat. 594. The Pandemic and All-Hazards 
Preparedness Act, Pub. L. No. 109-417, 120 Stat. 2831 enacted December 
19, 2006, reauthorized and amended the Preparedness and Response Act 
and authorized appropriations for HHS's Centers for Disease Control and 
Prevention's and Health Resources and Services Administration's public 
health and hospital preparedness programs through 2011. 

[2] CDC's program was formerly known as the Public Health Preparedness 
and Response for Bioterrorism Program. 

[3] These strategies also include the National Strategy for Pandemic 
Influenza and the National Security Strategy. 

[4] The federal agencies include HHS's Office of Inspector General 
(OIG), HHS's Agency for Healthcare Research and Quality (AHRQ), and the 
Congressional Research Service (CRS). 

[5] According to CDC officials, acquisition of personnel was necessary 
in order to develop and implement the activities authorized in the 
Preparedness and Response Act. 

[6] These strategies also include the National Strategy for Pandemic 
Influenza and the National Security Strategy. 

[7] Homeland Security Presidential Directives record and communicate 
presidential decisions about homeland security policies of the United 
States. 

[8] According to CDC officials, acquisition of personnel was necessary 
in order to develop and implement emergency preparedness activities 
authorized by the Preparedness and Response Act.

GAO's Mission: 

The Government Accountability Office, the audit, evaluation and 
investigative arm of Congress, exists to support Congress in meeting 
its constitutional responsibilities and to help improve the performance 
and accountability of the federal government for the American people. 
GAO examines the use of public funds; evaluates federal programs and 
policies; and provides analyses, recommendations, and other assistance 
to help Congress make informed oversight, policy, and funding 
decisions. GAO's commitment to good government is reflected in its core 
values of accountability, integrity, and reliability. 

Obtaining Copies of GAO Reports and Testimony: 

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through GAO's Web site (www.gao.gov). Each weekday, GAO posts 
newly released reports, testimony, and correspondence on its Web site. 
To have GAO e-mail you a list of newly posted products every afternoon, 
go to www.gao.gov and select "Subscribe to Updates." 

Order by Mail or Phone: 

The first copy of each printed report is free. Additional copies are $2 
each. A check or money order should be made out to the Superintendent 
of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or 
more copies mailed to a single address are discounted 25 percent. 
Orders should be sent to: 

U.S. Government Accountability Office 441 G Street NW, Room LM 
Washington, D.C. 20548: 

To order by Phone: Voice: (202) 512-6000 TDD: (202) 512-2537 Fax: (202) 
512-6061: 

To Report Fraud, Waste, and Abuse in Federal Programs: 

Contact: 

Web site: www.gao.gov/fraudnet/fraudnet.htm E-mail: fraudnet@gao.gov 
Automated answering system: (800) 424-5454 or (202) 512-7470: 

Congressional Relations: 

Gloria Jarmon, Managing Director, JarmonG@gao.gov (202) 512-4400 U.S. 
Government Accountability Office, 441 G Street NW, Room 7125 
Washington, D.C. 20548: 

Public Affairs: 

Paul Anderson, Managing Director, AndersonP1@gao.gov (202) 512-4800 
U.S. Government Accountability Office, 441 G Street NW, Room 7149 
Washington, D.C. 20548: