This is the accessible text file for GAO report number GAO-07-381R 
entitled 'Homeland Security Grants: Observations on Process DHS Used to 
Allocate Funds to Selected Urban Areas' which was released on February 
8, 2007. 

This text file was formatted by the U.S. Government Accountability 
Office (GAO) to be accessible to users with visual impairments, as part 
of a longer term project to improve GAO products' accessibility. Every 
attempt has been made to maintain the structural and data integrity of 
the original printed product. Accessibility features, such as text 
descriptions of tables, consecutively numbered footnotes placed at the 
end of the file, and the text of agency comment letters, are provided 
but may not exactly duplicate the presentation or format of the printed 
version. The portable document format (PDF) file is an exact electronic 
replica of the printed version. We welcome your feedback. Please E-mail 
your comments regarding the contents or accessibility features of this 
document to Webmaster@gao.gov. 

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately. 

February 7, 2007: 

Congressional Requesters: 

Subject: Homeland Security Grants: Observations on Process DHS Used to 
Allocate Funds to Selected Urban Areas: 

In fiscal year 2006, the Department of Homeland Security (DHS) provided 
approximately $1.7 billion in federal funding to states, localities, 
and territories through its Homeland Security Grant Program (HSGP) to 
prevent, protect against, respond to, and recover from acts of 
terrorism or other catastrophic events. The Urban Areas Security 
Initiative (UASI) is a discretionary grant under this program, and 
since fiscal year 2003, Congress has directed DHS to target UASI 
funding to high-threat, high-density urban areas to assist in building 
capacity.[Footnote 1] To meet this requirement and inform funding 
decisions, DHS developed a method to estimate the relative risk of 
terrorist attacks to urban areas. From fiscal year 2003 through 2005, 
DHS used a number of risk indicators such as population density and 
threat to allocate UASI funds. UASI funding increased during this 
period from about $96 million to $830 million, while the number of 
urban areas that received grants grew from 7 to 43. In fiscal year 
2006, DHS awarded approximately $711 million in UASI grants--a 14 
percent reduction in funds from the previous year--while the number of 
eligible urban areas identified by the risk assessment decreased to 35. 
For fiscal year 2006, DHS made several changes to the grant allocation 
process, including modifying its risk assessment methodology, 
introducing an assessment of the anticipated effectiveness of 
investments, and combining the outcomes of these two assessments to 
inform funding decisions. 

The results of the UASI eligibility and funding allocations in fiscal 
year 2006 raised congressional questions and concerns about DHS's 
methods in making UASI determinations. Several congressional members 
requested that we examine aspects of DHS's UASI funding process, and 
the fiscal year 2007 DHS Appropriations Act directed us to examine the 
validity, relevance, reliability, timeliness, and availability of the 
risk factors (including threat, vulnerability, and consequence) used by 
the Secretary of Homeland Security for the purpose of allocating 
discretionary grants.[Footnote 2] On November 17, 2006, we responded to 
the mandate and the request by briefing congressional staff on the 
results of this review (see app. I). We specifically examined (1) DHS's 
method of estimating relative risk of terrorism in fiscal year 2006; 
(2) DHS's process for assessing the effectiveness of the various risk 
mitigation investments submitted in UASI applications; (3) how DHS used 
estimated relative risk scores and assessments of effectiveness to 
allocate UASI grant funds in fiscal year 2006; and (4) what changes, if 
any, DHS plans to make in its UASI award determination process for 
fiscal year 2007. This letter and the accompanying appendices transmit 
the information provided during those briefings. 

To gain understanding and describe DHS's process for awarding fiscal 
year 2006 UASI funds, including eligibility and award amount 
determinations, we reviewed available documentation and interviewed 
knowledgeable officials. During this document review and our 
interviews, we gathered information about the data DHS used to analyze 
relative risk and what efforts it had in place to ensure data 
reliability. For example, we collected information on DHS's 
consultation with states to obtain and review critical infrastructure 
data and DHS's internal assessments of intelligence data. Additionally, 
we examined DHS guidance and methods for implementing an assessment of 
effectiveness of applicants' plans to mitigate risk. Finally, during 
our review of documents and interviews with DHS officials, we also 
collected information regarding any changes to the determination 
process for the fiscal year 2007 grant cycle. We conducted our work 
from September 2006 through November 2006 in accordance with generally 
accepted government auditing standards. 

Summary of DHS's Process for Allocating UASI Grant Funds in Fiscal 
Years 2006 and 2007: 

The Risk Assessment. In fiscal year 2006, DHS used its risk assessment 
to identify urban areas that faced the greatest potential risk, which 
made them eligible to apply for the UASI grant, and based the amount of 
awards to all eligible areas primarily on the outcomes of the risk 
assessment and a new effectiveness assessment. DHS enhanced its risk 
assessment by including three components--threat, vulnerability, and 
consequences--to estimate the relative risk of successful terrorist 
attacks to urban areas. The risk assessment was used to inform DHS's 
selection of eligible urban areas. DHS also implemented a competitive 
process to evaluate the anticipated effectiveness of proposed 
investments to address homeland security needs by using peer reviewers, 
who were homeland security professionals from fields such as law 
enforcement and fire service. The peer reviewers scored the investments 
using criteria, such as regionalization, sustainability, and impact. 
According to DHS, it combined the outcomes of the risk and 
effectiveness assessments to inform the funding allocation decisions in 
fiscal year 2006, but the Secretary of Homeland Security made the final 
UASI grant decisions. Officials also reported no significant changes to 
the risk assessment process for next year's grant cycle, but other 
decisions, such as the identification of eligible urban areas through 
the risk assessment and how much weight risk and effectiveness will be 
given in determining amounts, have yet to be made. Figure 1 illustrates 
UASI grant determination process in fiscal year 2006. 

Figure 1: Overview of DHS's UASI Grant Determination Process in Fiscal 
Year 2006: 

[See PDF for Image] 

Source: GAO analysis of DHS documents and information provided in 
interviews. 

[End of figure] 

In fiscal year 2006, DHS estimated the risk faced by urban areas by 
assessing the relative risk of terrorism[Footnote 3] as a product of 
three components: (1) threat, or the likelihood that a type of attack 
might be attempted; (2) vulnerability, or the likelihood of a 
successful attack using a particular attack scenario; and (3) 
consequence, or the potential impact of a particular attack. To 
estimate the relative risk, DHS assessed risk from two perspectives, 
asset-based and geographic, then combined the assessments weighting 
geographic risk twice as much as asset-based risk. According to DHS 
officials, it made the judgment to weight geographic risk 1.0 and asset-
based risk 0.5, since a potential loss of lives within an area would 
contribute to how geographic risk is assessed. To estimate asset risk, 
DHS computed the product of threat, vulnerability, and consequence by 
assessing the intent and capabilities of an adversary to successfully 
attack an asset type, such as a chemical plant, dam, or commercial 
airport, using one of 14 different attack scenarios (e.g., nuclear 
explosion or vehicle-borne improvised explosive device). 
Simultaneously, DHS assessed geographic risk by approximating the 
threat, vulnerability, and consequences considering general geographic 
characteristics mostly independent of the area's assets using counts of 
data such as reports of suspicious incidents, the number of visitors 
from countries of interest, and population. In DHS's view, the two 
estimates of risk--asset-based and geographic--are complementary and 
provide a "micro-and macro-" perspective of risk, respectively. In 
calculating these relative risk scores and addressing the uncertainties 
in estimating relative risk, policy and analytic judgments were 
required. For example, DHS made judgments about how to weight asset and 
geographic risk, how to identify the urban boundaries it used to 
estimate risk, and what data were sufficient to use in its risk 
estimates. DHS used this risk assessment to identify the eligibility 
cut point, which determined the number of urban areas that could apply 
for UASI funding in fiscal year 2006 and defined high-risk urban areas. 
According to DHS officials, the DHS Secretary selected a point that 
resulted in 35 eligible urban areas, which accounted for 85 percent of 
total related risk. DHS then decided to extend eligibility to 11 
sustainment areas that participated in the program in fiscal year 2005, 
but were not identified in fiscal year 2006 through the risk 
assessment.[Footnote 4] Appendix II contains more detail about the risk 
assessment process and describes how DHS used these estimates to 
determine which urban areas were eligible to apply for fiscal year 2006 
UASI grants. Appendix III provides information on the data sources used 
in DHS's fiscal year 2006 risk estimates. 

The Effectiveness Assessment. For the first time since the inception of 
the program, DHS required urban areas to submit investment 
justifications as part of their grant application, so it could assess 
the anticipated effectiveness of the various risk mitigation 
investments urban areas proposed. The investment justifications 
included up to 15 "investments" or proposed solutions to address 
homeland security needs, identified by the states and urban areas 
through their strategic planning process. DHS used peer reviewers to 
assess the investments submitted by the 46 urban areas--35 eligible 
through the risk assessment and 11 sustainment areas. DHS and the 
states collaborated to identify and select these peer reviewers who 
were homeland security professionals and managers from disciplines such 
as law enforcement, fire service, and emergency communications. 
According to DHS, it arranged 17 peer review panels that included 
reviewers from a variety of professions, all levels of government, and 
representatives from different regions of the country and from both 
large-and small-population states. These reviewers evaluated, 
discussed, and scored the urban areas' investment justifications, 
initially on an individual basis, then in panels. The criteria 
reviewers used to score the investment justifications included the 
following categories: relevance to the interim National Preparedness 
Goal and to state and local homeland security plans, anticipated 
impact, sustainability, regionalism, and the implementation of each 
proposed investment. Reviewers on each panel assigned scores for six 
investment justifications, which according to DHS officials were 
averaged to determine a final effectiveness score for each urban area. 
Appendix IV provides additional details about the approach DHS used to 
assess effectiveness in fiscal year 2006. 

Final Allocation Decisions. Finally, DHS used a new method to help 
determine UASI allocation amounts for the 46 eligible urban areas, 
based primarily on the risk and effectiveness assessments, but final 
allocation decisions were made by the Secretary of Homeland Security. 
The risk and effectiveness scores did not automatically translate into 
funding amounts, but rather, the scores informed the decisions, 
according to DHS. While all eligible urban areas that applied for UASI 
grants would receiving funding, DHS had to prioritize how funds would 
be allocated. DHS prioritized those areas estimated to have the highest 
risk of a successful terrorist attack, while still rewarding those 
areas that proposed ways to address homeland security needs that were 
anticipated to be effective. DHS used the combined scores to assign the 
46 eligible urban areas into four categories: Category I--higher risk, 
higher effectiveness; Category II--higher risk, lower effectiveness; 
Category III--lower risk, higher effectiveness; and Category IV--lower 
risk, lower effectiveness. According to DHS, it considered many 
different distributions of funding to each of the 4 categories. DHS 
officials said that they made the decision to give Category I the 
highest funding priority and Category IV the lowest funding priority. 
Once the amounts for each category were decided, DHS used a formula to 
determine the grant award for each urban area, giving the risk score a 
weight of 2/3 and the effectiveness score a weight of 1/3. According to 
DHS, these weights reflect its decision to prioritize risk over 
effectiveness. DHS officials reported presenting funding options to the 
Secretary of Homeland Security, who made the final decision about 
funding allocations. The final funding decision resulted in 70 percent 
of UASI funding going to "higher risk" candidates in Categories I and 
II. Figure 1 illustrates these funding priorities, as described by DHS 
officials, in which each circle represents a hypothetical urban area 
and the size of the circle corresponds to the relative amounts of the 
grant awards.[Footnote 5] Appendix V provides additional details on the 
allocation method used in fiscal year 2006. 

Figure 2: DHS Allocation Tool for Fiscal Year 2006 UASI Funding: 

[See PDF for Image] 

Source: GAO analysis of DHS documents and information provided in 
interviews. 

[End of figure] 

The Fiscal Year 2007 Process: 

The fiscal year 2007 process, as described by DHS officials, represents 
a continuing evolution in DHS's approach to its risk methodology for 
grant allocation. DHS officials said they will to continue to use the 
risk and effectiveness assessments to inform final funding decisions. 
For fiscal year 2007, DHS officials described changes that simplified 
the risk methodology, integrating the separate analyses for asset-based 
and geographic-based risk, and included more sensitivity analysis in 
determining what the final results of its risk analysis should be. DHS 
officials said the primary goal was to make the process more 
transparent and more easily understood, focusing on key variables and 
incorporating comments from a variety of stakeholders regarding the 
fiscal year 2006 process. For the 2007 grant cycle, DHS no longer 
estimated asset-based and geographic risk separately, considered most 
areas of the country equally vulnerable to a terrorist attack, given 
freedom of movement within the nation, and focused on the seriousness 
of the consequences of a successful terrorist attack. As shown in 
figure 3, the maximum risk score possible for a given area was 100. 
Threat to people and places accounted for a maximum of 20 points, and 
vulnerability and consequences for a maximum 80 points. In the fiscal 
year 2007 process the intelligence community for the first time 
assessed threat information for multiple years (generally, from 
September 11, 2001 forward) for all candidate urban areas and gave the 
Office of Grants and Training a list that grouped the 168 areas into 
one of four tiers. Tier I included those at highest threat, relatively 
to the other areas, and tier IV included those at lowest threat 
relative to the others. 

Figure 3: DHS Risk Assessment Methodology for Fiscal Year 2007 UASI 
Funding: 

[See PDF for Image] 

Source: DHS. 

Note: DIB is Defense Industrial Base. 

[End of Figure] 

According to DHS officials, the greatest concern was the impact of an 
attack on people, including the economic and health impacts of an 
attack. Also of concern was the quantity and nature of critical 
infrastructure within each of the 168 urban areas assessed. DHS 
reported that the threat information used for risk estimates was based 
upon an analysis of all credible intelligence data. DHS's Office of 
Intelligence and Analysis performed this review and provided the Office 
of Grants and Training with threat assessments and corresponding threat 
values for each urban area. In contrast, for the 2006 grant cycle, DHS 
used total counts of threats and suspicious incidents and incorporated 
these into its model. In addition, estimates of asset-based 
vulnerability were assigned values on a cardinal scale of 1 to 100 
rather than an ordinal scale of 1 to 3, which DHS officials believe 
provided insight into the differences between asset types with 
different values. 

In assessing threat, vulnerability and consequences, DHS specifically 
wanted to capture key land and sea points of entry into the United 
States and the location of defense industrial base facilities and 
nationally critical infrastructure facilities. The approximately 2,100 
nationally critical infrastructure assets included in the risk 
assessment were selected on the basis of analysis by DHS infrastructure 
protection analysts, sector specific federal agencies, and the states. 
According to DHS, these 2,100 assets include some 129 defense 
industrial base assets. Assets were grouped into two tiers: (1) those 
that if attacked could cause major national or regional impacts similar 
to those from Hurricane Katrina or 9/11; and (2) highly consequential 
assets with potential national or regional impacts if attacked. Tier II 
includes about 660 assets identified by state partners and validated by 
sector specific agencies. On the basis of Office of Infrastructure 
Protection analysis, Tier I assets were weighted using an average value 
three times as great as Tier II assets. According to DHS officials, 
defense industrial base assets were included in the national security 
index and all other assets in the national infrastructure index. 

Throughout this process, a number of policy judgments were necessary, 
including what variables to include in the assessment, the points to be 
assigned to each major variable (e.g., threat, the population index, 
economic index, national infrastructure index, and the national 
security index) with an eye towards how these judgments affected 
outcomes. DHS officials noted that such judgments were the subject of 
extensive discussions, including among high-level officials. In 
addition, DHS officials said that they conducted more sensitivity 
analyses than was possible in the fiscal year 2006 process. DHS 
officials noted that because expert judgment was applied to the data, 
fewer variables were used in the current model, making it possible to 
track the effect of different assumptions and values on the ranking of 
individual urban areas. 

Finally, DHS officials said that the effectiveness assessment process 
will be consistent with last year's process, although a number of 
enhancements have been made based on feedback received. However, no 
final decision has been made on the weights to be given to risk and 
effectiveness for the allocation of the fiscal year 2007 grants, 
according to DHS officials. One modification to the effectiveness 
assessment will provide urban areas the opportunity to include 
investments that involve multiple regions. This can potentially earn an 
extra 5 percent to 8 percent on their final score. In addition, DHS 
will convene a separate peer review panel to assess proposed 
investments for these multi-regional investments. DHS has also offered 
applicants a mid-year review where applicants can submit their draft 
proposals to DHS to obtain comments, guidance or address questions that 
the grant may pose (such as little or unclear information on the 
anticipated impact of the investment on preparedness). As in the 2006 
process, DHS officials have said that they can not assess how effective 
these investments, once made, are in mitigating risk. 

Observations: 

Determining an appropriate methodology to assess terrorism risk is 
challenging, given uncertainties such as the limited data on actual 
attacks and understanding the capabilities, intentions, and 
adaptability of terrorists. The inherent uncertainties in estimating 
risk require policy and analytic judgments. Other federal agencies, 
terrorism risk researchers, and we agree that threat, vulnerability, 
and consequences of an attack should be incorporated into terrorism 
risk assessments. DHS has adopted an overall risk assessment approach 
that consists of these three risk factors, and in implementing this 
approach has made judgments in an attempt to address inherent 
uncertainties. According to DHS's Under Secretary, DHS has made 
significant progress in developing its risk assessment methods, which 
includes using a model based on the three risk factors and 
incorporating state and local input. However, for the 2006 risk 
assessment process, DHS officials told us that DHS had limited 
knowledge of how changes to its risk assessment methods, such as adding 
asset types and using additional or different data sources, affected 
its risk estimates. According to a senior technical advisor in DHS's 
Risk Management Division, DHS did not have the resources to undertake 
such analyses. Consequently, DHS could not assess the effects of these 
changes on risk rankings and the determination of eligibility for, and 
amount of, UASI grants. This official acknowledged the importance of 
judgments in assessing risk of terrorism and eligibility outcomes. 

DHS had a limited understanding of the effects of the judgments made in 
estimating risk that influenced eligibility and allocation outcomes in 
fiscal year 2006. DHS leadership can make more informed policy 
decisions if they are provided with alternative risk estimates and 
funding allocations resulting from analyses of varying data, judgments, 
and assumptions. The Office of Management and Budget (OMB) offers 
guidelines for treatment of uncertainty in a number of applications, 
including the analysis of government investments and programs. These 
guidelines call for the use of sensitivity analysis to gauge what 
effects key sources of uncertainty have on outcomes. According to OMB, 
assumptions should be varied and outcomes recomputed to determine how 
sensitive analytical results are to such changes.[Footnote 6] By 
applying these guidelines decision makers are better informed about how 
sensitive outcomes are to key sources of uncertainty. While DHS has 
indicated that it performed some sensitivity analyses for fiscal year 
2006, it has not provided us with details on the extent of these 
analyses, how they were used, or how much they cost. DHS officials told 
us they had conducted more extensive sensitivity analyses for the 
fiscal year 2007 risk assessment, but we have no documentation on what 
analyses were conducted, how they were conducted, or how they were used 
and affected the final risk assessment scores and relative rankings. 

Agency Comments: 

We provided a draft of this report to DHS for review and comment. DHS 
provided us technical comments that we incorporated into our report 
where appropriate. 

GAO Contact: 

We are sending copies of this correspondence to the requesters listed 
below, the appropriate congressional committees, and the Secretary of 
Homeland Security. 

Contact points for our Offices of Congressional Relations and Public 
Affairs may be found on the last page of this report. For further 
information about this report, please contact William Jenkins, Jr., 
Director, GAO Homeland Security and Justice Issues Team, at (202)-512- 
8777 or at jenkinswo@gao.gov. GAO staff members who were major 
contributors to this report are listed in appendix VI. 

Signed by: 

William Jenkins, Jr., Director, 
Homeland Security and Justice Issues Team: 

List of Congressional Addressees: 

The Honorable Robert C. Byrd: 
Chairman: 
The Honorable Thad Cochran: 
Ranking Minority Member: 
Committee on Appropriations: 
United States Senate: 

The Honorable David Obey: 
Chairman: 
The Honorable Jerry Lewis: 
Ranking Minority Member: 
Committee on Appropriations: 
House of Representatives: 

The Honorable Bennie Thompson: 
Chairman: 
Committee on Homeland Security: 
House of Representatives: 

The Honorable Barbara Boxer: 
United States Senate: 

The Honorable Dianne Feinstein: 
United States Senate: 

The Honorable Bob Filner: 
House of Representatives: 

The Honorable Doris Matsui: 
House of Representatives: 

The Honorable Mike Thompson: 
House of Representatives: 

The Honorable Susan Davis:
House of Representatives: 

[End of section] 

Appendix I: Briefing Slides: 

Urban Area Security Initiative (UASI) Grants: DHS combined analyses and 
judgments for the FY06 assessment and allocation processes, and 
weighing potential changes for FY07: 

Briefing for Congressional Committees and Requesters: 
November 17, 2006: 

Objectives: 

In response to a legislative mandate, congressional request, and based 
on discussions with relevant congressional staff, we addressed the 
following: 

What process did DHS use to allocate UASI grants in fiscal year 2006? 

* How did DHS determine the urban areas that were eligible to apply for 
UASI grants in fiscal year 2006? 

* How did DHS determine award amounts in fiscal year 2006? 

What changes, if any, does DHS plan to implement in fiscal year 2007 to 
its UASI award determination process? 

Scope and Methodology: 

We analyzed DHS documents and interviewed DHS officials about: 

The UASI grant determination process in FY06 including: 

* How risk analysis is used to inform decisions: 

* How the peer review process is conducted: 

* How allocation decisions are made: 

Planned changes to the FY07 grant determination process --DHS 
Correspondence of October 25, 2006. 

We did our work between September 2006 and November 2006, in accordance 
with generally accepted government accounting standards (GAGAS). 

Results in n Brief: 

In FY06, DHS combined a risk analysis and effectiveness assessment - 
using empirical analytical methods and policy judgments-to select 
eligible urban areas and allocate UASI funds. 

DHS estimated the relative risk of a successful terrorist attack to 
urban areas, identified candidate urban areas, and from these selected 
35 urban areas that were eligible to apply for FY06 grants. 

DHS introduced an effectiveness assessment-a peer review process-to 
assess and score the effectiveness of the proposed investments 
submitted by the eligible applicants. 

DHS based awards on a decision to allocate a larger proportion of funds 
to areas considered to be both at highest risk and most effective. 

For FY07, DHS plans to retain the general process used in FY06 but key 
decisions have yet to be determined. 

Overview of the grant determination process: 

[See PDF for image] 

Source: GAO analysis of DHS documents and information provided in 
interviews. 

[End of figure] 

Risk analysis model used in determining eligible areas: 

For FY06, DHS adopted a general approach to estimating Risk (= Threat x 
Vulnerability x Consequences), but calculated risk by combining the 
results from two complementary models: 

* Asset-based risk and: 

* Geographic-based risk: 

Risk analysis model used in determining eligible areas: 

[See PDF for Image] 

[End of figure] 

Data used to measure asset and geographic risk: 

DHS collected quantitative data such as: 

Number of each type of asset: 

Number of intelligence reports and other indicators of geographic 
Threat: 

Population and related measures for geographic Consequences: 

DHS generated data such as: 

Threat values assigned to assets based on current and credible 
intelligence: 

Vulnerability values for assets derived by mapping attack scenarios 
against asset types: 

Judgments involved in estimating risk: 

Judgments: Identifying critical assets; 
Outcomes in FY06: DHS identified 38 asset types. 

Judgments: Selecting appropriate geographic urban area boundaries, 
known as a footprint, for risk estimation purposes; 
Outcomes in FY06: DHS identified the footprint by (1) applying 
population or threat criteria to cities, (2) merging contiguous cities, 
then (3) drawing a 10-mile buffer area around the merged area. 

Judgments: Determining candidates that are considered for UASI grant 
eligibility; 
Outcomes in FY06: 90 candidate Urban Areas were identified. 

Judgments: Determining what data are integrated and how reliability is 
assessed; 
Outcomes in FY06: 62 data variables from public and private sources, 
and DHS-generated data, were used. DHS generally does not verify data; 
outside sources were assumed to be reliable. 

Judgments: Combining asset and geographic risk scores after weighting; 
Outcomes in FY06: Asset and geographic risk were weighted as 0.5 and 
1.0 respectively, the summed. Total risk = (0.5)Asset + 
(1.0)Geographic. 

[End of table] 

Judgments in eligibility decisions in FY06: 

DHS computed risk scores for the 90 candidate areas and plotted them on 
a relative risk curve (see next slide). 

DHS determined the cut point on the relative risk curve that would 
determine which Urban Areas were eligible to apply for grants, 
according to DHS officials. 

* The cut point the DHS Secretary selected resulted in 35 Urban Areas, 
which accounted for 85 percent of total estimated risk. 

DHS decided to allow an additional 11 to apply. These Urban Areas 
received UASI funds in FY05, but were outside the cut point in FY06. 

Risk estimates used to inform eligibility decisions: 

[See PDF for Image] 

Source: GAO anal, DHS documents and information provided in interviews. 

[End of figure] 

Effectiveness assessment added in FY06: 

DHS assessed the applications submitted by the 46 eligible urban areas. 

DHS used a peer-review process to assess and score the effectiveness of 
proposed investments by: 

* engaging the states in identifying and selecting peer reviewers, 

* having peer reviewers individually score investments, and: 

* assigning peer reviewers to panels to make final effectiveness score 
determinations. 

According to DHS, this assessment encouraged urban areas to identify 
their needs and develop specific initiatives to address these needs. 

Allocation process based on both risk and effectiveness scores: 

In fiscal year 2006, DHS combined risk (2/3 of total) and effectiveness 
scores (1/3) to determine amounts to allocate. 

DHS used a 2 x 2 matrix to make decisions and set priorities (see next 
slide). 

* DHS created four quadrants which were given different funding 
priorities; each quadrant had its own funding allocation. 

* DHS plotted each eligible urban area's risk and effectiveness score 
on the matrix. 

* Some policy decisions were involved in determining how much funding 
applicants in each quadrant received. 

DHS used a 2 x 2 matrix to make decisions and set priorities: 

[See PDF for image] 

Source. GAO analysis of DHS documents and Information provided in 
Interviews. 

[End of figure] 

Plans for FY 2007 grant process: 

DHS planning to retain the structure and approach used in FY 2006: 

* assessing risk to determine eligibility, 

* scoring effectiveness of proposed investments, and: 

* combining risk and effectiveness to determine UASI award amounts. 

Planned changes include: 

* DHS Office of Intelligence & Analysis will apply judgments to threat 
information. 

Key eligibility and allocation decisions are still under consideration, 
according to DHS officials. 

Concluding observations: 

Inherent uncertainty associated with estimating risk of terrorist 
attack requires policy and analytic judgments. 

DHS performed some analyses to assess the sensitivity of its risk 
estimates to alternative policy and analytical judgments. 

DHS has adopted a process of "continuous improvement" to its methods 
for estimating risk and measuring applicants' effectiveness. 

[End of section] 

Appendix II: DHS's Approach to Risk Analysis for Fiscal Year 2006: 

In fiscal year 2006, DHS made enhancements to its approach to 
estimating risk that involved incorporating stakeholder feedback and 
three risk factors--threat, vulnerability, and consequence. Other 
models and methodologies of assessing risk also include these three 
risk factors. However, the inherent uncertainties associated with 
estimating the risk of terrorist attacks required DHS to make numerous 
policy and analytic judgments. The results of DHS's risk assessment 
were used to inform two key grant decisions in fiscal year 2006: (1) 
eligibility of urban areas to apply for UASI funding and (2) funding 
amounts. 

DHS has developed a flexible approach to assessing risk of terrorist 
attacks that considers several factors, including stakeholder feedback. 
In developing DHS's fiscal year 2006 UASI grant determination process, 
DHS considered agency goals and statutory responsibilities related to 
risk management. DHS's fiscal year 2006 funding criteria--based on 
relative risk and effectiveness of proposed solutions to identified 
needs--align federal resources with the national priorities established 
by the Interim National Preparedness Goal. In addition, DHS solicited 
feedback from states, territories, and local governments to increase 
transparency and held discussions with stakeholders and experts such as 
the RAND Corporation regarding data analysis.[Footnote 7] For example, 
in May 2005, DHS hosted a meeting to solicit feedback on the fiscal 
year 2005 risk formula, which was attended by representatives from 12 
states or urban areas and from law enforcement and fire service 
associations. DHS officials reported that changes to the fiscal year 
2006 risk estimation model for fiscal year 2007 were based on feedback 
received, given the data were relevant and the changes could be applied 
to all urban areas during the data collection phase. However, agency 
officials said that implementing these suggestions varied in cost and 
time from minimal to very costly and time-consuming. Additionally, we 
were told that incorporating suggestions from states, territories, and 
local areas may not add significant value to outcomes, but DHS did not 
test the impacts of these changes. Where possible, DHS has integrated 
approaches with the intent of improving the model and its approach to 
estimating risk. 

DHS changed its definition of risk in fiscal year 2006 to incorporate 
common components from other models. DHS defined risk by three 
principal components: threat, or the likelihood of a type of attack 
that might be attempted; vulnerability, or the likelihood of a 
successful attack with a particular attack method; and consequence, or 
the potential impact of a particular attack. Other risk assessment 
models also use these three components. For example, other federal 
agencies have adopted some form of threat, vulnerability, and 
consequence into their risk management frameworks. DOD's risk 
management approach includes threat and vulnerability assessments that 
identify potential threats and weaknesses that may be exploited by 
those threats. The Department of Justice provided guidance to law 
enforcement executives on how to assess risk of terrorism to an asset 
by combining assessments of threat, vulnerability, and criticality, 
which evaluates the likely impact if an identified asset is lost or 
harmed by specific events. Additionally, the RAND Corporation argues 
that threat, vulnerability, and consequences play a significant role in 
assessing risk to urban areas and defines risk in a way that links 
these three components. Further, in February 2005, the Congressional 
Research Service reported that many risk assessment models and 
methodologies consisted of identifying critical assets, evaluating 
threats, assessing the vulnerabilities of critical assets, and 
determining the expected consequences of specific types of attack on 
specific assets.[Footnote 8] For instance, the report noted the 
American Petroleum Institutes and the National Petrochemical and 
Refiners Association defined risk as a function of consequences of a 
successful attack against an asset, and likelihood of a successful 
attack against an asset, where likelihood is defined as the 
attractiveness of the target to the adversary based on the adversary's 
intent and the target's perceived value to the adversary, degree of 
threat based on capabilities, and degree of vulnerability of the asset. 

In fiscal year 2006, DHS combined two risk assessments--asset-based 
risk and geographic-based risk--that were both based on threat, 
vulnerability, and consequence to determine the relative risk of a 
successful terrorist attack to urban areas. The asset-based risk 
assessment analyzed the intent and capabilities of an adversary to 
successfully attack any of 38 asset types, such as a chemical, plant, 
dam, or commercial airport, using one of 14 different attack scenarios 
(e.g., nuclear explosion or vehicle-borne improvised explosive device.) 
DHS identified the list of 38 asset types, and according to DHS 
officials, it collected over 200,000 individual assets obtained from 
public and private sector sources. Geographic risk considered the 
general geographic characteristics of an area mostly independent of the 
area's assets using counts of information, such as suspicious incident 
reports, FBI cases, and population. Table 1 describes what we know 
about how each component of asset and geographic risk were calculated. 
According to DHS, the two estimates of risk, asset-based and 
geographic, were complementary providing a micro-and macro-prospective 
of risk, respectively. Furthermore, while DHS's risk analysis was 
largely based on population and population density in previous years, a 
DHS official told us that legislative language directed DHS to look at 
threats to infrastructure, which was partly why DHS added the asset- 
based analysis. 

Table 1: Description of Asset-Based and Geographic Risk Computations in 
Fiscal Year 2006: 

Asset-based risk. 

Component: Threat; 
Description: DHS used information from the intelligence community, such 
as communications intercepts and assessments of the abilities of 
adversaries to carry out various types of attacks. This information was 
evaluated on two main criteria, the intent and capability of the group 
making the threat. The strategic intent of an adversary is based on the 
"chatter factor" and "attractiveness," which is partly determined by 
how closely the results of a type of attack align with high-level 
objectives of an adversary. We learned that information used in this 
component for fiscal year 2006 was based on the terrorist group viewed 
as having the "greatest capabilities" across all attack scenarios. How 
the variables were calculated to get a measurement of threat to a 
particular asset type was not specified. 

Component: Vulnerability; 
Description: To measure the vulnerability of an asset type, DHS used 
internal subject matter experts who analyzed the general attributes of 
an asset type against various terrorist attack scenarios. These subject 
matter experts conducted site vulnerability analyses on a sample of 
sites from the asset type to catalog attributes for the generic asset. 
Experts evaluated vulnerability by attack scenario and asset type pairs 
(e.g., nuclear explosion against a chemical plant) and assigned an 
ordinal relative value (1, 2, or 3) to the pair based on 10 major 
criteria (e.g. electronic detection, access control, etc.) 

Component: Consequences; 
Description: The mode of attack associated with the greatest likelihood 
of success was used to assess the consequence that would result from 
such an attack on the asset type. DHS used four categories of 
consequences--human health, economic, strategic mission, and 
psychological--for this assessment, which were identified in the 
National Strategy for Infrastructure Protection. These categories were 
weighted and then summed. Details about what data were used to 
calculate or simulate consequence were not specified. 

Geographic risk. 

Component: Threat; 
Description: To measure threat to a geographic area, DHS used counts of 
data from seven variables--total of intelligence community reports, 
total of FBI investigations, total of DHS/Immigration and Customs 
Enforcement (ICE) investigations, total of suspicious incidents, total 
of ICE I-94 information for specific countries, total of international 
visitors from specific countries, and total of vessels from specific 
countries. Weights were assigned to each variable, then summed. 

Component: Vulnerability; 
Description: DHS used total of international visitors and miles of 
designated Waste Isolation Pilot Plant (WIPP) route to assess the 
vulnerability of a geographic area. Details on how the two were 
computed to achieve a measure of vulnerability for a given area were 
not specified. 

Component: Consequences; 
Description: DHS used three of the four categories used to assess 
consequences to asset types to assess the consequences to a geographic 
area. DHS did not factor economic consequences to urban areas in fiscal 
year 2006. According to a DHS official, it did not have a UASI-specific 
economic measure in fiscal year 2006, but has added it to the model for 
fiscal year 2007 using gross metropolitan product data. In fiscal year 
2006, DHS used various population types, population density, total of 
defense industrial base facilities, total of military installations, 
and total of large gatherings/special events to measure consequence. A 
description of how the variables were computed to achieve a measure of 
consequence for a given area was not specified. 

Source: GAO analysis of DHS data. 

[End of table] 

In calculating asset-based and geographic risk, DHS made a number of 
policy and analytic judgments because of uncertainties in estimating 
risk of terrorism. There are inherent uncertainties associated with 
estimating the risk of a terrorist attack, due to various factors such 
as limited information on actual attacks; limited information on goals, 
capabilities, and adaptability of terrorist groups; and differences in 
views about how to combine data about threat, vulnerability, and 
consequences into a risk methodology. Given uncertainties, policy and 
analytic judgments are required to inform the estimation process. For 
example, there are a number of judgments involved in estimating asset- 
based and geographic risk scores with various implications and 
limitations. Table 2 describes some of the judgments DHS made in 
estimating risk. 

Table 2: Judgments Used in Estimating Asset-Based and Geographic Risk 
in Fiscal Year 2006 and Potential Implications and Limitations: 

Asset-based risk. 

Judgment: Identifying critical assets--38 asset types for fiscal year 
2006; 
Potential Implications and Limitations: DHS assessed risk scores for 
generic types of assets, such as bridges. Alternatively, different risk 
models assess the threat and vulnerability of a specific asset, such as 
the Golden Gate Bridge, and factor in consequences from an attack to 
that specific asset. While determining which assets are critical can be 
a subjective judgment, there may also be a wide variance regarding the 
criticality of assets within a particular asset type. 

Judgment: Determining threat to assets from intelligence data on 
chatter, attractiveness of assets as targets, and strategic intent and 
capabilities of terrorist groups to attack assets; 
Potential Implications and Limitations: The capabilities of various 
terrorist groups are constantly changing, and there is no known method 
for predicting future motivations of adversaries. 

Judgment: Estimating vulnerability of assets to attack from internal 
subject matter experts who assigned values using various attack 
scenarios-- pairing of each of the 38 asset types to 14 attack 
scenarios for fiscal year 2006; 
Potential Implications and Limitations: DHS noted the limitation of 
this approach in determining vulnerability for generic asset types and 
would have liked to have conducted site visits for all assets instead 
of a sample for each asset type. Details about what information 
internal subject matter experts used to assign a value for 
vulnerability was not specified. DHS officials told us that using an 
ordinal value (1, 2, or 3) to measure vulnerability did not allow DHS 
to assess the differences in magnitude between the asset-scenario pairs 
with different values. Therefore, DHS used cardinal values (0-100) for 
fiscal year 2007 analyses. 

Judgment: Determining consequences of attack in terms of human health, 
economic, strategic mission, and psychological and assigning weights to 
each component; 
Potential Implications and Limitations: This method does not account 
for multiple or simultaneous attacks on assets because of lack of data 
and DHS' inability to compute these scenarios. DHS officials stated the 
current model does not address simultaneous multiple attacks. 
DHS officials reported many challenges to modeling consequences from 
multiple or simultaneous attacks on assets including answering modeling 
questions, such as who should determine what combination of location or 
mode of attacks was most likely (e.g. vehicle-borne improvised 
explosive device at a mall, plus a suicide bomber at a federal 
building). Additionally, according to agency officials, even if DHS 
were able to select the most likely multi-attack, it is very difficult 
to estimate the potential interdependencies and consequences. DHS 
continues to devise a way to integrate these issues into its risk 
model. DHS acknowledges the uncertainty of consequence values used in 
the model, but does not know of available databases for consequence 
information for all asset- scenario pairs. However when data are 
available, DHS uses them, such as with its use of EPA's database of 
"worst-case" scenarios from chemical releases. 
DHS did conduct sensitivity analysis for the consequence weights and 
told us that risk results did not change much under different 
assumptions about weights and that this may have been due to the fact 
that the consequences were positively correlated. 

Geographic risk. 

Judgment: Geographic threat was based on information from the 
intelligence community, such as reports, and numbers of FBI 
investigations, ICE investigations, and ICE I-94 data; 
Potential Implications and Limitations: In general, we know very little 
about how DHS estimated geographic risk and judgments regarding what 
parameters are used in assessing threat, vulnerability, and consequence 
were not specified. RAND has indicated limitations in using simple 
indicators such as counts of data to assess risk, although there is no 
theoretical and empirical basis for deciding what counts should be 
included and in what proportion. 

Judgment: Vulnerability was assessed in relation to total international 
visitors and miles of designated WIPP routes. 

Judgment: Three types of consequences were assessed--human health, 
strategic mission, and psychological--and data on population and other 
factors were used. 

Source: GAO analysis of DHS data. 

[End of table] 

The results of the asset-based and geographic risk calculations were 
combined to determine a total risk score for a candidate area. 
Combining these scores involved (1) determining the values of 
parameters; (2) normalizing the values; (3) weighting factors, 0.5 for 
asset and 1.0 for geographic; and (4) summing the values. Before adding 
the two estimates of risk, DHS made a judgment to weight geographic 
risk twice as much as asset-based risk since the potential loss of 
lives within an area was factored into how geographic risk was 
calculated, according to DHS officials. In determining the appropriate 
weights, DHS reported that it conducted limited sensitivity analysis 
for the weights applied to the asset and geographic risk scores, but 
that it would have been a useful tool to help inform decision makers 
about eligible candidate areas. During our review, we conducted 
sensitivity analysis for the weights assigned to asset-based and 
geographic risk estimates, which took an analyst a few hours to 
complete. By varying the weights, DHS could identify a subset of 
candidate areas that fall in and out of the cutoff point for UASI grant 
eligibility, which could justify the decision--35 urban areas. DHS has 
approached the National Infrastructure Simulation and Analysis Center 
to conduct work to identify sources of uncertainty, which could help 
better inform analytic judgments.[Footnote 9] 

DHS used essentially the same risk assessment methods for fiscal year 
2007. According to DHS officials, the most significant change to the 
model is in how threat was assessed. In fiscal year 2006, DHS used 
counts of data from the intelligence community to estimate threat to 
asset types and geographic areas. In fiscal year 2007, DHS's Homeland 
Infrastructure Threat and Risk Analysis Center--a joint unit of the 
Office of Intelligence and Analysis and the Office of Infrastructure 
Protection--will assess current and trend threat data to assign a 
single threat value for each asset type and geographic area using a 
tiered approach. Other changes were in response to stakeholder 
feedback. For example, DHS expanded the number of asset types in its 
assessment from 38 in fiscal year 2006 to 47 in fiscal year 2007, based 
on the feedback provided to DHS from users, such as states and local 
governments. 

Identifying Eligible Urban Areas: 

In applying its risk assessment to determine the urban areas that were 
eligible to receive UASI grants, DHS first had to determine the 
geographic boundaries or footprint of candidate urban areas within 
which data were collected to estimate risk. Table 3 identifies the 
footprints for eligible urban areas in fiscal year 2006. It used data 
from various sources to calculate risk scores; the sources included 
information from federal agencies; proprietary data on assets; and 
intelligence data on threats, suspicious incidents, and other 
indicators of threats. Appendix III further describes the data sources 
used by DHS to assess risk. 

On the basis of comments from state and local governments, DHS chose to 
redefine the footprint for fiscal year 2006. DHS took several steps to 
identify this footprint; these included: 

* Identifying areas with population greater than 100,000 persons and 
areas (cities) that had any reported threat data during that past year. 
For fiscal year 2006, DHS started with a total of 266 cities. 

* Combining cities or adjacent urban counties with shared boundaries to 
form single jurisdictions. For fiscal year 2006, this resulted in 172 
urban areas. 

* Drawing a buffer zone around identified areas. A 10-mile buffer was 
then drawn from the border of that city/combined entity to establish 
candidate urban areas.[Footnote 10] This area was used to determine 
what information was used in the risk analysis, and represents the 
minimum area that had to be part of the state/urban areas defined grant 
application areas. 

In fiscal year 2005, the footprint was limited to city boundaries (and 
did not include the 10-mile buffer zone). According to DHS, for fiscal 
year 2006, it considered other alternatives such as a radius from a 
city center, although such a solution created apparent inequities among 
urban areas. DHS incorporated buffer zones at the suggestion of 
stakeholders, although this action resulted in making the analysis more 
difficult, according to a DHS official. In addition, DHS officials told 
us the steps taken to determine the footprint were based on the "best 
fit," as compared with other alternatives. DHS did not provide details 
on what criteria this comparison was based on. 

Table 3: Footprint of Urban Areas Eligible for UASI Grants in Fiscal 
Year 2006: 

State: AZ; 
Eligible urban area: Phoenix Area[A]; 
Geographic area captured in the data count: Chandler, Gilbert, 
Glendale, Mesa, Peoria, Phoenix, Scottsdale, Tempe, and a 10-mile 
buffer extending from the border of the combined area; 
Previously designated urban areas included: Phoenix, AZ. 

State: CA; 
Eligible urban area: Anaheim/Santa Ana Area; 
Geographic area captured in the data count: Anaheim, Costa Mesa, Garden 
Grove, Fullerton, Huntington Beach, Irvine, Orange, Santa Ana, and a 10-
mile buffer extending from the border of the combined area; 
Previously designated urban areas included: Anaheim, CA; Santa Ana, CA. 

State: CA; 
Eligible urban area: Bay Area; 
Geographic area captured in the data count: Berkeley, Daly City, 
Fremont, Hayward, Oakland, Palo Alto, Richmond, San Francisco, San 
Jose, Santa Clara, Sunnyvale, Vallejo, and a 10-mile buffer extending 
from the border of the combined area; 
San Francisco, CA; 
Previously designated urban areas included: San Jose, CA; Oakland, CA. 

State: CA; 
Eligible urban area: Los Angeles/Long Beach Area; 
Geographic area captured in the data count: Burbank, Glendale, 
Inglewood, Long Beach, Los Angeles, Pasadena, Santa Monica, Santa 
Clarita, Torrance, Simi Valley, Thousand Oaks, and a 10-mile buffer 
extending from the border of the combined area; 
Previously designated urban areas included: Los Angeles, CA; Long 
Beach, CA. 

State: CA; 
Eligible urban area: Sacramento Area[A]; 
Geographic area captured in the data count: Elk Grove, Sacramento, and 
a 10-mile buffer extending from the border of the combined area; 
Previously designated urban areas included: Sacramento, CA. 

State: CA; 
Eligible urban area: San Diego Area[A]; 
Geographic area captured in the data count: Chula Vista, Escondido, and 
San Diego, and a 10-mile buffer extending from the border of the 
combined area; 
Previously designated urban areas included: San Diego, CA. 

State: CO; 
Eligible urban area: Denver Area; 
Geographic area captured in the data count: Arvada, Aurora, Denver, 
Lakewood, Westminster, Thornton, and a 10-mile buffer extending from 
the border of the combined area; 
Previously designated urban areas included: Denver, CO. 

State: DC; 
Eligible urban area: National Capital Region; 
Geographic area captured in the data count: National Capital Region and 
a 10-mile buffer extending from the border of the combined area; 
Previously designated urban areas included: National Capital Region, 
DC. 

State: FL; 
Eligible urban area: Fort Lauderdale Area; 
Geographic area captured in the data count: Fort Lauderdale, Hollywood, 
Miami Gardens, Miramar, Pembroke Pines, and a 10-mile buffer extending 
from the border of the combined area; 
Previously designated urban areas included: N/A. 

State: FL; 
Eligible urban area: Jacksonville Area; 
Geographic area captured in the data count: Jacksonville and a 10-mile 
buffer extending from the city border; 
Previously designated urban areas included: Jacksonville, FL. 

State: FL; 
Eligible urban area: Miami Area; 
Geographic area captured in the data count: Hialeah, Miami, and a 10-
mile buffer extending from the border of the combined area; 
Previously designated urban areas included: Miami, FL. 

State: FL; 
Eligible urban area: Orlando Area; 
Geographic area captured in the data count: Orlando and a 10-mile 
buffer extending from the city border; 
Previously designated urban areas included: Orlando, FL. 

State: FL; 
Eligible urban area: Tampa Area[A]; 
Geographic area captured in the data count: Clearwater, St. Petersburg, 
Tampa, and a 10-mile buffer extending from the border of the combined 
area; 
Previously designated urban areas included: Tampa, FL. 

State: GA; 
Eligible urban area: Atlanta Area; 
Geographic area captured in the data count: Atlanta and a 10-mile 
buffer extending from the city border; 
Previously designated urban areas included: Atlanta, GA. 

State: HI; 
Eligible urban area: Honolulu Area; 
Geographic area captured in the data count: Honolulu and a 10-mile 
buffer extending from the city border; 
Previously designated urban areas included: Honolulu, HI. 

State: IL; 
Eligible urban area: Chicago Area; 
Geographic area captured in the data count: Chicago and a 10-mile 
buffer extending from the city border; 
Previously designated urban areas included: Chicago, IL. 

State: IN; 
Eligible urban area: Indianapolis Area; 
Geographic area captured in the data count: Indianapolis and a 10-mile 
buffer extending from the city border; 
Previously designated urban areas included: Indianapolis, IN. 

State: KY; 
Eligible urban area: Louisville Area[A]; 
Geographic area captured in the data count: Louisville and a 10-mile 
buffer extending from the city border; 
Previously designated urban areas included: Louisville, KY. 

State: LA; 
Eligible urban area: Baton Rouge Area[A]; 
Geographic area captured in the data count: Baton Rouge and a 10-mile 
buffer extending from the city border; 
Previously designated urban areas included: Baton Rouge, LA. 

State: LA; 
Eligible urban area:  New Orleans Area; 
Geographic area captured in the data count: New Orleans and a 10-mile 
buffer extending from the city border; 
Previously designated urban areas included: New Orleans, LA. 

State: MA; 
Eligible urban area: Boston Area; 
Geographic area captured in the data count: Boston, Cambridge, and a 10-
mile buffer extending from the border of the combined area; 
Previously designated urban areas included: Boston, MA. 

State: MD; 
Eligible urban area: Baltimore Area; 
Geographic area captured in the data count: Baltimore and a 10-mile 
buffer extending from the city border; 
Previously designated urban areas included: Baltimore, MD. 

State: MI; 
Eligible urban area: Detroit Area; 
Geographic area captured in the data count: Detroit, Sterling Heights, 
Warren, and a 10-mile buffer extending from the border of the combined 
area; 
Previously designated urban areas included: Detroit, MI. 

State: MN; 
Eligible urban area: Twin Cities Area; 
Geographic area captured in the data count: Minneapolis, St. Paul, and 
a 10-mile buffer extending from the border of the combined entity; 
Previously designated urban areas included: Minneapolis, MN; St. Paul, 
MN. 

State: MO; 
Eligible urban area: Kansas City Area; 
Geographic area captured in the data count: Independence, Kansas City 
(MO), Kansas City (KS), Olathe, Overland Park, and a 10-mile buffer 
extending from the border of the combined area; 
Previously designated urban areas included: Kansas City, MO. 

State: MO; 
Eligible urban area: St. Louis Area; 
Geographic area captured in the data count: St. Louis and a 10-mile 
buffer extending from the city border; 
Previously designated urban areas included: St. Louis, MO. 

State: NC; 
Eligible urban area: Charlotte Area; 
Geographic area captured in the data count: Charlotte and a 10-mile 
buffer extending from the city border; 
Previously designated urban areas included: Charlotte, NC. 

State: NE; 
Eligible urban area: Omaha Area[A]; 
Geographic area captured in the data count: Omaha and a 10-mile buffer 
extending from the city border; 
Previously designated urban areas included: Omaha, NE. 

State: NJ; 
Eligible urban area: Jersey City/Newark Area; 
Geographic area captured in the data count: Elizabeth, Jersey City, 
Newark, and a 10- mile buffer extending from the border of the combined 
area; 
Previously designated urban areas included: Jersey City, NJ; Newark, 
NJ. 

State: NV; 
Eligible urban area: Las Vegas Area[A]; 
Geographic area captured in the data count: Las Vegas, North Las Vegas, 
and a 10-mile buffer extending from the border of the combined entity; 
Previously designated urban areas included: Las Vegas, NV. 

State: NY; 
Eligible urban area: Buffalo Area[A]; 
Geographic area captured in the data count: Buffalo and a 10-mile 
buffer extending from the city border; 
Previously designated urban areas included: Buffalo, NY. 

State: NY; 
Eligible urban area: New York City Area; 
Geographic area captured in the data count: New York City, Yonkers, and 
a 10-mile buffer extending from the border of the combined area; 
Previously designated urban areas included: New York, NY. 

State: OH; 
Eligible urban area: Cincinnati Area; 
Geographic area captured in the data count: Cincinnati and a 10-mile 
buffer extending from the city border; 
Previously designated urban areas included: Cincinnati, OH. 

State: OH; 
Eligible urban area: Cleveland Area; 
Geographic area captured in the data count: Cleveland and a 10-mile 
buffer extending from the city border; 
Previously designated urban areas included: Cleveland, OH. 

State: OH; 
Eligible urban area: Columbus Area; 
Geographic area captured in the data count: Columbus and a 10-mile 
buffer extending from the city border; 
Previously designated urban areas included: Columbus, OH. 

State: OH; 
Eligible urban area: Toledo Area[A]; 
Geographic area captured in the data count: Oregon, Toledo, and a 10-
mile buffer extending from the border of the combined area; 
Previously designated urban areas included: Toledo, OH. 

State: OK; 
Eligible urban area: Oklahoma City Area[A]; 
Geographic area captured in the data count: Norman, Oklahoma City, and 
a 10-mile buffer extending from the border of the combined area; 
Previously designated urban areas included: Oklahoma City, OK. 

State: OR; 
Eligible urban area: Portland Area; 
Geographic area captured in the data count: Portland, Vancouver, and a 
10-mile buffer extending from the border of the combined area; 
Previously designated urban areas included: Portland, OR. 

State: PA; 
Eligible urban area: Philadelphia Area; 
Geographic area captured in the data count: Philadelphia and a 10-mile 
buffer extending from the city border; 
Previously designated urban areas included: Philadelphia, PA. 

State: PA; 
Eligible urban area: Pittsburgh Area; 
Geographic area captured in the data count: Pittsburgh and a 10-mile 
buffer extending from the city border; 
Previously designated urban areas included: Pittsburgh, PA. 

State: TN; 
Eligible urban area: Memphis Area; 
Geographic area captured in the data count: Memphis and a 10-mile 
buffer extending from the city border; 
Previously designated urban areas included: Memphis, TN. 

State: TX; 
Eligible urban area: Dallas/Fort Worth/Arlington Area; 
Geographic area captured in the data count: Arlington, Carrollton, 
Dallas, Fort Worth, Garland, Grand Prairie, Irving, Mesquite, Plano, 
and a 10- mile buffer extending from the border of the combined area; 
Previously designated urban areas included: Dallas, TX; Fort Worth, TX; 
Arlington, TX. 

State: TX; 
Eligible urban area: Houston Area; 
Geographic area captured in the data count: Houston, Pasadena, and a 10-
mile buffer extending from the border of the combined entity; 
Previously designated urban areas included: Houston, TX. 

State: TX; 
Eligible urban area: San Antonio Area; 
Geographic area captured in the data count: San Antonio and a 10-mile 
buffer extending from the city border; 
Previously designated urban areas included: San Antonio, TX. 

State: WA; 
Eligible urban area: Seattle Area; 
Geographic area captured in the data count: Seattle, Bellevue, and a 10-
mile buffer extending from the border of the combined area; 
Previously designated urban areas included: Seattle, WA. 

State: WI; 
Eligible urban area: Milwaukee Area; 
Geographic area captured in the data count: Milwaukee and a 10-mile 
buffer extending from the city border; 
Previously designated urban areas included: Milwaukee, WI. 

Source: DHS. 

[A] Sustainment area: an urban area that received UASI funding in 
fiscal year 2005, but was not deemed eligible to apply through the 
fiscal year 2006 risk assessment. However, DHS extended eligibility to 
these areas for one additional grant cycle. 

[End of table] 

On the basis of the risk assessment and a policy decision, DHS 
determined which urban areas were eligible to apply for UASI grants in 
fiscal year 2006. DHS estimated risk for 172 urban areas, but in 
determining eligible urban areas, it limited its analysis of risk to 90 
candidate areas, based on a 200,000 population threshold, and/or 
reports of credible threats. DHS performed calculations of relative 
risk for the 90 urban areas. DHS combined the two risk assessment 
scores for 90 candidate urban areas to get their total relative risk 
score. These relative risk scores were plotted in order, then a cutoff 
point was selected that determined the number of urban areas eligible 
to apply for grants in fiscal year 2006 and defined the nation's most 
at-risk areas(Also see appendix I, page 20). According to DHS 
officials, the Secretary of Homeland Security selected a cut point that 
resulted in 35 urban areas, which accounted for 85 percent of total 
estimated risk. A senior DHS official also told us that decision makers 
may bring other sensitive information--outside the risk model--to the 
table, but exactly what that information was or what priority that 
information held over other DHS goals was unclear. Further, DHS also 
extended eligibility to 11 sustainment areas--urban areas that 
participated in the program in fiscal year 2005, but were not 
identified as eligible through the risk analysis process in fiscal year 
2006. This policy decision was made in order to foster long-term 
planning for program participants across fiscal years. According to 
DHS, any urban area not identified as eligible through the risk 
analysis process for two consecutive years will not be eligible for 
continued funding under the UASI program, but will continue to be 
eligible to receive funding from other DHS programs. 

DHS officials did not know the extent to which, if at all, the change 
in the definition of the footprint area between fiscal years 2005 and 
2006 influenced estimates of risk. According to DHS officials, it would 
be very difficult to pinpoint the source of changes in risk analysis 
outcomes in fiscal year 2006, since there were changes made to the 
urban area's footprint, the structure of the model, and the data inputs 
(e.g., new annual threat data for geographic risk). However, DHS 
officials believe that the change in footprint in 2006 was associated 
with changes in relative risk of many urban areas. For example, by 
defining the footprint to go beyond city limits additional information, 
such as a nuclear power plant outside a city boundary or suburban 
population, was captured in the fiscal year 2006 risk assessment, which 
was previously not accounted for in several urban areas. As a 
consequence of the change in the footprint, DHS officials concluded 
that the relative risk of New York City and the National Capital Region 
declined compared to those of other urban areas. DHS could not 
determine how much of the decline was due only to the change in the 
footprint versus other components of the risk methodology that changed. 
While, as of November 2006, DHS expected to use the same definition for 
an urban area footprint for fiscal year 2007, it has yet to determine 
how eligibility for UASI funding will be decided. 

Appendix III: Data Sources Used in DHS's Fiscal Year 2006 Risk 
Analysis: 

In assessing risk for the UASI grant determination process in fiscal 
year 2006, DHS applied 57 types of data variables from sources such as 
(1) federal agencies; (2) state, territory, and local stakeholders; (3) 
private proprietary data; as well as (4) data compiled by DHS. Some 
data variables were populated from a combination of sources. Data 
variables from DHS and other federal government data sources made up 36 
asset-based and geographic data variables. Private proprietary data 
sources comprised 22 asset-based and geographic variables, of which, 7 
variables were constituted exclusively with data from private 
proprietary sources. (See table 4 for details.) DHS officials told us 
that the National Asset Database (NADB) was not a data source for risk 
analysis since the database is not populated with relevant attributes. 
Our review of the list and sources of variables for the risk 
methodology that DHS provided us also reveal that NADB did not appear 
among the sources. 

DHS considered all data obtained for the risk model from the sources 
identified as reliable for the purposes of estimating risk, although 
DHS did not systematically test the reliability of the data used. This 
includes the intelligence data which DHS officials acknowledged that 
they had accepted from source agencies. DHS considered these data to be 
valid and reliable in the sense that DHS believed they appropriately 
measure the risk constructs for which they are collected (i.e., the 
data have face validity). According to DHS officials, to identify any 
data-related problems such as the validity of data used and any 
duplicative values, DHS made over 100 analytical runs of the fiscal 
year 2006 risk assessment model. These analyses revealed errors created 
by using buffer zones, which resulted in some individual assets being 
attributed to more than one urban area. 

Most of the data used by DHS in fiscal year 2006 were timely and 
appeared reliable. Our review of the data sources contained in table 4 
show most of the data sources to be less than 2 years old, and most 
sources of data were from 2005 or 2006. All data supplied by private 
proprietary sources were less than 2 years old. Data from federal 
sources on some asset-based variables were from 2002 or 2001. 

We performed a limited test on the reliability of the data sources, 
given the time constraints in conducting this review. We attempted to 
determine, as a result of prior or ongoing GAO work, whether any 
reliability assessments have been conducted on any of the data sets DHS 
obtained from proprietary sources, and if so, what were the results. To 
comply with GAO policies, we review the reliability of data whenever 
our work uses sources of data other than GAO-generated data to analyze 
and make conclusions in our work. Of the 57 data variables, we 
identified five data sources that were used in past GAO work, and found 
one of the sources has been questioned by GAO analysts, although our 
past work was not directly related to the specific type of data 
provided to DHS. Specifically, the provider of DHS's data for 
transoceanic cable landings asset type did not meet GAO's reliability 
standards, as our past work found internal control problems such as no 
mechanisms in place for the providers of the data to perform 
verifications. 

For fiscal year 2007, DHS reported that it will apply 69 types of asset-
based and geographic data variables from these sources. Of these 69 
variables, 38 were populated exclusively with data from a single 
source, and 24 asset-based variables were refined in fiscal year 2007 
by adding additional data sources. Also, as we discussed in appendix 
II, DHS's Office of Intelligence and Analysis performed threat reviews 
and provided the Office of Grants and Training with a single threat 
value for each urban area and asset type. This is in contrast to fiscal 
year 2006, when DHS used total counts of threats and suspicious 
incidents. Data supplied directly by state and local governments for 
fiscal year 2007 analyses were current as of August 2006, except where 
otherwise noted. 

Table 4: DHS Sources of Data Used in UASI Risk Analysis Model: 

[See PDF for Table, did not properly compute] 

Source: GAO analysis of DHS data. 

Note: DHS provided us information on the sources of data used in the 
risk model on November 8, 2006 and, at the time of our review, 
indicated that the list for fiscal year 2007 was subject to change. In 
addition, the fiscal year 2007 data used have a publication date of 
2006, unless otherwise noted. 

[A] Data sources with a publication date of prior to 2006. 

[B] Data sources with a publication date of either 2001 or 2002. 

[C] Data sources with a publication date not specified. 

[D] Denotes that publication dates for the variable were not provided 
for all sources. 

[E] DHS considered the variables used in the fiscal year 2006 risk 
model to assign a threat value between 0 and 1 for fiscal year 2007. 

[End of Table] 

Appendix IV: DHS's Approach to Assessing Effectiveness for Fiscal Year 
2006: 

Fiscal year 2006 marked the first time that eligible urban areas 
completed and submitted an investment justification to formally request 
UASI funding, which DHS used to assess the anticipated effectiveness of 
the risk mitigation investments urban areas proposed. The investment 
justification included up to 15 "investments" or proposed solutions to 
address homeland security needs identified by the states and urban 
areas through their strategic planning process. DHS and the states 
collaborated to identify and select peer reviewers who evaluated, 
discussed, and scored the investment justifications submitted by the 46 
eligible urban areas. Reviewers on each of the 17 panels assigned 
scores for six investment justifications, which according to DHS 
officials were averaged to determine a final effectiveness score for 
each urban area. 

Purpose and Goals of the Effectiveness Assessment: 

Given the uncertainties in estimating terrorism risk, DHS introduced 
the effectiveness assessment as an additional tool to inform DHS 
leaders when making allocation decisions. Specifically, the investment 
justifications allowed DHS to consider how the eligible urban areas 
planned to spend the grant money. While one identified goal of the UASI 
program is to address the needs of high-threat, high-density urban 
areas, DHS officials determined that it would be more useful for urban 
areas to suggest solutions for how to meet their self-identified needs 
within the investment justifications. In addition, DHS officials told 
us the emphasis on effectiveness was meant to avoid the potential bias 
that could have occurred from self-reported needs. The Interim National 
Preparedness Goal, which DHS described as a common planning framework 
to better understand preparedness levels, shape priorities, and focus 
expenditures, was in place for the first time for the fiscal year 2006 
grant cycle. DHS reported that designing funding criteria that 
incorporated both risk and effectiveness was done to ensure that urban 
areas' expenditures were in alignment with the national priorities 
established by the Interim National Preparedness Goal. In particular, 
for the new effectiveness assessment the states and urban areas 
requested fiscal year 2006 HSGP funding by submitting applications in 
support of their Homeland Security Strategies and related program 
planning documents.[Footnote 11] In addition, according to DHS the new 
effectiveness assessment added a degree of competition to the grant 
determination process, which was a change from fiscal year 2005, when 
urban areas did not have to justify their planned use of the grant 
before they received the funding.[Footnote 12] 

Instead of DHS determining the effectiveness of the urban areas' 
applications, it decided to use peer reviewers, who were homeland 
security professionals and managers from various fields, to make this 
assessment. DHS reported that involving subject matter experts from 
federal, state, and local government agencies was done to ensure a fair 
and equitable peer review process. To learn best practices for 
distributing competitive grants and conducting peer reviews, DHS met 
with officials who run other competitive grant programs (Assistance to 
Firefighter Grants, Transit Security Grant Program, and the National 
Science Foundation). According to DHS, this approach to evaluating 
anticipated effectiveness seeks to recognize applicants for proposing 
relevant, innovative, and reasonable investments that will directly 
affect our nation's preparedness. 

Preparing the Investment Justification: 

DHS assessed effectiveness only for the applications submitted by the 
46 eligible urban areas. Aside from the 11 sustainment areas, DHS 
stated that it did not allow areas that fell below the risk cut point 
to apply for a UASI grant because it did not want to set false 
expectations and create excessive work for candidate areas that were 
not going to receive funding. DHS provided states and urban areas with 
guidance that included instructions on completing the investment 
justification, the criteria peer reviewers would use to score the 
investment justifications, and an overview of how risk and 
effectiveness scores would be used to determine UASI allocations. DHS 
allowed each urban area to propose up to 15 investments, and for each 
investment, applicants were required to answer a total of 17 detailed 
questions across four sections: background, regionalization, impact, 
and funding and implementation plan. DHS instructed urban areas to 
build investments that supported their state's Enhancement Plan, a 
program management plan to help states identify strengths and weakness 
within their homeland security programs and capabilities.[Footnote 13] 
DHS cited this guidance as an example of how it encouraged states and 
urban areas to utilize the results of strategic planning efforts. DHS 
reported it was still determining how it would use the risk and 
effectiveness scores when allocating the UASI grants, and had not yet 
determined what weights would be applied to the risk and effectiveness 
scores at the time urban areas were completing their applications. 
Therefore, applicants did not know how much the effectiveness 
assessment would influence their grant amount. At the time of our 
review, DHS had not announced whether or not applicants will be 
provided with this information prior to submitting their fiscal year 
2007 applications. In addition, in fiscal year 2006, applicants did not 
have access to the outcomes of the risk analysis or to specific threats 
to assets or their area for consideration when preparing the investment 
justifications. 

Forming the Peer Review Panels: 

DHS engaged the states and territories in identifying and selecting the 
peer reviewers that would evaluate the investment justifications for 
their anticipated effectiveness. DHS provided some guidelines on what 
state officials should consider when nominating peer reviewers, and 
requested information, such as professional experience, about those 
nominated. DHS compiled a list of eligible peer reviewers from 
nominations made by the states and territories, and made its 
recommendations to the states based on the following high-level 
criteria: 

� the extent of the nominees' familiarity across multiple homeland 
security disciplines and their length of tenure, 

� the nominees' demonstrated experience managing an integrated homeland 
security program or initiative, 

� the nominees' familiarity with the HSGP (which was considered a 
benefit, but not required), and: 

� whether or not the nominees represented the State Administrative 
Agency (if so, these nominees were prioritized). 

DHS allowed the State Administrative Agencies to make the final 
selection of peer reviewers from their state or territory, who included 
homeland security professionals and managers from a variety of 
disciplines, such as officials from law enforcement, fire service, 
emergency management, state homeland security, and public health. DHS 
arranged 17 panels to include one facilitator, one note-taker, and up 
to seven peer reviewers, representing states or territories, urban 
areas, and federal agencies. DHS reported arranging the panels to 
ensure a diverse mix of backgrounds and experience, and to avoid 
potential conflicts of interest by: 

� including representatives from the eastern, western, and central 
geographic regions, and from large-and small-population states; 

� preventing reviewers from scoring their own state, territory, or 
urban area; and: 

� avoiding reviewers scoring neighboring states, territories, or urban 
areas, where possible: 

Overall, in DHS's view, the peer review panel process mitigated 
potential bias by requiring panelists to engage in discussion, justify 
their scores, and consider multiple perspectives. 

Reviewing and Scoring the Investment Justifications: 

When scoring an urban area's investment justification, peer reviewers 
conducted an individual assessment of the applications, and 
subsequently discussed scoring in peer review panels during a week-long 
conference. Each peer reviewer was responsible for reviewing six 
investment justifications, which included roughly 60 investments, on an 
individual basis over the course of 2 � weeks, and then submitted their 
scores, along with explanatory comments, to DHS. Specific scoring 
criteria were developed by DHS for the peer reviewers to use and were 
provided to states and urban areas about a month prior to the March 2, 
2006, HSGP application deadline. To score each individual investment, 
the reviewers evaluated the responses to the 17 questions, comparing 
them to detailed criteria and the state's Enhancement Plan to ensure 
the proposed investments were in alignment. Peer reviewers also scored 
the overall submission, so DHS provided the peer reviewers with 
criteria to consider the investment justification as a whole. By 
scoring the investment justification as a whole, DHS sought to reward 
innovative, forward-leaning approaches. The scoring criteria are 
summarized in table 5. 

Table 5: Factors Peer Reviewers Considered When Scoring Investment 
Justifications in Fiscal Year 2006: 

Individual Investment. 

Section description: Background: Applicants were asked to summarize the 
investment, its purpose, and how it will support the Enhancement Plan, 
state/urban area homeland security strategies, and national priorities 
and target capabilities. Includes four questions with multiple criteria 
for each question (a total of 14 criteria for the section); 
Examples: Question: Provide a summary description of this Investment 
and its purpose; Criteria: 
* Articulates clear end result of using fiscal year 2006 HSGP funds; 
* Explains how outcomes relate to the purpose. 

Section description: Regionalization: Applicants were asked to describe 
the investment's demographic and geographic area, and the urban area's 
plans for regional collaboration, stakeholder engagement, and an 
implementation approach to support the investment. Includes three 
questions with multiple criteria for each question (a total of 14 
criteria for the section); 
Examples: Question: Explain how the state/urban area is organizing to 
implement this Investment over the identified geographic areas(s); 
Criteria: 
* Discusses regional partnerships; 
* Discusses mitigating duplication of effort. 

Section description: Impact: Applicants were asked to describe 
anticipated impacts of the investment, how requested funds will help 
achieve the impacts, how the investment will decrease or mitigate risk, 
and what the potential risks of not funding the investment would be. 
Includes three questions with multiple criteria for each question (a 
total of 11 criteria for the section); 
Examples: Question: Discuss how the implementation of this Investment 
will decrease or mitigate risk; Criteria: 
* Targets specific consequences, vulnerabilities, and threats; 
* Provides a rationale of choices. 

Section description: Funding and implementation plan: 
Applicants were asked to describe the investment's funding plan; 
describe the planned implementation and oversight approach of the 
management team, provide an implementation timeline with milestones, 
identify potential challenges to effective implementation and how they 
will be addressed and mitigated, and describe the planned duration and 
long-term sustainability plans of the investment after fiscal year HSGP 
funds are expended. Includes seven questions with multiple criteria for 
each question (a total of 27 criteria for the section); 
Examples: Question: Identify potential challenges to the effective 
implementation of this Investment (e.g., stakeholder buy-in, 
sustainability, aggressive timelines); Criteria: 
* Describes necessary steps required for successful implementation and 
describes potential challenges; 
* Explains why the identified implementation challenges are challenges 
to this Investment. 

Overall investment justification submission. 

Criteria: 
* Overall relevance to implementation of the Interim National 
Preparedness Goal; 
* Connection to both the spirit and scope of the Enhancement Plan; 
* Extent to which the individual investments relate to each other to 
portray a complete picture of plans for the homeland security program; 
* Innovativeness of the proposed solutions to address needs; 
* Overall feasibility and reasonableness of proposed solutions. 

Source: GAO analysis of DHS documents. 

[End of table] 

After peer reviewers submitted preliminary scores based on their 
individual review, DHS identified the questions that received the 
greatest range of scores. Then the reviewers participated in a week- 
long conference, where the panels of peer reviewers discussed and 
scored each individual investment and the investment justification 
submission as a whole. Each panel had a facilitator, whose role 
according to DHS was to focus the discussions on those questions that 
received the greatest range of scores, ensure that the scoring criteria 
were consistently applied, and to help the panel develop feedback for 
the states, territories, and urban areas. In addition, subject matter 
experts were on call to answer questions that arose. During the 
conference, peer reviewers could revise their initial scores if 
desired, based on panel discussions. DHS computed the final scoring of 
the investments and the whole investment justification submission and 
then combined them to determine an overall effectiveness score. 
Specifically, peer reviewers provided scores for each of the 
investments in their assigned investment justifications based on 
evaluation criteria, and DHS told us it averaged the reviewers' scores 
together for each urban area.[Footnote 14] DHS reported it selected the 
median score as the final total "investment score" for each urban area. 
In addition, according to DHS, peer reviewers provided a score for each 
overall investment justification submission they reviewed, and the 
panels discussed these scores and determined a final "overall 
investment justification" score for each urban area. DHS told us it 
decided to give these two scores equal weights--0.5 to the total of 
investment scores, and 0.5 to the overall investment justification 
score--and averaged them together to determine one final effectiveness 
score. While officials told us they discussed alternative weights, they 
did not have any data to indicate that they would be more appropriate 
than those chosen. 

At the end of the panel conference DHS used a survey to gather 
feedback, and 80 percent of the 102 peer reviewers responded and 
provided comments. The following include some of the preliminary survey 
results that DHS reported: 

� Eighty-three percent of survey respondents agreed or strongly agreed 
that the fiscal year 2006 HSGP peer review resulted in objective, 
consistent, and defensible scores and feedback. 

� Ninety-six percent of respondents agreed or strongly agreed that each 
panel included balanced representation from different regions, 
disciplines, and backgrounds. 

� Sixty-nine percent of respondents disagreed or strongly disagreed 
that the level of effort necessary for the review process was clearly 
communicated, and 78 percent disagreed or strongly disagreed that 
panelists were given sufficient time to review, score, and return 
scoring sheets to DHS prior to the panel convention. 

At the time of our review, DHS planned to continue to use a peer review 
process to assess effectiveness, but did not indicate whether it would 
be making changes to the process for fiscal year 2007. 

Appendix V: UASI Grant Allocation Approach for Fiscal Year 2006: 

In fiscal year 2006, DHS used a new method to determine the amounts of 
UASI grants to each of the 46 eligible urban areas, based primarily on 
the risk and effectiveness assessments, but final allocation decisions 
were made by the Secretary of Homeland Security. DHS reported that the 
aim of considering both factors--risk and effectiveness--is to allocate 
and apply HSGP resources to generate the highest return on investment 
and, as a result, to strengthen national preparedness. The risk and 
effectiveness scores did not automatically translate into funding 
amounts, but rather, according to DHS, the scores informed the 
decisions made by DHS officials. While all eligible urban areas that 
applied for UASI grants received funding, DHS set priorities to 
determine how much each urban area would receive. When making funding 
decisions, DHS prioritized those areas estimated to have the highest 
risk of a successful terrorist attack, while still rewarding those 
areas that offered effective ways to address homeland security needs. 
As a result, the risk assessment was given a greater weight than the 
effectiveness assessment when allocating funds. 

DHS Allocation Tool Used to Categorize Urban Areas and Set Priorities: 

DHS established funding priorities before making allocation decisions. 
For example, DHS officials told us the Secretary of Homeland Security 
selected an approach that considered both the risk and effectiveness 
assessments when making allocation decisions, rather than using the 
outcomes of only one of the assessments. This approach combined the two 
assessments by using a graphical tool--a two-by-two matrix--to create 
four categories that would be used to set funding priorities (Figure 4 
illustrates the two-by-two matrix used by DHS). The four funding 
categories were: Category I--higher risk, higher effectiveness; 
Category II--higher risk, lower effectiveness; Category III--lower 
risk, higher effectiveness; and Category IV--lower risk, lower 
effectiveness. 

To create these four categories, DHS made judgments that affected the 
category in which urban areas fell. For example, dividing lines were 
drawn on the horizontal axis for effectiveness scores and the vertical 
axis for risk scores to create the four categories. Specifically, DHS 
officials told us they calculated a "natural inflection point" among 
the risk rankings of the 46 eligible urban areas, thereby determining 
the dividing line on the risk axis. DHS reported that about a third of 
the urban areas were above the dividing line and therefore considered 
"higher risk" and about two-thirds were below the line and thus, "lower 
risk." DHS officials told us they selected the median of the 
effectiveness scores as the midpoint on the horizontal axis, and those 
areas to the right of this point were considered "higher effectiveness" 
and those to the left "lower effectiveness." Each of the 46 eligible 
urban areas was plotted into one of the following categories according 
to their combination of risk and effectiveness scores. 

Figure 4: DHS Allocation Tool Used in Fiscal Year 2006 UASI Funding: 

[See PDF for Image] 

Source: GAO analysis of DHS documents and information provided in 
interviews. 

[End of Figure] 

Determining the Final Allocations: 

According to DHS, it considered many different distributions of funding 
to each of the four categories, and decided to give Category I the 
highest funding priority and Category IV the lowest funding priority. 
The figure above illustrates the funding priorities it reported making, 
in which each circle represents a hypothetical urban area and the size 
of the circle corresponds to the relative amounts of the grant awards 
(i.e., a larger circle indicates a greater allocation amount). DHS 
conducted what it described as an optimization process to produce many 
possible options of funding amounts to each category. DHS told us that 
once the allotments to categories were decided, DHS used a formula to 
determine the grant award for each urban area. DHS stated that it 
decided to prioritize the outcomes of the risk analysis over the 
effectiveness assessment, and so it made the policy decision to give 
each urban area's risk score a weight of 2/3 and the effectiveness 
score a weight of 1/3 when calculating the formula. DHS officials did 
not indicate whether or not they considered other weights for the risk 
and effectiveness scores. DHS reported that some stakeholders expressed 
frustration that the effectiveness assessment was not assigned a 
greater weight, since the peer review process required considerable 
time and effort. As was previously described in appendix IV, at the 
time urban areas were completing their applications, DHS had not yet 
determined the weights that would be applied to the risk and 
effectiveness scores. DHS officials expect that risk and effectiveness 
scores will both factor into allocation decisions for fiscal year 2007, 
but they do not currently know whether or not the weights given to risk 
and effectiveness will change in fiscal year 2007. 

DHS officials told us they presented funding options to the Secretary 
of Homeland Security, who made the final decision about funding 
allocations. The official from the Office of Grants and Training we 
spoke to did not provide additional details about the information 
presented to the Secretary to inform his decision, and did not know 
what other goals or data may have factored into the allocation 
decision. DHS also reported that it determined the need to treat two 
urban areas differently than the other urban areas when making funding 
decisions because it considered them to be outliers in the risk 
analysis. DHS officials told us these areas have consequences so great 
that they cannot be appropriately accounted for in the risk model. DHS 
did not specify what methods it used to determine the amount of these 
two UASI grants. 

All of the 46 eligible urban areas that applied for a fiscal year 2006 
UASI grant received funding. DHS reported that 70 percent of UASI 
funding went to the higher-risk urban areas in Categories I and II of 
the two-by-two matrix, and 45 percent of available funding went to the 
five urban areas with the highest relative risk estimates. The total 
amount of UASI funds DHS allocated in fiscal year 2006 decreased by 14 
percent from fiscal 2005, but individual funding percentage changes 
varied among the 46 grantees. For example, among the 46 urban areas, 
fifteen experienced an increase in funding and 28 saw a funding 
decrease. Three of the 35 areas did not receive funding in fiscal year 
2005, but were identified as eligible to apply for funding through the 
risk assessment in fiscal year 2006. The total amount awarded to these 
three urban areas was $23,620,000. The table below describes the 
allocations to each urban area in fiscal years 2005 and 2006, and 
illustrates the percentage change between years. 

Table 6: Percent Change in UASI Funding between Fiscal Year 2005 and 
Fiscal Year 2006: 

Urban area[A]; Fiscal year 2005 allocation; Fiscal year 2006 
allocation; Percent change in funds from fiscal year 2005 to 2006. 

New recipients in fiscal year 2006. 

Eligible areas through risk assessment: FL - Ft. Lauderdale Area; 
Fiscal year 2005 allocation: 0; 
Fiscal year 2006 allocation: $9,980,000; 
Percent change in funds from fiscal year 2005 to 2006: -. 

Eligible areas through risk assessment: FL - Orlando Area; 
Fiscal year 2005 allocation: 0; 
Fiscal year 2006 allocation: $9,440,000; 
Percent change in funds from fiscal year 2005 to 2006: -. 

Eligible areas through risk assessment: TN -Memphis Area; 
Fiscal year 2005 allocation: 0; 
Fiscal year 2006 allocation: $4,200,000; 
Percent change in funds from fiscal year 2005 to 2006: -. 

Increased funding. 

Eligible areas through risk assessment: NJ -Jersey City/Newark Area; 
Fiscal year 2005 allocation: $19,172,120; 
Fiscal year 2006 allocation: $34,330,000; 
Percent change in funds from fiscal year 2005 to 2006: 79%. 

Eligible areas through risk assessment: NC -Charlotte Area; 
Fiscal year 2005 allocation: $5,479,243; 
Fiscal year 2006 allocation: $8,970,000; 
Percent change in funds from fiscal year 2005 to 2006: 64%. 

Eligible areas through risk assessment: GA -Atlanta Area; 
Fiscal year 2005 allocation: $13,117,499; 
Fiscal year 2006 allocation: $18,660,000; 
Percent change in funds from fiscal year 2005 to 2006: 42%. 

Eligible areas through risk assessment: WI -Milwaukee Area; 
Fiscal year 2005 allocation: $6,325,872; 
Fiscal year 2006 allocation: $8,570,000; 
Percent change in funds from fiscal year 2005 to 2006: 35%. 

Eligible areas through risk assessment: FL - Jacksonville Area; 
Fiscal year 2005 allocation: $6,882,493; 
Fiscal year 2006 allocation: $9,270,000; 
Percent change in funds from fiscal year 2005 to 2006: 35%. 

Eligible areas through risk assessment: MO - St. Louis Area; 
Fiscal year 2005 allocation: $7,040,739; 
Fiscal year 2006 allocation: $9,200,000; 
Percent change in funds from fiscal year 2005 to 2006: 31%. 

Eligible areas through risk assessment: CA -Los Angeles/Long Beach 
Area; 
Fiscal year 2005 allocation: $69,235,692; 
Fiscal year 2006 allocation: $80,610,000; 
Percent change in funds from fiscal year 2005 to 2006: 16%. 

Eligible areas through risk assessment: IL - Chicago Area; 
Fiscal year 2005 allocation: $45,000,000; 
Fiscal year 2006 allocation: $52,260,000; 
Percent change in funds from fiscal year 2005 to 2006: 16%. 

Eligible areas through risk assessment: MO - Kansas City Area; 
Fiscal year 2005 allocation: $8,213,126; 
Fiscal year 2006 allocation: $9,240,000; 
Percent change in funds from fiscal year 2005 to 2006: 13%. 

Eligible areas through risk assessment: MI - Detroit; 
Fiscal year 2005 allocation: $17,068,580; 
Fiscal year 2006 allocation: $18,630,000; 
Percent change in funds from fiscal year 2005 to 2006: 9%. 

Eligible areas through risk assessment: FL - Miami Area; 
Fiscal year 2005 allocation: $15,828,322; 
Fiscal year 2006 allocation: $15,980,000; 
Percent change in funds from fiscal year 2005 to 2006: 1%. 

Reduced funding. 

Eligible areas through risk assessment: OR - Portland Area; 
Fiscal year 2005 allocation: $10,391,037; 
Fiscal year 2006 allocation: $9,360,000; 
Percent change in funds from fiscal year 2005 to 2006: (10%). 

Eligible areas through risk assessment: TX -Houston Area; 
Fiscal year 2005 allocation: $18,570,464; 
Fiscal year 2006 allocation: $16,670,000; 
Percent change in funds from fiscal year 2005 to 2006: (10%). 

Eligible areas through risk assessment: PA -Philadelphia Area; 
Fiscal year 2005 allocation: $22,818,091; 
Fiscal year 2006 allocation: $19,520,000; 
Percent change in funds from fiscal year 2005 to 2006: (14%). 

Eligible areas through risk assessment: MD - Baltimore; 
Fiscal year 2005 allocation: $11,305,357; 
Fiscal year 2006 allocation: $9,670,000; 
Percent change in funds from fiscal year 2005 to 2006: (14%). 

Eligible areas through risk assessment: CA -Bay Area; 
Fiscal year 2005 allocation: $33,226,729; 
Fiscal year 2006 allocation: $28,320,000; 
Percent change in funds from fiscal year 2005 to 2006: (15%). 

Eligible areas through risk assessment: OH - Cincinnati Area; 
Fiscal year 2005 allocation: $5,866,214; 
Fiscal year 2006 allocation: $4,660,000; 
Percent change in funds from fiscal year 2005 to 2006: (21%). 

Eligible areas through risk assessment: WA - Seattle Area; 
Fiscal year 2005 allocation: $11,840,034; 
Fiscal year 2006 allocation: $9,150,000; 
Percent change in funds from fiscal year 2005 to 2006: (23%). 

Eligible areas through risk assessment: IN - Indianapolis Area; 
Fiscal year 2005 allocation: $5,664,822; 
Fiscal year 2006 allocation: $4,370,000; 
Percent change in funds from fiscal year 2005 to 2006: (23%). 

Eligible areas through risk assessment: MN - Twin Cities Area; 
Fiscal year 2005 allocation: $5,763,411; 
Fiscal year 2006 allocation: $4,310,000; 
Percent change in funds from fiscal year 2005 to 2006: (25%). 

Eligible areas through risk assessment: TX -San Antonio Area; 
Fiscal year 2005 allocation: $5,973,524; 
Fiscal year 2006 allocation: $4,460,000; 
Percent change in funds from fiscal year 2005 to 2006: (25%). 

Eligible areas through risk assessment: HI - Honolulu Area; 
Fiscal year 2005 allocation: $6,454,763; 
Fiscal year 2006 allocation: $4,760,000; 
Percent change in funds from fiscal year 2005 to 2006: (26%). 

Eligible areas through risk assessment: MA - Boston Area; 
Fiscal year 2005 allocation: $26,000,000; 
Fiscal year 2006 allocation: $18,210,000; 
Percent change in funds from fiscal year 2005 to 2006: (30%). 

Eligible areas through risk assessment: OH - Cleveland Area; 
Fiscal year 2005 allocation: $7,385,100; 
Fiscal year 2006 allocation: $4,730,000; 
Percent change in funds from fiscal year 2005 to 2006: (36%). 

Eligible areas through risk assessment: CA -Anaheim/Santa Ana Area; 
Fiscal year 2005 allocation: $19,825,462; 
Fiscal year 2006 allocation: $11,980,000; 
Percent change in funds from fiscal year 2005 to 2006: (40%). 

Eligible areas through risk assessment: DC -National Capital Region; 
Fiscal year 2005 allocation: $77,500,000; 
Fiscal year 2006 allocation: $46,470,000; 
Percent change in funds from fiscal year 2005 to 2006: (40%). 

Eligible areas through risk assessment: NY -New York City; 
Fiscal year 2005 allocation: $207,563,211; 
Fiscal year 2006 allocation: $124,450,000; 
Percent change in funds from fiscal year 2005 to 2006: (40%). 

Eligible areas through risk assessment: OH - Columbus Area; 
Fiscal year 2005 allocation: $7,573,005; 
Fiscal year 2006 allocation: $4,320,000; 
Percent change in funds from fiscal year 2005 to 2006: (43%). 

Eligible areas through risk assessment: TX - Dallas/Fort 
Worth/Arlington Area; 
Fiscal year 2005 allocation: $24,355,870; 
Fiscal year 2006 allocation: $13,830,000; 
Percent change in funds from fiscal year 2005 to 2006: (43%). 

Eligible areas through risk assessment: PA -Pittsburgh Area; 
Fiscal year 2005 allocation: $9,635,991; 
Fiscal year 2006 allocation: $4,870,000; 
Percent change in funds from fiscal year 2005 to 2006: (49%). 

Eligible areas through risk assessment: LA -New Orleans Area; 
Fiscal year 2005 allocation: $9,305,180; 
Fiscal year 2006 allocation: $4,690,000; 
Percent change in funds from fiscal year 2005 to 2006: (50%). 

Eligible areas through risk assessment: CO - Denver Area; 
Fiscal year 2005 allocation: $8,718,395; 
Fiscal year 2006 allocation: $4,380,000; 
Percent change in funds from fiscal year 2005 to 2006: (50%). 

Total funding for 35 eligible areas; 
Fiscal year 2005 allocation: $749,100,346; 
Fiscal year 2006 allocation: $642,520,000; 
Percent change in funds from fiscal year 2005 to 2006: (14%). 

Sustainment areas[B]. 

Increased funding. 

Eligible areas through risk assessment: KY -Louisville Area; 
Fiscal year 2005 allocation: $5,000,000; 
Fiscal year 2006 allocation: $8,520,000; 
Percent change in funds from fiscal year 2005 to 2006: 70%. 

Eligible areas through risk assessment: NE -Omaha Area; 
Fiscal year 2005 allocation: $5,148,300; 
Fiscal year 2006 allocation: $8,330,000; 
Percent change in funds from fiscal year 2005 to 2006: 62%. 

Eligible areas through risk assessment: CA -Sacramento Area; 
Fiscal year 2005 allocation: $6,085,663; 
Fiscal year 2006 allocation: $7,390,000; 
Percent change in funds from fiscal year 2005 to 2006: 21%. 

Eligible areas through risk assessment: FL - Tampa Area; 
Fiscal year 2005 allocation: $7,772,791; 
Fiscal year 2006 allocation: $8,800,000; 
Percent change in funds from fiscal year 2005 to 2006: 13%. 

Reduced funding. 

Eligible areas through risk assessment: NV -Las Vegas Area; 
Fiscal year 2005 allocation: $8,456,728; 
Fiscal year 2006 allocation: $7,750,000; 
Percent change in funds from fiscal year 2005 to 2006: (8%). 

Eligible areas through risk assessment: OK - Oklahoma City Area; 
Fiscal year 2005 allocation: $5,570,181; 
Fiscal year 2006 allocation: $4,102,000; 
Percent change in funds from fiscal year 2005 to 2006: (26%). 

Eligible areas through risk assessment: OH - Toledo Area; 
Fiscal year 2005 allocation: $5,307,598; 
Fiscal year 2006 allocation: $3,850,000; 
Percent change in funds from fiscal year 2005 to 2006: (27%). 

Eligible areas through risk assessment: LA -Baton Rouge Area; 
Fiscal year 2005 allocation: $5,226,495; 
Fiscal year 2006 allocation: $3,740,000; 
Percent change in funds from fiscal year 2005 to 2006: (28%). 

Eligible areas through risk assessment: CA -San Diego Area; 
Fiscal year 2005 allocation: $14,784,191; 
Fiscal year 2006 allocation: $7,990,000; 
Percent change in funds from fiscal year 2005 to 2006: (46%). 

Eligible areas through risk assessment: NY -Buffalo Area; 
Fiscal year 2005 allocation: $7,207,995; 
Fiscal year 2006 allocation: $3,710,000; 
Percent change in funds from fiscal year 2005 to 2006: (49%). 

Eligible areas through risk assessment: AZ -Phoenix Area; 
Fiscal year 2005 allocation: $9,996,463; 
Fiscal year 2006 allocation: $3,920,000; 
Percent change in funds from fiscal year 2005 to 2006: (61%). 

Total funding for 11 sustainment areas; 
Fiscal year 2005 allocation: $80,556,405; 
Fiscal year 2006 allocation: $68,102,000; 
Percent change in funds from fiscal year 2005 to 2006: (15%). 

Total UASI funding allocated to 46 urban areas; 
Fiscal year 2005 allocation: $829,656,751; 
Fiscal year 2006 allocation: $710,622,000; 
Percent change in funds from fiscal year 2005 to 2006: (14%). 

Source: GAO analysis. 

Notes: 

a. For a description of the cities, counties, and other geographic 
areas included in each urban area, see appendix II, table 3. 

b. Sustainment area: an urban area that received UASI funding in fiscal 
year 2005, but was not deemed eligible to apply through the fiscal year 
2006 risk assessment. However, DHS extended eligibility to these areas 
for one additional grant cycle. 

[End of table] 

DHS Actions after the Fiscal Year 2006 UASI Grants Were Awarded: 

After DHS awarded the fiscal year 2006 UASI grants it took additional 
steps to provide information about the grant determination process and 
to gather feedback from stakeholders. These steps included providing 
award letters that summarized the risk and effectiveness assessments 
for each urban area, requiring states to conduct grant reporting 
activities, and hosting an HSGP after-action conference. 

� DHS provided individual award letters that included basic 
descriptions of the risk and effectiveness assessments. The award 
letter, which announced the amount of the urban area's fiscal year 2006 
UASI award, also provided high-level feedback. For example, counts of 
asset information and geographic attributes DHS used to estimate 
relative risk were included in the letter. It also described whether 
DHS's estimate of relative risk placed the urban area in the (1) top 25 
percent, (2) top 50 percent, (3) bottom 50 percent, or (4) bottom 25 
percent, compared to the other eligible urban areas. The letter did not 
provide the urban areas with their specific risk score or ranking, 
however. Summary information was also provided on the results of the 
effectiveness assessment, including which investments were anticipated 
to be the most and least effective. The award letter did not explain 
how the risk and effectiveness assessments were used by DHS to 
determine final grant allocation amounts. 

� Through its grant reporting process, DHS gathered additional 
information about how the fiscal year 2006 UASI grants were to be 
spent. Once it allocated the UASI funds, DHS allowed the recipient 
urban areas to decide how to spend the grant under some conditions with 
specific reporting requirements.[Footnote 15] According to DHS's fiscal 
year 2006 grant guidance, grants were to be awarded to the respective 
State Administrative Agencies, which were required to notify DHS within 
60 days of the award date as to how the grant funds were 
allocated.[Footnote 16] DHS also reported that grant recipients would 
be monitored periodically to ensure that the program goals, objectives, 
timeliness, budgets, and other related program criteria were being met. 
Officials from DHS's Office of Grants and Training reported that DHS 
plans to ask grant recipients how they spent their fiscal year 2006 
funds. DHS officials told us that they plan to consider information 
when making decisions for fiscal year 2008 UASI allocations. 

� DHS convened a Homeland Security Grant Program After-Action 
conference. At the July 2006 conference DHS gathered feedback on the 
UASI grant award process. The conference held working groups on 
homeland security planning, the HSGP guidance and application, the risk 
assessment, and the effectiveness assessment. DHS officials told us 
that the conference provided a feedback loop intended to bolster 
stakeholder support and promote transparency. The state and local 
partners who participated in the working groups at the conference 
produced 32 substantive recommendations to improve upon the HSGP 
process for fiscal year 2007 and beyond. For example, one of the risk 
assessment working group's recommendations was that DHS should provide 
detailed briefings to state and local partners on the core components 
of the risk methodology used in the fiscal year 2006 process as one 
step to improve the transparency of the risk analysis process. The 
effectiveness assessment working group recommended eliminating the 
overall investment justification score, as it believed it was not 
beneficial and was not a true representation of the quality of the 
application. DHS reported that state and local partners agreed the 
overall fiscal year 2006 planning process was the most effective and 
constructive thus far, and that the process helped to standardize the 
focus of state and local programs around key homeland security 
capabilities. 

According to DHS officials, stakeholder feedback on the fiscal year 
2006 UASI grant process has been obtained and is being considered and 
incorporated into the fiscal year 2007 process where appropriate. DHS 
stated it will continue to regularly seek stakeholder input and 
feedback to ensure that state and local governments are fully informed 
and that the process proceeds in a collaborative fashion in fiscal year 
2007. For example, DHS reported plans to convene stakeholder meetings 
to receive input on how to make specific grant programs more user- 
friendly and transparent, including a midterm review during the HSGP 
application process. 

Appendix VI: GAO Contact and Staff Acknowledgments: 

GAO Contact: 

William O. Jenkins, Jr., Director, GAO Homeland Security and Justice 
Issues Team, (202)-512-8777 (jenkinswo@gao.gov): 

Acknowledgments: 

In addition to the contact named above, the following individuals from 
GAO's Homeland Security and Justice Team also made contributions to 
this report: William Sabol, Assistant Director, Chris Keisling, 
Assistant Director, John Vocino, Analyst In Charge; Leslie Sarapu; Lacy 
Vong; and Kathryn Godfrey. Also contributing were Charles Bausell, Jr., 
Economist; David Alexander, GAO Applied Methodology; and Research Team; 
and Frances Cook, GAO Office of General Counsel. 

(440574): 

FOOTNOTES 

[1] Prior to fiscal year 2003, funding to urban areas was provided 
under the Nunn-Lugar-Domenici Domestic Preparedness Program, which was 
administered by the Department of Defense starting in fiscal year 1997, 
and later the Department of Justice during fiscal years 2001 and 2002. 
Other grants under the HSGP include the State Homeland Security 
Program, Law Enforcement Terrorism Prevention Program, and Citizen 
Corps Program, among others. 

[2] Pub. L. No. 109-295, 120 Stat. 1355, 1370 (2006). 

[3] By using a relative risk value, DHS assessed the risk of potential 
terrorist attacks to one urban area relative to another urban area. DHS 
estimated relative risk as an ordinal number, which typically is 
understood to indicate rank order. Further, the "distance" between the 
numbers has no meaning. According to DHS, a classical probabilistic 
risk assessment, in which risk is calculated using historical 
statistical data to quantitatively describe the likelihood of a 
particular event (usually expressed as a value between 0 and 1), cannot 
be used, because there are little available historical statistical data 
to describe terrorism risk. 

[4] According to DHS, extending eligibility to the 11 urban areas 
reflected feedback from stakeholders on the importance of providing 
funding across fiscal years. In addition, in DHS's view, this decision 
provided greater transparency in the process and fostered long-term 
planning for program participants. DHS has also stated that any urban 
area not identified as eligible through the risk analysis process for 
two consecutive grant cycles will not be eligible for continued UASI 
funding. 

[5] The figure does not represent actual urban areas or grant award 
amounts. 

[6] Office of Management and Budget, Circular A-94: Guidelines and 
Discount Rates for Benefit-Cost Analysis of Federal Programs, 
(Washington, D.C; October 29, 1992) p.10-11 

[7] The RAND Corporation is a nonprofit policy research and analysis 
institution that has conducted national security research for the U.S. 
Department of Defense, the intelligence community, and key allied 
governments and ministries of defense. In addition, RAND operates three 
federally funded research and development centers that focus on 
national security issues. 

[8] Congressional Research Service, Risk Management and Critical 
Infrastructure Protection: Assessing, Integrating, and Managing 
Threats, Vulnerability, and Consequences, (Washington, D.C.: February 
2005). 

[9] The National Infrastructure Simulation and Analysis Center is a 
virtual center that includes national laboratories, such as Sandia, Los 
Alamos, and Argonne National Laboratories. 

[10] Buffer zone extensions were considered for chemical plants (25 
miles) and nuclear power plants (50 miles). According to DHS officials, 
these distances were selected based on plume effects influenced by 
research conducted by the Department of Energy. 

[11] States and UASI areas were required to maintain a Homeland 
Security Strategy, which was meant to (a) provide a blueprint for 
comprehensive, enterprisewide planning for homeland security efforts 
and (b) provide a strategic plan for the use of related federal, state, 
local and private resources within the state or urban area before, 
during, and after threatened or actual domestic terrorist attacks, 
major disasters, and other emergencies. 

[12] In fiscal year 2006, as in fiscal year 2005, the states were 
required to notify DHS how they spent the funds. Within 60 days of the 
grant award, state administrative agencies were required to submit a 
prioritization of investments based upon the final grant award amounts 
and a certification that funds had been passed through to local units 
of government. 

[13] According to DHS, in 2005 states conducted a Program and 
Capability Review and from this created an Enhancement Plan, which is 
meant to prioritize focus areas and develop high-level initiatives to 
address the most critical needs. In addition, the Enhancement Plan is 
the foundation for building an investment justification to request 
fiscal year 2006 HSGP funding. 

[14] DHS reported that a consensus on final scores was not required. 
Instead, reviewers' scores within each panel were averaged. 

[15] DHS required that conditions established by peer reviewers be met 
before it funded an investment with a score below a certain threshold. 

[16] Subsequent information on actual expenditures was to be reported 
every 6 months through the Biannual Strategy Implementation Report. 

GAO's Mission: 

The Government Accountability Office, the investigative arm of 
Congress, exists to support Congress in meeting its constitutional 
responsibilities and to help improve the performance and accountability 
of the federal government for the American people. GAO examines the use 
of public funds; evaluates federal programs and policies; and provides 
analyses, recommendations, and other assistance to help Congress make 
informed oversight, policy, and funding decisions. GAO's commitment to 
good government is reflected in its core values of accountability, 
integrity, and reliability. 

Obtaining Copies of GAO Reports and Testimony: 

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through the Internet. GAO's Web site ( www.gao.gov ) contains 
abstracts and full-text files of current reports and testimony and an 
expanding archive of older products. The Web site features a search 
engine to help you locate documents using key words and phrases. You 
can print these documents in their entirety, including charts and other 
graphics. 

Each day, GAO issues a list of newly released reports, testimony, and 
correspondence. GAO posts this list, known as "Today's Reports," on its 
Web site daily. The list contains links to the full-text document 
files. To have GAO e-mail this list to you every afternoon, go to 
www.gao.gov and select "Subscribe to e-mail alerts" under the "Order 
GAO Products" heading. 

Order by Mail or Phone: 

The first copy of each printed report is free. Additional copies are $2 
each. A check or money order should be made out to the Superintendent 
of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or 
more copies mailed to a single address are discounted 25 percent. 
Orders should be sent to: 

U.S. Government Accountability Office 

441 G Street NW, Room LM 

Washington, D.C. 20548: 

To order by Phone: 

Voice: (202) 512-6000: 

TDD: (202) 512-2537: 

Fax: (202) 512-6061: 

To Report Fraud, Waste, and Abuse in Federal Programs: 

Contact: 

Web site: www.gao.gov/fraudnet/fraudnet.htm 

E-mail: fraudnet@gao.gov 

Automated answering system: (800) 424-5454 or (202) 512-7470: 

Public Affairs: 

Jeff Nelligan, managing director, 

NelliganJ@gao.gov 

(202) 512-4800 

U.S. Government Accountability Office, 

441 G Street NW, Room 7149 

Washington, D.C. 20548: