This is the accessible text file for GAO report number GAO-04-781T 
entitled 'Child And Family Services Reviews: States and HHS Face 
Challenges in Assessing and Improving State Performance' which was 
released on May 13, 2004.

This text file was formatted by the U.S. General Accounting Office 
(GAO) to be accessible to users with visual impairments, as part of a 
longer term project to improve GAO products' accessibility. Every 
attempt has been made to maintain the structural and data integrity of 
the original printed product. Accessibility features, such as text 
descriptions of tables, consecutively numbered footnotes placed at the 
end of the file, and the text of agency comment letters, are provided 
but may not exactly duplicate the presentation or format of the printed 
version. The portable document format (PDF) file is an exact electronic 
replica of the printed version. We welcome your feedback. Please E-mail 
your comments regarding the contents or accessibility features of this 
document to Webmaster@gao.gov.

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately.

Testimony:

Before the Subcommittee on Human Resources, Committee on Ways and 
Means, House of Representatives:

United States General Accounting Office:

GAO:

For Release on Delivery Expected at 10:00 a.m. EDT:

Thursday, May 13, 2004:

Child and Family Services Reviews:

States and HHS Face Challenges in Assessing and Improving State 
Performance:

Statement of Cornelia M. Ashby, Director, Education, Workforce, and 
Income Security Issues:

GAO-04-781T:

GAO Highlights:

Highlights of GAO-04-781T, a testimony before the Subcommittee on Human 
Resources, Committee on Ways and Means, House of Representative 

Why GAO Did This Study:

In 2001, the Department of Health and Human Services’ (HHS) 
Administration for Children and Families (ACF) implemented the Child 
and Family Services Reviews (CFSR) to increase states’ accountability. 
The CFSR uses states’ data profiles and statewide assessments, as well 
as interviews and an on-site case review, to measure state performance 
on 14 outcomes and systemic factors, including child well-being and the 
provision of caseworker training. The CFSR also requires progress on a 
program improvement plan (PIP); otherwise ACF may apply financial 
penalties. This testimony is based on our April 2004 report and 
addresses (1) ACF’s and the states’ experiences preparing for and 
conducting the statewide assessments and on-site reviews; (2) ACF’s and 
the states’ experiences developing, funding, and implementing items in 
PIPs; and (3) any additional efforts that ACF has taken beyond the CFSR 
to improve state performance. For the April 2004 report, we surveyed 
all 50 states, the District of Columbia, and Puerto Rico regarding 
their experiences throughout the CFSR process, visited 5 states to 
obtain first-hand information, and conducted a content analysis of all 
31 available PIPs as of January 1, 2004. We also interviewed HHS 
officials—including those in all 10 regional offices—and key child 
welfare experts.

What GAO Found:

ACF and many state officials perceive the CFSR as a valuable process 
and a substantial undertaking, but some data enhancements could improve 
its reliability. ACF staff in 8 of the 10 regions considered the CFSR a 
helpful tool to improve outcomes for children. Further, 26 of 36 states 
responding to a relevant question in our survey commented that they 
generally or completely agreed with the results of the final CFSR 
report, even though none of the 41 states with final CFSR reports 
released through 2003 has achieved substantial conformity on all 14 
outcomes and systemic factors. Additionally, both ACF and the states 
have dedicated substantial financial and staff resources to the 
process. Nevertheless, several state officials and child welfare 
experts we interviewed questioned the accuracy of the data used in the 
review process. While ACF officials contend that stakeholder interviews 
and case reviews complement the data profiles, many state officials and 
experts reported that additional data from the statewide assessment 
could bolster the evaluation of state performance. 

Program improvement planning is under way, but uncertainties have 
affected the development, funding, and implementation of state PIPs. 
Officials from 3 of the 5 states we visited said ACF’s PIP-related 
instructions were unclear, and at least 9 states reported in our survey 
that challenges to implementing their plans include insufficient 
funding, staff, and time. While ACF has provided some guidance, ACF and 
state officials remain uncertain about PIP monitoring efforts and how 
ACF will apply financial penalties if states fail to achieve their 
stated PIP objectives.

Since 2001, ACF’s focus has been almost exclusively on the CFSRs and 
regional staff report limitations in providing assistance to states in 
helping them to meet key federal goals. While staff from half of ACF’s 
regions told us they would like to provide more targeted assistance to 
states, and state officials in all 5 of the states we visited said that 
ACF’s existing technical assistance efforts could be improved, ACF 
officials acknowledged that regional staff might still be adjusting to 
the new way ACF oversees child welfare programs. 

In the April 2004 report, we recommended that the Secretary of HHS 
ensure that ACF uses the best available data to measure state 
performance. We also recommended that the Secretary clarify PIP 
guidance and provide guidance to regional officials on how to better 
integrate their many oversight responsibilities. In commenting on a 
draft of the April 2004 report, HHS acknowledged that the CFSR is a new 
process that continues to evolve, and noted several steps it has taken 
to address the data quality concerns we raised in that report. 

www.gao.gov/cgi-bin/getrpt?GAO-04-781T.

To view the full product, including the scope and methodology, click on 
the link above. For more information, contact Corneila Ashby at (202) 
512-8403 or ashbyc@gao.gov.

[End of section]

Mr. Chairman and Members of the Subcommittee:

Thank you for inviting me here today to discuss states' efforts to 
comply with federal Child and Family Services Reviews (CFSR). As you 
are aware, in 2001, the Department of Health and Human Services' (HHS) 
Administration for Children and Families' (ACF) began implementing the 
CFSRs to hold states accountable for improving child welfare outcomes. 
Unlike prior federal reviews--which determined states' adherence to 
certain process measures--ACF designed the CFSR as an outcome-oriented 
approach to assess children's safety; their timely placement in 
permanent homes; and their mental, physical, and educational well-
being; and it developed certain standards against which to measure 
states' success in these areas.[Footnote 1] ACF also designed the 
reviews to assess states' performance across a range of systemic 
factors, such as caseworker training and foster parent licensing. The 
CFSR has multiple phases, consisting of a statewide assessment; an on-
site review, which culminates in the release of a final report; and the 
development and implementation of a program improvement plan (PIP) when 
states are found to be deficient. Pursuant to CFSR regulations, ACF can 
withhold federal funds if states do not show adequate progress 
implementing their PIPs.

My testimony today will focus on three key issues: (1) ACF's and the 
states' experiences preparing for and conducting the statewide 
assessments and on-site reviews; (2) ACF's and the states' experiences 
developing, funding, and implementing items in their PIPs; and (3) 
additional efforts, if any, that ACF has taken beyond the CFSR to help 
ensure that all states meet federal goals of safety, permanency, and 
well-being for children. My comments are based on the findings from our 
April 2004 report.[Footnote 2] Those findings were based on a survey of 
all 50 states, the District of Columbia, and Puerto Rico regarding 
their experiences during each phase of the CFSR process;[Footnote 3] 
post-survey follow up phone calls with key states;[Footnote 4] and site 
visits to California, Florida, New York, Oklahoma, and Wyoming to 
obtain first-hand information on states' experiences. We selected these 
states for diversity in their location, size, program administration, 
performance on the CFSR, and the timing of their review. We also 
examined all 31 approved PIPs available as of January 1, 2004, and 
conducted interviews with ACF's senior officials, regional staff from 
all 10 regions, ACF contractors, staff from all 10 national resource 
centers,[Footnote 5] and key child welfare experts. We conducted our 
work between May 2003 and February 2004 in accordance with generally 
accepted government auditing standards.

In summary, ACF and many state officials perceive the CFSR as a 
valuable process and a substantial undertaking, but some data 
enhancements could improve its reliability. ACF staff in 8 of the 10 
regions considered the CFSR a helpful tool to improve outcomes for 
children, and 26 of 36 states responding to a relevant question in our 
survey commented that they generally or completely agreed with the 
results of the final CFSR report, even though none of the 41 states 
with final CFSR reports released through 2003 has achieved substantial 
conformity on all CFSR outcomes and systemic factors. Additionally, 
both ACF and the states have dedicated substantial financial and staff 
resources to the process. Nevertheless, several state officials and 
child welfare experts we interviewed questioned the accuracy of the 
data used in the review process and noted that additional data from the 
statewide assessment could bolster the evaluation of state performance. 
While states' PIP planning is under way, uncertainties have affected 
the development, funding, and implementation of these plans. Officials 
from 3 of the 5 states we visited said ACF's PIP-related instructions 
were unclear, and at least 9 of the 25 states reporting on PIP 
implementation in our survey stated that insufficient funding, staff, 
and time, as well as high caseloads, were among the greatest 
challenges. While ACF has provided some guidance, ACF and state 
officials remain uncertain about PIP monitoring efforts and how ACF 
will apply financial penalties if states fail to achieve their stated 
PIP objectives. Further, since 2001, ACF's focus has been almost 
exclusively on the CFSRs and regional staff report limitations in 
providing assistance to states in helping them to meet key federal 
goals. To improve its oversight, we recommended in our April 2004 
report that the Secretary of HHS ensure that ACF use the best available 
data to measure state performance, clarify PIP guidance, and help 
regional offices better integrate their oversight responsibilities.

Background:

ACF's Children's Bureau administers and oversees federal funding to 
states for child welfare services under Titles IV-B and IV-E of the 
Social Security Act, and states and counties provide these child 
welfare services, either directly or indirectly through contracts with 
private agencies.[Footnote 6] Among other activities, ACF staff are 
responsible for developing appropriate policies and procedures for 
states to follow to obtain and use federal child welfare funds, 
reviewing states' planning documents required by Title IV-B, conducting 
states' data system reviews, assessing states' use of Title IV-E funds, 
and providing technical assistance to states through all phases of the 
CFSR process. In addition, ACF staff coordinate the work of the 10 
resource centers to provide additional support and assistance to the 
states.

Spurred by the passage of the 1997 Adoption and Safe Families Act 
(ASFA), ACF launched the CFSR in 2001 to improve its existing 
monitoring efforts, which had once been criticized for focusing 
exclusively on states' compliance with regulations rather than on their 
performance over a full range of child welfare services. The CFSR 
process combines a statewide self-assessment, an on-site case file 
review that is coupled with stakeholder interviews,[Footnote 7] and the 
development and implementation of a 2-year PIP with performance 
benchmarks to measure process in improving noted deficiencies. In 
assessing performance through the CFSR, ACF relies, in part, on its own 
data systems, known as NCANDS and AFCARS, which were designed prior to 
CFSR implementation to capture, report, and analyze the child welfare 
information collected by the states.[Footnote 8] Today, these systems 
provide the national data necessary for ACF to calculate national 
standards for key performance items against which all states are 
measured and to determine, in part, whether or not states are in 
substantial conformity on CFSR outcomes and systemic factors.[Footnote 
9] Once ACF approves the PIP, states are required to submit quarterly 
progress reports. Pursuant to CFSR regulations, federal child welfare 
funds can be withheld if states do not show adequate PIP progress, but 
these penalties are suspended during the 2-year PIP implementation 
term.[Footnote 10]

In preparation for the next round of CFSRs, ACF officials have formed a 
Consultation Work Group of ACF staff, child welfare administrators, 
data experts, and researchers who will propose recommendations on the 
CFSR measures and processes. The group's resulting proposals for 
change, if any, are not yet available.

The CFSR Is a Valuable Yet Substantial Undertaking, but Data 
Enhancements Could Improve Its Reliability:

ACF and many state officials perceive the CFSR as a valuable process--
highlighting many areas needing improvement--and a substantial 
undertaking, but some state officials and child welfare experts told us 
that data enhancements could improve its reliability. ACF staff in 8 of 
the 10 regions considered the CFSR a helpful tool to improve outcomes 
for children. Further, 26 of the 36 states responding to a relevant 
question in our survey commented that they generally or completely 
agreed with the results of the final CFSR report, even though none of 
the 41 states with final CFSR reports released through 2003 has 
achieved substantial conformity on all 14 outcomes and systemic 
factors. In addition, both ACF and the states have dedicated 
substantial financial and staff resources to the process. However, 
several state officials and child welfare experts we interviewed 
questioned the accuracy of the data used to compile state profiles and 
establish the national standards. While ACF officials in the central 
office contend that stakeholder interviews and case reviews compliment 
the data profiles, many state officials and experts reported that 
additional data from the statewide assessment could bolster the 
evaluation of state performance.

The CFSR Is a Valuable Process for ACF and the States:

ACF and state officials support the objectives of the review, 
especially in focusing on children's outcomes and strengthening 
relationships with stakeholders, and told us they perceive the process 
as valuable. For example, ACF officials from 8 regional offices noted 
that the CFSRs were more intensive and more comprehensive than the 
other types of reviews they had conducted in the past, creating a 
valuable tool for regional officials to monitor states' performance. In 
addition, state officials from every state we visited told us that the 
CFSR process helped to improve collaboration with community 
stakeholders. Furthermore, state staff from 4 of the 5 states we 
visited told us the CFSR led to increased public and legislative 
attention to critical issues in child welfare. For example, caseworkers 
in Wyoming told us that without the CFSR they doubted whether their 
state agency's administration would have focused on needed reforms. 
They added that the agency used the CFSR findings to request 
legislative support for the hiring of additional caseworkers.

Along with the value associated with improved stakeholder relations, 
the ACF officials we talked to and many state officials reported that 
the process has been helpful in highlighting the outcomes and systemic 
factors, as well as other key performance items that need improvement. 
According to our survey, 26 of the 36 states that commented on the 
findings of the final CFSR report indicated that they generally or 
completely agreed with the findings, even though performance across the 
states was low in certain key outcomes and performance items. For 
example, not one of the 41 states with final reports released through 
2003 was found to be in substantial conformity with either the outcome 
measure that assesses the permanency and stability of children's living 
situations or with the outcome measure that assesses whether states had 
enhanced families' capacity to provide for their children's needs. 
Moreover, across all 14 outcomes and systemic factors, state 
performance ranged from achieving substantial conformity on as few as 2 
outcomes and systemic factors to as many as 9.[Footnote 11] As figure 1 
illustrates, the majority of states were determined to be in 
substantial conformity with half or fewer of the 14 outcomes and 
systemic factors assessed.

Figure 1: State Performance on the 14 CFSR Outcomes and Systemic 
Factors:

[See PDF for image]

[End of figure]

States' performance on the outcomes related to safety, permanency, and 
well-being--as well as the systemic factors--is determined by their 
performance on an array of items, such as establishing permanency 
goals, ensuring worker visits with parents and children, and providing 
accessible services to families. The CFSR showed that many states need 
improvement in the same areas. For example, across all 41 states 
reviewed through 2003, the 10 items most frequently rated as needing 
improvement included assessing the needs and services of children, 
parents, and foster parents (40 states); assessing the mental health of 
children (37 states); and establishing the most appropriate permanency 
goal for the child (36 states).

ACF and the States Report That Reviews Have Been a Substantial 
Undertaking:

Given the value that ACF and the states have assigned to the CFSR 
process, both have spent substantial financial resources and staff time 
to prepare for and implement the reviews. In fiscal years 2001-03, when 
most reviews were scheduled, ACF budgeted an additional $300,000 
annually for CFSR-related travel. In fiscal year 2004, when fewer 
reviews were scheduled, ACF budgeted about $225,000. To further enhance 
its capacity to conduct the reviews, and to obtain additional 
logistical and technical assistance, ACF spent approximately $6.6 
million annually to hire contractors. Specifically, ACF has let three 
contracts to assist with CFSR-related activities, including training 
reviewers to conduct the on-site reviews, tracking final reports and 
PIP documents, and, as of 2002, writing the CFSR final reports. 
Additionally, ACF hired 22 new staff to build central and regional 
office capacity and dedicated 4 full-time staff and 2 state government 
staff temporarily on assignment with ACF to assist with the CFSR 
process. To build a core group of staff with CFSR expertise, ACF 
created the National Review Team, composed of central and regional 
office staff with additional training in and experience with the review 
process. In addition, to provide more technical assistance to the 
states, ACF reordered the priorities of the national resource centers 
to focus their efforts primarily on helping states with the review 
process.

Like ACF, states also spent financial resources on the review. While 
some states did not track CFSR expenses--such as staff salaries, 
training, or administrative costs--of the 25 states that reported such 
information in our survey, the median expense to date was $60,550, 
although states reported spending as little as $1,092 and as much as 
$1,000,000 on the CFSR process.[Footnote 12] Although ACF officials 
told us that states can use Title IV-E funds to pay for some of their 
CFSR expenses, only one state official addressed the use of these funds 
in our survey, commenting that it was not until after the on-site 
review occurred that the state learned these funds could have been used 
to offset states' expenses. States also reported that they dedicated 
staff time to prepare for the statewide assessment and to conduct the 
on-site review, which sometimes had a negative impact on some staffs' 
regular duties. According to our survey, 45 states reported dedicating 
up to 200 full-time staff equivalents (FTE), with an average of 47 
FTEs, to the statewide assessment process.[Footnote 13] Similarly, 42 
states responded that they dedicated between 3 and 130 FTEs, with an 
average of 45 FTEs, to the on-site review process. For some 
caseworkers, dedicating time to the CFSR meant that they were unable or 
limited in their ability to manage their typical workload. For example, 
Wyoming caseworkers whose case files were selected for the on-site 
review told us that they needed to be available to answer reviewers' 
questions all day every day during the on-site review, which they said 
prevented them from conducting necessary child abuse investigations or 
home visits. Child welfare-related stakeholders--such as judges, 
lawyers, and foster parents--also contributed time to the CFSR.

States and Child Welfare Experts Report That Several Data Improvements 
Could Enhance CFSR Reliability:

State officials in the 5 states we visited, as well as child welfare 
experts, reported on several data improvements that could enhance the 
reliability of CFSR findings. In particular, they highlighted 
inaccuracies with the AFCARS and NCANDS data that are used for 
establishing the national standards and creating the statewide data 
profiles, which are then used to determine if states are in substantial 
conformity. These concerns echoed the findings of a prior GAO study on 
the reliability of these data sources, which found that states are 
concerned that the national standards used in the CFSR are based on 
unreliable information and should not be used as a basis for comparison 
and potential financial penalty.[Footnote 14] Furthermore, many states 
needed to resubmit their statewide data after finding errors in the 
data profiles ACF would have used to measure compliance with the 
national standards.[Footnote 15] According to our national survey, of 
the 37 states that reported on resubmitting data for the statewide data 
profile, 23 needed to resubmit their statewide data at least once, with 
one state needing to resubmit as many as five times to accurately 
reflect revised data. Four states reported in our survey that they did 
not resubmit their data profiles because they did not know they had 
this option or they did not have enough time to resubmit before the 
review.

In addition to expressing these data concerns, child welfare experts as 
well as officials in all of the states we visited commented that 
existing practices that benefit children might conflict with actions 
needed to attain the national standards. For example, officials in New 
York said that they recently implemented an initiative to facilitate 
adoptions. Because these efforts focus on the backlog of children who 
have been in foster care for several years, New York officials predict 
that their performance on the national standard for adoption will be 
lower since many of the children in the initiative have already been in 
care for more than 2 years. Experts and officials from multiple states 
also commented that they believe the on-site review case sample of 50 
cases is too small to provide an accurate picture of statewide 
performance, although ACF officials stated that the case sampling is 
supplemented with additional information.[Footnote 16] For example, 
Oklahoma officials we visited commented that they felt the case sample 
size was too small, especially since they annually assess more than 800 
of their own cases--using a procedure that models the federal CFSR--and 
obtain higher performance results than the state received on its CFSR. 
Furthermore, because not every case in the states' sample is applicable 
to each item measured in the on-site review, we found that sometimes as 
few as 1 or 2 cases were being used to evaluate states' performance on 
an item. For example, Wyoming had only 2 on-site review cases 
applicable for the item measuring the length of time to achieve a 
permanency goal of adoption, but for 1 of these cases, reviewers 
determined that appropriate and timely efforts had not been taken to 
achieve finalized adoptions within 24 months, resulting in the item 
being assigned a rating of area needing improvement.[Footnote 17] While 
ACF officials acknowledged the insufficiency of the sample 
size,[Footnote 18] they contend that the case sampling is augmented by 
stakeholder interviews for all items and applicable statewide data for 
the five CFSR items with corresponding national standards, therefore 
providing sufficient evidence for determining states' conformity.

All of the states we visited experienced discrepant findings between 
the aggregate data from the statewide assessment and the information 
obtained from the on-site review. We also found that in these 5 states, 
ACF had assigned an overall rating of area needing improvement for 10 
of the 11 instances in which discrepancies occurred. ACF officials 
acknowledged the challenge of resolving data discrepancies, noting that 
such complications can delay the release of the final report and 
increase or decrease the number of items that states must address in 
their PIPs. While states have the opportunity to resolve discrepancies 
by submitting additional information explaining the discrepancy or by 
requesting an additional case review, only 1 state to date has decided 
to pursue the additional case review.[Footnote 19] Further, several 
state officials and experts also told us that additional data from the 
statewide assessments--or other data sources compiled by the states--
could bolster the evaluation of states' performance, but they found 
this information to be missing or insufficiently used in the final 
reports. For example, child welfare experts and state officials from 
California and New York--who are using alternative data sources to 
AFCARS and NCANDS, such as longitudinal data that track children's 
placements over time--told us that the inclusion of this more detailed 
information would provide a more accurate picture of states' 
performance nationwide. An HHS official told us that alternative data 
are used only to assess state performance in situations in which a 
state does not have NCANDS data, since states are not mandated to have 
these systems.

Given their concerns with the data used in the review process, state 
officials in 4 of the 5 states believed that the threshold for 
achieving substantial conformity was difficult to achieve. While an ACF 
official told us that different thresholds for the national standards 
had been considered, ACF policy makers ultimately concluded that a 
threshold at the 75th percentile of the nationwide data would be used. 
ACF officials recognize that they have set a high standard. However, 
they believe it is attainable and supportive of their overall approach 
to move states to the standard through continuous improvement.

Program Improvement Planning Under Way, but Uncertainties Challenge 
Plan Development, Implementation, and Monitoring:

Forty-one states are engaged in program improvement planning, but many 
uncertainties, such as those related to federal guidance and monitoring 
and the availability of state resources, have affected the development, 
implementation, and funding of the PIPs. State PIPs include strategies 
such as revising or developing policies, training caseworkers, and 
engaging stakeholders, and ACF has issued regulations and guidance to 
help states develop and implement their plans. Nevertheless, states 
reported uncertainty about how to develop their PIPs and commented on 
the challenges they faced during implementation. For example, officials 
from 2 of the states we visited told us that ACF had rejected their 
PIPs before final approval, even though these officials said that the 
plans were based on examples of approved PIPs that regional officials 
had provided. Further, at least 9 of the 25 states responding to a 
question in our survey on PIP implementation indicated that 
insufficient time, funding, and staff, as well as high caseloads, were 
the greatest challenges they faced. As states progress in PIP 
implementation, some ACF officials expressed a need for more guidance 
on how to monitor state accomplishments, and both ACF and state 
officials were uncertain about how the estimated financial penalties 
would be applied if states fail to achieve the goals described in their 
plans.

State Plans Include a Variety of Strategies to Address Identified 
Weaknesses:

State plans include a variety of strategies to address weaknesses 
identified in the CFSR review process. However, because most states had 
not completed PIP implementation by the time of our analysis, the 
extent to which states have improved outcomes for children has not been 
determined.[Footnote 20] While state PIPs varied in their detail, 
design, and scope, according to our analysis of 31 available PIPs, 
these state plans have focused to some extent on revising or developing 
policies; reviewing and reporting on agency performance; improving 
information systems; and engaging stakeholders such as courts, 
advocates, foster parents, private providers, or sister agencies in the 
public sector.[Footnote 21] Table 1 shows the number of states that 
included each of the six categories and subcategories of strategies we 
developed for the purposes of this study.

Table 1: Number of States Including Each of the PIP Strategy Categories 
Used in This Study:

PIP strategy category: Policies and procedures; 
Description (number of states that included the strategy in their PIP): 
* Review, modify, or develop/implement any policy, procedure or case 
practice standard (31); 
* Enhance foster home/parent licensing standards (7); 
* Develop child and family assessment tools, such as protocols for risk/
safety determinations (28); 
* Identify and adopt any promising practices (19).

PIP strategy category: Data collection and analysis; 
Description (number of states that included the strategy in their PIP): 
* Review and report on agency performance through self-assessments or 
internal audits/review (31); 
* Apply federal CFSR or similar process for internal statewide case 
reviews (16); 
* Improve information and data collection systems (31).

PIP strategy category: Staff supports; 
Description (number of states that included the strategy in their PIP): 
* Train and develop caseworkers (through dissemination and training on 
policy or through revisions to overall curriculum) (30); 
* Assess and monitor staff responsibilities, skills, or performance 
(24); 
* Recruit additional staff/retain staff (14); 
* Lower caseloads (11); 
* Increase caseworker pay (1).

PIP strategy category: Foster parent supports/services and resources 
for children and families; 
Description (number of states that included the strategy in their PIP): 
* Train and develop foster families'/ providers' skills and capacities 
(27); 
* Recruit and retain foster families (22); 
* Increase involvement of foster or birth families in case (18); 
* Expand service array for children and families (includes developing 
or enhancing transportation systems to transport siblings and parents 
for visits, creating one-stop centers for assistance, modifying 
visitation services, and providing any additional support services) 
(27); 
* Engage stakeholders such as courts, advocates, foster homes, private 
providers, or sister agencies in public sector, e.g., mental health 
(can include consultation, training, or formal partnering to improve 
services or placements) (31); 
* Create or improve monitoring of contracts with private providers to 
enhance service delivery (includes development of performance based or 
outcome-based contracts or other evaluations of provider performance) 
(25).

PIP strategy category: State legislative supports; 
Description (number of states that included the strategy in their PIP): 
* State request for legislative action to support any of the above 
strategies (20).

PIP strategy category: Federal technical assistance; 
Description (number of states that included the strategy in their PIP): 
* State request technical assistance from ACF or any resource center to 
support any of the above strategies (27). 

Source: GAO analysis.

[End of table]

Our analysis also showed that many states approached PIP development by 
building on state initiatives in place prior to the on-site review. Of 
the 42 surveyed states reporting in our survey on this topic, 30 said 
that their state identified strategies for the PIP by examining ongoing 
state initiatives. For example, local officials in New York City and 
state officials in California told us that state reform efforts--borne 
in part from legal settlements--have become the foundation for the PIP. 
State officials in California informed us that reform efforts initiated 
prior to the CFSR, such as implementing a new system for receiving and 
investigating reports of abuse and neglect and developing more early 
intervention programs, became integral elements in the PIP.

Insufficient Guidance Hampered State Planning Efforts, but ACF Has 
Taken Steps to Clarify Expectations and Improve Technical Assistance:

ACF has provided states with regulations and guidance to facilitate PIP 
development, but some states believe the requirements have been 
unclear. For example, several states commented in our survey that 
multiple aspects of the PIP approval process were unclear, such as how 
much detail and specificity the agency expects the plan to include; 
what type of feedback states could expect to receive; when states could 
expect to receive such feedback; and whether a specific format was 
required. Officials in the states we visited echoed survey respondents' 
concerns with officials from 3 of the 5 states informing us that ACF 
had given states different instructions regarding acceptable PIP format 
and content. For example, California and Florida officials told us that 
their program improvement plans had been rejected prior to final 
approval, even though they were based on examples of approved plans 
that regional officials had provided. In addition, California officials 
told us that they did not originally know how much detail the regional 
office expected in the PIP and believed that the level of detail the 
regional office staff ultimately required was too high. Specifically, 
officials in California said that the version of their plan that the 
region accepted included 2,932 action steps--a number these officials 
believe is too high given their state's limited resources and the 2-
year time frame to implement the PIP.

ACF officials have undertaken several steps to clarify their 
expectations for states and to improve technical assistance. For 
example, in 2002, 2 years after ACF released the CFSR regulations and a 
procedures manual, ACF offered states additional guidance and provided 
a matrix format to help state officials prepare their plans. ACF 
officials told us the agency sends a team of staff from ACF and 
resource centers to the state to provide intensive on-site technical 
assistance, when it determines that a state is slow in developing its 
PIP. Further, ACF has sent resource center staff to states to provide 
training almost immediately after the completion of the on-site review 
to encourage state officials to begin PIP development before the final 
report is released. Our survey results indicate that increasing numbers 
of states are developing their PIPs early in the CFSR process, which 
may reflect ACF's emphasis on PIP development. According to our 
analysis, of the 18 states reviewed in 2001, only 2 started developing 
their PIPs before or during the statewide assessment phase. Among 
states reviewed in 2003, this share increased to 5 of 9.

Evidence suggests that lengthy time frames for PIP approval have not 
necessarily delayed PIP implementation, and ACF has made efforts to 
reduce the time the agency takes to approve states' PIPs. For example, 
officials in 3 of the 5 states we visited told us they began 
implementing new action steps before ACF officially approved their 
plans because many of the actions in their PIPs were already under way. 
In addition, according to our survey, of the 28 states reporting on 
this topic, 24 reported that they had started implementing their PIP 
before ACF approved it. Further, our analysis shows that the length of 
time between the PIP due date, which statute sets at 90 days after the 
release of the final CFSR report, and final ACF PIP approval has ranged 
considerably--from 45 to 349 business days. For almost half of the 
plans, ACF's approval occurred 91 to 179 business days after the PIP 
was due. Our analysis indicated that ACF has recently reduced the time 
lapse by 46 business days. This shorter time lapse for PIP approval may 
be due, in part, to the ACF's emphasis on PIP development. According to 
one official, ACF has directed states to concentrate on submitting a 
plan that can be quickly approved. Another ACF official added that 
because of ACF's assistance with PIP development, states are now 
submitting higher-quality PIPs that require fewer revisions.

State and Federal Uncertainties Cloud PIP Implementation and 
Monitoring:

Program improvement planning has been ongoing, but uncertainties have 
made it difficult for states to implement their plans and ACF to 
monitor state performance. Such uncertainties include not knowing 
whether state resources are adequate to implement the plans and how 
best to monitor state reforms. In answering a survey question about PIP 
implementation challenges, a number of states identified insufficient 
funding, staff, and time--as well as high caseloads--as their greatest 
obstacles. Figure 2 depicts these results.

Figure 2: Most Common Challenges Affecting States' PIP Implementation:

[See PDF for image]

Note: This is based on responses from 25 states. The results reported 
in the figure are a sum of the states reporting that the issue was a 
challenge to PIP implementation to a very great extent, great extent, 
moderate extent, or some/little extent. States not included answered no 
extent, no basis to judge, or not applicable.

[End of figure]

One official from Pennsylvania commented that because of the state's 
budget shortfall, no additional funds were available for the state to 
implement its improvement plan, so most counties must improve outcomes 
with little or no additional resources. A Massachusetts official 
reported that fiscal problems in his state likely would lead the state 
to lay off attorneys and caseworkers and to cut funding for family 
support programs. While state officials acknowledged that they do not 
have specific estimates of PIP implementation expenses because they 
have not tracked this information in their state financial systems, 
many states indicated that to cope with financial difficulties, they 
had to be creative and use resources more efficiently to fund PIP 
strategies. Of the 26 states responding to a question in our survey on 
PIP financing, 12 said that they were financing the PIP strategies by 
redistributing current funding, and 7 said that they were using no-cost 
methods. In an example of the latter, Oklahoma officials reported 
pursuing in-kind donations from a greeting card company so that they 
could send thank-you notes to foster parents, believing this could 
increase foster parent retention and engagement. Aside from funding 
challenges, states also reported that PIP implementation has been 
affected by staff workloads, but these comments were mixed. In Wyoming, 
for example, caseworkers told us that their high caseloads would 
prevent them from implementing many of the positive action steps 
included in their improvement plan. In contrast, Oklahoma caseworkers 
told us that the improvement plan priorities in their state--such as 
finding permanent homes for children--have helped them become more 
motivated, more organized, and more effective with time management.

ACF officials expressed uncertainty about how best to monitor states' 
progress and apply estimated financial penalties when progress was slow 
or absent, and 3 of the 5 states we visited reported frustration with 
the limited guidance ACF had provided on the PIPs quarterly reporting 
process. For example, 4 regional offices told us that they did not have 
enough guidance on or experience with evaluating state quarterly 
reports. Some regional offices told us they require states to submit 
evidence of each PIP action step's completion, such as training 
curricula or revised policies, but one ACF official acknowledged that 
this is not yet standard procedure, although the agency is considering 
efforts to make the quarterly report submission procedures more 
uniform. Moreover, ACF staff from 1 region told us that because PIP 
monitoring varies by region, they were concerned about enforcing 
penalties. Shortly before California's quarterly report was due, state 
officials told us they still did not know how much detail to provide; 
how to demonstrate whether they had completed certain activities; or 
what would happen if they did not reach the level of improvement 
specified in the plan. Based on data from the states that have been 
reviewed to date, the estimated financial penalties range from a total 
of $91,492 for North Dakota to $18,244,430 for California, but the 
impact of these potential penalties remains unclear. While ACF staff 
from most regional offices told us that potential financial penalties 
are not the driving force behind state reform efforts, some contend 
that the estimated penalties affect how aggressively states pursue 
reform in their PIPs. For example, regional office staff noted that 1 
state's separate strategic plan included more aggressive action steps 
than those in its PIP because the state did not want to be liable for 
penalties if it did not meet its benchmarks for improvement. State 
officials also had mixed responses as to how the financial penalties 
would affect PIP implementation. An official in Wyoming said that 
incurring the penalties was equivalent to shutting down social service 
operations in 1 local office for a month, while other officials in the 
same state thought it would cost more to implement PIP strategies than 
it would to incur financial penalties if benchmarks were unmet. 
Nevertheless, these officials also said that while penalties are a 
consideration, they have used the CFSR as an opportunity to provide 
better services. One official in another state agreed that it would 
cost more to implement the PIP than to face financial penalties, but 
this official was emphatic in the state's commitment to program 
improvement.

ACF's Focus Rests Almost Exclusively on Implementing the CFSR:

To implement the CFSRs, ACF has focused its activities almost entirely 
on the CFSR review process, and regional staff report limitations in 
providing assistance to states in helping them to meet key federal 
goals. ACF officials told us the CFSR has become the agency's primary 
mechanism for monitoring states and facilitating program improvement, 
but they acknowledged that regional office staff might not have 
realized the full utility of the CFSR as a tool to integrate all 
existing training and technical assistance efforts. Further, according 
to ACF officials, meetings to discuss a new system of training and 
technical assistance are ongoing, though recommendations were not 
available at the time of publication of our April 2004 report. Levels 
of resource center funding, the scope and objectives of the resource 
centers' work, and the contractors who operate the resource centers are 
all subject to change before the current cooperative agreements expire 
at the close of fiscal year 2004.

ACF officials told us that the learning opportunities in the Children's 
Bureau are intentionally targeted at the CFSR, but staff in 3 regions 
told us that this training should cover a wider range of subjects--
including topics outside of the CFSR process--so that regional 
officials could better meet states' needs. All 18 of the courses that 
ACF has provided to its staff since 2001 have focused on such topics as 
writing final CFSR reports and using data for program improvement, and 
while ACF officials in the central office said that the course 
selection reflects both the agency's prioritization of the CFSR process 
and staff needs, our interviews with regional staff suggest that some 
of them wish to obtain additional non-CFSR training. In addition, 
although ACF organizes biennial conferences for state and federal child 
welfare officials, staff from 5 regions told us that they wanted more 
substantive interaction with their ACF colleagues, such as networking 
at conferences, to increase their overall child welfare expertise. 
Further, staff from 6 of the 10 regions told us that their 
participation in conferences is limited because of funding constraints.

ACF staff in all 10 regions provide ongoing assistance or ad hoc 
counseling to states, either through phone, e-mail, or on-site support, 
but staff from 6 regions told us they would like to conduct site visits 
with states more regularly to improve their relationships with state 
officials and provide more targeted assistance. Further, staff in 4 
regions felt their travel funds were constrained and explained that 
they try to stretch their travel dollars by addressing states' non-CFSR 
needs, such as court improvements, during CFSR-related visits. While an 
ACF senior official from the central office confirmed that CFSR-related 
travel constituted 60 percent of its 2002 child welfare-monitoring 
budget, this official added that CFSR spending represents an infusion 
of funding rather than a reprioritization of existing dollars, and 
stated that regional administrators have discretion over how the funds 
are allocated within their regions. In addition, the same official 
stated that he knew of no instance in which a region requested more 
money for travel than it received.

Concerns from state officials in all 5 of the states we visited echoed 
those of regional office staff and confirmed the need for improvements 
to the overall training and technical assistance structure. For 
example, state officials in New York and Wyoming commented that ACF 
staff from their respective regional offices did not have sufficient 
time to spend with them on CFSR matters because regional staff were 
simultaneously occupied conducting reviews in other states. However, 
our survey results revealed that states reviewed in 2003 had much 
higher levels of satisfaction with regional office assistance than 
those states reviewed in 2001, which suggests improvements to regional 
office training and technical assistance as the process evolved.

Concluding Observations:

ACF and the states have devoted considerable resources to the CFSR 
process, but to date, no state has passed the threshold for substantial 
conformity on all CFSR measures, and concerns remain regarding the 
validity of some data sources and the limited use of all available 
information to determine substantial conformity. The majority of states 
surveyed agreed that CFSR results are similar to their own evaluation 
of areas needing improvement. However, without using more reliable 
data--and in some cases, additional data from state self-assessments--
to determine substantial conformity, ACF may be over-or under-
estimating the extent to which states are actually meeting the needs of 
the children and families in their care. These over-or under-estimates 
can, in turn, affect the scope and content of the PIPs that states must 
develop in response.

In addition, the PIP development, approval, and monitoring processes 
remain unclear to some, potentially reducing states' credibility with 
their stakeholders and straining the federal/state partnership. 
Similarly, regional officials are unclear as to how they can accomplish 
their various training and technical assistance responsibilities, 
including the CFSR. Without clear guidance on how to systematically 
prepare and monitor PIP-related documents, and how regional officials 
can integrate their many oversight responsibilities, ACF has left state 
officials unsure of how their progress over time will be judged and 
potentially complicated its own monitoring efforts.

To ensure that ACF uses the best available data in measuring state 
performance, we recommended in our April 2004 report that the Secretary 
of HHS expand the use of additional data states may provide in their 
statewide assessments and consider alternative data sources when 
available, such as longitudinal data that track children's placements 
over time, before making final CFSR determinations. In addition, to 
ensure that ACF regional offices and states fully understand the PIP 
development, approval, and monitoring processes, and that regional 
offices fully understand ACF's prioritization of the CFSR as the 
primary mechanism for child welfare oversight, we recommended that the 
Secretary of HHS issue clarifying guidance on the PIP process and 
evaluate states' and regional offices' adherence to this instruction 
and provide guidance to regional offices explaining how to better 
integrate the many training and technical assistance activities for 
which they are responsible, such as participation in state planning 
meetings and the provision of counsel to states on various topics, with 
their new CFSR responsibilities. In response to the first 
recommendation, HHS acknowledged that the CFSR is a new process that 
continues to evolve, and also noted several steps it has taken to 
address the data quality concerns we raise in our report. We believe 
that our findings from the April 2004 report, as well as a previous 
report on child welfare data and states' information systems, fully 
address HHS's initial actions, as well as the substantial resources the 
agency has already dedicated to the review process. However, to improve 
its oversight of state performance, our recommendation was meant to 
encourage HHS to take additional actions to improve its use of data in 
conducting these reviews. In response to the second recommendation, HHS 
said that it has continued to provide technical assistance and training 
to states and regional offices, when appropriate. HHS noted that it is 
committed to continually assessing and addressing training and 
technical assistance needs. In this context, our recommendation was 
intended to encourage HHS to enhance existing training efforts and 
focus both on state and on regional officials' understanding of how to 
incorporate the CFSR process into their overall improvement and 
oversight efforts.

Mr. Chairman, this concludes my prepared statement. I would be pleased 
to respond to any questions that you or other members of the 
subcommittee may have.

[End of section]

Appendix I: GAO Contacts Acknowledgments:

GAO Contacts:

For further contacts regarding this testimony, please call 
Cornelia M. Ashby at (202) 512-8403. Individuals making key 
contributions to this testimony include Diana Pietrowiak and Joy 
Gambino.

[End of section]

Related GAO Products:

D.C. Family Court: Operations and Case Management Have Improved, but 
Critical Issues Remain. GAO-04-685T. Washington, D.C.: April 23, 2004.

Child and Family Services Reviews: Better Use of Data and Improved 
Guidance Could Enhance HHS's Oversight of State Performance. GAO-04-
333 Washington, D.C.: April 20, 2004.

Child Welfare: Improved Federal Oversight Could Assist States in 
Overcoming Key Challenges. GAO-04-418T. Washington, D.C.: 
January 28, 2004.

D.C. Family Court: Progress Has Been Made in Implementing Its 
Transition. GAO-04-234. Washington, D.C.: January 6, 2004.

Child Welfare: States Face Challenges in Developing Information Systems 
and Reporting Reliable Child Welfare Data. GAO-04-267T. Washington, 
D.C.: November 19, 2003.

Child Welfare: Enhanced Federal Oversight of Title IV-B Could Provide 
States Additional Information to Improve Services. GAO-03-956. 
Washington, D.C.: September 12, 2003.

Child Welfare: Most States Are Developing Statewide Information 
Systems, but the Reliability of Child Welfare Data Could be Improved. 
GAO-03-809. Washington, D.C.: July 31, 2003.

D.C. Child and Family Services: Key Issues Affecting the Management of 
Its Foster Care Cases. GAO-03-758T. Washington, D.C.: May 16, 2003.

Child Welfare and Juvenile Justice: Federal Agencies Could Play a 
Stronger Role in Helping States Reduce the Number of Children Placed 
Solely to Obtain Mental Health Services. GAO-03-397. Washington, D.C.: 
April 21, 2003.

Foster Care: States Focusing on Finding Permanent Homes for Children, 
but Long-Standing Barriers Remain. GAO-03-626T. Washington, D.C.: April 
8, 2003.

Child Welfare: HHS Could Play a Greater Role in Helping Child Welfare 
Agencies Recruit and Retain Staff. GAO-03-357. Washington, D.C.: 
March 31, 2003.

Foster Care: Recent Legislation Helps States Focus on Finding Permanent 
Homes for Children, but Long-Standing Barriers Remain. GAO-02-585. 
Washington, D.C.: June 28, 2002.

District of Columbia Child Welfare: Long-Term Challenges to Ensuring 
Children's Well-Being. GAO-01-191. Washington, D.C.: December 29, 2000.

Child Welfare: New Financing and Service Strategies Hold Promise, but 
Effects Unknown. GAO/T-HEHS-00-158. Washington, D.C.: July 20, 2000.

Foster Care: States' Early Experiences Implementing the Adoption and 
Safe Families Act. GAO/HEHS-00-1. Washington, D.C.: December 22, 1999.

Foster Care: HHS Could Better Facilitate the Interjurisdictional 
Adoption Process. GAO/HEHS-00-12. Washington, D.C.: November 19, 1999.

Foster Care: Effectiveness of Independent Living Services Unknown. GAO/
HEHS-00-13. Washington, D.C.: November 10, 1999.

Foster Care: Kinship Care Quality and Permanency Issues. GAO/HEHS-99-
32. Washington, D.C.: May 6, 1999.

Juvenile Courts: Reforms Aim to Better Serve Maltreated Children. GAO/
HEHS-99-13. Washington, D.C.: January 11, 1999.

Child Welfare: Early Experiences Implementing a Managed Care Approach. 
GAO/HEHS-99-8. Washington, D.C.: October 21, 1998.

Foster Care: Agencies Face Challenges Securing Stable Homes for 
Children of Substance Abusers. GAO/HEHS-98-182. Washington, D.C.: 
September 30, 1998.

FOOTNOTES

[1] The CFSR measures state performance on 45 performance items, which 
correspond to 7 outcomes and 7 systemic factors. The outcomes relate to 
children's safety, permanency, and well-being, and the systemic factors 
address state agency management and responsiveness to the community. 
Six national standards, as reported in the Adoption and Foster Care 
Analysis and Reporting System (AFCARS) and the National Child Abuse and 
Neglect Data System (NCANDS), apply to 5 of the 45 items. Three of 
these standards are based on the 75th percentile of all states' 
performance--adoption; stability of foster care placements; and length 
of time to achieve reunification, guardianship, or permanent placement 
with relatives--because a higher incidence is desirable. However, the 
remaining three standards--recurrence of maltreatment, incidence of 
child abuse/neglect in foster care, and foster care re-entries--are 
based on the 25th percentile of state performance because lower 
incidence is a desired outcome for these measures. 

[2] U.S. General Accounting Office, Child and Family Services Reviews: 
Better Use of Data and Improved Guidance Could Enhance HHS's Oversight 
of State Performance (GAO-04- 333, April 20, 2004). 

[3] We achieved a 98 percent response rate from this survey; Puerto 
Rico was the only non-respondent. 

[4] The 10 states participating in our phone follow-up surveys were 
Arkansas, Iowa, Kansas, Mississippi, North Dakota, New Jersey, 
Pennsylvania, Rhode Island, Utah, and West Virginia. 

[5] ACF has established cooperative agreements with 10 national 
resource centers to help states implement federal legislation intended 
to ensure the safety, permanency, and well-being of children and 
families. ACF sets the resource centers' areas of focus, and although 
each center has a different area of expertise, such as organizational 
improvement or information technology, all of them conduct needs 
assessments, sponsor national conference calls with states, collaborate 
with other resource centers and agencies, and provide on-site training 
and technical assistance to states. 

[6] Title IV-B of the Social Security Act, consisting of two subparts, 
is the primary source of federal funding for services to help families 
address problems that lead to child abuse and neglect and to prevent 
the unnecessary separation of children from their families. Funding 
under Title IV-E of the Social Security Act is used primarily to pay 
for the room and board of children in foster care.

[7] The term stakeholder refers to two groups: (1) agency stakeholders, 
such as judges or advocates, whose responsibilities are closely related 
to the work of the child welfare agency and who can comment on the 
agency's overall performance on outcomes and systemic factors, and (2) 
case-specific stakeholders, such as parents, caseworkers, children, or 
others who are interviewed to provide first-hand information that 
supplements reviewers' assessment of paper or electronic case files.

[8] States began voluntarily reporting to NCANDS in 1990, and in 1995 
started reporting to AFCARS on the demographic characteristics of 
adoptive and foster children and their parents, as well as foster 
children's type of placement and permanency goals. We recently issued a 
report on states' child welfare information systems and the reliability 
of child welfare data. U.S. General Accounting Office, Child Welfare: 
Most States Are Developing Statewide Information Systems, but the 
Reliability of Child Welfare Data Could Be Improved, GAO-03-809 
(Washington, D.C.: July 31, 2003).

[9] States achieve substantial conformity on outcomes and systemic 
factors when at least 90 percent of applicable cases are substantially 
achieved; stakeholder interviews confirm that state plan and other 
program requirements are in place and functioning as described in the 
applicable regulations or statute; and performance on items with 
national standards, where applicable, meets the applicable threshold. 

[10] The formula for calculating penalties is based in part on each 
state's allocation of federal child welfare funds from Titles IV-B and 
IV-E and the number of outcomes and systemic factors for which 
substantial conformity has not been achieved. 

[11] California and Puerto Rico were determined to be in substantial 
conformity on 2 outcomes and systemic factors, while North Dakota 
achieved substantial conformity on 9. 

[12] These values are state-reported and reflect officials' estimates 
of costs associated with all CFSR-related activities except those 
incurred during PIP implementation. In reporting on their expenses, 
states were instructed to include the value of training, travel, 
infrastructure, technology, food, administrative supplies, and any 
other expenses associated with the CFSR process. States were also asked 
to provide supporting documentation for this particular question, but 
most states were unable to provide documentation. Many states reported 
that they did not track CFSR-related expenses. The 25 states that did 
provide estimates were in different phases of the CFSR. 

[13] The number of FTEs participating in each phase of the CFSR is 
state-reported. While states were not given specific instructions for 
how to calculate FTEs, they were asked to report only on the phases of 
the CFSR that they had started or completed. Therefore, states' 
responses varied depending on the phase of the CFSR process they were 
in and the methods they used to calculate FTEs. 

[14] GAO-03-809.

[15] ACF provides states with their statewide data about 6 months prior 
to the on-site review, during which time states are allowed to make 
corrections to the data and resubmit the updated data so it can be used 
when determining state conformity with CFSR measures. 

[16] According to our calculations--which assumed that the attribute of 
interest occurred in about 50 percent of the cases--a sample size of 50 
would produce percentage estimates with a 95 percent margin of error of 
approximately plus or minus 14 percentage points. This level of 
variability is a limitation when attempting to interpret estimates 
based on this sample size. 

[17] Because 1 of the 2 cases applicable to the adoption measure was 
assigned a rating of area needing improvement, 50 percent of the cases 
for this item were assigned a rating of area needing improvement. As a 
result, the item was given an overall rating of area needing 
improvement since both cases would have needed to be assigned a rating 
of strength for this item to meet the 85 percent threshold necessary to 
assign an overall rating of strength.

[18] An ACF statistician also confirmed that the CFSR sample is too 
small to generalize to the states' populations and that the three 
sites, from which cases are selected, also are not representative. 

[19] Virginia requested an additional case review to resolve a 
discrepancy between the statewide data and on-site review findings for 
the item measuring the state's performance on foster care re-entries. 
According to an ACF regional official, the state met the national 
standard for this item but the case review findings showed the state 
did not meet the threshold for this measure. At the time of publication 
of our April 2004 report, ACF and the state were still finalizing plans 
to conduct the additional case review, and until the review is 
completed, the state cannot receive its final report.

[20] As we reported in our April 2004 report, only Delaware and North 
Carolina had completed the 2-year term of their PIPs, and ACF was still 
analyzing the states' progress and had not determined if there has been 
overall improvement or if ACF will apply financial penalties.

[21] Although 41 states were developing or implementing PIPs when our 
April 2004 report was published, we reviewed the 31 available PIPs that 
ACF had approved as of January 1, 2004.