Skip ACF banner and navigation
Department of Health and Human Services logo
Questions?
Privacy
Site Index
Contact Us
 Home| Services|Working with ACF|Policy/Planning|About ACF|ACF News Search
Administration for Children and Families US Department of Health and Human Services
The Office of Child Support EnforcementGiving Hope and Support to America's Children

THE IMPLEMENTATION OF GPRA AT THE OFFICE OF CHILD SUPPORT ENFORCEMENT


U.S. Department of Health and Human Services
Administration for Children and Families
Office of Child Support Enforcement
February, 1996

This report was prepared under contract No. ACF-950213 by The Center for the Support of Children, Washington D.C.


TABLE OF CONTENTS

INTRODUCTION

GOVERNMENT PERFORMANCE AND RESULTS ACT

CHOOSING GPRA PILOTS

REQUIREMENTS OF GPRA PILOTS

DEVELOPING THE STRATEGIC PLAN

DEVELOPING PERFORMANCE MEASURES

PERFORMANCE PLANNING

OTHER GPRA ACTIVITIES

WHERE DOES OCSE GO FROM HERE?

SUMMARY OF LESSONS LEARNED




INTRODUCTION

     In 1993, Congress passed a law intended to have a profound 
effect on the way that agencies of the Federal government conduct 
business.  The Government Performance and Results Act (GPRA) is an 
attempt by Congress and the Clinton Administration to focus attention 
on the performance of Federal programs, to measure the results of 
those programs, and to look at the outcomes achieved by those 
programs.

     The legislation mandated that the Office of Management and 
Budget (OMB) select pilot projects in a number of Federal agencies 
before the Act could be fully implemented throughout the Federal 
government in 1997.  These pilots would begin work on strategic 
planning and performance measurement.  The success of their efforts 
would then be evaluated before further implementation of GPRA by the 
rest of the Federal government. 

     In the Department of Health and Human Services, the 
Administration for Children and Families' Office of Child Support 
Enforcement (OCSE) was selected as one GPRA pilot project.  This 
project is one of the few GPRA pilots that involves a program in 
which the Federal government and the States work together in a 
partnership.  Therefore, the effort to develop a strategic plan and 
performance measures for OCSE has been quite different from that of a 
strictly Federal program.  For the OCSE project to be successful, it 
is essential that the Federal Office of Child Support Enforcement 
work with its State partners in developing both the strategic plan 
and the performance measures on which the success of the OCSE program 
is to be judged.

     This report examines the history of the implementation of GPRA 
at OCSE from the beginning of the strategic planning effort in April, 
1994 through several phases in the development of performance 
measures.  It ends with the adoption, in February 1996, of a final 
draft set of performance measures which will be submitted to the IV-D 
directors for the States' consideration in the Spring of 1996.  The 
history is intended to provide a useful guide to other government 
agencies, particularly those involving Federal/State partnerships, as 
they begin the tasks of implementing GPRA and developing strategic 
plans and performance measures. By detailing the successes and 
failures of one pilot project, and sharing the lessons learned, this 
history attempts to make GPRA implementation easier for others.  As 
the documentation of a GPRA pilot project involving a Federal/State 
partnership, this report will also help States and local governments 
as they attempt strategic planning and performance measurement. 

     First, the report gives a brief history of GPRA and discusses 
the contents of the legislation.  It also looks at the reasons that 
OCSE applied to be a pilot project and why it was chosen.  Second, 
this report talks about the requirements of GPRA pilot projects and 
details in more depth the history of the strategic planning effort at 
OCSE as well as the development of performance measures and 
performance plans. The report also touches on the other OCSE efforts 
that have become part of the GPRA project, including the state pilot 
projects.  Finally, the report looks at the future direction of OCSE 
and summarizes the lessons learned by the agency during the first 
phases of the GPRA implementation effort.



GOVERNMENT PERFORMANCE AND RESULTS ACT OF 1993

The Background of GPRA

     On October 3, 1990, Sen. William V. Roth introduced the "Federal 
Program Performance Standards and Goals Act of 1990".  The bill was 
referred that day to the Committee on Governmental Affairs. [1]  The 
following January, Sen. Roth reintroduced the bill in a slightly 
revised form as S.20 and, again, it was referred to the Committee, 
which held hearings on the bill in May 1991 and, again, one year 
later. [2]  The Committee considered S. 20 on August 5, 1992 and 
adopted an amendment retitling the bill the "Government Performance 
and Results Act of 1992."  The amendment included, among other 
changes, the mandate that GPRA be implemented in a set of pilot 
projects before full implementation of the law throughout the 
government. The following January, S.20 was reintroduced and again 
referred to Committee and more hearings were held.  The Committee 
voted to report the bill favorably on March 27, 1993.[3]

     In the House of Representatives, H.R.826 was introduced by Reps. 
Conyers, Clinger, and McDade on February 4, 1993.  There was strong 
administration support from the OMB and the National Performance 
Review.  The bill was also supported by the General Accounting Office 
(GAO) which, since 1973, had produced over 70 reports on performance 
measures. [4]

     Legislation passed the House on May 25 and the Senate on June 
23. President Clinton signed GPRA on August 3, 1993, calling it "an 
important first step in the efforts to reform the way the federal 
government operates and relates to the American people." [5]

The Contents of the Act

     The purpose of GPRA is to improve the efficiency and 
effectiveness of Federal programs by establishing a system to set 
goals for program performance and to measure program results.

     GPRA forces agencies to focus on program results, service 
quality, and customer satisfaction by requiring strategic planning 
and performance measurement.  Under the Act, agencies must set 
program goals and then publicly report on their progress toward 
achieving those goals.

     The purposes of the GPRA, as stated in the legislation, are to:

-  improve the confidence of the American people in the 
         capability of the Federal Government by systematically 
          holding Federal agencies accountable for achieving program 
          results;

-  initiate program performance reform with a series of pilot 
          projects in setting program goals, measuring program 
          performance against these goals, and reporting publicly on 
          their progress;

-  improve Federal program effectiveness and public 
          accountability by promoting a new focus on results, service 
          quality, and public satisfaction;

-  help Federal managers improve service delivery by requiring 
          that they plan for meeting program objectives and by 
          providing them with information about program results and 
          service quality;

-  improve congressional decisionmaking by providing more 
          objective information on achieving statutory objectives, 
          and on the relative effectiveness and efficiency of Federal 
          programs and spending; and

-  improve internal management of the Federal Government.


Implications for Federal Agencies in the Future

     By September 30, 1997, all executive agencies will be required 
to submit to OMB and to the Congress a five-year strategic plan for 
their programs.  These strategic plans, which will then be submitted 
every three years, are to include a mission statement covering major 
functions and operations of the agency and general goals and 
objectives of the agency, as well as the approach and necessary 
resources to be used in achieving those goals and objectives.  
Agencies must also identify any "key external factors" that might 
have a significant affect on the agency's ability to achieve the 
general goals and objectives.  In addition, each agency must describe 
any program evaluations used in establishing or revising the goals 
and objectives (including plans for future evaluations).

     Beginning October 1, 1997, each Federal agency will also be 
required to prepare an annual performance plan.  The first plan will 
be for Fiscal Year 1999.  As with the pilot projects, these plans 
will cover each program activity in the agency's budget and establish 
performance goals.  These goals will then define the performance 
level to be achieved by a program activity.  The goals, whenever 
possible, are to be expressed in an objective, quantifiable, and 
measurable form. Performance indicators will be used to measure the 
relevant outputs, outcomes, and/or service levels for each program 
activity.  The performance plans will also describe the operational 
processes and resources needed to meet the performance goals and will 
establish a procedure for comparing actual program results with the 
performance goals.

     Beginning March 31, 2000, and every year thereafter, agencies 
will be required to publish annual program performance reports.  
(This report will be due six months after the end of the fiscal year 
on which it is based.)  The reports will compare the performance 
indicators that were established in the performance plan and the 
actual program performance achieved with the performance goals in 
that year's plan.  These reports also discuss the agency's success in 
achieving the performance goals and they will describe and explain 
those cases in which  performance goals have not been met.


GPRA Pilot Projects

     GPRA has provisions for pilot testing three different key 
concepts before implementing the Act throughout the government in 
1997:  annual performance plans and reports, managerial flexibility, 
and performance budgeting.  OMB is charged with assessing the costs, 
benefits, and usefulness of each concept.

     The OCSE's pilot project is testing the first concept -- that 
is, it is developing annual performance plans and performance 
reports.  GPRA requires the testing of performance plan and report 
preparation over a three year period by at least 10 agencies.  
However, OMB decided that there should also be a second group of 
pilot projects of this type that would run for two years.  These 
projects would be geared specifically toward multi-agency functions 
or joint Federal/State/local activities.  This is the category of 
pilot project that applies to the OCSE efforts.  
 
     Pilot agencies are required to write performance plans which 
describe annual performance goals and objectives, summarize the 
resources to be used, and list the indicators to be used to measure 
performance.  Annual performance reports by the pilot agencies then 
describe how well each agency did in meeting the goals and objectives 
outlined in the performance plan.

     OMB will then assess the performance measurement and 
goal-setting concepts of GPRA during the implementation of these 
pilot projects and it will identify any significant difficulties 
experienced by agencies.



CHOOSING GPRA PILOTS

Why OCSE Applied

      The OCSE was, in many ways, well-suited to becoming a pilot. 
OCSE began with the enactment of title IV-D of the Social Security 
Act, in 1975, for the purpose of "establishing and enforcing the 
support obligations owed by noncustodial parents to their children 
and the spouse or former spouse with whom the children may be 
living." [6]  The program is federally funded, but administered by 
States and local governments and, as such, is a true Federal/State 
partnership. The legislation authorized the States' use of federal 
matching funds for enforcing support obligations owed by noncustodial 
parents, locating absent parents, establishing paternity, and 
obtaining child and spousal support. The States were given 
responsibility for administering the child support enforcement 
program while the federal government's role was to fund, monitor, and 
evaluate the State programs and provide technical assistance to the 
States.

     At the time of its pilot application, OCSE's Eighteenth Annual 
Report to Congress, for the fiscal year ending September 30, 1993, 
the child support enforcement program established paternity for over 
half a million children and collected nearly $9 billion in child 
support. [7] In FY 1993, the IV-D child support enforcement caseload 
consisted of more than 17 million cases.[8] Fiscal year 1995 data 
shows that the caseload has expanded to over 20 million and 
collections have increased to $10.753 billion.  The growth trend has 
continued since the year the program began. 
     
     There were several factors surrounding the timing of the passage 
of GPRA that made OCSE ideally suited to participate as a pilot 
project.  OCSE had been part of a larger effort to develop strategic 
plans in the past, but never before had there been an attempt to 
include the States in the development of such a plan. By August 1993, 
when GPRA passed, there were already ongoing attempts at performance 
measurement and focusing on results within the agency and the States.  
If OCSE became a GPRA pilot, there would be a perfect opportunity to 
develop those efforts further.  

     There had also been recent attempts by OCSE to work with its 
State partners in the Measuring Excellence Through Statistics (METS) 
initiative, as described in more detail below. OCSE welcomed GPRA as 
an opportunity to build on these partnership efforts. 

     In addition, given that all agencies would have to follow the 
requirements of GPRA eventually, it made sense to those at OCSE to be 
a pilot project and, thus, get a head start on implementing a law 
that would affect the agency in the future.  

     By participating as a pilot agency, OCSE could take advantage of 
the additional help that would be available to the pilots in the form 
of technical assistance.  There might also be added attention focused 
on OCSE as a pilot that could generate some good publicity for the 
agency.  Further, being a pilot agency gave OCSE the opportunity to 
be on the cutting edge of federal change and have input into that 
change. OCSE also had an interest in the managerial waivers that 
might be made available to GPRA pilots in the "second wave" of 
piloting the key concepts in the legislation.

     Additionally, there was already a change occurring in the audit 
function at OCSE.  There was widespread agreement that the focus 
needed to change from looking at process to looking at results.  This 
change in focus had been evolving over time.  This shift fit neatly 
into the GPRA framework.

     Finally, there was an assumption in the agency that there would 
be welfare reform in the coming months.  The knowledge that 
legislation would be proposed and enacted which might change the 
welfare system generally and the child support enforcement program 
specifically in fundamental ways gave an added immediacy to the task 
of developing performance measures.  The proposed legislation 
included a focus on measuring the results of the child support 
enforcement program.
 

Previous OCSE Reform Efforts

     There had been several efforts to begin strategic planning, 
performance measurement, and audit reform at OCSE.  Assistant 
Secretary Mary Jo Bane had created a Performance Measurement Project 
Team at ACF in August 1993.  This Team had been asked "to explore and 
make recommendations for ways in which ACF could respond to the need 
to measure the results of its programs and initiatives, and to 
incorporate those measures in strategic planning and budgeting 
activities."[9]

     Originally, the team was created as part of a "quick start" 
toward the implementation of total quality management principles.  
However, there were a number of concurrent events which influenced 
the team and made its task even more important and immediate, 
including "the issuance of Vice President Gore's National Performance 
Review (NPR), the passage of the GPRA and President Clinton's signing 
of Executive Orders designed to put the government on a more 
results-oriented, customer-focused footing."[10]

     Because of the juxtaposition in timing, the Performance 
Measurement Project Team was able to play a part in recommending and 
brokering the application of the Child Support Enforcement Program to 
participate as one of OMB's GPRA implementation pilots.

     As noted above, another project that was an important precursor 
to the GPRA pilot was the METS initiative.  This was an attempt by 
OCSE and the States to improve the quality of the States' data 
collection efforts.  The effort involved numerous meetings between 
OCSE and State child support enforcement directors and the 
solicitation of comments and feedback from the States.  From 
February, 1992 through late 1993, the METS effort included attempts 
to revise federal reporting forms and the instructions that States 
are required to use to report program data.  Data definitions and 
reporting procedures were also to be revised.  

     The audit function at OCSE was undergoing a gradual and 
fundamental change at the same time.  The States, the American Public 
Welfare Association (APWA), the National Governors' Association (NGA) 
and OCSE worked together to develop ideas about how to focus the 
federal audit process on outcomes rather than on process.  
Preliminary attempts were also made to think about results-oriented 
performance measures.  

     The Federal statute mandated periodic comprehensive Federal 
audits of State programs to ensure substantial compliance with all 
Federal requirements.  If deficiencies identified in an audit were 
not corrected, States faced a mandatory fiscal penalty of between one 
and five percent of the Federal share of the State's AFDC program 
funding.  The current detail-oriented and process-oriented audit has 
been viewed by many as being time-consuming and labor intensive for 
both Federal auditors and the States.  One result is that audit 
findings do not measure current State performance or current program 
requirements. States contend that the audit system focuses too much 
on administrative procedures and processes rather than performance 
outcomes and results.  However, it is widely agreed that efforts to 
pass the audit have been a significant driving force behind the 
States' improved performance.  While two-thirds of the States fail 
the initial audit, three-fourths of these same States come into 
compliance after a corrective-action period, therefore avoiding the 
financial penalty.

     There have been several attempts to change the audit process so 
that it focuses more on outcomes. The current proposal in the welfare 
reform legislation simplifies the Federal audit requirements.  The 
States would be required to conduct self-reviews to assess whether or 
not all required services are being provided.  Federal auditors would 
assess States' data used to determine performance outcomes to assess 
whether it is valid and reliable and conduct periodic financial and 
other audits as the Secretary deems necessary.  If State self-reviews 
indicate that services are not being provided, OCSE would evaluate 
the State's program, ascertain the causes for the problems, and help 
States correct the problems. 

     The existence of all of the above efforts -- strategic planning, 
performance measurement, the METS initiative, and audit reform -- 
made OCSE interested in and well-prepared to be a part of the GPRA 
pilot effort.  The work previously done in all of these areas was 
essential to the success that OCSE would have with the GPRA work it 
would undertake.                                              


Why OCSE Was Chosen

     The Office of Management and Budget used several criteria for 
selecting initial pilot projects to implement the GPRA.  The key 
factor OMB would consider, according to their memo requesting 
nominations for pilot projects from the agencies, was whether an 
agency was currently collecting and reporting on performance. For an 
agency to be selected by OMB, it also had to have some capability to 
determine or estimate the costs and benefits of the planning and 
reporting process.  Furthermore, although OMB did not require an 
agency to have a current strategic plan, the agency should have long 
term goals and objectives related to its proposed pilot.

     As mentioned above, the OCSE had already begun efforts toward 
performance measurement, although it did not yet have a strategic 
plan.  The OCSE was also experienced at collecting and analyzing 
data, and although there were questions about the reliability of the 
data collected, the OCSE was addressing these reliability issues 
through the METS initiative. In addition, audit reform was beginning 
and the States were in the process of developing automated systems 
which would make gathering of necessary data easier. 

     These factors -- the ability to quantify performance results, 
the Federal/State partnership, and the data collection capacity -- 
all added to OCSE's suitability to be chosen as a GPRA pilot agency.
      


REQUIREMENTS OF GPRA PILOTS

     OMB specified a number of items that the agencies were required 
to provide as they applied to be GPRA pilot projects.  Included in 
these were:

     1.   The agency component(s), organization(s), or activities 
          that would form the pilot project,
     2.   The approximate amount of FY 1994 spending and the number 
          of FTEs that would be covered by the pilot,
     3.   Whether the agency currently has, or will have (not later 
          than FY 1996) a strategic plan covering the pilot project 
          function or operation.  (The Act requires that pilot 
          agencies have a strategic plan for at least one year of the 
          three-year pilot project period.)
     4.   An outline of the type or nature of the performance goals 
          that would be included in the performance plan(s)
     5.   A summary of the general nature and extent of any current 
          measurement of program performance for the function or 
          operation.
     6.   An indication whether, at the end of the pilot project 
          period, the agency could estimate the costs and benefits of 
          measuring performance and preparing the annual performance 
          plans and program performance reports.[11]

     Furthermore, pilots needed to be able to produce an annual 
performance plan setting measurable performance goals, and an annual 
report comparing actual performance to the target goals. A copy of 
the OCSE pilot project nomination, the strategic plan and the 
proposed performance indicators, and annual performance plans are 
included in the Appendix to this report.



DEVELOPING THE STRATEGIC PLAN  

     In October, 1993, Leon Panetta, then the OMB director, sent a 
memo to the Department of Health and Human Services soliciting 
nominations for GPRA pilot projects.  These nominations were due to 
the OMB by November 2, 1993.  In January, 1994 Secretary Shalala 
received a letter from Panetta designating the HHS as a pilot project 
for performance measurement under the GPRA.  Two projects were 
designated: a three year project in the Social Security 
Administration and the two year project in the Administration for 
Children and Families' Child Support Enforcement Program.

     Those who were working on the pilot project at OCSE knew that 
any project would have to include their State "partners," even though 
the federal legislation would not directly affect the States.  
Focusing the GPRA effort only at the OCSE would make little sense 
because it is the States that truly implement child support 
enforcement and provide almost all of the direct services to the 
public.  Therefore, it was essential to get the States to join in the 
effort if it was to be successful. Only by working with the States 
could the OCSE have an impact on the program's performance results.

     In late March, 1994, Judge David Gray Ross, the Deputy Director 
of the OCSE, sent a "Dear Colleague" letter to the State IV-D 
directors.  In this letter, he announced that OCSE was designated a 
GPRA pilot and he asked for their help on the performance plans 
required by the GPRA as well as their input into the development of 
performance indicators.

     Efforts to start designing a strategic planning document for the 
agency began at the end of April, 1994, when a group of participants 
from OCSE's central office met with Mike English and Hap Hadd, of the 
HHS Office of the Assistant Secretary for Management and Budget. 
Regional Office program managers were connected to this meeting via 
conference call. The group met and developed a first draft of a 
Strategic Plan.  This first draft was distributed and discussed with 
State IV-D directors at their annual meeting in Virginia Beach, 
Virginia in early May.

     The first draft of the Plan was intended by the central staff to 
provide a starting point for later discussions with the States.  They 
felt they needed to start with something written that the States 
could then react to, rather than starting the process with nothing.  
In fact, this first draft included many key elements that would 
remain in it throughout the planning process and be included in the 
final version that was adopted many months later.  However, there 
were also important changes made to the Plan as well as certain 
shifts in emphasis that were crucial to the Plan's ultimate 
acceptance by all partners.

     The May version of the Strategic Plan began with an Introduction 
that included some of the assumptions on which the Plan was based, a 
Vision Statement for OCSE, a Mission Statement, a list of Critical 
Success Factors, and five Goals along with Objectives for each Goal. 
It also included the beginnings of some performance indicators.

     Comments the OCSE staff received at the IV-D directors' annual 
meeting made it clear that the strategic planning effort should go no 
further without including State partners in the process.  Judge Ross, 
therefore, sent a letter on May 20 to the IV-D directors to ask for 
their help in developing a Strategic Plan for the national program, 
solicit from them proposals for two-year GPRA pilot projects in their 
States, and ask the directors to circulate the draft Strategic Plan 
in their States and get feedback from other interested parties. 
     
     At the end of May, further work was done on the Strategic Plan 
by OCSE staff and regional program managers (again, by phone) leading 
to a June 1st version of the document that was to be distributed at 
the Eastern Regional Interstate Child Support Association (ERICSA) 
Annual Conference in New Orleans.  At that conference, open houses 
were held to obtain input into the Plan.  These were attended by a 
cross section of State and local IV-D personnel, Federal staff, and 
advocacy group representatives. Again, the major comment received was 
that no further action should occur without State involvement.

     The June version of the Plan was not very different than the one 
written at the end of May. It was still a Plan that had been 
developed chiefly by the OCSE central office staff with input from 
the regional managers, but it was clearly seeking input from State 
partners.  A section was added about the "Overall Approach" that 
would be used to develop the Plan, acknowledging that "all partners 
involved in administering the Program need to be involved in 
developing and evolving the initial goals and objectives...." [12]  

     During the month of June, a Core Team was established to 
continue to refine and complete the work on the draft Strategic Plan.  
The National Council of Child Support Enforcement Administrators sent 
two representative IV-D directors to serve on this Core Team.  The 
OCSE regional program managers also named two people to the Core 
Team, one from Region X (Seattle) and one from Region II (New York).  
The OCSE central office had five representatives to the Core 
Team.Also, during June, the draft Strategic Plan was discussed 
throughout the 10 HHS regions during various conference calls in an 
effort to solicit comments and feedback.

     In July, the IV-D directors' representatives sent a letter to 
all of the IV-D directors with a first draft of the Strategic Plan 
asking for their comments.  Also during July, the monthly OCSE staff 
meeting was held as a GPRA training session with Mike English and Hap 
Hadd.  The 10 HHS regional program managers were connected to this 
training session in Washington D.C. by phone.  The early involvement 
of the regional offices and their ongoing input to the national 
Stategic Plan effort were crucial to its success.  The regions 
continued to discuss the plan with their States during this month.  
For example, in Region X the Plan was discussed at the State 
Conference on July 12.  The plan was also presented and discussed at 
the Delaware Parent Locator Regional Conference, July 21.  The APWA 
Quarterly meeting in Washington and the Region III Executive Board 
Meeting of the Domestic Relations Association of Pennsylvania offered 
further opportunities to discuss the emerging plan and to get 
feedback.  Focus groups were also held in July in the Central Office 
to discuss and get input into the plan.

     On July 26, the Core Team met in Washington to revise the 
Strategic Plan based on all of the comments that had been received.  
This meeting was held just before the OCSE 4th National Training 
Workshop, where even more focus groups were held to solicit comments 
for a new draft. These meetings led to an August 10th draft of the 
plan.

     By the August 10th draft, there were some notable changes in the 
Strategic Plan.  The Introduction to the Plan clearly stated that 
this was to be a five year plan for the Child Support Program. It 
acknowledged that welfare reform legislation might change the 
program.  It also acknowledged and incorporated the broader ACF 
vision into the Plan because people felt that it was important that 
the OCSE Plan explicitly fit with the policy direction being set for 
ACF as a whole. 

     The change to a broader approach for the child support program 
was happening on many fronts at this time.  The child support program 
had been evolving from its initial phases, in which it was seen 
primarily as a law enforcement program working to collect money from 
noncustodial parents (mostly fathers) either to reimburse the States 
for AFDC grants or on behalf of mothers and children.  It was now 
becoming a program that sought primarily to help children.  The new 
charge was "putting children first."  This change in emphasis and in 
focus was happening concurrently in the concerns of Congress, the 
research about families, the greater visibility of fathers' rights 
groups, and as a result of Judge Ross' influence on the agency.  In a 
key development, the August draft of the Plan began to address the 
broader change in focus.  The Plan mentioned the importance of both 
parents being involved and important to the children and it stressed 
the need to treat both parents fairly.    

     This August version of the draft addressed the issue of the 
"customer" of the child support program.  It stated that children are 
the primary customers of the program, while their parents are 
secondary customers. It further stated that program partners must 
work together to achieve program results.  In the Goals and 
Objectives for this version of the Plan, there was an added concern 
about medical insurance coverage.  Also, the paternity establishment 
goal became the first goal rather than the third goal.

     Throughout August, additional focus groups were held in 
Washington D.C., including outreach meetings with advocacy groups and 
other stakeholders.  At the National Child Support Enforcement 
Association (NCSEA) meeting at the end of August, there were more 
focus groups and discussions.  

     During September, the Plan was distributed and discussed at a 
meeting of Program Managers in College Park, Maryland and at the 
Western Interstate Child Support Enforcement Annual Conference 
(WICSEC) in Seattle, Washington.  During the fall, it was also 
distributed to various custodial and noncustodial parent advocacy 
groups, who were given the opportunity to comment on the contents. 
The next version of the plan, dated December 5, 1994, incorporated 
many of the comments that had been received throughout the fall.  It 
was discussed with IV-D directors during the American Public Welfare 
Association (APWA) winter meeting in San Francisco, bringing even 
more States into the process. About 25 State IV-D directors met 
face-to-face, as well as via telephone and HCFA video facilities, in 
the regional offices.  During the discussion of the Plan, many of the 
same themes and criticisms were stated that had been heard before.  
Some States still didn't feel ownership of the Plan because, even 
though they had representatives on the Core Team, they hadn't been 
"at the table" as the Plan was developed.

     This December version, which was quite close to the final Plan, 
mentioned that the States were involved in creating the Plan.  It 
discussed the need for coordination among program partners to avoid a 
fragmented service delivery system. It acknowledged the importance of 
fathers, mothers, and other caretakers in a child's upbringing.  It 
mentioned a commitment to research and demonstration projects.  This 
version of the Plan included possible performance indicators and 
approaches to the measures, as well.  

     In order to get final consensus on the Strategic Plan, it became 
clear to the OCSE staff that the indicators or performance measures 
needed to be separated from the Plan.  Consensus on and acceptance of 
the Strategic Plan by the partners would be the first step and, then, 
the development of the performance measures would be a second and 
separate step in the implementation of GPRA.  

     Consensus was achieved on the final version of the Strategic 
Plan during a national videoconference of State IV-D and Federal OCSE 
leaders on February 28, 1995 originating from Washington D. C.  More 
than 20 state CSE programs were represented and over 100 people 
participated in the videoconference.  Mike English facilitated the 
videoconference, which was aimed at bringing closure on the issues in 
the Plan.  

     In accepting the national Strategic Plan as a working blueprint 
for the child support enforcement program over the next five years, 
all participating IV-D partners signalled their agreement on the 
goals and objectives for the program.[13]  For those who participated 
in the videoconference, agreement on the Strategic Plan felt like a 
truly historic moment.  The accomplishment of consensus drew 
spontaneous applause from the group of 25 attending the 
videoconference in Washington.  Cecelia Burke, president of the 
National Council of Child Support Enforcement Administrators, 
acknowledged the event as a milestone in Federal-State relations in 
the child support enforcement program, saying, "For the first time 
ever, we have a Strategic Plan for the whole program.  I feel we are 
moving into a new realm with OCSE, when you consider the magnitude of 
what we have just accomplished here."[14]



DEVELOPING PERFORMANCE MEASURES

     After developing the goals and objectives for the Strategic 
Plan, the next step was to develop performance measures which would 
be used to measure the program's success in achieving the goals and 
objectives.  A representative group chose to work on the development 
of the performance measures.  

     The Performance Measure Workgroup was made up of some members of 
the Core Team, Federal and State staff, and some additional 
volunteers.  It was a very representative group, including members 
from twenty States, three local jurisdictions, five regional offices 
and several different central office functions. The Workgroup held an 
organizational meeting in February, 1994 just prior to the 
videoconference at which the Strategic Plan was accepted.  The 
Workgroup planned its first working meeting to be held in Alexandria, 
Virginia on March 16 and 17, 1994. 

     At the March meeting, the Performance Measures Workgroup spent 
two days together in an effort to develop performance measures which 
would flow logically from the goals and objectives stated in the 
Strategic Plan.  These sessions, facilitated by Mike English and Hap 
Hadd, led to an initial set of performance measures.  The group 
recognized that more work would be needed to refine these measures 
(such as work on definitions, data gathering requirements, and so 
on), but it had achieved a first draft of a document which included 
some important performance measures and definitions.

     The first draft of the Performance Measures included indicators 
for each objective as well as equations for measuring those 
indicators.  For example, Goal I of the Strategic Plan is "All 
Children Have Parentage Established."  The Objective is "To Increase 
Establishment of Paternities, Particularly Thoses Established within 
One Year of Birth."  During the March meeting, five paternity 
indicators (and accompanying equations) were proposed.  These ranged 
from counting at paternity acknowledgements in the IV-D caseload to 
measuring the total percentage of live births in a State "with 
paternity resolved."[15]

     Following the March meeting, the measures that had been 
developed were circulated in a document to members of the Workgroup 
as well as to all of the State IV-D directors for their review and 
comments.  Regional phone calls were then held in which the Program 
Managers led discussions about the measures with the participating 
States and Federal staff in order to elicit their comments, 
questions, and criticisms.

     The Performance Measures Workgroup reconvened in Austin, Texas 
on May 6th, prior to the annual conference of the National Council of 
State Child Support Enforcement Administrators.  Assistant Secretary 
Mary Jo Bane participated in the discussions. At that meeting, 
feedback from the regional telephone calls was discussed and 
recommendations were made about how to adjust those measures that had 
not been acceptable to many people on the calls.  For example, among 
the chief areas of concern were that many of the paternity measures 
focused on populations "outside the control of the IV-D agencies."  
The agencies did not want to be held accountable for things that they 
felt they could not control.  For example, many States did not want 
to be measured on out of wedlock births.  Others did not want to 
include Native American cases because of their lack of jurisdiction 
over the tribes.  Others argued in favor of keeping these "universal 
measures" because they focus on the outcomes that the program is 
trying to affect.

     The Performance Measures Workgroup then presented its results to 
the full meeting of the State IV-D directors on Monday, May 8, 
recommending that certain measures on which there appeared to be 
consensus be agreed to by the whole group. More work would be done on 
the other measures (on which consensus had not been reached) in 
another attempt to develop measures that would satisfy most partners. 
The Workgroup agreed to reconvene following the OCSE Training 
Conference to be held in Washington D.C. in July.

     At the July 13 and 14 meetings, again led by Mike English and 
Hap Hadd, the remaining measures were decided upon and some initial 
attempts were made at defining some of the terms of the measures. 
Another outcome of this meeting was that the OCSE central office 
staff agreed to try to gather some of the data for the measures to 
test the feasibility of using those measures.

     The notes on and results of the July meetings were distributed 
to all Workgroup members during August.  All of the proposed measures 
were presented at the NCSEA Annual Meeting in Kansas City that same 
month.  In addition, the measures were presented and discussed at 
several round tables at the September conference that OCSE sponsored, 
entitled "Making Welfare Reform Work."  Several Workgroup members 
were present for these discussions, which served as a vehicle to 
allow other interested people to hear about and comment on the 
proposed measures.

     Finally, all of the measures (including those that had been 
accepted in July) were distributed again to all State IV-D directors 
during September.  Conference calls were arranged by OCSE central and 
regional staff to gather more comments and feedback about the 
measures in another effort to gain consensus.  Summary notes of these 
conference calls were distributed to all participants and to the 
members of the Performance Measures Workgroup.  

     The group reconvened in February in order to produce a final set 
of performance measures which they would recommend for adoption to 
all of the states in April.  At this meeting, they decided to adopt 
three of the five paternity measures, leaving the remaining two 
measures as optional and recommending that they be pilot-tested by 
several States.  The two indicators for Goal II were adopted.  In 
Goal III, two measures on collection were deleted and the effort to 
"age arrears" was dropped.  It was recommended that a sampling 
technique be developed to examine cases for whether or not they were 
"appropriate and up-to-date."

     As of this writing, the Performance Measures Workgroup plans to 
make their recommendations to all State IV-D directors in early May 
at the next annual IV-D directors' meeting and looks forward to 
having some States gather pilot data on the measures in the next few 
months.



PERFORMANCE PLANNING

     As a GPRA pilot, the OCSE was required to conduct performance 
planning for FY 1995 and FY 1996 even before a Strategic Plan and 
performance measures were finalized.  A basic performance plan was 
developed for FY 1995 focusing on two outcome measures that were tied 
to objectives in the draft Strategic Plan -- the total number of 
paternities established and the total child support dollars 
collected. The same two outcome measures were used for the FY 1996 
plan.  The Performance Plans for FY 1995 and FY 1996 are included in 
the Appendix to this report.

     Mirroring the process of developing and implementing performance 
measures at the national level, the State GPRA demonstrations are 
putting performance measures in place for their own goals and 
objectives.  More than half of the States now have signed performance 
agreements with OCSE, just as Judge Ross has signed a performance 
agreement with the Assistant Secretary.



OTHER GPRA ACTIVITIES

     GPRA at OCSE has meant much more than just the drafting of a 
Strategic Plan and the setting of organizational goals and 
objectives.  Rather, all activities of the organization have been 
viewed in a new light following the GPRA and, in a sense, "GRPA is 
everything and everything is now GPRA." The GPRA effort has served to 
refocus and restructure the work of the agency toward achieving 
specific results.  GPRA activities for the program as a whole fall 
into three categories: strategic planning, performance planning, and 
special demonstrations.  The first two efforts have been described 
previously.  This section discusses the special demonstration 
projects that, at OCSE's initative, the states have voluntarily 
undertaken to practice the GPRA principles of setting goals and 
objectives, determining performance measures, and measuring results.

     The most immediate and direct expression of GPRA, beyond the 
national Strategic Plan, is the existence of demonstration projects 
undertaken by 34 States and localities.  A list of these projects is 
included in the Appendix.  The State projects, solicited by Judge 
Ross as an extension of GPRA, foster and demonstrate GPRA principles 
and range from statewide projects to regional demonstrations to 
programs that will operate in a single county.  The OCSE will provide 
technical assistance and consultation to these State and local 
demonstrations.  The GPRA demonstrations are producing baseline and 
performance data and allowing validation of the data produced.  
"Several demonstrations focus on improving performance in all child 
support enforcement functions, while others take aim at enhancing 
specific functions, such as paternity establishment, asset 
identification, or medical support." [16]

     The special GPRA demonstration projects were seen by Judge Ross 
as a way to enhance the agency's GPRA pilot project.  By getting 
States more directly involved in projects that would be geared toward 
achieving program goals, OCSE was encouraging the States to become 
active participants in GPRA. It was hoped that there might be 
additional funding to assist these state demonstration projects, but 
even without extra funding, most States, once approved, were eager to 
participate. They would receive extra attention and technical 
assistance for their efforts.

     Another GPRA initiative, in addition to the Strategic Plan, are 
six GPRA Research Demonstration Grants.  These projects are 
cooperative agreements with States lasting up to 17 months.  State 
child support enforcement agencies were eligible to apply for funding 
in order to demonstrate strategic planning, performance planning, 
performance measure development, and performance budgeting.  

     In Judge Ross' view, GPRA also became an excellent vehicle for 
improving communications throughout OCSE.[17]  The  OCSE pilot 
project is program-wide and all OCSE staff are involved in achieving 
the national goals.  Toward this end, there have been many efforts to 
increase and improve communications between regional offices and the 
central office.  For example, there are now weekly conference calls 
from the central office to all of the regional offices covering many 
programmatic and policy issues.  The GPRA task force, led by the GPRA 
Director Anne Donovan, took a lead in initiating these calls and 
increasing their frequency.

     Additional OCSE projects are focusing on program results as part 
of the GPRA.  Executive Order 12953 mandates that all executive 
agencies of the Federal government facilitate the payment of child 
support.  The OCSE is providing technical assistance to other 
executive branch agencies implementing this Executive Order by 
furnishing information on the child support program and assisting 
with annual efforts to inform all current and prospective Federal 
employees about the program.  

     In another GPRA-related project, the Washington Metro Project is 
aimed at identifying and overcoming barriers in interstate child 
support cases among the District of Columbia, Maryland, and Virginia.  
The OCSE is coordinating and facilitating efforts among the various 
State and local jurisdictions in an attempt to bridge the gaps 
between child support agencies in the metropolitan area.  Methods for 
improving enforcement of interstate cases developed and tested in the 
Metro Project will be disseminated to similar metropolitan areas 
nationwide. 

     The OCSE has also added specialists to its staff to focus 
exclusively on improving child support enforcement performance where 
cases involve Native Americans and foreign governments.  New staff 
are also working on the problems of cases involving members of the 
Armed Forces and employees of the Federal government.  There are also 
plans to add staff to liaison with law enforcement professionals on 
child support enforcement.

     Finally, there are two model office demonstration projects 
underway in Colorado and Maine testing various child support 
innovations in a controlled management environment.  For example, 
they are examining certain aspects of welfare reform and improved 
enforcement and location techniques.  Successful practices developed 
in these projects will be disseminated to all child support 
enforcement agencies. 
     

    
WHERE DOES OCSE GO FROM HERE?

     There are several issues facing OCSE as it plans to implement 
the performance measures developed for the national Strategic Plan.  

     Assuming that consensus is achieved on the performance measures, 
the first step in implementation will be to begin efforts at data 
collection.  There will need to be an assessment of how ready the 
States are to collect the necessary information.  While it was 
initially hoped that all the States would have automated systems in 
place by October 1995, that deadline has now been extended to 
October, 1997 and certification is proceeding.  The States will have 
to have fully operational systems and produce reliable data for the 
performance measures that have been identified, and some 
reprogramming of systems is likely.  Meanwhile, several States have 
volunteered to pilot test the measures and what they learn during 
this testing will be valuable to the implementation efforts 
nationwide.

     An additional task will be to mesh the performance measures with 
the measures in any final welfare reform legislation.  These measures 
will also need to be coordinated with those being established for 
incentive funding. It is OCSE's belief that the efforts of the 
Performance Measures Workgroup will flow smoothly into the incentive 
funding formula scheme envisioned in the Senate welfare reform bill.  
OCSE is confident that the Workgroup's efforts will save time in 
meeting the requirements of welfare reform because of the effort that 
has been made to date in thinking about and gaining consensus on 
appropriate measures.  But it will also take time for the incentive 
measures to be developed and agreed upon.


   
SUMMARY OF LESSONS LEARNED 

     The implementation of GPRA in OCSE has been a long process 
requiring the time, involvement, energy, and enthusiasm of people at 
every level of the agency and among all of the State partners.  The 
willing participation of all parties is certainly a key factor in the 
success of the effort. 

     There are a number of other factors that contributed to what is 
generally perceived to be the success to date of the implementation 
of GPRA at OCSE.  Among the most important of these is that the 
timing of the GPRA pilot coincided nicely with other efforts along 
the same lines that were already taking place at the agency.  There 
was also the external pressure of welfare reform which forced the 
agency to confront the changes that were being contemplated by 
others.  The OCSE was facing pressure for change internally and 
externally.  By forcing a focus on performance outcomes, the GPRA 
gave a structure to that change, requiring development of a new 
Strategic Plan and consensus on performance measures, and enhanced 
Federal/State partnership.

     According to the testimony of Johnny C. Finch, Assistant 
Comptroller General, "officials in a number of Federal agencies said 
that one of the single most important incentives to changing behavior 
in their agencies so that managers and staff focus more on achieving 
desired outcomes will be the degree to which top leadership actively 
demonstrates its support for such change." [18]

     The encouragement and commitment to participate as a GPRA 
project came from the top of the agency.  Assistant Secretary Mary Jo 
Bane and Deputy Director David Gray Ross are active and enthusiastic 
proponents of the results-oriented focus of GPRA.  Their 
participation, support, and persistence in the effort was a necessary 
component of such a fundamental change.  At the same time, there was 
recognition throughout the agency that an effort such as GPRA 
required the active participation of all child support enforcement 
partners and that a "top-down" approach to developing a strategic 
plan would not be effective.  The States had to be included and were 
essential to the success of the program.  At all stages of 
implementation, major efforts were made to communicate with partners, 
to include them in decisions, to solicit their opinions and listen 
carefully to their feedback.  While there was an initial effort to 
develop goals and a draft Strategic Plan centrally, the lesson was 
learned very quickly by the central office that, according to one 
staff member, "Unless you set your goals together, they are your 
goals and not your partners' goals."

     It was necessary to overcome a distrust of the Federal office 
that the States had developed over the years.  Prior to the 
implementation of GPRA, the relationship between the Federal staff 
and the States was not a true partnership.  The Federal office had 
been in the position of setting the rules, writing regulations, 
calling the shots and paying most of the bills, as Betsy Matheson 
described it.[19]  The States, as Matheson further noted, were the 
ones "doing the work and criticizing us."  The Federal staff focussed 
more and more on program and process details and found there was less 
emphasis by everyone on the program results.  It was necessary to 
change this relationship fundamentally and to foster a feeling of 
ownership and partnership among program staff at both the State and 
Federal levels.  

     One of the first steps to overcoming the distrust that the 
States felt about the Federal staff was the selection of Anne Donovan 
as the GPRA Director.  A former State IV-D Director, Donovan could 
identify with the problems faced by the States and she was also a key 
proponent of the new efforts toward increased communication and 
inclusion.  Many people mentioned the selection of Anne Donovan as a 
crucial element of GPRA's success.

     Another way that the distrust among partners was overcome was by 
selecting an "outside" facilitator to work with the Core Team as it 
developed the Strategic Plan.  Mike English and Hap Hadd, though they 
worked for HHS, were not perceived to be "Feds."  They brought not 
only their professionalism and experience as facilitators to the 
discussions but also a neutrality that was necessary given the 
initial distrust on the part of the participants. This helped them to 
change the past behaviors.

     The Strategic Plan became a joint project of OCSE and the IV-D 
directors.  Getting the active participation of the IV-D directors' 
organization was key in getting ultimate agreement on the Plan. In 
some cases, fellow IV-D directors could influence their colleagues in 
other states to go along with the Plan. But another important lesson 
learned was that there did not need to be -- nor should there be -- a 
unanimous vote to accept the Plan.  On something as large and 
complicated as the Strategic Plan, it was important to seek a general 
consensus.  Consensus doesn't necessarily mean that every one agrees 
with everything in the Plan; it means that everyone has had a chance 
to participate in the deliberations and "can live with " and accept 
the results of those deliberations. 

     It is generally agreed upon that the videoconference was a very 
important tool in gaining consensus.  As Cecelia Burke said later, 
"We were able to use technology to our advantage by involving more 
States in the discussions." [20]  People felt that they were part of 
the process and they were also able to see the results of their hard 
work in a finalized Plan. Sometimes the feeling of true partnership 
comes not only from the process used to obtain a result but also from 
the fact that a tangible result has been achieved.

     In spite of the general consensus, however, one of the problems 
that will be faced in the future with implementing the Strategic Plan 
is that States still have to deal with their own separate agendas.  
While they may applaud and agree to the national goals, there are 
different political pressures and objectives facing different 
jurisdictions and these may have a higher priority than the national 
goals and objectives.  In addition, the ever changing leadership in 
the States and Federal governments present a continuing challenge to 
implementing these changes.

     As with any change to a large, bureaucratic system, there will 
be resistance along the way.  It is important that this resistance be 
anticipated and acknowledged ahead of time and that staff be aware 
that there will be inevitable roadblocks along the way.  Change 
requires persistence and time.  The Federal and State members of the 
Core Team and the expanded Performance Measures Workgroup worked long 
and hard to make sure that the Strategic Plan and performance 
measures were discussed and agreed to by all interested parties and 
they worked with an awareness that most of the effort to make the 
partnership work would be on their shoulders. One lesson they learned 
in this process was that they had to be prepared to do most of the 
work to develop the partnership because it was their priority and 
they had to overcome the distrust that the States had for the Federal 
office.  

     There are some who feel that the State pilot projects should 
have begun after the Strategic Plan was developed so that they would 
focus only on the specific goals and objectives articulated in the 
Plan.  Others argue that it was necessary to get started with the 
projects as soon as the GPRA pilot started and that waiting for the 
development of the Strategic Plan would have meant losing the 
momentum of GPRA.  The outcome of these projects will have to be 
analyzed in the future.

     In conclusion, OCSE's implementation of the GPRA pilot has been 
very successful as of this writing.  The State and Federal partners 
have worked together to develop a Strategic Plan that has been 
accepted by all of the States.  The Performance Measures Workgroup 
has developed measures that they feel comfortable can be adopted and 
tested by the States.  Individual States and some regions have 
developed pilot projects to expand the reach of GPRA.  OCSE has 
succeeded, thus far, in fulfulling the major goals of the GPRA 
legislation: it is focusing its attention on the performance of the 
program; it is beginning to measures the results of the program; and 
it is examining the outcomes achieved by the program.


Download FREE Adobe Acrobat® Reader™ to view PDF files located on this site.

OCSE Home | Press Room | Events Calendar | Publications | State Links
Site Map | FAQs | Contact Information
Systems: FPLS | FIDM | State and Tribal | State Profiles
Resources: Grants Information | Información en Español | International | Federal/State Topic Search (NECSRS) | Tribal | Virtual Trainer's Library