This is the accessible text file for GAO report number GAO-06-82 
entitled 'Workforce Investment Act: Labor and States Have Taken Actions 
to Improve Data Quality, but Additional Steps Are Needed' which was 
released on November 14, 2005. 

This text file was formatted by the U.S. Government Accountability 
Office (GAO) to be accessible to users with visual impairments, as part 
of a longer term project to improve GAO products' accessibility. Every 
attempt has been made to maintain the structural and data integrity of 
the original printed product. Accessibility features, such as text 
descriptions of tables, consecutively numbered footnotes placed at the 
end of the file, and the text of agency comment letters, are provided 
but may not exactly duplicate the presentation or format of the printed 
version. The portable document format (PDF) file is an exact electronic 
replica of the printed version. We welcome your feedback. Please E-mail 
your comments regarding the contents or accessibility features of this 
document to Webmaster@gao.gov. 

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately. 

Report to Congressional Requesters: 

United States Government Accountability Office: 

GAO: 

November 2005: 

Workforce Investment Act: 

Labor and States Have Taken Actions to Improve Data Quality, but 
Additional Steps Are Needed: 

GAO-06-82: 

GAO Highlights: 

Highlights of GAO-06-82, a report to congressional requesters: 

Why GAO Did This Study: 

Federal programs carried out in partnership with states and localities 
continually balance the competing objectives of collecting uniform 
performance data with giving program implementers the flexibility they 
need. Our previous work identified limitations in the quality of 
performance data for the key employment and training program—the 
Workforce Investment Act (WIA). WIA relies on states and localities to 
work together to track and report on participant outcomes, and it 
changed the way outcomes are measured. Given the magnitude of changes 
and the impact such changes can have on data quality, we examined (1) 
the data quality issues that affected states’ efforts to collect and 
report WIA performance data; (2) states’ actions to address them; and 
(3) the actions the Department of Labor (Labor) is taking to address 
data quality issues, and the issues that remain. 

What GAO Found: 

Three key issues—flexibility in federal guidance, major changes to 
states’ information technology (IT) systems, and limited 
monitoring—compromised states’ early efforts to collect and report WIA 
performance data. Labor’s initial guidance allowed states and local 
areas flexibility in deciding which jobseekers to track and when 
jobseekers leave services and get counted in the measures. As a result, 
states and local areas have differed on whom they track and for how 
long. States took various approaches to implement IT systems for 
meeting WIA reporting requirements. Thirty-nine states reported to us 
that they made major modifications to their IT systems since WIA was 
first implemented in 2000. Thirteen of them said the changes resulted 
in problems affecting data quality, and 5 states are still trying to 
resolve these problems. In addition, oversight of WIA performance data 
was insufficient at all levels during early implementation. 

Almost all states have made efforts to improve the quality of WIA 
performance data—at least 40 states have controls in their IT systems 
that capture WIA performance data, such as edit checks or exception 
reports to help screen for errors or missing data. Forty-three states 
have taken actions to clarify Labor’s guidance and help local areas 
determine who should be tracked in the performance measures. In 
addition, most states said they monitor local areas by assessing local 
procedures and policies. 

Labor recently began addressing data quality issues, however, some 
issues remain. In 2004, Labor addressed some data quality concerns by 
requiring states to validate their data and ensure the accuracy of 
their performance outcomes. Most states told us that Labor’s 
requirements have increased awareness of data quality at the state and 
local level. However, Labor does not have methods in place to review 
states’ validation efforts or hold states accountable for complying 
with its requirements. Labor issued guidance requiring states to 
implement common performance measures on July 1, 2005, which clarified 
some key data elements, but does not address all the issues. Labor has 
some federal monitoring processes in place but lacks a standard 
monitoring guide to address data quality. 

States’ Views of How Labor’s Data Validation Efforts Have Helped Them: 

[See PDF for image] 

[End of figure] 

What GAO Recommends: 

GAO is recommending that Labor determine a standard point of 
registration and monitor its implementation; that Labor conduct its own 
review of WIA participant files and take steps to hold states 
accountable for meeting data validation requirements; and that Labor 
develop a standard monitoring tool for WIA performance data. In its 
response, Labor agreed with our findings and recommendations and noted 
steps it is taking to implement them. 

www.gao.gov/cgi-bin/getrpt?GAO-06-82. 

To view the full product, including the scope and methodology, click on 
the link above. For more information, contact Sigurd R. Nilsen at (202) 
512-7215 or nilsens@gao.gov 

[End of section] 

Contents: 

Letter: 

Results in Brief: 

Background: 

Several Key Issues Have Affected States' Efforts to Ensure the Quality 
of WIA Performance Data: 

States Have Taken Steps to Improve the Quality of WIA Performance Data: 

Labor Has Taken Steps to Improve WIA Data Quality, but Some Issues 
Remain: 

Conclusions: 

Recommendations For Executive Action: 

Agency Comments: 

Appendix I: Objectives, Scope, and Methodology: 

Appendix II: Comments from the Department of Labor: 

Appendix III: GAO Contact and Staff Acknowledgments: 

GAO Related Products: 

Tables: 

Table 1: WIA's Mandatory Programs and Related Federal Agencies: 

Table 2: Performance Measures and Allowable Data Sources for the WIA- 
Funded Programs: 

Table 3: Key WIA Terms That Allow Flexibility: 

Table 4: Summary of Data Quality Concerns That Affected States' Efforts 
to Ensure Data Quality: 

Table 5: Summary of How Labor's Data Validation Requirements Affect 
Data Quality Concerns: 

Table 6: Common Measures are Similar to Some of the WIA Measures: 

Table 7: List of Services That Labor Does Not Consider Substantial 
Enough to Keep a Participant from Being Exited: 

Table 8: Site Visit States and Local Areas: 

Figures: 

Figure 1: Data Flow from Local to State to Labor under WIA: 

Figure 2: Time It Took States to Implement Changes to IT Systems for 
WIA Implementation: 

Figure 3: Actions States Have Taken to Clarify and Explain Federal 
Guidance: 

Figure 4: Types of Errors Addressed with Edit Checks and Exception 
Reports: 

Figure 5: States' Monitoring Activities: 

Figure 6: Few States' Faced Major Difficulties Using Labor's Software: 

Figure 7: Most States Found Labor's Assistance in Data Validation 
Sufficient: 

Figure 8: States' View of How Labor's Data Validation Efforts Have 
Helped Them: 

Abbreviations: 

GEMS: Grants E-Management System: 

GPRA: Government Performance and Results Act: 

IG: Inspector General: 

IT: information technology: 

JTPA: Job Training Partnership Act: 

OMB: Office of Management and Budget: 

PART: Program Assessment Rating Tool: 

UI: Unemployment Insurance: 

WIA: Workforce Investment Act: 

WIASRD: Workforce Investment Act Standardized Record Data: 

WRIS: Wage Record Interchange System: 

United States Government Accountability Office: 

Washington, DC 20548: 

November 14, 2005: 

The Honorable Michael B. Enzi: 
Chairman: 
The Honorable Edward M. Kennedy 
Ranking Minority Member: 
Committee on Health, Education, Labor, and Pensions: 
United States Senate: 

The Honorable Patty Murray: 
Ranking Minority Member: 
Subcommittee on 
Employment and Workplace Safety: 
Committee on Health, Education, Labor, and Pensions: 
United States Senate: 

Performance data are becoming increasingly significant in helping 
policy makers and program managers assess progress of federal programs 
in meeting their long-term goals and in helping to make a variety of 
programmatic and budget decisions. Yet our previous work has identified 
limitations in the ability of federal agencies to produce credible 
performance data.[Footnote 1] In particular, federal programs that are 
carried out in partnership with states and localities continually 
balance the competing objectives of collecting uniform performance data 
at the national level with giving states and localities the flexibility 
they need to implement programs. The Workforce Investment Act (WIA) of 
1998--the centerpiece of the nation's employment and training system-- 
established three programs that rely on states and localities to work 
together to track and report on participant outcomes in areas of job 
placement, retention, earnings, and skill attainment, as well as 
customer satisfaction. WIA, implemented in July 2000, has resulted in a 
major shift from predecessor programs, including the Job Training 
Partnership Act (JTPA) program, by offering a broader array of services 
to the general public and no longer using income to determine 
eligibility for all program services. WIA also changed the way 
performance is measured, including establishing new performance 
measures that assess outcomes over time, requiring the use of 
Unemployment Insurance (UI) wage data to track outcomes, and requiring 
states to negotiate expected performance levels with the Department of 
Labor (Labor). 

States are held accountable for achieving their performance levels 
through financial incentives and sanctions. These changes have had 
profound implications for the way WIA performance data are collected 
and reported. 

Given the magnitude of these changes, the potential impact such changes 
can have on data quality, and the importance of having meaningful 
performance data, we examined (1) the data quality issues that have 
affected states' efforts to collect and report WIA performance data, 
(2) states' actions to address them, and (3) the actions Labor is 
taking to address data quality issues and the issues that remain. 

To learn more about states' experiences implementing data collection 
and reporting system changes for WIA, their implementation of Labor's 
data validation requirements for WIA, and state and local efforts to 
address the quality of WIA data, we conducted a web-based survey of 
workforce officials in 50 states and received a 100 percent response 
rate. We did not include the District of Columbia and U.S. territories 
in our survey. In addition, we conducted site visits in California, New 
York, Texas, West Virginia, and Wyoming, where we interviewed state 
officials and visited two local areas in each state. We selected these 
states because they represent a range of information technology (IT) 
systems--statewide comprehensive systems versus local systems with a 
state reporting function, include states with single and multiple 
workforce areas, and are geographically diverse. We also collected 
information on the quality of WIA data through interviews with Labor 
officials in headquarters and all six regional offices, and nationally 
recognized experts, and reviewed relevant research literature. Our work 
was conducted between June 2004 and September 2005 in accordance with 
generally accepted government auditing standards. (For a complete 
description of our scope and methodology, see app. I.) 

Results in Brief: 

Three key issues--flexibility in federal guidance, major changes to 
states' information systems, and limited monitoring efforts--have 
compromised states' early efforts to collect and report accurate and 
consistent WIA performance data. The guidance available to states at 
the time of implementation allowed flexibility in key definitions and 
contributed to inconsistency in the way the data are collected. For 
example, Labor allowed states and local areas flexibility in 
determining which jobseekers to track and when jobseekers leave 
services and, therefore, get counted in the performance measures. As a 
result, states and local areas have differed on whom they track and for 
how long. In addition, the transition from JTPA to WIA required 
significant changes to state information technology (IT) systems--new 
data elements were required, some definitions changed, performance 
measures were added, and new data sources were introduced to track 
outcomes. States and local areas took various approaches to develop and 
implement new IT systems for collecting and reporting WIA data. Thirty- 
nine states reported on our survey that they made major modifications 
to their WIA IT systems since implementation that included switching to 
an Internet-based system and adding new capabilities such as case 
management. Thirteen of these states said the major modifications 
resulted in problems affecting data quality, such as difficulties 
transferring data from the old system to the new system, loss of data, 
and challenges reconciling data from multiple systems. While 8 of the 
states reported that these issues have been resolved, 5 told us that 
they are still trying to resolve these data quality concerns. In 
addition, we and others found that oversight and monitoring of WIA 
performance data were insufficient during early implementation. 

Almost all states have taken steps to improve the quality of WIA 
performance data. Forty-three states reported to us that they developed 
their own guidelines to help local areas determine who should be 
tracked in the performance measures. At least 40 states have controls 
in their IT systems that identify potential problems with their WIA 
performance data, such as edit checks or exception reports to help 
screen for errors or missing data. Labor officials in most of Labor's 
six regions told us that states have made improvements to their IT 
systems since WIA was first implemented. In addition, 38 states 
reported to us that they monitor local areas to ensure data quality and 
consistency by assessing local procedures and policies on data 
collection. 

Labor recently began addressing data quality issues; however, some data 
quality issues remain. Beginning in 2004, Labor addressed several 
concerns with data quality by implementing new data validation 
requirements. Through this effort, Labor required states to compare 
data reported to the state with a sample of participant case files and 
provided software to help states ensure that the performance measures 
are accurately calculated. While it is too soon to fully assess whether 
Labor's efforts have improved data quality, almost all states reported 
on our survey that Labor's new requirements have increased awareness of 
data quality at the state and local level. At the same time, Labor does 
not currently have mechanisms in place to review states' data 
validation efforts or hold states accountable for the data validation 
requirements. Labor's guidance to implement common performance measures 
on July 1, 2005, clarified some key data elements that had been 
problematic with regard to the WIA performance measures. For example, 
this guidance provides for a clearer understanding of when participants 
leave services. However, it did not clarify when participants should be 
registered for WIA and counted in the performance measures. In 
addition, Labor has some federal monitoring processes in place but 
lacks a standard monitoring guide to address data quality. 

To address the inconsistencies in determining when participants should 
be registered and counted in the performance measures, we recommend 
that Labor determine a standard point of registration and monitor its 
implementation. To enhance the data validation requirements, we 
recommend that Labor conduct its own review of the WIA participant 
files to ensure that validation was done correctly and take steps to 
hold states accountable to both the report validation and data element 
validation requirements. In addition, to address variations in federal 
monitoring practices, we recommend that Labor develop a standard 
comprehensive monitoring tool for WIA performance data. In its written 
comments, Labor agreed with our findings and recommendations and noted 
steps it is taking to implement them. 

Background: 

Labor required states to implement major provisions of WIA by July 1, 
2000, although some states began implementing provisions of WIA as 
early as July 1999. WIA replaced the Job Training Partnership Act 
(JTPA) program and requires that many federal programs provide 
employment and training services through one-stop centers. Services 
funded under WIA represent a marked change from those provided under 
the previous program, allowing for a greater array of services to the 
general public. WIA is designed to provide for greater accountability 
than under previous law: it established new performance measures and a 
requirement to use Unemployment Insurance (UI) wage data to track and 
report on outcomes. 

WIA-Funded Services Represent a Change from Those Funded under JTPA: 

Program services provided under WIA represent a marked change from 
those provided under JTPA. When WIA was enacted in 1998, it replaced 
the JTPA programs for economically disadvantaged adults and youth and 
for dislocated workers with three new programs--Adult, Dislocated 
Worker, and Youth--that provide a broad range of services to the 
general public, no longer using income to determine eligibility for all 
program services. The WIA adult and dislocated worker programs no 
longer focus exclusively on training, but provide for three tiers, or 
levels, of service: core, intensive, and training. Core services 
include basic services such as help with job searches and providing 
labor market information. These activities may either be self-service 
or require some staff assistance. Intensive services include such 
activities as comprehensive assessment of jobseekers' skill levels and 
service needs and case management--activities that typically require 
greater staff involvement. Training services include such activities as 
occupational skills development or on-the-job training. Labor's 
guidance specifies that monitoring and tracking for the adult and 
dislocated worker programs should begin when jobseekers receive core 
services that require significant staff assistance. Jobseekers who 
receive core services that are self-service or informational in nature 
are not counted in the performance measures. 

In addition to those services provided by the three WIA funded 
programs, WIA also requires that states and local areas use the one- 
stop center system to provide services for many other employment and 
training programs. Seventeen categories of programs funded through four 
federal agencies are now required to provide services through the one- 
stop center under WIA. Table 1 shows the programs that WIA requires to 
provide services through the one-stop centers (also known as mandatory 
programs) and the federal agencies that administer these programs. 

Table 1: WIA's Mandatory Programs and Related Federal Agencies: 

Federal agency: Department of Labor; 
Mandatory programs: 
* WIA adult; 
* WIA dislocated worker; 
* WIA youth Employment Service (Wagner-Peyser); 
* Trade Adjustment assistance programs; 
* Veterans' employment and training programs; 
* Unemployment Insurance; 
* Job Corps; 
* Welfare-to-Work grant-funded programs; 
* Senior Community Service Employment Program; 
* Employment and training for migrant and seasonal farm workers; 
* Employment and training for Native Americans. 

Federal agency: Department of Education; 
Mandatory programs: 
* Vocational Rehabilitation Program; 
* Adult Education and Literacy; 
* Vocational Education (Perkins Act). 

Federal agency: Department of Health and Human Services; 
Mandatory programs: 
* Community Services Block Grant. 

Federal agency: Department of Housing and Urban Development (HUD); 
Mandatory programs: 
* HUD-administered employment and training. 

Source: U.S. Department of Labor. 

[End of table] 

WIA Performance Measures Are Designed to Increase Accountability for 
Three WIA-Funded Programs: 

WIA is designed to provide for greater accountability than its 
predecessor program by establishing new performance measures, a new 
requirement to use UI wage data to track and report on outcomes, and a 
requirement for Labor to conduct at least one multi-site control group 
evaluation. According to Labor, performance data collected from the 
states in support of the measures are intended to be comparable across 
states in order to maintain objectivity in determining incentives and 
sanctions. The performance measures also provide information to support 
Labor's performance goals under the Government Performance and Results 
Act (GPRA), the budget formulation process using the Office of 
Management and Budget's (OMB) Program Assessment Rating Tool (PART), 
and for program evaluation required under WIA. 

In contrast to JTPA, under which data on outcomes were obtained through 
follow-ups with job seekers, WIA requires states to use UI wage records 
to track employment-related outcomes. Each state maintains UI wage 
records to support the process of providing unemployment compensation 
to unemployed workers. The records are compiled from data submitted to 
the state each quarter by employers and primarily include information 
on the total amount of income earned during that quarter by each of 
their employees. Although UI wage records contain basic wage 
information for about 94 percent of workers, certain employment 
categories are excluded, such as self-employed persons, independent 
contractors, federal employees, and military personnel. According to 
Labor's guidance, if a program participant does not appear in the UI 
wage records, states may then use supplemental data sources, such as 
follow-up with participants and employers, or other administrative 
databases, such as U.S. Office of Personnel Management or U.S. 
Department of Defense records, to track most of the employment-related 
measures. However, only UI wage records may be used to calculate the 
earnings change and earnings replacement performance measures. (See 
table 2 for a complete list of WIA performance measures.) 

Table 2: Performance Measures and Allowable Data Sources for the WIA- 
Funded Programs: 

Program: Adult; 
Measure: 1. Entered employment rate; 
Data source: UI wage records: Yes; 
Data source: Supplemental data allowed: Yes; 
Data source: Other, such as educational data or survey: No. 

Measure: 2. Employment retention rate at 6 months; 
Data source: UI wage records: Yes; 
Data source: Supplemental data allowed: Yes; 
Data source: Other, such as educational data or survey: No. 

Measure: 3. Average earnings change in 6 months; 
Data source: UI wage records: Yes; 
Data source: Supplemental data allowed: No; 
Data source: Other, such as educational data or survey: No. 

Measure: 4. Employment and credential rate; 
Data source: UI wage records: Yes; 
Data source: Supplemental data allowed: Yes; 
Data source: Other, such as educational data or survey: Yes. 

Program: Dislocated worker; 
Measure: 5. Entered employment rate; 
Data source: UI wage records: Yes; 
Data source: Supplemental data allowed: Yes; 
Data source: Other, such as educational data or survey: No. 

Measure: 6. Employment retention rate at 6 months; 
Data source: UI wage records: Yes; 
Data source: Supplemental data allowed: Yes; 
Data source: Other, such as educational data or survey: No. 

Measure: 7. Earnings replacement rate in 6 months; 
Data source: UI wage records: Yes; 
Data source: Supplemental data allowed: No; 
Data source: Other, such as educational data or survey: No. 

Measure: 8. Employment and credential rate; 
Data source: UI wage records: Yes; 
Data source: Supplemental data allowed: Yes; 
Data source: Other, such as educational data or survey: Yes. 

Program: Youth (age 19-21); 
Measure: 9. Entered employment rate; 
Data source: UI wage records: Yes; 
Data source: Supplemental data allowed: Yes; 
Data source: Other, such as educational data or survey: Yes. 

Measure: 10. Employment retention rate at 6 months; 
Data source: UI wage records: Yes; 
Data source: Supplemental data allowed: Yes; 
Data source: Other, such as educational data or survey: Yes. 

Measure: 11. Average earnings change in 6 months; 
Data source: UI wage records: Yes; 
Data source: Supplemental data allowed: No; 
Data source: Other, such as educational data or survey: Yes. 

Measure: 12. Employment/education/training and credential rate; 
Data source: UI wage records: Yes; 
Data source: Supplemental data allowed: Yes; 
Data source: Other, such as educational data or survey: Yes. 

Program: Youth (age 14-18); 
Measure: 13. Skill attainment; 
Data source: UI wage records: No; 
Data source: Supplemental data allowed: No; 
Data source: Other, such as educational data or survey: Yes. 

Measure: 14. Diploma or equivalent; 
Data source: UI wage records: No; 
Data source: Supplemental data allowed: No; 
Data source: Other, such as educational data or survey: Yes. 

Measure: 15. Placement and retention rate; 
Data source: UI wage records: Yes; 
Data source: Supplemental data allowed: Yes; 
Data source: Other, such as educational data or survey: Yes. 

Program: All programs; 
Measure: 16. Customer satisfaction for participants; 
Data source: UI wage records: No; 
Data source: Supplemental data allowed: No; 
Data source: Other, such as educational data or survey: Yes. 

Measure: 17. Customer satisfaction for employers; 17. Customer 
satisfaction for employers: No; 
Data source: UI wage records: No; 
Data source: Supplemental data allowed: No; 
Data source: Other, such as educational data or survey: Yes. 

Source: U.S. Department of Labor. 

[End of table] 

Unlike JTPA, which established expected performance goals using a 
computer model that took into account varying economic and demographic 
factors, WIA requires states to negotiate with Labor to establish 
expected performance levels for each measure. States, in turn, must 
negotiate performance levels with each local area. The law requires 
that these negotiations take into account differences in economic 
conditions, participant characteristics, and services provided. To 
derive equitable performance levels, Labor and the states use 
historical data to develop their estimates of expected performance 
levels. These estimates provide the basis for negotiations. 

WIA holds states accountable for achieving their performance levels by 
tying those levels to financial sanctions and incentive funding. States 
that meet their performance levels under WIA are eligible to receive 
incentive grants that generally range from $750,000 to $3 million. 
Nineteen states were eligible to apply for incentive grants in program 
year 2003.[Footnote 2] States that do not meet at least 80 percent of 
their WIA performance levels are subject to sanctions. If a state fails 
to meet its performance levels for 1 year, Labor provides technical 
assistance, if requested. If a state fails to meets its performance 
levels for 2 consecutive years, it may be subject to a 5 percent 
reduction in its annual WIA formula grant. No states received financial 
sanctions in program year 2003. 

Labor determines incentive grants or sanctions based on the performance 
data submitted by states each October in their annual reports. States 
also submit quarterly performance reports, which are due 45 days after 
the end of each quarter. In addition to the performance reports, states 
submit updates for their Workforce Investment Act Standardized Record 
Data (WIASRD) in mid-October. WIASRD is a national database of 
individual records containing characteristics, activities, and outcome 
information for all enrolled participants who receive services or 
benefits under WIA. All three submissions primarily represent 
participants who have exited the WIA programs within the previous 
program year. 

The process of collecting and reporting WIA data involves all three 
levels of government. Participant data are typically collected by 
frontline staff in local areas and entered into a state or local IT 
system. In some states, local area staff may enter data directly into a 
statewide IT system; in other states, local areas may use their own 
individualized IT system to enter data, from which staff can extract 
and compile the necessary information for state submission. 

After the state receives data from local areas, this information is 
compiled and formatted for various submissions to Labor, including the 
state's WIASRD file, quarterly report, and annual report. During the 
data compilation process, state agencies administering WIA typically 
match participant records to their state's UI wage record system to 
obtain wage records and employment status. In addition, states may use 
the Wage Record Interchange System (WRIS) to match participant records 
to other state's UI wage records or use other databases such as that of 
the U.S. Office of Personnel Management to fill gaps in the UI wage 
records. States may also link participant records to partner programs' 
IT systems to track activities across programs or to determine outcomes 
such as attaining high school diplomas, degrees, and certificates. For 
the quarterly and annual report, states use software to calculate their 
performance measures. States generate the required WIA performance 
reports and electronically submit them to Labor's regional offices 
using the Enterprise Business Support System (see fig. 1). 

Figure 1: Data Flow from Local to State to Labor under WIA: 

[See PDF for image] 

[End of figure] 

Internal Controls to Ensure Data Quality: 

Internal controls comprise the plans, methods, and procedures an 
organization uses to meet its missions, goals, and objectives. Internal 
controls used by government agencies may include guidance that defines 
the specific data to be collected and any documentation needed to 
support the data and safeguards to ensure data are secure.[Footnote 3] 

Some key aspects of internal controls for collecting and reporting data 
include: 

* Guidance: Guidance should clearly and consistently define all data 
elements required for reporting, and effectively communicate this 
information to states and local areas. If definitions are vague or 
inconsistent, then program staff may interpret them incorrectly, 
resulting in more errors to the data. Additionally, any guidance and 
documentation from the national office to states and local areas must 
be clear and free of any conflicting or contradictory instructions. If 
reporting instructions are misinterpreted by program staff, then the 
data may not be useful to assess program performance. 

* Data entry procedures and edit check software: Data entry procedures 
and edit check software can help ensure data entering the designated 
reporting system are accurate and consistent. Written guides 
establishing who is responsible for each step in data creation and 
maintenance, and how data are transferred from initial to final formats 
can ensure data are consistently reported. Additionally, using 
electronic data management and processing software programs to conduct 
automated checks on data values and integrity can limit errors when 
data are reported at a later date. 

* Monitoring: Monitoring can ensure reported data are accurate and 
complete. Common monitoring practices may include formal on-site 
reviews of individual case files and source documentation at both the 
state and local levels, and assessments of issued guidance to ensure 
that information collected nationwide is consistent with existing 
policies and in compliance with laws and regulations. 

Several Key Issues Have Affected States' Efforts to Ensure the Quality 
of WIA Performance Data: 

Three key issues--flexibility in federal guidance, major changes to 
states' information technology (IT)[Footnote 4] systems and limited 
monitoring efforts--have compromised states' early efforts to collect 
and report WIA performance data. The guidance available to states at 
the time of implementation allowed flexibility in key definitions and 
contributed to inconsistency in the way the data are collected and 
reported. The transition from JTPA to WIA required states to make major 
changes to their IT systems and in some cases, the transition led to 
problems with the data. States used a variety of strategies to make the 
necessary system changes, some used the software they had used to 
report under JTPA, and others developed new software for WIA. More than 
three-fourths of the states told us that they had made major 
modifications to their WIA IT systems since implementation. One-third 
of these states reported that when these modifications were made, they 
experienced significant problems that affected the quality of the data. 
Lack of oversight at the local, state, and federal levels made it 
difficult to ensure that early WIA performance data were accurate. 

Flexibility in Guidance from Labor Led to Inconsistency in the Way Data 
Are Collected and Reported: 

The guidance available to states at the time of implementation was open 
to interpretation in key terms and contributed to inconsistency in the 
way that data are collected and reported. Labor allowed states and 
local areas flexibility in determining when to register a jobseeker in 
WIA and when participants leave the program (see table 3). 

Table 3: Key WIA Terms That Allow Flexibility: 

Key term: Registration; 
Performance measure affected: All adult and dislocated worker measures. 

Key term: Exit; 
Performance measure affected: All measures except the younger youth 
skill attainment rate and employer customer satisfaction measure. 

Source: GAO analysis. 

[End of table] 

Registration. When and who is registered affects all WIA performance 
measures for adults and dislocated workers because performance data are 
only collected for those job seekers who are registered under WIA--a 
process that occurs when they begin receiving services that require 
significant staff assistance. Labor has provided detailed written 
guidance to states on who should be registered under WIA and when this 
registration should occur, but the guidance is open to interpretation 
in some areas. The guidance provides examples of when to register job 
seekers, but it sometimes requires staff to make subtle and subjective 
distinctions. For example, those who receive initial assessment of 
skill levels and the need for supportive services are not to be 
registered; those requiring comprehensive assessment or staff-assisted 
job search and placement assistance are to be registered. In an earlier 
report, we found that local areas differed on when they registered WIA 
jobseekers, raising questions about both the accuracy and comparability 
of states' performance data, and we recommended that Labor provide 
clearer guidance.[Footnote 5] Inconsistencies in when states register 
participants could lead some states to register fewer participants than 
others do, which could affect the reported outcomes. 

Exit. Determining when a participant leaves the program--or exits-- 
affects nearly all WIA performance measures because jobseekers must 
exit the program in order to be counted in the performance measures. 
While Labor's guidance explains when an exit occurs, it also has 
allowed two different kinds of exits--the hard exit and the soft exit. 
A hard exit occurs when a participant has a specific date of case 
closure, program completion or known exit from WIA-funded or one-stop 
partner-funded services. A soft exit occurs when a participant does not 
receive any WIA-funded or partner-funded service for 90 days and is not 
scheduled for future services except follow-up. Furthermore, Labor's 
guidance on WIA did not clearly specify which services are substantial 
enough to delay exiting a participant, and local areas define these 
services differently. In a recent review we found considerable 
variation in exit practices at the state and local levels.[Footnote 6] 
For example, one local area defined exit as occurring when participants 
are finished with their WIA services; another local area defined exit 
when participants have found a new job and the wages for their new job 
are considered acceptable (regardless of the number of days that have 
passed since their last service). 

In addition to allowing states the flexibility to define some 
performance elements, the initial guidance failed to specify other key 
elements necessary to ensure data quality. For example, the guidance 
did not specify which source documentation was to be collected and 
maintained to support entries into the IT system. In the absence of 
guidance, some states continued to collect source documentation similar 
to that collected under JTPA, other states moved to paperless systems 
and did not collect and retain any source documentation. Without 
consistent source documentation, there is no assurance that the data in 
the IT system are accurate. 

States Needed to Make Significant Changes to IT Systems That Initially 
Compromised Data Quality: 

The transition from JTPA to WIA required states to make significant 
changes to their IT systems, and in some cases, problems during the 
transition led to data errors. For example, several data elements 
required in WIASRD--the file of individual exiters that states submit 
to Labor every year--were similar to those collected under JTPA, but 
the data definitions were slightly changed. This sometimes led to 
miscoded or missing data--especially for those participants who were 
carried over from JTPA into WIA. In addition, new data sources were 
used to measure outcomes, and the calculations for the measures were 
complex. Some states integrated their IT systems so that the system 
that is used for WIA data collection is used for tracking participation 
in other partner programs as well. These changes required major 
modifications to the IT systems. 

States used a variety of strategies to make the necessary system 
changes, often facing challenges in fully implementing WIA's 
requirements. For example, 22 states reported that they used the same 
software they had used under JTPA to report on WIA performance, but 15 
of these states later converted to different software for WIA. Twenty- 
six states used new software for WIA at implementation, but almost one- 
third of them replaced that system when it became clear that the new 
system was not sufficient to meet WIA reporting requirements. The time 
needed to make system changes varied across states. While nearly half 
of the states reported that they were able to implement their IT system 
changes in 1 year or less, the other half reported that it took more 
than one year, and as long as 3 years (see fig. 2). 

Figure 2: Time It Took States to Implement Changes to IT Systems for 
WIA Implementation: 

[See PDF for image] 

Note: One state reported "did not know" and two did not respond to this 
question. 

[End of figure] 

Thirty-nine states reported to us that they had made major 
modifications to their WIA IT systems since implementation, such as 
converting to Internet-based systems or adding new capabilities such as 
case management tracking. Thirteen of these states reported that when 
they made these modifications, they experienced significant problems 
that affected the quality of the data, including lost data and 
difficulties in combining or reconciling data from the multiple systems 
they had used. While 8 of the states reported that these issues have 
been resolved, 5 told us that they are still trying to resolve these 
data quality concerns. Some of the remaining 11 states that did not 
report making major changes to their IT systems since WIA 
implementation reported that they made minor changes, such as adding or 
deleting data elements and adding reporting capabilities. 

In addition to collecting and reporting the performance data, IT 
systems must also be able to calculate the performance measures. 
However, states are not all using the same methodology to calculate 
these measures. The calculations for the measures are complex and 
sometimes confusing. For example, in calculating some of the measures 
for the adult program, states must consider (1) whether the jobseeker 
is employed at registration, (2) whether he or she is employed at both 
the first and third quarters after exit, and (3) what data were used to 
confirm employment. This information results in 14 different ways that 
adult participants can be grouped together in order to calculate the 
measures. Labor does not mandate which software package states must use 
to calculate their performance measures, and at the 5 states we 
visited, each used a different approach--commercially available 
software, software developed by the state, or one of two different 
software packages developed under contract with Labor. These software 
packages can use slightly different formulas to calculate the measures 
and, as a result, produce differences in the outcomes reported. 

Monitoring of WIA Performance Data Was Limited at All Levels: 

Lack of oversight at the local, state, and federal levels made it 
difficult to ensure that early WIA performance data are accurate or 
verifiable. During the first year of WIA implementation, Labor's 
Inspector General (IG) found insufficient documentation of verification 
procedures at the state and local levels.[Footnote 7] The same report 
questioned the lack of formal federal monitoring to gauge the progress 
of state efforts to ensure the quality of the data. Furthermore, the 
report noted that Labor and states lacked adequate monitoring 
procedures and little was being done to monitor performance data at the 
case file level. In a previous study, we reported that Labor did not 
have a standard data monitoring guide in place, and regional officials-
-who have primary responsibility for monitoring--followed various 
oversight procedures. Table 4 summarizes WIA's data quality issues. 

Table 4: Summary of Data Quality Concerns That Affected States' Efforts 
to Ensure Data Quality: 

Data quality issues: Flexibility in federal guidance; 
Result of data quality issues: Inconsistent data collection for 
registration and exits. 

Data quality issues: Lack of federal guidance; 
Result of data quality issues: State practices on which source 
documentation should be collected and maintained. 

Data quality issues: Required IT system changes and variation in 
software used to calculate measures; 
Result of data quality issues: Data errors and missing data 
Inconsistencies in how outcomes are computed. 

Data quality issues: Insufficient monitoring; 
Result of data quality issues: It is difficult to ensure that the data 
collected and reported were accurate or verifiable. 

Source: GAO analysis. 

[End of table] 

States Have Taken Steps to Improve the Quality of WIA Performance Data: 

States have made efforts to address data quality concerns and improve 
the quality of WIA performance data. Most states have taken actions to 
clarify Labor's guidance to help local areas determine who should be 
tracked in the performance measures. Almost all states reported on our 
survey that they have controls for IT systems, such as edit checks or 
reports to help screen for errors or missing data. In addition, most 
states reported to us that they monitor local areas to ensure data 
quality and consistency by assessing local procedures and policies. 

States Have Taken Some Actions to Clarify Labor's Guidance: 

States have taken some steps to provide additional clarity to help 
local areas adhere to federal guidance. Over 40 states reported to us 
that they provide guidance to help local areas determine which 
jobseekers should be tracked--or registered--for WIA and when 
participants leave--or exited--services, and therefore get counted in 
the performance measures. For example, a West Virginia state official 
said the state developed a list of staff-assisted services that should 
trigger registration under WIA. Most states also provide technical 
assistance and training on registration and exit policies (see fig. 3). 
Some states take other steps to help local areas adhere to federal 
policies. For example, California state officials attempt to prevent 
local areas from keeping participants enrolled in the program once they 
have exited services by incorporating a capability in their IT system 
that will automatically exit a person who has not had any service for 
150 days. 

Figure 3: Actions States Have Taken to Clarify and Explain Federal 
Guidance: 

[See PDF for image] 

[End of figure] 

States Have Taken Steps to Reduce Errors in IT Systems: 

States have made efforts to reduce the errors in their WIA performance 
data. Almost all states reported on our survey that they have controls 
for IT systems, such as edit checks or reports to help screen for 
errors or missing data. Forty-six states screen for missing values, and 
44 states screen for errors such as data logic inconsistencies (see 
fig. 4). For example, if an individual is registered in the youth 
program, but the birth date indicates that the person is 40 years old, 
this case would be flagged in an error report checking for 
inconsistencies between these two data elements. Some of the states we 
visited told us they allow local areas flexibility in deciding who 
should enter data and how it gets done. In some locations, a case 
manager who works with the participant may enter data, and sometimes 
the case manager completes forms that are given to a data entry 
specialist. Despite these differences, most states have implemented 
edit checks and other controls in their IT systems to detect and 
control for errors. For example, state officials we met with in West 
Virginia said that the state created screen edits and drop-down menus 
to guide case managers as they enter data. If a case manager does not 
enter the necessary data, the system will not let the data entry 
process go forward until the data are entered. State officials 
acknowledged that people entering data can still make mistakes if they 
choose the wrong option on a drop down menu, but they told us they try 
to minimize these mistakes by conducting training sessions to acquaint 
staff with the right techniques. States also address data entry errors 
by running error reports. In New York, state officials told us that 
they produce error reports for each local area to show where data are 
missing, meeting with local officials to discuss these reports every 6 
weeks. 

Figure 4: Types of Errors Addressed with Edit Checks and Exception 
Reports: 

[See PDF for image] 

[End of figure] 

Labor officials in most of Labor's six regions told us that states have 
made improvements to their IT systems since WIA was first implemented. 
For example, Labor officials in one region said that they identified 
data quality issues related to states' IT systems in 10 of the 11 
states in that region in program year 2000 and found similar issues in 
only 5 states in the region between program years 2002 and 2004. A 
Labor official in another region told us that initial data collection 
efforts were poor because states were largely focused on getting WIA up 
and running and had not developed adequate IT system instructions. Now, 
most states have developed IT system manuals with clear instructions. 
Some regional officials told us that they provided technical assistance 
and closely monitored states that had early problems with their IT 
systems. 

Most States Conduct Monitoring of Key Data Elements at the Local Level: 

Most states told us they monitor local areas to ensure data quality and 
consistency by assessing local procedures and policies. Thirty-eight 
states reported to us that they monitor data collection at the local 
level. At least 33 states also reported to us that they conduct 
monitoring of local policies and procedures on registrations and exits 
and data entry (see fig. 5). State officials at the sites we visited 
generally said that they conduct annual monitoring visits to local 
areas or one-stop centers, and some conduct more frequent monitoring 
visits. Texas state officials we visited told us that the state 
monitors each local area once a year that includes reviewing 
participant files to assess eligibility decisions and ensure that 
outcomes are documented. In New York, state officials said that they 
have monitoring teams located in five regions across the state who 
visit the local areas within their regions about once a month. 
Initially, these visits focused on program compliance, but they have 
recently been expanded to include data quality. 

Figure 5: States' Monitoring Activities: 

[See PDF for image] 

[End of figure] 

Labor Has Taken Steps to Improve WIA Data Quality, but Some Issues 
Remain: 

Labor recently began addressing data quality issues, however, some data 
quality issues remain. In 2004, Labor addressed some data quality 
concerns by implementing new data validation requirements that called 
for states to review samples of participant files and provided software 
to help states ensure that the performance measures are computed 
accurately. Most states reported on our survey that Labor's new 
requirements are having positive effects on states' and local areas' 
attention to data quality. However, Labor does not currently have 
methods in place to review states' data validation efforts and hold 
states accountable to the data validation requirements. Labor's 
guidance requiring states to implement common performance measures on 
July 1, 2005, clarified some key data elements that had been 
problematic with regard to the WIA performance measures, but it does 
not address all the issues. Further, Labor has some federal monitoring 
processes in place but lacks a standard monitoring guide to address 
data quality. 

Labor's Data Validation Requirements Address Some Data Quality Issues 
but Do Not Address All Concerns: 

To address data quality concerns, Labor required states to implement 
new data validation procedures for WIA performance data in October 
2004.[Footnote 8] This process requires states to conduct two types of 
validation: (1) data element validation--reviewing samples of WIA 
participant files, and (2) report validation--assessing whether states' 
software accurately calculated performance outcomes. These requirements 
addressed a gap in earlier guidance by providing instructions for 
collecting and retaining source documentation to verify that the 
reported data are accurate. This includes specifying which 
documentation is acceptable and what should be maintained in 
participant files. For example, to document that a participant is 
placed in post program employment, states must show that the 
information was obtained from the UI wage records or Wage Record 
Interchange System or other sources such as a pay stub, a 1099 form, or 
telephone verification with employers. 

Labor's data validation process requires states to monitor local areas 
to compare data elements that were reported to the state against source 
documentation to verify that the data are accurate. Labor selected data 
elements for validation based on factors such as feasibility and risk 
of error. For example, self-reported data elements, such as race and 
ethnicity, are not validated because it is not feasible to locate the 
participant to verify these items. Data elements needing independent 
documentation, such as the use of supplemental data sources to 
determine employment, are assumed to be at higher risk of error than 
from using the UI wage records. Labor provided software to help states 
select a sample of files to be validated that includes participants 
from each group reported on in the performance measures--adults, 
dislocated workers, older youth, and younger youth. 

States are required to conduct monitoring visits to the local areas 
selected for validation and compare data elements for each participant 
in the sample to source files to ensure accuracy, but Labor does not 
have a standard process to verify that states did this 
correctly.[Footnote 9] State monitors record whether each data element 
is supported by source documentation, and therefore passes, or whether 
the documentation shows the element was incorrect or was not supported 
with source documentation, and, therefore, fails the element. States 
use Labor's software to total error rates for each population group and 
states submit these data to Labor. 

To address inconsistencies in calculating the performance measures, 
Labor's report validation software verifies the accuracy of outcomes 
reported by states. States can use Labor's software in two ways: they 
can use the software to compute the state's performance measures or 
they can use the software to check the calculations computed by their 
state's software to make sure that the measures were calculated 
accurately. According to Labor, about 20 states are currently using its 
software to compute their states performance measures. The remainder of 
states use their own or commercially available software to compute 
outcomes. These states must submit validation reports to Labor to show 
any differences between their calculations and the outcomes computed 
with Labor's software. 

Since initiating data validation, Labor made a number of modifications 
to its software, and states reported on our survey that they 
experienced some challenges in using the software. Most states reported 
that they experienced only minor difficulties or had no problems in 
using Labor's software for both data element validation and report 
validation (see fig. 6). However, some states did report major 
difficulties. For example, seven states reported that they initially 
had major difficulties with report validation, such as resolving 
discrepancies or errors in Labor's software. States also reported 
concerns that they were not always informed when Labor made updates to 
the software and did not always receive adequate time to work with the 
software before the results were due to Labor. In addition, some states 
reported on our survey that conducting data element validation was time 
consuming. Half the states that were able to estimate the time it took 
to complete data element validation said it took 60 days and half said 
that it took more than 60 days. 

Figure 6: Few States' Faced Major Difficulties Using Labor's Software: 

[See PDF for image] 

Note: Two states reported that they did not receive training 
assistance. 

[End of figure] 

The majority of states told us that Labor's guidance, training, and 
technical assistance on data validation were sufficient (see fig. 7). 

Figure 7: Most States Found Labor's Assistance in Data Validation 
Sufficient: 

[See PDF for image] 

[End of figure] 

Labor's Data Validation Requirements May Be Having Some Positive 
Effects on States and Local Areas: 

It is too soon to fully assess whether Labor's efforts have improved 
data quality, however, at least 46 states reported on our survey that 
Labor's new requirements have helped increase awareness of data 
accuracy and reliability at the state and local level (see fig. 8). A 
New York state official told us that the federal requirements helped 
local staff better understand the connection between the data that get 
entered and how these data affect performance levels. In addition, over 
30 states said that the new requirements have helped them in their 
monitoring of outcomes and eligibility. Some states and local areas we 
visited reported finding errors in their data through the data 
validation process and have made modifications to state and local 
procedures to enhance data quality as a result. For example, a local 
area in California started doing monthly spot checks of files to 
identify and correct errors on an ongoing basis. In New York, a local 
area told us that it added a new staff person, developed new forms and 
procedures, and centralized data entry to have more control over data 
quality as a result of the federal data validation process. While 
either centralized or decentralized data entry may be effective, 
experts in WIA performance data told us that one of the most important 
factors to avoid human error is for program managers and staff who 
enter data to understand how the data are used. 

Figure 8: States' View of How Labor's Data Validation Efforts Have 
Helped Them: 

[See PDF for image] 

[End of figure] 

Labor Currently Has No Method to Hold States Accountable for Complying 
with Data Validation Requirements: 

While Labor's data validation requirements are having some positive 
effects on states and local areas, Labor currently has no mechanism to 
hold states accountable for complying with the data validation 
requirements. Labor has plans to develop accuracy standards for report 
validation and to hold states accountable to these standards in about 3 
years. Initially, Labor planned to use program year 2003--July 1, 2003 
until June 30, 2004--as a base year for developing accuracy standards 
on report validation. However, as a result of reporting changes for the 
common measures, Labor has postponed the development of these standards 
until program year 2006, beginning July 1, 2006. At this time, Labor 
does not have plans to develop accuracy standards for the data element 
validation portion of its requirements. In addition, Labor does not 
conduct its own review of a sample of WIA participant files verified by 
states as part of data validation to ensure that states did this 
process correctly. Table 5 provides a summary of data quality concerns 
and how Labor's data validation efforts affect these concerns. 

Table 5: Summary of How Labor's Data Validation Requirements Affect 
Data Quality Concerns: 

Data quality issues: State practices varied on which source 
documentation should be collected and maintained because Labor did not 
provide guidance on this; 
Labor's efforts to address issues: Provided instructions on collecting 
and retaining source documentation to verify that the reported data are 
accurate; 
Results of Labor's efforts: States and locals areas have more 
clarification on source documentation needed. 

Data quality issues: Variation in software used to calculate measures 
led to inconsistencies in how outcomes are computed; 
Labor's efforts to address issues: Provided software to verify the 
accuracy of outcomes reported by states; 
Results of Labor's efforts: About 20 states use Labor's software to 
calculate measures, and the rest must submit validation reports to 
Labor; 
Remaining data quality issues: Labor has no mechanism to hold states 
accountable to report validation requirements, so cannot ensure 
consistency in calculations. 

Data quality issues: Insufficient monitoring made it is difficult to 
ensure that the data collected and reported were accurate or 
verifiable; 
Labor's efforts to address issues: Require states to conduct data 
element validation to compare reported data with source documents; 
Results of Labor's efforts: States must submit data element errors to 
Labor; 
Remaining data quality issues: Labor does not conduct a review of 
states data element validation work; therefore, states may not be doing 
this correctly. 

Source: GAO analysis. 

[End of table] 

Labor's Recent Common Measures May Address Some Concerns, but Some 
Issues Remain: 

In response to an OMB initiative, Labor recently began requiring states 
to implement common performance measures for WIA programs. OMB 
established a set of common measures to be applied to most federally 
funded job training programs that share similar goals. Labor further 
defined the common measures for all of its Employment and Training 
Administration programs and required states to implement these measures 
beginning July 1, 2005. In addition, Labor is replacing the definitions 
for the WIA measures that are similar to the common measures with the 
new definitions for common measures (see table 6). 

Moving to the common measures may increase the comparability of outcome 
information across programs and make it easier for states and local 
areas to collect and report performance information across the full 
range of programs that provide services in the one-stop system. Many 
federal job training programs had performance measures that track 
similar outcomes but have variations in the terms used and the way the 
measures are calculated. For example, WIA's adult program uses a 
different time period to assess whether participants got a job than the 
Wagner-Peyser funded Employment Service does. WIA's adult program looks 
at whether participants get a job by the end of the first quarter after 
exit, whereas the Employment Service looks at whether participants get 
a job in the first or second quarter after registration. Under common 
measures, both programs use the same time period for this measure. 

Table 6: Common Measures Are Similar to Some of the WIA Measures: 

Program: Adult; 
WIA measures: 
* Entered employment rate; 
* Average earnings change in 6 months; 
* Employment retention rate at 6 months; 
* Entered employment and credential rate; 
Common measures: 
* Entered employment; 
* Earnings increase; 
* Employment retention. 

Program: Dislocated workers; 
WIA measures: 
* Entered employment rate; 
* Earnings replacement rate at 6 months; 
* Employment retention rate at 6 months; 
* Entered employment and credential rate. 

Program: Youth (age 19-21); 
WIA measures: 
* Entered employment rate; 
* Average earnings change in 6 months; 
* Employment retention rate at 6 months; 
* Entered employment/education/training and credential rate; 
Common measures: 
* Placement in employment and education; 
* Attainment of a degree or certificate; 
* Literacy or numeracy gains. 

Program: Youth (age 14-18); 
WIA measures: 
* Skill attainment; 
* Diploma or equivalent; 
* Placement and retention rate. 

Source: U.S. Department of Labor. 

Note: Bolded WIA measures are similar to common measures. For common 
measures, adults and dislocated workers are reported using the same 
measures and all youth are reported together. 

[End of table] 

Labor's new guidance for common measures requires states to collect a 
count of all WIA participants who use one-stop centers. This can help 
provide a more complete picture of the one-stop system, but it does not 
clarify when participants should be registered for WIA and tracked in 
the performance measures. Therefore, it raises questions about both the 
accuracy and comparability of WIA's outcomes for adults and dislocated 
workers. Under common measures, states are being required to begin 
collecting and reporting a quarterly count of all jobseekers who 
receive services at one-stop centers. To track these jobseekers, Labor 
suggested that states collect a valid Social Security number, but 
allowed states to exclude individuals who do not wish to disclose their 
Social Security numbers. In addition, Labor is encouraging states to 
voluntarily report performance information on all jobseekers that are 
counted in one-stops. However, it is not clear how many states have the 
capability to track jobseekers who receive only self-service and 
informational activities. While 30 states reported on our survey that 
they have a state system to track all jobseekers, some officials we 
visited told us they do not require local areas to collect and report 
this information to the state. Given this, implementing the new 
requirement may take time and early data collection efforts may be 
incomplete. 

Labor's guidance on common measures provides for a clearer 
understanding of when WIA participants should be exited from the 
program than did earlier WIA guidance. First, the guidance provides a 
more uniform definition of exit. In the past, local areas could use a 
hard exit--when a participant has a known date of completion or exit 
from services or a soft exit--when a participant has not received any 
services for 90 days. Under the new guidance, only soft exits will be 
allowed and states will no longer be able to report a hard exit. 
Second, Labor clarified that some services are not substantial enough 
to keep a participant from being exited from WIA. For example, if a 
case manager is only making phone calls to the participant to see if he 
or she has a job or needs additional services or income support 
payments, those phone calls are not considered a service (see table 7). 
This new clarification may help prevent local areas from keeping WIA 
participants enrolled long after they have completed their last valid 
service. In a previous study, however, we cautioned that rushed 
implementation of these reporting changes may not allow states and 
local areas enough time to fully meet the requirements and could 
negatively affect the data quality of the information 
reported.[Footnote 10] 

Table 7: List of Services That Labor Does Not Consider Substantial 
Enough to Keep a Participant from Being Exited: 

Services not considered substantial enough to keep a participant from 
being exited: 

* A determination of eligibility to participate in the program. 

* Self-directed job search that does not result in a referral to a job. 

* Services and activities specifically provided as follow- up. 

* Regular contact with the participant or employer to only obtain 
information regarding employment status, educational progress, need for 
additional services, or income support payments. 

Source: U.S. Department of Labor. 

Note: Income support payments do not include trade readjustment 
allowances and other needs-related payments funded through the Trade 
Adjustment Assistance program or National Emergency Grants. 

[End of table] 

Data Validation and Other Labor Efforts Address Some Monitoring 
Concerns, but Federal Monitoring Still Has Some Limitations: 

In addition to data validation, Labor has some limited federal 
monitoring processes in place to oversee state and local performance 
data. Labor's regional offices--with primary responsibility for 
oversight--conduct a limited review of WIA report data to review 
quarterly and annual WIA performance reports. This generally involves 
identifying outliers or missing data and comparing the data with data 
in previous reports. If Labor regional officials identify basic 
problems with the data, they contact states to reconcile concerns. 

Labor's headquarters implemented an electronic system to manage grant 
oversight and track activities throughout the program year--called 
Grants E-Management System (GEMS). This system provides automated tools 
for conducting grant monitoring activities, including performing risk 
assessments and generating reports. Labor developed the risk assessment 
to help determine the programs and grant projects most in need of 
monitoring. The risk assessment assigns a risk level to each state 
based on past performance and other criteria. For example, for the WIA 
program, a state may be considered at risk if it failed to meet its 
performance levels in the prior year. However, regional officials can 
override the risk assessment if they are aware of other information 
that may not be captured in GEMS. 

Labor also implemented a core monitoring guide in spring 2005 to ensure 
that certain basic parameters are being followed during monitoring 
visits across all regions, but this guide does not provide for a 
standard analysis of data quality issues. According to Labor officials, 
they are developing program supplements for this guide that will 
address other issues specific to various programs. One regional office 
developed an extensive monitoring guide to review state and local 
guidance, procedures used for data entry, IT systems, and other data 
quality factors. This guide has been used since 2003 to review the 
eight states in its region. In addition, Labor officials said that 
several regional offices are using this guide and they plan to develop 
a similar guide that will be used across all regions. 

Conclusions: 

WIA overhauled the way federally funded employment and training 
services are provided to jobseekers and employers, and introduced 
changes that significantly affected the way performance data are 
collected and reported for WIA. Making this shift has taken a long time 
and some trial and error on the part of Labor, states, and localities. 
The magnitude of changes required considerable retooling of states' IT 
systems, which had a negative effect on the integrity of WIA 
performance data during the initial years of implementation. Since 
then, states have made progress in addressing challenges they faced in 
modifying or developing new IT systems and have invested considerable 
effort establishing controls for IT systems to minimize data errors. 

In addition, Labor's recent efforts to implement common performance 
measures across many of the WIA partner programs and its revised WIA 
reporting requirements have helped to address the concerns about when 
participants complete services and should be tracked in the performance 
measures. The new requirement for states to capture limited data on all 
WIA participants is an important step to better determine the full 
reach of WIA. However, this change still does not address the long- 
standing challenge Labor has faced in clearly defining which 
participants should be counted in the performance measures. Without 
clear guidance, the WIA performance data will continue to be 
inconsistent, even if the other data quality safeguards in place at the 
federal, state, and local level improve the quality of each state's and 
local area' s data. 

Labor's implementation of new data validation requirements is a major 
step toward addressing concerns about data quality resulting from the 
limited guidance and monitoring of WIA performance data in the past. By 
providing additional guidance and software to help states calculate the 
performance measures in a more uniform manner and requiring states to 
compare data reporting with participant case files, Labor has gone a 
long way toward helping ensure the consistency and comparability of the 
data. Most notably, these requirements have significantly raised 
awareness of data quality at the state and local levels, which is an 
essential part of ensuring data quality. However, more time is needed 
to fully assess the impact these new requirements are having on data 
quality. In addition, Labor does not currently review a sample of the 
participant files verified by states, nor does it have a mechanism to 
hold states accountable for meeting the data validation requirements. 
Further, Labor has not developed a standard monitoring guide to more 
uniformly assess state and local data collection and processing to 
ensure data quality. Without a standard monitoring guide and a means to 
hold states accountable to the data validation requirements, it will be 
difficult to assure decision makers that the data is of sufficient 
quality for applying incentives and sanctions, and making budget 
decisions. 

Recommendations For Executive Action: 

To address the inconsistencies in determining when participants should 
be registered and counted in the performance measures, we recommend 
that the Secretary of Labor determine a standard point of registration 
and monitor states to ensure that the policy is consistently applied. 

To enhance the data validation requirements, we recommend that the 
Secretary of Labor: 

* conduct its own review of the WIA participant files validated by 
states to ensure that states did this correctly, and: 

* ensure that steps are taken to hold states accountable to both the 
report validation and data element validation requirements. 

To address variations in federal monitoring practices, we recommend 
that the Secretary of Labor develop a standard comprehensive monitoring 
tool for WIA performance data that is used across all regions, 
including monitoring the new guidelines for determining when 
participants end services. 

Agency Comments: 

We provided a draft of this report to Labor for review and comment. 
Labor agreed with our findings and recommendations. Labor agreed that 
the lack of a standard point of registration and exit prevents 
comparisons across states and leads to performance outcome information 
that is arbitrary and inconsistent. Labor also agreed that steps are 
needed to increase the integrity of the data validation requirements 
and to improve the completeness and consistency of oversight. A copy of 
Labor's response is in appendix II. 

In response to our recommendations, Labor noted that it plans to 
implement a policy prior to the start of program year 2006 to clarify 
the point of registration and exit. In addition, Labor plans to modify 
the current data validation procedures to begin reviewing a sample of 
states' validated files and plans to hold states accountable for data 
validation results by program year 2006. Further, Labor told us that it 
is taking steps to develop a comprehensive monitoring guide for 
performance data and plans to provide training on this new guide to 
help improve the completeness and consistency of oversight. 

We are sending copies of this report to the Secretary of Labor, 
relevant congressional committees, and others who are interested. 
Copies will also be made available to others upon request. The report 
is also available on GAO's home page at http://www.gao.gov. 

If you or members of your staff have any questions about this report, 
please contact me at (202) 512-7215. Contact points for our Offices of 
Congressional Relations and Public Affairs may be found on the last 
page of this report. GAO staff who made major contributors to this 
report are listed in appendix III. 

Signed by: 

Sigurd R. Nilsen: 
Director, Education, Workforce, and Income Security Issues: 

[End of section] 

Appendix I: Objectives, Scope, and Methodology: 

We examined (1) the data quality issues that have affected states' 
efforts to collect and report Workforce Investment Act (WIA) 
performance data; (2) states' actions to address them; and (3) the 
actions the Department of Labor (Labor) is taking to address data 
quality issues, and the issues that remain. To learn more about states' 
experiences implementing data collection and reporting system changes 
for WIA, their implementation of Labor's data validation requirements 
for WIA, and state and local efforts to address the quality of WIA 
data, we conducted a web-based survey of state workforce officials and 
conducted site visits in five states, where we interviewed state 
officials and visited two local areas or one-stop centers in each 
state. We also collected information on the quality of WIA data through 
interviews with Department of Labor officials in headquarters and all 
six regional offices, nationally recognized experts, and reviewed 
relevant research literature. Our work was conducted between June 2004 
and September 2005 in accordance with generally accepted government 
auditing standards. 

Web-Based Survey: 

To determine the factors that affect the quality of WIA performance 
data, we conducted a Web-based survey of state workforce officials. 
These officials were identified using a GAO-maintained list of state 
WIA officials. We e-mailed the contacts, and they confirmed that they 
were the appropriate contact for our survey or identified and referred 
us to another person at the state level. Survey topics included (1) the 
changes made to data collection and reporting during the transition 
from the Job Training Partnership Act to WIA, (2) the current status of 
WIA data collection and reporting systems, (3) implementation of the 
U.S. Department of Labor's data validation requirements, and (4) state 
and local efforts to ensure the accuracy and reliability of WIA data. 
The survey was conducted using a self-administered electronic 
questionnaire posted on the Web. We contacted respondents via e-mail 
announcing the survey, and sent follow-up e-mails to encourage 
responses. The survey data were collected between February and May 
2005. We received completed surveys from all 50 states (a 100 percent 
response rate). We did not include Washington, D.C.,and U.S. 
territories in our survey. 

We worked to develop the questionnaire with social science survey 
specialists. Because these were not sample surveys, there are no 
sampling errors. However, the practical difficulties of conducting any 
survey may introduce errors, commonly referred to as nonsampling 
errors. For example, differences in how a particular question is 
interpreted, in the sources of information that are available to 
respondents, or how the data are entered into a database can introduce 
unwanted variability into the survey results. We took steps in the 
development of the questionnaires, the data collection, and data 
analysis to minimize these nonsampling errors. For example, prior to 
administering the survey, we pretested the content and format of the 
questionnaire with several states to determine whether (1) the survey 
questions were clear, (2) the terms used were precise, (3) respondents 
were able to provide the information we were seeking, and (4) the 
questions were unbiased. We made changes to the content and format of 
the final questionnaire based on pretest results. In that these were 
Web-based surveys whereby respondents entered their responses directly 
into our database, possibility of data entry errors was greatly 
reduced. We also performed computer analyses to identify 
inconsistencies in responses and other indications of error. In 
addition, a second independent analyst verified that the computer 
programs used to analyze the data were written correctly. 

Site Visits: 

We visited five states--California, New York, Texas, West Virginia, and 
Wyoming,--and traveled to two local areas or one-stop centers in each 
of these states.[Footnote 11] We selected these states because they 
represent a range of IT systems--statewide comprehensive systems versus 
local systems with a state reporting function, include single and 
multiple workforce areas, and are geographically diverse. From within 
each state, we judgmentally selected two local boards. In the case of 
our single workforce area state, we visited two one-stop centers (see 
table 8). 

Table 8: Site Visit States and Local Areas: 

State: California; 
Local area: Alameda County; 
City: Hayward. 

Local area: Orange County; 
City: Westminster. 

State: New York; 
Local area: Dutchess County; 
City: Poughkeepsie. 

Local area: Fulton-Montgomery-Scoharie; 
City: Amsterdam. 

State: Texas; 
Local area: Alamo; 
City: San Antonio. 

Local area: Dallas; 
City: Dallas. 

State: West Virginia; 
Local area: Region I; 
City: Beckley. 

Local area: Region IV; 
City: Ripley. 

State: Wyoming; 
Local area: Central Region; 
City: Casper. 

Local area: North Central Region; 
City: Sheridan. 

Source: GAO analysis. 

[End of table] 

In each state visited, we obtained general information about the 
state's implementation of WIA, an overview of the state's WIA 
administrative structure, the management information system and 
reporting processes in place to meet the federal requirements, data 
quality practices at the state and local levels, implementation of 
Labor's data validation requirements. We interviewed state officials 
responsible for local areas' WIA programs and analyzing and reporting 
on the state's WIA performance data, as well as other state WIA and 
information technology (IT) officials and staff of the state's 
Workforce Investment Board. At the local areas, we interviewed WIA 
officials and staff, including service providers, staff responsible for 
performance management issues, IT staff, case managers and other 
frontline staff, as well as staff of the local area Workforce 
Investment Board. The state and local interviews were administered 
using a semi-structured interview guide. 

Information that we gathered on our site visits represents only the 
conditions present in the states and local areas at the time of our 
site visits, from August 2004 through March 2005. We cannot comment on 
any changes that may have occurred after our fieldwork was completed. 
Furthermore, our fieldwork focused on in-depth analysis of only a few 
selected states and local areas or sites. On the basis of our site 
visit information, we cannot generalize our findings beyond the states 
and local areas or sites we visited. 

[End of section] 

Appendix II: Comments from the Department of Labor: 

U.S. Department of Labor: 
Assistant Secretary for Employment and Training: 
Washington, D.C. 20210: 

Oct. 31, 2005

Mr. Sigurd R. Nilsen: 
Director: 
Education, Workforce, and Income Security Issues: 
U.S. Government Accountability Office: 
441 G Street, N.W.: 
Washington, D.C. 20548: 

Dear Mr. Nilsen: 

The Employment and Training Administration (ETA) is in receipt of the 
draft Government Accountability Office (GAO) report entitled, 
"Workforce Investment Act: Labor and States Have Taken Actions to 
Improve WIA Data Quality, But Additional Steps Are Needed" (GAO-06-82). 

In addressing the report's first recommendation, the Department of 
Labor (Department) agrees that a policy needs to be implemented 
concerning a standard point of registration and exit for employment and 
training programs. Initially, upon enactment of the Workforce 
Investment Act, the Department established broad registration and exit 
parameters in order to support the principle of state and local 
flexibility. However, by not having a standard point of registration 
and exit, performance comparisons across states cannot be made and 
performance outcome information is arbitrary and inconsistent. It is 
our intent to implement a policy prior to the start of Program Year 
2006. 

The second recommendation suggests that the Department should conduct 
its own reviews of WIA participant files to ensure validation was done 
correctly and take steps to hold states accountable to both the report 
validation and data element validation requirements. The Department is 
already reviewing state validation results during monitoring visits. 
National and regional office staff have been working with the states to 
review the policies and training developed at the state and local 
levels to improve data quality and have conducted several visits during 
the past 18 months to review validation procedures. 

The Department originally intended to have standard error rates for the 
report validation results in place by Program Year 2004. Since that 
time, however, the Department has revised current reporting 
requirements in order to implement the set of common performance 
measures across all ETA-funded programs. We are in the process of 
reevaluating the impact this transition will have on data validation, 
including the changes to the data elements currently validated. In 
order to allow states a sufficient transition period, ETA plans to use 
the data validation results for Program Year 2005 as the basis for 
setting acceptable error rates to be implemented by Program Year 2006. 
Also, the Department is examining the possibility of making a state's 
data validation results one of the criteria to determine state 
eligibility for an incentive award. 

The Department agrees with GAO's third recommendation on developing a 
standard comprehensive monitoring tool for WIA performance data and is 
taking steps to develop a comprehensive monitoring supplement for 
regional use that will include a section on the review of performance 
information. The performance section will include data validation 
procedures for each specific program, including Workforce Investment 
Act Title 1B, Wagner-Peyser Act, and Trade Adjustment Assistance 
programs. Through the training of ETA employees using this supplement, 
we hope to improve the completeness and consistency of our oversight. 

ETA also plans to modify the current data validation software to allow 
Federal staff the opportunity to pull a sample of the validation 
records at the state level for review. This will increase the integrity 
of the process by allowing the monitoring staff the opportunity to 
randomly select records for each program for review. National and 
regional performance staff are planning the development and layout of 
the monitoring guide. One region has created a first draft, which has 
been used by other regions during initial monitoring visits. We will 
use that model as a starting point and work to create one guide that 
will be used by each region and can be utilized by every program. 

If you would like additional information, please do not hesitate to 
call me at (202) 693-2700. 

Sincerely, 

Signed by: 

Emily DeRocco Stover: 

[End of section] 

Appendix III: GAO Contact and Staff Acknowledgments: 

GAO Contact: 

Sigurd R. Nilsen, Director (202) 512-7215: 

Acknowledgments: 

Dianne Blank, Assistant Director: 

Laura Heald, Analyst-in-Charge: 

In addition, the following staff made major contributions to this 
report: Melinda Cordero, Vidhya Ananthakrishnan, and Leslie Sarapu 
served as team members; Jennifer Miller assisted with early data 
collection. Carolyn Boyce advised on design and methodology issues; 
Susan Bernstein advised on report preparation; Jessica Botsford advised 
on legal issues; Avrum Ashery and Robert Alarapon provided graphic 
design assistance; and Bill Hutchinson and Daniele Schiffman verified 
our findings. 

[End of section] 

GAO Related Products: 

Workforce Investment Act: Substantial Funds Are Used for Training, but 
Little Is Known Nationally about Training Outcomes. GAO-05-650. 
Washington, D.C.: June 29, 2005: 

Unemployment Insurance: Better Data Needed to Assess Reemployment 
Services to Claimants. GAO-05-413. Washington, D.C.: June 24, 2005: 

Workforce Investment Act: Labor Should Consider Alternative Approaches 
to Implement New Performance and Reporting Requirements. GAO-05-539. 
Washington, D.C.: May 27, 2005: 

Workforce Investment Act: Employers Are Aware of, Using, and Satisfied 
with One-Stop Services, but More Data Could Help Labor Better Address 
Employers' Needs. GAO-05-259. Washington, D.C.: February 18, 2005: 

Workforce Investment Act: States and Local Areas Have Developed 
Strategies to Assess Performance, but Labor Could Do More to Help. GAO- 
04-657. Washington, D.C.: June 1, 2004. 

Workforce Investment Act: Labor Actions Can Help States Improve Quality 
of Performance Outcome Data and Delivery of Youth Services. GAO-04-308. 
Washington, D.C.: February 23, 2004. 

Workforce Investment Act: One-Stop Centers Implemented Strategies to 
Strengthen Services and Partnerships, but More Research and Information 
Sharing Is Needed. GAO-03-725. Washington, D.C.: June 18, 2003. 

Older Workers: Employment Assistance Focuses on Subsidized Jobs and Job 
Search, but Revised Performance Measures Could Improve Access to Other 
Services. GAO-03-350. Washington, D.C.: January 24, 2003: 

Workforce Investment Act: Youth Provisions Promote New Service 
Strategies, but Additional Guidance Would Enhance Program Development. 
GAO-02-413. Washington, D.C.: April 5, 2002. 

Workforce Investment Act: Better Guidance and Revised Funding Formula 
Would Enhance Dislocated Worker Program. GAO-02-274. Washington, D.C.: 
February 11, 2002. 

Workforce Investment Act: Improvements Needed in Performance Measures 
to Provide a More Accurate Picture of WIA's Effectiveness. GAO-02-275. 
Washington, D.C.: February 1, 2002. 

FOOTNOTES: 

[1] GAO, Managing for Results: Challenges Agencies Face in Producing 
Credible Performance Information, GAO/GGD-00-52 (Washington, D.C.: Feb. 
4, 2000). 

[2] WIA operates on a program year basis. Program year 2003 ran from 
July 2003 to June 2004. 

[3] For more information on internal controls, see GAO, Standards for 
Internal Controls in the Federal Government, GAO/AIMD-00-21.3.1, 
(Washington, D.C.: November 1999). 

[4] IT systems as discussed in this study include computers, ancillary 
equipment, telecommunications, software, firmware, and related 
procedures, services, and resources used to obtain, store, manage, use, 
or otherwise handle electronic data. 

[5] GAO, Workforce Investment Act: Improvements Needed in Performance 
Measures to Provide a More Accurate Picture of WIA's Effectiveness, GAO-
02-275 (Washington, D.C.: Feb. 1, 2002). 

[6] GAO. Workforce Investment Act: Substantial Funds Are Used for 
Training, but Little Is Known Nationally about Training Outcomes, GAO-
05-650 (Washington, D.C.: Jun. 29, 2005). 

[7] U.S. Department of Labor, Office of Inspector General, Workforce 
Investment Act Performance Outcomes Reporting Oversight, 06-02-006-03- 
390 (Washington, D.C.: Sept. 30, 2002). 

[8] While Labor asked states to begin implementing data validation for 
program year 2002 data, states were not required to submit validation 
results to Labor until 2004, when OMB approved Labor's process. 

[9] State officials may request that some local areas send source files 
to the state rather than traveling to the local area such as in cases 
where there are too few files to review to justify the expense of 
traveling. 

[10] GAO, Workforce Investment Act: Labor Should Consider Alternative 
Approaches to Implement New Performance and Reporting Requirements, GAO-
05-539, (Washington, D.C.: May 27, 2005). 

[11] Wyoming is a single workforce investment area. 

GAO's Mission: 

The Government Accountability Office, the investigative arm of 
Congress, exists to support Congress in meeting its constitutional 
responsibilities and to help improve the performance and accountability 
of the federal government for the American people. GAO examines the use 
of public funds; evaluates federal programs and policies; and provides 
analyses, recommendations, and other assistance to help Congress make 
informed oversight, policy, and funding decisions. GAO's commitment to 
good government is reflected in its core values of accountability, 
integrity, and reliability. 

Obtaining Copies of GAO Reports and Testimony: 

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through the Internet. GAO's Web site ( www.gao.gov ) contains 
abstracts and full-text files of current reports and testimony and an 
expanding archive of older products. The Web site features a search 
engine to help you locate documents using key words and phrases. You 
can print these documents in their entirety, including charts and other 
graphics. 

Each day, GAO issues a list of newly released reports, testimony, and 
correspondence. GAO posts this list, known as "Today's Reports," on its 
Web site daily. The list contains links to the full-text document 
files. To have GAO e-mail this list to you every afternoon, go to 
www.gao.gov and select "Subscribe to e-mail alerts" under the "Order 
GAO Products" heading. 

Order by Mail or Phone: 

The first copy of each printed report is free. Additional copies are $2 
each. A check or money order should be made out to the Superintendent 
of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or 
more copies mailed to a single address are discounted 25 percent. 
Orders should be sent to: 

U.S. Government Accountability Office 

441 G Street NW, Room LM 

Washington, D.C. 20548: 

To order by Phone: 

Voice: (202) 512-6000: 

TDD: (202) 512-2537: 

Fax: (202) 512-6061: 

To Report Fraud, Waste, and Abuse in Federal Programs: 

Contact: 

Web site: www.gao.gov/fraudnet/fraudnet.htm 

E-mail: fraudnet@gao.gov 

Automated answering system: (800) 424-5454 or (202) 512-7470: 

Public Affairs: 

Jeff Nelligan, managing director, 

NelliganJ@gao.gov 

(202) 512-4800 

U.S. Government Accountability Office, 

441 G Street NW, Room 7149 

Washington, D.C. 20548: