skip navigational links United States Department of Labor
May 9, 2009   
DOL Home > About DOL > 2008 PAR > Reporting Performance Results
DOL Home

DOL Annual Report, Fiscal Year 2008
Performance and Accountability Report

Reporting Performance Results

Reporting Performance Results

The Performance Section of this report presents results at the Strategic Goal and Performance Goal levels. The four Strategic Goals established in our FY 2006-2011 Strategic Plan are general outcomes clearly linked to the Department's mission. Performance goals articulate more specific objectives associated with one or more programs administered by a distinct DOL agency. Progress in achieving these goals is measured by one or more quantifiable performance indicators, for which targets are established in the annual Performance Budget Overview. Each of the four strategic goal sections is introduced by an overview of results, net cost and future plans for its component performance goals. Results at the performance goal level are presented in separate narratives, each of which includes the following:

  • Performance Goal statements appear at the top of the page, followed by number that help organize this report but also correspond to commitments in DOL's annual budget\performance plans. The first two digits correspond to the funding (budget) period; in this report, "08" indicates goals reporting on a fiscal year and "07" those reporting on a program year. The single digit following the hyphen identifies the strategic goal and the letter distinguishes the performance goal from others in the same group (e.g., 08-1A). The agency acronym (e.g., BLS) is in parentheses.17
  • Indicators, Targets and Results tables list each indicator, its targets and results for the reporting period and previous years that have data for the same indicators. Indicators that were dropped prior to the current year are not shown; however, a note indicates where additional historical performance information (legacy data) can be obtained. Where all data for any year are shown, goal achievement is indicated. Where "baseline" appears in the target cell for new indicators, no data were available for establishing a numerical target, and these data do not count towards goal achievement. If results improve over the prior year but do not reach the target, "I" appears in the target cell. Net cost associated with the goal and indicators is also provided.18
  • Program Perspectives and Logic narratives describe the purpose of the program, how its activities are designed and managed to have a positive impact on the goal, and how it measures success and external factors that influence performance. Photos and vignettes communicate examples of programs' impact at the personal level.
  • Analysis and Future Plans narratives interpret results, assess progress, explain shortfalls and describe strategies for improvement. Performance data at the indicator level and net cost at the goal level are displayed in charts where sufficient data are available to illustrate trends.
  • PART, Program Evaluations and Audits tables provide updated information on Program Assessment Rating Tool reviews and improvement plans. Relevant audits and evaluations completed during the fiscal year are summarized in tables that summarize relevance, findings and recommendations, and next steps.
  • Data Quality and Top Management Challenges narratives discuss DOL's confidence in the performance information reported for the goal's measures and address management challenges that may have significant implications for achievement of program performance goals.19

Data Quality

This report is published six weeks after the end of the fiscal year. Since the Department uses a wide variety of performance data submitted by diverse systems and governed by agreements with State agencies and grant recipients, it is not possible in all cases to report complete data for the reporting period. The Department requires each agency responsible for performance goals in this report to submit a Data Estimation Plan in February that identifies, for each indicator, whether complete data are expected by the deadline for final review of the report in early October. If the data will not be available by then, the agencies must submit an acceptable plan to estimate results for the remainder of the year. Methodologies developed by agencies' program analysts are reviewed by the Department's Center for Program Planning and Results and the Office of Inspector General (OIG). The most common methods are substitution or extrapolation of two or three quarters of data and — for data with significant seasonal variation — use of the missing period's results from the previous year. Estimates are clearly identified wherever they are used in this report. With very few exceptions, final (actual) data are available by the end of the calendar year; these data will be reported in the FY 2010 President's Budget and the FY 2009 Performance and Accountability Report.

OIG assesses the internal controls of DOL agencies — systems used to validate, verify and record data submitted by field staff and partners (e.g., grantees). These systems are identified as Data Sources at the bottom of each performance goal history. Lack of findings does not imply that data are factual.

Material inadequacies are disclosed in the Secretary's Message, which includes a statement on the adequacy of program performance data that is supported by signed attestations from each agency head responsible for a performance goal in this report. OMB Circular No. A-11, Preparation of Submission and Execution of the Budget, defines "material inadequacy" as a condition that significantly impedes the use of program performance data by agency managers and government decision makers. For Departmental management, this threshold is established at the performance goal level as data that are insufficient to permit determination of goal achievement. This is an uncommon occurrence, as most DOL performance goals have sufficient indicators and historical data to allow reasonable estimation of results.20

DOL uses a Data Quality Assessment process to improve the quality of performance information reported to the public. By doing so, DOL not only increased the transparency of data quality among performance goals, but also implemented a forward-looking method for systematically evaluating data systems using widely accepted criteria. By increasing the visibility of data quality, DOL is using the assessment process as an important benchmark for monitoring progress and stimulating change.

Data assessments are based on seven criteria, of which two — accuracy and relevance — are weighted twice as much as others in the rating system (see box below). If data do not satisfy the standards for both of these criteria, the rating is Data Quality Not Determined. This reflects the DOL policy that further assessments of quality are irrelevant if the information is not reasonably correct or worthwhile. In FY 2008, no data assessments resulted in this rating.

Data Quality Rating System

Both bulleted descriptions under a criterion must be satisfied to receive points. No partial credit is awarded. The rating scale reflects 20 points for Section One "threshold" criteria plus additional points earned in Section Two. Data that do not satisfy both criteria presented in Section One are given the rating Data Quality Not Determined — regardless of the points achieved in Section Two. This rating indicates the agency is unable to assess data quality because it does not meet a minimum threshold.

Section One: 20 points

Accurate — Data are correct. (10 points)

  • Deviations can be anticipated or explained.
  • Errors are within an acceptable margin.

Relevant — Data are worth collecting and reporting. (10 points)

  • Data can be linked to program purpose to an extent they are representative of overall performance.
  • The data represent a significant budget activity or policy objective.

Section Two: 25 points

Complete — Data should cover the performance period and all operating units or areas. (5 points)

  • If collection lags prevent reporting full-year data, a reasonably accurate estimation method is in place for planning and reporting purposes.
  • Data do not contain any significant gaps resulting from missing data.

Reliable — Data are dependable. (5 points)

  • Trends are meaningful; i.e., data are comparable from year-to-year.
  • Sources employ consistent methods of data collection and reporting and uniform definitions across reporting units and over time.

Timely — Data are available at regular intervals during the performance period. (5 points)

  • The expectation is that data are reported quarterly.
  • Data are current enough to be useful in decision-making and program management.

Valid — Data measure the program's effectiveness. (5 points)

  • The data indicate whether the agency is producing the desired result.
  • The data allow the agency and the public to draw conclusions about program performance.

Verifiable — Data quality is routinely monitored. (5 points)

  • Quality controls are used to determine whether the data are measured and reported correctly.
  • Quality controls are integrated into data collection systems.

Rating

Points

Excellent

45

Very Good

40

Good

30-35

Fair

25

Unsatisfactory

20

Data Quality Not Determined

Varied

After three years, the DOL data quality assessment process continues to challenge agencies' data systems. Designed to encompass more than the mechanics of data collection, the assessments also question the value of information collected and the extent to which it provides evidence of goal achievement. One of the most important outcomes of this process, aside from increasing the transparency of performance information reported in the PAR, is encouraging the development of plans to either maintain or improve data quality.

In FY 2006, DOL established the data quality assessment scale recognizing that the Department lacked a systematic process for advancing the department-wide goal to improve the quality of data presented in the PAR. The data quality assessment process supports this goal by exposing data quality issues and seeking targeted remedies strictly within the parameters of the performance goal data for that fiscal year. The data quality assessment process matures each year to realistically reflect the pace of progress and to incorporate further guidance from OMB.

In the first year, DOL conducted baseline assessments of data for all performance goals. In FY 2007, DOL reviewed these assessments to determine whether ratings merited an upgrade or downgrade. By contrast, the prior year assessment process primarily collected improvement plans based on the recently completed baseline assessments. The FY 2008 assessment required updates on activities to strengthen data, as defined by the rating scale criteria, or to maintain already robust systems. Agencies sought upgrades based on improvements to this fiscal year's data. Downgrades were also possible, as evidenced with last year's downgrade of Senior Community Service Employment Program (SCSEP) within ETA, but did not occur in FY 2008.

Data for sixty-three percent of performance goals are rated Very Good or Excellent, and with the inclusion of Good, the total reaches 92 percent. No performance goals were rated Unsatisfactory, nor were any rated Data Quality Not Determined (DQND) due to fundamental problems with accuracy and relevance. Three performance goals received upgrades — two of which were rated DQND in FY 2007. PBGC implemented new performance measures from its FY 2007 PART which better reflect the mission of the program, thus satisfying the threshold criterion of relevance. ETA's SCSEP expeditiously addressed the issues of accuracy and completeness related to the quality of its performance data, which had undermined its ability to report results in the FY 2007 PAR. Grantee reporting was carefully reviewed for anomalies and variations throughout the year to ensure improved data quality and complete reporting by year's end. MSHA improved the timeliness of its health indicators by fully implementing more rigorous reporting requirements for its field and district offices, thereby increasing its score from a Good to Very Good. The last section of each performance goal narrative contains additional information about data quality.

Data Quality Criteria Met

Percent of Performance Goals

Verifiable

46%

Reliable

67%

Valid

71%

Timely

79%

Complete

79%

Accurate

100%

Relevant

100%

At the Departmental level, certain criteria are met more frequently than others. With the two performance goal upgrades, all DOL performance goals satisfy the threshold criteria of accurate and relevant. Over two-thirds of performance goals are supported by data that is valid, timely, and complete. As indicated in the adjacent table, the clear challenges for many performance goals are data reliability and the ability to verify the data. Less than half of all performance goals have data quality controls in place that routinely monitor data and are fully integrated into the data collection system. Verifiability emerges as a predominate issue largely as a result of ETA's numerous grant programs and its challenges monitoring and enforcing standards among grantees' diverse data systems. The reliability issue covers numerous agencies and generally reflects a lack of uniform definitions for data collection or inconsistent data reporting across years. As agencies refine performance measures, methodologies, and definitions to improve performance reporting, they can find it more difficult to demonstrate meaningful trends.

In FY 2009, in addition to the agencies' self-assessments and OASAM's review of those assessments, the Department will undertake an independent evaluation of the data for selected programs/performance goals. This evaluation will also analyze the current rating scale and make recommendations to improve the scale and process, particularly in support of the heightened requirements in OMB Circular No. A-11 (2008).


17FY 2008 covers October 1, 2007 to September 30, 2008; PY 2007 covers July 1, 2007 to June 30, 2008.
18See also Program Net Costs table in Cost of Results section of the Program Performance Overview (Management's Discussion and Analysis).
19See Top Management Challenges table in Management's Discussion and Analysis.
20Last year, data for one program/performance goal — the Community Based Job Training Grants (CBJTG) — were considered inadequate and omitted from the FY 2007 Performance and Accountability Report. Systems that would allow CBJTG to report on the job training common measures are currently being developed. Consequently, CBJTG measures were excluded from DOL's FY 2008 Performance Plan and again excluded from the Performance and Accountability Report.

Previous Section Next Section

 

Phone Numbers