www.dol.gov
|
November 5, 2008 DOL Home > About DOL > 2007 PAR > Reporting Performance Results |
DOL Annual Report, Fiscal Year
2007 Reporting Performance Results The Performance Section of this report presents results at the Strategic Goal and Performance Goal levels. The four Strategic Goals established in our FY 2006-2011 Strategic Plan are general outcomes clearly linked to the Department's mission. Performance goals articulate more specific objectives associated with one or more programs administered by a distinct DOL agency. Progress in achieving these goals is measured by one or more quantifiable performance indicators, for which targets are established in the annual Performance Budget. Each of the four strategic goal sections is introduced by an overview of results, net cost and future plans for its component performance goals. Results for each performance goal are presented in a brief section that includes the following:
Data Quality This report is published six weeks after the end of the fiscal year. Since the Department uses a wide variety of performance data submitted by diverse systems and governed by agreements with State agencies and grant recipients, it is not possible in all cases to report complete data for the reporting period. The Department requires each agency responsible for performance goals in this report to submit a Data Estimation Plan in February that identifies, for each indicator, whether complete data are expected by the deadline for clearance and final review of the report in early October. If the data will not be available by then, the agencies must submit an acceptable plan to estimate results for the remainder of the year. Methodologies developed by agencies' program analysts are reviewed by the Department's Center for Program Planning and Results and the Office of Inspector General (OIG). The most common methods are substitution or extrapolation of two or three quarters of data and for data with significant seasonal variation use of the missing period's results from the previous year. Estimates are clearly identified wherever they are used in this report. With very few exceptions, final (actual) data are available by the end of the calendar year; these data will be reported in the FY 2008 Performance and Accountability Report. OIG assesses the internal controls of DOL agencies systems used to validate, verify and record data submitted by field staff and partners (e.g., grantees). These systems are identified as Data Sources at the bottom of each performance goal history. Lack of findings does not imply that data are factual. Material inadequacies are disclosed in the Secretary's Message, which includes a statement on the adequacy of program performance data that is supported by signed attestations from each agency head responsible for a performance goal in this report. OMB Circular A-11 defines "material inadequacy" as a condition that significantly impedes the use of program performance data by agency managers and government decision makers. For Departmental management, this threshold is established at the performance goal level as data that are insufficient to permit determination of goal achievement. This is an unlikely occurrence, as most DOL performance goals have sufficient indicators and historical data to allow reasonable estimation of results. Generally, if agency or program level managers do not trust their own data, the results are not reported, because the problems created by skewed targets and trends are much worse than a gap in the data. Because DOL aspires to maintain high standards and because performance information is being used more than ever for decision-making and accountability, DOL recently created a Data Quality Assessment process to improve the quality of performance information reported to the public. The Data Quality and Major Management Challenges section of each performance goal narrative includes an overall rating of data quality (Excellent, Very Good, Good, Fair, or Unsatisfactory). Discussions summarize the rationale for these ratings and, for all but those rated Excellent, improvement plans. Data assessments are based on seven criteria, of which two accuracy and relevance are weighted twice as much as others in the rating system (see box below). If data do not satisfy the standards for both of these criteria, the rating is Data Quality Not Determined. This reflects the DOL policy that further assessments of quality are irrelevant if the information is not reasonably correct or worthwhile.
DOL piloted the Data Quality Assessment process in FY 2006. By doing so, DOL not only increased the transparency of data quality among performance goals, but also implemented a forward-looking method for systematically evaluating data systems using widely accepted criteria. In its pilot year, the assessments provided a valuable baseline by identifying weaknesses and establishing improvement plans. By increasing the visibility of data quality, DOL is using the assessment process as an important benchmark for monitoring progress and stimulating change. In this year's report, data for four performance goals are rated Excellent, ten are Very Good, six are Good, two are Fair, and two are Data Quality Not Determined. No performance goals were rated Unsatisfactory. Ratings this year largely remained the same; exceptions are higher ratings for ESA's Wage and Hour Division, Office of Federal Contractor Compliance Programs and Office of Workers' Compensation Programs goals, and lower ratings for ETA's Senior Community Service Employment Program and VETS' Uniformed Services Employment and Reemployment Rights Act goals. For two other goals, FY 2006 and FY 2007 ratings were not directly comparable due to the restructuring of performance goals. The Community Based Job Training Grants program did not report results and therefore, was not included. Given the short duration between the FY 2006 year-end pilot assessment and the FY 2007 mid-year assessment, this year's reporting focused on improvement plans to address the criteria not met in the pilot year assessment and considered the impact of any pertinent reports or audits released in FY 2007. OIG continues to identify data quality issues among its Major Management Challenges. Central to this ongoing challenge is the Department's limited ability to ensure the quality of data reported by States and other sources below the Federal level. The Employment and Training Administration (ETA) is the principal agency affected by these findings. While their data quality assessments consistently identify verifiability as a weakness, these findings strictly relate to data collection systems for their performance goals. The OIG findings, however, cover data quality for sources not related to the performance goals. Beyond ETA, measuring the societal impact of compliance assistance, enforcement, policy development, and outreach also poses measurement challenges. Individual agencies must find a balance between measuring activities linked to their performance goals and measuring the far-reaching benefits to their constituents. Multiple performance measures, often relying on various data collection systems, allow an agency to focus on key performance areas linked to specific strategies. It is important to recognize that the data quality rating system evaluates only those data collection systems which support performance indicators appearing in this report. Program evaluations and audit reports, such as those listed in the performance goal chapters, supplement the performance data and give agencies a more comprehensive view into the effectiveness of their programs and help identify areas for improvement. In FY 2008, the data quality assessment process will entail full re-assessments for all performance goals. This could result in upward or downward adjustments of ratings for some goals. As data quality standards are further institutionalized and awareness of data quality increases Department-wide, DOL expects improved quality and quantity of information. It may also result in minor changes to ratings. As a testament to the robustness of the assessments to date, subsequent pertinent reports and audits generally confirmed assessment findings. This year's improvement plans focused on remedying deficiencies among data systems that are mostly rated between Good to Excellent. Nonetheless, the Department views these results as the beginning of a long-term strategy to raise the bar in data quality and performance reporting. Planning and Evaluation at the Department of Labor The diagram below illustrates the theoretical foundation of performance planning and evaluation structures, processes and results covered in this section of the Performance and Accountability Report. The outer circle represents the scope of DOL's resources and influence. At the core is our mission. Everything in between is in continuous motion, clockwise and counter-clockwise. Quadrants represent the planning elements that are tied to periodic budget documents. Spokes incorporate the actual processes that follow resource allocation decisions and translate theory into practice. These elements are managed on a real-time basis; emergent cost and results information ultimately closes the feedback loop via reporting documents and the next period's budget. A more detailed description of planning and evaluation processes follows the diagram. Planning Cycle Evaluation Cycle 1FY 2007 began October 1, 2006 and ended September 30, 2007. PY 2006 began July 1, 2006 and ended June 30, 2007.
|
|||||||||||||||||||
|