ExpectMore.gov


Detailed Information on the
Job Training Apprenticeship Assessment

Program Code 10003901
Program Title Job Training Apprenticeship
Department Name Department of Labor
Agency/Bureau Name Department of Labor
Program Type(s) Direct Federal Program
Assessment Year 2005
Assessment Rating Results Not Demonstrated
Assessment Section Scores
Section Score
Program Purpose & Design 100%
Strategic Planning 38%
Program Management 72%
Program Results/Accountability 13%
Program Funding Level
(in millions)
FY2007 $22
FY2008 $21
FY2009 $23

Ongoing Program Improvement Plans

Year Began Improvement Plan Status Comments
2006

Implementing the common measures for earnings and retention and establishing an Internet-based apprenticeship registration system to efficiently obtain comprehensive performance data.

Action taken, but not completed A re-designed apprenticeship management system debuted in 2007. Phase 2 is scheduled for late FY 2008 and will allow State Apprenticeship Agencies and sponsors to upload data electronically. Proposed amendments to the regulations implementing the National Apprenticeship Act were published in Dec 2007. The final proposed regulations will be forwarded to the Office of Management and Budget in the Fall of FY 2008--after the review and comment period.
2006

Evaluating and reporting participants' employment and earnings after they leave the program to compare apprenticeship program outcomes with those of other training models.

Action taken, but not completed By September 2008, the Office of Apprenticeship (OA) will have the first data set describing performance results for apprentices who completed their apprenticeship training.?? Based on the results of this data analysis, OA will begin the process to develop new performance indicators and targets during FY 2009.
2006

Addressing underrepresentation of women in apprenticeship programs through a reinvigorated Equal Employment Opportunity review process and tracking and reporting performance.

Action taken, but not completed The Office of Apprenticeship will evaluate the progress reports from the three Women in Apprenticeship and Non-Traditional Occupations (WANTO) grant recipients (expected in the 4th Quarter of FY 2008) to determine: (1) the success of grantees in achieving the goal of putting women in nontraditional apprenticeship occupations; and (2) best practices for implementing this methodology into other Apprenticeship programs in FY 2009.
2007

Adopting efficiency measures that are linked to performance outcomes, account for all costs, and facilitate comparisons across Department of Labor training and employment programs.

Action taken, but not completed The Employment and Training Administration (ETA) is funding a contractor to study and define appropriate outcome-based efficiency measures for the job training programs by September 2008. ETA will develop, adopt and implement the new efficiency measures by June 2009.

Completed Program Improvement Plans

Year Began Improvement Plan Status Comments

Program Performance Measures

Term Type  
Long-term/Annual Outcome

Measure: Apprenticeship Retention Rate.


Explanation:Percent of those employed nine months after registration as an apprentice The Federal Apprenticeship Program monitors compliance with Federal rules and regulations regarding apprentices and works with states to set up apprenticeship programs. The Federal government manages programs and registers apprentices in twenty-five states, which provide performance data on their programs. Those states that run their own apprenticeship programs are not required to provide performance results to the Federal government and are not included in the national results. In this case, the performance indicator measures how many registered apprentices retained their apprenticeship participation, according to the following formula: Of those who were registered in the first quarter of Fiscal Year 2008 - The number of apprentices who are still in the program nine months later. This definition for retention was adopted in FY 2006 as an effort to establish performance measures for the apprentice program that more closely align with the common measures. To that end, the apprenticeship program is revising its data collection methodology to enable information to be collected on apprentices after completion of the program. This will allow the program to track retention in employment after program intervention. The Apprenticeship Program does not currently track entered employment because, by virtue of registering in the program, an apprentice is employed. For those apprentices registered during the first quarter of FY 2007 compared to the same apprentices nine months later, 83% remained registered in the program.

Year Target Actual
2005 Baseline 78%
2006 78% 82%
2007 79% 83%
2008 84%
2009 85%
2010 86%
2011 87%
2012 88%
2013 89%
Long-term/Annual Outcome

Measure: Average Earnings (wage gain).


Explanation:The current average wage gain for tracked apprentices employed in the first quarter after registration and still employed nine months later. The result is calculated according to the following formula: The average hourly wage of the apprentices registered in the first quarter of the fiscal year compared to the average hourly wage those same apprentices earned nine months later. (The wages of those apprentices who have exited the program during the nine months are not used in the estimate of the current average wage.). The earnings measure was adopted in FY 2006 as an effort to establish performance measures for the apprenticeship programs in the twenty-five Federal-managed states that more closely align with the common measures. To that end, the Apprenticeship Program is revising its data collection methodology to enable information to be collected on apprentices after completion of the program. This will allow the program to track the impact on wages after program intervention.

Year Target Actual
2005 Baseline $1.26
2006 $1.26 $1.32
2007 $1.33 $1.50
2008 $1.51
2009 $1.53
2010 $1.55
2011 $1.59
2012 $1.64
2013 $1.66
Annual Efficiency

Measure: Cost per registered apprentice


Explanation:The average cost for each apprentice registered through Federal staff, which is calculated by dividing the total annual appropriation for the program by the number of apprentices in the program (registered through Federal staff) at the end of the program year.

Year Target Actual
2005 $101 (Baseline Year) $109
2006 $108 $97
2007 $100 $74
2008 $73
2009 $72
2010 $71

Questions/Answers (Detailed Assessment)

Section 1 - Program Purpose & Design
Number Question Answer Score
1.1

Is the program purpose clear?

Explanation: The mission of the Office of Apprenticeship Training, Employer, and Labor Services (OATELS) is to encourage and assist industry and labor in developing apprenticeship programs, certify apprenticeship programs that meet certain minimum standards, promote equal opportunity for program participants, and safeguard the welfare of apprentices. Apprenticeship programs are sponsored jointly by employer and labor organizations or by individual employers, and/or employer associations. Programs use a combination of classroom training and on-the-job training under close supervision of a skilled worker to teach participants the practical and theoretical requirements of a highly skilled occupation.

Evidence: The National Apprenticeship Act of 1937 (Fitzgerald Act), 29 USC 50 (www.doleta.gov/atels_bat/fitzact_code.cfm).

YES 20%
1.2

Does the program address a specific and existing problem, interest, or need?

Explanation: The program began with an Executive Order, which created the Federal Committee on Apprenticeship Training. The Federal role was made permanent with the Fitzgerald Act of 1937, based on the view that there was a need for a Federal agency to bring employers and employees together to create apprenticeship programs, establish standards for apprenticeship programs, and remedy the shortage of skilled workers in certain industries at the time. While the economy has changed substantially since the passage of the Fitzgerald Act, there is still an interest in pursuing and maintaining apprenticeship programs in certain industries. In addition to the traditional apprenticeship industries (e.g., construction and the skilled trades), the program is focusing more on new and growing industries (e.g., health care) that did not traditionally use the apprenticeship model. The Federal role is also still valuable in terms of ensuring that apprenticeship programs meet a uniform set of skill standards and giving participants transferable credentials.

Evidence: U.S. House of Representatives, Committee Report 75-945 (June 7, 1937) and Committee on Labor Hearings on H.R. 6205 (April 22, 23, and 26, 1937). Advancing Apprenticeship: Implementation Plan, McNeil Research and Evaluation Associates, February 2003, www.doleta.gov/ates _bat/AAI.cfm

YES 20%
1.3

Is the program designed so that it is not redundant or duplicative of any other Federal, state, local or private effort?

Explanation: In 23 states, OATELS has direct responsibility for certifying and overseeing apprenticeship programs. In the remaining 27 States (as well as the District of Columbia, Guam, Puerto Rico, and the U.S. Virgin Islands) OATELS has delegated its responsibility to State Apprenticeship Councils (SACs), which operate without Federal funding. SACs operate with the authority of the Department of Labor and are guided by the Federal program regulations (CFR Parts 29 and 30). In SAC states, OATELS activities are generally limited to the conduct of SAC reviews to ensure compliance with Federal regulations, and the provision of support such as training and technical assistance. While the apprenticeship programs themselves are not duplicative, DOL does duplicated State effort in the way it oversees the programs. OATELS has Federal staff in 20 ot the 27 SAC States. While OATELS has decreased the number of Federal staff in SAC states, the continued Federal presence in these states is an inefficient use of resources.

Evidence:  

YES 20%
1.4

Is the program design free of major flaws that would limit the program's effectiveness or efficiency?

Explanation: There is no strong evidence that another approach or mechanism would be more efficient or effective to achieve the intended purpose. From a Federal standpoint, the program is extremely inexpensive, as most of the training costs are borne by apprenticeship program sponsors. However, as noted in 2.6, there have been no independent evaluations of the effectiveness of registered apprenticeship programs in terms of their impact on participants.

Evidence: In FY 2004, the Federal cost per registered apprentice was $96. In fiscal year 2001, the total OATELS budget was $21.069 million. The General Accounting Office (GAO) estimated that in the same year, State Apprenticeship Councils spent approximately $20 million, and employers and apprentices contributed at least $1 billion to the system (GAO, "Registered Apprenticeships: Labor Could Do More to Expand to Other Occupations," GAO-01-940 (September 2001)).

YES 20%
1.5

Is the program design effectively targeted so that resources will address the program's purpose directly and will reach intended beneficiaries?

Explanation: OATELS has made efforts to increase apprenticeship participation among underrepresented groups. In the FY 1999-2004 Strategic Plan, OATELS set a goal of increasing the number of newly registered female civilian apprentices by 15 percent over the FY 1999 baseline of 7,551. DOL surpassed this goal in FY 2001 and by 2004 had achieved a 29 percent increase in the number of new female apprentices. In FY 2002 OATELS piloted a Technical Assistance Provider's Bank to provide a pool of experts who could assist registered apprenticeship programs in achieving their diversity goals. The Provider's Bank provided technical assistance to 58 employer-sponsors across 20 states. Despite improvements in the 1970s and early 1980s, however, the Apprenticeship system continues to lag in terms of the participation of women. In 2004, women represented only 6.9 percent of apprentices in registered programs--roughly the same as in 1990 (7.1 percent) and far below their representation in the civilian labor force (46%). In 1992, GAO noted the underrepresentation of women in apprenticeship programs and found that women and minorities tend to be concentrated in apprenticeship programs for lower paying occupations. A study based on early 1990s OATELS data also showed that women have lower apprenticeship program graduation rates than men. (The same study also showed than the female graduation rate was higher in jointly sponsored apprenticeship programs than in management-sponsored programs.) The program should address the underrepresentation of women by more closely examining and holding accountable programs where underrepresentation is a problem, examining Equal Opportunity regulations to see whether they need to be strengthened, strengthening outreach activities, and adding a performance target to increase the representation of women.

Evidence: GAO, "Apprenticeship Training: Administration, Use, and Equal Opportunity," GAO-HRD-92-43 (March 1992); Gunseli Berik and Cihan Bilginsoy, "Do Unions Help or Hinder Women in Training?," Industrial Relations (October 2000, v. 39, issue 4, pages 600-624).

YES 20%
Section 1 - Program Purpose & Design Score 100%
Section 2 - Strategic Planning
Number Question Answer Score
2.1

Does the program have a limited number of specific long-term performance measures that focus on outcomes and meaningfully reflect the purpose of the program?

Explanation: As articulated in the Department of Labor's FY 2003-FY 2008 Strategic plan, OATELS' overarching goal is to strengthen the registered apprenticeship system to meet the training needs of business and workers in the 21st Century. The plan established two performance measures for this goal (number of apprenticeship graduates and completers' average earnings), but never set targets for them. Instead, the program tracked its performance against two output measures: the number of new apprenticeship programs in high-growth industries and number of registered apprentices. DOL has changed course and is now implementing the adult job training common measures (job retention and earnings) in the Apprenticeship program, using a slightly different approach than is being used in the other programs that are subject to the common measures. Specifically, rather than tracking individuals after they leave apprenticeship programs, the program will measure achievement during the programs. The program will also not measure placement, based on the argument that participation in an apprenticeship program itself constitutes employment (participants earn an hourly wage). The program should add a third measure to gauge the extent to which apprentices entering the programs actually complete them. This is an important indicator of program quality, and would be closer to an outcome than the other two measures. Tracking apprentices after they exit the program would require considerable additional data collection, which would not be cost effective relative to the Federal investment (less than $100 per apprentice). These post-program outcome data should instead be gathered through a program evaluation.

Evidence: The long-term goals for the Apprenticeship program are: (1) percentage of registered apprentices who are employed in both the second and third quarters after registration; and (2) total earnings in the second and third quarters after registration minus total earnings in the second and third quarters prior to registration, divided by the number of registered apprentices. For the former GPRA goals, see DOL's FY 2003-2008 Strategic Plan, www.dol.gov/_sec/stratplan/main.htm (see goal 1.1.B).

YES 12%
2.2

Does the program have ambitious targets and timeframes for its long-term measures?

Explanation: DOL did not establish targets for the measures included in the FY 2003-2008 Strategic Plan, and has not established a baseline or targets for its new long-term measures. DOL will establish a baseline in fiscal year 2005, and establish targets for FY 2006 and subsequent years in the beginning of fiscal year 2006, based on the data it collects in 2005.

Evidence: DOL FY 2003-2008 Strategic Plan, www.dol.gov/_sec/stratplan/main.htm (see goal 1.1.B)

NO 0%
2.3

Does the program have a limited number of specific annual performance measures that can demonstrate progress toward achieving the program's long-term goals?

Explanation: As described under the response to question 2.1, DOL is implementing the adult job training common measures in the Apprenticeship program with a slightly different approach than is being used in the other programs that are implementing the common measures. The measures will be the same (with the exception of placement), but the annual targets will be set so as to demonstrate progress toward the long-term targets. The program should add a third measure to gauge the extent to which apprentices entering the programs actually complete them. This is an important indicator of program quality, and would be closer to an outcome than the other two measures. Tracking apprentices after they exit the program would require considerable additional data collection, which would not be cost effective relative to the Federal investment ($75 per apprentice). These post-program outcome data should instead be gathered through a program evaluation.

Evidence: The annual measures for the Apprenticeship program are as follows: (1) percentage of apprentices who are employed in both the second and third quarters after registration; and (2) total earnings in the second and third quarters after registration minus total earnings in the second and third quarters prior to registration.

YES 12%
2.4

Does the program have baselines and ambitious targets for its annual measures?

Explanation: DOL has not yet established a baseline or targets for its annual measures. DOL will establish a baseline in fiscal year 2005, and establish targets for FY 2006 and subsequent years in the beginning of fiscal year 2006, based on the data it collects in 2005.

Evidence:  

NO 0%
2.5

Do all partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) commit to and work toward the annual and/or long-term goals of the program?

Explanation: While all 27 SAC states conduct activities that ultimately contribute to the annual and long-term goals, none of the SACs specifically commit to the achievement of specific goals and only seven report apprentice-level data to ETA that allow for measurement of their success with respect to the OATELS performance measures. The remaining states provide only quarterly aggregate data.

Evidence:  

NO 0%
2.6

Are independent evaluations of sufficient scope and quality conducted on a regular basis or as needed to support program improvements and evaluate effectiveness and relevance to the problem, interest, or need?

Explanation: The program has never been evaluated in terms of its employment outcomes: i.e., the effect of apprenticeship programs on participants' employment and earnings after they leave apprenticeship programs. Evaluations that have been done have been focused on specific aspects of program design and participation and the processes used to achieve program goals. GAO has done two past reviews of the program: one focused on the program's identification of new apprenticeable occupations (2001) and one focused on the participation of women and minorities in apprenticeship programs (1992). There are some external evaluations of U.S. apprenticeship programs, although they also focus on the experience of the apprentices while they are in the programs.

Evidence: General Accounting Office, "Registered Apprenticeships: Labor Could Do More to Expand to Other Occupations," GAO-01-940 (September 2001) and "Apprenticeship Training: Administration, Use and Equal Opportunity," GAO-HRD-93-43 (March 1992), both available at www.gao.gov; Expanding the National Registered Apprenticeship System Into New and Emerging Occupations: A Report on a Federal Grant to Six State Apprenticeship Programs, University of Wisconsin, 2004; The Quality Child Care Initiative After Three Years, McNeil Research and Evaluation Associates and Social Policy Research Associates, 2005; "Apprenticeship Impact Project," Coffey Communications, December [year]; and Cihan Bilginsoy, "The Hazards of Training: Attrition and Retention in Construction Industry Apprenticeship Programs," Industrial and Labor Relations Review (October 2003).

NO 0%
2.7

Are Budget requests explicitly tied to accomplishment of the annual and long-term performance goals, and are the resource needs presented in a complete and transparent manner in the program's budget?

Explanation: Like the rest of DOL, ETA does not have an integrated accounting and performance management system to identify the full cost of achieving this program's performance goals and support day-to-day operations. While the program's budget requests are aligned with the program's strategic and operational goals, they are not based on what is needed to attain specific levels of performance.

Evidence:  

NO 0%
2.8

Has the program taken meaningful steps to correct its strategic planning deficiencies?

Explanation: The program is establishing more outcome-oriented performance measures, using the adult common measures as a basis. OATELS will establish baselines for the new measures in FY 2005, and in the first quarter of FY 2006 set targets for FY 2006 through FY 2008 based on them. The registered apprenticeship program has also taken steps based on input received from the Government Accountability Office (GAO) and other parties to make program improvements. A 2001 GAO Report, for example, indicated the need for OATELS to expand apprenticeship to new occupations and improve apprenticeship data. In response, OATELS in 2002 launched the Advancing Apprenticeship Initiative, which sought to increase the number and types of industries and the number and types of employers participating in the Apprenticeship system. OATELS also took steps to improve its apprenticeship data collection process by establishing in 2002 the Registered Apprenticeship Information System (RAIS)--a system that has more mandatory data fields and data validity checks and is more user friendly than its predecessor.

Evidence: Research and Evaluation Associates (OATELS contractor), Advancing Apprenticeship: Implementation Plan (February 10, 2003).

YES 12%
Section 2 - Strategic Planning Score 38%
Section 3 - Program Management
Number Question Answer Score
3.1

Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance?

Explanation: The Registered Apprenticeship Information System (RAIS), the electronic system used by the program to track program outcomes and outputs, is not a comprehensive source of apprentice-level data on the program. While all of the Federal (BAT) states are included, only seven of the 27 SAC states participate in the RAIS by submitting apprentice-level data. The remaining states provide only quarterly aggregate data that OATELS enters into RAIS. OATELS is working with other states to encourage their participation. OATELS uses RAIS data to identify general program trends and analyze regional performance in terms of number of apprentices and programs, demographic information (e.g., race, ethnicity, and gender), industry type, apprentice completions, wage data, and number of apprentices who drop out of the programs. Data reports are compiled quarterly and are used to establish, and track performance against, targets. Managers have access to WebCEO, a database that permits comparisons across time periods and regions and provides demographic profiles. However, it's not clear how the program uses these data to improve the program--for example, to target for EEO review programs where participation of females is low.

Evidence: RAIS Statistical Reports.

NO 0%
3.2

Are Federal managers and program partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) held accountable for cost, schedule and performance results?

Explanation: OATELS monitors SAC States and registered apprenticeship programs to ensure compliance with federal standards, but does not hold them accountable for performance outcomes. Ultimately, OATELS has the authority to derecognize a SAC for the failure to fulfill, or operate in conformity with the Federal regulations. OATELS has only formally initiated derecognition once in its 68-year history??this case, the State of California, is still pending. OATELS can also deregister apprenticeship programs for noncompliance with program regulations (including EEO requirements), although it will first work to secure voluntary compliance and, failing that, will encourage the program to deregister voluntarily. More than 2,700 programs were voluntarily deregistered during FY 2004. No data were provided on the frequency of involuntary program deregistration, so it is unclear how often this accountability tool is used. OATELS' annual goals are incorporated into its managers' performance management plans, which are used to evaluate performance semi-annually and annually. These performance reviews are tied directly to annual bonuses and promotional opportunities. An annual Program Guidance Memorandum is developed and issued to OATELS staff and SAC partners to ensure that performance goals and indicators are communicated consistently throughout the apprenticeship community and tracked for GPRA and performance management purposes.

Evidence: ETA/OATELS Circular 95-02: Bureau of Apprenticeship and Training Review of State Apprenticeship Councils; and Circular 92-02: Apprenticeship Quality Assessment. ETA/OATELS Program Guidance Memorandum: Goals and Objectives for FY 2005.

NO 0%
3.3

Are funds (Federal and partners') obligated in a timely manner and spent for the intended purpose?

Explanation: In FY 2004, the Employment and Training Administration's Program Administration account (of which OATELS is a part) had obligated 99.9 percent of its funding by the end of the fiscal year. OATELS reports that there were no unobligated OATELS funds at the end of FY 2004. OATELS has procedures in place to plan and track expenditures. Once OATELS receives its annual operating budget from ETA, OATELS develops and distributes an operating budget for the National Office and Regional Offices based on its total operating funds. Each regional office submits, monthly, a status report to the National Office on expenditures. National and regional staff track funds to ensure they are spent for the intended purposes and obligated according to the established schedule. OATELS has developed an employment report, updated monthly, that tracks all personnel actions (e.g., separations, promotions) throughout the year. At the end of each month, OATELS' expenditures are reconciled with the Detailed Fund Report prepared by ETA's Accounting Office. OATELS also monitors and manages grants and contracts in accordance with the Federal Acquisition Regulations (FAR), and internal OATELS guidelines. The OATELS Federal Procurement Officers are charged with the administration and management of their assigned grants and contracts to ensure funds are expended appropriately. Each grantee is required to submit quarterly narrative reports, SF 269's and a comprehensive final report. These reports are reviewed to ensure the expected outcomes and tasks are consistent with the Statement of Work and are in line with determined performance goals and objectives. The accuracy of Financial SF 269 is validated by the FPO, and further by the voucher payments office through ETA's Payment Management System.

Evidence: OATELS tracks its budget and expenditures using the following: (1) the Employment and Training Administration (ETA) Agency Allocation, a document that provides offices with their annual allocations and detailed line item budget; (2) Monthly Forecast/Report of Personnel, which provides information on the status of remaining resources that have been allocated to OATELS; and (3) Monthly DOL/ETA Accounting Reports, which provides management with the status of budgeted line items.

YES 14%
3.4

Does the program have procedures (e.g. competitive sourcing/cost comparisons, IT improvements, appropriate incentives) to measure and achieve efficiencies and cost effectiveness in program execution?

Explanation: The program has proposed an efficiency measure (cost per registered apprentice), and has set a proxy baseline and targets. To improve the efficiency of its registration process OATELS in 2004 implemented the Apprenticeship Electronic Registration project, which allows apprenticeship program sponsors to register apprentices electronically. Currently more than 100,000 new apprentices are registered each year; OATELS enters most program and participant apprenticeship agreement data manually. This paper process is time consuming and prone to error. While a small share of apprentices are being registered via the new process (3,152), OATELS is encouraging sponsors to use the system and are anticipating broader acceptance after the first year. In 2003 OATELS issued guidance to its field staff to make the registration process was more efficient and responsive to business needs. This policy directs staff to reduce the turnaround time on the processing of new program registration. All programs must now be registered in 90 days.

Evidence: OATELS has proposed 2006 and 2007 targets of $100 per registered apprentice, based on an estimated 2005 baseline of $101. Apprenticeship Electronic Registration system, https://www.rais.doleta.gov/eform_appr/login.cfm

YES 14%
3.5

Does the program collaborate and coordinate effectively with related programs?

Explanation: OATELS coordinates with other offices within DOL and other federal agencies such as the Departments of Housing and Urban Development and Defense. OATELS worked with HUD to develop a construction industry apprenticeship program (called "Step-Up") for low-income individuals. While the program is funded by HUD, OATELS and HUD jointly approve Step-Up projects. OATELS has worked with the Job Corps program (also administered by ETA) to increase the number of Job Corps graduates entering registered apprenticeship programs. Among other things, OATELS has agreed to encourage apprenticeship program sponsors to allow for the direct entry of Job Corps graduates into their programs (instead of requiring them to undergo the same selection processes as other apprenticeship candidates) and involve Job Corps in their planning and marketing activities. Participants in the United Services Military Apprenticeship Program (USMAP) and the National Guard apprenticeship program can receive completion certification recognized in the civilian workforce. If they are unable to complete the program, they have the ability for direct-entry in other registered apprenticeship programs, without any loss of credit for previous training. As of March 31, 2005, 19,611 active military personnel are participating in the USMAP registered apprenticeship program. OATELS also formed the Apprenticeship Information Exchange Workgroup, comprised of SACs and Workforce Investment System Partners, to discuss strategies and activities to improve the linkages between the workforce investment system and apprenticeship.

Evidence: Information on HUD Step-up Program at www.hud.gov/progdesc/stepup.cfm; Job Corps/OATELS intra-agency partnership (February 1, 2005).

YES 14%
3.6

Does the program use strong financial management practices?

Explanation: OATELS financial results are reported in Financial Section of the DOL FY 2004 Annual Performance and Accountability Report. As a program office within the Employment and Training Administration (ETA), OATELS has received no material findings in the 2001-2004 OIG audit reports. To assure compliance with OMB regulations, OASAM guidance, and ETA budget and classification codes, OATELS distributes Annual Accounting Instructions and written procedures [TO WHOM?] at the beginning of each fiscal year. Expenditures are reported in the DOL monthly Detail Fund Report, which managers check to ensure consistency with the budget and identify any need for corrections. OATELS maintains the appropriate internal controls and separation of duties in compliance with FASB Standards and OMB Bulletin No. 01-09;

Evidence: FY 2005 Accounting Instructions and Classification Codes (OATELS Annual Accounting Instructions); DOL Monthly Detail Fund Report #D-253A; Financial Section of the DOL FY 2004 Annual Performance and Accountability Report, www.dol.gov/_sec/media/reports/annual2004/MDA.htm#4; OIG Audits of DOL Consolidated Financial Statements.

YES 14%
3.7

Has the program taken meaningful steps to address its management deficiencies?

Explanation: OATELS states that it cannot require SAC States to participate in the RAIS data system (see question 3.1), but reports that is taking steps to encourage their participation. Of the 27 SAC States, 7 currently participate in RAIS. Seven additional states have shown an interest in using RAIS during the next year and OATELS is assisting them in making that transition. This is an area requiring further attention. OATELS conducts reviews of SAC states and registered apprentice programs to ensure adherence to program regulations. OATELS reviews all SAC States over a three-year period. Circumstances which might prompt a formal SAC Review include revisions to state apprenticeship laws, regulations, policies or procedures; sponsor, apprentice or public complaints. In addition, regional staff are expected to review a certain number of apprenticeship programs each year. Registered programs are targeted for review based on complaints, new apprentice selections, history of deficiencies, under-representation of women/minorities, program completion rates, size of the program sponsor, and other factors. When deficiencies are identified during these reviews, staff provide technical assistance (e.g., training, linkages to related programs) to remedy the deficiency.

Evidence: ETA/OATELS Circular 95-02: Bureau of Apprenticeship and Training Review of State Apprenticeship Councils; ETA/OATELS Circular 92-02: Apprenticeship Quality Assessment.

YES 14%
Section 3 - Program Management Score 72%
Section 4 - Program Results/Accountability
Number Question Answer Score
4.1

Has the program demonstrated adequate progress in achieving its long-term performance goals?

Explanation: As noted in 2.2, the program has not yet established baselines or targets for its new long-term measures, so progress toward the long-term goals cannot be judged. The program also did not establish baselines or targets for the measures in the FY 2003-2008 Strategic Plan, so progress against these former long-term measures cannot be assessed either. The program exceeded its FY 1999-2004 long-term targets, but the measures--number of new apprentices and number of new female apprentices--are not outcome-oriented and are of limited value in gauging the success of the program in terms of participants' employment outcomes. It is also of concern that, despite the fact that the program exceeded both long-term targets in FY 2001, it did not establish new long-term targets to take their place.

Evidence: DOL FY 2003-2008 Strategic Plan, www.dol.gov/_sec/stratplan/main.htm (see goal 1.1.B) In the FY 1999-2004 Strategic Plan, DOL established two long-term measures: (1) to increase the number of new civilian apprentices by 10 percent over the FY 1999 baseline of 108,622; and (2) to increase the number of new female civilian apprentices by 15 percent over the FY 1999 baseline of 7,551. DOL surpassed first goal in FY 2000 and FY 2001, and the second goal in FY 2001. By 2004, DOL achieved a 28 percent increase in the number of new apprentices and a 29 percent increase in new female apprentices in 2004 (compared to 1999).

NO 0%
4.2

Does the program (including program partners) achieve its annual performance goals?

Explanation: As noted in 2.2, the program has not yet established baselines or targets for its new measures. The program met all but one of annual targets in the past three years. However, these measures (e.g., number of new apprenticeship programs) were not outcome-oriented and do not gauge the success of the program in terms of participants' employment outcomes.

Evidence: In FY 2004, DOL: (1) established 526 new apprenticeship programs in new and emerging industries, versus a target of 366 programs; and (2) registered 69,697 new apprentices (through Federal efforts only), versus a target of 68,592 new apprentices. In FY 2003, DOL: (1) established 359 programs in new and emerging industries, meeting the target; and (2) registered 130,615 new apprentices (through Federal and State efforts), falling short of the goal of 133,909. In FY 2002, DOL: (1) established 326 programs in new and emerging industries, exceeding its target of 293 programs; (2) registered 129,388 new apprentices, versus the target of 86,647; (3) established 2,952 new apprenticeship programs, versus the target of 1,854; and (4) involved 5,883 new businesses in the registered apprenticeship system, versus the target of 3,248. FY 2004 DOL Annual Report on Performance and Accountability, www.dol.gov/_sec/media/reports/annual2004/main.htm, (TO ACCESS: scroll to Appendices, click on Performance Goal Detail, scroll to Performance Goal 1.1C (ETA), Strengthen the registered apprenticeship system.)

NO 0%
4.3

Does the program demonstrate improved efficiencies or cost effectiveness in achieving program goals each year?

Explanation: The program has made some improvements to operate the program more efficiently. Specifically, as part of Employment and Training Administration reorganization, OATELS has reduced its regional offices from 10 to 6 and reduced the number of area offices from 89 to 81. The program has also reduced its staff from 210 full-time equivalent positions in 2001 to 170 FTE in FY 2006, largely by reducing the number of Federal staff in SAC states. The program is being given "small extent" based on these improvements. From 2002 to 2004, however, cost per participant increased by 5.5 percent.

Evidence: Cost per participant data are as follows: FY 2002, $91 per participant; FY 2003, $82 per participant; and FY 2004, $96 per participant.

SMALL EXTENT 7%
4.4

Does the performance of this program compare favorably to other programs, including government, private, etc., with similar purpose and goals?

Explanation: The apprenticeship program has never been evaluated in terms of its outcomes: i.e., the effect of apprenticeship programs on participants' employment and earnings after they leave apprenticeship programs, and there are no evaluations comparing the apprenticeship model with other training programs. The program is being given a "small extent" because it compares favorably to other job training programs in terms of cost. From a Federal perspective, the program is extremely inexpensive, at about $100 per participant, as most of the program costs are borne by the private sector. Further, apprentices are employed and earning a wage while they are being trained??a characteristic that sets the program apart from other job training models. In addition, DOL reports that the average weekly wage of apprenticeship graduates is $882, which is 33% above the median weekly wage for individuals aged 25 and older with some college or an associate degree ($661). Implementation of the common measures, which is planned for FY 2005, will allow for comparison of the apprenticeship program with other adult job training programs, although as noted in 2.1 the apprenticeship program will measure retention and wages during program participation instead of measuring true employment outcomes (i.e., after an individual has left the program).

Evidence: Bureau of Labor Statistics, Usual weekly earnings of full-time and salary workers by selected characteristics, 2004 annual averages.

SMALL EXTENT 7%
4.5

Do independent evaluations of sufficient scope and quality indicate that the program is effective and achieving results?

Explanation: The registered apprenticeship programs have never been evaluated in terms of their ultimate outcomes: i.e., the effect of apprenticeship programs on participants' employment and earnings after they leave apprenticeship programs. Evaluations that have been done have been focused on specific aspects of program design and participation and the processes used to achieve program goals. GAO has done two past reviews of the program: one, in 2001, noted that DOL had not systematically identified new occupations suitable for apprenticeship or addressed employer concerns about registration requirements, resulting in the slow expansion of the program to new occupations. A 1992 GAO study noted growth in minority and female participation in apprenticeship programs, but the continued underrepresentation of females. There are some external evaluations of U.S. apprenticeship programs, although they focus on the experience of the apprentices while they are in the programs.

Evidence: General Accounting Office, "Registered Apprenticeships: Labor Could Do More to Expand to Other Occupations," GAO-01-940 (September 2001) and "Apprenticeship Training: Administration, Use and Equal Opportunity," GAO-HRD-93-43 (March 1992), both available at www.gao.gov.

NO 0%
Section 4 - Program Results/Accountability Score 13%


Last updated: 09062008.2005SPR