ExpectMore.gov


Detailed Information on the
Vocational Education State Grants Assessment

Program Code 10000212
Program Title Vocational Education State Grants
Department Name Department of Education
Agency/Bureau Name Office of Vocational and Adult Education
Program Type(s) Block/Formula Grant
Assessment Year 2002
Assessment Rating Ineffective
Assessment Section Scores
Section Score
Program Purpose & Design 20%
Strategic Planning 43%
Program Management 56%
Program Results/Accountability 0%
Program Funding Level
(in millions)
FY2007 $1,182
FY2008 $1,161
FY2009 $0

Ongoing Program Improvement Plans

Year Began Improvement Plan Status Comments
2003

The Department will set short-term targets based on the measures in the new Perkins law and develop strategies for collecting the necessary data.

Action taken, but not completed The Department has set targets for FY2008 and 2009 for the performance measures that are tied to ESEA. During FY 2008, targets for the other performance indicators will be set for FY2009 and 2010. The Department has issued guidance on measurement approaches, and a majority of States are using one or more of those approaches. States will submit their first set of data for ESEA-related indicators in December 2008.
2006

Issuing regulations on implementation of performance measures systems under the new Perkins law.

Action taken, but not completed The Department made the determination not to issue regulations on implementing the performance measures under the 2006 Perkins Act. The Department has issued guidance on measurement approaches for the program's indicators, and expects to issue additional nonregulatory guidance during 2008.
2006

Providing technical assistance on integrating challenging academic career/technical instruction and collecting and reporting better performance data under the new Perkins law.

Action taken, but not completed During 2008 the Department will make awards to up to 6 States to develop and implement rigorous programs of study and will post on its website examples of curriculum projects on enhancing math and science instruction in a number of technical fields. In addition, the Department will host Data Quality Institutes in 2008 to provide assistance on improving the quality of performance data for the program.
2007

The Department will set long-term targets based on the measures in the new Perkins law.

No action taken In 2009 the Department will negotiate long-term targets for States for FY 2010-2012.
2003

Grantee funding will be contingent on a rigorous assessment that student outcomes are being achieved.

Enacted Although the recently reauthorized Perkins did not incorporate the proposal the Department released in 2003 for reauthorizing the program as a Secondary and Technical Education program, the 2006 Perkins strengthened accountability requirements for the program.
2003

Grantees will have the flexibility to focus program funds in a manner that best serve students in a given locality.

Not enacted The Department released a blueprint for reauthorizing the program as a Secondary and Technical Education program that would have allowed grantees to focus program funds based on local needs. Congress did not enact the proposal.

Completed Program Improvement Plans

Year Began Improvement Plan Status Comments

Program Performance Measures

Term Type  
Long-term/Annual Outcome

Measure: For postsecondary students - Percentage of participants who entered employment in the 1st quarter after program exit.


Explanation:

Year Target Actual
2002 84 86
2003 85 83
2004 86 83
2005 88 84
2006 89 87
2007 90 74
2008 91 no data
2009 80
2010 81
Long-term/Annual Outcome

Measure: For secondary students - Attainment of a high school diploma, certificate, or GED.


Explanation:

Year Target Actual
2002 85 84
2003 86 84
2004 88 84
2005 87 84
2006 88 89
2007 89 86
2008 90 no data
2009 87
2010 88
Long-term/Annual Outcome

Measure: For secondary students - Entry into employment or enrollment in postsecondary education/advanced training.


Explanation:

Year Target Actual
2002 85 84
2003 86 84
2004 87 87
2005 87 87
2006 88 87
2007 89 86
2008 90 no data
2009 87
2010 88
Long-term/Annual Efficiency

Measure: Cost per secondary student


Explanation:OMB job training common measure for efficiency. Cost is calculated for separately for secondary and postsecondary students.

Year Target Actual
2001 n/a 96
2002 n/a 76
2003 n/a 83
2004 85 81
2005 85 80
2006 85 75
2007 85 77
2008 85
2009 76
2010 75
Long-term/Annual Efficiency

Measure: Cost per postsecondary student.


Explanation:OMB job training common measure for efficiency. Cost is calculated for separately for secondary and postsecondary students.

Year Target Actual
2001 not applicable 79
2002 not applicable 74
2003 not applicable 73
2004 75 81
2005 75 86
2006 75 83
2007 75 76
2008 75
2009 74
2010 73

Questions/Answers (Detailed Assessment)

Section 1 - Program Purpose & Design
Number Question Answer Score
1.1

Is the program purpose clear?

Explanation: The program provides financial assistance to states in support of a variety of efforts, including improving students' academic skills and technical skills, preventing drop outs, increasing graduation rates, increasing post-secondary and advanced degree placement, and improving job outcomes. These multiple and potentially overlapping objectives have caused ambiguity among stakeholders as to the central purpose of the program.

Evidence: The Department has received feedback from stakeholders that the broad scope and varied activities of the program have caused confusion at the local level about the key objectives of the program.

NO 0%
1.2

Does the program address a specific interest, problem or need?

Explanation: Data indicate that a significant number of students are graduating from high school and community college without the necessary academic and technical competencies to be productive members of the workforce.

Evidence: National Assessment of Vocational Education; Consolidated Annual Performance Reports.

YES 20%
1.3

Is the program designed to have a significant impact in addressing the interest, problem or need?

Explanation: The program is not designed such that there is consensus among stakeholders on the program's key objectives. Moreover, because the impacts of the program are not currently known, the effect of reducing or increasing the federal investment in this program is unclear.

Evidence: The lack of information on program impact is due in large part to deficiencies in performance reporting, including problems with data quality.

NO 0%
1.4

Is the program designed to make a unique contribution in addressing the interest, problem or need (i.e., not needlessly redundant of any other Federal, state, local or private efforts)?

Explanation: The Federal contribution provides support for services that are provided to students at the state and local level. Because Federal and state funds are commingled, the extent of the value added of the Federal investment is unclear.

Evidence: There are a variety of Federal programs that seek to improve academic skills and ensure that students get into college and succeed. The diverse and varied goals of this program overlap with the goals of many other programs.

NO 0%
1.5

Is the program optimally designed to address the interest, problem or need?

Explanation: There are a number of program design features that warrant improvement, including for example, focusing the scope and objectives of the program and developing a more rigorous performance accountability framework.

Evidence: The Department has received feedback from stakeholders that the broad scope and varied activities of the program has caused confusion and implementation problems at the local level because of ambiguity surrounding the key objectives of the program.

NO 0%
Section 1 - Program Purpose & Design Score 20%
Section 2 - Strategic Planning
Number Question Answer Score
2.1

Does the program have a limited number of specific, ambitious long-term performance goals that focus on outcomes and meaningfully reflect the purpose of the program?

Explanation: Consistent with measures established under the job training common measures framework, the Department is working to develop several long-term indicators and performance targets that are tied to short term goals and are consistent with the program's scope and activities.

Evidence:  

NO 0%
2.2

Does the program have a limited number of annual performance goals that demonstrate progress toward achieving the long-term goals?

Explanation: Through the common measures matrix, the program has established a limited set of performance indicators designed to measure program performance/progress, including for example, placement in employment, degree attainment, and skill attainment. However, the Department must establish numerical targets and ensure that performance data exists to report on those targets. In addition, any short-term measures (whether the common measures or additional measures) must be linked to long-term goals. To the extent performance targets are set by states, a process should be put in place to ensure that state-defined targets are appropriately rigorous and that a methodology can be developed for aggregating performance data at the national level.

Evidence: The Department has made efforts to establish performance targets for each of the common performance measures. However, data integrity problems have made it difficult to assess past performance or establish a valid baseline. The Department believes that without such information, it is premature to establish performance targets.

NO 0%
2.3

Do all partners (grantees, sub-grantees, contractors, etc.) support program planning efforts by committing to the annual and/or long-term goals of the program?

Explanation: While the program receives regular and timely annual performance information from grantees, the information cannot yet be tied to a strategic planning framework where a limited number of annual performance goals demonstrate progress to achieving long-term goals.

Evidence: Instructions for this question indicate that a "no" is required if the program received a "no" for both questions 1 and 2 of this section.

NO 0%
2.4

Does the program collaborate and coordinate effectively with related programs that share similar goals and objectives?

Explanation: Considerable collaboration and coordination occurs at both the Federal level (e.g., with DOL) and at the grantee level (e.g., with WIA title I one-stops)

Evidence:  

YES 14%
2.5

Are independent and quality evaluations of sufficient scope conducted on a regular basis or as needed to fill gaps in performance information to support program improvements and evaluate effectiveness?

Explanation: The National Assessment of Vocational Education (NAVE) is an independent analysis, conducted every 5 years, and tracks appropriate program outcomes. However, the NAVE does not measure the marginal effects that the Federal investment has on state vocational education programs.

Evidence:  

YES 14%
2.6

Is the program budget aligned with the program goals in such a way that the impact of funding, policy, and legislative changes on performance is readily known?

Explanation: The program does not have a strategic planning framework where a limited number of annual performance goals demonstrate progress toward achieving long-term goals. Thus, at this time, performance goals are not currently aligned with budget policy.

Evidence: There is limited reliable data informing on critical performance measures. Specifically, educational and employment outcome data are not uniform across states and cannot be aggregated (e.g., states set their own thresholds, states have different definitions for who is a Voc Ed. student).

NO 0%
2.7

Has the program taken meaningful steps to address its strategic planning deficiencies?

Explanation: The Department has undertaken a process to make strategic planning improvements. This process is being coordinated with the Department's ongoing development of a reauthorization proposal as well as the development of the common measures framework.

Evidence:  

YES 14%
Section 2 - Strategic Planning Score 43%
Section 3 - Program Management
Number Question Answer Score
3.1

Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance?

Explanation: While grantees provide regular and timely performance information for a series of existing performance measures, there are data quality problems that affect the validity and reliability of the data. Moreover, current performance information is not yet linked to a strategic goals framework (see Sec II, q 1 & q 2), nor is it consistent with the common measures at this time.

Evidence: The Department has made progress in using existing performance information to manage the program, e.g., imposing conditions on grantees through requirements for improvement plans. Data quality issues include lack of a uniform definition of who is a Voc Ed. student as well as an inability to use state-level performance data to develop national estimates.

NO 0%
3.2

Are Federal managers and program partners (grantees, subgrantees, contractors, etc.) held accountable for cost, schedule and performance results?

Explanation: This program has not instituted an appraisal system that holds Federal managers accountable for grantee performance. However, as part of the President's Management Agenda, the Department is planning to implement an agency-wide system -- EDPAS -- that links employee performance to progress on strategic planning goals. Grantee performance is monitored on an annual basis through review and approval of annual budget plans, compliance reviews, audits, and site visits. Grantees that do not meet Federal requirements are required to submit improvement plans and can have grants reduced or discontinued for serious or persistent failures to comply.

Evidence:  

NO 0%
3.3

Are all funds (Federal and partners') obligated in a timely manner and spent for the intended purpose?

Explanation: Funds are obligated within the timeframes set out by Department schedules and used for the purposes intended.

Evidence:  

YES 11%
3.4

Does the program have incentives and procedures (e.g., competitive sourcing/cost comparisons, IT improvements) to measure and achieve efficiencies and cost effectiveness in program execution?

Explanation: This program has not yet instituted procedures to measure and improve cost efficiency in program execution. However, as part of the President's Management Agenda, the Department is implementing an agency-wide initiative to re-evaluate the efficiency of every significant business function, including the development of unit measures and the consideration of competitive sourcing and IT improvements.

Evidence: The common measures framework includes an efficiency measure -- cost per participant. The Department estimates that the cost per participant is $102 for high school students and $122 for post-secondary students. However, the lack of valid outcome data makes it impossible to link these costs to the achievement of program goals. Moreover, these figures will need further refinement once ED can establish a uniform definition of a Voc. Ed. participant.

NO 0%
3.5

Does the agency estimate and budget for the full annual costs of operating the program (including all administrative costs and allocated overhead) so that program performance changes are identified with changes in funding levels?

Explanation: Education's 2004 Budget satisfies the first part of the question by presenting the anticipated S&E expenditures (including retirement costs) for this program, which constitute less than 1 percent of the program's full costs. However, Education has not satisfied the second part of the question because program performance changes are not identified with changes in funding levels. The program does not have sufficiently valid and reliable performance information to assess the impact of the Federal investment.

Evidence:  

NO 0%
3.6

Does the program use strong financial management practices?

Explanation: The program has a positive audit history, with no evidence of internal control weaknesses.

Evidence:  

YES 11%
3.7

Has the program taken meaningful steps to address its management deficiencies?

Explanation: The program has improved its monitoring process through increased review of grantee budgets and performance as well as taking steps to increase accountability through conditions on state grants.

Evidence:  

YES 11%
3.B1

Does the program have oversight practices that provide sufficient knowledge of grantee activities?

Explanation: The Department maintains information on grantee activities through consolidated annual reports, site visits and compliance monitoring, and technical assistance activities.

Evidence:  

YES 11%
3.B2

Does the program collect grantee performance data on an annual basis and make it available to the public in a transparent and meaningful manner?

Explanation: The performance reports are annual and widely disseminated. Work needs to be done to both rectify data quality problems and make data quality problems more transparent.

Evidence:  

YES 11%
Section 3 - Program Management Score 56%
Section 4 - Program Results/Accountability
Number Question Answer Score
4.1

Has the program demonstrated adequate progress in achieving its long-term outcome goal(s)?

Explanation: Consistent with measures established under the job training common measures framework, the Department is working to develop several long-term indicators and performance targets that are tied to short term goals and are consistent with the program's scope and activities.

Evidence:  

NO 0%
4.2

Does the program (including program partners) achieve its annual performance goals?

Explanation: Through the common measures matrix, the program has established a limited set of performance indicators designed to measure program impacts, including for example, placement in employment, degree attainment, and skill attainment. However, the Department must establish numerical targets and ensure that performance data exists to report on those targets. In addition, any short-term measures (whether the common measures or additional measures) must be linked to long-term goals.

Evidence: The Department has made considerable efforts to accumulate the necessary data to inform on the common measures. However, ongoing data quality issues make reliable aggregation of state performance data impossible. For example, the Department estimates that 37% of postsecondary student participants earn a degree or certificate. However, States have varying definitions of who a "postsecondary student participant" is, raising concerns about the reliability of ED's estimate.

NO 0%
4.3

Does the program demonstrate improved efficiencies and cost effectiveness in achieving program goals each year?

Explanation: The common measures framework includes an efficiency measure -- cost per participant. The Department estimates that the cost per participant is $102 for high school students and $122 for post-secondary students. However, the lack of valid outcome data makes it impossible to link these costs to the achievement of program goals. Moreover, these figures will need further refinement once ED can establish a uniform definition of a Voc. Ed. participant.

Evidence:  

NO 0%
4.4

Does the performance of this program compare favorably to other programs with similar purpose and goals?

Explanation: To date, the Department has been unable to provide data that informs on the common measures. NAVE results and individual State performance reports (non-aggregated) indicate that vocational education as currently constituted is not effective in achieving academic and employment outcomes.

Evidence:  

NO 0%
4.5

Do independent and quality evaluations of this program indicate that the program is effective and achieving results?

Explanation: The most recent NAVE was released in December, 2002. Historically, the NAVE has provided mixed results in terms of whether program goals are achieved. The 1994 NAVE concluded that vocational education provides little or no measurable advantage for high school students in terms of high school completion, postsecondary enrollment, and academic achievement. Preliminary results from the 2002 NAVE confirm the 1994 findings and find further that substituting vocational courses for academic courses adversely affects student academic achievement and college enrollment. However, the 2002 NAVE did find that taking a high school vocational course (versus taking no vocational courses) may have a positive impact on earnings.

Evidence: 1994, 2002 NAVE.

NO 0%
Section 4 - Program Results/Accountability Score 0%


Last updated: 09062008.2002SPR