ExpectMore.gov


Detailed Information on the
High School Equivalency Program Assessment

Program Code 10002094
Program Title High School Equivalency Program
Department Name Department of Education
Agency/Bureau Name Office of Elementary and Secondary Education
Program Type(s) Competitive Grant Program
Assessment Year 2004
Assessment Rating Results Not Demonstrated
Assessment Section Scores
Section Score
Program Purpose & Design 100%
Strategic Planning 25%
Program Management 60%
Program Results/Accountability 0%
Program Funding Level
(in millions)
FY2008 $19
FY2009 $18

Ongoing Program Improvement Plans

Year Began Improvement Plan Status Comments
2005

Developing targets for its newly adopted efficiency measure, and using the measure to analyze costs relative to the costs of other GED attainment or drop-out prevention programs.

Action taken, but not completed The Department plans to develop targets that show differences in costs between commuter and residential HEP projects. The Department will use data from grantee reports submitted in December 2008 to develop targets. Although the Department expected to develop the targets earlier because it encouraged grantees to voluntarily use the new reporting format in December 2007, too few did so. The Department will also consider the overall performance of HEP projects relative to similar programs.
2005

Setting and gathering data on long-term goals that address outcomes achieved once participants complete the program, specifically, the extent to which they go to college or obtain better employment. These goals should be indexed against the performance of other disadvantaged populations or non-participating migrants, and not just provide "before and after" snapshots.

Action taken, but not completed The Department developed a new annual performance report that will be required for all grantees as of December 2008. This new report uses clearly defined data elements for post-program outcomes, specifies how long exiting students must be followed, and collects information on the grantees?? success in tracking students that have left the program. Baseline data will be available May 2009 and targets will be set after 2 years of data are collected.
2005

Developing a more effective method of using outcome data to hold grantees accountable.

Action taken, but not completed In 2007, the Department began designating grantees as ??low-performing?? or??high-risk?? recipients to improve program performance and accountability, and in 2008 is testing a draft protocol for desk and on-site monitoring, with an emphasis on monitoring high risk or low-performing grantees. Further, the Department plans to produce individual grantee project profiles containing grantee performance data and disseminate it among all program grantees in 2008.
2005

The program will develop a reporting and auditing system to verify locally reported data and to ensure that performance data are being collected consistently across grantees according to established criteria.

Action taken, but not completed The Department provided guidance and technical assistance to grantees in 2007 to refine and improve the uniformity of data collection and reporting. The new performance reports, required as of December 2008, should generate accurate, consistent data across all projects. The Department held grantee meetings in April and another will occur in July 2008. Grantees also received individualized letters to verify denominators to be used in reporting outcomes, which will be pre-populated for reporting.

Completed Program Improvement Plans

Year Began Improvement Plan Status Comments
2005

Developing a strategy for addressing impediments that discourage new applicants, including consideration of legislative strategies.

Completed The Department has notified grantees through written communications and presentations at grantee meetings that it has instituted procedures to weigh lack of compliance in reporting, not spending funds in a timely manner and the grantees?? poor performance on GPRA objectives against the priority points experienced grantees receive when applying for program funds. The Department has also trained staff to implement these activities.

Program Performance Measures

Term Type  
Annual Outcome

Measure: The percentage of HEP participants receiving a General Education Development (GED) credential.


Explanation:

Year Target Actual
1996 . 70
1997 . 66
1998 . 72
1999 . 73
2000 . 58
2001 . 53
2003 60 63
2004 60 65
2005 65 66
2006 66 63
2007 67 54
2008 68 data lag [Oct. 2010]
2009 69 data lag [Oct. 2012]
2010 70 data lag [Oct. 2013]
Annual Efficiency

Measure: The cost per training for HEP participants who earn a GED. (Targets under development.)


Explanation:

Long-term Outcome

Measure: Increasing numbers of HEP participants will have improved employment outcomes or attend a post-secondary institution.


Explanation:

Questions/Answers (Detailed Assessment)

Section 1 - Program Purpose & Design
Number Question Answer Score
1.1

Is the program purpose clear?

Explanation: The purpose of HEP is to help migrant and seasonal farmworkers and their children obtain a general education diploma (GED) and to gain employment or participate in postsecondary education or training.

Evidence: Higher Education Act, Title IV, Part A, Subpart 5, Section 418A

YES 20%
1.2

Does the program address a specific and existing problem, interest or need?

Explanation: Studies have identified the lifestyle of the migrant family and the associated mobility with such lifestyle as major obstacles to migrant students in obtaining the equivalent of a high school diploma, entry into post-secondary education, entering the military, or obtaining employment; Only 15% of migrant and seasonal farmworkers complete an 8th grade level of education; migrant and seasonal farmworkers earn an average of $5,000 annually.

Evidence: Literature Review prepared by RTI Center for Research in Education, October 30, 2003; No Longer Children, p. 10, Aguirre International, November 2000; ERIC Digest, No. ED 376-997.

YES 20%
1.3

Is the program designed so that it is not redundant or duplicative of any other Federal, state, local or private effort?

Explanation: HEP uniquely focuses on serving migrant and seasonal farmworkers who do not have a high school diploma and are beyond high school ages. This group is rarely served by the more mainstream, adult programs or federal supplemental education programs, and, only then, among population ages of 5 years to seventeen years. Since most HEP participants are over 17, the HEP program serves a population and works toward a goal, that no other federal program does. HEP eligibility directly focuses to addresses the unique factors commonly affecting these post-high-school age participants and which are the primary obstacles with which this population is faced: mobility, and its implications on the opportunities for accessing education by migrant and seasonal farmworkers.

Evidence: Higher Education Act of 1965, as amended, Title IV, Section 418A; HEP Regulations, 34 CFR Part 206, subsection 206.2 (Who is eligible?); No Child Left Behind, Title I, Part C Legislation

YES 20%
1.4

Is the program design free of major flaws that would limit the program's effectiveness or efficiency?

Explanation: There is no evidence that the structure of the program is a flawed design for the program. Program eligibility requirements target students who are most impacted by the migrant lifestyle and high mobility.

Evidence: Eligibility requirements

YES 20%
1.5

Is the program effectively targeted, so that resources will reach intended beneficiaries and/or otherwise address the program's purpose directly?

Explanation: There is no evidence that this program is not effectively targeted. Departmental regulations require that services be provided from grantees to the intended target population: migrant and seasonal farmworkers. It is estimated that from 60 to 75% of all students in HEP are between the ages of 19 through 30 years.

Evidence: 34 CFR 206. Secs. 206.10

YES 20%
Section 1 - Program Purpose & Design Score 100%
Section 2 - Strategic Planning
Number Question Answer Score
2.1

Does the program have a limited number of specific long-term performance measures that focus on outcomes and meaningfully reflect the purpose of the program?

Explanation: The Department is working to create long term performance measures through 2010.

Evidence:  

NO 0%
2.2

Does the program have ambitious targets and timeframes for its long-term measures?

Explanation: The Department is working to create targets and timeframes for long-term performance measures.

Evidence:  

NO 0%
2.3

Does the program have a limited number of specific annual performance measures that can demonstrate progress toward achieving the program's long-term goals?

Explanation: The Department has established the following performance measure for the program: The percentage of HEP participants who complete the program and receive the GED will remain high, if not increase. (This is a conditional "yes". The measures addresses attaining a GED, but not post-GED employment or post-secondary education.)

Evidence: PPMD - OESE HEP FY 2005

YES 12%
2.4

Does the program have baselines and ambitious targets for its annual measures?

Explanation: 2001 baseline data reported that 53 percent of HEP students received a GED. Increasing targets have been set for 2003, 2004, and 2005. However, ED has not established agreed-upon targets or a methodology for gathering and analyzing data.

Evidence: PPMD - HEP FY 2005

NO 0%
2.5

Do all partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) commit to and work toward the annual and/or long-term goals of the program?

Explanation: Through annual performance reports, the Department confirms grantee commitment to working towards the program's goal of assisting migrant and seasonal farmworker students to obtain a high school diploma. However, ED there is not evidence that consistent performance information is gathered for the program, nor is there information on post-GED attainment.

Evidence: Submissions of annual grant performance reports, OME provided technical assistance, monitoring per annual schedule and ad hoc telephone monitoring, review of budget and expenditures, conducting post-award technical assistance/guidance, and an annual grant administration meeting with grantee, OME oversight of application processing, non-competing continuation (NCC) review process, and regional technical assistance presentations all serve to establish quality control management over grant operations, including the required signed assurances from grantees' / universities' chief or key executives that university submissions of progress/data will be reliable and accurate.

NO 0%
2.6

Are independent evaluations of sufficient scope and quality conducted on a regular basis or as needed to support program improvements and evaluate effectiveness and relevance to the problem, interest, or need?

Explanation: Although no program funds are appropriated or authorized for conducting independent program-level evaluations, grantees are required individually to implement strong evaluation plans to shape improvement and comply with program objectives.

Evidence: Grant application requirements.

NO 0%
2.7

Are Budget requests explicitly tied to accomplishment of the annual and long-term performance goals, and are the resource needs presented in a complete and transparent manner in the program's budget?

Explanation: The Department has not satisfied the first part of the questions because performance changes are not identified with changes in funding levels. The program, at this time, does not have sufficiently valid and reliable peroformance information to assess (whether directly or indirectly) the impact of the Federal investment. However, the Department has satisfied the second part of the question because the Department's budget submissions show the full cost of the program (including S&E).

Evidence:  

NO 0%
2.8

Has the program taken meaningful steps to correct its strategic planning deficiencies?

Explanation: The Department is working to create long-term performance measures through 2010, to be reported in the 2006 PPMD Strategic Plan.

Evidence:  

YES 12%
Section 2 - Strategic Planning Score 25%
Section 3 - Program Management
Number Question Answer Score
3.1

Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance?

Explanation: The Department collects annual grant performance reports, which are used to determine whether grantee performance meets project objectives and program and GPRA goals. Data are self-reported and not subject to verification. However, where achievement or progress is not realized, grantees develop corrective action plans that OME staff oversees as to compliance and needs for technical assistance. To receive a "yes" on this, ED must show it has a plan for standardizing and verifying local grantee data.

Evidence: Project files

NO 0%
3.2

Are Federal managers and program partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) held accountable for cost, schedule and performance results?

Explanation: Currently, ED cannot demonstrate how federal managers and program partners are held accountable for program goals. However, the Department has initiated several efforts to improve accountability in its programs. First, ED is in the process of ensuring that EDPAS plans -- which link employee performance to relevant Strategic Plan goals and action steps ' hold Department employees accountable for specific actions tied to improving program performance. ED is also revising performance agreements for its SES staff to link performance appraisals to specific actions tied to program performance. Finally, ED is reviewing its grant policies and regulations to see how grantees can be held more accountable for program results.

Evidence:  

NO 0%
3.3

Are funds (Federal and partners') obligated in a timely manner and spent for the intended purpose?

Explanation: Funds are obligated within the timeframes set out by Department schedules and used for the purposes intended.

Evidence: Evidence suggests that grantees are drawing down funds at an acceptable rate.

YES 10%
3.4

Does the program have procedures (e.g. competitive sourcing/cost comparisons, IT improvements, appropriate incentives) to measure and achieve efficiencies and cost effectiveness in program execution?

Explanation: The program has not yet instituted procedures to measure and improve cost efficiency in program execution. However, as part of the President's Management Agenda, the Department is implementing "One-ED" -- an agency-wide initiative to re-evaluate the efficiency of every significant business function, including the development of unit measures and the consideration of competitive sourcing and IT improvements. A "yes" answer is likely once the One-ED process is applied to this program's relevant business functions.

Evidence:  

NO 0%
3.5

Does the program collaborate and coordinate effectively with related programs?

Explanation: Program staff has collaborated with Department of Labor management/supervisory staff to establish inter-program (DOL, ED) guidance on regulatory definitions of eligible "migrant and seasonal farmworker" HEP participants. Department of Education staff from the TRIO program have provided guidance to their grantees regarding the coordination of available services with the HEP program. Some HEP grantees have established working networks with State and/or local Title I Migrant Education Program (MEP) offices in order to share resources for recruiting and enrolling eligible HEP students.

Evidence: Grantee applications; Annual OME meeting with HEP and CAMP Directors; Annual OME meeting with MEP Directors

YES 10%
3.6

Does the program use strong financial management practices?

Explanation: Recent agency-wide audits have not identified deficiencies in the financial management of this program.

Evidence:  

YES 10%
3.7

Has the program taken meaningful steps to address its management deficiencies?

Explanation: While material internal management deficiencies have not been identified for this program, the program has put in place a system to identify potential problems.

Evidence: Program staff monitor excessive drawdowns of funds to prevent high-risk situations.

YES 10%
3.CO1

Are grants awarded based on a clear competitive process that includes a qualified assessment of merit?

Explanation: The Department awards grants on a point system that is based on selection criteria published in the Federal Register. Grant information for potential applicants is published in hard copy by the Department and posted on the Department's website.

Evidence: Federal Register Notice; EDGAR selection criteria are standards for developing slates, awards and successful applicants. However, program experience has demonstrated that grantees with established operational histories and knowledgeable project faculty--along with coordination and commitment from institutional organizations--are the most successful in the grant competitions.

YES 10%
3.CO2

Does the program have oversight practices that provide sufficient knowledge of grantee activities?

Explanation: The Department maintains information on grantee activities through annual performance reports, site visits, and technical assistance activities. Program staff reviews budget expenditures in the context of mid-year and end of year reporting.

Evidence: Annual performance reports; site visit reports; revised budgets submitted by grantees

YES 10%
3.CO3

Does the program collect grantee performance data on an annual basis and make it available to the public in a transparent and meaningful manner?

Explanation: Annual performance data is collected from grantees and compiled by program staff. Data for the indicators have been reported in PPMD for 2001 and 2002. While performance data are not published or posted on the web, results are available to the public and presented to grantees at technical assistance and grant administration meetings.

Evidence: Official project files and program compilations of performance results

NO 0%
Section 3 - Program Management Score 60%
Section 4 - Program Results/Accountability
Number Question Answer Score
4.1

Has the program demonstrated adequate progress in achieving its long-term performance goals?

Explanation: The Department has not yet established a long-term performance measure for this program. However, performance data from the FY 1999 cohort shows that progress has been made toward achieving the long-term HEP goal that are being developed by the Department: HEP participants will complete their GED.

Evidence: PPMD - HEP FY 2004 and FY 2005; ED Grant Performance Report tabulations of FY 1999, FY 2000, and FY 2001.

NO 0%
4.2

Does the program (including program partners) achieve its annual performance goals?

Explanation: While the Department has established annual performance measures, only baseline data has been collected and reported. Data for the first year of performance targets (FY 2003) will be reported later in 2004.

Evidence:  

NO 0%
4.3

Does the program demonstrate improved efficiencies or cost effectiveness in achieving program goals each year?

Explanation: The Department has established and OMB has approved an efficiency measure for this program: The cost per training for HEP participants who earn a GED. However, targets and baselines are still underdevelopment.

Evidence:  

NO 0%
4.4

Does the performance of this program compare favorably to other programs, including government, private, etc., with similar purpose and goals?

Explanation: There are no programs with readily accessible comparable data. ED should develop a strategy to compare HEP and Adult Ed. results.

Evidence: "The GED Myth" by Jay Greene. Manahattan Institute for Public Policy and Research. Page 8. October 2003.; ED Grant Performance Reports submitted in 2000, 2001, and 2002 from 23 HEP grantees,

NA 0%
4.5

Do independent evaluations of sufficient scope and quality indicate that the program is effective and achieving results?

Explanation: Although no program funds are appropriated or authorized for conducting independent program-level evaluations, grantees are required individually to implement strong evaluation plans to shape improvement and comply with program objectives. Some grantees provide copies to the Department as evidence of compliance.

Evidence: Grant application requirements.

NO 0%
Section 4 - Program Results/Accountability Score 0%


Last updated: 01092009.2004FALL