ExpectMore.gov


Detailed Information on the
Adult Education State Grants Assessment

Program Code 10000180
Program Title Adult Education State Grants
Department Name Department of Education
Agency/Bureau Name Office of Vocational and Adult Education
Program Type(s) Block/Formula Grant
Assessment Year 2006
Assessment Rating Effective
Assessment Section Scores
Section Score
Program Purpose & Design 100%
Strategic Planning 88%
Program Management 100%
Program Results/Accountability 80%
Program Funding Level
(in millions)
FY2007 $564
FY2008 $554
FY2009 $554

Ongoing Program Improvement Plans

Year Began Improvement Plan Status Comments
2006

Pursuing reauthorization language to enable the Department to collect participants' earnings data, either through Unemployment Insurance (UI) records or through other means allowed under State law.

Not enacted OMB is working with the Department to pursue the collection of UI wage data in states that have no legislative limits.
2007

Provide training to all States and Outlying Areas to improve the overall quality of program performance data collected and reported by State and local programs.

No action taken The Dept. will enter into a multi-year technical assistance and training contract to develop and provide a multi-day training workshop for all grantees on how to develop an electronic reporting tool generated from the State-level individual database that enables States to monitor and evaluate the extent to which local programs are meeting Federal data quality standards contained in the Department's data quality standards checklist.
2008

Provide technical assistance and training for up to six States that are promoting program accountability through the use of performance-based funding systems.

Action taken, but not completed The Department has funded a two-year project to assist up to twelve States adopt performance-based funding systems using Federal performance outcomes to allocate Federal, State, or both, types of adult education resources. The contract concludes in September 2010.

Completed Program Improvement Plans

Year Began Improvement Plan Status Comments
2006

During fiscal year 2008, improving the availability of program performance data to enable comparisons across States.

Completed The Department has cleaned up States' data from previous years (FY 2001-FY 2004) and populated the public database to ensure public access to improved data across States and programs. The updated data sets are available to the public online at http://wdcrobcolp01.ed.gov/CFAPPS/OVAE/NRS/tables/.

Program Performance Measures

Term Type  
Long-term/Annual Outcome

Measure: The percentage of adults in adult basic education programs who acquire the level of basic skills needed to complete the level in which they enrolled.


Explanation:This measure examines basic (ninth-to-less-than-twelfth grade equivalent) skills acquisition. One functional level corresponds to approximately 2 grade levels. Skill acquisition is measured by standardized assessment instruments and reported through annual reports. Due to internal performance reporting deadlines, the 2006 target could not be adjusted. Targets from 2007-2015 have been adjusted to reflect current program performance.

Year Target Actual
2001 45 36
2002 40 37
2003 41 38
2004 42 38
2005 42 40
2006 39 39
2007 42 38
2008 44 Feb. 2009
2009 46 Feb. 2010
2010 48 Feb. 2011
2011 50 Feb. 2012
2012 52 Feb. 2013
2013 54 Feb. 2014
2014 56 Feb. 2015
2015 68 Feb. 2016
Long-term/Annual Outcome

Measure: The percentage of adults enrolled in English literacy programs who acquire the level of English language skills needed to complete the level of instruction in which they enrolled.


Explanation:This measure examines proficiency in English language skills using standardized assessment instruments and is reported through annual performance reports. One functional level corresponds to approximately two grade levels. Due to internal performance reporting deadlines, the 2006 target could not be adjusted. Targets from 2007-2015 have been adjusted to reflect current program performance.

Year Target Actual
2001 50 31
2002 42 34
2003 44 36
2004 45 36
2005 45 37
2006 38 37
2007 40 39
2008 42 Feb. 2009
2009 44 Feb. 2010
2010 46 Feb. 2011
2011 48 Feb. 2012
2012 50 Feb. 2013
2013 52 Feb. 2014
2014 54 Feb. 2015
2015 56 Feb. 2016
Long-term/Annual Outcome

Measure: The percentage of adults with a high school completion goal who earn a high school diploma or recognized equivalent.


Explanation:This measure represents one of the employment measures associated with the job training common measures initiative. Due to internal performance reporting deadlines, the 2006 target could not be adjusted.

Year Target Actual
2001 45 33
2002 40 42
2003 41 44
2004 42 45
2005 46 51
2006 46 49
2007 52 59
2008 53 Feb. 2009
2009 54 Feb. 2010
2010 55 Feb. 2011
2011 56 Feb. 2012
2012 57 Feb. 2013
2013 58 Feb. 2014
2014 59 Feb. 2015
2015 60 Feb. 2016
Long-term/Annual Outcome

Measure: The percentage of adults with a goal to enter postsecondary education or training who enroll in a postsecondary education or training program.


Explanation:This measure reflects one of the employment measures associated with the job training common measures initiative. Data show only those students who have expressed a goal to enter postsecondary education or training programs as a learning objective. Due to internal performance reporting deadlines, the 2006 target could not be adjusted and is lower than actual reported data from 2005. Targets from 2007-2015 have been adjusted to reflect current program performance.

Year Target Actual
2001 New Baseline 25
2002 25 30
2003 26 30
2004 27 30
2005 30 34
2006 33 35
2007 37 55
2008 39 Feb. 2009
2009 41 Feb. 2010
2010 43 Feb. 2011
2011 45 Feb. 2012
2012 47 Feb. 2013
2013 49 Feb. 2014
2014 51 Feb. 2015
2015 53 Feb. 2016
Long-term/Annual Outcome

Measure: The percentage of adults with an employment goal who obtain a job by the end of the first quarter after their program exit quarter.


Explanation:This measure reflects one of the employment measures associated with the job training common measures inititative. Data show only those students who have expressed a goal to enter employment as a learning objective.

Year Target Actual
2001 New Baseline 36
2002 36 39
2003 37 37
2004 38 36
2005 40 37
2006 40 48
2007 41 61
2008 41 Feb. 2009
2009 42 Feb. 2010
2010 42 Feb. 2011
2011 43 Feb. 2012
2012 43 Feb. 2013
2013 44 Feb. 2014
2014 44 Feb. 2015
2015 45 Feb. 2016
Long-term/Annual Outcome

Measure: The percentage of adults who retained employment in the third quarter after exit.


Explanation:This measure reflects one of the employment measures associated with the job training common measures inititative. Data show only those students who have expressed a goal to enter employment as a learning objective and then remained employed in the third quarter following program exit. The Department will collect baseline data on the revised legislative definition for job retention and will set long-term targets accordingly.

Year Target Actual
2001 60 62
2002 62 63
2003 64 69
2004 64 63
2005 64 64
2006 65 64
2007 66 73
2008 66 Feb. 2009
2009 66 Feb. 2010
2010 67 Feb. 2011
2011 67 Feb. 2012
2012 67 Feb. 2013
Long-term/Annual Efficiency

Measure: The annual cost per participant.


Explanation:This measure captures only the Federal portion of Adult Education funds and does not take State or local funds into consideration.

Year Target Actual
2001 * $190
2002 * $206
2003 * $210
2004 * $219
2005 $217 $227
2006 $215 Dec. 2008
2007 $215 Dec. 2009
2008 $215 Dec. 2010
2009 $214 Dec. 2011
2010 $214 Dec. 2012
2011 $213 Dec. 2013
2012 $213 Dec. 2014
Long-term/Annual Efficiency

Measure: The cost per adult education participant who advanced one or more educational levels or earned a high school diploma or GED.


Explanation:This measure captures the average Federal cost for a student to advance to at least the next higher educational level in an Adult Education program, or to earn a high school diploma or GED credential.

Year Target Actual
2000 * $435
2001 * $481
2002 * $510
2003 * $516
2004 * $474
2005 $475 $502
2006 $451 Dec. 2008
2007 $428 Dec. 2009
2008 $407 Dec. 2010
2009 $407 Dec. 2011
2010 $405 Dec. 2012
2011 $405 Dec. 2013
2012 $403 Dec. 2014

Questions/Answers (Detailed Assessment)

Section 1 - Program Purpose & Design
Number Question Answer Score
1.1

Is the program purpose clear?

Explanation: Adult Education State grants (also known as Adult Education and Family Literacy Act or AEFLA) creates a partnership among the Federal government, States and localities, to provide adult education and literacy services to: (i) help adults become literate and obtain the skills necessary to become employed and self sufficient; (ii) obtain the skills necessary to be full partners in the educational development of their children; and (iii) complete secondary school education.

Evidence: Adult Education and Family Literacy Act, Section 202.

YES 20%
1.2

Does the program address a specific and existing problem, interest, or need?

Explanation: According to the National Assessment of Adult Literacy, in 2003, 14 percent of U.S. adults had "below basic" prose literacy skills (below eighth-grade equivalent skills) and 29 percent had "basic" prose literacy skills (ninth-to-less-than-twelfth-grade-equivalent skills). In 2000, 9.7 million adults reported that they spoke English "not well" or "not at all." Adults with low basic skills or limited English proficiency are significantly less likely to achieve economic self-sufficiency. In 2003, the median weekly earnings of adults with "intermediate" prose literacy skills were 65 percent higher than the earnings of adults with "below basic" skills and 27 percent higher than those of adults with "basic" skills. In addition, according to the Temporary Assistance for Needy Families (TANF) Sixth Annual Report to Congress, 42 percent of adults who received TANF benefits in 2003 had completed 11 or fewer years of education. Finally, the 2000 Census of Population and Housing indicated that limited English proficient males who worked full-time in 2000 earned 92 percent less than males who were proficient in English. Adults with limited English proficiency were three times as likely to receive public assistance.

Evidence: National Assessment of Adult Literacy; TANF Sixth Annual Report to Congress; 2000 Census of Population and Housing.

YES 20%
1.3

Is the program designed so that it is not redundant or duplicative of any other Federal, state, local or private effort?

Explanation: This is the only Federal program focused exclusively on improving the skills of low-skilled and limited English proficient out-of-school youth and adults in reading, mathematics, and English proficiency. It also helps them earn a high school diploma or its equivalent. AEFLA is the main provider of basic skills and English literacy instruction in the U.S., providing instruction to more than 2.5 million out of school youth and adults each year. Other Federal programs may provide similar services, but these programs don't: (1) focus on improving the education of this particular population;(2) universally offer adult education services to participants who need them; and, (3) serve as many individuals as AEFLA. In fiscal year (FY) 04, AEFLA served approximately 8 times as many individuals who participated in all other related federal programs combined. For example, in FY 04, 63,887 of Workforce Investment Act (WIA) Title I Youth program participants were high school dropouts, but 36 percent of these youth did not receive any educational services. The WIA Title I Adult program served 73,041 adults who lacked a high school diploma, but only 5,040 adults participated in programs that integrated basic skills instruction with occupational training. AEFLA complements state and local efforts to support adult education and ensures that services are available in states and communities with limited resources. In FY 02, AEFLA funds provided 26 percent of the total resources (including cash and in-kind contributions by states, local governments, and private entities) used to support adult education in the United States. State and local support for adult education varies widely across the states. AEFLA provided 51 percent or more of the resources used to support adult education in a majority of the states. The program requires states to use program funds to supplement and not supplant state efforts and to maintain not less than 90 percent of their prior year's support for adult education.

Evidence: AEFLA State financial and performance reports; Workforce Investment Act Standardized Record Data Books; Workforce System Results, Program Year 2004; Even Start Facts and Figures; General Accountability Office, Multiple Employment and Training Programs: Funding and Performance Measures for Major Programs (GAO-030589); National Youth Challenge 2004 Performance and Accountability Highlights; TANF Sixth Annual Report to Congress; Adult Education and Family Literacy Act, section 241, Job Corps PART.

YES 20%
1.4

Is the program design free of major flaws that would limit the program's effectiveness or efficiency?

Explanation: Adult Education has no major design flaws. The program authority is sufficiently broad to enable the program to utilize a variety of means for achieving program objectives. We have no evidence that other approaches would be more effective or efficient than the ones currently being used.

Evidence: Adult Education and Family Literacy Act, Section 202.

YES 20%
1.5

Is the program design effectively targeted so that resources will address the program's purpose directly and will reach intended beneficiaries?

Explanation: Funding is distributed to states on the basis of their respective shares of adults who lack a school diploma and immigrants admitted for legal residence. The law requires states to conduct a needs assessment to guide their use of program funds and to develop specific strategies to serve special populations, including individuals with disabilities, single parents, displaced homemakers, and individuals with limited English proficiency. States award funds competitively to adult education providers, and must consider the need for services in the geographical area that will be served, as well as provider effectiveness, in awarding funds. Eligibility for services is limited to individuals who: (1) "lack sufficient mastery of basic educational skills to enable the individuals to function effectively in society; (2) do not have a secondary school diploma or its recognized equivalent, and have not achieved an equivalent level of education; or (3) are unable to speak, read, or write the English language." The literacy or English proficiency skills of adults seeking services are established by standardized assessments at intake. A 2003 survey of participants confirmed that the program is serving a severely disadvantaged population. For example, nearly three-quarters of all participants were immigrants, and 18 percent were immigrants who had completed 8 or fewer years of education in their home countries.

Evidence: Adult Education and Family Literacy Act, sections 203(1), 211 (c)(2), 224(b)(1), 224(b)(10); Department of Education Appropriations Act, FY 2006; Adult Education Program Study.

YES 20%
Section 1 - Program Purpose & Design Score 100%
Section 2 - Strategic Planning
Number Question Answer Score
2.1

Does the program have a limited number of specific long-term performance measures that focus on outcomes and meaningfully reflect the purpose of the program?

Explanation: The program has five specific long-term measures that focus on outcomes and reflect the program's purposes. These measures track: basic skills and English language acquisition, high school completion, transition to post-secondary education, and employment. The program adopted the employment measures as part of the Administration's job training common measures initiative.

Evidence: See Measures tab.

YES 12%
2.2

Does the program have ambitious targets and timeframes for its long-term measures?

Explanation: Based on annual trend data from 2001, the program developed five long-term measures with ambitious and attainable targets through 2015. Previous targets set for years 2001-2006 were established without the benefit of baseline or trend data in most cases. Long-term targets are based on actual data through 2005 and therefore are more realistic targets for program success. The program has set a high bar for measuring educational gains by participants. The program seeks to improve performance by 15 to 25 percent in the next ten years. The long-term goals are ambitious in light of the 2003 National Assessment of Adult Literacy, which found no significant changes in adult prose, document, or quantitative literacy since 1993. Targets for long-term measures are ambitious and increase from 2007-2015. For instance, the target for measure #1 (basic skills acquisition) increases by 42 percent from 2007-2015. For measure #2 (English language acquitision), targets increase by 37.5 percent through 2015. The long-term performance targets for Adult Education are ambitious, and revised targets reflect actual data and represent attainable goals.

Evidence: Adult Education Long Term Goals (see Measures tab).

YES 12%
2.3

Does the program have a limited number of specific annual performance measures that can demonstrate progress toward achieving the program's long-term goals?

Explanation: The five long-term measures have annual targets, and the program has two efficiency measures, one that measures cost per participant and another that measures cost per learning gain. These targets were set using three years of reliable baseline data, unlike previous targets, which were established without the benefit of such data.

Evidence: Adult Education Long Term Goals; Adult Education Efficiency Measures (see Measures tab).

YES 12%
2.4

Does the program have baselines and ambitious targets for its annual measures?

Explanation: Although new baselines were established for annual measures in 2001, targets for 2001-2005 were established without actual performance data, which resulted in some years' data exceeding targets. The program has established new baselines for 2007 onward using actual performance data from the most recent years and set ambitious targets for all of its annual measures accordingly. Targets increase by 1 to 5 percent annually. Due to internal performance reporting deadlines, 2006 targets could not be adjusted.

Evidence: See Measures tab.

YES 12%
2.5

Do all partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) commit to and work toward the annual and/or long-term goals of the program?

Explanation: The program agrees upon annual targets with every State for each of its performance measures. These negotiations are driven by the aggregate annual and long-term performance targets the program has established. States, in turn, set performance targets for each local provider that receives AEFLA funds. States must report annually on their success in meeting their agreed-upon targets.

Evidence: Adult Education and Family Literacy Act, sections 212 and 231(e).

YES 12%
2.6

Are independent evaluations of sufficient scope and quality conducted on a regular basis or as needed to support program improvements and evaluate effectiveness and relevance to the problem, interest, or need?

Explanation: The program has supported numerous studies conducted by independent external contractors, including randomized controlled trials (RCTs), to identify programmatic reforms and instructional strategies that will improve the program's effectiveness, evaluate State and local implementation of the program, determine the extent of the need for the program, and assess its success in fulfilling its purpose. While the size and scope of the Adult Education program preclude conducting a RCT of the overall program, the program has initiated seven RCTs to evaluate the effectiveness of different interventions (policies, instructional strategies, and curriculum) in improving participant outcomes. One of these RCTs, which evaluated the effectiveness of compelling low-skilled TANF recipients to enroll in adult education, found that the intensity and duration of services improves outcomes for program participants. The availability of data on program instruction and duration, in addition to standards on pre and post testing participants improves the intensity of instruction provided to participants. The Department also responded to two of this study's principal findings (that individuals did not show significant reading skill gains until after 12 months, and that many individuals dropped out of the program before 12 months) by investing in five new RCTs to identify instructional strategies that will enable significant learning gains in fewer than 12 months. The first results of these studies will be reported in 2007. The program also implemented a pilot technical assistance initiative (Student Achievement in Reading, or STAR) to help 120 teachers in 6 States use existing scientifically-based reading research to improve instruction and accelerate learning gains. Another independent evaluation supported by the program, the Adult Reading Components Study, provided a detailed assessment of the reading skills of a representative sample of adult education participants. This evaluation helped inform the development and implementation of STAR. To improve instruction for English Language Learners, who comprise 40 percent of participants, the program is supporting another independent RCT of explicit literacy instruction that will report its first results in 2009. Other independent evaluations supported by the program include: (1) an evaluation of the extent and effects of State and local implementation of the 1998 law's funding, accountability, and one-stop career center participation requirements (which included a sample of State agencies); (2) a large-scale assessment of the literacy skills of a nationally representative sample of adults in the U.S. to determine the characteristics of adults with low-level skills and the relationship between low-level literacy skills and participation in the labor market (National Assessment of Adult Literacy, or NAAL); and (3) a survey of a nationally representative sample of Federally-funded adult education program administrators and a survey and a literacy skills assessment of their participants (Adult Education Program Study, or AEPS). The Department will use the results of the latter two studies, which will be reported by December 2006, to help States and local providers better target Federal resources to assist adults with the greatest educational needs, and improve the quality of the services they provide. Specifically related to AEPS, the Office of Vocational and Adult Education (OVAE) has already developed a program tool that allows States to use AEPS data to make comparisons of programs and participant characteristics across the country. This information will be used to develop new technology-based service delivery models. This provides an important tool to improve the structure and delivery of services within the States.

Evidence: Bos, Scrivener, et al. Improving Basic Skills: The Effects of Adult Education in Welfare-to-Work Programs. MDRC (2002); Pindus, Aron, et. al, Study to Assess Funding, Accountability, and One-Stop Delivery Systems in Adult Education, Urban Institute (2005); National Assessment of Adult Literacy (2005), conducted by the National Center for Education Statistics of the Intitute of Education Sciences; Adult Education Program Study, Westat (2006).

YES 12%
2.7

Are Budget requests explicitly tied to accomplishment of the annual and long-term performance goals, and are the resource needs presented in a complete and transparent manner in the program's budget?

Explanation: The budget materials for the program show the full cost of administering the program. However, budget requests are not explicitly tied to accomplishment of annual and long-term performance goals. While the Department reports performance data, the Department is not able to determine the impact of particular levels of funding on the performance of Adult Education in relation to annual and long-term performance measures.

Evidence:

NO 0%
2.8

Has the program taken meaningful steps to correct its strategic planning deficiencies?

Explanation: The program has implemented the recommendations made in the previous PART assessment and addressed the deficiencies the PART identified by taking administrative action to improve data quality and by investing national activity funds strategically in projects and activities to improve program performance. The program made six major administrative and seven program improvements since the original PART. The administration improvements include: (1) Implementing most of the job training common measures, with the exception of "increase in earnings." The Department of Education has sought statutory authority to collect data for this remaining measure. (2) Establishing ambitious long-term and annual performance targets through FY 2015 that focus on outcomes and reflect the program's purpose. (3) Improving the quality of state performance data as states have implemented the National Reporting System for adult education. (4) Using standardized assessments to measure student learning gains. (5) Creating data quality standards that identify the policies, processes and materials that states and local programs must have in place to collect valid and reliable data. (6) Pursuing rulemaking to make the data quality standards legally binding on states and local programs in 2006. Since the original PART assessment, the program refocused its national activity investments on projects and activities that will improve program performance, including: (1) experimental research to identify effective strategies in reading instruction for adults and literacy interventions for adult English Language Learners who have low levels of literacy in their native language; (2) technical assistance to states in developing and implementing content standards to drive instruction by local programs; (3) the development of a toolkit and accompanying professional development for state teams to help instructors incorporate evidence-based reading practices into the instruction of intermediate adult basic education learners (Project STAR); (4) technical assistance to states to expand and improve technology-enabled and/or web-enhanced distance education (AdultEd Online); (5) the identification of programs, practices, and policies that successfully facilitate transitions from adult basic education to community college certificate and degree programs; (6) assisting states in developing funding mechanisms that, in whole or in part, award funds on the basis of a provider's success in achieving measurable results; and (7) supporting a Center for Adult English Language Acquisition to disseminate research-based information and resources to all states regarding effective English language instruction for adults and to provide intensive professional development and technical assistance to states that have experienced a rapid influx of English Language Learners over the past several years.

Evidence: Adult Education Annual Performance and Financial Report (OMB Control #1830-0027); A Blueprint for Preparing America's Future: The Adult Basic and Literacy Education Act of 2003; Annual state performance and financial reports; Adult Education National Reporting System (www.nrsweb.org); Adult Education and Family Literacy Act National Activity Plans; Adult Education Content Standards Warehouse (www.adultedcontentstandards.org); STudent Achievement in Reading Project (www.startoolkit.org); Adult Basic Education to Community College Transitions (www.mdrc.org/project_31_74.html); Center for Adult English Language Acquisition (www.cal.org/caela/).

YES 12%
Section 2 - Strategic Planning Score 88%
Section 3 - Program Management
Number Question Answer Score
3.1

Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance?

Explanation: The Department annually collects performance data from States. To ensure the data are credible, the program has established data quality standards and monitors State compliance. On-site monitoring is focused on States that consistently fail to meet their targets or perform significantly below other States in order to try to identify the causes of poor performance and possible solutions. Performance data also drive the program's national activity investments. For example, disappointingly few program participants pursue postsecondary education after attaining a GED or high school diploma. In an effort to increase the number, the Department has supported the development of State content standards that are aligned with community college entrance requirements, initiated a project to identify programs, practices, and policies that successfully facilitate transitions to community college certificate and degree programs, collaborated with the Department of Labor to provide technical assistance to encourage community colleges to create pathways between adult education programs and occupational training programs, and is launching a new demonstration program to create more adult education to community college pathway programs.

Evidence: Adult Education Annual Performance and Financial Report (OMB Control #1830-0027); Division of Adult Education and Literacy FY 2006 Monitoring Plan; Adult Education and Family Literacy Act National Activity Plans.

YES 11%
3.2

Are Federal managers and program partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) held accountable for cost, schedule and performance results?

Explanation: The program agrees with each State on annual targets for the program's performance measures. States that exceed their targets are eligible for incentive grants if they also exceed their annual targets for the Carl D. Perkins Vocational and Technical Education Act and Title I of the Workforce Investment Act. The program office focuses its on-site monitoring on States that consistently fail to meet their targets or perform significantly below other States in order to try to identify the causes of poor performance and possible solutions. All States must consider applicants' past success in meeting State performance targets in their competitive award of funds to local providers. In addition, 3 States (California, Kansas, Pennsylvania) use performance-based criteria (e.g., number of students who advanced one or more educational functioning levels) to distribute 100 percent of Federal funds to local providers, and another 3 States (Indiana, Missouri, and Ohio) distributed a portion of Federal funds to local providers using performance-based criteria. An additional 3 States (Illinois, Kentucky, Michigan) use performance-based criteria to distribute a portion of State funds, and may begin to allocate Federal funds in this way as well. The program has encouraged all States to implement performance-based funding, and will be releasing a technical assistance guide to promote the use of performance-based funding in 2006. The program also has provided technical assistance to interested States in developing and disseminating "report cards" for local providers that would assist prospective students and community stakeholders in evaluating the success of local adult education programs. All national activity funds administered by the Office of Vocational and Adult Education (OVAE) have been awarded through performance-based contracts. Performance agreements hold the Deputy Assistant Secretary accountable for reducing the amount of program funds that States revert to the Treasury by 10 percent in FY 2005, and hold the program director accountable for awarding grants to States that have met Federal requirements at the time program funds become available (July 1), and for completing all scheduled procurement actions by the end of the fiscal year.

Evidence: Adult Education and Family Literacy Act, sections 212, 231(e)(2); Workforce Investment Act, section 501; Division of Adult Education and Literacy FY 2006 Monitoring Plan; Adult Education National Reporting System (www.nrsweb.org); MPR Associates, Performance-Based Funding in Adult Education (2005); Office of Vocational and Adult Education performance agreements.

YES 11%
3.3

Are funds (Federal and partners') obligated in a timely manner, spent for the intended purpose and accurately reported?

Explanation: Several times a year, program staff reviews the financial information in the Grant Administration and Payment System (GAPS). The Department of Education (ED) uses GAPS to track the financial activities of a grant from initial obligation of funds by ED, draw down of funds by grantee, and final settlement of grant. In addition, GAPS maintains demographic information on the grantees. There have been no substantive audit findings in this area.

Evidence: SF-133 forms, GAPS reports and monitoring.

YES 11%
3.4

Does the program have procedures (e.g. competitive sourcing/cost comparisons, IT improvements, appropriate incentives) to measure and achieve efficiencies and cost effectiveness in program execution?

Explanation: The National Reporting System (NRS) is the accountability system for the Federally funded, State-administered adult education program that addresses the accountability requirements by collecting learner outcomes in the five areas mandated by the Adult Education and Family Literacy Act, Title II of the Workforce Investment Act (WIA??P.L.105-220). The implementation of the NRS has increased cost-effectiveness in program administration and has enabled the Department to establish efficiency measures based on measured cost data. Based on NRS data from States, OVAE has established and implemented two efficiency measures (cost per participant and cost per learning gain) with ambitious targets. The cost per participant target remains $215 over a period of three years. The cost per participant with a learning gain decreases from $475 to $407- a savings of over 14% in program costs over 3 years. The use of NRS performance and financial data also allows the Department to make program decisions to increase the overall efficiency of the Department and, grantees of adult education programs. For example, efficiency measures are used by the Department to negotiate State targets and enables cost comparisons between States. The NRS also facilitates cost cutting measures by States??States are given this information to institute performance-based funding at the sub-grantee level so that only high-performing programs are awarded funding. Performance data has alerted the Department to potential issues with State data, which can be corrected before problems arise. This also allows States to continuously improve their programs, saving time and resources, while increasing effectiveness in the long run.

Evidence: National Reporting System; Adult Education and Family Literacy Act Efficiency Measures.

YES 11%
3.5

Does the program collaborate and coordinate effectively with related programs?

Explanation: To address the lack of scientifically based research on adult education, the program office has collaborated with the U.S. Department of Health and Human Services, the Institute of Education Sciences, the National Institute for Child Health and Human Development, and the National Institute for Literacy to invest in randomized controlled trials to evaluate the effectiveness of different program strategies, curricula and instructional practices. There is also extensive, ongoing collaboration between the program office and the Employment and Training Administration (ETA) in the U.S. Department of Labor. The program office joined ETA in sponsoring three regional summits in 2005 that brought together State directors of adult education, State administrators of Workforce Investment Act Title I youth programs, State administrators of juvenile justice programs and others to improve the coordination of the delivery of services to out-of-school youth. Through the joint ETA-OVAE Strategic Partnerships for a Competitive Workforce initiative, the program office is providing technical assistance to recipients of ETA's Community-Based Job Training grants to create "pathways" that will enable participants in Federally funded adult education programs to transition successfully to the occupational training ETA grantees will offer. The program office also has provided extensive technical assistance to ETA in adapting the National Reporting System to ETA youth programs. In 2005, the program office collaborated with the Office of Citizenship in the U.S. Department of Homeland Security to produce and disseminate a publication to help immigrants understand the U.S. system of government (Welcome to the United States: A Guide for New Immigrants) in English, Spanish, Chinese, Vietnamese, Korean, Russian, Arabic, Tagalog, Portuguese, French, and Haitian Creole. The program office plans to extend the partnership in 2006 by supporting the Office of Citizenship's efforts to create a professional development program to assist adult education instructors in delivering an integrated English Literacy/Civics Education curriculum. The program office also is supporting an interagency workgroup convened by the Secretaries of Education and Labor to promote greater collaboration among Federal programs that serve adults and out-of-school youth with low basic skills or limited English proficiency. The program is working with the U.S. Departments of Agriculture, Defense, Health and Human Services, Housing and Urban Development, and Labor, the Corporation for Community and National Service, and the Institute for Museum and Library Services to develop and implement strategies to expand opportunities for learning for low-skilled and limited English proficient adults and out-of-school youth, and to improve the quality of instruction in Federally-supported programs for these populations.

Evidence: Adult Education and Family Literacy Act National Activity Plans; Creating New Opportunities for Collaboration: A Shared Vision for Youth (http://www.doleta.gov/ryf/); Welcome to the United States: A Guide for New Immigrants (http://uscis.gov/graphics/citizenship/imm_guide.htm)

YES 11%
3.6

Does the program use strong financial management practices?

Explanation: Recent agency-wide audits have not identified deficiencies in the financial management of this program.

Evidence:

YES 11%
3.7

Has the program taken meaningful steps to address its management deficiencies?

Explanation: The Department has implemented the job training common measures and made improving the quality of performance data and promoting its use to increase program performance its chief priority. It has established data quality standards all states must meet, required states to create individual student record systems that meet the program's specifications for quality and functionality, and prescribed how states and local programs must identify post-program outcomes using data matching or follow-up surveys. The program office conducts on-site monitoring to determine state and local program adherence to these requirements and directs states to correct any deficiencies it identifies within 90 days. To promote the use of performance data to improve student outcomes, the program has supported "train-the-trainer" professional development for state staff and provided them with materials they can adapt and use in working with local providers.

Evidence: Adult Education Annual Performance and Financial Report (OMB Control #1830-0027); National Reporting System for Adult Education (www.nrsweb.org); Division of Adult Education and Literacy FY 2006 Monitoring Plan.

YES 11%
3.BF1

Does the program have oversight practices that provide sufficient knowledge of grantee activities?

Explanation: Oversight practices at the Department provide sufficient knowledge of grantee activities. The Department collects NRS performance and financial data from States annually. The data collected from States is relevant in facilitating the Department's decisions to improve the program. For example, States report data on program performance to the Office of Vocational and Adult Education (OVAE) and the performance data is assessed and used in a risk analysis of its grantees. Based on the NRS data, grantees are then selected for program monitoring and targeted technical assistance. The Department has also incorporated performance outcomes into its monitoring tool and program outcome data collected is a key component of the monitoring process. Using NRS data, OVAE conducts on-site monitoring visits to approximately one-third of States as well as desk monitoring on a regular basis each year.

Evidence: Adult Education Annual Performance and Financial Report (OMB Control #1830-0027); Division of Adult Education and Literacy FY 2006 Monitoring Plan.

YES 11%
3.BF2

Does the program collect grantee performance data on an annual basis and make it available to the public in a transparent and meaningful manner?

Explanation: The Department has taken active steps to publish and make available grantee performance data. The program office is currently finalizing an on-line data warehouse system to make this information more widely available. Currently, it is available on-line at: http://wdchqdcolp02.ed.gov/CFAPPS/OVAE/NRS/tables/ This website allows users to access program performance and enrollment data reports by State and program year. Some States have included this website link on their own websites so local program providers and other interested parties have access to this information. In addition, at least 15 States have published program performance report cards under the guidance of the program office. The Department supports States' efforts to publish performance data report cards and conducted a 3-day workshop training to assist States in publishing reports. Reports are available in two areas of the Department of Education's public website that include adult education enrollment and target population data by State: http://www.ed.gov/about/offices/list/ovae/pi/AdultEd/census1.pdf http://www.ed.gov/about/offices/list/ovae/pi/AdultEd/aedatatables.html Finally, the Adult Education Program Facts report includes a State-by-State breakdown of award funding, matching levels, and cost-per-participant data. This report can be found at: http://www.ed.gov/about/offices/list/ovae/pi/AdultEd/aeflaprogfacts.doc

Evidence: Adult Education and Family Literacy Act Annual Reports to Congress, Department website.

YES 11%
Section 3 - Program Management Score 100%
Section 4 - Program Results/Accountability
Number Question Answer Score
4.1

Has the program demonstrated adequate progress in achieving its long-term performance goals?

Explanation: Program performance on all indicators was higher in FY 2004 than in FY 2000. A participant must advance the equivalent of two or more grade levels on a standardized assessment in order to be reported as having made a learning gain. The percentage of participants achieving learning gains has risen consistently since FY 00, although the average number of hours of instruction participants receive per year (less than 200) is significantly less than the average hours of academic instruction delivered in a year of high school (800). Significantly, the program has made its greatest gains in improving the outcomes of adults who have the lowest level skills (grade level equivalent 0 to 1.9 or kindergarten up to second grade) and are the most difficult to serve. The percentage of participants with the lowest level skills who advanced one or more educational levels in a program year grew from 27 percent in FY 2000 to 42 percent in FY 2004??an increase of 56 percent over FY 2000. Also, the percentage of participants who attained a GED or high school diploma has increased 55 percent since FY 2000.

Evidence: Adult Education and Family Literacy Act Annual Reports to Congress; Adult Education and Family Literacy Act Performance Trends by Educational Level (attached).

LARGE EXTENT 13%
4.2

Does the program (including program partners) achieve its annual performance goals?

Explanation: The program has not met its targets for measures #1 and #2 (basic skill acquisition and English language acquisition), the most important measures. However, program performance on all indicators increased between 2001 and 2004. The program's performance targets for 2001 through 2004 were established in FY 2000 in the absence of reliable baseline performance data and some of these targets proved to be significantly higher than baseline. For example, the target for measure #2 (English language acquisition), was at least 8 points above baseline in each of FYs 2001 through 2004. More recent targets for the first two measures have been revised to reflect the most recent performance data. The program has exceeded its annual performance targets for measures #3 (high school completion) and #4 (transition to postsecondary education) in each of the last four fiscal years. The program met its annual performance target for measure #5 (transition to work) in FY 2003.

Evidence: Adult Education Annual and Long-Term Goals (see Measures tab).

SMALL EXTENT 7%
4.3

Does the program demonstrate improved efficiencies or cost effectiveness in achieving program goals each year?

Explanation: The program implemented a web-based reporting system in FY 2004 that eliminated the need for program staff to compile aggregate program performance data from paper reports submitted by States. An estimated 1 FTE was required to perform this function prior to FY 2004. The web-based reporting system now automatically compiles aggregate performance data from State reports, and includes edit checks that have dramatically reduced the number of data entry errors on State reports. Implementation of web-based reporting has reduced the costs of aggregating data by an estimated $15,000. To improve data quality and reduce the costs of data collection, the program office has increased the number of States that use data matching to determine the post-program outcomes of participants (placement in employment, employment retention, GED attainment, and entrance into postsecondary education) rather than surveys. Since 2001, the number of jurisdictions (the 50 States, the District of Columbia and Puerto Rico) that use data matching to determine placement in employment has increased by 37 percent (from 24 to 33), while the number of jurisdictions using data matching to determine postsecondary entrance has increased by 20 percent (from 21 to 26). Nearly half of the jurisdictions (24 of 52) now use data matching to determine all post-program outcomes.

Evidence: Adult Education State Performance and Financial Reports.

YES 20%
4.4

Does the performance of this program compare favorably to other programs, including government, private, etc., with similar purpose and goals?

Explanation: The program is more successful than other programs with similar purposes and goals in recruiting and retaining its target population of out-of-school youth and adults who lack a high school diploma or English proficiency, and assists more members of this target population in acquiring a GED or high school diploma, obtaining employment, and entering postsecondary education than all other related federal programs combined. It achieves these results at a significantly lower cost per participant than other programs. The program's performance exceeds that of other related programs on several of the job training common measures. The program's GED/high school diploma attainment rate in FY 2004 was 51 percent in FY 2004, while the WIA Title I youth program achieved a 36 percent attainment rate, and Youthbuild achieved a rate of 33 percent for the target population served by AEFLA. The attainment rate for Job Corps in FY 2004 was not available, but it was 48 percent in FY 2004. Only the National Guard ChalleNGe program exceeded AEFLA's GED/high school diploma attainment rate in FY 2004 (55 percent), but it is a selective program that serves a narrow subpopulation of the individuals served by AEFLA. The program achieved these outcomes at a significantly lower cost to the Federal government than other programs. In FY 2004, the most recent year for which data are available, the Federal cost per high school diploma or GED attained by out-of-school and adults was $3,081 for AEFLA, $97,603 for WIA Title I youth, $73,212 for Job Corps, $60,024 for Youthbuild, and $15,113 for National Guard ChalleNGe. The program's postsecondary placement rate also exceeds that of comparable programs, placing 34 percent of exiters in postsecondary education in FY 2004; the postsecondary placement rates for out-of-school youth participating in the WIA Title I youth program, Job Corps, and National Guard ChalleNGE were 3 percent, 11 percent, and 16 percent, respectively. AEFLA's federal cost per postsecondary placement was $10,525, compared with $577,292 for WIA Title I youth, $318,965 for Job Corps, and $52,482 for ChalleNGe. While the program's employment placement rate for unemployed participants in FY 2004 was lower than other related programs, it placed twice as many out-of-school youth and adults who lacked basic skills or English proficiency in jobs (145,927) than all other related programs combined (72,514) at a significantly lower cost to the federal government. In FY 2004, the job placement rates and cost per placement for the target population in AEFLA and related programs were: 37 percent and $3,834 for AEFLA; 62 percent and $47,252 for the WIA Title I youth program; 68 percent and $33,424 for the WIA Title I adult program; 78 percent and $76,651 for the WIA Title I dislocated workers program; and 37 percent and $22,492 for the ChalleNGe program.

Evidence: Adult Education and Family Literacy Act State financial and performance reports; Workforce Investment Act Standardized Record Data; FY 2003 Job Corps Annual Report; National Guard Challenge Program, 2004 Performance and Accountability Highlights; YouthBuild Demographics and Outcomes, 2000-2004; U.S. Department of Justice, Federal Bureau of Prisons, State of the Bureau 2004; GAO (GAO-02-413 and GAO-04-308); Schochet, Burghardt, et. al, National Job Corps Study(2000); Ricciuti, St. Pierre, et. al., Third National Even Start Evaluation: Follow-Up Findings from the Experimental Design Study (2004); Porter, Cuban, and Comings, "One Day I Will Make It:" Study of Adult Student Persistence in Library Literacy Programs (2005); GED Testing Service annual statistical reports; Adult Education Program Study; St. Pierre, Gamse, et. al., National Evaluation of the Even Start Family Literacy Program (1998); and Employment and Training Administration, Workforce System Results: July 1-September 30, 2004.

YES 20%
4.5

Do independent evaluations of sufficient scope and quality indicate that the program is effective and achieving results?

Explanation: A randomized controlled trial conducted by MDRC of adult education programs that served 2,923 TANF recipients (1,415 in the treatment group and 1,508 in the control group) for 6 to 18 months found that participants achieved learning gains in math and reading that were greater than or comparable to the gains associated with attending regular high school for the same length of time. Attaining a GED increased the average annual earnings of participants by 28 percent. An increase of one standard deviation in reading test scores increased average annual earnings by 13 percent, regardless of GED attainment, establishing that there is an earnings premium associated with improvements in reading skills that is independent of the effect of attaining a GED. An evaluation of the program's implementation found that collecting valid and reliable data and using it to improve program performance was a central focus of state and local program managers. A survey and assessment of program participants confirmed that the program is serving adults with the greatest need for assistance. In addition, the National Assessment of Adult Literacy found that adults who learned English by enrolling in an adult education program had significantly higher English literacy skills than adults who reported that they had learned English on their own. In 2003, 82 percent of adults who learned English independently had "below basic" (below eighth-grade equivalent) prose literacy skills and 12 percent had "basic" (ninth-to-less-than-twelfth-grade) prose literacy skills. In contrast, 63 percent of adults who had enrolled in an adult education program to learn English had "below basic" skills and 26 percent had "basic" skills.

Evidence: Bos, Scrivener, et al. Improving Basic Skills: The Effects of Adult Education in Welfare-to-Work Programs. MDRC (2002); Pindus, Aron, et. al, Study to Assess Funding, Accountability, and One-Stop Delivery Systems in Adult Education, Urban Institute (2005); Adult Education Program Study (2006); National Assessment of Adult Literacy (2006).

YES 20%
Section 4 - Program Results/Accountability Score 80%


Last updated: 09062008.2006SPR