ExpectMore.gov


Detailed Information on the
TRIO Upward Bound Assessment

Program Code 10000210
Program Title TRIO Upward Bound
Department Name Department of Education
Agency/Bureau Name Department of Education
Program Type(s) Competitive Grant Program
Assessment Year 2002
Assessment Rating Ineffective
Assessment Section Scores
Section Score
Program Purpose & Design 80%
Strategic Planning 72%
Program Management 55%
Program Results/Accountability 16%
Program Funding Level
(in millions)
FY2008 $314
FY2009 $361

Ongoing Program Improvement Plans

Year Began Improvement Plan Status Comments
2007

Developing targets for the efficiency measure.

Action taken, but not completed Program staff will examine the grantee efficiency data and overall efficiency results to date and propose strategies for setting outyear targets.
2007

Releasing the final report from the Upward Bound impact evaluation.

Action taken, but not completed The Department completed additional analyses and arranged for IES to have the report and additional analyses reviewed by external reviewers during the summer of 2008.
2009

Examine the grantee Annual Performance Report to determine whether changes are needed in order to ensure that HEOA prior experience outcome criteria data are available.

No action taken

Completed Program Improvement Plans

Year Began Improvement Plan Status Comments
2003

Making regulatory changes in the upcoming year to increase the proportion of high-risk students served by UB grantees.

Completed In 2003, the Department renewed and strengthened the Expansion Initiative to target higher-risk students. The Department establishedg an absolute priority for the 2007 competition to require all projects to target funds to higher-risk students.
2003

Providing technical assistance to new applicants and current grantees on high risk participants.

Completed The Department provided technical assistance on serving higher-risk students during the FY 2003 competition.
2003

Exploring policies that would encourage more qualified first-time grantees to participate in the program.

Completed The Department took action to ensure that prior experience points are awarded for demonstrated performance, and continues to examine ways to better link past performance with the key program goals.
2003

Monitoring and making better use of project performance data to improve the program.

Completed Based on performance data, the Department set an expenditure cap of $5,000 per participant and strengthened the Expansion Initiative by requiring grantees to serve higher-risk students.
2006

Implementing a strategy to use efficiency measures to improve cost effectiveness in achieving the program goals.

Completed The Department calculated overall UB efficiency data (the gap between cost per successful program completer and the cost per participant) for 2003 through 2005 and published grantee-level data to the Web in 2008. The Department has determined that length of participation in UB is directly related to improved postsecondary enrollment (which directly affects the efficiency measure), and has shared this information with grantees.
2006

Taking steps to better link rewards for past performance with the achievement of key program goals.

Completed The Department examined past performance in the award of prior experience points for the 2007 Upward Bound grant awards.

Program Performance Measures

Term Type  
Long-term/Annual Outcome

Measure: Percentage of Upward Bound participants enrolling in college.


Explanation:The Department plans to maintain overall performance while increasing the percentage of higher-risk participants. 2000 baseline data are provided by the program evaluation. Data for 2003 and later years are provided by project performance reports.

Year Target Actual
2000 N/A 65
2002 66 Not collected
2003 65.0 69.3
2004 65.0 74.2
2005 65.0 78.4
2006 65.0 data lag [Feb 2009]
2007 65.0 data lag [Nov 2009]
2008 70.0 data lag [Nov 2010]
2009 75.0
2010 75.0
2011 76.0
2012 76.0
2013 77.0
Annual Efficiency

Measure: The gap between cost per successful completer and the cost per participant for the Upward Bound program.


Explanation:A successful annual outcome is defined as a participant who persists toward or enrolls in college. The gap is the difference between the cost per output, which is the annual allocated appropriation divided by the number of students receiving services, and cost per successful outcome, which is the annual allocated appropriation divided by the number of students who persist in high school and enroll in college. In 2007, data were recalculated using an improved methodology that uses data from a longitudinal data file (rather than a one-year snapshot file) and assesses the program outcome of school persistence from one year to the next. The longitudinal file allows for the correction of data inconsistencies. The new methodology also uses data reported in fields added to the annual performance report in recent years. As a consequence, the improved methodology captures participants who were previously excluded from the analysis because of data inconsistencies.

Year Target Actual
2003 NA 306
2004 NA 384
2005 NA 279
2006 NA 233
2007 NA Data lag [Feb 2009]
2008 NA Data lag [Nov 2009]
2009 NA
2010 NA
2011 TBD [Feb 2010]
2012 TBD [Feb 2010]
2013 TBD [Feb 2010]
Annual Outcome

Measure: The percentage of TRIO Upward Bound participants persisting in college. (New measure, added February 2008)


Explanation:Persistence is the number of students who enroll in college for a second year of those students who enrolled in college in the year immediately following high school graduation. FY 2007 data, available in Nov. 2009, will reflect prior UB participants who enrolled in postsecondary education for the first time in the 2006-07 academic year and re-enrolled in the 2007-08 academic year.

Year Target Actual
2007 NA data lag [Nov 2009]
2008 Baseline data lag [Nov 2010]
2009 NA
2010 NA
2011 TBD [Dec 2010]
2012 TBD [Dec 2010]
2013 TBD [Dec 2010]

Questions/Answers (Detailed Assessment)

Section 1 - Program Purpose & Design
Number Question Answer Score
1.1

Is the program purpose clear?

Explanation: The program is designed to provide support services to high school students to increase their academic performance to prepare them for college.

Evidence: Statutory purpose (Subpart 2, Higher Education Act of 1965): "generate skills and motivation necessary for success in education beyond secondary school" for low-income, first generation students.

YES 20%
1.2

Does the program address a specific interest, problem or need?

Explanation: Data indicates that low-income, first-generation students are not adequately prepared for college, and do not enroll and complete college at the same rates as students who are less disadvantaged.

Evidence: A wide-range of data is available in NCES publications.

YES 20%
1.3

Is the program designed to have a significant impact in addressing the interest, problem or need?

Explanation: UB is designed to provide highly intensive services to selected students with demonstrated need for assistance.

Evidence: The average per student expenditure is over $4,500, supporting a range of interventions and a 6-week residential summer program. This level of expenditure and effort is 5 to 15 times more expensive than other individual interventions such as tutoring and other student supplemental services.

YES 20%
1.4

Is the program designed to make a unique contribution in addressing the interest, problem or need (i.e., not needlessly redundant of any other Federal, state, local or private efforts)?

Explanation: Few if any programs deliver the same high-intensity academic instruction catered to individual students, residential programs, and work-study stipends.

Evidence: GEAR UP and Talent Search have considerably lower expenditure levels per student and do not support residential programs and high school stipends.

YES 20%
1.5

Is the program optimally designed to address the interest, problem or need?

Explanation: The program does have significant impacts on certain types of students, but the evaluation findings indicate that it does not typically serve these students. This may be a design problem to be addressed through regulatory changes.

Evidence: The UB evaluation indicates that it increases 4-year college enrollment by 22% for students with lower expectations and 5% for all students, but the overall college enrollment rate is not improved. A multi-step plan has been put in place to improve performance by targeting higher risk students like those with lower expectations.

NO 0%
Section 1 - Program Purpose & Design Score 80%
Section 2 - Strategic Planning
Number Question Answer Score
2.1

Does the program have a limited number of specific, ambitious long-term performance goals that focus on outcomes and meaningfully reflect the purpose of the program?

Explanation: The program's overall goal is to increase the college enrollment rates of low income, first-generation students. ED has recently finalized targets for measuring success.

Evidence: The GPRA indicators track college enrollment rates for all UB students and for its higher-risk students, and targets are set to improve upon the current baseline performance levels.

YES 14%
2.2

Does the program have a limited number of annual performance goals that demonstrate progress toward achieving the long-term goals?

Explanation: The long-term goals are the same as the annual goals. With annual performance information available for both goals, ED will be able to track annual progress against its short-term targets while also tracking progress against its long-term goals.

Evidence: The GPRA indicators track enrollment rates and targets are set to improve upon the current baseline performance levels.

YES 14%
2.3

Do all partners (grantees, sub-grantees, contractors, etc.) support program planning efforts by committing to the annual and/or long-term goals of the program?

Explanation: Annual performance reports (APRs) are required of all grantees and their performance is measured (including the allocation of prior experience points) on the basis of how well they meet program goals.

Evidence: Performance reports collect data on student persistence, high school completion, and college enrollment rates.

YES 14%
2.4

Does the program collaborate and coordinate effectively with related programs that share similar goals and objectives?

Explanation: UB projects are often linked with Talent Search, GEAR UP, and Student Support Services projects, creating a pipeline of services through college. Some projects share project directors to oversee all programs with coordinators providing day-to-day management.

Evidence: There are several UB grantees that also have GEAR UP, Talent Search, Educational Opportunity Centers, SSS, and McNair grants. Another example is the San Diego State University project which coordinates with NSF.

YES 14%
2.5

Are independent and quality evaluations of sufficient scope conducted on a regular basis or as needed to fill gaps in performance information to support program improvements and evaluate effectiveness?

Explanation: The recently completed impact evaluation of this program was the first in over two decades. While this evaluation is of sufficient scope, this program has not had regular evaluations to guide program management and discern program impacts. ED has begun formulating a long-term evaluation plan for TRIO programs that will include Institute of Education Sciences (IES) intervention studies.

Evidence: The next interim evaluation report should be released in early 2003.

NO 0%
2.6

Is the program budget aligned with the program goals in such a way that the impact of funding, policy, and legislative changes on performance is readily known?

Explanation: Some funds for UB have been requested and allocated to provide work-study opportunities and recruit higher risk students per evaluation findings with the intent of improving program performance. However, the amount of funds allocated for these purposes has been very small, and no specific outcome goals for these initiatives have been set.

Evidence: Less than 10% of funds have been allocated for improving program performance ($16 million for higher risk students and $9 million for work-study).

NO 0%
2.7

Has the program taken meaningful steps to address its strategic planning deficiencies?

Explanation: Action steps have been developed to improve program performance and make changes to the competitive process. Steps have been taken to develop annual goals, including short- and long-term targets.

Evidence: A multi-step plan for improving program performance, including an invitational priority and regulatory changes to serve higher risk students, has been developed. Prior to this plan, the Department initiated the UB Participant Expansion Initiative and the newly authorized work study provisions.

YES 14%
Section 2 - Strategic Planning Score 72%
Section 3 - Program Management
Number Question Answer Score
3.1

Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance?

Explanation: Project performance information is not used to improve program performance even though it is used for grantee management, such as scoring prior experience points during each program competition (every 4 years), and assessing the degree to which grantees achieved their stated goals and objectives. Efforts are being made to use NSLDS data to validate project performance.

Evidence: In addition to scoring 15 prior experience points in each competition, staff work with project directors in developing partnership agreements to ensure that goals are attainable yet ambitious based on information included in newly funded proposals and staff's assessment of reports from the grantees. If grantees do not demonstrate sustained progress, continuation awards can and have been withheld.

NO 0%
3.2

Are Federal managers and program partners (grantees, subgrantees, contractors, etc.) held accountable for cost, schedule and performance results?

Explanation: This program has not instituted an appraisal system that holds Federal managers accountable for grantee performance. However, as part of the President's Management Agenda, the Department is planning to implement an agency-wide system -- EDPAS -- that links employee performance to progress on strategic planning goals. Grantee performance is monitored on an annual basis through review and approval of annual budget plans, compliance reviews, audits, and site visits. Grantees that do not meet Federal requirements are required to submit improvement plans and can have grants reduced or discontinued for serious or persistent failures to comply.

Evidence: Follow-up efforts to Inspector General reports have resulted in several SSS and UB grantees (Creighton University, Independence College, Miami-Dade, Winston Salem State College, etc.) being designated as high risks. They are required to submit monthly reports of activities and expenditures and staff conduct site visits.

NO 0%
3.3

Are all funds (Federal and partners') obligated in a timely manner and spent for the intended purpose?

Explanation: Funds are obligated in a timely matter but IG reports have indicated that monitoring of expenditures needs improvement. New office-wide monitoring plans are being implemented.

Evidence: Staff now monitor grantees' draw-down of funds by reviewing grantees' financial reports (GAPS). A memo explaining the consequences of excessive draw-downs was sent to all TRIO grantees. In addition, grantees must submit a written request before any accounts are reopened after the close of a grant cycle.

YES 9%
3.4

Does the program have incentives and procedures (e.g., competitive sourcing/cost comparisons, IT improvements) to measure and achieve efficiencies and cost effectiveness in program execution?

Explanation: The TRIO program office relies on competitive sourcing to "farm-out" technical and other administrative tasks that it does not have the expertise and staff to fill. However, the program has no formal procedures for measuring and improving the cost efficiency of its operations.

Evidence: TRIO administration funds support multiple contracts to provide database, technical assistance, and reporting support. Electronic APRs also create efficiencies in reporting.

YES 9%
3.5

Does the agency estimate and budget for the full annual costs of operating the program (including all administrative costs and allocated overhead) so that program performance changes are identified with changes in funding levels?

Explanation: Education's 2004 Budget submission satisfies the first part of the question by presenting the anticipated S&E expenditures (including retirement costs) for this program, which constitute less than 1% percent of the program's full costs. However, Education has not satisfied the second part of the question because program performance changes are not identified with changes in funding levels. The program does not have sufficiently valid and reliable performance information to assess the impact of the Federal investment.

Evidence:  

NO 0%
3.6

Does the program use strong financial management practices?

Explanation: The TRIO program office has not been revealed to have internal control weaknesses and follows Departmental guidelines for financial management.

Evidence:  

YES 9%
3.7

Has the program taken meaningful steps to address its management deficiencies?

Explanation: TRIO has developed a plan for responding to IG concerns regarding insufficient grantee monitoring and unclear reporting requirements.

Evidence: The TRIO program office has developed a detailed monitoring plan that emphasizes conducting on-site visits to newly funded projects, high-risk projects (evidence of mismanagement, constant turnover in leadership, etc.). In 2001, staff began visiting all new UB projects and continue to do so.

YES 9%
3.CO1

Are grant applications independently reviewed based on clear criteria (rather than earmarked) and are awards made based on results of the peer review process?

Explanation: Independent peer review panels are used to score and rank all applications.

Evidence: TRIO administration funds are used to pay for the peer review process. 100% of grants are subject to review.

YES 9%
3.CO2

Does the grant competition encourage the participation of new/first-time grantees through a fair and open application process?

Explanation: The TRIO program office provides outreach and technical assistance to new grantees, but significant competitive preference is given to existing grantees for their prior experience. The statute and regulations provide up to 15 bonus points for prior experience.

Evidence: Over 95% of grantees are successful in reapplying for funds. Without additional funds for awards, few if any new projects would be first-time grantees.

NO 0%
3.CO3

Does the program have oversight practices that provide sufficient knowledge of grantee activities?

Explanation: New procedures have been developed for improving the monitoring of expenditures based on IG concerns.

Evidence: In addition to increasing efforts at on-site monitoring, the TRIO program office continues to review all reports (APRs, partnership agreements, interim performance reports, audits) that grantees are required to submit and make follow-up calls to clarify questions and concerns.

YES 9%
3.CO4

Does the program collect performance data on an annual basis and make it available to the public in a transparent and meaningful manner?

Explanation: The TRIO program office does collect and compile data from performance reports, and occasionally produces a program profile report. However, this data is not readily available to the public, is not available on the internet, and does not reflect program impacts.

Evidence: Student privacy concerns are currently being examined and may be a barrier to providing readily available data.

NO 0%
Section 3 - Program Management Score 55%
Section 4 - Program Results/Accountability
Number Question Answer Score
4.1

Has the program demonstrated adequate progress in achieving its long-term outcome goal(s)?

Explanation: ED has recently finalized its goals and targets for Upward Bound but does not yet have information to measure program progress. UB's latest evaluation findings indicate significant impacts for some groups of students, namely those with lower educational expectations. One of ED's new goals is to improve performance in the enrollment rates of these student groups.

Evidence: Evaluation findings revealed UB increases 4-year college enrollment rates by 22% points for higher risk students and 5% points overall. However, there is no overall impact on college enrollment because the program is poorly targeted, serving students who are not most in need of services. ED will use APR data for subsequent reporting upon its new targets.

NO 0%
4.2

Does the program (including program partners) achieve its annual performance goals?

Explanation: ED's annual goals for this program are the same as the long-term goals. Annual targets are set as a proportion of the long term targets.

Evidence: 2004 APR data will begin to inform about annual progress for ED's goals.

NO 0%
4.3

Does the program demonstrate improved efficiencies and cost effectiveness in achieving program goals each year?

Explanation: The program does not lend itself to the development of efficiency measures that link the Federal investment to program outcomes.

Evidence:  

NA  %
4.4

Does the performance of this program compare favorably to other programs with similar purpose and goals?

Explanation: The overall performance of UB indicates that it can have positive effects if appropriately targeted, but there are no comparable programs with outcome data against which it can be judged.

Evidence: Upward Bound has significant impacts on some groups of students but is not well targeted to serve these students. A multi-step plan has been put in place to better target students who are shown to benefit.

NA 0%
4.5

Do independent and quality evaluations of this program indicate that the program is effective and achieving results?

Explanation: The evaluation findings are those of an independent contractor hired by the Department to conduct a longitudinal study with a matched comparison group and case studies.

Evidence: The evaluation indicates significant impacts for some groups of students, but no overall impact.

SMALL EXTENT 16%
Section 4 - Program Results/Accountability Score 16%


Last updated: 01092009.2002FALL