ExpectMore.gov


Detailed Information on the
TRIO Talent Search Assessment

Program Code 10001036
Program Title TRIO Talent Search
Department Name Department of Education
Agency/Bureau Name Office of Postsecondary Education
Program Type(s) Competitive Grant Program
Assessment Year 2005
Assessment Rating Moderately Effective
Assessment Section Scores
Section Score
Program Purpose & Design 100%
Strategic Planning 88%
Program Management 90%
Program Results/Accountability 50%
Program Funding Level
(in millions)
FY2007 $143
FY2008 $143
FY2009 $143

Ongoing Program Improvement Plans

Year Began Improvement Plan Status Comments
2006

Implementing a strategy to use efficiency measures to improve cost effectiveness in achieving the program goals.

Action taken, but not completed A contractor is compiling 2006-07 school year performance data for the project grants and is scheduled to deliver the data base to the Department by September 30, 2008. The Department anticipates completing grantee-level analyses and posting the results on the web by December 30, 2008. The 2006-07 efficiency data are due from the contractor by December 30, 2008 and will be posted on the web by June 30, 2009.
2007

Developing targets for the efficiency measure.

Action taken, but not completed The program developed a new annual performance report (APR) for use beginning in the fall of 2007 that provides clearer data definitions to help ensure that comparable data are collected across grants. The Department will receive the first year of data under this APR (for the 2006-07 school year) by December 30, 2008. The Department will need at least two years of data before setting targets.
2007

Analyzing and placing on the public website grantee-level performance data for the 2006-07 school year.

Action taken, but not completed The program office has received preliminary 2006-07 school year data from the contractor and is reviewing it to ensure quality. The program office will publish 2006-07 grantee level enrollment and financial aid measures by March 31, 2009 and efficiency data by June 30, 2009. A new annual performance report clarifying data definitions was implemented in the fall of 2007.

Completed Program Improvement Plans

Year Began Improvement Plan Status Comments
2004

Completing a program evaluation and utilizing its findings to improve program performance.

Completed The Study of the Effect of the Talent Search Program on Secondary and Postsecondary Outcomes in Florida, Indiana, and Texas was released in 2006 and is available at http://www.ed.gov/rschstat/eval/highered/talentsearch-outcomes/ts-report.pdf The Department is examining study findings to help determine program improvement strategies.
2004

Developing a meaningful efficiency measure.

Completed The Department developed an efficiency measure to track the average annual cost for each successful outcome??defined as a participant who persists toward or enrolls in college.
2004

Exploring policies that would open up the Talent Search application process to include more worthy new applicants while still rewarding high-performing prior grantees.

Completed The Department took action to ensure that prior experience points were awarded for demonstrated performance, and continues to examine ways to better link past performance with the key program goals.
2006

Taking steps to better link rewards for past performance with the achievement of key program goals.

Completed The Department has begun examining additional administrative actions that can be taken to link past performance with achievement of key program goals.

Program Performance Measures

Term Type  
Long-term/Annual Output

Measure: Percentage of students who apply for financial assistance to attend college.


Explanation:A new definition of "college-ready" will be implemented in FY 2007. A new definition of "college-ready" and other improvements in Talent Search's Annual Performance Report format should result in greater accuracy of FY 2007 data.

Year Target Actual
2000 N/A 82
2001 N/A 86
2002 N/A 86
2003 N/A 85.6
2004 N/A 85.1
2005 N/A 85.4
2006 86.0 85
2007 86.5 data lag [Dec 2008]
2008 86.5
2009 87.0
2010 87.0
2011 87.5
2012 87.5
2013 88.0
Long-term/Annual Outcome

Measure: Percentage of participants enrolling in college.


Explanation:The 2003 PART included a long-term target of 78 percent enrollment. Based on recent success in achieving this target, new targets were developed for 2006-2012. The efficiency indicator followed an upward trend from FY 2004 to FY 2006. For 2006-07, applicants were encouraged to propose serving fewer target schools so as to increase quality and efficiency of services; this may have a favorable effect on the indicator.

Year Target Actual
2000 N/A 73
2001 N/A 77
2002 N/A 78
2003 N/A 79
2004 73.5 77.6
2005 74.0 77.8
2006 78.5 77.8
2007 79.0 data lag [Dec 2008]
2008 79.0
2009 79.5
2010 79.5
2011 80.0
2012 80.0
2013 80.5
Annual Efficiency

Measure: The gap between cost per successful outcome and cost per participant.


Explanation:A successful annual outcome is defined as a participant that persists toward or enrolls in college. The gap is the difference between the cost per program participant, which is derived by dividing the annual appropriation by the number of participants, and the cost per successful outcome, which is derived by dividing the annual appropriation by the number of students that persist in high school or enroll in college. The efficiency indicator followed an upward trend from FY 2004 to FY 2006. For 2006-07, applicants were encouraged to propose serving fewer target schools so as to increase quality and efficiency of services; this may have a favorable effect on the indicator.

Year Target Actual
2004 NA 1.65
2005 NA 1.8
2006 NA 1.9
2007 NA Data lag [Dec 2008]
2008 NA Data lag [Dec 2009]
2009 NA Data lag [Dec 2010]
2010 NA
2011 TBD
2012 TBD
2013 TBD

Questions/Answers (Detailed Assessment)

Section 1 - Program Purpose & Design
Number Question Answer Score
1.1

Is the program purpose clear?

Explanation: The program is designed to encourage low-income, first-generation middle and high school students to complete high school and pursue a postsecondary degree.

Evidence: Section 402B of the Higher Education Act (HEA) states that the purpose is to identify low-income, first-generation students with college potential and "encourage such youths to complete secondary school and to undertake a program of postsecondary education."

YES 20%
1.2

Does the program address a specific and existing problem, interest, or need?

Explanation: Data indicate that low-income, first-generation students are not adequately prepared for college, and do not enroll in and complete college at the same rates as students who are less disadvantaged.

Evidence: Data from the report "Coming of Age in the 1990s: The Eighth-Grade Class of 1988 12 Years Later" indicate that 47.9% of low-income students do not attend postsecondary education, compared to 23.3% and 4.4% of middle- and high-income students. In addition, data indicate that 43.9% of students whose parents never attended college also do not attend college, compared to 22.8% and 5.2% of students whose parents had some college and whose parents had bachelor's degrees or higher (NCES, 2002, Table 2). A wide-range of other data are available in NCES publications.

YES 20%
1.3

Is the program designed so that it is not redundant or duplicative of any other Federal, state, local or private effort?

Explanation: Talent Search provides help in applying for financial aid, multiple types of counseling, and other forms of assistance. Although similar services are provided by local school districts, the high level of need exceeds the capacity of school counselors. Talent Search complements existing efforts by targeting students not served.

Evidence: Numerous studies indicate that counselor-student ratios in public schools are very high (Blackwater Associates & Savage, 1989; Wells & Gaus, 1991; Yanis and Willner, 1988). Talent Search addresses this issue by targeting low-income students who require additional guidance. Talent Search also provides career and college planning counseling services not available to most low-income students.

YES 20%
1.4

Is the program design free of major flaws that would limit the program's effectiveness or efficiency?

Explanation: There is no evidence of design problems that limit the program's effectiveness or efficiency. Moreover, positive findings from the program evaluation suggest that the regulatory requirement that a minimum number of students be served works to ensure project efficiency without decreasing project effectiveness.

Evidence: Program regulations require projects to serve at least 600 students, with an initial projected expenditure of approximately $300 per student. The program evaluation's findings that participants were significantly more likely to apply for financial aid and enroll in college suggest that Talent Search is well designed to translate a relatively small Federal investment into positive effects.

YES 20%
1.5

Is the program design effectively targeted so that resources will address the program's purpose directly and will reach intended beneficiaries?

Explanation: Talent Search appears well targeted to the neediest students who have potential for postsecondary education.

Evidence: The statute requires projects to assure that at least two-thirds of participants are low-income, first-generation college students. Talent Search's profile report (Mathematica, 2002) indicates that 81% of participants are low-income and 88% of participants are first-generation. The report also indicates that students in Talent Search schools have higher rates of participation in the Federal free lunch program than students in other schools (40% v. 25%).

YES 20%
Section 1 - Program Purpose & Design Score 100%
Section 2 - Strategic Planning
Number Question Answer Score
2.1

Does the program have a limited number of specific long-term performance measures that focus on outcomes and meaningfully reflect the purpose of the program?

Explanation: The primary goal of Talent Search is to increase the postsecondary enrollment rate of participating students. A secondary long-term goal is to increase the percentage of participants applying for financial aid.

Evidence: The GPRA measures track the percentage of college-ready participants who receive assistance in applying for financial aid and the percentage of college-ready participants who enroll in postsecondary education.

YES 12%
2.2

Does the program have ambitious targets and timeframes for its long-term measures?

Explanation: Since Talent Search exceeded previous long-term targets, annual trend data were used to establish new performance targets through 2011. This 5-year time period is consistent with the Department's strategic planning process.

Evidence: The application for financial aid target is 88% by 2011 and the college enrollment target is 81% by 2011.

YES 12%
2.3

Does the program have a limited number of specific annual performance measures that can demonstrate progress toward achieving the program's long-term goals?

Explanation: The annual goals are the same as the long-term goals. Annual performance data will track progress against short-term targets while also tracking progress against the long-term goals.

Evidence: The GPRA measures track the percentage of participants who receive assistance in applying for financial aid and the percentage of participants who enroll in postsecondary education.

YES 12%
2.4

Does the program have baselines and ambitious targets for its annual measures?

Explanation: Annual trend data were used to establish targets for incremental improvement.

Evidence: Annual performance targets are included in the program performance plan.

YES 12%
2.5

Do all partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) commit to and work toward the annual and/or long-term goals of the program?

Explanation: Talent Search projects all work toward the annual and long-term goals of the program. Although performance targets have yet to be established, the goals have been in place and widely accepted for some time. Annual performance reports (APRs) are required of all grantees and their performance is measured on the basis of how well they meet program goals.

Evidence: Program regulations clearly articulate the program goals (34 CFR 643.1) and indicate that grant awards, continuation funding, and prior experience points are awarded partly on the basis of how well projects achieve these goals.

YES 12%
2.6

Are independent evaluations of sufficient scope and quality conducted on a regular basis or as needed to support program improvements and evaluate effectiveness and relevance to the problem, interest, or need?

Explanation: A final evaluation report on the impact of Talent Search on college enrollment rates, the first of its kind, is scheduled to be released in 2005. Although the evaluation was conducted in part to determine the feasibility of using state data sources as a more cost effective means of measuring program impacts, the evaluation was of sufficient scope and quality. The next study relating to Talent Search will examine the extent to which TRIO provides a continuous pipeline of services and the effect of that pipeline on postsecondary education outcomes.

Evidence: The Talent Search evaluation, conducted by Mathematica Policy Research, used quasi-experimental matching techniques to compare Talent Search participants with similar students in 3 states. Since the program is not state-administered and the sample size was quite large??approximately 6,000 students in both the treatment and comparison groups??findings from the evaluation are suggestive of the effectiveness of the Talent Search program as a whole. As part of TRIO's long-term evaluation strategy, the next study relating to Talent Search will fill a gap in performance information regarding the effects of TRIO on students who receive a continuous pipeline of services.

YES 12%
2.7

Are Budget requests explicitly tied to accomplishment of the annual and long-term performance goals, and are the resource needs presented in a complete and transparent manner in the program's budget?

Explanation: In the past, funding for Talent Search has not been directly linked with program performance because few data were available. However, in the FY 2006 budget, the Department justified the proposed elimination of Talent Search partly on the basis of its lack of demonstrated performance. Now that a significant quantity of performance and evaluation data are available, the Department is using this data to help develop its budgetary proposal for FY 2007. The Department also will use this data to guide its allocation of funds for FY 2006, when Talent Search is scheduled to hold a new competition.

Evidence: The FY 2006 budget justification proposed to eliminate Talent Search given its "lack of demonstrated ability to help disadvantaged high school youth prepare for and enroll in college."

NO 0%
2.8

Has the program taken meaningful steps to correct its strategic planning deficiencies?

Explanation: The Talent Search performance report was revised to collect data that is more useful for assessing performance and making budgetary decisions on an annual basis. Talent Search developed and updated performance targets, and recently added a significant quantity of data to its program performance plan. Additionally, the Department developed a long-term evaluation strategy for TRIO, including Talent Search, and is using performance and evaluation data to inform FY 2007 budget discussions.

Evidence: Performance data are available for 4 consecutive years, and targets are being updated and developed for FYs 2006-2011.

YES 12%
Section 2 - Strategic Planning Score 88%
Section 3 - Program Management
Number Question Answer Score
3.1

Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance?

Explanation: Since the last PART assessment, TRIO has begun using performance information to improve the performance of Talent Search. As a result of audit findings and compliance monitoring, largely with respect to Talent Search projects, TRIO strengthened the verification requirements for performance reporting and focused on financial management deficiencies. Changes also were made to the application package for the FY 2006 competition to increase compliance and improve performance data by ensuring projects are held to a consistent standard of performance. TRIO developed a quarterly newsletter to improve communication with grantees. TRIO uses OPE's new electronic grant monitoring system and recently focused site visits on non-profit grant recipients, which were found to be at higher risk of non-compliance. Additionally, TRIO has long used data to assess the degree to which Talent Search grantees achieve their stated goals and to award prior experience points in each program competition.

Evidence: The application for the FY 2006 Talent Search competition specifies that projects must demonstrate need based on the demographics and conditions of their target schools rather than a general need for serving disadvantaged students, and it specifies how participants are to be counted. The Federal Register notice will set a minimum grant award of $220,000 and it will state that projects will not be able to renegotiate their goals. In addition to reporting the number of students that apply for college, projects now are required to report the number of students that enroll in college. TRIO has conducted 100 site visits each year for the past 2 years, significantly more than were conducted in previous years. In FY 2004, 12 TRIO grants were closed due to one of the following reasons: audit issues, accreditation, lack of substantial progress, or no regulatory authority. An additional 8 grants received reduced funding due to a failure to make substantial progress. TRIO has held teleconferences, shared information, and conducted site visits focused on inaccurate indirect cost reporting, failure to adequately expend grant funds in a timely fashion, and failure to properly document and serve the required number of eligible students. Three editions of the newsletter have been published. In addition to allocating up to 15 prior experience points to current grantees on the basis of performance data, staff work with project directors to ensure that goals are attainable yet ambitious based on information included in newly funded proposals and staff's assessment of reports from the grantees. Reports from grantees also were the basis for TRIO providing technology supplements to improve project performance.

YES 10%
3.2

Are Federal managers and program partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) held accountable for cost, schedule and performance results?

Explanation: ED's managers are subject to EDPAS which links employee performance to relevant Strategic Plan goals and action steps, and is designed to measure the degree to which a manager contributes to improving program performance. TS managers are identified and their EDPAS agreements are linked with the performance of projects or the program. Additionally, funding decisions for current grant recipients are based on prior performance.

Evidence: The EDPAS standards for TRIO program managers set forth requirements that program managers set forth strategies for implementing GPRA and Strategic Plan ininitiatives related to TRIO. TRIO project officers are held accountable for assessing project performance, calculating prior experience points, and monitoring the progress of projects in achieving program goals and objectives. TS projects receive continuation funding on the basis of their reported progress in achieving the program goals, and they receive up to 15 points for their prior performance when applying for a new grant.

YES 10%
3.3

Are funds (Federal and partners') obligated in a timely manner and spent for the intended purpose?

Explanation: Funds are obligated in a timely matter. In response to IG reports, TRIO has focused attention on appropriate financial management procedures. TRIO staff use GAPS and OPE's new electronic grant monitoring system.

Evidence: TRIO grants are typically obligated by May of each year, far exceeding the Department average. Additionally, new competitions are held 6 to 8 months ahead of schedule to ensure that new projects have adequate time to begin providing services on the project start date. Staff monitor grantees' draw-down of funds by reviewing grantees' financial reports electronically. A memo explaining the consequences of excessive draw-downs was sent to all TRIO grantees. In addition, staff have begun mo monitoring grants that are not spending funds as quickly as anticipated. Grantees must submit a written request before any accounts are reopened after the close of a grant cycle, and TRIO no longer allows funds that are to be returned to be used for current grant activities instead.

YES 10%
3.4

Does the program have procedures (e.g. competitive sourcing/cost comparisons, IT improvements, appropriate incentives) to measure and achieve efficiencies and cost effectiveness in program execution?

Explanation: TRIO developed a common efficiency measure, the annual cost per successful outcome, to assess the cost effectiveness of each TRIO program and project on an annual basis. In June 2005, TRIO began a pilot effort to implement its efficiency measures. The implementation strategy includes communication with grantees, calculating efficiency data in a variety of ways, and publishing efficiency data and setting targets for improved efficiency. TRIO is working with projects to focus on methods for comparing and improving efficiency levels over time. Additionally, prior experience points serve as a performance incentive for grantees.

Evidence: Preliminary calculations indicate the annual cost per successful outcome for Talent Search was $379 in 2003. TRIO's fall newsletter discusses the efficiency measure strategy. Up to 15 prior experience points are awarded to all eligible applicants during a competitive cycle.

YES 10%
3.5

Does the program collaborate and coordinate effectively with related programs?

Explanation: TRIO urges coordination with other Federal and non-Federal projects to create a pipeline of services through college. Some projects share project directors to oversee all programs with coordinators providing day-to-day management. Additionally, TRIO administers the Child Care Access Means Parents In School (CCAMPIS) program and works closely with the GEAR UP and Institutional Development and Undergraduate Education Service (IDUES) programs.

Evidence: Talent Search projects are often linked with Upward Bound, GEAR UP, and Student Support Services projects, including a number of institutions that are the recipients of multiple such grants. The spring 2005 TRIO newsletter includes information about IDUES and CCAMPIS activities.

YES 10%
3.6

Does the program use strong financial management practices?

Explanation: The TRIO program office has not been revealed to have internal control weaknesses and follows Departmental guidelines for financial management. TRIO staff also use OPE's new electronic grant monitoring system.

Evidence: The IG audit of TRIO's financial controls found no evidence of erroneous payments or other such material weaknesses.

YES 10%
3.7

Has the program taken meaningful steps to address its management deficiencies?

Explanation: TRIO developed a plan for responding to IG concerns regarding insufficient grantee monitoring and unclear reporting requirements. Additionally, an efficiency measure was developed and TRIO has begun to focus on ways that Talent Search projects can become more efficient.

Evidence: The TRIO program office developed a detailed monitoring plan that emphasizes conducting on-site visits to newly funded and high-risk projects (those with evidence of mismanagement, constant turnover in leadership, etc.). A full-time TRIO staff member is now dedicated to project oversight, and TRIO has reached agreement with the Federal Student Aid office to conduct joint site visits. The Talent Search efficiency measure was discussed in the FY 2006 budget justification, and the spring 2005 TRIO newsletter includes information on resources that can be used to improve projects, including FSA's online counselor and mentor handbook.

YES 10%
3.CO1

Are grants awarded based on a clear competitive process that includes a qualified assessment of merit?

Explanation: The TRIO program office provides outreach and technical assistance to new grantees. In addition, independent peer review panels are used to score and rank all applications. Although the statute and regulations provide up to 15 bonus points for prior experience, TRIO has tightened the process for awarding those points to ensure that the competitive preference given to existing grantees is based on demonstrated performance. Following HEA reauthorization, TRIO also plans to pursue regulatory changes to better link prior experience points with achievement of the key program outcomes.

Evidence: In the 2002 competition, only 37% of Talent Search grantees that applied for new awards received the maximum 15 prior experience points, down from 72% in the 1998 competition. This resulted in a 22% success rate for first-time applicants. It also reflected a trend that has continued in the other TRIO programs. In 2003, only 28% of Upward Bound grantees and 32% of Upward Bound Math/Science grantees received the maximum 15 points, down from 68% and 72% in 1999. In 2005, only 50% of Student Support Services grantees received the maximum 15 points, down from 74% in 2001. TRIO administration funds are used to pay for the peer review process. 100% of grants are subject to review.

YES 10%
3.CO2

Does the program have oversight practices that provide sufficient knowledge of grantee activities?

Explanation: New procedures were implemented for improving the monitoring of expenditures based on IG concerns, including joint audits with IG and the use of OPE's new electronic grant monitoring system.

Evidence: In addition to increasing efforts at on-site monitoring, the TRIO program office continues to review all reports (APRs, partnership agreements, interim performance reports, audits) that grantees are required to submit and make follow-up calls to clarify questions and concerns. A full-time TRIO staff member is now dedicated to project oversight, and TRIO has reached agreement with the Federal Student Aid office to conduct joint site visits.

YES 10%
3.CO3

Does the program collect grantee performance data on an annual basis and make it available to the public in a transparent and meaningful manner?

Explanation: While the TRIO program office collects and compiles data from performance reports on an annual basis and produces a program profile report biennially, it has not yet made disaggregated grantee-level data available to the public. TRIO continues working to increase the timeliness of making the data available to the public, and to make comparisons with data on participation in the Federal student financial assistance programs.

Evidence: Program profile reports prepared by independent contractors are available on TRIO's website (www.ed.gov/about/offices/list/ope/trio). The next report is scheduled to be released in 2005. The reports include complete performance data aggregated at the program level. Data also are disaggregated by type of institution and, in some cases, by region or state.

NO 0%
Section 3 - Program Management Score 90%
Section 4 - Program Results/Accountability
Number Question Answer Score
4.1

Has the program demonstrated adequate progress in achieving its long-term performance goals?

Explanation: Talent Search exceeded its long-term performance goal for college enrollment, it's primary performance goal. Targets were recently established for increasing the percentage of students who apply for financial aid, so Talent Search has not been able to show progress toward achieving that long-term goal.

Evidence: In 2003, Talent Search established long-term targets for college enrollment in 2007-2013 using baseline data from 2000. The targets were 75% in 2007, increasing to 78% in 2013. Actual data reported in the program performance plan show the college enrollment rate increasing to 77% in 2001, 78% in 2002, and 79% in 2003.

LARGE EXTENT 17%
4.2

Does the program (including program partners) achieve its annual performance goals?

Explanation: Talent Search exceeded each of its annual performance goals for college enrollment. Targets were recently established for increasing the percentage of students who apply for financial aid, so Talent Search has not been able to show progress toward achieving those annual goals.

Evidence: Actual data reported in the program performance plan show the college enrollment rate increasing to 77% in 2001, 78% in 2002, and 79% in 2003. Annual targets had been set at the 73% baseline level for those years, and to increase by .5% annually thereafter.

LARGE EXTENT 17%
4.3

Does the program demonstrate improved efficiencies or cost effectiveness in achieving program goals each year?

Explanation: The Department recently developed an efficiency measure for TS and a strategy for improving program efficiency. Although the average cost per successful annual outcome decreased in 2004, the Department is still developing efficiency targets.

Evidence: The average cost per successful annual outcome decreased from $379 in 2003 to $367 in 2004. A successful annual outcome is defined as a participant that persists toward or enrolls in college.

NO 0%
4.4

Does the performance of this program compare favorably to other programs, including government, private, etc., with similar purpose and goals?

Explanation: Although similar services are provided by local school districts, there are few data to demonstrate the effectiveness of those programs. And although some programs like Upward Bound target similar students, any comparison would be too inherently difficult because of fundamental differences between the program designs??Talent Search encourages students to "complete secondary school and to undertake a program of postsecondary education" whereas Upward Bound generates "skills and motivation necessary for success in education beyond secondary school." Nevertheless, the successful performance of Talent Search is apparent even though there are no comparable programs with outcome data against which it can be judged. An increasing percentage of Talent Search participants are enrolling in college, and the program evaluation suggests that Talent Search is effective.

Evidence:  

NA  %
4.5

Do independent evaluations of sufficient scope and quality indicate that the program is effective and achieving results?

Explanation: The Talent Search evaluation was independently conducted and of sufficient quality and scope. The findings suggest that Talent Search has positive effects on applying for financial aid and enrolling in college.

Evidence: The Talent Search evaluation, conducted by Mathematica Policy Research, used quasi-experimental matching techniques to compare Talent Search participants with similar students in 3 states. Since the program is not state-administered and the sample size was quite large??approximately 6,000 students in both the treatment and comparison groups??findings from the evaluation are suggestive of the effectiveness of the Talent Search program as a whole. The differences between participants and similar students in applying for financial aid varied by state, but ranged from 13-27 percentage-points. The differences between participants and similar students in college enrollment also varied by state, but ranged from 5-18 percentage points.

LARGE EXTENT 17%
Section 4 - Program Results/Accountability Score 50%


Last updated: 09062008.2005SPR