ExpectMore.gov


Detailed Information on the
TRIO Student Support Services Assessment

Program Code 10000208
Program Title TRIO Student Support Services
Department Name Department of Education
Agency/Bureau Name Office of Postsecondary Education
Program Type(s) Competitive Grant Program
Assessment Year 2005
Assessment Rating Moderately Effective
Assessment Section Scores
Section Score
Program Purpose & Design 100%
Strategic Planning 88%
Program Management 90%
Program Results/Accountability 58%
Program Funding Level
(in millions)
FY2007 $273
FY2008 $281
FY2009 $282

Ongoing Program Improvement Plans

Year Began Improvement Plan Status Comments
2006

Taking steps to better link rewards for past performance with the achievement of key program goals.

Action taken, but not completed For the 2009 competition, the Department will examine strategies to ensure that past performance is again fairly and consistently assessed when assigning points for prior experience.
2007

Conducting a study on promising practices in the Student Support Services program that can be used to help improve grant outcomes.

Action taken, but not completed The Department began the study in the fall of 2006 and anticipates producing a draft report in 2008 and releasing a final report in the summer of 2009.
2008

Implementing a strategy to use performance and/or efficiency measures to improve program performance.

No action taken The Department will examine strategies to use grantee-level performance and efficiency data to identify areas for program improvement.

Completed Program Improvement Plans

Year Began Improvement Plan Status Comments
2003

Monitoring and making use of project performance data to improve the program.

Completed In the 2005 competition, the Department changed requirements for establishing objectives and tracking students to ensure projects are held to a consistent standard of performance.
2003

Exploring policies that would reduce statutory and regulatory barriers faced by qualified first-time grantees in order to encourage their participation in the program.

Completed The Department took action to ensure that prior experience points are awarded for demonstrated performance, and continues to examine ways to better link past performance with the key program goals.
2003

Collecting second-year data for performance measures.

Completed The Department has collected multiple years of data for each performance measure, including college completion data for multiple cohorts of participants.
2006

Implementing a strategy to use efficiency measures to improve cost effectiveness in achieving the program goals.

Completed The program calculated efficiency measure data for 2003-04 at the project level and shared the information with the TRIO community. Performance data (persistence rate, graduation rate, and the efficience measure) do not show a strong association to the cost per student, suggesting that efficience is more dependent on student demographics and institutional characteristics. The Department will begin looking at other program performance data to try to identify areas for program improvement.
2007

Analyze and make available on the web a second year of program performance and efficiency data.

Completed Grantee level efficiency data for 2003-04 are available on the web at http://www.ed.gov/programs/triostudsupp/efficiencygranteeanalysis.xls. Contractors have begunanalysis of 2004-05 grantee level data.

Program Performance Measures

Term Type  
Long-term/Annual Outcome

Measure: Percentage of SSS participants persisting at the same institution.


Explanation:College persistence rate of program participants. In 2002 PART, a long-term target of 70% was established for 2007. Current actual performance exceeds that target, so new long-term targets have been established through 2011. The FY 2006 value is the first reporting year of a new SSS grant cycle; therefore, the 56 grantees that were not refunded in 2005-06 and several new grantees that did not receive in 2004-05 cannot be included. Performance may not be sustained in FY 2007 when the new grantees are included in the results.

Year Target Actual
2000 67.0 67
2001 67.0 70
2002 67.0 72
2003 68.0 72
2004 68.5 73.1
2005 69.0 74.1
2006 72.0 79.1
2007 73.0 data lag [Dec 2008]
2008 73.0
2009 73.5
2010 73.5
2011 74.0
2012 74.0
2013 74.5
Long-term/Annual Outcome

Measure: The percentage of SSS freshmen at 2-year institutions completing an Associates degree at original institution or transferring to a 4-year institution within 3 years.


Explanation:Based on evaluation data, a 2007 long-term target was previously set at 31% for the combined college completion rate of participants in 2-year and 4-year institutions. Project-level data indicate that SSS performance is improving but, largely due to the changing proportion of 2-year and 4-year institutions in SSS, college completion data are now being tracked separately and new targets were developed for 2-year and 4-year institutions.

Year Target Actual
2001 N/A 23.1
2002 N/A 26
2003 N/A 27.7
2004 N/A 25.6
2005 N/A 24.5
2006 27.0 24.6
2007 27.5 data lag [Dec 2008]
2008 27.5
2009 28.0
2010 28.0
2011 28.5
2012 28.5
2013 29.0
Long-term/Annual Outcome

Measure: Percentage of SSS freshmen at 4-year institutions completing a Bachelors degree at original institution within 6 years.


Explanation:Based on evaluation data, a 2007 long-term target was previously set at 31% for the combined college completion rate of participants in 2-year and 4-year institutions. Project-level data indicate that SSS performance is improving but, largely due to the changing proportion of 2-year and 4-year institutions in SSS, college completion data are now being tracked separately and new targets were developed for 2-year and 4-year institutions. The FY 2006 value is the first reporting year of a new SSS grant cycle; therefore, the 28 grantees that were not refunded in 2005-06 and several new grantees that did not receive funding in 2004-05 cannot be included. Performance may not be sustained in FY 2007 when the new grantees are included in the results.

Year Target Actual
2004 N/A 28.1
2005 N/A 29.4
2006 28.0 33.5
2007 29.0 data lag [Dec 2008]
2008 29.0
2009 29.5
2010 29.5
2011 30.0
2012 30.0
2013 30.5
Annual Efficiency

Measure: The gap between the cost per successful outcome and the cost per program participant.


Explanation:A successful annual outcome is defined as a participant that persists toward or completes a college degree. The gap is the difference between the cost per successful outcome, which is the annual appropriation divided by the number of students persisting at the same institution during that specific school year, transferring from a two- to four-year school, or completing their degree, and the cost per student served, which is the annual appropriation divided by the number of students receiving services during that specific school year. Improvement does not necessarily indicate a decrease in overall program costs, but does result from a decrease in the cost per successful outcome.

Year Target Actual
2003 N/A $263
2004 N/A $252
2005 N/A $245
2006 NA $209
2007 $239 Data lag [Dec 2008]
2008 $236 Data lag [Dec 2009]
2009 $233
2010 $223
2011 $213
2012 $203
2013 $193

Questions/Answers (Detailed Assessment)

Section 1 - Program Purpose & Design
Number Question Answer Score
1.1

Is the program purpose clear?

Explanation: The program is designed to provide support services to disadvantaged college students to increase their retention and completion rates.

Evidence: Section 402D of the Higher Education Act (HEA) states that the purpose is "to increase college retention and graduation rates" for low-income, first generation, and disabled college students.

YES 20%
1.2

Does the program address a specific and existing problem, interest, or need?

Explanation: Data indicate that low-income, first-generation, and disabled college students do not attend and graduate from college at the same rates as students who are less disadvantaged.

Evidence: Data from the report "Descriptive Summary of the 1995-96 Beginning Postsecondary Students: Six Years Later" indicate that, of students who enrolled in public 4-year colleges, 49.7% of students with family incomes below $45,000 completed a bachelor's degree, compared to 58.0% and 67.1% of students with family incomes up to $70,000 and with family incomes above $70,000. In addition, data indicate that 39.0% of students whose parents never attended college completed a bachelor's degree, compared to 47.7% and 62.4% of students whose parents had some college and whose parents had bachelor's degrees (NCES, 2002, Table 2.2-C). A wide-range of other data are available in NCES publications.

YES 20%
1.3

Is the program designed so that it is not redundant or duplicative of any other Federal, state, local or private effort?

Explanation: Student Support Services (SSS) is designed to provide services to students with demonstrated need for assistance. SSS is unique from other programs in the intensity of its program services and the targeting of these services to the highest-impact population of college-bound recipients.

Evidence: Although most institutions have programs that offer remedial education and support services, the SSS evaluation determined that, "there remains a continuing need to address the problem of equal opportunity in higher education and to have higher education serve economically, culturally, and academically disadvantaged youth."

YES 20%
1.4

Is the program design free of major flaws that would limit the program's effectiveness or efficiency?

Explanation: SSS has no major design flaws. Furthermore, there is no evidence that other approaches, like leveraging community resources, are more effective in providing support services and improving graduation rates.

Evidence: The SSS evaluation found a wide-range of positive impacts, especially for participants who receive supplemental services offered outside of SSS. The evaluation also found that SSS increases the amount of supplemental services that participants receive. These findings suggest that the program is well designed.

YES 20%
1.5

Is the program design effectively targeted so that resources will address the program's purpose directly and will reach intended beneficiaries?

Explanation: SSS appears well targeted to the students who are most at risk of dropping out of college.

Evidence: The statute requires projects to assure that at least two-thirds of participants are disabled or low-income, first-generation college students. SSS's profile report (Mathematica, 2004) indicates that 74% of participants are low-income, 79% of participants are first-generation, and 13% of participants are disabled.

YES 20%
Section 1 - Program Purpose & Design Score 100%
Section 2 - Strategic Planning
Number Question Answer Score
2.1

Does the program have a limited number of specific long-term performance measures that focus on outcomes and meaningfully reflect the purpose of the program?

Explanation: The primary goals of SSS are to increase the college persistence and completion rates of participating students.

Evidence: The GPRA measures track the percentage of participants who persist from their 1st to their 2nd year of college and the percentage of participants who complete any degree at their original institution within 6 years.

YES 13%
2.2

Does the program have ambitious targets and timeframes for its long-term measures?

Explanation: Since SSS exceeded previous long-term targets, annual trend data were used to establish new performance targets through 2011. This 5-year time period is consistent with the Department's strategic planning process.

Evidence: The college persistence target is 74% by 2011 and the college completion target is 32% by 2011.

YES 13%
2.3

Does the program have a limited number of specific annual performance measures that can demonstrate progress toward achieving the program's long-term goals?

Explanation: The annual goals are the same as the long-term goals. Annual performance data will track progress against short-term targets while also tracking progress against the long-term goals.

Evidence: The GPRA measures track the percentage of participants who persist from their 1st to their 2nd year of college and the percentage of participants who complete any degree at their original institution within 6 years.

YES 13%
2.4

Does the program have baselines and ambitious targets for its annual measures?

Explanation: Annual trend data were used to establish targets for incremental improvement.

Evidence: Annual performance targets are included in the program performance plan.

YES 13%
2.5

Do all partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) commit to and work toward the annual and/or long-term goals of the program?

Explanation: SSS projects all work toward the annual and long-term goals of the program. Annual performance reports (APRs) are required of all grantees and their performance is measured on the basis of how well they meet the program goals.

Evidence: Program regulations clearly articulate the program goals (34 CFR 646.1) and indicate that grant awards, continuation funding, and prior experience points are awarded partly on the basis of how well projects achieve these goals.

YES 12%
2.6

Are independent evaluations of sufficient scope and quality conducted on a regular basis or as needed to support program improvements and evaluate effectiveness and relevance to the problem, interest, or need?

Explanation: A final evaluation report on the impact of SSS on college completion rates, the first in over two decades, is scheduled to be released in 2005. The evaluation was of sufficient scope and quality. The next SSS study will examine the extent to which TRIO provides a continuous pipeline of services and the effect of that pipeline on postsecondary education outcomes.

Evidence: The SSS evaluation, conducted by Westat, was a quasi-experimental design with stratified, random treatment and matched comparison samples of 2,900 students each. The findings, which were subject to internal and external review, are reflective of the SSS program as a whole. As part of TRIO's long-term evaluation strategy, the next SSS study will fill a gap in performance information regarding the effects of TRIO on students who receive a continuous pipeline of services.

YES 12%
2.7

Are Budget requests explicitly tied to accomplishment of the annual and long-term performance goals, and are the resource needs presented in a complete and transparent manner in the program's budget?

Explanation: While ED has made strides in more closely tying its SSS budget requests to performance data, no specific goals have been set to link increased intensity of services and grant aid to increased outcomes. Funds for SSS have been requested for these activities and the SSS evaluation and other research suggest that these approaches will help SSS achieve its performance goals.

Evidence: Since the FY 2001 budget, the Department has requested and allocated significant budget increases for SSS projects to increase services and provide grant aid. In the FY 2005 competition, to ensure that first-time projects provide a level of services similar to other projects, the Department included guidelines on the minimum number of students projects should serve and increased the maximum grant award for first-time projects to $220,000. Over the last 5 years, the budget for SSS has increased by more than $90 million (or 50%), significantly more than every other TRIO program. The increases have allowed projects to provide more intensified services and grant aid.

NO 0%
2.8

Has the program taken meaningful steps to correct its strategic planning deficiencies?

Explanation: The Department developed a long-term evaluation strategy for TRIO, including the SSS program. Additionally, the Department continues to improve strategic planning by finding new ways to link SSS budgetary decisions with performance information and by reexamining the program performance measures. Based on recent trend data, SSS may revise its performance measures to reflect the differences between 2-year and 4-year institutions in their student persistence and completion rates.

Evidence: In FY 2005, funds for grant aid were included in the base grant awards, giving projects the flexibility to determine the appropriate mixture of services. The FY 2006 budget justification noted that the Department is considering changes in how performance data are calculated and displayed because 2-year institutions, which have significantly lower student completion rates over a 6-year period, are comprising an increasing percentage of SSS grantees.

YES 12%
Section 2 - Strategic Planning Score 88%
Section 3 - Program Management
Number Question Answer Score
3.1

Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance?

Explanation: Since the last PART assessment, TRIO has begun using performance information to improve the performance of Student Support Services. Changes were made in the FY 2005 competition to prevent applicants from providing misleading goals and to ensure projects are held to a consistent standard of performance. As a result of audit findings and compliance monitoring, TRIO strengthened the verification requirements for performance reporting and focused on financial management deficiencies. The requirement to submit student-level data, which are subject to less manipulation, helps ensure data quality. TRIO also developed a quarterly newsletter to improve communication with grantees. TRIO uses OPE's new electronic grant monitoring system, and has long used performance data to assess the degree to which Student Support Services grantees achieve their stated goals and to award prior experience points in each program competition.

Evidence: The FY 2005 application instructed that: "The objectives for persistence and graduation must be measured based upon cohorts and must address the need identified in the 'Need for the Project' section. The measurement for two-year institutions should be over a three-year period and four-year institutions should measure over a six-year period." "Multi-layered or compound objectives" were rejected, and the Federal Register notice stated that projects would not be able to renegotiate their goals. TRIO has conducted 100 site visits each year for the past 2 years, significantly more than were conducted in previous years. In FY 2004, 12 TRIO grants were closed due to one of the following reasons: audit issues, accreditation, lack of substantial progress, or no regulatory authority. An additional 8 grants received reduced funding due to a failure to make substantial progress. TRIO has held teleconferences, shared information, and conducted site visits focused on inaccurate indirect cost reporting, failure to adequately expend grant funds in a timely fashion, and failure to properly document and serve the required number of eligible students. Three editions of the newsletter have been published. In addition to allocating up to 15 prior experience points to current grantees on the basis of performance data, staff work with project directors to ensure that goals are attainable yet ambitious based on information included in newly funded proposals and staff assessments of reports from the grantees. Reports from grantees also were the basis for TRIO providing technology supplements to improve project performance.

YES 10%
3.2

Are Federal managers and program partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) held accountable for cost, schedule and performance results?

Explanation: ED's managers are subject to EDPAS which links employee performance to relevant Strategic Plan goals and action steps, and is designed to measure the degree to which a manager contributes to improving program performance. SSS managers are identified and their EDPAS agreements are linked with the performance of projects or the program. Additionally, funding decisions for current grant recipients are based on prior performance.

Evidence: The EDPAS standards for TRIO program managers set forth requirements that program managers set forth strategies for implementing GPRA and Strategic Plan ininitiatives related to TRIO. TRIO project officers are held accountable for assessing project performance, calculating prior experience points, and monitoring the progress of projects in achieving program goals and objectives. SSS projects receive continuation funding on the basis of their reported progress in achieving the program goals, and they receive up to 15 points for their prior performance when applying for a new grant.

YES 10%
3.3

Are funds (Federal and partners') obligated in a timely manner and spent for the intended purpose?

Explanation: Funds are obligated in a timely matter. In response to IG reports, TRIO has focused attention on appropriate financial management procedures. TRIO staff use GAPS and OPE's new electronic grant monitoring system.

Evidence: TRIO grants are typically obligated by May of each year, far exceeding the Department average. Additionally, new competitions are held 6 to 8 months ahead of schedule to ensure that new projects have adequate time to begin providing services on the project start date. Staff monitor grantees' draw-down of funds by reviewing grantees' financial reports electronically. A memo explaining the consequences of excessive draw-downs was sent to all TRIO grantees. In addition, staff have begun monitoring grants that are not spending funds as quickly as anticipated. Grantees must submit a written request before any accounts are reopened after the close of a grant cycle, and TRIO no longer allows funds that are to be returned to be used for current grant activities instead.

YES 10%
3.4

Does the program have procedures (e.g. competitive sourcing/cost comparisons, IT improvements, appropriate incentives) to measure and achieve efficiencies and cost effectiveness in program execution?

Explanation: TRIO developed a common efficiency measure, the annual cost per successful outcome, to assess the cost effectiveness of each TRIO program and project on an annual basis. In June 2005, TRIO began a pilot effort to implement its efficiency measures. The implementation strategy includes communication with grantees, calculating efficiency data in a variety of ways, and publishing efficiency data and setting targets for improved efficiency. TRIO is working with projects to focus on methods for comparing and improving efficiency levels over time. Additionally, prior experience points serve as a performance incentive for grantees.

Evidence: The annual cost per successful outcome for SSS was $1,528 in 2003. TRIO's fall newsletter discusses the efficiency measure strategy. Up to 15 prior experience points are awarded to all eligible applicants during a competitive cycle.

YES 10%
3.5

Does the program collaborate and coordinate effectively with related programs?

Explanation: TRIO urges coordination with other Federal and non-Federal projects to create a pipeline of services through college. Some projects share project directors to oversee all programs with coordinators providing day-to-day management. Additionally, TRIO administers the Child Care Access Means Parents In School (CCAMPIS) program and works closely with the GEAR UP and Institutional Development and Undergraduate Education Service (IDUES) programs.

Evidence: SSS projects are often linked with Upward Bound, GEAR UP, and Talent Search projects, including a number of institutions that are the recipients of multiple such grants. The spring 2005 TRIO newsletter includes information about IDUES and CCAMPIS activities.

YES 10%
3.6

Does the program use strong financial management practices?

Explanation: The TRIO program office has not been revealed to have internal control weaknesses and follows Departmental guidelines for financial management. TRIO staff also use OPE's new electronic grant monitoring system.

Evidence: The IG audit of TRIO's financial controls found no evidence of erroneous payments or other such material weaknesses.

YES 10%
3.7

Has the program taken meaningful steps to address its management deficiencies?

Explanation: TRIO developed a plan for responding to IG concerns regarding insufficient grantee monitoring and unclear reporting requirements. Additionally, an efficiency measure was developed and TRIO has begun to focus on ways that SSS projects can become more efficient.

Evidence: The TRIO program office developed a detailed monitoring plan that emphasizes conducting on-site visits to newly funded and high-risk projects (those with evidence of mismanagement, constant turnover in leadership, etc.). A full-time TRIO staff member is now dedicated to project oversight, and TRIO has reached agreement with the Federal Student Aid office to conduct joint site visits. The SSS efficiency measure was discussed in the FY 2006 budget justification, and the cover article in the spring 2005 TRIO newsletter focuses on learning communities as a more efficient and effective way to improve project performance.

YES 10%
3.CO1

Are grants awarded based on a clear competitive process that includes a qualified assessment of merit?

Explanation: The TRIO program office provides outreach and technical assistance to new grantees. In addition, independent peer review panels are used to score and rank all applications. Although the statute and regulations provide up to 15 bonus points for prior experience, TRIO has tightened the process for awarding those points to ensure that the competitive preference given to existing grantees is based on demonstrated performance. Following HEA reauthorization, TRIO also plans to pursue regulatory changes to better link prior experience points with achievement of the key program outcomes.

Evidence: In the 2005 competition, only 50% of Student Support Services grantees that applied for new awards received the maximum 15 prior experience points, down from 74% in 2001. This resulted in a 19% success rate for first-time applicants. It also reflected a trend in TRIO. In 2002, only 37% of Talent Search grantees that applied for new awards received the maximum 15 prior experience points, down from 72% in the 1998 competition. In 2003, only 28% of Upward Bound grantees and 32% of Upward Bound Math/Science grantees received the maximum 15 points, down from 68% and 72% in 1999. TRIO administration funds are used to pay for the peer review process. 100% of grants are subject to review.

YES 10%
3.CO2

Does the program have oversight practices that provide sufficient knowledge of grantee activities?

Explanation: New procedures were implemented for improving the monitoring of expenditures based on IG concerns, including joint audits with IG and the use of OPE's new electronic grant monitoring system.

Evidence: In addition to increasing efforts at on-site monitoring, the TRIO program office continues to review all reports (APRs, partnership agreements, interim performance reports, audits) that grantees are required to submit and make follow-up calls to clarify questions and concerns. A full-time TRIO staff member is now dedicated to project oversight, and TRIO has reached agreement with the Federal Student Aid office to conduct joint site visits.

YES 10%
3.CO3

Does the program collect grantee performance data on an annual basis and make it available to the public in a transparent and meaningful manner?

Explanation: While the TRIO program office collects and compiles data from performance reports on an annual basis and produces a program profile report biennially, it has not yet made disaggregated grantee-level data available to the public. TRIO continues working to increase the timeliness of making the data available to the public, and to make comparisons with data on participation in the Federal student financial assistance programs.

Evidence: A program profile report prepared by an independent contractor is available on TRIO's website (www.ed.gov/about/offices/list/ope/trio). The next report is scheduled to be released in 2005. The reports include complete performance data aggregated at the program level. Data also are disaggregated by type of institution and, in some cases, by region or state.

NO 0%
Section 3 - Program Management Score 90%
Section 4 - Program Results/Accountability
Number Question Answer Score
4.1

Has the program demonstrated adequate progress in achieving its long-term performance goals?

Explanation: SSS exceeded its long-term performance goal for college persistence. SSS has not yet met its long-term goal for the combined college completion rate of participants in 2-year and 4-year institutions, largely due to the increasing proportion of 2-year institutions participating in SSS. Separate college completion data for participants in 2-year and 4-year institutions indicate that SSS performance is improving and on pace to increase by a total of more than 2 percentage-points by 2007.

Evidence: In 2002, SSS established long-term targets for 2007 using baseline data from the program evaluation. The targets were 70% for college persistence and 31% for college completion (2 percentage-points higher than 2000 baseline data). SSS has met or exceeded the college persistence target every year since 2001. The first project-reported data on college completion (data for 2004) indicate that a combined 26% of participants completed a degree within 6 years at the same institution, less than baseline data from the evaluation. However, the proportion of 4-year institutions participating in SSS decreased from 59% during the time of the evaluation to 51% in 2004. 2004 data for 2-year institutions indicate that 26% of participants completed an Associates degree or transferred to a 4-year institution within 3 years, significantly more than the 8% of participants who were found to do so in the program evaluation. 2004 data for 4-year institutions indicate that 28% of participants completed a Bachelors degree within 6-years at the same institution (the same percentage as were found to do so in the program evaluation), and subsequent cohorts of participants are on pace for higher college completion rates. To better measure SSS performance, college completion data are now being tracked separately for 2-year and 4-year institutions. New long-term targets were developed for all SSS measures through 2011.

LARGE EXTENT 17%
4.2

Does the program (including program partners) achieve its annual performance goals?

Explanation: SSS exceeded each of its annual performance goals for college persistence. SSS did not meet its 2004 goal of 30% for the combined college completion rate of participants in 2-year and 4-year institutions, largely due to the increasing proportion of 2-year institutions participating in SSS. However, separate college completion data for participants in 2-year and 4-year institutions indicate that SSS performance has improved by more than .5% each year.

Evidence: Using evaluation data as the baseline, annual targets for SSS had been set to increase by as much as .5% annually. Actual data reported in the program performance plan show the college persistence rate increasing steadily from 67% in 2000 to 73% in 2004. The first project-reported data on college completion (data for 2004) indicate that a combined 26% of participants completed a degree within 6-years at the same institution, less than the baseline level from the evaluation. However, the proportion of 4-year institutions participating in SSS decreased from 59% during the time of the evaluation to 51% in 2004. Data for participants in 2-year institutions indicate that the college completion rate increased more than .5% each year between 2001 and 2003. To better measure SSS performance, college completion data are now being tracked separately for 2-year and 4-year institutions. New annual targets were developed for all SSS measures.

LARGE EXTENT 17%
4.3

Does the program demonstrate improved efficiencies or cost effectiveness in achieving program goals each year?

Explanation: The Department recently developed an efficiency measure for SSS and a strategy for improving program efficiency. Although the average cost per successful annual outcome decreased in 2004, the Department is still developing efficiency targets.

Evidence: The average cost per successful annual outcome decreased from $1,528 in 2003 to $1,510 in 2004. A successful annual outcome is defined as a participant that persists toward or completes a college degree.

NO 0%
4.4

Does the performance of this program compare favorably to other programs, including government, private, etc., with similar purpose and goals?

Explanation: Although there are other State and local programs that provide support services to disadvantaged college students, there are few data to demonstrate the effectiveness of those programs. And any comparison would be too inherently difficult given differences in scope between the programs. Nevertheless, the successful performance of SSS is apparent even though there are no comparable programs with outcome data against which it can be judged. SSS has significant impacts on participants and is well targeted to those most in need of services.

Evidence:  

NA  %
4.5

Do independent evaluations of sufficient scope and quality indicate that the program is effective and achieving results?

Explanation: The SSS evaluation was independently conducted and of high quality and scope. The preponderance of evidence suggests that SSS has positive effects on college persistence, transfer rates from 2-year to 4-year institutions, and degree completion. An evaluation design limitation??the inability to collect information to distinguish the types of services SSS participants receive after their 1st year in the program??is the only reason why a high level of certainty is not attached to the findings.

Evidence: The SSS evaluation, conducted by Westat, was a quasi-experimental design, with random treatment and matched comparison samples of 2,900 students each. The evaluation was based on a random cross-section of projects, so the findings are reflective of the SSS program as a whole. Six methodologies were used to measure and compare impacts. The impacts for participants that received SSS services in their 1st year and SSS or other supplemental services after their 1st year were consistently large. For example, the impact on Bachelor's degree completion ranged from 8-15 percentage-points. The impacts based solely on participants who received services in their 1st year of the program were somewhat limited. For example, the impact on bachelor's degree completion ranged from 0-6 percentage-points. It is important to note that the study found SSS participants to be more disadvantaged than students in the comparison sample, suggesting the program is well targeted.

YES 25%
Section 4 - Program Results/Accountability Score 58%


Last updated: 09062008.2005SPR