ExpectMore.gov


Detailed Information on the
Supplemental Educational Opportunity Grants Assessment

Program Code 10001033
Program Title Supplemental Educational Opportunity Grants
Department Name Department of Education
Agency/Bureau Name Department of Education
Program Type(s) Block/Formula Grant
Assessment Year 2003
Assessment Rating Results Not Demonstrated
Assessment Section Scores
Section Score
Program Purpose & Design 60%
Strategic Planning 62%
Program Management 56%
Program Results/Accountability 20%
Program Funding Level
(in millions)
FY2007 $771
FY2008 $757
FY2009 $0

Ongoing Program Improvement Plans

Year Began Improvement Plan Status Comments
2006

Correct the funding allocation formula as part of the reauthorization of the Higher Education Act by ensuring that funds reach postsecondary institutions with the highest proportion of needy students.

Action taken, but not completed These proposals were included in the Administration's Higher Education Act reauthorization proposal; beginning in the FY 2008 President's Budget, the Administration has proposed to eliminate SEOG, which is duplicative of the larger and more broadly available Pell Grant program. The Department continues to work to encourage Congress to include the proposals in reauthorization legislation.
2006

In 2004, begin to collect data for the SEOG program that is sufficient to measure program performance and reconcile financial data. These data should support the Education Department's new performance measurement approach that tracks program success on improving student persistence and graduation.

Action taken, but not completed The Department continues to explore a number of alternative data sources and methodological approaches for this measure. As of now, a source of reliable data for this measure has not been identified. The Department will meet with OMB this spring to determine the feasibility of collecting program-specific data.

Completed Program Improvement Plans

Year Began Improvement Plan Status Comments
2006

In 2004, develop meaningful efficiency measures for this program.

Completed Program-specific efficiency measures related to the unit cost of student aid administration have been developed and submitted to the Office of Management and Budget.

Program Performance Measures

Term Type  
Annual Efficiency

Measure: Persistence: The gap between persistence rates for campus-based aid recipients and for the general student population will decrease each year. [Targets under development.]


Explanation:The persistence rate is defined as the percentage of non-graduating students in a given year who return to continue their studies in the following year. A specific methodology to account for transfers and other data anomolies is under development.

Year Target Actual
2005 TBD n/a
2006 TBD n/a
2007 TBD n/a
2008 TBD
2009 TBD
Annual Outcome

Measure: Completion: The gap between completion rates for campus-based aid recipients and for the general student population will decrease each year. [Targets under development.]


Explanation:The completion rate is defined as the percentage of full-time degree-seeking students completing within 150 percent of the normal time required.

Year Target Actual
2005 TBD n/a
2006 TBD n/a
2007 TBD n/a
2008 TBD
2009 TBD

Questions/Answers (Detailed Assessment)

Section 1 - Program Purpose & Design
Number Question Answer Score
1.1

Is the program purpose clear?

Explanation: According to the authorizing statute, the program's purpose is "to provide, through instituions of higher education, supplemental grants to assist in making available the benefits of postsecondary education to qualified students who demonstrate financial need..."

Evidence: The program's purpose is clearly expressed in section 413A of the Higher Education Act of 1965, as amended.

YES 20%
1.2

Does the program address a specific and existing problem, interest, or need?

Explanation: Many needy students qualify for more grant aid than is available under the Pell Grant program. This program offers an additional source for grant aid for some of these students.

Evidence: Over half of the nearly 5 million Pell Grant recipients each year have an expected family contribution of zero. Since the average cost of college significantly exceeds the Pell Grant maximum award, many if not most of these students qualify for additional grant assistance.

YES 20%
1.3

Is the program designed so that it is not redundant or duplicative of any Federal, state, local or private effort?

Explanation: The program is clearly redundant of the Pell Grant program, as well as of other state, local, and institutional grant programs.

Evidence: Virtually all SEOG recipients also receive Pell Grants. Simply shifting funds appropriated for SEOG into the Pell Grant program would raise the maximum award by roughly $200. Since the average SEOG award is nearly $750, such an approach would more broadly distribute smaller grant awards to the rest of the Pell-eligible population, as compared to the current program structure.

NO 0%
1.4

Is the program design free of major flaws that would limit the program's effectiveness or efficiency?

Explanation: There is no evidence of a better existing mechanism to deliver supplemental aid.

Evidence:  

YES 20%
1.5

Is the program effectively targeted, so program resources reach intended beneficiaries and/or otherwise address the program's purpose directly?

Explanation: The program's institutional allocation formula (i.e., how much program funding is given to each school to offer SEOG aid) is designed to heavily benefit postsecondary institutions that have participated in Campus-Based programs for a long time, at the expense of more recent entrants or new applicants. Since these longstanding institutions do not have a higher proportion of needy students, this allocation formula tends to limit the program's ability to target resources to the neediest beneficiaries.

Evidence: The program's allocation formula is detailed in section 442 of the Higher Education Act of 1965, as amended.

NO 0%
Section 1 - Program Purpose & Design Score 60%
Section 2 - Strategic Planning
Number Question Answer Score
2.1

Does the program have a limited number of specific long-term performance measures that focus on outcomes and meaningfully reflect the purpose of the program?

Explanation: The Department has developed common measures for the Campus-Based programs (Work Study, Supplemental Education Opportunity Grants, and Perkins Loans). These measures relate to the targeting of Campus-Based aid to low-income students and the impact of such aid on student persistence and graduation rates, benchmarked to the overall population. The Department is working with OMB on developing an appropriate efficiency measure for this program.

Evidence: Department of Education Strategic Plan

YES 12%
2.2

Does the program have ambitious targets and timeframes for its long-term measures?

Explanation: Specific targets and timeframes are shown in the "Measures" tab and are under development. Once completed, they will also be included in the Department's annual performance plans. No annual data is currently available to support these goals.

Evidence: See answer to 2.1

NO 0%
2.3

Does the program have a limited number of specific annual performance measures that demonstrate progress toward achieving the program's long-term measures?

Explanation: See answer to 2.1.

Evidence: Department of Education Strategic Plan; Goal 5, Objectives 5.1, 5.3

YES 12%
2.4

Does the program have baselines and ambitious targets and timeframes for its annual measures?

Explanation: See answer to 2.2

Evidence: See answer to 2.2

NO 0%
2.5

Do all partners (including grantees, sub-grantees, contractors, cost-sharing partners, etc.) commit to and work toward the annual and/or long-term goals of the program?

Explanation: Program partners (i.e., schools) support the goals of the SEOG program, reporting data through the annual Fiscal Operations Report and Application to Participate (FISAP) form and meeting program statutory and regulatory requirements, as set out in program participation agreements. Schools also report program data through a variety of Department financial systems, as well as through ongoing surveys such as the Integrated Postsecondary Data System (IPEDS). Data from these reports are used in determining program performance.

Evidence: IPEDS, Department of Education financial and program management reports.

YES 12%
2.6

Are independent and quality evaluations of sufficient scope and quality conducted on a regular basis or as needed to support program improvements and evaluate effectiveness and relevance to the problem, interest, or need?

Explanation: Comprehensive studies by the American Council on Education, the National Center for Education Statistics, and the Advisory Committee on Student Financial Assistance, among others, assess the impact grant aid has on the enrollment and persistence of low-income students in higher education.

Evidence: NCES studies include: "Student Financing of Undergraduate Education (1999-2000); How Families of Low- and Middle-Income Undergraduates Pay for College: Full-Time Dependent Students in (1999-2000)"; and "Low-Income Students: Who They Are and How They Pay for Their Education (2000)." Advisory Committee studies include: "Access Denied: Restoring the Nation's Commitment to Equal Educational Opportunity," Feb 2001; and "Empty Promises: The Myth of College Access in America," June 2002.

YES 12%
2.7

Are Budget requests explicitly tied to accomplishment of the annual and long-term performance goals, and are the resource needs presented in a complete and transparent manner in the program's budget?

Explanation: ED has not satisfied the first part of the question because program performance changes are not identified with changes in funding levels. The program, at this time, does not have sufficiently valid and reliable performance information to assess (whether directly or indirectly) the impact of the Federal investment. However, ED has satisfied the second part of this question in that ED's budget submissions show the full cost of the program (including S&E). ED's '05 integrated budget and performance plan includes the program's annual and long-term goals. [Note: The measures discussed in 2.1 are new, and will be reflected in future budget requests.]

Evidence:  

NO 0%
2.8

Has the program taken meaningful steps to correct its strategic planning deficiencies?

Explanation: The Department is working to develop effective, program-specific performance measures, as discussed under 2.1.

Evidence: See 2.1

YES 12%
Section 2 - Strategic Planning Score 62%
Section 3 - Program Management
Number Question Answer Score
3.1

Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance?

Explanation: SEOG information is primarily collected through the FISAP, which is used by participating institutions to report program data to the Department and apply for continued program participation. Data on the FISAP is not sufficient for program management or performance assessment.

Evidence: SEOG program and financial data. FISAP data is not timely, or internally consistent, in that the design of the form, which requests cumulative rather than annual data, makes it almost impossible to reconcile financial information.

NO 0%
3.2

Are Federal managers and program partners (grantees, subgrantees, contractors, cost-sharing partners, etc.) held accountable for cost, schedule and performance results?

Explanation: ED's managers are subject to EDPAS, which links employee performance to relevant Strategic Plan goals and action steps, and is designed to measure the degree to which a manager contributes to improving program performance. OFSA federal managers are also subject to performance agreements developed under its Performance-Based Organization authority. Postsecondary institutions (the program partners) are held accountable through statutory cohort default rate penalties, annual compliance audits, and periodic program reviews, including site visits by ED. In addition, ED requires institutions participating in the Campus-Based programs to sign program participation agreements. To receive a "Yes," ED needs to: (1) identify for OMB the federal managers for this program; and (2) demonstrate the relationship between these managers' performance standards and the program's long-term and annual measures; and (3) demonstrate the relationship between program partner's performance standards and the program's long-term and annual measures.

Evidence:  

NO 0%
3.3

Are all funds (Federal and partners') obligated in a timely manner and spent for the intended purpose?

Explanation: Financial audits and program reviews indicate that funds are obligated in a timely manner and for the intended purpose.

Evidence: Department financial statements and supporting materials and documentation.

YES 11%
3.4

Does the program have procedures (e.g., competitive sourcing/cost comparisons, IT improvements, approporaite incentives) to measure and achieve efficiencies and cost effectiveness in program execution?

Explanation: This program has not yet instituted procedures to measure and improve cost efficiency in program execution. However, as part of the President's Management Agenda, the Department is implementing "One-ED" -- an agency-wide initiative to re-evaluate the efficiency of every significant business function, including the development of unit measures and the consideration of competitive sourcing and IT improvements. A "yes" answer is likely once the One-ED process is applied to this program's relevant business functions. [Note: Although the Department is currently developing a unit cost accounting system to measure cost effectiveness in FSA programs, this system is not yet fully in place.]

Evidence:  

NO 0%
3.5

Does the program collaborate and coordinate effectively with related programs?

Explanation: The SEOG program operates effectively within the overall Federal student aid system, taking advantage of shared application and aid disbursement procedures and systems, common institutional and student eligibility regulations, and program reviews.

Evidence: SEOG application and Federal funds disbursement processes; aid award packaging.

YES 11%
3.6

Does the program use strong financial management practices?

Explanation: No financial management deficiencies have been identified for this program; no negative audit reports have been issued. That said, as noted in 3.1, there are problems with the financial data ED collects on the FISAP.

Evidence: Department financial statements and supporting materials and documentation.

YES 11%
3.7

Has the program taken meaningful steps to address its management deficiencies?

Explanation: The Department is in the process of developing program-specific unit cost measures to better assess management efficiency, and is finishing a data strategy for the Office of Federal Student Aid (OFSA). The Department also plans to conduct a One-ED strategic investment review for OFSA.

Evidence: The Department of Education's One-ED Strategic Investment Review process. Also, the Student Aid Administration PART includes a performance measure related to management efficiency, and information on OFSA's data strategy.

YES 11%
3.BF1

Does the program have oversight practices that provide sufficient knowledge of grantee activities?

Explanation: Program participants are subject to regular oversight, including institutional audits and periodic program reviews. These oversight activities, together with program and financial reports, provide sufficient knoweldge of grantee activities.

Evidence: See FSA oversight procedures for the campus-based programs. However, Department Inspector General has concluded that ED should improve its monitoring of post-secondary institutions.

YES 11%
3.BF2

Does the program collect grantee performance data on an annual basis and make it available to the public in a transparent and meaningful manner?

Explanation: Annual data submitted through the FISAP contain compliance information, but not performance data.

Evidence: FISAP data collection. Program operations and financial reports.

NO 0%
Section 3 - Program Management Score 56%
Section 4 - Program Results/Accountability
Number Question Answer Score
4.1

Has the program demonstrated adequate progress in achieving its long-term outcome performance goals?

Explanation: Program performance goals are newly established; no long-term data are available.

Evidence:  

NO 0%
4.2

Does the program (including program partners) achieve its annual performance goals?

Explanation: Program performance goals are newly established; no long-term data are available.

Evidence:  

NO 0%
4.3

Does the program demonstrate improved efficiencies or cost effectiveness in achieving program performance goals each year?

Explanation: The Department has yet to develop and implement efficiency measures to quantitively assess performance improvements. The Department is working with OMB on developing an appropriate efficiency measure for this program.

Evidence:  

NO 0%
4.4

Does the performance of this program compare favorably to other programs, including government, private, etc., that have similar purpose and goals?

Explanation: Program performance goals are newly established; no long-term data are available.

Evidence:  

NO 0%
4.5

Do independent and quality evaluations of this program indicate that the program is effective and achieving results?

Explanation: Comprehensive studies by the American Council on Education, the National Center for Education Statistics, and the Advisory Committee on Student Financial Assistance, among others, have consistently found that grant aid has a major impact on the enrollment and persistence of low-income students in higher education.

Evidence: NCES studies -- Student Financing of Undergraduate Education (1999-2000); How Families of Low- and Middle-Income Undergraduates Pay for College: Full-Time Dependent Students in (1999-2000); Low-Income Students: Who They Are and How They Pay for Their Education (2000). Advisory Committee studies -- "Access Denied: Restoring the Nation's Commitment to Equal Educational Opportunity," Feb 2001 and "Empty Promises: The Myth of College Access in America," June 2002.

YES 20%
Section 4 - Program Results/Accountability Score 20%


Last updated: 09062008.2003SPR