ExpectMore.gov


Detailed Information on the
Education State Grants for Innovative Programs Assessment

Program Code 10003315
Program Title Education State Grants for Innovative Programs
Department Name Department of Education
Agency/Bureau Name Department of Education
Program Type(s) Block/Formula Grant
Assessment Year 2005
Assessment Rating Results Not Demonstrated
Assessment Section Scores
Section Score
Program Purpose & Design 20%
Strategic Planning 0%
Program Management 56%
Program Results/Accountability 0%
Program Funding Level
(in millions)
FY2007 $99
FY2008 $0
FY2009 $0

Ongoing Program Improvement Plans

Year Began Improvement Plan Status Comments
2006

Developing additional, meaningful performance measures for the program, collecting performance information, and setting targets for the measures.

Action taken, but not completed In fiscal year 2008, the Department will collect additional data for the program's performance measures. The Department uses program performance data to monitor State progress and to track how school districts are using program funds. The Department expects to report on FY 2007 data in August 2008.
2006

Initiating formal monitoring visits to States, including both on-site and "virtual" visits, to ensure that the program is funding effective local programs.

Action taken, but not completed In fiscal year 2008, the Department plans to complete 15 virtual and 2 on-site monitoring visits. The Department uses monitoring to determine the quality and accuracy of State data, how well States are implementing the program, and how States and school districts are using program funds. As of June 20, 2008, the Department has completed 11 virtual monitoring visits in fiscal year 2008.
2006

Developing a meaningful efficiency measure for the program and setting targets for the measure based on performance information.

Action taken, but not completed In fiscal year 2008, the Department will collect additional data on the program's efficiency measures. The program's efficiency measures ensure that program staff provide feedback to States very soon after they are monitored so that States can quickly begin to improve the management of the program and to help program staff determine if States are being sufficiently responsive to Department feedback. The Department expects to report on FY 2008 data in September 2008.

Completed Program Improvement Plans

Year Began Improvement Plan Status Comments

Program Performance Measures

Term Type  
Annual Output

Measure: The percentage of LEAs that complete a credible needs assessment.


Explanation:

Year Target Actual
2005 Baseline 100
2006 100 100
2007 100 August 2008
Annual Output

Measure: The percentage of school districts directing State Grants for Innovative Programs funds to activities designated as strategic priorities that link to the achievement of annual yearly progress (AYP). The strategic priorities are: (1) support student achievement, enhance reading and math; (2) improve the quality of teachers; (3) ensure that schools are safe and drug free; and (4) promote access for all students.


Explanation:

Year Target Actual
2003 Baseline 65
2004 68 69
2005 69 69
2006 70 70
2007 71 August 2008
Annual Efficiency

Measure: The number of days it takes the Department to send a monitoring report to States after monitoring visits (both on-site and virtual).


Explanation:

Year Target Actual
2007 Baseline 56
2007 Baseline Available Sept 2007
2008 51 September 2008
Annual Efficiency

Measure: The number of days it takes States to respond satisfactorily to findings in monitoring reports.


Explanation:

Year Target Actual
2007 Baseline 28
2007 Baseline Available Sept 2007
2008 30 September 2008
Annual Output

Measure: The percentage of funds that districts use for the four strategic priorities combined. The strategic priorities are: (1) support student achievement, enhance reading and math; (2) improve the quality of teachers; (3) ensure that schools are safe and drug free; and (4) promote access for all students.


Explanation:

Year Target Actual
2005 Baseline 91
2006 92 Not Collected
2007 93 August 2008

Questions/Answers (Detailed Assessment)

Section 1 - Program Purpose & Design
Number Question Answer Score
1.1

Is the program purpose clear?

Explanation: The purpose of this program is to provide a very flexible source of funding for States and districts to address their highest-priority education reform and improvment. The statutory purpose is to: (1) support local education reform efforts that are consistent with and support statewide education reform efforts; (2) provide funding to enable SEAs and LEAs to implement promising educational reform programs and school improvement programs based on scientifically based research; (3) provide a continuing source of innovation and educational improvement, including programs to provide library services and insturctional and media materials; (4) meet the educational needs of all students, including at-risk youth; and (5) develop and implement education programs to improve school, student, and teacher performance, including professional development activities and class-size reduction programs.

Evidence: ESEA, Section 5101

YES 20%
1.2

Does the program address a specific and existing problem, interest, or need?

Explanation: Program supports general educational reform. SEAs may use program funds for 7 allowable activities and LEAs may use funds for 27 specified allowable activities; allowable activities cover a wide range of educational activities. However, LEAs must use the funds consistent with an assessment of their local needs.

Evidence: ESEA, Sections 5121 and 5131

NO 0%
1.3

Is the program designed so that it is not redundant or duplicative of any other Federal, state, local or private effort?

Explanation: Program supports many activities that are allowable under other Federal education programs, including large State formula grant programs such as Improving Teacher Quality State Grants, Title I Grants to LEAs, and Safe and Drug-Free Schools and Communities State Grants, as well as several smaller discretionary grant programs. Many of the authorized activities are also duplicative of educational activities that are typically funded by State and local governments, such as the aquisition of instuctional materials such as textbooks, school library resources, and computer software and hardware.

Evidence:  

NO 0%
1.4

Is the program design free of major flaws that would limit the program's effectiveness or efficiency?

Explanation: The program is not designed in a manner that allows its effectiveness or efficiency to be measured or for States and LEAs to be held accountable for results. The program is well designed to enable SEAs and LEAs to use funds consistent with their identified needs. LEAs are required to evaluate their programs annually and use the evaluations to make decisions on program improvement. The program has a supplement/not supplant provision, which prevents States and localities from using program funds in lieu of their own funds for school improvement activities.

Evidence: The supplement/not supplant provision is Section 5144 of the ESEA.

NO 0%
1.5

Is the program design effectively targeted so that resources will address the program's purpose directly and will reach intended beneficiaries?

Explanation: This program is not well targeted to meet the needs of children in high-need schools. Funds are awarded to States based on their relative share of school-age children. Sub-State allocations are based on school enrollment, adjusted to provide higher per-pupil allocations to LEAs with the greatest numbers or percentages of children who are more expensive to educate, including LEAs with many children living in poverty or in sparsely populated areas. However, States have complete flexibility in determining how much of the funding to adjust on those bases.

Evidence: Sub-State allocation criteria are set forth in ESEA Section 5112.

NO 0%
Section 1 - Program Purpose & Design Score 20%
Section 2 - Strategic Planning
Number Question Answer Score
2.1

Does the program have a limited number of specific long-term performance measures that focus on outcomes and meaningfully reflect the purpose of the program?

Explanation: The program's existing performance measure is: "School districts that direct Title V funds to activities designated as strategic priorities will increasingly make AYP under Title I. The strategic priorities are: (1) support student achievement, enahnce reading and math; (2) improve the quality of teachers; (3) ensure that schools are sife and drug free; and (4) promote access for all students." However, although baseline data are available for this measure, no long-term targets have been set. Moreover, the measure is a somewhat indirect measure of program performance. In addition to the existing measure, the Department has recently added another performance measure (and is considering a third); the Department has not yet established targets for this measure.

Evidence:  

NO 0%
2.2

Does the program have ambitious targets and timeframes for its long-term measures?

Explanation: Long-term targets have not yet been set for the program's performance measures.

Evidence:  

NO 0%
2.3

Does the program have a limited number of specific annual performance measures that can demonstrate progress toward achieving the program's long-term goals?

Explanation: Program's existing performance measure is: "School districts that direct Title V funds to activities designated as strategic priorities will increasingly make AYP under Title I. The strategic priorities are: (1) support student achievement, enhance reading and math; (2) improve the quality of teachers; (3) ensure that schools are safe and drug free; and (4) promote access for all students." The Department has also recently a second performance measure -- "The percentage of funds that districts use for the four strategic priorities will increase." The Department is also considering a third perfromance measure -- "The percentage of LEAs that complete a credible needs assessment will increase."

Evidence: These aren't acceptable as performance measures.

NO 0%
2.4

Does the program have baselines and ambitious targets for its annual measures?

Explanation: Baseline for 2003 for the existing performance measure is 63 percent; the target for 2004 is 68 percent and target for 2005 is 69 percent. Baseline data are not yet available for the new measure.

Evidence: This doesn't make sense as a target.

NO 0%
2.5

Do all partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) commit to and work toward the annual and/or long-term goals of the program?

Explanation: Because ED doesn't not have a strategic planning framework for this program, there are no agreed to annual or long-term goals for the program partners to commit to. The Department requires States to provide information about their program goals in their consolidated State applications. In addition, the Department requires them to provide data in the Consolidated State Performance Report about LEA activities related to achieving the strategic priorities via the innovative program areas. The priorities are: student achievement in reading and math; teacher quality; safe and drug free schools; and access for all students to a quality education. We also ask States to provide quantitative data, if available, to describe major results of State-level activities to improve student achievement and the quality of education.

Evidence: Consolidated State Performance Report -- ESEA, Section 9303

NO 0%
2.6

Are independent evaluations of sufficient scope and quality conducted on a regular basis or as needed to support program improvements and evaluate effectiveness and relevance to the problem, interest, or need?

Explanation: There is currently no Federal evaluation of the program. Although LEAs are required by law to conduct local evaluations of the program, the Department is not collecting the data from those evaluations and it is not clear if LEAs and States are using the data to make changes to the program as required by law.

Evidence: LEAs are required to evaluate their State Grants for Innovative Programs programs annually. The evaluation must describe how program activities affected student academic achievement. At a minimum, the evaluation must include information and data on the use of funds, the types of services furnished, and the students served by the programs. LEAs must use information gleaned from the evaluation to make decisions about appropriate changes in programs for the subsequent year. The SEA is required to prepare, and submit to the Secretary, an annual statewide summary of how assistance under the program is contributing toward improving student achievement. The summary is to be based on the evaluation information that each LEA submits to the SEA. As with the local evaluation, the summary must include information about the effect of program activities implemented within the State on student academic achievement.

NO 0%
2.7

Are Budget requests explicitly tied to accomplishment of the annual and long-term performance goals, and are the resource needs presented in a complete and transparent manner in the program's budget?

Explanation: Administration budget arbitrarily sets request for this program at $100 million.

Evidence:  

NO 0%
2.8

Has the program taken meaningful steps to correct its strategic planning deficiencies?

Explanation: States have not submitted summaries of their programs. ED does not collect this information under the consolidated performance report. The program has annual performance measures and is in the process of developing a long-term measure.

Evidence: The SEA is required to prepare, and submit to the Secretary, an annual statewide summary of how assistance under Title V-A is contributing toward improving student achievement or improving the quality of education for students. The summary is to be based on the evaluation information that each LEA submits to the SEA. As with the local evaluation, the summary must include information about the effect of Title V-A programs implemented within the State on student academic achievement or the quality of education.

NO 0%
Section 2 - Strategic Planning Score 0%
Section 3 - Program Management
Number Question Answer Score
3.1

Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance?

Explanation: However, the Department collects program data annually through the Consolidated State Performance Report and EDEN, including information about state activities and data relevant to the program's performance measures. The Department is also considering adding a new performance measure to access whether LEAs are developing credible needs assessments; the Department would collect data about this measure through on-site monitoring visits and mailings.

Evidence:  

NO 0%
3.2

Are Federal managers and program partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) held accountable for cost, schedule and performance results?

Explanation: Under the revised EDPAS, the Department holds managers accountable for scheduling and focusing of grant accountability. However, the structure of the program does not provide for the Department to hold grantees accountable for performance.

Evidence:  

NO 0%
3.3

Are funds (Federal and partners') obligated in a timely manner and spent for the intended purpose?

Explanation: Funds are obligated within the timeframes set out by Department schedules and used for the purposes intended. Reports are generated from the Department's Grants and Payments System to monitor grantee spending. Program staff review financial records and audit reports as a standard practice during compliance reviews.

Evidence:  

YES 11%
3.4

Does the program have procedures (e.g. competitive sourcing/cost comparisons, IT improvements, appropriate incentives) to measure and achieve efficiencies and cost effectiveness in program execution?

Explanation: The Department does not have measures of efficiency for this program. The structure of the program makes it difficult to develop such measures.

Evidence:  

NO 0%
3.5

Does the program collaborate and coordinate effectively with related programs?

Explanation: The Department provides examples of uses of Title V program funds in ways that might support local reform efforts or improve the quality of education for students through program guidance, meetings and conferences, and ongoing technical assistance. As necessary, other programs such as Charter Schools, Title I Grants to LEAs, Improving Teacher Quality State Grants, and Safe and Drug-Free Schools are invited to meetings and conferences to address issues. For example, an LEA could decide to implement a hands-on science program to help students achieve to high standards using Improving Teacher Quality State Grants funds for the professional development component of this program and State Grants for Innovative Programs funds to acquire the necessary instructional equipment and materials.

Evidence:  

YES 11%
3.6

Does the program use strong financial management practices?

Explanation: Recent agency-wide audits have not identified deficiencies in the financial management of this program.

Evidence:  

YES 11%
3.7

Has the program taken meaningful steps to address its management deficiencies?

Explanation: While material internal management deficiencies have not been identified for this program, the Department has procedures in place to identify potential problems. For example, when an auditor finds that the State has a cash management problem, the Department follows up and monitors State activities to ensure that adequate procedures are put into place to ensure that program recipients demonstrate the ability to minimize the time elapsing between their receipt and use of federal program funds to correct the cash management problem. Reports are generated from the Department's Grants and Payments System to monitor grantee spending.

Evidence:  

YES 11%
3.BF1

Does the program have oversight practices that provide sufficient knowledge of grantee activities?

Explanation: The Department maintains information on grantee activities through consolidated annual reports and technical assistance activities. The program has two meetings with State coordinators every year; these meetings include technical assistance activities. However, the Department has not done on-site program monitoring in the past few years, except to resolve particular problems with the outlying areas. The Department also monitors and maintains ongoing communication with States via telephone calls, conference calls, and electronic mail.

Evidence:  

YES 11%
3.BF2

Does the program collect grantee performance data on an annual basis and make it available to the public in a transparent and meaningful manner?

Explanation: Grantee performance data are not available on the program's website. However, as the program integrates into the PBDMI system, these data will become available.

Evidence:  

NO 0%
Section 3 - Program Management Score 56%
Section 4 - Program Results/Accountability
Number Question Answer Score
4.1

Has the program demonstrated adequate progress in achieving its long-term performance goals?

Explanation: The Department does not have a long-term performance measure for this program.

Evidence:  

NO 0%
4.2

Does the program (including program partners) achieve its annual performance goals?

Explanation: 2004 data will be available in the fall of 2005.

Evidence:  

NO 0%
4.3

Does the program demonstrate improved efficiencies or cost effectiveness in achieving program goals each year?

Explanation: The program does not have an efficiency measure and is not well structured to demonstrate efficiency.

Evidence:  

NO 0%
4.4

Does the performance of this program compare favorably to other programs, including government, private, etc., with similar purpose and goals?

Explanation: No data are available for comparable programs. The structure of the program makes such comparisons difficult.

Evidence:  

NA  %
4.5

Do independent evaluations of sufficient scope and quality indicate that the program is effective and achieving results?

Explanation: The Department is not conducting an evaluation of this program and hasn't completed one in recent years.

Evidence:  

NO 0%
Section 4 - Program Results/Accountability Score 0%


Last updated: 09062008.2005SPR