ExpectMore.gov


Detailed Information on the
IDEA Special Education - Research and Innovation Assessment

Program Code 10001040
Program Title IDEA Special Education - Research and Innovation
Department Name Department of Education
Agency/Bureau Name Office of Special Education and Rehabilitative Services
Program Type(s) Research and Development Program
Competitive Grant Program
Assessment Year 2003
Assessment Rating Results Not Demonstrated
Assessment Section Scores
Section Score
Program Purpose & Design 100%
Strategic Planning 40%
Program Management 60%
Program Results/Accountability 8%
Program Funding Level
(in millions)
FY2008 $72
FY2009 $71

Ongoing Program Improvement Plans

Year Began Improvement Plan Status Comments
2004

Collect grantee performance data and make it available to the public in a transparent and meaningful manner.

Action taken, but not completed IES established new long-term and annual performance measures and a new efficiency measure for the Research in Special Education program. The first years of data for these measures will be made available to the public through the Department's Visual Performance Suite system in October 2009. These measures are aligned with the measures developed for the IES Research, Development, and Dissemination program during its 2007 PART assessment.
2004

Implement a regular schedule for review by an independent organization to assess overall program quality, coordinated with the reauthorization cycle.

Action taken, but not completed The last independent evaluation of special education research activities was a partial evaluation conducted by COSMOS in 1991. The National Board for Education Sciences awarded a contract for its evaluation of IES research activities in the summer of FY 2007. The results of this evaluation are not expected until 2009.
2006

Evaluate the impact of IDEA 2004, working in coordination with the Office of Special Education Programs. Use findings from the evaluation to advise the Administration and Congress on the next IDEA reauthorization.

Action taken, but not completed The National Center for Education Evaluation and Regional Assistance (NCEE) is conducting a national assessment of IDEA. Contracts for the analysis of extant data and an implementation study were awarded in 2007. The final report on them is due by the end of FY 2009. Depending on the advice of the panel of experts and available resources, additional contract(s) may be awarded in FY 2008 for studies of interventions designed to improve educational outcomes for students with disabilities.

Completed Program Improvement Plans

Year Began Improvement Plan Status Comments
2004

Collaborate with the Office of Special Education and Rehabilitative Services (OSERS) to create a research plan that demonstrates how the IES priorities and programs for special education research are consistent with the purposes of IDEA and coordinated with OSERS activities.

Completed The National Board for Education Sciences has approved research priorities for IES and NCSER has published Requests for Applications that show how these priorities are being applied within the Research in Special Education program.
2004

Articulate substantive long-term research objectives that have measurable outcomes and annual targets. If targets for annual measures are considered interim targets for the long-term measures, then the annual measures must be aligned with and demonstrate progress toward meeting the long-term research objectives of the program.

Completed Since a new Commissioner for Special Education Research is expected to be in place by the end of FY 2007, ED has set as a milestone for IES in the 1st quarter of FY 2008 to develop appropriate long-term outcome measures and targets for the Research in Special Education program. The measures will be submitted for approval to OMB and are expected to be similar to those approved by OMB for the Research, Development, and Dissemination program.

Program Performance Measures

Term Type  
Long-term Outcome

Measure: Minimum number of IES-supported interventions on reading, writing, or language development for special education that are reported by the What Works Clearinghouse to be effective at improving student outcomes by 2017-2018. (New measure, added February 2008)


Explanation:The published priorities of IES are organized around a principal goal of developing or identifying a substantial number of programs, practices, policies, and approaches that enhance academic achievement and can be widely deployed. A reliable count of the number of such education interventions that have been developed or identified by IES is an appropriate measure of the the long-term success of IES investments in research on special education. This count, for both the annual and long-term measures of the efficacy of IES-supported interventions, uses the published What Works Clearinghouse (WWC) standards of intervention effectiveness. The WWC employs transparent, rule-based evidence standards, rating schemes, and reporting systems that are published on its website (e.g., http://whatworks.ed.gov/reviewprocess/standards.html; http://whatworks.ed.gov/rating_scheme.pdf). It can take many years of research and development, sometimes across independent teams of researchers, to generate interventions that can be adopted by multiple schools and districts and that have been shown to be effective in field trials that employ rigorous research designs. For reading, writing, or language development for special education, the long-term goal of 15 interventions that are effective and can be widely deployed is based on an analysis of the current state of the field and the portfolio of grants currently funded by IES.

Year Target Actual
2017 15
Long-term Outcome

Measure: Minimum number of IES-supported interventions on school readiness for special education that are reported by the What Works Clearinghouse to be effective at improving student outcomes. (New measure, added February 2008)


Explanation:The published priorities of IES are organized around a principal goal of developing or identifying a substantial number of programs, practices, policies, and approaches that enhance academic achievement and can be widely deployed. A reliable count of the number of such education interventions that have been developed or identified by IES is an appropriate measure of the the long-term success of IES investments in research on special education. This count, for both the annual and long-term measures of the efficacy of IES-supported interventions, uses the published What Works Clearinghouse (WWC) standards of intervention effectiveness. The WWC employs transparent, rule-based evidence standards, rating schemes, and reporting systems that are published on its website (e.g., http://whatworks.ed.gov/reviewprocess/standards.html; http://whatworks.ed.gov/rating_scheme.pdf). It can take many years of research and development, sometimes across independent teams of researchers, to generate interventions that can be adopted by multiple schools and districts and that have been shown to be effective in field trials that employ rigorous research designs. For school readiness for special education, the long-term goal of 12 interventions that are effective and can be widely deployed is based on an analysis of the current state of the field and the portfolio of grants currently funded by IES.

Year Target Actual
2017 12
Long-term Outcome

Measure: Minimum number of IES-supported interventions on behavioral outcomes for special education that are reported by the What Works Clearinghouse to be effective at improving student outcomes by 2017-2018. (New measure, added February 2008)


Explanation:The published priorities of IES are organized around a principal goal of developing or identifying a substantial number of programs, practices, policies, and approaches that enhance academic achievement and can be widely deployed. A reliable count of the number of such education interventions that have been developed or identified by IES is an appropriate measure of the the long-term success of IES investments in research on special education. This count, for both the annual and long-term measures of the efficacy of IES-supported interventions, uses the published What Works Clearinghouse (WWC) standards of intervention effectiveness. The WWC employs transparent, rule-based evidence standards, rating schemes, and reporting systems that are published on its website (e.g., http://whatworks.ed.gov/reviewprocess/standards.html; http://whatworks.ed.gov/rating_scheme.pdf). It can take many years of research and development, sometimes across independent teams of researchers, to generate interventions that can be adopted by multiple schools and districts and that have been shown to be effective in field trials that employ rigorous research designs. For behavioral outcomes for special education, the long-term goal of 10 interventions that are effective and can be widely deployed is based on an analysis of the current state of the field and the portfolio of grants currently funded by IES.

Year Target Actual
2017 10
Long-term Outcome

Measure: Minimum number of individuals who have completed IES-supported pre- or post-doctoral research training programs and are actively involved in special education research by 2017-2018. (New measure, added February 2008)


Explanation:To expand the national capacity to carry out research that generates evidence on the effectiveness of education programs and practices, IES is committed to training at least 125 researchers on special education by 2017-2018. The first grants for IES-supported pre- or post-doctoral research training programs in special education will be awarded in 2008. The long-term target of 125 researchers is derived from the number of pre-docs and post-docs that IES intends to fund, adjusted by the expected number of students who will not finish their degrees or will choose a career other than education research. IES intends to fund 270 individuals in its special education research training programs through 2016. Based on figures from the American Psychological Association that found that of 209 Ph.D.'s awarded in education psychology in 2000 only 16 of those doctorate recipients were employed in a research-related career a year later, the long-term target of 125 active researchers requires a significantly higher percentage of individuals supported with IES funds to be actively engaged in education research that is produced by existing education research doctoral programs.

Year Target Actual
2017 125
Long-term Outcome

Measure: The percentage of decision makers surveyed in 2017-2018 who indicate that they consult the What Works Clearinghouse prior to making decision(s) on reading, writing, language, school readiness, or behavior interventions for special education. (New measure, added February 2008)


Explanation:IES measures the long-term success of this program not only by examining the extent to which its program investments produce interventions that are effective in improving student outcomes, but by examining the extent to which education decision makers seek evidence regarding the effectiveness of interventions before they adopt or use interventions that affect large numbers of students. In a 1997-98 survey, the percentage of principals in Title I schools who reported that the Department's dissemination vehicles, such as the Comprehensive Assistance Centers, Regional Labs, Parent Information Resource Centers, and ERIC, were very helpful ranged from less than one percent to three percent. IES' goal for the utility of the What Works Clearinghouse is 8 times greater than the baseline data from dissemination vehicles that have had much longer to establish relationships with educators. The recent report by Arthur Levine, Educating Researchers, surveyed 1800 school principals and reported that almost all depended on trade associations and newspapers as their sources of information on research. IES's goal is ambitious in this context. In 2017-2018, IES will conduct a survey of a random sample of education decision makers to determine whether they consult the What Works Clearinghouse before making decisions on reading, writing, language development, school readiness, or behavior interventions for special education.

Year Target Actual
2017 25
Annual Outcome

Measure: The minimum number of graduates of IES-supported special education research training programs who are employed in research positions. (New measure, added February 2008)


Explanation:Beginning in 2013, directors of the research training programs will be asked to obtain and provide information to IES about the current employment of the individuals who have completed their programs. Specifically, program graduates will be asked to indicate whether they are employed in research positions.

Year Target Actual
2013 10
2014 30
2015 60
2016 100
Annual Outcome

Measure: The minimum number of individuals who have been or are being trained in IES-funded special education research training programs. (New measure, added February 2008)


Explanation:The number of individuals who receive fellowship support as participants in IES-funded pre- and post-doctoral research training programs will be obtained from grantee reports contained in the official grant files.

Year Target Actual
2009 6
2010 15
2011 45
2012 85
2013 125
2014 165
2015 215
2016 270
Annual Outcome

Measure: The minimum number of IES-supported interventions with evidence of efficacy in improving behavior outcomes for students with disabilities. (New measure, added February 2008)


Explanation:The data for this annual measure are based on What Works Clearinghouse (WWC) reviews of initial findings on interventions from IES research grants, such as findings that will have been presented as papers at a convention or working papers provided to IES by its grantees. WWC principal investigators will review these initial reports from IES-supported projects on improving behavior outcomes for students with disabilities and rate them using the WWC published standards to determine whether the evidence from these research grants meets evidence standards of the WWC and demonstrates a statistically significant positive effect in improving achievement outcomes in reading or writing.: . The WWC defines "interventions" as "programs (for example, Accelerated Schools), products (for example, a textbook or a particular curriculum), practices (for example, mixed-age grouping), and policies (for example, class size reduction) that can be adopted by multiple schools and districts." It defines "meets evidence standards" as randomized controlled trials that do not have problems with randomization, attrition, or disruption, and regression discontinuity designs that do not have problems with attrition or disruption. The WWC's definition for a "statistically significant positive effect" is detailed and takes into account the statistical treatment of multiple measures of the outcome of interest and clustering (e.g., classrooms have been randomly assigned to receive or not receive the intervention whereas outcomes are assessed at the level of individual students clustered within those classrooms).

Year Target Actual
2009 1
2010 3
2011 5
2012 7
2013 10
2014 12
2015 15
2016 18
Annual Outcome

Measure: The minimum number of IES-supported interventions with evidence of efficacy in improving school readiness outcomes for students with disabilities. (New measure, added February 2008)


Explanation:The data for this annual measure are based on What Works Clearinghouse (WWC) reviews of initial findings on interventions from IES research grants, such as findings that will have been presented as papers at a convention or working papers provided to IES by its grantees. WWC principal investigators will review these initial reports from IES-supported projects on improving school readiness for students with disabilities and rate them using the WWC published standards to determine whether the evidence from these research grants meets evidence standards of the WWC and demonstrates a statistically significant positive effect in improving achievement outcomes in reading or writing.: . The WWC defines "interventions" as "programs (for example, Accelerated Schools), products (for example, a textbook or a particular curriculum), practices (for example, mixed-age grouping), and policies (for example, class size reduction) that can be adopted by multiple schools and districts." It defines "meets evidence standards" as randomized controlled trials that do not have problems with randomization, attrition, or disruption, and regression discontinuity designs that do not have problems with attrition or disruption. The WWC's definition for a "statistically significant positive effect" is detailed and takes into account the statistical treatment of multiple measures of the outcome of interest and clustering (e.g., classrooms have been randomly assigned to receive or not receive the intervention whereas outcomes are assessed at the level of individual students clustered within those classrooms).

Year Target Actual
2009 1
2010 3
2011 7
2012 10
2013 12
2014 15
2015 18
2016 20
Annual Outcome

Measure: The minimum number of IES-supported interventions with evidence of efficacy in improving reading, writing, or language outcomes for students with disabilities. (New measure, added February 2008)


Explanation:The data for this annual measure are based on What Works Clearinghouse (WWC) reviews of initial findings on interventions from IES research grants, such as findings that will have been presented as papers at a convention or working papers provided to IES by its grantees. WWC principal investigators will review these initial reports from IES-supported projects in reading, writing, or language development for students with disabilities and rate them using the WWC published standards to determine whether the evidence from these research grants meets evidence standards of the WWC and demonstrates a statistically significant positive effect in improving achievement outcomes in reading or writing. The WWC defines "interventions" as "programs (for example, Accelerated Schools), products (for example, a textbook or a particular curriculum), practices (for example, mixed-age grouping), and policies (for example, class size reduction) that can be adopted by multiple schools and districts." It defines "meets evidence standards" as randomized controlled trials that do not have problems with randomization, attrition, or disruption, and regression discontinuity designs that do not have problems with attrition or disruption. The WWC's definition for a "statistically significant positive effect" is detailed and takes into account the statistical treatment of multiple measures of the outcome of interest and clustering (e.g., classrooms have been randomly assigned to receive or not receive the intervention whereas outcomes are assessed at the level of individual students clustered within those classrooms).

Year Target Actual
2009 1
2010 3
2011 6
2012 11
2013 13
2014 15
2015 17
2016 20
Annual Outcome

Measure: The average number of research grants administered per each research scientist employed in the National Center for Special Education Research. (New measure, added February 2008)


Explanation:The major function of IES grant making is to award funds for activities that are of high quality and that fit IES' priorities. IES' principal efficiency measure is the ratio of research staff to research grants. These data will be collected from the official grant files for the National Center for Special Education Research.

Year Target Actual
2009 20
2010 22
2011 25
2012 28
2013 32
2014 35
2015 38
2016 40
2017 40

Questions/Answers (Detailed Assessment)

Section 1 - Program Purpose & Design
Number Question Answer Score
1.1

Is the program purpose clear?

Explanation: The overall purpose of the program of improving services and results for children with disabilities is clear from the authorizing legislation. While the main purpose of the program is to achieve these improvements through research, the program supports a wide range of other activities such as technical assistance and dissemination that overlap other Part D program activities.

Evidence: IDEA section 672(a) "The Secretary shall make competitive grants to, or enter into contracts or cooperative agreements with, eligible entities to produce, and advance the use of, knowledge" to improve services and results for children with disabilities.

YES 20%
1.2

Does the program address a specific and existing problem, interest, or need?

Explanation: IDEA Research and Innovation is the principal Federal program supporting research to improve early intervention and education for children with disabilities. Children with disabilities have special needs that, because of their low numbers, are unlikely to be addressed through most research activities, which are directed toward the majority of children who do not have disabilities.

Evidence: The No Child Left Behind (NCLB) legislation mandates improved results for all children, including children with disabilities. In order to achieve these results, schools need to have knowledge through research to address the specialized needs of children with disabilities.

YES 20%
1.3

Is the program designed so that it is not redundant or duplicative of any Federal, state, local or private effort?

Explanation: The IDEA Research and Innovation program is the primary program focused on improving specialized services and results for children with disabilities through applied research. The special needs of these children require specialized research approaches to improve their outcomes. However, since most children with disabilities spend all or large parts of their school days in regular education classrooms, it is important to coordinate special and regular education research efforts. Legislation moving special education research from the Office of Special Education and Rehabilitative Services to the Institute of Educational Sciences has been proposed by both the House and the Senate and is supported by the Administration. This transfer will improve coordination of special and regular education research activities.

Evidence: OSERS has long been a leader in supporting research and other activities to improve reading for children with disabilities. As part of its efforts, it has also played a leadership role in improving reading for all children (e.g. through its support of the National Center to Improve the Tools of Educators.) However, conducting meaningful research to that will benefit children with disabilities often entails looking at educational interventions for all children. Children who need special education should be those who do not respond to appropriate regular education interventions. For example, one reading center currently funded under the Research and Innovation program is providing primary and secondary interventions to almost 4,000 students and tertiary interventions to over 300 students.

YES 20%
1.4

Is the program design free of major flaws that would limit the program's effectiveness or efficiency?

Explanation: We are not aware of any studies that show that the current program structure is not cost effective compared to reasonable alternatives. However, the broad range of activities authorized under and funded through the program has detracted from its main focus of supporting research and providing new knowledge. For example, Research and Innovation is funding the Youth Leadership Development project which supports a group of youth leaders with disabilities who can provide input on policies and practices related to children with disabilities. This activity may be important but is not directly related to special education research.

Evidence: Diverse activities within the already broad purpose of the program described in section 672(a) include not only the production of new knowledge (section 672(b), but also the integration of research and practice (section 672(c) and improving the use of professional knowledge (section 672(d)).

YES 20%
1.5

Is the program effectively targeted, so program resources reach intended beneficiaries and/or otherwise address the program's purpose directly?

Explanation: Research and Innovation funding priorities are targeted through an elaborate planning process that involves extensive consultation with various interest groups. However, the absence of clear and definitive long-term performance goals linked to priorities is a problem in determining the extent to which the program is effectively targeted over time.

Evidence: See IDEA section 661(a), application packages for competitions, notices of competitions, 23rd Annual Report to Congress on the Implementation of the IDEA, OSEP web site at www.ed.gov/offices/OSERS/OSEP/Programs/CPP/index.html.

YES 20%
1.RD1

Does the program effectively articulate potential public benefits?

Explanation:  

Evidence:  

NA  %
1.RD2

If an industry-related problem, can the program explain how the market fails to motivate private investment?

Explanation:  

Evidence:  

NA  %
Section 1 - Program Purpose & Design Score 100%
Section 2 - Strategic Planning
Number Question Answer Score
2.1

Does the program have a limited number of specific long-term performance measures that focus on outcomes and meaningfully reflect the purpose of the program?

Explanation: Three of OSEP's four GPRA indicators for all of IDEA Part D National Activities programs relate to performance of Research and Innovation activities. They deal with the importance of program priorities, the quality of activities, and whether these activities produce results that are used. However, these goals do not focus on specific long term improvement in educational outcomes for children with disabilities. The Department is currently working on developing long term performance goals for the special education research program. The Department is also working with OMB on developing an appropriate efficiency measure for this program.

Evidence: Department of Education Annual Program Performance Reports (see www.ed.gov/pubs/annualplan2004/program/html); Department of Education Planning and Performance Management Database; priorities for grant competitions.

NO 0%
2.2

Does the program have ambitious targets and timeframes for its long-term measures?

Explanation: The program does not have meaningful long-term outcome measures and targets.

Evidence: Department of Education Annual Program Performance Reports (see www.ed.gov/pubs/annualplan2004/program/html); Department of Education Planning and Performance Management Database; priorities for grant competitions.

NO 0%
2.3

Does the program have a limited number of specific annual performance measures that demonstrate progress toward achieving the program's long-term measures?

Explanation: The program does not have specific long-term outcome measures but its annual GPRA indicators focus on the importance of program priorities, the quality of research, and on whether research activities produce results that could be used to improve educational services for children with disabilities. OSEP should continue to examine how the current methodology used to measure progress on GPRA indicators is an accurate representation of progress toward improving results. For example, grantees under the program are included as assessors of the extent to which the program addresses critical needs.

Evidence: Department of Education Annual Program Performance Reports (see www.ed.gov/pubs/annualplan2004/program/html); Department of Education Planning and Performance Management Database; priorities for grant competitions.

YES 10%
2.4

Does the program have baselines and ambitious targets and timeframes for its annual measures?

Explanation: Research and Innovation has annual targets for its performance indicators. However, targets are often not ambitious.

Evidence: Department of Education Annual Program Performance Reports (see www.ed.gov/pubs/annualplan2004/program/html); Department of Education Planning and Performance Management Database; application packages including information on GPRA indicators.

NO 0%
2.5

Do all partners (including grantees, sub-grantees, contractors, cost-sharing partners, etc.) commit to and work toward the annual and/or long-term goals of the program?

Explanation: The Research and Innovation program priorities include annual GPRA goals and commit selected grantees to work toward those goals. OSEP also contracts with the American Institute of Research to conduct an annual reviews of grantees on the achievement of these goals. However, the methodology for conducting these reviews could be improved. For example, grantees are often used to review the importance of priorities under which they have been funded.

Evidence: Research and Innovation annual competition notices.

YES 10%
2.6

Are independent and quality evaluations of sufficient scope and quality conducted on a regular basis or as needed to support program improvements and evaluate effectiveness and relevance to the problem, interest, or need?

Explanation: An evaluation of IDEA Part D activities is under consideration for funding under the Grants to States set aside. However there have been no independent evaluations of Research and Innovation activities since 1991 when a partial evaluation of program activities was conducted. Program activities are also assessed through the GPRA process. However, process may not be very objective because some of the individuals involved are also engaged in program planning and/or are grant recipients.

Evidence: Evaluation plan provided to OMB. COSMOS Evaluation 1991.

NO 0%
2.7

Are Budget requests explicitly tied to accomplishment of the annual and long-term performance goals, and are the resource needs presented in a complete and transparent manner in the program's budget?

Explanation: Budget requests identify priority areas. However, these priority areas are not described in terms of overall long-term goals related to improving results for children with disabilities.

Evidence: Congressional Budget Justifications.

NO 0%
2.8

Has the program taken meaningful steps to correct its strategic planning deficiencies?

Explanation: Although OSEP has been working to address its strategic planning deficiencies, meaningful actions to eliminate such deficiencies have not yet been implemented. As OSEP works to address planning deficiencies, it is placing particular emphasis on "adopting a limited number of specific, ambitious long-term performance goals and a limited number of annual performance goals."

Evidence: Twenty-third Annual Report to Congress on the Implementation of the Individuals with Disabilities Education Act (2001) (see www.ed.gov/offices/OSERS/OSEP/Products/OSEP2001AnlRpt/index.html); IDEA section 661(a).

NO 0%
2.RD1

If applicable, does the program assess and compare the potential benefits of efforts within the program to other efforts that have similar goals?

Explanation: The program supports three types of activities - research, demonstration and outreach. These three activities support the acquisition of knowledge and the development of practical ways to apply that knowledge to improving results for children with disabilities. Funding for each of these types of activities is considered in the planning process through which priorities and funding levels are determined.

Evidence: IDEA section 661(a).

YES 10%
2.RD2

Does the program use a prioritization process to guide budget requests and funding decisions?

Explanation: There is a documented planning process that leads to the development of specific annual priorities. However, the absence of meaningful long-term outcome goals linking the planning process to priorities over time is a serious problem.

Evidence: Twenty-third Annual Report to Congress on the Implementation of the Individuals with Disabilities Education Act; priorities for grant competitions.

YES 10%
Section 2 - Strategic Planning Score 40%
Section 3 - Program Management
Number Question Answer Score
3.1

Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance?

Explanation: A team of reviewers typically assess the performance of large grants in their second year. Program staff also work closely with these grantees in the implementation of their projects and review their final reports. However, the program could improve the collection of data related to GPRA performance goals.

Evidence: Program review files; reports

YES 10%
3.2

Are Federal managers and program partners (grantees, subgrantees, contractors, cost-sharing partners, etc.) held accountable for cost, schedule and performance results?

Explanation: ED's managers are subject to EDPAS which links employee performance to relevant Strategic Plan goals and action steps, and is designed to measure the degree to which a manager contributes to improving program performance. However, ED cannot demonstrate specific ways by which OSEP's managers are held accountable for linking their performance standards to the program's long term and annual measures. Program partners are subject to project reviews and grant monitoring but these oversight activities are not designed to link partners to specific performance goals.

Evidence: Internal records

NO 0%
3.3

Are all funds (Federal and partners') obligated in a timely manner and spent for the intended purpose?

Explanation: OSEP successfully obligates funds by the end of each fiscal year, but most funds are obligated late in each fiscal year. OSEP should institute changes to ensure that its grant competitions are announced on a regular schedule and provide sufficient time for preparation and review of applications. Funds are spent for the intended purposes; this is assessed through grant and contract monitoring and grant reviews for major grant programs. No improper uses of funds have been identified.

Evidence: Financial reports, notices of competitions, lists of funded applications.

YES 10%
3.4

Does the program have procedures (e.g., competitive sourcing/cost comparisons, IT improvements, approporaite incentives) to measure and achieve efficiencies and cost effectiveness in program execution?

Explanation: This program has not yet instituted procedures to measure and improve cost efficiency in program execution. However, as part of the President's Management Agenda, the Department is implementing "One-ED" -- an agency-wide initiative to re-evaluate the efficiency of every significant business function, including the development of unit measures and the consideration of competitive sourcing and IT improvements.

Evidence:  

NO 0%
3.5

Does the program collaborate and coordinate effectively with related programs?

Explanation: The Research and Innovation program collaborates effectively with some other agencies and programs, particularly the Department of Health and Human Services, and between Research and Innovation and other OSEP Part D programs, such as Technical Assistance and Dissemination. For example, research projects are required to report their findings to OSEP Technical Assistance and Dissemination projects to facilitate the distribution of information to appropriate audiences. However, collaboration between the special education research program and regular education research funded under the Institute of Educational Sciences is limited. The need to improve coordination between special and regular education is one reason the Department supports House and Senate legislative proposals to move special education research to the Institute for Educational Sciences.

Evidence: Reimbursable agreements with other agencies: program priorities

YES 10%
3.6

Does the program use strong financial management practices?

Explanation: No internal control weaknesses have been reported by auditors. The Department has a system for identifying excessive draw downs, and can put individual grantees on probation where draw downs need to be approved.

Evidence:  

YES 10%
3.7

Has the program taken meaningful steps to address its management deficiencies?

Explanation: The program has addressed some of its deficiencies. For instance, the President's Commission of Special Education identified the "peer review" process as an area of weakness in current program management practice. Internet training on the peer review process has been provided in an effort to improve the process. But, serious and persistent problems related to late award of grant have not been addressed. OSEP's inability to produce a Comprehensive Plan as required by the IDEA Amendments of 1997 is also a problem.

Evidence: President's Commission on Excellence in Special Education: Final Report

NO 0%
3.CO1

Are grants awarded based on a clear competitive process that includes a qualified assessment of merit?

Explanation: Grants are awarded based on a competitive peer review process and are based on merit. However, the President's Commission identified the existing peer review process as an area needing improvement in current program management. Areas of concern include: ensuring appropriate separation between program management and peer review responsibilities; developing a more effective process for recruiting and utilizing peer reviewers; initiating a two-level review process that focuses on both technical quality/rigor and relevance to OSEP priorities; ensuring that the peer review process is itself organized in a manner that actively encourages progressive improvement of proposals through revision and resubmission.

Evidence: Program funds are used to support peer review costs. All applicants are subject to peer review. "A New Era: Revitalizing Special Education for Children and Their Families" - the President's Commission on Excellence in Special Education.

YES 10%
3.CO2

Does the program have oversight practices that provide sufficient knowledge of grantee activities?

Explanation: OSEP reviews grantee performance through annual performance reports and final reports, and holds annual meetings with projects in Washington. When necessary, OSEP staff also conduct site visits to review grantee activities.

Evidence: Annual performance reports

YES 10%
3.CO3

Does the program collect grantee performance data on an annual basis and make it available to the public in a transparent and meaningful manner?

Explanation: GPRA data are now reported in several formats (including on the web), and GPRA data is made available to the public through annual reports on the implementation of IDEA. Research final reports are available to the public through the Department's ERIC clearinghouse and the grantees' own websites. However, it would be difficult for the public to access research information contained in these reports in a meaningful way to understand how the different research products support the program's goals and provide information about program performance.

Evidence: ericec.org/ Office of Special Education Programs Technical Assistance and Dissemination Network (see www.dssc.org/frc/oseptad.htm)

NO 0%
3.RD1

Does the program allocate funds through a competitive, merit-based process, or, if not, does it justify funding methods and document how quality is maintained?

Explanation:  

Evidence:  

NA 0%
3.RD2

Does competition encourage the participation of new/first-time performers through a fair and open application process?

Explanation:  

Evidence:  

NA  %
3.RD3

Does the program adequately define appropriate termination points and other decision points?

Explanation:  

Evidence:  

NA  %
3.RD4

If the program includes technology development or construction or operation of a facility, does the program clearly define deliverables and required capability/performance characteristics and appropriate, credible cost and schedule goals?

Explanation:  

Evidence:  

NA  %
Section 3 - Program Management Score 60%
Section 4 - Program Results/Accountability
Number Question Answer Score
4.1

Has the program demonstrated adequate progress in achieving its long-term outcome performance goals?

Explanation: There has been some progress in meeting the output measures included in GPRA measures, but there are no long term outcome measures against which to judge progress.

Evidence:  

NO 0%
4.2

Does the program (including program partners) achieve its annual performance goals?

Explanation: The program has had some success at meeting its short term output oriented GPRA goals. However, some of the long term GPRA goals appear to be relatively arbitrary and not particularly ambitious. Another problem is that the methodology used to collect data is not objective. For example, the expert panel used to address some indicators is recruited from among individuals who have been consulted during the planning process for developing the program priorities.

Evidence: Department of Education Annual Program Performance Reports (see www.ed.gov/pubs/annualplan2004/program/html); program reviews.

SMALL EXTENT 8%
4.3

Does the program demonstrate improved efficiencies or cost effectiveness in achieving program performance goals each year?

Explanation: The Department is working with OMB on developing an appropriate efficiency measure for this program.

Evidence:  

NO 0%
4.4

Does the performance of this program compare favorably to other programs, including government, private, etc., that have similar purpose and goals?

Explanation: The Research and Innovation program is the only Federal program supporting applied research for special education, but it can be compared to other research programs in government. However, no systematic evidence has been collected to compare Research and Innovation to other research programs.

Evidence:  

NA 0%
4.5

Do independent and quality evaluations of this program indicate that the program is effective and achieving results?

Explanation: There have been no independent evaluations of this program within the last 20 years.

Evidence:  

NO 0%
4.RD1

If the program includes construction of a facility, were program goals achieved within budgeted costs and established schedules?

Explanation:  

Evidence:  

NA  %
Section 4 - Program Results/Accountability Score 8%


Last updated: 01092009.2003FALL