U.S. Department of Education: Promoting Educational Excellence for all Americans

A r c h i v e d  I n f o r m a t i o n

RA: National Institute on Disability and Rehabilitation Research - 2005

CFDA Number: 84.133 - National Institute on Disability and Rehabilitation Research


Program Goal: To conduct high-quality research that leads to high-quality research products
Objective 8.1 of 3: Advance knowledge through capacity building: Increase capacity to conduct and use high-quality and relevant disability and rehabilitation research and related activities designed to guide decision-making, change practice and improve the lives of individuals with disabilities.
Indicator 8.1.1 of 2: Percentage of NIDRR-supported fellows, postdoctoral trainees, and doctoral students who publish results of NIDRR-sponsored research in refereed journals.
Targets and Performance Data Assessment of Progress Sources and Data Quality
The percentage of NIDRR-supported fellows, post-doctoral trainees, and doctoral students who publish results of NIDRR-sponsored research in refereed journals.
Year Actual Performance Performance Targets
 
Fellows Post-doctoral trainees Doctoral students
Fellows Post-doctoral trainees Doctoral students
2005
     
999 999 999


Explanation: The wording of this measure was revised the first quarter of FY 2005 based on recommendations from NIDRR's PART review. This is an output-oriented annual performance measure. The FY 2005 target is to set a baseline. 2005 data will come from the revised web-based annual project performance reporting (APPR) system containing information on all three target groups (i.e., fellows, postdoctoral trainees, and doctoral students). Baseline analyses will evaluate the merits of developing submeasures of this indicator to reflect different expectations for publication by size and type of award and to capture the success of NIDRR's capacity-building efforts among persons with disabilities and others from diverse backgrounds. The target for FY 2006 will be based on 2005 data findings.  
Source: Performance Report
Contractor Performance Report

Program: Annual Performance Reporting Forms for NIDRR Grantees (RERCs, RRTCs, Model Systems, DRRPs, and ARRTs).
Contractor: Research Triangle Institute, North Carolina.

Frequency: Annually.
Collection Period: 2005
Data Available: September 2006
Validated By: On-Site Monitoring By ED.
NIDRR is planning to work with other ED staff to conduct an audit of publications entered into the web-based reporting system to verify grantees' self-reports of peer-reviewed journal articles.

 
Indicator 8.1.2 of 2: By 2015, at least 10 percent of all projects will be multisite, collaborative controlled trials of interventions and programs.
Targets and Performance Data Assessment of Progress Sources and Data Quality
The percentage of active projects conducting multisite, collaborative controlled trials. (Long-term Measure)
Year Actual Performance Performance Targets
2005
 
999


Explanation: Based on recommendations from NIDRR's PART review, this measure was added as a long-term measure. The FY 2005 target is to establish a baseline. Project monitoring information and data from the existing project performance reporting system (APPR) will be used to set the baseline.  
Source: Performance Report
Contractor Performance Report

Program: Annual Performance Reporting Forms for NIDRR Grantees (RERCs, RRTCs, Model Systems, and DRRPs).
Contractor: Research Triangle Institute, North Carolina.

Additional Source Information: NIDRR administrative data and reports.

Frequency: Annually.
Collection Period: 2005
Data Available: September 2005

 

Objective 8.2 of 3: Advance knowledge through research and related activities: Generate scientific-based knowledge, technologies, and applications to inform policy, change practice, and improve outcomes.
Indicator 8.2.1 of 3: The average number of publications per award based on NIDRR-funded research and development activities in refereed journals.
Targets and Performance Data Assessment of Progress Sources and Data Quality
The average number of publications per award based on NIDRR-funded research and development activities in refereed journals. (Annual Measure)
Year Actual Performance Performance Targets
2002
2.74
 
2003
2.84
8
2004
 
5
2005
 
5


Progress: The 2002 baseline was determined in FY 2004. NIDRR worked out significant data management and verification problems associated with this measure. These problems were resolved in July 2004, allowing NIDRR to report nonduplicative and verifiable averages for both 2002 and 2003 using rigorous criteria established by the Institute for Scientific Information (ISI) to determine peer-review status. Actual values include the combined NIDRR-funded RERCs, RRTCs, and Model Systems programs. To capture all the refereed journal articles published in a given calendar year, data collection for this measure must span two years of performance reports. Accordingly, data on 2004 refereed publications will not be available until September 2005.

Explanation: The average number of peer-reviewed journal articles published in 2003 per award varied across program types from a high of 4.95 for Model Systems (183 publications/37 centers) to 1.66 for RRTCs (48/29) and .96 for RERCs (22/23). The same ordering was observed for 2002-refereed publications, although the numbers were different. Average peer-reviewed publications per award increased approximately 1.5 points for Model Systems (from 3.48), whereas RRTCs declined by almost the same amount (from 2.89), and RERCs stayed relatively the same (from 1.1 to .96). Variations in performance by program type are most likely due to differences in the nature of R&D activities conducted (i.e. medical rehabilitation research vs. psychosocial research and engineering design) and differences in publication practices and expectations associated with these disciplines. Variations over time probably have more to do with changes in the number and types of centers reporting in a given year as a result of natural fluctuations in funding cycles.  
Source: Performance Report
Contractor Performance Report

Program: Annual Performance Reporting Forms for NIDRR Grantees (RERCs, RRTCs, DBTACs, Model Systems, DRRPs, FIP, and SBIRs).
Contractor: Research Triangle Institute, North Carolina.

Additional Source Information: The refereed status of journal articles for 2002 and 2003 publications was determined using criteria established by the Institute for Scientific Information (ISI), which was recommended for this purpose by the National Library of Education. NIDRR classified journal articles published in 2003 as ''peer-reviewed'' if the journal title appeared on ISI's listing as of October 2004.

Frequency: Annually.
Collection Period: 2003 - 2004
Data Available: September 2005
NIDRR is planning to work with other ED staff to conduct an audit of publications entered into the Web-based project performance reporting systems to verify grantees' self-reports of publications.

Limitations: Data on 2002 and 2003 peer-reviewed publications are limited to only the three NIDRR program funding mechanisms (i.e., RERCs, RRTCs, SCI, TBI and Burn Model Systems) that were required to provide citations in the existing APPR. In addition, data for these two years may underrepresent the number of refereed publications due to terminating centers with no-cost extensions of 6 months or longer, which would delay the submission of final reports beyond the data collection period for the 2002 and 2003 measures. Another possible limitation involves reliance on a single aggregate measure of scientific productivity regardless of amount of award or nature of research conducted. Refereed journal articles may be a better indicator of scientific productivity for awards in medical rehabilitation research than they are for other areas of NIDRR's portfolio related to community integration and product development.

Improvements: NIDRR plans to correct these limitations through the redesigned APPR, which will collect publication data from four additional program funding mechanisms (DBTACs, DRRPs, FIPs, and KDU projects), and additional analyses of variations in publication rates across program mechanisms with the aim of creating sub-measures.

 
Indicator 8.2.2 of 3: Percentage of new studies funded by NIDRR that assess the effectiveness of interventions, programs, and devices using rigorous and appropriate methods.
Targets and Performance Data Assessment of Progress Sources and Data Quality
The percentage of new studies that assess the effectiveness of interventions, programs, and devices using rigorous and appropriate methods.
Year Actual Performance Performance Targets
2005
 
999


Explanation: Based on recommendations from NIDRR's PART review, this measure was added for 2005. This is an activity-oriented annual measure. The FY 2005 target is to establish a baseline. 2005 data will come from the revised Web-based annual project performance reporting system (APPR) and judgments of expert panelists participating in NIDRR's new portfolio assessment system.  
Source: Performance Report
Contractor Performance Report

Program: Annual Performance Reporting Forms for NIDRR Grantees (RERCs, RRTCs, Model Systems, DBTACs, DRRPs, and FIPs.).
Contractor: Research Triangle Institute, North Carolina.

Additional Source Information: Review by expert panels.

Frequency: Annually.
Collection Period: 2005
Data Available: April 2006
Validated By: On-Site Monitoring By ED.

Improvements: To reduce the costs and improve the efficiency of collecting qualitative judgments from experts panels in 2004 and 2005, NIDRR will experiment with using Internet-based alternatives to face-to-face program-review-type meetings.

 
Indicator 8.2.3 of 3: The percentage of grantee research and development that has appropriate study design, meets rigorous standards of scientific and/or engineering methods, and builds on and contributes to knowledge in the field.
Targets and Performance Data Assessment of Progress Sources and Data Quality
The percentage of grantee research and development that has appropriate study design, meets rigorous standards of scientific and/or engineering methods, and builds on and contributes to knowledge in the field.
Year Actual Performance Performance Targets
2002
54
65
2003
67
70
2004
 
70
2005
 
999


Progress: No data are reported for this measure for 2004 due to two interrelated factors: (1) only 20 of the 47 formative reviews slated for calendar year 2004 were conducted because of delays in contract approval and scheduling; and (2) NIDRR is currently redesigning its performance assessment system to convert from a reliance on program reviews of individual centers within a single program funding mechanism to portfolio assessments of clusters of related projects that cut across program mechanisms and year of award. The portfolio approach has the advantage of increasing the size and representativeness of the projects reviewed and thereby reducing fluctuations in scores due to cohort effects rather than to actual changes in performance. The initial phase of this system will be conducted in 2005, with pilot data available by September. Full implementation will take several years to complete. This measure was revised since 2004 to clarify the standards of R&D excellence upon which expert judgments will be based. The FY 2005 target is to establish a baseline.

Explanation: Data for 2002 and 2003 come from the summative program reviews conducted with 28 and 9 centers, respectively. The percentages reported are based on the number of projects in each year that scored 4 or 5 on the following NIDRR center of excellence indicators for R&D: appropriateness of study designs, rigor with which standards of scientific and/or engineering methods are applied, and the degree to which the research builds on and contributes to the improvements. NIDRR plans to correct this limitation, beginning in 2005 with the initial implementation of the new performance assessment system, which will include other types of R&D projects.  
Additional Source Information: Qualiative data from formative and/or summative program review meetings with expert panels.

Frequency: Annually.
Collection Period: 2004 - 2005
Data Available: April 2006
Validated By: On-Site Monitoring By ED.

Limitations: To date the data for this indicator have been limited to the three largest program funding mechanisms within the NIDRR portfolio --i.e., RERCs, RRTCs and Model Systems.

Improvements: NIDRR plans to correct this limitation, beginning in 2005.

 

Objective 8.3 of 3: Advance knowledge through translation and dissemination: Promote the effective use of scientific-based knowledge, technologies, and applications to inform policy, improve practice, and enhance the lives of individuals with disabilities.
Indicator 8.3.1 of 1: The number of new or improved assistive and universally designed technologies, products, and devices developed by grantees that are judged by an expert panel to be effective in improving outcomes and have the potential to be transferred to industry for commercialization.
Targets and Performance Data Assessment of Progress Sources and Data Quality
The number of new or improved assistive and universally designed technologies, products, and devices that are judged by an expert panel to be effective in improving outcomes and have the potential to be transferred to industry for commercialization.
Year Actual Performance Performance Targets
2005
 
999


Explanation: Based on recommendations from NIDRR's PART review, this measure was reworded for 2005. This is an output-oriented annual performance measure. Baseline data were not collected in FY 2004 as expected. The FY 2005 target is to establish a preliminary baseline using the 2005 pilot version of the redesigned Web-based annual project performance reporting (APPR) system and judgments of expert panels.  
Source: Performance Report
Grantee Performance Report: 1820-0642 Annual Performance Reporting Forms for NIDRR Grantees (RERCs, RRTCs, DBTACs, DRRPs, Model Systems, Dissemination & Utillization Projects).
Program: National Institute on Disability and Rehabilitation Research.
Contractor: Research Triangle Institute, North Carolina.

Additional Source Information: Expert panel review.

Frequency: Annually.
Collection Period: 2005
Data Available: April 2006
Validated By: On-Site Monitoring By ED.
Review by expert panel.

Improvements: To reduce the costs and improve the efficiency of collecting qualitative judgments from experts panels, in 2004 NIDRR will experiment with using Internet-based alternatives to face-to-face program-review-type meetings.

 

Return to table of contents