FR Doc 04-17739
[Federal Register: August 4, 2004 (Volume 69, Number 149)]
[Notices]               
[Page 47128-47133]
From the Federal Register Online via GPO Access [wais.access.gpo.gov]
[DOCID:fr04au04-55]                         
Download: PDF Version
-----------------------------------------------------------------------

DEPARTMENT OF EDUCATION

 
Office of Special Education and Rehabilitative Services, Overview 
Information; Research and Innovation To Improve Services and Results 
for Children With Disabilities--Evidence-Based Interventions for Severe 
Behavior Problems; Notice Inviting Applications for New Awards for 
Fiscal Year (FY) 2004

    Catalog of Federal Domestic Assistance (CFDA) Number: 84.324P.
    Applications Available: August 4, 2004.
    Deadline for Transmittal of Applications: September 10, 2004.
    Eligible Applicants: State educational agencies (SEAs); local 
educational agencies (LEAs); institutions of higher education (IHEs); 
other public agencies; nonprofit private organizations; outlying areas; 
freely associated States; and Indian tribes or tribal organizations.
    Estimated Available Funds: $4,300,000.
    Estimated Average Size of Awards: $1,075,000.
    Maximum Award: We will reject any application that proposes a 
budget exceeding $1,075,000 for a single budget period of 12 months. 
The Assistant Secretary for Special Education and Rehabilitative 
Services may change the maximum amount through a notice published in 
the Federal Register.
    Estimated Number of Awards: 4.

    Note: The Department is not bound by any estimates in this 
notice.

    Project Period: Up to 48 months.

Full Text of Announcement

I. Funding Opportunity Description

    Purpose of Program: The purpose of this program is to produce, and 
advance the use of, knowledge to improve the results of education and 
early intervention for infants, toddlers, and children with 
disabilities.
    Priority: In accordance with 34 CFR 75.105(b)(2)(iv), this priority 
is from allowable activities specified in the statute (see sections 
661(e)(2) and 672 of the Individuals with Disabilities Education Act, 
as amended (IDEA)).
    Absolute Priority: For FY 2004 this priority is an absolute 
priority. Under 34 CFR 75.105(c)(3), we consider only applications that 
meet this priority.
    This priority is:
    Research and Innovation to Improve Services and Results for 
Children with Disabilities--Evidence-Based Interventions for Severe 
Behavior Problems.
    Background: Children with severe behavior problems often engage in 
behaviors that are disruptive in school environments and, at times, 
dangerous to themselves and others. Aggression, self-injurious 
behavior, and other disruptive behaviors pose a serious threat to 
efforts to help these individuals lead more independent lives. Behavior 
problems have been linked to initial referrals to institutions and 
increased recidivism for those individuals leaving institutional 
settings or those referred to crisis intervention programs from 
community placements. Behavior problems interfere with such essential 
activities as family life, employment, and educational activities.
    There have been significant research advances in identifying 
procedures for reducing severe behavior problems--almost exclusively 
using behavioral approaches--and this research has expanded 
significantly over the past several decades. Theoretical formulations 
that incorporate the variables maintaining these behavior problems have 
informed research on assessment and intervention. Functional 
assessments (that determine why a child might be disruptive in a 
particular setting) and functionally-based interventions (such as 
teaching replacement skills and addressing environmental limitations) 
for assessing and treating behavior problems dominate the research 
literature and reviews of the effectiveness of these behavioral 
interventions are supportive of their use. Analyses of the research on 
positive behavioral support conclude that from one-half to two-thirds 
of the outcomes are successful.\1\
---------------------------------------------------------------------------

    \1\ Carr, E.G., Horner, R.H., Turnbull, A.P., Marquis, J.G., 
McLaughlin, D.M., McAtee, M.L., Smith, C.E., Ryan, K.A., Ruef, M.B., 
& Doolabh, A. (1999). Positive behavior support for people with 
developmental disabilities: A research synthesis. Washington, DC: 
American Association on Mental Retardation.
---------------------------------------------------------------------------

    The No Child Left Behind Act of 2001 (NCLB) encourages education 
decision-makers to base instructional practices and programs on 
scientifically based research. Yet, despite growing evidence of the 
potential of various behavioral interventions to reduce behavior 
problems, there is a need to better understand these interventions and 
document their strengths and limitations. The accumulated knowledge 
base primarily is derived from discovery-based research (identifying 
new intervention strategies) and community-based research (applying 
various strategies for a 2 limited number of students in community 
settings, such as schools). However, broad-based recommendations for 
practitioners and families cannot proceed without addressing 
population-based questions (e.g., what proportion of all children and 
what type of child will succeed with a particular intervention).
    Important guidelines common to most outcome evaluations are often 
not adequately followed in behavioral intervention studies. For 
example, standardization in applying interventions among participants 
is rare. Instead, programs aimed at reducing severe behavior problems 
are frequently designed individually for each student. This lack of 
standardization limits the ability to make definitive recommendations 
about a particular intervention approach. Similar concerns can be 
applied to functional assessments and outcome data. Functional 
assessments are designed independently by each research group, and 
often without addressing the psychometric properties of the 
instruments. Traditional measures of interrater reliability and test-
retest reliability as well as measures of validity are lacking in most 
functional assessments used in research programs. Outcomes assessed in 
most behavioral studies tend to rely solely on idiosyncratic 
observational data (e.g., frequency or duration of screaming), which 
makes the interpretation of results across studies problematic. Recent 
research using medical interventions use psychometrically sound rating 
scale data.\2\ However, the exclusive reliance on this form of data 
makes judgments about the educational relevance of these findings 
suspect. Again, these concerns limit the ability to make 
generalizations about the role of functional assessment in intervention 
design and obscure conclusions about outcomes.
---------------------------------------------------------------------------

    \2\ Aman, M.G., De Smedt, G., Derivan, A., Lyons, B., Findling, 
R.L.; Risperidone Disruptive Behavior Study Group (2002). Double-
blind, placebo-controlled study of risperidone for the treatment of 
disruptive behaviors in children with subaverage intelligence. 
American Journal of Psychiatry, 159, 1337-1346
---------------------------------------------------------------------------

    Adequate information about the characteristics of successful and 
unsuccessful participants is also noticeably absent in this research 
literature. Contributing to this problem is the lack of information 
about selection and attrition in individual

[[Page 47129]]

studies.\3\ Relatively low numbers of studies make mention of how they 
select those included in the research and few studies mention if they 
use procedures to reduce selection bias. Few studies indicate if 
participants drop out of treatment prematurely or assess potential 
subject characteristics that would predict differential attrition. This 
relative lack of information on important population-based questions 
calls into question the ability to generalize of otherwise positive 
outcome data on the treatment of behavior problems among students with 
severe behavior problems.\4\
---------------------------------------------------------------------------

    \3\ Durand, V.M. (2002, September). Methodological challenges: 
Single subject designs. Presentation at the National Institutes of 
Health Conference--Research on Psychosocial and Behavioral 
Interventions in Autism: Confronting Methodological Challenges. 
Bethesda, MD.
    \4\ ld.
---------------------------------------------------------------------------

    Population-based educational research (including the use of 
randomized controlled trials--RCTs) is necessary to inform the field 
about the educational efficacy and effectiveness of interventions.\5\ 
Educational efficacy refers to research in which control has been 
exercised by the investigator over sample selection, the delivery of 
the intervention, and the conditions under which the intervention 
occurs.\6\ Efficacy research provides scientific evidence comparing an 
intervention's effects to other interventions, to non-specific 
intervention (e.g., ``treatment as usual''), or to no intervention. 
Randomized experimental designs such as RCTs are recognized as the 
standards by which efficacy is assessed. Guidelines, including the 
CONSORT Statement,\7\ provide criteria for designing, analyzing and 
reporting the findings from randomized experimental designs.
---------------------------------------------------------------------------

    \5\ APA Task Force on Psychological Intervention Guidelines 
(1995). Template for developing guidelines: Interventions for mental 
disorders and psychosocial aspects of physical disorders. 
Washington, DC: American Psychological Association; Chorpita, B.F., 
Barlow, D.H., Albano, A.M., & Daleiden, E.L. (1998). Methodological 
strategies in child clinical trials: Advancing the efficacy and 
effectiveness of psychosocial treatments. Journal of Abnormal Child 
Psychology, 26, 7-16.
    \6\ Hoagwood, K., Hibbs, E., Brent, D., & Jensen, P. (1995). 
Introduction to the special section: Efficacy and effectiveness in 
studies of child and adolescent psychotherapy. Journal of Consulting 
and Clinical Psychology, 63, 683-687.
    \7\ Mosher, D., Schulz, K.F. & Altman, D. (2001). The CONSORT 
Statement: Revised recommendations for improving the quality of 
reports of parallel-group randomized designs. Journal of the 
American Medical Association, 285, 1987-1991.
---------------------------------------------------------------------------

    Educational effectiveness refers to research in which a previously 
tested efficacious intervention is examined with a heterogeneous group 
within a more naturalistic setting (e.g., a school) or is provided by 
real-world service practitioners rather than research therapists.\8\ 
Standards exist for identifying efficacious interventions \9\ and there 
are components of positive behavior support that meet the criteria for 
``probably efficacious'' interventions.\10\
---------------------------------------------------------------------------

    \8\ Hoagwood et al., supra note 6, at 683-687.
    \9\ Division 12 Task Force. (1995). Training in and 
dissemination of empirically-validated psychological treatments: 
Report and recommendations. The Clinical Psychologist, 48, 3-23.
    \10\ Kurtz, P.F., Chin, M.D., Huete, J.M., Tarbox, R.S., 
O'Connor, J.T., Paclawskyj, T.R., & Rush, K.S. (2003). Functional 
analysis and treatment of self-injurious behavior in young children 
a summary of 30 cases. Journal of Applied Behavior Analysis, 36, 
205-219.
---------------------------------------------------------------------------

    Priority: This priority supports rigorous efficacy and 
effectiveness evaluations of empirically based interventions designed 
to reduce severe problem behaviors and promote achievement and positive 
social development among children with severe behavior problems. A 
student with severe behavior problems is defined as a student whose 
behavior significantly impedes his or her own learning or the learning 
of others. Interventions must focus on skill building and address 
social and environmental obstacles.
    Year one will be considered a pilot year in which awardees may work 
out final development issues for the intervention, pilot specific 
outcome measures, refine materials, work out implementation issues, and 
train school personnel. Such pilot work could, but need not, include a 
series of replicated single-case research designs. The implementation 
of the intervention will occur during years two through four.
    Applications must:
    (a) Propose a general design that is a randomized experiment in 
which each site randomly assigns students or classrooms or schools to 
the intervention or comparison group.
    (b) Specify in detail what activities will be conducted in the 
pilot year.
    (c) Propose to implement an intervention that is appropriate for 
children in grades kindergarten through eight.
    (d) Provide a convincing theoretical and empirical rationale for 
the proposed intervention being likely to improve children's outcomes 
compared with the practices used in the comparison conditions. Programs 
must have some preliminary data or ``soft'' evidence supporting the 
potential effectiveness of the intervention or the potential 
effectiveness of the components of the intervention, if the applicant 
is combining components to form a more comprehensive intervention. 
Preliminary or soft evidence means that the data may not be conclusive. 
The preliminary data may have been gathered in such a way as not to 
rule out alternative hypotheses. For example, the investigator might 
have pre-test and post-test data indicating reduction in behavior 
problems or improvement in positive behavior in a school or classroom 
using the intervention, but not have data from a control group. This 
could also include controlled research not yet implemented across 
multiple sites or by typical intervention agents (e.g., teachers). This 
could also include work using single subject designs that have not been 
subjected to randomized designs. Preliminary data may include data that 
were obtained separately for specific components of the proposed 
intervention, but not from an evaluation of all of the proposed 
intervention components integrated into one intervention.
    (e) Describe the level and type of behavior support that is in 
place for each school (such as the presence of school-wide discipline 
procedures). This information may be used in analyses to determine if 
differences across schools influence outcomes of the targeted 
intervention. Applicants should describe how the level and type of 
behavior support will be assessed.
    (f) Provide access to students in a minimum of eight schools that 
agree to implement the proposed intervention (if assigned to the 
treatment condition) and to allow data collection to occur as outlined 
in this initiative (whether the school is selected for the treatment or 
comparison condition). Note that schools are not required to belong to 
the same school district.
    Before applicants may receive awards, they must--
    (1) provide written acknowledgement from schools that the schools 
agree to cooperate fully with the random assignment. To facilitate 
random assignment, applicants may offer incentives to schools, such as 
compensation for additional staff time required to cooperate with the 
research effort, and provision of additional resources to enable a 
school to conduct new activities under the project; and
    (2) provide a letter of cooperation from participating schools or 
school districts for the purposes of conducting the research. In the 
letter of cooperation, representatives of the participating schools or 
school districts must clearly indicate and accept the responsibilities 
associated with participating in the study. These responsibilities must 
include (i) an agreement to provide a sufficient number of students to

[[Page 47130]]

participate in the study; (ii) an agreement to the random assignment of 
students or classrooms or schools to the intervention being evaluated 
versus the comparison group (i.e., ``business as usual''); and (iii) an 
agreement to cooperate with school-level data collection (e.g., school 
personnel competing interventions, data from school records indicating 
number of students receiving office referrals).
    (g) Designate a coordinator to manage all aspects of data 
collection, intervention implementation, and interaction with the 
national contractor.
    (h) Assure that they will provide approval from the applicant's 
Institutional Review Board (IRB) for conducting research with human 
subjects in time to begin data collection for schools for the cross-
site study in the spring of year one of the award. Applicants need to 
have approval both for their own site-specific research and for the 
cross-site data collection.
    (i) Propose a sample of sufficient size to detect meaningful 
differences between outcomes in the intervention and control condition, 
taking into account attrition, variability across sites and children, 
and differences in fidelity of implementation of the intervention. 
Initial samples of fewer than 80 participants may be insufficient and 
need to be carefully and persuasively justified. In general, the larger 
the sample the better. A power analysis should be included.
    (j) Include students from a range of settings. These settings may 
include regular classrooms, segregated classrooms, or segregated 
schools, although the percentage of students in segregated settings 
should not exceed 60 percent of the total sample.
    (k) Propose to use the intervention only if the intervention is 
based on the individual needs of the child and will not interfere with 
the services required on a child's individualized education program 
(IEP) or the broad procedural safeguards stated in the IDEA.
    (l) Propose primary settings for evaluating the intervention only 
in classrooms within the child's educational placement. The 
intervention sites must implement the same intervention. The 
intervention must not be implemented in an intervention or comparison 
school prior to the beginning of the evaluation study.
    (m) Describe how age-related effects will be addressed in the 
research if the range of ages of selected students spans across both 
elementary and middle school age students. For example, the 
investigator may randomly assign students into intervention and 
comparison groups using child age stratification so that the groups 
will not 5 differ significantly according to age. Age stratification 
would insure that the results are attributable to the experimental 
intervention and not differential maturation. Applicants are encouraged 
to incorporate age as a variable for analysis if there is a theoretical 
or empirical reason to analyze age as a variable in the research. In 
either case, applicants should discuss the rationale.
    (n) Describe how the applicant will handle the flow of participants 
through each stage (a diagram is strongly recommended), and indicate 
how protocol deviations will be decided and handled.
    (o) Provide research designs that permit the identification and 
assessment of factors impacting the fidelity of implementation. 
Mediating and moderating variables that are measured in the 
intervention condition and are also likely to affect outcomes in the 
comparison condition should be measured in the comparison condition 
(e.g., student time-on-task, school personnel experience/time in 
position). Outcome measures of behavior change and skill development 
should include standardized assessments of these outcomes.
    Studies must be planned in such a way that the design ensures a 
contribution to a greater program of knowledge beyond the efficacy or 
effectiveness of a particular approach. This requires attending to 
replicability, including treatment manuals, standard subject selection 
and measures, and some links between theory and predictions.
    (p) Describe plans to create an implementation manual for the 
intervention that provides sufficient information for others to be able 
to adopt and replicate the program.
    (q) Propose complementary studies to conduct in conjunction with 
the cross-site program evaluation. Complementary studies provide 
investigators with the opportunity to design studies and collect data 
within the context of the cross-site evaluation. Investigators will be 
responsible for collecting the data for their complementary studies. 
Funding for this data collection must be included in the applicant's 
budget. The complementary research studies may address a range of 
issues related broadly to the efficacy or effectiveness of the 
intervention, the mechanisms by which the intervention results in 
behavioral improvements, the development of assessment tools, or other 
related topics. The complementary research provides an opportunity to 
identify outcomes that, because of data constraints, are not explored 
in the core evaluation or are specific to an individual site. It 
expands the possibilities for multiple measures of the same variable, 
and for the development of new measures. The scientific merit of the 
complementary studies will be considered an important aspect of the 
applicant's proposal.
    (r) Address questions of implementation and how best to train and 
support school personnel in the use of these interventions in their 
classrooms.
    (s) Use psychometrically sound observational, survey, or 
qualitative methodologies as a complement to experimental methodologies 
to assist in the identification of factors that may explain the 
effectiveness or ineffectiveness of the intervention.
    (t) Propose research teams that collectively demonstrate expertise 
in (1) functional behavioral assessments and behavioral intervention, 
(2) implementation and analysis of results from the research design 
that will be employed, and (3) working with school personnel, schools, 
or other education delivery settings that will be employed.
    (u) Provide information documenting the credentials and level of 
preparation required to deliver the intervention (e.g., certified 
teacher, paraprofessional) and the nature and extent of professional 
development, coaching, and monitoring required to effectively implement 
the intervention. Additionally, applicants should document existing 
family or community involvement in behavioral support plans.
    (v) Discuss likely threats to the internal validity of the study 
including attrition, student mobility, existing behavioral intervention 
activities or programs at comparison schools, and potential difficulty 
in implementation.
    Projects must include a plan to:
    (a) Work with the Office of Special Education Programs (OSEP) and a 
national evaluation coordination contractor, funded through a separate 
competition, to carry out randomized experiments of behavioral 
interventions. The national evaluation coordination contractor, working 
with OSEP and the recipients of awards under this competition, will 
coordinate the collection of a core set of measures following 
consistent protocols across sites so that comparable outcome data 
(including measures of positive and negative behaviors) will be 
obtained across sites. Details concerning the responsibilities of each 
awardee vis-a-vis the national contractor are provided in the sections 
below.
    (b) Within a month of receiving the award, meet with the Department 
and

[[Page 47131]]

the national evaluation coordinator in Washington, DC to agree upon 
common procedures that will permit linking of the funded studies. This 
linking will require agreement on a set of common identification and 
outcome measures collected by all projects that will help the 
evaluation of findings across studies and the generalization of the 
findings. Projects must also plan for an additional meeting during year 
one and two meetings (each year) in years two through four, for cross 
project activities with Federal officials, the national evaluation 
coordination contractor, and the other awardees funded in this 
competition. In addition, projects must budget for a two-day Project 
Directors' meeting in Washington, DC during each year of the project.
    (c) If the intervention is effective, deliver training on 
implementation to the control schools in the fourth year. For 
comparison schools, intervention training must not be provided to the 
school staff until the summer of year four, once the final cross-site 
data collection has been completed.
    (d) Obtain active informed consent of parents of children 
participating in the study and of all school staff from whom data will 
be collected.
    (e) Provide all necessary materials, training, and professional 
development to school personnel to implement the intervention to be 
evaluated in the intervention schools.
    (f) Work with the national evaluation coordination contractor for 
the collection of cross-site data, in coordination with any local data 
collection activities. The collection (including timing) of the cross-
site data will take precedence over any data collection activities for 
the complementary studies. Cross-site data must be collected from 
school staff and sent to the contractor in a timely fashion. There will 
be regular conference calls with OSEP staff, the national contractor, 
and each awardee to discuss, plan, and coordinate evaluation activities 
at each site.
    Projects must provide data to the national evaluation coordination 
contractor from each site in the fall and spring of years two and 
three, and the spring of year four of the project period. The core set 
of evaluation data provided to the national contractor will include 
assessments of the function of the students' behavior problems, changes 
in targeted behaviors, measures of intervention implementation, 
identified pro-social behaviors and measures of academic achievement. 
The core evaluation data will be collected by the contractor and the 
individual awardees beginning in the first year of the implementation 
of intervention and continuing through the second and third years of 
the implementation. (Note: applicants are not limited to collecting 
data through shared assessment procedures.)
    (g) Collaborate and participate with OSEP staff in the cross-site 
study activities, including, (1) the design and implementation of the 
cross-site research; (2) the development of a research protocol for IRB 
review by all collaborating institutions; (3) the analysis, 
presentation, and publication of the cross-site study findings; and (4) 
the monitoring and evaluation of the scientific and operational 
accomplishments of the project through conference calls, site visits, 
and review of technical reports.
    (h) Assess the extent to which treatment outcomes produce 
meaningful changes in behavior. Clinical significance must be assessed 
for all outcomes.
    (i) Conduct an economic analysis of the intervention (i.e., one of 
the outcome measures that must be collected by the awardee is the cost 
to conduct the intervention so that the cost-effectiveness of the 
intervention may be determined); and
    (j) Form an advisory group to provide both technical and 
substantive guidance and feedback on project activities.
    Waiver of Proposed Rulemaking: Under the Administrative Procedure 
Act (5 U.S.C. 553) the Department generally offers interested parties 
the opportunity to comment on proposed priorities. However, section 
661(e)(2) of the IDEA makes the public comment requirements 
inapplicable to the priority in this notice.
    Program Authority: 20 U.S.C. 1461 and 1472.
    Applicable Regulations: The Education Department General 
Administrative Regulations (EDGAR) in 34 CFR parts 74, 75, 77, 80, 81, 
82, 84, 85, 86, 97, 98, and 99.

    Note: The regulations in 34 CFR part 86 apply to IHEs only.

II. Award Information

    Type of Award: Cooperative agreements.
    Estimated Available Funds: $4,300,000.
    Estimated Average Size of Awards: $1,075,000.
    Maximum Award: We will reject any application that proposes a 
budget exceeding $1,075,000 for a single budget period of 12 months. 
The Assistant Secretary for Special Education and Rehabilitative 
Services may change the maximum amount through a notice published in 
the Federal Register.
    Estimated Number of Awards: 4.

    Note: The Department is not bound by any estimates in this 
notice.

    Project Period: Up to 48 months.

III. Eligibility Information

    1. Eligible Applicants: SEAs; LEAs; IHEs; other public agencies; 
nonprofit private organizations; outlying areas; freely associated 
States; and Indian tribes or tribal organizations.
    2. Cost Sharing or Matching: This competition does not involve cost 
sharing or matching.
    3. Other: General Requirements--(a) The projects funded under this 
competition must make positive efforts to employ and advance in 
employment qualified individuals with disabilities (see section 606 of 
the IDEA).
    (b) Applicants and grant recipients funded under this competition 
must involve individuals with disabilities or parents of individuals 
with disabilities in planning, implementing, and evaluating the 
projects (see section 661(f)(1)(A) of the IDEA).

IV. Application and Submission Information

    1. Address to Request Application Package: Education Publications 
Center (ED Pubs), PO Box 1398, Jessup, MD 20794-1398. Telephone (toll 
free): 1-877-433-7827. FAX: (301) 470-1244. If you use a 
telecommunications device for the deaf (TDD), you may call (toll free): 
1-877-576-7734.
    You may also contact ED Pubs at its Web site: http://www.ed.gov/pubs/edpubs.html
 or you may contact ED Pubs at its e-mail address: edpubs@inet.ed.gov.

    If you request an application from ED Pubs, be sure to identify 
this competition as follows: CFDA number 84.324P.
    Individuals with disabilities may obtain a copy of the application 
package in an alternative format (e.g., Braille, large print, 
audiotape, or computer diskette) by contacting the Grants and Contracts 
Services Team listed under FOR FURTHER INFORMATION CONTACT in section 
VII of this notice.
    2. Content and Form of Application Submission: Requirements 
concerning the content of an application, together with the forms you 
must submit, are in the application package for this competition.
    Page Limit: The application narrative (Part III of the application) 
is where you, the applicant, address the selection criteria that 
reviewers use to evaluate your application. You must limit Part III to 
the equivalent of no more than 70 pages, using the following standards:

[[Page 47132]]

     A ``page'' is 8.5 x 11, on one side 
only, with 1 margins at the top, bottom, and both sides.
     Double space (no more than three lines per vertical inch) 
all text in the application narrative, including titles, headings, 
footnotes, quotations, references, and captions, as well as all text in 
charts, tables, figures, and graphs.
     Use a font that is either 12-point or larger or no smaller 
than 10 pitch (characters per inch).
    The page limit does not apply to Part I, the cover sheet; Part II, 
the budget section, including the narrative budget justification; Part 
IV, the assurances and certifications; or the one-page abstract, the 
resumes, the bibliography, the references, or the letters of support. 
However, you must include all of the application narrative in Part III.
    We will reject your application if--
     You apply these standards and exceed the page limit; or
     You apply other standards and exceed the equivalent of the 
page limit.
    3. Submission Dates and Times:
    Applications Available: August 4, 2004.
    Deadline for Transmittal of Applications: September 10, 2004.
    We do not consider an application that does not comply with the 
deadline requirements.
    Applications for grants under this competition may be submitted by 
mail or hand delivery (including a commercial carrier or courier 
service), or electronically using the Electronic Grant Application 
System (e-Application) available through the Department's e-GRANTS 
system. For information (including dates and times) about how to submit 
your application by mail or hand delivery, or electronically, please 
refer to Section IV. 6. Procedures for Submitting Applications in this 
notice.
    4. Intergovernmental Review: This program is not subject to 
Executive Order 12372 and the regulations in 34 CFR part 79.
    5. Funding Restrictions: We reference regulations outlining funding 
restrictions in the Applicable Regulations section of this notice.
    6. Procedures for Submitting Applications: Applications for grants 
under this competition may be submitted electronically or in paper 
format by mail or hand delivery.
    a. Electronic Submission of Applications.
    If you submit your application to us electronically, you must use 
e-Application available through the Department's e-GRANTS system. The 
e-GRANTS system is accessible through its portal page at: http://e-grants.ed.gov
.

    If you use e-Application, you will be entering data online while 
completing your application. You may not e-mail an electronic copy of a 
grant application to us. The data you enter online will be saved into a 
database.
    If you participate in e-Application, please note the following:
     Your participation is voluntary.
     You must submit your grant application electronically 
through the Internet using the software provided on the e-Grants Web 
site (http://e-grants.ed.gov) by 4:30 p.m., Washington, DC time, on the 

application deadline date. The regular hours of operation of the e-
Grants Web site are 6 a.m. Monday until 7 p.m. Wednesday; and 6 a.m. 
Thursday until midnight Saturday, Washington, DC time. Please note that 
the system is unavailable on Sundays, and after 7 p.m. on Wednesdays 
for maintenance, Washington, DC time. Any modifications to these hours 
are posted on the e-Grants Web site. We strongly recommend that you do 
not wait until the application deadline date to initiate an e-
Application package.
     You will not receive additional point value because you 
submit your application in electronic format, nor will we penalize you 
if you submit your application in paper format.
     You must submit all documents electronically, including 
the Application for Federal Education Assistance (ED 424), Budget 
Information--Non-Construction Programs (ED 524), and all necessary 
assurances and certifications.
     Your e-Application must comply with any page limit 
requirements described in this notice.
     After you electronically submit your application, you will 
receive an automatic acknowledgement, which will include a PR/Award 
number (an identifying number unique to your application).
     Within three working days after submitting your electronic 
application, fax a signed copy of the Application for Federal Education 
Assistance (ED 424) to the Application Control Center after following 
these steps:
    1. Print ED 424 from e-Application.
    2. The applicant's Authorizing Representative must sign this form.
    3. Place the PR/Award number in the upper right hand corner of the 
hard copy signature page of the ED 424.
    4. Fax the signed ED 424 to the Application Control Center at (202) 
245-6272.
     We may request that you give us original signatures on 
other forms at a later date.
    Application Deadline Date Extension in Case of System 
Unavailability: If you are prevented from submitting your application 
on the application deadline date because the e-Application system is 
unavailable, we will grant you an extension of one business day in 
order to transmit your application electronically, by mail, or by hand 
delivery. We will grant this extension if--
    1. You are a registered user of e-Application and you have 
initiated an e-Application for this competition; and
    2. (a) The e-Application system is unavailable for 60 minutes or 
more between the hours of 8:30 a.m. and 3:30 p.m., Washington, DC time, 
on the application deadline date; or
    (b) The e-Application system is unavailable for any period of time 
during the last hour of operation (that is, for any period of time 
between 3:30 p.m. and 4:30 p.m., Washington, DC time) on the 
application deadline date.
    We must acknowledge and confirm these periods of unavailability 
before granting you an extension. To request this extension or to 
confirm our acknowledgement of any system unavailability, you may 
contact either (1) the person listed elsewhere in this notice under For 
Further Information Contact (see VII. Agency Contact) or (2) the e-
GRANTS help desk at 1-888-336-8930.
    You may access the electronic grant application for the Special 
Education--Research and Innovation to Improve Services and Results for 
Children with Disabilities--Evidence-Based Interventions for Severe 
Behavior Problems competition at: http://e-grants.ed.gov.

    b. Submission of Paper Applications By Mail.
    If you submit your application in paper format by mail (through the 
U.S. Postal Service or a commercial carrier), you must send the 
original and two copies of your application on or before the 
application deadline date to the following address: U.S. Department of 
Education, Application Control Center, Attention: (CFDA Number 
84.324P), 400 Maryland Avenue, SW., Washington, DC 20202.
    You must show proof of mailing consisting of one of the following:
    1. A legibly dated U.S. Postal Service Postmark;
    2. A legible mail receipt with the date of mailing stamped by the 
U.S. Postal Service;
    3. A dated shipping label, invoice, or receipt from a commercial 
carrier; or
    4. Any other proof of mailing acceptable to the U.S. Secretary of 
Education.
    If you mail your application through the U.S. Postal Service, we do 
not

[[Page 47133]]

accept either of the following as proof of mailing:
    1. A private metered postmark, or
    2. A mail receipt that is not dated by the U.S. Postal Service.
    If your application is post marked after the application deadline 
date, we will notify you that we will not consider the application.

    Note: Applicants should note that the U.S. Postal Service does 
not uniformly provide a dated postmark. Before relying on this 
method, you should check with your local post office.

    c. Submission of Paper Applications by Hand Delivery.
    If you submit your application in paper format by hand delivery, 
you (or a courier service) must deliver the original and two copies of 
your application on or before the application deadline date to the 
following address: U.S. Department of Education, Application Control 
Center, Attention: (CFDA Number 84.324P), 550 12th Street, SW., Room 
7041, Potomac Center Plaza, Washington, DC 20202-4260.
    The Application Control Center accepts deliveries daily between 8 
a.m. and 4:30 p.m., Washington, DC time, except Saturdays, Sundays and 
Federal holidays. A person delivering an application must show 
identification to enter the building.
    Note for Mail or Hand Delivery of Paper Applications: If you mail 
or hand deliver your application to the Department:
    1. You must indicate on the envelope and--if not provided by the 
Department--in Item 4 of the Application for Federal Education 
Assistance (ED 424 (exp. 11/30/2004)) the CFDA number--and suffix 
letter, if any--of the competition under which you are submitting your 
application.
    2. The Application Control Center will mail a Grant Application 
Receipt Acknowledgment to you. If you do not receive the notification 
of application receipt within 15 days from the mailing of your 
application, you should call the U.S. Department of Education 
Application Control Center at (202) 245-6288.

V. Application Review Information

    Selection Criteria: The selection criteria for this competition are 
listed in 34 CFR 75.210 of EDGAR. The specific selection criteria to be 
used for this competition are in the application package.

VI. Award Administration Information

    1. Award Notices: If your application is successful, we notify your 
U.S. Representative and U.S. Senators and send you a Grant Award 
Notification (GAN). We may also notify you informally.
    If your application is not evaluated or not selected for funding, 
we notify you.
    2. Administrative and National Policy Requirements: We identify 
administrative and national policy requirements in the application 
package and reference these and other requirements in the Applicable 
Regulations section of this notice.
    We reference the regulations outlining the terms and conditions of 
an award in the Applicable Regulations section of this notice and 
include these and other specific conditions in the GAN. The GAN also 
incorporates your approved application as part of your binding 
commitments under the grant.
    3. Reporting: At the end of your project period, you must submit a 
final performance report, including financial information, as directed 
by the Secretary. If you receive a multi-year award, you must submit an 
annual performance report that provides the most current performance 
and financial expenditure information as specified by the Secretary in 
34 CFR 75.118.
    4. Performance Measures: Under the Government Performance and 
Results Act (GPRA), the Department is currently developing indicators 
and measures that will yield information on various aspects of the 
quality of the Research and Innovation to Improve Services and Results 
for Children with Disabilities program. Included in these indicators 
and measures will be those that assess the quality and relevance of 
newly funded research projects. Two indicators will address the quality 
of new projects. First, an external panel of eminent senior scientists 
will review the quality of a randomly selected sample of newly funded 
research applications, and the percentage of new projects that are 
deemed to be of high quality will be determined. Second, because much 
of the Department's work focuses on questions of effectiveness, newly 
funded applications will be evaluated to identify those that address 
causal questions and then to determine what percentage of those 
projects use randomized field trials to answer the causal questions. To 
evaluate the relevance of newly funded research projects, a panel of 
experienced education practitioners and administrators will review 
descriptions of a randomly selected sample of newly funded projects and 
rate the degree to which the projects are relevant to practice.
    Other indicators and measures are still under development in areas 
such as the quality of project products and long-term impact. Data on 
these measures will be collected from the projects funded under this 
competition. Awardees will also be required to report information on 
their projects' performance in annual reports to the Department (EDGAR, 
34 CFR 75.590).
    We will notify grantees of the performance measures once they are 
developed.

VII. Agency Contact

FOR FURTHER INFORMATION CONTACT: Renee Bradley, U.S. Department of 
Education, 400 Maryland Avenue, SW., room 4105, Potomac Center Plaza, 
Washington, DC 20202-2600. Telephone: (202) 245-7277.
    If you use a telecommunications device for the deaf (TDD), you may 
call the Federal Information Relay Service (FIRS) at 1-800-877-8339.
    Individuals with disabilities may obtain this document in an 
alternative format (e.g., Braille, large print, audiotape, or computer 
diskette) on request by contacting the following office: The Grants and 
Contracts Services Team, U.S. Department of Education, 400 Maryland 
Avenue, SW., room 5075, Potomac Center Plaza, Washington, DC 20202-
2550. Telephone: (202) 245-7363.

VIII. Other Information

    Electronic Access to This Document: You may view this document, as 
well as all other documents of this Department published in the Federal 
Register, in text or Adobe Portable Document Format (PDF) on the 
Internet at the following site: http://www.ed.gov/news/fedregister.

    To use PDF you must have Adobe Acrobat Reader, which is available 
free at this site. If you have questions about using PDF, call the U.S. 
Government Printing Office (GPO), toll free, at 1-888-293-6498; or in 
the Washington, DC, area at (202) 512-1530.

    Note: The official version of this document is the document 
published in the Federal Register. Free Internet access to the 
official edition of the Federal Register and the Code of Federal 
Regulations is available on GPO Access at: http://www.gpoaccess.gov/nara/index.html




    Dated: July 30, 2004.
Troy R. Justesen,
Acting Deputy Assistant Secretary for Special Education and 
Rehabilitative Services.
[FR Doc. 04-17739 Filed 8-3-04; 8:45 am]

BILLING CODE 4000-01-P