ExpectMore.gov


Detailed Information on the
IDEA Special Education - Parent Information Centers Assessment

Program Code 10002098
Program Title IDEA Special Education - Parent Information Centers
Department Name Department of Education
Agency/Bureau Name Office of Special Education and Rehabilitative Services
Program Type(s) Competitive Grant Program
Assessment Year 2004
Assessment Rating Results Not Demonstrated
Assessment Section Scores
Section Score
Program Purpose & Design 100%
Strategic Planning 25%
Program Management 60%
Program Results/Accountability 0%
Program Funding Level
(in millions)
FY2008 $26
FY2009 $27

Ongoing Program Improvement Plans

Year Began Improvement Plan Status Comments
2007

Develop a strategy for evaluating the impact and effectiveness of program activities.

Action taken, but not completed The Office of Special Education Programs has begun to work on a plan for evaluating programs under Part D of the Individuals with Disabilities Education Act.
2006

Develop a baseline and targets for the program's efficiency measure.

Action taken, but not completed The program efficiency measure is: "Cost per output, by category, weighted by the expert panel quality rating." The Department has collected baseline data for 2006 for this measure. However, it believes that more study is needed of the methodology and data quality before developing targets from this baseline.
2008

Assess the data and methodologies used for annual, long-term, and efficiency measures to determine their reliability as indicators of performance.

No action taken All Parent Information Centers should have at least one year of data by October, 2008. The purpose of this Action is to determine whether the data and methodologies used for these measures produce reliable indicators of performance.

Completed Program Improvement Plans

Year Began Improvement Plan Status Comments
2005

Develop baselines and targets for three new measures that have been adopted for the Parent Information Centers program.

Completed Three new performance measures have been developed. Baseline data for these measures became available in November of 2006. However, these baseline data are of very low quality. Targets starting with 2007 were established at that time, but are expected to be revised.
2005

Develop baselines and targets for the program's 2 long-term measures.

Completed The long-term measures focus on 6 targeted areas: assessment, literacy, behavior, instructional strategies, early intervention, and inclusive practices. Data for establishing a 2006 baseline and targets for a measure on the extetn to which parent centers have enhanced the knowledge of parents they have served and the the extent to which parents who have been served promote scientifically- or evidence-based practices became available in the fall of 2008.
2007

Conduct a data based analysis of the distribution of funds among parent centers to determine the most effective allocation of funds.

Completed A data based analysis of the distribution of funds among states was conducted, which analyzed state parent information center needs relative to the number of parents served in each state, the poverty level in each state, and the population dispersion in each state. This information is being used better align funding for the centers based on their relative needs using a $824,000 increase in the eppropriation for the program provided for 2008.

Program Performance Measures

Term Type  
Annual Output

Measure: The percentage of materials used by PTI projects that are deemed to be of high quality.


Explanation:Data for the measure will be gathered through an expert panel review of materials used by projects.

Year Target Actual
2005 NA NA
2006 Historical 40%
2007 Baseline 70%
2008 72% 58%
2009 73%
2010 74%
Annual Output

Measure: The percentage of products and services deemed to be of high relevance to educational and early intervention policy or practice by an independent review panel of qualified members of the PTI target audiences.


Explanation:Data for the measure will be gathered through an expert pannel review of products and services provided by projects.

Year Target Actual
2005 NA NA
2006 Historical 47%
2007 Baseline 96%
2008 96% 95%
2009 96%
2010 96%
Annual Outcome

Measure: The percentage of all products and services deemed to be useful by target audiences to improve educational or early intervention policy or practice.


Explanation:Data for the measure will be gathered through an expert panel review of products and services provided by projects.

Year Target Actual
2005 NA NA
2006 Historical 27%
2007 Baseline 96%
2008 96% 95%
2009 96%
2010 96%
Annual Efficiency

Measure: The cost per output, by category, weighted by the expert panel quality rating.


Explanation:Data for the measure will be gathered through an expert panel review.

Year Target Actual
2006 NA NA
2007 Baseline $2.24
2008 $1.49
2009
Long-term Outcome

Measure: The percentage of parents receiving PTI services who promote scientifically- or evidence-based practices for their infants, toddlers, children and youth.


Explanation:Data for the measure will be gathered through a sample survey of parents receiving Parent Information Centers services. Data will be collected biennially.

Year Target Actual
2006 Historical 69%
2010 73%
2008 71% 73%
2012 75%
2014 77%
Long-term Output

Measure: The percentage of parents receiving PTI services who report enhanced knowledge of IDEA rights and responsibilities.


Explanation:Data for the measure will be gathered through a sample survey of parents receiving Parent Information Center services. Data will be collected biennially.

Year Target Actual
2006 Baseline 85%
2008 85%
2010 87%
2012 89%
2014 90%

Questions/Answers (Detailed Assessment)

Section 1 - Program Purpose & Design
Number Question Answer Score
1.1

Is the program purpose clear?

Explanation: The purpose of the program is to provide training and information to parents of children with disabilities on their rights and protections under the Individuals with Disabilities Education Act (IDEA) so that they can develop the skills necessary to participate effectively in planning and decisionmaking relating to early intervention, educational and transitional services, and in systemic-change activities. Centers also help parents understand the nature of their children's disabilities and needs so that they can help improve their children's education and life outcomes.

Evidence: IDEA Part D sections 681(b)(1), 682(b) and 683(b).

YES 20%
1.2

Does the program address a specific and existing problem, interest or need?

Explanation: Parental involvement and advocacy are critically important to the development and education of children with disabilities. Because IDEA services, procedures and protections are complicated, parents need specialized skills and knowledge to address issues where training and information are not readily available from other sources.

Evidence: Parents are automatic participants in the development of their children's Individualized Education Programs (IEPs) and Individualized Family Service Plans (IFSPs) and need expertise in a wide range of areas ranging from the evaluation of children's needs to the identification of appropriate services and educational goals. (See IDEA Part B section 614(d) and Part C section 636).

YES 20%
1.3

Is the program designed so that it is not redundant or duplicative of any other Federal, state, local or private effort?

Explanation: The program supports (1) parent training and information centers and (2) community parent resource centers. Parent training and information centers generally serve whole States or large portions of States. Community parent resource centers are in smaller, local areas within the larger areas covered by parent training and information centers. They focus especially on the needs of underserved parents. There is no significant overlap with other Federal programs, though many States provide additional support for parent information activities. Parents may also receive assistance relative to specific disabilities from non-profit associations focused on the needs of children with those particular disabilities (such as the ARC for children with mental retardation).

Evidence: IDEA Part D sections 682(b)(1) and 683((a).

YES 20%
1.4

Is the program design free of major flaws that would limit the program's effectiveness or efficiency?

Explanation: We do not have evidence that another approach, mechanism, or infrastructure would be more efficient or effective to achieve the program's purposes.

Evidence:  

YES 20%
1.5

Is the program effectively targeted, so that resources will reach intended beneficiaries and/or otherwise address the program's purpose directly?

Explanation: Awards for both parent training and information centers and community parent resource centers must be made to parent organizations as defined in IDEA, whose members are highly motivated and focused on addressing parent needs. In addition, the program supports technical assistance activities so that centers have the knowledge and capacity to provide quality services to parents. However, the community parent resource centers are able to serve only a small percentage of underserved parents. Also, while community parent resource centers may be effective in reaching their target populations of underserved parents, these populations are typically relatively small. Compared with parent training and information centers, the cost of serving these parents is relatively high.

Evidence: Defintion of "parent organization" is in IDEA Part D Section 682(g). Technical assistance is required in Section 684. For 2003, $2.6 million (10%) of total program funds was for technical assistance. In 2003, Education provided $20.7 million for 72 parent training and information centers and $3 million for 30 community resource centers.

YES 20%
Section 1 - Program Purpose & Design Score 100%
Section 2 - Strategic Planning
Number Question Answer Score
2.1

Does the program have a limited number of specific long-term performance measures that focus on outcomes and meaningfully reflect the purpose of the program?

Explanation: The program does not have meaningful long-term measures. The Department is also working with OMB on developing an appropriate efficiency measures.

Evidence: Lack of long term measures.

NO 0%
2.2

Does the program have ambitious targets and timeframes for its long-term measures?

Explanation: The program does not have meaningful long-term measures or ambitious targets.

Evidence: Lack of long term goals.

NO 0%
2.3

Does the program have a limited number of specific annual performance measures that can demonstrate progress toward achieving the program's long-term goals?

Explanation: Program staff recently participated in Department-wide meetings to develop common measures for assessing the performance of ED technical assistance programs. The data for measures generated through these meetings will be collected in 2006. Implementation includes development of a methodology for convening of panels of scientists and practitioners to review products and project designs and developing an instrument for obtaining data from target audiences on the usefulness of ED TA products and services.

Evidence: Draft Common Measures for Education Technical Assistance Programs.

YES 12%
2.4

Does the program have baselines and ambitious targets for its annual measures?

Explanation: OSEP still needs to develop baselines and targets for these annual measures.

Evidence: Draft Common Measures for Education Technical Assistance Programs.

NO 0%
2.5

Do all partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) commit to and work toward the annual and/or long-term goals of the program?

Explanation: Parent organizations are typically highly motivated and focused on addressing parent needs. In the past, they have worked towards OSEP's performance goals for all PART D program and are likely to be committed to the new annual goals.

Evidence: Parent Centers Helping Families Data Outcomes (1997 - 2002) Report.

YES 12%
2.6

Are independent evaluations of sufficient scope and quality conducted on a regular basis or as needed to support program improvements and evaluate effectiveness and relevance to the problem, interest, or need?

Explanation: There has not been an independent, high quality evaluation of the program, but OSEP is planning one in 2004 or 2005. Nevertheless, the program monitors and assesses grantee performance and for many years, grantees have systematically collected a wide range of useful program management data. While the objectivity of these self-assessments may be questionable, the information provided does help OSEP staff support program improvements.

Evidence: Parent Centers Helping Families Data Outcomes (1997 - 2002) Report.

NO 0%
2.7

Are Budget requests explicitly tied to accomplishment of the annual and long-term performance goals, and are the resource needs presented in a complete and transparent manner in the program's budget?

Explanation: The Budget request is not tied to either annual or long-term goals.

Evidence: Department of Education Fiscal Year 2005 Budget.

NO 0%
2.8

Has the program taken meaningful steps to correct its strategic planning deficiencies?

Explanation: Although OSEP has been working to address its strategic planning deficiencies, meaningful actions to eliminate such deficiencies have not yet been implemented. As OSEP works to address planning deficiencies, it is placing particular emphasis on "adopting a limited number of specific, ambitious long-term performance goals and a limited number of annual performance goals."

Evidence: The program is participating in Education's Technical Assistance common measures group but more work still needs to be done to correct strategic planning deficiencies.

NO 0%
Section 2 - Strategic Planning Score 25%
Section 3 - Program Management
Number Question Answer Score
3.1

Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance?

Explanation: All parent center grantees are required to submit Annual Performance Reports and Final Reports and OSEP staff work closely with them to review these reports. A team of reviewers also conducts an indepth assessment of the Technical Assistance grantee in its second year. However, it is unclear how information gathered in such reports translate into improved performance/accountability for grantees, or how they are linked to more management initiatives to better allocate resources or adjust program priorities.

Evidence: 3+2 evaluation of the Alliance Project (OSEP's TA contractor for parent centers) and grantee annual performance reports.

NO 0%
3.2

Are Federal managers and program partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) held accountable for cost, schedule and performance results?

Explanation: Currently, ED cannot demonstrate how federal managers and program partners are held accountable for program goals. However, the Department has initiated several efforts to improve accountability in its programs. First, ED is in the process of ensuring that EDPAS plans -- which link employee performance to relevant Strategic Plan goals and action steps ' hold Department employees accountable for specific actions tied to improving program performance. ED is also revising performance agreements for its SES staff to link performance appraisals to specific actions tied to program performance. Finally, ED is reviewing its grant policies and regulations to see how grantees can be held more accountable for program results.

Evidence: The President's Management Agenda scorecard (Human Capital and Budget & Performance Integration initiatives) notes ED's efforts to improve accountability. ??The Department's Discretionary Grants Improvement Team (DiGIT) recommendations indicate that ED is reviewing its grant policies and recommendations.

NO 0%
3.3

Are funds (Federal and partners') obligated in a timely manner and spent for the intended purpose?

Explanation: OSEP successfully obligates funds by the end of each fiscal year and on a timely basis. Funds are spent for the intended purposes and no improper uses of funds have been identified.

Evidence: Finance reports, notices of competitions, lists of funded applications.

YES 10%
3.4

Does the program have procedures (e.g. competitive sourcing/cost comparisons, IT improvements, appropriate incentives) to measure and achieve efficiencies and cost effectiveness in program execution?

Explanation: To date, the Department has not established procedures for this program to measure and achieve efficiencies in program operations. However, ED is in the process of developing its competitive sourcing Green Plan, and is working to improve the efficiency of its grantmaking activities. The Department has also established a strengthened Investment Review Board to review and approve information technology purchases agency-wide.

Evidence: Department Investment Review Board materials. ED's Discretionary Grants Improvement Team (DiGIT) recommendations.

NO 0%
3.5

Does the program collaborate and coordinate effectively with related programs?

Explanation: Education has convened a technical assistance working group to better coordinate similar TA&D programs in OSEP, IES, the What Works Clearinghouse, and elsewhere. All programs will collect common annual performance measures starting in 2006 on program quality, relevance, and utility. Also, OSEP is working to ensure that its various TA&D project grantees are collaborating with each other on program activities and strategies in order to reduce duplication.

Evidence: Application notices. www.ed.gov/legislation/FedRegister/announcements/2004-2/042104i.pdf. Example Web site for Kentucky can be seen at www.kyspin.com/. Web sites for all parent centers can be accessed through the National Dissemination Center for Children with Disabilities at www.nichey.org. Select "State Resources", Choose a State, "Parent Organizations".

YES 10%
3.6

Does the program use strong financial management practices?

Explanation: Auditors have not reported internal control weaknesses. The Department has a system for identifying excessive draw downs, and can put individual grantees on probation where draw downs need to be approved.

Evidence:  

YES 10%
3.7

Has the program taken meaningful steps to address its management deficiencies?

Explanation: The program has taken steps to address some of its management deficiencies. For example, the President's Commission on Special Education identified the "peer review" process as an area of weakness in current program management practices. In response, OSEP has provided Internet training on the peer review process. However, OSEP's inability to produce a Comprehensive Plan as required by the IDEA Amendments of 1997 for this and other Part D National Activities program remains a major problem.

Evidence:  

YES 10%
3.CO1

Are grants awarded based on a clear competitive process that includes a qualified assessment of merit?

Explanation: New awards are based on a clear competitive process.

Evidence: OSEP application notices.

YES 10%
3.CO2

Does the program have oversight practices that provide sufficient knowledge of grantee activities?

Explanation: OSEP reviews awardee performance through annual performance reports and final reports, and holds annual meetings with project officers in Washington. When necessary, OSEP staff also conduct site visits to review grantee activities.

Evidence: Annual performance reports and OSEP's "3+2" evaluation for project providing TA to centers.

YES 10%
3.CO3

Does the program collect grantee performance data on an annual basis and make it available to the public in a transparent and meaningful manner?

Explanation: Performance data is collected annually from awardees. However, these data are not readily available to the public in a transparent and meaningful manner. Education is developing a department-wide approach to improve the way programs provide performance information to the public. In 2004, Education will conduct pilots with selected programs to assess effective and efficient strategies to share meaningful and transparent information.

Evidence: Lack of transparent data for the public.

NO 0%
Section 3 - Program Management Score 60%
Section 4 - Program Results/Accountability
Number Question Answer Score
4.1

Has the program demonstrated adequate progress in achieving its long-term performance goals?

Explanation: The program does not have meaningful long-term measures.

Evidence: Lack of long term measures.

NO 0%
4.2

Does the program (including program partners) achieve its annual performance goals?

Explanation: The program still needs to develop annual performance goals.

Evidence: Lack of annual performance goals.

NO 0%
4.3

Does the program demonstrate improved efficiencies or cost effectiveness in achieving program goals each year?

Explanation: The Department is working with OMB on developing an appropriate efficiency measure for this and other Education TA&D programs.

Evidence:  

NO 0%
4.4

Does the performance of this program compare favorably to other programs, including government, private, etc., with similar purpose and goals?

Explanation: We have no systematic evidence to compare OSEP's Parent Centers program with other TA&D programs. However, the Department is currently working with OMB to develop a limited number of cross cutting performance indicators that may allow for such comparisons in the future.

Evidence:  

NO 0%
4.5

Do independent evaluations of sufficient scope and quality indicate that the program is effective and achieving results?

Explanation: There has not been an independent evaluation of this program, but OSEP is planning an evaluation of all of its Part D National Activities in 2004 or 2005.

Evidence: Lack of an independent program evaluation.

NO 0%
Section 4 - Program Results/Accountability Score 0%


Last updated: 01092009.2004FALL