Skip Navigation
acfbanner  
ACF
Department of Health and Human Services 		  
		  Administration for Children and Families
          
ACF Home   |   Services   |   Working with ACF   |   Policy/Planning   |   About ACF   |   ACF News   |   HHS Home

  Questions?  |  Privacy  |  Site Index  |  Contact Us  |  Download Reader™Download Reader  |  Print Print      

Administration on Developmental Disabilitiesskip to primary page content

Developmental Disabilities Program Independent Evaluation (DDPIE) Project

Task 7: Prepare Reports

Annual Report – Year Two

Contract No. 233-02-0087
October 10, 2007

Prepared for:
Administration on Developmental Disabilities
200 Independent Avenue, SW
Washington, DC 20201
Prepared by:
WESTAT
1650 Research Boulevard
Rockville, Maryland

 

This page left blank intentionally.

TABLE OF CONTENTS

Chapter
Page Number
1.  INTRODUCTION 1-1
2.  HIGHLIGHTS OF MAJOR ACTIVITIES 2-1
    Expanding Our Understanding of DD Network Programs and Collaboration 2-1
    Developing a Framework for Measurement Matrices and Performance Standards 2-1
    Obtaining Feedback on Draft Documents and Process for Development
      DD Network Programs
      ADD Feedback
      Advisory Panel Feedback
      Nature of Feedback on ADD Independent Evaluation
2.4  Developing the Process and Materials for the Pilot Study
2-4
2-4
2-7
2-7
2-8
2-8
3.  BARRIERS AND PROBLEMS ENCOUNTERED/SOLUTIONS
    Continued DD Network Program Concerns
    Timing of Performance Standards Development
    Time Schedule and Cost Implications
3-1
3-1
3-2
3-2
4.  ACTIVITIES AND DELIVERABLES PLANNED FOR NEXT REPORTING YEAR
    Conducting a Pilot Study
    Revisions of Benchmarks/Indicators
    Final Validation Meeting
    Final Report to ADD
4-1
4-1
4-2
4-3
4-3

LIST OF APPENDIXES

Appendix Page Number
A  Program, Working Group Members, State A-1
B  Advisory Panel Members and Affiliation B-1
C  Summary of Comments on ADD Independent Evaluation C-1
D  Data Collection Overviews D-1

LIST OF TABLES

Table Page Number
Table 2-1  Meetings to obtain input and feedback on draft documents,
September 30, 2006 – September 29, 2007
2-2
Table 2-2  State programs participating in scheduled telephone calls 2-6

LIST OF EXHIBITS

Table Page Number
Exhibit 2-1  Summary of general comments provided by all three DD Network programs 2-9
Exhibit 4-1  Exclusion and inclusion criteria for selecting programs for the pilot study 4-2

LIST OF FIGURES

Figure Page Number
3-1  DDPIE Project, revised schedule, July 1, 2007 – September 29, 2008 3-3

1.  INTRODUCTION

The Developmental Disabilities Program Independent Evaluation (DDPIE) project is an independent evaluation of the impact of the state Developmental Disabilities (DD) Network programs - State Councils on Developmental Disabilities (SCDDs or DD Councils); the State Protection and Advocacy System (P&As), and the National Network of University Centers for Excellence in Developmental Disabilities Education, Research, and Service (University Centers or UCEDDs) -- on the lives of people with developmental disabilities and their families. The project is funded by the Administration on Developmental Disabilities (ADD), which is part of the Administration for Children and Families in the U.S. Department of Health and Human Services.

The DDPIE project is divided into two phases: development and testing, and a full scale national evaluation. Phase I consists of two activities: (1) the development of tools for determining the impact of network programs on individuals with developmental disabilities and their families; and (2) a pilot study to test the accuracy, feasibility, and utility of these tools. Phase II of the evaluation will involve a full scale independent evaluation of the three DD network programs and collaboration among the three programs using the tools designed in Phase I. Westat, has been engaged by ADD to conduct Phase I of the independent evaluation.

Phase I is divided into seven tasks - (1) Attend meetings, (2) Conduct initial activities, (3) Develop study protocols, (4) Develop measurement matrices, (5) Implement DDPIE in states, (6) Synthesize findings, and (7) Prepare reports. This report represents the Annual Report for Year Two of the DDPIE project under Task 7.

In addition to this Introduction (Chapter 1), the report provides ADD with highlights of the major activities that took place in the past year (Chapter 2), barriers and problems encountered and their solutions (Chapter 3), and activities and deliverables planned for the next reporting year of the project (Chapter 4).

This page left blank intentionally.

2.  HIGHLIGHTS OF MAJOR ACTIVITIES

The major activities during Year 1 of the Developmental Disabilities Program Independent Evaluation (DDPIE) project (covering the period September 30, 2005 - September 29, 2006) included becoming familiar with the Developmental Disabilities (DD) Network programs through the review of background material and interviews with key informants, establishing and meeting with an Advisory Panel, and working with DD Network program Working Groups and a collaboration Working Group to begin developing draft measurement matrices.

Year 2 of the project, which covered the period September 30, 2006 - September 29, 2007), consisted of expanding our understanding of the three DD network programs and collaboration; developing a framework for measurement matrices and performance standards; drafting documents for discussion and obtaining input and feedback on draft documents from Working Groups, DD Network programs, the Advisory Panel, and the Administration on Developmental Disabilities (ADD); and beginning the process to develop the methodology and materials for a pilot study. Year 2 activities are described below.

2.1  Expanding Our Understanding of DD Network Programs and Collaboration

During Year 1, Westat read background documents, conducted background interviews, and met with DD Network program Working Groups and the Advisory Panel. In Year 2, we continued to meet with Working Groups and the Advisory Panel to further expand our understanding of the DD Network Programs and collaboration (Table 2-1). These meetings took place in person and by telephone. More will be said about these meetings in section 2.3.

2.2  Developing a Framework for Measurement Matrices and Performance Standards

By the end of Year 1, Westat had developed a framework for measurement matrices (with assistance from Working Groups). The framework first identified key functions for each DD Network program. Within each key function, we worked to identify benchmarks, indicators, and performance standards. Benchmarks were considered to be general standards or key expectations for each key function. Performance standards, more objectively defined than benchmarks, were statements of the expectations or requirements that DD Network programs should be achieving, doing, or having at a national level. Indicators are what will be measured to determine whether the benchmarks and performance standards are being met.

Table 2-1.  Meetings to obtain input and feedback on draft documents,
September 30, 2006 – September 29, 2007

No. Opportunities for Feedback Protection and Advocacy (P&A) System State Developmental Disabilities Council (DDC) National Network of UCEDDs Advisory Panel
1 Advisory Panel meetings - - -
  • October 25-26, 2006 (in person)
  • March 20-21, 2007 (in person)
  • April 17, 2007 *
  • April 18, 2007 *
  • May 1, 2007 *
  • May 2, 2007*August 1, 2007 (telephone)
  • September 27, 2007 (telephone and web cast)
2 Working Group Meetings
  • October 3, 2006 (in person)
  • October 14, 2006
  • November 14, 2006 (web cast)
  • November 28, 2006
  • November 29, 2006 (collaboration)
  • December 12, 2006
  • May 17, 2007
  • May 24, 2007
  • June 5, 2007
  • June 27, 2007
  • June 28, 2007
  • July 31, 2007 (collaboration)
  • October 10-11, 2006 (in person)
  • November 29, 2006
  • November 29, 2006 (collaboration)
  • May 1, 2007
  • May 2, 2007
  • May 15, 2007
  • May 16, 2007
  • November 2, 2006 (in person)
  • November 15, 2006
  • November 29, 2006 (collaboration)
  • April 18, 2007
  • April 19, 2007
  • April 30, 2007
  • May 3, 2007
-
3 Presentations of draft documents at National technical assistance meetings March 12, 2007 June 11, 2007 May 31, 2007 -
4 Distribution of draft documents electronically to all State programs August 14, 2007 June 19, 2007 July 11, 2007 June 19, 2007
July 11, 2007
August 14, 2007
5 Telephone conference calls to discuss draft documents distributed to all State programs August 22, 2007
August 23, 2007
September 5, 2007
July 9, 2007
July 11, 2007
July 12, 2007
July 25, 2007
July 26, 2007
August 2, 2007
June 25, 2007
June 26, 2007
August 1, 2007
6 Access to eRoom August 24, 2007 August 2, 2007 August 27, 2007 August 14, 2007

* Telephone meetings arranged to discuss draft documents prior to distribution to all state programs.

Within this framework and these definitions, Westat developed draft matrices to present to the three DD Network Working Groups in person when they attended their annual fall meeting. These meetings took place on October 3, 2006 in Austin, Texas (P&As), October 11, 2006 in Pittsburgh, (DD Councils), and November 2, 2006 in Washington, DC (UCEDDs). The purpose of these Working Group meetings was to obtain feedback on the framework as well as the content of the draft matrices.

2.3  Obtaining Feedback on Draft Documents and Process for Development

Year 2 has consisted primarily of obtaining feedback on draft documents and further input from a variety of stakeholders – the project’s Advisory Panel, DD Network program Working Groups, DD Network programs in each state, and ADD (Table 2-1). Based on that feedback, documents were revised several times.

As noted above, we began Year 2 of the project by obtaining further input from DD Network Working Groups (on October 3, October 11, and November 2, 2006). Based on comments and feedback from these groups, we revised all documents and presented our work to ADD on February 5 – 7, 2007. The purpose of these meetings was to describe the direction of our approach to ADD and provide ADD with the opportunity to make comments and suggestions. Based on these comments and suggestions, we again revised documents and prepared to distribute draft documents at technical assistance meetings in the spring, 2007.

Presentations and distribution of draft documents to DD Network State programs took place on March 12, 2007 for the P&As, May 31, 2007 for the UCEDDs, and June 11, 2007 for the DD Councils. The following is a summary of what transpired for each program at the time of the technical assistance meeting and as followup.

2.3.1  DD Network Programs

As a way of alleviating DD Network program concerns about the independent evaluation and incorporating more program input into the process of tool development, ADD identified individuals working in DD Network programs who would be interested and available to assist Westat in the process (Appendix A). Westat organized, prepared for, and conducted P&A, DD Council, UCEDD and Collaboration Working Group meetings by telephone and web cast during Year 1 of the project and continued to meet with Working Group members in Year 2. Year 2 also included opening up the process to make the process more transparent to all DD Network state programs in each state and to obtain wider input.

2.3.1.1  P&As

Westat distributed two documents to P&As at their technical assistance meeting on March 12, 2007: (1) benchmarks/indicators/performance standards for P&A systems and (2) benchmarks/ indicators/performance standards on collaboration among the three DD Network programs. Although the intent had been to distribute the documents to three separate, smaller groups, Executive Directors preferred to remain as one large group.

P&A Executive Directors expressed considerable concern about the detail in these documents, and particularly about the performance standards. They also expressed concern about the speed of the process, difficulties in providing feedback with short turnaround time, and not being able to be part of the process. Working Group members, only two of which attended this meeting, expressed concern that they had not seen the documents prior to distribution.

With ADD assistance, Westat made considerable effort to get back on track with the P&A Working Group and alleviate many of the concerns expressed by the P&As. We conducted an initial meeting on May 17, 2007 with the P&A Working Group to confirm their role and ours in the process of developing the materials for the ADD independent evaluation and to confirm Westat’s commitment to working with Working Group members and rebuilding their trust. Westat made revisions to the document on the P&A program distributed at the March P&A meeting and sent it to the P&A Working Group and Advisory Panel. We then set up and conducted telephone meetings to discuss the revised document (June 5, June 27 and June 28). Another meeting was scheduled and conducted with the P&A Working Group on August 31, 2007 to discuss the collaboration document.

Westat again made revisions to the P&A document based on Working Group feedback and suggestions. Westat provided a summary to the Working Group on how the document was revised and the rationale for making some changes and not making others. The revised document was sent to the P&A Working Group for final comments and then distributed to all P&As electronically on August 14, 2007.

Westat set up three telephone conference calls for all P&As in August and September so they could provide their general comments and any specific comments they might have on the document distributed on August 14th (Table 2-1). They were also given the opportunity to provide comments in writing. Telephone calls were also scheduled for Advisory Panel members. Table 2-2 lists the state programs that participated in telephone calls. Thirteen individuals representing P&As in 10 states were on the calls.

Table 2-2. State programs participating in scheduled telephone calls.

UCEDD DD COUNCIL P&A
Delaware (2)
Indiana
Massachusetts (2)
Texas
Delaware (2)
Indiana
Massachusetts (2)
Texas
New York (3)
Minnesota
Oklahoma
Oregon
South Carolina
South Dakota
Tennessee
Texas
Alabama
Arizona
California
Illinois (2)
Iowa (2)
Kansas
Kentucky
Michigan
Minnesota (2)
Washington

2.3.1.2  UCEDDs

Westat held several Working Group meetings prior to distributing draft documents to all UCEDD programs (Table 2-1). The purpose of the UCEDD Working Group meetings was to obtain feedback on draft documents. Revisions were made on draft documents based on Working Group feedback.

Westat distributed two documents at the UCEDD technical assistance meeting on May 31, 2007: (1) benchmarks/indicators/examples of performance standards for UCEDD network; and (2) benchmarks/indicators/examples of performance standards on collaboration among the three DD Network programs. The UCEDDs met in small groups to discuss the draft documents that were distributed and then returned to the large group to report back to everyone. A subset of UCEDDs (that referred to themselves as the "Doubletree 11") later met to discuss the draft documents further. This group developed general comments, shared them with the full UCEDD network, and sent finalized comments and a letter to the Commissioner of ADD, Pat Morrissey, on the evaluation. ADD provided a response to these comments, with input from Westat.

Telephone calls were scheduled for July 25 and 26 and August 2, 2007 for all UCEDD State programs. A total of six individuals from four states attended these calls (Table 2-2).

2.3.1.3  DD Councils

DD Council Working Group meetings took place in May 2007 to discuss draft documents that were intended for distribution to all DD Councils. Westat made revisions based on Working Group feedback and provided the Working Group with a rationale for why changes were and were not made.

Westat distributed two documents at the DD Council technical assistance meeting on June 11, 2007: (1) benchmarks/indicators/examples of performance standards for DD Council programs; and (2) benchmarks/indicators/examples of performance standards for collaboration among the three DD Network programs. DD Councils met in three small groups to discuss the draft documents that were distributed. ADD arranged to have notes taken in each group and compiled a summary.

Westat set up telephone calls to talk to DD Councils programs on July 9, July 11, and July 12, 2007.  Twenty-one individuals from 20 states attended these calls (Table 2-2). In addition, the President of the NACDD wrote to Pat Morrissey, ADD Commissioner, to summarize DD Council comments. ADD responded to each comment, many of which were similar to those expressed by the UCEDDs (see section 2.3.4).

2.3.2  ADD Feedback

In addition to providing detailed feedback on draft documents in February 2007, Westat also met with the ADD Commissioner, Pat Morrissey, and Jennifer Johnson in June, 2007. Dr. Morrissey voiced her continued commitment to the evaluation. She noted that it is not only important to be accountable and to be able to more formally gauge how the programs are doing, but she also thought that the evaluation would be an excellent opportunity to demonstrate the important work that the DD Network programs are doing for people with developmental disabilities and their families.

Dr. Morrissey also expressed her commitment to being as transparent as possible with the programs and would like to make sure Westat obtains feedback along the way from stakeholders, including the DD network programs themselves.

2.3.3  Advisory Panel Feedback

Due to retirement, one Advisory Panel member (Marie Danforth) left the panel during Year 2 of the project. She was replaced by Karen Armstrong of the Substance Abuse and Mental Health Services Administration (SAMHSA). The remainder of the Advisory Panel stayed the same (Appendix B).

Westat continued to obtain advice from Advisory Panel members in a number of ways.  Two in-person meetings were held (one in October, 2006 and one in March, 2007), as well as two telephone meetings (in August, 2007 and September 2007). Advisory Panel members were updated on DDPIE project activities and given the opportunity to make comments and suggestions.

Advisory Panel members also wished to have input into draft documents, so a number of opportunities were made available for them to do that (Table 2-1).  Telephone meetings were scheduled in the spring of 2007 to provide feedback on draft UCEDD documents before distributing them to state programs. In addition, telephone conference calls were scheduled and conducted during the summer, 2007 so that Advisory Panel members would be able to provide comments on draft documents distributed to all DD Network programs.

2.3.4  Nature of Feedback on ADD Independent Evaluation

The DD Network programs and the Advisory Panel have had a variety of opportunities to ask questions, have their questions answered, and provide comments themselves on the DDPIE project.  Many of the same questions continue to be asked (Why is ADD doing this? Will we still have to do MTARS and PPRs?).

In addition, many of the same comments were raised by each DD Network program (Exhibit 2-1).  More detailed comments provided by each DD Network program are contained in Appendix C.

2.4 Developing the Process and Materials for the Pilot Study

ADD's original Request for Proposal called for a pilot study to take place after the development of validated performance standards. After discussions with Commissioner Morrissey and further planning with Jennifer Johnson, it was concluded that tasks should take place in a different order. Thus, tasks in Year 3 will be conducted as follows:

    1.  Further revise draft benchmarks and indicators. These will be used to develop data collection instruments.

    2.  Develop the methodology and data collection instruments for a pilot study. The purpose of the pilot study will be to test data collection instruments and inform the further development of performance standards.

Exhibit 2-1. Summary of general comments provided by all three DD Network Programs

    • The ADD independent evaluation will place a burden on programs which already are overburdened by ADD reporting requirements.
    • The ADD independent evaluation needs to consider existing data (e.g., data from MTARS visits, PPRs, State Plans, and NIRS.
    • Indicators need to be consistent with the DD Act.
    • The evaluation should only consider outcome indicators and not structural, process and output indicators.
    • The process of developing the methodology and tools for the evaluation needs to be transparent.
    • The evaluation materials need to contain consistent definitions and terms.
    • The process and materials need to consider how the smaller programs/territories are affected.
    • The evaluation should define collaboration as collaboration with all entities, not just among the three DD Network programs.
    • The materials should contain many fewer indicators.
    3.  Meet with a validation group toward the end of Year 3. Present findings from data collection, recommendations for revised benchmarks and indicators, and examples of performance standards. The validation meeting will consist of discussion of these recommendations and examples, as well as discussion on the methodology for developing more performance standards.

    4.  Write a final report to ADD which will consist of recommendations on benchmarks, indicators, data collection for a full scale independent evaluation, and further work on performance expectation levels (standards).

    Westat has already begun to revise the draft benchmarks and indicators based on many of the comments provided by DD Network programs and the Advisory Panel. One of the recommendations we have not followed is to reduce the number of indicators. As suggested by the Advisory Panel on September 27, 2007, we will wait until after data collection and discussion with the validation group to reduce the number of indicators.

    This next version of benchmarks and indicators is being used to develop data collection instruments. The data collection methodology will be described in section 4.1.

This page left blank intentionally.

3.  BARRIERS AND PROBLEMS ENCOUNTERED/SOLUTIONS

Over the course of Year 2, three barriers were identified: (1) continued concerns about the Developmental Disabilities Program Independent Evaluation (DDPIE) project by Developmental Disabilities (DD) Network programs; (2) the timing of developing performance standards; and (3) the time schedule and cost implications to providing additional opportunities for DD Network program input and feedback. These are discussed below.

Continued DD Network Program Concerns

The most important and continuing problem encountered during Year 2 of the DDPIE project is the DD Network program concern regarding the independent evaluation. In the Year 1 report, we listed the concerns expressed by DD Network programs. They were:

  • Westat and the Administration on Developmental Disabilities (ADD) do not fully appreciate the complexity of their programs;
  • The work programs do cannot easily be quantified;
  • It will be difficult to measure the impact of the work they do on the lives of people with developmental disabilities and their families because they are often only one small part of a collaborative effort;
  • The evaluation will be used to make comparisons of all P&As, DD Councils, and UCEDDs;
  • The evaluation will be burdensome;
  • They (the P&As) are already being evaluated by other Federal agencies (e.g., the PAIMI evaluation).

These concerns continue to be raised, and a number of more specific concerns have been added to the list (see Exhibit 2-1 and Appendix C).

To alleviate some of these concerns, the Westat team worked extensively with Working Groups, made itself available at technical assistance meetings to answer questions about the evaluation and obtain feedback on draft documents from all State programs, met with the Advisory Panel in person and by telephone, scheduled and conducted telephone meetings for DD Network programs, and summarized changes in documents and the rationale for making (and not making) changes. We have also established an eRoom to give DD Network programs and Advisory Panel members the opportunity to better understand the ADD independent evaluation, to provide comments on the process and individual documents, and to see and comment on the comments of others. To date, DD Network Programs and Advisory Panel members have made little use of eRoom, which was expensive to set up and maintain. Among the 191 passwords that were issued to all DD Network programs, Advisory Panel members and ADD, the eRoom was visited 57 times by 20 individuals. One person provided comments through eRoom.

Despite the added transparency of the process and the many opportunities to provide input, there still appears to be an underlying mistrust of the process and concern about the use of data collected as part of the ADD independent evaluation, a lack of understanding of the need for the evaluation, and a feeling that the full scale evaluation will be a major burden to programs. In Year 3 of the project we will attempt to alleviate these concerns by continuing to inform all programs about the progress of the project and making revised materials available for review. We also expect that there will be some State programs that will never become completely comfortable with this evaluation.

3.2  Timing of Performance Standards Development

A concern was raised in the Year 1 report that the development of performance levels requires knowledge of the range and distribution of performance levels across all State programs. Although the collection of data during the pilot study will help to address this concern, it is doubtful that Westat will be able to recommend a full complement of performance standards by the end of this contract on September 29, 2008. Nevertheless, we feel that the data collected in the pilot study will help to inform the process of developing reasonable and more acceptable performance standards and that we will be able to make useful recommendations on the next steps for developing them.

3.3 Time Schedule and Cost Implications

Over the course of the first year of the project, a number of changes were made to the original plan which had implications for the rest of the project. The decision to organize measurement matrices according to key function added a certain complexity to the process. The use of program and collaboration Working Groups has been a great asset to Westat’s understanding of the programs and has helped to improve the draft benchmarks and indicators that have been developed. However, the process has added several months to and several thousand dollars to the costs of the project. We have also made important efforts to make the process more transparent (e.g., eRoom, telephone conference calls, web casts, in person meetings, Advisory Panel meetings lasting 2 days instead of 1) and include more stakeholders in the process. This also has added time and further costs.

Figure 3-1. DDPIE Project, Revised Schedule, July 1, 2007 - September 29, 2008

Task 2007 2008
  Jul Aug Sep Oct Nov Dec Jan Feb Mar Apr May Jun Jul Aug Sep
TASK 1.  ATEND MEETINGS # # # # # # # # # # # # # # #
TASK 2.  CONDUCT INITIAL ACTIVITIES                              
TASK 3.  DEVELOP DATA COLLECTION TOOLS                              
  3.1  DEVELOP TOOLS 7/2-10/15                      
  3.2  Seek IRB approval       10/15-11/2                    
TASK 4.  DEVELOP MEASUREMENTS MATRICES                              
  4.1  Discussion with and feedback from DD Network programs                              
    DD Council 7/9-9/13                        
    P&A (Working Group and all P&A's) 7/24-9/28                        
    UCEDD 7/24-9/28                        
  4.2  Revise benchmarks/indicators 7/10-10/12           4/3-5/28   7/2-8/22  
  4.3  Organize and conduct validation meeting                     5/16-6/27      
TASK 5.  ORGANIZE AND CONDUCT PRE-TEST/PILOT TEST                              
  5.1  Recruitment of participants     9/23-10/15                      
  5.2  Arrangements of logistics       10/8-11/26                    
  5.3  Data collection         11/5-1/31                
TASK 6.  SYNTHESIS OF FINDINGS                              
  6.1  Analysis of data           12/3-3/26            
TASK 7.  PREPARE REPORTS                              
  7.1  Quarterly reports 7/10     10/10     1/10     4/10     7/10    
  7.2  Write draft final report                         7/1-8/29  
  7.3  Final report                             9/1-9/27

4.  ACTIVITIES AND DELIVERABLES PLANNED FOR NEXT REPORTING YEAR

To date, Westat has developed a framework for the measurement matrices and a large portion of the benchmarks and indicators for each key function and collaboration. Year 2 has consisted of refining the benchmarks and indicators based on a considerable amount of interaction with DD network programs, the project Advisory Panel, and the Administration on Developmental Disabilities (ADD) in an attempt to ease concerns, respond to requests to make the process more transparent, and obtain input and feedback on draft documents. Although we will continue to attempt to ease concerns about the DDPIE project as much as possible, Year 3 will consist primarily of: (1) collecting data as part of a pilot study to inform further revisions of benchmarks and indicators; (2) summarizing findings; (3) conducting a final validation meeting; and (4) submitting a final report to ADD. Each of these tasks is described below.

4.1  Conducting a Pilot Study

The purpose of the pilot study will be to test the feasibility of implementing the methodology and data collection instruments for a full scale evaluation. A second purpose is to inform the revision of benchmarks and indicators and make recommendations for performance standards and how to develop them further.

In Year 3, we will be conducting several activities in preparation for the pilot study. The first is to determine the feasibility of using existing data from MTARS site visits, PPRs, State Plans, and other documents. We will also develop the data collection methodology and interview protocols, identify programs to participate in the pilot study, and determine whether the pilot study requires IRB approval. If so, we will obtain it.

We will conduct the pilot study at a total of nine state programs -- three P&A programs, three UCEDDs, and three DD Councils. Three of the programs will reside in the same state. ADD has recommended both inclusion and exclusion criteria for identifying the programs to select (Exhibit 4-1). As much as possible, we will stratify according to those criteria and randomly select programs. ADD will have final approval.

Data collection itself will be both quantitative and qualitative, in person and by telephone, individual, and in groups. In person data collection will take place with the Executive Director of each program and staff members, as appropriate, and DD Council members, including the Chair.

Exhibit 4-1. Exclusion and inclusion criteria for selecting programs for the pilot study

Exclusion Criteria Inclusion Criteria –
At least one program that…
  • Programs that had members on the Advisory Panel and Working Groups
  • States that will have an MTARS visit in 2008 (Arizona, Rhode Island, North Dakota, Indiana)
  • New UCEDDs (Texas A&M, University of South Florida, Vanderbilt University in Tennessee, University of California-Davis, University of Arizona, Marcus Institute in Georgia)
  • Programs with no or a new Executive Director (Arizona DD Council and P&A; Alaska DD Council; Mississippi DD Council; Rhode Island DD Council; Ohio Nisonger Center)
  • Has a high percentage of residents that live in rural areas
  • Is from each geographic region in the United States (west, midwest, east)
  • Participated in MTARS in 2005 – 2007 (UCEDD, P&A, and DD Council)
  • Has a minimum allotment (DD Council and P&As)
  • Is and is not a medical school (UCEDD)
  • Is and is not a LEND program (UCEDD)
  • Is and is not its own Designated State Agency (DSA) (DD Council)
  • Is and is not in a State agency (P&A)
  • Uses a legal model and uses an advocacy model (P&A)

For the P&As, we are also considering group interviews with policy makers and collaborators and recipients of public education. We are also contemplating group interviews with UCEDD colleagues and peer researchers, recipients of community services, as well as individual telephone interviews with former students and graduates. DD Council group interviews will take place with policy makers, collaborators, and contractors, as well as participants in advocacy and leadership training and DD Council-supported training and technical assistance.

4.2 Revisions of Benchmarks/Indicators

Prior to data collection, benchmarks and indicators will be revised according to many of the comments and suggestions recently provided by the Advisory Panel, ADD, and DD Network programs. This version will guide the development of data collection instruments. We will make these revisions and provide all DD Network programs with the revised versions and a rationale for why revisions were and were not made. Because programs do not appear to be using the eRoom, we are contemplating shutting it down. Instead, we will make documents available to DD Network programs as attachments to emails.

The major revision of benchmarks/indicators will take place after data collection, based on data collection findings, as well as feedback at the final validation meeting.

4.3 Final Validation Meeting

In June, 2008, Westat will organize a final validation meeting consisting of Advisory Panel members, selected Working Group members, ADD, and additional stakeholders, for a total of approximately 40 individuals. The purpose of this meeting, expected to be held over a period of 2 to 3 days in June 2008, is threefold: (1) to review revised benchmarks and indicators in light of data collection and data collection methods; (2) to rate the benchmarks and indicators as to their importance or value to the ADD independent evaluation goals; and (3) to recommend a design for the full scale national evaluation. At the end of the meeting, we expect to have selected only those benchmarks, indicators, and example performance expectation levels (standards) that reflect the ability of the DD Network programs and collaboration among the programs to have an impact on people with developmental disabilities and their families, State systems, and service providers. We also expect to be able to recommend to ADD the next steps in developing all performance expectation levels (standards) and integrating the ADD independent evaluation with MTARS, PPRs, and other ADD data collection requirements.

4.4 Final Report

Westat will provide ADD with a draft and final report. The report will summarize Westat activities over the 3 years of the project, contain recommended benchmarks, indicators, and selected performance standards, and make recommendations to ADD on the next steps in developing all performance standards. The information we glean from crosswalks of indicators with existing data and pilot study data will also enable us to make recommendations to ADD on ways to integrate the ADD independent evaluation with MTARS, PPRs, NIRS data, and other ADD data collection requirements.

This page left blank intentionally.

APPENDIX A
PROGRAM, WORKING GROUP MEMBERS, AND STATE

Program Team Members State
Protection and Advocacy System Mary Faithfull
Tom Gallagher
Bob Joondeph *
Tim Shaw
Jeanne Thobro *
Sarah Wiggins-Mitchell
Texas
Indiana
Oregon
Nebraska
Wyoming
New Jersey
State Developmental Disabilities Councils Waynette Cabral
Vendella Collins
Debra Dowd
Becky Harker *
Bill Lynch *
Dick Weathermon
Jamie Wolfe **
Hawaii
Michigan
Florida
Iowa
Oregon
Nevada
Delaware
National Network of UCEDDs Carl Calkins
Tawara Goode
Gloria Krahn *
David Mank
Fred Orlove *
Fred Palmer
Lu Zeph
Missouri
District of Columbia
Oregon (Health and Sci. Univ.)
Indiana
Virginia
Tennessee
Maine

* Also on Collaboration Working Group
** Chair, DD Council

This page left blank intentionally.

APPENDIX B
ADVISORY PANEL MEMBERS AND AFFILIATION

Name Affiliation
Karen Armstrong Center for Mental Health Services SAMHSA
Richard Carroll Director,
Northern Arizona University Institute of Human Development (UCEDD)
Elmer Cerano Executive Director,
Michigan Protection and Advocacy Service
Robin Foley Director,
Special Education Projects
Federation for Children with Special Needs
Shawn Fulton Self-advocate
Nora Fox Handler Family advocate
Andy Imparato President and CEO,
American Association of People with Disabilities
Adele Robinson Senior Director,
Public Policy and Communications
National Association for the Education of Young Children
William Strang Vice President,
AMSAQ, Inc.
Matthew Wangeman Chairperson,
Arizona Governor's Council on Developmental Disabilities
Wanda Willis Executive Director,
Tennessee Council on Developmental Disabilities

This page left blank intentionally.

APPENDIX C
SUMMARY OF COMMENTS ON ADD INDEPENDENT EVALUATION *

DD Network Program General Comments
UCEDD
  • Primary audience for this evaluation is OMB; secondary audience may be Congress, programs, public.
  • Need to clarify what is needed for the PART process—keep this evaluation focused on that requirement.
  • Clarify what qualifies as "independent" evaluation for OMB (e.g., recognize that independent evaluation may only require primary data collection to answer those questions for which you don’t already have reliable, valid data)
  • Evaluate DD Act programs—this feedback focuses on the UCEDD Network.
  • Base the evaluation on the DD Act; to the extent possible, be cognizant of the work on reauthorization.
  • Align the evaluation with compliance with DD legislation ("are programs doing what they are funded to do?").
  • Evaluate systems impact of the Network ("is the legislative intent of the DD Act being accomplished?")
  • In keeping with OMB policy, minimize the burden of data collection on grantees—use existing data sources that we have already invested in (e.g., MTARS, NIRS, and annual progress reports).
  • Feedback from telephone calls: Many measures are already captured in data from MTARS, or from NIRS. Sources of funding, for example, are well-documented.
  • Use a criterion-based, stratified sampling method of approximately 20-30% of UCEDDs.
  • Add rigorous, qualitative evaluation to tell the story (e.g., multiple case studies, document analysis, focus groups, and key informant interviews).
  • Ensure transparency of the process at all levels—responsive communication on all steps along the way.
  • Validation needs to occur sooner and more often in the process.
  • Assessment of collaboration needs to be defined more broadly to include collaboration not only with other ADD funded partners but with non-ADD funded partners. This broader view of collaboration is consistent with the leveraging and systems change requirements of the DD Act.
  • Reduce the number of measures (benchmarks, indicators and performance standards); recommend two for each core function and one for collaboration—select those applicable to most UCEDDs.
  • Feedback from telephone calls: 1. Reduce the number of indicators for each benchmark (There are now 117 indicators for the UCEDD benchmarks), and 18 indicators for the benchmarks under Collaboration.)
  • Eliminate measures of structure and process unless they are requirements for compliance.
  • Develop a data definition for each measure and evaluate it in terms of applicability to the majority of the Network.
  • In developing the benchmarks, use consistent language and framework (e.g., "the Network of UCEDDs…").
    • Focus now on design and benchmarks—wait until there is agreement on benchmarks before reviewing indicators.
DD Councils
  • NACDD believes that both ADD and Projects of National Significance should be included in the evaluation. Sections 104 and 105 of the Developmental Disabilities Act refer to requirements of the Administration on Developmental Disabilities in relation to the DD Councils and Section 161 refers to PNS.
  • NACDD believes that the basis for the evaluation should be alignment with the Developmental Disabilities and Bill of Rights Act. Since the Act is the charter of a DD Council, the measurement of success is the degree to which a Council meets the mandate in the Act. Every data element in the evaluation should obviously link back to the DD Act.
  • Feedback from TA meeting: Maintain flexibility that is in the DD Act and focus on outcomes that are in the DD Act (e.g., integration, inclusion, independence, etc.).
  • NACDD: Currently, ADD requires Councils to report on performance through the annual Program Performance Reports and periodic Monitoring and Technical Assistance Reviews. The integration of these activities into an evaluation system would be relevant and efficient. The DD Councils firmly believe that answers to many of the current questions posed in the draft Westat documents can be obtained through the PPRs, MTARS and the MTARS self-assessment checklist. Please consider these documents as a starting point to obtain the data needed to respond to the PART.
  • Enhance communication about the project with grantees.
  • Systems change requires much more than collaboration with the UCEDDs and P&As. We request that collaboration measures not be limited to collaboration with DD network partners. Additionally, we request that the focus on collaboration be on outcomes rather than the process because of the variability of collaborative ventures and the differences between and among the Councils.
  • Reduce the number of benchmarks and indicators for each key function (e.g., 2-3 indicators per key function); exclude indicators that are not required in the DD Act.
  • NACDD shares with ADD the goal of obtaining outcome oriented data. To date, the Westat drafts concentrate on a Council's methodology to accomplish its work rather than impact and outcomes. We request that Westat focus on outcome measures.
  • NACDD letter: During the Technical Assistance meeting DDPIE breakouts, there was confusion as to some of the terms used by Westat staff. It would be very helpful to have a definition of terms. Please include "promising practices" in the definitions particularly because the term is not included in the DD Act.
  • TA meeting feedback: Provide definitions for terms used in the study, such as "outputs" and "outcomes."
  • Ensure terminology is consistent and that people-first language is used throughout all documents.
  • Remove the phrase "the network of" from each benchmark.NACDD letter: An evaluation process and tool must recognize the individuality of State and Territorial Councils. Each operates in different economic, political, cultural, geographic and financial environments. Many of the draft indicators will be particularly burdensome for smaller Councils and territories to track because they will have to collect data not currently needed for the PPR. Again the "test" of a data element is its reference in the DD Act. If it is not mentioned in the Act, it should not be required.
  • Process indicators may not be necessary and can be problematic:
  • DD Councils achieve things differently through different processes but still might result in a good outcome.
  • The environment of the State, the role of Council members, and funding levels vary in different states.
  • DD Councils are different sizes and work in different cultural contexts (e.g., the territories).
  • Some indicators are not measurable or impossible to measure (e.g., page 21: Extent to which policy makers contacted by Council know about the developmental disabilities community).
  • There is concern that outcomes of DD Council work may not be captured during the independent evaluation period (e.g., systems and practice changes).
  • Suggest adding a key function of "Governance and Management" to address diversity of Council members and qualifications of Council staff.
P&As
  • Outcomes suggested as important:
    • Is the statutory language adequate?
    • Are the regulations consistent with the statute?
    • Is funding adequate?
    • Is ADD doing their oversight adequately?
    • Are P&As in compliance with the DD Act?
  • Baseline of performance needs to be consistent with principles of the DD Act.
  • The evaluation should look at what outcomes for people were achieved as a result of the P&A work and at what cost.
  • Would like to see ongoing clarification and communication about how the data will be reported and used.
  • The evaluation should not look at input and process components.
  • Much of P&A work is covered by attorney-client confidentiality and has legal/ethical implications. Thus, it will be very difficult to collect information on P&A work. Independent evaluation staff might be subject to subpoena in any action involving P&A clients or customers with whom they have contact.
  • Having a "one-size-fits all" set of standards will never be appropriate for the variety there is among state programs.
  • There is concern that P&As will only take on cases that will be able to meet the performance standards, instead of taking on risky cases that could have a more significant impact.
  • Outcome indicators should focus on:
    • changes in policies or practices (in PPRs)
    • dollars P&A has secured for people with developmental disabilities
    • Extent to which people with developmental disabilities handle problems more effectively as a result of the actions of the P&A
  • If problems are recurring, do people with developmental disabilities handle them more efficiently and are they using the information obtained from the P&A and skills developed through interaction with the P&A?
  • Another approach is to evaluate the extent to which ADD has established mechanisms to ensure consumers provide feedback and determine if the feedback is adequate, and if not, what kind of improvements need to be put in place.
  • Are P&As turning over customer satisfaction surveys in terms of people being served? Is this productive and cost effective?
  • P&A key functions break down into two groups: (1) Procedural (which can be evaluated with current tools) – Planning, Priority Setting and Implementation, Governance and Management; and (2) Intake.
  • Outcome-based (which should be the focus of the independent evaluation) – Individual Advocacy, which looks at the impact of legal advocacy; Systemic Advocacy, which looks at the impact on systemic issues; and Outreach and Public Education, which looks at the impact on prime targets for increased awareness.
  • Why is ADD developing the independent evaluation?
  • Who will be responsible for collecting and reporting the information?

* These comments were extracted from letters sent to ADD by the UCEDDs and the National Association of Councils on Developmental Disabilities (see sections 2.3.1.2 and 2.3.1.3), as well as from feedback at Advisory Committee meetings, Working Group meetings, National technical assistance meetings, on telephone conference calls with State programs, and from emails sent to Westat subsequent to National technical assistance meetings. Dates of all meetings (in-person and telephone) are listed in Table 2-1.

COMMENTS SPECIFIC TO KEY FUNCTIONS/BENCHMARKS/INDICATORS *

UCEDDs

  1. Key function: Interdisciplinary Pre-Service Preparation and Continuing Education
    • Benchmark A1. "Former students use what they learned …in their professional lives."

      The benchmark is too specific. As one reviewer put it, training prepares students for generic jobs, and not necessarily for jobs with a disability focus.

      Benchmark A2, third indicator, "Presence of distance learning"

      Accessibility, an appropriate component of the benchmark, does not translate into an indicator on the availability of distance learning. Distance learning is one of many possible approaches to accessibility.

      Benchmark A3, regarding an interdisciplinary faculty that is knowledgeable about developmental disabilities

      The second indicator under A3, "Ways in which and extent to which UCEDD faculty members have life experiences that provide them with firsthand knowledge of the developmental disabilities community …"

      Life experiences are not necessary for UCEDD faculty to understand disabilities and should not be an indicator except as a "for instance."

      An additional indicator for Benchmark A3 might be "Number of disciplines represented on the faculty."

      Benchmark A4, The network of UCEDDs supports a diverse student body …

      With regard to the example of a performance standard for the following indicator of student satisfaction: "At least XX% of pre-service interdisciplinary students report they are satisfied or very satisfied with the amount of time available to them to meet with UCEDD faculty on a one-to-one or small group basis."

      Student ratings are notoriously unreliable.

      Benchmark A5, The network of UCEDDs maintains influence in and access to other programs in the university.

      Indicators for Benchmark A5 should be "such as.." and not prescriptive
  2. Key Function: Conduct Basic and/or Applied Research
    • Benchmark B.8, regarding a UCEDD's infrastructure for supporting research, evaluation, and/or public policy analysis

      The performance standard "At least XX% of UCEDDs in the network of UCEDDs provide the CAC with an executive summary of finalized research papers and reports," should be changed to: "UCEDDs provide information on final versions of research papers." UCEDDS might provide a briefing, a copy of the entire paper, or information in some other form.
  3. Key Function: Provide Community Services
    • Benchmark C9, regarding the impact of UCEDDS on families, service providers, and the community at large.

      Suggestion for a new indicator: How well-known are the UCEDDs and/or their staffs in decision making circles?

* These comments were made at telephone meetings with State programs to discuss draft documents (see Table 2-1) and in emails to Westat subsequent to National technical assistance meeting.

COMMENTS SPECIFIC TO KEY FUNCTIONS/BENCHMARKS/INDICATORS *

DD COUNCILS

  1. State Plan Development, Maintenance, and Implementation
    • 1.  Change the wording of Benchmark 3:
      • Remove "all aspects of" from benchmark #3 (Benchmark 3: Council members are integrally involved in all aspects of the development of State Plans)
      • Or combine Benchmarks 2 and 3 and restate the benchmark as "State Plans reflect the needs of the developmental disabilities community by involving Council members in State Plan development."

        Benchmark 2: State Plans reflect the needs of the developmental disabilities community.
      2.  Add additional indicators that measure Council members' full participation and DD Councils’ supports for Council members to participate in Council activities
      • Ways in which DD Councils support Council members to participate in Council meetings and activities (see MTARS)
      • Consistent and full participation of Council members
      • Council members’ satisfaction with State Plan development process
      3.  State Plan development is an important activity that DD Councils undertake but it is not a key function. It is a way that DD Councils use to achieve other key functions.

      4.  Take out an indicator - "Methods of dissemination of State Plan (e.g., mail, posted on website)" (page 4): The use of more methods does not mean having better outcomes. The intent of reaching wide audience rather than the ways used for dissemination is important.

      5.  Three indicators on page 4 (see below) do not fit the intent of the key function.
      • Methods of dissemination of State Plan
      • Members of the community to which State Plan is disseminated
      • Characteristics of disseminated State Plan to make it accessible to a large audience
      6.  It is difficult for Council members to become familiar with State Plan requirements, Council issues, and the needs of the DD community as stated on page 6 (i.e., 4th bullet point – Council members' familiarity with the principles and goals of the Act and familiarity with State Plan requirements)
  2. Advocacy and Leadership Development
    • 7.  The indicator on page 9 (1st bullet point – Extent to which participants in advocacy programs become members or the chair of the Council) is not relevant since Council members are appointed by the State governor. Furthermore, this has never been the intent of some DD Councils' advocacy or training programs.

      8.  Comment on two indicators (page 9) - Advocacy and leadership training programs are conducted by contractors. Therefore, there are potential challenges for DD Councils to collect follow-up data from program participants (e.g. time, cost, and confidentiality).
      • Extent to which participants in advocacy programs become members or the chair of the Council (page 9)
      • Extent to which participants in self-advocacy programs become members or leaders (page 9)
      9.  Take out one indicator on page 11 (3rd bullet point) and two indicators on page 13 (2nd and 3rd bullet points) –wordings are too prescriptive and would ruin the relationship DD Councils would have with a self-advocacy association. It is self-advocates who decide their own directions, not DD Council's State Plan.
      • Content of Council-supported educational advocacy programs(page 11)
      • Extent to which the goals, objectives, and expected outcomes of a Council-funded organization led by people with DD reflect (are consistent with) the goals, objectives, and expected outcomes in the State Plan (page 13)
      • Ways in which and extent to which the Council requires Council-funded organizations led by people with DD to be accountable (e.g., requires organization to report progress to the Council) (page 13)
      10.  Delete the phrase of "have a low income" from the sample performance standard (p. 12, 3rd column)
      • At least XX% of advocacy and leadership educational and training participants are members of traditionally unserved or underserved populations or communities (e.g., are members of a racial/ethnic minority, have a low income, reside in a rural community, require assistive technology).
  3. Identification, Testing, and Promotion of Promising Practices
    • 11.  Combine key function C with key function D – Identifying, testing, and promoting promising practices are ways used by DD Councils to conduct systems design, redesign, maintenance, and improvement.
      • Key Function C: Identification, Testing, and Promotion of Promising Practices
      • Key Function D: Systems Design and Improvement.
      12.  The word "testing" may be irrelevant for many DD Councils since they are likely to conduct promoting, informing, and training related activities instead of demonstration projects (page 14, 1st and 2nd bullet points)
      • Ways in which and extent to which Council uses positive findings from testing and promoting efforts for its own planning and other activities
      • Ways in which and extent to which Council uses negative findings form testing and promoting efforts to inform further planning or other activities
      13.  Combine the first two bullet points (page 14) – both positive and negative findings are used.
      • Ways in which and extent to which Council uses positive findings from testing and promoting efforts for its own planning and other activities
      • Ways in which and extent to which Council uses negative findings form testing and promoting efforts to inform further planning or other activities
      14.  Need a definition for "promising practices" (1st bullet point, page 15)
      • Methods used to identify promising practices
      15.  Comment on an indicator (3rd bullet point, page 15) --The results of promising practices that "reflect the goals and objectives in the State Plan" may take a long time to happen.
      • Extent to which promising practices identified reflect (are consistent with) the goals and objectives in the State Plan
      16.  Benchmark 9 and its indicators are not meaningful since DD Councils are state agencies and have access to state procurement policies. (pages 15-16)

      Benchmark 9: The network of State Councils on Developmental Disabilities uses fair and effective approaches to identify contractors to test and promote promising practices that reflect the goals and objectives of the State Plan.
      • Council has procurement policies or access to state procurement policies that focus on fairness and the ability to identify competent and experienced contractors to implement demonstration projects
      • Presence of documented evidence that Council follows the policies related to procurement of contractors
      • Criteria for evaluating proposals
      • Extent to which funded projects are unsolicited
      • Composition of review committee for reviewing proposals
      17.  A definition of term -- "projects are unsolicited"-- is needed (2nd bullet point, page 16)
      • Extent to which funded projects are unsolicited
      18.  The use of percentage as a performance standard is hard (2nd bullet point, page 16 --"No more than XX% of Council funded projects to promote and test promising practices are unsolicited"). It was recommended to ask if DD Councils have a policy on unsolicited projects instead of asking the amount of unsolicited projects.
  4. Systems Design and Improvement
    • 19.  Add "Maintenance" to the key function - "Systems Design, Improvement, and Maintenance"

      20.  Delete an indicator (4th bullet point, page 20) or just take out its example -- "are familiar with their current positions on DD issues"
      • Extent to which Council members know the policy makers in the State or federally (e.g., are able to name some of the policy makers, are familiar with their current positions on developmental disabilities issues)
      21.  Replace "and" with "or" (2nd bullet point, page 21). This indicator also has budgetary implications. Some states may have different cultures and fewer budgets to cover associated costs.
      • Extent to which Council members and or staff sit on committees, panels, and other venues that put them in touch with other members of the developmental disabilities community
      22.  Indicators have budgetary implications and are hard for DD Councils to carry out (2nd and 3rd bullet points, page 22):
      • Ways in which and extent to which past, current, and potential collaborators are kept apprised of Council activities and achievements
      • Presence of a local (grassroots) structure that facilitates information sharing and active participation in policy and legislation development
  5. Community Capacity Development
    • 23.  It is difficult for people with intellectual disabilities to achieve an outcome indicator such as being "familiar with issues affecting people with DD" (2nd bullet point, page 24).
      • Extent to which education and training participants are familiar with issues affecting people with developmental disabilities as a result of attending education and training activities
      24.  Combine Benchmarks 16 and 17
      • Benchmark 16: The network of State Councils on Developmental Disabilities supports education and training on developmental disabilities that address issues affecting people with developmental disabilities to a diverse group of organizations, individuals, and the general public.
      • Benchmark 17: Informational materials used in community capacity efforts supported by the network of State Councils on Developmental Disabilities are appropriate for a diverse audience.
      25.  Take out an indicator "Types of audiences targeted by Council-supported education and training programs" (2nd bullet point page 25)

      26.  Indicator listed on page 26 (1st bullet point, "Use of representatives from target audiences to review training materials") is too prescriptive.
  6. Collaboration
    • 27.  Benchmarks 1, 2, and 3 should be re-conceptualized. They should look at whether the collaboration infrastructure exists and whether effective collaborative efforts are undertaken.
      • Benchmark 1: Developmental Disabilities Network (DDN) collaboration achieves common goals that have an impact on people with developmental disabilities and their families, service providers, and/or State systems
      • Benchmark 2: DDN programs collaborate to achieve common goals
      • Benchmark 3: DDN programs have an infrastructure that supports collaborative efforts.
      28.  Add one word "maintain" to sub-benchmark 1.1 (page 3): Ways in which and extent to which DDN collaborators achieve (or prevent/maintain) system changes

      29.  A definition of Collaboration is needed (e.g., number of DDN programs participating in collaborative efforts).

      30.  According to DD Act, collaboration with other DD Network programs is encouraged but not mandatory.

      31.  Collaboration can be at different levels (e.g., monthly meetings among directors).

      32.  Collaboration can be informal.

      33.  There is a concern about involving board members in every collaboration related meeting.

      34.  It is highly likely that Program Directors and Chairpersons are satisfied with whatever will meet the ADD reporting requirements, not the activities themselves. The time and effort that goes into DD Network collaboration tends to be a staff function and it is really up to the staff to make sure that the collaborative efforts are in sync with the State Plan – A comment on indicator 2.3 and its performance standard.
      • Indicator 2.3: DDN programs and the developmental disabilities community are satisfied with the collaboration process, outputs, and outcomes (page 4)
      • Extent to which DDN programs are satisfied with the process

        Each program director and Chair of the State Council on Developmental Disabilities is satisfied or fully satisfied with the nature and level of resources his/her program contributes to collaborative efforts.

* These comments were made at telephone meetings with State programs to discuss draft documents (see Table 2-1) and in emails to Westat subsequent to National technical assistance meeting.

COMMENTS SPECIFIC TO KEY FUNCTIONS/BENCHMARKS/INDICATORS *

P&As

  1. Planning, Priority Setting, and Implementation
    • General Comments: P&A’s job is not to second guess the input it receives but to explore and facilitate options based on identified interests and priorities. P&As must balance the interests of individual advocacy clients, systemic issues the P&A believes are in the interest of the DD community, and individual and systemic issues promoted by groups that represent key elements in the DD community (Voice of the Retarded) but not necessarily the viewpoint of the P&A. In other words, whose opinion counts? P&As can't just ignore the input they don’t like to hear. (Indicators related to consistency of the SGP with the principles of the DD Act and input received would address this.)
  2. Governance and Management
    • General Comments: P&As vary in how they are set up.

      1.  The majority are private, non-profit organizations/entities. Boards can be self-appointed and as such, they need to plan for the continuing life of the board (e.g., they need to make sure that the board’s composition has a balance of legal, volunteer, financial, leadership, and other needed personnel). In this sense, the board is self-perpetuating, but also consistent. Self-perpetuation places a major responsibility on the board to select, prepare, and support new members to sustain the work of the board and the P&A.

      2.  Approximately six to eight P&As are part of state government and have no Board of Directors. Some have commissions. Some have advisory boards, but the boards don’t participate in governance. Since litigation against the state is always a possibility, P&As that are state government entities have to have a way to make sure they can act as independently as possible so they are able to pursue all remedies (e.g., they have to have a process regarding engaging in litigation, a process that covers hiring and need for clearance for hiring, process regarding use of the media).

      3.  P&As need to have a method for dealing with potential and actual conflicts of interests. The P&As need to balance the importance of having board members with knowledge and interest in the P&A principles but potential conflicts of interest versus members with no conflicts of interest but no knowledge.

      4.  Some P&As are working to develop a volunteer program, prepare volunteers to assume roles within the P&A, and support them as they assume those roles.
  3. Individual Advocacy
    • Work within this key function must recognize the individual’s choice vs. systemic types of activities.

      Benchmark 5: P&A individual advocacy is client-centered and meets identified objectives.

      5.1: P&As seek to meet clients' objectives.
      5.2: P&A attorneys/advocates/contractors communicate with clients in clients' preferred mode of communication.

      To measure if clients' objectives were met, P&As/evaluators can do data runs on the DADS system on closing the case questions. They can ask staff directly about the process for getting retainer agreements (5.1) in the client’s preferred language (5.2).

      5.3: P&As maintain the confidentiality of all individual advocacy clients.

      Questions to ask the P&As:

      1.  How do you maintain client confidentiality? What policies and procedures are in place? Are staff trained on the policies and procedures? What assurances do you have that staff are following policies and procedures? Are the policies and procedures sufficient? How do you know?

      2.  How do you handle "client-to client" potential confidentiality issues, conflicts of interests, and safety issues (e.g., balancing one client’s safety against another's right to confidentiality)? (One possible procedure related to client-to-client conflicts of interest is to use the DADS system to check for potential client conflicts when calls come in.)

      3.  How do you handle inquiries for information about your clients?

      4.  How do you protect your paper files?

      5.  How do you protect your electronic files?

      5.4:  P&As have the capacity to represent clients effectively on a wide range of issues.
      (Third bullet: Ways in which staff and contractual personnel access TASC and other information resources on legal issues related to people with developmental disabilities and their families [e.g., Legal Backup Centers, TASC listserv, TASC documents, other content-related resources] when needed.)

      This implies that if the P&A doesn't use these specific resources, the P&A won’t meet the standard.

      7.1:  P&As represent their individual advocacy clients vigorously and ethically—Everything the P&A does is legal advocacy and should be overseen by an attorney. Should 7.1 read: P&As have a policy that supports professional and ethical representation of individual advocacy clients? (What is the difference between professional and ethical representation?)
  4. Systemic Advocacy
    • P&As have a broader obligation to look at the larger picture and create new options from which individuals with developmental disabilities can choose.

      Benchmark 8: P&A systemic advocacy efforts facilitate the advancement or maintenance of legislative, administrative, and/or organization actions, policies, and practices that reflect the issues, priorities, and needs of people with developmental disabilities and their families.

      8.1: System changes (or prevention of changes) are facilitated by P&A systemic advocacy efforts.The extent to which elimination and prevention of barriers to access for people with developmental disabilities are being facilitated through P&A systemic advocacy efforts would be felt at the state level. How can this be rolled up to the national level if the evaluation is looking at impact in the states?

      Benchmark 9: P&As respond to systemic advocacy issues identified in the Statement of Goals and Priorities (SGP) and to emerging systemic advocacy issues.

      9.2: As part of systemic advocacy efforts, P&As engage in ongoing interaction and collaboration with policy makers, DD Councils, UCEDDS, community organizations, and other groups with common goals on behalf of people with developmental disabilities and their families.
      Systemic Advocacy needs to address how the P&A uses the media. 9.2 is a good place to address it. If the outcome is systemic change, the P&A needs to coalesce with a few right (i.e., correct) players. Coalition composition needs to be fluid and depends on the circumstances. For example, P&As have a lot of good ideas but few grass roots connections. DD Councils have important grass roots connections but less access to information than the P&As. How well can the P&A identify the coalitions that are important? How does the evaluation determine how much participation in coalitions and other collaborations is "enough"? What do the P&As need to do to demonstrate they are involved in the right coalitions at the right time? (Would they have criteria for making such decisions?)

      Questions to ask the P&As:

      1.  How do the P&As determine what coalitions they need to join (i.e., How do they know they are coalescing with the right people at the right time?)

      2.  Another way to word this question might be: How well are P&As identifying what coalitions they need to be part of and the expected result?

      3.  Has the expected result been achieved?
  5. Intake
    • General Comments:
      • Intake in itself is not a key function. Intake is the way people get access to the P&A and is essentially process-oriented. The benchmarks focus more on information and referral. The benchmarks should focus on the outcome of getting quality information and advice to individuals since that is the fist step toward self-advocacy.
      • Following up on Intake is difficult. There’s variation in what kind of information on people who contact the P&A that states collect through the Intake process. Some states collect no data and cannot do follow-up with those contacts. For example, some P&As don’t collect information on people they refer to another resource; they only record contact information on inquiries that fall within the purview of the P&A. However, in some states, every call is recorded (i.e., logged). Does this mean there is contact information for every caller
      • The evaluation needs to make the distinction between Intake and Individual Advocacy clear. Is this distinction the same in all P&As? In some states, any service request is entered into the DADS system. Anytime a person is given advice about a problem specific to that individual, the request becomes an Individual Advocacy issue. If the P&A just sends a publication, this does not open a service request in the DADS system. Short-term assistance can be anything that falls short of entering into a client retainer service but goes beyond a phone call. We need to look at how Intake/I&R instances are reported on the PPR and how “non case-directed services” are defined in the PPR.
      Benchmark 10: P&As respond appropriately to requests for information, resources, and assistance related to issues affecting people with developmental disabilities and their families.

      10.1: The P&A intake process meets the needs of P&A clients or constituents.
      (Second bullet: Extent to which P&A clients or constituents are satisfied with the type of assistance they receive from the P&A.)
      • Satisfaction responses will be skewed by whether the individuals got what they wanted.
      • Remove intervention on behalf of groups and non case-directed services. These do not come to the P&As through Intake. They are more a result of systemic advocacy or outreach.
      10.2: Information gathered by the P&A through the intake process addresses the needs of P&As for monitoring and planning.

      Does 10.2 belong under Planning, Priority Setting, and Implementation?

      Questions to ask the P&As:

      1.  How do the P&As get the information they need and how do they use what they get?

      2.  Do they have systems in place for gathering the information they need in a useable format?

      3.  Are they getting what they need and using what they get?

      4.  How do they know they are getting or not getting the information and when they need to try another strategy?

      Benchmark 11: P&As respond promptly to requests for information, resources, and assistance related to issues affecting people with developmental disabilities and their families.

      11.1: P&A intake staff members follow a documented intake process on which they have been trained.

      In an effort to make sure calls are returned, a P&A will try to return calls within a 2-3 day time period; however, this can’t be rigid. Presence and nature of policies and procedures on Intake can be burdensome on the P&A. Need to make sure procedures "mirror" the community. For example, time constraints can vary depending on what is going on in the community (e.g.., trainings or outreach sessions the P&A held, beginning of the school year, cases of abuse and neglect being uncovered).

      11.2: P&A intake staff is able to communicate with people from diverse populations and communities in a culturally competent manner.

      Issues of composition of Intake staff, ability to communicate with individuals from a variety of racial/ethnic and language groups, and staff training relate to access people have to the P&A. Procedures to ensure access need to "mirror" the community (e.g., language, customs, needs, etc.).
  6. Outreach and Public Education
    • Benchmark 13: P&As are a resource on developmental disability issues for the developmental disability community, including those in traditionally unserved or underserved populations and communities.

      13.1: P&As receive requests from individuals and organizations throughout the state for information, training, and technical assistance on issues affecting people with developmental disabilities and their families.

      Question to ask the P&As:

      1.  What types of trainings do the P&As do?

      2.  How do they evaluate the results of their trainings (i.e., what kind of analyses of their trainings do the P&As do)?

      3.  Who is requesting training and what kind of training?

* These comments were made at telephone meetings with State programs to discuss draft documents (see Table 2-1) or in emails to Westat subsequent to National technical assistance meeting.

APPENDIX D

DATA COLLECTION OVERVIEWS

This page left blank intentionally.

Descriptive Elements Interview with UCEDD Director Interview with Chair, Consumer Advisory Committee and members Telephone Interviews - Peer Researchers and Colleagues In-Person Focus Groups – Community Organizations and Other Recipients of Community Services Telephone Focus Groups - Former Students and Graduates
Purpose: To test data collection instruments

To determine:
To test data collection instruments

To determine:
To assist in further developing indicators and data collection instruments To assist in further developing indicators and data collection instruments To assist in further developing indicators and data collection instruments
Interdisciplinary pre-service preparation and continuing education
 
  • Nature of UCEDD curricula, training materials, and teaching.
  • How UCEDD uses input from different disciplines in developing curricula and courses.
  • Whether and how UCEDD draws on expertise from community experts in developing curricula and course content.
  • Disciplinary backgrounds, diversity, and experience of UCEDD faculty.
  • Disciplinary focus and diversity of student body.
  • Whether and how CAC is involved in the development of curricula, training materials, or syllabi.
  • Quality of training UCEDD provides to students.
  • Involvement of UCEDD faculty and staff in other university programs.
  • Impact of UCEDD pre-service preparation and continuing education programs on the community.
  • Level of knowledge and skills evidenced by any UCEDD-trained employees.
  • Classes taken in UCEDD programs; class objectives and nature of the content.
  • Usefulness of course content.
  • Extent to which training was interdisciplinary.
  • How graduates have used material from the courses in their work or in their personal lives.
  • Whether students looked for or obtained a job with a disability focus.
Basic and/or applied research
 
  • Description of the infrastructure for supporting research.
  • Number/types of professional committees, advisory panels, review panels on which UCEDD faculty are serving.
  • Whether and how the CAC is involved in the development of plans for research, evaluation, or public policy analysis.
  • Extent to which CAC members think that UCEDD research has an impact on people with disabilities, their families, service providers, and state systems.
  • Quality of UCEDDs’ research, evaluation, and/or public policy analysis.
  • Extent to which UCEDD research has supported changes to practices/services.
  • Extent to which UCEDD research is cited by others.
  • Extent to which UCEDD research, evaluation, or public policy analysis has had an influence on the community.
  • Whether and how students were involved with or benefited from UCEDD basic or applied research.
Community services
 
  • Description of community services provided to people with developmental disabilities and their families, and to the community at large.
  • Whether and how the CAC is involved in the development of plans for provision of community services.
  • Extent to which CAC members think that UCEDD community services activities benefit the community.
  • Extent to which model programs developed by UCEDDs are known and respected.
  • Degree to which community services UCEDD strengthened the capacity of service providers and others to serve persons with developmental disabilities and their families.
  • Assessment of the impact of UCEDD services on people with developmental disabilities and their families.
  • Whether students participated in UCEDD community service activities and their assessment of the experience.
Dissemination of Information
 
  • Description of methods of dissemination and types of materials disseminated; frequency and methods of updating.
  • Whether and how CAC reviews/comments on materials for dissemination.
  • Quality and usefulness of publications, presentations and other materials disseminated by the UCEDD.
  • Extent to which community organizations and others access information disseminated by the UCEDD, and whether they find it useful.
  • Extent to which students access information disseminated by the UCEDD or continue to access it after graduation.
  • How students/graduates use the information, and how valuable they find it to be in contrast to other sources?
Governance and Management
 
  • Description of mission statement and structure of the organization
  • Description of the process of developing, implementing and updating goals, objectives, priorities and strategies for components of the program to address each of the four main core functions.
  • Composition and role of the Community Advisory Committee
  • Support received from the university.
  • Sources and amounts of funding.
  • Composition and role of Consumer Advisory Committee.
  • Whether and how CAC participates in the development and updating of goals, objectives, priorities and strategies of the UCEDD programs.
     
Number of Interviews (or group sessions and members) 1 Up to 3 9 2 groups of up to 8 members each 2 groups up to 6 members each
Description of Interviews Directors of state UCEDD programs. Chairs of UCEDD Consumer Advisory Committee. Researchers specializing in areas of focus for the UCEDD, and other colleagues at the university. Directors of community organizations served by UCEDDs and individual recipients of UCEDD services. Students who have graduated from the University and completed courses in the UCEDD program, representing a diverse set of majors, and former students who may still be at the university.
Length of Interview 3 hours 2 hours 1 hour 2 hours 1.5 hours
Type of Interview In person semi-structured In person semi-structured Telephone, semi-structured Focus groups Focus groups

 

Descriptive Elements Interview with Executive Director Interview with DD Council Chair and Members Interview with DD Council Staff Focus Group – Policy makers/Collaborators/ Contractors Focus Group –Participants of Advocacy/ Leadership Training Focus Group –Participants of Community Capacity Training
Purpose: To test data collection instruments

To determine:
To test data collection instruments

To determine:
To test data collection instruments

To determine:
To assist in further developing indicators and data collection instruments

To understand:
To assist in further developing indicators and data collection instruments

To understand:
To assist in further developing indicators and data collection instruments

To understand:
State Plan Development, Maintenance, and Implementation
 
  • Ways in which and extent to which goals and objectives reflect the needs of people with developmental disabilities and their families
  • Ways in which and extent to which Council members are involved in State Plan development efforts
  • Ways in which State Plan is carried out and maintained
  • Ways in which State Plans are used
  • Ways in which and extent to which Council members participate in State Plan development, implementation, and maintenance
  • Ways in which Council members become familiar with State Plan requirements, Council issues, and needs of the community of developmental disabilities
  • Ways in which comprehensive review and analysis is conducted
  • Ways in which State Plan is developed, approved, implemented, maintained, and disseminated
     
Advocacy and Leadership Development
 
  • Ways in which advocacy and leadership training programs are provided to target audiences
  • Ways in which program participants use knowledge and skills learned
  • Ways in which Council members are involved in advocacy and leadership development efforts
  • Ways in which training materials are developed
  • Ways in which training participants are recruited
  • Ways in which training programs are delivered
  • Ways in which advocacy and leadership training program is delivered by DD Council
  • Ways in which recipients use knowledge and skills learned
  • Components of advocacy and leadership training that are important and useful to recipients
   
Identification, Testing, and Promoting Promising Practices
 
  • Ways in which promising practices are identified
  • Process of selecting competent contractors to test and promote promising practices
  • Ways in which promising practices are carried out efficiently and effectively
  • Ways in which results of testing and promoting efforts are used
  • Ways in which Council members are involved in Council-supported demonstration projects
  • Knowledge of State supports and service systems (e.g., access barriers)
  • Ways in which DD Council identifies and selects competent and experienced contractors
  • Processes used to make selection approaches fair and effective
  • Ways in which DD Council selects and monitors conduct of contracts so they are conducted efficiently and effectively
  • Perspective of contractors on selection and monitoring processes
  • Outcomes of DD Council testing and promoting promising practices from perspective of contractors, policy makers, and collaborators
   
Systems Design and Improvement
 
  • Ways in which policy makers are educated on issues related to developmental disabilities
  • Ways in which collaborative efforts are built and maintained
  • Involvement of people with developmental disabilities and their families in systems design and improvement efforts
  • Results of systems design and improvement efforts
  • Ways in which Council members participate in systems design and improvement efforts
  • Extent to which Council members are familiar with policy makers in the State and federally
  • Ways in which DD Council becomes familiar with policy makers in the State
  • Ways in which Council staff build and maintain on-going relationships with members of various community organizations
  • Ways in which DD Council contact policy makers and collaborators
  • Methods used by DD Council to educate and change the behavior of policy makers as seen by policy makers
  • Methods used by DD Council to build and maintain ongoing relationships with collaborators as seen by collaborators
  • Outcomes of DD Council systems design and improvement as seen by policy makers and collaborators
   
Community Capacity Development
 
  • Characteristics of training programs
  • Ways in which training materials are appropriately developed
  • Ways in which program participants use knowledge and skills learned
  • Ways in which Council members participate in community capacity development efforts
  • Ways in which training materials are developed
  • Ways in which training participants are recruited
  • Ways in which training programs are delivered
   
  • Ways in which community capacity training is delivered by DD Council
  • Ways in which recipients use knowledge and skills learned
  • Components of community capacity training that are important and useful to recipients
Collaboration
 
  • Resources for collaboration
  • Process of collaboration
  • Results of collaboration
  • Collaborators
  • Ways in which Council members are involved in DDN collaborative efforts
  • Extent to which Council members are satisfied with goals, objectives, process, products, and results of collaborative efforts among DDN programs
       
Number of Interviews (or group sessions and members) 1 3 3 Up to 8 Up to 8 Up to 8
Description of Interviewees Executive Director of DD Council Council Chair and 2 other Council members Council staff who are in charge of Council activities related to key functions Examples:
  • Senior staff from State or service provider organizations
  • Legislative liaisons
  • Collaborators on recent (within past year) systems design and improvement efforts
  • Contractors of recent (within past year) demonstration projects
Recent training participants (within past year) Recent training participants (within past year)
Length of Interview 2-3 hours 1-2 hours 1-2 hours 2 hours 2 hours 2 hours
Type of Interview In-person (or telephone), semi-structured In-person, semi-structured In-person, semi-structured Focus group Focus group Focus group

 

Descriptive Elements Interview with Executive Director Interview with Board of Directors Interview with P&A Staff Focus Group - Policymakers/Collaborators Focus Group - Recipients of Public Education
Purpose: To test data collection instruments

To determine:
To test data collection instruments

To determine:
To test data collection instruments

To determine:
To assist in further developing indicators and data collection instruments

To understand:
To assist in further developing indicators and data collection instruments

To understand:
Planning, Priority Setting, and Implementation
 
  • Process for planning and priority setting (how, who, when, use of input)
  • Consistency of P&A activities with SGP or rationale for addressing additional issues
  • Process for monitoring progress toward goals and priorities and use of feedback
  • Responsiveness of planning and priority setting to needs of people with developmental disabilities and their families
  • Consistency of SGP priorities with those of DD Council and UCEDDs
       
Governance and Management
 
  • Ways in which P&As are efficient and effective
  • Nature of relationships between P&A and State, Board of Directors, ADD, and other DD Network programs
  • Use of feedback to inform ongoing activities and operations
  • Accessibility of P&A activities
  • Familiarity with P&A mission, SGP, and DD Act.
  • Adherence of P&As to principles and goals of DD Act.
  • Capacity to fulfill roles and responsibilities.
  • Relationship between Board of Directors and Executive Director.
  • Staff qualifications, backgrounds, and experience
  • Nature and extent to which intake staff are appropriately trained and monitored
   
Individual Advocacy
 
  • Ways in which P&As represent their clients vigorously and ethically (accommodations, confidentiality, responsiveness to individual needs)
  • Extent to which client caseloads reflect priorities and needs of people with DD and their families
  • Nature and extent to which individual advocacy is client-centered and meets identified objectives
 
  • Ways in which P&As represent their clients vigorously and ethically
  • Extent to which client caseloads reflect priorities and needs of people with DD and their families
  • Nature and extent to which individual advocacy is client-centered and meets identified objectives
   
Systemic Advocacy
 
  • Identification of systemic advocacy issues
  • Nature of systemic advocacy efforts (how, with whom)
  • The systemic or prevention changes facilitated by the P&A
   
  • Inputs that influence policymaking
  • Identification of systemic advocacy issues
  • Process used by P&As for systemic advocacy
  • Nature of advocacy issues in which P&A participates
  • Roles of P&As and collaborators in systemic advocacy activities
  • Outcomes of P&A systemic advocacy
 
Intake
 
  • Intake process, procedures, and policies
  • Responsiveness of intake process to needs of P&A those contacting the P&A
  • Process for monitoring intake
  • Ways in which and extent to which data collected during intake is used
  • Capacity of intake staff to respond efficiently, effectively, and in a culturally competent manner
  • Nature and extent to which intake staff are appropriately trained and monitored
 
  • Intake process, procedures, and policies
  • Whether P&A staff are efficient and effective (record keeping system, adherence to regulations)
   
Outreach and Public Education
 
  • Types of P&A outreach and public education activities
  • Consistency of P&A activities and efforts with targets set in SGP
  • P&A ability to reach target audiences
  • Nature of P&A outreach and public education activities (e.g., culturally competent)
  • Responsiveness of P&A outreach and public education activities to needs of people with developmental disabilities and their families
 
  • Types of P&A outreach and public education activities
  • Nature of P&A outreach and public education activities (e.g., culturally competent)
  • Responsiveness of P&A outreach and public education activities to needs of people with developmental disabilities and their families
 
  • Ways in which public education is delivered by P&As
  • Ways in which public education (provision of training, education) is used by recipients
  • Components of public education that are important to recipients of public education
  • Satisfaction with services received
Collaboration
 
  • Resources for collaboration
  • Process of collaboration
  • Results of collaboration
  • Collaborators
  • Ways in which Council members are involved in DDN collaborative efforts
  • Extent to which Council members are satisfied with goals, objectives, process, products, and results of collaborative efforts among DDN programs
     
Number of interviews (group members) 1 3 3 Up to 8 Up to 8
Description of Interviewees Executive Director of P&A for Individuals with Developmental Disabilities (PADD) Chair and 2 others
  • Advocate or lawyer responsible for individual advocacy clients
  • Intake staff person
  • Provider of public education
Examples:
  • Senior staff from State service provider organizations
  • Legislative liaisons
  • Collaborators on recent (within past year) systemic advocacy activities
Recent training participants (within past year)
Length of Interview 2-3 hours 15-30 minutes 1-2 hours 2 hours 2 hours
Type of Interview In-person, semi-structured Telephone, semi-structured In-person, semi-structured Focus group Focus group