Learn about theories of change, logic models, evaluation plans, and other research tools that help IECMHC teams measure program fidelity and impact. Please visit the new Center of Excellence for Infant and Early Childhood Mental Health Consultation (CoE IECMHC). This center is continuing the work begun under the first CoE for IECMHC. Vision Statement on Research and Evaluation The Center’s vision is that solid research and evaluation will help move infant and early childhood mental health consultation (IECMHC) forward as a necessary and essential service for children, families, and programs across the nation. Research and evaluation on IECMHC will be equity-informed. Data will be gathered at multiple levels and will be used to sustain programs and monitor fidelity. Evaluators will work with their community and state partners to ensure that they are measuring policy-relevant outcomes. Additional information on working with tribal communities. Evaluation Planning Program developers, implementers, evaluators, and practitioners must take several steps to demonstrate the success and impact of IECMHC programs and their fidelity. Learn about the steps of the program evaluation process and the evidence base for IECMHC: Designing an IECMHC Evaluation Plan Logic Models Selecting Measures for IECMHC Evaluation Equity-informed IECMHC Evaluation Using Data for Sustainability IECMHC Evaluation Examples Evidence Base for IECMHC Designing an IECMHC Evaluation Plan Logic Models A well-designed evaluation begins with the development of a program logic model, created in collaboration with stakeholders, program evaluators, and program designers. Logic models are visual depictions of the IECMHC program’s purpose, processes, and outcomes. The logic model states basic assumptions, the intended target populations, specific IECMHC services, and the processes needed to support the IECMHC program. Logic models are also an essential tool for guiding measurement of the program’s effects. Logic Model Toolbox Resources Maryland’s Project LAUNCH (school-based) – 2017 (PDF | 408 KB) Alaska’s Head Start Program (rural) – 2017 (PDF | 130 KB) Multnomah County, Oregon LAUNCH (home visiting) – 2017 (PDF | 146 KB) The following documents can also help your team develop a program logic model: A Roadmap for Collaborative and Effective Evaluation in Tribal Communities from the Children’s Bureau at the Administration for Children and Families (ACF) is designed to create a shared vision for evaluating the welfare of tribal children. While ACF uses the term “roadmap” here, the premise is comparable to a logic model. The W.K. Kellogg Foundation Logic Model Development Guide presents the components of a logic model and how they can be adapted to a variety of programs. Alaska’s IECMHC Framework – 2017 (PDF | 449 KB) identifies a number of successful strategies that programs can use to implement IECMHC services in rural settings. Selecting Measures for IECMHC Evaluations Best practice requires IECMHC evaluations to collect data at multiple levels. These include program characteristics; consultant demographics; teacher, classroom, and home visitor variables; and child- and family-level outcomes. If you’re starting to plan an evaluation, ensure that your team collects data that can be disaggregated by gender, race, ethnicity, and other characteristics of the service recipients. It’s essential to examine the fidelity of your IECMHC model. Fidelity is defined as how closely a program follows the intended protocol or procedures. Data systems are also critical to conversations about fidelity, program results, and future funding. Data can also highlight areas of the state where out-of-school discipline is disproportionate and may need to be addressed through better access to IECMHC services. IECMHC is a complex intervention that can impact multiple levels of outcomes, depending on the model and approach. Examples of individual outcomes associated with IECMHC are improvements in children’s behaviors, teachers’ practices, and parental stress. Examples of program outcomes may be measured by population reach, service provision across sites, expulsion rates, and disproportionality across target populations. Measures selected for IECMHC should align with the logic model. Evaluations Toolbox Resources Use the following documents and resources to help your team select program measurements: Measures Used to Evaluate Outcomes in IECMHC – 2017 (PDF | 916 KB) is a table that summarizes many of the tools that evaluators use to measure the impact of IECMHC. The Observation Toolkit for Mental Health Consultants at the Center for Early Childhood Mental Health Consultation is to be used by infant and early childhood mental health consultants in preschool classrooms. It contains materials and strategies to measure implementation at the program and teacher level as well as strategies for collecting data on child behavior. Equity-informed IECMHC Evaluation As demonstrated in data on school discipline gathered by the Department of Education’s Office for Civil Rights – 2014 (PDF | 2.1 MB), there are ongoing disparities in out-of-school discipline for preschool boys and African American preschoolers. While access to IECMHC has been associated with overall reductions in expulsions from early care and education settings, it remains to be seen if IECMHC can directly address this disproportionality. Evaluators are encouraged to collect information on race and ethnicity, gender, suspensions, and expulsions so that better data on this equity issue can be analyzed. Learn more about promoting equity and reducing disparities when engaging in IECMHC. State Snapshots: Equity in IECMHC Evaluations – 2017 (PDF | 259 KB) describes the purposes of equity-informed evaluation, offers key recommendations, and provides snapshots of equity-informed evaluation in five states: Louisiana’s IECMHC program prioritizes equity by requiring all affiliated staff, including the evaluation team, to participate in “undoing racism” seminars and training. Maryland’s detailed IECMHC program monitoring system allows administrators to disaggregate data by race, gender, community, and provider. This helps ensure that communities affected by disproportionality of suspension and/or expulsion rates are receiving IECMHC services. Connecticut’s statewide IECMHC information system has built-in transparency in its data collection and program monitoring. This system includes a feedback loop where mental health consultants share outcomes with consultees and families to involve them in the decision-making process for moving forward with IECMHC services. Arkansas’ commitment to equity in IECMHC involves new and stronger initiatives that strengthen non-expulsion policies throughout the childcare system and will track data to monitor adherence to these new policy requirements. Arizona’s evaluation of its Smart Support IECMHC program allowed evaluators to disaggregate child-level outcomes for African American and Latino children to examine changes in expulsion and suspension patterns. Using Data for Sustainability Strong evaluation data are often a powerful tool for communicating with stakeholders. Securing even a small amount of funds for an external evaluation can help make the case for scaling up pilot sites and securing state general revenue after a federal grant or seed money ends. Evaluators should work with their community and state partners to ensure they are measuring policy-relevant outcomes. The video Using Data to Show Effectiveness and Promote Sustainability (three minutes) can help your team use data for sustainability. In this video, key staff from Maryland explain how they used data to demonstrate IECMHC efforts and outcomes, leading to the formation of an effective message to share with decision-makers. IECMHC Evaluation Examples Evaluations of IECMHC programs take on many purposes. These purposes may include: Tracking programmatic outcomes in a statewide data system Documenting the reach of a targeted population, such as in foster care settings Fostering the use of innovative evaluation tools to demonstrate social and emotional support for children in IECMHC programs When evaluations are rigorous, they produce credible data that can help make the case for sustainable outcomes and securing additional funds from sponsors. Use the following evaluation examples to inform your own evaluation plan: Development of Maryland’s IECMHC Outcomes Monitoring System – 2017 (PDF | 243 KB) shows how Maryland’s system for monitoring IECMHC program outcomes allows administrators to disaggregate data by race, gender, community, and provider. This system is used in part to ensure that communities affected by disproportionate rates of suspensions and expulsions receive IECMHC services. IECMHC to Support Children in Foster Care – 2017 (PDF | 530 KB) profiles Arkansas’s unique IECMHC program for addressing the needs of children in foster care. Using Evaluation Data to Make the Case for Securing or Expanding IECMHC Funding is a three-and-a-half-minute video that shows the Maryland team discussing its Outcomes Monitoring System. The team explains how data demonstrating positive program outcomes and proven effectiveness led to expanded funding for IECMHC services. Arizona’s Smart Support Evaluation Report: The First Four Years examines the role of IECMHC in promoting the social and emotional development of children in Arizona’s early care and education programs. An Interdisciplinary Evaluation Report of Michigan’s Childcare Expulsion Prevention Initiative provides an example of how one state measured program fidelity and outcomes for children, parents, and providers. Early Childhood Consultation Partnership (ECCP): Results Across Three Statewide Random-Controlled Evaluations – 2016 summarizes and examines findings from this unique program. IECMHC program evaluations often rely on quasi-experimental methods, most often pre-/post- designs; very few evaluations are funded to include a comparison/control group, which adds expense and complexity to the design. An important exception to this is the ECCP program in Connecticut, which partnered with Yale University to conduct several randomized controlled trials of its statewide program. Evidence Base for IECMHC During a SAMHSA meeting of federal agencies and experts in the field of IECMHC in September 2014, researchers identified a substantial body of evidence for the effectiveness of IECMHC across a range of outcomes. In a variety of early care, education, and home visiting settings, IECMHC has been shown to reduce behavior problems in children and to increase adults’ awareness of the need to focus on social and emotional health. While this is welcome news, there is still need for more equity-based evaluations that examine whether IECMHC can reduce differential treatment of children based on factors such as race and gender. Access documents and tools on the evidence base for IECMHC. Find additional resources in the IECMHC Toolbox, including guidance on: Systems and Policy Models Competencies Workforce Development Communications Financing