Evaluation Briefs No. 11 | November 2006 Preparing an Evaluation Report Many audiences want to learn about and understand evaluation results. Dissemination is the process of communicating procedures, evaluation results, programmatic achievements, or lessons learned from an evaluation in a timely, unbiased, and non-technical manner. This Brief provides a general outline for an evaluation report that can be adapted to present evaluation results and is tailored to address the questions and concerns of different audiences. Components of the Evaluation Report An evaluation report clearly, succinctly, and impartially communicates all aspects of the evaluation. Additional guidance sources for writing evaluation reports available on the Internet are listed under Resources at the end of this Brief. Your report should include eight sections: Executive summary Background and purpose § Program background § Purpose of the evaluation § Brief program description Evaluation methods § Data collection methods § Data sources § Sampling procedures and/or description of respondents § Data processing and analysis technique, if appropriate § Data limitations Results Discussion of the results Conclusions and recommendations References Appendices Executive summary. This is a short section, usually two pages or less at the beginning of the report that provides a brief picture of the program and the most significant evaluation findings and recommendations. State the evaluation questions, data collection methods, and the evaluation results of the evaluation. If space permits, you also may provide recommendations. Background and purpose. Describe the history of the program, its goals and objectives, and major strategies. Highlight parts of the program that are unique. Define the purpose of the evaluation and the program’s target population. Evaluation methods. Describe the methods in sufficient detail to enable others to replicate your approach. Include information on the timing and frequency of data collection; from whom the data were collected; any sampling procedures used, the data sources (records, questionnaires, interviews, etc.), how data were collected, and who was responsible for data collection. Describe any limitations of your evaluation approach, problems you encountered, and how you resolved them. Results. Present key evaluation results without much interpretation. Consider using tables or cross-tabulations, examples, quotes, illustrations, photos, and graphics to emphasize important findings and create a memorable and personalized account of your program for readers. (See the evaluation brief entitled “Disseminating Program Achievements and Evaluation Findings to Garner Support”). Briefly explain the major findings revealed by the data. For example, a table might list the different groups of school staff who attended training on HIV/AIDS policies, the percentage of each group you trained, and the percentage of each group that you expected to train as specified in your program logic model. Comment on the differences between expected and actual percentages. Discussion of the results. If you have explanations or insights about what occurred and why, state your opinions and interpret the data in this section. Even when your findings are not what you had originally expected, your insights may help others who plan a similar program. Conclusions and recommendations. This section should not contain any new information but should restate the findings concisely. This is also the place to make recommendations about program effectiveness, improvements, financial support, or policy changes based on the results. Moving from data to recommendations can be difficult. It is critical to identify different audiences in the early stages of the evaluation to determine what information is relevant to them, so that your recommendations can be adopted. Making realistic recommendations requires not only the input of the evaluator and program staff, but also primary decision makers, who will use the results to generate their own recommendations. Specific audiences include program advisory boards, state legislators, coalition members, CDC and other funding agencies, teachers and school administrators, and state and local school boards. All of these audiences have different interests and decision making responsibilities and will use the evaluation report in different ways. References. Provide complete citations of any reports or publications cited in the body of the report. Appendices. If you wish to encourage others to replicate your evaluation, provide a copy of all data collection tools (e.g., questionnaires and interview protocols). You also may include a copy of your program logic model to provide additional details on your activities, anticipated outputs, and outcomes. Resources Frequently Asked Questions about Reports is available for download at: http://oerl.sri.com/reports/reportsfaq.html. (Accessed 11/07/06) Tell Your Story: Guidelines for Preparing an Evaluation Report. Available for download at: http://www.dhs.ca.gov/ps/cdic/tcs/documents/eval/EvaluationReport.pdf. (Accessed 11/7/06) Quality Criteria for Reports is available for download at: http://oerl.sri.com/reports/reportscrit.html. Online Evaluation Resource Library (OERL). (Accessed 11/7/06) For further information or assistance, contact the Evaluation Research Team at ert@cdc.gov. You can also contact us via our website: http://www.cdc.gov/healthyyouth/evaluation/index.htm.