LEAD & MANAGE MY SCHOOL
What Now? Communicating Effectively About Prevention Data

Evaluation Reports

The following is a suggested outline for organizing and presenting your evaluation findings. You will need a report that clearly describes the effectiveness of your program in order to maintain current support, garner additional support, and apply for continued funding.

Front Cover

Make sure that your front cover looks neat and professional -- this is the first thing readers will see, so you want to make a good impression. The front cover should include the following:

  • Title and location of your program or initiative
  • Name(s) of evaluator(s)
  • Period covered by the report
  • Date of the report

Section I. Executive Summary

This brief (two- to three-page) overview of the evaluation should outline major findings and recommendations. Since many people only read the executive summary (and ignore the rest of the report), make sure that it is as clear and complete as possible. The executive summary should answer these questions:

  • What was evaluated?
  • Why was the evaluation conducted?
  • What are the major findings and recommendations?

If space permits, it should also describe:

  • The report's intended audience
  • Others who might find the report to be of interest or importance
  • Decisions that have been, or need to be made, based on the evaluation results

Section II. Background Information about the Program

Write this section assuming that readers know nothing about your program. Typically, this section should include and/or identify the following:

  • Origins of your program or initiative
  • Program goals
  • Target audience
  • Administrative/organizational structure
  • Program activities and services
  • Materials used and produced by the program
  • Program staff

Section III. Description of the Evaluation

This section explains why you conducted the evaluation and what you hoped to learn from it. It should also explain anything the evaluation was not intended to do (e.g., if it was a process evaluation, it was not meant to assess program effectiveness). This section should include the following information:

  • Name of organization requesting the evaluation
  • Any evaluation restrictions (e.g., money, time)
  • Evaluation design and why it was selected
  • Timetable for collecting data
  • Type of data collected (for each separate measure)
  • Methods used to gather data and why they were chosen
  • Steps taken to ensure accuracy

Section IV. Evaluation Results

This section is where you present your findings. To be complete, this section should include the following:

  • All of the data collected during the evaluation, analyzed, recorded, and organized so that it is easily understood (make sure to use charts, tables, and graphs, as appropriate)
  • Excerpts from interviews
  • Testimonials from participants and clients
  • Questionnaire results
  • Test scores
  • Anecdotal evidence

Section V. Discussion of Results

Here is your chance to assign meaning to your results and place them in the context of your overall initiative. These are some questions that this section might answer:

  • How sure are you that your program or initiative caused these results?
  • Were there any other factors that could have contributed to the results?
  • If your program did not exist, how would the results differ?
  • Based on these results, what are the strengths and weaknesses of your program?

Section VI. Costs and Benefits

This section is optional, but it can provide you with an opportunity to justify your program's budget and financial choices. It may include:

  • Costs associated with the initiative (e.g., resources, staff/volunteer hours)
  • Methods used to develop the budget
  • Program benefits (both financial and non-financial)

Section VII. Conclusions

After writing this entire report, you may be tempted to dash off a brief conclusion. Resist that temptation! This is where you make your recommendations, so take your time and think through what you want say. This section should include the following:

  • Major conclusions, based on the evaluation results
  • Recommendations for future program activities
  • Things about the evaluation that did and did not work well
  • Recommendations for future program assessments

From:

Hampton, C. (2002). Communicating Information to Funders for Support and Accountability. University of Kansas: Community Toolbox. Available online at http://ctb.lsi.ukans.edu/tools/EN/sub_section_tools_1376.htm.


   22 | 23 | 24
TOC
Print this page Printable view Send this page Share this page
Last Modified: 11/06/2008