Centers for Disease Control and Prevention
 CDC Home Search Health Topics A-Z


CDC Evaluation Working Group
Eval Home | Overview | News and Notes | Contact Us

Local Contents

Framework for
Program Evaluation

Citation: Centers for Disease Control and Prevention.  Framework for Program Evaluation in Public Health. MMWR 1999;48(No. RR-11).

View as

pdficonsmall.gif (153 bytes) HTML

Contents

line.gif (187 bytes)

Framework Graphic

Framework for Program Evaluation

Back to Top

line.gif (187 bytes)

Summary
Effective program evaluation is a systematic way to improve and account for public health actions that involves procedures that are useful, feasible, ethical, and accurate.  The framework guides public health professionals in their use of program evaluation. It is a practical, nonprescriptive tool, designed to summarize and organize essential elements of program evaluation. The framework comprises steps in program evaluation practice and standards for effective program evaluation. Adhering to the steps and standards of this framework will allow an understanding of each program's context and will improve how program evaluations are conceived and conducted.

Evaluation can be tied to routine program operations when the emphasis is on practical, ongoing evaluation that involves all program stakeholders, not just evaluation experts.  Informal evaluation strategies may be adequate for ongoing program assessment.  However, when the stakes of potential decisions or program changes increase, employing evaluation procedures that are explicit, formal, and justifiable becomes important. 

Understanding the logic, reasoning, and values of evaluation that are reflected in this framework can lead to lasting impacts, such as basing decisions on systematic judgments instead of unfounded assumptions.

Back to Top

line.gif (187 bytes)

Purposes
The framework was developed to:

  • Summarize and organize the essential elements of program evaluation
  • Provide a common frame of reference for conducting evaluations
  • Clarify the steps in program evaluation
  • Review standards for effective program evaluation
  • Address misconceptions about the purposes and methods of program evaluation

Back to Top

line.gif (187 bytes)

Scope
Throughout this report, the term "program" is used to describe the object of evaluation; it applies to any organized public health action. This definition is deliberately broad because the framework can be applied to almost any public health activity, including:

  • Direct service interventions
  • Community mobilization efforts
  • Research initiatives
  • Surveillance systems
  • Policy development activities
  • Outbreak investigations
  • Laboratory diagnostics
  • Communication campaigns
  • Infrastructure building projects
  • Training and education services
  • Administrative systems; and
  • Others

Additional terms defined in this report were chosen carefully to create a basic evaluation vocabulary for public health professionals.

Back to Top 

line.gif (187 bytes)

How to Assign Value
Questions regarding values, in contrast with those regarding facts, generally involve three interrelated issues:

  • Merit (i.e., quality)

  • Worth (i.e., cost-effectiveness)

  • Significance (i.e., importance)

Assigning value and making judgments regarding a program on the basis of evidence requires answering the following questions:

  • What will be evaluated? (i.e. what is "the program" and in what context does it exist)

  • What aspects of the program will be considered when judging program performance?

  • What standards (i.e. type or level of performance) must be reached for the program to be considered successful?

  • What evidence will be used to indicate how the program has performed?

  • What conclusions regarding program performance are justified by comparing the available evidence to the selected standards?

  • How will the lessons learned from the inquiry be used to improve public health effectiveness?

These questions should be addressed at the beginning of a program and revisited throughout its implementation. The framework provides a systematic approach for answering these questions.

Back to Top 

line.gif (187 bytes)

Steps and Standards
The following table summarizes the steps in program evaluation practice with the most important subpoints for each, as well as the standards that govern effective effective program evaluation.   Follow the links to see brief definitions for each concept.

Steps in Evaluation Practice


Engage stakeholders
Those involved, those affected, primary intended users

Describe the program
Need, expected effects, activities, resources, stage, context, logic model

Focus the evaluation design
Purpose, users, uses, questions, methods, agreements

Gather credible evidence
Indicators, sources, quality, quantity, logistics

Justify conclusions
Standards, analysis/synthesis, interpretation, judgment, recommendations

Ensure use and share lessons learned
Design, preparation, feedback, follow-up, dissemination

Standards for "Effective" Evaluation


Utility
S
erve the information needs of intended users

Feasibility
Be realistic, prudent, diplomatic, and frugal

Propriety
Behave legally, ethically, and with due regard for the welfare of those involved and those affected

Accuracy
Reveal and convey technically accurate information

The steps and standards are used together throughout the evaluation process.  For each step there are a sub-set of standards that are generally most relevant to consider.  These are linked in the table entitled:
Cross Reference of Steps and Relevant Standards

Back to Top 

line.gif (187 bytes)

Applying the Framework

Conducting Optimal Evaluations

Public health professionals can no longer question whether to evaluate their programs; instead, the appropriate questions are

  • What is the best way to evaluate?
  • What are we learning from evaluation?
  • How will we use the learning to make public health efforts more effective?

The framework for program evaluation helps answer these questions by guiding its users in selecting evaluation strategies that are useful, feasible, ethical, and accurate.

To use the recommended framework in a specific program context requires skill in both the science and art of program evaluation. The challenge is to devise an optimal — as opposed to an ideal — strategy. An optimal strategy is one that accomplishes each step in the framework in a way that accommodates the program context and meets or exceeds all relevant standards.

Back to top

line.gif (187 bytes)

Assembling an Evaluation Team

Harnessing and focusing the efforts of a collaborative group is one approach to conducting an optimal evaluation. A team approach can succeed when small groups of carefully selected persons decide what the evaluation must accomplish, and they pool resources to implement the plan. Stakeholders might have varying levels of involvement on the team that correspond to their own perspectives, skills, and concerns. A leader must be designated to coordinate the team and maintain continuity throughout the process; thereafter, the steps in evaluation practice guide the selection of team members. For example,

  • Those who are diplomatic and have diverse networks can engage other stakeholders and maintain involvement.
  • To describe the program, persons are needed who understand the program's history, purpose, and practical operation in the field. In addition, those with group facilitation skills might be asked to help elicit unspoken expectations regarding the program and to expose hidden values that partners bring to the effort. Such facilitators can also help the stakeholders create logic models that describe the program and clarify its pattern of relationships between means and ends.
  • Decision makers and others who guide program direction can help focus the evaluation design on questions that address specific users and uses. They can also set logistic parameters for the evaluation’s scope, time line, and deliverables.
  • Scientists, particularly social and behavioral scientists, can bring expertise to the development of evaluation questions, methods, and evidence gathering strategies. They can also help analyze how a program operates in its organizational or community context.
  • Trusted persons who have no particular stake in the evaluation can ensure that participants’ values are treated fairly when applying standards, interpreting facts, and reaching justified conclusions.
  • Advocates, creative thinkers, and members of the power structure can help ensure that lessons are learned from the evaluation and that the new understanding influences future decision-making regarding program strategy.

All organizations, even those that are able to find evaluation team members within their own agency, should collaborate with partners and take advantage of community resources when assembling an evaluation team. This strategy increases the available resources and enhances the evaluation’s credibility. Furthermore, a diverse team of engaged stakeholders has a greater probability of conducting a culturally competent evaluation (i.e., one that understands and is sensitive to the program’s cultural context). Although challenging for the coordinator and the participants, the collaborative approach is practical because of the benefits it brings (e.g., reduces suspicion and fear, increases awareness and commitment, increases the possibility of achieving objectives, broadens knowledge base, teaches evaluation skills, strengthens partnerships, increases the possibility that findings will be used, and allows for differing perspectives).

Back to top

line.gif (187 bytes)

Addressing Common Concerns

Common concerns regarding program evaluation are clarified by using this framework. Evaluations might not be undertaken because they are misperceived as having to be costly. However, the expense of an evaluation is relative; the cost depends on the questions being asked and the level of certainty desired for the answers. A simple, low-cost evaluation can deliver valuable results.

Rather than discounting evaluations as time-consuming and tangential to program operations, the framework encourages conducting evaluations that are timed strategically to provide necessary feedback. This makes integrating evaluation with program practice possible.

Another concern centers on the perceived technical demands of designing and conducting an evaluation. Although circumstances exist where controlled environments and elaborate analytic techniques are needed, most public health program evaluations do not require such methods. Instead, the practical approach endorsed by this framework focuses on questions that will improve the program by using context-sensitive methods and analytic techniques that summarize accurately the meaning of qualitative and quantitative information.

Finally, the prospect of evaluation troubles some program staff because they perceive evaluation methods as punitive, exclusionary, and adversarial. The framework encourages an evaluation approach that is designed to be helpful and engages all interested stakeholders in a process that welcomes their participation.

Back to top

line.gif (187 bytes)

Documents and Manuals Based on the Framework
To facilitate learning, the framework is presented in the following formats for use with different audiences.

Title

File Type
Introduction to Program Evaluation for Public Health Programs: A Self-Study Guide (92 pages) pdficonsmall.gif (153 bytes)
MMWR Recommendations and Reports
(42 pages + references + boxes)
pdficonsmall.gif (153 bytes) HTML
Overview of the Framework
(3 pages)
pdficonsmall.gif (153 bytes)
Summary of the Framework
(10 pages + boxes)
pdficonsmall.gif (153 bytes)
Dissemination and Adaptation (Sept-Dec, 1999)
(Article published in Health Promotion Practice)   
pdficonsmall.gif (153 bytes)
Adapted Version for Community Stakeholders
(67pages)
pdficonsmall.gif (153 bytes)
Instructional Video and Workbook
(A 5 hour distance learning course)
pdficonsmall.gif (153 bytes)
Gateway to the Community Tool Box (CTB)
(Framework linked to CTB sections)
HTML

Back to Top 

line.gif (187 bytes)

Introduction to Program Evaluation for Public Health Programs: A Self-Study Guide
CDC's Office of Strategy and Innovation (OSI) produced a self-study manual that is organized around the steps in the Framework,  "Introduction to Program Evaluation for Public Health Programs: A Self-Study Guide."  This is a public domain document that can be shared without restriction. 

This version was developed to provide a practical tool to each step in the CDC Framework.   The manual presents the same content as the CDC publication, but using more user-friendly layout, cross-cutting case examples, and in-depth instructions and worksheets.

Back to Top

line.gif (187 bytes)

Adapted Version for Community Stakeholders
The Center for the Advancement of Community-based Public Health (CBPH) produced an adapted version of the framework entitled, "An Evaluation Framework for Community Health Programs."  This is a public domain document that can be shared without restriction. 

This version was developed to provide a practical tool for engaging community stakeholders in program evaluation activities.  Community stakeholders are often prevented from participating because explanations of evaluation are written mainly for academic and professional readers.  This document explains evaluation by speaking directly to people who live and work in communities.  Adaptations were based on feedback gathered systematically from front-line practitioners and community members across the country. The result is a retooled version of the framework that is more accessible to community members and staff of community-based organizations. The CBPH version presents essentially the same content as the CDC publication using less technical language, more graphics, and more user-friendly layout. It also includes case examples and quotes provided by community-based practitioners.

Back to Top

line.gif (187 bytes)

Instructional Video and Workbook
"Practical Evaluation of Public Health Programs" (course # VC0017) is a five-hour distance-learning course organized around CDC's recommended framework for program evaluation. Developed through CDC’s Public Health Training Network, the course consists of two videotapes and a workbook, which can be used by individuals for self-study or by small groups with optional enrichment activities. Continuing education credit is available for this course. For more information, visit the Public Health Training Network website or call 1-800-41-TRAIN (1-800-418-7246).

Course materials may be purchased from the Public Health Foundation by calling the toll free number 1-877-252-1200, or using use their on-line order form.  The cost is approximately $40.00.

For informational purposes, the workbook can be viewed free-of-charge over the internet.

Back to Top

line.gif (187 bytes)

Gateway to the Community Tool Box
The Community Tool Box
(CTB) is a highly acclaimed internet resource for health promotion and community development.  It contains a wealth of practical information about how to do the work of public health and social change on a community level.  Because they consider program evaluation to be a critical part of successful community-based work, the CTB team used the basic elements of the framework to create a unique gateway to evaluation ideas and tools.

Back to Top


Back to Top

Eval Home | Overview | What's New | Contact Us

CDC Home | Search | Health Topics A-Z

This page last reviewed
URL:

Centers for Disease Control and Prevention
CDC Evaluation Working Group