Home About ATSDR Press Room A-Z Index Glossary Employment Training Contact Us CDC  
ATSDR/DHHS Agency for Toxic Substances and Disease Registry Agency for Toxic Substances and Disease Registry Department of Health and Human Services ATSDR en Español

Search:

Primer Contents
En español
 
Preface
About the Primer
 
Principles & Techniques
 
Why Evaluate?
Types of Evaluation
Measure of Effectiveness
Barriers to Evaluation
 
Evaluation & Research
 
Designing and Testing
Review and Pretesting
Pretest Methods
Print Materials
Sample Survey
Group Case Study
Pretest Results
Using Pretest Results
Special Populations
Risk Message Checklist
 
Outcomes & Impacts
 
Assessment Questions
Evaluation Options
Midcourse Reviews
Have We Succeeded?
Evaluation Case Example
Evaluation Action Plan
Effective Program
 
Selected References
 
Risk Documents
 
Cancer Policy
Risk Assessment
Communication Primer
Methyl Parathion
Psychologial Responses
 
ATSDR Resources
 
Case Studies (CSEM)
Exposure Pathways
GATHER (GIS)
HazDat Database
Health Assessments
Health Statements
Interaction Profiles
Interactive Learning
Managing Incidents
Medical Guidelines
Minimal Risk Levels
Priority List
ToxFAQs™
ToxFAQs™ CABS
Toxicological Profiles
Toxicology Curriculum
 
External Resources
 
CDC
eLCOSH
EPA
Healthfinder®
Medline Plus
NCEH
NIEHS
NIOSH
OSHA
 

Agency for Toxic Substances and Disease Registry (ATSDR) 
Evaluation Primer on
Health Risk Communication Programs
Elements of an Evaluation Design


(NCI 1992)

Every formal design—whether formative, process, outcome, impact, or a combination—must contain eight basic elements.

1. A Statement of Communication Objectives

Unless there is an adequate definition of desired achievements, evaluation cannot measure them. Evaluators need clear and definite objectives in order to measure program effects.

2. Definition of Data To Be Collected

This is the determination of what is to be measured in relation to the objectives.

3. Methodology

A study design is formulated to permit measurement in a valid and reliable manner.

4. Instrumentation

Data collection instruments are designed and pretested. These instruments range from simple tally sheets for counting public inquiries to complex survey and interview forms.

5. Data Collection

The actual process of gathering information.

6. Data Processing

Putting the information into a usable form for analysis.

7. Data Analysis

The application of statistical techniques to the information to discover significant relationships.

8. Report

Compiling and recording evaluation results. These results rarely pronounce a program a complete success or failure. To some extent, all programs have good elements and bad. It is important to appreciate that lessons can be learned from both if results are properly analyzed. These lessons should be applied to altering the existing program or as a guide to planning new efforts.

[Top of Page]


Revised May 1997.