Primary Navigation for the CDC Website
CDC en Español
Division for Heart Disease and Stroke Prevention
divider
Email Icon Email this page
Printer Friendly Icon Printer-friendly version
divider
DHDSP Topics
bullet DHDSP Home
bullet About the Program
bullet Announcements
bullet State Program
bullet Public Health Action Plan
bullet WISEWOMAN
bullet Stroke Registry
bullet State Exam Survey
bullet HealthyPeople 2010
bullet Heart/Stroke Maps
bullet Legislative Database
bullet Resource Library
bullet Site Map
Contact Info
Mailing Address
CDC/NCCDPHP
(Mail Stop K–47)
4770 Buford Hwy, NE
Atlanta, GA 30341–3717

Information line:
(770) 488–2424
Fax:
(770) 488–8151

bullet Contact Us

Fundamentals of Evaluating Partnerships:
Evaluation Guide

Image of the guide cover.

Icon indicating a pdf fileOne or more of the following files are available in Portable Document Format (PDF). Learn more about PDFs.

divider
Guides:
bullet Writing SMART Objectives
bullet Developing and Using a Logic Model
bullet Developing an Evaluation Plan
bullet Fundamentals of Evaluating Partnerships
divider

Background

In 1998, the U.S. Congress provided funding for the Centers for Disease Control and Prevention (CDC) to initiate a national, state-based program, the Heart Disease and Stroke Prevention (HDSP) Program. State health departments are eligible for funds at two levels, Capacity Building (CB) and Basic Implementation (BI). Capacity Building states convene a state partnership, define the state’s heart disease and stroke burden, develop a comprehensive state plan, and provide training and technical assistance for partners. Basic Implementation states receive additional funding to implement heart disease and stroke prevention policy and system changes to improve the quality of care, improve emergency response, improve control of high blood pressure and high blood cholesterol, increase knowledge of signs and symptoms, and eliminate disparities. Because many factors increase the risk of developing heart disease and stroke, state-based programs must use strategies that target multiple risk factors in many different settings, including health care settings, work sites, and communities.

Purpose

The evaluation guides are a series of evaluation technical assistance tools developed by the CDC Division for Heart Disease and Stroke Prevention (DHDSP) for use by state HDSP programs. The guides clarify approaches to and methods of evaluation, provide examples specific to the scope and purpose of state HDSP programs, and recommend resources for additional reading. The guides are intended to offer guidance and a consistent definition of terms. The guides are also intended to aid in skill building on a wide range of general evaluation topics while recognizing that state HDSP programs differ widely in their experience with, and resources for, program evaluation. Although the guides were developed for use by state HDSP programs, the information will also benefit other state health department programs, especially chronic disease programs. State Well-Integrated Screening and Evaluation for Women Across the Nation (WISEWOMAN) programs may find the guides useful for evaluation activities as well.

The guides supplement existing program guidance and program evaluation documents such as the CDC State Heart Disease and Stroke Prevention Program Evaluation Framework, which is available on the Internet at http:// www.cdc.gov/DHDSP/library/evaluation_framework/index.htm. As they are developed, guides are posted on the DHDSP website at http://www.cdc.gov/DHDSP/state_program/evaluation_guides/index.htm. State program staff are encouraged to provide feedback to the Applied Research and Evaluation Branch on the usefulness of the guides and to suggest additional guide topics.

State Heart Disease and Stroke Prevention (HDSP) programs are expected to identify, consult with, and appropriately involve multiple state partners in developing and implementing a comprehensive state plan and in developing strategies to leverage resources and coordinate interventions. Specific guidance on partnership selection is provided in the Program Funding Opportunity Announcement which emphasizes that partners should represent the priorities identified by DHDSP. These include

  • Priority populations identified by geography, gender, race/ethnicity, or socioeconomic status.
  • Priority settings such as worksite and health care settings.
  • Priority areas including quality of care, hypertension and high blood cholesterol control, and emergency response.

Partners should also represent

  • Other state health department programs.
  • State and local government agencies that address heart disease and stroke, related risk factors or conditions, priority populations or settings, or that determine policy, such as Medicaid policy.
  • State voluntary organizations that address heart disease and stroke or related risk factors, improve health and quality of life, or that provide access to a setting or a priority population.
  • Private medical practices, health care providers, insurers, federally qualified health centers, and quality improvement organizations.
  • Private organizations, such as an emergency medical services association or a state black nurses’ association.
  • Businesses and employer groups such as a business coalition on health or the state chamber of commerce.
  • Universities.
  • Media.

Once the partners are established, states are to sustain and enhance partnerships. It is likely that the number of partners, partners’ activities and responsibilities, and relationships will change over time as the needs of the program change. Enhancing partnerships encompasses

  • Expanding membership to include new and needed partners;
  • Building the knowledge and skills of partners;
  • Improving the functioning and effectiveness of the partnership; and
  • Fully engaging partners in program planning, implementation, and evaluation.

State HDSP programs are expected to evaluate their partnership(s) on a regular basis. Evaluation is “the systematic collection of information about the activities, characteristics, and outcomes of programs to make judgments about the program, improve program effectiveness, and/or inform decisions about future programming.” (Patton, 1997) DHDSP proposes a tiered approach to partnership evaluation. This approach includes

  • An annual assessment of the partnership which involves verifying the number, diversity, and participation of partners.
     
  • Basic evaluation activities, which build upon annual assessment activities, correspond roughly to process evaluation. Process evaluation is conducted once a program or intervention is underway to assess the implementation of that program or intervention. It determines whether the program is implemented as intended, as well as the quantity and quality of processes, activities and products.
     
  • Enhanced evaluation activities are more focused on outcomes and require more complex methods and more resources. Enhanced evaluation activities build upon basic evaluation and annual assessment activities.

All states should engage in partnership evaluation. States should start by documenting a basic annual assessment and initiating some basic evaluation activities. As state evaluation capacity increases, and funding is available, states will want to enhance partnership evaluation by taking on more complex evaluation activities, while still including the assessment and basic evaluation activities. BI states should be positioned to periodically conduct enhanced evaluations of their partnerships. Sample evaluation questions and activities for basic assessment and for basic and enhanced evaluation are provided in Appendix 1. States may select from and add to evaluation questions from this list on the basis of input from stakeholders, their specific needs, and available resources.

Partnerships can vary substantially in size and scope of work. State HDSP program partnerships may range from a small workgroup tasked with completing a very specific project to a large group of state-level stakeholders who come together to develop and implement a state heart disease and stroke prevention plan. Evaluation activities must therefore be appropriate to the size, scope, and purpose of the partnership. In this guide, not all evaluation methods or all elements of a single method apply to all partnerships. Many apply only to partnerships with a large number of members and high-level tasks.

Partnership evaluation planning should be part of planning the partnership. Evaluation activities should be conducted throughout the life of the partnership and can include relatively simple activities such as meeting-effectiveness surveys or identifying barriers, to participation through informal interviews. Identifying lack of participation by critical partners and lack of partners’ participation in activities are especially important. As a program’s capacity and partnerships grow, a plan for more in-depth assessments of the partnerships’ accomplishments will be needed.

Partnership evaluation is a good collaborative activity for state chronic disease programs, who can share development and implementation costs.

Resources

Conducting partnership evaluation requires both staff and fiscal resources. Before planning such an evaluation, it is necessary to first identify funds in the program budget and staff who can lead the work. It is not unusual to dedicate 5-10% of a project budget to evaluation. Assistance with budgeting can come from discussion with colleagues in the state health department and state contracting offices about the costs of previous similar evaluation activities.

Partnership evaluation is a good collaborative activity for state chronic disease programs, who can share development and implementation costs. State colleagues may already have partnership evaluation tools or strategies they would be willing to share. Partners may also have evaluation staff that could help plan and conduct evaluation activities.

Universities and Prevention Research Centers (http:// www.cdc.gov/prc/) are also good evaluation resources. Check for evaluation classes or programs that require class projects, a master’s thesis, or an internship. Student energy and faculty leadership on these projects make for a winning combination. Ask about consulting services or community service projects as well.

The American Evaluation Association is an association of professional evaluators that is “devoted to the application and exploration of program evaluation, personnel evaluation, technology, and many other forms of evaluation” (http://www.eval.org*). American Evaluation Association affiliates are located throughout the United States. Check with a local affiliate for potential resources.

This guide applies the CDC Evaluation Framework (http://www.cdc.gov/eval/evalguide.pdf) [PDF–1.3M] to evaluating your partnership. The Framework lays out a six-step process for the decisions and activities involved in conducting an evaluation. While the framework provides “steps” for program evaluation, the steps are not always linear; some can be completed concurrently. In some cases, it makes more sense to skip a step and come back to it. The important thing is that all the steps are addressed. The steps and a brief description of each are listed below. Each is described in more detail on the pages that follow. Sections of the guide are linked to this outline and the CDC framework by a “bubble” graphic in which the highlighted bubble identifies the corresponding point in the framework.

Develop an Evaluation Plan

As you work through the next sections of the guide and begin planning your partnership evaluation activities, remember to add evaluation questions and methods to an evaluation plan. Additional guidance and a template are provided in the “Developing an Evaluation Plan” guide located at http://www.cdc.gov/DHDSP/state_program/evaluation_guides/pdfs/evaluation_plan.pdf. [PDF–176K] The elements of the evaluation plan to be identified through this planning process are

  • Evaluation questions.
  • Indicators – measures needed to answer the evaluation questions.
  • Data sources.
  • Data collection methods.
  • Time frame for evaluation activities.
  • Data analysis.
  • Communicating results – to whom and in what format.
  • Lead person responsible for overseeing the work.

Image of tab dividers like you would find in a binder, and the tab names are: Use & User; Stakeholders; Describe; Focus; Evidence; Justify; and Use & Share. Determine how the evaluation results will be used and by whom. Identify resources available for the evaluation, including money, staff time, and expertise. Begin developing an evaluation plan.

  1. Identify and engage evaluation stakeholders. Plan for how they will be involved in, and will contribute to, the evaluation.
     
  2. Describe the partnership’s members, activities, products, expectations, and outcomes. Develop a logic model to depict the partnership’s theory of change (i.e., how activities will accomplish goals). Identify the stage of development of the partnership. Identify contextual factors that will impact effectiveness. These will be helpful in developing evaluation questions.
     
  3. Brainstorm and then finalize a list of questions the evaluation will answer related to effective processes, partnership activities, and expected outcomes. These will form the basis of an evaluation plan.
     
  4. Determine how you will answer the evaluation questions by identifying indicators, data sources, how you will collect data, and a timeline for data collection. Identify who is responsible for seeing that the work gets done. Pilot test tools. Collect the data.
     
  5. Enter and check the data for errors. Analyze the data in a way that will make sense to the program partners. Interpret the data to reflect the current context. Consider and document factors that may affect or bias the findings. Compare findings with benchmarks or with what others have found.
     
  6. Distribute and use evaluation results. Report often along the way. Tailor the format and the mechanism of reporting to the specific audience.

As you make decisions, information can be added to a table similar to the following

Objective:
Evaluation Questions Indicators/
Measures
Data Sources Data Collection Time Frame Data Analysis Communicate Results Lead
               

A completed example of an evaluation plan is provided below. A blank planning template is provided in this document as Appendix 2.

Photo of a group of people standing on the steps of a building.

Use and User: How Will The Evaluation Results be Used and by Whom?

Use & User TabBefore any other evaluation planning takes place, the purpose of the evaluation and the end user of the evaluation should be clearly understood. These two aspects of the evaluation serve as a foundation for evaluation planning, design, and interpretation of results. The purpose of an evaluation influences the identification of stakeholders for the evaluation, selection of specific evaluation questions, and the timing of evaluation activities. If evaluation findings are intended for use in funding or planning decisions, the evaluation activities will have to be timed to meet that expectation.

Some potential uses of partnership evaluation include

  • Improve the functioning and productivity of state partnerships. Evaluation can identify partnership strengths and areas for improvement in operating processes, structure, planning, and activity implementation.
     
  • Improve and guide partnership activities. Evaluation can be used to assess partnership interventions and activities so that successful strategies can be supported and replicated.
     
  • Determine whether goals or objectives have been met. Achieving objectives provides a sense of accomplish to members and demonstrates to funders that the partnership is a good investment.
     
  • Promote the public image of the partnership. A partnership with a positive public image may find it easier to recruit new members, retain existing members, secure additional resources, gain access to needed data, etc.
     
  • Build capacity for evaluation within the partnership. People unfamiliar with evaluation may be uncomfortable with the idea of “being evaluated.” However, engaging partnership members in evaluation may help reduce this “evaluation anxiety”. Engaging partners in evaluation tasks may increase their appreciation of the usefulness of evaluation and provide partners with evaluation skills they can apply to the partnership or their own organization.
     
  • Provide accountability to funders and partners. Accountability applies to not only achieving results, but managing resources. It also applies to valuing the partners’ time and opinions.

Evaluating partnerships can be resource intensive; therefore, it is critical that mutual uses and benefits of such an effort be clearly understood by all involved. Otherwise, partners may see evaluation as taking time away from the “real” work of the group.

The intended user of an evaluation will influence many aspects of the evaluation as well, including the prioritizing of evaluation questions and how evaluation results are communicated. Identifying effective communication strategies early in the evaluation process facilitates planning especially when multiple stakeholders are involved and multiple communication methods are needed.

Examples of potential users of the partnership evaluation include:

  • Partnership leadership.
  • Partnership organizers.
  • Partnership members.
  • Funders.
  • People affected by partnership activities.
  • Potential partners.

The evaluation sponsor (such as the partnership funder or leader) should work with the evaluator to ensure that the intended use and users of the evaluation are agreed upon. The evaluator will use this information to direct and focus evaluation activities, set timelines, and select communication strategies.

Engage Stakeholders

Stakeholders TabStakeholders are essential to conducting a successful evaluation. In this context, stakeholders include people who can contribute to or facilitate the specific evaluation project, as opposed to an evaluation advisory group who might contribute to and facilitate general evaluation planning, or programmatic stakeholders. They include people who will use the evaluation results, who support or implement the partnership, and who are affected by the evaluation results. The number of stakeholders will depend greatly on the complexity of the evaluation, what’s at stake from the evaluation, and the importance or complexity of using the evaluation recommendations. Keeping the group a manageable size (maximum of 6 to 10 people) is also a consideration. In a partnership evaluation, stakeholders might include

  • The entities that provide financial support and HDSP program staff.
     
  • At-large partnership members who can support the use of the evaluation.
     
  • Partnership leadership and planning staff.
     
  • Representatives of affected or disparate populations that will be a focus of the evaluation. This may include representatives of specific racial or ethnic groups to reinforce cultural competence in evaluation activities.
     
  • Key leaders in the health area (such as American Heart Association or emergency services) or the health department who can inform the evaluation and use the findings.
  • Individuals or organizations that can ensure use of the evaluation.
     
  • Individuals or organizations respected by key users and funders that will enhance the credibility of the evaluation.
     
  • Individuals or organizations that may prevent or discredit the evaluation.

As you identify and engage stakeholders, think about specific areas in which they will provide input or assist with your evaluation. Make a general plan for how stakeholders will be involved throughout the course of the evaluation and in interpreting and reporting findings. Stakeholders’ participation may fall into specific steps of the evaluation, like interpreting data, to make best use of their time commitment. However, there should be a core group of evaluation stakeholders that are engaged in all phases of the evaluation to ensure continuity. Stakeholder roles or activities in an evaluation may include to:

  • Clarify the goals and objectives of the partnership.
  • Identify and prioritize evaluation questions.
  • Help develop and pretest evaluation materials.
  • Ensure evaluation results are used.
  • Help develop a data collection plan and collect data.
  • Interpret and report findings.
  • Provide resources for evaluation including staff, supplies, expertise, etc.

Report back often to stakeholders to ensure their continued support and engagement. Keep stakeholders advised on progress of the evaluation, barriers as they arise, and findings when appropriate.

Other partnership members can be engaged in the evaluation without being a member of the core stakeholder group. Members can be recruited to pretest evaluation tools, participate in data collection, participate in the reporting of findings, develop a utilization plan, etc.

Evaluation stakeholders have an important role in identifying and prioritizing evaluation questions, interpreting evaluation findings, and ensuring use of the evaluation.

Describe the Partnership

Describe TabA description of the partnership should include the purpose, resources, current and planned activities, expected outcomes, stage of development of the partnership, and the political and social context. A logic model is one way to describe your partnership. Developing or revisiting a partnership logic model at this time can help unify stakeholders’ expectations as well as describe the partnership. You can also use a narrative description to accomplish the same purpose.

Partnership Logic Model

The partnership logic model forms the basis for and can provide a starting place for your evaluation. If there is no partnership logic model, collaboratively developing one while planning an evaluation will foster understanding and general agreement on partnership goals, activities, and expected products. If there is a partnership logic model, evaluation planning is a good time to revisit it. The logic model can be used to identify processes and outcomes for evaluation, guide the development of evaluation questions, and demonstrate a link between workgroup efforts, larger partnership goals, and state program priorities. (See the evaluation guide “Developing and Using a Logic Model”  for more information). Remember that a logic model is a fluid tool and will likely change over time. Logic models are beneficial not only for large partnerships that take on long-term commitments (example in Figure 1 below), but also for small, task-oriented partnerships.

Partnership Stage of Development

The second descriptive assessment you will need to make is the stage of development of your partnership. This is different from the evolution of group dynamics, (forming, storming, norming, performing) although you may want to look at your partnership dynamics as well. The developmental stages that partnerships typically move through are formation (assessment and partner selection), building, and maintenance.

The stage of development is important for determining the appropriate focus for the evaluation.

For instance, evaluation of a partnership in the formation stage should focus on partnership development rather than partnership accomplishments.

Formation Stage

  • Needs assessment is what you do to determine the need for and feasibility of the partnership. This stage includes identifying gaps in work in your area, determining what resources are needed and available to develop and sustain the partnership, and assessing the political and social context in which the partnership will operate. This stage will include defining the vision, mission, and core strategy for forming the partnership.
     
  • Formation also involves identifying and recruiting partnership members who are representative of the population, area, and setting, and have the influence and access necessary to accomplish the mission.

Building Stage

  • The building stage of a partnership includes training partners and ensuring that processes, such as communication, decision-making, and reporting are in place. Building your partnership encompasses developing infrastructure and capacity and fostering commitment.

Maintenance Stage

  • As partnerships mature and move into a maintenance stage, partnership activities focus more on achieving outcomes and ensuring sustainability, and on maintaining attention on processes like communication and leadership. You may even have to go back to formation activities if changes occur in the area of program goals/direction, member representation, or funding.

Figure 1

Inputs:  Funds, staff time, training. Activities: Recruit members; facilitate meetings and committees; Facilitate communication and decisions; Develop and publish state plan; Implement state plan objectives; Evaluate partnership and interventions. Outputs: Formal agreement; Active committees; Published state plan; HDSP interventions in place; Partnership improvement plan. Outcomes: Increased collection and reach; Leveraged resources for HDSP; Increased state activity in HDSP priorities; Policy and system level change; Reduced HD&S risk factors; Improvements in HDSP priority areas (e.g., emergency response, knowledge of signs and symptoms, quality of care); Reduced CVD burden; and Reduced CVD disparity. Contextual factors: state and federal funding: newly formed partnership, competing partner and government priorities; rising rates of risk factors, political support, shift in population demographics.

Focus the Evaluation Design

Focus TabFocusing the evaluation includes determining the evaluation questions you will ask, deciding how and when you will collect data, and what evaluation design you will use.

Determine the Evaluation Questions

Brainstorming a list of potential evaluation questions with partnership stakeholders is the best way to begin. When developing evaluation questions, you have to consider two things simultaneously

  • Purpose of the evaluation (refer to page 5, “Use and User”).
  • Stage of development of the partnership.

Taking these into consideration, you can start developing questions that evaluate

  • The number, diversity, and participation of partners (annual assessment). Appendix 3 provides a tool that can be used for new and existing partnerships to assess membership.
     
  • Partnership processes. These include elements such as leadership, resources, characteristics of members, and training. They also include operational elements such as agreement on defined purpose and objectives, communication practices, internal reporting, recruitment, meeting organization, and decision making.

    Appendices 4 and 5 provide more detail on two ways of thinking about partnership processes and outcomes. Appendix 4 discusses work done by Paul Mattessich, PhD, and the Wilder Foundation to identify partnership success factors. Appendix 5 organizes evaluation planning by stage of development and three larger domains—capacity, operations, and expectations/outcomes. Use these appendices to help generate outcome evaluation questions.
     
  • Activities and outcomes of the partnership described in the logic model. These items might include progress toward achieving objectives, leveraged resources, policy or systems changes, and partnership growth. (The evaluation guide, “Developing and Using a Logic Model,” provides a good foundation for identifying evaluation questions from your logic model.)

Evaluation Questions on Activities and Outcomes of the Partnership

Referring to the partnership logic model will be most helpful in developing questions that evaluate the quantity and quality of the partnership’s activities and products (outputs) such as documents produced and distributed, events conducted, etc.

HDSP program partnership outcomes will generally focus on changes in

  • Relationships.
  • Leveraged resources.
  • Policy development and implementation.
  • Systems and the environment.
  • Health status as a longer-term outcome or impact.

Long-term outcomes or impacts can be very complex and are often affected by multiple factors, making them hard to measure and hard to link to partnership activities. Therefore, consider documenting your partnership’s contributions to health outcomes, rather than trying to attribute change to your partnership’s activities. By focusing on short and intermediate outcomes that are linked by sound theory to distal outcomes, you can document your progress toward those longer-term outcomes.

Prioritize Evaluation Questions

After you have developed a list of evaluation questions, including questions that focus on how to improve the partnership, rank them based on

  • The questions most important to you and your key stakeholders (the “must answer” questions).
  • Questions that provide results that you can use (e.g., for improvement).
  • Questions you can answer fully with available data.
  • Questions within your resources to answer.

Stakeholders are invaluable in prioritizing questions. Information that your stakeholders need should be a priority. Having stakeholders participate in the selection of questions increases the likelihood of their securing evaluation resources, providing access to data, and using the results.

Evaluation Design

For many HDSP partnership evaluations, either a pre-post or case study design will provide sufficient information for program improvement or accountability. Each design has strengths and weaknesses and requires a different level of resources.

A pre-post design uses baseline data to assess strengths, areas for improvement, and other indicators and compares those data to a measurement after improvement strategies are implemented. Data may be compared to benchmarks or expected performance.

For example:
A baseline assessment indicates that 25% of partners have a clear understanding of their roles and responsibilities within the partnership. Once partnership leadership recognizes this, they initiate several subcommittee meetings designed to clarify how the subcommittees interact with the larger partnership and the role of each subcommittee member. In addition, subcommittee members have the opportunity to become engaged in intervention activities. Twelve months later, this item is reassessed by leadership and they learn that 75% of partners have a clear understanding of their roles and responsibilities within the partnership. While there is still room for improvement, reviewing the membership roster indicates that the partnership has increased substantially in membership providing a reasonable explanation for the data.

A case study design is an in-depth description of the partnership based on data and observations. A case study provides the opportunity to fully describe the partnership’s work either in total or in a specific area as well as provide a historical perspective. A case study would describe the partnership’s current structure, operation, and context. It describes and reports the current status of indicators such as participation rates, representativeness of members, progress toward achieving objectives, influence of the partnership, how resources are leveraged, progress on objectives, etc. It may include both quantitative and qualitative data that answers specific evaluation questions and identifies barriers, gaps, and successes.

Consider the example of a regional partnership to improve and coordinate emergency services. The case study collects data on identified process and outcome measures such as participation, engagement, influence, and implementation of policy or system change facilitated by the regional partnership. In addition, a series of interviews are conducted with stakeholders to gather information on social and political context, how well the partnership operates, understanding of goals and objectives, barriers and facilitators, perceived individual gain, and so on. A case study report is developed that describes the partnership and its context, and themes and key elements of the interviews. The case study also reports baseline indicator data and trends over time.

No matter which evaluation design is used, a manageable number of indicators should be selected and monitored over time to ensure that processes of the partnership are functioning well and the partnership is continuing to accomplishing its objectives. These might include

  • Meeting participation rates.
  • Key roles and responsibilities are met.
  • Proportion of members actively engaged in workgroups or implementation of objectives.
  • Leveraged resources.
  • Influence of the partnership.
  • Completion of objectives or projects.
  • Contributions to policy or system change.

In general, partnership evaluation should

  • Be participatory. The evaluation should involve the stakeholders and partnership members in planning and implementation as much as is reasonable. Members can help pretest evaluation tools, provide guidance on how to best reach audiences, help collect data, “talk up” the evaluation, and so on. The more buy-in created among members, the more likely they are to value and use the findings.
     
  • Use a mixed method approach when feasible i.e., use a combination of quantitative (numbers such as percentages or proportions) and qualitative (thoughts, opinions and ideas) data. Combining these two approaches provides the “numbers” to justify conclusions supported by the richness and deeper understanding of “why” and “how.”

Gather Credible Evidence

Evidence TabThe next step of the CDC Evaluation Framework is to gather credible evidence, in other words, to collect accurate and valid data to answer your evaluation questions.

To do so, you must identify

  • Indicators (what you will measure).
  • Data sources (where will you find the data).
  • Data collection methods (how you will collect the data).

There is a wide range of possible indicators, data sources, and data collection methods. It will be helpful to talk with colleagues about data sources and methods that have been successful.
For each evaluation question to be answered, identify at least one indicator. Indicators are the specific information, or data, needed to answer the evaluation question. Examples of indicators for partnerships include

  • Number of members.
  • Partner participation rate.
  • Proportion of partners engaged in activities.
  • State plan objectives completed.
  • Leveraged resources.
  • Advocacy activities.
  • Policies adopted or refined.

Numerous methods and sources can be used to collect data. Common methods for partnership evaluation include

  • Document reviews of meeting minutes and attendance.
  • Observation of partnership meetings and partner interactions.
  • Surveys of partners.
  • Interviews of key partners.
  • Meeting effectiveness assessments (Appendix 6) from workgroup or general meeting participants.
  • Focus groups with partners and other stakeholders.
  • Monitoring behavior, health care quality, and health status data.

Often, using a mixed methods approach (i.e., using both quantitative and qualitative methods) is the best approach to answering your evaluation questions, especially when evaluation questions are complex.

Example:
Suppose your evaluation question is: “Are partnership meetings productive? Why or why not?” The indicator for this question is meeting productivity. Before you can answer this question, you will have to decide what you mean by “productivity.” Does productivity mean the number of tasks accomplished during the meeting? Is it new information learned? Is it decisions made at the meeting?

To answer this question you could

  • Conduct a document review of the past 2 years of meeting minutes. From this review, you determine that activities are not being completed at meetings,
     
  • Or conduct a meeting-effectiveness survey at numerous meetings to determine members’ perceptions of the meeting, and
     
  • Then, follow up with interviews with selected members to probe what productive means to individual members, what their expectations are for productivity, and how the meetings could be more productive.

Appendix 1 provides sample evaluation questions and related evaluation activities to collect information. This list can be used to start identifying evaluation questions or to begin brainstorming and prioritizing with stakeholders.

Justify Conclusions

Justify TabJustifying conclusions includes analyzing the information you collect, interpreting what the data mean, and drawing conclusions based on the data. Before beginning an analysis, you will want to ensure that you have good data. This includes ensuring there are no errors in the entries. Also, you must decide how you will handle outlying and missing data. If you have a substantial amount of missing data, consult with an expert in methodology about what to do.

Data analysis includes the following steps

  • Entering the data into a spreadsheet or data analysis program such as SPSS or Excel and checking for correct entries. If you have qualitative data, enter the responses into a qualitative data analysis software package or a word processing program.
     
  • Tabulating the data. Basic tabulations are probably all you will need for a partnership evaluation -- calculations such as the number or percentage of members who answered a certain way. For qualitative data, the most common themes or thoughts should be identified.

    It may be meaningful for you to tabulate responses by member characteristics, such as government versus non-government members or members who attend regularly versus those who don’t.
     
  • You can compare data over time, to similar situations, to what you expect, or to what is reasonable. For example, you may find that participation rates for your partnership are x%. While you may have wanted higher rates, you find through talking with experts that x% is a reasonable participation rate for your type of partnership.
     
  • Presenting data in terms that are familiar and clear to members. Use graphs and charts whenever possible.

Interpreting data is giving meaning to the numbers or responses, or putting those numbers into a context that has meaning to those who will use them. You may compare your results to those of other activities that are similar, or you may interpret your results in light of your particular situation or your intended goals. Contextual factors, such as members’ obligations to competing partnerships, will likely affect involvement in the partnership. When interpreting data, be sure to describe any limitations inherent in the data, such as response rates or biases.

Review evaluation findings with your stakeholders to ensure that your conclusions make sense for the partnership. This involvement of others will help ensure that your findings are valid and will also increase the use of those findings.

A group of people standing in an office.

Ensure Use and Share Lessons Learned

Use & Share TabThe intended use of evaluation results should be determined during evaluation planning and considered throughout the evaluation process. Using the results of your evaluation will help correct identified weaknesses, help the partnership grow and improve, and justify the resources expended, supporting future resource needs. To improve the likelihood of the evaluation findings being used

  • Share information regularly with partnership leaders and coordinators during the course of the evaluation. Providing periodic feedback will help ensure that your evaluation is on track and will limit the chances of your stakeholders being surprised.
     
  • Incorporate findings into an improvement plan.
     
  • Keep stakeholders involved so they are better prepared to share lessons learned.
     
  • Tailor the information and method used to share findings to the specific audience. Use multiple ways to share findings.
     
  • Present information in a timely manner.
     
  • Avoid jargon; present data in a clear and understandable way.

Evaluation results can be shared through a written report, an oral presentation, or even through a media event, whichever is appropriate for the partners or funders to whom you owe accountability.

An evaluation report should include

  • An executive summary.
  • A description of the evaluation purpose and methods.
  • Methods used for the evaluation, including the design of the evaluation and the data collection methods.
  • Key findings, using a mix of tables, graphs, charts, quoted remarks, and stories.
  • Discussion, limitations of the evaluation, and recommendations for action.

Recommendations for improving the partnership should be shared with the leadership and management staff of the partnership. Such communication can be accomplished through an oral presentation or informal discussion. Findings can be incorporated into an improvement plan and shared with the rest of the partners in that same format. While the evaluation may tell you what needs to be improved, further inquiry may be necessary to determine how to improve those aspects of your program.

What do you do if the results of your partnership evaluation are unfavorable? What if the results shed a negative light on a member? In these circumstances, it is important to be sensitive and positive in presenting data. Negative findings on processes, such as communication, can be presented as opportunities for improvement and can provide an impetus for developing an improvement plan. When presenting negative results of an evaluation, it is important that the contextual factors, political climate, budgetary realities, competing priorities, etc., be included so that mitigating circumstances are understood. Findings that reflect negatively on one partner can be presented in general terms publicly; and privately with that partner. In a report, findings can be presented without using names, but using instead such statements as “in general” or “in one case.”

Example: Evaluation Plan for Evaluating Your Partnership

The following is an example of a partnership evaluation plan that applies the principles and concepts described in the previous sections.

Activity: By January 15, 20__, evaluate the processes and short-term outcomes of the State HDSP Partnership. Use the results to develop a performance improvement plan.
Stakeholders: State health department leadership, HDSP program manager, HDSP partnership coordinator, partnership leadership, AHA liaison
Evaluation Questions Indicators/
Measures
Data Sources Data Collection Time Frame Data Analysis Communicate Results Lead
Are there at least 10 diverse partners representing priority areas and settings? Annual assessment of # of partners by setting. Partnership roster.

Annual partnership assessment.

Review partnership roster. Annually in July. Stratify list by setting, area, and population represented.

Tabulate by setting.

Identify gaps.

Orally report gaps to partnership membership committee.

Include in annual partnership report.

Partner-
ship coor-
dinator.
Do partners actively participate in meetings and partnership activities? Meeting participation rates overall and by partner type.

Number of state plan or state work plan activities to which partners are contributing.

Number of partners that present at partnership meetings.

Partnership meeting minutes.

Annual partnership assessment.

Document review.

Collate partner participation rates for each meeting over the previous 12 months.

Identify number and type of activities assigned to partners at each meeting.

Identify number of presentations or topic discussions hosted by partners.

Every 6 months (for previous 6 months) begin January. Calculate % of partners that participate at each meeting; graph trend over time. Report to partnership leadership.

Include in annual partnership report.

Partner-
ship
coor-
dinator.
Are partnership meetings productive, focused, and effective? Meeting productivity. Meeting effectiveness survey results. Conduct meeting survey after each meeting, inclu-
ding workgroup meetings. Revise tool as appropriate.
Continu-
ously.
Calculate response rate. Calculate percentage of respondents who agree with each item. Orally report to meeting planners.

Include in annual partnership report.

Partner-
ship
coor-
dinator.
Is the partnership operating successfully?

If not, where are the weaknesses?

Number of partnership success factors scored above 4 in the Wilder Inventory. Wilder Foundation Inventory results from state partnership members. Conduct baseline survey with annual follow-up.

Annually track improvement.

Annually in January. Using methods described in the Wilder guide identify areas of strength and areas of weakness. Include in annual partnership report. Local university.
Is the partnership influencing policies, practices, or systems? If not, where are the barriers? Changes through partnership intervention.

Number of new legislative policies for heart disease and stroke.

Partners, state plan progress reports. Conduct focus groups after annual meeting to collect partner success stories.

Review progress on HDSP state plan to identify policy, practice, and system changes.

At the end of year 3. Qualitative analysis for themes/
barriers.

Track number and reach of changes made by priority area.

Include in annual partnership report.

Publish success stories on partnership web page.

Press report.

Local university.
 

Increase the Success of Your Evaluation

You can take several steps to increase the success of your partnership evaluation:

  • Establish an evaluation plan during your partnership planning.
  • Start small. Be creative and flexible.
  • Engage partners and staff in the evaluation process.
  • Allow staff time and allocate resources for evaluation.
  • Match evaluation methods to evaluation questions.
  • Use and adapt existing tools.
  • Report results clearly and often.
  • Be sensitive to partners' time and needs.

There are many partnership and collaboration assessment tools available on the Internet and in manuals. Although you can find good ideas for questions or the phrasing of questions in these materials (and you really should consult them), the content of your instrument needs to be specific to your partnership evaluation. If you do choose to use an off-the-shelf assessment, pretest it with a small group of partners to be sure it is understandable and gathers the information you expect. If it does not, perhaps it can be customized to address your specific partnership. Following are some partnership evaluation tools you may want to review

  • The Wilder Foundation’s Collaboration Factors Inventory is a 40-item survey that solicits level of agreement with a series of statements. A limited number of participants may be selected by the partnership or state HDSP program to complete the inventory. State HDSP programs may choose to have all members, workgroup leaders, or just key partners complete the inventory. The inventory includes instructions for scoring and interpreting the results. HDSP programs have permission from the author to use this assessment to evaluate their partnership. Copies of the Wilder Foundation assessment can be obtained from the HDSP Project officers. (Be sure to credit the Wilder Foundation if you use the tool.)
     
  • A sample partnership satisfaction survey is provided in “Evaluation Concepts” (pages 34-39), published in 2000 by the Division of Heart Disease and Stroke Prevention. Copies of the survey are available by request from the Evaluation Team or the CDC HDSP project officers.
     
  • A sample meeting-effectiveness survey is provided in this guide as Appendix 6.
     
  • Partnership Self-Assessment Tool. This tool gives a partnership another way to assess how well its collaborative process is working and to identify specific areas on which its partners can focus to make the process work better. The tool is provided at no cost by the Center for the Advancement of Collaborative Strategies in Health at The New York Academy of Medicine, with funding from the W. K. Kellogg Foundation. The website includes a “Coordinator’s Guide,” “Instructions for Using the Tool,” and the questionnaire. Instructions explain how to analyze the information collected. The tool can also be used to track partnership progress over time. The tool can be accessed at http://www. partnershiptool.net.
     
  • A Coalition Effectiveness Inventory provided by Fran Butterfoss at the 2006 HDSP Program Management and Evaluation Training is provided as Appendix 7. The tool is used by partners to rate the partnership on a number of process and outcome indicators.
     
  • Social Network Analysis is a newer, more complex theory and tool for looking at social networks. It maps and measures relationships and communication between people, groups, and organizations. Links show the strength of relationships or communication between people or organizations. Through use of special software, it provides both a visual and a mathematical analysis of human relationships. There are many software applications available such as UCINET 6 and libSNA as well as analysis software that can be purchased. Search the Internet for “social network analysis software” for a wide range of resources.
     
  • A collection of partnership assessment tools is provided at http://www.coalitioninstitute.org/ Evaluation-Research/Coalition_Assessment_Tools. htm.*

To read more about evaluating partnerships, consult the following resources

  • Mattessich PW, Murray-Close M, Monsey BR. Collaboration: What Makes It Work. 2nd edition. St. Paul, MN: Amherst H. Wilder Foundation; 2004. This is an up-to-date and in-depth review of collaboration research. The edition also includes The Collaboration Factors Inventory.
     
  • Butterfoss FD. Coalitions and Partnerships in Community Health. San Francisco, CA. Jossey-Bass; 2007.
     
  • Evaluating Collaboratives, University of Wisconsin Cooperative Extension. Available at: http://learningstore.uwex.edu/Evaluating-Collaboratives-Reaching-the-Potential-P1032C238.aspx.* The site also includes an organizational assessment tool at http://www.uwex.edu/ces/pdande/evaluation/ evalinstruments.html.*
     
  • Gajda R. Utilizing collaborative theory to evaluate strategic alliances. American Journal of Evaluation. 2004;25(1):65–77. This article provides a framework for assessing the level of collaboration of a partnership, a theory and process to evaluate the level of collaboration over time, and assessment tools.

To learn more about surveys, interviewing, and focus groups, consult

To learn more about Social Network Analysis, consult

  • Introduction to social network methods. This web page, which is part of an on-line text by Robert A. Hanneman (University of California, Riverside) and Mark Riddle (University of Northern Colorado), is available at http://faculty.ucr.edu/~hanneman/nettext/C1_Social_Network_Data.html.*
     
  • Social Network Analysis, A Brief Introduction. Available at http://www.orgnet.com/sna.html. This site has a simple description of social network analysis and sells software and consulting services. (Commercial products are not endorsed by DHDSP.)
     
  • Luke D. Using network analysis to evaluate tobacco control programs. Presented at the 2005 APHA Meeting. The PowerPoint presentation is available at http://ctpr.wustl.edu/.*

Software for Qualitative Analysis

Appendices

Appendix 1: Sample Evaluation Questions and Methods

The following is a chart of sample evaluation questions and suggested activities for answering those questions. Keep in mind that these are just examples. Each state’s HDSP program partnership evaluation questions and activities will depend on the partnership stage of development, stakeholder input, specific needs, and available resources. This list can be used as a starting point to strategize and form a basis for a final list.

Questions are divided into three sections—basic assessment, basic evaluation, and enhanced evaluation—that correspond roughly to the annual assessment of the partnership, the process evaluation, and the outcome evaluation. Evaluation at a particular level should include some elements of the previous levels, just as a good outcome evaluation includes a thorough process evaluation.

Partnership Evaluation Questions & Activities

Annual Assessment
Evaluation Questions Evaluation Activities
  • Are there at least 10 diverse partners representing priority areas and settings?
     
  • Do partners actively participate in meetings? In planning and implementation of the state plan? In the HDSP work plan?
     
  • Is there adequate HDSP program staff support for the partnership?
     
  • What training do partners need to actively and productively participate in partnership activities?
  • List the number of partners, the sector each represents, and how the partner participates with the state HDSP program.
     
  • Track the number of partners that sign a Memorandum of Understanding or Agreement. Track follow-through on commitments.
     
  • Maintain meeting minutes or the Memorandum of Understanding to document the partnerships, activities, and delineation of tasks.
     
  • Evidence may also include lists of work group members, products of partnership, documents that demonstrate collaboration on cardiovascular health activities, and program activities with partners.
     
  • Log critical events. Critical events may be changes in resources, events facilitated by the partnership, events in support of partnership activity, or events that are barriers to partnership goals.
     
  • Debrief after partnership meetings for positive aspects and areas for improvement. Identify resources needed for improvement.
     
  • Conduct periodic training needs assessments.

 

Partnership Evaluation Questions & Activities

Basic Evaluation
Evaluation Questions Evaluation Activities
  • Is there adequate representation from stakeholder organizations, priority areas, and priority population(s)? Is there a method for identifying membership gaps?
     
  • Are partnership meetings successful, i.e., productive, focused, and effective?
     
  • Is the partnership operating successfully?

  •  
    • How well have goals for the partnership been defined and communicated? Are roles and responsibilities of leaders and members clear?
       
    • Are partners knowledgeable of group process and HDSP priorities?
       
    • Is communication efficient and timely?
       
    • Do workgroups function well?
       
    • Is the partnership mutually beneficial to partners? How could partners’ needs and priorities be better met?
       
  • What proportion of partnership activities are focused on priority strategies?
     
  • Are the partnership members satisfied with the functioning, progress and leadership of the partnership?
     
  • Is the partnership on track to accomplish goals and objectives?
     
  • Is training provided to partners beneficial?
  • Review processes for recruiting and placing members in the partnership.
     
  • Conduct participant evaluations after meetings to assess meeting processes, participation, expectations, leadership, etc.
     
  • Track measures such as the number of meetings and number of organizations representing priority populations that participate.
     
  • Conduct individual interviews to determine members’ awareness of and commitment to goals, roles, and communication processes, and recognition of how their participation fits into the larger plan.
     
  • Interview workgroup leaders or assemble a focus group of active workgroup participants to solicit feedback on workgroup effectiveness and methods to improve.
     
  • Review workgroup minutes and progress.
     
  • Conduct quarterly reviews of accomplishments.
     
  • Review meeting minutes for actions and decisions.
     
  • Maintain and review activity progress logs.
     
  • Track and monitor activity on state plan objectives.
     
  • Conduct a satisfaction/needs assessment of partnership members. Assessment could be completed by written surveys, focus groups, or interviews.
     
  • Identify a partnership success for story development.
     
  • Assess training benefits received by partners.
     
  • Conduct post-training follow-up at 3 months to determine if partners used training in their organization.

 

Partnership Evaluation Questions & Activities

Enhanced Evaluation
Evaluation Questions Evaluation Activities
  • Is the partnership successful in accomplishing its goals? Is the partnership making a difference? If not, why not?
     
  • Is the partnership influencing policies, practices, or systems?
     
  • What unintended outcomes are occurring?
     
  • Which external factors affect partnership work?
     
  • Which strategies are effective (have achieved identified performance measures)?
     
  • Is membership sustained over time? What are the reasons members leave the partnership? What are the reasons that members stay?
     
  • Who are the influencers in the partnership? Where are the strong communication links? Where are relationships strongest and weakest? (social network analysis)
     
  • What is the level of collaboration (integration) of the partnership? What is the ideal level of collaboration? What steps should be taken to achieve the ideal? (“Utilizing collaborative theory to evaluate strategic alliances,” Gajda, referenced page 18)
  • Interview community key informants to identify impacts, barriers, and unintended outcomes.
     
  • Conduct an assessment of the impact of the partnership. Consider accomplishments, policy and system changes enacted, indicators, effect on health status, etc.
     
  • Ask partners to submit “success stories” written from their perspective.
     
  • Document partnership activities. Pre- and post-activity assessment of state level policies.
     
  • Document partnership activities. Pre- and post-activity assessment of system and environmental enhancements in priority setting related to priority areas.
     
  • Conduct phone interviews with nonparticipating members and drop outs to determine reasons. If they are essential partners, solicit feedback on how their involvement could be revived and be beneficial to both. Assess awareness of partner goals and initiatives among key decision makers.
     
  • Use social network analysis techniques.
     
  • Use collaboration rubric, theory, and process proposed by Rebecca Gajda (see reference page 18).

Appendix 2: Partnership Evaluation Plan Template

Why are you evaluating the partnership?

__________________________________________________________________

Who will use the results?

__________________________________________________________________

Who are the key stakeholders?

__________________________________________________________________

How can you engage your stakeholders?

__________________________________________________________________

At what stage of development is the partnership? What contextual factors affect the work of the partnership?

__________________________________________________________________

What do you expect the partnership to accomplish?

__________________________________________________________________

What resources do you need to conduct your evaluation?

__________________________________________________________________

What resources do you have to conduct your evaluation?

__________________________________________________________________

 

Partnership Evaluation Plan

Evaluation Questions Indicators Data Source Data Collection Time Frame Data Analysis Report Results
             
             
             
             

Appendix 3: Partnership Membership Assessment Tool

An annual assessment of the membership and roles of Heart Disease and Stroke Prevention (HDSP)partnership(s) can keep the partnership group focused and ensure that the partnership has the skills and expertise needed to accomplish planned tasks. States may have multiple partnerships for different purposes that can be combined in the assessment process. This strategy will work for planning partnerships and assessing existing partnerships.

An annual partnership assessment should include the following three steps:

Step 1. Identify the roles or functions, skills, areas of expertise, and representation needed for a successful partnership.

Step 2. Review the partnership membership, the roles members and staff fill, and the skills and expertise members bring to the partnership.

Step 3. Compare the “wanted” attributes with the attributes the partnership has.
As you begin to assess the membership or composition of the partnership, the following key questions must be answered first:

  • What is the purpose of the partnership (e.g., state plan development and implementation, advisory group for a specific task or objective)?
    ______________________________________________________________________
     
  • What does success look like for the partnership? Are there specific activities or objectives for the partnership?
    ______________________________________________________________________
     
  • What roles do members need to fulfill? What resources or skills do they need to provide to ensure the success of your partnerships? Table A lists roles, skills/expertise and state-level groups that could be represented on the partnership. Use this list as a starting point, and review and customize as needed.
    ______________________________________________________________________
     
  • What organizations, agencies, and leaders need to be represented to ensure success? What assets are needed?
    ______________________________________________________________________

Step 1

After you have considered the key questions, use the lists in Table A to brainstorm membership needs with your state program members, key stakeholders, and partnership leadership. The needs of the partnership will vary depending on the scope and tasks of the partnership. Add these needs to the lists in Table A as they are identified

  • On Table A check the “Want” column of the “Roles”, before “Skills/Expertise” and “Representation” section that corresponds to attributes on your brainstormed list.
     
  • Once you have expanded the list, it might be helpful to narrow the list to those most relevant to the success of your partnership. This step will help you prioritize your efforts as you work to recruit new members or further develop or restructure an existing partnership.

Step 2

Table B is a tool to help you inventory existing partnership members or those that are considered for membership.

  • In column 1, list the individuals or groups that are HDSP partners. In column 2, write the name of the partnership or the intervention on which the partner participates.
     
  • For each partner, identify the specific role or task the partner has in the partnership arrangement and/or the expertise or skill the partner brings to the group or the organization represented. Partners may have multiple roles and multiple skills, as well as represent an organization.
     
  • Identify the specific contribution the partner brings to the partnership or the specific tasks the partner will accomplish. This may be based on how the partner contributes to the state plan or the state work plan, or a specific function of the partnership. For new partnerships, this will be expected contributions; for existing partnership, this will be based on actual contributions.

    This process will identify partners that are carrying much of the workload and help HDSP programs to engage members not actively involved in the partnership.
     
  • Go back to Table A. For each partner in Table B, check off each of the roles, skills, expertise, and groups represented in the “Have” column of the “Roles”, “Skills/Expertise” and “Representation” sections. Add new elements to the list as needed.

Although Table B is for existing partnerships, it also could be used as an ongoing partnership inventory as you develop a partnership, planning group or committee.

Step 3

Compare the roles, skills/expertise and representation desired on the partnership to those provided by partnership members. If your partnership is new, use Table A, to identify partnership roles, member skills and expertise, and represented groups needed for success. With existing partnerships, use Table A to compare what the partnership needs with what it has.

For example, compare columns 1 and 2 to assess partnership roles. The partnership has the “needed” role in the rows where both columns are checked. The partnership does not have the “needed” role in the rows where column 1 is checked, but column 2 is not. These rows identify gaps that need to be filled in future recruitment efforts.

Partnership Membership Assessment Tool

Table A.  Partner Roles, Skills/Expertise and Representation Checklist*

Want/Have Roles Want/Have Skills/
Expertise
Want/Have Representation
    Partnership roles     Data analysis, worksites     State Emergency Services
      Leader     Data analysis, healthcare     State Obesity Program
    Committee Leader     Reviewer, medical content     State Diabetes Program
      Task leader     Writer     State Tobacco Program
      Meeting planner     Advocate for stroke     State Epidemiology
      Meeting facilitator     Advocate for heart disease     State Office of Minority Health
      Strategic planner     Legislative advocate     Hospital Association
    Communications     Medical expert     Primary Care Association
      Training     Cardiologist     State legislature/policy makers
      Financial support     Neurologist     Schools (as worksites)
      Content reviewer     Healthcare quality improvement     Community health clinics
      Budget management     Nursing     Private insurers
      Spokesperson     Pharmacy     Medicaid/Medicare
    Funder     Media communications     Prevention Research Center
    Champion, healthcare     Workplace wellness     Chambers of commerce
    Champion, public health     State policy change     Unions
    Champion, worksites     Community policy change     Business coalition on health
    Strategy implementer     Training for healthcare     State American Heart Association
    Resource linker - connection to groups with influence or resources     Evaluation     Disparate groups (race/ethnicity, geographic, gender, SES, etc.)
          Marketing      
                 
                 
                 
                 

* Items are examples, not a required or complete list.  You can add your own in the blank cells.

Table B.  Partners, Roles, Skills, Expertise and Activities*

Partner Name
(Name, Title, Organization)
Partnerships (purpose, title, or Intervention project) Role, skill, expertise Actual or Planned Tasks/Contributions
Example:      
American Heart Association, Health Alliance staff, Mary Smith State Coalition
(Develop State Plan)
Chairman, represents state level advocacy group Prepares agenda and facilitates meeting

Provides meeting space

State Hospital Association State Coalition
(Develop State Plan)
(registry intervention)
Membership committee chair

Project manager

Attends meetings

Managers budget, collects data, prepare reports

       
       
       
       
       
       

* Table adapted with permission from Crump C, Emery J.  Competency-based curricula to shape health promotion policy. Prepared for the Directors of Health Promotion and Education and presented at: Centers for Disease Control and Prevention; February 27, 2008; Atlanta, GA.

Appendix 4: Processes of Partnership Operation

Paul Mattessich, Ph.D., and the Wilder Foundation identified 20 collaboration success factors based on a synthesis of research evidence about partnership and collaboration. The success factors apply to partnerships formed by non-profit and government agencies. The 20 factors focus on processes of partnership operation and fall into six categories. The publication entitled Collaboration: What Makes It Work provides details on each of the factors and describes measures of success for each.

Identifying weaknesses in these key areas through evaluation activities and addressing them should lead to a more effective partnership and improved collaborative activities. Focus on the areas that are most relevant to your particular partnership. To get a general sense of areas of weakness, you can use the partnership inventory developed by the Wilder Foundation to assess these areas; the instrument also provides a scoring methodology. See the “Tools” section (page 16, or go to http://surveys.wilder.org/public_cfi/index.php)*.

The six categories and 20 success factors identified through the Wilder Foundation review are
  • Environment
    Favorable social and political climates, positive history of collaboration, perceived leadership.
     
  • Membership characteristics
    Right partners, mutual respect, understanding and trust, self-interest met, and ability to compromise.
     
  • Process and structure
    Clear roles and responsibilities, clear method of decision making, flexible and adaptable, invested interest, multiple layers of participation, and comfortable pace of development.
     
  • Communication
    Multiple methods, open and frequent, and informal and formal communication.
     
  • Purpose
    Clear and attainable goals and objectives; shared vision and purpose; and unique purpose.
     
  • Resources
    Capable leadership; and enough staff, materials, funds, influence, and time.

Appendix 5: Evaluation Content by Stage of Development

Table C lists evaluation question topics (inside the table cells) sorted by partnership stage of development (rows) and three common evaluation domains (columns)– capacity, operation, and expectations/outcomes. To use the table, first identify where your partnership is in terms of its stage of development. Evaluation questions can be developed around any of the content areas in that row, or in the row(s) directly above it. You may choose to focus on one of the evaluation domains, such as operations, or on all domains. Keep in mind that as you look at expectations and outcomes, evaluating the processes that are necessary to support the outcomes is important when it comes to explaining your results. The table does not contain a comprehensive list of topics, but it is a way to get you to start focusing evaluation questions appropriate to your partnership’s stage of development. You can use this guide to narrow a list of evaluation questions, or begin to generate a list of questions. You will probably identify additional areas for evaluation that are unique to your partnership.

Table C.  Evaluation Content, by Domain and Stage of Development

Evaluation Domain
Partnership
Stage of
Development
Capacity/
Abilities
Operations Expectations/
Outcomes
Formation: Assessment Environment
Resources
Purpose - defined vision and mission Identified need
Formation: Partner Selection Member characteristics (skills and expertise)
and capacities listed above
Recruitment strategy (interview protocols, member orientation, identified expectations)
and operations listed above
Sphere of influence or reach
Building Resources
Training
and capacities listed above
Processes and structures in place and functioning (communication, defined work, etc.)
Plans for operation
and operations listed above
Engaged partners
Committed partners
Change in relationships
and expectations listed above
Maintenance Changing needs for training and staffing
Member contributions/
participation
Sustainable resources
all capacities listed above
Information feedback loop
Accountability and reporting
and operations listed above
Policy and systems change
Expansion or spread
Member longevity
Outreach efforts
Progress in achieving goals
Sustainability
and expectations listed above

Appendix 6: Sample Meeting Effectiveness Survey

Please indicate your level of agreement with the following statements about today's meeting:

  Strongly Disagree Disagree Agree Strongly Agree
The goals of the meeting were clear to me.        
My level of participation was comfortable for me.        
Most attendees participated in meeting discussion.        
Leadership during the meeting provided clear direction.        
Meeting participants worked well together.        
Discussion at the meeting was productive.        
The meeting was well organized.        
The meeting was a productive use of my time.        
The presentation by _________ enhanced my ability to participate in the meeting.        
Decisions were made by only a few people.        
Decisions were made in accordance with the established rules.        
The meeting objectives were met.        

Comments:

____________________________________________________________________

Appendix 7: Coalition Effectiveness Inventory

The following Coalition Effectiveness Inventory provides an inventory of partnership characteristics for members to use to assess the functioning of the partnership or coalition.

To use the inventory, partners should independently answer the questions and score their responses. Scores can be summarized by section and across partners to develop an improvement plan.

The Coalition Effectiveness Inventory (CEI)

Based on your experience, please complete the following inventory as a self-assessment tool to evaluate the strengths of your coalition and its stage of development. Using the assessment scheme on the instrument, place a check in the box that best corresponds to your rating of the particular characteristic. Based on your coalition’s stage of development, you might not be able to rate each characteristic.

Following the inventory, you can summarize strengths and opportunities for improvement.

Coalition Effectiveness Inventory (CEI)
Self-Assessment Tool

Name of Coalition: Name of Rater:
Date of Assessment: Score:

 

Assessment Scheme: Check one choice for each characteristic
0 Characteristic is absent
1 Characteristic is present but limited
2 Characteristic is present
N/A Characteristic not applicable at this stage of coalition

 

COALITION CHARACTERISTICS

Assessment

0 1 2 N/A Score
0-2
I. Coalition Participants

Lead Agency

         
1. Decision-makers are committed to and supportive of coalition          
2. Commits personnel and financial resources to coalition          
3. Knowledgeable about coalitions          
4. Experienced in collaboration          
5. Replaces agency representatives if vacancy occurs          

Staff

         
1. Knowledgeable about coalition-building process          
2. Skillful in writing proposals and obtaining funding/resources          
3. Trains members as appropriate          
4. Competent in needs assessment and research          
5. Encourages collaboration and negotiation          
6. Communicates effectively with members          

Leaders

         
1. Committed to coalition's mission          
2. Provide leadership and guidance in maintaining coalition          
3. Have appropriate time to devote to coalition          
4. Plan effectively and efficiently          
5. Knowledgeable about content area          
6. Flexible in accepting different viewpoints          
7. Demonstrate sense of humor          
8. Promote equity and collaboration among members          
9. Adept in organizational and communication skills          
10. Work within influential political and community networks          
11. Competent in negotiating, solving problems and resolving conflicts          
12. Attentive to individual member concerns          
13. Effective in managing meetings          
14. Adept in garnering resources          
15. Value members' input          
16. Recognize members for their contributions          

 

COALITION CHARACTERISTICS

Assessment

0 1 2 N/A Score
0-2

Members

         
1. Share coalition's mission          
2. Offer variety of resources and skills          
3. Clearly understand their roles          
4. Actively plan, implement, and evaluate activities          
5. Assume lead responsibility for tasks          
6. Share workload          
7. Regularly participate in meetings and activities          
8. Communicate well with each other          
9. Feel a sense of accomplishment          
10. Seek out training opportunities          
II. COALITION STRUCTURES          
1. Bylaws/rules of operation          
2. Mission statement in writing          
3. Goals and objectives in writing          
4. Provides for regular, structured meetings          
5. Establishes effective communication mechanisms          
6. Organizational chart          
7. Written job descriptions          
8. Core planning group (e.g., steering committee)          
9. Subcommittees          
III. COALITION PROCESSES          
1. Has mechanism to make decisions, e.g., voting          
2. Has mechanism to solve problems and resolve conflicts          
3. Allocates resources fairly          
4. Employs process and impact evaluation methods          
5. Conducts annual action planning session          
6. Assures that members complete assignments in timely manner          
7. Orients new members          
8. Regularly trains new and old members          

 

COALITION CHARACTERISTICS

Assessment

0 1 2 N/A Score
0-2

Formation

         
1. Permanent staff designated          
2. Broad-based membership includes community leaders, professionals, and grass-roots organizers representing target population          
3. Designated office and meeting space          
4. Coalition structures in place          

Implementation

         
1. Coalition processes in place          
2. Needs assessment conducted          
3. Strategic plan for implementation developed          
4. Strategies implemented as planned          

Maintenance

         
1. Strategies revised as necessary          
2. Financial and material resources secured          
3. Coalition broadly recognized as authority on issues it addresses          
4. Number of members maintained or increased          
5. Membership benefits outweigh costs          
6. Coalition accessible to community          
7. Accomplishments shared with members and community          
Institutionalization          
1. Coalition included in other collaborative efforts          
2. Sphere of influence includes state and private agencies and governing bodies          
3. Coalition has access to power within legislative and executive branches of agencies/government          
4. Activities incorporated within other agencies or institutions          
5. Long-term funding obtained          
6. Mission is refined to encompass other issues and populations          

 

Take Home Lessons from the CEI

What stage is your coalition in now?

 

 

In what areas does your coalition excel (i.e., in which major categories did your coalition receive scores of "2")?

1.

2.

3.

 

In what areas does your coalition need to improve (i.e., in which major categories did your coalition receive scores of "0" or "1")?

1.

2.

3.

 

What specific and feasible steps should your coalition take to address the challenges identified in the question above?

1.

2.

3.

 

With permission. Butterfoss, FD.(1998). Coalition Effectiveness Inventory (CEI). Norfolk, VA: Eastern Virginia Medical School. **Revised from Butterfoss and Center for Health Promotion, South Carolina Department of Health and Environmental Control (1994). Coalition Self-Assessment Tool (1994). Columbia, SC.

Bibliography

Butterfoss FD, Francisco VT. Evaluating community partnerships and coalitions with practitioners in mind. Health Promotion Practice 2004; 5(2):108-114.

Canadian Coalition on Seniors’ Mental Health. Overview of the Current Literature on Coalition Development. 2003 October. Available at http://www.ccsmh.ca/pdf/LiteratureCoalitionDev.pdf.* [PDF–42K]

Centers for Disease Control and Prevention. Evaluation Guide: Developing an Evaluation Plan. Atlanta, GA: U.S. Department of Health and Human Services; 2006. Available at http://www.cdc.gov/DHDSP/state_program/evaluation_guides/index.htm.

Centers for Disease Control and Prevention. Evaluation Guide: Developing and Using a Logic Model. Atlanta, GA: U.S. Department of Health and Human Services; 2006. Available at http://www.cdc.gov/DHDSP/state_program/evaluation_guides/index.htm.

Centers for Disease Control and Prevention. Evaluation Guide: Writing SMART Objectives. Atlanta, GA: U.S. Department of Health and Human Services; 2006. Available at http://www.cdc.gov/DHDSP/state_program/evaluation_guides/index.htm.

Center for the Advancement of Collaborative Strategies in Health. Partnership Self Assessment Tool. Available at http://www.cacsh.org/psat.html.*

Halliday J, Asthana SNM, Richardson S. Evaluating partnerships: the role of formal assessment tools. Evaluation 2004; 7(10): 285-303.

Himmelman A. Collaboration for a Change, Definitions, Decision-making Models, Roles, and Collaboration Process Guide. Available through Partnering Intelligence at http://www.partneringintelligence.com/documents/5.02_Collaboration%20for%20a%20Change.doc.

Himmelman A. Collaborative Leadership Self-Assessment. Available through Partnering Intelligence at http://www.partneringintelligence.com/resources_articles.cfm.

International Institute for Sustainable Development. Knowledge Networks: Guidelines for Assessment. Available at http://www.iisd.org/pdf/2004/networks_guidelines_for_assessment.pdf.* [PDF–350K]

Mattessich PW, Murray-Close M, Monsey BR. Collaboration: What Makes It Work: A Review of Research Literature on Factors Influencing Successful Collaboration. 2nd ed. St. Paul, MN: Amherst H. Wilder Foundation; 2001.

McMorris LE, Gottlieb NH, Sneden GG. Developmental stages in public health partnerships: a practical perspective. Health Promotion Practice. 2005; 6(2):219-226.

National Community Anti-Drug Coalition Institute. Breaking Through: Taking Your Evaluation to the Next Level: Advanced Issues in Coalition Evaluation. Available at http://www.coalitioninstitute.org/SPF_Elements/Evaluation/MYTI%20Advanced%20Evaluation%20Guidebook.pdf.* [PDF–384K]

National Community Anti-Drug Coalition Institute. Coalition Assessment Tools. Available at http://www.coalitioninstitute.org/Evaluation-Research/Coalition_Assessment_Tools.htm.*

National Community Care Network. Evaluating Community Based Partnerships. Available at http://www.hret.org/hret/programs/content/Winter02.pdf.*  [PDF–119K]

Patton, MQ, Utilization-Focused Evaluation, Edition 3: The New Century Text. Thousand Oaks, CA: Sage Publications; 1997. p. 23.

Prevention Institute. Developing Effective Coalitions: An Eight Step Guide. Available at http://www.preventioninstitute.org/pdf/eightstep.pdf.*  [PDF–202K]

University of Kansas. Community Toolbox. Our Evaluation Model: Evaluating Comprehensive Community Initiatives. Available at http://ctb.ku.edu/tools/EN/sub_section_main_1007.htm.*

University of South Carolina Prevention Research Center. Inventory of Measurement Tools for Evaluating Community Coalition Characteristics and Functioning. 2003 April. Available at http://prevention.sph.sc.edu/Tools/coalitionevalinvent.pdf.*  [PDF–249K]

University of Wisconsin Extension. Evaluating Collaboratives: Reaching the Potential. Available at http://learningstore.uwex.edu/pdf/G3658-8.PDF.*  [PDF–1.5M]

U.S. Department of Health and Human Services. CDC. Office of the Director, Office of Strategy and Innovation. Introduction to Program Evaluation for Public Health Programs: A Self-Study Guide. Atlanta, GA: CDC; 2005. Available at http://www.cdc.gov/eval/evalguide.pdf.*  [PDF–1.3M]

W.K. Kellogg Foundation. Community Partnership Toolkit. Available at http://www.wkkf.org/Pubs/CustomPubs/CPtoolkit/CPToolkit/.*

Wallerstein N, Polascek M, Maltrud K. Participatory evaluation model for coalitions: the development of systems indicators. Health Promotion Practice 2002; 70(3):361-373.

Download the Publication

Download the Fundamentals of Evaluating Partnerships Evaluation Guide Icon Indicating a PDF FilePDF (2.7M)

 
*Links to non–Federal organizations are provided solely as a service to our users. Links do not constitute an endorsement of any organization by CDC or the Federal Government, and none should be inferred. The CDC is not responsible for the content of the individual organization Web pages found at this link.
 

Back to Top

Page last reviewed: October 20, 2008
Page last modified: October 20, 2008
Content source: Division for Heart Disease and Stroke Prevention, National Center for Chronic Disease Prevention and Health Promotion

  Home | Policies and Regulations | Disclaimer | e-Government | FOIA | Contact Us
Safer, Healthier People

Centers for Disease Control and Prevention, 1600 Clifton Rd, Atlanta, GA 30333, U.S.A
Tel: (404) 639-3311 / Public Inquiries: (404) 639-3534 / (800) 311-3435
USAGovDHHS Department of Health
and Human Services