Skip all navigation and go to page content
NN/LM Home About Us | Contact Us | Feedback |Site Map | Help

Archive for July, 2008

Evaluation 2008

The American Evaluation Association’s 2008 annual meeting with its “Evaluation Policy and Evaluation Practice” theme, will be November 5-8 in Denver.  There will be three days chock full o’ presentations about evaluating almost any kind of program you can think of (including health promotion but not including health sciences librarianship).  To see if this is the meeting for you, take a look at the schedule.

What Is “Appreciative Inquiry”?

Christie, C.A. “Appreciative inquiry as a method for evaluation: an interview with Hallie Preskill.”  American Journal of Evaluation. Dec 2006. 27(4): 466-474.

In this interview, Preskill defines appreciative inquiry as “…a process that builds on past successes (and peak experiences) in an effort to design and implement future actions.” (p. 466)  She points out that when we look for problems we find them and this deficit-based approach can lead to feelings of powerlessness.  In appreciative inquiry the focus is on what has worked well, and use of affirmative and strengthening language improves morale.  She suggests a focus on the positive through interviews asking for descriptions of “peak experiences” that led to feelings of being energized and hopeful; asking for information about what is valued most.  She cautions that skeptics will find this to be a “Pollyanna” approach that lacks scientific rigor.

What Does “Effective” Mean?

Schweigert, F.J. “The meaning of effectiveness in assessing community initiatives.” American Journal of Evaluation. Dec 2006. 27(4):416-426.

Evaluators have a way of coming up with answers to questions we didn’t know we had, such as, “what does ‘effective’ mean?” This article points out that the meaning varies according to context. Sometimes a positive judgement means the changes that occurred were the ones that were expected; in others it requires that the changes were better than what would have occurred without any intervention (which needs evidence regarding cause and effect). In true academic evaluator fashion, the author presents three different meanings of “effectiveness”:

  • increased understanding through clarifying assumptions, documenting influences, identifying patterns, assessing expected and unexpected results, etc.
  • accountability through making decisions based on performance expectations and standards, such as in benchmarking.
  • demonstration of causal linkages through experimental and quasi-experimental evidence showing what works. “Although randomized experiments have been called the ‘gold standard’ of social science research and evaluation, evaluators are well aware that experimental designs are not always possible, feasible, necessary, or even desirable.” (p. 427)

Nuggets from the Health Program Evaluation Field

Grembowski, D.  The Practice of Health Program Evaluation.  Sage, 2001.  Info about this book from Google books.

Not a new book, but an interesting one, with information of potential use to us in thinking about evaluating health information outreach.  Some general overview perspective from the book:

  • Most evaluations are conducted to answer two questions:  Is the program working?  Why or why not?
  • All evaluation is political since judging worth is based on attaching values.
  • Evaluation as a 3-act play:  Act 1 is asking questions; Act 2 is answering them; Act 3 is using these answers in decision-making.
  • Evaluators’ roles range from objective researcher through participant, coach, and advocate.
  • Evaluations look at the “theories” behind programs, such as the causes and effects of implementing activities.
  • Premises underlying cost-effectiveness analysis: health care resources are scarce, resources have alternate uses, people have different priorities, there are never enough resources to satisfy all.
  • Evaluation standards include utility (results are intended to be used), feasibility (methods should be realistic and practical), propriety (methods should be ethical, legal, and respectful of the rights and interests of all participants), accuracy (produce sound information and conclusions that are related logically to data).

More from MLA on Library Value

This year’s Medical Library Association annual meeting in Chicago had several good sessions in which speakers presented experiences and approaches to assigning dollar values to library services and activities. These included:

  • “A Calculator for Measuring the Impact of Health Sciences Libraries and Librarians” presented by Betsy Kelly and Barb Jones of the MidContinental Region, National Network of Libraries of Medicine–Their calculators include the Valuing Library Services Calculator and the Cost Benefit and ROI Calculator. These have the potential to be very useful tools.
  • “Connecting with Administrators: Demonstrating the Value of Library Services” presented by Edward J. Poletti of the Central Arkansas Veterans Health Care System in Little Rock, AR–He and VA Library colleagues conducted value studies of shared electronic resources, ILL, and literature searches. His presentation included a list of sources of dollar values such as Fortney’s “Price History for Core Clinical Journals in Medicine and Nursing 2003-2007″ and “Doody’s core titles in the health sciences 2007: list overview and analysis.” This paper received honorable mention for the MLA Research Award, and a summary is available at the MLA Federal Libraries Section blog.
  • “Bridging the Gap: Using Dollar Values to Demonstrate the Value of Library Services” presented by Julia Esparza of Louisana State University Health Sciences Center in Shreveport, LA–Her experience with assigning and tracking dollar values included analysis of copying/printing costs and article costs.
  • “Quantum Physics and Hospital Library Assessment” presented by Michele Klein-Fedyshin of UPMC Shadyside, Pittsburgh, PA–Assessment must be locally relevant and there are various possible foci, such as the financial impact of local consortia, the impact of library services on nursing certification, prevention of hospital acquired infections, cost savings from library contributions to pay-for-performance, library as drug information center, etc.

The “LIMB” Model: Lay Information Mediary Behavior

Abrahamson, J.A.; Fisher, K.E. “‘What’s past is prologue’: towards a general model of lay information mediary behaviour.” Information Research 12(4):October, 2007

Health information outreach is often aimed at information mediaries in addition to primary information seekers. The article defines lay information mediaries as “those who seek information in a non-professional or informal capacity on behalf (or because) of others without necessarily being asked to do so, or engaging in follow-up.” These individuals are also known as gatekeepers, change agents, communication channels, links, navigators, and innovators. The authors present a generalized model of information mediary characteristics, activities, motivations, barriers, facilitators and raise the question of what differences exist between primary information seekers and information mediaries, since “the caregiver-as-person may have information needs that vary from the caregiver-as-caregiver.” These are factors we can take into account in community assessment activities.

What Do Administrators Want?

Back in May at the 2008 Medical Library Association meeting in Chicago, a group of health care administrators presented a panel discussion titled “Connecting with Leaders: What Do They Expect?” in which they provided their perspectives regarding their expectations for the health sciences library. This was a group of library supporters and their comments revealed their expectations that library leaders need to think broadly and creatively about their libraries’ roles. Suggestions included:

  • Participate in community outreach to serve the greater good of the institution and its communities
  • Work with IT to find ways that the library complements IT
  • Develop allegiances; although forming partnerships isn’t easy, a fundamental component of administration is relationship building
  • Stay connected and aligned with operational opportunities and priorities
  • Participate! In the “journey” toward magnet status; in research to improve patient care; in the institution’s constant staff retooling and retraining; in instructional delivery; in grant proposal creation; in benchmarking to learn what similar institutions have and what admired institutions have
  • Think in terms of dollars but remember other values

This session was on Monday, May 19 at 10:35am and, if you have access to the MLA ‘08 CD-ROM, it’s definitely a worthwhile listen.

The STAR Model for Developing Health Promotion Web Sites

Skinner, H.A.; Maley, O.; Norman, C.D. “Developing Internet-based ehealth promotion programs: The Spiral Technology Action Research (STAR) Model.” Health Promotion Practice 2006; 7(4):406-417.

The STAR model combines technology development with community involvement and continuous improvement through five cycles: listen, learn, plan, do, study, act. The “listen” cycle corresponds to community assessment: learning about needs and opportunities, and building partnerships and stakeholder buy-in. The “plan” and “do” cycles involve identification of objectives and strategies followed by prototyping and design to address identified community needs. The “study” cycle corresponds to process evaluation of web sites or prototypes, followed by the “act” cycle in which decisions are made and actions taken based on evaluation results (promotion, ongoing feedback collection and continued refinement, and sustainability). This article presents a case study of using the model plus methods for approaching each of the five cycles.

Storytelling and Behavior Change

Hinyard, L.J.; Kreuter, M.W. “Using narrative communication for health behavior change: a conceptual, theoretical, and empirical overview.” Health Education & Behavior 2007; 34(5):777-792.

This article advocates use of narrative communication in motivating people to change their health behaviors, pointing out that “understanding any situation involves storing and retrieving stories from memory.” The authors speculate that narrative ways of learning and knowing may be especially useful when addressing issues for which reason and logic have limitations, such as morality, religion, values, and social relationships. Narratives can help overcome resistance to a message, facilitate observational learning, and provide identification with characters. Stories can be combined with more “scientific” methods to achieve optimum results.

Health Promotion Facilitators and Barriers

Robinson, K.L.; Driedger, M.S.; Elliott, S.J.; Eyles, J. “Understanding facilitators of and barriers to health promotion practice.” Health Promotion Practice 2006; 7:467-476.

The authors state that although the “field of health promotion has shifted to embrace a socioecological model of health recognizing the role of environmental and contextual factors on health promotion practice and health outcomes,” most health promotion research “continues to focus on behavioral or risk factor outcomes.” Published studies of health promotion facilitators and barriers have tended to focus on one of the three linked stages of health promotion practice: capacity building for planning and development; delivery of health promotion activities; and evaluation and/or research. Barriers to evaluation and research include: health promotion activities rarely have simple, direct cause-effect relationships to test; health interventions involve many factors and processes that cannot easily be quantified; monitoring in rural areas or at the community level poses significant logistical and financial barriers; and tension exists between “scientific rigor” and participatory evaluation processes that aim to influence practice.

The article characterizes facilitators and barriers to health promotion practice as internal (leadership, staffing, resources, priority/interest, infrastructure, and organization of teams and groups) and external (community buy-in, turnover of local contacts, partnerships or collaboration, socioeconomic/demographic/political contexts, and funding opportunities or cuts).