1997 Partnerships for Networked Consumer Health Information ConferenceTranscripts of Plenary Sessions and Breakout SessionsPost Conference Workshop C: "Evaluation Issues in Consumer Health Informatics"Thursday, April 17 Moderators: Robert Hawkins, Ph.D., Professor of
Journalism and Mass Communication, University of
Wisconsin, Madison, WI, and Speakers: Farrokh Alemi, Ph.D., Associate
Professor of Health Administration, Cleveland State
University, Cleveland, OH and Hawkins: What new issues do interactive technology present with regard to evaluation? Can we measure whether consumer health applications are effective in improving health and reducing unnecessary medical care? Why, when, and how can we evaluate interactive health communication? These are some of the questions posed by the emergence of this technology. A lot of these situations we're discussing today are inherently problematic. I want to discuss issues, and instead of offering solutions, we will discuss a set of questions. So given that these things are problematic, let's have everyone speak their opinions and then proceed onto questions. Let's go around the room so we know who is here. Alemi: I am at theCleveland State University. I started two computer software companies. We have developed a group of systems of evaluation. Pingree: At the University of Wisconsin we develop software for breast cancer research. We confront the issues, listed here, for discussion. I'm not sure there are answers. We are both developers and evaluators. I hope what we have to say is going to be of interest to the Government. It seems that the Government is a bit over-represented here, but perhaps that can be a good sign. Audience: I think it is a good sign to see such high levels of Government participation. The Government, for the first time, is trying to assume a greater role in evaluating what they are doing. We want to be a part of the process by evaluating ourselves to make the technology better able to serve the public. Hawkins: Suzanne and I are colleagues. I am also involved in mass communication; that's how I got into computers; they allow people to communicate over vast areas. Something I've learned is that evaluation does not happen best after the product is developed. It should take place from the beginning. Jimison: We just finished an evaluation, and we helped consumers generate better decisions. There was a more qualitative setting, and even though we utilized needs assessment and feasibility studies, we've had to go through additional, post-project studies to learn how we can make evaluation more feasible. Hawkins: Let me take the position of devil's advocate. Maybe we are incorrect to assume that evaluation is necessary. I wonder if we are hamstringing the developers. And for those in academic settings who are excited about the new idea-- are we slowing them down? That is a real issue. Sample evaluation questions:
Pingree: I can't imagine adding new parts of the Comprehensive Health Enhancement Support System (CHESS) project without needs assessment. I don't think evaluation should be done only after implementation. Rippen: You may have to do an initial evaluation of the product, but once the technology is there, it won't be necessary. Once you have tested the interface and how it works with the consumer, then I think the dilemma is reduced. Audience: I remember when "techno-geeks" were the only people interested in developing and monitoring the advances of the new technology. A lot is being developed by idiot savants who know everything about technology and nothing about people. There has to be some element of evaluation and testing first. Audience: I think you have to look at the issue of whether the new material being developed is self-tutorial and whether it is accessible. Jimison: We think we understand the Net and the browser. But we don't. We really don't know how to carry out the task of finding something easily. There are no experts out there who understand the novice users of the Internet. Alemi: The issue here is that people think these technologies won't do any harm. But actually there is a lot of potential harm. I think the Wisconsin project is interesting because studies have shown that there can be an improvement in the delivery of care and education for AIDS patients. But what we found is that you have to evaluate these new technologies, disease by disease. What works for prostate cancer will not necessarily work for other types of cancer. You use existing patients for the outcomes of the project. There are methods for evaluation, but how do you do it? I think evaluation is necessary. After 8 years, I don't have a theoretical model to help me create a hypothesis. How can I explain that talking online can reduce health costs? Homeless, undereducated, drug users -- if they use our system three times a week we've seen that there is a positive impact. Audience: Can you put a dollar figure on the quality of life? Maybe you should tie the quality of life to cultural factors. Alemi: I think we should evaluate the mechanism that affects the quality of life. We have to show how online discussions lead to a better quality of life. Until we know this effect, we can't develop evaluation tools. Pingree: I have written some ideas regarding issues pertaining to what we have to evaluate. I was just trying to note some of the themes being raised in our discussion. [chart on blackboard]
Hawkins: What's the question? What does it mean when you say one wants to evaluate the system? What are a breast cancer patient's needs? It is a researcher's nightmare because, until you have a prototype, the responses are so varied. Audience: The thing I like about evaluation is that it forces us to step back and think a bit. After 15 years of being one of those idiot savants focused on technology, the reflection time has helped me to make less mistakes in all the new programs I develop. Alemi: I come from a different background. I think people don't know what they need. We need to determine whether you should go out and ask people what they want or whether you should go out and evaluate and then convince people of what they need. There are many stereotypes. We were going to call each patient once a week. The computer was going to call, but the President said no. I insisted that we do it for five patients for 5 months. At the conclusion, we got a barrage of calls to say that we should not have stopped. We have a new technology, and then we try to fit it into our existing lifestyle. But maybe we should change the lifestyles to fit the new technology. Audience: Your comments remind us that there are two movements here. There's the power movement with patients doing self-evaluation and working with their doctors to make decisions. We also have technology to aid the patient movement. Then we hear you say that perhaps there are people from the University who are pushing for a movement where the professionals determine what is best for an individual. That is, if the technology is as good as it has been claimed to be. Jimison: Well, actually speaking as a researcher involved in academia and development, I think I can reasonably state that most people from the University are out of touch. There needs to be more evaluation and feedback, and we may need to try it out on a few and then go on to develop the new technology after receiving feedback. Audience: Is there a mechanism where, after those first five people were assisted in the learning, they can be trained to be midlevel or entrepreneurial so they can be left to pass on what they learned in their community? Is there a theory that a community would have less health problems if some people were trained to educate about the system after needs assessment is finished? Audience: I think the technology should be sorted out. It is the end users who decide what they need. I have a telephone because I need a service. It is more than a needs assessment of the technology itself. Hawkins: I think it is important to keep a balance. We need to have open communication between those idiot savants, who know the technology and those who communicate with the end users. Good projects usually have a perpetual advisory group that brings new, ongoing ideas for changes and improvements, so those who have technical expertise can refine their new developments. Alemi: I also agree that any intervention should empower the community and leave something behind. For instance, in an experiment, there were 300 people on the system. We worked with them, and those people are probably still continuing to use PCs. A conventional approach is one where entrepreneurs have new ideas and everyone says that they don't make sense. Then, eventually, after much pushing on the part of the entrepreneur, people begin to come around and accept the new idea. I think that, despite some of the resistance in our urban risk project, there were gains to be made. Audience: Can you give us a better overview of these projects you're referring to? Alemi: The first was for low-income youths in housing projects, and the other was with drug-addicted pregnant women. The papers told us that we didn't belong there. There was a lot of resistance, but you have to constantly evaluate and monitor. I am an entrepreneur, and I will stick to an idea even if it is a bad idea. It is all part of reevaluating new technology and redesigning it for the users. Hawkins: We don't really understand the technology we're playing with, and it seems to be different than books and other printed educational tools. But when you are looking at new technology, there may be new questions to be raised. Audience: I remember when I was in college typing term papers, and now I cannot even think of doing work without a wordprocessor. We learned three things through a project from my department. We set up touch screen kiosks in WalMart. We figured that at WalMart we would get access to a broad segment of the public. We learned that even the simplest document had to be put into a different form so people could better understand the content. Things that were not moving in print became very accessible to those who used the new technology. People were motivated because the point and touch of keypads made the information more enjoyable to use. Also, when we look at intermediaries there are issues beyond just putting out public service announcements. Audience: I am always conscious of the deconstructualization of the material we are trying to get out there, especially as we try to increase immunizations. Another thing I learned is that fine tuning the results of the research is necessary. We must ask ourselves, who does this project or program work for? People in urban or rural locations? The context is very important--who did it work for and why? Audience: Very often the results of these evaluations are not fully appreciated by the Government. Evaluation helps us to redo the procedures more accurately. Usually we find that the doctor wants the nuggets of the guidelines. The doctor would be more responsive to the information that the Government provides. Hawkins: That brings up the question of whether we give doctors factoids or complete nuggets of guidelines and information. We know that you can tailor things more successfully with computers and you can tailor the information for each patient. Audience: Marketing strategy is important. We are talking about strategies to provide that information to people. Audience: People have values for each type of technology. How is technomatics dealing with the fact that people read just specific journals for their work? How can you get people to place more value in certain information when they have entrenched ways of getting their information? Pingree: The web is enormous and there is a lot to be gotten from it, but there is a lot of junk. At CHESS we let people get in and not get out. Yet now we are thinking of providing restricted links to other sites. But I don't think we should constrain the movement of people to other sites. Jimison: Something should be said about the heuristics of quality. The empowerment comes from getting what you want, and maybe the Government should be involved. Audience: How does the Government plan to make these evaluations? I do evaluation, but we are actually the third-party contractors. A university group works with the evaluators, and then we come in. Then we give our finding to the Government. Alemi: We have discussed coming up with guidelines for evaluators. I think there is potential for automation for evaluation, especially in health care. Audience: There is a white-paper summit group trying to bring together groups working on guidelines for evaluation of such things as alternative medicine web sites. Hopefully we will have an unbiased view. We are attempting to address how the technology can be effectively used to adjust interaction. Another issue is content. Alemi: What about how society is changing? We are talking about evaluating technology that is going to be different tomorrow, and the people are going to be different as well. Pingree: I agree. Kids who are high TV viewers are also high Internet users, and the level of computer literacy in these kids is staggering. But it is also a misconception to think that this new technology is only for the young. Audience: I don't know if this is fully understood, but only 6 percent of blacks and minorities have access to the Internet. It is a substantial resource for information and learning, but the accessibility to this new, useful technology has been limited to certain segments of the population. When we talk about who is connected, there is a stratification of the demographics. When you talk about the sophistication of computer literacy, what segment are you talking about? Pingree: That is certainly a decisive point. There is a great access gap. Audience: There is another major problem. We talk about empowering, but we don't know who we are able to reach with the tools for empowerment. Alemi: In our work at that Chicago housing project we gave homeless people voice mail capability. They didn't have homes, but they stayed connected to the outside and utilized this technology much more than those who had homes. There was a difference in who appreciated the voice mail. I don't want to leave out minorities at all. In fact they were the majority of beneficiaries for this project. I think even giving access to what is considered lower technology -- phone, voice mail, e-mail -- is going to significantly aid those who otherwise would be cut out of the system. Audience: There is an underrepresentation of those who have access and those who are using the Internet. I received an e-mail just yesterday -- I don't know if anyone else received it as well -- asking me to pass on a message of inclusion to other black professionals who use e-mail. And I thought this was highly interesting because it shed light on the fact that the sender of that message was not aware of how many black professionals use the Internet. Certainly, the black professionals are in themselves a minority because they are usually from the middle class. Audience: There was a project where we wanted to get computers donated to certain schools. It kept people off the streets during the time they had every afternoon. My daughter goes to school where the media center is "to die for." It's technology-rich. This woman set up centers for them. It has a bulletin board, and these inner-city kids loved it. They picked it up very quickly. It had a touch screen system. People will use that. Technology is part of the evaluation. It has been said that one of the most effective ways to reach inner-city people is through billboards. We have to decide if technology is a barrier. As I like to tell my husband, people don't want someone to explain how a computer works, they just want it to work. Hawkins: Let's go ahead and reconvene. Let's move to the next issue. It's in two parts in that it addresses the problem of who should do the evaluation and the challenge of objectivity and believability. The people who are developing the systems have an intellectual and financial stake. If they do the evaluation, there are bound to be questions about it. Audience: I don't see how it should make a difference. The other point is that, over a number of years, we developed a lot of information technology. Ultimately, there was a study started in the early development stages. Pingree: The evaluation wasn't good or wasn't successful? Audience: There are just worlds of systems out there that we contributed to, and they couldn't make it through peer review. It wouldn't go anywhere beyond that initial stage. Pingree: What about the situation where the developer does have a meritorious project and has funding? Audience: I'm of the opinion that if they get through peer review, then it's something good. Audience: There were a couple of things not mentioned. I was interested in talking about consumer evaluation of products. There is a difference between patient education for people in a medical setting versus education for people with disabilities. People with disabilities consider themselves different from people in a health setting. Kiosks are fine except if you're blind. The Trace Center in Madison is one of our grantees. Its designed for everyone to use. On one system, you put your hand down the left side, and it tells you what's there. To me that's the ultimate test. Audience: The links browser is always a big issue. What effect will this browser have? Jimison: I have a concern now. Most people here are from the Government. We think the Government is only a funding source, but we must think of them as developers in order to promote advancement in the creation of an effective system. Academic medical centers are losing funding dramatically. HMOs aren't going to share results. The same as U.S. HealthCare and others. We need to think more broadly. We have to think of some new models. Today we have to determine what's different in this area of study compared to say a drug study. Some early efforts are going under. That Time-Life book project didn't work. I don't get that. Audience: I haven't worked that closely with development. The company failed, as everybody read on the front page of the Wall Street Journal. The only thing I know was that sales weren't the same as projected. It's easy to say in hindsight that the books were put in the wrong setting. The consumers didn't want to buy the product. Audience: That's the ultimate motivation. Audience: Wanting it and paying for it are two different things. Audience: The bottom line needs attention. We're talking to a guy who was marketing a new program. Some of the largest pharmaceutical companies have gotten together. They will have an electronic Physicians Desk Reference for doctors. Doctors can download information and have it available. It's all for the convenience of the physician. Obviously, what the physicians want is something that saves time. This is the reason that small companies go belly up. They don't address the issue of the user. Alemi: We don't always go belly up, but we don't always do what we intended to do. Society and organizations are changing right under these companies. If you are independent on the web, then you will always have to advertise your existence. The practice changes are far more than the technological changes. We have so many patients here, so we have to reschedule our system. In the Time-Life situation, those videotapes will make sense in the process of care and with clinicians on staff to use them. I'm just thinking the health information itself is not sufficient. We have to redo the health care system. Hawkins: Informatics is not just information. It can be about emotional support. When you're designing something to go somewhere, you also have to be designing the dissemination system as well. Audience: Often when we design, we have to show that the money is being used wisely. That's not an integrated thought. Alemi: If I were going to recommend something to the Government, it would be that they not evaluate online services, but evaluate the health care system that has online services embedded in it. Why do we all want to lose weight? Why on the first of the year do we make promises that we don't keep? We have to look at the health care delivery system, the prescription that comes in the mailbox, and everything else. There's more to quality dieting than putting health care information about it online. Being online is just a minor component. Hawkins: I'd like to get back to the question of who should carry out evaluations. I'm getting a sense that funding of evaluations tends to focus on end-run items -- on things that are already set up. There are types of evaluations where you ask what is it that we are doing? Is this going to do what we think it is going to do? You can get some answers very cheaply. Promise to get people bagels and cream cheese and get them to come in to get the feedback necessary to set up a good system. There are lots of nickel and dime ways to get information, but they won't be very scientific. My mind goes back to what my grandfather used to say -- that a thing worth doing is worth doing well. Is it worth it? Is gathering a little tainted, subjective information better than flying blind? Audience: What I've been hearing here is that what it comes down to is that success and failure are human experiences. When you talk about a qualitative narrative, is anybody going to use it? Is it going to mean something to someone? Audience: In Health and Human Services, biostatistics is the primary tool, and evaluations are seen as soft. I'm a little pessimistic about what data public health officials are going to pay attention to. Jimison: We have a wrong definition of science. Journal articles are not necessarily the end product. I think we should question this arbitrary standard that has nothing to do with quality decision management. Audience: I am not only concerned about the quantity of information, but the penalty for wrong information. How serious is it? Audience: I can talk to that. We had a survey that was quickly put together. It was put together with an independent consultant firm. I couldn't believe some of the results. There were so many issues involved. People that market the product don't understand research. People who are making decisions are physicians. They were extremely put off by the way data collection was handled. If I were there, they wouldn't have gotten any data, unless it were good, solidly framed data. I still think it's very valuable, and I want to know what's happening. What are women asking their doctors by e-mail? I want to know at this point. There can be tremendous value; you just have to be careful about how you deliver it. Alemi: If you look at what's happening on TV recently with various court cases, there is some evidence beyond reasonable doubt, and some that is gathered with reasonable doubt. Let's all remember, Hitler was elected by the popular vote, so you can make mistakes. If you have the absence of controls you could mislead yourselves. Audience: The side I err on is getting information. How about simply saying did you use this tool? How did you use it? Did you get feedback? We're having an online evaluation of our web site. I used to not hesitate in calling nine random people and getting off-the-cuff feedback. An e-mail system on the web site is "in-your-face" democracy in action. They are telling me the most intimate details of their lives on the Internet. When you change the way people communicate, you change the very structures of their society. Changes will occur too quickly for the researchers to take their time and quantify information. Hawkins: As a developer, I'm not as worried about this. If I'm right 51 percent of the time, I'm ahead. Casinos work on less margin than that. The developer makes a decision, like let's talk to nine people, let's get the product out the door. The information is probably only 95 percent that is accurate. But it isn't science. Audience: It's social marketing. Audience: The patients are looking for something conclusive. They want one answer. We have to say this is the best information we can provide at this particular time. I work with breast cancer patients. When some erroneous data came up, it caused a tremendous uproar in that community, as women are making decisions based on it. This is the best information we can provide at this time, but the level of competence has to be stressed as well. Audience: One of the things we know is that people are very reluctant to share information early. There is a need for that direct information. Who's controlling it -- it's really an incredible time. Science has a new role in the world. It has entertainment value now. How do you package that? How the Government stays pure and yet responsive is a difficult question. Audience: This is the strength of informatic technologies. We want to find a solution, and we do an incremental update. The pill is the ultimate symbol. People say we want the pill. The one objective for me is to get people to check information on a regular basis. If you're checking information often, then that one article in the paper wouldn't be as upsetting to you. Is anybody looking at the social study? How do people handle information intake? Pingree: In one study, we can watch what people do with their computer -- one piece of information connects to another. It's a theoretical goal to define computer use and information use. It's an open field, but not very much is happening in that area at all. You can also look at who is accessing information. Alemi: You can try to analyze what people are saying. In one such open discussion group, we looked to see if people were talking about issues, emotions, or if they were just flaming each other. We found 50 percent were exchanging facts, 3 percent were talking about problems, and no one was flaming anyone. In a confidential arena, a significant number of questions asked did not have to do with medical problems, but with relationship problems. A higher percentage of people were asking about highly controversial issues like sex, race, or suicide. People seem to use these media in different ways. Jimison: I wanted to bring up the issue of uncertainty. We have to develop realistic expectations and move people away from the black and white attitude in all aspects of life. There have to be new methods to do this, but it's a very difficult concept to get across to people. Audience: When you respond to an invisible threat, like radiation, people feel trapped because of a knowledge gap. There was a story I heard recently about smoking that I saw in an Ann Lander's column. A child said to a parent, "I want to start smoking." The parent said, "There's something that I want you to do first. I want you to do a research paper first and look at the heath risks, and then if you decide you want to, then you can smoke." The kid did the research and decided not to smoke. We struggle with this at healthfinder. Do people take the information down and do anything, or do they just look at it and choose not to change their actions? In this culture we look for a silver bullet or a magic pill. At one conference, I talked to this older lady about arthritis and how it could never go away, but you could do certain things to help the situation. The lady came up to me after the presentation and said, "Thanks for the talk. Now could you give me the name of a doctor who could cure my arthritis?" Audience: People have to know we are tracking them. Confidentiality becomes very important, because in some ways it scares them. Pingree: Actually, Internet tracking with keystrokes through the CHESS program is quite clear. Through the home page, the tracking is not so clear. There's no way to get informed consent from users of a home page. Hawkins: We've mentioned a number of times the differences between people. We tend to lump these differences demographically. Computer access may be a very good way to track a demographic. Demographic surveys would never pick up information styles. The point is the differences between people are distinguished by the kinds of questions you ask. What's your question? Alemi: There is a lot of controversy over the issue of information style. I was at a conference. We were studying design styles. Once we finished the measurement we didn't know what to do. Suppose I come in and say I want details. Do we give you what you want? Do we match your style, or do we give you the opposite? Hawkins: What is the goal? Here is the tool. Here is a service, and we have to match. Audience: Let's not measure the information style, but let's measure the impact of that style on this distribution. That's what you want to impact. Hawkins: That's a very good way to think about it, but that model will not work as well as it used to. It came out of the 1950's, and it is essentially a model that presumes a top-down infusion of innovation. One of the things in play now is a competing model with people seeking a different way. I'm not saying replacing it, but augmenting it. The problem with laggards is you have to beat the message into them harder to get it across. Maybe we can find a different way to approach those laggards. Audience: I think of an old African proverb: If you want to understand the power of healing waters, you don't use the waters. You ask the people who use those waters, how do we do that without a well-designed, quasi-theoretical survey in place? I don't want to just put out any research. Let's get some theoretical research going. Alemi: If you do have research, you don't have to evaluate every circumstance until someone proves that version of the model wrong. Sometimes it increases visits, and sometimes it decreases visits. Audience: I would also argue that the other side is just as important. You need to know what that healing water is. Many of the advances we've made are to understand at that level. I hate to see this given up. Pingree: What other issues do we need to bring up here? Audience: I don't come from the evaluation field, so this may seem an obvious point. But it would seem to me that if you have a product, and you want to evaluate it, you would ask, what's the purpose of the product? What do you want it to do? It seems to me it would be easy to do an evaluation. Pingree: I think you're right. When I began working on CHESS in the early 90's, we developed personal stories in CHESS on AIDS and breast cancer. We focused on the question, what are you trying to get at with your breast research? It's very important. Just defining goals was a big step. I don't think people are self-reflective enough to think about this when they're designing. Jimison: If you are expecting purchasers to provide quality, HMOs will look at cost. If they're looking at consumer satisfaction, it's to develop more growth. It's important to us to look at society as a whole, rather than leave it entirely up to the HMOs. Audience: How you get an evaluation is serendipitous. It depends on how you analyzed the data from an evaluation. If you have a removed group, they may come up with serendipitous information also. You have to be able to let the ego go. Pingree: I'm not convinced it has to be someone outside of the project. Alemi: You have to be inside the project to really know what's going on. Pingree: On one of our studies with CHESS, we compared people who changed the most and who changed the least during their use of the system. There was one person who didn't change at all. He used a lot of links and he should have been a changer with our study, but then we discovered we didn't ask the right questions. It turned out he was an alcoholic when he started, and the program stopped him from drinking. Audience: Did we know if he stopped because of CHESS? Alemi: We don't know that. Pingree: He said thank you in his e-mail. He said if it hadn't been for the system that he wouldn't have changed. Now why exactly he changed, we don't know. Back to Summaries and Transcripts page. |
Detailed Agenda ~ Post-Conference Workshops ~ Speaker's Biographies
Back to Partnerships '97 home page.
Comments, questions or suggestions: email NHIC.
URL: http://odphp.osophs.dhhs.gov/confrnce/partnr97/transcripts/workshopc.htm
Last updated on June 26, 2003
National
Health Information Center
P.O. Box 1133
Washington, DC 20013-1133