Transcript

A System's Approach to Improving Safety: Translating Lessons from the Aviation Industry


This is part three of a series of National Audio Teleconferences sponsored by the User Liaison Program, Agency for Healthcare Research and Quality, conducted on June 5, 2000.


CINDY DIBIASI: Good afternoon and welcome. My name is Cindy DiBiasi, and I'll be your moderator for today's session.

This is the third and final call of this User Liaison Program, ULP, teleconference series on medical errors.

Today's call will address the issue of a system's approach to improving safety translating lessons from the aviation industry. We're going to examining the roles of safe play as purchasers and regulators of the healthcare system. By all accounts, the airline industry has one of the most extensive systems for investigating, analyzing, and preventing errors. And today, we'll discuss their system to see what lessons they may give us to reduce medical errors. We'll also discuss initiatives by the Clinton administration to reduce errors, as well as what's happening at the State level.

We have a panel of four experts with us today: Dr. John Eisenberg is the Director of Agency for the Healthcare Research and Quality; Linda Connell is the Director of the NASA Aviation Safety Reporting System, and she's also a research psychologist at the NASA Aim's Research Center; Benjamin Berman is a Chief of Major Investigations at the National Transportation Safety Board, an agency we hear a lot about; and finally, Trish Riley is the Executive Director of the National Academy for Safe Health Policy.

Welcome to all of you.

Before we begin our discussion, I have a few housekeeping items to take care of. And later in this call, our fine panel of experts will be taking your questions. Now, there are three ways you can communicate your questions to us. You may ask your question over the air by pressing one on your phone pad; this will put you in the queue. If your questions have been answered or you want to drop out of the queue, press the pound sign. You may also fax us your questions at 301-594-0380, or you may E-mail your question at info@ahrq.gov. If you prefer not to use your name, that's fine. But we would like to know from what department or organization you're calling. So, please indicate that in your E-mail or fax. We will have tapes of the teleconference available for purchase after the series is completed, and I'll give further details about this at the end of today's show.

But now, I think we're ready to turn to our panelists.

Dr. Eisenberg, the IOM report stirred tremendous interest in medical errors after it was released last November. President Clinton directed Federal agencies to study solutions. Could you update us on the work and recommendations of the Federal agencies? For example, I've heard about the QuIC committee.

JOHN EISENBERG, DIRECTOR, AGENCY FOR HEALTHCARE RESEARCH AND QUALITY: The QuIC is the Quality Interagency Coordination Task Force. And like all things in government, it has to have a good acronym. So that's the one we're using. And it is, in fact, working quickly to try to respond to the President's mandate, that we show how the Federal Government can work with the States and with the private sector in decreasing errors and improving patient safety. The QuIC is made up of a number of...

UNIDENTIFIED PARTICIPANT: (INAUDIBLE) even though...

EISENBERG: Pardon?

DIBIASI: Oh, I think you might have—that wasn't me, but I think that you may have a little problem with your cell phone.

EISENBERG: OK. Why don't you go to somebody else and let me fix this, OK?

DIBIASI: Well, you could actually—can you turn this—you want to turn the phone off because you're still on the other line, right?

EISENBERG: It's not near me, but...

DIBIASI: Oh, OK, OK. No; go ahead—we're—you're OK; go ahead.

EISENBERG: OK. The Quality Interagency Coordination Task Force made up of all the Federal agencies, who are involved in healthcare. And that includes the purchasers like the—like the Healthcare Financing Administration (HCFA) and the Office of Personnel Management (OPM), which write Medicare, Medicaid, and OPM runs the Federal employees health benefits plan. It includes the providers like the Defense Department and the VA (Veterans Administration), if you need health service, Bureau of Prisons, Coast Guard. It includes research agencies like the Agency for Healthcare Research and Quality. Regulators like the Food and Drug Administration, others like the Centers for Disease Control. It really includes a tremendous array of different perspectives on quality, and that gives us an opportunity to look at the issue of quality from a lot of different perspectives. So, the QuIC has over 100 items that we have on our agenda, that we're going to be under taking in order to decrease the patient safety problem and improve the quality of care throughout the country.

DIBIASI: And what is some of the first steps that we can expect to see from this report?

EISENBERG: Well, one of the first was announced recently, and that is that the Veteran's Affairs Health Administration Program. The Veteran's Health Administration is working with Linda Connell and her colleagues to develop a mechanism for reporting on errors, so that they can learn from those errors. But they also have a number of systems in place that will help to prevent the errors from occurring in the first place. Some computerized systems of order entry and bar coding in the hospitals.

The Health Care Financing Administration has announced that it will require that all hospitals have in place a patient safety program that it—that they can tell patients about. The Office and Personnel Management has announced that it's in its most recent call to health plans that are going to offer coverage to Federal employees the fact that they need to have patient safety programs in place. The Agency for Healthcare Research and Quality is initiating a research agenda in this area and has a call for proposals out. Those are some of the first, but there is a long list.

DIBIASI: I'm assuming there's a timetable for action. What does that look like?

EISENBERG: Well, the first action is immediate, and that is that we need to start right away with some of the programs that involve activities that each agency can do by itself, but also with some collaborations. The one's I describe are already under way. The—we're—we've already asked the National Quality Forum, for example, to come up with a list of those key practices that are likely to improve patient safety. We think that list is going to be helpful not only to the Federal Government, but also to the States and the private sector.

We've also asked for a list of measures of patient safety, that could be used for those, who want to see how well they're doing, or who want to be prudent purchasers of care and go to those places that are taking care to improve the patient's safety of their hospitals and their healthcare systems.

DIBIASI: Now, the QuIC recommendations call for responses and cooperation by the States in many areas; for example, encouraging error reduction criteria and the re-licensing of doctors by State medical boards. Are Federal agencies preparing to advise and assist the States? And if so, how and when can we expect evidence in support collaboration?

EISENBERG: The first thing that we want to do is to help the States to understand what is all ready going on. And in that regard, you'll hear later from Trish Riley about some early work that her organization has done to understand what some of the States are doing. And we want to do is to work with Trish, her organization, and others like hers to understand which practices work best in the States, so that the States can learn from one another with us serving more of a convenient function.

We also believe that we can help by defining the errors that are most easily measured by developing quantitative measures of errors, as well as qualitative measures of errors that States can adopt if they choose to do so. We've had seminars for the State leaders in order to understand what's being done in other States and in order to understand the issue of patient safety.

DIBIASI: Now, reporting has been a very hot buzzword since this report came out. And I know the QuIC recommendation calls for both voluntary and mandatory reporting systems to register and analyze medical errors. But doctors and hospital boards are understandably concerned about the confidentiality of the information to be reported. How can the State agencies and officials respond to these concerns for a provider?

EISENBERG: The way you frame the question is exactly the right way to frame it, which is that the issue of mandatory versus voluntary reporting is an important one. That is to say voluntary reporting would be done volitionally by those who have either made a mistake, observed a mistake, or had a near miss; whereas, mandatory reporting expects to get the full array of events that have occurred, which will help us to understand rates of events, as well as the most important events that can be learned from.

In either case, the question is whether that information is publicly disclosed or retained by those who are trying to use that information to improve the care that its provided, and that's a very controversial topic. Those who are in the healthcare system believe that it's important to keep the information confidential because that'll improve reporting, make people feel safe about their reporting, and help them to learn from their errors.

On the other hand, there are a number of people who represent the consumer side of this discussion who feel that the information must be made public, so that people can use the information in order to choose the safest healthcare possible. And—so that there's more accountable—accountability in the system.

We're going to have to find the right balance between confidentiality and public disclosure whether we're talking about mandatory reporting or voluntary reporting. One of the things that's going to need to happen in the States, as well as in the Federal Government, is a very careful look at the peer review protections that exist; so that when we do decide that, information ought to remain confidential it can; and that it—and that it would not be discoverable; that it could remain confidential and safe; and people can feel comfortable that, when they are told that they will be confidential, it remains confidential.

In many States, we don't have those kinds of peer review protections. In some, we had them; they're being challenged. And almost bill, that's been introduced Federally in the area of patient safety, has addressed the issue of peer review protections and confidentiality. It's going to be central and critical to having something happen to improve the reporting of errors and near misses.

DIBIASI: How will this initiative, this patient safety initiative affect medical malpractice suits in State courts?

EISENBERG: I don't think that it will at all, except that it might enable us to provide safer care, higher quality care, which will decrease the events that people are suing about in the first place. There's nothing about these proposals which makes the original information in the chart secret.

The only thing that would be confidential would be the secondary data sets. That is the data that's collected by people reporting the events, that are all ready reported in the medical record.

In addition to that, everybody in this field, from the American Medical Association to the Federal Government, has said that when an event occurs to an individual patient or to a family member that represents a medical error, a mistake, a mishap, that the patient and the family need to be told about that.

Therefore, there's nothing about this proposal that makes any of us more or for that matter, less susceptible to a malpractice suit as long as the kinds of confidentiality constraints that we're describing being in place are in place.

If we don't have confidentiality when we promise confidentiality, then there is the risk that somebody, who wants to trowel for cases that could be brought to suit, perhaps a legal firm or somebody like that, would go through a long list and then go around to the patient saying: Well, why didn't you sue? Don't you think you should sue? And that's the kind of thing that we want to avoid. But we do want to be sure that the patients and the families have all the information they need to make the decision themselves.

DIBIASI: Do you see any effect, any potential effect? Will there be any kind of effort to change State tort laws?

EISENBERG: That's really a separate issue. We don't think that it's necessary for tort laws to be changed in order for a patient safety program to have a big impact on medical errors. We just see it as a totally separate issue.

DIBIASI: Now, some States already have systems in place to report medical errors. How will a national initiative affect those systems in those States?

EISENBERG: I think it's going to be the other way around. How will those systems affect the national program? What we believe should happen is that we should look at the 50 States in the district as laboratories of innovation; that we should be looking at what they're doing and what we can learn from them with regard to definitions, with regard to collecting the data, feeding it back, privacy protections, confidentiality, all the key elements of this program. We suspect that we're going to learn a lot from the States—that other States will learn from the States.

Simultaneously, we're going to be initiating these Federal programs, like the one at the VA. And we believe that the States will be able to learn from the Federal experience, from the Federal providers and the Federal regulators. But this is going to be a process of learning. Nobody knows exactly how to do this right yet, but we do know that it's absolutely essential that we have a program in place and that we put it in place quickly.

DIBIASI: Once you gather your information and understand what there is to understand about what's going on in the States, do you see an effort to standardize reporting systems?

EISENBERG: There ought to be consistency in these reporting systems. We ought to be able to know that, when one State is talking about the medical errors, that it's consistent with what another State is; and, from one hospital to another, that we're talking about the same sort of thing. We ought to have some case mix adjustments in place, so that adverse events that occur because of patient's being very sick aren't misconstrued as adverse events occurring because of low quality care.

That kind of consistency and uniformity is going to be important. But it's also going to be important to have flexibility in place, because we are not at a stage right now, where we're ready to codify all of the events into one framework that would be the final framework.

But I do believe that, over time with the work of the National Quality Forum and with the States, that we're going to come together with a common vocabulary and a common set of definitions that we can all understand.

DIBIASI: Dr. Eisenberg, thank you. We'll be back to you in a little bit.

EISENBERG: OK.

DIBIASI: I'd like to turn to some of our other panelists today.

Linda Connell, how does NASA (National Aeronautics and Space Administration) address safety and incidents?

LINDA CONNELL, DIRECTOR, NASA AVIATION SAFETY REPORTING SYSTEM: NASA, long ago, 24 years ago, made a decision that this data, that we really didn't have available to us. And that is kind of a direct line from the front line participants: so, pilots, air traffic controllers, maintenance personnel; you know, the ability to speak about what they deal with everyday. And the life they live out there in the aviation system assuring safety and letting us know what kinds of things they're coming across every day. That then, they could report to us and we could make safety changes.

DIBIASI: And what—and how was the aviation safety reporting system created?

CONNELL: As in many things in our world and our country, nothing seems to happen until there is a major tragedy. And this is not unlike the ASRS, in that the reason it exists today is because of an aircraft accident back in 1974, where a TWA (Trans World Airlines) airplane on approach to the Dulles Airport in the Washington DC area hit a mountain on approach and killed everyone on board.

In the investigation of that accident through the NTSB (National Transportation Safety Board)—which you'll hear more about from them—it was discovered that United Airlines, 6 weeks prior to this accident, had experienced the same chain of events that would have lead to them hitting that mountain, if it had not been for a crew member, who detected the ambiguity of the controllers language for clearance and the written documents they had in front of them. And they pulled the nose of the aircraft up in time to miss that mountain.

That was when it was decided that that was valuable information that should be available to the system, so that everyone can be informed. They went—they had a company reporting system at that time, and all of their pilots learned at United about the problem at the Dulles Airport approach, and they were able to inform everyone of how to take the action, how to interpret the documents. The FAA (Federal Aviation Administration) was informed at that time, but the word never got out, as you can tell, to other airlines, including TWA. That is why it was determined, at that time, that a national level reporting system need to be put in place and 2 years later we had one here at NASA.

DIBIASI: So what kinds of data is ASRS collecting?

CONNELL: We are truly our name: Aviation Safety Reporting System. We expect, encourage people to send us anything that they see is related to safety. And we will listen and in, you know, work with that kind of information, plus incidents, plus involuntary or unintentional violations of the Federal air regulations. We wholly exclude from this system any criminal activities or any accidents because criminal activities in this country—of course, I'm sure you all know—go to the Department of Justice. And accidents in the aviation system aren't within the perusal of the National Transportation Safety Board. So, those are excluded from this system. And we have everything kind of below that, which is all safety reporting, all near misses, all incidents, all hazards, those kinds of information.

Now, the people who report all those—are those front-line people, pilots, air traffic controllers, maintenance, cabin, dispatch, gate and ramp agents, some flights the standards district's offices, et cetera. The people that are out there every day, working in the system.

DIBIASI: How is ASRS using this information to improve aviation safety?

CONNELL: Well, ASRS is one piece of the kind of whole picture in aviation safety, but it is a piece that has been found to be vital. It is the learning piece. It is not the accountability piece. It is the heart of the safety information chain that provides people the tools and the information to take proactive action to prevent an accident.

We have a variety of projects ourselves and to get this information into people's hands. The people who can act is that they to improve the system. We have what we call "Alert Messages", so if we—my analysts here are airline captains, who have retired after 30 years of illustrious service; air traffic controllers, who have retired from the air traffic system; cabin attendants and maintenance technicians. And these people look at the—every report that comes in the front door, which is running at around 36,000 a year now. And they read each one, and then they make an expert judgement based on their lengthy career of the hazards that might potentially lead to an accident. So, they flag those for further review. Then, a set of those go out as "Alerts" to the system, and they are basically informing all interested parties, all stakeholders in seeing some kind of change occur in the system to solve that problem is put in their hands.

Now, other products we have or we will search our database for anyone. It's a search-request program. We will do special, deeper analysis of data for the FAA, the NTSB, the NASA Research, as well as some other of the government organizations. We also put out a newsletter, and this is to really get it out into the hands, the information out into the hands of the people who report it to us. And it's a monthly newsletter that brings highlights about five different types of incidents we've heard during the previous 2 months to remind everybody what the issues were, how that person handled it, how they detected the problem, how they recovered from it, so that we all can improve our own performance.

We also have lengthier publications, and we get money for research. We do deeper research projects, but our purpose is to get information to people's hands, so that we can all learn from it when we are moving towards safety changes.

DIBIASI: Do you have any evidence, Linda, that suggests that the reporting is actually improving aviation safety?

CONNELL: Well, as I said, if you look at safety in aviation, we tend to do statistics by how many accidents, or how many fatal accidents we have in the system controlled by how many departures we have in the system, or how many actions are occurring and what the probabilities are for accidents. It's hard to say how much ASRS has really had an impact, because, in aviation, safety is multi-disciplined, multi-prong, and it's out there making changes. And we have driven this accident rate down to such a low level that it is really a rate that we could be proud of.

Now, ASRS...

DIBIASI: What is the rate, Linda?

CONNELL: Oh, I really couldn't tell you. Ben would probably be better at telling you the accident rates. But we know we've contributed to it because—and I can give you specific examples.

We were able to alert on situations problems with an approach procedure into a mountainous territory where an airport was located. Wrong runway takeoffs at an airport that seem to repeatedly being, you know, happen, showing that there was a flaw in the systems, where the human being kept falling into this kind of pitfall along the way. We were able to alert on both of those, and we repeatedly alert and we call people. We keep pressure up because ASRS is managed. Although paid for by the FAA, they realized early on they could not run this system, because they are the regulator and people would not be willing to risk their career or any problem there. So, having NASA, we were able to as a research organization to take up the confidential monitor and guarantee confidentiality.

We were trusted, however, when we'd send information out, we are not a regulator. We have no authority or power in that sense. So we count on these goodwill of others to make those changes. And over the years, we're able to work very collaboratively with the people that make these proactive changes. So, for instance those "Alerts" we had changes from those "Alerts". I know I saved potentially an accident in those arenas.

DIBIASI: Well—and that's got to be worth it (INAUDIBLE).

CONNELL: Oh, it is.

DIBIASI: Linda, we'll get back to you in a minute.

I'd like to go over to Ben Berman now, who's the Chief of Major Investigations at the National Transportation Safety Board. And certainly Ben, the Board has extensive experience investigating and responding to incidents. So, can you describe how the system works?

BENJAMIN BERMAN, CHIEF OF MAJOR INVESTIGATIONS, NATIONAL TRANSPORTATION SAFETY BOARD: Hi, there.

Well, I always like to be on the program with Linda Connell and the Aviation Safety Reporting System, because, in a sense, we're the— we're the other half of the story. We're the mandatory reporting system and the very public reporting system and investigation system.

First of all, I'd like to just talk a little bit about what kind of incidents we work with. Most of them are the one's that we call accidents. And in the world of aviation, whenever there's a death or a serious injury, it's classified as an accident. It's investigated by the—by the NTSB. And in that world, nothing is confidential. When it's—when it's a situation, where nobody has been badly inured or the airplane hasn't been damaged badly, then there's a classic incident that we look into. And Federal Aviation Administration, which is the regulator of the industry looks into and a broader class of the aviation safety recording (INAUDIBLE). And those are mostly public work and our work is public. But there's also the confidential area there.

Now, our system starts with a mandatory report. That's in our legislation. The airline, the pilots have to report when they've been involved in an accident. Generally, it's a very public event; it's on the news before they report it.

So there's no—actually, no question that it's going to be—it's going to be reported (INAUDIBLE). We take that as a starting point. And we're required to determine the cause. And in investigating the accident, we've got really—quite broad powers and authority. And like I was saying a second ago, we conduct our work in the public. That's mainly how our system works.

DIBIASI: What difference has the mandatory reporting system made?

BERMAN: Well, I think that there are a couple of ways that have made the system work, you know, because things are reported in a mandatory way. And one is, I think, that the aviation industry and the whole aviation system worldwide has—had developed a confidence on the part of the traveling product, on what used to be viewed as a very risky and chancy endeavor.

Part of the situation is that aviation is very safe. On commercial flights, you've got millions of fights before there's a fatal accident. The accident rate is something like 0.1 per 100,000 flight hours or million miles trip—hundred million miles traveled.

But in any case, you can certainly go a long-time or a lot of flights before there's a fatal accident. Of course, there's so many flights now that current projections show that, in the next few years, there'll probably be a major accident that's widely reported and spread about every week if travel continues to grow as it expected.

So even though they're infrequent, they're—they're going to be very much in the public mind. And it's easy to shake public confidence. So by having the public knowledge that the presenter is going to be investigated and findings are going to be made and corrective action taken, that's, I think, a big part in the public's confidence in aviation.

DIBIASI: You talked a little bit before about accidents versus incidents. What exactly has to be reported? How do you figure that out?

BERMAN: Well, this is pretty much set up in the definitions that make up our legislation. An accident is something where there was serious injury, major damage. That has to be reported to us. There's also a class of other events that have to be reported to the NTSB, and those involve dual engine failures, problems with the flight control systems, things that you would think of as serious stuff but we want somebody to look into.

We have the ability to look into anything else that we—that we're interested in. And we follow some safety issues and look into some extra things too.

DIBIASI: And what do you do with that information?

BERMAN: We—first of all, we gather the facts. And that's—that's the starting point for everything. We make the facts public. We release the entire factual record of our investigations. That actually queued into, not only the—just general public information. But that part of our work can feed into the world of litigation.

Then, we go along and analyze the facts and try and determine the cause of the accident. That part of our work—our analytical is work is protected from the world of litigation. And that's—that's part of what gives us a lot of our independence as an agency. We can—we can go ahead and do our work and find the cause of the accident without too much influence from what's going to happen later. (INAUDIBLE).

DIBIASI: And do you also publish the information when it's an incident, not an accident?

BERMAN: We do publish it. We're currently putting everything on the Web. We reach a finding of fact and cause for most of the engines we look at. One of the important things that needs to be known about the Safety Board's work is that we don't stop at the cause.

People always used to talk about how the cause of the accident was pilot error. In fact, pilot or operator errors are in involved in more than 80 percent of all of the accidents we look at and hasn't changed. It's not going to change probably because the humans are the ones who are doing most of the stuff, and they have a last chance to reverse the accident.

But we also go quite extensively into what we call "contributing factors." Those are the underlying factors that, in a sense, set up the humans, who are riding on the pointy end of the airplane to do their thing. And those allow us to try to get at the root causes of these events. And what we try and do with our findings and causes and contributing sectors is to make recommendations for change. And we make hundreds of recommendations every year. We track them for fulfillment. We've got know powers to change anything.

But because we do everything in the public eye, we get a very good acceptance rate. And more than 90 percent of our recommendations get ultimately fulfilled. Most of them are to the industries regulators. So we've got kind of a—kind of a difficult relationship with the regulator, the FAA. We're—we actually have some oversight over their activities and make recommendations to them for change.

DIBIASI: No. NTSB used to be part of the Department of Transportation (DOT). But obviously, the powers that be, that it really had to become an independent agency? Is that right? Let's talk about that a little.

BERMAN: We go way back to when we reported the department of commerce in the 1920s; and then going up through history aircraft accidents, immediately had a tremendous amount of public interest and following. And there was some pretty significant crashes, one that called Newt Rockney in the 1930s and a senator. And it become more apparent that that air safety was going to have to be looked at by some kind of a special agency so when to set up.

And then ultimately, we were part of the department of transportation. In the 1970s, there was a large—at that time, the worst accident that had ever happened involving a DC-10. And there was some question about the design of the airplane. Well, the design was approved by the—part of the Department of Transportation. And some serious questions were raised about the independence of the accident investigation with regard to the regulator and the DOT.

As a result of that, Congress took us out from under the DOT's wing, made us a completely independent agency, and gave us more of a function of a oversight of the DOT.

DIBIASI: Ben, we'll get back to you in a bit.

But I'd like to move now to Trish Riley, who's the Executive Director of the National Academy for State Health Policy.

Trish, thanks for being here today. What—what's—do States have pointing systems like what we just heard about in the aviation industry?

TRISH RILEY, EXECUTIVE DIRECTOR, NATIONAL ACADEMY FOR STATE HEALTH POLICY: Somewhat. States have had a fair amount of experience (INAUDIBLE) nursing home facilities for many, many years. And some States have reporting systems for abductions and abuse and neglect that are related to errors in hospitals.

Several months ago, the National Academy for State Health Policy did a survey of all the States to try to get a handle on just what was going on in reporting address events and medical errors, the terms used in the IOM (Institute of Medicine) report. And we limited our look across them.

We found a couple of things. John pointed out the importance of definitions, and we found out that States don't define things as medical errors or adverse events. So we just aggregated those things and asked States: Do you report unexpected death, wrong site surgery, loss of functioning, errors in medications?; and asked for a laundry list. And we found from that survey that all States and the District of Colombia responded to that 15 States and DC do have mandatory reporting of something like what we—what we would define as an adverse event or a medical error. 5 (INAUDIBLE) and those are around (INAUDIBLE); 13 do—did report to us that they also look at (INAUDIBLE) settings. But we didn't pursue more information on that.

DIBIASI: Now, what are the lessons from the aviation industry (INAUDIBLE) safe?

RILEY: Well, it's interesting to hear that this aviation program has been developed over 24 years. I guess the IOM study is to patient safety what the TWA accident was to aviation. It put the States on notice that we really have to take a harder look. And just as they needed 24 years ago, we need data now. And it's—it's really very much a developing kind of field.

I'd say that the lessons are maybe three. One is Linda, I think, nicely pointed out that you need both a system for learning and a system for accountability. And that translates to the States that we need both a mandatory end of volunteer end of mandatory systems. That mandatory systems afford States accountability to protect patient safety. And voluntary systems are those systems that help the industry itself look at errors and make preventive activities a reality.

I think the other lesson is that you need to examine data and change—in order to change behavior, in order to improve quality. You really need good data systems, and you need to take that data, aggregate it, look at it find out what trends exist and how to—how to intervene and do something really improve quality.

And finally, as I listen to it, it strikes me that successful systems take resources. And States don't now have the kind of resources to really put in place very aggressive programs.

DIBIASI: How do you think the States could translate the aviation model to State-based reporting for medical errors?

RILEY: Well, the first invitation for a translation, I think, is the fact that, if you go by the IOM studies, Lucien Lake in the first one of these calls suggested that the IOM numbers are probably smaller than the number—they're probably conservative.

Even if they're conservative, the translation is that the equivalent number of people die from medical errors as if a jet crashed every single day in America. So this is a very serious problem, and I think it's—it's a big wake-up call to the States that we have some responsibility here.

And I think—but there's also a difference. In aviation, accidents, as Ben points out, you know—you know—it's on the nightly news. And they're much more dispersed and not necessarily public. A medical error is, as my friend Art Levin from the Consumers Union in New York points out, in medicine, you're very unfit. And so it's not as visible and not as well known. And it's, of course, kind a way to say, but it is—it's a big issue and it's a tough issue. It's a tough nut to crack.

I think mandatory reporting in some ways has become a whipping point. There's many who charge that mandatory reporting will have a chilling effect and the reporting will hide errors, people won't be wanting to report. And I guess I had some question with that. In fact, it seems to me it's not mandatory reporting that hides errors but maybe the current system.

Ben at our workshop told a story that was pretty amazing to me. And it was that most—and Ben correct me if I get this wrong—but as I remember, you said that airline accidents—in most airline accidents, if the pilot is flying and then made the conjecture (INAUDIBLE), co-pilots are less willing to criticize the pilot. So when the pilot is in the co-pilot seat, he'll comment on problems and (INAUDIBLE). When it's the other way around, a co-pilot is unwilling to comment on the pilot. If that's the case, that is pretty scary for (INAUDIBLE) and for healthcare, because, if a co-pilot is reluctant to question a pilot as they're about to go into a mountain (INAUDIBLE). (INAUDIBLE) I think it's an issue of professional turf and really changing inside systems. That's—that's a big piece of (INAUDIBLE) in the healthcare profession.

DIBIASI: What do we know about what does work at the State level to improve patient care?

RILEY: We don't much. Again, this is a real developing field in the States. And I think there's even some question about whether mandatory reporting is the place to start. I think it's one piece of the bigger situation.

We're now conducting interviews and site visits in the 15 States that have mandatory reporting. So we'll know more in the fall, and we'll issue a report in the fall. And we invite people to offer up questions that they'd like us to look at because we'll be getting a report.

But I think the focus on medical errors has revealed a bigger problem in the space, and that's the lack of oversights generally of hospital quality. States have historically relied on the joint commission for accreditation of health organizations. They've deemed them to do most of the oversight of hospitals. And I think it raises very serious questions about whether that's a wise policy, and whether States have been placing strong enough oversights about quality (INAUDIBLE) generally but (INAUDIBLE) mandatory reporting (INAUDIBLE).

DIBIASI: (INAUDIBLE) the first, and we're also going to be opening our phone lines for questions.

For just to get the audience up to speed on how they can ask their question, you can ask them over the air by pressing the number one key on your touch-tone phone. And if you are in line and feel that your question has been asked, you want to drop out the queue, you press the pound key on the touch-tone phone to drop out.

You may also send us your question via fax. The fax number is 301-594-0380. Or you can E-mail your questions at info@ahrq.gov.

Before going to the questions, however, I want to say a few words about AHRQ and the User Liaison Program.

The AHRQ mission is to develop and disseminate research-based information, that will help clinicians and other healthcare stakeholders make decisions to improve healthcare quality and promote efficiency in the way that healthcare is delivered. The User Liaison Program (ULP) serves as a bridge between researches and State and local policymakers. We not only take research information to policymakers, so they're better informed. We also take the policymakers questions back to researchers, so they're aware of the priority. And hundreds of State and local officials participate in the ULP workshops every year.

As a new addition to the ULP portfolio of products, we hope that today's call and the other calls in this audio teleconference series will provide a forum for discussion between (INAUDIBLE) policy makers and researchers, like (INAUDIBLE) our discussion today. And we certainly appreciate any feedback you have on these teleconferences. You may E-mail your comments to the AHRQ User Liaison Program at info@ahrq.gov.

And our first question, which has been E-mailed from Susan from the County Medical Services program in Sacramento is: Trish, how are States (INAUDIBLE) punitive aspects of reporting?

RILEY: Well, to be frank, until we go into a State, we don't really know. And we're going to go into the States and talk to health commissions, licensing people, the hospital associations consumers. But again, these are mandatory reporting of adverse events. So it's more like the National Transportation Safety Board. It's definitely more of an accountability model than it is a learning model.

I think there's another area that we need to (INAUDIBLE) around voluntary reporting that's—that's more of a non-punitive learning kind of environment. So I think we have to keep the two separate and recognize that we—very little is going on now on voluntary reporting and I think that's where you get the learning and nonpunitive.

States are, however, protecting information. And we know that 13 of the 15 States are protecting information from discovery at some point, either at filing or during investigation. So there's a fair amount of peer review protection and protection from—from discoverability.

BERMAN: Let me ask you or Linda or Ben to comment on how the aviation industry has dealt with the punitive aspect? Have—have you seen that?

CONNELL: Yes, I'll speak first (INAUDIBLE) get that out of the way. The—it was recognized early on after NASA basically developed a program, set up the protocols and made a proposal to the industry and the rest of the (INAUDIBLE). The (INAUDIBLE) as to function as a collection point for information and would be voluntary and would be confidential.

However, the administrator of the Federal Aviation Administration at that time said this system will not work, if the people out there do not realize that the regulator supports the concept so thoroughly, that they are willing to give any reporter to the system limited immunity from any disciplinary or certificate action because the regulator has the power to do that. So the provisions, then, evolved due to the administrators lead that any reports submitted to the system, that would have to be submitted within 10 days of the event would have to be inadvertent or not delivered on the part of the person involved. They would have to not have any previous competency problems. Their licenses and everything had to be up to date. And they were considered to be proficient in the system by the regulator.

Once that was accomplished, they would take this information. We provide them a strip that they've submitted to the system, if the FAA is looking into that exact same event to also look as to why it happened and what can be done. And they determined that they—that the person probably would have in an enforcement way had maybe 30 days suspension of their license, which is not uncommon in aviation.

But if they submitted the information to the ASRS, then it would be considered to be and these are the legal terms in the document a constructive attitude toward safety, in that they were willing to take the step to confide in NASA about the incident. This incident is protected. (INAUDIBLE) it never goes into the actual judgements of these things. But it showed that the person submitted it and that's all they needed to do.

Then, the NTSB law judges that look at these cases, as well, as the FAA understand that they cannot take certificate action against that person. But it was probably a human error. Something in the interface between humans in the system that created the problem and that we are better off learning from that than punishing for it.

DIBIASI: And is that what (INAUDIBLE) the advantage of the voluntary reporting system because ASRS is entirely voluntary?

CONNELL: It is entirely voluntary.

DIBIASI: OK.

Well, Ben, your comments on this?

BERMAN: I'd like to make a couple of quick observations. One is that the regulator in aviation, the FAA has a large system of enforcement and penalties. And there's a huge movement that's sweeping that system. And actually it's sweeping it, I think, largely away. Part of it in the beginning of was the ASRS, but there's a lot more going on.

And one of the biggest I think potential benefits of the enforcement system now is that it's just a stick to get people to report their errors to the ASRS and to the other reporting systems that are getting going. So there's—there's a huge movement towards voluntary reporting and almost getting away with the enforcement penalty system of taking people's license away for 30 or 60 days.

But I also need to point out that the NTSB is not part of that system. We are not an enforcement system; we are really a learning system. But we're connected to a mandatory reporting system.

And I'd like to challenge all of you out there in the medical world to think about breaking the connection, that seems to be, I think, largely in your industry between a mandatory reporting system and one that—that brings about enforcement.

Now, sure enough, when the safety board learns about an accident, investigates, publicizes and finds a cause, eventually, that factors down to some issues related to litigation, it's a very difficult problem.

But our goal and our effect is to learn and to prevent future accidents.

I think that I'd like to hear what other folks think about that.

DIBIASI: Let's—let me go to Dr. Eisenberg for a second.

Are you—are you on the phone doctor? Hello?

UNIDENTIFIED PARTICIPANT: On his way to (INAUDIBLE).

DIBIASI: And (INAUDIBLE) OK.

Trish, do you want to comment on this?

RILEY: I think, you know, at one level, all reporting is voluntary because we don't really—especially in medical areas, because we don't really know what the end is. We don't really know how many errors occurred and whether people come forward or not.

I'd—I'd like to agree with Ben. But I think, for our purposes, at a State level, mandatory reporting isn't (INAUDIBLE). At this point, it's the only regulatory enforcement and regulatory kind of action that the States have, absent a different regulatory oversight system of hospital.

The JCAHO (Joint Commission on Accreditation of Healthcare Organizations), which is a peer review oversight, the States have deemed take over the responsibility for them, does have a voluntary sentinel report system. But I don't know really—I think the jury is out about how effective that is in improving patient (INAUDIBLE).

DIBIASI: Let's talk about State's perception up until to now even up until the IOM. Are they taking this issue seriously? Have the been?

And does the IOM report make them take it more serious?

RILEY: I think they're definitely taking the issue of patient (INAUDIBLE) very seriously. I think there is some concern about whether mandatory reporting is the place to begin and how—is that (INAUDIBLE) a place to begin with (INAUDIBLE).

I think one evidence of State interest is the fact that, in very short notice, we got a 100-percent response on a survey. The States were interested in—and told us what they were doing in mandatory reporting and asked questions about what they were interested in knowing more about. We're having a forum with commissioners and legislators in a couple of weeks to talk about the issue in the broad array of what States can do about patient safety as purchasers that you talked about at the last call, as well as regulate.

And finally, you see a lot of—a flurry of legislative activity. Although this was a special session in most—in most States, whether it was a short session, where we couldn't introduce very many new pieces of legislation. Nevertheless, in 15 States, over 40 pieces of legislation of around patient system were introduced and debated.

DIBIASI: How much of an impact do you think the States can actually have on a Federal level?

And how do they exercise that (INAUDIBLE)?

RILEY: Well, you know, most health reform lately has come—has generated some—the big health people have generated from the States. We didn't have the (INAUDIBLE) health insurance portability up until after virtually all the States had passed insurance reform. We didn't have the children's health insurance program until—more than half the States have children's health plan.

And I suspect similarly States, as John suggested, can experiment and think about how to do oversights and educate the Federal Government. The Federal Government is, however, on the fast track. And I think the States do not have the resources to do some of this work. I'm 100 percent certain about which way to proceed. And clearly, there's need for standardization and the capacity to share information across State lines.

But in the end, healthcare is local. And healthcare delivery systems are very local. So the States are very well positioned to be able to have that kind of oversight responsibility.

And I think at issue would be: Who—how does that get standardized and shared? How do we make sure that we have the functional equivalence of TWA able to talk to United about the problems? But I think there is a—a good invitation for a Federal role to collect the voluntary data that's collected at the—at the hospital level.

DIBIASI: Everybody has been talking about reporting, reporting, reporting. And that seems to—where everybody's paying attention when it comes to this report.

But from—if you look at it on the broad screen, what do you think is the biggest challenge? And potentially, you know, the biggest positive movement where should we be focusing?

RILEY: Well, it think a couple—I think there needs to be coalitions and collaborations here. It's too easy to dismiss this and to let this, sort of, traditional lobbying take hold—the hospital association comes forward with its perspective—the medical association with theirs.

And I think, increasingly in the States, you see some coalitions beginning to form and task forces to say there's a broad array of responsibilities here. We need to solve this issue of patient safety. How do we do it? And the States does, in the end, always have a responsibility for taking safety into public accountability.

So there's always going to be something equivalent to an FAA rule, I think. But States are going to need to work out with many stakeholders what the proper role is.

DIBIASI: Let me go back to Linda and—or Ben on this, because you obviously have been through some of this.

Is there anything that we can learn? I mean, I'm assuming you've—you've made some of the mistakes. Is there anything we can do to prevent making them?

And how do we get to where we want to go faster and more effectively?

CONNELL: I guess one thing that comes to my mind is—I've been asked to consult to (INAUDIBLE) medical groups for probably the last 2 to 3 years, being asked to present this model and the philosophy of the model.

And I agree with Trish as I look at it. Now, I'm formerly an R.N., so I have some familiarity with medicine after several years working in hospitals. And I see a direct translation of aviation safety kinds of approaches to medicine.

Now, they won't look exactly the same, I'm sure. But the underlying principals and philosophies and the reasons why we do it, I think, can be used to great—to great success in medicine.

I agree with Trish that, at some level, it has to be at least in the voluntary reporting zone. It has to be to an organization that is so, kind of, unconditionally trusted that people will begin to feel safe enough to talk about the errors that they have experienced due to their interface with the system or the dynamic and complex environment that they work in.

I think that we started out in aviation, where the FAA, having discovered the results of the NTSB investigation, except they're intricately involved in those accident investigations. So in all good conscientious way, as a Federal agency, that they should step up to collecting information that can be shared throughout the country. And they could not do it. It did not take them long to figure out that they could not be in the position of collecting this information. It was less than six months after kind of putting the whole idea out, they received no information. The industry advised them strongly that it was (INAUDIBLE) impossible.

So, you do have to begin to take this independent kind of look at this. And I think it will be an interesting step for medicine to feel able to report. Being raised in medicine myself, a lot of my training was how to word reports so that it wasn't legally incriminating. And we think legally, I think, from the beginning. I don't think it ever influenced the care I gave. I gave the care I felt, as a professional, I must give and had the ability to give. But I know that it was always a specter of legal issues surrounding us. There's going to be a step there that has to be taken in everyone's mind, that it's OK to talk to a confidential system, so that other people can learn from our mistakes or our errors or those things that interface poorly in our own every day world.

DIBIASI: Linda, let's get—thank you—let's get back to that in just a second. We have a caller on the line who I'd like to get to.

William from the L. A. County Department of Public Health, hello? Hello?

Is William on the air?

OK, we seem to have lost him temporarily.

You talked a little bit about—you have this idea of the mindset and permission that it's OK to—that we're all part of the same system here. And we have to talk about this. There seems to be a fair amount of you know changing attitudes and changing behavior when it comes to this. How did the airline industry change pilot behavior?

CONNELL: I had an opportunity to talk to the people who originated the system. We're here on the day that the first report arrived in the mail. And they said—it was similar language on here from them in terms of the people who felt it would never work. The comments were you'll never get the captain of a 747 to talk confidentially about any problems that they have.

There was a lot of that kind of discussion. Well, I've—I've already heard that from medicine. Oh, you'll never get a physician to take the time to write a report that confidentially exposes their errors.

Well, it was proved wrong. People were willing to move forward, although gradually. And many people and individuals since reached out and began to trust over time. It takes time to develop these systems. But I think the other crucial thing is that you can't begin to succeed at this until you take the first step.

And I'm hoping that my direct involvement with the VA and some of their safety efforts, they—(INAUDIBLE) to governing agencies—we were able to (INAUDIBLE) an interagency agreement now. And I can (INAUDIBLE) literally try to develop a prototype that could work in medicine.

DIBIASI: And certainly, as Trish pointed out, I mean the IOM report is certainly going to be the catalyst. I mean, it's going to be happening whether (INAUDIBLE) ready or not.

Let me just get to Deborah, who's on the phone from Mass Medicaid.

DEBORAH MORAN, MA MEDICAID: Yes, hi. This is Deborah Moran. I'm the Associate Medical Director at Massachusetts Medicaid.

And just a couple of points—or I guess questions: One to do with—in terms of the aviation system as a model, it seems to me that this system is—has had the sort of only platform it's operated from is a Federal level versus a Federal and State level. And that, because of the Federal—the broad sweeping impact of having basically powers from the Federal Government only, there's been a certain amount of consistency that you've been able to, I guess, take advantage of. So that's one, I guess—as opposed to every State being (INAUDIBLE) Medicaid, other private health insurers, that kind of thing.

And then the other is the culture in aviation versus the culture of medicine. Aviation is a relatively new commodity, where medicine is, you know, historic. And the relationship of and the work and relationship of the—a patient doctor is sacrosanct, you know, culturally. And with the—and that there's something about patients almost never knowing from their physicians what's going on that makes it extremely difficult if not impossible to really challenge from a person—a sick person's point of view, system—just from a personal advocacy point of view. And while I think of the—if there's a capacity in the error, there is so much human cry that the aviation industry knows they have to do something for their livelihoods depend upon it. And that's not really true in medicine.

DIBIASI: Trish, would you like to comment on this?

RILEY: I'll comment on the first part of the issues of standardization, you know, whether, you know, Federal or State. It's real clear from the preliminary work we did that there is no standard set of definitions on this issue.

But I'm not sure that this necessarily suggests a Federal role. I think the National Association of Insurance Commissions have a model for insurance regulations, has been pretty successful where they convene working groups and State officials to come up with consensus about what definition should be and what approaches should be the policy. And that's been extraordinarily successful in a voluntary kind of way with the State, and that may be what's—what's required here at least as a first step. And from that work, then there may be some more national activities to begin to look at that issue.

The cultural issues, I think, are really challenging because I think they are quite different. And the fundamental difference for me in aviation is I always feel a little bit safer because the pilot gets on the plane with me.

MORAN: You bet, you bet.

RILEY: (INAUDIBLE). So I think that's also where we have to be careful not to reinvent the problems. Frankly, that's some of the nursing home regulations (INAUDIBLE) simply too regulatory, not sufficiently collaborative, not focused on quality improvement. And we need to learn from those lessons and build collaborative efforts with the medical community to try to figure this one out.

But I'd also argue that we can't just sit back, because I think if there are no State pressures, no sort of State activity, no sort of lever external to this system, then it will become so internalized; it will have no guarantees and no knowledge that there is quality improvement going on.

MORAN: Well, on a State—I mean, part of the State difficulty is that every State's Medicaid structure is very different.

In Massachusetts, we happen to have a very active medical director with a lot of, again, sort of oversight and clinical integrity mandates. Well, that's not necessarily true in all States, some of whom do not have medical directors, for example. And this seems to be, you know—you know, a very tough issue.

RILEY: Yeah. And your question invites a real need for States to be able to work collaboratively in ways that they haven't, because the Medicaid agency, as a purchaser, can require a set of activities in hospitals from which it purchases. Well, the health agency as the licenser may have a very different set of activities. So, how those two will relate and whether you can speak with one voice, I think, will be important as we go forward.

MORAN: But of course, having the Medicaid perspective of every State, who really owns being a purchaser. And people, who can negotiate for the best for their people; they're responsible for that's a cultural issue also.

RILEY: Yep, it sure is.

EISENBERG: This is John Eisenberg. Let me add one thing to this discussion about the States, and that is that States have the authority for licensing physicians and institutions. And that does give them a substantial amount of leverage over change. In addition to which, despite the fact that we decry the fact that there are such variations in medical practice, we do recognize that there are some local idiosyncrasies in medical practice. And therefore, there are some merits to asking the States to look at the issue of patient safety as opposed to leaving it to the Federal Government.

In the case of aviation, in most instances, a flight by definition is interstate.

MORAN (?): Exactly.

EISENBERG: And—for it would be much more difficult for the State to have this responsibility in that area. But we're going to have to strike the balance between the (INAUDIBLE) responsibilities in the area of healthcare. And frankly, among the Federal agencies, we're going to have to strike a balance because so many different agencies are involved in the healthcare industry.

DIBIASI (?): Dr. Eisenberg, while you're on the phone, let me ask you: Do you have some examples of specific steps and practices that are going to be recommended for hospitals? Or that we think are going to be recommended for hospitals to undertake to reduce medical errors?

EISENBERG: We'll—we're relying on the—on the National Quality Forum to tell us what the steps are going to be that are going to be recommended by them, partly because of the issue we just left, which is that we want to be sure that, when the Federal Government has a set of steps, that ought to be followed that they're consistent with the ones that are being used by the States and by the private sector. And so it will provide us with more of an ability to have symmetry across Federal-State sectors.

But let me give you some examples of the ones people are talking about least. People are talking about not having concentrated chemicals on the ward, where they might be inadvertently administered to the patient through and intravenous line when they should have been diluted, like potassium chloride. People are talking about (INAUDIBLE) computerized order entry of pharmaceuticals—of drugs. People are talking about bar coding blood transfusions to be sure that it's being given to the right patient. People are talking about having a system in place for marking the leg that's supposed to be operated on so that the correct leg is operated on. It's really simple stuff like that.

That's what's so painful about this, is that the solutions are pretty simple. And they're system solutions, yet we aren't doing it.

DIBIASI (?): Right. And—so—and these—and these will be mandatory for all hospitals? I mean, once the practices are agreed upon then hospitals will be required to adopt them?

EISENBERG: That's not the plan for Medicare. The plan for Medicare is to require that the hospitals describe the systems that they have in place. And so—and then to be sure that the public has access to that information not to require them.

Now, it may be—it may be several years from now that the healthcare community will come together in agreement that there are certain procedures, which are just so necessary that they ought to be required.

But at the beginning, we believe it's accountability through information to the purchasers and the patients and their families that's most important.

DIBIASI (?): Why not require them?

EISENBERG: The first reason is because we don't have enough evidence about what works well in this area. The research is all of about 15 or 20 articles if you look at the really carefully done research on patient safety. Much of it was sponsored by our agency early in the mid 1990s. But there hasn't been much of an investment.

If we had a disease that was causing as much death 440 to 98,000 deaths a year and as many expenditures $29 billion a year, as this problem is, we would have invested a lot more in research than we have. What we've got to do in the next few years is to make that investment and to get some experience on looking at what works and what doesn't. Then (INAUDIBLE) be prepared to make some of these practices required. But I don't think we're ready.

UNIDENTIFIED PARTICIPANT: And you know, on the flip side of that, John, it speaks to why there have to be some external levers to push change. And if mandatory reporting has no other value, it seems to me it's to make clear to the industry that there's a seriousness about this. That while we can't impose practice requirements, because we don't know what works, we also can't accept what you just—you know, identified as sort of simple and painfully simple and embarrassing kinds of errors that lead to patient death.

So I think the role of the State, as a regulator and as a policymaker, is really critical in this one to keep the pressure on for reform. And not to do the wrong thing, but to say we're not going to be patient forever: This just has to change.

And we've been really focused on doctors, doctors, doctors. But we're also talking about aren't we, nurses and pharmacists and everybody in the healthcare system.

EISENBERG: We're talking about everybody, including the dietician who gives pork chops to the patient who should be on a low salt diet. We're talking about everybody.

UNIDENTIFIED PARTICIPANT: Which is why this could be somewhat more complicated and extensive than the model.

UNIDENTIFIED PARTICIPANT: By my—anecdotal research, which will horrify all the researchers here—but (INAUDIBLE) since working on this issue, it's really interesting. When you begin to talk about medical errors to doctors, they immediately say this is—this is a true survey of all the, you know, people I've (INAUDIBLE)—I immediately start to talk about nurses. And what goes on that nurses did this and nurses did that.

When you talk to nurses, they immediately say: Yes, the surgeon did. So there's not an ownership of the issue as a system of care.

DIBIASI: But that also gets back, I think, to what we were talking about earlier about changing the culture and changing the acceptance. And I'm assuming what the aviation industry had to do as well in terms of the pilots, the co-pilots and everybody up from the people on the ground to—is it—is it as inclusive?

Let me ask this to Ben and Linda: When it comes to your system, have you noticed—besides the co-pilot anecdote that Trish talked about earlier, have you noticed a reluctance among certain groups to report and talk and deal with this issue?

CONNELL: I can speak for ASRS in that it's hard for me to assess it. At 24 years maturity, I find very few people reticent to talk. The air traffic controllers of any one group might be the one who hesitates the most because they are employees of the FAA. And they don't know or don't feel or it's a stronger case to argue with them to—to build their trust.

DIBIASI: And I would imagine they would be really tremendous resources of information?

CONNELL: Well, in our—in our view, they're the other 100 percent of anything that happens. And so you need everyone's perspective.

EISENBERG: Let me add something to this. This is John, again—Eisenberg: There's a group of reporters in the healthcare system, who don't exist in the—in the airline industry quite as much, and that is the patient. They also are part of the solution. I mean, very few of us would suggest that we send some patients—some passengers up to the cockpit to help out in reducing aviation errors.

But we are definitely saying that the patients have to get into the healthcare cockpit. This is a problem that the patients share with the providers. It's not just those dieticians and nurses and doctors. We've got to get patients and their families involved in reducing errors because they're part of the system. And therefore, they can help to make the system work better and they can do the reporting as well.

DIBIASI: And John, what is you experience, then, with patients in terms of either how knowledgeable, or how willing they are to ask the probing questions and then to turn around and report?

EISENBERG: I think they're reluctant for several reasons. I think one of them is that they believe that they would be betraying their physician's trust, or the hospitals trust, if they were to report an error, when, in fact, what they'd be doing especially in a confidential system is to help the physician in the hospital.

Secondly, it's an awfully technical field. And I think they sometimes might think that they don't understand what's going on, when the trust is they probably understand better than anybody else.

Part of the effort is to get them engaged by educating them. There's a wonderful study that looked at one aspect of this, educating patients about asking people, who walked into their rooms, whether they had washed their hands. And you know, the use of soap in that hospital went up dramatically.

DIBIASI: Well—and it does seem from what—from what everybody is saying here today, that there is going to have to be a huge education component to this in terms of getting information and changing culture.

EISENBERG: And the States can play a big role in that.

CONNELL: And—this is Linda—one thing I would say—and Ben I'd let you speak to this is—is the efforts that we have in aviation for CRM.

EISENBERG (?): CRM is an acronym for saying (INAUDIBLE) resource management. We recognize that aviation requires a huge (INAUDIBLE) operating on time and safely. And every member of that team has (INAUDIBLE).

And the point of CRM is to have the people who are in charge, usually the captain, draw in the resources from all over the system and use them effectively, and to make the other really more junior elements, the co-pilots, flight attendants and sometimes mechanics feel that they can speak up and correct their errors, versus the charges making.

And it's something that have been a big effort over the past 15 years or so. A lot of progress has been made. But it's not anything that you can inoculate people with. You can't give them an hour of training or a week of training and say they mastered the art of managing your resources. It's something that we need to pay attention to all the time and encourage all the time. And we need to evaluate ways in which we can make it work better.

One of the important things we recognize in aviation is that errors are really inevitable. They're—they also occur all the time. I was asked what the accident rate is it's very, very low but the error rate is very high. I'll tell you mine own experiences as an airline pilot that not a flight goes by that you don't make an error. So the error rate is 100 percent. And the point of the crew resource management and a lot of the safety efforts in aviation is to accept that errors occur and then work to minimize the consequences.

DIBIASI: Well, Ben, you've succeeded in worrying everyone on this call when it comes to aviation. Let me just—let me get back to you in a minute because we have a couple of people who have been patiently waiting on the line.

I think we may have William back from L. A. County's Department of Public Health.

Are you there?

WILLIAM HADDOCK, L. A. COUNTY PUBLIC HEALTH: Yes.

DIBIASI: Hi.

HADDOCK: Hi.

I'm a physician, and I'm working with quality assurance and public health. And I've been looking at (INAUDIBLE) therapy for tuberculosis. And we've found that many of the medications look a like and sound a like. And it's been raising some brainstorming that we've done. But we're not quite sure where to take that from here and where an appropriate venue is to sort of talk about that and sort of get solutions. I know that, on the airline side, that would be similar to the design of aircraft. But I just wanted to see if you might be able to help me address this.

EISENBERG: This is John Eisenberg. The Food and Drug Administration has taken on—and, in fact, it was part of the report from the quality inter agency coordination task force. Janet Woodcock, who is the head of that component of the Food and Drug Administration, is working on it right now. And I think what you're going to see very soon is an effort from the FDA to eliminate those sound a like, look a like drug names to address the issue that you're describing.

HADDOCK: That will, you know, certainly help us to have patients more involved, because they'll—they'll be able to look at the drugs and know the sounds of them. And I think that will facilitate that.

EISENBERG (?): There's also voluntary (INAUDIBLE) medications. Michael Cohen's the Director that collects information about drugs. And it's a bandwagon that they're promoting to sort of clarify the look alike drugs, and it's a good resource for the States and counties.

DIBIASI: OK, our next caller is Carolyn from Maryland.

Hello, is Carol there?

Well, we might get to here in a little bit.

Actually, Dr. Eisenberg, there are—I do have a question that might help all of the callers on this call, and that is: How can State agency administrators best keep themselves informed about the developments in the patient safety initiative? Especially, the proposals that are going to affect them most?

EISENBERG: Right. Right now, there isn't a vehicle other than the one that we're using right now, the User Liaison Program, and certain organizations, like the National Academy for State Health Policy that Trish leads, for them to come together.

We're hoping that when the Congress passes the appropriations bill this year, that we'll get some funds for the creation of a (INAUDIBLE) and vehicles for the States to talk to one another, because we do think that one of the roles of the Federal Government ought to play is that of a convener to help the States come together (INAUDIBLE) these kinds of activities.

So if people have ideas about ways in which the States could come together or ways that a Federal agency like ours could help, we'd certainly like those ideas in the next few months we're going to be putting those projects together.

DIBIASI: How long do you think that this is going to take before, you know, we really get to a point to where able to effect some change? And really help (INAUDIBLE)—are we talking you know years? Are we talking months?

EISENBERG: I think we have effected change already by making this an issue and by for people discuss it. I am confident that more hospitals are now paying attention to it and more health plans are paying attention to it. When will we eliminate the errors? Never.

And so the question is: When will we get to the point where we have a substantial decrease? We've said that, within in 3 years, we think we can have a 50-percent reduction in the errors that are made. And I think that's realistic. But it's already started.

DIBIASI: And do you think in 3 years we're going to have a system where—that's going to be nationally adopted and standardized?

EISENBERG: I think—I don't think that we're going to have one single system within 3 years. I think that we're going to have State systems in place. We're going to have a coordinated, Federal system with public health agencies and Health Care Financing Administration and the other Federal agencies, like the VA. And I think then, we'll be able to look at this and say how much do we want to make a uniform? And how much do we want to continue to experiment?

DIBIASI: We're going to be taking now a final—sort of final comments, final wrap-up from all of the panelists. And that was actually—it was actually a good jumping off point.

Dr. Eisenberg, is there anything you'd like to add just in terms of wrapping up?

And where do you think we are?

And where we're going with this issue?

EISENBERG: Well, let me just say that I apologize for the static since I'm using a car phone. I'm on way to Dulles Airport to get on a very safe plane.

DIBIASI: I'm sure you'll be fine.

EISENBERG: And I'm going to be sitting up there in the cockpit, watching everything (INAUDIBLE). No.

I do want to thank the people on the call for their participation because we are all working through this together to learn from the aviation area. We're also looking at other industries that we can learn from as well as the—as the aviation industry.

If people come across lessons, then we want to hear about them because we think that we've all got to share those ideas. And if people would get in touch with me through the (INAUDIBLE) Web site to let us know about their ideas, about ways in which we can help, then I would appreciate that, because I think it's going to be a very exciting prospect for decreasing medical errors and learning from out colleagues in other industries.

DIBIASI: Thank you—Linda.

CONNELL: Yes, I am really so pleased to hear all of you come together to talk about a common issue that is going to improve healthcare in this country. I am—I am proud, as a nurse, to see that there is this much interest to do it. And I think do it the right way using models in other industries aviation and others, as John mentioned, is the way to do it. And the motivation I hear from everyone is really heartening in terms of solving these ongoing problems. The future looks much brighter.

DIBIASI: Good—Benjamin Berman.

BERMAN: Well, I—I'm very pleased to be on the program. And I'm kind of the reverse of John. I have a lot of fierce trepidations with my usual experience with the healthcare. But I feel very safe doing that, and you should about flying, as well.

I think that the experience of the aviation industry shows that we need, very much need both voluntary and mandatory reporting systems. And I'd like to ask everybody to think about something that I feel very strongly about, is that mandatory reporting is really not equal to punishment more enforcement. And we need to think about how those two can diverge and how we can get the safety benefits—the significant ones out of a mandatory reporting system.

I think one of the main issues that both you industry. And mine has—is really not anything to do with the reporting, but it's what you do with the information once it's reported to you. And in that regard, oftentimes, you know, what's revealed is a need for some very important and expensive system changes. It's nothing that a single report from one doctor or one pilot is going to prove the need for or even reveal. And once that work's done, you can take some very difficult changes. Sometimes, in order to get those changes moving, you need to list the support (INAUDIBLE). And you need to do those in order to keep the confidence of the public.

Thanks a lot.

DIBIASI: OK, thank you—Trish Riley.

RILEY: I'd also like to thank everybody. It's been a good discussion.

I guess the challenge is it's taken the healthcare system several 100 years to develop these behaviors that create these problems. And it's going to be a difficult thing to change those behaviors, and we have to be sanguine about that.

On the other hand, we need to keep the pressure up. This is a very, very serious problem. And States have a huge responsibility; the rubber hits the road for them. The other life—they're the ones that are supposed to protect the patient's safety. We don't really know how to do that. We know it's a huge problem. And how do you balance educating the public about the severity of this problem without frightening them to death about their healthcare system? And I think it places on the States, particularly, an enormous responsibility.

DIBIASI: Especially—in one of our first calls, we mentioned the statistics. Before that, more than—when the report came out, more than 50 percent of the media nationwide picked up the story. So the pressure is certainly going to be on from the public point of view.

Well, thank you all for joining us this afternoon. It's really been a wonderful discussion.

And if we didn't get to your question today, or if you want to take us up on the invitation to talk more about this or give us your insights, you can E-mail us your questions or comments at info@ahrq.gov. Depending on the number of questions, we'll try to answer you directly.

We will soon be posting a summary of the medical errors workshop on our Web site. And we also encourage you to send us an researchable questions you're facing at the State or local levels for our consideration because we're planning the agencies research priority.

I mentioned at the beginning of the show that we will have copies of the audio teleconference available for purchase several weeks after the series is completed. The cost for this series will be $6. Or for $25, you can order a complete tape set of all sessions recorded at the March 2000 workshop.

To get information about ordering, go to AHRQ Web site. Again, that's www.ahrq.gov. Then, select State and Local Policymakers.

You may also call the AHRQ Publications Clearinghouse at 1-800-358-9295. Again, that number is 1-800-358-9295. As for audio tapes, AHRQ-00AV11: That will give you the workshop tapes. For the audio conference tapes, which are entitled "How Safe is Our Healthcare System? What States Can Do to Improve Patient Safety and Reduce Medical Errors.", you can ask for AHRQ-00AV11A.

Also, we will soon be posting on the AHRQ Web site a summary of the March 2000 ULP workshop about medical errors.

We thank you for joining us today. And we hope you found this series of teleconferences useful.

As a final reminder, we would appreciate any feedback you may have about this AHRQ teleconference series by E-mailing comments to info@ahrq.gov.

We hope you'll join us for future AHRQ activities.

Have a good day.


Internet Citation:

A System's Approach to Improving Safety: Translating Lessons from the Aviation Industry. Transcript of part three of the series of National Audio Teleconferences sponsored by the User Liaison Program, Agency for Healthcare Research and Quality, conducted on June 5, 2000. Agency for Healthcare Research and Quality, Rockville, MD. http://www.ahrq.gov/news/ulp/trulp605.htm


Return to Teleconferences: How Safe Is Our Health Care System?