Skip Navigation U.S. Department of Health and Human Services
Agency for Healthcare Research Quality

You Are Here: AHRQ Home > Measuring Healthcare Quality > AHRQ Quality Indicators Learning Institute > Introduction to the QI Learning Institute

Introduction to the QI Learning Institute

This is the transcript of a Webinar titled Introduction to the QI Learning Institute that took place on July 30, 2008.

Select to access the PowerPoint® presentation (950 KB).

Dr. Irene Fraser: Good morning, everybody. This is Irene Fraser, and I wanted to welcome everybody to the first step in what I think is going to be a really exciting collaboration among all of you and us on using the Quality Indicators.

Before we get started, let me just mention that there should be captioning available soon. We're having a little bit of technical difficulty getting that up, but that should be up shortly, and there also is going to be a full transcript later.

I was looking through the list of people who are participating and noticed quite a few familiar names from some of our work with the State hospital associations, State data organizations, coalitions, and others. So for those who are familiar with AHRQ [Agency for Healthcare Research and Quality] and that I have been working with in the past, welcome and it is good to be working with you again.

For those who are less familiar with AHRQ, I will just mention that we are a small Federal agency that is part of the Department of Health and Human Services with a not small mission, which is to improve the safety and effectiveness and efficiency and quality of health care for all Americans, a rather large mandate. One of the many ways in which we do this is through the development and implementation of quality measures, and measures of patient experience of care, and to a growing extent measures for efficiency, and through data to populate these measures.

The Quality Indicators are one of the tools that we have developed in this way.  Actually we developed these measures quite a few years ago, in fact, in response to folks like yourselves, data organizations and hospital associations, who had been collecting hospital discharge data and were interested in working with us on ways to use those data for purposes of quality improvement, and that was the genesis of the Quality Indicators. They have now gone on to have a lot of use, including in the last several years a growing use for public reporting at the hospital level. There have been State data organizations and hospital associations and communities and others that have used them for that effort.  As a result of that effort there have been increased requests to us for support in that endeavor, and that's what we're really about here.

So I am going to start by providing a little bit of background and I will turn it over to others on the team who will fill you in a little bit more about the purpose of the Learning Institute, give you some background in terms of the NQF [National Quality Forum] endorsement status and CMS [Centers for Medicare & Medicaid Services] use of the Quality Indicators.  Then we will talk a little bit further about developments of the Learning Institute, and hopefully we'll have a good bit of time for questions and feedback because we would like the whole process to be a continuing dialog.

By way of background, I mentioned the Quality Indicators were developed largely in response to the desires of States and communities. What we eventually did was formalize this process, we had kind of a home-grown version that we were using for several years.  However, as the topic of quality, and particularly quality reporting, grew in national prominence, we went back through and had a significant fine tuning of the Quality Indicators through a contract with UCSF [University of California-San Francisco]-Stanford through the Evidence-based Practice Center there.

What all of the Indicators have in common is they use hospital discharge data based on readily available data elements, and incorporated severity adjustments, and comorbidity groupings, etc., into the Inpatient Quality Indicators. Our goal originally, as I mentioned, was to use them for quality improvement and national tracking, but there was a growing use for public reporting and pay-for-performance. There are several modules to the Quality Indicators. The original ones were the Prevention Quality Indicators (PQIs), the Patient Safety Indicators (PSIs), and the Inpatient Quality Indicators (IQIs). The Prevention Quality Indicators we won't be talking about on this Web conference because they are area-level measures that aren't really measures of hospital performance. They're, in a sense, windows on the community. A more recent module is the Neonatal Quality Indicators, and then there are also the Pediatric measures, which were really pulled together from the Inpatient, Patient Safety, and Prevention Quality Indicators. 

A couple of years ago we held a user meeting—and I know several of you were at that meeting—where there was some early sharing, particularly by those who were using the measures for public reporting, but also some of the ones using them for quality improvement.  And there was a big focus on how you guys were using them and what kind of issues you were having.  This was something that Mamatha Pancholi put together and led, and it was a really exciting session.  You will be hearing more from Mamatha in a moment.  One of the things that came out of that meeting was a very strong desire for a mechanism for networking and education around use of the AHRQ Quality Indicators for public reporting.  We set up some of that through our support center, but it seemed the time was really ripe to do something in a more systematic fashion.

At this point, to our knowledge, there are 12 States that are using the Quality Indicators for public hospital reporting. We may find out on the call from some of you that this has expanded. The main way we find out is from you all. My sense, having talked to several States that are not highlighted here, is that many of them were waiting for the National Quality Forum to go through its endorsement process before launching use of them, and so my guess is the next time we meet this map will be different.

So now I want to turn it over to Mamatha Pancholi, who is going to fill you in a little bit more on the purpose of the Learning Institute that we created.

Mamatha Pancholi: Hello. I would like to also add my welcome. As the Program Officer for the Quality Indicators, I actually get to hear from many of you on how you're using the Quality Indicators for public reporting and quality improvement and many other issues that arise. So I find this to be a unique opportunity really to bring together a lot of different users among the user community to talk about how the Quality Indicators could be used for public reporting in your own reporting efforts.

Recently—and my colleague Marybeth Farquhar will talk about this in more detail—the National Quality Forum has endorsed several measures within the AHRQ Quality Indicators set, so we thought this would be a good opportunity and timing for us to gather and discuss how best you could utilize that information to support reporting efforts within your own State. So as we were thinking about putting together the Learning Institute, we really wanted to think about our target audience.  From past experience, what we've seen is, for the most part, many reporting efforts are led by State agencies, various task forces within agencies, various business coalitions, and many State hospital associations.  When thinking about how to conduct this Webinar and a series of events, we thought these groups were the ones we were going to be focusing on. There may be others, but what we were trying to do is focus our efforts here on individual organizations and agencies that have either a mandate to do public reporting or are in some way actively engaged in that effort.

To do this we've asked several of our colleagues who are engaged in various activities with public reporting or the Quality Indicators to help us think about how best to plan for these events. So we've asked our colleagues Dale Bratzler, Denise Love, Kim Streit, Shoshanna Sofaer, and Jeff Geppert.  You can see their affiliations on the screen there.  These folks have got a lot to offer us in terms of understanding the Quality Indicators and understanding the world out there in terms of public reporting.  So we are looking to them to help us create a format, think about discussion, and help us reflect on what is going on out there in the real world. 

Here at AHRQ and AcademyHealth, these are my colleagues Irene Fraser, myself, and my colleague Marybeth Farquhar, who have put a lot of thought into the need for this type of forum. My colleagues Marjorie Shofer and Maggie Rutherford are also helping us think about the Webinar series, Learning Institute, and overall. Our colleague at AcademyHealth, Katherine Griffith, who I am sure some of you have already talked to in some capacity, is available for assistance as well.

The Learning Institute, as we envision it, is a series of activities over the course of a year that's going to focus on various topics, including measure selection, reporting formats, and so forth, and we see that being conducted through a variety of methods. Webinars are one, although we do envision an in-person meeting sometime in the near future. In particular, I will mention the AHRQ Annual Meeting is upcoming in September, and I hope to see many of you there as well. We have the capability of conducting online discussions, live chats, and we have a Web site that we're hoping many of you will join, and have that as another forum for discussion as well.

One of the things that we get a lot of questions about is a tool that we call the AHRQ Quality Indicator Model Report. This is an effort we embarked upon several years ago in response to the fact there were many States who were actively thinking about using and engaging the Quality Indicators for public reporting, but really didn't have any standardized way of doing so. So we contracted with Shoshanna Sofaer to create a standard model report using the Quality Indicators. We will be highlighting the two Model Reports as part of this Webinar series. We wanted to raise it as a particular item here because there are obviously many issues that we could be discussing within this Institute. That is just one.

If you look at the next slide, there are several topics, possible topics, we sort of raised a little bit on our own that we thought would be useful to those of you interested in public reporting. Now, this is just sort of a quick list that came off the top of our heads. We're actually interested in knowing among these topics and others what you would like us to consider addressing during the course of the year. I believe there should be at this time a little poll that pops up toward the right of your screen. There we go, and if you could take just a few seconds and highlight for us any or all of the topics that are of interest to you.  I believe in the bottom of the screen there is a text box, if you could enter something that we may not have thought about but would be of help to you to address over the course of the year.  This will actually assist us as we move forward in planning future Webinars and activities to basically facilitate this information sharing. We want to make sure this Learning Institute really is as helpful to you as possible. If you could take a few seconds to fill out the poll, that would be great. Thank you. Please fill out the top three in terms of prioritizing, and we'll capture that information and, depending on timing, may actually be able to communicate that back to you before the end of the session.

Okay. What I would like to do now is introduce my colleague Marybeth Farquhar. Dr. Farquhar has spent a lot of her time recently with the National Quality Forum efforts on behalf of AHRQ, so I will turn it over to her to give you a quick update on where the Quality Indicators stand with the National Quality Forum and how the CMS is using our measures as well.

Marybeth Farquhar: Good afternoon.  Welcome. I do spend a lot of time with the National Quality Forum. We initiated this project in July of 2006. It was supposed to end July 31st of 2008; however, we are extending it to December 31st. But the majority of the work, at least the measure work, has been completed. What we ended up doing is selecting certain Indicators to submit to the National Quality Forum under their standards of hospital-area additional priorities, and we also did an open call for other measures that could potentially be used for hospital care. We selected about 34 of the Indicators to submit to NQF. We developed four composite measures that we also submitted and also submitted the model report, or template, as we call it. We also have other submissions that are ongoing through the process, the perinatal measures, surgical measures, and any other project that has a call for measures. Okay. Next slide.

This project was pretty extensive. It had a lot of moving parts. Next slide, please.  It was pretty complicated. We started out with five technical advisory panels, which are called TAPs. Eventually it was whittled down to four technical advisory panels and one steering committee overall, and then an additional steering committee was developed for the composite measures.  The surgery and anesthesia advanced 11 of the 16 measures that we submitted to them, patient safety advanced 13 of the 14 measures. The pediatric advanced three of the four measures, and I will talk a little bit more about some of the other things that are going on with regard to that. The next slide, please, shows the NQF-endorsed QIs. The ones in yellow or white are the Quality Indicators. The ones in black are the additional ones that were also endorsed during this extensive project for 2 years.

What I want to call your attention to is that, based on the input from a lot of the technical advisory panels, we ended up doing some revisions and some refinements of the Indicators based on people's input or the TAPs' input and the comments we got back from the open comment period. One thing that we ended up doing was changing PSI-4, which used to be called failure to rescue. That was the name that Jeff Silver coined and felt that his measure should be called that and not our measure because it was essentially different, so what we renamed it was "death among surgical inpatients with serious reportable or treatable complications." This measure was also harmonized with the current NQF quote unquote failure to rescue measure, Jack Needleman's measure, so that's one significant change that went on.

A few other significant changes (if you want to page through for the next), we also have changes in patient safety/pediatrics and just pediatrics.  One of the other additional changes that we made was that we have changed things that have a small sample size into basically counts.  Essentially, there isn't a denominator any longer, so that's one point that we had to clarify, and that was the recommendation, a pretty strong one too, from the TAPs as well as from the steering committee. So that's how we have reported that and refined that Indicator. Next one, please. And these are the other Quality Indicators that were also approved at the time. It is just for your information. All of the AHRQ QIs are available through the AHRQ Web site.  The other ones are available through the various sponsors, and I believe you're going to be able to get through the NQF Web site to places like Leapfrog, the Joint Commission, etc., the others that have submitted those measures.

The composite measures: We ended up submitting four composite measures, and you can see them there, derived from the Quality Indicators themselves. We have an overall patient safety one and one for pediatrics and one for adults. Right now the composite measures steering committee, which is looking both at hospital and ambulatory composite measures, is developing a framework, and they're still vetting the framework itself among the steering committee members. What's holding it up, I believe, is basically whether we should have an all-or-none phenomenon, meaning that every measure gets in the composite or doesn't. (You don't get partial credit, shall I say.) Waiting is an issue that they're getting into, and what constitutes a composite measure. What will come out of this committee is basically a framework that will be vetted through the NQF process, and once that framework is vetted, then they're going to look at the AHRQ Quality Composite Measures and determine whether they receive endorsement or not. So that's the juncture we're at. I expect they'll be completing their work shortly.  They were supposed to have their work done by the end July, but I think they will go a bit into August and basically will have the framework out for people to take a look at and comment on. 

We also submitted the AHRQ template for the Quality Indicators, as Mamatha had said. Shoshanna Sofaer and her colleagues at Baruch College (The City University of New York) worked on this pretty extensively and did a number of focus groups and literature searches and did some consumer testing with regard to the language that is in the report.  If you have seen the report, it basically has a sponsors guide, it is a Web-based type of medium. People can print out portions of the report to have a paper version the same way.  We have a sponsor guide that instructs people on how to construct their Web sites and then we have two templates.  One basically has the hospital topics and is a little more consumer friendly.  It looks at things like cardiac care, whether you have care for surgical procedures, those kinds of things.  So consumers can go and quickly find what is of interest to them, select the measures that they want to see, and you can scroll down to see the various measures for each of the topics. 

Shoshanna did testing for things such as "iatrogenic pneumothorax," which does not resonate at all with consumers, but "collapsed lung" does.  So those are the types of things they did.  She looked at graphic displays as well, and basically the staging and tiering of the hospitals themselves, so that will be available shortly on our Web site.

The other template that we're looking at is the Composite Measure Template. It is basically the composite measures we have developed through our expert panels and through a lot of volunteer work from a lot of folks out there who are knowledgeable in composite development. Right now, what the NQF technical advisory panel is doing for the model reporting is looking at developing some guidance for people who want to do public reporting. The reason it is not an endorsed framework is because right now the field is pretty new, so basically they didn't want to lock into any type of wording or any type of way to do things in particular. So that, I believe, is coming out very shortly. That actually should be out by the end of the week, from what I understand. Basically it is a guidance document with several guidelines that are enumerated, and what they have done for AHRQ is to look at the guidelines and then go ahead and compare them to what the AHRQ report has in it and determine whether it met the criteria, didn't meet the criteria, or sort of kind of met the criteria. And that is where we are with that.  What will be coming out on our Web site is basically a revision to the original guidance document for public reporting and pay for performance.

What we have planned is to give folks a little more direction on which measures should be used for public reporting. What we ended up doing as part of the NQF process was tiering the Quality Indicators based on the evidence, based on how strong the Indicator captured what it was supposed to capture, and that's basically how we ended up doing this, and we included that in the new guidance.  Also, included in there will be copies of the sponsor guide and the two templates. One issue that came up quite frequently during the NQF endorsement process was the reason we didn't submit all of the Quality Indicators, and that reason is that once the NQF endorsed a measure, particularly one on CABG [coronary artery bypass graft] or AMI [acute myocardial infarction] or something to that effect, they will not endorse another. So that kind of narrowed what we could submit and, based on what was left, we ended up submitting additional measures for that.

Next slide, please. As some of you may know, with regard to the new hospital update, market basket update, CMS intends to use the Quality Indicators—nine of them right there—that are listed for their public reporting initiative. We're talking with CMS to help us obtain more information about the Quality Indicators as they come into use, and based on that we'll be doing some refinements and additional validation work, but also CMS is suggesting that we do a dry run with these Indicators, which is probably what they usually do with the Hospital Compare measure that is to go on the Web site.

At this time I would like to turn it over to Kim Streit and to Sylvia Cook. Kim Streit is the Vice President for Healthcare Research and Information of the Florida Hospital Association, and Sylvia Cook is the Team Leader for Texas Healthcare Information and Collection, Center for Health Statistics, Texas Department of State Health Services. Kim.

Kim Streit: Thank you, Marybeth. I would like to commend AHRQ and AcademyHealth for developing this Learning Institute. It would have come in very handy for us a couple years ago, and I know it is going to benefit us as we develop future enhancements to the Florida Web site. Just to give you background on some of the issues we faced in Florida, in 2004 our State legislature passed transparency legislation requiring the State agency, the Agency for Healthcare Administration, to develop a consumer Web site to help consumers with their health care decision-making.  Initially the time frames that were given to our State agency were almost, I would say, impossible. They gave them less than a year to start from scratch to create a Web site that included information on areas that the State hadn't really done a lot of work in. They had a Web site that showed volumes, charges, average length of stay, but the legislation was very, very specific and said that this Web site had to be consumer friendly, had to include information on infection rates and mortality rates and complication rates and readmission rates. But they put the parameters on the State, saying that you've got to use nationally accepted measures and methodologies, it must be risk adjusted, and that the State needed to minimize additional reporting requirements on the hospitals.  AHRQ was also specifically named in the legislation, for the State to look at what AHRQ was doing in terms of quality reporting.

So what the State did is they created several work groups, technical work groups, to quickly respond to this, and Florida Hospital Association was very involved with the Agency's implementation of the Web site because our members support transparency, and we wanted to work with the State to get the consumer Web site going as quickly as possible.  With the technical work group, we ran into a bunch of different roadblocks with trying to decide which measures really are appropriate for public reporting. We read everything on the AHRQ Web site. The AHRQ QI team was absolutely fantastic and extremely helpful, but there were a lot of questions that just came up on an ad hoc basis as we were trying to identify how we were going to post stuff on the Web site.

One of our challenges was, How many cases did we need, did a hospital need to have, in order to use for that QI measure to be, or the patient safety measure to be, valid or how many cases would skew the data. We struggled with how to display the data in a format that consumers could use and understand and also fit within the parameters of a Web site. We tried to figure out, Do we show the actual data, the expected value, the observed values, the rate? Do we include the numerators and denominators, or do we just show better than expected or as expected or worse than expected? We also looked at... ultimately we want consumers to use the data, and so we sat in work groups, and really the Agency staff worked behind the scenes to come up with some verbiage for the work group to use to help consumers understand it, and so all of this was basically starting at square one.

Then another component and issue that we faced was, How do we educate the hospitals on the Quality Indicators and the Patient Safety Indicators? At the time that the State started looking at that, this was new to our hospitals. They had never heard of the AHRQ QIs or PSIs, and so that was another challenge we faced. Some of the data, the challenges with administrative data. I think through the process we found that, as we started using the data and the hospitals looked closely at the data, we found some reporting inconsistencies, with the hospitals interpreting the data differently.  What we did do is we looked at the other States that had published data before us, and the other organizations, but they didn't answer all the questions that we were facing in Florida. So we had a lot of discussion at the work group level and I would say discussion/debate, and there was a lot of debate that went on within the work group, and our work groups were comprised of not just Agency staff and hospitals but we had health plans at the table. We had the business community at the table, AARP, and they were not necessarily knowledgeable about hospital administrative data or the AHRQ measures, and so there was some inherent lack of trust that went on.  So as we would bring up a valid issue, in some cases we were distrusted by the others on the work group in terms of whether that was valid or not.  By being able to go to the AHRQ QI team, it was extremely helpful to justify the decisions that were being proposed within the work group. I think the biggest thing that would have helped us is that opportunity to discuss and to share it among others who are trying to do the same thing, and it would have made our task easier, faster, and probably less painful. Our State and the State agency staff—and many of you all might know Beth Eastman—they put their heart and soul into this.  There was a lot of starting from the beginning, and I think that this type of Learning Institute is going to help all of us, even those with Web sites up, or those trying to get Web sites up. I think it is going to help us develop more thorough consumer Web sites.  That's our Florida perspective. So I guess I will turn it over to Sylvia to talk about the State agency perspective.

Sylvia Cook: Thanks, Kim.  This is Sylvia Cook with the Texas State Department of Health Services.  We collect inpatient discharge data from most hospitals in the State, and we have released data since 1999 on discharges. One of the first reports that we published was by hospital referral regions on the utilization of procedures using the first HCUP [Healthcare Cost and Utilization Project] Quality Indicators. We have a statutory requirement to publish reports on the quality of care in Texas hospitals, and the methodology that we use is required to include—and that's required by statute—required to include risk adjustment. The problem that we faced when we started looking at the possibility of doing a hospital report was that we have a very small staff, and we also knew that any methodology that was developed in-house would be criticized by our stakeholders.

So as we were working through that, we became aware of the next version of the Quality Indicators that AHRQ was developing, the IQIs.  That methodology included risk adjustment which met our statutory requirement. It also used APR DRGs [all patient refined diagnosis related groups], which we already had in our data. So our statutory requirement was met, the methodology was developed by a reputable third party, and that helped to allay fears from the hospitals and our stakeholders. As we worked through that process, we worked with technical advisory committees that were also required by statute. In particular, we worked with a committee that was composed primarily of medical professionals who reviewed all of the measures and made some recommendations on the release.  We also worked with the technical advisory committee that was composed primarily of hospital public relations personnel, and that committee helped to develop the user's guide that was released with the report. We were aware that AHRQ recommended that the IQIs not be used for public reporting on hospitals, but the methodology allowed us to publicly report on hospitals. 

The first report that we released was for data for 2000, and that was released in October of 2002, and since then the report has been released annually. One of the important lessons that we learned in that process was that working with stakeholders helps to make the report a better report.  That was certainly not an easy process but it helped to make our report better.  We have reported on Pediatric Heart Surgery Indicators since that first report in 2002, but there were continuing discussions with children's hospitals on the need for better reporting on children's hospitals.  We began more organized discussions with those hospitals in November of 2005, and the goal was to find more meaningful reporting on children's hospitals. These discussions coincided with AHRQ's development of the Pediatric Quality Indicators, and remembering those lessons we learned from our first release of the IQI report we worked on with the children's hospitals. We also had a methodology in place that was made available to us by NACHRI (National Association of Children's Hospitals and Related Institutions) to use their chart review tool so that each hospital could do chart reviews for cases that were included in each one of the measures. And the goal for this was to find measures that were meaningful and the least affected by our not collecting the present on admission indicators. This collaborative selected six indicators appropriate for public release. Hospital staff helped to develop a user's guide that was based on the IQI guide that was released with the report, and the Indicators were reviewed by the hospitals in the summer of 2007. We also have a statutory requirement to allow hospitals  to review and comment on the report before it is publicly released, and during that review one hospital discovered that it had incorrectly coded cases that affected one of the Indicators, so as a result of that, that one Indicator was not released. We expect to release it later after the hospitals have had an opportunity to correct their data.

The first report for 2005 was released in December of 2007, and we expect the second report for 2006 to be released in August, but this process was recently published in an article in the Journal for Healthcare Quality. We also used the Prevention Quality Indicators, and the public health community is particularly interested in that. We do not use the Patient Safety Indicators, and we don't plan to do that until we're able to collect present on admission indicators. We're willing to share what we learned, and we have shared our user's guide. We've shared our methodology for producing bar charts, and we've also shared our query tool for searching our IQI database. We shared all of those with other organizations, and we recognize that the situation in each State is unique, but we can still learn from each other. We're always looking for better ways to do things and better ways to communicate information to the public, and we see the Learning Institute as a potential source for those tools and the expertise that we think we need. Thanks.

Marybeth Farquhar: Thank you, Kim. Thank you, Sylvia. Appreciate it. It sounds like you all have a lot of experience that you're willing to share with us and we really appreciate that, and from working with both of you in the past, you've provided a lot of input that helped us refine the Quality Indicators.

Right now, Greg, can you show the polling results of the previous poll, please? We're supposed to have results up here. I don't see them yet.

Well, why don't we go ahead and move onto the next slide please. There is another poll. Hopefully we'll have the results of the first poll about the topics before you sign off.  For this poll, there are three polling questions that pop up on the right of the screen regarding the State public reporting initiative and requirements. If you could fill those out briefly for us that would be really helpful.

While you do that (next slide, please), we're going to go ahead and go onto some questions. If you have questions, you can either write them in the text box or you can signal the operator, who will put you through live. So the floor is now open for questions.

Operator: (Operator Instructions) Any questions about the Institute?

Marybeth Farquhar: Okay. Does anybody have any questions about what we're proposing for the Learning Institute, particularly just something that would meet your needs, something that would be beneficial for you to participate in? I am not sure from the topics slide whether we covered all of your issues here, so it would be good if you could speak to some of those issues and topics that potentially might be used for Webinars, seminars, or papers or other kinds of tools that you could be using for public reporting. And we do have a question.

Mamatha Pancholi: So we do have a question regarding present on admission. The question is, Do you see re-doing the QIs with present on admission?

To some extent we've already done that.  Measures based on analytic work have incorporated present on admission for certain measures where evidence has said it makes a real big difference. We actually have done that to a point. We are continually looking at that data element, and insofar as there are data organizations out there actually collecting that data element, we will be incorporating it in the future. I do think that's where we're going. A lot of this really does depend on AHRQ resources as well, so where possible I can say, yes, we will be doing our best to appropriate the data element as possible.

Marybeth Farquhar: A lot of the NQF measures also require present on admission, so right now that is available as a toggle, but not for those measures that are NQF endorsed that require that element. Also the QI team did work with California data and New York data about present on admission, and based on that we also did some refining of the Indicators from the results of that study. Another question?

Mamatha Pancholi: Yes. So there is another question regarding how often we'll be doing training events. Our plan right now is to do them monthly over the course of a year. There may be situations where we may have just one or two, or even more frequently, but I think it really depends on demand. And the medium may not necessarily always be a Webinar. It may be a particular chat room topic or some other form. We're hoping to utilize technology to our benefit in this particular area. So our plan is to do something monthly but my guess is we'll do something much more frequently.

Marybeth Farquhar: Okay. Any other questions? By the phone? There is one. Okay.

Mamatha Pancholi: There is a question. Will we be providing Webinars on data validation processing and how to assure data are accurately reported publicly? I believe that was one of the topics on the polling question, and it was an idea that was raised by us as well, that data quality is certainly an important part of the reporting strategy. So I do believe we're interested in having such a discussion as well. We would hope that to be a future topic for a Webinar.

Marybeth Farquhar: Or part of another. We could do that as well.

Mamatha Pancholi: We have another question. Given the collaborative nature of the QI efforts, who do you see leading and maintaining the QI Learning Institute effort?

To the extent AHRQ is able and funded appropriately to do that, we do see that as our role, given that we do support and maintain the Quality Indicators. It is one of our objectives to make sure we can facilitate the use of the QIs as accurately as possible to provide as much information to support that effort, so I do see that as being an AHRQ effort. At the moment, I can really only formally commit to the next year for the Institute, and so the future second Institute or so would really be dependent upon the demand for it and how this Institute is received.

So this question might be for Sylvia and Kim. Do Texas and Florida require POA [present on admission] on non-Medicare admissions.  If so, through State regulations or statutes?  Kim or Sylvia, would one of you be able to respond, please?

Kim Streit: In Florida our State started requiring hospitals to report present on admission with discharges after January 1, 2007, so we do have the POA. We have looked at the AHRQ measures, and there is a component in there with the POA, and as a result the State pulled down three of the Patient Safety measures. So we get four quarters on present on admission. We are collecting it, and it was done through the rule that requires a hospital to report inpatient data.

Sylvia Cook: This is Sylvia from Texas. We currently do not collect present on admission. It is our intention to do that, and it will require a change in our rules.

Marybeth Farquhar: Okay. Thank you, Kim and Sylvia. We know that there are several States that already have reporting programs up and running that are on the call. Do any of you have any comments about your experiences and your hopes for the Learning Institute?

We have a couple more questions while the folks who have reporting projects already up and running, if you could just respond, that would be great. That would be helpful for us. Go ahead.

Mamatha Pancholi: So one question goes back to the focus of the Institute itself. Is it focused solely on the public reporting issues or is it issues for all users? As we envision the Learning Institute and given the timing of the NQF endorsement and the world around us, we would like to focus the Institute on public reporting, particularly in light of NQF endorsement and the fact that CMS will be using nine of the Quality Indicators as well. So it is our intent to really focus on that aspect of use.

So a couple questions about getting more detailed information about what will be covered over the life of the Institute, and truthfully it is completely dependent on you. As we go through the information we got from the polls, we will determine and assign Webinars and various activities to those different topics to make sure the information gets conveyed. Our goal is to make sure there is a dialog occurring on those topics, and from the first QI user meeting what I learned is there were a lot of people out there who had a lot of similar issues and really were in silos, and so this is one way of trying to break down those silos. Our hope is, as we identify what the topics are, we'll be able to bring together all of you over a question or a topic, and we will be offering our experts in the field. So we'll be having some technical discussions around risk adjustment or validation is my guess. We'll have our Project Director, Jeff Geppert, participate in the discussion, and we'll have folks like Shoshanna Sofaer do maybe one session or two Webinars on the public reporting model as well. We'll have a variety of folks with some experience or information to bring to bear that would facilitate the use of the Quality Indicators, and my guess is the issues that you have had or will be having are quite similar.

Irene Fraser: Just in addition to the question of what the focus of the Institute is going to be, the focus of the Institute is going to be on public reporting because we realize that's a big preoccupation right now and a major new application. We do have a support network and a support system which will continue to support all the other uses as well. We recognize there are a lot of other uses, including quality improvement within hospitals, and we'll still be supporting that effort, but it will be through a different means.

Marybeth Farquhar: More questions, Mamatha?

Mamatha Pancholi: Very basic one. Is there a cost with participating in the Learning Institute? And the answer is a resounding no. This is a free public service, so we hope that organizations are able to take advantage of that. We know there are limited resources out there, and this is our way of trying to help alleviate some of that.

Marybeth Farquhar: Okay. Another question?

Mamatha Pancholi: There are several questions a little bit more of a technical nature, and I am not going to take some time out of this call to address them, but I will make sure they get responded to individually.  If you sent in a question and I have not responded to it on the call, please know we will respond directly to you via E-mail.

Marybeth Farquhar: Okay. I am just told that you'll see the results of the first poll in the chat box shortly indicating the top three issues that were selected by the group, so if there are no more questions, we're coming to closure here. Do we have one more?

Mamatha Pancholi: There is one more I think was a question for Sylvia. The question was about the present on admission. It is not a national requirement, and this user would like to understand why Texas is not doing this or I am sorry, I think the question already got answered.

Marybeth Farquhar: Okay.

Mamatha Pancholi: I think we're actually good. I guess I would like to thank everyone for participating.  

Marybeth Farquhar: Next slide, please. Here are a couple of items about staying connected to the Learning Institute. We have a Web site that we're doing. We have an E-mail update with instructions and those kinds of things. We need to have you sign up by September 30th to determine how many folks will be participating.

Okay. Then we have the AHRQ Annual Meeting September 7th through 10th. Again, that requires registration, at least let us know. The cost is free for the AHRQ Annual Meeting. There is the Web site there for you and the Learning Network lunch will be one activity that will happen on September 9th during the AHRQ Annual Meeting, and then we'll also have a QI Users meeting September 10th from 8 to 1 p.m. Next slide, please. Please take a minute to complete the feedback form that will pop up after you sign off about your experiences with this Webinar and suggestions for the Learning Institute.

Mamatha Pancholi: Just one quick note before we get off the topic of the survey questions and polling. From the poll, it looks like folks have indicated that a session on selecting measures, integrating ratings from the QIs with other hospital data, correlation between Quality Indicators, and risk adjustment sort of made the top tier there. So it would be my guess that, as we move forward in planning for these events, we would focus some of them on those particular topics. We look forward to having you participate in those.

Marybeth Farquhar: Okay. Thanks. There is more information there. The QI Learning Institute Web site, and its link, the E-mail for the Learning Institute, and also the QI Web site, which I am sure a lot of you are already familiar with.

At this point I think that we can bring our time together to a close. Irene, did you want to say anything in closing?

Irene Fraser: No, just to thank everybody again. This is really, I think, going to be an exciting venture we're going to do together.

Marybeth Farquhar: We look forward to hearing from you. Thank you for your participation and your attention.

Operator: Thank you for your participation in today's conference. This concludes the presentation. You may now disconnect. Good day. [Event concluded.]

Current as of August 2008

Internet Citation:

Introduction to the QI Learning Institute. Webinar transcript. August 2008. Agency for Healthcare Research and Quality, Rockville, MD.

AHRQ Advancing Excellence in Health Care