[This Transcript is Unedited]

Department of Health and Human Services

National Committee on Vital and Health Statistics

Workgroup on Quality

Working Session

November 29, 2000

Hubert H. Humphrey Building
200 Independence Avenue, S.W.
Washington, D.C.

Proceedings By:
CASET Associates, Ltd.
10201 Lee Highway, Suite 160
Fairfax, Virginia 22030
(703)352-0091

PARTICIPANTS:

Workgroup:

Staff:


P R O C E E D I N G S (8:05 a.m.)

MS. COLTIN: So what I had hoped to do today is think about what gaps exist, and what we have heard to date, and the charge that we have set for ourselves, and then think about future panels that we might want to have in order to fill those gaps.

Given that the February meeting schedule seems extremely tight, I also wanted to raise the notion of if there are significant gaps and significant numbers of perspectives that we want to hear from and include in our recommendations, that we consider having perhaps a one day meeting, maybe in late March or early April, or sometime around that time frame, between the February meeting and the June meeting, to have a series of panels, and cover those areas.

So that in June we could actually try to lay out what the report would look like in some depth, and engage someone to go through the testimony and organize the materials for us. I would actually like to get started on that process before the June meeting if possible. So if we do have a one day meeting, I would probably want to reserve part of it to discuss a draft outline for that report that we would get someone started on.

I am scheduled actually to rotate off the committee, as is Barbara, at the end of June. So I would really prefer if we could actually have a draft to review in June. And there seems to, most of the time, be a delay in replacing people, so we might end up being around for one more meeting after that anyway, but I don't want to set the schedule assuming that that will be the case.

So anyway, in the full meeting books was a copy of our charge and our work plan. And I went back to it last night just to say, where have we fulfilled what we set out for ourselves, and where have we not. And there are a couple of gaps. So one of the first things I want to do is see whether there are gaps that we want to plug, or whether we want to just change the work plan, and say, no, we're not going to do that.

So the first thing is that we had talked about addressing the data needs of state Medicaid agencies to support quality assessment and improvement in the managed care industry, and make recommendations for improving those data. And what we had decided to do was actually fold that into the Medicaid managed care report, so there were sections in that report that talked about the needs of state data agencies for assessing quality as well.

So it wasn't a separate report, but it was integrated into that report. We did have specific questions that we had laid out for those who testified, around quality measurement issues. So I think we are really done in terms of what we would be doing with regard to that particular objective.

The second objective was to evaluate data needs to support quality assessment and improvement as it relates to the delivery of post-acute care, and to the continuum of care, and make recommendations for improving those data. That objective was laid out at the time when the Subcommittee on Populations was actually setting a somewhat broader charge for itself around post-acute care and the continuum of care than it ended up pursuing. We ended up really narrowing that down to looking at functional status measurement, and then in particular to looking at the viability of ICIDH as a means to accomplish that.

And so in order to really go back and try to address this charge in full, I think would be a major work effort. And we did take testimony. As part of one of the quality assessment panels, we did bring people in to talk about the quality measures that are being constructed from the data sets that are being collected in post-acute settings in particular.

And we talked about the measures that are constructed from the MDS. We had David Zimmerman come and talk about that. So I think we have addressed it somewhat, although not completely. We had testimony about OASIS, and metrics that could be constructed from that actually a long time ago, like maybe a year after I joined the committee.

And I don't now if we want to reach back that far, but I think we certainly, as a committee, have a record of looking at those issues. And it wouldn't be unreasonable, I don't think, if we have a contractor try to pull information together for a report, to go back to that testimony and include that as well.

Are others comfortable with that, even though it occurred a couple of years ago? It really was before this workgroup was even formed. So that's the dilemma; that it really wasn't looked at by this workgroup, but was looked at by the broader subcommittee.

DR. STARFIELD: Well, the other issue, particularly with regard to continuum of care is linkage. That gets back to the personal identifier, which we can't work on.

MS. COLTIN: Well, it also gets back to the issue that we were actually on record on this, which was the OASIS-MDS incompatibility, where we sent a letter to the secretary about that. And about the fact that we didn't think it was a good idea, and that they really ought to hold off on implementing OASIS.

DR. STARFIELD: What has happened with that? Have they held up?

MS. COLTIN: No, OASIS has been implemented. It was mandated by Congress. I think they kind of took it out of the hands of HHS and mandated it.

Anyway, that's the one that is a dilemma. So I guess we have a couple of options there. One is we can say okay, we're not going to pursue that objective. We're going to change our work plan and delete it. Or the other is to go back and include testimony that we have taken before the work group was formed, but that many of us actually did participate in, and include that as part of the broader look at these issues.

People have opinions on one side or another?

DR. STEINDEL: If the testimony was given before actual implementation, and talks about the conflicts that can happen after implementation, I don't think we should include the testimony. If we are going to look at it, I think we should look at what happened after implementation, and did our testimony predict what was correct? So I would be in favor of leaving it out.

DR. STARFIELD: It was mostly Lisa that was involved in the OASIS.

MS. COLTIN: Yes, well, I was there at the meeting.

DR. STARFIELD: I was at the meeting too, but it's not something I --

MS. COLTIN: Well, Andy was really the expert on that, I think.

DR. EDINGER: I guess the other option is to have them update the testimony on the OASIS and the MDS, and some of the more recent of what happened.

DR. STEINDEL: That means reopening it, and having another hearing.

MS. COLTIN: But if we are going to have a day, we could have a panel where we do talk about those issues. It really has implications in terms of tracking individuals and their progress through an episode of illness across different care settings, which is an important aspect of quality of care assessment. Right now most of the measures occur in each setting.

DR. STARFIELD: OASIS doesn't do that.

MS. COLTIN: Well, the problem, the reason for looking at it is that it is incompatible with MDS. And if in fact if there is an MDS for home care as well, and if in fact they had adopted that instrument, there would have been an opportunity to track the same measures across the settings, because patients do go back and forth between home care and short stay nursing homes in particular.

DR. STARFIELD: Well, it seems to me it's worthwhile having something very small, four or five pages, that lays out the problem as we saw it with OASIS, and then says what was done. It lays the groundwork for further efforts in evaluation.

DR. LUMPKIN: I'm kind of struggling with the approach, because I think the reality has changed somewhat since we started this workgroup. And that's because of what the Quality Forum is looking to do, and they have a fair bit of interest in areas related to quality. And so I do know that at least in the document I showed you, that one of the framework board is beginning to address some of these issues in continuity between data collection and the burden of collection, and all that.

And as I think about the hierarchy, that to the extent that the Quality Forum may be dealing with some of these issues in detail, then that may mean that what we should be doing is dealing with things more in at a conceptual level, at a higher level. And the issues of integration across the continuum of care is a very important one to be addressed.

And to the extent that the Forum may be doing stuff on that, we should bring that in and incorporate that. Which may preclude the need for a hearing. We can refer to their documents and their findings, and say on this particular issue as reported in this document by the Quality Forum.

MS. COLTIN: And simply endorse their recommendation, if we agree?

DR. LUMPKIN: Right. And then spend our time looking at areas that they may not be addressing at that particular point in time.

DR. STARFIELD: So we could have a hearing with them.

DR. LUMPKIN: I'm not sure that hearing is really their approach. They are really using expert panels more so than hearings to come to their conclusions.

DR. STARFIELD: Who heads that Forum?

DR. LUMPKIN: Ken Kaiser.

DR. STARFIELD: Is that at AHRQ?

DR. LUMPKIN: This is the President's Commission on Quality recommended the formation of this public/private partnership. And this is the public/private partnership that came into being. Gail Warden is the chair, and Bill Roper is the vice chair, and Lisa and I are on the board. NCQA and JCAHO and AMA are kind of liaison members. Then they have got this group called the Framework Board, which essentially are people like Molly Coye and others who are spending 25 percent time working for the Forum on the Framework Board, and developing their documents and expert panel pieces.

DR. STARFIELD: So they are doing some conceptual work if they are doing framework.

DR. LUMPKIN: Yes.

DR. STARFIELD: Maybe it's worthwhile.

DR. LUMPKIN: I think it might be worthwhile to have -- I'm not sure if a hearing setting is the best way, as much as maybe sitting down and figuring out how we are going to work together with them, and maybe the IOM, and where we would see, given where the directions of the IOM -- so maybe a hearing with the IOM.

DR. STARFIELD: And AHRQ too.

DR. EDINGER: Richard Bunting(?) asked me to do a study with IOM, essentially to look at how the federal government manages the oversight of quality, which I'm assuming, since the IOM does lots of these studies, that they will have to look at some of these issues in one way or another, which I assume would include some look at things like OASIS and MDS.

So they are starting that now. They are selecting their advisory panel and their executive director and what not. So they should have some information available on that too. So it's probably the nuts and bolts. I'm assuming they will look at some of that. So I'm not sure it's necessary to go into some of that again, the nuts and bolts issues.

DR. LUMPKIN: But if they are looking at how they are doing it, then our job should be to say how we should be doing it, based upon the information.

DR. EDINGER: Right. We could feed into that, as well as the other, but I don't want to duplicate what they are doing.

DR. STARFIELD: And this is different from the IOM effort on the quality report, right?

DR. EDINGER: Yes, it's another study we got mandated to do. So there is a bunch of stuff that IOM is doing later, but the Quality Forum is doing some of this work, so we probably should avoid --

MS. COLTIN: Well, it's an interesting dilemma, because when we started doing this, it predated even the formation of the Forum. Our work was actually approved in February 1999, and we had laid out a number of these things. But we had also cited the president's advisory commission, their report and recommendations, and said that what we wanted to do was look at what some of the data implications were regarding their ability to meet some of the objectives that they had laid out. And where might some of the barriers be that would lie in data inadequacy or unavailability of the data that we currently had, and what would we need to develop in order for that to occur?

And at that time, as I said, the Forum hadn't even been formed, and so we had no clue that they were even going to get into issues of data, as opposed to measures. I think the expectation when they were first formed was more that they were going to be looking at measures, and not the underlying infrastructure.

And we had actually communicated that when we had had that communication. We sent a letter to the secretary about the report and said what we were planning to do. So it was a little surprising actually to me, that they were going in that direction, and they were getting into the data issues.

I think that in that area, it would actually make a great deal of sense for there to be some collaboration. Now what's the best way for that to occur? Are they now further along in this than we are? We have actually taken a lot of testimony that could be useful to them in this area. And if we did put together a report based even only on the testimony we have taken so far, that report could be useful to them in making their recommendations.

So I don't care who takes the lead on this. I don't care if we have a report that we make available to them and they use it, or whether we write a report, and we look at it and say, yes, we agree, and endorse this and encourage the department to take whatever steps are under their power to implement.

But whatever route we go, I think we need to have some sort of a plan for what is it that we are going to do. And so it sounds like there at least ought to be a meeting of some of the individuals involved, to think about what would make the most sense.

DR. STARFIELD: Okay, who are the individuals? Are we talking about the chairs?

MS. COLTIN: Well, I don't know. I have to rely more on John, because I don't know who is working specifically on the information systems and data aspects.

DR. LUMPKIN: I'm not sure I know that answer. Ken Kaiser was a big force behind GCPR and the medical --

DR. STARFIELD: Behind what?

DR. LUMPKIN: GCPR, the government's computerized patient medical record, and behind the VA, which was a big component of that. What we have been hearing lately is that since he left VA, there is not really as much interest there.

DR. STEINDEL: That's the vibe I'm getting.

DR. LUMPKIN: I haven't heard that specifically, but that's kind of the vibe I've been getting. And I think kind of picked up the circles around that the last time we had a briefing on the GCPR project. So my guess would be is that this push is probably coming from Ken. My interactions is that I don't get the feeling that Ken is into the details. He just wants it to happen. So I'm not sure if there anybody on staff at the Forum who is following through on that, as much as they are sort of moving in that direction.

One of the issues that came up recently was a resolution which will be discussed by the board, which initially said the Forum should take the lead on developing standards for computerized patient records. I think I was able to convince them, and they modified the document that we have got our document out there. There is all the SDOs are doing it. And what they have changed in their approach is to have sort of a conference to build support for computerized patient records as a means to get to better measurement of quality.

I think within that context, what is different between us and them is that we advise the secretary, and they are really a place that many entities that are involved in quality go to sort of develop a unified approach. And so given what they are doing, and what direction they are going, what is it that we should recommend to the secretary that HHS should do?

Should they participate in this process, and look at the areas within control of HHS that may need to be aligned so that there is consistency in data definitions and how data is collected? I think that certainly could be one of the things we could do based upon hearing from them and the IOM on what the role of the Department of Health and Human Services should be.

The second area that I think we need to look at, which I don't get the feeling that they are really addressing, and that goes to the nonclinical aspect of quality, meaning the things that you don't pull out of the medical record, whether it's CAPs or all of these other kinds of survey methodologies. Whether or not some suggested consistent data approach to these ought to be looked at.

DR. STARFIELD: It's really the population focus rather than the strictly individual clinical focus?

DR. LUMPKIN: In the interfaces, because some of these surveys, I don't think I would call them population focus, but part of the problem is they aren't. So if you do a CAP survey in Chicago for one managed care, and then you go to do another managed care, you have a comparison to the nonmanaged care environment.

Yet has anyone done the studies -- which could be our recommendation -- to determine the validity of making that into a population-based approach, where you actually survey not only managed care and nonmanaged care, and then you have a reference population to evaluate.

MS. COLTIN: Well, HCFA is doing that with Medicare CAPs. They have a fee-for-service comparison group that they have developed. There is a pilot study that has been going on to do that I think in six cities around the country, where they were trying to figure out how could you define the fee-for-service equivalent population in order to be able to do that.

DR. LUMPKIN: That may be what we want to do, to have someone come in and talk about that.

MS. COLTIN: They are doing that on the clinical measures side as well. They have the project, I think it's called the HERS project.

DR. STARFIELD: We had testimony from HCFA on something very nice they were doing, but I can't remember it.

MS. COLTIN: You mean when Steve Klaus(?) was here? I think Steve did talk about that.

DR. LUMPKIN: So anyway, my thought is that we can look at this higher level of issues, which I don't believe that we can use things from HCFA and from the Forum and others to kind of put it into an overall conceptual recommendation to the secretary of the direction that quality data collection, and being able to follow people through the continuum of care, as they move from one location to the other, to have a more individual perspective on quality.

That may be the other piece, is to the look at the model that has been developed around NHII, which we sort of had some discussion about, but one of the dimensions that really hasn't been explored is the quality dimension as an overlay to let's say how can a person manage their own quality? How can providers manage their quality of service in providing community-based quality?

So I think there are a number of --

MS. COLTIN: But looking at it this way, most of the work on quality measurement has been a systems approach to the quality measurement. Look at various parts of the health care system and saying, are they doing what they need to be doing? Or what's the short-term outcome of what they did?

DR. LUMPKIN: Can I change your term? A component approach.

MS. COLTIN: All right, a component approach. Yes, there is no system. But what I'm saying is it has proceeded from the provider viewpoint -- provider, practitioner, health plan, whatever, as opposed to from either of the population viewpoint or the individual viewpoint, which kind of cuts across all of those, and across the continuum.

And I don't know which approach the Forum is taking, or both, but it seems to me that even just talking about that conceptually, and talking about the area where the need appears to be the greatest, it really is this ability to follow a population across settings, and individuals until we get outcomes for individual across settings of care.

DR. STEINDEL: We were just discussing how to resolve the dichotomy between the three systems. In the individual this idea of quality is being expressed today is binary. It's good and bad, and that's definitely not the other two.

DR. STARFIELD: You know we could make our work relate to the disparities issue, because I don't think the Forum is doing that. Maybe that's what we could add. What do we need to add the disparities dimension?

MS. COLTIN: Well, one of the things that might be interesting on a panel in that regard is there was a very nice session held almost a year ago now that was led by the Office of Minority Health where they brought health plans together from around the country, as well as some people from NCQA, and a number of people from the government.

They talked about what were the possibilities for health plans to measure HEDIS separately for different racial and ethnic groups so that we could look at these measures, not just for a health plan as a whole, but for subpopulations within a health plan, based on potential vulnerable categorizations.

So race and ethnicity was the one that they were focusing on, but one could conceivably also look at economic disparities or gender disparities, or other kinds of disparities. That particular conference focused on racial and ethnic disparities. We were saying yesterday we wanted to be broader in thinking about what kinds of disparities.

There has been some work done that has looked within managed care plans at the same measures for the Medicaid population versus the commercial and insured population, which looks a little bit at the economic disparity issue, but of course they both have access.

DR. STARFIELD: But I was thinking of it as a population point of view. What about quality for those in managed care, and those not in managed care?

DR. LUMPKIN: There are a couple of ways we could deal with disparity. One is looking at -- and I don't know if it's possible yet, but certainly can we differentiate between the disparities in quality based upon socio-economic status versus that of ethnic and racial? And the studies that have been done on cardiovascular disease, which clearly differentiate the two. That there are people of the same socio-economic status, but of different racial groups, have different access to bypass surgery and angioplasty, and all these other things.

That's an issue that quality measurement and the quality improvement system has to address, which is the decisional bias that goes in, not based upon decisions that are influenced by what resources you think are available to the patient, but just because of the influence of the interaction between the two people and the care.

MS. COLTIN: And some of the same arguments can be made regarding gender. Some of those same studies looked at the same biases in terms of women and men and treatment for heart disease in particular. So I think I'm not quite sure what our role is there though in terms of the data, other than to say gee, we ought to collect information about race and ethnicity, so that we can look at these issues. We've been around that loop a number of times.

DR. LUMPKIN: And I think we've been around it a little, but maybe the answer is different depending upon what we want to look at. When we went around the loop the first time, our concern was that the observers' definition of the race and ethnicity of an individual frequently was different than what the individual may classify themselves.

But in this particular instance as it relates to disparities, it's actually the observers' perception that is probably the most important influence. And that the measurement and the collection of race and ethnicity data by the observer may actually be very valid.

MS. COLTIN: But didn't we have a whole panel where we heard both sides of the argument for collecting race and ethnicity? We talked about collecting it on enrollment. We talked about collecting it on encounter or discharge or whatever. Yes, we examined the issue of who decides or who identifies the race, whether it is the individual or the provider.

But we also talked about just the issues involved in collecting it, with just the issues from the standpoint of some providers, and what it would take to do it. As well as from members of the minority community around their attitudes toward having it collected. And really, there is this division of opinion. There was no consensus over those who were very strong advocates of collecting it, and there were those who were very much opposed to it. And that cut across whether you were looking at --

DR. STEINDEL: I think there was a very strong consensus that you couldn't collect it correctly, no matter who did it.

DR. STARFIELD: But then we had Nancy Krieger(?) talk to us about geocoding. We never wrote a report on any of that stuff, although it's highly relevant to what we do. We just never pulled it together, and we could pull it together.

DR. LUMPKIN: But part of it is that it wasn't in context. We were really looking -- the issue was measuring race and ethnicity. And in a context of trying to have good data that would be meaningful from one location to another. I think we would get one answer, which is the answer we got, people all over the place. But if we were to look at it in collecting data that it's meaningful for evaluating quality in relationship to race and ethnicity, I'm not sure we would get the same answer.

MS. COLTIN: But let's think about how you would do it. How would you collect the information to be used for quality? Don't you go back then to the same issues about data, collecting it on an encounter or collecting it on an enrollment, and how people feel about having it collected? Because otherwise, you are saying every time you are going to do a quality study, you have to go out and try for those subjects in your study to find out what their race and ethnicity is, which suggests a survey approach.

DR. STARFIELD: If you look at it from a population viewpoint, then the geocoding approach makes a lot of sense. You are not looking at individuals' quality, you are looking at population quality. It makes sense to do it by areas. Which I think is what this committee can add, because the other quality things are really looking at an individual.

DR. EDINGER: Maybe the new secretary -- maybe Gail, which basically changes your dynamics, but whoever it is, will probably have to set an agenda, which will be one of the first things they will probably do in February or so. So one of the things we should do, this is supposed to be an advisory committee, is to provide input into that agenda, especially on the issue of quality.

Whether it's accepted or rejected is another issue, but at least we will have made recommendations to the new secretary that he or she can use in setting their agenda. And since they can't be on board before the January 22, no matter why, they can't be confirmed until February probably, we will have some time at least in the process, or at least to notify the secretary that we are working on something that could be of use to them, and not a political agenda, but basically a data agenda, or a nonpolitical type of an issue.

In other words, what are the gaps in data, and the needs for data collection, what is the best way of approaching the quality issue? What gaps exist if you wanted to measure it? So you don't get into the problem of whoever the secretary is and their politics, because that would cause them an issue. I'm sure if somebody like Gail, should have a great interest in the cost of some of these outcomes, which is one of our interests. But if it's somebody else, it could be a different focus.

But at least we're on board that we are doing something, and we are having something moving out, instead of becoming irrelevant in a sense. Because if we wait too long to even be perceived as something, they will set an agenda. They will set their goals, their legislative goals, their priorities, their reorganizing, whatever the are doing in the department, which I'm sure they will do, whoever comes in, and the advice probably will be a little bit late, as far as that goes.

They will be looking for advice and places to seek advice. There are people like Gail, who have been around the field for 25 years, and probably know it as well as most people sitting here, who have sat here. But there will be people from various agencies coming to HCFA and looking for help and assistance, and places to turn. And maybe have something to provide them when they come in. At least at that level we can certainly influence decisions.

DR. LUMPKIN: I don't see us being in a position get something out that quickly.

DR. EDINGER: No, I'm saying we decide what we want to do. At least if we have something that is coming out, that might be of help to them, we are gathering information, or having hearings, or meeting with people, that will help them in their process, and not duplicate it. We are talking about maybe June or July. I'm sure the train will have left the platform by then as far as that issue.

But at least we will be perceived as doing something, which is important; at the perception of doing something. We're not an irrelevant body, which would be another problem.

DR. LUMPKIN: Depending upon the new administration, we may be too relevant.

DR. EDINGER: That's another problem we have.

DR. LUMPKIN: And then that's because of the HIPAA side of what we are doing. That's something we are going to need to find out about.

I think that as we are teasing some of these issues out in relationship to the quality, it may again position us to provide that advice as those priorities start getting set. We just had a whole discussion about the Quality Forum, and the issue is whether or not that is seen as a Clinton initiative.

DR. EDINGER: Well, it is.

DR. LUMPKIN: There could be the potential that if the new head of AHRQ and the new head of HCFA don't give the same importance to the Quality Forum that the current leadership does, the Quality Forum is going to go away, because there will be no money.

DR. EDINGER: And to be honest with you, the current administration defaulted more on waste, fraud, abuse, and cost than it has been on the quality side. And if the quality side, like what's our risk because we've got to make the public happy, we've got an election coming up in two years, that's probably -- it could very well go away.

So assuming this body stays, as you say it doesn't become too irrelevant, it doesn't become a problem. This body may be one of the few places that can actually provide that advice that is independent of sort of the agencies.

DR. STARFIELD: So who is the chief staff of the Forum?

DR. LUMPKIN: Ken Kaiser.

DR. STARFIELD: Why don't we have a meeting with Ken and Jennifer Curry(?), who I guess really heads up all the IOM stuff on quality, and someone from AHRQ. Who is the person from AHRQ?

DR. EDINGER: I guess it would be Greg or John.

DR. STARFIELD: Who was the first person?

DR. EDINGER: Greg Meyer(?).

DR. STARFIELD: And talk about the population and disparities issues.

MS. COLTIN: I think there are three issues that we've been involved with, and that we have got a track record in, and that we're also talk about as a focus going forward that relate to what we were just describing. And so thinking about a population and an individual approach, as opposed to a component approach. And disparities is certainly one of them, because when you are taking a population approach, looking at which populations, and how are some of populations faring differently in terms of quality.

Another is the whole issue of linkage, privacy, the individual identifier, the ability to link information and to track individuals in their progress, and populations in their progress over time and across settings.

And then the third is the issue of functional status, because that's one of the most important quality indicators that one would want to look at if you are trying to look at an episode of care across time or across settings. And we have been doing work in all of those areas.

DR. STARFIELD: Wasn't there a linkage of the continuum of care issue, the functional status, and what's the third one?

MS. COLTIN: The disparities. So it was disparities, linkage, and functional status.

DR. STARFIELD: You know we don't even know -- I don't know, it might be Jeff's workgroup work on things that provide data on race, ethnicity, geocoding.

DR. LUMPKIN: Which?

DR. STARFIELD: Computerization of the record.

DR. LUMPKIN: I don't think they have -- the work they have done so far really has been a little bit higher level than specific data elements. And really what they are going to be looking to do is to do more of what we did in the transactions, which is to anoint specific standards that have been developed by the SDOs, as opposed to actually looking at individual data elements.

DR. STARFIELD: But to the extent that some of these data elements we are talking about are not in the loop in terms of the SDOs -- I don't think people address this.

DR. LUMPKIN: Well, I think the approach is to figure out what it is that we want to do, and toss that at the SDOs.

DR. STARFIELD: Okay, well, that has to come through Jeff's work.

MS. COLTIN: This came up yesterday when we were talking about functional status, and talking about ICIDH, and saying if you went to a medical record, could you actually, from the information there, code the ICIDH? And everyone was pretty much in agreement that from our experience doing medical record reviews, that the answer is no. That the information you want is generally not there.

Well, in an environment with an electronic medical record, what changes? Is that information any more likely to be there, and if so, what might make it more likely to be there?

DR. LUMPKIN: The answer is that's where the terminology becomes important. If we are talking about the patient medical record information, and the electronic transmission thereof, which is really the parameters of which we are looking at our recommendations, then the answer is no.

But what we are talking about with the NHII vision is not just a record, but in fact an intelligent record. So in Illinois we have a system that we have had for about 8 or 9 years now called Cornerstone, which has intelligent decision support in doing assessments.

So if you now have a patient who had -- you have marked somewhere on the record that they have a disability or some injury that would normally be expected to result in a function, it would put up an assessment piece that would then collect all the information that is important in determining what this patient needs in order to reach the maximum functioning, but also affect all the data you need in order to be able to measure the functional status.

DR. STARFIELD: That's really what we are looking for, is what's the screening question that you use.

DR. LUMPKIN: And the link in what I saw yesterday, and I'm struggling with in the document, the link is not ICIDH, which is a classification tool, but it's the measurement tool of functional status. And whatever tool, whatever screening questions, or whatever measurements need to be done, they need to be done to assess where the patient is in order to provide the best care.

And as a byproduct of that process that allows us to adequately classify their functional status. And that the intelligence is in assuring that that data is collected, which on a quality basis, gives us the highest quality. If you don't fully assess their functionality, you are not getting quality care.

DR. STARFIELD: Did you say you are already doing it in Illinois?

DR. LUMPKIN: No, I'm sorry, our system is a maternal and child health system. And so if you ask a question in the general screening exam that would lead to let's say a high suggestibility of domestic violence, then there is a whole panel of questions on domestic violence that need to be answered.

DR. STARFIELD: So we are looking for the question on functional status, which we don't have yet.

MS. COLTIN: Well, if wanted to be able to classify individuals, you wanted to be able to code the ICIDH, the record is actually fairly strong in terms of the body function components of the ICIDH. It's the activities and participation dimensions that you seldom find in a record.

So it seems that if there was a question about activity limitations, and a question about social role or participation, whatever, it might elicit the information that would then be necessary to code. And so it seems if we were thinking about what would an ideal medical record contain, I think most people would agree that it ought to contain information about limitations in activities that people have, or their restrictions on participating in their social environment, but most of the time it doesn't. Even if the question is asked, and the conversation is had, it doesn't always make it into the medical record. Lots of times the conversation never occurs.

DR. LUMPKIN: So are we talking about data and information issues, or knowledge issues? Because I think what we are talking about is really a combination of the two.

MS. COLTIN: It is a combination.

DR. LUMPKIN: And to that extent, we need to find an intelligent way essentially move that knowledge to the point of service. And if there are -- with functional status there are assessments that ought to be made, and maybe as significant as whether or not an ACE inhibitor is given to someone, as opposed to denied, then how do you insure that that occurred? One way obviously is to measure it.

MS. COLTIN: Well, I think part of it is making it easy to do as well. And that's where I was thinking about the potential for the new electronic medical record, to make it easy through a structured text and point and click kinds of situations, to enter that kind of information about a patient in a pretty routine way.

I know the medical records system that we have had for years in some of our settings, there were the capabilities to bring up prompts of the kind of thing you are talking about. You enter a particular piece of information, and based on what you enter, it actually might prompt you for additional information.

DR. LUMPKIN: My guess would be we could probably talk to Jeff and Simon about this, is that the work that we have been doing on electronic patient medical record information alludes to this, but does not discuss it. And if we are looking for something that would be a contribution, maybe something that would assess the state-of-the-art of okay, now you've got it digitized, what do you do with it?

Of looking at these studies like Yukon and other studies, that actually demonstrate where you put some intelligence in the system, you actually can yield cost savings and improvements in quality, and make some recommendations about directions in research, directions in system development so that the hooks are there for these kind of decision support systems.

DR. STEINDEL: The groups have done a lot of looking at functional status measurements, and questions on functional status of the nurses.

DR. STARFIELD: We had testimony to that. That's what I meant the other day when I thought that the disparities were going to be a committee function, because it seemed to me that standards and security had to get involved in that.

DR. LUMPKIN: Yes, but I think the way to get to that, and looking at what they are doing, what I'm suggesting is that maybe the thing that this workgroup does is say that our major focus is going to be using -- instead of looking one or two years down the road, let's look five or ten years down the road, to make sure that the infrastructure is in place, that when we can electronically transmit in meaningful ways, electronic medical information from one location to the other, that we also have the capabilities and the infrastructure to bring quality to each individual encounter.

Right now our system is focused on measuring quality. And then because we determined that there may be deficits in quality, we are going to help not Patients A and B, whose quality is being measured, we are going to help Patients C and D, because we are going to change behaviors.

DR. STARFIELD: It's not encounter-based. That's the problem. That's why we need a continuum of care focus, and we need to bring that to them.

DR. LUMPKIN: But the infrastructure changes that we are talking about doing is instead of measuring the quality of A and B, to affect the quality on C and D, is to actually have the systems in real-time, evaluate the quality of A and B, and affect their particular outcomes. And it is moving from that kind of change, is where we might be able to make some contribution on directions that we need to go.

DR. STARFIELD: Who from the Standards and Security group is on the NHII?

DR. LUMPKIN: Jeff.

MS. COLTIN: Well, one of the things I'm getting concerned about as we are talking is that might be the next charge and the next work plan for this workgroup, but when this group met, our focus was on measurement, and what we needed to be able to understand where the system was working, and where it wasn't, and who was getting good quality, and who wasn't.

And not so much on what we needed to do from a data standpoint, or from an information standpoint to change that, to influence quality and improve it. I'm all for it. I think that's a very important thing to do, but that is probably phase 2. We don't even know necessarily within any great confidence, where all the problems are right now, because we are not collecting some of the information that we need to be able to identify the problems.

And so I was thinking more about the fact that you could make a recommendation that says if you are developing a system, and it is collecting information about a patient's vital signs, should smoking be a vital sign? Should the system prompt you to ask about that, and make sure you know it, because it's a factor then that is important in other decision-making down the road, as well as your ability to understand and evaluate what has happened with a patient.

So we are trying to say, yes, it's been shown it's cost effective that if physicians advise who smoke to quit, that more of them will quit. But who should they be advising? They should be advising people who smoke. Well, we don't know who smokes, because we are not collecting that information as a vital sign in order to know whether they got the advice.

Another thing is that along the same vein, all of the therapy codes that you use the terms and everything that is available tend to have grown out of a lot of the coding systems and constructs that we currently have about what's a therapy. And if you look through CPT, there aren't counseling codes in there. There aren't individual codes for yes, I talked to this patient about the risk of sexually transmitted disease, or I counseled this adolescent about the risks of particular behaviors. And the codes don't allow it. But within an electronic medical system, the recodes in ICD-10, aren't most of those recodes going away?

DR. LUMPKIN: I don't know.

DR. STARFIELD: That's what I was reading in ICD-10, they are supposed to go away.

MS. COLTIN: That was my understanding.

DR. STEINDEL: I think they are supposed to be folded in.

MS. COLTIN: Some of them I think are supposed to be folded in. And some of them are actually procedures, and they are supposed to be moved out, because they are -- so that's the issue. So owns this? Who owns counseling? Is it a diagnosis? It is a therapy. It belongs in a procedure coding system.

DR. LUMPKIN: Yes, the problem is a lot of the time it's not paid for, so it doesn't exist as a procedure code, because it is done by social workers or --

MS. COLTIN: It's paid for. It depends on how define it getting paid for. When it happens, it pays for itself, because things get avoided that would otherwise cost. So I think that those are the kinds of issues I'm talking about in terms of data gaps. And if we are thinking about an electronic medical record, and you are saying, okay, if you are designing it to capture therapies, make sure that it captures various aspects of counseling.

If you are designing it to collect vital signs, make sure that things like smoking are captured. Or that you have the category of behaviors that you capture. If you don't want to call it vital, call it a behavior or objective data, or some other term. But there are some key things that should be asked of a patient, just as there are key elements of a physical element that are key, behaviors that should be asked and incorporated into a record.

Because that's where a lot of the problems came up in quality measurement, was that there was no documentation that could be relied on either to identify the denominator population for whom an action should occur, like smokers, or to identify whether appropriate actions took place, if those actions were of the type of counseling, anticipatory guidance, and so forth.

DR. LUMPKIN: What I'm hearing, and what it makes me think of is to what extent -- and I suspect that there is a fair body of literature that highlights this, and I don't know that it is going to pull together in one place -- that we have information is gathered, information that should be gathered. And that the development of electronic medical records should facilitate bridging that gap.

MS. COLTIN: Right.

DR. LUMPKIN: Now to what extent are the SDOs looking at automating what is, as opposed to what should be? And that having some gap analysis might be a useful thing to provide into the mix as the SDOs are developing standards for electronic medical records. Our recommendation can either be to have someone as a report of the committee, pull together that body of literature, to at least highlight what we do know are the gaps that exist in current practice. Or in a letter to AHRQ or to the secretary, to suggest that this ought to be something that they do.

DR. STEINDEL: Is this a function of the SDOs, or is it just the function of the SDOs to provide a slot to put it in, which they already have in most cases.

DR. STARFIELD: I'm not sure they do.

DR. STEINDEL: In some cases, like HL7 has a blanket slot in their message. You can put anything in it. That's what I'm saying, is it a function of the SDO? We are making statements that this data, this data, and this data should be collected, and is that a function of the SDO, or is it a function of the medical care profession?

DR. LUMPKIN: It's a function of the SDO to make sure that there is a capability to collect it. And one of the problems in not knowing which particular HL7 you are talking about, is that they will provide a wild card slot, but if we find that there are five things --

DR. STEINDEL: It's like the thing that we were talking about yesterday where there are four categories for self, spouse, and it expands out to 23 in X12.

DR. STARFIELD: I was going to go I think where you were. I think we could do by the end of the year, a report on data gaps for elucidating disparities in quality of care in populations by having a meeting altogether with the Quality Forum, with the IOM, to identify whether they are even thinking about this, or how they are thinking about the issue, with our own workgroup, and with our NHII vision process. And sort of bring it altogether and say this is where we are now in data needs, and what we think should be done.

DR. STEINDEL: Could be or should be?

DR. STARFIELD: We can decide that once we have collected the information.

DR. STEINDEL: I think the vision thing needs to say a little bit more of what should be.

DR. STARFIELD: But not the vision project. The vision project is really not -- should be or could be.

DR. STEINDEL: I think we will know in about two weeks.

DR. STARFIELD: So do we want to have all those groups together? Or do we want to have a mini symposium, or do a separate thing?

DR. EDINGER: I'll do a symposium, but I guess it depends on what you want to achieve, because if you want an informal discussion, you really couldn't discuss it in open session with tape recorders, because part of the problem is that with some of these groups, nobody knows who may necessarily be coming and going out, and what the new flavor of -- it might be easier just to have an informal discussion with the players. We could probably set up something in March or April.

DR. LUMPKIN: I don't know that we, as a committee, can do anything informally.

DR. EDINGER: You have to have hearings?

DR. LUMPKIN: Right.

DR. STARFIELD: Should we devote half a day, whatever it is, an hour to each of these?

MS. COLTIN: I started out adding more things to the list in terms of kinds of the examples I was talking about, because I think we talked about disparities, which relies on collecting more information about patients in order to be able to identify what population subgroup they may belong to, black/white, high income/low income, whatever. So that's one thing.

And we mentioned the linkage and functional status, and what you would need, the kinds of questions you would need for functional status. But also behavior information is a gap. It's not collected widely or in any standardized way. And counseling, that kind of medical advice is also not collected.

So when you start thinking about the information that you need, what do you have, what do you need? These things fall out as major problems, and they come up again and again in the descriptions of measure developers' problems. When they try to develop measures, what do they run up against? And as I said, there was a whole day conference on what would it take to try to collect the HEDIS measures by race and ethnicity. There was a very nice report that was produced from that conference.

I think it would be good to have Olivia or somebody come and give a presentation on that, and sort of what are the problems, and why couldn't the health plans do this? Because basically, that was the conclusion, we have a lot of trouble doing this.

And now there is a project that has been funded. David Nerrins(?) at Henry Ford, and he is working with health plans around the country to try to test innovative ways to try to get at race and ethnicity, looking at some of the quality measures by race and ethnicity through linkage being one possibility.

Or using surveys where you collect that information, and linking it back with the administrative data, so now you have race that you have gotten from the CAPs survey for example, link the CAP sample to the administrative data. Use the race that you got from that, and look at other aspects of care that the patients received.

What are the issues in trying to do that? What are the privacy issues? People giving permission for you to use that data for other purposes and so forth. There are all kinds of issues that come up around some of the methods that are being proposed. But there are some innovative things that are being talked about.

So if we wanted to talk specifically about issues around race/ethnicity disparities and quality measurements, both of those, the report and the project goals. But that is managed care. That is health plans, and there are clearly broader based population issues around this. I know that there has been some work done recently in looking at the accuracy of the race and ethnicity data that is collected by the records and other kinds of systems. And there is a major source of information that could be used for quality -- birth certificate information, and so forth. And how reliable is that information?

DR. EDINGER: Maybe in February, since we'll probably know who the new secretary is by then, at least who is nominated, maybe that would be a good point at least to have the secretary or some representative of the secretary, to at least give them a flavor of what this group does. Actually invite them down, whoever that is, and maybe discuss some of the future plans, so they maybe get a feeling at least how interesting this is to the new group, and to that new person.

If it's Gail, I'm sure we'll hear what her interests are, but if it's somebody who is not as familiar with the department and what they want to do yet, that would be the opportunity in February to set a slot of time available for that person or her representative down. Because my concern is we might -- as you say, we're not sure what may be in tune with the new department's desires.

And you certainly don't want to be out of step with them, and have a hearing set up, and find out we're in the awkward position of having groups down that are not necessarily -- well, the IOM is always in favor -- in tune with the new thinking, the new policies are something.

MS. COLTIN: We are supposed to be apolitical.

DR. LUMPKIN: Right. But I think in February they won't even know where the bathrooms are. And they are going to be thinking about things like who is going to head up HCFA, and who is going to head up HRSA and all those other things. I don't think we will even pop up on the radar. We could invite them down. I don't see them showing up. So in that regards, I think what we do as a hearing, I think we should do our best call and just go from there.

MS. COLTIN: Well, we have a little bit of time between now and February to think about outlining some questions, and then presentations that we might like to invite. We can try and finalize that at our February meeting. But if there are people that are going to want to ask late February for a day of hearings, and late March or April is even late to get some people.

DR. STARFIELD: I think we should start right now on it.

DR. EDINGER: If you want Janet Corrigan, it's not a problem if we ask her now, because you know the IOM will be around doing stuff, or Ken.

DR. STARFIELD: Why don't we target for late February?

DR. LUMPKIN: Isn't there a meeting in February?

DR. STEINDEL: It's the third week.

DR. STARFIELD: So should we add on to that meeting?

MS. COLTIN: A lot of people don't like to stay over a day. I don't either. We could ask Patrice, or someone, your replacement, to poll committee members about their willingness to add another day. Lisa is not going to be here. Elizabeth is not going to be here. It may just be the three of us. I don't know which of the new members might want to come.

DR. STARFIELD: Vickie might want to come. But she decided she wanted to go to the privacy group this morning, but I think she would be interested.

MS. UPCHURCH: If you're thinking about having a breakout session for that second day of the full committee meeting?

MS. COLTIN: No, I don't think the full committee meeting on the schedule is going to allow for that. It's possible that if the subcommittee is going to meet to try to do what we were describing yesterday, which is take a look at what do we mean by health, and what do we mean by disparities, and try and scope those in terms of its work, that this could fall under that somewhat, because at least for the disparities issues it relates.

But this goes beyond just the disparities issue. Information about individuals that will enable us to measure disparities is just one type of information gap that we have.

DR. STARFIELD: So we are talking about trying to get Meyers, Corrigan, Kaiser, John Lumpkin, Dan Friedman, Jeff Blair, plus the couple of people you mentioned, David Nerrins, Olivia.

MS. COLTIN: I think Olivia Carter-Pokras, she was one of the people who organized that meeting that I was mentioning, so she would have the report, and would be able to either give the presentation, or ask someone else to give it. But I think it was important. It was sobering in terms of what plans said they were able to do. And how little access plans have to race and ethnicity data, because it's not on the enrollment form.

DR. STARFIELD: We really need to get Nancy back, because she can give us an alternative to doing it.

DR. LUMPKIN: It's like I have a mixed opinion about that. If they don't know, it's harder for them to discriminate. On the other hand, discrimination is occurring. It's just like everything else with measurement.

DR. STARFIELD: What do you need from us?

MS. COLTIN: What is interesting is the discrimination is occurring, but it is occurring at the provider level, the individual practitioner level in terms of making clinical recommendations that are -- it's not happening at the plan level. And maybe it is because the plans are blind to it for the most part, I don't know.

DR. STARFIELD: You're thinking of the most recent New England Journal article?

MS. COLTIN: There has been a whole slew of articles. Joel LaBliceman(?) -- he's done work in this area and Mark Vinneker(?) has.

MS. COLTIN: Well, when we look at the 23rd, we would be looking at the extra day.

DR. STARFIELD: So hold that, right?

MS. COLTIN: That's all right with me. If it's all right with the three of us, we may be just be the core of this work group.

DR. LUMPKIN: I teach class on Fridays.

DR. STARFIELD: Maybe we can start early and end early. We want to get out early on Friday anyhow. What time is your flight? Well, you can't do it that Friday.

MS. COLTIN: I wrote down the day before, the 20th.

DR. LUMPKIN: That's Tuesday, which is the Monday being President's Day, so that's a holiday. But if we start at ten, we could do that.

MS. COLTIN: So we can explore the 20th.

DR. EDINGER: In this case, most of the people you would invite are DC-based, so they are available. There wouldn't be travel problems. They'll all be here, at least most of them will.

DR. STARFIELD: What about Ken Kaiser?

DR. EDINGER: He's here.

MS. COLTIN: Well, we should try to find out whether any of the new members have an interest in joining this work group also.

DR. LUMPKIN: I'm going to try to spend a little bit of time talking to each one of the new member.

MS. UPCHURCH: So this is essentially a one day workgroup meeting?

MS. COLTIN: Right. And one of the things I would like to do is actually invite, even though it's the Quality Workgroup, I would like to invite Simon and Jeff, and I would like to invite --

DR. STARFIELD: Well, they have to talk.

MS. COLTIN: And I would like to invite whoever is going to lead the confidentiality.

DR. STARFIELD: I was going to ask them, to what extent are you considering this in your work?

DR. LUMPKIN: I think we're going to have a hearing portion, and then we're going to have a discussion portion with the committee, and I think we should invite them to the discussion portion. I think that we have not tended to have members testify before the committee.

MS. COLTIN: No, but I think they need to hear the issues, and then participate in the discussion. And I think that would be true for the confidentiality chair as well. I have no objection to inviting all the subcommittee members from standards and security and privacy and confidentiality, but I would at least like to see the chairs there to be a liaison or a link.

DR. LUMPKIN: And if we are able to schedule it the day before, there really is no additional cost to the committee for other members to be there. I'll probably come in the night before anyway. The travel is going to rough, because they will be traveling for the meetings, and the cost of having other people come would be negligible.

DR. EDINGER: And the room would be basically probably in this building. It wouldn't cost them anything. And the presenting speakers are government people in most cases, or private sector people who are here, so it shouldn't cost them anything to get those people here.

MS. COLTIN: Is Paul a member of this workgroup? Sometimes he will come to the meetings, and sometimes not, so I wasn't sure. I don't know.

MS. UPCHURCH: Paul Newacheck? No, he isn't.

MS. COLTIN: He has stayed sometimes when we have had this meeting at the end of the populations meeting, but I didn't know whether he was actually a member.

DR. LUMPKIN: I know he's got a fair bit of interest in quality measures.

MS. COLTIN: Well, we could certainly invite him. He is the only one who have to come out an additional night.

DR. LUMPKIN: Jeff probably would.

MS. COLTIN: Oh, Jeff too, that's right. So it would cost something, but not a lot.

DR. EDINGER: We have to end around one, but then we still have the problem that they still couldn't get here that early for their neck of the woods.

DR. STARFIELD: What about the other two staff members, Leon and Leroy?

DR. EDINGER: Leon was here once or twice, and Leroy I don't think I have ever seen.

MS. COLTIN: Well, I think it would be important to invite Leon, because I think some of the stuff we are talking about relates to work that HCFA is doing too, and hopefully he would be familiar with that. I think that's another area where we should have someone come and talk.

HCFA actually -- when plans do the HEDIS measures for the Medicare population, we don't just submit the numerator and denominator numbers to HCFA. We actually submit them a file of every beneficiary who was in the denominator, and every beneficiary who is in the numerator. In other words, it's a file of the denominator with a flag for whether they were in the numerator or not.

And they have done some work with trying to link those up to their eligibility file, where they have race and ethnicity data, and to take a look at that. And I don't think they were terribly successful, but I think it's another example of we ought to hear about this, because these are important attempts. So if you could give Leon a call. And if you can mention that particular initiative, if we could figure out who was involved with, and whether they could come and tell us about what they tried to do, and what problems they ran into.

There have been a number of kind of innovative attempts to try to get at looking at race and ethnicity and quality, and most of them that I'm aware of haven't been terribly successful, but there are lessons to be learned from that.

All right, at least we have plans. At least those of here could do the 20th, and then we need to check with who else -- actually, the new members. John, you are going to follow-up on that. I guess what I will do is I'll send an e-mail to Simon and Jeff and Dan. And who is going to chair the privacy committee?

DR. LUMPKIN: Richard in the interim, but he's on the bubble too. His appointment is up in February, and he is a Senate appointee.

DR. STARFIELD: Does that mean he probably will be continuing or he probably won't be continuing?

MS. COLTIN: He was appointed by Strom Thurmond.

DR. LUMPKIN: So this is the first time we have actually come to the end of the terms of the legislatively appointed people. So we're not sure exactly how that process is going on.

[Whereupon, the meeting was adjourned at 9:33 a.m.]