THIS TRANSCRIPT IS UNEDITED

National Committee on Vital and Health Statistics

June 23, 1999

Hubert H. Humphrey Building
Room 505A
200 Independence Avenue, S.W.
Washington, D.C.

Proceedings By:
CASET Associates, Ltd.
10201 Lee Highway, Suite 160
Fairfax, Virginia 22030
(703) 352-0091

PARTICIPANTS:

Committee Members:

Liaison Representative:


TABLE OF CONTENTS

Call to Order, Welcome and Introductions
Review of Agenda

Update from the Department

Committee Process for Addressing Issues Between Meetings, Dates for NCVHS Meetings in 2000

Panel Discussion on Data for Measuring Quality of Care

1996-1998 NCVHS Report -- Inclusion of Member Views

Presentation of Reports by the Subcommittee on Populations:
Medicaid Managed Care Data Collection and Reporting

Health Data Needs of the Pacific Insular Areas, Puerto Rico and the U.S. Virgin Islands

Discussion with HCFA Administrator

Presentation on Committee on National Statistics Report on Public Sector Performance Measurement


P R O C E E D I N G S [9:00 A.M.]

Agenda Item: Call to Order, Welcome and Introductions, Review of Agenda

DR. LUMPKIN: We will call to order the June meeting of the National Committee on Vital and Health Statistics.

As in prior meetings, this meeting is being broadcast over the Internet. So, I would like to encourage members of the committee and others who are speaking to speak into the microphones so that those on the other side of the cyber world can hear you when you make your comments, unless you don't think your comments are worth anything, in which case feel free not to use the microphone.

It seems like it has been a long while since we have met. I know a lot of the committees have been very busy. We have quite a full agenda. So, we are going to try to work through this agenda so that we can pay appropriate attention to each issue that is raised and the various presenters who are before us.

We have a slightly revised agenda. If everyone will look at the agenda that was in front of them at their place and all my notes, which are on my old agenda, I will try to transfer.

As per our new instructions in relationship to conflict of interest at this particular time, if there are any members who have conflicts of interest with items that are on the agenda, now is the time to so declare.

MR. BLAIR: What if we are taxpayers, does that qualify as a conflict of interest?

DR. LUMPKIN: No. Actually it counts as being a good citizen.

My name is John Lumpkin. I am the chair and I am also the director of the Illinois Department of Public Health.

We will go around and do introductions starting with my right.

MS. GREENBERG: I am Marjorie Greenberg from the National Center for Health Statistics, Centers for Disease Control and Prevention. I am the executive secretary to the committee.

DR. COHN: I am Simon Cohn. I am the national director for data warehousing for Kaiser Permanente and member of the committee.

MS. FRAWLEY: Kathleen Frawley, vice president of legislative and public policy services, the American Health Information Management Association.

MS. COLTIN: I am Kathy Coltin. I am director of external affairs and measurement systems at Harvard Pilgrim Health Care in Boston and a member of the committee.

MS. FYFFE: Kathleen Fyffe. I work for the Health Insurance Association of America and I am a member of the committee.

MR. GELLMAN: I am Bob Gellman. I am a privacy and information policy consultant.

MR. FANNING: I am John Fanning. I am a senior policy analyst in the Office of the Assistant Secretary for Planning and Evaluation of HHS and the privacy advocate of the Department. And I am here this morning to address the committee.

MR. BLAIR: I am Jeff Blair, vice president of the Medical Records Institute and a member of the committee.

DR. NEWACHECK: Paul Newacheck, University of California and a member of the committee.

MS. WARD: Elizabeth Ward from the Washington State Department of Health and a member of the committee.

DR. FRIEDMAN: Dan Friedman, Massachusetts Department of Public Health and a member of the committee.

DR. AMARO: Hortensia Amaro, Boston University School of Public Health and a member of the committee.

DR. STARFIELD: Barbara Starfield, Johns Hopkins University and member of the committee.

MR. SCANLON: I am Jim Scanlon from HHS's Office of Planning and Evaluation. I am executive staff director for the committee.

DR. LUMPKIN: We will now have those in attendance identify themselves. Because we don't have microphones that reach all around the room, those of you on the Internet may not hear them.

Why don't you introduce yourself and then we will --

DR. MC DONALD: Clement McDonald, Indiana University and Regenstrief Institute.

[Further introductions off microphone.]

DR. LUMPKIN: And Vince.

DR. MOR: Vince Mor, Brown University, member of the committee.

DR. LUMPKIN: Welcome.

Everyone has had an opportunity, I hope, to look at the agenda. Are there any changes -- I know that Dr. Peterson is stuck in Houston, I believe, and will not be able to be with us for this meeting.

I just want to note that it is Houston and not Chicago that she is stuck in.

Any other changes or -- okay. Then at this time we will go with the update from the Department.

Jim.

Agenda Item: Update from the Department

MR. SCANLON: Thank you, John.

Bill Braithwaite and John Fanning will be updating you in a couple of minutes on where we are with HIPAA, administration simplification standards and privacy.

Let me take a couple of minutes to talk about some of the other data policy issues and activities in HHS.

There are about three areas that I wanted to update the committee on. Race and ethnicity data is the first one that the Department and the Data Council have been active on. As you know, race and ethnicity data continue to be a focus of a number of data planning efforts in HHS. And we are expecting in July a major report and recommendations from the Data Council's Working Group on Race and Ethnicity.

Sometime ago, the council asked and established a working group to look at all previous recommendations in this area, how much progress we had made, what the data gaps continued to be and to come forward with some forward looking strategies and recommendations for improving race and ethnicity data and related data in HHS.

This is to be a comprehensive data strategy when they do come forward. Apparently, the group has just about finished its draft report and will be presenting it to the Data Council in July. So, we will vet it a bit within the Department, have everybody react to it and cost it out and we will clearly have a presentation and share it with the committee at that point as well.

I think I reported at our previous meeting as well on an initiative that the White House had created dealing with improvements in the ability to -- statistical ability to measure discrimination and track discrimination on U.S. society. That has moved along a little bit further. Let me bring you up to date on that.

This initiative is coming out of the White House and it is a follow-up to the President's Initiative on Race. It is an effort to try to improve the ability to track and measure discrimination in key sectors of U.S. society and these sectors include health care, education, labor, criminal justice and housing. So, it includes the federal executive departments from those agencies.

The initiative is being coordinated by the Office of Management and Budget and the Council of Economic Advisers. This was seen as a major economic issue as well. The status of the initiative is as follows: The focus, again, is on improving the nation's capability for measuring and tracking discrimination. Obviously, this goes beyond just measuring disparities.

We have strong systems for measuring disparities but getting at the motive is a little more complicated and it gets you into the legal sphere very quickly. So, OMB and the White House have called a couple of meetings and one of the main points of moving forward here will be to ask the National Academy of Sciences to form a panel, an expert panel, and to hold a workshop, maybe a two-day workshop in November, it looks like it will turn out to be, where the focus will be on what do we know about measuring and tracking discrimination in these areas, what are the methods used and what are the areas of research to move forward.

As I said, that research conference is tentatively scheduled for, I think, November of this year. OMB has also asked the agencies to begin looking at what they could do on a shorter term basis to improve this whole area and within HHS, we have started a couple of steps. First of all, we have -- I will tell you a little bit more about this -- we have updated the HHS meta-directory of data resources and systems within HHS. I guess it is more of a meta-directory than anything else.

I will tell you a little bit more. That directory actually has a fairly extensive amount of information about what race/ethnicity data is included in our various systems. But in addition to that, HHS has asked our Data Council's Working Group on Race/Ethnicity to take a look at all of our current surveys and administrative data systems to see whether there is the potential for some measures there or to see what changes might have to be made to move improvements forward in that area.

Finally, we are probably going to initiate a literature review, scientifically-based literature review, what do we know about the health area and from other areas about how to measure, how to track and what sort of indicators have been used for tracking discrimination.

You are all aware that in -- there is a variety of techniques and some are much further developed than others. In the housing area, for example, HUD actually sends paired testers in to apply for rental or mortgages and this sort of thing and they can literally see what the difference is when different individuals are sent in. So, they have in one way a very hard measure. I don't think you could apply this to the health area, where you would send someone in to what health plan or doctor's office, but there may be other ways of thinking about this.

Labor also can do this through -- by applying for jobs. They can send different sorts of applicants in for jobs and see what the difference is. So, at any rate, this will be moving forward.

If members of the committee know of any individuals that could help us with the literature review, please let me know. Now, again, this is focusing not just on disparities. I think the interest here is on why are there disparities and to what extent can they be attributed to discrimination.

Please let me know if you know of anyone who has done research in this area. You are probably familiar with the book recently published by David Smith, I think, health care divided, which is an excellent history of discrimination in health care, brings it up to the current time and looks at areas even now where there may be some discrimination.

And he actually includes in that review some of the indicators that have been used to sort of look further at what might be discrimination. But if you know of anyone or individuals who could help with the literature review or would be good members of the panel at the Academy or even speakers or experts, let me know, because I will be planning for that as well.

I mentioned earlier the directory, the meta-directory of HHS data resources and major data collection systems. This was developed under the Data Council's auspices and it includes virtually all of the major data collection systems and the major analytic databases in HHS. It includes a fairly lengthy description of each of those systems, including the kind of information that is included. As I said, it has a fair amount of information about the race and ethnicity detail and geographic detail that is included.

This year, it has been expanded to include a lot more information on how do you get the data, access to the data, including links to the agency that sponsors the data collection and in some cases links to the data itself. So, we will be putting that up probably in the next week. I think the agencies are fine tuning their entries. We will make this available publicly.

I think the Department plans to announce this more generally and to probably see if we can direct some of our research grant announcements towards analyzing that data. So, we will try to make it more permanent as well.

Finally, let me talk a little bit about data planning and budget planning. The Federal Government, as you know, is on a fiscal year basis that ends in September and we are already working, as many of you are, on the year 2001 budget. This is assuming we all survive Y2K. So, the Department and all federal agencies are now looking at the year 2001 budget. We are still awaiting a fiscal year 2000 budget as well, which looks like it may go right up to September.

But in the context of fiscal 2001 budget planning, there are a couple of activities relating to data strategy and planning. First of all, the Data Council has taken a look at -- a sort of across the Department look at the survey proposals and enhancements that are included in the budget request. They are doing this in the context of about four or five major data gaps and needs that continually occur and can be seen in the future as data gaps as well.

They will be discussing that analysis and priorities at the Data Council meeting tomorrow afternoon and they will probably be forwarding that analysis and recommendations to our Budget Review Board, Policy Board.

Secondly, as one of the budget planning themes for this budget, along with mental health and prevention, the Secretary has identified the theme of using information for decision-making. It is a very broad category and it is a very formative stage but this will be a Department-wide initiative that looks at information management, including gaps and so on in technology and may be able to come forward with some recommendations for how to move forward and get some budget support for that area as well. That is still in a very formative stage, but in about the next two or three months, that will be developed as a theme as well.

Let me stop there.

DR. LUMPKIN: Any questions? I have one. Evidently, I heard a rumor that yesterday the Vice President announced an issue related to making health data available the same way crime statistics are with geomapping and geocoding. Is there any -- has there been any discussion on that item yet?

MR. SCANLON: Well, the issue of geocoding clearly has come up in HHS and it comes up in two areas. One is on the -- can we include geocoding information in most of our major surveys, for example, in vital statistics and so on, so that you can map indicators to communities and counties and metropolitan areas and states and so on. That has started, but I think the Department is still working on a consistent policy there.

The other area where it has come up and it is a little more complicated is that can you -- if you want to look at the impact of grants and other public investments on health and so on, these could be in other areas as well, housing and education, is there a way to use geocoding information on the grant information; in other words, where these services are delivered and so forth, so that you would then be able to map the investment versus the outcomes, the indicators.

It could be infant mortality. It could be other measures of health status and so on. That requires a little bit more thought because often the grant is made to a state, for example, and what you would be looking at for Illinois would be Springfield would have a tremendous amount of dollars, but that is not where the services are necessarily delivered. So, you need to get to this next level of where the services might actually be delivered or where the -- and then you can look at matching the investments with some sort of indicators.

Otherwise, John, you may have heard whether -- about the -- the Vice President is a very sophisticated person and he has a sophisticated staff in terms of technology. We do have to be careful about privacy issues when we start mapping things like this. It is one thing in the crime area, obviously, where this is public information already and even there it runs into trouble.

But the basic idea of geocoding information to map and to follow indicators and even possibly to look at the investments is not a bad idea and actually HHS is moving in that direction.

DR. LUMPKIN: Perhaps, as this, in fact, unfolds as a direction, HHS will follow and we can get a report at the next meeting.

MR. SCANLON: Sure.

DR. LUMPKIN: Are there other questions?

[There was no response.]

John.

MR. FANNING: All right. I will give a brief update on two items. First, the status of legislation and regulations relating to confidentiality of health information under the requirements of the Health Insurance Portability and Accountability Act. As you know, that act has said that if Congress didn't pass confidentiality legislation applicable to the information in electronic financial and administrative transactions by August 20th, the Secretary was obliged to promulgate regulations covering that.

We have for some time been working on those regulations simply because there would not be enough time to do it adequately if we waited until August 20th. We are continuing with that task and if Congress does not pass legislation, we are committed to publish such a regulation.

Are there any questions about that?

DR. MC DONALD: It is an observation. The rumor I hear is that there is a move afoot to formally delay the date of all these HIPAA legislation issues, including perhaps the standard. Do you know anything about that?

MR. FANNING: People have raised the question as to whether Congress would put off this deadline. I don't know anything about it, although Senator Leahy has written to the President to counsel him to veto any putting off of that deadline or eliminating the Secretary's responsibility in that area, but I am not familiar with any other details.

MR. BLAIR: I have heard the same rumor and I can share with you the name of an individual in the Senate where the source was for that and it was specific enough to say that the deferral would be until September of next year. I can pass that information on to you if you want to follow up on it.

MR. FANNING: Thank you.

MR. GELLMAN: I don't know if you can answer this question, but I would like to ask you about the scope of the Secretary's authority to write privacy regulations. If you read HIPAA narrowly, it seems to suggest that the Secretary's authority covers the privacy of electronic health care transactions, but it isn't at all clear that the Secretary could not read HIPAA, as well as other authority that the Secretary has, much more broadly to cover all health information.

I wonder if a decision has been made yet that you can share with us about how far the regulations are going to go.

MR. FANNING: We are exploring the exact boundaries of our authority in this area and the reach of our regulations and we are not prepared to say exactly what those boundaries are. I think the goal that we are seeking is as rational and comprehensive coverage as can be obtained under the authorities that we have. But we don't have anymore refined answer to give at this point.

Okay? One other item, as you know, in Congress -- this has to do with making information developed under grants with federal money available to the public. Last year, Congress directed the Office of Management and Budget to amend OMB Circular A1-10, which deals with grant activities of federal agencies to require that any organization that developed information with federal funds make it available under the procedures as apply to the Federal Government under the Freedom of Information Act.

Now, the Office of Management and Budget promulgated a proposed text for an amendment to Circular A1-10 and is now working -- is now reviewing the responses. There were 10,000 -- about 10,000 responses divided roughly evenly between those who said, yes, push hard to make all of this available and on the other hand, those who said in general that this is not a rational way of accomplishing this task and the information should not be available. It will cause all kinds of trouble.

The Office of Management and Budget has assured us that all the federal agencies that deal with research grants and resulting data will be consulted before final action in this regard is taken. We have nothing specific to react to or to report. OMB is still analyzing the 10,000 comments.

There has been some interest in the Congress in changing this, but it has met with some resistance by those who put the provision there in the first place. So, that is under some discussion, but there does not seem to be any positive step toward changing it at the moment.

DR. MC DONALD: I had understood that the initial motivation for this legislation was a scientific data that was used in a policy and couldn't be examined and that the original interpretation of the bill was that it applied to every darn piece of science and it gets very complicated because then there is the issue of just time and cost of managing all this stuff if you have kind of harassment sort of searches.

But I had heard that the final regulations or the final kind of way this would evolve would apply only to those things as they had been used to set regulations or legislation, which would significantly constrain the scope of it and maybe the bother side of it. Is that true?

MR. FANNING: Yes. As proposed, the OMB proposal did have that limitation in it. We said that we thought that was the wise distinction. On the other hand, some people among this 10,000 will write in and say that is not a correct interpretation of the statute very likely.

So, it is hard to say how it will actually turn out at this point. Apropos of the business of making the information available, we did make the point that, you know, within feasible constraints, as much information as possible about the -- well, Jim can address this in greater detail and it might be wise to indicate that we are certainly in favor of sharing research data under appropriate conditions and controls and we do not want to be portrayed here as opposing making data available.

MR. SCANLON: I think in our own HHS comments to OMB, we stressed the point that we thought this was -- the goal was correct of making research data available, but there may be better ways to do this than what we think of as kind of a blunt instrument.

On the other hand, it is clear that at least half the commenters thought OMB went too far in restricting their interpretation and there were a lot of comments. In fact, there were systematic letter writing campaigns to balance the other side, including the Senate. I think the Senate leadership has expressed some concern that OMB, they believe OMB narrowed the interpretation beyond what they thought. So, I am not clear where it is going to go.

But you are right. It presents -- well, everyone agrees with the goals and I think the researcher community itself got itself into trouble by making it so difficult with what was an EPA environmental regulation. When the industry tried to get a hold of the data, the researchers made it very difficult. So, the industry went to the Hill understandably and the reaction was a fairly blunt approach.

So, I don't know where we are heading, but OMB, in fact, may end up having to issue another proposed rule before it can actually be --

DR. STARFIELD: Just for clarification, you used the words, both words, "information" and "data." The legislation refers to data. Is that right? Data not -- does the distinction -- at least we make a distinction between data and information, but we are referring to data.

MR. FANNING: I think we are referring to data.

DR. MC DONALD: It really wants the books to bear the whole -- I mean, everything there is and that is what makes it hard, especially when you have made promises to people about various things, like confidentiality. That, I think, is covered. They would give information, which they wouldn't give if it was -- When we are dealing with hospitals, they are giving us business information and they would not do that if it would be subject to --

MR. GELLMAN: And there is an exemption in the Freedom of Information Act for confidential business information.

MR. SCANLON: It is actually a plausible approach. I mean, the Freedom of Information Act actually would protect a lot of this information. The problem is you couldn't necessarily make an absolute assurance probably because it is sort of a case-by-case basis, I guess.

MR. GELLMAN: Well, one thing to remember is that any research sponsored today by the Federal Government directly -- I don't mean grants, but I mean operated by federal agencies subject to the Freedom of Information Act, the world hasn't come to an end. The same thing is true with any research that is being paid for by states, which also has open access laws. I am not saying that this provision doesn't have a lot of procedural problems, but it is not the end of the world either.

DR. LUMPKIN: Other questions?

MR. FANNING: That is all I have, Mr. Chairman.

DR. LUMPKIN: Thank you.

Dr. Braithwaite.

DR. BRAITHWAITE: Well, while the privacy activity in the Congress has been attracting all the attention, we have been working away underground analyzing the comments on the four rules for HIPAA standards that we put out last year, the one for transaction and coding sets, for national provider identifier, for employer identifier and for security. Those are being drafted and we are planning to publish them all by the end of this year.

We are planning to publish the one on transaction and coding sets first. If there is any conflict in resources, we are going to put our efforts into getting that one out first because the feedback from the industry is that that is the one they need first in order to start their implementation.

As an aside, the X12N Working Groups on the implementation guides for these transactions has taken to heart all of the comments that reflected on those implementation guides and has updated them and has now posted those updates on the Washington Publishing Web Site, so that the implementation guides, which will be adopted as federal law by these regulations when they are published by the end of this year, are already available to people on the Web site so there is a little early warning about what will have to be done. So, we get a little more than two years to implement.

With respect to the individual identifier, which we should have published an NPRM last year, as you all know, the push back on that with respect to privacy told the Vice President or he told us that we weren't going to publish a standard in that area until privacy was controlled either by legislation or by regulation and the Congress told us that we couldn't spend any money on publishing a final rule to that effect until they approved the standard. So, we are on hold pending developments in the privacy standards area on that one.

The plan identifier, which had started through the clearance process, went into some difficulties and we have redrafted that and we will be reviewing it and putting that back into clearance pretty soon. Again, we plan to publish an NPRM for the plan identifier by the end of this year.

The attachments standard, at least for the first six types of attachments, has been drafted and is in the clearance process. We are hoping to publish that by October of this year. It may be a little bit later if things -- if other things get hot and prevent us from clearing it earlier, but we are planning to put that out very soon.

Our current plans are to publish in those final rules that come out by the end of this year a discussion of how to continue the process of developing and maintaining standards. I guess I say this every time I report to you. The adoption of standards under HIPAA is not a one time, here it is forever kind of process. It is a continuing process where business needs have to be recognized; implementation guides have to be written. Standards have to be adapted and adopted and then they have to be adopted again by the Federal Government as national standards.

That process has to be smooth. It has to be relatively rapid in order that those business needs be met by the national standards. So, we are working with the content committees and the standards developing organizations around the country and trying to work out this sort of smooth transition for continuing the development of the standards over time. We will be publishing at least a preliminary mechanism for how we see doing that as part of the final rules, but, again, it is a continuing process and we expect that process to be refined as the rules are refined in years to come.

We are also discussing the code set issues. We are having discussions with the code set developers about how to make sure that those code set developments continue in a public open process and that low cost electronic distribution mechanisms are available for all of those code sets as required by the HIPAA law.

We are looking at trying to assist the industry in their efforts to plan and implement these standards. We have identified five areas of activities where they are going to need some help. Obviously, publicity, making sure that the world out there knows that they are required by law to follow these standards.

We are still getting phone calls from people who just heard a rumor that something had happened and that there was a law that they were going to have to follow and what was it anyway. So, publicity is still an important thing and I spend a great deal of my time in an airplane going around and talking to various groups of people to inform them about what is happening and what they are required to do.

So, that is sort of publicity and education. Technical assistance and implementation assistance is something that organizations are going to have to need because they have not yet implemented these kinds of standards and the law requires every single health plan in the country to be able to accept electronic transactions even if they don't even have a computer yet.

Now, they can hire a clearinghouse to do it for them, but they at least have to understand enough to write a contract with a clearinghouse. So, assistance in doing those implementations is needed. There is sort of a monitoring and evaluation aspect that this committee is well aware of because you are required by the federal law to report back to Congress about how things are going and you have to have some means of evaluating that.

Of course, the industry needs some means of evaluating that as well. All of those issues are things which we as a department are not funded to do. There was no money in this bill for us to do these things, but we will do what we can and we are planning and already working with industry in trying to publicize it and maybe we will set up a central sort of clearinghouse of educational and technical assistance activities that people can draw on. But the industry is stepping up to the plate. Various organizations from industry are already putting on courses and doing evaluations and accrediting organizations as having met the standards.

So, those things are going on and we intend to support that as best we can. Enforcement was an issue that was not addressed in the final rule. We will be looking at enforcement separately and will publish a separate NPRM, a regulation on how these standards will be enforced next year. That is our schedule for doing enforcement. Again, there was no funding in the bill to do enforcement. So, it has to be a relatively low key thing, but we really believe that the industry with our help will be able to police itself.

After all, the standards that we have put forward are intended to make the activities in the health care field more efficient, less expensive and the marketplace will have a certain amount of force in forcing people to adopt those standards. Otherwise, they will be left out in the cold by the rest of the industry.

Well, they will squeal on each other and then we can go after them. That is the enforcement strategy. Anyway, we will put that in official writing and come out with an NPRM next year about how this will be enforced.

That is a summary of where we are going with the standards activities. Any questions?

MS. FYFFE: Thanks, Bill.

You mentioned the low cost of distribution of code sets. Do the regulations address possible usage fees of code sets that are developed by any of the professional societies? For example, if a code set is copywritten by an organization, do the regulations say anything about a recurring annual fee for organizations to use those codes?

DR. BRAITHWAITE: That is a question that was brought up in the comments to these rules quite directly. The comments went all over from it is illegal for us to adopt a copyrighted code set to we can't afford it, so, why are you doing this to us; our lawyers assure us that it is legal for us to adopt as a national rule that copyrighted code sets be used.

From the perspective of the code set developers, who have to carry on a continuing process of maintaining and developing and keeping their code set up to date, there has to be some financial income for them to do that. The bulk of the comments wanted to make sure -- most of them understood that, but wanted to make sure that the low cost aspect of the law means that the cost that people have to spend to get copies of or get the licensing to use these private code sets not be excessive.

That is, they should pay for the maintenance of it, but they shouldn't be used to fund some other activities of an organization or something like that. That seemed to be the general gist of the comments.

We will not be addressing in the -- as far as I know, at least, in the final rule a specific sort of cost for these things nor are we going to say they are free, nor is the government in a position, particularly without an appropriations clause to buy them and give them to the public, although that was suggested in many of the comments.

But we will work with the private organizations that develop these code sets to assure -- so that they can sign agreements with us to assure us that, in fact, they will be allowing people to use these code sets under low cost and exactly what that means is up for debate, but it will be a low cost, and that very low cost electronic distribution of those code sets is available to everybody that needs to get access to them.

So, those are the negotiations that are underway.

DR. LUMPKIN: Jeff.

MR. BLAIR: That is okay. That covered my question. Thanks.

DR. MC DONALD: Two things. First, I would like to clarify. I don't think when you say copyright and you say cost, they are totally different independent things. So, being copywritten doesn't mean it costs anything. In fact, being not copywritten is probably bad because then you have no protection and not have 52 different variants, like the 1500 and the other forms.

So, I think it might be a good idea to have copywritten or no cost or low cost. The second thing, in the bill I thought there was something about equitable, as well as low cost. Does that imply everyone has to pay the same? That is, I mean, there are current codes that are not sold that way. That is no one knows what the costs are to different parties and it is probably true that they are different to different kinds of parties.

DR. BRAITHWAITE: I am sure that is true. I don't remember the equitable clause in the --

DR. MC DONALD: I might have made it up, but I thought --

DR. BRAITHWAITE: The principles of HIPAA would indicate that we should try and get at least publicly available pricing for those code sets, so the people know what they are getting in for and can see that they are low cost as opposed to having secret agreements.

And, again, that is something that we are working with the code set developers on to make sure that everybody understands what it costs and that it is relatively, at least, low cost.

DR. MC DONALD: You said something that I hadn't heard before; that is, low cost would be no money expended, except to maintain.

DR. BRAITHWAITE: That is what the comments told us should happen.

DR. MC DONALD: The challenge in this is that you basically make this a monopoly when you require it and then they can -- you know, it can be very expensive to -- you know, it can become infinity because they get paid a lot more now for doing code set maintenance. So, there is a real challenge to keep in the counterforce when we don't have a competing process.

On the other hand, we don't want to have 52 different codes. So, it is a challenge.

DR. BRAITHWAITE: It certainly is a competing process in the sense that these rules are open every year for change and if some organization suddenly, you know, quadrupled the cost of its code set, the industry would say, hey, we want you to change what code set we use to something else if there is a competitor out there.

Now, that is a very expensive decision to make and the industry is not -- would not easily come to that consensus, but if they did, HIPAA would require us to consider changing the standard --

DR. MC DONALD: One of the code sets has increased tenfold in the last year. Maybe that is a different set of issues.

I worry a lot about this issue of having -- we want to have one code set, but how in the world do you get it and find other examples and other initiatives where you can get to the situation with all the constraints that we have without having an infinity machine, even though there is no money floating anywhere else.

Some foundations -- not foundations -- a lot of non-profits do pretty well even though they are not profits.

DR. BRAITHWAITE: We agree. It is an issue that needs to be looked at very carefully.

DR. COHN: I just want to ask another question. I certainly agree that the issue of a framework for coding is an issue for the committee to consider over the next short while.

I think the legislation is very clear about what happens in terms of updating standards once they are actually implemented sometime in the year 2002 or whatever, but I observed that there is actually a general confusion in the industry about what happens during that initial two year period, where initially sometime later on this year we will see final regulations and the health care industry will begin to mobilize to implement those standards.

Yet, my understanding is that there will be at least some minor modifications to those standards as we move into final implementation, required implementation. Can you comment on this sort of two year implementation period and the likely changes?

DR. BRAITHWAITE: Well, as you say the law gives the industry in general, except for very small plans two years to comply with the final rules that we publish.

In terms of changes, the law has three different clauses about changes to these standards. The general rule is that we aren't allowed to publish changes to the standards any more frequently than once a year. Now, if the industry has a two year window in which to comply, that means, in fact, that we could publish another updated standard one year into implementation and a second one before everybody has to comply.

So, it is possible that we could adapt or adopt standards within the implementation period if by industry consensus, that seemed to be necessary to make it easier or better or whatever to implement these standards.

Then there is an emergency clause, which allows the Secretary to adopt changes to the standards, which are required for implementation. That is, when the industry starts to implement these standards, if they run into something that just doesn't make sense, somehow it slipped through all of the process that we have put in place to make sure that these are implementable and they just can't do it, we have the authority to suddenly issue a new standard, which takes care of that problem on an emergency basis without waiting for the year to pass.

Then the third mechanism for change is the change in code sets themselves, which did not have the once a year requirements. So, code sets, in fact, could be maintained and codes could be added and so on everyday if necessary to keep up a code set and it would be up to the code set developer and the industry consensus to decide how often and how frequently and so on that code set gets updated, but we are not limited to the update for code sets.

DR. LUMPKIN: Time for one more question.

Jeff.

MR. BLAIR: Dr. Braithwaite, have you been able to gather information with respect to the implementation of the standards and code sets to determine whether the majority of this implementation will be the development of new vendor systems that will incorporate these standards into new systems as a new function or whether there is going to be a significant amount of home grown code in user institutions that are being rewritten because the question is intended to get a feeling for whether it is going to be a little bit easier for people to install a new system that will give them this function and help them be more efficient or whether they are going to have to spend a lot of time rewriting old systems that may be more expensive and time consuming.

Do you have a feeling for how that balances out?

DR. BRAITHWAITE: Well, I don't have any hard data on it. I haven't done that research or study but I have been out there talking to industry a lot about this issue and the provider side of the industry is, in general, supported by companies that provide products. So, the vendors will be either updating their systems or providing new systems with that functionality.

The plans tended, although some of them buy vendor products, they tended in their feedback to me at least, that they have got a lot of either homegrown stuff or huge modifications to vendor systems that essentially make it a homegrown system and would require a fair amount of tweaking. And then, of course, there are those organizations who don't have any systems yet or any systems that do anything close to the standards that we are requiring.

So, they would have to purchase systems or contract with clearing houses to do the work for them. So, there is a broad range of different organizational needs that will have to be met in this two year implementation phase.

Unfortunately, the vendors -- well, of course, if I have talked to them, then they know about it, but the ones I have talked to are already making progress in implementing these standards because, of course, this has been a consensus process and they have been part of the process of consensus. They know what is in them already and, of course, now it is all available on the Web and they can get a head start on making the changes.

For many people the changes are not large because they have already implemented the current version of the ANSI standards for many of the transactions, if not all of them, so that the additional requirements for many of these progressive companies is not large. The others, of course, have a lot of work to do.

DR. LUMPKIN: Thank you.

I would like to thank all three of you for the update. It was very helpful and informative.

Agenda Item: Committee Process for Addressing Issues Between Meetings

We are going to move on to the next item on the agenda, which actually came out of the A1-10 Circular and that relates to the issue of how the committee responds. That is being handed out to all of you.

In the instance where an issue arises, obviously, it is the intent and the desire for the functioning of the committee that issues for which the committee would need to comment should be processed by the committee through advanced planning. However, there are issues that may fall outside of our radar screen or a committee member may raise an issue that may have been reviewed by the executive committee or other -- the executive subcommittee or other entities and at the time that that review occurred, someone said, well, okay, that issue is out there. We don't necessarily need to respond, but an individual member may contact the committee chair and say, you know, I think the committee should have a position on that.

So, what we have outlined in this is a process to deal with those kind of issues. Essentially if you would skip down to Item 5, Items 1 through 4 essentially describe how we will try to prevent this from happening. Item 5 describes a process by -- if an issue is raised, it could be sent to a subcommittee, which may meet by conference call and forward it to the full committee, which would then meet by conference call.

All of these would have to occur in a way that the public could participate or at least monitor the conference call. And then a decision could be made to take some action or position on an issue that is a late-breaking issue.

Item 6 deals where the issue is raised not by the subcommittee or the executive subcommittee, but it may be raised by an individual member to the chair. Then Item No. 7 talks about some of the time frames, two weeks being a turnaround time to reveal and comment on documents as a minimum, unless there is some concurrence.

Finally, Item No. 9 says that if we get an item that is raised, if we can't respond in a timely fashion, then individual members will be encouraged to respond to the issue in their professional capacities.

Any questions or concerns about the proposed process? I will give you a few minutes to mull through it.

[Pause.]

DR. HARDING: Just a clarification. Richard harding. In No. 7, timely response, request that a subcommittee or executive subcommittee take responsibility for developing and then members will be given -- the members meaning members of the committee or the subcommittees, members of the committee as a whole you are talking about there will be given a minimum of two weeks to review and this time period will be shortened by a concurrence of a majority of the members of the entire committee or the subcommittee. Just a clarification.

DR. LUMPKIN: Full committee.

DR. HARDING: Full committee. Okay.

MS. GREENBERG: The idea is that the members of the full committee would be -- one subcommittee would take responsibility and then the members of the full committee would be asked to send their comments to that subcommittee, which would then develop a recommendation with the two weeks and if a notice -- and, again, if there was a desire to shorten that, it would have to be a majority of the full committee members. We can make that clear.

DR. HARDING: Thank you.

DR. LUMPKIN: Bob.

MR. GELLMAN: I have a question on the whole premise of this rule and that is is there anything that comes up that is so important that we have to deal with between meetings. We don't make any substantive decisions in this committee. We give advice. We don't cover the waterfront of all health policy issues or health data issues or whatever because we can't. We are very selective in what we do.

I don't really see the need to have a procedure where we have to hurry up and comment on something in between the meetings. If we miss an opportunity to say something, so be it. We can say it after the fact. I don't see the need for this at all.

DR. FITZMAURICE: I guess I can see that sometimes the Secretary or the Data Council might have a need to get a quick resolution to a question, whether it be about a HIPAA standard, such as how fast to proceed on the uniform health identified once Congress and the Vice President give their approval and it is useful to have, in my opinion -- useful to have a procedure that sets out how we will go about it. If it is never used, then we have wasted two sides of one sheet of paper. If it is used, everybody knows the rules by which we are playing.

MR. GELLMAN: I think there is more danger here than you see because I think by having two different procedures for approval of committee activities that you run the risk of game playing. You can have issues come up in the regular way or in the expedited way and that is really the concern that I have. I mean, what happened over the FOIA rules is a good example. That came up -- I am not attributing any kind of malice or manipulation to anybody. It just happened this way, but this is an issue that came up at the last meeting. It was put on the table.

There were a few comments on it and we let it go and then at some -- I don't remember the timing of things, but at some point much later on as it got closer to the deadline for comments, the word came around that there was interest in having comments. Well, that could have been brought up in the first place and when you have an expedited procedures, issues can be left to the last minute.

I very much -- I mean, I don't have major objections to the idea here. I just don't see the need. I do have major objections to the notion that the time period can be reduced. If you are going to do something like this, you have to have a fixed minimum period of time for people to do it. I am not sure two weeks is enough, but the notion that people by majority can decide to do this in a day, I don't know how, first of all, you collect opinions from people on this matter, but everybody gets together. A majority says let's do it this way. We are all agreed. Let's just go ahead and do it right now. That is very unfair. There is not enough public notice.

There is not enough notice for people on -- members of the committee to make decisions. That is what I am concerned about that we are going to find things all of the sudden coming up in a different way. The Secretary knows when we meet. If the Secretary wants the opinion of the committee, she is going to have to ask in the regular order. If the Secretary wants the opinion of members of the committee, she can ask them individually.

DR. LUMPKIN: Let me perhaps segment the discussion. We have two issues really before us. One is should we do it. And the second one is how we do it. I think we should resolve the first one, which is should we do it, before we decide on how we do it. So, I will entertain discussion on should we.

Clem.

DR. MC DONALD: I would say we should do it but I also think it is important to say we should probably take out that line about the emergency ones because otherwise we -- we have an easier time deciding that, I think. I mean, I think it makes sense to have this process. I think it also does -- I think the point about if you squeeze it down to a day, it is just not going to work. So, if it is going to be a process, we should fix it with the --

MR. BLAIR: I think that restricting our decisions to our meetings has certain benefits. It also -- we pay a certain price for that in terms of expediting issues and moving more quickly. If we wind up having this proposed procedure, it offers certain benefits and as Bob Gellman has indicated, it offers certain risks.

My thinking is that we should maximize the benefits of this procedure because it does give us some advantages and that we should carefully review the procedure to minimize the risks, but it does give us an option to move ahead more expeditiously and I think there are a number of issues that may be coming up during this next year or so, where our ability to be effective in committee activities will be impaired if we don't have this as an option.

DR. LUMPKIN: Okay. We have had discussion on the issue of whether or not we should do it. I would like to entertain a motion to move forward with this particular document because I can't think of any way to vote on whether or not we should do it, but we have had that discussion. Then we can talk about how we do it. Those who believe that we shouldn't do it should vote then the modified motion down and those who believe that we should do it, but have now modified the way that we will do it and vote it up.

DR. MC DONALD: Do we modify it to take out that flexible time window?

DR. LUMPKIN: I just head your motion, which was that you move this document minus that sentence.

DR. MC DONALD: Yes.

DR. LUMPKIN: Then that, in fact, is the case.

DR. STARFIELD: I second it.

DR. LUMPKIN: It is moved and seconded. It is on the floor.

Jeff.

MR. BLAIR: I support the idea of going forward with developing a process like this. I do think that we should have -- we should give time to carefully edit and revise and try to minimize the exposures that Bob Gellman has articulated. So, I think that the vote should be in terms of, yes, we will go forward and we will give some period of time, whether it is a week or two or three for it to be edited and modified and brought back to the committee for approval once we are sure that this maximizes the benefits and minimizes the risks.

There is a motion seconded on the floor. Correct?

DR. LUMPKIN: That is correct. As we have not -- the function is to try to get to the issues and I think what Jeff is raising is that if there are two issues, that maybe we would segment the discussion. We would first vote whether or not we want to do it, let people look at the document, bring it back at our next meeting, which we wouldn't be able to do between now and the next meeting and then take a vote it on it at the next meeting.

Unless someone knows of something coming down the pike between now and the next meeting that would require us to implement that, then I think that that would be certainly a quite acceptable approach.

DR. HARDING: Just a point of information again.

In the first sentence of the thing, it mentions that we may need to comment, one, or take a position. There seems to be a difference between those two. I would have no -- myself, no objection to this process being implemented for a comment to somebody. That is not an official statement. That is a comment to somebody about what we think about something or what a subcommittee thinks. But taking a position is, to me, an official act and that would have a higher standard wanting this body to discuss it and so forth.

DR. AMARO: I think another possibility to consider in order to guard against the misuse of this is to perhaps consider a higher standard for the expedited research in terms of the votes required, so that there would be a disincentive, you know, for leaving things for the last minute and to ensure that not just the majority but almost the entire committee approved.

So, I don't know if you want to have a three-quarter, you know, approval requirement or something like that.

DR. LUMPKIN: So, if it is agreeable then, since the motion has withdrawn has been approved by the maker and the second, we would take then two motions. One is that we would send out to members a redraft of this and the two motions that split the issue of -- as Richard raised, one is to comment and the second one is to take a position. And I think that we can just vote on those two issues separately and then get a revised document out to everyone and then bring it back.

DR. COHN: Perhaps it is just a question of clarification. I am not sure -- I am not a lawyer, as you know, so, some of these concepts may allude me. I am not sure what the committee making a comment on something versus taking a position. It seems to me that it is effectively the same thing.

If an individual was making a comment, then an individual is making a comment. To my knowledge also, subcommittees aren't empowered to make comments or take positions on things without full committee attention anyway. So, I am just not sure what the point of making that distinction is. But I would ask that chair to clarify that for me.

DR. LUMPKIN: Actually, I interpreted this as being a -- if there is a regulation or something that requires a response, that would be a comment, but a position may be on an issue that has risen for which there is no legislation or documentation that we are responding to, but we place the position of the committee before. That may be splitting hairs and I don't know, Richard, if you --

DR. HARDING: Something like that, but it is in the document. The two are separated, so there must be some difference between the two.

DR. LUMPKIN: So we can in the document if it is agreeable, Jeff -- and then I will call on you -- if it is agreeable, then we will come up with some wording that kind of clarifies that, if it is agreeable that we want to have a policy to take action. We will not use the word "comment" or "position" between meetings.

Is there a motion that the committee --

MR. GELLMAN: Can I make a suggestion?

DR. LUMPKIN: Please.

MR. GELLMAN: I don't see any reason to vote on anything today. Go away, revise this thing in accordance with what people suggested, circulate it if you want, get other comments and then come back. There is no point in approving, having a vote to approve something when we are not approving it. We don't have a particular proposal in front of us. Just come back with a proposal and deal with it then. There is clearly some interest in this process. So, there is no point in approving a process unless you have a process in front of you.

DR. LUMPKIN: Actually, I am not asking that people approve the process, but the issue was raised by you, Bob, that we shouldn't do it at all and if that is the feeling of the committee, then let's not bring this back. Let's just not do anything on this issue.

So, I am just looking for a feeling from the committee. Do you want something to come back? I am seeing enough heads shaking that that would constitute enough people.

MR. GELLMAN: I think that is fair enough. I mean, come back with something -- maybe you will come back with something better and I won't object to it anymore. I don't know.

DR. LUMPKIN: I don't think we need to have a motion. I see enough heads shaking that it is worth doing the effort to do that. I will ask that individuals who have specific comments on the draft before you, please submit those within two weeks or shorter if you decide that that is more important.

DR. FITZMAURICE: Excuse me, John. Just to clarify that for the people out on the Internet and for the recorded minutes, when you said "head shake," you meant heads nodding "yes." Right?

DR. LUMPKIN: Yes, that was correct.

MS. GREENBERG: Please send your comments to me. I do need some clarification here. Does the majority want a distinction between -- it just may have been an overuse of words. Do you want a distinction on commenting and taking a position or not?

PARTICIPANTS: No.

DR. LUMPKIN: Okay. We are now scheduled --

MR. GELLMAN: John, before we move on, could I ask -- I don't expect anyone to have an answer to this. What do other advisory committees do about this? Anyone know? Maybe someone could give a call over to the GSA secretariat and see if there are any precedents or policies that might be relevant.

MS. GREENBERG: I do know that other advisory committees do develop recommendations positions, what have you, make comments through conference calls, which are clearly between meetings. I know there is not only precedence -- and that when that is done, the process that is generally described here is what is used and have an opportunity for public participation in the conference calls.

MR. BLAIR: An indirect answer to Bob's question, I don't know -- not within government agencies but certainly in health care information standards, committees and boards, there has been more pressure for us to take advantage of e-mail and electronic systems to expedite things and a lot of business activity goes on, at least in that community, taking advantage of electronic communications, so that makes me think are all committee members right now -- do they have regular access to e-mail. If not, then maybe that is part of the issue of -- may make it a little bit difficult for us to assume that that existed.

Marjorie, do you know, is everybody available on e-mail?

MS. GREENBERG: Everyone is on e-mail. The problem is that just using e-mail does not allow for public input to the decision-making process. As for commenting back and forth, developing, you know, ideas, obviously, we do that now and we continue to do that. But you can just come to a position through e-mail.

DR. LUMPKIN: We are scheduled for a --

MS. GREENBERG: There is one other little item here. You all got -- if you don't mind -- on the dates. You all received a sheet to fill out about the dates for the full committees in 2000. If you had a chance to look at the executive subcommittee minutes, you would notice that we had agreed to definitely establish a date for the June 2000 meeting because that would be the meeting with the symposium in honor of the 50th anniversary of the committee, actually I believe in 1999, established in 1949, but we will have had our 50 years and then we will have this symposium.

We plan to have that at --

[Multiple discussions.]

We are planning to have the symposium and reception at the National Academy of Sciences. Jim Scanlon had gotten available dates and then we had queried all of you through e-mail and we have selected the 20th through the 22nd. It may be -- it will definitely be the 20th and the 21st, I think, and Jim will follow up with the Academy as to when the best time for the reception is, but I would ask you to hold the three days because we may also have some subcommittee meetings or -- just to give us a little flexibility.

And the other dates, we have narrowed down for the other three meetings, we either have a February or a March meeting and so we have dates for those. Also, for a September meeting and a November meeting. We have tried to take account -- we have taken into account HL7 and that is 12, at least from what they planned. They don't have conflicting dates. The APHA, the AHSR, the major secular and religious holidays, whatever.

But anyway, if you are able to fill this out today, that would be great and I believe that Deena is collecting them. I realize that some of you may not be able to. So, get it back to Kathy Jones at NCHS. In fact, we could e-mail it to them.

We will e-mail it to you and then you can fill it out and then we will select the dates. Because, as you know, many groups do select their dates well in advance and we want to have ours on the table.

DR. STARFIELD: How soon will we have decisions?

MS. GREENBERG: I would hope that if you can't fill it out here, you could do it as soon as you get back to your office and send it back within a week.

DR. LUMPKIN: Thank you.

We will take a break until 10:30.

[Brief recess.]

DR. LUMPKIN: The next item on the agenda is our panel discussion of data for measuring quality of care. We have two presenters and we have Dr. Gregg Meyer, who is director for the Center for Quality Measurement at AHCPR. We have Dr. Steven Clauser, who is director of Quality Measurement and Health Assessment Group with HCFA.

Agenda Item: Panel Discussion on Data for Measuring Quality of Care

DR. MEYER: Thank you for the opportunity to address the committee. I wanted to start by putting just a little bit in context of a conversation I had with Kathryn Coltin regarding what she was looking for from this presentation. She asked me to address three main issues.

The first is describe the data limitations we have experienced in trying to measure quality of care particularly in the systems that we and our researchers work with.

The second was if you could change -- the second two actually are -- the second and the third are the three wishes phenomenon and the first one is if you could change three things with respect to how administrative data are currently captured, what would they be.

The final one is if you could capture three additional types of information on administrative, encounter or enrollment data, what would those be? I am going to use those to guide my comments here.

The first point, in terms of looking at what the agency is trying to do on quality, our think our quality agenda writ large is quite clear and you can get -- anyone can get access to that by looking at the President's Quality Commission Report. So, if you look at the very specific aims for quality improvement for the nation, some of them really are an important component of our basic quality agenda at AHCPR.

Those include expanding research on treatments and effectiveness and that is done through ongoing work with our own projects and new work that we are doing if you are translating research into practice. Also, a new focus on reducing health care errors and the agency has put together a task force from across the agency to try to see how we engage that process, which is moving quite quickly in the private sector and how we can play an important and supportive role there.

In terms of quality measurement reporting, one of the first things the President's Commission said was to identify core measures and standardize reporting -- and I would like to get back to that at the close of my comments -- on industry supportive measure development. And we are doing quite a bit on our own in terms of measure development.

We have just released an RFA last January and now we are evaluating responses to that to look at quality measurements for vulnerable populations and usable comparative information for consumers. So, not just thinking about how to measure it but how to report it out to achieve the goals that we have in terms of quality improvement. And finally building the capacity to improve quality.

What we are trying to do, we have to -- in terms of quality measurement, we have to think about the different information need and for us the main character is that we are trying to provide information to our patients, our purchasers and clinicians. Here it is very important now. If we talk about what the NCVHS could potentially do to improve quality measurement, we have different goals for measurement and those goals include measurement for improvement, for example, at an individual clinician level, measurement for comparison for a consumer making a choice between health plans or perhaps a purchaser making a choice between who to contract with and measurement for accountability, for example, for the government to look across health programs and take a look at the performance of various contractors, health plans and other providers.

We also, to accomplish these, we rely on different sources. The first source is administrative data and the difficulty with administrative data that we constantly face is we always try to make that silk purse or that quality purse from the reimbursement ear, from the system that was built around developing reimbursement.

Second is the use of clinical data and the good thing about clinical data is that it is much richer than the administrative data. The bad thing about it is that if you have seen one chart -- and I have gone through this personally as a researcher, looking at charts from across the country, if you have seen one chart, you have seen one chart and that the organization of charts and what is contained in charts varies very widely.

Finally, the issue of using survey data and survey data is often the port in what I would call the data paucity storm in that whenever we are looking for data on quality, it is not captured in administrative data. It is either unobtainable in charts or it would be difficult or expensive to get.

The obvious fallback is let's use a survey to capture this. The good news about that is that it provides a mechanism to do it. The bad news about that is there is an increasing issue with respond and burden and Kathryn Coltin can speak for many minutes to us about how much that burden plays out in terms of what health plans have to do and what we are asking our patients to do and the subjects in our studies to do.

So, although it is a short term solution, it is not the best solution over the long haul. The basic questions that we want to be able to answer through our research at AHCPR are relatively simple and if you take a step back from them, these are ones that all of you should be able to answer about your own programs or what you are doing in your careers and your own work.

Are things getting better or worse? Simple question. What methods do we have that are available to answer the question? What can we do to provide measures for decision-making at various levels and what do we need to change to drive quality improvement? Very simple issues. The truth of the matter is, as all of you recognize, that the current data availability and the way that data is organized right now prevents us from answering this on a large scale measure.

This is complicated by the fact that there is a change in the way that we measure quality. Quality in the past was relatively simple. When I was a health officer, I learned that quality health care was what the best attending physician provided. If you looked at what he did or she did and you behaved in that manner and took care of patients that way, that was high quality health care.

Quality was also the result of an individual effort. You had a really high quality doctor or a high quality hospital. We have come to realize that quality is much more complicated than that and that quality now came from multiple sources. The whole notion of evidence-based medicine getting all these different sources to come together to define quality -- and it is not just the result of an individual, but it is the result of an individual in the system. And I would argue to you that most of our data systems in terms of being able to measure quality are built around those path notions of quality, that we have built systems that if they can measure quality at all, they can measure it in these formal ways in terms of looking at knowledge from individuals as opposed to looking at systems.

Specifically, what we want to be able to achieve and where we are having difficulties is, first of all, there is a lack of comparability across sectors in geography. This is something that we at the AHCPR are going to need to face head on because we have a call from the Secretary's Quality Initiative to work with our colleagues in the CDC to develop a National Health Care Quality Report to look at the quality of health care delivery in the United States.

To be able to do that, we need to be able to look at population-based quality data and there is an extreme paucity of that data right now and we need help from you in advising us in terms of what data sources are developing or what work the NCVH could do to help us in that quest.

The second is that there is a lack of a core measurement set and Steve Clauser and I faced this issue head on about nine months ago. One of the first meetings of the Quality Interagency Coordinating Task Force Work Group on Quality Measures, which we co-chair -- and in that group we brought up the issue of whether or not the Quality Interagency Coordinating Task Force should consider in its mandate developing a core set of quality measures. There was a very, very clear consensus on that, I think, that reflected the opinion of everyone in the room and that is that that is something we should really wait and look and see what the private sector is going to do.

In that vein, what we are doing now is we are keeping our eye on what is going on with the Health Care Forum to see if they are going to help us out of the core measurement set morass.

Finally, that the data elements, there is a lack of data elements that we need for use in quality measurements. For example, if you go through the HEATUS measurement system, HEATUS 2000, and think about writing the code for those individual performance measures, what you will very quickly encounter is the fact that our data systems, our data code set have not kept up with the need to measure quality.

Specifically, there is a measure that is going to be integrated looking at the treatment of patients with chronic asthma and whether or not they are on certain medications like steroids and leukatriane(?) inhibitors. We don't have a separate ICD-9 code for chronic asthma.

Without that code, it is very difficult to do that with an administrative set. Doing that on a population would be all the more difficult.

All of you know the myriad of concerns that we have in terms of data privacy and we are specifically instructed not to touch on that, but there are other issues in terms of standardization versus flexibility, redundant effort, measuring everything once and the release and publication of results, something that we touched on a bit earlier, global versus specific measures and the public sector role versus the private sector role.

In think in your guidance on where the agency should be going in all these issues would be extraordinarily helpful.

A couple of other final data needs that we have before we get to my three wishes, the first is the ability to integrate patient care data with system data. As I said before, under the old definition of quality, when you looked at individual efforts, it was much easier to measure, but when you recognized that quality is not just an individual but it is an individual working within a system, you need to integrate the data of both individuals performance and the system.

Without that ability, we really don't have the ability to measure quality. We also need to be able to cut and reassemble data for new purposes. You heard this morning about some conversations regarding the rates in health issues that we are trying to address at a department level. We need to be able the cut the data. We need to be able to make those data cuts in terms of measuring quality. Without it, our ability to close those gaps is going to be extraordinarily limited.

Value for measurement, when is the investment for measurement worth what we are getting from that quality measurement activity? Again, a very acute issue in the public sector. This is something that HCFA faces constantly. There is a lot of push back and forth. The Oasis Data Set, I think, would be an excellent example of that.

In the private sector, national and community quality assurance is a constant issue. What is the value for this measurement? What are we going to get from adding the extra dollars to do this next survey or to try to improve our data collection systems to capture this new performance measure?

Integrating data collection into work, again, this is something that we at the agency are trying to support some grants to start to look at how this can be done, how to make data collection -- instead of collecting -- doing a quality improvement or a quality measurement becomes part of what you do everyday and you are capturing the data in a real time basis and then, finally, considering presentation formats for stakeholders and designing elements. One of the key points I would like to make to this committee is to think very long and hard about what the data elements when they are finally assembled into information that is going to be presented to various stakeholders, what the information should look like to convey that message.

I give you something from our research, which I thought was to me was incredibly startling and a group of patients were asked to rate their health plan on a five point scale by using a standard question going from very good to pretty poor. Patients went ahead and came up with those ratings on two health plans, Health Plan A and Health Plan B.

Another group of consumers was shown that information, quality information on Health Plan A and Health Plan Bon that five point scale and they then made a choice. Which health plan would you go with? Then took the exact same data from that five point scale and collapsed it down to three. This health plan is good, it is okay, it is not so good. Exact same data. Just presented differently. Went back to the group of consumers and said which one would you choose, Health Plan A or Health Plan B? Fifty percent of the consumers changed their choice. Same data. There is no difference in the data there, but the presentation was different.

So, I would ask you to always consider what the presentation will be of the data elements that you are speaking about in this committee.

I would like to close with my three wishes that were granted me. And the first three wishes have to do with administrative data changes. These come from a quick poll of people at the AHCPR. What are the three things that we would like to change? And I talked to both researchers and people involved primarily on the extramural side as well.

The first one was codes that capture performance measures and I spoke about that earlier, that we need to have coding systems that are going to be able to capture the nuances that have to be reflected in quality of care measures; for example, chronic versus other types of asthma; for example, conditions in children versus conditions in adults.

The second is standard linkages; the ability, for example, to measure performance in asthma care requires not only identifying the patient with chronic asthma with the medications that they are taking. We need to have the ability to use the standard linkage to hook up to the data from both laboratory, facility data, provider data, resource data. We need to be able to pull those together with some kind of common key.

Then finally, you need code to allow you to follow patients across the continuum of care, cross over both time, to follow them out over time and over space, over geography, over different facilities, over different levels of care.

In terms of the additions wish list that we came up, we came up with three. The first and foremost is a measure of functional status, our ability to measure quality meaningfully is hampered by our inability to capture data across a population on their emotional status. That would be first and foremost on our list.

The second is a short list of tailored lab radiology results dictated by conditions. One of the things that is quite remarkable is to step back and look at the quality improvement efforts of the Department of Veterans Affairs, the Department of Defense and Health Care Finance Administration. And you look at their list of key conditions, you will see extraordinary overlap. Diabetes is on for all three lists. Mental health in some form is on two of the lists.

Cancer is on two of the lists. Cardiovascular disease is on all three lists. There is a lot of overlap there. Different processes targeted at the same group of conditions. One of the things that would be very, very helpful is if for that short list of conditions that there does seem to be some general agreement upon that we would be able to get some specific clinical information that would be captured through an electronic means that would allow a broad look quality across a number of providers.

The final is standard integrated surveys, surveys, again, that are keyed to the other forms of data, to the electronic medical record. They are keyed into the administrative claims encounter data, where we can tie individual's responses to their experience in their health care system. I think some of the work that HCFA has done with CAPS goes a long way in terms of making a down payment on this process, but we need to see that with other survey measures as well.

With that, I will stop.

DR. LUMPKIN: I just have one comment before we go to Steve Clauser and that is that I think all of us found your comments about that survey very interesting. If you have anything written up on that that you could provide to the committee --

DR. MEYER: In terms of --

DR. LUMPKIN: The scale.

DR. MEYER: The scale. It is actually one of our grantees and that was at a -- I actually can give you some tables from a conference that we held last December, but it is not yet published. So, I think it would probably be best to wait a few months.

It was truly striking. It was one of those moments where you kind of just sat back in the chair.

DR. LUMPKIN: Steve Clauser.

DR. CLAUSER: Thank you very much.

I appreciate the opportunity to be here this morning and have a chance to comment on some of the challenges that this committee has been facing on addressing issues of quality of data and particularly as it relates to development of performance measures.

As you know, at HCFA we are quite involved in the work that is taking place in this committee and we really do applaud your efforts to keep a continuing national focus on data quality. Particularly from our point of view, its importance in really improving the care that we provide our beneficiaries under Medicare and Medicaid.

We really do appreciate the opportunity to help you with your work. I want to just start today by really kind of taking a little different tact from what Gregg did and talk about this a little bit from the notion of a purchaser's perspective. Talk a little bit about, just very briefly, about the importance of performance measurement and what we do because I think understanding different uses of performance measurement really does help ground some discussions about data issues. And then talk about some of the specific initiatives that we have to improve data and some suggestions, I think, for improving completeness of data collected.

You know, we do kind of consider performance measurement as a foundation for our evolving purchasing strategy in trying to improve the health of the beneficiaries we serve. This notion of value-based purchasing, which is kind of loosely used, at least involves some notion of measuring both cost and quality and we are committed to, you know, improving our data systems in terms of the clinical information it provides to get a better understanding of quality of data for both fee-for-service and managed care.

Operationally, it really falls out into three main activities. First, we are really beginning to use performance measurement to try to facilitate what we call better monitoring enforcement of clinical standards equality. For example, we have major developmental efforts going on to test the utility of clinical indicators and performance measures in improving and streamlining our nursing home survey process.

We are also very much involved in beginning a process in working with the end stage renal disease community to develop measures that will improve survey procedures for dialysis facilities and we are beginning an initiative to really evaluate the utility of HEATUS measures to assist our regional offices in managing contracting care oversight.

I think through projects like this, we hope to aim to use performance measurement as a means to do a better job of assuring the public and our beneficiaries that the care that providers and plans provide to participate in Medicare and Medicaid do represent an acceptable level of quality, at least as represented with our Medicare conditions of participation and requirements.

The other thing I want to talk about is that we are also using performance measurement to support our quality improvement initiatives and this is true in both our managed care and fee-for-service lines of business. These initiatives really are cooperative projects with health plans and providers to really try to raise the bar of quality based on the use of best practice and using performance measurement and quality data to do baseline measurement, intervention and follow-up work.

Our primary activity in the fee-for-service arena is through our health care quality improvement program, which is part of our peer reviewed scope of work or we have six national quality improvement projects that are designed to improve care for Medicare beneficiaries with diabetes, heart failure, acute myocardial infarction, pneumonia, breast cancer and atrial fib and we have developed a series of performance measures, both process and outcome, that we are going to literally hold the PROs accountable for improvement in their states through performance-based contracting incentives.

Also, as part of our quality improvement system for managed care, which is referred to as QISMC, health plans that participate in our Medicare Plus Choice program must participate in one national quality improvement project of their own choosing also within a broad list of choosing one of their own of some priority areas that are established in QISMC.

These standards are developed for both Medicare and Medicaid programs. Basically, the first project that will be worked on this year is in the area of diabetes.

Finally, we are beginning to use performance measurement, and I think Gregg commented on this, as part of a public information campaign, really to assist beneficiaries in making choices about plans and providers that they seek care from in our programs. We have published the HEATUS 1996 data on our HCFA Web site and we will be publishing a few quality measures in our forthcoming Medicare Compare handbook. Those measures will be both managed care and fee for service comparisons. We are currently working feverishly on similar projects for reporting clinical effectiveness information in skilled nursing facilities and in renal dialysis facilities.

We hope to be ready with the information for skilled nursing facilities sometime next summer. I think the point is that to improve the quality of data you have to have a clear business case for wanting that information. One thing we have been trying to do at HCFA is to create a very clear business case for how we want to use quality information to improve the way our programs operate.

I think that also the message is because we are investing so heavily in this, we do share your concerns about improving the data and the quality of data that particularly support these operational activities.

Let me just step back for a minute and talk about a couple of things we are trying to do to improve the quality of data and its use in our performance measurement activity. First of all, as a purchaser, we are putting a lot more emphasis now in trying to standardize measures of care as opposed to emphasizing developing new measures of care and new indicators of care. I think one of our key efforts is try to maximize the value of the current information that is presently required and presently available in the health care community that supports these programs.

One of the examples of that, I think, is the Diabetes Quality Improvement Project, called DQIP, which really exemplifies the strategy. DQIP was a project where the aim wasn't to develop new measures de novo, but basically to try to reach consensus across purchasers and providers on a common uniform measure set, okay, that drew upon the strengths of other measurement efforts and data collection experience in the past.

We are pleased that this effort has been well-received. The National Committee on Quality Assurance has adopted our DQIP measures in HEATUS 2000. Both the American Diabetic Association and the Foundation for Accountability have endorsed this measure set. And we will be using DQIP as part of our peer review organization, Health Quality Improvement Project and several other federal agencies, including the Department of Veterans Administration, have endorsed the DQIP measures.

We believe that through a standardization approach, the data quality will be approved if for no other reason that health plans and providers will be using the same measures or the same specifications for all their lines of business, whether they are managed care and fee-for-service or they are in the public or private sector.

The second thing I want to talk a little bit about is that we are creating standardized data collection and reporting tools to support measure sets that we endorse, based on what we call our MedQuest system. This is basically -- if you know how a computer works, it is like the little wizard in the computer set that helps you kind of manipulate through databases.

The tools that have been developed over a development process, which I think has gone on for more than ten years, really facilitate much better control on the front end of the measurement collection reporting by creating algorithms tot help improve data collection at the point of service. And we are working on also preparing analytical tools to complement the data collection set so that we can try to enhance the value added of these data sets and measures for front end users at the point of service.

All these modules that we have developed to date have been tested for reliability and validity. MedQuest itself will be required for all performance measurement initiatives that the peer review organizations will use in the six national projects that I talked about earlier and these tools will be made available free of charge to providers and plans who desire to participate in our quality improvement initiatives.

We are in the process of developing MedQuest modules for the minimum data set, for nursing homes, for the outcome assessment information set, for home health and for our ESRD core indicator set, again, to try to build upon this platform to create more uniformity and standardization in the way data is collected and specified in our performance measurement activities. We think kind of the developing of what we call this freeware approach to tool development is that standardizing collection and reporting tools really do two things to improve quality.

One is these tools and specifications are in the public domain, which allows clear scrutiny of measures themselves in the analytical logic of the data collection in these reporting protocols. It really starts moving us away from some of the black boxes that often occurs in proprietary measure sets, which I think is inhibited to some extent to our ability to standardize more on the national basis these kinds of reporting efforts.

These tools and specifications are also publicly copyrighted, which means it reduces the cost of improving information systems since they are free of charge, while maintaining quality control through a coordinated process that will improve tool refinement over time.

The third thing we are doing is we are committed to auditing performance measurement data both in managed care and fee-for-service. For example, over the past three years, we have been very involved in working with the National Committee on Quality Assurance to improve audit standards for managed care plans that report clinical HEATUS data and when we compared our findings from our 1998 audit to the 1997 audit, we were very encouraged to find that the progress plans had made in improving both the accuracy and completeness of Medicare HEATUS data. Improvement occurred in such areas as eye exams, people with diabetes, breast cancer screening measure and our beta blocker after heart attack measure.

We believe that the HEATUS audit findings and the feedback back to plans really contributed to improving the quality of the data and the reporting of these measures. So, as a result, we are continuing to require an audit of selective HEATUS measures. We are in the process of working with ten plans to really get more on the accuracy issue by looking at the connection between data that is actually in the medical record and tracing it directly through to actually what is reported to see how we can improve that process.

We are going to have audit protocols as part of our other clinical data sets that have come on line, both the minimum data set for nursing homes and the Oasis System for home health.

Now, briefly, talking a little bit about the challenges for the future, there clearly are several activities underway within the -- into improved standardization of data across purchasers and systems of care. And I think the national committee, subcommittees, work on developing standards for computerized patient records and some of the work that is being done on the analysis of Medicaid managed care data collection are going a long way to improving that effort and really assisting at least us and our data quality in performance measurement activities.

Gregg talked a lot about some of the particular data items that are of primary concern and I kind of endorse most of his wish list with one exception. I think the issue for functional status measures is not that we don't have it. It is that we have more than one. That is our problem. Right, Vince.

I never forgot our last meeting.

I think that what I want to talk a little bit is about how we enhance the value of what we are already requiring because one of the issues that we see is trying to enhance at least the completeness of data that is reported so we can do a much better job of trying to understand where those gaps are in data accuracy and data validity that will allow us to move forward in trying to enhance data collection efforts of the type that Gregg mentioned.

I think one of the challenges that we see is with the introduction of new payment methodologies in HCFA will have a direct impact on the availability and quality of clinical data. In addition to improving quality of care through our initiatives that I mentioned, clinically-focused data is becoming important for developing risk adjustment models to support prospective payment systems and to enhance our capitation payment under Medicare Plus Choice and for providing depth in validation of analyses of administrative data for program integrity.

Now, the good news is is that when we tie the quality of data to payment, the completeness of the reporting should improve, that our analysis of HEATUS data suggests that how health plans pay for a particular service had a great effect on data completeness. Not surprisingly, data capture rates were much better in fee-for-service payment arrangements than they were where providers were capitated.

If these new payment systems are implemented, however, there will be new challenges in understanding this data completeness issue. One thing is trying to differentiate between better coding and what we call kind of coding free in clinical data information.

Our experience has been in Medicare skilled nursing facility demonstrations and, if you think back to the hospital prospective payment system, that once clinical data is used for payment purposes, case mix all of the sudden increases. Part of this clearly is due to more complete and better coding, but part of it may be due to coding manipulations in order to maximize payment.

It is interesting with our experience in auditing HEATUS data -- and this is where we do not tie clinical data to payment -- plans are just as likely to underreport as overreport when we look at errors. Now, how this will change under Medicare Plus Choice and as we move to post-acute care PPS data systems really needs to be better understood and our ability to distinguish between these two factors, better coding and coding free will hopefully help our understanding of what data means and the limitations of its uses.

I think a second challenge that will be needed is how to improve the completeness of the quality of care that is not directly related to payment. Clearly, part of this is better auditing tools and specific quality initiatives tied to these data as we are doing in our peer review organization but I think a couple of things are very important.

One is that we need not only to think of enhancing ICD-9 -- the issue of asthma was a good example, but also to enhancing the way we approach specification of coding systems themselves for administrative data, largely procedure driven, largely driven on payment and we should be thinking a little bit about those coding systems for other uses.

I think as we start running into the realities of building encounter coding systems and matching it, at least in Medicare, to our fee-for-service system, those issues are something that I think we will be looking for leadership and I think it is something that this committee can provide.

I also think that we also should seriously look at the role of public reporting of data in and of itself as a basis to enhance the completeness of reporting. I think some of our experiences in the past have been that when data, particularly looking at the experience in Pennsylvania and in New York, as data moves to the public domain, people take that activity much more seriously.

I think new data systems and new data elements are very important for the future and I think some of the wishes that Gregg laid up there are things that I would heartily endorse, but I hope we also in the new term, as we think of quality of data, look to maximize our investment in the clinical data we are already requiring because I think this is going to be extremely important for us to make that business case, that collecting clinical data is important to improving the nation's health and I hope that also is a very important part of your mission as you move forward to provide guidance literally to the nation on these issues.

DR. MC DONALD: Can we ask the same question about getting -- you have described a data set in your talk. Can we get copies of that for diabetes?

DR. CLAUSER: Oh, yes, the Diabetes Quality Improvement. Yes, I can provide that.

DR. LUMPKIN: Kathy, did you want to make any comments before we went to general discussion questions?

MS. COLTIN: Only that I appreciate your raising just about all of the issues that have been raised in our work group and in other forums where I have participated. I think that there are a number of things that you have mentioned that have been of interest to committee members here, some of the issues around survey integration, for instance, that Gregg brought up and that HCFA has done some work with that around the MCBS, for instance, and linking survey information with administrative claims.

The kind of capabilities that you have had for data in the Medicare population have actually been quite broad compared with the kinds of capabilities we have for other populations. I wondered if you might talk a little bit more about that, particularly thinking about the Medicaid population or thinking about what we might do to be able to develop some of the same kinds of capabilities that you have been able to go for the Medicare population.

DR. CLAUSER: That is a challenge because state Medicaid programs have, you know, inherent administrative responsibilities in terms of the relationship between HCFA and the states. The one thing that in discussion at least with some of the state Medicaid directors, I hear, is a real interest in standardization of data and one of the difficulties is where you run into issues of how program design differs in, you know, the public and private sector, creates real challenges in trying to standardize information across payers.

For example, in many states, they have enrollment requirements where they need enrollment limitations of six month, seven month or ten month cycles, where most of the work that we do, for example, at NCQA on the Committee for Performance Measurement, everything is minimum for twelve months. Those kinds of issues, just in terms of the enrollment realities of how the programs differ present some real challenges. But beyond that, I think there is a real desire in areas of common interest to try to standardize some of those measurement activities to the extent they can and I think it is important to try to open up a dialogue with, you know, the state Medicaid director representatives, to try to see where that common ground is.

DR. MEYER: Two quick points. One is that this actually is an issue that is raised in multiple fora and I have had patients -- people literally stand up in a meeting and say where is health care quality for the rest of us, outside of the Medicare population, because of really the wonderful data that is available on the Medicare side, compared to the paucity of data for the rest of the population.

Tomorrow you are going to hear about the National inpatient sample from Anne Elixhauser and she is going to talk a little bit about the project at AHCPR, which on the administrative data side goes in that direction for hospital care with plans to expand that to emergency room care, ambulatory surgical center care, as more and more states start to collect that data. So, that ability is growing.

With that said, short of a major push on the equivalent of a PRO type program or quality improvement organization type program on the general population, the real impetus to move these measures out and to collect this data on the rest of the population is going to be lacking. So, that is a key direction that we want to push in.

DR. STARFIELD: I appreciate your presentations and I especially appreciate, Gregg, your recognition of the need to get to population focus. But I am thinking, you know, if I close my eyes and listen to you, I would think I was back in the 1950s or 1960s because you are using the same paradigm for quality as Paul Lembke used and Mildred Moorhead used in her OEO evaluations, the disease by disease approach to quality of care for populations. It doesn't work. You are never going to get the populations if you do it by diabetes, hypertension, et cetera, et cetera. And you are not even going to do it if you do it patient by patient.

So, we have got to start breaking out of that mold and getting to something that really does get at populations. I think we really just have to start thinking about why the old approach, the 50 year old approach, didn't work. I think it didn't work because we didn't focus on improving data to do a better job of conceptualizing quality, you know, using our capacity now to collect data and to analyze data that we didn't have then, but we have it now.

Let me just, you know, make some observations on your second or third slide on the national aims for improvement. The second one is expand research on treatments and effectiveness. What you haven't got in there is the fact that we don't have a good basis to even conceptualize the research because most research we have on treatment effectiveness is generalizable to maybe 10 percent of the population.

You know, you have people who are excluded from randomized clinical trials because they have got comorbidiy and most people have comorbidity. Okay? And then only 15 percent of the eligibles were on here.

That was a problem we had 50 years ago with the guidelines that Mildred Moorhead used. You know, they just really weren't generalizable. So, we have got to do something about thinking about how you conceptualize effectiveness and particularly the whole notion of harm. You have here reduce health care errors, but a lot of harm isn't due to errors.

I am reminded of the article, the deaths from drugs, where they, I think, relatively reliably estimated that the fourth to the sixth leading cause of death is harm from drugs and they are not errors necessarily. They are inadvertent harm.

So, I think we -- you know, I wrote about this rather extensively in an editorial in JAMA last September. It was called Internal Elegance and External Relevance and I think we have to get off this internal elegance and get to external relevance.

You didn't mention, for example, reduction in disparities or elimination of disparities across population groups, which is a national goal. So, that, at least, ought to be added to your list. So, we could go on at great length. I think there is a lot this committee would like to work with you on in both agencies.

DR. MEYER: I do think, actually, some of those were addressed, but perhaps not as directly as I should have, in particular the need to be able to cut and reassemble data in multiple ways get at exactly the issue you were talking about, which is moving away from condition-based or procedure-based measures of quality in looking at the entire population.

That is why I think one of the most exciting aspects of the notion of trying to start from ground zero and think about how to develop a national health care -- quality of health care delivery system report involving both the CDC and AHCPR is so exciting because that allows us to start to think about the population-based focus instead of condition by condition.

In terms of eliminating race and health disparities, absolutely, a key issue and, again, it gets to this notion of being able to cut the data, having the data available to make those measurements across populations.

Finally, in terms of expanding research and treatment effectiveness, I couldn't be in more agreement with you in terms of the fact that if you look at most guidelines, they apply to very, very thin slices of the population and this is -- I think was best articulated by some colleagues at the agency, who started to talk about how we can develop an initiative to address people like me, to look at not the individual that fits into one of these randomized controlled trials or would be the type of person that would be followed in a port study, but the average person, who may have prostate disease and who may also have four or five other conditions or who may have a single condition or who is interested in wellness.

So, I think that is a major challenge to us as an agency to rethink that way. It is on our radar screen and your guidance would be helpful.

DR. CLAUSER: I would just like to comment on that a little bit because I think that we need to pursue both. I disagree that in all respects that a disease-based approach has been a failure. We have pretty good documentation within the peer review program, for example, in AMI, where at least on some process measures, we have really enhanced the pretty well evidence-based clinical process measures that have had -- that definitely have a correlation in terms of reducing mortality.

I agree, though, that the broader objective of trying to move to more of an outcome-based, population-based measure and measurement strategy is really kind of the next frontier. From our own point of view in managed care, we have implemented the Health Outcome Survey, which is a population-based measure at least for managed care plans. We are looking at improving the physical and mental health status of our Medicare beneficiaries. We did the baseline data collection last year and next year we are going to do the follow-up measure.

We are working very feverishly with the managed care industry to try to develop interventions that can improve care on a population basis, measuring against some of these broader outcome measures. The challenge there from our point of view is the issue of what we can legitimately hold a health plan accountable for in terms of improvement as a basis for trying to understand our relationship between the health plan and the purchaser and trying to drive quality.

That has been a real challenge as we move to population-based measures.

DR. MC DONALD: I just was concerned a little bit about -- both of you described the systems sort of from an outsider's view in terms of data; that is, you are looking throughout the peephole or almost a little capillary thread of data that flows from the ICD-9 codes or the CPT-4 codes and it sounded like it is a chart review process these quality measures go through and it also sounds like most of them drive measures and if you step down a layer and look at the raw data on which they depend, it could be less work for everyone, given the way systems are evolving.

You mentioned the asthma, chronic asthma. There are surrogates for that. It is much more -- and continuous variables for hemoglobin A1C, I am sure, must be in the diabetes one. But, well, you don't have to have that all coded because you have the raw data and I think the -- as a general internist trying to see patients and then do extra stuff, it is like -- you know, it is like it doesn't fit. There is no physics there to do the things. So, if we are going to have extra work, w are going to break the system.

But most of the data you want a lot of it is being recorded. It is just not in the form you want. I just kind of urge you to sort of see about stepping down to a lower level and just yanking the raw data because it will be in the short term easier to do that than have people reviewing charts and figuring out what really -- how to arrive at it.

DR. MEYER: The notion of capturing data in the routine of care is absolutely --

DR. MC DONALD: Well, that is not what I am saying. I mean, it may not be what I am saying. That is, having the physician ask one of more question --

DR. MEYER: I am saying that it is all -- without having to do a separate step, without an extra form to fill out.

DR. FRIEDMAN: I have a couple of different comments, one of which -- obviously, developing mechanisms for presenting the data to the public in a useful way is a huge challenge and the Medicare Compare Web Site, where the CAPS data are presented, if folks have not looked at that, I really highly recommend it, at the risk of sounding like a shill for HCFA. I mean it is really --

DR. MEYER: We really appreciate that.

DR. FRIEDMAN: It is essentially a one click means of comparing health plans at the zip code level. You basically go in. You put in your zip code and you have got some really nice graphs. And it is -- simplicity, I think, in these presentations is essential and that is a really nice clear site.

Secondly, like Barbara, I certainly applaud Dr. Meyer's emphasis on population-based quality data and one of the things, though, that I found interesting is that we are talking about the sources of data. You list administrative data, clinical data and survey data. At least from my point of view, in public health one of the things that sort of is conspicuously left out from there is public health surveillance data, which I think both in and of itself, as well as in conjunction with either administrative or clinical data, both now, even without any kind of unique identifiers or possibly in the future, presents a really rich source for quality data.

I mean, certainly in terms of the Medicare SEER linkage, certainly in terms BRFS(?), every state now essentially has cancer incidence data and every state, obviously, has birth data, et cetera, et cetera, et cetera. And there is a lot there. I think it would be really helpful if we started thinking of that as an additional source of population-based quality data as well.

DR. MEYER: We are actually looking to that source as a key part of any quality report that is eventually produced. The focus they have on terms of survey data was it wasn't to be population-based data, but was -- and to follow the question I was asked to address.

DR. LUMPKIN: I have so far that the order will be Dan, Paul, Vince and Jeff.

DR. NEWACHECK: My question is for Dr. Meyer. When you were presenting your slides on the national aims for quality measurement, you briefly mentioned vulnerable populations and Barbara brought it up in the context of disparities or looking at disparities between populations or among populations.

Can you tell us what vulnerable populations you are considering looking at and also whether there will be an attempt to provide data on those vulnerable populations across all of these systems, the administrative systems, the clinical data, the survey data and that -- what your thinking is there?

DR. MEYER: The definition for vulnerable populations and the one that we actually use in the RFA we published last January was lifted directly from the President's Quality Commission Report. When you read that, it is very broad. It not only includes race issues, but also includes ethnicity issues, includes some cultural competence issues and also chronic disease.

So, when I look across the responses that we had to those RFAs, the RFA that we published, there was a wide spectrum that included measures for children with chronic illness to measures aimed directly at looking at race and health disparities.

So, again, it is written very, very broadly and that is the one that we went with with our RFA because we thought it would generate the most interest.

The second part of your question in terms of trying to get these included into data sets, that is really what we are looking to you for. What we do as a research agency is as a research agency, we are able to help fund the development of measures and get the research community to start to pay attention to these groups in their research work, but in terms of getting implemented out there on a large scale, that is where working with you and others and the private sector is most helpful.

DR. LUMPKIN: That breakdown includes socioeconomic status?

DR. MEYER: Yes, it does to that.

DR. NEWACHECK: And the uninsured as well?

DR. MEYER: I believe it includes the uninsured as well, although I would have to check the definition, but this was the definition right from the President's Quality Commission Report. I don't think it could get any broader.

DR. MOR: I am going to be a shill for the Health Care Financing Administration in this particular instance because I think that the paradigm or the vision that Dr. Meyer outlined of multilevel view of having ongoing data to make determinations about quality use of information for multiplicity of sources actually exists. It exists for a very limited field, the health services world and the nursing home context.

In addition to all the standard Medicare claims data that is available for people who are residents of -- who are moving in and out of nursing homes for their Medicare use, there is information now collected on a quarterly basis containing a vast amount or vast array of clinical and functional information. This information once integrated with information from the hospital or the skilled nursing facility in terms of diagnostic and otherwise has enormous utility for quality improvement, for purchasing and all of that, and accountability, all those purposes as well.

The Health Care Financing Administration is pushing the envelope to try to make those data real. In relation to Clem's comment about actually sort of capturing the data in the normal course of activity, in the nursing world, and I think it also applies -- in the nursing home world applies, probably in the acute care sector or the ambulatory care sector as well, the application or the creation of this assessment instrument or minimum data set basically created a common language and it is a common language not in the code set world, but actually in the parlance of what the record is.

So that at some point, the issue of wasting people's time to sit down and fill out silly forms or do data entry is going to, hopefully, go the way of other problems of data capture and the common information will be there to make it possible to do some aggregation.

Now, I am not going to say that this is a perfect system. I am in the midst of actually trying to pull together data on ten states, every single nursing home resident that has ever been in those facilities in ten states, matched with Medicare data, matched with the information on the nursing home as well as the hospitals and as well as the area. You can pull all that together, but you will still end up why is the rate of observed or reported pain twice as high in one state than in another state. You know, there is some difference. What does it mean? How does it go?

But we also know that is exactly the same thing for ICD-9 codes and it is a question of moving that agenda along to try to understand something about what the sources of variation are and not just say it is lousy and it doesn't behave the way it should behave in some uniform matter and toss it out. But I think the paradigm is there. It is unfortunately restricted right now to one particular sector and maybe there will be another sector at some point.

But how to make it really population-based the way Barbara would like it and the way I think we would all like it is the real struggle. The Agency for Health Care Policy and Research had a Q-span(?) RFA and so on and so forth and each one of those were minor little sort of solutions to try to look at some overlapping. I don't know that anything revolutionary will come out of those, but that is unfortunately where we stand.

But for this sector it is really possible. With all the incumbent dangers of trying to push towards -- probably prematurely towards accountability and reporting in some way, but it still has a possibility.

DR. LUMPKIN: Thank you.

Any responses from the panel?

DR. CLAUSER: Thank you, Vince. No.

The other thing I was going to mention is that -- I mean, that vision is -- we are trying to move very quickly with a very similar vision, although we are not quite doing it the same way through the assessment and care planning tool, but to create a very detailed clinical automated database in the end stage renal disease community. And we are moving extremely fast and working together both on the nephrology community, the dialysis community and the beneficiary community to create that system.

We have a core indicator set, which we now put out nationally, which is population-based, to look at exactly clinical care of ESRD patients on what is called a network level, which is like for several states. We have worked very closely in trying to take that indicator set and we have developed several performance measures that will be reported on a population basis and we will be talking about working with the networks like we work with the peer review organizations to try to stimulate all the improvement, based on those population reports and like I mentioned earlier, we are moving with both the provider community and the beneficiary community and state survey agencies to begin to develop and drill down dialysis facility specific reporting, based on that clinical data set for use in internal quality improvement and for public information, both for survey agencies and for beneficiaries, whose literally lives depend on the quality of care they get in dialysis facilities.

DR. LUMPKIN: Jeff.

MR. BLAIR: I think you are probably aware that within the NCVHS, we have a work group, the CPR work group, which is focused on studying uniform data standards for patient medical record information. One of the focus areas within this work group is business case issues and I thought that I heard you mention that you go through a process before you do a survey or, I gather, set forth your budget for data gathering, where you do some form of a business case.

The implication that I got was that the benefits of quality have to somehow meet or exceed the cost of gathering the data. If you do have those types of business case studies or reports, are you able to share them with us?

DR. CLAUSER: The case that has been made largely to date -- and this is a continuing debate between the agency and the health care community generally, has been largely focused on where the opportunities for improvement are, you know, based on evidence-based and where we actually think we can drive quality because we have the intervention tools to do so. Or it is has been largely based on legitimate quality of care concerns in the state nursing home, where clearly there was a public cry to get this information and integrate it directly into care planning and reporting.

Largely, the effort to date has been on trying to directly relate evidence-based quality improvement opportunities to the value of collecting the data. I think as we continue, we want to start looking at the relationship between cost and quality data and to try to begin to do that. We are only in the early stages of trying to figure out how to do that well. And it is a very complicated issue.

DR. LUMPKIN: Dr. Meyer.

DR. MEYER: One of the exercises I did when I first came to the agency, which I found extraordinarily valuable, was to revisit this notion of the fact that quality can cost less or quality has to cost less in a strict business case analysis.

The source I went to kind of expand my thinking on this was I went and I got brochures from each of the Malcolm Baldridge Quality Award winners. What I found there uniformly across all of those is that it is very rare for any of them to make a single mention that there was a business case analysis done before they decided to make their investment and develop their system to reward quality.

This was a decision that was made because they felt quality was the right thing to do. Perhaps in the long run, it would be something they could sell. If you look at one of their most recent winners, Boeing, which got it for their development of their tanker systems and Air Force 1, there is not a big market for Air Force 1. They can sell one of those. So, I think the business case analysis there would fail and it would always fail.

So, although I think that it is important for us to think about the value -- and I mentioned that earlier -- it is important for us to think about the value that we get in measurement, at the same time, recognize that the business case analysis, strictly speaking may not be there, particularly in the short run and that still shouldn't detract from our efforts to pursue quality and it hasn't in other industries.

MR. BLAIR: Maybe I didn't articulate my question clearly. That was part of what I was looking for, but explicitly do you wind up doing any kind of a cost benefit analysis to justify the cost of conducting the survey or working with the PROs or anything like that, based on what you feel the value of collecting quality data would cost?

DR. MEYER: As a research agency, we have it, although we have not.

MR. BLAIR: Okay.

MS. WARD: I just wanted to reiterate Dr. Clauser's comment. I agree that with Barbara's comment that we have got to do the visioning and we have got to move out of some of our categorical work, but the reality also is that with EDI, we are now doing things with categorical treatment that we have never been able to do before in terms of immediate feedback about condition specific treatment and I live in public health. I am a population-based person. I would be -- I am also a diabetic. I want to go to the clinic that is using EDI that tells my physician immediately whether he or she is treating me correctly and is going to prevent the tertiary complications that now we know you can prevent.

So, we have got -- to me, it is not an either or -- we have got to push both worlds at the same time.

DR. LUMPKIN: I actually have a couple of comments, questions or something like that. You can figure out what they are after I am done.

One of the issues in listening to the discussion that I have some concerns about is the cost of data collection and I think that our technology for measuring quality, as much of it is in its infant state, outstrips our ability to collect data that is relevant to quality.

And I think this is of concern because right now there is some backlash against quality measurement, particularly in industry and amongst health plans when you have individuals like the medical director of Blue Cross/Blue Shield in Illinois says that it costs them $700,000 for each HEATUS indicator they measure. Whether or not that is true, it begins to raise a specter that the current climate in health care is perhaps not conducive for furthering the quality movement, particularly with margins being as narrow as they are and we are beginning to see increases in premiums.

So, to what extent have you looked at the issue of cost of quality? And there is another piece to that, which is that we haven't standardized the collection of quality data. So, part of the cost is that if you have a non-automated area and you operate, let's say, a two person OB/GYN office and each of the health plans you work with, each of them have a separate form to collect the same information, the additional cost of collecting that data and the burden of collection becomes quite high. That certainly is something that we need to look at ways of addressing.

Before I ask you to respond to that, I do want to applaud the development of tools like CASS(?), where there is an attempt to try to routinize the collection of information from enrollees, where you are beginning to at least develop a unified tool. Unfortunately, it doesn't tell us much about the fee for service.

The second part of my question, which actually is, because it is a two part question, which doesn't have anything to do with the first part, is a concern for my public health side is that there was some mention of being able to measure health in a community. And I really don't have a good feeling on how the quality in managed care -- look at measuring quality in managed care, how we can measure community because there is still a significant portion of non-health plan-related care that is going on, whether it be managed, PPOs or fee for service.

DR. MEYER: Let me start by addressing a couple of the points that you brought up. In terms of the costs of data collection, I wholeheartedly agree. I think there is a backlash there. I think that as margins narrow or disappear in many cases, that everyone has to beat the drums especially loud for quality or else it won't be heard any longer.

At AHCPR what we can do and what we are attempting to do is to fund some demonstrations, which look at developing systems, which are able to collect the data in a more real time way that I discussed. With time, we will have some results from those. But in the interim, I think, I think that that is a major issue.

In terms of measuring health in a community, specifically with CAPS, a couple of things. One of them is that the CAPS project continues to evolve and starts to try to broaden out to other populations. If you look at what CAPS was originally designed for in terms of measuring the fast action with health plans for people who have insurance, that is relatively narrow and so the development of the fee-for-service insurance, with the help of our HCFA colleagues, is one important piece of that. But even beyond that is developing CAPS for additional new populations and those are directions that we are trying to go with that.

It is probably safe to say that an assessment of the health care system by a consumer or by the population is going to need different metrics than an assessment of what they think about a health plan, a health plan that cares for them. So, we are making investments in that direction.

DR. CLAUSER: I was just going to comment on that issue. We are sensitive to the issues of burden when we think about our measurement strategy. I mean, to the extent that administrative data can do, we use administrative data. Then we go down the hierarchy to survey and then, God forbid, medical records if we have to go there. A good example of that in diabetes, where we received some criticism because in our hemoglobin A1C measure, we set a threshold that everyone could agree on wouldn't require risk adjustment, although from a lot of the peers in the field, they thought it should have been set at a lower level, but we believed it would have created a lot more data burden in order to create comparable measurement across systems.

I mean, those are some of the tradeoffs that are done at the margin in terms of measurement development efforts that are important. I agree that there is a concern in the marketplace for, you know, the burden of data collection but there is -- in the concern over cost, there is concern for a lot of things and a lot more than just data and data measurement effects, the burden of quality, whether it be, you know, survey and certification, whether it be other kinds of -- you know, just the physician certification requirement that are duplicated over and over again.

I think -- I hope that we think of the contribution that the quality area can make, we don't just focus on the data issue because it is not clear to me that that is necessarily the most burdensome of the requirements that either health plans or providers have to face. So, I hope we look at it as a broader strategy. Also, we are hearing from our beneficiaries that they are still very concerned about quality and they are very -- they have been very pleased that we have taken on a measurement strategy to try to understand what is going on.

So, we are hearing it from both levels and so I still think there is a constituency out there, but we have to make the case in terms of the relevance to that constituency. You hear many discussions around here about, you know, how to get information out to beneficiaries. The importance of trying to kind of get quality measures and data out of the closet, so to speak, so that it is -- we can make the public case for this information and we are very interested in working with the committee to do that.

Oh, you mentioned about measurement of health communities. Clearly, from a quality improvement point of view, in our peer review organization scope of work, we clearly are taking a population-based approach as it relates to Medicare. Obviously, that is where we have authorities, to collect data on Medicare, in that we are going to hold the pros accountable for approvement at the state level. They are going to be measured on statewide on quality improvement targets and objectives.

So, they are going to have the flexibility to work within the state to build the kind of coalitions that they want to build, hopefully, that will transcend both the private health sector and the public health sector. We are teaming up with CDC on cooperative initiatives. In some states we are working with, you know, major purchasers, again, to try to bring communities together to pursue quality improvement initiatives that will have spillover effects and will leverage providers' interest in providing, you know, better care to our beneficiaries as well.

We are going to be monitoring those state-based improvement rates. At least it is a way to start to get to the issue of thinking about it. I also think that in the area of fee for service, particularly in Medicare, we may move faster on more of a community level measurement strategy than on a provider specific measurement strategy.

I think there is a real opportunity to move out quickly in our fee for service area on a community-based measure strategy, that we are doing that in an experimental contract with Health Economics Research, where we are actually measuring fee for service performance rates with HEATUS type measures, both at a national level, state level, kind of a community level, although we quite haven't got that figured out yet, and a group practice level and are looking at the differences in measuring quality and what it means in terms of how we look at it so that we can have a discussion about at what level can we really get interventions that are going to stimulate change.

DR. STARFIELD: Clarifying question.

I am not sure we are using the same definition of population. But in any case, are you doing things like geocoding, looking at population health by area of residency? There is a whole lot of literature in the last five years about the importance of that kind of thing.

DR. CLAUSER: I think we have some research that is going on looking at how well we can kind of integrate that into kind of an administrative system, but that is still at a research level. We aren't doing that at a programmatic level yet.

DR. LUMPKIN: Thank you.

DR. FITZMAURICE: A quick question and then maybe a quick answer to Steve.

Do you sense that HCFA is going to continue to be willing to make personally identifiable health information available to researchers when you need that information, that identification for the specific research project under conditions that protect the confidentiality of data? HCFA currently does that and in recent discussions, I think they are continuing to do that. Do you sense that that is going to continue to be the philosophy of HCFA?

DR. CLAUSER: Well, the area of patient confidentiality with our administrative data sets is an area that is not my direct area of responsibility. I have heard nothing to the contrary that we are changing our policies in terms of providing data, of course, with the appropriate protections for confidentiality that protect our beneficiaries, who provide this information to us in a public trust.

DR. FITZMAURICE: Great. Thank you.

DR. LUMPKIN: I would like to thank the panel. It has been, I think, a very useful discussion. Kathy, do you want to make any summary comments?

MS. COLTIN: No. I think that -- well, I will, yes.

I want to comment on a point that Steve raised. I think it is important to recognize that there is a spillover effect, that when you implement these changes within a provider system and you change provider behavior, that behavior doesn't change just for the one segment of the population. It tends to change for all of the patients that they are caring for, provided other barriers change -- don't differ across those populations.

So, I think that is important, but I do think we need to recognize that a lot of the changes that are being proposed and being pushed in the private sector are being pushed primarily through HMOs. I think it is -- we have to be careful when we talk about managed care. Managed care broadly is certainly becoming if not the dominant form of private health insurance, but within their HMOs are really a minority of enrollees in managed care and yet the measurements that we are talking about and the reporting that we are talking about, whether it is CAPS, whether it is HEATUS is really being at this point limited primarily to HMOs.

Most of the PPOs are not producing those measures and some of them don't have the data capabilities to produce them and our ability to address those consistencies really rest on the backs of voluntary effort in the private sector, by purchasers and others and I think that even with spillover effects, there are other things that have to be done in the private sector to push those kinds of initiatives into other segments of the population.

DR. LUMPKIN: Thank you. Thank you very much for the panel. I think this is obviously a discussion that is not complete and will be an ongoing process as we discuss this issue.

Thank you.

Agenda Item: 1996-1998 NCVHS Report -- Inclusion of Member Views

The next item on our agenda is a follow-up to discussions that we have had about our report for 1996-1998 and we have had some discussion. We submitted our annual report to Congress and we have had some discussion in the executive subcommittee and I wanted to bring it back to the full body.

Basically what I would like to propose since there is interest in members that when we submit this report that we would give each member 30 days to submit their -- if they have additional comments, personal comments to make, and that would be included in a section of this report and other annual reports on individual member comments. If that is agreeable to the group, we will do that. We will have 30 days.

Bob.

MR. GELLMAN: That is perfectly agreeable to me. I just want to make a broader point here. When I heard the original proposal from the executive committee, it would not have included other views in our report, I was outraged and I consulted not necessarily in this order with a lawyer to see about suing the committee to enjoin the publication of this report -- I am not sure there is much of a cause of action there, but there might be -- with a reporter to see if this was a newsworthy issue. This was an off-the-record discussion with the reporter and the answer was there was, and with a congressional staffer to see if there would be congressional interest in this. And the answer from this one staffer was there was.

I think this is a very important issue. An advisory committee is required by law to include different points of view, whether this committee qualifies under the statutory standard of being balanced is an open question that I don't want to debate right now, but if a committee does not have other views expressed, at least from time to time, then it is either not doing its job or it is not taking up the right issues or it is not paying attention.

I am doing my best to help. And I think that the committee needs to have a process in which other views are included in committee products, that they are recognized, that they are given what I would call equal time in some fashion. So, I think that the solution being proposed here is perfectly reasonable and the right answer.

I didn't know what was going to happen today and I prepared a motion, which I am going to read but not offer at the moment. I think maybe we can talk about this some more. The motion was to the effect of that it is the policy of the NCVHS that no report, letter or other document adopted by or on behalf of the committee be made public or otherwise distributed to an intended recipient unless it fairly includes any dissenting, additional or other views that a member of the committee reasonably asks to have included.

I don't know that this has been a problem in the past. Maybe it won't be a problem in the future. If it is, I am going to return to this point. I think that the committee has to recognize and incorporate other views in whatever it does. I think it is central. I think it is required by law, although I will admit that it is hard to find anything specific in the law on that point.

It hasn't been litigated and I think that from -- if this issue ever became a significant point of controversy, that my prediction is that Congress could be induced to amend the Federal Advisory Committee Act to require it because I don't think anyone ever contemplated that an advisory committee would, in fact, ignore views that were contrary to those that were adopted by the committee.

I am willing to leave the discussion here. I think that the point that we have on this report has been adequately dealt with and fairly and maybe we can talk about this off line at some other point and see. I don't know whether there is a need to get formal about this. As I said, we can talk about it another time.

MS. GREENBERG: What I will do is e-mail to all of you the final version of the annual report without -- obviously, it will also include the additional views and if there is no objection, any additional views that have already been submitted, so that -- because if you recall at the February meeting, there were a number of suggestions made. We did a revision and then sent that out to you and then the decision at that meeting was that the comments received would be adjudicated by the executive subcommittee.

That has been done, but you haven't received the final -- what came out of that process. So, I would think their 30 days could start when they received that. So, I will make that available to you as soon as I am back in the office.

DR. HARDING: John, could you just clarify -- you mentioned it briefly, but you are talking about each report that comes out will have an addendum where people can make comments. Would you clarify that just a little bit? When you summarized, you did it very quickly.

DR. LUMPKIN: Well, this report would and other annual reports would have a section for member comments and --

DR. HARDING: So, it would be an addendum to the report.

DR. LUMPKIN: Correct.

DR. HARDING: And would there be any filtering of that? For instance, what if somebody put an outrageous, inaccurate report in there? That would just be under their name and that would just stand or would there be anybody who would look at accuracy and so forth to that --

DR. LUMPKIN: It would have to be rated G. But I think other than that, I don't believe that it would be the role of the executive committee or other body to filter that report.

DR. HARDING: So, that individual would stand with what they wrote?

DR. LUMPKIN: That is correct.

DR. HARDING: Okay.

DR. LUMPKIN: Assuming that is okay with everybody.

Okay. Then that is what we will do.

DR. MOR: So, this is not just a minority opinion about something. It is just all kinds of stuff.

DR. LUMPKIN: Yes. If somebody wants to write in and say -- I think people should have the right to make -- what we are talking about is giving people the right to make additional comments because there are issues that sometimes are raised in annual reports, where either someone disagrees with the position that the group took or they may disagree with how the report characterizes the position.

DR. MOR: Just to follow up on Richard's point, presumably it would have to have -- in some way be germane to the topics raised by the committee in the last year. Yes or no? I mean, you know -- baseball score?

DR. LUMPKIN: I am assuming that given the people on this committee that that is an issue that will not come up. I am assuming that there will be some relevance to the instance they have.

We are now scheduled to take a lunch break and if we could start at 1 o'clock, that will give us an opportunity to cover the two reports that we have and not be in danger of running into -- of not being ready when the HCFA administrator comes.

[Whereupon, at 12:00 noon, the meeting was recessed, to reconvene at 1:05 p.m., the same day, Wednesday, June 23, 1999.]


A F T E R N O ON S E S S I O N [1:05 p.m.]

DR. LUMPKIN: We are going to try to get started.

Some of you may have noticed a familiar face in an unfamiliar seat at the table. Stewart Streimer has taken a new position at HCFA, so acting HCFA liaison, Karen Trudel is here sitting at the table. Welcome.

If it is acceptable to the committee, I would like to have a letter sent to Stewart thanking him for his work with us in support of our activities.

MR. SCANLON: I would like to make an administrative announcement. Several of the committee members have photo IDs that expire this month, June of 1999. We need to get you renewed. The office that does that is open on Thursdays from 9:00 to 12:00. I realize that we have two subcommittee meetings starting early and then we start the full committee meeting at 10:00.

So, what you could do -- apparently, you don't need anyone from HHS to accompany you if you have a photo ID or if you are in the system. If you could down to Room 107D tomorrow morning at your convenience -- it is in this building, Room 107D -- and show them your old photo ID or, I guess, if you lost it or anything like that, just have some photo ID available with you and that should -- they will renew you. You don't have to go across the street. It is Room 107D. Take your old photo ID or have another photo ID with you. "D" as in David.

Agenda Item: Presentation of Reports by the Subcommittee on Populations

DR. LUMPKIN: We have presentations of reports by the Subcommittee on Populations. As we have done in the past with these reports, any action items from these presentations will be taken care of tomorrow during the committee presentations.

MS. COLTIN: So, I am going to start with the Medicaid Managed Care Report.

First, I just want to remind people why we did this and that is that Medicaid programs across the country, state Medicaid programs are increasingly enrolling beneficiaries in managed care organizations and were enrolling broader cross sections of their beneficiary populations in managed care programs. So, not just the Aid to Families with Dependent Children, but moving into the disabled and other more vulnerable subsets of Medicaid population, being moved into managed care.

That heightens concerns about adequate data to be able to evaluate the quality of care that those beneficiaries were getting in those settings. The other concern had to do with potential data loss, that, in effect, in the fee for service environment providers would submit claims information for Medicaid beneficiaries that would capture the services that those individuals were receiving and when these beneficiaries were enrolled in managed care, the concern was that those service level data would evaporate and would not then be available for state Medicaid agencies to use as one source of information for evaluating care.

That certainly has become less of an issue as we learned -- as we began to get into this with HCFA requiring Medicaid agencies to collect and report encounter data from managed care organizations, but at the time that we began this initiative, it wasn't clear that that would actually occur.

What we did was go out and visit, have site visits in a couple of different states and took testimony here in Washington and at those site visits from people who are either involved in administering state Medicaid programs or who are secondary users of that information or are advocacy groups for beneficiaries and so forth, and put together this report, which summarizes the sort of state of the art out there in terms of collection and reporting of data on Medicaid managed care and offered some recommendations regarding improvements that we think can be made.

For those of you who either didn't read it as carefully as you might or who read it quite some time ago, there were four purposes in the report. They were to consider the importance of data collection and use in Medicaid managed care, to present the legal and operational framework for data collection, both existing federal and state requirements to, as I said, synthesize the results of the hearings that were held and, lastly, to present our recommendations.

What I thought I would do is just walk you briefly through the six key recommendations that we made and then ask for your high level endorsement of the report, noting that there may still be some minor wording or other modifications made by the subcommittee because some of the report is built on a set of contract specifications and language around contract specifications that state Medicaid agencies can use in trying to outline the data collection and reporting requirements of managed care organizations.

And as those specifications are going through a vetting process and may alter to some degree, this report will need to reflect those alterations. So, we may bring it back to you or if those alterations are really minor or minimal, then we would go with a broad endorsement today if we get it.

With regard to the recommendations, there were two umbrella recommendations. One was that HCFA should try to encourage standardization by specifying the manner and format of data that would be routinely collected from managed care organizations. So, while HCFA requires states to collect encounter data, there are no specific rules governing the format of the data. It has to be HIPAA compliant, for instance, or any specific guidelines and, in fact, what we learned is states were collecting data in all sorts of different formats, different data elements from different data sets in different states and so forth.

We found that that really would not be conducive to compiling this information to look at policy implications of different decisions that state Medicaid agencies might make as they might affect the kinds of care that were delivered in managed care settings in those states.

So, that was the first recommendation is to encourage greater standardization around the data. The second was to offer a menu of consistent contractual language that would enable states to implement provisions to have the managed care organizations collect and report this data.

So, one was a means to achieving the other. The second was a means to achieving the first.

One of the things I do want to point out is -- I am going to inject a couple of my biases also as I present this, given that I have the opportunity. There are just two places in the report where I think there are statements made that I don't believe were actually fully discussed and agreed on by the subcommittee and I did want to point those out. One is with regard to the standard contractual language and it is on page 48 in the second paragraph, the last sentence, which reads, "While the model specifications are specific to Medicaid managed care, the ultimate goal is to encourage the use of similar model specifications across all managed care contracts, including employer sponsored and commercial contracts because many data issues, such as lack of uniformity are not unique to Medicaid.

That is not something that the subcommittee discussed and agreed to and, in fact, I have considerable problems with it because we are talking about in this context, that would mean data bad plans would make available to employers and I think there are a whole different range in confidentiality concerns that arise in that context because there are provisions in the Medicaid laws, the state Medicaid laws, requiring this type of information. There are not similar provisions in general laws requiring the reporting of this information to employers. In fact, if you look at the Secretary's recommendations to Congress around data being provided to employers, there are real concerns raised about providing the individual level data and identifiable data and so forth.

So, there are some very different considerations and that was not even discussed. So, I am not quite sure how that language got into the report, but I would encourage it to be deleted.

Going on to the specific recommendations, there were six of them. One was for standardized minimum data set that is consistent with HIPAA transaction standards and privacy and confidentiality requirements. There were two subsets to that recommendation; one that the recommendation should provide for states a standard definition of what is an encounter so that when encounters are being counted, because they are frequently counted for reporting purposes, that we, in fact, we would be counting the same thing.

The data that are being submitted are really service level data. They are claim level data, so that a patient that has a service ordered that may be performed on a different day and come in on a different claim is not necessarily linked and made part of a single encounter.

We felt that there should be a definition of an encounter, which is a contact with a patient where the services that are either rendered or ordered can be linked to that rather than counted all as separate encounters. So, in other words, a claim does not equal an encounter. An encounter is an aggregation of claims.

So, that was 1(a). 1(b) had to do with enrollment data and specifically stated that for Medicaid beneficiaries, race and ethnicity data should be routinely collected as part of the enrollment and eligibility data that are collected by the states and then made available to the managed care organization, as well as information about language and reason for eligibility.

The report on page 49 says "Reason for Enrollment." That was just an error. It should have been "Reason for Eligibility."

The second recommendation was that HCFA should encourage state Medicaid agencies to adopt and standardize patient's survey of their experiences in managed care. For example, the consumer assessment of health plan survey, which many of them have already adopted or some other survey that may become the standard for that population, but that it would be very helpful to be able to have comparative data across states about patients' experiences, particularly as they could be compared with state policies around benefits, coverage and so forth and standardized encounter data so that the ability to link that data and see whether policy decisions made at the state level, which vary have an impact on measures that could be constructed form encounter data, as well as patient survey reports of their experiences.

The third recommendation was that services that are covered under Medicaid but not through the managed care organization should be reported to the state Medicaid agency in the same way that they would be reported if they were covered through the managed care organization.

So, an example might be pharmacy data. Some states have carved out pharmacy services from their managed care contracts. If pharmacy services are a part of the managed care contract, the MCO would be expected to report on those services. If they are not part of the contract, then the pharmacy benefit manager should be required to report to the state Medicaid agency on those services.

So, while our focus was on managed care, we felt that to have a level playing field that even carved out services should be reported to the state. We further felt that to the extent that the states had developed performance guarantees or particular quality improvement standards, that they were requiring managed care organizations to adhere to where the MCO's ability to improve or to demonstrate adherence to those standards rested on the types of data that were being collected to carve out programs, that the state be encouraged to share that information with the managed care organization. So, an example might be asthma is a very common problem in a Medicaid population. For plans to make improvements and to reduce emergency room usage and unnecessary hospitalizations in that population, often rests very heavily upon their ability to influence prescribing practices and to increase the percent of children who are on appropriate medications, on inhaled steroids.

They can't even measure that if they don't have access to the pharmacy data. They can't identify who the physicians are who aren't prescribing it so that they could intervene or who the patients are who aren't receiving it so that they could intervene. So that in cases where they are being held accountable for performance, they need to also make sure that the MCO has the data to be able to deliver on those accountabilities.

The fourth recommendation had to do with notifiable diseases and the recommendation was that HCFA should encourage states to make sure that managed care organizations collect and report data on notifiable diseases to the appropriate state agency. This was, I think, a somewhat contentious one and this was another one where I will editorialize. There were two -- what are proposed are two alternative options represented there. One is when the managed care organization is a provider, a direct provider of care, versus when the managed care organization functions as a payer and sort of promoter and assurer of care and service.

I think that the recommendation in the report that when a managed care organization is a direct provider, it should submit data on notifiable diseases as would any other provider be required to do. So, therefore, the laws that apply to providers in general would apply to managed care organizations that are direct providers. I certainly can support that recommendation. I think that is appropriate.

But the second recommendation had two parts. It says that the managed care organization when they are not a direct provider, when they contract for services through other providers, that they need to enforce -- and the word "enforce," I think, is important here -- the duty on contracted providers to report to the appropriate state agency.

I think the temptation here is to try to make the -- put the managed care organization in a role that really belongs to the state's public health department. If the public health department can't enforce the law, what makes them think that the managed care organization can enforce it?

We can put language in our contract that mirrors the language that is in the state law with our providers that says you will comply with the state law and that is our expectation. But when it comes to enforcement, penalties, sanctions, whatever, managed care is not the favorite of most providers now. You are guaranteeing that this will create further animosity.

So, I really caution against taking that type of an approach and, furthermore, I think that if you look at what types of managed care organizations are Medicaid agencies contracting with, it is mostly HMOs. It is not PPOs. So, again, you are hanging on one small sector of the health care industry the responsibility for enforcing what is a public health requirements. So, I do take issue with that.

The recommendation even goes further and says the managed care organization needs to provide the infrastructure for the providers to be able to report and I am thinking where did that come from. Clearly, you know, we are not the ones who are going to be out there providing all of that kind of technology for reporting to states registries and so forth.

The fifth recommendation has to do with data dissemination and ability and sharing among state agencies and basically says that access to this managed care organization raw data that gets submitted to the state's Medicaid agency should be available to HCFA for policy analysis and to designated contractors of the state Medicaid agency. That is a recommendation.

And second is that it should be available for approved research, as is currently the case with much state Medicaid data. The recommendation further goes on to say that state agencies should be encouraged to share data that would be mutually beneficial so that to the extent that the encounter data that are submitted by the managed care organizations could provide insights around public health issues, sharing that data with the state public health department would be a reasonable thing to do.

The last recommendation was that there be improvements made and attempts to encourage the Federal Government and the private sector to partner and invest in training programs to increase analytic capabilities of people who are working in the state agencies to try to analyze this data.

So, I think the reason for the private sector being involved as well is that these analytic capabilities also need to be encouraged in the private sector, but also they are the ones providing the data and they need the people who can actually look at the data from the other side.

So, there were a number of different options for how this type of training could be provided from traineeship grants to applied training courses, tuition reimbursement programs and so forth and those are laid out in Recommendation 6.

DR. LUMPKIN: We are going to try to work our way through these recommendations one at a time.

Simon.

DR. COHN: I just had sort of a general comment and I actually have specific comments on most of the recommendations. I just wanted to get a clear idea of what the outcome of this discussion is because as I listened to this, knowing that the presenter couldn't even support it seemed like a number of the recommendations without a significant modification, I wasn't sure whether we were dealing with a document that we were going to vote on as a finished document tomorrow or whether this was an early draft.

Not having been part of the discussion, I just wanted to get clarification on that.

MS. COLTIN: Well, this is an action item for tomorrow. So, I think the subcommittee is going to review this again this afternoon and take a look at it. It was reviewed in a conference call and a lot of the thoughts that were shared at that time were imbedded in this version, but not all of them got translated, I think, exactly as was intended and there were some things put in that I didn't recall seeing before, that first being an example, the first concern that I mentioned on page 48.

So, we will be reviewing them and what we were hoping to do is take your comments -- is to be able to take your comments and concerns into account as we go through this this afternoon and be able to make modifications to the wording.

DR. LUMPKIN: And then the subcommittee will decide whether or not they are ready to move this forward a vote tomorrow afternoon. That will be part of their discussion this afternoon.

DR. STARFIELD: So, the question is we are not asking for comments from the subcommittee members, but just from people who aren't on the subcommittee. Is that right?

MS. COLTIN: Well, I think to the extent that someone who is on the subcommittee wants to make a comment that others may want to react to and it leads to a fuller discussion of an issue, I think you should feel free to make it. I certainly made mine.

DR. LUMPKIN: Is this a general comment?

MR. GELLMAN: Yes.

DR. LUMPKIN: Okay.

MR. GELLMAN: I have two objections -- two kinds of objections to this report. First, I don't think the report should be presented as an action item. It is not complete. There is an entire section that is missing. I don't see how we can be asked to vote on it until we have had a chance to see that section.

With respect to the missing section, which are the contract clauses, those were passed out at the last full committee meeting. We were asked on a very short term basis to make comments on it. I filed three pages of comments. I don't know whether they were passed out to people or not.

The report has been prepared. The comments that I made have not been addressed. The contract clauses have not been revised and I don't see why we are being asked to vote on this with the notion that the contract clauses will come later.

I am going to read a paragraph from my comment because I think this is important. In general, I find almost nothing in the contract clauses recognizing the rights or interests of patients. I also see many provisions that strike me as inconsistent with the privacy policies approved by NCVHS and with policies supported by the Department. I am confused about why NCVHS is involved with this effort at all. The effect is to take patient data on the health care provider and convey it lock, stock and barrel to another health care establishment without the consent of or notice to patients and without any real regard for privacy interests.

Indeed, it appears that the contract clauses could, among other things, result in the transfer of patient data and ownership of that data to a patient's employer with no restrictions on the dissemination of the data by the employer. That is what is involved here. I see this report -- aside from my procedural objections, I see this report as another classic NCVHS, we have never met a use of data or sharing of data that we didn't like.

This report is filled with collecting new data and spreading it around all over the place. The states are to collect data. They are to give it to HCFA. They are to give it to other states. They are to give it to contractors. They are to give it to researchers.

All you find in this report is a tip of the hat in one or two places to, oh, we have to worry about privacy and confidentiality rules. Well, there aren't any privacy and confidentiality rules and most of the users who would get this data will be totally unrestricted in what they can do with it.

I think that if you look in the substance of the report, the report criticizes the State of Iowa for collecting data without patient identifiers, as if this is some kind of terrible thing. The report notes that the states lack resources for dealing with encounter data and the solution is to collect more and more encounter data and give it to the states. I don't understand this. This is just inconsistent.

There is no discussion in the report of technologies of anonymity, what can be done without using identifiable data. There is no discussion of organizational techniques to protect patient privacy. There is no discussion of the risks to patients from giving more of their data to the states. There is no discussion of the lack of state legal and technical protection. And this is an area where we know that there we are dealing in an environment with a lack of protections.

There is no discussion of the gaps in current laws. There is no discussion of the value of differences in the states. Maybe there are some benefits to letting states do thing differently instead of making everyone do it uniformly. There is no discussion of the preemption concerns that have been a hot issue in privacy laws. So, I totally question the whole fundamental premise of this report and the purpose of this report.

I don't think any attempt has been made at all to address this totally one-sided report that says, gee, wouldn't it be great if we could find more uses of data and some more sharing of data. I think it is completely inadequate and I intend to oppose it.

DR. LUMPKIN: Further discussion on the report from the committee?

PARTICIPANT: That is at the general level, correct?

DR. LUMPKIN: Right.

PARTICIPANT: Okay.

MS. GREENBERG: Just for clarification, in response to your question as to whether your comments had been shared, if I understand, the subcommittee asked the Subcommittee on Privacy and Confidentiality to look at the draft specifications and we have sent out to them both the specifications and your comments and they will be discussing that tomorrow morning.

MR. ZUBELDIA: I haven't seen that report before, but one thing that struck me during the listing of the recommendations is that there is no recommendation that the managed care organizations accept this data electronically. One thing that we have seen is that the managed care organizations are taking data, but a lot of them are small entities that only take it on paper. We are seeing a mass migration from electronically submitted data at the state level to paper submitted data when they migrate to managed care.

So, the non-managed care claims can be filed electronically. When the state moves to maybe 20 managed care contracts, 18 of them will have electronic capability and the data have to be submitted on paper. So, I would like to see a recommendation added that these managed care organizations be required to take the data electronically.

DR. LUMPKIN: Thank you.

Let's move on to No. 1, standardized set of elements. Are there comments that people would like to make on that?

DR. COHN: I thought this -- I actually liked the title. I did think it was, however, mischaracterized in the recommendations in that the -- it said in the second paragraph that HCFA should adopt a standardized set of data elements and it mentions it should be based on the NCVHS recommendations consistent with standard administrative transactions.

Probably this is in line with what was just -- our last commenter, that really what we want is this to be consistent with the HIPAA administrative transactions and it is really the job of the national committee to make sure that it gets into these administrative transactions, the EDI transactions, the NCVHS core data set and code set elements.

So, I want to make sure that gets -- because my feeling is that it needs to be aligned appropriately with the major activities follow-up the committee.

DR. LUMPKIN: Let me just follow up . What you are suggesting is is not that, let's say, that there were 20 or 30 elements, that there are two parts to it. One is that it should be consistent with HIPAA and, second, if it is not in HIPAA, then part of the recommendations from the committee would be that they would be subsequently included in on of the transaction sets. Is that what you are saying?

DR. COHN: Well, the second one was just sort of a comment that what we wanted -- what we need to do is to have this data consistent with the overall national administrative transaction standards under administrative simplification.

That is the standard of which we want them to go to. What we need to do in relationship to our NCVHS core data set is that we need to be dealing with those that are doing the administrative transaction standards to get that in there, but that is something outside of the scope of this report.

DR. LUMPKIN: Okay. Other comments on --

DR. STARFIELD: I didn't quite understand that. What kind of wording do you want changed?

MS. COLTIN: It is the --

DR. STARFIELD: No. I see the wording but how do you want it changed?

MS. COLTIN: Where we say we recommend that HCFA adopt a standardized set of data elements?

DR. COHN: Yes. I think that what we are really talking about is, for example, the 837 transactions are really what we are talking about and maybe the 835, which is the enrollment -- thank you -- 834 and that that is really what we are trying to talk about as opposed to an NCVHS core data set.

DR. LUMPKIN: I think the issue that Simon is raising refers to those data elements that are in the core data set that was developed by the NCVHS but are not in the transactions. They are not in the transactions for a number of reasons, some of which may be due to the fact that those data elements haven't been well enough developed to be able to insert. So, if, in fact, this document would recommend these data elements, then there is some obligation to work to put them in a form in which they can be included in 837 or 834.

DR. STARFIELD: But the wording says "and consistent with standard administrative transactions --

DR. COHN: But it also says "should be based on." I think it is just not a question of being consistent. I think that the primary issue is HIPAA.

MS. COLTIN: I think the language about based on was intended to say it needn't be every data element that is in the 837 or the 834. It might be a minimal data set, which is a subset of data elements that are in those standard transactions.

So, that is a content issue, but from a format standpoint, that they would be produced consistent with the HCFA standards.

DR. LUMPKIN: Perhaps, you know, it -- because, obviously, we are going to need to do a little bit more work on this report but maybe Simon could help put the language -- it may be that in trying to make it be concise, it may not be quite complete in conveying what people seem to be agreeing on.

DR. COHN: We are having a side conversation about whether this is a counter environment or are you submitting the whole 837 or a part? I think it is probably coming from the implementation guide issue. So, I think you really are submitting the whole 837, but according to whatever implementation guide the rules are. So -- anyway, that is --

MS. WARD: I would reinforce what Kathryn said is that I don't think we are making any assumptions on the entire set because that is -- contrary to something

-- I think Bob had some good points about it because we have been debating that in terms of that other side of what we were looking at. Kathryn and several of us when we went to these different sites were very concerned at the amount of data that is being collected because there is nowhere that we are seeing that, in fact, the questions that HCFA is trying to answer needs all of that.

In fact, I would hope that they would reduce the amount of data they are asking states to report and that we are trying -- we want what they ask -- what we are asking them to do is figure out what are the questions and, therefore, what do they need and to standardize that. We still think they are collecting more than they need.

MS. COLTIN: Well and the 837 contains all of the detailed identifiers. They may not need all those identifiers and we may not want them.

MS. WARD: We are not trying to decide for HCFA what their questions are and, therefore, what they need to collect. What we are saying is make sure that they are part of that.

DR. COHN: I apologize. Maybe this may be my understanding, but I read this for use by state Medicaid agencies not by HCFA. Did I miss something here? Is this really -- what you are talking about is a minimum data set that gets sent on to HCFA?

MS. COLTIN: They are required to send the data onto HCFA is my understanding, the state Medicaid agencies.

DR. LUMPKIN: Okay. I am going to suggest that we need to move off of this point. I think we have identified kind of what where we want to go on that. We do have six other points and another report to do in the next 20 minutes.

Again, I remind you that we are not making a decision. We are identifying issues before the full committee and, obviously, this one is going to need a little bit more work than we may suspect. So, the committee really needs to decide if they have time within the time frame whether or not to be able to bring it back tomorrow, or the subcommittee. Okay?

Any comments on common definition of "encounter"?

DR. COHN: Actually I did have a question only because I looked at the footnote and the footnote didn't seem to support the -- to me, the contentious part of this one is not that there should be a definition, which I am completely in support of. It is just this issue of looking at an encounter to all services provided, which is tough for anyone, be it in Medicaid, MCO or just a regular old HMO.

When I was looking at the footnote, it says -- which is the proposed definition -- it describes -- let's see, where was this? No. 2. It says in the following services and supplies furnished to a member in direct relation to an encounter or otherwise. So, I wasn't sure whether -- it seemed like it was just a statement of any service as opposed to a statement of linkage.

Maybe I am misreading the --

MS. COLTIN: I think the key was indirect relation to an encounter. I think that wording probably needs to be tightened up because it doesn't say order that for --

DR. COHN: Yes. Once again, I was just sort of looking through to see what they meant and it just seemed that they didn't specify what the position was.

DR. LUMPKIN: If you could clarify that -- is that footnote from the subcommittee or is that describing somebody else's draft specifications?

MS. COLTIN: I think that is in the draft specifications for the contract language.

DR. LUMPKIN: Kepa.

MR. ZUBELDIA: I would like to express a concern here that after the 837 implementation guides for claims and encounters, that we are trying to come up with a new definition of what goes into an encounter and, in fact, contrary to what implementation guides can do. An implementation guide for pharmacy, using the NCPAP(?) cannot contain encounter information for the physician encounter and vice versa.

So, we need to be careful with not contradicting previous work.

MS. COLTIN: I think we need to clarify what we meant. We are talking about when one takes the raw data, which is at the level that it would come in on in an 837 or a subset of an 837. How does one count how many actual encounters there were. And there, the definition is an encounter a visit? I mean, in some settings if you have a mammogram, that is counted as an encounter, as well as the visit at which the mammogram was ordered. So, you would be counting two encounters.

In others, they say, well, the mammogram was ordered at that visit. We just link it back to that visit and what they are really counting is visits. So, we felt that that was really fuzzy and that for statistical purposes it made it difficult to compare information. So, we just wanted a definition created that said if you are going to report on encounters and I think within the managed care industry we often use the term "encounter" and the term "visit" synonymously. So, it is a little different than the notion of a service, like a pharmacy prescription.

So, I think that is where a lot of this muddiness comes in, that people are using words and interpreting them differently and assigning different meanings to them. What we are asking for here is that there be a consistent definition applied to what is meant by that term when you are going to analyze and create a statistical report.

DR. LUMPKIN: We are running into a time problem here. Okay. I am sorry. What we are going to do is finish this report in the time allotted. When Nancy Ann Min DeParle comes, we will stop the discussion and we will proceed to our next presentation and then before the break, we will take the second report.

Anything on enrollment? Comments?

[There was no response.]

Anything on patient experiences with access to quality care?

DR. STARFIELD: I do. I do have just one concern about specifying CAPS. I don't have any problem with patient experience surveys, but I don't think we should lock ourselves into one. I just intend to discuss that in the subcommittee.

DR. LUMPKIN: Okay. Any other concerns?

No. 3, services covered under state Medicaid plan but not under MCO contract.

Okay. No. 5 -- No. 4. Simon.

DR. COHN: Actually, Kathy, I think you did most of my work for me in terms of your questions about that piece. I looked at this when I was reviewing it on the plane out and thought this was an odd set of recommendations in the sense that every state that I know of requires providers to do this and I think it is -- actually, I think, fundamentally it is the responsibility of the providers to do this in the sense that otherwise you don't get timely reporting and you also deal with issues of accurate reporting.

So, I don't know even why this was here, I guess, is the question.

MS. COLTIN: I think sometimes a managed care organization is a direct provider and then we are saying when -- you know, in that case, then, fine, this should apply to them. I think the area of contention is when they are not the direct provider, when they are a contractor.

DR. COHN: But this is an MCO contract, not a provider contract focus. I thought that the whole thing could do without this whole section completely. So, that was just my comment.

DR. FRIEDMAN: I just have a clarification question, which is did you say that there were two alternative formulations for -- you know, those two paragraphs in here or --

MS. COLTIN: Yes. It is the two alternative options are when they are direct provider organizations that they would have to report and then the second is that they would have to enforce reporting on the providers with whom they contract. So, therefore, it would be a subcontracting arrangement, the MCO's subcontract with providers that don't work for them, that aren't their own -- that they would have to put provisions in those contracts that those providers would report and then enforce them. That is the one I have the biggest issue with.

DR. FRIEDMAN: Do we have both of those formulas -- I am sorry I don't -- I am missing this -- do we have both of those formulations in this draft of the report?

MS. COLTIN: What it says here at the bottom of page 51 is states may wish to impose such a duty directly on MCOs, when MCOs are direct provider organizations, as is the case with the staff model. Then there is the "or" and the "or" is the second option, or states may want to require that MCOs enforce this duty on their participating providers, who are obligated to report it under state law. So, the part that comes before the word "or" is the first alternative and the part that comes after the word "or" is the second. You are right. It is not clear and it should be made clearer, but those are the two options that were proposed.

MS. FRAWLEY: This is to follow up on some of the earlier comments. I would just like to make a recommendation to the subcommittee this afternoon that they think about eliminating this recommendation from their report. I see it as very problematic. I am very concerned about the whole enforcement piece and I think this is something that we do have to face the statutes and regulations that require providers to report communicable diseases and I don't think this is something we need to step into.

I would just, you know, suggest that maybe the subcommittee discuss it this afternoon in their meeting, but I see this as a problem recommendation.

DR. MOR: I will second that. Departments of human services in the state governments don't talk to departments of health as it is and this is a totally different organizational structure, which would compound that communication.

DR. LUMPKIN: I think I would just like to comment that I guess I am comfortable with eliminating this recommendation as I am not sure it fits just here, but it is something that we may want to look at in that particular committee as infectious disease reporting, particularly as part of the process.

Most states require not only the providers of care, but also laboratories to report. Reporting is woefully inadequate and as exists in most states, there generally are no penalties for not reporting. Btu managed care can play an important role by encouraging participating providers to report.

It doesn't necessarily -- and putting it in the contract may be one way of just saying that this is one of the criteria in the contract, particularly if it is not necessarily worded that there is a penalty or many managed care organizations will have a sliding scale for physicians, depending upon how well they meet the certain criteria in the plan. And they have different tiers of payment and it can be included as a criteria to go from

-- if you are in Tier 3 to go to Tier 4 or so forth. But I think that there are ways to explore that managed care can enhance reporting of infectious diseases without getting into contractual issues.

So, I guess I just wanted to toss it back to the committee and say I am comfortable with removing it, but not to take this particular issue off the table for some future work.

DR. STARFIELD: I want to support that because the hearings didn't surface this issue as something that was important to people. It is true that lots of countries just pay physicians differently if they do or don't comply with reporting.

MS. COLTIN: I think the big issue is we don't have the data to know whether they complied. They are reporting it to the state not to us.

DR. LUMPKIN: And many managed care plans, particularly the IPO and IPA models will do audits in the physicians' offices. And then there is a score that they will get and they have to keep up to a certain level in order to be able to stay in the plan. This just might be one of the criteria in the audit whether or not cases are reported. That is just the kind of thing that we can look at. Obviously, it needs to be -- any recommendation that does come down as a separate recommendation needs to recognize the reality and the broad spectrum of managed care that exists in this country from staff model plans to loosely associated IPOs.

Five? I think Bob already made his comments about this one. Any other comments? You know, I think, as I was looking through this, I had the -- I think some -- Bob made a number of comments, but also that we need to toss in some of the things that we have discussed before about personally identifiable information and what degree of -- do you actually need record level cases? Many times, this data can be aggregated before it is moved up. So, I think there are ways we should explore that.

No other discussion on 5?

[There was no response.]

Six, training?

DR. STARFIELD: I guess I wasn't sure where that one came from because the hearings made it clear that the states didn't feel they needed much technical help. So, the recommendation seems much stronger than is warranted by what was obtained from the hearing.

MS. COLTIN: Actually, I guess I heard it differently. I heard that they did --

DR. STARFIELD: It is just what is in here. I actually wasn't at the hearings, but it is what I read in here.

DR. MOR: In Boston, we heard a lot of it.

MS. COLTIN: They couldn't recruit qualified staff. They weren't out there to --

DR. STARFIELD: Then that needs to be rewritten in here because it makes it sound like they didn't want that. I will find the section.

DR. COHN: Actually, this has nothing to do with 6. I was just going to make a summary comment. Is that in order?

DR. LUMPKIN: Well, it is, after we get the last comments on the table.

I have one and this is one, which actually links off of something that was said earlier. State Medicaid agencies may not have the expertise, but state government very likely does in one way or another. So, if we are going to look at these kind of recommendations, we should encourage closer collaboration between state health data organizations, state health departments and state Medicaid agencies in developing the resources to do the data analysis, as well as training.

Simon -- unless there is more on 6.

DR. COHN: This is just a general comment that I -- I think this is an interesting document. I agree with some of the comments Bob Gellman made that it is not a complete document and certainly based -- and I think our discussion is going to need relatively significant revision. I guess it was my hope that we not come back for an action tomorrow since it is hard for me to imagine that it will be cleaned up to a place where I could support it by 12:00 noon or 1 o'clock tomorrow afternoon and just sort of a recommendation that we may want to consider this in the September time frame, with maybe it being -- since it is such a lengthy document, it being sent out more than just a couple of days before the meeting.

DR. LUMPKIN: Any disagreement with that? I see everybody's head shaking up -- everybody is nodding agreement.

Okay. I don't think we want to start on the other document. Let's just -- we are going to take a three minute break.

[Brief recess.]

Agenda Item: Discussion with HCFA Administrator

DR. LUMPKIN: I would like to welcome you to our meeting with the National Committee on Vital and Health Statistics. It certainly is a pleasure to have you visiting with us and have a chance to chat. Our committee and our 50 year charge has been to give advice to the Department on various issues related to health information.

In the last four or five years since we have been reconstituted by HIPAA, we have been faced with the challenge of balancing the two halves of our brain, that that has been focused on standards and security and privacy issues and those which still continue to address population issues.

So, we are pleased to have you here to give us the HCFA perspective and have a chance to have some exchange of information and ideas.

Welcome.

MS. DE PARLE: You are talking about the two halves of your brain and I think those two pieces really are at a crossroads and some would say even a collision point within the Health Care Financing Administration right now. Actually, I had been told that the group was interested in hearing how you could be of assistance to us and I think everyone knows that we need a lot of help. So, I appreciate that.

I know Dr. Lumpkin, in particular, does because we have worked together before. But the particular place that we need your help, I think, is at that crossroads because we are many years into the development of some data collection instruments and information gathering tools that we believe are critical either to our efforts to try to pay providers more accurately and to protect the Medicare Trust Fund through doing that or to making sure that we are overseeing the quality that beneficiaries are receiving. Both of those things have been to some extent question marks in the first couple of decades of Medicare.

Now we are at a point where the data that we are able to collect and the instruments that we have, we might be able to move forward and do a better job on both of those. But we are running up against some collision points with the tension between those things and those activities and privacy and confidentiality.

The most recent example of that is the Oasis tool, which I don't know if this group was involved in, but as some of you may know, it was developed -- I guess it started in 1987, but there has been a requirement from the beginning of Medicare that home health agencies are supposed to assess patients when they come into care and to make from that judgments about what kind of care they need.

In 1987, the Congress directed us to develop a standardized tool for doing that and we contracted with the University of Colorado and clinicians and researchers there developed this tool and it was then tested around the country. I think around a hundred home health agencies used it and clinicians refined it and we put it out in the Federal Register in 1997 and all this was going forward and, again, you, perhaps, were more involved in this than I was, but as far as I know, there were never any questions raised.

Then two months ago, there suddenly became a big explosion over the kinds of questions that were in the instrument, some of which went to things like mental health status, which on the one hand my colleagues in the mental health community tell me it is very important that those things are assessed, both for accurate payment, to make sure that a beneficiary with those conditions is being properly cared for, but also, you know, so that they get the assistance that they need and that we make sure that if they had those conditions, they might be more vulnerable than others and so it might be something that the survey agencies for the state would want to take a look at or whatever.

But there is a tremendous sensitivity, I think. The privacy bills that are under consideration by Congress, I think, are evidence that there is tremendous sensitivity and tremendous concern about this tension between the potential for moving forward on the quality front and on the payment front and on a number of fronts for those of us who are public health professionals with more data.

On the other hand, the potential for misuse of that data or concern about security for the systems that we run here at the Department and that those of you who work in the states or in other -- who do research, you know, and use our data could create more potential for misuse or abuse of it.

So, that is really a problem that we have and we are very award that now as we look forward to some of the other things we are trying to do and it is a place where a group with some calm, cool heads and some time to do some thinking about what the smart way is to proceed to really be helpful.

So, I lay this very easy issue at your feet and encourage you to help us think it through because it is a problem that I think we will face more and more as we see that tension not really getting resolved.

So, John, I didn't want to -- I always when I see you give you a problem.

I am supposed to also talk to you about what our priorities are right now. Although I think you will see that -- and everyone knows we are an operations place. The essence of our business is getting claims paid for beneficiaries and making sure that people get health insurance coverage. So, it is not as though thinking about data and vital and health statistics is a daily part of our existence, but it is a tool and it is very much an important part of what we do.

We have a number of major projects going on. The number one priority right now is the one that goes to the essence of our business, which is making sure that claims get paid. We were faced with a tremendous challenge because of the year 2000 computer problem and the fact that Medicare, in particular, pays its claims through contractors around the country through systems that we don't have control over and through systems, which were not compliant as was the case for many insurance companies, but it was more of a problem for us because we had 60 different contractors with different systems and it all added up to about a hundred different systems that we have had to work on over the past year.

What that has meant is that some of the other information systems projects we might have done, some of the data that we might have been sharing and other things like that have been delayed because we have been putting our full attention from top to bottom in the agency. I mean I have probably spent as much as a quarter of my time on this over the past year just making sure that we could get all the millions of lines of code renovated and make sure that the systems would be paying providers January 1, 2000.

The good news is we are well past and well along our way to being there and I am so very sure that we are going to be paying claims on January 2000. What I am not so sure of and those of you who are physicians or work in hospitals can help us with this is whether or not all the providers who bill Medicare will be ready. There have been a number of surveys and some of you have seen that, I suppose.

I have talked to colleagues who fun academic health centers who tell me, as I have said, that they are spending half their time on this or whatever and that it has been a major undertaking for them as well. So, I believe in the end, the health care sector will come through and we will be ready. But it certainly hasn't looked good at different points along the way. So, that is a major priority of ours to do outreach to all those groups as well. It has been the agency's No. 1 priority. It was not something that I think any of us thought we would be spending so much time on but it has become a major issue for us.

I guess I would say in addition that we are working on administrative simplification under HIPAA. That is something where we have worked together with this group, I think, quite well, and the process that we undertook together worked well in terms of I think the lack of surprises and the formal comments that we have gotten to the regulation, even with the thousands of comments we have received. The major issues had already been raised and discussed around the country through the hearings that were conducted and through the interactions that we have with your group.

This, too, of course, has been delayed somewhat by the year 2000 work. Putting aside our own problems, many of the folks who would be affected by this urged us to slow down the process so that they would not be involved both in trying to make changes for administration simplification at the same time they were trying to fix their computer systems for the year 2000.

So, we have a number of next steps to undertake, including policies and procedures to enforce standards, examining the Department's role in industry implementation activities, monitoring the implementation process and reassessing standards based on new technologies and industry/business needs.

We look forward to working with you as we move toward making these standards a reality.

We are also looking forward to working with you in using clinically focused data. And some of you know, we have a computer-based record project. Clinically-focused data is becoming increasingly important to our program goals for development of risk adjustment models to support prospective payment systems and the kind of thing we are doing with managed care to provide depth and validation to the analyses of administrative data for program integrity and to help us to measure and improve the quality of care provided to our beneficiaries.

Both the size of our beneficiary population for Medicare, 39 million beneficiaries, and the scope of our provider relationships make the collection of clinical data difficult and costly. So, we are basing our national clinical performance measurement strategy on the principle that the content and the collection of data and performance measures should be standardized wherever possible.

We think that leads to more useful information for consumers and purchasers and reduces the burden for providers and plans. So, that is our goal. We, therefore, have become an active participant in health care data standards development. We are the lead agency, as I mentioned, for the development of the HIPAA data standards. We are providing resources and staff to other federal efforts, including the government computer-based patient record project, as well as participating through interagency agreements with agencies working in specific areas.

We also participate in standards development organizations, such as the Organization for International Standards, the American National Standards Institute's Health Informatics Standards Board and others. Some of you may know that the Quality Forum Planning Committee, which was announced basically a year ago has now evolved into a quality forum -- I am not sure actually what the title of it is, but we were on the board of that group and our goal of being there is to try to as much as possible be part of an effort to create standardized quality assessment tools and not to have HCFA and Medicare going in one direction and the private sector and others going in other directions.

So, all of those things work together. I mentioned at the beginning, privacy and confidentiality and I guess I would like to close with that as well because I really do think that is the place where we could most use your help and where we would like to work with you and get your thinking on what we should be doing.

We, I think, have a very good record of protecting the confidentiality of health information. In fact, in general, I think our protections for personal medical information are stronger than what exists in the private sector, but that doesn't mean that it is strong enough. And particularly right now, I think, there is a very deep appetite for making sure that everything the government does and, for that matter, everything the private sector does protects the confidentiality of health information.

So, we are actively participating in the administration's interagency process to look at the recommendations for medical privacy that Secretary Shalala has put forward, to look at what the Congress is examining to make sure that they can work on the operational level for the programs that we are responsible for.

We are also continuing to look for ways to improve privacy protection afforded to beneficiaries and the data that we have. So, we are very eager to hear from you any suggestions that you may have on how we could do a better job in that area.

I think you know that Karen Trudel is sitting in as Stewart Streimer has moved on to another spot in the organization and we will be soon finalizing who the permanent representative will be, but we appreciate Karen's work on our behalf in the meantime.

Thanks.

DR. LUMPKIN: Thank you.

I, too, would also like to thank you for allowing us to use Stewart and Karen and Bob Moore and a number of other people from HCFA, who have been very important to our ability as a committee to meet our charge. I think that we have welcomed the close working relationship that has developed under HIPAA and other activities, where we as an advisory committee for those outside of government have had an opportunity to work closely. I think that has yielded very positive results and we hope that we can maintain that relationship in the future so that as issues come up, they can get public airing and public discussion prior to the rulemaking process.

MS. DE PARLE: You know, you are right. And I want to emphasize what I said before about the HIPAA process that we worked on because there have been a lot of articles about things that haven't gone well in the past year relating to data collection or data efforts. That is one that I think -- of course, you haven't read any articles about it because it has gone so well. Working with the industry, you know, it took longer than maybe we wanted and we weren't quite on the time frames that the law said, but I think we made a lot of progress in preventing what otherwise could have been a big, you know, explosion or lack of understanding or misunderstandings over what was trying to be accomplished. So, that is a success story, I think.

DR. LUMPKIN: We certainly look forward to the opportunity. I did want to just share one concept with you and then toss it open to the members of the committee in that often this committee gets involved in very, sometimes minute details of data standards, but I think that we have an overarching vision and one of those which is embodied by the committee that we have to develop a national health information infrastructure.

The goal of that is -- last week I was here when Rosa Parks received her gold medal and I was fortunate enough to be invited to that ceremony. What struck me -- really struck me at that ceremony was when the comment was made -- I think it was made by President Clinton -- that Rosa Parks didn't get on the bus to start a movement. She got on the bus to go home.

Those of us who are involved in health information standards aren't involved to set standards and we are not involved because we are really fascinated by data, although some of us are. But our goal is to make the health delivery system better and we recognize health information and health informatics as being the transforming technology that will make a dramatic change in the quality, efficiency and the cost effectiveness of health care.

The basic day-to-day work of sitting down and working over these standards is one of the ways that that will be achieved. So, I think that certainly our efforts and our goals move in the same direction as HCFA, which is the quality care for the people they are responsible for.

MS. DE PARLE: Yes. I agree with you so much, John. I think that is a good anecdote to keep in mind as we go down this track because the reaction, as I said, to Oasis was very surprising and we probably hadn't done everything that we should have done to educate people about it. That is an area, again, where perhaps this committee could give us some guidance on how to make sure people understand the tremendous potential that this data has helped especially vulnerable populations, like home health beneficiaries.

DR. STARFIELD: One of the things that this committee is struggling with is how to make our data plans consistent with national goals. Right now, the year 2010 goals, one of which is to eliminate disparities across all, you know, race, ethnicity, SES and others.

We had understood from our work with HCFA before that, in fact, there is not a high priority to collect these kind of data, not a priority to get this kind of information. I wonder what your perspective on that was for the future.

MS. DE PARLE: I don't know who told you that. The data itself is one thing but we now have goals under the Government Performance and Results Act that include reducing disparities among our beneficiary population on things like flu shots, mammography. There is a whole host of particular benefits where we want to both increase the population as a whole. If any of you saw Dr. Wenberg's recent study, which was, to be fair, based on data earlier than when the new preventive benefits went into effect, that it showed a fairly dismal picture of what some of our beneficiaries are getting, even with health insurance.

So, those of us who believe that health insurance is a big thing and a positive thing have some explaining to do. So, we want to increase it among our population as a whole and certainly with certain groups that have not had the benefit of those things, we want to do an even better job. So, I am surprised that someone would have told you the data wasn't important.

DR. STARFIELD: No, no, no, no. It is not that it was not a priority, but, in fact, you don't collect the data to make it possible to examine whether you are reducing the disparities or eliminating them.

MS. DE PARLE: I know that in some cases we depend on others to collect that data. That may be what you are thinking of. I am sorry. I am not prepared to answer that. I would be interested in who told you that.

DR. STARFIELD: I mean, the data sets don't -- I think you are right. You depend on Social Security to collect the data. But that is one of the things I think we want to work with you on.

MR. GELLMAN: Your comments on Oasis were very interesting. I am going to tell you why Oasis blew up in your face. First of all, despite all the testing and careful work you did, all you did was talk to the health data establishment. That is what you are not going to get any help on privacy out of this committee because this is the health data establishment here.

You are going to have to talk to somebody else to get sensitivity on privacy issues. The second reason it blew up in your face is because the world has changed and for 50 years, whatever anyone wanted to do with data was okay because nobody paid attention. Now the public is paying attention and it is not enough to say, oh, this is good, so we ought to be able to do it.

You are going to have to confront the hard issues a lot more carefully.

MS. DE PARLE: That is why I think you weren't -- you came in late. You didn't hear me give the history of this but --

MR. GELLMAN: I heard you.

MS. DE PARLE: It is my understanding it started, you know, ten years ago or so. I think when it started people weren't focusing on this the way they are right now.

DR. COHN: First of all, thank you very much for coming and joining us. I have been on the national committee for two years. I am from Kaiser Permanente.

I have become very impressed by HCFA in the sense that I don't think you have enough staff to do what you need to do, but, yet, you do a very credible job of it anyway. So, thank you.

I do observe there seem to be a lot of various data mandates, many of which are legislatively required that you are involved with. In the spirit of the year 2000 and all of this, I am sort of sensing, it seems like a lot of those are falling off the plate, being deferred, being delayed and I think creating a lot of confusion in the health care industry. APCs and the hospital outpatient area is one example of something that was deferred, but yet no date set.

This whole area of the Medicare risk and requirements for physician encounter data is another example where, for example, it is in a contract for Medicare Plus Choice, but yet there are not specifics and that contract is supposedly for next year.

Do you have any thoughts, ideas about when all of these things are going to begin to come out --

MS. DE PARLE: Part of the problem is what you alluded to at the beginning. These things have been congressionally mandated and the -- when we miss a deadline, the time line's implementation is as soon as possible. So, we are in the middle of assessing right now what we can do when, after the year 2000 computer changes have all been made and tested.

Our expert consultant have advised us that we can't just sort of flip a switch after the January 1, 2000 and say everything starts again. That, itself, would put too much pressure on the system both for us and I think for many providers. So, we are trying to weigh out and come up with a schedule for doing things in sort of order of emergency and it has just taken some time to do that.

DR. COHN: Is that a strategy that you could share with us as it begins to evolve and -- it is something that I think everybody would like to understand better just because you really do have a lot on your plate right now and the ordering, I think, would be very helpful to allow the health care industry to help.

MS. DE PARLE: I can't promise anything imminent. I mean, we are in the middle of working that out and then I need to brief the Secretary on what our plans are. But, sure, when we have plans, I would be happy to share them.

DR. LUMPKIN: Other questions or comments?

[There was no response.]

This has truly been a special opportunity for us to begin to have the dialogue. We started this as a new process of the committee to invite some of our key partners that we are charged with advising to have an opportunity to share their thoughts with us and us to share some of our thoughts and concerns.

We are not a committee that has shied away from taking positions and speaking our opinion. I think that is a helpful process, but we certainly hope to and want to continue in partnership moving towards our common goal, which I think is not reflected so much in what the administrative task that we have to make, but so much so with the overall charge of improving the health of the people in this country.

MS. DE PARLE: I think you are right and I know many of you. I know that you are very busy professionally outside of this committee and want to thank you for the public service that you provide to the Department and to the country by being willing to serve.

DR. LUMPKIN: Thank you very much.

I think we are going to press on. The next presentation is on the IOM report on public sector performance measurements, an Institute of Medicine report?

PARTICIPANT: No. I think this was an error.

DR. LUMPKIN: I just would like to say that I had an opportunity to read it. I thought it was one of the best documents I have read in a long time.

Agenda Item: Presentation on Committee on National Statistics Report on Public Sector Performance Measurement

MS. DURCH: Well, I am pleased to be here to talk about this new report. It is from the Committee on National Statistics, which is part of the National Research Council of the National Academy of Sciences. The IOM is a part of the National Academy of Sciences and there was clearly a lot of overlap of interests in this report.

I believe all the members of the committee got a copy of the report and there is a handout that was distributed that has some material from the report that I thought might be helpful for people to have in case you didn't have the report with you while I was talking.

The report was released at the end of April and Dr. Lumpkin was a member of our panel. So, he got to watch the process from the beginning and was an important contributor as we went along.

It was a project that was funded by the Department of Health and Human Services and got under way in the fall of 1995, sort of in response to the GPRA initiatives and in response to the proposals for performance partnership grants. There had been a set of program areas that are funded on a grant basis and that had been proposed as areas where the usual categorical or block grant mechanism would be changed into a performance partnership grant with requirements for specific performance objectives and measurement and the panel at the Committee on National Statistics was assembled to serve in a technical advisory capacity on the kinds of measures that might be appropriate as those performance measures.

In the handout is a list of the members of the panel. It was actually a good mix of people from academics. We had people with federal experience and then we had a variety of people with state and local health agency experience from public health departments and from mental health and substance abuse agencies.

The first report, which you may be familiar with, I don't know, was released in 1997. Assessment of Performance Measures for Public Health, Substance Abuse and Mental Health was really the first report that the Department had requested and it provided an analytic framework that the Department and states and communities could use in assessing measures that might be appropriate for use in this performance partnership context.

It included examples of measures that might be used in each of these grant areas that had been part of the initial discussions. It included specific examples of outcome measures and some intermediate outcome risk reduction measures, but really didn't try to specify in any detail process or capacity measures, although it recommended use of those sorts of measures, but emphasized that they had to really be selected in the context of the work that was being done, the strategies that were being adopted to carry out the goals the grant activities.

This first report also talked a little bit about data sources for the measures that had been proposed. In the course of working on this first report, the panel felt that there were some broader issues that really deserved further attention and the Department gave us the opportunity to extend the life of the panel and that was the work that led to this second report, Health Performance Measurement and Public Sector Principles and Policies for Implementing an Information Network.

It really went beyond the framework that was initially set out for the project. It was an opportunity to look at data and information system issues at the federal, state and local levels. We went beyond the initial federal/state partnership context that was really the framework for the work of the panel in the first report and saw that there were really applicability in relationships between state and local health departments and that there was certainly a range of issues beyond the specific grant areas that had been the focus in the first report.

It retains an emphasis on activities in the public sector, publicly funded health programs. Though, in the course of our discussions, it is clear that some of these issues have relevancy probably in the private sector as well, but really the panel didn't feel that they were the appropriate group to extend their comments beyond the public sector framework.

There were several themes that really emerged in the panel's work and are reflected in the report. One is an emphasis on the appropriate use of performance measurement, that it is in the context of health programs, there is a lot that is not yet known. There are a lot of influences that are beyond the scope of health departments or publicly funded health programs that have to be taken into account. So, that treating performance measurement as kind of a cause and effect, either you have got it right or you didn't kind of approach really was not viewed as an appropriate framework for this, that it was more a tool to guide program management and to point the direction towards work that needed to be done and opportunities to guide work in the directions to achieve the kinds of outcomes that you are hoping to achieve.

Another theme that emerged was really trying to build on current data and data systems to the extent possible, that this was not -- that performance measurement was not an activity that ought to generate new data collection activities specifically and narrowly for performance measurement activities, that it should as much as possible build on resources that were there. It should perhaps contribute to improving those resources, but this is -- it wasn't viewed as an activity that should be self-contained and create its own sort of super structure.

Another important issue that really received a lot of continuing attention from the panel and in the report is the need for collaboration across all levels of government in trying to do these kinds of activities.

Finally, that this is an activity that requires continuing and ongoing attention, that you can't really expect to come up with one set of measures and do this once and say you have finished your job and go home, that knowledge is continually increasing, that program priorities change and that experience with using performance measurement have to be taken into account and that there needs to be a framework for continuing to look at measures that may have been selected at the performance measurement process, at thinking about how to use it effectively, whether measures are doing what you had hoped they would do.

There needs to be a way to revise and improve the process over time. These themes really were reflected in a couple of ways. The report includes some specific principles that were laid out for implementing performance measurement, things like emphasizing the program goals as a starting point for thinking about the measures that you want to use, that they should be related to what you are trying to achieve, that you need to develop identifiable sets of measures trying to work from similar sets of measures so that you are not faced with the prospect of inventing measures at every step and that there is some opportunity for comparison over time or across the states or communities and a need to recognize that information needs vary for different purposes and at say different levels at which performance measurement might be of interest.

Program management interests are going to be somewhat different than program management issues -- in the sense of, say, a federal agency that is looking at performance of grantees are going to be somewhat different than state health department or a local health department that is trying to actually use this kind of information to improve its direct delivery of services, whatever those services may be.

It is important to -- in working on performance measurement, it is important to consider the feasibility f data collection and analysis as you are proposing it and that there needs to be an opportunity for periodic evaluation of the performance monitoring system of the measures that you are using and the measurement process.

Again, that performance measurement should be viewed as a developmental activity that we really don't have enough experience with it to know exactly how best to use it as a tool and that there were some observations again on performance partnership agreements. I suspect you are all familiar with the history of the performance partnership grant notion that the initial idea was that there would be legislatively -- a legislative framework within these grant programs mandating the performance partnership mechanism.

That did not take place, but this whole performance measurement idea is still quite viable and in fact is reflected in the performance measures that the Maternal and Child Health Bureau has adopted for their Title V grants and so that this is still an ongoing process so that there are opportunities to continue to apply all of this notion.

The work of the panel then also pointed toward the need for better information systems and trying to evolve a national health information network is the term that the panel adopted and, specifically, a national framework rather than a federal framework. This is something that had to bridge all levels and the public and private sectors and not in a formal, consolidated, single data system, but more as improving the ability of the various data systems at various levels and in the public and private sectors to work together and that this system would, in fact, support a range of health data activities, that this was looking beyond just the performance measurement tasks but also envisioning that a stronger health information network would support a variety of other uses for health data, that there was -- in thinking about this health information network, the panel emphasized that there is a shared responsibility across all of those who would expect to benefit from such an improved data system in supporting it and that activities that are quite familiar to this group are going to be necessary, particularly in the area of standardization of data and data systems and that any information network of this sort is going to have to be adaptable to change, that it, again, is not a static system that you would achieve and then assume that the work was done.

The panel developed recommendations in four areas really to promote this work in performance measurement and encouraging a broader health information network and specifically in policy actions to promote collaboration and coordination, operational activities, investment in data systems and in training the health department staff out in states and communities and at the federal level as well and the research agenda, that is almost generally expected from an Academy report.

In terms of the policy areas, national collaboration really was a key feature of that and the panel emphasized -- in the handout there is a complete list of the recommendations. I will just highlight a few of these. But the issues of national collaboration really emphasized opportunities for participation and at the same time responsibility on the part of the participants that the states and communities needed a better framework in which to work with the Department of Health and Human Services, the Federal Government in helping to work towards the kinds of information and data systems and measures that would be needed.

In terms of data system integration, the panel recommended that the Department ought to work towards promoting integration of data systems at the sate and local level across categorical programs, that there is considerable frustration from overlapping or incompatible systems that might, in fact, operate more effectively if grant-related obstacles could be removed.

The panel also recommended that the Department try to identify areas where specific restrictions existed that were preventing such collaboration and work towards removing those. In terms of operational principles, it was viewed as particularly important that the process involved more than just or be open to more than just the health department at all levels, that there are important contributions to be made from a variety of agencies that have an important influence on health outcomes, whether it is the criminal justice system, transportation or things like highway safety, but that there needed to be opportunities for participation from those sectors.

The panel also recommended that a process be developed for ongoing review of performance measures and to promote development of standards for measures and for definitions of measures and for standards for data collection and data quality and that it was important to have a process for assembling sets of measures that could be identifiable and recognizable as resources that could be used by people wanting to undertake performance measurement activities, but that there it was also important to be evaluating those measures.

In terms of investment, the panel emphasized in its recommendations investing both in data systems themselves and in reviewing data collection activities to ensure that they were making efficient use of resources as possible and also to invest in training and technical assistance to ensure that the people in federal, state and local health agencies or other agencies had the resources and the analytic capacity to make use of the information that they were being asked to collect and use and to make effective use of the performance measurement process.

In terms of the research agenda, it was -- this is the place where it is important for federal agencies and perhaps foundations, as well, an opportunity to become involved in developing the evidence base on which performance measurements should rest, sort of the science base for this, but also in testing measures, evaluating the performance measurement process that we really do need to accumulate evidence about that as well.

Finally, there is an opportunity for states and communities to contribute to this process that is not funding research but being willing to contribute to research activities in terms of participating in them and helping to encourage the use and dissemination of evidence-based information.

So, I think the panel viewed performance measurement as an opportunity to address a variety of information issues that are relevant for a much broader range of interests and topics, but that specifically given the interest in performance measurement that it was important to ensure that these issues received attention, to ensure that this kind of activity was done appropriately and provided the kind of benefit that it was intended to.

DR. FRIEDMAN: I have a comment and a question. It is specifically directed to the part of your recommendations that deals with the National Health Information Network. It seems to me that we have sort of a lot of millennial epi-phenomena going on. And one of them is a lot of us are sort of wandering around visioning or in my case sometimes hallucinating. It is true in health information and I think it is true in health information nationally and internationally and certainly in the U.S. there seems to be sort of at the very most basic level of agreement on goals and at the sort of very high level content area, agreement, but there seems to be very little agreement on how we get there in any kind of specific sense.

There seem to be several different schools of thought. One is sort of a creationist school of thought, which is sort of benevolent guiding hands, sort of New Testament deity. And I think that is sort of our NHII, where sort of a federal government is going to step in and, you know, help move us there and I think a second is sort of much more of a dramatic big bang theory. Somehow or other it is just going to get done.

The third is more sort of evolutionary, which seems to be this panel, the attitude that the panel took. I guess this is directed both to you, John, as well as to Jane. Having said this, it is not clear to me what for any of us the next steps are in a very concrete sense. In some ways it is not fair directing it at you two, but the report is done. So, I guess I will -- so, it is just as fair directing it to you but directing to ourselves because we are just beginning the process.

So, I guess my question is in terms of a National Health Information Network, hey, what do we do next?

DR. LUMPKIN: Thank you for the opportunity to respond to that question. I think the answer is, in fact, included in the question, which is that one of the neat things about being on the committees at the National Academy of Sciences is that you don't always have to answer that sort of question.

However, this advisory body doesn't have that leeway and really it is important, I have found, that one of the biggest obstacles we have in health informatics and whether it be a public health system or it be a system related to health care is that it is very difficult to accept the benefit if they can't get the vision. When you can't quite understand what benefit it is going to be for society to have that, then all the concerns and many of them related to privacy, some of them related to confidentiality, others to the cost and reality, take the forefront. We see that issue related to immunization registries.

We have kind of been faced with that in Illinois, where as we are trying to envision and convince people that this is a good idea, all the other concerns come forward. But no one asks the parent who brings their child in and they say, well, my doctor keeps those records. Well, where are they? Well, why don't you ask my last doctor? And if it is the case of my son, it happened to be a managed care organization, which no longer had his record. So, they didn't have the records. We didn't have the records. My wife worked for the HMO, so we assumed that they would still be around.

Parents have those problems all the time and when the information system can solve those problems, then they are willing to say "yes," there is a benefit for this risk and part of our goal is to make sure that we have the vision at the same time that we have the concrete tasks.

So, that was kind of a long way around but I have to tell you that I am in politics in government in Illinois and I don't always answer the question directly, if at all.

DR. AMARO: You know, we have a project in Massachusetts, which I am a PI on and Dan has collaborated with us, that the CDC is funding that has brought together sort of holders of lots of different data sets throughout the state with the idea of developing these what are called HIV prevention indicators. We are almost at the end of the project and there are three other sites funded and so we have been trying to develop some common ones, as well as site specific indicators, population-based indicators, using items that exist already in data systems.

One of the difficulties that I have had with this performance measure approach is that I think it is really a misnomer to call it performance-based because you cannot conclude anything or very little about performance based on -- you can look at trends over time. You can make hypotheses about what may have influenced the ups and downs, but it is very difficult to really make attributions as to what causes the ups and downs because generally the processes involved, you know, which are referred to in your other report, and even the resources and capacity, level of data that you would need perhaps some educated observations about how when those fluctuate, performance might fluctuate, often aren't available or limited.

So, I have struggled a lot, you know, with as much as I would like to see us develop performance -- population-based performance measures, I find that in the end you are really kind of -- it is still useful, I think, endeavor in that it allows people to come together to collaborate, to identify data points that can be used. You don't have to start gathering new data for kind of looking at -- in this case we are interested in HIV, but it is very difficult to jump from that to any statement about how, you know, our prevention programs have impacted that in this.

So, I struggle with that and I am sure the committee struggles with it, too, and I know that sometimes the answer is, well, you have to look at it in context. That is very vague and it still doesn't leave me with much specific about what to do. And I wonder, you know, sort of where kind of the committee ended up n that.

MS. DURCH: That was very much part of the message about using performance measurement appropriately and it is difficult to make the link between what the health department does and the outcomes that you are interested in. So, that -- it is important to -- if you are going to be using this process to use it as a tool to help you think about whether things are going the way you think they ought to be going and hope they will be going. Outcomes give you one indicator, but they aren't sufficient by themselves. That was part of the message of the first report was that relying only on outcome measures isn't adequate because there are so many other things going on between whatever the health department may be doing and those outcomes.

So that part of the concern of the panel was that seeing some interest in trying to use performance measurement as a real resource allocation tool that if you -- either you achieve this outcome or you lose your money, that sort of thing, and being concerned that the tool was not good enough to support that kind of application. It is a mix of caution about how far you can push it and looking back towards what the sort of process and capacity side or the intermediate short term outcomes as ways of helping you to get a better sense of are things contributing to the right kind of outcome.

It is not as satisfying as everyone would like it to be but I think it is -- it is a way of encouraging people to look systematically at what is going on, but cautiously in terms of what the tools allow you to do. And it is part of the emphasis on continuing to examine what tools you are using that as your information increases, maybe there are better tools out there or learning more about how you can best use performance measurement as a tool to ensure that it is not pushing things in the wrong direction or being used inappropriately.

DR. LUMPKIN: Thank you. I think we are about out of time. Thank you for the presentation and bringing this issue before the committee. We will continue to struggle with this issue and I think we will need to look at some of the recommendations particularly as it relates to information and information structures.

We have spent a lot of time in our quality committee looking at how we are going to measure performance of the health care system, but I think it is also important for those of us who are in government to be held accountable. Certainly, I know our legislature tries to do that to us and it is better that they hold us accountable on reasonable measures, rather than measures that they may dream up. So, it really is, I think, an important thing for us to keep in the forefronts of our mind.

Thank you.

We have one other item before we break for the work groups and subcommittees. We have the health data needs of the Pacific insular areas, Puerto Rico and the U.S. Virgin Islands.

Agenda Item: Health Data Needs of the Pacific Insular Areas, Puerto Rico and the U.S. Virgin Islands

MS. WARD: I don't think this has the same level of debatable issue that our other committee report has, but unfortunately that is part of why we have selected this as an issue. What we discovered with Hortensia's pushing our subcommittee is that found there is an area of the world that has sort of been suffering from a lot of benign neglect. That is why we looked into the whole assessment of health data needs of the Pacific insular, Puerto Rico, Virgin Islands to get a sense of that issue.

We held hearings and I was very impressed with the amount of interest that representatives from these areas demonstrated by traveling for days, some of them across multiple time lines for the hearings and we learned a lot. You have here the report of additional copies that have been cleaned up a little bit than what was sent out to everybody.

Our intent of having it here is that we would like, as in the previous report, that the recommendations come out from the whole committee. So, as you read it, there is a tremendous amount of information about probably a part of the world you were as unfamiliar about as we as a subcommittee were, but please pay attention to those recommendations. That is what we are hoping that the whole committee will endorse.

One of the things that we do have to say is that we do still have to send this back to all of those representatives from those areas for them to review it to be sure that we have not misrepresented any of the things that we learned from them. Because of that, if there are significant changes to the recommendations, I don't think is going to happen, but if that were, then I would ask that we review that again. We would obviously change the report and come back to you.

But we don't think that that will change the recommendations. I don't want to read those recommendations for you. Please read those tonight and look at those and we will ask that you discuss or ask us questions, the subcommittee questions, about those recommendations. They all are tied up in allowing more flexibility for those areas to use, the federal dollars. We are asking for some more direct assistance to those areas so that they can do a more adequate job of collecting data and analyzing the data so that they can manage their health needs.

DR. LUMPKIN: So, we will be asked tomorrow to approve the recommendations --

MS. WARD: That is correct.

DR. LUMPKIN: -- for the report and they will be made a report of the committee pending -- unless there is major objection from the involved parties for the territories and insular -- I would just like to have us walk through the recommendations. Maybe if we can just go through one at a time just to make sure that there are no issues that may rear their head tomorrow.

MS. WARD: Okay. One of the things you will find as you read through these, that we don't have recommendations that are for all of these areas and that is -- I will perhaps also point out to you because one of the things we discovered as a subcommittee is that each one of these areas has a very different government-to-government relationship with the United States. So, recommendations differ according to the areas of the world of which we are speaking.

Recommendation 1 is particularly related to Puerto Rico, the Virgin Islands and Pacific insular areas in terms of asking Health and Human Services to improve health information infrastructure. This is in the way of tools and trained professionals so that those areas have more skills in being able to collect and use data.

What we found is over time there had been a central office that had been sort of a coordinator of services, training, skills, assistance, and that has been eliminated. So, that is one of the things over time that has provided a significant problem for some of these areas.

DR. LUMPKIN: Any comments on No. 1?

[There was no response.]

Okay.

MS. WARD: Recommendation No. 2 is specific to the Puerto Rico Department of Health and what we are asking is that the National Committee for Health Statistics, which is the federal agency we are speaking to here, consult directly with the Puerto Rico Department of Health to learn the status of recommendations that we discovered had been made as a result of the site visit and to see if there are specific things that they can do for Puerto Rico to implement the recommendations that had come out of an earlier visit.

DR. LUMPKIN: Any questions on that one?

[There was no response.]

3?

MS. WARD: We are again asking the National Center for Health Statistics to expand its vital statistics program into America Samoa, the Commonwealth of the Northern Mariannas and Guam. What we found is those are areas that are truly miles and miles away from anyone else and are out there on their own and have a tremendous problem in terms of getting vital statistic systems up and running in their countries and we heard about the difficulty of even being able to document births or deaths. All we are asking for is to see what we could do to help them out.

And part of that comes where you talk about curriculum and training because they have obviously had some unique problems about getting people trained. If you look at the map we found that as a committee we would look at the map and say, well, why don't you three groups over there sort of spend more time talking to each other and share staff and then they would describe the fact that even though they are on the map in proximity, it takes you 12 days because you have to go from here to here to Australia and back up to Micronesia and you can't just leap across because of transportation routes and those kinds of things.

We were really overwhelmed, I think, as a subcommittee about the difficulties of these people trying to run basic vital statistics and data collection and being able to respond then to any of the federal monies that are available to them. They are, in fact, being left out because they don't have the basic data to compete. So, that was -- I was also very impressed as a subcommittee member that the people who are from there from the National Committee, thinking out loud with us, with the participants about what could be done to try to get more assistance. I was very pleased with the federal entities who came to the table with us. I did see, we saw as a committee interest from everyone of those federal agencies that were participating in the discussion.

If anything, I think that is probably one of the most benefits I saw from having the hearings is that we did in many ways substitute for a part of the federal agency as a government that is no longer there, which is convening a group to bring these people from very isolated areas together to talk about what their problems are, bringing the federal agencies to the table so that they could do some common problem solving, which we found had not occurred for a long time because of the difficulty of anyone being accountable for trying to do that kind of problem solving.

No. 4 was directly related to what we discovered, related to the National Institute on Drug Abuse and mental health issues. We are asking again Health and Human Services to encourage SAMHSA to continue its plan for developing some particular services for these areas.

DR. LUMPKIN: Anything on 4?

[There was no response.]

MS. WARD: No. 5 is very similar, that because of the association between drug and alcohol abuse and so many of their other conditional problems, particular communicable disease, again to see what SAMHSA and CDC can do to provide more training and integration of their related data collection activities.

DR. LUMPKIN: Comments on 5?

[There was no response.]

6.

MS. WARD: We found the Healthy People 2010 approach might really help them in terms of -- in that process that is new for everybody to be an impetus for expanding assistance to the Pacific insular areas, Puerto Rico and the Virgin Islands. So, we were -- that objective talks about making that Healthy People 2000 available to them.

And 7 is related to that. Healthy People 2010 effort, make sure that they are including our reporting on these areas that are considered U.S. populations.

Another area we got into about how they are being left out, as I sort of referenced before, was the issue of categorical and block grant funding and, again, we have asked that Health and Human Services to try to encourage those grant setting entities to work out whether block grants could be directed and more inclusive of these areas.

We asked also that our Census Bureau and Health and Human Services to help them with official population figures. As I said, some of those areas are really struggling with making sure that they have good denominator data.

No. 10 is another Health and Human Services essentially related issue, looking at helping them produce accurate and timely census data.

No. 11 is developing standards for health data requirements and it says there some of the problems are that so many are actually separate nations. So, you have nation-to-nation issues related to get assistance to them.

And at the data and policy table, we sort of put another category under there trying to see if there is some way even though this particular office has been eliminated that we can encourage getting at some point representatives from these islands and jurisdictions to again still be able to participate, talk to each other, share concerns and see what is available for helping.

Thirteen is just another more detailed version of that to get these areas together and to look for extending activities to them.

14. Here we have targeted the Department of the Interior and Department of Commerce to also get involved in health data systems and they all actually began -- as we began to see how different these areas were, we began to expand into other kinds of federal entities that we don't normally ask to get involved, but found that because of their particular makeup and the fact that some of these are Pacific nations and commonwealth, that we ask these other agencies to become involved in assisting them.

DR. LUMPKIN: Any questions or comments on the 14 recommendations?

Actually I have one. You mentioned a couple of times an office that was no longer in existence.

MS. WARD: It is in --

MR. HITCHCOCK: It is partially still in existence. It was a major player apparently in the Federal Government in coordinating activities with the Pacific insular areas and the Virgin Islands, not Puerto Rico. Some of the folks are still there, but during the reinvention and downsizing period, it got -- ours was eliminated and it got placed into -- got subsumed by a larger office, where it doesn't have the clout nor the funds, as I understand it, nor the staff to really do the job that they used to do.

MS. WARD: It was interesting to hear them talk about as the different representatives say, well, we used to be able to do and then we used to be able to do. So, we were struck that that was a theme through a lot of the testimony about the fact that this budget cleaving had unfortunately really cut off access to some of these areas.

We didn't want to presume having the committee make a recommendation that that kind of budget solution be created because it, obviously, got done at a time when there were some significant budget cuts, but that, in fact, getting a better sense of how with the existing resources that are still there in the Department of Interior, they, hopefully, could maybe do it, spend a little bit more attention.

DR. LUMPKIN: I know we are running a little bit over schedule. I am a little bit concerned that when looking at this document in total, it basically says to me that within HHS there is no one who is concerned about the health data needs of these areas. There is a piece of recommendation here and a piece of recommendation there and a piece of recommendation here. Maybe we need to make some recommendation that there ought to be at least one central focus, point of contact with HHS for the health data responsible organizations in each one of these territories and insular areas and -- so that there is -- it just doesn't seem to hold together. We could go another five years or six years until we have another report that would come up with the same recommendations because there is no one who is responsible.

So, perhaps -- I mean, if it is agreeable if the committee could come up with some sort of recommendation that could be added to the report for our consideration tomorrow.

PARTICIPANT: Responsible and accountable.

MS. WARD: We can certainly take that back. I think part of our dilemma was that because of the differences of all these entities that we brought together for one presentation, I am not sure that we can go back and say this is where it should be. But I think we could generally rediscuss that because I am trying to remember the times that we had on our subcommittee, but, yes, we can address this in this afternoon's --

MR. BLAIR: It might be possible that you could assign responsibility or target to the individual recommendations in some cases even if you can't do it globally. At least that way there is some tag for who would go forward. I didn't mean to dilute what you are saying. I think she was just saying that there may not be the ability to have global responsibility. Make it as global as you can, but, you know, in defaulting that, then at least tag the individual ones with some responsibility.

DR. LUMPKIN: I actually had a stroke of genius whispered into my ear and that is that perhaps the recommendation which would be No. 15 would ask the Data Council to address this issue.

MS. WARD: That would be very --

DR. LUMPKIN: You know, to identify within HHS a lead responsibility.

MS. WARD: I think that trying to go with a few minutes to try to go back and try to recall the discussion, part of the dilemma is that as I said before -- and the staff were able to correct -- the point of contact that used to do this is the Department of Interior and that was part of our dilemma. If we wanted to have it work the way it used to, we would be making a recommendation to the Department of Interior. It is not a department we are used to interacting with.

DR. LUMPKIN: And we are charged to advise the Secretary of HHS and we can make a recommendation that HHS should rely on the Interior but since the Interior went away, I don't think that there was necessarily a compensatory response by HHS to say these issues now require our attention.

MR. BLAIR: Did you -- were you able to determine any idea of what a budget would be to be able to accomplish and support these recommendations?

MS. WARD: My vague memory is that was a million dollar office, but I don't know why that stuck in my head. That is what I think the person from the Department of Interior said that is what that office used to cost.

DR. AMARO: But that doesn't mean we would be able to accomplish these --

[Multiple discussions.]

DR. LUMPKIN: But that doesn't mean in order to do this it costs a million dollars because there really is -- it may be one FTE or someone who is going to coordinate currently existing programs. The point is that our recommendations don't hold together because there is no one that we said should be tracking the issue and that is really what we are going to be asking the Data Council --

MS. WARD: I think that is a very different issue that what we found what the Department of Interior was doing. I think that is why we will need to be very careful because I think it is, having heard them talk about what they used to do in terms of the travel, the skills, the consultations, the technical assistance, I am not surprised that was a million dollar office. But that is an entirely different function than asking Health and Human Services to look in its agency about what they are doing about those areas. That we can certainly look at.

MR. SCANLON: Within HHS, there are actually -- there are a couple of possibilities. There is a Pacific Basin Health Initiative, but that wouldn't include Puerto Rico and the Virgin Islands. At the same time our Office of Minority Health feels that they have kind of a central role. In fact, they helped to pay for the hearing that we did have in this area.

So, you may want to think -- and HHS is not going to encroach on the authorities given to DOI for actually dealing with the governments of these programs or census in terms of the basic statistical capability. But you may just want to -- you might want to consider OMH or you might just want to say that HHS should designate a focal point and let the Department -- I mean they could designate a committee like the Data Council, but that is not going to help you in terms of a national focal point that has day-to-day responsibility.

DR. LUMPKIN: Okay. Anything else?

[There was no response.]

I would like to thank the committee because clearly the nature of the recommendations indicate that there certainly is an issue here that needed to be addressed and it is important that we get it out.

At this point, we are going to break up into the two subcommittee meetings. I would like to remind everyone that tomorrow morning that the Subcommittee on Privacy and Confidentiality and the Work Group on National Health Information Infrastructure will be meeting.

Do we have the room number for privacy?

PARTICIPANT: Here.

DR. LUMPKIN: Oh, okay. So, we will meet and the NHII group will in 440D and then we will convene the full committee meeting at 10 o'clock.

Thank you.

[Whereupon at 3:34 p.m., the meeting was recessed, to reconvene at 10:00 a.m., the following morning, Thursday, June 24, 1999.]