[This Transcript is Unedited]

DEPARTMENT OF HEALTH AND HUMAN SERVICES

National Committee on Vital and Health Statistics

June 26, 2002

The Wyndham City Center Hotel
Potomac Room
1143 New Hampshire Avenue, N.W.
Washington, D.C. 20037

Proceedings By:
CASET Associates, Ltd.
10201 Lee Highway, Suite 160
Fairfax, Virginia 22030
(703) 352-0091

PARTICIPANTS:

Liaison:


TABLE OF CONTENTS


P R O C E E D I N G S: [9:00am]

Agenda Item: Call to Order, Welcome and Introductions, Review of Agenda

DR. LUMPKIN: Good morning. We are going to call the meeting to order. We have a quorum.

First of all, I think that for those of you who are looking at the agenda, we have some very interesting items as we begin to look forward to issues related to -- there have been a number of recent announcements.

There is a lot of pessimism in the world about whether we will use the informatics on a transformation of -- so, this should be a very interesting discussion.

Let's start off with introductions. My name is John Lumpkin.I am the chair of the NCVHS and also the director of the Illinois Department of Public Health.

MS. GREENBERG: I am Marjorie Greenberg from the National Center for Health Statistics, Center for Disease Control and Prevention and executive secretary to the committee.

MS. TRUDEL: Karen Trudel, Centers for Medicare and Medicaid Services, acting liaison to the committee.

MS. COLTIN: Kathryn Coltin, Harvard Pilgrim Health Care, member of the committee.

MR. NEWACHECK: Paul Newacheck, University of California, San Francisco, member of the committee.

DR. COHN: Simon Cohn, national director for health information policy for Kaiser Permanente and a member of the committee.

I should take this opportunity to inform the committee -- I guess you all know that I am a member of the AMA CPT Editorial Panel; therefore, will be recusing myself from voting on a letter related to code sets coming before the committee for discussion today. I guess I would ask the permission of the chair if I can participate in that discussion, however.

DR. LUMPKIN: Can I just ask a question? Do you get paid for doing that?

DR. COHN: No.

DR. LUMPKIN: So, that is a voluntary position.

DR. COHN: Yes.

DR. LUMPKIN: So, actually the conflict is not based upon financial issues. It is just the fact that you have been participating in a voluntary fashion on that committee.

DR. COHN: Yes.

DR. LUMPKIN: I think we need to clarify that. It is your choice whether or not you choose to vote, but I don't think that taints your participation in any discussions or actions of this committee, now that we know that.

MS. GREENBERG: It might be good -- again, maybe at the next meeting -- to have the staff from the ethics office at CDC to review with us the conflict issues, but it is -- Dr. Cohn does have a limited waiver, which specifically says that on issues that are related to his role on the CPT Editorial Board, he has recused himself from discussion and voting.

I think a general discussion is probably -- you know, about the code sets, he certainly doesn't have to generally recuse himself from, but anything specific to CPT, he would need to recuse himself. These types of decisions are not just based on whether a person is paid. In a sense, I mean, I guess probably your travel expenses are paid.

DR. COHN: Yes.

MS. GREENBERG: But because the conflict of interest issues are broader than whether you are actually gathering -- we can have this explained in fuller detail, but he is correct to recuse himself.

DR. LUMPKIN: These things are always simpler at the state level where -- because as Jim says we have the Federal Government watching over us.

Okay. I don't think that is the way we look at it. Barbara.

DR. STARFIELD: I am Barbara Starfield from the Johns Hopkins University and member of the committee.

MR. ROTHSTEIN: I am Mark Rothstein from the University of Louisville School of Medicine and member of the committee.

MS. KAMINSKI: I am Stephanie Kaminski from the Office of Civil Rights, lead staff to the Subcommittee on Privacy and Confidentiality.

DR. FRIEDMAN: I am Dan Friedman with the Massaschusetts Department of Public Health and a member of the committee. The fan down here is sort of loud. So, you folks who are at that end, if you do want us to hear -- if you don't want us to hear, just speak in a normal voice.

DR. LUMPKIN: Dan, all I can say is that the people at this end of the committee had control over the seating chart.

[Laughter.]

DR. FRIEDMAN: Thank you, John.

DR. SHORTLIFFE: I am Ted Shortliffe from Columbia University Department of Medical Informatics and a member of the committee.

DR. ZUBELDIA: I am Kepa Zubeldia, member of the committee and chairman and CEO of Claredi Corporation. I am also chair of the Association for Electronic Health Care Transactions.

Since tomorrow we will be talking about record linkage, I do have a patent on a specific way of anonymously linked records that was assigned to Arcamus(?) Corporation a couple years ago. I have no interest in it, but I do have a special knowledge in anonymously linking records.

MS. GREENBERG: Unless that particular activity is discussed --

DR. ZUBELDIA: I may have no financial interest in the patent at all.

DR. LUMPKIN: I think this committee would be in trouble if we excluded people based upon special knowledge.

[Laughter.]

DR. MC DONALD: Clem McDonald from Indiana University and the Regenstrief Institute. I am chairman of the Link(?) Committee and I am a member of HL7, ASPM and ISO, none of which pay anything.

DR. FITZMAURICE: Michael Fitzmaurice, senior science advisor for information technology and director of the Agency for Healthcare Research and Quality, lead staff on the Secretary's Council on Private Sector Initiatives to Improve Security, Safety and Quality of Healthcare, liaison to the National Committee and staff to the Subcommittee on Standards and Security.

MR. BLAIR: I am Jeff Blair, vice president of the Medical Records Institute and member of the committee.

MR. SCANLON: I am Jim Scanlon from the Office of the Assistant Secretary for Planning and Evaluation in HHS and I am the HHS staff director for the committee.

MR. RENNER: I am Phil Renner, director of measures development at NCQA. I will be one of the presenters today.

DR. MAYS: Good morning. Vickie Mays, University of California, Los Angeles.

DR. LUMPKIN: Now for the audience.

MS. CHALK: I am Mady Chalk, director of the Office of Quality Improvement and Financing in the Center for Substance Abuse Treatment in SAMHSA and presenting today.

MR. GANJU: I am Vijay Ganju, the director of the Center for Mental Health Quality and Accountability at the NASMHPD, NASMHPD being the National Association of State Mental Health Program Directors. I am a presenter today, as well.

MR. BETA: Matt Beta(?) with Kennedy Creger(?).

MS. GRADY: Debbie Grady. I am a graduate student at the University of Maryland.

MS. BOIS: Danielle Bois(?). I am with the American Association of Health Plans.

MS. CANAN: Susan Canan(?), writer for the committee.

MS. BARTLETT: Melissa Bartlett with the American Association of Health Plans.

MS. AULD: Vivian Auld, National Library of Medicine.

MR. TATE: Michael Tate, the American Dental Association.

MS. HORLICK: Gail Horlick, Centers for Disease Control and Prevention.

MS. JANE: Gail Jane(?), CDC, Atlanta, staff to the committee.

MR. ETTINGER: Stan Ettinger, AHRQ, and staff to the Quality Work Group.

MS. WILLIAMSON: Michelle Williamson, National Center for Health Statistics, CDC.

MS. PICKETT: Donna Pickett, NCHS, CDC.

MS. ARSWAGA: Pat Arswaga(?), Blue Cross and Blue Shield Association.

MS. BEBEE: Susie Bebee, NCHS, CDC, staff to the Subcommittee on Standards and Security.

MS. JACKSON: Debbie Jackson, National Committee on Health Statistics, committee staff.

MS. EISEN: Sue Eisen. I am a health research scientist at the Center for Health Quality Outcomes and Economic Research.

MS. SQUIRE: Marietta Squire, NCHS, CDC.

MS. WILHYDE: Carol Wilhyde(?) with Magna Systems.

MS. ADLER: Jackie Adler from NCHS.

MS. WEISS: Chris Weiss(?) from Health Link, Incorporated.

MS. DONOVAN: Nancy Donovan from the U.S. General Accounting Office.

MS. SKIVASS: Joanne Skivass(?), SnoMed, International.

MS. ASHMANS: Diane Ashmans(?), SnoMed, International.

MR. WINE: Mark Wine(?), Department of Veterans Affairs. I am from the Office of the CIO, program manager for Health Information Technology Sharing.

MR. GIROSA: Clem Girosa(?), Mirec Systems, Inc. We develop medical software.

MS. LEON-CHISEN: Nelly Leon-Chisen, American Hospital Association.

MS. CLAUS: Linda Claus(?), AHIMA, American Health Information Management Association.

MS. PROFITT: Sue Profitt, American Health Information Management Association.

MR. RODEY: Dan Rodey(?), AHIMA.

MS. WHITE: Tracy White, NCHS.

DR. LUMPKIN: Thank you. We are going to start off with the updates from the Department. Jim.

MR. SCANLON: Thank you very -- sorry?

DR. LUMPKIN: I did forget something. Just a reminder. We are, as usual in our meetings, broadcasting, simulcast and delayed simulcast, across the Internet. So, I may at some point interrupt you if you are speaking and I get a high sign from the folks running the system that they can't be heard. So, I may ask you to move the microphone closer to your mouth while you are speaking.

It is not being rude. It is just so everybody can hear, including Dan down at the other end of the table. Thank you.

Jim.

Agenda Item: Update from the Department

MR. SCANLON: Thank you, John.

Well, a number of developments have occurred since we met as a full committee, I guess, in April -- February and even more things and we are going to have -- Stephanie Kaminski will be talking about the privacy regulation in a few minutes. Karen will be talking about the data standards, actually some progress there as well.

Later, this morning, we will be talking about the Federal Interagency Consolidated Health Informatics Initiative. So, I am not going to talk about those. I am going to talk about some other things of more general interest.

Specifically, I am going to talk about the bioterrorism grants to the states. I am going to talk about the status of our National Academy of Sciences study on the adequacy of race and ethnicity data. I am going to talk a little bit about our -- every federal agency has to issue data quality guidelines.

We have some draft guidelines up on our web. I want to talk a little bit about that and ask the committee to look at those. I am going to talk a little bit about -- further about our HHS strategic plan, which is at the present being revised and I think we would like the committee to take a look at some of that as well.

So, let me start with the bioterrorism grants. On June 6th, HHS announced the award of nearly $1.1 billion in grants to the states for bioterrorism preparedness and prevention for public health protection, as well.

There are basically two parts to these grants. One is the CDC grant, which deals with strengthening the public health system in the states to prepare for these and prevent these eventualities. The other part really was through our HRSA, which was really a grant to the state to promote hospital regional preparedness plans.

The progress reports are due on October 1st, but as you can imagine, this was a very complicated -- this was basically 50 states sending in two plans, which each contained about 18 different -- would address about 18 different themes and areas. One of the areas that the states could choose to address was the area of information infrastructure and information systems planning in either of those plans.

So, this could, to the extent that states are using funding for that purpose, actually help to create the market for some of the information systems and standards that we have been talking about. This is probably the biggest infusion into the public health system, national public health system, in quite awhile, probably in a couple of decades.

On the other hand, I think we are all a little bit worried that if every state and every hospital goes its own way, we will basically be in the same place in five years with regard to information infrastructure systems as we are now. The connectivity and the interoperability won't necessarily be solved unless there is kind of a basic framework.

To that end, CDC and HRSA both issued guidance for information infrastructure planning and it was basically a NEDSS oriented -- a standards oriented kind of an approach. So, we are having at our Data Council meeting in July -- we are having a review of sort of the highlights and summaries of what the states actually proposed. So, we will look at the strengths and limits of what has been proposed and we will try to see where there are some general issues.

Secondly -- any questions on that, John? I know Illinois got a public health grant as well and is working on the HRSA grant. Three cities as well received funding; Los Angeles, Chicago and New York City because of the sensitivity there as well. So, I think we are all hoping that this will create some of the capital in the public health system to actually move forward with some of these approaches.

But, again, without a kind of a national framework for standards approach, these could end up being very expensive silos as well. Let me go on to the National Academy of Sciences study.

You will remember that Congress last year directed HHS to fund a study at the National Academy of Sciences to look at the adequacy of race and ethnicity data in the Health and Human Services data systems, but also in other private sector and other public and other agency data systems as well.

You have a copy of the -- the panel has now been put together. Ed Perrin, I think many of you know Ed Perrin, is chairing the panel. Our distinguished chairman is also a member of the panel and we are still working on a liaison between the committee and the panel at the National Academy of Sciences.

So, you can take a look at the panel composition and their basic mission as well. It is basically about an 18 month study. So, I think they plan to get the committee involved and we hope to hear some recommendations from them. I would stress, though, that we are asking them to look at private sector data capacity as well.

I think in HHS we have known for a long time sort of the right thing to do in this area. It is purely a matter of resources. On the private health sector side, though, I think there really is some confusion about what is allowed and what is not allowed. When you rely on third party systems, like medical records and bills, it is not clear. HHS doesn't have that much authority over those.

So, we have asked the panel to look specifically at what are the -- what is the current state of the art in private sector data systems, what are the barriers to further progress and to make some recommendations there as well. I think the IOM will be called in to help us with that.

I think they are thinking of a one or two day -- I won't use the word "summit" because we have had so many now that it has become meaningless, but a one or two day workshop where we can work through some of these missions. Any questions on the Academy study?

MS. GREENBERG: I would just note that another former member of the committee, David Williams, is also on the panel.

MR. SCANLON: That is right. David Williams is there as well.

DR. MC DONALD: Could I ask a question about the previous subject that you said, the HRSA report?

MR. SCANLON: Sure.

DR. MC DONALD: Is that available, those guidelines and recommendations, about the NEDSS --

MR. SCANLON: Sure. It is a public --

DR. LUMPKIN: It is on the CDC web site. Just do a search for N-E-D-S-S. And the standards -- there is also a public health data conceptual model that Denise Koo --

DR. MC DONALD: Yes, I am actually aware of that. I just -- what specific recommendations did HRSA make? Is that a separate document?

MR. SCANLON: Guidelines.

DR. LUMPKIN:

Yes, they are guidelines and those are also available, I think, on the HRSA web site.

MR. SCANLON: Let me make sure, but we can certainly make them available. They were part of the guidance to states -- by the way, you have a press release that summarizes the grants to the states and the two parts as well.

DR. STARFIELD: What is the length of the IOM study? It is going to have five meetings, but I don't know over what period of time. I scanned this. I didn't see it.

DR. LUMPKIN: They promised 18 months.

MR. SCANLON: By next summer basically we would hope to have -- we in HHS would hope to have some preliminary report. I think the panel -- you are probably thinking about having four or five meetings, a workshop and some papers.

Let me talk a little bit now about -- I think I reported briefly at the beginning of this process the Shelby 2, the famous Shelby 2, statute that recently -- well, last year passed and it requires basically all of the agencies, federal agencies, cabinet agencies to have and publish guidelines for ensuring the quality of the information that they disseminate to the public.

Shelby 1, you remember -- there are two parts to the Shelby -- Shelby 1, basically made the information that an extramural researcher holds subject to FOIA -- remember, that was last year. That went into effect last year.

I am not sure if there have actually been any requests, but if someone asked the University of California at Los Angeles -- if someone was interested in the study Vickie was doing, someone from the regulated industry, and they wanted to see the actual raw data, they could actually request that from Vickie as a FOIA request, not necessarily the identifiers, but basically the unit information. That is already a statute.

Shelby 2 went further and said that OMB was to issue guidelines providing -- well, basically, directing all of the federal agencies to have guidelines of their own to assure the quality of the information and data disseminated to the public. I am going to talk a little bit abut what information means and what dissemination means.

But, basically, this is a fairly -- this could have tremendous implications. Quality includes objectivity, utility and integrity as well. So, it encompasses a lot of issues.

The time line for this was that OMB issue their final guidelines in January and all of the federal agencies were required to put up their draft guidelines on their web sites by May 1st.

We in HHS did that. So, our guidelines are on the HHS web site. It is HHS.GOV/INFOQUALITY. They have been up there for a 30 day comment period and we have received about 18 comments so far. We were asked to extend the comment period. So, we are going basically until -- it is now 60 days but it ends, but it ends next Monday.

But I think for the committee we could probably get -- if there are any comments from the committee, we could certainly factor them in as well. Within HHS, we approached the guidelines in the following way. We developed HHS-wide umbrella guidelines that apply to all of our agencies and then specific agencies within HHS developed guidelines within the umbrella.

So, every single agency from the Administration on Aging to the Food and Drug Administration to NIH to NCHS have guidelines that are under -- specific guidelines under our umbrella guidelines as well. The guidelines must include a process for the public to request correction of any information they believe is not concordant with the guidelines.

We have described what those various request for correction processes would be. You actually have to be fairly specific about what information it is you are asking to correct. Why do you believe that things were -- what do you think the correct version would be? So, all federal agencies have to do this.

Let me say a few words about what we mean by information because it is not necessarily every piece. It basically applies to substantive information. So, it is basically the report statistics and authoritative information. In HHS we issue a lot of authoritative public health information. It is not basic internal operations or administrative kinds of information.

Secondly, dissemination means that the agency had to initiate or sponsor the dissemination. So, it would not apply in most cases to most of the extramural research that NIH sponsors, for example, where the dissemination of the information is the sole responsibility of the author. That is not considered to be agency initiated or sponsor dissemination. If the agency does embrace it, it puts its own imprimatur on it in some way. Then it does become an agency-initiated dissemination and it is subject to guidelines.

The guidelines don't cover hyperlinks. For example, a lot of our web sites have links to other sources of data. It is not our data. We are doing it for the convenience of the user. So, it wouldn't include those links and it would not include opinions, as you would imagine, where the presentation makes it clear that what is offered is someone's opinion rather than an agency view. So, there has to be some sort of a disclaimer.

In the case of intramural research, statistical programs and so on, it kind of depends because most of the time intramural analysts and researchers giving papers are often doing it as part of a formal dissemination program. It is hard to say that is not an agency dissemination. But it would depend on what the disclaimer was in the specific role of what was being done. Dissemination does not include the following: It doesn't include sort of limited distribution to govern employees or contractors or grantees. It does not include intra-agency or interagency use or sharing of information. It doesn't include individualized kinds of responses like FOIA requests or FACA or Privacy Act requests. It doesn't include sort of basically correspondence to individuals rather than general public. It does not include press releases. It doesn't include library activities, obviously, and some other traditional kinds of processes.

So, even with the exemptions, though, you can see this applies to virtually all of the scientific and statistical information, analytic and evaluation information that the agencies directly sponsor. So, in HHS, it would include the following: results of our scientific research studies; statistical and analytic studies and products; programmatic and regulatory information, including evaluation; public health surveillance; epidemiological studies and risk assessment studies and, finally, authoritative health and medical information.

When CDC or HHS or FDA issues a warning or an alert or guidance, we basically consider that authoritative health information. That is subject to these guidelines as well that we have reserved the right to waive the guidelines in the event of urgent -- or some of the guidelines in the event of urgent or emergency kinds of situations. So, the time line now is that the comment period ends in about a week.

All the agencies have to get their revised guidelines back to OMB by August 1st. So, we have basically a month to do that. OMB will then give us comments and then we will finalize the guidelines, post them on our web site by October 1st.

They will be in effect October 1st, including the correction mechanism. So, at any rate, that might have -- we have got about 18 comments so far from fairly big organizations and we are hearing from both sides of the spectrum. But this could have, depending on where it goes, as you could imagine, a very big impact on data policy within all the federal agencies. Let me -- I am sorry?

DR. LUMPKIN: Where should comments be directed?

MR. SCANLON: Send them to me. We are taking comments electronically or via mail, though our mail is actually going off for irradiation really to Ohio. So, we probably lose about two weeks. So, if you go right to the web side -- I will give you the web site -- and you could just give us electronic comments. The web site is HHS.GOV/INFOQUALITY.

DR. MC DONALD: I don't have a good sense of what the issues are on this. Could someone just sort of say what are the sides that might oppose it from either side or what is new about this that would be --

MR. SCANLON: Well, it is -- in fact, at HHS, I think we already -- we actually have sort of a philosophy of data access in quality of planning for these sorts of things anyway. It is not a new really requirement on HHS, but it has now become a formal requirement. It originated -- the impetus, I think, tells you where it is coming from. The impetus for this arose in a regulatory setting.

EPA had published regulations a number of years ago and they had cited a study that was performed -- it was a grant from NIH and it was performed by researchers at Harvard School of Public Health; the Six Cities Studies. I think you all are familiar with that; the particulate, the air quality studies.

That information was used at EPA to propose the standard for air quality and the industry understandably wanted to get a look at the information and for whatever reason, the researchers just didn't make it available. It led to lawsuits and finally it led to a number of industry groups finding a sympathetic ear in the Congress. Senator Shelby was on the Senate side. So, it was basically that lack of access to information that was proposed for use in a regulation, more or less by the opposing side.

I mean, there are always two sides to these things. They had a very difficult time getting it and it led to that. So, basically, Shelby won because that information at the Harvard School of Public Health is now FOIA, is now subject to FOIA. Secondly, federal agencies have to have these guidelines. So, what we are seeing in our comments is basically the consumer protection side of the House saying that you have to be careful with these guidelines.

We think they have only been set up to kind of slow down the regulatory process. And the industry and the lobby groups that say these are a good idea, you know, and we are happy with this or that. The HHS guidelines, we are -- some of the comments are directed at how OMB interprets what is in and what is out and so on. In terms of the HHS guidelines, the research community is very happy with the way we came out.

Folks are happy with the way we came out on FDA, but they always have specific comments, tweaks of how to do this or that. But it arose as a regulatory issue. It basically arose as a regulatory issue. There is some desire to have -- and, obviously, information that was used to support a regulation is subject to these guidelines.

Let me just finish up quickly now with the HHS strategic plan. You remember, GPRA, the Government Performance and Results Act requires all federal agencies to have strategic plans and we have had one, obviously, over the years. We have begun the process within HHS of revising and updating the HHS strategic plan to reflect the current leadership. The other one was developed about three years ago. The plan basically will include broad goals, broad HHS goals, objectives under each of those goals and it will include the basic strategies that HHS uses to achieve those goals.

I am happy to say that we are working on -- you never know how these come out. We are actually working on some NHII kinds of strategies and goals for the strategic plan. Obviously, these would have to be what HHS has the authority to do. We can't just say we are going to do something in general, but if it makes it through the revision process, that will actually add that dimension.

The draft strategic plan will be available for comment, public comment, later in the summer and we will have it up on our web site but we can actually have the committee take a look individually or collectively as well. So, this will be an updating of the HHS strategic plan covering all of the programs and all of the agencies. So let me stop there.

MR. BLAIR: Question. You used the word "some" of the NHII recommendations. Could you give us some feeling of what you are able to include within that?

MR. SCANLON: No. No.

There are a lot of ideas about the NHII and the committee's recommendations -- the committee's framework, I think, is very helpful in understanding people and the PMRI report. But how you get there, I think, there is a fair amount of -- there is not all that much consensus. Some people really believe in an entirely market-based approach. Other folks believe in various things you could do.

I think in direction, it will be more directional, Jeff, in terms of promoting and accelerating. But it will be in terms of a goal or a broad objective if it survives at all. This is a very complicated area and there may or may not be a consensus, as you know. I think everyone -- the vision resonates with everyone. But there is quite a bit of difference about how to get there and the role of the government in the U.S. and health care.

MS. GREENBERG: I think Tab 7 we have several responses to the committee's reports, I believe, Jim.

MR. SCANLON: Yes. You can go through the most recent ones.

Stephanie is going to talk -- the April 25th letter from the committee that included comments on the proposed rule is in the hands of OCR now and I think it is being considered as the rule is being revised along those lines. I think basically all of the privacy and confidentiality recommendations were looked at very carefully and actually found their way into the preamble you will remember for the proposed rule.

I think they were taken very seriously and considered in the formulation of the proposed rule. Stephanie will update us in a couple minutes on where we are. So, I think the recommendations were actually very helpful.

On the PMRI and -- well, the recommendations on NEDSS, you remember that the committee supported the basic concept of a standards framework for NEDSS, asked for additional -- asked for a couple of -- recommended some activities to make it a bit stronger, including national technical assistance, national conformance testing and so on.

We actually -- I was surprised to see this morning that we hadn't actually answered because we had a draft quite awhile ago. I think the issue is -- that should be coming out shortly -- the issue ended up being the same issue that we raised about NEDSS previously. How much of this is mandatory and how much of this is voluntary?

I think we just -- really I thought there was no way to make it mandatory, though you could tie it to funding, but I think we are still trying to resolve that internally. Otherwise, that letter will come as well.

There was a lot of support from the committee in terms of the NEDSS framework in moving forward. On the PMRI standards, we have a -- Dr. Lumpkin has presented those to the Data Council. We have circulated those to all of the agencies and you will hear from Jared in a few minutes, too. Our Working Group on Consolidated Health Informatics is looking at those message format standards and the next set of standards as well.

The NHII, I think, you know, clearly the committee gave everyone a lot to think about and people are sorting it through. It has been presented to the Data Council. It is circulating in the agencies. We are doing internal briefings.

I think, John, we are trying to get you to meet with the Secretary, hopefully, very shortly. As I said, there is a lot of interest in the NHII. Our Secretary has often said that your local supermarket probably has more information infrastructure capability than the health care system does. He is desirous of short of moving -- helping that all move forward and he has often talked about what the health care system could look like if there were more involvement of information technology. It is almost a future way of looking at this.

But, again, how to get there is another matter, but I think we are working on a couple of things that could -- if they all survive would actually boost this forward a fair amount.

MS. GREENBERG: I don't think the committee ever received a response on the internal status report. Do you know where that is?

MR. SCANLON: I don't know. Did we? I will check into it.

DR. LUMPKIN: Dan, did you have a quick question?

DR. FRIEDMAN: Yes. Just a request, which is, Jim, at a future meeting would you be able to spend a few minutes on the information portal --

MR. SCANLON: On the gateway, sure.

DR. FRIEDMAN: And perhaps even have a little demonstration?

MR. SCANLON: Actually, I think we are just about to finish a beta test internally and it actually held up pretty well. This was the gateway to statistics, Health and Human Services statistics, that I talked about at the previous meeting. We did a little beta test and it actually held up pretty well.

We are making a couple of tweaks and then I think I would be happy to brief the full committee. We will probably open it up in the summer, in August, probably.

DR. LUMPKIN: Great. We are going to just do a little bit of a flip-flop on the schedule because Jared has to leave for another meeting. So, if we can move into that point on the agenda and then come back to the updates on privacy and data standards.

Agenda Item: Consolidated Health Informatics

MS. ADAIR: I thank you for making that adjustment. I am Jared Adair and I am with the Centers for Medicare and Medicaid Services within Health and Human Services.

I have had the opportunity to meet with some of you and another one of the responsibilities that I have, which is the privilege of working with people, such as Karen Trudel on HIPAA.

But I am also here today to talk about another initiative that we are taking a look at and would like to talk to you about potentially working together on it. It is called Consolidated Health Care Informatics. It seems some folks are handing out the slides now. I apologize I did not get them out in advance to be in your book.

Let me give you a little bit of background before I go into exactly what it is. Some of you may be familiar that about a year ago the Office of Management and Budget was taking a look at how we in the Federal Government could move more efficiently and effectively into the electronic world and they have come up with a number of -- they came up with actually 24. I guess that is the number, but they came up with 24 areas they would like to be pursued. The last one in my mind that they came up with was that of consolidated health care informatics.

They asked that a business case, which is kind of our federal way of saying justify why you would do something in this area and they just understood that there was a lot of activity going on in the Federal Government having to do with that and they thought, boy, if we could pull some of this together and do it as one, that would be much more effective utilization of the taxpayer money. They asked HHS to take the lead on that activity and it went to our assistant secretary -- changed titles, so I will work on it, but her name is Janet Ailen(?). She is the assistant secretary for budget, technology and finance. It is a new title. So, I apologize if I didn't do it smoothly.

She then in turn asked me to be the project manager, I believe, is the appropriate term. That is a little bit of the background. A little bit more is that we got together with folks, such as the Department of Defense and Veterans Administration, and said if we were going to do something here, what is the best thing that we could do.

We all came to terms with it. We have to overcome a threshold issue. That threshold issue is something I believe that is near and dear to your hearts and that is truly standard and that if you are not speaking in the same vocabulary, if you don't have interoperability, you really are not going to move no matter what your business need is. You naturally move forward on that.

A proposal that the federal partners have put together and notably I would reiterate that that is the Department of Defense, Veterans Administration, as well as HHS, that we need to come to -- in order to enable the sharing of health information that people think is a good idea, first, we have to come to terms with what are the standards. So that we need to be identifying that. I am never very good -- and I apologize for following slides, but it is in here. I just don't -- I don't think as logically as the people who put the slides together for me.

What we are proposing and I have to reiterate that we are still in a proposal period for the President's Management Council. We will be formalizing our business case and getting it up. As I am sure you can appreciate, there are lots of review and lots of people stalked and trying to corral them into a document is pretty difficult.

But we are really proposing in this document is that we adopt government-wide interoperability, vocabulary, whatever -- we have spent a lot of time on those words. They mean a lot of different things to people, to different people, but to really just come to terms with in the Federal Government what we should be using and we would in turn build it into each of our departments' architecture.

This is a little bit different notion than HIPAA. HIPAA, as you are very well aware, mandated, legislated what the standards were and that everybody in the health care sector should be using it. Our proposition, our proposal, is that we, the folks in the Federal Government that touch health care, we provide it, we pay for it, for a plan.

We are a direct provide or whatever, that we would, in essence, as good business partners, say to the rest of the health care community in the United States, these are, in fact, the standards we are going to be using and that we believe by leading and influencing that way, it will help us get to a standardized place, without some of the -- it is just a different approach, if you will allow me, than HIPAA. We also would not get there on a date certain.

As you all are very well aware, that when we pass the legislation, it ends up with a date, you know, two years, two months after we have promulgated this thing. What we would basically say is that this is what we would be building to. It is part of our -- in the Federal Government we are required to have an architecture and it would be part of our target architecture that we would be moving to.

So, in our opinion, what this obviates is a lot of expense that is built into -- when you have to go back and change legacy systems and things along those lines, this is something that we would be moving to. These slides go through a lot of the benefits. I feel I would be preaching to the choir if I dared to say to you what we thought the benefits of this were.

I think not only are you because of what you do in your day-to-day business, cognizant of that, I believe you in documents that you have recommended to the Secretary are on record as to what you believe the benefits of those standards should be. So, I will move past that and kind of go to the stool has four legs slide because I do want you to know that we appreciate this does not happen in an isolation.

This, in essence, has to go along with what is going on in HIPAA, having to do with the transactions, the code sets, as well as the identifier. It also has to -- nobody could think about doing anything in the health care arena without the other piece of security.

Then the last tools that we recognize is truly that of changed management, that, in essence, if we move towards this, it will, in fact, bring about a changed management process that not only happens within the federal enterprise, but would happen in everybody that touches us. We provide some milestones that we would be working towards.

I think the one of note to you would be that we cannot perceive moving ahead on something like that and something like this without an acute awareness of the ever greening process.

To me, that basically means that you might come up with a standard but that doesn't mean that you just live in that world forever and ever, but you do, in fact, have to have a process, have to have an open process that talks about when is it time to move to the next iteration, the next generation. So, we are very cognizant that we will have to come up with an ever greening process.

The tools that we use, certainly, as I pointed out, two of the things I think we want to learn are some of the lessons learned out of HIPAA, you know, working with industry, not coming up with our own standards, but truly finding what is the best of breed within the industry at the moment and adopting that.

We are not looking to develop things on our own. We are not looking for something archaic that everybody would scratch their head and ask this question of why did you choose that. I do imagine that there would be people scratching their heads, saying I understand why they picked this one. I just wish they would have picked a different one. I don't want these people to think that is a very strange thing.

I would be remiss if I did not say one of the tool that we will use is a recommendation of this committee has, in fact, brought forth to the Secretary. Even though we are still waiting for approval for the business case and all of that, we believe strongly that this is such a good thing to be doing, we have started having groups take a look and one of the documents that they are reviewing is the document that you sent to the Secretary with your recommendations.

That is a great place for us to start. We are not looking to recreate any wheels. We are looking to build on what people such as yourselves have done and seeing if we can just say we are adopting this. This is the document that was sent to us and we in the federal enterprise are adopting this as a standard. Even, I think, maybe a little broader than the recommendations that you had set forth. That brings me to the last slide and why I wanted to really be having a conversation with you all today.

Obviously, one of the things that we have to figure out how to do is to have conversations on this, not just with our federal sector partners and to hear from industry and to hear from yourselves. We are here to ask you if you would be interested in helping us, in working with us. It is an open-ended question. I certainly have a bias as to what I would like the answer to be, which is a positive.

I have gone through these slides for what I wanted to say very quickly. I did that very purposefully because I believe the folks that I am speaking to here today are -- this is a topic that is near and dear to your heart. I didn't need to go through a lot of background.

My intent here was to say that we in the federal sector are taking a look at how we could adopt this in the federal sector and we are interested if you all were willing to help us on our path. If I have gone through it too quickly, if you have questions, I am ready to try to answer them.

I should point out I might, however, point to people that are around the walls because as I look here there are a lot of folks within -- that are working on this project with us that happened to be also support to you. So, that is the good news.

DR. LUMPKIN: Great. I think we have a lot of questions. Good questions. The easy answer to your question is "yes."

MS. ADAIR: Oh, thank you. I needed that.

DR. LUMPKIN: But what I would like to do is maybe sort of do this as a two part discussion and if I can ask Karen, who has to leave with you, to give a short update on where we are in the HIPAA standards, which I think fits into then the role that the committee can play in fleshing out what is not going through the standard process and then we will get into the questions.

MS. ADAIR: Thank you.

DR. LUMPKIN: Karen.

Agenda Item: Data Standards

MS. TRUDEL: Thank you. Okay.

I just want to take a couple of minutes to update everyone on what is happening with the non-privacy aspects of HIPAA. Most notably since the last meeting, we have published two proposed rules and a final rule. The final rule was on the employer identifier. The two proposed rules had to do with modification to the initial transactions and code sets.

One adopted the so-called addenda, which are modifications to the transactions themselves that the industry groups felt were necessary in order to facilitate implementation. The other had to do with changes to the pharmacy transaction and the national drug codes as a standard and essentially these are pretty much exactly the way the committee recommended to the Secretary that these issues be addressed.

There is a 30 day comment period on each one of them and that comment period is over on July 1st. We already have a team together fleshing out what the final rules will look like and we are going to push those through as quickly as we can. Also working very hard on the remaining four rules that we have in the Department.

Those are the security final rule, the provider identifier final rule and proposed rules for the claims attachment standard and the plan identifier. I believe that the security and attachment documents are farther along in the process than the two identifiers and we are targeting I believe some time late or early fall for those publications.

An update on the Administrative Simplification Compliance Act, ASCA, or the one year delay. We have so far received upwards of 20,000, I believe, compliance extension requests. The vast majority of them, thankfully, come in electronically. It is very interesting, we are up at about 800 or so a day and we are getting them seven days a week. The volume goes down over weekends, but we are receiving them seven days a week.

We also have a few -- this is really, I think, a tangible success point. We have some Medicare contractors that are in production with the A37 claim, actually in production. Not very many, but we are moving along very well and the worry that our contractors are having is making sure that we increase the volume of people who are coming forward to test with us, so that we can get everything done by October of 2003.

MR. BLAIR: Can we applaud for that?

MS. TRUDEL: Yes. Yes, definitely.

[Applause.]

The other thing that we are CMS are spending a lot of time on right now is outreach and we have just aired our second HIPAA video. We did that by satellite broadcast and WebCast a few weeks ago and the sense we had was that we reached somewhere around 7,000 people in just that one effort.

We are thinking about expanding that, doing additional broadcasts and, of course, the video is available free of charge for people who would like to have it. We have also held our second in a series of round table sessions and I think we had upwards of 700 people on a very large audio call, again, a few weeks ago. Sorry?

MS. ADAIR: 700 connections.

MS. TRUDEL: Connections. We don't know how many people actually were at each site. And gave people a short update and then just an opportunity to give us their problems, tell us what they were seeing as issues.

I think those were very, very well-received and we are hearing a lot of good feedback. Also, we are getting some good feedback on our web site. I don't get very many chances to pat us on the back. So, I am just going to take advantage of it.

We are getting feedback on the CMS administrative simplification web site and people are telling us that the compliance extension plan is actually not bad at all to complete. So, I think we got that one right in terms of having information that the committee can use and also something that is usable by people.

One person at the round table said that some physicians had told her that it was practically painless and I think that is pretty good. So, I think that is all I have to update on right now.

MS. ADAIR: Well, speaking about pat on the backs, I feel obligated to do this. I don't know if any of you happened to be listening to the round table, but I think Karen did an excellent job. It was over an hour of questions basically to her.

It is kind of like your final exam and people across the country are getting to ask the questions. And she did a wonderful job.

DR. LUMPKIN: But for those of us who have worked with Karen --

[Applause.]

those of us on the committee, who have worked with Karen, that is not a surprise.

MS. ADAIR: Not a surprise, but -- the other thing we are getting great response on is the web site on frequently asked questions.

We are trying to keep that up to date and actually we are encouraging people -- every once in awhile when Karen and I are out speaking, we ask them to send it in as a frequently asked question so that we have the opportunity to answer to a larger audience than just the person we are speaking to.

DR. LUMPKIN: Simon, you had a question.

DR. COHN: Actually, I guess, I should make one or two comments and then maybe a question and additional comments.

First of all, Karen, thank you very much for your work and I think it has been going on very well. I do want to say from just my view as a representative of a large health care agency -- organization -- that it has taken us longer to figure out, who to send the delay requests in than it has been to fill out the requests. We really do appreciate that.

Now, Jared, thank you very much, obviously, for your presentation. Jared and I have had some discussions previously, but I think what you are doing is very exciting work. I know the subcommittee is very interested in working with you.

I guess the key issues, one of the -- there are two issues coming before the Subcommittee on Standards and Security over the next six months and these may be things that you want to participate in. One is the beginning of work on clinical terminologies and seeing that there are some directions and recommendations that we can point to. That work, depending on the subcommittee's interest, may start as early as late August.

The other piece that I think we are all very interested in looking at is the issue of sort of lessons learned and possible improvements to the HIPAA process, recognizing that many of us are into six or seven years of this now. It is probably a good time to ask the industry how we can do these things better. Once again, I think we will be asking those questions over the next six months.

Now, I didn't get a real good sense of your time frame. You talk about near, medium and long term deliverables, but I -- does this fit into your overall project plan or your view?

MS. ADAIR: What was the time period you were referring to, Simon?

DR. COHN: What do you mean, for us or for you?

MS. ADAIR: You. You said did it fit into. I didn't --

DR. COHN: Well, I think that we are going to be beginning the clinical terminology, as I said, very shortly. I can't tell you when we are going to have the answer or the recommendations, but certainly within the next, I would imagine -- actually, I am looking to Mike Fitzmaurice and Jeff on that one.

I would imagine within the next eight to ten months, I would imagine. Certainly, I don't know that the lessons learned discussion is going to be a multi-part discussion. Hopefully, that would be a single discussion.

MS. ADAIR: I would think that, you know -- and, again, this is -- we don't have approval, you know. You know me well. You know I am kind of cautious.

So, I will with every opportunity, you know, remind people that, you know, we are still in the business case kind of mode. But I would believe that we are talking within a two year kind of a window.

I mean, it is a pretty short window, but I think that it is a good one for us because I think that if you go too much further than that, you know, you are back to revisiting where you are started. So, I think that is the kind of time period we are talking about. We would like, though, to -- it is not one of those things that you deliver at the end, you know.

It is pieces that we would be looking to do in an incremental kind of a place, taking a look at what could we move on sooner and let people know. I think that is an important part that we don't necessarily -- we are going to let people know what our decisions -- you know, share with folks what they are. I mean, I think that we would hope that it has a tendency then to become a standard outside of even where we work, even outside of the federal sector.

DR. LUMPKIN: I think it would be helpful because, obviously, we are very interested in partnering and I think one of the lessons that we as a committee have learned since HIPAA changed our charter and added some new areas of work is that a strong partnership between us and the Department in developing the agenda and moving the agenda forward. So, I think to the extent that as you start to see some items that may be a higher priority on your list that may be necessarily as quickly on our list, I think we can negotiate that and maybe move things up forward.

It also helps, I think, to the extent that we work together in partnership identifying the issues that we want to bring out at the hearing. So that if you have issues that you think are important, we believe it is our role as a committee to try to bring those out and make sure that you can hear what you need to hear and then obviously we will then come forward with recommendations as the committee that will dovetail with your process. So, I think that there is real room for the partnership.

Mike?

DR. FITZMAURICE: I have one question for Karen and two for Jared. Let me take Karen first.

Karen, you mentioned that you wanted an increased volume for testing. Where are the testing facilities? Are these available to the public; that is, to private sector firms to come in and test their products as well? Is there any plan for certifying those in the private sector who do a certified systems as compliance as some kind of a seal that says, yes, you are certifying the other people correctly?

MS. TRUDEL: I will take the first part first, I think. Start at the beginning.

I was referring to the testing facilities that the Medicare carriers, the fiscal intermediaries, and they have gone through a process already of certification with an external entity. Some of them are finished. Some of them are still in the process. But we are at this point able to open the doors to our providers and their submitters to come in and test and then go into production, just as we did when we implemented new electronic formats for Y2K.

As far as certifying external entities, certifying certifiers, we don't have any plans to do that at this time. Part of that is just the sheer complexity of doing it.

DR. FITZMAURICE: Okay. Two questions for Jared.

The first question is whose architecture are you talking about? You have some very valuable partners, as I am sure you know. You have got CDC. You have got VA and DOD. All of them have architectures of a sort.

Do you have a plan to look at the existing architectures and try to make something that will pull them altogether or take the best one? There are private sector architectures. There is HL7. There is an Australian model. Are you looking at how they fit together as we develop the government architecture?

MS. ADAIR: When I speak of the architecture, each of us is required as are each of operating divisions, such as CDC or CMS, as well as HHS, as well as VA and I am sure that there are DOD components -- each of us are required based on the Klinger-Cohen(?) to have an architecture. So, what our intent is is that each of us would build into our own architecture, that these interoperability standards, these vocabularies, as an inherent part of our architecture.

We would not through this be proposing that we merged as applications and things like that together. This is merely -- we would be building into what are the standards when you go about doing your business.

DR. FITZMAURICE: So, they might have different buildings but you want the building blocks to be the same?

MS. ADAIR: Yes.

DR. FITZMAURICE: The second question is are these plans built into the fiscal 2004 budget? That is, we are working on our 2004 budgets now, getting them sent up.

Are there dollars in there to back up what we would like to do under the CHI? You can be general. I don't expect you to have specific answers.

MS. ADAIR: The E-GOV initiatives are going through a little bit of a different kind of a process and I think that in essence most of what we find our activities on this initiative to be is truly are the resources that we have already because it is taking a look at -- it is folks that are already working on other kinds of activities and taking a look at adopting what is the appropriate one.

Each of us have activities or line items announced for how much we spend on licensing anyway. So, we are taking a look at --

DR. FITZMAURICE: So, there is a hope that all the partners will put it in their 2004 budgets?

MS. ADAIR: It is running a little differently than that because the thing is is that these are a lot of activities that we do anyway. What we are trying to do is work together instead of working separately. So, it is not necessarily --

DR. FITZMAURICE: I think I understand.

DR. LUMPKIN: Can I just follow-up so I understand? We are not really talking, let's say, that CMS is going to be developing System A. What we are trying to get them to do is to develop System A following the uniform standards that this project is trying to develop.

MS. ADAIR: Right.

DR. LUMPKIN: So, I think that System A would show up in the budget rather than necessarily the activities of the project.

DR. FITZMAURICE: Existing System A would show up in the budget. Any changes to System A that require resources, I think I understand, are not in the budget.

MS. ADAIR: I think the point is is that they are going into our target architecture. It is not necessarily that people are going to go back and revisit things that are already built or things along those lines.

It is what we will build, too, and it is exactly as Dr. Lumpkin said. It is basically that you choose to use these standards to move it forward.

DR. SHORTLIFFE: I would like to follow up on this same point because I have been trying to think about the specifics of the federal situation with regard to withdrawal as a care provider and what the kinds of applications are that will drive the enthusiasm for this.

You say the vision is to enable the sharing of health information in a secure environment and improve health. I mean, hallelujah, we all clearly have a great interest in that vision.

But there is a kind of segmentation of the patient populations that you are talking about in the various partnering groups, the military hospitals, the VA hospitals, not a lot of movement of patients, you know, between those institutions and facilities, the whole Medicare population where you have much less control over the way in which data are managed as a care provider environment, of course. So that the way in which gradual adherence to standards, government-wide, will, in fact, allow you to do new things by bringing together data across those various care provider organizations, i.e., the military, the VA and the Medicare base.

I mean, you can sort of imagine some of the things that might be able to be done. They seem to be more on the data analysis side than on the actual care providing side. I just wonder to what extent you have been, you know, specific about, you know, when this finally happens, here is what the government is going to be able to do with these data sets and the kinds of data that are there, that we can't do now because these standards have not been adhered to in the way in which our own operating groups have, in fact, built systems to date.

It seems like there has to be something more than the abstract sense that interoperability is good. There has to be very specific notions of what we are now going to be able to do that we can't do today. I am sure those exist, but it didn't quite come out in your --

MS. ADAIR: No, and it didn't.

You are right and I mean that there are a lot of things and we are going to sit down and, you know, really list out -- because some of it will have to do with what are the business needs of each of the components that are participating, you know, that there are going to be specific business needs, that there are in the DOD and the VA, for example, in how they work together in those types of activities.

There is certainly the -- it will all be laid out in the business case. You are right. We have to get very specific. I don't know if your intent was for me to go further down that path right now or in future conversations with you.

DR. SHORTLIFFE: I guess partly I am asking to what extent that has already happened and to what extent it is sort of clear to you that that needs to happen and it is part of the process that you are embarking on because I think it is clear that there are things you will be able to do that you can't do now if you are able to come up with consensus in this CHI process.

But, you know, as Mike was pointing out, there may be budgetary issues that come up. You are going to have to sell specific ideas for trying to -- even going forward, augment systems so that they adhere to the standards that get defined under the CHI process.

That means catching someone's imagination and excitement about what can be done that they can't do now. Those need to be gripping examples, as we have all learned in the real world.

MS. ADAIR: And I think that there are.

I think that we do have examples that we can put out like in our business case and that when I come back and have the conversation in more depth about what we are looking forward to and, you know, DOD, VA, HHS and I will come and give you some of the examples that we are building the business case on.

But I think that the other -- those are kind of like -- and I think you should point out, those are at the end, that there is a lot of work that just has to be done to come up with what those standards are and that this is the incentive that you have out there that you are continuing to remind people why it is you are doing this.

But I think that there is something I have to point out is that we are not -- we really are -- we need to talk about this, I guess, is that we are really talking about making this part of our target architecture and so that the investments that people were talking about are really just investments that we would come up with, our own business needs in the future, to be building an application, to be working DOD and VA, to be working within HHS in activities where we need to be sharing data to move forward in research, to move forward in our activities.

We are not going to go back. Our intent is not to go back and rebuild -- reestablish. It is to use them in the future.

Its applications we would have been doing anyway because of our business needs, but just using those standards. It will not be as fast. It will not be as quick.

DR. SHORTLIFFE: I am finding myself wondering as I think about this -- I mean, it is a wonderful thing to do. Nothing I am saying here is meant to be critical about it, but when I think about what drives so many of the interoperability issues that arise in health care data standards, it has to do with the fragmented health care system that is out there for most of us.

It is different in the federal health care environment. I mean, you have got much more opportunities for uniformity within the whole VA system, for example, than you have in the city of New York, where a patient goes moving around from one emergency room to another and sees pharmacies in three different chains. It is that kind of fragmentation that many of these standards are designed to try to address. So, having the CHI effort in the government in some way or another really bring along that private sector, which is one of your points here, is really crucial to achieving the overall goal here.

Any way in which we can help you sort of make sure that your effort is also part of something that is more widely perceived in the more fragmented parts of our health care system, I think, would be a very positive part of this whole activity.

MS. ADAIR: That is part of our hope.

I mean, you have choices when you want to go to a choice of standardization.

I mean, there are a number of choices that, you know, before things get out of the gate, you kind of like all come to terms with what they are. That opportunity has past. Another opportunity is you go the approach of HIPAA, you know, which is you legislate and you require people to come on a date certain.

We are taking a different tact. The tact we are taking, as the slide indicates, is that of leading and influencing, that when you pull together people in DOD, VA and HHS, not only are they pretty -- and there are others, by the way, within the Federal Government, but I think those are the ones that you all normally tend to think of.

You don't think quite as much who the other entities who deal with it, but there are. But anyway, when you start to say that this group is going to come to agreement on what our standards are and, hopefully, opening ourselves up to public kind of conversations about what that is and we say -- we purposely say these are the standards that we are going to be moving to.

These are the ones we are going to be adhering to. There is certainly a hope that by that we will -- it is a different tact than HIPAA. I don't believe we would even think about this had the standard not been established by HIPAA. Pardon the pun there.

But because I think that, you know, it made such a demonstrative statement, you know, it made such a change that we are hopeful that we could make this kind of approach and reap some success.

DR. LUMPKIN: Jim and then Kepa and then I think you guys have to leave, if not before then.

MS. ADAIR: I apologize. We made an 11 o'clock commitment and that was my fault.

DR. LUMPKIN: And I just wanted to make one real quick statement, that the discussion we are having, that is exactly the approach that we recommended in our most recent communications. We are thrilled.

MR. SCANLON: I think just to follow-up on that point, the basic notion here follows out of the experience with the government computer-based patient record initiative, which was DOD and VA. They looked at what they couldn't do or had to do in difficult ways because of interoperability. Now the circle is widening. So, it is several other health care programs, including the Indian Health Service and it is getting at the public health side as well.

Again, I want to emphasize, the working group is looking at the standards that have already been recommended by this group and others. The PMRI report that is provided to the group, that is kind of a framework and the recent recommendations on message format standards are also being looked at by the group. So, I think the idea here is a convergence among the federal agencies with where the private sector appears to be heading anyway.

It is not so much as Jared said a regulatory approach. It is hard to require where there is not much penetration to begin with. But it is more of a leadership and an influence and a market --

DR. SHORTLIFFE: Several recent reports have recommended precisely the role that the government could play in showing how these standards can, in fact, be put into use. So, I totally support --

MR. SCANLON: So, I think this approach is four square with what everyone seems to be heading for. The other point, the process point, I think the work group is asking for the committee to serve sort of as its FACA arm when hearings or experts needed to brief brought in in a public setting.

If the committee would be willing -- and the subcommittee probably, as it did with the GPCR, remember -- would be willing to service in that capacity and I think everyone is agreeing pretty much with that.

DR. ZUBELDIA: I think this is fantastic. As I see it, it is the first step in the road to the national health information infrastructure. We need to start taking one step at a time and leading in that role.

I would like to recommend that we keep the NHII recommendations always in mind and try to avoid a payer-centric type of approach, which will be way too easy to do, given the environment in which you will be working in these architectures and enable more a provider to provider approach, provider to patient approach, community -- and even though you probably don't have the budget for it, in defining the architectures, keep in mind that the interoperability also has to be with the outside world as part of the NHII. I think this is a fantastic first step, moving towards the NHII. I really applaud the Department for that.

MS. ADAIR: I think that with the participation that we have within HHS and its responsibilities that it has with community grants, that we -- and then DOD and VA with a different type of role, I actually believe that we are -- and, obviously, with the request that we have here to kind of -- I come to you and that type of things, we are trying not to have kind of a myopic view.

We really are trying to have a broad view. I take your comments.

DR. ZUBELDIA: A question I have is are you going to look at the communications interoperability, which is part of what is left out of HIPAA and it is now surfacing as one of the big problems in communicating these standard transactions that we have, but no standard way of sending or receiving them? That would also help with all the other standards.

Are you going to be looking at the Internet, for instance, securing interoperability? Then I will yield the rest of my minutes to the gentleman from Indiana.

MS. ADAIR: We can come back, Kepa. There is a very small slide that we talk about when it is not limited -- what it addresses, what it is not limited to.

We can come back at a later time and talk to you about what is the breadth of what it is we are talking about. I kind of wanted to come today and ask -- to be candid with you, ask for assistance, to ask for help. So, we have got a very short period of time on the agenda and so I didn't come completely ready to answer all of your questions, but I will take them away.

I missed an opportunity in responding to your question earlier in that we really do see this as a great opportunity for us and we believe that if we can come to terms on such a thing as standardization, there are going to be opportunities out there, in addition to needs that we have now, opportunities out there that we could not have taken advantage of without being on the standards kind of a platform.

DR. MC DONALD: I wanted to say the same, you know, sort of positive feedback. This really what that one bullet in our letter said is that the government would sort of induce and support these things by taking these kind of actions, which I think are probably the most powerful actions because you don't get a backlash.

You know, there are things you can control and it is amazing how strongly things line up, you know, like you put a magnetic field on a -- all those magnets line up when the feds go giving directions. So, I think it will be tremendously powerful.

I have two cautions. I wouldn't actually try to shoot the whole world. I would take these, you know, bite size at a time because you are going to -- I think you will get some nice line-ups behind them. That is No. 1.

The other one is -- well, maybe I won't do that other one. Never mind. I mean, we all want to do everything, but I think we defeat ourselves because you end up bringing -- you end up getting confusion because the discussion gets so wide. You end up with oppositions. You end up with, well, we are starting all over. You know, let's think from the -- and there is a lot you can do fairly rapidly that would sort of do this lining up the magnetic -- you know, the little iron filings. I think you would see some really -- in the outside governmental world.

DR. SHORTLIFFE: You are talking about the notion of interacting with the private sector as part of this process? The down side of that -- well, I think we have a discussion to have on that, independent of CHI because there is a lot going on suddenly in that general area.

I think CHI needs to be informed by all that is going on, not just NCVHS activities. There is a lot of other stuff.

DR. LUMPKIN: I think given the character of that, we should spend some time at our next meeting talking about what is going on with the E-Health Initiative, the Numerical Initiative, the AHA Initiatives in conjunction and revisit where this is going to fit in with that.

MS. ADAIR: And I agree that there is -- and I apologize for interrupting -- there is a significant amount going on and I think that we need to listen to it and we need an organized way in which to listen.

DR. MC DONALD: There is one other piece, that sometimes the government agencies wave the flag strongly and back off in reality. I won't name names or organizations, but one of the ones you have -- the words came out in your discussion. They tend to, you know, do all this and really when you get back to it, they are not doing this. They are not doing the standards. One of them is.

MS. ADAIR: I will say that it is -- that we at the table are all -- we all believe that this is the right thing to be doing. I think that we are all there hoping to move this forward, feeling very good about the process and that it is kind of like -- a lot of things are converging at the same time.

As you point out, there is a significant amount of discussion going out in the industry and maybe a little hard time deciding on which are the right standards. We are hoping to have a positive --

DR. SHORTLIFFE: The government can play a big role in getting that right.

MS. ADAIR: We are hoping to have a positive impact. I will -- there are a lot of people who have started calling asking me about this and if I had a way to kind of channel and have a positive way and an organized way in hearing that kind of input and I mean "I" there as reflective of the CHI project, not necessarily me personally.

DR. SHORTLIFFE: There are also legislative initiatives that could have an impact in this area that are underway. A lot of talk in both the House and the Senate, including bills that have been introduced. So, I think from a whole bunch of directions there are opportunities for a lot of interaction with new ventures.

DR. LUMPKIN: I would like to thank you very much for coming. We will probably want you to come back and give and give an update at a future time.

Of course, in between times, Simon and the Standards and Security Committee will be very interested in partnering and moving forward. We are thrilled and glad you are doing it.

MS. ADAIR: Thank you for allowing me to speak and thank you for agreeing to help us. Thank you.

DR. LUMPKIN: I think we need to take a break and, Stephanie, can we take you right after the break? Is that okay? Right.

We are going to do a ten minute break.

[Brief recess.]

DR. LUMPKIN: You can continue to get beverages, but, please, if we can get started so we can keep the side conversations, if you absolutely need to talk then please do so outside. We are going to finish up the updates with Stephanie and then we will move into the panel discussion.

Agenda Item: Privacy Regulation

MS. KAMINSKI: I just want to very, very quickly give people a little bit of an update on the status of the privacy rule in part because we are pressed for time here and in part because now that I have been working for OCR for several months, I have learned to work at an incredible breakneck speed. I will try to use those skills here.

We have been very, very, very hard from I would say even before the moment that comments -- the comment period closed, which was April 26, but certainly as of the time it closed, to digest and analyze the comments that we received. We received in excess of 11,000 comments on this round of the rule.

I didn't realize until I was walking out the door last night that I probably was -- could have given you -- I was allowed to disclose to you the breakdown on the topics and how they commented. I didn't pull that information from our database, but I do know that over 9,000 comments were on the topic of consent. So, that gives you some sense about where that particular piece of the rule -- how that played out.

But, clearly, we have had comments on all of the issues. We are working on a rigorous and I would have to say unforgiving schedule to try to get this rule published ASAP with the goal of having the publication in time to make this piece of the rule, the compliance data for this final, final rule converge with the compliance date of the December 2000 final rule for implementation purposes. So, that does mean that we are on a really, really tight schedule.

MR. BLAIR: Could you say that again? I didn't follow you with the December 2000 piece.

MS. KAMINSKI: The December 2000 piece was the final rule that was published in December of 2000.

MR. BLAIR: Right.

MS. KAMINSKI: As you know, after that, we opened up the rule for additional comment and we came out with our NPRM on those additional topics this past March. The December 2000 final rule, as you probably are aware, has a compliance date of April of 2003.

That has not changed. That rule has an effective date that is -- I am getting lost on the effective date piece. I am sorry -- but the compliance date is April 14th, 2003. That is not changing.

We are now trying to put forward a finalization on these modifications, which all covered entities will have to comply with and we want to do our best to make sure that the compliance date for these modifications s, converges, whatever you will, with the compliance date for that original HIPAA privacy rule so that anybody who is trying to put together a compliance plan and who is trying to come into compliance, more importantly, will have sort of one set, one comprehensive rule to be working under.

MR. BLAIR: You did actually tell me something here that I was not aware of because when people would ask me for information about the privacy rule or regulation, I would always refer to them as April 14th.

But apparently, that is not the date that you are considering to be the final rule. That was only the -- I guess maybe you call it a reaffirmation of the December rule. Is that --

MS. KAMINSKI: No. The April 14th, 2003 is the compliance date for the December 2000 final privacy rule.

That is correct. That has not been changed. So, the only thing that is knew is that we have now made some proposed modifications and we are in the process of finalizing those modifications.

We will come out with a sort of additional final rule, a complementary final rule. Those two final rules, the December 2000 and what we publish will work in tandem and covered entities will be required to comply with both pieces.

DR. LUMPKIN: Were you done with your report?

MS. KAMINSKI: Not quite.

DR. LUMPKIN: Okay. Why don't you finish your report.

MS. KAMINSKI: I will hurry, though. The Secretary is very interested and involved in this privacy rule, has been part of the process and is committed to putting forth a rule with strong privacy protections but also a workable rule.

We anticipate that the White House and OMB will also be deeply involved in the final decision-making process. I guess I just wanted to say a word about outreach, which is we have all hands on board right now really working hard on this rule.

We have stopped accepting speaking engagements right now until the fall. But we are also contracting for outside consultants to assist us to develop technical assistance.

That is it.

DR. LUMPKIN: We are running about half an hour behind. So, we can have a few questions and then we will move on.

Simon.

DR. COHN: Well, okay.

Stephanie, thank you for the update. I guess I am just a little confused about -- I mean, what you are really talking about is a modification to a final rule.

MS. KAMINSKI: Correct.

DR. COHN: Which is really what we are talking about. And that modified final rule will be implemented at the date you have described for the initial final rule.

MS. KAMINSKI: Nicely said.

DR. COHN: Okay.

Having said that, I was going to pull up my HIPAA law, can you explain about the -- is there a time frame that the Secretary has to get it out to allow the industry to implement it, to be in compliance with the law or can this be two months before the date, before April? I mean, is there a six month period or how does that work?

MS. KAMINSKI: There is a six month period between the effective date and the compliance date, but I do not believe that there is a particular date certain that the Secretary has to get this out by.

DR. LUMPKIN: See, Simon, I think the issue is is that HIPAA says that there is a period of time after the final rule is published on privacy, that the industry has to comply with. That is the 24 month and 36 month. This is a modification to that rule. So, just like it is with the transaction standards, once the standards are in place, the clock is ticking for compliance.

The Department may change any of those rules along the way. That doesn't mean that all the sudden you don't have to comply with the regular rules. So, in trying to get this rule in place will keep the industry from having to comply with the original rule, rather than comply with the modified rule. Is that a fair description of the situation?

DR. SHORTLIFFE: And you would like to have at least six months lead time.

MS. KAMINSKI: That is statutory. There is a six month window that is built into HIPAA between when the effective date happens and when the compliance date happens, even with modifications to any of these --

DR. LUMPKIN: But the six month compliance -- I mean, that window -- the problem is is that we believe that the modified rule is much better than the original rule. So, if you evoke the six months wait time, it could be a situation where people would be required to comply with the original rule because the six months hasn't expired for the new modified role.

I think that is what I am hearing OCR is trying to move the process forward. So that it will be clear what rule people need to comply with.

MS. KAMINSKI: Correct.

DR. ZUBELDIA: My question is on that very same topic. If I follow the guidelines -- if I follow the time line correctly, in order for the six months to end on April 14th, which will be -- and then do the providers have the choice as to what rule they can follow, the original rule or this rule during that time frame?

MS. KAMINSKI: They have to follow neither until the compliance date for the final rule.

DR. LUMPKIN: But the important point is is that if you are a provider, you are trying to get ready for compliance. Then you comply with the modified rule as the Department publishes it.

There is even -- let's say it happens November 1st. If I am a provider out there, I am preparing for the modified rule because that is really what the intent is. I think we have to be careful what the -- so that we understand what the purpose of the rule is.

The rule is to tell people who are playing the game what the rules are. It is not used to try to trip people up. So, the Department is trying to clearly say to everyone out there, here is what we are trying to do. We have heard that there is a problem and we are trying to fix it.

I think it is very similar to the NDC codes, which there is a vision and a new revised transaction NPRM that came out. I am a provider. The Department says, okay, we heard you about the NDC. We are fixing it. I am starting to do my HIPAA compliance even with the extension. I am going to act as if that has been fixed because the Department has clearly said we have heard you.

I can't imagine any comments that the Department is going to give that is going to say all the sudden, oh, we change our minds because the comment period said NDC is the best thing since sliced bread. So, what I hear the Department saying is they are trying to give as much guidance as they can, given the process, to the industry as we are trying to implement HIPAA under the guidelines of the clocks that have already been set in motion.

Thank you.

MS. KAMINSKI: Sure.

DR. LUMPKIN: With that, we are going to move on to our panel.

Kathryn.

Agenda Item: Data Issues in Measuring Quality of Mental Health and Substance Abuse Services

MS. COLTIN: As you know, we have been having a series of quality panels at the full committee meetings to discuss data issues in measuring quality of care. Today we have a panel that is going to be speaking to the data issues involved in measuring the quality of mental health and substance abuse services.

This is a relatively broad set of issues that I have asked the panel to address. The committee members should have a one page sheet at their places of the questions that the panelists were given. Not every panelist will be addressing every question, but, hopefully, collectively, these questions will be addressed across the panelists.

They range from the adequacy of existing federal data systems for providing direction to organizations around what topics are important to look at in measuring quality of care and where quality measurement would be most useful to how adequate are existing data systems in the public and private sector for actually developing and implementing quality measures to what are some of the privacy issues involved in collecting this type of information from various data sources, including patients themselves.

So, I am going to ask each of the panelists to introduce themselves, but to say in advance that we will follow the order that is laid out in the agenda and that each panelist has been given approximately ten minutes to make their remarks. I would ask that you each try to adhere to that so that all of the panelists will have an adequate time to speak. I have asked Mady Chalk and Eric Goplerud from SAMHSA to take the lead and sort of set the stage for this panel. So, if you would begin and introduce yourselves and set the stage and then as each panelist will proceed, would you also introduce yourselves.

DR. LUMPKIN: And just in fairness because we do have a large panel and what usually happens is is that if the panelists in the beginning run over, then it takes away from the panelists at the end. So, I do have a timer for ten minutes. You will hear it go off.

MR. GOPLERUD: Hello. I am Eric Goplerud. I direct quality and finance areas at the Substance Abuse and Mental Health Services Administration.

It was about five years ago I appeared before this committee and talked about the kind of scary position that we were in the public sector as Medicaid was using managed care and managed care technologies to organize and contain cost -- to organize care and contain costs.

What some of our concerns were about the impact that managed care was likely to have on the availability of publicly accessible data. I currently have a study that is in the field that we call the Cassandra Study, which is based on the old myth that Cassandra always told the truth and no one believed her.

I think that many of us believe what was happening around the field, that we were losing data and, in fact, that has happened. The issue of non-financial measures of quality of care for mental health and substance abuse has been an issue that SAMHSA and the behavioral health field has been very concerned with for some time.

In 1995, I believe it was, or 1994, SAMHSA commissioned the Institute of Medicine to take a look at how do you manage for quality in behavior health. What the IOM came out with "Managing Managed Care," which laid out a set of questions and responsibilities across the government, across payers and providers for the production of common performance metrics.

Unfortunately, that was a call which mostly went out to perhaps an agnostic field, that was not believing that this was an important issue. The national organizations representing state mental health, substance abuse and Medicaid directors worked together to come up with a common framework for performance measurement. That has really served as a grounding for a lot of later development work. Vijay Ganju will be talking some about that, as well as I believe probably some of the other panelists that will be talking from the substance abuse side.

In addition, there have been a number of pilot studies, one which started with five states looking at common mental health measures, which was expanded to 16 states, which has fin ally come together, looking at the feasibility of selecting common quality measures of performance in mental health. In the substance abuse side, SAMHSA has also sponsored several studies, treatment outcome. We only know by acronyms in the Federal Government.

I don't even know what they stand -- TOPS 1 and TOPS2 -- and particularly some very important pioneering work that Mady Chalk and I believe probably Connie Horgan will be talking about, the Washington Circle Group that has been building to develop common metrics for substance abuse treatment.

One of the holes that is in the panel and preparing for this, we somewhat jokingly talked about having some of our prevention colleagues appear with masking tape over their mouths because prevention didn't even make the panel and, yet, prevention and mental health promotion are important parts for which the development of performance metrics is critical.

There is very good work that has been done on this and we distributed to the committee members, the minimum data set, which our Center for Substance Abuse Prevention has been developing. So, our sort of silent partners are not even in the room, but there is very good work that is being done in the prevention area, which you need to know about. It really links in.

It is very hard to do treatment without also thinking about early intervention and prevention. Probably the most interesting activity right now that is going on is something that goes by the name of the Carter Center Forum on Performance Measures.

When we have taken a look at the diversity of performance measures, non-financial performance measures that exist out in the behavioral health field, we find that there are so many different ways of measuring depression or measuring symptoms or measuring access, that instead of there being a focused concern of what you fear is chaos. So, leaders of the mental health, substance abuse prevention and treatment fields have gotten together around the Carter Center Forum to develop common, well-specified performance measures that can be used across payers, across provider platforms and that can be consistently specified and benchmarked.

There are currently, depending on how you count them, there are four content groups. There is an adult mental health treatment group, an adult substance abuse treatment group, a child adolescent mental health treatment and a subgroup adolescent substance abuse treatment group and a prevention group that are all working under a coordinating committee and with a methodology work group that is also providing consultation to that.

When we got together looking at what were the most well-developed areas, it was clear that adult mental health treatment and adult substance abuse treatment were the most mature in their development. We found that there was remarkable agreement between the two areas in their areas of focus and how data would be collected.

There was agreement that each of these areas felt there was commonality around access, quality or processes of care and outcomes that can be and should be measured from both administrative data sets and from consumer or family experience with care. This is serving as a foundation for the development of common specified performance metrics. They are not the core set that you would want to understand everything about a particular program, but that would be common across.

Some of this will match up or should well match up with NCQA is doing through the HEDIS or what JCAHO is doing with ORIC(?), but it is not tied to any particular vendor or accrediting organization. Also, as a metric, we are convinced that behavioral health will make progress only to the extent that we are integrated in and consider ourselves and are considered part of the more general health enterprise. So, the IOM "Crossing the Quality Chasm" is the basic bible that we are using.

The six criteria of effective quality care, the ten principles -- I feel like I am holding up Mao's Red Book -- the six principles, the ten and the four components, but that really is what is guiding this effort as well, which is behavioral health is really part of health. As far as where we are answering some of the questions, as I predicted about five years ago, publicly available data sets are disappearing all over the place.

We are doing analyses of Medicaid data. We have reasonable data from the tape to tape that goes back to -- we are now doing 1995, 1996 and 1997. As states go into managed care, those data sets become unavailable. We don't have benchmarkable data and remarkably, the managed care firms have not been amazingly forthcoming with their wonderful data sets.

Performance partnership grants are a new grant program. That is, our block grants have been modified in our reauthorization to promote the collection of common data reporting to the -- by the states on mental health and substance abuse services. We have put out in Federal Register notices what those common elements are, as well as developmental measures and individual measures that states can report.

This will produce some emphasis behind the Carter Center Forum common measures that actually somebody here is going to require the collection and reporting on at least some of those common measures. I wanted to talk about some problems that we have encountered in administrative data sets and to give you an example, a couple of examples. One of the big areas of concern is how do we integrate care between primary health and behavioral health and, in particular, we have been working on the identification and treatment of depression in primary care.

DR. LUMPKIN: Finish up in a couple minutes.

MR. GOPLERUD: Yes, a couple minutes. One quick issue, we have been working on HIPAA transaction codes.

We took a look at do the transaction codes, which have been concerned about if you can't build for it, it won't get done. We haven't looked at it. Do we have the codes that are necessary to implement evidence-based practice.

When you take a look at what are the components of evidence-based practice around the treatment of depression in primary care, you often -- or you don't have coding for the use of standardized screening instruments.

You don't have a code for consultation with a behavioral health specialist or a patient who is not a specialist's patient and you don't have coding for telephonic case management, that often what is an important look at HIPAA that needs to be done is whether the transaction code that we are going to be using and implementing actually support the payment for evidence-based practice.

At a conversation with leadership in Medicare around what is effective care for the treatment of depression in primary care, one of the nation's experts on depression went through his research. When he got to the point in which he very clearly lines up about 12 studies that show that care management is the critical component of making either effective treatment or ineffective treatment for depression, the Medicare official said, well, you can stop right there.

Medicare doesn't care for ambulatory case management. So, I guess we might as well just stop. One of the difficulties is if you can't pay for it and you can't bill for it, good quality evidence-based care won't happen.

In working with employers, who are very concerned about the cost of psychopharmacology and trying to manage cost and quality with antidepressant medications being extraordinarily expensive, 70 percent plus of the antidepressant scripts that are written by primary care doctors, you can find no mention of depression symptomatology or of depression diagnosis.

In the script pad itself, there is no way to code diagnosis. So there is no way that you can tie between a script for an antidepressant and what an actual diagnosis is.

There are a number of different instances where if we just inserted a metric or a measure, we could start tracking and then do quality improvement on it. I have a whole rap about what we could do with the National Quality Forum nursing home set in which we have just lost the only behavioral health measure on treated depression because the metric was not quite tuned up enough.

We will go on and we will talk about why this is an important area and you ought not wait another five years to have another panel.

DR. LUMPKIN: Thank you.

MS. CHALK: I will make up for the extra time Eric took by being brief.

What I wanted to talk a bit about -- and Connie Horgan will be speaking in depth about a group which was formed in about 1998 with some support from the Center for Substance Abuse Treatment. You will have these, I believe. They were handed out.

DR. COHN: I am sorry. We still don't know who you are.

MS. CHALK: I am Mady Chalk. I am the director of the Office of Quality Improvement and Financing at the Center for Substance Abuse Treatment.

DR. COHN: Thank you.

MS. CHALK: The Washington Circle Group after much due deliberation began to focus on measuring processes of care, rather than measuring treatment outcomes, the rationale being that -- we are talking about for alcohol and drug treatment.

We know more than we think we do about the processes of care that are central to delivery, of appropriate treatment and to the promotion of positive outcomes. There may be, as there is in any audience, considerable skepticism about that, but I think the research will address otherwise.

We also thought that we needed to use measures that captured the range of necessary treatment services and a model of addiction as often a chronic relapsing condition. So, we came up with a minimum set of underlying principles about what processes of care we knew were linked to outcomes, such as facilitating identification of individuals with substance use disorders.

They are all on the handout. I won't go through all of them, facilitating access to care and the like.

We also knew that there are a number of services that need to be provided to people in addition to addiction services while they are in treatment and that those need to be measured as well, services that are related to improving their health and social problems, social functioning problems.

Using the process of care measures as an intermediate set of measures, we thought showed promise for being measurable at the program and systems level. Connie will be talking about how that has developed. And they allow for identification of areas of improvement and facilitate comparisons across health plans. Nevertheless, there are significant barriers: to wit, stigma and will.

Let's begin with the fact -- overcoming the issue of stigma to even arrive at the possibility of measuring identification is a significant issue, which I think you all can understand easily, the undiagnosis, the underdiagnosis, the lack of diagnosis, the lack of screening, the unwillingness to screen, which is part of will, is pervasive, let alone the question of diagnostic codes and other kinds of codes, which I am not even going to get into.

There is also a lack of needed data infrastructure. In the substance abuse treatment field, you may be aware that only 50 percent of treatment providers have a computer. That does not bode well either for HIPAA or for measurement in general. It is an issue that CSAT is very involved right now in attempting to address and in raising with other levels of the Department.

DR. STARFIELD: Do you mean mental health providers or all providers?

MS. CHALK: Mental health providers are not much better, but they are a little bit better off. But alcohol and drug treatment providers, there are only 50 percent who have computers. The cost of data infrastructure development is not minor, as we move along this path toward measurement. Let me -- in addition, data fragmentation across systems is a major issue.

Medication assisted treatment for alcohol and drug dependence is on the rise. Data are fragmented across the pharmacy databases and the substance abuse treatment database. Treatment for substance abuse disorders in primary care, Eric spoke about depression. It is the same issue.

Treatment for mental illnesses and substance abuse disorders together requires integration of databases. You may be interested to know that CSAT and the Center for Mental Health Services work with three states to integrate their databases; mental health, substance abuse and Medicaid data, quite successfully over a four year period.

We have asked the states now -- we put a Federal Register notice out about who was interested in volunteering to do this and have now got upwards of 32 requests from the states to help them do this. That would break the bank at CSAT if we actually undertook to help 32 states.

Some examples: Assuring of treatment processes that need to be measuring, assuring that patients get from detoxification into treatment; being able to measure that is not only difficult because the numbers are small on the private side.

They are high on the public side, but detoxification is in the medical benefit and the clinical treatment is in the behavioral health benefits and the two data systems often are unable to be married. Medicaid is unable to measure who gets from detox treatment.

There was an Office of the Inspector General study a couple of years ago. Very difficult, as is the whole issue of transition across health systems.

Other examples: assuring that patients get a clinical assessment as a precursor to treatment planning; measuring clinical assessment as distinct from screening is virtually impossible. There are no measures for measuring level of care determination.

Finally, measuring movement across levels of care within alcohol and drug treatment is a virtually impossible at the present time. Those are just some of the barriers. Connie will be talking in more depth about where we have gotten to with the Washington Circle Group.

I will say that we have asked a mental health group to test the Washington Circle measures. That is underway as we speak so that we will be able to come together on this and that we view evidence-based clinical guidelines as in a central component helping quality measurement move ahead.

DR. LUMPKIN: Thank you.

MR. RENNER: Am I up next? Okay. I am Phil Renner. I am director of measures development for the NCQA, the National Center for Quality Assurance.

I just want to give a little bit of background on NCQA, what we do and what we measure to try and wrap a little bit of context around why we might have obstacles around measuring performance and behavioral health. We are a national health care quality oversight organization. What we do is we measure and report on health care quality.

We do this through three main oversight methodologies. We look at accreditation surveys, going on site, looking at process and structure within an organization. We also do HEDIS measurement, which is a set of process and outcome measures at the managed care organization level.

We also have a series of surveys. We have the CAHPS survey. We also have a newly designed ECHO(?) survey for use in behavioral health care organizations and we access the HEDIS set patient survey.

We look at administrative data systems and we look at medical records of data. The evaluation programs that we have -- I have several listed here.

The key thing is the first two, managed care organization and the NVHOs(?) relate to the behavioral health issues here. Some of the issues also deal with the bridge, bridging the gap between these two types of organizations.

I will skip on ahead to the page that has the behavioral health measures that we currently use.

We have four measures currently in place in the managed care organization looking at effectiveness and antidepressant medication management, looking at whether patients who have been hospitalized for mental issues are followed up at either a seven day or a 30 day period. Then we have some utilization statistics for both chemical dependency and mental health utilization.

We also have a number of measures that are under development right now. The first one there relates to the work that Connie is going to be talking about in a little bit. It is treatment initiation engagement for patients with substance abuse issues. We are also looking at appropriate treatment and evaluation for children with ADHD. Screening for depression is something that we want to engage on.

Then we are also looking at how can we measure the timeliness and appropriateness of access to behavioral health services. In particular how are patients able to get across the primary to specialty care gap. So, as we think through trying to design performance measures in this area for public accountability, the question comes to mind and I think, you know, the question Kathy raised with me was what are the things that stop you or slow you down or might be obstacles in measuring effectiveness of behavioral health.

There are three major components to that. The issues are on confidentiality, around the adequacy of coding and then around the availability of data. Confidentiality just gets its -- you all know all about confidentiality but it is an obstacle for us if we need to get any data out of the chart. It makes it a lot more difficult for us to do a measure that is safe on confirming a diagnosis, confirming what happened in the chart. Access to behavioral health charts is very sensitive.

So that drives us putting together -- plus, you know, there is a cost component to it as well, but it drives us toward creating measures that are based on administrative data, you know, that come out of the electronic systems. One example for that is we recently changed the specifications for a follow-up after a hospitalization for mental illness measure from being able to take the data out of either administrative charts to administrative only for this reason.

So, if we are only able to look at now -- if we are not able to look at the chart, if we are focusing on looking at administrative data, what do we know about the adequacy of the administrative coding? We rely primarily on, you know, CPT-4, ICD-9, NDC coding for this. These systems lack the precision that we might need to be able to really differentiate what is going on in a treatment setting.

One example -- and I hope I am not stealing some of your thunder, Connie -- is if we want to look at has a patient received -- what sort of substance abuse treatment does a patient receive as an inpatient? Is it detox? Is it treatment? You can't tell that based on the administrative coding that you have. So, we are not -- the CPT-4 and ICD-9 coding really don't allow the precision that we need.

Also, back to Eric's point, they don't allow us to -- there is no administrative coding for a lot of screening and assessments that might go on, screening for depression, screening for substance abuse, other issues. If we wanted to build a measure that allows for follow-up in a way that is other than face to face with the providers, there is no coding for that or a follow-up where there might be evidence that that is an appropriate way to manage the patient.

Then I will go into the data availability piece in that Eric had maybe hit on it a little bit, structurally managed care organizations, the HMO will carve out to the managed behavioral health care organizations, take care of my -- take care of all of the behavioral health or part of the behavioral health care for our enrollees. That creates silos of data and it will degrades our ability to measure this. It also really degrades our ability to measure at the level of managed behavioral health care organizations.

For example, if we wanted to force our antidepressant medication measure over to managed behavioral health care organization setting, the MBHOs don't process the pharmacy. They would not have -- be able to go to a system and say, okay, here are all of the patients who are on antidepressants based on pharmacies. They would have to go back to their client, the managed care organization, to do that, creating a lot of friction back and forth that way.

Another is if we wanted to look at some of these Washington Circle Group type measures on initiation of treatment and engagement of treatment, any sort of inning of treatment or identification that happens in the primary care setting prior to referral into the behavioral health care setting, the MBHO isn't going to have those primary care claims and not really know what happened prior to entering that system. So, it makes it really difficult for us to design measures that can kind of or look both ways at the process.

So, to wrap up and maybe not to just sit here and come up with complaints, but maybe to throw out a couple of suggestions on what we might do or what I would -- any input that you have in this is -- one is continuing to ensure that enactment of HIPAA doesn't impact our ability to transfer this sort of information has not been hit by a performance measurement for accreditation. This, you know, would come down to not only the way we transfer data, but also around confidentiality.

How do you set up a sample frame to survey patients on their experience of care within receiving behavioral health care services? Are we able to collect date of birth that we need or zip code or gender to know are there differences in these groups or are they even specified who ought to be in the sample, based on an age range.

The other thing that we really need to have happen is that if we are going to be driven towards more administrative measures based on the administrative data, we need to supplement the coding system, either CPT Category 2 coding or other similar systems that will allow us to collect data on those things that aren't necessarily built for, again, getting back to the screening services, getting back to non-face to face types of follow-up and that would help our ability to measure these processes of care.

Thank you.

DR. HORGAN: Hi. I am Connie Horgan. I am a professor at the Heller School for Social Policy and Management at Brandeis. We do a lot of research in the mental health and substance abuse area. I would like to acknowledge one of my colleagues, Deborah Garnick(?), who does a lot of work in performance measurement.

The next overhead shows the outline of today's presentation. You should all have handout copies of handouts. They were distributed.

Today I am going to focus in two areas. I am going to describe these Washington Circle measures that have been mentioned a couple of times, but I should tell you about Washington Circle.

You are probably thinking what is this group and I will say, one, there are substance abuse experts that comprise it. I will not tell you who they are and our first meeting was held at the Washington Circle Hotel down the road; hence, the name. We are all very grateful that it wasn't held at the Comfort Inn or something like that.

[Laughter.]

There are two things that I am going to be focusing on in Washington Circle. One is our framework, which we consider very important, our continuum of care framework. I am going to actually only address three of the measures that are based on administrative data.

The second part of my presentation is going to talk about some of the challenges of using administrative data and this is based on our experience in pilot testing these three administrative measures.

I will be focusing on three challenges. One, Phil has spoken and maybe you have both spoken to a little bit, and that is the organizational structure of mental health and substance abuse services is different from medical care.

Second, some peculiarities with respect to patient and proprietor behavior and then I will end with some comments on coding issues.

First, our framework, we believe in having a set of measures that describe the process of care starting with prevention and education, population-based activities designed to raise awareness of substance abuse as a problem.

The second area is recognition. These would relate to measures that relate to case findings, screening, assessment and referral to treatment.

The third area is, in fact, treatment. That is where most of the measures have focused and its activities associated with rehabilitation of individuals who actually have a substance abuse diagnosis.

Finally, part of the process of care is maintenance and that is measures that look at activities related to long term positive outcomes. Using this process of care framework, the Washington Circle Group has identified seven measures. They are listed there on the -- in the second column.

There is a prevention measure, which is education. The recognition measure is identification rates and then we get into measures on treatment.

We start with initiation of treatment, linkage of detox to treatment services, treatment engagement, interventions for family members and significant others. In the maintenance area were measures of maintenance of treatment effects.

Today, I am only speaking about three measures; the identification rates, which rely on administrative data, the initiation of services, again, administrative data and treatment engagement with respect to an administrative data. I will just mention these have -- we actually also tested the detox measure with administrative data and I just won't speak about it anymore, just because detox is never there. So, we couldn't really test it. It was missing so often in private sector databases that it just was -- we couldn't deal with it.

I will say that the other measures relate to a patient survey and Sue Eisen will be speaking later about the ECHO Survey and that does include substance abuse efforts. I would like to point out that these three measures that I mentioned, relying on administrative data are nested.

The first measure is identification. It basically asks how does the plan do in identifying those with alcohol and other drug problems. I will not read the definition. It is there. It is on your handouts. That is how we calculate it.

The second measure is initiation. How well does the plan do in getting the patient to actually begin treatment. The definition is up here. I would like you to note that the numerator from the identification measure actually has become the denominator in the initiation measure, the nested concept again.

The third measure is engagement. We know that substance abuse has extremely high dropout rates and we know that the longer someone stays in treatment, the more successful the long term outcomes. This measure looks at how well does the plan do in engaging or getting someone to stay in treatment. Again, the definition is here. Again, note, the numerator here from the initiation measure has become the denominator in the engagement measure.

I am not going to show you any of the pilot testing results, but I am just going to describe to you some of the challenges that we face as we were doing the pilot testing. The bottom line is that these three measures worked. It is possible to discriminate between identification and initiation and engagement.

However, there were major challenges. The first challenge is the organizational structure. Employee assistance programs are something that are very common in the work place, particularly with medium and large employers.

These present some problems in measurement. Employee assistance programs frequently provide all kinds of services, personal counseling, legal counseling. They frequently provide mental health and substance abuse services. These are frequently outside of the medical plan. Some are becoming integrated with managed behavioral health organizations, but for the most part EAPs are separate from the medical system.

What problems this has caused in performance measurement. It means that if someone is using services in an EAP and not using the medical plan for treatment, the identification rates will be lower.

That is just one example. It could also distort utilization patterns. If someone is using EAP services and then starts using the plan, what shows up as an identification visit or an initiation visit may, in fact, really be further down the chain in terms of service delivery.

A second organizational challenge relates to benefit design. Frequently or sometimes -- I won't say "frequently" -- sometimes services that we are interested in are excluded in the benefit package. For example, a Bureau of Labor Statistics survey of medium and large employers showed that 20 percent of employee plans did not show up in a plan as some of those services covered, deductibles and limits, which are similar to the medical area.

However, in the absence of charity in many states, deductibles and limits are particularly limits and cost sharing are much more common for mental health and substance abuse services and typically deductible and limits lead to underreporting of data. Others have raised the issue of the role of carve outs. And I would like to spend a little bit more time on it because it is complicated.

The next overhead shows a diagram of the process of pathways to care and I would like to walk you through it because this has major implications for where organizational responsibility lies for the provision of substance abuse and actually for that matter, mental health services and where the data reside. Basically, what this shows is that there is a bifurcation frequently in the provision of behavioral and health services from medical services.

Let's first look at a payer. An employer makes a decision. Are they going to provide services, behavioral and health services, through their managed care plan or through their medical care plan or are they going to carve out and provide it through what we call managed behavioral health vendor, through an external contract.

We know that about a third of Fortune 500 companies carve out their behavioral and health services. This is not a minor issue in the private sector. In the public sector, many Medicaid programs, for example, carve out behavioral and health. It is simply not part of the medical plan.

There was some discussion earlier about fragmentation in terms of getting data to link up and how difficult it is. This is an example where the data would reside in a different location. I have described the payer-making decisions. It is even more complicated because in the world of managed behavioral and health care, the managed care plan or the insurance plan is also making that same decision.

They are making the decision to either make the services internally, provide services internally or to buy the services, to contract out to these vendors. This is a big issue in behavioral and health care and it is increasing.

I know you will be hearing more about numbers from Pam Greenberg. But from surveys that we have taken, we know that 80 percent of HMO products now carve out to vendors. It is lower for PPO products.

I believe the number is more on the order of 40, 50 percent. This is a major issue, even with a medical plan as to where the data reside.

DR. LUMPKIN: It has been a little bit over 12 minutes. So, if you could wrap up, please.

DR. HORGAN: Oh, okay. Quickly.

Second challenge. Privacy concerns, just quickly point out that in some cases in substance abuse, we are dealing with illegal behavior. So, the issues related to privacy, an employer finding out, are monumental concerns.

This particularly has ramifications if you have court ordered treatment and major concerns there. I will skip over this. Another challenge is the misreporting of services by clinicians. Why would there be misreporting? Perhaps dealing with the patient stigma, miscode because of patient concern, to avoid patient labeling.

Another reason for misreporting is that it is not a covered benefit. Perhaps there would be a medical code given if there is not coverage for it.

Finally, financial incentives, particularly with the carve out situation, if a primary care provider is not going to be reimbursed for behavioral or health diagnosis, they are probably not coded. This may happen if the care has been carved out.

I will just give one example that is in a check of medical records that the medical records that showed a diagnosis of alcoholism in the medical records, only 77 percent showed alcoholism on the claims data.

The third challenge, coding issues. Big problem with incomplete or inconsistent coding, which can distort conclusions. The coding issues fall into two diagnostic codings. Frequently, the full range of substance abuse codes are not used, just a general drug abuse, not specified.

If you want to do an outcome measure with someone who has a cocaine problem and you can't distinguish it from someone who has a marijuana problem and you are dealing with different types of people, that is a problem.

Another problem is a tendency with the providers to not code multiple diagnoses. Frequently, a medical code would be given to substance abuse as secondary is not given.

There also is flipping back and forth with mental health and substance abuse diagnoses. Procedure coding is a big problem. It has been mentioned by others currently. The CPT psychiatric codes are used for substance abuse. They are not specific to substance abuse. There are missing procedures in --

DR. LUMPKIN: It is 15 minutes. If you could go to your conclusion.

DR. HORGAN: And I will end with my concluding slide that the lack of procedure code specific to substance abuse treatment area a major barrier and the second barrier, which is improving is the lack of standard data formats from different sources is problematic when you try to merge the data to create outcome measures.

Thank you.

MS. EISEN: Hi. I am Sue Eisen. I am a health research scientist at the Center for Quality Outcomes and Economic Research, which is a VA health services research and development center. We are affiliated with the Boston University School of Public Health.

I am going to focus my remarks on three of the questions that Kathy distributed to us before today, the major one being the issue of challenges and issues in development and implementation of quality measures derived from patient reported survey data. I am pleased to see there is a common thread that is running through everybody's presentations. So, I will go over those quickly, but nobody has really talked about data collected from patient surveys yet.

I do want to acknowledge many colleagues you will see in my written materials. I am not going to name them all. Paul Cleary is the principal one I want to mention because of his commitment to making sure that behavioral health and substance abuse didn't get left out altogether of a lot of these national efforts. There was a lot of danger of that.

I am going to start with limitations that apply to consumer reported surveys generally. Then I am going to focus on a couple of commonly used methodologies to collect consumer data and talk about limitations associated with each of those.

First, I want to say that research has shown that well-designed consumer self reports can be reliable and valid measures of quality of care.

But now getting to the limitations. First, response rates to consumer services about quality of health care will never be a hundred percent. For this reason multiple measures of quality reporting just different aspects of quality are important to get from different perspectives.

Response rates are affected by many factors and I will discuss some of these when I get to the different methodologies that have been used to collect these data.

Second, consumers or patients are not the best sources of information about every aspect of quality. So, when we decide to ask them about care, we should focus on things that they know most about.

Those generally include information about interpersonal aspects of care, the art of care, the relationship between the provider and the patient, communication between provider and patient, receiving information that is important for their care, access to services and overall evaluation of services.

Another limitation is possible accuracy of patient reported information and the accuracy especially of people with mental illnesses has been called into question for a long time. There is a lot of research that as I said before that even people with severe mental illnesses can provide good information about care. But it is not going to be always a hundred percent accurate. It can be affected by lapses of memory, cognitive impairment, clinical status.

But I also want to say that information collected from other perspectives also has limitations and biases and you have already heard a lot about limitations of data from administrative databases. The point I want to make is the patient represents one perspective and it is an important perspective in obtaining information about mental health care quality, but other people represent other perspectives, clinicians, family members, as well as information that can be obtained from medical records and administrative databases. So that each perspective should be evaluated as representing the viewpoint of the particular respondent.

Not obtaining consumer evaluations at all I think of as akin to testing school performance by asking the teachers how well they taught their students, rather than testing the students themselves. No one would ever think of doing that.

Another limitation. Even when the same data elements are collected in the same standardized format, which frequently does not happen, differences in patient samples may make comparisons difficult to interpret. Consequently, there is a need for case mix adjustment, sometimes known as risk adjustment that statistically adjusts for demographic and clinical characteristics that may affect quality of services.

Before any such adjustment is done, we need to identify the factors that correlate with perceptions of quality, things like age, gender, race, severity or chronicity of illness. This means that that kind of information would have to be obtained as well.

Another limitation is the lack of standardization on consumer reported quality indicators. We know there are a number of national efforts that have tried to reach consensus on quality indicators. The Carter Center is a major one.

The private sector has tried to get together through the NAPHS, which I think stands for National Association of Psychiatric Health Systems, something like that. Of course, NCQA has made an effort and the Joint Commission on Accreditation of Healthcare Organizations, JCAHO, has made an effort. In fact, JCAHO is now requiring core indicators for medical care facilities but not yet for behavioral health care facilities.

Okay. I want to focus on the two widely used types of protocols that I have experienced in collecting patient reported survey data.

The first is the one we have used in implementing and field testing the ECHO Survey. That involves an annual survey in which enrollees in a health care plan or a managed behavioral health care plan are surveyed once a year and asked about all the services they have received over the past year.

The second protocol involves contacting individuals either during the treatment period or shortly thereafter, usually from within the provider organization and this kind of procedure is very widely used by behavioral health facilities, especially in assessing what they think of as consumer satisfaction, the survey has been implemented this way.

In this case the survey is linked to an episode of care and the assessment is limited to the care received at that facility during that episode. First, I want to mention some limitations about the methodology that involves the annual survey. I will just mention the ECHO Survey briefly. Are people very familiar with it? It is basically a mental health module of CAHPS survey that NCQA is now using.

Limitations. Methodologies that have been used to implement these annual surveys of health plan members are first limited to individuals who have insurance through a health plan and who are living in the community. So, automatically excluded from this methodology are uninsured people, which we know represent 40 million in the U.S. Is that right or something like that. It also doesn't represent people who are institutionalized, who are in a hospital or some kind of residential treatment setting.

Second limitation. The protocol involves mail and/or telephone contact, which relies on current address and telephone information. Our experience has shown that attempts to reach people by mail or phone are limited.

In our field testing of ECHO Survey, the number of addresses and telephone numbers that were not usable, that is, we couldn't reach the person at the address or phone number provided by the managed care company, ranged from 12 percent up to 60 percent. So, most cases it wasn't a question of a mailing address. That was the more minor, more common, because we do follow-up phone calling. More common was we had phone numbers and the people -- when we called the phone numbers, the people were no longer there. The phone was disconnected.

We know this is more common in Medicaid samples than in commercially insured samples. So, that becomes a big limitation. In part because of unusable contact information, response rates to mail and phone surveys have ranged from as low as 6 percent up to 65 percent, depending on specific samples and depending on the methodology used, the more effort that goes into trying to reach the people with multiple follow-up phone calls and mailings, the better you do. But there is still a lot you don't get.

In addition, among individuals who you do finally reach and who respond to the ECHO Survey, there is still about 25 percent of people say they haven't used services in the past 12 months. So, we don't know why they are saying that, whether it is because of stigma, privacy issues. They don't want to tell us. They don't want to respond to the survey or maybe even there are errors in the utilization database that determine that they did use services.

Another problem is that because surveys like ECHO focus on all care received, people who receive lots of care from different sources, this tends to be the severe and persistent mentally ill, are sometimes frustrated with having to average all the care they received over the year, which they feel dilutes, you know, possibly good care received from one provider and poor care received from another.

I am going to move on to the other protocol and just mention some of the limitations with obtaining survey data from consumers in treatment settings. A main one is that the resources for this kind of data collection are often lacking or inadequate. This kind of thing requires staff involvement. It requires administrative involvement and it is frequently not there.

Second big concern is the issue of concerns about privacy and whether patients being treated in a mental health facility are going to feel comfortable criticizing care they are receiving while they are still there and returning questionnaires to their providers, although attempts at anonymity have been attempted and with varying success.

Even though theoretically response rates should be greater and you are not losing people who were institutionalized, you are not losing people who you don't have names and addresses -- address and phone numbers for, response rates are still not that great. We send surveys in by provider organizations where response rates have ranged from 0 to 80 percent.

DR. LUMPKIN: You are now two minutes over.

MS. EISEN: Okay. Just one more comment about confidentiality is always a concern. It seems to be a much bigger concern for mental health than other kinds of health care.

Nobody really -- well, a lot of providers are sort of aware of HIPAA, but no one is too sure about what the implications are going to be in terms of their ability to collect this data. We do recommend confidentiality agreements with provider organizations. We recommend various other methods of trying to ensure confidentiality.

Again, the types of data, the -- we do believe that federal data systems should ask standardized questions specific to identification, treatment, outcomes and insurance coverage for mental health and substance abuse services. The rest of my comments you have got written out.

DR. LUMPKIN: Thank you.

MS. P. GREENBERG: Good morning. I am Pamela Greenberg, executive director of the American Managed Behavioral Healthcare Association. Unfortunately, Joann Albright had a family emergency and could not be here with me today.

However, I do have -- senior director of evaluation outcomes at Magellan Behavioral Health. To my knowledge, neither of us are related to the two committee members who have our last names. We would both like to thank you for the opportunity to speak today about soliciting, designing and implementing quality measures for mental health and substance abuse treatment services.

The American Managed Behavioral Healthcare Association --

DR. LUMPKIN: It is Tab 5 in the agenda book. We have your comments in case people are looking for it.

MS. P. GREENBERG: The American Managed Behavioral Healthcare Association is an association of the nation's leading managed behavior healthcare organizations and the member companies, of which Magellan is one, are collectively responsible for managing mental health and substance abuse services for over 110 million individuals across the country.

Approximately 170 million Americans, who have either commercial or public insurance coverage for mental health and substance abuse have this coverage through a managed behavioral health care organization.

I will address some of the panelist questions related to HIPAA and Suzanne will address the data issues related to measuring quality. AMBHA members are currently preparing to implement HIPAA regulations. In this process we have identified gaps in revenue codes and the health care common procedure coding systems, HCCPCS, specifically related to behavioral health.

We have worked with the National Association of State Mental Health Program Directors and the National Association of State Alcohol and Drug Abuse Directors and other behavioral health care organizations to submit suggested behavioral health codes to the HCCPCS panel. We are also in the process of making recommendations to the National Uniform Billing Committee regarding needed behavioral health revenue codes.

We see the lack of specificity of the codes for behavioral health to be a barrier for behavioral health managed care organizations. Within the behavioral health care delivery system, there are many settings and types of treatment, not all of which are reflected in the code sets.

For example, in the current code set, there is no code for intensive in home therapy, mobile therapy or EAP services. AMBHA would like to see the American Psychiatric Association's diagnostic and statistical manual of mental disorders, 4th edition, be designated as a descripter code set for the diagnoses and reporting of mental health disorders under HIPAA.

We understand that the APA is making a request to have the DSM-4 included as a descripter code set. Failure to designate DSM-4 as the mental health descripter code set under HIPAA will require mental health practitioners, agencies and institutions to abandon their use of DSM-4 diagnostic criteria for Medicare reporting in October 2003. In addition, the proposed transactions did not allow us to capture case closing information on employee assistance program services.

Employers speak of specific summary information about the nature of an employee's well-being at the close of treatment. This information is not captured in the code set or in the claims transaction. We are currently working with the X12 steering committee to address our EAP transaction concerns and we are also participating in the WEDI-SNIP(?) work group.

We will be presenting a position paper on this issue to WEDI-SNIP and at some point we would also like to talk further with your committee about our concerns. It is our hope that this issue can be addressed in the claims transaction attachment currently under development. Measuring quality using medical records also represents a major challenge, primarily because of concerns from practitioners.

Many MBHOs sample providers' treatment records for quality of documentation, treatment outcomes and adherence to clinical practice guidelines. However, many providers are resistant to providing treatment records for review, even records that have been edited to exclude patient identifying information. In explaining this resistance, providers site concerns about patients' privacy and confidentiality.

We are very concerned that providers will vary in their interpretation of HIPAA privacy rules and use the rules to defend their position that they will not furnish records for review. This may hinder collaboration of care initiatives and our ability to assess the quality of services provided by our providers. It also may potentially increase the animosity that exists between some providers and managed care companies. We believe the term "minimum necessary for data sharing" is open to a wide range of interpretations and potential conflicts.

I would also like to take this opportunity to note that AMBHA supports the proposed modification to the privacy NPRM that gives health care providers the option of obtaining the prior consent of patients to use or disclose identifiable health information for treatment, payment and health care operations. Now, I will turn it over to Suzanne.

MS. S. LUMPKIN: Good afternoon. I am Suzanne Lumpkin, the senior director for evaluation and outcome surveys at Magellan Behavioral Health and I am also here as a representative of AMBHA.

I would like to again echo many of the things that have been said today, but put that into a context of a managed behavioral health care organization's struggle as we try to work across many of these obstacles regarding collection of vital health care statistics.

Just a bit about Magellan. Magellan is currently the largest managed behavioral health care organization in the nation right now, serving nearly 70 million people across the United States. We also serve 3,000 client organizations representing health plans, government agencies, unions, large corporations and small group employers, as well. Additionally, we contract with over 40,000 behavioral health providers and practitioners, who treat our members.

I would like to just try to address some of the questions that were asked by the committee and then give a brief -- just a brief overview of some additional issues that you might want to address in the future as well. Starting with behavioral health care industry data are used primarily for transaction of business, as well as measurement and performance, quality improvement measures, as we call them.

They are measures that address structure, such as financial performance, process, such as clinical protocols and outcomes measures, such as patient satisfaction or health data. The Joint Commission, as was mentioned earlier, covers med surge issues, such as clinical performance, health status, satisfaction and administrative and financial measures.

AMBHA and its other organizations have also determined that they need to measure the same types of categories or areas. So, we are coming closer together. AMBHA has also previously developed a set of their own performance measures known as PERM(?). So, this effort has been going on for quite some time.

Specifically, some of the data sources that we have difficulties with are, for instance, claims data. Using claims data as a measurement of quality outcomes presents specific challenges. Those challenges can be constraints in the way that we capture the information and our company, as many other managed behavioral health care organizations do not always pay claims. So, we don't have the claims data and information.

Other issues that are created are sometimes the information that we get, for instance, from the Medicare system or community mental health centers, is that they do not submit claims that have the information that we need for primary payer. So, if we are not the primary payer, it is not submitted to us. It is submitted to them instead. So, we now have cross company negotiations that have to occur.

Other issues are that contracts may include only -- contracts may cause them to only include the first authorization for care or outpatient visits, not the information that is needed to do follow-up. So, things such as ambulatory follow-up are very difficult for us to come by.

Other issues that may also be encounters, being able to track the time line for patients in care. Where they are in treatment may become difficult.

One of the areas that we have to do this for is for the HEDIS data set, the Health Plan Data and Information Sets that were previously spoken about by NCQA. So, coordinating again with the health plans to capture this information creates some difficulty.

Patient surveys, as mentioned by Susan Eisen, that is where we get a lot of our outcomes data and information from. It is a proxy for quite a bit of that information and it makes it very difficult to get the necessary clinical outcomes information, but the perceptions are very important for us.

As Susan mentioned, there is a reluctance for people to be -- to participate that creates an issue for us, as well as the reluctance for members to want to identify themselves as even having used mental health, the stigma that was mentioned previously. So, when you send information to someone's home or you contact them about services, they may not be very willing to even say that they had a contact with you, even saying "no" when we have records that they did.

Another measurement issue that creates complexities is that the variations in treatment and treatment outcomes. Using these surveys, it becomes very difficult to identify the actual treatment and our databases don't always have the actual treatment that was rendered in them. The nature of mental health disorders may limit patient's possibility to understand the rating scales or concepts in the instruments themselves.

We are also in some states across the country asked to take them down to grade reading levels of fourth grade even and that in and of itself is very difficult in order for us to get concepts that we need to across the members in order to collect the information that we want. So, obstacles additionally such as coordinating current membership data, you can imagine that we don't get all of the membership data we need from either companies, employers or from managed care organizations that we work with as carve in.

In trying to work to get that information, they may update it on a regular basis or infrequently, but when we get those downloads of data, it may create contact issues, follow-up issues at a later date. Even when we attempt to update that information, it may not still be accurate and timely, depending on when you implement your outcomes measures or survey follow-up. So, that continues to be a major barrier for us.

Again, identifying -- difficulties in identifying the time to access -- to get access to those clients is very important. You may start identifying a client six months later. They may could have moved but not only could they have moved, but they may not have accurate information about their experience. So, again, collecting accurate information becomes an issue.

Having addressed the committee's specific questions, I would like to just briefly cover some general topics that others have covered, but I would like to put again on the record as AMBHA's concerns as well as at Magellan as one of those companies in the industry. Even managed care information systems require an enormous investment and they are extremely complex. Changing them, modifying them over time and keeping up with all of the current changes that are requested in transaction code sets, HIPAA everybody is familiar with, becomes very difficult.

We provide transactions with some 3,000 client organizations and those relationships need to be managed. Changes to those systems are costly and labor intensive.

Across many -- we work many treatment settings and with the wide array of providers out there in our network, we have limited access to the data from providers, for instance, such as the primary care physicians, which is a very significant portion we know of providers that are actually identifying, for instance, depression in the population. So, there is information that we are missing and we know there are large gaps as a result of it. So, coordination of information exchange is very challenging and sometimes addressing this across privacy issues and confidentiality becomes even more difficult.

The nature of our provider panel, as I have mentioned before and behavioral health is varied.

It can range from clinical psychologists, social workers, psychiatric nurses, marriage and family therapists, but this poses challenges to us in identifying performance measures. And quality-based interventions, coordinating with institutes to actually develop interventions is time-consuming as well as very difficult. I am probably going to have those two minutes.

DR. LUMPKIN: You have had your two minutes actually. If you can conclude, please.

MS. S. LUMPKIN: I just want a bullet point then a few other issues.

Geographic issues become apparent, as well as measurement and quality and AMBHA and its members are very much in favor of using a standardized set of measurements, but there is a wide range of discrepancies in what those should be.

Thank you very much for the opportunity to present today.

DR. LUMPKIN: You are sure we are not related?

MR. GANJU: Hi. I am Vijay Ganju and I am with the NASMHPD research institute.

What I am going to try to do is present -- race through sort of a public panoramic view of some of the issues related to data and really try to link some of the needs for data related to quality, to quality initiatives going on within primarily the public mental health sector.

I have been involved with a lot of the performance measurement initiatives that you have heard about, the ECHO, the NCQA, the PERMS and also coordinated several performance measurement initiatives related to state public mental health initiatives, as well as some federal activities. I want to sort of put this in a particular context that distinguishes mental health, not only from substance abuse, but also from health in general.

That is that state general revenue dollars constitute a major part of mental health spending, as much as Medicaid and Medicare and together with Medicaid and Medicare constitute a majority of mental health spending. So, a lot of the control of mental health especially is really tied to the states and state mental health agencies through state general revenue dollars outside Medicaid and Medicare constitute a very large chunk of expenditures and this creates a whole ethos for some of these quality initiatives that is very different than any other health sector.

I think the other part of it is that another contextual piece is that the Surgeon General's report on mental health identified that there are huge gaps between what we know and what actually is going on in terms of practice.

There is an opportunity there that also has data implications. Then the third is even within sort of the public mental health sector, there is huge fragmentation. In Texas, for example, the criminal justice system serves more people with mental health problems than all the state hospitals -- I mean, than the number of state hospital residents at a particular point in time.

Then you have got child welfare responsible and there isn't even within the public sector the kind of coordination that is needed across all these different sectors. So, quickly, we have come a long way and basically there has been a lot of work that has gone on in the last five years to develop standardized performance measures that sort of go back and sort of are built on consumer concerns, the MSSIP(?) report card was implemented.

I mean, right now, 50 states or 48 states are implementing parts of the report card. They are all using sort of similar kinds of measures. This has resulted in a framework that was established by the state mental health agencies, resulted in a couple of federal initiatives, the Five State Study, the Sixteen State Study so that you could have standardized performance measures across states.

Right now that has been expanded to a 47 state initiative. In these 47 states, they will be required to report the same data elements, do the same consumer survey, the MSSIP consumer survey, which is different than the ECHO Survey and through all these processes, there is a developmental effort that is also underway, which takes all the learning from all the initiatives you have heard about and tries to move into the next generation of activity and involves a lot of people sitting around the table here right now. But we have miles to go and that is really what I want to identify is all the sort of ground we need to traverse to really have quality imbedded in our data systems.

A lot of what we have been working on through this public mental health initiative is a focus on outcomes, reduce symptoms, reduce system distress, increase functioning, work, school performance, independent community living and then some sort of aspects of reduced hospitalization, reduced substance abuse, reduced criminal justice involvement.

Some of these, you know, people raise issues about how directly the service that you provide are related to these outcomes, but these are the important issues that have been identified both by legislatures, policy makers, as well as consumers.

I think if you look at the report card on data available for mental health quality, if you look at it in terms of quality management, we have moved somewhat in the direction of standardized definitions, but there is still a lot of work to be done.

Even under the sort of label up the same service, there is a lot of variation in terms of what services are provided under the same nomenclature. So, this is also an issue for HIPAA coding where as you go forward, you know, you can have the same code, but it may mean different things in different context.

Standardized outcomes, most states have some kind of measurement system in place, but they are not necessarily the same across states. I think as you look at some of the issues that we are dealing about, talking about, the focus is very much in terms of people who are actually receiving services. I think as you look at some of the data that exists, that we cannot identify a lot of people that have been sort of recognized as having a problem in terms of where they are, in terms of receiving services.

What I mean by that -- that wasn't very well stated -- was that there are lots of people that have been identified having mental health problems that don't seem to show up in any of the data in terms of receiving services. So, we don't -- we have good prevalence data, but we don't have good models of demand data and when we look at utilization data, what we have is good data for particular sectors and sort of organizations and sort of funding streams, but when you try to get data that sort of cuts across funding streams or across sectors, we cannot really count the number of people that are receiving services.

I can tell you how many people receive services through Medicaid or through non-Medicaid funding, but if you ask me to unduplicate that, there are some studies that have been done, but there is not routine kind of information. So, we really don't know how to count the number of people in the country that are receiving mental health services.

Then in terms of planning, we don't know very well -- we don't have good models of fit in terms of who should be getting what, in terms of -- and I don't mean this in a prescriptive clinical sense, but just in a planning, budgeting sense, in terms of how we sort of account and plan for some of the services that need to take place.

Some of the data in terms of benchmarking, in terms of the relationship to cost and benefits, in terms of sanctions, we don't have. Cultural competence, which is very tied now to quality initiatives, again, not a lot in the way of people looking at health disparities, although there is some kind of emergence on that.

If you look at implementation of evidence-based practices, there are still issues of definitions. The issue of how you track fidelity, I think one of the big issues is that as people implement things that have an evidence base, they are not always conforming to what sort of the procedures that actually resulted in the positive outcomes that were found in effectiveness studies and how you monitor the implementation.

Now, there is a increasing sort of move to sort of get data in all these areas, but as we talk about the fragmentation that exists in the mental health system, the issue of data integration -- and you have already heard about some of these initiatives in terms of health, mental health, private public, mental health and substance abuse, mental health in other social services, mental health and criminal justice.

You have sort of piecemeal kinds of studies in these areas, but you don't have systematic approach to address these issues.

I think if you look at the public mental health quality agenda, right now there is a three-pronged kind of approach that is essentially being implemented; performance measurement, the implementation of evidence-based practices and then a more sort of systems level quality management, quality improvement.

These are the three pillars on which people are trying to build new sort of models of quality and accountability. These are supposed to be check marks. These are cyrillic(?) check marks, these little -- but I mean in terms of what we have, in terms of standardization, implementation and useful quality management, if you look at penetration and utilization rates, we have pretty good data, I think, and more sectors move forward.

In terms of process, most states -- and, you know, I think you heard from Phil, we have got follow-up after seven days of discharge from hospitalization. In terms of evidence base, there are still issues in terms of standardized definition, although within different sectors people are using these in terms of implementation and quality management.

Readmissions, seclusion, restraint, all these are sort of currently being used either through the JCAHO system or through what states are currently implementing. In terms of outcomes, a lot of -- there are lots of initiatives where people have gone forward in terms of trying to implement outcomes, but how these relate to improvements in clinical performance or improvements in systems performance, I think, are still a question mark.

I think if you look at consumer surveys, I mean, you have heard about these being standardized and attempts to sort of develop one kind of survey across systems. These are being implemented, but there are issues that you heard Sue just talk about in terms of the implementation.

Let me quickly sort of go through the limitations of data. I have mentioned the unduplication, standardized definitions and I am not going to dwell on these.

I do want to talk a little bit about the issue of -- I am just going to jump over because I know the time is up. But I think there are several actions that can be taken at the federal level to help this along.

I think if you look at it, you know, working with some of the people around the table, we have tried to sort of get new mental health procedure codes into HIPAA and to replace the 2,300 sort of local codes that were being used in Medicaid.

We have submitted 50 codes and we have gone through an iterative kind of process with the HCCPCS panel to sort of get these submitted. Part of the isue is that there doesn't seem to be a lot of understanding about some of the issues in terms of these different sort of fragmented ways that mental health services are provided.

There is the issue of standardization in terms of definitions and uses about this stuff. I think -- and there are several federal initiatives going on in a lot of the activities that I laid out earlier, we are essentially federal sponsored, but there is not a strategic kind of approach to essentially address these kinds of issues that have identified.

It has been more piecemeal and fragmented. They have been cumulative, but there hasn't been a concerted strategic kind of approach. I think we have got to have data integration pilots and they have been implemented, but there needs to be sort of more largely disseminated for ongoing kind of use. Again, data of performance measures related to evidence-based practices and implementation, I think the more you can develop some kind of basis to essentially sort of facilitate that kind of implementation, that would be helpful.

I think what we need really is not just performance measurement systems, but how you use these performance measurement systems -- that this may have on the quality of services itself and I think I have outlined some of the areas in which I think work could be done and very profitable for us to sort of move forward the quality agenda here. These include performance measurement as a quality intervention and what kinds of reports and feedbacks that you have.

I think there is still the issue of whether we are measuring the right things and whether we are measuring what we want to measure and whether the measures, in fact, reflect the realities that we are trying to monitor. I think the issues that Sue talked about in terms of methodology, I think there is sort of huge variations in methodology, which may, in fact, result in differences that are greater than in actual performance. Perspective, I think, whose perspective do we want and how do they relate to each other.

We started to look at consumer reports and related to clinical performance and what we find is that the only area, which essentially relates to the clinical perception of care is the outcomes element. The process parts don't seem to correlate very well. Then, finally, we have got to look at how these measures and these data sort of essentially inform both clinical and systems performance practice. I know I have raced through this, but I have tried to respect the 12 minutes that we had.

DR. LUMPKIN: Thank you. Actually everyone had 10 minutes.

Most everyone took pretty close to 15. So, we are not going to have much time for questions because we are bumping up against the lunch break and we have business we need to do this afternoon. So, if there are maybe a couple of questions, Paul.

MR. NEWACHECK: One dimension of quality care in mental health is the extent to which consumers receive care that need that care. The flip side of that is unmet needs or prevalence of unmet needs in mental health. Some of our population-based services, like the National Health Interview Survey, attempt to provide national estimates of unmet needs for mental health.

It appears, at least, in my area, the children's area, that the estimates that come out of those surveys are very low and unrealistic and have very limited face validity. For example, in the kids' area, the national estimates of the Health Interview Survey suggests that only 3/10ths of 1 percent of kids have unmet mental health needs; whereas, about 5 percent report unmet medical care needs and about 10 percent report unmet dental care needs. So, if you take that logic and extend it, it suggests that dental care needs are 30 times as prevalent.

Medical care needs or unmet medical care needs are 15 times as prevalent. It makes it appear that mental health needs are really being well met, at least for children. I think that probably would apply for adults as well. I think that basically we are just not capturing the degree to which unmet needs exist, either because of the stigma attached to that or the awareness of families.

I am wondering if there is research that you all are aware of that would inform us as to better measurements of unmet needs in the mental health area or whether we should even try to measure it if we are collecting data that has such low reliability and validity, maybe we shouldn't even be trying to do that.

MR. GANJU: I will start, but it seems to me that, you know, the national prevalence studies, whether it was the epidemiological catchment area study or the national comorbidity study, suggests the opposite, that we actually have a lot of people that have sort of symptomatology and sort of -- that relates to specific diagnoses and the number of people -- you know, I think it is about -- the national comorbidity study showed that almost 20, 21 percent was the annual prevalence rate.

The number of people that were actually -- the proportion of people that were actually receiving services was closer to 8 or 9 percent. So, there is that huge unmet need of people not showing up in any sort of mental health sort of sector, whether it is through the health system or through the specialty mental health sector.

Then there is another form of unmet need, which is that when people enter the sort of behavioral health system or the mental health system, a lot of the people that initially established contact don't come back.

So, sometimes they are counted as a contact, but whether they have actually received adequate services is a huge issue. The model number of contacts in mental health is one.

So, I think there is a lot of documentation to show that there is not only unmet need, but if you look at the WHO studies, that these are in terms of daily adjusted life years, among the -- I mean, three sort of behavioral health diagnoses are among the top ten sort of health problems that account for the majority of loss of productivity.

MS. EISEN: To add to what Vijay said, I agree with everything he said, but I think a major problem is I don't believe there is a standardized definition of unmet need. If you look at something like the ECA study or the national comorbidity study, you can come up -- that assumes that anyone who reports certain levels of symptoms has a need, but there is not consensus on that definition.

Just from my experience in performance measurement and clinical settings, in speaking with patients and consumers, you know, providers -- you know, that there is a lot more unmet need than consumers feel there is. I mean, providers feel a need to treat a lot more conditions than consumers feel need to be treated. So, I think that is a major problem.

DR. MAYS: I will try and be brief because of the time, but believe me, I have lots of questions and concerns about some of the things that have been presented.

But let me start with the data part because part of the problem is that the two studies you talked about, like ECA and the national comorbidity study, the measurement of mental health is very complex. They are very long. They are very expensive.

Like some of the recommendations that we are talking about of the ability to be able to put it in things like NHIS are very difficult. You know, I think some of the places that you would want to put it, I am not sure that it actually can occur because of some of the issues around it being a mental health disorder and what happens if there is a problem.

There is a lot of IRB and there is a lot of complexity to the measurement issue. So, I don't think this is -- I think it is a very complex issue.

The other is the use of the DSM-4 and I think part of what we are doing here is talking about the health care system more so than actually the data, but we are starting with the data, but it is the health care system to some extent that is a problem. In terms of the assessment of mental health disorders, the DSM-4 does tend to work.

I mean, there are arguments about whether it is evidence-based enough, but that is not what is used in the health care system, particularly when a person goes into primary care. You are using a different system and it is not that, you know, there are so many problems with the ability to be able to diagnose.

The problem is in terms of the dysjunction that exists between various systems. Even in the states, what you have is -- you have a department of health services. You have alcohol and drugs. That is a separate budget and then you have mental health sometimes will be yet a third. Each has a different set of procedures, rules, data collection, et cetera. Some of that is for reasons.

That is what worries me a little bit is in terms of trying to get the data, there are some issues that I think it is like the patients will be impacted upon in terms of we might push for quality data. The question will be what will we do in terms of people seeking treatment because of issues of stigma, because of some of the difficulties of people not wanting, for example, treatment information to be available. So, those individuals may, for example, pay for treatment and then what you have is those individuals who are in Medicare or, you know, some of the federal kind of captured data sets looking sicker.

There are a lot of complex issues here that I have some concern about the direction that we want to go in in terms of the collecting of the data and kind of what system it resides in and why it has been separate. I think there are some reasons why it has been separate that we need to think about before we just move to integrate it all.

DR. FRIEDMAN: I have a question that can probably be answered by a " yes" or a "no" in the interest of time.

This is directed, I think, to Dr. Renner. Does CAHPS or the ECHO Survey that you referred to include any enrollment-based measures of out-of-plan utilization?

MR. RENNER: I don't believe ECHO -- or I don't believe CAHPS does. ECHO has supplemental questions about do you feel that you got -- questions around having used up all your benefits and needing to seek care, you know, outside of the plan benefit.

DR. FRIEDMAN: Well, that is half of it. The other half of it would be enrollees who opt to go out of plan either because they are dissatisfied with plan services or for confidentiality reasons.

MR. RENNER: We don't have any items that address that.

DR. FRIEDMAN: Thank you.

MR. BLAIR: Kathy, I wanted to thank you for pulling together this panel because it has just been a treasure trove of information, not only on quality but on information that probably will help us in the Subcommittee on Standards and Security with respect to making recommendations on clinical terminologies.

In that respect, I just wanted to ask is there someone who is assuming responsibility for capturing all of these recommendations and -- yes? The answer is "yes." And that is?

MS. COLTIN: Well, the Quality Work Group is going to be preparing a report that tries to summarize the data gaps and data issues in various categories. Those cut by sort of sectors of populations and -- settings, but also in terms of characteristics of the data, administrative data, survey data and so forth.

DR. LUMPKIN: They are meeting tomorrow at 8:30. I think, as before, anyone who is interested in the work of one of the subcommittees can get on the e-mail list and so forth. So, if you are interested, just let Kathy --

MR. BLAIR: Yes, please.

MS. GREENBERG: I will try to be real quick and I think this is a " or "no" also, but you mentioned a number of -- from the point of coding, a number of limitations primarily it seemed in service items or procedure or services data.

You said a little bit about diagnoses. But particularly the last speaker mentioned that the real nut of issue is outcomes and had a slide up there about functioning and functioning in various area.

I just wondered if any of you were familiar with the International ification of Functioning Disability and Health and whether you had any thoughts about being able to collect functional data particularly using that ification or some other -- although that is a ification that this committee at least has noted.

MR. GANJU: The way that a lot of these outcomes were developed was by looking at -- well, by sort of consumers and policy makers identifying these as key sort of targets of what they really wanted to accomplish as a product of the mental health system. So, a lot of these different systems were reviewed as part of how this would fit, but no existing system sort of captured the array of outcomes that were identified that I put up on the slide.

That is part of the reason that people have gone forward with trying to develop their own ways of measuring these different outcomes instead of building on existing -- they have built on existing systems but they haven't adopted the existing systems.

MS. EISEN: I think I know -- I am familiar with the ICIGH, too -- okay, I am not that familiar with it.

But from what I understand and maybe you can enlighten us more, Marjorie, is that it is a taxonomy of disability, but that there aren't really yet outcome measures developed that are associated with that taxonomy. Is that -- although I know there is a lot of interest in research --

MS. GREENBERG: It is a ification. You are right.

Then there is work going on now mapping various assessment tools to the ification. We can talk off line about it. I refer you to the committee's report on functional status.

DR. FITZMAURICE: Got a short question and that is I have seen as we have talked about health care data standards that CDC has been active in public health and that CMS has been active in data standards for claims and even for clinical use and the testifiers today have told us a lot of good problems -- the problems aren't good, but given us a lot of good knowledge about the problems that mental health has.

So, I would ask is SAMHSA active or plan to be active in the same SPOs in the developing organizations that develop the requirements and supply the codes and vocabularies for claims and clinical uses for behavioral health? I am speaking of X12, HL7, APA, SnoMed and others. I guess this would be directed toward Mady Chalk and Eric Goplerud.

DR. LUMPKIN: Both of who have left.

MS. WATTENBERG: Do you want me to respond?

DR. LUMPKIN: Oh, sure, please.

MS. WATTENBERG: Mady and Eric are gone. I am Sarah Wattenberg and I work at the Center for Substance Abuse Treatment. We have been sponsoring the NASMHPD and NASADAD(?) code set groups and we have been encouraging our states to volunteer and get onto all of these work groups to represent our interests.

We do feel that behavioral health and mental health, there is a tremendous lack of expertise. I think part of the problem with getting the code sets developed that we want, having distinct mental health and substance abuse codes, part of the problem is that there is not a tremendous amount of clinical expertise on those committees.

There is some and they are working with us. But we are trying our best to -- it is essentially a manpower issue at this point. But I could certainly talk more with you, Mike, about that.

DR. LUMPKIN: I think that is an issue certainly we would want the subcommittee to take a look at because our recommendations may address the issue of resources that are allocated for participation of those with clinical expertise on the SDOs.

We have made similar recommendations in other areas. I would like to thank the panel. I really apologize for having to be such a strict taskmaster, but we did have a very full panel. We have a lot of interest.

I had to cut off questions, but thank you for opening up this discussion for us as a committee. We are going to take a one hour break for -- 45 minute break for lunch. We are going to try to get started at roughly 1:30. I know you all like to come late. So, that is the reason why I didn't say 50 minutes. So, roughly around 1:30 we are going to get started.

[Whereupon, at 12:45 p.m., the meeting was recessed, to reconvene at 1:30 p.m., the same afternoon, Wednesday, June 26,2 2002.]


A F T E R N O O N S E S S I O N

DR. LUMPKIN: Right on time. The first item on our agenda for the afternoon is the health statistics for the 21st century report. We've seen the entire report itself before. What's new I think are the recommendations and Dan, if you will take it away.

DR. FRIEDMAN: Sure. Let me just take a couple of minutes of introduction and then very quickly try to plow through an overview.

I say quickly because, as the report, as I'm looking at it, is 83 single space pages and trying the summarize it or even summarize the recommendations is impossible. I do want to thank Barbara and Paul and Vicky for their patience and for their repeated reviews, suggestions, help and so forth.

Page, very quickly, page five of the report has a graphic that summarizes the process that we've been going through since 1998, and to remind you the report itself has been a joint production of NCVHS, NCHS and the data council. The report was written by Ed Hunter and Gib Parrish of CDC or now of CDC, shortly of CDC, and me.

It's been, the report is a culmination of a remarkably enough, three to four year long consultative process that's included local discussion groups in Albuquerque, Harrisburg, New Orleans, regional public hearings, expert discussion groups here in Washington, a National Academy of Sciences workshop, numerous presentations at, participatory presentations at professional meetings, an interim report that was published a couple of years ago and so forth.

And then a couple of weeks ago we did a presentation at the data council. The report itself is divided into four chapters. The first chapter is on what are health statistics which is an attempt to define health statistics. The purposes of health statistics uses and define the health statistics enterprise.

The second chapter focuses on a model of influences on the population's health and uses that model essentially as a template to summarize comments that we've received throughout the consultative process on health statistics needs and gaps.

The third chapter focuses on a model of the health statistic cycle and, again, we use that model as a template to review comments that we've received during the consultative process and to summarize gaps.

And the fourth chapter contains the vision for health statistics including the mission, overarching conceptual framework, core values and guiding principles as well as a statement about the national health information infrastructure and its relationship to health statistics.

Now, those four chapters constitute the joint report of NCHS, NCVHS and the data council. Then, in addition to that, there is, there are recommendations, there is a recommendation section which would be only an NCHS -- NCVHS excuse me, thank you, Jim, only an NCVHS process.

The recommendation chapter is organized, presents recommendations flowing from the 10 guiding principles presented in chapter four and does anybody want me to very quickly go through the guiding principles or can I skip that?

DR. BLAIR: You can really understand the recommendations because they are based on and refer to the guiding principle.

DR. FRIEDMAN: Okay, Thank you, Jeff. I'll do it briefly then. Ten guiding principles.

The first one refers and enterprise-wide planning. For us this is really essential and lies at the core of all the other recommendations.

Broad collaboration among data users, producers and suppliers; rigorous policies and procedures for protecting the privacy of data subjects and the confidentiality and security of data; flexibility to identify and address emergent health issues and needs; use of data standards to facilitate sharing and comparability of data; sufficient data at different levels of aggregation to support policy and programmatic decision making; integrative streamlined data collection for multiple purposes; timely production of valid and reliable health statistics; property access to and ease of use of health statistics and then finally, continuous valuation of the completeness, accuracy and timeliness of health statistics and of the ability of the health statistics enterprise to support their production.

Now, as I said, the recommendations are organized according to those ten principles.

At the same time we also have a cross cutting ification system where we ify each of the roughly --I've lost count -- three dozen recommendations as to whether the recommendation relates to data access, data set improvement, data standards, health statistics enterprise structure, evaluation, policy, research agenda or training.

Rather than go through each of the recommendations, let me just quickly summarize the four essentially core recommendations. It's not so much that these are the most important recommendations but we view these as necessary but not sufficient conditions for achieving the other recommendations.

The first recommendation relates to developing a reconstituted National Center for Health Statistics with a strong board of scientific counselors. And our feeling is that NCHS as presently constituted needs to be substantially strengthened and its agenda and mission need to be substantially broadened.

Our model of the health statistics cycle has a little hub in there, a little hub cap essentially and that function, the coordinating, planning function is right now being done in fits and starts and it's really essential that we have strong federal leadership to perform and to help perform some of those functions.

Part of that is also emphasizing the need of NCHS to focus even more broadly than it already does and to focus as an integrator of health statistics data which it does to some extent now, but that needs to be broadened.

A second core recommendation relates to the development of an autonomous health statistics planning board. This is something that we discussed at length in the work group and we've discussed at length with John and others.

It, the recommendation, flowed from a recommendation from the National Academy of Sciences workshop and our feeling is that planning and coordination needs to occur between private and public sector and between the three levels of government on an ongoing basis and we feel it's important not only that all parties to the enterprise sit on the board as equals, including federal agencies.

The hope would be that over time, this could be, as I said, an autonomous board, perhaps like, as it were, the FDOs. And the planning board would be responsible for setting the agenda for the enterprise, periodically evaluating the extent to which the enterprise is fulfilling its mission.

Re-setting the vision as necessary, engaging in consensus planning, recommending resources, proposing changes to existing data collection systems and fostering the development of enterprise-wide standards.

Now, we acknowledge in the report that the potential overlap of an autonomous health statistics planning board with NCVHS as well as a board of scientific counselors for NCHS.

Having said that, we feel that the functions, although somewhat overlapping are also distinct, and an autonomous health statistics planning board, that function cannot be fulfilled by either the NCHS Board of Scientific Counselors which is charged with providing advice to NCHS and would focus on the NCHS activities and also would be distinct from the functions of NCVHS.

A third core recommendation and coordination of state health statistics activities in each state by a single state agency, supported by state health statistics planning boards and then fourth, we have a brief recommendation emphasizing the importance of expanded graduate training in health statistics and in effect now this training really does not exist.

There are a few schools of public health that occasionally offer courses in surveillance. We are certainly not aware of any course work offered in health statistics per se, need for expanded in-service training, and the need for continuous education opportunities particularly for people who are active practitioners in the field.

So, with that, why don't I stop and turn it over for any questions. Comments?

DR. LUMPKIN: Let me mention a couple of other things. One of which is there are a couple of summaries that are not here, one of which is going to be a chapter by chapter straight executive summary which fortunately Susan Kanon has agreed to write, a second of which is going to be what I think of as a pop summary, five to ten pages and as a matter of fact, I've been having fantasies of having or models in three dimensional pop ups in the summary.

If Marjorie can arrange a budget for it. So that's going to be forthcoming and then there's also a little bit of minor text editing that we'll also be doing. We had a very good editor work on it earlier who's the scientific editor for Public Health Reports and hopefully we'll be bringing her back in. But I see, we don't anticipate anything resembling substantive changes unless, of course, Simon or any of the others on the committee suggest them. So, with that assignment --

DR. COHN: Dan, thank you for putting me in my place here. First of all, I want to congratulate you.

I think that the body of the report, I mean, I had a chance to read it over at my leisure on a long plane flight yesterday, and I think over all it's very well written and I don't have any changes to the body of the report.

I do have a couple of questions and clarifications about the recommendations and I was trying to remember and I was looking to other people here to understand whether we had seen the recommendations previously because I don't remember seeing them and so clearly the first time I saw these recommendations was three days ago, four days ago when the full packet from the NCVHS came to me so I actually do have a number of questions as I went through only because that's the part that's ours and I want to make sure that we are getting it right and understand all the ramifications here.

So, will you bear with me as I ask a couple of questions?

DR. FRIEDMAN: Absolutely.

DR. COHN: Okay. Number one is in recommendation 6.2, data set development and geocoding. Page number 62. They are all in the recommendations.

These are a series of question that I came up with. Now, tell me, basically you want to do geocoding at the census block group level.

Now, I will apologize, but what does that mean in terms of specificity of size of the population that relates to a census block group level? Is that large enough that there will be an anonymity of people that are in this or is it going to be any unusual disease is going to stand out like a very obvious anomaly?

DR. FRIEDMAN:

I'm looking and quality here. I believe a block group usually varies in size.

I believe it usually has around a thousand people, but we are also saying that just because something is attractive or geocoded, even if it's latitude and longitude where we actually have the space filed, it doesn't mean those data then become publicly available.

I mean, we have name and address for most of our data sets and those data are not publicly available.

DR. NEWACHECK: I think that was going to be my point, that the data would be, the confidentiality would be preserved under any circumstances.

DR. COHN: I guess the one issue I have here, and I am not a statistician and Brady is not here with us today, but it gets to be sole size issues and whether or not, by the time you start drilling down, even if the name is blocked and all, this stuff, it's something routinely in health care organizations you deal with and I just wasn't sure whether this was going to be, I mean, despite the exhortations of confidentiality and privacy, whether there was an issue there.

DR. LUMPKIN: But for the health statistics system, collecting data, the problem is that a lot of the health statistics are kept at the national level. We've go 102 counties in the state of Illinois, and for the purposes of planning, those health officials need to know what the health status is of their population and in many of the current health statistics system, health data systems, they can't.

So for instance, there's a survey that's done, the behavioral risk factor survey, in Illinois we do a special expenditure to get the data down to the county level so what this would be is in that particular system, within the public health rubric they could look at the data. That doesn't mean that they would release it because, again, you have the same restrictions on cell size which generally we use with cancer registries and other kinds of other data systems within the public health area.

So your issue would be a concern if it were a publicly available data set.

DR. COHN: Okay, but aren't abstracts of this going to be available?

DR. FRIEDMAN: The way the abstracts are generally made available -- one purpose of the geocoded data is what John described. Another purpose, Simon, is that typically our data sets have a limited number of demographic variables.

Virtually no, well, very few data sets have any data at all on economic position.

Many data sets don't even have decent data on education, race or ethnicity, let alone sort of the contextual effects -- housing, median income, etc., Etc. And by geocoding to the block group level, what one can then do is essentially attach the census, expanded census data to the individual records and that then enriches the data set.

So you are actually, you are not doing analysis, you are not drilling down to my census block group, but you are using those census block group descriptors as a variable in a aggregate analysis.

DR. COHN: Okay, as opposed to ZIP code or something like that or county.

DR. FRIEDMAN: Well, yes, the descriptors, but yes.

DR. COHN: Anyway, that was one issue, and I'm still mulling over your, I mean, I'm hearing what you are saying. I'm still a little bit concerned.

DR. FRIEDMAN: We can clarify this.

DR. COHN: Yes. And probably it sort of relates to this is recommendation three on page 65 where you talk about development of a longitudinal data set.

I read through this one and I found as the description, as a constructed data set on a person basis that includes, basically the integration of virtually all the data you could ever have on a person basis and maybe I'm misunderstanding includes person, family, date of birth, death, marriages, divorces, any sort of surveillance data sets on the person basis, additional survey data sets, family and household surveys, data collection multiple means, and all into a single data set.

Are we convinced, I mean, I found myself, my hair standing on end a little bit only from, once again, the privacy and confidentiality issues. Are we, I mean, I'm trying to think what the right word is here. I guess even though I don't consider myself to be a privacy activist, certainly this sort of integration of everything all together in an easily accessible way on the person basis, is there a concern, do you have a concern about that?

About either unintended or unauthorized use when something like this becomes quite to easily available? I mean, I see this as somewhat different that, for example, a data warehouse where with sophisticated queries you can reconstruct all this. I mean, you are doing something that basically typing the whatever the identifier for a person and you get the complete history out on.

Any comments, thoughts? Was that discussed during your testimony and hearings?

DR. FRIEDMAN: First of all, I think we can clarify the recommendations. We were not intending that this be the big data base in the sky.

Having said that, I mean I do think that there are, and we thought and it was identified during the process numerous times that there are a huge number of lost opportunities in the US for longitudinal, person based longitudinal analysis because we just don't, we just never put the data together and there are a few surveys, one of which we'll be hearing about this afternoon that are phenomenally expensive but as a longer term strategy, those surveys only meet a little bit of the need.

So I think what we need to do is clarify that recommendation.

DR. NEWACHECK: One thing we might want to do is think about right now NCHS operates under very strict confidentiality and privacy regulations that are Congressionally mandated in their charter and we can add some language to that effect that both of these recommendations, that they would be subject to existing privacy and confidentiality rules that NCHS or procedures.

DR. LUMPKIN: But let me just give an example.

Probably the most familiar longitudinally linked data set is creating the rate for infant mortality where birth records and death records have to be linked in order a calculate that rate.

We frequently link death certificates and cancer registries so I think the, Paul's suggestion as something that we ought to look at, that this is not something that ought to be easily accessible, ought to be well protected, but has certain important health status and health monitoring roles in order to be able to make the connections.

We link, for instance, our adverse pregnancy outcome which is a birth defects registry and very important research in relationship to neural tube defects occurred with the registry they had in Texas leading to the recommendations on use of folic acid.

Those kind of research within, as part of surveillance role of health agencies is what would be supported by that but I think the privacy piece needs to be spelled out.

DR. SCANLON: The other issue my, I mean, you don't want to be confused with a dossier kind of idea and it may be that you don't want to talk about a data set as much as you want to talk about data linkage.

You don't want to talk about a data set as much as about capabilities, analytic capabilities because it does sound like a data set.

DR. ROTHSTEIN: I want to support what Jim just said.

There's a difference between privacy and confidentiality that we need to focus on, and what we've been talking about is making sure that the information, once accumulated is not wrong fully re-disclosed.

Well, that's confidentiality but the privacy question is whether we ought to be in the business of routinely compiling data sets of marriages and divorces and all these other things.

There may be a compelling public health reason for maintaining the capability of doing that while assuring the public that we are not compiling this big data base in the sky.

So I want to add my support to Simon and Jim on this.

DR. HUNTER: Can I make a point of reference out that might be useful in thinking about how to change this one? Recommendation 3.1 is on the issue of privacy and a code of fair information practices and, in fact, there are some specific references in this to not only these issues, but things like developing a privacy impact statement for new linkages to address that kind, to address the privacy question as opposed to the confidentiality question which is separately addressed.

So it may be that part of the way to deal with this is to cross reference the two, that a pre-requisite for longitudinal analytic files is the capability to link but within the context of the information privacy policies that are advocated actually in text before that one sort of already implies a pre-requisite relationship.

DR. LUMPKIN: Personally I'm completely comfortable with all these suggestions. Vicky?

DR. MAYS: I just want to pick up on a point that Mark was making and that is about whether or not there's a compelling case, particularly in the public health for some of this.

I think what the report does, and it's very well done, is it's really about statistics and population and all that but from the side of the general public or the side of allocating dollars, I don't see the case for doing some of this, and I guess my concern is if one would go out and cost a lot of the things you are recommending, I don't know what's going to happen to the NCHS project.

I mean, when Ed Sondik comes in, he's telling us constantly he doesn't have enough resources for the data set that we are already currently collecting.

We now want a new board created, that's a cost to it, and then some of the other things here are very costly and we know that some of the very populations that we are concerned about, racial and ethnic groups are still not being measured well so when we advocate some of this, it isn't fixing some problems but it's adding more and I'm afraid for some of the groups we are concerned about, we won't get, they still won't be as much.

How would you address that?

DR. LUMPKIN: If I could maybe respond to that because I think all this report has to be taken in context of other material is that we've prepared, including the NHI report.

In Illinois our hospital discharge data set, the organization that collected that was abolished or will be abolished as of Friday. The authority to collect that was transferred to the state health department without a significant number of restrictions that existed before. So we are now looking at a hospital discharge set that may be based upon the 837.

So we move into a new environment where the cost of collecting that from the industry drops dramatically because right now, because it's an abstract of the UB-92, they have to spend extra effort in order to create this document which is now submitted to the state. With the new environment of HIPAA, the 837, when they send it in to the bill, they send us a copy. It costs them nothing, essentially, to send us that transaction.

So if we can see this perhaps as a blueprint as we start developing this new system in the new environment, we have to ask ourselves the question, what capabilities to we want this environment to have? What should we build towards? I think that's what this document addresses.

DR. MC DONALD: Maybe I didn't read this carefully enough to know -- it may already be in there but this, traditionally the states collect data and the government gets rolled off and whenever one hears about a national data base, everybody gets very nervous and I'm not clear whether this might be viewed as a national data base without not just roll off.

I'm not for or against that. I'm just, politically, though, you get into big mine fields when you get into that and it might be easier not to set ourselves up in that way.

Second thing is on one side of me though, actually it's counter to what Simon is worried about. I almost think it didn't go far enough.

I was glad to hear what you just said. I mean, in terms of the descriptive stuff, this is traditional public health data. There's nothing 21st century about it.

It's the same stuff we've been trying the collect forever, but there are some 21st century capabilities that we can think again, that we could really think again it might be really hard, but mean, the relationship between drugs, the side effects and early detection of drug effects and you could go on and on and think about things you could detect with the kind of data that might be easy to get in terms of cost on a small basis.

I don't know where I'm going with that, but I did like the idea you get 837s because you are obviously thinking that way. And then the third thing is whether there was no, I didn't notice any mention about when I do sampling and when not because that is a way to save money to sometimes do more penetrating stuff.

Is that outside of the box at this point?

DR. LUMPKIN: I think the issue of sampling goes back to Simon's earlier question about what level of, what's the level of measurement and the problem with sampling technologies is that you have to have a sample that's meaningful for local planning and so there are a number of ways that we collect data in certain surveys that are done but the real challenge is to get them meaningful at the community planning level.

DR. MC DONALD: But that's not really, historically, that has not required highly planning your data. It sounds like you are going for the more granular stuff so that you might not be able to test your stuff better. I mean, historically, that's what the national center has.

DR. FRIEDMAN: Clem, yes, I think both are done on a regular basis, depending upon the need. I mean, you point that out so people get all worried that you are going to be doing the rest of this sampling.

DR. MC DONALD: Yes, again, I think, again, I think that's a well-taken point and I think it's easy to, certainly worth making it in there.

Regarding the nature of the data that are discussed in here, my sense is that the data that are worked on on an ongoing basis within probably virtually every state public health department in the country, and most federal agencies that deal with population health outcomes, are extremely restricted.

And while the data that are discussed in here such as in figure two, may not seem particularly 21st century, in fact when we look at some of the 19th century work done on population health in Britain, it's exactly the same kind of data that are discussed.

Having said that, the data are not now used and in many cases are not now collected on anything resembling, it's just, they do not, they are not used.

DR. SCANLON: Well, one of the ways to get reporting is from lab systems and the NHI. There's obvious ways you can get surveillance of infectious disease and you can get bioterrorism from lab and other kinds of things. These aren't highlighted here.

Now, that might have been deliberate because you are already worried about people kind of jumping on you and making a big data base in the sky but that would be more -- I think there really are opportunities with grand and wondrous things with public health, with the population health via the information systems encapturing a lot for patient specific data.

DR. SCANLON: Well, I think Clem is sort of bringing up the issue I was going to raise. It's the definition and boundary issue.

I know you have defined health statistics and you have tried to define it in the way that doesn't include other things, but it sounds like it doesn't include public health surveillance and it sounds like it doesn't include health data. It includes statistics relatively narrow defined.

Am I correct it doesn't include national surveillance or it doesn't include public health surveillance? How are you defining administrative data systems as well? Would they be included or would they be outside?

DR. FRIEDMAN: They are included based upon use so administering the data, like surveillance data can be used for multiple purposes when they are used or you have got a --

DR. STARFIELD: It's on page seven.

DR. FRIEDMAN: Thank you, Barbara.

The definition that we use, that we develop are numerical data that characterize the health of a population and the influences and actions that affect the health of a population and I think one of the points that we were trying to make, that we try to make from the report is that, it's essentially you can have the same data set with multiple views of those data.

This is one view and we need to get more multiple views of the same data rather than generating more and more data sets which is what we tend to do.

DR. SCANLON: But what is the system exactly? That is, the data collection, nervous system or is it the analytic, it doesn't matter what the information was collected for. It would make it available some how analytic.

I'm trying to figure out how does the whole public health surveillance network fit into this definition or this whole framework or the administrative data?

DR. MC DONALD: Again, Jim, it's in based upon use. I mean, certainly administrative data fit in when they are used for these purposes. They don't fit in when they are used for -- I don't see a clear definition of the purposes in here.

DR. STARFIELD: Page seven showed how the use overlays the data. To distinguish between that and this.

DR. MC DONALD: Well, the data could come from any where I guess. It could come from any network and it's the analytic and the use.

DR. STARFIELD: What's on page seven is what's included within it.

DR. MC DONALD: What I see it as, I mean, I see it as if you are not in the field you won't know what this is about. That is, if you have been in public health, and collecting this data, you know what you mean by public health statistics but it's really is, it's kind of operational.

Everybody knows what it is but I don't, I think it's constrained I don't think you need to be that constrained. And I guess the other question, statistics is the easy part. It's the gathering that's the hard part and I think there's a lot of overlap with NHII.

This is different and we need a different one. I like the word population. I mean, the population helps I thought I heard you talking about, that I think I can hang onto that pretty hard. Statistics on, you know, I could do statistics on my stock market. I can do statistics on economic trends.

It isn't as well characterized. It is not a word that just briskly defines what you are talking about.

DR. LUMPKIN: But if I could perhaps try to differentiate between this and let's say something like meds, the fact that there've been, and I'm going to make up a number so don't quote me on this one, a thousand cases of salmonella in Illinois in a year because I just made up that number. We haven't had that since the big milk borne thing in 1985.

That is an issue of health statistics but the fact that we have five cases and all associated with a single product, that is surveillance because what our system does when we get our reports in and we start doing it, we conduct an investigation. We have people who actually go out and visit the people and fill out our survey a they say, okay, here's the pattern of this particular epidemic and we start trying to remove it.

So there is an interconnection, there's an overlap, but they are distinct activities.

DR. BLAIR: I think the point made by Jim and Clem is not that they don't understand it as well as there will be other people who read the report where the overlap with the NHII, the overlap with meds is not brought out clearly to show where it's complementary, where it overlaps, they just might be helpful to have either a graphic or a chart or something to clarify that. So, I think the term we always, more pictorials are always good in these reports so I agree with Jeff about having that.

I do think, however, in my own mind, one can imagine this being a repository of all this data, given that within, some of the recommendations talk about making the data available within one year after collection, maybe begin to make one wonder about how much surveillance capabilities this will have. This is clearly a different sort of piece that we are talking about.

DR. DANAHER: Hey, Dan and Jim, here what I, I'm kind of the least common denominator here so please pardon me. This is, you know, how, I guess I don't have is good sense of where this document goes.

Is this going to be submitted to the Secretary? Is it supposed to be a strategy document that is going to be implemented. Is this supposed to be a framework for setting up the public health surveillance system? And kind of the underpinning of my questions really get to Vicky's points which, Vicky's points for me were what immediate need is this fulfilling.

We could all sit around here and agree that this is a wonderful, great thing to have, but what immediate need is this fulfilling and how do you fund it, how does it get funded vis-a-vis competing resources? What's the intended purpose of this document? What we are supposed to do. What is, whoever, they are supposed to do with it.

DR. FRIEDMAN: The genesis of the document, the genesis was a request from Ed Sondik who directs the National Center for Health Statistics to develop a 10 to 15 year, what he called vision for health statistics. Health statistics and population health here in the US have been essentially not reflective, have essentially been based on a relatively narrow set of legacy data systems with a relatively narrow set of population health outcome measures.

And in addition to that, even the distinction between individual help on one hand and population help on the other hand, even within the field, hasn't been particularly keenly recognized. So the purpose of the effort and of the document was, in fact, a broad one, not to be a planning document per se, but to provide a general essentially, to build a consensus framework for people who engage in health statistics and people who use health statistics which could then be used to move ahead with more detailed planning and implementation.

DR. DANAHER: So the idea would be to take this, so is that planned to go up to the Secretary or is this to kind of get interdepartmental, buy off of a common vision of health statistics?

DR. LUMPKIN: I would think it would be both but also, I mean, not just the governmental vision and certainly not just a federal governmental vision.

DR. DANAHER: So it's kind of the first step in eventually getting to a broad scale policy.

DR. LUMPKIN: Yes. Now, just to put this in context because we have had some discussions in the four years that we've been working on this project. This fits within the overall context of the national health information infrastructure.

The patient medical record information recommendations of the standards and security are working on flesh out a good portion of the health care provider dimension. This fleshes out a part of the population dimension. Meds fleshes out another part.

So really it is consistent with the overall direction that we've been kind of working on as a committee for a number of years.

Mike has had his hand up for a while.

DR. FITZMAURICE: Two points. One on page 53, the other on page 56. On page 53, it talks about implementation of the following NCVHS recommendations pertaining the enterprise-wide planning and coordination is essential and it's the first bullet that I'm unclear about.

It says assignment of over all responsibility for health statistics activities within DHHS to a reconstituted National Center for Health Statistics. Does that mean they collect all of the data in health statistics? Does that mean they provide advice to other agencies? I'm thinking of my agency, thinking of SAMHSA who also collects health statistics.

I know about the over all assignment unless they collect all these data gathering efforts and consolidate them within NCHS.

DR. LUMPKIN: Well, that's just the short summation of what is referred to as GP 1.1 which is the two-page description of what that is. It could be rewritten something like overall responsibility for advising on public health statistics activating within DHHS.

DR. FRIEDMAN: Certainly what was intended was not necessarily taking all data collection.

DR. FITZMAURICE: The next one on recommendation 1.2, enterprise structure, autonomous health statistics planning board, when I look at the membership of it and the functions, you know, NCHS already has a good deal of this already like health data standards consortium.

NCHS already has a good deal of the health statistics planning board in the PHDSCC, public health data standards consortium I think is the title of it. I don't see any mention of that, that a good part of this already exists and it could get impetus for expanding the role of that consortium into being a statistics planning board and show the experience of NCHS in this.

DR. HUNTER: There certainly is mention of the consortium earlier in the document when integrating and planning is discussed but I don't think it's the same breadth of influence.

DR. MC DONALD: I kind of rationalized the document after I read some of the sections and actually recommendation three, data set development, longitudinal data sets, is kind of that more 21st century kind of view.

DR. MAYS: Can you speak up? I'm told that nobody can hear you over there.

DR. MC DONALD: I'm turning my mouth, that's the problem. I'll not talk to the chair. I will talk right into the microphone.

But I think really the problem is the title. I mean, what this is really talking about throughout is the population health and measures of population health and the statistics are sort of a side bar. They are important but they really are, I mean, that's an add-on. You are running it through the programs.

So I don't say you don't have to do that, it's just whether one might say, at least put population health statistic, somehow get the cue in the title that that's really the subject of the interest. It's not motor vehicle statistic. It's not economic statistics.

It's not, well, you are saying health but I think the population health is what would give the, tie it together better.

DR. MAYS: I want to talk a little bit about, again, this here, about the training that you are recommending. For example, on page 67, recommendation, CF recommendation five, training in terms of population health and then earlier you also talk about the board, for example, doing a need assessment.

One of the things that's ongoing right now is there seems to be a lot of activity by schools of public health and medicine of working with state departments of health to do the kind of training that you are talking about which actually customizes it.

You are trying, for example, to increase people's ability to be able to get MPHs, there is specific on-line courses, so it would seem like that some of this is actually underway being done and it may be in the hands of people who in certifying people, actually you gain more as opposed to having CE kinds of activities.

That's one thought. And the other is when you talk about the board doing is needs assessment.

I mean, to me it's almost like -- which is page 57, 1.4 -- but would you not want to think more lofty about asking IOM to look at what the, in general, what the training needs are in terms of population health as well as kind of assessing whether or not our current degree granting programs are actually preparing people.

So you have a work force problem as well as a training problem and you may want to think about, that would be very timely given the other data that's being collected by IOM, to think about that.

Because when they did this in terms of schools of public health and the kind of work force preparation, that made a big impact in terms of public health practice.

DR. LUMPKIN: Well, actually, there are two current IOM committees.

The first is IOM committee on public health work force which is actually addressing that issue so I think to the extent that this recommendation becomes enabling for the recommendations that will be coming out of that committee in the next year, the second is in relationship to the training on page 67.

I think it's a question of how much detail you go into in each one of these things, but I can just tell you, being a, running an agency that's trying to hire and have that kind of expertise, we are going through an early requirement program in our state and we are just going to see an incredible brain drain and the schools of public health are not producing the kinds of people that we need to be able to do the kind of statistical analysis on populations.

It's a very niche kind of educational process. So I think that to the extent that we can enhance the capability of doing population estimates using these sort of data to better describe what's going on, it has been under funded and under resourced and under trained and I think we do need these kinds of recommendations.

DR. MAYS: I think it's better done as not one shot CE type deals. It's better done within a program.

I mean, I think that's my point, that it's really better done within programs and I think pushing schools of public health and medicine where people are being trained to under take this as an area of study, in the long run I think is very productive as opposed to the CE type, public courses do get them up and going.

DR. LUMPKIN: Let me see if I understand. You are suggesting then that this particular recommendation be enhanced to talk about training in place as well as basic training or something like that.

DR. MAYS: Right. I think to put it in the schools of public health and medicine, to include this as an important part of the curriculum so that people are taught about population health.

Some universities -- we will talk about Johns Hopkins -- do this well. Some universities don't. But I think there's no push. I think that if IOM comes out with it or if there's some strong body that demonstrates that this is the future, then you are going to see the universities be able to move in that direction but there's not a push to do that right now.

DR. LUMPKIN: Let's maybe mark this because we are actually running over time. We do have a work group meeting tomorrow.

DR. HUNTER: I was going to refer back. There is an earlier place where training is discussed in a broader context.

On page 57, I think in this case it was more specific to that part of the report, and I think that may be enhanced to where it already talks about gradually training and other things as well.

DR. LUMPKIN: What I'm going to suggest that we do to, because we are running way over time, is if we can identify other issues for the committee to work, work group to work on tomorrow to bring it back to us and then we can look at it and say are we ready to move this forward, is there more work that needs to be done which is what we'll have to decide tomorrow.

Are there other issues to raise for the document for the work group? Mike?

DR. FITZMAURICE: I just found another sentence in here. Achieving the goal -- this is the bottom of page 54, requires centralizing responsibility for conducting DHHS's health statistics planning, it's data collection, analysis and translation functions and has the NCHS believes that NCHS is ideally positioned to serve as the home for this new enterprise.

That says to me you pull out all the data collection activities of DHHS agencies, you pull out the analytic power of those DHHS agencies and put them in NCHS. I don't think that's the original intent of the paper.

DR. LUMPKIN: No, it's not the intent of the paper.

DR. SCANLON: I think there would be little resistance to the idea of NCHS to the idea of NCHS as the designated general purpose, multipurpose health statistics agency and actually I would look even more toward, not so much a centralized data collection agency as much as the analytic capacity, but that's up to the committee.

But I think if you use the words, in that first recommendation, responsibility for general purpose health statistics and carried it through, you would have less of this issue that you see coming up. I'm not sure anyone would really agree to the others.

DR. LUMPKIN: But we are not going to try to write that section right now.

DR. SCANLON: No, no, no, but that's a boundary.

DR. LUMPKIN: And I've been reminded that this is so important that we've moved from tomorrow morning's agenda to this afternoon's agenda, later today.

So all my statements about it being tomorrow are wrong. It's at 5:15 this evening. Any other items for this? Okay, we will bring it back again tomorrow and take another whack at it.

Simon?

DR. COHN: On that note, actually we want to just mention two things.

First of all, there is a letter that we are drafting, the subcommittee in response to the notice of proposed rule, electronic transactions. It's going to be a relatively short letter, primarily thanking HHS for finally publishing the NPRM and encouraging them to put it into final rules as quickly as possible.

Plus there will be some comments made on specific questions or asked by HHS, a special relationship to the use of whether we should be stating that HCPCs should be a data set for particular uses, a relationship to drugs and biologics or whether we should leave it ambiguous for the moment.

I unfortunately do not have the version -- we are still drafting it -- it will come back tomorrow for the full committee and hopefully will not be terribly controversial. There is a time issue on all this stuff that we do really need to get this thing finished by the end of our meeting tomorrow, recognizing that HHS is requiring all submissions and comments be in by July 1st.

So that's item for action tomorrow number one.

Number two is we have a letter, actually I asked Kepa to take the lead ongoing through the it seems to me with in relationship to recommendations on code sets. We have come out of hearings from basically our last two set of hearings. I do want to apologize to the full committee that none of you have seen this before.

As I understand that both the combination of the fact that the subcommittee reviewed and made modifications to this letter up until the end of the day of Friday, I think the expectation was to get it out to all of you by e-mail before, because I understand there were some structural emergencies in NCHS that prevented this from being acceptability out yesterday by e-mail.

I heard the water system was closed and the building was evacuated as I understood so I do want you to just recognize that. I also want to, I'm not sure whether to apologize or at least make note, because I understand most all of our reviews of comments. I do understand today Dr. McDonald sent off comments to the person putting all of this together but somehow these comments did not make it to all the subcommittee members.

DR. MC DONALD: In fairness, I did that, like Monday.

DR. COHN: You did that like Monday, okay, so I think we just have to be aware in as we have not taken in all the comments from subcommittee members and during our subcommittee deliberation, I'm sure we'll have the chance to hear your concerns.

DR. ZUBELDIA: The later addresses the issue of the migration from ICD-9 to ICD-10 and we hear of extensive testimony that ICD-9, both the ICD-9, volumes one and two for diagnosis and volume three for procedures are incapable of supporting even current business needs for administrative and public health and other uses of the ICD-9 and they should be migrated to ICD-10 as quickly as possible.

So the letter is reflecting that testimony and the findings from the testimony and recommending to replace ICD-9, volumes one and two with ICD-10-CM and ICD-9, volume three with ICD-10 PCS as soon as possible.

However, during the testimony it was clear that to do that on an emergency basis as soon as possible it would not be wise because people need time to implement this transition and we heard that people wanted at least two years after they are done with HIPAA to do a transition from ICD-9 to ICD-10. So the recommendation is to implement it as soon as possible but no sooner than October of 2005.

DR. LUMPKIN: Okay, I would like to take care of this in two parts. The first part is the recommendation which is to migrate to ICD-10.

If we can just have, just see if there's any concerns with that conceptual concept and then we can address the letter itself specifically.

DR. ZUBELDIA: Let me add one thing.

In the letter, the second part of the end, it says the committee would continue deliberations on code set issues and provide additional recommendations in the future.

We believe that we may still need to hear from small insurance companies and some of the other constituents but we realize that it's important to do this migration as soon as possible and that given the time frames it takes for regulations to go through and so on, we want to put the movement, put this letter to start the movement right away, even though there may be some additional hearing that we need to hear on this issue.

DR. LUMPKIN: And probably recognizing with our experience in dealing with the federal system and how things move saying that as soon as possible in 2005, they are about the same thing.

DR. ZUBELDIA: I realize it could even be 2006, as soon as possible.

DR. MC DONALD: End of the second paragraph on the first page. Clarification on what you are asking.

Were you saying ICD-10, the diagnostic codes, or ICDM-PCS when you were, you used the word ICD-10?

DR. LUMPKIN: The letter specifically talks about three things.

DR. MC DONALD: You first said we agree, we want to go to ICD-10. I didn't know whether you meant both pieces of ICD-10.

DR. LUMPKIN: Yes.

DR. MC DONALD: That's actually what I, I mean, unfortunately, I wasn't at the hearing so I probably have no standing. I'm really asking do we need to pull out any of the pieces of it for --

DR. ZUBELDIA: I would like to argue about the wisdom of ICD-10-PCS at this time.

There's actually three recommendations. One for ICD-10-CM, one is for ICD-10-PCS and the other is for the deadline.

DR. MC DONALD: I don't have any, I mean, we have got to go to 10-CM some day. I mean, the rational for it was might be a little bit off because it's my understanding it doesn't really enrich the whole world of codes we would like to have tremendously.

There are certain areas where it's enriched and whether it's a huge improvement I think is -- I had not heard that it was a huge improvement but I think we've got to do it because it's treaty, and it's probably better. So the issue of PCS is the one that's on, it's pulled out.

Are there concerns with the other two issues? Can I address? It's CM, no, I guess there are two, YM and PCS.

DR. ZUBELDIA: Does that concern you maybe a little bit, even though there's not a lot of new functionality to make it completely different, there is a lot of new room for growth, that ICD-9 doesn't have.

And now when they are trying the insert a new procedure called or a new diagnosis code in ICD-9 because some of the categories are full, they have to spread it in other places where it doesn't belong and you tried to look for it and can't find it.

DR. LUMPKIN: You are talking about CM.

DR. ZUBELDIA: CM and PCS. I'm talking about both.

I'm talking about CM and volume three, procedure codes today don't have the room to add new codes that we need today.

DR. MC DONALD: It just was a side comment on that ICD-10-CM.

The thing I would like to have some discussion about is the other one so if you are going to do some voting, I would prefer to have a chance to discuss some of the complexities of the second decision.

DR. BLAIR: Okay, could I just say one other observation related to Clem's? Clem, I think there was a sense that we needed to go forward on these to fix existing problems with ICD-9.

Not that ICD-10 was perfect, but it addressed some of those problems and it made some of the improvements and there didn't seem to be a viable alternative to going forward with the ICD-10-CM and ICD-10-PCS. If you, in your mind right now are thinking there's an alternative, that would begin to change the possibilities.

DR. MC DONALD: We don't want to have that discussion yet. I mean, the chair I thought wanted to hold off the discussion.

DR. LUMPKIN: No, actually, what we are going to try to decide so we can split up into the work groups. Does any, are there any concerns related to migrating from ICD-9-CM to ICD-10-CM? Speak now or forever hold your piece.

DR. BLAIR: There's a lot of concerns but I just don't know of a better alternative.

DR. COHN: John, you are referring to volume one and two diagnosis?

DR. BLAIR: Yes, recommendation one. I'm trying to summarize it.

Recommendation one in the letter -- replace ICD-9-CM, volumes one and two of ICD-10-CM for disease, injury, impairments and other health problems and manifestations, cause of injury, impairment and other health problems.

DR. LUMPKIN: Seeing none, I'm going to take that off the table. The issue is, we now have the question of replace volume three of ICD-9-CM with ICD-10-PCS for procedures or other actions taken towards diseases, injuries and impairment on a hospital in-patients reported by hospitals.

And here's your options. We can discuss that now and make a decision. We can have some abbreviated discussion, ask the subcommittee to review, have that continued discussion and bring it back to us tomorrow or we could decide we just don't want to do it right now.

I think we have to look at some way to expedite it and the concern I have is I don't know, I'm just kind of getting to feeling that there are a few people who are concerned about it but the rest of the committee, like, for instance, me, I don't have a clue. Since I don't work with either one and I do know that when we switched as an agency to ICD-10 because that's how we report to NCVHS, also our nosologist retired because they didn't want to do the conversion.

But we are back to our work force but I think the issue is do we want to have this discussion in the subcommittee or do we want to do it at the full committee?

DR. NEWACHECK: John, just one point of clarification. I'm not sure what PCS is.

DR. MC DONALD: What PCS, it has nothing to do with the World Health Organization and ICD-10.

It's just sort of a convenient name and what it is is it was a newly invented procedure code which is much like ICD-9 in that it has a vector, there's a certain set of characters assigned to each part of the procedure code and I think there's six or seven and there's just certain numbers of characters assigned to each, and actually there's not going to be space there either.

In some of these space areas they will run out of space. So they have on embedded meaning.

The second thank is in a given sort of hierarchical code through A, I forget what they mean, but it goes through everything from radiology to nuclear medicine or something like that. Then the meanings of the sub codes change so the letter codes don't mean the same thing as you go down different parts of the tree.

It's long, there's two million of them, two million-plus codes compared to the 35,000 or 10,000 codes we have now. I did hear testimony in one meeting where there was some support from the some of the coders but the only data we saw took one and a half times or twice as long, it took longer to code.

This is going to be, I view we are going to get in the came situation we got in NBC with this because there not be places to put the fields. There will be this huge upheaval.

There hasn't been much introspection about this or much deep, you would almost like to float the plan to the people who have to use it and see what happens because I hate to have us in the situation where we are refracting. We lose our credibility.

So that's my concern and the other part about it is it really pre-judges the decision about CPT because it would, to me, I mean, the alternative is use CPT. That would be what I would actually think about talking about but that's a long, hard discussion, and the reason is, I'm not even a member of the AMA which I just got recently in trouble with by mistake to I don't have any specific connection on that point but I did notice a surge in our hospital.

They think of the surgical codes as the CPT numbers. These things are really distributed into the brains of people all over the country. Including, I mean, the physicians, wow, gee, I thought that was an appendectomy or something like that. So there will be some upheaval if we have to retread everybody on earth with this so that's the two reasons why I'm worried about that decision.

DR. BLAIR: Do you feel like that upheaval will still occur, given the time frames that we laid out in the recommendation letter, that the transition would not take place until after compliance was achieved with the financial administrative codes?

DR. LUMPKIN: Marjorie just wants to give a little bit more background and I'm going to toss this back to the subcommittee.

DR. GREENBERG: I just, from everybody because I think it should go back to the subcommittee, but I just wanted to provide a little bit more information to Clem since he wasn't at the hearing and to the others as well.

The testimony that was heard was that, well, first of all, the proposal was not that physicians should start using 10-PCS. This is for in-patient reporting by the hospitals.

Right now, the hospitals use volume three and physician reporting is using CPT and that would continue, the physician reporting would continue to be CPT and the hospital reporting of procedures would migrate from 9-CM volume three so this would not replace --

DR. MC DONALD: No, I understand that, but it really doesn't make sense to do different code systems in those two different places to start with.

DR. GREENBERG: That's another issue and I think what came out at the meeting was that the industry really wanted to migrate to a replacement that 9-CM volumes one, two and three had to be replaced.

There seemed to be consensus on that in all the testimony and that the industry wanted to make that replacement and migration together. They didn't want to do it separately. One, first do it for diagnoses and then to it for procedures.

That coming the close sure on a single procedure ification system which, of course, this committee has recommended going all the way back to, I don't know, 1993 or something, was still a goal and a good one but not one that could probably be resolved in any kind of short term and the recommendation is actually to publish on NPHM which then, of course, allows for complete comment period from the time which could result in the department re-thinking whether it wants to go ahead with both or de-coupled two of them.

So that the NPRM process, but on the other hand that we could continue holding hearings for the next four or five years and meanwhile we are living with broken and inadequate systems so that by moving ahead with an NPRM which has been noted is not a short term process.

This then gets this out for comment and based on the comments, that could be, final decisions could be different so I think that kind of context is helpful.

DR. COHN: First of all, Marjorie, for providing that context because I think it helps us remember why we came up with the letter as it developed.

I do want to comment. I think the subcommittee is happy to take this and further discuss it.

I guess I would just to make sure we have our options sort of on the table, it appears to me that our options are to divide the issue, it would easy to come back and say go forward with the letter as it is, one is to divide the issue, another is to ask for more public input about the letter and then come to a decision later, at some point until the near future.

I mean, I guess the final option is to defeat the issue. Obviously, we'll talk about them.

Those are the ones that came to me any way, so we will talk about it in the subcommittee and come back with a recommendation, hope fully tomorrow morning.

DR. ZUBELDIA: John, on this third recommendation, are there arguments or discussion on the date of October 2005?

DR. LUMPKIN: I think we had better decide what it is that we want to do before we can talk about a date.

So we can move to the subcommittees. We do have speakers for the next subcommittee who are outside and that's why I'm trying to--

The subcommittee on standards will be in the Georgetown room and the population will stay here.

DR. MAYS: Can I ask you, just because we actually have speakers and a part of the group in here, so I'm going to ask --

DR. GREENBERG: Oh, and don't forget that we are having a, you are all invited to have a group dinner, pay your own way, but I think it will be fun. Right across the street really.

(Whereupon, the session was recessed.)