Public Health Service

NATIONAL COMMITTEE ON VITAL AND HEALTH STATISTICS

Subcommittee on Standards and Security

October 22-23, 2002

Washington, D.C.


-Minutes -

The Subcommittee on Standards and Security of the National Committee on Vital and Health Statistics (NCVHS) held hearings on October 22-23, 2002, at the Hubert H. Humphrey Building in Washington, D.C. The meeting was open to the public. Present:

Subcommittee members:

Staff and Liaisons

Others


EXECUTIVE SUMMARY

October 22-23, 2002

The Subcommittee on Standards and Security held hearings on October 22-23 in an ongoing process focused on the Health Insurance Portability and Accountability Act (HIPAA) and administrative simplification and implementation.In keeping with NCVHS' responsibilities under both HIPAA and the Administrative Simplification Compliance Act (ASCA) to track implementation, identify barriers and solutions, and publish reports on effective solutions to compliance problems, members focused on HIPAA electronic transaction standards final rule and issues and opportunities related to implementation. In the afternoon, members discussed next steps related to the International Statistical Classification of Diseases and Health Related Problems, version 10 (ICD-10) including a proposed cost benefits study and synthesizing issues in areas of agreement and disagreement for discussion during the next full Committee meeting. Work underway related to drug codes and terminologies was considered during the final part of the afternoon. The next day members discussed testimony on Patient Medical Record Information (PMRI) terminologies and changes occurring in Centers for Medicare and Medicaid Services (CMS) in relationship to HIPAA implementation and compliance. During the two days, the Subcommittee heard 9 presentations and talked with two panels regarding HIPAA readiness and drug terminologies under development.

Ms. Trudel reported that on October 15 the Secretary announced that responsibility for enforcement of the administrative simplification provisions, excluding privacy and including HIPAA portability, would be assigned to CMS and that a new office was being created for this and other outreaching HIPAA activities. CMS will continue to work closely with the Office for Civil Rights. The Secretary's announcement clarified that the enforcement process would be complaint driven and progressive; penalties would only occur after all other avenues were exhausted.

Overview

Panel 1: HIPAA Readiness - Observations

Mr. Tennant said the model compliance form assisted the enormous number of providers who applied for the extension and raised awareness about compliance. He cautioned that providers relied heavily on vendors to bring them into compliance. Group practices reported major national vendors weren't providing a software solution and practices had to send HIPAA compliance claims to payers through a clearing-house. A coalition of 14 medical specialty societies established an online directory to report practice management system (PMS) vendors' HIPAA readiness. The Workgroup for Electronic Data Interchange (WEDI) focused on identifying roadblocks and creating solutions (e.g., vendor forums assisting in reaching compliance). Mr. Tennant said top-level payers expected 80 percent of health plans to be ready with the claim (837), remittance (835), and some eligibility (270/271) standards by next October, encompassing 90 percent of transactions. Community and rural health centers serving the poorest segment of society were most in peril. Industry uncertainty topped the list of roadblocks. The cost of systems also hampered compliance. Missing standard elements in the CMS 1500 paper claim; concerns about fines for non-compliance; and, until recently, a lack of direction from the federal government plagued providers. Mr. Tennant emphasized the need to provide constructive advice on how to move forward.

Mr. Gilligan gave CMS kudos for their implementation efforts and outreach. He noted a problem with different interpretations in the HIPAA implementation guides and reported that answering provider clients' questions about when their software would be HIPAA compliant left vendors open to liabilities; the vendor community was being judicious in the language of their responses. CMS, vendors and medical specialty societies will meet October 30 to try and resolve this issue. The vendor community wanted more done to educate providers about the role of providers and vendors in implementing the transaction and code sets final rule. Vendors also wanted help telling customers and clients they had to change the way they did business. Mr. Gilligan said still there was reason to be confident that next October compliance would be substantially achieved.

Mr. McLaughlin said participation in industry groups helped drive McKesson Information Solutions and its partners toward administrative simplification by utilizing industry accepted standards and protection of information through privacy and security standards. McKesson formed a HIPAA Project Office for all HIPAA-related development coordination, employee education, and customer communication and published a readiness disclosure on its own and the public Web site Mr. Tennant mentioned. He said the extension, coupled with uncertainties about addenda and National Drug Codes (NDC), complicated the implementation process. And he noted roadblocks to compliance: NDC, taxonomy, addenda, and extension code issues; lack of a final or proposed rules for remaining identifiers, security, and transactions; implementation guide interpretation issues; and trading partners with unknown status. Noting implementing the privacy final rules without the matching security rule was a near impossibility, he urged HHS to finalize the addenda, NDC, privacy, and security rules. He encouraged HHS/CMS's participation in a workgroup WEDI SNIP and X12 initiated aimed at eliminating confusion and inconsistent answers related to the implementation guides. Noting that many trading partners still hadn't declared testing and compliance dates, he recommended developing a directory documenting the readiness of partners.

HIPAA Extension Request Status

Ms. Trudel reported that over half-a-million people who probably didn't know about HIPAA six months ago demonstrated awareness by submitting an extension request on-line. She anticipated 50,000 paper submissions would be received and keyed into the database. Most electronic requests were from providers and reported a Medicare number. Ms. Trudel said she'd be able to report statistics and a sense of common problems or trends at the December meeting.

What Needs to be Done to Move the Industry Implementation Forward, Roy Rada, Professor of Health Care Information Systems, University of Maryland

Noting that different entity types had different practices, Dr. Rada suggested sorting through the data from the extension request forms by type to see what each entity did. Dr. Cohn said he hoped this data helped the Subcommittee meet its responsibility to identify compliance problems (they'd already heard some in today's testimony) and determine what had to be published or shared around best practices.

Preparation for ICD-10 Presentation for Full Committee in November

The Subcommittee considered a proposal for a cost benefit analysis that HHS staff prepared. Members noted there were a variety of suitable ways to get this done. Ms. Greenberg expressed hope for a preliminary analysis by the February Committee meeting. She advocated initiating work on a notice of proposed rule making (NPRM) and having an outside contractor facilitate the impact analysis, addressing both costs and benefits. She delineated the discussion and analysis as studying the cost and benefits of moving from clinical modification of ICD-9 (ICD-9-CM, volumes 1, 2 and 3) to 10-CM and 10- Procedure Coding System (PCS). She suggested that looking at other classifications as alternatives was out of scope. Members agreed that they needed enough specificity to identify how conclusions were made.

The Subcommittee potentially will look at preliminary data in February and final data in May. Members discussed reporting now to the full Committee and writing the Secretary after the cost benefit analysis and November discussion. Ms. Greenberg reiterated that one position that could come out of that discussion was proceeding with development of an NPRM. Meanwhile, the study could go on with the expectation that it would feed into any final NPRM. Dr. Fitzmaurice noted an NPRM in tandem with a benefit cost analysis would give the Secretary enough information to make a decision. Members discussed simultaneously developing the cost benefit analysis and recommending that the Department work on it's own impact analysis and on the regulatory language it would use if they choose to go forward with an NPRM. Mr. Blair suggested that the study seek recommendations on education, implementation tools, and other things to mitigate costs identified. Mr. Scanlon said putting forward the NPRM while still collecting basic information troubled him. Noting the Department looked to the Subcommittee to place all this in perspective, he encouraged them to think about all the other demands on providers, plans and the Department, consider alternatives, and not suddenly pull another regulatory action. Ms. Greenberg noted the American Hospital Association (AHA) advised that neither the hospitals which reported in-patient procedures, CMS which used in-patient procedures in DRG's, or others using 9-CM, volume 3, for statistical purposes considered The Physician's Current Procedural Terminology (CPT) a viable alternative.

The Subcommittee was prepared for a scope of work that didn't include CPT or single procedure classifications analysis and agreed to hold off a final decision on expanding the scope of the cost benefit in order to seek guidance from the Committee after the CMS presentation and discussion. If the Committee chose, it could do a separate study. Work will begin on the draft proposal, which wouldn't be initiated until after meeting with CMS. If necessary, CPT could be added, but it was evident that the issues of ICD-10-CM and ICD-10-PCS had to be split. The cost benefit impact had to look at 10-CM and 10-PCS separately, while gauging the costs and benefits of replacing ICD-9-CM, volumes 1 and 2 at the same or different times.

The Subcommittee considered how to synthesize issues of contention for presentation in the full Committee discussion. The background document laid out the history of each classification, why it was developed and a definitive timeline. A companion document outlined issues identified during discussions about single procedure coding. Members assembled an initial list of issues: the urgency for a new code set; the extent ICD-9-CM was broken,; whether there were alternative candidate code sets; CPT and timing including linking; whether to link changes in CM and PCS; adequacy of testing; cost versus benefit; and availability of help/assistance/education/support tools and programs for migration. Dr. Cohn recommended adding the GAO Report's worldview and the 1993 NCVHS study's perspective on classifications, along with its recommendations. Ms. Pickett will draft pros and cons. Members will augment the issues paper off-line. A draft will circulate next week. Comments are due November 5. The Subcommittee chose to facilitate the Committee's discussion by dividing up the bulleted pros and cons in the issues document and assigning them to members who will lead the exploration of each item, reflecting both views, with other members joining in if they felt a view needed expansion.

Panel 2: Drug Terminologies Under Development

Mr. Brown reported the July 2000 NCVHS report had been taken to heart and movement was underway to reformulate and distribute information already under government stewardship. National Library of Medicine (NLM), FDA and VA worked together and NCI agreed to collaborate with information on drug terminology. VA and CDC had also drafted an agreement. FDA was developing NDCs and active ingredient classifications. NLM utilized RxNorm for enhanced mapping capabilities between systems. And VA was active with NDF RT, a reference terminology based on the National Drug File (NDF) that expands the RxNorm model.

Discussions in HL7 meetings about defining a clinical drug so pharmacy knowledge-based vendors could better interact led to a new semantic type in the UMLS: a clinical drug as something with an ingredient and either a form or strength or both. This RxNorm form was intended as a standard representation of what was meant at a clinical level, relating all UMLS clinical drugs to that standard format and facilitating crosswalk between pharmacy knowledge-based vendors. A set of UMLS concepts was built, starting off with ingredients. Formulations were created combining the drug component (an ingredient in a strength) and standard drug forms HL7 proposed. The UMLS will contain relationships of the ingredient, component, clinical drug, and dose form. Things that didn't match would have a relationship to the RxNorm form that could be mapped to; everything could be represented in a graph. VA provided an ingredient list and NLM mapped those into semantic normal forms. Drug vocabularies from the five major clinical drugs sources were tested to gauge the scaleablity of the approach and refine it. The first experiment began with 80,000 drugs and ended up with 11,300 clinical drugs, 10,000 components and some 8,000 RxNorm forms. A 70 percent reduction in the workload was achieved algorithmically. A second experiment resulted in about 25,000 edited and created forms merged down to 14,000. At that point, 13 clinical drugs that had been separate UMLS concepts merged. About 35 UMLS concepts from previous additions were embodied in the same concept. Some 20 graphs establishing and labeling the full set of relationships for a list of highly prescribed drugs was distributed in the May release of the UMLS Metathesaurus. Mr. Nelson predicted by the end of 2002 all RxNorm forms needed would be created and by spring a full set of graphs would relate every clinical drug in the UMLS to a normalized form.

Mr. Nelson cited the need for a method to ensure the updating of things used on a regular basis at a clinical level. FDA provided access to much NDC code information and VA cooperated with NLM, which used those NDC codes as attributes of these concepts. Considerable interest was demonstrated in working together to deal with the maintenance burden. Multiple problems were noted with non-uniqueness of codes.

Mr. Levin explained that NDC comprised three different codes: labeler, product, and package. FDA assigns the four- or five-digit labeler code to a company; companies assign themselves the other codes. Each was unique, but bar codes couldn't interpret hyphens or distinguish between the three codes. Dr. Zubeldia noted that under HIPAA the Secretary was adopting an 11-digit number as the NDC standard. Mr. Levin reiterated that five-four-two codes without hyphens weren't unique. FDA knew about the problem and was working on improvements. Mr. Levin noted much of the drug information CDC and NLM needed was on the product label. FDA changed the organization of the labeling to make it more people, computer, and information-system friendly. Key information was highlighted in concise bullet form at the top of the label. The proposed rule required companies to provide labeling in electronic format. Each section of the label (e.g., pediatric use, usage in pregnancy, drug interactions,) and key elements within the text (e.g., active ingredients) were computer readable.

Mr. Levin said FDA partnered with NLM to disseminate information so people could put it into systems that fed the information to the public. FDA will provide NLM with new product ingredient codes, strengths and forms. NLM will note and update the RxNorm and provide information about imprint codes. NDC codes will be computer-readable in the label. FDA will work with the NLM to build an international ingredient coding system. The computer-readable label form would be added to information already distributed about drug products so people could import it into their information system. Updated information on all marketed products would come directly from FDA and the manufacturer and be freely distributed.

Day Two

Review of Agenda

Mr. Blair explained that HIPAA's administrative simplification provisions directed NCVHS to study and make recommendations to the Secretary on issues related to adoption of uniform data standards for PMRI and that information's electronic exchange. That report (which was on the Web site) set forth criteria for evaluation and selection of PMRI standards including recommendations for selecting them. In August of 2002, the Subcommittee formally began evaluating recommendations on PMRI message format standards. Today's discussion centered on a matrix summarizing the August 28 testimony of 14 testifiers who'd helped define the scope and criteria for selection of PMRI terminology standards and develop a framework for this decision-making process. The Subcommittee had to resolve four open issues about terminologies that would be in or out of scope in criteria for selection for different groups. Dr. Steindel noted that NLM and the American College of American Pathologists (CAP) reopened negotiations and either would reach agreement by January 15 or the federal government would initiate other steps.

Presentation on PMRI Terminologies, Walter Sujansky, independent advisor

Dr. Sujansky noted the Subcommittee had solicited and summarized testimony about the scope of terminologies the Subcommittee should address and criteria for selecting terminologies within that scope. The first issue addressed was the approach and method the Subcommittee would use to organize and consider terminologies. Input from testifiers in August broke out suggestions for how to define the terminology effort's scope into four or five categories. One category organized terminologies by clinical function and/or the messages addressed by the standardization effort, including suggestions about domain-oriented groupings. There were also recommendations about functional organizations (i.e., types of tasks supported by different terminologies). Mr. Blair had noted that this was a pragmatic, efficient approach that developed terminologies that directly addressed specific functions and uses. It led to terminologies that fit into a reference information model (RIM) designed to support messages and provided messages that were easily stored and retrieved. It seemed less expensive than strategies that included a reference terminology. This approach's weaknesses were that it: wasn't well-suited for decision-support applications that used and compared data for multiple domains and it didn't appear to address the need for clinically specific outcomes data to improve clinical processes.

Another approach organized terminologies with respect to a convergent or reference terminology that was the core or first layer and entailed some variation of specifying a single relatively comprehensive core terminology that would be used as an initial primary terminology for accessing and analyzing clinically specific data. Dr. Sujansky explained that other layers of terminology might include more limited and perhaps functionally specific terminologies. One layer would be for administrative terminologies. In the center was a comprehensive core reference terminology. Dr. Sujansky noted that this approach provided a unifying mechanism for all terminologies, facilitated clinical decision support to improve quality of care and patient safety, improved clinical outcomes data to facilitate improvements in clinical processes, did this more effectively with lower costs relative to functionally specific terminologies, and had potential to improve the accuracy and lower the costs of clinical research. Its weakness was the relatively high cost of developing and maintaining a convergent terminology. Dr. Sujansky noted inconclusive information about the marketplace's limited acceptance of the vast number of candidate, convergent terminologies that exist. He noted that this approach didn't offer complete coverage for PMRI terminologies, due in part to development costs of additional domains. Difficulties were often encountered when mapping administrative terminologies to such a core reference terminology.

A third approach developed a web of terminologies blended between information and terminology models. Dr. Sujansky viewed this as a natural, necessary interaction between terminologies and, for example, message and medical record structures. Achieving objectives of PMRI standardization required recognizing that there was overlap in the information content of terminologies and message structures.

Another approach retained the current representation for the scope of PMRI terminologies with modifications. While the fastest, simplest way to characterize the scope of PMRI terminologies, this approach didn't provide a strategic approach to unify, prioritize or address needs for new PMRI terminologies.

The last approach endorsed PMRI terminologies used by the government or private sector. It avoided objections from existing developers of PMRI terminologies, but offered no strategies or paths to unify, prioritize or address needs for new PMRI terminologies, didn't provide clinical specificity or address the need for clinically specified terminologies. While this group was more pragmatic than the first, both shared characteristics.

Dr. Sujansky observed that these approaches weren't necessarily mutually exclusive, but represented conceptual buckets the testimony could be put into. The buckets, like their contents, could be combined.

Discussion and Prioritization

Dr. Mc Donald said one couldn't presume that single or multi-axial solutions were the exclusive preserve of either model; the real and complex question was between compositional and non-compositional. While the pragmatic approach was good for immediate measurable results, Mr. Blair questioned whether it created an information infrastructure that would move them to that new paradigm. He considered the second category more capable of moving toward a broader information infrastructure. He thought Dr. Chute saw value in the information and terminology models and that the Committee should strive for the best of both. Dr. Sujansky advised them to consider and articulate the purpose of standardizing clinically specific terminologies. He said it might be useful for them to consider what was missing that required a standardization effort. Dr. Sujansky also encouraged considering the role of terminology as well as messaging standards in the cross- or super-facility aggregation and sharing of information. He recommended considering the context of this work. Personally, he believed the prime reason for PMRI standards was to standardize computerized access to that data across patients, time, providers and facilities.

Members considered use cases and the kind of functionality they sought. Dr. Sujansky asked members to e-mail suggestions and objectives that a standard terminology would enable so he could incorporate them into the discussion. He emphasized considering the meaning of the Subcommittee's recommendations and how they could make them relevant; the role of government versus industry; and how to leverage the government's role in enabling, fostering and encouraging functionality.

Mr. Brown had pointed out that interlocking models could meet many purposes and there probably would be more than a single core reference terminology. Mr. Blair suggested looking at terminology development works referenced in that first category as being represented in the second layer of Dr. Campbell's model. Dr. Cohn suggested that almost any model could be preliminarily acceptable--it would evolve as they moved forward. Acknowledging they didn't know whether this core described one or multiple terminologies, he noted it did need to be relatively tightly integrated and mapped and, hopefully, had little information loss. Mr. Blair proposed that the Health Level Seven (HL-7) RIM might provide harmonization and coordination among different terminologies in the second layer. Dr. Cohn said he struggled with how the information model fit into the terminology solar system; he thought of it as almost another dimension. Ms. Greenberg questioned the relationship between the second and third layers. Dr. Sujansky encouraged the Subcommittee to think about fundamental differences between the layers. The first or reference layer had no unrepresented overlap of content. He suggested LOINC and RX NORM might create a relatively comprehensive non-overlapping terminology at the second layer.

Dr. Cohn said unique patient-centric concepts deserved to be part of the core, though he found many of those code sets more nurse- than patient-centric. Ms. Bickford pointed out that other clinicians used nursing's patient-centric interventions and outcomes. Dr. Sujansky agreed that there were important patient-centric terms in the nursing terminologies; some already appeared in SNOMED and other medical terminologies. He didn't believe that the two-level model excluded patient-centric terms and concepts in a core terminology. He pictured a core level and another that included Dr. Spackman's "limited scope terminologies." A middle-layer core terminology might include several terminologies with non-overlapping content. A third level was for administrative terminologies. Information system applications could use different terminologies, but mappings existed to the core terminology. Mr. Brown had suggested one could see both an information and terminology model in the core; Dr. Cohn viewed Dr. Campbell's representation as a dimension that could represent an information model.

Members agreed that determining (at least conceptually) how to get the information and terminology models working together would help with strategy and direction. Dr. Sujansky's understanding of RIM was that one of many terminologies could be used in this field, so long as it was identified, providing some standardization but not enough to enable foreign applications to know they could interoperate using one of these terminologies. He noted a way to gain a complementary relationship with RIM and HL-7 terminology efforts was to use the core compliant terminology for that field as well as HL-7. Dr. Steindel concurred with Dr. Sujansky's description of RIM and its use of terminology. He noted the informatics community was realizing the complexity of clinical concepts and the blending and continuum between terminology and information models. He said the introduction of nursing terminologies into SNOMED was another blurring of terminology and information models. And he recommended seeking guidance on how to make recommendations for selections of terminology in the context of the patient medical record and reconciling an information and terminology model.

Next Steps

Noting the big obstacle to interoperability in HL-7 messages was terminology, Dr. Cohn proposed talking in December with HL-7 about requirements to gain interoperability in HL-7 messages. Mr. Blair said they needed to make sure their framework was broad enough to accommodate emerging applications. Dr. Steindel noted patient medical record systems and how they were used was an important component. Mr. Blair observed that they had to make sure there was an information infrastructure with terminology standards in place to support new applications that HL-7 couldn't address, at the same time they provided a framework where HL-7 could continue to advance and flourish.

Dr. Cohn noted that the information model was ripe for discussion in December. He suggested the issue was about pragmatic and near-term versus long-term needs, and he reflected that the big problem with decision support was that it hadn't been diffused into the industry. Mr. Blair asked Dr. Sujansky to provide guidance at the December meeting on options and approaches for coordinating and harmonizing information and terminology models. He also asked him to develop a synthesis of where the Subcommittee was now, based on today's discussion, and envision an additional step with Dr. Campbell's construct to aid them in considering how to move forward.

Members discussed technical issues about developing a work plan for Dr. Sujansky's activity. Members and staff will talk after the meeting about establishing parameters and resolve issues regarding the scope of Dr. Sujansky's work by e-mail within several days. WEDI will submit their list of best practices and publishable pieces; others were asked to submit what they considered useful. Members specified interest in creating a link to the Utah Health Information Network's HIPAA Savings Calculator.

The Subcommittee identified two key issues: vendors and education; moving forward on both will be discussed at the November meeting. Time permitting, members will also discuss the analysis of the data related to compliance delays. A brief discussion based on today's conversation about next steps will be held at the break out meeting. Sessions on December 10-11 will focus on reports from the DSMOs, issues related to smoothing administrative simplification and making the process work better, testimony and a discussion about PMRI next steps. Hearings are set for January 29-30, March 25-26, and May 21-22. Discussion about another hearing was deferred pending specifics of the work plan.


DETAILED HEARING SUMMARY

Day One

Overview

Mr. Blair noted the testifiers would provide feedback on the scope of the Subcommittee's PMRI proposal and whether the criteria for selection of terminologies were appropriate, pragmatic, realistic and sufficient. Both were derived from the Committee's July 2000 report to the Secretary that set forth priorities for going forward with PMRI standard recommendations and a set of recommendations including the selection of standards. The report also addressed accelerating development of PMRI standards and guiding principles for selection as well as other guidances the Subcommittee applied to PMRI terminology. In February 2002, the Committee applied these principles to the selection of PMRI message format standards and made recommendations.

Ms. Trudel reported that on October 15 the Secretary announced that responsibility for enforcement of the administrative simplification provision, excluding privacy and including HIPAA portability, would be assigned to CMS. A new office was being created for this and other outreaching HIPAA activities that CMS already had responsibility for including development of regulations, working with industry groups to maintain standards, and industry outreach. Ms. Trudel said CMS would continue to work closely with the Office for Civil Rights to coordinate outreach and guidance that crossed the privacy and non-privacy boundaries. The Secretary's announcement clarified that the enforcement process would be complaint driven. Only a complaint would begin the enforcement process and anyone filed against would have opportunity to submit additional information, a corrective action plan, and demonstrate an attempt to comply in good faith. The process would be progressive; penalties would only occur after all other avenues were exhausted. CMS was working hard to quickly flesh out this process.

Panel 1: HIPAA Readiness - Observations

Mr. Tennant credited CMS for their work with ASCA legislation and the model compliance form. An enormous number of providers applied for the extension and the model compliance form assisted providers in developing their plans and raised awareness within the provider community about the issue of compliance.

While noting a movement towards compliance, Mr. Tennant reported that providers relied heavily on vendors to bring them into compliance. Group practices reported many major national vendors (including large, national PMS vendors) said they wouldn't provide a software solution now and practices would have to send HIPAA compliance claims to payers through a clearing-house. Clients weren't given a timeframe for when vendors would bring their software systems into compliance.

Recently 14 medical specialty societies, including MGMA, formed a loose coalition to address these issues. One thing the group searched for was a way to determine the vendor community's HIPAA readiness. With Dr. Zubeldia's help, a directory of PMS vendors on a non-commercial Web site (www.hipaa.org/pmsdirectory) already had information on the HIPAA readiness of a number of vendors' systems.

Mr. Tennant said the concept of partnership was critical to HIPAA readiness. Anyone who wasn't ready produced problems for all partners. When a vendor wasn't ready, the providers weren't ready and couldn't test with their health plan, which then had trouble paying claims. WEDI focused on identifying roadblocks and coming up with solutions such as vendor forums, where vendors could seek help in reaching compliance.

Mr. Tennant had spoken to a number of top-level payers who were relatively confident that the majority of major health plans would be ready with the 837, 835, and some 270/271 standards by next October. The rest of the standard transactions were lower priorities for the health plans. The sense at the top level was that 80 percent of health plans would be ready, encompassing some 90 percent of transactions. Mr. Tennant said a reason providers moved ahead quicker was they had more clout with vendors. They'd been forced to customize many of their systems in order to meet their business needs, but they'd begin testing early in 2003. The groups most in peril were community and rural health centers that served the poorest segment of society. They had little money for compliance, few vendors served their needs (some were national vendors who weren't yet offering a full compliance solution), and they faced the daunting task of finding some way to meet the standard's requirements and still keep financially viable.

Mr. Tennant said industry uncertainty topped the list of roadblocks. He said he'd be confused if he were a vendor about what standards to move forward and he suggested this was one reason vendors weren't yet prepared to offer a HIPAA-compliant solution. MGMA members said the cost of these systems was extremely high, up to $80,000 dollars for even smaller practices. Vendors told them their current PMS systems weren't compliant and they'd have to buy a system that wouldn't run on their current hardware, and also incur hardware costs. They had to build this into a system where, according to MGMA data released last week, health care costs for group practices had increased five percent. At the same time, Medicare reimbursement plummeted. Mr. Tennant said all these converging figures signaled a provider it was expensive to move forward, hampering compliance.

Mr. Tennant said missing standard elements in the CMS 1500 paper claim form (previously HICVA) were an issue for many providers. Those with claim systems based on that form had to do a gap analysis, lacked expertise, and couldn't find vendors to provide it. Providers were also concerned about fines for non-compliance. WEDI believed fines should be deferred until the industry had an opportunity to implement and test fully all HIPAA transactions. Mr. Tennant also noted concern that, until recently, there had been a lack of direction from the federal government. He commended CMS for doing all they had with their limited budget, but stressed the need to reach hundreds of thousands of providers. Noting providers had demonstrated their interest with their applications for extension, Mr. Tennant emphasized the need to follow-up with constructive advice on how to move forward.

Panel 1: HIPAA Readiness - Observations

Mr. Gilligan explained that Association for Electronic Health Care Transactions (AFEHCT) was a vendor-oriented industry action group with a special focus on federal public policy as it related to the application of EDI, e-commerce and the Internet to solving problems associated with the financing, delivery and administration of health care in both the public and private sectors. AFEHCT members include vendors of software and services including clearinghouses and other companies actively involved in remediation and in the electronic processing, maintenance and transmission of health care clinical, financial and administrative transactions.

Mr. Gilligan gave CMS kudos for their implementation efforts, noting CMS responded quickly to ASCA's requirements and made the application for extension as easy as possible. ASCA Questions and Answers, Ask HIPAA, and HIPAA Hot Line were helpful. He said CMS also deserved credit for outreach, noting AFEHCT would ask for even more. He mentioned a problem with different interpretations in the HIPAA implementation guides.

Mr. Gilligan noted CMS and others involved in outreach to the vendor community had an endless task. The size and make-up of the vendor community wasn't knowable. The AOL search engine reported 836,000 medical PMS; Google, a million; MSN, 500,000. In 1999 and 2000, Vincent Hudson's directory of PMS listed over 1,500. He reported that many small PMS vendors couldn't afford to upgrade to HIPAA and dropped out of the market, which he currently estimated at 800-900.

Mr. Gilligan said software vendors in AFEHCT's membership tended to be the largest ones in the community and member clearinghouses processed an overwhelming percentage of transactions. These vendors knew what they had to do, were doing it, and were confident about providing solutions that would enable clients to become compliant with the transaction and code set standards. Mr. Gilligan noted, however, that answering provider clients' questions about when their software would be HIPAA compliant left vendors open to liabilities and the vendor community was being judicious in the language of their responses. One company said the language they used even affected when they could recognize revenue from software already sold. Mr. Gilligan said CMS, vendors and medical specialty societies would meet October 30 to try and resolve these issues. Another problem was provider education. The vendor community wanted more done to educate providers about the role of providers and vendors in implementing at least the transaction and code sets final rule. Vendors told Mr. Gilligan they wanted CMS to absorb some of the heat and help tell customers and clients that they had to change the way they did business.

Mr. Gilligan emphasized that, despite what he and others said today, there was reason to be confident that next October compliance would be substantially achieved. AFEHCT pledged cooperation.

Panel 1: HIPAA Readiness - Observations

McKesson Information Solutions' software applications include enterprise-wide patient care, clinical, financial, and strategic management of software, as well as Internet-based and networking technologies, electronic commerce, outsourcing, and other services to health care organizations throughout the world. Mr. McLaughlin said McKesson Information Solutions worked diligently toward the goals of HIPAA, even before the inception of the law. Participation in industry groups, such as WEDI and AFEHCT helped drive the industry toward administrative simplification by utilizing industry accepted standards and the protection of the information through privacy and security standards. The McKesson Regulatory Affairs Department monitored and commented on the regulations as they became available.

McKesson Information Solutions formed a HIPAA Project Office for all HIPAA-related development coordination, employee education, and customer communication. The Project Office was comprised of four full-time staff members with representation from every product and service area in McKesson Information Solutions. Product representation included representatives for transaction and privacy-and-security related issues from each product line. The Project Office also contains representation from Regulatory Affairs, Legal, Human Resources, and Marketing. Staff held regularly scheduled meetings with senior management acting as a Steering Committee. Meetings also were held regularly scheduled with a select group of customers who provided input on product direction as it related to the HIPAA rules. The Project Office also participated on and regularly reported to the Enterprise HIPAA Compliance Office, an internal department actively involved in providing direction for HIPAA requirements to all of McKesson's business and service functions.

McKesson had published a readiness disclosure on the For Customers section of its Web site giving information about each product's progress toward general availability, including gap analysis. The McKesson clearinghouse was able to accept and send claims, remittances, and eligibility today in the HIPAA standardized electronic formats and was evaluating other transactions (e.g., claim status, referral certification, and authorization). McKesson conducted testing and certification through a third party and used a third-party testing tool for internal product tests. The McKesson clearinghouse and several McKesson products had been certified through a third party.

McKesson voluntarily entered information on the public Web site Mr. Tennant mentioned that listed the vendor readiness status of practice management software. McKesson development was moving forward with integration of addenda items from the DSMO fast track process. Mr. McLaughlin said McKesson expects to issue software updates as those addenda become final.

Although viewed as a necessity facilitating seamless integration of these standards among all trading partners, Mr. McLaughlin said the extension complicated the implementation process. Coupled with uncertainties about the addenda and NDC codes, the extensions became complex. It was necessary to communicate to all trading partners in order to determine their needs; McKesson had thousands of partners and managing the directions of each was nearly an insurmountable task. As a hybrid entity, McKesson had covered entity as well as non-covered entity components. Mr. McLaughlin said the clearinghouse was prepared today to send and receive standard-compliant transactions, but saw no downside in filing for an extension.

McKesson had been actively involved in implementation for some time and saw the following roadblocks to compliance arise in the industry: NDC, taxonomy, addenda, and extension code issues; lack of a final or proposed rules for the remaining identifiers, security, and transactions; implementation guide interpretation issues, and trading partners with unknown status. Mr. McLaughlin emphasized that the complexity of dealing with thousands of trading partners who each had his or her own philosophy on implementation (e.g., whether or not to file for an extension, use NDC codes, utilize taxonomy codes, or incorporate the many changes from the addenda version of the guides) was overwhelming. Knowing the current addenda might change, providers, clearinghouses, and payers couldn't implement their final system versions without risking having to revise any changes to the customer base, costing McKesson and their customers additional time and expenses. Mr. McLaughlin urged HHS to finalize the addenda and NDC proposed rules as soon as possible, narrowing the complexity as NDC, Taxonomy, and addenda issues were removed from the equation.

Mr. McLaughlin said implementing the privacy final rules without the matching security rule would be a near impossibility, given that the privacy rule required appropriate security safeguards. Implementing standard systems without the standard identifiers would increase the number of times providers had to implement and update their health care software systems. To resolve this issue, HHS would need to propose and/or finalize these rules as soon as possible. By issuing these proposed rules, HHS would aid the industry in beginning the comment and finalization process. Issuing final rules, HHS would assist the industry in gaining closure to current unknowns. Mr. McLaughlin noted that HHS and CMS were actively working on that.

Interpretation of the implementation guides was an area of contention. Trading partners phase-in their implementation and the transactions required today weren't necessarily the HIPAA-compliant version. Many partners required elements that, under HIPAA standards, were clearly marked unusable, adding values to internal implementation guide tables that their existing systems still needed. Each partner placed his or her own requirements on the interchange and looping structures of the interim transactions as currently allowed, however no certain resolution was made about when the fully compliant version would be available. WEDI SNIP and X12 had initiated a workgroup aimed at eliminating confusion and inconsistent answers related to the implementation guides. Mr. McLaughlin said HHS/CMS's participation would allow for authoritative answers to be readily distributed to the industry using a mechanism such as the SNIP Issues Database.

While many trading partners were phasing in HIPAA implementation, many others still hadn't declared testing and compliance dates. Mr. McLaughlin suggested that a directory similar to the one listing PMS vendors could show the readiness of trading partners. Utilizing industry and HHS listserves also would help. Mr. McLaughlin expected there would be issues to contend with as next October drew near, but he said, over time, these issues would be resolved.

Discussion

Mr. Tennant clarified that he was referring to the health plans when he said 80 percent of the vendors he surveyed indicated that they'd be compliant by next October. His sense from speaking with the upper echelon (e.g., United, Signa, Aetna, the Blues) of the payer community was that they were confident that the top national payers would be ready to begin testing to meet the April 15 deadline. He noted they'd prioritized the standards 837, 835, 270/271 and this varied state-to-state and plan-to-plan (e.g., some made a business decision that their customers were more interested in claims status). Mr. Tennant said no one gave him a sense from that the plans would be fully ready for testing all the transactions by the April date.

Dr. Cohn said he'd heard from everyone about all these pieces (e.g., the final regulations, vendor readiness, education) that still had to move forward. He suggested that they refer to earlier letters and draft another to the Secretary. Dr. Cohn expressed concern that this was really about administrative simplification, but people appeared to be focusing on just a compliance maneuver. Noting that reducing the costs of administrative overhead (which took 25-to-30-percent of a normal physicians practice expense) was fundamental to this, he asked what was being done to help everybody realize those benefits.

Mr. Tennant remarked that one of the mixed messages providers received over the last few years was this issue of administrative simplification. Major national groups, including some provider groups, had argued against and sponsored legislation to kill administration simplification. WEDI focused on the cost savings and, from the beginning, MGMA pushed hard to move these standards forward. But it was an uphill battle, fighting disinformation campaigns launched by some industry groups and a preemptive focus on privacy. Mr. Tennant said one positive outcome of ASCA was refocusing back on administrative simplification. He agreed with Mr. Gilligan that the government had to convey an understanding to the provider community that this wasn't just another compliance issue, but that there were substantial benefits. Practical implications to these standards weren't getting out to the industry. He said there had to be a return on investment (ROI) argument and a reason for providers to demand a HIPAA compliance solution (checking eligibility on-line as opposed to spending time on the phone) from their vendor.

Mr. McLaughlin pointed out that the number of transactions and scope of implementation were similar to the NSF implementation from CMS for the Medicares, which initially had been an effort bringing all Medicare's into compliance with a national standard format. Mr. McLaughlin said initially there'd be fluctuation, as they'd seen with NSF, which became a national standard format and simplified the implementations. Speaking from a vendor and clearinghouse perspective, he noted that as the industry got more in tune with what NSF was and the data content should be, it became easier to implement a new NSF and make changes. He said they were in the beginning stages of something similar but on a grander scale.

Asked about the disinformation campaign Mr. Tennant had said MGMA was fighting, Mr. McLaughlin explained an effort led by a national provider trade association introduced legislation in the Senate a year and a half ago that would have essentially pushed implementation back six years or more. Many saw that as an effort to kill administrative simplification and send a mixed message to the industry that greatly concerned MGMA and WEDI, who for years pushed for these efficiencies. Noting studies sponsored by some industry groups projecting billions of dollars in costs, Mr. Tennant said a clear message had to come from the federal government that while there were costs and transition issues, HIPAA's overall goal was cost savings.

Dr. Fitzmaurice noted that software programs like Turbo Tax and fill-in forms downloaded from IRS where one inserted numbers into the variables and it cranked through all the rules could make things easier. He asked if there was enough certainty for the creation of a vendor product (e.g., a Turbo Tax for claims) for HIPAA. He said he thought there was, but it would take time for both those who set the rules and complied to pin them down. Dr. Fitzmaurice asked if anyone felt an urgency about whether CMS's new office was adequately staffed and funded to provide the impetus for the government to develop these rules and models. He noted that legal models had to be developed that precisely defined the variables. And he pointed out that trying to keep up with all the complexities would reduce the number of smaller vendors or associations. He suggested that associations could interpret these complexities in a legal format to the vendors. He emphasized it was a question of development and resources at the government and private levels

Mr. Tennant said a problem right now was people liked the tax system more than they liked HIPAA. WEDI SNIP was formed to develop a template best-practice approach that allowed the industry to move forward in a uniform manner, but was hampered by a lack of resources. He said this effort had to be driven by the federal government, much like it developed the tax code. He noted they'd started to move that way with the announcement of a point person at CMS and a vision of how they wanted to roll this out. But he cautioned it might be too late. The original compliance date had already passed and they hadn't gotten far down the road. There was a need for Turbo-Tax-type programs. Everyone and every approach would be different within every organization, but there were commonalities they needed to point to and that was WEDI SNIP's role. Mr. Tennant asked for more interaction between the government, WEDI SNIP, and other groups so a consistent message went out to the industry.

Mr. McLaughlin suggested that the new task group he'd mentioned was a step in that direction. Many different types of organizations (including payers, vendors, and physicians) were represented within X12. And WEDI collaborated to begin funneling through issues related to the implementation guides and plugging in a numbers scenario or at least agree on what should go into those elements.

Mr. Gilligan noted there was less than 245 working days until the compliance date and that this was the largest system conversion ever made. He expressed confidence that compliance would happen shortly thereafter. If an answer didn't come back when a transaction was sent, Mr. Gilligan asserted they'd work it out quickly. Being paid was an essential component of this process; the political situation would handle it. Mr. Tennant cautioned that if cash flow was disrupted, there could be disruption in patient care. Mr. Gilligan said he had faith that the hospitals, doctors and people who administered health insurance plans wouldn't let that happen. When Medicaid management information systems were implemented in the late 1970's, contractor payments sometimes stopped for as long as six months, but nobody was denied care.

Ms. Trudel reported CMS received about 1,700 calls a day on their hot line by the end of this process. She said it was hard to make a case for ROI for administrative simplification when there was no economically feasible product to help basic practitioners reap those returns. There was an enormous gap in low-end, low-cost software (e.g., Turbo Tax) for the basic physician. The only software she knew of was what the Medicare program gave providers who submitted electronic claims. The HIPAA-compliant version wouldn't help people who had to bill others. There was enormous need.

Ms. Trudel recounted a long and, on the other end, heated telephone conversation with an optometrist who'd asked why he should do this. She'd explained that he could check the status of claims automatically, check eligibility, get back remittance advice directly into a system that balanced and took care of patient accounts. The optometrist said his software didn't do that, his vendor would charge $10,000 dollars, and hung up. Dr. Cohn questioned whether they should ask CMS or others in the federal government to produce compliant public domain software. Ms. Trudel said that would do a disservice to the whole health care industry. She said much of this came about because some larger players who probably could make this happen most easily, hadn't seen an economic business case for getting into this market. She said that would change.

Mr. McLaughlin said he was continually surprised to hear providers say they didn't have systems that McKesson had had in place for ten years. He said typically it was because people hadn't purchased a system to bring their remittance back in and automatically balance it, which would save resources. Noting the need to spread these types of solutions throughout the industry, he agreed this was about simplification.

Mr. Tennant said a framework might be provided for how software should look and the functionality it should have and vendors could offer their solutions. He suggested that Mr. McLaughlin was saying their software solution handled that, though Mr. Tennant contended many didn't. But he said one thing that could come from WEDI and other groups, including CMS, was what one should expect from vendors. Demanding these other capabilities from vendors, everyone could begin to reap the benefits of administrative simplification, not5simply the claim.

Noting the various interpretations of the guides, Dr. Zubeldia pointed out that with Dr. Fitzmaurice's Turbo Tax model the IRS rules were created and interpreted by IRS, under HIPAA the Secretary adopted transactions that weren't developed by the Secretary or CMS, but by X12. Interpreting transactions developed by X12 involved a constant tension around avoiding favoring one constituent over another. He questioned that X12-WEDI's new group seeking consensus had the force that came with official interpretation of the guides. Without that, no matter what vendors did, they'd end up with different results. McKesson's solution might not work for ophthalmologists, because their interpretations might be different. Mr. McLaughlin agreed this was an issue. The task group had asked for HHS' and CMS' involvement so authoritative answers could come out of the resolutions produced. Placing those resolutions and authoritative answer in the SNIP Issues Database would make them available to the industry.

Mr. Tennant said Dr. Zubeldia's point went beyond the implementation guide issue to permeate all of HIPAA. Industry groups worked to develop best practices, but no one could say the government, or worse, the courts would accept them as compliant. He suggested that Dr. Zubeldia's point was to have a channel so best practices could be reviewed by the government and given a stamp of approval signaling the industry to implement them. Dr. Zubeldia said it would be similar to the process by which the standards were interpreted and adopted. Mr. Tennant suggested the Committee might act as a weigh station, reviewing and commenting on best practices to the Secretary.

Dr. Fitzmaurice commented that they were in a transition period. Unlike taxes, where Congress said implement this tax and IRS told how to do it, Congress told NCVHS to work with the industry to develop formats and requirements for different kinds of claims. In five-ten years, Dr. Fitzmaurice envisioned a similar form and instructions for transactions. The difference was that the industry worked on those and, unlike tax rates, it could change these forms and instructions. He emphasized the importance of working with the industry and adopting forms and instructions containing certainty and precision.

Mr. Blair acknowledged that he'd been nervous about hearing predictions of a catastrophe. Instead, he'd heard that, for the most part, attempts to change the legislation and gain additional delays were behind them. Things moved forward, some late and maybe all the transactions wouldn't be implemented by October 13. But by next October, Mr. Blair said he'd heard most health plans, providers and certainly claims clearinghouses would be at least compliant on the health claim and eligibility transactions and well on their way toward compliance on the remaining HIPAA transactions. Mr. Gilligan said he heard there were problems to deal with and a wish for a time-sensitive situation, but he expressed confidence in their ability to get there.

Speaking as a commissioner for the Electronic Health Care Network Accreditation Commission that accredits clearinghouses, Mr. McLaughlin predicted that, while next October payers might be ready to accept compliant transactions and clearinghouses able to send compliant transactions, due to the scope of pre-production testing, some who'd exhibited due diligence would still have to convert transactions. A clearinghouse might have 100 more conversions and progress toward them in the subsequent six months. Unless people thumbed their nose, CMS had said there'd be flexibility; people wouldn't be fined for non-compliance.

Noting Mr. McLaughlin confirmed his perception, Mr. Blair asked if they'd see, after next October, considerable pressure from providers who'd already made the investment for their vendors and health plans to become compliant as quickly as possible. Mr. Tennant said he wasn't as confident as other panelists. Patient care and the health care system wouldn't grind to a halt, but he recommended that group practices develop contingency plans, similar to Y2K, to avoid interruptions in patient care in case of a disruption in cash flow. Rural health clinics (RHC) couldn't go 90-180 days without being paid; they needed lines of credit and an arrangement to, temporarily, funnel claims through a clearinghouse. Noting he didn't think they'd see next October many standard transactions beyond the 837 and 835, Mr. Tennant recommended everyone demand the additional transactions from health plans and vendors. He predicted there'd be more pressure as information got out and he reiterated they hadn't made the case yet for a ROI document that showed providers what they'd save when they could take staff away from checking eligibility and claims status on the phone. Mr. Gilligan said MGMA had that argument for its members.

Returning to the issue Mr. Tennant raised earlier about vendors being HIPAA ready versus HIPAA compliant, Dr. Cohn observed that they'd discussed for years that vendors weren't sufficient but were the necessary component of implementation. He asked if a correct interpretation of "HIPAA-ready" was that they had a clearinghouse that they could send non-standard transactions to. Mr. Tennant replied that there was more interpretation of the word compliant than there was for the 837 guide. Vendors didn't have to be HIPAA-compliant, their clients and customers did. Providing a HIPAA-ready solution was different than providing a HIPAA-compliant solution. Noting vendor Web sites set no date for providing compliant software and uncertainty about whether vendors would charge a per transaction fee for going through their proprietary clearinghouse, Mr. Tennant expressed hope that this was a temporary solution to the deadline. He emphasized that the vendor directory would get this information in a public forum where everybody could see where vendors were.

Mr. McLaughlin agreed with Mr. Tennant that their HIS products couldn't be compliant per se. He said McKesson took the position that they could enable customers to have the data content and elements necessary to be compliant. And they had the ability to supply data content files out of their system so that they could be taken to other clearinghouses, although they certainly would prefer to use their integrated solution. Mr. Gilligan noted that the October 30 meeting was to discuss and sort out the language providers and vendors could use to talk to each other and come away with a meaningful process.

Ms. Burke-Beebe questioned if the impact on clearinghouses would be an issue. She said it sounded as though health plans were ahead of the curve on being compliant or ready, providers possibly behind the curve, and that in some cases there would be a reliance on clearinghouses. She asked if clearinghouses could handle the flow. Mr. McLaughlin said he believed the clearinghouses were at least as far ahead as the payers. Noting there'd be a magnitude of flow, Ms. Burke-Beebe asked if having enough clearinghouses was an issue. Mr. McLaughlin said that went back to what he'd said about the clearinghouses having all their payers in production on October 16, 2003. He said the chances were there wouldn't be enough availability, but he said they'd move toward that within a short period.

Mr. Gilligan noted that their use of the word provider had to be refined. The Mayo Clinic was probably going to be ready. Small doctors' offices were another matter. Those doctors had the most trouble realizing they had to do this and that they had a deadline. Mr. Tennant said the readiness of the clearinghouse was a huge issue and that the demand for clearinghouse services would increase a hundred fold next October. It might last only six months or a year, but the demand had to be met. He suggested that the Subcommittee ask the clearinghouses what would happen if the number of claims doubled. WEDI was looking into producing a directory similar to what they'd done with the PMS vendors for health plans and clearinghouses.

Dr. Zubeldia remarked on the need to keep the HIPAA transition in perspective. Today the health care system worked with a tolerable degree of imperfection: about five percent of the claims filed to payers electronically weren't perfect and needed development. About 30 percent of the clean claims had to be tended. Dr. Zubeldia observed that no one should expect HIPAA to be 100 percent perfect, because of the nature of the data.

He noted that a medical PMS that had their own private clearinghouse told him it took them three years to remediate all their sites for Y2K and another three years for an update that involved a significant amount of testing with trading partners. This same PMS considered sending all their customers to a clearinghouse because that was the only way they perceived they could even attempt to be compliant on time. As part of the maintenance contract, one either got software updated or the transactions were sent to a clearinghouse for free. Dr. Zubeldia asked if that was a concern or an acceptable solution. Mr. Tennant said WEDI found it an acceptable short-term solution, but he noted it didn't get at Mr. Blair's point, which was that this was about administrative simplification. That was a work around and fine short term, but it didn't provide solutions for eligibility and claims status, except through the clearinghouse. It was a proprietary solution and he didn't see a problem as long as they could use the same standard to determine eligibility of all the payers, the system worked, and they didn't charge fees--which he noted was a point in question.

HIPAA Extension Request Status

Ms. Trudel reported that the on-line application system was brought up on the Web site April 16 and received 134 requests that day. By the end of the extension period, 40,000 requests were received per day. Over 500,000 electronic extension requests were received (including some duplicates). Some 39,000 paper submissions (which must be post-marked by October 15) were received to date). Ms. Trudel anticipated receiving 50,000 paper submissions that will be keyed into the electronic database. Duplicated, edited final statistics are expected to be on the database and ready for statistical reporting by early next month. Most electronic requests were from providers and reported a Medicare number. Ms. Trudel said the benefits of this process were that over half-a-million people who probably didn't know about HIPAA six months ago demonstrated awareness by submitting an extension request. In addition, a number of small providers and health plans that hadn't been certain they were a covered entity had gone through the process, knew, and were thinking about what they had to do to be compliant. The hot line received 1700 telephone calls daily; Ask HIPAA Mailbox received 200 emails a day.

Ms. Trudel said they hadn't tabulated because keyers were still entering paper forms and the database hadn't been un-duplicated, so they didn't yet have a sense of common problems or trends. Ms. Trudel would report statistics at the December meeting. She explained that all the requests that came in electronically were accredited to the database before October 16. They had a 100 percent sample and could differentiate with that date. Noting they knew virtually everybody had applied for an extension, Dr. Cohn asked what other clear pointers they might come out with to guide them in identifying compliance issues and publishing best practices. Mr. Augustine said he went by the credo that the data had a story to tell and one had to wait until it told its story. Dr. Cohn asked if they would have some data to look at for the November meeting. Ms. Trudel said she'd ask for some preliminary runs on reasons for delay. From her experience with people on the phone, she thought the top two reasons were, "Waiting for my vendor" and "I just heard about HIPAA." Dr. Cohn noted they'd heard those reasons in testimony today.

Dr. Zubeldia said he'd like to see if there was a significant difference between payers' and providers' answers to when they thought they'd be compliant and when they'd start testing. It would be encouraging if payers' answers were significantly earlier than providers' compliance dates, but there would be a problem if their answers were the same.

Asked about narratives, Ms. Trudel said the only place they accepted narrative electronically was when someone chose the word "Other" as the reason for delay. Then there was a box to fill in. Entries were handled as text strings and could be sampled or categorized with other responses.

What Needs to be Done to Move the Industry Implementation Forward, Roy Rada, Professor of Health Care Information Systems, University of Maryland

Roy Rada from the University of Maryland pointed out that a definition of best practice accepted in professional circles (a practice that was quantifiably successful over a prolonged period and repeatable with modification in similar organizations) wasn't what folks talked about as best practice. He contended that data from the form indicated common, rather than best, practices, which he argued was what the Subcommittee actually sought. Noting it was implicit in questions (e.g., Dr. Zubeldia's question about payers versus providers) that different entity types had different practices, Dr. Rada suggested sorting through the data by type to see what each entity did. Dr. Cohn said he hoped the data helped the Subcommittee meet its responsibility to identify compliance problems (they'd already heard some in today's testimony) and determine what had to be published or shared around best practices to those problems. Dr. Rada noted the charge in ASCA to come up with a model compliance plan. Dr. Cohn explained that was done in generating the form itself. Dr. Cohn believed the Subcommittee's responsibility was to identify the compliance problems, survey and analyze the data, identify compliance problems, and publish effective solutions.

Discussion

Dr. Cohn said that day's conversation had begun to determine what community resources they might leverage: Ms. Trudel had commented on what the data might show and they were aware of issues about education, vendor readiness and demonstration of ROI that needed to be brought to the health care community's awareness. Dr. Cohn said he thought this initial conversation about what was already in the industry and might be brought forward was an opportunity to jump start this by potentially identifying some candidates and resources.

Mr. Augustine suggested that scenarios built around a small group practice with three physicians, a six-person staff, and certain volume of claims could, with real numbers and values, help people empathize and associate with the situation and bring home advantages administrative simplification will make on the bottom line.

Dr. Zubeldia described a model for a small practice on administrative savings coming from HIPAA that UHIN put together. Essentially an excel spreadsheet, one went through all the HIPAA transactions (e.g., inputting how long one stayed on hold to get an eligibility inquiry answered, how long it took for a referral, how one paid employees) and automatically calculated, based on the number of transactions per month, the cost with HIPAA transactions. UHIN provided the tool free with a request that the results be sent to UHIN and incorporated in a database of results from different practices, specialties and locations in Utah. Dr. Zubeldia said he didn't believe the model had been extended beyond the medical provider to the institutional level.

Dr. Rada remarked that best practices and the database were different views people took and he disagreed with Mr. Augustine about not hypothesizing about what they'd see. He suggested it would help to put themselves in others' positions and work through their scenarios. Mr. Augustine agreed. A lot of tests could be run and comparisons and cross tabs done, but he advocated stipulating statistical hypotheses before gathering data; not post hoc.

Mr. Tennant said WEDI recognized the need for all these tools. WEDI's chair, Stephen Lazarus had come up with a HIPAA calculator similar to UHIN's model. The ROI had always been debated within WEDI and the foundation intended to replicate that report that provided foundation for the legislation and produce new numbers showing the industry the potential savings. The problem was resources. An analysis of the health care industry reflecting the ROI would be a massive undertaking. Mr. Tennant said WEDI had focused on a number of business issues with transactions and also privacy and security. An early draft of a reader-friendly small provider implementation white paper was on the WEDI Web site and an updated version was in circulation. WEDI also had technical papers on front-end edits, translations, and sequencing aimed at the converted.

WEDI collaborated with the Council for Affordable Quality Healthcare (CAQH), a consortium of health plans, to provide a template health plans could use to show where they were in their readiness and testing. The template will drive CAQH's and WEDI's educational efforts. WEDI also has outreach programs with CMS. Mr. Tennant said he'd heard there was a need for WEDI to do other work. He noted that Mr. Augustine's concept of providing a scenario (two- physician practice, ten-physician small clinic, hospital, rural health center) was something he could take back to WEDI and say, "Here's what I'm hearing, what can we do about it?"

Dr. Zubeldia asked about the problem Mr. McLaughlin and Mr. Gilligan addressed earlier concerning the legal liability and impact on market valuation of public disclosures about states of readiness. He recalled that Senator Bennett introduced legislation that essentially said anyone who made a Y2K disclosure by a certain date couldn't be sued, and asked if something similar might help HIPAA. Mr. Tennant agreed that would free up a lot of entities to come forward, but he questioned that there was time to pass legislation. Noting Y2K focused down from top levels of government, Mr. Tennant said it would take that type of action, which he didn't see with HIPAA. But he said Dr. Zubeldia's point was valid: there had to be a way to get information to flow. He reiterated that this was all about partnerships. It would be easier if a provider knew where its vendor, clearinghouse, and payer were in this process. Dr. Zubeldia clarified that they weren't suggesting it was ok to miss deadlines, but encouraging everyone to get ready ahead of deadline. If everybody waited, it wouldn't work. Dr. Cohn asked how this differed from the ASCA requirement to begin testing in April. Mr. Gilligan explained that after the ASCA legislation passed there had been debate about what testing meant. Some specifically included external testing, but it ended up being just "testing." People could say they were testing internally. Mr. Augustine noted another problem was there were so many dependencies. It was a cascading affect, and if there wasn't open disclosure some people would be hung out to dry when the compliance date came around.

Noting Mr. McLaughlin was involved with both the vendor and clearinghouse sides, Dr. Zubeldia asked what the benefit or implication of knowing when payers were ready to start collecting was to the clearinghouse. Mr. McLaughlin noted they'd already discussed that payers might not be connected for production on October 16, 2003. McKesson's clearinghouse had gone through four iterations of calling and asking payers when they could test and go into production without getting many positive responses. Noting a payer directory with that information would be valuable, he emphasized that the critical point was the date they could go into production. Mr. McLaughlin said legislative protection would be especially helpful with predictions on compliance.

Mr. Blair said Dr. Zubeldia raised a valid point, but added that avoiding saying when one would be ready to test or be compliant could leave one without a way to manage accommodation over time and only the October 15 date--a worse business risk. He suggested no one had to guarantee they'd be ready; they could give target dates. Business issues would be the driving forces. Mr. Gilligan countered that Wall Street stock analysts could announce that a date was missed. He pointed out that the stock value of the vendor industry decreased 50 percent since last April. Dr. Cohn agreed that legislation wouldn't help. Mr. Gilligan added that no legislative solutions could be passed in time to save anybody by the compliance date.

Mr. McLaughlin noted that many vendor sites told customers who logged-in with a customer number and password about readiness and gap assessment analysis. Mr. Augustine remarked that activities of quality improvement committees within his organization weren't discoverable. He didn't know how that would work externally, but suggested considering it. Dr. Cohn said he wasn't sure how much of a problem there was and the Subcommittee hadn't offered much of a solution, though Dr. Zubeldia would share with them what came out of subsequent meetings. Mr. Gilligan reinforced Mr. McLaughlin's observation that vendors maintained Web sites for their clients. Mr. Tennant said the goal was to let the market force work: the more information providers had, the more they were able to make corporate decisions about vendors. He urged them to remember Dr. Zubeldia's query about how they could facilitate that information. And he suggested HHS might distribute the HIPAA.org PMS directory through CMS to the providers; the realization that providers saw the listings would force vendors to include their information. He reiterated that the goal, whether with legislation or CMS activity, was to get everyone to share information on readiness. Members discussed that vendors listed whatever information they chose on the Web site and that, hopefully, the marketplace would vet it.

Mr. Gilligan confirmed that clearinghouses that were members of AFECHT processed about 80 percent of clearinghouse transactions and were ready. Dr. Cohn asked about lists of systems requirements and questions for vendors that institutional and non-institutional providers that were new to HIPAA, had various software platforms, grappled with partial solutions and were confused could use. James Shaping said WEDI just finished a series of audio casts aimed at small providers; cassettes were available through WEDI. He noted that several of the white papers Mr. Tennant referred to walked people through what they needed to think about in getting ready. He said there had been so many requests for case studies that the report Mr. Tennant alluded to WEDI updating had been temporarily parked while WEDI focused its energies and human resources on generating more urgently needed products. Mr. Tennant encouraged everyone to go through the resource lists on the WEDI site.

Mr. Gilligan said anyone searching for what they ought to be thinking about only had to visit several vendor Web sites to spark his or her imagination. Mr. Tennant remarked that the small physician office Mr. Augustine was interested in wasn't likely to be visiting many Web sites; many didn't even have Internet access. He noted that CMS had positive results with a video featuring office managers talking about HIPAA and implementation issues. CMS also participated in WEDI's audio conference aimed at small providers. He said the way to reach providers was through a partnership between industry and CMS focused on outreach.

Noting local practitioners and smaller health plans didn't find their way to national programs, Mr. Shaping said WEDI launched a series of regional initiatives to reach under-served communities. A collaboration with CMS included pilots that used CMS's regional infrastructure to coordinate and deliver programs aimed at the small provider. A one-day program walked providers through what had to be done, questions to ask, how to prepare, and a general orientation to HIPAA. A second-level of seminars put on through WEDI's foundation and regional network of some 30 affiliates was a-day-and-a-half program at the intermediate and advanced levels with vendor participation, software and tools. Both programs were aimed at providers that weren't exposed to national programs or Web sites. Dr. Cohn asked Mr. Schuping and other groups for a list of face-to-face activities and white papers the Committee could catalogue in a listing of educational resources. Ms. Burke-Beebe noted that Mr. Tennant's written testimony listed other initiatives that might address these issues. Mr. Tennant explained that these were initiatives WEDI had considered and was inclined to do.

Noting they seemed to be talking about two things: ROI from educational material and applications available in databases, Dr. Rada recalled that Mr. Tennant had talked about a return-on-investment analysis, similar to what was done with the transactions rule. He suggested they might go over the spreadsheet they'd discussed. And he pointed out that many entities struggled with whether to send application forms in separately or as a single covered entity. He said studying examples of how people submitted applications could not only help providers understand this situation, but might also reveal patterns of what was happening nationally. Dr. Cohn said the Subcommittee would be going through that database and thanked everyone for his or her suggestions. Noting implementation was less than a year away and that, by January, people should be full scale into implementation, he said it might be too late to come up with educational tools to assist them. Dr. Cohn emphasized, instead, utilizing best practice documents to jump start people.

Dr. Zubeldia asked what the Committee could do to support the recommendations for best or common practices in WEDI's white papers: should CMS implement WEDI recommendations, were regulations required to implement some solutions, was CMS's leadership needed to resolve controversies such as WEDI's direct data entry (DDE) white paper generated? What next steps did testifiers see taking place? Mr. Tennant noted they'd talked during the break about a specific implementation guide issue: minutes versus units in anesthesiology. If the industry couldn't reach consensus, he said the government had to make a decision, because the industry needed to move forward. Noting most major payers in DDE had their own proprietary system and that flew in the face of standardization, Mr. Tennant said industry had to at least identify where it needed some direction. Dr. Cohn said he'd love to see a list of problems where industry hadn't been able to get consensus.

Dr. Steindel questioned that government leadership was what was required when industry didn't agree with a position the government took following the Medicare rules. He believed industry had to realize a solution was required and reach consensus. Noting how long it took to weigh through the controversy of home health care billing and whether payment for insulin syringes should incur on an institutional or professional billing form, Dr. Steindel said there had to be better solutions to these problems.

Ms. Burke-Beebe noted that Mr. Tennant's written testimony addressed the unreadiness of community health centers (CHC) and RHCs that traditionally treat under-served populations. She asked if WEDI's regional efforts addressed this and what the Subcommittee might do in its analysis of the data. Mr. Tennant noted MGMA worked with Health Resources Services Administration (HRSA) to provide outreach to grantees and commented on that community's staggering level of unreadiness. He expressed concern about bad outcomes in patient care if these people didn't get funded. Over the last several months, HRSA realized their grantees were under the gun and began an education program to reach 300 of the thousands of grantees. This community didn't have money to go to conferences, buy expensive compliance solutions, and hadn't been within the WEDI fold. Mr. Tennant emphasized the need to focus on folks most in danger and be proactive, going to them with education and resources. This effort could only be driven by resources and WEDI didn't have enough to provide free education to five thousand CHCs.

Ms. Greenberg remarked that at the first privacy hearing they'd heard that these organizations were tremendously under resourced and incapable of even translating notices in the languages of the people they served. She cautioned that the combination of these transactions and privacy (which was only six months away) put them at extreme risk. Dr. Steindel added that there was an undercurrent of discussion within the public health community about a rising HIPAA awareness that some funded programs (e.g. those that finance screening) were covered entities, but hadn't begun to become HIPAA compliant.

Dr. Fitzmaurice mentioned the need for CMS's new office to have additional resources to work more closely with the industry and discuss solutions that CMS had found for the Medicare program with those still looking for solutions in the private sector. Dr. Fitzmaurice emphasized education, outreach, and being open about processes. Ms. Greenberg noted that, in addition to education, CHCs and RHCs needed resources to accomplish what they had to do. While she was glad to hear that vendors were aware, she doubted anyone had resolved this.

Dr. Cohn said he hadn't hear any grave issues today, but that they needed to send a congratulatory letter regarding consolidation of these efforts into a single CMS office that also indicated the accompanying need for funding, especially as implementation was only a year away. They also had to point out the need for special attention to rural health care providers and community health clinics and that HHS had to look inward through it's public health entities and others and ensure that everyone moved toward HIPAA compliance. He envisioned a short letter reiterating the need to have the final regulations out.

Noting Mr. Tennant mentioned minutes-versus-units for anesthesiologists as a place for resolution, Dr. Fitzmaurice suggested looking for opportunities to speed up existing processes. Dr. Cohn said they'd hold that for December. He noted they wanted lists of "irreconcilables" from WEDI and others and that the DSMO's would talk in December about their processes, results from this year, and recommendations. They could then talk with other entities about how to improve and streamline the processes, resolve issues, and provide leadership.

Preparation for ICD-10 Presentation for Full Committee in November

The Subcommittee considered a proposal for a cost benefit analysis that Subcommittee staff and HHS prepared and whether there should be any modifications or changes. Dr. Cohn began by saying if they decided to go forward, they probably would want to discuss that with the full Committee. Ms. Greenberg said she hoped that didn't mean that he felt they had to wait until the Committee meeting. Dr. Cohn said that could be decided based on what everyone thought of the proposal and suggested changes. If they were in agreement, they probably could go forward.

Ms. Greenberg said that following the last Committee meeting she'd reiterated her earlier suggestion that, in response to all the discussion about the need for a cost benefit analysis, the Committee sponsor an analysis as a neutral party. She'd had preliminary discussion with Dr. Yasnoff and Ms. Pickett. What they were reviewing was a brief, preliminary skeleton of what was needed and input was welcomed. Recalling that the idea of an impact study developed in the context of RAND, Mr. Scanlon noted the Committee had access to a number of contractors; if the Subcommittee decided on the tasks, outcome and objective, they could find a number of suitable ways to get it done. Dr. Cohn suggested the questions for the Subcommittee were did this represent all their thoughts about what ought to be done, were things missing, or should any have greater emphasis knowing that, based on what Ms. Greenberg said, this would be further fleshed out over subsequent weeks, assuming it made sense to move forward. Dr. Cohn read a draft, Mandating a Switch from ICD-9 to ICD-10 Codes: Costs and Benefits Project Proposal, and members offered several clarifying comments.

Dr. Cohen said he believed the Subcommittee's intent if this went forward would be to have it done by late spring. Ms. Greenberg said her hope was that they could have a preliminary analysis by the February Committee meeting. She said what needed to be discussed if they wanted to go ahead with an impact analysis was the implications. She advised that something like this had to be done with an NPRM and that one would like the impact analysis to be as robust as possible, which would be facilitated by having an outside contractor address both costs and benefits. There were requests on the floor from various parties that the Committee not make any decisions until such an analysis was completed. Ms. Greenberg said how swiftly one wanted to get this done (and done well) was related to whether the Subcommittee decided all recommendations would be on hold until this was completed. If that were the case and this wasn't done until May, they'd delay an entire year making a recommendation that the Subcommittee was initially prepared to make in June 2002. She noted another view was it was necessary to do and would strengthen the impact analysis and the NPRM, but the Committee wouldn't have to wait for the final results before recommending an NPRM.

Dr. Cohn remarked that the proposed timing sought a preliminary briefing of findings by the February meeting so the full Committee could get acquainted with this. Noting this didn't fully meet all the needs the industry identified in its letters, Dr. Cohn said at least it gave everyone an opportunity early on to see the findings. And the Subcommittee had expressed its willingness to receive additional cost benefit studies developed by industry and, by then, would have interviewed and identified ideas about costs and benefits. Mr. Scanlon advised that the Subcommittee also could structure actions so people benefited from early-completed tasks (e.g., the contactor could be asked to provide any existing literature or studies on similar experiences and a set of steps or requirements could be laid out for the contractor to carry through).

Dr. McDonald expressed nervousness at the lack of awareness in the proposal of ICD and PCS, how they differ, and the assumption that experiences with ICD-10 in Canada would sufficiently inform them about ICD-10-PCS. He asked if there was a standard process for obtaining a contractor. Mr. Scanlon said HHS had access to a stable of about a dozen pre-qualified contractors for this kind of analysis and cost estimation. The quickest way (30-60 days) would be to choose one and issue a fairly full-blown task order so they knew what was wanted and the timeframe. The other way would be to announce it as any other contractor procurement on a full competition basis, which would take at least six months.

Ms. Greenberg envisioned beginning with "Would one like to have this done?" and "What did one want it to do? Then finding someone who could do it. The discussion and analysis would study the cost and benefits of moving from 9-CM, volumes 1, 2, and 3 to 10-CM and 10-PCS. It wouldn't include looking at other classifications as alternatives: there was none for 10-CM except staying with 9-CM and diagnosis, and it wouldn't look at alternatives to 10-PCS. There would be more (and more accurate) background and a description of the tasks. Participants agreed that, if the Subcommittee settled on the tasks and deliverables, the actual scope of the activity was something HHS could prepare and get feedback on. Dr. Zubeldia said the results of the project would be directly proportional to the quality of the proposal and what they asked to be done. Ms. Greenberg noted they had to be more specific. She envisioned various scenarios about the costs and benefits and interviewing people outside a standard interviewing protocol, which would have forced a clearance process that could delay this a few years.

Noting one of Dr. McDonald's big questions was the actual cost for an institution to code in the new systems, Dr. Steindel asked if he would accept the literature on coding costs or if he felt determining coding costs was part of this task. Dr. McDonald said he didn't believe there was any data in the literature, but at the very least any available data could be critiqued. He preferred doing a time-motion sample, but he didn't know if that was feasible. Ms. Greenberg said she thought the intent would be to gather information from all the testing done so far. At the end of the day, they might not have found enough to answer that question. The intent wasn't to include actual coding or testing of code sets; whoever they got to do this probably wouldn't be the appropriate person. Dr. McDonald said he'd like to make sure they'd comment on the adequacy of the existing data and sample size. He noted he'd stated his criticisms earlier. Mr. Scanlon said there were actuarial approaches to estimate these, but obviously they'd get better estimates with empirical data than with actuarial assumptions. There were ways to build this into a fairly thorough study, but it would take time.

Mr. Blair asked Dr. McDonald to propose an initial list of questions, methods and a sample size. Dr. McDonald said he'd make suggestions, but noted he didn't think it should be a solo task. He said he was only suggesting that when a reviewer considered available data he or she comment on the adequacy of the sample size for giving confident answers. Dr. Cohn agreed that they needed enough specificity to identify how conclusions were made. He suggested they check via email as this drilled down to make sure the questions addressed met all their needs and that none were omitted that affected their ability to draw a conclusion.

Asked if NCVHS had done something like this before, Ms. Greenberg noted this wasn't out of scope: they'd done contracts, assessments, evaluations and various studies. Dr. Steindel and Mr. Augustine concurred that this was a reasonable approach. Dr. Zubeldia recommended including in the proposal extracting from methodologies and populations used in cost benefits analysis by private industry. Remarking that they hadn't heard any testimony about cost benefit analysis, Dr. Cohn suggested they might have to go forward with a letter to the Secretary in November indicating they were doing an analysis and calling upon private industry to provide input or submit their own cost benefits studies for the Committee's review. Dr. Zubeldia recalled in August inviting Blue Cross Blue Shield Association to bring forward their cost benefit analysis.

Ms. Greenberg noted no other country had done a cost benefit analysis before moving from ICD-9 to ICD-10. There was international experience implementing ICD-10 and a new procedure classification, but none doing an associated cost benefit analysis. Other countries just felt it was the right and timely thing to do. Mr. Scanlon pointed out that, in light of HIPAA requirements, this wasn't an option. Dr. Cohn reflected that the likelihood of getting a multi-million dollar plus effort funded in his own company wouldn't be great if his bosses asked why and he said, "Because it's the right thing to do." Dr. Cohn and Ms. Greenberg established that they were both trying to identify the business case.

Dr. Cohn said he heard general agreement to proceed with a cost benefit study. Dr. McDonald, Ms. Greenberg, and Mr. Augustine recommended that, while doing the study, the Secretary begin the regulatory process by developing an NPRM. Remarking that the notion of regulatory process scared people, Dr. Steindel suggested "regulatory document" was a more neutral term. What he heard being recommended was to simultaneously develop the cost benefit analysis and recommend that the Department work on the regulatory language it would use if they choose to go forward with an NPRM. Ms. Greenberg added that the Subcommittee also wanted the Department to start work on it's own impact analysis, whatever a contractor did, it would be on a broader level than an in-depth analysis of costs and benefits in Medicare. A lot had to be done to develop an NPRM, and all this could be concurrent. Dr. Steindel expressed concern that the Committee's and the Department's cost benefit analyses might conflict. Ms. Greenberg pointed out that one couldn't do a case control study where one segment implemented and another didn't. There would always be a range of costs and benefits.

While noting he leaned toward a letter indicating they'd recommend initiating a process for an NPRM in parallel with the cost benefit study, Mr. Blair suggested checking an underlying assumption that the analysis wouldn't conclude they shouldn't go forward. An analysis could conclude that if HHS committed to providing education and migration and implementation tools to facilitate the industry moving to ICD-10-CM and PCS, they might wind up with many of the same difficulties faced with HIPAA financial administrative transactions. Mr. Blair advised that the study should seek recommendations for education, implementation tools, and other things to mitigate costs identified. Ms. Greenberg agreed. She added that those anticipating implementation were thinking about this; contractors won't have to make this up out of whole cloth. Mr. Blair said the key thing was that the cost benefit study would be completed before the NPRM and those recommendations could be reflected in it.

Noting the Department would look to the Subcommittee to place all this in perspective, Mr. Scanlon encouraged them to think about everything going on (including HIPAA and its implementation) and all the other demands on providers, plans and the Department. He suggested they owed it to everyone to put everything in perspective, consider alternatives, and not suddenly pull another regulatory action. Mr. Scanlon noted, too, that the decision to pursue a regulatory solution, absent a statutory mandate (e.g., HIPAA) was a serious matter for any Department. He emphasized that suggesting an NPRM, in parallel with the cost study, might or might not be enough for the Department to do anything. Everything had to be taken into consideration.

Mr. Scanlon reflected that there was no such thing as "the impact study." Both those that wanted one and those that didn't had that body of knowledge common in research and actuarial work as well as this. Whatever recommendations were made, if the Subcommittee felt great pressure to move to an NPRM, they should state so. Ultimately, it would have to compete with all other requirements on providers and plans as well as the resources Mr. Blair mentioned that went into HIPAA and other standards implementation. Mr. Scanlon encouraged the Subcommittee to step back and look at overall timing and impact issues in a broader sense.

Dr. Cohn said he heard Mr. Scanlon cautioning the Subcommittee about being sure of the priorities and how much they recommended happening at one time and asked for clarification. Mr. Scanlon said he didn't understand the rush to recommend an NPRM. Doing pre-regulatory information gathering was one thing, and what he thought they were saying was, ultimately, they wanted to move towards adopting ICD-10 and it most logically involved an NPRM. But Mr. Scanlon also heard that they didn't have enough information, found the industry divided, and didn't know enough to estimate the costs or the steps. What troubled him was putting forward the NPRM while still collecting basic information. He said the logic of regulatory development suggested the cost benefit as the next step and recommendations around an NPRM after that. He acknowledged there might be other demands and advised them to look at alternatives, estimate the cost of impact and what could be achieved, doing it in parallel. But that would be unusual, unless there were other circumstances.

Ms. Greenberg pointed out that they hadn't discussed other documents provided to the Subcommittee that needed to be brought before the Committee in November that not only revealed the industry's lack of agreement but also documented through several hearings the fact that the systems currently in use were broken and getting worse. The reason for expeditious behavior was the current situation was bad and deteriorating. Dr. Cohn acknowledged that there was a tremendous amount of controversy and perhaps no common understanding about a lot of these issues. He said they probably needed to flesh out this issue for the full Committee. And he added that another legitimate question to explore with the Committee was the case for and against there being so much pressure that they couldn't do a cost benefit before an NPRM. Noting an NPRM wouldn't be published without including that analysis, which was another way of saying a cost benefit study, Ms. Greenberg said nobody recommended publishing without one. Dr. Fitzmaurice urged the Committee to consider that, in putting together an NPRM, the Department's cards were on the table. He recommended issuing the NPRM at the time the study was initiated, so they'd have more information than if they strung it out over three or four years.

Recalling Ms. Greenberg mentioned that the cost benefit analysis would be more global and wouldn't include impact on Medicare or look at alternatives, Dr. Zubeldia said the cost benefit analysis had to look at the entire impact, including the impact on Medicare, in order to understand why it would have restrictions. And if it looked at the full impact, he said it should look at alternatives. Ms. Greenberg replied that it would look at the impact on Medicare and other payers; what she'd said was that this didn't get the Department off the hook from doing what it had to do to create the NPRM. Medicare couldn't just wait for what this study came out with. She explained that both AHA and NCVHS had stated that neither the hospitals who reported in-patient procedures, CMS which used in-patient procedures in DRG's, or people who used 9-CM, volume 3, for statistical purposes considered CPT a viable alternative. In that sense, regardless of the cost, it wasn't a classification that could be used for that purpose.

Dr. McDonald said he found it disingenuous that every hospital procedure was coded with CPT. Ms. Greenberg noted that CPT was designed to capture the physician component, not the facility costs. Dr. McDonald said it was only being used for hospital physician, surgical, and medical procedures--for all practical purposes, by hospitals. Dr. McDonald commented that, although the hospitals said the problem was they needed richer codes, they weren't using the codes for analysis in most their systems because the hospital surgical systems typically captured CPT code. Ms. Greenberg said they weren't questioning whether CPT codes had the support of the physician community or captured what it felt was necessary. They were talking about facilities. Dr. McDonald said they were told that hospitals weren't doing anything except physician's or practitioners' procedures for practical purposes. They had two coding procedures, one already richer than the other, and two groups battling for reasons he didn't understand, but he didn't think were necessarily for the good of everybody.

Nellie Leon-Chison, speaking for AHA, clarified how hospitals used surgical and other coding. Every procedure in a hospital did get coded, but there was a difference between codes a physician used to submit his or her claim and the codes a facility used. The physician used a CPT code to report his or her services to the payer, whereas the hospital, for the in-patient side, used an ICD-9-CM, volume 3, code to report the same procedure. They were entirely different systems. Much of the time, CPT referred to services in relation to how much work was involved on the part of the physician. There were situations where the CPT code included services, pre and post surgery, that were part of the physician's fee, but didn't relate to what the facility received, because hospitals were paid on a DRG for in-patient and Ambulatory Payment Classification (APC) services. Hospitals now used CPT codes for out-patient services, something fairly new with APC and had problems implementing them together: the descripters didn't always fit for facility coding and CPT descripters with certain phrases (e.g., "performed by physician,") might not apply for hospital billing. The descripters didn't always make sense to the facility. Members told AHA that they preferred 10-PCS: it was more straightforward, easier to use, and didn't make distinctions about who performed the service, but described the service done.

Dr. Cohn said he didn't have an opinion on this particular issue in relationship to hospital and patient coding, but he noted it came up repeatedly. And if they didn't include it in a cost benefit study, he wondered whether it would continually come up. Ms. Greenberg said CMS would address this at the November meeting, and she suggested they defer this topic until then. She added that this analysis they were discussing shouldn't include looking at alternatives between CPT, ICD-10-PCS, and the in-patient environment, which weren't about cost. Dr. McDonald clarified that he wasn't pleading for CPT per se, but he said it didn't seem rational that the only principal coding being done was mostly for professional procedures. And he questioned how they could reasonable talk about cost benefit without addressing other coding everyone was doing as well as ICD-9. Ms. Greenberg reiterated that she didn't see the cost benefit study addressing the issue of single procedure classification. She noted the Committee could commission a separate study. But she reminded them that they'd heard testimony that this was unlikely to be resolved in a way timely enough to move past out-of-date classifications that couldn't meet future needs.

Noting they'd tried to make a decision and go forward for a long time, Mr. Blair recalled Dr. Cohn commenting that if they didn't get the data to facilitate a decision it kept haunting them. He said in order to have the information to decide and move on they either had to expand this or add an additional study. Dr. Cohn suggested they ask the full Committee how it wanted to deal with this. Ms. Greenberg concurred. She said they should ask CMS to present the case for why they didn't believe CPT could be used in the in-patient environment. It was obviously used for professional services; if they felt it could be used in the in-patient environment they wouldn't have gone to the expense of developing 10-PCS. If the Subcommittee wasn't convinced by CMS's argument, Ms. Greenberg said there might need to be a separate study on single procedure classification. But she said she'd like to proceed with this cost benefit study and the terms generally laid out.

Dr. Zubeldia contended that the clash between 10-PCS and CPT wasn't just a financial argument. He said CMS could probably explain those other reasons best. Regardless of the reasons, Dr. Zubeldia said there should be a cost benefit analysis of that alternative taking all the reasons into consideration. Ms. Greenberg noted they'd have to assume it was an appropriate alternative to do a cost benefit analysis. Dr. Zubeldia identified two choices. Either they started the cost benefit analysis after CMS presented the CPT as an invalid alternative, excluding CPT, or start the analysis simultaneously with the cost benefit analysis, including CPT. If CMS proved it was an invalid alternative, they could stop there and continue with ICD-9. Rather than delay the entire cost benefit analysis until they had enough information from CMS to know whether or not they start CPT, Dr. Zubeldia said it was best to start both in parallel, even if one had to be stopped in November.

Ms. Greenberg noted it was unlikely that this would get awarded by November. She said they were prepared for the scope of work that didn't include CPT or single procedure classifications analysis and that the Committee could do a separate study on single procedure, if it chose to, after Tom Gustafson presented CMS's position on the diagnosis side. Dr. Cohn agreed that the November meeting would be a good time to reach agreement. Dr. Fitzmaurice suggested AMA, AHA, and AHIMA could also present their views. Ms. Greenberg said they'd already heard from them and they had agreed not to get outside testimony, because there was no end to it. Dr. Cohn noted that in this context CMS was considered outside testimony. He said the question was whether anyone on the Subcommittee needed to present commentary; further debate wasn't needed between the various sides.

The Subcommittee agreed to hold off a final decision on expanding the scope of the cost benefit in order to seek guidance from the Committee after the CMS presentation and the Subcommittee's subsequent discussion. Dr. Cohn said he understood that CMS would explicitly justify their reasons for moving forward with ICD-10-PCS. Ms. Greenberg pointed out that CMS already had made its case, what CMS hadn't addressed explicitly was whether CPT was a viable alternative. Dr. Zubeldia said he concurred based on the fact that Ms. Greenberg discussed this issue with CMS and her recommendation was to proceed with a cost benefit analysis without the CPT codes considered an alternative in the cost benefit study. He supported moving in that direction, but said he still wanted to see the information from CMS. The Subcommittee could begin drafting the proposal, which wouldn't be initiated until after the meeting with CMS. If necessary. The CPT part could be added. But he emphasized that it had become evident through all the discussion that the issues of ICD-10-CM and ICD-10-PCS needed to be clearly split.

Ms. Greenberg agreed. Obviously, the cost benefit impact had to look at 10-CM and 10-PCS separately, while also gauging the costs and benefits of replacing all of ICD-9-CM at the same time or different times--to that extent, they had to be viewed together. Dr. Zubeldia suggested doing the cost benefit in three parts: one for ICD-9- and ICD-10-CM, another for ICD-9, volume 3 and ICD-10-PCS, and a third to answer the question of whether additional synergies existed doing them together. He said they could start the proposal now, knowing that after CMS reviewed the situation, if necessary, they could add the single coding option. The Subcommittee deferred the issue to the full Committee. Dr. Cohn will query Dr. Lumpkin about facilitating the discussion.

Noting they'd talked about a number of issues, the Subcommittee considered how they wanted to conduct the discussion with the full Committee synthesizing the issues of contention. Ms. Pickett explained that the background document laid out the history of each classification, why it was developed and a definitive timeline of steps to develop it or a clinical modification related to ICD-10 diagnosis. The other document outlined issues identified during this discussion about single procedure coding: (1) whether the issue needed to be readdressed with information about NCVHS's history related to single procedure coding, (2) advantages of waiting for decisions versus the immediate needs felt by various providers in not having a classification that represented what they really did, (3) the cost benefit analysis and what should be a part of it, including some, not necessarily tangible or quantifiable, benefits. Ms. Pickett said she already had an outline of additional information and would include others mentioned during the discussion. Ms. Greenberg remarked that although they had at least the expectation that this independent cost benefit impact analysis would be commissioned, whether all recommendations about how to proceed awaited completion of the analysis was still an open question and should be in the issues paper. Dr. Cohn said they could give the pros and cons of moving forward now versus waiting for the cost benefit piece to be completed.

Ms. Trudel introduced a beginning list of issues. She noted discussions about the urgency of moving to a new code set: the extent ICD-9 was broken in terms of CM versus procedure, volume 3; whether there were other alternative candidate code sets; CPT and the timing, including the linking, whether to link a change in CM and PCS, the adequacy of testing to date, and cost versus benefit. Dr. Zubeldia added another issue: the availability of help and assistance tools (e.g., a crosswalk between ICD-9 and ICD-10). Mr. Blair rephrased that broader to include migration tools and education and support programs (e.g., mapping) for migration.

Ms. Greenberg asked for comments on the background document within the next week. Dr. Cohn recommended adding the GAO Report and its useful world view as well as the section of the 1993 NCVHS study that dealt with classifications, along with its set of recommendations, to the two documents they'd been discussing. Dr. Cohn said the study summarized Dr. McDonald's observation that this was about control and ownership as well as coding beautifully and helped make sense out of the problems.

Ms. Pickett will draft the pros and cons and members will review and augment the issues paper off-line. A draft would circulate the middle of next week. Comments had to be in by November 5, so it could be redrafted by the end of the week and put to bed by November 12, in time to go out with the agenda books.

The Subcommittee considered how to facilitate the Committee's discussion of pros and cons. Mr. Augustine recommended dividing up the bulleted pros and cons in the issues document and assigning them to members who would lead the discussion on each item, reflecting both views. Other members could join in if they felt a view needed to be expanded before the Committee could weigh it.

Panel 2: Drug Terminologies Under Development

Mr. Brown said VA wrestled with having no common drug terminologies between the many governmental agencies. He noted the number of memoranda's of agreement between agencies collaborating on drug information. NLM, FDA and VA worked together. NCI agreed to help collaborate with the information on drug terminology. And VA and CDC had drafted an agreement that hopefully was in final phases of approval.

Mr. Brown said the July 2000 NCVHS report had been taken to heart and movement was being made. FDA was developing NDCs and active ingredient classifications. NLM utilized RxNorm for enhanced mapping capabilities between systems. And VA was active with NDF RT, a reference terminology based on the NDF that expands the RxNorm model. Mr. Brown said the overall purpose of this collaborative initiative was to reformulate and distribute information already under government stewardship, making the whole bigger than any of its individual pieces.

Mr. Nelson noted there'd been a lot of discussion in HL7 meetings about defining a clinical drug so pharmacy knowledge-based vendors could better interact. As a result, a new semantic type was established in the UMLS: a clinical drug was something with an ingredient and either a form or strength or both. The clinical drug would be expressed at a different level of discourse than manufacturers produced things: the level of clinical discourse. Mr. Nelson said the concern was about the solution administered to the patient, not lyophilized patterns reconstituted for solution. Using usual techniques in the UMLS, NLM ended up with some 80,000 clinical drugs and a problem with unrecognized synonymy. Lexical processes used to identify things with the same or similar meanings didn't work. Mr. Nelson said the RxNorm form was conceived with the goal of developing a standard representation of what was meant at a clinical level, relating all UMLS clinical drugs to that standard format and facilitating crosswalks between different pharmacy knowledge-based vendors so each with its island of information could negotiate among them.

Mr. Nelson said it was decided to build a set of UMLS concepts, starting off with ingredients. Many were already concepts in UMLS's Medical Subject Headings (MeSH). A formulation would be created combining the drug component (an ingredient in a strength) and the standard drug forms HL7 proposed. The strength of each component would be represented in a standard way, following a set of rules, and relationships would be established with all other names in the UMLS. The UMLS would contain relationships of the ingredient, the component, clinical drug, and dose form. Other things in UMLS that didn't match exactly in meaning would have a relationship to the RxNorm form that could be mapped to. When completed, everything could be represented in a graph: e.g., Zyrtec 5 mg tablet was a trade name of Cetirizine HCI 5 mg oral tablet. Its dose form was an oral tablet. It consists of Cetirizine 5 mg, which had the ingredient Cetirizine, a form of Cetirizine HCI. Cetirizine HCI had the form of Cetirizine. Zyrtec was a trade name of Cetirizine HCI. Each of these was a concept within UMLS. And each labeled relationship within UMLS allowed a system to negotiate from one portion of that graph to any other portion, identifying permissible relationships and what was known.

Collaboration began with a memorandum of agreement with the VA National Drug Formulary. Some 80,000 drugs in the NDF were expressed at the NDC code level. VA provided an ingredient list and NLM mapped those into semantic normal forms. Next, drug vocabularies from the five major sources for clinical drugs (National Drug Formulary, Multum, MediSpan, Micromedex, and First Databanks) were tested to gauge the scaleablity of the approach and refine it. Work was under way fine-tuning the model and achieving a full map for the most common interactions.

The first experiment with VA began with 80,000 drugs and ended up with 11,300 clinical drugs, 10,000 components and some 8,000 RxNorm forms. A 70 percent reduction in the amount of work was achieved algorithmically. Another 3,000 drugs required human effort. Mr. Nelson noted enough couldn't be achieved algorithmically to make it worthwhile to leverage the effort involved in building the system up. Results of the experiment and the first version of the Metathesaurus were released in 2002.

The second experiment that tried to parse all the sources and approach them algorithmically ended up with about 25,000 forms edited and created. These were merged down to 14,000 and at that point 13 clinical drugs that had been separate UMLS concepts were merged together. About 35 UMLS concepts from previous additions were now merged into the same concept. Some 20 graphs establishing and labeling the full set of relationships for a list of highly prescribed drugs was distributed in the May release of the UMLS Metathesaurus. Noting this was mostly done with software built and modified on the fly, Mr. Nelson predicted by the end of 2002 all RxNorm forms needed would be created and by spring a full set of graphs would relate every clinical drug in the UMLS to a normalized form.

Mr. Nelson cited the need for a method to ensure that things used on a regular basis at a clinical level were updated. He noted there were many forms of changes. There were thousands of changes to NDC codes each month. Some 30 or 40 new molecular entities showed up every year. And there were numerous repackaging reformulations of new clinical drugs.

Another thing looked at was where NDC code information was available. NLM was dependent upon people who made their living tracking and didn't necessarily want to release proprietary information. But FDA provided access to much NDC code information and VA cooperated with NLM. Where NDC code information was available, NLM could use it as attributes of these concepts, saying those codes fulfilled these criteria for a given RxNorm form.

Mr. Nelson said knowledge-based vendors had spent a great deal of money trying to track what's happening with NDC codes. FDA had lost control of the process of NDC code creation and pharmacy knowledge-based vendors asked manufacturers what new NDC codes had been issued each week and put them into their systems.

Asked if the problem was one couldn't easily get to the active ingredients within an NDC code, Mr. Nelson said, with a good knowledge source, one could eventually go from the split code (without the hyphens, it wasn't unique) to the active ingredient. Mr. Nelson remarked that when physicians ordered they didn't care how many pills had been in the bottle their prescription came from or about an NDC code. For a generic, they weren't even concerned about the manufacturer. All they cared about was that the patient got the right drug in the right form at the right strength. NDC code had information about all that.

Mr. Nelson said the RxNorm was used to address clinical drugs at the level of equivalence and similarity at a clinical level. He suggested RxNorm might serve as a basis for computerized physician order entry.

Mr. Brown reported that providers did about 90 percent of all medication orders electronically. VA used its NDF as an interlingua between local data dictionaries at 170 medical centers. NDF, centrally maintained and distributed, was used for decision support, rules, interactions, and VA's mail-out pharmacies that filled 57 million prescriptions in 2001. VA looked primarily for the patient safety and decision support points of view, recognizing it faced real issues in terms of maintenance of terminologies and mapping between sites (e.g., the health data repository--a national roll-up) if VA didn't have standardized formal terminologies to do things algorithmically. VA also must inter-operate between installations within the system as well as with partners and patients seen in DOD facilities and academic affiliates.

Mr. Brown said VA took an ingredient-centric approach rather than a stock-the-pharmacy's-shelves-centric approach and was interested in computable definitions so operations could return specific answers. VA's goal was to do knowledge interviewing up front, re-using it at all sites. A clinical perspective prevailed. VA sought more information and was interested in collaborating to reduce the workload.

Mr. Brown said VA considered the reference terminology in Dr. Campbell's sense, using description logics, as opposed to something just for reference. VA added in reference hierarchies that included mechanism of action, physiologic effect, and therapeutic intent (which was different than indication) and was considering how much kinetics to have as definitional or considered a terminology model. VA kept its drug classes because they were needed and much software was written around them. Some 80,000 NDC drugs and 3,000 active ingredients were in NDF. Active ingredients were referenced most; definitions were done relatively few times and characteristics were inherited. Mr. Brown noted that, in most cases, mechanisms of action for an active ingredient were the same for different manufacturers products, as long as both were the same chemical substance.

Many drug classification schemes were liberally mixed with mechanisms of action or chemical class (do not give a drug concurrently with an MAO inhibitor or organic nitrate) or physiologic effects (don't give it with a negative inotropes). Using VA's classification scheme one knew and could choose what was a beta-blocker, ACE inhibitor, or anti-hypertensive. Abstracting these ideas into a multi-hierarchical system, VA sought to enhance its decision support. VA work on advanced reference hierarchies for mechanism of action includes the chemical structure hierarchy from MeSH, physiologic effect, therapeutic intent, and additional kinetics modeling. Currently NDF RT was used as the underlying data source for the pharmacogenomics knowledge base data management done in Stanford. NIGMS was funding a group of academic centers to collaborate with VA to expand kinetics and other areas.

Active ingredients for the 80,000 NDC level drugs were algorithmically defined and undergoing human review. The algorithmic initializations were about 80 percent right on therapeutic intent, based on mining the literature algorithmic. Mr. Brown said the name of the game was to do the best VA could up front using lexical and semantic techniques and then have humans refine to minimize costs while achieving a quality product.

VA utilized a browser as part of its internal strategy for creating enterprise terminology, which Mr. Brown defined as terminology that applied across all systems in the enterprise. VA wanted one way of expressing medications at 170 medical centers and some 150 vista implementations. Having separate islands with different data dictionaries made it hard to roll-up data and share decision support and VA was moving to take terminology services from the applications and mix them in the code, creating abstracted terminology services, which fit into VA's HealthePeople initiative. Mr. Brown noted several advantages: the ability to roll up data, compare sites, and realize a highly reusable way to reduce maintenance costs and more comparable data. When new drugs came out without an NDF update, each site had to discover it was there, put it in, and when there was an update, pay the local update penalty. Not only medications, but also things such as labs and document titles could be knowingly shared between sites and partners.

He noted considerable interest in working together to deal with the maintenance burden. VA worked to develop new use cases and extend the model to meet some of a multi-institute NIH pharmacogenomic's research network's needs. DOD did information modeling on some label sections. And VA was talking with Dr. McDonald about the needs of Regenstrief for order entry, which paralleled needs at other places like Ganderbuilt. Some 32 new drugs and 4000 NDC level changes in 2001 were a challenge and VA in collaboration with FDA and NLM had started a project for new drug transactions to prototype messaging standards and reflect them back through the standards organizations.

Mr. Brown said once a code was assigned it shouldn't change, but codes did get reused eventually and could also be split different ways and reinterpreted. There were multiple problems with non-uniqueness. He added this was more a marketing than a molecular entity issue: old codes could stay, but, with some 4,000 new codes a month, a lot of new product came out. Mr. Nelson said about 250,000 codes were currently active. VA had 80,000. Once the hyphens were taken out, they weren't necessarily unique.

Panel 2: Drug Terminologies Under Development

Mr. Levin explained that NDC was started in the late '60's as a way for Medicare reimbursements. NDC comprised three different codes: the labeler, product, and package codes. FDA assigns the four- or five-digit labeler code to a company. Companies assign themselves the second and third codes. He noted that drug companies, distributors, repackagers and relabelers assigned NDCs and didn't always list those products with FDA or assign NDC's to non-drug products. Even the product and package codes weren't controlled.

Mr. Levin clarified that the three codes were unique to that particular box. But bar codes didn't understand hyphens and couldn't distinguish between the five-digit labeler, package and product codes. Bar codes only read ten-digit numbers, which conceivably could be three different NDCs. Dr. Zubeldia pointed out that under HIPAA the Secretary was adopting an 11-digit number as the NDC standard. Noting that many people added a lead-in zero for the eleventh digit, Mr. Levin reiterated that five-four-two codes without hyphens weren't unique. Ten billion ten-digit codes were available and there were only 250,000 NDCs, so probably there weren't many duplications. But companies controlled the numbers and someone else might have inserted a zero elsewhere in that sequence. FDA knew about problems with the code and was working on ways to improve it. Codes were reusable five years after that product was withdrawn completely. FDA was looking at ways to address issues about uniqueness and reusability of the numbers so that NDC became a reliable code for the packaged product.

Noting NDF RT was based on AID terminology formularies, Dr. Sujansky asked if it included over-the-counter, pediatric, ob-gyn and other medications that might not be relevant to the VA formulary. Mr. Brown said there were plenty of ob's, some items that could overlap with over-the-counter medications, and not many unique pediatric meds. The differences were more the size of the tablets than the level of active ingredients. Mr. Brown noted that women comprised a large and rapidly increasing portion of VA's patients.

Mr. Levin remarked that much of the drug information Dr. Steindel and Mr. Nelson needed was on the product label. He noted that currently labels were printed in a small font on tissue-thin paper and inserts put in sealed cartons weren't replaced when updating was needed. He noted that FDA had samples of package inserts for every product marketed in the United States, but nothing in text or word processing form.

Mr. Blair asked if FDA would gather information about the active ingredients in food supplements and herbal medicines and require accountability and publication codes. Mr. Levin said his presentation would show how FDA handled ingredients for all the products it had information on including many food supplements. Mr. Nelson said RxNorm didn't make a distinction between over-the-counter medications and supplements. Mr. Levin explained that FDA didn't consider a food supplement a drug until the manufacturer made a drug claim: a Vitamin E that was claimed to ease dementia would be considered a drug.

Three efforts were initiated to improve access to FDA's medication information. FDA strove to make it more people, computer and information system friendly and had changed the organization of the labeling. Key information was highlighted in bullet form at the top of the label: e.g., indications, contra indications, warnings, dosage and how it was supplied.

The proposed rule requiring companies to come in with labeling in electronic format was proposed and the comment period was over. The few comments were being addressed and FDA was working on the final form. Mr. Levin noted the pharmaceutical companies didn't like the paper package insert and were interested in moving to electronic labeling. Asked about options for electronic labeling, Mr. Levin said FDA was considering putting the text-based PDF formatted content (figures, tables and text) into another electronic format.

One way FDA was trying to make labeling more people friendly was to take key information and put it into a highlighted concise section, so the prescriber could get much of the information quickly and, if needed, refer to the body of the label. Medication information was also being made computer readable. Each section of the label (e.g., pediatric use, usage in pregnancy, drug interactions,) and key elements within the text (e.g., active ingredients) were tagged so the computer could find them.

Every time FDA had a new product, it would provide NLM with the ingredient code, strength and form. NLM would note the RxNorm and keep it up to date. FDA also would provide information about the imprint code. The NDC code would be in the label and computer readable.

Mr. Levin described how FDA wanted to work with the NLM to build an ingredient coding system that could be used internationally. FDA was informed of the structure of a drug a company was going to investigate when it assigned the drug a unique code and entered it into the system. FDA never released those codes, only using them internally. But it was working on a system to share them with NLM who would publish them so people could use them. One problem was that FDA could only release the codes of marketed products. Products under investigation were confidential and many foreign products were never investigated in the U.S. and FDA wouldn't have jurisdiction over them. But NLM, which could see any code in the public domain, could ask FDA to assign a code and disclose it to them. Mr. Levin said the intent was that NLM would link the system to things outside the U.S. domain and it could be used by anyone. He noted that FDA was also interested in coding and labeling other things (e.g., interactions). Key information in the structured highlights would be modeled and put in a computer-readable format so it was easy to update and available as part of the usual process.

Mr. Levin said FDA was trying to make the labeling information-system friendly. FDA didn't know the best ways to present the information, but wanted to make it accessible so people could put it into systems that fed the information to the public. FDA partnered with NLM to disseminate this information. Labels in computer-readable form would be added to information already distributed about drug products so people could import it into their information system. Updated information on all marketed products would come directly from FDA and the manufacturer and be freely distributed.

Dr. Fitzmaurice commented that, knowing the NDC code for each label or package insert, one could do a cross map and go right to the package insert. Mr. Levin said FDA would provide all this information and hoped people would build systems that let them answer their questions. FDA would have labeling on its Web site and address the issues of those that come to the site, but Mr. Levin noted people wanted all sorts of information about the product and FDA wanted them to find out how to disseminate that information.

Mr. Blair asked what MedDRA codes had that NDC codes couldn't provide. Mr. Levin said MedDRA was a control terminology for adverse events. It was used for post marketing safety reports put into an adverse event reporting system where FDA did data mining and surveillance of adverse events. It wasn't used to code a product. Mr. Levin said the codes enabled building better dictionaries (avoiding misspellings and confusion over brand names) for use in exchanging adverse events internationally, improving surveillance capabilities. Mr. Levin explained the DailyMed Initiative, a collaborative effort between NLM, FDA, manufacturers and health information suppliers: changes in labeling submitted by manufacturer to FDA will be entered into a database utilizing common terminologies developed by NLM, VA and DoD and others and then distributed by NLM to health information suppliers, who will make them available to the public. Dr. Fitzmaurice remarked that this relationship was exactly what public/private partnerships ought to be. Responding to an impression that MedDRA dealt with international drug enforcement, Mr. Levin explained that the International Conference on Harmonization, which included FDA, the Japanese regulatory health authorities (EMEA) and major pharmaceutical associations, developed MedDRA as the medical terminology everyone would use as a standard way to report adverse events reports received from manufacturers. Dr. McDonald noted MedDRA was a big terminology with many symptoms and diagnoses.

Dr. Zubeldia expressed concern because the recommendation for the HIPAA standards adoption of the NDC was to use a series of five, four and two digits, without hyphens but with zero inserted to make a total of 11 digits. Asked if it would work without hyphens, Mr. Levin explained that only ten digits would fit on the current delinear product bar codes. He said he would check the HIPAA rule, but he didn't think the NDC, which was three different codes, was meant to be without hyphens. Companies couldn't assign NDCs without them and they'd have to be centrally assigned. Mr. Levin also noted more control was needed over NDC to ensure there weren't redundancies.

Mr. Levin doubted hyphens were needed if the qualifier transmitting the data unit specified a fixed five-four-two format. Mr. Levin explained that adding zero to ten-digits without hyphens could cause overlap. Dr. Cohn remarked that NCPDP would have commented if there were major problems with drug identification and the rule, but he said the Subcommittee would ask them. Mr. Levin said as long as the three codes were considered separately there was no problem with redundancy. That separation was lost without hyphens and with 11 digits bar coding became an issue. Noting FDA was working on bar coding as a way to reduce drug errors, Dr. Fitzmaurice asked how FDA would solve this problem when it pushed for bar coding of drugs. Mr. Levin said one option was to take more control over the NDC, addressing the issues of the hyphens and the bar coding that couldn't be read.

Dr. Cohn recalled encountering considerable resistance when they'd said in doing the original HIPAA rules that the industry should abandon J codes and go to the NDC. With the recent proposed rule, they'd suggested the industry go back to J codes, while noting work being done on a drug terminology that might meet needs better than J codes. Acknowledging he wasn't certain he saw a replacement for J codes, Dr. Cohn asked about ways to identify that somebody got an injection of 25 mg of a drug and whether that was only a partial unit. Mr. Nelson advised that the RxNorm form addressed the prescription. He said he wasn't quite sure how to deal with Dr. Cohn's problem of the 25-mg order coming from a 50-mg vial, but suggested saying they took half the vial represented the concentration, not the vial, and was reasonable.

Dr. McDonald remarked that the J codes were a special case for billing that in five years he'd look at as a historical remnant and, whether this approach could replace it or not, unifying the information from the package insert to the prescriber was a magnificent opening. They could start with prescriptions, medications, and writing orders and unify all the knowledge. Dr. Cohn said Mr. Brown's comments led him to believe this might be a solution and he hoped, eventually, it'd be referenced in an NPRM. Mr. Brown noted a variety of codes could be applied at different levels. VA pharmacists cared about packages and products; when he wrote for a medication, he didn't. DHA's order entry system should accommodate both his needs at the clinical-drug level using RxNorm form and the pharmacist's who pulled it from a different package. The computer kept straight entities and abstractions needing different codes. Noting a number of things referred to as codes were voluminous, Ms. Trudel asked about assigning an analog non-intelligence numerical indicator with the same meaning as the RxNorm (which might be sizeable) that could go in a claim and be less than 30 bytes long. Mr. Nelson pointed out that the unique identifier in the UMLS was a meaningless nine-digit number.

Dr. Cohn said he was confused because the finished dosage wasn't part of the UMLS RxNorm project, but was a piece of the VA reference terminology. Mr. Levin explained that they wouldn't have the finished dosage forming code until all three came together. Mr. Nelson noted that where there were names in the UMLS for finished dosage forms (e.g., Valium 5 mg tablet) there were established relationships to an RxNorm form. They were trying to do RxNorm forms at the generic equivalence level, but physicians order essentially a trade name and that information also would be there. Dr. Cohn said it sounded as if they were getting closer, but there was still a ways to go before they had a drug terminology that might be a viable replacement. Mr. Nelson suggested they ask again in a year.

Mr. Blair said Mr. Nelson's description of using UMLS as a basis for RxNorm sounded like a major evolution. He understood that UMLS originally was established to facilitate literature searches, and was a mapping to many other code sets. While some people considered it a possible data dictionary for electronic health records, others contended that wasn't its purpose. But, at least with RxNorm and drugs, it seemed to be adapted for patient care. Mr. Nelson pointed out that UMLS's original purpose was to facilitate inter-operation with a wide range of electronic systems for all sorts of information and data related to health, which accounted for all the vocabularies. UMLS was always broader than just literature. He recalled discussions when UMLS was built about navigating between vocabularies and using it to find the right name in the appropriate vocabulary. He said nobody ever used UMLS that well to navigate to vocabularies because relationships hadn't been established, but now there was a concerted effort to establish them in the drug vocabulary. It had been hard to consider developing mapping for 800,000 concepts. But the drug vocabulary was a circumscribed portion of the UMLS. With 80,000 concepts, computer applications could be leveraged. It looked like a reasonable target.

Dr. Cohn thanked the presenters, noting the Subcommittee needed to follow their efforts and that it sounded like they might hear about definite progress in the next six to eight months. They'd identified a variety of interests, including the electronic medical record, order entry, and solving problems with HIPAA transactions.

Day Two

Review of Agenda

Mr. Blair explained that HIPAA's administrative simplification provisions directed NCVHS to study and make recommendations to the Secretary on issues related to adoption of uniform data standards for PMRI and that information's electronic exchange. The Subcommittee's report of July 2000, (which was on the Web site) set forth criteria for evaluation and selection of PMRI standards including recommendations for selecting them. In August of 2002, the Subcommittee formally began evaluating recommendations on PMRI terminology standards. Dr. Steindel, Ms. Beebe, and Mr. Blair had created a matrix summarizing the testimony of 14 testifiers who'd help at the August 28 meeting to define the scope and the criteria for selection of PMRI standards and develop a framework to assist in the decision-making process. Today's discussion centered on a matrix summarizing the August 28 testimony of 14 testifiers who'd helped define the scope and criteria for selection of PMRI standards and develop a framework for this decision-making process. The Subcommittee had to resolve four open issues about terminologies that would be in or out of scope in criteria for selection for different groups. Dr. Steindel noted that NLM and the American College of American Pathologists (CAP) reopened negotiations and either would reach agreement by January 15 or the federal government would initiate other steps.

Mr. Blair noted the Subcommittee had received Dr. McDonald's written comments on the matrix and the discussion framework. Dr. Cohn suggested that considering the various steps in the work plan would help determine the need for additional hearings. Mr. Blair noted they'd just heard about the SNOMED situation and wouldn't have that answer until January. Dr. Cohn reflected that the existence of the contract didn't necessarily affect the Subcommittee's work. The question was how a low-cost license should be used: what domain should it function in, should there be others, how could the industry and health care make maximum use of PMRI terminologies, and what did the federal government and the Subcommittee need to do to assist the industry and country with administrative simplification? Dr. Steindel said those in the federal system looking at SNOMED would appreciate the Subcommittee identifying domains that had to be covered and the priorities. Mr. Augustine concurred.

Ms. Greenberg reported that participants at the annual meeting of WHO Collaborating Centres for the Family of International Classifications expressed keen interest in how they would go forward with vocabulary development and use in the United States and internationally. Noting SNOMED International was either used or being considered in numerous countries and the U.K., Australia and other countries were interested in working with the U.S, she expressed hope that the negotiations could encompass a broad perspective and solutions that worked internationally. Dr. Steindel pointed out that the sole source justification published in the Federal Register was limited to U.S. use.

Presentation on PMRI Terminologies, Walter Sujansky, independent advisor

Dr. Sujansky noted the Subcommittee had solicited and summarized testimony from 14 individuals about the scope of terminologies the Subcommittee should address and criteria for selecting the terminologies in that scope. He said the first issue they needed to address was the approach and method by which the Subcommittee would organize and consider terminologies. Input from testifiers on August 28 about how to define the terminology effort's scope broke out suggestions into four or five categories. The first organized the terminologies by clinical function and/or the messages addressed by the standardization effort, including suggestions about domain-oriented groupings (e.g., categorizing according to terminologies that address pharmacy, laboratory, diagnosis, history or physical exam). There were also recommendations about functional organizations based on the types of tasks terminologies supported (e.g., nursing versus physician tasks). Mr. Blair had noted that this was a pragmatic, efficient approach that developed terminologies that directly addressed specific functions and uses. It led to terminologies that fit into a RIM designed to support messages and provided messages that were easily stored and retrieved. It seemed less expensive than strategies that included a reference terminology. This approach's weaknesses were that it: wasn't well-suited for decision-support applications that had to use and compare data for multiple domains; it didn't appear to address the need for clinically specific outcomes data to improve clinical processes when compared to a multi-axial reference terminology; it didn't address the need for clinically specific data to advance clinical research; and interlocking individual terminologies might prove more expensive than mapping to a single reference terminology. Mark Overhage, Stuart Nelson, David Laroux and Steve Brown advocated this approach.

The second approach entailed organizing terminologies with respect to a convergent or reference terminology that was the core or first layer and entailed a variation specifying a single relatively comprehensive core terminology used as an initial primary terminology for accessing and analyzing clinically specific data. Dr. Sujansky explained that other layers of terminology might include more limited and perhaps functionally specific terminologies related to things like bar coding or certain clinical functions such as nursing or emergency departments that exist at another layer. Another layer would hold administrative terminologies used for billing, administrative reporting or disease surveillance. In the center would be a comprehensive core reference terminology. Dr. Sujansky noted this approach provided a unifying mechanism for all terminologies, facilitated clinical decision support to improve quality of care and patient safety, improved clinical outcomes data to facilitate improvements in clinical processes, did this more effectively with lower costs relative to functionally specific terminologies, and had potential to improve the accuracy and lower the costs of clinical research. Its weakness was the relatively high cost of developing and maintaining a convergent terminology. Dr. Sujansky indicated the Subcommittee had received inconclusive information about the marketplace's limited acceptance of the vast number of candidate, convergent terminologies. He noted this approach didn't offer complete coverage for PMRI terminologies, due in part to development costs of additional domains. Difficulties were often encountered when mapping administrative terminologies to such a core reference terminology; Dr. Sujansky suggested these difficulties mapping core terminologies extended to any terminology developed independently and created a challenge in harmonizing terminology model information models. Kent Spackman, Jim Campbell, Peter Elkin and Colin Price advocated this second approach.

Noting a recent draft document reporting the amount of information lost in mapping, Dr. Cohn predicted this was a central issue they'd have to consider: by definition, there still wasn't a single comprehensive terminology that handled all potential uses and didn't require mapping. Dr. Sujansky said some published literature, or at least white papers, were available (e.g., mapping SNOMED to ICD-9). They could investigate whether there was a quantitative measure of how much information was lost per se. Recalling that many testifiers stated that, given its importance, the mapping process should be treated with as much rigor as development of the vocabulary, Dr. Steindel suggested this could provide an estimate of the cost of mapping. While that cost would be significant, Dr. Cohn said, hopefully, it would become an add-on year-to-year or moment-to-moment change. The bigger issue was how they dealt with information loss. Noting that ICD was a critical piece of all this, Dr. Cohn emphasized the need to be aware of all parts of the framework, relating both to administrative codes and things on the fringes of both small and large terminologies. He reiterated that a single terminology wouldn't handle everything. Dr. Steindel agreed this was an important issue to explore and suggested it might be worthwhile to hold a hearing session. Dr. Sujansky said he would do some research and report back at the next Committee meeting. Often when SNOMED found problems with mapping to ICD codes, he said they just added an equivalent SNOMED code.

Mr. Blair said he'd heard two dimensions of this issue: degree of missing data and cost. Dr. Cohn said he thought the missing data was a key architectural issue and that, whatever they put forward, they needed to provide guidance. The Committee was responsible for determining what ought to happen and mitigating barriers and issues. Ms. Greenberg reported that the annual meeting of WHO Collaborating Centres had established an international work group on vocabularies and their relationships with the family of international classifications. The U.K. chaired the work group and the Australian center would take the lead on mapping between SNOMED and ICD-10. Dr Berglund would participate. Dr. Cohn bookmarked this issue, noting that the Subcommittee's initial recommendations in 2000 identified the need to handle mapping.

Dr. Sujansky observed that these approaches to organizing and considering the terminologies weren't necessarily mutually exclusive, but represented conceptual buckets testimony could be put into and that buckets could be combined. The third approach that Dr. Chute had articulated was development of a web of terminologies blended between information and terminology models. Dr. Sujansky viewed this as a natural, necessary interaction between terminologies and, for example, message and medical record structures. In order to achieve the objectives of PMRI standardization, he said it was important to recognize a certain amount of overlap in the information content of terminologies and message structures. Dr. Chute had advocated what he referred to as a web of terminologies that integrated information and terminology models. Dr. Sujansky illustrated the kind of problem Dr. Chute addressed with his approach: a term "family history of coronary artery disease" in a terminology and a "family history diagnosis" field with the coded term "coronary artery disease" in a patient record structure would result in two ways of representing the same information, an ongoing problem with standardized medical records.

Another approach was to retain the current representation for the scope of PMRI terminologies with modifications. Dr. Sujansky noted the recommended modifications included adding and removing certain terminologies from certain categories or consideration. Some terminologies were considered message specific codes: e.g., the emergency department's DEEDS terminology was considered more of a record structure with limited terminology specifications. Dr. Sujansky said this approach might be the fastest, simplest way to characterize the scope of PMRI terminologies. Its weakness was that it didn't provide a strategic approach to unify, prioritize or address the need for new PMRI terminologies.

The last approach advocated by Captain Brian Kelly, Michael Beebe, and Margaret Haber was endorsement of PMRI terminologies used by the government or the private sector (e.g., ICD-9, CT-4, NDC). This approach avoided any objections from existing developers of PMRI terminologies, but offered no strategies or paths to unify, prioritize or address needs for new PMRI terminologies. Remarking that standard terminologies currently used by the government or the private sector didn't provide much clinical specificity, Dr. Sujansky added that this approach also didn't address the need for clinically specified terminologies. Dr. Steindel recalled that Captain Kelly and Ms. Haber didn't feel the terminologies could exist de novo, and advocated mapping and interrelating terminologies as well. While the last group was more pragmatic than the first, both shared characteristics. Mr. Beebe was the only one advocating a specific terminology in the linkage. Mr. Blair suggested that, if they looked closely at the discussion framework, they'd find other examples where it fell short and, as Dr. Sujansky suggested, wasn't necessarily mutually exclusive. But Mr. Blair said it was intended to help them group their thoughts and determine the right direction. Dr. Sujansky pointed out a pragmatic aspect; attributes of the first group also were embodied in the last.

Discussion and Prioritization

Dr. Sujansky read Dr. Mc Donald's e-mailed suggestions into the record. Dr. Mc Donald recommended citing the law authorizing this work, as a way of providing focus and direction. He reduced the dialogue to two or three themes: (1) most pragmatists and those proposing targeting low-hanging fruit, hinted at a federated model (Dr. Bakken, Dr. Berglund and those advocating retaining terminologies already used by the government and private sector); (2) others argued for a more central authority and unified vocabulary, with SNOMED as the core (Drs. Spackman, Campbell, and Price); and (3) Dr. Mc Donald noted that both Drs. Elkins and Chute addressed issues relating to the data model and ties to vocabulary, but espoused a tight federation or overarching modeling rather than a major central coding system. Noting that CPT would exist for some time, no matter what they decided, Dr. Mc Donald favored the current pragmatic federate approach with ICD-10 and emergence of RX norm. Dr. Mc Donald characterized the statement that the first approach (grouping functionally specific terminologies) wasn't well suited for decision support as an exaggeration. For decades, systems that did decision support managed to deal with degrees of mixed vocabularies because these codes didn't overlap conceptual space but stated the logic in terms of a set of codes and/or internal mapping.

Dr. Mc Donald considered the statement that a federation couldn't solve clinical outcomes or advanced clinical research a criticism of individual code systems; acknowledging that one might criticize ICD-9 for not being granular enough for some outcome measure or advanced clinical research, but pointing out that assertion depended upon the collection of codes included in the federation. It would be hard to make the criticism stick if MedDRA or SNOMED represented diagnosis and findings. He added that the statement about outcomes in advanced research failed to consider that many outcomes were measured as quantitative results: e.g., ejection fraction for cardiac output, vital capacity for pulmonary function, and number of hospitalizations. Other code standards (e.g., HUGO gene names and evolving SNP nomenclatures) existed and could join the federation to cover some of the advanced research vocabulary.

Dr. Mc Donald observed that one couldn't presume that single or multi-axial solutions were the exclusive preserve of either model: a federated model could also have multi-axial codes. The real, complex question was between compositional and non-compositional. Dr. Sujansky observed that Dr. Mc Donald raised interesting points about the historical use of existing terminologies in real-life decision support systems, and their apparent utility today. Dr. Fitzmaurice interpreted Dr. Mc Donald's comments as supporting pooling terminologies and codes and making the best of what they had. He pointed out that focusing on mapping, rather than starting a core reference terminology and working outward, wasn't addressed. Dr. Sujansky said the comments omitted the overlap in meaning among existing terminologies that was one of the federated approach's biggest problems. Pursuing an approach that included a lot of terminology standards without carving out particular terminologies to cover specific areas left considerable possibilities for overlap and redundancy. Dr. Cohn said he didn't think anyone should be against synonyms, which was another word for redundancy. Remarking that he thought there had been a fair amount of posturing at the August hearings, Dr. Cohn said he found himself screening some comments.

Noting that they were in the midst of transition and uncertainty and that low-hanging fruit could varied moment-to-moment, Dr. Cohn defined the question as what to do near-term versus looking far out on the horizon at the perfect united reference terminology. He said the Subcommittee probably wouldn't do well looking ten years ahead and needed to be pragmatic and do things useful near-term while ensuring that what they did was extensible and moved toward that grand united whole. Mr. Blair observed that the first two categories Dr. Sujansky presented reflected important constituencies and real needs. He said he greatly appreciated the pragmatist approach and supported advancement of pragmatic solutions.

Mr. Blair said he felt pragmatic solutions often were based on what was available today and business needs exerting the most pressure. The Committee's view was broader than what was important to a particular institution, government agency, professional association, or standards development organization (SDO). Consistent with what Dr. Cohn said but noting he phrased it differently, Mr. Blair said the nation had a major problem: health care was almost 14 percent of the Gross Domestic Product, but lacked the data to manage and control it. Noting the NHII work, paradigm shifts, and how most industries had moved into the information age and could improve quality, add additional services, doing both at lower costs, Mr. Blair emphasized that the information infrastructure still wasn't in place. While the pragmatic approach was good for immediate measurable results, he questioned whether it created the information infrastructure to move to that new paradigm. He saw the second category as more capable of moving them toward a broader information infrastructure. It might be more expensive, time consuming and difficult with mapping and involve more challenging political issues, but he thought Dr. Chute saw value in the information and terminology models and that the Committee should consider how to get the best of both.

Dr. Cohn remarked that another big conundrum along with mapping was how the terminology and other information models fit together. He queried whether their end goal was to insure that a message in HL-7 could be sent or decision support. As he considered what might make the rate of increase in spending more manageable, he said he hoped decision support would be part of the equation. Dr. Steindel concurred. He asked the Committee to endorse that its role wasn't to sanction terminologies for use in messaging, but for use in PMRI systems. Their charge under the law wasn't to look for the best terminologies to appear in an HL-7 message. In some cases, the PMRI standards they'd recommended weren't suitable for HL-7 messaging, but encompassed what could be done inside an institution rather than between them. HL-7 was designed for both. In looking at recommendations for the Consolidated Health Informatics initiative they'd differentiated between those suitable for exchange between and within institutions. Dr. Steindel emphasized the need to keep aware of that distinction and focus on the selection for terminology outside the realm of messaging. Asked about a middle ground, Mr. Blair replied that he didn't think of this as a compromise. Both addressed different problems, could contribute to each other, and had to work together. Dr. Cohn noted that HL-7 did a lot of work on standards for electronic health records, a central issue the Subcommittee had to consider. Mr. Blair explained that much of this work in HL-7 was driven by the time frame within England's national health system, which was developing a national health record in an ambulatory environment.

Returning to the discussion about decision support versus messaging, Ms. Beebe recalled Mr. Brown's comments about the work VA, FDA and NOM did with RX NORM. Mr. Brown had said not to worry about decision support: computers did that. Ms. Beebe emphasized getting the terminology right, remembering the end result was to use the terminology for decision support at the application level. Noting HL-7 was also mapping between the HIPAA X-12 transactions and RIM, she suggested remembering, in considering mapping, that they could probably leverage SDOs and others engaged in these efforts.

Reflecting on Dr. Mc Donald's comments and discussion points raised, Dr. Sujansky advised the Subcommittee to articulate the purpose of standardizing clinically specific terminologies. He noted Dr. Mc Donald seemed to suggest these terminologies were necessary and already used for institution-specific decision support and reporting and that other institutions had sophisticated decision support, reporting and surveillance and tracking in place. Some terminologies worked and it was demonstrated that these systems were effective and had positive ROI and clinical outcomes. Dr. Sujansky suggested the Subcommittee consider what was missing that required a standardization effort (e.g., this happened only at a few institutions with the resources and knowledge to develop their own sophisticated systems--why weren't vendors providing this on a wide scale and how might standards facilitate that?).

Dr. Sujansky also suggested that, independent of decision support or institution-specific functionality, it might be useful to consider the role of terminology as well as messaging standards in the cross- or super-facility aggregation and sharing of information. Noting the most compelling area of need for standards might be organizations that did things differently and wished to share information processed by computer applications or contribute data to pooling or aggregation of data in order to analyze it reasonably, Dr. Sujansky said no terminologies or effective standards were in place for that and suggested this might be a compelling reason for standardizing terminologies and other areas of medical information representation.

Dr. Sujansky recommended that the Subcommittee consider the context of this work. Personally, he believed the reason for PMRI standards was to standardize computerized access to that data across patients, time, providers and facilities. He emphasized the importance of remembering when evaluating terminologies and their qualities that, as Lieman Lough said, terminologies weren't about medical knowledge bases or organizing medical knowledge, but about facilitating access to data. Standardization allowed computerized access to data that otherwise would be represented differently across patients and across time (e.g., those visiting multiple facilities or providers, the same provider at different points in time, and providers and facilities with different systems). He depicted a view of the model that showed clinicians, lab systems, devices and other categories of information as many different databases across patients, time and facilities feeding patient databases. Dr. Sujansky said standardized models were needed to create a standardized, uniform interface so other processes, applications and systems could access, share, and incorporate this data into other systems where it was shared and computer-processible without manual intervention. He noted standards also might be important for widespread, automated, computable decision support.

Decision support could be as simple as a drug-to-drug interaction, which might be accommodated by existing coding systems like RX NORM, but Mr. Blair noted that one started across existing terminologies in going to the patient record when ordering a medication and checking against allergies, whether a patient had rheumatic fever as a child, or against other symptoms, signs or items in the problem list. He expressed concern that, although existing coding systems might accommodate that with mapping or interlocking things, clinical specificity might be lost without a convergent reference terminology. Dr. Sujansky noted that they'd again raised the point of something that worked in a particular facility and system, and existing terminologies that might support that now versus a truly standardized representation. Dr. Sujansky agreed that there were different levels of decision support that required standardization of a variety of clinical information (e.g., allergies, past medical history, lab results). Guidelines were also decision-support vehicles that got into findings and symptomatology. Statistical inference (e.g., research; outcomes research; clinical trials pooling much data across patients, time, providers and facilities requiring comparability and aggregation of data in order to draw valid statistical inferences) were a third category of applications that could benefit from standard PMRI models. Dr. Sujansky suggested considering more detailed requirements for the selection of standards in terminologies and other areas of standardization as well as drilling down a level deeper than the high-level model (to the level addressed in the July 2000 report) and considering types of information that had to be standardized. He presented more use cases as examples of standardization-enabled scenarios they might use as a touchstone to identify strategies that provided a solution. One was an example of interoperability that considered documentation of a patient encounter in the emergency department and was electronically submitted to the patient's primary care physician and automatically incorporated into the physician's EMR. Without manual intervention, two different systems exchanged electronically represented, processable information. The diagnosis from the ED went in the patient's problem list at the PCP. Meds prescribed in the ED went into the medication list in the PCP system. Procedures performed in ED went in the past medical history portion of the EMR. Dr. Sujansky noted this was an example of the interoperability enabled by standards and wasn't possible today, given systems developed independently. He said terminology standards were necessary but not sufficient to do most things PMRI standards had to do; a holistic approach was needed. That was more difficult, less practical, and took longer, but Dr. Sujansky said the Subcommittee had to consider it.

Another specific-use case was for comparability and aggregation of data. An outcome analysis with treatment with beta blockers versus ACE inhibitors in congestive heart failure (CHF) patients over a two-year period would require considering diagnoses, medications, classes, and subjective and objective findings that gauged the outcome of a CHF patient (e.g., quantitative measures like ejection fraction, echo cardiograph results, and qualitative measures like edema and shortness of breath). Ideally, all were available electronically in a standard format, and one could draw statistical inference. Dr. Cohn noted a related CMS QUISMIC indicator and study where there was no way to identify contraindications other than manual chart review. Mr. Blair said the example was extremely important, appropriate and relevant; CMS was one of a few institutions in a position to make a difference in information infrastructure that could affect the cost and quality of care in the nation, but when they looked at what they might do with outcomes, they were so constrained by the lack and cost of data they tended to do one-to-one correlations (e.g., beta blockers). Mr. Blair noted a need for: a view broader than today's pragmatic outlook with messages; attending to clinical and outcomes studies in a multifaceted manner; looking at the patient's age, other complications, medications and environmental conditions at the same time one looked at the efficacy of that particular beta blocker. He emphasized that this was where they'd gain meaningful important improvements in outcomes and where he envisioned that reference terminology began to have power and importance to the nation.

Dr. Cohn noted they might need to talk to the industry about where real value was. One group indicated an outcomes analysis would reduce health care expenditures over the next decade, while others pointed out they'd already done outcomes analyses in multiple different institutions and the big issue was implementing lessons and remembering them at the point-of-care. Dr. Steindel concurred that decision analysis using specific terminologies had been done a long time. Information obtained from those databases depended on the quality of their queries. Similarly, using reference terminology for decision support depended on how concepts were modeled in those reference terminologies and the biases. Dr. Steindel said if they could substitute multiple models in the reference terminologies they might have the ubiquitous answer Mr. Blair envisioned, but in general they looked at specific models in the reference terminologies that might or might not provide researchers with information.

Dr. Steindel observed that public health usually wasn't interested in queries about clinical situations, which were what SNOMED was designed for. Either some models had to be redone and imposed on SNOMED for public health queries or they'd do what Dr. Mc Donald proposed and code queries into databases encoded in SNOMED. Dr. Steindel noted a balance between them. He recalled that Dr. Chute pointed out in his web of terminologies that multiple interlocking views of the terminologies were needed to gain maximum information. Even with a convergent or reference terminology, they'd probably still need multiple interlocking views. Dr. Fitzmaurice said Dr. Cohn had identified the two most important Agency for Health Care Research and Quality's (AHRQ) concepts: developing good evidence for evidence-based practice of medicine and translating research into practice.

Mr. Augustine pointed out that most outcomes analysis was retrospective and observational and some type of risk and severity adjustment was needed to compare outcomes, which required good patient information (e.g., comorbidities) that they hadn't mentioned. Dr. Sujansky agreed, noting that implicit in this was availability to that level of clinically specific data and an argument for improved terminologies with that level of detail for comorbidities. Some of this could be captured in ICD-9, but without that adjustment, clinically specific information wasn't available. Addressing Mr. Brown's point, Mr. Augustine said no terminology would ever be developed that could support any outcomes study imagined. ICD-9 supported certain studies, but what could be done with ICD-9 data was relatively limited. If SNOMED or something else clinically specific was adopted and coded data were available, a broader range of studies would be enabled. There would still be missing pieces and studies one couldn't do without going to the charts, but Mr. Augustine said a process could be in place to incorporate input back into the terminology development process and enable future studies.

Dr. Fitzmaurice added to what Dr. Sujansky had said: if one had the terminology, codes and electronic patient record, one could conduct a study by putting flags on patient subjects, so information was entered into the medical record. Mr. Augustine noted some studies based on ICD-9 that guided the industry in the way it practiced gave out false positives. Dr. Graham reported that VA had similar problems with ICD-9 and their clinical reminders and decision support. Differing views and rules between inpatient and outpatient resulted in rule outs being coded as positive on inpatients and not on outpatients. Dr. Graham also noted that a chart review was done in the example posed and information was based on a very small subset. Inclusion of the vocabularies would give most people an ability to make decisions on a population rather than on small segregated subsets. He added that VA spent upwards of $20 million annually for chart review to get access data for this subset. Mr. Augustine commented on the money spent in the managed care industry doing chart reviews for HEDIS numbers. Dr. Cohn said he was a strong believer in the importance of clinical terminologies, but cautioned about over-selling them. He noted CMS commissioned and developed some good risk adjustment models based on ICD, and rolled them up, because ICD was too granular. The same thing occurred at the level of DRG. Dr. Cohn said he had to be convinced that this level of granularity would necessarily achieve better scores. Mr. Augustine said he believed that the adjusters accounted for different amounts of variability and that actually they were looking at aspects of interest. While he qualified some existing adjusters as being good with better data, he contended that with better clinical terminology the adjusters would be more accurate and make for better decisions.

Dr. Steindel noted CDC's interest in the terminologies was population-based. Rolling out these systems with both a population point of view and a focus on the primary care setting, CDC hoped to get primary clinical information that triggered surveillance systems. Dr. Steindel reported that the first talks on syndromic surveillance a month ago in New York City concluded that clinicians couldn't input this information; it had to come from an electronic medical record system. Good decision support was needed to provide the warning bells and whistles that trigger syndromic surveillance. As Mr. Augustine had pointed out, this involved risk adjustment. Dealing with the general population, the incidence of anthrax was essentially non-existent; when an event occurred it changed risk adjustments and syndromic surveillance. They had to look at their terminology systems' flexibility, even extending this into general medicine.

Noting they'd spent a lot of time on use cases to create a context for the kind of functionality they sought and a touchstone to reference in considering recommendations, Dr. Sujansky presented two more use cases: surveillance (monitoring instances of respiratory ailments seen in outpatient offices and emergency departments compared to normal trend) and decision support (involving guidelines and alert rules that use specific clinical data and could be shared across electronic medical record systems, clinical data repositories, and other clinical information systems). Dr. Sujansky said this was a sophisticated and difficult function, requiring a standard patient record model, including standard record structure, terminology, and models of context, negation and time as well as everything relevant to decision support at various levels. Less sophisticated decision support modalities (e.g., drug-drug interaction checking) didn't require as sophisticated a system. But Dr. Sujansky noted the point wasn't that guidelines, alert rules and applications could be created, but to do it efficiently on a wider scale, standardize the data model systems used to access the data driving them, and standardize so systems were common, efficient and cheaper.

Dr. Sujansky asked members to e-mail suggestions and objectives a standard terminology would enable so he could incorporate them into the discussion. He emphasized considering the meaning of the Subcommittee's recommendations in this area and how they could make them relevant; the role of government versus industry (i.e., SDOs and vendors of information systems); and how to leverage the government's role in enabling, fostering and encouraging the functionality they'd described, given that industry had to implement much of that functionality. Dr. Sujansky emphasized key queries. What was the role of government funding? What should the Subcommittee recommend about funding: should it fund SDOs' efforts, research, and education--what was the best way to leverage the government capabilities? How could the government espouse a model for enabling these functionalities that industry would embrace and work to realize? How effective could the government be as a purchaser in fostering and encouraging standards, and creating de facto standards if standards weren't going to be mandated? From a practical point of view, Dr. Sujansky advised it was important to improve, extend and integrate existing terminologies without re-inventing the wheel. How could they balance what existed, which everyone said was imperfect and inadequate, with the reality of having to start there and the importance of being practical.

Mr. Blair noted that they'd put Dr. Campbell's model in the second category, but that he'd referred to its first layer as a convergent reference terminology. Mr. Brown had pointed out that there probably would be more than a single core reference terminology; interlocking models could meet many purposes, and the Subcommittee might want to expand the concept of that core. Dr. Campbell's model had a second layer, which tended to be domains. Mr. Blair said much of the progress within that layer was driven by people who considered themselves fueled by clinical functions or needs for transactions put in the first category. He asked if it was possible within Dr. Campbell's model to incorporate that first category within the second layer of Jim Campbell's model. Mr. Blair suggested lookING at terminology development works referenced in that first category as being represented in the second layer of Dr. Campbell's model. Mr. Augustine said existing terminologies would be in that second layer; administrative would be in the third. Dr. Sujansky said that had merit as a general concept. Other terminologies would continue to exist and be used for some time, regardless of what the Subcommittee did, and a model that included them would be more readily accepted. Declaring that no one could use emergency department terminology anymore wasn't practically achievable and it was better to include it at a different level of the standardization process. Dr. Cohn concurred. If CMS reimbursed him on the basis of things, CMS could designate things and he had to follow state or federal reporting mandates. Beyond that, the Committee wouldn't get far telling people what to use internally. Dr. Steindel agreed: one could do whatever one wanted within one's institution.

Dr. Cohn noted that the industry was looking for decision support in implementing terminologies that required considerable expense and discomfort to everyone in initial training and acclimation. Mr. Augustine said people griped about changing their code sets, local codes and terminology, but ultimately were happy they didn't have to maintain those relationships and definitions.

Dr. Sujansky noted that, doing it this way, a core terminology was informally endorsed as the reference terminology and standard and the industry and health care community would gravitate towards it. Given a lot of overlap between that terminology and some more functionally specific terminologies, there could be economic and functional incentives to move over time towards the core terminology. Vendors built more tools and functionalities around a prevailing terminology and more purchasers of systems gravitated towards it.

Mr. Blair said his thoughts were complementary to Mr. Brown's and Dr. Cohn's, but from the viewpoint of a vendor of health care information systems. Vendors competed to meet user needs and, if the Subcommittee set a framework for the evolution of clinically specific standards, including clinically specific terminologies that enabled vendors to produce additional functions for their customers, vendors would incorporate that in their software development plan and take next steps built on that direction. In this sense, there was a partnership. The Subcommittee didn't have to mandate.

Dr. Cohn agreed. He said almost any model would be preliminarily acceptable in terms of organizing things, because it would evolve as they moved forward. They needed to test with customers, vendors and others, continuing to refine and evolve this model used to organize PMRI terminologies and determine scope. With only one set of hearings with terminology experts and none yet from the industry, the Subcommittee wasn't ready to put anything in stone. It was a matter of what helped them move forward. Acknowledging they didn't know whether this core described one or multiple terminologies, Dr. Cohn noted it did need to be relatively tightly integrated and mapped and, hopefully, not have much information loss in moving across it. He said the problem was deciding what was core in health care and decision making. Dr. Cohn also noted that the outer or administrative and billing layer (the HIPAA standards) had a lot of other terminologies that they hadn't mentioned. He questioned the need to standardize the second layer, saying he'd have to be convinced there was a case for standardizing the particular perspectives represented by those terminologies. In a sense, everybody was in the second layer, like orbiting planets. He contended the issue was that they needed a set of principles for playing into things and quality mappings that were agreed to and maintained.

Mr. Blair proposed that the HL-7 RIM might begin to provide harmonization and coordination among terminologies in the second layer. He suggested a reference terminology like SNOMED for now, but suggested that, in the future, others might serve as multiple terminology models at the core. Mr. Blair said he found himself adding plurals to Dr. Chute's visualization of a terminology and information model and suggested that if they put it into Dr. Campbell's overall framework, they'd have a way to work together and resolve issues. Dr. Cohn said he struggled with how the information model fit into the terminology solar system and tended to view it as almost another dimension.

Noting lots of questions had been raised and questioning herself the relationship between the second and third layers, Ms. Greenberg expressed concern about embracing that model. Dr. Steindel said many others felt that way including himself. But he pointed out that a lot of people also enunciated basic ideas behind Dr. Campbell's model. There were different uses of terminologies and different ways they related and should be considered. The profession could discuss whether this terminology was put in the administrative or middle layer or the convergent and middle layer. But Dr. Steindel said he found this three-layered concept useful for conceptualizing discussions about terminologies.

Dr. Sujansky encouraged the Subcommittee to think about fundamental differences between the layers. The first layer, the reference layer, had no unrepresented overlap of content; in fact, he noted this could be a design criterion. Dr. Sujansky suggested LOINC and RX NORM were included because SNOMED didn't have a good vocabulary or content for drug tests and medications. The combination might create a relatively comprehensive non-overlapping terminology at the second layer, which contained many terminologies for different purposes (e.g., nursing and emergency departments). Dr. Sujansky noted considerable overlap in the semantic content of those terminologies that probably would and should continue in this two-layered model.

Carol Bickford said she'd been trying to figure out how to take this information back to the American Nurses Association's work group that shepherded the nursing languages recognition program. The medical model of diagnosis and procedures was clearly evident in the discussion, but Ms. Bickford noted others had a different perspective of their health care initiatives and relationships with and descriptions of their patients. Considerable work had been done to describe that, but Ms. Bickford didn't find that richness represented in the day's discussions about diagnosis, interventions, and outcomes; relationships between them; the resourcing component; the context for care delivery experiences and how the Nursing Management Minimum Data Set supported it. She cautioned that the model presented might not fit the rest of the world. Many nursing concepts were integrated into the core discussed as convergent terminology, but there was redundancy because they also appeared at the second level or farther out.

Dr. Cohn said he, too, struggled with the overall concept. Years ago when parts of SNOMED were initially put together, the aim was a patient-centric model and core. Unique patient-centric concepts were discovered because of the role nurses play in health care and Dr. Cohn agreed they deserved to be part of the core. But he noted there were many nursing code sets and a lot were more nurse- than patient-centric. Ms. Bickford pointed out that other clinicians used nursing's patient-centric interventions and outcomes. From the context of convergent terminology in three dimensions, Ms. Greenberg said none of the terminologies, including SNOMED, dealt adequately with functional status. Obviously, these were issues of importance to nursing terminologies, but Ms. Greenberg emphasized that in looking at outcomes and being patient-centric, one had to go beyond diagnoses, symptoms, and procedures.

Noting Dr. Sujansky had asked for specific terms, Dr. Steindel said there were many discussions about this at SNOMED editorial board meetings. He suggested the term "depression," which was modeled strongly from a clinical point of view. He noted that neither nurses nor public health professionals felt they were adequately represented. Dr. Sujansky agreed that there were important patient-centric terms in the nursing terminologies; some already appeared in SNOMED and other medical terminologies. He didn't believe that the two-level model excluded having and adding patient-centric terms and concepts (that today might exist only in one or more nursing terminologies) to a core terminology. Dr. Sujansky said there was nothing synonymous about core and medical; what they'd advocated and discussed wasn't mutually exclusive. He said this was true of any level-two terminology. If there was an argument to include them in a core model, they should be added. Once added, they'd be incorporated into the model and there would be no redundancy. Dr. Sujansky said he didn't know that the two- or three-level model was bad just because some terms remained that weren't in the core terminology.

Dr. Sujansky explained that his use of the model followed an approach advocated by Drs. Campbell and Spackman. There were multiple levels. In one was a core level. Another level included what Dr. Spackman called limited scope terminologies. The middle layer core terminology might include several terminologies with non-overlapping content. A third level was for administrative terminologies that weren't necessarily clinically relevant, but were important for logistical purposes. He noted that actual information system applications could use different terminologies. Anyone could continue to use a nursing system that used one of the nursing terminologies, but mappings should exist to the core terminology, because other applications operate against the core terminology. Dr. Cohn doubted that being more specific helped at this point. He acknowledged that sometimes this was useful as a high-level conceptual abstract, but drilling down too far one got into trouble, as they'd discovered with nursing. Dr. Cohn noted there were many issues having to do with the interfaces.

Mr. Blair said he found this morning's discussion helpful. He noted a number of points of consensus and a few others with loose ends, noting he'd like to come to agreement. Mr. Brown had suggested it was possible to see in the core both an information and terminology model; Dr. Cohn had viewed Dr. Campbell's representation as a dimension that could represent an information model. Dr. Cohn said it wasn't immediately clear to him how this worked together. He could see HL-7 using a core terminology, but couldn't see ICD, SNOMED or LOINC in every field. He imagined HL-7 would want to specify using a specific set of terminology to answer that question in that field, avoiding headaches around interoperability. Noting this was an outstanding issue that they had to investigate further, Dr. Cohn said probably there was thinking on how to take this and leverage it more recent than Alan Rencter's perspective, which they might also ask to hear.

Remarking that determining, at least conceptually, how to get the information and terminology models working together would help with strategy and direction, Mr. Blair asked Dr. Sujansky if he could suggest several investigators to testify. Dr. Sujansky said the first thing to do was look at the RIM and consider what it said and didn't say about terminology. His understanding at this point was that it said basically what Dr. Cohn said; one of many terminologies could be used in this field, so long as you identified it. That provided a measure of standardization, but didn't allow two applications that weren't familiar with each other to know they could use one of these terminologies to interoperate. One way to gain a complementary relationship between RIM and HL-7 terminology efforts was to not only use HL-7 but also the core compliant terminology for that particular field. Using both also produced a more proscriptive, tighter standard effort. Noting that SNOMED had 300,000 concepts, Dr. Cohn said they'd need to be more proscriptive than that to solve the problem.

Dr. Steindel said Dr. Sujansky's description of RIM and its use of terminology was correct. The vocabularies it specified in this area of messaging might be a broad or small set of vocabularies or one HL-7 defined, depending on where it appeared in the message structure. The informatics community was realizing the complexity of clinical concepts and that there was a blending and continuum between terminology and information models. Dr. Steindel said Dr. Chute reached to describe that "terminology" in talking about his web of terminologies. Dr. Spackman also grasped it within SNOMED as he looked at concepts in his findings axis that might actually be disorders. The terminology model wasn't clean, there were overlaps between axes, and how one handled that with respect to his terminology model was talked about in the continuum and drove SNOMED. The introduction of the nursing terminologies into SNOMED was another blurring of this terminology and information model, because similar concepts were used differently. Dr. Steindel recommended seeking guidance on how to make recommendations for selections of terminology in the context of the patient medical record and reconciling an information and terminology model.

Dr. Cohn said he absolutely agreed. He noted Dr. Steindel's description of the discussions hadn't changed much in the couple of years they'd grasped. Dr. Cohn said that although the primary requirement of the Committee's work might not be to make sure it works well in HL-7 message format standards and transmission, it better do a good job there. Noting the big obstacle to interoperability in HL-7 messages was terminology, Dr. Cohn proposed talking in December with HL-7 about the requirements necessary to gain interoperability in HL-7 messages. Mr. Blair agreed, but added that in his mind, the business and clinical requirements that drove the development of transaction standards, which HL-7 led extremely well, were only part of the scope they had to focus on. In addition, were applications that hadn't yet been developed because of a lack of clinically specific terminology to support applications for outcomes research, multifaceted clinical decision support, and the ability to have concurrent decision support. Mr. Blair said they needed to make sure that they represented emerging applications in that scope, so their framework was broad enough to accommodate them as they moved forward with message-driven terminologies.

Dr. Steindel agreed, but cautioned that they had to heed what Dr. Price had done in the National Health System over the last decade with Reed codes. He said it was important to realize that when they developed version three of the Reed codes, which was a convergent terminology intended to support decisions, the acceptance rate was about zero. Patient medical record systems and how they were used was an important component they needed to talk about. SNOMED realized that when they talked about developing navigational hierarchies and user-based in addition to clinical-based scenarios for SNOMED.

Mr. Blair said he believed in being pragmatic and thought Dr. Steindel's observation was correct. Almost no one used television in 1950, but the standards enabled vendors to move forward and within ten years there was a new environment. In addition to staying closely focused on being pragmatic, Mr. Blair said there was an information infrastructure with terminology standards that they had to make sure was in place to support new applications that HL-7 couldn't address, at the same time they provided a framework where HL-7 could continue to advance and flourish.

Dr. Cohn said he wasn't sure anyone really disagreed, but he thought the issue was about pragmatic and near-term versus long-term needs. He said he was uncomfortable looking too far out too quickly, because he wanted to make sure obvious near-term needs were met. Noting all the years HL-7 worked on an information model in moving to version three, Dr. Cohn said the interoperability issue wouldn't be completely solved by version three unless the terminology was straightened around.

Dr. Cohn said the use case Dr. Sujansky mentioned provided something concrete that they should explore, not avoiding or missing these other issues. He noted the problem with asking for requirements around vaporware was that it was just that: vaporware. Everyone wanted everything until a product was actually developed. Sitting down and determining what could work and be useful and helpful for HL-7 might provide a real near-term view, because much of this was almost a use-case discussion. Mr. Blair clarified that he wasn't arguing against the use case discussion, but simply saying that shouldn't be the only way to proceed.

Dr. Cohn observed that the big problem with decision support wasn't that it didn't exist, but that it hadn't been diffused into the industry. Once again, it wasn't the next generation, but a question of what were the impediments and how to make these things move further. A lot existed that was practical. He noted that the information model was ripe for discussion in December. Dr. Cohn reported that he'd talked with HL-7 about what they saw as issues and requirements and how all this might work in a way that supported interoperability. Dr. Sujansky said it was worthwhile talking to HL-7 on that front; in a global sense, they helped represent the vendor perspective. But he noted it was important to recognize that HL-7 was focused on backward compatibility and had a different view on standardization.

Next Steps

Mr. Blair said he was pleased that Dr. Sujansky was available to help and confirmed that he had enough information from this session plus background information to give some information and guidance at the December meeting on options and approaches for coordinating and harmonizing information and terminology models. He suggested it would also be helpful if Dr. Sujansky took Dr. Campbell's construct and envisioned an additional step with that construct that might help the Subcommittee consider how to move forward. Dr. Cohn concurred. Dr. Fitzmaurice addressed the issue of the scope of Dr. Sujansky's work. He recommended that the Subcommittee look at how they chose to proceed over the next six-to-nine months and decide how Dr. Sujansky's time could best achieve the end result. Ms. Greenberg expressed appreciation for Dr. Sujansky coming in these two days and committing to expeditiously completing their contractual relationship. She noted the Subcommittee had to recognize that it was hard for him to commit to anything until they completed the contract. Ms. Greenberg advised that any systematic information collection (e.g., references to questionnaires) couldn't be done under contract, because of clearance issues. However, analysis, follow up interviewing and gathering of information could be done. The Subcommittee agreed to discuss after the meeting the implications and how consistent the scope of work laid out for Dr. Sujansky was with the Committee's needs over upcoming months.

Dr. Cohn said he didn't see either the December or January meeting necessarily related to the scope of Dr. Sujansky's work, but he thought they needed to hear soon from the health care industry to make sure that they were on the right track. He didn't know yet if that was an activity for Dr. Sujansky, except perhaps to help put together the hearing. Dr. Cohn asked Mr. Blair if he considered gathering information from PMRI terminology developers a critical aspect in this work. Mr. Blair said he felt the Subcommittee needed to feel on its own how to step forward and the approach and agenda for today in that discussion framework. He suggested that Dr. Sujansky come back with a synthesis of where they were now. They could ask for comment and critique from terminology developers and major message format standard developers about any problems or issues with respect to their proposed work plan and the direction they'd take.

Dr. Sujansky noted that there was an issue about developing a work plan for his activity within and for the Subcommittee. He asked the members to consider if they wanted to develop a work plan for the next nine months, or if they preferred to proceed on a month-to-month or interim basis, achieving more clarity about the time frame and intensity of work required. He said they needed to resolve an air of uncertainty before moving forward. Mr. Blair said the Subcommittee had a general work plan, but there were other issues and they tried to maintain flexibility and accommodate them. That left uncertainty. He suggested they try to close those gaps. Dr. Sujansky noted the work plan included a fair bit of information gathering. The extent that was in scope for what he could be involved with obviously affected the work plan. All these issues needed to be resolved.

Mr. Blair said that was similar to what they did with the message format standards. But he noted Ms. Greenberg had pointed out that, from a technical standpoint, they had issues as to who did the questionnaire and how it was distributed. Dr. Sujansky would be in a position to do that analysis, but they'd have to modify the work plan. Dr. Cohn said he had no doubt that they'd utilize Dr. Sujansky over the coming months. Dr. Sujansky said he had flexibility in what those work activities were. Mr. Blair suggested resolving issues regarding the scope of Dr. Sujansky's work by e-mail over the next several days. Members and staff agreed to talk briefly about the issues after the meeting, establishing parameters for moving forward.

Dr. Cohn said they might have a brief discussion based on today's conversation about next steps at the November break out meeting and they'd spend a fair amount of time in December talking about next steps. They'd come up with a number of issues that they had to mull over, which might be a session during the hearing. They'd also need information from outside sources during the December 10-11 hearings. Dates had also been set for hearings on January 29-30, March 25-26, and May 21-22.

Dr. Cohn deferred discussion about having another hearing until specifics of the work plan evolved. He noted many sessions on December 10-11 related to reports from the DSMOs, as well as a discussion with interested parties on issues related to smoothing administrative simplification and making the process work better. Hopefully, they'd have at least half a day for testimony and discussing PMRI next steps.

Mr. Blair thanked everybody for producing a valuable, productive session. They'd reached consensus on a couple issues: looking at the different categories in the discussion framework, they'd concluded that characteristics and attributes of both category one and two were needed and offered directions that should be accommodated. They'd decided both an information and terminology model were needed. Dr. Sujansky would provide guidance on approaches for accommodating that. In December, testifiers will give additional guidance. Dr. Sujansky pointed out that use cases were a vehicle that could keep them from veering too far from pragmatic, real issues as they honed in on their direction and strategy for going forward.

Members also concluded that other aspects needed to be represented in an expansion or next generation of Dr. Campbell's model, including how to ensure they accommodated such things as a nursing as well as a medical perspective a la patient focus, and that they probably would include more than a single convergent terminology. Dr. Sujansky would take the Subcommittee to at least the next conceptualization of what that might be.

Ms. Greenberg noted that the Committee was expected to provide effective solutions for complying with the administrative simplification standards. At the December meeting, CMS would report on their preliminary analysis of the ASCA submissions. Obviously, if everyone had to start testing by April and implementing by October, providing solutions needed to be timely. Dr. Cohn agreed. WEDI will submit their list of best practices and publishable pieces and others were asked to submit what they considered useful. Members also specified interest in creating a link to the Utah Health Information Network's HIPAA savings calculator. The Subcommittee identified two key issues, vendors and education; moving forward on both will be discussed in November when, hopefully, they'll have more data from CMS. Time permitting, the Subcommittee will discuss results of the analysis of the data related to the compliance delays. Dr. Cohn and Mr. Blair adjourned the meeting at 12:30 p.m.


I hereby certify that, to the best of my knowledge, the foregoing summary of minutes is accurate and complete.

/s/ June 3, 2003

_________________________________________________
Chair Date