HHS Logo: bird/facesU.S. Department of Health and Human Services

State Nursing Home Quality Improvement Programs: Site Visit and Synthesis Report

Alan White, Ph.D., Barbara Manard, Ph.D., Deborah Deitz, BSN, Terry Moore, MPH, RN, Donna Hurd, MSN, Christine Landino, MSW, MPH, Jennie Harvell, M.Ed.

Abt Associates, Inc.

May 15, 2003

PDF Version


This report was prepared under contract #282-98-0062 between the U.S. Department of Health and Human Services (HHS), Office of Disability, Aging and Long-Term Care Policy (DALTCP) and Abt Associates. For additional information about the study, you may visit the DALTCP home page at http://aspe.hhs.gov/daltcp/home.shtml or contact the ASPE Project Officer, Jennie Harvell, at HHS/ASPE/DALTCP, Room 424E, H.H. Humphrey Building, 200 Independence Avenue, SW, Washington, DC 20201. Her e-mail address is: Jennie.Harvell@hhs.gov.



TABLE OF CONTENTS

EXECUTIVE SUMMARY
1.0 OVERVIEW AND SIGNIFICANCE
1.1 Policy Context and Study Description
1.2 Study Description
2.0 APPROACHES TO TA PROGRAMS: MOTIVATION AND PROGRAM DESIGN
2.1 Choosing the TA Route
2.2 Approaches to Program Design
3.0 TA PROGRAMS IN THE STUDY STATES: OVERVIEW AND CRITICAL DECISIONS
3.1 The TA Programs in Brief
3.2 Critical Decisions in the Design and Implementation of TA Programs
3.3 Summary
4.0 THE WIDER CONTEXT OF STATE-INITIATED QUALITY IMPROVEMENT
4.1 Public Reporting
4.2 Best Practice Dissemination Programs
4.3 Training/Joint Training Programs
4.4 Facility Recognition Programs
5.0 FUNDING MECHANISMS FOR QUALITY IMPROVEMENT PROGRAMS
5.1 Federal Funding for Survey, Certification, and Enforcement
5.2 Federal Funding Sources for TA and Other Quality Improvement Programs Being Used by Study States
5.3 Potential Funding Sources Not Being Used by Study States
5.4 Funding Sources Used for Identified Quality Improvement Programs
5.5 Proposed Legislation Affecting Funding for Quality Improvement Programs
6.0 EFFECTIVENESS OF TECHNICAL ASSISTANCE PROGRAMS
6.1 Previous Studies of the Impact of Nursing Home Quality Improvement Programs
6.2 Formal Assessment of TA Impact Among Study States
6.3 Informal Assessment of TA Impact Among Study States
7.0 SUGGESTIONS FROM STUDY STATES TO OTHERS CONSIDERING QUALITY IMPROVEMENT PROGRAMS
7.1 TA Programs
7.2 Other Quality Improvement Initiatives
8.0 SUGGESTIONS FROM STUDY STATES TO THE FEDERAL GOVERNMENT
8.1 Federal Program Provisions
8.2 Other Suggestions
9.0 CONCLUSIONS
9.1 Technical Assistance Programs
9.2 Other State Quality Improvement Initiatives
NOTES
REFERENCES
APPENDIX A. State Reports (separate file)
APPENDIX B. Public Reporting Systems (separate file)
APPENDIX C. Technical Assistance Programs (separate file)
APPENDIX D. Best Practices (separate file)
APPENDIX E. Provider Training Programs (separate file)
APPENDIX F. Facility Recognition Programs (separate file)
APPENDIX G. Funding of Quality Improvement Programs (separate file)


EXECUTIVE SUMMARY

Nursing home quality continues to be a major policy concern for both State and Federal policymakers. In response to this concern, some states are using consultative, collaborative technical assistance (TA) programs in an effort to improve nursing home quality in addition to the traditional regulatory approach embedded in survey and enforcement process. As part of these TA programs, states provide on-site consultation, training, and/or sharing of best practices in an effort to improve nursing home quality of care. These state-initiated technical assistance programs are one way that states can meet facility needs for assistance in improving nursing home quality while continuing the adversarial regulatory focus inherent in the survey and certification process.

The purpose of this study is to inform state and federal policymakers about state-initiated quality improvement programs, with the particular goal of providing information to states that may wish to develop similar programs in their state. We focus primarily on activities under way in seven states--Florida, Iowa, Maine, Maryland, Missouri, Texas, and Washington. Our information is based on in-person and telephone discussions with key stakeholders in each state.

It was not the intent of the study to evaluate the effectiveness of the state-initiated quality improvement programs that we reviewed in improving quality of care. For several reasons, it was not possible to make definitive conclusions about the effectiveness of these programs. First, most programs have only been operating for a short period. Second, in most states several different types of quality improvement programs were introduced at around the same time, and it is not possible to measure the impact of individual programs. Third, and most fundamental, with the potential exception of Texas, none of the programs that we reviewed are collecting the type of evaluation information necessary for a rigorous impact analysis. Even so, some important lessons can be learned from these states that are applicable to other states considering quality improvement programs.

Key Decisions Regarding Quality TA Program Design

The design and focus of TA programs varies across states, but the programs share several defining characteristics. First, TA program staff provide on-site consultation, training, and/or sharing of best practices with nursing facility staff. Second, the programs emphasize a collaborative approach between facilities and the TA staff, which often contrasts with the relationship between facilities and LTC surveyors. Third, the programs are non-punitive, and results from the visit are typically not shared with the survey and certification agency unless serious violations are observed.

The circumstances leading to a particular state's decision to implement a TA program were unique to that state. But underlying the decision process in every study state was the same catalyst--a widespread desire to "try something new," to provide a positive stimulus to quality improvement in addition to the potentially more adversarial long-term care (LTC) survey process. In reviewing the quality improvement programs in our study states, we identified a series of key decisions that shaped the way these programs operate and could influence their probable impact.

The Primary Goal of Quality Improvement Programs: Promoting Regulatory Compliance or Improving Nursing Home Care Practices

While all of the programs that we studied had the common underlying goal of improving quality of care, they differed with respect to the extent to which this goal was pursued by a focus on improving the care furnished by nursing homes versus promoting regulatory compliance. This choice of program focus is the most fundamental choice a state must make in designing its quality improvement program, as it has a heavy influence on other key program design decisions.

The TA programs in Maine, Maryland, Missouri and Texas have a direct focus on improving nursing home care practices, for example by providing facilities with clinical practice care guidelines or training in how to care for residents with particular conditions. Maine's program has the narrowest focus, dealing only with particular nursing home residents with behavior problems. The Texas program also has a narrow scope, focusing on three issues (restraints, nutrition, and toileting) that were previously identified as key issues for the state. The goal of the Missouri TA program--improvement in quality indicators--is broader. The TA program in Maryland also has a broader focus that includes quality assurance, technical assistance, and sharing of best practices.

Underlying the choice of program focus in these states was a general belief that regulatory compliance, while important, was separate from quality improvement, and that compliance with survey and certification requirements would not necessarily ensure that facilities are furnishing high quality care. These states believe that tying quality improvement activities to the LTC survey conflicts with the fundamental aim of their TA program--to help facilities understand the principles and practice of quality care in a non-adversarial atmosphere. Many of programs with this focus have been able to build collaborative relationships with facilities that may serve as the foundation for more honest communication and, therefore, potentially more productive information exchange. Through out the rest of this paper we refer state programs using this model as TA programs with a focus on nursing home care practices.

One goal of the TA programs in Florida and Washington is to inform facilities of potential regulatory compliance and enforcement issues, enhancing facility compliance with survey and certification requirements. The Washington TA program emphasizes facility compliance with survey and certification requirements. Florida's quality monitors combine a care practice and regulatory focus--they will note areas where the facility could be cited, but also cover care issues as well. Underlying the choice of program focus in these states was a belief that an emphasis on monitoring and enforcement is the best way to improve quality. This focus, in effect, increases the number of times the survey agency is evaluating facility performance, giving the state greater knowledge of facility operations. Providers in these states stated that they found these programs to be valuable. We refer to state programs using this model as TA programs with a focus on promoting regulatory compliance.

Content of TA--Technical Assistance and Training

States electing to design a TA program that is focused primarily on improving nurse home care practices varied with respect to the information sources used during the TA visit. One state uses evidence-based practice guidelines exclusively. However, the more usual practice is for TA staff to use a variety of sources, typically recognized reference material, with varying degrees of freedom for staff to use examples from their own experience. In some states, best practices are obtained from facilities who represent their experiences to be "best practices." Some stakeholders expressed concern that the latter approach does not always represent exemplary care and that superior facilities may not share information on their care practices, assuming that what they do in their facility is "normal" care delivery.

In addition, all of the study states include informal provider education during facility visits and all but one include some type of formalized training. Discussants reported that training sessions are usually well received and well attended. Determining topics for training is done in most states by identifying areas where providers are having the most difficulties as determined by survey and certification or TA staff. Two states provide joint training to providers and surveyors. Participants said that there is some resistance to joint training by both providers and survey staff. However, some also said that this training is valuable (a) so providers and surveyors receive the same information, and (b) because, though stressful, such sharing may ultimately improve provider-surveyor relations.

Mandatory or Voluntary TA Programs

Most of the TA programs in the study are mandatory. Maine and Missouri, the two states with voluntary programs, chose that route to encourage provider trust. The major concern with a voluntary approach is that the facilities that most need help may be the ones that choose not to participate. It is not coincidental that the two voluntary programs are focused on improvement through consultation rather than regulation. An emphasis on compliance is obviously not well served by a program that allows facilities to determine when, and even if, they are visited.

Structure and Length of the TA Visit

States vary with respect to the nature of the information shared during TA visits. An emphasis of the programs in Florida, Maryland, Missouri, and Texas is the sharing of best practices. In Maryland, Missouri, and Texas, this includes best practices based on clinical guidelines. In Florida the information that tends to be shared deals with care practices observed at other facilities. In Maine, the focus is on care plans for individual residents, and information on best practices is typically not shared. Washington TA staff avoid sharing information on best practices with facilities, instead encouraging facilities to network with one another to share best practices.

The length of the TA visit varied greatly. Visit length in Maine and Missouri, the two states with voluntary programs, tended to be shorter than visits in other states, typically lasting between 2 and 4 hours. In Maryland, which had the longest visit length, TA visits last for two days, with the TA program consisting of a legislatively mandated facility survey--called the "Second Survey" to distinguish it from the federally required certification survey.

Relationship Between the Technical Assistance and Survey Programs

The design and operation of state-initiated technical assistance programs depends, in part, on the relationship between the TA and survey programs and staff. States differed with respect to:

A close relationship between TA and survey programs is more important in states that have a program that is focused primarily on regulatory issues. In states where the TA program is closely linked to identifying compliance issues, surveyor training of program staff is an obvious asset. TA staff who also function as surveyors (i.e., have dual roles) can be perceived as having greater authority and more regulatory knowledge, and, for these reasons may be better able effect positive changes in resident care. Regulatory information given by TA staff who also function as surveyors may be more consistent with survey findings.

However, there are some potential negative implications resulting from a dual role for TA staff. The dual role has led to the diversion of TA staff to survey functions, reducing the frequency of TA visits. Some stakeholders also noted that closer relationships between the survey agency and TA programs can give rise to provider concerns about the extent to which information provided to the TA staff is shared with, and potentially acted on, by the survey and certification staff. This may inhibit honest and open assessment of programs and, thus, limit innovative ideas to improve quality. Keeping the findings from TA visits confidential may help achieve a more open and honest relationship with facilities.

In states where TA staff do not perform survey tasks but are required to have survey experience, some discussants commented that it was often hard for former surveyors to "change hats" from a regulatory and enforcement approach to an emphasis on facility care practices.

In states with TA programs that have no link to the survey agency, some providers said it was troublesome when TA staff cannot provide interpretive regulatory guidance and when advice given by TA staff is inconsistent with surveyor findings.

Other Quality Improvement Programs

In none of the study states was a TA program instituted in a vacuum, but along with a variety of other quality improvement initiatives. Most of these fall into one of two types:

Funding for Quality Improvement

Federal law makes available federal funding for certain quality improvement activities and States avail themselves of these funds for quality improvement activities related to training and facility recognition. The study states, however, make limited use federal funds for their technical assistance programs. States typically fund their technical assistance activities out of general revenue funds, often supplemented by the state portion of Civil Monetary Penalty (CMP) or fees levied on facilities. Some states explained that there were "too many strings attached" to use federal funding for these TA activities.

Pending before Congress are two legislative proposals that, if passed, would fund state initiated quality improvement efforts--the Nursing Home Staffing and Quality Improvement Act of 2001, and the Medicare and Medicaid Nursing Facility Quality Improvement Act of 2002. The Nursing Home Staffing and Quality Improvement Act is aimed at promoting staff recruitment and retention and improving nursing home quality of care. The Medicare and Medicaid Nursing Facility Quality Improvement Act of 2002 would permit alternatives to the federal survey and certification process for nursing facilities in up to eight states and includes language that would allow survey and certification staff to provide TA to facilities.

Lessons Learned

Feedback from those stakeholders with whom we spoke in the states we visited indicates a significant interest in and desire for TA and other collaborative programs. Many nursing facility staff seem to value the opportunity to have an open dialogue with TA staff about problems and issues in resident care, to obtain information on good clinical practices, and to receive training and feedback on how they can improve their care processes. A few stakeholders reported of problems when TA advice conflicted with what surveyors told the facility. But these appear to be isolated instances. There are, as noted, many differences across the study states in the design and goal of their TA programs. But several clear lessons emerge.

Defining the Relationship Between TA and the Survey Program is a Critical Decision Point

The principal reasons for choosing whether the TA program should emphasize improving care practices or promoting regulatory compliance appear to be primarily related to the philosophy of the state and the availability of federal funding. In states where the relationship between the technical assistance and survey programs is close, programs tend to focus their TA less on facility care practices and more on regulatory and compliance issues. While many facilities welcome this type of assistance, in states where the TA has a regulatory focus, the distinction between the two programs tends to become blurred. This may affect the types of information that facilities are willing to share with nursing facility staff, which may reduce the ability of the program to impact nursing facility care practices. During the period when a new TA program is being implemented, a clear separation between the TA program and the survey process was perceived to be particularly important.

Non-Mandatory TA Programs may not Reach Facilities most in need of Help

A problem with implementing a voluntary TA program is that the facilities most in need of help may decline the assistance. Study participants reported that the facilities with the lowest quality are often the ones that do not participate in TA programs. These facilities may not benefit from programs with mandatory participation either, however, given that they may be too overwhelmed by trying to comply with requirements to be able to participate in quality improvement initiatives.

The facilities that do participate in voluntary programs are likely to be those that want to improve their care practices based on what they learn during the TA visit. A non-mandatory program may be the only option for some states with budget limitations that allow for only a small program that cannot reach every facility.

Evaluation Needs to be Part of the Initial Program Design

As noted, evaluating how well the TA programs work at improving the quality of care will be particularly difficult. Of particular concern from an evaluation perspective is the simultaneous statewide implementation of several quality improvement programs. It is understandable that states have lots of ideas about ways to improve nursing home quality and a desire to try new programs. But states planning to implement TA or other quality improvement programs should consider the potential need for evaluation--which is being increasingly demanded by program funders in the current fiscal environment--and design their programs so that their evaluation needs can be met.


1.0 POLICY CONTEXT AND STUDY DESCRIPTION

The quality of nursing home care is a major concern for state and federal policymakers, and regulators as well as consumers and industry representatives. This concern has prompted many public policy initiatives intended to improve the quality of care.

1.1 Policy Context

The traditional approach to ensuring adequate quality of nursing home care is regulatory--through the long-term care (LTC) survey and certification process. The Omnibus Reconciliation Act (OBRA) of 1987 strengthened federal requirements for the LTC survey and enforcement requirements, establishing a set of minimum standards that nursing homes must meet in order to gain (and retain) Medicare and Medicaid certification. The Centers for Medicare and Medicaid Services (CMS), formerly the Health Care Financing Administration, contracts with state survey agencies to monitor compliance with these standards through annual facility surveys, and states are primarily responsible for regulating the quality of nursing homes. The Federal Government pays 100 percent of the costs of Medicare skilled nursing facility surveys and 75 percent of the costs of Medicaid nursing facility surveys.

Despite the survey process, quality of care in nursing homes continues to be a concern, and the effectiveness of the survey process continues to be debated.1 Enforcement regulations have been criticized by providers and consumer advocates alike as either too stringent or not stringent enough. Many critics say the problem is the lack of consistency in how the survey, certification, and enforcement processes are implemented--that wide intra and inter-state variation exists in the number and type of deficiencies issued, scope and severity ratings assigned, and penalties imposed.2

Some states have established programs to improve nursing home quality through information and guidance to nursing homes on ways to improve quality of care--both generally and in relation to a facility's particular problems. In some states, these programs are intended to "raise the bar" by providing technical assistance to facilities so that they can perform at levels that exceed regulatory standards.

Similarly, the Federal Government has recently implemented nursing home quality improvement programs provided by the Quality Improvement Organizations (QIOs, formerly known as Peer Review Organizations) under contract to CMS. The CMS effort also includes a public reporting component. As of November 2002, CMS made available, through the QIOs, technical assistance to nursing homes in all states and began posting quality measures for nursing homes, in addition to other facility-level information, for nursing facilities nationwide through the Nursing Home Compare website (http://www.medicare.gov/NHCompare/Home.asp).

The impetus for this recent federal initiative is similar to that of some of the states-- to stimulate the nursing facilities to improve performance through the provision of technical assistance and to furnish consumers with comparative information with which to make an informed choice about initial or continued residence in a given facility. How these federal nursing home quality improvement efforts will interact with state TA programs has not yet been determined.

1.2 Study Description

The purpose of this study is to inform state and federal policymakers about the characteristics, objectives, and implementation of the quality improvement programs states have implemented. A particular study goal is to provide information to states that may wish to develop such programs in their state.

Originally, the study was to focus solely on Technical Assistance (TA) programs that provide on-site consultation, training, and/or sharing of best practices with nursing facility staff. Eight states (Florida, Maryland, Maine, Michigan, Missouri, Texas, Virginia, and Washington) currently have active TA programs.3 The design and focus of these TA programs vary across states, but they share several defining characteristics:

Our study focus expanded, however, as our research revealed state-initiated quality improvement initiatives in addition to TA. In addition to providing TA, some states also train nursing home providers on compliance with regulations and other topics, and make information available to consumers through public reporting of information.

To select states to be included in this study, we collected basic information about the quality improvement programs in states through a combination of discussions with stakeholders and a review of relevant written information. The study focused on a group of states that had state-initiated quality improvement programs that included aspects of technical assistance and that were not reimbursement or payment related. The states we ultimately selected were Florida, Iowa, Maine, Maryland, Missouri, Texas, and Washington.4 All except Iowa have formal TA programs in place. Iowa was added because it had particularly interesting other quality improvement initiatives.5

Our data are from structured discussions with key stakeholders in each study state. Key representatives from the state agency responsible for the quality improvement programs were contacted to arrange face-to-face meetings with stakeholders. Participants in these discussions included state Survey and Certification Agency Directors and staff; Directors of Quality Improvement Projects and staff; state Medicaid Agency Directors; representative(s) of for-profit and not-for-profit nursing home associations; nursing home providers; and consumer advocacy representatives and the state's long-term care Ombudsmen. Most discussions lasted about two hours. Our research team encouraged the organization, agency, or nursing facility involved to include as many of their staff as they thought would be interested or have valuable information to share. In several states, the research team was able to observe a portion of a TA survey visit on site. Typically at least two researchers participated in each site visit--one researcher would guide the discussion; the other would take notes on participants' responses.

The discussions focused on the following topics:

Appendix A contains summary reports documenting each state visit.

We found a range of philosophical influences combining to shape quality improvement efforts in particular states. Major influences include state legislatures, personal involvement of individual state legislators in long-term care issues, campaigning by consumer advocacy organizations, complaints from the industry about "over-regulation" by both state and Federal Governments, and a considerable body of research documenting the inadequacy of care delivered to residents of U.S. nursing facilities.6 These issues are often interrelated--an interrelation that serves as the catalyst for a state's decision to embark on its own quality initiative.


2.0 APPROACHES TO TA PROGRAMS: MOTIVATION AND PROGRAM DESIGN

We were interested in two particular question related to program design: (1) the motivation for states to implement a TA program rather than some other type of quality improvement initiative; and (2) the extent to which states used a formalized design approach to guide development of their TA programs.

2.1 Choosing the TA Route

Although each state had its own set of reasons for designing and implementing its particular quality improvement programs, a similar driving force seemed typically to be behind the decision to implement a TA program--dissatisfaction with the survey process--stimulating a desire to "try something new" or focus attention on quality in a way other than regulation. This was particularly true for states with a TA program focused on improving care practices, and the cases of Missouri and Maryland illustrate this point.

The impetus in Missouri came from a set of pilot tests run in 1999 to study the impact of using advanced practice nurses to improve resident outcomes through technical assistance. This research showed that providing feedback on quality through reports and education was insufficient to improve clinical practices and resident outcomes.7 It found, further, that a stronger intervention of expert clinical consultation coupled with comparative feedback was needed to improve resident outcomes. Missouri also noted that TA visits were beneficial because they (1) recognize that facility staff are stretched to the limit, making it difficult for them to keep current on the latest clinical information; and (2) provide support to facility staff who want to do a good job, but need some ideas and encouragement (see Appendix A for more details on the Missouri TA program).

The impetus for Maryland's quality improvement programs, enacted in 2000, was a series of events and activities both within and outside the state over the preceding ten years. In 1989, the media reported on deplorable conditions in a Maryland nursing facility and subsequent scandals and multiple nursing facility closures over the next three years precipitated a 1999 General Accounting Office (GAO) study that found the complaint investigation process was unacceptably slow (the GAO made similar findings in other states). In 1999, the negative personal experiences of several influential state senators with respect to Maryland nursing homes, along with damaging testimony before the state legislature by Maryland Department of Health and Mental Hygiene/Office of Health Care Quality (OHCQ) staff on the issue of complaints, was influential in leading the legislature into tying passage of a nursing home funding bill to creation of a Nursing Home Task Force to study quality and oversight in Maryland.

The Task Force began meeting during the summer of 1999 and presented their recommendations in January 2000. In May 2000, a broad Nursing Home Reform Package was enacted in Maryland that did not focus simply on strengthening regulations and sanctions, but also included provisions specifically addressing quality improvement such as the addition of a technical assistance program through a required "Second Survey".

2.2 Approaches to Program Design

The literature on quality improvement strategies includes several potential design frameworks or paradigms for use in designing an effective quality improvement program.8 While differing in detail, all include a series of logical steps to (1) assess or identify the nursing facility quality problem at hand; (2) evaluate or analyze the issue in order to determine the best approach to resolving it; (3) create a plan for implementing the program design or activity intended to improve the problem; (4) define the interaction between TA staff and the survey agency; and (5) evaluate whether the intervention as designed and implemented actually resulted in quality improvement.

In an effort to categorize the quality improvement programs in the study states, we looked at the extent to which each program had been developed with this general sequence of steps in mind. We found only two states (Texas and Missouri) that had followed such a strategy in full, with rigorous program designs that included an evaluation component. Other state programs were developed through an essentially ad hoc process.9


3.0 TA PROGRAMS IN THE STUDY STATES: OVERVIEW AND CRITICAL DECISIONS

This chapter provides brief overviews of the six technical assistance programs we studied, and the critical decisions program designers and implementers must make. Chapter 4 places these TA programs within the wider context of state quality improvement programs more generally.

3.1 The TA Programs in Brief

All of the technical assistance programs we reviewed, with the exception of programs in Washington and Maine, have been in existence for less than two years. It is important to keep in mind that the relatively short life of these programs, combined with the fact that many of them were introduced at the same time as other quality improvement initiatives, limits our ability to draw firm conclusions about how program characteristics relate to quality of care outcomes.

Florida (Quality of Care Monitoring Program): The Quality of Care Monitoring Program was established in 2000, and is part of and administered by the Florida Agency for Health Care Administration (AHCA). AHCA also includes the state survey and certification agency. The Quality of Care Monitoring Program was designed to create "a positive partnership between the state regulatory agency and nursing homes and ultimately yield improved quality of care to residents." Technical assistance is provided by Quality Monitors who make quarterly, mostly unannounced, visits to facilities, and offer educational resources and performance intervention models designed to improve care. Quality Monitors also interpret and clarify state and federal rules and regulations governing nursing facilities, and seek to identify conditions that are potentially detrimental to the health, safety, and welfare of nursing home residents. The role of the monitors has expanded since the program was first implemented, to include a number of more regulatory-related processes. Quality Monitor staff now review compliance with minimum staffing and risk management requirements; preside over facility closures; and train new surveyors. Funding for the Florida technical assistance program is split between state general revenues and a portion of punitive damage awards that are set aside to improve nursing home quality.

Maryland (State Technical Assistance Unit--Quality Assurance Survey): The State Technical Assistance Unit was established in 2000, to monitor compliance efforts and provide information about best practices. The unit performs required, unannounced, annual Quality Assurance Survey (the so-called "Second Survey") at each Maryland nursing facility. The Quality Assurance Survey Unit Team, which is separate from and independent of the survey staff, consists of five nurses, one dietician, and a manager. The Second Survey is intended to be collegial and consultative rather than punitive, and its separation from the survey and certification process is intended to preserve confidentiality. Funding for the Maryland Quality Assurance Survey is obtained from state general revenues.

Washington (Quality Assurance Nurses): The Washington state Quality Assurance (QAN) program has been implemented since the late 1980s. QAN visits are made to all nursing homes in the state. In addition to providing technical assistance (or "information transfer," as the state calls it), 31 nurses conduct reviews of MDS accuracy; operate as surveyors, both conducting regular surveys and occasionally serving as complaint investigators; conduct discharge reviews to determine if resident rights are maintained when discharged/transferred; and serve as monitors of facilities in compliance trouble. The Washington State QAN program is unique in that it is the only state that has implemented a nursing home technical assistance program as part of it Medicaid "medical and utilization review or quality review" program (for further discussion of this financing mechanism see Chapter 5). Under this funding authority the state received a 75 percent federal match rate.

Maine (Consultant Nurse for Problem Behavior Residents): The technical assistance program in Maine is the smallest program in our study. In existence since 1994, the program in Maine consists of a single nurse, who provides statewide consultation and educational in-services to any facility on problem resident behaviors. The goals of the program are to (1) help facilities provide better services and reduce the risk of abuse and neglect, especially for those residents with problem behaviors who are more at risk; and (2) reduce the number of residents discharged because a facility cannot deal with their behavior. Maine financially supports the Consultant Nurse program by drawing on funds from fines collected through the imposition of civil money penalties (CMPs).

Missouri (Quality Improvement Program for Missouri): The Quality Improvement Program for Missouri was developed, and is implemented and operated by the University of Missouri-Columbia Sinclair School of Nursing. The location of Quality Improvement Program at the University of Missouri supports and underscores the independence of the program from the State Survey Agency. The Quality Improvement Program has seven nurses who provide confidential consultation to assist nursing homes with their quality improvement programs. The Quality Improvement Program is not mandatory. Since the program began in 2000, 45 percent of the nursing homes in Missouri have elected to receive this assistance. Funding for the program comes from the Missouri Department of Health and Senior Services and is financed through a combination of nursing home bed taxes, annual licensing fees, and fines collected through CMPs.

Texas (Quality Monitoring Program): The TA Quality Monitoring Program in Texas was implemented only in April 2002 and is a mandatory program for all nursing homes. The Quality Monitoring team includes registered nurses, pharmacists, and nutritionists, who conduct unannounced and unsolicited visits to facilities. Quality monitoring visits are scheduled based on a determination of the level of risk at each facility. Quality Monitors conduct individual resident and facility-level reviews to assess the quality and appropriateness of care in selected areas (e.g., restraint use, incontinence care, and toileting plans). The Texas Quality Monitoring Program is unique in that it has developed evidence-based protocols for quality improvement. Within the Quality Monitoring program, there is also a rapid response team, made up of one or more quality monitors. The Rapid Response Teams sometimes make unannounced to facilities that have been identified as being particularly problematic. They also visit facilities that request their assistance. The funding for the first two years of the Texas Quality Monitoring program was $2.7 million, with the program funded with 50 percent state funds and 50 percent federal funds.10 In order to fund its share of this program, the State transferred 50 FTEs from the survey to this new program. As part of the legislation that established the Quality Monitoring program, an additional 32 FTEs were transferred from actual survey work to other components of the state's Quality Outreach Program, including the state's Rapid Response teams, provider education, and liaison with providers.

Table 1 provides more detail on these state TA programs. Additional details on the programs in each study state can be found in Appendix A.

3.2 Critical Decisions in the Design and Implementation of TA Programs

States have a series of critical decisions to make as they develop and implement a TA-type program to improve nursing facility quality of care. Our discussion here reviews how our study states made these decisions. In so doing we highlight the range of choices the study states made and the implications of those choices for program operation, focus, and likely impact.

Program Focus: Improving Care Practice or Regulatory Compliance

The focus of a state's TA program is a fundamental choice that influences all the subsequent program design decisions. States tended to choose one of two directions. One group of states created programs that focused on direct promotion of quality improvement through efforts to assist facilities in improving their care practices. In the other group of states, the focus of the TA programs promoted quality through an emphasis on monitoring compliance with survey and certification requirements. Programs in this second group of states do offer technical assistance to facilities on quality related issues beyond the scope of the survey and do not have the punitive aspects of the survey process. However, they tend to focus more on monitoring care and regulatory compliance than on helping facilities to improve their care processes.

The distinction between the foci of the two groups of states was conspicuous, and state representatives, providers, and consumer advocates talked extensively about the orientation of the TA program. Although not explicitly stated by any of the stakeholders with whom we spoke, several statements taken together made it clear that some states believe that emphasizing monitoring and enforcement of survey requirements can and does raise the level of care quality. For example, in Washington, a state with a TA program that emphasizes regulatory compliance, virtually all of those with whom we spoke--state personnel, providers and consumer representatives--reported that one of the best things about the state's QAN program was its close ties to the survey. These stakeholders expressed a belief that TA programs should emphasize regulatory compliance, and be linked with survey activities and staff. Other states viewed such linkages as conflicting with what they saw as the primary aim of the TA program, through the provision of an alternative to the survey process. In states that focused on improving care practices, the belief was that when the focus was on improving quality of care for residents, regulatory compliance would logically follow (rather than the other way around).

Programs with a Focus on Directly Improving Care Practices

The majority of our study states (Maine, Maryland, Missouri, and Texas) have chosen to focus their TA programs directly on helping nursing facilities to improve their care practices, using an approach that is separate from the LTC survey process.

Programs with a More Regulatory Focus

The focus of the TA programs in Washington and Florida is more on promoting regulatory compliance.

While the primary focus of the types of programs (i.e., those with a focus on improving care practices vs. those with a focus on promoting regulatory compliance) is clear, there is a certain overlap between these two types of programs. For example:

Relationship Between the Survey and TA Programs

Close Ties

In two of our study states we found close relationship between the TA program and the state survey agency.

"Relative" Independence

In some of the study states there was relatively more independence or separation between the TA program and the state survey agency.

Total Separation

In our study states, Missouri was the only one state in which there was total separation between the TA program and the State Survey Agency.

Reporting of Findings from TA Visits to the Survey Agency

Study states fell into two groups here. In more than half of them (Florida, Maryland, Missouri, Texas) TA findings are not formally reported to long-term care survey staff. Hardly surprisingly, the states that have steered clear of regulatory-based TA fall into this group.

No Formal Reporting to the Survey Agency

Maryland TA staff do not share findings with the State Survey Agency unless very serious violations (i.e., situations where conditions in the facility are causing residents actual harm or placing them in immediate jeopardy.) At the time of our visit, TA staff reported this has only happened once. The regular process when violations are identified during a TA visit is to have the Quality Assurance team bring these to the attention of the nursing home staff and require a plan of correction.

In Missouri, TA visits are also confidential (except in the rare cases of immediate jeopardy or actual harm to residents). No details are reported to the survey agency (not even which facilities were visited). State law mandates that the TA nurses report any situations where there is actual harm or immediate jeopardy. They must inform the facility about the issue of concern; and then must contact the LTC survey agency to discuss it. TA staff report that such a situation has never come up.

In Florida, TA staff do not share information gathered during the TA visit with surveyors, but they will bring concerns about facilities that are performing poorly to their supervisors within the state survey office, as well as report on non-compliance related to staffing and risk management. TA staff are advised to call the state hotline to report instances of immediate jeopardy.

Formal Reporting to the LTC Survey

In Maine, copies of the TA reports go to the TA supervisor (who works in the survey office) and are available to surveyors. In Washington, TA staff report all serious violations to, and share all findings with survey staff. In Texas, Quality Monitor reports are available over the IntraNet to surveyors and are reviewed as part of preparation for surveys.

Requiring TA Staff to have Surveyor Training

States span the spectrum on the issue of whether TA staff should have surveyor training. In Maryland and Washington, TA staff are required to have surveyor training, while Maine and Missouri have purposely chosen not to hire surveyors. In Florida and Texas, surveyor training is not required but some TA staff who were previously surveyors have been hired as part of the quality improvement program.

States Requiring Surveyor Training

Some states use TA staff that have either survey expertise and/or surveyor training.

States Not Requiring Surveyor Training

Some states do not require that TA staff have either survey expertise and/or training.

Facility Participation in TA Programs

Mandatory Programs

In all our study states except Maine and Missouri, TA initiatives were mandatory for all Medicare and Medicaid certified long-term care facilities in the state. This decision is legislatively imposed in some states, such as Florida. In other states, such as Washington, the mandate is part of state utilization review requirements, which necessarily apply to all Medicaid facilities but not Medicare only facilities. In Maryland, there is no legislation specifically mandating a quality related survey, but state regulations require two annual surveys to be performed for each facility, and the state has chosen to focus its "Second Survey" on quality improvement activities that include technical assistance and sharing of best practices.

The frequency of TA visits in states with mandatory programs varies. In Maryland, TA visits are performed yearly at each facility. In Texas, all facilities have at least one TA visit per year with additional visits prioritized to target those considered likely to be at risk for a poor survey, based on factors such as quality indicator data and previous survey results. Facilities can also request a site visit if they need guidance about an area of care. Florida also ties the frequency of visits to quality concerns. Florida's original legislation was similar to Texas, calling for annual TA visits to all facilities, with more frequent visits to troubled facilities. Current legislation mandates quarterly visits to all facilities and continues the policy of providing additional visits to poorly performing facilities. In Washington state, Quality Assurance Nurses are required by regulation to visit each Medicaid nursing facility at least quarterly.

Voluntary Programs

The two states with voluntary TA programs in the study are Maine and Missouri. These programs focused on quality improvement through consultation focused on helping facilities to improve their care practices rather than through regulatory compliance.

In Missouri, TA visits are provided by nurses employed by the University of Missouri and are voluntary, confidential, and consultative. The consultative focus allows TA nurses to emphasize standards of care and to work with facility staff on improvement efforts that are specific to their facility and resident needs. In 2001, there were 459 site visits in 212 different facilities. This included 164 nursing homes, 20 intermediate care facilities, and 85 residential care facilities (note that some facilities fell into multiple categories). Since the program began in mid-2000, about 270 of the 600 (45 percent) nursing facilities in Missouri have participated in the TA program.12

Missouri's QIPMO program encourages facility participation through the efforts of the staff to publicize the program. The TA staff in Missouri believes that their involvement in support group activities helps increase provider awareness of and interest in the TA program. TA staff coordinates and facilitates monthly MDS Coordinator support group meetings. These meetings aim to (1) improve MDS coding accuracy, (2) enhance job satisfaction for MDS Coordinators and (3) increase overall staff retention rates. In addition, the program receives referrals from surveyors.

Maine's TA program provides behavioral consultation statewide to any long-term facility upon request. Its focus is on improving resident outcomes through a combination of consultative and educational support. There are 126 nursing facilities in Maine, with 7,309 residents reported as of Spring 2001. Maine's TA nurse reports visiting 181 residents from July 2000 through June 2001, and 169 residents from July 2001 through June 2002. No records have been kept to indicate the number of facilities that have been visited.

In Maine, nursing home providers appreciate that the TA is free, that it is not connected to the LTC survey, and involves all facility staff in the process. Some referrals come through the Ombudsman caseworker, who contacts the TA nurse directly or suggests that the facility contact her. But the majority of referrals come from facilities themselves. The TA nurse describes the goals of her services as "to assist staff in dealing more effectively with difficult behaviors by giving them a better understanding of the resident and why the behaviors are occurring, making recommendations, involving them in team problem solving where their input is valued, and providing them the education that will enable them to do their jobs more effectively and safely--as well as improving quality of care and ultimately quality of life for the resident."13 She prioritizes responses to facility requests based on the severity of the problem. Visits are generally made within two weeks of the request.

Focus of TA Visits

The focus of TA visits varied across states.

The Nature of the TA Intervention

Dissemination of Best Practices

"Best practices" as applied to nursing facilities is a general term that refers to a range of activities centered on identifying excellence in clinical practice. The methods by which the study states identify best practices and disseminate this information, and the audience for whom they are intended, vary significantly.

Study states varied in terms of what was describe as best practices--in how best practices are defined, where they originate, and how these practices are used by the state's other quality improvement programs. Some states define a best practice as an expert-derived protocol that should be adopted by facilities to raise standards of practice. Others define a best practice as an innovative idea originating at the facility level that was seen as potentially valuable to other facilities. Still other states use both definitions. Examples of Best Practice protocols disseminated by study states are included in Appendix C.

In Texas, a panel of academic, clinical, and medical experts were used to develop evidence-based clinical practice guidelines that are a core feature of the Quality Monitoring Program. The initial focus has been limited to a small number of areas (e.g., restraint use, incontinence care, hydration). The intent is for the assistance provided by TA staff to reflect the consensus of pooled experts, not the opinion of an individual TA nurse or the survey agency. Quality Monitors provide information regarding best practices and how to achieve them, give feedback to facilities regarding the degree to which the facility is providing care consistent with the best practice protocols, and help the facility identify system changes that could result in greater use of best practices. The best practices are also posted on the QM Website (described in more detail in section 4.2.)

TA staff in Maryland, Florida, Washington, and Missouri also disseminate best practice information. In these states, however, this consists of information that the TA staff has collected from personal reading, interactions with other facilities, and personal networking. None of the information has been formally endorsed by the state or collected together and posted in a single location.

TA Staff Composition

Florida, Missouri, Washington and Maine require their TA staff to be registered nurses (though not necessarily experienced in long-term care). Only Texas and Maryland's TA teams mimic the survey teams' composition, which includes other disciplines as well as nursing.

Visit Structure

The structure of the TA visits varies widely across states and in some states across geographic region within a state. The latter is true of Maryland, where the TA visit is still evolving, and in Washington, where TA staff have the flexibility to organize the visit according to the specific issues to be addressed that day. The facility personnel they meet with also vary. TA staff may meet with the facility risk manager (Florida), for example, or QA coordinator (Maryland), as well as with other members of the facility quality assurance team (e.g., social workers, nurses, therapy, administration) during each visit. Texas has a formal debriefing session (or exit conference) that TA staff conduct with each facility visited. Visit length also varies, by state and by issue being addressed on-site. For example, a Maine TA visit lasts about four hours. A Maryland visit takes two days, with about six hours spent in resident medical record review to reconcile what the staff is saying with what has been recorded in the charts. The remaining time is spent reviewing the QA plan, and interviewing key facility staff. Staff may be interviewed to assess the facility's concurrent review process (a requirement related to QA plan). In Florida, the TA staff nurse places signs in facilities she is visiting, inviting residents and families to speak with her. The Maryland, Texas, and Washington TA visits may involve resident interview and observation, as well.

3.3 Summary

Participants noted that there are both positive and negative aspects of having the TA program affiliated with the state survey program. TA staff who also function as surveyors are perceived as having greater authority, more regulatory knowledge, and better able to effect positive changes in resident care. Regulatory-related information given by TA staff who also function as a surveyor is expected to be more consistent among TA staff and between TA staff and surveyors. Sharing TA reports with survey staff may help inform and focus both survey and TA efforts.

However, housing the TA program within LTC survey agencies, having TA staff function in both TA and survey roles, and/or sharing information between the TA and survey programs gives rise to understandable provider concerns. In states with close ties between survey and TA staff, providers were less willing to be involved with the TA program. They reported being less forthright during visits, and less willing to give honest feedback on TA evaluation forms, given that the same TA staff might be performing their agency's next survey or complaint investigation. In addition, in states where TA staff acted in both roles, many participants noted that TA staff are sometimes diverted to survey tasks, reducing both the regularity and frequency of TA visits.

Whether TA staff should have surveyor training depends, in part, on whether or not there is a significant regulatory component to the TA program. In states where the TA program is closely linked to the survey agency, TA staff obviously need surveyor training. Interestingly, in states where TA staff do not perform survey tasks but have been recruited from the survey agency, discussants commented that former surveyors often have trouble "changing hats." In states that are unambiguously focused on quality first, clinical expertise is seen as more important than knowledge of regulations. However, facilities in these states say it bothers them when TA staff are unable to provide interpretive regulatory guidance. We also learned that some providers were overwhelmed by the amount and complexity of the TA information provided, particularly in states where evidence-based practice was a goal (Texas and Missouri).

The frequency of visits is also another design decision states must make. Providing quarterly visits to all facilities in a state, as Washington and Florida are required to do, is a Herculean task given current TA staff levels. In fact, in both states, state officials and some providers reported they were not receiving quarterly visits. Some Washington providers said that TA visits occur much less frequently than quarterly, and state program administrators agreed that certain geographic regions have experienced fewer visits due to the demands of the LTC survey and certification schedule. In Florida, high TA staff turnover and the increasing demands on TA staff time for survey-related tasks were blamed for the quarterly TA schedule slipping in some regions.

According to providers and other stakeholders we talked with during our visits, several factors probably contribute to facilities not participating in voluntary TA programs: (1) Some nursing home chains have their own quality improvement program and they feel that additional consultation is unnecessary and/or potentially confusing. (2) Some facilities do not understand the purposes and goals of the program, or are not aware that the program exists. (3) Some facilities associate TA with the LTC survey process and do not wish to be subjected to what they assume will be additional scrutiny. (4) Some facilities are focused only on survey and certification and lack interest in a program whose goals are not focused on improving survey results. (5) Some facilities do not have the resources either to devote to non-mandated quality improvement efforts or to allow staff to benefit from TA activities.

The nature of the TA intervention varied across, but was intended promote what each state defined as best practices. Interventions disseminated by the states included: evidenced-based care practices, expert opinion and information gathered by TA staff, and/or facility-nominated best practices.

These programs are too new, and the data are insufficient, for any conclusion to be drawn as to which approach is more effective in promoting quality (which all agree is the ultimate goal). Only Missouri, and to a lesser extent Maryland, had made any attempt to evaluate their programs at the time of our visit, and no state has tested the effectiveness of one approach over another.15 On the one hand, states that focus primarily on regulatory compliance have, in effect, increased the number of times the state agency is in the facility evaluating facility performance. This gives the state greater knowledge of day-to-day facility operations, but may not improve the relationship between providers and the regulatory agency, which historically has been troublesome in many states. On the other hand, states that focus primarily on improving nursing home care practices encourage consultation between monitors and providers, allowing facility staff to enter into collaborative relationships with state staff. These collaborative relationships may enhance problem recognition and solving. Providers, especially those not part of a larger network, appreciate the expertise and knowledge that can be provided by TA staff, who are not part of the potentially adversarial survey and certification process.


4.0 THE WIDER CONTEXT OF STATE-INITIATED QUALITY IMPROVEMENT

In addition to the TA programs reviewed in Chapter 3, the states we studied all had initiated additional state-initiated quality improvement efforts. In addition to technical assistance programs, the four most commonly initiated practices included:

This section presents information on each of these four program categories, including the differing approaches states have taken to implement them, the nature of their interaction with TA programs, the perceived positive and negative aspects of each program, and their potential impact on quality. Readers interested in learning more about these programs, as well as the other activities listed in Table 2, are directed to the state reports included in the Appendices at the end of this document.

4.1 Public Reporting

Over the past several years, a number of initiatives aimed at giving consumers and other members of the public access to information about nursing home quality have been implemented. In November 2002, as part of its Nursing Home Quality Initiative, CMS began posting on its Nursing Home Compare website [www.medicare.gov/NHCompare/Home.asp] information for each Medicare and Medicaid certified nursing home. The information includes indicators of each facility's performance as measured by ten quality measures. The Nursing Home Compare website benchmarks the facility's performance on these indicators against all nursing home providers in a state and nationally. The Nursing Home Compare website also includes provider-reported staffing information and was recently expanded to include complaint information.

In addition to public reporting efforts by CMS, 20 states have instituted their own public reporting initiatives.16 Of the seven states reviewed for this project, four (Florida, Iowa, Maryland, Texas) have developed a public reporting system. Each of these states makes the data accessible over the Internet. (Internet website addresses and examples of the data reported by these states are shown in Appendix D.) The public reporting systems in these states vary in the type and degree of posted information. Each is intended to provide information to assist consumers in understanding the quality of care provided in each Medicare or Medicaid certified facility in that state. In Florida, Iowa, and Texas, the websites allow access to information about survey results, giving users the ability to drill down to increasingly detailed data about each nursing home--including lists of deficiencies on the most recent survey and a summary of the facility's regulatory compliance history.

Interaction with TA Programs

The public reporting systems in Florida, Maryland, and Texas are used to help inform quality improvement efforts discussed in Chapter 3.

Positives and Negatives of Publicly Reported Information

Stakeholders with whom we spoke discussed the positive and negative implications of publicly reporting information on nursing home quality. State officials believe the greatest benefit of publicly available nursing home quality reports is to help nursing home residents, their families, and informal caregivers make informed decisions when selecting a nursing home or evaluating the care provided in a particular facility. Some stakeholders in most of the states indicated that the report cards had increased consumer access to public information. However, consumer advocates noted that consumers frequently do not know that the reports exist, may not have Internet access, or may not be proficient in navigating the Internet. There has been no analysis of how often report cards actually influenced decisions about nursing home placement.

Some stakeholders also expressed concern that websites may not be designed to optimize consumer access to, and use of, these sites. Some provider associations suggested that more collateral materials should be included on websites to aide consumer understanding of the information posted. States reported difficulties in balancing the provision of sufficient information to assist consumers in making more informed decisions, while not overloading consumers with data. For example:

The websites were also reported to provide easy access to information on nursing home quality to advocates, the provider industry, legislators, and other public policy makers. The websites in Florida, Iowa, Texas and Maryland each includes a disclaimer that the information on their website should not be used as the sole basis for nursing home selection. However, some stakeholders expressed concern that users of these websites do not sufficiently explore the meaning of posted information. For example,

While some stakeholders indicated that the information reported on a state's website was generally current and accurate, others expressed concern that some websites were designed to collect old information while other sites simply could not be kept current. For example:

Consumer representatives were concerned that a good rating on a report card--or even a bad one--could misinform consumers. For example, some advocates in Florida believe that giving the worst facilities in the state even a one-star rating was misleading. In Texas, the lowest ranking indicates facilities that have the 'most disadvantages' with respect to quality indicators or a 'substandard quality of care' with respect to survey findings, so this is less of a concern.

Many providers indicated that greatest benefit of the public reporting was the ability afforded to them to use a good quality rating as a marketing tool. Providers in several states said the reports allow good nursing homes an opportunity to receive the praise they deserve and distinguish them from poorer performing facilities.

While CMS and some of the states have posted nursing home performance information for the last several years, providers expressed concern about the impact of posting this information on the availability and costs of nursing home liability insurance. Providers and their associations in Iowa, Florida, and Texas reported that some liability insurance companies were choosing not to write policies for facilities with a higher number of deficiencies or that have poor quality indicator scores, and others have increased rates to the point where facilities report they can no longer afford this insurance. While the survey deficiency information has always been public, the availability of this information on state public reporting systems makes it easier and less costly for insurers to identify poor performing facilities. The states of Iowa, Florida, and Texas have convened task forces to examine the liability insurance issue.

Potential Impact on Quality

In the study states, state officials expressed their hope that public reporting of deficiencies will improve quality by stimulating competition and sparking change in facility culture. Of the states we studied, however none have formally evaluated the impact of their public reporting programs on quality of care. Maryland plans to perform an analysis on the impact of their public reporting initiative, and the state has made some modifications to the public report based upon feedback.

Doubts were already being voiced in several states we visited, however, about the potential effectiveness of public reporting to effect change. As discussed above, some stakeholders questioned whether the report cards could have an impact on consumer decisions, since the public is not sufficiently aware that the report cards exist. In most states, agency staff are able to measure how many people use the website, although they cannot identify whether these are consumers, policymakers, researchers, or others. Further, as suggested above, additional education may be necessary to raise consumer awareness of the report cards and promote consumer use of available nursing home quality information more generally.

Another factor that may limit the impact of report cards on quality improvement is that nursing home placement choices are limited in some states. However, some providers and other stakeholders voiced the opinion that access to quality reports is increasingly important in states where falling nursing home bed occupancy rates are expanding consumer choice.

Of most fundamental importance is the concern is that public reporting of inadequately risk adjusted quality indicators could limit access for heavy care patients even at the best performing facilities. For example:

Although public reporting has been promoted as a means for facilities to identify problem areas and target initiatives aimed at improving quality of care, none of the providers we spoke with identified it as such. Some stakeholders expressed concern that it is primarily the facilities already considered to be top-performing that will make necessary changes, while a certain percentage of providers in each state simply do not have the resources to initiate or sustain these improvement programs. In Florida, for example, consumer advocates noted that some facilities have been on the Watch List many times, and that this does not appear to have provided sufficient motivation for those facilities to do a better job. Nonetheless, some stakeholders with whom we spoke suggested that public reporting is a necessary, but, not sufficient step to improve nursing home quality.

4.2 Best Practice Dissemination Programs

As discussed in section 3.2, study states varied in terms of what each described and promoted as "best practices" and how these practices are incorporated into their quality improvement/technical assistance programs. In addition to best practice dissemination through the TA program, many of the study states also initiated additional activities to recognize and disseminate information about best practices in nursing homes in their state.

Potential Impact on Quality

As with public reporting, none of the study states has made any systematic attempt to measure the impact their best practices programs have had on quality. During discussions with providers and state program staff we received several comments on their potential impact, however.

4.3 Training/Joint Training Programs

As discussed in section 3.2, all study states include informal provider education during facility visits as one component of the technical assistance offered caregivers and administrators, and all but one include provision of some type of formalized training in their quality improvement efforts. This section describes state-initiated training programs that are directed at improving the quality of nursing home care that are separate from their quality improvement/technical assistance programs (as described in section 3.2).

Determining the topics for training is done by different methods in different states. A common approach is for states to select training topics simply by identifying areas where providers were perceived to be experiencing the greatest difficulties. In some states (e.g., Texas), at least part of the training is focused on areas that are most frequently cited as deficient. In some states, political pressures created the impetus for specific training initiatives (e.g., the Alzheimer's training program in Florida--see below). Generally, most states reported that training sessions are well attended, even though they are mostly voluntary.

Two of the states visited, Iowa and Texas, have made provision of joint training to providers and surveyors a key part of their quality improvement program. Examples of training programs used by study states can be found in Appendix E. When joint training is offered, the goals include an effort to provide a common knowledge base for surveyors and providers. Participants in these joint training programs reported that having both surveyors and providers in the same room has met with some resistance from both sides and may have had a chilling effect on discussion. Despite this, many said they believe joint training is essential, so that both providers and surveyors receive the same information--and that such sharing, even though stressful at the time, may ultimately help improve the surveyor-provider relationship, leading to better communication during the survey process.

In addition to the joint training described above, the Texas Ombudsman and his staff, who already have a presence in facilities, are conducting training on resident centered care. The issue of restraint use was chosen as a focus of this training because it is a long-standing issue with consumer advocates, because restraint use is notably high in Texas and currently a major concern of the Texas Department of Health, and because the Texas Department of Insurance identifies restraint use as a risk factor for liability issues. The program is intended to dispel myths about perceived benefits of restraints in resident safety and to help educate staff and families about alternative options. Program content has been coordinated with the best practice protocols developed for the Quality Monitor program. The program is set up in three modules: training all ombudsmen volunteers (60 staff oversee the 850 volunteers), followed by those volunteers training facility administrators and key staff, and then the volunteers/staff educate families on the topic area.

There is no mandatory requirement for facilities to participate. The goal of the program is to have 10 percent of facilities adopt the program by August 2003. Texas will compare the use of restraints in nursing homes before and after its joint training. The training program will be considered a success if restraint use is decreased in 10 percent of the facilities that participated in the joint training program. It will not be possible, however, to separate the effects of this training from other quality improvement efforts in the state.

Florida requires that all nursing home employees expected to have direct contact with residents with Alzheimer's Disease and related dementias receive a state approved training program. To provide this training, Florida employs a train the trainer model where one individual in each facility is trained by staff from the University of Southern Florida (USF) and then becomes the staff person responsible in that nursing home for training all other staff who may have contact with residents with Alzheimer's Disease and related dementias. USF has also developed a compact disc aimed at training licensed practical nurses in dementia-related care issues and also disseminates best practices via the web. Providers reported that they found the training program most helpful for nursing aides and for facilities that do not have a specific dementia care unit. Some expressed the opinion that facilities should be able to choose for themselves the training that would most benefit their facility. Some providers said mandatory training felt more like a "big brother is watching" regulatory approach than a valuable educational program.

Maine, a state with many rural facilities spread over a wide geographic area, brings training to the facilities. The single nurse who staffs the TA program developed this approach. While participating in a facility closure, she observed that educational programs available to long-term care staff were generally held outside the facility, requiring a facility representative to travel to the program and then carry the information back to the staff. She envisioned a program that would provide educational and support services in the environment of the residents and the direct care staff. She has developed seven such in-service programs, which she conducts at facilities on request. Topics include Practical Hints for Caregivers of Alzheimer's Disease and Elopement Risk Factors and Prevention. These programs are very popular and are often scheduled six months ahead. The state Licensing and Certification Division reported that 90 percent of all homes in the state sent staff to one of the workshops held in the past two years. Discussant comments on provider training tend to be positive, expressing the idea that the sharing of knowledge should at least provide facilities with useful information related to quality improvement.

Potential Impact on Quality

No state included in our study has yet done any formal analysis to of the impact of state sponsored training programs. Anecdotally, nursing home administrators and clinical staff reported that training combined with regulatory interpretation and practical applications in nursing home care improved quality. Providers reported making changes in their caregiving practice after participating in a seminar in which a surveyor provided interpretation of regulations, followed by a panel discussion and presentations by facilities of their best practices in that particular clinical area. Some stakeholders said they thought training was a critical but insufficient element of good quality care.

4.4 Facility Recognition Programs

Two of the states we studied (Florida and Iowa) have developed and initiated reward and recognition programs as part of their quality improvement efforts. The goal of these programs is to recognize facilities doing exemplary work. Examples of Facility Recognition Programs can be found in Appendix F.

Florida and Iowa use a similar process for selecting facilities for quality awards. Residents, family members, members of resident advocacy committees, or other health care facilities can make nominations for the awards. In Florida, nominations can be also made by the state Agency for Health Care Administration, provider organizations, ombudsman, or any member of the community. Nominations are presented to a governor-appointed committee that includes the state's long-term care ombudsmen and other consumer advocates, and health care provider and direct care worker representatives. Both states make efforts to eliminate conflict of interest among committee members.

Both states specify criteria that must be met for a provider to receive a "recognition" award. Nominees must provide a description of the facility's best practices and the resulting positive resident outcomes, or the unique or special care or services (nursing care, personal care, rehabilitative or social services) provided by the facility to enhance the quality of life for its residents. Performance data (e.g., the facility's "report card" or assigned "quality of care rank" within the applicant's geographic region) are used in determining the facility's quality.

Florida facilities must meet a number of additional rigorous criteria to qualify for the quality award including: strict standards of performance on survey inspection results (i.e., no Class I or Class II deficiencies within the previous 30 months of application), no history of complaints, high level of family involvement, satisfied consumers as measured by an assessment of consumer satisfaction, low staff turnover rates, and the provision of in-service training. Further, facilities are required to demonstrate financial soundness as evidenced by a formal financial audit. Many stakeholders believe that this latter criterion eliminates most facilities from consideration because most facilities may unable to afford such an audit and providers that have been the subject of bankruptcy proceedings (or whose parent organization have been the subject of bankruptcy proceedings) during the preceding 30 months are disqualified.

In both states, following selection of the finalists by the awards panel, onsite reviews are made to verify the accuracy of the information on the nomination form. When the awards are confirmed, the governor presents a certificate to the facility administrator in a recognition ceremony. Some consideration has been given to providing additional rewards to award-winning facilities, such as an extended survey cycle, but these have not been implemented due to federal policies mandating that nursing facilities be surveyed every 12 to 15 months.

Despite Florida's more detailed and complex requirements for consideration, a similar percentage of facilities in both states (between one and two percent) have received the quality awards. Iowa's numbers are limited because the state legislation permits only two facilities from each congressional district to be recognized as award winners each year.

In addition to the quality award described above, Iowa also presents a Certificate of Recognition to any facility that receives a deficiency-free survey. The certificate is intended to acknowledge the "hard work and dedication" of the facility's staff in meeting the established standards of care, and is considered a way of providing positive feedback to providers with good survey results.

Positive and Negative Responses

In general, the response to the quality award programs has been positive. State nursing home regulators assert that the awards provide facilities with incentives to focus on quality improvement and create a benchmark for others to strive to meet. Providers, who appreciate any program that rewards good facilities, see the awards as a powerful marketing tool that can boost revenues and possibly reduce liability insurance costs. Advocates welcome any type of information that can help consumers make informed decisions about nursing home placement.

However, a number of concerns were also voiced about the award programs:

Potential Impact on Quality

Whether the quality recognition programs have any effect on promoting quality resident care remains unanswered. Both the programs are relatively new and neither state has performed any formal analysis of their impact on quality. Interestingly, however, most stakeholders express the opinion that the programs are unlikely to affect quality. "Window dressing " and "a warm fuzzy for providers" were typical of the comments received. Many with whom we spoke were concerned that the programs focused on high-performing facilities instead of the facilities most in need of assistance concerned. One stakeholder noted, "Only 5 percent of facilities are eligible--we worry about the other 95 percent."

Some stakeholders voiced the opinion that the awards, like a good rating on the facility report card, are a marketing tool which becomes increasingly relevant when bed occupancy is lower. When occupancy rates are lower, consumers may have more choice about where to go, and, thus, providers may compete by improving quality.


5.0 FUNDING MECHANISMS FOR QUALITY IMPROVEMENT PROGRAMS

Typically, states are focused on "quality assurance" activities in nursing homes--that is monitoring and enforcing compliance with nursing home requirements. Most states have avoided nursing home quality improvement activities, particularly technical assistance programs, in large part, due to the limited availability of federal funds for quality improvement and confusion about what funding sources may or may not be used to support such programs.

This chapter reviews the current funding mechanisms used by states to fund state initiated quality improvement including technical assistance programs. It also provides a guide to potential funding sources for states considering quality improvement programs, by describing current and possible future legislation that may provide for federal funding for such programs. We start this discussion by reviewing the requirements for and limits on the Medicare and Medicaid survey and certification programs.

5.1 Federal Funding for Survey, Certification, and Enforcement

Funding for Survey, Certification, and Enforcement

CMS pays for Medicare and Medicaid nursing home survey, certification, and enforcement activities using a price-based budgeting process. Under the price-based methodology, national standard measures of workload and costs are used to project individual state workloads and budgets. Payments to states are based on allowable costs up to a ceiling of 115 percent of the national average. If states exceed this average, their payments are frozen at the previous year's level for that facility, unless the state can successfully justify the causes for costs exceeding 115 percent.17 At the time of our study, no states have argued that their costs in excess of the 115 percent ceiling should have been allowable.18 The federal budget for fiscal year 2003 includes almost $250 million for state survey and certification activities.

Survey Requirements--Sections 1819 and 1919(g)

The Social Security Act specifies the federal requirements for monitoring compliance of Medicare and Medicaid nursing home providers under Sections 1819 and 1919(g). Compliance with these statutory requirements and implementing regulations is assessed using a survey, certification and enforcement process defined in statute and regulation. Medicare and/or Medicaid certified nursing homes are surveyed at least once every 15 months.

The Federal Government is required to conduct surveys of Medicare SNFs. The Federal Government contracts with state survey agencies to perform this activity and pays 100 percent of the allowable state survey costs for Medicare SNFs (Section 1864(b)). In addition, as permitted by statute, the Federal Government contracts with states to conduct Medicaid surveys. The federal law requires that the Federal Government pay states 75 percent of survey, certification, and enforcement costs for Medicaid facilities (Section 1903(a)(2)(D)).

CMS restricts the amount of technical assistance that surveyors can provide. According to a December 2002 program memorandum (see Appendix G), surveyors "should not act as consultants to nursing homes…" but should "provide information to the facility about care and regulatory topics that would be useful to the facility for understanding and applying best practices in the care and treatment of the long-term care residents." This information exchange is not considered by CMS to be consultation with the facility, but rather "a means of disseminating information that may be of assistance to the facility in meeting long-term care requirements."

In addition, the memorandum refers to Section 2727 of the CMS State Operations Manual (see Appendix G), which states: "It is not the surveyor's responsibility to delve into the facility's policies and procedures to determine the root cause of the deficiency or to sift through various alternatives to suggest an acceptable remedy. When the State Agency conducts a revisit, it is to confirm that the facility is in compliance with the cited deficiencies, not whether it implemented the suggested best practices, and has the ability to remain in compliance." Reference information regarding best practices may be provided to "assist facilities in developing additional sources and networking tools for program enhancement," but surveyors are instructed not to "act as consultants to nursing homes."

Guidance on the types of allowable survey and certification activities that may be eligible for a federal matching payment is found in the State Operations Manual (Section 4100-4109). There is no provision that explicitly permits use of federal survey and certification funds for any technical assistance or quality improvement programs like the programs in the states that we visited.

Educational Programs--Sections 1819 and 1919(g)(1)(b)

As part of the statutory Medicare and Medicaid nursing home survey and certification requirements, each state must "conduct periodic educational programs for the staff and residents (and their representatives) of [nursing facilities] in order to present current regulations, procedures, and policies under this section." Technical assistance programs that include a regulatory focus may be considered such "educational programs." For Medicaid, a 75 percent federal match is available for approved costs. The Federal Government pays 100 percent of the costs of such programs for Medicare SNFs.

Nursing Home Enforcement--Sections 1819 and 1919(h)

Federal law enumerates several remedies that may used to promote compliance with nursing home requirements. In Medicaid, the remedies range from penalties to incentives for high quality. Some of the Medicaid remedies may be applicable to state initiated quality improvement programs. These are discussed below.

Medicaid Civil Monetary Penalty (CMP) Funding--Section 1919(h)(2)(A)(ii)

States collect CMP funds from Medicaid nursing facilities and from the Medicaid part of dually certified skilled nursing facilities (SNFs) not in compliance with federal conditions of participation. Federal CMP funds are collected from Medicare-only facilities and the Medicare portion of dually participating nursing facilities. The Social Security Act (Section 1919(h)(2)(A)(ii)) provides that CMP funds collected by a state from nursing homes19 must be applied to the protection of the health or property of residents of nursing facilities that the state finds to be deficient.20 CMS has given states flexibility in determining the appropriate uses of CMP funds as long as those funds are used "in accordance with the law and in a consistent manner." (Source: August 8, 2002 Memorandum from Steve Pelovitz, Director of CMS Survey and Certification Group, to State Survey Agency Directors, see Appendix G).21

Some states have used CMP funds for their technical assistance or other quality improvement programs. CMP funds must be applied to residents in facilities that have been found deficient. CMS has given states flexibility in determining when a facility must have been deficient to be eligible for a CMP-funded program. According to the August 2002 program memorandum:

"The law does not specify when a facility must have been determined to be deficient to qualify for benefits under a state project funded by CMPs. Most nursing facilities have had one or more deficiencies either recently or in the past. Rather than setting forth rigid criteria on when it is that a facility must have been deficient to be an eligible target for the application of CMP revenues, we believe that the best course is to offer states maximum flexibility to make this determination. Apart from this, we believe that projects funded by CMP collections should be limited to funding on hand and should be relatively short-term projects."

These CMP funds are state, not federal, funds. States may use the state-share of CMP collected from Medicaid-only certified nursing facilities and from the Medicaid part of dually participating facilities for any project that directly benefits facility residents in facilities that have been found deficient.

These CMP funds could be used to prevent continued noncompliance by nursing facilities through educational or other means including the development and dissemination of videos, pamphlets, or other publications providing best practices. Other uses could include the use of consultants to provide expert training to deficient facilities.

CMP funds collected from Medicare-only facilities, the Medicare part of dually-participating facilities, and the federal share of state collected CMPs are returned to the Treasury.

Incentives for High Quality Care in Medicaid--Section 1919(h)(2)(F)

The Social Security Act describes the enforcement tools that may be used to promote compliance with requirements. One tool, for which federal funding is available, are state established public recognition programs to recognize facilities that provide the highest quality of care provided to Medicaid residents. According to the statute, "a state may establish a program to reward, through public recognition, incentive payments, or both, nursing facilities that provide the highest quality of care to residents..." The law indicates that expenses incurred in such incentive programs, "shall be considered to be expenses necessary for the proper and efficient administration of the state plan under this title." These costs are eligible for a 50 percent federal match.

5.2 Federal Funding Sources for TA and Other Quality Improvement Programs Being Used by Study States

There are other Medicare and Medicaid provisions that could provide federal funding for TA or other quality improvement programs. These provisions are described below.

Medical and Utilization or Quality Review--Section 1903(a)(3)(C)(i)

This section provides for a 75 percent federal match for the costs incurred "for the performance of medical and utilization or quality review by a utilization and quality control peer review organization." This section covers activities performed by state Quality Improvement Organizations (QIOs), which have similar characteristics to the TA programs in several study states (see Chapter 8 for more details on this program), and is used by Washington state to secure matching funds for its technical assistance program which is operated as part of the state's medical utilization program.

Funding for Skilled Professionals and Support Staff--Section 1903(a)(2)(A)

This provision provides for a 75 percent federal match for costs "attributable to compensation or training of skilled professional medical personnel, and staff directly supporting such personnel, of the state agency or any other public agency." Iowa uses this provision to maximize federal funding for its public reporting, joint training, and provider recognition programs.

Funding for Nurse Aide Training--Section 1903(a)(B)

This provision allows a 50 percent federal match of the costs associated with nurse aide training and competency evaluation programs, regardless of whether the programs are provided in or outside nursing facilities. Florida uses this provision to maximize federal funding for the Florida Alzheimer's Training Program for nurse aides who are employed by or have an offer of employment in a nursing home.

5.3 Potential Funding Sources not being Used by Study States

State Consultative Services--Section 1902(a)(24)

This section provides that funding is available to nursing facilities (and other provider types) for "consultative services by health agencies and other appropriate agencies of the state" to assist them in qualifying for payment under the Medicare and Medicaid programs, or establishing the fiscal records needed to determine payment on "account of care and services furnished to individuals." This provision could be used to support programs, for example, related to the MDS (e.g., training in completing the MDS accurately). A 50 percent federal match rate is available for such consultative services. This section of the Social Security Act was not used to obtain federal funding by any of our study states.

Assuring Service Delivery in the Best Interest of Medicaid Recipients--Section 1902(a)(19)

According to this section, Medicaid state plans must "provide such safeguards as may be necessary to assure that eligibility for care and services under the plan will be determined, and such care and services will be provided, in a manner consistent with simplicity of administration and the best interests of the recipients." This section provides a 50 percent federal match and potentially could be used to fund state-established web pages or other sources of consumer information on nursing homes, although we are not aware of any states that have actually received federal funding for such efforts under this section.

Proper and Efficient Administration of the State Plan--Section 1903(a)(7)

This provision allows federal funding, subject to 1919(g)(3)(B) for 50 percent of the amounts expended by states (as approved by the Secretary) for the proper and efficient administration of the state plan.

Information, Counseling, and Assistance Grants--Section 4360 of OBRA '90

This provision permits states to receive funding for grants for programs related to providing education to Medicare beneficiaries. The law indicates that the purpose of such grants is to provide "information, counseling, and assistance relating to the procurement of adequate and appropriate health insurance coverage" to Medicare beneficiaries including providing "information that may assistance in obtaining benefits …under titles XVIII and XIX…" One potential use of these grant funds may be for public reporting systems that provide consumers with information regarding nursing homes.

The FY 2002 appropriation for this program was $12.5 million. State allocations are made using a formula that takes into account the number of beneficiaries in rural areas and the number of Medicare beneficiaries relative to the state's total population. For large states (e.g., Florida and California) the average grant award is about $500,000. For smaller states (e.g., North Dakota and Missouri) the grant award is about $125,000. None of the study states indicated using this section to secure federal funds for their quality improvement programs.

5.4 Funding Sources Used for Identified Quality Improvement Programs

In general, in the states that we studied we found that federal survey funds had not been used for technical assistance programs but had been used for other types of quality improvement activities. As discussed in Chapter 3, only two of the technical assistance programs in our study--the programs in Washington and Maine--receive any federal funding. The funding sources and amounts for each of our study state's technical assistance, best practice, training, and facility recognition programs are discussed below.

5.5 Proposed Legislation Affecting Funding for Quality Improvement Programs

Two bills are currently in the U.S. Congress that, if passed, will provide additional authorization for funding state initiated technical assistance programs.

Nursing Home Staffing and Quality Improvement Act of 2001 (H.R.118)

The Nursing Home Staffing and Quality Improvement Act, introduced in the House Committees on Ways and Means and Energy and Commerce, would authorize the Secretary of the Department of Health and Human Services (HHS) to provide grants to states for the purpose of improving the quality of care furnished in nursing homes operating in the state.

The bill would provide financial assistance for recruiting, retaining, or training nursing staff. State technical assistance programs may qualify for funding under these grant programs, since the legislation would permit funds to be used for bonuses to nursing homes that meet state quality standards; and for any other nursing home staffing and quality improvement initiative approved by HHS. Under the bill, Title XI of the Social Security Act would be amended to establish a Nursing Facility Civil Money Penalties Collection Account that would be used for awarding grants under the Act.

This bill was introduced in January 2001. In February 2001, it was referred to the House Subcommittee on Health, and there has been no further action on it since then.

Medicare and Medicaid Nursing Facility Quality Improvement Act of 2002 (H.R.4030)

This legislation, introduced in March 2002 by Rep. Dave Camp (MI), would amend the Medicare and Medicaid statutes to modify the federal survey and certification process for nursing facilities. The bill would allow states to apply for waivers to the survey and certification requirements specified in Section 1819(g) of the Social Security Act. These states would develop

"innovative quality measurement and oversight systems that differ from those presently required by federal law." According to the language of the bill, waiver requests are to be approved if they demonstrate "significant potential for improving the quality of care, quality of life, and safety of residents." According to the bill, up to eight states could receive authorization to create such alternative systems. The bill would eliminate rules that prohibit surveyors from making recommendations to improve nursing care. The legislation does not include authorize a specific funding amount or source.

The bill (H.R.4030) has 15 co-sponsors and has been endorsed by both the American Association of Homes and Services for the Aging (http://www.aahsa.org/public/ press_release/PR221.htm) and the American Health Care Association (http://www.ahca.org/brief/nr020322.htm).

In April 2002, the bill was referred to the House Subcommittee on Health in the Committee on Energy and Commerce, for a period to be determined by the committee chairman, and there was no action on this legislation between April 2002 and May 2003.


6.0 EFFECTIVENESS OF TECHNICAL ASSISTANCE PROGRAMS

A rigorous assessment of the effectiveness of state-initiated technical assistance programs is not possible at this time for several reasons:

In addition, while states expressed a general interest in measuring effectiveness of their quality improvement efforts, most have not developed a systematic evaluation plan and have been unable to identify acceptable criteria for measuring the impact. Although, intuitively, states believed that TA has a positive impact, uncertainty about an appropriate measure, along with the unknown influences of other ongoing programs, may mean that the impact of these TA programs on quality is never known.

As an example, Florida said they have considered looking at changes in deficiencies, but have not been able to arrive at a suitable measure. A decrease in the number of deficiencies cited, a decrease in overall scope and severity, or a decrease in the number of citations have been considered as possible measures but none has been proven as reliable measures. The known inconsistency of survey results on these and similar measures adds to the state's reluctance to use any of them. Florida is also aware of the impact staff turnover has had on program effectiveness and sustainability, making them hesitant to begin an evaluation that does not take turnover into account.

6.1 Previous Studies of the Impact of Nursing Home Quality Improvement Programs

Previous studies have provided mixed evidence regarding the effectiveness of nursing home quality improvement programs similar to the TA programs that we studied. A CMS study (1998) evaluated two nursing home quality improvement programs that were accompanied by reasonably strong evaluation designs. One program, an extremely labor intensive intervention to reduce incontinence, resulted in a reduction in incontinence rates, but these gains were not sustained after the external research staff stopped providing feedback to the participating nursing homes. The study found evidence that the other intervention, the Ohio Pressure Ulcer Prevention Initiative, was not effective. A Commonwealth Fund evaluation of the Wellspring quality improvement model27 found several positive outcomes (e.g., improvement on federal survey and lower staff turnover), but there was no clear evidence of improvements in clinical outcomes based on Minimum Data Set (MDS) quality indicators. These results suggest that it may be difficult to change the organizational and care practices within nursing homes that impact resident outcomes.

However, it is not possible to tell whether the mixed results of these previous evaluations are the result of an actual inability of the programs to result in improvements in quality or an inability of the available data to measure changes that may have actually occurred. A major challenge in measuring the effectiveness of any nursing home intervention is the difficulty in constructing valid quality measures. Absent any primary data collection, the two data sources that are available for measuring program effectiveness are the MDS and survey deficiency data. Both of these data sources have significant limitations for measuring quality of care, making it nearly impossible to draw definitive conclusions about the impact of specific interventions. These data limitations also limit the ability to compare the relative impact of nursing home programs with a quality improvement focus vs. those that focus on the survey and certification process.

The MDS has two potentially significant types of limitations:

As noted by Walshe (2001), differences in deficiency rates across states (or regions within states) and changes in deficiency rates across time may reflect real differences in quality of care.28 But they also may be the result of differences in the stringency, scope, or implementation of the survey process.29 It is not possible to disentangle these two effects. According to an OIG report (1999), inconsistency in the survey process results from unclear guidelines that may contribute to different interpretations by surveyors when citing deficiencies, differences in the level of supervisory review for survey reports, and high turnover among surveyors.

Due to these data limitations, little is known about the effectiveness of either TA programs or the survey and certification process, or about whether quality is improved more by investments in quality improvement or enforcement programs.30

6.2 Formal Assessment of TA Impact Among Study States

Missouri is far ahead of other states in using systematic data to measure the impact of its TA program. Missouri's TA program began in 1999, when a pilot test demonstrated that providing written reports to nursing facilities on their quality improvement status was not enough to motivate changes in processes that would improve resident outcomes. The researchers who performed the pilot test noted that on-site expert TA, particularly when delivered as a series of on-going visits, was most effective in changing resident outcomes.

Since the program's inception, staff have used the MDS-based quality indicators developed by the Center for Health Systems Research and Analysis (CHSRA) to measure the impact of their TA program on resident quality of care and quality of life. Although the quality of MDS data has improved, as familiarity with the tool has increased and data edits have been implemented by individual states and CMS, there is still considerable confusion around the coding of some items. Recognizing the potential for problems in the MDS data early on, Missouri developed standardized training materials for the MDS and mandates that anyone offering MDS training in the state utilize those materials. Their TA nurses also provide monthly support groups for MDS coordinators, as a forum to clarify issues regarding MDS coding.

In addition to analyzing median quality indicator scores, the program staff analyze trends for the 90th and 95th percentile, so that the effectives of the program in improving outcomes for low-quality facilities can be understood. Analysis of data since the implementation of the TA program across all facilities participating in the program demonstrated improvement in 16 quality indicators, declines in only six.31 The following are the indicators that have improved:

Several quality indicators have gotten worse in Missouri since the implementation of QIPMO, including behavior problems for high-risk residents, patients receiving nine or more medications, range of motion training/practice, and antipsychotics use in the absence of an appropriate diagnosis. Preliminary investigations by QIPMO staff suggest that these declines may reflect MDS coding issues rather than actual decline of care.

Maryland is the only other state that has attempted to formally evaluate the impact of quality on a select number of indicators. According to Maryland Department of Health and Mental Hygiene/Office of Health Care Quality (OHCQ), the eventual evaluation will look at complaint rates, correlations between deficiency citations and areas targeted for facility quality improvement, and facility satisfaction with the Second Survey.

6.3 Informal Assessment of TA Impact Among Study States

During state site visits, the research team asked about the perceived impact of the TA program in each state on the quality of life or quality of care of the residents. Only one state was able to report any empirical analysis of the outcome of their efforts. Thus, most of the information we present in this section is anecdotal, gathered during our discussions with stakeholders.

State program staff and stakeholders were also asked to describe aspects of the program that worked well, aspects that could be improved, sustainability, and lessons learned. Combining this information helped us understand how the programs had been able to effect change in facility systems or processes related to quality improvement--although it was typically difficult to attribute those changes solely to the TA program. Here we describe respondents' impressions of how the various TA programs have improved resident outcomes, which factors make them effective, and what difficulties they see as inherent in measuring program effectiveness.

Ongoing Feedback Mechanisms

The informal feedback TA programs received from facility staff generally took the form of a paper questionnaire given to facility staff at the end of a TA visit, asking facilities to provide information rating the performance of the TA staff and how helpful the visit had been. Some facility staff in Florida and Washington, where TA staff also function as surveyors, told the research team that they are hesitant to give any negative feedback on these questionnaires for fear that the staff member making the TA visit might be conducting their next LTC survey or complaint investigation. Texas was the only state that reported using the Internet for feedback on its program. In Washington, the survey staff holds quarterly forums with executives from the nursing home industry to discuss issues related to quality. Maryland state officials reported using the information collected as feedback on the TA program's first year to establish the focus for the second year's visits.

Informal Assessment of Impact on Quality of Care and Quality of Life

Maine, Texas, Florida, Maryland and Washington all reported anecdotal comments on the impact of their TA program on resident quality of life and quality of care issues. For long running TA programs like Maine's and Washington's, participants made relatively strong statements on the impact of their programs. Maine's program was praised by every participant as improving the quality of life for the affected residents. Providers believed that the quality of life for the residents referred to the behavioral consultation program was definitely improved, because staff were able to provide better care to a difficult population. Anecdotal feedback from survey staff, the ombudsman, and facilities indicated that the consultations have led to changes in plans of care that have had positive results for both residents and staff. LTC survey staff from the state indicated that, based on informal feedback, the education and support given to staff has decreased medication use among the residents and the number of discharges due to behavioral issues.

In Washington, program staff reported that they believe the TA program is positively affecting outcomes and quality because of informal feedback they receive from providers and stakeholders. Providers and ombudsmen with whom we held discussions noted ways in which they thought the TA program positively affected quality. For example, one provider stated that a good TA nurse can help facilities prioritize quality problems and can help new Directors of Nursing and facility staff to improve quality. An Ombudsman stated that the TA program has a positive effect because it promotes taking care of problems at an early stage. Many respondents viewed good performance on the survey as indicating better quality and indicated that TA visits helped facilities perform better on the survey.

Comments on programs implemented more recently were more tentative, especially in Texas, with many respondents adopting a "wait and see" attitude.

On the positive side, in every state there were participants who said the TA program was helpful, was a good resource for clinical and/or regulatory information, had taught or helped providers improve a skill, and represented a welcome change from the traditional adversarial relationship between provider and LTC survey staff. Providers reported that in many cases they value the consultative advice provided, saying that for some it has changed the relationship between the state and providers for the better. Participants reported learning investigative and analytic skills from TA that they are then able to use to review current facility processes. The shift in focus from deficiencies to quality improvement is also seen as positive. Some survey agencies even reported that providers have fewer complaints about the survey process.

Negative comments are more specific to the individual state program. Lack of consistency between surveyor and TA information was noted as a problem in Washington and Florida. In Florida particularly, providers noted that TA staff hired when the program was initially legislated were former surveyors receiving a promotion, but that those brought in as part of subsequently legislated program changes were not experienced in long-term care, geriatric clinical issues, or the regulations--and thus were less helpful to providers. Florida providers also noted that the value and usefulness of the TA program, which reflects program staff and leadership, appears to vary considerably by region. Both Florida and Washington discussion participants reported problems with the frequency and regularity of TA visits. In each state, visits are mandated to occur on a regular basis, but sometimes do not, leading to distrust of program staff and perceptions of reduced effectiveness. In both these states, TA staff are also utilized for surveyor tasks. Lastly, in Missouri and Texas, providers said they are occasionally overwhelmed by the amount and complexity of information provided by the TA program. Missouri TA staff are advanced practice nurses employed by the university school of nursing, who utilize clinical studies as guidance for providers. In Texas the TA staff promote expert evidenced based practice guidelines developed by academic, clinical, and medical experts. Respondents in Texas reported being often uncertain how to use all the information and for how much of it they will be held accountable.

The Florida and Washington programs, as noted, both involve TA staff functioning in multiple roles. Washington's TA staff act as surveyors on occasion and Florida's TA staff monitor facilities that are closing or in immediate jeopardy. In these states, facilities said they need to be aware of these differing functions and that, depending on the situation, the role of the TA and relationship with the facility may change. These seeming areas of overlap between TA and enforcement are seen by some to have a positive impact on quality, adding "teeth to be able to penalize facilities that don't perform." But others see them as negatively impacting the relationship and any atmosphere of openness between the facility and agency staff. Respondents from states where TA staff performed multiple roles made the point that where there are competing demands on staff who perform both roles, the TA role is often the one that suffers. More work is needed to evaluate which strategies most effectively change the culture of care giving.


7.0 SUGGESTIONS FROM STUDY STATES TO OTHERS CONSIDERING QUALITY IMPROVEMENT PROGRAMS

We asked providers, state program administrators, and consumer representatives in each of our study states for general guidance advice they would offer other states considering quality improvement programs. We also asked for specific suggestions based on lessons they learned in relation to programs initiated in their states. The following list summarizes general guidance from state administrators to states considering developing a QI initiative:

7.1 TA Programs

Many respondents offered advice related to the structure and function of TA programs, particularly regarding the relationship between TA and survey. The majority of respondents reported that they believe the TA programs are worthwhile and have a positive impact on facility quality of care. However, they varied in their opinions regarding which facilities should be targeted to receive technical assistance. Some consumer advocates said TA programs should focus primarily on small independent facilities that have fewer of their own resources from which to draw. Other stakeholders thought TA programs should either be mandatory for all providers or should focus primarily on poor performers.

Strong, but by no means unanimous, opinions were expressed about whether states should maintain separation between their TA programs and their LTC survey and certification process. States that had preserved that separation felt strongly that it is critical to the fundamental purpose of TA--i.e., to help facilities improve the care they deliver. Stakeholders from both the state survey agencies and the TA programs holding this view emphasized that any blurring of the lines between survey and TA could cause providers to become skeptical about confidentiality, and to fear that information shared during TA sessions will be reported to surveyors. They felt that this lack of confidentiality has the potential to chill the relationship between technical assistance staff and facilities, resulting in a loss of candor on the part of facilities and, as a result, lost opportunities for TA assistance.

In contrast, most program staff and many providers that we talked to in states with closely tied TA/survey programs recommended that TA staff also function as surveyors for reasons that are discussed in section 3.2, namely that the association with survey causes TA staff to have greater authority, more regulatory knowledge, and therefore a better ability to effect positive changes in resident care.

In several states, respondents, representing both TA programs and facilities, stressed how important the quality and personality of TA staff is to the success of their efforts. To be effective, it was generally agreed, staff members should be experienced in long-term care and sufficiently flexible to work collaboratively with facility staff. It was also agreed that the standards used and the training given to TA staff must be consistent to avoid subjective consulting across facilities.

7.2 Other Quality Improvement Initiatives

Administrators of quality improvement programs in study states also offered some specific advice for state officials interested in developing other QI initiatives:

Awards and Recognition Programs and Best Practice Initiatives. Participants thought it important to ensure that there is a consumer advocate position on the selection panel, and that this position is well defined so it does not default to "an industry representative who has a relative in a nursing home." They also recommended that the selection panel visit any facility nominated for an award, to validate nomination criteria and make sure the facility is in fact "doing something special" and not merely meeting minimum criteria. Stakeholders said it was important that the selection process be seen as objective--so that the award, in turn, is seen as truly recognizing outstanding quality. Stakeholders recommended that consideration be given to the criteria used to select facilities for awards. States advised caution about setting criteria too low or evaluating facilities over too short a period to ascertain whether the facilities chosen were maintaining good practice on a consistent basis. This is important to avoid the inevitable bad publicity and diminished consumer trust that result when facilities singled out for recognition later experience quality problems.

Training Initiatives. Several stakeholders advocated that the most effective training programs were those that included both interpretation of regulations and practical examples of integration of care principles. Some also recommended joint training for providers and LTC surveyors. This admittedly leads to some discomfort in both groups, but it provides an effective medium for dialogue between providers and surveyors, has the potential to promote greater understanding and cooperation, and ensures that both groups receive the same information. This, in turn, decreases the problem of different interpretations of the guidance offered. With respect to education more generally, some participants noted the need to educate (a) the public about realistic expectations regarding nursing home care outcomes and (b) facilities to better manage the expectations of patients and families.

Public Reporting Programs. Comments by some stakeholders suggest skepticism about consumer use of public report cards on nursing home quality. Nonetheless, in states that invest in public reporting, it became apparent during our discussions that a balance must be struck between providing enough information to consumers to assist them in making more informed decisions and overloading them with information and data that becomes too cumbersome to decipher. One solution recommended by several states is to develop a scoring system that incorporates multiple quality measures (e.g., survey and deficiency information and/or quality indicators). The advantage of such a system is that it reduces information overload and is easy for the consumer to understand. States caution, however, that the accuracy of these scoring systems as predicators of real quality is subject to considerable dispute and has not been empirically validated. States also advised caution regarding the potential negative impact on access, if facilities begin turning away heavier care residents patients because they fear their "consumer report cards" will be adversely affected by scoring systems that do not take sufficient account of facility differences in types of patients (and their differing care needs).


8.0 SUGGESTIONS FROM STUDY STATES TO THE FEDERAL GOVERNMENT

During the case studies we asked stakeholders if there were any suggestions they wished to offer the Federal Government with respect to nursing home quality improvement. The comments we received applied to perceived federally imposed barriers to state-initiated quality improvement programs, and to federal policies related to regulation, staffing, and quality.

In general, the states said they wanted to improve their relationships with the Federal Government. Officials in one state described the relationship between CMS and the state as "hostile." Providers in that state were especially upset by their belief that a deficiency-free state survey often triggered a federal survey. They encouraged the Federal Government to implement a policy that rewards good nursing homes with less frequent surveys and to focus resources on poorly performing facilities. Officials in another state said the Federal Government should be more flexible in allowing states to be innovative and to make their own attempts to improve quality. Stakeholders across states expressed a desire to either implement or expand technical assistance programs or other quality improvement initiatives--but believe that federal funds for such initiatives needs to be expanded.

8.1 Federal Program Provisions

CMS Public Reporting Initiative and Quality Improvement Organization (QIO) Involvement

Washington State was a pilot state for the recent federal piloting of national public reporting of quality measures (QMs). Respondents there had very mixed opinions of the QM public reporting, though general agreement among those who commented was that "quality indicator" rather than "quality measure" was a more accurate descriptor for the measures, since those interviewed did not believe that the QMs are the only aspect of quality that should be considered when making judgments about facility quality.

Some consumers in Washington were also skeptical of the QM initiative, saying that the QMs are too clinical and that they did not believe there was good correlation between performance on QMs and "real quality." Consumers also argued that the Federal Government should do more to assure that there is more consumer (resident) representation on federal quality initiatives such as the QM and QIO projects.

Officials in another state believed that information on CMS Nursing Home Compare website was too general and that the website needed to post more details to be really helpful to states. They thought it would be preferable to post all CHSRA QIs for each nursing home. Program staff in one state thought that CMS should post five years of survey and complaint data plus selected QIs. Respondents were also concerned about timeliness of data, since it heavily impacts the value of the posted information to consumers.

Regarding the new QIO initiative, many respondents from state survey agencies believe that the QIO program was an untapped resource that could be used, along with the state's survey agency, to work together and bring about changes in facility practices necessary to improve quality. One state believed CMS would be better served to award that responsibility (and associated funding) directly to the states. Some respondents suggested that the role of the QIO as an "improver" may be undermined by the QIO's required function as an "enforcer." Officials in another state were more concerned about the QIO's lack of experience with nursing facilities.

Overregulation

Many respondents felt that the current level of federal regulation is too demanding, although facility representatives generally felt that the state was even more demanding than the Federal Government in its expectations for high quality performance. Others were less concerned about the amount of oversight and more concerned about a need for more understandable regulations. Finally, one state's for-profit providers indicated that the federal regional offices should be doing a more diligent job overseeing the local state field offices to make sure they were doing their jobs fairly.

Staffing

Stakeholders were universally concerned about staff turnover and the related issues of maintaining adequate staffing in facilities. All complained of staffing shortages, high turnover, lack of mid-level staff with management skills, and pervasive use of contract staff. One state's consumer representative said that while she was not opposed to new quality improvement programs, the main issues at hand concerned inadequate staffing of the programs currently operating. Some stakeholders, particularly consumers, believe that the best thing the Federal Government can do to improve nursing home quality is to do "whatever it takes to improve staffing." On the other hand, some providers expressed concern that requiring minimum staffing ratios would not be appropriate, particularly if there were not significant reimbursement increases to pay for the higher staffing levels. There is concern about the ability to staff at the required level, given the nursing shortages that exist in many parts of the country, and also concern about how to account for differences in facility case mix in determining the required minimum staffing level for each facility.

8.2 Other Suggestions

A variety of other suggestions comments were also directed to the Federal Government.


9.0 CONCLUSIONS

The backbone of the nation's system for monitoring nursing home quality of care is the LTC survey and certification process, which focuses on facility compliance with the regulations governing Medicare and Medicaid certification. This regulatory focus sharply limits the amount and types of consultative advice LTC surveyors can provide, as reflected in Section 4018 of the State Operations Manual:

"It is not the surveyor's job to examine the facility's policies and procedures to determine or speculate on the root cause of deficiencies, or to sift through various alternatives to prescribe one acceptable remedy."33

Survey and certification staff are directed not to assist facilities with in-depth problem solving on ways of improving the quality of care delivered. They are allowed to disseminate information that may be of assistance to the facility in meeting long-term care requirements, but they do not provide training to nursing home staff on quality-related issues.34

This limited focus, combined with continuing concerns about nursing home quality, has led some states to supplement their quality assurance standards with consultative, collaborative programs that directly address quality improvement. The goal of the study reported here is to examine these state-initiated quality improvement efforts and, more specifically, to identify their characteristics and look for information that might be helpful to other states considering such initiatives.

We focused on seven states with quality improvement programs: Florida, Iowa, Maryland, Maine, Missouri, Texas, and Washington. For each of these states, we collected detailed information on their quality improvement programs through both in-person and telephone discussions with stakeholders.

9.1 Technical Assistance Programs

While we cannot systematically evaluate the effectiveness of technical assistance programs in improving quality of care, feedback from providers in the states we visited indicates a need for this type of program. All discussants agreed that technical assistance programs fill an important gap, and the majority of stakeholders we talked to, including officials from state survey agencies, provider representatives, and consumer advocates, believe these programs have had a positive impact on improving nursing home quality. It is also abundantly clear that, in all the states we visited, the technical assistance staff have been able to establish a more collaborative, less adversarial relationship with nursing facilities than is typical for surveyors.

Many nursing facility staff seem to value the opportunity to have an open dialogue with technical assistance staff about problems and issues in residents' care, to obtain information on good clinical care practices, and to receive training and feedback on how they can improve their care processes. There are, however, some providers who seem to misunderstand TA programs that do not focus on regulatory issues or survey performance. Many facilities consider this a disadvantage, because achieving good survey outcomes is an important goal for them. Some facilities, indeed, are primarily interested in receiving advice on survey preparation. These facilities generally are not receptive to the types of quality improvement oriented assistance provided as part of the technical assistance programs in the majority of states we studied. As discussed below, however, there are also potential disadvantages in having a TA program that is closely tied to the survey process.

The enforcement process does not appear to have been compromised in states with technical assistance programs. In some states this is because technical assistance and survey activities are separated from one another. The technical assistance programs in Maine, Maryland, Missouri, and Texas, for example, do not directly deal with compliance issues. In the states where the two functions are not as distinctly separated, Florida and Washington, the technical assistance programs have more of a regulatory focus and direct consultation on care processes is typically not provided. We heard a few reports of problems when advice from the TA staff conflicted with what the facility heard from surveyors, but these incidents appeared to be isolated. TA programs are clearly able to provide a constructive complement to the enforcement-related survey and certification activities.

To date, only Missouri has formally assessed the effectiveness of their program. Their analysis has shown improvements, since program implementation, in the majority of quality indicators the state has selected for comparative measurement. In coming years, we expect additional analyses of program effectiveness. Such analyses may allow more definitive conclusions to be drawn regarding which types of TA programs are most successful in improving quality.

In spite of considerable differences across states in the design and goals of their technical assistance programs, several common issues emerged that states planning technical assistance programs need to consider.

Separation Between Technical Assistance Program and Survey Process

The typical reaction of nursing facility staff is to distrust technical assistance programs, particularly if they are run by the state survey agency or staffed by former or current surveyors. Many administrators want to avoid having surveyors in the facility any more frequently than is required by law. It takes time to educate facility staff about the potential benefits of technical assistance programs, and a major component of this educational process involves convincing facility staff that it is "safe" to have an open discussion with technical assistance staff and that results of technical assistance visits will not lead to survey deficiencies. Separating the technical assistance function from the survey process almost certainly helps achieve this purpose.

The degree of separation between technical assistance and survey staff varied across states. Missouri and Maryland has the greatest separation. In Missouri, there is little interaction between the state's technical assistance staff, who are employed by the University of Missouri, and the survey agency. This separation seems to facilitate the emphasis of these programs on providing consultation to facility staff, including reviewing care plans for individual residents and providing training to staff. Technical assistance staff in Missouri deliberately avoid enforcement and regulatory issues. LTC survey staff, in turn, avoid any consultative role. Acceptance of Missouri's program by nursing facilities was reportedly slowed because, when the program started, it was more closely linked to the survey process.

In Maryland, the state's technical assistance nurses report only the most extreme quality of care violations to the state survey agency. When technical assistance staff identifies routine violations, they bring such violations to the attention of the nursing home staff, require a plan of correction, and provide ongoing compliance monitoring. The state believes this level of separation is necessary in order to get providers to accept the technical assistance program.

In states like Washington, where the distinction between technical assistance staff and the survey agency is not clear, it is likely that this causes some distrust of the technical assistance staff by nursing home providers, resulting in a reluctance to have an open discussion with technical assistance staff about quality improvement issues. We were not able to evaluate whether this affects program effectiveness, but comments from providers suggest that this close association between TA and survey staff can present real problems.

Making a Choice between a Focus on Directly Improving Care Practices versus Improving Regulatory Compliance

The principal reasons for selecting either an approach that emphasizes nursing home care practices or regulatory compliance appear to be primarily related to the stance of the state and the availability of federal funding for programs based in LTC regulation. Particularly in Washington State, there is a belief that the monitoring and enforcement of federal requirements for facilities can and does result in higher quality of care delivery. It is clear that many nursing facilities value technical assistance that is focused on improving survey outcomes, and that some value this type of assistance more than technical assistance directly focused on improved quality of care. There may be greater potential for conflict-of-interest for the programs with a regulatory focus, with TA staff who often work as part of the state survey agency, providing advice on issues related to regulatory compliance, but there are no data that permit determination of which type of approach is more effective in improving quality.

It is also the case that in states where technical assistance programs have a primarily regulatory focus, the distinction between technical assistance and LTC survey tends to become blurred. In Florida and Washington, for example, technical assistance staff occasionally act as surveyors, sometimes having to clarify with facilities as to which role they are playing on a particular day. This would seem to have an obvious impact on the type of information shared between facility and technical assistance staff, which can be expected to mute the effectiveness of any technical assistance whose intended focus is quality improvement outside the realm of regulation.

Importance of TA Program Staffing

Across all the study states, TA staff tend to be experienced and highly trained. Florida's quality monitors were initially recruited from the best surveyors in the state. Washington's QANs are all masters-prepared nurses. Most of Missouri's technical assistance staff have advanced nursing degrees and many have been personally recruited by the director of the technical assistance program. It is noteworthy that, in all the study states, the technical assistance staff tend to be more experienced than most of the surveyors. This gives them the clinical knowledge they need to address the variety of topics that may be covered during a technical assistance visit.

In addition to clinical experience, the personality of technical assistance staff was considered important to the success of a quality improvement effort. Our discussants said that technical assistance staff need to be good teachers, good communicators, and good listeners. They need a personality that allows them to build trust with facilities and enables them to encourage facilities to be active participants in the technical assistance program. These "soft skills" could well be as important to technical assistance staff success as their clinical background.

States varied with respect to whether technical assistance staff had survey experience, and we could not draw any conclusions about the importance of this type of experience. On the one hand, we heard reports that it may be difficult for surveyors to change from emphasizing enforcement issues to focusing on nursing home care practices. On the other hand, experienced surveyors may have insights from their experience as to best practices observed at other facilities that they can share. Having survey experience was clearly important for technical assistance programs that have a regulatory focus.

The Trade-Off Between Regulatory and Care Practice Focus

The technical assistance programs in Florida and Washington, which emphasized regulatory compliance issues more than the programs in other states, provided only a limited amount of direct consultation to nursing homes. Florida's quality monitors are deliberately careful to keep suggestions very general, forcing the facility to select the processes they feel are most appropriate to the needs of their residents. In Washington, technical assistance staff advise facilities to network with one another, but they avoid telling facilities how to fix problems. Reasons for the limited consultation provided in these states include (1) avoiding the danger of facilities being cited for doing something technical assistance staff told them to do; (2) limiting the potential liability of the technical assistance program for any advice they may give; (3) Federal restrictions on the types of consultation that can be provided as part of the survey and certification process; and (4) in Washington's case, preserving the perception that they are not providing "technical assistance" in order to maintain eligibility for federal funding.

In Maine, Missouri, and Texas, where the explicit intent is provision of direct consultation with facilities that is unrelated to regulatory issues, technical assistance staff appear to be comfortable sharing advice with facilities on how to treat particular conditions and individual residents. The Maine technical assistance nurse actually drafts care plans for inclusion in the medical record. Missouri technical assistance staff bring along many resource materials to the facilities they visit and provide guidance on a variety of topics. Texas technical assistance staff disseminate evidence-based best practice guidelines. Stakeholders in these states told us they greatly value the types of direct consultation provided under these technical assistance programs.

Trade-off of Mandatory Program Participation

In most study states, facility participation in technical assistance programs is mandatory. Participation in the technical assistance programs in Maine and Missouri, however, is voluntary. About 45 percent of nursing facilities received on-site consultation from Missouri's technical assistance program. Detailed facility statistics are not available for the Maine program since they track interventions by resident rather than by facility, but it is believed that a majority of the state's 126 nursing facilities have participated.

Voluntary programs allow facilities that do not want technical assistance to opt out and not receive this assistance. This runs the obvious danger that the facilities most in need of help may not receive it. Study discussants suggested that facilities with the worst quality do not participate, in part because they either do not understand the program or do not have the systems in place to benefit from it. This is certainly a plausible result of voluntary participation. The state survey agency in Missouri did not contradict this position, but was not troubled by such a possibility, arguing that the problems at the facilities with the most severe quality issues should most properly be addressed through the enforcement process rather than through TA.

On the other hand, even for states with mandatory technical assistance programs, it is likely that some facilities do not benefit from the programs--either because they are not willing or able to use advice received during the technical assistance visit to make changes to care processes. Some discussants believe that high staff turnover has resulted in facility staff actually having less contact with technical assistant staff. It is not clear that focusing on poor performing facilities would maximize the impact of technical assistance programs, given that these facilities may be too overwhelmed by the tasks involved in providing basic care to be able to undertake new quality improvement initiatives.

The Value of Focusing TA Visits on Quality Indicators

Maryland, Missouri, and Washington all incorporate quality indicators into their protocols. These States' use of quality indicators includes: (1) a means of targeting clinical areas of focus (Washington and Missouri); (2) a foundation for measuring how well both facilities and the technical assistance program are performing (Maryland and Missouri); and (3) a basis for facility improvement plans that can then be reviewed as part of the TA visit (Maryland).

The Need to Make Evaluation Part of the Program Design

There have been few systematic evaluations of the effectiveness of state technical assistance programs, and the designs of the current technical assistance initiatives--even when they have been in operation long enough to permit evaluation--will make it difficult to estimate how well the programs work. Of particular concern from an evaluation perspective is the simultaneous statewide implementation of several quality improvement programs. This is understandable, given the perceived urgent need to improve nursing home quality. However, a strategy that concurrently implements multiple interventions makes it virtually impossible to measure the effectiveness of any particular type of technical assistance. States planning to implement quality improvement programs should consider the increasing importance of the need to evaluate these programs given the current fiscal environment.

Federal Funding for Quality Improvement Programs

Federal funding is not generally available for programs that have a consultative or quality improvement focus. The study states make limited use of federal funds for their technical assistance programs, typically funding their programs from state general revenue funds, sometimes supplemented by the state portion of Civil Monetary Penalty (CMP) awards and/or penalties or fees levied on facilities. Some states explained that there were "too many strings attached" to use federal funding for these TA activities.

9.2 Other State Quality Improvement Initiatives

Other quality improvement programs in the study states fall mostly into one of four categories--training programs, programs that provide recognition to high-performing facilities, best practices programs, and public reporting programs. The same staff are generally responsible for both the TA and these other programs (with the exception of public reporting programs), and the two are often operationally indistinguishable.

The effectiveness of these programs has not been explored, and measuring their impact on quality of care would be difficult if not impossible. But feedback from provider and consumer groups indicates that they have generally been well received and are viewed positively, even if they are not perceived as producing large changes in quality.

Training

Almost all states had some type of formal training as part of their quality improvement programs. In general, these training sessions have been well attended and feedback has been positive. It is not possible to determine whether these training programs have led to quality improvements, although there is some anecdotal evidence of practice changes that were made following training sessions. The experiences of states that have conducted joint surveyor-provider training programs is mixed. Having both surveyors and providers at the same training session often inhibits discussion, and there is often resistance from both sides. Such sessions do, however, ensure that both providers and surveyors receive the same information and may ultimately help to improve the surveyor-provider relationship, leading to better communication during the survey process. With respect to training programs, some participants noted the need to educate the public about realistic expectations regarding nursing home care outcomes and the need for facility training to help them to manage better the expectations of patients and families.

Best Practices

Our research team noted a great deal of variation in what the study states described as best practices. Programs varied in how best practices were defined, where they originated, and how they were positioned among the state's other quality improvement programs. Some states defined best practice simply as an innovative idea originating at the facility level that was seen as potentially valuable to other facilities. For example, Iowa posts on its website innovative best practices deemed to be among the best in the state. Other states define best practices based on expert-derived, clinical protocols that should be adopted by facilities so as to raise the standards of practice. This is the approach used in Maryland, Texas and Missouri.

Facility Recognition

Florida and Iowa have developed and initiated reward and recognition programs as part of their quality improvement efforts. The goal of these programs is to recognize facilities doing exemplary work. These programs received positive feedback from both providers and consumer advocates. Providers view them as tools for combating the negative stereotype of nursing homes so often presented to the public. Consumer advocates present them as potentially useful sources of information for elders and their families making long-term care decisions. There is concern, however, that these types of programs are focused on facilities that already deliver quality care, and may divert state attention from the facilities with the quality problems.

Public Reporting

There was some concern about whether public reporting is useful as consumer information or as a marketing tool used by nursing homes. However, given the increasing use of this type of information, all discussants agreed that public reporting programs must strike a balance between providing information to consumers to assist them in making more informed decisions and not overloading them with information and data too complicated for them to use. Many discussants expressed concern that publicly reported data needed to be timely, valid, and sufficiently risk adjusted to provide meaningful information. In addition, provider groups expressed strong opposition to posting survey results that are under appeal. Given that the appeals process can take years to reach a final resolution, not posting results until appeals are resolved would result in data that are too out-dated to be useful for consumers needing to make placement decisions. Research is needed to understand the extent to which (a) public reporting systems are used by consumers to guide nursing home placement decisions, and (b) public reporting of information on facility quality actually leads to quality improvements.


NOTES

  1. For example, see GAO (2000, 2002), OIG (1999a, 1999b).

  2. See OIG (1999c), CMS (1998).

  3. California, Indiana, North Carolina, Ohio, and Wisconsin plan to implement TA programs, but these programs had not yet started at the time we were collecting data for this report.

  4. Michigan and Virginia were not included in the study because of the limited number of facilities that have participated in their TA programs.

  5. Despite the absence of a technical assistance program, the project's Technical Advisory Group believed that the study should include Iowa, as its programs may be substitutes for a technical assistance program and may include quality improvement models that other states may wish to replicate, potentially improving our study's ability to provide guidance to states considering implementing quality improvement projects. Iowa's quality improvement programs involve a wide variety of efforts including an internet web-based Nursing Home Report Card, recognition programs for exemplary practices and performance on licensure and certification surveys, training for providers and surveyors, feedback on surveys/surveyors and an alternative survey process for state-only licensed facilities meeting certain criteria. (See Appendix A for more details).

  6. For example, a 2002 report from the Office of the Inspector General concluded that "problems with quality of care continue to exist in nursing homes. The study found that the number of quality-of-care deficiencies has increased in recent years, as has the number of nursing home workers excluded from the Medicare and Medicaid programs as a result of patient abuse or neglect (OEI, 2002). A 2001 report prepared by the U.S. House of Representatives Special Investigations Division found that more than 30 percent of nursing homes had been cited for abuse violations between 1999 and 2001.

  7. See Rantz MJ, Popejoy L, Petroski GF, Madsen RW, Mehr DR, Zygart-Stauffacher M, Hicks LL, Grando V, Wipke-Tevis DD, Bostick J, Porter R, Conn VS, Maas M (2001). "Randomized clinical trial of a quality improvement intervention in nursing homes. The Gerontologist 41(4), 525-538.

  8. The Plan, Do, Study, Act cycle of improvement (also referred to as Shewart's Cycle for Learning and Improvement) is one that is commonly cited by organizations, such as CMS's Quality Improvement Organizations, that conduct continuous quality improvement activities. Another (see Massoud, 2001) specifies a different set of steps: identify, analyze, develop, and test/implement. Yet another uses the standard steps of the nursing process: assessment planning, implementation, reassessment, and evaluation.

  9. Maine did create what it called a "vision" for what quality improvement programs should look like, though this was developed too recently to be relevant for the Maine programs included in this study.

  10. See Chapter 5 for more information on the provisions of the Social Security Act that Texas used to secure federal funds for its Quality Monitoring program.

  11. Originally, TA staff went through the risk management training offered by the University of South Florida, However, risk management training was not provided for those hired when subsequent legislation increased the number of Quality Monitors.

  12. This is a much higher participation rate than the much smaller programs in Michigan and Virginia, which also have voluntary TA programs but were not included in the study.

  13. Laura Cote. Description of Behavior Management Consultation. September 2002.

  14. See Appendix B for an example of the Missouri Show-Me QI report.

  15. For more details on the results of Missouri's evaluation, see Chapter 6.

  16. Nursing Home Quality: A National Overview of Public Reporting Programs January 2002 Rhode Island Department of Health, Health Care Quality Series Number 11.

  17. In 2001, the average number of hours required per survey was 108. Across states, the average ranged from 66 hours per survey in Maine to 195 hours in Delaware. Based on last year's budget, any state taking more than 131 hours would be frozen at the previous year's funding levels. (Source: Interview with Steven Pelovitz, Director, Survey and Certification Group, Center for Medicare and Medicaid Services).

  18. Source: Discussion with Steve Pelovitz, Director of CMS Survey and Certification Group, January 2002.

  19. This includes CMPs assessed against nursing facilities for non-compliance with federal requirements, individuals who make false statements in a resident assessment (or who cause another person to make such false statements, and individuals who notify a nursing facility of when a standard survey is scheduled to be conducted.

  20. This includes payment for the costs of relocation of residents to other facilities, maintenance of operation of a facility pending correction of deficiencies or closure, and reimbursement of residents for personal funds lost."

  21. This memorandum is available on-line at http://cms.hhs.gov/medicaid/ltcsp/sc0242.pdf.

  22. According to analysis by the State Senate, licensure fees were expected to cover $783,000 of the costs of the quality monitoring program for FY 2001-02 and $721,000 for FY 2002-2003.

  23. Source: Senate Staff Analysis and Economic Impact Statement, http://www.leg.state.fl.us/data/session/2001/Senate/bills/analysis/pdf/2001s1202.ap.pdf.

  24. Licensing fees or bed taxes can only lead to quality improvement to the extent that the nursing facility payment rate is sufficient to meet basic nursing facility needs. A potential concern is that these fees or taxes may not be used to improve quality but instead are used to shift costs from states to the Federal Government.

  25. Source: Texas Legislature On-Line web site, (http://www.capitol.state.tx.us/cgi-bin/tlo/textframe.cmd?LEG=77&SESS=R&CHAMBER=S&BILLTYPE= B&BILLSUFFIX=01839&VERSION=1&TYPE=F).

  26. This includes the costs for the 75 percent of time that QAN nurses dedicate to QAN activities (The remaining 25 percent of their time is allocated to the survey.)

  27. The Wellspring quality improvement model it is very labor intensive and incorporates with additional resources about every intervention that plausibly could impact quality. It has two primary goals: (1) To make the nursing home a better place for residents to live by improving the clinical care provided to residents and (2) To create a better working environment by giving employees the skills that they need to do their jobs. (See http://www.cmwf.org/programs/elders/stone_wellspringevaluation_550.pdf.)

  28. Nationally, the average deficiency rate for nursing homes surveyed in 2001 was 6.2 per nursing home; this ranged from 2.9 deficiencies per nursing home in Vermont to 11.2 deficiencies in California (Source: OIG, 2003).

  29. An OIG review of 310 survey reports reveals that different deficiency tags are being used to cite the same problem. In five of the six standard surveys we observed, the OIG found inconsistency across surveyors in how deficiencies were cited, and also found differences across states in how many deficiencies they will cite for a single problem of non-compliance.

  30. While the CMS study found clear evidence of some important improvements in nursing home quality that resulted from the changes to the survey and certification process that were introduced as part of OBRA 87, this improvement is not relevant for assessing whether the marginal impact of additional resources is higher for enforcement-oriented or quality improvement programs (i.e., whether the marginal impact on quality is higher for TA or enforcement programs).

  31. Missouri program staff have not compared outcomes for TA participants vs. non-participants because such a comparison would confound programmatic effects vs. selection effects, due to the non-random selection of facilities.

  32. Note that CMS is currently working on an updated RAI manual and clarified instructions for coding the MDS.

  33. Source: CMS (http://cms.hhs.gov/manuals/pub07pdf/part-04.pdf).

  34. Further clarification on the role of surveyors regarding consultation, technical assistance, and sharing best practice information can be found in a CMS memorandum dated 12/12/02, available on-line at http://www.cms.hhs.gov/medicaid/ltcsp/sc0308.pdf.


REFERENCES

Center for Medicare and Medicaid Services. Study of Private Accreditation (Deeming) of Nursing Homes, Regulatory Incentives and Non-Regulatory Incentives, and Effectiveness of the Survey and Certification System, Report To Congress, 1998.

General Accounting Office. California Nursing Homes: Care Problems Persist Despite Federal and State Oversight. GAO-HEHS-98-202. Washington, DC: GAO, 1998.

General Accounting Office. Nursing Homes: Complaint Investigation Processes Often Inadequate to Protect Residents. GAO-HEHS-99-80. Washington, DC: GAO, 1999.

General Accounting Office. Nursing Homes: Quality of Care More Related to Staffing than Spending. GAO/HEHS-02-431R Washington, DC: GAO, 2002.

General Accounting Office. Nursing Homes: Sustained Efforts Are Essential to Realize Potential of the Quality Initiatives. GAO/HEHS-00-197. Washington, DC: GAO, 2000.

Institute of Medicine. Takeuchi, J., Burke, R., and McGeary, M., eds. Improving the Quality of Care in Nursing Homes. Washington, DC: National Academy Press, 1986.

Massoud, MRF. 2001. Advances in Quality Improvement: Principles and Framework. QA Brief--The Quality Assurance Project's Information Outlet. Spring, 9:1.

Minority Staff, Special Investigations Division, Committee on Government Reform, U.S. House of Representatives. Abuse of Residents Is a Major Problem in U.S. Nursing Homes, July 2001.

Office of the Inspector General. Nursing Home Survey and Certification: Overall Capacity, OEI-02-98-00330. Washington, DC: OIG, 1999(a).

Office of the Inspector General. Quality of Care in Nursing Homes, An Overview, OEI-02-98-00060. Washington, DC: OIG, 1999(b).

Office of the Inspector General. Nursing Home Survey and Certification: Deficiency Trends, OEI-02-98-00331. Washington, DC: OIG, 1999(c).

Popejoy, L.L., Rantz, M.J., Conn, V., Wipke-Tevis, D., Grando, V., & Porter, R. (2000). Improving quality of care in nursing facilities: The gerontological clinical nurse specialist as a research nurse and consultant. Journal of Gerontological Nursing, 26(4), 6-13.

Rantz MJ, Popejoy L, Petroski GF, Madsen RW, Mehr DR, Zygart-Stauffacher M, Hicks LL, Grando V, Wipke-Tevis DD, Bostick J, Porter R, Conn VS, Maas M (2001). "Randomized clinical trial of a quality improvement intervention in nursing homes. The Gerontologist 41(4), 525-538.

Shewart, W.A, Economic Control of Quality of Manufactured Product, ASQC, Van Nostrand, New York, NY, 1980.

State of Florida, Senate Staff Analysis and Economic Impact Statement, Bill CS/CS/CS/SB 1202 Long Term Care. http://www.leg.state.fl.us/data/session/2001/Senate/bills/analysis/pdf/2001s1202.ap.pdf.

Walshe K. (2001). Regulating U.S. nursing homes: Are we learning from experience? Health Affairs 29(6), 128-144.


TABLE 1. Description of State-Initiated Technical Assistance Programs
State Program Administration Facility Involvement Funding Source(s) Program Staff Frequency of Visits* Year Established Evaluation (Informal vs. Formal) Relationship to LTC Survey
Florida Health Standards & Quality Unit, Division of Managed Care and Health Quality within the Florida Agency for Health Care Administration (AHCA) Mandatory visits for all facilities (approx 700 facilities). The Quality of Long-Term Care Facility Improvement Trust Fund which supports activities and programs directly related to the care of nursing home and assisted living facility residents, is funded through a combination of general revenues and 50 percent of any punitive damages awarded as part of a lawsuit against nursing homes or related health care facilities (Florida law 400.0238). 19 Quality Monitors in 8 geographic regions. Each monitor has a caseload of approx 30 facilities. Design is for all facilities to be visited at least quarterly, plus additional visits to facilities on the Watch List as well as those that have a history of non-compliance, those whose QI reports reflect potential weaknesses; and facilities that have either changed ownership, changed administrators or changed Director of Nursing Services recently; and all new facilities. 1999 Feedback forms are collected from facilities visited by Quality Monitors to assess the helpfulness of the visit and rate the performance of the TA staff. No formal analysis done to date. The QOC Monitor program is administered by the survey branch. QOC Monitors do not share findings of monitoring visits with LTC Survey, except when conditions threaten the health or safety of a resident. Monitors perform the following surveyor responsibilities: monitoring facilities' compliance with the internal risk management program and minimum staffing standards; and coordination with the Field Office Managers in visiting facilities that are being financially monitored, closing, or in immediate jeopardy.
Maine Maine Department of Human Services, Bureau of Medical Services, Division of Licensing and Certification Voluntary--no records kept of number of facilities visited. The cost of this program is the single TA staff member's salary and administrative support, which is part of the Licensing and Certification budget. The funding for the state's best practices program comes from CMP fines. 1 RN Long Term Care Behavior Management Consultant. Consultation upon request. 1994 Informal evaluation of perceived usefulness of visit conducted by the nurse. Reports to the Assistant Director of the Division of Licensing and Certification. Reports are available to surveyors.
Maryland Maryland Department of Health and Mental Hygiene/Office of Health Care Quality Mandatory for all facilities (approx 250). All State General Funds approx $400,000. 1 Manager, 5 RNs, 1 Dietician Annual visits. 2000 A standardized tool was developed to examine compliance with regulations requiring facilities to implement a Quality Assurance Plan, which includes internal monitoring of falls, malnutrition and dehydration, pressure ulcers, medication administration, accidents and injuries, changes in physical/mental status, QIs, and other important aspects of care. Internal measures are reviewed by surveyors during the Second Survey. At the time of our visit, all nursing homes had been surveyed once and baseline data had been collected and is being analyzed. The Second Survey program is administered by the survey branch. TA nurses do not share findings of monitoring visits with LTC Survey, although they do report egregious conditions that threaten patient safety.
Missouri Quality Improvement Program for MO Long-Term Care Facilities (QUIP-MO) administered by the University of Missouri-Columbia School of Nursing Available to all facilities on a voluntary basis. As of July 2002, 345 site visits in 163 different facilities had been conducted by MU QI nurses. Funding provided by: (1) Nursing facility QI fund derived from facility tax based on number of residents. (2) Annual nursing facility licensing fee. (3) Civil Money Penalty fines, and in 2001-2002, the University received a $625,947 grant for its quality improvement programs. In 2000-2001, they received $743,424 and in 1998-1999, $492,258. Director, Statistician, Research Nurse, 7 QIPMO Nurses Voluntary Program--visits scheduled based on facility request. Pilot in 1999, official start in mid-2000. An anonymous evaluation instrument is completed at the conclusion of each site visit. Comparison of the distribution os CHRSA QI scores for all nursing facilities prior to QUIPMO start with 2001 scores (2 years into the program) show improvement in scores in multiple QIs. No relationship to survey agency.
Texas Texas Department of Human Services Mandatory for all facilities (approx 1250). TA and other quality improvement programs are financed by a combination of state and federal matching funds and a facility licensing fee. The total budget for the first two years of the program is $2.7 million. 36 TA staff (RNs, pharmacists and dieticians); 14 liaisons with providers; 16 FTEs for joint training. It is the intention of the program to visit all facilities annually. Facilities are targeted for a visit based on priority as determined by indicators in a DHS Early Warning System that identifies the facility as being at-risk for a poor survey. Facilities can also solicit a site visit from the rapid response team. 2001 DHS anticipates evaluating the program in 2003, after it has been in place approx 12 months. The evaluation will be based on comparing measures such as number of pressure ulcers pre and post initiation of the Quality Monitoring Program. Results of TA visits are discussed with survey.
Washington Department of Social and Human Services, Division of Residential Care Services Mandatory visits for all facilities (approx 275). The state receives a 75 percent match on TA staff salary and benefit costs. Costs for the program are approx $2.8 million. 30 nurses, each with a caseload of 8-12 facilities. Quarterly visits. 1988 No formal evaluation of impact of program has been conducted. TA staff work within the LTC survey agency, and share findings with surveyors. TA staff conduct LTC surveys as well as complaint investigations and monitoring of facilities that are in compliance trouble. TA staff may also write deficiency citations during a quality monitoring visit.


TABLE 2. Non-TA Quality Improvement Programs, by State
  Florida Iowa Maine Maryland Missouri Texas Washington
Public Reporting X X   X   X X
Best Practice Dissemination   X X   X    
Facility Recognition Program X X          
Provider and/or Joint Provider/Surveyor Training X X X   X X  
Clinical Alerts Newsletter       X      
Consumer Satisfaction Survey     X        
Rapid Response Teams X         X  
Risk Management Program X     X      
Medical Director Requirements X     X      
Teaching Nursing Home Research and Training Program X            
Corporate Visit Program             X
Pet Therapy Program       X      
Wellspring Project       X      
Decubitus Ulcer       X      
Family Council Project       X      


You can advance to:
  • APPENDIX A. State Reports
  • APPENDIX B. Public Reporting Systems
  • APPENDIX C. Technical Assistance Programs
  • APPENDIX D. Best Practices
  • APPENDIX E. Provider Training Programs
  • APPENDIX F. Facility Recognition Programs
  • APPENDIX G. Funding of Quality Improvement Programs