Skip Navigation U.S. Department of Health and Human Services www.hhs.gov/
Agency for Healthcare Research Quality www.ahrq.gov
www.ahrq.gov/
Selecting Quality and Resource Use Measures: A Decision Guide for Community Quality Collaboratives

Part I. Introduction to Performance Data

Health care performance data can be obtained from multiple sources, including State and Federal governments, national accrediting bodies, research organizations, professional associations, health plans, employers, vendors that pool data from multiple plans or employers, and directly from providers. The answers to Questions 1-6 provide a framework for evaluating and selecting existing data from these various sources. Our emphasis is on data that are ready to present as accepted performance measures or can readily be converted into such measures, rather than on data that could be used to develop entirely new measures. These performance data can be delivered as ready-to-use, provider-specific results or as raw data requiring data management and analytic expertise to generate such results.

Return to Contents

Question 1. What data, including both national and State sources, are readily available to collaboratives for performance measurement at the hospital and physician levels?

Within the last several years, key organizations have led a national public-private effort to harmonize measures to reduce the data collection burden on providers and to minimize confusion among stakeholders.10,11 Thus, many data reporting organizations overlap in the measures they collect and the results they provide. This answer summarizes the variety of available data sources for measuring hospital and physician performance. (Refer to the responses to Questions 8 and 9 for related information about measures.)

Most public agencies that offer off-the-shelf, summarized data at the provider level use measurement and data collection processes that have undergone rigorous review by committees of stakeholders. Using these existing sources obviates the need for a collaborative to undertake the time-consuming and costly task of validation. In addition, the data from these sources are frequently free. However, such data sources have several disadvantages, including inadequate coverage of some performance domains (domains described further in response to Question 20) and "one-size-fits-all" presentation of data, which may not reflect local or regional priorities.

To fill these gaps, some private entities have developed and implemented their own data collection and analysis systems, as further described in Table 1. However, the validity of these systems has generally not been established, and voluntary provider participation is a major limitation. Lacking any mechanism for auditing or any penalties for incorrect reporting, these systems are likely to suffer from selective nonparticipation by poorly performing providers and selective overreporting of performance.

Hospital Data

Table 1 summarizes off-the-shelf, publicly available sources of data available for hospital performance measurement at the national or State level. Some are available for free from government agencies, whereas others may require contracts with proprietary organizations.

Table 1. Organizations providing hospital-level data

Organization URLDescription
National Data Sources

Centers for Medicare & Medicaid (CMS): Hospital Compare

www.hospitalcompare.hhs.gov

CMS, in collaboration with the Hospital Quality Alliance, supports the HospitalCompare Web site, which reports hospital-specific data for clinical process-of-care indicators (e.g., most of The Joint Commission's "Core Measures"), clinical outcomes indicators (e.g., risk-adjusted mortality and readmission rates for selected conditions), and patient experience measures derived from the HCAHPS (Consumer Assessment of Healthcare Providers and Systems Hospital Survey). More information on the CAHPS® patient experience survey can be found at www.hcahpsonline.org. Exit Disclaimer In addition, CMS piloted the reporting of selected AHRQ QIs in 2009. All of these efforts have been undertaken under the auspices of a program known as Reporting Hospital Quality Data for Annual Payment Update (RHQDAPU), which links annual market basket payment updates to hospital participation.

Comment: Much of the process-of-care data available on the Hospital Compare Web site is also available from The Joint Commission. The outcomes data are limited in distinguishing provider quality, because they are based only on Medicare fee-for-service beneficiaries who received inpatient care at nonfederal hospitals.

The Commonwealth Fund: "Why Not the Best?"

www.whynotthebest.org Exit Disclaimer

Announced in 2008, The Commonwealth Fund's hospital performance reporting Web site repackages the information provided through Hospital Compare and offers hospital-specific composite scores for heart attack care, pneumonia care, heart failure care, surgical care improvement, and patient experience, and an overall performance score.

Comment: For collaboratives just starting to consider public reporting, the Commonwealth Fund's "opportunity-weighted" composites offer a relatively easy-to-understand approach for presenting complex data on processes of care.

HealthGrades®

www.healthgrades.com

HealthGrades® is perhaps the best known comparative reporting system. It assigns star ratings of "best," "as expected," or "poor" for at least 29 procedures and diagnoses using HealthGrades Hospital Report Card™ Mortality and Complication Based Outcomes Methodology. Most of these analyses are performed by applying proprietary risk-adjustment models to publicly available Medicare claims or all-payer hospital discharge data. HealthGrades also confers its Distinguished Hospital Awards for superior performance on a composite of selected AHRQ Patient Safety Indicators, Specialty Excellence Awards for superior performance on relevant risk-adjusted mortality or complication measures, and Outstanding Patient Experience Awards for superior performance on a composite of HCAHPS® items.

Comment: HealthGrades offers very limited information for public use through its Web site, although more detailed information may be available for purchase. The limited available information about HealthGrades' methods is a potential concern, and most measures are based only on Medicare fee-for-service claims.

The Joint Commission: Quality Check™

www.qualitycheck.orgExit Disclaimer

The Joint Commission is a not-for-profit hospital accrediting body that provides data on its "Core Measures," most of which focus on specific evidence-based processes of care, as well as HCAHPS® survey data. Hospitals are required to submit data quarterly and to meet other specified standards to receive accreditation and certification. Quality data can be downloaded directly through the Quality Check Web site.

Comment: There is overlap between measures reported by The Joint Commission's Quality Check and CMS's Hospital Compare Web sites. Because the Core Measures are divided into seven sets, and hospitals are currently only required to report four, not all measures are available from all hospitals.

Leapfrog Group

www.leapfroggroup.org/cp Exit Disclaimer

Leapfrog conducts a voluntary annual survey of hospitals nationwide to assess performance based on four quality and safety practice domains that are believed to reduce preventable medical mistakes. Endorsed by the National Quality Forum (NQF), these practice domains include computerized physician order entry, intensive care unit staffing by "intensivist" physicians, evidence-based hospital referral for selected high-risk procedures, and a targeted subset of 34 "safe practices" endorsed by the NQF. The resulting data are currently summarized for 10 different conditions and procedures, two adverse outcomes (hospital-acquired pressure ulcers and fall-related injuries), and a composite measure of "steps to avoid harm." In addition, Leapfrog assigns a rating of hospitals' implementation of its Policy Statement on Serious Reportable Events/"Never Events" (which identifies events that should never happen [e.g., removing the wrong limb or leaving surgical equipment inside a patient after surgery]).

Comment: The Leapfrog survey is conducted by Thomson Healthcare. Voluntary responses were submitted in 2008 from 1,276 hospitals in 37 major U.S. metropolitan areas, representing 48% of urban, general acute-care hospitals. Most components of the survey are not audited, so reporting bias is a potential concern.

Thomson Reuters

www.100tophospitals.com

Thomson Reuters (formerly Solucient) recognizes "100 Top Hospitals" based on both risk-adjusted clinical outcomes and financial performance, using publicly available Medicare claims or all-payer hospital discharge data. In addition, Thomson Reuters and other health care consultants contract with large national employers and their coalitions to analyze commercial claims data. These data may be particularly useful in midsize communities dominated by some of these large national employers.

Comment: Very limited information for public use is available through this Web site, although more detailed information may be available for purchase. The limited available information about Thomson's methods is a potential concern.

U.S. News & World Report

http://health.usnews.com/sections/health/best-hospitals/

U.S. News & World Report annually evaluates the performance of more than 1,500 U.S. hospitals in at least 16 specialties. In four of these specialties, evaluations are based purely on reputation, as reported by a national sample of physicians. In the other 12, evaluations are also based on risk-adjusted mortality from Medicare claims data (estimated using 3M® Health Information Systems APR-DRG* software), some AHRQ QIs, and other care-related factors, such as volume and nurse staffing (ascertained using the American Hospital Association's annual survey of hospitals).

Comment: Very limited information for public use is available through this Web site, although more detailed information may be available for purchase. The limited available information about U.S. News' methods is a potential concern.

*All Payer Refined-Diagnosis Related Group

State or Regional Data Sources

Statewide Health Data Organizations

 

Sources of hospital data at the State level include statewide health data organizations, hospital associations, private data organizations (e.g., Hawaii and Virginia), and State departments of health. These entities often collect hospital utilization, financial, and structural data relevant to hospital performance, in addition to all-payer inpatient hospital discharge data that go into AHRQ's multistate Healthcare Cost and Utilization Project (HCUP) databases. Some States (e.g., Pennsylvania) are collecting data on hospital-acquired infections or serious reportable events, while others are collecting data on nurse staffing levels or nursing skill mix. Most of these data collection efforts incorporate definitions established by the Centers for Disease Control and Prevention or the National Quality Forum, but the definitions may still vary slightly by State.

Inviting representatives from the State's health data organization to partner with the community quality collaboratives may yield useful data for local or regional analysis that would be unavailable through a national source. A relatively low-cost approach to analyzing State-level data is to use AHRQ's Inpatient Quality Indicators, Pediatric Quality Indicators, and Patient Safety Indicators (refer to Question 9), because Windows-compatible software and documentation can be downloaded for free from the Quality Indicators Web site (http://qualityindicators.ahrq.gov). In 2010, AHRQ will release a downloadable program (MONAHRQ) that allows organizations to input their hospital discharge data and output a Web site that provides query-driven information on all AHRQ QIs, including potential safety-related events, preventable hospitalizations, and volume or utilization of specific hospital services.

Comment: Rules regarding the release of hospital-specific data differ across States. Community quality collaboratives need to contact their statewide health data organization (www.hcup-us.ahrq.gov/partners.jsp) to determine such availability. Other limitations of this data source are that the data may not be available on a timely basis, depending on the State, and the quality of diagnosis coding for adverse events may vary across hospitals (which is particularly relevant to the AHRQ Patient Safety Indicators). However, these data offer the important advantage of being population based and including all payers (not just Medicare or Medicaid).

State and Regional Coalitions

 

Some States and regional coalitions have developed their own mechanisms for repackaging and rescoring hospital-specific data from Hospital Compare and from The Joint Commission. For example, the California Hospital Assessment and Reporting Task Force (Chart: www.calhospitalcompare.orgExit Disclaimer)adds State-level all-payer data on risk-adjusted coronary artery bypass surgery, adult critical care, and inpatient pneumonia mortality, along with rates of breastfeeding without formula supplementation (based on requests for genetic disease screening), and then rescores hospital performance as superior, above average, average, below average, or poor. Different cut points and benchmarks can be used by different reporting organizations, resulting in assignment of different symbols or ratings (stars, checkmarks, etc.), even though the underlying numeric scores are equal.

Comment: If a community quality collaborative chooses to draw upon multiple data sources in this manner, the scoring methodologies used by the reporting organizations should be carefully evaluated. Using the same cut points or benchmarks as the original data sources may lead to internal inconsistencies in how performance data are presented, but changing cut points or benchmarks may lead to inconsistencies with other sites reporting the same indicators on the same hospitals.

Proprietary Data (Providers/Business Coalitions/Health Plans/Consultants)

 

Proprietary data from health plans, providers, and actuarial and health care decision support firms may be available to community quality collaboratives. Inviting these types of organizations to work with the collaborative may provide access to unique data and data management expertise. Successful partnerships with these providers have been established by Chartered Value Exchanges (CVEs) in Wisconsin, Oregon, and California. In addition, some CVEs collaborate closely with local employer coalitions to assemble rich data sets that capture a significant portion of the commercially insured population. For example, Wisconsin's The Alliance has produced performance data in this manner for years, with resulting improvements in targeted domains of health care quality.

Comment: If a community quality collaborative has not already done so, we recommend inviting these organizations as partners in performance measurement.

Physician Data

As Table 2 illustrates, national efforts to collect physician group or individual physician-level quality data are in the experimental stage due in part to technical issues with limited sample sizes and standardization of IT systems. National and regional initiatives are underway to enhance the usefulness of existing data for physician performance measurement. For example, the Physician Quality Reporting Initiative (PQRI) (www.cms.hhs.gov/PQRI) offers incentive payments to physicians who report at least some quality measures (from a list of approximately 179 in fiscal year 2010) applicable to Medicare fee-for-service beneficiaries. But only 16% of eligible professionals participated in 2007, and only 52% of those who participated met the program and reporting requirements. Data results are not yet publicly available, and provider participation will need to improve before CVEs will find the data useful. In particular, the very limited number of measures on which any individual physician must report is a major limitation for comparative reporting.

Table 2. Organizations providing physician-level data

Organization URLDescription

National Data Sources

Generating Medicare Physician Quality Performance Measurement Results

www.cms.hhs.gov/GEM

As part of a project called "Generating Medicare Physician Quality Performance Measurement Results" (GEM), the Centers for Medicare & Medicaid Services (CMS) contracted with Masspro to calculate medical group practice performance results based on 2006-2007 Part B claims for fee-for-service Medicare beneficiaries. Information is presented at the population level, by State and ZIP code, and at the national level. These data are publicly available and may be downloaded from http://www.cms.hhs.gov/GEM/ . Results were calculated for the following measures using the National Committee for Quality Assurance's (NCQA) Healthcare Effectiveness Data and Information Set (HEDIS) definitions of these measures: Breast Cancer Screening, LDL Testing for Diabetics, Retinal Eye Exam for Diabetics, HbA1c Testing for Diabetics, Cardiovascular LDL Testing, Colorectal Cancer Screening, Nephropathy Testing for Diabetics, Persistence of Beta Blocker Therapy- Post MI, Annual Monitoring for Patients on Persistent Medications, Anti-Depressant Medication Management-Acute Phase, Beta Blocker Treatment After Heart Attack, and Disease-Modifying Anti-Rheumatic Drug Therapy.

Comment: The data formats were designed to allow performance results at the group practice level (i.e., Taxpayer Identification Number) to be aggregated with similar data from commercial sources, but participating CVEs encountered difficulties with provider attribution. The future of this effort is unclear, because the CMS unit of reporting, Taxpayer Identification Number, does not correspond to individual physicians, physician practice sites, physician organizations, or any other unit recognizable to consumers.

Consumers' Checkbook

www.checkbook.org/doctors/pageone.cfm Exit Disclaimer

Given the relative dearth of publicly available data on physician performance from official sources, other organizations are attempting to meet the market need. Consumers' Checkbook asks "roughly 260,000 physicians to tell us which specialists they would want to care for a loved one." They use the survey responses to construct a "Top Doctors" database, which contains the names of more than 20,000 doctors who were mentioned most often, across 30 specialties and 50 metropolitan areas. Consumers' Checkbook is also piloting an abbreviated version of the Clinician/Group (C/G) CAHPS® tool in three sites, although the reliability and validity of this abbreviated tool are unknown.

Comment: The reliability and validity of peer assessments of physician performance are not well established, given the poor response rate for most physician surveys.

HealthGrades®

http://www.healthgrades.com

HealthGrades offers an interactive survey tool (based loosely on C/G CAHPS®) for users to describe their experiences with individual physicians and whether they would recommend the physician to family or friends. These data are offered along with information about physicians' board certification (also available from the American Board of Medical Specialties at http://www.abms.org/WC/Login.aspx Exit Disclaimer), group practice and hospital affiliations, insurance plans accepted, and licensure and disciplinary actions by State medical boards (also available from the Federation of State Medical Boards or http://www.docboard.org/docfinder.html).

National Committee on Quality Assurance Physician Recognition Program

www.ncqa.org/ Exit Disclaimer

The National Committee on Quality Assurance (NCQA) Physician Recognition Program publicly recognizes individual physicians who meet clinical requirements for appropriate care in back pain; heart/stroke; and diabetes; and who establish a primary care medical home. In addition, NCQA also offers a Physician and Hospital Quality certification program to health plans that evaluate the cost and quality of physicians and hospitals. The list of certified health plans that evaluate physician care can be found at www.ncqa.org/tabid/954/Default.aspx Exit Disclaimer (login required).

Comment: NCQA's reporting tool, QualityCompass, does not report HEDIS measures for physicians. However, some local collaboratives have adapted the measures for physician performance measurement, most notably the Wisconsin and California CVEs.

Vitals.com

http://www.vitals.com

Vitals.com aims to present a 360o view of physicians, including factual information about their background, consumer reviews, peer reviews and awards, and office information. Vitals.com also offers an interactive survey tool with questions about the appointment process, waiting time, staff professionalism, accuracy, bedside manner, adequacy of time, and followup. Consumers are encouraged to write free-text reviews. Similar to HealthGrades, this site provides information about board certification, hospital affiliations, and insurance plans accepted, but it adds a publication list for physicians who have authored peer-reviewed papers.

Comment: HealthGrades' and Vitals.com's patient survey tools offer different response options than C/G CAHPS, even when the questions are similar, limiting one's ability to compare results across tools. Other proprietary sites, including Angie's List and Zagat, have more recently entered this market. For example, the managed care company WellPoint recently contracted with Zagat to encourage members of its "consumer-driven" Blue Cross plans in Los Angeles, Cincinnati, Dayton, and Connecticut to rate physicians on four distinct attributes: Trust, Communication, Availability, and Environment. Both Zagat and Angie's List encourage free-text ratings, although the latter site is available only to paid subscribers.

State Data Sources

Community Quality Collaboratives (e.g., Chartered Value Exchanges) and Local Health Care Coalitions

 

Across the country, community quality collaboratives and local health care coalitions are laying the foundation for collecting and reporting physician performance data. These organizations build their own network of local health plans, employers, and physician organizations to coordinate data collection and analysis. For example, the Washington-Puget Sound Health Alliance reports clinic and medical group performance using 21 indicators. California's Integrated Healthcare Association created a pay-for-performance system that integrates 12-13 HEDIS measures based on claims data, 4-5 measures of information technology-enabled "systemness," 6 new resource use measures, 9 patient experience measures, and 8-9 measures of coordinated diabetes care. Similarly, the Wisconsin Collaborative for Health Care Quality reports a mixture of HEDIS measures and its own measures. Some health plans use internal claims data to rate contracted physicians or physician organizations by applying HEDIS definitions. However, the availability and content of these ratings vary across plans and across States.

Comment: Community quality collaboratives are encouraged to partner with health plans and employers to build multipayer claims databases that can be used to evaluate process-of-care measures at the physician and physician organization levels. Such databases should be constructed to include as many payers as possible to improve the reliability of the resulting estimates and to reduce the possibility of conflicting data on the same physicians from different sources.

State or Local Registries

 

A handful of States maintain disease or procedure registries, most commonly focused on coronary artery bypass graft (CABG) surgery and related "open heart" procedures, to enable public reporting of surgeon-specific risk-adjusted outcomes. These States include New York, California, and Pennsylvania. Other States could establish similar registries, building on data management infrastructures established by medical specialty organizations, such as the Society for Thoracic Surgeons and the American College of Cardiology.

Comment: Registries for CABG surgery have been in place for more than a decade, and the risk-adjustment models using these data have been repeatedly refined and validated. However, these registries are generally limited to narrowly defined subsets of patients, which may limit their utility.

State medical boards

 

State medical boards generally maintain their own databases with licensure and disciplinary actions, including basic information submitted as part of the licensure process (e.g., medical school and year of graduation, residency training, and board certification). The format and structure of these data vary by State. Commercial Web sites, such as Vitals.com and HealthGrades.com, frequently list this information as part of the quality information they present. In addition, this information is used to populate the American Medical Association's DoctorFinder site (http://webapps.ama-assn.org/doctorfinder Exit Disclaimer) and the Administrators in Medicine (Association of State Medical Board Executive Directors) DocFinder site (http://www.docboard.org/docfinder.html Exit Disclaimer).

Comment: These sources provide very basic information about physicians' training and experience but do not directly address the quality of care that physicians provide.

Return to Contents
Proceed to Next Section

 

AHRQAdvancing Excellence in Health Care