Appendix B: Summaries
of PFQ Grantee Activities (continued)
PFQ Grant Summary: Closing the Gap: Partnering for Change
Lead Organization: American College of Physicians (ACP)
Partner Team: Northwestern University, Abington Memorial Hospital
Title: Closing the Gap: Partnering for Change
Topic Area: Process Continuing Medical Education to Improve Quality of Care
Principal Investigators: Vincenza Snow, MD
AHRQ Project Officer: Charlotte Mullican
Total Cumulative Award: $848,736
Funding Period: 9/2002–9/2005 (project funds not released until February 2003)
Project Status: Completed 9/29/2006
1. Project
Description
Goals.
The aim of this project was to (1) develop and test a team-oriented, practice-based
continuing medical education (CME) strategy that trains teams of doctors,
nurses, and office administrators in how to improve quality of care and
outcomes for patients with chronic diseases, and (2) design a business case
that would help spread the adoption of team-oriented, practice-based CME by the
ACP and other professional societies. The project team hoped to show that the
new team-oriented, practice-based approach to learning would be a better way to
promote physician adoption of clinical practice guidelines and improve quality
of care for patients. The team also intended to establish this type of CME as
a viable alternative to traditional CME, which is physician centered and based
on passive learning. For this trial, the prototype CME learning strategy
focused on educating physician practices on type 2 diabetes care.
Activities
and Progress. In the first funding year (September 2002-September 2003),
despite problems gaining IRB approval that delayed grant work by about six
months, the project team established partnerships with key national
stakeholders to create a project advisory board (see table for members). This
group helped to design the education program and develop a training manual on
learning collaboratives and a team-oriented toolkit for diabetes. Together, the
training materials are called "Closing the Gap Diabetes Modules."
The
project also recruited four ACP practices in Pennsylvania and Illinois to
participate in the pilot test of the practice-based learning model for
diabetes. The pilot test began in October of the second funding year (October
2003-September 2004). Each practice that participated in the pilot project
chose a team composed of one doctor, one nurse, and one administrator to attend
three training sessions over a six- to nine-month period. One session was held
on each of the following: performance improvement, the Plan-Do-Study-Act (PDSA)
cycle, and the fundamentals of the Chronic Care Model. During this time
period, the teams returned to their practices to train other staff and
implement the team-oriented diabetes toolkit, which included clinical,
administrative, and patient tools intended to redesign practice workflow. In
between the three training sessions, the primary program trainer, Dr. Kevin
Weiss of Northwestern University, held two conference calls lasting two hours
with the practices to keep them on track and guide them through operational
changes. Information from the pilot practices' learning experiences, responses
to barriers, and perceptions on how the team functioned differently informed
revisions that the research team made to the trial intervention.
Following
the pilot study, during the third funding year (October 2004-September 2005),
the research team began to implement the pseudo-randomized trial intervention.
The team successfully identified and recruited 25 practices in Philadelphia
(randomized into 13 intervention practices and 12 control practices) and 6
practices in Chicago (randomized into 3 intervention and 3 control practices)
to participate in the study. Rather than conduct a true randomized trial, in
which control practices would not receive training, the study design was
changed to allow the control practices to receive the intervention as soon as
the experimental practices completed the training program. This change was
prompted by the insistence of one hospital system that had volunteered 25
internal medicine and family practices to the study and wanted all of them to
benefit from it.
The
first training session occurred in October 2004, and the intervention proceeded
as it had in the pilot, except that the three full-day training sessions were
reduced to one full-day and two half-day training sessions, and the training
materials were revised to include only the most relevant and useful ones. The
research team designed an evaluation to measure three sets of outcomes: (1)
patient outcomes and practice patterns, (2) patient satisfaction, and (3)
practice teams' perceptions of the program. To collect patient outcomes and
practice pattern data, each practice (both experimental and control) enrolled
15 patients with diabetes and extracted data on HbA1C levels, blood pressure
levels, blood glucose, and lipid control from patient charts three times during
the study. The practices sent this data to the Data Coordinating Center at
Northwestern, where it was cleaned, analyzed, and used to create reports for
each practice on its patients' status at baseline, during the intervention, and
afterward.
Data
collection in the trial study was delayed by the slow pace of recruiting
patients and extracting data from charts. By June 2006, however, the research
team received all three rounds of data from the Philadelphia practices and
about 80 percent from the Chicago practices. To collect patient satisfaction
data, practices helped recruit patients with diabetes to participate in a
telephone survey, which staff at Northwestern had planned to conduct three
times during the study: before, during, and after intervention. However,
because of problems enrolling patients in some practices, and errors in sending
the correct consent forms to control groups in Philadelphia, the interviews
were delayed. As of June 2006, the patient surveys were complete and
researchers were analyzing the results.
2. Partnership
Structure/Function
There
were three levels of partnership in this project. The first involved ACP and Northwestern University, whose staff formed the core research team, including a project
principal investigator from ACP and a co-investigator from Northwestern. This
team spoke regularly and together designed the pilot test, the trial
intervention, and the teaching materials. They also provided the training and
support to practices, and collected and analyzed the data. The second
partnership involved the ACP-Northwestern research team and the physician
practices that participated in the pilot study and the trial intervention
training programs. Practices had regular contact with Dr. Weiss at
Northwestern, who provided them with ongoing technical support.
The third level of
partnership involved the ACP-Northwestern research team and members of an
advisory group, who provided input to the project's design and teaching tools
(Institute for Heathcare Improvement [IHI], Institute of Chronic Illness Care
[ICIC]), offered avenues to disseminate outputs from the project, and
facilitated participation of practice-based health providers (American Medical
Association [AMA], AHIP, American Nurse Association [ANA]). In the first year,
the project had one in-person advisory board meeting at which members could
cement relationships and reach agreement on a conceptual model of the
team-oriented, practice-based diabetes prototype. The project also created
working groups—one on the business case and another on implementation and
barriers—composed of advisory board members and other key partners. The
project held mini-strategic planning teleconference calls with the working
groups to develop different modules of the training program.
Table 1. Major Partner Organizations and Roles in the Project
|
Organization |
Role in Project |
Lead Organization (grant recipient) |
American College of Physicians
PI: Vincenza Snow, MD
|
Provided overall leadership and direction to
program; guided the design of CME intervention
and training materials; developed and
implemented training programs and developed
evaluation plan on project impact; assessed
opportunities for expansion and sustainability of
project outcomes
|
Key
Collaborators |
Northwestern University
Co-PI: Kevin Weiss, MD
Institute for Healthcare Improvement
Institute of Chronic Illness Care
American Medical Association and
American Association of Health
Plans
American Diabetes Association and
American Nurse Association
|
Guided design of CME intervention and training
materials; provided training and technical
assistance to participating practices; collected and
analyzed data for pilot test and randomized control
trial Participated in advisory board; assisted in
developing training materials for practices
(training manual on learning collaboratives—IHI;
toolkit for diabetes—IHI and ICIC) Participated in advisory board; assisted in
identifying opportunities for dissemination of
project outcomes and sustainability of project
activities Participated in advisory board; assisted in gaining
participation from nurses by providing CE credit
(ANA)
|
Target Organizations |
Four practices from Pennsylvania
and Illinois for pilot test (one was a
Lehigh Valley practice in PA
identified through another PFQ
project)
31 practices (experimental and
control); 25 in Philadelphia and 6 in
Chicago for trial intervention
|
Participated in team-oriented, practice-based
diabetes CME prototype; attended training
sessions; participated in conference calls;
implemented changes to practice workflow based
on training; performed data extractions and sent
data to the Data Coordinating Center; recruited
patients for patient satisfaction survey
|
3. Project Evaluation and Outcomes/Results
The project successfully created a set of diabetes
training modules for practice-based teams; pilot tested the module with 4
practices; recruited 35 practices (4 for the pilot and 31 for the trial) and
patients from those practices to participate in the randomized control trial
intervention; and gathered clinical trial data. While the research team did not
have information at the time this summary was written (September 2006) on the
impact of the program on patient clinical outcomes or patient satisfaction, it
did complete the qualitative evaluation of the practice teams' program
experience and level of collaboration. The results showed that practices were
willing to attend training in, learn from, and participate in the project's
team-based learning model, in spite of the cost involved in sending three
employees to training sessions. The trial intervention had about an 85 percent
participation rate from the experimental group, with 15 percent (about one to
two practices) showing inconsistent participation.
The research team evaluated the practice teams'
experience and the level of team collaboration with a pre- and postintervention
survey of the practices. Despite the intensity of this program, participants
rated it highly, while at the same time complaining of the high intensity. Over
the three training sessions, 94 percent of participants rated the program as
"very good" or "excellent." But "very good" to "excellent" ratings dropped from
96.7 percent of participants for sessions 1 and 2 to 88.2 percent for session
3, possibly reflecting fatigue. When asked what was the most "eye-opening
experience" for them, participants rated "working as a team" as the highest
followed by "interacting with the other teams," "learning improvement
strategies," and "reviewing their charts." The first two relate to the
in-person meetings, but the team interactions were also part of the conference
calls and could be accomplished via an on-line community. Program participants
rated the binder contents as most useful to learning, and within these, the
care models, patient tools, and chart tools were of greatest value. They also
rated the conference calls highly as a learning experience. Participants rated
measuring their practice, progress reports on the conference calls, and patient
satisfaction data as having the greatest impact on their ability to improve
practice, followed by the binder materials and the learning sessions.
The project was found to be helpful to nurses and
office managers. These practice staff indicated that the learning model helped
to integrate them into the care process by opening up dialogue between
physicians and staff. One physician practice noted that staff members felt a
renewed sense of purpose because the project gave them tools for comanaging
patients. Office managers often played a key role in the project at the
practice level by keeping track of patients in the project.
The project established "face validity" for the
learning model with physicians. Feedback and testimony from physicians were
positive; some practices indicated that the program changed the way they
practice by showing them the benefits of incorporating program tools, such as
new forms and databases, into everyday workflow. For example, one practice
introduced a scorecard that the nurse fills out with information on patient
health status, diabetes care status, and instructions for self-care. The
practice gives a copy of the scorecard to the patient and keeps a copy from
which to enter patient data into its computer registry to track performance
over time. Other practices made changes in office procedures, such as having
nurses help patients take off their shoes as a reminder to physicians to check
their feet, or instituted new patient education initiatives.
ACP and other organizations like the AMA and the
American Board of Internal Medicine (ABIM) had positive reactions to the new
CME model. Partly due to the success of the ACP project, which was part of an
AMA pilot to test practice-based CME, the AMA decided to award 20 category 1
CME credits to physicians participating in practice-based programs like ACP's
Closing the Gap, and ACP is now accredited to provide practice-based CME. In
addition, ABIM now accepts participation in ACP's Closing the Gap as fulfilling
part 4 of its requirements for Maintenance of Certification. The program was featured
at an ABIM Quality Summit as a "premier project for the ACP in helping members
achieve higher levels of quality care and become eligible for pay for
performance projects" (ACP, Mid-Year Progress Report to AHRQ, June 2006). The
ABIM considers Closing the Gap as the "gold standard" against which all other
practice-based CME programs are measured. ANA also approved CE credit for
nurses involved in the program. Finally, many ACP state and local chapters,
which were initially hesitant to participate in the study, are now anxious to
do so.
4. Major Products
- Closing the Gap
Diabetes Modules, including a Manual on Learning Collaboratives for the
practice teams, and a toolkit for diabetes care.
- Summary report on
the pilot test experiences and barriers.
- Presentation of
the project's experiences at the ACP's annual session in 2005.
- News articles in
ACP newsletters and electronic newsletters, distributed to 70,000 ACP members
(www.acponline.org/journals/news/may06/quality.htm).
- Patient data
registries, scorecards, and other tools that practices created to track
diabetic patients.
5. Potential for Sustainability/Expansion after PFQ
Grant Ends
ACP's Closing the Gap project led to larger projects
that are further testing the team-oriented, practice-based learning model
through follow-up pilots. The project has received funding from two
pharmaceutical companies to conduct two more rounds of Closing the Gap training
programs, one in diabetes (funded by Novo Nordisk for $9 million) and one in
cardiovascular disease, with 20 practices in each group. Several physicians who
received training in the initial study have become faculty for the new Closing
the Gap programs and will teach the training sessions for new practices.
The research team is working to develop a sustainable
business case and financing for the program. The two biggest costs to practices
are those related first to measurement and workflow changes, and second to the
time staff spends being trained. For ACP to expand this program, it also needs
to find external funding. One option involves ACP's charging fees for the
program, supplemented by contributions from local and state partners of ACP
chapters. ACP is also considering ways to build the program into its internal
budget and create its own data coordinating center, but this would also require
external funding. Finally, researchers are considering the development of a
web-based version of the program that would be less costly and time-consuming
for physicians—a "Closing the Gap 101" to teach the PDSA cycle—as a way to
disseminate it more broadly. The more intensive training in this program would
be the next step, a "Closing the Gap 102" that would concentrate on the
practice improvement and measurement components.
Return to Appendix B Contents
PFQ Grant Summary: Improving Care for the Dying: Transforming Patients' Wishes into the
Reality of High-Quality Palliative Care
Lead Organization: American Hospital Association (AHA), Health Research and Educational
Trust
Partner Team: Three Pennsylvania-based hospitals/hospital systems and four
hospitals/hospital systems based outside Pennsylvania (national)
Title: Improving Care for the Dying: Transforming Patients' Wishes into the
Reality of High-Quality Palliative Care
Topic Area: Palliative Care
Principal Investigators: John Richard Combes, President and Chief Operating Officer, Center for
Healthcare Governance, AHA
AHRQ Project Officer: Ronda Hughes
Total Cumulative Award: $1,282,703
Funding Period: 9/2002–9/2006
Project Status: Completed 9/29/2006
1. Project
Description
Goals. This project sought to promote the establishment of hospital-based
palliative care by creating centers of learning for other hospitals, and to
accelerate the translation of research findings into improved quality and
delivery of end-of-life care. In phase I, the project planned to establish
three palliative care learning centers at Pennsylvania-based hospitals to host
site visits by other hospitals interested in planning and developing similar
palliative care units. In phase II, the project planned to expand the number
of learning centers to hospital-based palliative care centers in other parts of
the country, selected from among recipients of the AHA's Circle of Life Award.
Activities
and Progress. The first six months were devoted to planning and developing
the core curriculum of the site visits with the initial three learning centers
in Pennsylvania: Geisinger Health System, Danville; Center for Palliative Care
in Thomas Jefferson University's Department of Family Medicine and the
Jefferson Health System, Philadelphia; and the University of Pittsburgh Medical
Center.
Phase
I began during the second half of year 1 and expanded into year 2. The project
aimed for each of the three facilities to accommodate five site visits the
first year and eight site visits per year for years 2 through 4, for a total of
29 site visits at each. During year 2, phase II began with the establishment
of four national learning centers (Connecticut Hospice in Branford, CT; Detroit Receiving Hospital in Detroit, MI; Palo Alto VA in Palo Alto, CA; and St. John's Regional Health Center in Springfield, MO). The four were chosen among AHA Circle of Life Award winners and finalists, and represented different types of settings
for palliative care (i.e., VA hospital, safety net hospital, Catholic hospital,
and hospices).
The
lead organization (initially Hospital and Health System Association of
Pennsylvania) recruited hospitals or hospital systems to participate in site
visits and matched up visitors with the learning centers. The learning centers
contacted the hospitals to schedule the site visit and to conduct a preliminary
needs assessment, in which staff members were interviewed to assess their
unique clinical and community situation, areas of interest, and palliative care
goals. During the visit, discussion was guided by the data gathered during
these pre-site interviews. The learning centers tailored the site visit
curriculum and schedule to the visitors' identified needs. After the site
visits, the lead organization followed up with the visiting hospitals to assess
the effectiveness of the site visit and provide ongoing support and technical
assistance.
As
of early October 2006, approximately 60-70 site visits had been conducted.
Site visits lasted a full day and were hosted by a team of professionals,
including physicians, a palliative care project coordinator, nurse clinicians,
hospital administrators, clergy, social service professionals, and volunteer
coordinators. Members of the host organization team provided tours of the
facility, supplemented by formal and interactive presentations. Each site
visit included a presentation on how the research collected during the
developmental stages in regard to challenges and successes was translated into
improved palliative care services and procedures. The host team encouraged
visitors to share their research findings and solicit approaches to translating
them into successful practices. Discussions focused on how to ensure that
systemic change, including policy change, occurred, and on how to create a
supportive environment so that established palliative care services could be
sustained. Host organizations shared data used for benchmarking, internal and
external marketing strategies, reimbursement and funding challenges, outcome
measurements, evaluation process, and views of how systemic change holistically
influenced the delivery of health care within their organization.
2. Partnership
Structure/Function
During
the initial planning phase, the three Pennsylvania-based hospitals/hospital
systems spoke with the principal investigator (PI) by phone every other week and
in person once per quarter to build the site visit curriculum. The PI, project
director, and seven learning centers (called "learning labs") did planning via
conference calls held approximately every six weeks. These conversations
provided the team with the opportunity to evaluate the effectiveness of the
program process, brainstorm on continued marketing and training strategies, and
continue group discussion and work on collaborative projects such as survey
development and refinement of curriculum and site visits. In addition, member
listserves, the Hospital-Based Palliative Care Consortium Hospital-Based
Palliative Care Consortium Web site, and conference calls facilitated
communication between the lead organization, participating hospitals, and learning
labs.
Table 1. Major Partner Organizations and Roles in the Project
|
Organization |
Role in Project |
Lead Organization (grant recipient) |
Health Research and Educational
Trust, AHA
|
To provide overall project leadership
To identify and recruit learning labs
To develop core curriculum for the site visits
and companion toolkit
To recruit participating hospitals (through
Web sites, electronic newsletters, learning lab
institution publications, and various meetings
and conferences)
To develop assessment tools to evaluate the
usefulness of the learning labs for the
visiting/participating hospitals
|
Key
Collaborators |
Phases I and II: Palliative care
programs in 3 PA-based hospitals and
hospital systems [Note: By the end of
the grant, one of the PA-based
learning labs had dropped out of the
program.] |
To assist in developing the core curriculum for
the site visits and companion toolkit (Phase I
hospitals/hospital systems only)
To conduct and assess pre-site-visit surveys
filled out by the visiting hospitals/hospital
systems
To coordinate and host site visits |
Phase II: Palliative care programs
based at 4 hospitals and hospital
systems (national) |
To respond to follow-up questions/inquiries
from visiting hospitals
|
Target Organizations |
Hospitals and hospital systems
throughout the U.S.
|
To complete pre-site-visit assessment
To visit learning labs and adapt evidence-based
models of change to incorporate palliative care
services into hospitals/hospital systems
|
3. Project Evaluation and Outcomes/Results
The program planned to evaluate its success according
to the number of new hospital-based palliative care programs created in
targeted hospitals,2 and the number of enhancements made to current
programs as a result of the training program. About 60-70 site visits had been
completed at the time this summary was written (October 2006). Initially, the
evaluation intended to measure outcomes such as reduced length of stay, patient
and family satisfaction, and the financial effects of instituting hospital-based
palliative care services. However, the learning labs were concerned about
measuring patient satisfaction. Specifically, they felt that while those
patients and families who participated in the palliative care program would
report positive effects, patients and families who did not receive palliative
care services might skew the results. As a result, the three Pennsylvania
pilot hospitals serving as learning labs provided only baseline clinical and
financial data prior to the initiation of phase I. During phase II, AHA-HRET
staff surveyed state and national learning labs to evaluate the impact of the
palliative care programs on these outcomes. These data will be compared to the
baseline data collected from the three Pennsylvania-based learning labs prior
to phase I.
In addition, AHA-HRET staff conducted followup with
visitors approximately six months to one year after the site visit to explore
whether expectations were met, what was learned from the visit, what new
services were developed as a result, how services were functioning, etc.
Project staff planed to analyze this information at the end of summer 2006 (as
of October 2006, we were unsure if this was completed as scheduled).
The project has also produced less tangible but nonetheless
important lessons. For example, many hospitals have been reluctant to adopt
the program because revenues are reduced if people spend less time in the
hospital, even though use and cost of inappropriate services are also
decreased. One of the learning labs taught visitors how to capture allowable
charges. The project also found that each set of stakeholders—hospital CEOs, CFOs, physicians, and nursing staff—have different concerns that need to be addressed to gain their support for a
palliative care program.
4. Major Products
- "Back to School: A Unique Education Program
Provides Hands-On Experience with Palliative Care." Hospitals and Health
Networks, November 2004.
- Implementation of
Hospital-Based End-of-Life and Palliative Care. Poster presented at AHRQ's 2004
TRIP Conference, July 12-14, 2004.
- Recruiting-oriented
presentations: American Academy of Medical Administrators, Boston, MA, November 2002; Partners for Quality, Rockville, MD, March 2003; Medical Advisory Board
Lehigh Valley Hospice and Home Health, Allentown, PA, April 2003.
5. Potential for Sustainability/Expansion after PFQ
Grant Ends
While there is no funding in place for sustaining this
project, it is possible that some learning labs will continue to host
scaled-down versions of the site visits, if approached by hospitals/hospital
systems. It is also possible that something may arise from AHA policy leaders',
concerns about the disproportionate amount spent on end-of-life care, AHA
leadership have discussed support for palliative care as a way to reduce that
spending but have not taken any steps towards this, other than the Circle of
Life Awards.
2. A similar
program, the Center to Advance Palliative Care (CAPC) at Mt. Sinai Hospital in New York, funded by the Robert Wood Johnson Foundation, used a similar
approach to promote hospital-based palliative care programs. It targeted larger
hospital systems and university-based hospitals, however, whereas this AHA-HRET
program targeted smaller community hospitals, VA hospitals, and safety net
hospitals. Also, CAPC charged hospitals to participate in its learning
programs, while AHA-HRET did not.
Return to Appendix B Contents
Return to Contents
Proceed to Next Section