Skip NavigationFDA Logo links to FDA home pageCenter for Devices and Radiological Health, U.S. Food and Drug AdministrationHHS Logo links to Department of Health and Human Services website
FDA Home Page | CDRH Home Page | Search | A-Z Index U.S. Food and Drug AdministrationCenter for Devices and Radiological Health Questions?
horizonal rule
(photos of mammography devices)Skip Mammography NavigationMammography
Information for mammography facility personnel, inspectors, and consumers              
about the implementation of the Mammography Quality Standards Act of 1992 (MQSA)
(nagivation toolbar)
Home About Regulations Guidance Certification/Inspection Scorecard Reports Consumers Archives Contact Us
PERFORMANCE EVALUATION OF ACCREDITATION BODIES
UNDER THE
MAMMOGRAPHY QUALITY STANDARDS ACT OF 1992
as amended by the
MAMMOGRAPHY QUALITY STANDARDS REAUTHORIZATION ACT OF 1998
PDF


January 1, 2002 through December 31, 2002

A Report to Congress

Purpose

The Mammography Quality Standards Act (MQSA) of 1992 (P.L. 102-539), as amended by the Mammography Quality Standards Reauthorization Act (MQSRA) of 1998 (P.L. 105-248), establishes standards for high quality mammography and requires all facilities to be accredited by a Food and Drug Administration (FDA) approved accreditation body (AB) in order for them to demonstrate that they meet these standards. FDA may approve either private nonprofit organizations or state agencies to serve as ABs. The MQSA also requires FDA to submit an annual performance evaluation of the approved ABs to the Senate Committee on Health, Education, Labor and Pension and the House Committee on Energy and Commerce under 42 USC 263b(e)(6).

Currently, there are five ABs: the American College of Radiology (ACR), a private nonprofit organization, and the state ABs of Arkansas (SAR), California (SCA), Iowa (SIA), and Texas (STX). This report covers the performance of ABs under the MQSA from January 1, 2002 through December 31, 2002.

Status of Accreditation Body Approvals

FDA approved the ACR, the SAR, the SIA, and the STX as ABs under the MQSRA of 1998 and the final regulations. The SCA was approved under the interim regulations, and has applied for accreditation approval under the final regulations. However, its application is pending approval until the State’s final mammography standards are signed and in effect. FDA approved SCA’s draft standards, which are currently moving through the State’s emergency legislative process.

Standards

MQSA requires that each AB develop (or adopt by reference) standards that are substantially the same as the quality standards established by FDA under subsection (f) of the Act to assure the safety and accuracy of mammography. Regarding state laws, nothing in the Act limits the authority of any state to enact and enforce laws about matters covered by the Act that are at least as stringent as the Act or the standards promulgated under the Act.

American College of Radiology AB, State of Arkansas AB, State of Iowa AB, State of Texas AB, and State of California AB

The ACR, SAR, SIA and the STX have either adopted the final MQSA standards by reference, or have developed standards that are substantially the same as the quality standards established by FDA. Each AB incorporated the standards into its own accreditation processes.

FDA gave preliminary acceptance to the SCA’s draft final standards, which are currently moving through the State’s emergency legislative process. Once the SCA standards (that are at least as stringent as those of MQSA) are published, FDA expects to grant approval to the SCA’s renewal application

Methodology

FDA evaluates its ABs through: (1) examination of responses to questionnaires developed by the FDA addressing performance indicators, (2) analysis of quantitative accreditation and inspection information, (3) review of selected files (including clinical and phantom images), (4) interviews with AB staff and management to answer questions or clarify issues, and (5) onsite visits. FDA uses the following eight performance indicators (as outlined in the final MQSA regulations) to assess performance: administrative resources, data management, reporting and record keeping processes, accreditation review and decision-making processes, AB onsite visits to facilities, random clinical image reviews, additional mammography reviews, and accreditation revocations and suspensions.

FDA staff analyzes unit accreditation pass and fail data along with data that describes the reasons for each AB failure decision. Significant differences in pass and fail rates or reasons for accreditation denial among ABs could, for example, indicate that one AB is interpreting the significance of a particular quality standard more or less strictly than another.

To complement the information submitted by ABs, FDA analyzes information from its Mammography Program Reporting and Information System (MPRIS) database of annual facility inspections. Accredited facility performance during inspections is measured by average phantom image scores, average radiation dose values, and average processor speeds. Collectively, these measures reflect the overall functioning of all components of the mammography system.

Performance Indicators

(1) Administrative Resources and Funding

AB staffs generally include management, mammography radiologic technologists, MQSA inspectors, health physicists, information technology program application specialists, and administrative assistants. All ABs continue to maintain adequate funding for their respective programs.

(2) Data Management (Process/Errors)

All ABs provide the FDA with electronic transmissions of accreditation data in a secure and appropriately maintained manner. The majority of the ABs reduced their percentage of data management errors from those noted in the previous year. Nevertheless, FDA continues to work individually with ABs to (a) further minimize the number of data errors, (b) emphasize the importance of routinely performing quality assurance and quality control practices to correct errors before transmitting the data, and (c) provide reports that outline errors and the frequency with which they occur.

(3) Reporting and Recordkeeping

FDA’s review of the ABs’ reporting and recordkeeping practices includes examining procedures for handling serious consumer complaints and appeals for accreditation decisions.

(a) Serious Consumer Complaints

MQSA requires ABs to develop and administer a consumer complaint mechanism whereby all facilities that an AB accredits must file serious unresolved complaints with their AB. By regulation, each AB must submit to the agency an annual report summarizing all serious complaints received during the previous calendar year, their resolution status, and any actions taken in response to them.

All ABs have an appropriate serious consumer complaint mechanism in place. Each AB submitted its serious consumer complaint report to FDA for the year 2002, indicating that the ABs follow acceptable procedures when resolving complaints.

(b) Appeals

Each AB must have an appeals process for facilities to contest an AB’s adverse accreditation decision. In CY 2002, only ACR received appeals to its accreditation decisions. ACR overturned more than half (62.5%) of its original decisions on appeal. The FDA is concerned about such a high percentage of appeal overturns because a facility cannot operate during the appeals process and the ultimate decision in over half of the cases indicates that the facility could have been operating.

In CY 2001, ACR overturned almost half (41.5%) of its original decisions on appeal. As a result, FDA required ACR to analyze and report the reason(s) why almost half of its original decisions were overturned during the appeals process. ACR gave FDA the reasons it thought may be contributing to the high percentage. However, in 2002, the overturned percentage increased from that in 2001. In light of this increase, and the trend of a continuing increase in the percentage of appeals in which ACR overturns its original decisions (CY 2000 – 32%; CY 2001 – 41.5%; CY 2002 – 62.5%), FDA asked ACR to review its current clinical image review process and its appeals process to determine if there are any reasons for ACR overturning its original accreditation decision in such a high percentage of appeals, and provide its conclusions to FDA along with a performance improvement plan. The deadline for ACR to respond to this action item was September 30, 2003, which ACR met.

(4) Accreditation Review and Decision-Making Processes

Review of the ABs’ accreditation and decision-making processes includes evaluating procedures for clinical image review, phantom image review, and mammography equipment evaluation and medical physicist annual survey review.

(a) Clinical Image Review

As part of the accreditation process, mammography facilities must submit clinical images to their ABs for review. To evaluate the ABs’ performance in the clinical image review area, FDA’s MQSA qualified interpreting physicians (IPs) annually review clinical images from a sample of facilities that submit cases to the ABs for clinical image review. Two FDA IPs independently conduct clinical image reviews for each facility in the sample for each of the ABs that perform clinical image review, evaluating each examination on the eight attributes listed in the final regulations using a five-point scale.

The SCA and the STX each have a contract with the ACR to conduct their clinical image reviews. The remaining three ABs have their own clinical image reviewers to evaluate their facilities’ clinical images. A summary of the FDA clinical image reviews follows.

American College of Radiology AB

FDA performed its evaluation of ACR’s clinical image review process on September 25, 2002. FDA found that there was good agreement between the FDA IPs and the ACR clinical image reviewers at the attribute evaluation level with generally no more than one point variation identified between reviewers. In reviewing the exams and summary evaluation forms, FDA reviewers agreed with the final overall assessments (pass and fail) in all the cases. FDA determined that this spot review of cases indicates that the quality of clinical image review by the ACR remains high and has not deviated from past performance.

State of Arkansas AB

FDA performed its evaluation of SAR’s clinical image review process on October 3 and 4, 2002. In reviewing the exams, FDA reviewers agreed with the final overall assessments (pass and fail) in all the cases. FDA’s IPs indicated that the quality of clinical image review performed by the SAR remains high and has not deviated from past performance. FDA recommended to SAR, however, that it make changes in its assessment form to clarify certain review criteria.

State of Iowa AB

On October 17, 2002 and October 22, 2002, FDA performed its evaluation of SIA’s clinical image review process. The FDA IPs found consistent agreement among the SIA reviewers and agreed with the SIA reviewers’ final overall assessments (pass/fail) in all the cases reviewed. The review indicated that the SIA continues to maintain high quality standards concerning clinical image review. FDA recommended to SIA, however, that it make a change in its assessment form to include a section where the reviewer can indicate when the clinical images are of such poor quality that additional mammography review for the facility should be considered.

Summary of Audits and Training of Clinical Image Reviewers by ABs

Clinical image review quality control activities that promote consistency among the various clinical image reviewers exist at the ACR (and STX and SCA via ACR contract), the SAR, and the SIA. Each of these ABs conducts training sessions at which clinical image reviewers evaluate clinical images and discuss findings, including the application of AB clinical image review evaluation criteria. To ensure uniformity and to identify potential problems, each of these ABs analyzes agreement and nonagreement rates for all individual clinical image reviewers to provide each reviewer with the necessary data to compare his or her results to the rest of the review group

(b) Phantom Image Review

As part of the accreditation process, mammography facilities must submit phantom images to their ABs for review. To evaluate the ABs’ performance in the phantom image review area, FDA’s MQSA expert staff annually reviews phantom images from facilities that submit cases to the ABs for phantom image review. Two FDA staff, working independently, review 10 randomly selected phantom images from each of the ABs that perform phantom image review. FDA evaluates all test objects (fibers, specks, masses) on these images as part of the review. Scores for these test objects should fall within the acceptable limit of +/- 0.5.

The STX has a contract with the ACR to conduct its phantom image reviews. The remaining four ABs have their own phantom image reviewers to evaluate their facilities’ phantom images. A summary of the phantom image reviews follows.

American College of Radiology AB

FDA reviewed ACR’s 10 phantom images on October 10, 2002. Most of the test object scores of the FDA reviewers were within the generally accepted range of the scores of the ACR reviewers. For one phantom image, the fiber scores between the two ACR reviewers diverged by more than 0.5. In another phantom image, there was a 1.0 difference between the ACR reviewers and the FDA reviewers in the speck group score. FDA believes that this difference is acceptable since the ACR reviewers were more stringent in their scoring. FDA reviewers noted that both ACR reviewers failed to comment on a plus density artifact that was visible in one of the phantom images.

State of Arkansas AB

FDA reviewed SAR’s 10 AB phantom images in November and December of 2002. Most of the test object scores of the FDA reviewers were within the generally accepted range of the scores of the SAR reviewers. One phantom image failed the fiber score and the FDA reviewers were uncertain as to whether the problem was with the image or the phantom itself. Therefore, FDA requested that the AB obtain two additional images from the facility. After reviewing the additional images, the FDA reviewers again could not conclude whether the problem was due to the imaging process or due to the phantom itself. Additionally, because the FDA reviewers disagreed with the SAR reviewers on their assessment of the phantom images, the FDA oversight team discussed the issue with the SAR during its January 25, 2003 onsite visit. Subsequently, the SAR AB followed up this discussion and had the two original SAR phantom image reviewers (along with the SAR tie-breaker) independently review the original phantom image for a second time. Both the tie-breaker reviewer and one of the original reviewers agreed with FDA’s assessment.

State of California AB

FDA completed its review of SCA’s 10 AB phantom images in October 2002. The FDA reviewer’s score for one of the phantom images differed from that of the SCA MQSA inspector score in one of the object groups by more than 0.5. The FDA reviewer failed the image while the SCA reviewer passed the image. FDA addressed this issue through its internal QA process via remedial training for SCA phantom image reviewers.

During the first quarter of 2003, the SCA implemented its revised phantom image review procedures. FDA believes that the benefits afforded from a second reviewer’s evaluation (and a tie-breaker as needed), as provided for in SCA’s revised procedures, will help ensure a more balanced phantom image review for the accreditation process.

State of Iowa AB

FDA reviewed SIA’s 10 AB phantom images in October 2002. All of the test object scores of the FDA reviewers were within the generally accepted range of the scores of the SIA reviewers.

Summary of Audits and Training of Phantom Image Reviewers by ABs

An audit of phantom image reviewers ensures uniformity, identifies any potential problems, and provides all individual phantom image reviewers with the necessary data to compare his/her results to the rest of the review group. A summary of this activity and reviewer training for the ABs follows.

Audits

Audit results are used to enhance reviewer training by emphasizing any performance issues. The ACR (and STX via ACR contract), the SAR, and the SIA conducted audits of their phantom image reviewers to collect statistics on reviewer agreement and nonagreement rates. In 2002, SCA was unable to conduct an agreement and nonagreement audit because it had been using only a single reviewer during its phantom image review process as discussed under the section on phantom image review. However, SCA AB implemented a revised phantom image review process in the first quarter of 2003 that it will use to audit its phantom image reviewers.

Training

ACR, SAR, and SIA conducted training sessions for their phantom image reviewers in CY 2002. FDA approved SCA’s revised policy and procedure for its phantom image review in January 2003. The policy and procedure calls for periodic refresher training for the phantom image reviewers. SCA expects to begin offering training in CY 2003.

(c) Mammography Equipment Evaluation (MEE) and Medical Physicist Survey Report Reviews

The final regulations state that ABs shall require every facility applying for accreditation to submit an MEE with its initial accreditation application and, prior to accreditation, to submit a medical physicist survey on each mammography unit at the facility (21 CFR §900.4(e)(i)). All of the ABs have policies and procedures established for the review of both the MEE and the medical physicist survey report.

(5) AB Onsite Visits to Facilities

The final MQSA regulations (21 CFR §900.4(f)(1)(i)) require that each AB annually conduct onsite visits to at least five percent of the facilities the body accredits to monitor and assess facility compliance with the standards established by the body for accreditation. However, a minimum of five facilities shall be visited, and visits to no more than 50 facilities are required. During such visits, the AB is required to evaluate eight core elements which are: (a) assessment of quality assurance activities; (b) review of mammography reporting procedures; (c) clinical image review; (d) review of medical audit system; (e) verification of personnel duties; (f) equipment verification; (g) verification of consumer complaint mechanism; and (h) other identified concerns.

At least 50 percent of the facilities visited shall be selected randomly and the other facilities visited shall be selected based on problems identified through state or FDA inspections, serious complaints received from consumers or others, a previous history of noncompliance, or other information in the possession of the AB, MQSA inspectors, or FDA (i.e., visits for cause).

American College of Radiology AB

Based on the number of facilities ACR accredits, it is required to conduct onsite visits to 50 facilities. During CY 2002, ACR fulfilled its AB onsite visit obligation by completing 50 onsite visits (39 random, 11 for cause).

State of Arkansas AB

SAR conducted seven onsite visits (four random, three for cause) in CY 2002, thus exceeding the minimum of four onsite visits required by regulation.

State of California AB

Based on the number of facilities SCA accredits, it is required to conduct onsite visits to 24 facilities. In CY 2002, SCA fulfilled its AB onsite visit obligation by completing 24 onsite visits (20 random, four for cause).

State of Iowa AB

SIA conducted 49 onsite visits (45 random, four for cause) in CY 2002, thus exceeding the minimum of seven onsite visits required by regulation.

State of Texas AB

STX conducted 11 onsite visits (five random, six for cause) in CY 2002, thus exceeding the minimum of seven onsite visits required by regulation.

(6) Random Clinical Image Review

The final MQSA regulations (21 CFR §900.4(f)(2)(i)) require that each AB annually conduct random clinical image reviews (RCIRs) of at least three percent of the facilities the body accredits to monitor and assess facility compliance with the standards established by the body for accreditation.

American College of Radiology AB

During CY 2002, ACR fulfilled its obligation by completing 280 RCIRs, a slightly higher number than the 253 required by regulation.

State of Arkansas AB

SAR conducted 13 RCIRs in CY 2002, thus exceeding the minimum of the three required by regulation.

State of California AB

In CY 2002, SCA conducted 20 RCIRs, thereby completing the 20 RCIRs required by regulation.

State of Iowa AB

The SIA conducted 45 RCIRs in CY 2002, thus exceeding the minimum of the four required by regulation.

State of Texas AB

STX conducted five RCIRs in CY 2002, slightly higher than the four RCIRs required by regulation.

(7) Additional Mammography Review

If FDA thinks that mammography quality at a facility has been compromised and may present a serious risk to human health, the facility must provide clinical images and other relevant information, as specified by FDA, for review by its AB (21 CFR §900.12(j)). This additional mammography review (AMR) helps the agency to determine whether there is a need to notify affected patients, their physicians, or the public that the quality of mammograms may have been compromised. The request for an AMR may also be initiated by an AB or a State Certifying Agency (SAC). When an AB initiates an AMR, FDA encourages it to discuss the case with the agency prior to implementing the AMR.

The following chart summarizes the number of AMRs conducted by each AB during CY 2002:

AB Number of AMRs Conducted or Initiated* Number With Deficiency or Serious Risk Number That Completed Corrective Action and/or Notification
ACR 18 3 3
SAR 0 0 0
SCA 4 2** 2
SIA 0 0 0
STX 3 2 1***

*Note: The SCA and the STX each have a contract with the ACR to conduct their clinical image reviews during an AMR. The remaining three ABs have their own clinical image reviewers to evaluate their facilities’ clinical images.
**The results for two of SCA’s AMRs were pending at the writing of SCA’s 2002 Performance Evaluation.
***At the writing of STX’s 2002 Performance Evaluation, the corrective actions were pending for one facility.

(8) Accreditation Revocation and Suspension

The MQSA final regulations (21 CFR §900.3(b)(3)(iii)(I)) require that each AB have policies and procedures for suspending or revoking a facility’s accreditation. If a facility cannot correct deficiencies to ensure compliance with the standards or if a facility is unwilling to take corrective actions, the AB shall immediately notify the FDA, and shall suspend or revoke the facility’s accreditation.

American College of Radiology AB, State of Arkansas AB, State of Iowa AB, and State of Texas AB

Neither ACR, the SAR, the SIA, nor the STX ABs revoked or suspended any facility’s accreditation in 2002.

State of California AB

According to SCA’s interpretation of its own State authority, it currently lacks the authority to suspend or revoke accreditation. The SCA’s draft standards, which are currently moving through the State’s emergency legislative process, will grant the SCA AB the specific authority to revoke or suspend accreditation. In order to accomplish the same end result until its draft standards are signed and in effect, SCA uses the State’s “cease and desist” authority to force facilities to cease operations. SCA issued no cease and desist orders in CY 2002.

(9) Quantitative Accreditation and Inspection Information

As additional performance indicators, FDA analyzed quantitative accreditation and inspection information related to (a) unit accreditation pass/fail data, (b) reasons for denial of accreditation, and (c) accredited facility performance during inspections. Note: There are a relatively small number of state-accredited facilities compared to ACR-accredited facilities. Therefore, small variations in state-accredited facility performance may lead to differences across accreditation bodies that do not reflect actual differences in accreditation body performance.

(a) Unit Accreditation Pass/Fail Data Sorted by AB

Number of Units ACR SAR SCA SIA STX
Total 5,224 28 265 59 86
Passed Accreditation 5,209 (99.7%) 28 (100%) 262 (98.9%) 59 (100%) 85 (98.8%)
Failed Accreditation* 15 (0.3%) 0 3 (1.1%) 0 1 (1.2%)

*Units that were still denied accreditation as of December 31, 2002.

At the conclusion of the reporting period, the accreditation pass rate of mammography units among the accreditation bodies ranged from 98.8 - 100 percent. In general, the rates for units that failed accreditation decreased from those in the last reporting period. The unit fail rate usually reflects the facility’s second and third attempts at unit accreditation. The majority of facilities whose unit receives a fail in the first attempt at accreditation initiate corrective action and subsequently do not fail.

(b) Reasons for Mammography Unit Denial

All the ABs denied mammography unit accreditation almost solely due to clinical image review failure, while ACR also denied unit accreditation for phantom image review failure. Most of the facilities that receive a denial in the accreditation process complete rigorous corrective action plans under the ABs’ reinstatement protocols and eventually successfully achieve the levels of quality needed for accreditation.

(c) Facility Performance During Inspections Sorted by AB

In CY 2002, over half (62.6 percent) of the accredited mammography facilities had no MQSA violations while only 2.4 percent of the facilities had a violation characterized as “most serious.” FDA actively works with these facilities on corrective measures, or takes regulatory measures if a facility cannot improve its performance.

  ACR SAR SCA SIA STX
Average Phantom
Image Score*
12.3 12.8 12.4 11.3 12.9
Average Dose (in millirads) 178.1 177.8 169 154.6 173.1
Average Processor Speed 104.6 108.8 108.5 102.7 107.2

*The maximum possible phantom image score is 16. Four fibers, three masses, and three speck groups must be visible on the image for a minimum passing score.

There were no significant differences in average phantom image scores among the facilities accredited by the five ABs. In general, average phantom image scores increased from those reported in the 2001 Report. As phantom images are an indirect measurement of image quality, this rise might suggest that the clinical image quality throughout mammography facilities improved in 2002.

In general, the average doses increased slightly from those reported in the 2001 report, but still remain well below the dose limit of 300 millirads mandated by the MQSA final regulations. This dose limit has the advantage of permitting flexibility for the optimization of technique factors used during examinations to achieve improved image quality.

The average processing speeds among the facilities of all the ABs remained in the range to produce satisfactory clinical images and in fact increased from those speeds reported in the 2001 Report. The evaluation of the mammography facility’s film processing speed is an important quality assurance measure. The speed of film processing impacts directly not only on the resulting image quality of the mammogram, but can also impact on the dose administered to the patient. If a mammography facility is processing film in accordance with the film manufacturer’s recommendations, then the processing speed should be close to 100 (80 – 120 is considered normal processing speed). If the processing speed falls significantly, then the clinical image is not completely developed, appears too light, and the quality of the mammographic image can be significantly compromised. Moreover, the facility may not realize its film processor is the source of the problem and may compensate by increasing the dose administered to the patient.

Status of the Action Items From the 2001 Report to Congress

In almost all instances, the ABs successfully completed their CY 2001 action items. In the rare instances where they did not, FDA is actively working with each AB to ensure that it successfully completes the requirements of each action item.

Conclusion

FDA’s AB oversight program promotes collaboration and cooperation. Therefore, each AB, in concert with FDA, is currently addressing all action items cited in this report. FDA and the ABs, working in partnership with the certified mammography facilities in the United States as well as the states participating in inspection and other MQSA activities, are ensuring quality mammography across the nation.

Updated February 26, 2004

horizonal rule

CDRH Home Page | CDRH A-Z Index | Contact CDRH | Accessibility | Disclaimer
FDA Home Page | Search FDA Site | FDA A-Z Index | Contact FDA | HHS Home Page

Center for Devices and Radiological Health / CDRH