Jump to main content.


Environmental Data Quality at DOD Superfund Sites in Region 9

TABLE OF CONTENTS

Executive Summary

CHAPTERS

1 - Introduction

Purpose

Background

Scope and Methodology

Prior Audit Coverage

2 - Laboratory Quality Assurance Oversight at DOD Sites Needs Improvement

3 - Sacramento Army Depot

4 - March Air Force Base

5 - Travis Air Force Base

6 - Hunters Point Naval Shipyard

7 - Luke Air Force Base

APPENDICES

Appendix A - Region 9 Response to Draft Report

Appendix B - OIG Evaluation of Region 9 Response

Appendix C - Superfund Cleanup Process

Appendix D - Regulations, Orders, Directives, and Guidance

Appendix E - Activities Contacted During the Audit

Appendix F - EPA Data Analytical Levels

Appendix G - Definitions of Quality Assurance Techniques

Appendix H - Acronyms

Appendix I - Report Distribution

Chapter 1

Introduction

PURPOSE

EPA has an important role in overseeing Department of Defense's (DOD) Superfund cleanups. This role includes ensuring that laboratory data is of acceptable quality. EPA is also responsible for expediting cleanups at closing and downsizing bases by resolving critical technical issues, such as those with laboratory data quality problems, before they become impediments to cleanup and reuse.

In 1992, serious problems began surfacing with the quality of laboratory data used in the cleanup of DOD Superfund sites located in Region 9 (the Region). These problems caused millions of dollars of laboratory data to be rejected and cleanups to be delayed.

The objective of the audit was to determine if the Region was ensuring that laboratory data was of known and acceptable quality under Federal facility agreements with DOD.

BACKGROUND

Characterized as the nation's largest industrial organization, DOD has 100 bases on the EPA's Superfund National Priorities List (NPL), a register of the nation's worst contaminated hazardous waste sites.

Twenty-four of these bases are located within the Region. DOD bases received $2.7 billion in Fiscal 1995 to cleanup their contaminated sites.

Nearly half of the NPL bases in the Region are scheduled for closure. However, hazardous waste contamination on many of these bases must be cleaned up before the property can be returned to the community for economic development.

For decades, DOD facilities generated, stored, and disposed of hazardous wastes, which often contaminated nearby soil and groundwater. Hazardous waste contamination can significantly contribute to serious illness or death, or pose a hazard to the environment. The types of hazardous wastes found at most DOD installations include solvents and corrosives; paint strippers and thinners; metals, such as lead, cadmium, and chromium; and unexploded ordnance. Contamination usually results from disposal, leaks, or spills

Under the Comprehensive Environmental Response, Compensation, and Liability Act and its amendments, commonly known as Superfund, DOD is required to carry out its hazardous waste cleanups according to the same guidelines as other facilities. Executive Order 12580 further delegates certain cleanup authorities under Superfund to DOD.

Federal Facility Agreements

Once a base is placed on the NPL, EPA, DOD, and the state must negotiate a Federal facility agreement to govern the cleanup. The state is a party to the agreement because Superfund requires Federal agencies to consult with states regarding cleanups. Also, Federal agencies must comply with state environmental laws, which sometimes are more stringent than Federal rules.

Federal facility agreements set requirements and enforceable schedules for completing studies, reports, and cleanup records of decision. EPA is responsible for overseeing these agreements and has final approval authority over fundamental documents, such as the risk assessments and the feasibility studies. It also has final decision-making authority for selecting, on the basis of environmental data, the cleanup remedy. EPA also oversees cleanups at closing bases not on the NPL. (The Superfund cleanup process is described in Appendix C of this report.)

Environmental Data Quality Requirements

Environmental data is obtained primarily by sampling contaminated water, soil, air, and other materials, and having the samples analyzed by a laboratory. EPA Order 5360.1 requires environmental measurements to be of known quality, verifiable, and defensible. The quality of environmental data may be adversely impacted by weaknesses in sampling, laboratory analysis, and the validation of results. Poor quality data can negatively impact or delay the decision making process. Further, incorrect decisions can lead to inadequate health protection or expenditures for unneeded cleanup remedies.

The primary goal of the QA program is to ensure that all environmentally related measurements...[Laboratory Analysis] produce data of known quality. The quality of the data is known when all components...are thoroughly documented, such documentation being verifiable and defensible.

-EPA Order 5360.1

In focusing our audit on laboratory data quality activities, we determined that, unlike EPA, DOD does not have a centrally-managed contract laboratory program for monitoring laboratory quality. Typically, DOD's cleanup efforts are carried out by contractors, who procure the laboratory services.

Steps to Ensure Laboratory Analyses Quality

There are two major steps to ensure the quality of laboratory analyses for a site.

First, data quality objectives must be determined. Such objectives define how data will be used, and establish corresponding quality objectives before data is collected, thereby resulting in a defensible decision-making process.

Second, a quality assurance project plan (QAPP) must be developed in accordance with 40 CFR 300.430. The quality assurance activities necessary to achieve the data quality objectives are incorporated into a QAPP. This plan is a blueprint for ensuring the laboratory analyses produce data of sufficient quality and quantity for decision-making.

EPA's Oversight Responsibilities

EPA's regulations and EPA Office of Solid Waste and Emergency Response (OSWER) directives define regional oversight responsibilities in the Federal facility area. (The applicable guidance is listed in Appendix D of this report.)

In overseeing a Superfund cleanup, EPA is responsible for:

Ensuring environmental data is of known and acceptable quality for decision making;

Approving the QAPP and ensuring that it is acceptable; and

Verifying that the QAPP is implemented.

The Hazardous Waste Management Division is responsible for implementing the Superfund program in the Region. The Division's Federal Facilities Cleanup Office works with Federal, state and local agencies to oversee investigations and cleanups at Federal facilities. It is also responsible for initiating enforcement actions.

                     Regional 

                     Administrator

    Hazardous                   Office
    Waste                       Of Policy
    Management Division         And Management
    Federal                     Environmental
    Facilities                  Services
    Cleanup Office              Branch

The Region's Environmental Services Branch, under the Office of Policy and Management, helps ensure that "key environmental decisions...are based on environmental measurements of the correct type and of sufficient quality and quantity." It reviews QAPPs and provides technical assistance to program offices.

SCOPE AND METHODOLOGY

We performed our audit in accordance with Government Auditing Standards issued by the Comptroller General. Our field work was conducted between November 17, 1994 and June 30, 1995. Our audit covered the period Fiscal 1991 to 1994 and management procedures in effect as of December 31, 1994.

We reviewed the Region's oversight of laboratory data quality assurance activities at 5 of the 24 DOD bases on the NPL. This included the following facilities:

March Air Force Base;

Travis Air Force Base;

Hunters Point Naval Shipyard;

Luke Air Force Base; and

Sacramento Army Depot.

The first four facilities were selected because of known or possible problems with the quality of laboratory data. The Sacramento Army Depot was selected because some data validation procedural problems were noted in our audit survey. In addition, some limited audit survey work was done at Mather Air Force Base.

Internal Controls

We reviewed the internal controls associated with the regional oversight of laboratory data used by DOD facilities, including the Federal facility agreements, the QAPPs, and implementing guidance. The internal control weaknesses noted in our audit are described in Chapters 2 and 3 of this report, along with recommendations for corrective action.

FMFIA

In planning our audit, we reviewed EPA's 1994 Integrity Act Report to The President and Congress, issued in December 1994. This report, which documents EPA's compliance with the Federal Managers' Financial Integrity Act, identified environmental data quality as an existing material weakness. It also stated that previously reported problems with EPA's oversight of Federal facility cleanups had been corrected. Our audit report identifies additional problems related to EPA's oversight of Federal facility cleanups.

To accomplish our audit objective, we obtained:

Applicable laws, regulations, directives, and other guidance, including DOD laboratory quality assurance guidance;

Federal facility agreements;

QAPPs and related review comments; and

Evidence of quality assurance oversight.

We interviewed responsible EPA, DOD, and state officials. (A complete list of the entities contacted during the audit is included in Appendix E of this report.)

To evaluate the effectiveness of laboratory quality assurance, we identified quality assurance activities used in EPA's Contract Laboratory Program, and those activities that detected laboratory data problems at DOD sites.

These activities were used to evaluate the adequacy of the QAPPs. We determined compliance with key quality assurance activities in plans by reviewing supporting documentation. The audit also identified other key quality assurance activities that were not called for in the plans.

We also evaluated regional procedures for improving laboratory data quality at Federal facilities.

PRIOR AUDIT COVERAGE

There have been no prior audits of environmental laboratory quality at DOD by either the EPA Office of Inspector General, the Department of Defense Office of Inspector General, the General Accounting Office, or the military service audit agencies. However, in a related audit, the Air Force Audit Agency evaluated environmental contract oversight at selected bases. Its report, issued on November 16, 1994, determined that Air Force oversight of environmental compliance contracts was not adequate.

Chapter 2

Laboratory Quality Assurance Oversight At DOD Sites Needs Improvement

SUMMARY

In 1992, serious problems began surfacing with laboratory analyses related to Superfund cleanups at DOD sites in Region 9. At the same time, a regional survey identified quality assurance deficiencies at Federal facilities in the Region. However, at the time of our review, the Region had not significantly strengthened its laboratory data quality oversight program at its DOD NPL sites.

The integrity of the data used to make cleanup decisions cannot be compromised if we are to successfully fulfill our joint responsibilities to protect public health and the environment.

-Region 9's Federal Facilities Cleanup Office in a May 1994 letter to DOD

QAPPs, the primary tool for controlling laboratory quality, were generally not designed to detect the laboratory quality problems found at the five DOD sites reviewed. We also found that QAPPs were not always officially approved or fully implemented. Further, the Region did not effectively monitor compliance with QAPPs or perform sufficient oversight to determine the quality of laboratory data. In addition, QAPP inadequacies were not corrected after data problems were found. We found many of these problems manifested at Sacramento Army Depot, resulting in serious data quality deficiencies. In our opinion, the extent of these deficiencies compromise the validity of the determination that a portion of the Depot had been cleaned up.

We also believe that a more effective quality assurance oversight program would have avoided a portion or all of the $5.5 million of rejected laboratory work and cleanup delays of up to 2 years.

DOD'S LABORATORY PROBLEMS

In 1992, the Region became aware of serious problems with the quality of laboratory work performed by DOD contractors. In May 1992, the Navy informed the Region that $2.5 million of data analyzed for the Hunters Point Naval Shipyard cleanup had been rejected. In September 1992, the Air Force informed the Region that there were significant problems with the laboratory analyses for Travis Air Force Base (AFB). Subsequently, EPA determined that $2 million of data was of unknown quality, setting the Travis AFB cleanup back 2 years.

Problems with Eureka Laboratories

In January 1993, the Region identified significant problems with laboratory analyses for the March AFB cleanup, eventually resulting in the rejection of $1 million of data. The laboratory performing the poor analyses for March Air Force Base, Eureka Laboratories, Inc., was eventually convicted of fraud based on its analyses for March AFB and another DOD site in Region 8.

Commendably, the Region took aggressive action to help prevent further data problems once the Eureka analyses deficiencies were discovered. At the Region's request, EPA suspended Eureka Laboratories in December 1993 from all future Federal direct procurements. In May 1994, the Region wrote to DOD asking for a "...comprehensive analysis of the impacts of the Eureka case...in Region 9..." In July 1994, Eureka Laboratories and two of its managers were indicted on 71 counts of fraud.

In August 1994, DOD responded to the Region's request, and identified 24 DOD installations that used Eureka Laboratories in the Region since 1989. However, DOD's response was not a complete analysis of the use of Eureka Laboratories' data. For example, DOD reported that Tracy Defense Distribution Region used Eureka Laboratories. However, DOD did not identify the extent of the laboratory's work done for Tracy or steps taken to ensure the reliability of data.

Eureka confirmed, by letter to ... the prime contractor, that none of the analyses were under question by EPA or were subject to fraudulent practices.

-Letter from DOD to Region 9,

August 12, 1994

The steps DOD reported it took to determine the reliability of data at other sites were clearly inadequate. For example, DOD said the Naval Station Guam and Pearl Harbor Naval Shipyard were not affected by the Eureka Laboratories problems. They indicated that Eureka Laboratories confirmed that none of the analyses for these bases were under question by EPA or were subject to any fraudulent practices. It is logical to assume that a laboratory under investigation for fraud would not admit to any fraudulent practices.

Also, DOD's response for these two sites did not indicate what types of cleanup decisions the data was used for. Additionally, DOD did not identify how it verified that the data was valid. Further, DOD failed to report a least three installations that used Eureka Laboratories.

In spite of DOD's incomplete response to the Eureka concerns, the Region did not request any further data from DOD. In our opinion, the impacts of the Eureka fraud need to be accurately determined to ensure that the laboratory data at affected sites is of the quality needed for decision making.

RELIABILITY OF DOD'S LABORATORY QA PROGRAM UNKNOWN

We also found that the Region's remedial project managers were unfamiliar with DOD's specific laboratory quality assurance procedures. The Region explained that DOD had the principal responsibility for data quality assurance. However, the Region is ultimately responsible for ensuring data is of known quality. Thus, the Region must perform sufficient oversight to ascertain the quality of the laboratory data.

It shall be the policy of all EPA organizational units to ensure that ... environmentally related measurements are of known quality.

-EPA Order 5360.1

The degree of regional oversight of DOD sites should depend on the effectiveness of DOD's laboratory quality assurance program. If DOD has an effective quality program, the Region can reduce its oversight procedures. Conversely, if DOD has a marginal quality assurance program, the Region needs to increase its oversight efforts.

We found the Region accepted at face value the description of laboratory quality assurance procedures provided by the DOD activities. However, our review disclosed that these procedures were not accurately portrayed. For example, DOD purported that a laboratory must successfully analyze performance evaluation samples before it receives samples. In actuality, performance evaluation samples were not always used. Further, laboratories were provided more than two opportunities to "pass" a performance evaluation sample.

We also question whether DOD's quality assurance procedures can be relied upon. This was illustrated by the results of laboratory audits of Eureka Laboratories.

Standard operating procedures...are available throughout...[Eureka Laboratories]. However, most of them are incomplete (with regards to QC and calibration requirements and acceptance criteria) and unapproved... Metals preparation, sample custody, and GC [gas chromatography]-pesticide SOPs do not represent what is actually being done...

-Air Force contractor,

February 11, 1991

An Air Force laboratory audit found major problems with Eureka Laboratories in January 1991 and again in October 1992. Apparently unaware of these problems, the Army Corps of Engineers (the Corps) validated Eureka Laboratories for an 18-month period starting January 1992. The validation was made after determining that performance evaluation samples successfully passed without any other major or minor deficiencies.

A Navy contractor also reviewed Eureka Laboratories in April 1991 and determined that Eureka Laboratories:

Has both the technical ability and sample capacity to meet the analytical and reporting requirements for the volume of work proposed for Navy...projects.

Subsequently, both the Army and Navy used Eureka Laboratories. In our opinion, the obvious difference in quality of laboratory audits between the military services impacts the degree of oversight the Region must perform.

No Tracking System

In 1986, EPA's Office of Solid Waste and Emergency Response issued a directive which required the oversight and monitoring of all Superfund laboratories, not just those under EPA contract. However, to date, the Region does not have a data base or tracking system to identify and oversee laboratories used at DOD NPL sites. To illustrate, when the Eureka Laboratories fraud was discovered, the Region could not determine the extent that Eureka's laboratories had been used at DOD NPL sites.

We believe this is a serious deficiency that warrants the development of some type of data base tracking system for Federal sites which identifies the laboratory used, the quality assurance measures performed, and any data quality deficiencies.

REGIONAL SURVEY IDENTIFIES LABORATORY QUALITY ASSURANCE DEFICIENCIES

In a 1992 survey of laboratory quality assurance oversight practices, the Region found vulnerabilities in Federal facility Superfund cleanups. The survey found that data validation, split sampling and performance evaluation sampling were used to a much lesser degree at Federal facility sites than at EPA lead sites.

The survey concluded that:

...the ultimate quality of the data generated was not known at a number of sites. As demonstrated by recent events at Hunters Point, this is a program vulnerability.

In spite of this acknowledged vulnerability, we were advised, at the beginning of the audit, that no actions had been taken as a result of the survey. To illustrate, at Sacramento Army Depot, we found that since 1992 the Region had not performed any independent oversight of laboratory quality using procedures such as data validation, performance evaluation samples, or magnetic tape audits.

QAPPS LACKING EVIDENCE OF EPA APPROVAL

We found the Region frequently could not provide evidence that the QAPPs were approved. Both 40 CFR 300 and EPA Order 5360.1 require EPA approval of the QAPPs. Regional guidance document no. 9QA-03-89 calls for both the Region's remedial project manager and quality assurance officer to approve each QAPP.

We reviewed 12 QAPPs for the 5 DOD bases in our audit. The review identified that only two QAPPs had documented approvals by both the regional program office and the Environmental Services Branch's Quality Assurance Section, which was responsible for quality assurance. Both of these QAPPs were for Sacramento Army Depot. Also, a QAPP for the Groundwater Operable Unit at the Depot was not prepared.

Unlike other investigations currently being conducted at [Sacramento Army Depot], ...the ground water investigation therefore lacks a Quality Assurance Project Plan.

-Letter from Region 9's Environmental Services Contractor, December 1989

The Region advised us that it did not need formal documentation to approve QAPPs. It pointed out that under Federal facility agreements, QAPPs will automatically be considered "final" documents if the Region did not provide comments to DOD within its 30-day review period.

In our opinion, formal approval is essential for a couple of reasons. First, without formal approval, it is not clear if the QAPP being used is the most current version. In fact, a recently issued EPA quality assurance regulation (EPA QA/R-5) calls for making sure all personnel involved in the cleanup work have copies of the approved QAPP. We found several versions of QAPPs during our reviews, but it was not apparent which one was the "approved" version. In fact, one regional remedial project manager provided us an "approved" QAPP that turned out to be a draft version.

Second, formal approval from the Region's Quality Assurance Section would help ensure its comments were solicited, received, and considered by the remedial project managers. EPA QA/R-5 provides that the appropriate content in the QAPP may be best achieved by having the QAPP reviewed and confirmed by the EPA remedial project manager, with the assistance and approval of the quality assurance manager.

However, it did not appear that the Quality Assurance Section recommendations were always given full consideration. For example, the March AFB QAPP was not approved by the Region's Quality Assurance Section. The section had previously recommended that the QAPP address the selection of subcontract laboratories and include more detail on the conduct of laboratory oversight. However, these recommendations were not included in the QAPP. In this case, DOD's laboratory selection process for March AFB had vulnerabilities, which were fully manifested by the selection of Eureka Laboratories. This issue is further discussed in Chapter 4 of this report.

It is recommended that the basewide plan [for March Air Force Base] address the selection of subcontract labor-atories and how laboratory oversight...will be conducted throughout the duration of the project.

-Regional quality assurance

staff recommendations,

September 13, 1991 QAPPS NOT WELL DESIGNED

Our audit disclosed that the QAPPs for the five DOD bases evaluated were typically not designed to detect poor laboratory work. To determine the most effective quality assurance techniques, we identified those that detected problems at the sites included in our report. As shown in the chart below, we found that certain techniques were highly effective in detecting poor or fraudulent laboratory data.

Quality Assurance Techniques Used

To Identify Data Quality Problems

  Quality Assurance Technique March Air Force Base Hunters Point Naval Shipyard Travis Air Force Base Sacramento Army Depot Luke Air Force Base
Full Data Validation X X X
"Double Blind" PE Samples

X

Magnetic Tape Audits

X

X

Laboratory Audits X X

As discussed below and in Chapters 3 through 7 of this report, we found that the above four quality assurance techniques were generally not required by the QAPP. In those instances where the techniques were required, they were not sufficiently monitored by DOD or EPA.

Except for Travis AFB, we found that the QAPPs were not revised after laboratory data analyses problems were found. To illustrate, the QAPP for March AFB was not revised to require double-blind performance evaluation samples or magnetic tape audits, even though the Region used these techniques to identify data problems.

We believe the QAPPs could be better designed to detect poor quality data by incorporating techniques that have been successful in identifying prior data problems. This should be done at the time of the annual QAPP review required by EPA QA/R-5:

For programs or projects of long duration,...the QAPP shall be reviewed at least annually by the Project Manager, revised if necessary to reflect current needs, and resubmitted for review and approval.

DATA VALIDATION REQUIREMENTS

Data validation is used to of ensure that laboratory data is of known quality. It involves reviewing data against a set of criteria to provide assurance that data is adequate for its intended use.

KEY DECISION POINTS

Requiring Data Validation

Determining contaminants of concern.

Identifying the extent of contamination.

Determining a cleanup is complete.

-"Basic Requirements for Quality Assurance at Region 9 Superfund Sites,"

Region 9, July 1988

Data validation is absolutely essential at key decision points, such as determining the boundaries of groundwater contamination.

EPA has data validation guidelines, known as national functional guidelines, for its own contract laboratory program. Generally, the QAPPs reviewed called for data validation that corresponded with EPA data validation guidelines. Data validation is typically performed on 100 percent of the data produced by EPA's contract laboratories. However, for DOD NPL sites, typically a much lower percentage is required.

According to EPA guidelines, data validation includes a review of documentation such as raw data, instrument printouts, chain of custody records, and instrument calibration logs. In order to perform an EPA-type validation, the data package should be an EPA "Level IV" package that includes the necessary documentation. (See Appendix F for description of data levels.) Without a Level IV package, it is not possible to evaluate many aspects of a laboratory's analyses or even verify the analyses were performed.

...a percentage of data representing all matrices, analysis types, and lab contractors for decision points should be selected for ... documentation and data validation. A bare minimum would be 10 percent.

-Region 9 Quality Assurance Management Section

To illustrate how effective data validation can be in determining the quality of laboratory data, we noted that the results of the data validation at Hunters Point and Sacramento Army Depot caused data to be rejected, making it unusable for most significant decision-making, such as risk assessments. In fact, the problems with data quality were so serious at the Sacramento Army Depot that, in our opinion, they compromised the final cleanup decision.

Limited Data

Our review of 12

QAPPs at 5 DOD facilities showed that 8 QAPPs did not require a specific percentage of data to be validated using EPA's national functional guidelines (or an equivalent).

For example, two Luke Air Force QAPPs required data validation only when "substantial quality control problems" were found. Since no quality control problems were found, data validation was not performed until fraud was alleged at the laboratory.

Sacramento Army Depot's Basewide QAPP also required data validation according to EPA national functional guidelines, and called for a percentage to be set in each operational unit QAPP. However, the percentage requirement was incorporated into only one of the three operable unit QAPPs.

Data Validation Was Deficient

We found when data validation was performed, it was not performed according to EPA national functional guidelines, completed promptly, or performed by an independent third party.

Our review noted serious problems were overlooked when data validation did not use procedures comparable to EPA national functional guidelines. For example, the engineering contractor for Sacramento Army Depot advised us that some of its laboratory data was being validated. However, the OIG's Engineering and Science staff found that data validation, comparable to EPA national functional guidelines, had not been completed.

In our opinion, this was a material quality assurance weakness. Therefore, we asked the Region to perform data validation using EPA functional guidelines for two operable units at Sacramento Army Depot. Both of these validations found major problems. The validation for the Burn Pits Operable Unit rejected all analyses for volatile organic compounds for one round of sampling.

The validation for the Groundwater Operable Unit found serious deficiencies, such as missed holding times, which compromised the laboratory's conclusions that contaminants of concern were not present. As discussed in Chapter 3, we believe that any decision that cleanup actions are completed at the Depot should be deferred until a positive determination can be made to the quality of the environmental data used in decision-making.

Although the Hunters Point QAPP required data validation according to EPA national functional guidelines, it was not performed promptly.

A large number of the Hunters Point samples were taken between October and December of 1990. In January 1991, the engineering contractor started performing "cursory" validation of this data and decided the laboratory analytical results were, "at best," estimates. In May 1992, the engineering contractor rejected all organic analyses for one phase of the project after performing data validation according to EPA guidelines. When data is rejected, the presence or absence of the analyte cannot be verified and the data is unusable.

In this case, well over one year passed before the data validation determined data was unusable. In our opinion, much of the delay in determining the usability of data could have been avoided if this validation would have been done promptly.

Another problem observed at Travis AFB was a lack of independence by the party performing the data validation. In this case, the engineering contractor used its own laboratory. When the engineering contractor performed data validation, it found most of the data was usable. When the Air Force's quality assurance contractor reviewed the validation, it determined the same data was unusable. Thus, it appears the lack of independence may have adversely affected the reliability of the data validation. A similar situation occurred at Mather AFB where the contractor performing the data validation had used its own laboratory to complete the laboratory analytical work.

Data validation is one of the most important quality assurance techniques in a QAPP. We are recommending that the Region ensure the QAPP requirements for independent and prompt data validation of DOD laboratory data, in accordance with EPA guidelines, be established at a minimum of 20 percent. This percentage was established in the Sacramento Army Depot QAPP for the Burn Pits Operable Unit. We also noted over 20 percent data validation was performed at Travis AFB. This validation should include all matrices, analysis types, and laboratories.

Computer Assisted Data Validation Should Be Implemented

EPA has developed a computerized data validation program called Computer-Aided Data Review and Evaluation (CADRE). Computerized data validation is a relatively new quality assurance technique that is more efficient than traditional manual data validation.

CADRE More Efficient

According to regional data, CADRE is much more efficient than manual data validation as shown below:

COMPARISON OF DATA VALIDATION METHODS
  CADRE Manual
Time to Validate 4 hours 35 hours
Turn Around Time 1 week 1 month
Cost to Validate $150 $1,200

CADRE More Effective

During an in-house test, the Region found that CADRE not only identified the same problems that manual data validation did, but was more objective and consistent. A drawback to CADRE was that, as a computer program, it could not visually inspect raw data to identify anomalies.

We believe an electronic data validation program, such as CADRE, could be of great assistance in accelerating DOD cleanups. Under the President's Fast Track Cleanup Program for closing and down-sizing military bases, EPA is charged with forging a partnership with DOD that hastens cleanups. One of EPA's missions in this program is to provide the environmental expertise to streamline the cleanup process. Consistent with its mission, we believe the Region should ensure that DOD uses electronic validation techniques to the maximum extent possible.

In this regard, the Region conducted CADRE training for DOD staff in April 1995. However, no one from the Air Force attended. Further, CADRE was only distributed to the individuals attending the training.

In our opinion, QAPPs for the DOD sites should be modified to require the use of electronic validation techniques. We recognize that an upfront investment in software may be necessary. However, we believe the advantages of efficiently performing data validation outweigh any additional implementation costs.

"DOUBLE-BLIND" PE SAMPLES NOT REQUIRED

As discussed in Chapter 4, "double-blind" performance evaluation (PE) samples were used by the Region at March AFB to identify poor quality laboratory data. However, only 2 of the 12 QAPPs reviewed required "double-blind" PE sample analyses.

The PE sample results provide a point-in-time evaluation of data quality related to the program quality assurance objectives.

-Travis AFB Model QAPP

PE samples are prepared by "spiking" a known concentration of chemicals into a contaminate-free media, such as water or soil. A laboratory's analyses of PE samples are used to evaluate its ability to produce accurate results.

PE samples can be administered by two methods: "blind" or "double-blind". When a PE sample is blind, the laboratory is aware the sample is a PE, but does not know the chemical concentration levels. Since the laboratory is aware the sample is a PE, it can make an extra effort to perform a thorough analysis.

When a sample is double-blind, the PE sample is submitted as part of a field sample shipment. In this situation, the laboratory is not only unaware of the concentration levels, it is also unaware that the sample is a PE. Double-blind PEs involve more logistical considerations, since the PEs must be inserted into a batch of samples in the field, using the same sample containers and sample tags.

The [double-blind] PE samples are made to look as similar to field samples as possible, and are submitted as part of a field sample shipment so that the laboratory is unable to distinguish between them.

-Travis AFB Model QAPP

In addition to double-blind PEs generally not being required, they were not always evaluated by a party independent of the laboratory.

After data quality problems were found at Travis AFB, one of the QAPPs was revised to require double-blind PE samples for each analytical method and field sampling program. According to this QAPP,

The concentrations reported for the PE samples will be compared to the known or expected concentrations...The percent recovery will be calculated and the results assessed according to the accuracy criteria...If the accuracy criteria are not met, the cause of the discrepancy will be investigated and a second PE sample will be submitted.

We believe this is an excellent quality assurance technique that should be incorporated into each DOD site QAPP as early in the project as possible.

MAGNETIC TAPES SHOULD BE AVAILABLE

None of the QAPPs required magnetic tape audits.

Magnetic tape audits are routinely conducted by EPA in monitoring its contract laboratories to assist in determining if the laboratory is complying with the contract, the integrity of the laboratory's computer systems, and the appropriateness of any software editing.

At the time of our review of the five DOD bases, we found that only March AFB used magnetic tape audits, as described in Chapter 4. In this case, the laboratory provided the magnetic tapes, although not required by the contract.

Although not suitable for all types of analytical data, we believe magnetic tape audits should be performed if major deficiencies are found by other methods, such as data validation or performance evaluation samples. However, in order to do so, DOD must be able to obtain the magnetic data. This means including the requirement in the QAPP. To illustrate the importance of the requirement, we noted that for Travis AFB, the laboratory would not provide the necessary magnetic tapes so that the Region could evaluate data quality.

LABORATORY AUDIT RESULTS NOT SHARED

All but 2 of the 12 QAPPs required some type of laboratory audit. However, none of the QAPPs required the results of the laboratory audits to be shared with the Region.

On-site audits are designed to identify technical areas which may cause laboratories to improperly identify or quantitate chemicals. The audits normally evaluate a laboratory's technical expertise, standing operating procedures, facility and equipment sufficiency, and possible sources of sample contamination.

As discussed earlier in this chapter, an Air Force laboratory audit of Eureka Laboratories in January 1991 found major problems with the Laboratory. It recommended that no further samples be sent to Eureka Laboratories. This audit recommendation was not shared with the Region. In the future, the Region should ensure that it obtains the most recent laboratory audits from each service as part of its oversight function.

We believe such audits are a good management technique and should be incorporated into QAPPs. In summary, our review identified the following weaknesses in laboratory audit requirements:

WEAKNESSES IN LABORATORY AUDIT

REQUIREMENTS FOR 5 DOD BASE QAPPS

Weakness Sites Affected
No laboratory audit required Hunters Point, Travis East Operable Unit
Audit results not required to be shared with Region 9 All
Audit not required before work is started Sacramento, March, Luke, Travis
Audit not conducted by firm independent of laboratory Travis

QAPPS NOT IMPLEMENTED OR CURRENT

As described in the following chapters of this report, we found that the QAPPs were not fully implemented by DOD. According to EPA Order 5360.1, EPA project managers are responsible for making sure QAPPs are implemented and adhered to.

To illustrate, the QAPP for the Burn Pits Operable Unit at Sacramento Army Depot required 20 percent of the data to be validated according to EPA national functional guidelines. We found that no data was validated. In another example, the QAPP for Luke AFB required the engineering contractor to do a laboratory audit at least semi-annually. We found the engineering contractor did not perform any laboratory audits.

We also found that some QAPPs were not current. For example, the Sacramento Army Depot QAPP named a quality assurance officer that was no longer employed by the depot's engineering contractor.

It is absolutely essential that the QAPP be kept current and that all personnel involved in the work effort have a current version of the QAPP available.

-EPA QA/R-5

DATA QUALITY OBJECTIVES WERE DEFICIENT

Data quality objectives were not sufficiently defined for four of the five DOD bases reviewed. Thus, it was difficult to determine whether data of sufficient quality and quantity was collected to support decision making process at these sites.

According to OSWER Directive 9355.9-01, EPA developed the data quality objective process because:

...it is the goal of EPA and the regulated community to minimize expenditures related to data collection by eliminating unnecessary, duplicative, or overly precise data. At the same time, they would like to collect data of sufficient quantity and quality to support defensible decision making. The most efficient way to accomplish both of these goals is to begin by ascertaining the type, quality, and quantity of data necessary to address the problem before the study begins.

The data quality objectives process was developed to avoid the collection of data that is inconsequential to the decision making process. The process allows decision makers to define their data requirements and acceptable levels of decision errors during planning, before any data is collected.

Data quality objectives are the basis for preparing the QAPP. The QAPP should show how the intended measurements or data acquisition methods are appropriate for achieving the data quality objectives. Thus, when data quality objectives are inadequately specified, the project runs the risk of not collecting sufficient quality data or expending too much on sampling.

EPA guidance concerning data quality objectives was updated in 1993. However, we evaluated the adequacy of data quality objectives using guidance prepared in 1987, since all of the QAPPs reviewed, except for the Travis Model QAPP, were prepared before the new guidance was issued. With the exception of this QAPP and the QAPP for Luke AFB, we found data quality objectives were not developed for each site investigation as required.

QUALITY ASSURANCE OFFICERS NOT APPOINTED

We found that DOD did not appoint the quality assurance officers in a timely fashion and appointed one quality assurance officer that was a contractor.

Generally, the Federal facility agreements or QAPPs required DOD to appoint a quality assurance officer to monitor project quality. As discussed in the chapters on Travis AFB and Hunters Point Naval Shipyard, these officers were sometimes not appointed for periods up to 3 years after the agreement was signed. In addition, we found there was no quality assurance officer at Sacramento Army Depot.

We also noted that, in one instance, the quality assurance officer appointed was an engineering contractor. Since the engineering contractors performed field sampling and sometimes were involved in laboratory analyses, we do not believe that they were in a position to independently evaluate data quality. The quality assurance officer position should be assigned to a government employee.

BEST PRACTICES

We identified some best practices with a Travis AFB QAPP which should be considered for inclusion in all DOD QAPPs. The Travis "Model" QAPP required double-blind performance evaluation samples. It also developed a very usable format for the quality assurance report that showed findings, corrective actions required, and the overall effect on data quality assurance.

We also identified a Federal facility agreement for a site not included in our audit, that did a good job specifying quality assurance requirements. This agreement, for the Marine Corps Air Station Yuma, called for the following:

In order to provide quality assurance and maintain quality control regarding all field work and sample collection performed pursuant to this Agreement, the Marine Corps agrees to designate a Quality Assurance Officer (QAO) who will ensure that all work is performed in accordance with approved work plans, sampling plans, and QAPPs. The QAO shall maintain for inspection a log of quality assurance field activities and provide a copy to the Parties upon request.

...To ensure compliance with the QAPP, the Marine Corps shall, upon request by EPA or the State, use its best efforts to obtain access to all laboratories performing analysis on behalf of the Marine Corps pursuant to this Agreement. If such access is not obtained for any laboratory, EPA or the state may reject all or portions of the data generated by such laboratory and require the Marine Corps to have the same or comparable data analyzed by a laboratory that will grant such access.

RECOMMENDATIONS

We recommend that the Regional Administrator, Region 9:

1. Fully identify the impacts of the Eureka Laboratories fraudulent practices on Federal facility investigations, studies, and cleanups in the Region.

2. Determine the reliability of DOD's laboratory quality assurance programs and perform sufficient oversight to assure the quality of the laboratory data.

3. Develop a data base or tracking system for laboratory data quality at DOD facilities that identifies the laboratory used, the quality assurance measures performed, and data quality deficiencies identified.

4. Ensure that a QAPP is prepared for each DOD site and that the QAPPs are kept current.

5. Require QAPPs to be formally approved by the Region, after review and concurrence of the quality assurance staff. In addition, the remedial project managers should be required to document the reasons quality assurance staff's recommendations have not been included in final QAPPs.

6. Review the DOD QAPPs at least annually and require appropriate revisions to include, as a minimum, the following quality assurance measures:

a. The use of EPA national functional guidelines for at least 20 percent of the data representing all matrices, analysis types, and laboratories for decision points.

b. The completion of data validation requirements by a party independent of both the laboratory and its parent company.

c. The use of electronic data validation techniques.

7. Require the QAPPs to be amended to include the requirement for independent, double-blind PE samples.

8. Revise the QAPPs to require that magnetic data is maintained and made available to the Region. In addition, magnetic tape audits should be required if major deficiencies are found by other quality assurance methods, such as data validation or performance evaluation samples.

9. Improve the use of laboratory audits by amending the QAPPs to:

a. Require laboratory audits to be done before work is started and periodically throughout the project.

b. Specify that the audits will be conducted by an activity independent of the laboratory.

c. Require the laboratory audit results to be provided promptly to the Region.

10. Ensure the QAPPs are fully implemented by DOD facilities. In making this determination, the Region should obtain appropriate supporting documentation from the Federal facilities to substantiate their actions.

11. Make sure reasonable data quality objectives are included for each site investigation.

12. Ensure quality assurance officers are appointed in a timely manner and that they are government employees.

13. Publicize best practices used in DOD QAPPs and Federal facility agreements to make remedial project managers and quality assurance managers aware of them.

REGION 9 COMMENTS AND OIG EVALUATION

The Region did not respond to the specific recommendations, but said it was pursuing the following actions:

1. Region 9 has recently and will continue to seek to require federal facilities, through our comments on their Quality Assurance Project Plans (QAPjPs), to follow the Agency's Data Quality Objective Process for Superfund Interim Final Guidance dated September 1993. Region 9 will also ensure that, whenever possible, numeric data quality objectives on precision, accuracy, completeness, and detection limits are set prior to the approval of each QAPjP.

2. The OIG draft report found that data validation comparable to the EPA National Functional Guideline was not performed at DOD sites. The Region has requested that several NPL federal facilities submit representative data packages to EPA. Region 9 will perform our standard data validation on these packages using EPA's Functional Guidelines for Data Validation. The data packages not containing sufficient analytical deliverables and all other data quality problems associated with the other data packages will be identified to DOD for action.

3. The Region has been working with DOD for the last 3 years to raise awareness of the quality assurance problem and to help define an improved DOD quality assurance program that will meet or exceed our minimum standards. We will continue to participate in these efforts. As part of this effort, the Region will continue to encourage DOD facilities to develop and implement, as part of their own quality systems, a self sufficient performance evaluation program, and will seek to require DOD to submit PE study results to EPA. The Region plans to score and validate as many PE sample results as our limited resources allow and will make recommendations to DOD on follow up corrective actions when necessary.

4. In addition to recommending that the Federal facilities conduct PE samples, recently, Region 9 QAMS has conducted performance evaluation samples for on-site and off-site labs at MCA Tustin (monthly), Yuma MCA (twice), George AFB, McClellan AFB (twice), and March AFB. Additional performance evaluation activities are planned.

5. Through the review of work plans and QAPjPs, the Region will seek to require DOD facilities to perform both pre-award and routine audits of their contract labs. We will also seek to require federal facilities to furnish all audit reports to EPA.

6. Through the review of work plans and QAPjPs, the Region will seek to require DOD facilities to address how they will ensure the authenticity of the data. At a minimum, we will seek to require that all organic data for GC and GC/MS magnetic tapes be archived and available to DOD and EPA upon request. This action would provide a strong deterrence to the fraudulent activities and any poor quality laboratory performance that have been observed in the past. Region 9 QAMS has conducted magnetic tape audits on March AFB, Travis AFB, Luke AFB, and Yuma MCA. Additional tape audits will be conducted as resources allow. The Region is committing resources for purchasing equipment necessary to develop a regional tape audit capability, and will suggest DOD do the same.

7. Region 9 and OERR provided training of the use of the Computer Assisted Data Review and Evaluation (CADRE) software on April 10-13, 1995 to the U.S. Army Corps of Engineers, the U.S. Navy and their contractors. An additional CADRE training sessions will be provided to the U.S. Air Force later this fall. Since the first training, QAMS has continued to distribute copies of CADRE and technical assistance to the attendees. QAMS received confirmation that the Army and Navy will begin to implement data validation using CADRE on a number facilities. Implementation of CADRE would help remediate vulnerability regarding data quality deficiencies identified in the IG report.

8. As appropriate, the Region will seek modifications by DOD of their previously approved QA Plans still in use to incorporate the IG comments regarding data quality objectives, data validation, the use of PE samples, etc.

9. One of the IG concerns is that federal facilities have not followed the requirements set forth in FFAs and approved QAPjPs. The Region's QAMS has developed a QA check list to assist the remedial project managers in their oversight of the quality system at federal facilities. The QA checklist can be used during the Technical Review Committee meetings to ensure that federal facilities comply with the requirements in the approved QAPjPs and appropriate oversight follow up is conducted.

10. The Region is exploring the idea of providing training for federal facilities and their contractors that would aim at improving DOD's overall quality systems, their reporting, development of expertise in the data quality objectives process, and managing DOD's contractors' laboratories.

The Region also indicated that:

Its staff were informed and directed to discuss with DOD officials the extent of Eureka's use at DOD sites, and to assess the criticality of this particular data to remedy selection at the site.

It agreed that "some" tracking system for data quality at Federal sites should be in place.

The Navy was revising the Hunters Point Naval Shipyard QAPP for use in the groundwater monitoring program.

OIG Evaluation

We accept the Region's comments as a strong commitment to improve the QAPPs, and related data quality oversight at DOD activities. Since the response to the draft report did not specifically address each recommendation in the draft report, the recommendations must be addressed during the final audit resolution process.

Chapter 3

Sacramento Army Depot

SUMMARY

We found the Region had performed virtually no data quality oversight at the Sacramento Army Depot (the Depot) since 1992. As a result, the environmental data obtained for the Depot Superfund site was of unknown quality because: (i) the data packages the laboratories provided were insufficient to verify the quality of the analyses; and (ii) analyses were not validated using EPA national functional guidelines. Also, in some instances, environmental data used in the decision-making process was subsequently rejected due to unacceptable quality.

The overall quality objectives for the...QAPP...[are] to obtain data that are scientifically and legally defensible...

-Sacramento Army Depot Quality Assurance Project Plan

As a result, we believe any decision that cleanup actions are completed should be deferred until a positive determination can be made to the quality of the environmental data used in decision-making. This includes the Region reevaluating its June 1994 certification that the Depot's Tank 2 Operable Unit was cleaned up. Until the reevaluation is completed, the Army should be informed the certification is being withdrawn. Further, we recommend that the Depot's QAPP be revised to better oversee laboratory quality for the remainder of the cleanup effort.

BACKGROUND

The Depot, located in Sacramento, California, is a nearly 500-acre former military operation that was established in 1945. The Depot handled communications equipment and supplies. Operations at the Depot involved the use of solvents, oils and grease, fuels, caustic solutions, and metal-plating baths. The Depot was closed in March 1995. Part of the Depot property has already been transferred to the City of Sacramento.

The Depot was added to the NPL in 1987. EPA, the Depot, and the State of California entered into a Federal facility agreement to investigate and cleanup the site in 1988. In January 1995, a Basewide record of decision was agreed to by the Army, State of California, and EPA to cleanup both the soil and groundwater at the Depot. The record of decision was completed more than a year ahead of schedule.

Depot Includes Four Operable Units

The schedule below shows the Depot's four operable units and the type of contaminants found.

Operable Unit
Contaminants of Concern
Burn Pits Volatile organic compounds, metals
Tank 2 Volatile organic compounds
Groundwater Volatile organic compounds
Oxidation Lagoons Metals

NO QAPP FOR GROUNDWATER OPERABLE UNIT

QAPPs were prepared for 3 of the 4 operable units, and a Basewide QAPP was prepared. However, a QAPP was not prepared for the Groundwater Operable Unit, even though one was required by EPA Order 5360.1. While there was a "monitoring plan" for the Groundwater Operable Unit, it was not subject to the same requirements as a QAPP. For example, EPA has responsibility for ensuring that a QAPP is implemented, but there is no similar requirement for a monitoring plan.

QAPP APPROVAL PROCESS DEFICIENT

We noted there were no documented regional approvals for the QAPPs for the Oxidation Lagoons or Burn Pits Operable Units. Such approvals are important to ensure that the approved version of the QAPP is being used. We also found that recommendations made by the Region's Quality Assurance Section, including one calling for the preparation of a QAPP for the Groundwater Operable Unit, were not followed. The Region's files did not document why this recommendation was not followed.

QAPPS DATA VALIDATION REQUIREMENT NOT MET

The Depot's Basewide QAPP required data validation according to EPA national functional guidelines. The percentage of samples requiring validation was to be incorporated into the operable unit QAPPs. Our review disclosed that three of four operable units had no percentage requirement for data validation in the QAPP or monitoring plan. Only the Burn Pits Operable Unit had a percentage requirement.

A percentage of all samples will be subjected to a complete statistical data validation procedure. The samples selected for the statistical analysis will be those that are nearest to the anticipated boundaries of contamination. Valid data will be used to evaluate whether the [data quality objectives] of the QAPP have been met.

-Basewide QAPP,

Sacramento Army Depot

DATA OF UNKNOWN QUALITY

EPA Order 5360.1 requires data to be of known quality, verifiable, and defensible. In our opinion, the laboratory data at the Depot was of unknown quality for several reasons.

Data Packages Are Insufficient

The OIG Engineering and Science Staff (ESS) determined the data packages that the laboratory provided to the Depot were insufficient to perform data validation in accordance with EPA national functional guidelines. One of the key items not provided by the laboratory were detailed equipment calibration records. These records are needed to perform data validation using EPA guidelines.

Although planned, data validation comparable to [EPA's] National Functional Guideline data validation has not ever been completed at the Depot. In addition, the data packages that the contractor required of the analytical laboratories are insufficient to perform EPA data validation.

-EPA OIG's Engineering and Science Staff, June 2, 1995

The data packages needed for data validation are normally called EPA Level IV data packages. (See Appendix F for an explanation of data package levels.) Without a Level IV package, it is not possible to confirm that the laboratory even performed the analyses.

Data Not Validated

The Depot's engineering contractor advised us that it had validated some of its contract laboratory data. However, the OIG ESS found it had not. In fact, we were not able to document any data validation performed by the Army or EPA that was comparable to EPA national functional guidelines for any of the operable units.

Samples Related to Tank 2 Cleanup

The Region accepted that the Tank 2 cleanup was complete in June 1994. As noted above, we determined that there was no documentation that data validation was performed in accordance with EPA national functional guidelines. However, we were advised that 12 "confirmation" samples were used to determine the Tank 2 Operable Unit cleanup had been completed. We subsequently determined that these samples had not been validated.

According to 1988 regional guidance, data validation is required for determining if cleanup is complete. In a May 1993 technical review committee meeting, the Depot agreed to validate 6 of the 12 confirmation samples.

Both the Corps, who administered the Depot's contracts, and the Region's project manager were under the impression that data validation had been performed by the Depot engineering contractor. However, when we asked for the data validation report, we found that the data validation was never completed.

The OIG ESS found the data package submitted by the laboratory was incomplete for data validation using EPA national functional guidelines. For example, the package did not include critical information for calibrations and performance check samples. Further, the OIG ESS determined that the concentration units were consistently reported wrong. Specifically, the soil sample results were incorrectly reported in liquid concentrations.

The Region's determination that the Tank 2 Operable Unit cleanup was complete only referenced the 12 confirmation samples. However, in response to our concerns about the lack of the data validation report, the Region advised us it had considered other data, such as contaminant removal rates, to determine the cleanup was complete. Further, the Region stated that it was working with the Depot to validate the confirmation samples to the "extent feasible."

The data must be evaluated per EPA's functional guidelines. It is not acceptable to take the data at face value without additional validation.

-Region 9 discussing the requirements for cleanup completion of the Tank 2 site with Sacramento Army Depot officials, May 25, 1993

In our opinion, and consistent with the requirements agreed to by the Depot, the sample analyses were of unknown quality without data validation. Consequently, its analyses did not meet the requirement for determining that the cleanup is complete.

Request for Regional Review

Because of the lack of data validation and its impact on the status of this NPL site, we asked the Region to perform data validation using EPA guidelines for two operable units at Sacramento Army Depot, the Burn Pits and Groundwater Operable Units.

Burn Pits Operable Unit Data Rejected

Data validation for the Burn Pits Operable Unit found that volatile organic compound laboratory analyses had to be rejected. In this respect, the Region's quality assurance staff performed a validation of 33 selected samples from the 137 samples included in the March 1991 round of sampling at the Burn Pits. In a letter to us dated May 19, 1995, the Region's quality assurance staff's validation found that:

The sample results [for volatile organic compounds] are rejected due to serious deficiencies in the ability to analyze the sample and meet quality control criteria. The presence or absence of the analyte cannot be verified.

The 33 samples selected from the March 1991 sampling round were critical to the cleanup decision making process at the site. These samples were used in the public health risk assessment, the remedial investigation, the feasibility study, and the record of decision. The data was also used to decide the contaminants of concern, determine the cleanup levels for the contaminants, and to select the cleanup remedy. Further, the OIG ESS concluded that all 137 samples taken in March 1991 should be rejected because of a defect in the sampling technique. Specifically, the samples were not segregated into appropriate containers.

Groundwater Operable Unit Data Validation Finds Problems

The Region's quality assurance section found serious problems in its validation of the volatile organic compound data for the Groundwater Operable Unit including:

Missed holding times on three samples, possibly causing false assumptions that contaminants were absent ("false negatives"), or biasing the minimum values of all compounds; and,

Unacceptable calibrations impacting numerous compound analyses, including one contaminant of concern in all samples, and another contaminant of concern in 10 samples.

As a result, the samples could not be used to determine how much contamination was present or whether some contaminants of concern were absent. This was significant, since most of the sample analyses showed that contaminants were not present. Because the samples were used in the decision-making process, including determination of the cleanup levels and selection of the cleanup remedy, it is important that accurate sample analyses be obtained.

QUALITY ASSURANCE PROJECT PLANS CAN BE STRENGTHENED

In addition to correcting the weaknesses relating to data validation, the Depot's QAPPs could be strengthened to improve the opportunity for quality data. This could be done by: (i) using double-blind performance evaluation (PE) samples; (ii) requiring that laboratory audit results be provided to the Region; (iii) requiring the laboratories to provide electronic data information; and (iv) identifying site specific data quality objectives.

Double-Blind PE Samples Not Used

Neither the Region nor the Army used double-blind PEs at Sacramento Army Depot to evaluate the quality of data. Double-blind PE samples were successfully used to detect laboratory analyses problems at the March Air Force Base NPL site, as discussed in Chapter 4. Further, the Travis Air Force Base Model QAPP required the Air Force to use double-blind PEs.

Laboratory Audit Results Not Provided To Region 9

We believe laboratory audit results should be available to the Region. According to the Basewide QAPP, a laboratory audit was to be performed at the beginning of the project, and each time a data set was received. However, the QAPP did not require the audit results to be provided to the Region, and the Region was not receiving the results. In our opinion, this is not consistent with EPA guidance which requires that the Region verify the QAPP implementation. Further, had these results been made available to EPA, it would have detected these to problems at the laboratories performing analysis of Depot samples.

Corps Laboratory Inspections Find Problems

Generally, we found the Corps, who administered the engineering contract, performed a laboratory inspection before a laboratory was used. These inspections were normally not as thorough as audits. While they did disclose deficiencies, they were not a substitute for audits. However, the results of these inspections were not provided to EPA, although they disclosed major deficiencies that may have impacted data quality.

For example, a Corps May 1992 inspection of the laboratory that analyzed the Tank 2 Operable Unit confirmation samples found:

There was one major and 3 minor deficiencies noted during the lab inspection. The major deficiency noted during the lab inspection pertained to traceability. There was a consistent absence of initialed and dated reagents, log book entries and standards and spiking solutions ...minor deficiencies...included...lack of consistent measurement and recording of storage temperatures and a lack of consistent calibration of analytical balances.

The laboratory inspector recommended that any further validation for tank work be handled by the Corp's Sacramento District.

[The] lab has documentation problems...[and] QC [quality control] problems with temperatures and balances.

-Army Corps Inspection Report,

May 27, 1992

A Corps August 1992 inspection of the laboratory that performed the groundwater analyses found:

One major deficiency that would adversely affect [the laboratory's] ability to conduct the required chemical determinations was noted during the inspection.

This major deficiency dealt with a retention time window being incorrectly set. Nine minor deficiencies were also disclosed noting problems such as retention time calculations not performed in accordance with method protocols. The report stated that these deficiencies could adversely affect the laboratory's ability to conduct the required analyses.

[Retention time] windows have been corrected, ...74 previously reported samples may have been affected by this error.

-Army Corps of Engineers Inspection,

September 1, 1992

We also could not find any evidence that the Corps performed a laboratory inspection each time a data set was received, as required by the QAPP.

Magnetic Tapes Not Required

The QAPPs did not require the laboratory to provide electronic data, such as magnetic tapes of raw data, needed to perform audits. Magnetic tape audits were instrumental in detecting laboratory fraud at March Air Force Base, as discussed in Chapter 4.

...[Data quality objectives] were not adequately addressed in either the basewide QAPP or in either of the [other] QAPPs reviewed.

-OIG Engineering and Science Staff, June 2, 1995

Site Specific Data Quality Objectives Not Provided

The QAPPs did not present any site specific data quality objectives. Data quality objectives are the basis for determining the level of quality assurance activities.

QUALITY ASSURANCE OFFICER NOT REQUIRED

The Federal facility agreement did not require a quality assurance officer. We recognize this is not a mandatory requirement, but believe it would help affix DOD responsibility for data quality.

While the QAPP identified a quality assurance officer, the individual named worked for a subcontractor that was no longer employed by the Depot's engineering contractor.

IMPACTS AT SACRAMENTO ARMY DEPOT

According to State and Army personnel, it is anticipated that the Depot will be taken off the NPL within two years. OSWER Directive 9320.2-3A, specifies that a Superfund site is considered completed when all cleanup actions identified in the record of decision have been successfully implemented and cleanup levels have been achieved. Only after the satisfaction of this requirement can a close out report be prepared. A closeout report is required before a site can be deleted from the NPL.

In view of the data validation problems noted in our review, and the general absence of other validated data, it is our opinion the Depot cleanup should not be considered complete until a positive assurance can be provided that the data was of sufficient quantity and quality to support the decisions made.

RECOMMENDATIONS

We recommend that the Regional Administrator, Region 9:

1. Inform the Army that the cleanup certification for the Tank 2 Operable Unit is being withdrawn until the required data validation is performed.

2. Require all environmental data used for decision making to be validated using EPA national functional guidelines.

Note: Additional recommendations are addressed in Chapter 2 of this report.

REGION 9 COMMENTS AND OIG EVALUATION

The Region generally agreed with the recommendations and provided the following comments:

Sacramento Army Depot has agreed to resample the Tank 2 Operable Unit and perform the necessary data validation on the new samples. We propose advising the Army that if resampling indicates that cleanup levels have not been achieved, EPA will withdraw its cleanup certification and additional remediation will be required.

Concur with the recommendation; however, we would prefer that the recommendation would be flexible enough to allow for a cost-effective approach to the validation of all previous data. We suggest that the recommendation read: "Require the Army to develop and implement an-EPA approved plan to validate previous data used for decision making using EPA national functional guidelines or equivalents."

OIG Evaluation

The Region's response meets the intent of the recommendations.

Chapter 4

March Air Force Base

SUMMARY

Despite objections raised by the Region, the engineering contractor for the AFB cleanup hired Eureka Laboratories in 1992 to analyze samples for part of the NPL site cleanup. Subsequently, in late 1993, the Region rejected $1 million worth of Eureka Laboratories analyses. This action delayed the project by about 1 years. In December 1993, EPA suspended the laboratory from further Federal procurement actions, and in May 1995, the laboratory pleaded guilty to criminal charges that its test results were falsified.

Our audit noted that the QAPPs for March AFB did not contain the primary techniques eventually used by the Region to detect the laboratory fraud. These techniques included a laboratory audit before the start of sampling and that magnetic tapes be provided upon request.

We believe that Eureka Laboratories probably would not have been used, if the laboratory audit had been required before the start of sampling. This belief is based on the fact that a 1991 Air Force laboratory audit of Eureka Laboratories under another project recommended that samples not be sent to Eureka Laboratories.

BACKGROUND

March AFB covers 7,100

acres near Riverside, California. It served as a training, bomber and air-to-air refueling operations base. Contaminants include heavy metals and volatile organic compounds, such as benzene. The base was listed on the NPL in 1989. The Air Force, the Region, and the State of California signed the Federal facility agreement for the cleanup in September 1990. Under the agreement, there is a Basewide QAPP and QAPPs for the base's three operable units. A Basewide remedial investigation and feasibility study report was due in August 1995. The Air Force Center for Environmental Excellence (AFCEE), in San Antonio, Texas issued most of the NPL site related contracts for March AFB.

DATA PROBLEMS FOUND BY REGION 9

Although the AFCEE decided to use Eureka Laboratories in spite of warnings from the regional staff, the Region deserves credit for taking aggressive actions to minimize project set backs caused by the laboratory's deficient work.

The Region learned that the Air Force planned to use Eureka Laboratories in early 1992, after EPA received the Air Force's QAPP for Operable Unit 2. The QAPP identified Eureka Laboratories as the contractor for the sample analyses.

Use of Eureka Laboratories Discouraged

Because another EPA Region had experienced problems with Eureka Laboratories, the Region verbally recommended that Eureka not be used. However, the Region's advice was not heeded. Instead, the AFCEE decided that it would continue to use Eureka Laboratories, along with two other laboratories, for laboratory analyses at Operable Unit 2.

The Region, sensitive to the potential problems associated with Eureka Laboratories, approved the Operable Unit 2 QAPP in August 1992 after deciding to use double-blind PE samples as part of its quality assurance oversight. The use of PE samples and subsequent regional audits of magnetic tapes from Eureka Laboratories disclosed that the laboratory work was deficient and fraudulent work was pervasive. This led to Eureka Laboratories pleading guilty to falsifying test results and two of its chemists being convicted of fraud in May 1995.

A Sacramento laboratory [Eureka Laboratories] admitted it routinely falsified results of water, soil and air tests done for the government to determine levels of pollutants and toxins at some of the worst hazardous waste sites in the country.

-Sacramento Bee

May 2, 1995

Meanwhile, EPA rejected all sample analyses work performed by Eureka Laboratories at the base. By January 1994, the Air Force had developed a corrective action plan for replacing the rejected data.

QUALITY ASSURANCE PROJECT PLANS WERE INADEQUATE

While the Region's oversight at March AFB was successful in detecting the problems with Eureka Laboratories, we concluded that the Basewide QAPP and Operable Unit 2 QAPP were inadequate to detect the problems. Specifically, the QAPPs did not require:

the use of PE samples;

the submission of magnetic tapes of raw data;

a laboratory audit before any analysis work began or that subsequent laboratory audits be provided to the Region; and,

the performance of data validation in accordance with EPA's national functional guidelines.

Further, we found that the QAPPs were not revised after the data problems were found. In addition, there was no record that the Basewide QAPP had been approved by the Region.

We also found that data quality objectives, which were the basis for determining the level of necessary data quality, were not sufficiently established in the QAPPs.

Performance Evaluation Samples Not Required

Neither of the QAPPs required the Air Force to use PE samples to judge laboratory quality. Instead, the QAPPs said that the laboratory will "participate" in PE sample programs, and that the results will be made available during audits.

Magnetic Tapes Not Provided

The QAPPs did not require the laboratory to provide electronic data, such as magnetic tapes of raw data, needed to perform audits of the validity of the data. Magnetic tape audits were instrumental in the Region's detection of the Eureka Laboratories fraud.

Laboratory Audits Not Required

The Region did not require a laboratory audit before a laboratory was used. While laboratory audits are not always required by QAPPs, they have been effectively used in reviewing a laboratory's procedures. In our opinion, Eureka Laboratories probably would not have been selected if the Region would have required an audit before the laboratory was used.

Instead of requiring an audit before selection of a laboratory, the QAPPs required the engineering contractor to perform one laboratory systems audit during the project. We consider this a weak quality assurance measure because it does not ensure that a laboratory is capable of the required analyses before work was performed. In addition, the QAPPs did not require the results of the audit to be submitted to the Region.

As previously noted, AFCEE contracted for a laboratory audit of Eureka Laboratories in 1991 under another Air Force project. The audit found major problems and recommended that no samples be sent to Eureka Laboratories for analysis. In February 1992, and again in October 1992, followup reviews by the AFCEE contractor found

many of the problems discovered in the 1991 audit remained unresolved. It was during this time frame that the QAPP for Operable Unit 2 was submitted to EPA for approval. The QAPP named Eureka Laboratories as the laboratory which would perform the sample analyses for Operable Unit 2. The sample analyses commenced in August 1992. During our discussions, AFCEE representatives were unable to explain why Eureka Laboratories was used at March AFB, when the laboratory audit recommended against sending samples to Eureka.

[The contractor] recommends that no further samples be sent to Eureka until the aforementioned corrective actions have been completed.

-AFCEE contractor

February 11, 1991

No Data Validation

The QAPPs did not require that data validation be performed in accordance with EPA's national functional guidelines. The QAPPs merely cited that there would be data validation "where applicable." However, they did not define where it was applicable or what percentage of the total samples analyzed would be validated. In addition, the QAPPs did not require the submission of detailed supporting sample documentation, known as "Level IV" data packages. These packages were necessary to perform data validation according to EPA national functional guidelines.

Failure to Establish Data Quality Objectives

Adequate data quality objectives were not established in the QAPPs, although they are the basis for determining the level of quality assurance activities. Specifically, the data quality objectives were necessary to:

Establish objectives for each type of data collection and analysis effort; and

Identify when "Level IV" data packages were needed. The Region recognizes that some "Level IV" data is a necessary prerequisite for data validation.

Additional Costs and Delays Associated With Rejected Work

March AFB officials advised us that it paid Eureka Laboratories about $1 million for the laboratory analyses which were rejected. An additional $250,000 was expended for resampling. All but $60,000 of this amount was paid for by the engineering contractor. Further, because of the deficient and fraudulent laboratory analyses, the due date for the Remedial Investigation Report slipped from January 11, 1994 to April 10, 1995. Thus, the remedial investigation was delayed at least 1 years.

RECOMMENDATIONS AND REGION 9 RESPONSE

Our recommendations are addressed in Chapter 2 of this report. The Region's complete response to this chapter and our evaluation of the response are included as Appendices A and B of this report.

Chapter 5

Travis Air Force Base

SUMMARY

After the Federal facility agreement was signed in 1990, the Air Force selected Roy F. Weston, Inc. (Weston) as the engineering contractor for the basewide remedial investigation and feasibility study. In 1991 and 1992, Weston completed three rounds of sampling to determine the extent of contamination at Travis AFB. The samples were sent to its own laboratories for analyses.

In January 1992, an Air Force laboratory audit of one of Weston's laboratories found major problems with its analytical practices. Because of the analytical problems, the Region determined that about $2 million of Weston's analyses were of unknown quality. In addition, because of these problems, one of the operable units has experienced a cleanup delay of more than 2 years.

We found the QAPP was not designed to detect the laboratory problems. After the data problems were disclosed, the Air Force developed two new QAPPs for Travis AFB. These QAPPs included many of the key data quality assurance ingredients missing from the original QAPP. However, in our opinion, the new QAPPs could be strengthened to improve the quality of laboratory analyses.

BACKGROUND

Travis AFB is located about 45 miles southwest of Sacramento, California, and covers over 5,000 acres. Its primary mission is to provide worldwide airlift capacity for cargo and troops.

Contaminants include volatile organic compounds, heavy metals, and polynuclear aromatic hydrocarbons. The base was listed on the NPL in 1990.

After the Federal facility agreement was signed in September 1990, the base was set up as one operable unit, and a basewide QAPP was prepared. The Air Force Center for Environmental Excellence (AFCEE) in San Antonio, Texas contracted with Weston to perform the remedial investigation and feasibility study. A subsequent AFCEE laboratory audit of one of Weston's laboratories disclosed major problems, which led to the Region's July 1993 determination that the data analyses were of unknown quality.

Operable Units

In 1993, after data problems were found, Travis AFB was divided into four operable units covered by two QAPPs as shown below:

REVISED QAPPS                           



Operable Unit                      QAPP

East Insustrial               East Industrial QAPP

North                         Model QAPP

West Industrial               Model QAPP

West, Annexes, and Basewide   Model QAPP

A remedial investigation has not been completed for any of the operable units as of June 1995.

Weston is the prime contractor for the East Industrial Operable Unit, but no longer uses its own laboratories for sample analyses. Two other engineering contractors are doing investigative work at the other three operable units.

AFCEE administers the contracts for both the North and West Industrial operable units. The Sacramento District of the U.S. Army Corps of Engineers administers the contracts for the East Industrial, and West, Annexes, and Basewide operable units.

DATA PROBLEMS ENCOUNTERED

Weston performed sampling in June 1991, November 1991, and March 1992. Weston's laboratories in Stockton, California and Lionville, Pennsylvania were used to analyze the samples. The following events led to the Region's determination that the laboratory analyses were of unknown quality:

In January 1992, an AFCEE laboratory audit of the Stockton laboratory found major problems. The laboratory audit identified that instruments were not properly calibrated and acceptable laboratory quality control procedures were not followed.

It was not until September 1992, nearly 9 months later, that the Air Force notified the Region of the potential data problems. This was about the same time that Air Force officials met with Weston to discuss the data problems and determine what should be done to salvage the data. One of the decisions that came out of this meeting was that Weston would validate its own data according to EPA national functional guidelines, instead of having an independent third party do this. As discussed in this chapter, we consider this approach inadequate.

The consensus was to have Weston...conduct a validation process...A validation by another third party would be too long, too costly, too involved, and unnecessary because of the integrity expected from Weston.

-Minutes of September 25, 1995 meeting between the Air Force and Weston

In July 1993, the Region determined that the quality of the data was unknown and could not be used for its intended purpose.

QUALITY ASSURANCE PROJECT PLANS ARE DEFICIENT

Our review disclosed that the original QAPP was inadequate to detect the above cited laboratory problems.

After the base was divided into four operable units, two new QAPPs were developed to replace the original QAPP. In our opinion, the East Industrial QAPP omitted several important quality assurance features. We concluded that the Model QAPP is the most comprehensively designed QAPP for any of the five bases included in our audit. The strengths and weaknesses of the original QAPP and the new QAPPs are summarized below and discussed in detail in the ensuing paragraphs.

DATA QUALITY ASSURANCE ITEMS IN QAPPS              

           



Data Quality           Original  East OU    Model

Assurance Item          QAPP       QAPP       QAPP

 

Double-Blind Performance                         X

Evaluation Samples                                



Independent                                       

Laboratory Audits                                 



Data Quality Objectives                          X



Data Validation According               X        X

to EPA Guidelines                                 



Magnetic Data Required                            

Upon Request                                       

                                 

Format for Quality                               X

Assurance Report                                   

                               



        X - Item is adequately described in QAPP.

ORIGINAL QAPP INADEQUATE

As explained previously, the original QAPP was not adequately designed to detect serious data problems. In addition, we found Weston made changes to the QAPP without the approval of either the Air Force or EPA.

Laboratory Audit Requirement Was Insufficient

The original QAPP did not contain an adequate requirement for laboratory audits. Specifically, the QAPP did not require an independent laboratory audit, but instead provided that the audit would be performed by Weston. In our opinion, this resulted in a conflict of interest situation, since Weston would be reviewing its own data. Further, the QAPP did not establish any procedures for the Air Force to review the Weston audits. Moreover, we could find no evidence the audits were performed.

Even though external laboratory audits were not required under the QAPP, the Air Force contracted for audits of Weston's Lionville, Pennsylvania and Stockton, California laboratories in the summer of 1990. The audits were initiated before the sampling started at Travis AFB in June 1991.

The audits found the laboratories had not fully implemented in-house control limits. These control limits are needed to ensure that data is of known quality, and is produced throughout the sampling and analysis programs.

The laboratory audits concluded that the in-house control limits must be in use by the fourth quarter of 1990. The contractor reaudited both laboratories in 1992, and found the control limits were still not being used for all methods of analyses. Even though the control limits were not established at either laboratory, the Air Force continued to submit samples to both Weston laboratories.

In-house control limits for both precision and accuracy must be in use by the fourth quarter of 1990 as projected.

- Air Force contractor

August 13, 1990

The Air Force contractor's reaudit of Weston's Stockton laboratory in January 1992 also found other major problems. The contractor recommended that no more samples be sent to the laboratory until Weston implemented an effective laboratory quality control system, and had demonstrated that it was capable of ensuring the quality of data. The Air Force did not follow the contractor's audit recommendations, and Weston initiated its third and final round of sampling in March 1992.

Since the Air Force was not providing the results of its laboratory audits to EPA, the Region was unaware of the problems. If the information had been available, the Region would have had a sufficient basis to prevent Travis AFB from submitting the final round of sampling analyses to Weston's Stockton laboratory.

Split Sampling Used By EPA to Find Problems

The original QAPP did not require split sampling, a technique where identical samples are sent to two different laboratories for analysis and the results are compared. However, the Region performed some split sampling in June 1991, during the first round of Weston's sampling. Unfortunately, the Region did not compare the results of the split sampling until June 1993. The results found major problems.

The results reported by EPA and Travis AFB for both soil and groundwater samples show almost no correspondence, particularly for metals.

- Report on Comparison

of Split Samples

June 18, 1993

If the sample comparison had been done promptly, the Region would have known about the data problems early enough to have avoided sending one or both of the remaining two rounds of samples to Weston's laboratory for analysis.

Data Validation Not Done

The original QAPP did not have a firm requirement for independent data validation. The QAPP stated the data would be reviewed according to EPA national functional guidelines, but did not:

Establish how often the data validation would be performed;

Require an independent third party to perform the validation; and

Require the necessary analysis documentation, known as EPA Level IV data packages. These packages are necessary to perform data validation in accordance with EPA national functional guidelines.

As a result, the initial data validation, performed by Weston's in-house staff in October 1992, determined that the data was usable for its intended purpose. The Air Force quality assurance contractor reviewed Weston's validation report and found major problems that made much of the data unusable.

The analytical data from Weston are insufficient to conduct an EPA Level IV Review.

- EPA contractor's review

of Weston's data validation

January 27, 1993

Data Quality Objectives Not Specific

Data quality objectives were not established for each type of data collection and analysis effort. Further, the objectives did not specify the data package requirements necessary to complete data validation by EPA national functional guidelines.

Performance Evaluation Samples Not Done

The original QAPP requirement for PE samples called for Weston's laboratory quality assurance coordinator to provide double-blind PE samples to its own laboratories at least semi-annually. While Weston did not provide the PE samples to its laboratories, we also consider this quality assurance procedure inadequate for two reasons.

First, PE sample evaluations should have been provided and judged by an activity independent of the laboratory. Second, there should have been a requirement to submit the PE results to either the Air Force or the Region.

No Requirement to Provide Magnetic Tapes

The original QAPP did not require Weston to provide electronic data, such as magnetic tapes of raw data, needed to perform audits. We were advised that Weston refused to provide the magnetic tapes when they were requested by the Region. It should be noted that magnetic tape audits were instrumental in detecting the fraud at Eureka Laboratories, as discussed in Chapter 4.

QAPP Revised Without Approval

We also noted that Weston made changes to the original QAPP without Air Force or the Region's approval.

Weston requested QAPP deviations from the Air Force involving evaluating data for usability instead of rerunning data and widening criteria. Weston apparently assumed the deviations had been approved based upon discussions with AFCEE chemists. However, deviations from the approved QAPP had not been formally approved.

Variances requested by Weston have not been approved in writing and can't be until they can be justified to the regulators.

- Minutes of meeting between

Air Force and Weston

October 6, 1992

MODEL QAPP IMPROVED

Travis AFB finalized, in May 1994, a Model QAPP for three operable units. The Region did an excellent job of ensuring most of the key ingredients for data quality assurance were both included in this QAPP, and complied with. The only two quality assurance activities that we believe could have been added to ensure the data quality were:

provisions for independent laboratory audits before samples are provided, and

access to magnetic tapes.

Best Practices

The Model QAPP incorporated most of the items that we believe are critical to ensuring data quality. It adequately defined data quality objectives, and required double-blind PE samples and data validation. It also required that data quality assurance reports be provided for EPA review.

Our review of PE samples for the North Operable Unit showed Travis AFB performed the required PE samples. We also reviewed data validation requirements for the North and West Industrial Operable Units and found about 20 percent of the samples were being validated. Further, an independent third party was performing the data validation.

Quality Assurance Report Is Meaningful

The Model QAPP established an effective format for the quality assurance report. The report showed the results of the PE samples, laboratory audits, and data validation. The report included information on the findings, corrective actions required, and the effects on data quality assurance. The report was included with the remedial investigation reports for EPA's review.

EAST INDUSTRIAL QAPP DEFICIENT

After the data problems became known, Travis AFB developed a QAPP for the East Industrial Operable Unit. Although the QAPP satisfied the documentation requirements for data validation using EPA national functional guidelines, many of the problems with the original QAPP remained.

The East Industrial Operable Unit QAPP, like the original QAPP, did not require:

Independent laboratory audits;

PE samples (although it noted that PE samples are an additional type of independent quality control);

The laboratory to provide electronic data, such as magnetic tapes of raw data, needed to perform audits.

We also noted the QAPP did not identify data quality objectives for each data collection activity.

However, the QAPP did require third-party data validation according to EPA national functional guidelines on 10 percent of the data, and our review found that the data validation requirements were met.

QUALITY ASSURANCE OFFICER NOT APPOINTED

The Federal facility agreement, signed in September 1990, required the Air Force to appoint a quality assurance officer. This officer has the responsibility to ensure that work is performed in accordance with the QAPP. However, a quality assurance officer was not appointed until 1993.

To date, nearly one and a half years after the effective date of the [federal facility agreement], the Air Force has not designated a QAO [for Travis Air Force Base].

-EPA Region 9 letter to Air Force, April 15, 1992

COSTS AND DELAYS IN CLEANUP

The Air Force informed us that it paid Weston about $2 million for its laboratory data analyses. However, the quality and usability of this data is unknown. Further, because of the laboratory analyses problems, the due date for the original remedial investigation report for one of the operable units slipped from May 1993 to February 1996, a delay of more than 2 years.

RECOMMENDATIONS AND REGION 9 RESPONSE

Our recommendations are addressed in Chapter 2 of this report. The Region's complete response to this chapter and our evaluation of the response are included as Appendices A and B of this report.

Chapter 6

Hunters Point Naval Shipyard

SUMMARY

Between October 8, 1990 and December 18, 1990, the Navy's engineering contractors collected over 1,200 soil samples and dozens of groundwater samples to initiate remedial investigations at Hunters Point Naval Shipyard. These samples were sent to two laboratories. Nearly 1 years later, in May 1992, all organic analyses from these samples were rejected due to poor quality laboratory data.

Throughout the RI/FS process, the [Region 9 remedial project manager] will be primarily responsible for overseeing that the Navy's decisions have been based on data of sufficient quality.

-Region 9's Data Quality Oversight Plan,

March 1991

According to the engineering contractor, the cost of the rejected data was $2.5 million. It cost the Navy another $1 million to replace the rejected data and the cleanup was set back 2 years.

We found that the Region had prepared a data quality oversight plan to help it assess the quality of the data and the performance of the laboratories. However, the plan was not implemented. If it had been, and had the QAPP been better designed and implemented, we believe the 2-year delay could have been substantially reduced.

BACKGROUND

Hunters Point Naval Shipyard is located in San Francisco and covers 965 acres, of which about half is on land and the remainder is on San Francisco Bay. It has continuously operated as a shipyard or ship repair facility since 1869. Contaminants include paints, solvents, fuels and oils, acids and bases, metals, polychlorinated biphenyl (PCB), and asbestos.

The base was listed on the NPL in 1989 and a Federal facility agreement to cleanup the site was signed in 1990. Hunters Point is divided into five operable units covered by one Basewide QAPP. In addition to the QAPP, the Region developed an oversight plan for monitoring data quality.

DATA PROBLEMS

From October 8, 1990 to December 18, 1990, the Naval Facilities Engineering Command had an engineering contractor collect over 1,200 soil samples and dozens of groundwater samples to initiate remedial investigations at Hunters Point. These samples were sent to two laboratories: Eagle Picher Environmental Services and National Environmental Testing (NET) Pacific Laboratories.

Problems were first noted by the engineering contractor in November 1990, when preliminary data received from both laboratories indicated that holding times were missed. By December 1990, the contractor concluded that the laboratories' analyses reports were going to be significantly late, and stopped work on this phase of the field work.

"Cursory" data reviews found more problems and the engineering contractor decided to discontinue sending samples to the two laboratories. However, the engineering contractor thought that previously analyzed data could be used for the remedial investigation.

When the contractor finished data validation using EPA national functional guidelines in March 1992, it found serious data quality deficiencies, such as instrument calibration problems.

Laboratory problems identified included difficulties meeting holding times because of sample overloading, difficulties in delivering results in electronic format, and problems submitting ...full validation packages. Additional lab problems... included instrument calibration problems, and calculation errors.

-Hunters Point Technical Meeting,

June 2, 1992

Consequently, in May 1992, the contractor recommended and the Navy agreed, that all organic compound analyses be rejected. At the May 1992 technical review committee meeting, the Naval Facilities Engineering Command informed the Region and the State of California of the data quality problems and its decision to reject the data.

REGIONAL OVERSIGHT PLAN NOT IMPLEMENTED

In March 1991, the Region developed a data quality oversight plan to help it assess the quality of the data and laboratory performance. These plans are not required and were not developed for the other bases included in our review. We believe that the plan could have been a good tool for overseeing data quality; however, it was never implemented.

Data Validation Oversight Not Performed

One of the major oversight areas in the plan was data validation. According to the plan, the following steps were to be accomplished:

The Navy would validate 100 percent of the data.

The Region's data validation contractor would review at least 10 percent of the Navy's validation.

If validation problems arose, the Region's contractor could review another 10 percent of the Navy's validation to identify the causes of problems.

If the Region finds major problems with data validation, it should meet with the Navy and discuss the problems.

As discussed later in this chapter, the QAPP did not require the 100 percent data validation specified in the oversight plan. It only required that 10 percent of the data be validated. Also, there was no evidence that the Region's contractor reviewed 10 percent of the Navy's data validation. Had either of these levels of data validation been performed, we believe the quality control problems that led to the rejected data could have been detected about one year earlier than May 1992.

Laboratory Audits Not Performed

The second major oversight area in the plan pertained to laboratory audits. According to the plan, the Region's project manager was responsible for implementing laboratory audits. The plan included a detailed audit checklist and recommended an audit before the start of analytical work and during the course of the program. However, the Region did not perform any laboratory audits.

An audit of the laboratories used for the Navy's Hunters Point Annex Remedial Investigation/Feasibility Study activities will assist the U.S. EPA in verifying the quality of the data.

-Hunters Point's Data

Quality Oversight Plan

QUALITY ASSURANCE PROJECT PLANS DEFICIENT

We reviewed the Basewide QAPP and found it included some good features such as requiring data validation using EPA national functional guidelines. However, it did not include any data quality objectives, an important tool for deciding data quality.

Further, it did not establish time limits for submitting data packages, completing data validation, or submitting data validation reports. In addition, it did not require laboratory audits, double-blind PE samples, or magnetic tapes availability. We also noted the Basewide QAPP was not formally approved by the Region.

Data Quality Objectives Not Defined

The OIG ESS found the QAPP did not adequately identify data quality objectives. The Region's data quality oversight plan also commented that the QAPP did not:

Clearly present data quality objectives;

Establish the analytical (data package) levels needed to meet the data quality objectives; or,

Establish acceptable error rates or confidence requirements for determining the sample size to reach the data quality objectives.

The lack of data quality objectives may have caused many of the data problems because the objectives are used to determine the type, quantity, and quality of data to be collected. No samples should be collected until data quality objectives have been determined.

...the allowable error or confidence needs to be documented in the...QAPP. The acceptable error and/or confidence is used to establish the number of samples required and the number of [quality control] samples required.

-Data Quality Oversight Plan,

March 1991

Data Validation Requirements Deficient

The QAPP required data to be validated using EPA national functional guidelines. It was through the use of data validation that the data quality problems at Hunters Point were found. However, the QAPP did not include time limits for providing data packages or performing the data validation.

Data was not rejected until 1 years after sampling because data validation was not performed promptly. A large number of samples were taken between October and December of 1990. In January 1991, the engineering contractor started performing "cursory" validation of this data and found "serious, but not fatal," holding time deficiencies. Holding times are time limits for analyzing samples. After limits are exceeded, results are considered less accurate.

Cursory validation was not completed until August 1991 because laboratory data was missing. Based on this validation, the engineering contractor decided the laboratory analytical results were, "at best," estimates. The engineering contractor erroneously assumed that the "degree of confidence in data qualitywould not get worse after completion of full validation." It did.

Data validation, in accordance with EPA national functional guidelines or the equivalent, did not start until about 10 months after sampling was completed. After completing the data validation in March 1992, the engineering contractor decided, in May 1992, to reject all organic analyses for three of the operable units.

Problems Found by

Data Validation

. Missed holding times

. Incomplete data validation packages

. Instrument calibration problems

. Calculation errors

At a June 1992 meeting, the State requested that, "...all future reports include only fully validated data..." While this was an appropriate request, we believe that timely validation should also have been required. In this case, well over one year passed before full data validation determined that all organic analyses had to be rejected.

Matrix Spike Results Excluded

Although the QAPPs generally required the use of EPA national functional guidelines, data validation reports did not contain all required elements. Our review of two data validation reports found that matrix spike results were not included in either report.

A matrix spike is prepared in the laboratory by adding a known amount of target analytes into the field samples prior to laboratory analysis. They are generated to determine long-term accuracy of the analytical method and to demonstrate acceptable compound recovery by the laboratory at the time of sample. We believe that matrix spikes and all other items included in the national functional guidelines should be included in the data validation reports to ensure the data is of known quality.

Data Validation Percentage Not Met

We also found that the percentage requirement for data validation was not fully met. The QAPP required data validation according to national functional guidelines and that validation be performed on 10 percent of the samples. We found just 8.3 percent of the 12,437 samples taken from 1991 to 1994 were validated.

Laboratory Audits Not Required

The QAPP did not require the Navy to perform laboratory audits. As a result, one of the two initial laboratories, NET Pacific, was not audited. Further, one of the primary replacement laboratories that analyzed nearly 3,000 samples, Kennedy/Jenks/Chilton Laboratory, was not audited.

Nonetheless, Navy contractors did some audits of laboratories performing analyses for the Hunters Point site. However, these audits did not ensure problems were found or resolved.

In April and May 1991, after the Navy realized there was a data problem, the engineering contractor audited two of the three replacement laboratories: Enseco, Inc.-California Analytical Laboratories and Anametrix. We concluded that these laboratory audits were of marginal value because:

They appeared to be cursory. The audits took only three hours to complete compared to the three days generally needed for a thorough audit.

The engineering contractor performing the audit did not report any problems. This was the same contractor that audited Eureka Laboratories on April 25, 1991 and did not find any major problems, although the laboratory later confessed to falsifying test results.

Moreover, when deficiencies were identified in a more thorough audit, they apparently remained uncorrected. The Navy contracted to have an audit of Eagle Picher Environmental Services laboratory conducted one month before the Navy started sending samples to the laboratory. The audit found several deficiencies related to calibration and acceptance criteria. Based on written comments received from the laboratory, the Navy's contractor felt the deficiencies were satisfactorily addressed. Subsequent data problems at the laboratory tend to contradict this conclusion.

Eagle Picher Audit Findings

. Need acceptance criteria for calibration of balances.

. Need acceptance criteria for water system.

. Calibration standards for gas chromatography/mass spectrometry volatile analysis cannot be traced to standards.

. No policy for documenting out of control events.

-September 1990 Navy Audit

The QAPP should be modified to require laboratory audits, and to ensure that they are thorough, completed before the work starts on the project, and performed periodically throughout the project. The QAPP also should require the results of the audits be provided to the Region. We believe that laboratory audits are a key piece of the laboratory quality picture and must be considered in defensible decision-making.

No Double-Blind PE Samples

Double-blind PE samples were not required by the QAPP and were generally not used.

Magnetic Tapes Not Required

The Basewide QAPP did not require the laboratories to provide electronic data, such as magnetic tapes of raw data, needed to perform audits.

QUALITY ASSURANCE OFFICER

Although the September 1990 Federal facility agreement required the appointment of a quality assurance officer, the Navy did not appoint one until July 1994, over 3 years after the agreement was signed. Further, the quality assurance officer, who oversaw 15 other Navy facilities, did not start work until September 1994.

COST AND DELAY CAUSED BY REJECTED DATA

According to the Navy's engineering contractor and a corresponding newspaper report, the Navy paid about $2.5 million for the laboratory work that was rejected. Another $1 million was spent to replace the rejected data. In addition, we believe the data problems caused at least 2 years of the 3-year delay in completing the basewide remedial investigation.

RECOMMENDATIONS AND REGION 9 RESPONSE

Our recommendations are addressed in Chapter 2 of this report. The Region's complete response to this chapter and our evaluation of the response are included as Appendices A and B of this report.

Chapter 7

Luke Air Force Base

SUMMARY

Since the Federal facility agreement was signed in 1990, Luke Air Force Base (AFB) has used Analytical Technologies Laboratory, Inc.'s (ATI) Phoenix laboratory to perform most of its laboratory analyses. In September 1994, the Air Force submitted a Remedial Investigation (RI) Report to EPA that was based on analyses performed by the ATI-Phoenix laboratory. Just before the submission of the RI Report, EPA suspended the ATI-Phoenix laboratory from further Federal work for allegedly using faulty test equipment and reporting false results under another EPA program.

The Region subsequently determined that the RI Report is not sufficiently supported. As a result, cleanup has been delayed by nearly one year thus far, and may require a major resampling effort. In our opinion, the delay could have been avoided if the QAPPs had been more effectively designed and implemented.

...We believe that the quality of the data from the remedial investigation at Luke Air Force Base cannot be determined to the extent that a Superfund Basewide Record of Decision...can be signed and finalized.

-Region 9's letter to

Luke Air Force Base, June 2, 1995

BACKGROUND

Luke AFB is a 4,200-acre base near Phoenix, Arizona, with a primary mission of advanced fighter training. The contaminants found at the site include petroleum products and volatile organic compounds, such as benzene. The base was listed on the NPL in 1990, and has been divided into two operable units (1 and 2). Three QAPPs were prepared; a Basewide QAPP and one for each operable unit. The base is using the services of the Omaha District of the U.S. Army Corps of Engineers (the Corps) to issue and monitor its contracts for the environmental investigation and cleanup.

The base's engineering contractor contracted with the ATI-Phoenix laboratory in 1991 to perform most of the analyses for the RI Report. In early 1994, a former ATI-Phoenix employee alleged that fraud was occurring at the laboratory. On September 1, 1994, EPA suspended ATI-Phoenix from performing any additional Federally funded work. The suspension was due to the laboratory's alleged use of faulty testing equipment and reporting false test results for analyses performed for the Glendale, Arizona Water System under the Safe Drinking Water Act.

The quality of laboratory data is of the utmost concern to us.

-EPA's Press Release

on ATI's Suspension

September 1, 1994

Although Luke AFB was aware of the suspension, it submitted the RI Report for Operable Unit 1 to EPA on September 2, 1994. However, neither the Air Force, the Corps, or the Region have been able to fully ascertain the usability of ATI-Phoenix's analyses. In August 1995, the Region determined that some of ATI-Phoenix's data was manipulated and therefore is of unknown quality. The Region, the Air Force and the state have agreed on the outline of a risk-based management approach that will determine the quality of the data. Because data is of unknown quality, the Region has been unable to approve the RI Report, delaying the cleanup by nearly one year thus far.

QUALITY ASSURANCE PROJECT PLANS WERE DEFICIENT

We reviewed the Basewide and Operable Unit 1 QAPPs at Luke AFB. Although these QAPPs included some good quality assurance features, such as split sampling and laboratory audits, they did not require:

Double-blind PE samples;

Data validation, according to EPA national functional guidelines (unless problems were found);

Magnetic tapes to be provided upon request; and

Disclosure of the primary laboratory used at the site.

In our opinion, all of the above are essential features for an effective laboratory quality assurance program and should have been included in the QAPPs. We also noted that none of the QAPPs were approved by the Region.

Performance Evaluation Samples Not Fully Effective

The QAPPs provided that PE samples may be used if data problems were found. We noted the Corps used single-blind PE samples as part of its process to validate laboratories. The Corps sent single-blind PE samples to ATI-Phoenix in August 1991 and again in March 1993. However, we do not believe that single-blind PE samples are fully effective because the laboratory is aware it is evaluating a specific PE sample.

The use of double-blind PE samples would be a more effective way of detecting problems, since the laboratory would not be aware that it was being evaluated. A good example of the effectiveness of this evaluation approach occurred at March AFB, as discussed in Chapter 4 of this report.

Data Validation Not Required

The QAPPs did not routinely require data validation according to EPA's national functional guidelines. The Region, in a review of quality assurance procedures in 1991, advised the Corps that:

...The minimum percentage of data package selection for validation...has not been proposed by the Corps. EPA recommends that the Corps outline a routine approach for identifying a minimum percentage of data validation plus a strategy for selecting data.

Although this was good advice, the Corps did not incorporate a validation percentage into the QAPPs. Instead, the QAPPs stated that data validation was required only when "substantial quality control problems" were detected. According to the Region's Quality Assurance Management Section, data validation should be done for a minimum of 10 percent of the samples, whether quality control problems are found or not.

Further, the QAPPs did not require the laboratory to provide data packages needed for data validation, unless problems were found. As a result, Luke AFB has experienced difficulty in obtaining the necessary data packages. Without these packages, the quality and useability of ATI Phoenix's data could not be determined, thereby delaying decisions on how the cleanup should proceed.

Even after learning of the potential laboratory analyses problems with ATI-Phoenix, the Corps performed data validation on only 3 of 2,073 samples analyzed by the laboratory. The validation, performed in mid 1994, did not disclose any problems. However, we consider the sample size too small to effectively evaluate data quality.

I do not believe ATI is responding in a timely manner with our/EPA's request for raw data.

-Luke Air Force Base

Restoration Section Chief

February 25, 1995

In September 1994, after suspending ATI-Phoenix, the Region requested ATI-Phoenix's data packages from Luke AFB. ATI-Phoenix did not furnish these packages promptly even though they were required to be provided within 30 days. In a February 22, 1995 letter, Luke AFB's cleanup manager told the Corps that:

I will not tolerate any further delays in ATI's responsiveness to this request. Our data is only usable if EPA determines it so via an audit of the ENTIRE list of accession numbers which I compiled. Failure to provide this entire list in a timely matter will result in a default determination that our data is not usable. I have determined that the definition of "a timely manner" runs out 10 March 95.

Although data packages were delivered by March 10, 1995, some of the packages were incomplete and could not be used to perform data validation in accordance with EPA guidelines. Consequently, the cleanup has been delayed because the majority of the work cannot proceed until the useability of the data has been determined.

Magnetic Tapes Not Available

The QAPPs did not require the laboratory to provide electronic data, such as magnetic tapes of the raw data, needed to perform magnetic tape audits.

ATI-PHOENIX LABORATORY NOT IDENTIFIED IN QAPP

The QAPP for Operable Unit 1 included the organizational structure and selected resumes for the ATI laboratories in San Diego and Pensacola. However, it did not identify or discuss the ATI-Phoenix laboratory. According to Luke AFB personnel, the QAPP was written by the engineering contractor's Tampa, Florida office, and it may have been unaware that ATI had a Phoenix laboratory. In order for the Region to effectively evaluate a QAPP, it must have complete information about the laboratory and its capabilities.

LACK OF QUALITY ASSURANCE OFFICER

Luke AFB, the Region, and the State of Arizona signed the Federal facility agreement in September 1990. We believe that one of the weaknesses in the agreement was that it did not require the appointment of a quality assurance officer. Such an appointment could be effective in helping to affix responsibility for oversight of laboratory quality.

DELAY IN CLEANUP

The potential data quality problems have already delayed the cleanup by nearly one year. To illustrate, the Region's review of the RI Report for Operable Unit 1, originally scheduled for completion by November 1994, has slipped to October 1995.

Further, it appears resampling will be required. According to a June 2, 1995 letter from the Region to Luke Air Force Base:

...we believe at this time that results for volatile organic compounds..., semi-volatile organic compounds..., and pesticides are, at best qualified...Qualified data alone is not sufficient to support a baseline risk assessment...Based on available information, we strongly anticipate that confirmatory sampling will be needed to support the Basewide ROD [Record of Decision].

RECOMMENDATIONS AND REGION 9 RESPONSE

Our recommendations are addressed in Chapter 2 of this report. The Region's complete response to this chapter and our evaluation of the response are included as Appendices A and B of this report.

APPENDIX A

Region 9 Response to Draft Report

Region 9's complete response to the draft report is attached. The Region's specific comments were referenced to the page numbers of our draft report. These page numbers have changed in the final report. Our evaluation of the Region's comments is included in Appendix B.

NOTE: Please see the hardcopy for page references.

UNITED STATES ENVIRONMENTAL PROTECTION
AGENCY

REGION 9

75 Hawthorne Street

San Francisco, CA 94105

SEP 20, 1995

MEMORANDUM

SUBJECT: Response to Draft Audit Report No. E1SKF5-09-0031 Environmental Data Quality at DOD Superfund Sites in Region 9

FROM: Nora L. McGee, Assistant Regional Administrator

for Policy and Management (P-1)

TO: Truman R. Beeler

Divisional Inspector General for Audits

Western Division (I-1)

The attached memo, dated September 3, 1995, titled "Audit of EPA Region 9's Oversight Of Data Quality Assurance at Federal facilities, Draft Audit Report No. E1SKF5-09-0031," and signed by Julie Anderson-Rubin, Chief, Federal Facilities Cleanup Office, constitutes our official management decision on the subject audit. We feel confident that this memo is responsive to your concerns but understand from the exit conference that we still need to better correlate our planned corrective actions to the draft audit report's recommendations. We will provide an improved corrective action plan in our response to the final audit report.

In the interim, should you or your staff have any comments or questions, please contact Rich Hennecke, Regional Audit Followup/Management Controls Coordinator at (415) 744-1630.

Attachment

cc: Geary Pena, Audit Manager (I-2)

Keith Takata (H-1-S)

Julie Anderson-Rubin (H-9)

Terry Stumph (P-3)

Vance Fong (P-3-2)

Rich Hennecke (P-2-1)

UNITED STATES ENVIRONMENTAL PROTECTION AGENCY

REGION IX

75 Hawthorne Street

San Francisco, CA 94105

September 3, 1995

MEMORANDUM

SUBJECT: Audit of EPA Region 9's Oversight of Data Quality

Assurance at Federal Facilities; Draft Audit Report

No. E1SKF5-09-0031

FROM: Julie Anderson, Director

Federal Facilities Cleanup Office

TO: Geary H. Pena

Acting Divisional Inspector General for Audit

Thank you for the opportunity to provide comments on the Draft Audit Report No. E1SKF5-09-0031, Environmental Data Quality at DOD Superfund Sites in Region 9. Included in this response are general clarifications and comments about the report, followed by comments on specific findings and recommendations in the report and updates on EPA Region 9's follow up to earlier findings.

Draft Report Overview

As we stated in our previous written comments, we in EPA Region 9 have always recognized the essential role that data quality assurance plays in our environmental programs. We work hard to instill in the federal facilities we oversee an awareness of their lead agency responsibilities.

EPA's oversight responsibilities at Federal facilities are substantial and diverse. They encompass scoping and approval of all investigations and other studies (workplans, risk assessments, ecological assessments, removal plans, feasibility studies, treatability studies), community involvement (including attendance at frequent Restoration Advisory Board meetings,), environmental justice awareness, resolution of the numerous issues that often arise in remedy selection (cleanup levels, unexploded ordnance, endangered species protection, natural resource protection, historical preservation), general compliance with CERCLA and the National Contingency Plan, and data quality assurance. These oversight responsibilities are compounded by the vast acreage and, frequently, hundreds of contaminated areas involved at each facility. At closing bases, additional efforts must be directed to land use considerations, to quickly identify properties most suitable for early reuse, and make those properties available to the community as expeditiously as possible. Given this considerable workload at Federal facilities, EPA does not (and likely never will) have sufficient resources to oversee all aspects of Superfund cleanup at each facility we ideally would, let alone to actually perform DOD's lead agency responsibilities. Instead, EPA must leverage our very limited oversight resources to provide maximum benefit with minimum investment. One way we have historically done this is to identify to DOD issues we have observed that may effect remedy selection -- particularly if these issues have the potential to cross multiple facilities. We then work with these lead agencies to help them address these problems systematically.

To date, our general approach in overseeing DOD's quality assurance responsibilities has been to work with DOD to establish basic quality assurance plans for each activity, and require DOD to ensure these plans are implemented through establishment of quality assurance officers dedicated to this function. We still believe this fundamental approach, well executed, is sound. However, improvements to the actual quality assurance plans and to DOD's execution are obviously necessary. Recent data quality problems at DOD sites (in some cases discovered by EPA Region 9) confirm that these have been breakdowns in this approach, leading to data losses or data of limited usability. In some cases, DOD did not assign quality assurance officers, and as a result did not implement all aspects of the approved quality assurance plans. While EPA needs to check in more routinely on DOD's implementation of the program, simply increasing our "policing" of DOD won't prevent or solve the fundamental problem. Instead, DOD must assign a high priority to developing and implementing a sound Department-wide approach to assuring data quality from its contract laboratories, as EPA has done with our Contract Lab Program. Since EPA Region 9 became aware of this DOD problem, we have worked closely with all Services to assist DOD in building an improved program that will fulfill their lead agency responsibility to provide data of acceptable quality.

The report does not provide the reader a clear differentiation of the EPA's role and DoD's role at DoD Superfund sites. The report does not reference or explain the effect Executive Order 12580, which delegated lead-agency authority and responsibility to DOD at DOD CERCLA sites. In the absence of explanation, many readers will assume incorrectly that EPA's role at federal facilities Superfund sites is analogous to its role at enforcement-lead Superfund sites, where potentially responsible parties are performing Superfund work. At enforcement-lead Superfund sites, EPA is the lead agency with primary responsibility for assuring proper implementation of NCP requirements, even though the work is being performed by a PRP. At DOD sites, DOD retains this primary responsibility. This intrinsically places EPA in a different, secondary position with regard to our role and responsibility at Federal sites.

While EPA retains certain authorities at federal facility sites (listing and delisting sites on the NPL, selecting remedies jointly with DOD and DOE, entering into interagency agreements), the resources we are provided to carry out our responsibilities are commensurate with very broad-brush oversight rather than a lead role. The lead agency functions of carrying out all aspects of the National Contingency Plan, preparing documents, performing fieldwork and construction, and ensuring data quality remain with DOD. In several locations in the draft report, problems and recommendations are described in terms that imply that EPA is the lead agency (e.g. "The Region did not...revise the QAPPs"). We recognize that the focus of this audit was to evaluate EPA Region 9, but where a perceived deficiency is attributable to DoD it should be so stated. Also, we suggest that the report recommend actions DoD should take to improve its QA programs, which EPA could recommend to DOD in our capacity as overseer. For instance, EPA should recommend to DOD that their contract documents be improved to establish QA requirements that assure adequate prime contractor review of subcontracted laboratories despite the inherent disincentives.

While the report addresses in great detail the perceived vulnerabilities in Region 9's oversight of DOD's quality assurance program, it fails to acknowledge our many competing responsibilities or the substantial effort and resource investment Region 9 has made over the last 2 1/2 years to detect and prosecute lab fraud at DOD labs, raise awareness of DOD's QA problem, and assume a national leadership role in working with DOD to attempt to improve this situation. We have found ourselves greatly frustrated in our many attempts to influence this larger issue, and were hopeful the Inspector General's review would assist in assigning a higher DOD priority to improving the situation. We are disappointed that the review doesn't appear to acknowledge the basis of the problem; as written it instead seems to portray DOD's failure to provide data of adequate quality as a problem caused by EPA's limited policing powers.

Executive Summary

1. Page i, paragraph 1

We recommend the following language changes:

At one DOD base, the Region uncovered extensive laboratory fraud was found.

Also, it's the source of the $5.5 Million/2.5 year loss should be referenced and supported.

2. Page ii, paragraph 3:

It is not EPA's role to directly oversee DOD laboratories as is implied in this paragraph; it is EPA's role to oversee DOD's quality assurance program, which may include occasional direct EPA involvement in reviewing laboratories under contract to DOD, or data produced by those labs. However, DOD has generally refused to sign FFAs that provide EPA independent authority to conduct audits and inspections of DOD contract laboratories.

We recommend the following language changes:

Although serious laboratory problems were identified, the Region had not significantly strengthened its oversight program over DOD laboratories. Nor had it only required that DOD modify the quality assurance project plans at one of the five sites audited ...

Further, it should be stated that the Region has increased its federal facility QA/QC oversight efforts in other ways. The Region has instituted a PE sampling program for DOD sites, increased the number of field and lab audits conducted (with DOD concurrence), has used several occasions to highlight the importance of QA/QC to DOD's contractor community, has raised the DOD QA/QC problem at the highest organizational levels of DOD and to all Service Branches, has worked collaboratively with DOD to develop and begin utilizing electronic data validation methods, and has participated in DOD's triservice data quality workgroup, recommending improvements to DOD's departmental data quality program. Further, the Region's debarment of fraudulent labs and subsequent criminal prosecution of laboratory personnel have sent effective messages to DOD and the laboratory community.

3. Page iii, paragraph 2

DQOs were a required element in all QAPPs. Often the DQOs stated in QAPPs were only statements of method requirements or laboratory capabilities, but this element had to be addressed in some fashion in all QAPPS undergoing QAMS review. Prior to 1994, there was no Agency guidance for defining quantitative project data quality goals (and there is still no practical method of deriving laboratory quality goals from statistical confidence levels for overall (field + lab) data quality). The 1987 Superfund DQO guidance stopped at defining data quality parameters (PARCC) and 5 categories of analytical data, which were erroneously termed as levels of data quality. We acknowledge that site-specific numerical data quality objectives (DQOs) have not been prepared. However, the 1987 data quality objective guidance document was considered by Region 9 staff as being not helpful in this regard.

4. Page iii, paragraph 3

We recommend the following language changes:

While the Region took action to reject data, address impacts of lost data, and increase certain QA oversight activities such as expanding its PE sample and lab audit programs, The Region did not effectively monitor compliance with QAPPs, or recommend that DOD revise the QAPPs after laboratory data problems were found.

5. Page iv, paragraph 4

The intent of this paragraph is unclear. Eureka laboratory was audited several times before being used by March AFB, and while one of the audits identified serious problems with the lab, none of the audits detected fraud. Further, this paragraph doesn't acknowledge that data quality problems as a result of lab fraud were finally detected and prosecuted by EPA Region 9. While it is easily said in hindsight that stronger provisions in the QAPP to detect fraud could have prevented this problem before the fact, the Region feels that our identification of the problem and insistence on taking corrective actions should receive equal billing with the finding that the QAPP was not sufficient.

Chapter One

1. Page 2, paragraph 1: The report should note that the Region is also responsible for oversight of twelve closing bases which are not on the NPL, and may soon be expected to provide oversight to 8-10 more closing bases added as a result of the fourth round of base closures.

2. Page 3, paragraph 2: The report would benefit by specifying what the IG learned about DOD's own data quality assurance program -- i.e., you found they don't have a CLP and use multiple contractors. What did you learn about DOD's program to fulfill its responsibility to provide data of acceptable quality from these many contract and subcontract labs?

3. Page 5, paragraph 1: The report should indicate that the first four facilities were identified by EPA Region 9 as sites which had experienced known data quality problems. The report should also mention that, in the initial IG review, when asked which issues were of greatest concern to Region 9 with regard to pace and quality of DOD's cleanup program, Region 9 expressed our continuing frustration with repeated data quality problems, some of which we had uncovered.

Chapter Two

1. Page 9 & 10, Reliability of DOD's Laboratory QA Program Unknown

The statement in Chapter 2 of the audit report that the Region is "unfamiliar with DOD's laboratory quality assurance program," is incorrect. It is recognized that some confusion exists concerning the individual laboratory QA programs for the Navy, Army, and Air Force. The text in Chapter 1 (Introduction, Background, Environmental Data Quality Requirements, Page 3) acknowledges that "unlike EPA, DOD does not have a centrally-managed contract laboratory program for monitoring laboratory quality." As a result, all three services operate somewhat differently.

However, it is unfair to conclude that the Region lacks expertise in understanding DOD's laboratory quality assurance program. The Region's Remedial Project Managers (RPMs) and Quality Assurance Management Section (QAMS) staff routinely handle issues related to DOD's program. Any criticisms of the Region's expertise in this matter should be accompanied by acknowledgement of problems with DOD's internal consistency. In addition, DOD staff are themselves often unfamiliar with their own QA organizations and methods for quality assurance.

2. Page 9, paragraph 3: The report should indicate that, wherever the Rivers response indicated Eureka lab had been used at an NPL site in Region 9, EPA staff were informed and directed to discuss the extent of Eureka's use at the site and to assess the criticality of this particular data to remedy selection at the site. While it may be true that EPA didn't request in writing more data, this misrepresents our awareness and interest in the data generated by this lab, and implications on decision making.

3. Page 10, Laboratory Quality Assurance Oversight Activities at DOD Sites Need Improvement

The text in one section of Chapter 2 (Reliability of DOD's Laboratory QA Program Unknown) criticizes the Region for providing laboratories with more than one opportunity to "pass" a performance evaluation (PE) sample. However, the text in another section of Chapter 2 ("Double-Blind" PE Samples not Required, page 21) implies that submission of a second PE sample, after a laboratory has failed to meet the accuracy criteria for a first PE sample, is considered to be acceptable. The audit report provides an example of language which was added to a QAPP for Travis Air Force Base (AFB) that requires the analysis of double-blind PE samples. The text in this example states that "If the accuracy criteria are not met, the cause of the discrepancy will be investigated and a second PE sample will be submitted." This apparent discrepancy should be addressed.

4. Page 11, paragraph 2: We agree "some tracking system" should be in place, but it should be the primary responsibility of DOD as lead agency to track its own laboratory use and the quality assurance measures employed by those labs to validate data. For EPA to create and maintain a similar data base is duplicative and wasteful of limited national resources.

5. Pages 11 and 15

A figure compares the 100% data validation and PE sample use at EPA-lead sites to the % at federal sites, implying that the lower % at federal sites is inadequate. 100% data validation is not a requirement, and a question remains as to whether it is in fact desirable. The data user, when considering the site or project-specific need may decide, through using the data quality objective process, an adequate percentage of data to be validated. A uniform 20% data validation may be adequate for one site and not adequate for another.

6. Page 16, Laboratory Quality Assurance Oversight Activities at DOD Sites Need Improvement, Data Validation was Deficient, Page 16

The discussion of data for Hunter's Point Superfund Site states that data rejected in validation, generally, are unusable. It should be noted rejected data are always considered to be unusable.

7. Pages 18 & 19, Laboratory Quality Assurance Oversight Activities at DOD Sites Need Improvement, Computer Aided Validation Should be Implemented

The text in Chapter 2 of the audit report recommends that QAPPs for DOD sites be modified to require the use of the EPA-developed computerized data validation program, Computer-Aided Data Review and Evaluation (CADRE). The use of computerized data validation is recognized to provide a more efficient, consistent, and cost effective method of data review than traditional manual validation, and its use should be encouraged whenever feasible.

However, the audit report fails to address critical implementation problems associated with the use of CADRE for validation of data generated for DOD sites. CADRE was expressly developed for application to data generated in Agency Standard Format, which is the electronic data reporting format used by the EPA Contract Laboratory Program (CLP). In order for CADRE to be useful to DOD, DOD contract laboratories would need to generate data deliverables in Agency Standard Format. It is unclear whether EPA has the authority to prescribe specific data reporting requirements for such laboratories. On April 10, 1995, QAMS offered the first CADRE training for Federal facility representatives. QAMS subsequently request approval from OERR to release CADRE to Federal facilities and their contractors. QAMS also marketed CADRE at the California Base Closure Committee meeting as a recommendation to better assess data quality and saving resources.

At this time, it is considered to be more realistic to recommend, rather than require, that QAPPs for DOD sites incorporate the use of CADRE. Alternatively, the specific model of electronic data validation software that will be used should be left to the discretion of DOD.

6. Page 19

The report indicated that the Region conducted CADRE training for DOD staff on April 1995. However, no one from the Air Force attended. QAMS has since scheduled a CADRE training for the Air Force for Fall, 1995.

7. Page 21, Laboratory Quality Assurance Oversight Activities at DOD Sites Need Improvement, Magnetic Tapes Should be Available

The discussion of magnetic tapes in Chapter 2 of the audit report should be expanded to describe the limited application of tape audits. It is agreed that magnetic tapes provide a valuable tool for disclosing analytical deficiencies. However, it should be clarified that, in general, primarily data generated by analytical methods which involve the use of gas chromatography and gas chromatography/mass spectrometry [GC/MS] (i.e., volatile and semivolatile organic compound and dioxin/furan data, most commonly) are suitable for tape auditing purposes. The information presented in the audit report is misleading in that it implies that magnetic tape audits have application to all types of analytical data.

8. Pages 25-27: Recommendations See our conclusions at end of this document.

Chapter 3, Sacramento Army Depot:

1. Data Validation

Sacramento Army Depot has agreed to resample the Tank 2 Operable Unit and perform the necessary data validation on the new samples. We propose advising the Army that if resampling indicates that cleanup levels have not been achieved, EPA will withdraw its cleanup certification and additional remediation will be required.

2. Recommendation

Concur with the recommendation; however, we would prefer that the recommendation be flexible enough to allow for a cost-effective approach to the validation of all previous data. We suggest the recommendation read: "Require the Army to develop and implement an EPA approved plan to validate previous data used for decision making using EPA national functional guidelines or equivalents."

Chapter 4, March AFB:

1. Page ii states, "...the Region had not significantly strengthened its oversight program..."

This statement is untrue. Four other laboratories in use at March AFB were sent PE samples following the identification of the Eureka problem. Due to either problems in preparation of the PE samples or analysis of the samples by our laboratory, the results were not conclusive and additional PE samples could not be prepared by EPA QAMS in a timely manner.

2. Page iii states, "...the lack of site-specific data quality objectives was one reason serious problems were found with environmental data quality."

This finding is a non-sequitur for March AFB. There is no relationship with data quality objectives and fraud. This also would be the case on at least one other base reviewed by the IG (Luke).

3. Page iv, second paragraph under March AFB, which begins, "We found..."

The intent of this paragraph is unclear. None of the QAPPs for any of the FF in the past have measures to protect against fraud. So this finding is not unique to March. There were two "laboratory audits" of Eureka by the Air Force technical support. While those audits certainly identified problems with use of the lab, those audits did not detect fraud, nor would were they designed to detect fraud.

4. Page vi, second to the last bullet.

I do not know of any evidence to support the concept that the quality of QAPPs could be assured by the use of only government employees as quality assurance officers. Such a statement diminishes the quality of the argument you wish to promote. EPA has had excellent technical support from a number of contractors, and there is no reason why this will not continue to be the case. Also, "...independent laboratory audits..." needs definition.

5. Page 39, first paragraph

Add the word, "criminal" to the sentence ending, "...laboratory pleaded guilty to criminal charges that its test results were falsified."

6. Page 39, second paragraph

This paragraph is unclear. If the intent is to say that the QAPP did not contain a requirement to do a laboratory audit nor did it require magnetic tape audits, this is true. However, it is also not unusual. None of the FF QAPPs have required either of these functions. In fact, magnetic tape audits by EPA of PRP work are not SOP in any of EPA guidance. They are done only if fraud is suspected, and the submittal of magnetic tape is probably enforceable only through Consent Decrees or Section 104 action. It is doubtful we could require it at FF unless it is in our guidance (The FF agrees to follow EPA guidance in the model FFA). Perhaps the IG wished to recommend that the guidance is strengthened, but this is not the impression I have when reading the draft.

7. Page 42, first full paragraph

The basewide QAPP was approved as part of the approval of the basewide Workplan and Sampling and Analysis Plan. Approval of the Workplan and the Sampling and Analysis Plan were given in our letter of December 21, 1991, in which we gave additional comments on the draft final. The Air Force agreed to our comments in their letter dated December 30, 1991. Copies of both those letters were sent to Dan Cox of the IG's office by mail on August 22, 1995.

8. Page 42, first paragraph under Laboratory Audits Not Required

EPA Region IX QAMS has not seen much value in "laboratory audits" in the past. Furnishing PE samples probably has more value on a dollar-for-dollar basis. Also, the Eureka Lab problem was fraud. Fraud, as we carefully explained to the IG staff can not be detected by a "laboratory audit". The type of fraud performed by Eureka would not have been detected by a "laboratory audit".

9. Page 43, first full paragraph

It would appear that the real candidate for an audit would be the Air Force. Why was a sub-contract given to Eureka when the Air Force contract technical auditor recommended, after two lab audits, that samples not be sent to that lab? It would appear that the largest contribution the EPA IG could make to this story is to push the Air Force to conduct an internal audit to discover the full extent of their internal QA problem.

10. Page 43, No Data Validation

This paragraph is a non-sequitur. Laboratory fraud will not be detected by Level IV data packages.

11. Page 44, Costs

Eureka Labs was paid about $845,000 for analyses of March AFB samples. An additional $144,000 was expended for all costs associated with resampling and analysis of samples. All of the $144,000 for resampling and analysis was paid by the engineering contractor.

Chapter 5, Travis Air Force Base

No additional comments.

Chapter 6, Hunters Point Annex:

1. Rejected Data Cost Estimates

The Navy engineering contractor, PRC, informed EPA Region 9 that it estimates that the rejected data cost approximately $1.21M, and the cost to replace this data was $945,906, not $2.5M and $1.1M as stated in the report. Please see the attached documentation provided by PRC, the Navy's engineering contractor.

2. Federal Facility Agreement. On page 56, Position Paper for Hunters Point, the following quotation is inaccurate: "Although the base was listed on the NPL in 1989, a federal facility agreement to cleanup the site was not signed until] 1992."

The original FFA for Hunters Point was signed September 1990. In January 1992, the FFA was revised with new schedules and the addition of the Regional Water Quality Control Board as a signatory. A copy of this earlier FFA is enclosed for your records.

3. Data Validation. On page 58 of the Report the following statement is made: "Also, there was no evidence that the Region's contractor reviewed 10 percent of the Navy's validation. If this had been done, we believe the quality control problems that led to the rejected data could have been detected by May 1991, about one year earlier."

We believe it is important to note that according to the Navy's engineering contractor, the sample holding time exceedence problems were detected early enough in the process, during the contractor's cursory review of the Phase 1A data, to result in termination of use the offending labs and initiation of use of new labs in the Summer of 1991 for the Phase 2A data analysis.

4. Page 58, Laboratory Audits Not Performed.

The Navy is the lead agency for the project. As the IG report acknowledges, the Navy did perform some lab audits. These audits were already part of the Navy's and their contractor's procedures and therefore, the Region did not use its oversight resources to perform its own lab audits which would have been duplicative of the Navy's efforts.

5. Page 61, Laboratory Audits Not Required.

The Region 9 staff are uncomfortable with the language used in the first sentences of this page: "The QAPP did not require the Navy to perform laboratory audits. Nonetheless, Navy contractors did some audits of laboratories doing analyses for the Hunters Point site." This statement could be rephrased: "Although the QAPP did not include a requirement that the Navy conduct lab audits, the Navy, in accordance with their own internal procedures, did perform audits of laboratories doing analyses for the Hunters Point site. It is for this reason, that EPA Region 9 did not use its oversight resources to perform its own lab audits which would have been duplicative of the Navy's efforts."

6. Page 61, Laboratory Audits Not Required. The last sentence of the first bulleted item reads, "This conclusion appears to be contradicted by the subsequent data problems at the laboratory."

This statement supports our belief that laboratory audits cannot be relied upon to detect all quality assurance problems that may arise and calls into question the Report's recommendations for increased emphasis on laboratory audits.

7. Estimated Schedule Delay

EPA does not agree with the estimated 2 year delay in the overall project schedule. It would be more appropriate to state that full characterization of the sites in question was delayed. Again, there have been many factors which have contributed to delays in the overall cleanup schedule at Hunters Point, most particularly, availability of sufficient capacity and funds for Navy contracts.

8. Split Sampling

During the most recent sampling of groundwater from a parking lot spring on Parcel A, a Field Sampling Plan was prepared by EPA Region for the Navy with complete review and approval of the Region 9 Quality Assurance Management Section. This sampling occurred in late 1994 and early 1995. In addition, EPA collected and analyzed split groundwater samples to confirm the validity of the Navy's sampling effort.

9. Revised Basewide QAPP

Remedial investigation fieldwork is near completion. The Navy, with the assistance of its engineering contractor, is in the process of revising its QAPP for use in the long term groundwater monitoring program at Hunters Point.

Chapter 7, Luke AFB:

1. Page vi, 1st paragraph: "To date, neither the Region or the Air Force have been able to determine if environmental data is of sufficient quality to support remedial action."

The statement should be updated by the fact that to date, the State, EPA, and Air Force have agreed (7/11/95) on the outline of a risk-based management approach that will determine the quality of data. The plan, that will implement the approach, is going through regulatory review and should be finalized by October 1995.

2. Chapter 7, Page 63, Summary: "The Region has not yet

determined whether the RI report is sufficiently supported, or if the report should be rejected."

The statement should be updated. To date, evidence has been (EPA Region 9, 8/4/95) acquired indicating that some of the Luke AFB RI data have been manipulated by ATI-Phoenix to the extent that the quality of the data is unknown, and therefore cannot be used for remedial decisions. Taking the data manipulation into account with the EPA lab suspension and State of Arizona suspension of contracts, the Region has determined that the RI report is not sufficiently supported. Negotiations are currently underway to put together the Plan (with enforceable dates) to address the data quality issues and for the Air Force resubmit the RI report.

3. Chapter 7, Page 64, Summary, Last paragraph: "..Neither the Air Force,...or the Region have been able to substantiate the quality of ATI-Phoenix's analyses...Consequently, the Region has been unable to approve the RI report, delaying the cleanup by nearly one year thus far."

The statement is not accurate and may give the impression that the Region has been primarily responsible for delaying the cleanup for one year. The statements also implies that if the Region had approved the RI report, the cleanup would not have been delayed. Lastly, the statements make it appear that the Region has done nothing to determine the quality of the RI data. The statement should be rewritten as follows:

However, the Air Force, the Army Corps of Engineers, and the Region have yet to fully ascertain the usability of ATI's analyses. Consequently, the Region has not approved the RI report more than one year after its submittal, which will ultimate delay the cleanup. While laboratory fraud was a primary cause of delays at this site, lack of cooperation on the part of ATI and the Air Force with EPA and State efforts to determine whether Luke data was impacted by fraud have further exacerbated the delays.

The Region should be given appropriate credit for its efforts to identify problems with the quality of data as appropriate.

The statements should be revised to be more accurate. First, it should be made clear that the Region is not primarily responsible for the delay in the cleanup. Primary responsibility for the delay should be given to the Air Force as the lead agency, which sub-contracted the analyses work with ATI-Phoenix through the Army Corps of Engineers. Second, EPA Region 9 approval of the RI report (for the purpose of not delaying the cleanup), would have been an irresponsible action that would have compromised EPA's mission to protect public health. The latest evidence (indicating direct manipulation of data by ATI-Phoenix) shows that the Region made the right decision to withhold approval of the RI report.

The Region did provide preliminary comments the RI report (EPA Region 9, 11/7/94). The Region's comments focused only on the metals analyses, since no evidence of fraud in that area of analyses was found.

Between the State of Arizona (AZ) and Air Force, the Region has taken the lead in identifying and determining that the RI data for Luke AFB was affected by ATI-Phoenix fraud (EPA Region 9, 6/2/95 and 8/4/95). In addition, the Region has taken the lead in starting the process of auditing the tapes to determine the extent of manipulation (Region 9, 6/8/95 & 6/23/95). As indicated in the draft report, data for the audit was not provided by the Air Force in a timely manner. Most importantly, the Region was the lead in proposing a risk-based management approach to address the data-quality issue to the Air Force and State of AZ (Project Manager meeting 7/11/95). The Plan outline of the approach was submitted on 8/10/95 and is under review at this time.

Lastly, the Region has conducted a preliminary review of the Army Corps of Engineer (ACE) QA/QC program (8/9/95). Our review focused on the ACE QA/QC program's ability to detect fraud with its split sampling program. Our initial review concludes that the ACE split samples cannot be used to discriminate between manipulated and untainted accession numbers (see comment #8).

4. Page 64, Last paragraph: "We also noted that none of the QAPPs were approved by the Region."

The statement is inaccurate. The statement should be revised to indicate that the Base-wide QAPP is a primary document under the Luke Air Force Base Federal Facilities Agreement (FFA) and was reviewed by the Region in accordance with the FFA.

5. Page 66, 4th paragraph: "Consequently, the cleanup has been delayed because work cannot proceed until useability of the data is determined.

This statement is inaccurate because cleanup work is progressing at some sites at Luke AFB. For example, a bioventing treatability study is being conducted at site SS-42, and the remedy for the OU-2 ROD (ex-situ biological treatment) is in the remedial action phase at site DP-23.

6. Page 66, last paragraph: The QAPPs did not require the laboratory to provide electronic data...to perform laboratory audits.

The statement should be clarified. The QAPP indicates that tapes will be provided upon request from the Air Force at any time. In, fact the Region has acquired tapes through the Air Force for the purpose of conducting a tape Audit.

7. Page 64, QAPP features that were lacking.

It should be noted that the type of fraud committed by ATI Phoenix would not have been detected by the features identified as lacking from the QAPP. Based on meetings with QAMs Region 9 (Steve Remaley), the best features that should be added to the QAPP, and conducted by the Air Force should be to: 1) conduct random tape audits (with no prior warning to the lab); and 2) The use of more than one lab for Remedial Investigation sample analyses (to minimize impact to decision making).

8. Page 67, QAPP Has Good Practices

The section should be updated to the fact that EPA Region 9 has conducted a preliminary review of the Army Corps of Engineer's (ACE) QA/QC Program in support of the Installation Restoration Program (IRP) at Luke Air Force Base. The review (EPA Region 9 8/9/95), focused on the split samples and on the program's ability to identify manipulated IRP data Luke Air Force Base.

Region 9 concluded that the criteria used to compare ACE split sample data and ATI-Phoenix data is too broad to discriminate differences caused by manipulation (fraud). Therefore, ACE split sampling program was not able to detect fraud.

Appendix B

These tables should include dates that the regulations, orders, directives, and guidances were finalized to clarify which were available at different stages in the Superfund process for the sites audited. Executive Order 12580 which delegates lead agency authority to DOD should be included.

Ongoing and Planned Corrective Actions

1. Region 9 has recently and will continue to seek to require federal facilities, through our comments on their Quality Assurance Project Plans (QAPjPs), to follow the Agency's Data Quality Objective Process for Superfund Interim Final Guidance dated September 1993. Region 9 will also ensure that, whenever possible, numeric data quality objectives on precision, accuracy, completeness, and detection limits are set prior to the approval of each QAPjP.

2. The OIG draft report found that data validation comparable to the EPA National Functional Guideline was not performed at DOD sites. The Region has requested that several NPL federal facilities submit representative data packages to EPA. Region 9 will perform our standard data validation on these packages using EPA's Functional Guidelines for Data Validation. The data packages not containing sufficient analytical deliverables and all other data quality problems associated with the other data packages will be identified to DOD for action.

3. The Region has been working with DOD for the last 3 years to raise awareness of the quality assurance problem and to help define an improved DOD Quality assurance program that will meet or exceed our minimum standards. We will continue to participate in these efforts. As part of this effort, the Region will continue to encourage DOD facilities to develop and implement, as part of their quality systems, a self sufficient performance evaluation program, and will seek to require DOD to submit PE study results to EPA. The Region plans to score and validate as many PE sample results as our limited resources allow and will make recommendations to DOD on follow up corrective actions when necessary.

4. In addition to recommending that the Federal facilities conduct PE samples, recently, Region 9 QAMS has conducted performance evaluation samples for on-site and off-site labs at MCA Tustin (monthly), Yumas MCA (twice), George AFB, McClellan AFB (twice), and March AFB. Many other performance evaluation activities are planned.

5. Through the review of work plans and QAPjPs, the Region will seek to require DOD facilities to perform both pre-award and routine audits of their contract labs. We will also seek to require federal facilities to furnish all audit reports to EPA.

6. Through the review of work plans and QAPjPs, the Region will seek to require DOD facilities to address how they will ensure the authenticity of the data. At a minimum, we will seek to require that all organic data for GC and GC/MS magnetic tapes be archived and available to DOD and EPA upon request. This action would provide a strong deterrence to the fraudulent activities and any poor quality laboratory performance that have been observed in the past. Region 9 QAMS has conducted magnetic tape audits on March AFB, Travis AFB, Luke AFB, and Yumas MCA. Additional tape audits will be conducted as resources allow. The Region is committing resources for purchasing equipment necessary to develop a regional tape audit capability, and will suggest DOD do the same.

7. Region 9 and OERR provided training of the use of the Computer Assisted Data Review and Evaluation (CADRE) software on April 10-13, 1995 to the U.S. Army Corps of Engineers, the U.S. Navy and their contractors. An additional CADRE training sessions will be provided to the U.S. Air Force later this fall. Since the first training, QAMS has continued to distribute copies of CADRE and technical assistance to the attendees. QAMS received confirmation that the Army and Navy will begin to implement data validation using CADRE on a number facilities. Implementation of CADRE would help remediate vulnerability regarding data quality deficiencies identified in the IG report.

8. As appropriate, the Region will seek modifications by DOD of their previously approved QA Plans still in use to incorporate the IG comments regarding data quality objectives, data validation, the use of PE samples, etc.

9 . One of the IG concerns is that federal facilities have not followed the requirements set forth in FFAs and approved QAPjPs. The Region's QAMS has developed a QA check list to assist the remedial project managers in their oversight of the quality system at federal facilities. The QA check list can be used during the Technical Review Committee meetings to ensure that federal facilities comply with the requirements in the approved QAPjPs and appropriate oversight follow up is conducted.

10. The Region is exploring the idea of providing training for federal facilities and their contractors that would aim at improving DOD's overall quality systems, their reporting, development of expertise in the data quality objectives process, and managing DOD's contractors' laboratories.

APPENDIX B

OIG Evaluation of Region 9 Response

A draft report was provided to the Region on July 20, 1995 and the Region responded on September 5, 1995. This response is included as Appendix A. An exit conference was held with regional officials on September 18, 1995.

We agreed with many of the Region's specific comments, and have modified our report accordingly. The modifications did not impact our overall conclusions.

The Region's response to our draft report (see Appendix A) included a Draft Report overview, and comments on the report's executive summary and individual chapters. Our reaction to the Region's Draft Report overview is incorporated into the Executive Summary section of our final report. For the balance of the Region's response, the following chart (referenced to note numbers in the Region's response) summarizes those comments where we did not agree with the Region. Reasons for our disagreement are discussed in the ensuing paragraphs.

     Chapter                      Note

Executive Summary                 2,5

         1                        2,3

         2                      1,2,3,4,5

         4                   1,2,3,4,6,7,8,9,10,11

         6                      1,3,4,5,6,7,8,9

         7                          4,6,7

Executive Summary, Note 2

The Region's actions to elevate the data quality problems to the highest levels of DOD are commendable. However, at the time of our audit, we do not agree that the Region had significantly strengthened its own oversight program.

Our review of regional actions for the five sites included in the audit showed that the Region had not intensified its oversight of these DOD sites, since the discovery of fraud at March AFB in January 1993. To illustrate, the Region did not use available oversight tools such as performance evaluation samples or laboratory audits in 1993 or 1994 at Sacramento Army Depot, Hunters Point Naval Shipyard, or Luke Air Force Base. Further, the Region did not require the Air Force to revise the QAPP for March AFB to require performance evaluation samples, although this activity detected the original fraud.

Executive Summary, Note 5

The Region's actions to detect the fraud at March AFB were outstanding, and have been so recognized in our audit report. The thrust of our report was to recommend ways that the QAPPs could be improved to readily identify other poor performing laboratories. Specifically, it is our opinion that the coordination of laboratory audit results between EPA and DOD would be an effective, yet inexpensive, management technique.

Chapter 1, Note 2

Chapter 2 generally describes what we learned about DOD's quality assurance program. The objective of our audit was to determine if the Region was ensuring that laboratory data was of acceptable quality under Federal facilities agreements with DOD. As such, we only evaluated DOD's program to the extent necessary to evaluate the Region's oversight.

Chapter 1, Note 3

This audit started as a special review of two Federal facility agreements: Sacramento Army Depot and Mather Air Force Base. During our entrance conference for this review, the Region advised us that the data quality was not being compromised during the accelerated cleanup process.

Although the Region admitted data quality assurance was "spotty" during this meeting, it did not inform us about problems with Eureka Laboratories. In this regard, Eureka had been suspended, at the Region's request, 6 months prior to our meeting. In fact, we first learned of problems with Eureka Laboratories from the local newspaper. Moreover, the Region did not identify any other Federal facility data quality problems during this meeting.

Chapter 2, Note 1

We believe our assertion that the Region is unfamiliar with DOD's laboratory program is correct.

The Region advised us, during the course of the audit, that it accepted DOD's description of its quality assurance procedures "at face value." As detailed in Chapter 2, we found that the procedures described by DOD were sometimes inaccurate. The Region also acknowledged that "some confusion exists concerning the individual QA programs for the Navy, Army, and Air Force..."

Chapter 2, Note 2

The Region's stated action was not adequate to fully assess the impacts of the Eureka Laboratories fraud. As noted in this chapter, one of the problems in assessing the effect of Eureka analyses was the identification of the sites that were impacted. DOD has yet to adequately identify all the NPL sites using Eureka Laboratories, on the NPL, including McClellan AFB and Sacramento Army Depot. Further, the Region was responsible for overseeing other closing DOD sites not on the NPL, including Mare Island Naval Shipyard and the Presidio of San Francisco. Some of these sites may have also used Eureka Laboratories, and the impacts of such usage would also need to be determined.

Chapter 2, Note 3

The performance evaluation samples discussed in Chapter 2 are of two types. The first type, discussed in the DOD laboratory QA section, is a preaward, single-blind sample. In other words, the laboratory knows it is analyzing a performance sample. DOD purported its procedures were the same as EPA's contract laboratory program procedures which allow a laboratory one opportunity to pass this known test. However, DOD procedures were not consistent with EPA procedures. To the contrary, some laboratories were given more than two opportunities to "pass" these tests.

The performance evaluation samples discussed later (in the double-blind PE samples section), are post-award, double-blind samples. Both EPA's contract laboratory program and DOD allow a laboratory another opportunity to "pass" these evaluations.

Chapter 2, Note 4

Since the DOD activities have not maintained a tracking system, we disagree that it would be duplicative or wasteful for EPA to maintain a data base of laboratories used by DOD. The establishment of a tracking system would be a valuable tool for helping the Region determine where to focus its resources.

Further, as noted by EPA's Office of Solid Waste and Emergency Response (OSWER), there is a need to monitor "all Superfund analytical activities, not just those analyses provided by [EPA]... laboratories." OSWER further pointed out that non-EPA laboratory work played a significant role in supporting Superfund.

Chapter 2, Note 5

The referenced graph compares the percentage of sites using data validation, not the percentage of data validated. In contrast, the subsequent chart shows the percentage of data that is required to be validated based on our review of 12 QAPPs. As noted in our recommendations, we believe that 20 percent of the data representing all matrices, analysis types, and laboratories for decision points should be validated, at a bare minimum. We agree that the data user may require a higher percentage.

Chapter 4, Note 1

It remains our position that the Region did not take adequate action to strengthen its oversight program after problems with Eureka Laboratories were discovered. As noted in the Region's response, there were problems with the EPA laboratory's preparation or analysis of data and the laboratory results were not conclusive. Further, the Region did not submit any additional PE samples to evaluate data quality or perform any other data quality assurance activities. Also, the QAPP was not changed to add or strengthen the data quality assurance activities.

Chapter 4, Note 2

We agree data quality objectives do not necessarily prevent fraud. However, they help ensure that data is of adequate quality. The adequacy of data quality objectives at specific sites is discussed in Chapter 2.

Chapter 4, Note 3

We disagree with the Region's comments. PE samples, magnetic tape audits, and laboratory audits can be used to help detect fraud, as is evidenced by the March AFB situation. PE samples at March AFB led to detection of the fraud, by identifying major data problems. As a result of the PE samples, magnetic tape audits were performed and the fraud was confirmed. Also, a laboratory audit performed before the start of the analyses found major problems that warranted further investigation. Unfortunately, none of these data quality activities were required by the QAPP and the QAPP has not been changed to include any of these items.

Chapter 4, Note 4

As indicated in Chapter 2, we believe that the Quality Assurance Officer position should be assigned to government employees, since they are in a position to independently evaluate data quality.

Chapter 4, Note 6

Our recommendations concerning magnetic tape audits are fully discussed in Chapter 2. We disagree that the Region cannot require specific quality assurance activities in QAPPs. The Region's authority to approve QAPPs is clearly defined in 40 CFR Chapter 1, 300 and EPA Order 5360.1. The model Federal facility agreement requires a QAPP, and QA/R-5 defines the QAPP requirements.

The Region is also incorrect in stating the QAPP did not require laboratory audits. As discussed in this chapter, laboratory audits were required by the QAPP. The weakness was that it did not require an audit before sampling started. It is also incorrect that none of the Federal facility QAPPs required laboratory audits. As discussed in Chapter 2 of this report, 60 percent of the QAPPs we reviewed required laboratory audits. It should be noted that, in response to our draft report, the Region agreed to require DOD facilities to perform both pre-award and routine audits.

Chapter 4, Note 7

The QAPP was not approved as part of the Basewide Sampling and Analysis Plan approval. The Region gave the Air Force additional comments to be incorporated into the QAPP, after it reviewed the Plan. The Air Force then replied in a letter that it had incorporated all of the suggested changes. However, the Region did not provide us with any documentation showing it approved the QAPP after the Air Force made the final changes, nor that it had otherwise ensured that the changes were made to the QAPP.

Chapter 4, Note 8

While we would agree that a laboratory audit, in itself, may not normally detect fraud, it can provide indication of poor laboratory practices. It is our understanding that a DOD laboratory audit of Eureka Laboratories detected fraud before problems surfaced at March AFB. However, due to inadequate communication within DOD, no action was taken to suspend or debar the laboratory.

Chapter 4, Note 9

The purpose of our audit was to evaluate Region 9's oversight of data quality. Making recommendations to the Air Force was outside the scope of the audit. We have suggested that the DOD Inspector General evaluate DOD's contract laboratory program.

Chapter 4, Note 10

Again, we agree that data validation, in itself, would not normally detect fraud. It was never our contention that it would. However, data validation is clearly a valuable tool for detecting unreliable data as evidenced at Hunters Point, Travis AFB, and Sacramento Army Depot. Data validation is essential at key decision points, if EPA is to make sound, supportable environmental decisions.

Chapter 4, Note 11

The Region provided different costs for the laboratory analyses and resampling work in its response to the draft report than it previously presented. However, the Region did not provide us with any additional documentation to support these revised cost amounts, although this information was requested. Our report, therefore, remains unchanged.

Chapter 6, Note 1

The Region's response relating to the estimated cost of rejected data and the cost to replace data are acknowledged. However, since the Region's response also represents an unsupported estimate, we have chosen to not revise the report.

Chapter 6, Note 3

Our report acknowledges that "cursory" data validation found problems, and that the same laboratories were dropped from the project. The purpose of this part of the report was to show that full data validation, although delayed, resulted in the rejection of all organic analyses. Thus, if full data validation had been done promptly, the rejected data could have been found up to one year earlier.

Chapter 6, Notes 4 and 5

A regional laboratory audit would not be duplicative, since the Navy did not audit NET Pacific, one of laboratories that had serious problems. It also did not audit one of the three primary replacement laboratories that performed 24 percent of the analysis.

Chapter 6, Note 6

Good quality, comprehensive laboratory audits can be relied on to detect serious quality assurance problems, as was illustrated, in Chapter 4, by the Air Force's audit of Eureka Laboratories. Also, when a laboratory audit recommends not using a laboratory, a potential problem could be avoided.

Nonetheless, our point is that the Region must obtain laboratory audit reports, and consider the depth and scope of the audit before deciding not to conduct its own laboratory audit.

Chapter 6, Note 7

The due date for the Basewide remedial investigation has been delayed three years, from January 30, 1993 to February 15, 1996. We estimate that bad data contributed to two years of this delay due to: rejecting data (17 months); and resampling (7 months). The Region did not provide us an estimate of the length of delays due to the bad data, although this information was requested.

We recognize that other factors may have impacted the three-year delay. The Region's comment that the lack of available laboratory capacity was another major issue may be part of the problem with bad data. According to the Navy contractor, "Laboratory nonperformance started impacting [Hunters Point]...in November 1990 when, first, the inferred capacities [of both laboratories]...were exceeded." These two laboratories provided data that was eventually rejected 17 months later.

We believe an effective laboratory audit would have prevented samples being sent to laboratories without sufficient capacity. It is unclear why an engineering contractor would contract with a laboratory without sufficient capacity.

Chapter 6, Notes 8 and 9

The report did not mention the split samples referenced in the Region's response because they were done in 1995. Our audit scope covered quality assurance activities performed as of December 31, 1994. Further, it is our understanding the QAPP will be modified by October 2, 1995. We commend the Region for any improvements in quality assurance activities, such as split sampling and performance evaluation samples, which are now being used at this site.

Chapter 7, Note 4

Even though the Basewide QAPP went through the same review process as a primary document, the Region did not provide any documentation showing that it approved the final version of the QAPPs for Luke AFB.

Chapter 7, Note 6

Section 9.3.3.4 of the Basewide QAPP is the only part of the QAPP that addresses magnetic data. However, it did not specifically require the laboratory to provide the magnetic data needed to perform the tape audits. Instead, it called for providing electronic data for maintaining a data base of sample results. It should be noted that the Air Force used the provisions of its laboratory contracts, not QAPP provisions, to obtain the magnetic tapes from ATI-Phoenix.

Chapter 7, Note 7

It is debatable whether the features lacking from the QAPP would detect fraud. However, they would certainly detect unreliable data and provide indicators of fraud. The purpose of this audit was to determine if the Region was ensuring that laboratory data was of known and acceptable quality. The QAPP was prepared to ensure the quality of laboratory data. This is the goal of EPA's quality assurance program, not just to detect fraud.

We agree that conducting random tape audits and requiring the use of more than one laboratory are good quality assurance activities and encourage their inclusion in the QAPP.

APPENDIX C

Superfund Cleanup Process

Preliminary Assessment

The initial stage of the cleanup program is an installation-wide study to determine if sites are present that pose hazards to public health or the environment. Available information is collected on the source, nature, extent, and magnitude of actual and potential hazardous substance releases at sites on the installation.

Site Assessment

The next step consists of sampling and analysis to determine the existence of actual site contamination. Information gathered is used to evaluate the site and determine the response action needed. Uncontaminated sites do not proceed to later stages of the process.

Remedial Investigation

Remedial investigation may include a variety of site investigative, sampling, and analytical activities to determine the nature, extent, and significance of the contamination. The focus of the evaluation is determining the risk to the general population posed by the contamination.

Feasibility Study

Concurrent with the remedial investigations, feasibility studies are conducted to evaluate remedial action options for the site to determine which would provide the protection required.

Remedial Design

Detailed design plans for the remedial action option chosen are prepared.

Remedial Action

The implementation of the chosen remedial option is implemented.

Interim Remedial Action

Remedial actions can be taken at any time during the cleanup process to protect public health or to control contaminant releases to the environment.

Remedy in Place and Functioning

The remedial action is functioning properly and performing as designed. These include such actions as the operation of pump and treat systems that will take decades to complete cleanup.

APPENDIX D

Regulations, Orders, Directives, and Guidance

OSWER Directives and Publications

OSWER Directive
Title
9240.0-2 Analytical Support for Superfund
9240.0-2A Further Guidance on OSWER Directive 9240.0-2
9240.0-2B Extending the Tracking of Analytical Services to Potentially Responsible Party-Lead Superfund Sites
9320.2-3A Procedures for Completion and Deletion of National Priorities List Sites
9355.0-07 Data Quality Objectives For Remedial Response Activities
9355.1-1 Superfund Federal-Lead Remedial Project Management Handbook
9355.3-01 Guidance for Conducting Remedial Investigations and Feasibility Studies Under CERCLA
9355.9-01 Guidance on Implementing the Data Quality Objectives Process for Superfund
9830.2 Regional Oversight of Federal Facility Cleanups Under CERCLA
9837.2B Enforcement Project Management Handbook
9992.4 Federal Facilities Hazardous Waste Compliance Manual
OSWER Publication
9240.1-05 US EPA Contract Laboratory Program National Functional Guidelines for Inorganic Data Review
9240.1-05-01 US EPA Contract Laboratory Program National Functional Guidelines for Organic Data Review

Regulations, Orders, Directives, and Guidance

Number
Title
40 CFR Chapter 1, 300 National Oil and Hazardous Substances Pollution Contingency Plan
EPA Order 5360.1 Policy and Program Requirements to Implement the Mandatory Quality Assurance Program
EPA QA/G-4 Guidance for Planning for Data Collection in Support of Environmental Decision Making Using the Data Quality Objectives Process
EPA QA/R-5 EPA Requirements for Quality Assurance Project Plans for Environmental Data Operations
QAMS-005/80 Interim Guidelines and Specifications for Preparing QA Project Plans
(No Number) Guidance on Accelerating CERCLA Environmental Restoration at Federal Facilities, August 1994

Region 9 Guidance

Number
Title
9QA-03-89 US EPA Region 9 Guidance for Preparing Quality Assurance Project Plans for Superfund Remedial Projects
(No Number) Basic Requirements for Quality Assurance at Region 9 Superfund Sites, July 1988

APPENDIX E

Activities Contacted During the Audit

Activity Location
Environmental Protection Agency, Region 9

Federal Facilities Cleanup Office

San Francisco, CA

Quality Assurance Management Section San Francisco, CA
Environmental Protection Agency, Headquarters

Office of Solid Waste and Emergency Response,

Analytical Operations Branch

Washington, DC

Office of Research and Development,

Quality Assurance Management Division

Research Triangle Park, NC
Sacramento Army Depot

Remedial Project Manager

Sacramento, CA

Engineering Contractor Sacramento, CA
March Air Force Base

Remedial Project Manager

Riverside, CA

Travis Air Force Base

Remedial Project Manager

Fairfield, CA

Engineering Contractor Sacramento, CA
Hunters Point Naval Shipyard

Remedial Project Manager

San Bruno, CA

Engineering Contractor San Francisco, CA
Luke Air Force Base

Remedial Project Manager

Glendale, AZ

Mather Air Force Base

Remedial Project Manager

Rancho Cordova, CA

Air Force Center for Environmental Excellence San Antonio, TX
Army Corps of Engineers

Missouri River Division

Omaha, NE

Omaha District Omaha, NE
Sacramento District Sacramento, CA
Naval Facilities Engineering Command

Naval Facilities Engineering Service Center

Port Hueneme, CA

Engineering Field Activity, West San Bruno, CA
General Accounting Office Washington, DC

Los Angeles, CA

Department of Defense, Office of Inspector General Alexandria, VA
Air Force Audit Agency Washington, DC

March AFB, CA

Army Audit Agency Alexandria, VA
Naval Audit Service Falls Church, VA
State of California

Department of Toxic Substance Control, Region 1

Sacramento, CA

Regional Water Quality Control Board, Central Valley Region Sacramento, CA

APPENDIX F

EPA Data Analytical Levels

Level I

This level of data is typically used for screening data in the field. It is characterized by the use of portable instruments which provide real-time data to assist in determining sampling locations. Results are often not compound specific or quantitative. It is the least costly of the analytical levels, but data is not accurate enough to be used for public health risk assessments, evaluating cleanup alternatives, or remedial design.

Level II

Typically employing portable instruments or on-site mobile laboratories, this level of analysis is used for analyzing data in the field. A wide range of data quality can be generated, depending on the types of contaminants, calibration standards used, reference materials, and personnel skills. This type of analytical data is limited mostly to volatiles and metals and is not considered accurate enough for public health risk assessments or remedial design.

Level III


This analytical level is laboratory analysis using standard EPA-approved procedures. Some procedures may be equivalent to EPA's contract laboratory program but validation and documentation procedures are normally not equivalent to EPA's program. This level of analysis is used for engineering studies and, in specific cases, can be used to provide data for public health risk assessments.

Level IV

Level IV is laboratory analysis using EPA's contract laboratory program protocols. This data level is characterized by rigorous quality assurance and quality control protocols and specific documentation requirements. Level IV data packages contain information on initial and continuing calibration, gas chromatography/mass spectrometry tuning, surrogate recovery percentages, matrix spike duplications, internal chain of custody, and holding times. Level IV documentation is needed for data validation by EPA national functional guidelines.

Without Level IV documentation, it is not possible to (1) evaluate many aspects of laboratory analysis, such as the effect of interferences; (2) confirm the accuracy of quality control summaries provided by the laboratory; or (3) verify that the analyses were actually performed.

This level of analytical data is relatively expensive and is commonly used for confirming low levels of contamination, making public health risk assessments, and obtaining highly documented data.

Level V

Level V is special laboratory analyses obtained for EPA's contract laboratory program. Data is analyzed by non-standard methods and method development or method modification may be required. Data quality is method specific and data can be used in public health risk assessments.

APPENDIX G

Definitions of Quality Assurance Techniques

Computerized Data Validation

Computerized data validation is a relatively new quality assurance measure that is more efficient than traditional manual data validation. EPA has developed an automated data validation program called Computer-Aided Data Review and Evaluation (CADRE) that has been issued to the field.

Data Validation

Data validation is a method for ensuring laboratory data is of known quality. It involves reviewing data against a set of criteria to provide assurance that data is adequate for its intended use.

EPA has data validation guidelines, known as national functional guidelines, for its own contract lab program. According to EPA guidelines, data validation includes a review of documentation such as raw data, instrument printouts, chain of custody records, and instrument calibration logs.

Double-Blind Performance Evaluation Samples

Performance evaluation samples can be administered by two methods: "blind" or "double-blind". Both methods are prepared by "spiking" a known concentration of chemicals into a contaminate-free media, such as water or soil. When a PE sample is blind, the laboratory is aware the sample is a PE, but does not know the chemical concentration levels.

When a sample is double-blind, the PE sample is submitted as part of a field sample shipment, so that the laboratory is not only unaware of the concentration levels, it is also unaware that the sample is a PE. A laboratory's analysis of PE samples are used to evaluate its ability to produce accurate results.

Laboratory Audits

Laboratory audits are on-site audits designed to identify technical areas which may cause laboratories to improperly identify or quantitate chemicals. Audits normally evaluate a laboratory's technical expertise, standing operating procedures, facility and equipment sufficiency, and possible sources of sample contamination.

Magnetic Tape Audits

Electronic data, often in the form of magnetic tapes, are an output of laboratory analyses. By obtaining magnetic tapes (or other electronic data) from a laboratory, audits can be conducted to help determine:

If the laboratory is complying with its contract;

The integrity of the laboratory's computer systems; and,

The appropriateness of any software editing.

Split Samples

Split samples are samples collected in the field which are divided into two samples. One sample is sent to the contract laboratory and the other one is sent to an independent laboratory. The results from the two laboratories are then compared and the differences are analyzed. Split samples can be used to verify the use of proper analytical methodology and to detect unusual data trends.

APPENDIX H

Acronyms

Acronym Name
AFCEE Air Force Center for Environmental Excellence
CERCLA Comprehensive Environmental Response, Compensation, and Liability Act
CLP EPA's Contract Laboratory Program
DOD Department of Defense
ESS (OIG) Engineering and Science Staff
NPL National Priorities List
OSWER Office of Solid Waste and Emergency Response
PE Performance Evaluation (Samples)
QA Quality Assurance
QAPP Quality Assurance Project Plan
QAO Quality Assurance Officer
QC Quality Control
RI Remedial Investigation

(This page intentionally left blank.)

APPENDIX I

Report Distribution

Distribution Individual or Activity
Office of Inspector General Inspector General (2410)

Deputy Inspector General (2410)

Assistant Inspector General for Audit (2410)

EPA Headquarters Assistant Administrator for Enforcement and Compliance Assurance (2211)

Assistant Administrator for Research and Development (8101)

Assistant Administrator for Solid Waste and Emergency Response (5101)

Associate Administrator for Regional Operations and State/Local Relations (1501)

Associate Administrator for Congressional and Legislative Affairs (1301)

Associate Administrator for Communication, Education, and Public Affairs (1701)

Headquarters Library (3304)

Director, National Center for Environmental Research and Quality Assurance (8201)

Director, Office of Federal Facilities (2261)

Director, Federal Facilities Restoration and Reuse Office (5101)

Agency Followup Official, Attn: Director, Resource Management Division (3304)

Region 9 Director, Hazardous Waste Management Division (H-1)

Assistant Regional Administrator, Office of Policy and Management (P-1)

Chief, Federal Facilities Cleanup Office (H-9)

Chief, Environmental Services Branch (P-3)

Audit Management Integrity Officer (P-2-1)

External General Accounting Office

DOD Inspector General

Air Force Audit Agency


Created January 6, 1997

Top of page

 


Local Navigation



Jump to main content.