Jump to main content.


Environmental Data Quality at DOD Superfund Sites in Region 9

#5100505

EXECUTIVE SUMMARY

PURPOSE

Starting in 1992, serious problems began surfacing with the quality of laboratory data related to the cleanup effort at Department of Defense (DOD) Superfund sites in Region 9 (the Region). These problems eventually caused $5.5 million of environmental data to be rejected and cleanups to be delayed up to 2 years. Laboratory data analysis problems may also be much more widespread. At one DOD base, the Region uncovered extensive laboratory fraud. DOD later determined that this laboratory performed analyses for at least 24 military installations in the Region. At another DOD site, the Region has certified a portion of the facility as cleaned up; however, the laboratory data relied on is of unknown quality.

...Indeed, data losses due to inadequate quality assurance of DOD contract labs have resulted in rework, delays, and/or additional costs at other California bases within the last year...

-Letter from Region 9 to DOD,

May 25, 1994

The objective of the audit was to determine if the Region was ensuring that laboratory data was of known and acceptable quality under Federal facility agreements with DOD.

BACKGROUND

Characterized as the nation's largest industrial organization, (DOD) has 100 bases on the Superfund national priorities list (NPL), a register of the nation's worst contaminated sites. Twenty-four of these NPL bases are located in the Region.

Superfund Laws

Under Superfund, DOD is required to carry out its hazardous waste cleanups according to the same guidelines as other facilities. Once a military base is proposed for the NPL, EPA, DOD, and the state must negotiate a Federal facility agreement to govern the cleanup. EPA is responsible for overseeing these agreements and has final decision-making authority for selecting the cleanup remedy. The cleanup decisions are based on the analyses of environmental data.

Decisions by management rest on the quality of environmental data...The primary goal of the QA program is to ensure that all environmentally related measurements... produce data of known quality.

-EPA Order 5360.1

Environmental Data Quality Requirements

Typically, environmental data is collected by sampling water or soil and having the samples analyzed by a laboratory. EPA Order 5360.1 requires environmental measurements to be of known quality, verifiable, and defensible.

Steps to Ensure Suitable Quality

To ensure environmental data is sufficient and of appropriate quality, data quality objectives are established. Next, a quality assurance project plan should be developed to identify the quality control requirements for data collection, sampling, and analysis.

RESULTS IN BRIEF

Although serious laboratory problems were identified, the Region had not significantly strengthened its oversight program over DOD laboratory data. Nor had it required that DOD modify the quality assurance project plans (QAPPs) to increase the opportunity to detect data quality problems for four of the five QAPPs reviewed.

The ultimate success of an environmental program or project depends on the quality of the environmental data collected and used in decision-making, and this may depend significantly on the adequacy of the QAPP and its effective implementation.

-EPA QA/R-5

In our opinion, the Region could better fulfill its oversight role and assist DOD in avoiding future data quality problems and cleanup delays by:

Strengthening oversight of quality assurance activities;

Including key quality assurance activities in QAPPs; and,

Ensuring QAPPs are complied with.

PRINCIPAL FINDINGS

Our principal findings are summarized below. Chapter 2 summarizes the problems we found with the Region's oversight of environmental data quality at DOD bases. Chapters 3 through 7 discuss the details for each of the five DOD bases included in our review.

Site-specific data quality objectives, a prerequisite to QAPPs, were generally not prepared. In our opinion, the lack of specific data quality objectives was one reason serious problems were found with environmental data quality.

The [Data Quality Objective] process is...designed to ensure that the type, quantity, and quality of environmental data used in decision making are appropriate for the intended application.

-OSWER Directive 9355.9-01

QAPPs, the primary tool for controlling laboratory quality, were not well designed or effectively implemented at the five DOD sites reviewed. While the Region took action to reject data and address the impacts of the lost data, it did not effectively monitor compliance with QAPPs. Further, the QAPPs were not always revised after laboratory data problems were found.

Sacramento Army Depot

We concluded that the environmental data for Sacramento Army Depot was of unknown quality because:

The data packages the laboratories provided were insufficient to verify the quality of the analyses.

Analyses were not validated using EPA national functional guidelines.

Analyses were not defensible since rejected data was used for the decision-making process due to inadequate quality assurance activities.

Problems Found at

Sacramento Army Depot

* No independent regional quality assurance activities

* Data packages were insufficient to verify quality

* Analyses were not validated using EPA guidelines

* Rejected data not identified by DOD

As a result, we do not believe that any portion of the Depot should be considered cleaned up until the data used for decision-making is validated using EPA national functional guidelines.

We also noted that in June 1994, the Region certified that one of the operable units was cleaned up based on data of unknown quality. In our opinion, the Region should withdraw this certification until it reevaluates whether the supporting data is of sufficient quality. Further, the QAPPs should be redesigned to better oversee laboratory quality for the rest of the cleanup.

March Air Force Base

Despite objections raised by the Region, the engineering contractor for the March AFB cleanup hired Eureka Laboratories in 1992 to analyze samples for part of the project. In late 1993, the Region rejected $1 million of Eureka Laboratories analyses after finding potential fraud. The rejected data caused a project delay of about 1 years. In May 1995, the laboratory pleaded guilty to charges that test results were falsified.

We found the Base's QAPPs did not contain the quality assurance measures the Region used to detect the laboratory fraud. These measures included performance evaluation samples and magnetic tape audits.

...The [magnetic tape] audit determined that Eureka's lab deficiencies and fraudulent work were pervasive throughout the analysis of all fifty-nine [March Air Force Base] samples.

-Region 9's Request for Suspension of Eureka Laboratory

A 1991 Air Force audit of Eureka Laboratories, under another project, recommended that samples not be sent to Eureka Laboratories. This recommendation was not implemented, nor was the Region advised of the audit results.

Travis Air Force Base

In 1991 and 1992, the engineering contractor for Travis AFB completed three rounds of sampling to determine the extent of contamination at the bases. The samples were sent to the contractor's own laboratories for analyses. In January 1992, an Air Force audit of the laboratories found major problems. Because of the problems, the Region was not able to determine the quality of about $2 million of analyses. This delayed the cleanup by more than 2 years.

We found the QAPP for Travis AFB was not designed to detect the laboratory problems. After the data problems were disclosed, Travis developed two new QAPPs, which included many of the key data quality assurance ingredients missing from the original QAPP.

Hunters Point Naval Shipyard

Between October 8, 1990 and December 18, 1990, the Navy's engineering contractors collected over 1,200 samples to initiate remedial investigations at Hunters Point Naval Shipyard. Nearly 1 years later, in May 1992, all organic analyses were rejected due to poor quality laboratory data.

Problems Found at

Hunters Point

* Unclear data quality objectives

* Incomplete data packages

* Late data validation

* No requirement for laboratory audits

* No magnetic tape audits or performance evaluation samples

The rejected data cost the Navy about $2.5 million and another $1 million was spent to replace the rejected data. Further, the cleanup was set back about 2 years.

We found that the Region prepared a data quality oversight plan to help it assess the data quality and laboratory performance at Hunters Point. However, the plan was not fully implemented. Had it been, and had the Navy been required to perform data validation promptly, we believe the 2-year delay could have been substantially reduced.

Luke Air Force Base

Since the Federal facility agreement was signed in 1990, Luke AFB has used a Phoenix, Arizona laboratory to do most of its analyses. In September 1994, the Air Force submitted a remedial investigation report to EPA that was based on analyses made by the laboratory. Just before submission of this report, EPA suspended the laboratory from further Federal work for allegedly using faulty test equipment and reporting false results under another EPA program.

...the quality of the data from the remedial investigation at Luke Air Force Base cannot be determined to the extent that a Superfund Basewide Record of Decision...can be signed...

-Region 9, June 2, 1995

The Region determined that some of the data was manipulated by the laboratory, and therefore was of unknown quality, and could not be used for remedial decisions. As a result of the data quality problems, the cleanup has been delayed by nearly one year thus far and will likely require resampling. We believe the delay could have been avoided if the QAPP had been better designed and implemented.

BEST PRACTICES

We identified some notable "best practices" in one DOD QAPP that should be considered for inclusion in all DOD QAPPs. The Travis Model QAPP adequately defined the data quality objectives and required "double-blind" performance evaluation samples, an effective laboratory evaluation tool. The QAPP also required that data quality assurance reports be provided to EPA for review.

RECOMMENDATIONS

Our detailed recommendations follow the findings in Chapters 2 and 3. However, in summary, we are recommending the following procedural changes to the Regional Administrator, Region 9, to improve the opportunity for obtaining environmental data of known quality at Superfund sites:

Establish a data base or tracking system for laboratory data quality at DOD installations, and perform sufficient oversight to ensure the quality of laboratory data.

Ensure QAPPs are designed to detect laboratory data problems, are fully implemented, kept current, and reviewed annually.

Improve the quality of the QAPPs by ensuring that quality assurance officers are government employees and requiring independent laboratory audits.

Publicize best practices used in DOD QAPPs.

REGION 9 COMMENTS

A draft report was provided to the Region on July 20, 1995, and the Region responded to the draft on September 5, 1995. We held an exit conference with regional officials on September 18, 1995.

While the Region's response did not address our specific recommendations, it identified 10 ongoing and planned corrective actions to improve data quality. These actions included requiring DOD to follow the EPA's data quality objective process and ensuring DOD complied with its QAPPs.

The Region also responded that it had:

...always recognized the essential role that data quality assurance plays in our environmental programs. We work hard to instill in the federal agencies we oversee an awareness of their lead agency responsibilities with regard to quality assurance, and a comparable sense of vigilance...

The Region advised that its "oversight responsibilities at Federal facilities are substantial and diverse." The Region further commented that, because of its workload at Federal facilities:

...EPA does not (and likely never will) have sufficient resources to oversee all aspects of Superfund cleanup at each facility as we ideally would...

The Region also said its "general approach" was to work with DOD to establish quality assurance plans and make sure these plans were implemented. It said improvements to the plans were "obviously necessary." The Region went on to suggest that:

...simply increasing our 'policing' of DOD won't prevent or solve the fundamental problem. Instead DOD must assign a high-priority to developing and implementing a sound Department-wide approach to assuring data quality from its contract laboratories...

The Region also commented that the report did not clearly identify that DOD had the lead role at Superfund sites, in accordance with Executive Order 12580. It pointed out that DOD had the primary responsibility for carrying out all aspects of the National Contingency Plan, including assuring the quality of data.

The Region also believed that the report did not sufficiently acknowledge its "competing responsibilities," efforts to detect and prosecute laboratory fraud at DOD bases, and assumption of a national leadership role in working with DOD to improve the laboratory situation.

OIG EVALUATION

We recognize that DOD has a responsibility for environmental data quality at its sites. We also agree that DOD needs a sound data quality program for its contract laboratories. However, DOD's responsibilities for Superfund cleanups do not relieve the Region from its paramount responsibility for ensuring the quality of environmental data.

Executive Order 12580

Under Executive Order 12580, it is EPA, and not DOD, that must make the final cleanup decision at a DOD Superfund site. This responsibility has not been delegated to DOD. Thus, the Region must ultimately ensure that environmental data is of sufficient quality for decision-making, as required by EPA Order 5360.1. The Region cannot delay its decisions until DOD fixes its data quality system problems.

Competing Responsibilities

We also recognize that the Region has numerous, competing responsibilities in its oversight role at DOD Superfund sites. However, we question whether any of these responsibilities can be successfully fulfilled if environmental data, the foundation of a Superfund cleanup, is unreliable. In our opinion, the Region's responsibility to ensure sufficient data quality is absolutely critical and cannot be sacrificed.

Resources

Contrary to its comments to this report, the Region, in response to a U.S. General Accounting Office report on environmental cleanups, stated in October 1994 that "...their staffing situation had improved and they were providing the required oversight for federal facilities on the NPL." Under OSWER Directive 9830.2, oversight includes ensuring data is of sufficient quality. This directive also states that, "Federal facility oversight should not suffer for a lack of resources."

EPA also indicated in its 1994 Integrity Act Report to the President and Congress that its resources for overseeing Superfund cleanups at Federal facilities were sufficient. In this report, EPA stated that previous "vulnerabilities in Agency oversight and enforcement of Federal facility cleanup" had been corrected. This statement was largely due to DOD providing EPA with 100 staff years and $7 million in funding to accelerate and oversee cleanups at closing and realigning bases. The Region received about one-third of these staff years.

Laboratory Fraud

Finally, we recognize and commend the Region for its aggressive pursuit of laboratory fraud. However, it remains our position that the Region needs to make substantive procedural changes to its quality assurance program to ensure environmental data at DOD Superfund sites is of sufficient quality for decision making.

The Region's complete response is included as Appendix A of this report. Our report has been modified to incorporate many of the Region's comments and concerns. In those areas where we disagreed with the Region's comments, our position is detailed in Appendix B of this report.


Created January 6, 1997

To request a hard copy, please contact EPA, Office of Inspector General, Office of Audit at 202-260-7784

Top of page

 


Local Navigation



Jump to main content.