U.S. Fish & Wildlife Service Emblem
U.S. Fish & Wildlife Service
Analytical Control Facility
Division of Environmental Quality

Contents:
Who We Are
Services We Provide
Staff & Who to
Contact
Contract
Laboratories
How we assign Labs
Quality Control
Inorganic Lab
Statement of Work
Organic Lab
Statement of Work
Catalog Information
Pricing Guides
Analytical Methods
Contaminant Database

How to Use the Pricing Guides

 
Home
 
Links:
DOI Home Page
USFWS Home Page
DEQ Home Page
Contact us at: chemistry@fws.gov
Privacy & Disclaimer
FOIA
 

Introduction

The Analytical Control Facility (ACF) is responsible for assuring the quality of the chemical analyses it provides through the in-house laboratory and through the contract laboratories. ACF can assure the quality of the chemical analyses, only from the point in time after the sample arrives at the laboratory. It is the responsibility of the sample submitter to properly collect, contain, label, preserve, document, and ship his/her samples to the laboratory designated to perform the analysis. Unless a gross problem is evident, the laboratory and ACF assumes the samples were properly handled and labeled prior to receipt.

The objectives of the program are to maintain continuing assessments of accuracy and precision, identify procedural problems, and provide analytical data that will withstand legal scrutiny. In performing these objectives, ACF is able to recognize, quantitate, and minimize errors. All analyses are performed in accordance with written standard operating procedures that comply with Good Laboratory Practices.

Quality Assurance

A system of activities whose purpose is to provide the producer or user of a product or a service the assurance that it meets the defined standards of quality with a stated level of confidence.
Taylor, John K., "Quality Assurance of Chemical Measurments", 1987

The quality of a chemical analysis is considered assured when the analysis is performed in a technically competent manner, by qualified personnel using appropriate methods and equipment, and the precision and accuracy of the measurement are within the expected ranges for the technique. Acceptable quality can vary by analyte, matrix, or analysis technique.

Quality assurance is achieved through a system of quality control and quality assessment procedures. John K. Taylor defines quality control as:

"A system of inspections, testing, and remedial actions applied to a process or operation so that, by inspecting a small portion (a sample) of the product currently produced, an estimate can be made of its quality and whether or not, or what if any, changes need to be made to achieve or maintain a predetermined or required level of quality."

He defines quality assessment as:

"A system of activities whose purpose is to provide assurance that the overall quality control job is in fact being done effectively. The system involves a continuing evaluation of the adequacy and effectiveness of the overall quality control program with a view to having corrective measures initiated where necessary. For a specific product or service, this involves verifications, audits, and the evaluation of the quality factors that affect the specification, production, inspection and use of the product or service."

Quality Control and Assessment

The overall system of activities whose purpose is to control the quality of a product or service so that it meetsthe needs of users.
Taylor, John K., "Quality Assurance of Chemical Measurments", 1987

Quality Control Techniques

The following is a description of the quality control techniques that are applied to each and every catalog submitted for analyses under the ACF contracts:

  • Procedural blanks are analyzed to determine the level of the target analyte in reagents, acids, sample vessels, glassware, or solvents that are used in the analysis. All reagents, acids, or solvents used in the analysis are added to an empty vessel for analytical determination. Procedural blanks are performed at a frequency of 1 to every 10 samples.
  • Duplicate samplesare analyzed to assess the precision of the methods used for analysis. After the sample is homogenized, two separate subsamples are taken and analyzed. Duplicate analysis is performed at a frequency of 1 to every 10 samples. Duplicate analyses are evaluated in accordance with the criteria listed in Table 1 below.
  • Spiked samples are analyzed to assess the accuracy of the methods used for analysis. After the sample is homogenized, two separate subsamples are taken. One is processed as a sample. The second subsample has a known quantity of the targeted analyte added before digestion. Analysis of the matrix spike and the sample generate recovery numbers. Spike analysis is performed at a frequency of 1 to every 10 samples. The control limit is exceeded if the percent recovery is greater than 2 times the standard deviation of the expected average. Spike recoveries are evaluated in accordance with the criteria listed in Table 2 below.
  • Reference Materials are analyzed to provide evidence that the method produces results that are comparable to those obtained by an independent organization. Reference materials are pre-homogenized samples which are certified to contain a stated amount of analyte. Reference materials are evaluated in accordance with the precision and accuracy that would be expected for the amount of analyte present and within the known error in the certificate. The run is in control and the data is accepted if the SRM reported value is within 2 standard deviations of the certified mean.
  • Calibration curves are required for each instrumental analysis of a sample batch. Calibration curves are verified immediately utilizing a mid-level calibration standard to determine if the initial calibration is acceptable for the analytical run.
  • Calibration checks are done after every ten samples to check the validity of the calibration curve for the continuing sample group. The normal check is to reanalyze the mid-level standard used in the initial calibration. The acceptable coefficient of variation on this analysis must be <= 10%.

Quality Assessment Procedures

The following is a description of the quality assessment procedures that are used for specific test of laboratory performance:

  • Crosscheck analysis - In the crosscheck, samples that have been analyzed by the contractor are recalled to the ACF laboratory, where they are reanalyzed. The results of the analyses are compared, and if significant differences occur corrective action is initiated. Each contract laboratory is aware that any catalog can be called back for crosscheck. Approximately 5% of the catalogs submitted to ACF are crosschecked.
  • Round robin - On an annual basis a round robin is conducted. In the round robin each laboratory analyzes a portion of the same sample. The results are compared, and if any significant differences occur, corrective action is initiated. The round robin yields the same type of information as the crosscheck, but involves all laboratories.
  • Data audits - Audits occur on an as needed basis. Usually in trying to resolve a problem with the analyses in a problem catalog or in conjunction with a crosscheck. In the data audit, the laboratory is required to submit copies of all raw data collected during the analyses of a particular catalog. ACF then attempts to regenerate the final report. If any anomalies are discovered, corrective action is initiated.
Table 1. Acceptable Precision
Analyte Concentration Range (1) 95% Confidence Interval Average Relative Percent Difference (2)
Metals-ICP (4) 2-10 LOD (3) 30% 17.3%
> 10 LOD 15% 8.64%
Metals-Atomic Absorption (5) 2-10 LOD 20% 11.5%
> 10 LOD 10% 5.75%
OC Scan (6) &
Petroleum Hdrocarbon Scan (7)
2-10 LOD 30% 17.3%
> 10 LOD 15% 8.64%
Dioxin / Furan Scan &
Congener Specific PCB Scan
2-10 LOD 70% 40.3%
> 10 LOD 35% 20.1%

Table Notes

(1) The range, in multiples of the limit of detection, that the sample in question falls. For samples with a concentration less than two times the limit of detection, 95% confidence interval is assumed to be + 2 LOD.

(2) The relative percent difference needed to produce the stated 95% confidence interval listed in the table. This is the average of all the relative percent differences of a given laboratory in a given matrix.

(3) Limit of Detection.

(4) Inductively coupled plasma emission spectroscopy, including direct and preconcentrated scans.

(5) Atomic absorption spectroscopy, including cold vapor, hydride generation, and graphite furnace techniques.

(6) Organochlorine pesticides scan including PCBs.

(7) Petroleum Hydrocarbon scan, including aliphatic and aromatic compounds.

Table 2. Acceptable Accurancy
Analyte Average Recovery (1)
Metals-ICP (2) 80-120%
Metals-Atomic Absorption (3) 85-115%
OC Scan (4) 80-120%
Petroleum Hydrocarbon Scan (5) 80-120%
Dioxin / Furan Scan 60-140%
Congener Specific PCB Scan 60-140%

Table Notes

(1) The average recovery of all spiked samples should be within the limits stated. Additionally, the laboratory must demonstrate consistency in the recoveries. Recoveries are from animal tissues. Other matrices can have different acceptable recoveries.

(2) Inductively coupled plasma emission spectroscopy including direct and preconcentrated scans.

(3) Atomic Absorption Spectroscopy including cold vapor, hydride generation, and graphite furnace techniques.

(4) Organochlorine pesticides scan including PCBs.

(5) Petroleum Hydrocarbon scan including aliphatic and aromatic compounds.

Analytical Precision and Accuracy

ACF maintains a continuing assessment of the accuracy and precision data generated from laboratories in order to ensure that the laboratories are in a state of statistical control and to identify procedural problems. Precision and accuracy are measures of the reliability of the results.

Precision is the ability to produce the same result in repeated tests of the same sample. Precision is the percent difference of the results from repeated test of the sample. Samples are analyzed in duplicates at a rate of 10%, with at least one duplicate per matrix per analytical run. This requirement is waived when there is insufficient sample to perform a duplicate analysis. Precision is calculated using Equation 1 below.

Equation 1: Precision Calculation

RPD = 2 * [A - B] / A + B * (100)

Where:

  • RPD is the relative percent difference between duplicate determinations.
  • A and B are the results for the duplicate determinations.
  • [A-B] is the absolute difference between the determinations

Accuracy is the degree in which the obtained results compare with the actual result. Accuracy check samples are performed at a rate of 10 %, with at least one spike per matrix per analytical run. A sample is fortified with a known quantity of analyte and analyzed as part of the run. The spike level is between 10 and 50 times the limit of detection, or the expected analyte concentration, whichever is higher. If there is insufficient sample to analyze a spike of the submitted samples, the laboratory can spike a similar material from another source. Spike accuracy is calculated using Equation 2 below.

Equation 2: Spike Accuracy

%R = 100 * (OV - BV / KV)

Where:

  • %R is the percentage recovery.
  • OV - Observed Value is the analytical result after spiking.
  • BV - Background value is the analytical result of the matrix before spiking.
  • KV - Known value is the concentration of the spike.

Standard Reference Materials are also incorporated into each batch of samples as another form of accuracy data. Standard Reference Material accuracy is calculated using Equation 3 below. Accuracy is usually expressed as percent recovery.

Equation 3: Standard Reference Material (SRM) Accuracy

%R = 100 * (OV / KCV)

Where:

  • %R is the percentage recovery.
  • OV - Observed Value is the analytical result obtained for the SRM.
  • KCV - Known certified value of the SRM.

The desire of ACF is to produce results that are both highly accurate and highly precise.

Detection Limit

The minimum concentration of a substance that can be identified, measured, and reported with 99% confidence that the analyte concentration is greater then zero.
USEPA 40 CFR Part 136

Instrument detection limit (IDL) is a measure of the normal instrument noise. It is a guideline to determine if you are receiving noise results or a real instrumental signal of the target analyte. IDL is an evaluation of the maximum sensitivity of an instrument to perform an analysis. It is the ability of an instrument to distinguish between small differences in analyte concentrations. The instrument detection limit is set at three times the standard deviation of the instrument noise. IDL is determined by the following procedure ( EPA 40 CFR Part 136, Appendix B, July 1993). See Appendix D. The analyst should consult the 40 CFR reference, as there are a number of specific conditions and choices that must be made during the IDL procedure.

Determining Instrument Detection Limits

  1. Prepare a calibration curve for the test.
  2. Analyze 7 laboratory blanks.
  3. Record the response of the blanks.
  4. Calculate the mean and standard deviation of the blanks.
  5. The IDL is three times the standard deviation on the calibration curve.

Method detection limit (MDL) is the minimum level of an analyte that can be determined with 99% confidence. The limit depends upon the ratio of the magnitude of the analytical signal to the size of the statistical differences within the blanks. The method detection limit assures that the analyst is not reporting instrument noise but the result of the target analyte. The procedure for MDL is also from (EPA 40 CFR Part 136, Appendix B, July 1993). See Appendix D.

Determining Method Detection Limits

  1. Prepare a spike of the target analyte 2 to 5 times greater than the IDL obtained or the expected MDL.
  2. Take seven aliquots of the spiked solution and process through the sample preparation procedure.
  3. Analyze each of the samples.
  4. Calculate the standard deviation.
  5. The MDL is equal to the one-tailed t-statistic at a 99% confidence level for the performed number of samples times the standard deviation.