National Rivers and Streams Assessment December 2010
QA Project Plan Page i of xiii
United States Environmental Protection Agency Office of Water
Office of Environmental Information Washington, DC
EPA 841-B-07-007
National Rivers and Streams Assessment
Quality Assurance
Project Plan
Rivers and Streams Assessment
Final Document December 2010
Project Leaders:
Ellen Tarquinio
U.S. Environmental Protection Agency
Office of Wetlands, Oceans, and Watersheds
1200 Pennsylvania Avenue, NW
4503T
Washington, DC 20460
-------
National Rivers and Streams Assessment December 2010
QA Project Plan Page ii of xiii
National Rivers and Streams Assessment (NRSA)
Quality Assurance (QA) Project Plan
Management Approvals
Signature indicates that this QAPP is approved and will be implemented in conducting this
project.
Steve Paulsen, Ph.D
Technical Advisor signature date
Western Ecology Division, NHEERL
U.S. EPA Office of Research and Development
Corvallis, Oregon
Sarah Lehman
NRSA Project QA Officer signature
date
U.S. EPA Office of Water
Office of Wetlands, Oceans, and Watersheds
Assessment and Watershed Protection Division
Washington, DC
Ellen Tarquinio
NRSA Project Leader signature date
U.S. EPA Office of Water
Office of Wetlands, Oceans, and Watersheds
Assessment and Watershed Protection Division
Washington, DC
Treda Smith
NRSA Project Leader signature date
U.S. EPA Office of Water
Office of Wetlands, Oceans, and Watersheds
Assessment and Watershed Protection Division
Washington, DC
Susan Holdsworth
Monitoring Branch signature date
U.S. EPA Office of Water
Office of Wetlands, Oceans, and Watersheds
Assessment and Watershed Protection Division
Washington, DC
Margarete Heber
OWOW QA Officer signature date
U.S. EPA Office of Water
Office of Wetlands, Oceans, and Watersheds
Washington, DC
-------
National Rivers and Streams Assessment December 2010
QA Project Plan Page iii of xiii
QUALITY ASSURANCE PROJECT PLAN
REVIEW & DISTRIBUTION ACKNOWLEDGMENT AND
COMMITMENT TO IMPLEMENT
for
National Rivers and Streams Assessment
We have read the QAPP and the methods manuals for the National Rivers and Streams
Assessment listed below. Our agency/organization agrees to abide by its requirements
for work performed under our cooperative agreement for Demonstration of Randomized
Design for Assessment of National Rivers and Streams (under CWA 104(b) (3)).
Quality Assurance Project Plan EPA-841-B-07-007
Site Evaluation Guidelines EPA-841-B-07-008
Field Operations Manual EPA-841-B-07-009
Laboratory Methods Manual EPA-841-B-07-010
Print Name
(Principle Investigator)
Title
Signature Date_
Address:
Phone: Fax: E-mail:
Please return the signed original to the EPA QA officer for this cooperative agreement:
Sarah Lehmann
U.S. EPA (4503T) 1200 Pennsylvania
Ave, NW Washington, DC
20460
202-566-1379 (phone)
202-566-1331 (fax)
Retain a copy for your files.
-------
National Rivers and Streams Assessment December 2010
QA Project Plan Page iv of xiii
NOTICE
The complete documentation of overall NRSA project management, design, methods, and
standards is contained in four companion documents, including:
National Rivers and Streams Assessment: Quality Assurance Project Plan EPA-841-B-07-007
National Rivers and Streams Assessment: Site Evaluation Guidelines EPA-841-B-07-008
National Rivers and Streams Assessment: Field Operations Manual EPA-841-B-07-009
National Rivers and Streams Assessment: Laboratory Methods Manual EPA 841-B-07-010
This document (Quality Assurance Project Plan) contains elements of the overall project
management, data quality objectives, measurement and data acquisition, and
information management for the NRSA, and is based on the guidelines developed and
followed in the Western Environmental Monitoring and Assessment Program (Peck et al.
2003). Methods described in this document are to be used specifically in work relating to
the NRSA. All Project Cooperators must follow these guidelines. Mention of trade names
or commercial products in this document does not constitute endorsement or
recommendation for use. More details on specific methods for site evaluation, field
sampling, and laboratory processing can be found in the appropriate companion
document(s) listed above.
The suggested citation for this document is:
USEPA. 2008 (draft). National Rivers and Streams Assessment: Integrated Quality Assurance
Project Plan. EPA/841/B-07/007. U.S. Environmental Protection Agency, Office of Water
and Office of Research and Development, Washington, DC.
-------
National Rivers and Streams Assessment
QA Project Plan
December 2010
Page v of xiii
DISTRIBUTION LIST
This QA Project Plan and associated manuals or guidelines will be distributed to the following
EPA, Tetra Tech, Inc. (Tt), and Great Lakes Environmental Center (GLEC) senior staff
participating in the NRSA and to State Water Quality Agencies or cooperators who will
perform the field sampling operations. The Tt and GLEC QA Officers will distribute the
QA Project Plan and associated documents to participating project staff at their
respective facilities and to the project contacts at participating laboratories, as they are
determined.
Regional Monitoring Coordinators
Tom Faber,
Region 1
Darvene Adams,
Region 2
Louis Reynolds,
Region 3
Larinda Tervelt,
Region 4
Sarah Lehmann,
Region 5
Mike Schaub,
Region 6
Gary Welker,
Region 7
Tina Laidlaw,
Region 8
Janet Hashimoto
Region 9
Gretchen Hayslip,
Region 10
(617)918-8672
Faber.Tom@epa.gov
(732) 321-6700
Adams.Darvene@epa.gov
(304) 234-0244
Reynolds.Louis@epa.gov
(404) 562-9265
Tervelt.Larinda@epa.gov
(312) 353-5784
Lehmann.Sarah@epa.gov
(214)665-7314
Schaub.Mike@epa.gov
(913)551-7177
Welker.Gary@epa.gov
(406) 457-5016
Laidlaw.Tina
(415)972-3452
Hashimoto.Janet@epa.gov
(206) 553-1685
Hayslip.Gretchen@epa.gov
U.S. EPA - Region I
1 1 Technology Drive North
Chelmsford, MA 01 863-
2431
USEPA - Region II
2890 Woodbridge Avenue,
Edison, NJ 08837-3679
U.S. EPA -Region III
303 Methodist Building
11th and Chapline Streets
Wheeling, WV 26003
U.S.EPA- Region IV
61 Forsyth Street, S.W. Atlanta,
GA 30303-8960
U.S. EPA - Region V
77 West Jackson Boulevard
Chicago, IL 60604-3507
U.S. EPA -Region VI
1445 Ross Avenue -Suite 1200
Dallas, TX 75202-2733
U.S. EPA -Region VII
901 North Fifth Street
Kansas City, KS 66101
U.S. EPA -Region VIII
10 West 15th Street, Suite 3200
Helena, MT 59626
75 Hawthorne Street San
Francisco, CA94105
U.S. EPA - Region X, ES-098
1200 Sixth Avenue
Seattle, WA 98101
-------
National Rivers and Streams Assessment
QA Project Plan
December 2010
Page vi of xiii
Contractor Support
Dennis McCauley,
GLEC
Michael Barbour
Tetra Tech
Jennifer Pitt
Tetra Tech
Marlys Cappaert
CSC
Esther Peters,
Tetra Tech
(231) 941-2230
dmccauley@glec.com
410-356-8993
Michael.Barbour@tetratech.com
410-356-8993
Jennifer.Pitt@tetratech.com
(541) 754-4467
Cappaert.Marlys@epa.gov
(703) 385-6000
Esther. Peters@tetratech-ffx.com
739 Hastings Street Traverse
City, Ml 49686
400 Red Brook Blvd., Ste 200
Owings Mills, MD21117
400 Red Brook Blvd., Ste 200
Owings Mills, MD21117
200 S.W. 35th St. Corvallis, OR
97330
10306 Eaton PI., Ste. 340
Fairfax, VA 22030
-------
National Rivers and Streams Assessment December 2010
QA Project Plan Page vii of xiii
TABLE OF CONTENTS
DISTRIBUTION LIST
LIST OF FIGURES
LIST OF TABLES
1.0 PROJECT PLANNING AND MANAGEMENT
1.1 Introduction
1.2 NRSA Project Organization
1.2.1 Project Schedule
1.3 Scope of QA Project Plan
1.3.1 Overview of Field Operations
1.3.2 Overview of Laboratory Operations
1.3.3 Data Analysis and Reporting
2.0 DATA QUALITY OBJECTIVES
2.1 Data Quality Objectives for NRSA
2.2 Measurement Quality Objectives
2.2.1 Method Detection Limits
2.2.2 Sampling Precision, Bias, and Accuracy
2.2.3 Taxonomic Precision and Accuracy
2.2.4 Completeness
2.2.5 Comparability
2.2.6 Representativeness
3.0 SURVEY DESIGN
3.1 Probability-Based Sampling Design and Site Selection.
4.0 INFORMATION MANAGEMENT
4.1 Data Policy
4.2 Overview of System Structure
4.2.1 Design and Logistic Data Bases
4.2.2 Sample Collection and Field Data Recording
4.2.3 Laboratory Analyses and Data Recording
4.2.4 Data Review, Verification and Validation Activities.
4.3 Data Transfer
4.4 Core Information Management Standards.
4.4.1 Metadata
4.4.2 Data Directory
4.4.3 Data Catalog
-------
National Rivers and Streams Assessment December 2010
QA Project Plan Page viii of xiii
4.4.4 Data Formats
4.4.5 Parameter Formats
4.4.6 Standard Coding Systems
4.5 Hardware and Software Control
4.6 Data Security
5.0 INDICATORS
5.1 Indicator Summaries
5.2 In situ Water Quality Measurements
5.2.1 Introduction
5.2.2 Sampling Design
5.2.3 Sampling and Analytical Methodologies
5.2.4 Quality Assurance Objectives
5.2.5 Quality Control Procedures: Field Operations
5.2.6 Quality Control Procedures: Laboratory Operations
5.2.7 Data Management, Review and Validation
5.2.8 Data Analysis Plan
5.3 Water Chemistry
5.3.1 Introduction
5.3.2 Sampling Design
5.3.3 Sampling and Analytical Methodologies
5.3.4 Quality Assurance Objectives
5.3.5 Quality Control Procedures: Field Operations.
5.3.6 Quality Control Procedures: Laboratory Operations...
5.3.7 Data Management, Review and Validation
5.3.8 Data Analysis Plan
5.4 Sediment Enzymes
5.4.1 Introduction
5.4.2 Sampling Design
5.4.3 Sampling and Analytical Methodologies
5.4.4 Quality Assurance Objectives
5.4.5 Quality Control Procedures: Field Operations
5.4.6 Quality Control Procedures: Laboratory Operations...
5.4.7 Data Management, Review and Validation
5.4.8 Data Analysis Plan
5.5 Chlorophyll @
5.5.1 Introduction
-------
National Rivers and Streams Assessment December 2010
QA Project Plan Page ix of xiii
5.5.2 Sampling Design
5.5.3 Sampling and Analytical Methodologies
5.5.4 Quality Assurance Objectives
5.5.5 Quality Control Procedures: Field Operations
5.5.6 Quality Control Procedures: Laboratory Operations
5.5.7 Data Management, Review and Validation
5.5.8 Data Analysis Plan
5.6 Periphyton
5.6.1 Introduction
5.6.2 Sampling Design
5.6.3 Sampling and Analytical Methodologies
5.6.4 Quality Assurance Objectives
5.6.5 Quality Control Procedures: Field Operations
5.6.6 Quality Control Procedures: Laboratory Operations
5.6.7 Data Management, Review and Validation
5.6.8 Data Analysis Plan
5.7 Benthic Macroinvertebrates
5.8.1 Introduction
5.8.2 Sampling Design
5.8.3 Sampling Methodologies
5.8.4 Quality Assurance Objectives
5.8.5 Quality Control Procedures: Field Operations
5.8.6 Quality Control Procedures: Laboratory Operations
5.8.7 Data Management, Review, and Validation
5.9 Fish Community Indicator
5.9.1 Introduction
5.9.2 Sampling Design
5.9.3 Sampling and Analytical Methodologies
5.9.4 Quality Assurance Objectives
5.9.5 Quality Control Procedures: Field Operations
5.9.6 Quality Control Procedures: Laboratory Operations
5.9.7 Data Reporting, Review, and Management
5.10 Physical Habitat
5.10.1 Introduction
-------
National Rivers and Streams Assessment December 2010
QA Project Plan Page x of xiii
5.10.2 Sampling Design
5.10.3 Sampling and Analytical Methodologies
5.10.4 Quality Assurance Objectives
5.10.5 Quality Control Procedures: Field Operations
5.10.6 Quality Control Procedures: Laboratory Operations
5.11 Fish Tissue
5.5.1 Introduction
5.5.2 Sampling Design
5.5.3 Sampling and Analytical Methodologies
5.5.4 Quality Assurance Objectives
5.5.5 Quality Control Procedures: Field Operations
5.5.6 Quality Control Procedures: Laboratory Operations
5.12 Fecal Indicator
5.10.1 Introduction
5.10.2 Sampling Design
5.10.3 Sampling and Analytical Methodologies
5.10.4 Quality Assurance Objectives
5.10.5 Quality Control Procedures: Field Operations
5.13 General Site Assessment
5.10.1 Introduction
5.10.2 Sampling Design
5.10.3 Sampling and Analytical Methodologies
5.10.4 Quality Assurance Objectives
5.10.5 Quality Control Procedures: Field Operations
6.0 BIOLOGICAL FIELD AND LABORATORY QUALITY EVALUATION AND ASSISTANCE
VISITS
6.1 Field Quality Evaluation and Assistance Visit Plan
6.2 Laboratory Quality Evaluation and Assistance Visit Plan
7.0 REFERENCES
-------
National Rivers and Streams Assessment December 2010
QA Project Plan Page xi of xiii
LIST OF FIGURES
Figure 1. NRSA Project Organization
Figure 2 Timeline of NRSA project activities
Figure 3 Site verification activities for river and stream field surveys
Figure 4a Summary of field activities for non-wadeable stream and river sampling
Figure 4b. Summary of field activities forwadeable stream and river sampling
Figure 5 Organization of information management system modeled after
EMAP-W for the NRSA
Figure 6a Sampling design for the benthic indicator at wadeable sites
Figure 6b. Sampling design for the benthic indicator at non-wadeable sites
Figure 7 Laboratory processing activities for the benthic indicator
Figure 8a. Stream and River index sampling design for the water chemistry
indicator for non-wadeable sites
Figure 8b. Stream and River index sampling design for the water chemistry
indicator for wadeable sites
Figure 9. Sample processing activities for water chemistry samples
Figure 10 Analysis activities for water chemistry samples
LIST OF TABLES
Table 1. Critical logistics elements
Table 2 General guidelines for analytical support laboratories
Table 3 Sample and field data quality control activities
Table 4 Laboratory data quality control activities
Table 5 Biological sample quality control activities
Table 6. Data review, verification and validation quality control activities
Table 7 Field and laboratory methods: benthic indicator
Table 8 Measurement data quality objectives: benthic indicator
Table 9 Laboratory quality control: benthic macroinvertebrate sample processing
Table 10. ..Laboratory quality control: benthic macroinvertebrate taxonomic identification
Table 11 Data validation quality control: benthic indicator
Table 12 Research issues: benthic indicator
Table 13 Field measurement methods: physical habitat indicator
Table 14 Measurement data quality objectives: physical habitat indicator
Table 15 Field quality control: physical habitat indicator
Table 16 Data validation quality control: physical habitat indicator
Table 17 Research questions and hypotheses: water chemistry indicator
Table 18 Analytical methodologies: water chemistry indicator
Table 19 Measurement data quality objectives: water chemistry indicator
Table 20 Sample processing quality control: water chemistry indicator
Table 21 Laboratory quality control samples: water chemistry indicator
Table 22 Data validation quality control: water chemistry indicator
Table 23 Data reporting criteria: water chemistry indicator
Table 24 Constants for converting major ion concentrations
Table 25 Factors to calculate equivalent conductivities
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 1 of 120
1.0 PROJECT PLANNING AND MANAGEMENT
1.1 Introduction
Several recent reports have identified the need for improved water quality monitoring and
analysis at multiple scales. In 2000, the General Accounting Office (USGAO, 2000)
reported that EPA and states cannot make statistically valid inferences about water
quality (via 305[b] reporting) and lack data to support key management decisions. In
2001, the National Research Council (NRC, 2000) recommended EPA and states
promote a uniform, consistent approach to ambient monitoring and data collection to
support core water quality programs. In 2002, the H. John Heinz III Center for Science,
Economics, and the Environment (Heinz Center, 2002) found there are inadequate data
for national reporting on fresh water, coastal and ocean water quality indicators. The
National Association of Public Administrators (NAPA, 2002) stated that improved water
quality monitoring is necessary to help states make more effective use of limited
resources. EPA's Report on the Environment 2003 (USEPA, 2003) says that there is
insufficient information to provide a national answer, with confidence and scientific
credibility, to the question, "What is the condition of U.S. waters and watersheds?"
In response to this need, the U.S. Environmental Protection Agency (EPA) Office of Water
(OW), in concert with EPA's Office of Research and Development (ORD) and the 10
EPA Regions, conceived of the National Aquatic Resource Surveys (NARS), which
includes the National Rivers and Streams Assessment (NRSA) - a national assessment
of the condition of rivers and streams in the conterminous U.S. NRSA is the first
assessment on flowing waters to be based on data collected using the same field and
laboratory protocols and based on a statistical survey design that would allow inferences
about all waters based on a sample of the rivers and streams across the country. The
desire is to implement this effort in cooperation with the States and other entities eligible
for 106 funding. NRSA builds upon the Environmental Monitoring and Assessment
Program's (EMAP) Western Study implemented by ORD, the EPA Regions, States and
Tribal nations in 12 western states and the Wadeable Streams Assessment (WSA)
undertaken in 2004. NRSA will provide the baseline for rivers and streams across the
country and regionally across many indicator types, as well as a comparison of stream
information to the original WSA.
The NRSA Quality Assurance Project Plan (QAPP) is designed to support the participants in
this project and to ensure that the final assessment is based on high quality data and
information. The QAPP contains elements of the overall project management, data
quality objectives, measurement and data acquisition, and information management for
the NRSA. The participants in the NRSA have agreed to follow this QAPP and the
protocols and design laid out in this document.
The NRSA is designed to answer key questions asked by Congress, the public, and decision
makers, such as:
• What's the extent of waters that support healthy ecosystems, recreation, and fish
consumption?
How widespread are the most significant water quality problems?
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 2 of 120
Over time and as additional surveys are implemented, these data will also contribute
to answer questions such as:
Is water quality improving?
• Are we investing in restoration and protection wisely?
Ecological assessments via the NRSA will provide estimates (with quantifiable uncertainty) of
the biological integrity of macroinvertebrate, fish, phytoplankton and periphyton
communities in streams and rivers. Recreational indicators such as fecal contaminants
and fish tissue will be collected to look at human health related issues. Additionally,
indicators of physical habitat condition such as bank stability, channel alterations, and
invasive species; basic water chemistry; and watershed characteristics will also be
collected to assist in explaining the patterns found in biological communities across the
country.
1.2 NRSA Project Organization
The major areas of activity and responsibilities are described here and illustrated in Figure 1.
The overall coordination of the project will be provided by EPA's Office of Water (OW) in
Washington, DC, with technical support from the Western Ecology Division (WED) of the
Office of Research and Development (ORD) in Corvallis, Oregon and the ten EPA
Regional Offices. This comprehensive quality assurance (QA) program has been
established to ensure data integrity and provide support for the reliable interpretation of
the findings from this project.
Program level QA will be the responsibility of the OWOW QA Officer and the Project QA
Officer. A QA records system will be used to maintain indefinitely a permanent
hardcopy file of all NRSA documentation from site selection to data analysis. This will
be housed in OW Headquarters Office.
The primary responsibilities of the principals and cooperators are as follows:
Project Management.
EPA Project Leader- provides overall coordination of the project and makes decisions
regarding the proper functioning of all aspects of the project. Makes assignments and
delegates authority, as needed to other parts of the project organization.
EPA Project QA Lead- provides leadership, development and oversight of project level
quality assurance for NRSA in Office of Water
EPA ORD Technical Advisor - advises the Project Leader on the relevant experiences
and technology developed within ORD's EMAP that are to be used in this project.
Serves as primary point-of-contact for project coordination in the absence or
unavailability of Project Leader.
Project Coordination - contractor providing day-to-day coordination of field
implementation as well as technical development of analysis of data.
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 3 of 120
Project Management
Project leads — Ellen Tarquinio, Treda Smith, O]
Project QA— Sarah Lehman, OW
Technical Advisor— Steve Paulsen, ORD
OWOW QA
Oversight and Review
Margaret Heber, OW
Field Protocols
State & Tribal Steering
Committee, ORD, OW
Field Logistics
Implementation Coordinator
Training
ORD, EPA Regions, Contractors
Field Implementation
State and Tribal Water Quality Agencies,
Contractors
Indicator Leads
ORD.jpW
Information Management
WED-CSC — Marlys Cappaert
Final Data
STORET-OW EMAP-ORD-AED,
States
Assessment
OW - Lead
ORD, Regional Coordinators,
States, Tribes, Cooperators,
and other partners
Figure 1. NRSA Project Organization
Study Design:
Objectives: The study is designed to sample 1800 probabilistic, 200 repeat sites and 200
reference sites (2200 total) river and stream sites across the country.
The objectives, or design requirements, for the National Rivers and Streams Assessment are to
produce:
1. Estimates of the 2008-2009 status of flowing waters nationally and regionally (9
aggregated Omernik ecoregions),
2. Estimates of the 2008-2009 status of wadeable streams and non-wadeable rivers
nationally and regionally (9 aggregated Omernik ecoregions),
3. Estimates of the 2008-2009 status of urban flowing waters nationally,
4. Estimates of the change in status in wadeable streams between 2008-2009 and 2004,
nationally and regionally (9 aggregated Omernik ecoregions).
Target population: The target populations consists of all streams and rivers within the 48
contiguous states that have flowing water during the study index period excluding
portions of tidal rivers up to head of salt defined as .05 ppt measured in the field). The
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 4 of 120
study index period extends from May to October and is characterized by low flow or
base flow conditions. The target population includes the Great Rivers (i.e. main stem of
the Mississippi River). Run-of-the-river ponds and pools are included while reservoirs
are excluded (those that have greater than 7 day retention period).
Sample Frame: The sample frame was derived from the National Hydrography Dataset (NHD),
in particular NHD-Plus. Attributes from NHD-Plus and additional attributes added to the
sample frame that are used in the survey design include: (1) state, (2) EPA Region, (3)
NAWQA Mega Region, (4) Omernik Ecoregion Level 3 (NACEC version), (4) WSA
aggregated ecoregions (nine and three regions), (5) Strahler order, (6) Strahler order
categories (1st, 2nd, ..., 7th and 8th +), (6) FCode, (7) Urban, and (8) Frame07.
Expected sample size: Expected sample size is 1800 flowing water sites: 450 sites revisited
from the WSA, 450 new sites from 1st to 4th order, and 900 new sites from 5th to 10th
order.
Over sample: No over sample sites were selected for the WSA_Revisit design. The expectation
is that all, or almost all, of the 450 sites selected will be sampled given they were
sampled previously. For the NRSA design, the over sample is nine times the expected
sample size within each state. The large over sample size was done to accommodate
those states who may want to increase the number of sites sampled within their state for
a state-level design.
National Rivers and Streams Assessment
Base Sites
i**_. ?*p «% *A.vf< >~% e^» «t^v/^
Projection: USA Contiguous Albers Equal Area Conic USGS version
0 130 260
520
• M.le 5
Figure 2. NRSA Base Sites
Field Protocol Development: The field sampling protocols are based on protocols developed
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 5 of 120
by ORD for use in the EMAP program and were developed with the purpose of providing
consistent and representative information across the country. During the initial design
phase of the project, collaborators and partners worked to refine those protocls for use in
the NRSA. This involved modifications to the original protocols used in the EMAP
program for use in the Great Rivers, tidal systems, and sites that were in between a
wadeable and a beatable system. New advance in the field, such as the incorporation
of surveyors levels for a more accurate measure of slope in wadeable sites were also
incorporated based on the consensus of the partners indicator workgroups. In addition,
OWOW directed development of fecal bacteria (Enterococci) indicator sampling
protocols and OST developed field protocols for the fish tissue indicator.
Field Logistics Coordinator- a contractor who functions on behalf of the Project Leader
to support all phases of the field implementation of the project. Primary
responsibility is to ensure all aspects of the project, i.e., technical, logistical,
organizational, are operating as smoothly as possible. Serves as point-of-contact
for questions from field crews and cooperators for all activities.
Training - Ten training sessions will be conducted in various locations throughout the
US per field year (ten in 2008 and ten in 2009). An initial training session focusing on
training the trainers was held in March 2008 and in March 2009. Headquarters,
GLEC/Tetra Tech (contract), and participants from the train the trainers session
conducted the remaining training sessions. When possible, a monitoring specialist
from each EPA Regional Office also participated in each of the trainings. Each field
crew must have a crew leader who has received 3 days of lecture and field training to
prepare them for this study. They must also have a fish technical lead who has
participated in the training and received prior approval from the EPA Project Lead. At
the end of the training period, each team will conduct a day long sampling on their own
under the watch of the trainers. This field readiness review will be the final QA check
of the training sessions. Additionally, all field crews will be audited early in their
sampling schedule to be certain any corrections will be made at the onset of sampling.
Field Implementation - States, Tribes, Interstate Agencies, and contract crews will conduct the
field implementation to collect samples using the NRSA protocols.
Field Quality Evaluation and Assistance Reviews (auditing) - Each field team will be
visited by a trained team from either an EPA Region, Headquarters, GLEC, or Tetra
Tech. The purpose of this field evaluation and assistance review is to observe the
crews implementing the protocols as trained and provide any assistance or
corrections necessary. This is intended to catch deviations from the protocols before
they become widespread.
Sample Flow. Field samples will be shipped by the crews to one of several locations. All water
samples will be sent to the Western Ecology Division laboratory staffed by Dynamac. All
biological samples will be sent to a national contract lab for analysis or the prior
approved state biological laboratory. Enterococci samples will be sent to Region 1 Lab
staffed by Tech Law for analysis. The fish tissue samples will be sent to GLEC for
homogenization and filleting. The field data sheets will be shipped to the Western
Ecology Division information management team staffed by CSC for scanning and entry
into the database. Each of the organizations processing samples will electronically
transfer the results to CSC using the naming conventions and standards provided by
CSC.
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 6 of 120
Information Management: The first stage of data processing will be to take the input from each
of the responsible laboratories and enter them into a common database for final
verification and validation. Once the final data sets are made available for the
assessment, copies of the data will be transferred to EPA's STORE! and EPA's EMAP
dataset for long-term storage and access. Working copies of the final data sets will be
distributed to the States and Cooperators and maintained at WED for analysis leading to
the assessment.
Assessment: The final assessment will be developed by a team, led by OW, that will include
Office of Water, Office of Environmental Information, several ORD research facilities,
EPA Regional Monitoring Coordinators, interested States/Tribes, and Cooperators. All
States/Tribes will be invited to participate in a collaborative process to interpret results
and shape the data assessment and report. The final assessment will include an
appendix describing the quality of the data used in the assessment.
1.2.1 Project Schedule
The U.S. EPA has responded to a State and OW goal to report on the quality of the Nation's
rivers and streams by no later than December, 2011. Tasks leading up to the final report
are described throughout the QAPP.
1.3 Scope of QA Project Plan
This QA Project Plan addresses all aspects of the data acquisition efforts of the NRSA,
which focuses on the 2008 and 2009 sampling of 2200 river and stream sites in the
contiguous United States. This QA plan also deals with the data integration
necessary between the WSA, NRSA, and EMAP Western Pilot Study (2001-2004) to
create one complete report on the ecological status of the Nation's rivers and
streams.
Relevant Companion documents to this QAPP are: NRSA: Site Evaluation Guidelines, NRSA:
Field Operations Manual, and NRSA: Laboratory Methods Manual ( See introductory
pages for citation information for each document).
1.3.1 Overview of Field Operations
Field data acquisition activities are implemented for the NRSA (Table 1-1), based on guidance
developed for earlier EMAP studies (Baker and Merritt 1990). Survey preparation is
initiated with selection of the sampling locations by the EMAP Design group (WED in
Corvallis). The list of sampling locations is distributed to the EPA Regional Monitoring
Coordinators and all Cooperators. Wth the sampling location list, Cooperator's field
crews can begin site reconnaissance on the primary sites and alternate replacement
sites and begin work on obtaining access permission to each site. Specific procedures
for evaluating each sampling location and for replacing non target sites are
documented in the NRSA: Site Evaluation Guidelines. Scientific collecting permits from
State and Federal agencies will be procured, as needed by the respective State or
cooperating organization. The field teams will use standard field equipment and
supplies which are being provided by EPA and GLEC. Field logistic coordinators
(GLEC and Tetra Tech) will work with Regional Monitoring Coordinators, Cooperators,
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 7 of 120
States, and Contractors to make certain the field crews have the equipment and
supplies they require in a timely fashion. Detailed lists of equipment required for each
field protocol, as well as guidance on equipment inspection and maintenance, are
contained in the Field Operations Manual.
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 8 of 120
Table 1-1. Critical logistics elements (from Baker and Merritt, 1990)
Logistics Plan Component
Project Management
Access and Scheduling
Safety
Procurement and Inventory
Control
Training and Data Collection
Assessment of Operations
Required Elements
Overview of Logistic Activities
Staffing and Personnel Requirements
Communications
Sampling Schedule
Site Access
Reconnaissance
Safety Plan
Waste Disposal Plan
Equipment, Supplies, and Services Requirements
Procurement Methods and Scheduling
Training Program
Field Operations Scenario
Laboratory Operations Scenarios
Quality Assurance
Information Management
Field Crew Debriefings
Logistics Review and Recommendations
Field measurements and samples are collected by trained teams. Each Crew Leader will be
trained at an EPA-sponsored training session prior to the start of the field season along with as
many crew members as possible. Half of the field team musthave participated in an official
NRSA training. Fish leads must also attend the training, as well as receive prior approval
by EPA Project Lead to serve in this role. Field quality evaluation and assistance review visits
will be completed for each team. Typically, each team is comprised of 4-5 members. The
number and size of teams depends on the duration of the sampling window, geographic
distribution of sampling locations, number and complexity of samples and field measurements,
and other factors. The training program stresses hands-on practice of methods, comparability
among crews, collection of high quality data and samples, and safety. Training will be provided in
ten central locations for cooperators and contractors each year. Project organizations responsible
for training oversight are identified in Figure 1. Training documentation will be maintained by the
EPA HQ, Tetra Tech and GLEC Training Support Team.
For each sampling location, a dossier will be prepared by the field crew and contains the
following applicable information: road maps, copies of written access permissions,
scientific collection permits, coordinates of index sites, information brochures on the
program for interested land owners, a topographic map with the index site location
marked, and local area emergency numbers. Team leaders will contact landowners at
least 2 days before the planned sampling date. As the design requires repeat visits to
selected sampling locations, it is important for the field teams to do everything possible
to maintain good relationships with landowners. This includes prior contacts, respect of
special requests, closing gates, minimal site disturbance, and removal of all materials
including flagging and trash.
A variety of methods may be used to access a site, including vehicles and boats. Some
sampling locations require teams to hike in, transporting all equipment in backpacks. For
this reason, ruggedness and weight are important considerations in the selection of
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 9 of 120
equipment and instrumentation. Teams may need to camp out at the sampling location
and if this is the case Teams must be equipped with the necessary camping equipment.
The site verification process is shown in Figure 3. Upon arrival at a site, the location is verified
by a Global Positioning System (GPS) receiver, landmark references, and/or local
residents. Samples and measurements for various indicators are collected in a specified
order (Figure 4). This order has been set up to minimize the impact of sampling for one
indicator upon subsequent indicators; for example, water chemistry samples from rivers
and streams are collected before collecting benthic invertebrates as the benthic
invertebrate method calls for kicking up sediments. All methods are fully documented in
step-by-step procedures in the NRSA: Field Operations Manual (USEPA 2008). The
manual also contains detailed instructions for completing documentation, labeling
samples, any field processing requirements, and sample storage and shipping. Any
revision of methods must be approved in advance by the EPA Project Leader. Field
communications will be available through Field Coordinators, regularly scheduled
conference calls, a Communications Center, or an electronic distribution.
Site Verification Activities
PEE-VISIT PREPARATION
Contact landowner to inform of visit and confirm access
Review site dossier and maps for directions and access requirements
SITE VERIFICATION DAT A
* Record directions to site
• Confirm identity of stream or river
• Site description
• Determine location with GPS
• Determine sampling status
I
LOCATESAMPLOTG& MEASUREMENT SITES
STREAMS
* Locate index site and determine location with GPS
• Locate upper and lower ends of sampling reach (40
channel widths)
* Establish habitat transects across channel (11 per reach)
Figure 3. Site verification activities for river and stream field surveys.
Standardized field data forms are provided to the field crews as the primary means of data
recording. On completion, the data forms are reviewed by a field crew member other
than the person who initially entered the information. Prior to departure from the field
site, the field team leader reviews all forms and labels for completeness and legibility
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 10 of 120
and ensures that all samples are properly labeled and packed. Each site has a unique
identifier (Site ID) provided by the design. All jars from a site have a predetermined
sample number that is preprinted on the labels provided to the field crews. If additional
jars are needed, extra labels are provided.
On return from a field sampling site (either to the field team's home office or to a motel),
completed data forms are sent to the information management staff at WED for entry
into a computerized data base. At WED, electronic data files are reviewed
independently to verify that values are consistent with those recorded on the field
data form or original field data file.
Samples are stored or packaged for shipment in accordance with instructions contained in the
field manual. Samples which must be shipped are delivered to a commercial carrier.
The recipient is notified to expect delivery; thus, tracking procedures can be initiated
quickly in the event samples are not received. Tracking forms and chain-of-custody
forms are completed for all transfers of samples maintained by the labs, with copies
also maintained by the field team. The information coordinator maintains a centralized
tracking system of all shipments.
The field operations phase is completed with collection of all samples or expiration of the
sampling window. Following completion of all sampling, a debriefing session will be
scheduled (see Table 1-1). These debriefings cover all aspects of the field program and
solicit suggestions for improvements.
1.3.2 Overview of Laboratory Operations
Holding times for samples vary with the sample types and analytes. Thus, some analytical
analyses (e.g., water chemistry) begin as soon as sampling begins while others are not
even initiated until sampling has been completed (e.g., benthic macroinvertebrates).
Analytical methods are summarized in the Laboratory Methods Manual that is a
companion document to this QAPP. When available, standard methods are used and
are referenced. Where experimental methods are used or standard methods are
modified, these methods are documented in the laboratory methods manual or in
internal documentation, and may be described in SOPs developed by the analytical
laboratories.
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 11 of 120
Whole Crew
Locate X-site
Verify site as target
Determine launch site & set up staging area
Group A Activities: |
Prepare forms, equipment & supplies
I Group B Activities:
Calibrate multi-probe meter
Load equipment and supplies onto boat (if non-wadeable)
Measure Secchi depth
Measure in situ temperature,
pH, DO, Sconductivity
Collect water chemistry
samples
LOCATE & TRAVEL TO PHYSICAL HABITAT STATIONS
Collect periphyton
samples
Conduct habitat
characterizations
Collect benthic
samples
Collect sediment enzyme
samples
Conduct fish assessment
RETURN TO STAGING AREA
Preserve benthic sample
& prepare for transport
Prepare phytoplankton
samples for transport
Prepare periphyton
samples for transport
Collect fecal indicator
sample at X-site
Filter fecal indicator
sample; prepare for transport
Filter chlorophyll-a
sample; prepare for transport
Prepare sediment
enzyme samples for transport
Collect fish tissue samples
Prepare fish tissue samples
for transport
Inspect and clean boat, motor, & trailer to prevent
transfer of nuisance species and contaminants
Review data forms for completeness
Clean and organize equipment for loading
Report back to Field Logistics Coordinator and
Information Management Coordinator
SHIP SAMPLES
Figure 4a.
Summary of field activities for beatable stream and river sampling.
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 12 of 120
Locate X-site
Verify site as target
Set up staging area
±.
Prepare forms, equipment and supplies
Calibrate multi-probe meter
Lay out sampling reach (from X-site to Transect A)
Lay out sampling reach (from X-site to Transect K)
BEGIN SAMPLING ACTIVITIES AT TRANSECT A
RETURN TO TRANSECT F (X-SITE)
Conduct habitat
characteristics
Measure in situ temperature,
pH, DO, & Conductivity
Collect benthic macroinvertebrate,
periphyton, & sediment enzyme
Collect water chemistry
samples
Collect fecal indicator
Sample at Transect K
TRAVEL TO TRANSECT A
RETURN TO STAGING AREA
Conduct fish assessment
Preserve benthic macroinvertebrate, periphyton, &
Sediment enzyme samples & prepare for transport
Collect fish tissue samples
RETURN TO STAGING AREA
Filter fecal indicator, chlorophyll-a, &, AFDM
Samples; prepare for transport
Preserve & prepare fish tissue
samples for transport
Review data forms for completeness
Clean and organize equipment for loading
Report back to Field Logistics Coordinator and
Information Manaaement Coordinator
SHIP SAMPLES
Figure 4b.
Summary of field activities forwadeable stream sampling.
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 13 of 120
Water chemistry and chlorophyll-a samples will be analyzed by the contract laboratory,
Dynamac, maintained by ORD Western Ecology Division. Benthic macroinvertebrate
samples will be processed by a national contractor and a few pre-approved state
laboratories. Sediment enzyme and periphyton APA samples will be analyzed by the
EPA's National Health and Environmental Effects Research Laboratory in Duluth, MN
(NHEERL-Dul). Periphyton ID samples will be analyzed by both the Philadelphia
Academy of Natural Sciences and Michigan State University and the state of
Wisconsin. Enterococci samples will be analyzed by the EPA's New England Regional
Laboratory (NERL). Fish tissue samples will be analyzed by the EPA's National
Exposure Research Laboratory in Cincinnati, OH (NERL-Cin). Fish identification
vouchers will be verified by the Philadelphia Academy of Natural Sciences and Oregon
State University. The physical habitat measurements are made in the field and recorded
on the field data sheets and then scanned into a database at the information
management center at ORD Western Ecology Division. Laboratories providing analytical
support must have the appropriate facilities to properly store and prepare samples, and
appropriate instrumentation and staff to provide data of the required quality within the
time period dictated by the project. Laboratories must conduct operations using
approved laboratory practices (Table 1-2).
All laboratories providing analytical support to the NRSA (water chemistry, chlorophyll a,
fish tissue, fish community, benthic macroinvertebrates, sediment enzymes,
enterococci, and periphyton) must adhere to the provisions of this integrated
QAPP and NRSA Laboratory Manual. Laboratories will provide information
documenting their ability to conduct the analyses with the required level of data
quality. Such information will include results from interlaboratory comparison
studies, analysis of performance evaluation samples, control charts and results of
internal QC sample or internal reference sample analyses to document achieved
precision, bias, accuracy, and method detection limits. Contracted laboratories will
be required to provide copies of their SOPs and audit reports. Water chemistry
laboratories may also be required to successfully analyze at least one
performance evaluation sample for target analytes before routine samples can be
analyzed. Laboratory operations will be evaluated by technical systems audits,
performance evaluation studies, and by participation in interlaboratory sample
exchange.
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 14 of 120
Table 1 -2. Guidelines for analytical support laboratories
A program of scheduled maintenance of analytical balances, water purification systems, microscopes,
laboratory equipment, and instrumentation.
Checking and recording the composition of fresh calibration standards against the previous lot.
Acceptable comparisons are ± 2 percent of the theoretical value.
Recording all analytical data in bound logbooks in ink, or on standardized recording forms.
Monitoring and recording (in a logbook or on a recording form) temperatures and performance of cold
storage areas and freezer units. During periods of sample collection operations, monitoring must
be done on a daily basis.
Verifying the efficiency of fume hoods.
If needed, having a source of reagent water meeting American Society of Testing and Materials (ASTM)
Type I specifications for conductivity (< 1 :S/cm at 25 /C; ASTM 1984) available in sufficient
quantity to support analytical operations.
Appropriate microscopes or other magnification for biological sample sorting and organism identification.
Labeling all containers used in the laboratory with date prepared, contents, and initials of the individual
who prepared the contents.
Dating and storing all chemicals safely upon receipt. Chemicals are disposed of properly when the
expiration date has expired.
Using a laboratory information management system to track the location and status of any sample
received for analysis.
Reporting results using standard formats and units compatible with the information management system.
1.3.3. Data Analysis and Reporting
A technical workgroup convened by and under the leadership of the EPA Project Leader is
responsible for outlining the final assessment report. Data analysis to support this report
will be conducted by the EMAP team at the Western Ecology Division and other
experts.. Information management activities in support of this effort are discussed further
in Section 4. Data in the database are available to Cooperators for their own use upon
completion of the final verification and validation. The final data from the NRSA will be
transferred to the OW STORET system.
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 15 of 120
2.0 DATA QUALITY OBJECTIVES
It is a policy of the U.S. EPA and its laboratories that Data Quality Objectives (DQOs) be
developed for all environmental data collection activities. Data quality objectives are
statements that describe the level of uncertainty that can be associated with
environmental data for their intended use. Data quality objectives thus provide the
criteria to design a sampling program within cost and resource constraints or technology
limitations imposed upon a project or study.
2.1 Data Quality Objectives for the NRSA
Target DQOs established for the NRSA relate to the goal of describing the current status in the
condition of selected indicators of the condition of rivers and streams in the
conterminous U.S. and subregions of interest. The formal statement of the DQO for
national estimates is as follows:
Estimate the proportion of river and stream length (± 5%) in the conterminous U.S. that falls
below the designated threshold for good conditions for selected measures with 95%
confidence.
For the subregions of interest (Omernik Level II Ecoregions) the DQO is:
Estimate the proportion of river and stream length (± 15%) in a specific Level II Ecoregion that
falls below the designated threshold for good conditions for selected
measures with 95% confidence.
2.2 Measurement Quality Objectives
For each indicator, performance objectives (associated primarily with measurement error) are
established for several different attributes of data quality (following Smith et al., 1988).
Specific objectives for each indicator are presented in the indicator section of this
QAPP. The following sections define the data quality attributes and present
approaches for evaluating them against acceptance criteria established for the
program.
2.2.1 Method Detection Limits
For chemical measurements, requirements for the method detection limit (MDL) are
established. The MDL is defined as the lowest level of analyte that can be
distinguished from zero with 99% confidence based on a single measurement (1)
(Glaser et al., 1981). The MDL for an individual analyte is calculated as:
MDL = f[« = 0.01, v = n-1] X *
where t is a Students' t value at a significance level (") of 0.01 and n-1 degrees of freedom
(<), and s is the standard deviation of a set of n measurements of a standard solution.
The standard contains analyte concentrations between two and three times the MDL
objective, and is subjected to the entire analytical method (including any preparation
or processing stages). At least seven non-consecutive replicate measurements are
required to calculate a valid estimate of the MDL. Replicate analyses of the standard
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 16 of 120
should be conducted over a period of several days (or several different calibration
curves) to obtain a long-term (among-batch) estimate of the MDL.
Laboratories should periodically monitor MDLs on a per batch basis. Suggested procedures for
monitoring MDLs are: (1) to analyze a set of serial dilutions of a low level standard,
determining the lowest dilution that produces a detectable response; and (2) repeated
analysis (at least seven measurements) of a low-level standard within a single batch.
Estimates of MDLs (and how they are determined) are required to be submitted with analytical
results. Analytical results associated with MDLs that exceed the detection limit
objectives are flagged as being associated with an unacceptable MDL. Analytical data
that are below the estimated MDL are reported, but are flagged as being below the
MDL.
2.2.2 Sampling Precision, Bias, and Accuracy
Precision and bias are estimates of random and systematic error in a measurement process
(Kirchmer, 1983; Hunt and Wilson, 1986). Collectively, precision and bias provide an
estimate of the total error or uncertainty associated with an individual measurement or
set of measurements. Systematic errors are minimized by using validated methodologies
and standardized procedures. Precision is estimated from repeated measurements of
samples. Net bias is determined from repeated measurements of solutions of known
composition, or from the analysis of samples that have been fortified by the addition of a
known quantity of analyte. For analytes with large ranges of expected concentrations,
objectives for precision and bias are established in both absolute and relative terms,
following the approach outlined in Hunt and Wilson, 1986. At lower concentrations,
objectives are specified in absolute terms. At higher concentrations, objectives are
stated in relative terms. The point of transition between an absolute and relative
objective is calculated as the quotient of the absolute objective divided by the relative
objective (expressed as a proportion, e.g., 0.10 rather than as a percentage, e.g., 10%).
Final estimates will be calculated by the analysis staff at WED.
Precision in absolute terms is estimated as the sample standard deviation when the number of
measurements is greater than two:
i =
/ — \2
(xi—xj
SD =
n-1
where
x is the value of the replicate
X is the mean of repeated sample measurements,
and n is the number of replicates.
Relative precision for such measurements is estimated as the relative standard deviation
(RSD, or coefficient of variation, [CV]):
s
x
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 17 of 120
where
s is the sample standard deviation of the set of measurements,
and x equals the mean value for the set of measurements.
Precision based on duplicate measurements is estimated based on the range of measured
values (which equals the difference for two measurements). The relative percent
difference (RPD) is calculated as:
'\A-B
RPD = I I I x 100
B
where
A is the first measured value,
6 is the second measured value.
Precision objectives based on the range of duplicate measurements can be calculated as:
Critical Range = s x ^2
where
s represents the precision objective in terms of a standard deviation.
Range-based objectives are calculated in relative terms as:
Critical RPD = RSD x ^/2
where
RSD represents the precision objectives in terms of a relative standard deviation.
For repeated measurements of samples of known composition, net bias (6) is estimated in
absolute terms as:
B = x-T
where
x equals the mean value for the set of measurements
and 7 equals the theoretical or target value of a performance evaluation sample.
Bias in relative terms (B[%J) is calculated as:
B%= xlOQ
\ y j_
where
x equals the mean value for the set of measurements,
and 7 equals the theoretical or target value of a performance evaluation sample.
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 18 of 120
Accuracy is estimated for some analytes from fortified or spiked samples as the percent
recovery. Percent recovery is calculated as:
C - C
% re cov ery = — - x 100
where
C& is the measured concentration of the spiked sample,
C, is the concentration of the unspiked sample, and
CM is the concentration of the spike.
2.2.3 Taxonomic Precision and Accuracy
For the NRSA, taxonomic precision will be quantified by comparing whole-sample identifications
completed by independent taxonomists or laboratories. Accuracy of taxonomy will be
qualitatively evaluated through specification of target hierarchical levels (e.g., family,
genus, or species); and the specification of appropriate technical taxonomic literature or
other references (e.g., identification keys, voucher specimens). To calculate taxonomic
precision, 10% of the biological samples from each participating laboratory will be
randomly-selected by EPA HQ, and sent to an independent taxonomist for re-
identification. Comparison of the results of whole sample re-identifications will provide a
Percent Taxonomic Disagreement (PTD) calculated as:
PTD =
where comppos is the number of agreements, and N is the total number of individuals in the
larger of the two counts. The lower the PTD, the more similar are taxonomic results and
the overall taxonomic precision is better. A measurement quality objective (MQO) of
15% is recommended for taxonomic difference or disagreement (overall mean < 15% is
acceptable based on similar projects)for benthic macroinvertebrates and fish. Individual
samples exceeding 15% are examined for taxonomic areas of substantial disagreement,
and the reasons for disagreement investigated. Periphyton and algal samples have a
higher PTD due to the variance amongst species.
Sample enumeration is another component of taxonomic precision. Sample enumeration
agreement will be checked with the same 10% of samples used to check taxonomic
precision. Final specimen counts for samples are dependent on the taxonomist, not the
rough counts obtained during the sorting activity. Comparison of counts is quantified by
calculation of percent difference in enumeration (PDE), calculated as:
An MQO of 5% is recommended (overall mean of < 5% is acceptable) for several biological
samples, while others will have higher PDE's. This is based on the laboratory
approaches used and the nature of the indicator. Specific PDE's are in each indicator
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 19 of 120
section.
Corrective actions for samples exceeding these MQOs can include defining the taxa for
which re-identification may be necessary (potentially even by third party), for which
samples (even outside of the 10% lot of QC samples) it is necessary, and where
there may be issues of nomenclatural or enumeration problems. Taxa lists will be
changed when disagreements are resolved by a third party.
Taxonomic accuracy is evaluated by having individual specimens representative of selected
taxa identified by recognized experts, usually contract or university affiliated persons
who have peer-reviewed publications for the taxonomic group they are reviewing.
Samples will be identified using the most appropriate technical literature that is accepted
by the taxonomic discipline and reflects the accepted nomenclature. The Integrated
Taxonomic Information System (ITIS, http://www.itis.usda.gov/) will be used to verify
nomenclatural validity and reporting. A reference collection will be compiled by each lab
as the samples are identified. Specialists in several taxonomic groups will verify selected
individuals of different taxa, as determined by the NRSA workgroup.
2.2.4 Completeness
Completeness requirements are established and evaluated from two perspectives. First, valid
data for individual indicators must be acquired from a minimum number of sampling
locations in order to make subpopulation estimates with a specified level of confidence
or sampling precision. The objective of this study is to complete sampling at 95% or
more of the 1800 initial sampling sites and the 200 reference sites. Percent
completeness is calculated as:
%C = V IT x 100
where V = number of measurements/samples judged valid, and T = total number of planned
measurements/samples. Within each indicator, completeness objectives are also
established for individual samples or individual measurement variables or analytes.
These objectives are estimated as the percentage of valid data obtained versus the
amount of data expected based on the number of samples collected or number of
measurements conducted. Where necessary, supplementary objectives for
completeness are presented in the indicator-specific sections of this QAPP.
2.2.5 Comparability
Comparability is defined as the confidence with which one data set can be compared to another
(Stanley and Verner, 1985; Smith et al., 1988). For all indicators, comparability is
addressed by the use of standardized sampling procedures, sampling equipment and
analytical methodologies by all sampling crews and laboratories. These are also the
same used to collect data in EMAP West and WSA studies. Comparability of data within
and among indicators is also facilitated by the implementation of standardized quality
assurance and quality control techniques and standardized performance and
acceptance criteria. For all measurements, reporting units and format are specified,
incorporated into standardized data recording forms, and documented in the information
management system. Comparability is also addressed by providing results of QA sample
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 20 of 120
data, such as estimates of precision and bias, conducting methods comparison studies
when requested by the grantees and conducting interlaboratory performance evaluation
studies among state, university, and NRSA contract laboratories. If some incompatibility
between sampling crews comes to light, the data will be rejected.
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 21 of 120
2.2.6 Representativeness
Representativeness is defined as "the degree to which the data accurately and precisely
represent a characteristic of a population parameter, variation of a property, a process
characteristic, or an operational condition" (Stanley and Verner, 1985, Smith et al.,
1988). At one level, representativeness is affected by problems in any or all of the
other attributes of data quality.
At another level, representativeness is affected by the selection of the target surface water
bodies, the location of sampling sites within that body, the time period when samples are
collected, and the time period when samples are analyzed. The probability-based
sampling design should provide estimates of condition of surface water resource
populations that are representative of the region. The individual sampling programs
defined for each indicator attempt to address representativeness within the constraints of
the sampling design and index sampling period. Holding time requirements for analyses
ensure analytical results are representative of conditions at the time of sampling. Use of
QC samples which are similar in composition to samples being measured provides
estimates of precision and bias that are applicable to sample measurements.
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 22 of 120
3.0 SURVEY DESIGN
Many of the questions which USEPA's Office of Water, States and Tribes are attempting to
address fundamentally require information about large numbers of systems rather than
individual systems. ORD has studied the role of monitoring surveys, their evolution and
the nature of existing federal monitoring programs, and can provide information and
assistance to the States and Tribes in this area.
The survey design for the NRSA is the same as used for EMAP-West plus the Great Rivers and
the tidal systems. The design is a sample survey design (a.k.a. probability design) that
ensures a representative set of sample sites from which inferences can be made about
the target population. For the NRSA, the target population is all National rivers and
streams in the conterminous US, excluding sites below the head of salt or reservoirs.
There is a large body of statistical literature dealing with sample survey designs which
addresses the problem of making statements about many by sampling the few (e.g.,
Cochran 1977, Kish 1965, Kish 1987, Sarndal etal. 1992). Sample surveys have been
used in a variety of fields (e.g., election polls, monthly labor estimates, forest inventory
analysis, national wetlands inventory) to determine the status of populations (large
groups of sites) of interest, especially if the population is too numerous to census or if it
is unnecessary to census the population to reach the desired level of precision for
describing the population's status. A key point in favor of probability based designs is
that they allow lower cost sampling programs because a smaller number of sites are
able to support conclusions with known accuracy and precision about status and trends
of a region.
Probability sampling surveys have been consistently used in some natural resource fields. The
National Agricultural Statistics Survey (NASS) conducted by the U.S. Department of
Agriculture and the Forest Inventory Analysis (FIAT) conducted by the U.S. Forest
Service (Bickford et al. 1963, Hazard and Law 1989) have both used probability based
sampling concepts to monitor and estimate the condition and productivity of agricultural
and forest resources from a commodity perspective. National Resources Inventory (NRI)
was instituted initially because of concerns about the impact of soil erosion on crop
production. More recently, the National Wetland Inventory (NWI) developed by the U.S.
Fish and Wldlife Service (Wlen 1990) to estimate the extent of wetland acreage in the
United States has used a probability based sampling design. However, no thorough
review of all national programs has occurred until recently.
The survey designs used in EMAP to date have been documented in published reports for each
resource group and in the peer reviewed literature. Below a brief description of the
design concepts and the specific application for riverine systems is provided. Much of
this is extracted from various publications and from Stevens (1994) which provides an
excellent overview of the design concepts, issues and applications for the entire
program. The EMAP sampling design strategy is based on the fundamental requirement
for a probability sample of an explicitly defined regional resource population, where the
sample is constrained to reflect the spatial dispersion of the population.
A key property of a probability sample is that every element in the population has some chance
of being included in the sample. If this were not the case, then some parts of the
population might as well not exist, since no matter what, their condition could have no
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 23 of 120
influence on estimates of population characteristics. This property has a side benefit, in
that it forces an explicit and complete definition of the population being described. This
may seem trivial; however, in practice, it is almost never easy to tightly delimit a real,
physical population. For example, "river" is a concept that has meaning for most people,
and the notion of "all rivers in the continental United States" would seem to define a
population. Nevertheless, an operational definition of membership is missing. The
operational definition must be complete enough to establish any flowing water, from a
headwater stream up to the Mississippi River, as either in or out of the population. Thus,
the definition must address such aspects as size limits (at least lower limits on flow),
natural rivers versus constructed channels, temporal fluctuation (If a "river" dries up
during a drought, is it still a river? Was it a river before the drought?), and amount of
flowing water and riparian zone. Without such an operational definition, any statement
about "all rivers in the United States" has an unquantifiable vagueness.
The river and stream resource does not fall neatly into either the discrete or extensive category.
The National Stream Survey (Messer et al., 1986; Overton, 1985) split streams into
reaches defined as the length of stream between confluences, or from the headwaters
down to the first confluence. Thus, streams were treated as a finite discrete population.
A grid was used to sample stream reaches by randomly placing a grid over a
topographic map of the area of interest, and then proceeding downhill along the fall line
until a stream reach was intersected. The approach that was taken avoids the necessity
of delimiting the resource areal units. The approach of EMAP-West is somewhat
different. The program focuses on the population of stream miles rather than stream
reaches. We wish to characterize the population in terms of the condition of length of
rivers and streams rather than numbers of river or stream reaches. Therefore, we want a
sampling method that samples a river or stream in proportion to its length; this is
accomplished by viewing rivers and streams as an extensive resource with length. The
method described here is currently being used in a pilot study, which, among other
goals, will examine the suitability of the method for a larger study. Stream and river
traces are identified on 1:100,000-scale Digital Line Graphs, and a Geographical
Information System is used to intersect these with the sampling templates. Each river
and stream segment within a template is identified and its length determined. The
endpoints of a segment are defined as confluences, headwaters ends, or intersections
with a template edge. Sets of connected segments of the same order are always kept
together in the sample selection process. The appropriate Strahler stream order is also
determined for each segment.
Some differential weighting by size is necessary because of the predominance of lower-order
streams. The sample selection proceeds with inclusion probability for a segment
proportional to its length times the weight for its order. The total inclusion probability for
each template is calculated as the weighted sum of stream lengths in the template, the
templates are partitioned into groups using the partitioning algorithm described for lakes,
and the samples are selected in an analogous manner: The partitions are randomized,
the templates are randomized within the partitions, and the sets of connected segments
are randomized within the templates. The same systematic selection protocol is used;
however, in this case, the selection not only identifies the stream segment to be
sampled, but also identifies the point on that segment where the sample is to be located.
This is accomplished by recording the relative distance from the beginning of the
segment to the selected point on the segment.
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 24 of 120
The types of questions which have been posed from various State and Tribal agencies suggest
that they would like to make statements about all streams and rivers. Clearly, sampling
every mile of river and stream in the country is not economically feasible nor is it
necessary. Probability designs have been used in a wide range of disciplines to address
this need (Converse 1987).
The primary objectives of this study are to estimate the condition of mapped perennial National
rivers and streams, and the extent (total length) of mapped channels, in conterminous
states of the U.S. The objectives specify an interest in the target population of wadeable
and non-wadeable perennial streams and rivers.
One estimate of extent is provided by National Hydrography Database Plus (NHD- Plus) which
is based on digitized blue lines from 1:100,000 scale maps. Based on prior information, it
is known that NHD-Plus incorrectly codes some stream segments. Incorrect code
information occurs for (1) designating Strahler stream order; (2) delineating perennial
and intermittent, (3) defining natural versus constructed channels, including newly
modified channels, and (4) distinguishing irrigation return flow from irrigation delivery
channels. In some cases, NHD-Plus includes stream channels that are not actually
present, due to (1) no definable channel present, (2) location is wetland/marsh with no
defined channel, or (3) channel may be an impoundment. NHD-Plus may also exclude
some stream channels due to (1) mapping inconsistencies in construction of 1:100,000
maps, (2) digitization of map blue lines, or (3) inadequacy of photo information used to
develop maps, e.g. heavily forested areas with low order streams. This study assumes
that NHD-Plus includes all stream channels specified by the definition of the target
population. That is, if stream channels exist that are not included in NHD-Plus, they will
not be addressed by this study.
A secondary outcome of estimating the extent of the stream channel resource will be estimates
on the amount of miscoding present in NHD-Plus. Those stream segments actually
selected in the survey sample that are found to be miscoded will be submitted to NHD-
Plus staff for correction.
3.1 Probability-Based Sampling Design and Site Selection
Target Population: Within the conterminous U.S, all stream and river channels (natural and
constructed) mapped at 1:100,000 scale
Sample Frame: NHD-Plus stream and river channel segments coded as R, S, T, N, W, (412,
413, 999) and U (414, 415).
This frame is subdivided into two major parts: (1) all NHD-Plus stream, river and canal
segments coded as perennial, and (2) all NHD-Plus stream, river and canal segments
coded as non-perennial, i.e., all other stream, river and canal segments. The purpose of
subdividing the frame is to allow a sampling focus on systems that have an exceedingly
high probability of being flowing waters during the index sampling period.
Sites were selected for the NRSA project using a hierarchical randomization design process
described by Stevens and Olsen (1999, 2003, 2004). The national hydrography
database (NHD) served as the frame representing streams and rivers in the US. Data
from approximately 1800 river and stream sites in the United States will be used in the
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 25 of 120
assessment and sampled over a two year index period. This total sample size will allow
national reporting as well as regional reporting at the scale of 9 aggregated Omernik
Level II ecoregions, the ten EPA Regions and 10-15 major drainage basins. Several
States have added additional sites to be able to report on the condition of streams
and/or rivers within their boundaries.
Key features of the approach are (1) utilizing survey theory for continuous populations within a
bounded area, (2) explicit control of the spatial dispersion of the sample through
hierarchical randomization, (3) unequal probability of selection by Strahler order, and (4)
nested subsampling to incorporate intensified sampling in special study regions.
Revisit Sites: Of the sites visited in the field and found to be target sites, a total of 10% will be
revisited. The 10% will be the first 10% of the sites visited. The primary purpose of this
revisit set of sites is to allow variance estimates that would provide information on the
extent to which the population estimates might vary. In addition 450 WSA streams will be
revisited during the 2008 and 2009 sampling season to evaluate change from the WSA.
Site Evaluation Sites: The number of sites that must be evaluated to achieve the expected
number of field sites that can be sampled can only be estimated based on assumptions
concerning expected error rates in RF3, percent of landowner refusals, and percent of
physically inaccessible sites. Based on the estimates gained in previous studies, a list of
alternate sites was selected at the same time as the base sites. These alternate sites will
be using in order until the desired sample designated for the state has been acheived.
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 26 of 120
4.0 INFORMATION MANAGEMENT
Like QA, information management (IM) is integral to all aspects of the NRSA from initial
selection of sampling sites through dissemination and reporting of final, validated data.
QA and QC measures implemented for the IM system are aimed at preventing
corruption of data at the time of their initial incorporation into the system and maintaining
the integrity of data and information after incorporation into the system. The general
organization of, and QA/QC measures associated with, the IM system are described in
this section.
Long-term data from the NRSA will be maintained in STORET/WQX and the EMAP data
system at ORD (formerly Surface Water Information Management System). Project data
management activities will be handled at EPA's Western Ecology Division and will be
compliant with all relevant EPA and Federal data standards. Data will be shipped from
sample processing laboratories to WED no later than May 2011.
4.1 Data Policy
The NRSA requires a continuing commitment to the establishment, maintenance, description,
accessibility, and long-term availability of high-quality data and information. All data used
in the NRSA will be maintained, following final verification and validation of dataset, in
EPA's STORET/WQX and EPA's EMAP data system.
Full and open sharing of the full suite of data and published information produced by the study is
a fundamental objective. Data and information will be available without restriction for no
more than the cost of reproduction and distribution. Where possible, the access to the
data will be via the World Wide Web through STORET and EMAP to keep the cost of
delivery to a minimum and to allow distribution to be as wide as possible. All data
collected by this study will be publicly available following verification and validation of the
dataset.
Organizations and individuals participating in the project will ship all samples in a timeline
consistent with the field operations manual. Field data sheets will be sent directly to
WED for data entry. All laboratories processing samples will send final electronic dataset
to WED by May 2011. Data and metadata will be available for assessment preparation
by July 2010. Final dataset with metadata will be available via STORET and EMAP at
the time of delivery of the final report, December 2011.
All data sets and published information used in the study will be identified with a citation; for
data sets an indication of how the data may be accessed will be provided. Data from this
study will be maintained indefinitely. All EPA data policies will be followed including EPA
data standards, GIS, etc., as discussed in section 4.3.
4.2 Overview of System Structure
At each point where data and information are generated, compiled, or stored, the information
must be managed. Thus, the IM system includes all of the data-generating activities, all
of the means of recording and storing information, and all of the processes which use
data. The IM system includes both hardcopy and electronic means of generating,
storing, and archiving data. All participants in the NRSA have certain responsibilities and
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 27 of 120
obligations which make them a part of the IM system. In its entirety, the IM system
includes site selection and logistics information, sample labels and field data forms,
tracking records, map and analytical data, data validation and analysis processes,
reports, and archives. IM staff supporting the NRSA at WED provide support and
guidance to all program operations in addition to maintaining a central data base
management system for the NRSA data.
The central repository for data and associated information collected for use by the NRSA is a
DEC Alpha server system located at WED-Corvallis. The general organization of the
information management system is presented in Figure 5. Data are stored and
managed on this system using the Statistical Analysis System (SAS) software package.
This centrally managed IM system is the primary data management center for the NRSA
research conducted at WED and elsewhere. The IM staff receives, enters, and maintains
data and information generated by the site selection process (see Section 3), field
sample and data collection, map-based measurements, laboratory analyses, and
verification and validation activities completed by the states, cooperators and
contractors. In addition to this inflow, the IM system provides outflow in provision of data
files to NRSA staff and other users. The IM staff at WED is responsible for maintaining
the security integrity of both the data and the system.
ORGANIZATION OF EMAP-WEST INFORMATION MANAGEMENT
SYSTEM
SAMP LE SITE IN FO RMATD N
INDICATOR RESEARCH AND D E V ELO P U EN T IN FO R U ATD N
TIER II
LIST
FRAME
• Sit ID
• i.iiJelgttlig
Factor
• Loeatioi
uivird Hates
LOGISTIC!
DATA
• Site ID
1 to m atD i
• Locattx
cootiliatef
• ACCSJ I
liTormattoi
srTE
VERIFICATION
DATA
• Sit ID
• Measired
bcatftt
coord li ate:
• Samphg
sens
LABORATORY
DATA
SAMPLE
TRACKING
DATA
ASSESSMENT AND REPORTING INFORMATION
ANNUAL
POPULATION
STATUS
DATA
POPULATION
TREND
DATA
S PAT IA L
DATA
(CIS)
QA&C
DATA
STRESSOR
DATA
• Laidise
dais
• Samp Hi g
St3t
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 28 of 120
collected for EMAP surface waters research projects. Activities to maintain the integrity
and assure the quality of the contents of the IM system are also described.
4.2.1 Design and Logistics Data Bases
The site selection process described in Section 3 produces a list of candidate sampling
locations, inclusion probabilities, and associated site classification data (e.g., target
status, ecoregion, stream order, etc.). This "design" data base is provided to the IM staff,
implementation coordinators, and field coordinators. Field coordinators determine
ownership and contacts for acquiring permission to access each site, and conduct
reconnaissance activities. Ownership and reconnaissance information for each site are
compiled into a "logistics" data base. Generally, standardized forms are used during
reconnaissance activities. Information from these forms may be entered into a SAS
compatible data management system. Whether in electronic or hardcopy format, a copy
of the logistics data base is provided to the IM for archiving storage.
4.2.2 Sample Collection and Field Data Recording
Prior to initiation of field activities, the IM staff develops standardized field data forms and
sample labels. Preprinted adhesive labels having a standard recording format are
completed and affixed to each sample container. Precautions are taken to ensure that
label information remains legible and the label remains attached to the sample.
Examples of sample labels are presented in the field operations manual.
Field sample collection and data forms are designed in conjunction with IM staff to ensure the
format facilitates field recording and subsequent data entry tasks. All forms which may
be used onsite are printed on water-resistant paper. Copies of the field data forms and
instructions for completing each form are documented in the field operations manuals.
Recorded data are reviewed upon completion of data collection and recording activities
by a person other than the one who completed the form. Field crews check completed
data forms and sample labels before leaving a sampling site to ensure information and
data were recorded legibly and completely. Errors are corrected if possible, and data
considered as suspect are qualified using a flag variable. The field crew enters
explanations for all flagged data in a comments section. Completed field data forms are
transmitted to the IM staff at WED for entry into the central data base management
system.
All samples are tracked from the point of collection. Hardcopy tracking and custody forms are
completed by the field crews. Copies of the shipping and custody record accompany all
sample transfers; other copies are transmitted to the IMC and applicable indicator lead.
Samples are tracked to ensure that they are delivered to the appropriate laboratory, that
lost shipments can be quickly identified and traced, and that any problems with samples
observed when received at the laboratory are reported promptly so that corrective action
can be taken if necessary. Detailed procedures on shipping and sample tracking can be
found in Appendix C of the Field Operations Manual
Procedures for completion of sample labels and field data forms, and use of PCs are covered
extensively in training sessions. General QC checks and procedures associated with
sample collection and transfer, field measurements, and field data form completion for
most indicators are listed in Table 3-1. Additional QA/QC checks or procedures specific
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 29 of 120
to individual indicators are described in the indicator sections in Section 5 of this QAPP.
4.2.3 Laboratory Analyses and Data Recording
Upon receipt of a sample shipment, analytical laboratory receiving personnel check the
condition and identification of each sample against the sample tracking record. Each
sample is identified by information written on the sample label and by a barcode label.
Any discrepancies, damaged samples, or missing samples are reported to the IM staff
and indicator lead by telephone. The laboratory receiving personnel log in the samples
and post the log-in information for the IM staff at WED, who track all sample shipping,
custody, and disposition.
Table 4-1. Sample and field data quality control activities
Quality Control Activity
Contamination Prevention
Sample Identification
Data Recording
Data Qualifiers
Sample Custody
Sample Tracking
Data Entry
Data Submission
Data Archival
Description and/or Requirements
All containers for individual site sealed in plastic bags until use; specific
contamination avoidance measures covered in training
Pre-printed labels with unique ID number for each sample
Data recorded on pre-printed forms of water-resistant paper; field crew
reviews data forms for accuracy, completeness, and legibility
Defined qualifier codes used on data form; additional qualifiers explained in
comments section on data form
Unique sample ID and tracking form information entered in an electronic
laboratory information management system (LIMS); sample
shipment and receipt confirmed
Sample condition inspected upon receipt and noted on tracking form with
copies sent to Indicator Lead, Communications Center, and/or IM
Data entered using customized entry screens that resemble the data forms;
entries reviewed manually or by automated comparison of double
entry
Standard format defined for each measurement including units, significant
figures, and decimal places, accepted code values, and required
field width
All data archived in an organized manner for a period of seven years or
until written authorization for disposition has been received from
the Surface Waters Technical Director.
Most of the laboratory analyses for the NRSA indicators, particularly chemical and physical
analyses, follow or are based on standard methods. Standard methods generally include
requirements for QC checks and procedures. General laboratory QA/QC procedures
applicable to most NRSA indicators are described in Table 4-2. Additional QA/QC
samples and procedures specific to individual indicator analyses are described in the
indicator sections in Part II of this QAPP. Biological sample analyses are generally
based on current acceptable practices within the particular biological discipline. Some
QC checks and procedures applicable to most NRSA biological samples are described
in Table 4-3. Additional QA/QC procedures specific to individual biological indicators are
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 30 of 120
described in the indicator sections in Part 5 of this QAPP.
A laboratory's IM system may consist of only hardcopy records such as bench sheets and
logbooks, an electronic laboratory information management system (LIMS), or some
combination of hardcopy and electronic records. Laboratory data records are reviewed
at the end of each analysis day by the designated laboratory onsite QA coordinator or by
supervisory personnel. Errors are corrected if possible, and data considered as suspect
by laboratory analysts are qualified with a flag variable. All flagged data are explained in
a comments section. Private contract laboratories generally have a laboratory quality
assurance plan and established procedures for recording, reviewing, and validating
analysis data. Once analytical data have passed all of the laboratory's internal review
procedures, a submission package is prepared and transferred to the IM staff. The
contents of the submission package are largely dictated by the type of analysis
(physical, chemical, or biological), but generally includes at least the elements listed in
Tables 4-2 or 4-3. All samples and raw data files (including logbooks, bench sheets, and
instrument tracings) are to be retained for a period of seven years or until authorized for
disposal, in writing, by the NRSA Project Leader.
Table 4-2. Laboratory data quality control activities
Quality Control Activity
Instrument Maintenance
Calibration
QC Data
Data Recording
Data Qualifiers
Data Entry
Submission Package
Description and/or Requirements
Follow manufacturer's recommendations and specific guidelines in methods;
maintain logbook of maintenance/repair activities
Calibrate according to manufacturer's recommendations and guidelines given
in Section 6; recalibrate or replace before analyzing any samples
Maintain control charts, determine MDLs and achieved data attributes;
include QC data summary in submission package
Use software compatible with EMAP-SWIM system; check all data entered
against the original bench sheet to identify and correct entry errors.
Review other QA data (e.g. condition upon receipt, etc.) for possible
problems with sample or specimens.
Use defined qualifier codes; explain all additional qualifiers
Automated comparison of double entry or 100% manual check against
original data form
Includes: Letter by the laboratory manager; data, data qualifiers and
explanations; electronic format compatible with EMAP-SWIM system,
documentation of file and data base structures, variable descriptions
and formats; summary report of any problems and corrective actions
implemented
Table 4-3. Biological sample quality control activities
Quality Control Activity
Sorting/Enumeration
Taxonomic Nomenclature
Description and/or Requirements
Re-sort 10% of samples and check counts of organisms
Use accepted common and scientific nomenclature and unique entry codes
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 31 of 120
Taxonomic Identifications
Independent Identifications
Duplicate Identifications
Taxonomic
Reasonableness
Checks
Use standard taxonomic references and keys; maintain bibliography of all
references used
Uncertain identifications to be confirmed by expert in particular taxa
At least 5% of all samples completed pertaxonomist reidentified by different
analyst; less than 15% assigned different ID
Species or genera known to occur in given conditions or geographic area
4.2.4 Data Review, Verification, Validation Activities
Raw data files are created from entry of field and analytical data, including data for QA/QC
samples and any data qualifiers noted on the field forms or analytical data package.
After initial entry, data are reviewed for entry errors by either a manual comparison of a
printout of the entered data against the original data form or by automated comparison of
data entered twice into separate files. Entry errors are corrected and reentered. For
biological samples, species identifications are corrected for entry errors associated with
incorrect or misspelled codes. Errors associated with misidentification of specimens are
corrected after voucher specimens have been confirmed and the results are available.
Files corrected for entry errors are considered to be raw data files. Copies of all raw data
files are maintained in the centralized IM system.
Some of the typical checks made in the processes of verification and validation are described in
Table 4-4. Automated review procedures may be used. The primary purpose of the initial
checks is to confirm that a data value present in an electronic data file is accurate with
respect to the value that was initially recorded on a data form or obtained from an
analytical instrument. In general, these activities focus on individual variables in the raw
data file and may include range checks for numeric variables, frequency tabulations of
coded or alphanumeric variables to identify erroneous codes or misspelled entries, and
summations of variables reported in terms of percent or percentiles. In addition,
associated QA information (e.g., sample holding time) and QC sample data are reviewed
to determine if they meet acceptance criteria. Suspect values are assigned a data
qualifier until they can be corrected or confirmed as unacceptable and replaced with a
new acceptable value from sample reanalysis.
Table 4-4. Data review, verification, and validation quality control activities
Quality Control Activity
Review any qualifiers associated with
variable
Summarize and review replicate sample
data
Determine if data quality objectives have
been achieved
Exploratory data analyses (univariate,
bivariate, multivariate) utilizing all
Description and/or Requirements
Determine if value is suspect or invalid; assign validation
qualifiers as appropriate
Identify replicate samples with large variance; determine
if analytical error or visit-specific phenomenon is
responsible
Determine potential impact on achieving research and/or
program objectives
Identify outlier values and determine if analytical error or
site-specific phenomenon is responsible
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 32 of 120
data
Confirm assumptions regarding specific
types of statistical techniques being
utilized in development of metrics
and indicators
Determine potential impact on
program objectives
achieving research and/or
A second review is conducted after all analyses have been completed and the raw data file is
created. The internal consistency among different analyses or measurements conducted
on a sample is evaluated. Examples of internal consistency checks include calculation of
chemical ion balances or the summation of the relative abundances of taxa. Samples
identified as suspect based on internal consistency checks are qualified with a flag
variable and targeted for more intensive review. Data remain qualified until they can be
corrected, are confirmed as acceptable in spite of the apparent inconsistency, or until
new acceptable values are obtained from sample reanalysis. Upon completion of these
activities, copies of the resultant data files are transmitted for archival storage.
In the final stage of data verification and validation, exploratory data analysis techniques may be
used to identify extreme data points or statistical outliers in the data set. Examples of
univariate analysis techniques include the generation and examination of box-and-
whisker plots and subsequent statistical tests of any outlying data points. Bivariate
techniques include calculation of Spearman correlation coefficients for all pairs of
variables in the data set with subsequent examination of bivariate plots of variables
having high correlation coefficients. Recently, multivariate techniques have been used in
detecting extreme or outlying values in environmental data sets (Meglen, 1985; Garner
et al., 1991; Stapanian et al., 1993). A software package, SCOUT, developed by EPA
and based on the approach of Garner et al. (1991) may be used for validation of
multivariate data sets.
Suspect data are reviewed to determine the source of error, if possible. If the error is
correctable, the data set is edited to incorporate the correct data. If the source of the
error cannot be determined, data are qualified as questionable or invalid. Data qualified
as questionable may be acceptable for certain types of data analyses and interpretation
activities. The decision to use questionable data must be made by the individual data
users. Data qualified as invalid are considered to be unacceptable for use in any
analysis or interpretation activities and will generally be removed from the data file and
replaced with a missing value code and explanatory comment or flag code. After
completion of verification and validation activities, a final data file is created, with copies
transmitted for archival and for uploading to the centralized IM system.
Once verified and validated, data files are made available for use in various types of
interpretation activities, each of which may require additional restructuring of the data
files. These restructuring activities are collectively referred to as "data enhancement." In
order to develop indicator metrics from one or more variables, data files may be
restructured so as to provide a single record per stream or river site. To calculate site
population estimates based on individual measurements or indicators, missing values
and suspect data points may need to be replaced with alternate data (such as a value
from a replicate measurement) or values calculated from predictive relationships based
on other variables.
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 33 of 120
4.3 Data Transfer
Field crews may transmit data electronically via email or CD; original hardcopies of completed
data and sample tracking forms must be transmitted to the IM staff at WED via express
courier service. Copies of raw, verified, and validated data files are transferred from
states, cooperators, and contractors to the IM staff for inclusion in the central IM system.
All transfers of data are conducted using a means of transfer, file structure, and file
format that has been approved by the IM staff. Data files that do not meet the required
specifications will not be incorporated into the centralized data access and management
system.
4.4 Core Information Management Standards
Participants will adhere to the "Core Information Management Standards for the EMAP Western
Study." National and international standards will be used to the greatest extent possible.
This section details a list of standards pertaining to information management that all
participants in the NRSA agree to follow. The goal of these core standards is to
maximize the ability to exchange data with other studies conducted under the monitoring
framework of the Committee on the Environment and Natural Resources (CENR 1997).
The main standards are those of the Federal Geographic Data Committee (FGDC 1999),
the National Spatial Data Infrastructure (NSDI 1999), and the National Biological
Information Infrastructure (NBII 1999).
4.4.1 Metadata
Federal Geographic Data Committee Content standard for digital geospatial metadata, version
2.0. FGDC-STD-001-1998 (FGDC 1998), including the Biological Data Profile and the
Biological Names and Taxonomy Data Standards developed by the National Biological
Information Infrastructure (NBII 1999).
For tabular data, metadata that meet the FGDC content standard are contained by a
combination of the EMAP Data Directory and the EMAP Data Catalog. For ARC/INFO
coverages, the metadata are in the .DOC file embedded in the coverage. This file stays
with the coverage. When the coverage is moved to the EMAP public web sites, it will be
duplicated to an ASCII text file.
4.4.2 Data Directory
The EMAP Data Directory is maintained as an Oracle database. The guidelines are given in
Frithsen and Strebel (1995), Frithsen (1996a, b) and USEPA (1996b).
EMAP Directory entries are periodically uploaded to the Environmental Information
Management system (EIMS 1999).. The EIMS will become EPA's node for the National
Spatial Data Infrastructure and will make directory information available to other federal
agencies through the Z39.50 protocol in accordance with the US Global Change
Research Program (USGCRP 1998)
4.4.3 Data Catalog
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 34 of 120
Data catalog standards are given in Frithsen and Strebel (1995), Frithsen (1996a), and USEPA
(1996c).
4.4.4 Data Formats
Attribute data ASCII files: comma-separated values, or space-delimited, or fixed column SAS
export files Oracle; GIS data ARC/INFO export files; compressed .tar file of ARC/INFO
workspace Spatial Data Transfer Standard (SDTS) (FGDC 1999) format available on
request
4.4.5 Parameter Formats
Sampling Site (EPA Locational Data Policy (USEPA 1991)
Latitude and Longitude in decimal degrees (+/- 7.4), Negative longitude values (west of
the prime meridian), NAD83
Date: YYYYMMDD (year, month, day)
Hour: HHMMSS (hour, minute, second), Greenwich mean time, Local time
Data loaded to STORE! will take on the STORE! formats upon loading.
4.4.6 Standard Coding Systems
Chemical Compounds: Chemical Abstracts Service (CAS 1999)
Species Names: Integrated Taxonomic Information system (ITIS 1999)
Land cover/land use codes: Multi-Resolution Land Characteristics (MRLC 1999)
4.5 Hardware and Software Control
All automated data processing (ADP) equipment and software purchased for or used in the
NRSA surface waters research is subject to the requirements of the federal government,
the particular Agency, and the individual facility making the purchase or maintaining the
equipment and software. All hardware purchased by EPA is identified with an EPA
barcode tag label; an inventory is maintained by the responsible ADP personnel at the
facility. Inventories are also maintained of all software licenses; periodic checks are
made of all software assigned to a particular PC.
The development and organization of the IM system is compliant with guidelines and standards
established by the EMAP Information Management Technical Coordination Group, the
EPA Office of Environmental Information (OEI), and the EPA office of Administrative
Resources Management (OARM). Areas addressed by these policies and guidelines
include, but are not limited to, the following:
• Taxonomic Nomenclature and Coding
• Locational data
• Sampling unit identification and reference
• Hardware and software
• Data catalog documentation
The NRSA is committed to compliance with all applicable regulations and guidance concerning
hardware and software procurement, maintenance, configuration control, and QA/QC.
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 35 of 120
As new guidance and requirements are issued, the NRSA information management staff
will assess the impact upon the IM system and develop plans for ensuring timely
compliance.
4.6 Data Security
All data files in the IM system are protected from corruption by computer viruses, unauthorized
access, and hardware and software failures. Guidance and policy documents of EPA
and management policies established by the IM Technical Coordination Group for data
access and data confidentiality are followed. Raw and verified data files are accessible
only to the NRSA collaborators. Validated data files are accessible only to users
specifically authorized by the EPA Project Leader. Data files in the central repository
used for access and dissemination are marked as read-only to prevent corruption by
inadvertent editing, additions, or deletions.
Data generated, processed, and incorporated into the IM system are routinely stored as well as
archived on redundant systems. This ensures that if one system is destroyed or
incapacitated, IM staff will be able to reconstruct the data bases. Procedures developed
to archive the data, monitor the process, and recover the data are described in IM
documentation.
Several backup copies of all data files and of the programs used for processing the data are
maintained. Backups of the entire system are maintained off-site. System backup
procedures are utilized. The central data base is backed up and archived according to
procedures already established for WED. All laboratories generating data and
developing data files must have established procedures for backing up and archiving
computerized data.
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 36 of 120
5.0 INDICATORS
5.1 Description of NRSA Indicators
5.1.1 In Situ Water Quality Measurements
Measurements for temperature, pH, dissolved oxygen (DO), and conductivity will be taken with
a calibrated water quality probe meter or multi-probe sonde at the X-site (center)
transect in each river or stream. This information will be used to detect extremes in
condition that might indicate impairment.
5.1.2 Secchi Disk Transparency
A Secchi disk is a black and white patterned disk commonly used to measure the clarity of water
in visibility distance. It will be used in the beatable systems to determine transparency.
5.1.3 Water Chemistry and Associated Measurements
Water chemistry measurements will be used to determine the acidic conditions and nutrient
enrichment, as well as classification of water chemistry type.
5.1.4 Chlorophyll-a
Chlorophyll-a is the pigment that makes plants and algae green. Its measurement is used to
determine algal biomass in the water.
5.1.5 Sediment Enzymes
Benthic organisms are in intimate contact with river sediments, and they are influenced by the
physical and chemical properties of the sediment. Sediment enzyme activity serves as a
functional indicator of key ecosystem processes.
5.1.6 Periphyton Assemblage
Periphyton are diatoms and soft-bodied algae that are attached or otherwise associated with
channel substrates. They can contribute to the physical stability of inorganic substrate
particles, and provide habitat and structure. Periphyton are useful indicators of
environmental condition because they respond rapidly and are sensitive to a number of
anthropogenic disturbances, including habitat destruction, contamination by nutrients,
metals, herbicides, hydrocarbons, and acidification.
5.1.7 Benthic Macroinvertebrate Assemblage
Benthic macroinvertebrates are bottom-dwelling animals without backbones ("invertebrates")
that are large enough to be seen with the naked eye ("macro"). Examples of
macroinvertebrates include: crayfish, snails, clams, aquatic worms, leeches, and the
larval and nymph stages of many insects, including dragonflies, mosquitoes, and
mayflies. Populations in the benthic assemblage respond to a wide array of stressors in
different ways so that it is often possible to determine the type of stress that has affected
a macroinvertebrate assemblage (Klemm et al., 1990). Because many
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 37 of 120
macroinvertebrates have relatively long life cycles of a year or more and are relatively
immobile, the structure and function of the macroinvertebrate assemblage is a response
to exposure of present or past conditions.
5.1.8 Fish Assemblage
Monitoring of the fish assemblage is an integral component of many water quality management
programs. The assessment will measure specific attributes of the overall structure and
function of the ichthyofaunal community to evaluate biological integrity and water quality.
5.1.9 Physical Habitat Assessment
The physical habitat assessment of the sampling reach and the riparian zone (the region lying
along a bank) will serve three purposes. First, habitat information is essential to the
interpretation of what ecological condition is expected to be like in the absence of many
types of anthropogenic impacts. Second, the habitat evaluation is a reproducible,
quantified estimate of habitat condition, serving as a benchmark against which to
compare future habitat changes that might result from anthropogenic activities. Third, the
specific selections of habitat information collected aid in the diagnosis of probable
causes of ecological degradation in rivers and streams. For example, some of the data
collected will be used to calculate relative bed stability (RBS). RBS is an estimate of
stream stability that is calculated by comparing the mean sediment size present to the
sediment size predicted by channel and slope.
In addition to information collected in the field by the physical habitat assessment, the physical
habitat description of each site includes many map-derived variables such as stream
order and drainage area. Furthermore, an array of information, including watershed
topography and land use, supplements the physical habitat information. Together with
water chemistry, the habitat measurements and observations describe the variety of
physical and chemical conditions that are necessary to support biological diversity and
foster long-term ecosystem stability.
5.1.10 Fecal Indicator (Enterococci)
Enterococci are bacteria that are endemic to the guts of warm blooded creatures. These
bacteria, by themselves, are not considered harmful to humans but often occur in the
presence of potential human pathogens (the definition of an indicator organism).
Epidemiological studies of marine and fresh water bathing beaches have established a
direct relationship between the density of enterococci in water and the occurrence of
swimming-associated gastroenteritis. This analysis will not serve as an exact equivalent
of a water quality test, since it includes dead organisms as well as living, but it will serve
as a surrogate of potential exposure. Enterococci samples will be taken from the last
transect one meter off the bank.
5.1.11 Fish Tissue
The NRSA fish tissue indicator will provide information on the national distribution of selected
persistent, bioaccumulative, and toxic (PBT) chemical residues (e.g., mercury and
organochlorine pesticides) in predator fish species from large (non-wadeable) streams
and rivers of the conterminous United States. In addition, samples collected from a
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 38 of 120
national statistical subset of NRSA urban sites (approximately 150 sites) located on
large (non-wadeable) rivers will be analyzed for Pharmaceuticals and personal care
product compounds that can persist through the wastewater treatment process. Various
studies have been conducted on fish tissue contaminants focusing on different parts of
the fish (e.g., whole fish, fillets, livers); however, the NRSA will focus on analysis of fillet
tissue because of associated human consumption and health risk implications.
5.1.12 Other Indicators / Site Characteristics
Observations and impressions about the site and its surrounding catchment by field teams will
be useful for ecological value assessment, development of associations and stressor
indicators, and data verification and validation.
Table 5-1. Summary table of indicators
Indicator
In Situ measurements (pH, DO,
temperature, conductivity)
Secchi Disk Transparency
Water chemistry (TP, TN [NH4, NO3),
basic anions and cations,
alkalinity [ANC], DOC, TOC,
TSS, conductivity
Chlorophyll-a
Sediment enzymes
Periphyton
Benthic macroinvertebrate assemblage
(Littoral)
Fish Assemblage
Physical habitat assessment
Fecal indicator (enterococci)
Fish Tissue
Drainage area
Characteristics of watershed
Specs/Location in Sampling Reach
One set of measurements taken at midpoint of the river;
readings are taken at 0.5 m depth
Measurements taken at midpoint of the river; readings are
taken at 0.5 m depth
Collected from a depth of 0.5 m at the midpoint of the river
Collected as part of water chemistry and periphyton samples
Collected from 1 1 locations systematically placed at each site
and combined into a single composite sample
Collected from 1 1 locations systematically placed at each site
and combined into a single composite sample
Collected from 1 1 locations systematically placed at each site
and combined into a single composite sample
Sampled throughout the sampling reach at specified locations
Measurements collected throughout the sampling reach at
specified locations
Collected at the last transect one meter off the bank
Target species collected throughout the sampling reach
Done at desktop, and used in target population selection
Done at desktop using CIS and verified by state agencies
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 39 of 120
5.2 Water Chemistry
5.2.1 Introduction
Ecological indicators based on river and stream water chemistry information attempt to evaluate
stream condition with respect to stressors such as acidic deposition and other types of
physical or chemical contamination. Data are collected for a variety of physical and
chemical constituents to provide information on the acid-base status of each stream,
water clarity, primary productivity, nutrient status, mass balance budgets of constituents,
color, temperature regime, and presence and extent of anaerobic conditions.
At each wadeable stream and beatable river site, crews fill one 4L Cubitainer, and a 2L brown
plastic bottle. These samples are stored in a cooler packed with resealable plastic bags
filled with ice and shipped to the analytical laboratory within 24 hours of collection. Field
crews also measure DO, pH, conductivity, and temperature using a multi-parameter
water quality meter. Secchi disk depth is only measured at non-wadeable sites. The
primary function of the water chemistry information is to determine:
• Acid-base status
• Trophic state (nutrient enrichment)
• Chemical stressors
• Classification of water chemistry type
5.2.2 Sampling Design
The plot design for stream and river sampling is shown in Figure 6. The plot design for water
chemistry sampling is based on that used for the National Rivers and Streams
Assessment (Kaufmann et al., 1988). At each stream and river, a single sampling site is
located at the midpoint of Transect F (the middle transect).
NON-WADEABLE SITES
Sampling Points
• L = left; R = right
• 1st point (transect A)
determined randomly
• Subsequent points
assigned systematically
K
Total reach length = 40 x mean wetted width (min = 150 m; max = 4 km)
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 40 of 120
Figure 6. Stream and river index sampling design for the water chemistry indicator for non-wadeable
sites.
WADEABLE SITES
SAMPLING POINTS
• L=Left C=Center R=Right
• First point (transect A)
determined at random
• Subsequent points assigned in
order L, C, R
Distance between transects=4 times
mean wetted width at X-site
Total reach length=40 times mean wetted width at X-site (minimum=150 m)
Figure 7. Stream and river index sampling design for the water chemistry indicator for wadeable sites.
5.2.3 Sampling and Analytical Methodologies
Sample Collection: At wadeable and non-wadeable index sites, a water sample is collected at
the midpoint to fill a 4-L cubitainer. A multi-probe sonde is also used at the midpoint to
measure DO, pH, temperature, and conductivity.. Secchi disk depths (depths that the
disc disappears and reappears) are recorded at the X-site. Detailed procedures for
sample collection and handling are described in the field operations manual. Figure 8
presents the process for collecting water chemistry samples and obtaining field
measurements.
Analysis: Table 5.2-1 summarizes performance requirements for water chemistry and
chlorophyll-a analytical methods.. Table 5.2-2 summarizes the analytical methods for the
water chemistry indicator. Analytical methods are based on EPA-validated methods,
modified for use with aqueous samples of low ionic strength. Modified methods are
thoroughly documented in the laboratory methods handbook prepared for the Aquatic
Effects Research Program (U.S. EPA, 1987).
5.2.4 Quality Assurance Objectives
Measurement data quality objectives (measurement DQOs or MQOs) are given in Table 19.
General requirements for comparability and representativeness are addressed in
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 41 of 120
Section 2. The MQOs given in Table 5.2-3 represent the maximum allowable criteria for
statistical control purposes. Method detection limits are monitored over time by repeated
measurements of low level standards and calculated using Equation 2-1. For major
cations and anions, the required MDLs are approximately equivalent to 1.0 ueq/L (0.5
ueq/L for nitrate). The analytical laboratory may report results in mg/L; these results are
converted to ueq/L for interpretation. For total suspended solids determinations, the
"detection limit" is defined based on the required sensitivity of the analytical balance.
For precision, the objectives presented in Table 5.2-3 represent the 99% confidence intervals
about a single measurement and are thus based on the standard deviation of a set of
repeated measurements (n > 1). Precision objectives at lower concentrations are
equivalent to the corresponding MDL. At higher concentrations, the precision objective is
expressed in relative terms, with the 99% confidence interval based on the relative
standard deviation (Section 2). Objectives for accuracy are equal to the corresponding
precision objective, and are based on the mean value of repeated measurements.
Accuracy is generally estimated as net bias or relative net bias (Section 2). For total
phosphorus and total nitrogen measurements, accuracy is also determined from
analyses of matrix spike samples (also sometimes called fortified samples) as percent
recovery (Section 2). Precision and bias are monitored at the point of measurement
(field or analytical laboratory) by several types of QC samples described in the Section
5.2.6, and from performance evaluation (PE) samples.
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 1 of 120
Table 5.2-1 . Performance requirements for water chemistry and chlorophyll-a analytical methods.
Analyte
Conductivity
Turbidity
PH
Acid Neutralizing
Capacity
(ANC)
Total and Dissolved
Organic
Carbon
/T/-\/-^ II- 1/— ./-^ \
Ammonia (NHs)
Nitrate-Nitrite (NO3-
NO2)
Total Nitrogen (TN)
Total Phosphorus (TP)
Ortho-phosphate
Sulfate (SO4)
Chloride (Cl)
Units
US/cm at 25°C
NTU
pH units
|ieq/L
(20 ueq/L=1 mg
as
mg C/L
mg N/L
mg N/L
mg/L
ugP/L
MQP/L
mg SO4/L
mg CI/L
Potential Range
of Samples1
1 to 15,000
0 to 44,000
3.7 to 10
-300 to +75,000
(-16 to 3,750 mg as
CaCOs)
0.1 to 109 (as
DOC)
Oto17
0 to 360 (as nitrate)
0.1 to 90
0 to 22,000
0 to 5, 000
0 to 5,000
Long-Term
MDL
Objective2
NA
1
NA
NA
0.10
0.01
(0.7 ueq/L)
0.01
0.01
2
2
0.25
(5 ueq/L)
0.10
(3 ueq/L)
Laboratory
Repor
ting
Limit3
2.0
2.0
NA
NA
0.20
0.02
(1.4 ueq/L)
0.02
0.02
4
4
0.50
(10 ueq/L)
0.20
(6 ueq/L)
Transition Precision
Value4 Objective5
20 ±2 or ±10%
20 ±2 or ±10%
5.75 and>8.25 ±0.08 or ±0.15
±50 ±5 or ±10%
<1 ±0.10 or ±10%
>1
0.10 ±0.01 or ±10%
0.10 ±0.01 or ±10%
0.10 ±0.01 or ±10%
20 ±2 or ±10%
20 ±2 or ±10%
2.5 ±0.25 or ±10%
1 ±0.10 or ±10%
Bias
Objective6
± 2 or 5%
±2 or ±10%
± 0.05 or± 0.10
±5 or ±10%
±0.10 or ±10%
±0.01 or ±10%
±0.01 or ±10%
±0.01 or ±10%
±2 or ±10%
±2 or ±10%
±0.25 or ±10%
±0.10 or ±10%
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 2 of 120
Analyte
Nitrate (NO3)
Calcium (Ca)
Magnesium (Mg)
Sodium (Na)
Potassium (K)
Silica (SiO2)
Total Suspended
Solids (TSS)
True Color
Chlorophyll a
Units
mg N/L
mg Ca/L
mg Mg/L
mg Na/L
mg K/L
mg SIO2/L
mg/L
PCU
|ig/L (in extract)
Potential Range
of Samples1
0 to 360
0.04 to 5, 000
0.1 to 350
0.08 to 3,500
0.01 to 120
0.01 to 100
0 to 27,000
0 to 350
0.7 to 11, 000
Long-Term
MDL
Objective2
0.01
(1 ueq/L)
0.05
(2.5 ueq/L)
0.05
(4 ueq/L)
0.05
(2 ueq/L)
0.05
(1 ueq/L)
0.05
1
NA
1.5
Laboratory
Repor
ting
Limit3
0.02
(4 ueq/L)
0.10
(5 ueq/L)
0.10
(8 ueq/L)
0.10
(4 ueq/L)
0.10
(2 ueq/L)
0.10
2
5
3
Transition Precision
Value4 Objective5
0.1 ±0.01 or ±10%
0.5 ±0.05 or ±10%
0.5 ±0.05 or ±10%
0.5 ±0.05 or ±10%
0.5 ±0.05 or ±10%
0.5 ±0.05 or ±10%
10 ±1or±10%
50 ±5 or ±10%
15 ±1.5 or ±10%
Bias
Objective6
±0.01 ±10%
±0.05 or ±10%
±0.05 or ±10%
±0.05 or ±10%
±0.05 or ±10%
±0.05 or ±10%
± 1 or ±10%
±5 or ±10%
± 1.5 or ±10%
1 Estimated from samples analyzed at the WED-Corvallis laboratory between 1999 and 2005 for TIME, EMAP-West, and WSA streams from across the U.S.
2 The long-term method detection limit is determined as a one-sided 99% confidence interval from repeated measurements of a low-level standard across several calibration curves,
based on USGS Open File Report 99-193. These represent values that should be achievable by multiple labs analyzing samples over extended periods with comparable
(but not necessarily identical) methods.
3 The minimum reporting limit is the lowest value that need to be quantified (as opposed to just detected), and represents the value of the lowest nonzero calibration standard used. It
is set to 2x the long-term detection limit, following USGS Open File Report 99-193 New Reporting Procedures Based on Long-Term Method Detection Levels and Some
Considerations for Interpretations of Water-Quality Data Provided by the U.S. Geological Survey National Water Quality Laboratory.
4 Value at which performance objectives for precision and bias switch from absolute (< transition value) to relative 9> transition value). Two-tiered approach based on Hunt, D.T.E. and
A.L. Wilson. 1986. The Chemical Analysis of Water: General Principles and Techniques. 2nd ed.. Royal Society of Chemistry, London, England.
5 For duplicate samples, precision is estimated as the pooled standard deviation (calculated as the root-mean square) of all samples at the lower concentration range, and as the
pooled percent relative standard deviation of all samples at the higher concentration range. For standard samples, precision is estimated as the standard deviation of
repeated measurements across batches at the lower concentration range, and as percent relative standard deviation of repeated measurements across batches at the
higher concentration range.
6 Bias (systematic error) is estimated as the difference between the mean measured value and the target value of a performance evaluation and/or internal reference samples at the
lower concentration range measured across sample batches, and as the percent difference at the higher concentration range.
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 3 of 120
Table 5.2-2: Analytical methodologies: water chemistry indicator
Analyte
Acid Neutralizing
Capacity
(ANC)
Carbon,
dissolved3
inorganic
(DIG),
closed
system
Carbon, dissolved
organic
(DOC)
Conductivity
QA
C
N
C
C
Expected Range
-100 to 5, 000 ueq/L
0.1 to 50 m g C/L
0.1 to 30 m g C/L
1 to 500 uS/cm
Summary of Method
Acidimetrictitration to pH < 3.5, with
modified Gran plot analysis
Sample collected and analyzed without
exposure to atmosphere; acid-
promoted oxidation to CO2, with
detection by infrared
spectrophotometry
UV-promoted persulfate oxidation,
detection by infrared
spectrophotometry.
Electrolytic (conductance cell and meter)
References
EPA 31 0.1 (modified); U.S.
EPA (1987)
U.S. EPA (1987)
EPA 41 5.2, U.S. EPA (1987)
EPA 120.6, U.S. EPA (1987)
Major Cations (dissolved)
Calcium
Magnesiu
m Sodium
Potassium
Ammonium
C
C
C
C
N
0.02 to 76 mg/L (1 to 3,800 ueq/L)
0.01 to 25 mg/L (1 to 2,000 ueq/L)
0.01 to 75 mg/L (0.4 to 3.3 ueq/L)
0.01 to 10 mg/L (0.3 to 250 ueq/L)
0.01 to 5 mg/L (0.5 to 300 ueq/L)
Atomic absorption spectroscopy (flame)
Colorimetric (automated phenate)
EPA 200.6, U.S. EPA (1987)
EPA 350. 7; U.S. EPA (1987)
Major Anions, dissolved
Chloride
Nitrate
Sulfate
Phosphorus, total
Nitrogen, total
C
C
C
C
N
0.03 to 100 mg/L (1 to 2,800 ueq/L)
0.06 to 20 mg/L (0.5 to 350 ueq/L)
0.05 to 25 mg/L (1 to 500 ueq/L)
0 to 1 000 ug/L
0 to 25,000 ug/L
Ion chromatography
Acid-persulfate digestion with automated
colorimetric determination
(molybdate blue)
Alkaline persulfate digestion with
determination of nitrate by
EPA 300.6; U.S. EPA (1987)
USGS I-4600-78; Skougstad et
al. (1979), U.S. EPA
(1987)
EPA 353.2 (modified); U.S.
EPA (1987)
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 4 of 120
Turbidity
Total Suspended
Solids
(TSS)
N
N
1 to 100 Nephelometric Turbidity
Units (NTU)
1 to 200 mg/L
cadmium reduction and
determination of nitrite by
automated colorimetry
(EDTA/sulfanilimide).
Nephelometric
Gravimetric
APHA214A., EPA 180.1; U.S.
EPA (1987)
EPA160.3; APHA(1989)
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 1 of 120
Table 5.2-3. Measurement data quality objectives: water chemistry indicator
Variable or
Measurement
Oxygen, dissolved
Temperature
Acid Neutralizing
Capacity
Carbon, dissolved
organic
Conductivity
Major Cations: Calcium
Magnesium
Sodium
Potassium
Ammonium
Major Anions: Chloride
Nitrate
Sulfate
Phosphorus, total
Nitrogen, total
Turbidity
Total Suspended Solids
Method
Det
ecti
on
Lim
it
NA
NA
NA
0.1 mg/L
NA
0.02 mg/L
0.0
1
mg/
L
0.0
2
mg/
L
0.0
4
mg/
L
0.02 mg/L
0.03 mg/L
0.0
3
mg/
L
0.0
5
mg/
L
1ug/L
1 |jg/L
NA
0.1 mg
Precision and
Accuracy
±0.5 mg/L
±1 ±C
±5 ueq/L or ±5%
±0.1 mg/L or ±10%
±1 uS/cm or ±2%
±0.02 mg/L or ±5%
±0.01 mg/L or
±5% ±0.02
mg/L or ±5%
±0.04 mg/L or
±5%
±0.02 mg/L or ±5%
±0.03 mg/L or ±5%
±0.03 mg/L or
±5% ±0.05
mg/L or ±5%
±1 ug/L or ±5%
±1 ug/L or ±5%
±2NTUor±10%
±1 mg/L or ±1 0%
Transition
Valu
ea
NA
NA
1 00 ueq/L
1 mg/L
50 uS/cm
0.4 mg/L 0.2
mg/
L
0.4
mg/
L
0.8
mg/
L
0.4 mg/L
0.6 m g/L
0.6
m
g/L
1
mg/
L
20 ug/L
20 ug/L
20NTU
10 mg/L
Completeness
95%
95%
95%
95%
95%
95%
95%
95%
95%
95%
95%
95%
NA= not applicable
a
Represents the value above which precision and bias are expressed in relative terms.
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 2 of 120
5.2.5 Quality Control Procedures: Field Operations
Field data quality is addressed, in part, by application and consistent performance of valid
procedures documented in the standard operating procedures detailed in the NRSA
Field Operations Manual. That quality is enhanced by the training and experience of
project staff and documentation of sampling activities. This QAPP, the NRSA Field
Operations Manual, and training materials will be distributed to all field sampling
personnel. Training sessions will be conducted by EPA to distribute and discuss project
materials. All sampling teams will be required to view the training materials, read the
QAPP, and verify that they understand the procedures and requirements.
Water chemistry field measurements should be measured with a calibrated multiprobe. The DO,
pH, and conductivity should be calibrated prior to each sampling event in the field. It is
recommended to periodically compare the probe to a DO chemical analysis procedure.
Also conduct a quality control check with a different pH and conductivity standard to
verify the calibration and periodically evaluate instrument precision. Test the temperature
meter against a thermometer that is traceable to the National Institute of Standards
(NIST) at least once per sampling season. Field crews should check the calibrated
sounding rod and measuring tape attached to the Secchi disk before each sampling
event. Field crews should verify that all sample containers are uncontaminated and
intact, and that all sample labels are legible and intact. A summary of Field quality
control procedures for water chemistry is presented in Table 5.2-4.
Check the label to ensure that all written information is complete and legible. Place a strip of
clear packing tape over the label and bar code, covering the label completely. Record
the bar code assigned to the water chemistry sample on the Sample Collection Form.
Enter a flag code and provide comments on the Sample Collection Form if there are any
problems in collecting the sample or if conditions occur that may affect sample integrity.
Store the sample on wet ice in a cooler. Recheck all forms and labels for completeness
and legibility. Additionally, duplicate (replicate) samples will be collected at 10% of sites
sampled.
Table 5.2-4. Field quality control: Water Chemistry
Check Description
Check calibration of
multiprobe
Check calibrated sounding
rod and measuring
tape attached to
Secchi disk
Check integrity of sample
containers and
labels
Frequency
Prior to each
sampli
ng day
Each site
Each site
Acceptance Criteria
Specific to instrument
Depth measurements
for all
sampling
points
Clean, intact
containers
and labels
Corrective Actions
Adjust and recalibrate,
redeploy gear
Obtain best estimate of depth
where actual
measurement not
possible
Obtain replacement supplies
5.2.6 Quality Control Procedures: Laboratory Operations
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 3 of 120
5.2.6.1 Sample Receipt and Processing
QC activities associated with sample receipt and processing are presented in Table 5.2-5. The
communications center and information management staff are notified of sample receipt
and any associated problems as soon as possible after samples are received. The
general schemes for processing stream and river water chemistry samples for analysis
is presented in Figure 9. Several additional aliquots are prepared from the bulk water
samples. Ideally, all analyses are completed within a few days after processing to allow
for review of the results and possible reanalysis of suspect samples within seven days.
Critical holding times (Table 5.2-6) for the various analyses are the maximum allowable
holding times, based on current EPA and American Public Health Association (APHA)
requirements (American Public Health Association, 1989). Analyses of samples after the
critical holding time is exceeded will likely not provide representative data.
Table 5.2-5. Sample receipt and processing quality control: water chemistry indicator
Quality Control
Activity
Sample Log-in
Sample Storage
Holding time
Aliquot
Container
s and
Preparati
on
Filtration
Preservation
Holding Times for
preserve
Description and Requirements
Upon receipt of a sample shipment, laboratory personnel
check the condition and identification of each
sample against the sample tracking record.
Store samples in darkness at 4 °C; Monitor temperature
daily
Complete processing bulk samples within 48 hours of
collection
Rinse collection bottles 2 times with stream or river water
to be sampled
0.4 urn polycarbonate filters required for all dissolved
analytes except DIG (0.45 urn) Rinse filters and
filter chamber twice with 50-ml portions of
deionized water, followed by a 20-mL portion of
sample. Repeat for each filter used on a single
sample. Rinse aliquot bottles with two 25 to 50 ml
portions of filtered sample before use.
Use ultrapure acids for preservation. Add sufficient acid to
adjust to pH < 2. Check pH with indicator paper.
Record volume of preservative on container label.
Store preserved aliquots in darkness at 4/C until
analysis.
Holding times range from 3 days to 6 months, based upon
current APHA criteria.
Corrective Action
Discrepancies,
damaged, or
missing
samples are
reported to
the IM staff
and indicator
lead
Qualify sample as
suspect for all
analyses
Qualify samples
Sample results are
qualified as
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 4 of 120
d aliquots
being in
violation of
holding time
requirements.
Table 5.2-6. Analyte holding time for various sampling methods
Analyte
Total Phosphorus (TP)
Total Nitrogen (TN)
Total ammonia-nitrogen (NhU)
Nitrate (NO3)
Anions
Cations
Total Suspended Solids (TSS)
Turbidity
Acid Neutralizing Capacity (ANC,
alkalinity)
Dissolved Organic Carbon (DOC)
Method
USGS I-4600-78
EPA 353.2
?
EPA 300.6
EPA 300.6
EPA 200.6
EPA 160.3
EPA 180.1
EPA 31 0.1
EPA 41 5.2
Preservative
Cool to 4° C
Cool to 4° C
Cool to 4° C
Cool to 4° C
Holding time
48 hours
7 days
4 hours
14 days
5.2.6.2 Analysis of Samples
QC protocols are an integral part of all analytical procedures to ensure that the results are
reliable and the analytical stage of the measurement system is maintained in a state of
statistical control. Most of the QC procedures described here are detailed in the
references for specific methods. However, modifications to the procedures and
acceptance criteria described in this QAPP supersede those presented in the methods
references. Information regarding QC sample requirements and corrective actions are
summarized in Table 5.2-7. Figure 9 illustrates the general scheme for analysis of a
batch of water chemistry samples, including associated QC samples.
5.2.7 Data Reporting, Review, and Management
Checks made of the data in the process of review, verification, and validation are summarized in
Table 5.2-8. Data reporting units and significant figures are given in Table 5.2-9. The
Indicator Lead is ultimately responsible for ensuring the validity of the data, although
performance of the specific checks may be delegated to other staff members.
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 5 of 120
FIELD MEASUREMENT PROCESS: WATER CHEMISTRY INDICATOR
PRE-DEPARTURE CHECK
Probe Inspection
Electronic Checks
Test Calibration
and/or Instrument
FIELD CALIBRATION
QC Sample Measurement
Performance Evaluation
Measurement
CONDUCT
MEASUREMENTS
AND RECORD DATA
QC Sample Measurement
Duplicate Measurement
REVIEW
DATA FORM
Qualify Data
Correct Errors
ACCEPT FOR DATA ENTRY
Figure 8: Field Measurement process for water chemistry samples.
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 1 of 120
Table 5.2-7. Laboratory quality control samples: water chemistry indicator
QC Sample Type (Analytes), and Description
Laboratory Blank: (all analyses total suspended
solids[TSS]) Reagent Blank: (DOC, Al [total,
monomeric, and organic monomeric], ANC, NH4
+, SIO2)
Filtration Blank: (All dissolved analytes, excluding syringe
samples) ASTM Type II reagent water processed
through filtration unit.
Detection Limit Quality Control Check Sample (QCCS): (All
analyses except true color, turbidity, and TSS)
Prepared so concentration is approximately 4-6
times the required MDL.
Calibration QCCS: For turbidity, QCCS is prepared at one
level for routine analyses (USEPA 1987).
Additional QCCS are prepared as needed for
samples having estimated turbidities >20 NTU.
For TSS determinations, QCCS is a standard
weight having mass representative of samples.
Frequency
Once per
ba
tc
h
pri
or
to
sa
m
Pi
e
an
al
ysi
s
Prepare
1/
w
ee
k
an
d
ar
ch
iv
e
Once per
ba
tc
h
Before and
aft
er
sa
m
Pi
e
an
al
Acceptance Criteria
Control limits < ±MDL
Measured
concentratio
ns < MDL
Control limits < ±MDL
Control limits <
precision
objective:
Mean value
< bias
objective
Corrective Action
Prepare and analyze new blank. Determine and correct
problem (e.g., reagent contamination, instrument
calibration, or contamination introduced during
filtration) before proceeding with any sample
analyses. Reestablish statistical control by
analyzing three blank samples.
Measure archived samples if review of other laboratory blank
information suggest source of contamination is
sample processing.
Confirm achieved MDL by repeated analysis of appropriate
standard solution. Evaluate affected samples for
possible re-analysis.
Repeat QCCS analysis. Recalibrate and analyze QCCS.
Reanalyze all routine samples (including PE and
field replicate samples) analyzed since the last
acceptable QCCS measurement.
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 2 of 120
Internal Reference Sample: (Suggested when available for
a particular analyte)
Laboratory Replicate Sample: (All analyses) For closed
system analyses, a replicate sample represents a
second injection of sample from the sealed
syringe.
Matrix spike samples: (Only prepared when samples with
potential for matrix interferences are encountered)
ys
es
One
an
al
ysi
s
in
a
mi
ni
m
u
m
of
fiv
e
se
pa
rat
e
ba
tc
he
s
One per
ba
tc
h
One per
ba
tc
h
Control limits <
precision
objective.
Mean value
< bias
objective
Control limits <
precision
objective
Control limits for
recovery
cannot
exceed
100±20%
Analyze standard in next batch to confirm suspected
imprecision or bias. Evaluate calibration and QCCS
solutions and standards for contamination and
preparation error. Correct before any further
analyses of routine samples are conducted.
Reestablish control by three successive reference
standard measurements which are acceptable.
Qualify all sample batches analyzed since the last
acceptable reference standard measurement for
possible reanalysis.
If results are below MDL: Prepare and analyze split from
different sample (volume permitting). Review
precision of QCCS measurements for batch. Check
preparation of split sample. Qualify all samples in
batch for possible reanalysis.
Select two additional samples and prepare fortified
subsamples. Reanalyze all suspected samples in
batch by the method of standard additions. Prepare
three subsamples (unfortified, fortified with solution
approximately equal to the endogenous
concentration, and fortified with solution
approximately twice the endogenous concentration.
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 1 of 120
PREPARE QC SAMPLES
Laboratory Blank
Fortified Sample
Laboratory Split Sample
0
SAMPLEPROCESSING
/ .
A PREPARE QC SAMPLES
• QC Check Samples (QCCS)
• Internal Reference Sample
Contamination
or Biased
Calibration
Laboratory
Blank
Recheck
LT-MDL QCCS
Insert randomly
into sample batch
Calibration
QCCS
Accept Batch
for Entry
and Verification
(Re-Calibrate
Re-analyze
Previous Samples
Calibration
QCCS
f Qualify batch
for possible
L re-analysis
Figure 9. Analysis activities for water chemistry samples.
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 2 of 120
Table 5.2-8. Data review, verification, and validation quality control: water chemistry indicator
Activity or Procedure
Range checks, summary statistics,
and/or exploratory data
analysis (e.g., box and
whisker plots)
Review holding times
Ion balance: Calculate percent ion
balance difference (%IBD)
using data from cations,
anions, and ANC.
Conductivity check: Compare
measured conductivity of
each sample to a calculated
conductivity based on the
equivalent conductances of
major ions in solution
(Hillmanetal., 1987).
Aluminum check: Compare results
for organic monomeric
aluminum, total monomeric
aluminum, and total
dissolved aluminum.
ANC check: Calculate ANC based on
pH and DIG. Compare to
measured ANC
Review data from QA samples
(laboratory PE samples, and
interlaboratory comparison
samples)
Requirements and Corrective Action
Correct reporting errors or qualify as suspect or invalid.
Qualify value for additional review
If total ionic strength <100 ueq/L, %IBD <±25%. If total ionic
strength >100 ueq/L, %IBD <±10%. Determine which
analytes, if any, are the largest contributors to the ion
imbalance. Review suspect analytes for analytical error
and reanalyze. If analytical error is not indicated, qualify
sample to attribute imbalance to unmeasured ions.
Reanalysis is not required. Flag= %IBD outside
acceptance criteria due to unmeasured ions
If measured conductivity < 25 uS/cm, ([measured ! calculated] •*•
measured) < ±25%. If measured conductivity > 25 uS/cm,
([measured ! calculated] •*• measured) < ±15%. Determine
which analytes, if any, are the largest contributors to the
difference between calculated and measured conductivity.
Review suspect analytes for analytical error and
reanalyze. If analytical error is not indicated, qualify
sample to attribute conductivity difference to unmeasured
ions. Reanalysis is not required.
[organic monomeric] < [total monomeric] < [total dissolved].
Review suspect measurement(s) to confirm if analytical
error is responsible for inconsistency.
Review suspect measurements for samples with results outside of
acceptance criteria. Determine if analytical error or non-
carbonate alkalinity are responsible for lack of agreement.
Compare with results from other years to determine comparability.
Determine impact and possible limitations on overall
usability of data
Table 5.2-9. Data reporting criteria: water chemistry indicator
Measurement
Dissolved Oxygen
Temperature
PH
Carbon, dissolved organic
Acid neutralizing capacity
Units
mg/L
°C
pH units
mg/L
ueq/L
Significant
Fig
ure
s
2
2
3
3
3
Maximum Decimal
Places
1
1
2
1
1
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 3 of 120
Conductivity
Calcium, magnesium, sodium, potassium,
ammonium, chloride, nitrate, and
sulfate
Total phosphorus and total nitrogen
Turbidity
Total suspended solids
uS/cm at 25 1C
ueq/L
ug/L
NTU
mg/L
3
3
3
3
3
1
1
0
0
1
The ion balance for each sample is computed using the results for major cations, anions, and
the measured acid neutralizing capacity. The percent ion difference (%IBD) for a sample
is calculated as:
%IBD =
(E cations - E anions) - ANC
ANC + E anions + E cations + 2[H*]
where ANC is the acid neutralization capacity, cations are the concentrations of calcium,
magnesium, sodium, potassium, and ammonium, converted from mg/L to ueq/L, anions
+
are chloride, nitrate, and sulfate (converted from mg/L to ueq/L), and H is the hydrogen
ion concentration calculated from the antilog of the sample pH. Factors to convert major
ions from mg/L to ueq/L are presented in Table 5.2-10. For the conductivity check,
equivalent conductivities for major ions are presented in Table 5.2-11.
Table 5.2-10. Constants for converting major ion concentrations from mg/L to ueq/L
Analyte
Calcium
Magnesium
Potassium
Sodium
Ammonium
Chloride
Nitrate
Sulfate
Conversion from mg/L to ueq/La
49.9
82.3
25.6
43.5
55.4
28.2
16.1
20.8
a Measured values are multiplied by the conversion factor.
Table 5.2-11. Factors to calculate equivalent conductivities of major ions
Ion
Calcium
Magnesium
Equivalent
Conductance
per mg/L
(uS/cm at 25
/C)
2.60
3.82
Ion
Nitrate
Sulfate
Equivalent
Conductance
per mg/L
(uS/cm at 25
/C)
1.15
1.54
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 4 of 120
Potassium
Sodium
Ammonium
Chloride
1.84
2.13
4.13
2.14
Hydrogen
Hydroxide
Bicarbonate
Carbonate
3.5 x 105 b
1.92 x 105 b
0.715
2.82
3 From Hillman et al. (1987).
b Specific conductance per mole/L, rather than per mg/L.
5.3 Chlorophyll-a Indicator
5.3.1 Introduction
Data are collected for chlorophyll-a to provide information on the algal loading and gross
biomass of blue-greens and other algae within each stream and river.
5.3.2 Sampling Design
The samples are collected at the index site located at the midpoint of the center transect of the
reach (transect F) on wadeable and non-wadeable sites. The plot design for sampling
locations is shown in Figure 6.
5.3.3 Sampling and Analytical Methods
Sample Collection: At the index site, collect a 2-L water sample from the surface using the
Nalgene beaker and transfer sample immediately to the 2-L brown bottle. The sample
should be preserved immediately on ice and placed in a cooler away from direct light.
After returning to shore, the sample is filtered in subdued light to minimize degradation.
The filter is then stored in a centrifuge tube on ice before being shipped to the laboratory
for chlorophyll-a analysis. Detailed procedures for sample collection and processing are
described in the Field Operations Manual.
Analysis: A performance-based methods approach is being utilized for chlorophyll-a analysis
that defines a set of laboratory method performance requirements for data quality.
Following this approach, participating laboratories may choose which analytical method
they will use to determine chlorophyll-a concentration as long as they are able to achieve
the performance requirements as listed in Table 5.2-1.
5.3.4 Quality Assurance Objectives
MQOs are given in Table 5.2-1. General requirements for comparability and representativeness
are addressed in Section 2. The MQOs given in Table 5.2-1 represent the maximum
allowable criteria for statistical control purposes. LT-MDLs are monitored over time by
repeated measurements of low level standards and calculated using Equation 1a.
For precision, the objectives presented in Table 5.2-1 represent the 99% confidence intervals
about a single measurement and are thus based on the standard deviation of a set of
repeated measurements (n > 1). Precision objectives at lower concentrations are
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 5 of 120
equivalent to the corresponding LRL. At higher concentrations, the precision objective is
expressed in relative terms, with the 99% confidence interval based on the relative
standard deviation (Section 2). Objectives for accuracy are equal to the corresponding
precision objective, and are based on the mean value of repeated measurements.
Accuracy is generally estimated as net bias or relative net bias (Section 2). Precision
and bias are monitored at the point of measurement (field or analytical laboratory) by
several types of QC samples described in Table 5.2-7, where applicable, and from
performance evaluation (PE) samples.
5.3.5 Quality Control Procedures: Field Operations
Field data quality is addressed, in part, by application and consistent performance of valid
procedures documented in the standard operating procedures detailed in the NRSA
Field Operations Manual. That quality is enhanced by the training and experience of
project staff and documentation of sampling activities. This QAPP, the NRSA Field
Operations Manual, and training materials will be distributed to all field sampling
personnel. Training sessions will be conducted by EPA to distribute and discuss project
materials. All sampling teams will be required to view the training materials, read the
QAPP, and verify that they understand the procedures and requirements.
Chlorophyll can degrade rapidly when exposed to bright light. It is important to keep the sample
on ice and in a dark place (cooler) until it can be filtered. If possible, prepare the sample
in subdued light (or shade) by filtering as quickly as possible to minimize degradation. If
the sample filter clogs and the entire sample in the filter chamber cannot be filtered,
discard the filter and prepare a new sample, using a smaller volume.
Check the label to ensure that all written information is complete and legible. Place a strip of
clear packing tape over the label and bar code, covering the label completely. Record
the bar code assigned to the chlorophyll-a sample on the Sample Collection Form. Also
record the volume of sample filtered on the Sample Collection Form. Verify that the
volume recorded on the label matches the volume recorded on the Sample Collection
Form. Enter a flag code and provide comments on the Sample Collection Form if there
are any problems in collecting the sample or if conditions occur that may affect sample
integrity. Store the filter sample in a 50-mL centrifuge tube (or other suitable container)
wrapped in aluminum foil and freeze using dry ice or a portable freezer. Recheck all
forms and labels for completeness and legibility. Additionally, duplicate (replicate)
samples will be collected at 10% of sites sampled. A summary of field quality control
procedures for the chlorophyll-a sample is presented in Table 5.3-1.
Table 5.3-1. Sample collection and field processing quality control: chlorophyll-a indicator
Quality Control Activity
Check integrity of sample
containers and
labels
Sample Storage (field)
Sample Processing (field)
Description and Requirements
Clean, intact containers and labels
Store sample on wet ice and in a dark place
(cooler)
Filter the sample quickly in a shaded area to
minimize degradation
Corrective Action
Obtain replacement
supplies
Discard and recollect
sample
Qualify samples
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 6 of 120
Filtration (done in field)
Duplicate samples
Holding time
Whatman GF/F (or equivalent) glass fiber filter.
Filtration pressure should not exceed 7
psi to avoid rupture of fragile algal cells.
Duplicate samples must be collected at 10% of
sites
Frozen filter must be shipped on wet ice
immediately
Discard and refilter
Qualify samples
5.3.6 Quality Control Procedures: Laboratory Operations
5.3.6.1 Sample Receipt and Processing
QC activities associated with sample receipt and processing are presented in Table 5.3-2. The
communications center and information management staff are notified of sample receipt
and any associated problems as soon as possible after samples are received.
Table 5.3-2. Sample receipt and processing quality control: chlorophyll-a indicator
Quality
Contro
I
Activit
y
Sample Log-in
Sample
Storag
e
Description and Requirements
Upon receipt of a sample shipment, laboratory
personnel check the condition and
identification of each sample against the
sample tracking record.
Store samples in darkness and frozen (-20 °C)
Monitor temperature daily
Corrective Action
Discrepancies, damaged, or
missing samples are
reported to the IM staff
and indicator lead
Qualify sample as suspect for all
analyses
5.3.6.2 Analysis of Samples
QC protocols are an integral part of all analytical procedures to ensure that the results are
reliable and the analytical stage of the measurement system is maintained in a state of
statistical control. Most of the QC procedures described here are detailed in the
references for specific methods. However, modifications to the procedures and
acceptance criteria described in this QAPP supersede those presented in the methods
references. QC activities associated with sample analysis are presented in Table 5.3-3.
Table 5.3-3. Sample analysis quality control: chlorophyll-a indicator
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 7 of 120
Quality
Contro
I
Activit
y
Description and Requirements
Corrective Action
5.3.7 Data Reporting, Review, and Management
Checks made of the data in the process of review, verification, and validation are summarized in
Table 5.3-4. Data reporting units and significant figures are given in Table 5.3-5. The
Indicator Lead is ultimately responsible for ensuring the validity of the data, although
performance of the specific checks may be delegated to other staff members. Once
data have passed all acceptance requirements, computerized data files are prepared in
a format specified for the NRSA. The electronic data files are transferred to the NRSA
IM Coordinator at WED-Corvallis for entry into a centralized data base. A hard copy
output of all files will also be sent to the NRSA IM Coordinator.
Table 5.3-4. Data review, verification, and validation quality control: chlorophyll-a indicator
Activity or Procedure
Range checks, summary statistics, and/or exploratory
data analysis (e.g., box and whisker plots)
Review data from QA samples (e.g., laboratory PE
samples or other standards or replicates)
Requirements and Corrective Action
Correct reporting errors or qualify as suspect
or invalid
Determine impact and possible limitations on
overall usability of data
Table 5.3-5. Data reporting criteria: chlorophyll-a indicator
Measurement
Chlorophyll-a
Units
ug/L
No. Significant
Figure
s
2
Maximum No. Decimal
Places
1
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 8 of 120
5.4 Sediment Enzymes Indicator
5.4.1 Introduction
Benthic organisms are in intimate contact with river sediments, and they are influenced by the
physical and chemical properties of the sediment. Sediment enzyme activity serves as a
functional indicator of key ecosystem processes. Sediment samples are collected,
preserved and analyzed to determine extracellular enzyme activity using the Bio-tek
microplate reader of fluorescence/luminescence.
5.4.2 Sampling Design
The samples are collected at the 11 sampling stations at each site and combined, resulting in a
single 500 ml_ composite sample per site. The transect and plot design for sampling
locations is shown in Figure 6.
5.4.3 Sampling and Analytical Methods
Sample Collection: Collect sediment samples at the 11 transect sampling stations at each site
and combine all subsamples at a site, resulting in a single 500 ml_ composite sample per
site. Collect fine surface sediments (top 5 cm) using a stainless steel spoon or dredge.
Store the samples on wet ice in the field. If not shipped immediately, samples may be
stored in a refrigerator for no more than 2 weeks until shipment to the analytical
laboratory for processing. Samples will be analyzed for available DIN, NH4, DIP, TP, TN,
total carbon (TC), and enzyme activity. Detailed procedures for sample collection and
processing are described in the Field Operations Manual.
Analysis: Sediment samples are collected in clean ziplock bags and frozen until analysis. The
subsamples are weighed (0.5-2.0g wet weight) into 125ml_ Nalgene bottles and either
refrozen until analysis, or used immediately. Seventy-five (75) ml acetate buffer is
added to sample, homogenized, and then quantitatively transferred to a 300 ml sterile
wide mouth glass jar. An additional 125 ml of buffer is added, and re-homogenized if
necessary. Prepared samples are stored in the refrigerator, and stirred with stir bar
during sample pipetting. Samples are run (or diluted and run) on the Bio-tek
fluorescence detector. Detailed procedures are contained in the laboratory operations
manual and cited references.
5.4.4 Quality Assurance Objectives
MQOs are given in Table 5.4-1. General requirements for comparability and representativeness
are addressed in Section 2. The MQOs given in Table 5.4-1 represent the maximum
allowable criteria for statistical control purposes. LT-MDLs are monitored over time by
repeated measurements of low level standards and calculated using Equation 1a.
Table 5.4-1. Measurement data quality objectives: sediment enzymes indicator
Variable or
Measurement
Method
Det
ecti
on
Precision and
Accuracy
Transition
Value9
Completeness
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 9 of 120
DIN
NH4
DIP
TP
TN
total carbon (TC)
enzyme activity
Lim
it
NA= not applicable
a
Represents the value above which precision and bias are expressed in relative terms.
5.4.5 Quality Control Procedures: Field Operations
Field data quality is addressed, in part, by application and consistent performance of valid
procedures documented in the standard operating procedures detailed in the NRSA
Field Operations Manual. That quality is enhanced by the training and experience of
project staff and documentation of sampling activities. This QAPP, the NRSA Field
Operations Manual, and training materials will be distributed to all field sampling
personnel. Training sessions will be conducted by EPA to distribute and discuss project
materials. All sampling teams will be required to view the training materials, read the
QAPP, and verify that they understand the procedures and requirements.
It is important to keep the individual sediment subsamples on wet ice and in a dark place
(cooler) as each subsequent subsample is collected. After the subsamples are
composited, the composite sample is stored on wet ice and in a dark place (cooler in
field; refrigerator in lab). The composited samples must be shipped to the analytical
laboratory within 2 weeks of collection.
Check the sample label to ensure that all written information is complete and legible. Place a
strip of clear packing tape over the label and bar code, covering the label completely.
Record the bar code assigned to the sediment sample on the Sample Collection Form.
Enter a flag code and provide comments on the Sample Collection Form if there are any
problems in collecting the sample or if conditions occur that may affect sample integrity.
Recheck all forms and labels for completeness and legibility. Additionally, duplicate
(replicate) samples will be collected at 10% of sites sampled. A summary of field quality
control procedures for sediment enzyme samples is presented in Table 5.4-2.
Table 5.4-2. Sample collection and field processing quality control: sediment enzymes indicator
Quality Control Activity
Description and Requirements
Corrective Action
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 10 of 120
Check integrity of sample
containers and
labels
Sample Storage (field)
Duplicate samples
Holding time
Sample Storage (lab)
Clean, intact containers and labels
Store sediment samples on wet ice and in a dark
place (cooler)
Duplicate samples must be collected at 10% of
sites
Refrigerated samples must be shipped on wet ice
within 2 weeks of collection
Sediment samples are collected in clean ziplock
bags and frozen until analysis.
Obtain replacement
supplies
Discard and recollect
sample
Qualify samples
Qualify sample as
suspect for all
analyses
5.4.6 Quality Control Procedures: Laboratory Operations
5.4.6.1 Sample Receipt and Processing
QC activities associated with sample receipt and processing are presented in Table 5.4-3. The
communications center and information management staff are notified of sample receipt
and any associated problems as soon as possible after samples are received.
Table 5.4-3. Sample receipt and processing quality control: sediment enzymes indicator
Quality Control Activity
Sample Log-in
Description and Requirements
Upon receipt of a sample shipment, laboratory
personnel check the condition and
identification of each sample against the
sample tracking record.
Corrective Action
Discrepancies, damaged,
or missing
samples are
reported to the IM
staff and indicator
lead
5.4.6.2 Analysis of Samples
QC protocols are an integral part of all analytical procedures to ensure that the results are
reliable and the analytical stage of the measurement system is maintained in a state of
statistical control. Most of the QC procedures described here are detailed in the
references for specific methods. However, modifications to the procedures and
acceptance criteria described in this QAPP supersede those presented in the methods
references. Replicate lab samples should be analyzed on at least 10% of total number
of samples analyzed. Replicate lab samples should agree within 20-30% of each
determination. QC activities associated with sample receipt and processing are
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 11 of 120
presented in Table 5.4-4. (There is very little QA/QC info in the Lab SOP; need more
info for this section)
Table 5.4-4. Sample analysis quality control: sediment enzymes indicator
Quality Control Activity
Description and Requirements
Corrective Action
5.4.7 Data Reporting, Review, and Management
Checks made of the data in the process of review, verification, and validation are summarized in
Table 5.4-5. Data reporting units and significant figures are given in Table 5.4-6. The
Indicator Lead is ultimately responsible for ensuring the validity of the data, although
performance of the specific checks may be delegated to other staff members. Once
data have passed all acceptance requirements, computerized data files are prepared in
a format specified for the NRSA. The electronic data files are transferred to the NRSA
IM Coordinator at WED-Corvallis for entry into a centralized data base. A hard copy
output of all files will also be sent to the NRSA IM Coordinator.
Table 5.4-5. Data review, verification, and validation quality control: sediment enzymes indicator
Activity or Procedure
Range checks, summary statistics, and/or exploratory
data analysis (e.g., box and whisker plots)
Review data from QA samples (e.g., laboratory PE
samples or other standards or replicates)
Requirements and Corrective Action
Correct reporting errors or qualify as suspect
or invalid
Determine impact and possible limitations on
overall usability of data
Table 5.4-6. Data reporting criteria: sediment enzymes indicator
Measurement
DIN
NH4
DIP
TP
TN
total carbon (TC)
Units
No. Significant
Figure
s
Maximum No. Decimal
Places
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 12 of 120
enzyme activity
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 13 of 120
5.5 Periphyton
5.5.1 Introduction
Periphyton are diatoms and soft-bodied algae that are attached or otherwise associated with
channel substrates. They can contribute to the physical stability of inorganic substrate
particles, and provide habitat and structure. Periphyton are useful indicators of
environmental condition because they respond rapidly and are sensitive to a number of
anthropogenic disturbances, including habitat destruction, contamination by nutrients,
metals, herbicides, hydrocarbons, and acidification.
5.5.2 Sampling Design
The samples are collected at the 11 sampling stations at each site and combined, resulting in a
single 500 ml_ composite sample per site. Four individual samples are prepared from
this composite sample. The transect and plot design for sampling locations is shown in
Figure 6.
5.5.3 Sampling and Analytical Methodologies
Sample Collection: At the each transect within the littoral zone, crews collect periphyton
samples from coarse substrate. A 12cm delimiter is used to define the sampling area on
the substrate. An aspirator is used if no coarse substrate is available. The sample is a
composite from each of the 11 transects throughout the reach. In the post-sampling
activities, periphyton composite samples will be separated for a 50 ml community
sample, a filtered ash free dry mass sample, a filtered chlorophyll-a sample and a 50 ml
acid phosphotase activity sample.
Analysis: Community identification samples are preserved, processed, enumerated, and
organisms identified to the lowest possible taxonomic level (generally genus, see
Laboratory Methods Manual) using specified standard keys and references. Processing
and archival methods are based on USGS NAWQA methods (Charles et al. 2003).
Detailed procedures are contained in the laboratory methods manual and cited
references. There is no maximum holding time associated with preserved periphyton
samples. Chlorophyll-a samples will be filtered on a Whatman GF/F 0.7um filter, frozen
in the filed and shipped to the Dynamac lab. The sample analysis and QC will follow that
previously described for water column chlorophyll-a in section 5.2. Acid Phosphatase
Activity (APA) samples will be frozen in the field and shipped on ice to the analysis lab in
Duluth, MN. Ash free dry mass samples will be filtered in the field, and filters shipped to
the analytical lab.
5.5.4 Quality Assurance Objectives
MQOs are given in Table 5.5-1. General requirements for comparability and representativeness
are addressed in Section 2. Precision is calculated as percent efficiency, estimated from
independent identifications of organisms in randomly selected samples. The MQO for
accuracy is evaluated by having individual specimens representative of selected taxa
identified by recognized experts.
Table 5.5-1. Measurement data quality objectives: phytoplankton indicator
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 14 of 120
Variable or
Measurement
Enumeration
Identification
Precision
85%
85%
Accuracy
90%a
90%a
Complete
ness
99%
99%
a Taxonomic accuracy, as calculated using Equation 9 in Section 2.
5.5.5 Quality Control Procedures: Field Operations
Field data quality is addressed, in part, by application and consistent performance of valid
procedures documented in the standard operating procedures detailed in the NRSA
Field Operations Manual. That quality is enhanced by the training and experience of
project staff and documentation of sampling activities. This QAPP, the NRSA Field
Operations Manual, and training materials will be distributed to all field sampling
personnel. Training sessions will be conducted by EPA to distribute and discuss project
materials. All sampling teams will be required to view the training materials, read the
QAPP, and verify that they understand the procedures and requirements.
It is important to keep the individual periphyton subsamples on wet ice and in a dark place
(cooler) as each subsequent subsample is collected. After the 500-mL bottle has been
filled, the composite sample is processed (filtered or preserved) in the field. The sample
must be thoroughly mixed before processing to ensure that the sample material is evenly
distributed throughout the composite. The crews must be careful to use the appropriate
filter or preservative for each type of sample prepared from the composite.
The sample labels should be checked to ensure that all written information is complete and
legible, and that the label has been completely covered with clear packing tape. It
should be verified that the bar code assigned to the periphyton samples is recorded
correctly on the Sample Collection Form. The presence of preservative in the sample
should be noted on the Sample Collection Form to assure the integrity of the sample. A
flag code should be recorded and comments provided on the Sample Collection Form to
denote any problems encountered in collecting the sample or the presence of any
conditions that may affect sample integrity. Recheck all forms and labels for
completeness and legibility. Additionally, duplicate (repeat) samples will be collected at
10% of lakes sampled. A summary of Field quality control procedures for periphyton
samples is presented in Table 5.5-2.
Table 5.5-2. Sample collection and field processing quality control: periphyton indicator
Quality Control Activity
Check integrity of sample
containers and
labels
Sample Storage (field)
Homogenize composite
Description and Requirements
Clean, intact containers and labels
Store samples on wet ice and in a dark place (cooler)
Thoroughly mix samples before processing to ensure
that the sample material is evenly distributed
throughout the composite.
Corrective Action
Obtain replacement
supplies
Discard and recollect
sample
Discard and recollect
sample
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 15 of 120
Preparing samples
Duplicate samples
Holding times
Use the appropriate filter or preservative for each
type of sample prepared from the composite.
Duplicate samples must be collected at 10% of sites
The frozen chlorophyll and AFDM filters are shipped
immediately on wet ice. The APA sample
may be held frozen and shipped on wet ice
within 2 weeks of collection. The ID sample
preserved with Lugol's solution is held in a
refrigerator and must be shipped on wet ice
within 2 weeks of collection.
Discard and prepare a
replacement
subsample
from the
composite
Qualify samples
5.5.6 Quality Control Procedures: Laboratory Operations
5.5.6.1 Sample Receipt and Processing
QC activities associated with sample receipt and processing are presented in Table 5.5-3. The
communications center and information management staff are notified of sample receipt
and any associated problems as soon as possible after samples are received.
Table 5.5-3. Sample receipt and processing quality control: periphyton indicator
Quality Control
Activity
Sample Log-in
Sample Storage
Holding time
Filtration
Preservation
Description and Requirements
Upon receipt of a sample shipment, laboratory
personnel check the condition and
identification of each sample against the
sample tracking record.
Corrective Action
Discrepancies, damaged,
or missing
samples are
reported to the IM
staff and indicator
lead
Qualify sample as suspect
for all analyses
Qualify samples
Qualify samples
Qualify samples
5.5.6.2 Analysis of Samples
It is critical that prior to taking a small portion of the subsample, the sample be thoroughly mixed
and macro or visible forms are evenly dispersed.
5.5.7 Data Management, Review, and Validation
The Indicator Lead is ultimately responsible for ensuring the validity of the data, although
performance of the specific checks may be delegated to other staff members. Once
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 16 of 120
data have passed all acceptance requirements, computerized data files are prepared in
a format specified for the NRSA project. The electronic data files are transferred to the
Rivers and Streams Survey IM Coordinator at WED-Corvallis for entry into a centralized
data base. A hard copy output of all files will also be sent to the Rivers and Streams
Survey IM Coordinator.
Sample residuals, vials, and slides are archived by each laboratory until the EPA Project Leader
has authorized, in writing, the disposition of samples. All raw data (including field data
forms and bench data recording sheets) are retained permanently in an organized
fashion by the Indicator Lead in accordance with EPA records management policies.
5.6 Benthic Macroinvertebrates
5.6.1 Introduction
The benthic macroinvertebrate assemblage found in sediments and on substrates of streams
and rivers reflect an important aspect of the biological condition of the stream or river.
The response of benthic communities to various stressors can often be used to
determine the type of stressor and to monitor trends (Klemm et al., 1990). The overall
objectives of the benthic macroinvertebrate indicators are to detect stresses on
community structure in National rivers and streams and to assess and monitor the
relative severity of those stresses. The benthic macroinvertebrate indicator procedures
are based on various recent bioassessment literature (Barbour et al. 1999, Hawkins et
al.2000, Peck et al. 2003).
5.6.2 Sampling Design
Benthic macroinvertebrates are collected at randomly selected sampling locations on the 11
cross-sectional transects established along the stream reach. A composite sample is
collected from a multi-habitat approach and consists of sampling pools, riffles, runs, and
glides. See field manual for more details.
5.6.3 Sampling and Analytical Methodologies
Sample Collection: Benthic macroinvertebrate composite samples are collected using a D-
frame net with 500 urn mesh openings. The samples are taken from the randomly
selected sampling stations at the 11 transects equally distributed along the targeted
reach. Benthic macroinvertebrates are collected from an approximately 1 ft2 area in
wadeable systems and from 1 linear meter in non-wadeable systems. Samples are
field-processed to remove large detritus (rinsed and inspected for organisms) and
preserved in ethanol. Detailed sampling and processing procedures are described in the
field operations manual. A condensed description of key elements of the field activities is
provided for easy reference onsite.
Analysis: Preserved composite samples are sorted, enumerated, and invertebrates identified to
the genus level (see Attachment 6 of the Laboratory Methods Manual) using specified
standard keys and references. Processing and archival methods are based on standard
practices. Detailed procedures are contained in the laboratory methods manual and
cited references. There is no maximum holding time associated with preserved benthic
macroinvertebrate samples. Five hundred benthic organism count is the target number
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 17 of 120
to match the EMAP West protocol. A 10% external check is standard QA for EMAP
West. For operational purposes of the NRSA, laboratory sample processing should be
completed by March 2010. Table 5.6-1 summarizes field and analytical methods for the
benthic macroinvertebrates indicator.
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 18 of 120
Table 5.6.1. Field and laboratory methods: benthic indicator
Variable or
Measu
remen
t
Sample
Collect
ion
Sorting and
Enum
eration
Identification
QA
C
C
C
Expected
Ran
ge/U
nits
NA
0 to 500
orga
nism
s
genus
Summary of Method
D-frame kick net (500 |jm mesh)
used to collect organisms,
which are composited
from 1 1 transects
Random systematic selection of
grids with target of 500
organisms from sample
Specified keys and references
References
Barbouretal. 1999, Peck
et al. 2003, WSA
Field Operation
Manual 2004
WSA Benthic Laboratory
Methods 2004
C = critical, N = non-critical quality assurance classification.
5.6.4 Quality Assurance Objectives
Measurement quality objectives (MQOs) are given in Table 5.8-2. General requirements for
comparability and representativeness are addressed in Section 2. The MQOs given in
Table 8 represents the maximum allowable criteria for statistical control purposes.
Precision is calculated as percent efficiency, estimated from examination of randomly
selected sample residuals by a second analyst and independent identifications of
organisms in randomly selected samples. The MQO for picking accuracy is estimated
from examinations (repicks) of randomly selected residues by experienced taxonomists.
Table 5.6.2. Measurement data quality objectives: benthic indicator
Variable or
Measurement
Sort and Pick
Identification
Precision
95%
85%
Accuracy
90%
90%a
Completeness
99%
99%
NA = not applicable
aTaxonomic accuracy, as calculated using Equation 10 in Section 2.
The completeness objectives are established for each measurement per site type (e.g.,
probability sites, revisit sites, etc.). Failure to achieve the minimum requirements for a
particular site type results in regional population estimates having wider confidence
intervals. Failure to achieve requirements for repeat and annual revisit samples reduces
the precision of estimates of index period and annual variance components, and may
impact the representativeness of these estimates because of possible bias in the set of
measurements obtained.
5.6.5 Quality Control Procedures: Field Operations
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 19 of 120
Field data quality is addressed, in part, by application and consistent performance of valid
procedures documented in the standard operating procedures detailed in the NRSA
Field Operations Manual. That quality is enhanced by the training and experience of
project staff and documentation of sampling activities. This QAPP, the NRSA Field
Operations Manual, and training materials will be distributed to all field sampling
personnel. Training sessions will be conducted by EPA to distribute and discuss project
materials. All sampling teams will be required to view the training materials, read the
QAPP, and verify that they understand the procedures and requirements.
It is important to keep the individual benthic macroinvertebrate subsamples wet while in the
sieve bucket as each subsequent subsample is collected. It is recommended that teams
carry a sample bottle containing a small amount of ethanol with them to enable them to
immediately preserve larger predaceous invertebrates such as helgramites and water
beetles. Doing so will help reduce the chance that other specimens will be consumed or
damaged prior to the end of the field day. Once the composite sample from all stations is
sieved and reduced in volume, store in a 1-liter jar and preserve with 95% ethanol. Do
not fill jars more than 1/3 full of material to reduce the chance of organisms being
damaged or crushed during transport. The composite sample is stored in a cool, dark
place until it is shipped to the analytical laboratory.
Check the sample label to ensure that all written information is complete and legible. Place a
strip of clear packing tape over the label and bar code, covering the label completely.
Record the bar code assigned to the benthic sample on the Sample Collection Form.
Enter a flag code and provide comments on the Sample Collection Form if there are any
problems in collecting the sample or if conditions occur that may affect sample integrity.
Recheck all forms and labels for completeness and legibility. Additionally, duplicate
(replicate) samples will be collected at 10% of sites sampled. Specific quality control
measures are listed in Table 5.6-3 for field operations.
Table 5.6-3. Sample collection and field processing quality control: benthic indicator
Quality Control Activity
Check integrity of sample
containers and
labels
Sample Collection
Sample Collection
Sample Processing (field)
Sample Storage (field)
Duplicate samples
Description and Requirements
Clean, intact containers and labels
Keep the individual benthic macroinvertebrate
subsamples wet while in the sieve bucket as
each subsequent subsample is collected.
Carry a small amount of ethanol to immediately
preserve larger predaceous invertebrates to
reduce the chance that other specimens will
be consumed or damaged.
Preserve with 95% ethanol. Fill jars1/3 full of material
to reduce the chance of organisms being
damaged.
Store benthic samples in a cool, dark place until
shipment to analytical lab
Duplicate samples must be collected at 10% of sites
Corrective Action
Obtain replacement
supplies
Qualify samples
Discard and recollect
sample
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 20 of 120
Holding time
Preserved samples can be stored indefinitely;
periodically check jars and change the
ethanol if sample material appears to be
degrading.
Qualify samples
5.6.6 Quality Control Procedures: Laboratory Operations
5.6.6.1 Sample Receipt and Processing
QC activities associated with sample receipt and processing are presented in Table 5.6-4. The
communications center and information management staff are notified of sample receipt
and any associated problems as soon as possible after samples are received.
Table 5.6-4. Sample receipt and processing quality control: benthic macroinvertebrate indicator
Quality Control
Activity
Sample Log-in
Sample Storage
Holding time
Preservation
Description and Requirements
Upon receipt of a sample shipment, laboratory
personnel check the condition and
identification of each sample against the
sample tracking record.
Corrective Action
Discrepancies, damaged, or
missing samples are
reported to the IM staff
and indicator lead
Qualify sample as suspect for all
analyses
Qualify samples
Qualify samples
5.6.6.2 Analysis of Samples
Specific quality control measures are listed in Table 5.6-5 for laboratory operations. Figure 11
presents the general process for analyzing benthic invertebrate samples. Specific
quality control measures are listed in Table 5.6-6 for laboratory identification operations.
Table 5.6-5. Laboratory Quality Control: benthic macroinvertebrate sample processing
Check or Sample
Description
Frequency
Acceptance Criteria
Corrective Action
SAMPLE PROCESSING (PICK AND SORT)
Sample residuals
examined by
different analyst
within lab
Sorted samples sent to
independent lab
10% of all samples
completed
per analyst
10% of all samples
Efficiency of picking
>90%
Accuracy of contractor
laboratory
picking and
identification
>90%
If <90%, examine all
residuals of samples
by that analyst and
retrain analyst
If picking accuracy <90%, all
samples in batch will
be reanalyzed by
contractor
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 21 of 120
Table 5.6-6: Laboratory Quality Control: benthic macroinvertebrate taxonomic identification
Check or Sample
Description
Duplicate identification
by different
taxonomist
within lab
Independent
identification
by outside
taxonomist
Use widely/commonly
excepted
taxonomic
references
Prepare reference
collection
Frequency
10% of all samples
completed
per
laboratory
All uncertain taxa
For all
identificatio
ns
Each new taxon
per
laboratory
Acceptance Criteria
Efficiency >85%
Uncertain identifications
to be confirmed
by expert in
particular taxa
All keys and references
used must be on
bibliography
prepared by
another
laboratory
Complete reference
collection to be
maintained by
each individual
laboratory
Corrective Action
If <85%, reidentify all samples
completed by that
taxonomist
Record both tentative and
independent IDs
If other references desired,
obtain permission to
use from Project QA
Officer
Lab Manager periodically
reviews data and
reference collection to
ensure reference
collection is complete
and identifications are
accurate
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 22 of 120
Subsampling should
result in
1 jar of organisms
1 j ar of sort residue
JJar of inserted
remains
3 (at least) total jars,
labeled accordngly
Sample
received
Sample logged
Sample cleaned & spread in
gridded screen
Randomly select 3 grid
s (pares (A-F; 1-6)
H ace in white
"picking" tray
Sort enough
grids to achieve
target number,
UptolOx
rnagnifi cation
used
and respread
3 original grids
inlray
Choose 3 new gri d
squares (A-F; 1-6)
Select and sort
enough grids to
achieve target
number, up to lOx
used
Choose last grid,
respread and sort
until target
number is
achieved
over 600
orgs. with
final grid?
Verify organisms
under microscope
Bottle inserted
sample remains
and sort residue;
Label properly
Figure 11: Laboratory Processing Activities for the benthic indicator
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 23 of 120
5.6.7 Data Management, Review, and Validation
Checks made of the data in the process of review, verification, and validation are summarized in
Table 5.6-7. The Project Facilitation Team is ultimately responsible for ensuring the
validity of the data, although performance of the specific checks may be delegated to
other staff members. Once data have passed all acceptance requirements,
computerized data files are prepared in a format specified for the NRSA project by
EMAP and copied onto a CD. The CDs are transferred to the NRSA IM Coordinator
(Marlys Cappaert) for entry into a centralized data base. A hard copy output of all files
accompanies each data CD.
A reference specimen collection is prepared as new taxa are encountered in samples. This
collection consists of preserved specimens in vials and mounted on slides and is
provided to the responsible EPA laboratory as part of the analytical laboratory contract
requirements. The reference collection is archived at the responsible EPA laboratory.
Sample residuals, vials, and slides are archived by each laboratory until the NRSA Project
Leader has authorized, in writing, the disposition of samples. All raw data (including field
data forms and bench data recording sheets) are retained in an organized fashion
indefinitely or until written authorization for disposition has been received from the NRSA
Project Leader.
Table 5.6-7: Data review, verification, and validation quality control: benthic indicator
Check Description
Taxonomic
"reasonableness"
checks
Frequency
All data sheets
Acceptance Criteria
Genera known to occur in given
stream or river
conditions or geographic
area
Corrective Action
Second or third
identification by
expert in that
taxon
5.6.8 Data Analysis Plan
Specific research issues to be addressed from this year's activities and the ecological attributes
or metrics associated with the benthic indicator are summarized in Table 5.6-8.
Table 5.6-8. Research issues: benthic indicator
Research Issues
Variance
Estimates
Indicator
Developm
ent and
Evaluation
Methods
Comparab
Design Strategy
Obtain estimates of variance components from duplicate samples and revisits to
sites.
Identify best set of ecological attributes or metrics that are broadly applicable to
assessing biological condition and are informative as to detection and
characterization of impairment. Candidate attributes are selected measures
of richness, O/E, representatives of sensitive taxa. These are based on
EPA's biological condition gradient attributes as part of the aquatic life use
initiative.
Use standardized guidelines (from the NWQMC Methods and Data Comparability
Board) for methods comparability studies (to measure precision and
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 24 of 120
ility
Threshold
Developm
ent for
Assessme
nt
Biological
Condition
sensitivity along environmental and disturbance gradients), and select
ecologic al attributes best suited to compare performance of methods (e.g.,
compositional metrics, or richness adjusted for reference).
Develop general expectations for each attribute (for each ecoregion) from collection
of reference sites sampled with NRSA methods. Supplement with
information from states and existing data where methods differences are not
an issue. Combining data for an integrated assessment is based on
minimizing sampling bias. Explore the use of thresholds based on %
difference, e.g., 20% deviation from reference as a consistent means of
evaluating biological condition across ecoregions.
Develop an ordinal scale related to a biological condition gradient to reflect varying
degrees of quality.
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 25 of 120
5.7 Fish Community Structure
5.7.1 Introduction
Monitoring of the fish assemblage is an integral component of many water quality management
programs. The assessment will measure specific attributes of the overall structure and
function of the ichthyofaunal community to evaluate biological integrity and water quality.
5.7.2 Sampling Design
The fish sampling method is designed to provide a representative sample of the fish community,
collecting all but the rarest fish inhabiting the site. It is assumed to accurately represent
species richness, species guilds, relative abundance, and anomalies. The goal is to
collect fish community data that will allow the calculation of an Index of Biotic Integrity
(IBI) and Observed/Expected (O/E) models. Backpack or barge electrofishing is the
preferred method. If electrofishing is not possible due to safety concerns, high turbidity,
or extremes in conductivity, complete the "Not Fished" section of the field form and
comment why.
5.7.3 Sampling and Analytical Methods
5.7.3.1 Wadeable Streams
Streams with mean wetted widths less than 12.5 m will be electrofished in their entirety,
covering all available habitats. However, the time and effort necessary to sample
reaches greater than 12.5 m wide is prohibitive in the context of the survey, thus sub-
sampling is required. Sub-sampling is defined by 5-10 sampling zones, each starting at a
transect. In all instances electrofishing in wadeable systems should proceed in an
upstream direction using a single anode. Identification and processing offish should
occur at the completion of each transect.
5.7.3.2 Non-wadeable Streams
The time and effort necessary to sample the reach in its entirety is prohibitive in the context of
the survey, thus sub-sampling is required. Electrofishing will occur in a downsteam
direction at all habitats along alternating banks over a length of 20 times the mean
channel width (5 transects - A through E). Collection of a minimum of 500 fish is
required. If this target is not attained, sampling will continue until 500 individuals are
captured or the downstream extent of the site (transect K) is reached. Identification and
processing of fish should occur at the completion of each transect.
5.7.4 Quality Assurance Objectives
MQOs are given in Table 5.7-1. General requirements for comparability and representativeness
are addressed in Section 2. Precision is calculated as percent efficiency, estimated from
independent identifications of organisms in randomly selected samples. The MQO for
accuracy is evaluated by having individual specimens representative of selected taxa
identified by recognized experts.
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 26 of 120
Table 5.7.1. Measurement data quality objectives: fish community indicator
Variable or Measurement
Precision
Accuracy
Completeness
NA = not applicable
aTaxonomic accuracy, as calculated using Equation 10 in Section 2.
5.7.5 Quality Control Procedures: Field Operations
Field data quality is addressed, in part, by application and consistent performance of valid
procedures documented in the standard operating procedures detailed in the NRSA
Field Operations Manual. That quality is enhanced by the training and experience of
project staff and documentation of sampling activities. This QAPP, the NRSA Field
Operations Manual, and training materials will be distributed to all field sampling
personnel. Training sessions will be conducted by EPA to distribute and discuss project
materials. All sampling teams will be required to view the training materials, read the
QAPP, and verify that they understand the procedures and requirements.
Review all collecting permits to determine if any sampling restrictions are in effect for the site. In
some cases, you may have to cease sampling if you encounter certain listed species.
An experienced fisheries biologist sets up the electrofishing equipment. After selecting
the initial voltage setting and pulse rate, the crew starts electrofishing. If fishing success
is poor, increase the pulse width first and then the voltage to sample effectively and
minimize injury and mortality. Increase the pulse rate last to minimize mortality or injury
to large fish. If mortalities occur, first decrease pulse rate, then voltage, then pulse width.
Fishing begins with a cleared clock to document button time. If button time is not
metered, estimate it with a stop watch and flag the data.
Crews may choose to have more than one person holding a net, but no more than one person
should be netting at any one time. To reduce stress and mortality, immobilized fish
should be netted immediately and deposited into a live-well for processing. Process fish
when fish show signs of stress (e.g., loss of righting response, gaping, gulping air,
excessive mucus). Change water or stop fishing and initiate processing as soon as
possible. Similarly, State- and Federally-listed threatened or endangered species or
large game fish should be processed and released as they are captured. If periodic
processing is required, fish should be released in a location that prevents the likelihood
of their recapture. For safety, all crew members are required to wear non-breathable
waders and insulated gloves. Polarized sunglasses and caps to aid vision are also
required.
An experienced fisheries biologist will identify the collected fish specimens in the field. All
specimens must be identified by common name as listed in Appendix D of the Field
Operations Manual. The biologist may chose to retain certain specimens for
identification or verification in the laboratory. These samples are retained at the
discretion of the fisheries biologist and are separate from the official voucher specimens
that must be collected at 10% of each field crews' sites to be re-identified by an
independent taxonomist.
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 27 of 120
Check the sample labels for all voucher and laboratory ID specimens to ensure that all written
information is complete and legible. Place a strip of clear packing tape over the label
and bar code, covering the label completely. Record the bar code assigned to the
voucher sample on the Sample Collection Form. Enter a flag code and provide
comments on the Sample Collection Form if there are any problems in collecting the
samples or if conditions occur that may affect sample integrity. Preserve all voucher
samples with 10% buffered formalin and store them in a sturdy container (i.e., cooler)
until shipment to the analytical laboratory. Recheck all forms for completeness and
legibility. Additionally, duplicate (replicate) samples will be collected at 10% of sites
sampled. A summary of Field quality control procedures for the fish community indicator
is presented in Table 5.7-2.
Table 5.7-2. Sample collection and field processing quality control: fish community indicator
Quality Control Activity
Check integrity of sample
containers and
labels
Set up electrofishing
equipment
Comparable effort
Comparable effort
Field Processing
Field Processing
Field Processing
Sample Collection
Description and Requirements
Clean, intact containers and labels
An experienced fisheries biologist sets up the unit. If
results are poor, adjustments are made to the
pulse width and voltage to sample effectively
and minimize injury/mortality.
Reset unit clock to document button time (700
seconds per transect). If button time is not
metered, estimate it with a stop watch and flag
the data.
No more than 1 person is netting at any one time.
Immobilized fish are netted immediately and deposited
into livewell. Process before fish show signs of
stress. State or federally listed threatened or
endangered species or large game fish should
be processed and released as they are
captured.
Fish should be released in a location that prevents the
likelihood of their recapture.
The fisheries biologist will identify specimens in the
field using a standardized list of common
names (App. D of the Field Operations
Manual).
The biologist may retain uncertain specimens for ID or
verification in the laboratory. These samples
are retained at the discretion of the biologist
and are separate from the official voucher
specimens that must be collected at 10% of
each field crews' sites to be re-identified by an
independent taxonomist.
Corrective Action
Obtain replacement
supplies
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 28 of 120
Sample Collection -
Taxonomic QC
samples
Sample Preservation
Safety
Safety
Duplicate samples
10% of each field crews' sites are randomly selected
for re-identification by an independent
taxonomist. A minimum of 1 complete voucher
is required for each field taxonomist and will
consist of either preserved specimen(s) or
digital images representative of all species in
the sample, even common species.
Fish retained for lab ID or vouchers are preserved with
10% buffered formalin. All personnel must
read the MSDS (App D of QAPP).
All crew members are required to wear insulated
gloves and non-breathable waders. Caps and
polarized sunglasses to aid vision are also
required.
Wear vinyl or nitrile gloves and safety glasses, and
always work in a well-ventilated area.
Duplicate samples must be collected at 10% of sites
5.7.5.1 Sample Preservation
Fish retained for laboratory identification or as vouchers should be preserved in the field with
10% buffered formalin. The specimens should be placed in a large sample jar
containing a 10% buffered formalin solution in a volume equal to or greater than the total
volume of specimens. Individuals larger than 200 mm in total length should be slit along
the right side of the fish in the lower abdominal cavity to allow penetration of the solution.
All personnel handling 10% buffered formalin must read the MSDS (Appendix D).
Formalin is a potential carcinogen and should be used with extreme caution, as vapors
and solution are highly caustic and may cause severe irritation on contact with skin,
eyes, or mucus membranes. Wear vinyl or nitrile gloves and safety glasses, and always
work in a well-ventilated area.
5.7.5.2
Laboratory Identification
Fish that are difficult to identify in the field are kept for laboratory identification or to verify
difficult field identifications. Table 6.5-5 in the Field Operations Manual outlines the
laboratory identification process and completing the Fish Collection Form. Field crews
must retain the Fish Collection Form(s) for all sites until the laboratory identification
process is complete. Crews should retain the Fish verification sample - contact your
regional EPA coordinator if you cannot store the samples at your facility.
5.7.5.3
Voucher Specimens
Approximately 10% of each field crews' sites will be randomly pre-selected for re-identification
by an independent taxonomist. A minimum of one complete voucher is required for each
person performing field taxonomy and will consist of either preserved specimen(s) or
digital images representative of all species in the sample, even common species.
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 29 of 120
Multiple specimens per species can be used as vouchers, if necessary (i.e., to document
different life or growth stages, or sexes). Note that a complete sample voucher does not
mean that all individuals of each species will be vouchered, only enough so that
independent verification can be achieved.
For species that are retained, specimen containers should be labeled with the sample number,
site ID number, site name, and collection date. There should be no taxonomic
identification labels in or on the container.
Digital images should be taken as voucher documentation for species that are recognized as
Rare, Threatened, or Endangered (RTE) - they should not be harmed or killed. Very
common and well-known, or very large-bodied species should also be recorded by
digital images; however, these can be preserved at the discretion of the taxonomist.
Labeling, within the image, should be similar to that used for preserved samples and not
include taxonomic identification. Guidance for naming photo files is provided below in the
photovouchering section.
5.7.5.4 Photovouchering
Digital imagery should be used for fish species that cannot be retained as preserved specimens
(e.g., RTE species; very large bodied; or very common). Views appropriate and
necessary for an independent taxonomist to accurately identify the specimen should be
the primary goal of the photography. Additional detail for these guidelines is provided in
Stauffer et al. (2001), and is provided to all field crews as a handout.
The recommended specifications for digital images to be used for photovouchering include: 16-
bit color at a minimum resolution of 1024x768 pixels; macro lens capability allowing for
images to be recorded at a distance of less than 4 cm; and built-in or external flash for
use in low-light conditions. Specimens should occupy as much of the field of view as
possible, and the use of a fish board is recommended to provide a reference to scale
(i.e., ruler or some calibrated device) and an adequate background color for
photographs. Information on Station ID, Site Name, Date and a unique species ID (i.e.,
A, B, C, etc.) should also be captured in the photograph, so that photos can be identified
if file names become corrupted. All photovouchered species should have at least a full-
body photo (preferably of the left side of the fish) and other zoom images as necessary
for individual species, such as lateral line, ocular/oral orientation, fin rays, gill arches, or
others. It may also be necessary to photograph males, females, or juveniles.
Images should be saved in medium- to high-quality jpeg format, with the resulting file name of
each picture noted one the Fish Collection Form. It is important that time and date
stamps are accurate as this information can also be useful in tracking the origin of
photographs. It is recommended that images stored in the camera be transferred to a
PC or storage device at the first available opportunity. At this time the original file should
be renamed to follow the logic presented below:
F01_CT003_20080326.jpg
where F=fish, 01=tag number, CT003=state (Connecticut) and site number, and 20080326=date
(yyyymmdd).
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 30 of 120
Field crews should maintain files for the duration of the sampling season. Notification regarding
the transfer of all images to the existing database will be provided at the conclusion of
the sampling.
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 31 of 120
5.7.6 Quality Control Procedures: Laboratory Operations (Voucher Specimens)
5.7.6.1 Sample Receipt and Processing
QC activities associated with sample receipt and processing are presented in Table 5.7-3. The
communications center and information management staff are notified of sample receipt
and any associated problems as soon as possible after samples are received.
Table 5.7-3. Sample receipt and processing quality control: fish community indicator
Quality Control
Activity
Sample Log-in
Sample Storage
Holding time
Preservation
Description and Requirements
Upon receipt of a sample shipment, laboratory
personnel check the condition and
identification of each sample against the
sample tracking record.
Corrective Action
Discrepancies, damaged, or
missing samples are
reported to the IM staff
and indicator lead
Qualify sample as suspect for all
analyses
Qualify samples
Qualify samples
5.7.6.2 Analysis of Samples
Specific quality control measures are listed in Table 5.7-4 for laboratory operations.
Table 5.7-4: Laboratory Quality Control: fish voucher taxonomic identification
Check or Sample
Description
Independent
identification
by outside
taxonomist
Use widely/commonly
excepted
taxonomic
references
Frequency
Complete voucher
colection
for 10% of
all sites
For all
identificatio
ns
Acceptance Criteria
Uncertain identifications
to be confirmed
by expert in
particular taxa
All keys and references
used must be on
bibliography
prepared by
another
laboratory
Corrective Action
If <85%, reidentify all samples
completed by that
taxonomist
If other references desired,
obtain permission to
use from Project QA
Officer
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 32 of 120
5.8 Physical Habitat Quality
5.8.1 Introduction
Naturally occurring differences in physical habitat structure and associated hydraulic
characteristics among surface waters contributes to much of the observed variation in
species composition and abundance within a zoogeographic province. Structural
complexity of aquatic habitats provides the variety of physical and chemical conditions to
support diverse biotic assemblages and maintain long-term stability. Anthropogenic
alterations of riparian physical habitat, such as channel alterations, wetland drainage,
grazing, agricultural practices, weed control, and streambank modifications such as
revetments or development, generally act to reduce the complexity of aquatic habitat
and result in a loss of species and ecosystem degradation.
For the NRSA, indicators derived from data collected on physical habitat quality will be used to
help explain or characterize stream and river conditions relative to biological response
and trophic state indicators. Specific groups of physical habitat attributes important in
stream and river ecology include: channel dimensions, gradient, substrate; habitat
complexity and cover; riparian vegetation cover and structure; anthropogenic alterations;
and channel-riparian interaction (Kaufmann, 1993). Overall objectives for this indicator
are to develop quantitative and reproducible indices, using both multivariate and
multimetric approaches, to classify streams and rivers and to monitor biologically
relevant changes in habitat quality and intensity of disturbance.
5.8.2 Sampling Design
As the physical habitat indicator is based on field measurements and observations, there is no
sample collection associated with this indicator. Field crews are provided with 1:24,000
maps with the midpoint (index site) of the stream reach marked. At NRSA sites, eleven
cross-sectional measurement transects are spaced at equal intervals proportional to
baseflow channel width, thereby scaling the sampling reach length and resolution in
proportion to stream and river size. A systematic spatial sampling design is used to
minimize bias in the selection of the measurement sites. Additional measurements are
made at equally spaced intervals between the cross-sectional sites.
5.8.3 Sampling Methodologies
Field Measurements: Field measurements, observations, and associated methodology for the
protocol are summarized in Table 5.8-1. Detailed procedures for completing the
protocols are provided in the field operations manual; equipment and supplies required
are also listed. All measurements and observations are recorded on standardized forms
which are later entered in to the central EMAP surface waters information management
system at WED-Corvallis.
There are no sample collection or laboratory analyses associated with the physical habitat
measurements.
Table 5.8-1. Field measurement methods: physical habitat indicator
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 33 of 120
Variable or
Meas
urem
ent
Units
QA
Summary of Method
THALWEG PROFILE
Thalweg depth
Wetted width
Habitat class
cm
0.1m
none
C
C
N
Measure max depth at 100-150 points for wadeable or
200 points for non-wadeable along reach with
surveyor's rod or sonar equipment
Measure wetted width with range finder or measuring
tape on perpendicular line to mid-channel line
Visually estimate channel habitat using defined class
descriptions
WOODY DEBRIS TALLY
Large woody
debris
#of
F
c
N
Use pole drag and visually estimate amount of woody
debris in baseflow channel using defined class
descriptions
CHANNEL AND RIPARIAN CROSS-SECTIONS
Slope and
bearin
g
Substrate size
Bank angle
Bank incision
Bank undercut
Bankful width
Bankful height
Canopy cover
%/ ^
f
r
f
c
mm
degrees
0.1m
cm
0.1m
0.1m
points
c
f
r
t
C
C
N
N
N
N
N
C
Backsight between cross-section stations using
clinometer, rangefinder compass, & tripod
At 5 points on cross section, estimate size of one selected
particle using defined class descriptions
Use clinometer and surveyors rod to measure angle
Visually estimate height from water surface to first terrace
of floodplain
Measure horizontal distance of undercut
Measure width at top of bankful height
Measure height from water surface to estimated water
surface during bankful flow
Count points of intersection on densiometer at specific
points and directions on cross-section
References
Frissell et al, 1986
Robison and
Beschta,
1990
Robison &
Kaufman
n, in
prep.;
Stack,
1989
Wollman, 1954;
Bain et
al, 1985;
Plafkin et
al, 1989
Plattsetal, 1983
Lemmon, 1957;
Mulvey
etal,
1992
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 34 of 120
Riparian
veget
ation
struct
ure
Fish cover,
algae,
macro
phyte
s
Human
influe
nee
r
c
t
r
percent
percent
none
N
C
C
Observations of ground cover, understory, and canopy
types and coverage of area 5 m on either side of
cross section and 10 m back from bank
Visually estimate in-channel features 5 m on either side of
cross section
Estimate presence/absence of defined types of
anthropogenic features
STREAM DISCHARGE
Discharge
m/s or
L
r
i
r
N
Velocity-Area method, Portable Weir method, timed
bucket discharge method
Linsley et al, 1982
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 35 of 120
5.8.4 Quality Assurance Objectives
Measurement data quality objectives (measurement DQOs or MQOs) are given in Table 5.8-2.
General requirements for comparability and representativeness are addressed in
Section 2. The MQOs given in Table 5.8-2 represent the maximum allowable criteria for
statistical control purposes. Precision is determined from results of revisits by a different
crew (field measurements) and by duplicate measurements by the same crew on a
different day.
The completeness objectives are established for each measurement per site type (e.g., NRSA
sites, revisit sites, state comparability sites). Failure to achieve the minimum
requirements for a particular site type results in regional population estimates having
wider confidence intervals. Failure to achieve requirements for repeat and annual revisit
samples reduces the precision of estimates of index period and annual variance
components, and may impact the representativeness of these estimates because of
possible bias in the set of measurements obtained.
Table 5.8-2. Measurement data quality objectives: physical habitat indicator
Variable or Measurement
Field Measurements and Observations
Map-Based Measurements
Precision
±10%*
±10%
Accuracy
NA
NA
Completeness
90%
100%
NA = not applicable *Not for RBP measures
5.8.5 Quality Control Procedures: Field Operations
Field data quality is addressed, in part, by application and consistent performance of valid
procedures documented in the standard operating procedures detailed in the NRSA
Field Operations Manual. That quality is enhanced by the training and experience of
project staff and documentation of sampling activities. This QAPP, the NRSA Field
Operations Manual, and training materials will be distributed to all field sampling
personnel. Training sessions will be conducted by EPA to distribute and discuss project
materials. All sampling teams will be required to view the training materials, read the
QAPP, and verify that they understand the procedures and requirements. Specific
quality control measures are listed in Table 5.8-3 for field measurements and
observations.
Table 5.8-3. Field quality control: physical habitat indicator
Check Description
Check totals for cover class
categories
(vegetation type,
fish cover)
Check completeness of
thalweg depth
measurements
Frequency
Each transect
Each site
Acceptance Criteria
Sum must be reasonable
(best professional
judgement)
Depth measurements for
all sampling points
Corrective Actions
Repeat observations
Obtain best estimate of
depth where
actual
measurement not
possible
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 36 of 120
Check calibration of
multiprobe
Prior to each
samplin
gday
Specific to instrument
Adjust and recalibrate,
redeploy gear
5.8.6 Quality Control Procedures: Laboratory Operations
There are no laboratory operations associated with this indicator.
5.8.7 Data Management, Review, and Validation
Checks made of the data in the process of review, verification, and validation are summarized in
Table 5.8-4. The Indicator Lead is ultimately responsible for ensuring the validity of the
data, although performance of the specific checks may be delegated to other staff
members. All raw data (including all standardized forms and logbooks) are retained in an
organized fashion for seven years or until written authorization for disposition has been
received from the NRSA Project Coordinator.
Table 5.8-4. Data validation quality control: physical habitat indicator
Check Description
Estimate precision of
measurements
based on repeat
visits by different
crews
Frequency
At least 2 teams visit
stream and river
1 time each at
1 0% of streams
and rivers (may
be same team or
different teams)
Acceptance Criteria
Measurements
should be
within 10
percent
Corrective Action
Review data for
reasonableness;
Determine if
acceptance criteria
need to be
modified
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 37 of 120
5.9 Fish Tissue
5.9.1 Introduction
Fish are time-integrating indicators of persistent pollutants, and contaminant bioaccumulation in
fish tissue has important human and ecological health implications. Contaminants in fish
pose risks to human consumers and to piscivorous wildlife. The NRSA fish tissue
indicator will provide information on the national distribution of selected persistent,
bioaccumulative, and toxic (PBT) chemical residues (e.g., mercury and organochlorine
pesticides) in predator fish species from large (non-wadeable) streams and rivers of the
conterminous United States. Recent studies show that an emerging group of
contaminants - Pharmaceuticals and personal care products (PPCPs) - can persist
through the wastewater treatment process and occur in municipal effluent, surface
water, and sediments. However, data on the accumulation of PPCPs in fish are scarce.
NRSA fish tissue samples will be used to address this data gap. Samples collected from
a national statistical subset of NRSA urban sites (approximately 150 sites) located on
large (non-wadeable) rivers will be analyzed for PPCPs.
The fish tissue indicator procedures are based on EPA's National Study of Chemical Residues
in Lake Fish Tissue (USEPA 2000a) and EPA's Guidance for Assessing Chemical
Contaminant Data for Use in Fish Advisories, Volume 1 (Third Edition) (USEPA 2000b).
5.9.2 Sampling Design
The NRSA crews will collect fish for the tissue indicator from all non-wadeable study reaches
sampled for the fish community structure indicator (Section 5.8). Fish tissue samples
must consist of a composite of fish (i.e., five individuals of one predator species that will
collectively provide greater than 500 grams of fillet tissue) from each site. Tissue
sampling may require additional effort (temporally and/or spatially) beyond that of the
fish community structure sampling. Fish retained for the tissue indicator may be
collected from anywhere between site transects A and K.
Field teams will consist of one experienced fisheries biologist and one field technician. The
experienced on-site fisheries biologist will select the most appropriate electrofishing gear
type(s) for a particular site. The appropriate sampling equipment will be based on the
size/depth of each site, and deployment will target recommended predator species
(Table 5.9.1). Accurate taxonomic identification is essential to prevent mixing of species
within composites. Five fish will be collected per composite at each site, all of which
must be large enough to provide sufficient tissue for analysis (i.e., 500 grams of fillets,
collectively). Fish in each composite must all be of the same species, satisfy legal
requirements of harvestable size (or be of consumable size if there are no harvest
limits), and be of similar size so that the smallest individual in the composite is no less
that 75% of the total length of the largest individual. If the recommended target species
are unavailable, the on-site fisheries biologist will select an alternative species (i.e., a
predator species that is commonly consumed in the study area, with specimens of
harvestable or consumable size, and in sufficient numbers to yield a composite).
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 38 of 120
Table 5.9.1. Recommended Target Species for Fish Tissue Collection (In Order of Preference) at non-
wadeable sites
Predator/Gamefish Species
(in order of preference)
Family name
Centrarchidae
Percidae
Percichthyidae
Esocidae
Salmonidae
Common name
Largemouth bass
Smallmouth bass
Black crappie
White crappie
Walleye/sauger
Yellow perch
White bass
Northern pike
Lake trout
Brown trout
Rainbow trout
Brook trout
Scientific name
Micropterus salmoides
Micropterus dolomieu
Pomoxis nigromaculatus
Pomoxis annularis
Sander vitreus /S. canadensis
Perca flavescens
Morone chrysops
Esox lucius
Salvelinus namaycush
Salmo trutta
Oncorhynchus mykiss
Salvelinus fontinalis
Length Guideline
(Estimated
Minimum)
-280 mm
-300 mm
-330 mm
-330 mm
-380 mm
-330 mm
-330 mm
-430 mm
-400 mm
-300 mm
-300 mm
-330 mm
5.9.3 Sampling and Analytical Methodologies
The fish tissue sample collection schedule will be consistent with the requirements specified in
this QAPP for all other NRSA indicators with the following exception: replicate fish tissue
samples will be collected at revisit sites only during the first round of sampling. The
sampling teams are responsible for providing fisheries sampling gear and sampling
vessels. Fish selected for compositing should be rinsed in ambient water, handled using
clean nitrile gloves, and placed in clean holding containers (e.g., livewells or buckets).
Each fish of the selected target species should be measured to determine total body
length (i.e., length from the anterior-most part of the fish to the tip of the longest caudal
fin ray when the lobes of the caudal fin are depressed dorsoventrally) recorded in
millimeters. When sufficient numbers of the target species have been identified to make
up a suitable composite (i.e., five individuals meeting the criteria presented above), the
species name, specimen lengths, and all other site sampling information should be
recorded on the fish tissue field form.
After initial processing to determine species and size, each of the five fish found to be suitable
for the composite sample will be individually wrapped in extra heavy-duty aluminum foil
(provided by EPA as solvent-rinsed, oven-baked sheets). A sample identification label
will be completed for each fish specimen. Each foil-wrapped fish and sample
identification label will be placed into waterproof plastic tubing that will be cut to fit the
specimen (i.e., heavy duty food grade polyethylene tubing provided by EPA), and each
end of the tubing will be sealed with a plastic cable tie. All five individually-wrapped
specimens from each site will be placed in a large plastic composite bag and sealed with
another cable tie.
EPA will provide fish tissue sample packing and shipping supplies (with the exception of dry
ice). A list of equipment and expendable supplies is provided in the NRSA Field
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 39 of 120
Operations Manual. Following collection, wrapping, and labeling, samples should be
immediately placed on dry ice for shipment. If samples will be carried back to an interim
location to be frozen before shipment, wet ice can be used to transport the samples in
coolers to that location. Each sampling team will ship all fish tissue samples in coolers
on dry ice (i.e., a recommended 50 pounds per cooler) via priority overnight delivery
service to a sample control center designated by EPA. All cooler vent holes must be
taped open to allow gasses to escape, and the cooler lids will be sealed with a custody
seal that has been signed and dated by the collector. The time of sample collection,
relinquishment by the sample team, and time of their arrival at the sample preparation
laboratory must be recorded on the NRSA chain-of-custody form.
5.9.4 Quality Assurance Objectives
The relevant quality objectives for fish tissue sample collection activities are primarily related to
sample handling issues. Types of field sampling data needed for the fish tissue indicator
are listed in Table 5.9.2. Methods and procedures described in this QAPP and the NRSA
Field Operations Manual are intended to reduce the magnitude of the sources of
uncertainty (and their frequency of occurrence) by applying:
• standardized sample collection and handling procedures, and
• use of trained scientists to perform the sample collection and handling activities.
Table 5.9.2. Field Data Types: Fish Tissue Indicator
Variable or Measurement
Fish specimen
Fish length
Composite classification
Specimen count classification
Measurement Endpoint
or Unit
Species-level taxonomic identification
Millimeters (mm), total length
Composite identification number
Specimen number
5.9.5 Quality Control Procedures: Field Operations
Field data quality is addressed, in part, by application and consistent performance of valid
procedures documented in the standard operating procedures detailed in the NRSA
Field Operations Manual. That quality is enhanced by the training and experience of
project staff and documentation of sampling activities. This QAPP, the NRSA Field
Operations Manual, and training materials will be distributed to all field sampling
personnel. Training sessions will be conducted by EPA to distribute and discuss project
materials. All fish tissue sampling teams will be required to view the training materials,
read the QAPP, and verify that they understand the procedures and requirements.
Specific quality control measures are listed in Table 5.9-3 for field measurements and
observations.
Table 5.9-3. Field quality control: fish tissue indicator
Quality Control Activity
Check integrity of sample
containers and
Description and Requirements
Clean, intact containers and labels
Corrective Action
Obtain replacement
supplies
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 40 of 120
labels
Set up electrofishing
equipment
Field Processing
Sample Collection
Sample Collection
An experienced fisheries biologist sets up the
unit. If results are poor, adjustments are
made to the pulse width and voltage to
sample effectively and minimize
injury/mortality.
The fisheries biologist will identify specimens in
the field using a standardized list of
common names (App. D of the Field
Operations Manual).
The biologist will retain 5 specimens of the same
species to form the composite sample.
The length of the smallest fish must be at least
75% of the length of the longest fish.
5.9.7 Data Management, Review, and Validation
Checks made of the data in the process of review, verification, and validation are summarized in
Table 5.9-4. The Indicator Lead is ultimately responsible for ensuring the validity of the
data, although performance of the specific checks may be delegated to other staff
members. All raw data (including all standardized forms and logbooks) are retained in an
organized fashion for seven years or until written authorization for disposition has been
received from the NRSA Project Coordinator. Once data have passed all acceptance
requirements, computerized data files are prepared in a format specified for the NRSA
project by EMAP and copied onto a CD. The CDs are transferred to the NRSA IM
Coordinator (Marlys Cappaert) for entry into a centralized data base. A hard copy output
of all files accompanies each data CD.
Table 5.9-4. Data validation quality control: fish tissue indicator
Check Description
Duplicate
sampling
Taxonomic
"reasonabl
eness"
checks
Composite validity
check
75% rule
Frequency
Duplicate composite
samples
collected at
1 0% of sites
All data sheets
All composites
All composites
Acceptance Criteria
Measurements should be
within 10 percent
Genera known to occur in
stream or river
conditions or
geographic area
Each composite sample
must have 5 fish
of the same
species
Length of smallest fish in
the composite
must be at least
Corrective Action
Review data for reasonableness;
determine if acceptance
criteria need to be
modified
Second or third identification by
expert in that taxon
Indicator lead will review
composite data and
advise the lab before
processing begins
Indicator lead will review
composite data and
advise the lab before
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 41 of 120
75% of the length
of the longest
fish.
processing begins
5.9.8 Data Analysis Plan
Fish tissue concentration data from laboratory analysis of the fish composite samples will be
reported as percentiles, including the 50th percentile or median concentration, for each
target chemical. Cumulative distribution of fish tissue concentrations for the sampled
population of sites will be estimated using a procedure described by Diaz-Ramos et al.
(1996) entitled, "Estimation Method 1: Cumulative Distribution Function for Proportion of
a Discrete or an Extensive Resource." The estimated proportion (pc) below a specific
value for a concentration (C) is:
w*x
PC =
where: x, = 1 if concentration for /1h lake is below C and equals 0 otherwise,
Wj = the adjusted weight for /1h lake, and
n = total number of lakes sampled.
A cumulative distribution function (CDF) offers an approach to displaying statistical data that
correlates the results to the sampled population. In technical terms, a CDF
characterizes the probability distribution of a random variable. For the tissue indicator,
the random variable is the concentration of a particular chemical in fish tissue.
Variance estimates will be derived using the local neighborhood variance estimator described
by Stevens and Olsen (2003 and 2004). To complete these analyses, R statistical
software (R Development Core Team 2004) and an R contributed library will be utilized
for probability survey population estimation (spsurvey). The R library is available online
at the following Internet address: http://www.epa.gov/nheerl/arm/analysispages/software.
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 42 of 120
5.10 Fecal Indicator: Enterococci
5.10.1 Introduction
The primary function of collecting water samples for Pathogen Indicator Testing is to provide a
relative comparison of fecal pollution indicators for national rivers and streams. The
concentration of Enterococci (the current bacterial indicator for fresh and marine waters)
in a water body correlates with the level of more infectious gastrointestinal pathogens
present in the water body. While some Enterococci are opportunistic pathogens among
immuno-compromised human individuals, the presence of Enterococci is more
importantly an indicator of the presence of more pathogenic microbes (bacteria, viruses
and protozoa) associated with human or animal fecal waste. These pathogens can
cause waterborne illness in bathers and other recreational users through exposure or
accidental ingestion. Disease outbreaks can occur in and around beaches that become
contaminated with high levels of pathogens. Therefore, measuring the concentration of
pathogens present in river and stream water can help assess comparative human health
concerns regarding recreational use.
In this survey, a novel, Draft EPA Quantitative PCR Method (1606) will be used to measure the
concentration of genomic DMA from the fecal indicator group Enterococcus in the water
samples. While neither federal or state Water Quality Criteria (standards) have been
formally established for the level of Enterococcus DMA in a sample, epidemiological
studies (Wade et a/. 2005) have established a strong correlation between Enterococcus
DMA levels and the incidence of high-credible gastrointestinal illness (HCGI) among
swimmers. The Enterococcus qPCR results will serve as an estimate of the
concentration of total (culturable and non-culturable) Enterococci present in the
surveyed rivers and streams for the purpose of comparative assessment. This study
also has the potential to yield invaluable information about the inhibitory effects of water
matrices from the different regions of the nation upon the qPCR assay.
5.10.2 Sampling Design
A single "pathogen" water sample will be collected from one sampling location approximately 1
m offshore, in conjunction with the final physical habitat sampling station location.
5.10.3 Sampling Methods
Sample Collection: At the final physical habitat shoreline station (located approximately 1 m off
shore), a single 1-L water grab sample is collected approximately 6-12 inches below the
surface of the water. Detailed procedures for sample collection and handling are
described in the Field Operations Manual. Pathogen samples must be filtered and the
filters must be folded and frozen in vials within 6 hours of collection.
Analysis: Pathogen samples are filter concentrated, then shipped on dry ice to the New
England Regional Laboratory where the filter retentates are processed, and the DMA
extracts are analyzed using Quantitative Polymerase Chain Reaction (qPCR), a genetic
method that quantifies a DMA target via a fluorescently tagged probe, based on methods
developed by the USEPA National Exposure Research Laboratory. Detailed procedures
are contained in the laboratory operations manual. Table 5.10-1 summarizes field and
analytical methods for the pathogen indicator.
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 43 of 120
Table 5.10-1. Field and laboratory methods: pathogen indicator (Enterococci)
Variable or
Measurement
Sample
Collec
tion
Sub-sampling
Sub-sample
(& Buffer
Blank)
Filtrati
on
Preservation
&
Shipm
ent
DMA
Extra c
tion
(Recovery)
Method 1606
(Enter
ococc
US&
SPC
qPCR
)
QA
Class
C
N
N
C
C
C
Expected
Ran
ge
and
/or
Unit
s
NA
NA
NA
-40C to +40
C
10-141%
<60 (RL) to
>10
0,00
0
EN
T
CC
Es
/100
-ml
Summary of Method
Sterile sample bottle submerged to
collect 250-mL sample 6-12"
below surface at 10m from
shore
2 x 50-mL sub-samples poured in sterile
50-mL tube after mixing by
inversion 25 times.
Up to 50-mL sub-sample filtered through
sterile polycarbonate filter.
Funnel rinsed with minimal
amount of buffer. Filter folded,
inserted in tube then frozen.
Batches of sample tubes shipped on dry
ice to lab for analysis.
Bead-beating of filter in buffer containing
Extraction Control (SPC) DNA.
DNA recovery measured
5-uL aliquots of sample extract are
analyzed by ENT & Sketa qPCR
assays along with blanks,
calibrator samples & standards.
Field and lab duplicates are
analyzed at 10% frequency.
Field blanks analyzed at end of
testing only if significant
detections observed.
References
NRSA Field
Operations
Manual
2008
NRSA Laboratory
Methods
Manual
2008
NRSA Lab Methods
Manual
2008
NRSA Lab Methods
Manual
2008
EPA Draft Method
1606
Enterococc
us qPCR
EPA Draft Method
1606
Enterococc
us qPCR
NERL NLPS2007
qPCR
Analytical
SOP
C = critical, N = non-critical quality assurance classification.
5.10.4 Quality Assurance Objectives
Measurement quality objectives (MQO) are given in table 5.10-2. General requirements for
comparability and representativeness are addressed in Section 2. Precision is
calculated as percent efficiency, estimated from independent identifications of organisms
in randomly selected samples. The MQO for accuracy is evaluated by having individual
specimens representative of selected taxa identified by recognized experts.
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 44 of 120
Table 5.10-2. Measurement data quality objectives: Pathogen-Indicator DMA Sequences
Variable or Measurement*
SPC & ENT DMA sequence numbers
of Calibrators & Standards by
AQM
ENT CCEs by dCf ROM
ENT CCEs by ddCf ROM
Method Precision
RSD=50%
RSD = 70%
RSD = 70%
Method Accuracy
50%
35%
50%
Completeness
95%
95%
95%
*AQM = Absolute Quantitation Method; RQM = Relative Quantitation Method;
SPC = Sample Processing Control (Salmon DMA / Sketa); CCEs = Calibrator Cell Equivalents
5.10.5 Quality Control Procedures: Field Operations
Field data quality is addressed, in part, by application and consistent performance of valid
procedures documented in the standard operating procedures detailed in the NRSA
Field Operations Manual. That quality is enhanced by the training and experience of
project staff and documentation of sampling activities. This QAPP, the NRSA Field
Operations Manual, and training materials will be distributed to all field sampling
personnel. Training sessions will be conducted by EPA to distribute and discuss project
materials. All fish tissue sampling teams will be required to view the training materials,
read the QAPP, and verify that they understand the procedures and requirements.
Specific quality control measures are listed in Table 5.10-3 for field measurements and
observations.
It is important that the sample container be completely sterilized and remain unopened until
samples are ready to be collected. Once the sample bottles are lowered to the desired
depth (6-12 in. below the surface), the sample bottles may then be opened and filled.
After filling the 1-L bottle check the label to ensure that all written information is complete
and legible. Place a strip of clear packing tape over the label and bar code, covering the
label completely. Record the bar code assigned to the pathogen sample on the Sample
Collection Form. Enter a flag code and provide comments on the Sample Collection
Form if there are any problems in collecting the sample or if conditions occur that may
affect sample integrity. All samples should be placed in coolers and maintained on ice
during transport to the laboratory and maintained at 1-4°C during the time interval
before they are filtered for analysis. Recheck all forms and labels for completeness and
legibility.
Field blanks and duplicates will be collected at 10% of sites sampled. In addition, each field
crew should collect a blank sample over the course of the survey as a check on each
crew's aseptic technique and the sterility of test reagents and supplies.
Table 5.10-3. Sample collection and field processing quality control: fecal indicator
Quality Control Activity
Check integrity of sample
containers and
labels
Description and Requirements
Clean, intact containers and labels
Corrective Action
Obtain replacement
supplies
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 45 of 120
Sterility of sample
containers
Sample Collection
Sample holding
Field Processing
Duplicate samples
Field Blanks
Sample collection bottle and filtering apparatus are
sterile and must be unopened prior to
sampling. Nitrile gloves must be worn during
sampling and filtering
Collect sample at the last transect to minimize holding
time before filtering and freezing
Sample is held in a cooler on wet ice until filtering
Sample is filtered and filters are frozen on dry ice
within 6 hours of collection
Duplicate samples must be collected at 10% of sites
Field blanks must be filtered at 10% of sites
5.10.6 Quality Control Procedures: Laboratory Operations
Specific quality control measures are listed in Table 5.10-4 for laboratory operations.
Table 5.10-4. Laboratory Quality Control: Pathogen-Indicator DMA Sequences
Check or
Sample
Description
Frequency
Acceptance
Criteria
Corrective Action
SAMPLE PROCESSING
Re-process sub-
samples
(Lab Duplicates)
10% of all
samples
complet
ed per
laborato
ry
Percent Congruence
<70% RSD
If >70%, re-process additional sub-
samples
qPCR ANALYSIS
Duplicate analysis
by
different
biologist
within lab
Independent
analysis
by
external
laboratory
10% of all
samples
complet
ed per
laborato
ry
None
Percent Congruence
<70% RSD
Independent analysis
TBD
If >70%, determine reason and if
cause is systemic, re-analyze
all samples in question.
Determine if independent analysis can
be funded and conducted.
-------
National Rivers and Streams Assessment
QA Project Plan
November 2010
Page 46 of 120
Use single stock of
E. faecalis
calibrator
ForallqPCR
calibrat
or
samples
for
quantita
tion
All calibrator sample
Cp (CO must
have an RSD <
50%
If calibrator Cp (CO values exceed an
RSD value of 50% a batch's
calibrator samples shall be re-
analyzed and replaced with
new calibrators to be
processed and analyzed if
RSD not back within range.
DATA PROCESSING & REVIEW
1 00% verification
and
review of
qPCR
data
All qPCR
amplific
ation
traces,
raw and
process
ed data
sheets
All final data will be checked
against raw data,
exported data, and
calculated data printouts
before entry into LIMS
and upload to Corvallis,
OR database.
Second tier review by
contractor and third
tier review by EPA.
5.10.7 Data Management, Review, and Validation
Checks made of the data in the process of review, verification, and validation are summarized in
Table 5.10-5. The Indicator Lead is ultimately responsible for ensuring the validity of the
data, although performance of the specific checks may be delegated to other staff
members. All raw data (including all standardized forms and logbooks) are retained in an
organized fashion for seven years or until written authorization for disposition has been
received from the NRSA Project Coordinator. Once data have passed all acceptance
requirements, computerized data files are prepared in a format specified for the NRSA
project by EMAP and copied onto a CD. The CDs are transferred to the NRSA IM
Coordinator (Marlys Cappaert) for entry into a centralized data base. A hard copy output
of all files accompanies each data CD.
Table 5.10-5. Data validation quality control: fecal indicator
Check Description
Duplicate
sampling
Field filter blanks
Frequency
Duplicate composite
samples
collected at
1 0% of sites
Field blanks filtered
at 10% of
sites
Acceptance Criteria
Measurements should be
within 10 percent
Measurements should be
within 10 percent
Corrective Action
Review data for reasonableness;
determine if acceptance
criteria need to be
modified
Review data for reasonableness;
determine if acceptance
criteria need to be
modified
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 47 of 120
6.0 FIELD AND BIOLOGICAL LABORATORY QUALITY EVALUATION
AND ASSISTANCE VISITS
No national program of accreditation for biological sample collections and processing currently
exists. However, national standards of performance and audit guidance for biological
laboratories are being considered by the National Environmental Laboratory
Accreditation Conference (NELAC). For this reason, a rigorous program of field and
laboratory evaluation and assistance visits has been developed to support the National
Rivers and Streams Assessment Program.
Procedural review and assistance personnel are trained to the specific implementation and data
collection methods detailed in the NRSA field operations manual. Plans and checklists
for field evaluation and assistance visit have been developed to reinforce the specific
techniques and procedures for both field and laboratory applications. The plans and
checklists are included in this section and describe the specific evaluation and corrective
action procedures.
It is anticipated that evaluation and assistance visits will be conducted with each Field Team
early in the sampling and data collection process, and that corrective actions will be
conducted in real time. These visits provide a basis for the uniform evaluation of the data
collection techniques, and an opportunity to conduct procedural reviews as required to
minimize data loss due to improper technique or interpretation of program guidance.
Through uniform training of field crews and review cycles conducted early in the data
collection process, sampling variability associated with specific implementation or
interpretation of the protocols will be significantly reduced. The field evaluations, while
performed by a number of different supporting collaborator agencies and participants,
will be based on the uniform training, plans, and checklists. This review and assistance
task will be conducted for each unique crew collecting and contributing data under this
program; hence no data will be recorded to the project database that were produced by
an 'unaudited' process, or individual.
Similarly, laboratory evaluation and assistance visits will be conducted early in the project
schedule and soon after sample processing begins at each laboratory to ensure that
specific laboratory techniques are implemented consistently across the multiple
laboratories generating data for the program. Laboratory evaluation plans and checklists
have been developed to ensure uniform interpretation and guidance in the procedural
reviews. These laboratory visits are designed such that full corrective action plans and
remedies can be implemented in the case of unacceptable deviations from the
documented procedures observed in the review process without recollection of samples.
The Field and Laboratory Evaluation and Assistance Visit Plans are described in sections 6.1
and 6.2.
6.1 National Rivers and Streams Assessment Field Quality Evaluation and Assistance
Visit Plan
Evaluators: One or more designated EPA or Contractor staff members who are
qualified (i.e., have completed training) in the procedures of the NRSA field
sampling operations.
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 48 of 120
To Evaluate: Field Sampling Teams during sampling operations on site.
Purpose: To identify and correct deficiencies during field sampling operations.
1. Tetra Tech and GLEC project staff will review the Field Evaluation and Assistance Visit
Plan and Check List with each Evaluator during field operations training sessions.
2. The Tetra Tech and GLEC QA Officer or authorized designee will send a copy of the
final Plan and the final Check List pages, envelopes to return the Check Lists, a
clipboard, pens, and the NRSA Quality Assurance Project Plan and Field Operations
Manual to each participating Evaluator.
3. Each Evaluator is responsible for providing their own field gear sufficient to accompany
the Field Sampling Teams (e.g., protective clothing, sunscreen, insect repellent, hat, hip
boots or waders, water bottle, food, back pack, cell phone) during a complete sampling
cycle. Schedule of the Field visits will be made by the Evaluator in consultation with the
Tetra Tech or GLEC QA Officer and respective Field Crew Leader. Evaluators should
be prepared to spend additional time in the field if needed (see below).
4. Tetra Tech, GLEC, and the Regional Monitoring Coordinators will arrange the schedule
of visitation with each Field Team, and notify the Evaluators concerning site locations,
where and when to meet the team, and how to get there. Ideally, each Field Team will
be evaluated within the first two weeks of beginning sampling operations, so that
procedures can be corrected or additional training provided, if needed. EPA Evaluators
will visit Tetra Tech and GLEC Field Teams. Any EPA or Contractor Evaluator may visit
State Field Teams.
5. A Field Team for the NRSA consists of a four-person crew where, at a minimum, the
Field Crew Leader and one additional crew member is fully trained.
6. If members of a Field Team change, and a majority (i.e., two) of the members have not
been evaluated previously, the Field Team must be evaluated again during sampling
operations as soon as possible to ensure that all members of the Field Team understand
and can perform the procedures.
7. The Evaluator will view the performance of a team through one complete set of sampling
activities as detailed on the Field Evaluation and Assistance Check List.
a. Scheduling might necessitate starting the evaluation midway on the list of tasks at a site,
instead of at the beginning. In that case, the Evaluator will follow the team to the next
site to complete the evaluation of the first activities on the list.
b. If the Team misses or incorrectly performs a procedure, the Evaluator will note this on
the checklist and immediately point this out so the mistake can be corrected on the spot.
The role of the Evaluator is to provide additional training and guidance so that the
procedures are being performed consistent with the Field Operations Manual, all data
are recorded correctly, and paperwork is properly completed at the site.
c. When the sampling operation has been completed, the Evaluator will review the results
of the evaluation with the Field Team before leaving the site (if practicable), noting
positive practices and problems, weaknesses [might affect data quality], and deficiencies
[would adversely affect data quality]). The Evaluator will ensure that the Team
understands the findings and will be able to perform the procedures properly in the
future.
d. The Evaluator will record responses or concerns, if any, on the Field Evaluation and
Assistance Check List.
e. If the Evaluator's findings indicate that the Field Team is not performing the procedures
correctly, safely, or thoroughly, the Evaluator must continue working with this Field Team
until certain of the Team's ability to conduct the sampling properly so that data quality is
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 49 of 120
not adversely affected.
f. If the Evaluator finds major deficiencies in the Field Team operations (e.g., less than
three members, equipment or performance problems) the Evaluator must contact one of
the following QA officials:
Dr. Esther Peters, Tetra Tech QA Officer (703-385-6000)
Ms. Robin Silva-Wilkinson, GLEC QA Officer (231-941-2230)
Mr. Richard Mitchell, EPA NRSA Project QA Officer (202-566-0644)
The QA official will contact the Project Implementation Coordinator ( Ellen Tarquinio - 202-566-
2267 ) to determine the appropriate course of action.
Data records from sampling sites previously visited by this Field Team will be checked to
determine whether any sampling sites must be redone.
g. Complete the Field Evaluation and Assistance Check List, including a brief summary of
findings, and ensure that all Team members have read this and signed off before leaving
the Team.
8. The Evaluator will electronically scan and make a photocopy of the Field Evaluation and
Assistance Check List. The Evaluator will retain the photocopied checklist, and email
the scanned file and send the original checklist to
Richard Mitchell
USEPA Office of Wetlands, Oceans, and Watersheds
1200 Pennsylvania Avenue (4503-T)
Washington, DC 20460-0001
(202)-566-0644
6.2 National Rivers and Streams Assessment Laboratory Quality Evaluation and
Assistance Visit Plan
Evaluators: One or more designated Contractor staff members who are qualified (i.e.,
have completed training) in the procedures of the NRSA biological laboratory
operations.
To Evaluate: Biological laboratories performing subsampling, sorting, and taxonomic
procedures to analyze collected stream and river samples.
Purpose: To identify and correct deficiencies during laboratory operations.
1. Tetra Tech project staff will review the Laboratory Evaluation and Assistance Visit Plan
and Check List with each Evaluator prior to conducting laboratory evaluations.
2. The Tetra Tech QA Officer or authorized designee will send a copy of the final Plan and
final Check List pages, envelopes to return the Check Lists, a clipboard, pens, and the
NRSA Quality Assurance Project Plan and Laboratory Method Manual to each
participating Evaluator.
3. Schedule of lab visits will be made by the Evaluator in consultation with the Tetra Tech
QA Officer and the respective Laboratory Supervisor Staff. Evaluators should be
prepared to spend additional time in the laboratory if needed (see below).
4. Tetra Tech, GLEC, and the Regional Monitoring Coordinators will arrange the schedule
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 50 of 120
of visitation with each participating Laboratory, and notify the Evaluators concerning site
locations, where and when to visit the laboratory, and how to get there. Ideally, each
Laboratory will be evaluated within the first two weeks following initial receipt of samples,
so that procedures can be corrected or additional training provided, if needed.
5. The Evaluator will view the performance of the laboratory sorting process and QC Officer
through one complete set of sample processing activities as detailed on the Laboratory
Evaluation and Assistance Check List.
a. Scheduling might necessitate starting the evaluation midway on the list of tasks for
processing a sample, instead of at the beginning. In that case, the Evaluator will view the
activities of the Sorter when a new sample is started to complete the evaluation of the
first activities on the list.
b. If a Sorter or QC Officer misses or incorrectly performs a procedure, the Evaluator will
note this on the checklist and immediately point this out so the mistake can be corrected
on the spot. The role of the Evaluator is to provide additional training and guidance so
that the procedures are being performed consistent with the Benthic Laboratory Methods
manual, all data are recorded correctly, and paperwork is properly completed at the site.
c. When the sample has been completely processed, the Evaluator will review the results
of the evaluation with the Sorter and QC Officer, noting positive practices and problems,
weaknesses [might affect data quality], and deficiencies [would adversely affect data
quality]). The Evaluator will ensure that the Sorter and QC Officer understand the
findings and will be able to perform the procedures properly in the future.
d. The Evaluator will record responses or concerns, if any, on the Laboratory Evaluation
and Assistance Check List.
e. If the Evaluator's findings indicate that Laboratory staff are not performing the
procedures correctly, safely, or thoroughly, the Evaluator must continue working with
these staff members until certain of their ability to process the sample properly so that
data quality is not adversely affected.
f. If the Evaluator finds major deficiencies in the Laboratory operations, the Evaluator must
contact one of the following QA officials:
Dr. Esther Peters, Tetra Tech QA Officer (703-385-6000)
Jennifer Hanson, GLEC QA Officer (231-941-2230)
Ms. Sarah Lehman, EPA NRSA Project QA Officer (202-566-1379)
The QA official will contact the Project Implementation Coordinator (Ellen Tarquinio - 202-566-
2267) to determine what should be done.
Data records from samples previously processed by this Laboratory will be checked to
determine whether any samples must be redone.
g. Complete the Laboratory Evaluation and Assistance Check List, including a brief
summary of findings, and ensure that the Sorter and QC Officer have read this and
signed off before leaving the laboratory.
9. The Evaluator will electronically scan and make a photocopy of the Laboratory
Evaluation and Assistance Check List. The Evaluator will retain the photocopied
checklist, and email the scanned file and send the original checklist to
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 51 of 120
Richard Mitchell
USEPA Office of Wetlands, Oceans, and Watersheds
1200 Pennsylvania Avenue (4503-T)
Washington, DC 20460-0001
(202)-566-0644
6.
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 52 of 120
7.0 REFERENCES
American Public Health Association. 1989. Standard Methods for the Examination of Water
and Wastewater. Seventeenth Edition. American Public Health Association,
Washington, D.C.
Bain, M.B., J.T. Finn, and H.E. Booke. 1985. Quantifying stream substrate for habitat
analysis studies. North American Journal of Fisheries Management 5:499-500.
Baker, J.R. and G.D. Merritt, 1990. Environmental Monitoring and Assessment Program:
Guidelines for Preparing Logistics Plans. EPA 600/4-91-001. U.S. Environmental
Protection Agency. Las Vegas, Nevada.
Barbour, M.T., J. Gerritsen, B.D. Snyder, and J.B. Stribling. 1999. Rapid Bioassessment
Protocols for Use in Streams and Wadeable Rivers: Periphyton, Benthic
Macroinvertebrates, and Fish. Second Edition. EPA/841-B-99-002. U.S. Environmental
Protection Agency, Office of Water, Assessment and Watershed Protection Division,
Washington, D.C.
Bickford, C.A., C.E. Mayer, and K.D. Water. 1963. An Efficient Sampling Design for Forest
Inventory: The Northeast Forest Resurvey. Journal of Forestry. 61: 826-833.
CAS. 1999. Chemical Abstracts Service web site (http://www.cas.org)
Carlson, R.E. 1977. A trophic state index for lakes. Limnology and Oceanography 22(2):361-
369.
CENR. 1997. Integrating the Nation's Environmental Monitoring and Research Networks and
Programs: A Proposed Framework. Committee on Environment and Natural Resources,
National Science and Technology Council, Washington, DC, USA.
Chaloud, D.C., J.M. Nicholson, B.P. Baldigo, C.A. Hagley, and D.W. Sutton. 1989. Handbook of
Methods for Acid Deposition Studies: Field Methods for Surface Water Chemistry. EPA
600/4-89/020. U.S. Environmental Protection Agency, Washington, D.C.
Converse, J.M. 1987. Survey Research in the United States: Roots and Emergence 1890-1960.
University California Press. Berkeley, CA. 564 pp.
EIMS. 1999. Environmental Information Management System (EIMS) web site.
(Http://www.epa.gov/eims)
FGDC. 1998. Content standard for digital deospatial metadata, version 2.0. FGDCSTD-001-
1998. Federal Geographic Data Committee. Washington, DC. (Http://www.fgdc.gov)
Frissell, C.A., W.J. Liss, C.E. Warren, and M.D. Hurley. 1986. A hierarchical framework for
stream habitat classification: viewing streams in a watershed context. Environ. Mgmt.
10(2): 199-214.
Frithsen, J.B. 1996a. Suggested modifications to the EMAP data set directory and catalog for
implementation in US EPA Region 10,. Draft. June 10, 1996. Report prepared for the
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 53 of 120
U.S. Environmental Protection Agency. National Center for Environmental Assessment.
Washington, DC. By Versar, Inc., Columbia, MD.
Frithsen, J.B. 1996b. Directory Keywords: Restricted vs. unrestricted vocabulary. Draft, May, 21,
1996. Report prepared for the U.S. Environmental Protection Agency, National Center
for Environmental Assessment, Washington, DC., by Versar, Inc. Columbia, MD.
Frithsen, J.B., and D.E. Strebel. 1995. Summary documentation for EMAP data: Guidelines for
the information management directory. 30 April 1995. Report prepared for U.S.
Environmental Protection Agency, Environmental and Assessment Program (EMAP),
Washington, DC. Prepared by Versar, Inc., Columbia, MD.
Garner, F.C., M.A. Stapanian, and K.E. Fitzgerald. 1991. Finding causes of outliers in
multivariate environmental data. Journal of Chemometrics. 5: 241-248.
Glase, J.A., D.L. Foerst, G.D> McKee, S.A. Quave, and W.L Budde. 1981. Trace analyses
of waste-waters. Environmental Science & Technology. 15: 1426-1435.
Hawkins, C. P., R. H. Norris, J. N. Hogue, and J. W. Feminella. 2000. Development and
evaluation of predictive models for measuring the biological integrity of streams.
Ecological Applications 10:1456-1477.
Hazard, J.W., and B.E. Law. 1989. Forest Survey Methods Used in the USDA Forest
Service,. EPA/600/3-89/065. NTIS PB89 220 594/AS. U.S. EPA Environmental
Research Laboratory. Corvallis, Oregon.
Hillman, D.C., S.H. Pia, and S.J. Simon. 1987. National Surface Water Survey: Stream Survey
(Pilot, Middle Atlantic Phase I, Southeast Screening, and Episode Pilot) Analytical
Methods Manual. EPA 600/8-87-005. U.S. Environmental Protection Agency, Las
Vegas, Nevada.
Heinz Center. 2002. The State of the Nation's Ecosystems. The Cambridge University Press.
Hunsaker, C. T., and D. E. Carpenter. 1990. Environmental Monitoring and Assessment
Program: ecological indicators. Office of Research and Development, U. S.
Environmental Protection Agency, Research Triangle Park, North Carolina. EPA-600-3-
90-060.
Hunt, D.T. E., and A.L. Wlson. 1986 The Chemical Analysis of Water: General Principles
and Techniques. Second edition. Royal Society of Chemistry, London, England.
683 pp.
ITIS. 1999. Integrated Taxonomic Information System web site
(http://www.itis.usda.gov/itis).
Kaufmann, P.R. (ed.). 1993. Physical Habitat. IN: R.M. Hughes (ed.) Stream Indicator and
Design Workshop. EPA600/R-93/138. U.S. Environmental Protection Agency,
Corvallis, Oregon.
Kaufmann, P.R., A. T. Herlihy, J.W. Elwood, M.E. Mitch, W.S. Overton, M.J. Sale, J.J..
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 54 of 120
Messer, K.A. Cougan, D.V. Peck, K.H. Reckhow, A,J, Kinney, S.J. Christie, D.D.
Brown, C.A. Hagley, and H.I. Jager. 1988. Chemical Characteristics of Streams in the
Mid-Atlantic and Southeastern United States. Volume I: Population Descriptions and
Physico-Chemical Relationships. EPA 600/3-88/021 a. U.S. Environmental Protection
Agency, Washington, D.C.
Kish, L 1965. Survey Sampling. John Wiley & Sons. New York. 643 pp.
Kish, L. 1987. Statistical Design for Research. John Wley & Sons. New York. 267 pp.
Kirchmer, C.J. 1983. Quality control in water analysis. Environmental Science &
Technology. 17: 174A-181A.
Klemm, D.J., P.A. Lewis, F. Fulk, and J.M. Lazorchak. 1990. Macroinvertebrate Field and
Laboratory Methods for Evaluating the Biological Integrity of Surface Waters. EPA
600/4-90/030. U.S. Environmental Protection Agency, Cincinnati, Ohio.
Lemmon, P.E. 1957. A new instrument for measuring forest overstory density. J. For. 55(9):
667-669.
Linsley, R.K., M.A. Kohler, and J.L.H. Paulhus. 1982. Hydrology for Engineers.
McGraw-Hill Book Co. New York.
Meglen, R.R. 1985. A quality control protocol for the analytical laboratory. Pp. 250-270 IN: J.J.
Breen and P.E. Robinson (eds). Environmental Applications of Cehmometrics. ACS
Symposium Series 292. American Chemical Society, Washington, D.C.
Messer, J.J., C.W. Ariss, J.R. Baker, S.K. Drouse, K.N. Eshleman, P.R. Kaufmann, R.A.
Linthurst, J.M. Omernik, W.S. Overton, M.J. Sale, R.D. Schonbrod, S.M. Stambaugh,
and J.R. Tuschall, Jr. 1986. National Surface Water Survey: National Stream Survey,
Phase l-Pilot Survey. EPA-600/4-86/026. Washington, D.C: U.S. Environmental
Protection Agency.
MRLC. 1999. Multi-Resolution Land Characteristics web site (http://www.epa.gov/mrlc)
Mulvey, M., L. Cato, and R. Hafele. 1992. Oregon Nonpoint Source Monitoring Protocols
Stream Bioassessment Field Manual: For Macroinvertebrates and Habitat
Assessment. Oregon Department of Environmental Quality Laboratory Biomonitoring
Section. Portland, Oregon. 40pp.
NAPA. 2002. Environment.gov. National Academy of Public Administration.
ISBN: 1-57744-083-8. 219 pages.
NBII. 1999. The NBII Biological Metadata Standard. National Biological Information
Infrastructure web site (http://www.nbii.gov)
NSDI. 1999. National Spatial Data Infrastructure web site
(http://www.fgdc.gov/nsdi/nsdi.html)
NRC. 2000. Ecological Indicators forthe Nation. National Research Council.
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 55 of 120
Washington, DC. National Academy Press.
Overton, W.S., 1985. Draft Sampling Plan for Streams in the National Surface Water Survey.
July 1986. Technical Report 114. Corvallis, Oregon: Department of Statistics, Oregon
State University.
Overton, W. S., D. White, and D. L. Stevens, Jr. 1991. Design Report for EMAP, the
Environmental Monitoring and Assessment Program. U. S. Environmental Protection
Agency, Office of Research and Development, Washington, D.C. EPA-600-3- 91-053.
Paulsen, S.G., D.P. Larsen, P.R. Kaufmann, T.R. Whittier, J.R. Baker, D. Peck, J. McGue, R.M.
Hughes, D. McMullen, D. Stevens, J.L. Stoddard, J. Lazorchak, W. Kinney, A.R. Selle,
and R. Hjort. 1991. EMAP - surface waters monitoring and research strategy, fiscal
year 1991. EPA-600-3-91-002. U.S. Environmental Protection Agency, Office of
Research and Development, Washington, D.C. and Environmental Research
Laboratory, Corvallis, Oregon.
Peck, D.V., J.M. Lazorchak, and D.J. Klemm (editors). 2003. Unpublished draft. Environmental
Monitoring and Assessment Program - Surface Waters: Western Pilot Study Field
Operations Manual for National Rivers and Streams. EPA/xxx/x-xx/xxxx. U.S.
Environmental Protection Agency, Washington, D.C.
Plafkin, J.L, M.T. Barbour, K.D. Porter, S.K. Gross, and R.M. Hughes. 1989. Rapid Bio-
assessment Protocols for Use in Streams and Rivers: Benthic Macroinvertebrates and
Fish. EPA 440/4-89/001. U.S. Environmental Protection Agency, Office of Water,
Washington, D.C.
Platts, W.S., W.F. Megahan, and G.W. Minshall. 1983. Methods for Evaluating Stream,
Riparian, and Biotic Conditions. USDA Forest Service, Gen. Tech. Rep. INT-183. 71pp.
Robison, E.G. and R.L. Beschta. 1990. Characteristics of coarse woody debris for several
coastal streams of southeast Alaska, USA. Canadian Journal of Fisheries and Aquatic
Sciences 47(9): 1684-1693.
Robison, E.G., and P.R. Kaufmann. (In preparation). Evaluating and improving and objective
rapid technique for defining pools in small National Rivers and Streams.
Sarndal, C.E., B. Swensson, and J. Wretman. 1992. Model Assisted Survey Sampling.
Springer-Verlag. New York. 694pp.
Skougstad, M.W., M.J. Fishman, L.C. Friedman, D.E. Erdman, and S.S. Duncan (eds.). 1979.
Method I-4600-78, Automated Phosphomolybdate Colorimetric Method for Total
Phosphorus. |J\I: Methods for Determination of Inorganic Substances in Water and
Fluvial Sediments: Techniques of Water-Resources Investigations of the United States
Geological Survey. Book 5, Chapter A1. U.S. Government Printing Office,
Washington, D.C.
Smith, F., S. Kulkarni, L. E. Myers, and M. J. Messner. 1988. Evaluating and presenting
quality assurance data. Pages 157-68 in L.H. Keith, ed. ACS Professional Reference
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 56 of 120
Book. Principles of Environmental Sampling. American Chemical Society, Washington,
D.C.
Stack, B.R. 1989. Factors Influencing Pool Morphology in Oregon Coastal Streams.
M.S. Thesis, Oregon State University. 109pp.
Stanley, T.W., and S.S. Verner. 1986. The U.S. Environmental Protections Agency's quality
assurance program, pp. 12-19 IN: J.K. Taylor and T.W. Stanley (eds.). Quality
Assurance for Environmental Measurements. ASTM STP 867, American Society for
Testing and Materials, Philadelphia, Pennsylvania.
Stapanian, M.A., F.C. Garner, K.E. Fitzgerald, G.T. Flatman, and J.M. Nocerino. 1993.
Finding suspected causes of measurement error in multivariate environmental data.
Journal of Chemometrics. 7: 165-176.
Stevens, D. L, Jr., 1994. Implementation of a National Monitoring Program. Journal
Environ. Management 42:1-29.
Stevens, D.L., Jr., and A.R. Olsen. 1999. Spatially restricted surveys over time for aquatic
resources. Journal of Agricultural, Biological and Environmental Statistics. 4:415-
428.
Stevens, D.L., Jr., and A.R. Olsen. 2003. Variance estimation for spatially balanced
samples of environmental resources. Environmetrics. 14:593-610.
Stevens, D.L., Jr., and A.R. Olsen. 2004. Spatially-balanced sampling of natural
resources. Journal of American Statistical Association. 99:262-278.
STORET. 1999. The STORET web site. (Http://www.epa.gov/OWOW/STORET)
U.S. EPA. 1987. Handbook of Methods for Acid Deposition Studies: Laboratory Analyses for
Surface Water Chemistry. EPA/600/4-87/026. U.S. Environmental Protection Agency,
Office of Research and Development, Washington, D.C.
U.S. EPA. 1991. IRM Policy Manual. Chapter 13. Locational data.
U.S. EPA. 1996b. Addendum to: Guidelines for the information management directory. U.S.
EPA, NHEERL. Atlantic Ecology Division. Narragansett, Rl.
U.S. EPA. 2003. Draft Report on the Environment. ORD and OEI. EPA-260-R-02-006.
USGCRP. 1998. Data Management for Global Change Research. Policy Statements for the
National Assessment Program. July 1998. U.S. Global Change Research Program.
National Science Foundation. Washington, DC.
Wlen, B.O. 1990. The U.S. Fish and Wldlife Service's National Wetlands Inventory, in
S. J. Kiraly, R. A. Cross, J. D. Buffington (Eds.) Federal Coastal Wetlands Mapping Programs
(U.S. Department of the Interior Fish and Wldlife Service, Washington, D.C., 1990),
FWS Biological Report 90(18), pp. 9-20.
-------
National Rivers and Streams Assessment November 2010
QA Project Plan Page 57 of 120
Wolman, M.G. 1954. A method of sampling coarse river-bed material. Transactions of the
American Geophysical Union 35(6):951-956.
------- |