United States Environmental Protection Agency
                  Office of Water
                  Office of Environmental Information
                  Washington, DC
                  EPA 841-B-07-003
      Survey of the Nation's Lakes
      Quality Assurance
           Project Plan
              August 2007  DRAFT
Survey of the Nation's Lakes (Lakes Survey)
Quality Assurance (QA) Project Plan

-------
 Survey of the Nation's Lakes                                             Revision No. 1
 Quality Assurance Project Plan                                        Date: August 2007
	Page /'/'

                       QUALITY ASSURANCE PROJECT PLAN
                 REVIEW & DISTRIBUTION ACKNOWLEDGMENT AND
                           COMMITMENT TO IMPLEMENT

                                        for

                            Survey of the Nation's Lakes

 We have read the QAPP and the methods manuals for the Lakes Survey listed below. Our
 agency/organization, agrees to abide by its requirements for work performed under the Lakes
 Survey (under CWA 106).

 Quality Assurance Project Plan      n
 Field Operations Manual           n
 Site Evaluation Guidelines          n
 Laboratory Methods Manual        n
 Print Name
Title	
(Cooperator's Principal Investigator)

Organization	
Signature                                         Date

-------
 Survey of the Nation's Lakes                                               Revision No. 1
 Quality Assurance Project Plan                                          Date: August 2007
	Page Hi

 NOTICE

 The complete documentation of overall Lakes Survey project management, design, methods,
 and standards is contained in four companion documents, including:

    •  Survey of the Nation's Lakes: Quality Assurance Project Plan (EPA 841-B-07-003)
    •  Survey of the Nation's Lakes: Lake Evaluation Guidelines (EPA 841-B-06-003)
    •  Survey of the Nation's Lakes: Field Operations Manual (EPA 841-B-07-004)
    •  Survey of the Nation's Lakes: Laboratory Methods Manual (EPA 841-B-O 7-005)

 This document (Quality Assurance Project Plan) contains elements of the overall project
 management, data quality objectives, measurement and data acquisition, and information
 management for the Lakes Survey.  Methods described in this document are to be used
 specifically in work relating to the Lakes Survey. All Project Cooperators should follow these
 guidelines. Mention of trade names or commercial products in this document does not
 constitute endorsement or recommendation for use. More details on specific methods for site
 evaluation, field sampling, and laboratory processing can be found in the appropriate
 companion document(s).

 The suggested citation for this document is:

       USEPA. 2006 (draft). Survey of the Nation's Lakes: Integrated
       Quality Assurance Project Plan. EPA/841-B-07-003.  U.S.  Environmental
       Protection Agency, Office of Water and Office of Research and Development,
       Washington, DC.

-------
Survey of the Nation's Lakes
Quality Assurance Project Plan
                        Revision No. 1
                    Date: August 2007
                    	Page iv
                                DISTRIBUTION LIST

This QA Project Plan and associated manuals or guidelines will be distributed to the following:
EPA, States, Tribes, universities, and contractors participating in the Lakes Survey.  EPA
Regional Lake Survey Coordinators are responsible for distributing the Lakes Survey QA
Project Plan to State and Tribal Water Quality Agency staff or other cooperators who will
perform the field sampling and laboratory operations.  The Tetra Tech and Great Lakes
Environmental Center QA Officers will distribute the QA Project Plan and associated documents
to participating project staff at their respective facilities and to the project contacts at
participating laboratories, as they are determined.
Susan Holdsworth
Office of Wetlands, Oceans and
Watersheds
US- Environmental Protection Agency
1200 Pennsylvania Avenue, NW(4503T)
Washington,  DC 20460
202-566-1187
holdsworth.susan@epa.gov

Steven G. Paulsen, Ph.D.
Aquatic Monitoring and Assessment Branch
Western Ecology Division,  NHEERL, ORD,
EPA 200 S.W. 35th St.
Corvallis, OR 97330
541-754-4428
Paulsen.Steve@epa.gov

Ellen Tarquinio
USEPA Office of Wetlands, Oceans and
Watersheds
1200 Pennsylvania Avenue, NW(4503T)
Washington DC 20460
202-566-2267
Tarquinio.Ellen@epa.gov

Carol Peterson
USEPA Office of Wetlands, Oceans and
Watersheds
1200 Pennsylvania Avenue, NW(4503T)
Washington DC 20460
202-566-1304
Peterson.Carol@epa.gov

Otto Gutenson
USEPA Office of Wetlands, Oceans and
Watersheds
1200 Pennsylvania Avenue, NW(4503T)
Washington DC 20460
202-566-1183
Hilary Snook
EPA Region 1 Lakes Survey Coordinator
U. S. EPA Region 1
Ecosystems Assessment Unit
11 Technology Drive
North Chelmsford, MA 01863-2431
617-918-8670
snook.hilary@epa.gov

Darvene Adams
EPA Region 2 Lakes Survey Coordinator
Division of Envir. Science and Assessment
U. S. EPA Region 2
2890 Woodbridge Ave.
Edison, NJ 08837
732-321-6700
adams.darvene@epa.gov

Frank Borsuk, PhD
EPA Region 3 Lakes Survey Coordinator
U. S. EPA Region 3
Wheeling Office
1060 Chapline Street, Suite 303
Wheeling, WV 26003-2995
304-234-0241 Phone
304-234-0260 Fax
borsuk.frank@epa.gov

Marion Hopkins
EPA Region 4 Lakes Survey Coordinator
U.S. EPA Region 4
AFCBIdg., 15th Floor
61 Forsyth St., S.W.
Atlanta, GA 30303-8960
404-562-9144
hopkins.marion@epa.gov

-------
Survey of the Nation's Lakes
Quality Assurance Project Plan
                       Revision No. 1
                    Date: August 2007
                             Page v
Sarah Lehmann
EPA Region 5 Lakes Survey Coordinator
U.S. EPA Region 5
77 West Jackson Blvd.
Chicago, IL  60604-3507
312-353-4328
lehmann.sarah@epa.gov

Jessica Franks, PhD
EPA Region 6 Lakes Survey Coordinator
U.S. EPA - Region 6 (6WQ-EWM)
1445 Ross Avenue - Suite 1200
Dallas, TX 75202-2733
214-665-8335
franks.jessica@epa.gov

Gary Welker
EPA Region 7 Lakes Survey Coordinator
USEPA - Region 7
901 North Fifth Street
Kansas City, KS 66101
913-551-7177
Welker.gary@epa.gov

Sandra Spence
EPA Region 8 Lakes Survey Coordinator
USEPA - Region 8
999 18th Street - Suite 500
Denver,  CO 80202-2405
303-312-7753
spence.sandra@epag.gov

Janet  Hashimoto
EPA Region 9 Lakes Survey Coordinator
U.S. EPA- Region 9
75 Hawthorne Street
San Francisco, CA  94105
415-972-3452
hashimoto.janet@epa.gov

Lillian  Herger
EPA Region 10 Lakes Survey
Coordinator
USEPA-Region 10
1200 Sixth Avenue
Seattle, WA 98101
206-553-1074
herger.lillian@epa.gov
Michael T. Barbour, PhD
TetraTech, Inc.
400 Red Brook Blvd, Suite 200
Owings Mills, MD21117
410-356-8993
Michael.Barbour@tetratech.com

Dennis J. McCauley
Great Lakes Environmental Center
739 Hastings St.
Traverse City, Ml 49686
231/941-2230
dmccaulev(S)glec-tc.com

Esther Peters
TetraTech, Inc.
10306 Eaton  Place, Suite 340
Fairfax, VA 22030
703-385-6000, ext. 196
peteres@tetratech-ffx. com

-------
 Survey of the Nation's Lakes                                              Revision No. 1
 Quality Assurance Project Plan                                         Date: August 2007
	Page vi

                                TABLE OF CONTENTS

 DISTRIBUTION LIST	iv

 LIST OF TABLES	4

 LIST OF FIGURES	4

 1.0    PROJECT PLANNING AND MANAGEMENT	5
       1.1    Introduction	5
       1.2    Lakes Survey Project Organization	6
             1.2.1   Project Schedule	9
       1.3    Scope of QA Project Plan	9
             1.3.1   Overview of Field Operations	10
             1.3.2   Overview of Laboratory Operations	12
             1.3.3   Data Analysis and Reporting	15
             1.3.4   Peer Review	15

 2.0    DATA QUALITY OBJECTIVES	16
       2.1    Data Quality Objectives for Lakes Survey	17
       2.2    Measurement Quality Objectives	17
             2.2.1   Method Detection Limits	17
             2.2.2   Sampling Precision, Bias, and Accuracy	18
             2.2.3   Taxonomic Precision and Accuracy	19
             2.2.4   Completeness	20
             2.2.5   Comparability	21
             2.2.6   Representativeness	21

 3.0    SITE SELECTION DESIGN	21
       3.1    Probability Based Sampling Design and Site Selection	22

 4.0    INFORMATION MANAGEMENT	23
       4.1    Overview of System Structure	23
             4.1.1   Design and Logistics Data Bases	24
             4.1.2   Sample Collection and Field Data Recording	25
             4.1.3   Laboratory Analyses and Data Recording	26
             4.1.4   Data Review, Verification, Validation Activities	28
       4.2    Data Transfer	29
       4.3    Hardware  and Software Control	30
       4.4    Data Security	30

 5.0    INDICATORS 	31
       5.1    Water Chemistry Indicator	31
             5.1.1   Introduction	31
             5.1.2   Sampling Design	31
             5.1.3   Sampling and Analytical Methods	31
             5.1.4   Quality Assurance Objectives	36
             5.1.5   Quality Control  Procedures: Field Operations	36
             5.1.6   Quality Control  Procedures: Laboratory Operations	38
                    5.1.6.1 Sample  Receipt and Processing	38

-------
 Survey of the Nation's Lakes                                               Revision No. 1
 Quality Assurance Project Plan                                          Date: August 2007
	Page vii

                          TABLE OF CONTENTS (CONTINUED)

                    5.1.6.2 Analysis of Samples	38
              5.1.7  Data Reporting, Review, and Management	44
       5.2    Chlorophyll-a Indicator	48
              5.2.1  Introduction	48
              5.2.2  Sampling Design	48
              5.2.3  Sampling and Analytical Methods	48
              5.2.4  Quality Assurance Objectives	49
              5.2.5  Quality Control Procedures: Field Operations	49
              5.2.6  Quality Control Procedures: Laboratory Operations	51
                    5.2.6.1 Sample Receipt and Processing	51
                    5.2.6.2 Analysis of Samples	51
              5.2.7  Data Reporting, Review, and Management	51
       5.3    Sediment Diatom Indicator	51
              5.3.1  Introduction	51
              5.3.2  Sampling Design	52
              5.3.3  Sampling and Analytical Methods	52
              5.3.4  Quality Assurance Objectives	53
              5.3.5  Quality Control Procedures: Field Operations	54
              5.3.6  Quality Control Procedures: Laboratory Operations	54
              5.3.7  Data Reporting, Review, and Management	55
       5.4    Physical Habitat Quality Indicator	56
              5.4.1  Introduction	56
              5.4.2  Sampling Design	56
              5.4.3  Sampling and Analytical Methods	56
              5.4.4  Quality Assurance Objectives	58
              5.4.5  Quality Control Procedures: Field Operations	59
              5.4.6  Quality Control Procedures: Laboratory Operations	60
              5.4.7  Data Management, Review, and Validation	60
       5.5    Phytoplankton Indicator	60
              5.5.1  Introduction	60
              5.5.2  Sampling Design	60
              5.5.3  Sampling and Analytical Methods	60
              5.5.4  Quality Assurance Objectives	61
              5.5.5  Quality Control Procedures: Field Operations	62
              5.5.6  Quality Control Procedures: Laboratory Operations	62
              5.5.7  Data Management, Review, and Validation	62
       5.6    Zooplankton Indicator	64
              5.6.1  Introduction	64
              5.6.2  Sampling Design	64
              5.6.3  Sampling and Analytical Methods	64
              5.6.4  Quality Assurance Objectives	64
              5.6.5  Quality Control Procedures: Field Operations	65
              5.6.6  Quality Control Procedures: Laboratory Operations	66
              5.6.7  Data Management, Review, and Validation	66
       5.7    Pathogen Indicator	66
              5.7.1  Introduction	66
              5.7.2  Sampling Design	68
              5.7.3  Sampling and Analytical Methods	68

-------
 Survey of the Nation's Lakes                                               Revision No. 1
 Quality Assurance Project Plan                                         Date: August 2007
	Page viii

                         TABLE OF CONTENTS (CONTINUED)

             5.7.4  Quality Assurance Objectives	68
             5.7.5  Quality Control Procedures: Field Operations	68
             5.7.6  Quality Control Procedures: Laboratory Operations	68
             5.7.7  Data Management, Review, and Validation	69
       5.8   Algal Toxin Indicator	69
             5.8.1  Introduction	69
             5.8.2  Sampling Design	69
             5.8.3  Sampling and Analytical Methods	69
             5.8.4  Quality Assurance Objectives	69
             5.8.5  Quality Control Procedures: Field Operations	69
             5.8.6  Quality Control Procedures: Laboratory Operations	70
             5.8.7  Data Management, Review, and Validation	70
       5.9   Benthic Macroinvertebrates	70
             5.9.1  Introduction	70
             5.9.2  Sampling Design	70
             5.9.3  Sampling and Analytical Methods	71
             5.9.4  Quality Assurance Objectives	71
             5.9.5  Quality Control Procedures: Field Operations	73
             5.9.6  Quality Control Procedures: Laboratory Operations	73
             5.9.7  Data Management, Review, and Validation	73

 6.0    FIELD AND BIOLOGICAL LABORATORY QUALITY EVALUATION AND ASSISTANCE
       VISITS       	75
       6.1   Field Quality Evaluation and Assistance Visit Plan for the Survey of the Nation's
             Lakes (Lakes Survey)	76
       6.2   Laboratory Quality Evaluation and Assistance Visit Plan for the Survey of the
             Nation's Lakes (Lakes Survey)	78

 7.0    REFERENCES	81

 APPENDIX A:  FIELD AUDIT CHECKLIST

-------
Survey of the Nation's Lakes
Quality Assurance Project Plan
   Revision No. 1
Date: August 2007
         Page ix
                                    LIST OF TABLES
Table 1-1      Critical logistics elements (from Baker and Merritt, 1990)	10
Table 4-1      Sample and field data quality control activities	26
Table 4-2      Laboratory data quality control activities	27
Table 4-3      Biological sample quality control activities	27
Table 4-4      Data review, verification, and validation quality control activities	29
Table 5-1      Performance requirements for water chemistry and chlorophyll-a analytical
              methods	31
Table 5-2      Field quality control: water chemistry indicator	35
Table 5-3      Sample processing quality control activities: water chemistry indicator	35
Table 5-4      Laboratory quality control samples: water chemistry indicator	37
Table 5-5      Date validation quality control: water chemistry indicator	40
Table 5-6      Data reporting criteria: water chemistry indicator	41
Table 5-7      Constants for converting major ion concentrations from mg/L to ueq/L	42
Table 5-8      Factors to calculate equivalent conductivities of major ions	42
Table 5-9      Sample processing quality control: chlorophyll-a indicator	46
Table 5-10    Data validation quality control: chlorophyll-a indicator	46
Table 5-11    Data reporting criteria: chlorophyll-a indicator	46
Table 5-12    Field and laboratory methods: sediment diatom indicator	47
Table 5-13    Measurement quality objectives: sediment diatom indicator	48
Table 5-14    Sample processing quality control: sediment diatom indicator	49
Table 5-15    Laboratory Quality Control: sediment diatom indicator	50
Table 5-16    Field measurement methods: physical habitat indicator	51
Table 5-17    Measurement data quality objectives: physical habitat indicator	53
Table 5-18    Field quality control: physical habitat indicator	53
Table 5-19    Field and laboratory methods: phytoplankton indicator	55
Table 5-20    Measurement data quality objectives: phytoplankton indicator	55
Table 5-21    Laboratory Quality Control: phytoplankton indicator	57
Table 5-22    Field and laboratory methods: zooplankton indicator	58
Table 5-23    Measurement data quality objectives: zooplankton indicator	58
Table 5-24    Laboratory Quality Control: zooplankton indicator	59
Table 5-25    Field and laboratory methods: pathogen indicator (Enterococci)	62
Table 5-26    Measurement data quality objectives: Pathogen-Indicator DMA Sequences	62
Table 5-27    Laboratory Quality Control: Pathogen-Indicator DMA Sequences	64
Table 5-28    Example Layout of Samples and Controls on Microtiter Plate	66
Table 5-29    Sample analysis quality control activities: microcystin indicator Quality Control
              Activity	66
Table 5-30    Field and laboratory methods: benthic indicator	70
Table 5-31    Measurement data quality objectives: benthic indicator	71
Table 5-32    Laboratory Quality Control: benthic indicator	71

-------
 Survey of the Nation's Lakes                                                Revision No. 1
 Quality Assurance Project Plan                                           Date: August 2007
	Page x

                                   LIST OF FIGURES
 Figure 1-1     Lakes Survey project organization chart	3
 Figure 1-2     Timeline of Lakes Survey project activities	9
 Figure 1-3     Site verification process	13
 Figure 1-4     Summary of field activities and lake sampling	14
 Figure 4-1     Organization of information management system modeled after EMAP-SW
              For Lakes Survey	24
 Figure 5-1     Sampling locations for Lakes Survey indicators	30
 Figure 5-2     Field measurement activities for the water chemistry indicator	34
 Figure 5-3     Sample processing activities for water chemistry samples	36
 Figure 5-4     Analysis activities for water chemistry samples	39
 Figure 5-5     Sample collection form	45
 Figure 5-6     Example data entry form for microcystins	69

-------
 Survey of the Nation's Lakes                                               Revision No. 1
 Quality Assurance Project Plan                                          Date: August 2007
	Page 1 of 87

                    1.0    PROJECT PLANNING AND MANAGEMENT

 1.1    Introduction

       Several recent reports have identified the need for improved water quality monitoring
 and analysis at multiple scales. In 2000, the General Accounting Office (USGAO 2000)
 reported that EPA, states, and tribes collectively cannot make statistically valid inferences about
 water quality (via 305[b] reporting) and lack data to support key management decisions.  In
 2001, the National Research Council (NRC 2000) recommended EPA, states, and tribes
 promote a uniform, consistent approach to ambient monitoring and data collection to support
 core water quality programs. In 2002, the H. John Heinz III Center for Science, Economics, and
 the Environment (Heinz Center 2002) found there is inadequate data for national reporting on
 fresh water, coastal and ocean water quality indicators. The National Association of Public
 Administrators (NAPA 2002) stated that improved water quality monitoring is necessary to help
 states  and tribes make more effective use of limited resources.  EPA's Report on the
 Environment 2003 (USEPA 2003) says that there is not sufficient information  to provide a
 national answer, with confidence  and scientific credibility, to the question, 'What is the condition
 of U.S. waters and watersheds?'

       In  response to this need, the U.S. Environmental Protection Agency (EPA) Office of
 Waterin partnership with states and tribes has begun a program to assess the condition of the
 nation's waters via a statistically valid approach. The current assessment, the Survey of the
 Nation's Lakes (referred to as Lakes  Survey throughout this document), builds upon the
 Wadeable Streams Assessment implemented by EPA to  monitor and assess  the condition of
 the nation's wadeable stream resource. The Lakes Survey effort will provide  important
 information to states and the public about the condition of the nation's lake resource and key
 stressors  on a national and regional scale.

       EPA developed this Quality Assurance Project Plan (QAPP) to support the states and
 tribes participating in this project.  The plan contains elements of the overall project
 management, data quality objectives, measurement and data acquisition, and information
 management for the Lakes Survey. EPA recognizes that states and tribes may have  added
 elements, such as supplemental indicators, that are not covered in the scope  of this integrated
 Quality Assurance Project Plan. EPA expects that any supplemental elements are addressed in
 a separate approved QAPP or an addendum to this QAPP by the states and tribes or their
 designee.

       As a cooperative effort between states, tribes, and federal agencies, a broad-scale study
 to assess the condition of the Nation's lakes with both confidence and scientific credibility is
 made possible. Through this survey, states and tribes have the opportunity to collect data
 which can be used to supplement their existing monitoring programs or to begin development of
 new programs. The Lakes Survey has two main objectives:

    •   Estimate the current status, trends, and changes in selected trophic, ecological, and
       recreational indicators of the condition  of the Nation's Lakes with known statistical
       confidence.

    •   Seek associations between selected indicators of natural and anthropogenic stresses
       and indicators of ecological condition.

-------
 Survey of the Nation's Lakes                                                 Revision No. 1
 Quality Assurance Project Plan                                           Date: August 2007
	Page 2 of 87

 1.2    Lakes Survey Project Organization

       The responsibilities and accountability of the various principals and cooperators are
 described here and illustrated in Figure 1-1. The overall coordination of the project will be done
 by EPA's Office of Water (OW) in Washington, DC, with support from the Western Ecological
 Division  (WED) of the Office of Research and Development (ORD) in Corvallis, Oregon.  Each
 EPA Regional Office has identified a Regional EPA Coordinator who is part of the EPA team
 providing a critical link with state and tribal partners. Cooperators will work with their Regional
 EPA Coordinator to address any technical  issues. A comprehensive quality assurance (QA)
 program has been established to ensure data integrity and provide support for the reliable
 interpretation of the findings from this project.  Technical Experts Workgroups will be convened
 to decide on the best and most appropriate approaches for key technical issues, such as: (1)
 the selection and establishment of reference conditions based on least-disturbed sites and
 expert consensus for characterizing benchmarks for assessment of ecological condition;  (2)
 selection and calibration of ecological endpoints and attributes of the biota and relationship to
 stressor indicators; (3) a data analysis plan for interpreting the data and addressing the
 objectives  in a nationwide assessment; and (4) a framework for the reporting of the condition
 assessment and conveying the information on the ecological status of the Nation's lakes.

       Contractor support is provided for all aspects of this project. Contractors will provide
 support ranging from implementing the survey, sampling and laboratory processing, data
 management, data analysis, and report writing.  Cooperators will interact with their Regional
 EPA Coordinator and the EPA Project Leader regarding contractual services.

       The primary responsibilities of the principals and cooperators are as follows:

 EPA Project Leader- Carol Peterson
    •   Provides overall  coordination of the project and makes decisions regarding the proper
       functioning of all aspects of the project.
    •   Makes assignments and delegates authority, as needed to other parts of the project
       organization.

 Alternate EPA Project Leader- Steve Paulsen
    •   Assists EPA Project Leader with coordination and assumes responsibility for certain
       aspects of the project, as agreed upon with the EPA Project Leader.
    •   Serves as primary point-of-contact  for project coordination in the absence or
       unavailability of EPA Project Leader.
    •   Serves on the Technical Experts Workgroup and interacts with Project Facilitator  on
       technical, logistical, and organizational issues on a regular basis.

 Regional EPA Coordinator
    •   Assists EPA Project Leader with regional coordination activities.
    •   Serves on the Technical Experts Workgroup and interacts with Project Facilitator  on
       technical, logistical, and organizational issues on a regular basis.
    •   Serves as primary point-of-contact  for the Cooperators.

-------
 Survey of the Nation's Lakes                                                Revision No. 1
 Quality Assurance Project Plan                                           Date: August 2007
	Page 3 of 87

 Technical Experts Workgroup(s) - States, EPA, academics, other federal agencies
    •  Provides expert consultation on key technical issues as identified by the EPA
       Coordination team and works with Project Facilitator to resolve approaches and
       strategies to enable data analysis and interpretation to be scientifically valid.

 Tetra Tech (Tt) Project Facilitator- Michael Barbour
    •  A contractor who functions to support implementation of the project based on technical
       guidance established by the  EPA Project Leader and Alternate EPA Project Leader
    •  Primary responsibility is to ensure all aspects of the project, i.e., technical, logistical,
       organizational, are operating as smoothly as possible.
    •  Serves as point-of-contact for questions from field crews and cooperators for all
       activities.

 Great Lakes Environmental Center (GLEC) Technical Representative- Dennis McCauley
    •  Provides contractor support to the project and works with Project Facilitator to ensure all
       needs for contractor support are covered.

 Cooperator(s)
    •  Under the scope of their assistance agreements, plans and executes their individual
       studies as part of the cross jurisdictional Survey of the Nation's Lakes, and adheres to all
       QA requirements and standard operating procedures (SOPs).
    •  Interacts with the Grant Coordinator, Project Facilitator and EPA Project Leader
       regarding technical, logistical, organizational issues.

 Field Sampling Crew Leader
    •  Functions as the senior member of each Cooperator's field sampling crew and the point
       of contact for the Field  Logistics Coordinator.
    •  Responsible for overseeing all activities of the field sampling crew and ensuring that the
       Project field method protocols are followed during all sampling activities.

 Field Logistics Coordinator
    •  A contractor who functions to support implementation of the project based on technical
       guidance established by the  EPA Project Leader and Alternate EPA Project Leader
       serves as point-of-contact for questions from field crews and cooperators for all
       activities.
    •  Tracks progress of field sampling activities.

 Information Management Coordinator
    •  A contractor who functions to support implementation of the project based on technical
       guidance established by the  EPA Project Leader and Alternate EPA Project Leader
       oversees all sample shipments and receives data forms from the Cooperators.
    •  Oversees all aspects of data entry and data management for the project.

 EPA QA Officer
    •  Functions as the primary officer overseeing all QA and quality control  (QC) activities.
    •  Responsible for ensuring that the QA program is implemented thoroughly and
       adequately to document the performance of all activities.

-------
 Survey of the Nation's Lakes                                                Revision No. 1
 Quality Assurance Project Plan                                           Date: August 2007
	Page 4 of 87

 QA Project Officer(s)
    •  Oversee(s) individual studies of cooperators (assistance recipients).
    •  Interacts with EPA Project Leader and Project Facilitator on issues related to sampling
       design, project plan, and schedules for conduct of activities.
    •  Collects copies of all official field forms, field evaluation checklists and reports.
    •  Oversees and maintains records on field evaluation visits, but is not a part of any one
       sampling team.

 Tetra Tech (Tt) QA  Officer
    •  The contractor QA Officer who will supervise the implementation of the QA program.
    •  Directs the field and laboratory audits and ensures the field and lab auditors are
       adequately trained to correct errors immediately to avoid erroneous data and the
       eventual discarding of information from the assessment.

 Great Lakes Environmental Center (GLEC) QA Officer
    •  Provides support to the Tt QA Officer in carrying out the QC checks and documenting
       the quality of the activities and adherence to specified procedures.

 EPA Headquarters  Indicator Team
    •  Oversees the transfer of samples and related records for each indicator.
    •  Ensures the validity of data for each indicator.

 1.2.1  Project Schedule

       Training and field sampling will be conducted in 2007. Sample processing and data
 analysis will be completed by 2008 in order to publish a report the following year.  Figure 1-2
 gives an overview of the major tasks leading  up to the final report.

 1.3    Scope of QA Project Plan

       This QA Project Plan addresses the data acquisition efforts of Lakes Survey, which
 focuses on the 2007 sampling of lakes across the United States.  Data from approximately 1000
 lakes (selected with a probability design) located within the contiguous 48 states will provide a
 comprehensive assessment of the Nation's lakes. Companion documents to this QAPP that are
 relevant to the overall project include Survey of the Nation's Lakes: Site Evaluation Guidelines,
 Survey of the Nation's Lakes: Field Operations Manual, and  Survey of the Nation's Lakes:
 Laboratory Methods Manual.

 1.3.1  Overview of Field Operations

       Field data acquisition activities are implemented for the Lakes Survey (Table 1-1), based
 on guidance developed by EMAP (Baker and Merritt 1990), through the direction of a steering
 committee comprised of various state, tribal, and  regional agencies. Funding for states and
 tribes to conduct field data collection activities are provided by  EPA under Section 106 of the
 Clean Water Act. Survey preparation  is initiated with selection of the sampling locations by the
 Design Team  (ORD in Corvallis).  The list of sampling locations is distributed to the EPA
 Regional Lakes Survey Coordinators,  states, and tribes. With the sampling location list, state
 and tribal field crews can begin site reconnaissance on the primary sites and alternate
 replacement sites and begin work on obtaining access  permission to each site.  Specific

-------
Survey of the Nation's Lakes
Quality Assurance Project Plan
             Revision No. 1
          Date: August 2007
               Page 5 of 87
                          Project Management
                    Project lead - Susan Holdsworth, OW
                      Project QA - Otto Gutenson, OW
                   Technical Advisor - Steve Paulsen, ORD
           OWOW QA
       Oversight and Review
       Margarete Heber, OW
        Study Design
       Tony Olsen, ORD
  Field Protocols
State & Tribal Steering
Committee, ORD, OW
                              Field Logistics
                         Implementation Coordinator

                                 Training
                       ORD, EPA Regions, Contractors

                           Field Implementation
                    State and Tribal Water Quality Agencies,
                                Contractors

                              Indicator Team
                                 ORD, OW
                                Sample Flow
                        Zooplankton
                         Central lab?
             Phytoplankton
              Central lab?


Pathogens
NERL?
Diatoms
Central lab


?

/
U
 Algal Toxins
 USGS Kansas
                          Information Management
                          WED-CSC - Marlys Cappaert

                                  Final Data
                      STORET/WQX-OW  EMAP-ORD-AED,
                                    States
                                 Assessment
                                  OW-Lead
                           ORD, Regional Coordinators,
                           States, Tribes, Cooperators,
                               and other partners
Figure 1-1. Lakes Survey project organization chart.

-------
Survey of the Nation's Lakes
Quality Assurance Project Plan
   Revision No. 1
Date: August 2007
     Page 6 of 87
                         2007
                                                      2008
                                                                               2009
 Site evaluation/recon


 Field team training

 Field sampling/shipping


 Field evaluations


 Sample processing


 Lab evaluations


 Data management
 (QA/QC)/Integration


 Data analysis

 Report preparation


 Report review


 Peer Review


 Final Report
 Production
Figure 1-2. Timeline of Lakes Survey project activities.
procedures for evaluating each sampling location and for replacing non-sampleable sites are
documented in Survey of the Nation's Lakes: Site Evaluation Guidelines.  Scientific collecting
permits from State and Federal agencies will be procured, as needed. The field teams will use
standard field  equipment and supplies as identified in the Equipment and Supplies List
(Appendix A of the Field Operations Manual).  Field Team coordinators from  states and tribes
will work with EPA Regional Coordinators and the Information Management Center to
coordinate equipment and supply requirements.  This helps to ensure comparability of protocols
across states.  Detailed lists of equipment required for each field protocol, as well as guidance
on equipment inspection and maintenance, are contained in the Field Operations Manual.

       Field measurements and samples are collected by trained teams. The field team leaders
must be trained at EPA-sponsored training.  Ideally, all members of each field team should
attend one EPA-sponsored training session  before the field season in their state or tribal
jurisdiction.  Field sampling audits or evaluation visits will be completed for each field team.  The
training program stresses hands-on practice of methods, consistency among crews, collection
of high quality data and samples, and safety. Training documentation will be maintained by the
Project QA Officers.

       For each lake, a dossier is prepared  and contains the following applicable information:
road maps, copies of written access permissions, scientific collection permits, coordinates of
lake sites, information brochures on the program for interested land owners, a bathymetric map
with the index site location marked (if available), and local area emergency numbers.
Whenever possible, field team leaders attempt to contact landowners approximately two days
before the planned sampling date. Procedures for land owner notification can be found in the
Site Evaluation Guidelines.  As the design requires repeat visits to select sampling locations, it
is important for the field teams to do everything possible to maintain good relationships with

-------
 Survey of the Nation's Lakes                                               Revision No. 1
 Quality Assurance Project Plan                                          Date: August 2007
	Page 7 of 87

 landowners.  This includes prior contacts, respect of special requests, closing gates, minimal
 site disturbance, and removal of all materials, including trash, associated with the sampling visit.

       A variety of methods may  be used to access a lake, including vehicles and boats.  Some
 sampling locations require teams  to hike in, transporting all equipment in backpacks.  For this
 reason, ruggedness and weight are important considerations in the selection of equipment and
 instrumentation. Teams may need to camp out at the sampling location and so are equipped
 with the necessary camping equipment.

       The site verification  process is shown in Figure 1-3. Upon arrival at a site, the location is
 verified by a Global Positioning System (GPS) receiver, landmark references, and/or local
 residents. Samples and measurements for various parameters are collected in a specified
 order (Figure 1-4). This order has been set up to minimize the impact of sampling for one
 parameter upon subsequent parameters.  All methods are fully documented in step-by-step
 procedures in the Survey of the Nation's Lakes: Field Operations Manual (USEPA 2007).  The
 manual also contains detailed instructions for completing documentation, labeling samples, any
 field processing requirements, and sample storage and shipping.  Field communications will be
 through Field Team Coordinators, and may involve  regularly scheduled conference calls or
 contacts with the Communications Center.

       Standardized field data forms are the primary means of data recording. On completion,
 the data forms are reviewed by a  person other than the person who initially entered the
 information. Prior to departure from the field site, the field team leader reviews all forms and
 labels for completeness and legibility and ensures that all samples are properly labeled and
 packed.

       Upon return from field sampling to the office, completed data forms are sent to the
 information management staff at WED in Corvallis, Oregon for entry into  a computerized data
 base.  At WED, electronic data files are reviewed independently to verify that values are
 consistent with those recorded on the field data form or original field data file (see Section
 4.1.4).

       Samples are stored  or packaged for shipment in accordance with instructions contained
 in the Field Operations Manual. Precautions are taken so holding times are not exceeded.
 Samples which must be shipped are delivered to  a commercial carrier; copies of bills of lading
 or other documentation are  maintained by the team. The Information management Center is
 notified to track the sample  shipment; thus, tracing procedures can be initiated quickly in the
 event samples are not received.  Chain-of-custody forms are completed for all transfers of
 samples, with copies maintained  by the field team.

       The field operations  phase is completed with collection of all samples or expiration of the
 sampling window.  These debriefings cover all aspects of the field program and solicit
 suggestions for improvements.

 1.3.2  Overview of Laboratory  Operations

       Holding times for surface water samples vary with the sample types and analytes.  Thus,
 some analytical measurements begin during sampling (e.g., in situ profiles) while others are not
 initiated until sampling has been completed (e.g., phytoplankton, zooplankton). Analytical

-------
Survey of the Nation's Lakes
Quality Assurance Project Plan
             Revision No. 1
          Date: August 2007
               Page 8 of 87
methods are summarized in the specific Field and Laboratory SOPs that are companion
documents to this QAPP.
                        Receive
                       site packet
                    Conduct desktop
                    and/or field recon
                      Sample site
No

Reject site


Select
alternate site I
                        Is lake
                      part of target
                      population?
                      Permission
                       to sample
                       obtained?
  Select
alternate site
No _

Reject site


Select
alternate site I
Figure 1-3. Site verification process.

-------
Survey of the Nation's Lakes
Quality Assurance Project Plan
                                       Revision No. 1
                                  Date: August 2007
                                         Page 9 of 87
                                Verify lake as target and determine launch site
                                           Set up staging area
          Sampler A Activities:
                                                  iipi
                   Sampler B Activities:
          Prepare forms, equipment & supplies
         Calibrate multi-probe meter



                                   Load equipment and supplies onto boat
                                   LOCATE INDEX SITE & ANCHOR BOAT
                    (deepest point of lake; for lakes > 50 m, find point = 50m nearest center of lake)
                                                                Measure in situ temperature, pH & DO profile
  Collect integrated water samples #1 & 2 (phytoplankton,
              chlorophyll-a, & algal toxin )

  Collect integrated water sample #3 & 4 (water chemistry)
                                                                  Collect zooplankton using Wisconsin nets
                                                                         (take each net tow from
                                                                        opposite sides of the boat)
                              Collect sediment core; take mercury subsample and
                               remove top and bottom slices for sediment diatoms
                            LOCATE & TRAVEL TO PHYSICAL HABITAT STATIONS
           Conduct habitat characterizations
                                                          Sample benthic macroinvertebrates in littoral zone
                             Collect fecal indicator (Enterococci) sample at 10th station
                                            RETURN TO SHORE
             Preserve benthic sample and
                 prepare for transport
     Filter chlorophyll-a and fecal indicator (enterococci)
             samples; prepare  for transport

        Clean and organize equipment for loading
        Check and prepare zooplankton,
phytoplankton, and algal toxin samples for transport

     Check and prepare water and sediment
            samples for transport
 Inspect and clean boat, motor, & trailer to prevent
  transfer of nuisance species and contaminants

                                     Review data forms for completeness

                                 Report back to Field Logistics Coordinator and
                                     Information Management Coordinator

Figure 1-4.   Summary of field activities and  lake sampling.

-------
 Survey of the Nation's Lakes                                                 Revision No. 1
 Quality Assurance Project Plan                                           Date: August 2007
	Page 10 of 87

       Chemical, physical, or biological analyses may be performed in-house or by contractor
 or cooperator laboratories.  Laboratories providing analytical support must have the appropriate
 facilities to properly store and prepare samples and appropriate instrumentation and staff to
 provide data  of the required quality within the time period dictated by the project.  Laboratories
 are expected to conduct operations using good laboratory practices. General guidelines for
 analytical support laboratories:

    •  A program of scheduled maintenance of analytical balances, water purification systems,
       microscopes, laboratory equipment, and instrumentation.

    •  Verification of the calibration of analytical balances using class "S" weights which are
       certified by the National Institute of Standards and Technology (NIST).

    •  Verification of the calibration of top-loading balances using NIST-certified class "P"
       weights.

    •  Checking and recording the composition of fresh calibration standards against the
       previous lot. Acceptable comparisons are 2 percent of the theoretical value.  (This
       acceptance is tighter than the method calibration criteria.)

    •  Recording all analytical data in bound logbooks in ink, or on standardized recording
       forms.

    •  Verification of the calibration of uniquely identified daily use thermometers using NIST-
       certified thermometers.

    •  Monitoring and recording (in a logbook or on a recording form)  temperatures and
       performance of cold storage areas and freezer units (where samples, reagents, and
       standards may be stored).  During periods of sample collection operations, monitoring
       must be done on a daily basis.

    •  An overall program  of laboratory health and safety including periodic inspection and
       verification of presence and adequacy of first aid and spill kits;  verification of presence
       and performance of safety showers, eyewash stations, and fume hoods; sufficiently
       exhausted reagent storage units, where applicable; available chemical and hazardous
       materials inventory; and accessible material safety data sheets for all required materials.

    •  An overall program  of hazardous waste management and minimization, and evidence of
       proper waste handling and disposal procedures (90-day storage, manifested waste
       streams, etc.).

    •  If needed, having a source of reagent water meeting American Society of Testing and
       Materials (ASTM) Type I specifications for conductivity (< 1 uS/cm at 25 °C; ASTM
       1984) available in sufficient quantity to support analytical operations.

    •  Appropriate microscopes or other magnification for biological sample sorting  and
       organism identification.

-------
 Survey of the Nation's Lakes                                               Revision No. 1
 Quality Assurance Project Plan                                          Date: August 2007
	Page 11 of 87

    •  Labeling all containers used in the laboratory with date prepared, contents, and initials of
       the individual who prepared the contents.

    •  Dating and storing all chemicals safely upon receipt.  Chemicals are disposed of
       properly when the expiration date has expired.

    •  Using a laboratory information management system to track the location and status of
       any sample received for analysis.

    •  Reporting results using standard formats and units compatible with the information
       management system.

       All laboratories providing analytical support to Lakes Survey must adhere to the
 provisions of this integrated QAPP. Laboratories will provide information documenting their
 ability to conduct the analyses with the required level of data quality before analyses begin.  The
 documentation will be sent to Otto Gutenson at EPA Headquarters. Such information might
 include results from interlaboratory comparison studies, analysis of performance evaluation
 samples, control charts and results of internal QC sample or internal reference sample analyses
 to document achieved precision, bias, accuracy, and method detection limits. Contracted
 laboratories will be required to provide copies of their Data Management Plan. Laboratory
 operations may be evaluated by technical systems audits, performance evaluation studies, and
 by participation in interlaboratory sample exchange.

 1.3.3  Data Analysis and Reporting

       A technical workgroup convened by the EPA Project Leader is responsible for
 development of a data analysis plan that includes a verification and validation strategy. These
 processes are described in the internal indicator research strategies and summarized in the
 indicator-specific sections of this QAPP. Validated data are transferred to the central data base
 managed by EMAP information management support staff located at WED in Corvallis.
 Information management activities are discussed further in Section 4.  Data in the WED data
 base are available to Cooperators for use in development of indicator metrics.  All validated
 measurement and indicator data from the Lakes Survey are eventually transferred to EPA's
 Water Quality Exchange (WQX) that will replace the STORET data management system.

 1.3.4  Peer Review

 The Survey will undergo a thorough peer review process, where the scientific community and
 the public will be given the opportunity to provide comments. Cooperators have been actively
 involved in the development of the overall project management, design, methods, and
 standards including the drafting of four key project documents:

    •  Quality Assurance Project Plan (EPA 841-B-07-003)
    •  Lake Evaluation Guidelines (EPA 841-B-06-003)
    •  Field Operations Manual (EPA 841-B-07-004)
    •  Laboratory Methods Manual (EPA841-B-07-005)

-------
 Survey of the Nation's Lakes                                                Revision No. 1
 Quality Assurance Project Plan                                          Date: August 2007
	Page 12 of 87

 Outside scientific experts from universities, research centers, and other federal agencies have
 been instrumental in indicator development and will continue to play an important role in data
 analysis.

 The EPA will utilize a three tiered approach for peer review of the Survey: (1) internal and
 external review by EPA, states, other cooperators and partners, (2) external scientific peer
 review, and (3) public review.

 Once data analysis has been  completed, cooperators will examine the results at regional
 meetings. Comments and feedback from the cooperators will be incorporated into the draft
 report. Public and scientific peer review will happen simultaneously. This public comment period
 is important to the process and will allow us to garner a broader perspective in examining the
 results before the final report.  The public peer review is consistent with the Agency and OMB's
 revised requirements for peer review.

 Below are the proposed measures EPA will implement for engaging in the peer review process:
    1)  Develop and maintain  a public website with links to standard operating procedures,
       quality assurance documents, fact sheets, cooperator feedback, and final report
    2)  Conduct technical workgroup meetings composed of scientific experts, cooperators, and
       EPA to evaluate and recommend data analysis options and indicators
    3)  Hold national meeting  where cooperators will provide input and guidance on data
       presentation and an approach for data analysis
    4)  Complete data validation on all  chemical, physical and biological data
    5)  Conduct final data analysis with workgroup to generate assessment results
    6)  Engage peer review contractor  to identify external peer review panel
    7)  Develop draft report presenting assessment results
    8)  Conduct regional meetings with cooperators to examine and comment on results
    9)  Develop final draft report incorporating input from cooperators and results from  data
       analysis group to be distributed for peer and public review
    10) Issue Federal Register (FR) Notice announcing document availability and hold
       scientific/peer review and public comment (30-45 days)
    11) Consider scientific and public comments and produce a final report

 The proposed peer review schedule is  provided below and is contingent upon timeliness of data
 validation, schedule availability for regional meetings and experts for data analysis workshop.

 May 2008 - December 2008         Data validation
 February 15, 2009                 Data analysis workshop
 February - March 2009             Internal peer review meetings with states, cooperators,
                                  participants
 August 15, 2009                   Release for external peer and public review of draft
                          2.0    DATA QUALITY OBJECTIVES

       It is a policy of the U.S. EPA that Data Quality Objectives (DQOs) be developed for all
 environmental data collection activities following the prescribed DQO Process.  DQOs are
 qualitative and quantitative statements that clarify study objectives, define the appropriate types
 of data, and specify the tolerable levels of potential decision errors that will be used as the basis
 for establishing the quality and quantity of data needed to support decisions (EPA 2006).  Data

-------
 Survey of the Nation's Lakes                                               Revision No. 1
 Quality Assurance Project Plan                                          Date: August 2007
	Page 13 of 87

 quality objectives thus provide the criteria to design a sampling program within cost and
 resource constraints or technology limitations imposed upon a project or study. DQOs are
 typically expressed in terms of acceptable uncertainty (e.g., width of an uncertainty band or
 interval) associated with a point estimate at a desired level of statistical confidence (EPA 2006).
 The DQO Process is used to establish performance or acceptance criteria, which serve as the
 basis for designing a plan for collecting data of sufficient quality and quantity to support the
 goals of a study (EPA 2006). As  a general rule, performance criteria represent the full set of
 specifications that are needed to  design a data or information collection effort such that, when
 implemented, generate newly-collected data that are of sufficient quality and quantity to address
 the project's goals (EPA 2006). Acceptance criteria are specifications intended to evaluate the
 adequacy of one or more existing sources of information or data as being acceptable to support
 the project's intended use (EPA 2006).

 2.1    Data Quality Objectives  for Lakes Survey

       Target DQOs established  for the Lakes Survey relate to the goal of describing the
 current status in the condition of selected indicators of the condition of lakes in the conterminous
 U.S. and ecoregions of interest. The formal statement of the DQO for national estimates is as
 follows:

    •  Estimate the proportion of lakes (± 5%) in the conterminous U.S. that falls below the
       designated threshold for good conditions for selected measures with 95% confidence.

 For the ecoregions of interest the DQO is:

    •  Estimate the proportion of lakes (± 15%) in a specific ecoregion that fall below the
       designated threshold for good conditions for selected measures with 95% confidence.

 2.2    Measurement Quality Objectives

       For each parameter, performance objectives (associated primarily with measurement
 error) are established for several  different data quality indicators (following USEPA Guidance for
 Quality Assurance  Plans EPA240/R-02/009).  Specific measurement quality objectives (MQOs)
 for each parameter are presented in the indicator section of this QAPP. The following sections
 define the data quality indicators and present approaches for evaluating them against
 acceptance criteria established for the program.

 2.2.1  Laboratory Reporting  Level (Sensitivity)

       For chemical measurements, requirements for the method detection limit (MDL) are
 typically established. The MDL is defined as the lowest level of analyte that can be
 distinguished from zero with 99 percent confidence based on a single measurement (Glaser et
 al., 1981).  USGS NWQL has developed a variant of the MDL called the long-term MDL (LT-
 MDL) to capture greater method variability (Oblinger Childress et al. 1999). Unlike MDL, it is
 designed to incorporate more of the measurement variability that is typical for routine analyses
 in a  production laboratory, such as multiple instruments, operators, calibrations, and sample
 preparation events (Oblinger Childress et al. 1999). The LT-MDL determination ideally employs
 at least 24 spiked samples prepared and analyzed by multiple analysts on multiple instruments
 over a 6- to 12-month period at a  frequency of about two samples per month (EPA 2004).  The

-------
 Survey of the Nation's Lakes                                               Revision No. 1
 Quality Assurance Project Plan                                          Date: August 2007
	Page 14 of 87

 LT-MDL uses "F-pseudosigma" (Fa) in place of s, the sample standard deviation, used  in the
 EPA MDL calculation. F-pseudosigma is a non-parametric measure of variability that is based
 on the interquartile range of the data (EPA 2004). The LT-MDL may be calculated using either
 the mean or median of a set of long-term blanks, or from long-term spiked sample results
 (depending o the analyte and specific analytical method). The LT-MDL for an individual analyte
 is calculated as:
Equation la
                        LT-MDL  = M +
where M is the mean or median of blank results; n is the number of spiked sample results; and
Fa\s F-pseudosigma, a nonparametric estimate of variability calculated as:
Equation lb                        J?  _
                                            1,349
where: Q3 and Q1 are the 75th percentile and 25th percentile of spiked sample results,
respectively.

       LT-MDL is designed to be used in conjunction with a laboratory reporting level (LRL;
Oblinger Childress et al. 1999). The LRL is designed to achieve a risk of <1% for both false
negatives and false positives (Oblinger Childress et al. 1999). The LRL is set as a  multiple of
the LT-MDL, and is calculated as follows:

                                   LRL = 2 x LT-MDL

Therefore, multiple measurements of a sample having a true concentration at the LRL should
result in the concentration being detected and reported 99 percent of the time (Oblinger
Childress et al. 1999).

       All laboratories will develop calibration curves for each batch of samples that include a
calibration standard with an analyte concentration equal to the LRL. Estimates of LRLs (and
how they are determined) are required to be submitted with analytical results. Analytical results
associated with LRLs that exceed the objectives are flagged as  being associated with
unacceptable LRLs. Analytical data that are below the estimated LRLs are reported, but are
flagged as being below the LRLs.

2.2.2  Precision,  Bias, and Accuracy

       Precision and bias are estimates of random and systematic error in a measurement
process (Kirchmer, 1983; Hunt and Wilson,  1986, USEPA2002). Collectively, precision and
bias provide an estimate of the total error or uncertainty associated with an individual
measurement or set of measurements. Systematic errors are minimized by using validated
methods and standardized procedures across all laboratories.  Precision is estimated from
repeated measurements of samples.  Net bias is determined from repeated measurements of
solutions of known composition, or from the analysis of samples that have been fortified by the
addition of a known quantity of analyte. For analytes with large  ranges of expected

-------
 Survey of the Nation's Lakes                                               Revision No. 1
 Quality Assurance Project Plan                                          Date: August 2007
	Page 15 of 87

 concentrations, MQOs for precision and bias are established in both absolute and relative
 terms, following the approach outlined in Hunt and Wilson (1986). At lower concentrations,
 MQOs are specified in absolute terms.  At higher concentrations, MQOs are stated in relative
 terms. The point of transition between an absolute and relative MQO is calculated as the
 quotient of the absolute objective divided by the relative objective (expressed as a proportion,
 e.g., 0.10 rather than as a percentage, e.g., 10%).

       Precision in absolute terms is estimated as the sample standard deviation when the
 number of measurements is greater than two:
Equation 1                      s = '
                                        n-l
where Xj is the value of the replicate, x is the mean of repeated sample measurements, and n is
the number of replicates.  Relative precision for such measurements is estimated as the relative
standard deviation (RSD, or coefficient of variation, [CV]):


Equation 2                        RSD = — x 1 00
                                        X

here s is the sample standard deviation of the set of measurements, and x equals the

mean value for the set of measurements.

       Precision based on duplicate measurements is estimated based on the range of
measured values (which equals the difference for two measurements).  The relative percent
difference (RPD) is calculated as:
       ,   A~B   ,
RPD = \	  xlOO
Equation 3
where A is the first measured value, 6 is the second measured value.

       For repeated measurements of samples of known composition, net bias (6) is estimated
in absolute terms as:

Equation 4                          B = x - T


where x equals the mean value for the set of measurements, and 7 equals the theoretical
or target value of a performance evaluation sample.  Bias in relative terms (B[%]) is
calculated as:

Equation 5                      B(%) =	x 100

-------
 Survey of the Nation's Lakes                                               Revision No. 1
 Quality Assurance Project Plan                                          Date: August 2007
	Page 16 of 87


 where x equals the mean value for the set of measurements, and 7 equals the theoretical or
 target value of a performance evaluation sample.

       Accuracy is estimated for some analytes from fortified or spiked samples as the percent
 recovery.  Percent recovery is calculated as:


                                         C ~C
 Equation 6                   %re cov ery = —-—- x 100
where C/s is the measured concentration of the spiked sample, C// is the concentration of the
unspiked sample, and Cs is the concentration of the spike.
       Precision and bias within each laboratory are monitored for every sample batch by the
 analysis of internal QC samples.  Samples associated with unacceptable QC sample results are
 reviewed and re-analyzed if necessary.  Precision and bias across all laboratories will be
 evaluated after analyses are completed  by using the results of performance evaluation (PE)
 samples sent to all laboratories (3 sets of 3 PE samples, with each set consisting of a low,
 moderate, and high concentration sample of all analytes).

 2.2.3  Taxonomic Precision and Accuracy

       For the Lakes Survey, taxonomic precision will be quantified by comparing whole-
 sample identifications completed  by independent taxonomists or laboratories.  Accuracy of
 taxonomy will be qualitatively evaluated  through specification of target hierarchical levels (e.g.,
 family, genus, or species); and the specification of appropriate technical taxonomic literature or
 other references (e.g., identification keys, voucher specimens). To calculate taxonomic
 precision, 10 percent of the samples will be randomly-selected for re-identification by an
 independent, outside taxonomist  or laboratory. Comparison of the results of whole sample re-
 identifications will provide a  Percent Taxonomic Disagreement (PTD) calculated as:
Equation 1                   PTD =
1-
                                        comp
                                             pos
                                           N
x 100
where comppos is the number of agreements, and N is the total number of individuals in the
larger of the two counts. The lower the PTD, the more similar are taxonomic results and the
overall taxonomic precision is better. A MQO of 15% is recommended for taxonomic difference
(overall mean <15% is acceptable). Individual samples exceeding 15% are examined  for
taxonomic areas of substantial  disagreement, and the reasons for disagreement investigated.

       Where re-identification by an independent, outside taxonomist or laboratory is not
practical (i.e., phytoplankton, zooplankton, diatoms), percent similarity will be calculated.
Percent similarity is a measure of similarity between two communities or two samples
(Washington 1984). Values range from 0% for samples with no species in common, to 100% for
samples which are identical.  It is calculated as follows:

-------
Survey of the Nation's Lakes                                                Revision No. 1
Quality Assurance Project Plan                                          Date: August 2007
                                                                       Page 17 of 87
                                              K
Equations                        PSC = 1-0.5Y ia-b
where: a and b are, for a given species, the relative proportions of the total samples A and B,
respectively, which that species represents. A MQO of >85% is recommended for percent
similarity of taxonomic identification. If the MQO is not met, the reasons for the discrepancies
between analysts should be discussed. If a major discrepancy is found in how the two analysts
have been identifying organisms, the last batch of samples that have been counted by the
analyst under review may have to be recounted.

       Additionally, percent similarity should be calculated for re-processed subsamples. This
provides a quantifiable measure of the precision of subsampling procedures employed for
various parameters (i.e., phytoplankton, zooplankton, diatoms). A MQO of >70% is
recommended for percent similarity of subsamples.  If a sample does not meet this threshold,
additional subsamples should be processed from that sample until the MQO is achieved.

       Sample enumeration is another component of taxonomic precision. Final specimen
counts for samples are dependent on the taxonomist, not the rough counts obtained during the
sorting activity.  Comparison of counts is quantified by calculation of percent difference in
enumeration (PDE), calculated as:

F «,, at;™ Q                                (I Labi - Labi \]
Equation 9                         pDE = J	_± \
                                        \.  Labi + Labi)
An MQO of 5% is recommended (overall mean of <5% is acceptable). Individual samples
exceeding 5% are examined to determine reasons for the exceedance.

       Corrective actions for samples exceeding these MQOs can include defining the taxa for
which re-identification may be necessary (potentially even by third party), for which samples
(even outside of the 10% lot of QC samples) it is necessary, and where there may be issues of
nomenclatural or enumeration  problems.

       Taxonomic accuracy is evaluated by having individual specimens representative of
selected taxa identified by recognized experts. Samples will be identified using the most
appropriate technical literature that is accepted by the taxonomic discipline and reflects the
accepted nomenclature.  Where necessary, the Integrated Taxonomic Information System (ITIS,
http://www.itis.usda.gov/) will be used to verify nomenclatural validity and spelling. A reference
collection will be compiled as the samples are identified.  Specialists in several taxonomic
groups will verify selected individuals of different taxa, as determined by the Lakes Survey
workgroup.

2.2.4  Completeness

       Completeness requirements  are established and evaluated from two perspectives. First,
valid data for individual parameters must be acquired from a minimum number of sampling
locations in order to make subpopulation estimates with a specified  level of confidence or

-------
 Survey of the Nation's Lakes                                               Revision No. 1
 Quality Assurance Project Plan                                          Date: August 2007
	Page 18 of 87

 sampling precision. The objective of this study is to complete sampling at 95% or more of the
 1000 initial sampling sites. Percent completeness is calculated as:


 Equation 10                       %C = VL x 100


 where V is the number of measurements/samples judged valid, and T is the total number of
 planned measurements/samples. Within each indicator, completeness objectives are also
 established for individual samples or individual measurement variables or analytes.  These
 objectives are estimated as the percentage of valid data obtained versus the amount of data
 expected based on the number of samples collected or number of measurements conducted.
 Where necessary,  supplementary objectives for completeness are presented in the indicator-
 specific sections of this QAPP.

       The completeness objectives are established for each measurement per site type (e.g.,
 probability sites,  revisit sites, etc.). Failure to achieve the minimum requirements for a particular
 site type results in  regional population estimates having wider confidence intervals.  Failure to
 achieve requirements  for repeat sampling (10% of samples collected) and revisit samples (10%
 of sites visited) reduces the precision of estimates of index period and annual variance
 components, and may impact the representativeness of these estimates  because of possible
 bias in the set of measurements obtained.

 2.2.5  Comparability

       Comparability is defined as the confidence with which one data set can be compared to
 another (USEPA 2002).  A performance-based methods approach is being utilized for water
 chemistry and chlorophyll-a analyses that defines a set of laboratory method performance
 requirements for data  quality. Following this approach, participating laboratories may choose
 which analytical methods they will use for each target analyte as  long as  they are able to
 achieve the performance requirements as listed in Table 5-1. For all parameters, comparability
 is addressed by the use of standardized sampling procedures and analytical methods by all
 sampling crews and laboratories. Comparability of data within and  among parameters is also
 facilitated by the implementation of standardized quality assurance and quality control
 techniques and standardized performance and acceptance criteria. For all measurements,
 reporting units and format are specified, incorporated into standardized data recording forms,
 and documented in the information management system. Comparability  is also addressed by
 providing results of QA sample data, such as estimates of precision and bias, conducting
 methods comparison studies when requested by the grantees and conducting interlaboratory
 performance evaluation studies among state, university, and Lakes Survey contract
 laboratories.

 2.2.6  Representativeness

       Representativeness is defined as "the degree to which the data accurately and precisely
 represent a characteristic of a population parameter, variation of  a property, a process
 characteristic, or an operational condition" (USEPA 2002). At one level, representativeness is
 affected by problems in any or all of the other data quality indicators.

       At another level, representativeness is affected by the selection of the target surface
 water bodies, the location of sampling sites within that body, the time  period when samples are

-------
 Survey of the Nation's Lakes                                               Revision No. 1
 Quality Assurance Project Plan                                          Date: August 2007
	Page 19 of 87

 collected, and the time period when samples are analyzed. The probability-based sampling
 design should provide estimates of condition of surface water resource populations that are
 representative of the region. The individual sampling programs defined for each indicator
 attempt to address representativeness within the constraints of the response design, (which
 includes when, where, and how to collect a sample at each site).  Holding time requirements for
 analyses ensure analytical results are representative of conditions at the time of sampling.  Use
 of duplicate (repeat) samples which are similar in composition to samples being measured
 provides estimates of precision and bias that are applicable to sample measurements.
                    3.0    SAMPLING DESIGN AND SITE SELECTION

       The overall sampling program for the Lakes Survey project requires a randomized,
 probability-based approach for selecting lakes where sampling activities are to be conducted.
 Details regarding the specific application of the probability design to surface waters resources
 are described in Paulsen et al. (1991) and Stevens (1994). The specific details for the collection
 of samples associated with different indicators are described in the indicator-specific sections of
 this QAPP.

 3.1    Probability Based Sampling Design and Site Selection

       The target population for this project includes all lakes, reservoirs,  and ponds within the
 48 contiguous United States greater than 4 hectares (10 acres) in surface area that are
 permanent waterbodies. Lakes that are saline are excluded as are those  used for aquaculture,
 disposal-tailings, sewage treatment, evaporation, or other unspecified disposal use. The
 National Hydrography Dataset (NHD) was employed to derive a list of lakes for potential
 inclusion into the survey. The overall sample size was set to include 1000 lake sampling
 events, of which 909 are discrete lake samples and 91 are revisits.

       A Generalized Random Tessellation Stratified (GRTS) survey design for a finite resource
 was used for site selection. The design was developed to include a representative subset of the
 lakes that were sampled in EPA's National Lake Eutrophication Study (NES), which will allow for
 an extrapolation of changes to the full set of NES lakes. Lake selection for the survey provided
 for five size class categories (4-10 ha, 10-20 ha, 20-50 ha, 50-100 ha, >100 ha), as well as
 spatial distribution across the lower 48 states and nine aggregated Omernik Level 3 ecoregions.
 Small lakes (1-4 ha in size) were also included in the selection process  so that states
 may elect to include these  smaller lakes in state-level assessment efforts. An additional 4000
 lakes were selected as potential replacement lakes (oversample  sites).  The oversample is used
 to replace a candidate lake that is determined to be non-target or to replace a target lake that is
 not accessible due to landowner denials, physical barriers, or safety concerns. Replacement
 sites should be taken from  the Oversample list in order.

       Lakes were selected using a two-stage process employing a systematic grid of sampling
 points developed for use by all EMAP resource groups (Overton  et al. 1991). The selection
 process is automated, using digital maps and geographic information system (GIS) techniques
 and  equipment (Selle et al. 1991).

       QA for GIS methods is focused on aspects of accuracy (e.g., how well do digitized  maps
 provide information of what is actually present at a location) and the representativeness of this
 information.  Three basic types of errors have been identified by the EMAP design group:

-------
Survey of the Nation's Lakes                                                Revision No. 1
Quality Assurance Project Plan                                          Date: August 2007
                                                                        Page 20 of 87
   •   Map-related errors: These are errors due to inconsistencies between different types (or
       scales) of maps (e.g., paper maps versus digitized versions).

   •   Landscape-related errors: These are errors due to changes occurring at a site since the
       corresponding map was last revised. Such changes could be natural (due to natural
       successional processes) or anthropogenic (e.g., draining a manmade lake or reservoir).

   •   Other errors:  Software developed for digitizing maps or other associated GIS
       processing applications may introduce errors.

       The GIS staff at WED that support surface waters research in EMAP have developed
QC procedures for controlling some of these errors. Other types of errors are quantified as they
are discovered, essentially by using ground truthing as a standard for comparison.

       The first stage of the probability sample (termed the "Tier I" sample) is developed by
intersecting the spatial file of surface water body information with a second file containing spatial
information related to the EMAP systematic sampling  grid.  This information includes locational
information regarding the sampling points on the grid  and an associated 40-km2 hexagon area
centered on each sampling point. The Tier I sample represents all surface water bodies whose
digitized labeling points are located within the boundaries of one of the hexagons.

       A QC check is made by comparing a selected subset of the Tier I sample  against the
parent DLGs. Any noted discrepancies are reconciled by using the corresponding paper
topographic maps.  Error rates for the frame are extrapolated from the error rates found in the
Tier I sample.

       The second stage of site selection involves selecting a subset of the Tier I sample. This
subset (termed the "Tier II" sample),  represents sites  that are expected to be visited by field
sampling crews. The Tier II sample is selected through a process that incorporates the desired
Tier II sample size stratified into multi-density categories. Sites are selected randomly from the
Tier I sample, with the constraint that the spatial distribution of sites be preserved. Each Tier II
site has an associated inclusion probability with which any measured attribute can be related to
the target population of sites.

Revisit Sites: Of the sites visited in the field and found to be target sites, a total  of 10% will be
revisited. The primary purpose of this revisit set of sites is to allow variance estimates that
would provide information on the extent to which the population estimates might vary if they
were sampled at a different time.

Oversample Lake Sites: The number of sites that must be evaluated to achieve the expected
number of field sites that can be sampled can only be estimated based on assumptions
concerning expected error rates in NHD, percent of landowner refusals, and percent of
physically inaccessible sites. Based on the estimates gained in previous studies, a list of 4000
alternate sites was selected at the same time as the base sites.  The large oversample size was
done primarily to accommodate those states who may want to increase the number of lakes
sampled within their state for a state-level design.  Alternate sites must be used in order until the
desired sample size has been achieved.

-------
 Survey of the Nation's Lakes                                                Revision No. 1
 Quality Assurance Project Plan                                          Date: August 2007
	Page 21 of 87

                          4.0    INFORMATION MANAGEMENT

       Like QA, information management (IM) is integral to all aspects of the Lakes Survey
 from initial selection of sampling sites through dissemination and reporting of final, validated
 data. QA and QC measures implemented for the IM system are aimed at preventing corruption
 of data at the time of their initial incorporation into the system and maintaining the integrity of
 data and information after incorporation into the system. The general organization of, and
 QA/QC measures associated with, the IM systems are described in this section.

 4.1    Overview of System Structure

       At each point where data and information are generated, compiled, or stored, the
 information must be managed. Thus, the IM system includes all of the data-generating
 activities, all of the means of recording and storing information, and all of the processes which
 use data. The IM system includes both hardcopy and electronic means of generating,  storing,
 and archiving data.  All participants in the Lakes Survey have certain responsibilities and
 obligations which make them a part of the IM system.  In its entirety, the IM system includes site
 selection and logistics information, sample labels and field data forms, tracking records, map
 and analytical data, data validation and analysis processes, reports, and archives. IM staff
 supporting the Lakes Survey at WED provides support and guidance to all program operations
 in addition to maintaining a  central data base management system for the  Lakes Survey data.

       The central repository for data and associated information collected for use by the Lakes
 Survey is a secure,  access-controlled server located at WED-Corvallis. The general
 organization of the information management system is presented  in Figure 4-1.  Data are stored
 and managed on this system using the Statistical Analysis System (SAS) software package.
 This centrally managed IM system is the primary data management center for the Lakes Survey
 research conducted at WED and elsewhere. The IM staff receives, enters, and maintains data
 and information generated by the site selection process (see Section 3), field sample and data
 collection, map-based measurements, laboratory analyses, and verification and validation
 activities completed by the indicator leads.  In addition to this inflow, the IM system provides
 outflow in provision  of data files to Lakes Survey staff and other users.  The IM staff at  WED is
 responsible for maintaining  the security integrity of both the data and the system.

       The following sections describe the major inputs to the central data base and the
 associated QA/QC processes used to record, enter, and validate measurement and analytical
 data collected for EMAP surface waters research projects. Activities to maintain the integrity
 and assure the quality of the contents of the IM system are also described.

-------
Survey of the Nation's Lakes
Quality Assurance Project Plan
   Revision No. 1
Date: August 2007
    Page 22 of 87
              SAMPLE SITE INFORMATION
                                               INDICATOR RESEARCH AND DEVELOPMENT INFORMATION
TIER II LIST
FRAME

• Site ID
• Weighting
Factor
• Location
coordinates




LOGISTICS
DATA
• Site ID
information
• Location
coordinates
• Access
Information




SITE
VERIFICATION
DATA

• Site ID
• Measured
location
coordinates
• Sampling status

FIELD
DATA




LABORATORY
DATA

QA/QC
DATA



TRACING HISTORICAL
TRACKING DflTfl
DATA UAIA

STRESSOR
DATA
• Land use
data
• Sampling
status

ASSESSMENT AND REPORTING INFORMATION
(by indicator)

ANNUAL
POPULATION
STATUS
DATA

POPULATION
TREND
DATA

SPATIAL
DATA
(CIS)

                                                             M ETA-DATA


DATA BASE
DOCUMENTATION

QUALITY ASSURANCE
DOCUMENTATION



IM SYSTEM
USER GUIDES

METHODS
DOCUMENTATION
Figure 4-1. Organization of information management system modeled after EMAP Surface Water
Information Management (SWIM) system for the Lakes Survey.

4.1.1   Design and Site Status Data Files

       The site selection process described in Section 3 produces a list of candidate sampling
locations, inclusion probabilities, and associated site classification data (e.g., target status,
ecoregion, etc.). This "design" data file is  provided to the IM staff, implementation coordinators,
and field coordinators. Field coordinators  determine ownership and contacts for acquiring
permission to access each site, and conduct site evaluation and reconnaissance activities.
Ownership, site evaluation, and reconnaissance information for each site are compiled into a
"site status" data file. Generally, standardized forms are used during reconnaissance activities.
Information from these forms may be entered into a SAS compatible data management system.
Whether in electronic or  hardcopy format,  a  copy of the logistics data base is provided to the IM
for archival.

4.1.2   Sample Collection and Field Data Recording

       Prior to initiation of field activities, the IM staff works with the indicator leads and
analytical support laboratories to  develop standardized field data forms and sample labels.
Preprinted adhesive labels having a standard recording format are completed and affixed to
each sample container.  Precautions are taken to ensure that label information remains legible
and the label remains attached to the sample.  Examples of sample labels are presented in the
Field Operations Manual.

       Field sample collection and data forms are designed  in conjunction with IM staff to
ensure the format facilitates field  recording and subsequent data entry tasks. All forms which
may be used onsite are printed on water-resistant paper.  Copies of the field data forms and
instructions for completing each form are documented in the Field Operations Manuals.
Recorded data are reviewed upon completion of data collection and recording activities by a
person other than the one who completed the form. Field crews check completed data forms

-------
 Survey of the Nation's Lakes                                                Revision No. 1
 Quality Assurance Project Plan                                          Date: August 2007
	Page 23 of 87

 and sample labels before leaving a sampling site to ensure information and data were recorded
 legibly and completely. Errors are corrected if possible, and data considered as suspect are
 qualified using a flag variable. The field sampling crew enters explanations for all flagged data
 in a comments section. Completed field data forms are transmitted to the IM staff at WED for
 entry into the central data base management system; indicator leads also receive copies of all
 field-recorded data.

       If portable PCs (or handheld data recorders) are to be used in the field, user screens are
 developed that duplicate the standardized form to facilitate data entry.  Specific output formats
 are available to print data for review and for production of shipping forms. Data may be
 transferred via modem on a daily basis. Each week CDs containing all down-loaded data for
 the week are mailed to the IMC.

       All samples are tracked from the point of collection. If field PCs are used, tracking
 records are generated by custom-designed software.   Hardcopy tracking and custody forms are
 completed if PCs are not available for use. Copies of the shipping and custody record
 accompany all sample transfers; other copies are transmitted to the IMC and applicable
 indicator lead. Samples are tracked to ensure that they are delivered to the appropriate
 laboratory,  that lost shipments can be quickly identified and traced, and that any problems with
 samples observed when received at the laboratory are reported promptly so that corrective
 action can be taken if necessary. Detailed procedures on shipping and sample tracking can be
 found in the Field Operations Manual.

       Procedures for completion of sample labels and field data forms, and use of PCs are
 covered extensively in training sessions.  General QC  checks and procedures associated with
 sample collection and transfer, field measurements, and field data form completion for most
 indicators are listed  in Table 4-1.  Additional QA/QC checks or procedures specific to individual
 indicators are described in the indicator sections  in Section 5 of this QAPP.

 4.1.3  Laboratory Analyses and Data Recording

       Upon receipt of a sample shipment, analytical laboratory receiving personnel check the
 condition and identification of each sample against the sample tracking record. Each sample is
 identified by information written on the  sample label and by a barcode label. Any discrepancies,
 damaged samples, or missing samples are reported to the IM staff and indicator lead by
 telephone.

       Most of the laboratory  analyses for the Lakes Survey indicators, particularly chemical
 and physical analyses, follow or are based on standard methods. Standard methods generally
 include requirements for QC checks and procedures.   General laboratory QA/QC procedures
 applicable to most Lakes  Survey indicators are described in Table 4-2. Additional QA/QC
 samples and procedures specific to individual indicator analyses are described in the indicator
 sections of this QAPP. Biological sample  analyses are generally based on current acceptable
 practices within the particular biological discipline. Some  QC checks and procedures applicable
 to most Lakes Survey biological samples are described in Table 4-3. Additional QA/QC
 procedures specific to individual parameters are described in the indicator section of this QAPP.
 Table 4-1. Sample and field data quality control activities

-------
Survey of the Nation's Lakes
Quality Assurance Project Plan
                                                  Revision No. 1
                                              Date: August 2007
                                                   Page 24 of 87
Quality Control
Activity
Contamination
Prevention
Sample Identification
Data Recording
Data Qualifiers
Sample Custody
Sample Tracking
Data Entry
Data Submission
Data Archival
Description and/or Requirements
All containers for individual site sealed in plastic bags until use; specific
contamination avoidance measures covered in training
Pre-printed labels with unique ID number on each sample
Data recorded on pre-printed forms of water-resistant paper; field sampling
crew reviews data forms for accuracy, completeness, and legibility
Defined qualifier codes used on data form; qualifiers explained in comments
section on data form
Unique sample ID and tracking form information entered in LIMS; sample
shipment and receipt confirmed
Sample condition inspected upon receipt and noted on tracking form with
copies sent to Indicator Lead and/or IM
Data entered using customized entry screens that resemble the data forms;
entries reviewed manually or by automated comparison of double entry
Standard format defined for each measurement including units, significant
figures, and decimal places, accepted code values, and required field width
All data records, including raw data, archived in an organized manner in
compliance with EPA and Federal Government records management policies.
Processed samples and reference collections of taxonomic specimens
submitted for cataloging and curation at an appropriate museum facility
Table 4-2. Laboratory data quality control activities
  Quality Control
      Activity
                   Description and/or Requirements
Instrument Maintenance

Calibration

QC Data


Data Recording
Data Qualifiers
Data Entry

Submission Package
Follow manufacturer's recommendations and specific guidelines in
methods; maintain logbook of maintenance/repair activities
Calibrate according to manufacturer's recommendations and guidelines
given in Section 5.1.5; recalibrate or replace before analyzing any samples
Maintain control charts, determine LT-MDLs and achieved data attributes;
include QC data summary (narrative and compatible electronic format) in
submission package
Use software compatible with EMAP-SWIM system; check all data entered
against the original bench sheet to identify and correct entry errors.
Review  other QA data (e.g. condition upon receipt,  etc.) for possible
problems with sample or specimens.
Use defined qualifier codes; explain all qualifiers
Automated comparison of double entry or 100% manual check against
original data form
Includes: Letter by the laboratory manager;  data, data qualifiers and
explanations; electronic format compatible with EMAP-SWIM system,
documentation of file and data base structures, variable descriptions and
formats; summary report of any problems and corrective actions
implemented	
Table 4-3. Biological sample quality control activities

-------
Survey of the Nation's Lakes                                                 Revision No. 1
Quality Assurance Project Plan                                            Date: August 2007
                                                                          Page 25 of 87
     Quality Control
        Activity
               Description and/or Requirements
Taxonomic Nomenclature
Taxonomic Identifications
Independent Identifications
Duplicate Identifications

Taxonomic
Reasonableness Checks
Reference Collections
Use accepted common and scientific nomenclature and unique entry
codes
Use standard taxonomic references and keys; maintain bibliography of all
references used
Uncertain identifications to be confirmed by expert in particular taxa
At least 5% of all samples completed pertaxonomist reidentified by
different analyst;  less than 10% assigned different ID
Species or genera known to occur in given conditions or geographic area

Permanent mounts or voucher specimens of all taxa encountered	
       A laboratory's IM system may consist of only hardcopy records such as bench sheets
and logbooks, an electronic laboratory information management system (LIMS), or some
combination of hardcopy and electronic records.  Laboratory data records are reviewed at the
end of each analysis day by the designated laboratory onsite QA coordinator or by supervisory
personnel.  Errors are corrected if possible, and data considered as suspect by laboratory
analysts are qualified with a flag variable. All flagged data are explained in a comments section.
Private contract laboratories generally have a laboratory quality assurance plan and established
procedures for recording, reviewing, and validating analysis data.

       Once analytical data have passed all of the laboratory's internal review procedures, a
submission package is prepared and transferred to the IM staff. The contents of the submission
package are largely dictated by the type of analysis (physical, chemical, or biological), but
generally includes at least the elements listed in the  Field and Laboratory Operations Manuals.

       Remaining sample material and voucher specimens may be transferred to EPA's
designated laboratory or facilities as directed by the  EPA Project Leader. All samples and raw
data files (including  logbooks, bench sheets, and  instrument tracings) are to be retained
permanently or until authorized for disposal, in writing, by the EPA Project Leader.
(Deliverables from contractors and cooperators, including raw data, are  permanent as per EPA
Record Schedule 258.  EPA's project records are scheduled 501 and are also permanent.)

4.1.4  Data Review, Verification, and Validation Activities

       Raw data files  are created from entry of field  and analytical data, including data for
QA/QC samples and any data qualifiers noted on the field forms or analytical data package.
After initial entry, data are reviewed for entry errors by either a manual comparison of a printout
of the entered data against the original data form  or by automated  comparison of data entered
twice into separate files. Entry errors are corrected and reentered.  For  biological  samples,
species identifications are corrected for entry errors associated with incorrect or misspelled
codes.  Errors associated with misidentification of specimens are corrected after voucher
specimens have been confirmed and the results are available. Files corrected for entry errors
are considered to be raw data files.  Copies of all  raw data files are maintained in the centralized
IM system.

       The Tetra Tech facilitation team will work with Indicator Leads and the IM (primary data
recipients) to ensure that sufficient QC activities are  engaged in the various data management
processes. A copy of the raw data files are maintained in the central IM system, generally in

-------
Survey of the Nation's Lakes
Quality Assurance Project Plan
                            Revision No. 1
                         Date: August 2007
                             Page 26 of 87
active files until completion of reporting and then in archive files. Redundant copies are
maintained of all data files and all files are periodically backed up.

       Some of the typical checks made in the processes of verification and validation are
described in Table 4-4.  Automated review procedures may be used. The primary purpose of
the initial checks is to confirm that a data value present in an electronic data file is accurate with
respect to the value that was initially recorded on a data form or obtained from an analytical
instrument.  In general, these activities focus on individual variables in the raw data file and may
include range checks for numeric variables, frequency tabulations of coded or alphanumeric
variables to identify erroneous codes or misspelled entries, and summations of variables
reported in terms of percent or percentiles.  In addition, associated QA information (e.g., sample
holding time) and QC sample data are reviewed to determine if they meet acceptance criteria.
Suspect values are assigned a data qualifier.  They will either be corrected, replaced with a new
acceptable value from sample reanalysis, or confirmed suspect after sample reanalysis. Any
suspect data will be flagged for data qualification.

Table 4-4. Data review, verification, and validation quality control activities	
        Quality Control Activity
         Description and/or Requirements
Review any qualifiers associated with variable
Summarize and review replicate sample data
Determine if MQOs and project DQOs have been
achieved
Exploratory data analyses (univariate, bivariate,
multivariate) utilizing all data

Confirm assumptions regarding specific types of
statistical techniques  being utilized in development
of metrics and indicators
Determine if value is suspect or invalid; assign
validation qualifiers as appropriate

Identify replicate samples with large variance;
determine if analytical error or visit-specific
phenomenon is responsible
Determine potential impact on achieving research
and/or program objectives
Identify outlier values and determine if analytical
error or site-specific phenomenon is responsible

Determine potential impact on achieving research
and/or program objectives
       In the final stage of data verification and validation, exploratory data analysis techniques
may be used to identify extreme data points or statistical outliers in the data set.  Examples of
univariate analysis techniques include the generation and examination of box-and-whisker plots
and subsequent statistical tests of any outlying data points. Bivariate techniques include
calculation of Spearman correlation coefficients for all pairs of variables in the data set with
subsequent examination of bivariate plots of variables having high correlation coefficients.
Multivariate techniques have also been used in detecting extreme or outlying values in
environmental data sets (Meglen, 1985; Garner et al., 1991; Stapanian et al.,  1993).  A software
package, SCOUT, developed by EPA and based on the approach of Garner et al. (1991) may
be used for validation of multivariate data sets.

       Suspect data are reviewed to determine the source of error,  if possible.  If the error is
correctable, the data set is edited to incorporate the correct data.  If the source of the error
cannot be determined, data  are qualified as questionable or invalid.  Data qualified as
questionable may be acceptable for certain types of data analyses and interpretation activities.
The decision to use questionable data must be made by the individual data users.  Data
qualified as invalid are considered to be unacceptable for use in any analysis or interpretation

-------
 Survey of the Nation's Lakes                                               Revision No. 1
 Quality Assurance Project Plan                                          Date: August 2007
	Page 27 of 87

 activities and will generally be removed from the data file and replaced with a missing value
 code and explanatory comment or flag code.  After completion of verification and validation
 activities, a final data file is created, with copies transmitted for archival and for uploading to the
 centralized IM system.

       Once verified and validated, data files are made available for use in various types of
 interpretation activities, each of which may require additional restructuring of the data files.
 These restructuring activities are collectively referred to as "data enhancement." In order to
 develop indicator metrics from one or more variables, data files may be restructured so as to
 provide a single record per lake. .

 4.2    Data Transfer

       Field crews may transmit data electronically via modem  or floppy disc;  hardcopies of
 completed data and sample tracking forms may be transmitted to the IM staff at WED via
 portable facsimile (FAX) machine or via express courier service. Copies of raw, verified, and
 validated data files are  transferred from indicator leads to the IM staff for inclusion in the central
 IM system. All transfers of data are conducted using a means of transfer, file structure, and file
 format that has been approved by the IM staff. Data files that do not meet the required
 specifications will not be incorporated into the centralized data access and management
 system.

 4.3    Hardware and Software Control

       All automated data processing (ADP) equipment and software purchased for or used in
 Lakes  Survey surface waters research is subject to the requirements of the federal government,
 the particular Agency, and the individual facility making the purchase or maintaining the
 equipment and software.  All hardware purchased by EPA is identified with an EPA barcode tag
 label; an inventory is maintained by the responsible ADP personnel at the facility.  Inventories
 are also maintained of all software licenses; periodic checks are made of all software assigned
 to a particular PC.

       The development and organization of the IM system is compliant with guidelines and
 standards established by the EMAP Information Management Technical Coordination Group,
 the EPA Office of Technology, Operations, and Planning (OTOP), and the  EPA Office of
 Administrative Resources Management (OARM). Areas addressed  by these policies and
 guidelines include, but are not limited to, the following:

    •  Taxonomic Nomenclature and Coding
    •  Locational data
    •  Sampling unit identification and reference
    •  Hardware and software
    •  Data catalog documentation

       The Lakes Survey is committed to compliance with all applicable regulations and
 guidance concerning hardware and software procurement, maintenance, configuration control,
 and QA/QC.  As new guidance and requirements are issued, the Lakes Survey information
 management staff will assess the impact upon the IM system and develop  plans for ensuring
 timely compliance.

-------
Survey of the Nation's Lakes                                                Revision No. 1
Quality Assurance Project Plan                                          Date: August 2007
                                                                        Page 28 of 87
4.4    Data Security

       All data files in the IM system are protected from corruption by computer viruses,
unauthorized access, and hardware and software failures.  Guidance and policy documents of
EPA and management policies established by the IM Technical Coordination Group for data
access and data confidentiality are followed. Raw and verified data files are accessible only to
the Lakes Survey collaborators. Validated data files are accessible only to users specifically
authorized by the EPA Project Leader. Data files in the central repository used for access and
dissemination are marked as read-only to prevent corruption by inadvertent editing, additions, or
deletions.

       Data generated, processed, and incorporated into the IM system are routinely stored as
well as archived on redundant systems. This ensures that  if one system is destroyed or
incapacitated, IM staff will be able to  reconstruct the data bases. Procedures developed to
archive the data, monitor the process, and recover the data are described in IM documentation.

       Several backup copies of all data files and of the programs  used for processing the data
are maintained.  Backups of the entire system are maintained off-site. System backup
procedures are utilized. The central data base is backed up and archived according to
procedures already established for WED. All laboratories generating data and developing data
files must have established procedures for backing up and  archiving computerized data.

4.5    Data Archive

       All data will  be transferred to U.S. EPA's agency-wide WQX (Water Quality Exchange)
data management system for archival purposes.  WQX is a repository for water quality,
biological, and physical data and is used by state environmental agencies, EPA  and  other
federal agencies, universities, private citizens, and many others. Revised from STORET, WQX
provides a centralized system for storage of physical, chemical, and biological data and
associated analytical  tools for data analysis. Data from the Lakes Survey project in an Excel
format will be run through an Interface Module and uploaded to WQX. Once uploaded, states
and tribes will be able to download data (using Oracle software) from their region.
                                  5.0    INDICATORS

5.1     Water Chemistry Indicator

5.1.1   Introduction

       Trophic indicators based on lake water chemistry information attempt to evaluate lake
condition with respect to stressors such as acidic deposition and nutrients as well as other types
of physical or chemical contamination.  Data are collected for a variety of physical and chemical
constituents to provide information on the acid-base status of each lake, water clarity, primary
productivity, nutrient status, mass balance budgets of constituents, color, temperature regime,
and presence and extent of anaerobic conditions.
       There are two components to collecting water chemistry information:  collecting samples
of lake water for laboratory analysis, and field or in situ measurements of dissolved oxygen
(DO), pH and water temperature. At each site, crews fill one 4-L cubitainer using a depth-

-------
 Survey of the Nation's Lakes                                                Revision No. 1
 Quality Assurance Project Plan                                          Date: August 2007
	Page 29 of 87

 integrated sampler device.  All samples are stored in a cooler packed with resealable plastic
 bags filled with ice and shipped to the analytical laboratory within 24 hours of collection. In situ
 measurements are made using field meters and recorded on standardized data forms.  The
 primary function of the water chemistry information is to determine:

    •  Acid-base status
    •  Trophic state (nutrient enrichment)
    •  Chemical stressors
    •  Classification of water chemistry type

 5.1.2  Field Collection

        A single index site is located at the deepest point of the lake. At the index site, a single
 4-L composite sample is collected for laboratory analysis. In addition, a vertical profile  of in situ
 or field measurements (temperature, pH and DO) at various depths is conducted to provide a
 representation of the lake's condition with respect to stratification throughout the water  column.
 The response design for sampling locations is shown in Figure 5-1.

 5.1.3  Sampling and Analytical Methods

       Sample Collection: At the lake index site, a depth-integrated water chemistry sample is
 collected from the surface to a depth of 2 m using an integrated sampler device. The entire
 sample is combined into a single bulk water composite sample. Enough sample should be
 collected to fill a 4-L cubitainer.  Detailed procedures for sample collection and handling are
 described in the Field  Operations Manual.

       Field Measurements: At the lake index site, vertical profiles of temperature, pH and DO
 are measured at predetermined depth intervals. For shallow lakes (<3 m), DO, pH and
 temperature are measured at the surface and at 0.5-m intervals, until .5 m above the bottom.
 For lakes deeper than 3.0 m, DO, pH  and temperature are measured at the surface and at
 every meter thereafter through 20 m (or until reaching .5 m above the bottom). After the
 measurement at 20 m, measurements are recorded every 2 m starting at 22m (or until  .5 m
 above the bottom).

       Analysis: A performance-based methods approach is being utilized for water chemistry
 analysis that defines a set of laboratory method performance requirements for data quality.
 Following this approach, participating laboratories may choose which analytical methods they
 will use for each target analyte as long as they are able to achieve the performance
 requirements as listed in Table 5-1.

 5.1.4  Quality Assurance Objectives

       Measurement  quality objectives (MQOs) are given in Table 5-1. General requirements
 for comparability and representativeness are addressed in Section 2. The MQOs given in  Table
 5-1 represent the  maximum allowable criteria for statistical control purposes.

       For duplicate samples, precision across batches is estimated as the pooled standard
 deviation (calculated as the root-mean square) of all samples at the lower concentration range,
 and as the pooled percent relative standard deviation of all samples at the higher concentration
 range.  For samples of known composition, precision is estimated as the standard  deviation of

-------
Survey of the Nation's Lakes
Quality Assurance Project Plan
              Revision No. 1
          Date: August 2007
               Page 30 of 87
repeated measurements across batches at the lower concentration range, and as percent
relative standard deviation of repeated measurements across batches at the higher
concentration range (see Section 2).  Bias (systematic error) is estimated as either net bias or
relative net bias (Section 2).  Net bias is estimated as the difference between the mean
measured value and the target value of a performance evaluation and/or internal reference
samples at the lower concentration range measured across sample batches, and relative bias
as the percent difference at the higher concentration range.  Precision and bias are monitored at
the point of measurement (field or analytical laboratory) by several types of QC samples
described in the Section 5.1.6, and from performance evaluation (PE) samples.
           Littoral zone - benthic sampling area

             Sub-littoral zone


            Profundal zone
Observation station
positioned 10 m
offshore for sampling
                                                                             All stations equidistant
                                                                              from one another
                                       Index site
                                (deepest point - chosen using
                                bathymetric map and/or sonar
                                    30 min. to choose])
                                   •Water chemistry
                                       -Depth-integrated
                                       -In situ
                                   •Chlorophyll a
                                   •Phytoplankton
                                   •Zooplankton
                                   •Sediment diatoms
                                   •Algal toxins      	
                                                              Habitat and benthic sampling station
                                                                        15m
                                                          Shoreline
                                                          zone (1 m)
     Physical habitat and benthic
      sampling stations (A-J) -
       Starting point randomly
         selected a priori
                                             Benthic sample collected _
                                            from dominant habitat within
                                                  littoral zone
                                                                    Observation station
   Figure 5-1.  Sampling locations for Lakes Survey indicators.

-------
Survey of the Nation's Lakes
Quality Assurance Project Plan
   Revision No. 1
Date: August 2007
    Page 31 of 87
Table 5-1 . Performance requirements for water chemistry and chlorophyll-a analytical
Analyte
Conductivity
Turbidity
PH
Acid Neutralizing
Capacity (ANC)
Total and Dissolved
Organic Carbon
(TOC/DOC)
Ammonia (NH3)
Nitrate-Nitrite (NO3-NO2)
Total Nitrogen (TN)
Total Phosphorus (TP)
Ortho-phosphate
Sulfate (SO4)
Chloride (Cl)
Nitrate (NO3)
Units
|xS/cmat25'C
NTU
pH units
neq/L
(20 ueq/L=1 mg
as CaCO3)
mg C/L
mg N/L
mg N/L
mg/L
ugP/L
ugP/L
mg SOVL
mg CI/L
mg N/L
Potential Range
of Samples
1 to 15,000
0 to 44,000
3.7 to 10
-300 to +75,000
(-16 to 3,750 mg
as CaCO3)
0.1 to 109 (as
DOC)
Oto17
0 to 360 (as
nitrate)
0.1 to 90
0 to 22,000

0 to 5,000
OtoS.OOO
0 to 360
Long-Term
MDL
Objective2
NA
1
NA
NA
0.10
0.01
(0.7 ueq/L)
0.01
0.01
2
2
0.25
(5 ueq/L)
0.10
(3 ueq/L)
0.01
(1 ueq/L)
Laboratory
Reporting Transition
Limit3 Value4
2.0 20
2.0 20
NA 5.75 and>8.25
NA ±50
0.20 < 1
> 1
0.02 0.10
(1 .4 ueq/L)
0.02 0.10
0.02 0.10
4 20
4 20
0.50 2.5
(10 ueq/L)
0.20 1
(6 ueq/L)
0.02 0.1
(4 ueq/L)
methods.
Precision
Objective5
±2 or ±10%
±2 or ±10%
± 0.08 or ±0.1 5
±5 or ±10%
±0.10 or ±10%
±0.01 or ±10%
±0.01 or ±10%
±0.01 or ±10%
±2 or ±10%
±2 or ±10%
±0.25 or ±10%
±0.10 or ±10%
±0.01 or ±10%

Bias
Objective6
± 2 or 5%
±2 or ±10%
± 0.05 or± 0.10
±5 or ±10%
±0.10 or ±10%
±0.01 or ±10%
±0.01 or ±10%
±0.01 or ±10%
±2 or ±10%
±2 or ±10%
±0.25 or ±10%
±0.10 or ±10%
±0.01 ±10%

-------
Survey of the Nation's Lakes
Quality Assurance Project Plan
     Revision No. 1
Date: August 2007
      Page 32 of 87

Analyte
Calcium (Ca)
Magnesium (Mg)
Sodium (Na)
Potassium (K)
Silica (SiO2)
Total Suspended Solids
(TSS)
True Color
Chlorophyll a
Potential Range
Units of Samples
mg Call 0.04 to 5,000
mg Mg/L 0.1 to 350
mg Na/L 0.08 to 3,500
mg K/L 0.01 to 120
mg SiO2/L 0.01 to 100
mg/L 0 to 27,000
PCU 0 to 350
ng/L (in extract) 0.7 to 1 1 ,000
Long-Term
MDL
Objective2
0.05
(2.5 ueq/L)
0.05
(4 ueq/L)
0.05
(2 ueq/L)
0.05
(1 ueq/L)
0.05
1
NA
1.5
Laboratory
Reporting Transition
Limit3 Value4
0.10 0.5
(5 ueq/L)
0.10 0.5
(8 ueq/L)
0.10 0.5
(4 ueq/L)
0.10 0.5
(2 ueq/L)
0.10 0.5
2 10
5 50
3 15
Precision
Objective5
±0.05 or ±10%
±0.05 or ±10%
±0.05 or ±10%
±0.05 or ±10%
±0.05 or ±10%
± 1 or ±10%
±5 or ±10%
± 1.5 or ±10%
Bias
Objective6
±0.05 or ±10%
±0.05 or ±10%
±0.05 or ±10%
±0.05 or ±10%
±0.05 or ±10%
± 1 or ±10%
±5 or ±10%
± 1.5 or ±10%
1 Estimated from samples analyzed at the WED-Corvallis laboratory between 1999 and 2005 for TIME, EMAP-West, and WSA streams from across the U.S.
 The long-term method detection limit is determined as a one-sided 99% confidence interval from repeated measurements of a low-level standard across several calibration curves, based on USGS Open
   File Report 99-193.  These represent values that should be achievable by multiple labs analyzing samples over extended periods with comparable (but not necessarily identical) methods.
3 The minimum reporting limit is the lowest value that need to be quantified (as opposed to just detected), and represents the value of the lowest nonzero calibration standard used.  It is set to 2x the
   long-term detection limit, following USGS Open File Report 99-193 New Reporting Procedures Based on Long-Term Method Detection Levels and Some Considerations for Interpretations ofWater-
   Quality Data Provided by the U.S. Geological Survey National Water Quality Laboratory.
4 Value at which performance objectives for precision and bias switch from absolute (< transition value) to relative 9> transition value). Two-tiered approach based on Hunt, D.T.E. and A.L. Wilson.
   1986. The Chemical Analysis of Water: General Principles and Techniques. 2ntl ed.. Royal Society of Chemistry, London, England.
5 For duplicate samples, precision is estimated as the pooled standard deviation (calculated as the root-mean square) of all samples at the lower concentration range, and as the pooled percent relative
   standard deviation of all samples at the higher concentration range. For standard samples, precision is estimated as the standard deviation of repeated measurements across batches at the lower
   concentration range, and as percent relative standard deviation of repeated measurements across batches at the higher concentration range.
6 Bias (systematic error) is estimated as the difference between the mean measured value and the target value of a performance evaluation and/or internal reference samples at the lower concentration
range measured across sample batches, and as the percent difference at the higher concentration range.

-------
Survey of the Nation's Lakes                                                   Revision No. 1
Quality Assurance Project Plan                                              Date: August 2007
	Page 33 of 87

5.1.5   Quality Control Procedures: Field Operations

       For in situ measurements, each field instrument (e.g., multi-probe) must be calibrated,
inspected  prior to use, and operated according to manufacturer specifications. The
measurements will be taken from the surface to the bottom, ending until 0.5 m above the bottom
or the maximum depth of 50 m is reached. Figure 5-2 illustrates the general scheme for field
chemistry  measurement procedures. If problems with any field instrument are encountered, the
user should consult the manufacturer's manual, and/or call the manufacturer prior to sampling.
In addition to daily calibrations, the DO probe should periodically be checked against a Winkler
titration kit to ensure that it is properly calibrated.  For pH and conductivity, the calibration of pH
electrodes and conductivity probes should be checked using an independent standard that is
similar in ionic strength and pH to the lake samples being measured (e.g., Peck and Metcalf
1991, Metcalf and Peck 1993) Specific quality control measures are listed in Table 5-2 for field
measurements. Additionally, duplicate samples will be collected at 10% of lakes  sampled.

       Throughout the water chemistry sample collection process it is important to take
precautions to avoid contaminating the sample. Many lakes in some regions have a very low
ionic strength (i.e., very low levels of chemical constituents) and samples can be  contaminated
quite easily by perspiration from  hands, sneezing, smoking, suntan lotion, insect  repellent,
fumes from gasoline engines or chemicals used during sample collection.

5.1.6   Quality Control Procedures: Laboratory Operations

5.1.6.1 Sample Receipt and Processing

       QC activities associated with sample receipt and processing are presented in Table 5-3.
The communications center and information management staff is notified of sample receipt and
any associated problems as soon as possible  after samples are received. The general
schemes for processing lake water chemistry samples for analysis is presented in Figure 5-3.
Several aliquots are prepared from bulk water samples and preserved accordingly.  Ideally, all
analyses are completed within a few days after processing to allow for review of the results and
possible reanalysis of suspect samples within  seven days.  Critical holding times  for the various
analyses are the maximum allowable holding times, based on current EPA and American Public
Health Association (APHA) requirements  (American Public Health Association, 1989).

5.1.6.2 Analysis of Samples

       QC protocols are an integral part of all  analytical  procedures to ensure that the results
are reliable and the analytical stage of the measurement system is maintained in a state of
statistical control.  Information regarding QC sample requirements and corrective actions are
summarized in Table 5-4. Figure 5-4 illustrates the general scheme for analysis of a batch  of
water chemistry samples, including associated QC samples.

-------
Survey of the Nation's Lakes
Quality Assurance Project Plan
                                                                      Revision No. 1
                                                                  Date: August 2007
                                                                      Page 34 of 87
                    FIELD MEASUREMENT PROCESS:  WATER CHEMISTRY INDICATOR
                                          PRE-DEPARTURE CHECK
                                                                                Replace Probe
                                                                               and/or Instrument
                       Probe Inspectrion
                       Electronic Checks
                       Test Calibration
                                            FIELD CALIBRATION
                                           QC Sample Measurement
                                           Performance Evaluation
                                           Measurement
                                               CONDUCT
                                             MEASUREMENTS
                                            AND RECORD DATA
                                           QC Sample Measurement
                                           Duplicate Measurement
                                                REVIEW
                                               DATA FORM
 Qualify Data
Correct Errors
                                         ACCEPT FOR DATA ENTRY
      Figure 5-2.  Field measurement activities for the water chemistry indicator.

-------
Survey of the Nation's Lakes
Quality Assurance Project Plan
                                                                    Revision No. 1
                                                                Date: August 2007
                                                                    Page 35 of 87
Table 5-2.  Field quality control: water chemistry indicator
Check Description
Check calibration of
instrument
Verify performance of
temperature probe
using wet ice.
Check DO calibration
against Winkler Titration
Check calibration of pH,
and conductivity with an
independent standard
solution
Frequency
Prior to sampling each
day
Prior to initial
sampling, daily
thereafter
Weekly
Weekly
Acceptance Criteria
Specific to each
instrument
Functionality = ± 2°C
±1 .0 mg/L
pH:
> 5. 75 and < 8.: ±0.15
< 575 or > 8.25: ±0.08
Conductivity:
±2 uS/cmor±10%
Corrective
Actions
Adjust and recalibrate,
redeploy gear
See manufacturer's
directions.
Adjust and recalibrate
Recalibrate or
repair/replace electrode
or probe
Table 5-3.  Sample processing quality control activities: water chemistry indicator
   Quality
   Control
   Activity
           Description and Requirements
      Corrective Action
 Sample
 Storage
Store samples in darkness at 4°C
Monitor temperature daily	
Qualify sample as suspect for
all analyses	
 Holding time
Complete processing bulk samples within 48 hours of
collection if possible, or ASAP after receipt	
Qualify samples
Aliquot
Containers
and
Preparation
HOPE bottles.
Rinse bottles and soak for 48 h with ASTM Type II
reagent water; test water for conductivity
Prepare bottles to receive acid as preservative by filling
with a 10% HCI solution and allow to stand overnight.
Rinse six times by filling with deionized water.
Determine the conductivity of the final  rinse of every
tenth bottle. Conductivity must be < 2 |j.S/cm.	
Repeat the deionized water
rinsing procedure on all bottles
cleaned since the last
acceptable check. Check
conductivity of final rinse on
every fifth bottle.
 Filtration
0.4 |j,m polycarbonate filters required for all dissolved
analytes. Rinse filters and filter chamber twice with 50-
ml portions of deionized water, followed by a 20-mL
portion of sample. Repeat for each filter used on a
single sample. Rinse aliquot bottles with two 25 to 50
ml portions of filtered sample before use.	

-------
Survey of the Nation's Lakes
Quality Assurance Project Plan
                                                                      Revision No. 1
                                                                   Date: August 2007
                                                                       Page 36 of 87
Table 5-3.  Continued.
   Quality
   Control
   Activity
           Description and Requirements
      Corrective Action
 Preservation
Use ultrapure acids for preservation.
Add sufficient acid to adjust to pH < 2.  Check pH with
indicator paper. Record volume of preservative on
container label. Store preserved aliquots in darkness at
4°C until analysis.	
 Holding
 Times for
 preserved
 aliquots
Holding times range from 3 days to 6 months, based
upon current APHA criteria.
Sample results are qualified as
exceeding the specified holding
time.
 NOTE: Does not include
 analysis for ortho-phosphate.

 Also, We may end up with a
 glass bottle for TOC and DOC
 analyses

 Assumes everyone will be
 measuring pH in the field
                           SAMPLE RECEIPT
                              4-L Buk Samples
                    Inspect samples and complete tracking form
                    Store at 4 °C in darkness
J
Process within 48 Hours
i

Filtration (0.4 \im)
J, 	 ^
B
• HOPE bottle
• Acid washed
• Preserve
with HN03 }



1

Not Filtered
1 ^^L^ 1
( ^
B
• HOPE bottle
• Not acid washed
• No preservative
^
B
• HOPE bottle
• Acid washed
• Preserve with
^ H2S04


f \
B
• HOPE bottle
• Acid washed
* Preserve with
H2S04


1


r
ID
• HOPE bottle
• Not acid washed
• No preservative
V
                                                         • Total
                                                          Phosphorus
                                                         • Total Nitrogen
                                                         • Total Organic
                                                          Carbon

                                                            (28 day
                                                          Holding time)
1 1

Analyses:
• Turbidity

(72 hour
holding
time)

Analyses:
• ANC
• Conductivity

(28 day
Holding time)
  No analysis for orthophosphate because of the 24 hour holding time

  Figure 5-3.  Sample processing activities for water chemistry samples.

-------
Survey of the Nation's Lakes
Quality Assurance Project Plan
                                                          Revision No. 1
                                                      Date: August 2007
                                                           Page 37 of 87
Table 5-4. Laboratory quality control samples: water chemistry indicator
  QC Sample Type
   (Analytes), and
     Description
    Frequency
    Acceptance
      Criteria
     Corrective Action
Laboratory/ Reagent
Blank:  (For all
analyses except total
suspended solids
[TSS].  ForTSS, the
lab will filter a known
volume of reagent
water and process the
filters per method.)
Once per day prior
to sample analysis
Control limits < LRL
 Prepare and analyze new
 blank. Determine and correct
 problem (e.g., reagent
 contamination, instrument
 calibration, or contamination
 introduced during filtration)
 before proceeding with any
 sample analyses.
 Reestablish statistical
 control by analyzing three
 blank samples.	
Filtration Blank:  (All
dissolved analytes,

ASTM Type II reagent
water processed
through filtration unit.
Prepare once per
week and archive
Prepare filter blank
for each box of 100
filters, and examine
the results before
any other filters are
used from that box.
Measured
concentrations 

-------
Survey of the Nation's Lakes
Quality Assurance Project Plan
                                                           Revision No. 1
                                                       Date: August 2007
                                                           Page 38 of 87
Table 5-4.  (Continued).
  QC Sample Type
   (Analytes), and
     Description
    Frequency
   Acceptance
     Criteria
     Corrective Action
Laboratory Duplicate
Sample:  (All
analyses)
One per batch
Control limits <
precision objective
If results are below LRL:

Prepare and analyze split
from different sample (volume
permitting). Review precision
of QCCS measurements for
batch.  Check preparation of
split sample. Qualify all
samples in batch for possible
reanalysis.	
Standard Reference
Material:  (When
available for a
particular analyte)
One analysis in a
minimum of five
separate batches
Manufacturers
certified range
Analyze standard in next
batch to confirm suspected
imprecision or bias. Evaluate
calibration and QCCS
solutions and standards for
contamination and
preparation error.  Correct
before any further analyses of
routine samples are
conducted. Reestablish
control by three successive
reference standard
measurements which are
acceptable. Qualify all sample
batches analyzed since the
last acceptable reference
standard measurement for
possible reanalysis.	
Matrix spike samples:
(Only prepared when
samples with potential
for matrix
interferences are
encountered)
One per batch
Control limits for
recovery cannot
exceed 100±20%
Select two additional samples
and prepare fortified
subsamples. Reanalyze all
suspected samples in batch
by the method of standard
additions.  Prepare three
subsamples (unfortified,
fortified with solution
approximately equal to the
endogenous concentration,
and fortified with solution
approximately twice the
endogenous concentration.

-------
Survey of the Nation's Lakes
Quality Assurance Project Plan
                                                   Revision No. 1
                                               Date: August 2007
                                                    Page 39 of 87
           PREPARE QC SAMPLES
         Laboratory Blank
         Fortified Sample
         Laboratory Split Sample
                                PREPARE QC SAMPLES
SAMPLEPROCESSING
                                                                               QC Check Samples (QCCS)
                                                                               Internal Reference Sample
                                                                        Contamination
                                                                          or Biased
                                                                          Calibration
     Laboratory
       Blank
                                                                         Recheck
                                                                       LT-MDL QCCS
                               Insert randomly
                              into sample batch
                  Accept Batch
                    for Entry
                 and Verification
                                                                           (Re-Calibrate
                                                                            Re-analyze
                                                                         Previous Samples
     Calibration
       QCCS
                  Qualify batch
                  for possible
                   re-analysis
                                          Pass  /Calibration^  Fail
Figure 5-4. Analysis activities for water chemistry samples.

-------
Survey of the Nation's Lakes
Quality Assurance Project Plan
   Revision No. 1
Date: August 2007
    Page 40 of 87
5.1.7   Data Reporting, Review, and Management

       Checks made of the data in the process of review and verification are summarized in
Table 5-5.  Data reporting units and significant figures are given in Table 5-6. The Indicator
Lead is ultimately responsible for ensuring the validity of the data, although performance of the
specific checks may be delegated  to other staff members.
Table 5-5.  Data validation quality control: water chemistry indicator
Activity or Procedure
Range checks, summary statistics, and/or
exploratory data analysis (e.g., box and
whisker plots)
Review holding times
Ion balance: Calculate percent ion balance
difference (%IBD) using data from cations,
anions, pH, and ANC.
Conductivity check: Compare measured
conductivity of each sample to a calculated
conductivity based on the equivalent
conductances of major ions in solution (Hillman
etal., 1987).

Review data from QA samples (laboratory PE
samples, and interlaboratory comparison
samples)
Requirements and Corrective Action
Correct reporting errors or qualify as suspect or
invalid.
Qualify value for additional review
If total ionic strength <100 |aeq/L, %IBD <
±25%.
If total ionic strength > 100 |aeq/L, %IBD
< 25 |j,S/cm,
([measured ~ calculated] •*• measured) <
±25%.
If measured conductivity > 25 |j,S/cm,
([measured ~ calculated] •*• measured) <
±15%.
Determine which analytes, if any, are the
largest contributors to the difference between
calculated and measured conductivity.
Review suspect analytes for analytical error
and reanalyze.
If analytical error is not indicated, qualify
sample to attribute conductivity difference to
unmeasured ions. Reanalysis is not required.

Determine impact and possible limitations on
overall usability of data

-------
Survey of the Nation's Lakes
Quality Assurance Project Plan
                                                       Revision No. 1
                                                    Date: August 2007
                                                        Page 41 of 87
Table 5-6.  Data reporting criteria: water chemistry indicator
Measurement
Dissolved Oxygen
Temperature
PH
Carbon, total & dissolved organic
Acid neutralizing capacity
Conductivity
Calcium, magnesium, sodium,
potassium, ammonium, chloride,
nitrate, and sulfate
Silica
Total phosphorus
Total nitrogen
Nitrate-Nitrite
Ammonia
Turbidity
True color
Total suspended solids
Units
mg/L
°C
pH units
mg/L
M,eq/L
iaS/cm at 25 °C
|j,eq/L
mg/L
HQ/L
mg/L
mg/L
mg/L
NTU
PCU
mg/L
No. Significant
Figures
2
2
3
3
3
3
3
3
3
3
3
3
3
2
3
Maximum No.
Decimal Places
1
1
2
1
1
1
1
2
0
2
2
2
0
0
1
       The ion balance for each sample is computed using the results for major cations, anions,
and the measured acid neutralizing capacity.  The percent ion difference (%IBD) for a sample is
calculated as:
Equation 11
%IBD =
                                (locations - Cantons) - ANC
                            ANC + JLanions + ^cations + 2
where ANC is the acid neutralization capacity, cations are the concentrations of calcium,
magnesium, sodium, potassium, and ammonium, converted from mg/L to i^eq/L, anions are
chloride, nitrate, and sulfate (converted from mg/L to i^eq/L), and H+ is the hydrogen ion
concentration calculated from the antilog of the sample pH.  Factors to convert major ions from
mg/L to |aeq/L are presented in Table 5-7.  For the conductivity check, equivalent conductivities
for major ions are presented in Table 5-8.

-------
Survey of the Nation's Lakes
Quality Assurance Project Plan
   Revision No. 1
Date: August 2007
    Page 42 of 87
Table 5-7. Constants for converting major ion concentrations from mg/L to [j.eq/L
Analyte
Calcium
Magnesium
Potassium
Sodium
Ammonia
Chloride
Nitrate
Sulfate
Conversion from
mg/L to |j,eq/La
49.9
82.3
25.6
43.5
55.4
28.2
16.1
20.8
  Measured values are multiplied by the conversion factor.
Table 5-8. Factors to calculate equivalent conductivities of major ions3
Ion
Calcium
Magnesium
Potassium
Sodium
Ammonia
Chloride
Equivalent
Conductance per
mg/L (|aS/cm at 25
°C)
2.60
3.82
1.84
2.13
4.13
2.14
Ion
Nitrate
Sulfate
Hydrogen
Hydroxide
Bicarbonate
Carbonate
Equivalent
Conductance per
mg/L (|aS/cm at 25
°C)
1.15
1.54
3.5x10DD
1.92x10DD
0.715
2.82
  From Hillman et al. (1987).
  Specific conductance per mole/L, rather than per mg/L.
5.2    Chlorophyll-a Indicator

5.2.1   Introduction

       Trophic indicators based on algal community information attempt to evaluate lake
condition with respect to stressors such as nutrient loading.  Data are collected for chlorophyll-a
to provide information on the algal loading and gross biomass of blue-greens and other algae
within each lake.

-------
Survey of the Nation's Lakes                                                   Revision No. 1
Quality Assurance Project Plan                                              Date: August 2007
	Page 43 of 87

5.2.2  Sampling Design

       At the index site located at the deepest point of the lake, a single depth-integrated water
sample is collected from the euphotic zone to provide a representation of the lake's trophic
condition with respect to its algal loads.  The response design for sampling locations is shown in
Figure 5-1.

5.2.3  Sampling and Analytical Methods

       Sample Collection: At the lake index site, collect a 2-L depth-integrated water sample
from the surface within the photic zone (determined for each lake by multiplying the Secchi
depth by 2, with a maximum depth of 2 m) using an integrated sampler device. The sample
should be  preserved immediately on ice and placed in a cooler away from direct light.  After
returning to shore, the sample is filtered in subdued light to minimize degradation.  The filter is
then stored in a centrifuge tube on ice before being shipped to the laboratory for chlorophyll-a
analysis. Detailed procedures for sample collection and processing are described in the Field
Operations Manual.

       Analysis: A performance-based methods approach is being utilized for chlorophyll-a
analysis that defines a set of laboratory method performance requirements for data quality.
Following this approach, participating laboratories may choose which analytical method they will
use to determine chlorophyll-a concentration as long as they are able to  achieve the
performance requirements as listed in Table 5-1.

5.2.4  Quality Assurance Objectives

       MQOs are given in Table 5-1. General requirements for comparability and
representativeness are addressed in Section 2. The MQOs given in Table 5-1 represent the
maximum  allowable criteria for statistical control purposes.  LT-MDLs are monitored over time
by repeated measurements  of low level standards and calculated using Equation 1a.

       For precision, the objectives presented in Table  5-1 represent the 99 percent confidence
intervals about a single measurement and are thus based on the standard deviation of a set of
repeated measurements (n > 1).  Precision objectives at lower concentrations are equivalent to
the corresponding LRL.  At higher concentrations, the precision objective is expressed in
relative terms, with the 99 percent confidence interval based on the relative standard deviation
(Section 2).  Objectives for accuracy are equal to the corresponding precision objective, and are
based on the mean value of repeated measurements. Accuracy is generally estimated as net
bias  or relative net bias (Section 2).  Precision and bias are monitored at the point of
measurement (field or analytical laboratory) by several types of QC samples described in the
Section 5.1.6, where applicable, and from performance  evaluation (PE) samples.

5.2.5  Quality Control Procedures: Field Operations

       Chlorophyll can degrade rapidly when exposed to bright light. It is important to keep the
sample on ice and in a dark place (cooler) until it can be filtered.  If possible, prepare the sample

-------
Survey of the Nation's Lakes                                                   Revision No. 1
Quality Assurance Project Plan                                              Date: August 2007
	Page 44 of 87

in subdued light (or shade) by filtering as quickly as possible to minimize degradation.  If the
sample filter clogs and the entire sample in the filter chamber cannot be filtered, discard the filter
and prepare a new sample, using a smaller volume.

       Check the label to ensure that all written information is complete and legible. Place a
strip of clear packing tape over the label and bar code, covering the label completely. Record
the bar code assigned to the chlorophyll-a sample on the Sample Collection Form (Figure 5-5).
Also record the volume of sample filtered on the Sample Collection Form.  Verify that the
volume recorded on the label matches the volume recorded on the Sample Collection Form.
Enter a flag code and provide comments on the Sample Collection Form if there are any
problems in collecting the sample or if conditions occur that may affect sample integrity.  Store
the filter sample in a 50-mL centrifuge tube (or other suitable container) wrapped in aluminum
foil and freeze using dry ice or a portable freezer. Recheck all forms and labels for
completeness and legibility. Additionally, duplicate (replicate) samples will be collected at 10%
of lakes sampled.

5.2.6   Quality Control Procedures: Laboratory Operations

5.2.6.1 Sample Receipt and Processing

       QC activities associated with sample receipt and processing are presented in Table 5-9.
The communications center and information management staff are notified of sample receipt
and any associated problems as soon as possible after samples are received.

5.2.6.2 Analysis of Samples

       QC protocols are an integral part of all analytical procedures to ensure that the results
are reliable and the analytical stage of the measurement system is maintained in a state of
statistical control.  Most of the QC procedures described here are detailed in the references for
specific methods.  However, modifications to the procedures and acceptance criteria described
in this QAPP supersede those presented in the methods references.  Information regarding QC
sample requirements, where applicable, and corrective actions are summarized in Table 5-5.

5.2.7   Data Reporting, Review, and Management

       Checks made of the data in the process of review, verification, and validation are
summarized in Table 5-10. Data reporting units and significant figures are given  in Table 5-11.
The Indicator Lead is ultimately responsible for ensuring the validity of the data, although
performance of the specific checks may be delegated to other staff members.  Once data have
passed all acceptance requirements, computerized data files are prepared in a format specified
for the Lakes Survey project.  The electronic data files are transferred to the Lakes Survey IM
Coordinator at WED-Corvallis for entry into a centralized data base.  A hard copy output of all
files will also be sent to the Lakes Survey IM Coordinator.

-------
Survey of the Nation's Lakes
Quality Assurance Project Plan
    Revision No. 1
Date: August 2007
     Page 45 of 87
SAMPLE COLLECTION FORM-LAKES
LAKE NAME:
LAKE ID
TEAM ID
L
circle): 1234
5
DATE OF COLLECTION: / / VISIT*: 1 2
SITE ID
6 7 8 9 10 OTh
(circle): INDEX OTHER:
ER:
SECCHI DISK TRANSPARENCY
DEPTH DISK DISAPPEARS DEPTH DISK REAPPEARS
M . M
CLEAR TO
BOTTOM (X) COMMENTS

WATER CHEMISTRY (4-L CUBITAINER AND 4 SYRINGES)
SAMPLE ID * SAMPLE
(Barcode) TYPE
	 R1

DEPTH
COLLECTED
M
M
FLAG COMMENTS


CHLOROPHYLL (TARGET VOLUME = 500 ML)
SAMPLE ID * SAMPLE
(Barcode) TYPE
R1

DEPTH
COLLECTED
«
»
SAMPLE
VOLUME FLAG COMMENTS
ML
ML
ZOOPLANKTON
MESH
SIZE
COARSE
FINE


SAMPLE ID »
(Barcode)




SAMPLE
TYPE
R1
R1


LENGTH
OF TOW
M
M
M
M
(FILL TO MARK ON BOTTLE « 80 ML)
CONTAINERS
NO. PRESERVED ly FLAG




COMMENTS




SEDIMENT CORE SAMPLES (TARGET CORE LENGTH = 35 TO 40 CM)
Collected at (circle): INDEX OTHER
SAMPLE
CLASS
TOP
BOTTOM


SAMPLE ID i
(Barcode)




SAMPLE
TYPE
R1
R1


If OTHER, record d rect on and distance from INDEX site:
LENGTH
OF
CORE
e.
CM
c.
«•
INTERVAL
From To FLAG
™
„„
c.
0.
COMMENTS




                 FLAG CODES:  K » No MEASUREMENT OR SAMPLE COLLECTED; U = SUSPECT MEASUREMENT OR SAMPLE;
                 F1. F2. ETC. = MISC. FLAGS ASSIGNED BY EACH FIELD CREW. EXPLAIN ALL FLAGS IN COMMENTS SECTION.
                                                                                 REVIEWED BY (INITIAL):	
                                                                                  Sample Collection Form - Lakes -1
                 Figure 5-5.  Sample collection form

-------
Survey of the Nation's Lakes
Quality Assurance Project Plan
   Revision No. 1
Date: August 2007
    Page 46 of 87
Table 5-9.  Sample processing quality control: chlorophyll-a indicator
Quality
Control
Activity
Filtration
(done in
field)
Sample
Storage
Description and Requirements
Whatman GF/F (or equivalent) glass fiber filter. Filtration
pressure should not exceed 7 psi to avoid rupture of
fragile algal cells.
Store samples in darkness and frozen (-20 °C)
Monitor temperature daily
Corrective Action
Discard and refilter
Qualify sample as suspect for
all analyses
Table 5-10.  Data validation quality control: chlorophyll-a indicator
Activity or Procedure
Range checks, summary statistics, and/or
exploratory data analysis (e.g., box and
whisker plots)
Review data from QA samples (e.g., laboratory
PE samples or other standards or replicates)
Requirements and Corrective Action
Correct reporting errors or qualify as suspect or
invalid
Determine impact and possible limitations on
overall usability of data
Table 5-11.  Data reporting criteria: chlorophyll-a indicator
Measurement
Chlorophyll-a
Units
ug/L
No. Significant
Figures
2
Maximum No.
Decimal Places
1
5.3    Sediment Diatom Indicator

5.3.1   Introduction

       Ecological indicators based on sediment diatoms provide an indication of both current
and historical lake condition with respect to stressors such as nutrients and sediment loadings.
The diatom indicator is unique in that it can potentially provide insight to the "original" or pristine
condition of the lake. Diatoms are collected from  bottom sediments to provide information on
temporal and spatial trends in eutrophication and  to provide a historical perspective for
comparisons.

5.3.2   Sampling Design

       At the index site located  at the deepest point of the lake, a single core is collected from
the bottom  by lowering a core sampler into the sediment. The collection goals for the diatom
sample are to obtain a sample of undisturbed surface sediments, and to  obtain a deeper sample

-------
Survey of the Nation's Lakes
Quality Assurance Project Plan
   Revision No. 1
Date: August 2007
    Page 47 of 87
(representing past conditions) that is uncontaminated with the shallower sediments. The
response design for sampling locations is shown in Figure 5-1.

5.3.3   Sampling and Analytical Methods

       Sample Collection: At the lake index site,
                                                              The target length for a core
sample is 35-45 cm in length. If the target length cannot be obtained after two consecutive
attempts, the maximum obtainable core should be used.  When sampling natural lakes, one
sectioned sample is collected from the top 1-cm of the core and another section 1-cm from the
bottom. When sampling reservoirs, only the top 1-cm of the core is collected.  Each sample is
placed in a separate sealable container with a label indicating the depth of the sample and
preserved with a wet paper towel to prevent dessication.  Detailed procedures for sample
collection and handling are described in the Field Operations Manual.

       Analysis: Sediment samples are cleaned of organic matter with strong oxidizing agents
and slides are made.  The analysis is made by identifying and counting 600 individual cells.
Detailed procedures for sample processing and enumeration are described in the laboratory
methods manual. Table 5-12 summarizes field and analytical methods for the sediment diatom
indicator.
Table 5-12.  Field and laboratory methods: sediment diatom indicator
Variable or
Measurement
Sample
Collection
Sample
Digestion and
Concentration
Slide
preparation
Enumeration
Identification
QA
Class
C
N
N
C
C
Expected
Range
and/or Units
NA
NA
NA
0 to 600
organisms
genus
Summary of Method
Core sampler used to collect a
35-45 cm core of sediments
Add acid and heat at 200°C for 2
hrs. Allow to settle, siphon off
supernatant, repeat until final
volume is between 25-50 ml
Prepare coverslips and mount
on slide using Naphrax
Random systematic selection of
rows and fields with target of
600 organisms from sample
Specified keys and references
References
Glewetal. 2001;
Lakes Survey Field
Operations Manual
2006
Charles et al. 2003;
Lakes Survey
Laboratory Methods
Manual 2006
Charles et al. 2003;
Lakes Survey
Laboratory Methods
Manual 2006
Charles et al. 2003;
Lakes Survey
Laboratory Methods
Manual 2006

C = critical, N = non-critical quality assurance classification.

-------
Survey of the Nation's Lakes                                                   Revision No. 1
Quality Assurance Project Plan                                              Date: August 2007
	Page 48 of 87

5.3.4  Quality Assurance Objectives

       MQOs are given in Table 5-13.  General requirements for comparability and
representativeness are addressed in Section 2. Precision is calculated as percent efficiency,
estimated from independent identifications of organisms in randomly selected samples.  The
MQO for accuracy is evaluated by having individual specimens representative of selected taxa
identified by recognized experts.

Table 5-13.  Measurement quality objectives: sediment diatom indicator
Variable or Measurement
Enumeration
Identification
Precision
85%
85%
Accuracy
90%a
90%a
Completeness
99%
99%
 a Taxonomic accuracy, as calculated using Equation 8 in Section 2.


5.3.5  Quality Control Procedures: Field Operations

       Any contamination of the samples can produce significant errors in the resulting
interpretation.  Great care must be taken by the samplers not to contaminate the bottom sample
with higher levels of the core or with lake water or with the tools used to collect the sample (i.e.,
the corer, core tube, and spatulas) and not to mix the surface layer with the deeper sediments.
Prior to sampling, the corer device and collection tools should be examined to ensure that they
are clean and free of contaminants from previous sampling activities. After the first (top) core is
sectioned off, the sectioning apparatus should be removed and rinsed in Dl water. This
procedure prevents contamination of the bottom sediment layer with diatoms from the upper
portion of the core.

       After each sample is sectioned and placed in a separate container, the labels should be
checked to ensure that the depth of each core is recorded and all written information is
complete and legible, and that the label has been completely covered with clear packing tape.  It
should be verified that the bar code assigned to the sediment diatom sample is recorded
correctly on the Sample Collection Form (Figure 5-5). A flag code should be recorded and
comments provided  on the Sample Collection Form to denote any problems encountered in
collecting the sample or the  presence of any conditions that may affect sample integrity.

       Additionally, duplicate (replicate) samples will be collected at 10% of lakes sampled.

5.3.6  Quality Control Procedures: Laboratory Operations

       Specific quality control measures are listed in Table 5-14 for laboratory operations.

-------
Survey of the Nation's Lakes                                                   Revision No. 1
Quality Assurance Project Plan                                              Date: August 2007
	Page 49 of 87

Table 5-14. Sample processing quality control: sediment diatom indicator
Quality Control
Activity
Sample Storage
Description and Requirements
Store samples in darkness at 4°C
Monitor temperature daily
Corrective Action
Qualify sample as suspect
for all analyses
5.3.7  Data Reporting, Review, and Management

Checks made of the data in the process of review, verification, and validation are summarized in
Table 5-15.  The Indicator Lead is ultimately responsible for ensuring the validity of the data,
although performance of the specific checks may be delegated to other staff members. Once
data have passed all acceptance requirements, computerized data files are prepared in a format
specified for the Lakes Survey project.  The electronic data files are transferred to the  Lakes
Survey IM Coordinator at WED-Corvallis for entry into a centralized data  base.  A  hard copy
output of all files will also be sent to the Lakes Survey IM Coordinator.

       Sample residuals, vials, and slides are archived by each laboratory until the EPA Project
Leader has authorized, in writing, the disposition of samples. All raw data (including field data
forms and bench data recording sheets) are retained in an organized fashion by the Indicator
Team permanently or until written authorization for disposition has been received from the EPA
Project Leader.

5.4    Physical Habitat Quality Indicator

5.4.1   Introduction

       The physical habitat shoreline and littoral surveys that the Lakes Survey field teams
conduct serve three purposes. First, this habitat information is absolutely essential to the
interpretation of what lake biological assemblages "should" be like in the  absence of many types
of anthropogenic impacts. Second, the habitat evaluation is a reproducible, quantified estimate
of habitat condition, serving as a benchmark against which to compare future habitat changes
that might result from anthropogenic activities. Third, the specific selections of habitat
information collected aid in the diagnosis of probable causes of ecological impairment in lakes.

       In addition to information collected in the field by the shoreline and littoral surveys, the
physical habitat description  of each lake includes many map-derived variables such as lake
surface area, shoreline length, and shoreline complexity.  Furthermore, an array of information,
including watershed topography and land use, supplements the physical  habitat information.
The shoreline and littoral surveys concentrate on information best derived "on the ground." As
such, these survey results provide the all-important linkage between large watershed-scale
influences and those forces that directly affect aquatic organisms day to day.  Together with
water chemistry, the habitat measurements and observations describe the variety of physical
and chemical conditions that are necessary to support biological diversity and foster long-term

-------
Survey of the Nation's Lakes
Quality Assurance Project Plan
   Revision No. 1
Date: August 2007
    Page 50 of 87
ecosystem stability. These characteristics of lakes and their shorelines are the very aspects that
are often changed as a result of anthropogenic activities.

Table 5-15.  Laboratory Quality Control: sediment diatom indicator
Check or
Sample
Description
Frequency
Acceptance Criteria
Corrective Action
SAMPLE PROCESSING
Re-process
subsamples
(Replicate
analysis)
10% of all
samples
completed per
laboratory
Percent Similarity >70%
If <70%, re-analyze all
samples since last
acceptable replicate
IDENTIFICATION
Duplicate
identification
by different
taxonomist
within lab
Independent
identification
by outside
taxonomist
Use standard
taxonomic
references
Prepare
reference
collection
10% of all
samples
completed per
laboratory
All uncertain
taxa
For all
identifications
Each new
taxon per
laboratory
Percent Similarity >85%
Uncertain identifications to be
confirmed by expert in particular
taxa
All keys and references used
must be on bibliography
prepared by another laboratory
Complete reference collection to
be maintained by each individual
laboratory
If <85%, determine reason
and if cause is systemic, re-
identify all samples
completed by that
taxonomist
Record both tentative and
independent IDs
If other references desired,
obtain permission to use
from Project Facilitator
Lab Manager periodically
reviews data and reference
collection to ensure
reference collection is
complete and identifications
are accurate
DATA VALIDATION
Taxonomic
"reasonable-
ness" checks
All data
sheets
Genera known to occur in given
lake or geographic area
Second or third
identification by expert in
that taxon
5.4.2   Response Design

       As the physical habitat indicator is based on field measurements and observations, there
is no sample collection associated with this indicator. The shoreline and littoral habitat surveys
employ a randomized, systematic design with 10 equally spaced observation stations located

-------
Survey of the Nation's Lakes
Quality Assurance Project Plan
   Revision No. 1
Date: August 2007
    Page 51 of 87
around the shore of each sample lake.  Teams go to the field with pre-marked lake outlines
showing these stations. The response design for sampling locations is shown in Figure 5-1.

5.4.3   Sampling  Methods

       Field Measurements: Field measurements, observations, and associated methodology
for the protocol are summarized in Table 5-16. The observations at each station include
quantitative and semiquantitative observations of vegetation structure, anthropogenic
disturbances, and bank substrate onshore.  In-lake littoral measurements and observations deal
with littoral water depth, bottom substrate, nearshore fish cover, and aquatic macrophyte cover.
With quantifiable confidence, investigators condense these observations into descriptions
applicable to the whole lakeshore and littoral zone.  Detailed procedures for completing the
protocol are provided in the Field Operations Manual; equipment and supplies required are also
listed. All measurements and observations  are recorded on standardized forms which are later
entered in to the central EMAP surface waters information management system (SWIM) at
WED-Corvallis.

       There is no sample collection or laboratory analysis associated with the physical habitat
measurements.
       Table 5-16. Field measurement methods: physical habitat indicator.
Variable or
Measurement
Units

Summary of Method
RIPARIAN ZONE
Vegetation type
Riparian
vegetation
structure
Substrate type
Bank angle
Bank features
Human
influence
none
percent
percent
none
0.1 m
none






Record the dominant vegetation type in
the canopy and understory layers within
plot
Visually estimate areal coverage of ground
cover, understory, and canopy types within
plot
Visually estimate areal coverage of
substrate types present in area 1 m back
from water
Describe the angle of the shoreline bank
back 1 m from the edge of the water
Visually estimate the vertical and
horizontal distances between the present
lake level and the high water line
Estimate presence/absence of defined
types of anthropogenic features









-------
Survey of the Nation's Lakes
Quality Assurance Project Plan
   Revision No. 1
Date: August 2007
    Page 52 of 87
Table 5-16.  (Continued).
Variable or
Measurement
Units

Summary of Method
LITTORAL ZONE
Substrate type
Station depth
Surface film
Sediment color
Sediment odor
Macrophyte
cover
Fish cover
Littoral habitat
and cover
percent
m
none
none
none
percent
none
none








Visually estimate areal coverage of substrate types
present within the 10-by 15-m area between the boat
and shoreline
Measure depth at 10 m offshore
Indicate presence/absence of defined types of
surface films
Note sediment color if a sample can be seen or
collected
Note sediment odor if a sample can be collected
Estimate areal coverage of aquatic macrophyte
types: submerged, emergent, and floating within the
10-by 15-m area between the boat and shoreline
Indicate presence/absence offish cover types within
the 10-by 15-m area between the boat and shoreline
Classify littoral habitat according to the following:
disturbance regime, cover class, cover type, and
substrate type for 10 m by 15 m littoral area










5.4.4   Quality Assurance Objectives

       MQOs are given in Table 5-17. General requirements for comparability and
representativeness are addressed in Section 2. The MQOs given in Table 5-17 represent the
maximum allowable criteria for statistical control purposes. Precision is determined from results
of revisits (field measurements) taken on a different day and  by duplicate measurements taken
on the same day.

5.4.6   Quality Control Procedures: Laboratory Operations

       There are no laboratory operations associated with this indicator.

-------
Survey of the Nation's Lakes
Quality Assurance Project Plan
   Revision No. 1
Date: August 2007
    Page 53 of 87
Table 5-17. Measurement data q
Variable or
Measurement
Field Measurements and
Observations
uality objectives: physical habitat indicator
Precision
±10%
Accuracy
NA
Completeness
90%
NA = not applicable in most cases. This would apply if the field auditor did a separate assessment and
compared the results to the crews.
5.4.5   Quality Control Procedures: Field Operations

       Specific quality control measures are listed in Table 5-19 for field measurements and
observations.

Table 5-18.  Field quality control: physical habitat indicator
Check Description
Frequency
Acceptance Criteria
Corrective
Actions
QUALITY CONTROL
Check totals for cover
class categories
(vegetation type,
substrate, cover)
Check completeness of
station depth
measurements
Each station
Each station
Sum must be
reasonable
Depth measurements
for all stations
Repeat observations
Obtain best estimate of
depth where actual
measurement not
possible
DATA VALIDATION
Estimate precision of
measurements based
on repeat visits
2 visits
Measurements should
be within 10 percent
Review data for
reasonableness;
Determine if acceptance
criteria need to be
modified
5.4.7   Data Management, Review, and Validation

       Checks made of the data in the process of review, verification, and validation are
summarized in Table 5-18.  The Indicator Lead is ultimately responsible for ensuring the validity
of the data, although performance of the specific checks may be delegated to other staff
members. All raw data (including all standardized forms and logbooks) are retained
permanently in an organized fashion in accordance with EPA records management polices.

-------
Survey of the Nation's Lakes                                                   Revision No. 1
Quality Assurance Project Plan                                              Date: August 2007
	Page 54 of 87

5.5    Phytoplankton Indicator

5.5.1   Introduction

       Phytoplankton are free-floating algae suspended in the lakes' water column, which
provide the base of most lake food webs. Excessive nutrient and organic inputs from human
activities in lakes and their watersheds lead to eutrophication, characterized in part by increases
in phytoplankton biomass.  Both species composition and abundance respond to water quality
changes caused by nutrients, pH, alkalinity, temperature, and metals.

5.5.2   Response Design

       At the index site located at the deepest point of the lake, a single depth-integrated
phytoplankton sample is collected from the euphotic zone of sufficient volume to ensure
adequate phytoplankton biomass for analysis. The response design for sampling locations is
shown in Figure 5-1.

5.5.3   Sampling Methods

       Sample Collection: An integrated sampling device is used to collect a depth-integrated
water sample from the euphotic zone. Typically, samples are collected from the surface down
to a depth of 2 m. However, if the Secchi depth is less than 1 m, the sampler should be held at
an oblique angle down to a depth of twice the Secchi depth.  This is to ensure that the sample is
collected from the upper epilimnion.  From the 2-L composite sample collected, an aliquot of 1-L
is transferred to a bottle for settling and preserved with Lugol's solution.  The remaining 1-L
aliquot is used for the algal toxin sample (see section 5.8). Detailed procedures for sample
collection and handling are described in the Field Operations Manual.

       Analysis: Preserved samples are processed, enumerated, and organisms identified to
the lowest possible taxonomic level (generally genus, see Laboratory Methods Manual) using
specified standard keys and references.  Processing and archival methods are based on USGS
NAWQA methods (Charles et al. 2003).  Detailed procedures are contained in the laboratory
operations manual and cited references. There is no maximum holding time associated with
preserved phytoplankton samples. Table 5-19 summarizes field and analytical methods for the
phytoplankton indicator.

5.5.4   Quality Assurance Objectives

       MQOs are given in Table 5-20. General requirements for comparability and
representativeness are addressed in Section 2.  Precision is calculated as percent efficiency,
estimated from independent identifications of organisms in randomly selected samples.  The
MQO for accuracy is evaluated by having individual specimens representative of selected taxa
identified by recognized experts.

-------
Survey of the Nation's Lakes
Quality Assurance Project Plan
   Revision No. 1
Date: August 2007
    Page 55 of 87
Table 5-19.  Field and laboratory methods: phytoplankton indicator
Variable or
Measurement
Sample
Collection
Concentrate
Subsamples
Counting cell/
Chamber
preparation
Enumeration
Identification
QA
Class
C
N
N
C
C
Expected
Range
and/or Units
NA
NA
NA
0 to 300
organisms
genus
Summary of Method
Depth-integrated sampler used
to collect 1-L water sample from
euphoticzone
Concentrated by settling and
decanting or by centrifugation to
5-10 times the original whole-
water sample
Prepare either Palmer-Maloney
counting cell or Utermb'hl
sedimentation chamber
Random systematic selection of
field or transect with target of
300 organisms from sample
Specified keys and references
References
Lakes Survey Field
Operations Manual
2006
Charles et al.; 2003
Lakes Survey
Laboratory Methods
Manual 2006
Charles et al.; 2003
Lakes Survey
Laboratory Methods
Manual 2006
Charles et al. 2003;
Lakes Survey
Laboratory Methods
Manual 2006

 C = critical, N = non-critical quality assurance classification.
Table 5-20.  Measurement data quality objectives: phytoplankton indicator
Variable or Measurement
Enumeration
Identification
Precision
85%
85%
Accuracy
90%a
90%a
Completeness
99%
99%
  Taxonomic accuracy, as calculated using Equation 9 in Section 2.
5.5.5   Quality Control Procedures: Field Operations

       After the 1-L bottle has been filled and Lugol's preservative has been added, the label
should be checked to ensure that all written information is complete and legible, and that the
label has been completely covered with clear packing tape. It should be verified that the bar
code assigned to the phytoplankton sample is recorded correctly on the Sample Collection Form
(Figure 5-5). The presence of preservative in the sample should be noted on the Sample
Collection Form to assure the integrity of the sample.  A flag code should be recorded and
comments provided on the Sample Collection Form to denote any problems encountered in
collecting the sample or the presence of any conditions that may affect sample integrity.
       Additionally, duplicate (repeat) samples will be collected at 10% of lakes sampled.

-------
Survey of the Nation's Lakes                                                   Revision No. 1
Quality Assurance Project Plan                                             Date: August 2007
	Page 56 of 87

5.5.6  Quality Control Procedures: Laboratory Operations

       It is critical that prior to taking a small portion of the subsample, the sample be
thoroughly mixed and macro or visible forms are evenly dispersed. Specific quality control
measures are listed in Table 5-21 for laboratory identification operations.

5.5.7  Data Management, Review, and Validation

       Checks made of the data in the process of review, verification, and validation are
summarized in Table 5-21.  The Indicator Lead is ultimately responsible for ensuring the validity
of the data, although performance of the specific checks may be delegated to other staff
members.  Once data have passed  all acceptance requirements, computerized data files are
prepared in a format specified for the Lakes Survey project.  The electronic data files are
transferred to the Lakes Survey IM Coordinator at WED-Corvallis for entry into a centralized
data base. A hard copy output of all files will also be sent to the Lakes Survey IM Coordinator.

       Sample residuals, vials, and slides are archived by each laboratory until the EPA Project
Leader has authorized, in writing, the disposition of samples. All raw data (including field data
forms and bench data recording sheets) are retained permanently in an organized fashion  by
the Indicator Lead in accordance with EPA records management policies.

5.6    Zooplankton Indicator

5.6.1   Introduction

       Zooplankton are important components of the open water environment of lakes and
ponds. Most species are microscopic and consist of crustaceans, rotifers, pelagic insect larvae,
and aquatic mites.  Zooplankton are important elements of the food chain since they transfer
energy from algae (primary producers) to larger invertebrate predators and fish.  The
zooplankton species assemblage responds to environmental stressors such as nutrient
enrichment, acidification, and fish stocks. The effects of environmental stress can be detected
through changes in species composition and abundance,  body  size distribution, and food web
structure.

5.6.2  Response Design

       At  the index site located at the deepest point of the lake, a single zooplankton sample is
collected to provide a representation of the lake's condition with respect to its biota. The
response design for sampling  locations is shown in Figure 5-1.

-------
Survey of the Nation's Lakes
Quality Assurance Project Plan
   Revision No. 1
Date: August 2007
    Page 57 of 87
Table 5-21. Laboratory Quality Control: phytoplankton indicator

Check or
Sample
Description
Frequency
Acceptance Criteria
Corrective Action
SAMPLE PROCESSING
Re-process
subsamples
using final
dilution/
concentration
factor
10% of all
samples
completed per
laboratory
Percent Similarity >70%
If <70%, re-process
additional subsamples
IDENTIFICATION
Duplicate
identification
by different
taxonomist
within lab
Independent
identification
by outside
taxonomist
Use standard
taxonomic
references
10% of all
samples
completed per
laboratory
All uncertain
taxa
For all
identifications
Percent Similarity >85%
Uncertain identifications to be
confirmed by expert in particular
taxa
All keys and references used
must be on bibliography
prepared by another laboratory
If <85%, determine reason
and if cause is systemic, re-
identify all samples
completed by that
taxonomist
Record both tentative and
independent IDs
If other references desired,
obtain permission to use
from Project Facilitator
DATA VALIDATION
Taxonomic
"reasonable-
ness" checks
All data
sheets
Genera known to occur in given
lakes or geographic area
Second or third
identification by expert in
that taxon
5.6.3   Sampling Methods

       Sample Collection: Zooplankton samples are collected using a Wisconsin net sampler
with a fine (73 urn) mesh net towed vertically from near the bottom to the surface.  A calibrated
chain is used to make and measure the vertical tow. The chain is attached to the Wisconsin net
so that depth is measured from the mouth of the net. The net is hauled from about 0.5 m off the
bottom to the surface.  In clear, shallow lakes (less than 2-m deep, where the Secchi disk can
be seen on the bottom), a second tow is performed to collect a sufficient number of individuals
to adequately characterize the assemblage. Detailed procedures for sample collection and
handling are described in the Field Operations Manual.

-------
Survey of the Nation's Lakes
Quality Assurance Project Plan
   Revision No. 1
Date: August 2007
    Page 58 of 87
       Analysis: Preserved samples are processed, enumerated, and organisms identified to
the lowest possible taxonomic level (generally genus, see Laboratory Methods Manual) using
specified standard keys and references. Processing and archival methods are based on
standard methods.  Detailed procedures are contained in the Laboratory Methods Manual and
cited references. There is no maximum holding time associated with preserved zooplankton
samples. Table 5-22 summarizes field and analytical methods for the zooplankton indicator.

5.6.4   Quality Assurance Objectives

       Measurement quality objectives (MQOs) are given in Table 5-23.  General requirements
for comparability and representativeness are addressed in Section 2. Precision is calculated as
percent similarity, estimated from independent identifications of organisms in randomly selected
samples. The MQO for accuracy is evaluated by having individual specimens representative of
selected taxa identified by recognized experts.
Table 5-22.  Field and laboratory methods: zooplankton indicator
Variable or
Measurement
Sample
Collection
Subsampling
Counting cell/
Chamber
preparation
Enumeration
Identification
QA
Class
C
N
N
C
C
Expected
Range
and/or Units
NA
NA
NA
400
organisms
genus
Summary of Method
Wisconsin net with 73 urn mesh
towed vertically from 0.5m
above bottom to surface
Pipette from graduated cylinder/
Imhoff Cone or Folsom plankton
splitter
Prepare counting cell for small
organisms and counting
chamber for larger organisms
Random systematic selection of
field with target of 400
organisms from sample
Specified keys and references
References
Lakes Survey Field
Operations Manual
2006
Lakes Survey
Laboratory Methods
Manual 2006
Lakes Survey
Laboratory Methods
Manual 2006
Lakes Survey
Laboratory Methods
Manual 2006

 C = critical, N = non-critical quality assurance classification.
Table 5-23.  Measurement data quality objectives: zooplankton indicator
Variable or Measurement
Enumeration
Identification
Precision
85%
85%
Accuracy
90%a
90%a
Completeness
99%
99%
 NA = not applicable
 a Taxonomic accuracy, as calculated using Equation 9 in Section 2.

-------
Survey of the Nation's Lakes
Quality Assurance Project Plan
5.6.5   Quality Control Procedures: Field Operations
   Revision No. 1
Date: August 2007
    Page 59 of 87
       After the sample is collected and dispensed into 125 mLjars, the labels should be
checked to verify that all written information is complete and legible, and that the label has been
completely covered with clear packing tape.  It should be verified that both the bar codes
assigned to the sample and the tow length have been recorded correctly on the Sample
Collection Form (Figure 5-5).  The presence of preservative in the sample should be noted on
the Sample Collection Form to assure the integrity of the sample.  A flag code should be
recorded and comments provided on the Sample Collection Form to denote any problems
encountered in collecting the  sample or the presence of any conditions that may affect sample
integrity.

       Additionally, duplicate (repeat) samples will be collected at 10%  of lakes sampled.

5.6.6   Quality Control Procedures: Laboratory Operations

       Specific quality control measures are listed in Table 5-24 for laboratory operations.

5.6.7   Data Management, Review, and Validation

       Checks made of the data in the process of review, verification, and validation are
summarized in Table 5-24. The Indicator Lead is ultimately responsible for ensuring the validity
of the data, although performance of the specific checks may be delegated to other staff
members. Once data have passed all acceptance requirements, computerized data files are
prepared in a format specified for the Lakes Survey project. The electronic data files are
transferred to the Lakes Survey IM Coordinator at WED-Corvallis for entry into a centralized
data base. A hard copy output of all files will also be sent to the Lakes Survey IM Coordinator.

       Sample residuals, and vials are archived by each laboratory until the EPA Project
Leader has authorized, in writing, the disposition of samples. All  raw data (including field data
forms and bench data recording sheets) are retained permanently in an  organized fashion by
the Indicator Lead in accordance with EPA records management policies.

Table 5-24. Laboratory Quality Control: zooplankton indicator
Check or
Sample
Description
Frequency
Acceptance Criteria
Corrective Action
SAMPLE PROCESSING
Re-process
subsamples
10% of all
samples
completed per
laboratory
Percent Similarity >70%
If <70%, re-process
additional subsamples
IDENTIFICATION

-------
Survey of the Nation's Lakes
Quality Assurance Project Plan
   Revision No. 1
Date: August 2007
    Page 60 of 87
Duplicate
identification
by different
taxonomist
within lab
Independent
identification
by outside
taxonomist
Use standard
taxonomic
references
10% of all
samples
completed per
laboratory
All uncertain
taxa
For all
identifications
Percent Similarity >85%
Uncertain identifications to be
confirmed by expert in particular
taxa
All keys and references used
must be on bibliography
prepared by another laboratory
If <85%, determine reason
and if cause is systemic, re-
identify all samples
completed by that
taxonomist
Record both tentative and
independent IDs
If other references desired,
obtain permission to use
from Project Facilitator
DATA VALIDATION
Taxonomic
"reasonable-
ness" checks
All data
sheets
Genera known to occur in given
lake or geographic area
Second or third
identification by expert in
that taxon
5.7    Pathogen Indicator

5.7.1   Introduction

       The primary function of collecting water samples for Pathogen Indicator Testing is to
provide a relative comparison of fecal pollution indicators for national lakes and ponds. The
concentration of Enterococci (the current bacterial indicator for fresh and marine waters) in a
water body correlates with the level of more infectious gastrointestinal pathogens present in the
water body.  While some Enterococci are opportunistic pathogens among immuno-
compromised human individuals, the presence of Enterococci is more importantly an indicator of
the presence of more pathogenic microbes (bacteria, viruses and protozoa) associated with
human or animal fecal waste. These pathogens can cause waterborne illness in bathers and
other recreational users through exposure or accidental ingestion. Disease outbreaks can occur
in and around beaches that become contaminated with high levels of pathogens. Therefore,
measuring the concentration of pathogens present in lake and pond water can help assess
comparative human health concerns regarding recreational use.

       In this survey, a novel, Draft  EPA Quantitative PCR Method (1606) will be used to
measure the concentration of genomic DMA from the fecal indicator group Enterococcus in the
water samples. While neither federal or state Water Quality Criteria (standards) have been
formally established for the level of Enterococcus DMA in a sample, epidemiological studies
(Wade et al. 2005) have established a strong correlation between Enterococcus DMA levels and
the incidence of high-credible gastrointestinal illness (HCGI) among swimmers.  The
Enterococcus qPCR results will serve as an estimate of the concentration of total (culturable
and non-culturable) Enterococci present in the surveyed lakes and ponds for the purpose of
comparative assessment. This study also has the potential to yield  invaluable information about
the inhibitory effects of water matrices from the different regions of the nation  upon the qPCR
assay.

-------
Survey of the Nation's Lakes                                                  Revision No. 1
Quality Assurance Project Plan                                              Date: August 2007
	Page 61 of 87
5.7.2  Sampling Design

       A single "pathogen" water sample will be collected from one sampling location
approximately 10m offshore, in conjunction with the final physical habitat sampling station
location.  The plot design for sampling locations is shown in Figure 5-1.

5.7.3  Sampling Methods

       Sample Collection:  At the final physical habitat shoreline station (located approximately
10 m off shore), a single 1-L water grab sample is collected approximately 6-12 inches below
the surface of the water. Detailed procedures for sample collection and handling are described
in the Field Operations Manual.  Pathogen samples must be filtered and the filters must be
folded and frozen in vials within 6 hours of collection.

       Analysis: Pathogen samples are filter concentrated, then shipped on dry ice to the New
England Regional Laboratory where the filter retentates are processed, and the DMA extracts
are analyzed using Quantitative Polymerase Chain Reaction (QPCR), a genetic method that
quantifies a DMA target via a fluorescently tagged probe, based on methods developed by
USEPA National Exposure Research Laboratory. Detailed procedures are contained in the
laboratory operations manual. Table 5-25 summarizes field and analytical methods for the
pathogen indicator.

5.7.4  Quality Assurance Objectives

Measurement quality objectives (MQO) are given in table 5-26. General requirements for
comparability and representativeness are addressed in Section 2. Precision is calculated as
percent efficiency, estimated from independent identifications of organisms in randomly selected
samples. The MQO for accuracy is  evaluated by having individual specimens representative of
selected taxa identified by recognized experts.

-------
Survey of the Nation's Lakes
Quality Assurance Project Plan
    Revision No. 1
Date: August 2007
    Page 62 of 87
Table 5-25.  Field and laboratory methods: pathogen indicator (Enterococci)
Variable or
Measurement
Sample
Collection
Sub-sampling
Sub-sample
(& Buffer
Blank)
Filtration
Preservation &
Shipment
DMA Extraction
(Recovery)
Method 1606
(Enterococcus
& SPC qPCR)
QA
Class
C
N
N
C
C
C
Expected
Range
and/or Units
NA
NA
NA
-40Cto+40 C
10-141%
<60 (RL) to
>100,000
ENT CCEs
/100-mL
Summary of Method
Sterile sample bottle submerged
to collect 250-mL sample 6-12"
below surface at 10m from
shore
2 x 50-mL sub-samples poured
in sterile 50-mL tube after mixing
by inversion 25 times.
Up to 50-mL sub-sample filtered
through sterile polycarbonate
filter. Funnel rinsed with minimal
amount of buffer. Filter folded,
inserted in tube then frozen.
Batches of sample tubes
shipped on dry ice to lab for
analysis.
Bead-beating of filter in buffer
containing Extraction Control
(SPC) DMA. DMA recovery
measured
5-uL aliquots of sample extract
are analyzed by ENT & Sketa
qPCR assays along with blanks,
calibrator samples & standards.
Field and lab duplicates are
analyzed at 10% frequency.
Field blanks analyzed at end of
testing only if significant
detections observed.
References
Lakes Survey Field
Operations Manual
2006
Lakes Survey
Laboratory Methods
Manual 2006
Lakes Survey
Laboratory Methods
Manual 2006
Lakes Survey
Laboratory Methods
Manual 2006
EPA Draft Method
1 606 Enterococcus
qPCR
EPA Draft Method
1 606 Enterococcus
qPCR
NERL NLPS2007
qPCR Analytical SOP
 C = critical, N = non-critical quality assurance classification.
Table 5-26.  Measurement data quality objectives: Pathogen-Indicator DMA Sequences
Variable or Measurement*
SPC & ENT DMA sequence
numbers of Calibrators &
Standards by AQM
ENT CCEs by dCf RQM
ENT CCEs by ddCf RQM
Method Precision
RSD=50%
RSD = 70%
RSD = 70%
Method Accuracy
50%
35%
50%
Completeness
95%
95%
95%
*AQM = Absolute Quantitation Method; RQM = Relative Quantitation Method;
 SPC = Sample Processing Control (Salmon DNA / Sketa); CCEs = Calibrator Cell Equivalents

-------
Survey of the Nation's Lakes                                                   Revision No. 1
Quality Assurance Project Plan                                             Date: August 2007
	Page 63 of 87

5.7.5  Quality Control Procedures: Field Operations

       It is important that the sample container be completely sterilized and remain unopened
until samples are ready to be collected.  Once the sample bottles are lowered to the desired
depth (6-12 in. below the surface), the sample bottles may then be opened and filled.  After
filling the 1-L bottle check the label to ensure that all written information is complete and legible.
Place a strip of clear packing tape over the label and bar code, covering the label completely.
Record the bar code assigned to the pathogen sample on the Sample Collection Form (Figure
5-5).  Enter a flag code and provide comments on the Sample Collection Form if there are any
problems in collecting the sample or if conditions occur that may affect sample integrity.  All
samples should be placed in coolers and maintained on ice during transport to the laboratory
and maintained at 1-4°C during the time interval before they are filtered for analysis.  Recheck
all forms and labels for completeness and legibility.

       Field blanks and duplicates will be collected at 10% of sites sampled. In addition, each
field crew should collect a blank sample over the course of the survey as a check on each
crew's aseptic technique and the sterility of test reagents and supplies.

5.7.6  Quality Control Procedures: Laboratory Operations

Specific quality control measures  are listed in Table 5-27 for laboratory  operations

-------
Survey of the Nation's Lakes
Quality Assurance Project Plan
   Revision No. 1
Date: August 2007
    Page 64 of 87
Table 5-27. Laboratory Quality Control: Pathogen-Indicator DMA Sequences
Check or
Sample
Description
Frequency
Acceptance Criteria
Corrective Action
SAMPLE PROCESSING
Re-process
sub-samples
(Lab
Duplicates)
10% of all
samples
completed per
laboratory
Percent Congruence <70% RSD
If >70%, re-process
additional sub-samples
qPCR ANALYSIS
Duplicate
analysis by
different
biologist
within lab
Independent
analysis by
external
laboratory
Use single
stock of £.
faecalis
calibrator
10% of all
samples
completed per
laboratory
None
ForallqPCR
calibrator
samples for
quantitation
Percent Congruence <70% RSD
Independent analysis TBD
All calibrator sample Cp (CO
must have an RSD < 50%.
If >70%, determine reason
and if cause is systemic, re-
analyze all samples in
question.
Determine if independent
analysis can be funded and
conducted.
If calibrator Cp (CO values
exceed an RSD value of
50% a batch's calibrator
samples shall be re-
analyzed and replaced with
new calibrators to be
processed and analyzed if
RSD not back within range.
DATA PROCESSING & REVIEW
100%
verification
and review of
qPCR data
All qPCR
amplification
traces, raw
and
processed
data sheets
All final data will be checked
against raw data, exported data,
and calculated data printouts
before entry into LIMS and
upload to Corvallis, OR
database.
Second tier review by
contractor and third tier
review by EPA.
5.7.7   Data Management, Review, and Validation

       Once data have passed all acceptance requirements, computerized data files are
prepared in a format specified by the 2007 NLPS.  The electronic data files are transferred to
the Lakes Survey IM Coordinator at WED-Corvallis for entry into a centralized data base.  A
hard copy output of all files will also be sent to the  Lakes Survey IM Coordinator.

-------
Survey of the Nation's Lakes                                                   Revision No. 1
Quality Assurance Project Plan                                              Date: August 2007
	Page 65 of 87

5.8    Microcystin Indicator

5.8.1   Introduction

       Cyanobacteria, also known as blue-green algae, are photosynthetic bacteria found in
eutrophic waters. Hundreds of bioactive compounds have been isolated from cyanobacteria
including numerous cyanotoxins, which have been known to threaten human health due to
contaminated drinking water and consumption of contaminated aquatic organisms. The most
common of the toxin groups produced and released by cyanobacteria are microcystins.
Measuring the concentration of microcystins in the water provides and indication of the safety of
the lake water for recreational purposes.

5.8.2   Response Design

       At the index site located at the deepest point of the lake, a single depth-integrated water
sample is collected from the euphotic zone to provide an indication of the presence and
concentration of potentially hazardous algal toxins. The response design for sampling locations
is shown in Figure 5-1.

5.8.3   Sampling Methods

       Sample Collection: An integrated sampling device is used to collect a depth-integrated
water sample from  the surface down to a depth of 2 m. However, if the Secchi depth is less
than 1 m, the sampler should be  held at an oblique angle down to a depth of twice the Secchi
depth.  From the 2-L composite sample collected, an aliquot of 400 ml_ should be collected in a
pre-rinsed 500 ml_  HOPE or amber glass bottle and placed immediately in a cooler with ice, or
frozen using dry ice if possible. More detailed procedures for sample collection and handling
are described in the Field Operations Manual.

       Analysis:  A performance-based approach is being utilized for the Mirocystin analysis
that defines a set of laboratory methods performance requirements for data quality. Preserved
samples are processed and concentrations reported using a microtiter plate Enzyme-Linked
Immuno-Sorbent Assay ( ELISA) using the Abraxis kit. Performance requirements are listed in
Table 5-??. Laboratory work will be performed by the  UGSG Organic Geochemistry Research
Group (OGRG) Laboratory in Lawrence, Kansas.

5.8.4   Quality Assurance Objectives

       Measurement quality objectives (MQOs) are given in Table 5-???. The MQOs given in
Table 5-??? represent the maximum allowable criteria for statistical  control purposes.  Results
from water samples and concentrations are reported between 0.10 ug/L and  5.0 ug/L without
dilution. If a dilution is performed, higher concentrations can be reported. Non-detected are
reported as "<0.10  ug/L".

-------
Survey of the Nation's Lakes
Quality Assurance Project Plan
   Revision No. 1
Date: August 2007
    Page 66 of 87
Table 5-28.  Example Layout of Samples and Controls on Microtiter Plate

A
B
C
D
E
F
G
H
1
SI
S2
S3
S4
S5
Cl
SI
S2
2
S3
S4
S5
C2
Ul
LI
U2
U3
3
U4
U5
U6
L6
U7
U8
U9
U10
4
P10
C3
Ull
Lll
U12
U13
U14
U15
5
U16
L16
U17
U18
U19
U20
P20
C4
6
U21
L21
U22
U23
U24
U25
U26
L26
7
U27
U28
U29
U30
P30
C5
U31
L31
8
U32
U33
U34
U35
U36
L36
U37
U38
9
U39
U40
P40
C6
U41
L41
U42
U43
10
U44
U45
U46
L46
U47
U48
U49
U50
11
P50
C7
U51
LSI
U52
U53
U54
U55
12
U56
L56
U57
U58
U59
QC3
P59
C8
       S = standard; C = 0.75 ug/L control - supplied with ELISA kit; QC = quality control; U =
       unknown (sample); L = unknown duplicate (sample); P = spiked duplicate unknown
       (sample)
Table 5-29.  Sample analysis quality control activities: microcystin indicator Quality Control
Activity
Quality
Control
Activity
Laboratory
Duplicate
Laboratory
Spiked Sample
Identical
Sample
Project Quality
Control
Sample
Description and Requirements
Every first and fifth sample are duplicate samples
analyzed for QC purposes.
Every tenth sample analyzed is a laboratory spiked
duplicate sample that contains
Identical sample designated by a letter S attached to the
log number. Final concentration will be 0.75 (J,g/L of
Microcystin-LR plus the ambient concentration
Designated project archive sample is re-analyzed with
every run set for the project. Control charts are
maintained for these samples.
Corrective Action
Samples are re-analyzed if
samples do not agree or bad
standard deviation curves
Samples are re-analyzed if
samples do not agree or bad
standard deviation curves
Samples are re-analyzed if
samples do not agree or bad
standard deviation curves
Samples are re-analyzed if
samples do not agree or bad
standard deviation curves

-------
Survey of the Nation's Lakes                                                  Revision No. 1
Quality Assurance Project Plan                                              Date: August 2007
	Page 67 of 87

5.8.5  Quality Control Procedures: Field Operations

       It is important that the sample bottle be rinsed with sample water three times before
collecting the sample. After collecting 400 ml_ of sample water and sealing the lid with electrical
tape, check the label to ensure that all written information is complete and legible. Place a strip
of clear packing tape over the label and bar code, covering the label completely.  Record the bar
code assigned to the algal toxin sample on the Sample Collection Form (Figure 5-5).  Enter a
flag code and provide comments on the Sample Collection Form if there are any problems in
collecting the sample or if conditions occur that may affect sample integrity.  All samples should
be placed  in coolers and maintained on ice during transport to the laboratory and frozen
immediately upon return  to the lab. Recheck all forms and labels for completeness and
legibility.  Additionally, field duplicate and field replicate samples will be collected at 10% of
lakes sampled.

5.8.6  Quality Control Procedures: Laboratory Operations

5.8.6.1 Sample  Receipt and Processing

       The communications center and information management staff is notified of sample
receipt and any associated problems as soon as possible after samples are received.

5.8.6.2 Analysis of Samples

QC protocols are an integral part of all analytical procedures to ensure that the results are
reliable and the analytical stage of the measurement system is maintained in a state of
statistical control. Information regarding QC sample requirements and corrective actions are
summarized in Table 5-??.

Samples will be analysis by the USGS using the a microtiter plate Enzyme-Linked Immuno-
Sorbent Assay (  ELISA) using the Abraxis kit. An example layout of this kit is shown in Table 5-
???.  The SoftMax Pro software is on the immunoassay computer and is used for controlling
the microtiter plate reader and for calculating results. The software calculates the values of the
samples from the Calibration  Curve and averages the two results to a standard curve. The
standard curve should have a correlation coefficient of .99. The absorbency of the blank must
be standard correlation coefficient >1.400. The lower reporting limit is 0.10 ug/L.  Samples with
less than this concentration are flagged as non-detects.

Laboratory duplicates should  have a percent Relative Standard Deviation (%RSD) of <20
percent when compared  to each other. Laboratory Spiked Duplicates must have  an actual value
of +/-20 percent of the theoretical  concentration  of the spiked sample. The theoretical
concentration is determined by adding the concentration of the unspiked sample and .75 ug/L

For each set of ten samples, the first and 5th samples are duplicates, and the tenth sample is a
spiked duplicate. A designated archived project  samples is re-analyzed with every set that is
run. Control charts are maintained for these samples. A running historical average is maintained

-------
Survey of the Nation's Lakes                                                   Revision No. 1
Quality Assurance Project Plan                                              Date: August 2007
	Page 68 of 87

of the concentration from each run. The concentration of the QC samples for each successive
run has to be ± 20 percent of that average to be acceptable.
5.8.7  Data Management, Review, and Validation

       Once data have passed all acceptance requirements, computerized data files are
prepared in a format specified for the Lakes Survey project. An example data entry sheet is
shown below in Figure 5-6. The electronic data files are transferred to the Lakes Survey IM
Coordinator (Marlys Cappaert) for entry into a centralized data base.  A hard copy output of all
files will also be sent to the Lakes Survey IM Coordinator.

5.9    Benthic Macroinvertebrates

5.9.1   Introduction

       Benthic invertebrates inhabit the sediment (infauna) or live on the bottom substrates or
aquatic vegetation (epifauna) of lakes.  The benthic macroinvertebrate assemblage in lakes is
an important component of measuring the biological condition of the aquatic community and the
overall ecological condition of the lake. Monitoring this assemblage is useful in assessing the
status of the water body and detecting trends in ecological condition.  Populations in the benthic
assemblage respond to a wide array of stressors in different ways so that it is often possible to
determine the type of stress or that has affected a macroinvertebrate assemblage (e.g., Klemm
et al., 1990).  Because many macroinvertebrates have relatively  long life cycles of a year or
more and are relatively immobile, the condition of the structure and function of the
macroinvertebrate assemblage is a response to exposure of cumulative disturbance.

       For the Lakes Survey,  the epibenthos will be the primary  benthic indicator. Benthos are
collected using a semi-quantitative sampling of multiple habitats  in the littoral zone of lakes
using a D-frame dip net.  The  lake littoral zone is  made up of many microhabitat types, which
have a strong influence on the macroinvertebrate assemblage. Therefore, sample collection is
stratified on the following three specific habitat types: rocky/cobble/large woody debris;
macrophyte beds; and organic fine muds.  Targeted components of the macroinvertebrate
assemblage for these habitat types are rocky-littoral epibenthos,  macrophytic epibenthos, and
muddy-littoral epi- and infaunal benthos, respectively.

5.9.2  Response Design

       Benthic macroinvertebrates are collected  from the dominant habitat within the littoral
zone of each of the 10 P-Hab  stations established along the shoreline. A composite sample of
macroinvertebrates is prepared from a multi-habitat approach and consists of three specific
habitat types:  (1) rocky/cobble/large woody debris; (2) macrophyte beds; and (3) organic fine
muds or sand. The response  design for sampling locations is shown in Figure 5-1.

-------
Survey of the Nation's Lakes
Quality Assurance Project Plan
    Revision No. 1
Date: August 2007
     Page 69 of 87
                 Microcystin Immunoassay (IMN)
Sample
Tube#
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40










Project
Code














RRM
RRM
RRM
RRM
RRM
RRM
RRM
RRM
RRM
RRM
RRM

RRM
RRM
RRM
RRM
RRM
RRM
RRM
RRM
RRM
RRM
RRM
RRM
RRM











Lab ID (MT#)
SO
SO
SI
SI
S2
S2
S3
S3
S4
S4
S5
S5
Control
Control
43340A
43340L
43388A
43389A
43390A
43391A
43391 L
43392A
43393A
43394A
43394S
CCV 0.75 ppb
43395A
43395L
43396A
43397A
43398A
43399A
43399L
43401 A
43402A
43403A
43404A
43405A
43405S
CCV 0.75 ppb










Calculated*
Cone. (ug/L)
0
0
0.15
0.15
0.4
0.4
1
1
2
2
5
5
0.75
0.75
8.2
8.3
2.11
2.19
1.2
0.95
<0.10
<0.10
<0.10
<0.10
0.76
0.73
<0.10
<0.10
8.2
8.2
0.75
<0.10
<0.10
<0.10
<0.10
<0.10
<0.10
0.69
1.49
0.73










Dilution
1:1
1:1
1:1
1:1
1:1
1:1
1:1
1:1
1:1
1:1
1:1
1:1
1:1
1:1
1:10
1:10
1:1
1:1
1:1
1:1
1:1
1:1
1:1
1:1
1:1
1:1
1:1
1:1
1:10
1:10
1:1
1:1
1:1
1:1
1:1
1:1
1:1
1:1
1:1
1:1










Analysis
Date
1/20/2005
1/20/2005
1/20/2005
1/20/2005
1/20/2005
1/20/2005
1/20/2005
1/20/2005
1/20/2005
1/20/2005
1/20/2005
1/20/2005
1/20/2005
1/20/2005
1/20/2005
1/20/2005
1/20/2005
1/20/2005
1/20/2005
1/20/2005
1/20/2005
1/20/2005
1/20/2005
1/20/2005
1/20/2005
1/20/2005
1/20/2005
1/20/2005
1/20/2005
1/20/2005
1/20/2005
1/20/2005
1/20/2005
1/20/2005
1/20/2005
1/20/2005
1/20/2005
1/20/2005
1/20/2005
1/20/2005










Plate Name
N20JAN05
N20JAN05
N20JAN05
N20JAN05
N20JAN05
N20JAN05
N20JAN05
N20JAN05
N20JAN05
N20JAN05
N20JAN05
N20JAN05
N20JAN05
N20JAN05
N20JAN05
N20JAN05
N20JAN05
N20JAN05
N20JAN05
N20JAN05
N20JAN05
N20JAN05
N20JAN05
N20JAN05
N20JAN05
N20JAN05
N20JAN05
N20JAN05
N20JAN05
N20JAN05
N20JAN05
N20JAN05
N20JAN05
N20JAN05
N20JAN05
N20JAN05
N20JAN05
N20JAN05
N20JAN05
N20JAN05










Remarks
QC sample; DO NOT ENTER
QC sample; DO NOT ENTER
QC sample; DO NOT ENTER
QC sample; DO NOT ENTER
QC sample; DO NOT ENTER
QC sample; DO NOT ENTER
QC sample; DO NOT ENTER
QC sample; DO NOT ENTER
QC sample; DO NOT ENTER
QC sample; DO NOT ENTER
QC sample; DO NOT ENTER
QC sample; DO NOT ENTER
QC sample; DO NOT ENTER
QC sample; DO NOT ENTER

QC sample; DO NOT ENTER




QC sample; DO NOT ENTER



QC sample; DO NOT ENTER
QC sample; DO NOT ENTER

QC sample; DO NOT ENTER




QC sample; DO NOT ENTER





QC sample; DO NOT ENTER
QC sample; DO NOT ENTER










Tech ID
KO
KO
KO
KO
KO
KO
KO
KO
KO
KO
KO
KO
KO
KO
KO
KO
KO
KO
KO
KO
KO
KO
KO
KO
KO
KO
KO
KO
KO
KO
KO
KO
KO
KO
KO
KO
KO
KO
KO
KO










Data Entry


















































* the calculated concentration incorporated any dilutions to give the final concentration of the original sample.
Concentration range for this analysis is 0.10 -5.0 ug/L without dilution.
Figure 5-6.  Example data entry form for microcystins.

-------
Survey of the Nation's Lakes
Quality Assurance Project Plan
   Revision No. 1
Date: August 2007
    Page 70 of 87
5.9.3   Sampling and Analytical Methods

       Sample Collection:  Benthic macroinvertebrates are collected from the dominant habitat
type within each of 10 P-Hab stations. Samples are collected using a modified D-frame kick-net
(500 |im mesh) procedure and are combined together to produce a single composite sample for
the lake.  Samples  are field-processed to remove large detritus and preserved in 70% ethanol.
Detailed sampling and processing procedures are described in section 5.4 of the Field
Operations Manual. A condensed description of key elements of the field activities is provided
for easy reference onsite.

       Analysis: Preserved composite samples are sorted, enumerated, and invertebrates
identified to the lowest possible taxonomic level (generally genus, see Laboratory Methods
Manual) using specified standard keys and references. Processing and archival methods are
based on standard  practices. Detailed procedures are contained in the laboratory operations
manual and cited references. There is no maximum holding time associated with preserved
benthic invertebrate samples. Table 5-?? summarizes field and analytical methods for the
benthic invertebrates indicator.

5.9.4   Quality Assurance Objectives

       Measurement quality objectives (MQOs) are given in Table 5-??.  General requirements
for comparability and representativeness are addressed in Section 2.  The MQOs given in Table
5-?? represents the maximum allowable criteria for  statistical control purposes.  Precision is
calculated as percent efficiency, estimated from examination of randomly selected sample
residuals by a second analyst and independent identifications of organisms in randomly
selected samples.  The MQO for picking accuracy is estimated from examinations (repicks) of
randomly selected residues by experienced taxonomists.

Table 5-30.  Field and laboratory methods: benthic  indicator
Variable or
Measurement
Sample
Collection
Sorting and
Enumeration
Identification
QA
Class
C
C
C
Expected
Range
and/or Units
NA
0 to 500
organisms
genus
Summary of Method
One-man D-frame kick net
(500|j,m mesh) used to collect
organisms, which are
composited from 1 0 stations
Random systematic selection of
grids with target of 500
organisms from sample
Specified keys and references
References
Kamman 2005 (Draft);
Lakes Survey Field
Operation Manual
2006
Lakes Survey Benthic
Laboratory Methods
2006

 C = critical, N = non-critical quality assurance classification.

-------
Survey of the Nation's Lakes
Quality Assurance Project Plan
Table 5-31.  Measurement data quality objectives: benthic indicator
   Revision No. 1
Date: August 2007
    Page 71 of 87
Variable or Measurement
Sort and Pick
Identification
Precision
95%
85%
Accuracy
90%
90%a
Completeness
99%
99%
 NA = not applicable
 a Taxonomic accuracy, as calculated using Equation 9 in Section 2.

5.9.5  Quality Control Procedures: Field Operations

       Specific quality control measures are listed in Table 5-?? for field operations.
Additionally, duplicate (replicate) samples will be collected at 10% of lakes sampled.

5.9.6  Quality Control Procedures: Laboratory Operations

       Specific quality control measures are listed in Table 5-?? for laboratory operations.

5.9.7  Data Management, Review, and Validation

       Checks made of the data in the process of review, verification, and validation are
summarized in Table 5-??.  The Indicator Lead is ultimately responsible for ensuring the validity
of the data, although performance of the specific checks may be delegated to other staff
members. Once data have passed all acceptance requirements, computerized data files  are
prepared in a format specified for the  Lakes Survey  project by EMAP and copied onto a floppy
diskette. The diskettes are transferred to the Lakes  Survey IM Coordinator at WED-Corvallis for
entry into a centralized data base. A hard copy output of all files accompanies each diskette.

       A reference specimen collection is prepared  as new taxa are encountered in samples.
This collection consists of preserved specimens in vials and mounted on slides and is provided
to the responsible EPA laboratory as part of the analytical laboratory contract requirements.
The  reference collection is archived at the responsible EPA laboratory or other suitable facility.

       Sample residuals, vials, and slides are archived by each laboratory until the EPA Project
Leader has authorized, in writing, the disposition of samples. All raw data (including field  data
forms and bench data recording sheets) are retained permanently in an organized fashion by
the Indicator Lead in accordance with EPA records management policies.

Table 5-32.  Laboratory Quality Control: benthic indicator
Check or
Sample
Description
Frequency
Acceptance Criteria
Corrective Action
SAMPLE PROCESSING (PICK AND SORT)

-------
Survey of the Nation's Lakes
Quality Assurance Project Plan
   Revision No. 1
Date: August 2007
    Page 72 of 87
Sample
residuals
examined by
different
analyst within
lab
Sorted
samples sent
to
independent
lab
10% of all
samples
completed per
analyst
10% of all
samples
Efficiency of picking >90%
Accuracy of contractor laboratory
picking and identification >90%
If <90%, examine all
residuals of samples by that
analyst and retrain analyst
If picking accuracy <90%,
all samples in batch will be
reanalyzed by contractor
IDENTIFICATION
Duplicate
identification
by different
taxonomist
within lab
Independent
identification
by outside
taxonomist
Use standard
taxonomic
references
Prepare
reference
collection
10% of all
samples
completed per
laboratory
All uncertain
taxa
For all
identifications
Each new
taxon per
laboratory
Efficiency >85%
Uncertain identifications to be
confirmed by expert in particular
taxa
All keys and references used
must be on bibliography
prepared by another laboratory
Complete reference collection to
be maintained by each individual
laboratory
lf<85%, re-identify all
samples completed by that
taxonomist
Record both tentative and
independent IDs
If other references desired,
obtain permission to use
from Project Facilitator
Benthic Lab Manager
periodically reviews data
and reference collection to
ensure reference collection
is complete and
identifications are accurate
DATA VALIDATION
Taxonomic
"reasonable-
ness" checks
All data
sheets
Genera known to occur in given
lakes or geographic area
Second or third
identification by expert in
that taxon
      6.0   FIELD AND BIOLOGICAL LABORATORY QUALITY EVALUATION AND
                                ASSISTANCE VISITS

      No national program of accreditation for phytoplankton, zooplankton, sediment diatom,
algal toxin, or benthic macroinvertebrate collection and sample processing currently exists.
However, national standards of performance and audit guidance for biological laboratories are
being considered by the National Environmental Laboratory Accreditation Conference (NELAC).

-------
Survey of the Nation's Lakes                                                    Revision No. 1
Quality Assurance Project Plan                                              Date: August 2007
	Page 73 of 87

For this reason, a rigorous program of field and laboratory evaluation and assistance visits has
been developed to support the Survey of the Nation's Lakes Program.

       Procedural review and assistance personnel are trained to the specific implementation
and data collection methods detailed in the Lakes Survey Field Operations Manual. Plans and
checklists for field evaluation and assistance visit have been developed to reinforce the specific
techniques and procedures for both field and laboratory applications. The plans and checklists
are included in this section and describe the specific evaluation and corrective actions
procedures.

       It is anticipated that evaluation and assistance visits will be conducted with each Field
Team early in the sampling and data collection process, and that corrective actions will be
conducted in real time.  These visits provide a basis for the uniform evaluation of the data
collection  techniques, and an opportunity to conduct procedural reviews as required to minimize
data loss due to improper technique or interpretation of program guidance. Through uniform
training of field crews and review cycles conducted early in the data collection process,
sampling variability associated with specific implementation or interpretation of the protocols will
be significantly reduced. The field visits evaluations, while performed by a number of different
supporting collaborator agencies and participants, will be based on the uniform training, plans,
and checklists.  This  review and assistance task will be conducted for each unique crew
collecting  and contributing data under this program; hence no data will be recorded to the
project database that were produced by an 'unaudited' process, or individual.

       Similarly, laboratory evaluation and assistance visits will be conducted early in the
project schedule and soon after sample processing begins at each laboratory to  ensure that
specific laboratory techniques are implemented consistently across the multiple laboratories
generating data for the  program. Laboratory evaluation and assistance visit plans and
checklists have been developed to ensure uniform interpretation and guidance in the procedural
reviews. These laboratory visits are designed such that full corrective action plans and
remedies  can be implemented in the case of unacceptable deviations from the documented
procedures observed in the review process without recollection of samples.

The  Field  and Laboratory Evaluation and Assistance Visit Plans are as follows:

6.1     Field Quality Evaluation and Assistance Visit Plan for the Survey of the Nation's
       Lakes (Lakes Survey)

Evaluators: One or more designated EPA or Contractor staff members who are qualified (i.e.,
have completed training) in the procedures of the Lakes Survey field sampling operations.

To Evaluate: Regional Monitoring Coordinator-appointed Field Sampling Teams
during sampling operations on site.

-------
Survey of the Nation's Lakes                                                   Revision No. 1
Quality Assurance Project Plan                                              Date: August 2007
	Page 74 of 87

Purpose:  To identify and correct deficiencies during field sampling operations.

1. Tetra Tech project staff will review the Field Evaluation and Assistance Visit Plan and Check
   List with each Evaluator during field operations training sessions.

2. The Tetra Tech QA Officer or authorized designee will send a copy of the final Plan and 4-
   part carbonless copy versions of the final Check List pages, envelopes to return the Check
   Lists, a clipboard, pens, and Lakes Survey QAPP and Field Operations Manual to each
   participating Evaluator.

3. Each Evaluator is responsible for providing their own field gear sufficient to accompany the
   Field Sampling Teams (e.g., protective clothing, sunscreen, insect repellent, hat, water
   bottle, food, back pack, cell phone) during a complete sampling cycle.  Schedule of the Field
   visits will be made by the Evaluator in consultation with the Tetra Tech QA Officer and
   respective Field sampling crew Leader. Evaluators should be prepared to spend
   additional time in the field if needed (see below).

4. Tetra Tech and the Regional Coordinators will arrange the schedule of visitation with each
   Field Team, and notify the Evaluators concerning site locations, where and when to meet
   the team, and how to get there.  Ideally, each Field Team will be evaluated within the first
   two weeks of beginning sampling operations, so that procedures  can be corrected or
   additional training provided,  if needed. GLEC or EPA Evaluators will visit Tetra Tech Field
   Teams and Tetra Tech or EPA Evaluators will visit GLEC Field Teams. Any EPA or
   Contractor Evaluator may visit State/Tribal Field Teams.

5. A Field Team for the Lakes Survey consists of a two- to four-person crew where, at a
   minimum, the Field sampling crew Leader is fully trained.

6. If members of a Field Team  changes, and a majority (i.e., two) of the members have  not
   been evaluated previously, the Field Team must be evaluated again during sampling
   operations as soon as possible to ensure that all members of the Field Team understand
   and can perform the procedures.

7. The Evaluator will view the performance of a team through one complete set of sampling
   activities as detailed on the Field Evaluation and Assistance Check List.

   a.  Scheduling might necessitate starting the evaluation midway on the list of tasks at a site,
       instead of at the beginning.  In that case, the Evaluator will follow the team to the  next
       site to complete the evaluation of the first activities on the list.

   b.  If the Team misses or incorrectly performs a procedure, the Evaluator will note this on
       the checklist and immediately point this out so the mistake can be corrected on the spot.
       The role of the Evaluator is to provide additional training and guidance so that the
       procedures are being performed consistent with the Field Operations Manual, all data
       are recorded correctly, and paperwork is properly completed at the site.

-------
Survey of the Nation's Lakes                                                   Revision No. 1
Quality Assurance Project Plan                                              Date: August 2007
	Page 75 of 87

   c.  When the sampling operation has been completed, the Evaluator will review the results
       of the evaluation with the Field Team before leaving the site (if practicable), noting
       positive practices and problems (i.e., weaknesses [might affect data quality]; deficiencies
       [would adversely affect data quality]).  The Evaluator will ensure that the Team
       understands the findings and will be able to perform the procedures properly in the
       future.

   d.  The Evaluator will  record responses or concerns, if any, on the Field Evaluation and
       Assistance Check List. They will review this list with the field sampling crew at the site.

   e.  If the Evaluator's findings indicate that the Field Team is not performing the procedures
       correctly, safely, or thoroughly, the Evaluator must continue working with this Field Team
       until certain of the  Team's ability to conduct the sampling properly so that data quality is
       not adversely affected.

   f.   If the Evaluator finds major deficiencies in the Field Team operations (e.g., less than
       three members, equipment or performance problems) the Evaluator must contact one of
       the following QA officials:

            i.  Dr. Esther Peters, Tetra Tech QA Officer (703-385-6000)
            ii.  Ms. Robin Silva-Wilkinson, GLEC QA Officer (231-941-2230)
            iii. Mr. Otto Gutenson, EPA Lakes Survey Project QA Officer (202-566-1183)

            The QA official will contact the EPA Project Leader (Carol Peterson - 202-566-
            1304) or Alternate EPA Project Leader (Steve Paulsen - 541-754-4428) to
            determine the appropriate course of action.

8. Data records from sampling sites previously visited by this Field Team will be checked to
   determine whether any sampling sites must be redone.

9. Complete the Field  Evaluation and Assistance Check List, including a brief summary of
   findings, and ensure that all Team members have read this and signed off before leaving the
   Team.

10. Retain the back copy of  each page of the Field Evaluation and Assistance Check List (color:
   	). Fasten the pages of the check list for each Field Team together with a
   paper clip.

11. Mail the remaining pages of each completed Field Evaluation and Assistance Check List to

              Dr. Esther  Peters
              Tetra Tech, Inc.
              10306 Eaton Place, Suite 340
              Fairfax, VA 22030

-------
Survey of the Nation's Lakes                                                   Revision No. 1
Quality Assurance Project Plan                                              Date: August 2007
	Page 76 of 87

12. The Tetra Tech QA Officer or authorized designee will review the returned Field Evaluation
   and Assistance Check Lists, note any issues, check off the completion of the evaluation for
   each Field Team, and distribute the remaining pages of each check list as follows:

   Original:         Tetra Tech QA Officer file, Fairfax, VA

   Color:	Tetra Tech Project Manager file, Owings Mills, MD

   Color:	Lakes Survey QA Officer file, Washington, DC

6.2    Laboratory Quality Evaluation and Assistance Visit Plan for the Survey of the
       Nation's Lakes (Lakes Survey)

Evaluators: One or more designated Contractor staff members who are qualified (i.e., have
completed training) in the procedures of the Lakes Survey laboratory operations.

To Evaluate: Laboratories performing chemical, pathogen or algal toxin analysis or
subsampling, sorting, and taxonomic procedures to analyze lake samples.

Purpose:  To identify and correct deficiencies during laboratory operations and procedures.

   1. Tetra Tech project staff will review the Laboratory Evaluation and Assistance Visit Plan
      and Check List with each Evaluator prior to conducting laboratory evaluations.

   2. The Tetra Tech QA Officer or authorized designee will send a copy of the final Plan and
      4-part carbonless copy versions of the final Check List pages, envelopes to return the
      Check Lists, a clipboard, pens, and Lakes Survey QAPP  and Laboratory Methods
       manual to each participating Evaluator.

   3. Schedule of lab visits will be made  by the Evaluator in consultation with the Tetra Tech
      QA Officer and the respective Laboratory Supervisor Staff.  Evaluators should be
       prepared to spend additional time in the laboratory if needed (see below).

   4. Tetra Tech will arrange the schedule of visitation with each  participating Laboratory, and
       notify the Evaluators concerning site locations, where and when to visit the laboratory,
      and how to get there. Ideally, each Laboratory will be evaluated within the  first two
      weeks following initial receipt of samples, so that procedures can be corrected or
      additional training  provided, if needed.

   5. The Evaluator will  view the performance of the laboratory procedures and QC Officer
      through  one complete set of sample processing activities as detailed on the Laboratory
       Evaluation and Assistance Check List.

          a. Scheduling might necessitate starting the evaluation midway on the list of tasks
             for processing a sample,  instead of at the beginning. In that case, the Evaluator
             will view the activities of the laboratory personnel  when a new sample is started

-------
Survey of the Nation's Lakes                                                    Revision No. 1
Quality Assurance Project Plan                                              Date: August 2007
	Page 77 of 87

              to complete the evaluation of the first activities on the list.

          b.  If laboratory personnel miss or incorrectly perform a procedure, the Evaluator will
              note this on the checklist and immediately point this out so the mistake can be
              corrected on the spot.  The role of the Evaluator is to provide additional training
              and guidance so that the procedures are being performed consistent with the
              Laboratory Methods manual, all data are recorded correctly, and paperwork is
              properly completed at the site.

          c.  When the sample has been  completely processed or analyzed, the Evaluator will
              review the results of the evaluation with laboratory personnel and QC Officer,
              noting positive practices and problems (i.e., weaknesses [might affect data
              quality]; deficiencies [would  adversely affect data quality]).  The Evaluator will
              ensure that the laboratory personnel and QC Officer understand the findings and
              will be able to perform the procedures properly in the future.

          d.  The Evaluator will record responses or concerns, if any, on the Laboratory
              Evaluation and Assistance Check List.

          e.  If the Evaluator's findings indicate that Laboratory staff are not performing the
              procedures correctly, safely, or thoroughly, the Evaluator must continue working
              with these staff members until certain  of their ability to process the sample
              properly so that data quality is not adversely affected.

          f.   If the Evaluator finds major deficiencies in the Laboratory operations, the
              Evaluator must contact one  of the following QA officials:

                  i.  Dr. Esther Peters, Tetra Tech QA Officer (703-385-6000)
                 ii.  Mr. Dennis McCauley, GLEC QA Officer (231-941-2230)
                 iii.  Mr. Otto Gutenson, EPA Lakes Survey Project QA Officer (202-566-1183)

                   The QA official will contact the EPA Project Leader (Carol Peterson - 202-
                   566-1304) or Alternate EPA Project Leader (Steve Paulsen - 541-754-
                   4428) to determine what should  be done.

   6.  Data records from samples previously processed by this Laboratory will be checked to
       determine whether any samples must be redone.

   7.  Complete the Laboratory Evaluation and Assistance Check List, including a brief
       summary of findings, and ensure that the Sorter and QC Officer have read this and
       signed off before leaving the Laboratory.

   8.  Retain the back copy of each page  of the Laboratory  Evaluation and Assistance Check
       List (color:	). Fasten the pages of the check list for each Sorter together
       with a paper clip.

-------
Survey of the Nation's Lakes                                                    Revision No. 1
Quality Assurance Project Plan                                               Date: August 2007
	Page 78 of 87

    9.  Mail the remaining pages of each completed Laboratory Evaluation and Assistance
       Check List to:

                   Dr. Esther Peters
                   TetraTech, Inc.
                   10306 Eaton Place, Suite 340
                   Fairfax, VA 22030

    10. The Tetra Tech QA Officer or authorized designee will review the returned Laboratory
       Evaluation and Assistance Check Lists, note any issues, check off the completion of the
       evaluation for each participating Laboratory, and distribute the remaining pages of each
       check list as follows:

    Original:          Tetra Tech QA Officer file, Fairfax, VA

    Color:	Tetra Tech Project Manager file, Owings Mills, MD

    Color:	Lakes Survey QA Officer file, Washington, DC


                             7 .0   DATA ANALYSIS PLAN

The Data Analysis Plan describes the general process used to evaluate the data for the survey.
It outlines the steps taken to assess the condition of the nation's lakes and identify the relative
impact of stressors on this condition. Results from the analysis will be included in the final report
and used in future analysis. This is the first analysis of lakes of this scope and scale, so the data
analysis plan will likely be refined and clarified as the data are analyzed  by EPA and states.

7.1    Data Interpretation Background

       The basic intent of data interpretation is to evaluate the occurrence and distribution of
parameters throughout the population of the in the United States within the context of regionally
relevant expectations for least disturbed reference conditions. This is presented using a
cumulative distribution function or similar graphic. For most indicators the analysis will also
categorize  the condition of water as good, fair, or poor. Because of the large-scale and
multijurisdictional nature of this effort, the key issues for data interpretation are unique and
include: the scale of assessment, selecting the best indicators, defining the least impacted
reference conditions, and determining thresholds for judging condition.

       Scale of assessment. This will be the first national report on the ecological condition of
the nation's lakes using comparable methods. EPA selected the sampling locations for  the
survey using a probability based design, and developed rules for selection to meet certain
distribution criteria, while ensuring that the design yielded a set of lakes that would provide for
statistically valid conclusions about the condition of the population of lakes across the nation. A
challenge that this mosaic of waterbodies poses is developing a data analysis plan that allows
us to interpret data and present results at a large, aggregate scale.

-------
Survey of the Nation's Lakes                                                    Revision No. 1
Quality Assurance Project Plan                                              Date: August 2007
	Page 79 of 87

       Selecting the best indicators. Indicators should be applicable across all reporting units,
and must be able to differentiate a range of conditions. As part of the indicator selection
process, input from state experts at a conference co-sponsored by the Agency and the National
Association of Lakes Managers, the National Conference Planning a Survey of the Nation's
Lakes  held April of 2005. The Agency also formed a steering committee with state and regional
representatives to develop and refine indicators and sampling methodologies.

       EPA developed screening and evaluation criteria which included indicator applicability
on a national scale, the ability of an indicator to reflect various aspects of ecological condition,
and cost-effectiveness.

       Defining least impacted reference condition. Reference condition data are necessary to
describe expectations for biological conditions under least disturbed setting. EPA has  identified
and will sample 132 reference lakes stratified by 9 ecoregions (based on Omernik Level III
ecoregion) representing both natural lakes and reservoirs. EPA followed a three-step a priori
screening approach proposed by Alan Herlihy under EPA cooperative agreement for identifying
candidate reference lakes in four of the aggregate ecoregions - Northern Appalachians, Upper
Midwest, Western Mountains, and Xeric. The approach involves screening for chemical
constituents, screening with GIS coverage for landuse and road density, and screening for
evidence of human disturbance based on evaluation of air photos. For the remaining five
ecoregions, EPA: (1) compiled lists of candidate reference lakes from regions 3-9 based on best
professional judgment from the states and/or regions. In EPA Regions 1, 2 and 10 we had
existing candidate lists. Allocation of candidate lakes to be sampled was based on natural vs.
reservoir class, EPA Region, and national Ecoregion according to the below table.(2) examined
candidate reference lakes for disturbances using aerial photographs in a 100 m buffer around
the lake shoreline. Disturbances were scored from 0-3 in seven categories (residential,
agricultural, recreational, industrial, forestry, water development, roads). Disturbance scores for
each category were summed into one "total photo" score for use as an overall disturbance index
(0 = no noted disturbances). (3) provided lakes with a low "total photo" scores higher preference
for inclusion. We stratified lakes by lake surface area, Omernik level III ecoregions and then
lat/long were used to spread out the sample spatially. In cases of "ties" (similar total photo
scores) we dropped lakes with agricultural and industrial disturbances first (as opposed to
road/recreation type disturbances). After that, "tie" lakes were picked randomly to fill out cells in
the table. In addition to the selection of primary reference lakes, we listed alternates in case of
limited access issues with the primary lakes. When replacing a primary lake with an alternate,
we selected those with a similar ecoregion/lake size. (4) determined the number and types of
reference lakes appropriate and feasible for each region and selected reference lakes for
inclusion in the 2007 sampling effort (See table below).

-------
Survey of the Nation's Lakes
Quality Assurance Project Plan
   Revision No. 1
Date: August 2007
    Page 80 of 87
              Allocation of Reference Lakes by EPA Region and Ecoregion
L= Natural Lake
R=Reservoir
                           117 Total Reference Lakes*
EPA Region
1
2
3
4
5
6
7
8
9
10
Total Lake
Total Res.
NAP
20L
2L
~
~
~
~
—
~
~
—
22
0
SAP
~
~
10R
~
~
~
5R
~
~
~
0
15
CPL
~
~
5R
5L
~
3R
—
~
~
~
5
8
UMW
~
~
~
~
15L
~
—
~
~
—
15
0
TPL
~
~
~
~
10L
~
10R
~
~
~
10
10
NPL
~
—
~
~
~
~
—
~
~
—
0
0
SPL
~
—
~
~
~
6R
10R
~
~
—
0
16
XER
~
~
~
~
~
1R
~
~
2L/2R
1L
3
3
WMT
~
~
~
~
~
~
~
~
~
10L
10
0
TOTAL
20
2
15
5
25
10
25
0
4
11
65
52
*EPA Region 1 agreed to sample 15 additional lakes bringing the total to 132 reference lakes.

       Determining thresholds for judging condition. This reference site approach is then used
to set expectations and benchmarks for interpreting the data on lake condition. The range of
conditions found in the reference sites for an ecoregion describes a distribution of those
biological or stressor values expected for least disturbed condition. The benchmarks used to
define distinct condition classes  (e.g., good, fair, poor/ least disturbed, intermediate, most
disturbed) are drawn from this reference distribution. Our approach is to examine the range of
values for a biological  or stressor indicator in all of the  reference sites  in a region, and to use the
5th percentile of the reference distribution for that indicator to separate the most disturbed of all
sites from moderately  disturbed  sites.  Using the 5th percentile means that lakes in the most
disturbed category are worse than 95% of the best sites used to define reference condition.
Similarly, the 25th percentile of the reference distribution can be used  to distinguish between
moderately disturbed sites and those in least disturbed condition. This means that lakes
reported as least disturbed are as good as 75% of the  sites used to define reference condition.

7.2    Datasets Available for the Report

       Methods used  in the survey stem  from discussions, input, and feedback provided by the
Survey of the Nation's Lakes Steering Committee. Many of the methods are an outgrowth of the
testing and refinement of the existing and developed methods and the logistical foundation
constructed during the implementation of the Environmental Monitoring and Assessment
Program (EMAP) studies from 1991 through 1994, from a New England pilot study conducted in
2005, from focused pilot studies  for methods development, and from various State water quality
agency methods currently in  use.

-------
Survey of the Nation's Lakes                                                   Revision No. 1
Quality Assurance Project Plan                                              Date: August 2007
	Page 81 of 87

       The survey will use indicators to assess trophic status, ecological integrity, and the
recreational  value of lakes:

       Trophic status. Lakes are typically classified according to their trophic state. Three
variables, chlorophyll, Secchi disk depth,  and total phosphorus, are most often used to estimate
biomass and define trophic state of a particular lake. Other variables will be measured in
conjunction with the trophic state variables to supplement and enhance understanding of lake
processes that affect primary productivity.

       Ecological integrity. Ecological integrity describes the ecological condition of a lake
based on different assemblages of the aquatic community and their physical habitat. The
indicators include plankton (phytoplankton and zooplankton), benthic macroinvertebrates,
diatoms, and the physical habitat of the shoreline and littoral zone.

       Recreational value. Recreational indicators address the ability of the population to
support recreational uses such as swimming, fishing and boating. The protection of these uses
is one of the requirements in the Clean Water Act under 305(b). Both the extent of a fecal
indicator (Enterococci), algal toxins (microcystin), and mercury will serve as the primary
indicators of recreational value.

7.3    Benthic Macroinvertebrate and  Zooplankton Assemblages

       Benthic macroinvertebrate and zooplankton assemblage will be analyzed using both
multimetric indices (MMI)  and observed/expected indices  (O/E) models. The MMI approach
summarizes various assemblage attributes, such as composition, tolerance to disturbance,
trophic and habitat preferences, as individual metrics or measures of the biological community.
Candidate metrics are evaluated for aspects of performance and a subset of the best
performing metrics are combined into an  index known as a Macroinvertebrate Index of Biotic
Condition. This index is then used to rank the condition of the resource.

       The predictive model or O/E approach estimates the expected taxonomic composition of
an assemblage in the absence of human  stressors, using  a set of least-disturbed sites and other
variables related to natural gradients, such as elevation, lake size, latitude and longitude. The
resulting models are then used to estimate the expected taxa composition (taxa richness) at
each site sampled. The number of expected taxa actually observed at a site is compared to the
number of expected taxa as an Observed Expected ratio or index. Departures from a ratio of
one indicate that the taxonomic composition in the sample differs from that expected under least
disturbed conditions.

7.4    Phytoplankton Assemblages

       Phytoplankton will be collected as an integrated sample in open water.  Both abundance
and biovolumn on a species-specific basis will be determined. The raw data will be used in
multiple data analysis techniques, metrics and indices, such as Centrales/ Pennales ratios,
Palmer's WQ Index and other diversity indices.

-------
Survey of the Nation's Lakes                                                   Revision No. 1
Quality Assurance Project Plan                                              Date: August 2007
	Page 82 of 87

7.5    Diatom Data Analysis

       Sediment diatoms will be sampled in the deepest part of the lake (up to 50 meters), or
the midpoint of a reservoir, using a sediment core sampling device. Diatoms will be analyzed/
identified in the sediment surface fraction and in a deep fraction, i.e., 35 to 45cm. Comparison of
these fractions provide an indication of both current and historical lake condition with respect to
stressors such as nutrients (phosphorus) and sediment loadings. Comparison of the diatoms
found in deep and surface fractions can  also provide insight to the structure and composition of
algal communites under  pristine conditions as well as inform on the temporal and spatial trends
of eutrophication.

       Mercury levels will also be determined in the sediment core samples to compare to
existing national mercury distribution databases.

7.6    Enterococci Data Analysis

       The presence of certain levels  of enterococci is associated with pathogenic bacterial
contamination of the resource. A single enterococci water sample will be collected at each lake,
then filtered, processed,  and analyzed using quantitative Polymerase Chain Reaction,( qPCR ).
Bacterial occurrence and distribution will be reported. Data interpretation will be enhanced by
comparison to USEPA qPCR pilot studies as well as to thresholds recommended from the Great
Lakes qPCR studies. In  addition, some  states are doing parallel studies with better known
culturing techniques that have a vast historical database which to compare.

7.7    Water Chemistry, Chlorophyll-a and Secchi Depth

       A wide array of water chemistry parameters will be measured, such as DO, pH, total N,
total P, clarity, TOC/DOC, color, ANC  and primary productivity.  Values for these parameters
and their distribution will  be reported.  Water chemistry analysis is critical for interpreting the
biological indicators. Chlorophyll-a, Secchi depth and nutrient measurements will be used to
determine trophic level indices,  such as the Carlson Index.  Temperature profiles will be used to
determine degree of lake stratification.

7.8    Algal Toxin Data Analysis

       Cyanobacterial (blue-green algal) blooms are common midsummer to late-fall events
that occur in many lakes and reservoirs throughout the United States.  Algal toxin production
has been identified as a significant potential human health problem that has been associated
with many of these bloom events. However, little is known about the general occurrence of algal
toxins in the pelagic zones of these water bodies, where extensive blooms are less likely to
occur than in near-shore areas.

       The USGS Kansas Water Science Center will analyze the total (whole water)
concentrations of microcystins (total) in lakes and reservoirs throughout the United States using
a standardized immunoassay test. The USGS will also perform quantitative LC/tandem MS

-------
Survey of the Nation's Lakes                                                   Revision No. 1
Quality Assurance Project Plan                                              Date: August 2007
	Page 83 of 87

analysis of 2% of the samples for microcystin for verification of immunoassay results. This data
will be used  to verify the immunoassay results and support the scientific integrity of the data.

       The USGS will  analyze and interpret the data for microcystin occurrence and
concentration and with respect to other environmental data that is collected as part of the lake
assessment (e.g. nutrients, phytoplankton, chlorophyll, turbidity, specific conductance, pH).
Data interpretation by the USGS will be reviewed by the EPA and accepted through a letter of
concurrence.

7.9    Physical Habitat Assessment

7.9.1   Lakes ho re and Littoral Physical Habitat Observations and Metric Definitions

Shoreline human disturbances
       The presence or absence of 12 predefined types of human land use or disturbance was
recorded for each of the 10 stations.  In 1993, additional human disturbances were separately
identified outside of but adjacent to the plots.  For each of the 12  disturbance categories, we
calculated the  proportion of lakeshore stations where the disturbance was observed on each
lake. In 1993, the proportions were weighted according to the proximity of the disturbance
before computing the whole-lake metrics. Weightings were 1.0 for disturbance observations
within the riparian sample plots and 0.33 for those behind or adjacent to the plots. Two types of
summary metrics were calculated by synthesizing all the  human disturbance observations. The
first, a measure of the extent of shoreline disturbance, was calculated as the proportion of
stations at which one or more human disturbances were observed.  The second, a measure of
disturbance intensity, was calculated as the mean number of human disturbance types
observed at each of the 10 shoreline stations.

Riparian vegetation
       Riparian vegetation type and areal cover were visually estimated in three layers:  the
canopy (>5 m high), mid-layer (0.5-5 m high) and ground cover (<0.5 m high). Coniferous and
deciduous vegetation was distinguished in the canopy and mid-layer; woody and herbaceous
vegetation was distinguished in the mid-layer and ground cover.  In 1991 and 1992, cover was
estimated  in four classes: absent (0),  sparse (0-10%), moderate (10-40%) and heavy (>40%).
In 1993, another cover class was added to improve precision and interpretation, redefining
"heavy" as 40-75% and "very heavy" as >75%.

       Simple whole-lake metrics were calculated by assigning the cover class mid-point value
to each station's observations and then averaging those cover  values across all 10 stations.
Summary  metrics were calculated for each lake by summing the areal cover or tallying the
presence of defined combinations of riparian vegetation layers  or vegetation types.

Aquatic macrophytes
       Using the same cover classes  as for riparian vegetation, areal covers of nearshore
emergent,  floating, and submerged aquatic macrophytes  were  each estimated visually. In 1993,
the same cover class redefinition was  applied in aquatic macrophytes as was used for riparian

-------
Survey of the Nation's Lakes                                                   Revision No. 1
Quality Assurance Project Plan                                              Date: August 2007
	Page 84 of 87

vegetation. Simple and summary aquatic macrophyte metrics were calculated for each lake in
the same fashion as for riparian vegetation.

Fish concealment features
       In 1991 and 1992, the presence or absence of eight specified types of fish concealment
features was recorded within each 10-m x 15-m littoral plot.  In 1993, the areal cover of each
type also was assigned to one of three cover classes (0, 0-10%, >10%).  Simple metrics for
each type of fish concealment feature were calculated as the proportion of littoral stations with
the particular concealment feature present.  Summary metrics were calculated as the mean
number of concealment types per station. In 1993, we used the areal cover class designation to
unweight very sparse cover in the calculation of both simple and summary fish cover metrics
(i.e., the areal cover designations in the previous paragraph were respectively assigned values
ofO, 0.2,  and 1.0).

Shoreline and littoral bottom substrate
       Visual estimates of areal  cover of seven defined substrate types were made separately
for the 1-m shoreline band and the bottom within the 10-m x 15-m littoral plot. Cover classes
were the same as for riparian vegetation, with the same modification to include an additional
higher cover class in 1993. In cases where the bottom substrate could not be observed directly,
observers used a clear plastic viewing bucket, a 3-m plastic (PVC) sounding tube, or an anchor
to examine or obtain samples of bottom sediments.

       Simple metrics describing the lakewide mean cover of littoral and shoreline substrate in
each size category were obtained by averaging the cover estimates at each station, using the
cover class midpoint approach described for riparian vegetation. Three substrate summary
metrics were calculated for both  shoreline and  littoral  bottom substrates.  First was the mean
cover of the dominant substrate type.  Second  and third were measures of the central tendency
and variety of substrate size. Because the size categories are approximately logrithmic, we
calculated a cover-weighted mean substrate size class and its standard deviation; we ranked
the substrate classes by size from  1 to 6, weighting them by their lakewide mean cover, and
then averaging weighted cover or computing its variance across size classes.

Littoral depth, bank characteristics and other observations
       Lake depth 10m offshore was measured using SONAR, sounding line, or sounding
tube.  Field crews estimated the  bank angle and, based on  high and low water marks, the
vertical and lateral range in lake  water level fluctuation. They also noted the presence of water
surface scums, algal mats, oil slicks, and sediment color and odor.  In  1993, bank angle
estimates were specifically confined to the 1-m-wide shore zone; limits for those observations
were ambiguous the previous two years.  Whole-lake metrics for littoral depth and water level
fluctuations were calculated as arithmetic averages and standard deviations.  For bank angle
classes and qualitative observations of water surface condition, sediment color, and odor, we
simply calculated the proportion  of stations with presence of the described features.

Composite habitat quality indices
       A riparian bird habitat quality index was calculated as the mean of four physical habitat
metrics, each scaled as a continuous variable from 0 (low quality) to 1  (high quality).  Included

-------
Survey of the Nation's Lakes                                                   Revision No. 1
Quality Assurance Project Plan                                              Date: August 2007
	Page 85 of 87

as subcomponents were summary metrics characterizing the extent and intensity of riparian
human disturbance, the extent of riparian canopy presence, and the complexity of riparian
vegetation structure.  The index integrates 22 separate simple metrics, because the 4
component variables are themselves summaries. Human disturbances were incorporated as
negative habitat quality factors; the rest were treated as positive factors. A fish habitat quality
index was calculated in the same manner, but it also included the abundance of fish
concealment features. The fish habitat index contains information from 30 separate simple
metrics, because the 5 subcomponent metrics are summary variables

7.9.2     Human Disturbances in Riparian/Littoral

•   12 Simple metrics describe presence (proportion of shore) with:  buildings,  commercial land
    use, lawns, developed parkland, roads/railroads, docks/boats, trash/landfill,
    seawalls/revetments, rowcrop agriculture, pasture, orchards, other human activities.
•   2 Summary metrics describe mean number of disturbance types observed per station and
    proportion of shoreline with human disturbance of any type.

7.9.3     Riparian Vegetation Structure

•  10 Simple metrics describe areal cover of trees >0.3 m DBH and <0.3 m DBH in canopy
   layer; woody and herbaceous vegetation in mid-layer;  barren ground and woody,
   herbaceous, and inundated vegetation in ground cover layer.
•  6 Summary metrics describe aggregate covers in canopy + mid-layer, woody vegetation in
   canopy + mid-layer, and canopy + mid-layer + ground  cover layers; presence of vegetation in
   canopy layer; presence in  both canopy and mid-layer.

7.9.4     Littoral Aquatic Macrophytes

•  Simple metrics describe cover of emergent, floating, and submergent macrophytes; and
   presence of macrophytes  lakeward from the shoreline observation plot.
•  2 Summary metrics describe mean combined cover and  proportion of shoreline with
   macrophytes present.

7.9.5     Shoreline and Littoral Substrate Type and Size

•   14 Simple metrics separately describing shoreline and littoral substrate: areal cover
    estimates of bedrock (>4000 mm), boulder (250-4000 mm),  cobble (64-250 mm), gravel (2-
    64 mm), sand (0.06-2.0 mm), soil or silt/clay/muck (<0.06 mm), and vegetation or woody
    debris (if concealing substrate).
•   6 Summary metrics (3 for shore and 3 for littoral bottom) estimating cover-weighted mean
    size class, size class variance, and the areal cover of the dominant substrate type.

-------
Survey of the Nation's Lakes                                                   Revision No. 1
Quality Assurance Project Plan                                              Date: August 2007
	Page 86 of 87

7.9.6      Littoral Fish Cover

•   8 Simple metrics estimating proportion of shore zone with various fish cover types:  boulder,
    rock ledge, brush, inundated live trees, overhanging vegetation, snags >0.3 m diameter,
    aquatic macrophytes, and human structures (e.g., docks, enhancement structures).
•   Summary metrics describing the mean number of all vegetation-related, rock-related, non-
    anthropogenic, and all types of fish cover types.

7.9.7      Littoral Depth, Banks, and Level Fluctuations

•   7 Simple metrics describing mean depth and depth variation among sampling station, bank
    angle,  and apparent height and extent of vertical and horizontal lake water level fluctuations.
•   1 Summary metric describing spatial variation of station depths on lake.

7.9.8      Miscellaneous Habitat Variables

•   7 Simple metrics describing proportion of sampling sites with sediment odor (petrol, H2S,)
    sediment colors (black, brown, other), and water surface films (oil, algal mat, other).
•   1 Summary metric describing proportion of sampling sites with surface film of any type.
                                 8.0    REFERENCES

American Public Health Association.  2006.  Standard Methods for the Examination of Water
       and Wastewater. 21st Edition.  American Public Health Association, Washington, D.C.

Bain, M.B., J.T. Finn, and H.E. Booke. 1985. Quantifying stream substrate for habitat analysis
       studies. North American Journal of Fisheries Management 5:499-500.

Baker, J.R. and G.D. Merritt, 1990. Environmental Monitoring and Assessment Program:
       Guidelines for Preparing Logistics Plans. EPA 600/4-91-001. U.S. Environmental
       Protection Agency. Las Vegas, Nevada.

Carlson, R.E. 1977.  A trophic state index for lakes.  Limnology and Oceanography 22(2):361-
       369.

Chaloud, D.C., J.M.  Nicholson, B.P. Baldigo, C.A. Hagley, and D.W. Sutton.  1989. Handbook
       of Methods for Acid Deposition Studies: Field Methods for Surface Water Chemistry.
       EPA 600/4-89/020. U.S. Environmental Protection Agency, Washington, D.C.

Charles, D. F., C. Knowles, and R.S. Davis. 2003. Protocols for the analysis of algal samples
       collected as part of the U.S. Geological Survey National Water-Quality Assessment
       program. Patrick Center For Environmental Research, The Academy Of Natural
       Sciences. Report No. 02-06

-------
Survey of the Nation's Lakes                                                   Revision No. 1
Quality Assurance Project Plan                                              Date: August 2007
	Page 87 of 87

Frissell, C.A., W.J. Liss, C.E. Warren, and M.D. Hurley.  1986.  A hierarchical framework for
       stream habitat classification: viewing streams in a watershed context. Environ. Mgmt.
       10(2):  199-214.

Garner, F.C., M.A. Stapanian, and K.E. Fitzgerald. 1991. Finding causes of outliers in
       multivariate environmental data. Journal of Chemometrics. 5: 241-248.

Glew, J.R., Smol, J.P. and Last, W.M. 2001. Sediment core collection and extrusion, pp. 73-
       105. In: Last, W.M. and Smol, J.P. [Editors]. Tracking Environmental Change Using Lake
       Sediments. Vol 1: Basin Analysis, Coring, and Chronological Techniques. Kluwer
       Academic Publishers, Dordrecht.

Hawkins, C. P., R. H. Morris, J. N. Hogue, and J. W. Feminella. 2000. Development and
       evaluation of predictive models for measuring the biological  integrity of streams.
       Ecological Applications 10:1456-1477.

Heinz Center. 2002.  The State of the Nation's Ecosystems. The Cambridge University Press.

Hillman, D.C.,  S.H. Pia, and S.J.  Simon. 1987. National Surface Water Survey:  Stream Survey
       (Pilot, Middle Atlantic Phase I, Southeast Screening, and Episode Pilot) Analytical
       Methods Manual. EPA 600/8-87-005.  U.S. Environmental Protection Agency, Las
       Vegas, Nevada.
Hunt, D.T.E and A.L. Wlson.  1986. The chemical analysis of water: general principles and
       techniques. 2nd edition.  Royal Society of Chemistry, London, England.

Kaufmann, P.R. (ed.).  1993.  Physical Habitat. IN:  R.M. Hughes (ed.) Stream Indicator and
       Design Workshop. EPA600/R-93/138. U.S. Environmental  Protection Agency,
       Corvallis, Oregon.

Kaufmann, P.R., A. T. Herlihy, J.W. Elwood, M.E. Mitch, W.S. Overton, M.J. Sale, J.J.. Messer,
       K.A. Cougan, D.V. Peck,  K.H. Reckhow, A,J,  Kinney, S.J. Christie, D.D. Brown, C.A.
       Hagley, and H.I. Jager. 1988.  Chemical Characteristics of Streams in the Mid-Atlantic
       and Southeastern United States. Volume I: Population Descriptions and Physico-
       Chemical Relationships.  EPA 600/3-88/021 a. U.S. Environmental Protection Agency,
       Washington,  D.C.

Kirchmer,  C.J. 1983. Quality control in water analysis. Environmental Science &
       Technology. 17: 174A-181A.

Klemm, D.J., P.A. Lewis, F. Fulk, and J.M. Lazorchak.  1990. Macroinvertebrate Field and
       Laboratory Methods for Evaluating the Biological Integrity of Surface Waters. EPA
       600/4-90/030. U.S. Environmental Protection Agency, Cincinnati, Ohio.

Lemmon,  P.E. 1957. A new instrument for measuring forest overstory density.  J. For. 55(9):
       667-669.

-------
Survey of the Nation's Lakes                                                  Revision No. 1
Quality Assurance Project Plan                                             Date: August 2007
	Page 88 of 87

Meglen, R.R. 1985. A quality control protocol for the analytical laboratory. Pp. 250-270
       IN: J.J.  Breen and P.E. Robinson (eds).  Environmental Applications of
       Cehmometrics. ACS Symposium Series 292. American Chemical Society,
       Washington, D.C.

Peck, D. V., and R. C. Metcalf. 1991.  Dilute, neutral pH standard of known conductivity and
       acid neutralizing capacity.  Analyst 116:221-231.

Metcalf, R. C.,  and D. V. Peck. 1993.  A dilute standard for pH, conductivity, and acid
       neutralizing capacity measurement. Journal of Freshwater Ecology 8:67-72.

Mulvey, M., L. Cato, and R. Hafele.  1992. Oregon Nonpoint Source Monitoring Protocols
       Stream Bioassessment Field Manual: For Macroinvertebrates and Habitat Assessment.
       Oregon Department of Environmental Quality Laboratory Biomonitoring Section.
       Portland, Oregon. 40pp.

NAPA. 2002. Environment.gov. National Academy of Public Administration.  ISBN: 1-57744-
       083-8. 219 pages.

NRC. 2000. Ecological Indicators for the Nation. National Research Council.

OblingerChildress, C.J., Foreman, W.T., Connor, B.F. and T.J. Maloney.  1999.  New reporting
       procedures based on long-term method  detection levels and some considerations for
       interpretations of water-quality data provided by the U.S. Geological Survey National
       Water Quality Laboratory. U.S.G.S Open-File Report 99-193, Reston, Virginia.

Overton, W.S., White, D., and Stevens, D.L  Jr.  1991. Design report for EMAP,
       the Environmental Monitoring and Assessment Program. EPA/600/3-
       91/053, U.S. Environmental Protection Agency, Washington, D.C.

Paulsen, S.G.,  D.P. Larsen, P.R. Kaufmann, T.R. Whittier, J.R. Baker,  D. Peck, J.
       McGue, R.M. Hughes, D. McMullen, D. Stevens, J.L. Stoddard, J. Lazorchak, W.
       Kinney, A.R. Selle, and R.  Hjort. 1991. EMAP - surface waters monitoring and
       research strategy, fiscal year 1991. EPA-600-3-91-002. U.S. Environmental
       Protection Agency, Office of Research and Development, Washington,  D.C. and
       Environmental Research Laboratory, Corvallis, Oregon.

Peck, D.V., J.M. Lazorchak, and D.J. Klemm (editors). 2003. Unpublished draft.
       Environmental Monitoring and Assessment Program - Surface  Waters: Western Pilot
       Study Field Operations Manual for Wadeable Streams. EPA/xxx/x-xx/xxxx. U.S.
       Environmental Protection Agency, Washington, D.C.

Plafkin, J.L, M.T. Barbour, K.D. Porter, S.K. Gross, and R.M. Hughes.  1989. Rapid
       Bioassessment Protocols for Use in Streams and Rivers:  Benthic Macroinvertebrates
       and Fish. EPA 440/4-89/001. U.S. Environmental Protection Agency, Office of Water,
       Washington, D.C.

-------
Survey of the Nation's Lakes                                                  Revision No. 1
Quality Assurance Project Plan                                             Date: August 2007
	Page 89 of 87
Platts, W.S., W.F. Megahan, and G.W. Minshall.  1983.  Methods for Evaluating Stream,
       Riparian, and Biotic Conditions.  USDA Forest Service, Gen. Tech. Rep. INT-183. 71pp.

Robison, E.G. and R.L. Beschta. 1990. Characteristics of coarse woody debris for several
       coastal streams of southeast Alaska, USA. Canadian Journal of Fisheries and Aquatic
       Sciences 47(9): 1684-1693.

Robison, E.G., and P.R. Kaufmann. (In preparation).  Evaluating and improving and objective
       rapid technique for defining pools in small wadeable streams.

Selle, A.R., D.P. Larsen, S.G. Paulsen.  1991.  GIS procedure to create a national lakes frame
       for environmental monitoring.  In: Proceedings of the 11th Annual Environmental
       Systems Research Institute User Conference;  1991  May 20-24; Palm Springs, CA.
       Corvallis, OR:  U.S. Environmental Protection Agency,  Environmental Research
       Laboratory.

Skougstad, M.W., M.J. Fishman, L.C.  Friedman, D.E.  Erdman, and S.S. Duncan (eds.).  1979.
       Method I-4600-78, Automated  Phosphomolybdate Colorimetric Method for Total
       Phosphorus. Methods for Determination of Inorganic Substances in Water and Fluvial
       Sediments:  Techniques of Water-Resources Investigations of the United States
       Geological Survey.  Book 5, Chapter A1.  U.S. Government Printing Office, Washington,
       D.C.

Smith,  F., S. Kulkarni, L. E. Myers, and M. J. Messner. 1988. Evaluating and presenting
       quality assurance data. Pages 157-68 in L.H. Keith,  ed. ACS Professional Reference
       Book. Principles of Environmental Sampling. American Chemical Society,
       Washington, D.C.

Stanley, T.W., and S.S. Verner. 1986. The U.S. Environmental Protections Agency's
       quality assurance program, pp. 12-19 IN: J.K. Taylor and T.W. Stanley (eds.). Quality
       Assurance for Environmental Measurements. ASTM STP 867, American Society for
       Testing and Materials, Philadelphia, Pennsylvania.

Stapanian, M.A.,  F.C. Garner, K.E. Fitzgerald, G.T. Flatman, and J.M. Nocerino. 1993.
       Finding suspected causes of measurement error in multivariate environmental data.
       Journal of Chemometrics. 7: 165-176.

Stevens, D. L., Jr., 1994. Implementation of a National Monitoring Program. Journal
       Environ. Management 42:1-29.

U.S. EPA. 1987. Handbook of Methods for Acid Deposition Studies:  Laboratory Analyses for
       Surface Water Chemistry. EPA/600/4-87/026. U.S. Environmental Protection Agency,
       Office of Research and Development, Washington, D.C.

-------
Survey of the Nation's Lakes                                                 Revision No. 1
Quality Assurance Project Plan                                            Date: August 2007
	Page 90 of 87

U.S. EPA 2002 Guidance for Quality Assurance Plans EPA240/R-02/009 U.S. Environmental
       Protection Agency, Office of Environmental Information, Washington, D.C.

U.S. EPA. 2003. Draft Report on the Environment. ORD and OEI. EPA-260-R-02-006.

U.S. EPA. 2004. Revised Assessment of Detection and Quantitation Approaches. EPA-821-B-
       04-005.  U.S. Environmental Protection Agency, Office of Science and Technology,
       Washington, D.C.

U.S. EPA. 2006.  Guidance on Systematic Planning Using the Data Quality Objectives Process.
       EPA/240/B-06/001.  U.S. Environmental Protection Agency, Office of Environmental
       Information, Washington, D.C.

USEPA.  2007.  Survey of the Nation's Lakes.  Field Operations Manual. EPA 841-B-07-004.
       U.S. Environmental  Protection Agency, Washington, DC.

U.S.GAO. 2000. Water Quality. GAO/RCED-00-54.

Washington, H.G. 1984. Diversity, biotic, and similarity indices. Water Research 18(6): 653-694.

-------
Survey of the Nation's Lakes                        Revision No. 1
Quality Assurance Project Plan                     Date: August 2007
	Page 91 of 87
             APPENDIX A

     NATIONAL LAKES SURVEY FIELD
      EVALUATION AND ASSISTANCE
          SITE VISIT CHECKLIST

-------
National Lakes Survey             Draft Field Evaluation and Assistance Site Visit Checklist               Jun. 12, 08
       NATIONAL LAKES SURVEY FIELD EVALUATION AND ASSISTANCE
                                SITE VISIT CHECKLIST
Evaluation Date(s):
Evaluation Team Member(s):
Name




Lake ID:
Organization




Phone




Lake Name:
Location:
Field Team ID:
Field Team Members:
Name




Organization




Phone




Other Observers Present During Evaluation:
Name




Organization




Phone





-------
National Lakes Survey
                                      Draft Field Evaluation and Assistance Site Visit Checklist
Jun. 12, 08
BASE SITE ACTIVITIES
Global Positioning System Receiver
Were the batteries checked?
Was a re-initialization check required?
Were other tests or checks required as recommended in operating manual?
Y
Y
Y
N
N
N
N/A
N/A
N/A
Multi-Probe
Was the electrode stored properly?
Were the meter red lines, zeroes, readings steady?
Membrane inspection: temperature, DO and pH checks conducted correctly
before using?
Was the DO calibration done at the lake (in accordance with 3.1.2)?
Was the multi-probe calibrated for pH and conductivity at the base location
or before traveling to the site (whichever is appropriate for the unit)?
Was pH and conductivity (if measured) checked for performance against a
QCCS solution (at the beginning of the week whenever sampling is
occurring, as described in the field manual, minimum of 2x, before first and
after last lake sampled)?
Y
Y
Y
Y
Y
Y
N
N
N
N
N
N
N/A
N/A
N/A
N/A
N/A
N/A
Containers/Labels
Were labels affixed to containers when required?
Were labels completed where feasible and appropriate (before or after
collection) using a permanent marker (pencil for benthos inside jar label)
and covered with clear tape?
Y
Y
N
N
N/A
N/A
Preservatives and Other Solutions
Were stock preservatives prepared if required (recipes available)?
Were the benthic invertebrate and zooplankton preservatives ready for
transport?
Was the preservative for phytoplankton ready for transport?
Was dry ice present?
Was the pH/conductivity quality control check sample solution ready for
transport?
Y
Y
Y
Y
Y
N
N
N
N
N
N/A
N/A
N/A
N/A
N/A
Other Equipment and Supplies
Was the current version of the Lake Visit Checklist used at the base
location?
Was the Supply Needs List sent or phoned in to "home base" or directly to
the Field Logistics Coordinator?
Were additional "custom" items added to the checklist?
Were equipment and supplies clean, in verified working order, and
organized for transport?
Y
Y
Y
Y
N
N
N
N
N/A
N/A
N/A
N/A
Site Information and Access
Were individual site packets, including directions to the site and
topographic maps, available and organized?
Was the site access information/permission letter available?
Y
Y
N
N
N/A
N/A

-------
National Lakes Survey
                                      Draft Field Evaluation and Assistance Site Visit Checklist
Jun. 12, 08
Was the landowner is contacted prior to site visit?
Were other key contact persons notified (e.g., Regional Coordinator, State
or Tribal contacts)?
Y
Y
N
N
N/A
N/A
Vehicle
Was the tire pressure checked and OK?
Was the fuel level checked and OK?
Were the vehicle lights, turn signals, and brake lights checked?
Were there any operational problems?
Were emergency kit-jumper cables, first aid kit, etc. available?
Was there an extra set of keys for the vehicle available and with a different
person?
Y
Y
Y
Y
Y
Y
N
N
N
N
N
N
N/A
N/A
N/A
N/A
N/A
N/A
Boat
Was the trailer and hitch inspected prior to departing to the site to ensure
that the trailer was securely fastened?
Were the electronic connection and brake lights for the trailer checked?
Was the boat(s) in good working order and inspected before departure?
Was there any additional emergency equipment (e.g., shovel, fire
extinguisher, etc.)?
Were PFDs available for all passengers?
Y
Y
Y
Y
Y
N
N
N
N
N
N/A
N/A
N/A
N/A
N/A
Conductivity (OPTIONAL)
Was the QC check conducted correctly before field measurement, using a DI
water rinse, rinse bottle, and test bottle of QC solution?
Was the measured conductivity of QCC solution recorded?
Does the crew understand what to do in case of an unacceptable QC check?
Was the temperature of the solution recorded (if meter does not provide
temperature-corrected values)?
Was the QC solution recently replaced? (2-3 weeks)?
Was the conductivity measurement made at a representative location within the
stream (near X-site, flowing water, mid-depth, etc.)?
Was a measured conductivity value recorded correctly on the field form?
Were the meter and probe stored correctly after use?
Y
Y
Y
Y




N
N
N
N




N/A
N/A
N/A
N/A





-------
National Lakes Survey                 Draft Field Evaluation and Assistance Site Visit Checklist                   Jun. 12, 08





 NOTES

-------
National Lakes Survey
                                      Draft Field Evaluation and Assistance Site Visit Checklist
Jun. 12, 08
LAKE VERIFICATION
Lake Verification at the Launch Site
Was the site information sheet available for the lake?
Were the lake coordinates recorded on the verification form?
Was a detailed description of the final part of the route to the lake recorded?
Was the lake classified correctly (e.g., target vs. nontarget vs. inaccessible)?
Was the Verification Form completed for sites not visited and for sites visited
but not sampled?
Was a rough sketch of lake outline available for Side 2 of verification form?
Does the map sketch of the lake outline include shoreline station locations and
launch site location?
Y
Y
Y
Y
Y
Y
Y
N
N
N
N
N
N
N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Lake Verification at the Index Site Location
Was the lake verified via GPS coordinates or map information and recorded
on the verification form?
Was the lake evaluated to see if it meets study requirements (e.g., > 1m deep)
Was the deepest point, or index location, (< 50 m) determined using sonar or
bathymetric map?
Were the GPS coordinates of index location recorded on the form?
Were photographs of the site taken (if appropriate)?
Was the index site location marked on the lake outline map?
Y
Y
Y
Y
Y
Y
N
N
N
N
N
N
N/A
N/A
N/A
N/A
N/A
N/A

-------
National Lakes Survey
                                      Draft Field Evaluation and Assistance Site Visit Checklist
Jun. 12, 08
INDEX SITE SAMPLING
Temperature, Dissolved Oxygen, and pH
Was the depth measured at the index location, and the intervals calculated before
probe was placed in the water?
Were the site conditions properly recorded?
Was the probe calibrated during the initial site activities?
Was an operation manual available for the meter?
Were the measurements at each depth interval conducted and recorded according
to the protocol on the Lake Profile Form?
Did the probe touch the bottom of the lake?
Was a duplicate reading taken at the surface after the profile was completed?
Was the probe stored correctly after the measurement?
Was the top and bottom of the metalimnion marked on the form where the water
temperature changes 1 degree per meter?
Y
Y
Y
Y
Y
Y
Y
Y
Y
N
N
N
N
N
N
N
N
N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Secchi Disk Transparency
Was the Secchi disk being used the black and white patterned disk?
Was the calibrated sounding line visibly marked in half meter intervals?
Was the measurement taken from the shady side of the boat?
Was the recorder wearing sunglasses or a hat?
Was a viewscope used?
Y
Y
Y
Y
Y
N
N
N
N
N
N/A
N/A
N/A
N/A
N/A
Water Sample Collection and Preservation
Were gloves worn?
Was the integrated sampler rinsed three times at the index point?
Was the euphotic zone correctly defined by the team based on Secchi depth
measurements?
Was the euphotic zone calculated on lake index site sample collection form?
If the euphotic zone < 2m, was the sample collected from within the euphotic
zone only?
Were labels for all containers securely attached and covered with clear tape?
Was the Lake ID correctly labeled on each container?
Was the cubitainer expanded by water pressure, not by inflating or pulling apart
sides?
Were fingers kept away from the inner surface of the cap and container opening
during sample collection?
Was the first cubitainer mixed thoroughly before pouring off into the 2L bottle for
chlorophyll a filtration, 1L bottle for phytoplankton, and 500 ml for the
microcystin sample?
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
N
N
N
N
N
N
N
N
N
N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A

-------
National Lakes Survey
                                      Draft Field Evaluation and Assistance Site Visit Checklist
Jun. 12, 08
Water Sample Collection and Preservation
Are the sample jars clearly labeled for each indicator?
Were approximately 10ml of Lugol's added to the 1L bottle for phytoplankton
preservation?
Was the sample a "weak-tea" color?
For the microcystin sample, was the 500 ml bottle filled with water from the 4 L
cubitainer?
For microcystin sample, was the bottle placed in the cooler with wet ice?
Was the cubitainer placed in dark plastic bag?
Was the cubitainer placed in a cooler or in a black gallon bag on ice until the site
work was complete?
Y
Y
Y
Y
Y
Y
Y
N
N
N
N
N
N
N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Zooplankton Sample Collection
Were the mesh sizes clearly marked on the two Wisconsin nets and buckets
(80um and 243 urn)?
Were the nets inspected before use for holes or tears?
Were the nets each attached to a line visibly marked every 0.5m?
Was the net carefully lowered through the water in an upright position?
Was the net stopped 0.5 m from the bottom?
If the lake is < 2m deep and the Secchi disk was visible at the bottom of the lake,
was a second tow conducted?
Was the net pulled to the surface at a steady, constant rate (about 1 ft or 0.3
m/second)?
At the surface, was the net dipped into the water to rinse organisms to the cod
end?
Was the outside of the net carefully rinsed at the surface with a squirt bottle or
similar tool?
Was the second net towed from the other side of the boat or the opposite end?
Was the lake ID pre-recorded on the sample label?
Was the mesh size (80um or 243 um) used on the jar?
Were the samples collected from each net mesh size treated as two, unique
samples (different sample ID numbers)?
Did the 500 ml bottle contain the CO2 tablets?
Was EtOH water used to rinse the zooplankton from the net into the sample
bottle?
If the volume of zooplankton in the bucket exceeds 125 ml, was a second jar
used?
If so, were the jars labeled properly? ( i.e., Extra jar, and 2 of 2 added)
Was approximately 80 ml of ethanol added to the jar?
Was the length of the tow recorded on the label and sample collection form?
Was the lid wrapped in electrical tape?
Was this procedure followed separately, for each net?
Y
Y
Y
Y
T
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
N
N
N
N
N
N
N
N
N
N
N
N
N
N
N
N
N
N
N
N
N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A

-------
National Lakes Survey
                                      Draft Field Evaluation and Assistance Site Visit Checklist
Jun. 12, 08
Zooplankton Sample Collection
Was the Sample Collection Form completed correctly for zooplankton? Does
the information on the form match the information on the label for each sample?
Y
N
N/A
Sediment Diatom and Mercury Sample Collection
Were the containers properly labeled for top, bottom, and sediment cores?
Was the corer cleaned from the last site visit and rinsed with tap water after
arrival to this lake site?
Were gloves (powderless) worn throughout this procedure?
Was the core extruded from an area of undisturbed sediments?
Was the core 35 cm to 45cm in length?
Was the water-sediment interface maintained while placing the stopper in the
bottom of the corer?
Was the corer kept in a vertical position while the slices are extracted?
Was the total length of the core measured to the nearest 0.1 cm?
Was the water at the top of the core carefully removed with a siphoning tube, so
the top sediments were not disturbed?
Was the crew careful to ensure that the sampling kit did not come in contact with
anything other than the sediment sample?
Was the sediment from the center of the core (for mercury analysis) transferred to
the vial without rinsing?
Was the sediment sample placed immediately on dry ice?
Was the top 1 cm of the core transferred to the sample container labeled "TOP?"
Was the interval recorded on the Sample Collection Form?
For natural lakes, was the sectioning apparatus rinsed before the bottom slice was
extracted?
For natural lakes, was the sediment extruded until the bottom of the stopper was 5
cm from the top of the coring tube? Was the tube marked at 5 cm?
For natural lakes, were the next 2 cm extruded and discarded?
For natural lakes, was the next 1 cm extruded and kept as the "BOTTOM" slice?
Was this interval correctly recorded on the Sample collection form?
Were the labels secured with clear plastic tape?
Was the corer cleaned and rinsed with lake water after all samples were
collected?
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
N
N
N
N
N
N
N
N
N
N
N
N
N
N
N
N
N
N
N
N
N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A

-------
National Lakes Survey                 Draft Field Evaluation and Assistance Site Visit Checklist                   Jun. 12, 08





 NOTES

-------
National Lakes Survey
                                      Draft Field Evaluation and Assistance Site Visit Checklist
Jun. 12, 08
PHYSICAL HABITAT EVALUATION
Site Selection and Location
Were habitat sites selected randomly and distributed evenly around the lake
perimeter?
Were habitat sites located accurately (using GPS, lake outline, or topography) and
the plots properly lain out?
Were habitat sites adjusted reasonably and only when necessary?
Was the lake outline map on the verification form marked appropriately for the
adjusted stations?
Was an observation vantage point established at 10 m off the shore and on
centerline of the plot?
Was the water depth at 10 m off shore measured with a sounding or sonar and
recorded accurately (including units)?
Y
Y
Y
Y
Y
Y
N
N
N
N
N
N
N/A
N/A
N/A
N/A
N/A
N/A
Bottom Substrates
Were bottom substrates visually observed or probed with a sounding pole
throughout littoral plot?
Were the categories of bottom substrates interpreted correctly?
Did the categorical levels of bottom substrates potentially add up to 100%?
Y
Y
Y
N
N
N
N/A
N/A
N/A
Aquatic Macrophytes
Were aquatic macrophytes correctly categorized and characterized?
Was the total macrophyte coverage consistent with coverage in the individual
categories?
Y
Y
N
N
N/A
N/A
Fish Cover
Were the elements offish cover properly identified and quantified?
Y
N
N/A
Riparian vegetation
Were the canopy, understory, and ground cover correctly and completely
characterized?
Were the vegetative types consistent with coverage categories?
Y
Y
N
N
N/A
N/A
Shoreline Substrate Zone
Were the shoreline substrates in the first landward meter properly identified and
quantified?
Y
N
N/A
Human Influence
Were the human influences properly identified within or near the plot?
Y
N
N/A
Littoral Fish Macrohabitat
Were all fields complete?
Were selections consistent with information on front of form?
Y
Y
N
N
N/A
N/A
Bank Features
Was the bank angle correctly interpreted in the first landward meter and
recorded?
Was the high water mark correctly identified?
Y
Y
N
N
N/A
N/A

-------
National Lakes Survey
                                      Draft Field Evaluation and Assistance Site Visit Checklist
Jun. 12, 08
Were the horizontal and vertical distances from the current waterline correctly
estimated or measured and recorded (in meters)?
Y
N
N/A
Invasives
Were the species correctly marked or "none observed" marked in both the littoral
and riparian columns?
Y
N
N/A
Whole Form
Were the site and date information complete?
Was one habitat form completed per station (additional forms included for new
sites, e.g., islands)?
Were data flags used appropriately and explained adequately throughout the
form?
Was the form reviewed and initialed?
Were the comments legible?
Y
Y
Y
Y
Y
N
N
N
N
N
N/A
N/A
N/A
N/A
N/A
Benthic Macroinvertebrate Sample Collection
After locating the sample site, was the dominant habitat type identified within the
plot?
Was a D-frame dip net (equipped with 500 jim mesh) used to sweep through 1
linear meter of the dominant habitat type at a single location within the 10 m x 15
m littoral zone sampling area, making sure to disturb the substrate enough to
dislodge organisms?
If the dominant habitat is rocky /cobble/large woody debris, did the crew member
conducting the sampling exit the boat and disturb the substrate (e.g., overturn
rocks, logs) using his/her feet while sweeping the net through the disturbed area?
After completing the 1 -meter sweep, were organisms and debris removed from
net and placed in a bucket?
Were the organisms and detritus collected at each station on the lake combined in
a single bucket to create a single composite sample for the lake?
Y
Y
Y
Y
Y
N
N
N
N
N
N/A
N/A
N/A
N/A
N/A
Fecal Indicator (Enterococci) Sample Collection
Were gloves worn?
Was the sodium thiosulfate tablet transferred from the pre-sterilized, 250 ml
sample bottle to a sterile screw-cap 50-ml PP tube?
Was the sampling location 1 m deep and approached slowly from downstream or
downwind?
Was the 250 ml sample bottle lowered un-capped and inverted to a depth of 0.3
meters below the water surface, avoiding surface scum, vegetation, and
substrates?
Was the mouth of the container pointed away from the body or boat?
Was the bottle righted and raised through the water column, allowing the bottle to
fill completely?
Y
Y
Y
Y
Y
Y
N
N
N
N
N
N
N/A
N/A
N/A
N/A
N/A
N/A

-------
National Lakes Survey
                                    Draft Field Evaluation and Assistance Site Visit Checklist
Jun. 12, 08
Fecal Indicator (Enterococci) Sample Collection
After removing the container from the water, was a small portion of the sample
discarded to allow for proper mixing before analyses?
Was the sodium thiosulfate tablet added along with the cap, and the bottle shaken
25 times?
Was the sample stored in a cooler on ice to chill (not freeze)?
Was the sample chilled for at least 15 minutes and held for less than 8 hours
before filtration?
Y
Y
Y
Y
N
N
N
N
N/A
N/A
N/A
N/A
 NOTES

-------
National Lakes Survey
                                      Draft Field Evaluation and Assistance Site Visit Checklist
Jun. 12, 08
FINAL LAKE ACTIVITIES
General Lake Assessment
Were any of the sources of potential stressors recorded that were observed while
on the lake, while driving or walking through the lake catchment, or while flying
over the lake and catchment?
For activities and stressors that the crew observed, was their abundance or
influence as low (L), moderate (M), or heavy (H) rated on the line next to the
listed disturbance?
Was the box on the assessment forms checked to denote blanks as zeros?
Was the section "Lake Site Activities and Disturbances Observed" completed
including residential, recreational, agricultural, industrial, and lake management
categories?
Were observations regarding the general characteristics of the lake recorded?
Was the hydrologic lake type recorded?
Were flight hazards noted that might interfere with either low-altitude fly-overs
by aircraft (for future aerial photography or videography) or landing on the lake
for sampling purposes (either by float plane or helicopter)?
When estimating the intensity of motor boat usage, in addition to the actual
number of boats observed on the lake during the visit, were other observations
such as the presence of boat houses, docks, and idle craft recorded?
Were all six characteristics estimated and the section "General Lake Information"
completed?
When the extent of major vegetation types was estimated, was the assessment
limited to the immediate lake shoreline (i.e., within 20 m of the water)?
Was the percentage of the immediate shoreline that has been developed or
modified by humans estimated?
Were all eight shoreline categories completed and the section "Shoreline
Characteristics" estimated?
Was the areal percentage of macrophyte coverage for the three categories
estimated and the section "Qualitative Macrophyte Survey" completed?
Was the waterbody character rated?
Was the water body character defined by using degree of human development and
aesthetics attributes?
Were the three ecological values (i.e., trophic state, ecological integrity, and
recreation) assessed?
For ecological values, was the overall impression of the "health" of the biota in
the lake recorded and note any possible causes of impairment?
For tropic status, was a visual impression of the trophic status including overall
impression of algal abundance and general type provided?
For tropic status, were any observed potential nutrient sources to the lake listed?
For recreation, was the overall impression of the lake as a site for recreation
recorded?
Y
Y
Y

Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
N
N
N

N
N
N
N
N
N
N
N
N
N
N
N
N
N
N
N
N/A
N/A
N/A

N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A

-------
National Lakes Survey
                                    Draft Field Evaluation and Assistance Site Visit Checklist
Jun. 12, 08
FINAL LAKE ACTIVITIES
General Lake Assessment
For recreation, were possible causes of impairment, or the presence or absence of
people using the lake for recreational activities recorded?
Was the comments section used on the Lake Assessment Form to note any other
pertinent information about the lake or its catchment?
Y
Y
N
N
N/A
N/A
 NOTES

-------
National Lakes Survey
                                      Draft Field Evaluation and Assistance Site Visit Checklist
Jun. 12, 08
FINAL LAKE ACTIVITIES
Processing the Fecal Indicator
Were non-powdered surgical gloves worn?
Were the Filter Extraction tubes with beads chilled on dry ice?
Were the 4 PC filters aseptically transferred from the filter box to the base of the
opened Petri dish?
Was the cellulose nitrate filter removed from funnel and discarded?
Was the filtration funnel loaded with sterile PC filter on the support pad (shiny
side up)?
Was the sample bottle(s) shaken 25 times to mix well?
Was the 25 ml of the mixed water sample measured in the sterile graduated PP
tube and poured into the filter funnel?
Was it pumped until all liquid was in the filtrate collection flask?
If the first 25 ml volume passed readily through the filter, was another 25 ml
added and the filtration continued?
If the filter clogged before completely filtering the first or second 25 ml volume,
was the filter discarded and the filtration repeated using a lesser volume?
Was a quarter (approx. 25 ml) of the chilled Dilution Buffer poured into the
graduated PP tube used for the sample?
Was the tube capped and shaken 5 times?
Was the cap removed and the rinsate poured into the filter funnel to rinse filter?
Was the rinsate filtered and repeated with another 25 ml of Dilution Buffer?
Was the filter funnel removed from the base without disturbing filter?
Were sterile disposable forceps used to remove the filter (touching only the filter
edges)?
Was the filter folded it in half, in quarters, and then in eighths?
Was the filter inserted into chilled filter extraction tube (with beads)?
Was the screw cap replaced and tightened?
Was the tube(s) inserted into ziplock bag on dry ice for preservation during
transport and shipping?
Was the volume of water sample filtered through each filter recorded?
If 25 ml of dilution buffer was not used, was this flagged and noted on the
collection form?
Was the filtration start time and finish time recorded for each sample?
Were the steps repeated for the remaining three 50 ml sub-sample volumes to be
filtered?
Y
Y

Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
N
N

N
N
N
N
N
N
N
N
N
N
N
N
N
N
N
N
N
N
N
N
N
N/A
N/A

N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A

-------
National Lakes Survey                 Draft Field Evaluation and Assistance Site Visit Checklist                   Jun. 12, 08





 NOTES

-------
National Lakes Survey
                                      Draft Field Evaluation and Assistance Site Visit Checklist
Jun. 12, 08
FINAL LAKE ACTIVITIES
Processing the Chlorophyll-a Sample
Were surgical gloves worn?
Was a glass fiber filter placed in the graduated filter holder apparatus?
Was the filter handled with forceps?
Was 250 ml of water poured into the filter holder, the cap replaced, and the
sample pumped through the filter?
If 250 ml of lake water did not pass through the filter, was the filter changed, the
apparatus rinsed with DI water, and the procedures repeated using 100 ml of lake
water?
Was the upper portion of the filtration apparatus rinsed thoroughly with DI water
to include any remaining cells adhering to the sides and pumped through the
filter?
Was the level of water monitored in the lower chamber to ensure that it did not
contact the filter or flow into the pump?
Was the filter observed for visible color?
If there was not, did the process proceed until color was visible on the filter or
until a maximum of 2,000 ml was filtered?
Was the actual sample volume filtered recorded on the Sample Collection Form
and on the sample label?
Was the bottom portion of the apparatus removed and the water poured off from
the bottom?
Was the filter removed from the holder with clean forceps?
Was the filter folded in half, with the colored side folded inward?
Was the folded filter placed into a 50 ml steam-top centrifuge tube and caped?
Was the sample volume filtered recorded on a chlorophyll label and attached to
the centrifuge tube?
Was all written information complete and legible?
Was the label covered with a strip of clear tape?
Does the "total volume of water filtered" on the Sample Collection Form match
the total volume recorded on the sample label?
Was the tube wrapped in aluminum foil and placed in a self-sealing plastic bag?
Was this bag placed between two small bags of ice in a cooler?
Were the filter chambers rinsed with DI water?
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
N
N
N
N
N
N
N
N
N
N
N
N
N
N
N
N
N
N
N
N
N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A

-------
National Lakes Survey                 Draft Field Evaluation and Assistance Site Visit Checklist                   Jun. 12, 08





 NOTES

-------
National Lakes Survey
                                    Draft Field Evaluation and Assistance Site Visit Checklist
Jun. 12, 08
FINAL LAKE ACTIVITIES
Data Forms and Sample Inspection
After the Lake Assessment Form was completed, did the Field Team Leader
review all of the data forms and sample labels for accuracy, completeness, and
legibility?
Did the other team member inspect all sample containers and packages in
preparation for transport, storage, or shipment?
Did the team ensure that all required data forms for the lake were completed?
Was it confirmed that the LAKE-ID and date of visit are correct on all forms?
On each form, was it verified that all information was recorded accurately, the
recorded information was legible, and any flags were explained in the comments
section?
Was it ensured that written comments are legible, with no "shorthand" or
abbreviations?
After reviewing each form, was the upper right corner of each page of the form
initialed?
Was it ensured that all samples were labeled, all labels are completely filled in,
and each label was covered with clear plastic tape?
Were all sample containers checked to ensure that they were properly sealed?
Will the coolers be shipped with fresh bags of ice in cooler; ice bags labeled as
"ICE"?
Will the coolers be shipped by overnight courier ASAP after collection (generally
the next day)?
If samples will be held after collection, will they kept cold and in darkness?
Were the Wisconsin nets and buckets rinsed at least three times with the DI
water?
Y
Y

Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
N
N

N
N
N
N
N
N
N
N
N
N
N/A
N/A

N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
 NOTES

-------
National Lakes Survey
                                    Draft Field Evaluation and Assistance Site Visit Checklist
Jun. 12, 08
FINAL LAKE ACTIVITIES
Launch Site Cleanup
Were the boat, motor, and trailer inspected for evidence of weeds and other
macrophytes?
Were the boat, motor, and trailer cleaned as completely as possible before leaving
the launch site?
Were all nets inspected for pieces of macrophyte or other organisms and as much
as possible was removed before packing the nets for transport?
Were all equipment and supplies packed in the vehicle and trailer for transport
and kept as organized as presented in the equipment checklists?
Was all waste material at the launch site cleaned up and disposed of or
transported it out of the site if a trash can is not available?
Y
Y

Y
Y
N
N

N
N
N/A
N/A

N/A
N/A
 NOTES

-------
National Lakes Survey
                                 Draft Field Evaluation and Assistance Site Visit Checklist
Jun. 12, 08
Miscellaneous
Do the team members know the Communications Center phone number by heart,
is the number is saved in cell phone, or do they know the location of number in
Field Ops Manual?
Do the team members have suggestions/problems concerning the sampling
Procedures, forms, lodging, logistics, etc.?
Y
Y
N
N
N/A
N/A
                                FINAL EVALUATION ACTIVITIES

-------
National Lakes Survey              Draft Field Evaluation and Assistance Site Visit Checklist                Jun. 12, 08
                              FINAL EVALUATION ACTIVITIES
                                         Areas of Concern

-------
National Lakes Survey             Draft Field Evaluation and Assistance Site Visit Checklist                Jun. 12, 08
                                  FINAL LAKE ACTIVITIES
 Was the team debriefed on the results of the evaluation by the evaluator?          | Y |  N  | N/A
                     COMMENTS OF THE TEAM BEING EVALUATED

-------
   National Lakes Survey            Draft Field Evaluation and Assistance Site Visit Checklist              Jun. 12, 08
                                       SIGNATURES
Evaluator                          Date           Field Team Leader                    Date
Field QC Officer (if assigned by site)  Date           Field Team Member                  Date
Field Team Member                Date          Field Team Member                   Date
Field Team Member                Date          Field Team Member                   Date
Field Team Member                  Date        Field Team Member                   Date

-------
       APPENDIX B
LAKE SURVEY LABORATORY LIST

-------
        Appendix B
Lakes Survey Laboratory List
Support
Field Sampling



Laboratory
Water Chemistry Analysis








Sediment Diatom Analysis







Contact
Michael Barbour
410-356-8993



Dave Peck
EPA/COR
541-754-4463








Dennis McCauley
231-941-2230







Contractor
TetraTech, Inc.
10306 Eaton Place
Fairfax, VA 22030
703-385-6000

Dynamac Corp.

c/oU.S.EPA
200 SW 35th St.
Corvallis, OR 97333
541-754-4463
R. Jan Stevenson, Ph.D.
Co-Director, Center for Water
Sciences
and
Professor, Department of
Zoology
203 Natural Science Building
Michigan State University
East Lansing, MI 48824
Phone: 517-432-8083
FAX: 517-432-2789
www.msu.edu/~ri stev
Kociolek, Patrick
Diatom Collection
California Academy of
Sciences
875 Howard Street
San Francisco, CA 94103
Contractor No. &
Task No.
AWPD 68-C-02-108
Task No. 167



EP-D-06-013
Work Assignment 1-06








HECD 68-C-04-006
Work Assignment 3-58







Project Officer
Carol Peterson
EPA/OW/OWOW (4303T)
1200 Pennsylvania Ave. NW
Washington, DC 20460
202-566-1304
Kathy Martin
U.S. EPA
NHEERL-WED
200 S.W. 35th St.
Corvallis, OR 97333
541-754-4502







Carol Peterson
EPA/OW/OWOW (4303T)
1200 Pennsylvania Ave. NW
Washington, DC 20460
202-566-1304







-------
                                         Mark A. Schadler
                                         Phycology Project Manager
                                         Patrick Center for
                                         Environmental Research
                                         Academy of Natural Sciences
                                         1900 Benjamin Franklin
                                         Pkwy
                                         Philadelphia, PA
                                         19103
                                         215-299-3792
qPCR for enterococci
Analysis
Jack Paar
617-918-8300
TechLaw, Inc.
14500 Avion Parkway
Chantilly, VA 20151-1101
703-816-1000
ESAT Contract No.
EP-W-06-17
Task Order 08 (non-
superfund PCR support)
Pat Svetaka
U.S.EPA Regional Lab (EQA)
11 Technology Dr.
N. Chelmsford, MA 01863-2431
617-918-8396
Algal Toxin Analysis
Keith Loftin
785-832-3543
uses
Kansas Water Science Center
4821 Quail Crest Place
Lawrence ,KS 66049
785-832-3511
IAG No. DW 14922508-01-0
Susan Holdsworth
EPA/OW/OWOW (4303T)
1200 Pennsylvania Ave. NW
Washington, DC 20460
202-566-1187
Zooplankton Analysis
Ellen Tarquinio
202-566-2267
EcoAnalysts, Inc.
105 E 2nd St. Suite 1
Moscow, ID 83843
(208) 882-2588
                                                                      BPA 07-03
                          Ellen Tarquinio
                          EPA/OW/OWOW (4305T)
                          1200 Pennsylvania Ave. NW
                          Washington, DC 20460
                          (202) 566-2267
Phytoplankton Analysis
Ellen Tarquinio
202-566-2267
EcoAnalysts, Inc.
105 E 2nd St. Suite 1
Moscow, ID 83843
(208) 882-2588
BPA 07-02
Ellen Tarquinio
EPA/OW/OWOW (4305T)
1200 Pennsylvania Ave. NW
Washington, DC 20460
(202) 566-2267

-------