National Primary Drinking Water Regulations: Long Term 2 Enhanced
Surface Water Treatment Rule
[Federal Register: August 11, 2003 (Volume 68, Number 154)]
[Proposed Rules]
[Page 47639-47688]
From the Federal Register Online via GPO Access [wais.access.gpo.gov]
[DOCID:fr11au03-39]
-----------------------------------------------------------------------
ENVIRONMENTAL PROTECTION AGENCY
40 CFR Parts 141 and 142
[FRL-7530-5]
RIN 2040--AD37
National Primary Drinking Water Regulations: Long Term 2 Enhanced
Surface Water Treatment Rule
AGENCY: Environmental Protection Agency.
ACTION: Proposed rule.
-----------------------------------------------------------------------
SUMMARY: In this document, the Environmental Protection Agency (EPA) is
proposing National Primary Drinking Water Regulations that require the
use of treatment techniques, along with monitoring, reporting, and
public notification requirements, for all public water systems (PWSs)
that use surface water sources. The purposes of the Long Term 2
Enhanced Surface Water Treatment Rule (LT2ESWTR) are to improve control
of microbial pathogens, including specifically the protozoan
Cryptosporidium, in drinking water and to address risk-risk trade-offs
with the control of disinfection byproducts. Key provisions in today's
proposed LT2ESWTR include the following: source water monitoring for
Cryptosporidium, with reduced monitoring requirements for small
systems; additional Cryptosporidium treatment for filtered systems
based on source water Cryptosporidium concentrations; inactivation of
Cryptosporidium by all unfiltered systems; disinfection profiling and
benchmarking to ensure continued levels of microbial protection while
PWSs take the necessary steps to comply with new disinfection byproduct
standards; covering, treating, or implementing a risk management plan
for uncovered finished water storage facilities; and criteria for a
number of treatment and management options (i.e., the microbial
toolbox) that PWSs may implement to meet additional Cryptosporidium
treatment requirements. The LT2ESWTR will build upon the treatment
technique requirements of the Interim Enhanced Surface Water Treatment
Rule and the Long Term 1 Enhanced Surface Water Treatment Rule.
EPA believes that implementation of the LT2ESWTR will significantly
reduce levels of Cryptosporidium in finished drinking water. This will
substantially lower rates of endemic cryptosporidiosis, the illness
caused by Cryptosporidium, which can be severe and sometimes fatal in
sensitive subpopulations (e.g., AIDS patients, the elderly). In
addition, the treatment technique requirements of this proposal are
expected to increase the level of protection from exposure to other
microbial pathogens (e.g., Giardia lamblia).
DATES: EPA must receive public comment on the proposal by November 10,
2003.
ADDRESSES: Comments may be submitted by mail to: Water Docket,
Environmental Protection Agency, Mail Code 4101T, 1200 Pennsylvania
Ave., NW., Washington, DC 20460, Attention Docket ID No. OW-2002-0039.
Comments may also be submitted electronically or through hand delivery/
courier by following the detailed instructions as provided in section
I.C. of the SUPPLEMENTARY INFORMATION section.
FOR FURTHER INFORMATION CONTACT: For technical inquiries, contact
Daniel Schmelling, Office of Ground Water and Drinking Water (MC
4607M), U.S. Environmental Protection Agency, 1200 Pennsylvania Ave.,
NW., Washington, DC 20460; telephone (202) 564-5281. For regulatory
inquiries, contact Jennifer McLain at the same address; telephone (202)
564-5248. For general information contact the Safe Drinking Water
Hotline, Telephone (800) 426-4791. The Safe Drinking Water Hotline is
open Monday through Friday, excluding legal holidays, from 9 a.m. to
5:30 p.m. Eastern Time.
SUPPLEMENTARY INFORMATION:
I. General Information
A. Who Is Regulated by This Action?
Entities potentially regulated by the LT2ESWTR are public water
systems (PWSs) that use surface water or ground water under the direct
influence of surface water (GWUDI). Regulated categories and entities
are identified in the following chart.
------------------------------------------------------------------------
Category Examples of regulated entities
------------------------------------------------------------------------
Industry............................... Public Water Systems that use
surface water or ground water
under the direct influence of
surface water.
State, Local, Tribal or Federal Public Water Systems that use
Governments. surface water or ground water
under the direct influence of
surface water.
------------------------------------------------------------------------
This table is not intended to be exhaustive, but rather provides a
guide for readers regarding entities likely to be regulated by this
action. This table lists the types of entities that EPA is now aware
could potentially be regulated by this action. Other types of entities
not listed in this table could also be regulated. To determine whether
your facility is regulated by this action, you should carefully examine
the definition of public water system in Sec. 141.3 of Title 40 of the
Code of Federal Regulations and applicability criteria in Sec. Sec.
141.76 and 141.501 of today's proposal. If you have questions regarding
the applicability of the LT2ESWTR to a particular entity, consult one
of the persons listed in the preceding section entitled FOR FURTHER
INFORMATION CONTACT
B. How Can I Get Copies of This Document and Other Related Information?
1. Docket. EPA has established an official public docket for this
action under Docket ID No. OW-2002-0039. The official public docket
consists of the documents specifically referenced in this action, any
public comments received, and other information related to this action.
Although a part of the official docket, the public docket does not
include Confidential Business Information (CBI) or other information
whose disclosure is restricted by statute. The official public docket
is the collection of materials that is available for public viewing at
the Water Docket in the EPA Docket Center, (EPA/DC) EPA West, Room
B102, 1301 Constitution Ave., NW., Washington, DC. The EPA Docket
Center Public Reading Room is open from 8:30 a.m. to 4:30 p.m., Monday
through Friday, excluding legal holidays. The telephone number for the
Public Reading Room is (202) 566-1744, and the telephone number for the
Water Docket is (202) 566-2426. For access to docket material, please
call (202) 566-2426 to schedule an appointment.
2. Electronic Access. You may access this Federal Register document
electronically through the EPA Internet under the ``Federal Register''
listings at http://www.epa.gov/fedrgstr/.
An electronic version of the public docket is available through
EPA's electronic public docket and comment system, EPA Dockets. You may
use EPA Dockets at http://www.epa.gov/edocket/ to submit or view public
comments, access the index listing of the contents of the official
public docket, and to access those documents in the public docket that
are available electronically. Once in the system, select ``search,''
then key in the appropriate docket identification number.
Certain types of information will not be placed in the EPA Dockets.
Information claimed as CBI and other
[[Page 47641]]
information whose disclosure is restricted by statute, which is not
included in the official public docket, will not be available for
public viewing in EPA's electronic public docket. EPA's policy is that
copyrighted material will not be placed in EPA's electronic public
docket but will be available only in printed, paper form in the
official public docket. Although not all docket materials may be
available electronically, you may still access any of the publicly
available docket materials through the docket facility identified in
section I.B.1.
For public commenters, it is important to note that EPA's policy is
that public comments, whether submitted electronically or in paper,
will be made available for public viewing in EPA's electronic public
docket as EPA receives them and without change, unless the comment
contains copyrighted material, CBI, or other information whose
disclosure is restricted by statute. When EPA identifies a comment
containing copyrighted material, EPA will provide a reference to that
material in the version of the comment that is placed in EPA's
electronic public docket. The entire printed comment, including the
copyrighted material, will be available in the public docket.
Public comments submitted on computer disks that are mailed or
delivered to the docket will be transferred to EPA's electronic public
docket. Public comments that are mailed or delivered to the Docket will
be scanned and placed in EPA's electronic public docket. Where
practical, physical objects will be photographed, and the photograph
will be placed in EPA's electronic public docket along with a brief
description written by the docket staff.
C. How and to Whom Do I Submit Comments?
You may submit comments electronically, by mail, or through hand
delivery/courier. To ensure proper receipt by EPA, identify the
appropriate docket identification number in the subject line on the
first page of your comment. Please ensure that your comments are
submitted within the specified comment period. Comments received after
the close of the comment period will be marked ``late.'' EPA is not
required to consider these late comments.
1. Electronically. If you submit an electronic comment as
prescribed below, EPA recommends that you include your name, mailing
address, and an e-mail address or other contact information in the body
of your comment. Also include this contact information on the outside
of any disk or CD ROM you submit, and in any cover letter accompanying
the disk or CD ROM. This ensures that you can be identified as the
submitter of the comment and allows EPA to contact you in case EPA
cannot read your comment due to technical difficulties or needs further
information on the substance of your comment. EPA's policy is that EPA
will not edit your comment, and any identifying or contact information
provided in the body of a comment will be included as part of the
comment that is placed in the official public docket, and made
available in EPA's electronic public docket. If EPA cannot read your
comment due to technical difficulties and cannot contact you for
clarification, EPA may not be able to consider your comment.
a. EPA Dockets. Your use of EPA's electronic public docket to
submit comments to EPA electronically is EPA's preferred method for
receiving comments. Go directly to EPA Dockets at http://www.epa.gov/
edocket, and follow the online instructions for submitting comments.
Once in the system, select ``search,'' and then key in Docket ID No.
OW-2002-0039. The system is an ``anonymous access'' system, which means
EPA will not know your identity, e-mail address, or other contact
information unless you provide it in the body of your comment.
b. E-mail. Comments may be sent by electronic mail (e-mail) to OW-
Docket@epa.gov, Attention Docket ID No. OW-2002-0039. In contrast to
EPA's electronic public docket, EPA's e-mail system is not an
``anonymous access'' system. If you send an e-mail comment directly to
the Docket without going through EPA's electronic public docket, EPA's
e-mail system automatically captures your e-mail address. E-mail
addresses that are automatically captured by EPA's e-mail system are
included as part of the comment that is placed in the official public
docket, and made available in EPA's electronic public docket.
c. Disk or CD ROM. You may submit comments on a disk or CD ROM that
you mail to the mailing address identified in section I.C.2. These
electronic submissions will be accepted in WordPerfect or ASCII file
format. Avoid the use of special characters and any form of encryption.
2. By Mail. Send three copies of your comments and any enclosures
to: Water Docket, Environmental Protection Agency, Mail Code 4101T,
1200 Pennsylvania Ave., NW., Washington, DC, 20460, Attention Docket ID
No. OW-2002-0039.
3. By Hand Delivery or Courier. Deliver your comments to: Water
Docket, EPA Docket Center, Environmental Protection Agency, Room B102,
1301 Constitution Ave., NW, Washington, DC, Attention Docket ID No. OW-
2002-0039. Such deliveries are only accepted during the Docket's normal
hours of operation as identified in section I.B.1.
D. What Should I Consider as I Prepare My Comments for EPA?
You may find the following suggestions helpful for preparing your
comments:
1. Explain your views as clearly as possible.
2. Describe any assumptions that you used.
3. Provide any technical information and/or data you used that
support your views.
4. If you estimate potential burden or costs, explain how you
arrived at your estimate.
5. Provide specific examples to illustrate your concerns.
6. Offer alternatives.
7. Make sure to submit your comments by the comment period deadline
identified.
8. To ensure proper receipt by EPA, identify the appropriate docket
identification number in the subject line on the first page of your
response. It would also be helpful if you provided the name, date, and
Federal Register citation related to your comments.
Abbreviations Used in This Document
AIPC All Indian Pueblo Council
ASDWA Association of State Drinking Water Administrators
ASTM American Society for Testing and Materials
AWWA American Water Works Association
AWWARF American Water Works Association Research Foundation
[deg]C Degrees Centigrade
CCP Composite Correction Program
CDC Centers for Disease Control and Prevention
CFE Combined Filter Effluent
CFR Code of Federal Regulations
COI Cost-of-Illness
CT The Residual Concentration of Disinfectant (mg/L) Multiplied by the
Contact Time (in minutes)
CWS Community Water Systems
DAPI 4',6-Diamindino-2-phenylindole
DBPs Disinfection Byproducts
DBPR Disinfectants/Disinfection Byproducts Rule
DE Diatomaceous Earth
DIC Differential Interference Contrast (microscopy)
EA Economic Analysis
[[Page 47642]]
EPA United States Environmental Protection Agency
GAC Granular Activated Carbon
GWUDI Ground Water Under the Direct Influence of Surface Water
HAA5 Haloacetic acids (Monochloroacetic, Dichloroacetic,
Trichloroacetic, Monobromoacetic and Dibromoacetic Acids)
HPC Heterotrophic Plate Count
ICR Information Collection Request
ICRSS Information Collection Rule Supplemental Surveys
ICRSSM Information Collection Rule Supplemental Survey of Medium
Systems
ICRSSL Information Collection Rule Supplemental Survey of Large Systems
IESWTR Interim Enhanced Surface Water Treatment Rule
IFA Immunofluorescence Assay
Log Logarithm (common, base 10)
LRAA Locational Running Annual Average
LRV Log Removal Value
LT1ESWTR Long Term 1 Enhanced Surface Water Treatment Rule
LT2ESWTR Long Term 2 Enhanced Surface Water Treatment Rule
MCL Maximum Contaminant Level
MCLG Maximum Contaminant Level Goal
MGD Million Gallons per Day
M-DBP Microbial and Disinfectants/Disinfection Byproducts
MF Microfiltration
NCWS Non-community water systems
NF Nanofiltration
NODA Notice of Data Availability
NPDWR National Primary Drinking Water Regulation
NTNCWS Non-transient Non-community Water System
NTTAA National Technology Transfer and Advancement Act
NTU Nephelometric Turbidity Unit
OMB Office of Management and Budget
PE Performance Evaluation
PWS Public Water System
QC Quality Control
QCRV Quality Control Release Value
RAA Running Annual Average
RFA Regulatory Flexibility Act
RO Reverse Osmosis
RSD Relative Standard Deviation
SAB Science Advisory Board
SBAR Small Business Advocacy Review
SERs Small Entity Representatives
SDWA Safe Drinking Water Act
SWTR Surface Water Treatment Rule
TCR Total Coliform Rule
TTHM Total Trihalomethanes
TNCWS Transient Non-community Water Systems
UF Ultrafiltration
UMRA Unfunded Mandates Reform Act
Table of Contents
I. Summary
A. Why Is EPA Proposing the LT2ESWTR?
B. What Does the LT2ESWTR Proposal Require?
1. Treatment Requirements for Cryptosporidium
2. Disinfection Profiling and Benchmarking
3. Uncovered Finished Water Storage Facilities
C. Will This Proposed Regulation Apply to My Water System?
II. Background
A. What Is the Statutory Authority for the LT2ESWTR?
B. What Current Regulations Address Microbial Pathogens in
Drinking Water?
1. Surface Water Treatment Rule
2. Total Coliform Rule
3. Interim Enhanced Surface Water Treatment Rule
4. Long Term 1 Enhanced Surface Water Treatment Rule
5. Filter Backwash Recycle Rule
C. What Public Health Concerns Does This Proposal Address?
1. Introduction
2. Cryptosporidium Health Effects and Outbreaks
a. Health Effects
b. Waterborne Cryptosporidiosis Outbreaks.
3. Remaining Public Health Concerns Following the IESWTR and
LT1ESWTR
a. Adequacy of Physical Removal To Control Cryptosporidium and
the Need for Risk Based Treatment Requirements.
b. Control of Cryptosporidium in Unfiltered Systems
c. Uncovered Finished Water Storage Facilities
D. Federal Advisory Committee Process
III. New Information on Cryptosporidium Health Risks and Treatment
A. Overview of Critical Factors for Evaluating Regulation of
Microbial Pathogens
B. Cryptosporidium Infectivity
1. Cryptosporidium Infectivity Data Evaluated for IESWTR
2. New Data on Cryptosporidium Infectivity
3. Significance of New Infectivity Data
C. Cryptosporidium Occurrence
1. Occurrence Data Evaluated for IESWTR
a. Filtered Systems.
b. Unfiltered Systems
2. Overview of the Information Collection Rule and Information
Collection Rule Supplemental Surveys (ICRSS)
a. Scope of the Information Collection Rule
b. Scope of the ICRSS
3. Analytical Methods for Protozoa in the Information Collection
Rule and ICRSS
a. Information Collection Rule Protozoan Method
b. Method 1622 and Method 1623
4. Cryptosporidium Occurrence Results from the Information
Collection Rule and ICRSS
a. Information Collection Rule Results
b. ICRSS Results
5. Significance of New Cryptosporidium Occurrence Data
6. Request for Comment on Information Collection Rule and ICRSS
Data Sets
D. Treatment
1. Overview
2. Treatment Information Considered for the IESWTR and LT1ESWTR
a. Physical Removal
b. Inactivation
3. New Information on Treatment for Control of Cryptosporidium
a. Conventional Filtration Treatment and Direct Filtration
i. Dissolved Air Flotation.
b. Slow Sand Filtration
c. Diatomaceous Earth Filtration
d. Other Filtration Technologies
e. Inactivation
i. Ozone and Chlorine Dioxide
ii. Ultraviolet Light
iii. Significance of New Information on Inactivation
IV. Discussion of Proposed LT2ESWTR Requirements
A. Additional Cryptosporidium Treatment Technique Requirements
for Filtered Systems
1. What Is EPA Proposing Today?
a. Overview of Framework Approach
b. Monitoring Requirements
c. Treatment Requirements
i. Bin Classification
ii. Credit for Treatment in Place
iii. Treatment Requirements Associated With LT2ESWTR Bins
d. Use of Previously Collected Data
2. How Was This Proposal Developed?
a. Basis for Targeted Treatment Requirements
b. Basis for Bin Concentration Ranges and Treatment Requirements
i. What Is the Risk Associated With a Given Level of
Cryptosporidium in a Drinking Water Source?
ii. What Degree of Additional Treatment Should Be Required for a
Given Source Water Cryptosporidium Level?
c. Basis for Source Water Monitoring Requirements
i. Systems Serving at Least 10,000 People
ii. Systems Serving Fewer Than 10,000 People
iii. Future Monitoring and Reassessment
d. Basis for Accepting Previously Collected Data
3. Request for Comment
B. Unfiltered System Treatment Technique Requirements for
Cryptosporidium
1. What Is EPA Proposing Today?
a. Overview
b. Monitoring Requirements
c. Treatment Requirements
2. How Was This Proposal Developed?
a. Basis for Cryptosporidium Treatment Requirements
b. Basis for Requiring the Use of Two Disinfectants
c. Basis for Source Water Monitoring Requirements
3. Request for Comment
C. Options for Systems to Meet Cryptosporidium Treatment
Requirements
1. Microbial Toolbox Overview
2. Watershed Control Program
a. What Is EPA Proposing Today?
b. How Was This Proposal Developed?
c. Request for Comment
3. Alternative Source
[[Page 47643]]
a. What Is EPA Proposing Today?
b. How Was This Proposal Developed?
c. Request for Comment
4. Off-stream Raw Water Storage
a. What Is EPA Proposing Today?
b. How Was This Proposal Developed?
c. Request for Comment
5. Pre-sedimentation With Coagulant
a. What Is EPA Proposing Today?
b. How Was This Proposal Developed?
i. Published Studies of Cryptosporidium Removal by Conventional
Sedimentation Basins
ii. Data Supplied by Utilities on the Removal of Spores by
Presedimentation
c. Request for Comment
6. Bank Filtration
a. What Is EPA Proposing Today?
b. How Was This Proposal Developed?
c. Request for Comment
7. Lime Softening
a. What Is EPA Proposing Today?
b. How Was This Proposal Developed?
c. Request for Comment
8. Combined Filter Performance
a. What Is EPA Proposing Today?
b. How Was This Proposal Developed?
c. Request for Comment
9. Roughing Filter
a. What Is EPA Proposing Today?
b. How Was This Proposal Developed?
c. Request for Comment
10. Slow Sand Filtration
a. What Is EPA Proposing Today?
b. How Was This Proposal Developed?
c. Request for Comment
11. Membrane Filtration
a. What Is EPA Proposing Today?
b. How Was This Proposal Developed?
c. Request for Comment
12. Bag and Cartridge Filtration
a. What Is EPA Proposing Today?
b. How Was This Proposal Developed?
c. Request for Comment
13. Secondary Filtration
a. What Is EPA Proposing Today?
b. How Was This Proposal Developed?
c. Request for Comment
14. Ozone and Chlorine Dioxide
a. What Is EPA Proposing Today?
b. How Was This Proposal Developed?
c. Request for Comments
15. Ultraviolet Light
a. What Is EPA Proposing Today?
b. How Was This Proposal Developed?
c. Request for Comment
16. Individual Filter Performance
a. What Is EPA Proposing Today?
b. How Was This Proposal Developed?
c. Request for Comment
17. Other Demonstration of Performance
a. What Is EPA Proposing Today?
b. How Was This Proposal Developed?
c. Request for Comment
D. Disinfection Benchmarks for Giardia lamblia and Viruses
1. What Is EPA Proposing Today?
a. Applicability and Schedule
b. Developing the Disinfection Profile and Benchmark
c. State Review
2. How Was This Proposal Developed?
3. Request for Comments
E. Additional Treatment Technique Requirements for Systems with
Uncovered Finished Water Storage Facilities
1. What Is EPA Proposing Today?
2. How Was This Proposal Developed?
3. Request for Comments
F. Compliance Schedules
1. What Is EPA Proposing Today?
a. Source Water Monitoring
i. Filtered Systems
ii. Unfiltered Systems
b. Treatment Requirements
c. Disinfection Benchmarks for Giardia lamblia and Viruses
2. How Was This Proposal Developed?
3. Request for Comments
G. Public Notice Requirements
1. What Is EPA Proposing Today?
2. How Was This Proposal Developed?
3. Request for Comment
H. Variances and Exemptions
1. Variances
2. Exemptions
3. Request for Comment
a. Variances
b. Exemptions
I. Requirements for Systems To Use Qualified Operators
J. System Reporting and Recordkeeping Requirements
1. Overview
2. Reporting Requirements for Source Water Monitoring
a. Data Elements To Be Reported
b. Data System
c. Previously Collected Monitoring Data
3. Compliance With Additional Treatment Requirements
4. Request for Comment
K. Analytical Methods
1. Cryptosporidium
a. What Is EPA Proposing Today?
b. How Was This Proposal Developed?
c. Request for Comment
2. E. coli
a. What Is EPA Proposing Today?
b. How Was This Proposal Developed?
c. Request for Comment
3. Turbidity
a. What Is EPA Proposing Today?
b. How Was This Proposal Developed?
c. Request for Comment
L. Laboratory Approval
1. Cryptosporidium Laboratory Approval
2. E. coli Laboratory Approval
3. Turbidity Analyst Approval
4. Request for Comment
M. Requirements for Sanitary Surveys Conducted by EPA
1. Overview
2. Background
3. Request for Comment
V. State Implementation
A. Special State Primacy Requirements
B. State Recordkeeping Requirements
C. State Reporting Requirements
D. Interim Primacy
VI. Economic Analysis
A. What Regulatory Alternatives Did the Agency Consider?
B. What Analyses Support Selecting the Proposed Rule Option?
C. What Are the Benefits of the Proposed LT2ESWTR?
1. Non-quantifiable Health and Non-health Related Benefits
2. Quantifiable Health Benefits
a. Filtered Systems
b. Unfiltered Systems
3. Timing of Benefits Accrual (latency)
D. What Are the Costs of the Proposed LT2ESWTR?
1. Total Annualized Present Value Costs
2. Water System Costs
a. Source Water Monitoring Costs
b. Filtered Systems Treatment Costs
c. Unfiltered Systems Treatment Costs
d. Uncovered Finished Water Storage Facilities
e. Future Monitoring Costs
f. Sensitivity Analysis-influent Bromide Levels on Technology
Selection for Filtered Plants
3. State/Primacy Agency Costs
4. Non-quantified Costs
E. What Are the Household Costs of the Proposed Rule?
F. What Are the Incremental Costs and Benefits of the Proposed
LT2ESWTR?
G. Are There Benefits From the Reduction of Co-occurring
Contaminants?
H. Are There Increased Risks From Other Contaminants?
I. What Are the Effects of the Contaminant on the General
Population and Groups Within the General Populations That Are
Identified as Likely to be at Greater Risk of Adverse Health
Effects?
J. What Are the Uncertainties in the Baseline, Risk, Benefit,
and Cost Estimates for the Proposed LT2ESWTR as well as the Quality
and Extent of the Information?
K. What Is the Benefit/Cost Determination for the Proposed
LT2ESWTR?
L. Request for Comment
VII. Statutory and Executive Order Reviews
A. Executive Order 12866: Regulatory Planning and Review
B. Paperwork Reduction Act
C. Regulatory Flexibility Act
D. Unfunded Mandates Reform Act
1. Summary of UMRA Requirements
2. Written Statement for Rules With Federal mandates of $100
million or more
a. Authorizing Legislation
b. Cost-benefit Analysis
c. Estimates of Future Compliance Costs and Disproportionate
Budgetary Effects
d. Macro-economic Effects
e. Summary of EPA Consultation With State, local, and Tribal
Governments and Their Concerns
f. Regulatory Alternatives Considered
g. Selection of the Least Costly, Most Cost-effective, or Least
Burdensome Alternative That Achieves the Objectives of the Rule
3. Impacts on Small Governments
E. Executive Order 13132: Federalism
F. Executive Order 13175: Consultation and Coordination With
Indian Tribal Governments
G. Executive Order 13045: Protection of Children from
Environmental Health and Safety Risks
H. Executive Order 13211: Actions that Significantly Affect
Energy Supply, Distribution, or Use
I. National Technology Transfer and Advancement Act
J. Executive Order 12898: Federal Actions to Address
Environmental Justice in Minority Populations or Low-Income
Populations
[[Page 47644]]
K. Consultations With the Science Advisory Board, National
Drinking Water Advisory Council, and the Secretary of Health and
Human Services
L. Plain Language
VIII. References
I. Summary
A. Why Is EPA Proposing the LT2ESWTR?
EPA is proposing the Long Term 2 Enhanced Surface Water Treatment
Rule (LT2ESWTR) to provide for increased protection against microbial
pathogens in public water systems that use surface water sources. The
proposed LT2ESWTR focuses on Cryptosporidium, which is a protozoan
pathogen that is widespread in surface water. EPA is particularly
concerned about Cryptosporidium because it is highly resistant to
inactivation by standard disinfection practices like chlorination.
Ingestion of Cryptosporidium oocysts can cause acute gastrointestinal
illness, and health effects in sensitive subpopulations may be severe,
including risk of mortality. Cryptosporidium has been identified as the
pathogenic agent in a number of waterborne disease outbreaks across the
U.S. and in Canada (details in section II).
The intent of the LT2ESWTR is to supplement existing microbial
treatment requirements for systems where additional public health
protection is needed. Currently, the Interim Enhanced Surface Water
Treatment Rule (IESWTR) requires large systems that filter to remove at
least 99% (2 log) of Cryptosporidium (63 FR 69478, December 16, 1998)
(USEPA 1998a). The Long Term 1 Enhanced Surface Water Treatment Rule
(LT1ESWTR) extends this requirement to small systems (67 FR 1812,
January 14, 2002) (USEPA 2002a). Subsequent to promulgating these
regulations, EPA has evaluated significant new data on Cryptosporidium
infectivity, occurrence, and treatment (details in section III). These
data indicate that current treatment requirements achieve adequate
protection for the majority of systems, but there is a subset of
systems with higher vulnerability to Cryptosporidium where additional
treatment is necessary.
Specifically, national survey data show that average
Cryptosporidium occurrence in filtered systems is lower than previously
estimated. However, these data also demonstrate that Cryptosporidium
concentrations vary widely among systems, and that a fraction of
filtered systems have relatively high levels of source water
Cryptosporidium contamination. Based on this finding, along with new
data suggesting that the infectivity (i.e., virulence) of
Cryptosporidium may be substantially higher than previously understood,
EPA has concluded that the current 2 log removal requirement does not
provide an adequate degree of treatment in filtered systems with the
highest source water Cryptosporidium levels. Consequently, EPA is
proposing targeted additional treatment requirements under the LT2ESWTR
for filtered systems with the highest Cryptosporidium risk.
Under current regulations, unfiltered systems are not required to
provide any treatment for Cryptosporidium. New occurrence data suggest
that typical Cryptosporidium levels in the treated water of unfiltered
systems are substantially higher than in the treated water of filtered
systems. Hence, Cryptosporidium treatment by unfiltered systems is
needed to achieve equivalent public health protection. Recent treatment
studies have allowed EPA to develop criteria for systems to inactivate
Cryptosporidium with ozone, ultraviolet (UV) light, and chlorine
dioxide. As a result, EPA has concluded that it is feasible and
appropriate to propose under the LT2ESWTR that all unfiltered systems
treat for Cryptosporidium.
In addition to concern with Cryptosporidium, the LT2ESWTR proposal
is intended to ensure that systems maintain adequate protection against
microbial pathogens as they take steps to reduce formation of
disinfection byproducts (DBPs). Along with the LT2ESWTR, EPA is also
developing a Stage 2 Disinfection Byproducts Rule (DBPR), which will
further limit allowable levels of trihalomethanes and haloacetic acids.
The proposed LT2ESWTR contains disinfection profiling and benchmarking
requirements to ensure that microbial protection is maintained as
systems comply with the Stage 2 DBPR. Also in the proposed LT2ESWTR are
requirements to limit risk associated with existing uncovered finished
water storage facilities. Uncovered storage facilities are subject to
contamination if not properly managed or treated.
Today's proposed LT2ESWTR reflects consensus recommendations from
the Stage 2 Microbial and Disinfection Byproducts (M-DBP) Federal
Advisory Committee. These recommendations are set forth in the Stage 2
M-DBP Agreement in Principle (65 FR 83015, December 29, 2000) (USEPA
2000a).
B. What Does the LT2ESWTR Proposal Require?
1. Treatment Requirements for Cryptosporidium
EPA is proposing risk-targeted treatment technique requirements for
Cryptosporidium control in filtered systems that are based on a
microbial framework approach. Under this approach, systems that use a
surface water or ground water under the direct influence of surface
water (referred to collectively as surface water systems) will conduct
source water monitoring to determine an average Cryptosporidium
concentration. Based on monitoring results, filtered systems will be
classified in one of four possible risk categories (bins). A filtered
system's bin classification determines the extent of any additional
Cryptosporidium treatment requirements beyond the requirements of
current regulations.
EPA expects that the majority of filtered systems will be
classified in the Bin 1, which carries no additional treatment
requirements. Those systems classified Bins 2-4 will be required to
provide from 1.0 to 2.5 log of treatment (i.e., 90 to 99.7 percent
reduction) for Cryptosporidium in addition to conventional treatment
that complies with the IESWTR or LT1ESWTR (details in section IV.A).
Filtered systems will meet additional Cryptosporidium treatment
requirements by using one or more treatment or control steps from a
``microbial toolbox'' of options (details in section IV.C). Rather than
monitoring, filtered systems may elect to comply with the treatment
requirements of Bin 4 directly.
Under the proposed LT2ESWTR, all surface water systems that are not
required to filter (i.e., unfiltered systems) must provide at least 2
log (i.e., 99 percent) inactivation of Cryptosporidium. In addition,
unfiltered systems will monitor for Cryptosporidium in their source
water and must achieve at least 3 log (i.e., 99.9 percent) inactivation
of Cryptosporidium if the mean level exceeds 0.01 oocysts/L.
Alternatively, unfiltered systems may elect to provide 3 log
Cryptosporidium inactivation directly, instead of monitoring. All
requirements established under the Surface Water Treatment Rule (SWTR)
(54 FR 27486, June 29, 1989) (USEPA 1989a) for unfiltered systems will
remain in effect, including 3 log inactivation of Giardia lamblia and 4
log inactivation of viruses. However, the LT2ESWTR proposal requires
that unfiltered systems achieve their overall inactivation requirements
using a
[[Page 47645]]
minimum of two disinfectants (details in section IV.B).
2. Disinfection Profiling and Benchmarking
The purpose of disinfection profiling and benchmarking is to ensure
that when a system makes a significant change to its disinfection
practice, it does not compromise the adequacy of existing microbial
protection. EPA established the disinfection benchmark under the IESWTR
and LT1ESWTR for the Stage 1 M-DBP rules, and the LT2ESWTR proposal
extends disinfection benchmark requirements to apply to the Stage 2 M-
DBP rules.
The proposed profiling and benchmarking requirements are similar to
those promulgated under IESWTR and LT1ESWTR. Systems that meet
specified criteria must prepare disinfection profiles that characterize
current levels of virus and Giardia lamblia inactivation over the
course of one year. Systems with valid operational data from profiling
conducted under the IESWTR or LT1ESWTR are not required to collect
additional data. If a system that is required to prepare a profile
proposes to make a significant change to its disinfection practice, the
system must calculate a disinfection benchmark and must consult with
the State regarding how the proposed change will affect the current
benchmark (details in section IV.D).
3. Uncovered Finished Water Storage Facilities
The proposed LT2ESWTR also includes requirements for systems with
uncovered finished water storage facilities. The IESWTR and LT1ESWTR
require systems to cover all new storage facilities for finished water,
but these rules do not address existing uncovered finished water
storage facilities. Under the LT2ESWTR proposal, systems with uncovered
finished water storage facilities must cover the storage facility or
treat the storage facility discharge to achieve 4 log virus
inactivation unless the State determines that existing risk mitigation
is adequate. Where the State makes such a determination, systems must
develop and implement a risk mitigation plan that addresses physical
access, surface water run-off, animal and bird wastes, and on-going
water quality assessment (details in section IV.E).
C. Will This Proposed Regulation Apply to My Water System?
All community and non-community water systems that use surface
water or ground water under the direct influence of surface water are
affected by the proposed LT2ESWTR.
II. Background
A. What Is the Statutory Authority for the LT2ESWTR?
This section discusses the Safe Drinking Water Act (SDWA or the
Act) sections that direct the development of the LT2ESWTR.
The Act, as amended in 1996, requires EPA to publish a maximum
contaminant level goal (MCLG) and promulgate a national primary
drinking water regulation (NPDWR) with enforceable requirements for any
contaminant that the Administrator determines may have an adverse
effect on the health of persons, is known to occur or there is a
substantial likelihood that the contaminant will occur in public water
systems (PWSs) with a frequency and at levels of public health concern,
and for which in the sole judgement of the Administrator, regulation of
such contaminant presents a meaningful opportunity for health risk
reduction for persons served by PWSs (section 1412 (b)(1)(A)).
MCLGs are non-enforceable health goals, and are to be set at a
level at which no known or anticipated adverse effect on the health of
persons occur and which allows an adequate margin of safety (sections
1412(b)(4) and 1412(a)(3)). EPA established an MCLG of zero for
Cryptosporidium under the IESWTR (63 FR 69478, December 16, 1998)
(USEPA 1998a). The Agency is not proposing any changes to the current
MCLG for Cryptosporidium.
The Act also requires that at the same time EPA publishes an NPDWR
and MCLG, it must specify in the NPDWR a maximum contaminant level
(MCL) which is as close to the MCLG as is feasible (sections 1412(b)(4)
and 1401(1)(c)). The Agency is authorized to promulgate an NPDWR that
requires the use of a treatment technique in lieu of establishing an
MCL if the Agency finds that it is not economically or technologically
feasible to ascertain the level of the contaminant (sections
1412(b)(7)(A) and 1401(1)(C)). The Act specifies that in such cases,
the Agency shall identify those treatment techniques that would prevent
known or anticipated adverse effects on the health of persons to the
extent feasible (section 1412(b)(7)(A)).
The Agency has concluded that it is not currently economically or
technologically feasible for PWSs to determine the level of
Cryptosporidium in finished drinking water for the purpose of
compliance with a finished water standard (the performance of available
analytical methods for Cryptosporidium is described in section III.C;
the treated water Cryptosporidium levels that the LT2ESWTR will achieve
are described in section IV.A). Consequently, today's proposal for the
LT2ESWTR relies on treatment technique requirements to reduce health
risks from Cryptosporidium in PWSs.
When proposing a NPDWR that includes an MCL or treatment technique,
the Act requires EPA to publish and seek public comment on an analysis
of health risk reduction and cost impacts. This includes an analysis of
quantifiable and nonquantifiable costs and health risk reduction
benefits, incremental costs and benefits of each alternative
considered, the effects of the contaminant upon sensitive
subpopulations (e.g., infants, children, pregnant women, the elderly,
and individuals with a history of serious illness), any increased risk
that may occur as the result of compliance, and other relevant factors
(section 1412 (b)(3)(C)). EPA's analysis of health benefits and costs
associated with the proposed LT2ESWTR is presented in ``Economic
Analysis of the LT2ESWTR'' (USEPA 2003a) and is summarized in section
VI of this preamble. However, the Act does not authorize the
Administrator to use additional health risk reduction and cost
considerations to establish MCL or treatment technique requirements for
the control of Cryptosporidium (section 1412 (b)(6)(C)).
Finally, section 1412 (b)(2)(C) of SDWA requires EPA to promulgate
a Stage 2 Disinfectants and Disinfection Byproducts Rule within 18
months after promulgation of the LT1ESWTR, which occurred on January
14, 2002. Consistent with statutory requirements for risk balancing
(section 1412(b)(5)(B)), EPA will finalize the LT2ESWTR with the Stage
2 DBPR to ensure parallel protection from microbial and DBP risks.
B. What Current Regulations Address Microbial Pathogens in Drinking
Water?
This section summarizes the existing regulations that apply to
control of pathogenic microorganisms in surface water systems. These
rules form the baseline of regulatory protection that will be
supplemented by the LT2ESWTR.
1. Surface Water Treatment Rule
The SWTR (54 FR 27486, June 29, 1989) (USEPA 1989a) applies to all
PWSs using surface water or ground water under the direct influence
(GWUDI) of surface water as sources (Subpart H systems). It established
[[Page 47646]]
MCLGs of zero for Giardia lamblia, viruses, and Legionella, and
includes treatment technique requirements to reduce exposure to
pathogenic microorganisms, including: (1) Filtration, unless specified
avoidance criteria are met; (2) maintenance of a disinfectant residual
in the distribution system; (3) removal and/or inactivation of 3 log
(99.9%) of Giardia lamblia and 4 log (99.99%) of viruses; (4) combined
filter effluent turbidity of 5 nephelometric turbidity units (NTU) as a
maximum and 0.5 NTU at 95th percentile monthly for treatment plants
using conventional treatment or direct filtration (with separate
standards for other filtration technologies); and (5) watershed
protection and source water quality requirements for unfiltered
systems.
2. Total Coliform Rule
The Total Coliform Rule (TCR) (54 FR 27544, June 29, 1989) (USEPA
1989b) applies to all PWSs. It established an MCLG of zero for total
and fecal coliform bacteria, and an MCL based on the percentage of
positive samples collected during a compliance period. Coliforms are
used as a screen for fecal contamination and to determine the integrity
of the water treatment process and distribution system. Under the TCR,
no more than 5 percent of distribution system samples collected in any
month may contain coliform bacteria (no more than 1 sample per month
may be coliform positive in those systems that collect fewer than 40
samples per month). The number of samples to be collected in a month is
based on the number of people served by the system.
3. Interim Enhanced Surface Water Treatment Rule
The IESWTR (63 FR 69477, December 16, 1998) (USEPA 1998a) applies
to PWSs serving at least 10,000 people and using surface water or GWUDI
sources. Key provisions established by the IESWTR include the
following: (1) An MCLG of zero for Cryptosporidium; (2) Cryptosporidium
removal requirements of 2 log (99 percent) for systems that filter; (3)
strengthened combined filter effluent turbidity performance standards
of 1.0 NTU as a maximum and 0.3 NTU at the 95th percentile monthly for
treatment plants using conventional treatment or direct filtration; (4)
requirements for individual filter turbidity monitoring; (5)
disinfection benchmark provisions to assess the level of microbial
protection provided as facilities take steps to comply with new DBP
standards; (6) inclusion of Cryptosporidium in the definition of GWUDI
and in the watershed control requirements for unfiltered public water
systems; (7) requirements for covers on new finished water storage
facilities; and (8) sanitary surveys for all surface water systems
regardless of size.
The IESWTR was developed in conjunction with the Stage 1
Disinfectants and Disinfection Byproducts Rule (Stage 1 DBPR) (63 FR
69389; December 16, 1998) (USEPA 1998b), which reduced allowable levels
of certain DBPs, including trihalomethanes, haloacetic acids, chlorite,
and bromate.
4. Long Term 1 Enhanced Surface Water Treatment Rule
The LT1ESWTR (67 FR 1812, January 14, 2002) (USEPA 2002a) builds
upon the microbial control provisions established by the IESWTR for
large systems, through extending similar requirements to small systems.
The LT1ESWTR applies to PWSs using surface water or GWUDI as sources
that serve fewer than 10,000 people. Like the IESWTR, the LT1ESWTR
established the following: 2 log (99 percent) Cryptosporidium removal
requirements for systems that filter; individual filter turbidity
monitoring and more stringent combined filter effluent turbidity
standards for conventional and direct filtration plants; disinfection
profiling and benchmarking; inclusion of Cryptosporidium in the
definition of GWUDI and in the watershed control requirements for
unfiltered systems; and the requirement that new finished water storage
facilities be covered.
5. Filter Backwash Recycle Rule
EPA promulgated the Filter Backwash Recycling Rule (FBRR) (66 FR
31085, June 8, 2001) (USEPA 2001a) to increase protection of finished
drinking water supplies from contamination by Cryptosporidium and other
microbial pathogens. The FBRR requirements will reduce the potential
risks associated with recycling contaminants removed during the
filtration process. The FBRR provisions apply to all systems that
recycle, regardless of population served. In general, the provisions
include the following: (1) Recycling systems must return certain
recycle streams prior to the point of primary coagulant addition unless
the State specifies an alternative location; (2) direct filtration
systems recycling to the treatment process must provide detailed
recycle treatment information to the State; and (3) certain
conventional systems that practice direct recycling must perform a one-
month, one-time recycling self assessment.
C. What Public Health Concerns Does This Proposal Address?
This section presents the basis for the public health concern
associated with Cryptosporidium in drinking water by summarizing
information on Cryptosporidium health effects and outbreaks. This is
followed by a description of the specific areas of public health
concern that remain after implementation of the IESWTR and LT1ESWTR and
that are addressed in the LT2ESWTR proposal. More detailed information
about Cryptosporidium health effects may be found in the following
criteria documents: Cryptosporidium: Human Health Criteria Document
(USEPA 2001b), Cryptosporidium: Drinking Water Advisory (USEPA 2001c),
and Cryptosporidium: Risks for Infants and Children (USEPA 2001d).
1. Introduction
While modern water treatment systems have substantially reduced
waterborne disease incidence, drinking water contamination remains a
significant health risk management challenge. EPA's Science Advisory
Board in 1990 cited drinking water contamination, particularly
contamination by pathogenic microorganisms, as one of the most
important environmental risks (USEPA 1990). This risk is underscored by
information from the Centers for Disease Control and Prevention (CDC)
which indicates that between 1980 and 1998 a total of 419 outbreaks
associated with drinking water were reported, with greater than 511,000
estimated cases of disease. A number of agents were implicated in these
outbreaks, including viruses, bacteria, and protozoa, as well as
several chemicals (Craun and Calderon 1996, Levy et al. 1998, Barwick
et al. 2000). The majority of cases were associated with surface water,
and specifically with the 1993 Cryptosporidium outbreak in Milwaukee,
WI with an estimated 403,000 cases (Mac Kenzie et al. 1994). A recent
study by McDonald et al. (2001), which used blood samples from
Milwaukee children collected during and after the 1993 outbreak,
suggests that Cryptosporidium infection, including asymptomatic
infection, was more widespread than might be inferred from the illness
estimates by Mac Kenzie et al. (1994).
It is important to note that the number of identified and reported
outbreaks in the CDC database is believed to substantially understate
the actual incidence of waterborne disease outbreaks and cases (Craun
and
[[Page 47647]]
Calderon 1996, National Research Council 1997). This under reporting is
due to a number of factors. Many people experiencing gastrointestinal
illness do not seek medical attention. Where medical attention is
provided, the pathogenic agent may not be identified through routine
testing. Physicians often lack sufficient information to attribute
gastrointestinal illness to any specific origin, such as drinking
water, and few States have an active outbreak surveillance program.
Consequently, outbreaks are often not recognized in a community or, if
recognized, are not traced to a drinking water source.
In addition, an unknown but probably significant portion of
waterborne disease is endemic (i.e. isolated cases not associated with
an outbreak) and, thus, is even more difficult to recognize. The
Economic Analysis for the proposed LT2ESWTR (USEPA 2003a) uses data on
Cryptosporidium occurrence, infectivity, and treatment to estimate the
baseline endemic incidence of cryptosporidiosis attributable to
drinking water, as well as the reductions projected as a result of this
rule.
Most waterborne pathogens cause gastrointestinal illness with
diarrhea, abdominal discomfort, nausea, vomiting, and other symptoms.
The effects of waterborne disease are usually acute, resulting from a
single or small number of exposures. Such illnesses are generally of
short duration in healthy people. However, some pathogens, including
Giardia lamblia and Cryptosporidium, may cause disease lasting weeks or
longer in otherwise healthy individuals, though this is not typical for
Cryptosporidium. Waterborne pathogens also cause more serious disorders
such as hepatitis, peptic ulcers, myocarditis, paralysis,
conjunctivitis, swollen lymph glands, meningitis, and reactive
arthritis, and have been associated with diabetes, encephalitis, and
other diseases (Lederberg 1992).
There are populations that are at greater risk from waterborne
disease. These sensitive subpopulations include children (especially
infants), the elderly, the malnourished, pregnant women, the disease
impaired (e.g., diabetes, cystic fibrosis), and a broad category of
those with compromised immune systems, such as AIDS patients, those
with autoimmune disorders (e.g., rheumatoid arthritis, lupus
erythematosus, multiple sclerosis), transplant recipients, and those on
chemotherapy (Rose 1997). This sensitive segment represents almost 20%
of the population in the United States (Gerba et al. 1996). The
severity and duration of illness is often greater in sensitive
subpopulations than in healthy individuals, and in a small percentage
of such cases, death may result.
2. Cryptosporidium Health Effects and Outbreaks
Cryptosporidium is a protozoan parasite that exists in warm-blooded
hosts and, upon excretion, may survive for months in the environment
(Kato et al., 2001). Ingestion of Cryptosporidium can lead to
cryptosporidiosis, a gastrointestinal illness. Transmission of
cryptosporidiosis often occurs through consumption of feces
contaminated food or water, but may also result from direct or indirect
contact with infected persons or animals (Casemore 1990). Surveys
(described in Section III) indicate that Cryptosporidium is common in
surface waters used as drinking water supplies. Sources of
Cryptosporidium contamination include animal agriculture, wastewater
treatment plant discharges, slaughterhouses, birds, wild animals, and
other sources of fecal matter.
EPA is particularly concerned about Cryptosporidium because, unlike
pathogens such as bacteria and most viruses, Cryptosporidium oocysts
are highly resistant to standard disinfectants like chlorine and
chloramines. Consequently, control of Cryptosporidium in most treatment
plants is dependent on physical removal processes. Finished water
monitoring data indicate that Cryptosporidium is sometimes present in
filtered, treated drinking water (LeChevallier et al. 1991; Aboytes et
al. 2002). Moreover, as noted later, many of the individuals sickened
by waterborne outbreaks of cryptosporidiosis were served by filtered
surface water supplies (Solo-Gabriele and Neumeister, 1996). In some
cases, these outbreaks were attributed to treatment deficiencies, while
in other cases the cause was unidentified (see Table II-1).
These data suggest that surface water systems that filter and
disinfect may still be vulnerable to Cryptosporidium, depending on the
source water quality and treatment effectiveness. Today's proposed rule
addresses concern with passage of Cryptosporidium through physical
removal processes during water treatment, as well as in systems lacking
filtration.
a. Health effects. Cryptosporidium infection is characterized by
mild to severe diarrhea, dehydration, stomach cramps, and/or a slight
fever. Symptoms typically last from several days to two weeks, though
in a small percentage of cases, the symptoms may persist for months or
longer in otherwise healthy individuals. Human feeding studies have
demonstrated that a low dose of Cryptosporidium parvum (C. parvum) is
sufficient to cause infection in healthy adults (DuPont et al. 1995,
Chappell et al. 1999, Messner et al. 2001). Studies of immunosuppressed
adult mice have demonstrated that a single viable oocyst can induce
patent C. parvum infections (Yang et al. 2000).
There is evidence that an immune response to Cryptosporidium
exists, but the degree and duration of this immunity is not well
characterized. In a study by Chappell et al. (1999), individuals with a
blood serum antibody (IgG), which can develop from exposure to C.
parvum, demonstrated immunity to low doses of oocysts. The
investigators found the ID50 dose (i.e., dose that infects 50% of the
challenged population) of one C. parvum isolate for adult volunteers
who had pre-existing serum IgG to be 1,880 oocysts in comparison to 132
oocysts for individuals reported as serologically negative. However,
the implications of these data for studies of Cryptosporidium
infectivity are unclear. Earlier work did not observe a correlation
between the development of antibodies after Cryptosporidium exposure
and subsequent protection from illness (Okhuysen et al. 1998). A
subsequent investigation by Muller et al. (2001) observed serological
responses to Cryptosporidium antigens in samples from individuals
reported by Chappel et al. as serologically negative.
Cryptosporidium parvum was first recognized as a human pathogen in
1976 (Juranek 1995). Cases of illness from Cryptosporidium were rarely
reported until 1982 when documented disease incidence increased due to
the AIDS epidemic (Current 1983). As laboratory diagnostic techniques
improved during subsequent years, outbreaks among immunocompetent
persons were recognized as well. Human, cattle, dog and deer types of
C. parvum have been found in healthy individuals (Ong et al. 2002,
Morgan-Ryan et al. 2002). Other Cryptosporidium species (C. felis, C.
meleagridis, and possibly C. muris) have infected healthy individuals,
primarily children (Xiao et al. 2001, Chalmers et al. 2002, Katsumata
et al. 2000). Cross-species infection occurs. The human type of C.
parvum (now named C. hominis (Morgan-Ryan et al. 2002)) has infected a
dugong and monkeys (Spano et al. 1998). The cattle type of C. parvum
infects humans, wild animals, and other livestock, such as sheep, goats
and deer (Ong et al. 2002).
As noted earlier, there are sensitive populations that are at
greater risk from pathogenic microorganisms.
[[Page 47648]]
Cryptosporidiosis symptoms in immunocompromised subpopulations are much
more severe, including debilitating voluminous diarrhea that may be
accompanied by severe abdominal cramps, weight loss, and low grade
fever (Juranek 1995). Mortality is a significant threat to the
immunocompromised infected with Cryptosporidium:
the duration and severity of the disease are significant:
whereas 1 percent of the immunocompetent population may be
hospitalized with very little risk of mortality, Cryptosporidium
infections are associated with a high rate of mortality in the
immunocompromised (Rose 1997)
A follow-up study of the 1993 Milwaukee, WI outbreak reported that
at least 50 Cryptosporidium-associated deaths occurred among the
severely immunocompromised (Hoxie et al. 1997).
b. Waterborne cryptosporidiosis outbreaks. Cryptosporidium has
caused a number of waterborne disease outbreaks since 1984 when the
first one was reported in the U.S. Table II-1 lists reported outbreaks
in community water systems (CWS) and non-community water systems
(NCWS). Between 1984--1998, nine outbreaks caused by Cryptosporidium
were reported in the U.S. with approximately 421,000 cases associated
cases of illness (CDC 1993, 1996, 1998, 2000, and 2001). Solo-Gabriele
and Neumeister (1996) characterized water supplies associated with U.S.
outbreaks of cryptosporidiosis. They determined that almost half of the
outbreaks were associated with ground water (untreated or chlorinated
springs and wells), but that the majority of affected individuals were
served by filtered surface water supplies (rivers and lakes). They
found that during outbreaks involving treated spring or well water, the
chlorination systems were apparently operating satisfactorily, with a
measurable chlorine residual.
Although the occurrence of Cryptosporidium in U.S. drinking water
supplies has been substantiated by data collected during outbreak
investigations, the source and density of oocysts associated with the
outbreak have not always been detected or reported. Furthermore,
because of limitations and uncertainties of the immunofluorescence
assay (IFA) method used in earlier studies, negative results in source
or finished water during these outbreaks do not necessarily mean that
there were no oocysts in the water at the time of sampling.
Table II-1.--Outbreaks Caused by Cryptosporidium in Public Water Systems: 1984-1998
----------------------------------------------------------------------------------------------------------------
Year State Cases System Deficiency Source
----------------------------------------------------------------------------------------------------------------
1984......................... TX 117 CWS 3 Well.
1987......................... GA 13,000 CWS 3 River.
1991......................... PA 551 NCWS 3 Well.
1992......................... OR ?? CWS 3 Spring.
1992......................... OR ?? CWS 3 River.
1993......................... NV 103 CWS 5 Lake.
1993......................... WI 403,000 CWS 3 Lake.
1994......................... WA 134 CWS 2 Well.
1998......................... TX 1,400 CWS 3 Well.
----------------------------------------------------------------------------------------------------------------
??
=Total estimated cases were 3,000. The locations were nearby and cases overlapped in time
Definitions of deficiencies = (1) untreated surface water; (2) untreated ground water; (3) treatment
deficiency (e.g., temporary interruption of disinfection, chronically inadequate disinfection, and inadequate
or no filtration); (4) distribution system deficiency (e.g., cross connection, contamination of water mains
during construction or repair, and contamination of a storage facility); and (5) unknown or miscellaneous
deficiency.
3. Remaining Public Health Concerns Following the IESWTR and LT1ESWTR
This section presents the areas of remaining public health concern
following implementation of the IESWTR and LT1ESWTR that EPA proposes
to address in the LT2ESWTR. These are as follows: (a) Adequacy of
physical removal to control Cryptosporidium and the need for risk based
treatment requirements; (b) control of Cryptosporidium in unfiltered
systems; and (c) uncovered finished water storage facilities.
EPA recognized each of these issues as a potential public health
concern during development of the IESWTR, but could not address them at
that time due to the absence of key data. Accordingly, this section
begins with a description of how EPA considered these issues during
development of the IESWTR, including the data gaps that were identified
at that time. This is followed by a statement of the extent to which
new information has filled these data gaps, thereby allowing EPA to
address these public health concerns in the LT2ESWTR proposal.
a. Adequacy of physical removal to control Cryptosporidium and the
need for risk based treatment requirements. A question that received
significant consideration during development of the IESWTR is whether
physical removal by filtration plants provides adequate protection
against Cryptosporidium in drinking water, or whether certain systems
should be required to provide inactivation of Cryptosporidium based on
source water pathogen levels. As discussed in the proposal, notice of
data availability (NODA), and final IESWTR, EPA and stakeholders
concluded that data available during IESWTR development were not
adequate to support risk based inactivation requirements for
Cryptosporidium. However, the Agency maintained that a risk based
approach to Cryptosporidium control would be considered for the
LT2ESWTR when data collected under the Information Collection Rule were
available and other critical information needs had been addressed.
The IESWTR proposal (59 FR 38832, July 29, 1994) (USEPA 1994)
included two treatment alternatives, labeled B and C, that specifically
addressed Cryptosporidium. Under Alternative B, the level of required
treatment would be based on the density of Cryptosporidium in the
source water. The proposal noted concerns with this approach, though,
due to uncertainty in the risk associated with Cryptosporidium and the
feasibility of achieving higher treatment levels through disinfection.
Consequently, EPA also proposed Alternative C, which would require 2
log (99%) removal of Cryptosporidium by filtration. This was based on
the determination that 2 log Cryptosporidium removal is feasible using
conventional treatment.
In the 1996 Information Collection Rule (61 FR 24354, May 14, 1996)
(USEPA 1996a), EPA concluded that the analytical method prescribed for
measuring Cryptosporidium was
[[Page 47649]]
adequate for making national occurrence estimates, but would not
suffice for making site specific source water density estimates. This
finding further contributed to the rationale supporting Alternative C
under the proposed IESWTR.
The NODA for the IESWTR (62 FR 59498, Nov. 3, 1997) (USEPA 1997a)
presented the recommendations of the Stage 1 MDBP Federal Advisory
Committee for the IESWTR. As stated in the NODA, the Committee engaged
in extensive discussions regarding the adequacy of relying solely on
physical removal to control Cryptosporidium and the need for
inactivation. There was an absence of consensus on whether it was
possible at that time to adequately measure Cryptosporidium
inactivation efficiencies for various disinfection technologies. This
was a significant impediment to addressing inactivation in the IESWTR.
However, the Committee recognized that inactivation requirements may be
necessary under future regulatory scenarios, as shown by the following
consensus recommendation from the Stage 1 MDBP Agreement in Principle:
EPA should issue a risk based proposal of the Final Enhanced
Surface Water Treatment Rule for Cryptosporidium embodying the
multiple barrier approach (e.g., source water protection, physical
removal, inactivation, etc.), including, where risks suggest
appropriate, inactivation requirements (62 FR 59557, Nov. 3, 1997)
(USEPA 1997a).
The preamble to the final IESWTR (63 FR 69478, Dec. 16, 1998)
(USEPA 1998a) states that EPA was unable to consider the proposed
Alternative B (treatment requirements for Cryptosporidium based on
source water occurrence levels) for the IESWTR because occurrence data
from the Information Collection Rule survey and related analysis were
not available in time to meet the statutory promulgation deadline. The
Agency affirmed, though, that further control of Cryptosporidium would
be addressed in the LT2ESWTR.
In today's notice, EPA is proposing a risk based approach for
control of Cryptosporidium in drinking water. Under this approach, the
required level of additional Cryptosporidium treatment relates to the
source water pathogen density. EPA believes many of the data gaps that
prevented the adoption of this approach under the IESWTR have been
addressed. As described in Section III of this preamble, information on
Cryptosporidium occurrence from the Information Collection Rule and
Information Collection Rule Supplemental Surveys, along with new data
on Cryptosporidium infectivity, have provided EPA with a better
understanding of the magnitude and distribution of risk for this
pathogen. Improved analytical methods allow for a more accurate
assessment of source water Cryptosporidium levels, and recent
disinfection studies with UV, ozone, and chlorine dioxide provide the
technical basis to support Cryptosporidium inactivation requirements.
b. Control of Cryptosporidium in unfiltered systems. There is
particular concern about Cryptosporidium in the source waters of
unfiltered systems because this pathogen has been shown to be resistant
to conventional disinfection practices. In the IESWTR, EPA extended
watershed control requirements for unfiltered systems to include the
control of Cryptosporidium. EPA did not establish Cryptosporidium
treatment requirements for unfiltered systems because available data
suggested an equivalency of risk in filtered and unfiltered systems.
This is described in the final IESWTR as follows:
it appears that unfiltered water systems that comply with the source
water requirements of the SWTR have a risk of cryptosporidiosis
equivalent to that of a water system with a well operated filter
plant using a water source of average quality (63 FR 69492, Dec. 16,
1998) (USEPA 1998a)
The Agency noted that data from the Information Collection Rule
would provide more information on Cryptosporidium levels in filtered
and unfiltered systems, and that Cryptosporidium treatment requirements
would be re-evaluated when these data became available.
In today's notice, EPA is proposing Cryptosporidium inactivation
requirements for unfiltered systems. These proposed requirements stem
from an assessment of Cryptosporidium source water occurrence in both
filtered and unfiltered systems using data from the Information
Collection Rule and other surveys, as described in Section III of this
preamble. These new data do not support the finding described in the
IESWTR of equivalent risk in filtered and unfiltered systems. Rather,
Cryptosporidium treatment by unfiltered systems is necessary to achieve
a finished water risk level equivalent to that of filtered systems. In
addition, the development of Cryptosporidium inactivation criteria for
UV, ozone, and chlorine dioxide in the LT2ESWTR has made it feasible
for unfiltered systems to provide Cryptosporidium treatment.
c. Uncovered finished water storage facilities. In the IESWTR
proposal, EPA solicited comment on a requirement that systems cover
finished water storage facilities to reduce the potential for
contamination by pathogens and hazardous chemicals. Potential sources
of contamination to uncovered storage facilities include airborne
chemicals, runoff, animal carcasses, animal or bird droppings, and
growth of algae and other aquatic organisms (59 FR 38832, July 29,
1994) (USEPA 1994).
The final IESWTR established a requirement to cover all new storage
facilities for finished water for which construction began after
February 16, 1999 (63 FR 69493, Dec. 16, 1998) (USEPA 1998a). In
preamble to the final IESWTR, EPA described future regulation of
existing uncovered finished water storage facilities as follows:
EPA needs more time to collect and analyze additional
information to evaluate regulatory impacts on systems with existing
uncovered reservoirs on a national basis . . . EPA will further
consider whether to require the covering of existing reservoirs
during the development of subsequent microbial regulations when
additional data and analysis to develop the national costs of
coverage are available.
EPA continues to be concerned about contamination resulting from
uncovered finished water storage facilities, particularly the potential
for virus contamination via bird droppings, and now has sufficient data
to estimate national cost implications for various regulatory control
strategies. Therefore, EPA is proposing control measures for all
systems with uncovered finished water storage facilities in the
LT2ESWTR. New data and proposed requirements are described in section
IV.E of this preamble.
D. Federal Advisory Committee Process
In March 1999, EPA reconvened the M-DBP Federal Advisory Committee
to develop recommendations for the Stage 2 DBPR and LT2ESWTR. The
Committee consisted of organizational members representing EPA, State
and local public health and regulatory agencies, local elected
officials, Indian Tribes, drinking water suppliers, chemical and
equipment manufacturers, and public interest groups. Technical support
for the Committee's discussions was provided by a technical workgroup
established by the Committee at its first meeting. The Committee's
activities resulted in the collection and evaluation of substantial new
information related to key elements for both rules. This included new
data on pathogenicity, occurrence, and treatment of microbial
contaminants, specifically including Cryptosporidium, as well as new
data on DBP health risks, exposure, and control. New information
relevant to the
[[Page 47650]]
LT2ESWTR is summarized in Section III of this proposal.
In September 2000, the Committee signed an Agreement in Principle
reflecting the consensus recommendations of the group. The Agreement
was published in a December 29, 2000 Federal Register notice (65 FR
83015, December 29, 2000) (USEPA 2000a). The Agreement is divided into
Parts A & B. The entire Committee reached consensus on Part A, which
contains provisions that directly apply to the Stage 2 DBPR and
LT2ESWTR. The full Committee, with the exception of one member, agreed
to Part B, which has recommendations for future activities by EPA in
the areas of distribution systems and microbial water quality criteria.
The Committee reached agreement on the following major issues
discussed in this notice and the proposed Stage 2 DBPR:
LT2ESWTR: (1) Additional Cryptosporidium treatment based on source
water monitoring results; (2) Filtered systems that must comply with
additional Cryptosporidium treatment requirements may choose from a
``toolbox'' of treatment and control options; (3) Reduced monitoring
burden for small systems; (4) Future monitoring to confirm source water
quality assessments; (5) Cryptosporidium inactivation by all unfiltered
systems; (6) Unfiltered systems meet overall inactivation requirements
using a minimum of 2 disinfectants; (7) Development of criteria and
guidance for UV disinfection and other toolbox options; (8) Cover or
treat existing uncovered finished water reservoirs (i.e., storage
facilities) or implement risk mitigation plans.
Stage 2 DBPR: (1) Compliance calculation for total trihanomethanes
(TTHM) and five haloacetic acids (HAA5) revised from a running annual
average (RAA) to a locational running annual average (LRAA); (2)
Compliance carried out in two phases of the rule; (3) Performance of an
Initial Distribution System Evaluation; (4) Continued importance of
simultaneous compliance with DBP and microbial regulations; (5)
Unchanged MCL for bromate.
III. New Information on Cryptosporidium Health Risks and Treatment
The purpose of this section is to describe information related to
health risks and treatment of Cryptosporidium in drinking water that
has become available since EPA developed the IESWTR. Much of this
information was evaluated by the Stage 2 M-DBP Federal Advisory
Committee when considering whether and to what degree existing
microbial standards should be revised to protect public health. It
serves as a basis for the recommendations made by the Advisory
Committee and for provisions in today's proposed rule. This section
begins with an overview of critical factors that EPA considers when
evaluating regulation of microbial pathogens. New information is then
presented on three key topics: Cryptosporidium infectivity, occurrence,
and treatment.
A. Overview of Critical Factors for Evaluating Regulation of Microbial
Pathogens
When proposing a national primary drinking water regulation that
includes a maximum contaminant level or treatment technique, SDWA
requires EPA to analyze the health risk reduction benefits and costs
likely to result from alternative regulatory levels that are being
considered. For assessing risk, EPA follows the paradigm described by
the National Academy of Science (NRC, 1983) which involves four steps:
(1) Hazard identification, (2) dose-response assessment, (3) exposure
assessment, and (4) risk characterization. The application of these
steps to microbial pathogens is briefly described in this section,
followed by a summary of how EPA estimates the health benefits and
costs of regulatory alternatives.
Hazard identification for microbial pathogens is a description of
the nature, severity, and duration of the health effects stemming from
infection. Under SDWA, EPA must consider health effects on the general
population and on subpopulations that are at greater risk of adverse
health effects. See section II.C.2 of this preamble for health effects
associated with Cryptosporidium.
Dose-response assessment with microorganisms is commonly termed
infectivity and is a description of the relationship between the number
of pathogens ingested and the probability of infection. Information on
Cryptosporidium infectivity is presented in section III.B of this
preamble.
Exposure to microbial pathogens in drinking water is generally a
function of the concentration of the pathogen in finished water and the
volume of water ingested (exposure also occurs through secondary routes
involving infected individuals). Because it is difficult to directly
measure pathogens at the low levels typically present in finished
water, EPA's information on pathogen exposure is primarily derived from
surveys of source water occurrence. EPA estimates the concentration of
pathogens in treated water by combining source water pathogen
occurrence data with information on the performance of treatment plants
in reducing pathogen levels. Data on the occurrence of Cryptosporidium
are described in section III.C of this preamble and in Occurrence and
Exposure Assessment for the LT2ESWTR (USEPA 2003b). Cryptosporidium
treatment studies are described in section III.D of this preamble.
Risk characterization is the culminating step of the risk
assessment process. It is a description of the nature and magnitude of
risk, and characterizes strengths, weaknesses, and attendant
uncertainties of the assessment. EPA's risk characterization for
Cryptosporidium is described in Economic Analysis for the LT2ESWTR
(USEPA 2003a).
Estimating the health benefits and costs that would result from a
new regulatory requirement involves a number of steps, including
evaluating the efficacy and cost of treatment strategies to reduce
exposure to the contaminant, forecasting the number of systems that
would implement different treatment strategies to comply with the
regulatory standard, and projecting the reduction in exposure to the
contaminant and consequent health risk reduction benefits stemming from
regulatory compliance. EPA's estimates of health benefits and costs
associated with the proposed LT2ESWTR are presented in Economic
Analysis for the LT2ESWTR (USEPA 2003a) and are summarized in section
VI of this preamble.
B. Cryptosporidium Infectivity
This section presents information on the infectivity of
Cryptosporidium oocysts. Infectivity relates the probability of
infection by Cryptosporidium with the number of oocysts that a person
ingests, and it is used to predict the disease burden associated with
different Cryptosporidium levels in drinking water. Information on
Cryptosporidium infectivity comes from dose-response studies where
healthy human subjects ingest different numbers of oocysts and are
subsequently evaluated for signs of infection and illness.
Data from a human dose-response study of one Cryptosporidium
isolate (the IOWA study, conducted at the University of Texas-Houston
Health Science Center) had been published prior to the IESWTR (DuPont
et al. 1995). Following IESWTR promulgation, a study of two additional
isolates (TAMU and UCP) was completed and published (Okhuysen et al.
1999). This study also presented a
[[Page 47651]]
reanalysis of the IOWA study results. As described in more detail later
in this section, this new study indicates that the infectivity of
Cryptosporidium oocysts varies over a wide range. The UCP oocysts
appeared less infective than those of the IOWA study while the TAMU
oocysts were much more infective. Although the occurrence of these
isolates among environmental oocysts is unknown, a meta-analysis of
these data conducted by EPA suggests the overall infectivity of
Cryptosporidium may be significantly greater than was estimated for the
IESWTR (USEPA 2003a).
This section begins with a description of the infectivity data
considered for the IESWTR. This is followed by a presentation of
additional data that have been evaluated for the proposed LT2ESWTR and
a characterization of the significance of these new data.
1. Cryptosporidium Infectivity Data Evaluated for IESWTR
Data from the IOWA study (DuPont et al. 1995) were evaluated for
the IESWTR. In that study, 29 individuals were given single doses
ranging from 30 oocysts to 1 million oocysts. This oocyst isolate was
originally obtained from a naturally infected calf. Seven persons
received doses above 500, and all were infected. Eleven of the twenty
two individuals receiving doses of 500 or fewer were classified as
infected based on oocysts detected in stool samples.
The IOWA study data were analyzed using an exponential dose-
response model established by Haas et al. (1996) for Cryptosporidium:
Probability {Infection / Dose{time} =
1-e -Dose/k
Based on the maximum likelihood estimate of k (238), the
probability of infection from ingesting a single oocyst (1/k) is
approximately 0.4% (4 persons infected for every 1,000 who each ingest
one oocyst). Based on the same estimate, the dose at which 50% of
persons become infected (known as the median infectious dose or ID50)
is 165.
2. New Data on Cryptosporidium Infectivity
A study of two additional Cryptosporidium isolates was conducted at
the University of Texas-Houston Health Science Center (Okhuysen et al.
1999). One of the isolates (UCP) was originally collected from
naturally infected calves. The other isolate (TAMU) was originally
collected from a veterinary student who became infected during necropsy
on an infected foal.
The TAMU and UCP studies were conducted with 14 and 17 subjects,
respectively. Because thousands of oocysts per gram of stool can go
undetected, researchers elected to use both stool test results and
symptoms as markers of infection (only stool test results had been used
for the IOWA study). Under this definition, two additional IOWA
subjects were regarded as having been infected. As shown in Table III-
1, all but two of the TAMU subjects were presumed infected and all but
six of the UCP subjects were presumed infected following ingestion of
the indicated oocyst doses.
Table III-1.--Cryptosporidium Parvum Infectivity in Healthy Adult
Volunteers
------------------------------------------------------------------------
Number of Number
Isolate and dose (# of oocysts) subjects infected
\1\ \1\
------------------------------------------------------------------------
IOWA:
30.......................................... 5 2
100......................................... 8 4
300......................................... 3 2
500......................................... 6 5
1,000....................................... 2 2
10,000...................................... 3 3
100,000..................................... 1 1
1,000,000................................... 1 1
TAMU:
10........................................ 3 2
30........................................ 3 2
100....................................... 3 3
500....................................... 5 5
UCP:
500....................................... 5 3
1,000..................................... 3 2
5,000..................................... 5 2
10,000.................................... 4 4
------------------------------------------------------------------------
\1\ The two right columns list the number of subjects belonging to each
category.
EPA conducted a meta-analysis of these results in which the three
isolates were considered as a random sample (of size three) from a
larger population of environmental oocysts (Messner et al. 2001). This
meta analysis was reviewed by the Science Advisory Board (SAB). In
written comments from a December 2001 meeting of the Drinking Water
Committee, SAB members recommended the following: (1) two assumed
infectivity distributions (of parameter r = 1/k as logit normal and
logit-t) should be used in order to characterize uncertainty and (2)
EPA should consider excluding the UCP data set because it seems to be
an outlier (see Section VII.K). In response, EPA has used the two
recommended distributions for infectivity and has conducted the meta-
analysis both with and without the UCP data due to uncertainty about
whether it is appropriate to exclude these data.
Table III-2 presents meta-analysis estimates of the probability of
infection given one oocyst ingested. Results are shown for the four
different analysis conditions (log normal and log-t distributions; with
and without UCP data) as well as a combined result derived by sampling
equally from each distribution. A more complete description of the
infectivity analysis is provided in Economic Analysis for the LT2ESWTR
(USEPA 2003a).
Table III-2.--Risk of Infection, Given One Oocyst Ingested
------------------------------------------------------------------------
Basis for analysis Probability of
----------------------------------------------- infection, one oocyst
ingested
-------------------------
Studies used Distributional 80%
model Mean Credible
interval
------------------------------------------------------------------------
IOWA, TAMU, and UCP.......... Normal......... 0.07 0.007-0.19
IOWA, TAMU, and UCP.......... Student's t 0.09 0.015-0.20
(3df) \1\.
IOWA and TAMU................ Normal......... 0.09 0.011-0.23
IOWA and TAMU................ Student's t 0.10 0.014-0.25
(3df) \1\.
--------------
Equal Mix of the Four ............... 0.09 0.011-0.22
Above.
------------------------------------------------------------------------
\1\ Student's t distribution with 3 degrees of freedom (3df).
[[Page 47652]]
The results in Table III-2 show that the mean probability of
infection from ingesting a single infectious oocyst ranges from 7% to
10% depending on the assumptions used. In comparison, the best estimate
in the IESWTR of this probability was 0.4%, based on the IOWA isolate
alone, and using the earlier definition of infection. Thus, these data
suggest that both the range and magnitude of Cryptosporidium
infectivity is higher than was estimated in the final IESWTR.
It should be noted that although significantly more data on
Cryptosporidium infectivity are available now than when EPA established
the IESWTR, there remains uncertainty about this parameter in several
areas. It is unknown how well the oocysts used in the feeding studies
represent Cryptosporidium naturally occurring in the environment, and
the analyses do not fully account for variability in host
susceptibility and the effect of previous infections. Furthermore, the
sample sizes are relatively small, and the confidence bands on the
estimates span more than an order of magnitude. Another limitation is
that none of the studies included doses below 10 oocysts, while when
people ingest oocysts in drinking water it is usually a single oocyst.
3. Significance of New Infectivity Data
The new infectivity data reveal that oocysts vary greatly in their
ability to infect human hosts. Moreover, due to this variability and
the finding of a highly infectious isolate, TAMU, the overall
population of oocysts appears to be more infective than assumed for the
IESWTR. The meta-analysis described earlier indicates the probability
of infection at low Cryptosporidium concentrations may be about 20
times as great as previously estimated (which was based on the IOWA
isolate alone and using the earlier definition of infection (stool-
confirmed infections)).
C. Cryptosporidium Occurrence
This section presents information on the occurrence of
Cryptosporidium oocysts in drinking water sources. Occurrence
information is important because it is used in assessing the risk
associated with Cryptosporidium in both filtered and unfiltered
systems, as well as in estimating the costs and benefits of the
proposed LT2ESWTR.
For the IESWTR, EPA had no national survey data and relied instead
on several studies that were local or regional. Those data suggested
that a typical (median) filtered surface water source had approximately
2 Cryptosporidium oocysts per liter, while a typical unfiltered surface
water source had about 0.01 oocysts per liter, a difference of two
orders of magnitude.
Subsequent to promulgating the IESWTR, EPA obtained data from two
national surveys: the Information Collection Rule and the Information
Collection Rule Supplemental Surveys (ICRSS). These surveys were
designed to provide improved estimates of occurrence on a national
basis. As described in more detail later in this section, the
Information Collection Rule and ICRSS results show three main
differences in comparison to Cryptosporidium occurrence data used for
the IESWTR:
(1) Average Cryptosporidium occurrence is lower. Median oocyst
levels for the Information Collection Rule and ICRSS data are
approximately 0.05/L, which is more than an order of magnitude lower
than IESWTR estimates.
(2) Cryptosporidium occurrence is more variable from location to
location than was shown by the data considered for the IESWTR. This
indicates that although median occurrence levels are below those
assumed for the IESWTR, there is a subset of systems whose levels
are considerably greater than the median.
(3) There is a smaller difference in Cryptosporidium levels
between typical filtered and unfiltered system water sources. The
Information Collection Rule data do not support the IESWTR finding
that unfiltered water systems have a risk of cryptosporidiosis
equivalent to that of a filter plant with average quality source
water.
This section begins with a summary of occurrence data that were
used to assess risk under the IESWTR (these data were also used in the
main risk assessment for the LT1ESWTR). This is followed by a
discussion of the Information Collection Rule and ICRSS that covers the
scope of the surveys, analytical methods, results, and a
characterization of how these new data impact current understanding of
Cryptosporidium exposure. A more detailed description of occurrence
data is available in Occurrence and Exposure Assessment for the Long
Term 2 Enhanced Surface Water Treatment Rule (USEPA 2003b).
1. Occurrence Data Evaluated for IESWTR
Occurrence information evaluated for the IESWTR is detailed in
Occurrence and Exposure Assessment for The Interim Enhanced Surface
Water Treatment Rule (USEPA 1998c). This information is summarized in
the next two paragraphs.
a. Filtered systems. In developing the IESWTR, EPA evaluated
Cryptosporidium occurrence data from a number of studies. Among these
studies, LeChevallier and Norton (1995) produced the largest data set
and data from this study were used for the IESWTR risk assessment. This
study provided estimates of mean occurrence at 69 locations from the
eastern and central U.S. Although limited by the small number of
samples per site (one to sixteen samples; most sites were sampled five
times), variation within and between sites appeared to be lognormal.
The study's median measured source water concentration was 2.31
oocysts/L and the interquartile range (i.e., 25th and 75th percentile)
was 1.03 to 5.15 oocysts/L.
b. Unfiltered systems. To assess Cryptosporidium occurrence in
unfiltered systems under the IESWTR, EPA evaluated Cryptosporidium
monitoring results from several unfiltered water systems that had been
summarized by the Seattle Water Department (Montgomery Watson, 1995).
The median (central tendency) of these data was approximately 0.01
oocysts/L. Thus, the median concentration in these data set was about 2
orders of magnitude less than the median concentration in the data set
used for filtered systems. These data, coupled with the assumption that
filtered systems will remove at least 2 log of Cryptosporidium as
required by the IESWTR, suggested that unfiltered systems that comply
with the source water requirements of the SWTR may have a risk of
cryptosporidiosis equivalent to that of a filter plant using a water
source of average quality (62 FR 59507, November 3, 1997) (USEPA
1997a).
2. Overview of the Information Collection Rule and Information
Collection Rule Supplemental Surveys (ICRSS)
The Information Collection Rule and the Information Collection Rule
Supplemental Surveys (ICRSS) were national monitoring studies. They
were designed to provide EPA with a more comprehensive understanding of
the occurrence of microbial pathogens in drinking water sources in
order to support regulatory decision making. The surveys attempted to
control protozoa measurement error through requiring that (1)
laboratories meet certain qualification criteria, (2) standardized
methods be used to collect data, and (3) laboratories analyze
performance evaluation samples throughout the duration of the study to
ensure adequate analytical performance. Information Collection Rule
monitoring took place from July 1997 to December 1998; ICRSS
Cryptosporidium monitoring
[[Page 47653]]
began in March 1999 and ended in February 2000.
a. Scope of the Information Collection Rule. The Information
Collection Rule (61 FR 24354, May 14, 1996) (USEPA 1996a) required
large PWSs to collect water quality and treatment data related to DBPs
and microbial pathogens over an 18-month period. PWSs using surface
water or ground water under the direct influence of surface water as
sources and serving at least 100,000 people were required to monitor
their raw water monthly for Cryptosporidium, Giardia, viruses, total
coliforms, and E. coli. Approximately 350 plants monitored for
microbial parameters.
b. Scope of the ICRSS. The ICRSS were designed to complement the
Information Collection Rule data set with data from systems serving
fewer than 100,000 people and by employing an improved analytical
method for protozoa (described later). The ICRSS included 47 large
systems (serving greater than 100,000 people), 40 medium systems
(serving 10,000 to 100,000 people) and 39 small systems (serving fewer
than 10,000 people). Medium and large systems conducted 1 year of
twice-per-month sampling for Cryptosporidium, Giardia , temperature,
pH, turbidity, and coliforms. Other water quality measurements were
taken once a month. Small systems did not test for protozoa but tested
for all other water quality parameters.
3. Analytical Methods for Protozoa in the Information Collection Rule
and ICRSS
This subsection describes analytical methods for Cryptosporidium
that were used in the Information Collection Rule and ICRSS.
Information on Cryptosporidium analytical methods is important for the
LT2ESWTR for several reasons: (1) It is relevant to the quality of
Cryptosporidium occurrence data used to assess risk and economic impact
of the LT2ESWTR proposal, (2) it provides a basis for the statistical
procedures employed to analyze the occurrence data, and (3) it is used
to assess the adequacy of Cryptosporidium methods to support source-
specific decisions under the LT2ESWTR.
The Information Collection Rule and ICRSS data sets were generated
using different analytical methods. The Information Collection Rule
Protozoan Method (ICR Method) was used to analyze water samples for
Cryptosporidium during the Information Collection Rule. For the ICRSS,
a similar but improved method, EPA Method 1622 (later 1623), was used
for protozoa analyses (samples were analyzed for Cryptosporidium using
Method 1622 for the first 4 months; then Method 1623 was implemented so
that Giardia concentrations could also be measured).
a. Information Collection Rule Protozoan Method. With the
Information Collection Rule Method (USEPA 1996b), samples were
collected by passing water through a filter, which was then delivered
to an EPA-approved Information Collection Rule laboratory for analysis.
The laboratory eluted the filter, centrifuged the eluate, and separated
Cryptosporidium oocysts and Giardia cysts from other debris by density-
gradient centrifugation. The oocysts and cysts were then stained and
counted. Differential interference contrast (DIC) microscopy was used
to examine internal structures.
The Information Collection Rule Method provided a quantitative
measurement of Cryptosporidium oocysts and Giardia cysts, but it is
believed to have generally undercounted the actual occurrence
(modeling, described later, adjusted for undercounting). This
undercounting was due to low volumes analyzed and low method recovery.
The volume analyzed directly influences the sensitivity of the
analytical method and the Information Collection Rule Method did not
require a specific volume analyzed. As a result, sample volumes
analyzed during the Information Collection Rule varied widely,
depending on the water matrix and analyst discretion, with a median
volume analyzed of only 3 L.
Method recovery characterizes the likelihood that an oocyst present
in the original sample will be counted. Loss of organisms may occur at
any step of the analytical process, including filtration, elution,
concentration of the eluate, and purification of the concentrate. To
assess the performance of the Information Collection Rule Method, EPA
implemented the Information Collection Rule Laboratory Spiking Program.
This program involved collection of duplicate samples on two dates from
70 plants. On each occasion, one of the duplicate samples was spiked
with a known quantity of Giardia cysts and Cryptosporidium oocysts (the
quantity was unknown to the laboratory performing the analysis), and
both samples were processed according to the method. Recovery of spiked
Cryptosporidium oocysts ranged from 0% to 65% with a mean of 12% and a
standard deviation nearly equal to the mean (relative standard
deviation (RSD) approximately 100%) (Scheller et al. 2002).
b. Method 1622 and Method 1623. EPA developed Method 1622 (detects
Cryptosporidium) and 1623 (detects Cryptosporidium and Giardia) to
achieve higher recovery rates and lower inter- and intra-laboratory
variability than previous methods. These methods incorporate
improvements in the concentration, separation, staining, and microscope
examination procedures. Specific improvements include the use of more
effective filters, immunomagnetic separation (IMS) to separate the
oocysts and cysts from extraneous materials present in the water
sample, and the addition of 4, 6-diamidino-2-phenylindole (DAPI) stain
for microscopic analysis. The performance of these methods was tested
through single-laboratory studies and validated through multiple-
laboratory validation (round robin) studies.
The per-sample volume analyzed for Cryptosporidium during the ICRSS
was larger than in the Information Collection Rule, due to a
requirement that laboratories analyze a minimum of 10 L or 2 mL of
packed pellet with Methods 1622/23 (details in section IV.K). To assess
method recovery, matrix spike samples were analyzed on five sampling
events for each plant. The protozoa laboratory spiked the additional
sample with a known quantity of Cryptosporidium oocysts and Giardia
cysts (the quantity was unknown to the laboratory performing the
analysis) and filtered and analyzed both samples using Methods 1622/23.
Recovery in the ICRSS matrix spike study averaged 43% for
Cryptosporidium with an RSD of 47% (Connell et al. 2000). Thus, mean
Cryptosporidium recovery with Methods 1622/23 under the ICRSS was more
than 3.5 times higher than mean recovery in the Information Collection
Rule lab spiking program and relative standard deviation was reduced by
more than half.
Although Methods 1622 and 1623 have several advantages over the
Information Collection Rule method, they also have some of the same
limitations. These methods do not determine whether a cyst or oocyst is
viable and infectious, and both methods require a skilled microscopist
and several hours of sample preparation and analyses.
4. Cryptosporidium Occurrence Results from the Information Collection
Rule and ICRSS
This section describes Cryptosporidium monitoring results from the
Information Collection Rule and ICRSS. The focus of this discussion is
the national distribution of mean Cryptosporidium occurrence levels in
the sources of filtered and unfiltered plants.
[[Page 47654]]
The observed (raw, unadjusted) Cryptosporidium data from the
Information Collection Rule and ICRSS do not accurately characterize
true concentrations because of (a) the low and variable recovery of the
analytical method, (b) the small volumes analyzed, and (c) the
relatively small number of sample events. EPA employed a statistical
treatment to estimate the true underlying occurrence that led to the
data observed in the surveys and to place uncertainty bounds about that
estimation.
A hierarchical model with Bayesian parameter estimation techniques
was used to separately analyze filtered and unfiltered system data from
the Information Collection Rule and the large and medium system data
from the ICRSS. The model included parameters for location, month,
source water type, and turbidity. Markov Chain Monte Carlo methods were
used to estimate these parameters, producing a large number of estimate
sets that represent uncertainty. This analysis is described more
completely in Occurrence and Exposure Assessment for the Long Term 2
Enhanced Surface Water Treatment Rule (USEPA 2003b).
a. Information Collection Rule results. Figure III-1 presents
plant-mean Cryptosporidium levels for Information Collection Rule
plants as a cumulative distribution. Included in Figure III-1 are
distributions of both the observed raw data adjusted for mean
analytical method recovery of 12% and the modeled estimate of the
underlying distribution, along with 90% confidence bounds. The two
distributions (observed and modeled) are similar for plants where
Cryptosporidium was detected (196 of 350 Information Collection Rule
plants did not detect Cryptosporidium in any source water samples). The
modeled distribution allows for estimation of Cryptosporidium
concentrations in sources where oocysts may have been present but were
not detected due to low sample volume and poor method recovery (this
concept is explained further later in this section).
BILLING CODE 6560-50-P
[GRAPHIC]
[TIFF OMITTED]
TP11AU03.000
BILLING CODE 6560-50-P
The results shown in Figure III-1 indicate that mean
Cryptosporidium levels among Information Collection Rule plants vary
widely, with many plants having relatively little contamination and a
fraction of plants with elevated source water pathogen levels. The
median and 90th percentile estimates of Information Collection Rule
plant-mean Cryptosporidium levels are 0.048 and 1.3 oocysts/L,
respectively. These levels are lower than Cryptosporidium occurrence
estimates used in the IESWTR (USEPA 1998c), and the distribution of
Information
[[Page 47655]]
Collection Rule data is broader (i.e., more source-to-source
variability). Also, the occurrence of Cryptosporidium in flowing stream
sources was greater and more variable than in reservoir/lake sources
(shown in USEPA 2003b).
The fact that only 44% of Information Collection Rule plants had
one or more samples positive for Cryptosporidium and that only 7% of
all Information Collection Rule samples were positive for
Cryptosporidium suggests that oocyst levels were relatively low in many
source waters. However, as noted earlier, it is expected that
Cryptosporidium oocysts were present in many more source waters at the
time of sampling and were not detected due to poor analytical method
recovery and low sample volumes.
This concept is illustrated by Figure III-2, which shows the
likelihood of no oocysts being detected by the Information Collection
Rule method as a function of source water concentration (assumes median
Information Collection Rule sample volume of 3 L). As can be seen in
Figure III-2, when the source water concentration is 1 oocyst/L, which
is a relatively high level, the probability of no oocysts being
detected in a 3 L sample is 73%; for a source water with 0.1 oocyst/L,
which is close to the median occurrence level, the probability of a
non-detect is 97%. Consequently, EPA has concluded that it is
appropriate and necessary to use a statistical model to estimate the
underlying distribution.
EPA modeled Cryptosporidium occurrence separately for filtered and
unfiltered plants that participated in the Information Collection Rule
because unfiltered plants comply with different regulatory requirements
than filtered plants. As shown in Table III-3, the occurrence of
Cryptosporidium was lower for unfiltered sources.
BILLING CODE 6560-50-P
[GRAPHIC]
[TIFF OMITTED]
TP11AU03.001
BILLING CODE 6560-50-C
[[Page 47656]]
Table III-3.--Summary of Information Collection Rule Cryptosporidium
Modeled Source Water Data for Unfiltered and Filtered Plants
------------------------------------------------------------------------
Information collection rule
modeled plant-mean (oocysts/
L)
Source ----------------------------
90th
Mean Median percentile
------------------------------------------------------------------------
Unfiltered................................. 0.014 0.0079 0.033
Filtered................................... 0.59 0.052 1.4
------------------------------------------------------------------------
The median Cryptosporidium occurrence level for unfiltered systems
in the Information Collection Rule was 0.0079 oocysts/L, which is close
to the median level of 0.01 oocysts/L reported for unfiltered systems
in the IESWTR (Montgomery Watson, 1995). However, the Information
Collection Rule data do not show the 2 log difference in median
Cryptosporidium levels between filtered and unfiltered systems that was
observed for the data used in the IESWTR. The ratio of median plant-
mean occurrence in unfiltered plants to filtered plants is about 1:7
(see Table III-3). Thus, based on an assumption of a minimum 2 log
removal of Cryptosporidium by filtration plants (as required by the
IESWTR and LT1ESWTR), these data indicate that, on average, finished
water oocysts levels are higher in unfiltered systems than in filtered
systems.
b. ICRSS results. Figures III-3 and III-4 present plant-mean
Cryptosporidium levels for ICRSS medium and large systems,
respectively, as cumulative distributions. Medium and large system data
were analyzed separately to identify differences between the two data
sets. Similar to the Information Collection Rule data plot, Figures
III-3 and III-4 include distributions for both the observed raw data
adjusted for mean analytical method recovery of 43% and the modeled
estimate of the underlying distribution, along with 90% confidence
bounds. The observed and modeled distributions are similar for the 85%
of ICRSS plants that detected Cryptosporidium, and the modeled
distribution allows for estimation of Cryptosporidium concentrations
for source waters where oocysts may have been present but were not
detected.
Plant-mean Cryptosporidium concentrations for large and medium
systems in the ICRSS are similar at the mid and lower range of the
distribution and differ at the upper end. ICRSS medium and large
systems both had median plant-mean Cryptosporidium levels of
approximately 0.05 oocysts/L, which is close to the median oocyst level
in the Information Collection Rule data set as well. However, the 90th
percentile plant-mean was 0.33 oocysts/L for ICRSS medium systems and
0.24 oocysts/L for ICRSS large systems. Note that in the Information
Collection Rule distribution, the 90th percentile Cryptosporidium
concentration is 1.3 oocysts/L, which is significantly higher than
either the ICRSS medium or large system distribution.
The reasons for different results between the surveys are not well
understood, but may stem from year-to-year variation in occurrence,
systematic differences in the sampling or measurement methods employed,
and differences in the populations sampled. This topic is discussed
further at the end of this section.
BILLING CODE 6560-50-P
[[Page 47657]]
[GRAPHIC]
[TIFF OMITTED]
TP11AU03.002
[[Page 47658]]
[GRAPHIC]
[TIFF OMITTED]
TP11AU03.003
BILLING CODE 6560-50-C
5. Significance of new Cryptosporidium occurrence data.
The Information Collection Rule and ICRSS data substantially
improve overall knowledge of the occurrence distribution of
Cryptosporidium in drinking water sources. They provide data on many
more water sources than were available when the IESWTR was developed
and the data are of more uniform quality. In regard to filtered
systems, these new data demonstrate two points:
(1) The occurrence of Cryptosporidium in many drinking water
sources is lower than was indicated by the data used in IESWTR.
Median plant-mean levels for the Information Collection Rule and
ICRSS data sets are approximately 0.05 oocysts/L, whereas the median
oocyst concentration in the LeChevallier and Norton (1995) data used
in the IESWTR risk assessment was 2.3 oocysts/L.
(2) Cryptosporidium occurrence is more variable from plant to
plant than was indicated by the data considered for the IESWTR
(i.e., occurrence distribution is broader). This is illustrated by
considering the ratio of the 90th percentile to the median plant-
mean concentration. In the LeChevallier and Norton (1995) data used
for the IESWTR, this ratio was 4.6, whereas in the Information
Collection Rule data, this ratio is 27.
These data, therefore, support the finding that Cryptosporidium
levels are relatively low in most water sources, but there is a subset
of sources with relatively higher concentrations where additional
treatment may be appropriate.
In regard to unfiltered plants, the Information Collection Rule
data are consistent with the Cryptosporidium occurrence estimates for
unfiltered systems in the IESWTR. However, due to the lower occurrence
estimates for filtered systems noted previously, the Information
Collection Rule data do not support the IESWTR finding that unfiltered
water systems in compliance with the source water requirements of the
SWTR have a risk of cryptosporidiosis equivalent to that of a well-
operated filter plant using a water source of average quality (63 FR
69492, December 16, 1998) (USEPA 1998a). Rather, these data indicate
that Agency conclusions regarding the risk comparison between
unfiltered and filtered drinking waters must be revised. For protection
equivalent to that provided by filtered systems, unfiltered systems
must take additional steps to strengthen their microbial barriers.
6. Request for Comment on Information Collection Rule and ICRSS Data
Sets
EPA notes that there are significant differences in the Information
Collection Rule and ICRSS medium and large system data sets. The median
values for these data sets are 0.048, 0.050, and 0.045 oocysts/L,
respectively, while the 90th percentile values are 1.3, 0.33, and 0.24
oocysts/L. The reasons for these differences are not readily apparent.
The ICRSS used a newer method with better quality control that yields
significantly higher recovery, and this suggests that these data are more
[[Page 47659]]
reliable for estimating concentrations at individual plants. However,
the Information Collection Rule included a much larger number of plants
(350 v. 40 each for the ICRSS medium and large system surveys) and,
consequently, may be more reliable for estimating occurrence
nationally. The surveys included a similar number of samples per plant
(18 v. 24 in the ICRSS). The two surveys cover different time periods
(7/97-12/98 for the Information Collection Rule and 3/99-2/00 for the
ICRSS).
In order to better understand the factors that may account for the
differences in the three data sets, EPA conducted several additional
analyses. First, EPA compared results for the subset of 40 plants that
were in both the Information Collection Rule and ICRSS large system
surveys. The medians for the two data sets were 0.13 and 0.045 oocysts/
L, respectively, while the 90th percentiles were 1.5 and 0.24 oocysts/
L. Clearly, the discrepancy between the two surveys persists for the
subsample of data from plants that participated in both surveys. This
suggests that the different sample groups in the full data sets are not
the primary factor that accounts for the different results.
Next, EPA looked at the six month period (July through December)
that was sampled in two consecutive years (1997 and 1998) during the
Information Collection Rule survey to investigate year-to-year
variations at the same plants. Estimated medians for 1997 and 1998 were
0.062 and 0.040 oocysts/L, respectively, while the 90th percentiles
were 1.1 and 1.3 oocysts/L. While these comparisons show some interyear
variability, it is less than the variability observed between the
Information Collection Rule and ICRSS data sets. EPA has no data
comparing the same plants using the same methods for the time periods
in question (1997-98 and 1999-2000) so it is not known if the variation
between these time periods was larger than the apparent variation
between 1997 and 1998 in the Information Collection Rule data set.
The choice of data set has a significant effect on exposure, cost,
and benefit estimates for the LT2ESWTR. Due to the lack of any clear
criterion for favoring one data set over the other, EPA has conducted
the analyses for this proposed rule separately for each, and presents a
range of estimates based on the three data sets. EPA requests comment
on this approach. EPA will continue to evaluate the relative strengths
and limitations of the three data sets, as well as any new data that
may become available for the final rule.
D. Treatment
1. Overview
This section presents information on treatment processes for
reducing the risk from Cryptosporidium in drinking water. Treatment
information is critical to two aspects of the LT2ESWTR: (1) estimates
of the efficiency of water filtration plants in removing
Cryptosporidium are used in assessing risk in treated drinking water
and (2) the performance and availability of treatment technologies like
ozone, UV light, and membranes that effectively inactivate or remove
Cryptosporidium impact the feasibility of requiring additional
treatment for this pathogen.
The majority of plants treating surface water use conventional
filtration treatment, which is defined in 40 CFR 141.2 as a series of
processes including coagulation, flocculation, sedimentation, and
filtration. Direct filtration, which is typically used on sources with
low particulate levels, includes coagulation and filtration but not
sedimentation. Other common filtration processes are slow sand,
diatomaceous earth (DE), membranes, and bag and cartridge filters.
For the IESWTR (and later the LT1ESWTR), EPA evaluated results from
pilot and full scale studies of Cryptosporidium removal by various
types of filtration plants. Based on these studies, EPA concluded that
conventional and direct filtration plants meeting IESWTR filter
effluent turbidity standards will achieve a minimum 2 log (99%) removal
of Cryptosporidium. The Agency reached the same conclusion for slow
sand and DE filtration plants meeting SWTR turbidity standards.
Treatment credit for technologies like membranes and bag and cartridge
filters was to be made on a product-specific basis.
Subsequent to promulgating the IESWTR and LT1ESWTR, EPA has
reviewed additional studies of the performance of treatment plants in
removing Cryptosporidium, as well as other micron size particles (e.g.,
aerobic spores) that may serve as indicators of Cryptosporidium
removal. As discussed later in this section, the Agency has concluded
that these studies support an estimate of 3 log (99.9%) for the average
Cryptosporidium removal efficiency of conventional treatment plants in
compliance with the IESWTR or LT1ESWTR. Section IV.A describes how this
estimate of average removal efficiency is used in determining the need
for additional Cryptosporidium treatment under the LT2ESWTR. Further,
this estimate is consistent with the Stage 2 M-DBP Agreement in
Principle, which states as follows:
The additional treatment requirements in the (LT2ESWTR) bin
requirement table are based, in part, on the assumption that
conventional treatment plants in compliance with the IESWTR achieve
an average of 3 logs removal of Cryptosporidium.
In addition, the Agency finds that available data support an
estimate of 3 log average Cryptosporidium removal for well operated
slow sand and DE plants. Direct filtration plants are estimated to
achieve a 2.5 log average Cryptosporidium reduction, in consideration
of the absence of a sedimentation process in these plants.
The most significant developments in the treatment of
Cryptosporidium since IESWTR promulgation are in the area of
inactivation. During IESWTR development, EPA determined that available
data were not sufficient to identify criteria for awarding
Cryptosporidium treatment credit for any disinfectant. As presented in
section IV.C.14, EPA has now acquired the necessary data to specify the
disinfectant concentrations and contact times necessary to achieve
different levels of Cryptosporidium inactivation with chlorine dioxide
and ozone. Additionally, recent studies have demonstrated that UV light
will produce high levels of Cryptosporidium and Giardia lamblia
inactivation at low doses. Section IV.C.15 provides criteria for
systems to achieve credit for disinfection of Cryptosporidium, Giardia
lamblia, and viruses by UV.
This section begins with a summary of treatment information
considered for the IESWTR and LT1ESWTR, followed by a discussion of
additional data that EPA has evaluated since promulgating those
regulations. Further information on treatment of Cryptosporidium is
available in Technologies and Costs for Control of Microbial
Contaminants and Disinfection Byproducts (USEPA 2003c), Occurrence and
Exposure Assessment for the Long Term 2 Enhanced Surface Water
Treatment Rule (USEPA 2003b) and section IV.C of this preamble.
2. Treatment information considered for the IESWTR and LT1ESWTR
Treatment studies that were evaluated during development of the
IESWTR are described in the IESWTR NODA (62 FR 59486, November 3, 1997)
(USEPA 1997b), the Regulatory Impact Analysis for the IESWTR (USEPA
1998d), and Technologies and Costs for the Microbial Recommendations of
the M/DBP Advisory Committee (USEPA 1997b). Treatment information
considered in development of the
[[Page 47660]]
LT1ESWTR is described in the proposed rule (65 FR 59486, April 10,
2000) (USEPA 2000b). Pertinent information is summarized in the
following paragraphs.
a. Physical removal. EPA evaluated eight studies on removal of
Cryptosporidium by rapid granular filtration for the IESWTR. These were
Patania et al. (1995), Nieminski and Ongerth (1995), Ongerth and
Pecoraro (1995), LeChevallier and Norton (1992), LeChevallier et al.
(1991), Foundation for Water Research (1994), Kelley et al. (1995), and
West et al. (1994). These studies included both pilot and full scale
plants.
Full scale plants in these studies typically demonstrated 2-3 log
removal of Cryptosporidium, and pilot plants achieved up to almost 6
log removal under optimized conditions. In general, the degree of
removal that can be quantified in full scale plants is limited because
Cryptosporidium levels following filtration are often below the
detection limit of the analytical method. Pilot scale studies overcome
this limitation by seeding high concentrations of oocysts to the plant
influent, but extrapolation of the performance of a pilot plant to the
routine performance of full scale plants is uncertain.
Cryptosporidium removal efficiency in these studies was observed to
depend on a number of factors including: water matrix, coagulant
application, treatment optimization, filtered water turbidity, and the
filtration cycle. The highest removal rates were observed in plants
that achieved very low effluent turbidities.
EPA also evaluated studies of Cryptosporidium removal by slow sand
(Schuler and Ghosh 1991, Timms et al. 1995) and DE filtration (Schuler
and Gosh 1990) for the IESWTR. These studies indicated that a well
designed and operated plant using these processes could achieve 3 log
or greater removal of Cryptosporidium.
After considering these studies, EPA concluded that conventional
and direct filtration plants in compliance with the effluent turbidity
criteria of the IESWTR, and slow sand and DE plants in compliance with
the effluent turbidity criteria established for these processes by the
SWTR, would achieve at least 2 log removal of Cryptosporidium.
Recognizing that many plants will achieve more than the minimum 2 log
reduction, EPA estimated median Cryptosporidium removal among
filtration plants as near 3 log (99.9%) for the purpose of assessing
risk.
The LT1ESWTR proposal included summaries of additional studies of
Cryptosporidium removal by conventional treatment (Dugan et al. 1999),
direct filtration (Swertfeger et al. 1998), and DE filtration (Ongerth
and Hutton 1997). These studies supported IESWTR conclusions stated
previously regarding the performance of these processes. The LT1ESWTR
proposal also summarized studies of membranes, bag filters, and
cartridge filters (Jacangelo et al. 1995, Drozd and Schartzbrod 1997,
Hirata and Hashimoto 1998, Goodrich et al. 1995, Collins et al. 1996,
Lykins et al. 1994, Adham et al. 1998). This research demonstrated that
these technologies may be capable of achieving 2 log or greater removal
of Cryptosporidium. However, EPA concluded that variation in
performance among different manufacturers and models necessitates that
determinations of treatment credit be made on a technology-specific
basis (65 FR 19065, April 10, 2000) (USEPA 2000b).
b. Inactivation. In the IESWTR NODA (62 FR 59486) (USEPA 1997a),
EPA cited studies that demonstrated that chlorine is ineffective for
inactivation of Cryptosporidium at doses practical for treatment plants
(Korich et al. 1990, Ransome et al. 1993, Finch et al. 1997). The
Agency also summarized studies of Cryptosporidium inactivation by UV,
ozone, and chlorine dioxide. EPA evaluated these disinfectants to
determine if sufficient data were available to develop prescriptive
disinfection criteria for Cryptosporidium.
The studies of UV disinfection of Cryptosporidium that were
available during IESWTR development were inconclusive due to
methodological factors. These studies included: Lorenzo-Lorenzo et al.
(1993), Ransome et al. (1993), Campbell et al. (1995), Finch et al.
(1997), and Clancy et al. (1997). A common limitation among these
studies was the use of in vitro assays, such as excystation and vital
dye staining, to measure loss of infectivity. These assays subsequently
were shown to overestimate the UV dose needed to inactivate protozoa
(Clancy et al. 1998, Craik et al. 2000). In another case, a reactor
vessel that blocked germicidal light was used (Finch et al. 1997).
EPA evaluated the following studies of ozone inactivation of
Cryptosporidium for the IESWTR: Peeters et al. (1989), Korich et al.
(1990), Parker et al. (1993), Ransome et al. (1993), Finch et al.
(1997), Daniel et al. (1993), and Miltner et al. (1997). These studies
demonstrated that ozone could achieve high levels of Cryptosporidium
inactivation, albeit at doses much higher than those required to
inactivate Giardia. Results of these studies also exhibited significant
variability due to factors like different infectivity assays and
methods of dose calculation.
The status of chlorine dioxide inactivation of Cryptosporidium
during IESWTR development was similar to that of ozone. EPA evaluated a
number of studies that indicated that relatively high doses of chlorine
dioxide could achieve significant inactivation of Cryptosporidium
(Peeters et al. 1989, Korich et al. 1990, Ransome et al. 1993, Finch et
al. 1995 and 1997, and LeChevallier et al. 1997). Data from these
studies showed a high level of variability due to methodological
differences, and the feasibility of high chlorine dioxide doses was
uncertain due to the MCL for chlorite that was established by the Stage
1 DBPR.
After reviewing these studies, EPA and the Stage 1 Federal Advisory
Committee concluded that available data were not adequate to award
Cryptosporidium inactivation credit for UV, ozone, or chlorine dioxide.
3. New Information on Treatment for Control of Cryptosporidium
a. Conventional filtration treatment and direct filtration. This
section provides brief descriptions of seven recent studies of
Cryptosporidium removal by conventional treatment and direct
filtration, followed by a summary of key points.
Dugan et al. (2001) evaluated the ability of conventional treatment
to control Cryptosporidium under varying water quality and treatment
conditions, and assessed turbidity, total particle counts (TPC), and
aerobic endospores as indicators of Cryptosporidium removal. Fourteen
runs were conducted on a small pilot scale plant that had been
determined to provide equivalent performance to a larger plant. Under
optimal coagulation conditions, oocyst removal across the sedimentation
basin ranged from 0.6 to 1.8 log, averaging 1.3 log, and removal across
the filters ranged from 2.9 to greater than 4.4 log, averaging greater
than 3.7 log. Removal of aerobic spores, TPC, and turbidity all
correlated with removal of Cryptosporidium by sedimentation, and these
parameters were conservative indicators of Cryptosporidium removal
across filtration. Sedimentation removal under optimal conditions
related to raw water quality, with the lowest Cryptosporidium removals
observed when raw water turbidity was low.
Suboptimal coagulation conditions (underdosed relative to jar test
predictions) significantly reduced plant
[[Page 47661]]
performance. Oocyst removal in the sedimentation basin averaged 0.2
log, and removal by filtration averaged 1.5 log. Under suboptimal
coagulation conditions, low sedimentation removals of Cryptosporidium
were observed regardless of raw water turbidity.
Nieminski and Bellamy (2000) investigated surrogates as indicators
of Giardia and Cryptosporidium in source water and as measures of
treatment plant effectiveness. It involved sampling for microbial
pathogens (Giardia, Cryptosporidium, and enteric viruses), potential
surrogates (bacteria, bacteria spores, bacterial phages, turbidity,
particles), and other water quality parameters in the source and
finished waters of 23 surface water filtration facilities and one
unfiltered system.
While Giardia and Cryptosporidium were found in the majority of
source water samples, the investigators could not establish a
correlation between either occurrence or removal of these protozoa and
any of the surrogates tested. This was attributed, in part, to low
concentrations of Giardia and Cryptosporidium in raw water and high
analytical method detection limits. Removal of Cryptosporidium and
Giardia averaged 2.2 and 2.6 log, respectively, when conservatively
estimated using detection limits in filtered water. Aerobic spores were
found in 85% of filtered water samples and were considered a measure of
general treatment effectiveness. Average reduction of aerobic spores
was 2.84 log. Direct filtration plants removed fewer aerobic spores
than conventional or softening plants.
McTigue et al. (1998) conducted an on-site survey of 100 treatment
plants for particle counts, pathogens (Cryptosporidium and Giardia),
and operational information. The authors also performed pilot scale
spiking studies. Median removal of particles greater than 2 mm was 2.8
log, with values ranging from 0.04 to 5.5 log. Removal generally
increased with increasing raw water particle concentration. Results
were consistent with previously collected data. Cryptosporidium and
Giardia were found in the majority of raw water sources, but
calculation of their log removal was limited by the concentration
present. River sources had a higher incidence of pathogen occurrence.
Direct filtration plants had higher levels of pathogens in the filtered
water than others in the survey.
Nearly all of the filter runs evaluated in the survey exhibited
spikes where filtered water particle counts increased, and pilot work
showed that pathogens are more likely to be released during these spike
events. Cryptosporidium removal in the pilot scale spiking study
averaged nearly 4 log, regardless of the influent oocyst concentration.
Pilot study results indicated a strong relationship between removal of
Cryptosporidium and removal of particles (£ 3 [mu]m) during
runs using optimal coagulation and similar temperatures.
Patania et al. (1999) evaluated removal of Cryptosporidium at
varied raw water and filter effluent turbidity levels using direct
filtration. Runs were conducted with both low (2 NTU) and high (10 NTU)
raw water turbidity. Targeted filtered water turbidity was either 0.02
or 0.05 NTU. At equivalent filtered water turbidity, Cryptosporidium
removal was slightly higher when the raw water turbidity was higher.
Also, Cryptosporidium removal was enhanced by an average of 1.5 log
when steady-state filtered water turbidity was 0.02 NTU compared to
0.05 NTU.
Huck et al. (2000) evaluated filtration efficiency during optimal
and suboptimal coagulation conditions with two pilot scale filtration
plants. One plant employed a high coagulation dose for both total
organic carbon (TOC) and particle removal, and the second plant used a
low dose intended for particle removal only. Under optimal operating
conditions, which were selected to achieve filtered water turbidity
below 0.1 NTU, median Cryptosporidium removal was 5.6 log at the high
coagulant dose plant and 3 log at the low dose plant. Under suboptimal
coagulation conditions, where the coagulant dose was reduced to achieve
filtered water turbidity of 0.2 to 0.3 NTU, median Cryptosporidium
removals dropped to 3.2 log and 1 log at the high dose and low dose
plants, respectively. Oocyst removal also decreased substantially at
the end of the filter cycle, although this was not always indicated by
an increase in turbidity. Runs conducted with no coagulant resulted in
very little Cryptosporidium removal.
Emelko et al. (2000) investigated Cryptosporidium removal during
vulnerable filtration periods using a pilot scale direct filtration
system. The authors evaluated four different operational conditions:
stable, early breakthrough, late breakthrough, and end of run. During
stable operation, effluent turbidity was approximately 0.04 NTU and
Cryptosporidium removal ranged from 4.7 to 5.8 log. In the early
breakthrough period, effluent turbidity increased from approximately
0.04 to 0.2 NTU, and Cryptosporidium removal decreased significantly,
averaging 2.1 log. For the late breakthrough period, where effluent
turbidity began at approximately 0.25 NTU and ended at 0.35 NTU,
Cryptosporidium removal dropped to an average of 1.4 log. Two
experiments tested Cryptosporidium removal during the end-of-run
operation, when effluent turbidities generally start increasing.
Turbidity started at about 0.04 NTU for both experiments and ended at
0.06 NTU for the first experiment and 0.13 NTU for the second. Reported
Cryptosporidium removal ranged from 1.8 to 3.3 log, with an average of
2.5 log for both experiments.
Harrington et al. (2001) studied the removal of Cryptosporidium and
emerging pathogens by filtration, sedimentation, and dissolved air
flotation (DAF) using bench scale jar tests and pilot scale
conventional treatment trains. In the bench scale experiments, all run
at optimized coagulant doses, mean log removal of Cryptosporidium was
1.2 by sedimentation and 1.7 by DAF. Cryptosporidium removal was
similar in all four water sources that were evaluated and was not
significantly affected by lower pH or coagulant aid addition. However,
removal of Cryptosporidium was greater at 22[deg]C than at 5[deg]C, and
was observed to be higher with alum coagulant than with either
polyaluminum hydroxychlorosulfate or ferric chloride.
In the pilot scale experiments, mean log removal of Cryptosporidium
was 1.9 in filtered water with turbidity of 0.2 NTU or less. Removal
increased as filtered water turbidity dropped below 0.3 NTU. There was
no apparent effect of filtration rate on removal efficiency. In
comparing Cryptosporidium removal by sand, dual media (anthracite/
sand), and trimedia (anthracite/sand/garnet) filters, no difference was
observed near neutral pH. However, at pH 5.7, removal increased
significantly in the sand filter and it outperformed the other filter
media configurations. The authors found no apparent explanation for
this behavior. There was no observable effect of a turbidity spike on
Cryptosporidium removal.
Significance of Conventional and Direct Filtration Studies
The performance of treatment plants under current regulations is a
significant factor in determining the need for additional treatment. As
described in section IV.A, the proposed Cryptosporidium treatment
requirements associated with LT2ESWTR risk bins for filtered systems
are based, in part, on an estimate that conventional plants in
compliance with
[[Page 47662]]
the IESWTR achieve an average of 3 log Cryptosporidium removal. The
following discussion illustrates why EPA believes that available data
support this estimate.
While Cryptosporidium removal at full scale plants is difficult to
quantify due to limitations with analytical methods, pilot scale
studies show that reductions in aerobic spores and total particle
counts are often conservative indicators of filtration plant removal
efficiency for Cryptosporidium (Dugan et al. 2001, McTigue et al. 1998,
Yates et al. 1998, Emelko et al. 1999 and 2000). Surveys of full scale
plants have reported average reductions near 3 log for both aerobic
spores (Nieminski and Bellamy, 2000) and total particle counts (McTigue
et al. 1998). Consequently, these findings are consistent with an
estimate that average removal of Cryptosporidium by filtration plants
is approximately 3 log.
Pilot scale Cryptosporidium spiking studies (Dugan et al. 2001,
Huck et al. 2000, Emelko et al. 2000, McTigue et al. 1998, Patania et
al. 1995) suggest that a conventional treatment plant has the potential
to achieve greater than 5 log removal of Cryptosporidium under optimal
conditions. However, these high removals are typically observed at very
low filter effluent turbidity values, and the data show that removal
efficiency can decrease substantially over the course of a filtration
cycle or if coagulation is not optimized (Dugan et al. 2001, Huck et
al. 2000, Emelko et al. 2000, Harrington et al. 2001). Removal
efficiency also appears to be impacted by source water quality (Dugan
et al. 2001, McTigue et al. 1998). Given these considerations, EPA
believes that 3 log is a reasonable estimate of average Cryptosporidium
removal efficiency for conventional treatment plants in compliance with
the IESWTR or LT1ESWTR.
The Stage 2 M-DBP Advisory Committee did not address direct
filtration plants, which lack the sedimentation basin of a conventional
treatment train, but recommended that EPA address these plants in the
LT2ESWTR proposal (65 FR 83015, December 29, 2000) (USEPA 2000a). While
some studies have observed similar levels of Cryptosporidium removal in
direct and conventional filtration plants (Nieminski and Ongerth, 1995,
Ongerth and Pecoraro 1995), EPA has concluded that the majority of
available data support a lower estimate of Cryptosporidium removal
efficiency for direct filtration plants.
As described in section IV.C.5, pilot and full scale studies
demonstrate that sedimentation basins, which are absent in direct
filtration, can achieve 0.5 log or greater Cryptosporidium reduction
(Dugan et al. 2001, Patania et al. 1995, Edzwald and Kelly 1998,
Payment and Franco 1993, Kelley et al. 1995). In addition, Patania et
al. (1995) observed direct filtration to achieve less Cryptosporidium
removal than conventional treatment, and McTigue et al. (1998) found a
higher incidence of Cryptosporidium in the treated water of direct
filtration plants. Given these findings, EPA has estimated that direct
filtration plants achieve an average of 2.5 log Cryptosporidium
reduction (i.e., 0.5 log less than conventional treatment).
i. Dissolved air flotation. Dissolved air flotation (DAF) is a
solid-liquid separation process that can be used in conventional
treatment trains in place of gravity sedimentation. DAF takes advantage
of the buoyancy of oocysts by floating oocyst/particle complexes to the
surface for removal. In DAF, air is dissolved in pressurized water,
which is then released into a flotation tank containing flocculated
particles. As the water enters the tank, the dissolved air forms small
bubbles that collide with and attach to floc particles and float to the
surface (Gregory and Zabel, 1990).
In comparing DAF with gravity sedimentation, Plummer et al. (1995)
observed up to 0.81 log removal of oocysts in the gravity sedimentation
process, while DAF achieved 0.38 to 3.7 log removal, depending on
coagulant dose. Edzwald and Kelley (1998) demonstrated a 3 log removal
of oocysts using DAF, compared with a 1 log removal using gravity
sedimentation in the clarification process before filtration. In bench
scale testing by Harrington et al. (2001), DAF averaged 0.5 log higher
removal of Cryptosporidium than gravity sedimentation. Based on these
results, EPA has concluded that a treatment plant using DAF plus
filtration can achieve levels of Cryptosporidium removal equivalent to
or greater than a conventional treatment plant with gravity
sedimentation.
b. Slow sand filtration. Slow sand filtration is a process
involving passage of raw water through a bed of sand at low velocity
(generally less than 0.4 m/h) resulting in substantial particulate
removal by physical and biological mechanisms. For the LT2ESWTR
proposal, EPA has reviewed two additional studies of slow sand
filtration.
Fogel et al. (1993) evaluated removal efficiencies for
Cryptosporidium and Giardia with a full scale slow sand filtration
plant. The removals ranged from 0.1-0.5 log for Cryptosporidium and
0.9-1.4 log for Giardia. Raw water turbidity ranged from 1.3 to 1.6 NTU
and decreased to 0.35-0.31 NTU after filtration. The authors attributed
the low Cryptosporidium and Giardia removals to the relatively poor
grade of filter media and lower water temperature. The sand had a
higher uniformity coefficient than recommended by design standards.
This creates larger pore spaces within the filter bed that retard
biological removal capacity. Lower water temperatures (1 [deg]C) also
decreased biological activity in the filter media.
Hall et al. (1994) examined the removal of Cryptosporidium with a
pilot scale slow sand filtration plant. Cryptosporidium removals ranged
from 2.8 to 4.3 log after filter maturation, with an average of 3.8 log
(at least one week after filter scraping). Raw water turbidity ranged
from 3.0 NTU to 7.5 NTU for three of four runs and 15.0 NTU for a
fourth run. Filtered water turbidity was 0.2 to 0.4 NTU, except for the
fourth run which had 2.5 NTU filtered water turbidity. This study also
included an investigation of Cryptosporidium removal during filter
start-up where the filtration rate was slowly increased over a 4 day
period. Results indicate that filter ripening did not appear to affect
Cryptosporidium removal.
The study by Fogel et al. is significant because it indicates that
a slow sand filtration plant may achieve less than 2 log removal of
Cryptosporidium removal while being in compliance with the effluent
turbidity requirements of the IESWTR and LT1ESWTR. The authors
attributed this poor performance to the filter being improperly
designed, which, if correct, illustrates the importance of proper
design for removal efficiency in slow sand filters. In contrast, the
study by Hall et al. (1994) supports other work (Schuler and Ghosh
1991, Timms et al. 1995) in finding that slow sand filtration can
achieve Cryptosporidium removal greater than 3 log. Overall, this body
of work appears to show that slow sand filtration has the potential to
achieve Cryptosporidium removal efficiencies similar to that of a
conventional plant, but proper design and operation are critical to
realizing treatment goals.
c. Diatomaceous earth filtration. Diatomaceous earth filtration is
a process in which a precoat cake of filter media is deposited on a
support membrane and additional filter media is continuously added to
the feed water to maintain the permeability of the filter cake. Since
the IESWTR and LT1ESWTR, EPA has reviewed one new study of DE
filtration (Ongerth and Hutton 2001). It supports the findings of
[[Page 47663]]
earlier studies (Schuler and Gosh 1990, Ongerth and Hutton 1997) in
showing that a well designed and operated DE plant can achieve
Cryptosporidium removal equivalent to a conventional treatment plant
(i.e., average of 3 log).
d. Other filtration technologies. In today's proposal, information
about bag filters, cartridge filters, and membranes, including criteria
for awarding Cryptosporidium treatment credit, is presented in section
IV.C as part of the microbial toolbox. Section IV.C also addresses
credit for pretreatment options like presedimentation basins and bank
filtration.
e. Inactivation. Substantial advances in understanding of
Cryptosporidium inactivation by ozone, chlorine dioxide, and UV have
been made following the IESWTR and LT1ESWTR. These advances have
allowed EPA to develop criteria to award Cryptosporidium treatment
credit for these disinfectants. Relevant information is summarized
next, with additional information sources noted.
i. Ozone and chlorine dioxide. With the completion of several major
studies, EPA has acquired sufficient information to develop standards
for the inactivation of Cryptosporidium by ozone and chlorine dioxide.
For both of these disinfectants, today's proposal includes CT tables
that specify a level of Cryptosporidium treatment credit based on the
product of disinfectant concentration and contact time.
For ozone, the CT tables in today's proposal were developed through
considering four sets of experimental data: Li et al. (2001), Owens et
al. (2000), Oppenheimer et al. (2000), and Rennecker et al. (1999).
Chlorine dioxide CT tables are based on three experimental data sets:
Li et al. (2001), Owens et al. (1999), and Ruffell et al. (2000).
Together these studies provide a large body of data that covers a range
of water matrices, both laboratory and natural. While the data exhibit
variability, EPA believes that collectively they are sufficient to
determine appropriate levels of treatment credit as a function of
disinfection conditions. CT tables for ozone and chlorine dioxide
inactivation of Cryptosporidium are presented in Section IV.C.14 of
this preamble.
ii. Ultraviolet light. A major recent development is the finding
that UV light is highly effective for inactivating Cryptosporidium and
Giardia at low doses. Research prior to 1998 had indicated that very
high doses of UV light were required to achieve substantial
disinfection of protozoa. However, as noted previously, these results
were largely based on the use of in vitro assays, which were later
shown to substantially overestimate the UV doses required to prevent
infection (Clancy et al. 1998, Bukhari et al. 1999, Craik et al. 2000).
Recent research using in vivo assays (e.g., neonatal mouse infectivity)
and cell culture techniques to measure infectivity has provided strong
evidence that both Cryptosporidium and Giardia are highly sensitive to
low doses of UV.
BILLING CODE 6560-50-P
[[Page 47664]]
[GRAPHIC]
[TIFF OMITTED]
TP11AU03.004
BILLING CODE 6560-50-C
Figure III-5 presents data from selected studies of UV inactivation
of Cryptosporidium. While the data in Figure III-5 show substantial
scatter, they are consistent in demonstrating a high level of
inactivation at relatively low UV doses. These studies generally
demonstrated at least 3 log Cryptosporidium inactivation at UV doses of
10 mJ/cm 2 and higher. In comparison, typical UV dose for
drinking water disinfection are 30 to 40 mJ/cm 2. A recent
investigation by Clancy et al. (2002) showed that UV light at 10 mJ/cm
2 provided at least 4 log inactivation of five strains of
Cryptosporidium that are infectious to humans. Studies of UV
inactivation of Giardia have reported similar results (Craik et al.
2000, Mofidi et al. 2002, Linden et al. 2002, Campbell and Wallis 2002,
Hayes et al. 2003).
In addition to efficacy for protozoa inactivation, data indicate
that UV disinfection does not promote the formation of DBPs (Malley et
al. 1995, Zheng et al. 1999). Malley et al. (1995) evaluated DBP
formation in a number of surface and ground waters with UV doses
between 60 and 200 mJ/cm\2\. UV light did not directly form DBPs, such
as trihalomethanes (THM) and haloacetic acids (HAA), and did not alter
the concentration or species of DBPs formed by post-disinfection with
chlorine or chloramines. A study by Zheng et al. (1999) reported that
applying UV light following chlorine disinfection had little impact on
THM and HAA formation. In addition, data suggest that photolysis of
nitrate to nitrite, a potential concern with certain types of UV lamps,
will not result in nitrite levels near the MCL under typical drinking
water conditions (Peldszus et al. 2000, Sharpless and Linden 2001).
These studies demonstrate that UV light is an effective technology
for inactivating Giardia and Cryptosporidium, and that it does not form
DBPs at levels of concern in drinking water. Section IV.C.15 describes
proposed criteria for awarding treatment credit for UV inactivation of
Cryptosporidium, Giardia lamblia, and viruses. These criteria include
UV dose tables, validation testing, and monitoring standards. In
addition, EPA is preparing a UV Disinfection Guidance Manual with
information on design, testing, and operation of UV systems. A draft of
this guidance is available in the docket for today's proposal
(http://www.epa.gov/edocket/).
iii. Significance of new information on inactivation. The research
on ozone, chlorine dioxide, and UV light described in this proposal has
made these disinfectants available for systems to use in meeting
additional Cryptosporidium treatment requirements under LT2ESWTR. This
overcomes a significant limitation to establishing inactivation
requirements for Cryptosporidium that existed when the IESWTR was
developed. The Stage 1 Advisory Committee recognized the need for
inactivation criteria if EPA were to consider a risk based proposal
[[Page 47665]]
for Cryptosporidium in future rulemaking (62 FR 59498, November 3,
1997) (USEPA 2000b). The CT tables for ozone and chlorine dioxide
provide such criteria. In addition, the availability of UV furnishes
another relatively low cost tool to achieve Cryptosporidium
inactivation and DBP control.
While no single treatment technology is appropriate for all
systems, EPA believes that these disinfectants, along with the other
management and treatment options in the microbial toolbox presented in
section IV.C, make it feasible for systems to meet the additional
Cryptosporidium treatment requirements in today's proposal.
IV. Discussion of Proposed LT2ESWTR Requirements
A. Additional Cryptosporidium Treatment Technique Requirements for
Filtered Systems
1. What Is EPA Proposing Today?
a. Overview of framework approach. EPA is proposing treatment
technique requirements to supplement the existing requirements of the
SWTR, IESWTR, and LT1ESWTR (see section II.B). The proposed
requirements will achieve increased protection against Cryptosporidium
in public water systems that use surface water or ground water under
the direct influence of surface water as sources. Under this proposal,
filtered systems will be assigned to one of four risk categories (or
``bins''), based on the results of source water Cryptosporidium
monitoring. Systems assigned to the lowest risk bin incur no additional
treatment requirements, while systems assigned to higher risk bins must
reduce Cryptosporidium levels beyond IESWTR and LT1ESWTR requirements.
Systems will comply with additional Cryptosporidium treatment
requirements by selecting treatment and management strategies from a
``microbial toolbox'' of control options.
Today's proposal reflects recommendations from the Stage 2 M-DBP
Federal Advisory Committee (65 FR 83015, December 29, 2000) (USEPA
2000a), which described this approach as a ``microbial framework''.
This approach targets additional treatment requirements to those
systems with the highest source water Cryptosporidium levels and,
consequently, the highest vulnerability to this pathogen. In so doing,
today's proposal builds upon the current treatment technique
requirement for Cryptosporidium under which all filtered systems must
achieve at least a 2 log reduction, regardless of source water quality.
The intent of this proposal is to assure that public water systems with
the higher risk source water achieve a level of public health
protection commensurate with systems with less contaminated source
water.
b. Monitoring requirements. Today's proposal requires systems to
monitor their source water (influent water prior to treatment plant)
for Cryptosporidium, E. coli, and turbidity. The purpose of the
monitoring is to assess source water Cryptosporidium levels and,
thereby, classify systems in different risk bins. Proposed monitoring
requirements for large and small systems are summarized in Table IV-I
and are characterized in the following discussion.
Large Systems
Large systems (serving at least 10,000 people) must sample their
source water at least monthly for Cryptosporidium, E. coli, and
turbidity for a period of 2 years, beginning no later than 6 months
after LT2ESWTR promulgation. Systems may sample more frequently (e.g.,
twice-per-month, once-per-week), provided the same sampling frequency
is used throughout the 2-year monitoring period. As described in
section IV.A.1.c, systems that sample more frequently (at least twice-
per-month) use a different calculation that is potentially less
conservative to determine their bin classification.
The purpose of requiring large systems to collect E. coli and
turbidity data is to further evaluate these parameters as indicators to
identify drinking water sources that are susceptible to high
concentrations of Cryptosporidium. As described next, these data will
be applied to small system LT2ESWTR monitoring.
Small Systems
EPA is proposing a 2-phase monitoring strategy for small systems
(serving fewer than 10,000 people) to reduce their monitoring burden.
This approach is based on Information Collection Rule and ICRSS data
indicating that systems with low source water E. coli levels are likely
to have low Cryptosporidium levels, such that additional treatment
would not be required under the LT2ESWTR. Under this approach, small
systems must initially conduct one year of bi-weekly sampling (one
sample every two weeks) for E. coli, beginning 2.5 years after LT2ESWTR
promulgation. Small systems are triggered into Cryptosporidium
monitoring only if the initial E. coli monitoring indicates a mean
concentration greater than 10 E. coli/100 mL for systems using a
reservoir or lake as their primary source or greater than 50 E. coli/
100 mL for systems using a flowing stream as their primary source.
Small systems that exceed these E. coli trigger values must conduct one
year of twice-per-month Cryptosporidium sampling, beginning 4 years
after LT2ESWTR promulgation.
The analysis supporting the proposed E. coli values that trigger
Cryptosporidium monitoring by small systems is presented in Section
IV.A.2. However, as recommended by the Stage 2 M-DBP Advisory
Committee, EPA will evaluate Cryptosporidium indicator relationships in
the LT2ESWTR monitoring data collected by large systems. If these data
support the use of different indicator levels to trigger small system
Cryptosporidium monitoring, EPA will issue guidance with
recommendations. The proposed LT2ESWTR allows States to specify
alternative indicator values for small systems, based on EPA guidance.
Table IV-1.--LT2ESWTR Monitoring Requirements
--------------------------------------------------------------------------------------------------------------------------------------------------------
Monitoring parameters and sample frequency requirements
Public water systems Monitoring begins Monitoring duration ---------------------------------------------------------------------
Cryptosporidium E. coli Turbidity
--------------------------------------------------------------------------------------------------------------------------------------------------------
Large systems (serving 10,000 or 6 months after 2 years.............. minimum 1 sample/month minimum 1 sample/ minimum 1 measurement/
more people). promulgation of b. month b. month b.
LT2ESWTR a.
Small systems (serving fewer than 30 months (2\1/2\ 1 year............... See following rows.... 1 sample every two N/A
10,000 people). years) after weeks.
promulgation of
LT2ESWTR.
------------------------------------
[[Page 47666]]
Possible additional monitoring requirement for Cryptosporidium. If small systems exceed E. coli trigger levels c, then * * *
--------------------------------------------------------------------------------------------------------------------------------------------------------
Small systems (serving fewer than 48 months (4 years) 1 year............... 2 samples/month....... N/A.................. N/A.
10,000 people) c. after promulgation of
LT2ESWTR.
--------------------------------------------------------------------------------------------------------------------------------------------------------
a Public water systems may use equivalent previously collected (grandfathered) data to meet LT2ESWTR requirements. See section IV.A.1.d for details.
b Public water systems may sample more frequently (e.g., twice-per-month, once-per-week).
c Small systems must monitor for Cryptosporidium for one year, beginning 6 months after completion of E. coli monitoring, if the E. coli annual mean
concentration exceeds 10/100 mL for systems using lakes/reservoir sources or 50/100 mL for systems using flowing stream sources.
N/A = Not applicable. No monitoring required.
Sampling Location
Source water samples must be representative of the intake to the
filtration plant. Generally, sampling must be performed individually
for each plant that treats a surface water source. However, where
multiple plants receive all of their water from the same influent
(e.g., multiple plants draw water from the same pipe), the same set of
monitoring results may be applicable to each plant. Typically, samples
must be collected prior to any treatment, with exceptions for certain
pretreatment processes. Directions on sampling location for plants
using off-stream storage, presedimentation, and bank filtration are
provided in section IV.C.
Systems with plants that use multiple water sources at the same
time must collect samples from a tap where the sources are combined
prior to treatment if available. If a blended source tap is not
available, systems must collect samples from each source and either
analyze a weighted composite (blended) sample or analyze samples from
each source separately and determine a weighted average of the results.
Sampling Schedule
Large systems must submit a sampling schedule to EPA within 3
months after promulgation of the LT2ESWTR. Small systems must submit a
sampling schedule for E. coli monitoring to their primacy agency within
27 months after rule promulgation; small systems required to monitor
for Cryptosporidium must submit a Cryptosporidium sampling schedule
within 45 months after promulgation. The sampling schedules must
specify the calendar date on which the system will collect each sample
required under the LT2ESWTR. Scheduled sampling dates should be evenly
distributed throughout the monitoring period, but may be arranged to
accommodate holidays, weekends, and other events when collecting or
analyzing a sample would be problematic.
Systems must collect samples within 2 days before or 2 days after a
scheduled sampling date. If a system does not sample within this 5-day
window, the system will incur a monitoring violation unless either of
the following two conditions apply:
(1) If extreme conditions or situations exist that may pose
danger to the sample collector, or which are unforeseen or cannot be
avoided and which cause the system to be unable to sample in the
required time frame, the system must sample as close to the required
date as feasible and submit an explanation for the alternative
sampling date with the analytical results.
(2) Systems that are unable to report a valid Cryptosporidium
analytical result for a scheduled sampling date due to failure to
comply with analytical method quality control requirements
(described in section IV.K) must collect a replacement sample within
14 days of being notified by the laboratory or the State that a
result cannot be reported for that date. Systems must submit an
explanation for the replacement sample with the analytical results.
Where possible, the replacement sample collection date should not
coincide with any other scheduled LT2ESWTR sampling dates.
Approved Analytical Methods and Laboratories
To ensure the quality of LT2ESWTR monitoring data, today's proposal
requires systems to use approved methods for Cryptosporidium, E. coli,
and turbidity analyses (see section IV.K for sample analysis
requirements), and to have these analyses performed by approved
laboratories (described in section IV.L).
Reporting
Because source water monitoring by large systems will begin 6
months after promulgation of the LT2ESWTR, EPA is proposing that
monitoring results for large systems be reported directly to the Agency
though an electronic data system (described in section IV.J), similar
to the approach currently used under the Unregulated Contaminants
Monitoring Rule (64 FR 50555, September 17, 1999) (USEPA 1999c). Small
systems will report data to EPA or States, depending on whether States
have assumed primacy for the LT2ESWTR.
Previously Collected Monitoring Results
EPA is proposing to allow systems to use previously collected
(i.e., grandfathered) Cryptosporidium monitoring data to meet LT2ESWTR
monitoring requirements if the data are equivalent to data that will be
collected under the rule (e.g., sample volume, sampling frequency,
analytical method quality control). Criteria for acceptance of
previously collected data are specified in section IV.A.1.d.
Providing Additional Treatment Instead of Monitoring
Filtered systems are not required to conduct source water
monitoring under the LT2ESWTR if the system currently provides or will
provide a total of at least 5.5 log of treatment for Cryptosporidium,
equivalent to meeting the treatment requirements of Bin 4 as shown in
Table IV-4 (i.e., the maximum required in today's proposal). Systems
must notify EPA or the State not later than the date the system is
otherwise required to submit a sampling schedule for monitoring and
must install and operate technologies to provide a total of at least
5.5 log of treatment for Cryptosporidium by the applicable date in
Table IV-23. Any filtered system that fails to complete LT2ESWTR
monitoring requirements must meet the treatment requirements for Bin 4.
Ongoing Source Assessment and Second Round of Monitoring
Because LT2ESWTR treatment requirements are related to the degree
of source water contamination, today's proposal contains provisions to
assess changes in a system's source water
[[Page 47667]]
quality following initial risk bin classification. These provisions
include source water assessment during sanitary surveys and a second
round of monitoring.
Under 40 CFR 142.16(b)(3)(i), source water is one of the components
that States must address during the sanitary surveys that are required
for surface water systems. These sanitary surveys must be conducted
every 3 years for community systems and every 5 years for non-community
systems. EPA is proposing that if the State determines during the
sanitary survey that significant changes have occurred in the watershed
that could lead to increased contamination of the source water, the
State may require systems to implement specific actions to address the
contamination. These actions include implementing options from the
microbial toolbox discussed in section IV.C.
EPA is proposing that systems conduct a second round of source
water monitoring, beginning six years after systems are initially
classified in LT2ESWTR risk bins. To prepare for this second round of
monitoring, the Advisory Committee recommended that EPA initiate a
stakeholder process four years after large systems complete initial bin
classification. The purpose of the stakeholder process would be to
review risk information, and to determine the appropriate analytical
method, monitoring frequency, monitoring location, and other criteria
for the second round of monitoring.
If EPA does not modify LT2ESWTR requirements through issuing a new
regulation prior to the second round of monitoring, systems must carry
out this monitoring according to the requirements that apply to the
initial round of source water monitoring. Moreover, systems will be
reclassified in LT2ESWTR risk bins based on the second round monitoring
results and using the criteria specified in this section for initial
bin classification. However, if EPA changes the LT2ESWTR risk bin
structure to reflect a new analytical method or new risk information,
systems will undergo a site specific risk characterization in
accordance with the revised rule.
c. Treatment Requirements
i. Bin classification. Under the proposed LT2ESWTR, surface water
systems that use filtration will be classified in one of four
Cryptosporidium concentration categories (bins) based on the results of
source water monitoring. As shown in Table IV-2, bin classification is
determined by averaging the Cryptosporidium concentrations measured for
individual samples.
Table IV-2.-- Bin Classification Table for Filtered Systems
------------------------------------------------------------------------
If your average Cryptosporidium Then your bin classification
concentration 1 is . . . is . . .
------------------------------------------------------------------------
Cryptosporidium <0.075/L.................. Bin 1.
0.075/L <= Cryptosporidium < 1.0/L........ Bin 2.
1.0/L <= Cryptosporidium < 3.0/L.......... Bin 3.
Cryptosporidium £= 3.0/L........ Bin 4.
------------------------------------------------------------------------
\1\ All concentrations shown in units of oocysts/L
The approach that systems will use to average individual sample
concentrations to determine their bin classification depends on the
number of samples collected and the length of the monitoring period.
Systems serving at least 10,000 people are required to monitor for 24
months, and their bin classification must be based on the following:
(1) Highest twelve month running annual average for monthly
sampling, or
(2) two year mean if system conducts twice-per-month or more
frequent sampling for 24 months (i.e., at least 48 samples).
Systems serving fewer than 10,000 people are required to collect 24
Cryptosporidium samples over 12 months if they exceed the E. coli
trigger level, and their bin classification must be based on the mean
of the 24 samples. As noted earlier, systems that fail to complete the
required Cryptosporidium monitoring will be classified in Bin 4.
When determining LT2ESWTR bin classification, systems must
calculate individual sample concentrations using the total number of
oocysts counted, unadjusted for method recovery, divided by the volume
assayed (see section IV.K for details). As described in Section IV.A.2,
the ranges of Cryptosporidium concentrations that define LT2ESWTR bins
reflect consideration of analytical method recovery and the percent of
Cryptosporidium oocysts that are infectious. Consequently, sample
analysis results will not be adjusted for these factors.
ii. Credit for treatment in place. A key parameter in determining
additional Cryptosporidium treatment requirements is the credit that
plants receive for treatment currently provided (i.e., treatment in
place). For baseline treatment requirements established by the SWTR,
IESWTR, and LT1ESWTR that apply uniformly to filtered systems, the
Agency has awarded credit based on the minimum removal that plants will
achieve. Specifically, in the IESWTR and LT1ESWTR, EPA determined that
filtration plants, including conventional, direct, slow sand, and DE,
meeting the required filter effluent turbidity criteria will achieve at
least 2 log removal of Cryptosporidium. Consequently, these plants were
awarded a 2 log Cryptosporidium removal credit, which equals the
maximum treatment required under these regulations.
The LT2ESWTR will supplement existing regulations by mandating
additional treatment at certain plants based on site specific
conditions (i.e., source water Cryptosporidium level). When assessing
the need for additional treatment beyond baseline requirements for
higher risk systems, the Agency has determined that it is appropriate
to consider the average removal efficiency achieved by treatment
plants. As described in section III.D, EPA has concluded that
conventional, slow sand, and DE plants in compliance with the SWTR,
IESWTR, and LT1ESWTR achieve an average Cryptosporidium reduction of 3
log. Consequently, EPA is proposing to award these plants a 3 log
credit towards Cryptosporidium treatment requirements under the
LT2ESWTR. As noted previously, this approach is consistent with the
Stage 2 M-DBP Agreement in Principle.
For other types of filtration plants, treatment credit under the
LT2ESWTR differs. Conventional treatment is defined in 40 CFR 141.2 as
a series of processes including coagulation, flocculation,
sedimentation, and filtration, with sedimentation defined as a process
for removal of solids before filtration by gravity or separation. Thus,
plants with separation (i.e., clarification) processes other than
gravity sedimentation between flocculation and filtration, such as DAF,
may be regarded as conventional treatment for purposes of awarding
treatment credit under the LT2ESWTR. However, for direct filtration
plants, which lack a sedimentation process, EPA is proposing a 2.5 log
Cryptosporidium removal credit. Studies that support awarding direct
filtration plants less treatment credit than conventional plants are
summarized in section III.D.
EPA is unable to estimate an average log removal for other
filtration technologies like membranes, bag filters, and cartridge
filters, due to variability among products. As a result, credit for
these devices must be determined by the State, based on product
specific testing described in section IV.C or other criteria approved
by the State.
[[Page 47668]]
Table IV-3 presents the credit proposed for different types of
plants towards LT2ESWTR Cryptosporidium treatment requirements. As
described in section IV.C.18, a State may award greater credit to a
system that demonstrates through a State-approved protocol that it
reliably achieves a higher level of Cryptosporidium removal.
Conversely, a State may award less credit to a system where the State
determines, based on site specific information, that the system is not
achieving the degree of Cryptosporidium removal indicated in Table IV-
3.
Table IV-3.--Cryptosporidium Treatment Credit Towards LT2ESWTR Requirements 1
----------------------------------------------------------------------------------------------------------------
Conventional
treatment Slow sand or Alternative
Plant type (includes Direct filtration diatomaceous earth filtration
softening) filtration technologies
----------------------------------------------------------------------------------------------------------------
Treatment credit................ 3.0 log........... 2.5 log........... 3.0 log........... Determined by
State 2.
----------------------------------------------------------------------------------------------------------------
\1\ Applies to plants in full compliance with the SWTR, IESWTR, and LT1ESWTR as applicable
\2\ Credit must be determined through product or site specific assessment
iii. Treatment requirements associated with LT2ESWTR bins
The treatment requirements associated with LT2ESWTR risk bins are
shown in Table IV-4. The total Cryptosporidium treatment required for
Bins 2, 3, and 4 is 4.0 log, 5.0 log, and 5.5 log, respectively. For
conventional (including softening), slow sand, and DE plants that
receive 3.0 log credit for compliance with current regulations,
additional Cryptosporidium treatment of 1.0 to 2.5 log is required when
classified in Bins 2-4. Direct filtration plants that receive 2.5 log
credit for compliance with current regulations must achieve 1.5 to 3.0
log of additional Cryptosporidium treatment in Bins 2-4.
For systems using alternative filtration technologies, such as
membranes or bag/cartridge filters, and classified in Bins 2-4, the
State must determine additional treatment requirements based on the
credit awarded to a particular technology. The additional treatment
must be such that plants classified in Bins 2, 3, and 4 achieve the
total required Cryptosporidium reductions of 4.0, 5.0, and 5.5 log,
respectively.
Table IV-4.--Treatment Requirements Per LT2ESWTR Bin Classification
----------------------------------------------------------------------------------------------------------------
And you use the following filtration treatment in full compliance with the
SWTR, IESWTR, and LT1ESWTR (as applicable), then your additional treatment
requirements are . . .
-------------------------------------------------------------------------------
If your bin classification is . Conventional
. . filtration Slow sand or Alternative
treatment Direct filtration diatomaceous earth filtration
(includes filtration technologies
softening)
----------------------------------------------------------------------------------------------------------------
Bin 1........................... No additional No additional No additional No additional
treatment. treatment. treatment. treatment.
Bin 2........................... 1 log treatment 1.5 log treatment 1 log treatment As determined by
\1\. \1\. \1\. the State 1, 3.
Bin 3........................... 2 log treatment 2.5 log treatment 2 log treatment As determined by
\2\. \2\. \2\. the State 2, 4.
Bin 4........................... 2.5 log treatment 3 log treatment 2.5 log treatment As determined by
\2\. \2\. \2\. the State 2, 5.
----------------------------------------------------------------------------------------------------------------
\1\ Systems may use any technology or combination of technologies from the microbial toolbox.
\2\ Systems must achieve at least 1 log of the required treatment using ozone, chlorine dioxide, UV, membranes,
bag/cartridge filters, or bank filtration.
\3\ Total Cryptosporidium removal and inactivation must be at least 4.0 log.
\4\ Total Cryptosporidium removal and inactivation must be at least 5.0 log.
\5\ Total Cryptosporidium removal and inactivation must be at least 5.5 log.
Plants can achieve additional Cryptosporidium treatment credit
through implementing pretreatment processes like presedimentation or
bank filtration, by developing a watershed control program, and by
applying additional treatment steps like UV, ozone, chlorine dioxide,
and membranes. In addition, plants can receive additional credit for
existing treatment through achieving very low filter effluent turbidity
or through a demonstration of performance. Section IV.C presents
criteria for awarding Cryptosporidium treatment credit to a host of
treatment and control options, including those listed here and others,
which are collectively termed the ``microbial toolbox''.
Systems in Bin 2 can meet additional Cryptosporidium treatment
requirements through using any option or combination of options from
the microbial toolbox. In Bins 3 and 4, systems must achieve at least 1
log of the additional treatment requirement through using ozone,
chlorine dioxide, UV, membranes, bag filtration, cartridge filtration,
or bank filtration.
d. Use of previously collected data. Today's proposal allows
systems with previously collected Cryptosporidium data (i.e., data
collected prior to the required start of monitoring under the LT2ESWTR)
that are equivalent in sample number, frequency, and data quality to
data that will be collected under the LT2ESWTR to use those data in
lieu of conducting new monitoring. Specifically, EPA is proposing that
Cryptosporidium sample analysis results collected prior to promulgation
of the LT2ESWTR must meet the following criteria to be used for bin
classification:
? Samples were analyzed by laboratories using validated
versions of EPA Methods 1622 or 1623 and meeting the quality control
criteria specified in these methods (USEPA 1999a, USEPA 1999b, USEPA
2001e, USEPA 2001f).
? Samples were collected no less frequently than each
calendar month on a regular schedule, beginning no earlier than January
1999 (when EPA Method 1622 was first released as an interlaboratory-
validated method).
? Samples were collected in equal intervals of time over the
entire collection period (e.g., weekly,
[[Page 47669]]
monthly). The allowances for deviations from a sampling schedule
specified under IV.A.1.b for LT2ESWTR monitoring apply to grandfathered
data.
? Samples were collected at the correct location as specified
for LT2ESWTR monitoring. Systems must report the use of bank
filtration, presedimentation, and raw water off-stream storage during
sampling.
? For each sample, the laboratory analyzed at least 10 L of
sample or at least 2 mL of packet pellet volume or as much volume as
two filters could accommodate before clogging (applies only to filters
that have been approved by EPA for use with Methods 1622 and 1623).
? The system must certify that it is reporting all
Cryptosporidium monitoring results generated by the system during the
time period covered by the previously collected data. This applies to
samples that were (a) collected from the sampling location used for
LT2ESWTR monitoring, (b) not spiked, and (c) analyzed using the
laboratory's routine process for Method 1622 or 1623 analyses.
? The system must also certify that the samples were
representative of a plant's source water(s) and the source water(s)
have not changed.
If a system has at least two years of Cryptosporidium data
collected before promulgation of the LT2ESWTR and the system does not
intend to conduct new monitoring under the rule, the system must submit
the data and the required supporting documentation to EPA no later than
two months following promulgation of the rule. EPA will notify the
system within four months following LT2ESWTR promulgation as to whether
the data are sufficient for bin determination. Unless EPA notifies the
system in writing that the previously collected data are sufficient for
bin determination, the system must conduct source water Cryptosporidium
monitoring as described in section IV.A.1.b of this preamble.
If a system intends to grandfather fewer than two years of
Cryptosporidium data, or if a system intends to grandfather 2 or more
years of previously collected data and also to conduct new monitoring
under the rule, the system must submit the data and the required
supporting documentation to EPA no later than eight months following
promulgation of the rule. Systems must conduct monitoring as described
in section IV.A.1.b until EPA notifies the system in writing that it
has at least 2 years of acceptable data. See section IV.J for
additional information on reporting requirements associated with
previously collected data.
2. How Was This Proposal Developed?
The monitoring and treatment requirements for filtered systems
proposed under the LT2ESWTR stem from the data and analyses described
in this section and reflect recommendations made by the Stage 2 M-DBP
Federal Advisory Committee (65 FR 83015) (USEPA 2000a).
a. Basis for targeted treatment requirements. Under the IESWTR, EPA
established an MCLG of zero for Cryptosporidium at the genus level
based on the public health risk associated with this pathogen. The
IESWTR included a 2 log treatment technique requirement for medium and
large filtered systems that controlled for Cryptosporidium as close to
the MCLG as was then deemed technologically feasible, taking costs into
consideration. The LT1ESWTR extended this requirement to small systems.
Given the advances that have occurred subsequent to the IESWTR in
available technology to measure and treat for Cryptosporidium, a key
question for the LT2ESWTR was the extent to which Cryptosporidium
should be further controlled to approach the MCLG of zero, considering
technical feasibility, costs, and potential risks from DBPs.
The data and analysis presented in Section III of this preamble
suggest wide variability in possible risk from Cryptosporidium among
public water systems. This variability is largely due to three factors:
(1) The broad distribution of Cryptosporidium occurrence levels among
source waters, (2) disparities in the efficacy of treatment provided by
plants, and (3) differences in the infectivity among Cryptosporidium
isolates. EPA and the Advisory Committee considered this wide range of
possible risks and the desire to address systems where the 2 log
removal requirement established by the IESWTR and LT1ESWTR may not
provide adequate public health protection.
A number of approaches were evaluated for furthering control of
Cryptosporidium. One approach was to require all systems to provide the
same degree of additional treatment for Cryptosporidium (i.e., beyond
that required by the IESWTR and LT1ESWTR). This approach could ensure
that most systems, including those with poor quality source water,
would be adequately protective. The uniformity of this approach has the
advantage of minimizing transactional costs for determining what must
be done by a particular system to comply. However, a significant
downside is that it may require more treatment, with consequent costs,
than is needed by many systems with low source water Cryptosporidium
levels. In addition, there were concerns with the feasibility of
requiring almost all surface water treatment plants to install
additional treatment processes for Cryptosporidium.
A second approach was to base additional treatment requirements on
a plant's source water Cryptosporidium level. Under this approach,
systems monitor their source water for Cryptosporidium, and additional
treatment is required only from those systems that exceed specified
oocyst concentrations. This has the advantage of targeting additional
public health protection to those systems with higher vulnerability to
Cryptosporidium, while avoiding the imposition of higher treatment
costs on systems with the least contaminated source water. In
consideration of these advantages, the Advisory Committee recommended
and EPA is proposing this second approach for filtered systems under
the LT2ESWTR.
b. Basis for bin concentration ranges and treatment requirements.
The proposed LT2ESWTR will classify plants into different risk bins
based on the source water Cryptosporidium level, and the bin
classification will determine the extent to which additional treatment
beyond IESWTR and LT1ESWTR is required. Two questions were central in
developing the proposed bin concentration ranges and additional
treatment requirements:
? What is the risk associated with a given level of
Cryptosporidium in a drinking water source?
? What degree of additional treatment should be required for
a given source water Cryptosporidium level?
This section addresses these two questions by first summarizing how
EPA assessed the risk associated with Cryptosporidium in drinking
water, followed by a description of how EPA and the Advisory Committee
used this type of information in identifying LT2ESWTR bin concentration
ranges and treatment requirements. For additional information on these
topics, see Economic Analysis for the LT2ESWTR (USEPA 2003a).
i. What is the risk associated with a given level of
Cryptosporidium in a drinking water source? The risk of infection from
Cryptosporidium in drinking water is a function of infectivity (i.e.,
dose-response associated with ingestion) and exposure. Section III.B
summarizes available data on Cryptosporidium infectivity. EPA conducted
a meta-analysis of reported infection rates from human feeding
[[Page 47670]]
studies with 3 Cryptosporidium isolates. This analysis produced an
estimate for the mean probability of infection given a dose of one
oocyst near 0.09 (9%), with 10th and 90th percentile confidence values
of 0.011 and 0.22, respectively.
Exposure to Cryptosporidium depends on the concentration of oocysts
in the source water, the efficiency of treatment plants in removing
oocysts, and the volume of water ingested (exposure can also occur
through interactions with infected individuals). Based on data
presented in section III.D, EPA has estimated that filtration plants in
compliance with the IESWTR or LT1ESWTR reduce source water
Cryptosporidium levels by 2 to 5 log (99% to 99.999%), with an average
reduction near 3 log. For drinking water consumption, EPA uses a
distribution, derived from the United States Department of
Agriculture's (USDA) 1994-96 Continuing Survey of Food Intakes by
Individuals, with a mean value of 1.2 L/day. Average annual days of
exposure to drinking water in CWS, non-transient non-community water
systems (NTNCWS), and transient non-community water systems (TNCWS) are
estimated at 350 days, 250 days, and 10 days, respectively. (The
Economic Analysis for the LT2ESWTR (USEPA 2003a) provides details on
all parameters listed here, as well as morbidity, mortality, and other
risk factors.)
Using an estimate of 1.2 L/day consumption and a mean probability
of infection of 0.09 for one oocyst ingested, the daily risk of
infection (DR) is as follows:
DR = (oocysts/L in source water) x (percent remaining after treatment)
x (1.2 L/day) x (0.09).
The annual risk (AR) of infection for a CWS is
AR = 1-(1-DR)\350\
where 350 represents days of exposure in a CWS.
Table IV-5 presents estimates of the mean annual risk of infection
by Cryptosporidium in CWSs for selected source water infectious oocyst
concentrations and filtration plant removal efficiencies.
Table IV-5.--Annual Risk of Cryptosporidium Infection in CWSs That
Filter, as a Function of Source Water Infectious Oocyst Concentration
and Treatment Efficiency
------------------------------------------------------------------------
Source water Mean annual risk of infection for different levels of
concentration treatment efficiency (log removal) \1\
(infectious -------------------------------------------------------
oocysts per 5
liter) 2 log 3 log 4 log log
------------------------------------------------------------------------
0.0001 3.8E-05 3.8E-06 3.8E-07 3.8
E-0
8
0.001 3.7E-04 3.8E-05 3.8E-06 3.8
E-0
7
0.01 3.7E-03 3.7E-04 3.8E-05 3.8
E-0
6
0.1 3.7E-02 3.7E-03 3.7E-04 3.8
E-0
5
1 0.31 3.7E-02 3.7E-03 3.7
E-0
4
10 0.89 0.31 3.7E-02 3.7
E-0
3
------------------------------------------------------------------------
\1\ Scientific notation (E-x) designates 10-x
For example, Table IV-5 shows that if a filtration plant had a mean
concentration of infectious Cryptosporidium in the source water of 0.01
oocysts/L, and the filtration plant averaged 3 log removal, the mean
annual risk of infection by Cryptosporidium is estimated as 3.7 x
10-4 (3.7 infections per 10,000 consumers).
ii. What degree of additional treatment should be required for a
given source water Cryptosporidium level? In order to develop targeted
treatment requirements for the LT2ESWTR, it was necessary to identify a
source water Cryptosporidium level above which additional treatment by
filtered systems would be required. Based on the type of risk
information shown in Table IV-5, EPA and Advisory Committee
deliberations focused on mean source water Cryptosporidium
concentrations in the range of 0.01 to 0.1 oocysts/L as appropriate
threshold values for prescribing additional treatment.
Analytical method and sampling constraints were a significant
factor in setting the specific Cryptosporidium level that triggers
additional treatment by filtered systems. The number of samples that
systems can be required to analyze for Cryptosporidium is limited.
Consequently, if the bin threshold concentration for additional
treatment was set near 0.01 oocysts/L, systems could exceed this level
due to a very low number of oocysts being detected. For example, if
systems took monthly 10 L samples and bin classification was based on a
maximum running annual average, then a system would exceed a mean
concentration of 0.01 oocysts/L by counting only 2 oocysts in 12
samples. Given the variability associated with Cryptosporidium
analytical methods, the Advisory Committee did not support requiring
additional treatment for filtered systems based on so few counts.
Another concern related to analytical method limitations was
systems being misclassified in a lower bin. For example, if a system
had a true mean concentration at or just above 0.1 oocysts/L, the mean
that the system would determine through monitoring might be less than
0.1 oocyst/L. Thus, if the bin threshold for additional treatment was
set at 0.1 oocysts/L, a number of systems with true mean concentrations
above this level would be misclassified in the lower bin with no
additional treatment required. This type of error, described in more
detail in the next section, is a function of the number of samples
collected and variability in method performance.
In consideration of the available information on Cryptosporidium
risk, as well as the performance and feasibility of analytical methods,
EPA is proposing that the source water threshold concentration for
requiring additional Cryptosporidium treatment by filtered systems be
established at a mean level of 0.075 oocysts/L. This is the level
recommended by the Advisory Committee, and it affords a high likelihood
that systems with true mean Cryptosporidium concentrations of 0.1
oocysts/L or higher will provide additional treatment under the rule.
Beyond identifying this first threshold, it was also necessary to
determine Cryptosporidium concentrations that would demarcate higher
risk bins. With respect to the concentration range that each bin should
comprise, EPA and the Advisory Committee dealt with two opposing
factors: bin misclassification and equitable risk reduction.
As described in the next section, a monthly monitoring program
involving EPA Methods 1622 or 1623 can characterize a system's mean
Cryptosporidium concentration within a
[[Page 47671]]
0.5 log (factor of 3.2) margin with a high degree of accuracy. However,
the closer a system's true mean concentration is to a bin boundary, the
greater the likelihood that the system will be misclassified into the
wrong bin due to limitations in sampling and analysis. Accordingly, by
establishing bins that cover a wide concentration range, the likelihood
of system misclassification is reduced.
However, a converse factor relates to equitable protection from
risk. Because identical treatment requirements will apply to all
systems in the same bin, systems at the higher concentration end of a
bin will achieve less risk reduction relative to their source water
pathogen levels than systems at the lower concentration end of a bin.
Thus, bins with a narrow concentration range provide a more uniform
level of public health protection.
In balancing these factors and to account for the wide range of
possible source water concentrations among different systems as
indicated by Information Collection Rule and ICRSS data, the Advisory
Committee recommended and EPA is proposing a second bin threshold at a
mean level of 1.0 oocysts/L and a third bin threshold at a mean level
of 3.0 oocysts/L. Information Collection Rule and ICRSS data indicate
that few, if any, systems would measure mean Cryptosporidium
concentrations greater than 3.0 oocysts/L, so there was not a need to
establish a bin threshold above this value. Thus, the LT2ESWTR proposal
includes the following four bins for classifying filtered systems: Bin
1: <0.075/L; Bin 2: £=0.075 to <1.0/L; Bin 3:
£=1.0/L to <3.0/L; and Bin 4: £=3.0/L (oocysts/L).
With respect to additional Cryptosporidium treatment for systems in
Bins 2-4, values were considered ranging from 0.5 to 2.5 log and
greater. As recommended by the Advisory Committee, EPA is proposing 1.0
log additional treatment for conventional plants in Bin 2. This level
of treatment will ensure that systems classified in Bin 2 will achieve
treated water Cryptosporidium levels comparable to systems in Bin 1,
the lowest risk bin. In contrast, if systems in Bin 2 provided only 0.5
log additional treatment then those systems with mean source water
concentrations in the upper part of Bin 2 would have higher levels of
Cryptosporidium in their finished water than systems in Bin 1.
In consideration of the much greater potential vulnerability of
systems in the highest risk bins, the Advisory Committee recommended
additional treatment requirements of 2.0 log and 2.5 log for
conventional plants in Bins 3 and 4, respectively. The Agency concurs
with these recommendations and has incorporated them in today's
proposal.
An important aspect of the proposed additional treatment
requirements is that they are based, in part, on the current level of
treatment provided by filtration plants. As noted earlier, the Advisory
Committee assumed when developing its recommendations that conventional
treatment plants in compliance with the IESWTR achieve an average of 3
log removal of Cryptosporidium. EPA has determined that available data,
discussed in section III.D, support this assumption and has proposed a
3 log Cryptosporidium treatment credit for conventional plants under
the LT2ESWTR. Thus, the additional treatment requirements for
conventional plants in Bins 2, 3, and 4 translate to total requirements
of 4.0, 5.0, and 5.5 log, respectively.
The Advisory Committee did not address additional treatment
requirements for plants with treatment trains other than conventional,
but recommended that EPA address such plants in the proposed LT2ESWTR
and take comment. Based on treatment studies summarized in section
III.D, EPA has concluded that plants with slow sand or DE filtration
are able to achieve 3 log or greater removal of Cryptosporidium when in
compliance with the IESWTR or LT1ESWTR. Because these plants can
achieve comparable levels of performance to conventional treatment
plants, EPA is proposing that slow sand and DE filtration plants also
apply 1 to 2.5 log of additional treatment when classified in Bins 2-4.
Direct filtration differs from conventional treatment in that it
does not include sedimentation or an equivalent clarification process
prior to filtration. As described in section III.D, EPA has concluded
that a sedimentation process can consistently achieve 0.5 log or
greater removal of Cryptosporidium. The Agency is proposing that direct
filtration plants in compliance with the IESWTR or LT1ESWTR receive a
2.5 log Cryptosporidium removal credit towards LT2ESWTR requirements.
Accordingly, proposed additional treatment requirements for direct
filtration plants in bins 2, 3, and 4 are 1.5 log, 2.5 log, and 3 log,
respectively.
Section IV.C of this notice describes proposed criteria for
determining Cryptosporidium treatment credits for other filtration
technologies like membranes, bag filters, and cartridge filters. Due to
the proprietary and product specific nature of these filtration
devices, EPA is not able to propose a generally applicable credit for
them. Rather, the criteria in section IV.C focus on challenge testing
to establish treatment credit. Systems using these technologies that
are classified in Bins 2-4 must work with their States to assess
appropriate credit for their existing treatment trains. This will
determine the level of additional treatment necessary to achieve the
total treatment requirements for their assigned bins. EPA has developed
guidance on challenge testing of bag and cartridge filters and
membranes, which is available in draft form in the docket
(http://www.epa.gov/edocket/).
In order to give systems flexibility in choosing strategies to meet
additional Cryptosporidium treatment requirements, the Advisory
Committee identified a number of management and treatment options,
collectively called the microbial toolbox. The toolbox, which is
described in section IV.C, contains components relating to watershed
control, intake management, pretreatment, additional filtration
processes, inactivation, and demonstrations of enhanced performance.
As recommended by the Advisory Committee, EPA is proposing that
systems in Bin 2 can meet additional Cryptosporidium treatment
requirements under the LT2ESWTR using any component or combination of
components from the microbial toolbox. However, systems in Bins 3 and 4
must achieve at least 1 log of the additional treatment requirement
using inactivation (UV, ozone, chlorine dioxide), membranes, bag
filters, cartridge filters, or bank filtration. These specific control
measures are proposed due to their ability to serve as significant
additional treatment barriers for systems with high levels of
pathogens.
c. Basis for source water monitoring requirements. The goal of
monitoring under the LT2ESWTR is to correctly classify filtration
plants into the four LT2ESWTR risk bins. The proposed sampling
frequency, time frame, and averaging procedure for bin classification
are intended to ensure that systems are accurately assigned to
appropriate risk bins while limiting the burden of monitoring costs.
The basis for the proposed monitoring requirements for large and small
systems is presented in the following discussion.
i. Systems serving at least 10,000 people.
Sample Number and Frequency
Systems serving at least 10,000 people have two options for
sampling under the
[[Page 47672]]
LT2ESWTR: (1) They can collect 24 monthly samples over a 2 year period
and calculate their bin classification using the highest 12 month
running annual average, or (2) They can collect 2 or more samples per
month over the 2 year period and use the mean of all samples for bin
classification.
These proposed requirements reflect recommendations by the Advisory
Committee and are based on analyses of misclassification rates
associated with different monitoring programs that were considered. EPA
is concerned about systems with high concentrations of Cryptosporidium
being misclassified in lower bins as well as systems with low
concentrations being misclassified in higher bins. The first type of
error could lead to systems not providing an adequate level of
treatment while the second type of error could lead to systems
incurring additional costs for unnecessary treatment.
A primary way that EPA analyzed misclassification rates was by
considering the likelihood that a system with a true mean
Cryptosporidium concentration that is a factor of 3.2 (0.5 log) above
or below a bin boundary would be assigned to the wrong bin.
Probabilities were assessed for two cases:
? False negative: a system with a mean concentration of 0.24
oocysts/L (i.e., factor of 3.2 above the Bin 1 boundary of 0.075
oocysts/L) is misclassified low in Bin 1.
? False positive: a system with a mean concentration of 0.024
oocysts/L (i.e., factor of 3.2 below the Bin 1 boundary of 0.075
oocysts/L) is misclassified high in Bin 2.
Table IV-6 provides false negative and false positive rates as
defined previously for different approaches to monitoring and bin
classification that were evaluated. Results are shown for the following
approaches:
? 48 samples with bin assignment based on arithmetic mean
(i.e., average of all samples).
? 24 samples with bin assignment based on highest 12 sample
average, equivalent to the maximum running annual average (Max-RAA).
? 24 samples with bin assignment based on arithmetic mean.
? 12 samples with bin assignment based on the second highest
sample result.
? 8 samples with bin assignment based on the maximum sample
result.
These estimated misclassification rates were generated with a Monte
Carlo analysis that accounted for the volume assayed, variation in
source water Cryptosporidium occurrence, and variable method recovery.
See Economic Analysis for the LT2ESWTR (USEPA 2003a) for details.
Table IV-6.--False Positive and False Negative Rates for Monitoring and
Binning Strategies Considered for the LT2ESWTR
[In percentages]
------------------------------------------------------------------------
False False
Strategy positive negative
\1\ \2\
------------------------------------------------------------------------
48 sample arithmetic mean........................... 1.7 1.4
24 sample Max-RAA................................... 5.3 1.7
24 sample arithmetic mean........................... 2.8 6.2
12 sample second highest............................ 47 1.1
8 sample maximum.................................... 66 1.0
------------------------------------------------------------------------
\1\ False positive rates calculated for systems with Cryptosporidium
concentrations 0.5 log below the Bin 1 boundary of 0.075 oocysts/L.
\2\ False negative rates calculated for systems with Cryptosporidium
concentrations 0.5 log above the Bin 1 boundary of 0.075 oocysts/L.
The first two of these approaches, the 48 sample arithmetic mean
and 24 sample Max-RAA, were recommended by the Advisory Committee and
are proposed for bin classification under the LT2ESWTR because they
have low false positive and false negative rates. As shown in Table IV-
6, these strategies have false negative rates of 1 to 2%, meaning there
is a 98 to 99% likelihood that a plant with an oocyst concentration 0.5
log above the Bin 1 boundary would be correctly assigned to Bin 2. The
false positive rate is near 2% for the 48 sample arithmetic mean and 5%
for the 24 sample Max-RAA. These rates indicate that a plant with an
oocyst concentration 0.5 log below the Bin 1 boundary would have a 95
to 98% probability of being correctly assigned to Bin 1. Bin
misclassification rates across a wide range of concentrations are shown
in Economic Analysis for the LT2ESWTR (USEPA 2003a).
The 24 sample arithmetic mean had a slightly lower false positive
rate than the 24 sample Max-RAA (2.8% vs. 5.3%) but the false negative
rate of the arithmetic mean was almost 4 times higher. Consequently, a
plant with a mean Cryptosporidium level above the Bin 1 boundary would
be much more likely to be misclassified in Bin 1 using a 24 sample
arithmetic mean than with a 24 sample Max-RAA. In order to increase the
probability that systems with mean Cryptosporidium concentrations above
0.075 oocysts/L will provide additional treatment, EPA is proposing
that if only 24 samples are taken, the maximum 12 month running annual
average must be used to determine bin assignment.
Monitoring strategies involving only 12 and 8 samples were
evaluated to determine if lower frequency monitoring could provide
satisfactory bin classification. The results of this analysis indicate
that these lower sample numbers are not adequate and could unfairly
bias excessive treatment requirements. For example, results in Table
IV-6 show that if plants were classified in bins based on the second
highest of 12 samples or the highest of eight samples then low false
negative rates could be achieved. A system with a mean Cryptosporidium
level 0.5 log above the Bin 1 boundary would have a 99% chance of being
appropriately classified in a bin requiring additional treatment under
either strategy. However, the false positive rates associated with
these low sample numbers are very high. A system with a mean oocyst
concentration 0.5 log below the Bin 1 boundary would have a 47%
probability of being incorrectly classified in Bin 2 using the second
highest result among 12 samples, or a 66% likelihood of being
misclassified in Bin 2 using the maximum result among 8 samples. Due to
high false positive rates, these strategies are not proposed.
EPA also evaluated lower frequency monitoring strategies that had
lower false positive rates, such as bin classification based on the
mean of 12 samples, the third highest result of 12 samples, and the
second highest of 8 samples. Each of these strategies, though, had an
unacceptably high false negative rate, meaning that many systems with
mean oocyst concentrations greater than the Bin 1 boundary would be
misclassified low in Bin 1. Consequently, these strategies are
inconsistent with the public health goal of the LT2ESWTR for systems
with mean levels above 0.075 oocysts/L to provide additional treatment.
Increasing the number of samples used to compute the maximum
running annual average above 24 also increased the number of annual
averages computed, so it did not reduce the likelihood of false
positives. Raising the number of samples used to compute an arithmetic
mean above 48 did reduce bin misclassification rates, but the rates
were already very small (1 to 2% for plants with levels 0.5 log above
or below bin boundaries). For sources with Cryptosporidium
concentrations very near or at bin boundaries, increasing the number of
samples did not markedly improve the error rates, which remained near
50% at the bin boundaries.
In summary, EPA believes that the proposed sampling designs perform
well for the purpose of classifying plants in LT2ESWTR risk bins and,
[[Page 47673]]
thereby, achieving the public health protection intended for the rule.
More costly designs, involving more frequent sampling and analysis,
provide only marginally improved performance. Less frequent sampling,
though lower in cost, creates unacceptably high misclassification rates
and would not provide for the targeted risk reduction goals of the
rule.
No Adjustments for Method Recovery or Percent of Oocysts That Are
Infectious
Two considerations in using Cryptosporidium monitoring data to
project risk are (1) Fewer than 100% of oocysts in a sample are
recovered and counted by the analyst and (2) not all the oocysts
measured with Methods 1622/23 are viable and capable of causing
infection. These two factors are offsetting in sign, in that oocyst
counts not adjusted for recovery tend to underestimate the true
concentration, while the total oocyst count may overestimate the
infectious concentration that presents a health risk. Based on
information described in this section, EPA is proposing that
Cryptosporidium monitoring results be used directly to assign systems
to LT2ESWTR risk bins and not be adjusted for either factor.
As described in section III.C, ICRSS matrix spike data indicate
that average recovery of Cryptosporidium oocysts with Methods 1622/23
in a national monitoring program will be about 40%. There is no similar
direct measure of the fraction of environmental oocysts that are
infectious, but information related to this value can be derived from
two sources: (1) A study where samples were analyzed with both Method
1623 and a cell culture-polymerase chain reaction (CC-PCR) test for
oocyst infectivity, and (2) the structure of oocysts counted with
Methods 1622 and 1623.
LeChevallier et al. (2003) conducted a study in which six natural
waters were frequently tested for Cryptosporidium using both Method
1623 and a CC-PCR method to test for infectivity. Cryptosporidium
oocysts were detected in 60 of 593 samples (10.1%) by Method 1623 and
infectious oocysts were detected in 22 of 560 samples (3.9%) by the CC-
PCR procedure. Recovery efficiencies for the two methods were similar.
According to the authors, these results suggest that approximately 37%
(22/60) of the Cryptosporidium oocysts detected by Method 1623 were
viable and infectious.
In regard to oocyst structure, Cryptosporidium oocysts counted with
Methods 1622/23 are characterized in one of three ways: (1) Internal
structures, (2) amorphous structures, or (3) empty. Oocysts with
internal structures are considered to have the highest likelihood of
being infectious, while empty oocysts are believed to be non-viable
(LeChevallier et al. 1997). During the ICRSS, 37% of the oocysts
counted were characterized as having internal structures, 47% had
amorphous structures, and 16% were empty. If it is assumed that empty
oocysts could not be infectious, the mid-point value within the
percentage range of counted oocysts that could have been infectious is
42%.
After considering this type of information, the Advisory Committee
recommended that monitoring results not be adjusted upward for percent
recovery, nor adjusted downward to account for the fraction of oocysts
that are not infectious. While it is not possible to establish a
precise value for either factor in individual samples, the data suggest
that they may be of similar magnitude. EPA concurs with this
recommendation and is proposing that systems be classified in bins
under the LT2ESWTR using the total Cryptosporidium oocyst count,
uncorrected for recovery, as measured using EPA Method 1622/23. The
proposed LT2ESWTR risk bins are constructed to reflect this approach.
Data Collection To Support Use of a Microbial Indicator by Small
Systems
As described in the next section, small systems will monitor for an
indicator, currently proposed to be E. coli, to determine if they are
required to sample for Cryptosporidium. The proposed E. coli levels
that will trigger Cryptosporidium monitoring are based on Information
Collection Rule and ICRSS data. However, to provide for a more
extensive evaluation of Cryptosporidium indicator criteria, EPA is
proposing that large systems measure E. coli and turbidity in their
source water when they sample for Cryptosporidium. This was recommended
by the Advisory Committee and will allow for possible development of
alternative indicator levels or parameters (e.g., turbidity in
combination with E. coli) to serve as triggers for small system
Cryptosporidium monitoring.
Time Frame for Monitoring
In recommending a time frame for LT2ESWTR monitoring, the Agency
considered the trade-off between monitoring over a long period to
better capture year-to-year fluctuations, and the desire to prescribe
additional treatment quickly to systems identified as having high
source water pathogen levels. Reflecting Advisory Committee
recommendations, EPA is proposing that large systems evaluate their
source water Cryptosporidium levels using 2 years of monitoring. This
will account for some degree of yearly variability, without
significantly delaying additional public health protection where
needed.
ii. Systems serving fewer than 10,000 people.
Indicator Monitoring
In recognition of the relatively high cost of analyzing samples for
Cryptosporidium, EPA and the Advisory Committee explored the use of
indicator criteria to identify drinking water sources that may have
high levels of Cryptosporidium occurrence. The goal was to find one or
more parameters that could be analyzed at low cost and identify those
systems likely to exceed the Bin 1 boundary of 0.075 oocysts/L. Data
from the Information Collection Rule and ICRSS were evaluated for
possible indicator parameters, including fecal coliforms, total
coliforms, E. coli, viruses (Information Collection Rule only), and
turbidity. Based on available data, E. coli was found to provide the
best performance as a Cryptosporidium indicator, and the inclusion of
other parameters like turbidity was not found to improve accuracy.
The next part of this section presents data that support E. coli
mean concentrations of 10/100 mL and 50/100 mL as proposed screening
levels that will trigger Cryptosporidium monitoring in reservoir/lake
and flowing stream systems, respectively. It describes how E. coli and
Cryptosporidium data from the Information Collection Rule and ICRSS
were analyzed and shows the performance of different concentrations of
E. coli as an indicator for systems that will exceed the Bin 1 boundary
of 0.075 oocysts/L.
Information Collection Rule data were evaluated as maximum running
annual averages (Information Collection Rule samples were collected
once per month for 18 months) while ICRSS data were evaluated using an
annual mean (ICRSS samples were collected twice per month for 12
months). In addition, as indicators were being evaluated it became
apparent that it was necessary to analyze plants separately based on
source water type, due to a significantly different relationship
between E. coli and Cryptosporidium in reservoir/lake systems compared
to flowing stream systems.
Analyzing the performance of an E. coli level as a screen to
trigger Cryptosporidium monitoring under the proposed LT2ESWTR involved
[[Page 47674]]
evaluating each water treatment plant in the data set relative to two
factors: (1) Did the plant E. coli level exceed the trigger value being
assessed? and (2) Did the plant mean Cryptosporidium concentration
exceed 0.075 oocysts/L? Accordingly, plants were sorted into four
categories, based on Cryptosporidium and E. coli concentrations:
? Plants with Cryptosporidium < 0.075 oocysts/L that did not
exceed the E. coli trigger level (Figure IV-1, box A)
? Plants with Cryptosporidium < 0.075 oocysts/L that exceeded
the E. coli trigger level (Figure IV.1, box B)
? Plants with Cryptosporidium £= 0.075 oocysts/L
that did not exceed the E. coli trigger level (Figure IV.1, box C)
? Plants with Cryptosporidium £= 0.075 oocysts/L
that exceeded the E. coli trigger level (Figure IV.1, box D)
Summary data with E. coli trigger concentrations ranging from 5 to 100
per 100 mL are presented for Information Collection Rule and ICRSS data
in Figures IV-2 and IV-3.
The performance of each E. coli level as a trigger for
Cryptosporidium monitoring was evaluated based on false negative and
false positive rates. False negatives occur when plants do not exceed
the E. coli trigger value, but exceed a Cryptosporidium level of 0.075
oocysts/L. False positives occur when plants exceed the E. coli trigger
value but do not exceed a Cryptosporidium level of 0.075 oocysts/L. The
false negative rate is critical because it characterizes the ability of
the indicator to identify those plants with high Cryptosporidium
levels. In general, low false negative rates can be achieved by
lowering the E. coli trigger concentration. However, when the E. coli
trigger concentration is decreased, more plants with low
Cryptosporidium levels in their source water exceed it. As a result,
more plants incur false positives. Consequently, identifying an
appropriate E. coli concentration to trigger Cryptosporidium monitoring
involves balancing false negatives and false positives to minimize
both.
Results of the indicator analysis for plants with flowing stream
sources are shown in Figure IV-2. An E. coli trigger concentration of
50/100 mL produced zero false negatives for both data sets. This means
that in these data sets, all plants that exceeded mean Cryptosporidium
concentrations of 0.075 oocysts/L also exceeded the E. coli trigger
concentration and would, therefore, be required to monitor. However,
this trigger concentration had a significant false positive rate (i.e.,
it was not highly specific in targeting only those plants with high
Cryptosporidium levels). False positive rates were 57% (24/42) and 53%
(9/17) with Information Collection Rule and ICRSS data, respectively.
At a higher E. coli trigger concentration, such as 100/100 mL, the
false negative rate increased to 12.5% (3/24) with Information
Collection Rule data and 50% (2/4) with ICRSS data, while the false
positive rate decreased to 43% (18/42) and 35% (6/17), respectively.
Consequently, EPA is proposing a mean E. coli concentration of 50/100
mL as a trigger for Cryptosporidium monitoring by small systems with
flowing stream sources.
Results of the indicator analysis for plants with reservoir/lake
sources are shown in Figure IV-3. An E. coli trigger of 10/100 mL
resulted in a false negative rate of 20% (2/10) with Information
Collection Rule data and 67% (2/3) with ICRSS data (misclassified 2 out
of 3 plants over 0.075 oocysts/L). Going to a lower concentration E.
coli trigger, such as 5 per 100 mL, decreased the false negative rate
in both the Information Collection Rule and ICRSS data sets by one
plant, but increased the false positive rate from 20% to 43% (13/30) in
the ICRSS data and from 24% to 39% (44/114) in the Information
Collection Rule data. Based on these results, EPA is proposing that a
mean E. coli concentration of 10/100 mL trigger small systems using
lake/reservoir sources into monitoring for Cryptosporidium. While the
false negative rate associated with this trigger value in the ICRSS
data set is high, the ICRSS data set contains only 3 reservoir/lake
plants that exceeded a Cryptosporidium level of 0.075 oocysts/L.
Due to limitations in the available data, the Advisory Committee
did not recommend that large systems use the E. coli indicator screen,
as Cryptosporidium monitoring is less of an economic burden for large
systems. Rather, the Advisory Committee recommended that large systems
sample for E. coli and turbidity when they monitor for Cryptosporidium
under the LT2ESWTR. These data will then be used to verify or, if
necessary, further refine the proposed indicator trigger values for
small systems. EPA concurs with these recommendations and they are
reflected in today's proposal.
The proposed monitoring schedule under the LT2ESWTR is set up to
allow EPA and stakeholders to evaluate large system monitoring data for
indicator relationships prior to the start of small system E. coli
monitoring. After one year of large system monitoring is completed, EPA
will begin analyzing monitoring data to assess whether alternative
indicator strategies would be appropriate. Depending on the findings of
this analysis, EPA may issue guidance to States on approving
alternative indicator trigger strategies for small systems. Therefore,
the proposed rule is written with the allowance for States to approve
alternative indicator strategies.
BILLING CODE 6560-50-P
[[Page 47675]]
[GRAPHIC]
[TIFF OMITTED]
TP11AU03.005
[[Page 47676]]
[GRAPHIC]
[TIFF OMITTED]
TP11AU03.006
[[Page 47677]]
BILLING CODE 6560-50-C
Cryptosporidium Monitoring
Small systems that exceed the E. coli trigger must conduct
Cryptosporidium monitoring, beginning 6 months after completion of E.
coli monitoring. As recommended by the Advisory Committee, EPA is
proposing that small systems collect 24 Cryptosporidium samples over a
period of one year. This number of samples is the same as required for
large systems, but the monitoring burden is targeted only on those
plants that E. coli monitoring indicates to have elevated levels of
fecal matter in the source water. By completing Cryptosporidium
monitoring in one year, small systems will conduct a total of 2 years
of monitoring to determine LT2ESWTR bin classification (including the
one year of E. coli monitoring). This time frame is equivalent to the
requirement for large systems, which monitor for Cryptosporidium, E.
coli, and turbidity for 2 years.
The Stage 2 M-DBP Agreement in Principle recommended that EPA
explore the feasibility of alternative, lower frequency,
Cryptosporidium monitoring criteria for providing a conservative mean
estimate in small systems. As described earlier, EPA has evaluated
smaller sample sizes, such as systems taking 12 or 8 samples instead of
24 (see Table IV-6). However, EPA has concluded that these smaller
sample sizes result in unacceptably high misclassification rates. For
example, bin classification based on the second highest of 12 samples
produces an estimated false positive rate of 47% for systems with a
mean Cryptosporidium concentration 0.5 log below the Bin 1 boundary of
0.075/L. In comparison, bin classification based on the mean of 24
samples achieves a false positive rate of 2.8% for systems at this
Cryptosporidium concentration. Consequently, EPA is proposing no
alternatives to the requirement that small systems take at least 24
samples.
Small system bin classification will be determined by the
arithmetic mean of the 24 samples collected over one year. Because the
bin structure in the LT2ESWTR is based on annual mean Cryptosporidium
levels, it is necessary that bin classification involve averaging
samples over at least one year. Consequently, small systems will
determine their bin classification by averaging results from all
Cryptosporidium samples collected during their one year of monitoring.
iii. Future monitoring and reassessment. EPA is proposing that
beginning 6 years after the initial bin classification, large and small
systems conduct another round of monitoring to determine if source
water conditions have changed to a degree that may warrant a revised
bin classification. The Advisory Committee recommended that EPA convene
a stakeholder process within 4 years after the initial bin
classification to develop recommendations on how best to proceed with
implementing this second round of monitoring. Unless EPA modifies the
LT2ESWTR to allow for an improved analytical method or a revised bin
structure based on new risk information, the second round of monitoring
will be conducted under the same requirements that apply to the initial
round of monitoring.
In addition, EPA is proposing to use the required assessment of the
water source during sanitary surveys as an ongoing measure of whether
significant changes in watersheds have occurred that may lead to
increased contamination. Where the potential for increased
contamination is identified, States must determine what follow-up
actions by the system are necessary, including the possibility of the
system providing additional treatment from the microbial toolbox.
d. Basis for accepting previously collected data. Members of the
Advisory Committee had multiple objectives in recommending that EPA
allow the use of previously collected (grandfathered) Cryptosporidium
data. These include (1) giving credit for data collected by proactive
utilities, (2) facilitating early determination of LT2ESWTR compliance
needs and, thereby, allowing for early planning of appropriate
treatment selection, (3) increasing laboratory capacity to meet demand
for Cryptosporidium analysis under the LT2ESWTR, and (4) allowing
utilities to improve their data set for bin determination by
considering more than 2 years of data (i.e., include data collected
prior to effective date of LT2ESWTR). The latter objective incorporates
the assumption that occurrence can vary from year to year, so that if
more years of data are used in the bin determination, the source water
concentration estimate will be a more accurate representation of the
overall mean.
A significant issue with accepting previously collected data for
making bin determinations is ensuring that the data are of equivalent
quality to data that will be collected following LT2ESWTR promulgation.
As noted previously, EPA is establishing requirements so that data
collected under the LT2ESWTR will be similar in quality to data that
were generated under the ICRSS. These requirements include the use of
approved analytical methods and compliance with method quality control
(QC) criteria, use of approved laboratories, minimum sample volume, and
a sampling schedule with minimum frequency. For example, under the
ICRSS, laboratories analyzed 10 L samples and (considered collectively)
achieved a mean Cryptosporidium recovery of approximately 43% in spiked
source water with a relative standard deviation (RSD) of 50%. EPA
anticipates that laboratories conducting Cryptosporidium analysis for
the LT2ESWTR will collectively achieve similar analytical method
performance. Consequently, EPA expects previously collected data sets
used under the LT2ESWTR to meet these standards and has established
criteria for accepting previously collected data accordingly (see
section IV.A.1.d).
Systems are requested, but not required, to notify EPA prior to
promulgation of the LT2ESWTR of their intent to submit previously
collected data. This will help EPA allocate the resources that will be
needed to evaluate these data in order to make a decision on adequacy
for bin determination. Systems that have at least 2 years of previously
collected data to grandfather when the LT2ESWTR is promulgated and do
not intend to conduct new monitoring under the rule are required to
submit the previously collected data to EPA within 2 months following
promulgation. This will enable EPA to evaluate the data and report back
to the utility in sufficient time to allow, if needed, the utility to
contract with a laboratory to conduct monitoring under the LT2ESWTR.
Systems that have fewer than 2 years of previously collected data
to grandfather when the LT2ESWTR is promulgated, or that intend to
grandfather 2 or more years of previously collected data and also
conduct new monitoring under the rule, are required to submit the
previously collected data to EPA within 8 months following
promulgation. This will allow these utilities to continue to collect
previously collected data in the 6 month period between promulgation
and the date when monitoring under the LT2ESWTR must begin, plus a 2
month period for systems to compile the data and supporting
documentation. Utilities may submit the data earlier than 8 months
after promulgation if they acquire 2 years of previously collected data
before this date.
Submitted grandfathered data sets must include all routine source
water monitoring results for samples collected during the time period
covered by the
[[Page 47678]]
grandfathered data set (i.e., the time period between collection of the
first and last samples in the data set). However, systems are not
required under the LT2ESWTR to submit previously collected data for
samples outside of this time period.
3. Request for Comment
EPA requests comments on all aspects of the monitoring and
treatment requirements proposed in this section. In addition, EPA
requests comment on the following issues:
Requirements for Systems That Use Surface Water for Only Part of the
Year
Bin classification for the LT2ESWTR is based on the mean annual
sourcewater Cryptosporidium level. Consequently, today's proposal
requires E. coli and Cryptosporidium monitoring to be conducted over
the full year. However, EPA recognizes that some systems use surface
water for only part of the year. This occurs with systems that use
surface water for part of the year (e.g., during the summer) to
supplement ground water sources and with systems like campgrounds that
are in operation for only part of the year. Year round monitoring for
these systems may present both logistic and economic difficulties. EPA
is requesting comment on how to apply LT2ESWTR monitoring requirements
to surface water systems that operate or use surface water for only
part of the year. Possible approaches that may be considered for
comment include the following:
Small public water systems that operate or use surface water for
only part of the year could be required to collect E. coli samples at
least bi-weekly during the period when they use surface water. If the
mean E. coli concentration did not exceed the trigger level (e.g., 10/
100 mL for reservoirs/lakes or 50/100mL for flowing streams), systems
could apply to the State to waive any additional E. coli monitoring.
The State could grant the waiver, require additional E. coli
monitoring, or require monitoring of an alternate indicator. If the
mean E. coli concentration exceeded the trigger level, the State could
require the system to provide additional treatment for Cryptosporidium
consistent with Bin 4 requirements, or require monitoring of
Cryptosporidium or an indicator, with the results potentially leading
to additional Cryptosporidium treatment requirements.
Large public water systems that operate or use surface water for
only part of the year could be required to collect Cryptosporidium
samples (along with E. coli and turbidity) either twice-per-month
during the period when they use surface water or 12 samples per year,
whichever is smaller. Samples would be collected during the two years
of the required monitoring period, and bin classification would be
based on the highest average of the two years.
EPA requests comment on these and other approaches for both small
and large systems.
Previously Collected Monitoring Data That Do Not Meet QC Requirements
EPA is proposing requirements for acceptance of previously
collected monitoring data that are equivalent to requirements for data
generated under the LT2ESWTR. The Agency is aware that systems will
have previously collected Cryptosporidium data that do not meet all
sampling and analysis requirements (e.g., quality control, sample
frequency, sample volume) proposed for data collected under the
LT2ESWTR. However, the Agency has been unable to develop an approach
for allowing systems to use such data for LT2ESWTR bin classification.
This is due to uncertainty regarding the impact of deviations from
proposed sampling and analysis requirements on data quality and
reliability. For example, Methods 1622 and 1623 have been validated
within the limits of the QC criteria specified in these methods. While
very minor deviations from required QA/QC criteria may have only a
minor impact on data quality, the Agency has not identified a basis for
establishing alternative standards for data acceptability.
EPA requests comment on whether or under what conditions previously
collected data that do not meet the proposed criteria for LT2ESWTR
monitoring data should be accepted for use in bin determination.
Specifically, EPA requests comment on the sampling frequency
requirement for previously collected data, and whether EPA should allow
samples collected at lower or varying frequencies to be used as long as
the data are representative of seasonal variation and include the
required number of samples. If so, how should EPA determine whether
such a data set is unbiased and representative of seasonal variation?
How should data collected at varying frequency be averaged?
Monitoring for Systems That Recycle Filter Backwash
Plants that recycle filter backwash water may, in effect, increase
the concentration of Cryptosporidium in the water that enters the
filtration treatment train. Under the LT2ESWTR proposal, microbial
sampling may be conducted on source water prior to the addition of
filter backwash water. EPA requests comment on how the effect of
recycling filter backwash should be considered in LT2ESWTR monitoring.
Bin Assignment for Systems That Fail To Complete Required Monitoring
Today's proposal classifies systems that fail to complete required
monitoring in Bin 4, the highest treatment bin. EPA requests comment on
alternative approaches for systems that fail to complete required
monitoring, such as classifying the system in a bin based on data the
system has collected, or classifying the system in a bin one level
higher than the bin indicated by the data the system has collected. The
shortcoming to these alternative approaches is that bin classification
becomes more uncertain, and the likelihood of bin misclassification
increases, as systems collect fewer than the required 24
Cryptosporidium samples. Consequently, the proposed approach is for
systems to collect all required samples.
Note that under today's proposal, systems may provide 5.5 log of
treatment for Cryptosporidium (i.e., comply with Bin 4 requirements) as
an alternative to monitoring. Where systems notify the State that they
will provide treatment instead of monitoring, they will not incur
monitoring violations.
Monitoring Requirements for New Plants and Sources
The proposed LT2ESWTR would establish calendar dates when the
initial and second round of source water monitoring must be conducted
to determine bin classification. EPA recognizes that new plants will
begin operation, and that existing plants will access new sources,
after these dates. EPA believes that new plants and plants switching
sources should conduct monitoring equivalent to that required of
existing plants to determine the required level of Cryptosporidium
treatment. The monitoring could be conducted before a new plant or
source is brought on-line, or initiated within some time period
afterward. EPA requests comment on monitoring and treatment
requirements for new plants and sources.
Determination of LT2ESWTR Bin Classification
In today's proposal, EPA expects that systems will be assigned to
LT2ESWTR risk bins based on their reported Cryptosporidium monitoring
results and the calculations proposed for bin
[[Page 47679]]
assignment described in this section. EPA requests comment on whether
bin classifications should formally be made or reviewed by States.
Source Water Type Classification for Systems That Use Multiple Sources
In today's proposal, the E. coli concentrations that trigger small
system Cryptosporidium monitoring are different for systems using lake/
reservoir and flowing stream sources. However, EPA recognizes that some
systems use multiple sources, potentially including both lake/reservoir
and flowing stream sources, and that the use of different sources may
vary during the year. Further, some systems use sources that are ground
water under the direct influence (GWUDI) of surface water. EPA requests
comment on how to apply the E. coli criteria for triggering
Cryptosporidium monitoring to systems using multiple sources and GWUDI
sources.
B. Unfiltered System Treatment Technique Requirements for
Cryptosporidium
1. What Is EPA Proposing Today?
a. Overview. EPA is proposing treatment technique requirements for
Cryptosporidium in unfiltered systems. Today's proposal requires all
unfiltered systems using surface water or ground water under the direct
influence of surface water to achieve at least 2 log (99%) inactivation
of Cryptosporidium prior to the distribution of finished water.
Further, unfiltered systems must monitor for Cryptosporidium in their
source water, and where monitoring demonstrates a mean level above 0.01
oocysts/L, systems must provide at least 3 log Cryptosporidium
inactivation. Disinfectants that can be used to meet this treatment
requirement include ozone, ultraviolet (UV) light, and chlorine
dioxide.
All current requirements for unfiltered systems under 40 CFR 141.71
and 141.72(a) remain in effect, including requirements to inactivate at
least 3 log of Giardia lamblia and 4 log of viruses. In addition,
unfiltered systems must meet their overall disinfection requirements
using a minimum of two disinfectants. These proposed requirements
reflect recommendations of the Stage 2 M-DBP Federal Advisory
Committee. Details of the proposed requirements are described in the
following sections.
b. Monitoring requirements. Requirements for Cryptosporidium
monitoring by unfiltered systems are similar to requirements for
filtered systems of the same size, as given in section IV.A.1.
Unfiltered systems serving at least 10,000 people must sample their
source water for Cryptosporidium at least monthly for two years,
beginning no later than 6 months after promulgation of this rule.
Samples may be collected more frequently (e.g., semi-monthly, weekly)
as long as a consistent frequency is maintained throughout the
monitoring period.
Unfiltered systems serving fewer than 10,000 people must conduct
source water sampling for Cryptosporidium at least twice-per-month for
one year, beginning no later than 4 years following promulgation of
this rule (i.e., on the same schedule as small filtered systems).
However, unlike small filtered systems, small unfiltered systems cannot
monitor for an indicator (e.g., E. coli) to determine if they are
required to monitor for Cryptosporidium. EPA has not identified
indicator criteria that can effectively screen for plants with
Cryptosporidium concentrations below 0.01 oocysts/L. Consequently, all
small unfiltered systems must conduct Cryptosporidium monitoring.
As described in section IV.K and IV.L, Cryptosporidium analyses
must be performed on at least 10 L per sample with EPA Methods 1622 or
1623, and must be conducted by laboratories approved for these methods
by EPA. Analysis of larger sample volumes is allowed, provided the
laboratory has demonstrated comparable method performance to that
achieved on a 10 L sample. Section IV.J describes requirements for
reporting sample analysis results. All Cryptosporidium samples must be
collected in accordance with a schedule that is developed by the system
and submitted to EPA or the State at least 3 months prior to initiation
of sampling. Refer to section IV.A.1 for requirements pertaining to any
failure to report a valid sample analysis result for a scheduled
sampling date and procedures for collecting a replacement sample.
Unfiltered systems are required to participate in future
Cryptosporidium monitoring on the same schedule as filtered systems of
the same size. Future monitoring requirements for filtered systems are
described in section IV.A.1.
Unfiltered systems are not required to conduct source water
Cryptosporidium monitoring under the LT2ESWTR if the system currently
provides or will provide a total of at least 3 log Cryptosporidium
inactivation, equivalent to meeting the treatment requirements for
unfiltered systems with a mean Cryptosporidium concentration of greater
than 0.01 oocysts/L. Systems must notify the State not later than the
date the system is otherwise required to submit a sampling schedule for
monitoring. Systems must install and operate technologies to provide a
total of at least 3 log Cryptosporidium inactivation by the applicable
date in Table IV-24.
c. Treatment requirements. All unfiltered systems must provide
treatment for Cryptosporidium, and the degree of required treatment
depends on the level of Cryptosporidium in the source water as
determined through monitoring. Unfiltered systems must calculate their
average source water Cryptosporidium concentration using the arithmetic
mean of all samples collected during the required two year monitoring
period (or one year monitoring period for small systems). For
unfiltered systems with mean source water Cryptosporidium levels of
less than or equal to 0.01 oocysts/L, 2 log Cryptosporidium
inactivation is required. Where the mean source water level is greater
than 0.01 oocysts/L, 3 log inactivation is required.
In addition, unfiltered systems are required to use at least two
different disinfectants to meet their overall inactivation requirements
for viruses (4 log), Giardia lamblia (3 log), and Cryptosporidium (2 or
3 log). Further, each of the two disinfectants must achieve by itself
the total inactivation required for one of these three pathogen types.
For example, a system could use UV light to achieve 2 log inactivation
of Cryptosporidium and Giardia lamblia, and use chlorine to inactivate
1 log Giardia lamblia and 4 log viruses. In this case, chlorine would
achieve the total inactivation required for viruses while UV light
would achieve the total inactivation required for Cryptosporidium, and
the two disinfectants together would meet the overall treatment
requirements for viruses, Giardia lamblia, and Cryptosporidium. In all
cases unfiltered systems must continue to meet disinfectant residual
requirements for the distribution system.
EPA has developed criteria, described in sections IV.C.14-15, for
systems to determine Cryptosporidium inactivation credits for chlorine
dioxide, ozone, and UV light. Unfiltered systems are allowed to use any
of these disinfectants to meet the 2 (or 3) log Cryptosporidium
inactivation requirement. The following paragraphs describe standards
for demonstrating compliance with the proposed Cryptosporidium
treatment technique requirement. For systems using ozone and chlorine
dioxide, these standards are similar to current standards for
compliance with Giardia
[[Page 47680]]
lamblia and virus treatment requirements, as established by the SWTR in
40 CFR 141.72 and 141.74. However, for systems using UV light, modified
compliance standards are proposed, due to the different way in which UV
disinfection systems will be monitored.
Each day a system using ozone or chlorine dioxide serves water to
the public, the system must calculate the CT value(s) from the system's
treatment parameters, using the procedures specified in 40 CFR
141.74(b)(3). The system must determine whether this value(s) is
sufficient to achieve the required inactivation of Cryptosporidium
based on the CT criteria specified in section IV.C.14. The disinfection
treatment must ensure at least 99 percent (or 99.9 percent if required)
inactivation of Cryptosporidium every day the system serves water to
the public, except any one day each month. Systems are required to
report daily CT values on a monthly basis, as described in section
IV.J.
Each day a system using UV light serves water to the public, the
system must monitor for the parameters, including flow rate and UV
intensity, that demonstrate whether the system's UV reactors are
operating within the range of conditions that have been validated to
achieve the required UV dose, as specified in section IV.C.15. Systems
must monitor each UV reactor while in use and must record periods when
any reactor operates outside of validated conditions. The disinfection
treatment must ensure at least 99 percent (or 99.9 percent if required)
inactivation of Cryptosporidium in at least 95 percent of the water
delivered to the public every month. Systems are required to report
periods when UV reactors operate outside of validated conditions on a
monthly basis, as described in section IV.J.
Unfiltered systems currently must comply with requirements for DBPs
as a condition of avoiding filtration under 40 CFR 141.71(b)(6). As
described earlier, EPA is developing a Stage 2 DBPR, which will further
limit allowable levels of certain DBPs, specifically trihalomethanes
and haloacetic acids. EPA intends to incorporate new standards for DBPs
established under the Stage 2 DBPR into the criteria for filtration
avoidance.
2. How Was This Proposal Developed?
a. Basis for Cryptosporidium treatment requirements. The intent of
the proposed treatment requirements for unfiltered systems is to
achieve public health protection against Cryptosporidium equivalent to
filtration systems. As described in section III.C, an assessment of
survey data indicates that under current treatment requirements,
finished water Cryptosporidium levels are higher in unfiltered systems
than in filtered systems.
Information Collection Rule data show an average plant-mean
Cryptosporidium level of 0.59 oocysts/L in the source water of filtered
plants and 0.014 oocysts/L in unfiltered systems. Median plant-mean
concentrations were 0.052 and 0.0079 oocysts/L in filtered and
unfiltered system sources, respectively. Thus, these results suggest
that typical Cryptosporidium occurrence in filtered system sources is
approximately 10 times higher than in unfiltered system sources.
In translating these data to assess finished water risk, EPA and
the Advisory Committee estimated that conventional plants in compliance
with the IESWTR achieve an average Cryptosporidium removal of 3 log
(see discussion in section III.D). Hence, if the median source water
Cryptosporidium level at conventional plants is approximately 10 times
higher than at unfiltered systems, and it is estimated that
conventional plants achieve an average reduction of 3 log (99.9%), then
the median finished water Cryptosporidium concentration at conventional
plants is lower by a factor of 100 than at unfiltered systems.
Therefore, to ensure equivalent public health protection, unfiltered
systems should reduce Cryptosporidium levels by 2 log.
Due to the development of criteria for Cryptosporidium inactivation
with ozone, chlorine dioxide, and UV light, EPA has determined that it
is feasible for unfiltered systems to comply with a Cryptosporidium
treatment technique requirement. Consequently, EPA is proposing that
all unfiltered systems provide at least 2 log inactivation of
Cryptosporidium.
The proposed treatment requirements for unfiltered systems with
higher source water Cryptosporidium levels are consistent with proposed
treatment requirements for filtered systems. As discussed previously,
EPA is proposing that filtered plants with mean source water
Cryptosporidium levels between 0.075 and 1.0 oocysts/L, as measured by
Methods 1622 and 1623, provide at least a 4 log reduction (with greater
treatment required for higher source water pathogen levels). These
requirements will achieve average treated water Cryptosporidium
concentrations below 1 oocyst/10,000 L in filtered systems. An
unfiltered system with a mean source water Cryptosporidium
concentration above 0.01 oocyst/L would need to provide more than 2 log
inactivation in order to achieve an equivalent finished water oocyst
level. Therefore, EPA is proposing that unfiltered systems provide at
least 3 log inactivation where mean concentrations exceed 0.01 oocysts/
L.
For unfiltered systems using UV disinfection to meet these proposed
Cryptosporidium treatment requirements, EPA is proposing that
compliance be based on a 95th percentile standard (i.e., at least 95
percent of the water must be treated to the required UV dose). This
standard is intended to be comparable with the ``every day except any
one day per month'' compliance standard established by the SWTR for
chemical disinfection (see 40 CFR 141.72(a)(1)). Because UV
disinfection systems will typically consist of multiple parallel
reactors that will be monitored continuously, the Agency has determined
that it is more appropriate to base a compliance determination on the
percentage of water disinfected to the required level, rather than a
single daily measurement. The UV Disinfection Guidance Manual (USEPA
2003d) will provide advice on meeting this proposed standard. A draft
of this guidance is available in the docket for today's proposal
(http://www.epa.gov/edocket/).
b. Basis for requiring the use of two disinfectants. EPA is
proposing that unfiltered systems use at least two different
disinfectants to meet the 2 (or 3), 3, and 4 log inactivation
requirements for Cryptosporidium, Giardia lamblia, and viruses,
respectively. The purpose of this requirement is to provide for
multiple barriers of protection against pathogens. One benefit of this
approach is that if one barrier were to fail then there would still be
one remaining barrier to provide protection against some of the
pathogens that might be present. For example, if a plant used UV to
inactivate Cryptosporidium and Giardia lamblia, along with chlorine to
inactivate viruses, and the UV system were to malfunction, the chlorine
would still meet the treatment requirement for viruses and would
provide some degree of protection against Giardia lamblia.
Another benefit of multiple barriers is that they will typically
provide more effective protection against a broad spectrum of pathogens
than a single disinfectant. Because the efficacy of disinfectants
against different pathogens varies widely, using multiple disinfectants
will generally provide more efficient inactivation of a wide
[[Page 47681]]
range of pathogens than a single disinfectant.
EPA is aware, though, that this requirement would not result in a
redundant barrier for each type of pathogen. In the example of a plant
using chlorine and UV, the chlorine would provide essentially no
protection against Cryptosporidium and might achieve only a small
amount of Giardia lamblia inactivation if it was designed primarily to
inactivate viruses. However, since the watersheds of unfiltered systems
are required to be protected (40 CFR 141.71), the probability is low
that high levels of Cryptosporidium or Giardia lamblia would occur
during the time frame necessary to address a short period of treatment
failure.
Note the request for comment on this topic at the end of this
section.
c. Basis for source water monitoring requirements. Monitoring by
unfiltered systems is necessary to identify those with mean source
water Cryptosporidium levels above 0.01 oocysts/L. In order to allow
for simultaneous compliance with other microbial and disinfection
byproduct regulatory requirements, EPA is proposing that unfiltered
systems monitor for Cryptosporidium on the same schedule as filtered
systems of the same size. Because EPA was not able to identify
indicator criteria, such as E. coli, that can discriminate among
systems above and below a mean Cryptosporidium concentration of 0.01
oocysts/L, EPA is proposing that all unfiltered systems monitor for
Cryptosporidium.
Consistent with requirements for filtered systems, unfiltered
systems are required to analyze at least 24 samples of at least 10 L
over the two year monitoring period (one year for small systems).
However, if an unfiltered system collected and analyzed only 24 samples
of 10 L then a total count of 3 oocysts among all samples would result
in a source water concentration exceeding 0.01 oocysts/L. To avoid a
relatively small number of counts determining an additional treatment
implication, unfiltered systems may consider conducting more frequent
sampling or analyzing larger sample volumes (e.g., 50 L). Since the
water sources of unfiltered systems tend to have very low turbidity
(compared to average sources in filtered systems), it is typically more
feasible to analyze larger sample volumes in unfiltered systems.
Filters have been approved for Cryptosporidium analysis of 50 L
samples. Note that analysis of larger sample volumes would not reduce
the required sampling frequency.
3. Request for Comment
EPA solicits comment on the proposed monitoring and treatment
technique requirements for unfiltered systems. Specifically, the Agency
seeks comment on the following issues:
Use of Two Disinfectants
EPA requests comment on the proposed requirement for unfiltered
systems to use two disinfectants and for each disinfectant to meet by
itself the inactivation requirement for at least one regulated
pathogen. The requirement for unfiltered systems to use two
disinfectants was recommended by the Advisory Committee because (1)
disinfectants vary in their efficacy against different pathogens, so
that the use of multiple disinfectants can provide more effective
protection against a broad spectrum of pathogens, and (2) multiple
disinfectants provide multiple barriers of protection, which can be
more reliable than a single disinfectant.
An alternate approach would be to allow systems to meet the
inactivation requirements using any combination of one or more
disinfectants that achieved the required inactivation level for all
pathogens. This would give systems greater flexibility and could spur
the development of new disinfection techniques that would be applicable
to a wide range of pathogens. However, this approach might be less
protective against unregulated pathogens. A related question is whether
the proposed requirements for use of two disinfectants establish an
adequate level of multiple barriers in the treatment provided by
unfiltered systems.
Treatment Requirements for Unfiltered Systems With Higher
Cryptosporidium Levels
Under the proposed LT2ESWTR, a filtered system that measures a mean
source water Cryptosporidium level of 0.075 oocysts/L or higher is
required to provide a total of 4 log or more reduction of
Cryptosporidium. However, if an unfiltered system, meeting the criteria
for avoiding filtration were to measure Cryptosporidium at this level,
it would be required to provide only 3 log treatment. Available
occurrence data indicate that very few, if any, unfiltered systems will
measure mean source water Cryptosporidium concentrations above 0.075
oocysts/L. However, EPA requests comment on whether or how this
possibility should be addressed.
C. Options for Systems To Meet Cryptosporidium Treatment Requirements
1. Microbial Toolbox Overview
The LT2ESWTR proposal contains a list of treatment processes and
management practices for water systems to use in meeting additional
Cryptosporidium treatment requirements under the LT2ESWTR. This list,
termed the microbial toolbox, was recommended by the Stage 2 M-DBP
Advisory Committee in the Agreement in Principle. Components of the
microbial toolbox include watershed control programs, alternative
sources, pretreatment processes, additional filtration barriers,
inactivation technologies, and enhanced plant performance. The intent
of the microbial toolbox is to provide water systems with broad
flexibility in selecting cost-effective LT2ESWTR compliance strategies.
Moreover, the toolbox allows systems that currently provide additional
pathogen barriers or that can demonstrate enhanced performance to
receive additional Cryptosporidium treatment credit.
A key feature of the microbial toolbox is that many of the
components carry presumptive credits towards Cryptosporidium treatment
requirements. Plants will receive these credits for toolbox components
by demonstrating compliance with required design and implementation
criteria, as described in the sections that follow. Treatment credit
greater than the presumptive credit may be awarded for a toolbox
component based on a site-specific or technology-specific demonstration
of performance, as described in section IV.C.17.
While the Advisory Committee made recommendations for the degree of
presumptive treatment credit to be granted to different toolbox
components, the Committee did not specify the design and implementation
conditions under which the credit should be awarded. EPA has identified
and is proposing such conditions in today's notice, based on an
assessment of available data. For certain toolbox components, such as
raw water storage and roughing filters, the Agency concluded that
available data do not support the credit recommended by the Advisory
Committee. Consequently, EPA is not proposing a presumptive credit for
these options.
For each microbial toolbox component, EPA is requesting comment on:
(1) Whether available data support the proposed presumptive credits,
including the design and implementation conditions under which
[[Page 47682]]
the credit would be awarded, (2) whether available data are consistent
with the decision not to award presumptive credit for roughing filters
and raw water off-stream storage, and (3) whether additional data are
available on treatment effectiveness of toolbox components for reducing
Cryptosporidium levels. EPA will consider modifying today's proposal
for microbial toolbox components based on new information that may be
provided.
EPA particularly solicits comment on the performance of alternative
filtration technologies that are currently being used, as well as ones
that systems are considering for use in the future, specifically
including bag filters, cartridge filters, and bank filtration, in
removing Cryptosporidium. The Agency requests both laboratory and field
data that will support a determination of the appropriate level of
Cryptosporidium removal credit to award to these technologies. In
addition, the Agency requests information on the applicability of these
technologies to different source water types and treatment scenarios.
Data submitted in response to this request for comment should include,
where available, associated quality assurance and cost information.
This preamble discusses bank filtration in section IV.C.6 and bag and
cartridge filters in section IV.C.12.
Table IV-7 summarizes presumptive credits and associated design and
implementation criteria for microbial toolbox components. Each
component is then described in more detail in the sections that follow.
EPA is also developing guidance to assist systems with implementing
toolbox components. Pertinent guidance documents include: UV
Disinfection Guidance Manual (USEPA 2003d), Membrane Filtration
Guidance Manual (USEPA 2003e), and Toolbox Guidance Manual (USEPA
2003f). Each is available in draft form in the docket for today's
proposal (http://www.epa.gov/edocket/).
Table IV-7.--Microbial Toolbox: Proposed Options, Log Credits, and
Design/Implementation Criteria \1\
------------------------------------------------------------------------
Proposed Cryptosporidium log credit
Toolbox option with design and implementation
criteria\1\
------------------------------------------------------------------------
Watershed control program......... 0.5 log credit for State-approved
program comprising EPA specified
elements. Does not apply to
unfiltered systems.
Alternative source/Intake No presumptive credit. Systems may
management. conduct simultaneous monitoring for
LT2ESWTR bin classification at
alternative intake locations or
under alternative intake management
strategies.
Off-stream raw water storage...... No presumptive credit. Systems using
off-stream storage must conduct
LT2ESWTR sampling after raw water
reservoir to determine bin
classification.
Pre-sedimentation basin with 0.5 log credit with continuous
coagulation. operation and coagulant addition;
basins must achieve 0.5 log
turbidity reduction based on the
monthly mean of daily measurements
in 11 of the 12 previous months;
all flow must pass through basins.
Systems using existing pre-sed
basins must sample after basins to
determine bin classification and
are not eligible for presumptive
credit.
Lime softening.................... 0.5 log additional credit for two-
stage softening (single-stage
softening is credited as equivalent
to conventional treatment).
Coagulant must be present in both
stages--includes metal salts,
polymers, lime, or magnesium
precipitation. Both stages must
treat 100% of flow.
Bank filtration (as pretreatment). 0.5 log credit for 25 ft. setback;
1.0 log credit for 50 ft. setback;
aquifer must be unconsolidated sand
containing at least 10% fines;
average turbidity in wells must be
< 1 NTU. Systems using existing
wells followed by filtration must
monitor well effluent to determine
bin classification and are not
eligible for presumptive credit.
Combined filter performance....... 0.5 log credit for combined filter
effluent turbidity <= 0.15 NTU in
95% of samples each month.
Roughing filters.................. No presumptive credit proposed.
Slow sand filters................. 2.5 log credit as a secondary
filtration step; 3.0 log credit as
a primary filtration process. No
prior chlorination.
Second stage filtration........... 0.5 log credit for second separate
filtration stage; treatment train
must include coagulation prior to
first filter. No presumptive credit
for roughing filters.
Membranes......................... Log credit equivalent to removal
efficiency demonstrated in
challenge test for device if
supported by direct integrity
testing.
Bag filters....................... 1 log credit with demonstration of
at least 2 log removal efficiency
in challenge test.
Cartridge filters................. 2 log credit with demonstration of
at least 3 log removal efficiency
in challenge test.
Chlorine dioxide.................. Log credit based on demonstration of
log inactivation using CT table.
Ozone............................. Log credit based on demonstration of
log inactivation using CT table.
UV................................ Log credit based on demonstration of
inactivation with UV dose table;
reactor testing required to
establish validated operating
conditions.
Individual filter performance..... 1.0 log credit for demonstration of
filtered water turbidity < 0.1 NTU
in 95 percent of daily max values
from individual filters (excluding
15 min period following backwashes)
and no individual filter 0.3 NTU in two consecutive
measurements taken 15 minutes
apart.
Demonstration of performance...... Credit awarded to unit process or
treatment train based on
demonstration to the State, through
use of a State-approved protocol.
------------------------------------------------------------------------
\1\ Table provides summary information only; refer to following preamble
and regulatory language for detailed requirements.
2. Watershed Control Program
a. What is EPA proposing today? EPA is proposing a 0.5 log credit
towards Cryptosporidium treatment requirements under the LT2ESWTR for
filtered systems that develop a State-approved watershed control
program designed to reduce the level of Cryptosporidium. The watershed
control program credit can be added to the credit awarded for any other
toolbox component. However, this credit is not available to unfiltered
systems, as they are currently required under 40 CFR 141.171 to
maintain a watershed control
[[Page 47683]]
program that minimizes the potential for contamination by
Cryptosporidium as a criterion for avoiding filtration.
There are many potential sources of Cryptosporidium in watersheds,
including sewage discharges and non-point sources associated with
animal feces. The feasibility, effectiveness, and sustainability of
control measures to reduce Cryptosporidium contamination of water
sources will be site-specific. Consequently, the proposed watershed
control program credit centers on systems working with stakeholders in
the watershed to develop a site-specific program, and State review and
approval are required. In the Toolbox Guidance Manual (USEPA 2003f),
available in draft in the docket for today's proposal, EPA provides
information on management practices that systems may consider in
developing their watershed control programs.
Initial State approval of a system's watershed control program will
be based on State review of the system's proposed watershed control
plan and supporting documentation. The initial approval can be valid
until the system completes the second round of Cryptosporidium
monitoring described in section IV.A (systems begin a second round of
monitoring six years after the initial bin assignment). During this
period, the system is responsible for implementing the approved plan
and complying with other general requirements, such as an annual
watershed survey and program status report. These requirements are
further described later in this section.
The period during which State approval of a watershed control
program is in effect is referred to as the approval period. Systems
that want to continue their eligibility to receive the 0.5 log
Cryptosporidium treatment credit must reapply for State approval of the
program for each subsequent approval period. In general, the re-
approval will be based on the State's review of the system's
reapplication package, as well as the annual status reports and
watershed surveys. Subsequent approval(s) by the State of the watershed
control program typically will be for a time equivalent to the first
approval period, but States have the discretion to renew approval for a
longer or shorter time period.
Requirements for Initial State Approval of Watershed Control Programs
Systems that intend to pursue a 0.5 log Cryptosporidium treatment
credit for a watershed control program are required to notify the State
within one year following initial bin assignment that the system
proposes to develop a watershed control plan and submit it for State
approval.
The application to the State for initial program approval must
include the following minimum elements:
? An analysis of the vulnerability of each source to
Cryptosporidium. The vulnerability analysis must address the watershed
upstream of the drinking water intake, including: A characterization of
the watershed hydrology, identification of an ``area of influence''
(the area to be considered in future watershed surveys) outside of
which there is no significant probability of Cryptosporidium or fecal
contamination affecting the drinking water intake, identification of
both potential and actual sources of Cryptosporidium contamination, the
relative impact of the sources of Cryptosporidium contamination on the
system's source water quality, and an estimate of the seasonal
variability of such contamination.
? An analysis of control measures that could address the
sources of Cryptosporidium contamination identified during the
vulnerability analysis. The analysis of control measures must address
their relative effectiveness in reducing Cryptosporidium loading to the
source water and their sustainability.
? A plan that specifies goals and defines and prioritizes
specific actions to reduce source water Cryptosporidium levels. The
plan must explain how actions are expected to contribute to specified
goals, identify partners and their role(s), present resource
requirements and commitments including personnel, and include a
schedule for plan implementation.
The proposed watershed control plan and a request for program
approval and 0.5 log Cryptosporidium treatment credit must be submitted
by the system to the State no later than 24 months following initial
bin assignment.
The State will review the system's initial proposed watershed
control plan and either approve, reject, or ``conditionally approve''
the plan. If the plan is approved, or if the system agrees to
implementing the State's conditions for approval, the system will be
awarded 0.5 log credit towards LT2ESWTR Cryptosporidium treatment
requirements. A final decision on approval must be made no later than
three years following the system's initial bin assignment.
The initial State approval of the system's watershed control
program can be valid until the system completes the required second
round of Cryptosporidium monitoring. The system is responsible for
taking the required steps, described as follows, to maintain State
program approval and the 0.5 log credit during the approval period.
Requirements for Maintaining State Approval of Watershed Control
Programs
Systems that have obtained State approval of their watershed
control program are required to meet the following ongoing requirements
within each approval period to continue their eligibility for the 0.5
log Cryptosporidium treatment credit:
? Submit an annual watershed control program status report to
the State during each year of the approval period.
? Conduct an annual State-approved watershed survey and
submit the survey report to the State.
? Submit to the State an application for review and re-
approval of the watershed control program and for a continuation of the
0.5 log treatment credit for a subsequent approval period.
The annual watershed control program status report must describe
the system's implementation of the approved plan and assess the
adequacy of the plan to meet its goals. It must explain how the system
is addressing any shortcomings in plan implementation, including those
previously identified by the State or as the result of the watershed
survey. If it becomes necessary during implementation to make
substantial changes in its approved watershed control program, the
system must notify the State and provide a rationale prior to making
any such changes . If any change is likely to reduce the level of
source water protection, the system must also include the actions it
will take to mitigate the effects in its notification.
The watershed survey must be conducted according to State
guidelines and by persons approved by the State to conduct watershed
surveys. The survey must encompass the area of the watershed that was
identified in the State-approved watershed control plan as the area of
influence and, as a minimum, assess the priority activities identified
in the plan and identify any significant new sources of
Cryptosporidium.
The application to the State for review and re-approval of the
system's watershed control program must be provided to the State at
least six months before the current approval period expires or by a
date previously determined by the State. The request must include a
summary of activities and issues identified during the previous
approval period and a revised
[[Page 47684]]
plan that addresses activities for the next approval period, including
any new actual or potential sources of Cryptosporidium contamination
and details of any proposed or expected changes from the existing
State-approved program. The plan must address goals, prioritize
specific actions to reduce source water Cryptosporidium, explain how
actions are expected to contribute to achieving goals, identify
partners and their role(s), resource requirements and commitments, and
the schedule for plan implementation.
The annual program status reports, watershed control plan and
annual watershed sanitary surveys must be made available to the public
upon request. These documents must be in a plain language format and
include criteria by which to evaluate the success of the program in
achieving plan goals. If approved by the State, the system may withhold
portions of the annual status report, watershed control plan, and
watershed sanitary survey based on security considerations.
b. How was this proposal developed? The M-DBP Advisory Committee
recommended that systems be awarded 0.5 log Cryptosporidium treatment
credit for implementing a watershed control program. This
recommendation was based on the Committee's recognition that some
systems will be able to reduce the level of Cryptosporidium in their
source water by implementing a well-designed and focused watershed
control program. Moreover, the control measures used in the watershed
to reduce levels of Cryptosporidium are likely to reduce concentrations
of other pathogens as well.
EPA concurs that well designed watershed control programs that
focus on reducing levels of Cryptosporidium contamination of water
sources should be encouraged, and that implementation of such programs
will likely reduce overall microbial risk. A broad reduction in
microbial risk will occur through the application of control measures
and best management practices that are effective in reducing fecal
contamination in the watershed. In addition, plant management practices
may be enhanced by the knowledge systems acquire regarding the
watershed and factors that affect microbial risk, such as sources,
fate, and transport of pathogens.
Given the highly site-specific nature of a watershed control
program, including the feasibility and effectiveness of different
control measures, EPA believes that systems should demonstrate their
eligibility for 0.5 log Cryptosporidium treatment credit by developing
targeted programs that account for site-specific factors. As part of
developing a watershed control program, systems will be required to
assess a number of these factors, including watershed hydrology,
sources of Cryptosporidium in the watershed, human impacts, and fate
and transport of Cryptosporidium. Furthermore, EPA believes that the
State is well positioned to judge whether a system's watershed control
program is likely to achieve a substantial reduction of Cryptosporidium
in source water. Consequently, EPA is proposing that approval of
watershed control programs and allowance for an associated 0.5 log
treatment credit be made by the State on a system specific basis.
A watershed control program could include measures such as (1) the
elimination, reduction, or treatment of wastewater or storm water
discharges, (2) treatment of Cryptosporidium contamination at the sites
of waste generation or storage, (3) prevention of Cryptosporidium
migration from sources, or (4) any other measures that are effective,
sustainable, and likely to reduce Cryptosporidium contamination of
source water. EPA recognizes that many public water systems do not
directly control the watersheds of their sources of supply. EPA expects
that systems will need to develop and maintain partnerships with
landowners within watersheds, as well as with State governments and
regional agencies that have authority over activities in the watershed
that may contribute Cryptosporidium to the water supply. Stakeholders
that have some level of control over activities that could contribute
to Cryptosporidium contamination include municipal government and
private operators of wastewater treatment plants, livestock farmers and
persons who spread manure, individuals with failing septic systems,
logging operations, and other government and commercial organizations.
EPA has initiated a number of programs that address watershed
management and source water protection. In 2002, EPA launched the
Watershed Initiative (67 FR 36172, May 23, 2002) (USEPA 2002b), which
will provide grants to support innovative watershed based approaches to
preventing, reducing, and eliminating water pollution. In addition, EPA
has recently promulgated new regulations for Concentrated Animal
Feeding Operations (CAFOs), which through the NPDES permit process will
limit discharges that contribute microbial pathogens to watersheds.
SDWA section 1453 requires States to carry out a source water
quality assessment program for the protection and benefit of public
water systems. EPA issued program guidance in August of 1997, and
expects that most States will complete their source water assessments
of surface water systems by the end of 2003. These assessments will
establish a foundation for watershed vulnerability analyses by
providing the preliminary analyses of watershed hydrology, a starting
point for defining the area of influence, and an inventory and
hierarchy of actual and potential contamination sources. In some cases,
these portions of the source water assessment may fully satisfy those
analytical needs.
As noted earlier, EPA has published and is continuing to develop
guidance material that addresses contamination by Cryptosporidium and
other pathogens from both non-point sources (e.g., agricultural and
urban runoff, septic tanks) and point sources (e.g., sewer overflows,
POTWs, CAFOs). The Toolbox Guidance Manual, available in draft with
today's proposal, includes a list of programmatic resources and
guidance available to assist systems in building partnerships and
implementing watershed protection activities. In addition, this
guidance manual incorporates available information on the effectiveness
of different control measures to reduce Cryptosporidium levels and
provides case studies of watershed control programs. This guidance is
intended to assist water systems in developing their watershed control
programs and States in their assessment and approval of these programs.
In addition to guidance documents, demonstration projects, and
technical resources, EPA provides funding for watershed and source
water protection through the Drinking Water State Revolving Fund
(DWSRF) and Clean Water State Revolving Fund (CWSRF). Under the DWSRF
program, States may provide funding directly to public water systems
for source water protection, including watershed management and
pathogen source reduction plans. CWSRF funds have been used to develop
and implement agricultural best management practices for reducing
pathogen loading to receiving waters and to fund directly, or provide
incentives for, the replacement of failing septic systems. EPA
encourages the use of CWSRF for source protection and has developed
guidelines for the award of funds to address non-point sources of
pollution (CWA section 319 Non Point Source Pollution Program).
Further, the Agency is promoting the broader use of
[[Page 47685]]
SRF funds to implement measures to prevent and control non-point source
pollution. Detailed sanitary surveys, with a specific analysis of
sources of Cryptosporidium in the watershed, will facilitate the
process of targeting funding available under SRF programs to eliminate
or mitigate these sources.
c. Request for comment. EPA requests comment on the proposed
watershed control program credit and associated program components.
? Should the State be allowed to reduce the frequency of the
annual watershed survey requirement for certain systems if systems
engage in alternative activities like public outreach?
? The effectiveness of a watershed control program may be
difficult to assess because of uncertainty in the efficacy of control
measures under site-specific conditions. In order to provide
constructive guidance, EPA welcomes reports on scientific case studies
and research that evaluated methods for reducing Cryptosporidium
contamination of source waters.
? Are there confidential business information (CBI) concerns
associated with making information on the watershed control program
available to the public? If so, what are these concerns and how should
they be addressed?
? How should the ``area of influence'' (the area to be
considered in future watershed surveys) be delineated, considering the
persistence of Cryptosporidium?
3. Alternative Source
a. What is EPA proposing today? Plant intake refers to the works or
structures at the head of a conduit through which water is diverted
from a source (e.g., river or lake) into the treatment plant. Plants
may be able to reduce influent Cryptosporidium levels by changing the
intake placement (either within the same source or to an alternate
source) or managing the timing or level of withdrawal.
Because the effect of changing the location or operation of a plant
intake on influent Cryptosporidium levels will be site specific, EPA is
not proposing any presumptive credit for this option. Rather, if a
system is concerned that Cryptosporidium levels associated with the
current plant intake location and/or operation will result in a bin
assignment requiring additional treatment under the LT2ESWTR, the
system may conduct concurrent Cryptosporidium monitoring reflecting a
different intake location or different intake management strategy. The
State will then make a determination as to whether the plant may be
classified in an LT2ESWTR bin using the alternative intake location or
management monitoring results.
Thus, systems that intend to be classified in an LT2ESWTR bin based
on a different intake location or management strategy must conduct
concurrent Cryptosporidium monitoring. The system is still required to
monitor its current plant intake in addition to any alternative intake
location/management monitoring, and must submit the results of all
monitoring to the State. In addition, the system must provide the State
with supporting information documenting the conditions under which the
alternative intake location/management samples were collected. The
concurrent monitoring must conform to the sample frequency, sample
volume, analytical method, and other requirements that apply to the
system for Cryptosporidium monitoring as stated in Section IV.A.1.
If a plant's LT2ESWTR bin classification is based on monitoring
results reflecting a different intake location or management strategy,
the system must relocate the intake or implement the intake management
strategy within the compliance time frame for the LT2ESWTR, as
specified in section IV.F.
b. How was this proposal developed? In the Stage 2 M-DBP Agreement
in Principle, the Advisory Committee identified several actions related
to the intake which potentially could reduce the concentration of
Cryptosporidium entering a treatment plant. These actions were included
in the microbial toolbox under the heading Alternative Source, and
include: (1) Intake relocation, (2) change to alternative source of
supply, (3) management of intake to reduce capture of oocysts in source
water, (4) managing timing of withdrawal, and (5) managing level of
withdrawal in water column.
It is difficult to predict in advance the efficacy of any of these
activities in reducing levels of Cryptosporidium entering the treatment
plant. However, if a system relocates the plant intake or implements a
different intake management strategy, it is appropriate for the plant
to be assigned to an LT2ESWTR bin using monitoring results reflecting
the new intake strategy.
EPA believes that the requirements specified for monitoring to
determine bin placement are necessary to characterize a plant's mean
source water Cryptosporidium level. Consequently, any concurrent
monitoring carried out to characterize a different intake location or
management strategy should be equivalent. For this reason, the sampling
and analysis requirements which apply to the current intake monitoring
also apply to any concurrent monitoring used to characterize a new
intake location or management strategy.
EPA also recognizes that if plant's bin assignment is based on a
new intake operation strategy then it is important for the plant to
continue to use this new strategy in routine operation. Therefore, EPA
is proposing that the system document the new intake operation strategy
when submitting additional monitoring results to the State and that the
State approve that new strategy.
c. Request for comment. EPA requests comment on the following
issues:
? What are intake management strategies by which systems
could reduce levels of Cryptosporidium in the plant influent?
? Can representative Cryptosporidium monitoring to
demonstrate a reduction in oocyst levels be accomplished prior to
implementation of a new intake strategy (e.g., monitoring a new source
prior to constructing a new intake structure)?
? How should this option be applied to plants that use
multiple sources which enter a plant through a common conduit, or which
use separate sources which enter the plant at different points?
4. Off-Stream Raw Water Storage
a. What is EPA proposing today? Off-stream raw water storage
reservoirs are basins located between a water source (typically a
river) and the coagulation and filtration processes in a treatment
plant. EPA is not proposing presumptive treatment credit for
Cryptosporidium removal through off-stream raw water storage. Systems
using off-stream raw water storage must conduct Cryptosporidium
monitoring after the reservoir for the purpose of determining LT2ESWTR
bin placement. This will allow reductions in Cryptosporidium levels
that occur through settling during off-stream storage to be reflected
in the monitoring results and consequent LT2ESWTR bin assignment.
The use of off-stream raw water storage reservoirs during LT2ESWTR
monitoring must be consistent with routine plant operation and must be
recorded by the system. Guidance on monitoring locations is provided in
Public Water System Guidance Manual for Source Water Monitoring under
the LT2ESWTR (USEPA 2003g), which is available in draft in the docket
for today's proposal.
b. How was this proposal developed? The Stage 2 M-DBP Agreement in
Principle recommends a 0.5 log credit for off-stream raw water storage
[[Page 47686]]
reservoirs with detention times on the order of days and 1.0 log credit
for reservoirs with detention times on the order of weeks. After a
review of the available literature, EPA is unable to determine criteria
that provide reasonable assurance of achieving a 0.5 or 1 log removal
of oocysts. Consequently, EPA is not proposing a presumptive treatment
credit for this process.
This proposal for off-stream raw water storage represents a change
from the November 2001 pre-proposal draft of the LT2ESWTR (USEPA
2001g), which described 0.5 log and 1 log presumptive credits for
reservoirs with hydraulic detention times of 21 and 60 days,
respectively. These criteria were based on a preliminary assessment of
reported studies, described later in this section, that evaluated
Cryptosporidium and Giardia removal in raw water storage reservoirs.
Subsequent to the November 2001 pre-proposal draft, the Science
Advisory Board (SAB) reviewed the data that EPA had acquired to support
Cryptosporidium treatment credits for off-stream raw water storage (see
section VII.K). In written comments from a December 2001 meeting of the
SAB Drinking Water Committee, the panel concluded that the available
data were not adequate to demonstrate the treatment credits for off-
stream raw water storage described in the pre-proposal draft, and
recommended that no presumptive credits be given for this toolbox
option. The panel did agree, though, that a utility should be able to
take advantage of off-stream raw water storage by sampling after the
reservoir for appropriate bin placement. EPA concurs with this finding
by the SAB and today's proposal is consistent with their
recommendation.
Off-stream raw water storage can improve the microbial quality of
water in a number of ways. These include (1) reduced microbial and
particulate loading to the plant due to settling in the reservoir, (2)
reduced viability of pathogens due to die-off, and (3) dampening of
water quality and hydraulic spikes. EPA has evaluated a number of
studies that investigated the removal of Cryptosporidium and other
microorganisms and particles in raw water storage basins. These studies
are summarized in the following paragraphs, and selected results are
presented in Table IV-8.
Table IV-8.--Studies of Cryptosporidium and Giardia Removal From Off-Stream Raw Water Storage
----------------------------------------------------------------------------------------------------------------
Researcher Reservoir Residence time Log reductions
----------------------------------------------------------------------------------------------------------------
Ketelaars et al. 1995........... Biesbosch reservoir 24 weeks (average)................... Cryptosporidium-
system: man-made 1.4 Giardia-2.3.
pumped storage
(Netherlands).
Van Breeman et al. 1998......... Biesbosch reservoir 24 weeks (average)................... Cryptosporidium-
system: man-made 2.0 Giardia-2.6.
pumped storage
(Netherlands).
PWN (Netherlands).. 10 weeks (average)................... Cryptosporidium-
1.3 Giardia-0.8.
Bertolucci et al. 1998.......... Abandoned gravel 18 days (theoretical)................ Cryptosporidium-
quarry used for 1.0 Giardia-0.8.
storage (Italy).
Ongerth, 1989................... Three impoundments 40, 100 and 200 days (respectively).. No Giardia removal
on rivers with observed.
limited public
access (Seattle,
WA).
----------------------------------------------------------------------------------------------------------------
Ketelaars et al. (1995) evaluated Cryptosporidium and Giardia
removal across a series of three man-made pumped reservoirs, named the
Biesbosch reservoirs, with reported hydraulic retention times of 11, 9,
and 4 weeks (combined retention time of 24 weeks). To prevent algal
growth and hypolimnetic deoxygenation, the reservoirs were destratified
by air-injection. Based on weekly sampling over one year, mean influent
and effluent concentrations of Cryptosporidium were 0.10 and 0.004
oocysts/100 L, respectively, indicating an average removal across the
three reservoirs of 1.4 log. Mean removal of Giardia was 2.3 log.
Van Breemen et al. (1998) continued the efforts of Ketelaars et al.
(1995) in evaluating pathogen removal across the Biesbosch reservoir
system. Using a more sensitive analytical method, Van Breeman et al.
measured mean Cryptosporidium levels of 6.3 and 0.064 oocysts/100 L at
the inlet and outlet, respectively, indicating an average removal of
2.0 log. For Giardia, the average reduction was 2.6 log. In addition,
Van Breeman et al. (1998) evaluated removal of Cryptosporidium,
Giardia, and other microorganisms in a reservoir designated PWN, which
had a hydraulic retention time of 10 weeks. Passage through this
storage reservoir was reported to reduce the mean concentration of
Cryptosporidium by 1.3 log and of Giardia by 0.8 log.
Bertolucci et al. (1998) investigated removal of Cryptosporidium,
Giardia, and nematodes in a reservoir derived from an abandoned gravel
quarry with a detention time reported as around 18 days. Over a 2 year
period, average influent and effluent concentrations of Cryptosporidium
were 70 and 7 oocysts/100 L, respectively, demonstrating a mean
reduction of 1 log. Average Giardia levels decreased from 137 cysts/
100L in the inlet to 46 cysts/100L at the outlet, resulting in a mean
0.5 log removal.
Ongerth (1989) studied concentrations of Giardia cysts in the Tolt,
Cedar, and Green rivers, which drain the western slope of the Cascade
Mountains in Washington. The watersheds of each river are controlled by
municipal water departments for public water supply, and public access
is limited. The Cedar, Green, and Tolt rivers each have impoundments
with reported residence times of 100, 30-50, and 200 days,
respectively, in the reach studied. Ongerth found no statistically
significant difference in cyst concentrations above and below any of
the reservoirs. Median cyst concentrations above and below the Cedar,
Green, and Tolt reservoirs were reported as 0.12 and 0.22, 0.27 and
0.32, and 0.16 and 0.21 cysts/L, respectively. It is unclear why no
decrease in cyst levels was observed. It is possible that contamination
of the water in the impoundments by Giardia from animal sources, either
directly or through run-off, may have occurred.
EPA has also considered results from studies which evaluated the
rate at which Cryptosporidium oocysts lose viability and infectivity
over time. Two studies are summarized next, with selected results
presented in Table IV-9.
[[Page 47687]]
Table IV-9.--Studies of Cryptosporidium Die-Off During Raw Water Storage
------------------------------------------------------------------------
Researcher Type of experiment Log reduction
------------------------------------------------------------------------
Medema et al. 1997.......... River water was 0.5 log reduction
inoculated with over 50 days at 5
Cryptosporidium and [deg]C; 0.5 log
bacteria and reduction over 20-
incubated. 80 days at 15
[deg]C.
Sattar et al. 1999.......... Synthetic hard water In vitro conditions
and natural water showed 0.7 to 2.0
from several rivers log reduction over
inoculated with 30 days at 20
Giardia and [deg]C. Little
Cryptosporidium. reduction at 4
[deg]C. In situ
conditions showed
0.4 to 1.5 log
reduction at 21
days.
------------------------------------------------------------------------
Medema et al. (1997) conducted bench scale studies of the influence
of temperature and the presence of biological activity on the die-off
rate of Cryptosporidium oocysts. Die-off rates were determined at
5[deg]C and 15[deg]C, and in both natural and sterilized (autoclaved)
river water. Both excystation and vital dye staining were used to
determine oocyst viability. At 5[deg]C, the die-off rate under all
conditions was 0.010 log10/day, assuming first-order
kinetics. This translates to 0.5 log reduction at 50 days. At 15[deg]C,
the die-off rate in natural river water approximately doubled to 0.024
log10/day (excystation) and 0.018 log10/day (dye
staining). However, in autoclaved water at 15[deg]C, the die-off rate
was only 0.006 log10/day (excystation) and 0.011
log10/day (dye staining). These results suggest that oocyst
die-off is more rapid at higher temperatures in natural water, and this
behavior may be caused by increased biological or biochemical activity.
Sattar et al. (1999) evaluated factors impacting Cryptosporidium
and Giardia survival. Microtubes containing untreated water from the
Grand and St. Lawrence rivers (Ontario) were inoculated with purified
oocysts and cysts. Samples were incubated at temperatures ranging from
4[deg]C to 30[deg]C, viability of oocysts and cysts was measured by
excystation. At 20[deg]C and 30[deg]C, reductions in viable
Cryptosporidium oocysts ranged from approximately 0.6 to 2.0 log after
30 days. However, relatively little inactivation took place when
oocysts were incubated at 4[deg]C (as low as 0.2 log at 100 days).
To evaluate oocyst survival under dynamic environmental conditions,
Sattar et al. seeded dialysis cassettes with Cryptosporidium oocysts
and placed them in overflow tanks receiving water from different rivers
in Canada and the United States. Reductions in the concentration of
viable oocysts ranged from approximately 0.4 to 1.5 log after 21 days.
Survival of oocysts was enhanced by pre-filtering the water, suggesting
that microbial antagonism was involved in the natural inactivation of
the parasites.
Overall these studies indicate that off-stream storage of raw water
has the potential to effect significant reductions in the concentration
of viable Cryptosporidium oocysts, both through sedimentation and
degradation of oocysts (i.e., die-off). However, these data also
illustrate the challenge in reliably estimating the amount of removal
that will occur in any particular storage reservoir. Removal and die-
off rates reported in these studies varied widely, and were observed to
be influenced by factors like temperature, contamination, hydraulic
short circuiting, and biological activity (Van Breeman et al. 1998,
Medema et al. 1997, Sattar et al. 1999). Because of this variability
and the relatively small amount of available data, it is difficult to
extrapolate from these studies to develop nationally applicable
criteria for awarding removal credits to raw water storage.
c. Request for comment. EPA requests comment on the finding that
the available data are not adequate to support a presumptive
Cryptosporidium treatment credit for off-stream raw water storage, and
that systems using off-stream storage should conduct LT2ESWTR
monitoring at the reservoir outlet. This monitoring approach would
account for reductions in oocyst concentrations due to settling, but
would not provide credit for die-off, since non-viable oocysts could
still be counted during monitoring. In addition, EPA would also
appreciate comment on the following specific issues:
? Is additional information available that either supports or
suggests modifications to this proposal concerning off-stream raw water
storage?
? How should a system address the concern that water in off-
stream raw water storage reservoirs may become contaminated through
processes like algal growth, run-off, roosting birds, and activities on
the watershed?
5. Pre-Sedimentation With Coagulant
a. What is EPA proposing today? Presedimentation is a preliminary
treatment process used to remove particulate material from the source
water before the water enters primary sedimentation and filtration
processes in a treatment plant. EPA is proposing to award a presumptive
0.5 log Cryptosporidium treatment credit for presedimentation that is
installed after LT2ESWTR monitoring and meets the following three
criteria:
(1) The presedimentation basin must be in continuous operation and
must treat all of the flow reaching the treatment plant.
(2) The system must continuously add a coagulant to the
presedimentation basin.
(3) The system must demonstrate on a monthly basis at least 0.5 log
reduction of influent turbidity through the presedimentation process in
at least 11 of the 12 previous consecutive months. This monthly
demonstration of turbidity reduction must be based on the arithmetic
mean of at least daily turbidity measurements in the presedimentation
basin influent and effluent, and must be calculated as follows:
Monthly mean turbidity log reduction = log10(monthly mean of
daily influent turbidity)-log10(monthly mean of daily
effluent turbidity).
If the presedimentation process has not been in operation for 12
months, the system must verify on a monthly basis at least 0.5 log
reduction of influent turbidity through the presedimentation process,
calculated as specified in this paragraph, for at least all but any one
of the months of operation.
Systems with presedimentation in place at the time they begin
LT2ESWTR Cryptosporidium monitoring are not eligible for the 0.5 log
presumptive credit and must sample after the basin when in use for the
purpose of determining their bin assignment. The use of
presedimentation during LT2ESWTR monitoring must be consistent with
routine plant operation and must be recorded by the system. Guidance on
monitoring is provided in Public Water System Guidance Manual for
Source Water Monitoring under the LT2ESWTR (USEPA 2003g), which is
available in draft in the docket for today's proposal.
b. How was this proposal developed? Presedimentation is used to
remove gravel, sand, and other gritty material
[[Page 47688]]
from the raw water and dampen particle loading to the rest of the
treatment plant. Presedimentation is similar to conventional
sedimentation, except that presedimentation may be operated at higher
loading rates and may not involve use of chemical coagulants. Also,
some systems operate the presedimentation process periodically and only
in response to periods of high particle loading.
Because presedimentation reduces particle concentrations, it is
expected to reduce Cryptosporidium levels. In addition, by dampening
variability in source water quality, presedimentation may improve the
performance of subsequent treatment processes. In general, the efficacy
of presedimentation in lowering particle levels is influenced by a
number of water quality and treatment parameters including surface
loading rate, temperature, particle concentration, coagulation, and
characteristics of the sedimentation basin.
The Stage 2-M-DBP Agreement in Principle recommends 0.5 log
presumptive Cryptosporidium treatment credit for presedimentation with
the use of coagulant. Today's proposal is consistent with this
recommendation. However, the proposed requirement for demonstrated
turbidity reduction as a condition for presedimentation credit
represents a change from the November 2001 pre-proposal draft of the
LT2ESWTR (USEPA 2001g). Rather than a requirement for turbidity
removal, the 2001 pre-proposal draft included criteria for maximum
overflow rate and minimum influent turbidity as conditions for the 0.5
log presedimentation credit.
The Science Advisory Board (SAB) reviewed the criteria and
supporting information for presedimentation credit in the November 2001
pre-proposal draft (see section VII.K). In written comments from a
December 2001 meeting of the SAB Drinking Water Committee, the panel
concluded that available data were minimal to support a 0.5 log
presumptive credit and recommended that no credit be given for
presedimentation. Additionally, the panel stated that performance
criteria other than overflow rate need to be included if credit is to
be given for presedimentation.
Due to this finding by the SAB, EPA further reviewed data on
removal of aerobic spores (as an indicator of Cryptosporidium removal)
and turbidity in full-scale presedimentation basins. As shown later in
this section, these data indicate that presedimentation basins
achieving a monthly mean reduction in turbidity of at least 0.5 log
have a high likelihood of reducing mean Cryptosporidium levels by 0.5
log or more. Consequently, EPA has determined that it is appropriate to
use turbidity reduction as a performance criterion for awarding
Cryptosporidium treatment credit to presedimentation basins. The Agency
believes this performance criterion addresses the concerns raised by
the SAB.
The Agency has concluded that it is appropriate to limit
eligibility for the 0.5 log presumptive Cryptosporidium treatment
credit to systems that install presedimentation after LT2ESWTR
monitoring. Systems with presedimentation in place prior to initiation
of LT2ESWTR Cryptosporidium monitoring may sample after the
presedimentation basin to determine their bin assignment. In this case,
the effect of presedimentation in reducing Cryptosporidium levels will
be reflected in the monitoring results and bin assignment. Systems that
monitor after presedimentation are not subject to the operational and
performance requirements associated with the 0.5 log credit. The SAB
agreed that a system should be able to sample after the
presedimentation treatment process for appropriate bin placement.
In considering criteria for awarding Cryptosporidium removal credit
to presedimentation, EPA has evaluated both published studies and data
submitted by water systems using presedimentation. There is relatively
little published data on the removal of Cryptosporidium by
presedimentation. Consequently, EPA has reviewed studies that
investigated Cryptosporidium removal by conventional sedimentation
basins. These studies are informative regarding potential levels of
performance, the influence of water quality parameters, and correlation
of Cryptosporidium removal with removal of potential surrogates.
However, removal efficiency in conventional sedimentation basins may be
greater than in presedimentation due to lower surface loading rates,
higher coagulant doses, and other factors. To supplement these studies,
EPA has evaluated data provided by utilities on removal of other types
of particles, primarily aerobic spores, in the presedimentation
processes of full scale plants. Data indicate that aerobic spores may
serve as a surrogate for Cryptosporidium removal by sedimentation
(Dugan et al. 2001).
i. Published studies of Cryptosporidium removal by conventional
sedimentation basins. Table IV-10 summarizes results from published
studies of Cryptosporidium removal by conventional sedimentation
basins.
Table IV-10.--Summary of Published Studies of Cryptosporidium Removal by Conventional Sedimentation Basins
----------------------------------------------------------------------------------------------------------------
Author(s) Plant/process type Cryptosporidium removal by sedimentation
----------------------------------------------------------------------------------------------------------------
Dugan et al. (2001).................. Pilot scale 0.6 to 1.6 log (average 1.3 log).
conventional.
States et al. (1997)................. Full scale conventional 0.41 log.
with primary and
secondary
sedimentation.
Edzwald and Kelly (1998)............. Bench scale 0.8 to 1.2 log.
sedimentation.
Payment and Franco (1993)............ Full scale conventional 3.8 log and 0.7 log.
(2 plants).
Kelly et al. (1995).................. Full scale conventional 0.8 log.
(two stage lime
softening).
Full scale conventional 0.5 log.
(two stage
sedimentation).
Patania et al. (1995)................ Pilot scale 2.0 log (median).
conventional (3
plants).
----------------------------------------------------------------------------------------------------------------
Dugan et al. (2001) evaluated the ability of conventional treatment
to control Cryptosporidium under different water quality and treatment
conditions on a small pilot scale plant that had been demonstrated to
provide equivalent performance to a larger plant. Under optimal
coagulation conditions, oocyst removal across the sedimentation basin
ranged from 0.6 to 1.6 log, averaging 1.3 log. Suboptimal coagulation
conditions (underdosed relative to jar test predictions) significantly
reduced plant performance with oocyst removal in the
[[Continued on page 47689]]