The prospect of assessing human health risks from exposure to chemical
mixtures looms as a nightmare for many scientists, especially toxicologists
charged with coming up with the necessary basic data. Indeed, exposure at
a variety of levels to large numbers of chemical compounds, either concurrently
or sequentially via multiple pathways, is the environmental reality for
just about everyone on the planet.
Mixtures of chemicals are ubiquitous in ground and surface water, in
our air, food, and drinking water, as well as in soil surrounding leaking
toxic waste disposal sites. Examples of environmentally prevalent chemical
mixtures are cigarette smoke, diesel and automobile exhaust, disinfection
by-products from chlorination, and dioxin and dioxinlike compounds formed
as by-products of incomplete combustion of hospital and municipal waste.
Despite this potential for exposure to environmental mixtures, the vast
majority of established exposure standards are for single compounds. Moreover,
the vast majority of toxicology studies examine the cancer and noncancer
effects of single chemicals. Currently, more than 95% of the resources in
toxicology are devoted to single-chemical studies. "For most chemical
mixtures and multiple chemical exposures, adequate data on exposure and
toxicity are lacking," says toxicologist Victor J. Feron, senior scientist
with TNO Nutrition and Food Research Institute, in the Netherlands.
|
Harold Zenick-- Complicating the
issue of chemical mixtures is the question of multiple mechanisms. |
A number of factors may account for this data shortfall. Harold Zenick,
deputy director of EPA's Health Effects Research Laboratory (HERL), points
to the issue of difficulty. "There are those who argue that [the question
of chemical mixtures] is too difficult a topic to be undertaken in a research
venue. This is based on the belief that it's difficult enough for us to
address all the uncertainties associated with single chemical risk assessment."
Mixtures research, Feron says, adds a layer of increased complexity to risk
assessment because of the potential for multiple mechanisms functioning
simultaneously. This, he explains, complicates the ability to extrapolate
to other dose levels and exposure scenarios, to other mixtures, and to other
species. "And given that you may have either multiple chemical exposures
via single or multiple pathways, you may also be eliciting multiple effects.
And what is less apparent is whether those effects are independent or interactive."
Choosing the Approach
Basic to the study of chemical mixtures is the issue of what approach
to take. A bottom-up approach is typically aimed at identifying mechanistic
interactions of simple mixtures to predict their effects, while top-down
studies examine the effects of complex mixtures to determine the underlying
mechanisms. Strict adherents to the bottom-up approach may be criticized
for lacking "real world" immediacy, which is exposure to complex
mixtures. Critics of the top-down route say scientists could test mixtures
and their components forever.
|
Jane Ellen Simmons-- Development of mechanistic models is a realistic
possibility. |
"The challenge for toxicology is development of a database necessary for the risk assessment process for chemical mixtures," says HERL toxicologist Jane Ellen Simmons. "However, toxicity assessment by itself is not a feasible approach. There are quite simply too many mixtures and multiple chemical exposures for us to realistically think we can assess the toxicity
of each of them." Simmons, a HERL team leader for chemical mixtures and interactions, describes how quickly an experimental study of just three chemicals at five different dose levels (including a zero dose) can become very cumbersome and problematic. It would require 125 treatment groups, she explains, and at 10 animals per group would require 1,250 animals. Such a study would only be able to look at toxicity at a one time point and at only one dosing regimen relevant to that exposure. "Given there are too many mixtures for toxicity assessment to be a reasonable or viable approach, a realistic possibility is the development of mechanistic models," Simmons says. "The evaluation of mechanisms can be incorporated into both top-down and bottom-up approaches," she adds. Thus, according
to Simmons, understanding the mechanism of action based on simple mixtures
should lead to improvement in assessing risks of complex mixtures. Likewise,
top-down toxicological evaluation of complex mixtures provides not only
valuable and similar information for other mixtures, but also a context
to interpret mechanistic understandings based on simple mixtures.
Lingua Franca
Inconsistent usage of certain key terms within the toxicology literature
has sometimes made communication of findings on chemical mixtures problematic.
This is especially true in fields such as toxicology and biostatistics where
many synonyms have been employed for certain words or where there has been
lack of agreement over the precise meaning of terms. "Such lack of
communication and frequent anarchy creates a high baseline of confusion
within the scientific and regulatory communities but also the general public,"
says, Edward J. Calabrese of the School of Public Health of the University
of Massachusetts.
What are we talking about here? The
first step in developing an approach to mixtures research is clarifying
the terms of the debate. (Source: E.J. Calabrese, Multiple Chemical
Interactions, Boca Raton, FL: Lewis Publishers, 1991.)
Calabrese and others now propose three fundamental classes of joint interaction
of chemicals, defined as follows:
* Additivity--the effect of a combination is exactly what is expected.
For example, the combination of one chemical with a toxicity level of 1,
with another compound also having a toxicity of 1 would equal a toxicity
level of 2. This general classification of additivity implies nothing about
how the addition occurs.
* Synergy--a positive interaction such that the response is greater than
expected. Simply put, a combination of two compounds with individual toxicity
levels of 1 might yield a toxicity level of 10, for example.
* Antagonism--a negative interaction such that the response is less than
expected. Here, the mixture of two compounds with a toxicity level of 1
each might give a toxicity level of 1.5.
Whether these suggested definitions will take hold remains to be seen.
Today, throughout the literature and at conferences, a plethora of terms
are used, such as "greater than additive" or "superadditivity,"
"less than additive" or "subadditivity," and "potentiation,"
"augmentation," and "independence." Calabrese wrote
in Multiple Chemical Interactions: "The time has come to seek
the lowest common denominator around which most will agree. In such cases,
simplicity is often the path to greater clarity and scientific sanity."
Additivity by Default
Associated with the three fundamental joint interactions described above
are three possible results in assessing risk of chemical mixtures: overestimation,
correct estimation, and underestimation. How these results might play out
can be understood in terms of current risk assessment practices.
Guidelines of national and international organizations involved in setting
exposure standards typically suggest the use of simple "dose addition"
or "response addition" models for assessing the hazard of a chemical
mixture. To derive a best estimate of risk, EPA guidelines say it is preferable
to have a lot of toxicological data on the mixture. In the absence of this
information, the initial default is to use data on a similar mixture. However,
such information is rarely available. Risk is then estimated based upon
knowledge of the mixture's known components. In the absence of data to the
contrary, the health risk of any given mixture is estimated by adding the
risks of the individual components. Thus, the additivity default typically
embraced by EPA for risk assessment of chemical mixtures is based on single
chemicals.
Demonstrable additivity of mixture components makes assessing risks much
easier. For example, if two structurally similar chemicals have similar
toxicity (dose-response) characteristics, it is possible to regulate
a standard for exposure to both as a mixture based on the toxicity of either
component alone. The presence of chemical A is not affecting the toxicity
of chemical B, so there's no concern about possible interactive, or greater
than additive, effects. Exposure to a mixture with effects predictable by
a linear dose-addition model would not present a greater risk than exposure
to its chemical components alone. When there is predictable antagonism between
chemicals, or a less additive effect, regulation based on the toxicity risks
of single chemicals can still provide adequate protection for exposure to
a combination of those chemicals. For example, the presence of one chemical
may suppress the action of another.
Risk Overestimation
There are potential problems with the additivity approach, however. It
may greatly overestimate the risk when chemicals act by mechanisms for which
additivity assumptions are invalid. For example, according to Feron, essential
nutrients (vitamins, trace elements, essential amino and fatty acids) possess
relatively small margins of safety between the dose people need (the recommended
daily allowance) and the dose that may be toxic. Consuming these chemicals
simultaneously at their recommended daily allowances would be considered
unhealthy when toxicity of the mixture is assessed on the basis of dose
addition. This of course would not be a valid assumption, as people routinely
take multivitamins and other dietary supplements with no problem.
|
Ralph Kodell-- Default assumptions...may overestimate the risk of mixtures. |
A similar problem arises when estimates are made of excess cancer risks
posed by exposure to mixtures of chemical carcinogens. Animal bioassays
are frequently used to assess carcinogenicity associated with exposure to
individual chemicals. Statistically derived upper bounds of specific exposure
levels are generally used to characterize the experimental low-dose risk.
"In the absence of a formal procedure for calculating upper bounds
for mixtures under the additivity assumption, regulatory agencies have adopted
the common practice of assuming upper-bound risk estimates for individual
components," says Ralph L. Kodell, National Center for Toxicological
Research deputy director for biometry and risk assessment. This conservative
approach, which is taken by the FDA on food additives and by the EPA on
hazardous waste-site cleanup, can overstate the true underlying risk associated
with a given mixture, Kodell says.
Additivity Verified
An assumption of additivity often holds up experimentally and offers
support for a shared underlying mechanism among a mixture's compounds. A
case in point arises from recent findings by HERL and SRI International,
a private research institution, on the combined effects of paired ototoxic
organic solvents on the auditory system of rats. William K. Boyse, HERL
chief of neurophysiological toxicology, explains that the absence of data
on neurotoxicity endpoints for many classes of neurotoxicants was among
the reasons for this investigation. Organic solvents were also chosen because
they are prevalent in hazardous waste sites and because a number of these
compounds cause hearing loss.
The findings of this study were consistent with the EPA default assumption
for noncancer endpoints, says Boyse. "We could not detect any changes
from additivity. No outcome was predictive of superadditive or subadditive
effects. All effects were as predicted by a linear dose-addition model.
The implication is that these ototoxic solvents operate through the same
or similar mechanisms."
Risk Underestimation
Recent studies in rodents seem to underscore the dangers of generally
applying the additivity assumption to risk assessment of chemical mixtures
because the assumption may lead to an underestimation of risk.
Among the studies are several subacute toxicity studies of a combination
of nine chemicals (including aspirin, cadmium chloride, stannous chloride,
formaldehyde, and dichloromethane), all of which are highly relevant to
the general human population in terms of use pattern, dose level, and frequency
of exposure. According to TNO's John Groten, a four-week inhalation study
at the no-adverse-effect level (NAEL) for each of the chemicals revealed
pathological changes in the nose and liver. "This suggests that combined
exposures to compounds even at their NAEL will not necessarily result in
a NAEL for the combination," says Groten. He also points out that,
interestingly, even at one-third the NAEL of the individual chemicals, some
minor adverse effects were found.
"Quantitative risk assessment to a large degree is still based on
assumptions," Kodell says. "There are a lot of critical assumptions
that go into it that have yet to be verified biologically. It remains a
goal to strive toward. Still, I don't think you should wait for all the
information before doing something. That's why EPA and FDA use the best-available
assumptions to produce some appropriate regulations."
Interactive Mechanisms and Toxicokinetics
Within organisms, chemicals can interact at a number of different levels,
through absorption, metabolism, distribution, and at the site of action.
Melvin Anderson, a toxicologist with ICF Kaiser Systems, says that it's
through studies of pharmacokinetics that we begin to understand the behavior
of mixtures. "Unless we learn what the mechanisms of these interactions
are, we have little hope of extrapolating to lower doses and from one species
to another." Rather than refer to chemical interactions strictly in
terms of additivity, synergy, or antagonism, Anderson says he prefers to
use the terms "pharmacokinetic" and "pharmacodynamic"
interactions. Anderson defines the terms thus: "Pharmacokinetic interactions
are when the tissue dose of a chemical per unit of exposure is altered by
co-exposure to another chemical. A pharmacodynamic interaction is when tissue
responses to a unit concentration of the chemical is altered due to co-exposure
to other chemicals." Anderson says that over the past 20 years, toxicologists
have been heavily cautioned not to equate responses to administered dose.
"We really have to know what kind of chemical gets to the tissues and
in what form, and the intensity of exposure, to correlate outcome to a particular
exposure."
|
Linda Birnbaum-- TEFs should
be viewed as an interim approach. |
EPA toxicologist Linda Birnbaum agrees. Her work with toxic equivalency
factors in risk assessment for dioxin and dioxinlike chemical compounds
involves interactions at a molecular level. "The ability of a chemical
to interact through the receptor just tells you that it has the ability
to do that; it doesn't tell you what happens when the chemical actually
gets into the animal," she says. "And in fact, pharmacokinetic
factors play a very major role in tempering the potency of a number of compounds.
PCB-77 has a very good ability to act with a receptor, but it is metabolized
and eliminated almost immediately upon entering an animal's body."
Through animal models of pharmacokinetic interactions of specific chemical
compounds, scientists hope to make predictions for occupational exposure,
leading to improved regulatory standards for workplace safety. For example,
there are experiments that simulate human exposures to atmospheric mixtures
of styrene and butadiene (a probable human carcinogen, according to the
EPA) that may occur during processing and production of styrene-butadiene
polymers. In one such study, toxicologists led by Gyorgy A. Csanady at GSF-Institut
fur Toxikologie in Neuherberg, Germany, found the amount of butadiene metabolized
was inhibited by simultaneous exposure to styrene, whereas butadiene had
no detectable effect on the kinetics of styrene. Findings of antagonistic
metabolic interactions between these compounds have also been reported by
toxicologists at the Chemical Industry Institute of Toxicology. At CIIT,
inhibition of the oxidative metabolism of butadiene as well as inhibition
of further oxidation and detoxification of an important reactive metabolite
has been shown. This metabolite, butadiene monoepoxide, is thought to be
partly responsible for the genotoxicity of butadiene.
An interesting example of a chemical mixture with metabolic interactions
that pose a health risk is the interaction between trichlorethylene (TCE)
and ethanol. TCE is found at most hazardous waste sites and is the most
common groundwater contaminant near those sites. Ethanol both induces and
competes with TCE metabolism, and according to M. Moiz Mumtaz and Jo Ann
Freedman of the Agency for Toxic Substances and Disease Registry, the effect
of this interaction depends on the exposure protocol; that is, the timing
of exposure. The ATSDR scientists found that simultaneous exposure causes
competition for enzymes and co-factors in TCE metabolism, with consequent
decreased potentiation of TCE-induced central nervous system depression.
Induction predominates if the interval between exposure to ethanol and exposure
to TCE is three hours. Simultaneous exposure potentiates cardiac arrhythmias,
while increasing the exposure interval decreases this potentiation, but
increases liver toxicity. "Degreasers flush," is a severe and
sometimes fatal intoxication that can occur in habitual alcohol drinkers
exposed to TCE.
Knowledge of antagonism between compounds may prove useful for reducing
toxicity. Robert Snyder, a professor of toxicology at Rutgers University,
has been exploring ways to reduce benzene toxicity by modifying its metabolism
in rats through co-exposure to toluene. Exposure to high doses of benzene
(in excess of 25 parts per million) over prolonged periods has been associated
with the development of aplastic anemia among workers in the printing, shoemaking,
and plioform (a saran precursor) industries. "Toluene is a competitive
inhibitor of benzene metabolism," Snyder says. "Toxic metabolites
of benzene aren't produced, so you reduce its toxicity. Any way you can
prevent metabolism, you can protect against the effect." He points
out, however, that while toluene is antagonistic toward benzene metabolism,
the chemicals act additively to produce central nervous system depression.
Interspecies Extrapolation
In terms of using interspecies extrapolation as the basis for risk assessment,
what holds true for single-chemical studies seems to apply with equal vengeance
to mixtures. Zenick points out that equally relevant to mixtures studies
are questions pertaining to whether mechanisms are homologous among species
and questions about what happens to the mixture as a result of pharmacodynamics.
"You are potentially faced with multiple mechanisms elicited simultaneously
within each species under consideration; thus the ability to tease out the
issue of mechanisms becomes more complex." Zenick adds, "Even
if you have homologous mechanisms that are present and in operation at high
doses across species, you cannot be confident those same mechanisms would
be there at low doses more appropriate to human exposure levels."
Indeed, interspecies differences in metabolic activation and deactivation
of single compounds are common. It should not come as a surprise that rats
and mice respond differently to concurrent exposures to certain chemicals.
Such is the case with chloroform and TCE as studied in mice by HERL's Simmons
and David J. Svendsgaard, along with University of North Carolina toxicologist
Hui-Min Yang. They note that previous reports have shown reductions in chloroform-induced
liver and kidney toxicity in rats co-exposed to chloroform and TCE compared
to rats treated with chloroform alone. In their more recent study, concurrent
oral exposure to chloroform and TCE in mice "suggests synergistic liver
toxicity in higher dose regions and additive toxicity in lower dose regions."
Kidney toxicity, they state, appeared additive.
Contributions from Epidemiology
"Determining the health risks of complex mixtures poses equally
daunting challenges to toxicologists using experimental methods and to epidemiologists
using observational methods," says Jonathan Samet, chair of the Department
of Epidemiology at Johns Hopkins University. "Some of the weaknesses
of epidemiologic methods for investigating chemical mixtures are also evident,"
Samet adds. "Exposure assessment may be particularly challenging. Random
and nonrandom errors in the estimation of exposures may blunt the sensitivity
of epidemiologic studies and constrain interpretation of findings. And large,
expensive studies may be indicated."
"Epidemiologic data have the implicit strength of directly addressing
risks of exposures in human populations and, for this reason, the findings
of epidemiological research have received prominence in the development
of regulations," Samet says. Samet also points out that in regard to
chemical mixtures, epidemiology studies can offer information on the consequences
of community and workplace exposure to the mixtures present. And when laboratory
replication is not feasible or even possible, epidemiologic studies can
support study results.
Epidemiological investigations have proved highly informative for identifying
adverse consequences of diverse environmental exposures, including such
chemicals as benzene and vinyl chloride. Epidemiology has also been central
in identifying adverse health effects of mixtures such as tobacco smoke
and outdoor and indoor air pollution. Samet says that such highly variable
mixtures of gaseous and particulate agents have not been readily investigated
using toxicologic approaches. "Epidemiologic research has been less
informative in characterizing the effects of exposures to relatively low
levels of mixtures in determining the components of mixtures that may be
most relevant to disease causation, and in understanding the interactions
among components of mixtures," he says.
Study designs such as the nested case-
control and case-cohort studies involve sampling from populations to
enhance feasibility and reduce costs. These designs should still yield estimates
of effect that are unbiased and reasonably precise compared to those obtained
by studying the entire population. New tools offering promise for mixtures
research include methods for time-activity assessment, area and personal
monitoring of exposures, and biomarkers.
|
Eula Bingham-- Scientists
should use immunological data in assessing mixtures. |
Eula Bingham, a professor of environmental health at the University of
Cincinnati, urges a multidisciplinary approach to mixtures research that
would allow toxicologists to exploit information from human disease. "What
are some of the diseases that concern us regarding health effects of mixtures?"
she asks. "Cancer and specific types of cancer."According to Bingham,
there is immunological evidence that some agents work together in more ways
than additivity. Examples include radon and smoking and asbestos and smoking
in lung cancer and alcohol and smoking in pharyngeal cancer. She says that
revised EPA guidelines on chemical mixtures risk assessment should emphasize
the importance of synergism, rather than additivity. Bingham suggests "that
we worry about adding up over time, adding to the immunologic burden, and
to the estrogenic burden."
"Mixtures are tough for everybody," Samet says. "Epidemiologic
approaches represent the only way to look at the consequences of mixtures
as they are experienced by people."
New Approaches
An approach to improving the assessment of potential hazard for complex
chemical mixtures still under development is the use of toxic equivalency
factors (TEFs). The use of TEFs involves development of a potency ranking
scheme which relies on existing data and scientific judgment. The TEF is
derived by observing the data available for one chemical, by looking at
the dose-response characteristics for that compound, and comparing it
to the dose-
response characteristics observed for a prototypical compound. Thus, each
chemical in a mixture has a TEF assigned to it. Says Birnbaum, "Multiply
that fractional potency value by the mass [of the mixture], sum it all up,
and that's the total toxic equivalency."
Birnbaum says TEFs are used for observing differences in orders of magnitude
and that they are not precise estimates of relative potency. "I think
it's important that for risk assessment this is viewed by at least our agency
[EPA] as an interim approach until we might develop something that will
work better."
What is in the future for risk assessment of chemical mixtures? At the
HERL symposium on chemical mixtures and risk assessment in November, William
Greco of the Roswell Park Cancer Institute predicted, "By the beginning
of the next millennium, routine assessment of the effects of chemical mixtures,
for both toxic and therapeutic agents, will be very different from approaches
commonly used today." The future paradigm, according to Greco and Roswell
Park's Leonid A. Khinkis, will include assays that are more automated and
robotized; routine study of multicomponent mixtures; empirical models that
will routinely fit to data with sophisticated user-friendly software on
fast, inexpensive computer workstations; insightful computer-based graphical
exploratory analysis procedures; routine combined pharmacokinetic-pharmacodynamic
modeling of chemical mixtures, and standardization of nomenclature and approaches.
"The seeds of a brave new world have already been planted, and spring
is approaching," said Greco.
Leslie Lang
Leslie Lang is a freelance journalist in Chapel Hill, North Carolina.
[
Table
of Contents] [
Citation
in PubMed] [
Related
Articles]
Last Update: April 21, 1998