Fear of Phenolphthalein?
|
Unwanted ingredient? New research shows phenolphthalein, a drug found in many laxatives, may have carcinogenic side effects in animals. |
In December scientists announced that phenolphthalein, a substance used for almost a century in over-the-counter laxatives, showed clear evidence of carcinogenicity in rodent studies and may present a risk to humans, particularly individuals who ingest amounts greatly exceeding recommended doses.
The determination was made by the National Toxicology Program, which initiated toxicology and carcinogenicity studies of phenolphthalein because no long-term animal studies were available to allow evaluation of the potential risks to humans from prolonged use of the drug. The National Cancer Institute had nominated phenolphthalein for study.
In the NTP studies, rats and mice were fed phenolphthalein over a period of two years at doses of 12,000, 25,000, and 50,000 parts per million (ppm) to rats and 3,000, 6,000, and 12,000 ppm to mice. The rodents were then examined for the presence of cancerous and noncancerous pathology. The study results are summarized as follows:
- clear evidence of carcinogenic activity in male F344/N rats based on markedly increased incidences of benign neoplasms of the adrenal medulla and benign and malignant neoplasms of the kidneys;
- some evidence of carcinogenic activity in female rats based on increased incidences of benign pheochromocytoma in the 12,000 ppm dose group, and of benign or malignant neoplasms of the adrenal medulla;
- clear evidence of carcinogenic activity in male mice based on histiocytic sarcoma and malignant lymphoma;
- clear evidence of carcinogenic activity in female mice based on increased incidences of histiocytic sarcoma, malignant lymphoma of all types, lymphoma of thymic origin, and benign ovarian tumors.
According to the NTP, phenolphthalein may cause cellular alterations in animals by a number of mechanisms including chromosomal damage, and through estrogenlike activity. Additional studies are underway to further understand the mechanisms by which phenolphthalein acts. The NTP technical report on phenolphthalein stresses, however, that it is difficult to extrapolate human risk from animal studies, and no population studies of phenolphthalein users have shown an increased risk for disease. This does not mean, however, that the drug is necessarily risk-free for humans. George Lucier, director of the Environmental Toxicology Program at the NIEHS, says, "Although we can't precisely determine the relevance of the NTP animal findings for human risk, they do provide a red flag of caution."
Alternatives in Animal Testing
The three "R's" of animal testing are refine, reduce, and replace. Respectively, they denote modifying toxicological test procedures to lessen or eliminate animals' pain, curtailing the number of animals required for a test, and substituting test animals with non-animal methods or phylogenetically lower species. Total replacement would mean eliminating the use of animals by using microbes, cells, tissues, and other in vitro methods, as well as using computerized information databases and mathematical models.
Applying the three Rs to the development and validation of new and improved testing methods is mandated by Section 1301 of the National Institutes of Health Revitalization Act of 1993 (PL 103-43). The Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM) was established at the NIEHS in 1994 to fulfill this mandate by establishing criteria for the validation of alternative testing methods, and recommending processes by which they can be accepted for regulatory use.
This daunting task was discussed at the National Toxicology Program Workshop on Validation and Regulatory Acceptance of Alternative Toxicological Test Methods, held in Arlington, Virginia, 11-12 December 1995. The workshop was organized by ICCVAM, which consists of ad hoc representatives of 15 federal scientific and regulatory agencies. The purpose of the public workshop was to obtain comments and recommendations from experts and interested stakeholders from industry, academia, government, public interest groups, and animal welfare organizations on a draft report prepared by the committee. The draft report and workshop report was presented at an Organization for Economic Cooperation and Development (OECD) workshop on toxicological test alternatives held in January 1996 in Stockholm, Sweden. Neil Wilcox, special assistant to the associate commissioner of the FDA, said, "The forum successfully brought key individuals together to discuss the criteria and primary stages necessary for the eventual validation, regulatory acceptance, and implementation of test methods."
Workshop participants sought to identify more efficient ways for developers to communicate information and gain acceptance of their new test methods by scientists in various regulatory agencies without having to systematically approach each agency. For instance, a test method to characterize the acute toxicity of a chemical to be used commercially might be of interest to the EPA, the Consumer Product Safety Commission, the DOT, and OSHA. On the other hand, a new test method for a drug may be important to the FDA, but of no interest to any other agency. A question posed at the meeting was how communications between stakeholders might be streamlined to facilitate the review and acceptance process.
Participants at the workshop breakout session on future directions proposed the creation of a clearinghouse through which a developer could bring a new test method to the attention of the appropriate federal regulatory agency or agencies. The clearinghouse was envisioned as an interagency coordinating committee that would facilitate communication between test method developers and agency scientists regarding new alternative test methods for various endpoints. However, at what point in the development and validation process should communications be made with the clearinghouse? After a method has been developed and a validation study designed, consultation with the clearinghouse might determine that the design will not generate sufficient data to be adequately evaluated by regulatory agencies. However, if the data are not communicated until after validation--an expensive procedure--there are inadequate data to evaluate the usefulness of the method by the intended regulatory agencies, and much time and money may have been wasted. The breakout group concluded that a process was needed to facilitate communications among all stakeholders at all stages of development, validation, and acceptance. Oliver Flint, a principal scientist at Bristol-Myers Squibb and executive secretary of the breakout session, supports the idea of a clearinghouse. Said Flint, "From the scientist's point of view, this will have the advantage that individual tests will not have to be revalidated for each regulatory agency; and from the public's point of view, that one agency will not have lower or eccentrically different standards from those of another."
The breakout panel suggested interposing a stage called prevalidation between the stages of development and validation. Prevalidation would generate enough data for clearinghouse scientists to assess the likelihood that the test method would pass regulatory testing purposes. A negative finding would stop the process. A positive finding would LEFT the allocation of additional time and funds for validation, peer review, and hopefully, implementation. The clearinghouse might even assist in identifying potential funding sources.
As discussed by the breakout group, the clearinghouse would communicate information about test methods, not products, and would have only an advisory function to the participating agencies. Its membership would include scientists from all of the relevant federal agencies. Several questions remained, however, such as whether the clearinghouse would be a revamped ICCVAM or a new interagency coordinating committee, how the clearinghouse would be funded, and whether it would eventually be set up to communicate information efficiently to foreign and international regulatory bodies.
ICCVAM co-chairperson William Stokes, associate director of Animal and Alternative Resources at the NIEHS, agrees with the concepts of the clearinghouse and prevalidation and thinks it would be useful. "Fifteen agencies have worked well together in the ICCVAM to develop the draft report so the proposed clearinghouse functions could be the logical next step. It could have representatives from each [ICCVAM member] agency and 'go international' down the road. Proposals for enhanced international coordination at all stages of validation and acceptance are more likely to emerge at the OECD workshop, where the ICCVAM draft report and workshop report will serve as working documents. The aim is to harmonize our report with that of an OECD guidance document to be developed in Stockholm."
Some workshop participants would like to see animal tests phased out entirely and as soon as possible. For instance, Michael Balls, director of the European Center for the Validation of Alternative Methods, of the European Union Joint Research Center's Environment Institute, called for greater efforts to find replacements for all animal tests as soon as possible.
Many scientists at the meeting expressed the view that totally eliminating the use of animals in testing appears unlikely for the foreseeable future. Stokes explained that despite some limitations of current animal models used for testing, they remain useful and necessary for the protection of human health. "The degree of relevance of information [to humans] depends on the specific animal model used, our understanding of the model, and toxicity endpoint studied," he pointed out. Balls, however, believes that animal testing can be entirely replaced in the future. "When I'm optimistic, I think it can be done in 25 years. When I feel pessimistic, I think in terms of 50 years."
Although a specific deadline for replacing animal models was not an outcome of the meeting, George Lucier, director of the Environmental Toxicology Program at the NIEHS, believes the meeting was a significant step in the right direction for several reasons. "This meeting produced a significant broadening of the definition of alternative models to include such things as mechanistic data and mathematical models," he said. "And most importantly, all of the stakeholders bought into the ideas put forth for improving, especially streamlining, the approaches by which we determine human toxicity."
Great Lakes on the Mend
The Great Lakes are on the mend, according to recent reports on the overall health of the world's largest body of fresh water. Yet persistent toxic compounds continue to be dumped into the lakes, and PCBs, DDT, and other chemicals can still be detected in lake sediments and fish, causing potential health problems for people who eat the fish. New health studies indicate that pollution of the Great Lakes could be contributing to subtle health concerns in the region including a reduction in human sperm counts, higher rates of breast cancer, low birth-weight babies, and learning disabilities in children.
The U.S. EPA and Environment Canada concluded that water quality and human health are improving in the Great Lakes region, but the results are mixed. The agencies presented these findings in the State of the Great Lakes 1995, a biennial report released in September.
"I think it's safe to say this: there's no doubt that contaminants are declining," says Harold Humphrey, veteran research scientist for the Michigan Department of Public Health. "Things are a bit better. But the question remains, what is going to happen with all of these subtle health effects? No one can answer that question yet."
The Great Lakes have been a receptacle for a wide variety of pollutants for decades, including DDT, PCBs, pesticides, dioxin, and more. It is a heavily industrialized region: about one-fifth of American industry and one-half of Canadian industry are located along the Great Lakes or tributary streams. Forty-two shoreline areas in the Great Lakes, such as Indiana Harbor, Milwaukee Harbor, and Green Bay Harbor, have been designated as "areas of concern"--the most degraded sites in the basin--by the EPA and Environment Canada. Thirty-five of the areas of concern have public advisories against fish consumption.
In view of the number of people consuming fish and the potential human health impacts, the International Joint Commission (IJC) Great Lakes Water Quality Board called for a zero discharge of persistent toxicants into the Great Lakes in its seventh biennial report titled 1993-1995 Priorities and Progress Under the Great Lakes Water Quality Agreement. "Society must adopt a clear and comprehensive action plan to virtually eliminate persistent toxic substances that are threatening human health and the future of the Great Lakes ecosystem," the report said. That goal has yet to be achieved.
On the bright side, research detailed in the State of the Great Lakes 1995 shows that levels of DDT in women's breast milk (from women living in a number of Canadian cities in the Great Lakes area) have declined 87% from more than 150 parts per million in 1967 to 20 ppm in 1986. The levels of PCBs in women's breast milk also has declined. Results of a Lake Michigan study conducted by the Wisconsin Department of Natural Resources show the amount of PCBs in lake trout and salmon has decreased by 80%.
In its report, the IJC advocates pollution prevention as an immediate step that industry can take to help reduce persistent toxicants in the Great Lakes. The water quality board recommended that binational initiatives be adopted to build on those gains, including new benchmarks, management guidelines, and increased monitoring.
John Westendorf, manager of water quality and corporate environmental affairs for Occidental Chemical Corporation in Niagara Falls, New York, said his company's Niagara Falls plant has reduced its air, water, and hazardous waste discharges by 73% in the past five years.
"Industry is committed to doing a better job as long as we're given some flexibility," Westendorf says.
But Occidental opposes a zero-discharge standard, Westendorf says. At minimum, chemicals are needed to control exotic species such as the sea lamprey and zebra mussel, he says. Sea lampreys prey on lake trout and other game fish, and zebra mussels clog water-intake pipes for utilities and industries. Both exotic species have proliferated in the Great Lakes by the millions.
Burkhard Mausberg, executive director of Great Lakes United in Buffalo, New York, says his organization supports zero discharge and ecosystem management to improve the Great Lakes. But he sees few advances toward those goals in the politically charged policy arena.
Mausberg notes that the U.S. Congress has pondered deep cuts in renewing the Clean Water Act and EPA enforcement. "What we're finding is the environmental programs that are most progressive are being cut. We don't need that right now," he says.
The Great Lakes Initiative, launched by former President Bush and carried on by President Clinton, envisions uniform pollution-discharge standards for all the Great Lakes states. It has been opposed by most states thus far for economic reasons, officials say.
While politics muddies the waters for now, the IJC's Great Lakes Science Advisory Board has called for more research into the impacts of chemicals on the reproductive, developmental, and immune systems in animals and humans. New research indicates that the discharge of PCBs, pesticide residues, and dioxin into the Great Lakes could be causing hormonal changes in some fish, birds, and mammals.
The research is preliminary, according to Theo Colborn of the World Wildlife Fund, who is authoring a book on environmental estrogens. But the effects on fish could also materialize in humans, she says. According to Humphrey, it's too early to tell how substantial the risk is for humans. "The concern among public health professionals is that we don't cry wolf too much so people believe us when the real wolf comes along," he says.
East Meets West for Improved Antimalarials
Malaria is one of the world's greatest public health problems, striking 200-300 million people yearly, and killing at least a million. The intracellular protozoan parasites that cause the disease are endemic in tropical areas and are spread by mosquitoes. In many areas, especially southeast Asia, the parasite has become resistant to existing antimalarials such as chloroquine.
Now with some help from western science, an ancient Chinese herbal remedy may serve as the prototype for a new family of antimalarial drugs. At least three million malaria patients have been treated with the folk medicine, but it is far from a perfect remedy. Western scientists hope to improve the drug by changing its chemical makeup. In 1972, Chinese scientists rediscovered qinghaosu. This plant extract is derived from the leaves of Artemisia annua, a prolific shrub related to wormwood, and has been used against fever for 2000 years. The western name for the active component is artemisinin. When the Chinese found that it fought malaria, they put it into clinical use. So far, the malaria parasites have shown no resistance to the substance, nor is artemisinin toxic at clinical doses. And it works quickly.
"It's very good for treating severe, life-threatening malaria," says Steven R. Meshnick, a parasitologist, biochemist, and associate professor of epidemiology at the University of Michigan School of Public Health. Artemisinin revives comatose patients much faster than quinine. But it's not a cure. It's less useful in milder cases and the disease frequently returns. The drug is also poorly water soluble and difficult to administer orally. And the body eliminates it quickly, so it must be taken frequently.
However, "Once you know exactly where the drug works, and how, you can then design it to work better," says Meshnick. The essential part of the artemisinin molecule is a peroxide bridge, a chemical structure that's unusual and often unstable. The molecule's selective toxicity is due to the malaria parasite's diet of hemoglobin. The parasite digests the globin portion but can't metabolize the iron-containing heme structure, which it stores in hemozoin granules. The peroxide bridge interacts with the iron and heme exposed in these granules to produce short-lived, highly reactive free radicals. These free radicals or related reactive intermediates damage critical proteins in the parasite, killing it. Though heme exists throughout the human body, it's tucked inside proteins and thus protected from this reaction.
Artemisinin is a complex, multiringed molecule, but all the rings aren't necessary for the antimalarial effect. "We've tried to simplify the structure and arrive at compounds that are equally potent and yet are much easier to prepare in the lab," says Gary Posner, a medical chemist at Johns Hopkins University. Posner and others have used this mechanism-based design strategy to synthesize hundreds of artemisinin analogs, some of which are as effective as artemisinin in animal studies.
Michael Bentley, chairperson of the Department of Chemistry at the University of Maine, has employed a different tactic. Bentley attached artemisinin to polymers of polyethylene glycol, a nontoxic, nonallergenic compound used in foods and drugs. Resulting compounds tend to be soluble both in water and in nonpolar solvents. Such structural modifications may lengthen artemisinin's stay in the body. Plus, each polymer carries two peroxide bridges. In studies with mice, some of Bentley's compounds show improved activity over artemisinin alone, or chloroquine alone.
Artemisinin-based drugs are promising, experts agree, but the coevolutionary arms race with the malaria parasite will continue. Given the opportunity, the parasite will develop resistance. "Nothing is a cure-all, magic bullet for all time," says Bentley. Meshnick concurs, but notes that although penicillin wasn't perfect either, it served as a prototype for a whole new family of antibiotics. "I really think the same thing can happen for artemisinin," he says.
What's Causing Parkinson's?
The cause of Parkinson's disease has baffled doctors ever since this chronic neurological syndrome was first described by James Parkinson in 1817. Now scientists may finally be closing in on the culprits. The disease is likely to be caused by "some admixture of genetic predisposition, aging, and exposure to environmental toxicants," says Doyle Graham, chair of the pathology department at Vanderbilt University Medical Center in Nashville.
Sorting out multiple potential causal agents and the interactions among them will never be easy. However, thanks to increasingly sophisticated research techniques, it may now be possible. A prime example of the kind of meticulously designed research needed for this purpose is a case-control study currently being completed at Henry Ford Health System in Detroit. "This is one of only two population-based studies to date in Parkinson's epidemiology, and it's the only one of which I'm aware in which an industrial hygienist's assessment of exposures, based on detailed occupational histories, has been used," says Jay M. Gorell, lead investigator and director of the hospital's Parkinson's disease center.
The study, now in the final data analysis stage, included 144 Parkinson's patients and 469 control subjects who were matched for age, race, and sex. Preliminary analyses, based on all but 10 of the patients and all but six of the controls, suggest an increased risk of Parkinson's disease associated with exposure to manganese, copper, and lead, as well as exposure to herbicides and insecticides used at work. However, it may take some combination of factors to produce Parkinson's disease. "It's possible that an agent might act in a cumulative way over time to partially disable a cell, but it might take several agents together to cause the cell either to fail to function or to die prematurely," says Gorell.
While some researchers are devoting their energies to determining which environmental agents contribute to Parkinson's disease, others are more concerned with discovering how they do so. The spectrum of possibilities ranges "from the enhanced metabolism of substances into their toxic form to the diminished protection of cells from these kinds of toxic products," says Graham. One hypothesis he is pursuing relates to the potential role of transition metals, such as manganese. Manganese poisoning produces symptoms similar to Parkinson's disease, although it affects a different site in the brain. Graham and his colleague, Thomas Montino, are currently studying the effect of transition metals on neuroglial cells in tissue culture.
Graham and others believe that transition metals may contribute to the oxidation of catecholamines, compounds that carry signals between nerve cells. One way that catecholamines are destroyed is by the action of the enzyme monoamine oxidase that results in the production of hydrogen peroxide, a known toxic compound. Catecholamine oxidation can also occur by metal-catalyzed processes that produce their own toxic by-products. Of course, cells have to cope with such natural toxins on a regular basis, so they develop the ability to protect themselves from cellular damage with enzymes that destroy toxins. However, if the rate of toxin production is increased or the ability of a cell to protect itself is lessened, this could lead to cellular injury over time.
As research presents new evidence for a link between environmental agents and Parkinson's disease, still other scientists are trying to understand how such factors may interact with a given individual's genetic makeup. The most likely scenario seems to be that a person can inherit a predisposition to develop the disease that is only later activated by a toxic exposure. For example, "a person may have a genetic defect in the body machinery that deals with a toxic compound, but if that person doesn't ever come across this toxic agent, he may never develop the disease," says Donato Di Monte, director of biochemical toxicology at the Parkinson's Institute in Sunnyvale, California.
One major obstacle to all Parkinson's research, however, is the lack of an objective test to diagnose the condition. At present, diagnosis is based on the clinical assessment of a given patient's symptoms, a method that can be inaccurate. The often conflicting results of many older Parkinson's studies may, in fact, be partly due to a lack of standardization in diagnostic criteria. Di Monte is among the scientists now searching for biological markers of Parkinson's disease, in the hopes of one day developing a simple, reliable test for the condition. Some of his work is aimed at studying the products of oxidative metabolism in spinal fluid.
The picture of Parkinson's remains complicated. Says Di Monte, "We have to start looking at the interactions between environmental and genetic factors. And among the environmental factors, we have to start looking at interactions between neurotoxins. Naturally, we would like to have a simple experimental model, but unfortunately that may not be feasible with Parkinson's disease."
EHPnet
Reliance on petroleum comes with a price. Because of spills, such as the Exxon Valdez, and the numerous barrels of oil that leak from petroleum pipelines, cleaning up petroleum-polluted water and soil is big business. The most common mechanical remediation techniques involve drumming, transportation, and remote disposal of contaminated soil and water to special dump sites and treatment plants.
Recently, an alternative technique called bioremediation has been developed. Bioremediation is the process of using a mix of living microorganisms, nutrients, and biological catalysts to rapidly break down hydrocarbons in soil or water into nonhazardous, nonregulated, organic fertilizer-like compounds. Bioremediation methods are estimated to be five to ten times less expensive than mechanical methods of remediation.
A World Wide Web site dedicated to bioremediation was created by Oettco Products Corporation, a company specializing in petroleum-oxidizing products. The site provides an overall explanation of bioremediation methods, materials, and techniques that have been used by professional bioremediation contractors over the last fifteen years. The topics outlined include basic concepts, typical commercial bioremediation materials, the bioremediation industry, safety issues, soil remediation topics, eliminating absorbed petroleum, cleaning coastal soil, and oil slicks on open water.
Bioremediation is not only used to clean up petroleum spills, but is also used to support basic sanitation infrastructures. Algae have been used to treat waste water for over a century, but only recently have certain algae species been actively cultivated to "digest waste." These methods are cost effective and produce little waste, as the leftover algae can be dried and used as fertilizer. Bioremediation methodology and techniques may provide a more environmentally sound way to help clean up the environment.
Last update: May15, 1997