Advertisement Banner
Skip to content

EHP

Featured | News | Science Selection Volume 122 | Issue 11 | November 2014

Environ Health Perspect; DOI:10.1289/ehp.122-A311

Thinking One Step Ahead: Strategies to Strengthen Epidemiological Data for Use in Risk Assessment

Carrie Arnold is a freelance science writer living in Virginia. Her work has appeared in Scientific American, Discover, New Scientist, Smithsonian, and more.

About This Article open

Citation: Arnold C. 2014. Thinking one step ahead: strategies to strengthen epidemiological data for use in risk assessment. Environ Health Perspect 122:A311; http://dx.doi.org/10.1289/ehp.122-A311

News Topics: Research Issues and Initiatives, Risk Assessment

Published: 1 November 2014

PDF icon PDF Version (332 KB)

Related EHP Article

Evaluating Uncertainty to Strengthen Epidemiologic Data For Use in Human Health Risk Assessments

Carol J. Burns, J. Michael Wright, Jennifer B. Pierson, Thomas F. Bateson, Igor Burstyn, Daniel A. Goldstein, James E. Klaunig, Thomas J. Luben, Gary Mihlan, Leonard Ritter, A. Robert Schnatter, J. Morel Symons, and Kun Don Yi

Risk assessment is a cornerstone of environmental health research and policy making.1 A commentary in this issue of EHP presents a set of recommendations and guidelines to help researchers more effectively characterize uncertainty in epidemiological findings.2 Not only will this provide more transparency for the science itself, says coauthor Jennifer Pierson, a scientific program manager at the ILSI Health and Environmental Sciences Institute, it should also lead to more sound policies when those findings are integrated into risk assessments.

“Risk assessment is nothing magical; it’s a process to guide decision making. As with any kind of scientific question, it’s important to know how certain we are of our data,” says Thomas Burke, director of the Johns Hopkins Risk Science and Public Policy Institute, who was not involved with the commentary. “We can’t ever fully eliminate uncertainty, but we can describe it and put bounds around it with statistics.”

DAG superimposed over a crowded thoroughfareDirected-acyclic graphs (DAGs) can be an effective way to visualize relationships between the variables in a study.

© Joseph Tart; Tomislav Pinter/Shutterstock

Experimental data have traditionally formed the basis for most human health risk assessments, but increasingly regulators are recognizing the value of epidemiological data for this purpose. “Different types of studies, like toxicology studies in animals and epidemiological studies in humans, can help compensate for each other’s inherent weaknesses,” says Michael Dourson, director of Toxicology Excellence for Risk Assessment, a public health organization located in Cincinnati, Ohio. Dourson was not involved with the commentary.

In October 2012 Pierson and colleagues convened a workshop with more than 30 environmental health researchers to develop recommendations for characterizing uncertainty in epidemiological studies. Their recommendations and guidelines form the basis of the new commentary.

To Pierson, one of the most important first steps is to get study authors out of their respective silos. “Oftentimes, the epidemiologists don’t work with the toxicologists, who don’t work with the risk assessors,” she says. By working together from the earliest planning stages of a study, researchers can pool their knowledge to limit uncertainty through study design, rather than scrambling to fix problems at the end.

Addressing uncertainty is critical when writing up results so that policy makers can factor it into their appraisal of the literature. Pierson and colleagues recommend that investigators assess and comment on the uncertainty in their findings using a tiered system developed by the National Research Council. This system enables policy makers to rate the quality of epidemiological data and how well study findings can be generalized to larger populations. This further allows them to weigh the uncertainties from different studies based on the quality of research, creating more accurate and nuanced risk assessments. For authors, applying the system to their own work can point to areas where uncertainty can benefit from further analysis.2

Validation studies and sensitivity analyses of epidemiological data, combined with a better understanding and disclosure of the sources of uncertainty, can help authors explore such areas. These methods can transform the discussion of uncertainty from its usual qualitative form3 into a quantitative measurement. This allows scientists to clearly communicate their results and accompanying uncertainties in the numbers-driven language of policy makers.2

“You need to communicate what you’ve done, and you’ve got to be able to state your results in a way that managers can get their head around,” Dourson says.

The authors of the commentary recommend several more techniques to more clearly and accurately present data. Among others, they suggest the use of directed-acyclic graphs as a way to visualize the sometimes complex relationships among confounders. They also emphasize the need to distinguish between correlation and causation in describing study results, to ensure scientists and policy makers don’t draw incorrect conclusions about risk.


References

1. National Research Council. Science and Decisions: Advancing Risk Assessment. Washington, DC:The National Academies Press (2009).

2. Burns CJ, et al. Evaluating uncertainty to strengthen epidemiologic data for use in human health risk assessments. Environ Health Perspect 122(11):1160–1165 (2014); doi: 10.1289/ehp.1308062.

3. Jurek AM, et al. Exposure-measurement error is frequently ignored when interpreting epidemiologic study results. Eur J Epidemiol 21(12):871–876; doi: 10.1007/s10654-006-9083-0.


WP-Backgrounds Lite by InoPlugs Web Design and Juwelier Schönmann 1010 Wien