Skip navigation links
 
NIGMS Home | Site Map | Staff Search

MIDAS Consultation on Modeling and Public Policy Meeting Report

Virginia Bioinformatics Institute
September 25-26, 2006
Holiday Inn, Blacksburg, VA

Informing Public Policy with Models of Complex Systems

Models in Support of Decision Making

Chris Barrett
Planning exercises are often designed to explain how things should work in the perfect world rather than understanding how things really work. Further, models often do not address what decision makers need or want to know.

One of the meeting goals is to understand how our research does and does not address the policy makers' issues.

We would also like to understand how the focus of policy changes, often independent of rationality (e.g., anthrax morphed into smallpox) and understand how policy makers react to different presentations of information (e.g., when information is presented as mortality vs survival, you get different responses).

Perspectives on Preparing for Pandemic Influenza

Richard Hatchett
DHHS is planning to establish a modeling hub that will incorporate, aggregate, and coordinate various models and data. This suggests that modeling could have a far bigger impact in the future, but this does depend on our ability to speak "policy."

A short and incomplete history of modeling
Feb - Sept 2001 UK FMD outbreak
June 2001 Dark Winter (smallpox attack)
Sept 2001 Anthrax mailing
April 2002 AHRQ funds BERM development Bioterrorism Emergency Response Model
Aug 2002 Kaplan / Wein PNAS article - not developed with subject experts and came under a lot of fire
Fall 2002 - Spring 2004 HHS Council on Public Health Preparedness Modeling Initiatives - smallpox exercise with policymakers and subject experts - worked well
Dec 2003 - June 2004 Cities readiness initiative planning and development drew on BERM projects and focused on operational capabilities
May 2004 MIDAS funded
April 2006 First MIDAS runs for TLC
May 2006 Briefings of senior advisors focused on
Defining objectives
Combating fatalism
Taking certain strategies off the table
Justifying models and explaining model dynamics
Reporting results
Defining a strategy

Briefings of senior policy makers dispensed with explaining and justifying models
June 2006 Outreach to public health, private sector, education focusing on combating fatalism, summarizing concepts, presenting a well-rounded argument, answering predictable questions, and enlisting support.
June 2006 Critical concerns raised
July 2006 Focus on firming up scientific, economic, and historic analyses
August 2006 Briefings to DOD, state and local partners
September 2006 Preparation of publications

The most helpful parts of the Targeted Layered Containment or Combined Scenario analyses were the sensitivity analyses, the intermodel comparisons, and the iterative work to understand how the models were operating. Bob Glass's simple models were very helpful for their speed and in refining the questions. Historical analyses were used to test some hypotheses suggested by modeling. Telling stories was very helpful, especially when real people were involved.

In the future, presentations may focus on responses at different organizational levels - individual, community, and international.

General Observations
  • Modeling has contributed to every state of policy development but in different ways at different times
  • Modeling provides insights, not answers
  • Needs and expectations of policy makers differ depending on their responsibilities, expertise, and seniority
  • Modeling needs advocates within policy circles who understand what it can and can't do; it should be pushed, not pulled
  • The value of a model is determined by the question being asked and is not necessarily a function of the model's complexity
  • Modeling results presented in isolation will never drive policy decisions
  • The recent experience was a unique opportunity. In the future, modelers may need to do the hard work of interfacing with decision and policy makers.

Interfacing Models and Public Policy

Sally Phillips
Models can condense mass quantities of difficult data into simple representation, generate graphical depictions, and allow no cost planning scenarios. Models cannot give the right answer, predict correct scenario consequences and actions, accommodate all possible confounding variables, especially human behavior, guarantee that the right actions will be set into policy over the best action, or assure that interpretations will be accurate

There are several reasons for doing model research:
  • Research for knowledge and truth
  • Translating knowledge (we usually translate to ourselves, not to people who need to use the information)
  • Modeling at the nexus of basic research data, representations of the truth, prediction of action and consequences, and decision support tool
  • Research support for evidence based policy
Bioterrorism and Epidemic outbreak Response Model (BERM) was developed for New York City in response to anthrax. The tool provides information for mass prophylaxis calculations - how many people needed, staffing, timing, length of campaign, etc. Such tools allow people to play with the numbers and allow for allocation of scarce resources and other planning activities.

The National Mass Patient, Regulating, and Tracking System is for disasters that require a large Federal response (e.g., who is affected, what type of patients and where are they going, how many people in shelters).

All Models are Wrong: Learning from Models in Complex Environments

John Sterman
For most models of complex systems, validation is impossible (build confidence through iterative testing). Modelers must strike a balance between scope and detail.

Pragmatics
  • The fastest model wins
  • The coolest interface wins
  • The model consistent with preconceptions
  • The model that favors your organization
  • The mental model ALWAYS wins.
The Implementation Paradox
  • The greatest potential for improvement comes when formal modeling leads to change in deeply held mental models.
  • The more fundamental the mental model you challenge, the greater people's defensiveness may be.
  • Policymakers must discover the insights for themselves…meaning that they need to be involved in the modeling process.
Modeling may be protective rather than reflective; models may be used to prove a point, hide motives, selective use of data, support preconceptions and cover them up, or to promote the authority of the modeler and build client dependence.

Testing models at extreme parameter values may expose assumptions that bias the model.

Example: economic models of energy production that assumed a fixed fraction of energy delivered by some sectors, even when their cost was reduced to 0.

Examining modeling assumptions is crucial, continuous work. On several occasions, whole communities of modelers have been shut out of the policy process for years because of bad assumptions in some models. These assumptions had not come to light during internal or external reviews. Braithwaite also emphasized similar points.

Reflective modeling promotes inquiry, exposes assumptions, motivates range of empirical tests, challenges preconceptions and supports multiple viewpoints, and involves the community of stakeholders. It empowers the clients and enhances their capabilities.

It might be very important to parameterize a set of equation based models for vest pocket .

Perhaps the most common among abuses are situations where mathematical models are constructed with an excruciating abundance of detail in some aspects, whilst other important facets of the problem are misty or a vital parameter is uncertain to within, at best, an order of magnitude.

It makes no sense to convey a beguiling sense of reality with irrelevant detail when other equally important factors can only be guessed at.

Robert May Science 303:790-793

Robust Inference for Model Based Policy Analysis

Steven Bankes
In engineering, the specification of the system boundary and model boundary are both under our control. In classical practice, we choose boundaries carefully to facilitate predictive analysis. In policy analysis, the boundary of the system is generally imposed on us, is open, and is analytically difficult. In general, the intersection of situations where decisions make a difference and situations where we can accurately forecast is empty.

Validation, strictly understood, fails as a strategy for two reasons:
  • Global integrated models are not predicatively accurate
  • Validating the pieces is helpful but some important pieces cannot be validated.
Our tendency is to ignore information we understand poorly, akin to looking for keys where the light is good. In doing so, we make other, often implicit, assumptions about the behaviors of those pieces.

Inference is key to understanding the role of modeling, and an analysis of uncertainty is important. In the policy arena, uncertainties are usually large and the implications of policies are often unstudied, controversial, and data free. Sensitivity analyses are not enough because they can miss important phenomena and underestimate the uncertainties.

A helpful approach is to ask "What is the universe of stuff we want to know and don't?" Defining the unknowns can help us think more explicitly about the universe of real world situations we want to address.

One can also delineate the ways that the model could fail and use this information to refine the model or the question or develop a challenge set of issues. Robust software should allow for testing of robustness and failure modes. Every project should include a section on how to break the model / argument.

A trick for presenting material is to bound the unknown variables and assess where the value matters. For example, the value of a life is between x and y, but with respect to some question, the only place it matters for this study is above y-1.

Advice
  • Model policy levers as well as problem physics
  • Confess the true degree of uncertainty
  • Use exploration to discover most informative policy tradeoffs
  • Leave choice in hands of policy audience

Modeling and Simulation: Government Relations for Scientists

Karl Braithwaite
Braithwaite has helped scientists create presentations for policy makers, especially around the Clean Air Act and auto emission standards. Policy makers are often confused by the plethora of models that groups create to advance their own interests. Developing good presentation skills is critical if one is to have an impact.

Advice:
Pick something you can score on at the start. Figure out where you can have an impact and throw yourself into it.

The following guidelines for scientists interacting with policymakers were taken from a book by John Kingdon - Agendas, Alternatives, and Policies.

Understand context and timing
It is impossible to get on the policy agenda until there is an alignment of several important interests. A window of opportunity occurs when the problem, policy interests, and politics are aligned. Progress requires a champion to advocate for the cause.

This process can be very frustrating to scientists.

Study your policy network
Understand that there is a lot of homework on the stances and responsibilities of many groups, including Congress, Federal agencies, interest groups, media, and trusted advisors. The Iron Triangle refers to a coalition of congressional subcommittee, agency, and advocacy group. More recently, we see the emergence of advocacy coalitions which include businesses, associations, and others.

In briefing, provide layered briefing package:
Prepare a 1 page summary, a 3 page briefing, and appendices. The legislative aid reads the 1 page summary, the senator reads the 3 page briefing, and appendices add credibility

Beware of the friendly briefing with no tough questions
The person being briefed should be asking tough questions. Bring them up yourself if he does not.

Different briefings for supporters vs detractors
Do your homework to ensure you know the positions of the people you are briefing and prepare what they need to know.

Stories and examples
These may be the most important part of the presentation.

Find partners if possible
Develop role of scientific spokesman who can communicate to policymakers and understands policy
Modeling results are only one of many factors affecting decisions (Karas, 2004, pp 14-15)
Work with those who influence policymakers
Remember Federalism
States have considerable power, and one must be aware of what various levels of government and various agencies really can do.

Use your institution's government relations office
Braithwaite's handout is an appendix to these minutes.

Discussion Points:

MIDAS's connection to policy and decision makers is vital, although we need to ensure that there is a safe haven inside MIDAS for basic science, social networks, computational and statistical problems.

Is it time to create a meta-model that consolidates strategies and can be run quickly to address scenarios?

After discussion, there was a feeling that this project would be quite difficult and expensive, though the time is right to start. Points include:
  • One can envision MIDAS spinning off some software products, thus freeing groups for methodological research.
  • Building a general purpose platform is probably less desirable than creating a set of models that serve more specialized interests.
  • Our consumers are varied - policymakers at the national level, state and local health departments, researchers
  • We should consider the granularity of models. It may not be possible to create a model at lower than the state level.
  • We should consider how useful a tool is rather than how accessible we can make it. In that regard, think about what will solve users' problems.
  • It is important to avoid too much simplicity and the resulting misuse of models.
  • Simple models are not tools for research, but they have utility.
  • Versioning is critical and never stops.
This page last updated November 19, 2008