Computational and Theoretical Neuroscience: From Synapse to Circuitry

Skip secondary menu

Computational and Theoretical Neuroscience: From Synapse to Circuitry

Sponsored by the National Institute of Neurological Disorders and Stroke

The Neuroscience Building, Room A
6001 Executive Blvd., Rockville, MD
April 28, 2000

Technical Report

Table of Contents

 

Introduction

The advent of ubiquitous computing brings us two universal and complementary scientific capabilities - data analysis and modeling. Taking full advantage of these capabilities requires new experimental methods and theoretical skill sets, but ultimately offers biologists the power to achieve deeper levels of understanding than otherwise imaginable. Genomic, biochemical, morphological, electrophysiological, and imaging data are accumulating at rates exceeding the capacity of manual analysis. Quantitative models of genetic, molecular, and neuronal networks are assuming increasing amounts of responsibility for clarifying how and why biological systems operate as they do. Ultimately the twin capabilities of analysis and modeling will provide a clearer picture of biology at a systems level, and coupled with inspired experimental design, they may resolve questions that are currently too complicated even to imagine. Perhaps even our most intricate neurological and psychiatric diseases, which are as yet unnamed, will one day be rendered trivial through our modeling efforts.

In this workshop on April 28, 2000, many aspects of modeling were explored, and possible funding and policy initiatives of NIH and NINDS were suggested. In the following pages we interpret and summarize the principles and policies that came up during this workshop.


The Problems

In this section, by way of introduction, we discuss problems of increasing complexity which can benefit from modeling. We mention the work of people present at the meeting when relevant, to provide fiducial references for the reader of this paper.

We begin our survey at the level of biochemistry. The brain expresses over 30,000 unique genes that are found nowhere else in the human body, and it is certain that understanding the brain will require careful analyses of signal transduction, gene expression, and intercellular communication during processes related to development and plasticity. Computers are currently used for genetic sequence and protein structure analysis, in which evolutionary and statistical principles predict function from existent biological data, but experimental and theoretical methods need to be developed to elucidate quantitative descriptions of how these molecules (some of which exist in small numbers that are not amenable to continuous-variable analysis) interact. Many properties of biochemical and genetic networks - their cooperativity, nonlinearity, robustness to noise, parallel nature, and feedback architecture - point to dynamics of such complexity that any nonquantitative analysis would miss the essential nature of these networks, and at best provide an approximate taxonomy of interactions. Compartmental modeling using diffusion equations, kinetic interaction models, and Monte Carlo methods has provided insight into several interesting biochemical networks, including synaptic transmission and long-term potentiation. Models may also assist in the engineering of useful biochemical networks, which could be used by cells to metabolize environmentally dangerous substances, for example. Finally, modeling may be the only feasible way to address extremely slow processes like normal human aging, which may be intractable to purely experimental approaches. Dr. David Tank gave two interesting examples, those of bacterial chemotaxis and sensory phototransduction, where many components have been identified but there is a lot of systems biology to be done at the subcellular level.

Another process of interest to computational neuroscientists is synaptic transmission. The estimated 1,000,000,000,000,000 synapses in the human brain transform the electrical activity of each presynaptic cell into currents and biochemical reactions in a postsynaptic cell. Synaptic release involves the fusion of a vesicle containing neurotransmitter to a presynaptic cellular membrane, thus releasing the vesicular contents onto receptors on a postsynaptic cell. Synaptic transmission depends on mobilization of vesicles to the 'docked' state at the presynaptic membrane, triggering of vesicle fusion by calcium, endocytosis via various mechanisms of varying time scales, and recycling of vesicles for refilling and rerelease. The kinetics of each of these steps depends on the phosphorylation state of various proteins, the time course of presynaptic levels of calcium, and the configurations of channels and protein assemblies in the presynaptic terminal. These dependencies result in such temporal effects as augmentation, depression, facilitation, and potentiation, which in turn shape the signals being computed at each synapse in a way that is poorly understood. Since the biochemistry and morphology of synapses differ from cell to cell (and even within a single cell), short-term plasticity may be drastically different at different synapses. Without realistic synaptic models, understanding the processing being done by any natural network of neurons may well be impossible. Several investigators presented synaptic models of this nature, including Dr. Richard Tsien, who described models of fusion pore modulation, short-term plasticity, and vesicle recycling, and Dr. Wade Regehr, who explained the temporal properties of three very different types of synapse in terms of a universal kinetic model with different levels of facilitation and depression. Other investigators, not present at the meeting, like Dr. Larry Abbott and Dr. Dean Buonomano, have described possible computations being performed by these short-term forms of plasticity, including gain control, frequency-selective filtering, and detecting transitions between different neuronal firing patterns. Dr. Tom Bartol demonstrated a Monte Carlo simulation of a neuromuscular junction at the molecular level, showing how computational methods can help with quantitative visualization.

Single neurons have been rich targets for the modeling world. Cable theory allows the modeling of dendrites, which canonically integrate electrical signals from many thousands of impinging synapses. Compartmental modeling similarly allows postsynaptic calcium levels, phosphorylation states, and protein concentrations to be computed from given patterns of presynaptic activity and empirically derived morphologies, synaptic locations, and channel distributions. Kinetic models of channels and receptors enable people to analyze the dynamics of single neurons, including subthreshold oscillations and bursting. With these modeling tools and appropriate experimental technologies - optical uncaging and microiontophoresis of ligands, optical imaging of calcium- and voltage- sensitive dyes, dendritic electrophysiology, and chemical monitoring of various protein states - computational neuroscientists have begun to quantitatively reconstruct the behavior of single neurons. Dr. Tank described the very first neuronal simulation ever made - Hodgkin and Huxley's model of the action potential - and emphasized that one purpose of a model is ultimately to seek universal principles as powerful as theirs. Dr. John Rinzel described Dr. Wilfrid Rall's work on dendritic cable theory as significant not only because it worked, but also because it reflected good judgment and experience which new computational neuroscientists should try to attain; furthermore, it provided falsifiable predictions, such as the existence of dendrodendritic synapses, that turned out to be correct.

People have modeled neural networks of many different kinds - vestibuloocular reflex plasticity, short-term memory in the oculomotor integrator, lobster stomatogastric ganglion activity, locust olfactory bulb oscillations, birdsong motor commands, and hundreds of other system behaviors. Often these models break with biophysical realism, for three reasons: to simulate the network with realistic neurons could exceed extant computational limits, to measure the necessary parameters for all the neurons in a real circuit could be impractical, and to execute a comprehensive simulation could diminish the amount of insight gained from the model. Many system phenomena are emergent, in that they are not explained merely by the properties of individual molecules and neurons, but require knowledge about the configuration of the entire system. Drs. Tank and Sebastian Seung described their experiments and models relating to the goldfish oculomotor integrator, which sums series of brief motor command bursts, encoding phasic motions, into tonic firing rates, which encode sustained eye positions. They also described several potential models, beginning with a very simple bistable neuron, but focusing on a recurrent network model which recapitulates many dynamic properties of the oculomotor integrator. Dr. Gwen Jacobs also described combining anatomical reconstruction with simulation in order to understand the behavior of a sensory neural network in an insect. Dr. Eve Marder described another invertebrate system, the lobster stomatogastric ganglion, a circuit which exhibits very different patterns of activity in different situations. Dr. Steve Lisberger explained how in collaboration with Dr. Terry Sejnowski, he made a model of the VOR that predicted where the sites of learning are. Dr. Charles Wilson explained that many models have been based primarily on anatomical or morphological data, and that the next step is to go beyond anatomy. Dr. Ruzena Bajcsy mentioned that interpreting imaging data, like that from fMRI, should be addressed with sophisticated systems-level models. Dr. John Donoghue explained that in order to understand how information is represented in the activity of neurons, new ideas on representation and statistics, as well as new experimental techniques, will be needed. One intriguing hypothesis is that synchrony between neurons 'binds' information together for further processing; experimentally investigating this hypothesis is a daunting challenge, and conceptually sophisticated models may be needed to interpret any results. Perhaps understanding the codes will help us design useful sensory and motor prostheses, giving sight to the blind and motion to the paralyzed, and perhaps even relieving chronic pain. For example, current simple models, which use linear combinations of firing rates to calculate arm trajectories, cannot account for the full richness of 3-D limb motion desirable in a naturalistic prosthetic arm: more sophisticated models of the neural code might extend these currently primitive capabilities.

Ultimately, completely understanding a biological system means understanding how it works in a natural environment over a period of time, since those are the criteria that guided the evolution of the system to its current state. Such conclusions are extremely difficult to make precise, since there are no 'controls' for what evolution has produced. As for astrophysicists and other people who work on theories of creation, experimentation for evolutionary biologists is often merely scrutiny of the present and modeling of the past. Here is another niche for modeling, in that short of waiting for billions of years, to see how a nervous system evolved to deal with stimuli in the real world is nigh impossible. Developmental neurobiology, while more amenable to experimentation than evolutionary neuroscience, still benefits from modeling in order to understand the intricate interactions which go into wiring the brain, especially when activity-dependent modifications are involved.


The Methods

Data analysis is an enormous field, and any cursory survey would fail to capture the vast variety of methods used for interpreting experimental data. Many computerized data analysis programs do not offer a conceptual advantage in data analysis - their primary purpose is to permit the fast analysis of large data sets. Recently, though, people have begun to use the computer to visualize data sets and extract information from data in sophisticated ways, using ideas from modern statistics and the field of human-computer interface design. In this way the frontier between analysis and modeling has become blurred.

Biological models can serve three main purposes. First, they can quantitatively describe data. In this sense they provide a framework for the interpretation of experimental results and offer proper levels of abstraction and interpretation such that the salience of the data is manifest. For example, a model of how a wave of seizure activity propagates across the cortex of an epileptic patient could be invaluable to a neurosurgeon trying to invert field potential data so as to locate an epileptic focus. Second, some models are designed to be validated or falsified, and as such can drive experimental inquiry. These are surprisingly rare, since experimental techniques are often not up to the challenge of addressing even the simplest theoretical results. Finally, some models are teleological. They provide insight into how and why a system exists or performs in a certain way, due to development, evolution, or other influential circumstances. Often these models are hard to experimentally refine, but their purpose is to provide enlightenment: different experimentalists are carrying out research at many different levels, from molecular studies all the way up to psychology, and one rational way to integrate the results is to create suitable mathematical models to link the different levels. Often these models do not emphasize computational realism, but attempt to bring out salient aspects of the system at hand, and often they appeal to other disciplines such as control theory or statistical mechanics to explain things.

In addition to biological models, abstract models of neural networks are also useful for understanding how information can be processed. The reason for this is that the abstract algorithm used to solve a problem, and the specific material implementation, are in many ways independent: because a particular mathematical algorithm can be evaluated by hand, or run on a pocket calculator, or executed on a powerful supercomputer, there is concrete value in understanding the algorithm itself, even if the machine it's running on is not well understood. Thus algorithms like Dr. Seung's nonnegative matrix factorization, while not based explicitly on biophysics, are interesting because they capture the essential nature computational processes. In some fortunate cases, particular algorithms such as principal components analysis and backpropagation also have shaped the way people think about the brain, since they have very natural implementations in the form of neural networks.


The Focus

This workshop arose from the desire of NIH to reach out to other fields, most notably computer science. Historically, a panel met in June 1999 to discuss BISTI, the Biomedical Informatics Science and Technology Initiative. Four recommendations - to start National Programs of Excellence in biomedical computing, to develop bioinformatics tools, to create a national infrastructure, and to train a new generation of bioinformatics scientists - were made. A subcommittee within NINDS focusing on neuroinformatics was also established, which led eventually to this workshop.

The stated purposes of this workshop were to educate the NIH staff on the significance of theoretical neuroscience (in particular, what modeling can do that experiment cannot), to identify the current needs and barriers relevant to progress in the field, and to make recommendations to NIH staff on how to promote research in the field. We will summarize the policy findings in two sections, one dealing with the nature of the science at hand, and the other dealing with the people and institutions involved.


The Policy: Scientific

Many of the attendees agreed that new computational tools were needed. New insightful algorithms are needed to model the systems that people are interested in, and data format standardization and scalable modular code would facilitate sharing and collaboration between different computational neuroscientists. Visualization tools, which are particularly computationally intensive due to their need to do graphical manipulations in real time, was frequently mentioned as an important tool for understanding data, especially spatial data such as from fMRI studies.

However, the key problems in the field are most likely those of representation. Many questions in the field are not even being asked in the correct way due to the complexity of neural systems. Computational power is not the issue - understanding is the key ingredient that we lack. What data should we be looking at? What experiments should be done? What is an epiphenomenon, and what is causal? What aspects of the system are key to gain insight into its function? As in any new field, many scientists feel like the drunk who looks for his keys under a street lamp just because that's where the light is. But many of the best theoretical neuroscientists are now, so to speak, shedding new light on the matter - whether by choosing systems where relatively rich behavior is comprehensible, inventing new experimental tools, or bringing in ideas from other fields such as physics and statistics.

Several questions arose as to what kind of science should be funded. It was debated whether pure-modeling studies should receive money, even if there are no explicit collaborations with experimentalists; this issue was left unresolved although some people mentioned that they had seen pure-modeling studies that were very insightful. Engineering and abstract neural networks have traditionally not been funded by NIH either; however, it was also noted that physicists often have trouble getting NIH money because they're not yet good at grantsmanship, and some coaching might be necessary to encourage scientists to make the crossover. Dr. Wilson explained that good computational grants must take into account the culture of the study section (for example, being very precise about what one is doing over the duration of the grant, and why), and that "fishing expedition" approaches which simply try to find values for large numbers of parameters are mostly rejected. It was also concluded that more respect should be shown to studies involving simpler animals (invertebrates and nonmammalian vertebrates), since they often offer insights that are more difficult or impossible to discover in higher animals, while providing principles that generalize well. Classic studies on olfaction in slugs and insects, gastric motility in crabs and lobsters, dynamic filtering in electric fish, and plasticity in Aplysia, to name a few, have provided enormous insight into a wide variety of computationally interesting processes. For example, the thousands of published studies on hippocampal long-term potentiation and cerebellar long-term depression have benefited enormously from insight gained from studies done in the seemingly distant field of classical conditioning in Aplysia. As noted by Dr. Seung, it is silly to have the attitude that "If I'm going to fail to understand the brain, I might as well do it in a monkey." On the other hand, to analyze a sophisticated topic like decision-making, declarative memory, or advanced visual perception, it may be essential to use a primate: the important thing is to choose a system carefully.

The general intellectual goals arrived at during the meeting include creating hierarchical models to link different levels from molecules to systems, finding ways to analyze small changes (which are currently hard to address in biology, but are almost certainly important for a lot of subtle illnesses), analyzing information representations and transformations (neural codes at the single-cell and ensemble levels), and coming up with general principles (as opposed to focusing on details for their own sake; this will almost certainly require some standardization of experimental techniques to be successful). Specific scientific topics that were said to need exploration with strong experimentation and modeling include:

  • chronic pain, which involves the spinal cord, thalamus, brainstem, and cortex at molecular and cellular levels,
  • functional remapping of receptive fields in sensory cortex, where a high-resolution understanding of the dynamics would help people understand the effects of strokes and amputations,
  • rhythmogenesis and neural network oscillations in systems ranging from slug olfaction to primate attention,
  • the production of speech, birdsong, and other sequential behaviors (and the ways animals interpret such sequences),
  • diseases like Parkinson's where the static anatomical circuitry is well known but there are no good dynamical models,
  • diseases like autism, schizophrenia, and manic depression where the changes are subtle, multigenic, and unlikely to be understood with traditional genetic, chemical, or neurological techniques,
  • short-term and long-term plasticity, and the interactions and interconversions between them, across varying time scales,
  • genetic and biochemical network behavior (subcellular computation),
  • the structure and function of circuit primitives, such as bursters, recurrent loops, cortical columns, gap-junction connected inhibitory networks, and Purkinje columns,
  • activity-dependent development, a general property of almost all neural systems, and
  • sensorimotor integration, which offers complete understanding of a system (such as the vestibuloocular reflex) from stimulus input to behavior output.

Many of these problems are already being analyzed, but not at the level of theoretical neuroscience. Take the example of functional remapping of sensory cortex: many locations along the input pathway (from sensory afferent to thalamus to cortex) are candidate sites of synaptic plasticity or changes in morphology; modeling the activity of the different neurons could help one infer the sites of plasticity. Furthermore, results from experiments that are easier to carry out in slice or culture (due to the better accessibility of those preparations), such as the induction of activity-dependent synaptic plasticity, the monitoring of neuronal morphology using 2-photon microscopy, and the application of pharmacological agents to particular neurons (say, to determine the relative roles of excitation and inhibition in the dynamics of the circuit), can then be extrapolated to the circuitry of living animals via a modeling study. Such insight could help scientists understand the cortical and thalamic reorganizations that result from such insults as strokes and limb loss. Many disorders that are currently poorly understood - thalamic pain, schizophrenia, certain dystonias, and phantom limb syndrome - are almost certainly due, at least in part, to improper cortical mapping.

In conclusion, the theoretical neuroscience agenda should attack hard problems that require lots of original ideas, in appropriately chosen systems, with attention to multiple time scales and levels of complexity. Solving these problems will require people equipped with new ways of thinking, which brings us conveniently to the next section of this paper, which summarizes the educational and institutional strategies discussed at this workshop.


The Policy: People and Institutions

One set of institutions that has worked well in the past is that of the Sloan Centers for Theoretical Neurobiology, of which there are five in the United States. The Sloan Centers act as training programs for engineers and physical scientists who want to work on problems in neuroscience. Dr. Lisberger explained how the postdocs and visiting scientists (such as Dr. Bialek) who went through the UCSF Sloan Center have radically changed the way many of the people at the Keck Center think about neuroscience. Dr. Marder explained how mathematicians and physicists in her lab are now entering the academic workforce, thus starting a positive feedback loop that will result in the training of many more theoretical neuroscientists. It was emphasized that the key ingredient that makes the Sloan Centers work is the willingness of the entire community to learn from one another - people with neurobiological problems and people with mathematical methods are working together to create good theoretical neuroscience. It was also suggested that 'bridging people' - people with explicit training in both fields - greatly facilitated such interactions and contributed to the science being done. Many people noted that results should not be expected right away - it always takes time for people from different intellectual cultures to learn from one another and to appreciate each other's talents, and immediately gratifying productivity should not be emphasized at the expense of long-lasting synergistic creativity. This is not important only because of the value of long-lasting collaborations, but also because to criticize pure research based on immediate utility or relevance is not always the best policy: many scientific inventions like the laser or transistor were created without their current applications in mind, but only later attained their modern significance.

In addition, the Sloan sponsorship at Brandeis helped facilitate a faculty position that would not have been autonomously feasible. To overcome the natural inertia of many universities to act within non-theoretical, monodisciplinary ranks, additional money can be an excellent incitement. Making departments of one discipline open to people from another can greatly facilitate science: forming large open collaborative centers like Stanford's Bio-X, which explicitly encourages interdisciplinary bioscience, may prove to be an excellent way to accomplish this openness. At this workshop it was generally agreed that the Sloan Centers were successful experiments in exploring computational neuroscience, and that they have worked well enough to deserve replication at a larger scale, perhaps with the assistance of NIH.

 

Undergraduate and graduate education is also a focal point for creating good theoretical neuroscientists. While many graduate students are comfortable with modeling, the question arose during this workshop as to whether biology undergraduates should be required to take, say, nonlinear dynamics classes. It was concluded that while much of biology can still be done with nonquantitative tools, eventually that part of the field will become obsolete as the post-genomic age progresses, and quantitative descriptions will be paramount. Many undergraduate biology curricula are occupied with nonquantitative, memorization-intensive science - indeed, the curricula are designed for premedical students, who are often mathphobic. It was suggested that biological science students will take math voluntarily when they realize that they need it to do science in the modern world, just as many biological science students opt for computer classes although they are not explicitly required to. In summary, although training grants for predoctoral students and support grants for postdocs are important, it is not yet clear whether undergraduate support or drastic curricular revision is appropriate at the current time. But the field is so immature and in need of a critical-sized community of good thinkers, that freedom and daring must absolutely be encouraged, even at fairly early educational stages. Thus the creation of undergraduate and graduate programs aimed at computational neuroscience, while resource-intensive, are important for the future of the field.

Many biologists are learning computational neuroscience on their own: according to Dr. Wilson, one of the editors of the Journal of Computational Neuroscience, papers are being written at rates far greater than would be predicted simply from the influx of non-biologists into the field. One debate at this workshop was whether it was better to encourage existent biologists and theoreticians to learn from each other, perhaps by running short educational courses or creating environments suitable for cross-fertilization, or to educate the next generation of people in both simultaneously, perhaps from the very start of an undergraduate education. Certainly both strategies are essential: it is good to start students thinking about problems at an early age, and it would be foolish to abandon the talented scientists that already work in the separate disciplines - indeed, without them, how will the new students be trained? Providing for cross-disciplinary education at a late stage can be expensive, especially since physicist postdocs make much more money than biologist postdocs and it can take many years to cross-educate a person in the two fields. However, a key theme is that processes that encourage positive feedback are good, due to the transitive nature of education: for example, training biology postdocs (who are about to become professors) in mathematics will cause their undergraduate and graduate students to be more quantitative in their thinking. Educational institutions are structured so that teaching an individual equates, after a time, to teaching everybody; the trick is to make that time as short as possible.

The question arose as to how to facilitate efficient cultural changes in institutions - to make people want to spend money on theoretical neuroscience. This requires leadership and effort, but makes long-term survival of interdisciplinary programs more likely: with sufficient inspiration and energy, many of the above ideas could become forces for scientific change. Money can give institutions and people freedom to change; it is irrational, for example, to expect professors to design a new theoretical biology curriculum, or to learn a few field, while juggling research, traditional classes, and administrative duties. One suggestion made during this workshop was that money could be used to give professors sabbaticals to design new curricula, or to let faculty take educational opportunities to broadly explore different fields, or to hire faculty that would drive departments into theoretical directions. Another suggestion was to create a hybrid program which sponsors a few years of postdoctoral research in an alternative field, followed by a highly catalytical faculty position in a department which desires to become more interdisciplinary. It was generally agreed that additional money could create time for thought, conversation, education, and collaboration at every level along the career of a scientist, although quality of scientist should perhaps be emphasized over the quantity trained.

It was suggested that different universities should candidly share their successes and failures as they try to implement the above ideas, especially in relation to education at the undergraduate and graduate levels. Perhaps inter-institution collaborations and regional networks could be facilitated in some fashion, perhaps through meetings, conventions, and local training programs where people can converge with ideas and interdisciplinary goals. These last few suggestions reflects a general theme of all the conclusions arrived at during this workshop: it is very important not to work in isolation - not at the individual, departmental, institutional, or intellectual levels. Indeed isolation is contrary to the essence of all modern biological research. Communication and sharing of insight is therefore fundamental to every aspect of computational neuroscience.


Conclusion

In conclusion, the field of computational neuroscience is full of stunning challenges of incredible complexity. The rewards, however, are enormous - perhaps no endeavor like the mathematical analysis of the mind is so potentially satisfying, given our essential nature as human beings. To complete the loop of scientific knowledge - to understand neuroscience in terms of biology, biology in terms of physics, physics in terms of math, math in terms of cognitive psychology, and cognitive psychology in terms of neuroscience once again - will require insightful models at every step of the way. But to complete this loop is to fulfill the dream of humans ever since rationality became manifest. The medical promises are also enticing - useful prosthetic muscles and sensory organs, computer-enhanced memory and brainpower, and repair of systems damaged by age or disease. Some of these promises are futuristic enough that it is hard to imagine them now, but the lag time between a scientific dream and its fulfillment is shrinking rapidly.

To guide this field into the 21st century will require creative leadership, energetic education, and the use of resources to encourage freedom, risk-taking, and close communication. It is the hope of this writer that this workshop has played an integral role in shaping this requisite vision.


Participants

Richard W. Tsien, PhD (Co-Chair)
Department of Molecular and Cellular Physiology
Stanford University School of Medicine
Beckman Center Room B105
Stanford, CA 94305-5345
Phone: (650) 725-7557
Fax: (650) 725-2504
email: rwtsien@leland.stanford.edu

Ruzena Bajcsy, PhD
Directorate for Computer and
Information Science and Engineering (CISE)
National Science Foundation
4201 Wilson Boulevard, Suite 1105
Arlington, VA 22230
Phone: 703-306-1900
Fax: 703-306-0577
Email: rbajcsy@nsf.gov

Tom M. Bartol, Jr., PhD
Computational Neurobiology Laboratory
The Salk Institute
10010 N. Torrey Pines Rd.
La Jolla, CA 92037
Phone: 858-453-4100 x1565
Fax: 858-587-0417
e-mail: bartol@salk.edu

William Bialek, PhD
NEC Research Institute
4 Independence Way
Princeton NJ 08540
Phone: 609-951-2643
Fax: 609 951 2496
email: bialek@research.nj.nec.com

Eve E. Marder, PhD (Co-chair)
Volen Center
Brandeis University
Waltham, MA 02454
Phone: 781-736-3140
Fax: 781-736-3142
email: marder@brandeis.edu

John P. Donoghue, PhD
Department of Neuroscience
Brown University
Box 1953
Providence, RI 02912
Phone: 401-863-2701
Fax: 401-863-1074
email: John_Donoghue@brown.edu

Gwen A. Jacobs, PhD
Center Computational Biology
Montana State University
Bozeman, MT 59717-3505
Phone: 406-994-7334
Fax: 406-994-5122
email: gwen@nervana.montana.edu

Stephen G. Lisberger, PhD
Department of Physiology
University of California School of Medicine
513 Parnassus Ave
Box 0444
San Francisco, CA 94143-0444
Phone: 415-476-1062
Fax: 415-502-4848
email: sgl@phy.ucsf.edu

Wade Regehr, PhD
Department of Neurobiology
Harvard University School of Medicine
Boston, MA 02115
Phone: 617-432-0435
Fax: 617-734-7557
email: wregehr@hms.harvard.edu

John M. Rinzel, PhD
Center for Neural Science
New York University
Room 809
New York, NY 10003
Phone: 212-998-3308
Fax: 212-995-4011
email: rinzel@cns.nyu.edu

H. Sebastian Seung, PhD
Department of Brain and Cognitive Science
Massachusetts Institute of Technology
45 Carleton St
E25-210
Cambridge, MA 02139
Phone: 617-252-1693
Fax: 617-258-7978
email: seung@mit.edu

Karen Ann Sigvardt, PhD
Center for Neuroscience
University California at Davis
1544 Newton Ct.
Davis, CA 95616
Phone: (530) 757-8520
Fax: (530) 757-8827
Email: kasigvardt@ucdavis.edu

David W. Tank, PhD
Bell Laboratories
Biological Computation Research
Lucent Tech Room 3L-408
600 Mountain Ave
Murray Hill, NJ 07974
Phone: 908-582-7058
Fax: 908-582-4702
email: dwt@physics.lucent.com

Charles Wilson, Ph.D.
Cajal Neuroscience Research Center
Division of Life Science
University Texas at San Antonio
6900 N. Loop 1604 West
San Antonio, Texas 78249
Phone: (210) 458-5654
Fax: (210) 458-5658
email: cjwilson@utsa.edu

NIH:

Gerald D. Fischbach, MD
Director
NINDS, NIH
31 Center Dr.
Room 8A52
Bethesda, MD 20892
Phone: 301-496-9746
Fax: 301-496-0296
email: gf33n@nih.gov

Yuan Liu, PhD (Organizer)
Channels, Synapses and Circuits
NINDS/NIH
6001 Executive Blvd, Room 2110B
MSC 9523
Bethesda, MD 20892-9523
Phone: 301-496-1917
Fax: 301-480-2424
email: liuyuan2@ninds.nih.gov

Constance W. Atwell, PhD
Associate Director for Extramural Research
NINDS, NIH
Neuroscience Center
Suite 3009
6001 Executive Blvd
Bethesda, MD 20892
Phone: 301-496-9248
Fax: 301-402-4370
email: CA23C@NIH.GOV

Dennis L. Glanzman, PhD (Co-Organizer)
Theoretical and Computational Neuroscience Program
Division of Neuroscience
and Basic Behavioral Science
NIMH/NIH
6001 Executive Blvd, Room 7N-7171
Bethesda, MD 20892-9637
Phone: 301-443-1576
Fax: 301-443-4822
email: glanzman@helix.nih.gov

Prepared by:

Edward Boyden (Writer)
Department of Molecular and Cellular Physiology
Stanford University School of Medicine
Beckman Center Room B105
Stanford, CA 94305-5345
Phone: (650) 725-7564
Fax: (650) 725-8021
email: eboyden3@stanford.edu

Jason Pyle (Associate Writer)
Department of Molecular and Cellular Physiology
Stanford University School of Medicine
Beckman Center Room B105
Stanford, CA 94305-5345
Phone: (650) 725-8119
Fax: (650) 725-8021
email: jpyle@stanford.edu

Last updated September 15, 2008