NERSCPowering Scientific Discovery Since 1974

Burkhard Militzer

BES Requirements Worksheet

1.1. Project Information - Quantum Monte Carlo Simulations of High Pressure Materials

Document Prepared By

Burkhard Militzer

Project Title

Quantum Monte Carlo Simulations of High Pressure Materials

Principal Investigator

Burkhard Militzer

Participating Organizations

University of California, Berkeley

Funding Agencies

 DOE SC  DOE NSA  NSF  NOAA  NIH  Other:

2. Project Summary & Scientific Objectives for the Next 5 Years

Please give a brief description of your project - highlighting its computational aspect - and outline its scientific objectives for the next 3-5 years. Please list one or two specific goals you hope to reach in 5 years.

Over the next five, we will help establish quantum Monte Carlo simulations as a standard tool for electronic structure computations in condense matter physics. We will provide benchmark calculation by selecting important problem in high pressure physics and geophysics where DFT has been know to fail or be of limited accuracy. We want to develop and apply new QMC methods and then demonstrated that, with sufficient cycles provided, QMC can be performed on all systems where ground state calculations are currently performed with DFT only.  
The goal is to eliminate the dependence on approximate exchange-correlation functionals and the need to pick the right functional or to pick a good value for the U parameter by comparing with experiments. Once this is achieved we will have a much more robust description structural and spin transitions of transition metal oxides, which make up significant part of the Earth lower mantle. Another application QMC will be to characterize van der Waals bonding in solids and fluids. This will enable to select materials for hydrogen storage and carbon sequestration. 
The precision of such calculations may rival that of high pressure experiments for selected simple materials. In experiment, it may be difficult to control hydrostaticity and have a well define chemical composition.

3. Current HPC Usage and Methods

3a. Please list your current primary codes and their main mathematical methods and/or algorithms. Include quantities that characterize the size or scale of your simulations or numerical experiments; e.g., size of grid, number of particles, basis sets, etc. Also indicate how parallelism is expressed (e.g., MPI, OpenMP, MPI/OpenMP hybrid)

1) Casino QMC code, maintained by Cambridge group of Richard Needs 
2) QMCPack maintained by Jeongnim Kim in David Ceperley's group at the University of Illinois at Urbana-Champaign 
 
The size of the simulation is best characterized by the number of valence electrons that are treated explicitly in QMC. Large simulations have currently 500-1000 electrons. 
 
Amount of communication in QMC is small but not zero. All done using MPI.  

3b. Please list known limitations, obstacles, and/or bottlenecks that currently limit your ability to perform simulations you would like to run. Is there anything specific to NERSC?

1) Most severe limitation at NERSC is the wait time in the queues, 3-4 days is not unusual. This makes it difficult for development work but also for researcher with many other commitments. The combination of limited days to work on NERSC computer and delayed return combined make progress slower than needed. 
 
--> I would like to make a suggestion and propose the metric of NERSC's success should changed from just measuring how many hours were delivered. The wait time should be factored. For every day of waiting 10% should subtracted from the allocation a user is charged and also from how what NERSC counts as delivered. 
 
2) If new machine are bought with less memory per core then we will have to rewrite our codes, which may cause some delay. Also there will a limit below which some QMC calculations can no longer be preformed. I would recommend again less than 1 GB per core. 

3c. Please fill out the following table to the best of your ability. This table provides baseline data to help extrapolate to requirements for future years. If you are uncertain about any item, please use your best estimate to use as a starting point for discussions.

Facilities Used or Using

 NERSC  OLCF  ACLF  NSF Centers  Other:  

Architectures Used

 Cray XT  IBM Power  BlueGene  Linux Cluster  Other:  

Total Computational Hours Used per Year

1500000 Core-Hours

NERSC Hours Used in 2009

 80000 Core-Hours

Number of Cores Used in Typical Production Run

64-512

Wallclock Hours of Single Typical Production Run

12

Total Memory Used per Run

 40-1024 GB

Minimum Memory Required per Core

 2 GB

Total Data Read & Written per Run

 4 GB

Size of Checkpoint File(s)

1 GB

Amount of Data Moved In/Out of NERSC

4 GB per week

On-Line File Storage Required (For I/O from a Running Job)

 0.5 GB and  10000 Files

Off-Line Archival Storage Required

 1 GB and  20000 Files

Please list any required or important software, services, or infrastructure (beyond supercomputing and standard storage infrastructure) provided by HPC centers or system vendors.

none 

4. HPC Requirements in 5 Years

4a. We are formulating the requirements for NERSC that will enable you to meet the goals you outlined in Section 2 above. Please fill out the following table to the best of your ability. If you are uncertain about any item, please use your best estimate to use as a starting point for discussions at the workshop.

Computational Hours Required per Year

10000000

Anticipated Number of Cores to be Used in a Typical Production Run

1000-5000

Anticipated Wallclock to be Used in a Typical Production Run Using the Number of Cores Given Above

12

Anticipated Total Memory Used per Run

 2000-10000 GB

Anticipated Minimum Memory Required per Core

 2 GB

Anticipated total data read & written per run

 40 GB

Anticipated size of checkpoint file(s)

10 GB

Anticipated On-Line File Storage Required (For I/O from a Running Job)

 5 GB and 10000 Files

Anticipated Amount of Data Moved In/Out of NERSC

 40 BG per week GB per  once per week

Anticipated Off-Line Archival Storage Required

 10 GB and  20000 Files

4b. What changes to codes, mathematical methods and/or algorithms do you anticipate will be needed to achieve this project's scientific objectives over the next 5 years.

Limitations in the communication between large runs are not a problem in QMC. I foresee however that our code will need adapt a reduced memory size per core. We will need rewrite the storage routine for the wave function so that different cores

4c. Please list any known or anticipated architectural requirements (e.g., 2 GB memory/core, interconnect latency < 3 #s).

2 GB is reasonable. For lower value, some problem cannot be done without rewriting the code, which takes time.

4d. Please list any new software, services, or infrastructure support you will need over the next 5 years.

Jointly with a mathematician, we are developing linear scaling QMC methods that can evaluate large determinant efficiently.  

4e. It is believed that the dominant HPC architecture in the next 3-5 years will incorporate processing elements composed of 10s-1,000s of individual cores, perhaps GPUs or other accelerators. It is unlikely that a programming model based solely on MPI will be effective, or even supported, on these machines. Do you have a strategy for computing in such an environment? If so, please briefly describe it.

 

New Science With New Resources

To help us get a better understanding of the quantitative requirements we've asked for above, please tell us: What significant scientific progress could you achieve over the next 5 years with access to 50X the HPC resources you currently have access to at NERSC? What would be the benefits to your research field if you were given access to these kinds of resources?

Please explain what aspects of "expanded HPC resources" are important for your project (e.g., more CPU hours, more memory, more storage, more throughput for small jobs, ability to handle very large jobs).

Our NERSC allocation is comparatively small 140,000 hours. With 7,000,000 hours per year, we could do three to four QMC projects like we run at NSF centers (2,000,000 hours). The impact to society as a whole would be modest.