Combining materials modeling and simulation with neutron

Transcription

Combining materials modeling and simulation with neutron
Combining
materials modeling
and simulation with
neutron scattering
experiments at the
Spallation Neutron
Source
Peter F. Peterson
NDAV, Oak Ridge National Laboratory
Developing and applying the world’s
best tools for neutron scattering
High Flux Isotope Reactor:
Intense steady-state neutron flux
and a high-brightness cold neutron source
Spallation Neutron Source:
World’s most powerful
accelerator-based neutron source
Biology and Soft Matter
Chemical and Engineering Materials
Neutron Data Analysis and Visualization
Instrument and Source Design
Quantum Condensed Matter
2 ADD2013 P.F.Peterson
Neutron Data Analysis and Visualization Division
Thomas Proffen
Division Director
Technology
Advancement
Matrix from CCSD
Neutron Data Analysis
and
Visualization Group
Galen Shipman
(Data Systems Architect)
Mark Hagen
(Group Leader)
Data Infrastructure
Diffraction Software
Peter Peterson
HPC Systems
Inelastic Software
Stuart Campbell
Low-Q Software
Mathieu Doucet
3 ADD2013 P.F.Peterson
Data Operations
Matrix from RAD
Karen White (Manager)
Instrument Data
Accelerator
Acquisition and
Controls
Controls
Karen White
(Group Leader)
Steven Hartman
(Group Leader)
Software System
Tools
Detector Acquisition
IT Support
Data Translation
Experiment
Acquisition
We are not alone…
C-LAB (Computational “Laboratory)
• Mark Johnson/Institut Laue Langevin
• Users can “apply” for simulation support
• Been going since 1990’s
European Spallation Source (ESS)
• DMSC: Data Mamangement & Scientific Computing
• ESS in Lund streams data to U. Copenhagen
In the US…
• Advanced Light Source (ALS) → NERSC [LBNL]
• Advanced Photon Source (APS) → TCS/ALCF [ANL]
4 ADD2013 P.F.Peterson
Our Mission… what do we aspire to do?
SNS is an experimental user facility
What should we aspire to… create a data infrastructure that gives users
o
o
o
o
The ability to reduce, and analyze, the data as it is taken
Data files created instantly after acquisition (no matter how big)
The ability to reduce a data set post-acquisition in ~1 minute
The resources for any user to do post-acquisition reduction,
analysis, visualization, modeling from anywhere…
Surely everyone signs up to these… but how does one make it happen?
5 ADD2013 P.F.Peterson
Let’s rewind for a moment…
Detectors
DAS
Sample
Environment
6 ADD2013 P.F.Peterson
Data
Monster
Analysis
Computer
Offsite
Analysis
DAS Needs Upgrading Anyway
• Parts of the SNS DAS must be upgraded for obsolescence/maintainability
• Some parts because can be simplified/made better
• New functionality for data analysis
7 ADD2013 P.F.Peterson
Collaboration – NDAV, Technology Integration Group (NCCS), DAS/RAD
Shared Resources
NeXus Translation
Services
Per Beam Line Resources
Detectors
Stream Management
Service
Data Portals
Aggregates event and
environmental data
Streams data to Translation
Services and Data Reduction
and Analysis
Electronics
Detector
Preprocessor(s)
(1 or more)
User Interface
Global
High-performance
file system
Pushes Events + Timing
Hosts analysis and
environmental
controls GUI(s)
Slow Controls Sample
Environment &
Scanning Services
Pushes environmental data
Receives controls commands for
processing and response
Analysis
Clusters
8 ADD2013 P.F.Peterson
Analysis
Workstations
Manages sample environment,
choppers, motors, etc.
Streaming Data
Reduction & Analysis
Services
Receives aggregate event
stream and provides live
reduction & analysis services
The Stream Management Service
• A High Performance Publish/Subscribe
System
• Data is aggregated from a number of
sources, transformed and streamed in a
common network format
• As data is aggregated and transformed it is
stored locally allowing subscribers to join
the stream at any time
• Can service a relatively large number of
downstream subscribers concurrently
• Subscribers can consume neutron data at
different rates, SMS will manage buffering
• Provides a common communication
backplane for reduction and archiving
9 ADD2013 P.F.Peterson
The Streaming Translation Service
• A downstream subscriber to the SMS
• During experiment run startup SMS connects to the
STS and negotiates translation processing
• STS begins translation as data is received from the SMS
using a small amount of buffering, translating that buffer
and then writing out that buffer within the NeXus HDF5 file
• A single NeXus file is written over the course of the run
and upon completion of the run the NeXus file is
archived/cataloged
• Translation of even extremely large datasets will complete
mere moments after the experiment is finished
• NeXus files will instantly be available in a high-performance
parallel file system allowing parallel capable reduction and
analysis operators for post-processing
10 ADD2013 P.F.Peterson
The Streaming Reduction Service
11 ADD2013 P.F.Peterson
Workflow Manager
• Handles data cataloging, automated reduction, and archiving
of datasets post acquisition
• Built on Apache ActiveMQ allowing loosely coupled architecture
• Resilience through durable messages and clustered message
brokers
• Load balancing of tasks across resources through message queues
• Tasks within the workflow are handled by independent processes running on
one or more systems
• Each task type maps to an ActiveMQ queue, to activate a task you simply send
a message to that queue
• A director process manages the workflow sending messages to these queues
• Workflows are defined and stored within the director’s redundant MySQL
system
12 ADD2013 P.F.Peterson
Parallel Reduction – Low Hanging Fruit
 People like automated reduction
 Why?... (Fits their modus operandi)
 Get automated working for “all”
 Parallel reduction
 Large files
 Ensemble runs
13 ADD2013 P.F.Peterson
How can we do it…
• We stream the data (neutron and SE) out from the data acquisition…
• We modify Mantid (data reduction program) to read from the data stream…
• We re-configure the data translation (file creation) to read from the data stream…
… and we create the files while the run is taking place… end of run = close file
[however big the file it appears “instantly” at the end of the run]
• We stream the data to, create the files in, a place where we put the resources to
upload/reduce big files in minutes
• We put the resources for
analysis there and give the
users remote access
Shared Resources
NeXus Translation
Services
Per Beam Line Resources
Detectors
Stream Management
Service
Data Portals
Aggregates event and
environmental data
Streams data to Translation
Services and Data Reduction
and Analysis
• It’s not crazy, others can
do it… why not us?
Electronics
Detector
Preprocessor(s)
(1 or more)
User Interface
Global
High-performance
file system
Pushes Events + Timing
Hosts analysis and
environmental
controls GUI(s)
Slow Controls Sample
Environment &
Scanning Services
Pushes environmental data
Receives controls commands for
processing and response
Analysis
Clusters
14 ADD2013 P.F.Peterson
Analysis
Workstations
Manages sample environment,
choppers, motors, etc.
Streaming Data
Reduction & Analysis
Services
Receives aggregate event
stream and provides live
reduction & analysis services
It’s not crazy… #1
• The current Data Acquisition System (DAS) streams neutron data internally… and
writes it to disk on the DAS in the target building…
• The DAS “polls” the SE and… writes
to a file on the DAS…
• Create the “Stream Management
Service” (SMS) – that can stream the
data live on a network to…
• The SNS instrument data rate is not
that high, whole target building at worst
case is ~4GB/s
15 ADD2013 P.F.Peterson
Shared Resources
NeXus Translation
Services
Per Beam Line Resources
Detectors
Stream Management
Service
Data Portals
Aggregates event and
environmental data
Streams data to Translation
Services and Data Reduction
and Analysis
Electronics
Detector
Preprocessor(s)
(1 or more)
User Interface
Global
High-performance
file system
Pushes Events + Timing
Hosts analysis and
environmental
controls GUI(s)
Slow Controls Sample
Environment &
Scanning Services
Pushes environmental data
Receives controls commands for
processing and response
Analysis
Clusters
Analysis
Workstations
Manages sample environment,
choppers, motors, etc.
Streaming Data
Reduction & Analysis
Services
Receives aggregate event
stream and provides live
reduction & analysis services
It’s not crazy… #2
• What about the “receivers” of the data…
• Streaming Translation Service (STS)
• Mantid (data reduction) already –
o Reads events (stream format)
data files (HDF5)
o (For SNS instruments) internally
stores and manipulates data as
event streams
Shared Resources
NeXus Translation
Services
Per Beam Line Resources
Detectors
Stream Management
Service
Data Portals
Aggregates event and
environmental data
Streams data to Translation
Services and Data Reduction
and Analysis
• Streaming Reduction Service (SRS)
Electronics
Detector
Preprocessor(s)
(1 or more)
User Interface
Global
High-performance
file system
Pushes Events + Timing
Hosts analysis and
environmental
controls GUI(s)
Slow Controls Sample
Environment &
Scanning Services
Pushes environmental data
Receives controls commands for
processing and response
Analysis
Clusters
16 ADD2013 P.F.Peterson
Analysis
Workstations
Manages sample environment,
choppers, motors, etc.
Streaming Data
Reduction & Analysis
Services
Receives aggregate event
stream and provides live
reduction & analysis services
It’s not crazy… #3
Shared Resources
• Post- Acquisition Reduction (& Analysis)
NeXus Translation
Services
Data Portals
• NCCS post-processes large data sets all of the time
• Parallel File System for large data set upload
• Clusters [Chadwick, Fermi,…] for large data set
reduction
• For reduction enable MPI, openMP, and parallel
I/O in Mantid (Target – reduce 1 TB file in ~1 minute)
Us
Global
High-performance
file system
Hosts
env
con
• Analysis workstation cluster for user login’s
• Locate this in central computing where they have
the experience (+ floor loading & HVAC)
Analysis
Clusters
• Leverage the ASCR investment – networking, skills, HPC environment
17 ADD2013 P.F.Peterson
Analysis
Workstations
Data Reduction/Analysis History…
• Before 2010 we had a set of “disconnected” software
Java, IDL,…
• After April 2010 – “All Mantid, all the time”
• Generation I:
o Stabilize the current software situation
o Provide operating instruments with efficient data reduction software
o Ensure that upcoming instruments will have reliable reduction software
• Generation II:
o Create a single high performance data reduction workbench for
all instruments that works at SNS/HFIR and
can be distributed to users
o Build in automated reduction and parametric/
real time features
• Generation III: Long term outlook
o Advanced analysis methods utilizing
modeling, simulation, MD, DFT etc.
18 ADD2013 P.F.Peterson
http://www.mantidproject.org
• Data reduction, visualization, and analysis
• Collaboration with ISIS
• Cross platform
• Windows, Linux, and Mac
• C++, QT, Python, ParaView
• OpenMP and MPI
19 ADD2013 P.F.Peterson
: VATES
20 ADD2013 P.F.Peterson
Analysis Workstation Cluster
• Each beamline has 1-3 analysis computers
• Offline analysis: http://analysis.sns.gov/
• Load balanced cluster of 5 workstations
• Each 256 GB memory, 32 cores
• Same desktop as on beam line
• Users can login with XCAMS username/password
• We also have ability to SSH to NERSC from these machines…
• Software stack as on beam line
Mantid
Horace
Tobyfit
GSAS
Fullprof
ZODS
McSTAS
ParaView
PDFGui
SANSview
IDL codes – Ref_L, Ref_M, BASIS, VDRIVE, Nomad IDL code
• Compilers/Languages: FORTRAN/C++, Python, IDL, Matlab, Paraview
21 ADD2013 P.F.Peterson
Simulation and the HPC Facilities
•
First (baby) step on making the connection with HPC simulation…
•
4,000,000 CPU-hours at NERSC (National Energy Research
Scientific Computing Center, LBL) on carver, franklin, hopper
•
Computational resources for the interpretation of neutron scattering data from the Spallation
Neutron Source and HFIR (PI: M. Hagen)
Anharmonicity and electron-phonon coupling in thermoelectric materials studied with inelastic neutron
scattering and first-principles computations – O. Delaire, I. Al-Qasir, S. Campbell, M. Doucet
Nanoparticle structure from total scattering data - T. Proffen, K. Page, R. Neder
Hopper , Franklin and Carver are ASCR/DoE’s
“capacity” supercomputer’s
Quantitative modeling of disorder in the light up-conversion material NaLnF4, C. Hoffmann, V. Lynch,
T. Michels-Clark, R. Harrison
Dynamics of dihydrofolate reductase (dhrf) in aqueous organic solvents – J. Borreguero-Calvo,
P. Agawal, D. Myles et al
Hydrophobic matching of vesicle bilayers by melittin – W. Heller, J. Borreguero-Calvo, S. Qian
Integrating theory, molecular dynamics simulations and small-angle neutron scattering to reveal
pathways of molecular recognition in intrinsically disordered proteins – C. Stanley, M. Doucet,
A. Ramanathan et al
Hydrodynamic interactions of soft colloidal solutions – W.R. Chen, T. Iwashita, C. Do, B. Wu
Neutron data analysis development projects – M. Hagen, A.T. Savici, B. Fultz, J. Lin et al
22 ADD2013 P.F.Peterson
Jaguar (OLCF) and ALCF are ASCR/DoE’s
“INCITE” supercomputer’s
Enabling the future…
• ADARA lays foundations for things we can do in the future…
• Live data analysis/computational steering (multi-level):
o Instrument level – GSAS fit on POWGEN or Lorentzians on BASIS makes decision to
move to next temperature etc.
o Run a simulation on Jaguar (or Titan) in real time to steer an experiment
• Develop a program that lets SNS/HFIR (experimental) users leverage HPC resources for
analysis/modeling at OLCF/NERSC etc.
• Engage theory/modeling/simulation to drive research at SNS/HFIR on the “big” problems…
o ORNL has established simulation/theory groups
o What about collaboration with non-ORNL groups
o Establish a (small) theory group and employ
online collaboration tools similar to CASL??
o Funding for collaborative projects with other
theory groups
• Collaboration with CCSD on “applied Math” topics
o Fitting and optimization problems
o Higher dimensional visualization
23 ADD2013 P.F.Peterson
The future is now ?
• Opportunity Knocks…
• CAMM – Center for Accelerating Materials Modeling
• History:
• October 2011 – BES/ASCR Workshop on Data & Communications
SNS, LCLS, APS, ALS, NSLS – same story
Data Reduction/Analysis in near real time
• March 2012 – BES call for “Predictive Modeling & Simulation”
• ORNL Pre-proposal for CAMM (invited for full proposal)
24 ADD2013 P.F.Peterson
CAMM – continues on from ADARA
• Center/project
• Neutron Sciences
• Computational Sciences
• Materials Science
• Center for Nanophase Material Science (CNMS)
• Computational framework
• User Interface
• Resource Scheduling (on-demand clusters for CAMM, NERSC,… Jaguar)
• Data Management (experiment & simulation)
• Sensitivity (statistical) Analysis data/simulations
• Optimal Experimental Design
• Computational “Steering”
• Materials simulations – Convert to calculate neutron scattering laws
25 ADD2013 P.F.Peterson
High Level View
Infrastructure &
Interfaces
User
Program
Code
Maint.
& Dev.
Research
Data Analysis Scientist
Data Reduction
Modeling/
Simulation
User
Program
Inst.
Maint.
& Dev.
Research
Beamline Scientist
26 ADD2013 P.F.Peterson
Firstly… Data Reduction will never end
• We will always be doing more
and more sophisticated &
complex experiments…
• Stephan Rosenkranz (et al)
Sweep mode for inelastic..
Cross-correlation for diffuse..
• Pulsed magnetic fields
Pulsed electric fields
Dynamic stress-strain
(Temperature ramping)
(Parametric studies)
Shared Resources
NeXus Translation
Services
Detectors
Stream Management
Service
Data Portals
Aggregates event and
environmental data
Streams data to Translation
Services and Data Reduction
and Analysis
Electronics
Detector
Preprocessor(s)
(1 or more)
User Interface
Global
High-performance
file system
Pushes Events + Timing
Hosts analysis and
environmental
controls GUI(s)
Slow Controls Sample
Environment &
Scanning Services
Pushes environmental data
Receives controls commands for
processing and response
Analysis
Clusters
• More and more information is going to be
“shoved” into the data stream and used
to obtain S(Q), I(d), S(Q,E),…
• Remember neutron scattering is an
experimental technique, acquiring and
reducing the data is important
27 ADD2013 P.F.Peterson
Per Beam Line Resources
Analysis
Workstations
Manages sample environment,
choppers, motors, etc.
Streaming Data
Reduction & Analysis
Services
Receives aggregate event
stream and provides live
reduction & analysis services
Data Acquisition and Reduction
• The “experimental” part is about data acquisition and reduction working together
• We need a data acquisition system [EPICS] that can extensibly (reliably, robustly)
acquire neutron and SE data and “shove” it into the [ADARA] data stream
• We need a data reduction engine [MANTID] that gives us flexibility, event
manipulation/filtering, parallel capabilities (OpenMP, MPI), integration with
cataloguing [ICAT] – interactive (live) & automated/batch
• We need a User/Control Interface that is flexible/extensible, customizable to
different experimental classes, able to send meta-data to reduction/cataloguing,
able to receive “feedback” from reduction/analysis […]
Shared Resources
NeXus Translation
Services
Per Beam Line Resources
Detectors
Stream Management
Service
Data Portals
Aggregates event and
environmental data
Streams data to Translation
Services and Data Reduction
and Analysis
Electronics
Detector
Preprocessor(s)
(1 or more)
User Interface
Global
High-performance
file system
Pushes Events + Timing
Hosts analysis and
environmental
controls GUI(s)
Slow Controls Sample
Environment &
Scanning Services
Pushes environmental data
Receives controls commands for
processing and response
Analysis
Clusters
28 ADD2013 P.F.Peterson
Analysis
Workstations
Manages sample environment,
choppers, motors, etc.
Streaming Data
Reduction & Analysis
Services
Receives aggregate event
stream and provides live
reduction & analysis services
But still…
• The data is on disk. It’s a fancy (powerful)
disk but… it’s still a disk.
• Unless we do something with the data
it is… just occupying disk space.
• Users should analyze the data, it’s their
data after all…
• However we (the facilities) can help
o Download/transfer of data
o Access to compute resources
o Access to “standard” analysis packages
o Access to software development tools
o Developing “new” analysis packages
o Assimilate/curate (user) analysis packages
o Help/collaboration using analysis packages
(Analysis = modeling/simulation/fitting/statistics)
29 ADD2013 P.F.Peterson
Analysis with no name…
Shared Resources
NeXus Translation
Services
Per Beam Line Resources
Detectors
Stream Management
Service
Data Portals
Aggregates event and
environmental data
Streams data to Translation
Services and Data Reduction
and Analysis
Electronics
Detector
Preprocessor(s)
(1 or more)
User Interface
Global
High-performance
file system
Pushes Events + Timing
Chadwick (196 nodes)
Fermi (512 nodes)
OIC (256 nodes)
Hosts analysis and
environmental
controls GUI(s)
Slow Controls Sample
Environment &
Scanning Services
Pushes environmental data
Receives controls commands for
processing and response
Analysis
Clusters
Analysis
Workstations
Login node
workstations
32 cores, 256Gb
(looks like beam line)
Streaming Data
Reduction & Analysis
Services
Receives aggregate event
stream and provides live
reduction & analysis services
Manages sample environment,
choppers, motors, etc.
Beam line
workstations
32 cores, 256Gb
Bit like the ISS – would like to attach new “modules”
Would like to have a common software stack
30 ADD2013 P.F.Peterson
Offline Analysis
•
•
•
•
As “before”… Mantid, SANSview, etc.
Compilers/Languages – (Intel) FORTRAN/C++, Python, IDL, Matlab, Paraview
NERSC (VASP, LAMMPS,…)
VNF
• Materials Studio, Amber, Charmm…
• VASP
• Analysis software development
• Rietveld – GSAS “replacement”
• DGS resolution & modeling
• Integration with MD – e.g. Sassena
•
•
•
•
User interfaces for simple case modeling/simulations
Resource scheduling
Catalogue simulations in a similar way to data (ICAT)
“Fitting” service for comparing models & data
• Open source, repository, unit tests…
31 ADD2013 P.F.Peterson
Live Analysis
• Neutron scattering is very “interactive”
• Simple fitting (Gaussians) but what about modeling/simulation…
o User Interface
o Resource Scheduling (on-demand clusters vs NERSC)
o Data Management (experiment & simulation),
o Sensitivity (statistical) analysis data/simulations [fitting]
o Optimal Experimental Design
o Computational “Steering” (Decision Support System)
32 ADD2013 P.F.Peterson
“Data on disk is useless” → Analysis
• Data on disk is “useless” – analyzed data leads to scientific discovery
• Users should analyze the data,… but we (the facilities) will help
― Download/transfer data
― Access to computer resources
― Access to software development tools
― Access to “standard” analysis packages
― Develop “new” analysis packages
― Assimilate/curate (user) analysis packages
― Help/collaboration using analysis packages
(Analysis = modeling/simulation/fitting/statistics)
• MANTID, DANSE software: Virtual Neutron
Facility (VNF), SANSview, PDFgui
• Compilers/Languages – (Intel) FORTRAN/C++,
Python, IDL, Matlab, Paraview
• Materials Studio, Amber, NAMD, Gromacs
• NERSC (VASP, LAMMPS,…)
33 ADD2013 P.F.Peterson
Using the ASCR Facilities
ERCAP award at NERSC
• hopper.nersc.gov & carver.nersc.gov – capacity
computational resources
• ERCAP award Computational resources for the
interpretation of neutron scattering data from the Spallation
Neutron Source and HFIR (PI: M. Hagen)
• Grant of 4 million cpu-hours
• Time available on request for projects that need mid-level
computational resources
• CNMS also has a user program allocation on NERSC
DD Time Project(s) on Smoky/Jaguar/Titan
• Have 4 million cpu-hour Directors Discretion grant on the
OLCF machines
• Intended for development of CAMM (later)
34 ADD2013 P.F.Peterson
Materials by Design
• BES Proposal Call (May 2012) in Predictive Theory and Modeling
Theoretical Condensed Matter Physics (MSED)
Computational and Theoretical Chemistry (CSGBD)
• Center for Accelerating Materials Modeling (CAMM) – partnership between
- Neutron Sciences: NDAV – Robert McGreevy & Mark Hagen
- Physical Sciences: CNMS – Sean Smith & Bobby Sumpter; MSTD – Olivier Delaire
- Computational Sciences: OLCF/CSMD – Galen Shipman & Bobby Sumpter
• Use (inelastic) neutron scattering to refine/define potentials for predictive theory
Use predictive theory to feedback to (neutron scattering) experiments on where to measure
• Funded Center Proposals – CAMM & NMGC
• Nanoporous Materials Genome Center (NMGC) – led by U. Minnesota-Minneapolis
(U. Washington, Rice, Berkeley,…) - MOF’s by design ← need (genome) of MD force fields
From ab-initio & experiment (inc. neutron scattering)
35 ADD2013 P.F.Peterson
CAMM – Year 1
• Core component – refining MD forces fields against inelastic & quasi-elastic neutron
scattering data
• Develop refinement software & test with 2 science projects
Refinement package – DAKOTA (developed by Sandia National Lab/ASCR)
Run an MD(s) → Convert to S(Q,E) → Correct for neutron corrections → Calc. cost func.
Scientific Workflow Management – KEPLER (developed by ASCR)
• Initially starting with some aqueous LiCl solution data from BASIS & NAMD for simulation
+
+ -
-
+
-
36 ADD2013 P.F.Peterson
Li+ counterion
Acrylic acid
Ethylene oxide
100 Å
• Coarse grained MD (LAMMPS) on polyethylene oxide –
acrylic acid (Bobby Sumpter)
CAMM: Phonons
•
Thermoelectrics (PbTe) & Ferroelectrics (KTN)
– Olivier Delaire
•
For single crystal inelastic need to convolute
S(Q,E) with resolution function
•
Fast convolution methods
•
Spin waves – Randy Fishman (MSTD)
& Georg Ehlers + Steven Hahn (QCMD)
PbTe calculation (DFT)
PbTe experiment (INS)
E (meV)
10
0
G
-
q
Focus on width of dispersions:
measured line widths need corrections for instrument resolution effects.
will support development of reliable line width predictions.
Delaire et al., Nature Materials (2011).
37 ADD2013 P.F.Peterson
Materials by Design
• BES Proposal Call (May 2012) in Predictive Theory and Modeling
Theoretical Condensed Matter Physics (MSED)
Computational and Theoretical Chemistry (CSGBD)
• Center for Accelerating Materials Modeling (CAMM) – partnership between
- Neutron Sciences: NDAV – Robert McGreevy & Mark Hagen
- Physical Sciences: CNMS – Sean Smith & Bobby Sumpter; MSTD – Olivier Delaire
- Computational Sciences: OLCF/CSMD – Galen Shipman & Bobby Sumpter
• Use (inelastic) neutron scattering to refine/define potentials for predictive theory
Use predictive theory to feedback to (neutron scattering) experiments on where to measure
• Funded Center Proposals – CAMM & NMGC
• Nanoporous Materials Genome Center (NMGC) – led by U. Minnesota-Minneapolis
(U. Washington, Rice, Berkeley,…) - MOF’s by design ← need (genome) of MD force fields
From ab-initio & experiment (inc. neutron scattering)
38 ADD2013 P.F.Peterson
Acknowledgements
ADARA – Funded by Laboratory Directed Research & Development at ORNL
PI: Galen Shipman (CCSD), Mark Hagen (NScD)
David Dillow, Dale Stansberry, Ross Miller, Stuart Campbell, Russell Taylor,
Mathieu Doucet, Shelly Ren, Jim Kohl, Marie Yao, Carol Tang, Andrei Savici,
Michael Reuter, Peter Peterson, John Quigley, Rich Crompton, Clay England,
Barry Winn
CAMM – Funded by Materials Science and Engineering Division, Office of Basic
Energy Sciences, U.S. Dept. of Energy
PI: Mark Hagen, Co-PI’s: Galen Shipman, Bobby Sumpter, Olivier Delaire
Jose Borreguero-Calvo, Vickie Lynch, Andrei Savici, Monojoy Goswami
Mantid
Lots of people
39 ADD2013 P.F.Peterson
Questions?
40 ADD2013 P.F.Peterson