PIs in University of California, Davis active in the last 90 days
Allocations with low numbers of SUs (10,000 or less) are usually those used as educational allocations, or are given as startup allocations, or extensions. Allocations with less than 10 SUs are usually used for storage purposes.

Go back Choose a different time period.

Name Project Title Teragrid Resource Discipline Board Type Base Allocation
Jawdat Al-Bassam Structural mechanisms of molecular assemblies regulating the polymerization and biogenesis of microtubues PSC Regular Memory (Bridges) Cell Biology Startup 55,064
" " SDSC Dell Cluster with Intel Haswell Processors (Comet) " " 20,000
" " SDSC Medium-term disk storage (Data Oasis) " " 2,000
" " TACC Dell/Intel Knight's Landing System (Stampede2 - Phase 1) " " 1,563
" " TACC Long-term tape Archival Storage (Ranch) " " 500
" " PSC Storage (Bridges Pylon) " " 500
" " IU/TACC Jetstream " " 0
" " TACC Dell PowerEdge C8220 Cluster with Intel Xeon Phi coprocessors (Stampede) " " 0
Hajar Amini Construction of De novo transcriptome assembly to identify candidate pathway involved in the production of medicinal compounds in Ferula assafoetida IU/TACC Jetstream Biological Sciences Startup 100,000
" " IU/TACC Storage (Jetstream Storage) " " 1,000
Varaprasad Bandaru Developing Spatially Explicit Regional Modeling Framework for Studying Impacts of Poplar based Bioenergy Systems SDSC Dell Cluster with Intel Haswell Processors (Comet) Ecological Studies Startup 50,000
" " PSC Regular Memory (Bridges) " " 50,000
" " SDSC Medium-term disk storage (Data Oasis) " " 500
" " PSC Storage (Bridges Pylon) " " 500
C. Titus Brown Compute Infrastructure to Support the Data Intensive Biology Summer Institute for Sequence Analysis at UC Davis IU/TACC Jetstream Biological Sciences Educational 432,000
Roland Faller Direct phase equilibrium simulation of NIPAM oligomers in water and optimization of the potential XStream/Stanford University GPU Supercomputer (Cray CS-Storm, Intel Ivy-Bridge, NVIDIA K80) Chemistry Startup 5,000
Richard Grosberg The evolution and implications of life history in marine invertebrates: a genomic and transcriptomic analysis PSC Regular Memory (Bridges) Systematic and Population Biology Research 1,604,929
" " PSC Storage (Bridges Pylon) " " 2,048
" " PSC Large Memory Nodes (Bridges Large) " " 1,781
Melissa Kardish The role of microbiota in mediating local adaptation and plant influence on ecosystem function in a marine foundation species, Zostera marina SDSC Dell Cluster with Intel Haswell Processors (Comet) Ecological Studies Startup 50,000
" " SDSC Medium-term disk storage (Data Oasis) " " 500
Louise Kellogg CIG Science Gateway and Community Codes for the Geodynamics Community TACC Dell/Intel Knight's Landing System (Stampede2 - Phase 1) Geophysics Research 26,470
" " HP/NVIDIA Interactive Visualization and Data Analytics System (Maverick) " " 15,000
" " TACC Dell PowerEdge C8220 Cluster with Intel Xeon Phi coprocessors (Stampede) " " 12,063
" " TACC Long-term tape Archival Storage (Ranch) " " 10,000
John Naliboff Testing the scalability and numerical efficiency of long-term tectonic models of continental extension SDSC Dell Cluster with Intel Haswell Processors (Comet) Geophysics Startup 50,000
" " SDSC Medium-term disk storage (Data Oasis) " " 500
Daniel Standage Training in cyberinfrastructure for data intensive biology IU/TACC Jetstream Biological Sciences Startup 50,000
" " IU/TACC Storage (Jetstream Storage) " " 2,400
Dean Tantillo MECHANISMS OF BIOORGANIC AND ORGANOMETALLIC CYCLIZATION REACTIONS PSC Regular Memory (Bridges) Organic and Macromolecular Chemistry Research 1,565,000
" " SDSC Dell Cluster with Intel Haswell Processors (Comet) " " 1,079,000
" " PSC Storage (Bridges Pylon) " " 500
" " SDSC Medium-term disk storage (Data Oasis) " " 500
Igor Vorobyov Elucidation of molecular mechanisms of sex-dependent pro-arrhythmia through hERG block by drugs and steroid hormones SDSC Dell Cluster with Intel Haswell Processors (Comet) Biophysics Startup 50,000
" " XStream/Stanford University GPU Supercomputer (Cray CS-Storm, Intel Ivy-Bridge, NVIDIA K80) " " 5,000
" " SDSC Comet GPU Nodes (Comet GPU) " " 2,500
" " SDSC Medium-term disk storage (Data Oasis) " " 1,000
Andrew Wetzel Simulating the Local Group TACC Dell PowerEdge C8220 Cluster with Intel Xeon Phi coprocessors (Stampede) Extragalactic Astronomy and Cosmology Research 1,847,880
" " TACC Dell/Intel Knight's Landing System (Stampede2 - Phase 1) " " 56,296
" " TACC Long-term tape Archival Storage (Ranch) " " 50,000
Matthew Williamson Spatially explicit estimates of the likelihood of conservation action Open Science Grid (OSG) Ecological Studies Startup 100,000
" " PSC Regular Memory (Bridges) " " 10,000
" " PSC Large Memory Nodes (Bridges Large) " " 1,000
" " PSC Storage (Bridges Pylon) " " 500
Vladimir Yarov-Yarovoy State-dependent drug modulation of sodium channels Open Science Grid (OSG) Biophysics Startup 200,000
" " PSC Regular Memory (Bridges) " " 50,000
" " LSU Cluster (superMIC) " " 50,000
" " XStream/Stanford University GPU Supercomputer (Cray CS-Storm, Intel Ivy-Bridge, NVIDIA K80) " " 5,000
" " PSC Bridges GPU (Bridges GPU) " " 2,500
" " PSC Storage (Bridges Pylon) " " 1,000
Close

Project Abstract

Structural mechanisms of molecular assemblies regulating the polymerization and biogenesis of microtubues

PI: Jawdat Al-Bassam



Microtubules are intracellular dynamic protein tubular protein polymers that regulate the cell shape, form bipolar mitotic spindles and regulate migration of eukaryotic cells. Microtubules polymerize and depolymerize from alpha-beta tubulin heterodimer proteins maintained in the cytoplasms of eukaryotic cells. A wide variety of intracellular molecular-machines are conserved across eukaryotes that accelerate all aspects of microtubule polymerization, depolymerization and assembly of alpha-beta tubulin dimers. Microtubule based motor assemblies mediate the motility of mThese molecular assemblies mediate these activities by binding individual tubulin dimers and interacting with microtubule polymers. Our research group is focused on the physical mechanisms of these regulatory complexes. We use x-ray crystallography and cryo-electron microscopy approaches to study structures of these molecular assemblies in complex with microtubules or tubulin dimers. The process of structure determination utilizes extensive computational resources for structural determination using image analysis program such as RELION or SPARX that statistical approaches and maximum likelihood approaches to determine high resolution structures of macromolecules. Our goal in this startup allocation is to utilize these programs to determine structures of macromolecular assemblies in different states to determine their physical organization and conformational changes during activity cycles.
Close

Project Abstract

Construction of De novo transcriptome assembly to identify candidate pathway involved in the production of medicinal compounds in Ferula assafoetida

PI: Hajar Amini



Ferula assafoetida is an important source of oleo-gum-resins such as asafoetida, which is useful for therapeutic industries such as inflammations, neurological disorders, digestive disorders, rheumatism, neurological disorders, headache, arthritis and dizziness. Therefore it is important to determine the biological properties of Oleo-gum-resin compounds isolated from F. assafoetida. However, in spite of the known medicinal attributes of compounds from F. assafoetida, most of these compounds as well as the enzymes involved in their biosynthesis remain uncharacterized at the molecular level. Therefore we decided to evaluate the transcriptome and metabolome of different tissues of F. assafoetida to identify candidate mechanisms and pathway involved in the production of some important medicinal compounds. This proposal is for requesting resources from Jetstream cloud for the purpose of assembling the transcriptome of Ferula from RNA-Seq reads generated from three different plant species and from four different tissues. De novo transcriptome assembly will be constructed using Trinity after the reads have been subjected for quality trimming and digital normalization. De novo assembly construction is considered as highly memory intensive and time taking process and it involves several iterations of running the assembler with different k-mer sizes until an optimum assembly is generated. Once the assembler is constructed, the assembly will be assessed using a variety of tools such as Transrate, BUSCO and so on. The final part of the analysis will be annotating the assembling transcriptome using Dammit software. Currently I am using High Performance Computing cluster at UC Davis for initial assembly, but there is a long wait time to start any kind of analysis on the cluster. Allocation of resources on the public Jetstream and persistent storage will allow us to further exploration of this data set and running whole pipeline easily and rapidly. Our results from this analysis will facilitate studies on the functions of genes involved in the secondary metabolite biosynthesis pathway in other medicinal plants. Furthermore the information about metabolic pathways of this transcriptome is very valuable for understanding the biosynthesis process of the production Oleo-gum-resin such as the place that is produced, or the tissue that transfer it to other parts etc., Resources Request Information: In order to achieve the goals, i request the following: 100,000 SU’s s1.xxlarge (44 CPUs, 120 GB memory, 480 GB disk) instance 1 TB external volume space for storing my raw RNA-Seq reads as well as all the outputs and intermediate files generated from
Close

Project Abstract

Developing Spatially Explicit Regional Modeling Framework for Studying Impacts of Poplar based Bioenergy Systems

PI: Varaprasad Bandaru



As a renewable energy source, biofuels are expected to play an important role in sustainably meeting U.S long term energy goals. As part of a larger regional research and development initiative focused on researching sustainable ways of producing biofuel from poplar production in the U.S Pacific Northwest region (for details, visit http://hardwoodbiofuels.org), we are interested in modeling hybrid poplar to understand different aspects at the regional level including 1) identification of potential locations for growing poplar plantation; 2) assessing inherent biomass potential on suitable locations; 3) evaluating environmental and economic impacts with adoption of hybrid poplar in the place of current croplands and conserved grasslands. For this assessment, we are planning to implement the Environmental Policy Integrated Climate Model (EPIC) at high spatial resolution. The EPIC is an integrated biophysical and biogeochemical simulation model that can be used to assess available feedstock for bioenergy, water and soil quality, greenhouse gas emissions, and nutrient loss under various climate and management conditions. Since the EPIC model is a point scale model, when applied at the spatial scale, each pixel is considered as one simulation point. Using earlier allocations, we were able to build a framework to run the EPIC model using parallel computing and made simulations for small regions in Pacific Northwest region but requires implementing over all croplands and grasslands in PNW region. As such, we need to have access to GORDON and we would like to request renewal of our project to another one year.
Close

Project Abstract

Developing Spatially Explicit Regional Modeling Framework for Studying Impacts of Poplar based Bioenergy Systems

PI: Varaprasad Bandaru



As a renewable energy source, biofuels are expected to play an important role in sustainably meeting U.S long term energy goals. As part of a larger regional research and development initiative focused on researching sustainable ways of producing biofuel from poplar production in the U.S Pacific Northwest region (for details, visit http://hardwoodbiofuels.org), we are interested in modeling hybrid poplar to understand different aspects at the regional level including 1) identification of potential locations for growing poplar plantation; 2) assessing inherent biomass potential on suitable locations; 3) evaluating environmental and economic impacts with adoption of hybrid poplar in the place of current croplands and conserved grasslands. For this assessment, we are planning to implement the Environmental Policy Integrated Climate Model (EPIC) at high spatial resolution. The EPIC is an integrated biophysical and biogeochemical simulation model that can be used to assess available feedstock for bioenergy, water and soil quality, greenhouse gas emissions, and nutrient loss under various climate and management conditions. Since the EPIC model is a point scale model, when applied at the spatial scale, each pixel is considered as one simulation point. Using earlier allocations, we were able to build a framework to run the EPIC model using parallel computing and made simulations for small regions in Pacific Northwest region but requires implementing over all croplands and grasslands in PNW region. As such, we need to have access to GORDON and we would like to request renewal of our project to another one year.
Close

Project Abstract

Developing Spatially Explicit Regional Modeling Framework for Studying Impacts of Poplar based Bioenergy Systems

PI: Varaprasad Bandaru



As a renewable energy source, biofuels are expected to play an important role in sustainably meeting U.S long term energy goals. As part of a larger regional research and development initiative focused on researching sustainable ways of producing biofuel from poplar production in the U.S Pacific Northwest region (for details, visit http://hardwoodbiofuels.org), we are interested in modeling hybrid poplar to understand different aspects at the regional level including 1) identification of potential locations for growing poplar plantation; 2) assessing inherent biomass potential on suitable locations; 3) evaluating environmental and economic impacts with adoption of hybrid poplar in the place of current croplands and conserved grasslands. For this assessment, we are planning to implement the Environmental Policy Integrated Climate Model (EPIC) at high spatial resolution. The EPIC is an integrated biophysical and biogeochemical simulation model that can be used to assess available feedstock for bioenergy, water and soil quality, greenhouse gas emissions, and nutrient loss under various climate and management conditions. Since the EPIC model is a point scale model, when applied at the spatial scale, each pixel is considered as one simulation point. Using earlier allocations, we were able to build a framework to run the EPIC model using parallel computing and made simulations for small regions in Pacific Northwest region but requires implementing over all croplands and grasslands in PNW region. As such, we need to have access to GORDON and we would like to request renewal of our project to another one year.
Close

Project Abstract

Compute Infrastructure to Support the Data Intensive Biology Summer Institute for Sequence Analysis at UC Davis

PI: C. Titus Brown



Large datasets have become routine in biology. However, performing a computational analysis of a large dataset can be overwhelming, especially for novices. From June 18 to July 21, 2017 (30 days), the Lab for Data Intensive Biology will be running several different computational training events at the University of California, Davis for 100 people and 25 instructors. In addition, there will be a week-long instructor training in how to reuse our materials, and focused workshops, such as: GWAS for veterinary animals, shotgun environmental -omics, binder, non-model RNAseq, introduction to Python, and lesson development for undergraduates. The materials for the workshop were previously developed and tested by approximately 200 students on Amazon Web Services cloud compute services at Michigan State University’s Kellogg Biological Station from 2010 and 2016, with support from the USDA and NIH. Materials are and will continue to be CC-BY, with scripts and associated code under BSD; the material will be adapted for Jetstream cloud usage and made available for future use.
Close

Project Abstract

Direct phase equilibrium simulation of NIPAM oligomers in water and optimization of the potential

PI: Roland Faller



N-isopropylacrylaminde-based polymers (PNIPAM) are one of the best-studied thermoresponsive materials. These can be used in a wide range of applications, including catalysis, sensors, enzyme encapsulation and drug delivery, which makes it very desirable to understand the molecular behavior of PNIPAM. In water, PNIPAM shows a lower critical solution temperature (LCST) at 305 K and a conformational transition of single chains at the same temperature. Below this temperature PNIPAM is completely soluble in water, but above the LCST water and PNIPAM separate into two pure phases. In the last years this behavior has been simulated in atomistic molecular dynamics (MD) simulations to gain a deeper understanding of the mechanisms leading to the phase separation. Because the molecular mechanisms are very complex, to this date the correct phase behavior has not been simulated without an error regarding the LCST. For better results in MD simulations a modification of the potential for PNIPAM can be introduced, such that the LCST is shifted to the experimental observed value. The objective of this work is to adapt a modification of the potential, such that it fits the experimentaly observed data. To archive this goal, MD simulations of oligo-NIPAM using Amber94 + TIP3P force fields will be performed. The parameter for modification of the potential will then be fitted to match the experimental results. Therefore, the experimental results for NIPAM-oligomers (ONIPAM), synthesized at the “Leibniz-Institut für Interaktive Materialien” (DWI), are available. Thus a model, which can simulate the real LCST of ONIPAM, will be developed.
Close

Project Abstract

The evolution and implications of life history in marine invertebrates: a genomic and transcriptomic analysis

PI: Richard Grosberg



Our lab aims to develop genetic resources for non-model organisms in order to better understand the evolutionary causes and consequences of diversification of life histories. We have six ongoing projects—each taking a different approach to determine the evolutionary and ecological effects of life history on varying organisms. Our projects integrate field work, bench work, and computational work. All of our projects require vast amounts of data from next generation sequencing to be completed. We are requesting computational time primarily to assemble transcriptomes and genomes for several species as well as for analyzing large RADseq datasets. Given the size of these data, we require access to computational resources beyond what we have in our laboratory.
Close

Project Abstract

The role of microbiota in mediating local adaptation and plant influence on ecosystem function in a marine foundation species, Zostera marina

PI: Melissa Kardish



Increasing research suggests that microbiota interact with plants and animals to alter host fitness and disease resistance. Furthermore, microbiome composition can vary among host genotypes and environments, and may contribute to observed variation in host phenotype. Individual variation in phenotype within key species, such as foundation plant species or keystone consumers, affects the structure and functioning of entire ecosystems, providing a potentially important mechanism by which microbiomes contribute to the functioning of macroscopic ecosystems. However, few experiments test causal links between host phenotype and microbiome composition, and, outside of a few model systems, virtually no studies examine the cascading effects of variation in a host’s microbiome on communities or ecosystems. I conducted a series of reciprocal transplants of the marine angiosperm, Zostera marina, and have sequenced the V4-V5 region of the 16 S gene of bacteria associated with leaves, roots, and adjacent sediment. This will allow me to examine the sources of natural variation in the microbiome of the marine angiosperm Zostera marina (eelgrass), and the potential consequences of microbiome composition for host fitness, host local adaptation, and the effect of eelgrass on ecosystem structure and functioning. To accomplish this analysis, I would like to use Qiime the Gordon Computing Cluster to assist in the processing of 16S data from these transplants as well as from temporal data.
Close

Project Abstract

CIG Science Gateway and Community Codes for the Geodynamics Community

PI: Louise Kellogg



The Computational Infrastructure for Geodynamics (CIG), an NSF cyberinfrastructure facility, aims to enhance the capabilities of the geodynamics community through developing software that can be used to address a range of challenging problems in geophysics. CIG supports code development and benchmarking, user training, and new users by providing small allocations of computation time along with user support for CIG codes. CIG supports the aforementioned efforts in the following areas of activity: mantle dynamics, seismic wave propagation, geodynamo, and crustal and lithospheric dynamics on both million-year and earthquake time-scales. These efforts have resulted in successful allocation requests by our community and involvement of international researchers in benchmarking the next generation of geodynamo codes all of which were enabled by our community allocation.
Close

Project Abstract

Testing the scalability and numerical efficiency of long-term tectonic models of continental extension

PI: John Naliboff



This request for a startup allocation on the XSEDE-supported cluster Comet follows preliminary scaling tests on Comet and prior work on the Stampede1 cluster related to my research at the Computational Infrastructure for Geodynamics (CIG). As a project scientist at CIG, my work centers on developing, testing and applying the finite-element code ASPECT to simulations of long-term tectonic deformation (viscous and brittle behavior) in the solid earth. ASPECT is built on the open source finite element library deal.II, which provides massive scalability across 10^3-10^4 cores, adaptive mesh-refinement capabilities and robust linear and non-linear solvers. To date, strong and weak scaling tests with ASPECT have been performed on a wide range of clusters, including Stampede1, Lonestar, HLRN (Berlin) and many additional smaller clusters. These scaling results have been published in multiple peer-reviewed articles and are also contained in the CIG proposal for computing on Stampede 1: “CIG Science Gateway and Community Codes for the Geodynamics Community” (TG-EAR080022N). Here, I am applying for a startup allocation of 50,000 core-hours on Comet to perform additional scaling tests with ASPECT and test the relative efficiency of different model configurations (non-linear solver tolerances, linear solver schemes, etc) for 3-D simulations of continental extension. These simulations of continental extension are built on extensive 2-D and 3-D sensitivity tests for relatively small model sizes (< 10^7 degrees of freedom) and a limited (< 5) number of large 3-D simulations (> 10^8 degrees of freedom) run on STAMPEDE 1. Through a small trial allocation (1000 SUs) on Comet, I have performed strong and weak scaling tests on up to 96 cores for models that range from ~60,000 to ~16,000,000 degrees of freedom. This trial allocation will be used in part for scaling tests that examine models with up to 10^9 degrees of freedom run across hundreds or thousands of cores (up to 1536). Notably, these scaling tests are based on relatively simple models that only require using linear solvers and do not contain large variations in material parameters. To ensure the code scales efficiently on Comet for models using a non-linear rheology and large (orders of magnitude) variations in material properties, I will perform a second series of scaling tests with a model setup derived from earlier simulations of continental breakup. While these two series of scaling tests will likely require on the order of 10-20 thousand core-hours, the models are only run for a small number (1-2) of time steps. In contrast, the simulations of continental breakup require thousands of time steps, during which the dynamics can change significantly. To ensure that the predicted scaling behavior extends throughout the model duration, the remaining core-hours (30-40 thousand) will be used to run one large simulation to completion. This estimate is based on the preliminary 3-D simulation of continental extension (~ 108 degrees of freedom) run on STAMPEDE1. As an example, one model required 10.333 hours and 960 cores (~ 9920 core hours) to run for 25% of the simulation time planned for future models. The results of the scaling tests and trial simulation outlined above will form the basis of a proposal requesting further computing time on Comet for a series of production runs. If further details regarding the details of ASPECT or the planned scaling tests is required, I will provide this information in haste. Thank you for the consideration of this startup allocation request of 50,000 core-hours and I will be looking forward to hearing from you.
Close

Project Abstract

Training in cyberinfrastructure for data intensive biology

PI: Daniel Standage



This proposal requests computing resources on the IU/TACC Jetstream platform in support of training and outreach efforts in the Lab for Data Intensive Biology at UC Davis. Through our lab's research collaborations, workshops, rotation student mentoring, and informal weekly "meet and analyze data" sessions, we frequently encounter scientists faced with bioinformatics computing problems beyond what their training has prepared them for. Accordingly, along with our research software tools we have published several protocols for analysis of genome-scale data (http://khmer-protocols.readthedocs.org/). These protocols are now one of our go-to resources for on-boarding new students, collaborators, and colleagues. However, one lingering obstacle in these training efforts is that the laptop and desktop computers most scientists have access to are insufficient even for these basic introductory protocols, and much more so for their ongoing computing needs. The immediate goal of this allocation is to address the need for compute resources in these training and outreach activities. As a secondary result we hope and expect that experience with these resources will help our colleagues become more independent, empowered to write their own research allocation proposals and utilize the diversity of cyberinfrastructure resources at their disposal.
Close

Project Abstract

MECHANISMS OF BIOORGANIC AND ORGANOMETALLIC CYCLIZATION REACTIONS

PI: Dean Tantillo



The focus of the research proposed herein, a renewal of CHE030089N, is to apply modern quantum chemical methods to the elucidation of molecular mechanisms of organic chemical reactions that are used in the synthesis and biosynthesis of polycyclic organic molecules. During this award period, we will focus on cation-promoted polycyclization reactions involved in biosyntheses of terpene natural products (expanded from the previous grant period), and will broaden our efforts to include Rh-promoted cyclization reactions. During this grant period we will focus our efforts on direct/ab initio molecular dynamics calculations and extensive conformational searches, which are the most time-consuming calculations we carry out.
Close

Project Abstract

Elucidation of molecular mechanisms of sex-dependent pro-arrhythmia through hERG block by drugs and steroid hormones

PI: Igor Vorobyov



Common and sometimes fatal heart rhythm disorders such as long QT syndrome (LQTS) have been linked to mutations in cardiac ion channels as well as unwanted drug interactions with those proteins. Female sex has been shown to be an independent risk factor for both inherited and acquired LQTS as well as associated arrhythmias. tentatively correlated with differential levels of sex hormones (estradiol, progesterone and testosterone) playing opposite roles in proclivities for heart rhythm disturbances. There is a critical need to understand cardiac ion channel modulation by drugs and/or sex hormones at the molecular level to develop safer and effective therapeutics. We will focus on drug and/or hormone interactions with the human ether-a-go-go (hERG) potassium channel (KV11.1), a major contributor to a cardiac action potential repolarization and an anti-target for diverse drug molecules. We propose atomistic modeling and simulation approaches to compute binding affinities of hormones and LQTS inducing drugs such as dofetilide as a first step. A recent cryo-EM structure of hERG will be used for those studies. We will use quantum mechanical (QM) calculations using Gaussian software to develop and/or validate drug and hormone force field parameters. Drug and hormone binding to hERG will be tested using both long unbiased drug/hormone “flooding’ as well as multi-window restrained umbrella sampling (US) molecular dynamics (MD) simulations using NAMD. Therefore we request the following XSEDE allocations: Comet (SDSC) – 50,000 cpu hours (to be used for QM and US MD simulations, XStream (Stanford) – 5000 GPU hours (or Comet GPU when it becomes available) for “flooding” MD. These estimates are based on benchmarks on our local resources (small 10-node GeForce GTX 1080 / Xeon E5-2620 GPU/CPU cluster and workstations). Equivalent resource substitution or using Open Science Grid up to an allowed maximum is a reasonable alternative for this project as well. The proposed allocation will be used for a few runs described above to provide preliminary scientific (including feasibility and convergence) and benchmarking data for a larger scientific allocation to be submitted in the nearest future.
Close

Project Abstract

Simulating the Local Group

PI: Andrew Wetzel



We request a renewal allocation to continue our ultra-high-resolution simulations of galaxy evolution, star formation, and stellar feedback, with which we will study the physics of the interstellar medium (ISM), the formation of stars, stellar feedback, galaxy formation, and the cosmological distribution of dark matter, with new physics and game-changing resolution. This renewal will allow us to build on the significant numerical and physical advances that our previous XSEDE research allocations have enabled, in order to run a targeted suite of simulations, using dark matter + gas + stars together with state-of-the-art treatments of stellar physics, to understand the Local Group, comprising the Milky Way (MW), Andromeda (M31), the Large Magellanic Cloud (LMC), and numerous satellite dwarf galaxies. A wealth of exciting ongoing/upcoming observational projects are targeted to near-field cosmology and galactic archaeology in the Local Group, by measuring stellar populations and phase-space distribution of stars in/around the Milky Way, Andromeda, and its satellite dwarf galaxies at un- precedented levels, including the Hubble Space Telecope, SDSS-APOGEE survey, the Dark Energy Survey (DES), the Gaia mission (surveying 1 billion stars), and LSST. These observational campaigns are revolutionizing our understanding of galaxy formation, from massive galaxies like the Milky Way to the faintest known dwarf galaxies, as well as the nature of dark matter on the smallest cosmological scales. However, interpreting and understanding these results, including making predictions for upcoming observations, requires ultra-high-resolution cosmological simulations, which can resolve structure on 1 pc scales, and which include the necessary physics of hydrodynamics, star formation, and feedback, all carefully targeted to the environment of the Local Group. Thus, we propose a suite of cosmological zoom-in simulations targeted to the Milky Way, Andromeda, and the LMC, all at unprecedentedly high resolution, to provide much-needed theoretical insight, guidance, and mock datasets for these observations. Each of our simulated systems will be resolved with > 200 million particles and followed self-consistently over their entire history to the present day in live cosmological settings carefully matched to the Local Group environment, using our state-of-the-art Feedback In Realistic Environments (FIRE) physics model. These simulations not only will represent a significant numerical improvement beyond previous work in terms of resolution, but also they will enable us to model the physics of the Local Group with unprecedented realism, as our simulations uniquely incorporate all of the important stellar feedback mechanisms: radiation pressure in the ultraviolet, optical, and infrared; stellar winds; supernova explosions of Types I & II; and photoionization heating in HII regions, in a self-consistent manner. Because of their unprecedented resolution, physical realism, and careful matching to the Local Group environment, our proposed simulations will address a wide array of timely scientific questions. For the galaxies like the Milky Way/Andromeda, we will study in detail (1) gas accretion, angular momentum transport, and its role in disk formation, including the impact of close pairs of galaxies like the Milky Way and Andromeda, (2) turbulent cascades in the ISM induced by cosmic accretion and stellar feedback, (3) stellar migration and chemical mixing within the disk, (4) the impact of massive satellites/subhalos on kinematic heating of the disk, and (5) how 6-D measurement of stellar orbits in the halo can be used to reconstruct the underlying mass/potential of the Milky Way. Moreover, our simulations are the first that span the dynamic range needed to model self-consistently the satellite dwarf galaxies that are observed around the Milky Way and Andromeda, while including the relevant baryonic physics to predict the properties of their observed stars: from massive satellites like the LMC with M∗ = 2 × 109 M⊙ to faint dwarf galaxies with M∗ ∼ 105 M⊙. Because they are so faint and dark-matter dominated, such “dwarf” galaxies represent a key frontier field for testing (1) the Cold Dark Matter (CDM) paradigm of cosmology, (2) the epoch of re-ionization, and (3) the most extreme regimes of galaxy evolution. Our initial results from our previous XSEDE allocation have demonstrated that our physically motivated stellar feedback models can resolve several outstanding mysteries regarding the properties of these faint satellite galaxies. Our proposed simulations thus represent the culmination of several years of work supported by XSEDE, involving code development and optimization customized for the physical problems of galaxy formation. Our simulations also will be critical to enable the NSF-funded work of several graduate students and postdocs. In this renewal, we request 15 million SUs to run a suite of targeted simulations. Specifically, we will run the following simulations, carefully designed for targeted questions regarding the environment of the Local Group: 3 realizations of Local Group-like pairs of Milky Way and Andromeda-like galaxies (10.5 million SUs), and 3 realizations of Milky-Way-like galaxies with LMC-like satellites (3 million SUs). In each case, we will run 3 realization, which represents a minimum reasonable number for statistically significant results and to survey scatter in galaxy formation history. To compare with Local Group observations, we must run each simulation across its entire formation history to the present day (z = 0).
Close

Project Abstract

Spatially explicit estimates of the likelihood of conservation action

PI: Matthew Williamson



Conservation is an inherently human endeavor - initiated, designed, and deployed by humans to alter future behavior or undo previous impacts to affect positive changes in biodiversity. Predicting where conservation will occur in the future, however, remains a challenge. We propose a conceptual model where conservation action is determined by gradients of ecological value, individual willingness, and institutional ability. Conservation action may occur at any location along this 3-dimensional continuum, but becomes increasingly likely as ecological values, individual willingness, and institutional ability simultaneously approach their maxima. We demonstrate this approach using available high-resolution, spatially explicit data on demographics, economic drivers, institutional characteristics, and environmental conditions to evaluate the degree to which the spatial coincidence of these factors affects the likelihood of conservation. We use Bayesian hierarchical models that treat past conservation action as probabilistic outcomes of the interaction of ecological, institutional, and social covariates to identify key explanatory variables influencing the likelihood of conservation action using multi-model inference and hierarchical variance partitioning to evaluate the relative importance of each factor in explaining past conservation. We then implement these models in a GIS to generate probabilistic surfaces of the likelihood conservation action to identify where conservation is likely in the future.
Close

Project Abstract

State-dependent drug modulation of sodium channels

PI: Vladimir Yarov-Yarovoy



The goal of this project is to study the molecular mechanisms of voltage gated sodium (Nav) channel gating and modulation using molecular dynamics (MD) simulations. Our proposal will take advantage of several recent breakthroughs in the field of Nav channel structure: (1) a cryo-electron microscopy (cryoEM) structure of the first eukaryotic Nav channel (with pore-forming domain in the closed state and voltage-sensing domains in either activated or intermediate state); (2) new X-ray structures of bacterial Nav channels (with pore-forming domain in its open state); and (3) we have used Rosetta computational modeling software and MD simulations to generate stable ion conductive open state models of a bacterial Nav channel. We propose to simulate our new Rosetta structural models for human Nav channels in open, closed and inactivated states. This will enable demonstration of the molecular mechanisms of channel activation and inactivation. Experimental studies have identified structural regions forming the binding sites of small molecule inhibitors on Nav channels, yet the molecular mechanisms of modulation remain unclear. The proposed simulations on XSEDE supercomputers will significantly advance our basic knowledge of Nav channel gating and modulation, providing new understanding that may lead to novel therapeutics for neurological, muscular and cardiac diseases.