Content with tag biology .

Current Campus Champions

Current Campus Champions listed by institution. Participation as either an Established Program to Stimulate Competitive Research (EPSCoR) or as a minority-serving institution (MSI) is also indicated.

 

Campus Champion Institutions  
Total Academic Institutions 237
      Academic institutions in EPSCoR jurisdictions 73
      Minority Serving Institutions 47
      Minority Serving Institutions in EPSCoR jurisdictions 16
Non-academic, not-for-profit organizations 24
Total Campus Champion Institutions 261
Total Number of Champions 496

LAST UPDATED: August 15, 2018

See also the lists of Leadership Team and Regional LeadersDomain Champions and Student Champions.

Institution  Campus Champions EpSCoR MSI
Albany State University Olabisi Ojo  
Albert Einstein College of Medicine Jacob Pessin    
Arizona State University Sean Dudley, Barnaby Wasson, Johnathan Lee, Lee Reynolds, William Dizon, Jorge Henriquez,  Ian Shaeffer, Dalena Hardy, Joshua Watts, Gil Speyer    
Arkansas State University Hai Jiang  
Auburn University Tony Skjellum  
Auburn University at Montgomery  
Austin Peay State University Justin Oelgoetz    
Bates College Kai Evenson  
Bentley University Jason Wells    
Bethune-Cookman University Ahmed Badi  
Boise State University Kyle Shannon, Jason Watt, Kelly Byrne  
Boston University Shaohao Chen, Brian Gregor, Katia Oleinik    
Brown University Helen Kershaw, Maximillian King  
California Baptist University Linn Carothers  
California Institute of Technology Tom Morrell    
California State Polytechnic University-Pomona Chantal Stieber  

Carnegie Institution for Science

Floyd A. Fayton, Jr.

   
Carnegie Mellon University Franz Franchetti    
Case Western Reserve University , , Emily Dragowsky    
Centre College David Toth  
Children's Research Institute, Children's Mercy Kansas City Shane Corder    
Citadel Military College of South Carolina John Lewis  
Claremont McKenna College Jeho Park    
Clark Atlanta University Dina Tandabany  
Clarkson University Brian Helenbrook, Jeeves Green    
Clemson University Marcin Ziolkowski, Xizhou Feng, Ashwin Srinath, Jeffrey Denton  
Cleveland Clinic Foundation Iris Nira Smith, Daniel Blankenberg    
Clinton College

Terris S. Riley

Coastal Carolina University Will JonesThomas Hoffman  
Colby College Randall Downer  
College of Charleston Berhane Temelso  
College of William and Mary Eric Walter    
Colorado School of Mines Torey Battelle    
Colorado State University Tobin Magle    
Columbia University  George Garrett    
Complex Biological Systems Alliance Kris Holton    
Cornell University    
Dakota State University David Zeng  
Doane University Adam Erck, Mark Meysenburg  
Dominican University of California Randall Hall    
Drexel University David Chin    
Duke University Tom Milledge    
Earlham College Charlie Peck    

Federal Reseve Bank Of Kansas City, Center for the Advancement of Data and Research in Economics (CADRE)

BJ Lougee, Chris Stackpole, Brad Praiswater    
Felidae Conservation Fund Kevin Clark    
Fisk University  
Florida A&M University Hongmei Chi, Jesse Edwards  
Florida Atlantic University Rhian Resnick    
Florida International University David Driesbach, Cassian D'Cunha  
Florida Southern College Christian Roberson    
Florida State University Paul van der Mark    
George Mason University Jayshree SarmaDmitri Chebotarov    
George Washington University , Adam Wong, Glen Maclachlan    
Georgia Southern University Brandon Kimmons    
Georgia State University Neranjan Edirisinghe Pathiran    
Georgia Institute of Technology Mehmet Belgin, Semir Sarajlic    
Gettysburg College Charles Kann    
Great Plains Network    
Harvard University Scott Yockel, Plamen Krastev, Francesco Pontiggia    
Harvard Medical School Jason Key    
Hood College Xinlian Liu    
Howard University  
Idaho National Laboratory Ben Nickell, Eric Whiting  
Idaho State University , , Dong Xu  
Indiana University Abhinav Thota, Junjie Li, Le Mai Weakley    
Indiana University of Pennsylvania John Chrispell    
Illinois Institute of Technology Jeff Wereszczynski     
Iowa State University James Coyle, Andrew Severin, Levi Baber    
The Jackson Laboratory Shane Sanders    
Jackson State University Carmen Wright
James Madison University Isaiah SumnerYasmeen Shorish    
Johns Hopkins University Anthony Kolasny, Jaime CombarizaKevin Manalo    
Kansas State University  
Kennesaw State University Jon PrestonDick Gayler    
Kentucky State University
KINBER Jennifer Oxenford    
Lafayette College Bill Thompson, Jason Simms    
Lamar University    
Langston University , Abebaw Tadesse, Joel Snow
Lawrence Berkeley National Laboratory Andrew Wiedlea    
Lehigh University Alexander Pacheco    
Lock Haven University    
Louisiana State University Wei Feinstein  
Louisiana Tech University Don Liu  
Marquette University , ,    
Marshall University Justin Chapman,  
Massachusetts Green High Performance Computing Center (MGHPCC) Julie Ma    

Massachusetts Institute of Technology

Christopher Hill, Lauren Milechin

   
Medical University of South Carolina  
Michigan State University Andrew Keen, Yongjun Choi    
Michigan Technological University Gowtham    
Middle Tennessee State University Dwayne John    
Midwestern State University Eduardo Colmenares-Diaz    
Mississippi State University  
Missouri State University Matt Siebert    
Missouri University of Science and Technology Buddy Scharfenberg, Don Howdeshell    
Monmouth College    
Montana State University Jonathan Hilmer  
Montana Tech Bowen Deng  
Morehouse College Jigsa Tola, Doreen Stevens  
National University Ali Farahani  
Navajo Technical University
New Mexico State University Alla Kammerdiner, Diana Dugas
New York University    
North Carolina A & T State University Dukka KC  
North Carolina Central University Caesar Jackson, Alade Tokuta  
North Dakota State University Dane Skow, Nick DusekOluwasijibomi Saula  
Northern Arizona University Christopher Coffey    
Northwest Missouri State University Jim Campbell    
Northwestern University Alper Kinaci,    
Northwestern State University (Louisiana Scholars' College)  
Ohio State University Sandy ShewKeith Stewart    
Ohio Supercomputer Center    
Oklahoma Innovation Institute John Mosher, George Louthan  
Oklahoma State University Dana BrunsonJesse Schafer, Phillip Doehle, Evan Linde  
Old Dominion University Rizwan Bhutta, Wirawan Purwanto    
Oregona State University David BarberCJ KeistChuck SearsTodd Shechter    
OWASP Foundation Learning Gateway Project Bev Corwin, Laureano Batista, Zoe Braiterman, Noreen Whysel    
Penn State University Chuck Pavloski, Wayne Figurelle, Guido Cervone    
Portland State University William Garrick    
Princeton University Ian Cosden    
Purdue University Xiao Zhu, Tsai-wei Wu, Stephen Harrell, Marisa Brazil, Eric Adams    
Reed College Trina Marmarelli    
Rensselaer Polytechnic Institute Joel Giedt    
Rhodes College Brian Larkins    
Rice University Erik Engquist, Xiaoqin Huang, Clinton Heider    
Rutgers University , , Bill Abbott, Leslie Michelson, Paul Framhein, Galen Collier, Eric Marshall, Kristina Plazonic, Vlad Kholodovych    
SBGrid Consortium Jason Key    
Saint Louis University Eric Kaufmann    
Saint Martin University Shawn Duan    
San Diego State University Mary Thomas  
San Jose State University Sen Chiao  
Slippery Rock University of Pennsylvania Nitin Sukhija    
Smithsonian Conservation Biology Institute Jennifer Zhao    
Sonoma State University Mark Perri  
South Carolina State University Damian Clarke, Biswajit Biswal, Jagruti Sahoo    
South Dakota State University Kevin Brandt, Maria Kalyvaki  
Southeast Missouri State University Marcus Bond    
Southern Connecticut State University Yigui Wang    
Southern Illinois University Shaikh Ahmed, Chet Langin    
Southern Methodist University Amit Kumar, Merlin Wilkerson, Robert Kalescky    
Southern University and A & M College
Southwest Innovation Cluster Thomas MacCalla    
Southwestern Oklahoma State University Jeremy Evert  
Spelman College Yonas Tekle  
Stanford University , Zhiyong Zhang    
Sustainable Horizons Institute Wendi Sapp    
Swarthmore College    
Temple University Richard Berger    
Tennessee Technological University Tao Yu    
Texas A & M University-College Station Rick McMullen, Dhruva Chakravorty    
Texas Southern University  
Texas State University Shane Flaherty  
Texas Wesleyan University Terrence Neumann    
The College of New Jersey Shawn Sivy    
The University of Tennessee-Chattanooga Craig TanisEthan Hereth  
The University of Texas at Austin Kevin Chen    
The University of Texas at Dallas Gi VaniaFrank FeagansJaynal Pervez    
The University of Texas at El Paso Vinod Kumar  
The University of Texas at San Antonio Brent LeagueJeremy MannZhiwei WangArmando RodriguezThomas Freeman  
Tinker Air Force Base Zachary Fuchs, David Monismith    
Trinity College Peter Yoon    
Tufts University Shawn Doughty    
Tulane University Hideki Fujioka, Hoang Tran  
United States Department of Agriculture - Agriculture Research Service Nathan Weeks    
United States Geological Survey Jeff Falgout, Janice Gordon, Natalya Rapstine    
United States Naval Academy    
University at Buffalo Cynthia Cornelius    
University of Alabama at Birmingham John-Paul Robinson  
University of Alaska Fairbanks Liam Forbes, Kevin Galloway
University of Arizona , , , Cynthia HartChris DeerRic AndersonTodd MerrittChris ReidyAdam MichelDima Shyshlov    
University of Arkansas , , Pawel Wolinski, Paras Pradhan, James McCartney  
University of Arkansas at Little Rock  
University of California-Berkeley Aaron Culich    
University of California-Davis Bill Broadley    
University of California-Irvine Harry Mangalam  
University of California-Los Angeles TV Singh    
University of California-Merced Jeffrey Weekley, Sarvani Chadalapaka  
University of California-Riverside Russ Harvey, Bill Strossman, Charles Forsyth  
University of California-San Diego Cyd Burrows-Schilling, Claire Mizumoto    
University of California-San Francisco Jason Crane    
University of California-Santa Barbara Sharon Solis    
University of California-Santa Cruz Shawfeng Dong  
University of Central Florida Paul Wiegand, Jason Nagin    
University of Central Oklahoma Evan Lemley  
University of Chicago Igor Yakushin    

University of Cincinnati

Brett Kottmann

   
University of Colorado , Shelley Knuth    
University of Delaware Anita Schwartz  
University of Florida Alex Moskalenko    
University of Georgia Guy Cormier    
University of Guam Rommel HidalgoEugene AdanzoRandy DahiligJose SantiagoSteven Mamaril

University of Hawaii Gwen Jacobs, Sean Cleveland
University of Houston Jerry Ebalunode  
University of Houston, Clear Lake ,    
University of Houston, Downtown Hong Lin  
University of Idaho Lucas Sheneman  
University of Illinois at Chicago Himanshu Sharma, Jon Komperda  
University of Indianapolis Steve Spicklemire    
University of Iowa , Brenna MillerSai Ramadugu    
University of Kansas Riley Epperson  
University of Kentucky , James Griffioen  
University of Louisiana at Lafayette  
University of Louisville  
University of Maine System Bruce Segee, Steve Cousins  
University of Massachusetts Amherst Johnathan Griffin    
University of Massachusetts Boston Jeff Dusenberry, Runcong Chen  
University of Massachusetts Dartmouth Scott Field, Gaurav Khanna    
University of Miami Warner Baringer, Dan Voss    
University of Michigan    
University of Minnesota Jim Wilgenbusch, Ben Lynch, Eric Shook, Joel Turbes, Doug Finley    
University of Missouri-Columbia Timothy MiddelkoopSusie Meloro, George RobbJacob GotbergMicheal Quinn    
University of Missouri-Kansas City    
University of Montana Tiago Antao  
University of Nebraska , Jingchao Zhang  
University of Nebraska Medical Center Ashok Mudgapalli  
University of Nevada-Las Vegas Sharon Tettegah
University of Nevada-Reno Fred Harris  
University of New Hampshire Grace Wilson-Caudill  
University of New Mexico Hussein Al-Azzawi, Ryan Johnson
University of New Mexico-Gallup  
University of North Carolina .    
University of North Carolina Wilmington Eddie Dunn, Ellen Gurganious    
University of North Dakota  
University of North Texas Charles Peterson, Damiri Young    
University of Notre Dame Dodi Heryadi, Scott Hampton    
University of Oklahoma Kali McLennan, Horst SeveriniJames Ferguson, David AkinS. Patrick CalhounGeorge Louthan, Jason Speckman  
University of Oregon Nick Maggio, Robert Yelle, Chris Hoffman, Michael Coleman    
University of Pennsylvania Gavin Burris    
University of Pittsburgh Kim WongFangping MuKetan Maheshwari, Matt Burton, Shervin Sammak    
University of Puerto Rico Mayaguez Ana Gonzalez
University of South Carolina Paul Sagona, Ben Torkian, Nathan Elger  
University of South Dakota Doug Jennewein  
University of Southern California Erin Shaw, Cesar Sul    
University of Southern Mississippi Brian OlsonGopinath Subramanian   
University of Tulsa Peter Hawrylak  
University of Utah Anita Orendt    
University of Vermont Yves Dubief  
University of the Virgin Islands
University of Virginia Ed Hall,    
University of Washington Chance Reschke    
University of Wisconsin-La Crosse David Mathias, Samantha Foley    
University of Wisconsin-Milwaukee    
University of Wyoming  
Utah Valley University George Rudolph    
Valparaiso University Nicholas S. RosascoPaul M. NordPaul Lapsansky    
Vanderbilt University Will French    
Vassar College Christopher Gahn    
Virginia Tech University James McClure, Alana RomanellaSrijith Rajamohan    
Washburn University Karen Camarda, Steve Black  
Washington State University Jeff White    
Washington University at St. Louis Xing Huang, Matt Weil, Matt Callaway    
Wayne State University Patrick Gossman, Michael Thompson, Aragorn Steiger    
Weill Cornell Medicine Joseph Hargitai    
West Chester University of Pennsylvania Linh Ngo    
West Virginia Higher Education Policy Commission Jack Smith  
West Virginia State University Sridhar Malkaram
West Virginia University  Nathan Gregg, Guillermo Avendano-Franco  
Wichita State University

Terrance Figy

 
Winston-Salem State University Xiuping Tao  
Yale University , Kaylea Nelson, Benjamin Evans    
Youngstown State University Feng George Yu    

LAST UPDATED: August 15, 2018

 

Key Points
Members
Institutions
Contact Info
Contact Information

Domain and Student Champions

Campus Champions programs include Regional, Student, and Domain Champions.

 

 

Student Champions

Student Champion volunteer responsibilities may vary from one institution to another and depending on your Mentor. Student Champions may work with their Campus Champion Mentor to provide outreach on campus to help users access the best advanced computing resource that will help them accomplish their research goals, provide training to users on campus, or work on special projects assigned by your Mentor. Student Champions are also encouraged to attend the annual PEARC conference and participate in PEARC student program as well as submit posters or papers to the conference. 

Interested in applying to become a Student Champion? Fill out this form and someone will be in touch soon! (Please note that your institution must be part of the Champions program and you must have a Campus Champion mentor. To check to see if your institution is part of the Champions program and to get in touch a Champion on your campus, check here. Can't find your institution on the list? Fill out the application form and we will work to help you!)

INSTITUTION CHAMPION MENTOR GRADUATION YEAR
Florida A&M Univerisity George Kurian Hongmei Chi  
Florida A&M Univeristy Temilola Aderibigbe Hongmei Chi 2019
Florida A&M Univeristy Stacyann Nelsom Hongmei Chi 2020
Georgia State University Kenneth Huang Suranga Naranjan 2020
Georgia State University Thakshila Herath Suranga Naranjan 2018
Jackson State University Ebrahim Al-Areqi Carmen Wright 2018
Jackson State University Duber Gomez-Fonseca Carmen Wright 2019
New Mexico State University Mohammed Tanash Diana Toups Dugas 2020
Oklahoma State University Raj Shukla Dana Brunson 2018
Rensselaer Polytechnic Institute James Flamino Joel Geidt 2022
Southern Illinois University Sai Susheel Sunkara Chet Langin 2018
Southern Illinois University

Majid Memari

Chet Langin 2018
Southern Illinois University Alex Sommers Chet Langin  
Tufts University Georgios (George) Karamanis Shawn G. Doughty 2018
University of California - Merced Luanzheng Guo Sarvani Chadalapaka  
University of Central Florida Amit Goel Paul Weigand  
University of Florida David Ojika Oleksandr Moskalenko 2018
University of Houston-Downtown Eashrak Zubair Hong Lin 2020
University of Michigan Simon Adorf Brock Palen 2019
University of Missouri Alexander Barnes Timothy Middelkoop 2018
University of North Carolina Wilmington Cory Nichols Shrum Eddie Dunn  
University of South Dakota Joseph Madison Doug Jennewein 2018
Virginia Tech University David Barto Alana Romanella 2020
       
GRADUATED      
Georgia State University Mengyuan Zhu Suranga Naranjan 2017
Mississippi State University Nitin Sukhija Trey Breckenridge 2015
Oklahoma State University Phillip Doehle Dana Brunson 2016
Rensselaer Polytechnic Institute Jorge Alarcon Joel Geidt 2016
Souther Illinois University Sai Susheel Sunkara Chet Langin 2018
Southern Illinois University Monica Majiga Chet Langin 2017
Southern Illinois University Sai Sandeep Kadiyala  Chet Langin 2017
Southern Illinois University Rezaul Nishat Chet Langin 2018
University of Arkansas Shawn Coleman Jeff Pummill 2014
University of Houston Clear Lake Tarun Kumar Sharma Liwen Shih 2014
University of Maryland Baltimore County Genaro Hernadez Paul Schou 2015
University of North Carolina Wilmington James Stinson Gray Eddie Dunn 2018
University of Pittsburgh Shervin Sammak Kim Wong 2016
Virginia Tech University Lu Chen Alana Romanella 2017

Updated: August 15, 2018

Domain Champions

Domain Champions act as ambassadors by spreading the word about what XSEDE can do to boost the advancement of their field, based on their personal experience, and to connect interested colleagues to the right people/resources in the XSEDE community (XSEDE Extended Collaborative Support Services (ECSS) staff, Campus Champions, documentation/training, helpdesk, etc.). Domain Champions work within their discipline, rather than within a geographic or institutional territory.

The table below lists our current domain champions. We are very interested in adding new domains as well as additional champions for each domain. Please contact domain-info@xsede.org if you are interested in a discussion with a current domain champion, or in becoming a domain champion yourself.

DOMAIN CHAMPION INSTITUTION
Astrophysics, Aerospace, and Planetary Science Matthew Route Purdue University
Data Analysis Rob Kooper University of Illinois
Finance Mao Ye University of Illinois
Molecular Dynamics Tom Cheatham University of Utah
Genomics Brian Couger Oklahoma State University
Digital Humanities Virginia Kuhn University of Southern California
Digital Humanities Michael Simeone Arizona State University
Genomics and Biological Field Stations Thomas DoakCarrie L. GanoteSheri SandersBhavya Nalagampalli Papudeshi Indiana University, National Center for Genome Analysis Support
Chemistry and Material Science Sudhakar Pamidighantam Indiana University
Fluid Dynamics & Multi-phase Flows Amit Amritkar University of Houston
Chemistry Christopher J. Fennell Oklahoma State University
Geographic Information Systems Eric Shook University of Minnesota


Last Updated: May 16, 2018

Key Points
Student Champions
Regional Champions
Domain Champions
Contact Information

ECSS Projects

Explore ECSS's current and past projects, research teams, and science gateways.

Project List

For each of the following groups, the tables below list the Project, PI and Institution, the ECSS consultant(s) and the end date of the allocated support.

Research Teams

Project Name PI PI Institution ECSS Consultant(s)
An implicit, Chimera-based discontinuous Galerkin solver: development and application Paul David Orkwis University of Cincinnati Davide Del Vento, Shiquan Su
Statistical Analysis for Partially-Observed Markov Processes with Marked Point Process Obs, Y4 Yong Zeng University of Missouri, Kansas City Mitchel DeWayne Horton
Genome-Wide microRNAs and Single Gamete Based Genetic Profiling of Sweet Sorghum Varieties for Biofuel Production Ahmad Naseer Aziz Tennessee State University Alex Ropelewski
DISSCO, a Digital Instrument for Sound Synthesis and Composition, Y2 Sever Tipei University of Illinois at Urbana-Champaign Paul Rodriguez
Six Degrees of Francis Bacon, Y2 Christopher Norton Warren Carnegie Mellon University David Walling
Allocation Request on Bridges for Joint Analysis of Metagenomics and Metabolomics Data, Y4 Ping Ma University of Georgia Paul Rodriguez, Philip Blood
Analysis of human cortical electrophysiological data: database design for rapid analysis Max Novelli University of Pittsburgh Roberto Gomez
Assessment of Competition in the US Markets Based on Retail Scanner Data Philip Garland Gayle Kansas State University Kwai Wong, Od Odbadrakh
Simulation for 2D Semiconductor with Parallel Uniform and Adaptive Multigrid Method for Multi-component Phase Field Crystal Models, Y2 Zhen Guan University of California, Irvine David Bock, Dmitry Pekurovsky, Sudhakar Pamidighantam
Turbulent Mixing in a Magnetic Field and Flow strcuture under Successive Axisymmetric Straining, Y2 Pui-kuen Yeung Georgia Institute of Technology Lars Koesterke
Modeling Heliospheric Phenomena with MS-FLUKSS and Observational Boundary Conditions Nikolai Pogorelov University of Alabama, Huntsville Laura Carrington
The "Morelli Machine": A Proposal Testing a Critical, Algorithmic Approach to Art History Christopher James Nygren University of Pittsburgh Alan Craig, Paul Rodriguez
New Frontiers of Direct Laser Acceleration in Megatesla Magnetic Fields Alex Arefiev University of California, San Diego Amit Chourasia, Shiquan Su
Method Development and Application of Electronic Structure Calculations for Complex Nanostructures Kaushik Dayal Carnegie Mellon University Yang Wang

Community Codes

Project Name PI PI Institution ECSS Consultant
Genome Assembly and Annotation of Red Abalone, Yellowtail, Soybean Cyst Nematode and Spiny Softshell Turtle, y2 Andrew J Severin Iowa State University David O'Neal, Philip Blood
Computational Support for Bioinformatics Projects on Assembly Analysis of Fungal Metagenomes for the Discovery of Genes Involved in Important Biological Processes Suping Zhou Tennessee State University Alex Ropelewski
Alpha-viscosity vs. GRMHD Patrick C Fragile College of Charleston Damon McDougall
Development of a High-Performance Parallel Gibbs Ensemble Monte Carlo Simulation Engine Jeffrey J. Potoff Wayne State University Junwen Li, Yifeng Cui
Numerical Simulations of Neutron Star Mergers David Radice Institute for Advanced Study Lars Koesterke
Bioinformatics Pipeline and Infrastructure Development for Genomics, Y2 Uma Chandran University of Pittsburgh Alex Ropelewski
Preparing for Checkpoint-Restart in the Exascale Generation, Y3 Gene Cooperman Northeastern University Cyrus Proctor
Modeling AGN Feedback Physics in Galaxy clusters with Ultra-High Resolution Cosmological Simulations Erwin Tin-Hay Lau Yale University Albert Lu
Coastal General Ecology Model (CGEM) optimization and training John Christopher Lehrter University of South Alabama Kent Milfeld
The Best Practices to Supported Softwares on TACC HPC Systems John Cazes TACC Hang Liu

Science Gateways

Project Name PI PI Institution ECSS Consultant(s)
Using Galaxy and Jetstream resources to analyze non-coding and unmapped regions of whole genomes of healthy and disease-carrying dogs Tendai Mutangadura University of Missouri, Columbia Philip Blood
Searching the SRA Robert Edwards San Diego State University Eroma Abeysinghe, Mats Rynge
Atmospheric Science in the Cloud: Enabling Data-Proximate Science Mohan Ramamurthy UCAR/Unidata Suresh Marru
NFIE Hydrological Property Computation at National Scale Zong-Liang Yang University of Texas at Austin Yan Liu
GISandbox: A Gateway for Research, Education, and Experimentation in Geographic Information Systems and Science (GIS) Eric Shook University of Minnesota Andrea Zonca, Davide Del Vento
Particle-in-Cell and Kinetic Simulation Software Center (PICKSC) Gateway Benjamin John Winjum University of California, Los Angeles Eroma Abeysinghe, Suresh Marru
OpenTopography: A gateway to high resolution topography data and services Viswanath Nandigam San Diego Supercomputer Center Choonhan Youn
High Resolution Spatial Temporal Analysis of Whole-Head 306-Channel Magnetoencephalography and 66-Channel Electroencephalograpy Brain Imaging in Humans During Sleep David Shannahoff-Khalsa University of California, San Diego Mona Wong
Interoperating CyberGIS and HydroShare for Scalable Geospatial and Hydrologic Sciences Shaowen Wang University of Illinois at Urbana-Champaign Yan Liu
Science and Engineering Applications Grid (SEAGrid): A Gateway for Simulation of Molecular and Material Structures and Dynamics Sudhakar V. Pamidighantam Indiana University Feng Kevin Chen
Understanding sustainability issues in the global farm-food system using a global gridded model of agriculture URIS BALDOS Purdue University Lan Zhao
Exploring the full simulation modeling workflow on XSEDE Richard P Signell US Geological Survey Andrea Zonca
Gateway for SimVascular Educational Users Alison Lesley Marsden Stanford University Eroma Abeysinghe, Marcus Christie

Past Projects

Project Name PI PI Institution ECSS Consultant(s)
Laser-based Structural Sensing and Damage Assessment, Y2 Jerome F. Hajjar Northeastern University DJ Choi
Computation and testing of effective degree of eQTL Networks Sheila Marie Gaynor Harvard University Roberto Gomez
A Coupled Lattice Boltzmann and Spectrin Link Method for Modeling Fluid-Structure Interaction Cyrus K Aidun Georgia Institute of Technology Anirban Jana, Darren Adams
Exploring the universe with Advanced LIGO's detections of compact-object binaries Duncan Brown Syracuse University Lars Koesterke
Hi-Fidelity Simulations of Shear Flow of Polymeric Melts: Flow Induced Disentanglement and Shear Banding Bamin Khomami University of Tennessee, Knoxville DJ Choi
Image Analysis of the 1935-1944 Farm Security Administration - Office of War Information Photography Collection Elizabeth Wuerffel Valparaiso University Alan Craig, Paul Rodriguez, Sandeep Puthanveetil Satheesan
Inter-cloud Bursting: Decreasing Time-to-Science with a Multi-Stack Cloud Federation Adam M Brazier Cornell University Mats Rynge
Investigating morphology evolution in thin-film polymer blends, Y1 Baskar Ganapathysubramanian Iowa State University Amit Chourasia, Christopher Thompson
Investigating morphology evolution in thin-film polymer blends, Y2 Baskar Ganapathysubramanian Iowa State University Amit Chourasia, Christopher Thompson
Tanaka: Structural Estimation of Equilibrium Models, Y3 Chao Fu University of Wisconsin-Madison Victor Eijkhout
Deep Learning for drug-protein interaction prediction Gil Alterovitz Harvard Medical School/Boston Children's Hospital Yang Wang
Multilingual FrameNet: Cross-lingual Alignment and Annotation Collin F Baker International Computer Science Institute Roberto Gomez
Omics approaches to understanding cellular and organismal responses to chronic stress in marine mammals Jane Khudyakov University of the Pacific Erik Ferlanti
Using semantic networks to generate novel hypotheses in biomedicine Jodi Schneider University of Illinois at Urbana-Champaign Paul Hoover
Speeding-up Agent-Based Simulation of Cattle Herd Infection Diseases Rebecca Lee Smith National Center for Supercomputing Applications Yifeng Cui
dREG Science Gateway Charles G Danko Cornell University Eroma Abeysinghe
Simulation of Quantum Circuits in the Clifford-T gate set Antia Lamas-Linares Texas Advanced Computing Center Cyrus Proctor
Computational Studies on Physical and Chemical Properties Yigui Wang Southern Connecticut State University Marcela Madrid, Yang Wang
Constructing Nanosecond Level Snapshot of Financial Markets Using Supercomputers, Y3 Mao Ye University of Illinois at Urbana-Champaign David O'Neal, Lei Huang
Deploying Multi-Facility Cyberinfrastructure in XSEDE cloud and storage facilities to serve the national geosciences community Timothy Keith Ahern Incorporated Research Institutions for Seismology Andrea Zonca
Fluid-Structure Interaction, Optimization, and Uncertainty Quantification Using High Performance Computing in Cardiovascular Applications, Y2 Alison Lesley Marsden Stanford University Eroma Abeysinghe, Suresh Marru
High-Performance Computing Resources for CAPS Realtime Storm-Scale Ensemble Forecasts in Support of Hazardous Weather Testbed and Hydrometeorological Testbed Fanyou Kong University of Oklahoma David O'Neal
Modeling the Effective Energy and Mass Transfer input to Earth's Critical Zone from sub-meter to global spatial scales and daily to millennial time scales Jon Damian Pelletier University of Arizona Mats Rynge, Yan Liu
Preparing for Checkpoint-Restart in the Exascale Generation, Y2 Gene Cooperman Northeastern University Jerome Vienne
Tuning Geodynamo Simulation to Paleomagnetic Observations, Y2 David Gubbins Scripps Institution of Oceanography Chad Burdyshaw, Shiquan Su
ARIEL: Analysis of Rare Incident-Event Languages Ravi Starzl Carnegie Mellon University Paola Buitrago
Data Investigation and Sharing Environment for Big Weather Web Carlos Maltzahn University of California, Santa Cruz Suresh Marru
Distributed MCMC for Bayesian Hierarchical Models Imran Currim University of California, Irvine Andrea Zonca, Paul Rodriguez
First-principles studis of electronic structure and transport in semiconductors and multifunctional molecular electronic materials Xiaoguang Zhang University of Florida Yang Wang
Carnegie Library of Pittsburgh: Digital Asset Management Brooke Sansosti Carnegie Library of Pittsburgh Roberto Gomez
Large Scale Biomechanics Simulations in ABAQUS Jonathan Pieter Vande Geest University of Pittsburgh Anirban Jana
DiaGrid: Deliverying Science-as-a-Service for Collaborative Research and Education Communities Carol Xiaohui Song Purdue University Christopher Thompson
Assessing future flood risk from the emerging threat of Madden-Julian Oscillation amplification using a convection-permitting climate model Gabe Kooperman University of California, Irvine Si Liu
Computational Anatomy Gateway, Y1 Michael I Miller John Hopkins University Antonio Gomez
Computational studies of earthquake hazards: effects of complex geologic structure and topography on seismic wave propagation and quantification of uncertainties in seismic source models and probabilistic seismic-hazard Morgan Paul Moschetti US Geological Survey Lars Koesterke, Yifeng Cui
Predictability and Data Assimilation of Severe Weather and Tropical Cyclones, Y2 Fuqing Zhang Pennsylvania State University Greg Foss
Surface-induced forcing and decadal variability and change of the East Asian climate, surface hydrology and agriculture Yongkang Xue University of California, Los Angeles Shiquan Su
Simulating the microscopic properties of neutron star crust matter William George Newton Texas A&M University, Commerce Lars Koesterke
WaterHUB for Hydrologic Modeling and Education Venkatesh Merwade Purdue University Lan Zhao
Digital Humanities Projects: Processing XML Data Elisa Eileen Beshero-Bondar University of Pittsburgh David O'Neal, Roberto Gomez
XSEDE Insights: System Utilization Analyses Through XDMoD Data John Cazes Texas Advanced Computing Center Damon McDougall
Adaptive Finite-element Simulations of Complex Industrial Flow Problems Cameron Walter Smith Rensselaer Polytechnic Institute Lars Koesterke, Lei Huang
Allocation Request on Blacklight and Bridges for SNP Detection in Large Metagenomics Samples, y3 Ping Ma University of Georgia Philip Blood
Computational Design and Molecular Dynamics Simulations of Protein, Polymers and Host-Guest Interactions Jeffery G Saven University of Pennsylvania Victor Eijkhout
Computational fluid-structure interaction of biological systems, Y3 Haoxiang Luo Vanderbilt University Hang Liu
Direct Numerical Simulations and Analysis of Compressible And Incompressible Turbulence, Y3 Diego Donzis Texas A&M University Xiao Zhu
DNS of droplet-laden isotropic turbulence, y4 Antonino Ferrante University of Washington Darren Adams
Multiscale Simulation and Complex Data Fitting of HIV Latency-Reactivation Control and Viral Dynamics Alan Stuart Perelson Los Alamos National Laboratory Laura Carrington, Paul Hoover
Simulation for 2D Semiconductor with Parallel Uniform and Adaptive Multigrid Method for Phase Field Crystal Models, Y1 Zhen Guan University of California, Irvine David Bock, Dmitry Pekurovsky, Sudhakar Pamidighantam
Simulation for the IceCube telescope data analysis and detector upgrade studies Francis Halzen University of Wisconsin-Madison Mats Rynge
Six Degrees of Francis Bacon, Y1 Christopher Norton Warren Carnegie Mellon University David Walling, Roberto Gomez
Providing a Science Gateway to Existing Simulations of Galaxy Cluster Mergers, Y2 Marios Chatzikos University of Kentucky Christopher Thompson
Optimizing the Multi-State Modified Embedded Atom Method Code Addon for LAMMPS on Stampede Srinivasan Srivilliputhur University of North Texas Kent Milfeld
Machine Assisted Rhetorical Analysis James Ian Wynn Carnegie Mellon University Roberto Gomez
Micromechanical analysis of stress-strain inhomoheneities with Fourier transforms (MASSIF) Anthony D Rollett Carnegie Mellon University Anirban Jana, Antonio Gomez
Decomposing Bodies Alison Langmead University of Pittsburgh Alan Craig, Paul Rodriguez, Sandeep Puthanveetil Satheesan
Large Scale First Principles Study of Anderson Localization Mark Jarrell Louisiana State University Yang Wang
BioTeam Science Gateway for Life Sciences Applications Ari Berman BioTeam, Inc. Suresh Marru
Bioinformatics Pipeline and Infrastructure Development for Cancer Genomics, Y1 Uma Chandran University of Pittsburgh Alex Ropelewski
Hydrological model development and application for a refined understanding of Arctic hydrology Anna Katarina Liljedahl University of Alaska, Fairbanks Laura Carrington
Improving Seasonal Prediction of the Indo-Pacific SST and Indian Monsoon with Multiple Ocean Analyses Ensemble Bohua Huang George Mason University Shiquan Su
Investigating 4-D Dynamics of Subduction and Continental Evolution using Adjoint Convection Models Lijun Liu University of Illinois at Urbana-Champaign Yifeng Cui
Large-eddy simulations of pulverized coal combustion Xinyu Zhao University of Connecticut Laura Carrington
Large-shared-memory supercomputing for game-theoretic analysis with fine-grained abstractions, and novel tree search algorithms, Y3 Tuomas Sandholm Carnegie Mellon University John Urbanic
Laser-based Structural Sensing and Damage Assessment, Y1 Jerome F. Hajjar Northeastern University DJ Choi
Multi-Physics Computational Modeling of Cardiac Flows and Heart Murmurs Using a Parallelized Immerse Rajat Mittal John Hopkins University Antonio Gomez
Numerical Modeling of High-Impact Weather: Severe Storms, Tornadoes and Winter Cyclones, Y4 Brian F Jewett University of Illinois at Urbana-Champaign David O'Neal
Investigating the relationship between Identity Politics and Discourse using visualizations Nicole Brown National Center for Supercomputing Applications Mark Van Moer, Paul Rodriguez
Research Inititiation Awards: Globalization Strategy and Parallelism in PDE-constrained optimization for inverse source problems Widodo Samyono Jarvis Christian College Victor Eijkhout
Unidata Cyberinfrastructure in a Cloud Computing Environment Mohan Ramamurthy UCAR/Unidata Suresh Marru
Finding Real Differences Between Semi-supervised Learning Algorithms Using PJ2 George Rudolph The Citadel Jerome Vienne
Disclosure Scriptability Matthew DeAngelis Georgia State University Kwai Wong
Numerical Methods and Models Using Stochastic (Partial) Differential Equations in Biology Christopher Vincent Rackauckas University of California, Irvine DJ Choi
Simulating collisions of white dwarfs as a primary channel for type Ia supernovae,Doron Kushnir,Institute for Advanced Study,David Bock,12/31/16
DISSCO, a Digital Instrument for Sound Synthesis and Composition, Y1 Sever Tipei University of Illinois at Urbana-Champaign Paul Rodriguez, Robert Sinkovits
Terrain Analysis using digital elevation models, Y3 David Tarboton Utah State University Yan Liu
Image Analysis of fMRI Data for Medical Applications Frank Michael Skidmore University of Alabama, Birmingham Junqi Yin
Population Level Benchmarking Variant Calling Pipelines and Building a Community Resource with 3K Rice Genomes Liya Wang Cold Spring Harbor Laboratory Mats Rynge
Optimization and uncertainty quantification in cardiovascular flow simulations using high performance computing Alison Lesley Marsden Stanford University Kwai Wong
Large eddy simulations of extended wind-farms Charles Meneveau Johns Hopkins University Darren Adams, David Bock
Mining, Coordinating and Visualizing Curricula in STEM and Social Science Sharon Tettegah University of Illinois at Urbana-Champaign Pragneshkumar Patel
XSEDE: Prediction and Control of Compressible Turbulent Flows, y4 Daniel Joseph Bodony University of Illinois at Urbana-Champaign Lonnie D. Crosby
Ab initio study on the glycosidation of 5-Fluorouracil (5-FU) Yigui Wang Southern Connecticut State University DJ Choi
Analyzing the Influence of Socioeconomic Factors on Online Game Advancement, Y2 Min Zhan University of Illinois at Urbana-Champaign Lonnie D. Crosby
Numerical Modeling of High-Impact Weather: Winter Snowstorms, Severe Thunderstorms and Tornadoes, Y3 Brian F Jewett University of Illinois at Urbana-Champaign David O'Neal, Lei Huang
Constructing Nanosecond Level Snapshot of Financial Markets Using Supercomputers Mao Ye University of Illinois at Urbana-Champaign David O'Neal
Direct numerical simulation of droplet-laden homogeneous isotropic turbulence - Y3 Antonino Ferrante University of Washington Darren Adams
NOAA Climate and Weather Research Allocation Request Frank Indiviglio National Oceanic and Atmospheric Administration Pragneshkumar Patel

Key Points
ECSS has assisted on more than 60 XSEDE projects
Contact Information

August 2018 | Science Highlights, Announcements & Upcoming Events
The Extreme Science and Engineering Discovery Environment (XSEDE) is an NSF-funded single virtual organization which connects scientists with advanced digital resources. People around the world use these resources and services — things like supercomputers, collections of data, expert staff and new tools — to improve our planet. Learn how XSEDE can enable your research here.
Program Announcements
XSEDE allocates $7.3M worth of computing time to U.S. researchers in latest allocation cycle

XSEDE has awarded 145 deserving research teams at 109 universities and other institutions access to nearly two dozen NSF-funded computational and storage resources, as well as other services unique to XSEDE, such as the Extended Collaborative Support Services (ECSS). Total allocations this cycle, running July 1, 2018 through June 30, 2019, represent an estimated $7.3 million of time on multi-core, many-core, GPU-accelerated, and large-memory computing resources (which does not include additional consulting resources) – all at no cost to the researchers. Since its founding in 2011, XSEDE and XSEDE 2.0 have allocated an estimated $270M of computing time.

Campus Champions celebrate 10 years!

Since theCampus Champions program began in 2008, it has grown to 450 champions, reaching over 200 institutions spanning all 50 states, the District of Columbia, Puerto Rico, Guam and the U.S. Virgin Islands. Thanks to sustained funding from XSEDE and the National Science Foundation (NSF) over the last ten years, Campus Champions have helped support research that will change the world, developed optimized workflows for high-performance computing and continue to improve research computing networks with a nationwide impact.

Ruby Mendenhall presents findings from XSEDE-sponsored research on lives of African American women in keynote address at PEARC18

Using HPC resources allocated to researchers through XSEDE, Mendenhall and a multidisciplinary team are gleaning lessons on how black women related to and affected the larger society during periods when their written voices were underrepresented or even illegal.

XSEDE names 2018-2019 Campus Champions Fellows

Five researchers from American universities will work with cyberinfrastructure and high-performance computing experts from XSEDE and U.S. research teams to tackle real-world science and engineering projects over the next year in the 2018 Campus Champions Fellows program.

Community Announcements
TACC's NSF-funded volunteer computing project is now live!

TACC's NSF-funded volunteer computing project is now live! The project leverages the BOINC software infrastructure to bring the power of volunteer computing to TACC and XSEDE users. By taking advantage of the computing cycles donated by volunteers, researchers can supplement the compute cycles granted to them as part of the TACC/XSEDE allocation process. Sign up now to help science succeed!

Upcoming Events
Tune in for the ECSS Symposium on August 21!

All are welcome to attend the ECSS symposium on Tuesday, August 21 at 1 p.m. Eastern/10 a.m. Pacific featuring "OpenTopography: A gateway to high resolution topography data and services" with Choonhan Youn of SDSC. Details on the second presenter to come. Click the link below for symposium coordinates and additional details.

Training Registration Open for 2018 NSF Cybersecurity Summit

Registration for training sessions for the 2018 NSF Cybersecurity Summit for Large Facilities and Cyberinfrastructure is now open. The training day sessions will take place on Tuesday, August 21st. If interested, please register for this summit here.

Once you have registered for the summit and if you are planning to attend training day sessions, please use this form to reserve your seat for the available training sessions prior to the Tuesday, August 14 deadline. A list of training session descriptions is available here.

Gateways 2018 poster deadline extended to Wednesday, August 15!

There's still time to submit an abstract for the Gateways 2018 poster session! Deadline for poster submissions has been extended to August 15. Interested individuals also have until Friday, September 7 to register for the conference or reserve a space at the Resource Expo.

 

Questions? Email help@sciencegateways.org.

Globus Workshop at NCAR: Wednesday, Sept. 5, 2018

Register for this workshop to learn how the Globus platform simplifies development of web applications for researchers and exchange ideas with your peers on ways to apply Globus technologies. This day of tutorials includes sessions on endpoint administration, best practice for sharing, automating workflows, Jupyter + Globus, and much more. Register before September 1!



XSEDE Partnerships

Check out the partner institutions that are a funded by the XSEDE Project.

XSEDE is led by the University of Illinois' National Center for Supercomputing Applications. The partnership includes the following institutions:

XSEDE Leadership Committees

XSEDE Leadership Team

John Towns

National Center for Supercomputing Applications
University of Illinois at Urbana-Champaign
jtowns@ncsa.illinois.edu

Kelly Gaither

Texas Advanced Computing Center
University of Texas at Austin
kelly@tacc.utexas.edu

Questions from the public and the media about the XSEDE project should be directed to:

Kristin Williamson

National Center for Supercomputing Applications
University of Illinois at Urbana-Champaign
kwillia8@illinois.edu / 217-343-1594

Collaborations with Funded Activities

NSF Proposal Title PI/Contact Award Abstract Amount
Mainstreaming Volunteer Computing David Anderson 1105572 $548,546.00
SI2-SSI: SciDaaS -- Scientific data management as a service for small/medium labs Ian Foster 1148484 $2,398,999.00
Collaborative Research: Integrated HPC Systems Usage and Performance of Resources Monitoring and Modeling (SUPReMM- SUNY Buffalo) Abani Patra 1203560 $458,561.00
Collaborative Research: Integrated HPC Systems Usage and Performance of Resources Monitoring and Modeling (SUPReMM- UT-Austin) Abani Patra 1203604 $457,919.00
Center for Trustworthy Scientific Cyberinfrastructure (CTSC) Von Welch 1234408 $4,518,845.00
Latin America-US Institute 2013: Methods in Computational Discovery for Multidimensional Problem Solving Kevin Franklin 1242216 $99,986.00
EAGER proposal: Toward a Distributed Knowledge Environment for Research into Cyberinfrastructure: Data, Tools, Measures, and Models for Multidimensional Innovation Network Analysis Nicholas Berente 1348461 $204,935.00
Multiscale Software for Quantum Simulations in Materials Design, Nano Science and Technology Jerzy Bernholc 1339844 $500,000.00
MRI: Acquisition of SuperMIC-- A Heterogeneous Computing Environment to Enable Transformation of Computational Research and Education in the State of Louisiana Seung-Jong Park 1338051 $3,924,181.00
Open Gateway Computing Environments Science Gateways Platform as a Service (OGCE SciGaP) Marlon Pierce 1339774  $2,500,000.00
Sustaining Globus Toolkit for the NSF Community (Sustain-GT) Steven Tuecke 1339873 $1,200,000.00
CC-NIE Integration: Developing Applications with Networking Capabilities via End-to-End SDN (DANCES) Kathy L. Benninger 1341005 $650,000.00
A Large-Scale, Community-Driven Experimental Environment for Cloud Research Dr. Kate Keahey 1419141 $10,049,765.00
MRI: Acquisition of a National CyberGIS Facility for Computing- and Data-Intensive Geospatial Research and Education Shaowen Wang 1429699 $1,787,335.00
Acquisition of an Extreme GPU cluster for Interdisciplinary Research Todd Martinez 1429830 $3,500,000.00
The Centrality of Advanced Digitally-ENabled Science: CADENS Donna Cox 1445176 $1,499,535.00
CloudLab: Flexible Scientific Infrastructure to Support Fundamental Advances in Cloud Architectures and Applications Robert Ricci 1419199 $9,999,999.00
RUI: CAREER Organizational Capacity and Capacity Building for Cyberinfrastructure Diffusion Dr. Kerk F. Kee, Almadena Y. Chtchelkanova 1453864  $519,753.00
Fostering Successful Innovative Large-Scale, Distributed Science and Engineering Projects through Integrated Collaboration Nicolas Berente 1551609  $100,000.00
EarthCube RCN: Collaborative Research: Research Coordination Network for HighPerformance Distributed Computing in the Polar Sciences Allen Pope 1541620 $299,977.00
MRI Collaborative Consortium: Acquisition of a Shared Supercomputer by the Rocky Mountain Advanced Computing Consortium Thomas Hauser 1532236  $2,730,000.00
BD Hubs: Midwest: "SEEDCorn: Sustainable Enabling Environment for Data Collaboration that you are proposing in response to the NSF Big Data Regional Innovation Hubs (BD Hubs): Accelerating the Big Data Innovation Ecosystem (NSF 15-562) solicitation Edward Seidel 1550320  $1,499,999.00
Secure Data Architecture: Shared Intelligence Platform for Protecting our National Cyberinfrastructure" that you are proposing in response to the NSF Cybersecurity Innovation for Cyberinfrastructure (NSF 15-549) solicitation Alexander Withers 1547249  $499,206.00
CILogon 2.0 project that you are proposing in response to the NSF Cybersecurity Innovation for Cyberinfrastructure (NSF 15-549) solicitation James Basney 1547268  $499,973.00
DIBBs: Merging Science and Cyberinfrastructure Pathways: The Whole Tale Bertram Ludaescher 1541450 $4,986,951.00
Associated Universities, Inc. (AUI) and the National Radio Astronomy Observatory (NRAO) Philip J. Puxley 1519126  $1.00
SI2-SSE: Multiscale Software for Quantum Simulations of Nanostructured Materials and Devices J. Bernholc 1615114 $29,232.00
Collaborative Research: SI2-SSI: Adding Volunteer Computing to the Research Cyberinfrastructure David Anderson 1550601 $259,999.00
Molecular Sciences Software Institute (MolSSI) that you are proposing in response to the NSF Scientific Software Innovation Institutes (S2I2, NSF 15-553) solicitation Thomas Crawford 1547580 $5,880,491.00
Science Gateways Software Institute for NSF Scientific Software Innovation Institutes (S2I2, NSF 15-553) solicitation Nancy Wilkins-Diehr 1547611 $6,599,000.00
CC* Compute: BioBurst in response to the Campus Cyberinfrastructure (CC*) Program solicitation (NSF 16-567) Ron Hawkins 1659104 $494,066.00
CC* Networking Infrastructure: Building HPRNet (High-Performance Research Network) for advancement of data intensive research and collaboration Farzad Mashayek 1659255 $499,745.00
Cybertraining:CIP – Professional Training for CyberAmbassadors Dirk Colbry 1730137 $498,330.00
SI2-SSI: Pegasus: Automating compute and data intensive science Ewa Deelman 1664162 $2,500,000.00
Quantum Mechanical Modeling of Major Mantle Materials Renata Wentacovitch 0635990  $805,227.00
MRI: Acquisition of the Lawrence Supercomputer to Advance Multidisciplinary Research in South Dakota Doug Jennewein 1626516 $504,911.00
Collaborative Research: CyberTraining: CIU: Hour of Cyberinfrastructure: Developing Cyber Literacy for Geographic Information Science Eric Shook 1829708 $373,990.00
CC* NPEO: A Sustainable Center for Engagement and Networks Jennifer M. Schopf 1826994 $1,166,667.00
Collaborative Research: Building the Community for the Open Storage Network Alex Szalay 1747493 $165,185.00

 

Key Points
XSEDE is comprised of partnerships with 19 institutions
Contact Information

Champion Leadership Team

This page includes the Champions Leadership team and Regional Champions

Champion Leadership Team
Name Institution Position
Dana Brunson Oklahoma State University Campus Engagement Co-manager
Henry Neeman University of Oklahoma Campus Engagement Co-manager
Marisa Brazil Purdue University Champion Coordinator
Jeff Pummill University of Arkansas Champion Science Coordinator
Jay Alameda University of Illinois Urbana-Champaign Champion Technology Coordinator
Aaron Culich University of California-Berkeley Champion Leadership Team (2017-2019)
Alla Kammerdiner New Mexico State University Champion Leadership Team (2017-2019)
Doug Jennewein University of South Dakota Champion Leadership Team (2018-2020)
Timothy Middelkoop University of Missouri Champion Leadership Team (2018-2020)
Julie Ma MGHPCC Champion Leadership Team (2018-2020)
Hussein Al-Azzawi University of New Mexico Champion Leadership Team (2018-2020)
     
Leadership Team Alumni    
Jack Smith West Virginia Higher Education Policy Commission  Champion Leadership Team (2016-2018)
Dan Voss University of Miami Champion Leadership Team (2016-2018)
Erin Hodges University of Houston Champion Leadership Team (2017-2018)

Updated: August 6, 2018

Regional Champions

The Regional Champion Program is built upon the principles and goals of the XSEDE Champion Program. The Regional Champion network facilitates education and training opportunities for researchers, faculty, students and staff in their region that help them make effective use of local, regional and national digital resources and services. Additionally, the Regional Champion Program provides oversight and assistance in a predefined geographical region to ensure that all Champions in that region receive the information and assistance they require, as well as establish a bi-directional conduit between Champions in the region and the XSEDE champion staff, thus ensuring a more efficient dissemination of information, allowing finer grained support. Finally, the Regional Champions acts as a regional point of contact and coordination, to assist in scaling up the Champion program by working with the champion staff to coordinate and identify areas of opportunity for expanding outreach to the user community.

Regional Champions are coordinated by Jeff Pummill.

CHAMPION INSTITUTION DEPUTY CHAMPION INSTITUTION REGION
Ben Nickell Idaho National Labs Nick Maggio University of Oregon 1
Ruth Marinshaw Stanford University Aaron Culich University of California, Berkeley 2
Kevin Brandt South Dakota State University  Chet Langin Southern Illinois University 3
Dan Andresen Kansas State University BJ Lougee Federal Reserve Bank Of Kansas City CADRE  4
Mark Reed University of North Carolina Craig Tanis University of Tennessee, Chattanooga 5
Scott Hampton University of Notre Dame Stephen Harrell Purdue University 6
Scott Yockel Harvard University Scott Valcourt University of New Hampshire 7
Anita Orendt University of Utah Shelley Knuth University of Colorado 8

Updated: August 6, 2018


 

Key Points
Leadership table
Regional Champions table
Contact Information

Campus Champions Fellows Program

The Fellows Program partners Campus Champions with Extended Collaborative Support Services (ECSS) staff and research teams to work side by side on real-world science and engineering projects.

 

See Fellows Projects Here

The cyberinfrastructure expertise developed by high-end application support staff in XSEDE's Extended Collaborative Support Services (ECSS) program can be difficult to disseminate to the large numbers of researchers who would benefit from this knowledge. Among their many roles, XSEDE Campus Champions (CC) serve as local experts on national cyberinfrastructure resources and organizations, such as XSEDE. Champions are closely connected to those doing cutting-edge research on their campuses. The goal of the Campus Champions (CC) Fellows program is to increase cyberinfrastructure expertise on campuses by including CCs as partners in XSEDE's Extended Collaborative Support Services (ECSS) projects.

The Fellows program partners Campus Champions with ECSS staff and research teams to work side by side on real-world science and engineering projects. In 2015, the types of projects offered have expanded beyond ECSS projects. In addition to ECSS, Champions now have the opportunity to work with XSEDE Cyberinfrastructure Integration (XCI) to develop on-ramps from campuses to XSEDE, to work with Community Engagment & Enrichment staff to help create a formal undergraduate or graduate minor, concentration, or certificate program at their institution, or to design a project of their choosing. Fellows will develop expertise within varied areas of cyberinfrastructure, and they are already well positioned to share their advanced knowledge through their roles as the established conduits to students, administrators, professional staff, and faculty on their campuses. A directory of Fellows will expand the influence even further by creating a network of individuals with these unique skill sets. In addition to the technical knowledge gleaned from their experiences, the individual Fellows will benefit from their personal interactions with the ECSS staff and will acquire the skills necessary to manage similar user or research group project requests on their own campuses. The Campus Champions Fellows program is a unique, rare opportunity for a select group of individuals to learn first-hand about the application of high-end cyberinfrastructure to challenging science and engineering problems.

A volunteer partner opportunity - ECSS Affiliates

Accepted Fellows make a 400-hour time commitment and are paid a $15,000 annual stipend for their efforts. The program includes funding for two one- to two-week visits to an ECSS or research team site to enhance the collaboration and also funding to attend and present at a Fellows symposium at an XSEDE conference.

The following are the types of skills that may be developed, depending on project assignments:

  • Use of profiling and tracking tools to better understand a code's characteristics
  • CUDA programming
  • Hybrid (MPI/OpenMP) programming
  • Optimal use of math libraries
  • Use of visualization tools, with a particular focus on large data sets
  • Use of I/O tools and software such as HDF5, MPI IO, and parallel file system optimization
  • Optimal use of scientific application software such as AMBER, ABAQUS, RaxML, MrBayes, etc.
  • Application of high-performance computing and high-throughput computing to non-traditional domains such as computational linguistics, economics, genomics
  • Single-processor optimization techniques
  • Benchmarking, including concepts of sockets, binding processes to cores, and impacts of these in optimization
  • Optimal data transfer techniques including the use of grid-ftp and Globus Online
  • Cluster scheduling, tuning for site-specific goals
  • Managing credentials, security practices
  • Monitoring resources with information services
  • Understanding failure conditions and programming for fault tolerance
  • Automated work and data flow systems
  • Web tools and libraries for building science gateways or portals that connect to high-end resources
  • Data management through tools such as iRODS and SAGA

For questions about the program please contact outreach-ccfellows@xsede.org.

 

Download a PDF (1.5MB) of the Memo of Understanding for the Fellows program

Key Points
Campus Champions as local experts
Campus Champions work with ECSS staff and research teams
Campus Champions responsibilities and skills

New road map 


Campus Champions

Computational Science & Engineering makes the impossible possible; high performance computing makes the impossible practical

Campus Champions Celebrate Ten Year Anniversary 

What is a Campus Champion?

A Campus Champion is an employee of, or affiliated with, a college or university (or other institution engaged in research), whose role includes helping their institution's researchers, educators and scholars (faculty, postdocs, graduate students, undergraduates, and professionals) with their computing-intensive and data-intensive research, education, scholarship and/or creative activity, including but not limited to helping them to use advanced digital capabilities to improve, grow and/or accelerate these achievements.

What is the Campus Champions Program?

The Campus Champions Program is a group of 400+ Campus Champions at 200+ US colleges, universities, and other research-focused institutions, whose role is to help researchers at their institutions to use research computing, especially (but not exclusively) large scale and high end computing.

Campus Champions peer-mentor each other, to learn to be more effective. The Campus Champion community has a very active mailing list where Champions exchange ideas and help each other solve problems, regular conference calls where we learn what's going on both within the Champions and at the national level, and a variety of other activities.

Benefits to Campus Champion Institutions

  • A Campus Champion gets better at helping people use computing to advance their research, so their institution's research becomes more successful.
  • There is no charge to the Campus Champion institution for membership.

Benefits to the Campus Champion

  • A Campus Champion becomes more valuable and more indispensable to their institution's researchers, and therefore to their institution.
  • The Campus Champions Program is a lot of fun, so Champions can enjoy learning valuable strategies.

What does a Campus Champion do as a member of the CC Program?

  • Participate in Campus Champions Program information sharing sessions such as the Campus Champions monthly call and email list.
  • Participate in peer mentoring with other Campus Champions, learning from each other how to be more effective in their research support role.
  • Provide information about national Cyberinfrastructure (CI) resources to researchers, educators and scholars at their local institution.
  • Assist their local institution's users to quickly get start-up allocations of computing time on national CI resources.
  • Serve as an ombudsperson, on behalf of their local institution's users of national CI resources, to capture information on problems and challenges that need to be addressed by the resource owners.
  • Host awareness sessions and training workshops for their local institution's researchers, educators, students, scholars and administrators about institutional, national and other CI resources and services.
  • Participate in some or all of the Campus, Regional, Domain, and Student Champion activities.
  • Submit brief activity reports on a regular cadence.
  • Participate in relevant national conferences, for example the annual SC supercomputing conference and the PEARC conference.
  • Participate in education, training and professional development opportunities at the institutional, regional and national level, to improve the champion(s)' ability to provide these capabilities.

What does the Campus Champions program do for the Campus Champions?

  • Provide a mailing list for sharing information among all Campus Champions and other relevant personnel.
  • Provide the Campus Champions with regular correspondence on new and updated CI resources, services, and offerings at the national level, including but not limited to the XSEDE offerings.
  • Provide advice to the Campus Champions and their institutions on how to best serve the institution's computing- and data-intensive research, education and scholarly endeavors.
  • Provide education, training and professional development for Campus Champions at conferences, Campus Champion meetings, training events, and by use of online collaboration capabilities (wiki, e-mail, etc.).
  • Help Champions to pursue start-up allocations of computing time on relevant national CI resources (currently only XSEDE resources, but we aspire to expand that), to enable Campus Champions to help their local users get started quickly on such national CI resources.
  • Record success stories about impact of Campus Champions on research, education and scholarly endeavors.
  • Maintain a web presence and other social media activity that promotes the Campus Champions Program and lists all active Campus Champions and their institutions, including their local institution and its Campus Champion(s).
  • Raise awareness of, and recruit additional institutions and Campus Champions into the Campus Champions Program.
  • Provide Campus Champions with the opportunity to apply for the XSEDE Champion Fellows Program (and aspirationally other programs), to acquire in-depth technical and user support skills by working alongside XSEDE staff experts.
  • Provide Campus Champions information to participate in subgroup activities, such as the Regional Champion initiative.

Become a Champion

  • Write to champion-info@xsede.org and ask to get involved
  • We'll send you a template letter of collaboration
  • Ask questions, add signatures, send it back, and join the community

In addition to traditional Campus Champions, the Champion Program now includes the following types of specialized Champions:

 

  • Student Champions - offering a unique student perspective on use of digital resources and services
  • Regional Champions - regional point of contact to help scale the Champion Program
  • Domain Champions - spread the word about what XSEDE can do to boost the advancement of their specific research domains

Key Points
Program serves more than 200 US colleges and universities
Aimed at making the institution's research more successful
Free membership
Contact Information

Campus Bridging History

When XSEDE started in 2011, it created a very small group called the XSEDE Campus Bridging group

As described in a recent workshop report (http://ieeexplore.ieee.org/document/7307557/), campus bridging encompassed a number of different kinds of issues: some financial (e.g. network capacity to the campus), some related to authentication frameworks, some computer science, and some computer engineering. What the problems addressed in the 2011 report had in common was that they were all perceived or encountered from the viewpoint of a researcher or student on a campus trying to interoperate with the national cyberinfrastructure. Many of these campus bridging problems have been addressed or are being addressed by NSF solicitations issues since the 2011 ACCI taskforce report.

When XSEDE started in 2011, it created a very small group called the XSEDE Campus Bridging group (originally 0.35 FTEs that grew to 2.5 FTEs over the course of five years of XSEDE). Challenges identified in the ACCI taskforce and met by XSEDE Campus Bridging Group, working with other areas of XSEDE, include:

  • Ease of file transfer. Prior to XSEDE, simply moving files around within the research cyberinfrastructure ecosystem of the US was a significant obstacle. Globus transfer provides a file and data movement service that provides "best possible" data throughput between any two CI systems (ranging from a laptop to a massive data archive) with an easy-to-use web interface. Globus transfer as implemented by XSEDE and XSEDE-allocated SPs have created a situation in which the large majority of users of XSEDE are satisfied to happy with their ability to move files from local campuses to the national NSF-funded cyberinfrastructure.
  • Ease of management of local campus CI resources. The two cluster building kits supported by XCI - the XSEDE-Compatible Basic Cluster (XCBC) and the The XSEDE National Integration Toolkit (XNIT) - allow campus cyberinfrastructure sysadmins to automate tasks that may easily be automated, focusing their own precious time on meeting needs that are particular to their own local resources and local user needs.

XSEDE had a significant impact on the national CI ecosystem, as evidenced by the following statistics as of the end of the first five years of XSEDE:

  • Total aggregate PetaFLOPS of systems running XCBC and / or XNIT: 732
  • Total number of CI resource on which one or more XCI tools are used: 594
  • Total PetaBytes of files moved via Globus transfer since the start of XSEDE: 148
  • Total number of partnership interactions between XCRI and SPs, campus and national CI providers since the start of XSEDE: 72

Many other campus bridging problems were addressed by other entities. Examples of campus bridging challenges that went well beyond the scope of XSEDE, but which have been addressed since the start of XSEDE include:

  • The NSF CC* solicitation addresses many of the issues of local campus cyberinfrastructure and the capacity of campus network connections to national cyberinfrastructure
  • The XSEDE-allocated Wrangler system provides a data analytics and massive data disk storage and analysis resource allocated via XSEDE and available to the national research community.
  • The Service Provider Forum creates a place for information exchange between operators of campus CI systems and those providing services at the national level.

To better understand the beginnings and history of Campus Bridging as a concept, you can read an assessment of the general area of campus bridging as of the end of the first five years of XSEDE. You can read the report from a workshop run in conjunction with IEEE Cluster 2015 here: https://scholarworks.iu.edu/dspace/handle/2022/20538.

There is also a Youtube playlist for Campus Bridging videos at https://www.youtube.com/channel/UCdzP0orHEoZHtNhql98cbtw

Key Points
Different issues Campus Bridging faced
XSEDE had a significant impact on the national CI ecosystem
Contact Information

XCRI's Mission

The XSEDE Cyberinfrastructure Resource Integration group is dedicated to providing tools that will help campus cyberinfrastructure staff more easily manage their local computing and storage resources.

On the following three points, there is nearly universal agreement

  • There are not enough people on campuses to meet all the demands for supporting researcher use of advanced cyberinfrastructure
  • There is simply not enough computational power in the US to support all of the meaningful and important research
  • The people on campuses managing local campus resources have too much to do and too little time in which to do it.

The XSEDE Cyberinfrastructure Resource Integration group is dedicated to providing tools that will help campus cyberinfrastructure staff more easily manage their local computing and storage resources.

To dig a bit deeper into the challenges we face as a research community, let's look at some of the causes of the situation we have today (beyond the issue of simply not enough funding to meet demand):

  • Many sysadmins are doing tasks locally, by hand, in ways that are often re-invented over and over again, when those tasks could largely be made more consistent across campuses and automated
  • When users switch between local computing environments and national resources such as those supported by XSEDE, it often feels very disconcerting – while systems are advertised as being similar
  • Things like writing documentation are hard, often the last things done when building a system, and because of lack of staff it is often the case that busy sysadmins are asked to write documentation for end users
  • The state of software for managing and using our many and diverse cyberinfrastructure resources is not good enough and does not effectively support resource sharing

XCRI's goal is to provide as much aid to the US research community on the above four challenges as possible.

XCRI Toolkits and services include

  • The XSEDE-Compatible Basic Cluster (XCBC) software toolkit enables campus CI resource administrators to build a local cluster from scratch, which is then easily interoperable with XSEDE-supported CI resources. XCBC is very simple in concept: pull the lever, have a cluster built for you complete with an open source resource manager / scheduler and all of the essential tools needed to run a cluster, and have those tools set in place in ways that mimic the basic setup on an XSEDE-supported cluster. The XCBC is based on the OpenHPC project, and consists of XSEDE-developed Ansible playbooks and templates designed to ease the work required to build a cluster. Consult the XSEDE Knowledge Base for complete information about how to use XCBC to set up a cluster.
  • The XSEDE National Integration Toolkit (XNIT). Suppose you already have a cluster that you are happy with and you want to add too it software tools that will allow users to use open sources software like that on XSEDE, or other particular pieces of software that you think are important, but you don't want to blow up your cluster to add that capability? XNIT is for you. You can add all of the basic software that is in SCBC, as relocatable RPMs (Resource Package Manger), via a YUM repo. (YUM Stands for Yellowdog Updater, Modified). The RPMs in XNIT allow you to expand the functionality of your cluster, in ways that mimic the setup on an XSEDE cluster. XNIT packages include specific scientific, mathematical, and visualization applications that have been useful on XSEDE systems. Systems administrators may pick and choose what they want to add to their local cluster; updates may be configured to run automatically or manually. Currently the XNIT repository is available for x86_64 systems running CentOS 6 or 7. Consult the XSEDE Knowledge Base for more information.
  • Optional software that can be added to clusters running XCBC or XNIT is available. This includes a toolkit for installing a local Globus connect server. Globus transfer is the recommended method for transferring data to any XSEDE system.
  • XCRI staff will travel in person to your campus to help implement XNIT, XCBC, or any other XCRI tools on your campus. After an initial phone consultation, we can assist onsite with configuration and ensure that you have the knowledge that you need to maintain your system. That's right…. XSEDE will pay to fly XSEDE staff to your campus and help you with your campus cluster, even if you have no particular relationship to XSEDE. You can see the list of places we have gone to give talks or help people set up clusters. You can read detailed descriptions of past visits to campuses to help with local clusters in the XSEDE 2016 paper Implementation of Simple XSEDE-Like Clusters: Science Enabled and Lessons Learned. These site visits are funded by XSEDE, including staff travel and lodgings.
  • XCRI offers a Cluster Monitoring toolkit - compatible with the XCBC or any OpenHPC cluster running Warewulf, that allows sites to install tools that monitor both cluster health and job statistics. The toolkit allows administrators to generate fine-grained reports of usage based on users, projects, and job types, which can be a great aid in keeping track of ROI or justifying future funding.

XCRI Impact

We're having an impact on the national CI ecosystem, as evidenced by the following statistics from the first five years of XSEDE:

  • Total aggregate TeraFLOPS of systems running XCBC and / or XNIT: 732
  • Total number of CI resource on which one or more XCI tools are used: 594
  • Total PetaBytes of files moved via Globus transfer since the start of XSEDE: 148
  • Total number of partnership interactions between XCRI and SPs, campus and national CI providers since the start of XSEDE: 72

A YouTube library of videos about XCI and its organizational forerunner the XSEDE1 Campus Bridging Group is available.

For a look at how XCRI's toolkits have helped campuses put resources in researchers' hands, see the YouTube video XSEDE Helps Small University Connect With Nation's Cyberinfrastructure.

Key Points
Shortage of time and resources to meet research demands
Causes of the challenges we face
XCRI's goal is to help the US research community adapt and leverage resources in response to these challenges
Contact Information

Newsletter

To receive the most up-to-date news and events from XSEDE, please subscribe to IMPACT by XSEDE, a monthly e-newsletter sent directly to your inbox.

Key Points
Stay up to date with XSEDE Newsletters
Contact Information

Organization

Learn the description and breakdown of the XSEDE organizational structure. Questions? Contact us at info@xsede.org.

XSEDE is divided into four organizational levels:

  • Level 1 (Project Level):
    • encompasses all functional areas as well as external components (e.g. XAB, SPF, UAC)
    • led by the Principle Investigator, John Towns
  • Level 2 (L2 Functional Areas):
    • represents the six functional areas of the project: Community Engagement & Enrichment (CEE), Extended Collaborative Support Services (ECSS), XSEDE Cyberinfrastructure Integration (XCI), Operations (Ops), Resource Allocations Service (RAS), and the Program Office
    • led by the L2 Directors
  • Level 3 (L3 Focus Areas):
    • Each L2 Functional Area is separated into focus areas, called L3 teams, that, collectively, represent the functional responsibilities of that L2 area
    • led by L3 Managers
  • Level 4 (Working Level):
    • Individual team members complete assigned individual and group activities for the L3 team(s) they report into

A Work Breakdown Structure (WBS) approach is used to designate Levels 1 through 3 of the organizational structure.

Organizational level 2 teams are specifically setup to align with the project's strategic goals.

Work Breakdown Structure diagram

The breakdown of the XSEDE project is shown in the figure below.

XSEDE Organization Chart

Key Points
Four organizational levels
Organizational structure aligned with strategic goals
Contact Information

XSEDE Governance

Learn about XSEDE's governance and leadership as well as decision-making and performance metrics

The XSEDE project is a collaborative partnership of 19 institutions, led by:

The additional partners that strongly complement these cyberinfrastructure centers with expertise in science, engineering, technology and education can be found on the XSEDE Collaborations & Partnerships page.

Vision

The XSEDE vision is of a world of digitally enabled scholars, researchers, and engineers participating in multidisciplinary collaborations while seamlessly accessing advanced computing resources and sharing data to tackle society's grand challenges.

Mission

XSEDE exists to enhance the productivity of a growing community of scholars, researchers, and engineers through access to advanced digital services that support open research by coordinating and adding value to the leading cyberinfrastructure resources funded by the NSF and other agencies.

Strategic Goals

  1. Deepen and extend use for existing and new communities through workforce development and efforts that raise awareness of the value of advanced digital services.
  2. Advance the Ecosystem by creating an open and evolving infrastructure and enhancing the array of technical expertise and support services.
  3. Sustain the Ecosystem by providing reliable, efficient and secure infrastructure, excellent user support services, and an innovative, effective and productive virtual organization.

Governance

https://www.xsede.org/documents/1477968/1578972/XSEDEGovModel.png/59aecc25-c8b5-42dc-bb4c-7ce424ed59ec?t=1499435290000

XSEDE uses a balanced governance model that includes strong central management providing rapid response to issues and opportunities, and openness to genuine stakeholder participation from NSF, User Advisory Committee (UAC), Service Provider Forum (SPF) and external advisors who form the XSEDE Advisory Board (XAB)

 

XSEDE Staff Wiki

The XSEDE Staff Wiki has a section dedicated to KPIs and Metrics. This section is accessible from the Wiki home page in the Project Execution section and is intended for general purpose use.

 

XSEDE Metrics Dashboard

The XSEDE Metrics Dashboard was created to support in-depth metrics analysis, including trend analysis. It presents multiple views and supports customized views, allowing very detailed metrics analysis. The dashboard is accessible via the XSEDE website and the XSEDE User Portal.

 

Key Points
Strong central management
Delegated decision-making authority
Stakeholder participation
Contact Information

Previous years' ECSS seminars may accessed through these links:

2017

2016

2015

2014

Content with tag biology .

August 21, 2018

OpenTopography: A gateway to high resolution topography data and services

Presenter(s): Choonhan Youn (SDSC)

Over the past decade, there has been dramatic growth in the acquisition of publicly funded high-resolution topographic and bathymetric data for scientific, environmental, engineering and planning purposes. Because of the richness of these data sets, they are often extremely valuable beyond the application that drove their acquisition and thus are of interest to a large and varied user community. However, because of the large volumes of data produced by high-resolution mapping technologies such as lidar, it is often difficult to distribute these datasets. Furthermore, the data can be technically challenging to work with, requiring software and computing resources not readily available to many users. Some of these complex algorithms require high performance computing resources to run efficiently, especially in an on-demand processing and analysis environment. With the steady growth in the number of users, complex and resource intensive algorithms to generate derived products from these invaluable datasets, HPC resources are becoming more necessary to meet the increasing demand. By utilizing the comet XSEDE resource, OpenTopography aims to democratize access and processing of these high-resolution topographic data.

Development of multiple scattering theory method: the recent progress and applications

Presenter(s): Yang Wang (PSC)

Multiple scattering theory is an ab initio electronic structure calculation method in the framework of density functional theory. It differs from other ab initio methods in that it is an all-electron method and is not based on variational approach. Its advantage of having easy access to the Green function makes it a unique tool for the study of random alloys and electronic transport. In this presentation, I will give a brief overview of the multiple scattering theory, and will discuss the recent ECSS projects relevant to the development and applications of multiple scattering theory method.


Ruby Mendenhall Charts Progress Using HPC, Big Data to Flag Unidentified Historical Sources on African American Women's Lives

Using HPC resources allocated to researchers through XSEDE, Mendenhall and a multidisciplinary team are gleaning lessons on how Black women related to and affected the larger society during periods when their written voices were underrepresented or even illegal.

By Ken Chiacchia, Pittsburgh Supercomputing Center

Information about the lives and experiences of Black women can be gleaned from surprising historical literary sources, Ruby Mendenhall of the University of Illinois Urbana-Champaign said on July 25 in a plenary talk at the PEARC18 conference in Pittsburgh, Pa. Using HPC resources allocated to researchers through XSEDE, Mendenhall and a multidisciplinary team are gleaning lessons on how Black women related to and affected the larger society during periods when their written voices were underrepresented or even illegal.

"We're using advanced computing to recover Black women's history," said Mendenhall, who is an Associate Professor in Sociology and African American Studies at Urbana-Champaign and newly appointed Assistant Dean for Diversity and Democratization of Innovation at the Carle Illinois College of Medicine. "How is inequality expressed or hidden in the everyday lives of African American women? How do they seek to challenge that inequality?"

The annual Practice and Experience in Advanced Research Computing (PEARC) conference—with the theme Seamless Creativity—stresses key objectives for those who manage, develop and use advanced research computing throughout the U.S. and the world. This year's program offered tutorials, plenary talks, workshops, panels, poster sessions and a visualization showcase.

Mendenhall said she first became aware of the resources available for the social sciences from exposure to the National Center for Supercomputing Applications (NCSA) at Urbana-Champaign, a member of XSEDE, an NSF-funded virtual organization that integrates and coordinates access to advanced cyberinfrastructure. Working with Michael Simeone and other staff at NCSA, she obtained a series of allocations in the XSEDE system that have allowed her group to analyze about 800,000 documents in the JSTOR and HathiTrust databases of documents from 1746 to 2014.

"Our motivation was that often literature by or about Black women was inaccessible or illegal," she said. "A lot of the voices and experiences are either not in the literature early on, or are under-represented."

Mendenhall and her collaborators analyzed the databases with two sets of keywords, those referring to race and those referring to gender. They conducted their analyses within the theoretical framework of standpoint theory, which posits that social and political experiences shape individuals' perspectives and positions. They queried the databases with two computational tools: latent Dirichelt allocation (LDA), a statistical model that infers the collection of topics found in a text; and comparative text mining (CTM), which identifies similarities and differences among topics under which words fall.

Using XSEDE allocations on the former Blacklight, Greenfield and current Bridges systems at Pittsburgh Supercomputing Center (PSC), the group trained their algorithms using a subset of 20,000 texts known to be about Black women, then used those algorithms to identify potentially relevant texts in the larger databases. As a next step they reviewed the metadata from the positive results. Such a review of metadata is called an "intermediate reading." Many of those that passed that step received a traditional "close reading" of the full text by human subject experts to verify the works contained content relating to Black women.

"Our results unfortunately supported the idea of writing as an act of privilege," Mendenhall said. "We often had to go through [writings by] Black men or White women" to glean information about Black women's lives. "You wouldn't think you would read about Black women or their lived experience in some of these works, but when we did a close reading there was information about that."

One intriguing result stemmed from a topic derived in the LDA analysis revolving around court proceedings and property.

"It was unclear whether the property referred to land or to Black women held as slaves," Mendenhall said, but the close readings subsequently confirmed that the result was both valid and corresponded to a known historical period: the "golden age" before 1846 in which enslaved Black women had some success challenging their own status and their children's status via the U.S. legal system.

Their analysis was consistent with this historical period with 575 freedom suits, 60% in which slaves won their freedom.

"We're capturing some of the real experiences of Black women" at a time when it was illegal for them to be literate, she said.

Another, dark phenomenon regarded the use of Black women and their children as subjects in medical studies with incomplete or often absent consent. One article the Urbana-Champaign team identified in the American Journal of Diseases of Childrenin 1918 described the case of an undernourished 5-year-old Black child with chronic diarrhea. While the child's mother was only indirectly referenced in the paper, it identified some provocative insights into her relationship with the medical profession: she didn't or couldn't bring her child in for care for a year, until blood had appeared in the stool; the doctors referred them to a charity hospital, at which they were likely to receive inferior care relative to the hospital where the child had been assessed; and the child's reported diet prior to the symptoms suggested a typical diet for African Americans at that time and place.

Mendenhall said her team will next focus on the current state of Black women in the U.S. in the aftermath of the Great Recession, housing crisis, police shooting controversies, and other factors in what has been called a "new nadir in Black history" by historian Dr. Cha-Jua. The group is recruiting "citizen-scientists" to collect health data in real time and personal reporting via written or online journals to examine how gun violence affects public life and public health.  Her team is hoping to collaborate with higi, a consumer health data tracking service with access to more than 217 million health measurements from over 6.9 million account holders at 11,000 centers around the U.S.

"We're asking how we can use cyberinfrastructure to capture unheard stories about violence," she said, stressing the importance of investigating correlations between violence and Black maternal and infant mortality, diabetes, cancer and other medical problems.

Mendenhall sees the new research as an integral part of her new appointment at the College of Medicine. "We want to see the community at the table" in charting a course for medical research at the college in which the flow of information moves in both directions. In addition to helping design studies that engage and earn community support, "I'm hoping that community members will come forward with health issues that they would like solved."


Ruby Mendenhall gives her keynote speech at PEARC18 in Pittsburgh, PA.


XSEDE announces 2018-2019 Campus Champions Fellows

XSEDE has selected five Campus Champions Fellows for the next year, studying everything from agriculture to computer science.

Five researchers from American universities will work with cyberinfrastructure and high-performance computing experts from XSEDE and U.S. research teams to tackle real-world science and engineering projects over the next year in the 2018 Campus Champions Fellows program.

 

The 2018 cadre are current XSEDE Campus Champions, a collection of faculty, staff and researchers at over 200 U.S. institutions who advise others on their local campus on the use of high-end cyberinfrastructure, and have been doing so for the past 10 years. The goal of the Campus Champions Fellows program is to increase expertise on campuses by including Campus Champions as partners in XSEDE's Extended Collaborative Support Services (ECSS) projects, which provide vital domain expertise to interested researchers.
 

  • Peter Hawrylak, University of Tulsa, "Workforce Development: Education"

    • ABSTRACT:

    • The XSEDE education program seeks to expand a capable and innovative advanced digital resource workforce across the country by providing access to example programs, course syllabi, and computational science education materials as well as guidance from XSEDE's education staff. Through the education program, Champions may propose a project to help create a formal undergraduate or graduate minor, concentration, or certificate program at their institution. This requires working with the faculty to identify the courses that would be part of such a program, locating and testing computational projects that would become parts of those courses, and working with the appropriate academic committees to prepare the materials needed to obtain program approval.

  • Chet Langin, Southern Illinois University at Carbondale, "Understanding sustainability issues in the global farm-food system using a global gridded model of agriculture"

    • ABSTRACT:

    • The challenge of attaining sustainability in the global farm-food system is quite daunting. By 2050, global population is expected to reach 9.7 billion persons placing further pressure on global agriculture. If left unchecked, extensive expansion in the global farm-food system could encroach on hotspots of threatened natural systems resulting in greater clearing of key forestlands and unsustainable withdrawals of water for irrigation. Climate change further compounds this problem as extreme temperature and precipitation dampens crop yields within and across countries. To untangle the complex food and environmental issues faced by the world's farm-food system, we developed a global gridded computational model of agriculture (SIMPLE-G: a Simplified International Model of agricultural Prices Land use and the Environment). This agricultural model is open-source, geospatial (around 36,000+ grid cells) and flexible enough to accommodate a wide variety of interdisciplinary assessment (climate impacts, water scarcity, biodiversity, terrestrial carbon stocks and food security). We have a version of the model implemented as a HUBzero tool and is hosted on MyGeoHub (https://mygeohub.org).

  • Xinlian Liu, Hood College, "Interoperating CyberGIS and HydroShare for Scalable Geospatial and Hydrologic Sciences"

    • ABSTRACT:

    • CyberGIS and HydroShare are two NSF Sustainable Software Integration (SSI) projects that support separate but closely related domain science areas. Both projects move computation off the desktop into advanced cyberinfrastructure based on service-oriented architecture enabling computation on big data, avoiding platform dependency and software installation requirements and serving as gateways to high performance computing. Interoperability between the two systems will enable the coupling of data and multi-scale and multidisciplinary modeling capabilities from both communities and empower scalable geospatial and hydrologic sciences. As both projects grow to integrate big data analytics and advanced modeling capabilities, we face two major challenges in interoperability: 1. How to establish a user environment that seamlessly integrates distributed data, software, and computation from both CyberGIS and HydroShare in a sandbox where users can focus on domain research? 2. How to achieve interoperability features as extensible, reproducible, and reusable software solutions for scalable development, deployment, and operation in order to support broader collaboration of various multidisciplinary research in our communities?

  • Gil Speyer, Arizona State University, "Simulation for 2D Semiconductor with Parallel Uniform and Adaptive Multigrid Method for Multi-component Phase Field Crystal Models"

    • ABSTRACT:

    • Two-dimensional (2D) semiconductors and their heterostructures hold promise to yield revolutionary new technologies, ranging from nanosized transistors and efficient light emitting diodes to highly sensitive chemical sensors. 2D materials exhibit unique properties due to confinement in the third dimension. To fully exploit them, it is essential to develop techniques for growing large-area films while precisely controlling the nano/microscale morphology and defects. But due to the difficulty in fully characterizing such systems in atomic scale, fundamental understanding of the relation between these properties and the growing condition remains unclear. Computational modeling is essential in the research for filling this gap. However, current state- of-the-art methods are not well suited for the evolution of 2D materials growths on the mesocale during growth while reflecting the influence of atomic-scale interaction. We devised highly efficient parallel multigrid solver with P3DFFT, and implemented it to simulate large scale HCP crystal modeled by the structural phase field crystal (XPFC) model. We also devised advanced data processing and visualization algorithm for the XPFC model. We are pursuing the following objectives: 1) improve the efficiency of the parallel multigrid solver further by implementing the MPI/OpenMP hybrid technique and adapting intel KNL many-core architecture; 2) conduct large scale simulation for the XPFC model and multi-component phase field crystal (PFC) model for 3D advanced materials such as the hexagonal close-packed (HCP) crystal and graphene, and analyze the result with data processing and visualization algorithm; 3) investigate novel visualization technique for 3D atomistic data with custom rendering system; 4) develop parallel adaptive multigrid solver.

  • Mohammed Tanash, New Mexico State University, "Cyberinfrastructure Resource Integration"

    • ABSTRACT:

    • The XSEDE Cyberinfrastructure Integration (XCI) team seeks Campus Champions Fellowship applications for projects in bridging activities between a local campus or campuses and XSEDE resources. These can include creating workflow submission systems that send jobs to XSEDE Service Provider resources from campus, the creation of shared virtual compute facilities that allow jobs to be executed on multiple resources, data management for researchers with Globus Connect, the creation of local XSEDE Compatible Cluster Systems, or other projects that utilize tools which reduce barriers for scaling analyses from campuses to national cyberinfrastructure.

Accepted Fellows, with the support of their home institution, make a 400-hour time commitment and are paid a stipend to allow them to focus time and attention on these collaborations. The program also includes funding for two visits, each ranging from one to two weeks, to an ECSS, PI or conference site to enhance the collaboration.

 

For more information on the XSEDE Campus Champions Fellows program, including all past cohorts, visit: https://www.xsede.org/ccfellows.



Since the Campus Champions program began in 2008, it has grown to 450 champions, reaching over 200 institutions spanning all 50 states, the District of Columbia, Puerto Rico, Guam and the U.S. Virgin Islands. Over the last decade, the Campus Champions' mission has changed to reflect the diversity and growth of this community, which is, "To promote and facilitate the effective participation of a diverse national community of institutions in the application of advanced digital resources and services to accelerate scientific discovery and scholarly achievement." Thanks to sustained funding from XSEDE and the National Science Foundation (NSF) over the last ten years, Campus Champions have helped support research that will change the world, developed optimized workflows for high-performance computing and continue to improve research computing networks with a nationwide impact.

 


The Campus Champions are a flourishing, one-of-a-kind community of peer-mentors and facilitators who successfully communicate virtually to share information, best practices and experiences including, but not limited to: asking and answering questions on an active mailing list, sharing technical expertise and resources, participating in video conferences as well as face-to-face meetups at regional and national conferences like PEARC and Supercomputing (SC).

 

The Campus Champions hope to increase scalable and sustainable access to advanced digital services from providers at all levels. With the rapid growth of the Campus Champions community, the program will continue to foster a more diverse nationwide cyberinfrastructure ecosystem by cultivating inter-institutional exchanges of resources, expertise and support for another ten years.

 

XSEDE allocates $7.3M worth of computing time to U.S. researchers in latest allocation cycle

XSEDE has awarded 145 deserving research teams at 109 universities

 

XSEDE has awarded 145 deserving research teams at 109 universities and other institutions access to nearly two dozen NSF-funded computational and storage resources, as well as other services unique to XSEDE, such as the Extended Collaborative Support Services (ECSS). Total allocations this cycle, running July 1, 2018, through June 30, 2019, represent an estimated $7.3 million of time on multi-core, many-core, GPU-accelerated, and large-memory computing resources (which does not include additional consulting resources) all at no cost to the researchers. Since its founding in 2011, XSEDE and XSEDE 2.0 have allocated an estimated $270M of computing time.

 

This round of XSEDE allocations will enable scientific discoveries from 48 different fields of science, ranging from biochemistry to materials research to astronomical sciences.The full list of those with current allocations and their research abstracts may be found here.

 

XSEDE offers several different types of allocations, so that everyone from the first-time HPC user to the career HPC expert can gain access to the appropriate level of services that they need. Though the majority of XSEDE's resources are allocated via Research Allocations, XSEDE also offers Startup Allocations, which are one of the fastest ways to gain access to and start using XSEDE-allocated resources. These allocations are designed to allow researchers to further develop their applications, conduct benchmarking on various resources, and other small-scale computational activities that require the unique capabilities of resources allocated through XSEDE. XSEDE also offers Education Allocations for academic courses or training activities which allow instructors to request up to three separate computational resources.

 

Research requests for XSEDE allocations are accepted and reviewed on a quarterly basis by the XSEDE Resource Allocations Committee (XRAC), a volunteer panel of approximately 40 computational science experts from academia and industry who represent many fields of science and engineering.

 

More information on the XSEDE allocations process may be found here. For those interested in serving on the XRAC, more information about that process may be found here.

 

For U.S.-based researchers who currently use, or want to use, advanced research computing resources and services, XSEDE can help. Whether you intend to use XSEDE-allocated resources or resources elsewhere, the XSEDE program works to make such resources easier to use and help more people use them. Once you are ready to take the next step, you can become an XSEDE user in a matter of minutes and be on your way to taking your computational activities to the next level. Begin discovering more with XSEDE here.

About XSEDE

 

The Extreme Science and Engineering Discovery Environment, more commonly known as XSEDE, is an NSF-funded program that scientists use to interactively share computing resources, data and expertise. People around the world use these resources and services — things like supercomputers, collections of data and new tools — to improve our planet.

 

XSEDE is supported by the National Science Foundation through awards ACI-1053575 and ACI-1548562.

 



An allocation provides access to the XSEDE resources, typically via a block of computational time and storage. This roadmap walks through the topics most needed by new users who need allocations to access resources in high-performance computing, scientific visualization, and data storage. (Target audience: new and current XSEDE users)

 


 
July 2018 | Science Highlights, Announcements & Upcoming Events
 
The Extreme Science and Engineering Discovery Environment (XSEDE) is an NSF-funded single virtual organization which connects scientists with advanced digital resources. People around the world use these resources and services — things like supercomputers, collections of data, expert staff and new tools — to improve our planet. Learn how XSEDE can enable your research here .
 
Science Highlights
 
 
XSEDE Supercomputers Help Design Mutant Enzyme that Eats Plastic Bottles
 
PET plastic, short for polyethylene terephthalate, is the fourth most-produced plastic and is used to make things like beverage bottles and carpets, most of which are not being recycled. Some scientists are hoping to change that, using XSEDE-allocated supercomputers to engineer an enzyme that breaks down PET. They say it's a step on a long road toward recycling PET and other plastics into commercially valuable materials at industrial scale.
 
 
Electrostatic potential distribution of PETase structure. Image courtesy of Gregg Beckham.
 
 
Creating Ultra-Accurate Molecular Models by Applying Machine Learning Techniques
 
With the help of XSEDE-allocated resources, specifically the GPU-computing power and capabiliities provided by Comet and Maverick at TACC, a team of researchers led by UC San Diego's Department of Chemistry and Biochemistry and the San Diego Supercomputer Center (SDSC) are pioneering the use of machine learning techniques to develop models for simulations of Earth's most critical element water that can be extended to other generic molecules with what researchers call "unprecedented accuracy."
 
 
Machine learning techniques predict quantum mechanical many-body interactions in water. Shown is an example using neural networks for a water trimer (top left) from a simulation of liquid water (top right). Molecular descriptors encode the structural environment around oxygen atoms (red) and hydrogen atoms (white). When used as input for neural networks (blue boxes for oxygen, orange boxes for hydrogen), many-body energies can be calculated accurately. Credit: Andreas Goetz and Thuong Nguyen, SDSC/UC San Diego
 
 
Brain Imaging Library Offers Data, Metadata for State-of-the-Art 3D Volume Images
 
On June 19, PSC's Derek Simmel addressed the XSEDE Extended Collaborative Support Service's (ECSS) monthly symposium on the Brain Image Library now being built by PSC and its collaborating institutions. The BIL, funded as part of the National Institutes of Health's Brain Research through Advancing Innovative Neurotechnologies (BRAIN) Initiative, will archive 3D volume image data of animal brains obtained using state-of-the-art microscopes operated in laboratories across the U.S. The BIL will make the data and associated metadata available to scientists who are studying the function of all brain cells, tissues and structures to answer big questions about how brains work and how to protect brain health. 
 
 
Announcements
 
NSF Office of Advanced Cyberinfrastructure Announces Core Research Program - New Solicitation NSF 18-567
 
The Office of Advanced Cyberinfrastructure (OAC) is pleased to announce its core research program solicitation, with the goals of supporting all aspects of advanced cyberinfrastructure (CI) research that will significantly impact the future capabilities of advanced research CI, as well as the research career paths of cyber scientists and engineers. Through this solicitation, OAC seeks to foster the development of new knowledge in the innovative design, development, and utilization of robust research CI.
 
Potential principal investigators (PIs) are strongly encouraged to contact an OAC cognizant program director listed in this solicitation with a 1-page project summary (with project overview, intellectual merit, and broad impact) for further guidance. OAC plans to hold a webinar for potential PI's and will announce it in the near future. The window for this proposal is November 1-15, 2018.
 
 
XCI Updates
 
The XSEDE Cyberinfrastructure Integration (XCI) team released an upgrade to the Information Publishing Framework (IPF) software (version 1.4) that enables allocated and unallocated/campus service providers to publish GPU information along with existing CPU, software, and other system information. Other improvements also enable Science Gateways and Community Software Area (CSA) owners to advertise their software and support contact coordinates thru XSEDE software search interfaces. Dan Voss, Campus Champions Fellow for XCI, is testing virtual cluster software developed with the help of staff at SDSC at a campus cluster in Nebraska to enable flexible configurations on campus resources.
 
 
XSEDE Seeking Satellite Sites for XSEDE HPC Monthly Workshop Series
 
XSEDE is pleased to announce a regular series of remote workshops on High Performance Computing topics. These hands-on workshops provide a convenient way for researchers to learn about the latest techniques and technologies of current interest in HPC. We are currently accepting satellite sites for:
  • October 2-3: MPI
  • November 6: GPU Computing Using OpenACC
  • December 4-5: BIG DATA
 
You, or another representative of your institution, should contact us if you are interested in hosting a remote site. Satellite sites MUST be able to provide a facility capable of two-way video for the duration of the workshop. There should also be a full time TA available to assist with local technical issues. An AV system test will be scheduled sometime before each workshop. Failure to complete a successful AV test prior to an event and subsequent AV issues during the workshop my affect the priority level of your site for future events. Address any questions regarding course content to John Urbanic ( urbanic@psc.edu ) and questions regarding registration to Tom Maiden ( tmaiden@psc.edu ).
 
Upcoming Events
 
 
Early-Bird registration for Gateways 2018 now open!
 
Early-bird registration  for Gateways 2018, which will be held at the University of Texas, Austin, September 25-27, is open through Monday, August 6. (Regular registration closes Friday, September 7.) Book your hotel now for the best rates! Use the following hyperlinks to check out the conference program , submit an abstract for the poster session (by August 1), or reserve a space at the Resource Expo .
 
Questions? Email  help@sciencegateways.org .
 
 
Save the Dates for:
 

 


 


XSEDE supercomputers help design enzyme that breaks down plastic bottles

TACC Stampede2, SDSC Comet systems simulate PETase enzyme interactions

Supercomputers helped study the binding of a plastic-degrading enzyme, PETase, which could lead to developing industrial-scale plastic recycling for throw-away bottles and carpet. Electrostatic potential distribution of PETase structure courtesy of Gregg Beckham.

A plastics pollution problem might have met its match, thanks to scientific computing. PET, short for polyethylene terephthalate, is the fourth most-produced plastic in the world and is used to make things like beverage bottles and carpets, much of which is not recycled. Scientists want to change that, using XSEDE-awarded time on supercomputers to engineer an enzyme that breaks down PET. They say their computationally-assisted research, which has already uncovered one surprising twist, is a needed step on a long road toward recycling PET and other plastics into commercially valuable materials at industrial scale.

Gregg Beckham, National Renewable Energy Laboratory.

"We're ideally going from a place where plastics are hard to recycle, to a place where we use nature and millions of years of evolution to direct things in a way that make plastic easy to recycle," said Lee Woodcock, an Associate Professor of Chemistry at the University of South Florida. Woodcock co-authored a study that probed the structure of PETase, an enzyme to degrade PET, and was published March of 2018 in the Proceedings of the National Academy of Sciences (PNAS).

Supercomputers allowed the researchers to tackle tough science questions on PETase, such as the details of how it interacts on a molecular scale bound to a substrate, something beyond the scope of what could be determined by knowing its crystal structure.

The researchers took advantage for this study of computational resources allocated through XSEDE, the Extreme Science and Engineering Discovery Environment, funded by the National Science Foundation.

Lee Woodcock, Chemistry Department, University of South Florida.

"Having access to XSEDE resources really opens up the possibility of being able to model and being able to study what type of large-scale conformational or even local, small structural changes occur as a function of both binding to the substrate and, additionally, what are the structural changes the large-scale or local, small scale structural changes that occur in the enzyme after we make the mutations. That was a big part of what we were looking at," Woodcock said.

Woodcock explained that they simulated the system using a technique known as molecular dynamics, in which the motions of the individual atoms in the enzyme, substrate and surrounding water were tracked over long timescales. The work employed two software packages, NAMD (Nanoscale Molecular Dynamics) and CHARMM (Chemistry at Harvard Macromolecular Mechanics), with the forces between the atoms modeled using the CHARMM force fields.

XSEDE awarded Gregg Beckham allocations on the Stampede1 and Stampede2 systems at the Texas Advanced Computing Center (TACC) and on the Comet system at the San Diego Supercomputer Center (SDSC).

"Our experience to date on Stampede2 has been absolutely wonderful," Beckham said. "For all the codes on there that we use, it's been a fantastic machine. We get through the queues quickly. We're producing a lot of great science across the spectrum of what our groups are collectively doing together using Stampede2 right now. Certainly, for the research on the plastics-degrading enzyme, we're using it for manuscripts and studies going forward on this same topic."

Electrostatic potential distribution of PETase structure. Image courtesy of Gregg Beckham.

"One nice thing about Comet," Woodcock said, "is that you have, for jobs that you need to get through in a high-throughput fashion, SDSC has a shared queue, which allows you to submit much smaller jobs but do it in a very high-throughput fashion, as they can share cores on the nodes at Comet. This was particularly helpful."

The study built on a discovery in 2016 by Yoshida et al. of a bacterium, Ideonella sakaiensis 201-F6, that feeds on PET plastic as its source of carbon and energy. The PNAS study authors focused on the bacteria's plastic-degrading enzyme, called PETase. Team members at the University of Portsmouth, led by Professor John McGeehan, used X-ray crystallography at the Diamond Light Source in the UK to solve the high-resolution crystal structure of PETase.

"We then used computer simulations to understand how a polymeric ligand like PET would be able to bind to the enzyme," said study co-author Gregg Beckham, a Senior Research Fellow and Group Leader at the US National Renewable Energy Laboratory (NREL). "We also conducted experimental work to show that indeed, the PETase can break down water or soda bottles, industrially relevant PET films, and another plastic, polyethylene furanoate."

After doing this work on the structure and function of the PETase enzyme, the authors next tried to understand its evolution and look to similar enzymes, a family of cutinases, which degrade the waxy polymer cutin found on the surface of plants.

"We developed the hypothesis that if we make the PETase enzyme more like a cutinase, then we should make the enzyme worse. When we did this work, in fact we ended up making the enzyme slightly better by doing that," Woodcock said.

"It was incredibly surprising to us," Beckham explained. "When we made it more cutinase-like, the enzyme was modestly improved. That's actually one of the key aspects of where computation came in, because it allowed us to essentially predict or suggest aromatic-aromatic interactions in the enzyme with the aromatic polyester PET could potentially be responsible for its improved activity. But it was quite a surprise to us," Beckham said.

Both researchers agreed that computation helps make scientific discoveries. "Experimentalists and computational scientists are working hand-in-hand ever more frequently," Woodcock said. "And without access to resources like this, this would really take us a step back, or multiple steps back in producing the highest levels of science and really being able to address the world's most challenging problems, which

is what we did in this particular study, done by partnering with top level experimental groups like our collaborators in the UK and with us here in the US."

Beckham said that their work has just begun on enzymes that clean up plastic pollution. "We're just starting to understand how this enzyme has evolved," Beckham said. He wants to use computation to take advantage of large databases of genomics and metagenomics on enzymes to find the needles in the haystack that can degrade plastics.

"The other thing too that we're interested in," Beckham said, "is if we're able to do this at much higher temperature, that would be able to accelerate the degradation of PET and get us into realms that potentially could be industrially relevant in terms of using an enzyme to degrade PET and then convert that into the higher value materials, which could incentivize higher rates of reclamation, especially in the developing world where lots of plastic waste goes into the ocean."

Electron microscope images showing interaction of PETase enzyme with PET plastic. Image courtesy of Gregg Beckham.

Lee Woodcock sees new computational techniques as a game-changer in modeling non-druglike forcefields that tackle polymer interactions more realistically than CHARMM and NAMD can today. "I'm working with colleagues at NREL on making sure that we can improve the force fields in a very rapid fashion, so that if somebody comes in and says that we need to look at this polymer next, we have confidence that we can put together a modeling strategy in a very short amount of time to get that a quick turnaround when we have to model many different polymers.

The scientists are hopeful their work will one day make the world outside of the lab a better place. A dump truck's worth of plastic empties into the ocean every minute. Worldwide, humankind produces over 300 million tons of plastic each year, much of which is predicted to last centuries to millennia and pollutes both aquatic and terrestrial environments. "Understanding how we can better design processes to recycle plastics and reclaim them is a dire global problem and it's something that the scientific and engineering community has to come up with solutions for," Beckham said.

The study, "Characterization and engineering of a plastic-degrading aromatic polyesterase," was published in March 2018 in the Proceedings of the National Academy of Sciences. The authors are Harry P. Austin, Mark D. Allen, Alan W. Thorne, John E. McGeehan of the University of Portsmouth; Bryon S. Donohoe, Rodrigo L. Silveira, Michael F. Crowley, Antonella Amore, Nicholas A. Rorrer, Graham Dominic, William E. Michener, Christopher W. Johnson, Gregg T. Beckham of the National Renewable Energy Laboratory; Fiona L. Kearns, Benjamin C. Pollard, H. Lee Woodcock of the University of South Florida; Munir S. Skaf of the University of Campinas; Ramona Duman,Kamel El Omari, Vitaliy Mykhaylyk, Armin Wagner of the Diamond Light Source, Harwell Science and Innovation Campus. The National Renewable Energy Laboratory Directed Research and Development Program funded the study, with computer time provided by the Extreme Science and Engineering Discovery Environment (XSEDE) allocation MCB-090159.



Creating Ultra-Accurate Molecular Models by Applying Machine Learning Techniques

XSEDE Resources Provide the GPU Capability to Develop New Models

A team led by researchers at UC San Diego's Department of Chemistry and Biochemistry and the San Diego Supercomputer Center (SDSC) has pioneered the use of machine learning techniques to develop models for simulations of Earth's most critical element – water – that can be extended to other generic molecules with what researchers call "unprecedented accuracy."

Their work, published recently in The Journal of Chemical Physics, demonstrates how popular machine learning techniques can be used to construct predictive molecular models based on quantum mechanical reference data. Molecular simulations using modern high-performance computing systems such as the ones provided via XSEDE are key to the rational design of novel materials with applications ranging from fuel cells to water purification systems, atmospheric climate models, and computational drug design.

"This is a new methodology that could revolutionize computational chemistry," noted SDSC Director Michael Norman, who also is the principal investigators for the Comet supercomputer, an XSEDE resource based at SDSC.

The team relied on the GPU-computing power and capabilities provided by Comet as well as Maverick, based at the Texas Advanced Computing Center (TACC) at the University of Texas at Austin. Access to both systems was allocated through the Extreme Science and Engineering Discovery Environment (XSEDE), an NSF-funded program under which scientists can interactively share computing resources, data, and expertise.

"Although computer simulations have become a powerful tool for the modeling of water and for molecular sciences in general, they are still limited by a tradeoff between the accuracy of the molecular models and the associated computational cost," said Francesco Paesani, professor of chemistry and biochemistry at UC San Diego and the study's principal investigator.

"Now that we've proved this concept with a model of water using machine learning techniques, we are currently extending this novel approach to generic molecules," added Paesani. "Scientists will be able to predict the properties of molecules and materials with unprecedented accuracy."

Researchers used that term because these new models are more accurate than classical force fields that are currently used for molecular simulations, explained Andreas W. Goetz, a research scientist who directed the work at SDSC. Now researchers can make quantitative predictions of the properties of water, for example, where other models fail.

The new study builds on the highly accurate and successful "MB-pol many-body potential" for water developed in Paesani's lab, which recently has emerged as an accurate molecular model for water simulations from the gas to liquid to solid phases.

As reported in the paper, the researchers investigated the performance of three machine learning techniques – permutationally invariant polynomials, neural networks, and Gaussian approximation potentials – in representing many-body interactions in water. Machine learning typically involves ‘training' a computer or robot on millions of actions so that the computer learns how to derive insight and meaning from the data as time advances.

In the quantum world, all three methods have been consistently equivalent in reproducing large datasets involving the interaction of multiple particles – many-body phenomena such as two-body and three-body energies – as well as water cluster interaction energies, all with great accuracy.

"We have demonstrated that these different machine learning techniques can effectively be employed to encode the highly complex quantum mechanical many-body interactions that arise when molecules interact," said Thuong Nguyen, lead author of the study and a research scholar at UC San Diego when the research was conducted.

GPUs and Complex Neural Networks

As for future efforts, these findings are not only important because the models are highly accurate, but also because it means researchers can choose the algorithms that best map to the available hardware, according to SDSC's Goetz. "

Modern many-core processors, for instance, are well-suited to evaluate the complex expressions of the permutationally invariant polynomials, while massively parallel graphics processing units (GPUs) perform exceptionally well for neural networks," he said.

The development of complex neural networks with associated optimization processes was performed on Comet and Maverick via XSEDE allocations. "Currently, there are not many GPU resources conveniently available," said Goetz, a long-time user of XSEDE resources. "So we are grateful for XSEDE for providing access to such systems in a manner that saved us both time and expense, in turn accelerating our time to published results."

Also participating in the study were researchers at the École Polytechnique Fédérale de Lausanne in Switzerland, Cambridge University in England, and the University of Göttingen in Germany. This research was supported by the National Science Foundation through grant # ACI-1642336.

ABOUT XSEDE

The Extreme Science and Engineering Discovery Environment (XSEDE) is an NSF-funded single virtual organization which connects scientists with advanced digital resources. People around the world use these resources and services — things like supercomputers, collections of data, expert staff and new tools — to improve our planet. Learn how XSEDE can enable your research here. XSEDE is supported by the National Science Foundation through award ACI-1053575.


Machine learning techniques predict quantum mechanical many-body interactions in water. Shown is an example using neural networks for a water trimer (top left) from a simulation of liquid water (top right). Molecular descriptors encode the structural environment around oxygen atoms (red) and hydrogen atoms (white). When used as input for neural networks (blue boxes for oxygen, orange boxes for hydrogen), many-body energies can be calculated accurately. Credit: Andreas Goetz and Thuong Nguyen, SDSC/UC San Diego


Showing 1 - 10 of 1,974 results.
Items per Page 10
of 198