Welcome to the XSEDE podcast page! Below are a few audio-recorded feature news segments about XSEDE science success stories, all written, directed and produced by Scott Gibson of the National Institute for Computational Sciences and Jorge Salazar from the Texas Advanced Computing Center, and members of the XSEDE external relations team. Make sure to check back for more on this page, through our "Impact" newsletter or via social media on Facebook and Twitter, @XSEDEscience.

Kelly Gaither Starts Advanced Computing for Social Change


 XSEDE identified 20 graduate and undergrad students that participated in a week-long event in November 2016 called Advanced Computing for Social Change. The event is hosted by XSEDE, TACC and SC16.
The SC16 Social Action Network student cohort tackled a computing challenge. They learned how to mine through a variety of data sets such as social media data encompassing a number of years and across large geographic regions. To complete their analysis in a timely fashion they learned how to organize the large data sets to allow fast queries.
The students of the SC16 Social Action Network used a computational modeling tool called risk terrain modeling that has been used to predict crime using crime statistics. This technique was first introduced to TACC in work done with the Cook County Hospital in Chicago, Illinois. The work used statistical data to predict child maltreatment in an effort to put programs in place to prevent it.
Podcast host Jorge Salazar interviewed Kelly Gaither, Director of Visualization at the Texas Advanced Computing Center (TACC). Gaither is also the Director of Community Engagement and Enrichment for XSEDE, the Extreme Science and Engineering Discovery Environment, funded by the National Science Foundation.
Kelly Gaither: Advanced Computing for social change is an initiative that we started to really use our collective capabilities, here at TACC and more broadly at supercomputing centers across the nation, to work on problems that we know have need for advanced computing. You can think of it as data analysis, data collection, all the way to visualization and everything in between to really work on problems of societal benefit. We want to make a positive change using the skill sets we already have.
The SC16 supercomputing conference took place in Salt Lake City, Utah November 13-18, 2016. The event showcases the latest in supercomputing to advance scientific discovery, research, education and commerce. 

How to See Living Machines


Scientists have taken the closest look yet at molecule-sized machinery called the human preinitiation complex. It basically opens up DNA so that genes can be copied and turned into proteins. The science team formed from Northwestern University, Berkeley National Laboratory, Georgia State University, and UC Berkeley. They used a cutting-edge technique called cryo-electron microscopy and combined it with supercomputer analysis. They published their results May of 2016 in the journal Nature.
Over 1.4 million 'freeze frames' of the human preinitiation complex, or PIC, were obtained with cryo-electron microscopy. They were initially processed using supercomputers at the National Energy Research Scientific Computing Center. This sifted out background noise and reconstructed three-dimensional density maps that showed details in the shape of the molecule that had never been seen before.
Study scientists next built an accurate model that made physical sense of the density maps of PIC. For that they XSEDE, the eXtream Science and Engineering Discovery Environment, funded by the National Science Foundation. Through XSEDE, the Stampede supercomputer at the Texas Advanced Computing Center modeled the human pre initiation complex for this study. Their computational work on molecular machines also includes XSEDE allocations on the Comet supercomputer at the San Diego Supercomputing Center.
Podcast host Jorge Salazar interviews Eva Nogales, Professor in the Department of Molecular and Cellular Biology at UC Berkeley and Senior Faculty Scientist and Howard Hughes Medical Investigator at Lawrence Berkeley National Laboratory; and Ivaylo Ivanov, Associate Professor of in the Department of Chemistry at Georgia State University.

Supercomputing Black Hole Jets


Black holes make one of the great mysteries in physics. You might know that black holes are so massive that nothing, not even light, can escape once it gets close enough.
The great physics mystery is that something is shooting out of some black holes at close to the speed of light. No one knows how these jets form. Supercomputer simulations of black hole jets are starting to shed light on them.
To get a better picture of what's happening, imagine gases, dust, and debris from stars spiraling into a black hole. They form a disc around it. Coming out the top and bottom of the disc are jets - one made of electrons and protons and the other of electrons and positrons - the antimatter twin brother of electrons.
On the podcast host Jorge Salazar talks more about black hole jets with Ken-Ichi Nishikawa, a principal research scientist at The Center for Space Plasma and Aeronomic Research at the University of Alabama in Huntsville. Dr. Nishikawa's team been awarded XSEDE allocations on several supercomputers, including time on Maverick, Ranch, and Stampede at the Texas Advanced Computing Center; Oasis and Gordon at the San Diego Supercomputing Center; and Nautilus at the National Institute for Computational Sciences. His simulations study how black hole jets interact with the plasma environment that surrounds them.
Nishikawa co-authored a study published April of 2016 in the Astrophysical Journal. The study simulations showed for the first time structural differences in one jet compared to the other. He's interviewed by podcast host Jorge Salazar.


August 16, 2016

It takes a supercomputer to grow a better soybean. A project called the Soybean Knowledge Base, or SoyKB for short, wants to do just that. Scientists at the University of Missouri-Columbia developed SoyKB. They say they've made SoyKB a publicly-available web resource for all soybean data, from molecular data to field data that includes several analytical tools.

SoyKB has grown to be used by thousands of soybean researchers in the U.S. and beyond. They did it with the support of XSEDE, the Extreme Science and Engineering Discovery Environment, funded by the National Science Foundation. The SoyKB team needed XSEDE resources to sequence and analyze the genomes of over a thousand soybean lines using about 370,000 core hours on the Stampede supercomputer at the Texas Advanced Computing Center. They're since moved that work from Stampede to Wrangler, TACC's newest data-intensive system. And they're getting more users onboard with an allocation on XSEDE's Jetstream, a fully configurable cloud environment for science.

Host Jorge Salazar interviews Trupti Joshi and Dong Xu of the University of Missouri-Columbia; and Mats Rynge of the University of Southern California.

Feature Story: www.tacc.utexas.edu/-/soybean-scien…-supercomputers

Recovering Lost History


The story in this podcast revolves around a collaboration of social scientists, humanities scholars, and digital researchers directed at using advanced computing to find and understand the historical experiences of Black women by searching two massive databases (HathiTrust and JSTOR)for written works from the 18th through the 20th centuries. The team also is developing a common toolbox that can help other digital humanities projects.

The research is supported by the National Science Foundation's eXtreme Science and Engineering Discovery Environment (XSEDE), the preeminent collection of integrated digital resources and services in the world. (xsede.org)

Participating in the podcast's discussion are the following: Ruby Mendenhall, an associate professor at the University of Illinois, Urbana-Champaign, and the project's principal investigator; Nicole M. Brown, a postdoctoral researcher at the National Center for Supercomputing Applications (NCSA); Michael Black, an assistant professor of English at the University of Massachusetts, Lowell; and Mark Vanmoer, a senior visualization programmer at NCSA.

Links to stories that have been written about this project:

Tiny Zaps, Big Results: Laser–Materials Research with Guest Leonid Zhigilei


The "small talk" of researchers in nanotechnology is extremely small. Their interest is in the physical phenomena occurring with things one-billionth of a meter in size, a million times shorter than the length of an ant, and up to 100,000 times thinner than a human hair. But the benefits to society of the science, engineering, and technology they're doing at such tiny scales are huge.

In fact, more than 800 everyday commercial products rely on nanoscale materials and processes, according to the National Nanotechnology Initiative (nano.gov). A central aspect of nanotechnology is that it allows essential structures of materials to be tailored to achieve specific properties that improve a variety of applications in medicine, energy, information technology, and many other areas.

One method of nanotechnology research involves the use of short laser pulses at minuscule fractions of a second to produce structural changes in thin, localized surface regions of various materials, such as gold, silver, or silicon.

Leonid Zhigilei, who heads the Computational Materials Research Group at the University of Virginia, says via telephone in this podcast that what attracted him to this type of research is the ability of lasers to excite and change materials in ways not possible with any other technique.

DIrect link to story: https://www.nics.tennessee.edu/zhigilei-laser-nanotechnology

Societal Impact of Earthquake Simulations at Extreme Scale 11-17-2015

On the podcast host Jorge Salazar interviews Thomas Jordan, Professor of Earth Sciences at University of Southern California and the Director of the Southern California Earthquake Center, a big national collaboration of over a thousand earthquake experts and 70 institutions.

Dr. Jordan uses the computational resources of XSEDE, the Extreme Science and Engineering Discovery Environment, to model earthquakes and help reduce their risk to life and property. Dr. Jordan was invited to speak at the SC15 supercomputing conference on the Societal Impact of Earthquake Simulations at Extreme Scale.

Thomas Jordan: One thing people need to understand is we need a lot of supercomputer time in order to be able to do these calculations. Some of our simulation models that are based on the simulation of earthquake physics can take hundreds of millions of hours of computer time to generate.

These are very complex system-level calculations. They're of the similar complexity of trying to calculate what Earth's climate is going to be like in 50 years because of human activities and CO2 charging of the atmosphere. It's a similar scale of problem.

These problems that deal with natural hazards and the complexity of the Earth system really require very large computers to be able to simulate that activity. We're frankly looking forward to the day when computers are ten times or a hundred times or more faster than they are today.

Direct link to the story: https://soundcloud.com/usetacc/sc15-societal-impact-of-earthquake-simulations-at-extreme-scale

Wrangler supercomputer takes on big data 11-16-2015

Podcast host Jorge Salazar interviews Niall Gaffney, Director of Data Intensive Computing at the Texas Advanced Computing Center. Gaffney leads efforts at TACC to bring online a new data-intensive supercomputing system called Wrangler.

The National Science Foundation's Division of Advanced Cyberinfrastructure awarded TACC and its collaborators 11.2 million dollars in November of 2013 to build and operate the Wrangler supercomputer. Indiana University, TACC, and the University of Chicago worked together on the project.

In April of 2015, Wrangler began early operations for the open science community, where results are made freely available to the public. Wrangler will augment the Stampede supercomputer, one of the most powerful in the world. And Wrangler will join the cyberinfrastructure of NSF-funded XSEDE, the eXtreme Science and Engineering Discovery Environment.

Niall Gaffney: We went to propose to build Wrangler with (the data world) in mind. We kept a lot of what was good with systems like Stampede, but then added new things to it like a very large flash storage system, a very large distributed spinning disc storage system, and high speed network access to allow people who have data problems that weren't being fulfilled by systems like Stampede and Lonestar to be able to do those in ways that they never could before.

Direct link to the story: https://soundcloud.com/usetacc/wrangler-supercomputer-takes-on-big-data

Revealing the Hidden Universe with Supercomputer Simulations of Black Hole Mergers 11-18-2015

November 2015 marks 100 years of Einstein's field equations that describe space and time as one interwoven continuum - and predict the existence of black holes and more.

On the podcast host Jorge Salazar interviews Manuela Campanelli, a professor at the Rochester Institute of Technology and the Director of the Center for Computational Relativity and Gravitation. Dr. Campanelli was invited by the SC15 supercomputing conference to give a presentation titled "Revealing the Hidden Universe with Supercomputer Simulations of Black Hole Mergers." Dr. Campanelli uses the computational resources of XSEDE, the Extreme Science and Engineering Discovery Environment, to probe the mysteries of black holes. She spoke by Skype to talk about that and about the 100th anniversary of Einstein's field equations and about her work that takes on the complexity of accurately describing black hole mergers.

Manuela Campanelli: General Relativity is celebrating this year a hundred years since its first publication in 1915, when Einstein introduced his theory of General Relativity, which has revolutionized in many ways the way we view our universe. For instance, the idea of a static Euclidean space, which had been assumed for centuries and the concept that gravity was viewed as a force changed. They were replaced with a very dynamical concept of now having a curved space-time in which space and time are related together in an intertwined way described by these very complex, but very beautiful equations.

direct link to the story: https://soundcloud.com/usetacc/sc15-campanelli

Earthquakes reveal deep secrets beneath East Asia 05-15-2015

Host Jorge Salazar of the Texas Advanced Computing Center interviews computational scientists Min Chen of Rice University and Jeroen Tromp of Princeton University about their XSEDE-enabled research supported with the Campus Champions program.

An international science team reported a discovery of gigantic rock structures hidden deep under East Asia, centered on the Tibetan Plateau. Scientists used supercomputers to process earthquake data and make images in 3-D down to depths of about 900 kilometers, or about 560 miles below ground. Scientists from China, Canada, and the U.S. worked together to publish their results March of 2015 in the American Geophysical Union Journal of Geophysical Research, Solid Earth.

The study area is a hotspot for earthquakes. And it's surrounded by networks of seismographic stations, 1869 stations in all. That's where scientists got their data to take cat scans of the Earth using the supercomputer model they developed.

The science team says their research could potentially help discover hidden pockets of hydrocarbon resources like oil and gas. More broadly they say their work will help explore the Earth hidden miles under East Asia and elsewhere.

Direct link to the story: https://soundcloud.com/usetacc/earthquakes-reveal-deep-secrets-beneath-east-asia


An XSEDE Campus Bridging Success Story

This podcast episode focuses on a recent XSEDE Campus Bridging Success involving a site visit. The National Science Foundation-funded eXtreme Science and Engineering Discovery Environment (XSEDE) is a single virtual system that scientists can use to interactively share computing resources, data and expertise. People around the world use these resources and services-things like supercomputers, collections of data and new tools-to improve our planet.

XSEDE's Campus Bridging aims to make cyberinfrastructure-whether local, regional, national, or international-feel close at hand to students, researchers, and staff. Computational systems, data and information management, advanced instruments, visualization environments, and people are all linked together by software for a sense of virtual proximity akin to accessing a peripheral device plugged into the back of a laptop.

Campus Bridging helps institutions with their computer clusters, or sets of computers that work together, by providing an open-source toolkit called Rocks for deploying, managing, upgrading, and scaling applications in parallel across the cluster. Each release of Rocks has core and optional Rolls that provide the operating system and other applications integral to cluster computing. So, Rocks Rolls helps institutions build an XSEDE-compatible basic cluster, or XCBC.

Campus Bridging also offers on-site visits for software installations. The first of such visits took place in April at Marshall University in Huntington, West Virginia. The project involved increasing the cluster's capabilities and enhancing system management.

Direct link: https://soundcloud.com/tennessee-supercomputing/an-xsede-campus-bridging-success-story-installation-at-marshall-university

Earth, Wind, & Fire: A Computer Model Combination Could Unravel the Mysteries of Sudden Firestorms 4-22-2015

Unanticipated firestorms can take lives and destroy property, equipment, and other resources. With help from the eXtreme Science and Engineering Discovery Environment (XSEDE), researchers are working to improve wildfire forecasts and give firefighting professionals a more powerful tool.

XSEDE is a single, virtual system funded by NSF for scientists to interactively share resources, data, and expertise. People around the world use XSEDE's resources and services-things like supercomputers, collections of data, and new tools-to improve our planet.

The Joint Institute for Computational Sciences (JICS) was established by the University of Tennessee and Oak Ridge National Laboratory (ORNL) to advance scientific discovery and leading-edge engineering, and to further knowledge of computational modeling and simulation. JICS realizes its vision by taking full advantage of petascale-and-beyond computers housed at ORNL and by educating a new generation of scientists and engineers well-versed in the application of computational modeling and simulation for solving the most challenging scientific and engineering problems. JICS operates the National Institute for Computational Sciences (NICS), which had the distinction of deploying and managing the Kraken supercomputer. NICS is a leading academic supercomputing center and a major partner in XSEDE. In November 2012, JICS sited the Beacon system, which set a record for power efficiency and captured the number one position on the Green500 list of the most energy-efficient computers.

Founded in 1985, the San Diego Supercomputer Center (SDSC) enables international science and engineering discoveries through advances in computational science and data-intensive, high-performance computing.

Direct link: https://soundcloud.com/tennessee-supercomputing/earth-wind-fire-a-computer-model-combination-could-unravel-the-mysteries-of-sudden-firestorms

Link to story on the website of the National Institute for Computational Sciences (NICS): www.nics.tennessee.edu/wildfire-modeling

Computer-Designed Rocker Protein World's First to Biomimic Ion Transport 04-07-2015

Host Jorge Salazar reports from the Texas Advanced Computing Center an interview with Michael Grabe, an associate professor in the Department of Pharmaceutical Chemistry and the Cardiovascular Research Institute at the University of California, San Francisco.

For the first time ever, scientists designed completely from scratch a protein molecule that behaves like a slice of life. It mimics a natural protein found in living cells that transports ions across a cell membrane. The cell membrane surrounds living cells like an envelope. And ion transport through the membrane helps keep us alive. It lets nutrients in and waste out of cells, and it also transmits signals between nerve cells of the brain and spinal cord. 

Scientists used the Stampede supercomputer at TACC to model the stability and dynamics of the designed protein. They did this with an allocation through XSEDE, the Extreme Science and Engineering Discovery Environment, funded by the National Science Foundation. The researchers published their results in the journal Science in December 2014.

This research has wide potential application, such as targeting medicines more specifically into cancer cells and driving charge separation potentially for harvesting energy for batteries.

Direct link to the story: https:// https://soundcloud.com/usetacc/computer-designed-rocker-protein-worlds-first-to-biomimic-ion-transport


Supercomputers Help Solve Puzzle-Like Bond for Biofuels 3-12-2015

Host Jorge Salazar reports from the Texas Advanced Computing Center an interview with Klaus Schulten, professor of Physics at the University of Illinois at Urbana-Champaign. Dr. Schulten used the Stampede supercomputer enabled by an allocation by XSEDE to discover one of life's strongest chemical while doing research on biofuels. 

The biomolecular interaction binds at about half the strength of a chemical covalent bond the pieces of a finger-like system of proteins called cellulosomes used by bacteria in cow stomachs to digest plants. The researchers published their results in the journal Nature Communications in December of 2014. Their find could boost efforts to develop catalysts for biofuel production from non-food waste plants.

Direct link to the story: https://soundcloud.com/usetacc/supercomputers-help-solve-puzzle-like-bond-for-biofuels


Exotic States Materialize with Supercomputers 02-11-2015

Host Jorge Salazar of the Texas Advanced Computing Center interviews research scientist Xiaofeng Qian of Texas A&M University. Qian describes his discovery of materials with novel electrical properties. His science team made the discovery using the Stampede supercomputer through an XSEDE allocation and the Lonestar supercomputer of TACC.

The scientists found a new class of materials that have an exotic state of matter known as the quantum spin Hall effect. The researchers published their results in the journal Science in December 2014, where they proposed a new type of transistor made from these materials. 

The computational allocation was made through XSEDE, the Extreme Science and Engineering Discovery Environment, a single virtual system funded by the National Science Foundation that scientists use to interactively share computing resources, data and expertise. The study itself was funded by the U.S. Department of Energy and the NSF.

Direct link to the story: https://soundcloud.com/usetacc/exotic-states-materialize-with-supercomputers


Spurring Scientific Exploration on the Great Plains with XSEDE and HPC 02-10-2015

Computer scientist Doug Jennewein's working environment is, in more ways than one, a great frontier. Situated at the University of South Dakota (USD) on America's wind-swept plain, he is in a job that is allowing him to pursue his passion for science and technology, and be a pioneer in growing the use of computationally assisted research.

In this podcast, he discusses how the Extreme Science and Engineering Discovery Environment (XSEDE) has enhanced his work as USD's research computing manager.

Direct Link: https://soundcloud.com/tennessee-supercomputing/spurring-scientific-exploration-on-the-great-plains-with-xsede-and-hpc

Link to the story on the XSEDE website: https://www.xsede.org/spurring-scientific-exploration


Oil Spill Research Gets a Boost from Supercomputing 02-03-2015

Environmental research performed since the Deepwater Horizon oil spill in the Gulf of Mexico has uncovered fundamental changes to the microbes in the Louisiana marshes. Supercomputing is advancing DNA analyses involved in the work.

This podcast features interview comments from Annette Engel, associate professor of earth and planetary sciences at the University of Tennessee, Knoxville.

Direct link: https://soundcloud.com/tennessee-supercomputing/oil-spill-research-gets-a-boost-from-supercomputing

Link to the story on the website of the National Institute for Computational Sciences (NICS): www.nics.tennessee.edu/gulf-coast-research



Interstellar mystery solved by supercomputer simulations 11-26-2014

Host Jorge Salazar of the Texas Advanced Computing Center interviews theoretical astrophysicist Philip Hopkins about supercomputer simulations of galaxy evolution enabled by an XSEDE allocation on the Stampede supercomputer.

Astrophysicists have been puzzled by their observations since the 1970s that only a small fraction of matter in the cloud becomes a star and part of a galaxy. They found a lot less of the universe's mass than expected in the middle of galaxies.

Things changed when a multi-university collaboration produced a set of new supercomputer models of galaxies called FIRE, The Feedback in Realistic Environments. Philip Hopkins led a 2014 study of initial results that found that star activity—like supernova explosions or even just starlight—plays a big part in the formation of other stars and the growth of galaxies. 

Direct link to the story: https://soundcloud.com/usetacc/interstellar-mystery-solved-by-supercomputer-simulations


Supercomputing beyond genealogy reveals surprising European ancestors 10-28-2014

Host Jorge Salazar of the Texas Advanced Computing Center interviews Joshua Schraiber, a National Science Foundation Post-doctoral fellow at the University of Washington.  Dr. Schraiber used an XSEDE allocation on the Stampede supercomputer of TACC to compare genomes of ancient humans with modern humans. 

A group of scientists peered thousands of years back into Europe's murky past and found a mysterious ancestor. They published their results September 2014 in the journal Nature. With state-of-the-art genetic tools, supercomputing simulations and modeling, they traced the origins of modern Europeans to three distinct populations. 

One group they found were hunter-gatherers with olive skin and mainly blue-eyes that came to the European continent about 12,000 years ago.  Another distinct group were farmers with light-colored skin and brown eyes from the Near East, who migrated west and mixed with the hunter-gatherers. The mystery group of ancient European revealed recently by scientists turned out to be northern Eurasians who migrated all the way from Siberia. 

Direct link to the story: https://soundcloud.com/usetacc/supercomputing-beyond-genealogy-reveals-surprising-european-ancestors