Science Stories

Discover More about XSEDE-enabled science, programs, trainings, and more.

Key Points
See how XSEDE supports science
« Back

Researchers Use Machine Learning to More Quickly Analyze Key Capacitor Materials

XSEDE Resources Speed Up Electronics Design


By: Jorge Salazar

Scientists at Georgia Tech are using machine learning with supercomputers to analyze the electronic structure of materials and ultimately find ways to build more capable capacitors. (Left) Density functional theory (DFT) charge density of a molecular dynamics snapshot of a benzene. (Right) Charge density difference between machine learning prediction and DFT for the same benzene structure. Credit: Rampi Ramprasad, Georgia Tech.


Capacitors, given their high energy output and recharging speed, could play a major role in powering the machines of the future from electric cars to cell phones. 

But the biggest hurdle for capacitors as energy storage devices is that they store much less energy than a battery of similar size.

Anand Chandrasekaran, a postdoctoral researcher, and Rampi Ramprasad, a professor in the School of Materials Science and Engineering, stand in a room with a high-powered computer dedicated to machine learning. (Credit: Allison Carter)

Researchers at the Georgia Institute of Technology (Georgia Tech) are tackling the problem in a novel way with the help of the National Science Foundation-funded Extreme Science and Engineering Discovery Environment (XSEDE) project. They've combined machine learning with XSEDE-allocated supercomputers to find ways to build more capable capacitors, which could lead to better power management for electronic devices.

The method was described in Nature Partner Journals Computational Materials, published February of 2019. The study involved teaching a computer to analyze at the atomic level two materials, aluminum and polyethylene, used to make some capacitors. 

The researchers focused on finding a way to more quickly analyze the electronic structure of the capacitor materials, looking for features that could affect performance.

"The electronics industry wants to know the electronic properties and structure of all of the materials they use to produce devices, including capacitors," said Rampi Ramprasad, a professor in the School of Materials Science and Engineering at Georgia Tech. 

For example, polyethylene is a very good insulator with a large band gap, the energy range forbidden to electrical charge carriers. But if it has a defect, unwanted charge carriers are allowed into the band gap, reducing efficiency, he said.  

"In order to understand where the defects are and what role they play, we need to compute the entire atomic structure, something that so far has been extremely difficult," said Ramprasad. "The current method of analyzing those materials using quantum mechanics is so slow that it limits how much analysis can be performed at any given time."

Analyzing the electronic structure of a material with quantum mechanics involves solving the Kohn-Sham equation of density functional theory, which generates data on wave functions and energy levels. That data is then used to compute the total potential energy of the system and atomic forces.

Ramprasad and colleagues were awarded allocations of supercomputer resources by XSEDE. "XSEDE was mostly used to develop the training dataset used in our work," said study co-author Deepak Kamal, a graduate student advised by Ramprasad at the Georgia Tech School of Materials Science and Engineering. 

Deepak Kamal, Georgia Tech School of Materials Science and Engineering.

"We had to do a lot of trials to choose the best dataset for the work," Kamal continued, "and later to develop and test our model. This involved hundreds of calculations for generating the structures that we used for the dataset. That included computationally-expensive simulations like ab initio (first principles) molecular dynamics simulations on polyethylene slabs and crystals with about 500 atoms; and aluminum slabs and crystals with about 1,000 atoms. We used XSEDE then because the compute nodes on XSEDE resources are fast and have a large memory associated with them, which makes them ideal for calculations on large structures."

The researchers used the Stampede2 supercomputer at the Texas Advanced Computing Center (TACC), part of the University of Texas at Austin. Stampede2 is an XSEDE-allocated resource capable of 18 petaflops, with 4,200 Intel Knights Landing nodes complemented with 1,736 Intel Xeon Skylake nodes. "We exclusively used Stampede2 for this work. The calculations were very fast and queue time was reasonable as well," Kamal said.

Ramprasad was also awarded supercomputing time through XSEDE on the Comet system of the San Diego Supercomputer Center (SDSC), an Organized Research Unit of the University of California San Diego. Comet is capable of 2.76 petaflops achieved mainly through 1,944 Intel Haswell Standard Compute Nodes. "In the work leading up to the study, we used to use the Comet cluster extensively for high-throughput polymer electronic property calculation, such as the effect of polymer morphology on the band gap of polymers." Kamal said. "We used Comet because it was fast and efficient at handling large number and quantities of calculations."

The results of the quantum mechanical analysis of aluminum and polyethylene on Stampede2 and Comet produced a sample of data, which in turn was used as an input to teach a powerful computer how to simulate that analysis.

Using the new machine learning method developed by Ramprasad and colleagues produced similar results several orders of magnitude faster than using the conventional technique based on quantum mechanics. 

"This unprecedented speedup in computational capability will allow us to design electronic materials that are superior to what is currently out there," Ramprasad said. "Basically we can say, ‘Here are defects with this material that will really diminish the efficiency of its electronic structure.' And once we can address such aspects efficiently, we can better design electronic devices."

Overview of the process used to generate surrogate models for the charge density and density of states. The first step entails the generation of the training dataset by sampling random snapshots of molecular dynamics trajectories. First-principles calculations were then performed on these systems (shown in Figure S1) to obtain the training atomic configurations, charge densities, and local density of states. The scalar (S), vector (V), and tensor (T) fingerprint invariants are mapped to the local electronic structure at every grid-point. For the charge density, this mapping is achieved using a simple fully connected neural network with one output neuron. The local density of states (LDOS) spectrum, on the other hand, is learned via a recurrent neural network architecture, wherein the LDOS at every energy window is represented as a single output neuron (linked via a recurrent layer to other neighboring energy windows). The trained model is then used to predict the electronic structure (i.e, DOS and charge density) of an unseen configuration. Credit: Rampi Ramprasad, Georgia Tech.

While the study focused on aluminum and polyethylene, machine learning could be used to analyze the electronic structure of a wide range of materials. Beyond analyzing electronic structure, other aspects of material structure now analyzed by quantum mechanics could also be hastened by the machine learning approach, Ramprasad said.

"In part we selected aluminum and polyethylene because they are components of a capacitor. But we also demonstrated that you can use this method for vastly different materials, such as metals that are conductors and polymers that are insulators," Ramprasad said.

The faster processing allowed by the machine learning method would also enable researchers to more quickly simulate how modifications to a material will impact its electronic structure, potentially revealing new ways to improve its efficiency.  

Said Kamal: "Supercomputing systems allow high-throughput computing, which enables us to create vast databases of knowledge about various material systems. This knowledge can then be utilized to find the best material for a specific application."

The study, "Solving the electronic structure problem with machine learning," was published in February 2019 in the journal Nature Partner Journals Computational Materials. The authors are Anand Chandrasekaran, Deepak Kamal, Rohit Batra, Chiho Kim, Lihua Chen, and Rampi Ramprasad of the School of Materials Science and Engineering, Georgia Institute of Technology. This research was supported by the Office of Naval Research under grant No. N0014-17-1-2656.

CITATION:  Anand Chandrasekaran, Deepak Kamal, Rohit Batra, Chiho Kim, Lihua Chen and Rampi Ramprasad, "Solving the electronic structure problem with machine learning," (Computational Materials, 2019).

XSEDE awarded scientists access to the Comet supercomputer at the San Diego Supercomputer Center (left) and the Stampede2 supercomputer at the Texas Advanced Computing Center (right).

Original press release from Georgia Tech at this link: