https://next-gen.materialsproject.org/materials/mp-22851
Visualization created using .traj outputs of ASE MD simulation
Since the announcement in 2011 of the Materials Genome Initiative by the Obama administration, much attention has been given to the subject of materials design to accelerate the discovery of new materials that could have technological implications. Although having its biggest impact for more applied materials like batteries, there is increasing interest in applying these ideas to predict new superconductors. This is obviously a challenge, given that superconductivity is a many body phenomenon, with whole classes of known superconductors lacking a quantitative theory. Given this caveat, various efforts to formulate materials design principles for superconductors are reviewed here, with a focus on surveying the periodic table in an attempt to identify cuprate analogues. https://arxiv.org/abs/1601.00709
Tibetan-style vajra/dorje. Based on references from https://www.himalayanart.org/search/set.cfm?setID=563.
Authors make use of a new optical device to drive metallic K3C60 with mid-infrared pulses of tunable duration, ranging between one picosecond and one nanosecond. The same superconducting-like optical properties observed over short time windows for femtosecond excitation are shown here to become metastable under sustained optical driving, with lifetimes in excess of ten nanoseconds.
The study of superconductivity in compressed hydrides is of great interest due to measurements of high critical temperatures (Tc) in the vicinity of room temperature, beginning with the observations of LaH10 at 170-190 GPa. However, the pressures required for synthesis of these high Tc superconducting hydrides currently remain extremely high. Here we show the investigation of crystal structures and superconductivity in the La-B-H system under pressure with particle-swarm intelligence structure searches methods in combination with first-principles calculations. https://arxiv.org/abs/2107.02553
Realizing general inverse design could greatly accelerate the discovery of new materials with user-defined properties. However, state-of-the-art generative models tend to be limited to a specific composition or crystal structure. Herein, we present a framework capable of general inverse design (not limited to a given set of elements or crystal structures), featuring a generalized invertible representation that encodes crystals in both real and reciprocal space, and a property-structured latent space from a variational autoencoder (VAE). https://arxiv.org/abs/2005.07609
Here, we report a universal IAP for materials based on graph neural networks with three-body interactions (M3GNet). The M3GNet IAP was trained on the massive database of structural relaxations performed by the Materials Project over the past 10 years and has broad applications in structural relaxation, dynamic simulations and property prediction of materials across diverse chemical spaces. Chi Chen & Shyue Ping Ong https://www.nature.com/articles/s43588-022-00349-3 Preprint version from arXiv
This study employs the SuperCon dataset as the largest superconducting materials dataset. Then, we perform various data pre-processing steps to derive the clean DataG dataset, containing 13,022 compounds. In another stage of the study, we apply the novel CatBoost algorithm to predict the transition temperatures of novel superconducting materials. In addition, we developed a package called Jabir, which generates 322 atomic descriptors. We also designed an innovative hybrid method called the Soraya package to select the most critical features from the feature space. These yield R2 and RMSE values (0.952 and 6.45 K, respectively) superior to those previously reported in the literature. Finally, as a novel contribution to the field, a web application was designed for predicting and determining the Tc values of superconducting materials.
Data-driven methods, in particular machine learning, can help to speed up the discovery of new materials by finding hidden patterns in existing data and using them to identify promising candidate materials. In the case of superconductors, the use of data science tools is to date slowed down by a lack of accessible data. In this work, we present a new and publicly available superconductivity dataset (‘3DSC’), featuring the critical temperature Tc of superconducting materials additionally to tested non-superconductors.
For 4000 “low-Tc” superconductors (i.e., non-cuprate and non-iron-based), Tc is plotted vs. the a) average atomic weight, b) average covalent radius, and c) average number of d) valence electrons. Having low average atomic weight and low average number of d) valence electrons are necessary (but not sufficient) conditions for achieving high Tc in this group. d) Scatter plot of Tc for all known superconducting cuprates vs. the mean number of unfilled orbitals. c), d) suggest that the values of these predictors lead to hard limits on the maximum achievable Tc
a) Histogram of materials categorized by Tc (bin size is 2 K, only those with finite Tc are counted). Blue, green, and red denote low-Tc, iron-based, and cuprate superconductors, respectively. In the inset: histogram of materials categorized by ln(Tc) restricted to those with Tc > 10 K. b) Performance of different classification models as a function of the threshold temperature (Tsep) that separates materials in two classes by Tc. Performance is measured by accuracy (gray), precision (red), recall (blue), and F1 score (purple). The scores are calculated from predictions on an independent test set, i.e., one separate from the dataset used to train the model. In the inset: the dashed red curve gives the proportion of materials in the above-Tsep set. c) Accuracy, precision, recall, and F1 score as a function of the size of the training set with a fixed test set. d) Accuracy, precision, recall, and F1 as a function of the number of predictors
Superconductivity has been the focus of enormous research effort since its discovery more than a century ago. Yet, some features of this unique phenomenon remain poorly understood; prime among these is the connection between superconductivity and chemical/structural properties of materials. To bridge the gap, several machine learning schemes are developed herein to model the critical temperatures (Tc) of the 12,000+ known superconductors available via the SuperCon database. Materials are first divided into two classes based on their Tc values, above and below 10 K, and a classification model predicting this label is trained. The model uses coarse-grained features based only on the chemical compositions. It shows strong predictive power, with out-of-sample accuracy of about 92%. https://www.nature.com/articles/s41524-018-0085-8
Machine-learned force fields have transformed the atomistic modeling of materials by enabling simulations of ab initio quality on unprecedented time and length scales. However, they are currently limited by: (i) the significant computational and human effort that must go into development and validation of potentials for each particular system of interest; and (ii) a general lack of transferability from one chemical system to the next. Here, using the state-of-the-art MACE architecture we introduce a single general-purpose ML model, trained on a public database of 150k inorganic crystals, that is capable of running stable molecular dynamics on molecules and materials.
The 2nd generation of our atoms-in-molecules neural network potential (AIMNet2), which is applicable to species composed of up to 14 chemical elements in both neutral and charged states, making it a valuable method for modeling the majority of non-metallic compounds. Using an exhaustive dataset of 2 x 107 hybrid DFT level of theory quantum chemical calculations, AIMNet2 combines ML-parameterized short-range and physics-based long-range terms to attain generalizability that reaches from simple organics to diverse molecules with “exotic” element-organic bonding.
Here we present the Crystal Hamiltonian Graph Neural Network (CHGNet), a graph neural network-based machine-learning interatomic potential (MLIP) that models the universal potential energy surface. CHGNet is pretrained on the energies, forces, stresses and magnetic moments from the Materials Project Trajectory Dataset, which consists of over 10 years of density functional theory calculations of more than 1.5 million inorganic structures. https://www.nature.com/articles/s42256-023-00716-3
This work presents Neural Equivariant Interatomic Potentials (NequIP), an E(3)-equivariant neural network approach for learning interatomic potentials from ab-initio calculations for molecular dynamics simulations. https://www.nature.com/articles/s41467-022-29939-5
Training in 1.58b With No Gradient Memory. Preprint paper by wbrickner
Building Ouro, searching for room-temp superconductors and rare-earth free permanent magnets with machine learning.