Ouro
  • Docs
  • Blog
  • Pricing
  • Teams
Sign inJoin for free
  • Teams
  • Search
Assets
  • Quests
  • Posts
  • APIs
  • Data
  • Teams
  • Search
Assets
  • Quests
  • Posts
  • APIs
  • Data

Files

6908 total

Scatter plots of Tc for superconducting materials with family-specific regression predictors

Image

For 4000 “low-Tc” superconductors (i.e., non-cuprate and non-iron-based), Tc is plotted vs. the a) average atomic weight, b) average covalent radius, and c) average number of d) valence electrons. Having low average atomic weight and low average number of d) valence electrons are necessary (but not sufficient) conditions for achieving high Tc in this group. d) Scatter plot of Tc for all known superconducting cuprates vs. the mean number of unfilled orbitals. c), d) suggest that the values of these predictors lead to hard limits on the maximum achievable Tc

1y

SuperCon dataset and classification model performance

Image

a) Histogram of materials categorized by Tc (bin size is 2 K, only those with finite Tc are counted). Blue, green, and red denote low-Tc, iron-based, and cuprate superconductors, respectively. In the inset: histogram of materials categorized by ln(Tc) restricted to those with Tc > 10 K. b) Performance of different classification models as a function of the threshold temperature (Tsep) that separates materials in two classes by Tc. Performance is measured by accuracy (gray), precision (red), recall (blue), and F1 score (purple). The scores are calculated from predictions on an independent test set, i.e., one separate from the dataset used to train the model. In the inset: the dashed red curve gives the proportion of materials in the above-Tsep set. c) Accuracy, precision, recall, and F1 score as a function of the size of the training set with a fixed test set. d) Accuracy, precision, recall, and F1 as a function of the number of predictors

1y

Machine learning modeling of superconducting critical temperature

PDF

Superconductivity has been the focus of enormous research effort since its discovery more than a century ago. Yet, some features of this unique phenomenon remain poorly understood; prime among these is the connection between superconductivity and chemical/structural properties of materials. To bridge the gap, several machine learning schemes are developed herein to model the critical temperatures (Tc) of the 12,000+ known superconductors available via the SuperCon database. Materials are first divided into two classes based on their Tc values, above and below 10 K, and a classification model predicting this label is trained. The model uses coarse-grained features based only on the chemical compositions. It shows strong predictive power, with out-of-sample accuracy of about 92%. https://www.nature.com/articles/s41524-018-0085-8

1y

A foundation model for atomistic materials chemistry

PDF

Machine-learned force fields have transformed the atomistic modeling of materials by enabling simulations of ab initio quality on unprecedented time and length scales. However, they are currently limited by: (i) the significant computational and human effort that must go into development and validation of potentials for each particular system of interest; and (ii) a general lack of transferability from one chemical system to the next. Here, using the state-of-the-art MACE architecture we introduce a single general-purpose ML model, trained on a public database of 150k inorganic crystals, that is capable of running stable molecular dynamics on molecules and materials.

1y

AIMNet2 Paper

PDF

The 2nd generation of our atoms-in-molecules neural network potential (AIMNet2), which is applicable to species composed of up to 14 chemical elements in both neutral and charged states, making it a valuable method for modeling the majority of non-metallic compounds. Using an exhaustive dataset of 2 x 107 hybrid DFT level of theory quantum chemical calculations, AIMNet2 combines ML-parameterized short-range and physics-based long-range terms to attain generalizability that reaches from simple organics to diverse molecules with “exotic” element-organic bonding.

1y

CHGNet as a pretrained universal neural network potential for charge-informed atomistic modelling

PDF

Here we present the Crystal Hamiltonian Graph Neural Network (CHGNet), a graph neural network-based machine-learning interatomic potential (MLIP) that models the universal potential energy surface. CHGNet is pretrained on the energies, forces, stresses and magnetic moments from the Materials Project Trajectory Dataset, which consists of over 10 years of density functional theory calculations of more than 1.5 million inorganic structures. https://www.nature.com/articles/s42256-023-00716-3

1y

Equivariant graph neural networks for data-efficient and accurate interatomic potentials

PDF

This work presents Neural Equivariant Interatomic Potentials (NequIP), an E(3)-equivariant neural network approach for learning interatomic potentials from ab-initio calculations for molecular dynamics simulations. https://www.nature.com/articles/s41467-022-29939-5

1y

noise_step

PDF

Training in 1.58b With No Gradient Memory. Preprint paper by wbrickner

1y
  • Previous
  • 1
  • More pages
  • 342
  • 343
  • 344
  • 345
  • 346

all

@all

6.91K files19.3K datasets135 services512 posts11 quests

    Members

    Teams

    Public
  • Public
  • Public
  • Public
  • Public
  • Public
  • Public
  • Public
  • Public
  • Public
  • Public
  • Public
  • Public
  • Public
  • Public
  • Public
  • Public
  • Public
  • Public
  • Public
  • Public
  • Public
  • Public
  • Public
  • Public
  • Public
  • Public
  • Public
  • Public
  • Public
  • Public
  • Public
  • Public
  • Public