Ouro
  • Docs
  • Blog
Join for freeSign in
  • Home
  • Teams
  • Search
Assets
  • Quests
  • Posts
  • APIs
  • Data
  • Home
  • Teams
  • Search
Assets
  • Quests
  • Posts
  • APIs
  • Data
1d
12 views

On this page

  • MAE model idea I
Loading compatible actions...

MAE model idea I

Core Idea: Train a GNN from scratch to predict MAE using CHGNet-derived features:

Node features: CHGNet latent embeddings (structural context) + CHGNet magmom predictions (explicit magnetic state)

Global features: Pooled CHGNet graph-level representations

Key insight: We need to learn how explicit magnetic configurations couple with structural motifs to produce MAE. CHGNet's latent space has these entangled, but we need them separated - the latent embeddings give structural patterns, while the magmom predictions give the magnetic configuration. Our new GNN learns the coupling function between them.

Why not just fine-tune CHGNet? Because we need magmoms as explicit input features to learn structure-magnetism coupling, which requires a new architecture.

Data efficiency: 2000 samples is reasonable since we're leveraging CHGNet's pretrained representations rather than learning structural features from scratch.

Loading comments...