Published March 24, 2026 | Version v1.0
Preprint Open

TRIADS: Tiny Recursive Information-Attention with Deep Supervision

  • 1. Independent Researcher

Description

TRIADS is a parameter-efficient recursive attention architecture for materials property prediction in small-data regimes. By combining weight-tied recursive reasoning, per-cycle deep supervision, and physics-informed features, TRIADS achieves strong performance across six Matbench benchmarks with fewer than 250K parameters. Results include 0.9655 ROC-AUC on matbench expt is metal (44K–100K parameters), 0.3068 eV MAE on matbench expt gap, 35.89 meV/atom on matbench jdft2d, and 41.91 cm⁻¹ on matbench phonons without external pretraining. Controlled ablations show that deep supervision reduces MAE by 23.3% under identical architecture, highlighting the importance of architecture-coupled training in small-data settings.

Files

triads_paper.pdf

Files (562.2 kB)

Name Size Download all
md5:37864f10c06424ed8b309992ab530860
562.2 kB Preview Download

Additional details

Software

Repository URL
https://github.com/Rtx09x/TRIADS
Programming language
Python
Development Status
Active