ACEsuit/mace: v0.3.14
Creators
- Ilyes Batatia1
- davkovacs
- bernstei
- ttompa
- WillBaldwin0
- Janosh Riebesell
- Hatem Helal
- Matthew Avaylon
- Rokas Elijosius
- Vivek Bharadwaj2
- wcwitt
- EszterVU
- Alin Marin Elena
- Rhys Goodall3
- ThomasWarford
- ElliottKasoar
- Andrew S. Rosen4
- Cheuk Hin Ho5
- Felix Musil6
- Alexander Spears
- Hubert Beck7
- Eric Sivonxay8
- Nils Goennheimer
- Lars Schaaf
- Chaitanya Joshi1
- Sandip De9
- Harry Moore
- Tamas Stenczel
- samwaltonnorwood
- Leo10
- 1. University of Cambridge
- 2. @PASSIONLab
- 3. @Radical-AI
- 4. Princeton University
- 5. University of British Columbia
- 6. CuspAI
- 7. Charles University Prague
- 8. Lawrence Berkeley National Laboratory
- 9. BASF
- 10. Max Planck Institute for the Structure and Dynamics of Matter
Description
MACE v0.3.14 Release Notes
We are excited to announce MACE v0.3.14, featuring significant new capabilities for embedding functionality, GPU acceleration, dielectric properties prediction, and enhanced training options.
🏗️ Foundation Models
MACE-OMOL Foundation Models
Introduced support for MACE-OMOL models trained on the 100M OMOL dataset with charge and spin embeddings, providing improved accuracy for organic molecules with charges and spins.
Example usage:
from mace.calculators import mace_omol
# Load MACE-OMOL model with charge and spin support
calc = mace_omol(model="extra_large", device="cuda")
atoms.calc = calc
# Set charge and spin for the system
atoms.info["charge"] = 1.0 # +1 charge
atoms.info["spin"] = 1.0 # spin multiplicity
energy = atoms.get_potential_energy()
forces = atoms.get_forces()
Small OMAT Model
Added omat-small-0
to foundation model shortcuts for faster inference when high accuracy is not critical.
Example usage:
from mace.calculators import mace_mp
calc = mace_mp(model="small-omat-0") # Faster, smaller model
🎯 Fine-tuning
Pseudo-label Finetuning
Added multihead pseudo-label finetuning capability, allowing models to be fine-tuned on their own predictions for improved accuracy and easy of use.
Example usage:
python run_train.py \
--foundation_model="medium" \
--train_file=real_data.xyz \
--pt_train_file=replay_data.xyz \
--multiheads_finetuning=True \
--pseudolabel_replay=True \
Enhanced Multihead Finetuning
Fixed various issues with multihead finetuning functionality and improved robustness for the code.
Add option to use any of the models from mace_mp as --foundation_model
key name, see: https://github.com/ACEsuit/mace/blob/0139da1b864b29054f07db6627887a847a42050e/mace/calculators/foundations_models.py#L19. For example, --foundation_model="small-omat-0"
.
🔬 Models
Embedding Functionality for MACE
Added support for embedding additional properties like total charge, total spin, and electronic temperature into MACE models, enabling more accurate predictions for charged and magnetic systems.
Training example with custom embeddings:
python run_train.py \
--train_file=data.xyz \
--embedding_specs='{"charge": {"embed_type": "continuous", "min": -5, "max": 5}, "spin": {"embed_type": "continuous", "min": 0, "max": 4}, "temperature": {"embed_type": "continuous", "min": 0, "max": 3000}}' \
--use_embedding_readout \
--model=MACE
Dielectric MACE for Polarizability Prediction
Added support for the Dielectric MACE model to predict molecular polarizabilities alongside energies and forces.
Reference: Kapil, et al, "First-principles spectroscopy of aqueous interfaces using machine-learned electronic and quantum nuclear effects"
Example usage:
# Train dielectric MACE model
python run_train.py \
--model="AtomicDielectricMACE" \
--train_file=data.xyz \
--loss="dipole_polar" \
--polarizability_key="REF_polarizability" \
--error_table="DipolePolarRMSE" \
LES Models for Coulomb Interactions
Implemented support for LES models for improved description of Coulomb interactions.
Reference: Bingqing Cheng, "Latent Ewald summation for machine learning of long-range interactions"
Example usage:
# Train MACELES model
python run_train.py \
--model=MACELES \
--train_file=data.xyz \
New Non-linear Blocks
Introduced improved non-linear readout blocks that provide better accuracy when training MACE on large datasets. See code.
⚡ Performance Improvements
OpenEquivariance Support
Integrated OpenEquivariance support for MACE kernels, enabling acceleration on AMD GPUs and providing alternative compute backends. Use --enable_oeq=True
for training with openequivariance, and use enable_oeq=True
to evaluate ASE calc with openequivariance.
CuEq Fused Operations
Added CuEq fused operations for tensor product and scatter sum operations, significantly improving training and inference speed on CUDA devices.
Refactored Clebsch-Gordan Coefficients
Implemented option to use refactored CG coefficients with fewer computational paths for improved efficiency.
Example usage:
python run_train.py \
--train_file=data.xyz \
--use_reduced_cg
Element Agnostic Product Basis
Added option for element-agnostic product basis, reducing memory usage and improving scalability for systems with many element types.
Example usage:
python run_train.py \
--train_file=data.xyz \
--use_agnostic_product
🔧 Training and Infrastructure Improvements
Improved Multi-GPU Support
- Fixed support for
torchrun
and MPI multi-GPU training - Dynamically set backend for distributed training based on device type
- Better handling of distributed training configurations
Intel GPU Support
Added IPEX (Intel Extension for PyTorch) support for Intel GPU inference and training.
🐛 Bug Fixes and Improvements
- Fixed bug preventing building 1-layer (no message passing) MACE models
- Improved handling of E0s specification via JSON in preprocessing
- Better pathlib usage for file suffix handling in finetuning_select
- Fixed OEQ import handling on macOS without CUDA
- Corrected loss output to log files during fine-tuning
- Improved behavior of auto-downloaded MP data when using pseudolabels
- Enhanced handling of atomic numbers in multihead training
📚 Additional Features
- Added support for calculating descriptors in eval_configs
- Enhanced node energy output capabilities
- Improved interaction energy plotting functionality
- Better filtering of training quantities with zero weights
🙏 Acknowledgments
We thank all contributors to this release, including new contributors @Alexsp32, @Enry99, @vbharadwaj-bk, @naik-aakash, and @Nilsgoe.
Full Changelog: https://github.com/ACEsuit/mace/compare/v0.3.13...v0.3.14
For detailed documentation and examples, visit our GitHub repository and documentation.
Files
ACEsuit/mace-v0.3.14.zip
Files
(121.3 MB)
Name | Size | Download all |
---|---|---|
md5:f095ccb23bff21e8a002fd77ed2e9c9b
|
121.3 MB | Preview Download |
Additional details
Related works
- Is supplement to
- Software: https://github.com/ACEsuit/mace/tree/v0.3.14 (URL)
Software
- Repository URL
- https://github.com/ACEsuit/mace