Published April 12, 2022
| Version 1.0.0
Software
Open
MALA
Creators
- 1. Center for Advanced Systems Understanding (CASUS), Helmholtz-Zentrum Dresden-Rossendorf e.V. (HZDR)
- 2. Oak Ridge National Laboratory (ORNL)
- 3. Sandia National Laboratories (SNL)
- 4. Helmholtz-Zentrum Dresden-Rossendorf e.V. (HZDR)
Description
Features
- Preprocessing of QE data using LAMMPS interface and LDOS parser (parallel via MPI)
- Networks can be created and trained using pytorch (arallel via horovod)
- Hyperparameter optimization using optuna, orthogonal array tuning and neural architecture search without training (NASWOT) supported
- optuna interface supports distributed runs and NASWOT can be run in parallel via MPI
- Postprocessing using QE total energy module (available as separate repository)
- Network inference parallel up to the total energy calculation, which currently is still serial.
- Reproducibility through single
Parameters
object, easy interface to JSON for automated sweeps - Modular design
- full integration of Sandia ML-DFT code into MALA (network architectures, misc code still open)
- Parallelization of routines:
- Preprocessing (both SNAP calculation and LDOS parsing)
- Network training (via horovod)
- Network inference (except for total energy)
- Technical improvements:
- Default parameter interface is now JSON based
- internal refactoring
Notes
Files
mala-project/mala-v1.0.0.zip
Files
(447.8 kB)
Name | Size | Download all |
---|---|---|
md5:5f9666f41439163b05ba5d7218afda70
|
447.8 kB | Preview Download |
Additional details
Related works
- Is supplement to
- https://github.com/mala-project/mala/tree/v1.0.0 (URL)