Amp: Machine Learning for Atomistics v0.3alpha
Description
Amp provides a modular approach to atomistic machine learning. The user can custom specify fingerprinting schemes and regression methods to suit their needs and development. This software is developed primarily by Andrew Peterson and Alireza Khorshidi, from the Brown University School of Engineering. Amp stands for Atomistic machine-learning potentials.
This is an early release of Amp; the newest version may offer improved stability and lives at https://bitbucket.org/andrewpeterson/amp.
This project is derived from and will replace our previous Neural project (http://dx.doi.org/10.5281/zenodo.12665); in the near term Neural may offer superior performance if modularity is not desired. We intend for v0.2.x to be the last stable release of Neural. Amp inherits the numbering scheme with the v0.3 version of Amp as the next successive release. The Neural project currently lives on in its own repository at https://bitbucket.org/andrewpeterson/neural. Neural provides both Behler-Parinello and Cartesian neural network schemes, and will continue to receive small bug-fix updates for some time.
Amp is designed to integrate with the python-based Atomic Simulation Environment (ASE) from the Technical University of Denmark, and carries ASE and the standard scientific python libraries (numpy and scipy) as dependencies.
Files
Files
(41.7 kB)
Name | Size | Download all |
---|---|---|
md5:3460ef7bc0a6c5f5073cb1da9002e73c
|
41.7 kB | Download |