Skip to content

duartegroup/mlp-train

Repository files navigation

DOI pytest CI codecov pre-commit Ruff License

mlp-train

General machine learning potentials (MLP) training for molecular systems in the gas phase and solution

Available models:

  • GAP
  • ACE
  • MACE

Documentation and tutorial

The (incomplete) readthedocs documentation for the mlp-train package is available here

Simple tutorials illustrating the use of mlp-train are available at: https://github.com/duartegroup/euchems_tutorial.

Install

Each model is installed into an individual conda environment:

# Install GAP
./install_gap.sh

# Install ACE
./install_ace.sh

# Install MACE
./install_mace.sh 

The environment for ACE requires the installation of Julia (version >= 1.6), which needs to be present in $PATH.

The MACE installation benefits from CUDA acceleration. Depending on your machine, you might need to prefix the mace_install.sh with instructions for conda:

CONDA_OVERRIDE_CUDA="11.2" ./install_mace.sh 

This is needed in two scenarios:

  • To ensure an environment that is compatible with your CUDA driver.
  • To force CUDA builds to be installed, even if the installation is being done from a CPU-only machine. This is typical in a situation where you are installing from a head node without GPUs but intend to run on GPUs and want to install the CUDA builds.

Notes

  • Units are: distance (Å), energy (eV), force (eV Å$^{-1}$), time (fs)

Using with OpenMM

The OpenMM backend only works with MACE at the moment. The necessary dependencies are installed automatically during MACE installation.

You should then be able to run water_openmm.py in ./examples or run the Jupyter notebook on Google Colab water_openmm_colab.ipynb.

You can use OpenMM during active learning by passing the keyword argument md_program="OpenMM" to the al_train method. You can run MD with OpenMM using mlptrain.md_openmm.run_mlp_md_openmm()

For developers

We are happy to accept pull requests from users. Please first fork the mlp-train repository. We use pre-commit, Ruff and pytest to check the code. Your PR needs to pass through these checks before is accepted. Pre-commit is installed as one of the dependencies. To use it in your repository, run the following command in the mlp-train folder:

pre-commit install 

Pre-commit will then run automatically at each commit and will take care of the installation and running of Ruff.

Citations

If mlptrain is used in a publication please consider citing the paper:

@article{MLPTraining2022,
  doi = {10.1039/D2CP02978B},
  url = {https://doi.org/10.1039/D2CP02978B},
  year = {2022},
  publisher = {The Royal Society of Chemistry},
  author = {Young, Tom and Johnston-Wood, Tristan and Zhang, Hanwen and Duarte, Fernanda},
  title = {Reaction dynamics of Diels-Alder reactions from machine learned potentials},
  journal = {Phys. Chem. Chem. Phys.}
}

Contact

For bugs or implementation requests, please use GitHub Issues

About

MLP training for molecular systems

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors 13