Models & Backends

Included (pre-parameterized) models

A model is the combination of a functional form with a set of parameters. Six pre-parameterized models can be selected: M3GNet-UP-2022 (Universal Potential), AIMNet2-B973c, AIMNet2-wB97MD3, ANI-2x, ANI-1ccx, and ANI-1x. The predictions from the AIMNet2-* and ANI-* models are calculated from committees (ensembles), meaning that the final prediction is an average over several independently trained neural networks.

Table 1 Pre-parameterized models for the MLPotential engine

M3GNet-UP-2022

AIMNet2-B973c

AIMNet2-wB97MD3

ANI-2x

ANI-1ccx

ANI-1x

Type

Neural network

Neural network

Neural network

Neural network

Neural network

Neural network

Committee size

1

4

4

8

8

8

Atomic environment descriptor

m3gnet

AIMNet2

AIMNet2

ACSF

ACSF

ACSF

Supported elements

95: H, He, Li, .., Am

H, B, C, N, O, F, Si, P, S, Cl, As, Se, Br, I

H, B, C, N, O, F, Si, P, S, Cl, As, Se, Br, I

H, C, N, O, F, S, Cl

H, C, N, O

H, C, N, O

Supports charged systems

No

Yes

Yes

No

No

No

Supports 3D periodicity

Yes

No

No

Yes

Yes

Yes

Predicts atomic charges

No

Yes

Yes

No

No

No

Predicts dipole moment

No

Yes

Yes

No

No

No

Predicts energy uncertainty

No

Yes

Yes

Yes

Yes

Yes

Predicts force uncertainty

No

Yes

Yes

Yes

Yes

Yes

Retrainable with ParAMS

Yes

No

No

No

No

No

Training set structures

materials project

molecules

molecules

organic molecules

organic molecules

organic molecules

Reference method

PBE, PBE+U

B97-3c

ωB97M-D3/Def2-TZVPP

ωB97-x/6-31G(d)

DLPNO-CCSD(T)/CBS

ωB97-x/6-31G(d)

Backend

m3gnet

AIMNet2

AIMNet2

TorchANI

TorchANI

TorchANI

Reference

1

2

2

3

4

5

Note: Use the horizontal scrollbar in the above table to see all supported models.

Model
Type

Multiple Choice

Default value

ANI-2x

Options

[Custom, AIMNet2-B973c, AIMNet2-wB97MD3, ANI-1ccx, ANI-1x, ANI-2x, M3GNet-UP-2022]

Description

Select a particular parameterization. ANI-1x and ANI-2x: based on DFT (wB97X) ANI-1cxx: based on DLPNO-CCSD(T)/CBS M3GNet-UP-2022: based on DFT (PBE and PBE+U) data. AIMNet2: based on ωB97m-D3 or B97-3c data. ANI-1x and ANI-1ccx have been parameterized to give good geometries, vibrational frequencies, and reaction energies for gasphase organic molecules containing H, C, O, and N. ANI-2x can also handle the atoms F, S, and Cl. M3GNet-UP-2022 is a universal potential (UP) for the entire periodic table and has been primarily trained to crystal data (energies, forces, stresses) from the Materials Project. AIMNet2 has been parametrized to give good geometries and reaction energies for gasphase molecules and ions containing H, B, C, N, O, F, Si, P, S, Cl, As, Se, Br, I. Set to Custom to specify the backend and parameter files yourself.

Custom models (custom parameters)

Tip

You can use Engine ASE to use any ASE calculator as the engine.

Tip

You can use ParAMS to train your own ML potential parameters.

Set Model to Custom and specify which backend to use with the Backend option. In a typical case, you would have used that backend to train your own machine learning potential.

The backend reads the parameters, and any other necessary information (for example neural network architecture), from either a file or a directory. Specify the ParameterFile or ParameterDir option accordingly, with a path to the file or directory. Read the backend’s documentation to find out which option is appropriate.

Some backends may require that an energy unit (MLEnergyUnit) and/or distance unit (MLDistanceUnit) be specified. These units correspond to the units used during the training of the machine learning potential.

Example:

Engine MLPotential
    Backend SchNetPack
    Model Custom
    ParameterFile ethanol.schnet-model
    MLEnergyUnit kcal/mol
    MLDistanceUnit angstrom
EndEngine
Backend
Type

Multiple Choice

Options

[M3GNet, NequIP, SchNetPack, sGDML, TorchANI]

Description

The machine learning potential backend.

MLDistanceUnit
Type

Multiple Choice

Default value

Auto

Options

[Auto, angstrom, bohr]

GUI name

Internal distance unit

Description

Unit of distances expected by the ML backend (not the ASE calculator). The ASE calculator may require this information.

MLEnergyUnit
Type

Multiple Choice

Default value

Auto

Options

[Auto, Hartree, eV, kcal/mol, kJ/mol]

GUI name

Internal energy unit

Description

Unit of energy output by the ML backend (not the unit output by the ASE calculator). The ASE calculator may require this information.

ParameterDir
Type

String

Default value

GUI name

Parameter directory

Description

Path to a set of parameters for the backend, if it expects to read from a directory.

ParameterFile
Type

String

Default value

Description

Path to a set of parameters for the backend, if it expects to read from a file.

Backends

Table 2 Backends supported by the MLPotential engine.

M3GNet

SchNetPack

sGDML

TorchANI

Reference

1

8

9

10

Methods

m3gnet

HDNNPs, GCNNPs, …

GDML, sGDML

[ensembles of] HDNNPs

Pre-built models

M3GNet-UP-2022

none

none

ANI-1x, ANI-2x, ANI-1ccx

Parameters from

ParameterDir

ParameterFile

ParameterFile

ParameterFile

Kernel-based

No

No

Yes

No

ML framework

TensorFlow 2.9.1

PyTorch

none, PyTorch

PyTorch

Note

Technically, there is also an AIMNet2 backend but it can only be activated through the pre-parametrized models AIMNet2-B973c and AIMNet2-wB97MD3.

Note

Starting with AMS2023, PiNN 7 is only supported as a custom Calculator through Engine ASE 6.

Note

For sGDML, the order of the atoms in the input file must match the order of atoms which was used during the fitting of the model.

Note

If you use a custom parameter file with TorchANI, the model specified via ParameterFile filename.pt is loaded with torch.load('filename.pt')['model'], such that a forward call should be accessible via torch.load('filename.pt')['model']((species, coordinates)). The energy shifter is not read from custom parameter files, so the absolute predicted energies will be shifted with respect to the reference data, but this does not affect relative energies (e.g., reaction energies).

References

1(1,2)

C. Chen, S. P. Ong. Nature Computational Science 2, 718–728 (2022). arXiv.2202.02450.

2(1,2)

D. M. Anstine, R. Zubatyuk, O. Isayev. https://doi.org/10.26434/chemrxiv-2023-296ch

3

C. Devereux et al., J. Chem. Theory Comput. 16 (2020) 4192-4202. https://doi.org/10.1021/acs.jctc.0c00121

4

J. S. Smith et al., Nat. Commun. 10 (2019) 2903. https://doi.org/10.1038/s41467-019-10827-4

5

J. S. Smith et al., J. Chem. Phys. 148 (2018) 241733. https://doi.org/10.1063/1.5023802

6

https://wiki.fysik.dtu.dk/ase/index.html

7

Y. Shao et al., J. Chem. Inf. Model. 60 (2020) 1184-1193. https://doi.org/10.1021/acs.jcim.9b00994

8

K. T. Schütt et al., J. Chem. Theory Comput. 15 (2019) 448-455. https://doi.org/10.1021/acs.jctc.8b00908

9

S. Chmiela et al. Comp. Phys. Commun. 240 (2019) 38-45. https://doi.org/10.1016/j.cpc.2019.02.007

10

X. Gao et al. J. Chem. Inf. Model (2020). https://doi.org/10.1021/acs.jcim.0c00451