Skip to content
Scm logo
Scm logo
  • Amsterdam Modeling Suite
  • Applications
  • Tools
  • Docs & Support
  • Company
    • About us
    • Team
    • Projects & Collaborations
    • Partners & Contributors
    • News
    • Events
    • Careers
    • Pricing & Licensing
    • Contact us
Amsterdam Modeling Suite
Atomistic Scale
Electronic Structure
ADF

Understand and predict chemical properties with our fast and accurate molecular DFT code.

Periodic DFT

BAND & Quantum Espresso: Calculate reactivity, band gaps, optical response, and other properties for periodic systems.

DFTB & MOPAC

Model larger molecules and periodic systems, or prescreen many candidates, with the fast electronic structure methods DFTB and MOPAC.

Interatomic Potentials
ReaxFF

Study large, chemically evolving systems with ReaxFF molecular dynamics.

Machine Learning Potentials

Use preparametrized ML potentials M3GNET, ANI-1ccx or your own models.

Force Fields

GFN-FF, Apple&P, UFF, and more- (polarizable) force fields.

Meso- & Macroscale
kMC and Microkinetics

Predict catalytic turn-over frequencies with microkinetics and kinetic Monte Carlo.

Bumblebee: OLED stacks

3D kinetic Monte Carlo for simulating OLED device-level physics

Fluid Thermodynamics
COSMO-RS

Quick physical property predictions, thermodynamic properties in solution, and solvent screening.

Amsterdam Modeling Suite: computational chemistry with expert support to advance your chemistry & materials R&D

Discover the Suite Pricing & licensing
Tools
Workflows and Utilities
OLED workflows

Automatic workflows to simulate physical vapor deposition and calculate properties for OLED device modeling.

ChemTraYzer2

Automatically extract reaction pathways and reaction rates from reactive MD trajectories.

Conformers

Easily generate, screen, refine, and select conformers. Pass on to other modules for conformational averaging.

Reactions Discovery

Predict chemical (side) reactions from nothing but constituent molecules.

AMS Driver
Properties

Calculate frequencies, phonons, and more. Use forces and energies from AMS or external engines.

PES Exploration

Minimize structures, find transitions states, scan multiple coordinates.

Molecular Dynamics

Use advanced thermo- and barostats, non-equilibrium and accelerated MD, molecule gun.

Monte Carlo

Grand Canonical Monte Carlo to study absorption, (dis)charge processes.

Interfaces
ParAMS

Versatile graphical and python scripting tools to create training sets and parametrize DFTB, ReaxFF, and machine learned potentials.

PLAMS

Versatile python scripting interface to create your own computational chemistry workflows

GUI

Powerful graphical interface to set up, run, and analyze calculations. Even across different platforms.

VASP

Interface to popular plane-wave code VASP. Easily set up PES Scans to create training data.

The SCM team wants to make computational chemistry work for you!

Check out the tutorials Questions? Contact us!
Docs & Support
Downloads
Windows

ams2025.104

Mac

ams2025.104

Linux

ams2025.104

See all
Documentation
Overview

Documentation links for all our modules and tools

Tutorials

Get started quickly with our Tutorials!

Installation Manual

Quick-start guide and extensive installation manual

Brochures

Brochure and flyers for different applications

Other Resources
Changelog

Latest changes to our binaries

Webinars

Workshops

Knowledgebank

Research highlights

FAQ

General FAQs on licensing.

Pricing and licensing

Price and licensing information.

Consulting & Support
How can we help?
Search
  • Free trial
  • Contact

Home > Documentation

Navigate to
  • Documentation
  • Tutorials
  • Installation

Table of contents

  • 1. Introduction
    • 1.1. Parameter Fitting: A Lennard-Jones Example
    • 1.2. General Application
  • 2. Getting Started
    • 2.1. Installation
    • 2.2. Unit Tests
    • 2.3. First Steps with ParAMS
  • 3. Components
    • 3.1. Architecture Quick Reference
    • 3.2. Collections
      • 3.2.1. Job Collection
        • 3.2.1.1. Adding Jobs
        • 3.2.1.2. Working with the Collection
        • 3.2.1.3. I/O
        • 3.2.1.4. Generating AMSJobs
        • 3.2.1.5. Running Collection Jobs
      • 3.2.2. Engine Collection
      • 3.2.3. Collections API
        • 3.2.3.1. JCEntry
        • 3.2.3.2. JobCollection
        • 3.2.3.3. Engine
        • 3.2.3.4. EngineCollection
        • 3.2.3.5. Collection Base Class
    • 3.3. Data Set
      • 3.3.1. Adding Entries
      • 3.3.2. Accessing Entries
      • 3.3.3. Removing Entries
      • 3.3.4. Adding External Reference Data
      • 3.3.5. Calculating and Adding Reference Data with AMS
      • 3.3.6. Storage and I/O
      • 3.3.7. Calculating the Loss Function Value
      • 3.3.8. Checking for Consistency with a given Job Collection
      • 3.3.9. Splitting into Subsets
      • 3.3.10. Data Set Entry API
      • 3.3.11. Data Set API
    • 3.4. Extractors and Comparators
      • 3.4.1. Supported Data Structures
      • 3.4.2. Custom Comparators
      • 3.4.3. Available Extractors
        • 3.4.3.1. Distance
        • 3.4.3.2. Angles
        • 3.4.3.3. Dihedral
        • 3.4.3.4. RMSD
        • 3.4.3.5. Energy
        • 3.4.3.6. Forces
        • 3.4.3.7. Hessian
        • 3.4.3.8. Stress Tensor
        • 3.4.3.9. Charges
        • 3.4.3.10. Vibrational Frequencies
        • 3.4.3.11. PES
    • 3.5. Parameter Interfaces
      • 3.5.1. Available Parameter Interfaces
        • 3.5.1.1. xTB
        • 3.5.1.2. ReaxFF
        • 3.5.1.3. Lennard Jones
      • 3.5.2. Parameter Interface Basics
      • 3.5.3. Working with Parameters
      • 3.5.4. The Active Parameters Subset
      • 3.5.5. Storage
        • 3.5.5.1. Lossless Storage
      • 3.5.6. Relation to PLAMS Settings
      • 3.5.7. Parameter API
      • 3.5.8. Interface Base Class API
    • 3.6. Optimizers
      • 3.6.1. CMA-ES
        • 3.6.1.1. List of valid cmasettings
        • 3.6.1.2. References
      • 3.6.2. Scipy
      • 3.6.3. Nevergrad
      • 3.6.4. Adaptive Rate MC
      • 3.6.5. Simple Grid Optimizer
      • 3.6.6. Optimizer Base Class
        • 3.6.6.1. BaseOptimizer API
        • 3.6.6.2. MinimizeResult API
    • 3.7. Optimization
      • 3.7.1. Optimization Setup
      • 3.7.2. Optimization API
    • 3.8. Parallelization
    • 3.9. Constraints
    • 3.10. Callbacks
      • 3.10.1. Logger
      • 3.10.2. Timeout
      • 3.10.3. Target Value
      • 3.10.4. Maximum Iterations
      • 3.10.5. Early Stopping
      • 3.10.6. Stopfile
      • 3.10.7. Time per Evaluation
      • 3.10.8. Load Average
      • 3.10.9. User-Defined Callbacks
      • 3.10.10. Callback API
    • 3.11. Loss Functions
      • 3.11.1. Least Absolute Error
      • 3.11.2. Mean Absolute Error
      • 3.11.3. Root-Mean-Square Error
      • 3.11.4. Sum of Squares Error
      • 3.11.5. Loss Function API
    • 3.12. Utilities
      • 3.12.1. Optimization History
      • 3.12.2. Active Parameter Search
      • 3.12.3. Data Set Sensitivity
      • 3.12.4. Plotting Funcitons
      • 3.12.5. ReaxFF Conversion
  • 4. ParAMS Main Script
    • 4.1. The Configuration File
  • 5. Examples
    • 5.1. Parameterization of a Lennard-Jones Engine for Argon
      • 5.1.1. Understanding and running the example
      • 5.1.2. Evaluating the Optimization
      • 5.1.3. Modifying the example
    • 5.2. Parameterization of a Water Force Field with ReaxFF
      • 5.2.1. Complete Example Script
      • 5.2.2. Changing the Example Script
    • 5.3. Refitting HF Charges with the ACKS2 Model
      • 5.3.1. The Config File
      • 5.3.2. Preparing the YAML Files from Input
      • 5.3.3. Running the Optimization
      • 5.3.4. Comparing the new parameters
      • 5.3.5. Playing with the example
  • 6. Changelog
ParAMS
  • Documentation/
  • ParAMS/
  • 3. Components/
  • 3.12. Utilities/
  • 3.12.2. Active Parameter Search

3.12.2. Active Parameter Search¶

This class allows to reduce the dimensionality of the parameter search space by performing a sensitivity analysis on each active parameter individually, or in a small set.

Synopsis

>>> ff      = ReaxParams('path/to/ffield.ff')
>>> ds      = DataSet('path/to/dataset.yml')
>>> jc      = JobCollection('path/to/jobcol.yml')
>>> aps     = ActiveParameterSearch(ff, ds, jc)
>>> ids, fx = aps.scan(steps=[1.1], dim=1, verbose=True)
>>> ff.is_active = aps.get_is_active(n=20)

scan() returns the scanned ids of the active subset, and the respective loss function values.

>>> # Set only the first three parameters to active:
>>> ff.is_active = len(ff)*[False]
>>> for i in range(3):
>>>  ff[i].is_active = True
>>> len(ff.active)
3
>>> aps = ActiveParameterSearch(ff, ds, jc)
>>> aps.scan()
(array([[0],
        [1],
        [2]]), array([[[-0.16769481]],
        [[ 0.33069672]],
        [[-0.09795433]]]))

The first return value are the scanned ids, the second one an array of loss function values.

The parameter search can also scan a subset of active parameters, rather than scanning every one individually:

>>> aps.scan(dim=2)
(array([[0, 1],
        [0, 2],
        [1, 2]]), array([[-0.28081611],
        [ 0.02706811],
        [-0.2683532 ]]))

The step size and number can be set with the steps argument. Each entry is a multiplier to the initial parameters, generating a new set from \(\boldsymbol{x}_\mathrm{scaled} = scale*\boldsymbol{x}_0\).

>>> aps.scan(steps=[0.9,1.2])
(array([[0],
        [1],
        [2]]), array([[-0.55754578, -0.26966971],
        [-0.21234735, -0.19213127],
        [-0.16746101, -0.19213127]]))

The results are also stored in the attributes fx0, ids and fx after scan() has been called:

>>> aps.ids
array([[0],
       [1],
       [2]])
>>> aps.fx
array([[[-0.55754578, -0.26966971]],
       [[-0.21234735, -0.19213127]],
       [[-0.16746101, -0.19213127]]])

For relative sensitivities, use the fx0 attribute:

>>> rel_fx = aps.fx[:,:,ds_id] / aps.fx0[ds_id]

Once a scan is complete, get_is_active() will return an array of bools, that can be assigned to the parameter interface’s is_active attribute:

>>> ff.is_active = aps.get_is_active(n=20)

Multiple Data Sets can be evaluated with one Parameter Search instance, provided they all can be calculated with the same Job Collection. To do so, a list of data sets can be passed when instantiating. This results in the attribute shapes fx0.shape == (len(ds)) and fx.shape == (len(ff.active), len(ids), len(ds)).

>>> aps = ActiveParameterSearch(ff, [ds1, ds2], jc)
>>> ids, fx = aps.scan()
>>> fx_ds1 = fx[:,:,0] # select scanned results of the first  data set
>>> fx_ds2 = fx[:,:,1] # select scanned results of the second data set

In such cases get_is_active() method’s dataset_id argument can be passed to specify which data set results to use for the evaluation:

>>> aps = ActiveParameterSearch(ff, [ds1, ds2], jc)
>>> ids, fx = aps.scan()
>>> active_based_on_ds1 = aps.get_is_active(10, dataset_id=0)
>>> active_based_on_ds2 = aps.get_is_active(10, dataset_id=1)

API

class ActiveParameterSearch(parameterinterface, datasets, jobcollection, file=None)¶

Allows to scan for the most sensitive parameters of a ParameterInterface instance, given a Data Set.

Note

Will only scan the active subset of parameters.

The following are available after scan() has been called:

Attributes:
fx0 : float
The fx value of the initial parameters
ids : ndarray
The last return value of scan()[0]
fx : ndarray
The last return value of scan()[1]
__init__(parameterinterface, datasets, jobcollection, file=None)¶

Initialize a Parameter Search instance with the given interface, datasets and jobcollection.
Previous results can be loaded by providing the optional file argument.

The datasets argument can either be a single DataSet instance, or a list of them. The latter assumes that all Data Sets in the list can be calculated from the jobcollection. If multiple Data Sets are provided, the get_is_active() method’s dataset_id can be used to specify which of the sets are used for the best parameter evaluation.

scan(steps: Sequence = [1.05], dim=1, loss='sse', parallel=None, verbose=True)¶

Start the scan.

Note

Parameters that have a value of zero will be shifted by (step-1) instead.

After calling this method, the get_is_active() and save() methods can be called.

Parameters:
steps : Sequence[float]
Number of steps and the respective scaling for each step
dim : 1 <= int <= len(parameterinterface.active)
If dim > 1, will scan dim parameters at once on a combinatorial grid of len(parameters) over dim points. Possiby costly, as \(N_\mathrm{evals} = \binom {N_\mathrm{params}}{dim}\).
loss : str, Loss
The Loss function to be used for the Data Set evaluation.
parallel : ParallelLevels
Calculate parallel.parametervectors parameter sets at once, each set set running parallel.jobs jobs in parallel. Defaults to ParallelLevels(parametervectors=NCPU).
Returns:
self.ids : ndarray
2d array of indices for the parameterinterface.active subset of parameters, each element i maps to the scanned parameter(s) of parameterinterface.active[i].
self.fx : ndarray
Array of shape (len(ids), len(steps), len(datasets)). In the same order as ids, the fitness function values for the modified parameter sets. Will contain a list of multiple fx values, if len(steps) > 1.
get_is_active(n: int, dataset_id: int = 0, mode: str = 'highest_absolute') → List¶

Can only be called after scan().
Given the initial parameter interface, return the ParameterInterface.is_active attribute with n most sensitive parameters marked as active. The returned List can be used to set the parameter interface:

>>> params.is_active = ActiveParameterSearch.get_is_active(10)
Valid mode argument values are
  • 'lowest_relative': Will determine the best parameters by selecting lowest values as determined by (fx/fx0).mean(-1)
  • 'highest_absolute': Will determine the best parameters by selecting highest values as determined by abs(fx-fx0).mean(-1)

If dim>1 was requested during the scan, the number of active parameters will be equal to set(dim*n).

When multiple datasets have been provided at init, the dataset_id can be used to specify, which of the sets should be used for the best parameters evaluation.

save(fname)¶

Saves ids, fx0 and fx to fname

static load(fname)¶

Loads and returns a triplet of ids, fx and fx0 from fname

Next Previous
AMS Modules
Electronic Structure
ADF: molecular DFT Periodic DFT DFTB & MOPAC
Interatomic Potentials
ReaxFF ML Potentials Force Fields
Kinetics
kMC and Microkinetics Bumblebee: OLEDs
Macroscale
COSMO-RS
Application Areas
Research Topics
Batteries Biotechnology Bonding Analysis Catalysis Heavy Elements Inorganic Chemistry Materials Science Nanoscience Oil & Gas OLEDs Perovskites Polymers Semiconductors Spectroscopy
Where to use AMS?
Industry Government Lab National Supercomputer Academic Research Teaching
Tools
Workflows
Conformers OLED workflows Reaction analysis Reaction discovery
AMS Driver
Hybrid Engine Molecular Dynamics Monte Carlo PES Exploration Properties
Python Utilities
ACErxn ParAMS PLAMS pyZacros
Interfaces
GUI VASP Parametrization
Documentation & Support
Downloads Documentation Videos Release notes Changelog Previous releases Webinars Workshops AMS Literature Brochure Newsletter
Company
About us Careers Contact us Events News Our team Partners & Contributors Projects & Collaborations
Pricing & Licensing
Get a price quote Pricing structure Ordering License terms Resellers FAQ

© SCM – Software Chemistry & Materials 2025 – All Rights Reserved

Privacy Disclaimer Cookies Terms of Use Copyright
Manage Cookie Consent
We use cookies to optimise site functionality and give you the best possible experience.
See our cookie statement for all information.
Functional cookies Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes. The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.
Manage options Manage services Manage {vendor_count} vendors Read more about these purposes
View preferences
{title} {title} {title}