4.6. Training and validation sets

Important

First go through the Getting Started: Lennard-Jones tutorial.

With ParAMS, you can have a validation set in addition to the training set.

../../_images/val_running_loss.png

The optimizer will minimize the loss function on the training set. During the parametrization, you can also monitor the loss function on the validation set (see figure above). This approach is used to prevent overfitting. As long as the loss function on the (appropriately chosen) validation set is decreasing similarly to the loss function on the training set, it means that there is likely no overfitting.

The validation set should be stored in validation_set.yaml.

4.6.1. Create a validation set

Make a copy of the directory $AMSHOME/scripting/scm/params/examples/LJ_Ar_validation_set.

Start the ParAMS GUI: SCM → ParAMS.
File → Open, and browse to the job_collection.yaml file in the example directory.
On the Training Set panel are four entries: two of type Energy and two of type Forces.
../../_images/val_training_set.png
On the Validation Set panel is one entry: the Forces for Ar32_frame001.
../../_images/val_validation_set.png
On the All panel, all validation set entries are marked with a blue color.
../../_images/val_all.png

Move an entry from the training set to validation set:

On the All or Training Set panel, select the entry for the Forces of Ar32_frame003.
Training Set → Move to Validation Set
The entry disappears from the Training Set panel, and appears on the Validation Set panel

Move an entry from the validation set to training set:

On the All or Validation Set panel, select the entry for the Forces of Ar32_frame003.
Training Set → Move to Training Set
The entry disappears from the Validation Set panel, and appears on the Training Set panel

You can also do a random training/validation split on selected entries:

On the All panel, select the two Energy entries and three Forces entries
Training Set → Generate Validation Set…
Percentage of entries to use for validation: 40.0
Click OK
This places 2 (40%) random entries in the validation set, and 3 in the training set.
../../_images/val_post_split.png

Before continuing, revert all changes:

File → Revert Training/Validation/Jobs
../../_images/val_all.png

This places just the forces for the Ar32_frame001 job in the validation set again.

Note

There is only one job collection!. It is used for both the training and validation sets. Here, the job Ar32_frame001 is needed for both the training and validation sets.

  • energy('Ar32_frame001')-energy('Ar32_frame002') is part of the training set

  • forces('Ar32_frame001') is part of the validation set

4.6.2. Validation set settings

Switch to the OptimizationPanel panel in the bottom half
Details → DataSet
For the validation set, set Evaluate every to 5
../../_images/val_settings.png
Details → Output
Set the Logging General to 5.
  • LoggingInterval%General 5 means that information about the optimization is logged every 5 iterations (for the training and validation sets)

  • EvaluateEvery 5 means that the validation set will only be evaluated every 5 iterations (the training set must by definition be evaluated every iteration).

Tip

We recommend to set EvaluateEvery and LoggingInterval%General to the same value. If the validation set is very expensive to calculate, set EvaluateEvery to a multiple of LoggingInterval%General, otherwise the validation error will not be logged!

4.6.3. Run the optimization

Make sure that there is at least one entry on the Validation Set panel.
File → Save As
Save the project in a new directory
File → Run

4.6.4. Training and validation set results

On the Validation Set panel, you can see the predictions and loss contributions just as on the Training Set panel.

On the Graphs panel, you can plot results for the training set, validation set, or both.

In one of the graph drop-downs, choose Loss → loss
This plots the loss function for both the training and validation set as a function of evaluation number
In the Data From: drop-down, you can toggle whether to plot the training loss, validation loss, or both.
../../_images/val_running_loss.png

In this case, both the training set and validation set losses decrease, so there is no sign of overfitting.

You can also plot the root-mean-squared error (RMSE) or mean absolute error (MAE) as a function of evaluation for both the training and validation sets:

In one of the graph drop-downs, choose Stats → Forces
This plots the RMSE of the forces for the training and validation sets
../../_images/val_stats_forces.png

Note

If you plot Stats → Energy then you do not get any validation set results, since there were no Energy entries in the validation set!

For scatter plots:

In one of the graph drop-downs, choose Forces.
This plots two curves with titles Training (134): forces and Validation (134): forces.
../../_images/val_forces_best_training.png

The (134) (in your case the number may be different) indicates for which evaluation number the parameters came from. Every set of parameters during the parametrization has a unique evaluation number.

The default plot is Best Training, which in this case corresponds to the parameters at evaluation 134.

In the Best Training drop-down, select Best Validation
This plots curves with titles Validation (140): forces and Training (140): forces.
Evaluation 140 corresponds to the parameters that gave the lowest loss function on the validation set.
../../_images/val_forces_best_validation.png

Note

Only after the parametrization has finished (from convergence, max_evaluations, or timeout) will you get the validation set results for the best training set parameters and vice versa. While the parametrization is running you will only be able to plot results for the training set using the best training set parameters, and for the validation set using the best validation set parameters.

You can also plot the latest evaluation:

In the Best Validation drop-down, select Latest Training
This plots a curve with the title Training (149).
In the Latest Training drop-down, select Latest Validation
This plots a curve with the title Validation (145).

Here, the latest training set evaluation was done with a different set of parameters than the latest validation set parameters. This is because the validation set is only evaluated every 5 iterations (as specified by the eval_every keyword in the settings).

Tip

You double-click on a plot axis to access plot configuration settings.