Parameters

The Wikipedia definition of a hyperparameter is: “a parameter whose value is set before the learning process begins.” This definition exists to differentiate these values that control how the machine learning is conducted from parameters such as neural network weights that are actively changed during training. Thus, hyperparameters are a subset of parameters.

In the context of Valohai, all parameters to be recorded are defined before launching an execution. If you wish to record something that is defined or changes during runtime, use Valohai metadata.

Tip

Valohai supports numerous workloads in addition to normal machine learning training such as data generation and feature extraction. Check out What is Valohai? page to learn what else could be achieved. This is the reason why we don’t always use the training-centric hyperparameter term.

Defining parameters

To keep track of parameters in Valohai, you add them as part of your YAML step.parameters.

Example of a valohai.yaml with a step defining 3 parameters:

- step:
    name: train
    image: python:3.6
    command:
      - python train.py {parameters}
    parameters:
      - name: max_steps
        pass-as: --max_steps={v}
        type: float
        default: 300
      - name: learning_rate
        pass-as: --learning_rate={v}
        type: float
        default: 0.001
      - name: dropout
        pass-as: --dropout={v}
        type: float
        default: 0.9

The above would generate the following command by default:

python train.py --max_steps=300 --learning_rate=0.001 --dropout=0.9

Selecting values

When you are creating a new Valohai execution using the web user interface, look for the Parameters subsection at the bottom.

Execution parameters.

Default values of the parameters are defined by the valohai.yaml, but they can be tweaked in the web user interface, command-line client or the API. All changes are version controlled as part of a Valohai execution.