Parameters: Make Your Jobs Configurable

Turn your hardcoded values into parameters that you can change without touching code. This guide shows how to expose configuration values, hyperparameters, and other settings to Valohai.

💡 Already using argparse? You're 90% done. Just list your parameters in valohai.yaml.

Why Use Parameters?

  • Change values without code changes, Adjust hyperparameters from the UI

  • Track experiments automatically, Sort and filter by parameter values

  • Run parallel experiments, Test multiple configurations at once

  • Reproduce results, Every execution saves its exact parameters

Quick Example

Turn hardcoded values into command-line arguments:

Before

# Hardcoded values
iterations = 10
learning_rate = 0.01

After

# Parse from command line
import argparse

parser = argparse.ArgumentParser()
parser.add_argument('--iterations', type=int, default=10)
parser.add_argument('--learning_rate', type=float, default=0.01)
args = parser.parse_args()

iterations = args.iterations
learning_rate = args.learning_rate

Then declare them in valohai.yaml:

- step:
    name: train-model
    image: tensorflow/tensorflow:2.6.0
    command:
        - python train_model.py
    parameters:
        - name: iterations
          type: integer
          default: 10
        - name: learning_rate
          type: float
          default: 0.01

How Parameters Work

  1. Define in valohai.yaml, List parameters with types and defaults

  2. Parse in your code, Use argparse (or read from config files)

  3. Override at runtime, Change values via UI, CLI, or API

Valohai passes parameters as command-line arguments:

python train_model.py --iterations=50 --learning_rate=0.001

Common Patterns

Multiple Parameter Types

parameters:
    - name: batch_size
      type: integer
      default: 32
    - name: dropout_rate
      type: float
      default: 0.5
    - name: optimizer
      type: string
      default: "adam"
    - name: use_augmentation
      type: flag
      default: false

💡 Flags are not exactly booleans. When a flag gets the value false, nothing is passed to the execution. If you want the flags to behave more like actual booleans, i.e. to pass true or false to the execution, you can use the pass-true-as / pass-false-as in your valohai.yaml:

        - name: verbose
          type: flag
          default: False
          pass-true-as: --verbose=True
          pass-false-as: --verbose=False

Parse in Python

import argparse

parser = argparse.ArgumentParser()
parser.add_argument('--batch_size', type=int, default=32)
parser.add_argument('--dropout_rate', type=float, default=0.5)
parser.add_argument('--optimizer', type=str, default='adam')
parser.add_argument('--use_augmentation', action='store_true')
args = parser.parse_args()

Override When Running

# Via CLI
vh execution run train-model --batch_size=64 --optimizer=sgd

# Parameters appear in the UI for easy changes
# Also available via API for automation

Alternative: Read from Config Files

Don't use command-line arguments? Valohai creates config files you can read:

import json

# Read from JSON
with open('/valohai/config/parameters.json') as f:
    params = json.load(f)
    learning_rate = params['learning_rate']

# Or read from YAML
import yaml
with open('/valohai/config/parameters.yaml') as f:
    params = yaml.safe_load(f)

These files are read-only and created automatically by Valohai so you can't change the parameter values in them while the job is running.

Optional: Use the valohai-utils Python helper tool

The valohai-utils helper library offers a simpler syntax:

import valohai

# No argparse needed
batch_size = valohai.parameters('batch_size').value
learning_rate = valohai.parameters('learning_rate').value

Parameter Types Reference

Type
YAML
Python (argparse)
Example

Integer

type: integer

type=int

--epochs=100

Float

type: float

type=float

--lr=0.001

String

type: string

type=str

--model=resnet

Flag

type: flag

action='store_true'

--augment

What's Next?

With parameters defined, you can:

  • Compare experiments, Filter executions by parameter values

  • Run hyperparameter sweeps, Test multiple values in parallel

  • Create reproducible pipelines, Lock in successful parameter sets


Last updated

Was this helpful?