Notification Triggers

Launch executions and pipelines automatically when Valohai events occur

Notification triggers launch workloads automatically when events happen inside Valohai. Perfect for building event-driven ML workflows that react to data uploads, model approvals, or execution completions.


Why Use Notification Triggers?

Event-driven automation: React immediately when things happen in Valohai

Zero manual intervention: Workflows advance automatically from step to step

Data freshness: Process new datasets the moment they're uploaded

ML lifecycle automation: Deploy models immediately when approved

Pipeline orchestration: Chain workflows without manual clicks


How Notification Triggers Work

Notification triggers have two key components:

1. Managed Trigger Channel

When you create a notification trigger, Valohai automatically creates a special notification channel for it. This channel receives notifications about events happening in your project.

You'll see this as Launches trigger: <trigger-name> in your notification settings.

2. Notification Routing

You connect specific events to your trigger channel through notification routing:

  1. Choose which event type to monitor (e.g., "dataset version created")

  2. Optionally filter by users (all users or specific ones)

  3. Route the event to your trigger's managed channel

When the event occurs, the notification fires and launches your configured action.


Supported Events

Notification triggers can respond to these Valohai events:

Dataset Events

  • Dataset version created: New data uploaded to a dataset

  • Useful for: Auto-preprocessing, validation, training on fresh data

Model Events

  • Model latest approved version changes: A model's approved version updates

  • Useful for: Auto-deployment, version tracking, downstream updates

Execution Events

  • Execution completes: An execution finishes successfully

  • Execution fails: An execution errors out

  • Useful for: Pipeline chaining, conditional logic, error handling

Pipeline Events

  • Pipeline completes: A pipeline finishes successfully

  • Pipeline fails: A pipeline errors out

  • Useful for: Multi-stage workflows, reporting, deployment triggers

Task Events

  • Task completes: A task finishes successfully

  • Task fails: A task errors out

Deployment Events

  • Deployment completes: A deployment finishes successfully

  • Deployment fails: A deployment errors out


Common Notification Trigger Patterns

Pattern: Auto-Process New Datasets

Goal: Every time new dataset version is created, automatically preprocess it

Setup:

  1. Create notification trigger that runs preprocessing pipeline

  2. Configure trigger to receive dataset version as input

  3. Set notification routing: "dataset version created" → trigger channel

Result: Upload data → preprocessing starts automatically → clean data ready for training

Full dataset version trigger guide →


Pattern: Deploy Approved Models

Goal: When a model version is approved, automatically deploy it

Setup:

  1. Create notification trigger that runs deployment pipeline

  2. Configure trigger to receive model version as input

  3. Set notification routing: "model latest approved version changes" → trigger channel

  4. Add payload filter for specific model

Result: Approve model → deployment starts automatically → model live in production

Full model version trigger guide →


Pattern: Chain Pipeline Stages

Goal: When training completes, automatically run evaluation

Setup:

  1. Training pipeline outputs model files

  2. Create notification trigger that runs evaluation pipeline

  3. Configure evaluation to receive model as input

  4. Set notification routing: "pipeline completes" → trigger channel

  5. Add payload filter to only trigger for training pipeline

Result: Training finishes → evaluation launches → results available automatically


Pattern: Failure Recovery

Goal: When execution fails, notify team and launch diagnostic job

Setup:

  1. Create notification trigger that runs diagnostic execution

  2. Configure trigger to receive failure details

  3. Set notification routing: "execution fails" → trigger channel

  4. Also set up outgoing webhook to Slack for alerts

Result: Execution fails → team notified → diagnostics run automatically → faster debugging


Payload Filtering

Notification triggers can filter events to only launch for specific cases.

Filter by Model or Dataset

Only trigger for a specific resource using payload filters:

Condition type: Payload Filter

  • Lookup Path: data.model.slug (for models) or data.dataset.name (for datasets)

  • Operation: Equals

  • Invariant: your-model-name or your-dataset-name

This prevents the trigger from firing for every model/dataset in your project.

Filter by User

In notification routing settings, choose:

  • All users: Trigger for events from any team member

  • Specific users: Trigger only for selected users

Useful for personal workflows without spamming the team.


Passing Data to Triggered Workflows

The Notification Payload

When a notification event occurs, Valohai generates a JSON payload with event details:

{
  "type": "dataset_version_created",
  "data": {
    "version": {
      "uri": "datum://dataset-name/version-id",
      "ctime": "2024-01-15T10:30:00Z"
    }
  }
}

This payload becomes an input file in your triggered execution or pipeline.

Parsing the Payload

Your code extracts data from the payload:

import json
import valohai

# Read the notification payload
with open(valohai.inputs('payload').path()) as f:
    payload = json.load(f)

# Extract the dataset version URI
dataset_uri = payload["data"]["version"]["uri"]

# Use it as needed
print(f"Processing dataset: {dataset_uri}")

Passing Data Through Pipelines

Use pipeline edges to pass extracted data to downstream nodes:

- pipeline:
    name: Dataset Handler
    nodes:
      - name: parse-notification
        step: parse-notification
      - name: process-data
        step: process-data
    edges:
      - [parse-notification.metadata.dataset, process-data.parameter.dataset_url]

The parsing node extracts the URI and outputs it as metadata. The edge connects it to the processing node's parameter.


Setting Up a Notification Trigger

Step 1: Create the Trigger

  1. Go to Project Settings → Triggers

  2. Click Create Trigger

  3. Trigger Type: Select "Notification"

  4. Title: Give it a descriptive name

  5. Conditions: Add payload filters if needed

  6. Actions: Choose what to launch

    • Run Execution: Launch a single step

    • Run Pipeline: Launch a multi-step workflow

  7. Payload Input Name: Specify where to send the notification data

    • For execution: input-name

    • For pipeline: node-name.input-name

  8. Save the trigger

A managed trigger channel is automatically created.

Step 2: Set Up Notification Routing

  1. Go to Project Settings → Notifications → Project Notifications

  2. Click Create new notification routing

  3. Event: Select the event type

  4. Filter by users: Choose "All users" or specific users

  5. Channel: Select Launches trigger: <your-trigger-name>

  6. Save the routing

Now when the event occurs, your trigger launches automatically!


Testing Your Trigger

Manual Testing

Before relying on real events, test your trigger:

  1. Create test payload: Make a JSON file matching the expected structure

  2. Create execution manually: Select your parsing step and upload the test payload

  3. Verify parsing: Check that your code extracts data correctly

  4. Test full pipeline: Run the pipeline manually with test inputs

Trigger a Real Event

Once manual testing passes:

  1. For dataset triggers: Upload a new dataset version

  2. For model triggers: Approve a model version

  3. For execution triggers: Run a test execution

Check the execution list to verify the trigger launched automatically.

Debug Failed Triggers

If triggers don't launch:

  1. Check trigger logs: Project Settings → Triggers → ... menu → View Logs

  2. Verify routing: Confirm event type matches your routing configuration

  3. Check payload filters: Ensure filters aren't too restrictive

  4. Validate inputs: Make sure step declares the payload input in YAML


Notification Triggers vs. Webhook Triggers

Both respond to events, but from different sources:

Feature
Notification Triggers
Webhook Triggers

Event source

Inside Valohai

External services

Setup

Notification routing

Webhook URL

Authentication

Not needed

Required

Use cases

React to Valohai events

Integrate external tools

Use notification triggers for Valohai-internal workflows. Use webhook triggers when external services need to start work in Valohai.


Next Steps

Last updated

Was this helpful?