Notification Triggers
Launch executions and pipelines automatically when Valohai events occur
Notification triggers launch workloads automatically when events happen inside Valohai. Perfect for building event-driven ML workflows that react to data uploads, model approvals, or execution completions.
Why Use Notification Triggers?
Event-driven automation: React immediately when things happen in Valohai
Zero manual intervention: Workflows advance automatically from step to step
Data freshness: Process new datasets the moment they're uploaded
ML lifecycle automation: Deploy models immediately when approved
Pipeline orchestration: Chain workflows without manual clicks
How Notification Triggers Work
Notification triggers have two key components:
1. Managed Trigger Channel
When you create a notification trigger, Valohai automatically creates a special notification channel for it. This channel receives notifications about events happening in your project.
You'll see this as Launches trigger: <trigger-name> in your notification settings.
2. Notification Routing
You connect specific events to your trigger channel through notification routing:
Choose which event type to monitor (e.g., "dataset version created")
Optionally filter by users (all users or specific ones)
Route the event to your trigger's managed channel
When the event occurs, the notification fires and launches your configured action.
Supported Events
Notification triggers can respond to these Valohai events:
Dataset Events
Dataset version created: New data uploaded to a dataset
Useful for: Auto-preprocessing, validation, training on fresh data
Model Events
Model latest approved version changes: A model's approved version updates
Useful for: Auto-deployment, version tracking, downstream updates
Execution Events
Execution completes: An execution finishes successfully
Execution fails: An execution errors out
Useful for: Pipeline chaining, conditional logic, error handling
Pipeline Events
Pipeline completes: A pipeline finishes successfully
Pipeline fails: A pipeline errors out
Useful for: Multi-stage workflows, reporting, deployment triggers
Task Events
Task completes: A task finishes successfully
Task fails: A task errors out
Deployment Events
Deployment completes: A deployment finishes successfully
Deployment fails: A deployment errors out
Common Notification Trigger Patterns
Pattern: Auto-Process New Datasets
Goal: Every time new dataset version is created, automatically preprocess it
Setup:
Create notification trigger that runs preprocessing pipeline
Configure trigger to receive dataset version as input
Set notification routing: "dataset version created" → trigger channel
Result: Upload data → preprocessing starts automatically → clean data ready for training
Full dataset version trigger guide →
Pattern: Deploy Approved Models
Goal: When a model version is approved, automatically deploy it
Setup:
Create notification trigger that runs deployment pipeline
Configure trigger to receive model version as input
Set notification routing: "model latest approved version changes" → trigger channel
Add payload filter for specific model
Result: Approve model → deployment starts automatically → model live in production
Full model version trigger guide →
Pattern: Chain Pipeline Stages
Goal: When training completes, automatically run evaluation
Setup:
Training pipeline outputs model files
Create notification trigger that runs evaluation pipeline
Configure evaluation to receive model as input
Set notification routing: "pipeline completes" → trigger channel
Add payload filter to only trigger for training pipeline
Result: Training finishes → evaluation launches → results available automatically
Pattern: Failure Recovery
Goal: When execution fails, notify team and launch diagnostic job
Setup:
Create notification trigger that runs diagnostic execution
Configure trigger to receive failure details
Set notification routing: "execution fails" → trigger channel
Also set up outgoing webhook to Slack for alerts
Result: Execution fails → team notified → diagnostics run automatically → faster debugging
Payload Filtering
Notification triggers can filter events to only launch for specific cases.
Filter by Model or Dataset
Only trigger for a specific resource using payload filters:
Condition type: Payload Filter
Lookup Path:
data.model.slug(for models) ordata.dataset.name(for datasets)Operation: Equals
Invariant:
your-model-nameoryour-dataset-name
This prevents the trigger from firing for every model/dataset in your project.
Filter by User
In notification routing settings, choose:
All users: Trigger for events from any team member
Specific users: Trigger only for selected users
Useful for personal workflows without spamming the team.
Passing Data to Triggered Workflows
The Notification Payload
When a notification event occurs, Valohai generates a JSON payload with event details:
{
"type": "dataset_version_created",
"data": {
"version": {
"uri": "datum://dataset-name/version-id",
"ctime": "2024-01-15T10:30:00Z"
}
}
}This payload becomes an input file in your triggered execution or pipeline.
Parsing the Payload
Your code extracts data from the payload:
import json
import valohai
# Read the notification payload
with open(valohai.inputs('payload').path()) as f:
payload = json.load(f)
# Extract the dataset version URI
dataset_uri = payload["data"]["version"]["uri"]
# Use it as needed
print(f"Processing dataset: {dataset_uri}")Passing Data Through Pipelines
Use pipeline edges to pass extracted data to downstream nodes:
- pipeline:
name: Dataset Handler
nodes:
- name: parse-notification
step: parse-notification
- name: process-data
step: process-data
edges:
- [parse-notification.metadata.dataset, process-data.parameter.dataset_url]The parsing node extracts the URI and outputs it as metadata. The edge connects it to the processing node's parameter.
Setting Up a Notification Trigger
Step 1: Create the Trigger
Go to Project Settings → Triggers
Click Create Trigger
Trigger Type: Select "Notification"
Title: Give it a descriptive name
Conditions: Add payload filters if needed
Actions: Choose what to launch
Run Execution: Launch a single step
Run Pipeline: Launch a multi-step workflow
Payload Input Name: Specify where to send the notification data
For execution:
input-nameFor pipeline:
node-name.input-name
Save the trigger
A managed trigger channel is automatically created.
Step 2: Set Up Notification Routing
Go to Project Settings → Notifications → Project Notifications
Click Create new notification routing
Event: Select the event type
Filter by users: Choose "All users" or specific users
Channel: Select
Launches trigger: <your-trigger-name>Save the routing
Now when the event occurs, your trigger launches automatically!
Testing Your Trigger
Manual Testing
Before relying on real events, test your trigger:
Create test payload: Make a JSON file matching the expected structure
Create execution manually: Select your parsing step and upload the test payload
Verify parsing: Check that your code extracts data correctly
Test full pipeline: Run the pipeline manually with test inputs
Trigger a Real Event
Once manual testing passes:
For dataset triggers: Upload a new dataset version
For model triggers: Approve a model version
For execution triggers: Run a test execution
Check the execution list to verify the trigger launched automatically.
Debug Failed Triggers
If triggers don't launch:
Check trigger logs: Project Settings → Triggers → ... menu → View Logs
Verify routing: Confirm event type matches your routing configuration
Check payload filters: Ensure filters aren't too restrictive
Validate inputs: Make sure step declares the payload input in YAML
Notification Triggers vs. Webhook Triggers
Both respond to events, but from different sources:
Event source
Inside Valohai
External services
Setup
Notification routing
Webhook URL
Authentication
Not needed
Required
Use cases
React to Valohai events
Integrate external tools
Next Steps
Process datasets automatically: Dataset Version Trigger Guide
Auto-deploy models: Model Version Trigger Guide
Need scheduled automation? Scheduled Triggers
Integrate external services: Webhook Triggers
Last updated
Was this helpful?
