Model Version Trigger
Automatically launch workflows when a model's approved version changes
Launch executions or pipelines automatically when a model's latest approved version changes. Perfect for automating deployment workflows, validation checks, or downstream updates when models are approved.
Use Case
Deploy models immediately after they're approved, without manual intervention. When someone approves a model version (or rejects the current one, making an older version the latest), your deployment workflow launches automatically.
Perfect for:
Auto-deploying approved models to production
Running validation checks on newly approved models
Updating downstream systems when model versions change
Uploading model files to external repositories
How It Works
Model approved: User approves a model version in Valohai
Notification fires: "Model latest approved version changes" event triggers
Pipeline launches: Your deployment workflow starts automatically
Parse payload: First node extracts model version URI from notification
Deploy model: Second node receives URI and deploys the model
What Counts as "Latest Approved Version Changes"?
This event fires when:
A new version is approved (becomes the latest approved)
Current latest is rejected (an older approved version becomes latest)
The notification always contains the current latest approved version — which is always approved and always the latest.
Step 1: Create a Model (or Use Existing)
You need a model to trigger automation for. You can use an existing model or create a new one.
Create New Model
Navigate to your project → Models tab
Click + Create Model
Enter model name (e.g., "Rock Formations")
Note the model slug from the URI (e.g.,
rock-formationsfrommodel://rock-formations)
Use the slug, not the name: The slug is the immutable identifier. If you rename the model, the slug stays the same, so your automation won't break.
Step 2: Create the Scripts
Parse Notification Payload
This script extracts the model version URI from the notification payload.
Create new_latest_version.py:
"""Parse latest model version from the payload.
The payload is a JSON file that Valohai sends to the step.
"""
import json
import valohai
# notification payload is provided in a Valohai input file
input_file = valohai.inputs("payload").path()
# get the json "payload" content from the input file
with open(input_file) as file:
payload = json.load(file)
# retrieve the new model version URI from the payload
model_version_uri = payload["data"]["model_version"]["uri"]
# output the URI
print(f"The latest approved model version is: {model_version_uri}")
# output it as metadata for use in a pipeline input edge later
print(json.dumps({"model": model_version_uri}))Basic YAML Configuration
Add to valohai.yaml:
- step:
name: new_latest_version
image: python:3.12
command:
- pip install valohai-utils
- python ./new_latest_version.py {parameters}
inputs:
- name: payloadStep 3: Create the Trigger
Go to Project Settings → Triggers
Click Create Trigger
Configure:
Basic Settings:
Title:
New model version handlerTrigger Type: Notification
Conditions: Add Payload Filter to only trigger for your specific model:
Lookup Path:
data.model.slugOperation: Equals
Invariant:
rock-formations(use your model's slug)
Actions:
Action Type: Run Execution
Source Commit Reference:
mainExecution Step:
new_latest_versionExecution Title: (optional)
Payload Input Name:
payload
Click Save Trigger
Step 4: Set Up Notification Routing
Go to Project Settings → Notifications → Project Notifications
Click Create new notification routing
Configure:
Event:
model latest approved version changesFilter events by users: All users
Channel: Select
Launches trigger: New model version handler
Save
Step 5: Test the Trigger
Upload and Approve a Model Version
Go to your model in the Models tab
Click + Create Version
Upload a model file (any file works for testing)
After upload, click Approve on the version
Check Execution Launched
Navigate to Executions tab
Look for automatically created execution
Open execution and check logs:
The latest approved model version is: model://rock-formations/1Success! Your trigger is working.
Advanced: Upload Model Files to External Service
Let's extend this to actually do something with the approved model, upload it to an external repository.
Add Upload Script
Create upload_latest_model.py:
import valohai
import requests
# Notice! This is a non-functioning sample URL; replace it with your own service endpoint.
# It should accept a single multipart form upload with the field name "model_file".
post_endpoint = "https://example.com/model_repository/"
for file_path in valohai.inputs("model").paths():
print(f"Uploading {file_path}...")
with open(file_path, "rb") as model_file:
requests.post(
post_endpoint,
files={'model_file': model_file},
).raise_for_status()Replace the URL: The example URL is a placeholder. Replace with your actual model repository endpoint.
Update YAML for Pipeline
Replace your YAML with this pipeline configuration:
- step:
name: new_latest_version
image: python:3.12
command:
- pip install valohai-utils
- python ./new_latest_version.py {parameters}
inputs:
- name: payload
- step:
name: upload_latest_model
image: python:3.12
command:
- pip install valohai-utils requests
- python ./upload_latest_model.py {parameters}
parameters:
- name: model_url
type: string
inputs:
- name: model
default: "{parameter:model_url}"
- pipeline:
name: Automatically upload newest approved model version
nodes:
- name: new_latest_version
step: new_latest_version
type: execution
- name: upload_latest_model
step: upload_latest_model
type: execution
edges:
- [new_latest_version.metadata.model, upload_latest_model.parameter.model_url]How the Pipeline Works
Node 1: new_latest_version
Receives notification payload
Extracts model version URI
Outputs as metadata
Edge: Connect metadata to parameter
Takes
new_latest_version.metadata.modelPasses to
upload_latest_model.parameter.model_url
Node 2: upload_latest_model
Receives model URI via parameter
Downloads model files from Valohai
Uploads each file to external repository
Update the Trigger
Edit your existing trigger
Change Actions:
Action Type: Run Pipeline (was: Run Execution)
Pipeline Name:
Automatically upload newest approved model versionPipeline Title:
Model approved version uploaderPayload Input Name:
new_latest_version.payload(was:payload)
Save trigger
Step 6: Test End-to-End
Upload a new model version
Approve it
Check Pipelines tab for automatically launched pipeline
Verify both nodes execute:
First node extracts model URI
Second node uploads to external service
Check external service to confirm files uploaded
Troubleshooting
Trigger Not Firing
Check model slug:
Verify payload filter uses correct slug
Check model page for actual slug in URI
Check notification routing:
Confirm event is "model latest approved version changes"
Verify channel is correctly selected
Check trigger status:
Ensure trigger is enabled
Check trigger logs for errors
Pipeline Fails at Upload Node
Common issues:
External endpoint URL is incorrect
Authentication missing or wrong
Network connectivity issues
Model files too large
Debug steps:
Test external endpoint with
curlor Postman firstAdd authentication headers if required
Check execution logs for detailed error messages
Verify model files downloaded successfully before upload attempt
Wrong Model Version Triggered
Possible causes:
Multiple models without payload filters
Payload filter using name instead of slug
Model was renamed (name changed but slug stayed same)
Solutions:
Always filter by
data.model.slug, not nameVerify slug matches in model URI
Check trigger logs to see which model fired it
Multiple Triggers Firing
This is expected if:
You have multiple triggers without payload filters
Multiple models share the same slug (impossible, but check anyway)
Solutions:
Add or fix payload filters
Consider consolidating triggers if intentional
Best Practices
Use Model Slugs for Filtering
Always filter by data.model.slug, never by name:
✅
data.model.slug=my-model(stable, never changes)❌
data.model.name=My Model(can be renamed)
Validate Before Deploying
Don't blindly deploy approved models:
Run sanity checks (model loads, correct format)
Test on validation data
Verify expected outputs
Check model size and complexity
Handle Deployment Failures
Build robust error handling:
Log detailed error information
Set up failure notifications
Test failure scenarios
Version Your Deployment Code
Your deployment pipeline is critical infrastructure:
Keep deployment scripts in version control
Test changes before deploying to production
Document deployment procedures
Use staging environments
Next Steps
Process datasets automatically: Dataset Version Trigger Guide
Schedule recurring jobs: Scheduled Triggers
Integrate external tools: Webhook Triggers
Last updated
Was this helpful?
