Logging Metrics
Why metrics matter
Print final metrics
import shutil
from ultralytics import YOLO
import argparse
import json
def parse_args():
parser = argparse.ArgumentParser()
parser.add_argument("--epochs", type=int, default=3)
parser.add_argument("--verbose", type=bool, default=False)
return parser.parse_args()
args = parse_args()
# Load a model
model = YOLO("yolov8n.pt") # Load a pretrained model (recommended for training)
# Use the model
model.train(data="coco128.yaml", epochs=args.epochs, verbose=args.verbose) # Train the model
path = model.export(format="onnx") # Export the model to ONNX format
metadata = {}
# Loop through the metrics
for metric in model.metrics.results_dict:
# Some metrics have a 'metrics/' prefix (e.g., metrics/precision)
# We split it to isolate the actual metric name.
metric_name = metric.split("metrics/")[-1]
metric_value = model.metrics.results_dict[metric]
metadata[metric_name] = metric_value
# Print the JSON dictionary to register metrics and their values in Valohai
print(json.dumps(metadata))
# Copy the exported model to the Valohai outputs directory
shutil.copy(path, "/valohai/outputs/")
# Define a JSON dictionary containing a friendly name
# You can then reference this file with datum://latest-model
file_metadata = {
"valohai.alias": "latest-model",
}
# Attach the metadata to the file
with open("/valohai/outputs/best.onnx.metadata.json", "w") as f:
f.write(file_metadata)Run and view metrics
Track metrics during training
Compare executions
Related topics
Last updated
Was this helpful?
