In this tutorial, we will use Jupyhai, to run a Valohai execution from your local Jupyter notebook.
A short recap on Notebooks
Jupyhai is a Jupyter notebook extension developed and maintained by Valohai.
Use the Run remote button, instead of the Run cell button to run your notebook on Valohai.
The Jupyhai addon will generate a valohai.yaml config file file for each execution based on the contents of the Notebook. You don’t need to create the YAML file yourself.
Installing Jupyhai to on your machine
pip install --upgrade pip pip install notebook pip install jupyhai jupyhai install
Create a new folder
valohai-notebook on your desktop and launch Jupyter in that directory.
mkdir valohai-notebook cd valohai-notebook jupyter notebook
Create a new Python 3 notebook
Open the new Notebook
Click on Settings on the toolbar
Login with your Valohai credentials
Update the Docker image to
Close the settings tab
print('Hello Valohai Notebooks!')to the first cell
Run remoteto run the notebook remotely
View the logs from the execution by clicking on the gizmo on the right side of the page (e.g.
Click on Settings on the toolbar to:
Connect your Notebook to a Valohai project
Choose the environment you want to run on
Define the Docker image you’d like to use
Open a Notebook from a previous execution
Each of the colored gizmos on the right side of the page signify a single Valohai execution. You can click on any of the completed executions and select
Notebook to inspect the Notebook version that was used to run the execution.
Download the sample MNIST dataset and place it in the same folder as the notebook.
print('Hello Valohai Notebooks!') with the sample code from below.
This is a simple MNIST sample that loads up the data from a local file, trains a model and saves the trained model on the local machine.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34
import tensorflow as tf import numpy myinput = 'mnist.npz' with numpy.load(myinput, allow_pickle=True) as f: x_train, y_train = f['x_train'], f['y_train'] x_test, y_test = f['x_test'], f['y_test'] x_train, x_test = x_train / 255.0, x_test / 255.0 model = tf.keras.models.Sequential([ tf.keras.layers.Flatten(input_shape=(28, 28)), tf.keras.layers.Dense(128, activation='relu'), tf.keras.layers.Dropout(0.2), tf.keras.layers.Dense(10) ]) predictions = model(x_train[:1]).numpy() predictions tf.nn.softmax(predictions).numpy() loss_fn = tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True) loss_fn(y_train[:1], predictions).numpy() model.compile(optimizer='adam', loss=loss_fn, metrics=['accuracy']) model.fit(x_train, y_train, epochs=5) model.save('model.h5')
Parameterizing a notebook happens through cell tags. Tags are a standard Jupyter feature.
Create a new cell and define a new variable
Show cell tags by going to View->Cell Toolbar->Tags.
Add a new tag
parametersto the first cell.
epoch_val = 6
model.fit to set the epochs value from the variable.
28 29 30 31 32 33 34
model.compile(optimizer='adam', loss=loss_fn, metrics=['accuracy']) model.fit(x_train, y_train, epochs=epoch_val) model.save('model.h5')
You can easily download data to your notebook either from a public location (HTTP/HTTPS) or a private cloud storage.
Create a new cell at the top of your notebook
inputstag to the new cell
In the new cell define a variable
mydataand paste the address of your dataset
mydata = 's3://onboard-sample/tf-sample/mnist.npz'
Valohai will download the input data to
myinput in your sample code to point to
1 2 3 4 5 6 7 8
import tensorflow as tf import numpy myinput = '/valohai/inputs/mydata/mnist.npz' with numpy.load(myinput, allow_pickle=True) as f: x_train, y_train = f['x_train'], f['y_train'] x_test, y_test = f['x_test'], f['y_test']
Save a trained model¶
Finally you need to save the trained model as a Valohai output.
View in Valohai¶
Open your project on app.valohai.com
Open the latest execution
The details type contains information about the executions
The logs tab contains all logs from the execution
You can click on the blue Notebooks button, to view an executed Notebook
Using Tasks for hyperparameter tuning