Docker Images

Valohai utilizes Docker images to define your runtime environment. This means that the platform is capable of running any code from C to Python as long as it can run inside a Docker container.

You can use any Docker image available online. After getting initial versions working, it makes sense to package your dependencies by building your own images.

We recommend hosting your images on Docker Hub as it’s the most straight forward but you can use any Docker registry. You can configure authenticated access under organization settings.

Here are the most common Docker images currently used on the platform:

ufoym/deepo:all-py36                     # good for initial prototyping
ufoym/deepo:all-py36-cpu                 # good for initial prototyping
tensorflow/tensorflow:<VERSION>-gpu-py3  # e.g. 1.15.0, for GPU support
tensorflow/tensorflow:<VERSION>-py3      # e.g. 1.15.0, for CPU only
pytorch/pytorch:<VERSION>                # e.g. 1.3-cuda10.1-cudnn7-runtime
continuumio/miniconda:<VERSION>          # e.g. 4.7.10
python:<VERSION>                         # e.g. 3.8.0
r-base:<VERSION>                         # e.g. 3.6.1
julia:<VERSION>                          # e.g. 1.3.0
bash:<VERSION>                           # e.g. 5.0.11, for light scripting
busybox:<VERSION>                        # e.g. 1.13.1, for light scripting

Which images to use depend on your specific use-case, but it usually makes sense to:

  • start with as minimal image as possible

  • use a specific image tag (the :<VERSION> part) so everything stays reproducible

🐞 Give feedback about this page