This page describes how to create a derivative container based on one of the standard available Deep Learning Containers images.
To complete the steps in this guide, you can use either Cloud Shell or any environment where the Google Cloud CLI is installed.
Before you begin
Before you begin, make sure you have completed the following steps.
Complete the set up steps in the Before you begin section of Getting started with a local deep learning container.
Make sure that billing is enabled for your Google Cloud project.
Enable the Container Registry API.
The Process
To create a derivative container, you'll use a process similar to this:
Create the initial Dockerfile and run modification commands.
To start, you create a Deep Learning Containers container using one of the available image types. Then use conda, pip, or Jupyter commands to modify the container image for your needs.
Build and push the container image.
Build the container image, and then push it to somewhere that is accessible to your Compute Engine service account.
Create the initial Dockerfile and run modification commands
Use the following commands to select a Deep Learning Containers
image type and make a small modification to the container image. This
example shows how to start with the latest TensorFlow image and modifies the
image with a custom TensorFlow wheel. The following example assumes there's
a file named tensorflow.whl
in the same working directory as your
Dockerfile
. Write the following commands to the Dockerfile
:
FROM gcr.io/deeplearning-platform-release/tf-gpu:latest
# Copy from local file system to container
COPY tensorflow.whl /tensorflow.whl
RUN pip uninstall -y tensorflow && \
pip install -y /tensorflow.whl
Build and push the container image
Use the following commands to build and push the container image to Container Registry, where it can be accessed by your Google Compute Engine service account.
export PROJECT=$(gcloud config list project --format "value(core.project)")
docker build . -f Dockerfile.example -t "gcr.io/${PROJECT}/tf-custom:latest"
docker push "gcr.io/${PROJECT}/tf-custom:latest"