Add a custom container to a managed notebooks instance

Stay organized with collections Save and categorize content based on your preferences.

This page shows you how to add a custom container to a Vertex AI Workbench managed notebooks instance as a kernel that you can run your notebook files on.

Overview

You can add custom containers for use with your managed notebooks instance. These custom containers are then available as local kernels that you can run your notebook files on.

Custom container requirements

Vertex AI Workbench managed notebooks supports any of the current Deep Learning Containers container images.

To create a custom container image of your own, you can modify one of the Deep Learning Containers container images to create a derivative container image.

To create a custom container image from scratch, make sure the container image meets the following requirements:

  • Use a Docker container image with at least one valid Jupyter kernelspec. This exposed kernelspec lets Vertex AI Workbench managed notebooks load the container image as a kernel. If your container image includes an installation of JupyterLab or Jupyter Notebook, the installation will include the kernelspec by default. If your container image doesn't have the kernelspec, you can install the kernelspec directly.

  • The Docker container image must support sleep infinity.

  • To use your custom container with the managed notebooks executor, ensure that your custom container has the nbexecutor extension.

How custom containers become kernels in managed notebooks

For each custom container image provided, your managed notebooks instance identifies the available Jupyter kernelspecs on the container image when the instance starts. These kernelspecs appear as local kernels in the JupyterLab interface. When one of the kernelspecs is selected, the managed notebooks kernel manager runs the custom container as a kernel and starts a Jupyter session on that kernel.

Custom container image availability

Deep Learning Containers container images are available to all users.

If you want to use your own custom container image, it must be located in either Container Registry or Artifact Registry, and the container image must be publicly available.

Before you begin

  1. Sign in to your Google Cloud account. If you're new to Google Cloud, create an account to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
  2. In the Google Cloud console, on the project selector page, select or create a Google Cloud project.

    Go to project selector

  3. Make sure that billing is enabled for your Cloud project. Learn how to check if billing is enabled on a project.

  4. Enable the Notebooks, Container Registry, and Artifact Registry APIs.

    Enable the APIs

  5. In the Google Cloud console, on the project selector page, select or create a Google Cloud project.

    Go to project selector

  6. Make sure that billing is enabled for your Cloud project. Learn how to check if billing is enabled on a project.

  7. Enable the Notebooks, Container Registry, and Artifact Registry APIs.

    Enable the APIs

Add a custom container while creating an instance

To add a custom container to a managed notebooks instance, the custom container image must be specified at instance creation.

To add a custom container while you create a managed notebooks instance, complete the following steps.

  1. In the Google Cloud console, go to the Managed notebooks page.

    Go to Managed notebooks

  2. Click  New notebook.

  3. In the Notebook name field, enter a name for your instance.

  4. Click the Region list, and select a region for your instance.

  5. Click Advanced settings.

  6. In the Environment section, in Custom Docker images, select the Provide custom Docker images checkbox.

  7. Add a Docker container image in one of the following ways:

    1. Enter a Docker container image path. For example, to use the TensorFlow Enterprise 2.8 container image from Deep Learning Containers, enter gcr.io/deeplearning-platform-release/tf-gpu.2-8.
    2. Click Select to add a Docker container image from Container Registry or Artifact Registry. Then click either the Container Registry tab or Artifact Registry tab where your container image is stored, change the project to the project that includes your container image, and select your container image.
  8. Complete the rest of the Create a managed notebook dialog according to your needs.

  9. Click Create.

  10. Vertex AI Workbench automatically starts the instance. When the instance is ready to use, Vertex AI Workbench activates an Open JupyterLab link.

Set up a notebook file to run in your custom container

To open JupyterLab, create a new notebook file, and set it up to run on your custom container's kernel, complete the following steps.

  1. Next to your managed notebooks instance's name, click Open JupyterLab.

  2. In the Authenticate your managed notebook dialog, click the button to get an authentication code.

  3. Choose an account and click Allow. Copy the authentication code.

  4. In the Authenticate your managed notebook dialog, paste the authentication code, and then click Authenticate.

    Your managed notebooks instance opens JupyterLab.

  5. Select File > New > Notebook.

  6. In the Select kernel dialog, select the kernel for the custom container image that you want to use, and then click Select. Larger container images may take some time to appear as kernels. If the kernel that you want isn't there yet, try again in a few minutes. You can change the kernel whenever you want to run your notebook file on a different kernel.

    Your new notebook file opens.

What's next