Use Deep Learning VM Images and Deep Learning Containers with Vertex AI

This page describes the main features of Deep Learning VM and Deep Learning Containers, and helps you understand how you might use these products with Vertex AI.

Deep Learning VM

Overview

Deep Learning VM Images is a set of virtual machine images optimized for data science and machine learning tasks. All images come with key ML frameworks and tools pre-installed. You can use them out of the box on instances with GPUs to accelerate your data processing tasks.

Deep Learning VM images are available to support many combinations of framework and processor. There are currently images supporting TensorFlow Enterprise, TensorFlow, PyTorch, and generic high-performance computing, with versions for both CPU-only and GPU-enabled workflows.

To see a list of frameworks available, see Choosing an image.

To learn more, see the Deep Learning VM documentation.

Using Deep Learning VM

You can use a Deep Learning VM instance as a part of your work in Vertex AI. For example, you can develop an application to run on a Deep Learning VM instance to take advantage of its optimized data-processing capability. Or use a Deep Learning VM instance as a development environment for a self-managed distributed training system.

You can create Deep Learning VM instances on the Deep Learning VM Cloud Marketplace page in the Google Cloud Console.

Go to the Deep Learning VM Cloud Marketplace page

Deep Learning Containers

Overview

Deep Learning Containers are a set of Docker containers with key data science frameworks, libraries, and tools pre-installed. These containers provide you with performance-optimized, consistent environments that can help you prototype and implement workflows quickly.

To learn more, see the Deep Learning Containers documentation.

Using Deep Learning Containers

You can use a Deep Learning Containers instance as a part of your work in Vertex AI. For example, the pre-built containers available on Vertex AI are integrated Deep Learning Containers.

You can also build your Vertex AI model as a custom container-based application to help you deploy it in a consistent environment and run it wherever it needs to be.

To get started building your own custom container, follow these steps:

  1. Choose one of the available container images.

  2. See the relevant Vertex AI documentation on container requirements, such as Custom containers for training and Custom container requirements for prediction.

    Consider these requirements and prepare to modify your container accordingly.

  3. Create a Deep Learning Containers local instance, while making sure to modify the container according to Vertex AI requirements.

  4. Push the container to Artifact Registry.

What's next