Pre-built containers for prediction

Vertex AI provides Docker container images that you run as pre-built containers for serving predictions from trained model artifacts. These containers, which are organized by machine learning (ML) framework and framework version, provide HTTP prediction servers that you can use to serve predictions with minimal configuration. In many cases, using a pre-built container is simpler than creating your own custom container for prediction.

This document lists the pre-built containers for prediction, and it describes how to use them with model artifacts that you created using Vertex AI's custom training functionality or model artifacts that you created outside of Vertex AI.

Available container images

Each of the following container images is available in several Artifact Registry repositories, which store data in various locations. You can use any of the URIs for an image when you perform custom training; each provides the same container image. If you use the Google Cloud Console to create a Model resource, the Cloud Console selects the URI that best matches the location where you are using Vertex AI in order to reduce latency.

TensorFlow

ML framework version Use with GPUs? URIs (choose any)
2.5 No
  • us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-5:latest
  • europe-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-5:latest
  • asia-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-5:latest
2.5 Yes
  • us-docker.pkg.dev/vertex-ai/prediction/tf2-gpu.2-5:latest
  • europe-docker.pkg.dev/vertex-ai/prediction/tf2-gpu.2-5:latest
  • asia-docker.pkg.dev/vertex-ai/prediction/tf2-gpu.2-5:latest
2.4 No
  • us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-4:latest
  • europe-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-4:latest
  • asia-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-4:latest
2.4 Yes
  • us-docker.pkg.dev/vertex-ai/prediction/tf2-gpu.2-4:latest
  • europe-docker.pkg.dev/vertex-ai/prediction/tf2-gpu.2-4:latest
  • asia-docker.pkg.dev/vertex-ai/prediction/tf2-gpu.2-4:latest
2.3 No
  • us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-3:latest
  • europe-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-3:latest
  • asia-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-3:latest
2.3 Yes
  • us-docker.pkg.dev/vertex-ai/prediction/tf2-gpu.2-3:latest
  • europe-docker.pkg.dev/vertex-ai/prediction/tf2-gpu.2-3:latest
  • asia-docker.pkg.dev/vertex-ai/prediction/tf2-gpu.2-3:latest
2.2 No
  • us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-2:latest
  • europe-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-2:latest
  • asia-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-2:latest
2.2 Yes
  • us-docker.pkg.dev/vertex-ai/prediction/tf2-gpu.2-2:latest
  • europe-docker.pkg.dev/vertex-ai/prediction/tf2-gpu.2-2:latest
  • asia-docker.pkg.dev/vertex-ai/prediction/tf2-gpu.2-2:latest
2.1 No
  • us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-1:latest
  • europe-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-1:latest
  • asia-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-1:latest
2.1 Yes
  • us-docker.pkg.dev/vertex-ai/prediction/tf2-gpu.2-2:latest
  • europe-docker.pkg.dev/vertex-ai/prediction/tf2-gpu.2-2:latest
  • asia-docker.pkg.dev/vertex-ai/prediction/tf2-gpu.2-2:latest
1.15 No
  • us-docker.pkg.dev/vertex-ai/prediction/tf-cpu.1-15:latest
  • europe-docker.pkg.dev/vertex-ai/prediction/tf-cpu.1-15:latest
  • asia-docker.pkg.dev/vertex-ai/prediction/tf-cpu.1-15:latest
1.15 Yes
  • us-docker.pkg.dev/vertex-ai/prediction/tf-gpu.1-15:latest
  • europe-docker.pkg.dev/vertex-ai/prediction/tf-gpu.1-15:latest
  • asia-docker.pkg.dev/vertex-ai/prediction/tf-gpu.1-15:latest

scikit-learn

ML framework version Use with GPUs? URIs (choose any)
0.24 No
  • us-docker.pkg.dev/vertex-ai/prediction/sklearn-cpu.0-24:latest
  • europe-docker.pkg.dev/vertex-ai/prediction/sklearn-cpu.0-24:latest
  • asia-docker.pkg.dev/vertex-ai/prediction/sklearn-cpu.0-24:latest
0.23 No
  • us-docker.pkg.dev/vertex-ai/prediction/sklearn-cpu.0-23:latest
  • europe-docker.pkg.dev/vertex-ai/prediction/sklearn-cpu.0-23:latest
  • asia-docker.pkg.dev/vertex-ai/prediction/sklearn-cpu.0-23:latest
0.22 No
  • us-docker.pkg.dev/vertex-ai/prediction/sklearn-cpu.0-22:latest
  • europe-docker.pkg.dev/vertex-ai/prediction/sklearn-cpu.0-22:latest
  • asia-docker.pkg.dev/vertex-ai/prediction/sklearn-cpu.0-22:latest
0.20 No
  • us-docker.pkg.dev/vertex-ai/prediction/sklearn-cpu.0-20:latest
  • europe-docker.pkg.dev/vertex-ai/prediction/sklearn-cpu.0-20:latest
  • asia-docker.pkg.dev/vertex-ai/prediction/sklearn-cpu.0-20:latest

XGBoost

ML framework version Use with GPUs? URIs (choose any)
1.4 No
  • us-docker.pkg.dev/vertex-ai/prediction/xgboost-cpu.1-4:latest
  • europe-docker.pkg.dev/vertex-ai/prediction/xgboost-cpu.1-4:latest
  • asia-docker.pkg.dev/vertex-ai/prediction/xgboost-cpu.1-4:latest
1.3 No
  • us-docker.pkg.dev/vertex-ai/prediction/xgboost-cpu.1-3:latest
  • europe-docker.pkg.dev/vertex-ai/prediction/xgboost-cpu.1-3:latest
  • asia-docker.pkg.dev/vertex-ai/prediction/xgboost-cpu.1-3:latest
1.2 No
  • us-docker.pkg.dev/vertex-ai/prediction/xgboost-cpu.1-2:latest
  • europe-docker.pkg.dev/vertex-ai/prediction/xgboost-cpu.1-2:latest
  • asia-docker.pkg.dev/vertex-ai/prediction/xgboost-cpu.1-2:latest
1.1 No
  • us-docker.pkg.dev/vertex-ai/prediction/xgboost-cpu.1-1:latest
  • europe-docker.pkg.dev/vertex-ai/prediction/xgboost-cpu.1-1:latest
  • asia-docker.pkg.dev/vertex-ai/prediction/xgboost-cpu.1-1:latest
0.90 No
  • us-docker.pkg.dev/vertex-ai/prediction/xgboost-cpu.0-90:latest
  • europe-docker.pkg.dev/vertex-ai/prediction/xgboost-cpu.0-90:latest
  • asia-docker.pkg.dev/vertex-ai/prediction/xgboost-cpu.0-90:latest
0.82 No
  • us-docker.pkg.dev/vertex-ai/prediction/xgboost-cpu.0-82:latest
  • europe-docker.pkg.dev/vertex-ai/prediction/xgboost-cpu.0-82:latest
  • asia-docker.pkg.dev/vertex-ai/prediction/xgboost-cpu.0-82:latest

Using a pre-built container

To use a pre-built container, specify the following API fields of a Model:

If you are using a TrainingPipeline resource to perform custom training and create the Model from the trained artifacts, then specify this Model in TrainingPipeline.modelToUpload. Otherwise, learn how to import model artifacts as a Model resource.

What's next