This page documents production updates to Vertex AI. You can periodically check this page for announcements about new or updated features, bug fixes, known issues, and deprecated functionality.
To get the latest product updates delivered to you, add the URL of this page to your feed reader, or add one of the following feed URLs directly:
- For both Vertex AI and Vertex AI Workbench:
- For Vertex AI only:
- For Vertex AI Workbench only:
You can see the latest product updates for all of Google Cloud on the Google Cloud page, browse and filter all release notes in the Google Cloud console, or you can programmatically access release notes in BigQuery.
June 30, 2022Vertex AI
Features supported by Experiments include:
- Vary and track parameters and metrics.
- Compare parameters, metrics, and artifacts between pipeline runs.
- Track steps and artifacts to capture the lineage of experiments.
- Compare vertex pipelines against Notebook experiments.
June 28, 2022Vertex AI
Vertex AI Forecasting is available in GA. The following features are available:
June 17, 2022Vertex AI
May 27, 2022Vertex AI Workbench
The M93 release of Vertex AI Workbench managed notebooks includes the following:
- Fixed a bug that prevented kernels from shutting down properly in Vertex AI Workbench managed notebooks.
May 18, 2022Vertex AI
The ability to configure Vertex AI private endpoints is now general available (GA). Vertex AI private endpoints provide a low-latency, secure connection to the Vertex AI online prediction service. You can configure Vertex AI private endpoints by using VPC Network Peering. For more information, see Use private endpoints for online prediction.
May 12, 2022Vertex AI Workbench
The M91 release of Vertex AI Workbench managed notebooks includes the following:
- Log streaming to the consumer project via Logs Viewer is now supported.
- Added the
- Regular package refreshments and bug fixes.
- Fixed an issue that caused Spark server networking errors when using Dataproc Serverless Spark and VPC Peering.
April 26, 2022Vertex AI
You can now train your custom models using Cloud TPU Architecture (TPU VMs).
April 21, 2022Vertex AI
You can now use a pre-built container to perform custom training with PyTorch 1.11.
April 06, 2022Vertex AI
Vertex AI Model Registry is available in Preview. Vertex AI Model Registry is a searchable repository where you can manage the lifecycle of your ML models. From the Vertex AI Model Registry, you can better organize your models, train new versions, and deploy directly to endpoints.
Vertex AI Workbench is generally available (GA). Vertex AI Workbench is a single notebook surface for all your data science needs that lets you access BigQuery data and Cloud Storage from within JupyterLab, execute notebook code in Vertex AI custom training and Spark, use custom containers, manage costs with idle timeout, and secure your instances with VPC Service Controls and customer managed encryption keys (CMEK).
Features supported include:
- Google-managed instances and the latest GPU support
- Idle shutdown for managed notebooks instances
- Custom containers
- End-user and service account authentication
- Native plug-ins for BigQuery and Cloud Storage
- In-notebook Spark connect to Dataproc clusters
- Jobs support via the managed notebooks executor on Vertex AI custom training and Spark
- One-click deploy for NGC containers
- VPC Service Controls
- Customer managed encryption keys (CMEK)
The Vertex AI Workbench managed notebooks executor is generally available (GA). Use the executor to run notebook files on a schedule or as a one-time execution. You can use parameters in your execution to make specific changes to each run. For example, you might specify a different dataset to use, change the learning rate on your model, or change the version of the model. For more information, see Run notebook files with the executor.
March 07, 2022Vertex AI
Vertex AI Feature Store online store autoscaling is available in Preview. The online store nodes automatically scale to balance performance and cost with different traffic patterns. The offline store already scales automatically.
You can now mount Network File System (NFS) shares to access remote files when you run a custom training job. For more information, see Mount an NFS share for custom training.
This feature is in Preview.
Google Cloud Pipeline Components SDK v1.0 is now generally available.
February 16, 2022Vertex AI
You can now use a pre-built container to perform custom training with TensorFlow 2.8.
February 10, 2022Vertex AI
For Vertex AI featurestore resources, the online store is optional. You can set the number of online nodes to
0. For more information, see Manage featurestores.
January 04, 2022Vertex AI
You can now use a pre-built container to perform custom training with PyTorch 1.10.
December 23, 2021Vertex AI
There are now three Vertex AI release note feeds. Add any of the following to your feed reader:
- For both Vertex AI and Vertex AI Workbench:
- For Vertex AI only:
- For Vertex AI Workbench only:
December 02, 2021Vertex AI
You can now use a pre-built container to perform custom training with TensorFlow 2.7.
December 01, 2021Vertex AI
November 19, 2021Vertex AI
The autopackaging feature of the
gcloud ai custom-jobs create command is generally available (GA). Autopackaging lets you use a single command to run code on your local computer as a custom training job in Vertex AI.
November 09, 2021Vertex AI
November 04, 2021Vertex AI
Vertex Explainable AI Preview support available for AutoML image classification models
Vertex Explainable AI offers Preview support for the following model type:
November 02, 2021Vertex AI
You can use these interactive shells with VPC Service Controls.
October 25, 2021Vertex AI
October 11, 2021Vertex AI Workbench
The Notebooks product and all existing Notebooks instances are now part of Vertex AI Workbench as user-managed notebooks.
October 05, 2021Vertex AI
September 24, 2021Vertex AI
September 21, 2021Vertex AI
September 15, 2021Vertex AI
September 13, 2021Vertex AI
September 10, 2021Vertex AI
When you perform custom training, you can access Cloud Storage buckets by reading and writing to the local filesystem. This feature, based on Cloud Storage Fuse, is available in Preview.
Due to a recent change, the
iam.serviceAccounts.actAs permission on the specified service account for the notebook instance is required for users to continue to have access to their notebook instances. The Google internal Inverting Proxy server that provides access to notebook instances now verifies that this permission is present before allowing users access to the JupyterLab URL. The JupyterLab URL this update covers is:
This update only applies to notebook instances in Single User mode and verifies that the assigned single user is authorized to execute code inside the notebook instance. Notebook instances running in Service Account or Project Editor mode already perform this verification via the Inverting Proxy server.
August 30, 2021Vertex AI
August 24, 2021Vertex AI
August 02, 2021Vertex AI
Vertex Pipelines is available in the following regions:
See all the locations where Vertex Pipelines is available.
July 28, 2021Vertex AI
July 27, 2021Vertex AI
The following features are generally available (GA):
- Access Transparency for Vertex AI
- Using a custom service account for custom training and prediction
- Using VPC Service Controls with Vertex AI
- Setting up VPC Network Peering with Vertex AI and using private IP for custom training (Using private IP for prediction and vector matching with Matching Engine remains in preview.)
July 26, 2021Vertex AI Workbench
If using proxy single-user mode, Notebooks API now verifies if the specified user (
proxy-user-mail) has Service Account permissions on the Service Account. This check is performed during instance creation and registration.
July 20, 2021Vertex AI
Private endpoints for online prediction are now available in preview. After you set up VPC Network Peering with Vertex AI, you can create private endpoints for low-latency online prediction within your private network.
Additionally, the documentation for VPC Network Peering with custom training has moved. The general instructions for setting up VPC Network Peering with Vertex AI are available at the original link, https://cloud.google.com/vertex-ai/docs/general/vpc-peering. The documentation for custom training is now available here: Using private IP with custom training.
July 19, 2021Vertex AI
You can now use an interactive shell to inspect your custom training container while it runs. The interactive shell can be helpful for monitoring and debugging training.
This feature is available in preview.
July 14, 2021Vertex AI
You can now use the
gcloud beta ai custom-jobs create command to build a Docker image based on local training code, push the image to Container Registry, and create a
July 08, 2021Vertex AI
You can now containerize and run your training code locally by using the new
gcloud beta ai custom-jobs local-run command. This feature is available in preview.
June 25, 2021Vertex AI
June 18, 2021Vertex AI Workbench
Support for Compute Reservations. Notebooks API allows the use of Compute Reservations during instance creation.
June 11, 2021Vertex AI
You can now use a pre-built container to serve predictions from TensorFlow 2.5 models.
You can now use a pre-built container to serve predictions from XGBoost 1.4 models.
May 18, 2021Vertex AI
AI Platform (Unified) is now Vertex AI.
Vertex AI has added support for custom model training, custom model batch prediction, custom model online prediction, and a limited number of other services in the following regions:
May 03, 2021Vertex AI
You can now use a pre-built container to serve predictions from TensorFlow 2.4 models.
You can now use a pre-built container to serve predictions from scikit-learn 0.24 models.
You can now use a pre-built container to serve predictions from XGBoost 1.3 models.
April 27, 2021Vertex AI
AI Platform Vizier is now available in preview. Vizier is a feature of AI Platform (Unified) that you can use to perform black-box optimization. You can use Vizier to tune hyperparameters or optimize any evaluable system.
April 15, 2021Vertex AI
The Python client library for AI Platform (Unified) is now called the
AI Platform (Unified) SDK. With the release of version 0.7
(Preview), the AI Platform (Unified) SDK provides two levels of support.
is designed to simplify common data
science workflows by using wrapper classes and opinionated defaults. The
aiplatform.gapic library remains
available for those times when you need more flexibility or control.
March 31, 2021Vertex AI
AI Platform (Unified) is now available in General Availability (GA).
AI Platform (Unified) has added support for the following regions for custom model training, as well as batch and online prediction for custom-trained models:
- us-west1 (Oregon)
- us-east1 (South Carolina)
- us-east4 (N. Virginia)
- northamerica-northeast1 (Montreal)
- europe-west2 (London)
- europe-west1 (Belgium)
- asia-southeast1 (Singapore)
- asia-northeast1 (Tokyo)
- australia-southeast1 (Sydney)
- asia-northeast3 (Seoul)
March 26, 2021Vertex AI Workbench
Cross Project Service Account is supported for user-managed notebooks.
March 15, 2021Vertex AI
You can now use a pre-built container to perform custom training with PyTorch 1.7.
March 04, 2021Vertex AI Workbench
New Notebooks instances add labels for VM image (
goog-caip-notebook) and volume (
March 02, 2021Vertex AI
CMEK compliance using the client libraries
You can now use the client libraries to create resources with a customer-managed encryption key (CMEK).
For more information on creating a resource with an encryption key using the client libraries, see Using customer-managed encryption keys (CMEK).
March 01, 2021Vertex AI
The client library for Java now includes enhancements to improve usage of training and prediction features. The client library includes additional types and utility functions for sending training requests, sending prediction requests, and reading prediction results.
To use these enhancements, you must install the latest version of the client library.
February 25, 2021Vertex AI
AI Platform (Unified) now supports Access Transparency in beta. Google Cloud organizations with certain support packages can use this feature. Learn more about using Access Transparency with AI Platform (Unified).
The client libraries for Node.js and Python now include enhancements to improve usage of training and prediction features. These client libraries include additional types and utility functions for sending training requests, sending prediction requests, and reading prediction results.
To use these enhancements, you must install the latest version of the client libraries.
explain method calls no longer require the use of a different service endpoint (for example,
https://us-central1-prediction-aiplatform.googleapis.com). These methods are now available on the same endpoint as all other methods.
You can now use a pre-built container to perform custom training with TensorFlow 2.4.
You can now use a pre-built container to serve predictions from TensorFlow 2.3 models.
You can now use a pre-built container to serve predictions from XGBoost 1.2 models.
February 01, 2021Vertex AI
You can now use a pre-built container to perform custom training with PyTorch 1.6.
Notebooks Terraform Module supports Notebooks API v1
January 23, 2021Vertex AI Workbench
VPC-SC for Notebooks (now known as user-managed notebooks) is now Generally Available.
Notebooks API supports Shielded VM configuration.
January 19, 2021Vertex AI
Preview: Select AI Platform (Unified) resources can now be configured to use Customer-managed encryption keys (CMEK).
Currently you can only create resources with a CMEK key in the UI; this functionality is not currently available using the client libraries.
January 11, 2021Vertex AI
The default boot disk type for virtual machine instances used for custom training has changed from
pd-ssd. Learn more about disk types for custom training and read about pricing for different disk types.
If you previously used the default disk type for custom training and want to continue training with the same disk type, make sure to explicitly specify the
pd-standard boot disk type when you perform custom training.
January 06, 2021Vertex AI
You can now use a pre-built container to perform custom training with TensorFlow 2.3.
December 17, 2020Vertex AI
AI Platform (Unified) now stores and processes your data only in the region you specify for most features. Learn more.
November 16, 2020Vertex AI
AI Platform (Unified) is now available in Preview.
For more information, see the product documentation.
September 21, 2020Vertex AI Workbench
AI Platform Notebooks (now known as user-managed notebooks) API is now Generally Available. The API now includes an isUpgradable endpoint and adds manual and auto-upgrade functionality to notebooks instances created using the API.
Granular IAM permissions for AI Platform Notebooks (now known as user-managed notebooks) is now Generally Available.
AI Platform Notebooks now supports E2 machine types.
The following new regions have been added:
March 31, 2020Vertex AI Workbench
AI Platform Notebooks (now known as user-managed notebooks) is now Generally Available. Some integrations with and specific features of AI Platform Notebooks are still in beta, such as Virtual Private Cloud Service Controls, Identity and Access Management (IAM) roles, and AI Platform Notebooks API.
February 04, 2020Vertex AI Workbench
VPC Service Controls now supports AI Platform Notebooks. Learn how to use a notebook instance within a service perimeter. This functionality is in beta.
February 03, 2020Vertex AI Workbench
AI Platform Notebooks now supports Access Transparency. Access Transparency provides you with logs of actions that Google staff have taken when accessing your data. To learn more about Access Transparency, see the Overview of Access Transparency.
September 12, 2019Vertex AI Workbench
You can now use customer-managed encryption keys (CMEK) to protect data on the boot disks of your AI Platform Notebooks (now known as user-managed notebooks) VM instances. CMEK in AI Platform Notebooks is generally available. For more information, see Using customer-managed encryption keys (CMEK).
September 09, 2019Vertex AI Workbench
AI Platform Notebooks now provides more ways for you to customize your network settings, encrypt your notebook content, and grant access to your notebook instance. These options are available when you create a notebook.
Now you can implement AI Platform Notebooks using custom containers. Use a Deep Learning Containers image or create a derivative container of your own, then create a new notebook instance using your custom container.
July 12, 2019Vertex AI Workbench
R upgraded to version 3.6.
R Notebooks are no longer dependent on a Conda environment.
June 03, 2019Vertex AI Workbench
You can now create AI Platform Notebooks instances with R and core R packages installed. Learn how to install R dependencies, and read guides for using R with BigQuery in AI Platform Notebooks and using R and Python in the same notebook.
March 01, 2019Vertex AI Workbench
AI Platform Notebooks is now available in beta. AI Platform Notebooks enables you to create and manage virtual machine (VM) instances that are pre-packaged with JupyterLab and a suite of deep learning software.