This page describes the interfaces that you can use to interact with Vertex AI and when you should use them. You can use these interfaces along with one of Vertex AI's notebook solutions.
Some Vertex AI operations are only available through specific interfaces, so you may need to switch between interfaces during your workflow. For example, in Vertex AI Experiments, you must use the API to log data to an experiment run, but you can view the results in the console.
The Google Cloud console is a graphical user interface that you can use to work with your machine learning resources.
In the Google Cloud console, you can manage your Vertex AI datasets, models, endpoints, and jobs. You can also access other Google Cloud services, such as Cloud Storage and BigQuery, through the console.
Use the Google Cloud console if you prefer to view and manage your Vertex AI resources and visualizations through a graphical user interface.
For more information, see the Dashboard page of the Vertex AI section:
The Google Cloud command-line interface (CLI) is a set of tools for
creating and managing Google Cloud resources using the
Use the Google Cloud CLI when you want to manage your Vertex AI resources from the command line or through scripts and other automation.
Terraform is an infrastructure-as-code (IaC) tool that you can use to provision the infrastructure, such as resources and permissions, for multiple Google Cloud services, including Vertex AI.
You can define the Vertex AI resources and permissions for your Google Cloud project in a Terraform configuration file. You can then use Terraform to apply the configuration to your project by creating new resources and updating existing resources.
Use Terraform if you want to standardize the infrastructure for Vertex AI resources in your Google Cloud project and update the existing Google Cloud project infrastructure while fulfilling resource dependencies.
To get started, see Terraform support for Vertex AI.
Use the Vertex AI SDK for Python to programmatically automate your Vertex AI workflow.
The Vertex AI SDK for Python is similar to the Vertex AI Python client library, except the SDK is higher-level and less granular. For more information, see the Understand the SDK and client library differences.
To get started, see Install the Vertex AI SDK.
Client libraries use each supported language's natural conventions to call the Vertex AI API and reduce boilerplate code that you have to write.
The following languages are supported for Vertex AI:
Python. The Vertex AI Python client library is installed when you install the Vertex AI SDK for Python.
For more information, see Install the Vertex AI client libraries.
The Vertex AI REST API provides RESTful services for managing jobs, models, and endpoints, and for making predictions with hosted models on Google Cloud.
Use the REST API if you need to use your own libraries to call the Vertex AI API from your application.
To get started, see the Vertex AI API REST reference.
- Setting up a project and a development environment.
- Choosing a training method.
- Tutorials for Image, Text, Tabular, Video data types, and Custom training.
- Learn best practices for implementing custom-trained ML models on Vertex AI.