- 1.74.0 (latest)
- 1.73.0
- 1.72.0
- 1.71.1
- 1.70.0
- 1.69.0
- 1.68.0
- 1.67.1
- 1.66.0
- 1.65.0
- 1.63.0
- 1.62.0
- 1.60.0
- 1.59.0
- 1.58.0
- 1.57.0
- 1.56.0
- 1.55.0
- 1.54.1
- 1.53.0
- 1.52.0
- 1.51.0
- 1.50.0
- 1.49.0
- 1.48.0
- 1.47.0
- 1.46.0
- 1.45.0
- 1.44.0
- 1.43.0
- 1.39.0
- 1.38.1
- 1.37.0
- 1.36.4
- 1.35.0
- 1.34.0
- 1.33.1
- 1.32.0
- 1.31.1
- 1.30.1
- 1.29.0
- 1.28.1
- 1.27.1
- 1.26.1
- 1.25.0
- 1.24.1
- 1.23.0
- 1.22.1
- 1.21.0
- 1.20.0
- 1.19.1
- 1.18.3
- 1.17.1
- 1.16.1
- 1.15.1
- 1.14.0
- 1.13.1
- 1.12.1
- 1.11.0
- 1.10.0
- 1.9.0
- 1.8.1
- 1.7.1
- 1.6.2
- 1.5.0
- 1.4.3
- 1.3.0
- 1.2.0
- 1.1.1
- 1.0.1
- 0.9.0
- 0.8.0
- 0.7.1
- 0.6.0
- 0.5.1
- 0.4.0
- 0.3.1
EndpointServiceClient(*, credentials: Optional[google.auth.credentials.Credentials] = None, transport: Optional[Union[str, google.cloud.aiplatform_v1beta1.services.endpoint_service.transports.base.EndpointServiceTransport]] = None, client_options: Optional[google.api_core.client_options.ClientOptions] = None, client_info: google.api_core.gapic_v1.client_info.ClientInfo = <google.api_core.gapic_v1.client_info.ClientInfo object>)
A service for managing Vertex AI's Endpoints.
Inheritance
builtins.object > EndpointServiceClientProperties
transport
Returns the transport used by the client instance.
Type | Description |
EndpointServiceTransport | The transport used by the client instance. |
Methods
EndpointServiceClient
EndpointServiceClient(*, credentials: Optional[google.auth.credentials.Credentials] = None, transport: Optional[Union[str, google.cloud.aiplatform_v1beta1.services.endpoint_service.transports.base.EndpointServiceTransport]] = None, client_options: Optional[google.api_core.client_options.ClientOptions] = None, client_info: google.api_core.gapic_v1.client_info.ClientInfo = <google.api_core.gapic_v1.client_info.ClientInfo object>)
Instantiates the endpoint service client.
Name | Description |
credentials |
Optional[google.auth.credentials.Credentials]
The authorization credentials to attach to requests. These credentials identify the application to the service; if none are specified, the client will attempt to ascertain the credentials from the environment. |
transport |
Union[str, EndpointServiceTransport]
The transport to use. If set to None, a transport is chosen automatically. |
client_options |
google.api_core.client_options.ClientOptions
Custom options for the client. It won't take effect if a |
client_info |
google.api_core.gapic_v1.client_info.ClientInfo
The client info used to send a user-agent string along with API requests. If |
Type | Description |
google.auth.exceptions.MutualTLSChannelError | If mutual TLS transport creation failed for any reason. |
__exit__
__exit__(type, value, traceback)
Releases underlying transport's resources.
.. warning:: ONLY use as a context manager if the transport is NOT shared with other clients! Exiting the with block will CLOSE the transport and may cause errors in other clients!
common_billing_account_path
common_billing_account_path(billing_account: str)
Returns a fully-qualified billing_account string.
common_folder_path
common_folder_path(folder: str)
Returns a fully-qualified folder string.
common_location_path
common_location_path(project: str, location: str)
Returns a fully-qualified location string.
common_organization_path
common_organization_path(organization: str)
Returns a fully-qualified organization string.
common_project_path
common_project_path(project: str)
Returns a fully-qualified project string.
create_endpoint
create_endpoint(request: Optional[Union[google.cloud.aiplatform_v1beta1.types.endpoint_service.CreateEndpointRequest, dict]] = None, *, parent: Optional[str] = None, endpoint: Optional[google.cloud.aiplatform_v1beta1.types.endpoint.Endpoint] = None, endpoint_id: Optional[str] = None, retry: Union[google.api_core.retry.Retry, google.api_core.gapic_v1.method._MethodDefault] = <_MethodDefault._DEFAULT_VALUE: <object object>>, timeout: Optional[float] = None, metadata: Sequence[Tuple[str, str]] = ())
Creates an Endpoint.
from google.cloud import aiplatform_v1beta1
def sample_create_endpoint():
# Create a client
client = aiplatform_v1beta1.EndpointServiceClient()
# Initialize request argument(s)
endpoint = aiplatform_v1beta1.Endpoint()
endpoint.display_name = "display_name_value"
request = aiplatform_v1beta1.CreateEndpointRequest(
parent="parent_value",
endpoint=endpoint,
)
# Make the request
operation = client.create_endpoint(request=request)
print("Waiting for operation to complete...")
response = operation.result()
# Handle the response
print(response)
Name | Description |
request |
Union[google.cloud.aiplatform_v1beta1.types.CreateEndpointRequest, dict]
The request object. Request message for EndpointService.CreateEndpoint. |
parent |
str
Required. The resource name of the Location to create the Endpoint in. Format: |
endpoint |
google.cloud.aiplatform_v1beta1.types.Endpoint
Required. The Endpoint to create. This corresponds to the |
endpoint_id |
str
Immutable. The ID to use for endpoint, which will become the final component of the endpoint resource name. If not provided, Vertex AI will generate a value for this ID. This value should be 1-10 characters, and valid characters are /[0-9]/. When using HTTP/JSON, this field is populated based on a query string argument, such as |
retry |
google.api_core.retry.Retry
Designation of what errors, if any, should be retried. |
timeout |
float
The timeout for this request. |
metadata |
Sequence[Tuple[str, str]]
Strings which should be sent along with the request as metadata. |
Type | Description |
google.api_core.operation.Operation | An object representing a long-running operation. The result type for the operation will be Endpoint Models are deployed into it, and afterwards Endpoint is called to obtain predictions and explanations. |
delete_endpoint
delete_endpoint(request: Optional[Union[google.cloud.aiplatform_v1beta1.types.endpoint_service.DeleteEndpointRequest, dict]] = None, *, name: Optional[str] = None, retry: Union[google.api_core.retry.Retry, google.api_core.gapic_v1.method._MethodDefault] = <_MethodDefault._DEFAULT_VALUE: <object object>>, timeout: Optional[float] = None, metadata: Sequence[Tuple[str, str]] = ())
Deletes an Endpoint.
from google.cloud import aiplatform_v1beta1
def sample_delete_endpoint():
# Create a client
client = aiplatform_v1beta1.EndpointServiceClient()
# Initialize request argument(s)
request = aiplatform_v1beta1.DeleteEndpointRequest(
name="name_value",
)
# Make the request
operation = client.delete_endpoint(request=request)
print("Waiting for operation to complete...")
response = operation.result()
# Handle the response
print(response)
Name | Description |
request |
Union[google.cloud.aiplatform_v1beta1.types.DeleteEndpointRequest, dict]
The request object. Request message for EndpointService.DeleteEndpoint. |
name |
str
Required. The name of the Endpoint resource to be deleted. Format: |
retry |
google.api_core.retry.Retry
Designation of what errors, if any, should be retried. |
timeout |
float
The timeout for this request. |
metadata |
Sequence[Tuple[str, str]]
Strings which should be sent along with the request as metadata. |
Type | Description |
google.api_core.operation.Operation | An object representing a long-running operation. The result type for the operation will be `google.protobuf.empty_pb2.Empty` A generic empty message that you can re-use to avoid defining duplicated empty messages in your APIs. A typical example is to use it as the request or the response type of an API method. For instance: service Foo { rpc Bar(google.protobuf.Empty) returns (google.protobuf.Empty); } The JSON representation for Empty is empty JSON object {}. |
deploy_model
deploy_model(request: Optional[Union[google.cloud.aiplatform_v1beta1.types.endpoint_service.DeployModelRequest, dict]] = None, *, endpoint: Optional[str] = None, deployed_model: Optional[google.cloud.aiplatform_v1beta1.types.endpoint.DeployedModel] = None, traffic_split: Optional[Mapping[str, int]] = None, retry: Union[google.api_core.retry.Retry, google.api_core.gapic_v1.method._MethodDefault] = <_MethodDefault._DEFAULT_VALUE: <object object>>, timeout: Optional[float] = None, metadata: Sequence[Tuple[str, str]] = ())
Deploys a Model into this Endpoint, creating a DeployedModel within it.
from google.cloud import aiplatform_v1beta1
def sample_deploy_model():
# Create a client
client = aiplatform_v1beta1.EndpointServiceClient()
# Initialize request argument(s)
deployed_model = aiplatform_v1beta1.DeployedModel()
deployed_model.dedicated_resources.min_replica_count = 1803
deployed_model.model = "model_value"
request = aiplatform_v1beta1.DeployModelRequest(
endpoint="endpoint_value",
deployed_model=deployed_model,
)
# Make the request
operation = client.deploy_model(request=request)
print("Waiting for operation to complete...")
response = operation.result()
# Handle the response
print(response)
Name | Description |
request |
Union[google.cloud.aiplatform_v1beta1.types.DeployModelRequest, dict]
The request object. Request message for EndpointService.DeployModel. |
endpoint |
str
Required. The name of the Endpoint resource into which to deploy a Model. Format: |
deployed_model |
google.cloud.aiplatform_v1beta1.types.DeployedModel
Required. The DeployedModel to be created within the Endpoint. Note that Endpoint.traffic_split must be updated for the DeployedModel to start receiving traffic, either as part of this call, or via EndpointService.UpdateEndpoint. This corresponds to the |
traffic_split |
Mapping[str, int]
A map from a DeployedModel's ID to the percentage of this Endpoint's traffic that should be forwarded to that DeployedModel. If this field is non-empty, then the Endpoint's traffic_split will be overwritten with it. To refer to the ID of the just being deployed Model, a "0" should be used, and the actual ID of the new DeployedModel will be filled in its place by this method. The traffic percentage values must add up to 100. If this field is empty, then the Endpoint's traffic_split is not updated. This corresponds to the |
retry |
google.api_core.retry.Retry
Designation of what errors, if any, should be retried. |
timeout |
float
The timeout for this request. |
metadata |
Sequence[Tuple[str, str]]
Strings which should be sent along with the request as metadata. |
Type | Description |
google.api_core.operation.Operation | An object representing a long-running operation. The result type for the operation will be DeployModelResponse Response message for EndpointService.DeployModel. |
endpoint_path
endpoint_path(project: str, location: str, endpoint: str)
Returns a fully-qualified endpoint string.
from_service_account_file
from_service_account_file(filename: str, *args, **kwargs)
Creates an instance of this client using the provided credentials file.
Name | Description |
filename |
str
The path to the service account private key json file. |
Type | Description |
EndpointServiceClient | The constructed client. |
from_service_account_info
from_service_account_info(info: dict, *args, **kwargs)
Creates an instance of this client using the provided credentials info.
Name | Description |
info |
dict
The service account private key info. |
Type | Description |
EndpointServiceClient | The constructed client. |
from_service_account_json
from_service_account_json(filename: str, *args, **kwargs)
Creates an instance of this client using the provided credentials file.
Name | Description |
filename |
str
The path to the service account private key json file. |
Type | Description |
EndpointServiceClient | The constructed client. |
get_endpoint
get_endpoint(request: Optional[Union[google.cloud.aiplatform_v1beta1.types.endpoint_service.GetEndpointRequest, dict]] = None, *, name: Optional[str] = None, retry: Union[google.api_core.retry.Retry, google.api_core.gapic_v1.method._MethodDefault] = <_MethodDefault._DEFAULT_VALUE: <object object>>, timeout: Optional[float] = None, metadata: Sequence[Tuple[str, str]] = ())
Gets an Endpoint.
from google.cloud import aiplatform_v1beta1
def sample_get_endpoint():
# Create a client
client = aiplatform_v1beta1.EndpointServiceClient()
# Initialize request argument(s)
request = aiplatform_v1beta1.GetEndpointRequest(
name="name_value",
)
# Make the request
response = client.get_endpoint(request=request)
# Handle the response
print(response)
Name | Description |
request |
Union[google.cloud.aiplatform_v1beta1.types.GetEndpointRequest, dict]
The request object. Request message for EndpointService.GetEndpoint |
name |
str
Required. The name of the Endpoint resource. Format: |
retry |
google.api_core.retry.Retry
Designation of what errors, if any, should be retried. |
timeout |
float
The timeout for this request. |
metadata |
Sequence[Tuple[str, str]]
Strings which should be sent along with the request as metadata. |
Type | Description |
google.cloud.aiplatform_v1beta1.types.Endpoint | Models are deployed into it, and afterwards Endpoint is called to obtain predictions and explanations. |
get_mtls_endpoint_and_cert_source
get_mtls_endpoint_and_cert_source(
client_options: Optional[google.api_core.client_options.ClientOptions] = None,
)
Return the API endpoint and client cert source for mutual TLS.
The client cert source is determined in the following order:
(1) if GOOGLE_API_USE_CLIENT_CERTIFICATE
environment variable is not "true", the
client cert source is None.
(2) if client_options.client_cert_source
is provided, use the provided one; if the
default client cert source exists, use the default one; otherwise the client cert
source is None.
The API endpoint is determined in the following order:
(1) if client_options.api_endpoint
if provided, use the provided one.
(2) if GOOGLE_API_USE_CLIENT_CERTIFICATE
environment variable is "always", use the
default mTLS endpoint; if the environment variabel is "never", use the default API
endpoint; otherwise if client cert source exists, use the default mTLS endpoint, otherwise
use the default API endpoint.
More details can be found at https://google.aip.dev/auth/4114.
Name | Description |
client_options |
google.api_core.client_options.ClientOptions
Custom options for the client. Only the |
Type | Description |
google.auth.exceptions.MutualTLSChannelError | If any errors happen. |
Type | Description |
Tuple[str, Callable[[], Tuple[bytes, bytes]]] | returns the API endpoint and the client cert source to use. |
list_endpoints
list_endpoints(request: Optional[Union[google.cloud.aiplatform_v1beta1.types.endpoint_service.ListEndpointsRequest, dict]] = None, *, parent: Optional[str] = None, retry: Union[google.api_core.retry.Retry, google.api_core.gapic_v1.method._MethodDefault] = <_MethodDefault._DEFAULT_VALUE: <object object>>, timeout: Optional[float] = None, metadata: Sequence[Tuple[str, str]] = ())
Lists Endpoints in a Location.
from google.cloud import aiplatform_v1beta1
def sample_list_endpoints():
# Create a client
client = aiplatform_v1beta1.EndpointServiceClient()
# Initialize request argument(s)
request = aiplatform_v1beta1.ListEndpointsRequest(
parent="parent_value",
)
# Make the request
page_result = client.list_endpoints(request=request)
# Handle the response
for response in page_result:
print(response)
Name | Description |
request |
Union[google.cloud.aiplatform_v1beta1.types.ListEndpointsRequest, dict]
The request object. Request message for EndpointService.ListEndpoints. |
parent |
str
Required. The resource name of the Location from which to list the Endpoints. Format: |
retry |
google.api_core.retry.Retry
Designation of what errors, if any, should be retried. |
timeout |
float
The timeout for this request. |
metadata |
Sequence[Tuple[str, str]]
Strings which should be sent along with the request as metadata. |
Type | Description |
google.cloud.aiplatform_v1beta1.services.endpoint_service.pagers.ListEndpointsPager | Response message for EndpointService.ListEndpoints. Iterating over this object will yield results and resolve additional pages automatically. |
model_deployment_monitoring_job_path
model_deployment_monitoring_job_path(
project: str, location: str, model_deployment_monitoring_job: str
)
Returns a fully-qualified model_deployment_monitoring_job string.
model_path
model_path(project: str, location: str, model: str)
Returns a fully-qualified model string.
network_path
network_path(project: str, network: str)
Returns a fully-qualified network string.
parse_common_billing_account_path
parse_common_billing_account_path(path: str)
Parse a billing_account path into its component segments.
parse_common_folder_path
parse_common_folder_path(path: str)
Parse a folder path into its component segments.
parse_common_location_path
parse_common_location_path(path: str)
Parse a location path into its component segments.
parse_common_organization_path
parse_common_organization_path(path: str)
Parse a organization path into its component segments.
parse_common_project_path
parse_common_project_path(path: str)
Parse a project path into its component segments.
parse_endpoint_path
parse_endpoint_path(path: str)
Parses a endpoint path into its component segments.
parse_model_deployment_monitoring_job_path
parse_model_deployment_monitoring_job_path(path: str)
Parses a model_deployment_monitoring_job path into its component segments.
parse_model_path
parse_model_path(path: str)
Parses a model path into its component segments.
parse_network_path
parse_network_path(path: str)
Parses a network path into its component segments.
undeploy_model
undeploy_model(request: Optional[Union[google.cloud.aiplatform_v1beta1.types.endpoint_service.UndeployModelRequest, dict]] = None, *, endpoint: Optional[str] = None, deployed_model_id: Optional[str] = None, traffic_split: Optional[Mapping[str, int]] = None, retry: Union[google.api_core.retry.Retry, google.api_core.gapic_v1.method._MethodDefault] = <_MethodDefault._DEFAULT_VALUE: <object object>>, timeout: Optional[float] = None, metadata: Sequence[Tuple[str, str]] = ())
Undeploys a Model from an Endpoint, removing a DeployedModel from it, and freeing all resources it's using.
from google.cloud import aiplatform_v1beta1
def sample_undeploy_model():
# Create a client
client = aiplatform_v1beta1.EndpointServiceClient()
# Initialize request argument(s)
request = aiplatform_v1beta1.UndeployModelRequest(
endpoint="endpoint_value",
deployed_model_id="deployed_model_id_value",
)
# Make the request
operation = client.undeploy_model(request=request)
print("Waiting for operation to complete...")
response = operation.result()
# Handle the response
print(response)
Name | Description |
request |
Union[google.cloud.aiplatform_v1beta1.types.UndeployModelRequest, dict]
The request object. Request message for EndpointService.UndeployModel. |
endpoint |
str
Required. The name of the Endpoint resource from which to undeploy a Model. Format: |
deployed_model_id |
str
Required. The ID of the DeployedModel to be undeployed from the Endpoint. This corresponds to the |
traffic_split |
Mapping[str, int]
If this field is provided, then the Endpoint's traffic_split will be overwritten with it. If last DeployedModel is being undeployed from the Endpoint, the [Endpoint.traffic_split] will always end up empty when this call returns. A DeployedModel will be successfully undeployed only if it doesn't have any traffic assigned to it when this method executes, or if this field unassigns any traffic to it. This corresponds to the |
retry |
google.api_core.retry.Retry
Designation of what errors, if any, should be retried. |
timeout |
float
The timeout for this request. |
metadata |
Sequence[Tuple[str, str]]
Strings which should be sent along with the request as metadata. |
Type | Description |
google.api_core.operation.Operation | An object representing a long-running operation. The result type for the operation will be UndeployModelResponse Response message for EndpointService.UndeployModel. |
update_endpoint
update_endpoint(request: Optional[Union[google.cloud.aiplatform_v1beta1.types.endpoint_service.UpdateEndpointRequest, dict]] = None, *, endpoint: Optional[google.cloud.aiplatform_v1beta1.types.endpoint.Endpoint] = None, update_mask: Optional[google.protobuf.field_mask_pb2.FieldMask] = None, retry: Union[google.api_core.retry.Retry, google.api_core.gapic_v1.method._MethodDefault] = <_MethodDefault._DEFAULT_VALUE: <object object>>, timeout: Optional[float] = None, metadata: Sequence[Tuple[str, str]] = ())
Updates an Endpoint.
from google.cloud import aiplatform_v1beta1
def sample_update_endpoint():
# Create a client
client = aiplatform_v1beta1.EndpointServiceClient()
# Initialize request argument(s)
endpoint = aiplatform_v1beta1.Endpoint()
endpoint.display_name = "display_name_value"
request = aiplatform_v1beta1.UpdateEndpointRequest(
endpoint=endpoint,
)
# Make the request
response = client.update_endpoint(request=request)
# Handle the response
print(response)
Name | Description |
request |
Union[google.cloud.aiplatform_v1beta1.types.UpdateEndpointRequest, dict]
The request object. Request message for EndpointService.UpdateEndpoint. |
endpoint |
google.cloud.aiplatform_v1beta1.types.Endpoint
Required. The Endpoint which replaces the resource on the server. This corresponds to the |
update_mask |
google.protobuf.field_mask_pb2.FieldMask
Required. The update mask applies to the resource. See |
retry |
google.api_core.retry.Retry
Designation of what errors, if any, should be retried. |
timeout |
float
The timeout for this request. |
metadata |
Sequence[Tuple[str, str]]
Strings which should be sent along with the request as metadata. |
Type | Description |
google.cloud.aiplatform_v1beta1.types.Endpoint | Models are deployed into it, and afterwards Endpoint is called to obtain predictions and explanations. |