Class DeployedModel (1.6.2)

DeployedModel(mapping=None, *, ignore_unknown_fields=False, **kwargs)

A deployment of a Model. Endpoints contain one or more DeployedModels.

Attributes

NameDescription
dedicated_resources google.cloud.aiplatform_v1.types.DedicatedResources
A description of resources that are dedicated to the DeployedModel, and that need a higher degree of manual configuration.
automatic_resources google.cloud.aiplatform_v1.types.AutomaticResources
A description of resources that to large degree are decided by Vertex AI, and require only a modest additional configuration.
id str
Output only. The ID of the DeployedModel.
model str
Required. The name of the Model that this is the deployment of. Note that the Model may be in a different location than the DeployedModel's Endpoint.
display_name str
The display name of the DeployedModel. If not provided upon creation, the Model's display_name is used.
create_time google.protobuf.timestamp_pb2.Timestamp
Output only. Timestamp when the DeployedModel was created.
explanation_spec google.cloud.aiplatform_v1.types.ExplanationSpec
Explanation configuration for this DeployedModel. When deploying a Model using EndpointService.DeployModel, this value overrides the value of Model.explanation_spec. All fields of explanation_spec are optional in the request. If a field of explanation_spec is not populated, the value of the same field of Model.explanation_spec is inherited. If the corresponding Model.explanation_spec is not populated, all fields of the explanation_spec will be used for the explanation configuration.
service_account str
The service account that the DeployedModel's container runs as. Specify the email address of the service account. If this service account is not specified, the container runs as a service account that doesn't have access to the resource project. Users deploying the Model must have the ``iam.serviceAccounts.actAs`` permission on this service account.
disable_container_logging bool
For custom-trained Models and AutoML Tabular Models, the container of the DeployedModel instances will send ``stderr`` and ``stdout`` streams to Stackdriver Logging by default. Please note that the logs incur cost, which are subject to `Cloud Logging pricing
enable_access_logging bool
These logs are like standard server access logs, containing information like timestamp and latency for each prediction request. Note that Stackdriver logs may incur a cost, especially if your project receives prediction requests at a high queries per second rate (QPS). Estimate your costs before enabling this option.
private_endpoints google.cloud.aiplatform_v1.types.PrivateEndpoints
Output only. Provide paths for users to send predict/explain/health requests directly to the deployed model services running on Cloud via private services access. This field is populated if network is configured.

Inheritance

builtins.object > proto.message.Message > DeployedModel