- 1.73.0 (latest)
- 1.72.0
- 1.71.1
- 1.70.0
- 1.69.0
- 1.68.0
- 1.67.1
- 1.66.0
- 1.65.0
- 1.63.0
- 1.62.0
- 1.60.0
- 1.59.0
- 1.58.0
- 1.57.0
- 1.56.0
- 1.55.0
- 1.54.1
- 1.53.0
- 1.52.0
- 1.51.0
- 1.50.0
- 1.49.0
- 1.48.0
- 1.47.0
- 1.46.0
- 1.45.0
- 1.44.0
- 1.43.0
- 1.39.0
- 1.38.1
- 1.37.0
- 1.36.4
- 1.35.0
- 1.34.0
- 1.33.1
- 1.32.0
- 1.31.1
- 1.30.1
- 1.29.0
- 1.28.1
- 1.27.1
- 1.26.1
- 1.25.0
- 1.24.1
- 1.23.0
- 1.22.1
- 1.21.0
- 1.20.0
- 1.19.1
- 1.18.3
- 1.17.1
- 1.16.1
- 1.15.1
- 1.14.0
- 1.13.1
- 1.12.1
- 1.11.0
- 1.10.0
- 1.9.0
- 1.8.1
- 1.7.1
- 1.6.2
- 1.5.0
- 1.4.3
- 1.3.0
- 1.2.0
- 1.1.1
- 1.0.1
- 0.9.0
- 0.8.0
- 0.7.1
- 0.6.0
- 0.5.1
- 0.4.0
- 0.3.1
ModelServiceAsyncClient(*, credentials: Optional[google.auth.credentials.Credentials] = None, transport: Union[str, google.cloud.aiplatform_v1beta1.services.model_service.transports.base.ModelServiceTransport] = 'grpc_asyncio', client_options: Optional[google.api_core.client_options.ClientOptions] = None, client_info: google.api_core.gapic_v1.client_info.ClientInfo = <google.api_core.gapic_v1.client_info.ClientInfo object>)
A service for managing Vertex AI's machine learning Models.
Inheritance
builtins.object > ModelServiceAsyncClientProperties
transport
Returns the transport used by the client instance.
Type | Description |
ModelServiceTransport | The transport used by the client instance. |
Methods
ModelServiceAsyncClient
ModelServiceAsyncClient(*, credentials: Optional[google.auth.credentials.Credentials] = None, transport: Union[str, google.cloud.aiplatform_v1beta1.services.model_service.transports.base.ModelServiceTransport] = 'grpc_asyncio', client_options: Optional[google.api_core.client_options.ClientOptions] = None, client_info: google.api_core.gapic_v1.client_info.ClientInfo = <google.api_core.gapic_v1.client_info.ClientInfo object>)
Instantiates the model service client.
Name | Description |
credentials |
Optional[google.auth.credentials.Credentials]
The authorization credentials to attach to requests. These credentials identify the application to the service; if none are specified, the client will attempt to ascertain the credentials from the environment. |
transport |
Union[str, `.ModelServiceTransport`]
The transport to use. If set to None, a transport is chosen automatically. |
client_options |
ClientOptions
Custom options for the client. It won't take effect if a |
Type | Description |
google.auth.exceptions.MutualTlsChannelError | If mutual TLS transport creation failed for any reason. |
common_billing_account_path
common_billing_account_path(billing_account: str)
Returns a fully-qualified billing_account string.
common_folder_path
common_folder_path(folder: str)
Returns a fully-qualified folder string.
common_location_path
common_location_path(project: str, location: str)
Returns a fully-qualified location string.
common_organization_path
common_organization_path(organization: str)
Returns a fully-qualified organization string.
common_project_path
common_project_path(project: str)
Returns a fully-qualified project string.
delete_model
delete_model(request: Optional[Union[google.cloud.aiplatform_v1beta1.types.model_service.DeleteModelRequest, dict]] = None, *, name: Optional[str] = None, retry: Union[google.api_core.retry.Retry, google.api_core.gapic_v1.method._MethodDefault] = <_MethodDefault._DEFAULT_VALUE: <object object>>, timeout: Optional[float] = None, metadata: Sequence[Tuple[str, str]] = ())
Deletes a Model.
A model cannot be deleted if any xref_Endpoint resource has a xref_DeployedModel based on the model in its xref_deployed_models field.
from google.cloud import aiplatform_v1beta1
def sample_delete_model():
# Create a client
client = aiplatform_v1beta1.ModelServiceClient()
# Initialize request argument(s)
request = aiplatform_v1beta1.DeleteModelRequest(
name="name_value",
)
# Make the request
operation = client.delete_model(request=request)
print("Waiting for operation to complete...")
response = operation.result()
# Handle the response
print(response)
Name | Description |
request |
Union[google.cloud.aiplatform_v1beta1.types.DeleteModelRequest, dict]
The request object. Request message for ModelService.DeleteModel. |
name |
`str`
Required. The name of the Model resource to be deleted. Format: |
retry |
google.api_core.retry.Retry
Designation of what errors, if any, should be retried. |
timeout |
float
The timeout for this request. |
metadata |
Sequence[Tuple[str, str]]
Strings which should be sent along with the request as metadata. |
Type | Description |
google.api_core.operation_async.AsyncOperation | An object representing a long-running operation. The result type for the operation will be `google.protobuf.empty_pb2.Empty` A generic empty message that you can re-use to avoid defining duplicated empty messages in your APIs. A typical example is to use it as the request or the response type of an API method. For instance: service Foo { rpc Bar(google.protobuf.Empty) returns (google.protobuf.Empty); } The JSON representation for Empty is empty JSON object {}. |
endpoint_path
endpoint_path(project: str, location: str, endpoint: str)
Returns a fully-qualified endpoint string.
export_model
export_model(request: Optional[Union[google.cloud.aiplatform_v1beta1.types.model_service.ExportModelRequest, dict]] = None, *, name: Optional[str] = None, output_config: Optional[google.cloud.aiplatform_v1beta1.types.model_service.ExportModelRequest.OutputConfig] = None, retry: Union[google.api_core.retry.Retry, google.api_core.gapic_v1.method._MethodDefault] = <_MethodDefault._DEFAULT_VALUE: <object object>>, timeout: Optional[float] = None, metadata: Sequence[Tuple[str, str]] = ())
Exports a trained, exportable Model to a location specified by the user. A Model is considered to be exportable if it has at least one [supported export format][google.cloud.aiplatform.v1beta1.Model.supported_export_formats].
from google.cloud import aiplatform_v1beta1
def sample_export_model():
# Create a client
client = aiplatform_v1beta1.ModelServiceClient()
# Initialize request argument(s)
request = aiplatform_v1beta1.ExportModelRequest(
name="name_value",
)
# Make the request
operation = client.export_model(request=request)
print("Waiting for operation to complete...")
response = operation.result()
# Handle the response
print(response)
Name | Description |
request |
Union[google.cloud.aiplatform_v1beta1.types.ExportModelRequest, dict]
The request object. Request message for ModelService.ExportModel. |
name |
`str`
Required. The resource name of the Model to export. This corresponds to the |
output_config |
OutputConfig
Required. The desired output location and configuration. This corresponds to the |
retry |
google.api_core.retry.Retry
Designation of what errors, if any, should be retried. |
timeout |
float
The timeout for this request. |
metadata |
Sequence[Tuple[str, str]]
Strings which should be sent along with the request as metadata. |
Type | Description |
google.api_core.operation_async.AsyncOperation | An object representing a long-running operation. The result type for the operation will be ExportModelResponse Response message of ModelService.ExportModel operation. |
from_service_account_file
from_service_account_file(filename: str, *args, **kwargs)
Creates an instance of this client using the provided credentials file.
Name | Description |
filename |
str
The path to the service account private key json file. |
Type | Description |
ModelServiceAsyncClient | The constructed client. |
from_service_account_info
from_service_account_info(info: dict, *args, **kwargs)
Creates an instance of this client using the provided credentials info.
Name | Description |
info |
dict
The service account private key info. |
Type | Description |
ModelServiceAsyncClient | The constructed client. |
from_service_account_json
from_service_account_json(filename: str, *args, **kwargs)
Creates an instance of this client using the provided credentials file.
Name | Description |
filename |
str
The path to the service account private key json file. |
Type | Description |
ModelServiceAsyncClient | The constructed client. |
get_model
get_model(request: Optional[Union[google.cloud.aiplatform_v1beta1.types.model_service.GetModelRequest, dict]] = None, *, name: Optional[str] = None, retry: Union[google.api_core.retry.Retry, google.api_core.gapic_v1.method._MethodDefault] = <_MethodDefault._DEFAULT_VALUE: <object object>>, timeout: Optional[float] = None, metadata: Sequence[Tuple[str, str]] = ())
Gets a Model.
from google.cloud import aiplatform_v1beta1
def sample_get_model():
# Create a client
client = aiplatform_v1beta1.ModelServiceClient()
# Initialize request argument(s)
request = aiplatform_v1beta1.GetModelRequest(
name="name_value",
)
# Make the request
response = client.get_model(request=request)
# Handle the response
print(response)
Name | Description |
request |
Union[google.cloud.aiplatform_v1beta1.types.GetModelRequest, dict]
The request object. Request message for ModelService.GetModel. |
name |
`str`
Required. The name of the Model resource. Format: |
retry |
google.api_core.retry.Retry
Designation of what errors, if any, should be retried. |
timeout |
float
The timeout for this request. |
metadata |
Sequence[Tuple[str, str]]
Strings which should be sent along with the request as metadata. |
Type | Description |
google.cloud.aiplatform_v1beta1.types.Model | A trained machine learning Model. |
get_model_evaluation
get_model_evaluation(request: Optional[Union[google.cloud.aiplatform_v1beta1.types.model_service.GetModelEvaluationRequest, dict]] = None, *, name: Optional[str] = None, retry: Union[google.api_core.retry.Retry, google.api_core.gapic_v1.method._MethodDefault] = <_MethodDefault._DEFAULT_VALUE: <object object>>, timeout: Optional[float] = None, metadata: Sequence[Tuple[str, str]] = ())
Gets a ModelEvaluation.
from google.cloud import aiplatform_v1beta1
def sample_get_model_evaluation():
# Create a client
client = aiplatform_v1beta1.ModelServiceClient()
# Initialize request argument(s)
request = aiplatform_v1beta1.GetModelEvaluationRequest(
name="name_value",
)
# Make the request
response = client.get_model_evaluation(request=request)
# Handle the response
print(response)
Name | Description |
request |
Union[google.cloud.aiplatform_v1beta1.types.GetModelEvaluationRequest, dict]
The request object. Request message for ModelService.GetModelEvaluation. |
name |
`str`
Required. The name of the ModelEvaluation resource. Format: |
retry |
google.api_core.retry.Retry
Designation of what errors, if any, should be retried. |
timeout |
float
The timeout for this request. |
metadata |
Sequence[Tuple[str, str]]
Strings which should be sent along with the request as metadata. |
Type | Description |
google.cloud.aiplatform_v1beta1.types.ModelEvaluation | A collection of metrics calculated by comparing Model's predictions on all of the test data against annotations from the test data. |
get_model_evaluation_slice
get_model_evaluation_slice(request: Optional[Union[google.cloud.aiplatform_v1beta1.types.model_service.GetModelEvaluationSliceRequest, dict]] = None, *, name: Optional[str] = None, retry: Union[google.api_core.retry.Retry, google.api_core.gapic_v1.method._MethodDefault] = <_MethodDefault._DEFAULT_VALUE: <object object>>, timeout: Optional[float] = None, metadata: Sequence[Tuple[str, str]] = ())
Gets a ModelEvaluationSlice.
from google.cloud import aiplatform_v1beta1
def sample_get_model_evaluation_slice():
# Create a client
client = aiplatform_v1beta1.ModelServiceClient()
# Initialize request argument(s)
request = aiplatform_v1beta1.GetModelEvaluationSliceRequest(
name="name_value",
)
# Make the request
response = client.get_model_evaluation_slice(request=request)
# Handle the response
print(response)
Name | Description |
request |
Union[google.cloud.aiplatform_v1beta1.types.GetModelEvaluationSliceRequest, dict]
The request object. Request message for ModelService.GetModelEvaluationSlice. |
name |
`str`
Required. The name of the ModelEvaluationSlice resource. Format: |
retry |
google.api_core.retry.Retry
Designation of what errors, if any, should be retried. |
timeout |
float
The timeout for this request. |
metadata |
Sequence[Tuple[str, str]]
Strings which should be sent along with the request as metadata. |
Type | Description |
google.cloud.aiplatform_v1beta1.types.ModelEvaluationSlice | A collection of metrics calculated by comparing Model's predictions on a slice of the test data against ground truth annotations. |
get_mtls_endpoint_and_cert_source
get_mtls_endpoint_and_cert_source(
client_options: Optional[google.api_core.client_options.ClientOptions] = None,
)
Return the API endpoint and client cert source for mutual TLS.
The client cert source is determined in the following order:
(1) if GOOGLE_API_USE_CLIENT_CERTIFICATE
environment variable is not "true", the
client cert source is None.
(2) if client_options.client_cert_source
is provided, use the provided one; if the
default client cert source exists, use the default one; otherwise the client cert
source is None.
The API endpoint is determined in the following order:
(1) if client_options.api_endpoint
if provided, use the provided one.
(2) if GOOGLE_API_USE_CLIENT_CERTIFICATE
environment variable is "always", use the
default mTLS endpoint; if the environment variabel is "never", use the default API
endpoint; otherwise if client cert source exists, use the default mTLS endpoint, otherwise
use the default API endpoint.
More details can be found at https://google.aip.dev/auth/4114.
Name | Description |
client_options |
google.api_core.client_options.ClientOptions
Custom options for the client. Only the |
Type | Description |
google.auth.exceptions.MutualTLSChannelError | If any errors happen. |
Type | Description |
Tuple[str, Callable[[], Tuple[bytes, bytes]]] | returns the API endpoint and the client cert source to use. |
get_transport_class
get_transport_class()
Returns an appropriate transport class.
import_model_evaluation
import_model_evaluation(request: Optional[Union[google.cloud.aiplatform_v1beta1.types.model_service.ImportModelEvaluationRequest, dict]] = None, *, parent: Optional[str] = None, model_evaluation: Optional[google.cloud.aiplatform_v1beta1.types.model_evaluation.ModelEvaluation] = None, retry: Union[google.api_core.retry.Retry, google.api_core.gapic_v1.method._MethodDefault] = <_MethodDefault._DEFAULT_VALUE: <object object>>, timeout: Optional[float] = None, metadata: Sequence[Tuple[str, str]] = ())
Imports an externally generated ModelEvaluation.
from google.cloud import aiplatform_v1beta1
def sample_import_model_evaluation():
# Create a client
client = aiplatform_v1beta1.ModelServiceClient()
# Initialize request argument(s)
request = aiplatform_v1beta1.ImportModelEvaluationRequest(
parent="parent_value",
)
# Make the request
response = client.import_model_evaluation(request=request)
# Handle the response
print(response)
Name | Description |
request |
Union[google.cloud.aiplatform_v1beta1.types.ImportModelEvaluationRequest, dict]
The request object. Request message for ModelService.ImportModelEvaluation |
parent |
`str`
Required. The name of the parent model resource. Format: |
model_evaluation |
ModelEvaluation
Required. Model evaluation resource to be imported. This corresponds to the |
retry |
google.api_core.retry.Retry
Designation of what errors, if any, should be retried. |
timeout |
float
The timeout for this request. |
metadata |
Sequence[Tuple[str, str]]
Strings which should be sent along with the request as metadata. |
Type | Description |
google.cloud.aiplatform_v1beta1.types.ModelEvaluation | A collection of metrics calculated by comparing Model's predictions on all of the test data against annotations from the test data. |
list_model_evaluation_slices
list_model_evaluation_slices(request: Optional[Union[google.cloud.aiplatform_v1beta1.types.model_service.ListModelEvaluationSlicesRequest, dict]] = None, *, parent: Optional[str] = None, retry: Union[google.api_core.retry.Retry, google.api_core.gapic_v1.method._MethodDefault] = <_MethodDefault._DEFAULT_VALUE: <object object>>, timeout: Optional[float] = None, metadata: Sequence[Tuple[str, str]] = ())
Lists ModelEvaluationSlices in a ModelEvaluation.
from google.cloud import aiplatform_v1beta1
def sample_list_model_evaluation_slices():
# Create a client
client = aiplatform_v1beta1.ModelServiceClient()
# Initialize request argument(s)
request = aiplatform_v1beta1.ListModelEvaluationSlicesRequest(
parent="parent_value",
)
# Make the request
page_result = client.list_model_evaluation_slices(request=request)
# Handle the response
for response in page_result:
print(response)
Name | Description |
request |
Union[google.cloud.aiplatform_v1beta1.types.ListModelEvaluationSlicesRequest, dict]
The request object. Request message for ModelService.ListModelEvaluationSlices. |
parent |
`str`
Required. The resource name of the ModelEvaluation to list the ModelEvaluationSlices from. Format: |
retry |
google.api_core.retry.Retry
Designation of what errors, if any, should be retried. |
timeout |
float
The timeout for this request. |
metadata |
Sequence[Tuple[str, str]]
Strings which should be sent along with the request as metadata. |
Type | Description |
google.cloud.aiplatform_v1beta1.services.model_service.pagers.ListModelEvaluationSlicesAsyncPager | Response message for ModelService.ListModelEvaluationSlices. Iterating over this object will yield results and resolve additional pages automatically. |
list_model_evaluations
list_model_evaluations(request: Optional[Union[google.cloud.aiplatform_v1beta1.types.model_service.ListModelEvaluationsRequest, dict]] = None, *, parent: Optional[str] = None, retry: Union[google.api_core.retry.Retry, google.api_core.gapic_v1.method._MethodDefault] = <_MethodDefault._DEFAULT_VALUE: <object object>>, timeout: Optional[float] = None, metadata: Sequence[Tuple[str, str]] = ())
Lists ModelEvaluations in a Model.
from google.cloud import aiplatform_v1beta1
def sample_list_model_evaluations():
# Create a client
client = aiplatform_v1beta1.ModelServiceClient()
# Initialize request argument(s)
request = aiplatform_v1beta1.ListModelEvaluationsRequest(
parent="parent_value",
)
# Make the request
page_result = client.list_model_evaluations(request=request)
# Handle the response
for response in page_result:
print(response)
Name | Description |
request |
Union[google.cloud.aiplatform_v1beta1.types.ListModelEvaluationsRequest, dict]
The request object. Request message for ModelService.ListModelEvaluations. |
parent |
`str`
Required. The resource name of the Model to list the ModelEvaluations from. Format: |
retry |
google.api_core.retry.Retry
Designation of what errors, if any, should be retried. |
timeout |
float
The timeout for this request. |
metadata |
Sequence[Tuple[str, str]]
Strings which should be sent along with the request as metadata. |
Type | Description |
google.cloud.aiplatform_v1beta1.services.model_service.pagers.ListModelEvaluationsAsyncPager | Response message for ModelService.ListModelEvaluations. Iterating over this object will yield results and resolve additional pages automatically. |
list_models
list_models(request: Optional[Union[google.cloud.aiplatform_v1beta1.types.model_service.ListModelsRequest, dict]] = None, *, parent: Optional[str] = None, retry: Union[google.api_core.retry.Retry, google.api_core.gapic_v1.method._MethodDefault] = <_MethodDefault._DEFAULT_VALUE: <object object>>, timeout: Optional[float] = None, metadata: Sequence[Tuple[str, str]] = ())
Lists Models in a Location.
from google.cloud import aiplatform_v1beta1
def sample_list_models():
# Create a client
client = aiplatform_v1beta1.ModelServiceClient()
# Initialize request argument(s)
request = aiplatform_v1beta1.ListModelsRequest(
parent="parent_value",
)
# Make the request
page_result = client.list_models(request=request)
# Handle the response
for response in page_result:
print(response)
Name | Description |
request |
Union[google.cloud.aiplatform_v1beta1.types.ListModelsRequest, dict]
The request object. Request message for ModelService.ListModels. |
parent |
`str`
Required. The resource name of the Location to list the Models from. Format: |
retry |
google.api_core.retry.Retry
Designation of what errors, if any, should be retried. |
timeout |
float
The timeout for this request. |
metadata |
Sequence[Tuple[str, str]]
Strings which should be sent along with the request as metadata. |
Type | Description |
google.cloud.aiplatform_v1beta1.services.model_service.pagers.ListModelsAsyncPager | Response message for ModelService.ListModels Iterating over this object will yield results and resolve additional pages automatically. |
model_evaluation_path
model_evaluation_path(project: str, location: str, model: str, evaluation: str)
Returns a fully-qualified model_evaluation string.
model_evaluation_slice_path
model_evaluation_slice_path(
project: str, location: str, model: str, evaluation: str, slice: str
)
Returns a fully-qualified model_evaluation_slice string.
model_path
model_path(project: str, location: str, model: str)
Returns a fully-qualified model string.
parse_common_billing_account_path
parse_common_billing_account_path(path: str)
Parse a billing_account path into its component segments.
parse_common_folder_path
parse_common_folder_path(path: str)
Parse a folder path into its component segments.
parse_common_location_path
parse_common_location_path(path: str)
Parse a location path into its component segments.
parse_common_organization_path
parse_common_organization_path(path: str)
Parse a organization path into its component segments.
parse_common_project_path
parse_common_project_path(path: str)
Parse a project path into its component segments.
parse_endpoint_path
parse_endpoint_path(path: str)
Parses a endpoint path into its component segments.
parse_model_evaluation_path
parse_model_evaluation_path(path: str)
Parses a model_evaluation path into its component segments.
parse_model_evaluation_slice_path
parse_model_evaluation_slice_path(path: str)
Parses a model_evaluation_slice path into its component segments.
parse_model_path
parse_model_path(path: str)
Parses a model path into its component segments.
parse_training_pipeline_path
parse_training_pipeline_path(path: str)
Parses a training_pipeline path into its component segments.
training_pipeline_path
training_pipeline_path(project: str, location: str, training_pipeline: str)
Returns a fully-qualified training_pipeline string.
update_model
update_model(request: Optional[Union[google.cloud.aiplatform_v1beta1.types.model_service.UpdateModelRequest, dict]] = None, *, model: Optional[google.cloud.aiplatform_v1beta1.types.model.Model] = None, update_mask: Optional[google.protobuf.field_mask_pb2.FieldMask] = None, retry: Union[google.api_core.retry.Retry, google.api_core.gapic_v1.method._MethodDefault] = <_MethodDefault._DEFAULT_VALUE: <object object>>, timeout: Optional[float] = None, metadata: Sequence[Tuple[str, str]] = ())
Updates a Model.
from google.cloud import aiplatform_v1beta1
def sample_update_model():
# Create a client
client = aiplatform_v1beta1.ModelServiceClient()
# Initialize request argument(s)
model = aiplatform_v1beta1.Model()
model.display_name = "display_name_value"
request = aiplatform_v1beta1.UpdateModelRequest(
model=model,
)
# Make the request
response = client.update_model(request=request)
# Handle the response
print(response)
Name | Description |
request |
Union[google.cloud.aiplatform_v1beta1.types.UpdateModelRequest, dict]
The request object. Request message for ModelService.UpdateModel. |
model |
Model
Required. The Model which replaces the resource on the server. This corresponds to the |
update_mask |
`google.protobuf.field_mask_pb2.FieldMask`
Required. The update mask applies to the resource. For the |
retry |
google.api_core.retry.Retry
Designation of what errors, if any, should be retried. |
timeout |
float
The timeout for this request. |
metadata |
Sequence[Tuple[str, str]]
Strings which should be sent along with the request as metadata. |
Type | Description |
google.cloud.aiplatform_v1beta1.types.Model | A trained machine learning Model. |
upload_model
upload_model(request: Optional[Union[google.cloud.aiplatform_v1beta1.types.model_service.UploadModelRequest, dict]] = None, *, parent: Optional[str] = None, model: Optional[google.cloud.aiplatform_v1beta1.types.model.Model] = None, retry: Union[google.api_core.retry.Retry, google.api_core.gapic_v1.method._MethodDefault] = <_MethodDefault._DEFAULT_VALUE: <object object>>, timeout: Optional[float] = None, metadata: Sequence[Tuple[str, str]] = ())
Uploads a Model artifact into Vertex AI.
from google.cloud import aiplatform_v1beta1
def sample_upload_model():
# Create a client
client = aiplatform_v1beta1.ModelServiceClient()
# Initialize request argument(s)
model = aiplatform_v1beta1.Model()
model.display_name = "display_name_value"
request = aiplatform_v1beta1.UploadModelRequest(
parent="parent_value",
model=model,
)
# Make the request
operation = client.upload_model(request=request)
print("Waiting for operation to complete...")
response = operation.result()
# Handle the response
print(response)
Name | Description |
request |
Union[google.cloud.aiplatform_v1beta1.types.UploadModelRequest, dict]
The request object. Request message for ModelService.UploadModel. |
parent |
`str`
Required. The resource name of the Location into which to upload the Model. Format: |
model |
Model
Required. The Model to create. This corresponds to the |
retry |
google.api_core.retry.Retry
Designation of what errors, if any, should be retried. |
timeout |
float
The timeout for this request. |
metadata |
Sequence[Tuple[str, str]]
Strings which should be sent along with the request as metadata. |
Type | Description |
google.api_core.operation_async.AsyncOperation | An object representing a long-running operation. The result type for the operation will be UploadModelResponse Response message of ModelService.UploadModel operation. |