Class ModelMonitoringJob (1.74.0)

ModelMonitoringJob(
    model_monitoring_job_name: str,
    model_monitor_id: typing.Optional[str] = None,
    project: typing.Optional[str] = None,
    location: typing.Optional[str] = None,
    credentials: typing.Optional[google.auth.credentials.Credentials] = None,
)

Initializer for ModelMonitoringJob.

Example Usage:

 my_monitoring_job = aiplatform.ModelMonitoringJob(
     model_monitoring_job_name='projects/123/locations/us-central1/modelMonitors/\
     my_model_monitor_id/modelMonitoringJobs/my_monitoring_job_id'
 )
 or
 my_monitoring_job = aiplatform.aiplatform.ModelMonitoringJob(
     model_monitoring_job_name='my_monitoring_job_id',
     model_monitor_id='my_model_monitor_id',
 )

Parameters

Name Description
model_monitoring_job_name str

Required. The resource name for the Model Monitoring Job if provided alone, or the model monitoring job id if provided with model_monitor_id.

model_monitor_id str

Optional. The model monitor id depends on the way of initializing ModelMonitoringJob.

project str

Required. Project to retrieve endpoint from. If not set, project set in aiplatform.init will be used.

location str

Required. Location to retrieve endpoint from. If not set, location set in aiplatform.init will be used.

credentials auth_credentials.Credentials

Optional. Custom credentials to use to init model monitoring job. Overrides credentials set in aiplatform.init.

Properties

create_time

Time this resource was created.

display_name

Display name of this resource.

encryption_spec

Customer-managed encryption key options for this Vertex AI resource.

If this is set, then all resources created by this Vertex AI resource will be encrypted with the provided encryption key.

gca_resource

The underlying resource proto representation.

labels

User-defined labels containing metadata about this resource.

Read more about labels at https://goo.gl/xmQnxf

name

Name of this resource.

resource_name

Full qualified resource name.

state

Fetch Job again and return the current JobState.

Returns
Type Description
state (job_state.JobState) Enum that describes the state of a Model Monitoring Job.

update_time

Time this resource was last updated.

Methods

ModelMonitoringJob

ModelMonitoringJob(
    model_monitoring_job_name: str,
    model_monitor_id: typing.Optional[str] = None,
    project: typing.Optional[str] = None,
    location: typing.Optional[str] = None,
    credentials: typing.Optional[google.auth.credentials.Credentials] = None,
)

Initializes class with project, location, and api_client.

Parameters
Name Description
project str

Optional. Project of the resource noun.

location str

Optional. The location of the resource noun.

credentials google.auth.credentials.Credentials

Optional. custom credentials to use when accessing interacting with resource noun.

resource_name str

A fully-qualified resource name or ID.

create

create(
    model_monitor_name: typing.Optional[str] = None,
    target_dataset: typing.Optional[
        vertexai.resources.preview.ml_monitoring.spec.objective.MonitoringInput
    ] = None,
    display_name: typing.Optional[str] = None,
    model_monitoring_job_id: typing.Optional[str] = None,
    project: typing.Optional[str] = None,
    location: typing.Optional[str] = None,
    credentials: typing.Optional[google.auth.credentials.Credentials] = None,
    baseline_dataset: typing.Optional[
        vertexai.resources.preview.ml_monitoring.spec.objective.MonitoringInput
    ] = None,
    tabular_objective_spec: typing.Optional[
        vertexai.resources.preview.ml_monitoring.spec.objective.TabularObjective
    ] = None,
    output_spec: typing.Optional[
        vertexai.resources.preview.ml_monitoring.spec.output.OutputSpec
    ] = None,
    notification_spec: typing.Optional[
        vertexai.resources.preview.ml_monitoring.spec.notification.NotificationSpec
    ] = None,
    explanation_spec: typing.Optional[
        google.cloud.aiplatform_v1beta1.types.explanation.ExplanationSpec
    ] = None,
    sync: bool = False,
) -> vertexai.resources.preview.ml_monitoring.model_monitors.ModelMonitoringJob

Creates a new ModelMonitoringJob.

Parameters
Name Description
model_monitor_name str

Required. The parent model monitor resource name. Format: projects/{project}/locations/{location}/modelMonitors/{model_monitor}

target_dataset objective.MonitoringInput

Required. The target dataset for analysis.

display_name str

Optional. The user-defined name of the ModelMonitoringJob. The name can be up to 128 characters long and can comprise any UTF-8 character.

model_monitoring_job_id str

Optional. The unique ID of the model monitoring job run, which will become the final component of the model monitoring job resource name. The maximum length is 63 characters, and valid characters are /^a-z?$/. If not specified, it will be generated by Vertex AI.

project str

Optional. Project to retrieve endpoint from. If not set, project set in aiplatform.init will be used.

location str

Optional. Location to retrieve endpoint from. If not set, location set in aiplatform.init will be used.

credentials auth_credentials.Credentials

Optional. Custom credentials to use to create model monitoring job. Overrides credentials set in aiplatform.init.

baseline_dataset objective.MonitoringInput

Optional. The baseline dataset for monitoring job. If not set, the training dataset in ModelMonitor will be used as baseline dataset.

output_spec output.OutputSpec

Optional. The monitoring metrics/logs export spec. If not set, will use the default output_spec defined in ModelMonitor.

notification_spec notification.NotificationSpec

Optional. The notification spec for monitoring result. If not set, will use the default notification_spec defined in ModelMonitor.

explanation_spec explanation.ExplanationSpec

Optional. The explanation spec for feature attribution monitoring. If not set, will use the default explanation_spec defined in ModelMonitor.

sync bool

Required. Whether to execute this method synchronously. If False, this method will be executed in concurrent Future and any downstream object will be immediately returned and synced when the Future has completed. Default is False.

Returns
Type Description
ModelMonitoringJob The model monitoring job that was created.

delete

delete() -> None

Deletes an Model Monitoring Job.

done

done() -> bool

Method indicating whether a job has completed.

list

list(
    filter: typing.Optional[str] = None,
    order_by: typing.Optional[str] = None,
    project: typing.Optional[str] = None,
    location: typing.Optional[str] = None,
    credentials: typing.Optional[google.auth.credentials.Credentials] = None,
    parent: typing.Optional[str] = None,
) -> typing.List[google.cloud.aiplatform.base.VertexAiResourceNoun]

List all instances of this Vertex AI Resource.

Example Usage:

aiplatform.BatchPredictionJobs.list( filter='state="JOB_STATE_SUCCEEDED" AND display_name="my_job"', )

aiplatform.Model.list(order_by="create_time desc, display_name")

Parameters
Name Description
filter str

Optional. An expression for filtering the results of the request. For field names both snake_case and camelCase are supported.

order_by str

Optional. A comma-separated list of fields to order by, sorted in ascending order. Use "desc" after a field name for descending. Supported fields: display_name, create_time, update_time

project str

Optional. Project to retrieve list from. If not set, project set in aiplatform.init will be used.

location str

Optional. Location to retrieve list from. If not set, location set in aiplatform.init will be used.

credentials auth_credentials.Credentials

Optional. Custom credentials to use to retrieve list. Overrides credentials set in aiplatform.init.

parent str

Optional. The parent resource name if any to retrieve list from.

to_dict

to_dict() -> typing.Dict[str, typing.Any]

Returns the resource proto as a dictionary.

wait

wait()

Helper method that blocks until all futures are complete.