- 1.73.0 (latest)
- 1.72.0
- 1.71.1
- 1.70.0
- 1.69.0
- 1.68.0
- 1.67.1
- 1.66.0
- 1.65.0
- 1.63.0
- 1.62.0
- 1.60.0
- 1.59.0
- 1.58.0
- 1.57.0
- 1.56.0
- 1.55.0
- 1.54.1
- 1.53.0
- 1.52.0
- 1.51.0
- 1.50.0
- 1.49.0
- 1.48.0
- 1.47.0
- 1.46.0
- 1.45.0
- 1.44.0
- 1.43.0
- 1.39.0
- 1.38.1
- 1.37.0
- 1.36.4
- 1.35.0
- 1.34.0
- 1.33.1
- 1.32.0
- 1.31.1
- 1.30.1
- 1.29.0
- 1.28.1
- 1.27.1
- 1.26.1
- 1.25.0
- 1.24.1
- 1.23.0
- 1.22.1
- 1.21.0
- 1.20.0
- 1.19.1
- 1.18.3
- 1.17.1
- 1.16.1
- 1.15.1
- 1.14.0
- 1.13.1
- 1.12.1
- 1.11.0
- 1.10.0
- 1.9.0
- 1.8.1
- 1.7.1
- 1.6.2
- 1.5.0
- 1.4.3
- 1.3.0
- 1.2.0
- 1.1.1
- 1.0.1
- 0.9.0
- 0.8.0
- 0.7.1
- 0.6.0
- 0.5.1
- 0.4.0
- 0.3.1
MonitoringInput(
vertex_dataset: typing.Optional[str] = None,
gcs_uri: typing.Optional[str] = None,
data_format: typing.Optional[str] = None,
table_uri: typing.Optional[str] = None,
query: typing.Optional[str] = None,
timestamp_field: typing.Optional[str] = None,
batch_prediction_job: typing.Optional[str] = None,
endpoints: typing.Optional[typing.List[str]] = None,
start_time: typing.Optional[google.protobuf.timestamp_pb2.Timestamp] = None,
end_time: typing.Optional[google.protobuf.timestamp_pb2.Timestamp] = None,
offset: typing.Optional[str] = None,
window: typing.Optional[str] = None,
)
Model monitoring data input spec.
Attributes |
|
---|---|
Name | Description |
vertex_dataset |
str
Optional. Resource name of the Vertex AI managed dataset. Format: projects/{project}/locations/{location}/datasets/{dataset}
At least one source of dataset should be provided, and if one of the
fields is set, no need to set other sources
(vertex_dataset, gcs_uri, table_uri, query, batch_prediction_job,
endpoints).
|
gcs_uri |
str
Optional. Google Cloud Storage URI to the input file(s). May contain wildcards. |
data_format |
str
Optional. Data format of Google Cloud Storage file(s). Should be provided if a gcs_uri is set. Supported formats: "csv", "jsonl", "tf-record" |
table_uri |
str
Optonal. BigQuery URI to a table, up to 2000 characters long. All the columns in the table will be selected. Accepted forms: - BigQuery path. For example: bq://projectId.bqDatasetId.bqTableId .
|
query |
str
Optional. Standard SQL for BigQuery to be used instead of the table_uri .
|
timestamp_field |
str
Optional. The timestamp field in the dataset. the timestamp_field must be specified if you'd like to use
start_time , end_time , offset or window .
If you use query to specify the dataset, make sure the
timestamp_field is in the selection fields.
|
batch_prediction_job |
str
Optional. Vertex AI Batch Prediction Job resource name. Format: projects/{project}/locations/{location}/batchPredictionJobs/{batch_prediction_job}
|
endpoints |
List[str]
Optional. List of Vertex AI Endpoint resource names. Format: projects/{project}/locations/{location}/endpoints/{endpoint}
|
start_time |
timestamp_pb2.Timestamp
Optional. Inclusive start of the time interval for which results should be returned. Should be set together with end_time .
|
end_time |
timestamp_pb2.Timestamp
Optional. Exclusive end of the time interval for which results should be returned. Should be set together with start_time .`
|
offset |
str
Optional. Offset is the time difference from the cut-off time. For scheduled jobs, the cut-off time is the scheduled time. For non-scheduled jobs, it's the time when the job was created. Currently we support the following format: 'w|W': Week, 'd|D': Day, 'h|H': Hour E.g. '1h' stands for 1 hour, '2d' stands for 2 days. |
window |
str
Optional. Window refers to the scope of data selected for analysis. It allows you to specify the quantity of data you wish to examine. It refers to the data time window prior to the cut-off time or the cut-off time minus the offset. Currently we support the following format: 'w|W': Week, 'd|D': Day, 'h|H': Hour E.g. '1h' stands for 1 hour, '2d' stands for 2 days. |