MonitoringInput(
    vertex_dataset: typing.Optional[str] = None,
    gcs_uri: typing.Optional[str] = None,
    data_format: typing.Optional[str] = None,
    table_uri: typing.Optional[str] = None,
    query: typing.Optional[str] = None,
    timestamp_field: typing.Optional[str] = None,
    batch_prediction_job: typing.Optional[str] = None,
    endpoints: typing.Optional[typing.List[str]] = None,
    start_time: typing.Optional[google.protobuf.timestamp_pb2.Timestamp] = None,
    end_time: typing.Optional[google.protobuf.timestamp_pb2.Timestamp] = None,
    offset: typing.Optional[str] = None,
    window: typing.Optional[str] = None,
)Model monitoring data input spec.
Attributes | 
      |
|---|---|
| Name | Description | 
vertex_dataset | 
        
          str
          Optional. Resource name of the Vertex AI managed dataset. Format: projects/{project}/locations/{location}/datasets/{dataset}
   At least one source of dataset should be provided, and if one of the
   fields is set, no need to set other sources
   (vertex_dataset, gcs_uri, table_uri, query, batch_prediction_job,
   endpoints).
         | 
      
gcs_uri | 
        
          str
          Optional. Google Cloud Storage URI to the input file(s). May contain wildcards.  | 
      
data_format | 
        
          str
          Optional. Data format of Google Cloud Storage file(s). Should be provided if a gcs_uri is set. Supported formats: "csv", "jsonl", "tf-record"  | 
      
table_uri | 
        
          str
          Optonal. BigQuery URI to a table, up to 2000 characters long. All the columns in the table will be selected. Accepted forms: - BigQuery path. For example: bq://projectId.bqDatasetId.bqTableId.
         | 
      
query | 
        
          str
          Optional. Standard SQL for BigQuery to be used instead of the table_uri.
         | 
      
timestamp_field | 
        
          str
          Optional. The timestamp field in the dataset. the timestamp_field must be specified if you'd like to use
   start_time, end_time, offset or window.
   If you use query to specify the dataset, make sure the
   timestamp_field is in the selection fields.
         | 
      
batch_prediction_job | 
        
          str
          Optional. Vertex AI Batch Prediction Job resource name. Format: projects/{project}/locations/{location}/batchPredictionJobs/{batch_prediction_job}
         | 
      
endpoints | 
        
          List[str]
          Optional. List of Vertex AI Endpoint resource names. Format: projects/{project}/locations/{location}/endpoints/{endpoint}
         | 
      
start_time | 
        
          timestamp_pb2.Timestamp
          Optional. Inclusive start of the time interval for which results should be returned. Should be set together with end_time.
         | 
      
end_time | 
        
          timestamp_pb2.Timestamp
          Optional. Exclusive end of the time interval for which results should be returned. Should be set together with start_time.`
         | 
      
offset | 
        
          str
          Optional. Offset is the time difference from the cut-off time. For scheduled jobs, the cut-off time is the scheduled time. For non-scheduled jobs, it's the time when the job was created. Currently we support the following format: 'w|W': Week, 'd|D': Day, 'h|H': Hour E.g. '1h' stands for 1 hour, '2d' stands for 2 days.  | 
      
window | 
        
          str
          Optional. Window refers to the scope of data selected for analysis. It allows you to specify the quantity of data you wish to examine. It refers to the data time window prior to the cut-off time or the cut-off time minus the offset. Currently we support the following format: 'w|W': Week, 'd|D': Day, 'h|H': Hour E.g. '1h' stands for 1 hour, '2d' stands for 2 days.  |