- 0.58.0 (latest)
- 0.57.0
- 0.56.0
- 0.55.0
- 0.54.0
- 0.53.0
- 0.52.0
- 0.51.0
- 0.50.0
- 0.49.0
- 0.48.0
- 0.47.0
- 0.46.0
- 0.45.0
- 0.44.0
- 0.43.0
- 0.42.0
- 0.41.0
- 0.40.0
- 0.39.0
- 0.38.0
- 0.37.0
- 0.36.0
- 0.35.0
- 0.34.0
- 0.33.0
- 0.32.0
- 0.31.0
- 0.30.0
- 0.29.0
- 0.28.0
- 0.27.0
- 0.26.0
- 0.25.0
- 0.24.0
- 0.23.0
- 0.22.0
- 0.21.0
- 0.20.0
- 0.19.0
- 0.18.0
- 0.17.0
- 0.16.0
- 0.15.0
- 0.14.0
- 0.13.0
- 0.12.0
- 0.11.0
- 0.10.0
- 0.9.1
- 0.8.0
- 0.7.0
- 0.6.0
- 0.5.0
- 0.4.0
- 0.3.0
- 0.2.0
- 0.1.0
Reference documentation and code samples for the Vertex AI V1 API class Google::Cloud::AIPlatform::V1::BatchPredictionJob::InstanceConfig.
Configuration defining how to transform batch prediction input instances to the instances that the Model accepts.
Inherits
- Object
Extended By
- Google::Protobuf::MessageExts::ClassMethods
Includes
- Google::Protobuf::MessageExts
Methods
#excluded_fields
def excluded_fields() -> ::Array<::String>
-
(::Array<::String>) — Fields that will be excluded in the prediction instance that is
sent to the Model.
Excluded will be attached to the batch prediction output if key_field is not specified.
When excluded_fields is populated, included_fields must be empty.
The input must be JSONL with objects at each line, CSV, BigQuery or TfRecord.
#excluded_fields=
def excluded_fields=(value) -> ::Array<::String>
-
value (::Array<::String>) — Fields that will be excluded in the prediction instance that is
sent to the Model.
Excluded will be attached to the batch prediction output if key_field is not specified.
When excluded_fields is populated, included_fields must be empty.
The input must be JSONL with objects at each line, CSV, BigQuery or TfRecord.
-
(::Array<::String>) — Fields that will be excluded in the prediction instance that is
sent to the Model.
Excluded will be attached to the batch prediction output if key_field is not specified.
When excluded_fields is populated, included_fields must be empty.
The input must be JSONL with objects at each line, CSV, BigQuery or TfRecord.
#included_fields
def included_fields() -> ::Array<::String>
-
(::Array<::String>) — Fields that will be included in the prediction instance that is
sent to the Model.
If instance_type is
array
, the order of field names in included_fields also determines the order of the values in the array.When included_fields is populated, excluded_fields must be empty.
The input must be JSONL with objects at each line, CSV, BigQuery or TfRecord.
#included_fields=
def included_fields=(value) -> ::Array<::String>
-
value (::Array<::String>) — Fields that will be included in the prediction instance that is
sent to the Model.
If instance_type is
array
, the order of field names in included_fields also determines the order of the values in the array.When included_fields is populated, excluded_fields must be empty.
The input must be JSONL with objects at each line, CSV, BigQuery or TfRecord.
-
(::Array<::String>) — Fields that will be included in the prediction instance that is
sent to the Model.
If instance_type is
array
, the order of field names in included_fields also determines the order of the values in the array.When included_fields is populated, excluded_fields must be empty.
The input must be JSONL with objects at each line, CSV, BigQuery or TfRecord.
#instance_type
def instance_type() -> ::String
-
(::String) —
The format of the instance that the Model accepts. Vertex AI will convert compatible [batch prediction input instance formats][google.cloud.aiplatform.v1.BatchPredictionJob.InputConfig.instances_format] to the specified format.
Supported values are:
object
: Each input is converted to JSON object format.- For
bigquery
, each row is converted to an object. - For
jsonl
, each line of the JSONL input must be an object. - Does not apply to
csv
,file-list
,tf-record
, ortf-record-gzip
.
- For
array
: Each input is converted to JSON array format.- For
bigquery
, each row is converted to an array. The order of columns is determined by the BigQuery column order, unless included_fields is populated. included_fields must be populated for specifying field orders. - For
jsonl
, if each line of the JSONL input is an object, included_fields must be populated for specifying field orders. - Does not apply to
csv
,file-list
,tf-record
, ortf-record-gzip
.
- For
If not specified, Vertex AI converts the batch prediction input as follows:
- For
bigquery
andcsv
, the behavior is the same asarray
. The order of columns is the same as defined in the file or table, unless included_fields is populated. - For
jsonl
, the prediction instance format is determined by each line of the input. - For
tf-record
/tf-record-gzip
, each record will be converted to an object in the format of{"b64": <value>}
, where<value>
is the Base64-encoded string of the content of the record. - For
file-list
, each file in the list will be converted to an object in the format of{"b64": <value>}
, where<value>
is the Base64-encoded string of the content of the file.
#instance_type=
def instance_type=(value) -> ::String
-
value (::String) —
The format of the instance that the Model accepts. Vertex AI will convert compatible [batch prediction input instance formats][google.cloud.aiplatform.v1.BatchPredictionJob.InputConfig.instances_format] to the specified format.
Supported values are:
object
: Each input is converted to JSON object format.- For
bigquery
, each row is converted to an object. - For
jsonl
, each line of the JSONL input must be an object. - Does not apply to
csv
,file-list
,tf-record
, ortf-record-gzip
.
- For
array
: Each input is converted to JSON array format.- For
bigquery
, each row is converted to an array. The order of columns is determined by the BigQuery column order, unless included_fields is populated. included_fields must be populated for specifying field orders. - For
jsonl
, if each line of the JSONL input is an object, included_fields must be populated for specifying field orders. - Does not apply to
csv
,file-list
,tf-record
, ortf-record-gzip
.
- For
If not specified, Vertex AI converts the batch prediction input as follows:
- For
bigquery
andcsv
, the behavior is the same asarray
. The order of columns is the same as defined in the file or table, unless included_fields is populated. - For
jsonl
, the prediction instance format is determined by each line of the input. - For
tf-record
/tf-record-gzip
, each record will be converted to an object in the format of{"b64": <value>}
, where<value>
is the Base64-encoded string of the content of the record. - For
file-list
, each file in the list will be converted to an object in the format of{"b64": <value>}
, where<value>
is the Base64-encoded string of the content of the file.
-
(::String) —
The format of the instance that the Model accepts. Vertex AI will convert compatible [batch prediction input instance formats][google.cloud.aiplatform.v1.BatchPredictionJob.InputConfig.instances_format] to the specified format.
Supported values are:
object
: Each input is converted to JSON object format.- For
bigquery
, each row is converted to an object. - For
jsonl
, each line of the JSONL input must be an object. - Does not apply to
csv
,file-list
,tf-record
, ortf-record-gzip
.
- For
array
: Each input is converted to JSON array format.- For
bigquery
, each row is converted to an array. The order of columns is determined by the BigQuery column order, unless included_fields is populated. included_fields must be populated for specifying field orders. - For
jsonl
, if each line of the JSONL input is an object, included_fields must be populated for specifying field orders. - Does not apply to
csv
,file-list
,tf-record
, ortf-record-gzip
.
- For
If not specified, Vertex AI converts the batch prediction input as follows:
- For
bigquery
andcsv
, the behavior is the same asarray
. The order of columns is the same as defined in the file or table, unless included_fields is populated. - For
jsonl
, the prediction instance format is determined by each line of the input. - For
tf-record
/tf-record-gzip
, each record will be converted to an object in the format of{"b64": <value>}
, where<value>
is the Base64-encoded string of the content of the record. - For
file-list
, each file in the list will be converted to an object in the format of{"b64": <value>}
, where<value>
is the Base64-encoded string of the content of the file.
#key_field
def key_field() -> ::String
-
(::String) — The name of the field that is considered as a key.
The values identified by the key field is not included in the transformed instances that is sent to the Model. This is similar to specifying this name of the field in excluded_fields. In addition, the batch prediction output will not include the instances. Instead the output will only include the value of the key field, in a field named
key
in the output:- For
jsonl
output format, the output will have akey
field instead of theinstance
field. - For
csv
/bigquery
output format, the output will have have akey
column instead of the instance feature columns.
The input must be JSONL with objects at each line, CSV, BigQuery or TfRecord.
- For
#key_field=
def key_field=(value) -> ::String
-
value (::String) — The name of the field that is considered as a key.
The values identified by the key field is not included in the transformed instances that is sent to the Model. This is similar to specifying this name of the field in excluded_fields. In addition, the batch prediction output will not include the instances. Instead the output will only include the value of the key field, in a field named
key
in the output:- For
jsonl
output format, the output will have akey
field instead of theinstance
field. - For
csv
/bigquery
output format, the output will have have akey
column instead of the instance feature columns.
The input must be JSONL with objects at each line, CSV, BigQuery or TfRecord.
- For
-
(::String) — The name of the field that is considered as a key.
The values identified by the key field is not included in the transformed instances that is sent to the Model. This is similar to specifying this name of the field in excluded_fields. In addition, the batch prediction output will not include the instances. Instead the output will only include the value of the key field, in a field named
key
in the output:- For
jsonl
output format, the output will have akey
field instead of theinstance
field. - For
csv
/bigquery
output format, the output will have have akey
column instead of the instance feature columns.
The input must be JSONL with objects at each line, CSV, BigQuery or TfRecord.
- For