- 3.56.0 (latest)
- 3.55.0
- 3.54.0
- 3.53.0
- 3.52.0
- 3.50.0
- 3.49.0
- 3.48.0
- 3.47.0
- 3.46.0
- 3.45.0
- 3.44.0
- 3.43.0
- 3.42.0
- 3.41.0
- 3.40.0
- 3.38.0
- 3.37.0
- 3.36.0
- 3.35.0
- 3.34.0
- 3.33.0
- 3.32.0
- 3.31.0
- 3.30.0
- 3.29.0
- 3.28.0
- 3.25.0
- 3.24.0
- 3.23.0
- 3.22.0
- 3.21.0
- 3.20.0
- 3.19.0
- 3.18.0
- 3.17.0
- 3.16.0
- 3.15.0
- 3.14.0
- 3.13.0
- 3.12.0
- 3.11.0
- 3.10.0
- 3.9.0
- 3.8.0
- 3.7.0
- 3.6.0
- 3.5.0
- 3.4.2
- 3.3.0
- 3.2.0
- 3.0.0
- 2.9.8
- 2.8.9
- 2.7.4
- 2.5.3
- 2.4.0
public static interface BatchPredictionJob.InstanceConfigOrBuilder extends MessageOrBuilder
Implements
MessageOrBuilderMethods
getExcludedFields(int index)
public abstract String getExcludedFields(int index)
Fields that will be excluded in the prediction instance that is sent to the Model.
Excluded will be attached to the batch prediction output if key_field is not specified.
When excluded_fields is populated, included_fields must be empty.
The input must be JSONL with objects at each line, BigQuery or TfRecord.
repeated string excluded_fields = 4;
Parameter | |
---|---|
Name | Description |
index |
int The index of the element to return. |
Returns | |
---|---|
Type | Description |
String |
The excludedFields at the given index. |
getExcludedFieldsBytes(int index)
public abstract ByteString getExcludedFieldsBytes(int index)
Fields that will be excluded in the prediction instance that is sent to the Model.
Excluded will be attached to the batch prediction output if key_field is not specified.
When excluded_fields is populated, included_fields must be empty.
The input must be JSONL with objects at each line, BigQuery or TfRecord.
repeated string excluded_fields = 4;
Parameter | |
---|---|
Name | Description |
index |
int The index of the value to return. |
Returns | |
---|---|
Type | Description |
ByteString |
The bytes of the excludedFields at the given index. |
getExcludedFieldsCount()
public abstract int getExcludedFieldsCount()
Fields that will be excluded in the prediction instance that is sent to the Model.
Excluded will be attached to the batch prediction output if key_field is not specified.
When excluded_fields is populated, included_fields must be empty.
The input must be JSONL with objects at each line, BigQuery or TfRecord.
repeated string excluded_fields = 4;
Returns | |
---|---|
Type | Description |
int |
The count of excludedFields. |
getExcludedFieldsList()
public abstract List<String> getExcludedFieldsList()
Fields that will be excluded in the prediction instance that is sent to the Model.
Excluded will be attached to the batch prediction output if key_field is not specified.
When excluded_fields is populated, included_fields must be empty.
The input must be JSONL with objects at each line, BigQuery or TfRecord.
repeated string excluded_fields = 4;
Returns | |
---|---|
Type | Description |
List<String> |
A list containing the excludedFields. |
getIncludedFields(int index)
public abstract String getIncludedFields(int index)
Fields that will be included in the prediction instance that is sent to the Model.
If
instance_type
is array
, the order of field names in included_fields also determines
the order of the values in the array.
When included_fields is populated, excluded_fields must be empty.
The input must be JSONL with objects at each line, BigQuery or TfRecord.
repeated string included_fields = 3;
Parameter | |
---|---|
Name | Description |
index |
int The index of the element to return. |
Returns | |
---|---|
Type | Description |
String |
The includedFields at the given index. |
getIncludedFieldsBytes(int index)
public abstract ByteString getIncludedFieldsBytes(int index)
Fields that will be included in the prediction instance that is sent to the Model.
If
instance_type
is array
, the order of field names in included_fields also determines
the order of the values in the array.
When included_fields is populated, excluded_fields must be empty.
The input must be JSONL with objects at each line, BigQuery or TfRecord.
repeated string included_fields = 3;
Parameter | |
---|---|
Name | Description |
index |
int The index of the value to return. |
Returns | |
---|---|
Type | Description |
ByteString |
The bytes of the includedFields at the given index. |
getIncludedFieldsCount()
public abstract int getIncludedFieldsCount()
Fields that will be included in the prediction instance that is sent to the Model.
If
instance_type
is array
, the order of field names in included_fields also determines
the order of the values in the array.
When included_fields is populated, excluded_fields must be empty.
The input must be JSONL with objects at each line, BigQuery or TfRecord.
repeated string included_fields = 3;
Returns | |
---|---|
Type | Description |
int |
The count of includedFields. |
getIncludedFieldsList()
public abstract List<String> getIncludedFieldsList()
Fields that will be included in the prediction instance that is sent to the Model.
If
instance_type
is array
, the order of field names in included_fields also determines
the order of the values in the array.
When included_fields is populated, excluded_fields must be empty.
The input must be JSONL with objects at each line, BigQuery or TfRecord.
repeated string included_fields = 3;
Returns | |
---|---|
Type | Description |
List<String> |
A list containing the includedFields. |
getInstanceType()
public abstract String getInstanceType()
The format of the instance that the Model accepts. Vertex AI will convert compatible batch prediction input instance formats to the specified format.
Supported values are:
object
: Each input is converted to JSON object format.- For
bigquery
, each row is converted to an object. - For
jsonl
, each line of the JSONL input must be an object. - Does not apply to
csv
,file-list
,tf-record
, ortf-record-gzip
.
- For
array
: Each input is converted to JSON array format.- For
bigquery
, each row is converted to an array. The order of columns is determined by the BigQuery column order, unless included_fields is populated. included_fields must be populated for specifying field orders. - For
jsonl
, if each line of the JSONL input is an object, included_fields must be populated for specifying field orders. - Does not apply to
csv
,file-list
,tf-record
, ortf-record-gzip
.
If not specified, Vertex AI converts the batch prediction input as follows:
- For
bigquery
andcsv
, the behavior is the same asarray
. The order of columns is the same as defined in the file or table, unless included_fields is populated. - For
jsonl
, the prediction instance format is determined by each line of the input. - For
tf-record
/tf-record-gzip
, each record will be converted to an object in the format of{"b64": <value>}
, where<value>
is the Base64-encoded string of the content of the record. - For
file-list
, each file in the list will be converted to an object in the format of{"b64": <value>}
, where<value>
is the Base64-encoded string of the content of the file.
- For
string instance_type = 1;
Returns | |
---|---|
Type | Description |
String |
The instanceType. |
getInstanceTypeBytes()
public abstract ByteString getInstanceTypeBytes()
The format of the instance that the Model accepts. Vertex AI will convert compatible batch prediction input instance formats to the specified format.
Supported values are:
object
: Each input is converted to JSON object format.- For
bigquery
, each row is converted to an object. - For
jsonl
, each line of the JSONL input must be an object. - Does not apply to
csv
,file-list
,tf-record
, ortf-record-gzip
.
- For
array
: Each input is converted to JSON array format.- For
bigquery
, each row is converted to an array. The order of columns is determined by the BigQuery column order, unless included_fields is populated. included_fields must be populated for specifying field orders. - For
jsonl
, if each line of the JSONL input is an object, included_fields must be populated for specifying field orders. - Does not apply to
csv
,file-list
,tf-record
, ortf-record-gzip
.
If not specified, Vertex AI converts the batch prediction input as follows:
- For
bigquery
andcsv
, the behavior is the same asarray
. The order of columns is the same as defined in the file or table, unless included_fields is populated. - For
jsonl
, the prediction instance format is determined by each line of the input. - For
tf-record
/tf-record-gzip
, each record will be converted to an object in the format of{"b64": <value>}
, where<value>
is the Base64-encoded string of the content of the record. - For
file-list
, each file in the list will be converted to an object in the format of{"b64": <value>}
, where<value>
is the Base64-encoded string of the content of the file.
- For
string instance_type = 1;
Returns | |
---|---|
Type | Description |
ByteString |
The bytes for instanceType. |
getKeyField()
public abstract String getKeyField()
The name of the field that is considered as a key.
The values identified by the key field is not included in the transformed
instances that is sent to the Model. This is similar to
specifying this name of the field in
excluded_fields.
In addition, the batch prediction output will not include the instances.
Instead the output will only include the value of the key field, in a
field named key
in the output:
- For
jsonl
output format, the output will have akey
field instead of theinstance
field. For
csv
/bigquery
output format, the output will have have akey
column instead of the instance feature columns.The input must be JSONL with objects at each line, CSV, BigQuery or TfRecord.
string key_field = 2;
Returns | |
---|---|
Type | Description |
String |
The keyField. |
getKeyFieldBytes()
public abstract ByteString getKeyFieldBytes()
The name of the field that is considered as a key.
The values identified by the key field is not included in the transformed
instances that is sent to the Model. This is similar to
specifying this name of the field in
excluded_fields.
In addition, the batch prediction output will not include the instances.
Instead the output will only include the value of the key field, in a
field named key
in the output:
- For
jsonl
output format, the output will have akey
field instead of theinstance
field. For
csv
/bigquery
output format, the output will have have akey
column instead of the instance feature columns.The input must be JSONL with objects at each line, CSV, BigQuery or TfRecord.
string key_field = 2;
Returns | |
---|---|
Type | Description |
ByteString |
The bytes for keyField. |