public static final class BatchPredictionJob.OutputConfig.Builder extends GeneratedMessageV3.Builder<BatchPredictionJob.OutputConfig.Builder> implements BatchPredictionJob.OutputConfigOrBuilder
Configures the output of
BatchPredictionJob.
See
Model.supported_output_storage_formats
for supported output formats, and how predictions are expressed via any of
them.
Protobuf type google.cloud.aiplatform.v1beta1.BatchPredictionJob.OutputConfig
Inherited Members
com.google.protobuf.GeneratedMessageV3.Builder.getUnknownFieldSetBuilder()
com.google.protobuf.GeneratedMessageV3.Builder.mergeUnknownLengthDelimitedField(int,com.google.protobuf.ByteString)
com.google.protobuf.GeneratedMessageV3.Builder.mergeUnknownVarintField(int,int)
com.google.protobuf.GeneratedMessageV3.Builder.parseUnknownField(com.google.protobuf.CodedInputStream,com.google.protobuf.ExtensionRegistryLite,int)
com.google.protobuf.GeneratedMessageV3.Builder.setUnknownFieldSetBuilder(com.google.protobuf.UnknownFieldSet.Builder)
Static Methods
public static final Descriptors.Descriptor getDescriptor()
Methods
public BatchPredictionJob.OutputConfig.Builder addRepeatedField(Descriptors.FieldDescriptor field, Object value)
Overrides
public BatchPredictionJob.OutputConfig build()
public BatchPredictionJob.OutputConfig buildPartial()
public BatchPredictionJob.OutputConfig.Builder clear()
Overrides
public BatchPredictionJob.OutputConfig.Builder clearBigqueryDestination()
The BigQuery project or dataset location where the output is to be
written to. If project is provided, a new dataset is created with name
prediction_<model-display-name>_<job-create-time>
where <model-display-name> is made
BigQuery-dataset-name compatible (for example, most special characters
become underscores), and timestamp is in
YYYY_MM_DDThh_mm_ss_sssZ "based on ISO-8601" format. In the dataset
two tables will be created, predictions
, and errors
.
If the Model has both
instance
and
prediction
schemata defined then the tables have columns as follows: The
predictions
table contains instances for which the prediction
succeeded, it has columns as per a concatenation of the Model's
instance and prediction schemata. The errors
table contains rows for
which the prediction has failed, it has instance columns, as per the
instance schema, followed by a single "errors" column, which as values
has google.rpc.Status
represented as a STRUCT, and containing only code
and message
.
.google.cloud.aiplatform.v1beta1.BigQueryDestination bigquery_destination = 3;
public BatchPredictionJob.OutputConfig.Builder clearDestination()
public BatchPredictionJob.OutputConfig.Builder clearField(Descriptors.FieldDescriptor field)
Overrides
public BatchPredictionJob.OutputConfig.Builder clearGcsDestination()
The Cloud Storage location of the directory where the output is
to be written to. In the given directory a new directory is created.
Its name is prediction-<model-display-name>-<job-create-time>
,
where timestamp is in YYYY-MM-DDThh:mm:ss.sssZ ISO-8601 format.
Inside of it files predictions_0001.<extension>
,
predictions_0002.<extension>
, ..., predictions_N.<extension>
are created where <extension>
depends on chosen
predictions_format,
and N may equal 0001 and depends on the total number of successfully
predicted instances. If the Model has both
instance
and
prediction
schemata defined then each such file contains predictions as per the
predictions_format.
If prediction for any instance failed (partially or completely), then
an additional errors_0001.<extension>
, errors_0002.<extension>
,...,
errors_N.<extension>
files are created (N depends on total number
of failed predictions). These files contain the failed instances,
as per their schema, followed by an additional error
field which as
value has google.rpc.Status
containing only code
and message
fields.
.google.cloud.aiplatform.v1beta1.GcsDestination gcs_destination = 2;
public BatchPredictionJob.OutputConfig.Builder clearOneof(Descriptors.OneofDescriptor oneof)
Overrides
public BatchPredictionJob.OutputConfig.Builder clearPredictionsFormat()
Required. The format in which Vertex AI gives the predictions, must be
one of the
Model's
supported_output_storage_formats.
string predictions_format = 1 [(.google.api.field_behavior) = REQUIRED];
public BatchPredictionJob.OutputConfig.Builder clone()
Overrides
public BigQueryDestination getBigqueryDestination()
The BigQuery project or dataset location where the output is to be
written to. If project is provided, a new dataset is created with name
prediction_<model-display-name>_<job-create-time>
where <model-display-name> is made
BigQuery-dataset-name compatible (for example, most special characters
become underscores), and timestamp is in
YYYY_MM_DDThh_mm_ss_sssZ "based on ISO-8601" format. In the dataset
two tables will be created, predictions
, and errors
.
If the Model has both
instance
and
prediction
schemata defined then the tables have columns as follows: The
predictions
table contains instances for which the prediction
succeeded, it has columns as per a concatenation of the Model's
instance and prediction schemata. The errors
table contains rows for
which the prediction has failed, it has instance columns, as per the
instance schema, followed by a single "errors" column, which as values
has google.rpc.Status
represented as a STRUCT, and containing only code
and message
.
.google.cloud.aiplatform.v1beta1.BigQueryDestination bigquery_destination = 3;
public BigQueryDestination.Builder getBigqueryDestinationBuilder()
The BigQuery project or dataset location where the output is to be
written to. If project is provided, a new dataset is created with name
prediction_<model-display-name>_<job-create-time>
where <model-display-name> is made
BigQuery-dataset-name compatible (for example, most special characters
become underscores), and timestamp is in
YYYY_MM_DDThh_mm_ss_sssZ "based on ISO-8601" format. In the dataset
two tables will be created, predictions
, and errors
.
If the Model has both
instance
and
prediction
schemata defined then the tables have columns as follows: The
predictions
table contains instances for which the prediction
succeeded, it has columns as per a concatenation of the Model's
instance and prediction schemata. The errors
table contains rows for
which the prediction has failed, it has instance columns, as per the
instance schema, followed by a single "errors" column, which as values
has google.rpc.Status
represented as a STRUCT, and containing only code
and message
.
.google.cloud.aiplatform.v1beta1.BigQueryDestination bigquery_destination = 3;
public BigQueryDestinationOrBuilder getBigqueryDestinationOrBuilder()
The BigQuery project or dataset location where the output is to be
written to. If project is provided, a new dataset is created with name
prediction_<model-display-name>_<job-create-time>
where <model-display-name> is made
BigQuery-dataset-name compatible (for example, most special characters
become underscores), and timestamp is in
YYYY_MM_DDThh_mm_ss_sssZ "based on ISO-8601" format. In the dataset
two tables will be created, predictions
, and errors
.
If the Model has both
instance
and
prediction
schemata defined then the tables have columns as follows: The
predictions
table contains instances for which the prediction
succeeded, it has columns as per a concatenation of the Model's
instance and prediction schemata. The errors
table contains rows for
which the prediction has failed, it has instance columns, as per the
instance schema, followed by a single "errors" column, which as values
has google.rpc.Status
represented as a STRUCT, and containing only code
and message
.
.google.cloud.aiplatform.v1beta1.BigQueryDestination bigquery_destination = 3;
public BatchPredictionJob.OutputConfig getDefaultInstanceForType()
public Descriptors.Descriptor getDescriptorForType()
Overrides
public BatchPredictionJob.OutputConfig.DestinationCase getDestinationCase()
public GcsDestination getGcsDestination()
The Cloud Storage location of the directory where the output is
to be written to. In the given directory a new directory is created.
Its name is prediction-<model-display-name>-<job-create-time>
,
where timestamp is in YYYY-MM-DDThh:mm:ss.sssZ ISO-8601 format.
Inside of it files predictions_0001.<extension>
,
predictions_0002.<extension>
, ..., predictions_N.<extension>
are created where <extension>
depends on chosen
predictions_format,
and N may equal 0001 and depends on the total number of successfully
predicted instances. If the Model has both
instance
and
prediction
schemata defined then each such file contains predictions as per the
predictions_format.
If prediction for any instance failed (partially or completely), then
an additional errors_0001.<extension>
, errors_0002.<extension>
,...,
errors_N.<extension>
files are created (N depends on total number
of failed predictions). These files contain the failed instances,
as per their schema, followed by an additional error
field which as
value has google.rpc.Status
containing only code
and message
fields.
.google.cloud.aiplatform.v1beta1.GcsDestination gcs_destination = 2;
public GcsDestination.Builder getGcsDestinationBuilder()
The Cloud Storage location of the directory where the output is
to be written to. In the given directory a new directory is created.
Its name is prediction-<model-display-name>-<job-create-time>
,
where timestamp is in YYYY-MM-DDThh:mm:ss.sssZ ISO-8601 format.
Inside of it files predictions_0001.<extension>
,
predictions_0002.<extension>
, ..., predictions_N.<extension>
are created where <extension>
depends on chosen
predictions_format,
and N may equal 0001 and depends on the total number of successfully
predicted instances. If the Model has both
instance
and
prediction
schemata defined then each such file contains predictions as per the
predictions_format.
If prediction for any instance failed (partially or completely), then
an additional errors_0001.<extension>
, errors_0002.<extension>
,...,
errors_N.<extension>
files are created (N depends on total number
of failed predictions). These files contain the failed instances,
as per their schema, followed by an additional error
field which as
value has google.rpc.Status
containing only code
and message
fields.
.google.cloud.aiplatform.v1beta1.GcsDestination gcs_destination = 2;
public GcsDestinationOrBuilder getGcsDestinationOrBuilder()
The Cloud Storage location of the directory where the output is
to be written to. In the given directory a new directory is created.
Its name is prediction-<model-display-name>-<job-create-time>
,
where timestamp is in YYYY-MM-DDThh:mm:ss.sssZ ISO-8601 format.
Inside of it files predictions_0001.<extension>
,
predictions_0002.<extension>
, ..., predictions_N.<extension>
are created where <extension>
depends on chosen
predictions_format,
and N may equal 0001 and depends on the total number of successfully
predicted instances. If the Model has both
instance
and
prediction
schemata defined then each such file contains predictions as per the
predictions_format.
If prediction for any instance failed (partially or completely), then
an additional errors_0001.<extension>
, errors_0002.<extension>
,...,
errors_N.<extension>
files are created (N depends on total number
of failed predictions). These files contain the failed instances,
as per their schema, followed by an additional error
field which as
value has google.rpc.Status
containing only code
and message
fields.
.google.cloud.aiplatform.v1beta1.GcsDestination gcs_destination = 2;
public String getPredictionsFormat()
Required. The format in which Vertex AI gives the predictions, must be
one of the
Model's
supported_output_storage_formats.
string predictions_format = 1 [(.google.api.field_behavior) = REQUIRED];
Returns |
---|
Type | Description |
String | The predictionsFormat.
|
public ByteString getPredictionsFormatBytes()
Required. The format in which Vertex AI gives the predictions, must be
one of the
Model's
supported_output_storage_formats.
string predictions_format = 1 [(.google.api.field_behavior) = REQUIRED];
Returns |
---|
Type | Description |
ByteString | The bytes for predictionsFormat.
|
public boolean hasBigqueryDestination()
The BigQuery project or dataset location where the output is to be
written to. If project is provided, a new dataset is created with name
prediction_<model-display-name>_<job-create-time>
where <model-display-name> is made
BigQuery-dataset-name compatible (for example, most special characters
become underscores), and timestamp is in
YYYY_MM_DDThh_mm_ss_sssZ "based on ISO-8601" format. In the dataset
two tables will be created, predictions
, and errors
.
If the Model has both
instance
and
prediction
schemata defined then the tables have columns as follows: The
predictions
table contains instances for which the prediction
succeeded, it has columns as per a concatenation of the Model's
instance and prediction schemata. The errors
table contains rows for
which the prediction has failed, it has instance columns, as per the
instance schema, followed by a single "errors" column, which as values
has google.rpc.Status
represented as a STRUCT, and containing only code
and message
.
.google.cloud.aiplatform.v1beta1.BigQueryDestination bigquery_destination = 3;
Returns |
---|
Type | Description |
boolean | Whether the bigqueryDestination field is set.
|
public boolean hasGcsDestination()
The Cloud Storage location of the directory where the output is
to be written to. In the given directory a new directory is created.
Its name is prediction-<model-display-name>-<job-create-time>
,
where timestamp is in YYYY-MM-DDThh:mm:ss.sssZ ISO-8601 format.
Inside of it files predictions_0001.<extension>
,
predictions_0002.<extension>
, ..., predictions_N.<extension>
are created where <extension>
depends on chosen
predictions_format,
and N may equal 0001 and depends on the total number of successfully
predicted instances. If the Model has both
instance
and
prediction
schemata defined then each such file contains predictions as per the
predictions_format.
If prediction for any instance failed (partially or completely), then
an additional errors_0001.<extension>
, errors_0002.<extension>
,...,
errors_N.<extension>
files are created (N depends on total number
of failed predictions). These files contain the failed instances,
as per their schema, followed by an additional error
field which as
value has google.rpc.Status
containing only code
and message
fields.
.google.cloud.aiplatform.v1beta1.GcsDestination gcs_destination = 2;
Returns |
---|
Type | Description |
boolean | Whether the gcsDestination field is set.
|
protected GeneratedMessageV3.FieldAccessorTable internalGetFieldAccessorTable()
Overrides
public final boolean isInitialized()
Overrides
public BatchPredictionJob.OutputConfig.Builder mergeBigqueryDestination(BigQueryDestination value)
The BigQuery project or dataset location where the output is to be
written to. If project is provided, a new dataset is created with name
prediction_<model-display-name>_<job-create-time>
where <model-display-name> is made
BigQuery-dataset-name compatible (for example, most special characters
become underscores), and timestamp is in
YYYY_MM_DDThh_mm_ss_sssZ "based on ISO-8601" format. In the dataset
two tables will be created, predictions
, and errors
.
If the Model has both
instance
and
prediction
schemata defined then the tables have columns as follows: The
predictions
table contains instances for which the prediction
succeeded, it has columns as per a concatenation of the Model's
instance and prediction schemata. The errors
table contains rows for
which the prediction has failed, it has instance columns, as per the
instance schema, followed by a single "errors" column, which as values
has google.rpc.Status
represented as a STRUCT, and containing only code
and message
.
.google.cloud.aiplatform.v1beta1.BigQueryDestination bigquery_destination = 3;
public BatchPredictionJob.OutputConfig.Builder mergeFrom(BatchPredictionJob.OutputConfig other)
public BatchPredictionJob.OutputConfig.Builder mergeFrom(CodedInputStream input, ExtensionRegistryLite extensionRegistry)
Overrides
public BatchPredictionJob.OutputConfig.Builder mergeFrom(Message other)
Parameter |
---|
Name | Description |
other | Message
|
Overrides
public BatchPredictionJob.OutputConfig.Builder mergeGcsDestination(GcsDestination value)
The Cloud Storage location of the directory where the output is
to be written to. In the given directory a new directory is created.
Its name is prediction-<model-display-name>-<job-create-time>
,
where timestamp is in YYYY-MM-DDThh:mm:ss.sssZ ISO-8601 format.
Inside of it files predictions_0001.<extension>
,
predictions_0002.<extension>
, ..., predictions_N.<extension>
are created where <extension>
depends on chosen
predictions_format,
and N may equal 0001 and depends on the total number of successfully
predicted instances. If the Model has both
instance
and
prediction
schemata defined then each such file contains predictions as per the
predictions_format.
If prediction for any instance failed (partially or completely), then
an additional errors_0001.<extension>
, errors_0002.<extension>
,...,
errors_N.<extension>
files are created (N depends on total number
of failed predictions). These files contain the failed instances,
as per their schema, followed by an additional error
field which as
value has google.rpc.Status
containing only code
and message
fields.
.google.cloud.aiplatform.v1beta1.GcsDestination gcs_destination = 2;
public final BatchPredictionJob.OutputConfig.Builder mergeUnknownFields(UnknownFieldSet unknownFields)
Overrides
public BatchPredictionJob.OutputConfig.Builder setBigqueryDestination(BigQueryDestination value)
The BigQuery project or dataset location where the output is to be
written to. If project is provided, a new dataset is created with name
prediction_<model-display-name>_<job-create-time>
where <model-display-name> is made
BigQuery-dataset-name compatible (for example, most special characters
become underscores), and timestamp is in
YYYY_MM_DDThh_mm_ss_sssZ "based on ISO-8601" format. In the dataset
two tables will be created, predictions
, and errors
.
If the Model has both
instance
and
prediction
schemata defined then the tables have columns as follows: The
predictions
table contains instances for which the prediction
succeeded, it has columns as per a concatenation of the Model's
instance and prediction schemata. The errors
table contains rows for
which the prediction has failed, it has instance columns, as per the
instance schema, followed by a single "errors" column, which as values
has google.rpc.Status
represented as a STRUCT, and containing only code
and message
.
.google.cloud.aiplatform.v1beta1.BigQueryDestination bigquery_destination = 3;
public BatchPredictionJob.OutputConfig.Builder setBigqueryDestination(BigQueryDestination.Builder builderForValue)
The BigQuery project or dataset location where the output is to be
written to. If project is provided, a new dataset is created with name
prediction_<model-display-name>_<job-create-time>
where <model-display-name> is made
BigQuery-dataset-name compatible (for example, most special characters
become underscores), and timestamp is in
YYYY_MM_DDThh_mm_ss_sssZ "based on ISO-8601" format. In the dataset
two tables will be created, predictions
, and errors
.
If the Model has both
instance
and
prediction
schemata defined then the tables have columns as follows: The
predictions
table contains instances for which the prediction
succeeded, it has columns as per a concatenation of the Model's
instance and prediction schemata. The errors
table contains rows for
which the prediction has failed, it has instance columns, as per the
instance schema, followed by a single "errors" column, which as values
has google.rpc.Status
represented as a STRUCT, and containing only code
and message
.
.google.cloud.aiplatform.v1beta1.BigQueryDestination bigquery_destination = 3;
public BatchPredictionJob.OutputConfig.Builder setField(Descriptors.FieldDescriptor field, Object value)
Overrides
public BatchPredictionJob.OutputConfig.Builder setGcsDestination(GcsDestination value)
The Cloud Storage location of the directory where the output is
to be written to. In the given directory a new directory is created.
Its name is prediction-<model-display-name>-<job-create-time>
,
where timestamp is in YYYY-MM-DDThh:mm:ss.sssZ ISO-8601 format.
Inside of it files predictions_0001.<extension>
,
predictions_0002.<extension>
, ..., predictions_N.<extension>
are created where <extension>
depends on chosen
predictions_format,
and N may equal 0001 and depends on the total number of successfully
predicted instances. If the Model has both
instance
and
prediction
schemata defined then each such file contains predictions as per the
predictions_format.
If prediction for any instance failed (partially or completely), then
an additional errors_0001.<extension>
, errors_0002.<extension>
,...,
errors_N.<extension>
files are created (N depends on total number
of failed predictions). These files contain the failed instances,
as per their schema, followed by an additional error
field which as
value has google.rpc.Status
containing only code
and message
fields.
.google.cloud.aiplatform.v1beta1.GcsDestination gcs_destination = 2;
public BatchPredictionJob.OutputConfig.Builder setGcsDestination(GcsDestination.Builder builderForValue)
The Cloud Storage location of the directory where the output is
to be written to. In the given directory a new directory is created.
Its name is prediction-<model-display-name>-<job-create-time>
,
where timestamp is in YYYY-MM-DDThh:mm:ss.sssZ ISO-8601 format.
Inside of it files predictions_0001.<extension>
,
predictions_0002.<extension>
, ..., predictions_N.<extension>
are created where <extension>
depends on chosen
predictions_format,
and N may equal 0001 and depends on the total number of successfully
predicted instances. If the Model has both
instance
and
prediction
schemata defined then each such file contains predictions as per the
predictions_format.
If prediction for any instance failed (partially or completely), then
an additional errors_0001.<extension>
, errors_0002.<extension>
,...,
errors_N.<extension>
files are created (N depends on total number
of failed predictions). These files contain the failed instances,
as per their schema, followed by an additional error
field which as
value has google.rpc.Status
containing only code
and message
fields.
.google.cloud.aiplatform.v1beta1.GcsDestination gcs_destination = 2;
public BatchPredictionJob.OutputConfig.Builder setPredictionsFormat(String value)
Required. The format in which Vertex AI gives the predictions, must be
one of the
Model's
supported_output_storage_formats.
string predictions_format = 1 [(.google.api.field_behavior) = REQUIRED];
Parameter |
---|
Name | Description |
value | String
The predictionsFormat to set.
|
public BatchPredictionJob.OutputConfig.Builder setPredictionsFormatBytes(ByteString value)
Required. The format in which Vertex AI gives the predictions, must be
one of the
Model's
supported_output_storage_formats.
string predictions_format = 1 [(.google.api.field_behavior) = REQUIRED];
Parameter |
---|
Name | Description |
value | ByteString
The bytes for predictionsFormat to set.
|
public BatchPredictionJob.OutputConfig.Builder setRepeatedField(Descriptors.FieldDescriptor field, int index, Object value)
Overrides
public final BatchPredictionJob.OutputConfig.Builder setUnknownFields(UnknownFieldSet unknownFields)
Overrides