- 0.58.0 (latest)
- 0.57.0
- 0.56.0
- 0.55.0
- 0.54.0
- 0.53.0
- 0.52.0
- 0.51.0
- 0.50.0
- 0.49.0
- 0.48.0
- 0.47.0
- 0.46.0
- 0.45.0
- 0.44.0
- 0.43.0
- 0.42.0
- 0.41.0
- 0.40.0
- 0.39.0
- 0.38.0
- 0.37.0
- 0.36.0
- 0.35.0
- 0.34.0
- 0.33.0
- 0.32.0
- 0.31.0
- 0.30.0
- 0.29.0
- 0.28.0
- 0.27.0
- 0.26.0
- 0.25.0
- 0.24.0
- 0.23.0
- 0.22.0
- 0.21.0
- 0.20.0
- 0.19.0
- 0.18.0
- 0.17.0
- 0.16.0
- 0.15.0
- 0.14.0
- 0.13.0
- 0.12.0
- 0.11.0
- 0.10.0
- 0.9.1
- 0.8.0
- 0.7.0
- 0.6.0
- 0.5.0
- 0.4.0
- 0.3.0
- 0.2.0
- 0.1.0
Configures the output of BatchPredictionJob. See Model.supported_output_storage_formats for supported output formats, and how predictions are expressed via any of them.
Inherits
- Object
Extended By
- Google::Protobuf::MessageExts::ClassMethods
Includes
- Google::Protobuf::MessageExts
Methods
#bigquery_destination
def bigquery_destination() -> ::Google::Cloud::AIPlatform::V1::BigQueryDestination
Returns
-
(::Google::Cloud::AIPlatform::V1::BigQueryDestination) — The BigQuery project or dataset location where the output is to be
written to. If project is provided, a new dataset is created with name
prediction_<model-display-name>_<job-create-time>
where
#bigquery_destination=
def bigquery_destination=(value) -> ::Google::Cloud::AIPlatform::V1::BigQueryDestination
Parameter
-
value (::Google::Cloud::AIPlatform::V1::BigQueryDestination) — The BigQuery project or dataset location where the output is to be
written to. If project is provided, a new dataset is created with name
prediction_<model-display-name>_<job-create-time>
where
Returns
-
(::Google::Cloud::AIPlatform::V1::BigQueryDestination) — The BigQuery project or dataset location where the output is to be
written to. If project is provided, a new dataset is created with name
prediction_<model-display-name>_<job-create-time>
where
#gcs_destination
def gcs_destination() -> ::Google::Cloud::AIPlatform::V1::GcsDestination
Returns
-
(::Google::Cloud::AIPlatform::V1::GcsDestination) — The Cloud Storage location of the directory where the output is
to be written to. In the given directory a new directory is created.
Its name is
prediction-<model-display-name>-<job-create-time>
, where timestamp is in YYYY-MM-DDThh:mm:ss.sssZ ISO-8601 format. Inside of it filespredictions_0001.<extension>
,predictions_0002.<extension>
, ...,predictions_N.<extension>
are created where<extension>
depends on chosen predictions_format, and N may equal 0001 and depends on the total number of successfully predicted instances. If the Model has both instance and prediction schemata defined then each such file contains predictions as per the predictions_format. If prediction for any instance failed (partially or completely), then an additionalerrors_0001.<extension>
,errors_0002.<extension>
,...,errors_N.<extension>
files are created (N depends on total number of failed predictions). These files contain the failed instances, as per their schema, followed by an additionalerror
field which as value has google.rpc.Status containing onlycode
andmessage
fields.
#gcs_destination=
def gcs_destination=(value) -> ::Google::Cloud::AIPlatform::V1::GcsDestination
Parameter
-
value (::Google::Cloud::AIPlatform::V1::GcsDestination) — The Cloud Storage location of the directory where the output is
to be written to. In the given directory a new directory is created.
Its name is
prediction-<model-display-name>-<job-create-time>
, where timestamp is in YYYY-MM-DDThh:mm:ss.sssZ ISO-8601 format. Inside of it filespredictions_0001.<extension>
,predictions_0002.<extension>
, ...,predictions_N.<extension>
are created where<extension>
depends on chosen predictions_format, and N may equal 0001 and depends on the total number of successfully predicted instances. If the Model has both instance and prediction schemata defined then each such file contains predictions as per the predictions_format. If prediction for any instance failed (partially or completely), then an additionalerrors_0001.<extension>
,errors_0002.<extension>
,...,errors_N.<extension>
files are created (N depends on total number of failed predictions). These files contain the failed instances, as per their schema, followed by an additionalerror
field which as value has google.rpc.Status containing onlycode
andmessage
fields.
Returns
-
(::Google::Cloud::AIPlatform::V1::GcsDestination) — The Cloud Storage location of the directory where the output is
to be written to. In the given directory a new directory is created.
Its name is
prediction-<model-display-name>-<job-create-time>
, where timestamp is in YYYY-MM-DDThh:mm:ss.sssZ ISO-8601 format. Inside of it filespredictions_0001.<extension>
,predictions_0002.<extension>
, ...,predictions_N.<extension>
are created where<extension>
depends on chosen predictions_format, and N may equal 0001 and depends on the total number of successfully predicted instances. If the Model has both instance and prediction schemata defined then each such file contains predictions as per the predictions_format. If prediction for any instance failed (partially or completely), then an additionalerrors_0001.<extension>
,errors_0002.<extension>
,...,errors_N.<extension>
files are created (N depends on total number of failed predictions). These files contain the failed instances, as per their schema, followed by an additionalerror
field which as value has google.rpc.Status containing onlycode
andmessage
fields.
#predictions_format
def predictions_format() -> ::String
Returns
- (::String) — Required. The format in which Vertex AI gives the predictions, must be one of the [Model's][google.cloud.aiplatform.v1.BatchPredictionJob.model] supported_output_storage_formats.
#predictions_format=
def predictions_format=(value) -> ::String
Parameter
- value (::String) — Required. The format in which Vertex AI gives the predictions, must be one of the [Model's][google.cloud.aiplatform.v1.BatchPredictionJob.model] supported_output_storage_formats.
Returns
- (::String) — Required. The format in which Vertex AI gives the predictions, must be one of the [Model's][google.cloud.aiplatform.v1.BatchPredictionJob.model] supported_output_storage_formats.