Cloud Dataproc V1 API - Class Google::Cloud::Dataproc::V1::Batch (v1.2.0)

Reference documentation and code samples for the Cloud Dataproc V1 API class Google::Cloud::Dataproc::V1::Batch.

A representation of a batch workload in the service.

Inherits

  • Object

Extended By

  • Google::Protobuf::MessageExts::ClassMethods

Includes

  • Google::Protobuf::MessageExts

Methods

#create_time

def create_time() -> ::Google::Protobuf::Timestamp
Returns

#creator

def creator() -> ::String
Returns
  • (::String) — Output only. The email address of the user who created the batch.

#environment_config

def environment_config() -> ::Google::Cloud::Dataproc::V1::EnvironmentConfig
Returns

#environment_config=

def environment_config=(value) -> ::Google::Cloud::Dataproc::V1::EnvironmentConfig
Parameter
Returns

#labels

def labels() -> ::Google::Protobuf::Map{::String => ::String}
Returns
  • (::Google::Protobuf::Map{::String => ::String}) — Optional. The labels to associate with this batch. Label keys must contain 1 to 63 characters, and must conform to RFC 1035. Label values may be empty, but, if present, must contain 1 to 63 characters, and must conform to RFC 1035. No more than 32 labels can be associated with a batch.

#labels=

def labels=(value) -> ::Google::Protobuf::Map{::String => ::String}
Parameter
  • value (::Google::Protobuf::Map{::String => ::String}) — Optional. The labels to associate with this batch. Label keys must contain 1 to 63 characters, and must conform to RFC 1035. Label values may be empty, but, if present, must contain 1 to 63 characters, and must conform to RFC 1035. No more than 32 labels can be associated with a batch.
Returns
  • (::Google::Protobuf::Map{::String => ::String}) — Optional. The labels to associate with this batch. Label keys must contain 1 to 63 characters, and must conform to RFC 1035. Label values may be empty, but, if present, must contain 1 to 63 characters, and must conform to RFC 1035. No more than 32 labels can be associated with a batch.

#name

def name() -> ::String
Returns
  • (::String) — Output only. The resource name of the batch.

#operation

def operation() -> ::String
Returns
  • (::String) — Output only. The resource name of the operation associated with this batch.

#pyspark_batch

def pyspark_batch() -> ::Google::Cloud::Dataproc::V1::PySparkBatch
Returns
  • (::Google::Cloud::Dataproc::V1::PySparkBatch) — Optional. PySpark batch config.

    Note: The following fields are mutually exclusive: pyspark_batch, spark_batch, spark_r_batch, spark_sql_batch. If a field in that set is populated, all other fields in the set will automatically be cleared.

#pyspark_batch=

def pyspark_batch=(value) -> ::Google::Cloud::Dataproc::V1::PySparkBatch
Parameter
  • value (::Google::Cloud::Dataproc::V1::PySparkBatch) — Optional. PySpark batch config.

    Note: The following fields are mutually exclusive: pyspark_batch, spark_batch, spark_r_batch, spark_sql_batch. If a field in that set is populated, all other fields in the set will automatically be cleared.

Returns
  • (::Google::Cloud::Dataproc::V1::PySparkBatch) — Optional. PySpark batch config.

    Note: The following fields are mutually exclusive: pyspark_batch, spark_batch, spark_r_batch, spark_sql_batch. If a field in that set is populated, all other fields in the set will automatically be cleared.

#runtime_config

def runtime_config() -> ::Google::Cloud::Dataproc::V1::RuntimeConfig
Returns

#runtime_config=

def runtime_config=(value) -> ::Google::Cloud::Dataproc::V1::RuntimeConfig
Parameter
Returns

#runtime_info

def runtime_info() -> ::Google::Cloud::Dataproc::V1::RuntimeInfo
Returns

#spark_batch

def spark_batch() -> ::Google::Cloud::Dataproc::V1::SparkBatch
Returns
  • (::Google::Cloud::Dataproc::V1::SparkBatch) — Optional. Spark batch config.

    Note: The following fields are mutually exclusive: spark_batch, pyspark_batch, spark_r_batch, spark_sql_batch. If a field in that set is populated, all other fields in the set will automatically be cleared.

#spark_batch=

def spark_batch=(value) -> ::Google::Cloud::Dataproc::V1::SparkBatch
Parameter
  • value (::Google::Cloud::Dataproc::V1::SparkBatch) — Optional. Spark batch config.

    Note: The following fields are mutually exclusive: spark_batch, pyspark_batch, spark_r_batch, spark_sql_batch. If a field in that set is populated, all other fields in the set will automatically be cleared.

Returns
  • (::Google::Cloud::Dataproc::V1::SparkBatch) — Optional. Spark batch config.

    Note: The following fields are mutually exclusive: spark_batch, pyspark_batch, spark_r_batch, spark_sql_batch. If a field in that set is populated, all other fields in the set will automatically be cleared.

#spark_r_batch

def spark_r_batch() -> ::Google::Cloud::Dataproc::V1::SparkRBatch
Returns
  • (::Google::Cloud::Dataproc::V1::SparkRBatch) — Optional. SparkR batch config.

    Note: The following fields are mutually exclusive: spark_r_batch, pyspark_batch, spark_batch, spark_sql_batch. If a field in that set is populated, all other fields in the set will automatically be cleared.

#spark_r_batch=

def spark_r_batch=(value) -> ::Google::Cloud::Dataproc::V1::SparkRBatch
Parameter
  • value (::Google::Cloud::Dataproc::V1::SparkRBatch) — Optional. SparkR batch config.

    Note: The following fields are mutually exclusive: spark_r_batch, pyspark_batch, spark_batch, spark_sql_batch. If a field in that set is populated, all other fields in the set will automatically be cleared.

Returns
  • (::Google::Cloud::Dataproc::V1::SparkRBatch) — Optional. SparkR batch config.

    Note: The following fields are mutually exclusive: spark_r_batch, pyspark_batch, spark_batch, spark_sql_batch. If a field in that set is populated, all other fields in the set will automatically be cleared.

#spark_sql_batch

def spark_sql_batch() -> ::Google::Cloud::Dataproc::V1::SparkSqlBatch
Returns
  • (::Google::Cloud::Dataproc::V1::SparkSqlBatch) — Optional. SparkSql batch config.

    Note: The following fields are mutually exclusive: spark_sql_batch, pyspark_batch, spark_batch, spark_r_batch. If a field in that set is populated, all other fields in the set will automatically be cleared.

#spark_sql_batch=

def spark_sql_batch=(value) -> ::Google::Cloud::Dataproc::V1::SparkSqlBatch
Parameter
  • value (::Google::Cloud::Dataproc::V1::SparkSqlBatch) — Optional. SparkSql batch config.

    Note: The following fields are mutually exclusive: spark_sql_batch, pyspark_batch, spark_batch, spark_r_batch. If a field in that set is populated, all other fields in the set will automatically be cleared.

Returns
  • (::Google::Cloud::Dataproc::V1::SparkSqlBatch) — Optional. SparkSql batch config.

    Note: The following fields are mutually exclusive: spark_sql_batch, pyspark_batch, spark_batch, spark_r_batch. If a field in that set is populated, all other fields in the set will automatically be cleared.

#state

def state() -> ::Google::Cloud::Dataproc::V1::Batch::State
Returns

#state_history

def state_history() -> ::Array<::Google::Cloud::Dataproc::V1::Batch::StateHistory>
Returns

#state_message

def state_message() -> ::String
Returns
  • (::String) — Output only. Batch state details, such as a failure description if the state is FAILED.

#state_time

def state_time() -> ::Google::Protobuf::Timestamp
Returns

#uuid

def uuid() -> ::String
Returns
  • (::String) — Output only. A batch UUID (Unique Universal Identifier). The service generates this value when it creates the batch.