Index
BatchController
(interface)Batch
(message)Batch.State
(enum)Batch.StateHistory
(message)BatchOperationMetadata
(message)BatchOperationMetadata.BatchOperationType
(enum)CreateBatchRequest
(message)DeleteBatchRequest
(message)DiagnoseClusterResults
(message)EnvironmentConfig
(message)ExecutionConfig
(message)GetBatchRequest
(message)ListBatchesRequest
(message)ListBatchesResponse
(message)PeripheralsConfig
(message)PySparkBatch
(message)RuntimeConfig
(message)RuntimeInfo
(message)SparkBatch
(message)SparkHistoryServerConfig
(message)SparkRBatch
(message)SparkSqlBatch
(message)
BatchController
The BatchController provides methods to manage batch workloads.
CreateBatch |
---|
Creates a batch workload that executes asynchronously.
|
DeleteBatch |
---|
Deletes the batch workload resource. If the batch is not in terminal state, the delete fails and the response returns
|
GetBatch |
---|
Gets the batch workload resource representation.
|
ListBatches |
---|
Lists batch workloads.
|
Batch
A representation of a batch workload in the service.
Fields | |
---|---|
name |
Output only. The resource name of the batch. |
uuid |
Output only. A batch UUID (Unique Universal Identifier). The service generates this value when it creates the batch. |
create_time |
Output only. The time when the batch was created. |
runtime_info |
Output only. Runtime information about batch execution. |
state |
Output only. The state of the batch. |
state_message |
Output only. Batch state details, such as a failure description if the state is |
state_time |
Output only. The time when the batch entered a current state. |
creator |
Output only. The email address of the user who created the batch. |
labels |
Optional. The labels to associate with this batch. Label keys must contain 1 to 63 characters, and must conform to RFC 1035. Label values may be empty, but, if present, must contain 1 to 63 characters, and must conform to RFC 1035. No more than 32 labels can be associated with a batch. |
runtime_config |
Optional. Runtime configuration for the batch execution. |
environment_config |
Optional. Environment configuration for the batch execution. |
operation |
Output only. The resource name of the operation associated with this batch. |
state_history[] |
Output only. Historical state information for the batch. |
Union field batch_config . The application/framework-specific portion of the batch configuration. batch_config can be only one of the following: |
|
pyspark_batch |
Optional. PySpark batch config. |
spark_batch |
Optional. Spark batch config. |
spark_r_batch |
Optional. SparkR batch config. |
spark_sql_batch |
Optional. SparkSql batch config. |
State
The batch state.
Enums | |
---|---|
STATE_UNSPECIFIED |
The batch state is unknown. |
PENDING |
The batch is created before running. |
RUNNING |
The batch is running. |
CANCELLING |
The batch is cancelling. |
CANCELLED |
The batch cancellation was successful. |
SUCCEEDED |
The batch completed successfully. |
FAILED |
The batch is no longer running due to an error. |
StateHistory
Historical state information.
Fields | |
---|---|
state |
Output only. The state of the batch at this point in history. |
state_message |
Output only. Details about the state at this point in history. |
state_start_time |
Output only. The time when the batch entered the historical state. |
BatchOperationMetadata
Metadata describing the Batch operation.
Fields | |
---|---|
batch |
Name of the batch for the operation. |
batch_uuid |
Batch UUID for the operation. |
create_time |
The time when the operation was created. |
done_time |
The time when the operation finished. |
operation_type |
The operation type. |
description |
Short description of the operation. |
labels |
Labels associated with the operation. |
warnings[] |
Warnings encountered during operation execution. |
BatchOperationType
Operation type for Batch resources
Enums | |
---|---|
BATCH_OPERATION_TYPE_UNSPECIFIED |
Batch operation type is unknown. |
BATCH |
Batch operation type. |
CreateBatchRequest
A request to create a batch workload.
Fields | |
---|---|
parent |
Required. The parent resource where this batch will be created. Authorization requires the following IAM permission on the specified resource
|
batch |
Required. The batch to create. |
batch_id |
Optional. The ID to use for the batch, which will become the final component of the batch's resource name. This value must be 4-63 characters. Valid characters are |
request_id |
Optional. A unique ID used to identify the request. If the service receives two CreateBatchRequests with the same request_id, the second request is ignored and the Operation that corresponds to the first Batch created and stored in the backend is returned. Recommendation: Set this value to a UUID. The value must contain only letters (a-z, A-Z), numbers (0-9), underscores (_), and hyphens (-). The maximum length is 40 characters. |
DeleteBatchRequest
A request to delete a batch workload.
Fields | |
---|---|
name |
Required. The name of the batch resource to delete. Authorization requires the following IAM permission on the specified resource
|
DiagnoseClusterResults
The location of diagnostic output.
Fields | |
---|---|
output_uri |
Output only. The Cloud Storage URI of the diagnostic output. The output report is a plain text file with a summary of collected diagnostics. |
EnvironmentConfig
Environment configuration for a workload.
Fields | |
---|---|
execution_config |
Optional. Execution configuration for a workload. |
peripherals_config |
Optional. Peripherals configuration that workload has access to. |
ExecutionConfig
Execution configuration for a workload.
Fields | |
---|---|
service_account |
Optional. Service account that used to execute workload. |
network_tags[] |
Optional. Tags used for network traffic control. |
kms_key |
Optional. The Cloud KMS key to use for encryption. |
Union field network . Network configuration for workload execution. network can be only one of the following: |
|
network_uri |
Optional. Network URI to connect workload to. |
subnetwork_uri |
Optional. Subnetwork URI to connect workload to. |
GetBatchRequest
A request to get the resource representation for a batch workload.
Fields | |
---|---|
name |
Required. The name of the batch to retrieve. Authorization requires the following IAM permission on the specified resource
|
ListBatchesRequest
A request to list batch workloads in a project.
Fields | |
---|---|
parent |
Required. The parent, which owns this collection of batches. Authorization requires the following IAM permission on the specified resource
|
page_size |
Optional. The maximum number of batches to return in each response. The service may return fewer than this value. The default page size is 20; the maximum page size is 1000. |
page_token |
Optional. A page token received from a previous |
ListBatchesResponse
A list of batch workloads.
Fields | |
---|---|
batches[] |
The batches from the specified collection. |
next_page_token |
A token, which can be sent as |
PeripheralsConfig
Auxiliary services configuration for a workload.
Fields | |
---|---|
metastore_service |
Optional. Resource name of an existing Dataproc Metastore service. Example:
|
spark_history_server_config |
Optional. The Spark History Server configuration for the workload. |
PySparkBatch
A configuration for running an Apache PySpark batch workload.
Fields | |
---|---|
main_python_file_uri |
Required. The HCFS URI of the main Python file to use as the Spark driver. Must be a .py file. |
args[] |
Optional. The arguments to pass to the driver. Do not include arguments that can be set as batch properties, such as |
python_file_uris[] |
Optional. HCFS file URIs of Python files to pass to the PySpark framework. Supported file types: |
jar_file_uris[] |
Optional. HCFS URIs of jar files to add to the classpath of the Spark driver and tasks. |
file_uris[] |
Optional. HCFS URIs of files to be placed in the working directory of each executor. |
archive_uris[] |
Optional. HCFS URIs of archives to be extracted into the working directory of each executor. Supported file types: |
RuntimeConfig
Runtime configuration for a workload.
Fields | |
---|---|
version |
Optional. Version of the batch runtime. |
container_image |
Optional. Optional custom container image for the job runtime environment. If not specified, a default container image will be used. |
properties |
Optional. A mapping of property names to values, which are used to configure workload execution. |
RuntimeInfo
Runtime information about workload execution.
Fields | |
---|---|
endpoints |
Output only. Map of remote access endpoints (such as web interfaces and APIs) to their URIs. |
output_uri |
Output only. A URI pointing to the location of the stdout and stderr of the workload. |
diagnostic_output_uri |
Output only. A URI pointing to the location of the diagnostics tarball. |
SparkBatch
A configuration for running an Apache Spark batch workload.
Fields | |
---|---|
args[] |
Optional. The arguments to pass to the driver. Do not include arguments that can be set as batch properties, such as |
jar_file_uris[] |
Optional. HCFS URIs of jar files to add to the classpath of the Spark driver and tasks. |
file_uris[] |
Optional. HCFS URIs of files to be placed in the working directory of each executor. |
archive_uris[] |
Optional. HCFS URIs of archives to be extracted into the working directory of each executor. Supported file types: |
Union field driver . The specification of the main method to call to drive the Spark workload. Specify either the jar file that contains the main class or the main class name. To pass both a main jar and a main class in that jar, add the jar to jar_file_uris , and then specify the main class name in main_class . driver can be only one of the following: |
|
main_jar_file_uri |
Optional. The HCFS URI of the jar file that contains the main class. |
main_class |
Optional. The name of the driver main class. The jar file that contains the class must be in the classpath or specified in |
SparkHistoryServerConfig
Spark History Server configuration for the workload.
Fields | |
---|---|
dataproc_cluster |
Optional. Resource name of an existing Dataproc Cluster to act as a Spark History Server for the workload. Example:
|
SparkRBatch
A configuration for running an Apache SparkR batch workload.
Fields | |
---|---|
main_r_file_uri |
Required. The HCFS URI of the main R file to use as the driver. Must be a |
args[] |
Optional. The arguments to pass to the Spark driver. Do not include arguments that can be set as batch properties, such as |
file_uris[] |
Optional. HCFS URIs of files to be placed in the working directory of each executor. |
archive_uris[] |
Optional. HCFS URIs of archives to be extracted into the working directory of each executor. Supported file types: |
SparkSqlBatch
A configuration for running Apache Spark SQL queries as a batch workload.
Fields | |
---|---|
query_file_uri |
Required. The HCFS URI of the script that contains Spark SQL queries to execute. |
query_variables |
Optional. Mapping of query variable names to values (equivalent to the Spark SQL command: |
jar_file_uris[] |
Optional. HCFS URIs of jar files to be added to the Spark CLASSPATH. |