Interface VirtualClusterConfigOrBuilder (3.1.2)

public interface VirtualClusterConfigOrBuilder extends MessageOrBuilder

Implements

MessageOrBuilder

Methods

getAuxiliaryServicesConfig()

public abstract AuxiliaryServicesConfig getAuxiliaryServicesConfig()

Optional. Configuration of auxiliary services used by this cluster.

.google.cloud.dataproc.v1.AuxiliaryServicesConfig auxiliary_services_config = 7 [(.google.api.field_behavior) = OPTIONAL];

Returns
TypeDescription
AuxiliaryServicesConfig

The auxiliaryServicesConfig.

getAuxiliaryServicesConfigOrBuilder()

public abstract AuxiliaryServicesConfigOrBuilder getAuxiliaryServicesConfigOrBuilder()

Optional. Configuration of auxiliary services used by this cluster.

.google.cloud.dataproc.v1.AuxiliaryServicesConfig auxiliary_services_config = 7 [(.google.api.field_behavior) = OPTIONAL];

Returns
TypeDescription
AuxiliaryServicesConfigOrBuilder

getInfrastructureConfigCase()

public abstract VirtualClusterConfig.InfrastructureConfigCase getInfrastructureConfigCase()
Returns
TypeDescription
VirtualClusterConfig.InfrastructureConfigCase

getKubernetesClusterConfig()

public abstract KubernetesClusterConfig getKubernetesClusterConfig()

Required. The configuration for running the Dataproc cluster on Kubernetes.

.google.cloud.dataproc.v1.KubernetesClusterConfig kubernetes_cluster_config = 6 [(.google.api.field_behavior) = REQUIRED];

Returns
TypeDescription
KubernetesClusterConfig

The kubernetesClusterConfig.

getKubernetesClusterConfigOrBuilder()

public abstract KubernetesClusterConfigOrBuilder getKubernetesClusterConfigOrBuilder()

Required. The configuration for running the Dataproc cluster on Kubernetes.

.google.cloud.dataproc.v1.KubernetesClusterConfig kubernetes_cluster_config = 6 [(.google.api.field_behavior) = REQUIRED];

Returns
TypeDescription
KubernetesClusterConfigOrBuilder

getStagingBucket()

public abstract String getStagingBucket()

Optional. A Storage bucket used to stage job dependencies, config files, and job driver console output. If you do not specify a staging bucket, Cloud Dataproc will determine a Cloud Storage location (US, ASIA, or EU) for your cluster's staging bucket according to the Compute Engine zone where your cluster is deployed, and then create and manage this project-level, per-location bucket (see Dataproc staging and temp buckets). This field requires a Cloud Storage bucket name, not a gs://... URI to a Cloud Storage bucket.

string staging_bucket = 1 [(.google.api.field_behavior) = OPTIONAL];

Returns
TypeDescription
String

The stagingBucket.

getStagingBucketBytes()

public abstract ByteString getStagingBucketBytes()

Optional. A Storage bucket used to stage job dependencies, config files, and job driver console output. If you do not specify a staging bucket, Cloud Dataproc will determine a Cloud Storage location (US, ASIA, or EU) for your cluster's staging bucket according to the Compute Engine zone where your cluster is deployed, and then create and manage this project-level, per-location bucket (see Dataproc staging and temp buckets). This field requires a Cloud Storage bucket name, not a gs://... URI to a Cloud Storage bucket.

string staging_bucket = 1 [(.google.api.field_behavior) = OPTIONAL];

Returns
TypeDescription
ByteString

The bytes for stagingBucket.

getTempBucket()

public abstract String getTempBucket()

Optional. A Cloud Storage bucket used to store ephemeral cluster and jobs data, such as Spark and MapReduce history files. If you do not specify a temp bucket, Dataproc will determine a Cloud Storage location (US, ASIA, or EU) for your cluster's temp bucket according to the Compute Engine zone where your cluster is deployed, and then create and manage this project-level, per-location bucket. The default bucket has a TTL of 90 days, but you can use any TTL (or none) if you specify a bucket (see Dataproc staging and temp buckets). This field requires a Cloud Storage bucket name, not a gs://... URI to a Cloud Storage bucket.

string temp_bucket = 2 [(.google.api.field_behavior) = OPTIONAL];

Returns
TypeDescription
String

The tempBucket.

getTempBucketBytes()

public abstract ByteString getTempBucketBytes()

Optional. A Cloud Storage bucket used to store ephemeral cluster and jobs data, such as Spark and MapReduce history files. If you do not specify a temp bucket, Dataproc will determine a Cloud Storage location (US, ASIA, or EU) for your cluster's temp bucket according to the Compute Engine zone where your cluster is deployed, and then create and manage this project-level, per-location bucket. The default bucket has a TTL of 90 days, but you can use any TTL (or none) if you specify a bucket (see Dataproc staging and temp buckets). This field requires a Cloud Storage bucket name, not a gs://... URI to a Cloud Storage bucket.

string temp_bucket = 2 [(.google.api.field_behavior) = OPTIONAL];

Returns
TypeDescription
ByteString

The bytes for tempBucket.

hasAuxiliaryServicesConfig()

public abstract boolean hasAuxiliaryServicesConfig()

Optional. Configuration of auxiliary services used by this cluster.

.google.cloud.dataproc.v1.AuxiliaryServicesConfig auxiliary_services_config = 7 [(.google.api.field_behavior) = OPTIONAL];

Returns
TypeDescription
boolean

Whether the auxiliaryServicesConfig field is set.

hasKubernetesClusterConfig()

public abstract boolean hasKubernetesClusterConfig()

Required. The configuration for running the Dataproc cluster on Kubernetes.

.google.cloud.dataproc.v1.KubernetesClusterConfig kubernetes_cluster_config = 6 [(.google.api.field_behavior) = REQUIRED];

Returns
TypeDescription
boolean

Whether the kubernetesClusterConfig field is set.