Class VirtualClusterConfig (4.0.0)

public sealed class VirtualClusterConfig : IMessage<VirtualClusterConfig>, IEquatable<VirtualClusterConfig>, IDeepCloneable<VirtualClusterConfig>, IBufferMessage, IMessage

Dataproc cluster config for a cluster that does not directly control the underlying compute resources, such as a Dataproc-on-GKE cluster.

Inheritance

Object > VirtualClusterConfig

Namespace

Google.Cloud.Dataproc.V1

Assembly

Google.Cloud.Dataproc.V1.dll

Constructors

VirtualClusterConfig()

public VirtualClusterConfig()

VirtualClusterConfig(VirtualClusterConfig)

public VirtualClusterConfig(VirtualClusterConfig other)
Parameter
NameDescription
otherVirtualClusterConfig

Properties

AuxiliaryServicesConfig

public AuxiliaryServicesConfig AuxiliaryServicesConfig { get; set; }

Optional. Configuration of auxiliary services used by this cluster.

Property Value
TypeDescription
AuxiliaryServicesConfig

InfrastructureConfigCase

public VirtualClusterConfig.InfrastructureConfigOneofCase InfrastructureConfigCase { get; }
Property Value
TypeDescription
VirtualClusterConfig.InfrastructureConfigOneofCase

KubernetesClusterConfig

public KubernetesClusterConfig KubernetesClusterConfig { get; set; }

Required. The configuration for running the Dataproc cluster on Kubernetes.

Property Value
TypeDescription
KubernetesClusterConfig

StagingBucket

public string StagingBucket { get; set; }

Optional. A Storage bucket used to stage job dependencies, config files, and job driver console output. If you do not specify a staging bucket, Cloud Dataproc will determine a Cloud Storage location (US, ASIA, or EU) for your cluster's staging bucket according to the Compute Engine zone where your cluster is deployed, and then create and manage this project-level, per-location bucket (see Dataproc staging and temp buckets). This field requires a Cloud Storage bucket name, not a gs://... URI to a Cloud Storage bucket.

Property Value
TypeDescription
String

TempBucket

public string TempBucket { get; set; }

Optional. A Cloud Storage bucket used to store ephemeral cluster and jobs data, such as Spark and MapReduce history files. If you do not specify a temp bucket, Dataproc will determine a Cloud Storage location (US, ASIA, or EU) for your cluster's temp bucket according to the Compute Engine zone where your cluster is deployed, and then create and manage this project-level, per-location bucket. The default bucket has a TTL of 90 days, but you can use any TTL (or none) if you specify a bucket (see Dataproc staging and temp buckets). This field requires a Cloud Storage bucket name, not a gs://... URI to a Cloud Storage bucket.

Property Value
TypeDescription
String