This page shows you how to use an inline Google APIs Explorer template to update a Dataproc cluster to change the number of workers in a cluster. You can learn how to do the same task using the Google Cloud Console in Quickstart Using the Console or using the command line in Quickstart using the gcloud command-line tool.
Before you beginThis quickstart assumes you have already created a Dataproc cluster. You can use the APIs Explorer, the Google Cloud Console, or the Cloud SDK gcloud command-line tool to create a cluster.
Update a cluster
To update the number of workers in your cluster, fill in and execute the APIs Explorer template, below, as follows:
- Enter you project ID (project name) in the
The following fields are filled in for you:
region= a "us-central1". If you created your cluster (see APIs Explorer—Create a cluster) in a different region, replace "us-central1" with the name of your cluster's region.
clusterName= "example-cluster". This is the name of the Dataproc cluster (created in the previous quickstarts—see APIs Explorer—Create a cluster) that will be updated. Replace this name with the name of your cluster if it is different.
updateMask= "config.worker_config.num_instances". This is the query parameter's JSON PATH relative to the cluster. This parameter specifies the number of worker instances in a cluster(see next item).
- Patch body
config.workerConfig.numInstances= "5". This sets (updates) the number of workers in the cluster.
Click EXECUTE. A dialog will ask you to confirm the default
https://www.googleapis.com/auth/cloud-platformscope. Click the dialog's ALLOW to send the request to the service. After less than one second (typically), the JSON response showing that the example-cluster is pending appears below the template.
You can confirm that the number of workers in the cluster has been updated by going to Cloud Console—Clusters.
Congratulations! You've used the Google APIs Explorer to update a Dataproc cluster.
To avoid incurring charges to your Google Cloud account for the resources used in this quickstart, follow these steps.
If you plan on using the cluster, you can use the above template to restore the cluster to it default configuration with two workers (in the
Patch bodyfield, change
numInstancesto "2", fill in the
projectID, then click Authorize and Execute).
If you delete your cluster, you should also remove any Cloud Storage buckets that were created by the cluster. To do this, run the following command in a local terminal window or in Cloud Shell:
gsutil rm gs://bucket/subdir/**
- Learn how to write and run a Scala job.