This page shows you how to use an inline Google APIs Explorer template to call the Cloud Dataproc API to create a cluster, then run a simple Spark job in the cluster. It also shows you how to use the APIs Explorer template to call the Cloud Dataproc API to update a cluster.
Before you begin
Sign in to your Google Account.
If you don't already have one, sign up for a new account.
Select or create a GCP project.
Make sure that billing is enabled for your Google Cloud Platform project. Learn how to enable billing.
- Enable the Cloud Dataproc API.
Create a cluster
Before you can run Cloud Dataproc jobs, you need to create a cluster of virtual machines to run them on. To create a Cloud Dataproc cluster in your project, fill in and execute the APIs Explorer template, below, as follows:
- Insert your project ID (project name) in the
- In the Request body
config.gceClusterConfig.zoneUrifield, replace the "[project-id]" placeholder with the your project ID (project name). For example, if your project ID is "my-project-12345", the completed
zoneUrifield would read as follows:
You can also replace the
us-central1-azone with the name of another zone (see Available regions & zones). Note that the zone you choose for your cluster must be supported by the region selected for your cluster (see the next item on choosing a region). The
globalregion supports all zones.
The following fields are filled in for you:
globalregion is special multi-region endpoint that is capable of deploying instances into any user-specified Compute Engine zone. You can also specify distinct regions, such as
europe-west1, to isolate resources (including VM instances and Cloud Storage) and metadata storage locations utilized by Cloud Dataproc within the user-specified region. See Regional endpoints to learn more about the difference between global and regional endpoints. See Available regions & zones for information on selecting a region. You can also run the
gcloud compute regions listcommand to see a listing of available regions.
- Request body
clusterName= "example-cluster". This is the name of the Cloud Dataproc cluster that will be created. You will use this name to interact with your cluster, such as submitting jobs or deleting the cluster.
Click EXECUTE. A dialog will ask you to confirm the default
https://www.googleapis.com/auth/cloud-platformscope. Click the dialog's ALLOW to send the request to the service. After less than one second (typically), the JSON response showing that the example-cluster is pending appears below the template.
You can confirm that the cluster is created by going to GCP Console—Clusters.
Congratulations! You've used the Google APIs Explorer to create cluster.