APIs Explorer Quickstart—Create a cluster

This page shows you how to use an inline Google APIs Explorer template to call the Dataproc API to create a cluster, then run a simple Spark job in the cluster. It also shows you how to use the APIs Explorer template to call the Dataproc API to update a cluster.

You can find out how to create a cluster using the Google Cloud Console in Quickstart Using the Console, using the command line in Quickstart using the gcloud command-line tool, and programmatically using the Cloud Client Libraries.

Before you begin

  1. Sign in to your Google Account.

    If you don't already have one, sign up for a new account.

  2. In the Google Cloud Console, on the project selector page, select or create a Google Cloud project.

    Go to the project selector page

  3. Make sure that billing is enabled for your Cloud project. Learn how to confirm that billing is enabled for your project.

  4. Enable the Dataproc API.

    Enable the API

Create a cluster

Before you can run Dataproc jobs, you need to create a cluster of virtual machines to run them on. To create a Dataproc cluster in your project, fill in and execute the APIs Explorer template, below, as follows:

  1. Insert your project ID (project name) in the projectID field.
  2. In the Request body config.gceClusterConfig.zoneUri field, replace the "[project-id]" placeholder with your project ID (project name). For example, if your project ID is "my-project-12345", the completed zoneUri field would read as follows:
    You can also replace the us-central1-a zone with the name of another zone (see Available regions & zones). Note that the zone you choose for your cluster must be supported by the region selected for your cluster (see the next item on choosing a region). The global region supports all zones.
  3. The following fields are filled in for you:

    1. region = The global region is a special multi-region endpoint that is capable of deploying instances into any user-specified Compute Engine zone. You can also specify distinct regions, such as us-east1 or europe-west1, to isolate resources (including VM instances and Cloud Storage) and metadata storage locations utilized by Dataproc within the user-specified region. See Regional endpoints to learn more about the difference between global and regional endpoints. See Available regions & zones for information on selecting a region. You can also run the gcloud compute regions list command to see a listing of available regions.
    2. Request body clusterName = "example-cluster". This is the name of the Dataproc cluster that will be created. You will use this name to interact with your cluster, such as submitting jobs or deleting the cluster.
  4. Click EXECUTE. A dialog will ask you to confirm the default https://www.googleapis.com/auth/cloud-platform scope. Click the dialog's ALLOW to send the request to the service. After less than one second (typically), the JSON response showing that the example-cluster is pending appears below the template.

You can confirm that the cluster is created by going to Cloud Console—Clusters.

Congratulations! You've used the Google APIs Explorer to create a cluster.

What's next