Create a Dataproc cluster by using the Google Cloud console

This page shows you how to use the Google Cloud console to create a Dataproc cluster, run a basic Apache Spark job in the cluster, and then modify the number of workers in the cluster.


To follow step-by-step guidance for this task directly in the Google Cloud console, click Guide me:

Guide me


Before you begin

  1. Sign in to your Google Cloud account. If you're new to Google Cloud, create an account to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
  2. In the Google Cloud console, on the project selector page, select or create a Google Cloud project.

    Go to project selector

  3. Make sure that billing is enabled for your Google Cloud project.

  4. Enable the Dataproc API.

    Enable the API

  5. In the Google Cloud console, on the project selector page, select or create a Google Cloud project.

    Go to project selector

  6. Make sure that billing is enabled for your Google Cloud project.

  7. Enable the Dataproc API.

    Enable the API

Create a cluster

  1. In the Google Cloud console, go to the Dataproc Clusters page.

    Go to Clusters

  2. Click Create cluster.

  3. In the Create Dataproc cluster dialog, click Create in the Cluster on Compute engine row.

  4. In the Cluster Name field, enter example-cluster.

  5. In the Region and Zone lists, select a region and zone.

    Select a region (for example, us-east1 or europe-west1) to isolate resources, such as virtual machine (VM) instances and Cloud Storage and metadata storage locations that are utilized by Dataproc, in the region. For more information, see Available regions and zones and Regional endpoints.

  6. For all the other options, use the default settings.

  7. To create the cluster, click Create.

    Your new cluster appears in a list on the Clusters page. The status is Provisioning until the cluster is ready to use, and then the status changes to Running. Provisioning the cluster might take a couple of minutes.

Submit a Spark job

Submit a Spark job that estimates a value of Pi:

  1. In the Dataproc navigation menu, click Jobs.
  2. On the Jobs page, click Submit job, and then do the following:

    1. In the Cluster field, click Browse.
    2. On the row for example-cluster, click Select.
    3. In the Job ID field, use the default setting, or provide an ID that is unique to your Google Cloud project.
    4. For Job type, select Spark.
    5. In the Main class or jar field, enter org.apache.spark.examples.SparkPi.
    6. In the Jar files field, enter file:///usr/lib/spark/examples/jars/spark-examples.jar.
    7. In the Arguments field, enter 1000 to set the number of tasks.

    8. Click Submit.

      Your job is displayed on the Job details page. The job status is Running or Starting, and then it changes to Succeeded after it's submitted.

      To avoid scrolling in the output, click Line wrap: off. The output is similar to the following:

      Pi is roughly 3.1416759514167594
      

      To view job details, click the Configuration tab.

Update a cluster

Update your cluster by changing the number of worker instances:

  1. In the navigation menu, click Clusters.
  2. In the list of clusters, click example-cluster.
  3. On the Cluster details page, click the Configuration tab.

    Your cluster settings are displayed.

  4. Click Edit.

  5. In the Worker nodes field, enter 5.

  6. Click Save.

Your cluster is now updated. To decrease the number of worker nodes to the original value, follow the same procedure.

Clean up

To avoid incurring charges to your Google Cloud account for the resources used on this page, follow these steps.

  1. On the Cluster details page for example-cluster, click Delete to delete the cluster.
  2. To confirm that you want to delete the cluster, click Delete.

What's next