Quickstart Using Java and Apache Maven

This page shows you how to set up your Google Cloud project, create a Maven project with the Apache Beam SDK for Java, and run an example pipeline on the Dataflow service.

This quickstart requires familiarity with installing and configuring both Java and Maven in your local production environment. If you prefer running a sample job without first installing the prerequisites in your local environment, try the same Dataflow tutorial from the Cloud Console.

Go to the tutorial

Before you begin

  1. Sign in to your Google Cloud account. If you're new to Google Cloud, create an account to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
  2. In the Google Cloud Console, on the project selector page, select or create a Google Cloud project.

    Go to project selector

  3. Make sure that billing is enabled for your Cloud project. Learn how to confirm that billing is enabled for your project.

  4. Enable the Cloud Dataflow, Compute Engine, Stackdriver Logging, Cloud Storage, Cloud Storage JSON, BigQuery, Cloud Pub/Sub, Cloud Datastore, and Cloud Resource Manager APIs.

    Enable the APIs

  5. Create a service account:

    1. In the Cloud Console, go to the Create service account page.

      Go to Create service account
    2. Select a project.
    3. In the Service account name field, enter a name. The Cloud Console fills in the Service account ID field based on this name.

      In the Service account description field, enter a description. For example, Service account for quickstart.

    4. Click Create.
    5. Click the Select a role field.

      Under Quick access, click Basic, then click Owner.

    6. Click Continue.
    7. Click Done to finish creating the service account.

      Do not close your browser window. You will use it in the next step.

  6. Create a service account key:

    1. In the Cloud Console, click the email address for the service account that you created.
    2. Click Keys.
    3. Click Add key, then click Create new key.
    4. Click Create. A JSON key file is downloaded to your computer.
    5. Click Close.
  7. Set the environment variable GOOGLE_APPLICATION_CREDENTIALS to the path of the JSON file that contains your service account key. This variable only applies to your current shell session, so if you open a new session, set the variable again.

  8. Create a Cloud Storage bucket:
    1. In the Cloud Console, go to the Cloud Storage Browser page.

      Go to Browser

    2. Click Create bucket.
    3. On the Create a bucket page, enter your bucket information. To go to the next step, click Continue.
      • For Name your bucket, enter a unique bucket name. Don't include sensitive information in the bucket name, because the bucket namespace is global and publicly visible.
      • For Choose where to store your data, do the following:
        • Select a Location type option.
        • Select a Location option.
      • For Choose a default storage class for your data, select the following: Standard.
      • For Choose how to control access to objects, select an Access control option.
      • For Advanced settings (optional), specify an encryption method, a retention policy, or bucket labels.
    4. Click Create.
  9. Download and install the Java Development Kit (JDK) version 11 (Note: Dataflow continues to support version 8.) Verify that the JAVA_HOME environment variable is set and points to your JDK installation.
  10. Download and install Apache Maven by following Maven's installation guide for your specific operating system.

Get the WordCount code

The Apache Beam SDK is an open source programming model for data pipelines. You define these pipelines with an Apache Beam program and can choose a runner, such as Dataflow, to execute your pipeline.

Create a Maven project containing the Apache Beam SDK's WordCount examples, using the Maven Archetype Plugin. From a directory on your computer, run the mvn archetype:generate command in your shell or terminal as follows:

$ mvn archetype:generate \
      -DarchetypeGroupId=org.apache.beam \
      -DarchetypeArtifactId=beam-sdks-java-maven-archetypes-examples \
      -DarchetypeVersion=2.29.0 \
      -DgroupId=org.example \
      -DartifactId=word-count-beam \
      -Dversion="0.1" \
      -Dpackage=org.apache.beam.examples \
      -DinteractiveMode=false

After running the command, you should see a new directory called word-count-beam under your current directory. word-count-beam contains a simple pom.xml file and a series of example pipelines that count words in text files.

$ cd word-count-beam/

$ ls
pom.xml	src

$ ls src/main/java/org/apache/beam/examples/
DebuggingWordCount.java	WindowedWordCount.java	common
MinimalWordCount.java	WordCount.java

For a detailed introduction to the Apache Beam concepts used in these examples, see the WordCount Example Walkthrough. The following example executes WordCount.java.

Run WordCount locally

Run WordCount locally by running the following command from your word-count-beam directory:

$ mvn compile exec:java \
      -Dexec.mainClass=org.apache.beam.examples.WordCount \
      -Dexec.args="--output=counts"
The output files have the prefix counts and are written to the word-count-beam directory. They contain unique words from the input text and the number of occurrences of each word.

Run WordCount on the Dataflow service

Build and run WordCount on the Dataflow service:

$ mvn -Pdataflow-runner compile exec:java \
      -Dexec.mainClass=org.apache.beam.examples.WordCount \
      -Dexec.args="--project=<PROJECT_ID> \
      --gcpTempLocation=gs://<STORAGE_BUCKET>/temp/ \
      --output=gs://<STORAGE_BUCKET>/output \
      --runner=DataflowRunner \
      --region=<REGION>"
  • For the --project argument, specify the Project ID for the Google Cloud project you created.
  • For the --stagingLocation and --output arguments, specify the name of the Cloud Storage bucket you created as part of the path.

  • For the --region argument, specify a Dataflow regional endpoint.

View your results

  1. Open the Dataflow Web UI.
    Go to the Dataflow Web UI

    The Jobs page lists the details of all the available jobs, including the status.
    You should see your wordcount job with a status of Running at first, and then Succeeded.

  2. Open the Cloud Storage browser in the Google Cloud Console.
    Go to the Cloud Storage browser

    The Storage browser page displays the list of all the storage buckets in your project.

  3. Click the storage bucket that you created.

    In the Bucket details page, you should see the output files and staging files that your job created.

Clean up

To avoid incurring charges to your Google Cloud account for the resources used in this quickstart, follow these steps.

  1. In the Cloud Console, go to the Cloud Storage Browser page.

    Go to Browser

  2. Click the checkbox for the bucket that you want to delete.
  3. To delete the bucket, click Delete.

What's next