Processing Landsat satellite images with GPUs

This tutorial shows you how to use GPUs on Dataflow to process Landsat 8 satellite images and render them as JPEG files.


  • Build a Docker image for Dataflow that has TensorFlow with GPU support.
  • Run a Dataflow job with GPUs.


This tutorial uses billable components of Google Cloud, including:

  • Cloud Storage
  • Dataflow
  • Container Registry

Use the pricing calculator to generate a cost estimate based on your projected usage.

Before you begin

  1. Sign in to your Google Cloud account. If you're new to Google Cloud, create an account to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
  2. In the Google Cloud Console, on the project selector page, select or create a Google Cloud project.

    Go to project selector

  3. Make sure that billing is enabled for your Cloud project. Learn how to confirm that billing is enabled for your project.

  4. Enable the Dataflow and Cloud Build APIs.

    Enable the APIs

  5. Create a service account:

    1. In the Cloud Console, go to the Create service account page.

      Go to Create service account
    2. Select a project.
    3. In the Service account name field, enter a name. The Cloud Console fills in the Service account ID field based on this name.

      In the Service account description field, enter a description. For example, Service account for quickstart.

    4. Click Create.
    5. Click the Select a role field.

      Under Quick access, click Basic, then click Owner.

    6. Click Continue.
    7. Click Done to finish creating the service account.

      Do not close your browser window. You will use it in the next step.

  6. Create a service account key:

    1. In the Cloud Console, click the email address for the service account that you created.
    2. Click Keys.
    3. Click Add key, then click Create new key.
    4. Click Create. A JSON key file is downloaded to your computer.
    5. Click Close.
  7. Set the environment variable GOOGLE_APPLICATION_CREDENTIALS to the path of the JSON file that contains your service account key. This variable only applies to your current shell session, so if you open a new session, set the variable again.

  8. To store the output JPEG image files from this tutorial, create a Cloud Storage bucket:
    1. In the Cloud Console, go to the Cloud Storage Browser page.

      Go to Browser

    2. Click Create bucket.
    3. On the Create a bucket page, enter your bucket information. To go to the next step, click Continue.
      • For Name your bucket, enter a unique bucket name. Don't include sensitive information in the bucket name, because the bucket namespace is global and publicly visible.
      • For Choose where to store your data, do the following:
        • Select a Location type option.
        • Select a Location option.
      • For Choose a default storage class for your data, select the following: Standard.
      • For Choose how to control access to objects, select an Access control option.
      • For Advanced settings (optional), specify an encryption method, a retention policy, or bucket labels.
    4. Click Create.

Preparing your working environment

Before you can work through this tutorial, you must set up your development environment and download the starter files.

  1. Clone the python-docs-samples repository.

    git clone
  2. Navigate to the sample code directory.

    cd python-docs-samples/dataflow/gpu-workers
  3. Set up and activate a Python virtual environment. After you complete this tutorial, exit the virtual environment by running deactivate.

  4. Install the sample requirements.

    pip install -U pip
    pip install -r requirements.txt

Building the Docker image

Cloud Build allows you to build a Docker image using a Dockerfile and save it into Container Registry, where the image is accessible to other Google Cloud products.

export IMAGE="samples/dataflow/tensorflow-gpu:latest"
export PYTHON_VERSION=`python -c 'import platform; print(platform.python_version())'`

gcloud --project $PROJECT builds submit \
  --timeout 20m .

Replace the following:

  • PROJECT: the Google Cloud project name
  • BUCKET: the Cloud Storage bucket name (without the gs:// prefix)

Running the Dataflow job with GPUs

The following code block demonstrates how to launch this Dataflow pipeline with GPUs.

export REGION="us-central1"
export GPU_TYPE="nvidia-tesla-t4"

python \
    --output-path-prefix "gs://$BUCKET/samples/dataflow/landsat/" \
    --runner "DataflowRunner" \
    --project "$PROJECT" \
    --region "$REGION" \
    --worker_machine_type "custom-1-13312-ext" \
    --worker_harness_container_image "$PROJECT/$IMAGE" \
    --disk_size_gb 50 \
    --experiments "worker_accelerator=type:$GPU_TYPE;count:1;install-nvidia-driver" \
    --experiments "use_runner_v2"

After you run this pipeline, wait for the command to finish. If you exit your shell, you might lose the environment variables that you've set.

To avoid sharing the GPU between multiple worker processes, this sample uses a machine type with 1 vCPU. The memory requirements of the pipeline are addressed by using 13 GB of extended memory. For more information, read GPUs and worker parallelism.

Viewing your results

The pipeline in processes Landsat 8 satellite images and renders them as JPEG files. Use the following steps to view these files.

  1. List the output JPEG files with details by using gsutil.

    gsutil ls -lh "gs://$BUCKET/samples/dataflow/landsat/"
  2. Copy the files into your local directory.

    mkdir outputs
    gsutil -m cp "gs://$BUCKET/samples/dataflow/landsat/*" outputs/
  3. Open these image files with the image viewer of your choice.

Cleaning up

To avoid incurring charges to your Google Cloud account for the resources used in this tutorial, either delete the project that contains the resources, or keep the project and delete the individual resources.

Deleting the project

The easiest way to eliminate billing is to delete the project that you created for the tutorial.

To delete the project:

  1. In the Cloud Console, go to the Manage resources page.

    Go to Manage resources

  2. In the project list, select the project that you want to delete, and then click Delete.
  3. In the dialog, type the project ID, and then click Shut down to delete the project.

What's next