使用 GPU 处理 Landsat 卫星图像


本教程介绍如何在 Dataflow 上使用 GPU 处理 Landsat 8 卫星图像并将其渲染为 JPEG 文件。本教程基于使用 GPU 处理 Landsat 卫星图像示例。

目标

  • 为支持 TensorFlow 和 GPU 的 Dataflow 构建 Docker 映像。
  • 使用 GPU 运行 Dataflow 作业。

费用

本教程使用 Google Cloud 的以下收费组件:

  • Cloud Storage
  • Dataflow
  • Artifact Registry

您可使用价格计算器根据您的预计使用情况来估算费用。

准备工作

  1. Sign in to your Google Cloud account. If you're new to Google Cloud, create an account to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
  2. Install the Google Cloud CLI.
  3. To initialize the gcloud CLI, run the following command:

    gcloud init
  4. Create or select a Google Cloud project.

    • Create a Google Cloud project:

      gcloud projects create PROJECT_ID

      Replace PROJECT_ID with a name for the Google Cloud project you are creating.

    • Select the Google Cloud project that you created:

      gcloud config set project PROJECT_ID

      Replace PROJECT_ID with your Google Cloud project name.

  5. Make sure that billing is enabled for your Google Cloud project.

  6. Enable the Dataflow, Cloud Build, and Artifact Registry APIs:

    gcloud services enable dataflow cloudbuild.googleapis.com artifactregistry.googleapis.com
  7. If you're using a local shell, then create local authentication credentials for your user account:

    gcloud auth application-default login

    You don't need to do this if you're using Cloud Shell.

  8. Grant roles to your user account. Run the following command once for each of the following IAM roles: roles/iam.serviceAccountUser

    gcloud projects add-iam-policy-binding PROJECT_ID --member="USER_IDENTIFIER" --role=ROLE
    • Replace PROJECT_ID with your project ID.
    • Replace USER_IDENTIFIER with the identifier for your user account. For example, user:myemail@example.com.

    • Replace ROLE with each individual role.
  9. Install the Google Cloud CLI.
  10. To initialize the gcloud CLI, run the following command:

    gcloud init
  11. Create or select a Google Cloud project.

    • Create a Google Cloud project:

      gcloud projects create PROJECT_ID

      Replace PROJECT_ID with a name for the Google Cloud project you are creating.

    • Select the Google Cloud project that you created:

      gcloud config set project PROJECT_ID

      Replace PROJECT_ID with your Google Cloud project name.

  12. Make sure that billing is enabled for your Google Cloud project.

  13. Enable the Dataflow, Cloud Build, and Artifact Registry APIs:

    gcloud services enable dataflow cloudbuild.googleapis.com artifactregistry.googleapis.com
  14. If you're using a local shell, then create local authentication credentials for your user account:

    gcloud auth application-default login

    You don't need to do this if you're using Cloud Shell.

  15. Grant roles to your user account. Run the following command once for each of the following IAM roles: roles/iam.serviceAccountUser

    gcloud projects add-iam-policy-binding PROJECT_ID --member="USER_IDENTIFIER" --role=ROLE
    • Replace PROJECT_ID with your project ID.
    • Replace USER_IDENTIFIER with the identifier for your user account. For example, user:myemail@example.com.

    • Replace ROLE with each individual role.
  16. 向您的 Compute Engine 默认服务账号授予角色。对以下每个 IAM 角色运行以下命令一次:roles/dataflow.adminroles/dataflow.workerroles/bigquery.dataEditorroles/pubsub.editorroles/storage.objectAdminroles/artifactregistry.reader

    gcloud projects add-iam-policy-binding PROJECT_ID --member="serviceAccount:PROJECT_NUMBER-compute@developer.gserviceaccount.com" --role=SERVICE_ACCOUNT_ROLE
    • PROJECT_ID 替换为您的项目 ID。
    • PROJECT_NUMBER 替换为您的项目编号。 如需查找项目编号,请参阅标识项目
    • SERVICE_ACCOUNT_ROLE 替换为每个角色。
  17. 如需存储本教程中的输出 JPEG 图片文件,请创建一个 Cloud Storage 存储桶:
    1. In the Google Cloud console, go to the Cloud Storage Buckets page.

      Go to Buckets page

    2. Click Create bucket.
    3. On the Create a bucket page, enter your bucket information. To go to the next step, click Continue.
      • For Name your bucket, enter a unique bucket name. Don't include sensitive information in the bucket name, because the bucket namespace is global and publicly visible.
      • For Choose where to store your data, do the following:
        • Select a Location type option.
        • Select a Location option.
      • For Choose a default storage class for your data, select the following: Standard.
      • For Choose how to control access to objects, select an Access control option.
      • For Advanced settings (optional), specify an encryption method, a retention policy, or bucket labels.
    4. Click Create.

准备工作环境

下载起始文件,然后创建 Artifact Registry 代码库。

下载入门版文件

下载入门版文件,然后更改目录。

  1. 克隆 python-docs-samples 代码库。

    git clone https://github.com/GoogleCloudPlatform/python-docs-samples.git
    
  2. 导航到示例代码目录

    cd python-docs-samples/dataflow/gpu-examples/tensorflow-landsat
    

配置 Artifact Registry

创建 Artifact Registry 制品库,以便上传制品。每个代码库可以包含一种受支持格式的工件。

所有制品库内容都已使用 Google 拥有且 Google 管理的密钥或客户管理的加密密钥进行加密。Artifact Registry 默认使用 Google 拥有且 Google 管理的密钥,此选项无需进行任何配置。

您必须至少具有代码库的 Artifact Registry Writer 权限

运行以下命令创建新代码库。该命令使用 --async 标志并立即返回,无需等待正在进行的操作完成。

gcloud artifacts repositories create REPOSITORY \
    --repository-format=docker \
    --location=LOCATION \
    --async

REPOSITORY 替换为您的代码库的名称。对于项目中的每个代码库位置,代码库名称不得重复。

请先配置 Docker 以对 Artifact Registry 的请求进行身份验证,然后再推送或拉取映像。如需为 Docker 代码库设置身份验证,请运行以下命令:

gcloud auth configure-docker LOCATION-docker.pkg.dev

该命令将更新您的 Docker 配置。现在,您可以在 Google Cloud 项目中与 Artifact Registry 连接以推送映像。

构建 Docker 映像

借助 Cloud Build,您可以使用 Dockerfile 构建 Docker 映像,并将其保存到 Artifact Registry 中,供其他 Google Cloud 产品访问。

使用 build.yaml 配置文件构建容器映像。

gcloud builds submit --config build.yaml

使用 GPU 运行 Dataflow 作业

以下代码块演示了如何使用 GPU 启动此 Dataflow 流水线。

我们使用 run.yaml 配置文件运行 Dataflow 流水线。

export PROJECT=PROJECT_NAME
export BUCKET=BUCKET_NAME

export JOB_NAME="satellite-images-$(date +%Y%m%d-%H%M%S)"
export OUTPUT_PATH="gs://$BUCKET/samples/dataflow/landsat/output-images/"
export REGION="us-central1"
export GPU_TYPE="nvidia-tesla-t4"

gcloud builds submit \
    --config run.yaml \
    --substitutions _JOB_NAME=$JOB_NAME,_OUTPUT_PATH=$OUTPUT_PATH,_REGION=$REGION,_GPU_TYPE=$GPU_TYPE \
    --no-source

替换以下内容:

  • PROJECT_NAME:Google Cloud 项目名称
  • BUCKET_NAME:Cloud Storage 存储桶名称(不带 gs:// 前缀)

运行此流水线后,等待命令完成。如果退出 shell,则可能会丢失设置的环境变量。

为避免在多个工作器进程之间共享 GPU,此示例使用了配备 1 个 vCPU 的机器类型。流水线的内存要求是使用 13 GB 的扩展内存。如需了解详情,请参阅 GPU 和工作器并行

查看结果

tensorflow-landsat/main.py 中的流水线会处理 Landsat 8 卫星图像,并将其渲染为 JPEG 文件。请按照以下步骤查看这些文件。

  1. 使用 Google Cloud CLI 列出包含详细信息的输出 JPEG 文件。

    gcloud storage ls "gs://$BUCKET/samples/dataflow/landsat/" --long --readable-sizes
    
  2. 将文件复制到您的本地目录中。

    mkdir outputs
    gcloud storage cp "gs://$BUCKET/samples/dataflow/landsat/*" outputs/
    
  3. 使用您选择的图片查看器打开这些图片文件。

清理

为避免因本教程中使用的资源导致您的 Google Cloud 账号产生费用,请删除包含这些资源的项目,或者保留项目但删除各个资源。

删除项目

为了避免产生费用,最简单的方法是删除您为本教程创建的项目。

如需删除项目,请执行以下操作:

  1. In the Google Cloud console, go to the Manage resources page.

    Go to Manage resources

  2. In the project list, select the project that you want to delete, and then click Delete.
  3. In the dialog, type the project ID, and then click Shut down to delete the project.

后续步骤