准备工作
如果您尚未设置 Google Cloud 项目和两个 (2) Cloud Storage 存储桶,请执行此操作。
设置项目
- Sign in to your Google Cloud account. If you're new to Google Cloud, create an account to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
-
In the Google Cloud console, on the project selector page, select or create a Google Cloud project.
-
Verify that billing is enabled for your Google Cloud project.
-
Enable the Dataproc, Compute Engine, Cloud Storage, and Cloud Run functions APIs.
-
Install the Google Cloud CLI.
-
如果您使用的是外部身份提供方 (IdP),则必须先使用联合身份登录 gcloud CLI。
-
如需初始化 gcloud CLI,请运行以下命令:
gcloud init
-
In the Google Cloud console, on the project selector page, select or create a Google Cloud project.
-
Verify that billing is enabled for your Google Cloud project.
-
Enable the Dataproc, Compute Engine, Cloud Storage, and Cloud Run functions APIs.
-
Install the Google Cloud CLI.
-
如果您使用的是外部身份提供方 (IdP),则必须先使用联合身份登录 gcloud CLI。
-
如需初始化 gcloud CLI,请运行以下命令:
gcloud init
- In the Google Cloud console, go to the Cloud Storage Buckets page.
- Click Create.
- On the Create a bucket page, enter your bucket information. To go to the next
step, click Continue.
-
In the Get started section, do the following:
- Enter a globally unique name that meets the bucket naming requirements.
- To add a
bucket label,
expand the Labels section ( ),
click add_box
Add label, and specify a
key
and avalue
for your label.
-
In the Choose where to store your data section, do the following:
- Select a Location type.
- Choose a location where your bucket's data is permanently stored from the Location type drop-down menu.
- If you select the dual-region location type, you can also choose to enable turbo replication by using the relevant checkbox.
- To set up cross-bucket replication, select
Add cross-bucket replication via Storage Transfer Service and
follow these steps:
Set up cross-bucket replication
- In the Bucket menu, select a bucket.
In the Replication settings section, click Configure to configure settings for the replication job.
The Configure cross-bucket replication pane appears.
- To filter objects to replicate by object name prefix, enter a prefix that you want to include or exclude objects from, then click Add a prefix.
- To set a storage class for the replicated objects, select a storage class from the Storage class menu. If you skip this step, the replicated objects will use the destination bucket's storage class by default.
- Click Done.
-
In the Choose how to store your data section, do the following:
- Select a default storage class for the bucket or Autoclass for automatic storage class management of your bucket's data.
- To enable hierarchical namespace, in the Optimize storage for data-intensive workloads section, select Enable hierarchical namespace on this bucket.
- In the Choose how to control access to objects section, select whether or not your bucket enforces public access prevention, and select an access control method for your bucket's objects.
-
In the Choose how to protect object data section, do the
following:
- Select any of the options under Data protection that you
want to set for your bucket.
- To enable soft delete, click the Soft delete policy (For data recovery) checkbox, and specify the number of days you want to retain objects after deletion.
- To set Object Versioning, click the Object versioning (For version control) checkbox, and specify the maximum number of versions per object and the number of days after which the noncurrent versions expire.
- To enable the retention policy on objects and buckets, click the Retention (For compliance) checkbox, and then do the following:
- To enable Object Retention Lock, click the Enable object retention checkbox.
- To enable Bucket Lock, click the Set bucket retention policy checkbox, and choose a unit of time and a length of time for your retention period.
- To choose how your object data will be encrypted, expand the Data encryption section (Data encryption method. ), and select a
- Select any of the options under Data protection that you
want to set for your bucket.
-
In the Get started section, do the following:
- Click Create.
- 创建工作流模板。
gcloud dataproc workflow-templates create wordcount-template \ --region=us-central1
- 将 wordcount 作业添加到工作流模板。
-
在运行命令之前指定 output-bucket-name(您的函数将提供输入存储分区)。插入输出存储分区名称后,输出存储分区参数应如下所示:
gs://your-output-bucket/wordcount-output"
。 -
“count”步骤 ID 是必需的,用于标识已添加的作业。
gcloud dataproc workflow-templates add-job hadoop \ --workflow-template=wordcount-template \ --step-id=count \ --jar=file:///usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar \ --region=us-central1 \ -- wordcount gs://input-bucket gs://output-bucket-name/wordcount-output
-
在运行命令之前指定 output-bucket-name(您的函数将提供输入存储分区)。插入输出存储分区名称后,输出存储分区参数应如下所示:
- 使用托管单节点集群运行工作流。Dataproc 将创建集群,对其运行工作流,然后在工作流完成时删除集群。
gcloud dataproc workflow-templates set-managed-cluster wordcount-template \ --cluster-name=wordcount \ --single-node \ --region=us-central1
- 点击 Google Cloud 控制台中的 Dataproc 工作流页面上的
wordcount-template
名称,以打开工作流模板详细信息页面。确认 wordcount 模板属性。 - 将工作流模板导出到
wordcount.yaml
文本文件以进行参数化。gcloud dataproc workflow-templates export wordcount-template \ --destination=wordcount.yaml \ --region=us-central1
- 使用文本编辑器打开
wordcount.yaml
,然后将parameters
块添加到 YAML 文件的末尾,以便在工作流触发时,Cloud Storage INPUT_BUCKET_URI 可以作为args[1]
传递到字数二进制文件。导出的 YAML 文件示例如下所示。您可以采用以下两种方法之一来更新模板:
- 在将 your-output_bucket 替换为您的输出存储分区名称后,复制然后粘贴整个文件以替换导出的
wordcount.yaml
,,或者 - 复制并粘贴
parameters
部分到导出的wordcount.yaml
文件的末尾。
jobs: - hadoopJob: args: - wordcount - gs://input-bucket - gs://your-output-bucket/wordcount-output mainJarFileUri: file:///usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar stepId: count placement: managedCluster: clusterName: wordcount config: softwareConfig: properties: dataproc:dataproc.allow.zero.workers: 'true' parameters: - name: INPUT_BUCKET_URI description: wordcount input bucket URI fields: - jobs['count'].hadoopJob.args[1]
- 在将 your-output_bucket 替换为您的输出存储分区名称后,复制然后粘贴整个文件以替换导出的
- 导入参数化的
wordcount.yaml
文本文件。当系统要求您覆盖模板时,请键入“Y”。gcloud dataproc workflow-templates import wordcount-template \ --source=wordcount.yaml \ --region=us-central1
在Google Cloud 控制台中打开 Cloud Run functions 页面,然后点击“创建函数”。
在创建函数页面上,输入或选择以下信息:
- 名称:wordcount
- 分配的内存:保留默认选项。
- 触发器:
- Cloud Storage
- 事件类型:完成/创建
- 存储桶:选择您的输入存储桶(请参阅在项目中创建 Cloud Storage 存储桶)。 将文件添加到此存储分区时,该函数将触发工作流。该工作流将运行 Wordcount 应用,该应用将处理存储分区中的所有文本文件。
源代码:
- 內嵌编辑器
- 运行时:Node.js 8
INDEX.JS
标签:将默认代码段替换为以下代码,然后编辑const projectId
行以提供 -your-project-id-(开头或结尾“-”)。
const dataproc = require('@google-cloud/dataproc').v1; exports.startWorkflow = (data) => { const projectId = '-your-project-id-' const region = 'us-central1' const workflowTemplate = 'wordcount-template' const client = new dataproc.WorkflowTemplateServiceClient({ apiEndpoint: `${region}-dataproc.googleapis.com`, }); const file = data; console.log("Event: ", file); const inputBucketUri = `gs://${file.bucket}/${file.name}`; const request = { name: client.projectRegionWorkflowTemplatePath(projectId, region, workflowTemplate), parameters: {"INPUT_BUCKET_URI": inputBucketUri} }; client.instantiateWorkflowTemplate(request) .then(responses => { console.log("Launched Dataproc Workflow:", responses[1]); }) .catch(err => { console.error(err); }); };
PACKAGE.JSON
标签:将默认代码段替换为以下代码。
{ "name": "dataproc-workflow", "version": "1.0.0", "dependencies":{ "@google-cloud/dataproc": ">=1.0.0"} }
- 要执行的函数:Insert:"startWorkflow"。
点击“创建”。
将公开的文件
rose.txt
复制到您的存储分区以触发该函数。在命令中插入 your-input-bucket-name(用于触发函数的存储分区)。gcloud storage cp gs://pub/shakespeare/rose.txt gs://your-input-bucket-name
等待 30 秒,然后运行以下命令以验证函数是否已成功完成。
gcloud functions logs read wordcount
... Function execution took 1348 ms, finished with status: 'ok'
如需从 Google Cloud 控制台中的函数列表页面查看函数日志,请点击
wordcount
函数名称,然后点击函数详细信息页面上的“查看日志”。您可以在Google Cloud 控制台的 Storage 浏览器页面中查看输出存储桶中的
wordcount-output
文件夹。工作流完成后,作业详细信息将保留在Google Cloud 控制台中。点击 Dataproc 作业 页面上列出的
count...
作业以查看工作流作业详细信息。- In the Google Cloud console, go to the Manage resources page.
- In the project list, select the project that you want to delete, and then click Delete.
- In the dialog, type the project ID, and then click Shut down to delete the project.
- In the Google Cloud console, go to the Cloud Storage Buckets page.
- Click the checkbox for the bucket that you want to delete.
- To delete the bucket, click Delete, and then follow the instructions.
- 请参阅 Dataproc 工作流模板概览。
- 请参阅工作流安排解决方案。
在项目中创建或使用两个 Cloud Storage 存储桶
您的项目中需要两个 Cloud Storage 存储桶:一个用于输入文件,另一个用于输出文件。
创建工作流模板
如需创建和定义工作流模板,请在本地终端窗口或 Cloud Shell 中复制并运行以下命令。
参数化工作流模板
参数化输入存储分区变量以传递给工作流模板。
创建 Cloud Functions 函数
测试函数
清理
本教程中的工作流将在工作流完成时删除其托管集群。为避免重复收费,您可以删除与本教程关联的其他资源。
删除项目
删除 Cloud Storage 存储分区
删除工作流模板
gcloud dataproc workflow-templates delete wordcount-template \ --region=us-central1
删除 Cloud Functions 函数
在 Google Cloud 控制台中打开 Cloud Run functions 页面,选中 wordcount
函数左侧的框,然后点击删除。