Trigger Workflows with direct events from Cloud Storage (gcloud CLI)
This quickstart shows you how to execute a workflow using an Eventarc trigger that receives events from Cloud Storage.
The trigger executes the workflow by listening for an object creation event in a Cloud Storage bucket and passes the event as a runtime argument to a destination workflow.
In this quickstart, you:
Create a Cloud Storage bucket as an event source.
Use Workflows to create and deploy a workflow that extracts and returns the name of the storage bucket and the name of an uploaded file.
Create an Eventarc trigger that connects the Cloud Storage bucket to the Workflows event receiver.
Generate an event by uploading a text file to the Cloud Storage bucket. This event is passed as a runtime argument to the destination workflow.
View the name of the bucket and the name of the text file as a result of the workflow execution.
For step-by-step guidance on this task directly in Google Cloud console, click Guide me:
The following sections take you through the same steps as clicking Guide me.
Before you begin
Some of the steps in this document might not work correctly if your organization applies constraints to your Google Cloud environment. In that case, you might not be able to complete tasks like creating public IP addresses or service account keys. If you make a request that returns an error about constraints, see how to Develop applications in a constrained Google Cloud environment.
- Sign in to your Google Cloud account. If you're new to Google Cloud, create an account to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
-
In the Google Cloud console, on the project selector page, select or create a Google Cloud project.
-
Make sure that billing is enabled for your Cloud project. Learn how to check if billing is enabled on a project.
- Install and initialize the Google Cloud CLI.
-
In the Google Cloud console, on the project selector page, select or create a Google Cloud project.
-
Make sure that billing is enabled for your Cloud project. Learn how to check if billing is enabled on a project.
- Install and initialize the Google Cloud CLI.
Enable the Eventarc, Pub/Sub, and Workflows APIs.
gcloud services enable eventarc.googleapis.com pubsub.googleapis.com workflows.googleapis.com workflowexecutions.googleapis.com
- Update
gcloud
components:gcloud components update
- Log in using your account:
gcloud auth login
Set your environment variables
Set the environment variables used in this quickstart.
export PROJECT_ID=PROJECT_ID
export WORKFLOW_LOCATION=us-central1
export TRIGGER_LOCATION=us-central1
gcloud config set project ${PROJECT_ID}
gcloud config set workflows/location ${WORKFLOW_LOCATION}
gcloud config set eventarc/location ${TRIGGER_LOCATION}
Replace PROJECT_ID
with your Google Cloud project ID.
You can find your project ID on the Dashboard page of the Google Cloud console.
Set up your service accounts
Grant the required permissions to the service accounts used in this quickstart.
Grant the
pubsub.publisher
role to the Cloud Storage service account:SERVICE_ACCOUNT="$(gsutil kms serviceaccount -p ${PROJECT_ID})" gcloud projects add-iam-policy-binding ${PROJECT_ID} \ --member="serviceAccount:${SERVICE_ACCOUNT}" \ --role='roles/pubsub.publisher'
If you enabled the Pub/Sub service account on or before April 8, 2021, grant the
iam.serviceAccountTokenCreator
role to the Pub/Sub service account:gcloud projects add-iam-policy-binding ${PROJECT_ID} \ --member="serviceAccount:service-PROJECT_NUMBER@gcp-sa-pubsub.iam.gserviceaccount.com" \ --role='roles/iam.serviceAccountTokenCreator'
Replace
PROJECT_NUMBER
with your Google Cloud project number.You can find your project number on the Dashboard page of the Google Cloud console.
Create a service account and give it a name; for example,
my-service-account
:export MY_SERVICE_ACCOUNT=my-service-account gcloud iam service-accounts create ${MY_SERVICE_ACCOUNT}
Grant the
roles/workflows.invoker
role to the service account:gcloud projects add-iam-policy-binding ${PROJECT_ID} \ --member="serviceAccount:${MY_SERVICE_ACCOUNT}@${PROJECT_ID}.iam.gserviceaccount.com" \ --role='roles/workflows.invoker'
Create a Cloud Storage bucket
Create a Cloud Storage bucket to use as the event source:
gsutil mb -l us-central1 gs://${PROJECT_ID}-bucket/
Create and deploy a workflow
Create and deploy a workflow that is executed when an object created in the Cloud Storage bucket triggers a workflow with an HTTP request.
In your home directory, create a new file called
myEventWorkflow.yaml
ormyEventWorkflow.json
.Copy and paste the following into the new file and save it:
YAML
main: params: [event] steps: - log_event: call: sys.log args: text: ${event} severity: INFO - extract_bucket_object: assign: - bucket: ${event.data.bucket} - object: ${event.data.name} - return_bucket_object: return: bucket: ${bucket} object: ${object}
JSON
{ "main": { "params": [ "event" ], "steps": [ { "log_event": { "call": "sys.log", "args": { "text": "${event}", "severity": "INFO" } } }, { "extract_bucket_object": { "assign": [ { "bucket": "${event.data.bucket}" }, { "object": "${event.data.name}" } ] } }, { "return_bucket_object": { "return": { "bucket": "${bucket}", "object": "${object}" } } } ] } }
Deploy the workflow:
export MY_WORKFLOW=myEventWorkflow gcloud workflows deploy ${MY_WORKFLOW} --source=myEventWorkflow.yaml
Replace
.yaml
with.json
if you copied the JSON version of the example workflow.
Create an Eventarc trigger
The Eventarc trigger sends events from the Cloud Storage bucket to the Workflows destination.
Create a trigger that filters Cloud Storage events:
gcloud eventarc triggers create storage-events-trigger \ --destination-workflow=${MY_WORKFLOW} \ --destination-workflow-location=${WORKFLOW_LOCATION} \ --event-filters="type=google.cloud.storage.object.v1.finalized" \ --event-filters="bucket=${PROJECT_ID}-bucket" \ --service-account="${MY_SERVICE_ACCOUNT}@${PROJECT_ID}.iam.gserviceaccount.com"
This creates a trigger called
storage-events-trigger
.To confirm
storage-events-trigger
was successfully created, run:gcloud eventarc triggers describe storage-events-trigger --location=${TRIGGER_LOCATION}
Generate and view an event
To generate an event, upload a text file to Cloud Storage:
echo "Hello World" > random.txt gsutil cp random.txt gs://${PROJECT_ID}-bucket/random.txt
The upload generates an event that is passed as a runtime argument to the workflow which returns the names of the storage bucket and uploaded file.
To verify that a workflows execution was triggered, list the last five executions:
gcloud workflows executions list ${MY_WORKFLOW} --limit=5
The output should be similar to the following, listing a NAME and STATE equal to
SUCCEEDED
for each workflow execution:NAME: projects/606789101455/locations/us-central1/workflows/myFirstWorkflow/executions/8c02b8f1-8836-4a6d-99d9-fc321eb9668f STATE: SUCCEEDED START_TIME: 2021-10-13T03:38:03.019148617Z END_TIME: 2021-10-13T03:38:03.249705805Z NAME: projects/606789101455/locations/us-central1/workflows/myFirstWorkflow/executions/a6319d9d-36a6-4117-904e-3d1118bdc90a STATE: SUCCEEDED START_TIME: 2021-10-13T17:28:51.492864252Z END_TIME: 2021-10-13T17:28:52.227212414Z
Note that in the
NAME
field in the preceding example,a6319d9d-36a6-4117-904e-3d1118bdc90a
is the ID of the workflow execution. Copy your execution ID as it is used in the next step.To view the event message:
View the execution status:
gcloud workflows executions describe WORKFLOW_EXECUTION_ID --workflow=${MY_WORKFLOW}
Replace
WORKFLOW_EXECUTION_ID
with the ID of the workflow execution that corresponds to the time at which the file was uploaded to the bucket.The output is similar to the following:
argument: '{"bucket":"PROJECT_ID-bucket","data":{"bucket":"PROJECT_ID-bucket","[...]","timeCreated":"2021-10-13T03:38:01.735Z"," [...]","subject":"objects/random.txt","time":"2021-10-13T03:38:01.735384Z","type":"google.cloud.storage.object.v1.finalized"}' endTime: '2021-10-13T03:38:03.249705805Z' name: projects/218898424763/locations/us-central1/workflows/myFirstWorkflow/executions/86d2567b-0f1e-49b3-8b10-cdac5d0f6239 result: '{"bucket":"PROJECT_ID-bucket","object":"random.txt"}' startTime: '2021-10-13T03:38:03.019148617Z' state: SUCCEEDED workflowVersionId: '1'
Verify that the time,
"timeCreated": "2021-10-13T03:38"
at which the Cloud Storage bucket was updated and thestartTime
of the workflow execution correspond to each other.Look for an event message similar to the following:
result: '{"bucket":"PROJECT_ID-bucket","object":"random.txt"}'
Congratulations, you have successfully generated a Cloud Storage event that has triggered a Workflows event receiver using Eventarc.
Clean up
To avoid incurring charges to your Google Cloud account for the resources used on this page, follow these steps.
Delete the workflow you created:
gcloud workflows delete ${MY_WORKFLOW}
When asked if you want to continue, enter
y
.Delete your storage bucket:
gsutil rm -r gs://${PROJECT_ID}-bucket/
Delete the trigger created in this tutorial:
gcloud eventarc triggers delete storage-events-trigger
Alternatively, you can delete your Google Cloud project to avoid stop billing for all the resources used within that project.
To delete your project:
gcloud projects delete PROJECT_ID_OR_NUMBER
Replace PROJECT_ID_OR_NUMBER
with your Google Cloud
project ID or number.