Trigger Workflows with direct events from Cloud Storage (gcloud CLI)

This quickstart shows you how to execute a workflow using an Eventarc trigger that receives events from Cloud Storage.

The trigger executes the workflow by listening for an object creation event in a Cloud Storage bucket and passes the event as a runtime argument to a destination workflow.

In this quickstart, you:

  1. Create a Cloud Storage bucket as an event source.

  2. Use Workflows to create and deploy a workflow that extracts and returns the name of the storage bucket and the name of an uploaded file.

  3. Create an Eventarc trigger that connects the Cloud Storage bucket to the Workflows event receiver.

  4. Generate an event by uploading a text file to the Cloud Storage bucket. This event is passed as a runtime argument to the destination workflow.

  5. View the name of the bucket and the name of the text file as a result of the workflow execution.


To follow step-by-step guidance for this task directly in the Google Cloud console, click Guide me:

Guide me


Before you begin

Security constraints defined by your organization might prevent you from completing the following steps. For troubleshooting information, see Develop applications in a constrained Google Cloud environment.

  1. Sign in to your Google Cloud account. If you're new to Google Cloud, create an account to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
  2. Install the Google Cloud CLI.
  3. To initialize the gcloud CLI, run the following command:

    gcloud init
  4. Create or select a Google Cloud project.

    • Create a Google Cloud project:

      gcloud projects create PROJECT_ID

      Replace PROJECT_ID with a name for the Google Cloud project you are creating.

    • Select the Google Cloud project that you created:

      gcloud config set project PROJECT_ID

      Replace PROJECT_ID with your Google Cloud project name.

  5. Make sure that billing is enabled for your Google Cloud project.

  6. Install the Google Cloud CLI.
  7. To initialize the gcloud CLI, run the following command:

    gcloud init
  8. Create or select a Google Cloud project.

    • Create a Google Cloud project:

      gcloud projects create PROJECT_ID

      Replace PROJECT_ID with a name for the Google Cloud project you are creating.

    • Select the Google Cloud project that you created:

      gcloud config set project PROJECT_ID

      Replace PROJECT_ID with your Google Cloud project name.

  9. Make sure that billing is enabled for your Google Cloud project.

  10. Enable the Compute Engine, Eventarc, Pub/Sub, and Workflows APIs.

    gcloud services enable \
    compute.googleapis.com \
    eventarc.googleapis.com \
    pubsub.googleapis.com \
    workflows.googleapis.com \
    workflowexecutions.googleapis.com

  11. Update gcloud components:
    gcloud components update
  12. Sign in using your account:
    gcloud auth login

Set your environment variables

Set the environment variables used in this quickstart.

export PROJECT_ID=PROJECT_ID
export WORKFLOW_LOCATION=us-central1
export TRIGGER_LOCATION=us-central1
gcloud config set project ${PROJECT_ID}
gcloud config set workflows/location ${WORKFLOW_LOCATION}
gcloud config set eventarc/location ${TRIGGER_LOCATION}

You can find your project ID on the Welcome page of the Google Cloud console.

Set up your service accounts

Grant the required permissions to the service accounts used in this quickstart.

  1. If you are the project creator, you are granted the basic Owner role (roles/owner). By default, this Identity and Access Management (IAM) role includes the permissions necessary for full access to most Google Cloud resources and you can skip this step.

    If you are not the project creator, required permissions must be granted on the project to the appropriate principal. For example, a principal can be a Google Account (for end users) or a service account (for applications and compute workloads). For more information, see the Roles and permissions page for your event destination.

    Required permissions

    To get the permissions that you need to complete this quickstart, ask your administrator to grant you the following IAM roles on your project:

    For more information about granting roles, see Manage access.

    You might also be able to get the required permissions through custom roles or other predefined roles.

  2. The Compute Engine default service account is automatically created after enabling or using a Google Cloud service that uses Compute Engine.

    For test purposes, you can attach this service account to an Eventarc trigger to represent the identity of the trigger. Note the email format to use when creating a trigger:

    PROJECT_NUMBER-compute@developer.gserviceaccount.com
    

    Replace PROJECT_NUMBER with your Google Cloud project number. You can find your project number on the Welcome page of the Google Cloud console or by running the following command:

    gcloud projects describe PROJECT_ID --format='value(projectNumber)'

    The Compute Engine service account is automatically granted the basic Editor role (roles/editor) on your project. However, if automatic role grants have been disabled, refer to the applicable Roles and permissions instructions on to create a new service account and grant it the required roles.

  3. Grant the Eventarc Event Receiver role (roles/eventarc.eventReceiver) on the project to the Compute Engine default service account so that the Eventarc trigger can receive events from event providers.
    gcloud projects add-iam-policy-binding PROJECT_ID \
        --member=serviceAccount:PROJECT_NUMBER-compute@developer.gserviceaccount.com \
        --role=roles/eventarc.eventReceiver
  4. Grant the Workflows Invoker role (roles/workflows.invoker) on the project to the Compute Engine default service account so that the account has permission to trigger your workflow execution.
    gcloud projects add-iam-policy-binding PROJECT_ID \
        --member=serviceAccount:PROJECT_NUMBER-compute@developer.gserviceaccount.com \
        --role=roles/workflows.invoker
  5. Grant the Logging Logs Writer role (roles/logging.logWriter) on the project to the Compute Engine default service account so that the workflow can send logs to Cloud Logging.
    gcloud projects add-iam-policy-binding PROJECT_ID \
        --member=serviceAccount:PROJECT_NUMBER-compute@developer.gserviceaccount.com \
        --role=roles/logging.logWriter
  6. Before creating a trigger for direct events from Cloud Storage, grant the Pub/Sub Publisher role (roles/pubsub.publisher) to the Cloud Storage service agent, a Google-managed service account:

    SERVICE_ACCOUNT="$(gsutil kms serviceaccount -p PROJECT_ID)"
    
    gcloud projects add-iam-policy-binding PROJECT_ID \
        --member="serviceAccount:${SERVICE_ACCOUNT}" \
        --role='roles/pubsub.publisher'
    
  7. If you enabled the Cloud Pub/Sub service agent on or before April 8, 2021, to support authenticated Pub/Sub push requests, grant the Service Account Token Creator role (roles/iam.serviceAccountTokenCreator) to the Google-managed service account. Otherwise, this role is granted by default:
    gcloud projects add-iam-policy-binding PROJECT_ID \
        --member=serviceAccount:service-PROJECT_NUMBER@gcp-sa-pubsub.iam.gserviceaccount.com \
        --role=roles/iam.serviceAccountTokenCreator

Create a Cloud Storage bucket

Create a Cloud Storage bucket to use as the event source:

  gsutil mb -l us-central1 gs://${PROJECT_ID}-bucket/

Create and deploy a workflow

Create and deploy a workflow that is executed when an object created in the Cloud Storage bucket triggers a workflow with an HTTP request.

  1. In your home directory, create a new file called myEventWorkflow.yaml or myEventWorkflow.json.

  2. Copy and paste the following into the new file and save it:

    YAML

      main:
        params: [event]
        steps:
            - log_event:
                call: sys.log
                args:
                    text: ${event}
                    severity: INFO
            - extract_bucket_object:
                assign:
                - bucket: ${event.data.bucket}
                - object: ${event.data.name}
            - return_bucket_object:
                    return:
                        bucket: ${bucket}
                        object: ${object}
      

    JSON

    {
    "main": {
    "params": [
      "event"
    ],
    "steps": [
      {
        "log_event": {
          "call": "sys.log",
          "args": {
            "text": "${event}",
            "severity": "INFO"
          }
        }
      },
      {
        "extract_bucket_object": {
          "assign": [
            {
              "bucket": "${event.data.bucket}"
            },
            {
              "object": "${event.data.name}"
            }
          ]
        }
      },
      {
        "return_bucket_object": {
          "return": {
            "bucket": "${bucket}",
            "object": "${object}"
          }
        }
      }
    ]
    }
    }
  3. Deploy the workflow:

    export MY_WORKFLOW=myEventWorkflow
    gcloud workflows deploy ${MY_WORKFLOW} --source=myEventWorkflow.yaml
    

    Replace .yaml with .json if you copied the JSON version of the example workflow.

Create an Eventarc trigger

The Eventarc trigger sends events from the Cloud Storage bucket to the Workflows destination.

  1. Create a trigger that filters Cloud Storage events:

    gcloud eventarc triggers create storage-events-trigger \
        --destination-workflow=${MY_WORKFLOW} \
        --event-filters="type=google.cloud.storage.object.v1.finalized" \
        --event-filters="bucket=${PROJECT_ID}-bucket" \
        --service-account="PROJECT_NUMBER-compute@developer.gserviceaccount.com"
    

    This creates a trigger called storage-events-trigger.

    Note that when creating an Eventarc trigger for the first time in a Google Cloud project, there might be a delay in provisioning the Eventarc service agent. This issue can usually be resolved by attempting to create the trigger again. For more information, see Permission denied errors.

  2. To confirm storage-events-trigger was successfully created, run:

    gcloud eventarc triggers describe storage-events-trigger --location=${TRIGGER_LOCATION}
    

    The output should be similar to the following listing the time of creation and trigger location:

    createTime: '2021-10-14T15:15:43.872360951Z'
    [...]
    name: projects/PROJECT_ID/locations/us-central1/triggers/storage-events-trigger
    

Generate and view an event

  1. To generate an event, upload a text file to Cloud Storage:

    echo "Hello World" > random.txt
    gsutil cp random.txt gs://${PROJECT_ID}-bucket/random.txt
    

    The upload generates an event that is passed as a runtime argument to the workflow which returns the names of the storage bucket and uploaded file.

  2. To verify that a workflows execution was triggered, list the last five executions:

    gcloud workflows executions list ${MY_WORKFLOW} --limit=5
    

    The output should be similar to the following, listing a NAME and STATE equal to SUCCEEDED for each workflow execution:

    NAME: projects/606789101455/locations/us-central1/workflows/myFirstWorkflow/executions/8c02b8f1-8836-4a6d-99d9-fc321eb9668f
    STATE: SUCCEEDED
    START_TIME: 2021-10-13T03:38:03.019148617Z
    END_TIME: 2021-10-13T03:38:03.249705805Z
    NAME: projects/606789101455/locations/us-central1/workflows/myFirstWorkflow/executions/a6319d9d-36a6-4117-904e-3d1118bdc90a
    STATE: SUCCEEDED
    START_TIME: 2021-10-13T17:28:51.492864252Z
    END_TIME: 2021-10-13T17:28:52.227212414Z
    

    Note that in the NAME field in the preceding example, a6319d9d-36a6-4117-904e-3d1118bdc90a is the ID of the workflow execution. Copy your execution ID as it is used in the next step.

  3. To view the execution status, run the following command:

    gcloud workflows executions describe WORKFLOW_EXECUTION_ID --workflow=${MY_WORKFLOW}
    

    Replace WORKFLOW_EXECUTION_ID with the ID of the workflow execution that corresponds to the time at which the file was uploaded to the bucket.

    The output is similar to the following:

    argument: [...]
    name: projects/218898424763/locations/us-central1/workflows/myEventWorkflow/executions/86d2567b-0f1e-49b3-8b10-cdac5d0f6239
    result: '{"bucket":"PROJECT_ID-bucket","object":"random.txt"}'
    startTime: '2021-10-13T03:38:03.019148617Z'
    state: SUCCEEDED
    
  4. Verify that the time,"timeCreated": "2021-10-13T03:38" at which the Cloud Storage bucket was updated and the startTime of the workflow execution correspond to each other.

Congratulations, you have successfully generated a Cloud Storage event that has triggered a Workflows event receiver using Eventarc.

Clean up

To avoid incurring charges to your Google Cloud account for the resources used on this page, delete the Google Cloud project with the resources.

  1. Delete the workflow you created:

    gcloud workflows delete ${MY_WORKFLOW}
    

    When asked if you want to continue, enter y.

  2. Delete your storage bucket:

    gsutil rm -r gs://${PROJECT_ID}-bucket/
    
  3. Delete the trigger created in this tutorial:

    gcloud eventarc triggers delete storage-events-trigger
    
  4. Alternatively, you can delete your Google Cloud project to avoid incurring charges. Deleting your Google Cloud project stops billing for all the resources used within that project.

    Delete a Google Cloud project:

    gcloud projects delete PROJECT_ID

What's next