This document describes how to use Terraform and the
google_eventarc_trigger
resource to create Eventarc triggers for the following Google Cloud
destinations:
For more information about using Terraform, see the Terraform on Google Cloud documentation.
The code samples in this guide route direct events from Cloud Storage but can be adapted for any event provider. For example, to learn how to route direct events from Pub/Sub to Cloud Run, see the Terraform quickstart.
Before you begin
- Sign in to your Google Cloud account. If you're new to Google Cloud, create an account to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
-
In the Google Cloud console, on the project selector page, select or create a Google Cloud project.
-
Make sure that billing is enabled for your Google Cloud project.
-
In the Google Cloud console, on the project selector page, select or create a Google Cloud project.
-
Make sure that billing is enabled for your Google Cloud project.
-
Enable the Cloud Resource Manager and Identity and Access Management (IAM) APIs.
-
In the Google Cloud console, activate Cloud Shell.
At the bottom of the Google Cloud console, a Cloud Shell session starts and displays a command-line prompt. Cloud Shell is a shell environment with the Google Cloud CLI already installed and with values already set for your current project. It can take a few seconds for the session to initialize.
Terraform is integrated into the Cloud Shell environment and you can use Cloud Shell to deploy your Terraform resources without having to install Terraform.
Prepare to deploy Terraform
Before deploying any Terraform resources, you must create a Terraform configuration file. A Terraform configuration file lets you define your preferred end-state for your infrastructure using the Terraform syntax.
Prepare Cloud Shell
In Cloud Shell, set the default Google Cloud project where you want to apply your Terraform configurations. You only need to run this command once per project, and you can run it in any directory:
export GOOGLE_CLOUD_PROJECT=PROJECT_ID
Replace PROJECT_ID
with the ID of your Google Cloud project.
Note that environment variables are overridden if you set explicit values in the Terraform configuration file.
Prepare the directory
Each Terraform configuration file must have its own directory (also called a root module). In Cloud Shell, create a directory and a create a new file within that directory:
mkdir DIRECTORY && cd DIRECTORY && touch main.tf
The filename must have the .tf
extension—for
example, in this document, the file is referred to as main.tf
.
Define your Terraform configuration
Copy the applicable Terraform code samples into your newly created
main.tf
file. Optionally, you can copy the code from GitHub. This
is recommended when the Terraform snippet is part of an end-to-end solution.
Typically, you apply the entire configuration at once. However, you can also target a specific resource. For example:
terraform apply -target="google_eventarc_trigger.default"
Note that the Terraform code samples use interpolation for substitutions such as reference variables, attributes of resources, and call functions.
Enable APIs
Terraform samples typically assume that the required APIs are enabled in your Google Cloud project. Use the following code to enable the APIs:
Cloud Run
GKE
Workflows
Create a service account and configure its access
Every Eventarc trigger is associated with an IAM service account at the time the trigger is created. Use the following code to create a dedicated service account and grant the user-managed service account specific Identity and Access Management roles to manage events:
Cloud Run
The Pub/Sub service agent is automatically created when the
Pub/Sub API is enabled. If the Pub/Sub service agent was
created on or before April 8, 2021, and the service account does not have
the Cloud Pub/Sub Service Agent role
(roles/pubsub.serviceAgent
), grant the
Service
Account Token Creator role (roles/iam.serviceAccountTokenCreator
)
to the service agent. For more information, see
Create and grant roles to service agents.
resource "google_project_iam_member" "tokencreator" { project = data.google_project.project.id role = "roles/iam.serviceAccountTokenCreator" member = "serviceAccount:service-${data.google_project.project.number}@gcp-sa-pubsub.iam.gserviceaccount.com" }
GKE
Before creating the service account, enable Eventarc to manage GKE clusters:
Create the service account:
Workflows
The Pub/Sub service agent is automatically created when the
Pub/Sub API is enabled. If the Pub/Sub service agent was
created on or before April 8, 2021, and the service account does not have a
the Cloud Pub/Sub Service Agent role
(roles/pubsub.serviceAgent
), grant the
Service
Account Token Creator role (roles/iam.serviceAccountTokenCreator
)
to the service agent. For more information, see
Create and grant roles to service agents.
resource "google_project_iam_member" "tokencreator" { project = data.google_project.project.id role = "roles/iam.serviceAccountTokenCreator" member = "serviceAccount:service-${data.google_project.project.number}@gcp-sa-pubsub.iam.gserviceaccount.com" }
Create a Cloud Storage bucket as an event provider
Use the following code to create a Cloud Storage bucket, and grant the
Pub/Sub
Publisher role (roles/pubsub.publisher
) to the
Cloud Storage service agent.
Cloud Run
GKE
Workflows
Create an event receiver to be the event target
Create an event receiver using one of the following Terraform resources:
Cloud Run
Create a Cloud Run service as an event destination for the Eventarc trigger:
GKE
To simplify this guide, create a Google Kubernetes Engine service as an event destination outside of Terraform, in between applying Terraform configurations.
If you haven't created a trigger in this Google Cloud project before, run the following command to create the Eventarc service agent:
gcloud beta services identity create --service eventarc.googleapis.com
Create a GKE cluster:
Deploy a Kubernetes service on GKE that will receive HTTP requests and log events by using a prebuilt Cloud Run image,
us-docker.pkg.dev/cloudrun/container/hello
:Get authentication credentials to interact with the cluster:
gcloud container clusters get-credentials eventarc-cluster \ --region=us-central1
Create a deployment named
hello-gke
:kubectl create deployment hello-gke \ --image=us-docker.pkg.dev/cloudrun/container/hello
Expose the deployment as a Kubernetes service:
kubectl expose deployment hello-gke \ --type ClusterIP --port 80 --target-port 8080
Make sure the pod is running:
kubectl get pods
The output should be similar to the following:
NAME READY STATUS RESTARTS AGE hello-gke-5b6574b4db-rzzcr 1/1 Running 0 2m45s
If the
STATUS
isPending
orContainerCreating
, the pod is deploying. Wait a minute for the deployment to complete, and check the status again.Make sure the service is running:
kubectl get svc
The output should be similar to the following:
NAME TYPE CLUSTER-IP EXTERNAL-IP PORT(S) AGE hello-gke ClusterIP 34.118.230.123 <none> 80/TCP 4m46s kubernetes ClusterIP 34.118.224.1 <none> 443/TCP 14m
Workflows
Deploy a workflow that executes when an object is updated in the Cloud Storage bucket:
Define an Eventarc trigger
An Eventarc trigger routes events from an event provider to an
event destination. Use the
google_eventarc_trigger
resource to specify CloudEvents attributes in the matching_criteria
and filter the events. For more information, follow the instructions when
creating a trigger for a specific provider, event type, and destination.
Events that match all the filters are sent to the destination.
Cloud Run
Create an Eventarc trigger that routes Cloud Storage
events to the hello-event
Cloud Run service.
GKE
Create an Eventarc trigger that routes Cloud Storage
events to the hello-gke
GKE service.
Workflows
Create an Eventarc trigger that routes Cloud Storage
events to the workflow named storage-workflow-tf
.
Apply Terraform
Use the Terraform CLI to provision infrastructure based on the configuration file.
To learn how to apply or remove a Terraform configuration, see Basic Terraform commands.
Initialize Terraform. You only need to do this once per directory.
terraform init
Optionally, to use the latest Google provider version, include the
-upgrade
option:terraform init -upgrade
Review the configuration and verify that the resources that Terraform is going to create or update match your expectations:
terraform plan
Make corrections to the configuration as necessary.
Apply the Terraform configuration by running the following command and entering
yes
at the prompt:terraform apply
Wait until Terraform displays the "Apply complete!" message.
Verify the creation of resources
Cloud Run
Confirm that the service has been created:
gcloud run services list --region us-central1
Confirm that the trigger has been created:
gcloud eventarc triggers list --location us-central1
The output should be similar to the following:
NAME: trigger-storage-cloudrun-tf TYPE: google.cloud.storage.object.v1.finalized DESTINATION: Cloud Run service: hello-events ACTIVE: Yes LOCATION: us-central1
GKE
Confirm that the service has been created:
kubectl get service hello-gke
Confirm that the trigger has been created:
gcloud eventarc triggers list --location us-central1
The output should be similar to the following:
NAME: trigger-storage-gke-tf TYPE: google.cloud.storage.object.v1.finalized DESTINATION: GKE: hello-gke ACTIVE: Yes LOCATION: us-central1
Workflows
Confirm that the workflow has been created:
gcloud workflows list --location us-central1
Confirm that the Eventarc trigger has been created:
gcloud eventarc triggers list --location us-central1
The output should be similar to the following:
NAME: trigger-storage-workflows-tf TYPE: google.cloud.storage.object.v1.finalized DESTINATION: Workflows: storage-workflow-tf ACTIVE: Yes LOCATION: us-central1
Generate and view an event
You can generate an event and confirm that the Eventarc trigger is working as expected.
Retrieve the name of the Cloud Storage bucket you previously created:
gcloud storage ls
Upload a text file to the Cloud Storage bucket:
echo "Hello World" > random.txt gcloud storage cp random.txt gs://BUCKET_NAME/random.txt
Replace
BUCKET_NAME
with the Cloud Storage bucket name you retrieved in the previous step. For example:gcloud storage cp random.txt gs://BUCKET_NAME/random.txt
The upload generates an event and the event receiver service logs the event's message.
Verify that an event is received:
Cloud Run
Filter the log entries created by your service:
gcloud logging read 'jsonPayload.message: "Received event of type google.cloud.storage.object.v1.finalized."'
Look for a log entry similar to the following:
Received event of type google.cloud.storage.object.v1.finalized. Event data: { "kind": "storage#object", "id": "trigger-cloudrun-BUCKET_NAME/random.txt", ...}
GKE
Find the pod ID:
POD_NAME=$(kubectl get pods -o custom-columns=":metadata.name" --no-headers)
This command uses
kubectl
's formatted output.Check the logs of the pod:
kubectl logs $POD_NAME
Look for a log entry similar to the following:
{"severity":"INFO","eventType":"google.cloud.storage.object.v1.finalized","message": "Received event of type google.cloud.storage.object.v1.finalized. Event data: ...}
Workflows
Verify that a workflows execution is triggered by listing the last five executions:
gcloud workflows executions list storage-workflow-tf --limit=5
The output should include a list of executions with a
NAME
,STATE
,START_TIME
, andEND_TIME
.Get the results for the most recent execution:
EXECUTION_NAME=$(gcloud workflows executions list storage-workflow-tf --limit=1 --format "value(name)") gcloud workflows executions describe $EXECUTION_NAME
Confirm that the output is similar to the following:
... result: '"Received event google.cloud.storage.object.v1.finalized - BUCKET_NAME, random.txt"' startTime: '2024-12-13T17:23:50.451316533Z' state: SUCCEEDED ...
Clean up
Remove resources previously applied with your Terraform configuration by running the following
command and entering yes
at the prompt:
terraform destroy
You can also delete your Google Cloud project to avoid incurring charges. Deleting your Google Cloud project stops billing for all the resources used within that project.
- In the Google Cloud console, go to the Manage resources page.
- In the project list, select the project that you want to delete, and then click Delete.
- In the dialog, type the project ID, and then click Shut down to delete the project.