Trigger Workflows using Cloud Audit Logs (gcloud CLI)
This quickstart shows you how to execute a workflow using an Eventarc trigger that receives events from BigQuery using Cloud Audit Logs. BigQuery hosts public datasets for you to access and integrate into your applications. The trigger executes the workflow by listening for completed BigQuery jobs and passes the event as runtime arguments to a destination workflow.
You can complete this quickstart using the Google Cloud CLI.
- Use Workflows to create and deploy a workflow that extracts and returns the email of the user who ran the query and the query.
- Create an Eventarc trigger that connects a BigQuery job to a Workflows event receiver.
- Generate an event by running a BigQuery job using the bq command-line tool. This event is passed as a runtime argument to the destination workflow.
- View the email of the user who ran the query and the query run.
Before you begin
Security constraints defined by your organization might prevent you from completing the following steps. For troubleshooting information, see Develop applications in a constrained Google Cloud environment.
- Sign in to your Google Cloud account. If you're new to Google Cloud, create an account to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
- Install the Google Cloud CLI.
-
To initialize the gcloud CLI, run the following command:
gcloud init
-
Create or select a Google Cloud project.
-
Create a Google Cloud project:
gcloud projects create PROJECT_ID
-
Select the Google Cloud project that you created:
gcloud config set project PROJECT_ID
-
-
Make sure that billing is enabled for your Google Cloud project.
- Install the Google Cloud CLI.
-
To initialize the gcloud CLI, run the following command:
gcloud init
-
Create or select a Google Cloud project.
-
Create a Google Cloud project:
gcloud projects create PROJECT_ID
-
Select the Google Cloud project that you created:
gcloud config set project PROJECT_ID
-
-
Make sure that billing is enabled for your Google Cloud project.
- Update
gcloud
components:gcloud components update
- Log in using your account:
gcloud auth login
- Enable the Compute Engine, Eventarc,
Pub/Sub, and Workflows APIs.
gcloud services enable \ compute.googleapis.com \ eventarc.googleapis.com \ pubsub.googleapis.com \ workflows.googleapis.com \ workflowexecutions.googleapis.com
- Set the configuration variables used in this quickstart:
export WORKFLOW_LOCATION=us-central1 export TRIGGER_LOCATION=us-central1 export PROJECT_ID=PROJECT_ID gcloud config set project ${PROJECT_ID} gcloud config set workflows/location ${WORKFLOW_LOCATION} gcloud config set eventarc/location ${TRIGGER_LOCATION}
-
If you are the project creator, you are granted the basic Owner role (
roles/owner
). By default, this Identity and Access Management (IAM) role includes the permissions necessary for full access to most Google Cloud resources and you can skip this step.If you are not the project creator, required permissions must be granted on the project to the appropriate principal. For example, a principal can be a Google Account (for end users) or a service account (for applications and compute workloads). For more information, see the Roles and permissions page for your event destination.
Required permissions
To get the permissions that you need to complete this quickstart, ask your administrator to grant you the following IAM roles on your project:
-
Eventarc Admin (
roles/eventarc.admin
) -
Logs View Accessor (
roles/logging.viewAccessor
) -
Project IAM Admin (
roles/resourcemanager.projectIamAdmin
) -
Service Account Admin (
roles/iam.serviceAccountAdmin
) -
Service Account User (
roles/iam.serviceAccountUser
) -
Service Usage Admin (
roles/serviceusage.serviceUsageAdmin
) -
Workflows Admin (
roles/workflows.admin
)
For more information about granting roles, see Manage access.
You might also be able to get the required permissions through custom roles or other predefined roles.
-
Eventarc Admin (
The Compute Engine default service account is automatically created after enabling or using a Google Cloud service that uses Compute Engine.
For test purposes, you can attach this service account to an Eventarc trigger to represent the identity of the trigger. Note the email format to use when creating a trigger:
PROJECT_NUMBER-compute@developer.gserviceaccount.com
Replace
PROJECT_NUMBER
with your Google Cloud project number. You can find your project number on the Dashboard page of the Google Cloud console or by running the following command:gcloud projects describe PROJECT_ID --format='value(projectNumber)'
The Compute Engine service account is automatically granted the basic Editor role (
roles/editor
) on your project. However, if automatic role grants have been disabled, refer to the applicable Roles and permissions instructions on to create a new service account and grant it the required roles.- Grant the
Eventarc
Event Receiver role (
roles/eventarc.eventReceiver
) on the project to the Compute Engine default service account so that the Eventarc trigger can receive events from event providers.gcloud projects add-iam-policy-binding PROJECT_ID \ --member=serviceAccount:PROJECT_NUMBER-compute@developer.gserviceaccount.com \ --role=roles/eventarc.eventReceiver
- Grant the
Workflows
Invoker role (
roles/workflows.invoker
) on the project to the Compute Engine default service account so that the account has permission to trigger your workflow execution.gcloud projects add-iam-policy-binding PROJECT_ID \ --member=serviceAccount:PROJECT_NUMBER-compute@developer.gserviceaccount.com \ --role=roles/workflows.invoker
- Grant the
Logging
Logs Writer role (
roles/logging.logWriter
) on the project to the Compute Engine default service account so that the workflow can send logs to Cloud Logging.gcloud projects add-iam-policy-binding PROJECT_ID \ --member=serviceAccount:PROJECT_NUMBER-compute@developer.gserviceaccount.com \ --role=roles/logging.logWriter
- If you enabled the Cloud Pub/Sub service agent on or before April
8, 2021, to support authenticated Pub/Sub push requests, grant
the Service
Account Token Creator role (
roles/iam.serviceAccountTokenCreator
) to the Google-managed service account. Otherwise, this role is granted by default:gcloud projects add-iam-policy-binding PROJECT_ID \ --member=serviceAccount:service-PROJECT_NUMBER@gcp-sa-pubsub.iam.gserviceaccount.com \ --role=roles/iam.serviceAccountTokenCreator
Create and deploy a workflow
Create and deploy a workflow that is executed when a BigQuery job completion triggers a workflow with an HTTP request.
- Open a terminal or Cloud Shell.
- In your home directory, create a new file called
myFirstWorkflow.yaml
ormyFirstWorkflow.json
. - Copy and paste the following into the new file and save it:
YAML
main: params: [event] steps: - log_event: call: sys.log args: text: ${event} severity: INFO - extract_data: assign: - user: ${event.data.protoPayload.authenticationInfo.principalEmail} - query: ${event.data.protoPayload.serviceData.jobCompletedEvent.job.jobConfiguration.query.query} - return_data: return: user: ${user} query: ${query}
JSON
{ "main": { "params": [ "event" ], "steps": [ { "log_event": { "call": "sys.log", "args": { "text": "${event}", "severity": "INFO" } } }, { "extract_data": { "assign": [ { "user": "${event.data.protoPayload.authenticationInfo.principalEmail}" }, { "query": "${event.data.protoPayload.serviceData.jobCompletedEvent.job.jobConfiguration.query.query}" } ] } }, { "return_data": { "return": { "user": "${user}", "query": "${query}" } } } ] } }
- Deploy the workflow:
export MY_WORKFLOW=myFirstWorkflow gcloud workflows deploy ${MY_WORKFLOW} --source=myFirstWorkflow.yaml
Replace
.yaml
with.json
if you copied the JSON version of the example workflow.
Create an Eventarc trigger
The Eventarc trigger sends events from BigQuery to the Workflows destination.
Create a trigger that filters BigQuery events:
gcloud eventarc triggers create events-cal-trigger \ --destination-workflow=${MY_WORKFLOW} \ --destination-workflow-location=${WORKFLOW_LOCATION} \ --event-filters="type=google.cloud.audit.log.v1.written" \ --event-filters="serviceName=bigquery.googleapis.com" \ --event-filters="methodName=jobservice.jobcompleted" \ --service-account="PROJECT_NUMBER-compute@developer.gserviceaccount.com"
This creates a trigger called
events-cal-trigger
.Note that when creating an Eventarc trigger for the first time in a Google Cloud project, there might be a delay in provisioning the Eventarc service agent. This issue can usually be resolved by attempting to create the trigger again. For more information, see Permission denied errors.
To confirm
events-cal-trigger
was successfully created, run:gcloud eventarc triggers describe events-cal-trigger --location=${TRIGGER_LOCATION}
The output should be similar to the following listing the time of creation and trigger location:
createTime: '2021-10-14T15:15:43.872360951Z' [...] name: projects/PROJECT_ID/locations/us-central1/triggers/events-cal-trigger
Generate and view an event
Run a BigQuery job using the bq command-line tool to generate an event and trigger the workflow. The generated event is passed as a runtime argument of the workflow which returns the user email and query as a result of the workflow execution.
To trigger a workflow, run a BigQuery job that accesses a public dataset and retrieves information from it:
bq query --nouse_legacy_sql \ 'SELECT COUNT(*) FROM `bigquery-public-data`.samples.shakespeare'
The job completion generates an event that is passed as a runtime argument to the workflow which returns the email of the user who ran the query and the query itself.
To verify that a workflow execution was triggered, list the last five executions:
gcloud workflows executions list ${MY_WORKFLOW} --limit=5
The output should be similar to the following, listing a
NAME
and aSTATE
equal toSUCCEEDED
for each workflow execution.NAME: projects/218898424763/locations/us-central1/workflows/myFirstWorkflow/executions/a073ad6a-c76b-4437-8d39-2ab3ade289d2 STATE: SUCCEEDED START_TIME: 2021-11-08T21:59:33.870561996Z END_TIME: 2021-11-08T21:59:34.150034659Z NAME: projects/218898424763/locations/us-central1/workflows/myFirstWorkflow/executions/35d7c730-7ba5-4055-afee-c04ed706b179 STATE: SUCCEEDED START_TIME: 2021-10-14T19:32:39.908739298Z END_TIME: 2021-10-14T19:32:40.147484015Z
Note that in the output,
a073ad6a-c76b-4437-8d39-2ab3ade289d2
from theNAME
field is the ID of the workflow execution. Copy your execution ID to use in the next step.To view the execution status, run the following command:
gcloud workflows executions describe WORKFLOW_EXECUTION_ID --workflow=${MY_WORKFLOW}
Replace
WORKFLOW_EXECUTION_ID
with the ID of the workflow execution that corresponds to the time at which the BigQuery job completed.The output should be similar to the following:
argument: [...] duration: 0.277917625s endTime: '2021-11-08T21:59:34.150034659Z' name: projects/218898424763/locations/us-central1/workflows/myFirstWorkflow/executions/a073ad6a-c76b-4437-8d39-2ab3ade289d2 result: '{"query":"SELECT\n COUNT(*)\nFROM\n `bigquery-public-data`.samples.shakespeare","user":"USER_EMAIL"}' startTime: '2021-11-08T21:59:33.870561996Z' state: SUCCEEDED
Verify that the
startTime
at which the BigQuery job completed and theSTART_TIME
of the workflow execution correspond to each other.
Congratulations, you have successfully generated a BigQuery event that has triggered a Workflows event receiver using Eventarc.
Clean up
- Delete the workflow you created:
gcloud workflows delete ${MY_WORKFLOW}
When asked if you want to continue, entery
. - Delete the trigger you created:
gcloud eventarc triggers delete events-cal-trigger
- Alternatively, you can delete your Google Cloud project to avoid incurring
charges. Deleting your Google Cloud project stops billing for all the
resources used within that project.
Delete a Google Cloud project:
gcloud projects delete PROJECT_ID