This tutorial is for DevOps teams who want to get notifications for important Google Cloud events pushed to their collaboration platforms, such as Google Chat, Slack, or Microsoft Teams. Notifications free teams from having to regularly log into the Google Cloud console to check for event updates.
As an example for this tutorial, notifications are generated after you take disk snapshots. Taking a disk snapshot backs up data from your zonal persistent disks or regional persistent disks. You can modify this tutorial to automatically push notifications to your collaboration systems when other important Google Cloud events occur—for example, when a virtual machine (VM) instance is created or deleted.
The code for this tutorial is available in a GitHub repository.
This tutorial is intended for engineers who specialize in DevOps and cloud technology. It assumes you are familiar with Cloud Logging, Pub/Sub, and Cloud Functions.
The State of DevOps reports identified capabilities that drive software delivery performance. This tutorial will help you with the following capabilities:
Architecture
The following diagram shows the different components that you use in this tutorial.
First, you create a persistent disk and take a disk snapshot. You then filter the log events that correspond to successful disk snapshots and export the events by publishing them to a Pub/Sub topic. A Cloud Function reads the message from the topic and sends a push notification to a webhook. For this tutorial, this webhook is represented by a Cloud Function.
Objectives
- Set up Cloud Logging to export selected events to Pub/Sub so that Cloud Functions can consume them.
- Deploy a Cloud Function that consumes events exported by Cloud Logging and then triggers a webhook.
Costs
In this document, you use the following billable components of Google Cloud:
To generate a cost estimate based on your projected usage,
use the pricing calculator.
When you finish the tasks that are described in this document, you can avoid continued billing by deleting the resources that you created. For more information, see Clean up.
Before you begin
-
In the Google Cloud console, on the project selector page, select or create a Google Cloud project.
-
Make sure that billing is enabled for your Google Cloud project.
-
In the Google Cloud console, activate Cloud Shell.
You use Cloud Shell to run all commands in this tutorial.
Enable the Compute Engine, Cloud Logging, Cloud Functions, and Pub/Sub APIs:
gcloud services enable \ compute.googleapis.com \ logging.googleapis.com \ cloudfunctions.googleapis.com \ pubsub.googleapis.com
Preparing your environment
In Cloud Shell, set the
gcloud
default for the Compute Engine region and zone that you want to use for this tutorial:gcloud config set compute/region us-central1 gcloud config set compute/zone us-central1-a
This tutorial uses the
us-central1
region and theus-central1-a
zone. You can change the region and zone to suit your needs. For more information, see Geography and regions.Define the environment variables that you use for this tutorial:
PROJECT=$(gcloud config get-value project) PUBSUB_TOPIC="gce-snapshots-events" DISK="my-disk-1" WEBHOOK_NAME="webhookEmulator" WEBHOOK_URL="https://$(gcloud config get-value compute/region)-$PROJECT.cloudfunctions.net/$WEBHOOK_NAME"
Creating resources
In this section, you create the following Google Cloud resources for this tutorial:
- Pub/Sub topic
- Cloud Logging sink
- Cloud Function
Create a Pub/Sub topic
In Cloud Shell, create a Pub/Sub topic to publish event messages to:
gcloud pubsub topics create $PUBSUB_TOPIC
Create a Cloud Logging sink to Pub/Sub
Cloud Logging lets you store, search, and analyze events from Google Cloud. You can filter and export these logs to Cloud Storage, BigQuery, and Pub/Sub.
In Cloud Shell, export the logs generated by disk snapshots to Pub/Sub:
gcloud logging sinks create gce-events-sink \ pubsub.googleapis.com/projects/$PROJECT/topics/$PUBSUB_TOPIC \ --log-filter='resource.type="gce_disk" jsonPayload.event_type="GCE_OPERATION_DONE" jsonPayload.event_subtype="compute.disks.createSnapshot"'
The output contains the service account's email address that Cloud Logging uses when it publishes logs to the Pub/Sub topic that you created earlier. This email address takes the form of
SERVICE_ACCOUNT_NAME@gcp-sa-logging.iam.gserviceaccount.com
.Copy the service account's email address. You need it in the next section.
Set up permissions
In Cloud Shell, grant the service account the Pub/Sub Publisher IAM role (
roles/pubsub.publisher
) so that it can publish messages to the Pub/Sub topic:gcloud beta pubsub topics add-iam-policy-binding $PUBSUB_TOPIC \ --member='serviceAccount:SERVICE_ACCOUNT_EMAIL_ADDRESS' \ --role='roles/pubsub.publisher'
Replace
SERVICE_ACCOUNT_EMAIL_ADDRESS
with the email address that you copied in the previous section.
Create a webhook
Typically in production, you are notified of important Google Cloud events by push notifications sent to your collaboration platforms. Most of these platforms offer webhooks. In this tutorial, you create a Cloud Function to simulate a webhook from one of these platforms. This Cloud Function, which is triggered over HTTP, prints the content of the received messages.
By default, any user or service can invoke an HTTP Cloud Function. You can configure Identity and Access Management (IAM) on HTTP functions to restrict this behavior so that your HTTP function can be invoked only by providing authentication credentials in the request.
In Cloud Shell, clone the Git repository that contains the tutorial code:
git clone https://github.com/GoogleCloudPlatform/gcf-notification-forwarding
Switch to the working directory:
cd gcf-notification-forwarding/
Deploy the Cloud Function
webhookEmulator
webhook by using an HTTP trigger:gcloud functions deploy $WEBHOOK_NAME \ --runtime=nodejs8 \ --trigger-http \ --allow-unauthenticated \ --source=webhook/
This command can take up to two minutes to complete.
Create a Cloud Function to push notifications
You create a Cloud Function that subscribes to the Pub/Sub topic that you created earlier in push mode. Then, each time Cloud Logging exports (or pushes) an event to the Pub/Sub topic, this Cloud Function is triggered. The function receives the event, processes it, and pushes it to the HTTP endpoint of the webhook that you created earlier.
In Cloud Shell, deploy the Cloud Function:
gcloud functions deploy pushEventsToWebhook \ --runtime=nodejs8 \ --trigger-topic=$PUBSUB_TOPIC \ --set-env-vars=WEBHOOK_URL=$WEBHOOK_URL \ --allow-unauthenticated \ --source=push_notification/
Testing the setup
To test this setup, you take a disk snapshot and check whether the webhook receives the event log that the disk snapshot generates.
In Cloud Shell, create a zonal persistent disk. The default is a standard hard disk drive of 500 GB.
gcloud compute disks create $DISK \ --zone=$(gcloud config get-value compute/zone)
Trigger the creation of a snapshot of the disk that you created earlier:
gcloud compute disks snapshot $DISK \ --zone=$(gcloud config get-value compute/zone) \ --storage-location=$(gcloud config get-value compute/region)
This command can take up to two minutes to complete. After the snapshot is taken, an admin activity log entry is generated. The event log is filtered and pushed to the Pub/Sub topic. The subscribed Cloud Function picks it up, formats it, and pushes it to the webhook's HTTP endpoint.
After a few minutes, check whether the webhook received the event log:
gcloud beta functions logs read $WEBHOOK_NAME \ --region=$(gcloud config get-value compute/region) \ --limit=10
The output is similar to following:
{ data:[ { "type":'disk', "url":'https://console.cloud.google.com/compute/disksDetail/zones/us-central1-a/disks/my-disk-1?project=$PROJECT&supportedpurview=project', "name":'my-disk-1' }, { "type":'project', "project_id":'$PROJECT', "project_url":'https://console.cloud.google.com/?project=$PROJECT' }, { "zone":'us-central1-a' }, { "date_time":'2020-04-15T09:07:21.205-06:00' } ] }
The output shows that the webhook received the notification that a disk snapshot was taken.
Clean up
The easiest way to eliminate billing is to delete the project you created for the tutorial.
- In the Google Cloud console, go to the Manage resources page.
- In the project list, select the project that you want to delete, and then click Delete.
- In the dialog, type the project ID, and then click Shut down to delete the project.
What's next
- Read about Google Cloud Observability.
- Learn about routing logs for Cloud Logging.
- Read about Pub/Sub.
- Explore tutorials for Pub/Sub.
- Read about Cloud Functions.
- Explore tutorials for Cloud Functions.
- Explore reference architectures, diagrams, and best practices about Google Cloud. Take a look at our Cloud Architecture Center.
- Read our resources about DevOps.
- Learn more about the DevOps capabilities related to this tutorial:
- Take the DevOps quick check to understand where you stand in comparison with the rest of the industry.