Jump to Content
Security & Identity

Using Cloud Logging as your single pane of glass

August 21, 2020
https://storage.googleapis.com/gweb-cloudblog-publish/images/Cloud_Identity.max-2600x2600.jpg
Josh Davis

Cloud Technical Resident

Jeanno Cheung

Strategic Cloud Engineer

Logs are an essential tool for helping to secure your cloud deployments. In the first post in this series, we explored Cloud Identity logs and how you can configure alerts for potentially malicious activity in the Cloud Identity Admin Console to make your cloud deployment more secure. Today, we’ll take it a step further and look at how you can centralize collection of these logs to view activity across your deployment in a single pane of glass. 

Our best practices for enterprises using Google Cloud Platform (GCP) encourage customers to centralize log management, operations, searching, and analysis in GCP’s Cloud Logging. However, sometimes customers use services and applications that may not automatically or fully log to Cloud Logging. One example of this is Cloud Identity.

Fortunately, there’s a way to get Cloud Identity logs into this central repository by using a Cloud Function that executes the open-source GSuite log exporter tool. A Cloud Scheduler job will trigger the execution of this Cloud Function automatically, on a user-defined cadence. Here’s a visual representation of this flow:

https://storage.googleapis.com/gweb-cloudblog-publish/images/Cloud_Identity_logs.max-1200x1200.jpg

Google Cloud Professional Services also provides resources that can help you automate the deployment of the GCP tools involved in this solution. Even better, the services used are fully-managed: no work is required post-deployment.

Is this solution right for me? 

Before proceeding, let’s decide if the tools in this post are right for your organization. Cloud Identity Premium has a feature that lets you export Cloud Identity logs straight to BigQuery. This may be sufficient if your organization only needs to analyze the logs in BigQuery. However, you may want to export the logs to Cloud Logging for retention or further processing as part of your normal logging processes.

GCP also has a G Suite audit logging feature which automatically publishes some Cloud Identity logs into Cloud Logging. You can explore which Cloud Identity logs this feature covers in the documentation. The G Suite log exporter tool we will explore in this post provides additional coverage for getting Mobile, OAuth Token, and Drive logs into Cloud Logging, and also allows the user to specify exactly which logs they want to ingest from Cloud Identity.

If either of these situations are relevant to your organization, keep reading!

The tools we use

The G Suite log exporter is an open-source tool developed and maintained by Google Cloud Professional Services. It handles exporting data from Cloud Identity by calling G Suite’s Reports API. It specifies Cloud Logging on GCP as the destination for your logs, grabs the Cloud Identity logs, does some cleanup and reformatting, and writes to Cloud Logging using the Cloud Logging API.

One way to run this tool is to spin up a virtual machine using Google Compute Engine. You could import and execute the tool as a Python package and set up a cronjob that runs the tool on a cadence. We even provide a Terraform module that will automate this setup for you. It seems simple enough, but there are some things you must consider if you take this path, including how to secure your VM and what project and VPC it belongs to. 

An alternative approach is to use Google-managed services to execute this code. Cloud Functions gives you a serverless platform for event-based code execution—no need to spin up or manage any resources to run the code. Cloud Scheduler is Google’s fully managed enterprise-grade cronjob scheduler. You can integrate a Cloud Function with a Cloud Scheduler job so that your code executes automatically on a schedule, per the following steps:

  • Create a Cloud Function that subscribes to a Cloud Pub/Sub topic

  • Create a Pub/Sub topic to trigger that function

  • Create a Cloud Scheduler job that invokes the Pub/Sub trigger

  • Run the Cloud Scheduler job.

We also provide open-source examples that will help you take this approach, using a script or a Terraform module. Post-deployment, the Cloud Function will be triggered by the recurring Cloud Scheduler job, and the GSuite log exporter tool will execute indefinitely. That’s it! You now have up-to-date Cloud Identity logs in Cloud Logging. And since we’re using fully-managed GCP services, there’s no further effort required.

Customizing the solution

The open-source examples above can also be customized to fit your needs. Let’s take a look at the one that uses a script.

In this example, the default deploy.sh script creates a Cloud Scheduler job that triggers the exporter tool every 15 minutes. But, let’s say your organization needs to pull logs every 5 minutes to meet security requirements. You can simply change the “--schedule” flag in this file so that the exporter tool is fired as often as you’d like. The cadence is defined in unix-cron format.

https://storage.googleapis.com/gweb-cloudblog-publish/images/Customizing_the_solution_1.max-500x500.jpg

You may also want to customize main.py to control which specific Cloud Identity logs you grab. Our example pulls every log type currently supported by the exporter tool: Admin activity, Google Drive activity, Login activity, Mobile activity, and OAuth Token activity. The log types are defined in the sync_all function call in this file. Simply edit the “applications=” line (Line 34) to customize the log types you export (see below).

https://storage.googleapis.com/gweb-cloudblog-publish/images/Customizing_the_solution_2.max-800x800.jpg

Next steps

A few minutes after running the script or executing the Terraform module, you will have a Cloud Function deployed that automatically pulls the logs you want from Cloud Identity and puts them into Cloud Logging on a schedule you define. Now you can integrate them into your existing logging processes: send them to Cloud Storage for retention, to BigQuery for analysis, or to a Pub/Sub topic to be exported to a destination such as Splunk.

A Cloud Function integrated with a Cloud Scheduler job is a simple but effective way to collect Cloud Identity logs into Cloud Logging, so that your Google Cloud logs live behind a single pane of glass. The fully managed and easy-to-deploy examples we discussed today free up resources and time so your organization can further focus on keeping your cloud safe.

Posted in