Aggregate and store your organization's logs

This document describes how to manage the logs generated by the resources contained in your Google Cloud organization by using a non-intercepting aggregated sink.

You can configure an aggregated sink to be intercepting or non-intercepting, depending on whether you want control over which logs can be queried in, or routed through the sinks in child resources. In this tutorial, you create an aggregated sink that routes your organization's audit logs to a log bucket. You can configure the filter of the sink to route other types of log entries. For more information about aggregated sinks, see Collate and route organization- and folder-level logs to supported destinations.

In this tutorial, you perform the following steps:

  1. Creating the Cloud Logging bucket for storing the aggregated logs.

  2. Creating a non-intercepting aggregated sink at the organization level to route the logs to the new log bucket.

  3. Configuring read access to the new log bucket.

  4. Querying and viewing your logs from the Logs Explorer page.

Before you begin

Ensure the following:

  • You have one of the following IAM roles for the Google Cloud organization or folder from which you're routing logs.

    • Owner (roles/owner)
    • Logging Admin (roles/logging.admin)
    • Logs Configuration Writer (roles/logging.configWriter)

    The permissions contained in these roles allow you to create, delete, or modify sinks. For information about setting IAM roles, see the Logging Access control guide.

  • If you use VPC Service Controls, then you must add an ingress rule to the service perimeter. For more information about VPC Service Controls limitations, see Aggregated sinks and VPC Service Controls limitations.

Create a log bucket

Log buckets store the logs that are routed from other Google Cloud projects, folders, or organizations. For more information, see Configure log buckets.

To create the log bucket in the Google Cloud project that you want to aggregate logs into, complete the following steps:

  1. Navigate to the Google Cloud console, or click the following button:

    Go to Google Cloud console

  2. In a Cloud Shell terminal, run the following command to create a log bucket, replacing the variables with appropriate values:

     gcloud logging buckets create all-audit-logs-bucket \
       --location=global \
       --project=logs-test-project
    
  3. Verify that the log bucket was created:

    gcloud logging buckets list --project=logs-test-project
    
  4. Optional: Set the retention period of the logs in the bucket. This example extends the retention of logs stored in the bucket to 365 days:

    gcloud logging buckets update all-audit-logs-bucket --location=global --project=logs-test-project --retention-days=365
    

Create the sink

You can route logs to a log bucket by creating a sink. A sink includes an inclusion filter, an optional exclusion filter, and a destination. In this tutorial, the destination is a log bucket, all-audit-logs-bucket. For more information about sinks, see Route logs to supported destinations.

Set up the sink at the organization level

To create a sink, complete the following steps.

  1. Run the following command, replacing the variables with appropriate values:

    gcloud logging sinks create all-audit-logs-sink \
    logging.googleapis.com/projects/logs-test-project/locations/global/buckets/all-audit-logs-bucket \
      --log-filter='logName:cloudaudit.googleapis.com' \
      --description="All audit logs from my org log sink" \
      --organization=12345 \
      --include-children
    

    The --include-children flag is important so that logs from all the Google Cloud projects within your organization are also included. For more information, see Collate and route organization-level logs to supported destinations.

  2. Verify the sink was created:

    gcloud logging sinks list --organization=12345
    
  3. Get the name of the service account:

    gcloud logging sinks describe all-audit-logs-sink --organization=12345
    

    The output looks similar to the following:

    writerIdentity: serviceAccount:o1234567890-12345@gcp-sa-logging.iam.gserviceaccount.com
    
  4. Copy the entire string for writerIdentity starting with serviceAccount:.

Grant access to the sink

After creating the sink, you must grant your sink permission to write to your log bucket. You can do this either through the Google Cloud console or by manually editing the Identity and Access Management (IAM) policy, as described in Set destination permissions.

In this tutorial, we set the permissions through the Google Cloud console using the following steps.

  1. In the navigation panel of the Google Cloud console, select IAM:

    Go to IAM

  2. Ensure you've selected the destination Google Cloud project that contains the organization-level bucket you're using to aggregate the logs.

  3. Click Grant access.

  4. In the New principals field, add the service account without the serviceAccount: prefix.

  5. In the Select a role menu, select Logs Bucket Writer.

  6. Click Save.

Generate logs to assist in sink verification

If your sink is using audit logs, one way to validate that the sink is correctly routing logs is to start up a VM in a different Google Cloud project and then turn that VM down to see if this event appears in the logs.

If you have many Google Cloud projects in your organization already, you might have enough audit log traffic that this step isn't needed.

Configure read access to the new log bucket

Now that your sink routes logs from your entire organization into your log bucket, you're ready to search across all of these logs. You can use log views to restrict who has access to the logs within your log buckets, by creating a log view, and granting principals the roles/logging.viewAccessor IAM role.

In this tutorial, we set the permissions through the Google Cloud console using the following steps.

  1. In the navigation panel of the Google Cloud console, select IAM:

    Go to IAM

    Make sure you've selected the Google Cloud project you're using to aggregate the logs.

  2. Click Add.

  3. In the New principal field, add your email account.

  4. In the Select a role menu, select Logs Views Accessor.

    This role provides a principal with read access to log views on log buckets in the Google Cloud project. To limit a user's access, create or select from an existing log view and add a condition that lets the user read only from your new log bucket:

    1. Click Add condition.

    2. Enter a Title and Description for the condition.

    3. In the Condition type menu, select Resource, and then select Name.

    4. In the Operator menu, select Ends with.

    5. In the Value field, enter the name for the log view. For information about log views, see Configure log views on a log bucket.

      The name of a log view has the following format:

      locations/global/buckets/all-audit-logs-bucket/views/view_id
      
    6. Click Save to add the condition.

  5. Click Save to set the permissions.

Search logs from the Logs Explorer page

After setting the permissions in the previous section, go to the Google Cloud console and complete the following steps:

  1. In the navigation panel of the Google Cloud console, select Logging, and then select Logs Explorer:

    Go to Logs Explorer

  2. Select Refine Scope.

  3. On the Refine scope panel, select Scope by storage.

  4. Select all-audit-logs-bucket.

  5. Click Apply.

    The Logs Explorer refreshes to show logs from your log bucket.

    For information on using the Logs Explorer, see Using the Logs Explorer.