Storing your organization's logs in a log bucket

This page describes how to store your logs into a single log bucket. A common way to manage your logs is to aggregate them from across your organization into a single log bucket. This guide walks through this process, using the example of aggregating all audit logs.

This process involves the following steps:

  1. Creating the log bucket for storing the aggregated logs.

  2. Creating the sink at the organization level to route the logs to the new logs bucket.

  3. Configuring read access to the new log bucket.

  4. Searching logs from the Logs Explorer page.

Before you begin

To complete the steps in this guide, you'll need to know the following:

  • What Cloud project do you want to aggregate the logs into? In the example in this guide, we use a Cloud project called logs-test-project.

  • What is the name and location of the log bucket you will aggregate logs into? In this example, the bucket name is all-audit-logs-bucket and the location is global.

  • Which logs do you want to include? In this example, we include all audit logs, logName:cloudaudit.googleapis.com.

  • What is the name for the sink that collects these logs? In this example, the sink name is all-audit-logs-sink.

  • What is the organization number? In this example, the organization number is 12345.

Creating a log bucket

Log buckets store the logs that are routed from other Cloud projects, folders, or organizations. For more information, see Managing log buckets.

To create the bucket in the Cloud project that you want to aggregate logs into, complete the following steps:

  1. Open the Google Cloud Console in the Cloud project you're using to aggregate the logs.

    Go to Google Cloud Console

  2. In a Cloud Shell terminal, run the following command to create a bucket, replacing the variables with appropriate values:

     gcloud logging buckets create all-audit-logs-bucket \
       --location=global \
       --project=logs-test-project
    
  3. Verify that the bucket was created:

    gcloud logging buckets list --project=logs-test-project
    
  4. Optionally, set the retention period of the logs in the bucket. This example extends the retention of logs stored in the bucket to 365 days:

    gcloud logging buckets update all-audit-logs-bucket --location=global --project=logs-test-project --retention-days=365
    

Creating the logs sink

You can route logs to a log bucket by creating a logs sink. A sink includes a logs filter, which selects which log entries to export through this sink, and a destination. In this guide, the export destination is our bucket, all-audit-logs-bucket.

Setting up the sink at the organization level

To create a sink, complete the following steps.

  1. Run the following command, replacing the variables with appropriate values:

    gcloud logging sinks create all-audit-logs-sink \
    logging.googleapis.com/projects/logs-test-project/locations/global/buckets/all-audit-logs-bucket \
      --log-filter='logName:cloudaudit.googleapis.com' \
      --description="All audit logs from my org log sink" \
      --organization=12345 \
      --include-children
    

    The --include-children flag is important so that logs from all the Cloud projects within your organization are also included. For more information, refer to the Aggregated sinks guide.

  2. Verify the sink was created:

    gcloud logging sinks list --organization=12345
    
  3. Get the name of the service account:

    gcloud logging sinks describe all-audit-logs-sink --organization=12345
    

    The output looks similar to the following:

    writerIdentity: serviceAccount:p1234567890-12345@gcp-sa-logging.iam.gserviceaccount.com
    
  4. Copy the entire string for writerIdentity starting with serviceAccount:.

Setting permissions for the sink

After creating the sink, you must grant your sink access to write to your bucket. You can do this either through the Cloud Console or by manually editing the Identity and Access Management (IAM) policy, as described in Managing logs buckets.

In this guide, we set the permissions through the Cloud Console using the following steps.

  1. In the Cloud Console, go to the IAM page:

    Go to the IAM page

  2. Make sure you've selected the destination Cloud project that contains the organization-level bucket you're using to aggregate the logs.

  3. Click Add.

  4. In the New principal field, add the service account without the serviceAccount: prefix.

  5. In the Select a role drop-down menu, select Logs Buckets Writer.

  6. Click Save.

Generating logs to assist in sink verification

If your sink is using audit logs, one way to validate that the sink is correctly routing logs is to start up a VM in a different Cloud project and then turn that VM down to see if this event appears in the logs.

If you have many Cloud projects in your organization already, you might have enough audit log traffic that this step isn't needed.

Configuring read access to the new log bucket

Now that your sink routes logs from your entire organization into your bucket, you're ready to search across all of these logs. You need to grant read access to see the views in the new bucket, specifically the roles/logging.viewAccessor role.

In this guide, we set the permissions through the Cloud Console using the following steps.

  1. In the Cloud Console, go to the IAM page:

    Go to the IAM page

    Make sure you've selected the Cloud project you're using to aggregate the logs.

  2. Click Add.

  3. In the New principal field, add your email account.

  4. In the Select a role drop-down menu, select Logs Views Accessor.

    This role provides users with read access to all views. To limit user access to a specific bucket, add a condition based on the resource name.

    1. Click Add condition.

    2. Enter a Title and Description for the condition.

    3. In the Condition type drop-down menu, select Resource > Name.

    4. In the Operator drop-down menu, select Ends with.

    5. In the Value field, enter the location and bucket name portion of your bucket ID.

      For example:

      locations/global/buckets/all-audit-logs-bucket
      
    6. Click Save to add the condition.

  5. Click Save to set the permissions.

Searching logs from the Logs Explorer page

After setting the permissions in the previous section, go to the Cloud Console and complete the following steps.

  1. From the Logging menu for the Cloud project you're using to aggregate the logs, select Logs Explorer.

    Go to Logs Explorer

  2. Select Refine Scope.

  3. On the Refine scope panel, select Scope by storage.

  1. Select all-audit-logs-bucket.

  2. Click Apply.

  3. The Logs Explorer refreshes to show logs from your bucket.

    For information on using the Logs Explorer, refer to Using the Logs Explorer.