Aggregate and store your organization's logs

To manage your Google Cloud organization's logs, you can aggregate them from across your organization into a single Cloud Logging bucket.

This document describes the process, using the example of aggregating an organization's audit logs.

This process involves the following steps:

  1. Creating the Cloud Logging bucket for storing the aggregated logs.

  2. Creating the sink at the organization level to route the logs to the new bucket.

  3. Configuring read access to the new bucket.

  4. Searching logs from the Logs Explorer page.

Before you begin

To complete the steps in this guide, you'll need to know the following:

  • What Cloud project do you want to aggregate the logs into? In the example in this guide, we use a Cloud project called logs-test-project.

  • What is the name and location of the Logging bucket you will aggregate logs into? In this example, the bucket name is all-audit-logs-bucket and the location is global.

  • Which logs do you want to include? In this example, we include all audit logs, logName:cloudaudit.googleapis.com.

  • What is the name for the sink that collects these logs? In this example, the sink name is all-audit-logs-sink.

  • What is the organization number? In this example, the organization number is 12345.

Create a bucket

Cloud Logging buckets store the logs that are routed from other Cloud projects, folders, or organizations. For more information, see Configure and manage buckets.

To create the bucket in the Cloud project that you want to aggregate logs into, complete the following steps:

  1. Open the Google Cloud Console in the Cloud project you're using to aggregate the logs.

    Go to Google Cloud Console

  2. In a Cloud Shell terminal, run the following command to create a bucket, replacing the variables with appropriate values:

     gcloud logging buckets create all-audit-logs-bucket \
       --location=global \
       --project=logs-test-project
    
  3. Verify that the bucket was created:

    gcloud logging buckets list --project=logs-test-project
    
  4. (Optional) Set the retention period of the logs in the bucket. This example extends the retention of logs stored in the bucket to 365 days:

    gcloud logging buckets update all-audit-logs-bucket --location=global --project=logs-test-project --retention-days=365
    

Create the sink

You can route logs to a bucket by creating a sink. A sink includes an inclusion filter, which selects which log entries to route, and a destination. In this guide, the destination is our bucket, all-audit-logs-bucket.

Set up the sink at the organization level

To create a sink, complete the following steps.

  1. Run the following command, replacing the variables with appropriate values:

    gcloud logging sinks create all-audit-logs-sink \
    logging.googleapis.com/projects/logs-test-project/locations/global/buckets/all-audit-logs-bucket \
      --log-filter='logName:cloudaudit.googleapis.com' \
      --description="All audit logs from my org log sink" \
      --organization=12345 \
      --include-children
    

    The --include-children flag is important so that logs from all the Cloud projects within your organization are also included. For more information, see Configure aggregated sinks.

  2. Verify the sink was created:

    gcloud logging sinks list --organization=12345
    
  3. Get the name of the service account:

    gcloud logging sinks describe all-audit-logs-sink --organization=12345
    

    The output looks similar to the following:

    writerIdentity: serviceAccount:p1234567890-12345@gcp-sa-logging.iam.gserviceaccount.com
    
  4. Copy the entire string for writerIdentity starting with serviceAccount:.

Set permissions for the sink

After creating the sink, you must grant your sink access to write to your bucket. You can do this either through the Cloud Console or by manually editing the Identity and Access Management (IAM) policy, as described in Configure and manage buckets.

In this guide, we set the permissions through the Cloud Console using the following steps.

  1. In the Cloud Console, go to the IAM page:

    Go to the IAM page

  2. Make sure you've selected the destination Cloud project that contains the organization-level bucket you're using to aggregate the logs.

  3. Click Add.

  4. In the New principal field, add the service account without the serviceAccount: prefix.

  5. In the Select a role drop-down menu, select Logs Buckets Writer.

  6. Click Save.

Generate logs to assist in sink verification

If your sink is using audit logs, one way to validate that the sink is correctly routing logs is to start up a VM in a different Cloud project and then turn that VM down to see if this event appears in the logs.

If you have many Cloud projects in your organization already, you might have enough audit log traffic that this step isn't needed.

Configure read access to the new bucket

Now that your sink routes logs from your entire organization into your bucket, you're ready to search across all of these logs. You need to grant read access to see the views in the new bucket, specifically by adding the roles/logging.viewAccessor IAM role.

In this guide, we set the permissions through the Cloud Console using the following steps.

  1. In the Cloud Console, go to the IAM page:

    Go to the IAM page

    Make sure you've selected the Cloud project you're using to aggregate the logs.

  2. Click Add.

  3. In the New principal field, add your email account.

  4. In the Select a role drop-down menu, select Logs Views Accessor.

    This role provides the newly added principal with read access to all views for any buckets in the Cloud project. To limit a user's access, create or select from an existing log view and add a condition that lets the user read only from your new bucket:

    1. Click Add condition.

    2. Enter a Title and Description for the condition.

    3. In the Condition type drop-down menu, select Resource > Name.

    4. In the Operator drop-down menu, select Ends with.

    5. In the Value field, enter the bucket's location, bucket name portion of your bucket ID, and the view ID.

      For example:

      locations/global/buckets/all-audit-logs-bucket/view/view_id
      
    6. Click Save to add the condition.

  5. Click Save to set the permissions.

Search logs from the Logs Explorer page

After setting the permissions in the previous section, go to the Cloud Console and complete the following steps:

  1. From the Logging menu for the Cloud project you're using to aggregate the logs, select Logs Explorer.

    Go to Logs Explorer

  2. Select Refine Scope.

  3. On the Refine scope panel, select Scope by storage.

  1. Select all-audit-logs-bucket.

  2. Click Apply.

  3. The Logs Explorer refreshes to show logs from your bucket.

    For information on using the Logs Explorer, see Using the Logs Explorer.