Configure log buckets to store logs

Stay organized with collections Save and categorize content based on your preferences.

This document describes how to create and manage Cloud Logging buckets using the Google Cloud console, the Google Cloud CLI, and the Logging API. It also provides instructions for creating and managing log buckets at the Cloud project level, but you can create log buckets at the folder and organization levels as well.

You can upgrade log buckets to use Log Analytics. Log Analytics lets you run SQL queries on your log data, helping you troubleshoot security and networking issues. You can also use BigQuery to view the data stored in a log bucket when it is upgraded to use Log Analytics and when a linked dataset exists. Creating a linked dataset lets you join your log data with other data stored in BigQuery, and access data from other tools like Data Studio and Looker.

For a conceptual overview of buckets, see Routing and storage overview: Log buckets.

Before you begin

To get started with buckets, do the following:

  • Ensure that you've enabled billing for your Google Cloud project.

  • Ensure that your Identity and Access Management (IAM) role grants you the permissions necessary to create, upgrade, and link buckets.

    The Logs Configuration Writer (roles/logging.configWriter) role is the minimal predefined role that grants the permissions required to manage buckets. For the full list of permissions and roles, see Access control with IAM.

  • Understand the LogBucket formatting requirements, including the supported regions in which you can store your logs.

  • Consider setting a default resource location to apply a default storage region for the _Required and _Default buckets in your organization.

  • To use BigQuery to view the data stored in a log bucket, do the following:

Create a bucket

You can create a maximum of 100 buckets per Cloud project.

To create a user-defined log bucket for your Cloud project, do the following:

Google Cloud console

To create a log bucket in your Cloud project, do the following:

  1. From the Logging menu, select Logs Storage.

    Go to Logs Storage

  2. Click Create Logs Bucket.

  3. Enter a Name and Description for your bucket.

  4. Optional: If you aren't using Service Perimeters, then upgrade your bucket to use Log Analytics.

    1. Select Upgrade to use Log Analytics.

      When you upgrade a bucket to use Log Analytics, you can query your logs in the Log Analytics page by using SQL queries. You can also continue to view your logs by using the Logs Explorer.

      The region is automatically set to global and the retention period is set to the default value. These settings can't be modified.

    2. Optional: To view your logs in BigQuery, select Create a new BigQuery dataset that links to this bucket and enter a unique dataset name.

      When you select this option, BigQuery can read the data stored in your log bucket. You can now query in the BigQuery interface where you can join your log data, and also access data from other tools like Data Studio and Looker.

  5. Optional: To select the storage region for your logs, click the Select logs bucket region menu and select a region. If you don't select a region, then the global region is used, which means that the logs could be physically located in any of the regions.

    You can't set the region for log buckets that are upgraded to use Log Analytics.

  6. Optional: To set a custom retention period for the logs in the bucket, click Next.

    If you set a custom retention period for your log bucket and you want to upgrade it to use Log Analytics, then you must change the retention period to the default value.

    In the Retention field, enter the number of days, between 1 day and 3650 days, that you want Cloud Logging to retain your logs. If you don't customize the retention period, the default is 30 days.

    You can also update your bucket to apply custom retention after you create it.

  7. Click Create bucket.

    After the log bucket is created, Logging upgrades the bucket and creates the dataset link, if these options were selected.

    It might take a moment for these steps to complete.

gcloud

To create a log bucket in your Cloud project, run the gcloud logging buckets create command:

gcloud logging buckets create BUCKET_ID --location=LOCATION OPTIONAL_FLAGS

The variable LOCATION refers to the region in which you want your logs to be stored.

For example, if you want to create a bucket with the BUCKET_ID my-bucket in the asia-east2 region, your command would look like the following:

gcloud logging buckets create my-bucket --location asia-east2 --description "My first bucket"

API

To create a bucket, use projects.locations.buckets.create in the Logging API. Prepare the arguments to the method as follows:

  1. Set the parent parameter to be the resource in which to create the bucket: projects/PROJECT_ID/locations/LOCATION

    The variable LOCATION refers to the region in which you want your logs to be stored.

    For example, if you want to create a bucket for project my-project in the in the asia-east2 region, your parent parameter would look like this: projects/my-project/locations/asia-east2

  2. Set the bucketId parameter; for example, my-bucket.

  3. Call projects.locations.buckets.create to create the bucket.

After creating a bucket, create a sink to route log entries to your bucket and configure log views to control who can access the logs in your new bucket and which logs are accessible to them. You can also update the bucket to configure custom retention, restricted fields, or CMEK settings.

Track log ingestion

The Logs Storage page in the Google Cloud console tracks the volume of logs data ingested by log buckets.

To track the logs data ingestion for your Cloud project, go to the Logs Storage page in the console:

Go to Logs Storage

The Logs Storage page displays a summary of statistics for your Cloud project:

The summary statistics report logs data ingestion amounts for the currently
selected project.

The following statistics are reported:

  • Current total volume: The amount of logs data that your Cloud project has ingested since the first day of the current calendar month.

  • Previous month volume: The amount of logs data that your Cloud project ingested in the last calendar month.

  • Projected volume by EOM: The estimated amount of logs data that your Cloud project will ingest by the end of the current calendar month, based on current usage.

The log ingestion volume statistics don't include the _Required bucket. The logs in that bucket can't be excluded or disabled.

The Log Router page in the Google Cloud console gives you tools that you can use to minimize any charges for logs ingestion or storage that exceeds your your monthly allotment. You can do the following:

  • Disable logs ingestion at the bucket level.
  • Exclude certain log entries from ingestion into Logging.

For more information, see Manage sinks.

Manage buckets

This section describes how to manage your log buckets using the Google Cloud CLI or the Google Cloud console.

Update a bucket

To update the properties of your bucket, such as the description or retention period, do the following:

Google Cloud console

To update your bucket's properties, do the following:

  1. From the Logging menu, select Logs Storage.

    Go to Logs Storage

  2. For the bucket you want to update, click More.

  3. Select Edit bucket.

  4. Edit your bucket as needed.

  5. Click Update bucket.

gcloud

To update your bucket's properties, run the gcloud logging buckets update command:

gcloud logging buckets update BUCKET_ID --location=LOCATION UPDATED_ATTRIBUTES

For example:

gcloud logging buckets update my-bucket --location=global --description "Updated description"

API

To update your bucket's properties, use projects.locations.buckets.patch in the Logging API.

Upgrade a bucket to use Log Analytics

To upgrade an existing bucket to use Log Analytics, the following restrictions apply:

Google Cloud console

To upgrade an existing bucket to use Log Analytics, do the following:

  1. From the Logging menu, select Logs Storage.

    Go to Logs Storage

  2. Locate the bucket that you want to upgrade.

  3. When the Log Analytics available column displays Upgrade, you can upgrade the log bucket to use Log Analytics. Click Upgrade.

    A dialog box opens. Click Confirm.

gcloud

To upgrade a log bucket to use Log Analytics, you must use the Google Cloud console.

API

To upgrade a log bucket to use Log Analytics, you must use the Google Cloud console.

When you want to use the capabilities of BigQuery to analyze your log data, upgrade a log bucket to use Log Analytics, and then create a linked dataset. With this configuration, you can use BigQuery to read the logs stored in the log bucket.

Google Cloud console

To create a link to a BigQuery dataset for an existing log bucket, do the following:

  1. Review the Before you begin section of this document.

  2. From the Logging menu, select Logs Storage.

    Go to Logs Storage

  3. Locate the log bucket and verify that the Log Analytics available column displays Open.

    If this column displays Upgrade, then the log bucket hasn't been upgraded to use Log Analytics. Configure Log Analytics:

    1. Click Upgrade.
    2. Click Confirm in the dialog.

    After the upgrade completes, proceed to the next step.

  4. On the log bucket, click More, and then click Edit bucket.

    The Edit log bucket dialog opens.

  5. Select Create a new BigQuery dataset that links to this bucket and enter the name for the new dataset. The dataset name must be unique for your Google Cloud project.

  6. Click Done and then click Update bucket.

    After Logging displays the linked dataset name on the Logs Storage page, it might take several minutes before BigQuery recognizes the dataset.

gcloud

To create a linked dataset, you must use the Google Cloud console.

API

To create a linked dataset, you must use the Google Cloud console.

Lock a bucket

When you lock a bucket against updates, it includes locking the bucket's retention policy. After a retention policy is locked, you can't delete the bucket until every log in the bucket has fulfilled the bucket's retention period.

To prevent anyone from updating or deleting a log bucket, lock the bucket. To lock the bucket, do the following:

Google Cloud console

The Google Cloud console doesn't support locking a log bucket.

gcloud

To lock your bucket, run the gcloud logging buckets update command with the --locked flag:

gcloud logging buckets update BUCKET_ID --location=LOCATION --locked

For example:

gcloud logging buckets update my-bucket --location=global --locked

API

To lock your bucket's attributes, use projects.locations.buckets.patch in the Logging API. Set the locked parameter to true.

List buckets

To list the log buckets associated with a Cloud project, and to see details such as retention settings, do the following:

Google Cloud console

Go to the Logs Storage page:

Go to Logs Storage

A table named Log buckets lists the buckets associated with the current Cloud project.

The table lists the following attributes for each log bucket:

  • Name: The name given to the bucket when it was created.
  • Description: The description given to to the bucket when it was created.
  • Retention period: The number of days that the bucket's data will be stored by Cloud Logging.
  • Region: The geographic location in which the bucket's data is stored.
  • Status: Whether the bucket is locked or unlocked.

If a bucket is pending deletion by Cloud Logging, its table entry is annotated with a warning .

gcloud

Run the gcloud logging buckets list command:

gcloud logging buckets list

You see the following attributes for the log buckets:

  • LOCATION: The region in which the bucket's data is stored.
  • BUCKET_ID: The name given to the bucket when it was created.
  • RETENTION_DAYS: The number of days that the bucket's data will be stored by Cloud Logging.
  • LIFECYCLE_STATE: Indicates whether the bucket is pending deletion by Cloud Logging.
  • LOCKED: Whether the bucket is locked or unlocked.
  • CREATE_TIME: A timestamp that indicates when the bucket was created.
  • UPDATE_TIME: A timestamp that indicates when the bucket was last modified.

You can also view the attributes for just one bucket. For example, to view the details for the _Default log bucket in the 'global' region, run the gcloud logging buckets describe command:

gcloud logging buckets describe _Default --location=global

API

To list the log buckets associated with a Cloud project, use projects.locations.buckets.list in the Logging API.

View a bucket's details

To view the details of a single log bucket, do the following:

Google Cloud console

Go to the Logs Storage page:

Go to Logs Storage

On the log bucket, click More > View bucket details.

The dialog box lists the following attributes for the log bucket:

  • Name: The name given to the bucket when it was created.
  • Description: The description given to to the bucket when it was created.
  • Retention period: The number of days that the bucket's data will be stored by Cloud Logging.
  • Region: The geographic location in which the bucket's data is stored.
  • Log analytics: Indicates whether your bucket is upgraded to use Log Analytics.
  • BigQuery analysis: Indicates whether a BigQuery dataset is linked to your bucket.
  • BigQuery dataset: Provides a link to your BigQuery dataset, which opens in the BigQuery SQL workspace page. The date that BigQuery linking was enabled is also shown.

gcloud

Run the gcloud logging buckets describe command:

gcloud logging buckets describe _Default --location=global

You see the following attributes for the log bucket:

  • createTime: A timestamp that indicates when the bucket was created.
  • description: The description given to the bucket when it was created.
  • lifecycleState: Indicates whether the bucket is pending deletion by Cloud Logging.
  • name: The name given to the bucket when it was created.
  • retentionDays: The number of days that the bucket's data will be stored by Cloud Logging.
  • updateTime: A timestamp that indicates when the bucket was last modified.

API

To view the details of a single log bucket, use projects.locations.buckets.get in the Logging API.

Delete a bucket

To delete a log bucket, do the following:

Google Cloud console

To delete a log bucket, do the following:

  1. From the Logging menu, select Logs Storage.

    Go to Logs Storage

  2. Locate the bucket that you want to delete, and click More.

  3. If the Linked dataset in BigQuery column displays a link, then delete the linked BigQuery dataset:

    1. Click Edit bucket.

    2. Clear Create a new BigQuery dataset that links to this bucket, click Done, and then click Update bucket.

      After you return to the Logs Storage page, click More for the bucket you want to delete, then proceed to the next steps.

  4. Select Delete bucket.

  5. On the confirmation panel, click Delete.

  6. On the Logs Storage page, your bucket has an indicator that it's pending deletion. The bucket, including all the logs in it, is deleted after 7 days.

gcloud

To delete a log bucket, run the gcloud logging buckets delete command:

gcloud logging buckets delete BUCKET_ID --location=LOCATION

API

To delete a bucket, use projects.locations.buckets.delete in the Logging API.

A deleted bucket stays in this pending state for 7 days, and Logging continues to route logs to the bucket during that time. To stop routing logs to a deleted bucket, you can delete the log sinks that have that bucket as a destination, or you can modify the filter for the sinks to stop routing logs to the deleted bucket.

Restore a deleted bucket

You can restore, or undelete, a log bucket that's in the pending deletion state. To restore a log bucket, do the following:

Google Cloud console

To restore a log bucket that is pending deletion, do the following:

  1. From the Logging menu, select Logs Storage.

    Go to Logs Storage

  2. For the bucket you want to restore, click More .

  3. Select Restore deleted bucket.

  4. On the confirmation panel, click Restore.

  5. On the Logs Storage page, the pending-deletion indicator is removed from your bucket.

gcloud

To restore a log bucket that is pending deletion, run the gcloud logging buckets undelete command:

gcloud logging buckets undelete BUCKET_ID --location=LOCATION

API

To restore a bucket that is pending deletion, use projects.locations.buckets.undelete in the Logging API.

Write to a bucket

You don't directly write logs to a log bucket. Rather, you write logs to Google Cloud resource: a Cloud project, folder, or organization. The sinks in the parent resource then route the logs to destinations, including log buckets. A sink routes logs to a log bucket destination when the logs match the sink's filter and the sink has permission to route the logs to the log bucket.

Read from a bucket

Each log bucket has a set of log views. To read logs from a log bucket, you need access to a log view on the log bucket. For more information on log views, see Managing log views.

To read logs from a log bucket, do the following:

Google Cloud console

You can refine the scope of the logs displayed in the Logs Explorer through the Refine scope option. You can search only logs within the current Cloud project or search logs in one or more storage views. To refine the scope of the Logs Explorer, do the following:

  1. From the Logging menu, select Logs Explorer.

    Go to Logs Explorer

  2. Select Refine Scope.

  3. On the Refine scope dialog, select a Scope by option.

  4. Select Scope by storage and choose one or more buckets you want to view.

    The dialog lists storage views that meet the following conditions:

    • You have access to the storage view.
    • The log buckets belong to the selected Cloud project, or the selected Cloud project has previously routed logs to the storage buckets.

gcloud

To read logs from a log bucket, use the gcloud logging read command and add a LOG_FILTER to select data:

gcloud logging read LOG_FILTER --bucket=BUCKET_ID --location=LOCATION --view=VIEW_ID

API

To read logs from a log bucket, use the entries.list method. Set resourceNames to specify the appropriate bucket and log view, and set filter to select data.

For detailed information about the filtering syntax, see Logging query language.

Configure custom retention

When you create a log bucket, you have the option to customize the period for how long Cloud Logging stores the bucket's logs. You can configure the retention period for any user-defined log bucket and also for the _Default log bucket. If you set a custom retention period for your log bucket and you want to upgrade it to use Log Analytics, then you must change the retention period to the default value.

If you shorten the bucket's retention, then there is a 7-day grace period in which expired logs aren't deleted. You can't query or view those expired logs but, in those 7 days, you can restore full access by extending the bucket's retention. Logs stored during the grace period count towards your storage costs.

To update the retention period for a log bucket, do the following:

Google Cloud console

To update a log bucket's retention period, do the following:

  1. From the Logging menu, select Logs Storage.

    Go to Logs Storage

  2. For the bucket you want to update, click More .

  3. Select Edit bucket.

  4. In the Retention field, enter the number of days, between 1 day and 3650 days, that you want Cloud Logging to retain your logs.

  5. Click Update bucket. Your new retention duration appears in the Logs bucket list.

gcloud

To update the retention period for a log bucket, run the gcloud logging buckets update command, after setting a value for RETENTION_DAYS:

gcloud logging buckets update BUCKET_ID  --location=LOCATION --retention-days=RETENTION_DAYS

For example, to retain the logs in the _Default bucket in the 'global' location for a year, your command would look like the following:

gcloud logging buckets update _Default --location=global --retention-days=365

If you extend a bucket's retention period, then the retention rules apply going forward and not retroactively. Logs can't be recovered after the applicable retention period ends.

Troubleshoot common issues

If you encounter problems when using log buckets, refer to the following troubleshooting steps and answers to common questions.

Why can't I delete this bucket?

If you're trying to delete a bucket, do the following:

  • Ensure that you have the correct permissions to delete the bucket. For the list of the permissions that you need, see the Access control guide.

  • Determine whether the bucket is locked by listing the bucket's attributes. If the bucket is locked, check the bucket's retention period. You can't delete a locked bucket until all of the logs in the bucket have fulfilled the bucket's retention period.

Which service accounts are routing logs to my bucket?

To determine if any service accounts have IAM permissions to route logs to your bucket, do the following:

  1. Go to the Identity and Access Management page for the Cloud project that contains the bucket:

    Go to IAM

  2. From the Permissions tab, view by Roles. You see a table with all the IAM roles and principals associated with your Cloud project.

  3. In the table's Filter text box, enter Logs Bucket Writer.

    You see any principals with the Logs Bucket Writer role. If a principal is a service account, its ID contains the string gserviceaccount.com.

  4. Optional: If you want to remove a service account from being able to route logs to your Cloud project, select the check box next to it and click Remove.

Why do I see logs for a Cloud project even though I excluded them from my _Default sink?

You might be viewing logs in a log bucket in a centralized Cloud project, which aggregates logs from across your organization.

If you're using the Logs Explorer to access these logs and see logs that you excluded from the _Default sink, then your view might be scoped to the Cloud project level.

To fix this issue, select Scope by storage in the Refine scope panel and then select the _Default bucket in your Cloud project. You shouldn't see the excluded logs anymore.

Why can't I create logs-based metrics for a bucket?

Logs-based metrics apply only to a single Google Cloud project. You can't create them for log buckets or for other Google Cloud resources such as folders or organizations.

What's next

For information on the log bucket API methods, refer to the LogBucket reference documentation.

For information on addressing common use cases with log buckets, see the following topics: