This document describes how to create and manage Cloud Logging buckets using
the Google Cloud console, the Google Cloud CLI, and the
Logging API.
It also provides instructions for creating and managing log buckets at the
Google Cloud project level. You can't create log buckets at the folder
or organization level; however, Cloud Logging automatically creates
_Default
and _Required
buckets at the folder and organization level for you.
You can upgrade log buckets to use Log Analytics. Log Analytics lets you run SQL queries on your log data, helping you troubleshoot application, security, and networking issues. You can also use BigQuery to view the data stored in a log bucket when the log bucket is upgraded to use Log Analytics and when a linked BigQuery dataset exists. Creating a linked dataset lets you join your log data with other data stored in BigQuery, and access data from other tools like Looker Studio and Looker.
For a conceptual overview of buckets, see Routing and storage overview: Log buckets.
Before you begin
To get started with buckets, do the following:
Ensure that you've enabled billing for your Google Cloud project.
Ensure that your Identity and Access Management (IAM) role grants you the permissions necessary to create, upgrade, and link buckets.
The Logs Configuration Writer (
roles/logging.configWriter
) role is the minimal predefined role that grants the permissions required to manage buckets. For the full list of permissions and roles, see Access control with IAM.Understand the
LogBucket
formatting requirements, including the supported regions in which you can store your logs.Consider setting a default resource location to apply a default storage region for the
_Required
and_Default
buckets in your organization.To use BigQuery to view the data stored in a log bucket, do the following:
Ensure that the BigQuery API is enabled. You can verify that the API is enabled by listing available services.
Your Identity and Access Management role includes the permissions that let you create a linked dataset. For more information, see Permissions for linked BigQuery datasets.
For information about how to create a log bucket that uses a customer-managed encryption key (CMEK), see Configure CMEK for logs storage.
Create a bucket
You can create a maximum of 100 buckets per Google Cloud project.
To create a user-defined log bucket for your Google Cloud project, do the following:
Google Cloud console
To create a log bucket in your Google Cloud project, do the following:
From the Logging menu, select Logs Storage.
Click Create log bucket.
Enter a Name and Description for your bucket.
Optional: Upgrade your bucket to use Log Analytics.
Select Upgrade to use Log Analytics.
When you upgrade a bucket to use Log Analytics, you can query your logs in the Log Analytics page by using SQL queries. You can also continue to view your logs by using the Logs Explorer.
Not all regions are supported for Log Analytics. For more information, see Supported regions.
Optional: To view your logs in BigQuery, select Create a new BigQuery dataset that links to this bucket and enter a unique dataset name.
When you select this option, BigQuery can read the data stored in your log bucket. You can now query in the BigQuery interface where you can join your log data, and also access data from other tools like Looker Studio and Looker.
Optional: To select the storage region for your logs, click the Select log bucket region menu and select a region. If you don't select a region, then the
global
region is used, which means that the logs could be physically located in any of the regions.Optional: To set a custom retention period for the logs in the bucket, click Next.
In the Retention period field, enter the number of days, between 1 day and 3650 days, that you want Cloud Logging to retain your logs. If you don't customize the retention period, the default is
30 days
.You can also update your bucket to apply custom retention after you create it.
Click Create bucket.
After the log bucket is created, Logging upgrades the bucket and creates the dataset link, if these options were selected.
It might take a moment for these steps to complete.
gcloud
To only create a log bucket,
run the gcloud logging buckets create
command. If you want
to upgrade the log bucket to use Log Analytics, then include
the --enable-analytics
and
--async
flags,
and ensure that you set the variable LOCATION to a
region supported for Log Analytics:
gcloud logging buckets create BUCKET_ID --location=LOCATION --enable-analytics --async OPTIONAL_FLAGS
The flag --async
forces the
command to be asynchronous. The return of
an asynchronous method is an Operation
object, it
contains information about the progress of the method. When the
method completes, the Operation
object contains the status. For more
information, see Asynchronous API methods.
If you don't want to upgrade the log bucket to use Log Analytics, then
omit the --enable-analytics
and
--async
flags. You can set the
the variable LOCATION to any supported
region.
For example, if you want to create a bucket with the BUCKET_ID
my-bucket
in the asia-east2
region, your command would look like the
following:
gcloud logging buckets create my-bucket --location asia-east2 --description "My first bucket"
For example, to create a bucket with the BUCKET_ID
my-upgraded-bucket
in the us
location,
and then upgrade the log bucket to use Log Analytics,
your command would look like the following:
gcloud logging buckets create my-upgraded-bucket --location us \ --description "My first upgraded bucket" \ --enable-analytics --retention-days=45
API
To create a bucket, use the
projects.locations.buckets.create
or the
projects.locations.buckets.createAsync
method. Prepare the arguments to the method as follows:
Set the
parent
parameter to be the resource in which to create the bucket:projects/PROJECT_ID/locations/LOCATION
The variable LOCATION refers to the region in which you want your logs to be stored. Not all regions are supported for Log Analytics. For more information, see Supported regions.
For example, if you want to create a bucket for project
my-project
in the in theasia-east2
region, yourparent
parameter would look like this:projects/my-project/locations/asia-east2
Set the
bucketId
parameter; for example,my-bucket
.Do one of the following:
To create a log bucket and then upgrade the log bucket to use Log Analytics:
Set the
LogBucket.analyticsEnabled
boolean totrue
.Call the asynchronous method
projects.locations.buckets.createAsync
to create the bucket.The response to the asynchronous methods is an
Operation
object. This object contains information about the progress of the method. When the method completes, theOperation
object contains the status. For more information, see Asynchronous API methods.The
createAsync
method takes several minutes to complete. This method method doesn't generate an error message or fail when theanalyticsEnabled
boolean is set totrue
and the region isn't supported for upgraded buckets. For example, if you set the location toasia-east2
, then the log bucket is created but the bucket isn't upgraded to use Log Analytics.
Otherwise, call the synchronous method
projects.locations.buckets.create
to create the bucket.
After creating a bucket, create a sink to route log entries to your bucket and configure log views to control who can access the logs in your new bucket and which logs are accessible to them. You can also update the bucket to configure custom retention and restricted fields.
Track log ingestion
The Logs Storage page in the Google Cloud console tracks the volume of logs data ingested by log buckets.
To track the logs data ingestion for your Google Cloud project, go to the Logs Storage page in the console:
The Logs Storage page displays a summary of statistics for your Google Cloud project:
The following statistics are reported:
Current month ingestion: The amount of logs data that your Google Cloud project has ingested since the first day of the current calendar month.
Previous month ingestion: The amount of logs data that your Google Cloud project ingested in the last calendar month.
Projected ingestion by EOM: The estimated amount of logs data that your Google Cloud project will ingest by the end of the current calendar month, based on current usage.
Current month billable storage: The amount of logs data that has been retained for over 30 days that is billed.
The log ingestion volume statistics don't include the
_Required
bucket. The
logs in that bucket can't be excluded or disabled.
The Log Router page in the Google Cloud console gives you tools that you can use to minimize any charges for logs ingestion or storage that exceeds your your monthly allotment. You can do the following:
- Disable logs ingestion at the bucket level.
- Exclude certain log entries from ingestion into Logging.
For more information, see Manage sinks.
Manage buckets
This section describes how to manage your log buckets using the Google Cloud CLI or the Google Cloud console.
Update a bucket
To update the properties of your bucket, such as the description or retention period, do the following:
Google Cloud console
To update your bucket's properties, do the following:
From the Logging menu, select Logs Storage.
For the bucket you want to update, click more_vert More.
Select Edit bucket.
Edit your bucket as needed.
Click Update bucket.
gcloud
To update your bucket's properties, run the
gcloud logging buckets update
command:
gcloud logging buckets update BUCKET_ID --location=LOCATION UPDATED_ATTRIBUTES
For example:
gcloud logging buckets update my-bucket --location=global --description "Updated description"
API
To update your bucket's properties, use
projects.locations.buckets.patch
in the Logging API.
Upgrade a bucket to use Log Analytics
To upgrade an existing bucket to use Log Analytics, the following restrictions apply:
- The log bucket was created at the Google Cloud project level.
- Field-level access control isn't configured.
- The log bucket is unlocked unless it is the
_Required
bucket. - There aren't pending updates to the bucket.
Not all regions are supported for Log Analytics. For more information, see Supported regions.
Google Cloud console
To upgrade an existing bucket to use Log Analytics, do the following:
From the Logging menu, select Logs Storage.
Locate the bucket that you want to upgrade.
When the Log Analytics available column displays Upgrade, you can upgrade the log bucket to use Log Analytics. Click Upgrade.
A dialog box opens. Click Confirm.
gcloud
To upgrade your log bucket to use Log Analytics, run the
gcloud logging buckets update
command. You must
set the --enable-analytics
flag, and we recommend that you also include the
--async
flag:
gcloud logging buckets update BUCKET_ID --location=LOCATION --enable-analytics --async
The flag --async
forces the
command to be asynchronous. The return of an asynchronous
method is an Operation
object, and it
contains information about the progress of the method. When the
method completes, the Operation
object contains the status. For more
information, see Asynchronous API methods.
API
To upgrade a log bucket to use Log Analytics, use the
projects.locations.buckets.updateAsync
method of the Cloud Logging API.
Prepare the arguments to the method as follows:
- Set the
LogBucket.analyticsEnabled
boolean totrue
. - For the query parameter of the command, use
updateMask=analyticsEnabled
.
The response to the asynchronous methods is an
Operation
object. This object contains
information about the progress of the method. When the method
completes, the Operation
object contains the status. For more information,
see Asynchronous API methods.
The updateAsync
might take several minutes to complete.
Create a linked BigQuery dataset
When you want to use the capabilities of BigQuery to analyze your log data, upgrade a log bucket to use Log Analytics, and then create a linked dataset. With this configuration, you can use BigQuery to read the logs stored in the log bucket.
Google Cloud console
To create a link to a BigQuery dataset for an existing log bucket, do the following:
Review the Before you begin section of this document.
From the Logging menu, select Logs Storage.
Locate the log bucket and verify that the Log Analytics available column displays Open.
If this column displays Upgrade, then the log bucket hasn't been upgraded to use Log Analytics. Configure Log Analytics:
- Click Upgrade.
- Click Confirm in the dialog.
After the upgrade completes, proceed to the next step.
On the log bucket, click Moremore_vert, and then click Edit bucket.
The Edit log bucket dialog opens.
Select Create a new BigQuery dataset that links to this bucket and enter the name for the new dataset. The dataset name must be unique for your Google Cloud project.
Click Done and then click Update bucket.
After Logging displays the linked dataset name on the Logs Storage page, it might take several minutes before BigQuery recognizes the dataset.
gcloud
To create a linked dataset for a log bucket that is upgraded
to use Log Analytics, run the
gcloud logging links create
command:
gcloud logging links create LINK_ID --bucket=BUCKET_ID --location=LOCATION
The LINK_ID field must be unique for your Google Cloud project.
The links create
command is asynchronous. The return of an
asynchronous method is an Operation
object, and it
contains information about the progress of the method. When the
method completes, the Operation
object contains the status. For more
information, see Asynchronous API methods.
The links create
command takes several minutes to complete.
For example, the following command creates a linked dataset for the
bucket with the name my-bucket
:
gcloud logging links create mylink --bucket=my-bucket --location=global
If you attempt to create a linked dataset for a log bucket that isn't upgraded to use Log Analytics, then the following error is reported:
A link can only be created for an analytics-enabled bucket.
API
To create a linked a BigQuery dataset for an existing log bucket
that is upgraded use Log Analytics, call the asynchronous
projects.locations.buckets.links.create
method of the Cloud Logging API.
Prepare the arguments to the method as follows:
- Create a
Link
object. - For the query parameter of the command, use
linkId=LINK_ID
.
The response to the asynchronous methods is an
Operation
object. This object contains
information about the progress of the method. When the
method completes, the Operation
object contains the status. For more
information, see Asynchronous API methods.
The links.create
method takes several minutes to complete.
If you attempt to create a linked dataset for a log bucket that isn't upgraded to use Log Analytics, then the following error is reported:
A link can only be created for an analytics-enabled bucket.
Lock a bucket
When you lock a bucket against updates, you also lock the bucket's retention policy. After a retention policy is locked, you can't delete the bucket until every log in the bucket has fulfilled the bucket's retention period.
To prevent anyone from updating or deleting a log bucket, lock the bucket. To lock the bucket, do the following:
Google Cloud console
The Google Cloud console doesn't support locking a log bucket.
gcloud
To lock your bucket, run the gcloud logging buckets update
command with the --locked
flag:
gcloud logging buckets update BUCKET_ID --location=LOCATION --locked
For example:
gcloud logging buckets update my-bucket --location=global --locked
API
To lock your bucket's attributes, use
projects.locations.buckets.patch
in the Logging API. Set the locked
parameter to true
.
List buckets
To list the log buckets associated with a Google Cloud project, and to see details such as retention settings, do the following:
Google Cloud console
Go to the Logs Storage page:
A table named Log buckets lists the buckets associated with the current Google Cloud project.
The table lists the following attributes for each log bucket:
- Name: The name of the log bucket.
- Description: The description of the bucket.
- Retention period: The number of days that the bucket's data will be stored by Cloud Logging.
- Region: The geographic location in which the bucket's data is stored.
- Status: Whether the bucket is locked or unlocked.
If a bucket is pending deletion by Cloud Logging, its table entry is annotated with a warning warning.
gcloud
Run the gcloud logging buckets list
command:
gcloud logging buckets list
You see the following attributes for the log buckets:
LOCATION
: The region in which the bucket's data is stored.BUCKET_ID
: The name of the log bucket.RETENTION_DAYS
: The number of days that the bucket's data will be stored by Cloud Logging.LIFECYCLE_STATE
: Indicates whether the bucket is pending deletion by Cloud Logging.LOCKED
: Whether the bucket is locked or unlocked.CREATE_TIME
: A timestamp that indicates when the bucket was created.UPDATE_TIME
: A timestamp that indicates when the bucket was last modified.
You can also view the attributes for just one bucket. For example, to view
the details for the _Default
log bucket in the global
region, run the
gcloud logging buckets describe
command:
gcloud logging buckets describe _Default --location=global
API
To list the log buckets associated with a Google Cloud project, use
projects.locations.buckets.list
in the Logging API.
View a bucket's details
To view the details of a single log bucket, do the following:
Google Cloud console
Go to the Logs Storage page:
On the log bucket, click More more_vert > View bucket details.
The dialog box lists the following attributes for the log bucket:
- Name: The name of the log bucket.
- Description: The description of the log bucket.
- Retention period: The number of days that the bucket's data will be stored by Cloud Logging.
- Region: The geographic location in which the bucket's data is stored.
- Log analytics: Indicates whether your bucket is upgraded to use Log Analytics.
- BigQuery analysis: Indicates whether a BigQuery dataset is linked to your bucket.
- BigQuery dataset: Provides a link to your BigQuery dataset, which opens in the BigQuery SQL workspace page. The date that BigQuery linking was enabled is also shown.
gcloud
Run the gcloud logging buckets describe
command.
For example, the following command reports the details of the _Default
bucket:
gcloud logging buckets describe _Default --location=global
You see the following attributes for the log bucket:
createTime
: A timestamp that indicates when the bucket was created.description
: The description of the log bucket.lifecycleState
: Indicates whether the bucket is pending deletion by Cloud Logging.name
: The name of the log bucket.retentionDays
: The number of days that the bucket's data will be stored by Cloud Logging.updateTime
: A timestamp that indicates when the bucket was last modified.
API
To view the details of a single log bucket, use
projects.locations.buckets.get
in the Logging API.
Delete a bucket
To delete a log bucket, do the following:
Google Cloud console
To delete a log bucket, do the following:
From the Logging menu, select Logs Storage.
Locate the bucket that you want to delete, and click more_vertMore.
If the Linked dataset in BigQuery column displays a link, then delete the linked BigQuery dataset:
Click Edit bucket.
Clear Create a new BigQuery dataset that links to this bucket, click Done, and then click Update bucket.
After you return to the Logs Storage page, click more_vertMore for the bucket you want to delete, then proceed to the next steps.
Select Delete bucket.
On the confirmation panel, click Delete.
On the Logs Storage page, your bucket has an indicator that it's pending deletion. The bucket, including all the logs in it, is deleted after 7 days.
gcloud
To delete a log bucket, run the
gcloud logging buckets delete
command:
gcloud logging buckets delete BUCKET_ID --location=LOCATION
You can't delete a log bucket when that bucket has a linked BigQuery dataset:
- To list the links associated with a log bucket, run the
gcloud logging links list
command. - To delete a linked dataset, run the
gcloud logging links delete
command.
API
To delete a bucket, use
projects.locations.buckets.delete
in the Logging API.
It is an error to delete a log bucket if that bucket has a linked BigQuery dataset. You must delete the linked dataset before deleting the log bucket:
- To list the links associated with a log bucket, run the
projects.locations.buckets.links.list
method. - To delete a linked dataset, run the
projects.locations.buckets.links.delete
method.
A deleted bucket stays in this pending state for 7 days, and Logging continues to route logs to the bucket during that time. To stop routing logs to a deleted bucket, you can delete the log sinks that have that bucket as a destination, or you can modify the filter for the sinks to stop routing logs to the deleted bucket.
Restore a deleted bucket
You can restore, or undelete, a log bucket that's in the pending deletion state. To restore a log bucket, do the following:
Google Cloud console
To restore a log bucket that is pending deletion, do the following:
From the Logging menu, select Logs Storage.
For the bucket you want to restore, click More more_vert.
Select Restore deleted bucket.
On the confirmation panel, click Restore.
On the Logs Storage page, the pending-deletion indicator is removed from your bucket.
gcloud
To restore a log bucket that is pending deletion, run the
gcloud logging buckets undelete
command:
gcloud logging buckets undelete BUCKET_ID --location=LOCATION
API
To restore a bucket that is pending deletion, use
projects.locations.buckets.undelete
in the Logging API.
Alert on monthly log bytes ingested
To create an alerting policy, on the Logs Storage page in the
Google Cloud console, click add_alert Create usage alert. This
button opens the Create alerting policy page in Monitoring,
and populates the metric type field with
logging.googleapis.com/billing/bytes_ingested
.
To create an alerting policy that triggers when your monthly log bytes ingested exceeds your user-defined limit for Cloud Logging, use the following settings.
New condition Field |
Value |
---|---|
Resource and Metric | In the Resources menu, select Global. In the Metric categories menu, select Logs-based metric. In the Metrics menu, select Monthly log bytes ingested. |
Filter | None. |
Across time series Time series aggregation |
sum |
Rolling window | 60 m |
Rolling window function | max |
Configure alert trigger Field |
Value |
---|---|
Condition type | Threshold |
Alert trigger | Any time series violates |
Threshold position | Above threshold |
Threshold value | You determine the acceptable value. |
Retest window | Minimum acceptable value is 30 minutes. |
For more information about alerting policies, see Alerting overview.
Write to a bucket
You don't directly write logs to a log bucket. Rather, you write logs to Google Cloud resource: a Google Cloud project, folder, or organization. The sinks in the parent resource then route the logs to destinations, including log buckets. A sink routes logs to a log bucket destination when the logs match the sink's filter and the sink has permission to route the logs to the log bucket.
Read from a bucket
Each log bucket has a set of log views. To read logs from a log bucket, you need access to a log view on the log bucket. For more information on log views, see Configure bucket-level access.
To read logs from a log bucket, do the following:
Google Cloud console
You can refine the scope of the logs displayed in the Logs Explorer through the Refine scope option. You can search only logs within the current Google Cloud project or search logs in one or more storage views. To refine the scope of the Logs Explorer, do the following:
From the Logging menu, select Logs Explorer.
Select Refine Scope.
On the Refine scope dialog, select a Scope by option.
Select Scope by storage and choose one or more buckets you want to view.
The dialog lists storage views that meet the following conditions:
- You have access to the storage view.
- The log buckets belong to the selected Google Cloud project, or the selected Google Cloud project has previously routed logs to the storage buckets.
gcloud
To read logs from a log bucket, use the
gcloud logging read
command and add
a LOG_FILTER
to select
data:
gcloud logging read LOG_FILTER --bucket=BUCKET_ID --location=LOCATION --view=VIEW_ID
API
To read logs from a log bucket, use the
entries.list method. Set
resourceNames
to specify the appropriate bucket and log view, and set
filter
to select data.
For detailed information about the filtering syntax, see Logging query language.
Configure custom retention
When you create a log bucket, you have the option to
customize the period for how long Cloud Logging stores the bucket's logs.
You can configure the retention period for any user-defined log bucket and also
for the _Default
log bucket.
If you shorten the bucket's retention, then there is a 7-day grace period in which expired logs aren't deleted. You can't query or view those expired logs but, in those 7 days, you can restore full access by extending the bucket's retention. Logs stored during the grace period count towards your storage costs.
To update the retention period for a log bucket, do the following:
Google Cloud console
To update a log bucket's retention period, do the following:
From the Logging menu, select Logs Storage.
For the bucket you want to update, click More more_vert.
Select Edit bucket.
In the Retention field, enter the number of days, between 1 day and 3650 days, that you want Cloud Logging to retain your logs.
Click Update bucket. Your new retention duration appears in the Logs bucket list.
gcloud
To update the retention period for a log bucket, run the
gcloud logging buckets update
command, after setting a value for
RETENTION_DAYS:
gcloud logging buckets update BUCKET_ID --location=LOCATION --retention-days=RETENTION_DAYS
For example, to retain the logs in the _Default
bucket in the
global
location for a year, your command would look like the following:
gcloud logging buckets update _Default --location=global --retention-days=365
If you extend a bucket's retention period, then the retention rules apply going forward and not retroactively. Logs can't be recovered after the applicable retention period ends.
Asynchronous API methods
The response of an asynchronous method like
projects.locations.buckets.createAsync
is an Operation
object.
Applications that call an asynchronous API method should poll
the operation.get
endpoint until the
value of the Operation.done
field is true
:
When
done
isfalse
, the operation is in progress.To refresh the status information, send a
GET
request to theoperation.get
endpoint.When
done
istrue
, the operation is complete and either theerror
orresponse
field is set:error
: When set, the asynchronous operation failed. The value of this field is aStatus
object that contains a gRPC error code and an error message.response
: When set, the asynchronous operation completed successfully, and the value reflects the result.
To poll an asynchronous command by using the Google Cloud CLI, run the following command:
gcloud logging operations describe OPERATION_ID --location=LOCATION --project=PROJECT_ID
For more information, see gcloud logging operations describe
.
Troubleshoot common issues
If you encounter problems when using log buckets, refer to the following troubleshooting steps and answers to common questions.
Why can't I delete this bucket?
If you're trying to delete a bucket, do the following:
Ensure that you have the correct permissions to delete the bucket. For the list of the permissions that you need, see Access control with IAM.
Determine whether the bucket is locked by listing the bucket's attributes. If the bucket is locked, check the bucket's retention period. You can't delete a locked bucket until all of the logs in the bucket have fulfilled the bucket's retention period.
Verify that the log bucket doesn't have a linked BigQuery dataset. You can't delete a log bucket with a linked dataset.
The following error is shown in response to a
delete
command on a log bucket that has a linked dataset:FAILED_PRECONDITION: This bucket is used for advanced analytics and has an active link. The link must be deleted first before deleting the bucket
To list the links associated with a log bucket, run the
gcloud logging links list
command or run theprojects.locations.buckets.links.list
API method.
Which service accounts are routing logs to my bucket?
To determine if any service accounts have IAM permissions to route logs to your bucket, do the following:
Go to the Identity and Access Management page for the Google Cloud project that contains the bucket:
From the Permissions tab, view by Roles. You see a table with all the IAM roles and principals associated with your Google Cloud project.
In the table's Filter text box filter_list, enter Logs Bucket Writer.
You see any principals with the Logs Bucket Writer role. If a principal is a service account, its ID contains the string
gserviceaccount.com
.Optional: If you want to remove a service account from being able to route logs to your Google Cloud project, select the check box check_box_outline_blank for the service account and click Remove.
Why do I see logs for a Google Cloud project even though I excluded them from my _Default
sink?
You might be viewing logs in a log bucket in a centralized Google Cloud project, which aggregates logs from across your organization.
If you're using the Logs Explorer to access these logs and see logs that you
excluded from the _Default
sink, then your view might be scoped to the
Google Cloud project level.
To fix this issue, select Scope by storage in the
Refine scope panel
and then select the _Default
bucket in your
Google Cloud project. You shouldn't see the excluded logs anymore.
Why can't I create log-based metrics for a bucket?
Log-based metrics apply only to a single Google Cloud project. You can't create them for log buckets or for other Google Cloud resources such as folders or organizations.
What's next
For information on the log bucket API methods, refer to the
LogBucket
reference documentation.
For information on addressing common use cases with log buckets, see the following topics: