This page describes how to store your logs in a Cloud Logging bucket in a designated region. To meet regulatory or contractual obligations, organizations might be required to store their logs in specific regions. This guide walks through this process using the example of redirecting all logs to the europe-west1 region.
This process involves the following steps:
Creating a logs bucket in the designated region for storing the logs.
_Defaultsink to route the logs to the new logs bucket.
Searching logs in the UI.
Optionally updating the log retention period.
Before you begin
To complete the steps in this guide, you need to know the following:
What project do you want to store the logs in? In this example, we use a project called logs-test-project.
What is the name and location of the logs bucket you want store the logs in? In this example, the bucket name is region-1-logs-bucket and the location is europe-west1.
When you create your logs bucket, you can choose to store your logs in any of the following regions:
- Which logs do you want to include? In this example, we include all logs routed
Creating the logs bucket
Logs buckets store the logs that are routed from other projects, folders, or organizations. For more information, see Logs buckets.
To create the bucket in the project that you want to store logs in, complete the following steps:
Open the Google Cloud Console in the project you're using to store the logs.
In a terminal, run the following command to create a bucket, replacing the parts in bold with your own information:
gcloud logging buckets create region-1-logs-bucket \ --location=europe-west1 \ --project=logs-test-project
Verify that the bucket was created:
gcloud logging buckets list --project=logs-test-project
_Default logs sink
You route logs to a logs bucket by creating a logs sink. A sink includes a logs
filter, which selects which log entries to export through the sink, and a
destination. In this guide, we update the existing
_Default sink to route logs
to our bucket, region-1-logs-bucket.
To update the sink, run the following command, replacing the parts in bold with your own information:
gcloud logging sinks update _Default \ logging.googleapis.com/projects/logs-test-project/locations/europe-west1/buckets/region-1-logs-bucket \ --log-filter='NOT LOG_ID("cloudaudit.googleapis.com/activity") AND NOT LOG_ID("externalaudit.googleapis.com/activity") AND NOT LOG_ID("cloudaudit.googleapis.com/system_event") AND NOT LOG_ID("externalaudit.googleapis.com/system_event") AND NOT LOG_ID("cloudaudit.googleapis.com/access_transparency") AND NOT LOG_ID("externalaudit.googleapis.com/access_transparency")' \ --description="Updated the _Default sink to route logs to the europe-west1 region"
Creating a log entry to assist with sink verification
To verify that you updated the sink properly, complete the following steps:
Send a test log message to your regionalized bucket using the
gcloud logging writecommand. For example:
gcloud logging write TEST_LOG_NAME "Test to route logs to region-1-logs-bucket" --project=logs-test-project
After a few minutes, view your log entry in Logs Explorer:
In the Log field pane, select the Global resource type.
Your test log entry displays in the Query results panel.
Searching logs in the UI
After setting the permissions in the previous section, go to the UI and complete the following steps.
From the Logging menu for the project you're using to store the logs, select Logs Explorer.
Select Refine Scope.
On the Refine scope panel, select Scope by storage.
The Logs Explorer refreshes to show logs from your bucket.
For information on using the Logs Explorer, refer to Using the Logs Explorer.
[Optional] Updating the bucket's log retention period
To change the retention period for your logs in your bucket, run the following command:
gcloud logging buckets update region-1-logs-bucket \ --location=europe-west1 --project=logs-test-project \ --retention-days=14
Explore additional location-based concepts, such as zones, that apply to other Google Cloud services.
Read the following whitepapers that provide best practices for data governance: