This page describes how to store your logs in a Cloud Logging bucket in a designated region.
For conceptual information on logs data location, see Data regionality for Cloud Logging.
This guide walks through this process using the example of redirecting all logs to the europe-west1 region. This process involves the following steps:
Create a log bucket in the designated region for storing the logs.
_Defaultsink to route the logs to the new log bucket.
Search for logs in the Logs Explorer.
(Optional) Update the log retention period.
Before you begin
To complete the steps in this guide, you need to know the following:
In which Google Cloud project do you want to store the logs? In this guide, we use a Cloud project called logs-test-project.
What is the name and location of the log bucket in which you want to store the logs? In this guide, the bucket name is region-1-logs-bucket, and the location is europe-west1.
When you create your log bucket, you can choose to store your logs in any of the following regions:
- Which logs do you want to include? In this guide, we include all logs routed
Create the log bucket
Log buckets store the logs that are routed from other Google Cloud projects, folders, or organizations. For more information, see Manage log buckets.
To create the bucket in the Cloud project that you want to store logs in, complete the following steps:
Open the Google Cloud Console in the Cloud project you're using to store the logs.
In a terminal, run the following command to create a bucket, replacing the parts in bold with your own information:
gcloud logging buckets create region-1-logs-bucket \ --location=europe-west1 \ --project=logs-test-project
Verify that the bucket was created:
gcloud logging buckets list --project=logs-test-project
_Default logs sink
You route logs to a log bucket by creating a sink. A sink includes a filter,
which selects which log entries to export through the sink, and a
destination. In this guide, we update the existing
_Default sink to route logs
to our bucket, region-1-logs-bucket.
To update the sink, run the following command, replacing the parts in bold with your own information:
gcloud logging sinks update _Default \ logging.googleapis.com/projects/logs-test-project/locations/europe-west1/buckets/region-1-logs-bucket \ --log-filter='NOT LOG_ID("cloudaudit.googleapis.com/activity") AND NOT LOG_ID("externalaudit.googleapis.com/activity") AND NOT LOG_ID("cloudaudit.googleapis.com/system_event") AND NOT LOG_ID("externalaudit.googleapis.com/system_event") AND NOT LOG_ID("cloudaudit.googleapis.com/access_transparency") AND NOT LOG_ID("externalaudit.googleapis.com/access_transparency")' \ --description="Updated the _Default sink to route logs to the europe-west1 region"
Create a log entry to test your sink
To verify that you updated the sink properly, complete the following steps:
Send a test log message to your regionalized bucket using the
gcloud logging writecommand. For example:
gcloud logging write TEST_LOG_NAME "Test to route logs to region-1-logs-bucket" --project=logs-test-project
After a few minutes, view your log entry in Logs Explorer:
In the Log field pane, select the Global resource type.
Your test log entry displays in the Query results panel.
Search logs in the Cloud Console
After setting the permissions in the previous section, go to the Cloud Console and complete the following steps:
From the Logging menu for the project you're using to store the logs, select Logs Explorer.
Select Refine Scope.
On the Refine scope panel, select Scope by storage.
The Logs Explorer refreshes to show logs from your bucket.
For information on using the Logs Explorer, refer to Using the Logs Explorer.
[Optional] Updating the bucket's log retention period
To change the retention period for your logs in your bucket, run the following command:
gcloud logging buckets update region-1-logs-bucket \ --location=europe-west1 --project=logs-test-project \ --retention-days=14
Explore additional location-based concepts, such as zones, that apply to other Google Cloud services.
Read the following whitepapers that provide best practices for data governance: