This document explains how to create and manage sinks to route log entries to supported destinations.
Overview
Sinks control how Cloud Logging routes logs. Using sinks, you can route some or all of your logs to the following destinations:
- Cloud Logging log buckets: Provides storage in Cloud Logging. A log bucket can store logs ingested by multiple Google Cloud projects. You can combine your Cloud Logging data with other data by upgrading a log bucket to use Log Analytics, and then creating a linked BigQuery dataset. For information about viewing logs stored in log buckets, see Query and view logs overview and View logs routed to Cloud Logging buckets.
- Google Cloud projects: Route log entries to a different Google Cloud project. When you route logs to a different Google Cloud project, the destination project's Log Router receives the logs and processes them. The sinks in the destination project determine how the received log entries are routed.
- Pub/Sub topics: Provides support for third-party integrations, such as Splunk. Log entries are formatted into JSON and then routed to a Pub/Sub topic. For information about viewing logs routed to Pub/Sub, see View logs routed to Pub/Sub.
- BigQuery datasets: Provides storage of log entries in BigQuery datasets. You can use big data analysis capabilities on the stored logs. To combine your Cloud Logging data with other data sources, we recommend that you upgrade your log buckets to use Log Analytics and then create a linked BigQuery dataset. For information about viewing logs routed to BigQuery, see View logs routed to BigQuery.
- Cloud Storage buckets: Provides storage of log data in Cloud Storage. Log entries are stored as JSON files. For information about viewing logs routed to Cloud Storage, see View logs routed to Cloud Storage.
Sinks belong to a given Google Cloud resource: Google Cloud projects, billing accounts, folders, and organizations. When the resource receives a log entry, it routes the log entry according to the sinks contained by that resource. The log entry is sent to the destination associated with each matching sink.
An aggregated sink is a type of sink that combines and routes log entries from the Google Cloud resources contained by an organization or folder. For instructions, see Collate and route organization-level logs to supported destinations.
To create and manage sinks, you can use the Google Cloud console, the Cloud Logging API, and the Google Cloud CLI. Using the Google Cloud console has the following advantages over the other methods:
- View and manage all of your sinks in one place.
- Preview which log entries are matched by your sink's filter before you create the sink.
- Create and authorize sink destinations for your sinks.
Before you begin
The instructions in this document describe creating and managing sinks at the Google Cloud project level, but you can create sinks (non-aggregated) for billing accounts, folders, and organizations.
To get started, do the following:
You have a Google Cloud project with logs that you can see in the Logs Explorer.
To create, modify, or delete a sink, you must have one of the following Identity and Access Management roles for the Google Cloud project from which you're routing logs.
- Logs Configuration Writer (
roles/logging.configWriter
) - Logging Admin (
roles/logging.admin
) - Owner (
roles/owner
)
For information about granting IAM roles, see the Logging Access control guide.
- Logs Configuration Writer (
You have a resource in a supported destination or have the ability to create one.
To route logs to a destination, the destination must exist before you create the sink. You can create the destination in any Google Cloud project in any organization.
Some limitations might apply when you route your logs to other destinations. For more information, see Destination limitations.
Create a sink
Following are the instructions for creating a sink in a Google Cloud project. Instead of a Google Cloud project, you can specify a billing account, folder, or organization.
You can create up to 200 sinks per Google Cloud project.
After you create the sink, ensure that Logging has the appropriate permissions to write logs to your sink's destination; see Set destination permissions.
To create a sink, do the following:
Console
In the Google Cloud console, select Logging from the navigation menu, then click Log Router:
Go to Log RouterSelect an existing Google Cloud project.
Select Create sink.
In the Sink details panel, enter the following details:
Sink name: Provide an identifier for the sink; note that after you create the sink, you can't rename the sink but you can delete it and create a new sink.
Sink description (optional): Describe the purpose or use case for the sink.
In the Sink destination panel, select the sink service and destination by using the Select sink service menu.
To route logs to a service that is in the same Google Cloud project, select one of the following options:
- Cloud Logging bucket: Select or create a Logging bucket.
- BigQuery table: Select or create the particular dataset to receive the routed logs. You also have the option to use partitioned tables.
- Cloud Storage bucket: Select or create the particular Cloud Storage bucket to receive the routed logs.
- Pub/Sub topic: Select or create the particular topic to receive the routed logs.
- Splunk: Select the Pub/Sub topic for your Splunk service.
- (Preview) Other project: Populate the Sink destination field as described in Destination path formats.
In the Choose logs to include in sink panel, do the following:
In the Build inclusion filter field, enter a filter expression that matches the log entries you want to include. To learn more about the syntax for writing filters, see Logging query language.
If you don't set a filter, all logs from your selected resource are routed to the destination.
For example, you might want to build a filter to route all Data Access logs to a single Logging bucket. This filter looks like the following:
LOG_ID("cloudaudit.googleapis.com/data_access") OR LOG_ID("externalaudit.googleapis.com/data_access")
Note that the length of a filter can't exceed 20,000 characters.
To verify you entered the correct filter, select Preview logs. This opens the Logs Explorer in a new tab with the filter prepopulated.
(Optional) In the Choose logs to filter out of sink panel, do the following:
In the Exclusion filter name field, enter a name.
In the Build an exclusion filter field, enter a filter expression that matches the log entries you want to exclude. You can also use the
sample
function to select a portion of the log entries to exclude.
You can create up to 50 exclusion filters per sink. Note that the length of a filter can't exceed 20,000 characters.
Select Create sink.
API
To create a logging sink in your Google Cloud project, use projects.sinks.create in the Logging API. In the LogSink object, provide the appropriate required values in the method request body:
name
: An identifier for the sink. Note that after you create the sink, you can't rename the sink, but you can delete it and create a new sink.destination
: The service and destination to where you want your logs routed. To route logs to a different project, or to a destination that is in another project, set thedestination
field with the appropriate path, as described in Destination path formats.For example, if your sink destination is a BigQuery dataset, then
destination
looks like the following:bigquery.googleapis.com/projects/PROJECT_ID/datasets/DATASET_ID
In the LogSink object, provide the appropriate optional information:
filter
: Set thefilter
property to match the log entries you want to include in your sink. If you don't set a filter, all logs from your Google Cloud project are routed to the destination. Note that the length of a filter can't exceed 20,000 characters.exclusions
: Set this property to match the log entries that you want to exclude from your sink. You can also use thesample
function to select a portion of the log entries to exclude. You can create up to 50 exclusion filters per sink.description
: Set this property to describe the purpose or use case for the sink.
Call projects.sinks.create to create the sink.
If the API response contains a JSON key labeled
"writerIdentity"
, then grant the service account of the sink the permission to write to the sink destination. For more information, see Set destination permissions.You don't need to set destination permissions when the API response doesn't contain a JSON key labeled
"writerIdentity"
.
For more information on creating sinks using the Logging API, see the LogSink reference.
gcloud
To create a sink, run the following gcloud logging sinks create
command.
Provide the appropriate values for the variables in the command as follows:
- SINK_NAME: An identifier for the sink. Note that after you create the sink, you can't rename the sink but you can delete it and create a new sink.
SINK_DESTINATION: The service and destination to where you want your logs routed. To route logs to a different project, or to a destination that is in another project, set SINK_DESTINATION with the appropriate path, as described in Destination path formats.
For example, if your sink destination is a BigQuery dataset, then SINK_DESTINATION looks like the following:
bigquery.googleapis.com/projects/PROJECT_ID/datasets/DATASET_ID
OPTIONAL_FLAGS includes the following flags:
--log-filter
: Use this flag to set a filter that matches the log entries you want to include in your sink. If you don't set a filter, all logs from your Google Cloud project are routed to the destination.--exclusion
: Use this flag to set an exclusion filter for log entries that you want to exclude from your sink. You can also use thesample
function to select a portion of the log entries to exclude. This flag can be repeated; you can create up to 50 exclusion filters per sink.--description
: Use this flag to describe the purpose or use case for the sink.
gcloud logging sinks create SINK_NAME SINK_DESTINATION OPTIONAL_FLAGS
For example, to create a sink to a Logging bucket, your command might look like this:
gcloud logging sinks create my-sink logging.googleapis.com/projects/myproject123/locations/global/buckets/my-bucket \ --log-filter='logName="projects/myproject123/logs/matched"' --description="My first sink"
For more information on creating sinks using the
Google Cloud CLI, including more flags and examples, see the
gcloud logging sinks
reference.
New log sinks to Cloud Storage buckets might take several hours to start routing logs. Sinks to Cloud Storage are processed hourly while other destination types are processed in real time.
Sinks don't define the schema for BigQuery datasets. Instead, the first log entry received by BigQuery determines the schema for the destination table. For more information, see BigQuery schema for routed logs.
For information about how to view logs in the sink destinations, see View logs routed to Cloud Logging buckets.
After creating the sink, you can view the number and volume of log entries
received using the logging.googleapis.com/exports/
metrics.
If you receive error notifications, see Troubleshoot routing and sinks.
Destination path formats
If you are routing to a destination that is in another project, then you must provide the Logging, BigQuery, Cloud Storage, or Pub/Sub service and destination information:
To route log entries to a Cloud Logging log bucket that is in a different Google Cloud project, the sink destination is the following:
logging.googleapis.com/projects/DESTINATION_PROJECT_ID/locations/LOCATION/buckets/BUCKET
To route log entries to another Google Cloud project, the sink destination path is the following:
logging.googleapis.com/projects/DESTINATION_PROJECT_ID
To route log entries to a BigQuery dataset, the sink destination is the following:
bigquery.googleapis.com/projects/PROJECT_ID/datasets/DATASET_ID
To route log entries to a Cloud Storage bucket, the sink destination is the following:
storage.googleapis.com/BUCKET
To route log entries to a Pub/Sub topic, the sink destination is the following:
pubsub.googleapis.com/projects/PROJECT_ID/topics/TOPIC_ID
Manage sinks
After your sinks are created, you can perform these actions on them:
- View sink details
- Update sink
- Disable sink
- Delete sink
- Troubleshoot sink
- View sink log volume and error rates
Before you delete a sink, consider the following:
- You can't delete the
_Default
and the_Required
sinks, but you can disable_Default
sinks to stop routing logs to_Default
Logging buckets. - After a sink is deleted, it stops routing log entries.
- If the sink has a dedicated service account, then deleting that sink also deletes the service account. Sinks created before May 22, 2023 have dedicated service accounts. Sinks created on or after May 22, 2023 have a shared service account. Deleting the sink doesn't delete the shared service account.
Any changes made to a sink might take a few minutes to apply.
Following are the instructions for managing a sink in a Google Cloud project. Instead of a Google Cloud project, you can specify a billing account, folder, or organization:
Console
In the Google Cloud console, select Logging from the navigation menu, then click Log Router:
Go to Log RouterSelect the Google Cloud project that contains your sink by using the resource selector from anywhere in the Google Cloud console:
To view your aggregated sinks, select the organization, folder, or billing account that contains the sink.
The Log Router page contains a table summary of sinks. Each table row contains information about a sink's properties:
- Enabled: Indicates if the sink's state is enabled or disabled.
- Type: The sink's destination service; for example,
Cloud Logging bucket
. - Name: The sink's identifier, as provided when the sink was created;
for example
_Default
. - Description: The sink's description, as provided when the sink was created.
- Destination: Full name of the destination for where the routed log entries will be sent.
- Created: The date and time that the sink was created.
- Last updated: The date and time that the sink was last edited.
Each table row has a menu more_vert and provides the following options:
- View sink details: Displays the sink's name, description, destination service, destination, and inclusion and exclusion filters. Selecting Edit opens the Edit Sink panel.
- Edit sink: Opens the Edit Sink panel where you can update the sink's parameters.
- Disable sink: Lets you disable the sink and stop routing logs to the sink's destination. For more information on disabling sinks, see Stop logs ingestion.
- Enable sink: Lets you enable a disabled sink and restart routing logs to the sink's destination.
- Delete sink: Lets you delete the sink and stop routing logs to the sink's destination.
- Troubleshoot sink: Opens the Logs Explorer where you can troubleshoot errors with the sink.
- View sink log volume and error rates: Opens the Metrics Explorer where you can view and analyze data from the sink.
Clicking on any of the column names lets you sort data in ascending or descending order.
API
To view the sinks for your Google Cloud project, call
projects.sinks.list
.To view a sink's details, call
projects.sinks.get
.To update a sink, call
projects.sink.update
.You can update a sink's destination, filters, and description. You can also disable or reenable the sink.
To disable a sink, call
projects.sink.update
and set thedisabled
property totrue
.To reenable the sink, call
projects.sink.update
and set thedisabled
property tofalse
.To delete a sink, call
projects.sinks.delete
.For more information on any of these methods for managing sinks using the Logging API, see the LogSink reference.
gcloud
To view your list of sinks for your Google Cloud project, use the
gcloud logging sinks list
command, which corresponds to the Logging API methodprojects.sinks.list
:gcloud logging sinks list
To view your list of aggregated sinks, use the appropriate flag to specify the resource that contains the sink. For example, if you created the sink at the organization level, use the
--organization=ORGANIZATION_ID
flag to list the sinks for the organization.To describe a sink, use the
gcloud logging sinks describe
command, which corresponds to the Logging API methodprojects.sinks.get
:gcloud logging sinks describe SINK_NAME
To update a sink, use the
gcloud logging sinks update
command, which corresponds to the API methodprojects.sink.update
.You can update a sink to change the destination, filters, and description, or to disable or reenable the sink:
gcloud logging sinks update SINK_NAME NEW_DESTINATION --log-filter=NEW_FILTER
Omit the NEW_DESTINATION or
--log-filter
if those parts don't change.For example, to update the destination of your sink named
my-project-sink
to a new Cloud Storage bucket destination namedmy-second-gcs-bucket
, your command looks like this:gcloud logging sinks update my-project-sink storage.googleapis.com/my-second-gcs-bucket
To disable a sink, use the
gcloud logging sinks update
command, which corresponds to the API methodprojects.sink.update
, and include the--disabled
flag:gcloud logging sinks update _Default --disabled
To reenable the sink, use the
gcloud logging sinks update
command, remove the--disabled
flag, and include the--no-disabled
flag:gcloud logging sinks update _Default --no-disabled
To delete a sink, use the
gcloud logging sinks delete
command, which corresponds to the API methodprojects.sinks.delete
:gcloud logging sinks delete SINK_NAME
For more information on managing sinks using the Google Cloud CLI, see the
gcloud logging sinks
reference.
Stop logs ingestion
For each Google Cloud project, Logging automatically creates
two log buckets: _Required
and _Default
.
Logging automatically creates two log sinks, _Required
and _Default
, that route logs to the correspondingly named buckets.
You can't disable the _Required
sink; neither ingestion pricing nor storage
pricing applies to the logs data stored in the _Required
log bucket. You can
disable the _Default
sink to stop logs from being ingested into the
_Default
bucket. You can also disable any user-defined sinks.
When you stop logs ingestion for the _Default
bucket
by disabling all the sinks in your Google Cloud project that send logs to the
_Default
bucket, no new Cloud Logging ingestion charges are incurred by
your Google Cloud project for the _Default
bucket. The _Default
bucket is
empty when all of the previously ingested logs in the
_Default
bucket have fulfilled the bucket's
retention period.
To disable your Google Cloud project sinks that route logs to the _Default
bucket, complete the following steps:
Console
In the Google Cloud console, select Logging from the navigation menu, then click Log Router:
Go to Log RouterTo find all the sinks that route logs to the
_Default
bucket, filter the sinks by destination, and then enter_Default
.For each sink, select Menu more_vert and then select Disable sink.
The sinks are now disabled and your Google Cloud project sinks no
longer route logs to the _Default
bucket.
To reenable a disabled sink and restart routing logs to the sink's destination, do the following:
In the Google Cloud console, select Logging from the navigation menu, then click Log Router:
Go to Log RouterTo find all the disabled sinks previously configured to route logs to the
_Default
bucket, filter the sinks by destination, and then enter_Default
.For each sink, select Menu more_vert and then select Enable sink.
API
To view the sinks for your Google Cloud project, call the Logging API method
projects.sinks.list
.Identify any sinks that are routing to the
_Default
bucket.For example, to disable the
_Default
sink, callprojects.sink.update
and set thedisabled
property totrue
.
The _Default
sink is now disabled; it no longer routes logs to the
_Default
bucket.
To disable the other sinks in your Google Cloud project that are routing
to the _Default
bucket, repeat the steps above.
To reenable a sink, call
projects.sink.update
and set the disabled
property to false
.
gcloud
To view your list of sinks for your Google Cloud project, use the
gcloud logging sinks list
command, which corresponds to the Logging API methodprojects.sinks.list
:gcloud logging sinks list
Identify any sinks that are routing to the
_Default
bucket. To describe a sink, including seeing the destination name, use thegcloud logging sinks describe
command, which corresponds to the Logging API methodprojects.sinks.get
:gcloud logging sinks describe SINK_NAME
For example, to disable the
_Default
sink, use thegcloud logging sinks update
command and include the--disabled
flag:gcloud logging sinks update _Default --disabled
The _Default
sink is now disabled; it no longer routes logs to the
_Default
bucket.
To disable the other sinks in your Google Cloud project that are routing
to the _Default
bucket, repeat the steps above.
To reenable a sink, use the
gcloud logging sinks update
command, remove the --disabled
flag, and include the --no-disabled
flag:
gcloud logging sinks update _Default --no-disabled
Set destination permissions
This section describes how to grant Logging the Identity and Access Management permissions to write logs to your sink's destination. For the full list of Logging roles and permissions, see Access control.
Cloud Logging creates a shared service account for a resource when a log sink is created, unless the required service account already exists. The service account might exist because the same service account is used for all sinks in the underlying resource. Resources can be a Google Cloud project, an organization, a folder, or a billing account.
The writer identity of a sink is the identifier of the service
account associated with that sink. All sinks have a writer identity except
for sinks that write to a log bucket in the current Google Cloud project.
When the destination of a sink is a log bucket in the current
Google Cloud project, the sink doesn't require any additional
destination permissions. Therefore, the value of the writer identity field
is listed as None
in the console,
and it isn't reported by the API and the Google Cloud CLI commands.
Following are the instructions for setting Google Cloud project-level permissions for your sink to route to its destination. Instead of a Google Cloud project, you can specify a billing account, folder, or organization:
Console
To get the sink's writer identity—an email address—from the new sink, do the following:
In the Google Cloud console, select Logging from the navigation menu, then click Log Router:
Go to Log RouterSelect menu, then more_vert View sink details. The writer identity appears in the Sink details panel.
If the value of the
writerIdentity
field contains an email address, then proceed to the next step. When the value isNone
, you don't need to configure destination permissions for the sink.Click Copy content_copy to copy the sink's writer identity into your clipboard.
If you have Owner access to the destination, then add the service account as an IAM principal in the destination project:
- For Cloud Storage destinations, add the sink's writer identity
as a principal by using IAM, and then grant it the
Storage Object Creator role
(
roles/storage.objectCreator
). - For BigQuery destinations, add the sink's writer identity
as a principal by using IAM, and then grant it the
BigQuery Data Editor role
(
roles/bigquery.dataEditor
). - For Pub/Sub destinations, including Splunk, add the sink's writer identity
as a principal by using IAM, and then grant it the
Pub/Sub Publisher role
(
roles/pubsub.publisher
). - For Logging bucket destinations in different
Google Cloud projects, add the sink's writer identity as a principal by
using IAM, and then grant it the
Logs Bucket Writer role
(
roles/logging.bucketWriter
). - (Preview) For different Google Cloud projects destinations, add the sink's
writer identity as a principal by using IAM, and then grant it the
Logs Writer role
(
roles/logging.logWriter
). Specifically, a principal needs thelogging.logEntries.route
permission.
- For Cloud Storage destinations, add the sink's writer identity
as a principal by using IAM, and then grant it the
Storage Object Creator role
(
API
Call the API method projects.sinks.list to list the sinks in your Google Cloud project.
Locate the sink whose permissions you want to modify, and if the sink details contain a JSON key labeled
"writerIdentity"
, then proceed to the next step. When the details don't include a"writerIdentity"
field, you don't need to configure destination permissions for the sink.If you have IAM Owner access to the destination, add the service account to the destination by doing the following:
- For Cloud Storage destinations, add the sink's writer identity
as a principal by using IAM, and then grant it the
Storage Object Creator role
(
roles/storage.objectCreator
). - For BigQuery destinations, add the sink's writer identity
as a principal by using IAM, and then grant it the
BigQuery Data Editor role
(
roles/bigquery.dataEditor
). - For Pub/Sub destinations, including Splunk, add the sink's writer identity
as a principal by using IAM, and then grant it the
Pub/Sub Publisher role
(
roles/pubsub.publisher
). - For Logging bucket destinations in different
Google Cloud projects, add the sink's writer identity as a principal by
using IAM, and then grant it the
Logs Bucket Writer role
(
roles/logging.bucketWriter
). - (Preview) For different Google Cloud projects destinations, add the sink's
writer identity as a principal by using IAM, and then grant it the
Logs Writer role
(
roles/logging.logWriter
). Specifically, a principal needs thelogging.logEntries.route
permission.
- For Cloud Storage destinations, add the sink's writer identity
as a principal by using IAM, and then grant it the
Storage Object Creator role
(
gcloud
Get the service account from the
writerIdentity
field in your sink:gcloud logging sinks describe SINK_NAME
Locate the sink whose permissions you want to modify, and if the sink details contain a line with
writerIdentity
, then proceed to the next step. When the details don't include awriterIdentity
field, you don't need to configure destination permissions for the sink.The value of the SERVICE_ACCOUNT field in the following steps is the writer identity, which looks similar to the following:
serviceAccount:service-p-123456789012@gcp-sa-logging.iam.gserviceaccount.com
If you have IAM Owner access to the destination, add the service account to the destination by doing the following:
- For Cloud Storage destinations, add the sink's writer identity
as a principal by using IAM, and then grant it the
Storage Object Creator role
(
roles/storage.objectCreator
). - For BigQuery destinations, add the sink's writer identity
as a principal by using IAM, and then grant it the
BigQuery Data Editor role
(
roles/bigquery.dataEditor
). - For Pub/Sub destinations, including Splunk, add the sink's writer identity
as a principal by using IAM, and then grant it the
Pub/Sub Publisher role
(
roles/pubsub.publisher
). - For Logging bucket destinations in different
Google Cloud projects, add the sink's writer identity as a principal by
using IAM, and then grant it the
Logs Bucket Writer role
(
roles/logging.bucketWriter
). - (Preview) For different Google Cloud projects destinations, add the sink's
writer identity as a principal by using IAM, and then grant it the
Logs Writer role
(
roles/logging.logWriter
). Specifically, a principal needs thelogging.logEntries.route
permission.
For example, if you're routing logs between Logging buckets in different Google Cloud projects, you would add
roles/logging.bucketWriter
to the service account as follows:Get the Identity and Access Management policy for the destination Google Cloud project and write it to a local file in JSON format:
gcloud projects get-iam-policy DESTINATION_PROJECT_ID --format json > output.json
Add an IAM condition that lets the service account write only to the Cloud Logging bucket you created. For example:
{ "bindings": [ { "members": [ "user:username@gmail.com" ], "role": "roles/owner" }, { "members": [ "SERVICE_ACCOUNT" ], "role": "roles/logging.bucketWriter", "condition": { "title": "Bucket writer condition example", "description": "Grants logging.bucketWriter role to service account SERVICE_ACCOUNT used by log sink [SINK_NAME]", "expression": "resource.name.endsWith(\'locations/global/buckets/BUCKET_ID\')" } } ], "etag": "BwWd_6eERR4=", "version": 3 }
Update the IAM policy:
gcloud projects set-iam-policy DESTINATION_PROJECT_ID output.json
- For Cloud Storage destinations, add the sink's writer identity
as a principal by using IAM, and then grant it the
Storage Object Creator role
(
Destination limitations
Depending on the destination you route your logs to, there might be some limitations.
Route to log buckets in different Google Cloud projects
The following limitations apply when you route your logs to log buckets in different Google Cloud projects:
If you route logs to a log bucket stored in a different project, then Error Reporting can't analyze those logs.
Error Reporting is disabled on logs stored in log buckets that use customer-managed encryption keys.
Route to different Google Cloud projects
The following limitations apply when you route your logs to different Google Cloud projects:
If you route logs to a different project, then Error Reporting can't analyze those logs.
There is a one-hop limit. For example, if you route log entries from project A to project B, then you can't route the log entries from project B to a different project.
Audit logs aren't routed to the
_Required
bucket in the destination project. You must create another sink or bucket to store them.If the organization or folder that contains the Google Cloud project that you are routing to has existing aggregated sinks, then the logs aren't routed by those aggregated sinks.
Code samples
To use client library code to configure sinks in your chosen languages, see Logging client libraries: Log sinks.
Filter examples
Following are some filter examples that are particularly useful when creating sinks.
For additional examples that might be useful as you build your inclusion filters and exclusion filters, see Sample queries.
Restore the _Default
sink filter
If you edited the filter for the _Default
sink, you might want to restore
its default filter. To do so, enter the following inclusion filter:
NOT LOG_ID("cloudaudit.googleapis.com/activity") AND NOT \
LOG_ID("externalaudit.googleapis.com/activity") AND NOT \
LOG_ID("cloudaudit.googleapis.com/system_event") AND NOT \
LOG_ID("externalaudit.googleapis.com/system_event") AND NOT \
LOG_ID("cloudaudit.googleapis.com/access_transparency") AND NOT \
LOG_ID("externalaudit.googleapis.com/access_transparency")
Exclude Google Kubernetes Engine container and pod logs
To exclude Google Kubernetes Engine container and pod logs for
GKE system namespaces
, use the following filter:
resource.type = ("k8s_container" OR "k8s_pod")
resource.labels.namespace_name = (
"cnrm-system" OR
"config-management-system" OR
"gatekeeper-system" OR
"gke-connect" OR
"gke-system" OR
"istio-system" OR
"knative-serving" OR
"monitoring-system" OR
"kube-system")
To exclude Google Kubernetes Engine node logs for GKE system
logNames
, use the following filter:
resource.type = "k8s_node"
logName:( "logs/container-runtime" OR
"logs/docker" OR
"logs/kube-container-runtime-monitor" OR
"logs/kube-logrotate" OR
"logs/kube-node-configuration" OR
"logs/kube-node-installation" OR
"logs/kubelet" OR
"logs/kubelet-monitor" OR
"logs/node-journal" OR
"logs/node-problem-detector")
To view the volume of Google Kubernetes Engine node, pod and container logs data ingested into Cloud Logging, use Metrics Explorer in Cloud Monitoring.
Exclude Dataflow logs not required for supportability
To exclude Dataflow logs that aren't required for supportability, use the following filter:
resource.type="dataflow_step"
labels."dataflow.googleapis.com/log_type"!="system" AND labels."dataflow.googleapis.com/log_type"!="supportability"
To view the volume of Dataflow logs data ingested into Cloud Logging, use Metrics Explorer in Cloud Monitoring.
Supportability
While Cloud Logging provides you with the ability to exclude logs from being ingested, you might want to consider keeping logs that help with supportability. Using these logs can help you quickly troubleshoot and identify issues with your applications.
For example, GKE system logs are useful to troubleshoot your GKE applications and clusters because they are generated for events that happen in your cluster. These logs can help you determine if your application code or the underlying GKE cluster is causing your application error. GKE system logs also include Kubernetes Audit Logging generated by the Kubernetes API Server component, which includes changes made using the kubectl command and Kubernetes events.
For Dataflow, we recommended that you, at a minimum, ingest your system
logs (labels."dataflow.googleapis.com/log_type"="system"
) and supportability
logs (labels."dataflow.googleapis.com/log_type"="supportability"
). These logs
are essential for developers to observe and troubleshoot their Dataflow
pipelines, and users might not be able to use the Dataflow
Job details page to view job logs.
What's next
If you encounter issues as you use sinks to route logs, see Troubleshoot routing logs.
To learn how to view your routed logs in their destinations, as well as how the logs are formatted and organized, see View logs in sink destinations.
To learn more about querying and filtering with the Logging query language, see Logging query language.