Configure and manage sinks

Stay organized with collections Save and categorize content based on your preferences.

This document explains how to create and manage sinks to route log entries to supported destinations.

Overview

Sinks control how Cloud Logging routes logs. Using sinks, you can route some or all of your logs to supported destinations.

Sinks belong to a given Google Cloud resource: Cloud projects, billing accounts, folders, and organizations. When the resource receives a log entry, it routes the log entry according to the sinks contained by that resource. The log entry is sent to the destination associated with each matching sink.

An aggregated sink is a type of sink that combines and routes log entries from the Google Cloud resources contained by an organization or folder. For instructions, see Configure aggregated sinks.

To create and manage sinks, you can use the Google Cloud console, the Cloud Logging API, and the Google Cloud CLI. Using the Google Cloud console has the following advantages over the other methods:

  • View and manage all of your sinks in one place.
  • Preview which log entries are matched by your sink's filter before you create the sink.
  • Create and authorize sink destinations for your sinks.

Supported destinations

You can route logs to the following destinations:

  • Cloud Storage: JSON files stored in Cloud Storage buckets.
  • Pub/Sub: JSON messages delivered to Pub/Sub topics. Supports third-party integrations, such as Splunk, with Logging.
  • BigQuery: Tables created in BigQuery datasets.
  • Another Cloud Logging bucket: Log entries held in Cloud Logging log buckets.

Before you begin

The instructions in this document describe creating and managing sinks at the Cloud project level, but you can create sinks (non-aggregated) for billing accounts, folders, and organizations.

As you get started, ensure the following:

  • You have a Google Cloud project with logs that you can see in the Logs Explorer.

  • You have one of the following IAM roles for the source Cloud project from which you're routing logs.

    • Owner (roles/owner)
    • Logging Admin (roles/logging.admin)
    • Logs Configuration Writer (roles/logging.configWriter)

    The permissions contained in these roles allow you to create, delete, or modify sinks. For information on setting IAM roles, see the Logging Access control guide.

  • You have a resource in a supported destination or have the ability to create one.

    The routing destination has to be created before the sink, through either Google Cloud CLI, Google Cloud console, or the Google Cloud APIs. You can create the destination in any Cloud project in any organization, but first ensure that the service account from the sink has permissions to write to the destination.

Create a sink

Following are the instructions for creating a sink in a Cloud project. Instead of a Cloud project, you can specify a billing account, folder, or organization.

You can create up to 200 sinks per Cloud project.

After you create the sink, ensure that Logging has the appropriate permissions to write logs to your sink's destination; see Set destination permissions.

To create a sink, do the following:

Console

  1. In the Google Cloud console, go to the Logs Router page:

    Go to Logs Router

  2. Select an existing Cloud project.

  3. Select Create sink.

  4. In the Sink details panel, enter the following details:

    • Sink name: Provide an identifier for the sink; note that after you create the sink, you can't rename the sink but you can delete it and create a new sink.

    • Sink description (optional): Describe the purpose or use case for the sink.

  5. In the Sink destination panel, select the sink service and destination by using the Select sink service menu.

    If you are routing to a service that is in the same Cloud project, select one of the following options:

    • Cloud Logging bucket: Select or create a Logging bucket.
    • BigQuery table: Select or create the particular dataset to receive the routed logs. You also have the option to use partitioned tables.
    • Cloud Storage bucket: Select or create the particular Cloud Storage bucket to receive the routed logs.
    • Pub/Sub topic: Select or create the particular topic to receive the routed logs.
    • Splunk: Select the Pub/Sub topic for your Splunk service.

    If you are routing to a destination that is in another project, then select Other project. You must provide the Logging, BigQuery, Cloud Storage, or Pub/Sub service and destination information.

    To route log entries to a Cloud Logging log bucket that uses the global region and is defined in a different Google Cloud project, the sink destination is the following:

    logging.googleapis.com/projects/DESTINATION_PROJECT/locations/global/buckets/BUCKET_NAME
    

    To route log entries to a BigQuery dataset, the sink destination is the following:

    bigquery.googleapis.com/projects/PROJECT_ID/datasets/DATASET_ID
    

    To route log entries to a Cloud Storage bucket, the sink destination is the following:

    storage.googleapis.com/BUCKET_NAME
    

    To route log entries to a Pub/Sub topic, the sink destination is the following:

    pubsub.googleapis.com/projects/PROJECT_ID/topics/TOPIC_ID
    

    Note that if you are routing logs between Cloud projects, you still need the appropriate destination permissions.

  6. In the Choose logs to include in sink panel, do the following:

    1. In the Build inclusion filter field, enter a filter expression that matches the log entries you want to include. To learn more about the syntax for writing filters, see Logging query language.

      If you don't set a filter, all logs from your selected resource are routed to the destination.

      For example, you might want to build a filter to route all Data Access logs to a single Logging bucket. This filter looks like the following:

      LOG_ID("cloudaudit.googleapis.com/data_access") OR LOG_ID("externalaudit.googleapis.com/data_access")
      

      Note that the length of a filter can't exceed 20,000 characters.

    2. To verify you entered the correct filter, select Preview logs. This opens the Logs Explorer in a new tab with the filter prepopulated.

  7. (Optional) In the Choose logs to filter out of sink panel, do the following:

    1. In the Exclusion filter name field, enter a name.

    2. In the Build an exclusion filter field, enter a filter expression that matches the log entries you want to exclude. You can also use the sample function to select a portion of the log entries to exclude.

    You can create up to 50 exclusion filters per sink. Note that the length of a filter can't exceed 20,000 characters.

  8. Select Create sink.

API

  1. To create a logging sink in your Cloud project, use projects.sinks.create in the Logging API. In the LogSink object, provide the appropriate required values in the method request body:

    • name: An identifier for the sink. Note that after you create the sink, you can't rename the sink, but you can delete it and create a new sink.
    • destination: The service and destination to where you want your logs routed. For example, if your sink destination is a BigQuery dataset, then destination would look like the following:

      bigquery.googleapis.com/projects/PROJECT_ID/datasets/DATASET_ID
      
  2. In the LogSink object, provide the appropriate optional information:

    • filter : Set the filter property to match the log entries you want to include in your sink. If you don't set a filter, all logs from your Cloud project are routed to the destination. Note that the length of a filter can't exceed 20,000 characters.
    • exclusions: Set this property to match the log entries that you want to exclude from your sink. You can also use the sample function to select a portion of the log entries to exclude. You can create up to 50 exclusion filters per sink.
    • description: Set this property to describe the purpose or use case for the sink.
  3. Call projects.sinks.create to create the sink.

  4. If the API response contains a JSON key labeled "writerIdentity", then grant the service account of the sink the permission to write to the sink destination. For more information, see Set destination permissions.

    You don't need to set destination permissions when the API response doesn't contain a JSON key labeled "writerIdentity".

For more information on creating sinks using the Logging API, see the LogSink reference.

gcloud

To create a sink, run the following gcloud logging sinks create command.

Provide the appropriate values for the variables in the command as follows:

  • SINK_NAME: An identifier for the sink. Note that after you create the sink, you can't rename the sink but you can delete it and create a new sink.
  • SINK_DESTINATION: The service and destination to where you want your logs routed. For example, if your sink destination is a BigQuery dataset, then SINK_DESTINATION would look like the following:

    bigquery.googleapis.com/projects/PROJECT_ID/datasets/DATASET_ID
    
  • OPTIONAL_FLAGS includes the following flags:

    • --log-filter : Use this flag to set a filter that matches the log entries you want to include in your sink. If you don't set a filter, all logs from your Cloud project are routed to the destination.
    • --exclusion: Use this flag to set an exclusion filter for log entries that you want to exclude from your sink. You can also use the sample function to select a portion of the log entries to exclude. This flag can be repeated; you can create up to 50 exclusion filters per sink.
    • --description: Use this flag to describe the purpose or use case for the sink.
gcloud logging sinks create SINK_NAME
SINK_DESTINATION OPTIONAL_FLAGS

For example, to create a sink to a Logging bucket, your command might look like this:

gcloud logging sinks create my-sink logging.googleapis.com/projects/myproject123/locations/global/buckets/my-bucket \
  --log-filter='logName="projects/myproject123/logs/matched"' --description="My first sink"

For more information on creating sinks using the Google Cloud CLI, including more flags and examples, see the gcloud logging sinks reference.

New log sinks to Cloud Storage buckets might take several hours to start routing logs. Sinks to Cloud Storage are processed hourly while other destination types are processed in real time.

For information about how to view logs in the sink destinations, see Find routed logs.

After creating the sink, you can view the number and volume of log entries received using the logging.googleapis.com/exports/ metrics.

If you receive error notifications, see Troubleshoot routing and sinks.

Route logs between log buckets in different Cloud projects

You can route logs to a destination in a different Cloud project than the one the sink is created in.

To do so, you must do one of the following:

  • Give your sink's service account the roles/logging.bucketWriter role to write to the destination; see Destination permissions for instructions.

  • Have one of the following IAM permissions for the source Cloud project from which you're sending logs.

    • Owner (roles/owner)
    • Logging Admin (roles/logging.admin)
    • Logs Configuration Writer (roles/logging.configWriter)

    If you're creating a new Logging bucket in the destination Cloud project, you must have one of these permissions.

Manage sinks

After your sinks are created, you can perform these actions on them:

  • View sink details
  • Update sink
  • Disable sink
  • Delete sink
  • Troubleshoot sink
  • View sink log volume and error rates

If deleting a sink, note the following:

  • You can't delete the _Default and the _Required sinks, but you can disable _Default sinks to stop routing logs to _Default Logging buckets.
  • After a sink is deleted, it stops routing log entries.
  • Sinks created before September 30, 2022 have a dedicated service account. When you delete that sink, the service account is deleted. Sinks created on or after September 30, 2022 have a shared service account. Deleting the sink doesn't delete the shared service account.

Any changes made to a sink might take a few minutes to apply.

Following are the instructions for managing a sink in a Cloud project. Instead of a Cloud project, you can specify a billing account, folder, or organization:

Console

You can view and manage your sinks in the Logs Router page:

Go to Logs Router

Select the Cloud project that contains your sink by using the resource selector from anywhere in the Google Cloud console:

A project is selected from the drop-down menu.

To view your aggregated sinks, select the organization, folder, or billing account that contains the sink.

The Logs Router page contains a table summary of sinks. Each table row contains information about a sink's properties:

  • Enabled: Indicates if the sink's state is enabled or disabled.
  • Type: The sink's destination service; for example, Cloud Logging bucket.
  • Name: The sink's identifier, as provided when the sink was created; for example _Default.
  • Description: The sink's description, as provided when the sink was created.
  • Destination: Full name of the destination for where the routed log entries will be sent.
  • Created: The date and time that the sink was created.
  • Last updated: The date and time that the sink was last edited.

Each table row has a menu and provides the following options:

  • View sink details: Displays the sink's name, description, destination service, destination, and inclusion and exclusion filters. Selecting Edit opens the Edit Sink panel.
  • Edit sink: Opens the Edit Sink panel where you can update the sink's parameters.
  • Disable sink: Lets you disable the sink and stop routing logs to the sink's destination. For more information on disabling sinks, see Stop logs ingestion.
  • Enable sink: Lets you enable a disabled sink and restart routing logs to the sink's destination.
  • Delete sink: Lets you delete the sink and stop routing logs to the sink's destination.
  • Troubleshoot sink: Opens the Logs Explorer where you can troubleshoot errors with the sink.
  • View sink log volume and error rates: Opens the Metrics Explorer where you can view and analyze data from the sink.

Clicking on any of the column names lets you sort data in ascending or descending order.

API

gcloud

  • To view your list of sinks for your Cloud project, use the gcloud logging sinks list command, which corresponds to the Logging API method projects.sinks.list:

    gcloud logging sinks list
    

    To view your list of aggregated sinks, use the appropriate flag to specify the resource that contains the sink. For example, if you created the sink at the organization level, use the --organization=ORGANIZATION_ID flag to list the sinks for the organization.

  • To describe a sink, use the gcloud logging sinks describe command, which corresponds to the Logging API method projects.sinks.get:

    gcloud logging sinks describe SINK_NAME
    
  • To update a sink, use the gcloud logging sinks update command, which corresponds to the API method projects.sink.update.

    You can update a sink to change the destination, filters, and description, or to disable or reenable the sink:

    gcloud logging sinks update SINK_NAME  NEW_DESTINATION  --log-filter=NEW_FILTER

    Omit the NEW_DESTINATION or --log-filter if those parts don't change.

    For example, to update the destination of your sink named my-project-sink to a new Cloud Storage bucket destination named my-second-gcs-bucket, your command looks like this:

    gcloud logging sinks update  my-project-sink  storage.googleapis.com/my-second-gcs-bucket
    
  • To disable a sink, use the gcloud logging sinks update command, which corresponds to the API method projects.sink.update, and include the --disabled flag:

    gcloud logging sinks update _Default  --disabled
    

    To reenable the sink, use the gcloud logging sinks update command, remove the --disabled flag, and include the --no-disabled flag:

    gcloud logging sinks update _Default  --no-disabled
    
  • To delete a sink, use the gcloud logging sinks delete command, which corresponds to the API method projects.sinks.delete:

    gcloud logging sinks delete SINK_NAME
    

    For more information on managing sinks using the Google Cloud CLI, see the gcloud logging sinks reference.

Stop logs ingestion

For each Cloud project, Logging automatically creates two log buckets: _Required and _Default. Logging automatically creates two log sinks, _Required and _Default, that route logs to the correspondingly named buckets.

You can't disable the _Required sink; neither ingestion pricing nor storage pricing applies to the logs data stored in the _Required log bucket. You can disable the _Default sink to stop logs from being ingested into the _Default bucket. You can also disable any user-defined sinks.

When you stop logs ingestion for the _Default bucket by disabling all the sinks in your Cloud project that send logs to the _Default bucket, no new Cloud Logging ingestion charges are incurred by your Cloud project for the _Default bucket. The _Default bucket is empty when all of the previously ingested logs in the _Default bucket have fulfilled the bucket's retention period.

To disable your Cloud project sinks that route logs to the _Default bucket, complete the following steps:

Console

  1. In the Google Cloud console, go to the Logs Router page:

    Go to Logs Router

  2. To find all the sinks that route logs to the _Default bucket, filter the sinks by destination, and then enter _Default.

    Find all sinks that route logs to the default bucket.

  3. For each sink, select Menu and then select Disable sink.

The sinks are now disabled and your Cloud project sinks no longer route logs to the _Default bucket.

To reenable a disabled sink and restart routing logs to the sink's destination, do the following:

  1. In the Google Cloud console, go to the Logs Router page:

    Go to Logs Router

  2. To find all the disabled sinks previously configured to route logs to the _Default bucket, filter the sinks by destination, and then enter _Default.

  3. For each sink, select Menu and then select Enable sink.

API

  1. To view the sinks for your Cloud project, call the Logging API method projects.sinks.list.

    Identify any sinks that are routing to the _Default bucket.

  2. For example, to disable the _Default sink, call projects.sink.update and set the disabled property to true.

The _Default sink is now disabled; it no longer routes logs to the _Default bucket.

To disable the other sinks in your Cloud project that are routing to the _Default bucket, repeat the steps above.

To reenable a sink, call projects.sink.update and set the disabled property to false.

gcloud

  1. To view your list of sinks for your Cloud project, use the gcloud logging sinks list command, which corresponds to the Logging API method projects.sinks.list:

    gcloud logging sinks list
    
  2. Identify any sinks that are routing to the _Default bucket. To describe a sink, including seeing the destination name, use the gcloud logging sinks describe command, which corresponds to the Logging API method projects.sinks.get:

    gcloud logging sinks describe SINK_NAME
    
  3. For example, to disable the _Default sink, use the gcloud logging sinks update command and include the --disabled flag:

    gcloud logging sinks update _Default  --disabled
    

The _Default sink is now disabled; it no longer routes logs to the _Default bucket.

To disable the other sinks in your Cloud project that are routing to the _Default bucket, repeat the steps above.

To reenable a sink, use the gcloud logging sinks update command, remove the --disabled flag, and include the --no-disabled flag:

gcloud logging sinks update _Default  --no-disabled

Set destination permissions

This section describes how to grant Logging the Identity and Access Management permissions to write logs to your sink's destination. For the full list of Logging roles and permissions, see Access control.

Cloud Logging creates a shared service account for a resource when a log sink is created, unless the required service account already exists. The service account might exist because the same service account is used for all sinks in the underlying resource. Resources can be a Google Cloud project, an organization, a folder, or a billing account.

The writer identity of a sink is the identifier of the service account associated with that sink. All sinks have a writer identity except for sinks that write to a log bucket in the current Cloud project. When the destination of a sink is a log bucket in the current Cloud project, the sink doesn't require any additional destination permissions. Therefore, the value of the writer identity field is listed as None in the console, and it isn't reported by the API and the Google Cloud CLI commands.

Following are the instructions for setting Cloud project-level permissions for your sink to route to its destination. Instead of a Cloud project, you can specify a billing account, folder, or organization:

Console

  1. To get the sink's writer identity—an email address—from the new sink, do the following:

    1. In the Google Cloud console, go to the Logs Router page:

      Go to Logs Router

    2. Select menu, then View sink details. The writer identity appears in the Sink details panel.

  2. If the value of the writerIdentity field contains an email address, then proceed to the next step. When the value is None, you don't need to configure destination permissions for the sink.

  3. Click Copy to copy the sink's writer identity into your clipboard.

  4. If you have Owner access to the destination, add the service account to the destination in the following way:

    • For Cloud Storage destinations, add the sink's writer identity to your Cloud Storage bucket and give it the Storage Object Creator role.
    • For BigQuery destinations, add the sink's writer identity to your dataset and give it the BigQuery Data Editor role.
    • For Pub/Sub, including Splunk, add the sink's writer identity to your topic and give it the Pub/Sub Publisher role.
    • For Logging bucket destinations in different Cloud projects, add the sink's writer identity to the destination log bucket and give it the roles/logging.bucketWriter permission.

    If you don't have Owner access to the sink destination, send the writer identity service account name to someone who has that ability. That person should then follow the instructions in the previous step to add the writer identity to the sink destination.

API

  1. Call the API method projects.sinks.list to list the sinks in your Google Cloud project.

  2. Locate the sink whose permissions you want to modify, and if the sink details contain a JSON key labeled "writerIdentity", then proceed to the next step. When the details don't include a "writerIdentity" field, you don't need to configure destination permissions for the sink.

  3. If you have IAM Owner access to the destination, add the service account to the destination in the following way:

    • For Cloud Storage destinations, add the sink's writer identity to your Cloud Storage bucket and give it the Storage Object Creator role.
    • For BigQuery destinations, add the sink's writer identity to your dataset and give it the BigQuery Data Editor role.
    • For Pub/Sub, including Splunk, add the sink's writer identity to your topic and give it the Pub/Sub Publisher role.
    • For Logging bucket destinations in different Cloud projects, add the sink's writer identity to the destination log bucket and give it the roles/logging.bucketWriter permission.

    If you don't have Owner access to the sink destination, send the writer identity service account name to someone who has that ability. That person should then follow the instructions in the previous step to add the writer identity to the sink destination.

gcloud

  1. Get the service account from the writerIdentity field in your sink:

    gcloud logging sinks describe SINK_NAME
    
  2. Locate the sink whose permissions you want to modify, and if the sink details contain a line with writerIdentity, then proceed to the next step. When the details don't include a writerIdentity field, you don't need to configure destination permissions for the sink.

    The value of the SERVICE_ACCOUNT field in a following steps is the writer identity, which looks similar to the following:

    serviceAccount:service-p-123456789012@gcp-sa-logging.iam.gserviceaccount.com
    
  3. If you have IAM Owner access to the destination, add the service account to the destination in the following way:

    • For Cloud Storage destinations, add the sink's writer identity to your Cloud Storage bucket and give it the Storage Object Creator role.
    • For BigQuery destinations, add the sink's writer identity to your dataset and give it the BigQuery Data Editor role.
    • For Pub/Sub, including Splunk, add the sink's writer identity to your topic and give it the Pub/Sub Publisher role.
    • For Logging bucket destinations in different Cloud projects, add the sink's writer identity to the destination log bucket and give it the roles/logging.bucketWriter permission.

    If you don't have Owner access to the sink destination, send the writer identity service account name to someone who has that ability. That person should then follow the instructions in the previous step to add the writer identity to the sink destination.

    For example, if you're routing logs between Logging buckets in different Cloud projects, you would add roles/logging.bucketWriter to the service account as follows:

    1. Get the Identity and Access Management policy for the destination Cloud project and write it to a local file in JSON format:

      gcloud projects get-iam-policy DESTINATION_PROJECT_ID --format json > output.json
      
    2. Add an IAM condition that lets the service account write only to the Cloud Logging bucket you created. For example:

      {
      "bindings": [
       {
         "members": [
           "user:username@gmail.com"
         ],
         "role": "roles/owner"
       },
       {
         "members": [
           "SERVICE_ACCOUNT"
         ],
         "role": "roles/logging.bucketWriter",
         "condition": {
             "title": "Bucket writer condition example",
             "description": "Grants logging.bucketWriter role to service account SERVICE_ACCOUNT used by log sink [SINK_NAME]",
             "expression":
               "resource.name.endsWith(\'locations/global/buckets/BUCKET_ID\')"
         }
       }
      ],
      "etag": "BwWd_6eERR4=",
      "version": 3
      }
    3. Update the IAM policy:

      gcloud projects set-iam-policy DESTINATION_PROJECT_ID output.json
      

Code samples

To use client library code to configure sinks in your chosen languages, see Logging client libraries: Log sinks.

Filter examples

Following are some filter examples that are particularly useful when creating sinks.

For additional examples that might be useful as you build your inclusion filters and exclusion filters, see Sample queries.

Restore the _Default sink filter

If you edited the filter for the _Default sink, you might want to restore its default filter. To do so, enter the following inclusion filter:

  NOT LOG_ID("cloudaudit.googleapis.com/activity") AND NOT \
  LOG_ID("externalaudit.googleapis.com/activity") AND NOT \
  LOG_ID("cloudaudit.googleapis.com/system_event") AND NOT \
  LOG_ID("externalaudit.googleapis.com/system_event") AND NOT \
  LOG_ID("cloudaudit.googleapis.com/access_transparency") AND NOT \
  LOG_ID("externalaudit.googleapis.com/access_transparency")

Exclude Google Kubernetes Engine container and pod logs

To exclude Google Kubernetes Engine container and pod logs for GKE system namespaces, use the following filter:

resource.type = ("k8s_container" OR "k8s_pod")
resource.labels.namespace_name = (
"cnrm-system" OR
"config-management-system" OR
"gatekeeper-system" OR
"gke-connect" OR
"gke-system" OR
"istio-system" OR
"knative-serving" OR
"monitoring-system" OR
"kube-system")

To exclude Google Kubernetes Engine node logs for GKE system logNames, use the following filter:

resource.type = "k8s_node"
logName:( "logs/container-runtime" OR
"logs/docker" OR
"logs/kube-container-runtime-monitor" OR
"logs/kube-logrotate" OR
"logs/kube-node-configuration" OR
"logs/kube-node-installation" OR
"logs/kubelet" OR
"logs/kubelet-monitor" OR
"logs/node-journal" OR
"logs/node-problem-detector")

To view the volume of Google Kubernetes Engine node, pod and container logs data ingested into Cloud Logging, use Metrics Explorer in Cloud Monitoring.

Exclude Dataflow logs not required for supportability

To exclude Dataflow logs that aren't required for supportability, use the following filter:

resource.type="dataflow_step"
labels."dataflow.googleapis.com/log_type"!="system" AND labels."dataflow.googleapis.com/log_type"!="supportability"

To view the volume of Dataflow logs data ingested into Cloud Logging, use Metrics Explorer in Cloud Monitoring.

Supportability

While Cloud Logging provides you with the ability to exclude logs from being ingested, you might want to consider keeping logs that help with supportability. Using these logs can help you quickly troubleshoot and identify issues with your applications.

For example, GKE system logs are useful to troubleshoot your GKE applications and clusters because they are generated for events that happen in your cluster. These logs can help you determine if your application code or the underlying GKE cluster is causing your application error. GKE system logs also include Kubernetes Audit Logging generated by the Kubernetes API Server component, which includes changes made using the kubectl command and Kubernetes events.

For Dataflow, we recommended that you, at a minimum, ingest your system logs (labels."dataflow.googleapis.com/log_type"="system") and supportability logs (labels."dataflow.googleapis.com/log_type"="supportability"). These logs are essential for developers to observe and troubleshoot their Dataflow pipelines, and users might not be able to use the Dataflow Job details page to view job logs.

What's next