Configure sinks

This document explains how to create and manage sinks to route log entries using the Cloud Console, the Cloud Logging API, and the gcloud command-line tool.

In summary, you route logs by creating one or more sinks that include a filter expression and a destination. As Logging receives new log entries, they are compared against each sink. If a log entry matches a sink's filter, then a copy of the log entry is written to the sink's destination. For a broader conceptual overview on sinks, see Routing and storage overview: Sinks.

Using the Cloud Console, you can do the following:

  • View and manage all of your sinks in one place.
  • Preview which log entries are matched by your sink's filter before you create the sink.
  • Create and authorize sink destinations for your sinks.

However, the Cloud Console can only create or view sinks in Cloud projects. To create sinks in organizations, folders, or billing accounts using the gcloud command-line tool or Cloud Logging API, see Aggregated sinks.

Supported destinations

You can route logs within the same Cloud project or between Cloud projects to the following destinations:

  • Cloud Storage: JSON files stored in Cloud Storage buckets.
  • Pub/Sub: JSON messages delivered to Pub/Sub topics. Supports third-party integrations, such as Splunk, with Logging.
  • BigQuery: Tables created in BigQuery datasets.
  • Another Cloud Logging bucket: Log entries held in Cloud Logging log buckets.

To create sinks in organizations, folders, or billing accounts, see Aggregated sinks.

Before you begin

Before you can create a sink, verify the following:

  • You have a Google Cloud project with logs that you can see in the Logs Explorer.

  • You have one of the following IAM roles for the source Cloud project from which you're sending logs.

    • Owner (roles/owner)
    • Logging Admin (roles/logging.admin)
    • Logs Configuration Writer (roles/logging.configWriter)

    The permissions contained in these roles allow you to create, delete, or modify sinks. For information on setting IAM roles, see the Logging Access control guide.

  • You have a resource in a supported destination or have the ability to create one.

    The destination for log sinks has to be created before the sink, through either gcloud command-line tool, Cloud Console, or the Google Cloud APIs. You can create the destination in any Cloud project in any organization, but you need to make sure that the service account from the sink has permissions to write to the destination.

Create a sink

Following are the instructions for creating a sink in a Cloud project using the Cloud Console or the gcloud command-line tool.

You can create up to 200 sinks per Cloud project.

To create a sink, do the following:

Console

  1. In the Cloud Console, go to the Logging > Log Router page.

    Go to the Log Router

  2. Select an existing Cloud project.

  3. Select Create sink.

  4. In the Sink details panel, enter the following details:

    • Sink name: Provide an identifier for the sink; note that after you create the sink, you can't rename the sink but you can delete it and create a new sink.

    • Sink description (optional): Describe the purpose or use case for the sink.

  5. In the Sink destination panel, select the sink service and destination:

    • Select sink service: Select the service where you want your logs routed.

    Based on the service that you select, you can select from the following destinations:

    • Cloud Logging bucket: Select or create a Logging bucket.
    • BigQuery table: Select or create the particular dataset to receive the routed logs. You also have the option to use partitioned tables.
    • Cloud Storage bucket: Select or create the particular Cloud Storage bucket to receive the routed logs.
    • Pub/Sub topic: Select or create the particular topic to receive the routed logs.
    • Splunk: Select the Pub/Sub topic for your Splunk service.
    • Other Cloud project: Manually add the Logging, BigQuery, Cloud Storage, or Pub/Sub service and destination information in the following format:

      SERVICE.googleapis.com/projects/PROJECT_ID/SINK_DESTINATION/DESTINATION_ID
      

      For example, if your sink destination is a BigQuery dataset, the sink destination would be the following:

      bigquery.googleapis.com/projects/PROJECT_ID/datasets/DATASET_ID
      

      Note that if you are routing logs between Cloud projects, you still need the appropriate destination permissions.

  6. Choose logs to include in the sink in the Build inclusion filter panel.

    1. Enter a filter expression that matches the log entries you want to include. If you don't set a filter, all logs from your Cloud project are routed to the destination.

      For example, you might want to build a filter to route all Data Access logs to a single Logging bucket. This filter looks like the following:

      LOG_ID("cloudaudit.googleapis.com/data_access") OR LOG_ID("externalaudit.googleapis.com/data_access")
      

      Note that the length of a filter can't exceed 20,000 characters.

    2. To verify you entered the correct filter, select Preview logs. This opens the Logs Explorer in a new tab with the filter prepopulated.

  7. (Optional) Choose logs to exclude from the sink in the Build an exclusion filter panel:

    1. Enter a name in the Exclusion filter name field.

    2. In the Build an exclusion filter section, enter a filter expression that matches the log entries you want to exclude. You can also use the sample function to select a portion of the log entries to exclude.

    You can create up to 50 exclusion filters per sink. Note that the length of a filter can't exceed 20,000 characters.

  8. Select Create sink.

API

  1. To create a logging sink in your Cloud project, use projects.sinks.create in the Logging API. In the LogSink object, provide the appropriate required values in the method request body:

    • name: An identifier for the sink. Note that after you create the sink, you can't rename the sink, but you can delete it and create a new sink.
    • destination: The service and destination to where you want your logs routed. For example, if your sink destination is a BigQuery dataset, then destination would look like the following:

      bigquery.googleapis.com/projects/PROJECT_ID/datasets/DATASET_ID
      
  2. In the LogSink object, provide the appropriate optional information:

    • filter : Set the filter property to match the log entries you want to include in your sink. If you don't set a filter, all logs from your Cloud project are routed to the destination. Note that the length of a filter can't exceed 20,000 characters.
    • exclusions: Set this property to match the log entries that you want to exclude from your sink. You can also use the sample function to select a portion of the log entries to exclude. You can create up to 50 exclusion filters per sink.
    • description: Set this property to describe the purpose or use case for the sink.
  3. Call projects.sinks.create to create the sink.

  4. Retrieve the service account name from the writer_identity field returned from the API response.

  5. Give that service account permission to write to your sink destination.

    If you don't have permission to make that change to the sink destination, then send the service account name to someone who can make that change for you.

    For more information about granting service accounts permissions for resources, see the set destination permissions section.

For more information on creating sinks using the Logging API, see the LogSink reference.

gcloud

To create a sink, run the following gcloud logging sinks create command.

Provide the appropriate values for the variables in the command as follows:

  • SINK_NAME: An identifier for the sink. Note that after you create the sink, you can't rename the sink but you can delete it and create a new sink.
  • SINK_DESTINATION: The service and destination to where you want your logs routed. For example, if your sink destination is a BigQuery dataset, then SINK_DESTINATION would look like the following:

    bigquery.googleapis.com/projects/PROJECT_ID/datasets/DATASET_ID
    
  • OPTIONAL_FLAGS includes the following flags:

    • --log-filter : Use this flag to set a filter that matches the log entries you want to include in your sink. If you don't set a filter, all logs from your Cloud project are routed to the destination.
    • --exclusion: Use this flag to set an exclusion filter for log entries that you want to exclude from your sink. You can also use the sample function to select a portion of the log entries to exclude. This flag can be repeated; you can create up to 50 exclusion filters per sink.
    • --description: Use this flag to describe the purpose or use case for the sink.
gcloud logging sinks create SINK_NAME
SINK_DESTINATION OPTIONAL_FLAGS

For example, to create a sink to a Logging bucket, your command might look like this:

gcloud logging sinks create my-sink logging.googleapis.com/projects/myproject123/locations/global/buckets/my-bucket \
  --log-filter='logName="projects/myproject123/logs/matched"' --description="My first sink"

For more information on creating sinks using the gcloud command-line tool, including more flags and examples, see the gcloud logging sinks reference.

For information about how to view logs in the sink destinations, see Find routed logs.

After creating the sink, you can view the number and volume of log entries received using the logging.googleapis.com/exports/ metrics.

If you receive error notifications, see Troubleshoot routing and sinks.

Route logs between log buckets in different Cloud projects

You can route logs to a destination in a different Cloud project than the one the sink is created in.

To do so, you must do one of the following:

  • Give your sink's service account the roles/logging.bucketWriter role to write to the destination; see Destination permissions for instructions.

  • Have one of the following IAM permissions for the source Cloud project from which you're sending logs.

    • Owner (roles/owner)
    • Logging Admin (roles/logging.admin)
    • Logs Configuration Writer (roles/logging.configWriter)

    If you're creating a new Logging bucket in the destination Cloud project, you must have one of these permissions.

Manage sinks

After your sinks are created, you can perform the following actions on them:

  • View sink details
  • Update sinks
  • Disable sinks
  • Delete sinks

To view and manage your sinks, do the following:

Console

You can view and manage your sinks in the Log Router page:

Go to Log Router

Make sure you've selected the Cloud project that contains your sink by using the resource selector from anywhere in the Cloud Console:

A project is selected from the drop-down menu.

To view your aggregated sinks, select the organization, folder, or billing account that contains the sink.

The Log Router page contains a table summary of sinks. Each table row contains information about a sink's properties:

  • Enabled: Indicates if the sink's state is enabled or disabled.
  • Type: The sink's destination service; for example, Cloud Logging bucket.
  • Name: The sink's identifier, as provided when the sink was created; for example _Default.
  • Description: The sink's description, as provided when the sink was created.
  • Destination: Full name of the destination for where the routed log entries will be sent.
  • Created: The date and time that the sink was created.
  • Updated: The date and time that the sink was last edited.

Each table row has a menu and provides the following options:

  • View sink details: Displays the sink's name, description, destination service, destination, and inclusion and exclusion filters. Selecting Edit opens the Edit Sink panel.
  • Edit sink: Opens the Edit Sink panel where you can update the sink's parameters.
  • Disable sink: Lets you disable the sink and stop routing logs to the sink's destination. For more information on disabling sinks, see Stop logs ingestion.
  • Enable sink: Lets you enable a disabled sink and restart routing logs to the sink's destination.
  • Delete sink: Lets you delete the sink and stop routing logs to the sink's destination. The _Default and the _Required sinks can't be deleted, but the _Default sink can be disabled to stop routing logs to the _Default Logging bucket.

Clicking on any of the column names lets you sort data in ascending or descending order.

API

  • To view the sinks for your Cloud project, call projects.sinks.list.

  • To view a sink's details, call projects.sinks.get.

  • To update a sink, call projects.sink.update.

    You can update a sink's destination, filters, and description. You can also disable or reenable the sink.

  • To disable a sink, call projects.sink.update and set the disabled property to true.

    To reenable the sink, call projects.sink.update and set the disabled property to false.

  • To delete a sink, call projects.sinks.delete.

    Note that if you delete a sink, log entries are no longer routed from it.

    For more information on any of these methods for managing sinks using the Logging API, see the LogSink reference.

gcloud

  • To view your list of sinks for your Cloud project, use the gcloud logging sinks list command, which corresponds to the Logging API method projects.sinks.list:

    gcloud logging sinks list
    

    To view your list of aggregated sinks, use the appropriate flag to specify the resource that contains the sink. For example, if you created the sink at the organization level, use the --organization=ORGANIZATION_ID flag to list the sinks for the organization.

  • To describe a sink, use the gcloud logging sinks describe command, which corresponds to the Logging API method projects.sinks.get:

    gcloud logging sinks describe SINK_NAME
    
  • To update a sink, use the gcloud logging sinks update command, which corresponds to the API method projects.sink.update.

    You can update a sink to change the destination, filters, and description, or to disable or reenable the sink:

    gcloud logging sinks update SINK_NAME  NEW_DESTINATION  --log-filter=NEW_FILTER

    Omit the NEW_DESTINATION or --log-filter if those parts don't change.

    For example, to update the destination of your sink named my-project-sink to a new Cloud Storage bucket destination named my-second-gcs-bucket, your command looks like this:

    gcloud logging sinks update  my-project-sink  storage.googleapis.com/my-second-gcs-bucket
    
  • To disable a sink, use the gcloud logging sinks update command, which corresponds to the API method projects.sink.update, and include the --disabled flag:

    gcloud logging sinks update _Default  --disabled
    

    To reenable the sink, use the gcloud logging sinks update command, remove the --disabled flag, and include the --no-disabled flag:

    gcloud logging sinks update _Default  --no-disabled
    
  • To delete a sink, use the gcloud logging sinks delete command, which corresponds to the API method projects.sinks.delete:

    gcloud logging sinks delete SINK_NAME
    

    Note that if you delete a sink, log entries are no longer routed from it.

    For more information on managing sinks using the gcloud command-line tool, see the gcloud logging sinks reference.

Stop logs ingestion

For each Cloud project, Logging automatically creates two log buckets: _Required and _Default. Logging automatically creates two log sinks, _Required and _Default, that route logs to the correspondingly named buckets.

You can't disable the _Required sink; neither ingestion pricing nor storage pricing applies to the logs data stored in the _Required log bucket. You can disable the _Default sink to stop logs from being ingested into the _Default bucket. You can also disable any user-defined sinks.

When you stop logs ingestion for the _Default bucket by disabling all the sinks in your Cloud project that send logs to the _Default bucket, no new Cloud Logging ingestion charges are incurred by your Cloud project for the _Default bucket. The _Default bucket is empty when all of the previously ingested logs in the _Default bucket have fulfilled the bucket's retention period.

To disable your Cloud project sinks that route logs to the _Default bucket, complete the following steps:

Console

  1. Go to the Log Router:

    Go to Log Router

  2. To find all the sinks that route logs to the _Default bucket, filter the sinks by destination, and then enter _Default.

    Find all sinks that route logs to the default bucket.

  3. For each sink, select Menu and then select Disable sink.

The sinks are now disabled and your Cloud project sinks no longer route logs to the _Default bucket.

To reenable a disabled sink and restart routing logs to the sink's destination, do the following:

  1. Go to the Log Router page:

    Go to Log Router

  2. To find all the disabled sinks previously configured to route logs to the _Default bucket, filter the sinks by destination, and then enter _Default.

  3. For each sink, select Menu and then select Enable sink.

API

  1. To view the sinks for your Cloud project, call the Logging API method projects.sinks.list.

    Identify any sinks that are routing to the _Default bucket.

  2. For example, to disable the _Default sink, call projects.sink.update and set the disabled property to true.

The _Default sink is now disabled; it no longer routes logs to the _Default bucket.

To disable the other sinks in your Cloud project that are routing to the _Default bucket, repeat the steps above.

To reenable a sink, call projects.sink.update and set the disabled property to false.

gcloud

  1. To view your list of sinks for your Cloud project, use the gcloud logging sinks list command, which corresponds to the Logging API method projects.sinks.list:

    gcloud logging sinks list
    
  2. Identify any sinks that are routing to the _Default bucket. To describe a sink, including seeing the destination name, use the gcloud logging sinks describe command, which corresponds to the Logging API method projects.sinks.get:

    gcloud logging sinks describe SINK_NAME
    
  3. For example, to disable the _Default sink, use the gcloud logging sinks update command and include the --disabled flag:

    gcloud logging sinks update _Default  --disabled
    

The _Default sink is now disabled; it no longer routes logs to the _Default bucket.

To disable the other sinks in your Cloud project that are routing to the _Default bucket, repeat the steps above.

To reenable a sink, use the gcloud logging sinks update command, remove the --disabled flag, and include the --no-disabled flag:

gcloud logging sinks update _Default  --no-disabled

Set destination permissions

This section describes how to grant Logging the Identity and Access Management permissions to write logs to your sink's destination. For the full list of Logging roles and permissions, see Access control.

When you create a sink, Logging creates a new service account for the sink, called a unique writer identity. Your sink destination must permit this service account to write log entries. You can't manage this service account directly as it is owned and managed by Cloud Logging. The service account is deleted if the sink gets deleted.

If you're using a sink to route logs between Logging buckets in the same Cloud project, no new service account is created; the sink works without the unique writer identity. If you're using a sink to route logs between Logging buckets in different Cloud projects, a new service account is created.

To set permissions for your sink to route to its destination, do the following:

Console

  1. Obtain the sink's writer identity—an email address—from the new sink. Go to the Log Router page, and select menu > View sink details. The writer identity appears in the Sink details panel.

  2. If you have Owner access to the destination, add the service account to the destination in the following way:

    • For Cloud Storage destinations, add the sink's writer identity to your Cloud Storage bucket and give it the Storage Object Creator role.
    • For BigQuery destinations, add the sink's writer identity to your dataset and give it the BigQuery Data Editor role.
    • For Pub/Sub, including Splunk, add the sink's writer identity to your topic and give it the Pub/Sub Publisher role.
    • For Logging bucket destinations in different Cloud projects, add the sink's writer identity to the destination log bucket and give it the roles/logging.bucketWriter permission.

    If you don't have Owner access to the sink destination, send the writer identity service account name to someone who has that ability. That person should then follow the instructions in the previous step to add the writer identity to the sink destination.

API

  1. Call the API method projects.sinks.create or projects.sinks.update to create or modify the sink.

    Set uniqueWriterIdentity to true. When updating a sink, you can change from using a shared writer to a unique writer. If the existing sink already uses a unique writer, the updated sink uses the same writer.

    The methods return the new sink, which contains the new writer identity.

  2. If you have IAM Owner access to the destination, add the service account to the destination in the following way:

    • For Cloud Storage destinations, add the sink's writer identity to your Cloud Storage bucket and give it the Storage Object Creator role.
    • For BigQuery destinations, add the sink's writer identity to your dataset and give it the BigQuery Data Editor role.
    • For Pub/Sub, including Splunk, add the sink's writer identity to your topic and give it the Pub/Sub Publisher role.
    • For Logging bucket destinations in different Cloud projects, add the sink's writer identity to the destination log bucket and give it the roles/logging.bucketWriter permission.

    If you don't have Owner access to the sink destination, send the writer identity service account name to someone who has that ability. That person should then follow the instructions in the previous step to add the writer identity to the sink destination.

gcloud

  1. Get the service account from the writerIdentity field in your sink:

    gcloud logging sinks describe SINK_NAME
    

    The service account looks similar to the following:

    serviceAccount:p123456789012-12345@gcp-sa-logging.iam.gserviceaccount.com
    
  2. If you have IAM Owner access to the destination, add the service account to the destination in the following way:

    • For Cloud Storage destinations, add the sink's writer identity to your Cloud Storage bucket and give it the Storage Object Creator role.
    • For BigQuery destinations, add the sink's writer identity to your dataset and give it the BigQuery Data Editor role.
    • For Pub/Sub, including Splunk, add the sink's writer identity to your topic and give it the Pub/Sub Publisher role.
    • For Logging bucket destinations in different Cloud projects, add the sink's writer identity to the destination log bucket and give it the roles/logging.bucketWriter permission.

    If you don't have Owner access to the sink destination, send the writer identity service account name to someone who has that ability. That person should then follow the instructions in the previous step to add the writer identity to the sink destination.

    For example, if you're routing logs between Logging buckets in different Cloud projects, you would add roles/logging.bucketWriter to the service account as follows:

    1. Get the Identity and Access Management policy for the destination Cloud project and write it to a local file in JSON format:

      gcloud projects get-iam-policy DESTINATION_PROJECT_ID --format json > output.json
      
    2. Add an IAM condition that lets the service account write only to the Cloud Logging bucket you created. For example:

      {
      "bindings": [
       {
         "members": [
           "user:username@gmail.com"
         ],
         "role": "roles/owner"
       },
       {
         "members": [
           "[SERVICE_ACCOUNT]"
         ],
         "role": "roles/logging.bucketWriter",
         "condition": {
             "title": "Bucket writer condition example",
             "description": "Grants logging.bucketWriter role to service account [SERVICE_ACCOUNT] used by log sink [SINK_NAME]",
             "expression":
               "resource.name.endsWith(\'locations/global/buckets/BUCKET_ID\')"
         }
       }
      ],
      "etag": "BwWd_6eERR4=",
      "version": 3
      }
    3. Update the IAM policy:

      gcloud projects set-iam-policy DESTINATION_PROJECT_ID output.json
      

Code samples

To use client library code to configure sinks in your chosen languages, see Logging client libraries: Log sinks.

Filter examples

Following are some filter examples that are particularly useful when creating sinks.

For additional examples that might be useful as you build your inclusion filters and exclusion filters, see Sample queries.

Restore the _Default sink filter

If you edited the filter for the _Default sink, you might want to restore its default filter. To do so, enter the following inclusion filter:

  NOT LOG_ID("cloudaudit.googleapis.com/activity") AND NOT \
  LOG_ID("externalaudit.googleapis.com/activity") AND NOT \
  LOG_ID("cloudaudit.googleapis.com/system_event") AND NOT \
  LOG_ID("externalaudit.googleapis.com/system_event") AND NOT \
  LOG_ID("cloudaudit.googleapis.com/access_transparency") AND NOT \
  LOG_ID("externalaudit.googleapis.com/access_transparency")

Exclude Google Kubernetes Engine container and pod logs

To exclude Google Kubernetes Engine container and pod logs for GKE system namespaces, use the following filter :

resource.type = ("k8s_container" OR "k8s_pod")
resource.labels.namespace_name = (
"cnrm-system" OR
"config-management-system" OR
"gatekeeper-system" OR
"gke-connect" OR
"gke-system" OR
"istio-system" OR
"knative-serving" OR
"monitoring-system" OR
"kube-system")

To exclude Google Kubernetes Engine node logs for GKE system logNames, use the following filter:

resource.type = "k8s_node"
logName:( "logs/container-runtime" OR
"logs/docker" OR
"logs/kube-container-runtime-monitor" OR
"logs/kube-logrotate" OR
"logs/kube-node-configuration" OR
"logs/kube-node-installation" OR
"logs/kubelet" OR
"logs/kubelet-monitor" OR
"logs/node-journal" OR
"logs/node-problem-detector")

To view the volume of Google Kubernetes Engine node, pod and container logs data ingested into Cloud Logging, use Metrics Explorer in Cloud Monitoring.

Exclude Dataflow logs not required for supportability

To exclude Dataflow logs that aren't required for supportability, use the following filter:

resource.type="dataflow_step"
labels."dataflow.googleapis.com/log_type"!="system" AND labels."dataflow.googleapis.com/log_type"!="supportability"

To view the volume of Dataflow logs data ingested into Cloud Logging, use Metrics Explorer in Cloud Monitoring.

Supportability

While Cloud Logging provides you with the ability to exclude logs from being ingested, you might want to consider keeping logs that help with supportability. Using these logs can help you quickly troubleshoot and identify issues with your applications.

For example, GKE system logs are useful to troubleshoot your GKE applications and clusters because they are generated for events that happen in your cluster. These logs can help you determine if your application code or the underlying GKE cluster is causing your application error. GKE system logs also include Kubernetes Audit Logging generated by the Kubernetes API Server component, which includes changes made using the kubectl command and Kubernetes events.

For Dataflow, we recommended that you, at a minimum, ingest your system logs (labels."dataflow.googleapis.com/log_type"="system") and supportability logs (labels."dataflow.googleapis.com/log_type"="supportability"). These logs are essential for developers to observe and troubleshoot their Dataflow pipelines, and users might not be able to use the Dataflow Job details page to view job logs.

What's next