This document explains how to create and manage sinks to route log entries to supported destinations.
Overview
Sinks control how Cloud Logging routes logs. Using sinks, you can route some or all of your logs to the following destinations:
- Cloud Logging log buckets: Provides storage in Cloud Logging. A log bucket can store logs ingested by multiple Google Cloud projects. You specify the data retention period, the data storage location, and the log-views on a log bucket. Log views let you control which logs in a log bucket that a user can access. Log buckets are recommended storage when you want to troubleshoot your applications and services, or to analyze your log data. If you need to combine your Cloud Logging data with other data sources, then you can store your logs in log buckets that are upgraded to use Log Analytics, and then link that bucket to BigQuery. For information about viewing logs, see Query and view logs overview and View logs routed to Cloud Logging buckets.
- Pub/Sub topics: Provides support for third-party integrations, such as Splunk, with Logging. Log entries are formatted into JSON and then delivered to a Pub/Sub topic. For information about viewing these logs, their organization, and how to configure a third-party integration, see View logs routed to Pub/Sub.
- BigQuery datasets: Provides storage of log entries in BigQuery datasets. You can use big data analysis capabilities on the stored logs. If you need to combine your Cloud Logging data with other data sources, then you can route your logs to BigQuery. An alternative is to store your logs in log buckets that are upgraded to use Log Analytics and then linked to BigQuery. For information about viewing logs routed to BigQuery, see View logs routed to BigQuery.
- Cloud Storage buckets: Provides inexpensive, long-term storage of log data in Cloud Storage. Log entries are stored as JSON files. For information about viewing these logs, how they are organized, and how late-arriving logs are handled, see View logs routed to Cloud Storage.
Sinks belong to a given Google Cloud resource: Cloud projects, billing accounts, folders, and organizations. When the resource receives a log entry, it routes the log entry according to the sinks contained by that resource. The log entry is sent to the destination associated with each matching sink.
An aggregated sink is a type of sink that combines and routes log entries from the Google Cloud resources contained by an organization or folder. For instructions, see Collate and route organization-level logs to supported destinations.
To create and manage sinks, you can use the Google Cloud console, the Cloud Logging API, and the Google Cloud CLI. Using the Google Cloud console has the following advantages over the other methods:
- View and manage all of your sinks in one place.
- Preview which log entries are matched by your sink's filter before you create the sink.
- Create and authorize sink destinations for your sinks.
Before you begin
The instructions in this document describe creating and managing sinks at the Cloud project level, but you can create sinks (non-aggregated) for billing accounts, folders, and organizations.
As you get started, ensure the following:
- You have a Google Cloud project with logs that you can see in the Logs Explorer.
You have one of the following IAM roles for the source Cloud project from which you're routing logs.
- Owner (
roles/owner
) - Logging Admin (
roles/logging.admin
) - Logs Configuration Writer (
roles/logging.configWriter
)
The permissions contained in these roles allow you to create, delete, or modify sinks. For information on setting IAM roles, see the Logging Access control guide.
- Owner (
You have a resource in a supported destination or have the ability to create one.
The routing destination has to be created before the sink, through either Google Cloud CLI, Google Cloud console, or the Google Cloud APIs. You can create the destination in any Cloud project in any organization, but first ensure that the service account from the sink has permissions to write to the destination.
Create a sink
Following are the instructions for creating a sink in a Cloud project. Instead of a Cloud project, you can specify a billing account, folder, or organization.
You can create up to 200 sinks per Cloud project.
After you create the sink, ensure that Logging has the appropriate permissions to write logs to your sink's destination; see Set destination permissions.
To create a sink, do the following:
Console
In the Google Cloud console, select Logging from the navigation menu, then click Log Router:
Go to Log RouterSelect an existing Cloud project.
Select Create sink.
In the Sink details panel, enter the following details:
Sink name: Provide an identifier for the sink; note that after you create the sink, you can't rename the sink but you can delete it and create a new sink.
Sink description (optional): Describe the purpose or use case for the sink.
In the Sink destination panel, select the sink service and destination by using the Select sink service menu.
If you are routing to a service that is in the same Cloud project, select one of the following options:
- Cloud Logging bucket: Select or create a Logging bucket.
- BigQuery table: Select or create the particular dataset to receive the routed logs. You also have the option to use partitioned tables.
- Cloud Storage bucket: Select or create the particular Cloud Storage bucket to receive the routed logs.
- Pub/Sub topic: Select or create the particular topic to receive the routed logs.
- Splunk: Select the Pub/Sub topic for your Splunk service.
If you are routing to a destination that is in another project, then select Other project. You must provide the Logging, BigQuery, Cloud Storage, or Pub/Sub service and destination information.
To route log entries to a Cloud Logging log bucket that uses the
global
region and is defined in a different Google Cloud project, the sink destination is the following:logging.googleapis.com/projects/DESTINATION_PROJECT/locations/global/buckets/BUCKET_NAME
To route log entries to a BigQuery dataset, the sink destination is the following:
bigquery.googleapis.com/projects/PROJECT_ID/datasets/DATASET_ID
To route log entries to a Cloud Storage bucket, the sink destination is the following:
storage.googleapis.com/BUCKET_NAME
To route log entries to a Pub/Sub topic, the sink destination is the following:
pubsub.googleapis.com/projects/PROJECT_ID/topics/TOPIC_ID
Note that if you are routing logs between Cloud projects, you still need the appropriate destination permissions.
In the Choose logs to include in sink panel, do the following:
In the Build inclusion filter field, enter a filter expression that matches the log entries you want to include. To learn more about the syntax for writing filters, see Logging query language.
If you don't set a filter, all logs from your selected resource are routed to the destination.
For example, you might want to build a filter to route all Data Access logs to a single Logging bucket. This filter looks like the following:
LOG_ID("cloudaudit.googleapis.com/data_access") OR LOG_ID("externalaudit.googleapis.com/data_access")
Note that the length of a filter can't exceed 20,000 characters.
To verify you entered the correct filter, select Preview logs. This opens the Logs Explorer in a new tab with the filter prepopulated.
(Optional) In the Choose logs to filter out of sink panel, do the following:
In the Exclusion filter name field, enter a name.
In the Build an exclusion filter field, enter a filter expression that matches the log entries you want to exclude. You can also use the
sample
function to select a portion of the log entries to exclude.
You can create up to 50 exclusion filters per sink. Note that the length of a filter can't exceed 20,000 characters.
Select Create sink.
API
To create a logging sink in your Cloud project, use projects.sinks.create in the Logging API. In the LogSink object, provide the appropriate required values in the method request body:
name
: An identifier for the sink. Note that after you create the sink, you can't rename the sink, but you can delete it and create a new sink.destination
: The service and destination to where you want your logs routed. For example, if your sink destination is a BigQuery dataset, thendestination
would look like the following:bigquery.googleapis.com/projects/PROJECT_ID/datasets/DATASET_ID
In the LogSink object, provide the appropriate optional information:
filter
: Set thefilter
property to match the log entries you want to include in your sink. If you don't set a filter, all logs from your Cloud project are routed to the destination. Note that the length of a filter can't exceed 20,000 characters.exclusions
: Set this property to match the log entries that you want to exclude from your sink. You can also use thesample
function to select a portion of the log entries to exclude. You can create up to 50 exclusion filters per sink.description
: Set this property to describe the purpose or use case for the sink.
Call projects.sinks.create to create the sink.
If the API response contains a JSON key labeled
"writerIdentity"
, then grant the service account of the sink the permission to write to the sink destination. For more information, see Set destination permissions.You don't need to set destination permissions when the API response doesn't contain a JSON key labeled
"writerIdentity"
.
For more information on creating sinks using the Logging API, see the LogSink reference.
gcloud
To create a sink, run the following gcloud logging sinks create
command.
Provide the appropriate values for the variables in the command as follows:
- SINK_NAME: An identifier for the sink. Note that after you create the sink, you can't rename the sink but you can delete it and create a new sink.
SINK_DESTINATION: The service and destination to where you want your logs routed. For example, if your sink destination is a BigQuery dataset, then SINK_DESTINATION would look like the following:
bigquery.googleapis.com/projects/PROJECT_ID/datasets/DATASET_ID
OPTIONAL_FLAGS includes the following flags:
--log-filter
: Use this flag to set a filter that matches the log entries you want to include in your sink. If you don't set a filter, all logs from your Cloud project are routed to the destination.--exclusion
: Use this flag to set an exclusion filter for log entries that you want to exclude from your sink. You can also use thesample
function to select a portion of the log entries to exclude. This flag can be repeated; you can create up to 50 exclusion filters per sink.--description
: Use this flag to describe the purpose or use case for the sink.
gcloud logging sinks create SINK_NAME SINK_DESTINATION OPTIONAL_FLAGS
For example, to create a sink to a Logging bucket, your command might look like this:
gcloud logging sinks create my-sink logging.googleapis.com/projects/myproject123/locations/global/buckets/my-bucket \ --log-filter='logName="projects/myproject123/logs/matched"' --description="My first sink"
For more information on creating sinks using the
Google Cloud CLI, including more flags and examples, see the
gcloud logging sinks
reference.
New log sinks to Cloud Storage buckets might take several hours to start routing logs. Sinks to Cloud Storage are processed hourly while other destination types are processed in real time.
For information about how to view logs in the sink destinations, see Find routed logs.
After creating the sink, you can view the number and volume of log entries
received using the logging.googleapis.com/exports/
metrics.
If you receive error notifications, see Troubleshoot routing and sinks.
Route logs between log buckets in different Cloud projects
You can route logs to a destination in a different Cloud project than the one the sink is created in.
To do so, you must do one of the following:
Give your sink's service account the
roles/logging.bucketWriter
role to write to the destination; see Destination permissions for instructions.Have one of the following IAM permissions for the source Cloud project from which you're sending logs.
- Owner (
roles/owner
) - Logging Admin (
roles/logging.admin
) - Logs Configuration Writer (
roles/logging.configWriter
)
If you're creating a new Logging bucket in the destination Cloud project, you must have one of these permissions.
- Owner (
Manage sinks
After your sinks are created, you can perform these actions on them:
- View sink details
- Update sink
- Disable sink
- Delete sink
- Troubleshoot sink
- View sink log volume and error rates
If deleting a sink, note the following:
- You can't delete the
_Default
and the_Required
sinks, but you can disable_Default
sinks to stop routing logs to_Default
Logging buckets. - After a sink is deleted, it stops routing log entries.
Any changes made to a sink might take a few minutes to apply.
Following are the instructions for managing a sink in a Cloud project. Instead of a Cloud project, you can specify a billing account, folder, or organization:
Console
In the Google Cloud console, select Logging from the navigation menu, then click Log Router:
Go to Log RouterSelect the Cloud project that contains your sink by using the resource selector from anywhere in the Google Cloud console:
- To view your aggregated sinks, select the organization, folder, or billing account that contains the sink.
The Log Router page contains a table summary of sinks. Each table row contains information about a sink's properties:
- Enabled: Indicates if the sink's state is enabled or disabled.
- Type: The sink's destination service; for example,
Cloud Logging bucket
. - Name: The sink's identifier, as provided when the sink was created;
for example
_Default
. - Description: The sink's description, as provided when the sink was created.
- Destination: Full name of the destination for where the routed log entries will be sent.
- Created: The date and time that the sink was created.
- Last updated: The date and time that the sink was last edited.
Each table row has a menu more_vert and provides the following options:
- View sink details: Displays the sink's name, description, destination service, destination, and inclusion and exclusion filters. Selecting Edit opens the Edit Sink panel.
- Edit sink: Opens the Edit Sink panel where you can update the sink's parameters.
- Disable sink: Lets you disable the sink and stop routing logs to the sink's destination. For more information on disabling sinks, see Stop logs ingestion.
- Enable sink: Lets you enable a disabled sink and restart routing logs to the sink's destination.
- Delete sink: Lets you delete the sink and stop routing logs to the sink's destination.
- Troubleshoot sink: Opens the Logs Explorer where you can troubleshoot errors with the sink.
- View sink log volume and error rates: Opens the Metrics Explorer where you can view and analyze data from the sink.
Clicking on any of the column names lets you sort data in ascending or descending order.
API
To view the sinks for your Cloud project, call
projects.sinks.list
.To view a sink's details, call
projects.sinks.get
.To update a sink, call
projects.sink.update
.You can update a sink's destination, filters, and description. You can also disable or reenable the sink.
To disable a sink, call
projects.sink.update
and set thedisabled
property totrue
.To reenable the sink, call
projects.sink.update
and set thedisabled
property tofalse
.To delete a sink, call
projects.sinks.delete
.For more information on any of these methods for managing sinks using the Logging API, see the LogSink reference.
gcloud
To view your list of sinks for your Cloud project, use the
gcloud logging sinks list
command, which corresponds to the Logging API methodprojects.sinks.list
:gcloud logging sinks list
To view your list of aggregated sinks, use the appropriate flag to specify the resource that contains the sink. For example, if you created the sink at the organization level, use the
--organization=ORGANIZATION_ID
flag to list the sinks for the organization.To describe a sink, use the
gcloud logging sinks describe
command, which corresponds to the Logging API methodprojects.sinks.get
:gcloud logging sinks describe SINK_NAME
To update a sink, use the
gcloud logging sinks update
command, which corresponds to the API methodprojects.sink.update
.You can update a sink to change the destination, filters, and description, or to disable or reenable the sink:
gcloud logging sinks update SINK_NAME NEW_DESTINATION --log-filter=NEW_FILTER
Omit the NEW_DESTINATION or
--log-filter
if those parts don't change.For example, to update the destination of your sink named
my-project-sink
to a new Cloud Storage bucket destination namedmy-second-gcs-bucket
, your command looks like this:gcloud logging sinks update my-project-sink storage.googleapis.com/my-second-gcs-bucket
To disable a sink, use the
gcloud logging sinks update
command, which corresponds to the API methodprojects.sink.update
, and include the--disabled
flag:gcloud logging sinks update _Default --disabled
To reenable the sink, use the
gcloud logging sinks update
command, remove the--disabled
flag, and include the--no-disabled
flag:gcloud logging sinks update _Default --no-disabled
To delete a sink, use the
gcloud logging sinks delete
command, which corresponds to the API methodprojects.sinks.delete
:gcloud logging sinks delete SINK_NAME
For more information on managing sinks using the Google Cloud CLI, see the
gcloud logging sinks
reference.
Stop logs ingestion
For each Cloud project, Logging automatically creates
two log buckets: _Required
and _Default
.
Logging automatically creates two log sinks, _Required
and _Default
, that route logs to the correspondingly named buckets.
You can't disable the _Required
sink; neither ingestion pricing nor storage
pricing applies to the logs data stored in the _Required
log bucket. You can
disable the _Default
sink to stop logs from being ingested into the
_Default
bucket. You can also disable any user-defined sinks.
When you stop logs ingestion for the _Default
bucket
by disabling all the sinks in your Cloud project that send logs to the
_Default
bucket, no new Cloud Logging ingestion charges are incurred by
your Cloud project for the _Default
bucket. The _Default
bucket is
empty when all of the previously ingested logs in the
_Default
bucket have fulfilled the bucket's
retention period.
To disable your Cloud project sinks that route logs to the _Default
bucket, complete the following steps:
Console
In the Google Cloud console, select Logging from the navigation menu, then click Log Router:
Go to Log RouterTo find all the sinks that route logs to the
_Default
bucket, filter the sinks by destination, and then enter_Default
.For each sink, select Menu more_vert and then select Disable sink.
The sinks are now disabled and your Cloud project sinks no
longer route logs to the _Default
bucket.
To reenable a disabled sink and restart routing logs to the sink's destination, do the following:
In the Google Cloud console, select Logging from the navigation menu, then click Log Router:
Go to Log RouterTo find all the disabled sinks previously configured to route logs to the
_Default
bucket, filter the sinks by destination, and then enter_Default
.For each sink, select Menu more_vert and then select Enable sink.
API
To view the sinks for your Cloud project, call the Logging API method
projects.sinks.list
.Identify any sinks that are routing to the
_Default
bucket.For example, to disable the
_Default
sink, callprojects.sink.update
and set thedisabled
property totrue
.
The _Default
sink is now disabled; it no longer routes logs to the
_Default
bucket.
To disable the other sinks in your Cloud project that are routing
to the _Default
bucket, repeat the steps above.
To reenable a sink, call
projects.sink.update
and set the disabled
property to false
.
gcloud
To view your list of sinks for your Cloud project, use the
gcloud logging sinks list
command, which corresponds to the Logging API methodprojects.sinks.list
:gcloud logging sinks list
Identify any sinks that are routing to the
_Default
bucket. To describe a sink, including seeing the destination name, use thegcloud logging sinks describe
command, which corresponds to the Logging API methodprojects.sinks.get
:gcloud logging sinks describe SINK_NAME
For example, to disable the
_Default
sink, use thegcloud logging sinks update
command and include the--disabled
flag:gcloud logging sinks update _Default --disabled
The _Default
sink is now disabled; it no longer routes logs to the
_Default
bucket.
To disable the other sinks in your Cloud project that are routing
to the _Default
bucket, repeat the steps above.
To reenable a sink, use the
gcloud logging sinks update
command, remove the --disabled
flag, and include the --no-disabled
flag:
gcloud logging sinks update _Default --no-disabled
Set destination permissions
This section describes how to grant Logging the Identity and Access Management permissions to write logs to your sink's destination. For the full list of Logging roles and permissions, see Access control.
If you're using a sink to route logs between Logging buckets in the same Cloud project, no new service account is created; the sink works without the writer identity. If you're using a sink to route logs between Logging buckets in different Cloud projects, a new service account is created.
Following are the instructions for setting Cloud project-level permissions for your sink to route to its destination. Instead of a Cloud project, you can specify a billing account, folder, or organization:
Console
To get the sink's writer identity—an email address—from the new sink, do the following:
In the Google Cloud console, select Logging from the navigation menu, then click Log Router:
Go to Log RouterSelect menu, then more_vert View sink details. The writer identity appears in the Sink details panel.
If the value of the
writerIdentity
field contains an email address, then proceed to the next step. When the value isNone
, you don't need to configure destination permissions for the sink.Click Copy content_copy to copy the sink's writer identity into your clipboard.
If you have Owner access to the destination, add the service account to the destination in the following way:
- For Cloud Storage destinations, add the sink's writer identity to your Cloud Storage bucket and give it the Storage Object Creator role.
- For BigQuery destinations, add the sink's writer identity to your dataset and give it the BigQuery Data Editor role.
- For Pub/Sub, including Splunk, add the sink's writer identity to your topic and give it the Pub/Sub Publisher role.
- For Logging bucket destinations in different
Cloud projects, add the sink's
writer identity to the destination log bucket and give it the
roles/logging.bucketWriter
permission.
If you don't have Owner access to the sink destination, send the writer identity service account name to someone who has that ability. That person should then follow the instructions in the previous step to add the writer identity to the sink destination.
API
Call the API method projects.sinks.list to list the sinks in your Google Cloud project.
Locate the sink whose permissions you want to modify, and if the sink details contain a JSON key labeled
"writerIdentity"
, then proceed to the next step. When the details don't include a"writerIdentity"
field, you don't need to configure destination permissions for the sink.If you have IAM Owner access to the destination, add the service account to the destination in the following way:
- For Cloud Storage destinations, add the sink's writer identity to your Cloud Storage bucket and give it the Storage Object Creator role.
- For BigQuery destinations, add the sink's writer identity to your dataset and give it the BigQuery Data Editor role.
- For Pub/Sub, including Splunk, add the sink's writer identity to your topic and give it the Pub/Sub Publisher role.
- For Logging bucket destinations in different
Cloud projects, add the sink's writer identity to the
destination log bucket and give it the
roles/logging.bucketWriter
permission.
If you don't have Owner access to the sink destination, send the writer identity service account name to someone who has that ability. That person should then follow the instructions in the previous step to add the writer identity to the sink destination.
gcloud
Get the service account from the
writerIdentity
field in your sink:gcloud logging sinks describe SINK_NAME
Locate the sink whose permissions you want to modify, and if the sink details contain a line with
writerIdentity
, then proceed to the next step. When the details don't include awriterIdentity
field, you don't need to configure destination permissions for the sink.The value of the SERVICE_ACCOUNT field in the following steps is the writer identity, which looks similar to the following:
serviceAccount:service-p-123456789012@gcp-sa-logging.iam.gserviceaccount.com
If you have IAM Owner access to the destination, add the service account to the destination in the following way:
- For Cloud Storage destinations, add the sink's writer identity to your Cloud Storage bucket and give it the Storage Object Creator role.
- For BigQuery destinations, add the sink's writer identity to your dataset and give it the BigQuery Data Editor role.
- For Pub/Sub, including Splunk, add the sink's writer identity to your topic and give it the Pub/Sub Publisher role.
- For Logging bucket destinations in different
Cloud projects, add the sink's writer identity to the
destination log bucket and give it the
roles/logging.bucketWriter
permission.
If you don't have Owner access to the sink destination, send the writer identity service account name to someone who has that ability. That person should then follow the instructions in the previous step to add the writer identity to the sink destination.
For example, if you're routing logs between Logging buckets in different Cloud projects, you would add
roles/logging.bucketWriter
to the service account as follows:Get the Identity and Access Management policy for the destination Cloud project and write it to a local file in JSON format:
gcloud projects get-iam-policy DESTINATION_PROJECT_ID --format json > output.json
Add an IAM condition that lets the service account write only to the Cloud Logging bucket you created. For example:
{ "bindings": [ { "members": [ "user:username@gmail.com" ], "role": "roles/owner" }, { "members": [ "SERVICE_ACCOUNT" ], "role": "roles/logging.bucketWriter", "condition": { "title": "Bucket writer condition example", "description": "Grants logging.bucketWriter role to service account SERVICE_ACCOUNT used by log sink [SINK_NAME]", "expression": "resource.name.endsWith(\'locations/global/buckets/BUCKET_ID\')" } } ], "etag": "BwWd_6eERR4=", "version": 3 }
Update the IAM policy:
gcloud projects set-iam-policy DESTINATION_PROJECT_ID output.json
Code samples
To use client library code to configure sinks in your chosen languages, see Logging client libraries: Log sinks.
Filter examples
Following are some filter examples that are particularly useful when creating sinks.
For additional examples that might be useful as you build your inclusion filters and exclusion filters, see Sample queries.
Restore the _Default
sink filter
If you edited the filter for the _Default
sink, you might want to restore
its default filter. To do so, enter the following inclusion filter:
NOT LOG_ID("cloudaudit.googleapis.com/activity") AND NOT \
LOG_ID("externalaudit.googleapis.com/activity") AND NOT \
LOG_ID("cloudaudit.googleapis.com/system_event") AND NOT \
LOG_ID("externalaudit.googleapis.com/system_event") AND NOT \
LOG_ID("cloudaudit.googleapis.com/access_transparency") AND NOT \
LOG_ID("externalaudit.googleapis.com/access_transparency")
Exclude Google Kubernetes Engine container and pod logs
To exclude Google Kubernetes Engine container and pod logs for
GKE system namespaces
, use the following filter:
resource.type = ("k8s_container" OR "k8s_pod")
resource.labels.namespace_name = (
"cnrm-system" OR
"config-management-system" OR
"gatekeeper-system" OR
"gke-connect" OR
"gke-system" OR
"istio-system" OR
"knative-serving" OR
"monitoring-system" OR
"kube-system")
To exclude Google Kubernetes Engine node logs for GKE system
logNames
, use the following filter:
resource.type = "k8s_node"
logName:( "logs/container-runtime" OR
"logs/docker" OR
"logs/kube-container-runtime-monitor" OR
"logs/kube-logrotate" OR
"logs/kube-node-configuration" OR
"logs/kube-node-installation" OR
"logs/kubelet" OR
"logs/kubelet-monitor" OR
"logs/node-journal" OR
"logs/node-problem-detector")
To view the volume of Google Kubernetes Engine node, pod and container logs data ingested into Cloud Logging, use Metrics Explorer in Cloud Monitoring.
Exclude Dataflow logs not required for supportability
To exclude Dataflow logs that aren't required for supportability, use the following filter:
resource.type="dataflow_step"
labels."dataflow.googleapis.com/log_type"!="system" AND labels."dataflow.googleapis.com/log_type"!="supportability"
To view the volume of Dataflow logs data ingested into Cloud Logging, use Metrics Explorer in Cloud Monitoring.
Supportability
While Cloud Logging provides you with the ability to exclude logs from being ingested, you might want to consider keeping logs that help with supportability. Using these logs can help you quickly troubleshoot and identify issues with your applications.
For example, GKE system logs are useful to troubleshoot your GKE applications and clusters because they are generated for events that happen in your cluster. These logs can help you determine if your application code or the underlying GKE cluster is causing your application error. GKE system logs also include Kubernetes Audit Logging generated by the Kubernetes API Server component, which includes changes made using the kubectl command and Kubernetes events.
For Dataflow, we recommended that you, at a minimum, ingest your system
logs (labels."dataflow.googleapis.com/log_type"="system"
) and supportability
logs (labels."dataflow.googleapis.com/log_type"="supportability"
). These logs
are essential for developers to observe and troubleshoot their Dataflow
pipelines, and users might not be able to use the Dataflow
Job details page to view job logs.
What's next
If you encounter issues as you use sinks to route logs, see Troubleshooting routing and sinks.
To learn how to view your routed logs in their destinations, as well as how the logs are formatted and organized, see View logs in sink destinations.
To learn more about querying and filtering with the Logging query language, see Logging query language.