Exporting logs with the Google Cloud Console

This page explains how to export log entries using the Cloud Console and the gcloud command-line tool.

You can also export log entries using the Cloud Logging API.

For a conceptual overview on exporting logs, see Overview of logs exports. In summary, you export logs by creating one or more sinks that include a logs filter and an export destination. As Cloud Logging receives new log entries, they are compared against each sink. If a log entry matches a sink's filter, then a copy of the log entry is written to the export destination.

You can export logs to the following destinations:

To learn how exported logs are formatted and organized, as well as how to view your exported logs, go to Using exported logs.

Creating and managing sinks with the Logs Explorer and Logs Router

Using the Cloud Console, you can do the following:

  • View all of your sinks in one place.
  • View which log entries are matched by your sink query before you create a sink.
  • Create and authorize export destinations for your sinks.

However, the Cloud Console can only create or view sinks in projects. To create sinks in organizations, folders, or billing accounts using the gcloud command-line tool or Cloud Logging API, see Aggregated sinks.

Before you can create a sink, verify the following:

  • You have a Google Cloud project with logs that you can see in the Logs Explorer.

  • You have the Owner or the Logging/Logs Configuration Writer IAM roles in the Cloud project to create, delete, or modify a sink. Go to Permissions and roles for more information.

  • You have a destination service or have the ability to create a destination service.

Creating a sink

To create a sink from the Logs Explorer page, select Actions > Create sink.

Menu showing create sink option.

To create a sink from the Logs Router page, select Create sink.

Select create sink from the Logs Router

After selecting Create sink, complete the following steps in the Create logs routing sink panel:

  1. Enter sink details

    • Sink name: Provide an identifier for the sink.

    • Sink description (optional): Describe the purpose or use-case for the sink.

  2. Enter sink destination

    • Select sink service: Select the service where you want your logs routed.

    The following services and destinations are available:

    • Cloud Logging logs bucket: Select or create a Logs Bucket.
    • BigQuery: Select or create the particular dataset to receive the exported logs. You also have the option to use partitioned tables.
    • Cloud Storage: Select or create the particular Cloud Storage bucket to receive the exported logs.
    • Pub/Sub: Select or create the particular topic to receive the exported logs.
    • Splunk: Select the Pub/Sub topic for your Splunk service.
    • Other project: Add the Google Cloud service and destination in the following format:

      SERVICE.googleapis.com/projects/PROJECT_ID/DESTINATION/DESTINATION_ID
      

      For example, if your export destination is a BigQuery dataset, the sink destination would be the following:

      bigquery.googleapis.com/projects/PROJECT_ID/datasets/DATASET_ID
      

  3. Choose logs to include in the sink

    • Build an inclusion filter: Enter a filter to select the logs you want routed to the sink's destination.

    One filter you might want to build is a filter to route all Data Access logs to a single bucket. This filter looks like the following:

    LOG_ID("cloudaudit.googleapis.com/data_access") OR LOG_ID("externalaudit.googleapis.com/data_access")
    

    To verify you entered the correct filter, select Preview logs. This opens the Logs Explorer in a new tab with the filter pre-populated. For information about creating a filter, see Logging query language.

  4. Choose logs to exclude from the sink (optional)

    • Build an exclusion filter: Select Add exclusion and enter a filter to select logs you don't want routed to the sink's destination.

    • Exclusion filter name: Provide an identifier for the exclusion filter.

    • Exclusion filter rate: Provide an integer between 0 and 100. The incoming logs matching the exclusion filter are sampled according to that value.

      A value of 0 samples 0 percent of the logs matching the filter; therefore, 0 is equivalent to disabling the exclusion filter. A value of 100 samples 100 percent of all logs; therefore, all logs matching the exclusion filter are excluded from the destination. A value of 50 samples 50 percent of the logs matching the exclusion filter; therefore, 50 percent of the logs that match the exclusion are excluded, while the other 50 percent are routed to the destination.

      You can create up to 50 exclusion filters per sink.

  5. Select Create sink.

New log entries that match your sink's filter are routed to the sink's destination. Log entries going to Logs Buckets, BigQuery, or Pub/Sub are streamed immediately. Log entries going to Cloud Storage are batched and sent out approximately every hour. For information about how to view logs in the exported destinations, go to Using exported logs.

If Logging encounters errors when trying to export logs to your export destinations, the errors appear in your project's Activity Stream. Select Activity at the top of your project's home page in Google Cloud Console. To diagnose common errors, go to Troubleshooting below.

Exporting logs to another Google Cloud project

You can export logs to a destination in a different Cloud project than the one the sink is created in.

To do so, you must do one of the following:

  • Have one of the following IAM permissions on the Cloud project that you are sending logs to:

    • Owner (roles/owner)
    • Logging Admin (roles/logging.admin)
    • Logs Configuration Writer (roles/logging.configWriter)
  • Give your sink's service account the roles/logging.logWriter role to write to the destination.

For more instructions on providing the sink's service account the right permissions, see Destination permissions. For the list of Logging roles and permissions, see Access control.

Managing sinks

Once your sink is created, you can view it in the Logs Router page, where you can perform the following actions:

  • View the sink's details
  • Edit the sink
  • Disable the sink
  • Delete the sink

To go to the Logs Router page from the Logging menu, select Logs Router.

Go to Logs Router

The Logs Router page contains a table summary of sinks. Each table row contains information corresponding to some of the sink properties described in Logs exports:

  • Type: The sink's destination type.
  • Name: The sink's identifier in the current project.
  • Description: The sink's description.
  • Destination: Full name for where the exported log entries will go.
  • State: Indicates if the sink is enabled or disabled.

Each table row has a menu and provides the following options:

  • View sink details: Displays the sink's name, description, service, destination, and inclusion and exclusion filters. Selecting Edit lets you change the properties of the sink.
  • Edit sink: Opens the Edit Sink panel where you can change the sink's parameters.
  • Disable sink: Lets you disable the sink and stop routing logs to the sink's destination.

  • Enable sink: Lets you enable a disabled sink and restart routing logs to the sink's destination.

  • Delete sink: Lets you delete the sink and stop routing logs to the sink's destination. The _Default and the _Required sinks can't be deleted, but the _Default sink can be disabled to stop routing logs to the _Default logs bucket.

Clicking on any of the column names lets you sort data in ascending or descending order. At the bottom of the table, you can also select the number of rows that you wish to display.

Creating and managing sinks with gcloud command-line tool

To create a sink, run the gcloud alpha logging sinks create command:

gcloud alpha logging sinks create SINK_NAME SINK_LOCATION OPTIONAL_FLAGS

For example, to create a sink to a Cloud Logging logs bucket:

gcloud alpha logging sinks create my-sink logging.googleapis.com/projects/myproject123/locations/global/buckets/my-bucket \
  --log-filter='logName="projects/myproject123/logs/matched"' --description="My first sink"

Routing logs from one project to a bucket in a different project

To route your current project's logs to a bucket in a different project, complete the following steps.

Note that cross-project bucket sinks require adding the appropriate Identity and Access Management permissions to the service account that Logging creates for these sinks.

  1. If you haven't done so, create a bucket in the other project:

     gcloud alpha logging buckets create BUCKET_ID --project=DESTINATION_PROJECT_ID
    
  2. Create a sink to route logs to the other bucket:

     gcloud alpha logging sinks create SINK_NAME \
       logging.googleapis.com/projects/DESTINATION_PROJECT_ID/locations/global/buckets/BUCKET_ID \
       --log-filter='FILTER_CONDITIONS'
    
  3. Get the service account from the writerIdentity field in your sink:

     gcloud alpha logging sinks describe SINK_NAME
    

    The service account looks similar to the following:

     serviceAccount:p123456789012-12345@gcp-sa-logging.iam.gserviceaccount.com
    
  4. Grant the roles/logging.bucketWriter role to the service account.

    1. Get the Identity and Access Management policy for the destination project and write it to a local file in JSON format:

      gcloud projects get-iam-policy DESTINATION_PROJECT_ID --format json > output.json
      
    2. Add an IAM condition that lets the service account write only to the bucket you created. For example:

       {
         "bindings": [
           {
             "members": [
               "user:username@gmail.com"
             ],
             "role": "roles/owner"
           },
           {
             "members": [
               "[SERVICE_ACCOUNT]"
             ],
             "role": "roles/logging.bucketWriter",
             "condition": {
                 "title": "Bucket writer condition example",
                 "description": "Grants logging.bucketWriter role to service account [SERVICE_ACCOUNT] used by log sink [SINK_NAME]",
                 "expression":
                   "resource.name.endsWith("locations/global/buckets/BUCKET_ID")"
             }
           }
         ],
         "etag": "BwWd_6eERR4=",
         "version": 3
       }

    3. Update the IAM policy:

      gcloud projects set-iam-policy DESTINATION_PROJECT_ID output.json
      

Stopping logs ingestion

When you disable logs ingestion for the _Default bucket by disabling the _Default sink and other sinks routing to the _Default bucket, Cloud Logging stops ingesting and storing your logs data in the _Default bucket. The _Default bucket then contains logs up until the following two conitions are met:

  • No sinks route to the _Default bucket.

  • The retention period on the bucket has expired.

Once you have disabled all the sinks that send logs to the _Default bucket, Cloud Logging no longer charges for ingesting new logs into the bucket.

To disable logs ingestion, complete the following steps:

  1. Go to the Logs Router.

    Go to Logs Router

  2. To find all the sinks that route logs to the _Default bucket, filter the sinks by destination, and then enter _Default.

    Find all sinks that route logs to the default bucket

  3. For each sink, Select Menu and then select Disable sink.

    Default sink is disabled

The sinks are now disabled and Cloud Logging no longer routes logs to the _Default bucket.

Destination permissions

This section describes how you can grant Logging the Identity and Access Management permissions to write exported logs to your sink's export destination.

When you create a sink, Logging creates a new service account for the sink, called a unique writer identity. You cannot manage this service account directly as it is owned and managed by Cloud Logging. The service account is deleted if the sink gets deleted.

Your export destination must permit this service account to write log entries. To set up this permission, complete the following steps:

  1. Create the new sink in the Cloud Console, the gcloud logging command-line interface, or the Logging API.

  2. If you created your sink in the Cloud Console and you have Owner access to the destination, then Cloud Logging should have set up the necessary permissions on your behalf. If it did so, you are done. If not, continue.

  3. Obtain the sink's writer identity—an email address—from the new sink:

    • If you are using the Cloud Console, go to the Logs Router page, and select menu > View sink details. The writer identity appears in Sink details panel.
    • If you are using gcloud logging, the writer identities appear when you list your sinks.
    • If you are using the Logging API, you can get the writer identity from the LogSink object.
  4. If you have Owner access to the destination, then add the service account to the destination in the following way:

    • For Cloud Storage destinations, add the sink's writer identity to your bucket and give it the Storage Object Creator role.
    • For BigQuery destinations, add the sink's writer identity to your dataset and give it the BigQuery Data Editor role.
    • For Pub/Sub, add the sink's writer identity to your topic and give it the Pub/Sub Publisher role.
    • For Logs Buckets destinations, add the sink's writer identity to the Logs Bucket to give it the roles/logging.bucketWriter permission.
  5. If you do not have Owner access to the export destination, then send the writer identity service account name to someone who has that ability. That person should then follow the instructions in the previous step to add the writer identity to the export destination.

Authorization delays

If a sink tries to export a log entry but lacks the appropriate IAM permission to the export destination, then the sink reports an error and skips the log entry. This continues until the permission is granted, at which time the sink begins exporting new log entries.

There is a delay between creating the sink and using the sink's new service account to authorize writing to the export destination. During the first 24 hours after sink creation, you might see permission-related error messages from the sink on your project's Activity page; you can ignore them.

Troubleshooting

Here are some common problems you may encounter when exporting logs, and what to do about them:

  • Errors from the destination: Check the specification of the export destination in the sink. Use projects.sinks.get to find the writer identity for your sink, and be sure that identity is permitted to write to your export destination.

  • No logs are being exported: Here are some possible reasons:

    • Your query is incorrect. Check your export query to verify that log entries matching your query have recently arrived in Logging; correct any misspellings or formatting errors.

    • No matching log entries have been received since you created or updated your sink; only new log entries are exported.

      There is a delay before you can view your exported logs in the destination. This is especially true of Cloud Storage destinations. For details, go to Exported logs availability.

      You can also check the export system metrics. The export system metrics can tell you how many log entries are exported and how many are dropped due to errors.

Your sink begins exporting logs when any errors are corrected.

To view your sink errors using the Legacy Logs Viewer, do the following:

  1. Go to the Activity Stream for the project or other resource where the sink was created:

    Go to Activity Stream

  2. In the Filters panel, select Activity type > Configuration and Resource type > Logging export sink.

  3. Adjust Date/time to view sink errors for the appropriate timeframe.

    Your sink errors appear.

The following sections list some service-specific possible errors and unexpected results, and explains what to do about them.

Errors exporting to Cloud Storage

The following table lists the most common errors when exporting logs to Cloud Storage:

Error Cause Solution
Permissions on bucket [YOUR_BUCKET] do not allow the logs group to create new objects. The sink's writer identity does not have the correct permissions to the bucket. Add the necessary permissions to the bucket or update your sink to use a different bucket. See Destination permissions.
No bucket with name: [YOUR_BUCKET]. There might be an error in the bucket name, or the bucket might have been deleted. Update your sink with the correct bucket destination.
Sink successfully created. However, we were unable to grant correct permissions to the destination. The access control model for the bucket was set to Uniform when the bucket was created.

(This error message continues to be displayed after the addition of the service account.)
Select the Fine-grained access control model during bucket creation. For existing buckets, you can change the access control model for the first 90 days after bucket creation by using the Permissions tab.

Errors exporting to BigQuery

The following table lists the most common errors when exporting logs to BigQuery:

Error Cause Solution
Permissions on dataset [YOUR_DATASET] do not allow the logs group to create new tables. The sink's writer identity does not have enough permissions to the dataset. Add the permission to the dataset. See Destination permissions.
No dataset with name: [YOUR_DATASET]. You might have an error in your sink's destination or someone might have deleted the dataset. Either re-create the dataset or update the export sink to use a different dataset.
Logs streamed to table [YOUR_TABLE] in dataset [YOUR_DATASET] do not match the table schema. You are trying to export logs that are incompatible with the current table's schema. Make sure that your log entries match the table's schema. Common issues include sending log entries with different data types. For example, one of the fields in the log entry is an integer, while a corresponding column in the schema has a string type. The activity stream contains a link to one of the invalid log entries. After you fix the source of the error, you can rename your current table and let Logging create the table again.
Per-table streaming insert quota has been exceeded for table [YOUR_TABLE] in dataset [YOUR_DATASET]. You are exporting too many log entries too quickly. See the BigQuery default quota limits, which apply to logs streaming. Decrease the amount of log data your sink generates. You can update your sink's query to match fewer log entries or use the sample() function.
Logs streamed to partitioned table [YOUR_TABLE] are outside the permitted time boundaries. BigQuery does not accept logs that are too far in the past or future. Logs outside permitted time boundaries cannot be exported with sinks. You can export those logs to Cloud Storage and use a BigQuery load job instead. See BigQuery documentation for further instructions.
Logs cannot be streamed to dataset [YOUR_DATASET] because that operation is prohibited by an organization policy. An organization policy exists that prevents writes to the selected dataset. See documentation for more details on organization policies. Modify your export sink to use a compliant dataset.

Errors exporting logs to Pub/Sub

The following table lists the most common errors when exporting logs to Pub/Sub:

Error Cause Solution
[ACCOUNT] needs edit permission on [PROJECT] to publish to [TOPIC] The sink's writer identity does not have the correct permissions to the topic. Add the necessary permissions to your project. See Destination permissions.
Topic [TOPIC] does not exist. You might have deleted the topic that was configured to receive your exported logs. Either re-create the topic with the same name, or change the export configuration to use a different topic.

What's next