Exporting with the Logs Viewer

This page explains how to export log entries using the Cloud Console. You can also export log entries using the Cloud Logging API or the gcloud command-line tool.

For a conceptual overview on exporting logs in Logging, see Overview of logs exports. In summary, you export logs by creating one or more sinks that include a logs filter and an export destination. As Cloud Logging receives new log entries, they are compared against each sink. If a log entry matches a sink's filter, then a copy of the log entry is written to the export destination.

Supported destinations for exported log entries are Cloud Storage, BigQuery, Pub/Sub, and Logs Buckets in Cloud Logging.

To learn how exported logs are formatted and organized, as well as how to view your exported logs, go to Using exported logs.

Before you begin

  • Project: You must have a Google Cloud project with logs that you can see in the Logs Viewer.

    You must also have the Owner or the Logging/Logs Configuration Writer IAM roles in the project to create, delete, or modify a sink. Go to Permissions and roles for more information.

  • Destination service: To export logs, you must sign up for the Google Cloud service to which you will write your logs: Cloud Storage, BigQuery, or Pub/Sub.

Getting started

  1. From the Logging menu, select Logs Router.

    Go to Logs Router

  2. Select an existing Google Cloud project at the top of the page.

Logs Router user interface

The Logs Router interface contains a table summary of sinks. Each table row contains information corresponding to some of the sink properties described in Logs exports:

  • Name: The sink's identifier in the current project.
  • Destination: Where the exported log entries will go.
  • State: Indicator if the sink is enabled or disabled.

Each table row has a menu and features the following options:

  • View sink details: Displays the sink's filter. Clicking Edit lets you change the sink's properties or filter.
  • Edit sink: Opens an Edit Sink panel where you can change the sink's parameters.
  • Disable sink: Lets you disable the sink and stop routing logs to the sink's destination.
  • Enable sink: Lets you enable a disabled sink and restart routing logs to the sink's destination.
  • Delete sink: Lets you delete the sink and stop routing logs to the sink's destination. _Default and _Required sinks can't be deleted, but the _Default sink can be disabled to stop routing logs to the _Default logs bucket.

Clicking on any of the column names lets you sort data in ascending or descending order. At the bottom of the table, you can also select the number of rows that you wish to display.

Creating sinks

You can create a sink using either the Logs Viewer or Logs Router.

Creating a sink using Logs Viewer (Classic)

To create an export sink, click Create Sink at the top of the Logs Router page. You can also do this at the top of the Logs Viewer page.

If your service account lacks permission to create exports for the project, this option isn't available. Review Before you begin above for more information.

The following screenshot shows the Edit Sink panel with some fields filled in:

The user interface showing the export editing panel.

To create a sink, fill in the Edit Sink panel as follows:

  1. (filter): Enter an advanced logs query. You don't need quotation marks around the query and you can use multiple lines. The initial query is determined by the log entries being displayed when you click Create Export.

    Whenever you edit the query, click Submit query to display the matched log entries. Click Jump to newest logs to fetch the most recent logs.

    If you wish to use the basic viewing interface to select the logs, use the drop-down menu in the search-query box.

  2. Sink name: Enter the identifier you want to assign to the sink.

  3. Sink Service: Select a destination service: Cloud Storage, Pub/Sub, BigQuery or Custom Destination.

    A custom export destination still has to be in Cloud Storage, BigQuery, or Pub/Sub, but allows you to send logs to a sink in a different Google Cloud project. The sink's source and the destination don't have to be within the same Google Cloud organization.

  4. Sink Destination:

    1. Cloud Storage: Select or create the particular bucket to receive the exported logs.
    2. Pub/Sub: Select or create the particular topic to receive the exported logs.
    3. BigQuery: Select or create the particular dataset to receive the exported logs. You also have the option to use partitioned tables.
    4. Custom Destination: Add the Cloud Storage, Pub/Sub, or BigQuery Google Cloud project as a string. For information on project name formatting, read Sink properties.
  5. Click Update Sink to create the sink.

    As part of creating the sink, Logging attempts to grant the sink's writer identity permission to write to your destination. If you are exporting to a destination in a project other than the one owning your logs, then an administrator of the new destination must grant permission. You should send the administrator the sink's writer identity, which is listed with the sink in the Router page.

New log entries that match your sink start being outed to the sink's destination. Log entries going to BigQuery or Pub/Sub are streamed to those export destinations immediately. Log entries going to Cloud Storage are batched and sent out approximately every hour. For more information, go to Using exported logs.

If Logging encounters errors when trying to export logs to your export destination, the errors appear in your project's Activity Stream. Select Activity at the top of your project's home page in Google Cloud Console. To diagnose common errors, go to Troubleshooting below.

Creating a sink using the Logs Router

To route logs to a logs bucket or a custom destination, complete the following steps.

GCLOUD

To create a sink to a destination, run the gcloud alpha logging sinks create command:

gcloud alpha logging sinks create SINK_NAME SINK_LOCATION OPTIONAL_FLAGS

For example, to create a sink to a Cloud Logging logs bucket:

gcloud alpha logging sinks create my-sink logging.googleapis.com/projects/myproject123/locations/global/buckets/my-bucket \
  --log-filter='logName="projects/myproject123/logs/matched"' --description="My first sink"

CONSOLE

To create a sink to a destination, complete the following steps:

  1. From the Logging menu, select Logs Router.

    Go to Logs Router

  2. Click Create Sink.

  3. In the Select sink service panel, select the destination.

  4. Enter a Name and Description for your sink.

  5. Click Next.

  6. In Select sink destination, select the bucket you want to send logs to.

  7. Click Next.

  8. Build an inclusion filter to specify which logs to route to the bucket. As you build the filter, you can click Preview logs to make sure you are including the correct logs.

    One filter you might want to build is a filter to route all Data Access logs to a single bucket. This filter looks like the following:

    LOG_ID("cloudaudit.googleapis.com/data_access") OR LOG_ID("externalaudit.googleapis.com/data_access")
    

  9. When you're finished, click Next.

  10. Optionally, you can build an exclusion filter to specify which logs to exclude from your bucket.

    For more information on filters, refer to Using exclusion filters and to Filters for common use cases.

  11. When you're finished, click Create sink. Your new sink appears in the Logs Routing Sinks list.

Routing logs from one project to a bucket in a different project

To route your current project's logs to a bucket in a different project, complete the following steps.

Note that cross-project bucket sinks require adding the appropriate Identity and Access Management permissions to the service account that Logging creates for these sinks.

GCLOUD

  1. If you haven't done so, create a bucket in the other project:

    gcloud alpha logging buckets create BUCKET_ID --project=DESTINATION_PROJECT_ID
    
  2. Create a sink to route logs to the other bucket:

    gcloud alpha logging sinks create SINK_NAME \
      logging.googleapis.com/projects/DESTINATION_PROJECT_ID/locations/global/buckets/BUCKET_ID \
      --log-filter='FILTER_CONDITIONS'
    
  3. Get the service account from the writerIdentity field in your sink:

    gcloud alpha logging sinks describe SINK_NAME
    

    The service account looks similar to the following:

    serviceAccount:p123456789012-12345@gcp-sa-logging.iam.gserviceaccount.com
    
  4. Grant the roles/logging.bucketWriter role to the service account.

    1. Get the Identity and Access Management policy for the destination project and write it to a local file in JSON format:
    gcloud projects get-iam-policy DESTINATION_PROJECT_ID --format json > output.json
    
    1. Add an IAM condition that lets the service account write only to the bucket you created. For example:

      {
      "bindings": [
       {
         "members": [
           "user:username@gmail.com"
         ],
         "role": "roles/owner"
       },
       {
         "members": [
           "[SERVICE_ACCOUNT]"
         ],
         "role": "roles/logging.bucketWriter",
         "condition": {
             "title": "Bucket writer condition example",
             "description": "Grants logging.bucketWriter role to service account [SERVICE_ACCOUNT] used by log sink [SINK_NAME]",
             "expression":
               "resource.name.endsWith(\"locations/global/buckets/BUCKET_ID\")"
         }
       }
      ],
      "etag": "BwWd_6eERR4=",
      "version": 3
      }
    2. Update the IAM policy:

    gcloud projects set-iam-policy DESTINATION_PROJECT_ID output.json
    

CONSOLE

  1. From the Logging menu, select Logs Router.

    Go to Logs Router

  2. Click Create Sink.

  3. In the Select sink service window, select Other project and click Next.

  4. In the Sink details section, enter the Sink name and Sink description.

  5. Click Next.

  6. In the Sink destination field, enter the bucket location for the destination project.

    logging.googleapis.com/projects/DESTINATION_PROJECT_ID/locations/global/buckets/BUCKET_ID
    
  7. Build an inclusion filter to specify which logs to route to the bucket. As you build the filter, you can click Preview logs to make sure you are including the correct logs.

    One filter you might want to build is a filter to route all Data Access logs to a single bucket. This filter looks like the following:

    LOG_ID("cloudaudit.googleapis.com/data_access") OR LOG_ID("externalaudit.googleapis.com/data_access")
    

  8. When you're finished building the filter, click Next.

  9. Optional: Specify which log entries to exclude from the bucket.

    For more information on filters, refer to Using exclusion filters and to Filters for common use cases.

  10. When you're finished, click Create sink. Your new sink appears in the Logs Routing Sinks list.

  11. Get the service account from the writerIdentity field in your sink:

    gcloud alpha logging sinks describe SINK_NAME
    

    The service account looks similar to the following:

    serviceAccount:p123456789012-12345@gcp-sa-logging.iam.gserviceaccount.com
    
  12. Grant the roles/logging.bucketWriter role to the service account.

    1. Get the Identity and Access Management policy for the destination project and write it to a local file in JSON format:

      gcloud projects get-iam-policy DESTINATION_PROJECT_ID --format json > output.json
      
    2. Add an IAM condition that lets the service account write only to the bucket you created. For example:

      {
      "bindings": [
       {
         "members": [
           "user:username@gmail.com"
         ],
         "role": "roles/owner"
       },
       {
         "members": [
           "[SERVICE_ACCOUNT]"
         ],
         "role": "roles/logging.bucketWriter",
         "condition": {
             "title": "Bucket writer condition example",
             "description": "Grants logging.bucketWriter role to service account [SERVICE_ACCOUNT] used by log sink [SINK_NAME]",
             "expression":
               "resource.name.endsWith(\"locations/global/buckets/BUCKET_ID\")"
         }
       }
      ],
      "etag": "BwWd_6eERR4=",
      "version": 3
      }
    3. Update the IAM policy:

      gcloud projects set-iam-policy DESTINATION_PROJECT_ID output.json
      

Updating sinks

Update a sink in Logs Viewer (Classic)

To update a sink, select the Edit sink command from the menu to the right of the sink's name. You can change any of the following parameters:

  • Destination
  • Query

To change other parameters of your sinks, use the projects.sinks.update API method.

Update a sink in Logs Router

You can edit your existing sinks using the Logs Router by completing the following steps.

  1. From the Logging menu, select Logs Router.

    Go to Logs Router

  2. For the sink you want to view the exclusion filters, click More .

  3. Click Edit sink.

  4. Edit the sink as needed.

  5. When finished, click Update sink.

Deleting sinks

Deleting a sink removes it from Cloud Logging and prevents future logs from being routed to its destination.

Deleting a sink in Logs Viewer (Classic)

To delete a sink, select the sink in the Router page and click Delete at the top of the page. Alternatively, select Delete sink from the menu to the right of the sink's name.

Deleting a sink in Logs Router

To delete your sink, complete the following steps.

CONSOLE

  1. From the Logging menu, select Logs Router.

    Go to Logs Router

  2. For the sink you want to delete, click More .

  3. Select Delete sink.

  4. In the confirmation panel, click Delete.

  5. The sink is deleted.

Disabling sinks

Disabling a sink prevents future logs from being routed to its destination but doesn't remove the sink from Cloud Logging.

Disabling your sink using Logs Router

To disable your sink, complete the following steps.

CONSOLE

  1. From the Logging menu, select Logs Router.

    Go to Logs Router

  2. For the sink you want to disable, click More .

  3. Select Disable sink.

  4. In the confirmation panel, click Disable.

  5. The sink State changes to Disabled.

Enabling sinks

Enabling a sink restarts a disabled sink and resumes routing logs to its destination.

Enabling your disabled sink

To enable your disabled sink, complete the following steps.

CONSOLE

  1. From the Logging menu, select Logs Router.

    Go to Logs Router

  2. For the sink you want to enable, click More .

  3. Select Enable sink.

  4. In the confirmation panel, click Enable.

  5. The sink State changes to Enabled.

Destination permissions

This section describes how you can grant Logging the Identity and Access Management permissions to write exported logs to your sink's export destination.

When you create a sink, Logging creates a new service account for the sink, called a unique writer identity. You cannot manage this service account directly as it is owned and managed by Cloud Logging. The service account is deleted if the sink gets deleted.

Your export destination must permit this service account to write log entries. To set up this permission, follow these steps:

  1. Create the new sink in the Cloud Console, the gcloud logging command-line interface, or the Logging API.

  2. If you created your sink in the Cloud Console and you have Owner access to the destination, then Cloud Logging should have set up the necessary permissions on your behalf. If it did so, you are done. If not, continue.

  3. Obtain the sink's writer identity—an email address—from the new sink:

    • If you are using the Cloud Console, the writer identity appears in the sink listing on the Router page.
    • If you are using the Logging API, you can get the writer identity from the LogSink object.
    • If you are using gcloud logging, the writer identities appear when you list your sinks.
  4. If you have Owner access to the destination, then add the service account to the destination in the following way:

    • For Cloud Storage destinations, add the sink's writer identity to your bucket and give it the Storage Object Creator role.
    • For BigQuery destinations, add the sink's writer identity to your dataset and give it the BigQuery Data Editor role.
    • For Pub/Sub, add the sink's writer identity to your topic and give it the Pub/Sub Publisher role.

    This completes the authorization.

  5. If you do not have Owner access to the export destination, then send the writer identity service account name to someone who has that ability. That person should then follow the instructions in the previous step to add the writer identity to the export destination.

Authorization delays

If a sink tries to export a log entry but lacks the appropriate IAM permission to the export destination, then the sink reports an error and skips the log entry. This continues until the permission is granted, at which time the sink begins exporting new log entries.

There is a delay between creating the sink and using the sink's new service account to authorize writing to the export destination. During the first 24 hours after sink creation, you might see permission-related error messages from the sink on your project's Activity page; you can ignore them.

Advantages and limitations

The Cloud Console has the following advantages over using the Logging API:

  • The Cloud Console shows all of your sinks in one place.
  • The Cloud Console shows you which log entries are matched by your sink query before your create a sink.
  • The Cloud Console can create and authorize export destinations for your sinks.

However, the Cloud Console can only create or view sinks in projects. To create sinks in organizations, folders, or billing accounts, read Aggregated sinks.

Troubleshooting

Here are some common problems you may encounter when exporting logs, and what to do about them:

  • Errors from the destination: Check the specification of the export destination in the sink. Use projects.sinks.get to find the writer identity for your sink, and be sure that identity is permitted to write to your export destination.

  • No logs are being exported: Here are some possible reasons:

    • Your query is incorrect. Check your export query to verify that log entries matching your query have recently arrived in Logging; correct any misspellings or formatting errors.

    • No matching log entries have been received since you created or updated your sink; only new log entries are exported.

      There is a delay before you can view your exported logs in the destination. This is especially true of Cloud Storage destinations. For details, go to Exported logs availability.

      You can also check the export system metrics. The export system metrics can tell you how many log entries are exported and how many are dropped due to errors.

Your sink begins exporting logs when any errors are corrected.

To view your sink errors using the Logs Viewer, do the following:

  1. Go to the Activity Stream for the project or other resource where the sink was created:

    Go to Activity Stream

  2. In the Filters panel, select Activity type > Configuration and Resource type > Logging export sink.

  3. Adjust Date/time to view sink errors for the appropriate timeframe.

    Your sink errors appear.

The following sections list some service-specific possible errors and unexpected results, and explains what to do about them.

Errors exporting to Cloud Storage

The following table lists the most common errors when exporting logs to Cloud Storage:

Error Cause Solution
Permissions on bucket [YOUR_BUCKET] do not allow the logs group to create new objects. The sink's writer identity does not have the correct permissions to the bucket. Add the necessary permissions to the bucket or update your sink to use a different bucket. See Destination permissions.
No bucket with name: [YOUR_BUCKET]. There might be an error in the bucket name, or the bucket might have been deleted. Update your sink with the correct bucket destination.
Sink successfully created. However, we were unable to grant correct permissions to the destination. The access control model for the bucket was set to Uniform when the bucket was created.

(This error message continues to be displayed after the addition of the service account.)
Select the Fine-grained access control model during bucket creation. For existing buckets, you can change the access control model for the first 90 days after bucket creation by using the Permissions tab.

Errors exporting to BigQuery

The following table lists the most common errors when exporting logs to BigQuery:

Error Cause Solution
Permissions on dataset [YOUR_DATASET] do not allow the logs group to create new tables. The sink's writer identity does not have enough permissions to the dataset. Add the permission to the dataset. See Destination permissions.
No dataset with name: [YOUR_DATASET]. You might have an error in your sink's destination or someone might have deleted the dataset. Either re-create the dataset or update the export sink to use a different dataset.
Logs streamed to table [YOUR_TABLE] in dataset [YOUR_DATASET] do not match the table schema. You are trying to export logs that are incompatible with the current table's schema. Make sure that your log entries match the table's schema. Common issues include sending log entries with different data types. For example, one of the fields in the log entry is an integer, while a corresponding column in the schema has a string type. The activity stream contains a link to one of the invalid log entries. After you fix the source of the error, you can rename your current table and let Logging create the table again.
Per-table streaming insert quota has been exceeded for table [YOUR_TABLE] in dataset [YOUR_DATASET]. You are exporting too many log entries too quickly. See the BigQuery default quota limits, which apply to logs streaming. Decrease the amount of log data your sink generates. You can update your sink's query to match fewer log entries or use the sample() function.
Logs streamed to partitioned table [YOUR_TABLE] are outside the permitted time boundaries. BigQuery does not accept logs that are too far in the past or future. Logs outside permitted time boundaries cannot be exported with sinks. You can export those logs to Cloud Storage and use a BigQuery load job instead. See BigQuery documentation for further instructions.
Logs cannot be streamed to dataset [YOUR_DATASET] because that operation is prohibited by an organization policy. An organization policy exists that prevents writes to the selected dataset. See documentation for more details on organization policies. Modify your export sink to use a compliant dataset.

Errors exporting logs to Pub/Sub

The following table lists the most common errors when exporting logs to Pub/Sub:

Error Cause Solution
[ACCOUNT] needs edit permission on [PROJECT] to publish to [TOPIC] The sink's writer identity does not have the correct permissions to the topic. Add the necessary permissions to your project. See Destination permissions.
Topic [TOPIC] does not exist. You might have deleted the topic that was configured to receive your exported logs. Either re-create the topic with the same name, or change the export configuration to use a different topic.

What's next