Exporting Logs (v2)

You can export copies of new log entries from Stackdriver Logging to a Google Cloud Storage bucket, a Google BigQuery dataset, a Google Cloud Pub/Sub topic, or any combination of the three.

Where do you want to go?

Before you begin

  • Project: You must have a Google Cloud Platform project with logs that you can see in the Logs Viewer. For more information and getting-started information, see the Stackdriver Logging Logs Viewer.

  • Permissions: You need more permissions to export logs than you do to view them. You must have either the Owner or Logs Configuration Writer role in the project. For more information, see the IAM & Admin page of the Cloud Platform Console, and the Stackdriver Logging Access Control Guide.

  • Destination service: To export logs you must sign up for the service to which you write your logs:

Understanding the user interface

Sinks are configured in the Exports page of the Logs Viewer.

Go to the Exports page

The following screenshot shows an example of the Logs Viewer's Exports page. The page shows two existing export sinks, one created with the former Logs Viewer's Exports page (v1) and one created with the current v2 Exports page:

Export page

Migrating v1 exports

If you have any export sinks created by the Stackdriver Logging API v1—including but not limited to those created in the v1 Logs Viewer and shown in the v2 Logs Viewer—then you must convert those v1 export sinks to v2 export sinks. For information about converting them, see Migration to the Stackdriver Logging API v2.

The remainder of this page discusses v2 export sinks in the v2 Logs Viewer.

Creating sinks

To create an export sink, click Create Export at the top of the page. The sample page shown below already has some fields filled in:

Configure export

Fill in the sink creation panel as follows:

(filter)

To select the logs to export, enter an advanced logs filter in the topmost text box. The filter in the sample page specifies the Nginx error logs from all your Google App Engine applications.

Whenever you edit the filter, press Submit Filter to display the matched log entries in the area at the bottom of the page. Press the Refresh icon at the top of the page to fetch the most recent logs.

If you wish to use the basic viewing interface to select the logs, use the ▾ menu at the right side of the filter box.

Sink name

The identifier you want to assign to the export sink.

Sink Service

Select the service that will receive your exported logs: Cloud Storage, Cloud Pub/Sub, or BigQuery.

Sink Destination

Select or create the particular bucket, topic, or dataset to receive the exported logs. The destination must be in the same project as your logs.

Press Create Sink to create the sink.

As part of creating the sink, Stackdriver Logging will first grant itself (cloud-logs@google.com) permission to write to your destination.

New log entries that match your sink will start being exported. Log entries going to BigQuery or Cloud Pub/Sub are streamed to those destinations immediately. Log entries going to Cloud Storage are batched and sent out approximately every hour. For more information, see Exported logs availability.

If you see any error messages from Cloud Storage, BigQuery, or Cloud Pub/Sub related to logs export, see Troubleshooting.

Deleting sinks

To delete a sink, select the sink in the Export page and press Delete at the top of the page.

Editing sinks

To edit a sink, select the Edit command in the menu to the right of the sink's name.

Destination authorization

Stackdriver Logging must have permission to write exported logs to the destination: a Cloud Storage bucket, a BigQuery dataset, or a Cloud Pub/Sub topic. There are two ways to assign permission:

  • Compatible with the v1 API, you can give permission to the group cloud-logs@google.com, if you are exporting to a destination in the same project as the logs.

  • New in the v2 API is an option to create a unique writer identity for each sink. This lets you authorize writing to a destination in another project, or exporting logs from an organization. For more information, see [projects.sinks.create]((/logging/docs/api/reference/rest/v2/projects.sinks/create#query-parameters). This option is not yet supported in the Logs Viewer Export page.

The following permissions are needed:

  • In Cloud Storage, the group cloud-logs@google.com is given Owner permission to your bucket.
  • In BigQuery, the group cloud-logs@google.com is given Can edit permission to your dataset.
  • In Cloud Pub/Sub, the service account cloud-logs@system.gserviceaccount.com is given Editor permission in your project.

If you set up exports using the Stackdriver Logging API or the command line interface instead of the Logs Viewer Export page, then you must manually give the export writer permission to write to the destination. Until you do this, the exports will fail.

Unimplemented v2 API features

The Logs Viewer Exports page does not presently support the following Stackdriver Logging API features:

  • The use of a unique writer identity in new sinks. This is required to export logs to a different project, or from an organization.

  • The use of start and end times on the sink, except in the special case of using the Logs Viewer to convert a v1 sink to a v2 sink. For more information, see Migration to the v2 API.

If you need these features, create your sinks using the Stackdriver Logging API v2.

Troubleshooting

This section lists some possible export errors and explains what to do about them.

General problems

Problem Cause Solution
Your new log entries are exported but your older log entries are not exported. Stackdriver Logging only exports log entries that are received after the export has been set up. Use the entries.list API method to retrieve your older log entries and use the destination service's API to write the log entries to the export destination.

Errors exporting to Cloud Storage

The following table lists the most common errors when you configure Stackdriver Logging to export logs to Cloud Storage:

Error Cause Solution
Permissions on bucket <your_bucket> do not allow the logs group to create new objects. The group cloud-logs@google.com does not have owner access to the bucket. Access is granted automatically when you configure log export using the Logs Viewer, but not when using the command line interface or the Stackdriver Logging API. It is also possible that the permissions were revoked by a project owner. Add the necessary permissions to the named bucket or re-configure logs export to use another bucket. See Set access permissions for Cloud Storage.
No bucket with name: <your_bucket>. You might have deleted the bucket that was configured to receive your exported logs. Either re-create the bucket with the same name, or set up Stackdriver Logging to use a different bucket for logs export.
Bucket <your_bucket> is not in project <your_project>. The specified bucket does not belong to the project that is exporting the logs. Either delete the bucket and re-create it in the correct project, or change the export configuration to use a different bucket.

Errors exporting to BigQuery

The following table lists the most common errors when you configure Stackdriver Logging to export logs to BigQuery:

Error Cause Solution
Permissions on dataset <your_dataset> do not allow the logs group to create new tables. In order to write logs to a BigQuery dataset, you must grant the group cloud-logs@google.com the WRITER role on the dataset. Access is normally granted automatically when you configure log exporting to use BigQuery. If you remove this permission, Stackdriver Logging is not be able to write logs to that bucket. Add the permission to the dataset. See manually setting access permissions for exported logs.
No dataset with name: <your_dataset>. You might have deleted the dataset that was configured to receive your exported logs. Either re-create the dataset using the same name, or change the export configuration to use a different dataset.
Logs streamed to table <your_table> in dataset <your_dataset> do not match the table schema. You are trying to export logs that are incompatible with the current table schema. Make sure your log entries match the table. You can also remove or rename the table and let Stackdriver Logging create the table again.
Per-table streaming insert quota has been exceeded for table <your_table> in dataset <your_dataset>. You are exporting too much log data too quickly. See the BigQuery default quota limits, which apply to logs streaming. If you exceed the quota, contact a sales representative to increase your per-table or per-project quotas.

Errors exporting logs to Cloud Pub/Sub

The following table lists the most common errors when you configure Stackdriver Logging to export logs to Cloud Pub/Sub:

Error Cause Solution
<account> needs edit permission on <project> to publish to <topic> The specified account does not have Can edit permission in your project. Add the necessary permissions to your project. See Set access permissions for Cloud Pub/Sub.
Topic <topic> does not exist. You might have deleted the topic that was configured to receive your exported logs. Either re-create the topic with the same name, or change the export configuration to use a different topic.

What's next

Send feedback about...

Stackdriver Logging