You can export copies of new log entries from Stackdriver Logging to a Google Cloud Storage bucket, a Google BigQuery dataset, a Google Cloud Pub/Sub topic, or any combination of the three.
Where do you want to go?
- To use the Logs Export panel in the Logs Viewer, continue on this page.
- To use the command-line SDK to set up exports, see Exporting Logs in the SDK.
- To use the Stackdriver Logging API to set up exports, see Exporting Logs in the API, and the projects.sinks methods.
- To learn how log exporting works and how to find and use the logs you export, see Exported Logs.
Before you begin
Project: You must have a Google Cloud Platform project with logs that you can see in the Logs Viewer. For more information and getting-started information, see the Stackdriver Logging Logs Viewer.
Permissions: You need more permissions to export logs than you do to view them. You must have either the Owner or Logs Configuration Writer role in the project. For more information, see the IAM & Admin page of the Cloud Platform Console, and the Stackdriver Logging Access Control Guide.
Destination service: To export logs you must sign up for the service to which you write your logs:
Understanding the user interface
Sinks are configured in the Exports page of the Logs Viewer.
The following screenshot shows an example of the Logs Viewer's Exports page. The page shows two existing export sinks, one created with the former Logs Viewer's Exports page (v1) and one created with the current v2 Exports page:
Migrating v1 exports
If you have any export sinks created by the Stackdriver Logging API v1—including but not limited to those created in the v1 Logs Viewer and shown in the v2 Logs Viewer—then you must convert those v1 export sinks to v2 export sinks. For information about converting them, see Migration to the Stackdriver Logging API v2.
The remainder of this page discusses v2 export sinks in the v2 Logs Viewer.
To create an export sink, click Create Export at the top of the page. The sample page shown below already has some fields filled in:
Fill in the sink creation panel as follows:
To select the logs to export, enter an advanced logs filter in the topmost text box. The filter in the sample page specifies the Nginx error logs from all your Google App Engine applications.
Whenever you edit the filter, press Submit Filter to display the matched log entries in the area at the bottom of the page. Press the Refresh icon at the top of the page to fetch the most recent logs.
If you wish to use the basic viewing interface to select the logs, use the ▾ menu at the right side of the filter box.
- Sink name
The identifier you want to assign to the export sink.
- Sink Service
Select the service that will receive your exported logs: Cloud Storage, Cloud Pub/Sub, or BigQuery.
- Sink Destination
Select or create the particular bucket, topic, or dataset to receive the exported logs. The destination must be in the same project as your logs.
Press Create Sink to create the sink.
As part of creating the sink, Stackdriver Logging will first grant itself
firstname.lastname@example.org) permission to write to your destination.
New log entries that match your sink will start being exported. Log entries going to BigQuery or Cloud Pub/Sub are streamed to those destinations immediately. Log entries going to Cloud Storage are batched and sent out approximately every hour. For more information, see Exported logs availability.
If you see any error messages from Cloud Storage, BigQuery, or Cloud Pub/Sub related to logs export, see Troubleshooting.
To delete a sink, select the sink in the Export page and press Delete at the top of the page.
To edit a sink, select the Edit command in the menu to the right of the sink's name.
Stackdriver Logging must have permission to write exported logs to the destination: a Cloud Storage bucket, a BigQuery dataset, or a Cloud Pub/Sub topic. There are two ways to assign permission:
Compatible with the v1 API, you can give permission to the group
email@example.com, if you are exporting to a destination in the same project as the logs.
New in the v2 API is an option to create a unique writer identity for each sink. This lets you authorize writing to a destination in another project, or exporting logs from an organization. For more information, see [
projects.sinks.create]((/logging/docs/api/reference/rest/v2/projects.sinks/create#query-parameters). This option is not yet supported in the Logs Viewer Export page.
The following permissions are needed:
- In Cloud Storage, the group
firstname.lastname@example.org given Owner permission to your bucket.
- In BigQuery, the group
email@example.com given Can edit permission to your dataset.
- In Cloud Pub/Sub, the service account
firstname.lastname@example.org given Editor permission in your project.
If you set up exports using the Stackdriver Logging API or the command line interface instead of the Logs Viewer Export page, then you must manually give the export writer permission to write to the destination. Until you do this, the exports will fail.
Unimplemented v2 API features
The Logs Viewer Exports page does not presently support the following Stackdriver Logging API features:
The use of a unique writer identity in new sinks. This is required to export logs to a different project, or from an organization.
The use of start and end times on the sink, except in the special case of using the Logs Viewer to convert a v1 sink to a v2 sink. For more information, see Migration to the v2 API.
If you need these features, create your sinks using the Stackdriver Logging API v2.
This section lists some possible export errors and explains what to do about them.
|Your new log entries are exported but your older log entries are not exported.||Stackdriver Logging only exports log entries that are received after the export has been set up.||Use the entries.list API method to retrieve your older log entries and use the destination service's API to write the log entries to the export destination.|
Errors exporting to Cloud Storage
The following table lists the most common errors when you configure Stackdriver Logging to export logs to Cloud Storage:
|Permissions on bucket
||Add the necessary permissions to the named bucket or re-configure logs export to use another bucket. See Set access permissions for Cloud Storage.|
|No bucket with name:
||You might have deleted the bucket that was configured to receive your exported logs.||Either re-create the bucket with the same name, or set up Stackdriver Logging to use a different bucket for logs export.|
||The specified bucket does not belong to the project that is exporting the logs.||Either delete the bucket and re-create it in the correct project, or change the export configuration to use a different bucket.|
Errors exporting to BigQuery
The following table lists the most common errors when you configure Stackdriver Logging to export logs to BigQuery:
|Permissions on dataset
||In order to write logs to a BigQuery dataset, you must grant the group
||Add the permission to the dataset. See manually setting access permissions for exported logs.|
|No dataset with name:
||You might have deleted the dataset that was configured to receive your exported logs.||Either re-create the dataset using the same name, or change the export configuration to use a different dataset.|
|Logs streamed to table
||You are trying to export logs that are incompatible with the current table schema.||Make sure your log entries match the table. You can also remove or rename the table and let Stackdriver Logging create the table again.|
|Per-table streaming insert quota has been exceeded for table
||You are exporting too much log data too quickly. See the BigQuery default quota limits, which apply to logs streaming.||If you exceed the quota, contact a sales representative to increase your per-table or per-project quotas.|
Errors exporting logs to Cloud Pub/Sub
The following table lists the most common errors when you configure Stackdriver Logging to export logs to Cloud Pub/Sub:
||The specified account does not have Can edit permission in your project.||Add the necessary permissions to your project. See Set access permissions for Cloud Pub/Sub.|
||You might have deleted the topic that was configured to receive your exported logs.||Either re-create the topic with the same name, or change the export configuration to use a different topic.|