Exporting Logs (v1)

You can export copies of new log entries from Stackdriver Logging to a Google Cloud Storage bucket, a Google BigQuery dataset, a Google Cloud Pub/Sub topic, or any combination of the three.

This guide explains how to use the Logs Export panel in the Logs Viewer to set up exports. To use the command-line SDK to set up exports, see Exporting Logs in the SDK. To use the Stackdriver Logging API to set up exports, see Exporting Logs in the API, and the projects.sinks methods. To learn how log exporting works and how to find and use exported logs, see Exported Logs.

Before you begin

Creating export destinations

If you use the Logs Export panel of the Logs Viewer, you can create your export destinations—buckets, datasets, and topics—in the Cloud Platform Console as you configure logs export.

If you use the command line interface or the API, you must create your destinations and assign the right permissions to them before you export your logs. See Permissions for writing exported logs for more information.

Configuring log sinks

Following are the instructions for configuring logs export using the Logs Export panel:

  1. Go to the Logs Viewer and pick a project.

  2. Select Exports in the left-side navigation pane. You see the following panel:

    Configure export

  3. Select logs to export by following these steps.

    1. Using the Select service menu, choose a log service whose logs you want to export. For example, choose App Engine if you want to export your App Engine Request Log. The menu shows only the services that you have enabled.

    2. To export all the logs from the selected log service, check All logs and skip ahead to the Select export destinations step.

    3. To export only some of the logs from the log service, uncheck All logs. A box appears containing a large plus sign [+].

    4. For each log you want to export, press the plus sign + and then select a log source from the drop-down menu. The menu only shows the logs that you have access to.

  4. Select export destinations. For each of the three possible export destination services—BigQuery, Cloud Storage, or Cloud Pub/Sub—select a destination from the service's dropdown menu:

    1. Choose Don't export to <service> if you do not want to export the selected logs to the destination service.

    2. Choose Add new <destination>... if you want to create a new destination—dataset, bucket, or topic—for the selected logs.

    3. Choose an existing destination to export the selected logs to that dataset, bucket, or topic.

    4. When you have configured all your exports, select Save.

  5. Export more log services. To export logs from another log service, repeat these steps beginning with Select logs to export. Your new export settings for a service will override all previous export settings for the service.

  6. Halting log exports. To stop exporting logs from a log service, select the service, choose All logs, and instead select Don't export to <service> for each export destination service.

Stackdriver Logging immediately begins to export log entries according to your configuration. See Exported logs availability for more information.

If you see any error messages from Cloud Storage, BigQuery, or Cloud Pub/Sub related to logs export, see Troubleshooting.

Set permissions for writing exported logs

There are two permissions involved in exporting logs:

  • You or your software must have permission to configure logs export.
  • Stackdriver Logging must have permission to write the logs to the selected destination.

To configure logs export, you or your software must have one of the following roles in the project exporting the logs:

  • Owner
  • Logs Configuration Writer

You can see and change permissions in the IAM & Admin page of the Cloud Platform Console. For more information, see the Stackdriver Logging Access Control Guide.

Stackdriver Logging must also have permission to write exported logs to their destination: a Cloud Storage bucket, a BigQuery dataset, or a Cloud Pub/Sub topic. The Cloud Platform Console assigns the following permissions for you when you configure export using the Logs Export panel of Logs Viewer:

  • In Cloud Storage, the group cloud-logs@google.com is given Owner permission to your bucket.
  • In BigQuery, the group cloud-logs@google.com is given Can edit permission to your dataset.
  • In Cloud Pub/Sub, the service account cloud-logs@system.gserviceaccount.com is given Editor permission in your project.

If you set up exports using the Stackdriver Logging API or the command line interface instead of the Logs Export panel, then you must create the destinations with the right permissions before you configure logs export. The instructions are given in this section.

No matter how the permissions are granted, if you stop exporting logs to these destinations, the permissions still remain in place until you remove them.

Setting permissions for Cloud Storage

You must give Stackdriver Logging Owner permission on each bucket to which you export logs:

  1. Navigate to the Cloud Storage Browser page in the Cloud Platform Console, Storage > Storage > Browser.

  2. Find your storage bucket and choose Edit bucket permissions from the More menu for the bucket:

Storage browser

  1. If necessary, click + Add item and add cloud-logs@google.com as a Group and Owner of your project:

Add cloud-logs as an owner

  1. Click Save.

For more information, see Cloud Storage access control.

Setting permissions for BigQuery

You must give Stackdriver Logging Editor permission to each BigQuery dataset to which you export logs:

  1. Navigate to the BigQuery Web UI in the Cloud Platform Console, Big Data > BigQuery.

  2. In the dropdown link to the right of your dataset's name, select Share dataset:

    Share dataset

  3. In the Add people section:

    1. Select Group by email in the dropdown list at the left of the dialog.
    2. Enter cloud-logs@google.com in the text box.
    3. Select Can edit in the dropdown list at the right of the dialog.
    4. Clear the Notify people via email checkbox. The dialog should look like the following:

      Add cloud-logs

  4. Click Add.

  5. Click Save changes.

For more information, see BigQuery access control.

Setting permissions for Cloud Pub/Sub

You must give Stackdriver Logging the Pub/Sub Publisher role for any Pub/Sub topic to which you will export logs. If you give Stackdriver Logging the Pub/Sub Publisher role in your project, then you can export logs to any topic in your project. You can substitute any role that includes Pub/Sub Publisher, such as Editor.

To grant permission at the project level, follow these steps:

  1. Navigate to the Cloud Platform Console Permissions page for your project.

  2. If account cloud-logs@system.gserviceaccount.com is already listed, be sure it includes the role Pub/Sub Publisher or some other role that includes that permission. Add the role if necessary.

  3. If the account is not listed, then click Add Member at the top of the page. Fill in the Add members dialog as follows:

    1. Enter cloud-logs@system.gserviceaccount.com in the Members box.
    2. Select Pub/Sub > Publisher in the Select a role menu, or substitute a role that includes that permission.
    3. Click Add.

To grant permission at the topic level, follow these steps:

  1. Navigate to the Pub/Sub topic list for your project in the Cloud Platform Console.
  2. Create a new topic or select an existing topic for logs export.
  3. Select Permissions.
  4. Enter cloud-logs@system.gserviceaccount.com.
  5. In the Select a role menu, select Pub/Sub Publisher or another role which includes that permission.
  6. Click Add.

Using exported logs

See Exported Logs for a detailed description of the format of the exported log files, tables, topics, and log entries.

Troubleshooting

This section lists some possible export errors and explains what to do about them.

General problems

Problem Cause Solution
Your new log entries are exported but your older log entries are not exported. Stackdriver Logging only exports log entries that are received after the export has been set up. Use the entries.list API method to retrieve your older log entries and use the destination service's API to write the log entries to the export destination.

Errors exporting to Cloud Storage

The following table lists the most common errors when you configure Stackdriver Logging to export logs to Cloud Storage:

Error Cause Solution
Permissions on bucket <your_bucket> do not allow the logs group to create new objects. The group cloud-logs@google.com does not have owner access to the bucket. Access is granted automatically when you configure log export using the Logs Viewer, but not when using the command line interface or the Stackdriver Logging API. It is also possible that the permissions were revoked by a project owner. Add the necessary permissions to the named bucket or re-configure logs export to use another bucket. See Set access permissions for Cloud Storage.
No bucket with name: <your_bucket>. You might have deleted the bucket that was configured to receive your exported logs. Either re-create the bucket with the same name, or set up Stackdriver Logging to use a different bucket for logs export.
Bucket <your_bucket> is not in project <your_project>. The specified bucket does not belong to the project that is exporting the logs. Either delete the bucket and re-create it in the correct project, or change the export configuration to use a different bucket.

Errors exporting to BigQuery

The following table lists the most common errors when you configure Stackdriver Logging to export logs to BigQuery:

Error Cause Solution
Permissions on dataset <your_dataset> do not allow the logs group to create new tables. In order to write logs to a BigQuery dataset, you must grant the group cloud-logs@google.com the WRITER role on the dataset. Access is normally granted automatically when you configure log exporting to use BigQuery. If you remove this permission, Stackdriver Logging is not be able to write logs to that bucket. Add the permission to the dataset. See manually setting access permissions for exported logs.
No dataset with name: <your_dataset>. You might have deleted the dataset that was configured to receive your exported logs. Either re-create the dataset using the same name, or change the export configuration to use a different dataset.
Logs streamed to table <your_table> in dataset <your_dataset> do not match the table schema. You are trying to export logs that are incompatible with the current table schema. Make sure your log entries match the table. You can also remove or rename the table and let Stackdriver Logging create the table again.
Per-table streaming insert quota has been exceeded for table <your_table> in dataset <your_dataset>. You are exporting too much log data too quickly. See the BigQuery default quota limits, which apply to logs streaming. If you exceed the quota, contact a sales representative to increase your per-table or per-project quotas.

Errors exporting logs to Cloud Pub/Sub

The following table lists the most common errors when you configure Stackdriver Logging to export logs to Cloud Pub/Sub:

Error Cause Solution
<account> needs edit permission on <project> to publish to <topic> The specified account does not have Can edit permission in your project. Add the necessary permissions to your project. See Set access permissions for Cloud Pub/Sub.
Topic <topic> does not exist. You might have deleted the topic that was configured to receive your exported logs. Either re-create the topic with the same name, or change the export configuration to use a different topic.

What's next

Monitor your resources on the go

Get the Google Cloud Console app to help you manage your projects.

Send feedback about...

Stackdriver Logging