Exporting with the Logs Viewer

This page explains how to export log entries using the GCP Console. You can also export log entries using the Stackdriver Logging API or the gcloud logging command-line tool.

For a conceptual overview on exporting logs in Logging, see Overview of logs exports. In summary, you export logs by creating one or more sinks that include a logs query and an export destination. As Stackdriver Logging receives new log entries, they are compared against each sink. If a log entry matches a sink's query, then a copy of the log entry is written to the export destination.

Supported destinations for exported log entries are Cloud Storage, BigQuery, and Cloud Pub/Sub.

To learn how exported logs are formatted and organized, as well as how to view your exported logs, go to Using exported logs.

Before you begin

  • Project: You must have a GCP project with logs that you can see in the Logs Viewer.

    You must also have the Owner or the Logging/Logs Configuration Writer Cloud IAM roles in the project to create, delete, or modify a sink. Go to Permissions and roles for more information.

  • Destination service: To export logs, you must sign up for the GCP service to which you will write your logs: Cloud Storage, BigQuery, or Cloud Pub/Sub.

Getting started

  1. Go to Stackdriver Logging > Exports in the GCP Console:

    Go to Logs Exports

  2. Select an existing GCP project at the top of the page.

The following screenshot shows an example of the Exports page, where several log sinks are already configured:

The user interface showing the exports list.

If you have not yet configured any log sinks, the message No Log sinks are configured is displayed.

Logs Exports user interface

The Logs Exports interface contains a table summary of exports. Each table row contains information corresponding to some of the sink properties described in Logs exports:

  • Sink Name: The sink's identifier in the current project.
  • Destination: Where the exported log entries will go.
  • Writer identity: The service account that Logging uses to write log entries to the destination. This service account must have permission to write to the sink's export destination.

Each table row has a menu on its far right and features the following options:

  • Edit sink: Opens an Edit Export panel where you can change the sink's parameters.
  • Delete sink: Lets you delete the sink and stop the logs export.
  • View filter: Displays the sink's query. Clicking Edit lets you change the sink's properties or query.

The search-query box above the table lets you query your sinks by text search, or by these sink properties: Sink Name, Destination, and Writer Identity. For example, the following screenshot shows a search on Destination bigquery and options to concatenate other sink properties to the argument by OR (AND is the default):

The user interface showing a search query.

In addition, clicking on any of the column names lets you sort data in ascending or descending order. At the bottom of the table, you can also select the number of rows that you wish to display.

Creating sinks

To create an export sink, click Create Export at the top of the Logs Exports page. You can also do this at the top of the Logs Viewer page.

If your service account lacks permission to create exports for the project, this option isn't available. Review Before you begin above for more information.

The following screenshot shows the Edit Export panel with some fields filled in:

The user interface showing the export editing panel.

To create a sink, fill in the Edit Export panel as follows:

  1. (filter): Enter an advanced logs query. You don't need quotation marks around the query and you can use multiple lines. The initial query is determined by the log entries being displayed when you click Create Export.

    Whenever you edit the query, click Submit query to display the matched log entries. Click Jump to newest logs to fetch the most recent logs.

    If you wish to use the basic viewing interface to select the logs, use the drop-down menu in the search-query box.

  2. Sink name: Enter the identifier you want to assign to the sink.

  3. Sink Service: Select a destination service: Cloud Storage, Cloud Pub/Sub, BigQuery or Custom Destination.

    A custom export destination still has to be in Cloud Storage, BigQuery, or Cloud Pub/Sub, but allows you to send logs to a sink in a different project.

  4. Sink Destination:

    1. Cloud Storage: Select or create the particular bucket to receive the exported logs.
    2. Cloud Pub/Sub: Select or create the particular topic to receive the exported logs.
    3. BigQuery: Select or create the particular dataset to receive the exported logs. You also have the option to use partitioned tables.
    4. Custom Destination: Add the Cloud Storage, Cloud Pub/Sub, or BigQuery GCP project as a string. For information on project name formatting, read Sink properties.
  5. Click Update Sink to create the sink.

    As part of creating the sink, Logging attempts to grant the sink's writer identity permission to write to your destination. If you are exporting to a destination in a project other than the one owning your logs, then an administrator of the new destination must grant permission. You should send the administrator the sink's writer identity, which is listed with the sink in the Exports page.

New log entries that match your sink start being exported. Log entries going to BigQuery or Cloud Pub/Sub are streamed to those export destinations immediately. Log entries going to Cloud Storage are batched and sent out approximately every hour. For more information, go to Using exported logs.

If Logging encounters errors when trying to export logs to your export destination, the errors appear in your project's Activity Stream. Select Activity at the top of your project's home page in Google Cloud Platform Console. To diagnose common errors, go to Troubleshooting below.

Updating sinks

To update a sink, select the Edit sink command from the menu to the right of the sink's name. You can change any of the following parameters:

  • Destination
  • Query

To change other parameters of your sinks, use the projects.sinks.update API method.

Deleting sinks

To delete a sink, select the sink in the Exports page and click Delete at the top of the page. Alternatively, select Delete sink from the menu to the right of the sink's name.

Destination permissions

This section describes how you can grant Logging the Cloud Identity and Access Management permissions to write exported logs to your sink's export destination.

When you create a sink, Logging creates a new service account for the sink, called a unique writer identity. You cannot manage this service account directly as it is owned and managed by Stackdriver Logging. The service account is deleted if the sink gets deleted.

Your export destination must permit this service account to write log entries. To set up this permission, follow these steps:

  1. Create the new sink in the GCP Console, the gcloud logging command-line interface, or the Logging API.

  2. If you created your sink in the GCP Console and you have Owner access to the destination, then Stackdriver Logging should have set up the necessary permissions on your behalf. If it did so, you are done. If not, continue.

  3. Obtain the sink's writer identity—an email address—from the new sink:

    • If you are using the GCP Console, the writer identity appears in the sink listing on the Exports page.
    • If you are using the Logging API, you can get the writer identity from the LogSink object.
    • If you are using gcloud logging, the writer identities appear when you list your sinks.
  4. If you have Owner access to the destination, then add the service account to the destination in the following way:

    • For Cloud Storage destinations, add the sink's writer identity to your bucket and give it the Storage Object Creator role.
    • For BigQuery destinations, add the sink's writer identity to your dataset and give it the BigQuery Data Editor role.
    • For Cloud Pub/Sub, add the sink's writer identity to your topic and give it the Pub/Sub Publisher role.

    This completes the authorization.

  5. If you do not have Owner access to the export destination, then send the writer identity service account name to someone who has that ability. That person should then follow the instructions in the previous step to add the writer identity to the export destination.

Authorization delays

If a sink tries to export a log entry but lacks the appropriate Cloud IAM permission to the export destination, then the sink reports an error and skips the log entry. This continues until the permission is granted, at which time the sink begins exporting new log entries.

There is a delay between creating the sink and using the sink's new service account to authorize writing to the export destination. During the first 24 hours after sink creation, you might see permission-related error messages from the sink on your project's Activity page; you can ignore them.

Advantages and limitations

The GCP Console has the following advantages over using the Logging API:

  • The GCP Console shows all of your sinks in one place.
  • The GCP Console shows you which log entries are matched by your sink query before your create a sink.
  • The GCP Console can create and authorize export destinations for your sinks.

However, the GCP Console can only create or view sinks in projects. To create sinks in organizations, folders, or billing accounts, read Aggregated exports.


Here are some common problems you may encounter when exporting logs, and what to do about them:

  • Errors from the destination: Check the specification of the export destination in the sink. Use projects.sinks.get to find the writer identity for your sink, and be sure that identity is permitted to write to your export destination.

  • No logs are being exported: Here are some possible reasons:

    • Your query is incorrect. Check your export query to verify that log entries matching your query have recently arrived in Logging; correct any misspellings or formatting errors.

    • No matching log entries have been received since you created or updated your sink; only new log entries are exported.

      There is a delay before you can view your exported logs in the destination. This is especially true of Cloud Storage destinations. For details, go to Exported logs availability.

      You can also check the export system metrics. The export system metrics can tell you how many log entries are exported and how many are dropped due to errors.

Your sink begins exporting logs when any errors are corrected.

To view your sink errors using the Logs Viewer, do the following:

  1. Go to the Activity Stream for the project or other resource where the sink was created:

    Go to Activity Stream

  2. In the Filters panel, select Activity type > Configuration and Resource type > Logging export sink.

  3. Adjust Date/time to view sink errors for the appropriate timeframe.

    Your sink errors appear.

The following sections list some service-specific possible errors and unexpected results, and explains what to do about them.

Errors exporting to Cloud Storage

The following table lists the most common errors when exporting logs to Cloud Storage:

Error Cause Solution
Permissions on bucket [YOUR_BUCKET] do not allow the logs group to create new objects. The sink's writer identity does not have the correct permissions to the bucket. Add the necessary permissions to the bucket or update your sink to use a different bucket. See Destination permissions.
No bucket with name: [YOUR_BUCKET]. There might be an error in the bucket name, or the bucket might have been deleted. Update your sink with the correct bucket destination.
Sink successfully created. However, we were unable to grant correct permissions to the destination. The access control model for the bucket was set to Set permissions uniformly at bucket-level (Bucket Policy Only) when the bucket was created.

(This error message continues to be displayed after the addition of the service account.)
Select the access control model Set object-level and bucket-level permissions during bucket creation. For existing buckets, you can change the access control model for the first 90 days after bucket creation by using the Permissions tab.

Errors exporting to BigQuery

The following table lists the most common errors when exporting logs to BigQuery:

Error Cause Solution
Permissions on dataset [YOUR_DATASET] do not allow the logs group to create new tables. The sink's writer identity does not have enough permissions to the dataset. Add the permission to the dataset. See Destination permissions.
No dataset with name: [YOUR_DATASET]. You might have an error in your sink's destination or someone might have deleted the dataset. Either re-create the dataset or update the export sink to use a different dataset.
Logs streamed to table [YOUR_TABLE] in dataset [YOUR_DATASET] do not match the table schema. You are trying to export logs that are incompatible with the current table's schema. Make sure that your log entries match the table's schema. Common issues include sending log entries with different data types. For example, one of the fields in the log entry is an integer, while a corresponding column in the schema has a string type. The activity stream contains a link to one of the invalid log entries. After you fix the source of the error, you can rename your current table and let Logging create the table again.
Per-table streaming insert quota has been exceeded for table [YOUR_TABLE] in dataset [YOUR_DATASET]. You are exporting too many log entries too quickly. See the BigQuery default quota limits, which apply to logs streaming. Decrease the amount of log data your sink generates. You can update your sink's query to match fewer log entries or use the sample() function.
Logs streamed to partitioned table [YOUR_TABLE] are outside the permitted time boundaries. BigQuery does not accept logs that are too far in the past or future. Logs outside permitted time boundaries cannot be exported with sinks. You can export those logs to Cloud Storage and use a BigQuery load job instead. See BigQuery documentation for further instructions.
Logs cannot be streamed to dataset [YOUR_DATASET] because that operation is prohibited by an organization policy. An organization policy exists that prevents writes to the selected dataset. See documentation for more details on organization policies. Modify your export sink to use a compliant dataset.

Errors exporting logs to Cloud Pub/Sub

The following table lists the most common errors when exporting logs to Cloud Pub/Sub:

Error Cause Solution
[ACCOUNT] needs edit permission on [PROJECT] to publish to [TOPIC] The sink's writer identity does not have the correct permissions to the topic. Add the necessary permissions to your project. See Destination permissions.
Topic [TOPIC] does not exist. You might have deleted the topic that was configured to receive your exported logs. Either re-create the topic with the same name, or change the export configuration to use a different topic.

What's next

Оцените, насколько информация на этой странице была вам полезна:

Оставить отзыв о...

Текущей странице
Stackdriver Logging
Нужна помощь? Обратитесь в службу поддержки.