If you are not familiar with exporting logs in Stackdriver Logging, see Overview of Logs Export. In summary, you export logs by creating one or more sinks that include a logs filter and a destination. As Stackdriver Logging receives new log entries, they are compared against each sink. If a log entry matches a sink's filter, then a copy of the log entry is written to the destination.
Advantages and limitations
The Logs Viewer has the following advantages over using the Stackdriver Logging API:
- The Logs Viewer shows all of your sinks in one place.
- The Logs Viewer shows you which log entries are matched by your sink filter before your create a sink.
- The Logs Viewer can create and authorize destinations for your sinks.
The Logs Viewer has the following limitations compared to the Stackdriver Logging API:
- You can only create sinks in projects. You cannot create sinks in organizations, folders, or billing accounts.
- You cannot create or edit v1 sinks.
Before you begin
Project: You must have a Google Cloud Platform project with logs that you can see in the Logs Viewer. You must have the Owner or the Logging/Logs Configuration Writer IAM roles in the project to create, delete, or modify a sink. For getting-started information, see the Stackdriver Logging Quickstart and the IAM Quickstart
Understanding the user interface
Sinks are configured in the Exports page of the Logs Viewer.
The following screenshot shows an example of the Logs Viewer's Exports page. The page shows both v1 and v2 sinks:
V1 sink section
If you have any v1 sinks, they are listed in the v1 section of the Exports page. The information shown consists of some of the sink properties described in Overview of Logs Export, with the following exception:
- Sink Type is the v1 API sink type:
- Log: This sink exports all log entries from a named log.
- Log Service: This sink exports all log entries from a named log service, such as Google Compute Engine.
- Project: This sink exports log entries that match a filter, like v2 sinks. However, the output will be in the v1 log entry format.
Each v1 sink has the following options in its right-side More menu:
View as v2: Displays the Create Export page, filling in the v2 filter to mimic the v1 sink. You can complete the creation of a new v2 sink, or cancel by clicking the Back button on your browser.
Schedule v2 conversion: Initiates an automatic conversion of the v1 API sink to an equivalent v2 API sink. The new sink uses the same destination as the old sink.
You must specify a date for the conversion to take effect; the time is fixed to be midnight GMT at the end of that date. Log entries timestamped before the conversion time are exported by your existing v1 sink. Log entries timestamped after the conversion time are exported by your new v1 sink.
Midnight GMT is also when Stackdriver Logging instructs Cloud Storage and BigQuery to begin using new buckets or tables for exported log entries. That means your v1 and v2 exported log entries will not be mixed in the same bucket or table.
In the case of Cloud Pub/Sub destinations, the v1 and v2 log entries are likely to be mixed in the stream sent to your topic. The stream of log entries is not necessarily in timestamp order. Your topic subscribers should be prepared for the changeover.
Delete sink: Deletes the sink.
V2 sink section
If you have any v2 sinks, they are listed in the v2 section. The information shown corresponds to some of the sink properties described in Overview of Logs Export:
- Sink Name: The sink's identifier in the current project.
- Destination: Where the exported log entries will go.
- Writer Identity: The service account that Stackdriver Logging uses to write log entries to the destination. This service account must have permission to write to the sink's destination.
Each v2 sink has the following options in its right-side More menu:
- Edit sink: Lets you update the sinks parameters.
- Delete sink: Deletes the sink.
- View filter: Displays the sink's filter. Clicking Edit will let you update the sink.
To create an export sink, click Create Export at the top of the Logs Viewer page. The following screenshot shows the creation panel with some fields filled in:
Fill in the sink creation panel as follows:
(filter): Enter an advanced logs filter. You don't need quotation marks around the filter and you can use multiple lines. The filter in the preceding sample specifies the Nginx error logs from all your Google App Engine applications.
Whenever you edit the filter, press Submit Filter to display the matched log entries in the area at the bottom of the page. Press the Jump to newest logs icon at the top of the page to fetch the most recent logs.
If you wish to use the basic viewing interface to select the logs, use the ▾ menu at the right side of the filter box.
Sink name: Enter the identifier you want to assign to the sink.
Sink Service: Select a destination service: Cloud Storage, Cloud Pub/Sub, or BigQuery.
Sink Destination: Select or create the particular bucket, topic, or dataset to receive the exported logs.
Click Create Sink to create the sink.
As part of creating the sink, Stackdriver Logging attempts to grant the sink's writer identity permission to write to your destination. If you are exporting to a destination in a project other than the one owning your logs, then an administrator of the new destination must grant permission. You should send the administrator the sink's writer identity, which is listed with the sink in the Exports page.
New log entries that match your sink will start being exported. Log entries going to BigQuery or Cloud Pub/Sub are streamed to those destinations immediately. Log entries going to Cloud Storage are batched and sent out approximately every hour. For more information, see Exported logs availability.
If Stackdriver Logging encounters errors when trying to export logs to your destination, the errors appear in your project's Activity Stream. Select Activity at the top of your project's home page in Google Cloud Platform Console. To diagnose common errors, see Troubleshooting.
To update a v2 sink, select the Edit sink command in the More menu to the right of the sink's name. You can change any of the following parameters:
To change other parameters of your sinks, use the projects.sinks.update API method.
You cannot update v1 sinks in the Logs Viewer.
To delete a sink, select the sink in the Export page and press Delete at the top of the page. Alternatively, select Delete sink from the More menu to the right of the sink's name.
This section describes how you can grant Stackdriver Logging permission to write exported logs to your sink's destination.
When you create a sink, Stackdriver Logging creates a new service account for the sink, called a unique writer identity. Your destination must permit this service account to write log entries. To set up this permission, follow these steps:
Create the new sink in the Logs Viewer, the
gcloud loggingcommand-line interface, or the Stackdriver Logging API.
If you created your sink in the Logs Viewer and you have Owner access to destination, then the Logs Viewer should have set up the needed permission on your behalf. If it did so, you are done. If not, continue.
Obtain the sink's writer identity—an email address—from the new sink:
- If you are using the Logs Viewer, you can see the writer identity in the sink listing on the Export page.
- If you are using the Stackdriver Logging API, you can get the writer identity from the LogSink object.
- If you are using
gcloud logging, you can see the writer identities when you list your sinks.
If you have Owner access to the destination, then add the service account to the destination in the following way:
- For Cloud Storage destinations, add the sink's writer identity to your bucket and give it Owner permission.
- For BigQuery destinations, add the sink's writer identity to your dataset and give it Can edit permission.
- For Cloud Pub/Sub, add the sink's writer identity to your topic and give it the role Pub/Sub Publisher.
This completes the authorization. You are done.
If you do not have Owner access to the destination, then send the writer identity service account name to someone who has that ability. That person should then follow the instructions in the previous step to add the writer identity to the destination.
If a sink tries to export a log entry but does not have the needed permission to the destination, it will report an error and skip the log entry. This will continue until the permission is granted, at which time the sink begins exporting new log entries.
There is an unavoidable delay between creating the sink and using the sink's new service account to authorize writing to the destination. You can choose to simply ignore any error messages from the sink during the delay.
This section lists some possible errors and unexpected results, and explains what to do about them.
Errors from sinks appear in the Activity Stream for the project or other resource where the sink was created. See the Activity Stream in the resource's home page in Cloud Platform Console.
|Your new log entries are exported but your older log entries are not exported.||Stackdriver Logging only exports log entries that are received after the export has been set up.||Use the entries.list API method to retrieve your older log entries and use the destination service's API to write the log entries to the export destination.|
Errors exporting to Cloud Storage
The following table lists the most common errors when you configure Stackdriver Logging to export logs to Cloud Storage:
|Permissions on bucket [YOUR_BUCKET] do not allow the logs group to create new objects.||The sink's writer identity does not have enough permissions to the bucket.||Add the necessary permissions to the bucket or update your sink to use a different bucket. See Destination authorization.|
|No bucket with name: [YOUR_BUCKET].||There might be an error in the bucket name, or the bucket might have been deleted.||Update your sink with the correct bucket destination.|
Errors exporting to BigQuery
The following table lists the most common errors when you configure Stackdriver Logging to export logs to BigQuery:
|Permissions on dataset [YOUR_DATASET] do not allow the logs group to create new tables.||In order to write logs to a BigQuery dataset, you must grant the sink's writer identity either Can edit permission or the Writer role.||Add the permission to the dataset. See Destination authorization.|
|No dataset with name: [YOUR_DATASET].||You might have an error in your sink's destination or someone might have deleted the dataset.||Either re-create the dataset or update the export sink to use a different dataset.|
|Logs streamed to table [YOUR_TABLE] in dataset [YOUR_DATASET] do not match the table schema.||You are trying to export logs that are incompatible with the current table's schema.||Make sure nothing other than log entries are being written to the same table. You can remove or rename the table and let Stackdriver Logging create the table again.|
|Per-table streaming insert quota has been exceeded for table [YOUR_TABLE] in dataset [YOUR_DATASET].||You are exporting too many log entries too quickly. See the BigQuery default quota limits, which apply to logs streaming.||If you exceed the quota, contact a sales representative to increase your per-table or per-project quotas.|
Errors exporting logs to Cloud Pub/Sub
The following table lists the most common errors when you configure Stackdriver Logging to export logs to Cloud Pub/Sub:
|[ACCOUNT] needs edit permission on [PROJECT] to publish to [TOPIC]||The specified account does not have Can edit permission in your project.||Add the necessary permissions to your project. See Destination authorization.|
|Topic [TOPIC] does not exist.||You might have deleted the topic that was configured to receive your exported logs.||Either re-create the topic with the same name, or change the export configuration to use a different topic.|