This page provides a conceptual overview of exporting logs using Cloud Logging.
For instructions on how to export your logs, go to Next steps on this page.
You can export some or all of your logs to various sink destinations. You might want to export logs for the following reasons:
- To store logs that are unlikely to be read but that must be retained for compliance purposes.
- To use big-data analysis tools on your logs.
- To stream your logs to other applications, other repositories, or third parties.
For a further exploration of common logging export scenarios, refer to Design patterns for exporting logging data.
How exports work
The following diagram illustrates how Cloud Logging treats exported log entries:
All logs, including audit logs, platform logs, and user logs, are sent to the Cloud Logging API where they pass through the Logs Router. The Logs Router checks each log entry against existing rules to determine which log entries to ingest (store), which log entries to include in exports, and which log entries to discard. For more details, see Logs Router overview.
Exporting involves writing a filter that selects the log entries you want to export, and choosing a destination from the following options:
- Cloud Storage: JSON files stored in Cloud Storage buckets.
- BigQuery: Tables created in BigQuery datasets.
- Pub/Sub: JSON messages delivered to Pub/Sub topics. Supports third-party integrations, such as Splunk, with Logging.
- Another Google Cloud Cloud project: Log entries held in Cloud Logging logs buckets.
The filter and destination are held in an object called a sink. Sinks can be created in Google Cloud project, organizations, folders, and billing accounts.
Sink properties and terminology
Sinks have the following properties:
Sink identifier: A name for the sink. For example,
Parent resource: The resource in which you create the sink. The parent is most often a project, but it can be any of the following:
"projects/[PROJECT_ID]" "folders/[FOLDER_ID]" "billingAccounts/[BILLING_ACCOUNT_ID]" "organizations/[ORGANIZATION_ID]"
The sink can only export logs that belong to its parent resource. For the one exception to this rule, see the following Aggregated exports property.
The full resource name of a sink includes its parent resource and sink identifier. For example:
Logs filter: Selects which log entries to export through this sink. For filter examples, go to the Sample queries.
Destination: A single place to send the log entries matching your filter. The following are the supported destinations:
Cloud Storage buckets provide inexpensive, long-term storage:
BigQuery datasets provide big data analysis capabilities:
Pub/Sub topics stream your log entries to other applications or repositories:
Logging logs buckets provide storage with customizable retention periods:
In the Logs Explorer, you can also use the Custom destination option when creating an export to send your logs from one project to a destination in another project. For more information, go to creating sinks.
You can export logs to destinations in any project, so long as the export destination authorizes the sink's service account as a writer.
Writer identity: A service account name. The export destination's owner must give this service account permission to write to the export destination. When exporting logs, Logging adopts this identity for authorization. For increased security, new sinks get a unique service account:
For more information, go to destination permissions.
Aggregated exports. The
includeChildrenproperty is described in Aggregated exports. It is only relevant to sinks created for organizations or folders.
How sinks work
Every time a log entry arrives in a project, folder, billing account, or organization resource, Logging compares the log entry to the sinks in that resource. Each sink whose filter matches the log entry writes a copy of the log entry to the sink's export destination.
To create or modify a sink, you must have the Identity and Access Management roles Owner or Logging/Logs Configuration Writer in the sink's parent resource. To view existing sinks, you must have the IAM roles Viewer or Logging/Logs Viewer in the sink's parent resource. For more information, go to Access control.
To export logs to a destination, the sink's writer service account must be permitted to write to the destination. For more information about writer identities, read Sink properties on this page.
Cloud Logging doesn't charge to export logs, but destination charges might apply. For details, review the appropriate product's pricing page:
Note also that if you send and then exclude your Virtual Private Cloud flow logs from Cloud Logging, VPC flow log generation charges apply in addition to the destination charges.
Export your logs
To learn how to export your logs, review the following pages:
- To use the Logs Router, go to Exporting with the Logs Router.
- To use the Logging API, go to Exporting logs in the API.
- To use the
gcloudcommand-line tool, go to gcloud logging.
Find and use your exported logs
To learn about the format of exported log entries and how the exported logs are organized in destinations, go to Using exported logs.
Explore Logging export scenarios
The following tutorials describe scenarios for which you might want to export logs. Each tutorial details the requirements, setup, and usage, and shows how to share the exports.
- Scenario – Export for compliance requirements
- Scenario – Export for security and access analytics
- Scenario – Export to Splunk
- Scenario – Export to Elasticsearch