This article introduces a series of articles that explore design patterns and best practices for common logging export scenarios.
Cloud Logging provides an operational datastore for logs and provides rich export capabilities. You might export your logs for several reasons, such as retaining logs for long-term storage (months or years) to meet compliance requirements or for running data analytics against the metrics extracted from the logs. Cloud Logging can export to Cloud Storage, BigQuery, and Pub/Sub, and also to Elasticsearch.
Cloud Logging can export all logging for an organization, using aggregated sinks, or for a specific Google Cloud project, using log sinks. Using logging filters, you can include or exclude specific projects or cloud resources. For example, you could export all Compute Engine logs, but exclude high-volume logs from Cloud Load Balancing. This approach gives you the flexibility to export all logs or specific logs.
Using aggregated sinks, your organization can export logs from all projects or from a single folder. With this functionality, you can enforce logging export policy across all your organization's projects. You can use organization-level IAM controls in order to limit user access to just modifying the logging export configuration.
As an alternative to aggregated sinks, logs export is enabled per project rather than for the entire organization. Logs export is otherwise identical to aggregated sinks.
Ways to export
There are three ways to export logs from Cloud Logging:
- To files: JSON files stored in Cloud Storage.
- To BigQuery: logging tables created in a BigQuery dataset.
- To Pub/Sub: JSON messages delivered to a Pub/Sub topic.
What gets exported
To learn about the types of logs available in Cloud Logging, see Available logs.
Depending on the type of log, there are three distinct logging payload formats.
- The contents are represented as a single string. The logs reported by the
Cloud Logging agent (including
syslog) and the Cloud SQL logs are both examples of logs that use this format.
- The contents are represented as a protocol buffer and vary depending on the specific content being logged. The Admin Activity and Data Access audit logs are both exported in this format. These logs have different JSON and table structures in BigQuery based on the exported entry type.
- The contents are represented as a JSON object and vary depending on the specific content being logged. The activity logs from Compute Engine and the Compute Engine autoscaler are examples that use this format.
schemas and fields documentation
provides detailed information about mapping the log formats to
BigQuery table and JSON export file structures. Consider the
logging payload format when you write queries against BigQuery
export or when you parse the file or Pub/Sub export JSON files.
The detailed format of the log is listed in the API definition for
Logging export scenarios
Articles in this series describe scenarios for which you might want to export logs. Each scenario details the requirements, setup, and usage, and shows how to share the exports.
- Scenario – Export for compliance requirements
- Scenario – Export for security and access analytics
- Scenario – Export to Splunk
- Scenario – Export to Elasticsearch
- Scenario – Export to Datadog
- Learn more about Cloud Logging routing.
- Exporting Cloud Logging logs to Elastic Cloud
- Learn more about Cloud Audit Logs.
- Learn more about BigQuery.
- Explore reference architectures, diagrams, tutorials, and best practices about Google Cloud. Take a look at our Cloud Architecture Center.