Quotas and limits

This page provides details on the limits that apply to using Cloud Logging.

Logging usage limits

The following limits apply to the usage of Cloud Logging. With the exception of the limits on the number of log buckets and sinks, these limits are fixed; you can't increase or decrease them.

Category Maximum value
Size of a LogEntry 256 KB1
Size of an audit log entry 512 KiB
Number of labels 64 per LogEntry
Length of a LogEntry label key 512 B2
Length of a LogEntry label value 64 KB2
Length of a Logging query language query 20,000 characters
Query fanout4 200 buckets
Number of sinks 200 per Google Cloud project3
Length of a sink inclusion filter 20,000 characters
Length of a sink exclusion filter 20,000 characters
Number of exclusion filters 50 per sink
Number of log buckets 100 per Google Cloud project3,4,5
Number of custom indexed fields 20 per log bucket
Number of log views 30 per log bucket
Oldest timestamp that can be stored in log buckets6 30 days in the past
Future timestamp that can be stored in log buckets6 Up to 1 day in the future
Number of log scopes per resource7 100
Number of log views and projects included in a log scope7 100
Number of projects included in a log scope7 5

1 This approximate limit is based on internal data sizes, not the actual REST API request size.

2 Cloud Logging truncates oversized label keys and values when their associated log entry is written.

3 This limit also applies to billing accounts, folders, and organizations and isn't hierarchical. For example, if you have multiple Google Cloud projects in an organization, then you could configure up to 200 sinks for each Google Cloud project; for that same organization, you could also configure up to 200 sinks at the organization level.

4 This limit is the maximum number of buckets that might contain log entries for a resource. For more information, see Query returns an error.

5 This limit includes buckets that are pending deletion.

6 Log entries with timestamps outside of these boundaries are rejected from log buckets. The Logging API accepts entries with older timestamps, and those entries can be routed to sink destinations but not stored in log storage. The Logging API rejects entries with timestamps more than 1 day in the future and returns an INVALID_ARGUMENT error.

7 Log scopes are in Public Preview.

Logging API quotas and limits

The following limits apply to your usage of the Logging API. You can request changes to your Logging API quotas and limits; for instructions, see Requesting changes to Cloud Logging API quota on this page.

Category Maximum value
API usage To view your quotas, go to the API dashboard, select an API, and then select Quotas.
Lifetime of API page tokens 24 hours
Number of open live-tailing sessions 10 per Google Cloud project3
Number of live-tailing entries returned 60,000 per minute
Number of restricted fields 20 per bucket
Size of a restricted field 800 B
Size of an entries.write request 10 MB
Number of entries.write requests 120,000 per minute, per Google Cloud project1, 3
Number of entries.list requests 60 per minute, per Google Cloud project2, 3
Number of different resource names in a
single entries.write command4
1000
Control requests5 per minute 600
Control requests5 per day 1,000 per Google Cloud project
Number of Google Cloud projects or other resource names in a
single entries.list request
100
Number of concurrent copy operations 1 per Google Cloud project3
Rate of exports to a Pub/Sub topic6 60 GB per minute per Google Cloud project

1 Using exclusion filters doesn't reduce this number because logs are excluded after the entries.write request is made.

2This value is the default setting.

3 This limit also applies to billing accounts, folders, and organizations and isn't hierarchical.

4 The logName field of a log entry specifies the log entry's resource name.

5 The daily control-request quota applies to API requests for creating and updating exclusions and sinks. The per-minute control-request quota applies to everything also included in the daily control-request quota, plus API requests for deleting logs and managing log-based metrics.

6 If the rate of exports exceeds the quota, then the error is recorded in a log entry. The summary field indicates sink configuration error and the error code is listed as topic_over_quota.

Request changes to Cloud Logging API quota

You can request higher or lower Logging API limits using the Google Cloud console. For more information, see View and manage quotas.

If you get an error Edit is not allowed for this quota, you can contact Support to request changes to the quota. Note also that billing must be enabled on the Google Cloud project to click the checkboxes.

Optimize usage of entries.list

The expected usage of entries.list is to search for matching logs. This method isn't intended for high-volume retrieval of log entries. If you're regularly exhausting your entries.list quota, then consider the following options:

  • Ensure that you are using the Cloud Logging API effectively. For more information, see Optimize usage of the API.

  • If you know in advance that the log entries you want to analyze exceed the entries.list quota, then configure a log sink to export your logs to a supported destination.

  • To analyze log entries outside of Logging, you can retroactively copy log entries that already exist in Logging to Cloud Storage buckets. When you copy logs to a Cloud Storage bucket, you can share log entries with auditors outside of Logging, and run scripts in Cloud Storage.

Log-based metrics

The following limits apply to your usage of user-defined log-based metrics. With the exception of the number of metric descriptors, these limits are fixed; you can't increase or decrease them.

Category Maximum value
Number of labels 10 per metric
Length of label value 1,024 B
Length of label description 800 B
Length of a filter 20,000 characters
Length of metric descriptors 8,000 B
Number of metric descriptors 500 per Google Cloud project2
Number of active time series1 30,000 per metric
Number of histogram buckets 200 per custom distribution metric
Data retention See Cloud Monitoring: Data retention

1A time series is active if you have written data points to it within the last 24 hours.

2 This limit also applies to billing accounts, folders, and organizations and isn't hierarchical.

Audit logging

The maximum sizes of audit logs are shown in the following table. These values can help you estimate the space you need in your sink destinations.

Audit log type Maximum size
Admin Activity 512 KiB
Data Access 512 KiB
System Event 512 KiB
Policy Denied 512 KiB

Logs retention periods

The following Cloud Logging retention periods apply to log buckets, regardless of which types of logs are included in the bucket or whether they were copied from another location. The retention information is as follows:

Bucket Default retention period Custom retention
_Required 400 days Not configurable
_Default 30 days Configurable
User-defined 30 days Configurable

For the _Default and user-defined log buckets, you can configure Cloud Logging to retain your logs between 1 day and 3650 days. For information on setting retention rules, see Configure custom retention.

Pricing

Cloud Logging doesn't charge to route logs to a supported destination; however, the destination might apply charges. With the exception of the _Required log bucket, Cloud Logging charges to stream logs into log buckets and for storage longer than the default retention period of the log bucket.

Cloud Logging doesn't charge for copying logs, for defining log scopes, or for queries issued through the Logs Explorer or Log Analytics pages.

For more information, see the following documents: