Quotas and limits

This page provides details on the limits that apply to using Cloud Logging.

For pricing information, see the Logging pricing summary.

Logging usage limits

The following limits apply to the usage of Cloud Logging. These limits are fixed; you can't increase or decrease them.

Category Maximum value
Size of a log entry 256 KB1
Size of an audit log entry 512 KiB
Number of labels per log entry 64
Length of a log entry label key 512 B2
Length of a log entry label value 64 KB2
Length of a query 20,000 characters
Query fanout3 200 buckets
Number of sinks 200 per Cloud project, billing account, folder, and organization3
Length of a sink inclusion filter 20,000 characters
Length of a sink exclusion filter 20,000 characters
Number of exclusion filters 50 per sink
Number of log buckets 100 per Cloud project4, 5
Number of custom indexed fields 20 per bucket
Number of log views 30 per bucket
Oldest timestamp ingestible to log buckets6 30 days in the past
Future timestamp ingestible to log buckets6 Up to 1 day in the future

1 This approximate limit is based on internal data sizes, not the actual REST API request size.

2 Cloud Logging truncates oversized label keys and values when their associated log entry is written.

3 The limits for sinks apply to the resource on which they are configured and aren't hierarchical. For example, if you have multiple Cloud projects in an organization, then you could configure up to 200 sinks for each Cloud project; for that same organization, you could also configure up to 200 sinks at the organization level.

4 This limit is the maximum number of buckets that might contain log entries for a resource. For more information, see Query returns an error.

5This limit includes buckets that are pending deletion.

6Log entries with timestamps outside of these boundaries are rejected from log buckets. The Logging API accepts entries with older timestamps, and those entries can be routed to sink destinations but not stored in log storage. The Logging API rejects entries with timestamps more than 1 day in the future and returns an INVALID_ARGUMENT error.

Logging API quotas and limits

The following limits apply to your usage of the Logging API. You can request changes to your Logging API quotas and limits; for instructions, see Requesting changes to Cloud Logging API quota on this page.

Category Maximum value
API usage To view your quotas, go to the API dashboard. Click an API and select Quotas.
Lifetime of API page tokens 24 hours
Number of open live-tailing sessions per Cloud project 10
Number of live-tailing entries returned per minute 60,000
Number of restricted fields 20 per bucket
Size of a restricted field 800 B
Size of an entries.write request 10 MB
Number of entries.write API calls 120,000 per minute, per Cloud project1
Number of entries.list API calls 1 per second, per Cloud project2
Number of Cloud projects or other resource names in a single entries.list API call 100
Number of concurrent copy operations 1 per Cloud project, folder, or organization

1 Using log exclusions doesn't reduce this number because logs are excluded after the entries.write call is made.

2This value is the default setting for each Cloud project.

Requesting changes to Cloud Logging API quota

You can request higher or lower Logging API limits using the Google Cloud console:

  1. In the Quotas page, use the checkboxes to select Cloud Logging API, and then click EDIT QUOTAS.

    If you get an error Edit is not allowed for this quota, you can contact Support to request changes to the quota. Note also that billing must be enabled on the Cloud project in order to click the checkboxes.

  2. In the Quota changes panel, select the service to expand the view and then fill in the New limit and Request description fields. Click Next.

  3. Complete the form in the Contact details panel.

  4. Click Submit request.

For more information, go to Working with quotas.

Optimize usage of entries.list

To efficiently use the entries.list quota, try the following:

  • Set a large pageSize: In the request body, set the pageSize parameter to its maximum of 1,000. This allows Logging to return more entries per query, reducing the number of queries needed to retrieve the full set of entries that you're targeting.

  • Set a large deadline: When a query nears its deadline, Logging prematurely terminates and returns the log entries scanned thus far. If you set a large deadline, then Logging can retrieve more entries per query.

  • Retry quota errors with exponential backoff: If your use case isn't time-sensitive, then you can wait for the quota to replenish before retrying your query. The pageToken parameter is still valid after a delay.

If you need contemporary or continuous querying, then configure sinks to send your logs to Pub/Sub or BigQuery.

Logs-based metrics

The following limits apply to your usage of user-defined logs-based metrics. These limits are fixed; you can't increase or decrease them.

Category Maximum value
Number of labels per metric 10
Length of label value 1,024 B
Length of label description 800 B
Length of a filter 20,000 characters
Length of metric descriptors 8,000 B
Number of metric descriptors per Cloud project 500
Number of active time series1 per metric 30,000
Histogram buckets per custom distribution metric 200

1A time series is active if you have written data points to it within the last 24 hours.

Logs retention periods

The following Cloud Logging retention periods apply to log buckets, regardless of which types of logs are included in the bucket or whether they were copied from another location. The retention information is as follows:

Log bucket Default retention period Custom retention
_Required 400 days Not configurable
_Default 30 days Configurable
User-defined 30 days Configurable

For the _Default and user-defined log buckets, you can configure Cloud Logging to retain your logs between 1 day and 3650 days. For information on setting retention rules, see Configure custom retention.