Quotas and limits
This document lists the quotas and limits that apply to BigQuery.
A quota restricts how much of a particular shared Google Cloud resource your Google Cloud project can use, including hardware, software, and network components.
Quotas are part of a system that does the following:
- Monitors your use or consumption of Google Cloud products and services.
- Restricts your consumption of those resources for reasons including ensuring fairness and reducing spikes in usage.
- Maintains configurations that automatically enforce prescribed restrictions.
- Provides a means to make or request changes to the quota.
When a quota is exceeded, in most cases, the system immediately blocks access to the relevant Google resource, and the task that you're trying to perform fails. In most cases, quotas apply to each Google Cloud project and are shared across all applications and IP addresses that use that Google Cloud project.
There are also limits on BigQuery resources. These limits are unrelated to the quota system. Limits cannot be changed unless otherwise stated.
By default, BigQuery quotas and limits apply on a per-project basis. Quotas and limits that apply on a different basis are indicated as such; for example, the maximum number of columns per table, or the maximum number of concurrent API requests per user. Specific policies vary depending on resource availability, user profile, Service Usage history, and other factors, and are subject to change without notice.
Quota replenishment
Daily quotas are replenished at regular intervals throughout the day, reflecting their intent to guide rate limiting behaviors. Intermittent refresh is also done to avoid long disruptions when quota is exhausted. More quota is typically made available within minutes rather than globally replenished once daily.
Request a quota increase
To increase or decrease most quotas, use the Google Cloud console. For more information, see Requesting a higher quota.
For step-by-step guidance through the process of requesting a quota increase in Google Cloud console, click Guide me:
Cap quota usage
To learn how you can limit usage of a particular resource by specifying a smaller quota than the default, see Capping usage.
Required permissions
To view and update your BigQuery quotas in the Google Cloud console, you need the same permissions as for any Google Cloud quota. For more information, see Google Cloud quota permissions.
Troubleshoot
For information about troubleshooting errors related to quotas and limits, see Troubleshooting BigQuery quota errors.
Jobs
Quotas and limits apply to jobs that BigQuery runs on your behalf
whether they are run by using Google Cloud console, the bq
command-line tool, or
programmatically using the REST API or client libraries.
Query jobs
The following quotas apply to query jobs created automatically by
running interactive queries, scheduled queries, and jobs submitted by using the
jobs.query
and query-type jobs.insert
API methods:
Quota | Default | Notes |
---|---|---|
Query usage per day | Unlimited | There is no limit to the number of bytes that can be processed by
queries in a project. View quota in Google Cloud console |
Query usage per day per user | Unlimited | There is no limit to the number of bytes that a user's queries can
process each day. View quota in Google Cloud console |
Cloud SQL federated query cross-region bytes per day | 1 TB | If the
BigQuery query processing location and the
Cloud SQL instance location are different, then your query is a
cross-region
query. Your project can run up to 1 TB in cross-region queries
per day. See
Cloud SQL
federated queries. View quota in Google Cloud console |
Cross-cloud transferred bytes per day | 1 TB |
You can transfer up to 1 TB of data per day from an Amazon S3 bucket or
from Azure Blob Storage. For more information, see
Cross-cloud
transfer from Amazon S3 and Azure.
View quota in Google Cloud console |
The following limits apply to query jobs created automatically by
running interactive queries, scheduled queries, and jobs submitted by using the
jobs.query
and query-type jobs.insert
API methods:
Limit | Default | Notes |
---|---|---|
Maximum number of concurrent interactive queries run in a reservation | 100 queries |
Your project can run up to 100 concurrent interactive queries in a
reservation.
Queries with results that are returned from the
query cache count against
this limit for the duration it takes for BigQuery to
determine that it is a cache hit. Dry-run queries don't count against
this limit. You can specify a dry-run query by using the
--dry_run
flag. For information about strategies to stay within this limit, see
Troubleshooting quota errors.
One approach to mitigating these errors is to enable
query queues
(preview). Query queues
provide a dynamic concurrency limit, and queuing of up to 1,000
interactive queries beyond those running.
|
Maximum number of queued interactive queries | 1,000 queries | If query queues (preview) are enabled, your project can queue up to 1,000 interactive queries. Additional interactive queries that exceed this limit return a quota error. |
Maximum number of concurrent batch queries run in a reservation | 10 queries | Your project can run up to 10 concurrent batch queries in a reservation. |
Maximum number of queued batch queries | 20,000 queries | Your project can queue up to 20,000 batch queries. Additional batch queries that exceed this limit return a quota error. |
Maximum number of concurrent interactive queries against Cloud Bigtable external data sources | 16 queries | Your project can run up to sixteen concurrent queries against a Bigtable external data source. |
Maximum number of concurrent queries that contain remote functions | 10 queries | You can run up to ten concurrent queries with remote functions per project. |
Maximum number of concurrent multi-statement queries | 1,000 multi-statement queries | Your project can run up to 1,000 concurrent multi-statement queries. For other quotas and limits related to multi-statement queries, see Multi-statement queries. |
Maximum number of concurrent legacy SQL queries that contain UDFs | 6 queries | Your project can run up to six concurrent legacy SQL queries with user-defined functions (UDFs). This limit includes both interactive and batch queries. Interactive queries that contain UDFs also count toward the concurrent limit for interactive queries. This limit does not apply to GoogleSQL queries. |
Daily query size limit | Unlimited | By default, there is no daily query size limit. However, you can set limits on the amount of data users can query by creating custom quotas to control query usage per day or query usage per day per user. |
Daily destination table update limit | See Maximum number of table operations per day. |
Updates to destination tables in a query job count toward the limit on the
maximum number of table
operations per day for the destination tables. Destination table updates include
append and overwrite operations that are performed by queries that you run by
using the Google Cloud console, using the bq command-line tool, or calling the
jobs.query
and query-type
jobs.insert
API methods.
|
Query/multi-statement query execution-time limit | 6 hours | A query or multi-statement query can execute for up to six hours, and then it fails. However, sometimes queries are retried. A query can be tried up to three times, and each attempt can run for up to six hours. As a result, it's possible for a query to have a total runtime of more than six hours. |
Maximum number of resources referenced per query | 1,000 resources |
A query can reference up to 1,000 total of unique
tables, unique
views, unique
user-defined functions
(UDFs), and unique
table
functions
after full expansion. This limit includes the following:
|
Maximum unresolved legacy SQL query length | 256 KB |
An unresolved legacy SQL query can be up to 256 KB long. If
your query is longer, you receive the following error: The query
is too large.
To stay within this limit, consider replacing large arrays or lists with
query parameters.
|
Maximum unresolved GoogleSQL query length | 1 MB |
An unresolved GoogleSQL query can be up to 1 MB long. If
your query is longer, you receive the following error: The query is too
large.
To stay within this limit, consider replacing large arrays or lists with query
parameters.
|
Maximum resolved legacy and GoogleSQL query length | 12 MB | The limit on resolved query length includes the length of all views and wildcard tables referenced by the query. |
Maximum number of GoogleSQL query parameters | 10,000 parameters | A GoogleSQL query can have up to 10,000 parameters. |
Maximum request size | 10 MB | The request size can be |