BigQuery limits the maximum rate of incoming requests and enforces appropriate quotas on a per-project basis. Specific policies vary depending on resource availability, user profile, service usage history, and other factors, and are subject to change without notice.
The lists below outline the current rate limits and quota limits of the system.
- Concurrent rate limit for on-demand, interactive queries — 50 concurrent queries
Queries with results that are returned from the
query cache, and dry run queries
do not count against this limit. You can specify a dry run query using the
or by setting the
property in a query job.
- Concurrent rate limit for queries that contain user-defined functions (UDFs) — 6 concurrent queries
The concurrent rate limit for queries that contain UDFs includes both interactive and batch queries. Interactive queries that contain UDFs also count toward the concurrent rate limit for interactive queries.
- Daily query size limit — Unlimited by default
You may specify limits on the amount of data users can query by setting custom quotas.
- Daily destination table update limit — 1,000 updates per table per day
Destination tables in a query job are subject to
the limit of 1,000 updates per table per day. Destination table updates include
append operations and overwrite operations performed by a query using the
BigQuery web UI, the
bq command-line tool, or by calling the
Query execution time limit — 6 hours
Maximum number of tables referenced per query — 1,000
Maximum unresolved query length — 256 KB
Maximum resolved query length — 12 MB
The limit on resolved query length includes the length of all views and wildcard tables referenced by the query.
- Maximum response size — 128 MB compressed1
1Sizes vary depending on compression ratios for the data. The actual response size may be significantly larger than 128 MB.
The maximum response size is unlimited when writing large query results to a destination table.
- Maximum row size — 100 MB2
2The maximum row size limit is approximate, as the limit is based on the internal representation of row data. The maximum row size limit is enforced during certain stages of query job execution.
- Maximum concurrent slots per project for on-demand pricing — 2,000
The default number of slots for on-demand queries is shared among all queries in a single project. As a rule, if you're processing less than 100 GB of queries at once, you're unlikely to be using all 2,000 slots.
To check how many slots you're using, see Monitoring BigQuery Using Stackdriver. If you need more than 2,000 slots, contact your sales representative to discuss whether flat-rate pricing meets your needs.
The following limits apply to jobs created automatically by loading data using the command-line
tool or the BigQuery web UI. The limits also apply to load jobs submitted programmatically using the
The following limits apply when you load data into BigQuery.
- Load jobs per table per day — 1,000 (including failures)
- Load jobs per project per day — 50,000 (including failures)
- Row and cell size limits:
Data format Max limit CSV 10 MB (row and cell size) JSON 10 MB (row size) Avro 16 MB (block size)
- Maximum columns per table — 10,000
- Maximum File Sizes:
File Type Compressed Uncompressed CSV 4 GB 5 TB JSON 4 GB 5 TB Avro Compressed Avro files are not supported, but compressed data blocks are. BigQuery supports the DEFLATE and Snappy codecs. 5 TB (1 MB for the file header)
- Maximum size per load job — 15 TB across all input files for CSV, JSON, and Avro
- Maximum number of source URIs in job configuration — 10,000 URIs
- Maximum number of files per load job — 10 Million total files including all files matching all wildcard URIs
- Load job execution time limit — 6 hours
For more information, see Introduction to Loading Data into BigQuery.
The following limits apply to copying tables
in BigQuery. The limits apply to jobs created automatically by copying data using
the command-line tool or the BigQuery web UI. The limits also apply to copy jobs submitted
programmatically using the copy-type
- Copy jobs per table per day — 1,000 (including failures)
- Copy jobs per project per day — 10,000 (including failures)
The following limits apply to jobs that export data
from BigQuery. The following limits apply to jobs created automatically by
exporting data using the command-line tool or the BigQuery web UI. The limits also apply to export
jobs submitted programmatically using the load-type
- Exports per day — 1,000 exports per project and up to 10 TB per day (the 10TB data limit is cumulative across all exports)
- Wildcard URIs — 500 wildcard URIs per export
The following limits apply to datasets:
- Number of datasets per project — unrestricted
- The number of datasets per project is not subject to a quota; however, as you approach thousands of datasets in a project, web UI performance begins to degrade, and listing datasets becomes slower.
- Number of tables per dataset — unrestricted
- The number of tables per dataset is also unrestricted, but as you
approach 50,000 or more tables in a dataset, enumerating them becomes slower.
Enumeration performance suffers whether you use an API call, the web UI, or the
__TABLES_SUMMARY__meta table. To improve UI performance, you can use the
?minimalparameter to limit the number of tables displayed to 30,000 per project. You add the parameter to the BigQuery web UI URL in the following format:
- Maximum number of authorized views in a dataset's access control list — 1,000
- You can create an authorized view to restrict access to your source data. An authorized view is created using a SQL query that excludes columns you do not want users to see when they query the view. You can add up to 1,000 authorized views to a dataset's access control list.
- Maximum rate of dataset metadata update operations — 1 operation every 2 seconds per dataset
- The dataset metadata update limit includes all metadata update operations performed using the BigQuery web UI, the
bqcommand-line tool, or by calling the
The following limits apply to BigQuery tables.
- Maximum number of table operations per day — 1,000
You are limited to 1,000 operations per
table per day whether the operation appends data to a table, overwrites
a table, or uses a DML
INSERT statement to write data to a table.
The maximum number of table operations includes the
combined total of all load jobs,
copy jobs, and
that append to or overwrite a destination table or that use a DML
INSERT statement to write data to a table.
For example, if
you run 500 copy jobs that append data to
mytable and 500
query jobs that append data to
mytable, you would reach the quota.
- Maximum rate of table metadata update operations — 1 operation every 2 seconds per table
The table metadata update limit includes all
metadata update operations performed using the BigQuery web UI, the
command-line tool, or by calling the
API methods. This limit also applies to
Maximum number of partitions per partitioned table — 2,500
Maximum number of operations per partition — 2,000
You are limited to 2,000 operations per
partition per day whether the operation appends data to a partition, overwrites
a partition, or uses a DML
INSERT statement to write data to a
The maximum number of partition operations includes
the combined total of all load jobs,
copy jobs, and
that append to or overwrite a partition or that contain
INSERT statements that write data to a partition.
- Maximum rate of partition operations — 50 partition operations every 10 seconds
Data Manipulation Language statements
The following limits apply to Data Manipulation Language (DML) statements.
- Maximum UPDATE/DELETE statements per day per table — 96
- Maximum UPDATE/DELETE statements per day per project — 10,000
- Maximum INSERT statements per day per table — 1,000
DML statements are more limited than SELECT (SQL) statements because processing DML statements requires significantly more resources.
The following limits apply for streaming data into BigQuery.
- Maximum row size: 1 MB. Exceeding this value will cause
- HTTP request size limit: 10 MB. Exceeding this value will cause
- Maximum rows per second:
100,000 rows per second, per project. Exceeding this amount will cause
quotaExceedederrors. The maximum number of rows per second per table is also 100,000. You can use all of this quota on one table or you can divide this quota among several tables in a project.
- Maximum rows per request: 10,000 rows per request. We recommend a maximum of 500 rows. Batching can increase performance and throughput to a point, but at the cost of per-request latency. Too few rows per request and the overhead of each request can make ingestion inefficient. Too many rows per request and the throughput may drop. We recommend using about 500 rows per request, but experimentation with representative data (schema and data sizes) will help you determine the ideal batch size.
- Maximum bytes per second: 100 MB per second, per table. Exceeding this amount will cause
If you need more streaming data quota for your project, you can submit a request from the Google Cloud Platform Console page. You can set a custom quota on streaming data in increments of 50,000 rows. You will usually receive a response within 2 to 3 business days.
The following limits apply to BigQuery API requests:
- API requests per second, per user — 100
- If you make more than 100 requests per second, throttling might occur. This limit does not apply to streaming inserts.
- Concurrent API requests, per user: 300
- If you make more than 300 concurrent requests per user, throttling might occur. This limit does not apply to streaming inserts.
When are quotas refilled?
Daily quotas are replenished at regular intervals throughout the day, reflecting their intent to guide rate limiting behaviors. Intermittent refresh is also done to avoid long disruptions when quota is exhausted. More quota is typically made available within minutes rather than globally replenished once daily.
Quota and limit errors return either a
403 or a
400 HTTP response code. See
troubleshooting errors for a full list of
error codes and troubleshooting steps.