BigQuery limits the maximum rate of incoming requests and enforces appropriate quotas on a per-project basis. Specific policies vary depending on resource availability, user profile, service usage history, and other factors, and are subject to change without notice.
The lists below outline the current rate limits and quota limits of the system.
Query jobs
The following limits apply to query jobs created automatically by running interactive queries and to jobs submitted programmatically using jobs.query and query-type jobs.insert method calls.
- Concurrent rate limit for interactive queries — 100 concurrent queries
Queries with results that are returned from the
query cache count against this
limit for the duration it takes for BigQuery to determine that it
is a cache hit. Dry run queries do not count against this limit.
You can specify a dry run query using the
--dry_run
flag
or by setting the dryRun
property in a query job.
This limit is applied at the project level. To raise the limit, contact support or contact sales.
- Concurrent rate limit for interactive queries against Cloud Bigtable external data sources — 4 concurrent queries
You are limited to 4 concurrent queries against a Cloud Bigtable external data source.
- Concurrent rate limit for legacy SQL queries that contain user-defined functions (UDFs) — 6 concurrent queries
The concurrent rate limit for legacy SQL queries that contain UDFs includes both interactive and batch queries. Interactive queries that contain UDFs also count toward the concurrent rate limit for interactive queries. This limit does not apply to standard SQL queries.
- Daily query size limit — Unlimited by default
You may specify limits on the amount of data users can query by setting custom quotas.
- Daily destination table update limit — 1,000 updates per table per day
Destination tables in a query job are subject to
the limit of 1,000 updates per table per day. Destination table updates include
append operations and overwrite operations performed by a query using the
console, the classic BigQuery web UI, the bq
command-line tool, or by
calling the
jobs.query
and query-type
jobs.insert
API methods.
- Query execution time limit — 6 hours
This limit cannot be changed.
Maximum number of tables referenced per query — 1,000
Maximum unresolved legacy SQL query length — 256 KB
Maximum unresolved standard SQL query length — 1 MB
Maximum resolved legacy and standard SQL query length — 12 MB
The limit on resolved query length includes the length of all views and wildcard tables referenced by the query.
Maximum number of standard SQL query parameters — 10,000
Maximum response size — 10 GB compressed1
1Sizes vary depending on compression ratios for the data. The actual response size may be significantly larger than 10 GB.
The maximum response size is unlimited when writing large query results to a destination table.
- Maximum row size — 100 MB2
2The maximum row size limit is approximate, as the limit is based on the internal representation of row data. The maximum row size limit is enforced during certain stages of query job execution.
Maximum columns in a table, query result, or view definition — 10,000
Maximum concurrent slots per project for on-demand pricing — 2,000
BigQuery slots are shared among all queries in a single project. BigQuery might burst beyond this limit to accelerate your queries.
To check how many slots you're using, see Monitoring BigQuery using Stackdriver Monitoring.
- Maximum concurrent queries against a Cloud Bigtable external data source — 4
For information on limits that apply to user-defined functions in SQL queries, see UDF limits.
Load jobs
The following limits apply to jobs created automatically by loading data using the command-line
tool, the Cloud Console, or the classic BigQuery web UI. The limits also apply to load jobs
submitted programmatically by using the load-type jobs.insert
API method.
The following limits apply when you load data into BigQuery.
- Load jobs per table per day — 1,000 (including failures)
- Load jobs per project per day — 100,000 (including failures)
- Row and cell size limits:
Data format Max limit CSV 100 MB (row and cell size) JSON 100 MB (row size) - Maximum columns per table — 10,000
- Maximum File Sizes:
File Type Compressed Uncompressed CSV 4 GB 5 TB JSON 4 GB 5 TB Avro Compressed Avro files are not supported, but compressed data blocks are. BigQuery supports the DEFLATE and Snappy codecs. 5 TB (1 MB for the file header) - Maximum size per load job — 15 TB across all input files for CSV, JSON, Avro, Parquet, and ORC
- Maximum number of source URIs in job configuration — 10,000 URIs
- Maximum number of files per load job — 10 Million total files including all files matching all wildcard URIs
- Load job execution time limit — 6 hours
- With the exception of US-based datasets, you must load data from a Cloud Storage bucket in the same region as the dataset's location (the bucket can be either a multi-regional bucket or a regional bucket in the same region as the dataset). You can load data into a US-based dataset from any region.
For more information, see Introduction to loading data into BigQuery.
Copy jobs
The following limits apply to copying tables
in BigQuery. The limits apply to jobs created automatically by copying data using
the command-line tool, the console, or the classic BigQuery web UI. The limits also apply to copy jobs
submitted programmatically by using the copy-type
jobs.insert
API method.
- Copy jobs per destination table per day — 1,000 (including failures)
- Copy jobs per project per day — 100,000 (including failures)
Export jobs
The following limits apply to jobs that export data
from BigQuery. The following limits apply to jobs created automatically by
exporting data using the command-line tool, the console, or the classic BigQuery web UI. The limits also
apply to export jobs submitted programmatically by using the load-type
jobs.insert
API method.
- Exports per day — 100,000 exports per project and up to 10 TB per day (the 10TB data limit is cumulative across all exports)
- Wildcard URIs — 500 wildcard URIs per export
To export more than 10 TB of data per day, use the BigQuery Storage API.
Dataset limits
The following limits apply to datasets:
- Number of datasets per project — unrestricted
- The number of datasets per project is not subject to a quota; however, as you approach thousands of datasets in a project, classic web UI performance begins to degrade, and listing datasets becomes slower.
- Number of tables per dataset — unrestricted
- As you approach 50,000 or more tables in a dataset, enumerating them becomes
slower. Enumeration performance suffers whether you use an API call or the
classic BigQuery web UI. Currently, the BigQuery web UI in the GCP console allows you to
display only 50,000 tables per dataset. To improve classic BigQuery web UI
performance, you can use the
?minimal
parameter to limit the number of tables displayed to 30,000 tables per project. You add the parameter to the classic BigQuery web UI URL in the following format:https://bigquery.cloud.google.com/queries/[PROJECT_NAME]?minimal
.
- Maximum number of authorized views in a dataset's access control list — 2,500
- You can create an authorized view to restrict access to your source data. An authorized view is created using a SQL query that excludes columns you do not want users to see when they query the view. You can add up to 2,500 authorized views to a dataset's access control list.
- Maximum rate of dataset metadata update operations — 5 operations every 10 seconds per dataset
- The dataset metadata update limit includes all metadata update operations
performed using the console, the classic BigQuery web UI, the
bq
command-line tool, or by calling thedatasets.insert
,datasets.patch
, ordatasets.update
API methods.
- Maximum length of a dataset description — 16,384 characters
- When you add a description to a dataset, the text can be at most 16,384 characters.
Table limits
The following limits apply to BigQuery tables.
All tables
- Maximum length of a column description — 16,384 characters
When you add a description to a column, the text can be at most 16,384 characters.
Standard tables
- Maximum number of table operations per day — 1,000
You are limited to 1,000 operations per
table per day whether the operation appends data to a table, overwrites
a table, or uses a DML INSERT
statement to write data to a table.
The maximum number of table operations includes the
combined total of all load jobs,
copy jobs, and
query jobs
that append to or overwrite a destination table or that use a DML
INSERT
statement to write data to a table.
For example, if
you run 500 copy jobs that append data to mytable
and 500
query jobs that append data to mytable
, you would reach the quota.
- Maximum rate of table metadata update operations — 5 operations every 10 seconds per table
The table metadata update limit includes all
metadata update operations performed by using the Cloud Console, the
classic BigQuery web UI, the bq
command-line tool, the client libraries,
by calling the
tables.insert
,
tables.patch
,
or tables.update
API methods, or executing
ALTER TABLE
DDL statements. This limit also applies to
job output.
- Maximum columns in a table, query result, or view definition — 10,000
Partitioned tables
Maximum number of partitions per partitioned table — 4,000
Maximum number of partitions modified by a single job — 4,000
Each job operation (query or load) can affect a maximum of 4,000 partitions. Any query or load job that affects more than 4,000 partitions is rejected by BigQuery.
- Maximum number of partition modifications per ingestion time partitioned table — 5,000
- Maximum number of partition modifications per column partitioned table — 30,000
You are limited to a total of 5,000 partition
modifications per day for an ingestion time partitioned table and 30,000 partition
modifications for a column partitioned table. A partition can be modified
by using an operation that appends to or overwrites data in the partition.
Operations that modify partitions include: a load job, a query that writes
results to a partition, or a DML statement (INSERT
, DELETE
,
UPDATE
, or MERGE
) that modifies data in a partition.
More than one partition may be affected by a single job. For example, a DML statement can update data in multiple partitions (for both ingestion-time and partitioned tables). Query jobs and load jobs can also write to multiple partitions but only for partitioned tables. BigQuery uses the number of partitions affected by a job when determining how much of the quota the job consumes. Streaming inserts do not affect this quota.
- Maximum rate of partition operations — 50 partition operations every 10 seconds
View limits
- Maximum number of nested view levels — 16
BigQuery supports up to 16 levels of
nested views. If there are more than 16 levels, an INVALID_INPUT
error is returned.
- Maximum length of a standard SQL query used to define a view — 256 K characters
When you create a view, the text of the standard SQL query can be at most 256 K characters.
- Maximum number of authorized views in a dataset's access control list — 2,500
You can create an authorized view to restrict access to your source data. An authorized view is created using a SQL query that excludes columns you do not want users to see when they query the view. You can add up to 2,500 authorized views to a dataset's access control list.
UDF limits
The following limits apply to temporary and persistent user-defined functions in SQL queries.
- The amount of data that your JavaScript UDF outputs when processing a single row — approximately 5 MB or less.
- Concurrent rate limit for legacy SQL queries that contain user-defined functions (UDFs) — 6 concurrent queries.
- Maximum number of JavaScript UDF resources, such as inline code blobs or external files, in a query job — 50
- Maximum size of each inline code blob — 32 KB.
- Maximum size of each external code resource — 1 MB.
The concurrent rate limit for legacy SQL queries that contain UDFs includes both interactive and batch queries. Interactive queries that contain UDFs also count toward the concurrent rate limit for interactive queries. This limit does not apply to standard SQL queries.
The following limits apply to persistent user-defined functions.
- Maximum length of a function name — 256 characters
- Maximum number of arguments — 256
- Maximum length of a argument name — 128 characters
- Maximum depth of a user-defined function reference chain — 16
- Maximum depth of argument or output of type STRUCT — 15
- Maximum number of fields in argument or output of type STRUCT per UDF — 1024
- Maximum number of unique UDF plus table references per query — 1000 After full expansion, each UDF can reference up to 1000 combined unique tables and UDFs.
- Maximum number of JavaScript libraries in CREATE FUNCTION statement — 50
- Max length of included JavaScript library paths — 5000 characters
- Maximum update rate per UDF — 5 per 10 seconds After function creation, you can update each function up to 5 times per 10 seconds.
- Each inline code blob is limited to a maximum size of 32 KB.
- Each JavaScript code resource is limited to a maximum size of 1 MB.
Data Manipulation Language statements
The following limits apply to Data Manipulation Language (DML) statements.
- Maximum number of combined INSERT, UPDATE, DELETE, and MERGE statements per day per table — 1,000
A MERGE statement is counted as a single DML statement, even if it contains multiple INSERT, UPDATE, or DELETE clauses.
Streaming inserts
The following limits apply for streaming data into BigQuery.
If you do not populate the insertId
field when you insert rows:
Currently, these quotas apply only to the US
multi-region location, and you
must complete the BigQuery Streaming V2 beta
enrollment form
in order to use them.
- Maximum rows per second: 1,000,000
- If you don’t populate
insertId
field for each row inserted, you are limited to 1,000,000 rows per second, per project. This quota is cumulative. You can use all of this quota on one table, or you can use this quota to stream data to several tables in a project.
Exceeding this amount will causequotaExceeded
errors.
- Maximum bytes per second: 1 GB
- If you don’t populate
insertId
field for each row inserted, you are limited to 1 GB per second, per project. This limit applies at the project level. It does not apply to individual tables.
Exceeding this amount will causequotaExceeded
errors.
If you populate the insertId
field when you insert rows:
- Maximum rows per second: 100,000
- If you populate the
insertId
field for each row inserted, you are limited to 100,000 rows per second, per project or table. This quota is cumulative. You can use all of this quota on one table, or you can use this quota to stream data to several tables in a project.
Exceeding this amount will causequotaExceeded
errors.
- Maximum bytes per second: 100 MB
- If you populate the
insertId
field for each row inserted, you are limited to 100 MB per second, per table.
Exceeding this amount will causequotaExceeded
errors.
The following additional streaming quotas apply whether or not you populate the
insertId
field:
- Maximum row size: 1 MB
- Exceeding this value will cause
invalid
errors.
- HTTP request size limit: 10 MB
- Exceeding this value will cause
invalid
errors.
- Maximum rows per request: 10,000 rows per request
- A maximum of 500 rows is recommended. Batching can increase performance
and throughput to a point, but at the cost of per-request latency.
Too few rows per request and the overhead of each request can make ingestion
inefficient. Too many rows per request and the throughput may drop.
A maximum of 500 rows per request is recommended, but experimentation with representative data (schema and data sizes) will help you determine the ideal batch size.
insertId
field length: 128- Exceeding this value will cause
invalid
errors.
If you need more streaming quota for your project, you can submit a request from the Google Cloud Console. You can set a custom quota on streaming data in increments of 50,000 rows. Expect a response to your request within 2 to 3 business days.
API requests
All API requests
The following limits apply to all BigQuery API requests:
- API requests per second, per user — 100
- If you make more than 100 requests per second, throttling might occur. This limit does not apply to streaming inserts.
- Concurrent API requests, per user: 300
- If you make more than 300 concurrent requests per user, throttling might occur. This limit does not apply to streaming inserts.
tabledata.list
requests
The tabledata.list
method
retrieves table data from a specified set of rows. The following limits apply
to tabledata.list
requests:
- Maximum number of
tabledata.list
queries per project: 500/second - When you call
tabledata.list
, you can submit up to 500 requests per second, per project.
- Maximum number of
- Maximum bytes per second per project returned by calls to
tabledata.list
: 60 MB/second - When you call
tabledata.list
, you can return a maximum of 60 MB per second of table row data per project. The limit applies to the project that contains the table being read.
- Maximum bytes per second per project returned by calls to
- Maximum rows per second per project returned by calls to
tabledata.list
: 150,000/second - When you call
tabledata.list
, you can return a maximum of 150,000 table rows per second per project. The limit applies to the project that contains the table being read.
- Maximum rows per second per project returned by calls to
tables.insert
requests
The tables.insert
method
creates a new, empty table in a dataset. The following limits apply
to tables.insert
requests:
- Maximum requests per second per project: 10
: When you call
tables.insert
, you can create a maximum of 10 requests per second per project. This limit includes statements that create tables such as theCREATE TABLE
DDL statement, and queries that write results to destination tables.
projects.list
requests
The projects.list
method
lists all projects to which you have been granted access. The following limits
apply to projects.list
requests:
- Maximum requests per second per project: 2
: When you call
projects.list
, you can create a maximum of 2 requests per second per project.
jobs.get
requests
The jobs.get
method
returns information about a specific job. The following limits apply
to jobs.get
requests:
- Maximum requests per second per project: 1,000
: When you call
jobs.get
, you can create a maximum of 1,000 requests per second per project.
jobs.query
requests
The jobs.query
method runs a
SQL query synchronously and returns query results if the query completes within
a specified timeout.
- Maximum response size: 10 MB
: By default, there is no maximum row count for the number of rows of data to
return per page of results. However, you are limited to the 10 MB maximum
response size. You can alter the number of rows to return by using the
maxResults
parameter.
BigQuery Storage API requests
The following limits apply to ReadRows
calls using the
BigQuery Storage API:
ReadRows
calls per minute: 5,000 : When you read data using the BigQuery Storage API, you are limited to 5,000ReadRows
calls per minute, per user, per project.
The following limits apply to all other method calls using the BigQuery Storage API:
- API calls per minute: 1,000 : You are limited to 1,000 BigQuery Storage API calls per minute, per user, per project.
When are quotas refilled?
Daily quotas are replenished at regular intervals throughout the day, reflecting their intent to guide rate limiting behaviors. Intermittent refresh is also done to avoid long disruptions when quota is exhausted. More quota is typically made available within minutes rather than globally replenished once daily.
Error codes
Quota and limit errors return either a 403
or a 400
HTTP response code. See
troubleshooting errors for a full list of
error codes and troubleshooting steps.