Quotas and limits

This document lists the quotas and limits that apply to BigQuery.

A quota restricts how much of a particular shared Google Cloud resource your Cloud project can use, including hardware, software, and network components.

Quotas are part of a system that does the following:

  • Monitors your use or consumption of Google Cloud products and services.
  • Restricts your consumption of those resources for reasons including ensuring fairness and reducing spikes in usage.
  • Maintains configurations that automatically enforce prescribed restrictions.
  • Provides a means to make or request changes to the quota.

When a quota is exceeded, in most cases, the system immediately blocks access to the relevant Google resource, and the task that you're trying to perform fails. In most cases, quotas apply to each Cloud project and are shared across all applications and IP addresses that use that Cloud project.

Many products and services also have limits that are unrelated to the quota system. These are constraints, such as maximum file sizes or database schema limitations, which generally cannot be increased or decreased, unless otherwise stated.

By default, BigQuery quotas and limits apply on a per-project basis. Quotas and limits that apply on a different basis are indicated as such; for example, the maximum number of columns per table, or the maximum number of concurrent API requests per user. Specific policies vary depending on resource availability, user profile, Service Usage history, and other factors, and are subject to change without notice.

Quota replenishment

Daily quotas are replenished at regular intervals throughout the day, reflecting their intent to guide rate limiting behaviors. Intermittent refresh is also done to avoid long disruptions when quota is exhausted. More quota is typically made available within minutes rather than globally replenished once daily.

Request a quota increase

To increase or decrease most quotas, use the Google Cloud Console. Some quotas can't be increased above their default values.

For more information, see the following sections of Working with quotas:

For step-by-step guidance through the process of requesting a quota increase in Cloud Console, click Guide me:

Guide me

Permissions

To view and update your BigQuery quotas in the Cloud Console, you need the same permissions as for any Google Cloud quota. For more information, see Google Cloud quota permissions.

Copy jobs

The following limits apply to BigQuery jobs for copying tables. The limits apply to jobs created by using the bq command-line tool, Cloud Console, or the copy-type jobs.insert API method. All copy jobs count toward this limit, whether they succeed or fail.

Limit Default Notes
Copy jobs per destination table per day See Table operations per day.
Copy jobs per day 100,000 jobs Your project can run up to 100,000 copy jobs per day.
Cross-region copy jobs per destination table per day 100 jobs Your project can run up to 100 cross-region copy jobs for a destination table per day.
Cross-region copy jobs per day 2,000 jobs Your project can run up to 2,000 cross-region copy jobs per day.

The following limits apply to copying datasets:

Limit Default Notes
Maximum number of tables in the source dataset 20,000 tables A source dataset can have up to 20,000 tables.
Maximum number of tables that can be copied per run to a destination dataset in the same region 20,000 tables Your project can copy 20,000 tables per run to a destination dataset that is in the same region.
Maximum number of tables that can be copied per run to a destination dataset in a different region 1,000 tables Your project can copy 1,000 tables per run to a destination dataset that is in a different region. For example, if you configure a cross-region copy of a dataset with 8,000 tables in it, then BigQuery Data Transfer Service automatically creates eight runs in a sequential manner. The first run copies 1,000 tables. Twenty-four hours later, the second run copies 1,000 tables. This process continues until all tables in the dataset are copied, up to the maximum of 20,000 tables per dataset.

Data manipulation language (DML) statements

The following limits apply for BigQuery data manipulation language (DML) statements:

Limit Default Notes
DML statements per day Unlimited DML statements count toward the number of table operations per day (or the number of partitioned table operations per day for partitioned tables). However, the number of DML statements your project can run per day is not limited. After the daily limit for table operations (or partitioned table operations) is used up, you get errors for non-DML table operations. But you can continue to execute DML statements without getting errors.
Concurrent mutating DML statements per table 2 statements BigQuery runs up to two concurrent mutating DML statements (UPDATE, DELETE, and MERGE) for each table. Additional mutating DML statements for a table are queued.
Queued mutating DML statements per table 20 statements A table can have up to 20 mutating DML statements in the queue waiting to run. If you submit additional mutating DML statements for the table, then those statements fail.
Maximum time in queue for DML statement 6 hours An interactive priority DML statement can wait in the queue for up to six hours. If the statement has not run after six hours, it fails.

For more information about mutating DML statements, see UPDATE, DELETE, MERGE DML concurrency.

Datasets

The following limits apply to BigQuery datasets:

Limit Default Notes
Maximum number of datasets Unlimited There is no limit on the number of datasets that a project can have.
Number of tables per dataset Unlimited When you use an API call, enumeration performance slows as you approach 50,000 tables in a dataset. The Cloud Console can display up to 50,000 tables for each dataset.
Number of authorized views in a dataset's access control list 2,500 authorized views A dataset's access control list can contain up to 2,500 authorized views.
Number of dataset update operations per dataset per 10 seconds 5 operations Your project can make up to five dataset update operations every 10 seconds. The dataset update limit includes all metadata update operations performed by the following:
Maximum length of a dataset description 16,384 characters When you add a description to a dataset, the text can be at most 16,384 characters.

Export jobs

The following quota applies to jobs that export data from BigQuery by exporting data by using the bq command-line tool, Cloud Console, or the export-type jobs.insert API method.

Quota Default Notes
Maximum number of exported bytes per day 50 TB Your project can export up to 50 terabytes per day.
To export more than 50 TB of data per day, use the Storage Read API or the EXPORT DATA statement.
View quota in Cloud Console

The following limits apply to jobs that export data from BigQuery by using the bq command-line tool, Cloud Console, or the export-type jobs.insert API method.

Limit Default Notes
Maximum number of exports per day 100,000 exports Your project can run up to 100,000 exports per day.
Wildcard URIs per export 500 URIs An export can have up to 500 wildcard URIs.

Load jobs

The following limits apply when you load data into BigQuery, using the Cloud Console, the bq command-line tool, or the load-type jobs.insert API method.

Limit Default Notes
Load jobs per table per day See Maximum number of table operations per day. Failed load jobs count toward this limit. Load jobs, including failed load jobs, count toward the limit on the maximum number of table operations per day for the destination table.
Load jobs per day 100,000 jobs Your project can run up to 100,000 load jobs per day. Failed load jobs count toward this limit.
Maximum columns per table 10,000 columns A table can have up to 10,000 columns.
Maximum size per load job 15 TB The total size for all of your CSV, JSON, Avro, Parquet, and ORC input files can be up to 15 TB.
Maximum number of source URIs in job configuration 10,000 URIs A job configuration can have up to 10,000 source URIs.
Maximum number of files per load job 10,000,000 files A load job can have up to 10 million total files, including all files matching all wildcard URIs.
Load job execution-time limit 6 hours A load job fails if it executes for longer than six hours.
Avro: Maximum size for file data blocks 16 MB The size limit for Avro file data blocks is 16 MB.
CSV: Maximum cell size 100 MB CSV cells can be up to 100 MB in size.
CSV: Maximum row size 100 MB CSV rows can be up to 100 MB in size.
CSV: Maximum file size - compressed 4 GB The size limit for a compressed CSV file is 4 GB.
CSV: Maximum file size - uncompressed 5 TB The size limit for an uncompressed CSV file is 5 TB.
JSON: Maximum row size 100 MB JSON rows can be up to 100 MB in size.
JSON: Maximum file size - compressed 4 GB The size limit for a compressed JSON file is 4 GB.
JSON: Maximum file size - uncompressed 5 TB The size limit for an uncompressed JSON file is 5 TB.

If you regularly exceed the load job limits due to frequent updates, consider streaming data into BigQuery instead.

Query jobs

The following quotas apply to query jobs created automatically by running interactive queries, scheduled queries, and jobs submitted by using the jobs.query and query-type jobs.insert API methods:

Quota Default Notes
Query usage per day Unlimited Your project can run an unlimited number of queries per day.
View quota in Cloud Console
Query usage per day per user Unlimited Users can run an unlimited number of queries per day.
View quota in Cloud Console
Cloud SQL federated query cross-region bytes per day 1 TB If the BigQuery query processing location and the Cloud SQL instance location are different, then your query is a cross-region query. Your project can run up to 1 TB in cross-region queries per day. See Cloud SQL federated queries.
View quota in Cloud Console

The following limits apply to query jobs created automatically by running interactive queries, scheduled queries, and jobs submitted by using the jobs.query and query-type jobs.insert API methods:

Limit Default Notes
Concurrent rate limit for interactive queries 100 queries Your project can run up to 100 concurrent interactive queries. Queries with results that are returned from the query cache count against this limit for the duration it takes for BigQuery to determine that it is a cache hit. Dry-run queries don't count against this limit. You can specify a dry-run query by using the --dry_run flag. For information about strategies to stay within this limit, see Troubleshooting quota errors.
Concurrent rate limit for interactive queries against Cloud Bigtable external data sources 4 queries Your project can run up to four concurrent queries against a Bigtable external data source.
Daily query size limit Unlimited By default, there is no daily query size limit. However, you can set limits on the amount of data users can query by creating custom quotas.
Concurrent rate limit for legacy SQL queries that contain UDFs 6 queries Your project can run up to six concurrent legacy SQL queries with user-defined functions (UDFs). This limit includes both interactive and batch queries. Interactive queries that contain UDFs also count toward the concurrent rate limit for interactive queries. This limit does not apply to Standard SQL queries.
Daily destination table update limit See Maximum number of table operations per day. Updates to destination tables in a query job count toward the limit on the maximum number of table operations per day for the destination tables. Destination table updates include append and overwrite operations that are performed by queries that you run by using the Cloud Console, using the bq command-line tool, or calling the jobs.query and query-type jobs.insert API methods.
Query/script execution-time limit 6 hours A query or script can execute for up to six hours, and then it fails. However, sometimes queries are retried. A query can be tried up to three times, and each attempt can run for up to six hours. As a result, it's possible for a query to have a total runtime of more than six hours.
Maximum number of resources referenced per query 1,000 resources A query can reference up to 1,000 total of unique tables, unique views, unique user-defined functions (UDFs), and unique table functions (Preview) after full expansion. This limit includes the following:
  • Tables, views, UDFs, and table functions directly referenced by the query.
  • Tables, views, UDFs, and table functions referenced by other views/UDFs/table functions referenced in the query.
  • Tables resulting from the expansion of wildcard tables used in the query or the other referenced views/UDFs/table functions.
Maximum unresolved legacy SQL query length 256 KB An unresolved legacy SQL query can be up to 256 KB long. If your query is longer, you receive the following error: The query is too large. To stay within this limit, consider replacing large arrays or lists with query parameters.
Maximum unresolved Standard SQL query length 1 MB An unresolved Standard SQL query can be up to 1 MB long. If your query is longer, you receive the following error: The query is too large. To stay within this limit, consider replacing large arrays or lists with query parameters.
Maximum resolved legacy and Standard SQL query length 12 MB The limit on resolved query length includes the length of all views and wildcard tables referenced by the query.
Maximum number of Standard SQL query parameters 10,000 parameters A Standard SQL query can have up to 10,000 parameters.
Maximum response size 10 GB compressed Sizes vary depending on compression ratios for the data. The actual response size might be significantly larger than 10 GB. The maximum response size is unlimited when writing large query results to a destination table.
Maximum row size 100 MB The maximum row size is approximate, because the limit is based on the internal representation of row data. The maximum row size limit is enforced during certain stages of query job execution.
Maximum columns in a table, query result, or view definition 10,000 columns A table, query result, or view definition can have up to 10,000 columns.
Maximum concurrent slots for on-demand pricing 2,000 slots With on-demand pricing, your project can have up to 2,000 concurrent slots. BigQuery slots are shared among all queries in a single project. BigQuery might burst beyond this limit to accelerate your queries. To check how many slots you're using, see Monitoring BigQuery using Cloud Monitoring.
Maximum CPU usage per scanned data for on-demand pricing 256 CPU seconds per MiB scanned With on-demand pricing, your query can use up to approximately 256 CPU seconds per MiB of scanned data. If your query is too CPU-intensive for the amount of data being processed, the query fails with a billingTierLimitExceeded error. For more information, see billingTierLimitExceeded.

Although scheduled queries use features of the BigQuery Data Transfer Service, scheduled queries are not transfers, and are not subject to load job limits.

Row-level security

The following limits apply for BigQuery row-level access policies:

Limit Default Notes
Maximum number of row access policies per table 100 policies A table can have up to 100 row access policies.
Maximum number of row access policies per query 100 policies A query can access up to a total of 100 row access policies.
Maximum number of CREATE / DROP DDL statements per policy per 10 seconds 5 statements Your project can make up to five CREATE or DROP statements per row access policy resource every 10 seconds.
DROP ALL ROW ACCESS POLICIES statements per table per 10 seconds 5 statements Your project can make up to five DROP ALL ROW ACCESS POLICIES statements per table every 10 seconds.
Maximum number of rowAccessPolicies.list calls See Limits for all BigQuery APIs.
Maximum number of rowAccessPolicies.getIamPolicy calls See IAM API quotas.

Streaming inserts

The following quotas and limits apply for streaming data into BigQuery. For information about strategies to stay within these limits, see Troubleshooting quota errors.

Streaming inserts without insertId fields

If you do not populate the insertId field when you insert rows, the following limit applies. For more information, see Disabling best effort de-duplication. This is the recommended way to use BigQuery in order to get higher streaming ingest quota limits.

Limit Default Notes
Maximum bytes per second 1 GB If you don’t populate the insertId field for each row inserted, your project can stream up to 1 GB per second. Exceeding this limit causes quotaExceeded errors.

Streaming inserts with insertId fields

If you populate the insertId field when you insert rows, the following quotas apply. If you exceed these quotas, you get quotaExceeded errors.

Quota Default Notes
Maximum rows per second per project in the us and eu multi-regions 500,000 rows

If you populate the insertId field for each row inserted, you are limited to 500,000 rows per second in the us and eu multi-regions, per project. This quota is cumulative within a given multi-region. In other words, the sum of rows per second streamed to all tables for a given project within a multi-region is limited to 500,000. Each table is additionally limited to 100,000 rows per second.

Exceeding either the per-project limit or the per-table limit will cause quotaExceeded errors.

Maximum rows per second per project in all other locations 100,000 rows

If you populate the insertId field for each row inserted, you are limited to 100,000 rows per second in all locations except the us and eu multi-regions, per project or table. This quota is cumulative within a given region. In other words, the sum of rows per second streamed to all tables for a given project within a region is limited to 100,000.

Exceeding this amount will cause quotaExceeded errors.

Maximum rows per second per table 100,000 rows

If you populate the insertId field for each row inserted, you are limited to 100,000 rows per second per table.

Exceeding this amount will cause quotaExceeded errors.

Maximum bytes per second 100 MB

If you populate the insertId field for each row inserted, you are limited to 100 MB per second, per table.

Exceeding this amount will cause quotaExceeded errors.

All streaming inserts

The following additional streaming limits apply whether or not you populate the insertId field:

Limit Default Notes
Maximum row size 10 MB Exceeding this value causes invalid errors.
HTTP request size limit 10 MB

Exceeding this value causes invalid errors.

Internally the request is translated from HTTP JSON into an internal data structure. The translated data structure has its own enforced size limit. It's hard to predict the size of the resulting internal data structure, but if you keep your HTTP requests to 10 MB or less, the chance of hitting the internal limit is low.

Maximum rows per request 50,000 rows A maximum of 500 rows is recommended. Batching can increase performance and throughput to a point, but at the cost of per-request latency. Too few rows per request and the overhead of each request can make ingestion inefficient. Too many rows per request and the throughput can drop. Experiment with representative data (schema and data sizes) to determine the ideal batch size for your data.
insertId field length 128 characters Exceeding this value causes invalid errors.

If you need more streaming quota for your project, you can disable best effort de-duplication. For additional streaming quota, see Request a quota increase.

Table functions

The following limits apply to BigQuery table functions:

Limit Default Notes
Maximum length of a table function name 256 characters The name of a table function can be up to 256 characters in length.
Maximum length of an argument name 128 characters The name of a table function argument can be up to 128 characters in length.
Maximum number of arguments 256 arguments A table function can have up to 256 arguments.
Maximum depth of a table function reference chain 16 references A table function reference chain can be up to 16 references deep.
Maximum depth of argument or output of type STRUCT 15 levels A STRUCT argument for a table function can be up to 15 levels deep. Similarly, a STRUCT record in a table function's output can be up to 15 levels deep.
Maximum number of fields in argument or return table of type STRUCT per table function 1,024 fields A STRUCT argument for a table function can have up to 1,024 fields. Similarly, a STRUCT record in a table function's output can have up to 1,024 fields.
Maximum number of columns in return table 1,024 columns A table returned by a table function can have up to 1,024 columns.
Maximum length of return table column names 128 characters Column names in returned tables can be up to 128 characters long.
Maximum number of updates per table function per 10 seconds 5 updates Your project can update a table function up to five times every 10 seconds.

Tables

All tables

The following limits apply to all BigQuery tables.

Limit Default Notes
Maximum length of a column description 1,024 characters When you add a description to a column, the text can be at most 1,024 characters.
Maximum depth of nested records 15 levels Columns of type RECORD can contain nested RECORD types, also called child records. The maximum nested depth limit is 15 levels. This limit is independent of whether the records are scalar or array-based (repeated).

External tables

The following limits apply to BigQuery tables with data stored on Cloud Storage in Parquet, ORC, Avro, CSV, or JSON format:

Limit Default Notes
Maximum number of source URIs per external table 10,000 URIs Each external table can have up to 10,000 source URIs.
Maximum number of files per external table 10,000,000 files An external table can have up to 10 million files, including all files matching all wildcard URIs.
Maximum size of stored data on Cloud Storage per external table 600 TB An external table can have up to 600 terabytes across all input files. This limit applies to the file sizes as stored on Cloud Storage; this size is not the same as the size used in the query pricing formula. For externally partitioned tables, the limit is applied after partition pruning.

Partitioned tables

The following limits apply to BigQuery partitioned tables.

Partition limits apply to the combined total of all load jobs, copy jobs, and query jobs that append to or overwrite a destination partition, or that use a DML DELETE, INSERT, MERGE, TRUNCATE TABLE, or UPDATE statement to write data to a table.

DML statements count toward partition limits, but aren't limited by them. In other words, the total number of daily operations that count toward the limit includes DML statements, but DML statements don't fail due to this limit. For example, if you run 500 copy jobs that append data to mytable$20210720 and 1,000 query jobs that append data to mytable$20210720, you reach the daily limit for partition operations.

A single job can affect multiple partitions. For example, a DML statement can update data in multiple partitions (for both ingestion-time and partitioned tables). Query jobs and load jobs can also write to multiple partitions, but only for partitioned tables.

BigQuery uses the number of partitions affected by a job when determining how much of the limit the job consumes. Streaming inserts do not affect this limit.

For information about strategies to stay within the limits for partitioned tables, see Troubleshooting quota errors.

Limit Default Notes
Maximum number of partitions per partitioned table 4,000 partitions Each partitioned table can have up to 4,000 partitions. If you exceed this limit, consider using clustering in addition to, or instead of, partitioning.
Maximum number of partitions modified by a single job 4,000 partitions Each job operation (query or load) can affect up to 4,000 partitions. BigQuery rejects any query or load job that attempts to modify more than 4,000 partitions.
Partition modifications per ingestion-time partitioned table per day 5,000 modifications Your project can make up to 5,000 partition modifications per day to an ingestion-time partitioned table.
Partition modifications per column-partitioned table per day 30,000 modifications

Your project can make up to 30,000 partition modifications per day for a column-partitioned table.

Number of partition operations per 10 seconds per table 50 operations Your project can run up to 50 partition operations per partitioned table every 10 seconds.
Number of possible ranges for range partitioning 10,000 ranges A range-partitioned table can have up to 10,000 possible ranges. This limit applies to the partition specification when you create the table. After you create the table, the limit also applies to the actual number of partitions.

Standard tables

The following limits apply to BigQuery standard tables:

Limit Default Notes
Table operations per day 1500 operations

Your project can make up to 1,500 times per table per day, whether the operation appends data to the table or truncates the table. This limit includes the combined total of all load jobs, copy jobs, and query jobs that append to or overwrite a destination table or that use a DML DELETE, INSERT, MERGE, TRUNCATE TABLE, or UPDATE statement to write data to a table.

DML statements count toward this limit, but aren't limited by it. In other words, the total daily operations that count toward the limit includes DML statements, but DML statements don't fail due to this limit. For example, if you run 500 copy jobs that append data to mytable and 1,000 query jobs that append data to mytable, you reach the limit.

For information about table operations per day for partitioned tables, see Partitioned table operations per day.

Maximum rate of table metadata update operations per table 5 operations per 10 seconds Your project can make up to five table metadata update operations per 10 seconds per table. This limit applies to all table metadata update operations, performed by the following: This limit also includes the combined total of all load jobs, copy jobs, and query jobs that append to or overwrite a destination table. This limit doesn't apply to DML operations.

If you exceed this limit, you get an error message like Exceeded rate limits: too many table update operations for this table. This error is transient; you can retry with an exponential backoff.

To identify the operations that count toward this limit, you can Inspect your logs.

Maximum number of columns per table 10,000 columns Each table, query result, or view definition can have up to 10,000 columns.

Table snapshots

The following limits apply to BigQuery table snapshots:

Limit Default Notes
Maximum number of concurrent table snapshot jobs 100 jobs Your project can run up to 100 concurrent table snapshot jobs.
Maximum number of table snapshot jobs per day 50,000 jobs Your project can run up to 50,000 table snapshot jobs per day.
Maximum number of jobs per table snapshot per day 50 jobs Your project can run up to 50 jobs per day per table snapshot.
Maximum number of metadata updates per table snapshot per 10 seconds 5 updates Your project can update a table snapshot's metadata up to five times every 10 seconds.

UDFs

The following limits apply to both temporary and persistent user-defined functions (UDFs) in BigQuery SQL queries.

Limit Default Notes
Maximum output per row 5 MB The maximum amount of data that your JavaScript UDF can output when processing a single row is approximately 5 MB.
Maximum concurrent legacy SQL queries with Javascript UDFs 6 queries Your project can have up to six concurrent legacy SQL queries that contain UDFs in JavaScript. This limit includes both interactive and batch queries. Interactive queries that contain UDFs also count toward the concurrent rate limit for interactive queries. This limit does not apply to standard SQL queries.
Maximum JavaScript UDF resources per query 50 resources A query job can have up to 50 JavaScript UDF resources, such as inline code blobs or external files.
Maximum size of inline code blob 32 KB An inline code blob in a UDF can be up to 32 KB in size.
Maximum size of each external code resource 1 MB The maximum size of each JavaScript code resource is one MB.

The following limits apply to persistent UDFs:

Limit Default Notes
Maximum length of a UDF name 256 characters A UDF name can be up to 256 characters long.
Maximum number of arguments 256 arguments A UDF can have up to 256 arguments.
Maximum length of an argument name 128 characters A UDF argument name can be up to 128 characters long.
Maximum depth of a UDF reference chain 16 references A UDF reference chain can be up to 16 references deep.
Maximum depth of a STRUCT type argument or output 15 levels A STRUCT type UDF argument or output can be up to 15 levels deep.
Maximum number of fields in STRUCT type arguments or output per UDF 1,024 fields A UDF can have up to 1024 fields in STRUCT type arguments and output.
Maximum number of JavaScript libraries in a CREATE FUNCTION statement 50 libraries A CREATE FUNCTION statement can have up to 50 JavaScript libraries.
Maximum length of included JavaScript library paths 5,000 characters The path for a JavaScript library included in a UDF can be up to 5,000 characters long.
Maximum update rate per UDF per 10 seconds 5 updates Your project can update a UDF up to five times every 10 seconds.

Views

The following limits apply to BigQuery views:

Limit Default Notes
Maximum number of nested view levels 16 levels BigQuery supports up to 16 levels of nested views. If there are more than 16 levels, an INVALID_INPUT error is returned.
Maximum length of a standard SQL query used to define a view 256 K characters The text of a standard SQL query that defines a view can be up to 256 K characters.
Maximum number of authorized views per dataset 2,500 authorized views A dataset's access control list can have up to 2,500 authorized views.

BigQuery API

This section describes the quotas and limits that apply to all BigQuery API requests, and the quotas and limits that apply to specific types of API requests.

All BigQuery APIs

The following quota applies to all BigQuery API requests:

Quota Default Notes
Requests per day Unlimited Your project can make an unlimited number of BigQuery API requests per day.
View quota in Cloud Console

The following limits apply to all BigQuery API requests:

Limit Default Notes
Maximum number of API requests per second per user per method 100 requests A user can make up to 100 API requests per second to an API method. If a user makes more than 100 requests per second to a method, then throttling can occur. This limit does not apply to streaming inserts.
Maximum number of concurrent API requests per user 300 requests If a user makes more than 300 concurrent requests, throttling can occur. This limit does not apply to streaming inserts.
Maximum request header size 16 KiB Your BigQuery API request can be up to 16 KiB, including the request URL and all headers. This limit does not apply to the request body, such as in a POST request.

jobs.get requests

The following limit applies to jobs.get API requests:

Limit Default Notes
Maximum jobs.get requests per second 1,000 requests Your project can make up to 1,000 jobs.get requests per second.

jobs.query requests

The following limit applies to jobs.query API requests:

Limit Default Notes
Maximum jobs.query response size 10 MB By default, there is no maximum row count for the number of rows of data returned by jobs.query per page of results. However, you are limited to the 10-MB maximum response size. You can alter the number of rows to return by using the maxResults parameter.

projects.list requests

The following limit applies to projects.list API requests:

Limit Default Notes
Maximum projects.list requests per second 2 requests Your project can make up to two projects.list requests per second.

tabledata.list requests

The following quota applies to tabledata.list requests. Other APIs including jobs.getQueryResults and fetching results from jobs.query and jobs.insert can also consume this quota.

Quota Default Notes
Tabledata list bytes per minute 3.6 GB Your project can return a maximum of 3.6 GB of table row data per minute. This quota applies to the project that contains the table being read.
View quota in Cloud Console

The following limits apply to tabledata.list requests.

Limit Default Notes
Maximum number of tabledata.list requests per second 1,000 requests Your project can make up to 1,000 tabledata.list requests per second.
Maximum rows returned by tabledata.list requests per second 150,000 rows Your project can return up to 150,000 rows per second by using tabledata.list requests. This limit applies to the project that contains the table being read.
Maximum rows per tabledata.list response 100,000 rows A tabledata.list call can return up to 100,000 table rows. For more information, see Paging through results using the API.

tables.insert requests

The tables.insert method creates a new, empty table in a dataset. The following limit applies to tables.insert requests. This limit includes SQL statements that create tables, such as CREATE TABLE, and queries that write results to destination tables.

Limit Default Notes
Maximum tables.insert requests per second 10 requests Your project can make up to 10 tables.insert requests per second.

BigQuery Connection API

The following quotas apply to BigQuery Connection API calls:

Quota Default Notes
Read requests per minute 1,000 requests Your project can make up to 1,000 requests per minute to BigQuery Connection API methods that read connection data.
View quota in Cloud Console
Write requests per minute 100 requests per minute Your project can make up to 100 requests per minute to BigQuery Connection API methods that create or update connections.
View quota in Cloud Console

BigQuery Reservation API

The following quotas apply to the BigQuery Reservation API:

Quota Default Notes
Requests per minute per region 100 requests Your project can make a total of up to 100 calls to BigQuery Reservation API methods per minute per region.
View quotas in Cloud Console
Number of SearchAllAssignments calls per minute per region 100 requests Your project can make up to 100 calls to the SearchAllAssignments method per minute per region.
View quotas in Cloud Console
Requests for SearchAllAssignments per minute per region per user 10 requests Each user can make up to 10 calls to the SearchAllAssignments method per minute per region.
View quotas in Cloud Console
(In the Google Cloud Console search results, search for per user.)
Total number of slots for each region (except US region and EU region) per region 0 slots The maximum number of BigQuery slots you can purchase in each region by using Google Cloud Console.
View quotas in Cloud Console
Total number of slots for the EU region 1,000 slots The maximum number of BigQuery slots you can purchase in the EU multi-region by using Google Cloud Console.
View quotas in Cloud Console
Total number of slots for the US region 4,000 slots The maximum number of BigQuery slots you can purchase in the US multi-region by using Google Cloud Console.
View quotas in Cloud Console

IAM API

The following quotas apply when you use Identity and Access Management functionality in BigQuery to retrieve and set IAM policies, and to test IAM permissions.

Quota Default Notes
IamPolicy requests per minute 3,000 requests Your project can make up to 3,000 IAM requests per second.
View quota in Cloud Console
IamPolicy requests minute per user 1,500 requests Each user can make up to 1,500 IAM requests per minute per project.
View quota in Cloud Console

Storage Read API

The following quotas apply to BigQuery Storage Read API requests:

Quota Default Notes
Read data plane requests per minute per user 5,000 requests Each user can make up to 5,000 ReadRows calls per minute per project.
View quota in Cloud Console
Read control plane requests per minute per user 5,000 requests Each user can make up to 5,000 Storage Read API metadata operation calls per minute per project. The metadata calls include the CreateReadSession and SplitReadStream methods.
View quota in Cloud Console

The following limit applies to BigQuery Storage Read API requests:

Limit Default Notes
Maximum row/filter length 1 MB When you use the Storage Read API CreateReadSession call, you are limited to a maximum length of 1 MB for each row or filter.

Storage Write API

The following quotas apply to Storage Write API (Preview) requests:

Quota Default Notes
CreateWriteStream requests per minute 100 requests Your project can make up to 100 calls to CreateWriteStream per minute. If you get an error because you have exceeded this limit, retry the operation with exponential backoff. Also, try to space out calls to CreateWriteStream. The default stream is not subject to this quota. If you don't need exactly-once semantics with committed mode, then consider using the default stream.
View quota in Cloud Console
FlushRows requests per minute 10,000 requests Your project can make up to 10,000 calls to the FlushRows method per minute.
View quota in Cloud Console
Concurrent connections 10,000 in multi-regions; 1,000 in regions Your project can operate on 10,000 concurrent connections in the `us` and `eu` multi-regions, and 1,000 in other regions.
View quota in Cloud Console

The following limit applies to Storage Write API (Preview) requests:

Limit Default Notes
Request size 10 MB The maximum request size is 10 MB.

Cap quota usage

To learn how you can limit usage of a particular resource by specifying a smaller quota than the default, see Capping usage.

Troubleshoot

For information about troubleshooting errors related to quotas and limits, see Troubleshooting BigQuery quota errors.