Quotas & Limits

BigQuery limits the maximum rate of incoming requests and enforces appropriate quotas on a per-project basis. Specific policies vary depending on resource availability, user profile, service usage history, and other factors, and are subject to change without notice.

The lists below outline the current rate limits and quota limits of the system.

Queries

The following limits apply to jobs.query and query-type jobs.insert function calls.

  • Concurrent rate limit for interactive queries under on-demand pricing: 50 concurrent queries. Queries that return cached results, or queries configured using the dryRun property, do not count against this limit.
  • Concurrent rate limit for queries that contain user-defined functions (UDFs): 6 concurrent queries, including both interactive and batch queries. Interactive queries that contain UDFs count toward the concurrent rate limit for interactive queries.
  • Daily query size limit: unlimited by default, but you may specify limits using custom quotas.
  • Daily update limit: 1,000 updates per table per day; applies only to the destination table in a query.
  • Query execution time limit: 6 hours
  • Maximum concurrent slots per BigQuery project for on-demand pricing: 2,000
  • Maximum number of tables referenced per query: 1,000
  • Maximum number of authorized views per dataset: 1,000
  • Maximum unresolved query length: 256 KB
  • Maximum resolved query length: 12 MB including the length of all referenced views and wildcard tables
  • Maximum response size: 128 MB compressed1 (unlimited when writing large query results to a destination table)
    1Sizes vary depending on compression ratios for the data; the actual response size may be significantly larger than 128 MB.

The default number of slots for on-demand queries is shared among all queries in a single project. As a rule of thumb, if you're processing less than 100 GB of queries at once, you're unlikely to be using all 2000 slots. To check how many slots you're using, see Monitoring BigQuery Using Stackdriver. If you need more than 2,000 slots, contact your sales representative to discuss whether Flat-rate pricing meets your needs.

Dataset and table update operations

The following limits apply to dataset and table update operations:

  • Maximum rate of dataset metadata update operations: 1 operation every 2 seconds (insert, patch, update).
  • Maximum rate of table update operations: 1 operation every 2 seconds (insert, patch, update, jobs output).

Data Manipulation Language statements

The following limits apply to Data Manipulation Language (DML).

  • Maximum UPDATE/DELETE statements per day per table: 96
  • Maximum UPDATE/DELETE statements per day per project: 1,000
  • Maximum INSERT statements per day per table: 1,000
  • Maximum INSERT statements per day per project: 1,000

DML statements are significantly more expensive to process than SELECT statements.

Load jobs

The following limits apply for loading data into BigQuery.

  • Daily limit: 1,000 load jobs per table per day (including failures), 50,000 load jobs per project per day (including failures)
  • The limit of 1,000 load jobs per table per day cannot be raised.
  • Row and cell size limits:
    Data format Max limit
    CSV 10 MB (row and cell size)
    JSON 10 MB (row size)
    Avro 16 MB (block size)
  • Maximum columns per table: 10,000
  • Maximum File Sizes:
    File Type Compressed Uncompressed
    CSV 4 GB
    • With quoted new-lines in values: 4 GB
    • Without new-lines in values: 5 TB
    JSON 4 GB 5 TB
    Avro Compressed Avro files are not supported, but compressed data blocks are. BigQuery supports the DEFLATE and Snappy codecs. 5 TB (1 MB for the file header)
  • Maximum size per load job: 15 TB across all input files for CSV, JSON, and Avro
  • Maximum number of source uris in job configuration: 10,000 uris
  • Maximum number of files per load job: 10 Million total files matching all wildcards
  • Load job execution time limit: 6 hours

For more information about BigQuery's supported data formats, see preparing data for BigQuery.

Copy jobs

The following limits apply to copying tables in BigQuery.

  • Daily Limit: 1,000 copy jobs per table (copy destination table) per day (including failures), 10,000 copy jobs per project (under which the copy job is running) per day (including failures).

Export requests

The following limits apply for exporting data from BigQuery.

  • Daily Limit: 1,000 exports per day with a cumulative limit of 10 TB per day
  • Multiple Wildcard URI Limit: 500 URIs per export

Partitioned table updates

The following limits apply to partitioned tables.

  • Daily limit: 2,000 partition updates per table, per day
  • Rate limit: 50 partition updates every 10 seconds

Writes to a table partition from query jobs (including DML), load jobs, and copy jobs count toward partition update limits.

Streaming inserts

The following limits apply for streaming data into BigQuery.

  • Maximum row size: 1 MB. Exceeding this value will cause invalid errors.
  • HTTP request size limit: 10 MB. Exceeding this value will cause invalid errors.
  • Maximum rows per second: 100,000 rows per second, per project. Exceeding this amount will cause quotaExceeded errors. The maximum number of rows per second per table is also 100,000. You can use all of this quota on one table or you can divide this quota among several tables in a project.
  • Maximum rows per request: 10,000 rows per request. We recommend a maximum of 500 rows. Batching can increase performance and throughput to a point, but at the cost of per-request latency. Too few rows per request and the overhead of each request can make ingestion inefficient. Too many rows per request and the throughput may drop. We recommend using about 500 rows per request, but experimentation with representative data (schema and data sizes) will help you determine the ideal batch size.
  • Maximum bytes per second: 100 MB per second, per table. Exceeding this amount will cause quotaExceeded errors.

If you need more streaming data quota, you can use the BigQuery Custom Quota Request form. You will usually receive a response within 2 to 3 business days.

API requests

  • API requests per second, per user: If you make more than 100 requests per second, throttling might occur. This limit does not apply for streaming inserts.

When are quotas refilled?

Daily quotas are replenished at regular intervals throughout the day, reflecting their intent to guide rate limiting behaviors. Intermittent refresh is also done to avoid long disruptions when quota is exhausted. More quota is typically made available within minutes rather than globally replenished once daily.

Error codes

Quota and limit errors return either a 403 or 400 HTTP response code. See troubleshooting errors for a full list of error codes and troubleshooting steps.

Monitor your resources on the go

Get the Google Cloud Console app to help you manage your projects.

Send feedback about...