BigQuery limits the maximum rate of incoming requests and enforces appropriate quotas on a per-project basis. Specific policies vary depending on resource availability, user profile, service usage history, and other factors, and are subject to change without notice.
The lists below outline the current rate limits and quota limits of the system.
- Concurrent rate limit for interactive queries under on-demand pricing: 50 concurrent queries. Queries that return cached results, or queries configured using the dryRun property, do not count against this limit.
- Concurrent rate limit for queries that contain user-defined functions (UDFs): 6 concurrent queries, including both interactive and batch queries. Interactive queries that contain UDFs count toward the concurrent rate limit for interactive queries.
- Daily query size limit: unlimited by default, but you may specify limits using custom quotas.
- Daily update limit: 1,000 updates per table per day; applies only to the destination table in a query.
- Query execution time limit: 6 hours
- Maximum concurrent slots per BigQuery account for on-demand pricing: 2,000
- Maximum number of tables referenced per query: 1,000
- Maximum number of authorized views per dataset: 1,000
- Maximum query length: 256 KB1
- Maximum response size: 128 MB compressed2 (unlimited when returning large query results)
1Approximate size based on compression ratios.
2Sizes vary depending on compression ratios for the data; the actual response size may be significantly larger than 128 MB.
Slots are shared among all queries in a project. As a rule of thumb, if you're processing less than 100 GB of queries at once, you're unlikely to be using all 2000 slots. To check how many slots your account uses, see Monitoring BigQuery Using Stackdriver. If you need more than 2,000 slots, contact your sales representative to discuss whether Flat-rate pricing meets your needs.
Dataset and table update operations
The following limits apply to dataset and table update operations:
- Maximum rate of dataset metadata update operations: 1 operation every 2 seconds (insert, patch, update).
- Maximum rate of table update operations: 1 operation every 2 seconds (insert, patch, update, jobs output).
The following limits apply for loading data into BigQuery.
- Daily limit: 1,000 load jobs per table per day (including failures), 50,000 load jobs per project per day (including failures)
- Row and cell size limits:
Data format Max limit CSV 2 MB (row and cell size) JSON 2 MB (row size) Avro 16 MB (block size)
- Maximum columns per table: 10,000
- Maximum File Sizes:
File Type Compressed Uncompressed CSV 4 GB
- With quoted new-lines in values: 4 GB
- Without new-lines in values: 5 TB
JSON 4 GB 5 TB Avro Compressed Avro files are not supported, but compressed data blocks are. BigQuery supports the DEFLATE codec. 5 TB (1 MB for the file header)
- Maximum size per load job: 15 TB across all input files for CSV, JSON, and Avro
- Maximum number of files per load job: 10,000 including all files matching a wildcard
- Load job execution time limit: 6 hours
For more information about BigQuery's supported data formats, see preparing data for BigQuery.
The following limits apply to copying tables in BigQuery.
- Daily Limit: 1,000 copy jobs per table (copy destination table) per day (including failures), 10,000 copy jobs per project (under which the copy job is running) per day (including failures).
The following limits apply for exporting data from BigQuery.
- Daily Limit: 1,000 exports per day, up to 10 TB
- Multiple Wildcard URI Limit: 500 URIs per export
The following limits apply for streaming data into BigQuery.
- Maximum row size: 1 MB. Exceeding this value will cause
- HTTP request size limit: 10 MB. Exceeding this value will cause
- Maximum rows per second:
100,000 rows per second, per project. Exceeding this amount will cause
quotaExceedederrors. The maximum number of rows per second per table is also 100,000. You can use all of this quota on one table or you can divide this quota among several tables in a project.
- Maximum rows per request: There is no hard limit, but we recommend a maximum of 500 rows. Batching can increase performance and throughput to a point, but at the cost of per-request latency. Too few rows per request and the overhead of each request can make ingestion inefficient. Too many rows per request and the throughput may drop. We recommend using about 500 rows per request, but experimentation with representative data (schema and data sizes) will help you determine the ideal batch size.
- Maximum bytes per second: 100 MB per second, per table. Exceeding this amount will cause
- API requests per second, per user: If you make more than 100 requests per second, throttling might occur. This limit does not apply for streaming inserts.
Quota and limit errors return either a
400 HTTP response code. See troubleshooting errors for a full list of error codes and troubleshooting steps.