Quota Policy

BigQuery limits the maximum rate of incoming requests and enforces appropriate quotas on a per-project basis. Specific policies vary depending on resource availability, user profile, service usage history, and other factors, and are subject to change without notice.

The lists below outline the current rate limits and quota limits of the system.

Queries

The following limits apply to jobs.query() and query-type jobs.insert() function calls.

  • Concurrent rate limit for interactive queries: 50 concurrent queries. Queries that return cached results, or queries configured using the dryRun property, do not count against this limit.
  • Concurrent rate limit for queries that contain user-defined functions (UDFs): 6 concurrent queries, including both interactive and batch queries. Interactive queries that contain UDFs count toward the concurrent rate limit for interactive queries.
  • Daily query size limit: unlimited by default, but you may specify limits using custom quotas.
  • Daily update limit: 1,000 updates per table per day; applies only to the destination table in a query.
  • Maximum number of tables referenced per query: 1,000
  • Maximum query length: 256 KB1
  • Maximum response size: 128 MB compressed2 (unlimited when returning large query results)
    1Approximate size based on compression ratios.
    2Sizes vary depending on compression ratios for the data; the actual response size may be significantly larger than 128 MB.

Load Jobs

The following limits apply for loading data into BigQuery.

  • Daily limit: 1,000 load jobs per table per day (including failures), 10,000 load jobs per project per day (including failures)
  • Row and cell size limits:
    Data format Max limit
    CSV 2 MB (row and cell size)
    JSON 2 MB (row size)
    Avro 16 MB (block size)
  • Maximum columns per table: 10,000
  • Maximum File Sizes:
    File Type Compressed Uncompressed
    CSV 4 GB
    • With quoted new-lines in values: 4 GB
    • Without new-lines in values: 5 TB
    JSON 4 GB 5 TB
    Avro Compressed Avro files are not supported, but compressed data blocks are. BigQuery supports the DEFLATE codec. 5 TB (2 MB for the file header)
  • Maximum size per load job: 12 TB across all input files for CSV and JSON
  • Maximum number of files per load job: 10,000

For more information about BigQuery's supported data formats, see preparing data for BigQuery.

Export Requests

The following limits apply for exporting data from BigQuery.

  • Daily Limit: 1,000 exports per day, up to 10 TB
  • Multiple Wildcard URI Limit: 500 URIs per export

Streaming Inserts

The following limits apply for streaming data into BigQuery.

  • Maximum row size: 1 MB
  • HTTP request size limit: 10 MB
  • Maximum rows per second: 100,000 rows per second, per table. 1,000,000 rows per second, per project. Exceeding either amount will cause quota_exceeded errors. If your project requires a higher limit, contact support.
  • Maximum rows per request: There is no hard limit, but we recommend a maximum of 500 rows. Batching can increase performance and throughput to a point, but at the cost of per-request latency. Too few rows per request and the overhead of each request can make ingestion inefficient. Too many rows per request and the throughput may drop. We recommend using about 500 rows per request, but experimentation with representative data (schema and data sizes) will help you determine the ideal batch size.
  • Maximum bytes per second: 100 MB per second, per table. Exceeding this amount will cause quota_exceeded errors.

API Requests

  • API requests per second, per user: If you make more than 100 requests per second, throttling might occur. This limit does not apply for streaming inserts.

Error Codes

Quota and limit errors return a 403 HTTP response code. See troubleshooting errors for a full list of error codes and troubleshooting steps.

Send feedback about...

BigQuery Documentation