Quotas & Limits

BigQuery limits the maximum rate of incoming requests and enforces appropriate quotas on a per-project basis. Specific policies vary depending on resource availability, user profile, service usage history, and other factors, and are subject to change without notice.

The lists below outline the current rate limits and quota limits of the system.

Query jobs

The following limits apply to query jobs created automatically by running interactive queries and to jobs submitted programmatically using jobs.query and query-type jobs.insert method calls.

Queries with results that are returned from the query cache, and dry run queries do not count against this limit. You can specify a dry run query using the --dry_run flag or by setting the dryRun property in a query job.

  • Concurrent rate limit for legacy SQL queries that contain user-defined functions (UDFs) — 6 concurrent queries

The concurrent rate limit for legacy SQL queries that contain UDFs includes both interactive and batch queries. Interactive queries that contain UDFs also count toward the concurrent rate limit for interactive queries. This limit does not apply to standard SQL queries.

  • Daily query size limit — Unlimited by default

You may specify limits on the amount of data users can query by setting custom quotas.

  • Daily destination table update limit — 1,000 updates per table per day

Destination tables in a query job are subject to the limit of 1,000 updates per table per day. Destination table updates include append operations and overwrite operations performed by a query using the BigQuery web UI, the bq command-line tool, or by calling the jobs.query and query-type jobs.insert API methods.

  • Query execution time limit — 6 hours

  • Maximum number of tables referenced per query — 1,000

  • Maximum unresolved query length — 256 KB

  • Maximum resolved query length — 12 MB

The limit on resolved query length includes the length of all views and wildcard tables referenced by the query.

  • Maximum response size — 128 MB compressed1

1Sizes vary depending on compression ratios for the data. The actual response size may be significantly larger than 128 MB.

The maximum response size is unlimited when writing large query results to a destination table.

  • Maximum row size — 100 MB2

2The maximum row size limit is approximate, as the limit is based on the internal representation of row data. The maximum row size limit is enforced during certain stages of query job execution.

  • Maximum concurrent slots per project for on-demand pricing — 2,000

The default number of slots for on-demand queries is shared among all queries in a single project. As a rule, if you're processing less than 100 GB of queries at once, you're unlikely to be using all 2,000 slots.

To check how many slots you're using, see Monitoring BigQuery Using Stackdriver. If you need more than 2,000 slots, contact your sales representative to discuss whether flat-rate pricing meets your needs.

  • Maximum concurrent queries against a Cloud Bigtable external data source — 4

For information on limits that apply to user-defined functions in SQL queries, see UDF limits.

Load jobs

The following limits apply to jobs created automatically by loading data using the command-line tool or the BigQuery web UI. The limits also apply to load jobs submitted programmatically using the load-type jobs.insert API method.

The following limits apply when you load data into BigQuery.

  • Load jobs per table per day — 1,000 (including failures)
  • Load jobs per project per day — 50,000 (including failures)
  • The limit of 1,000 load jobs per table per day cannot be raised.
  • Row and cell size limits:
    Data format Max limit
    CSV 100 MB (row and cell size)
    JSON 100 MB (row size)
    Avro 16 MB (block size)
  • Maximum columns per table — 10,000
  • Maximum File Sizes:
    File Type Compressed Uncompressed
    CSV 4 GB 5 TB
    JSON 4 GB 5 TB
    Avro Compressed Avro files are not supported, but compressed data blocks are. BigQuery supports the DEFLATE and Snappy codecs. 5 TB (1 MB for the file header)
  • Maximum size per load job — 15 TB across all input files for CSV, JSON, and Avro
  • Maximum number of source URIs in job configuration — 10,000 URIs
  • Maximum number of files per load job — 10 Million total files including all files matching all wildcard URIs
  • Load job execution time limit — 6 hours
  • With the exception of US-based datasets, you must load data from a Cloud Storage bucket in the same region as the dataset's location (the bucket can be either a multi-regional bucket or a regional bucket in the same region as the dataset). You can load data into a US-based dataset from any region.

For more information, see Introduction to Loading Data into BigQuery.

Copy jobs

The following limits apply to copying tables in BigQuery. The limits apply to jobs created automatically by copying data using the command-line tool or the BigQuery web UI. The limits also apply to copy jobs submitted programmatically using the copy-type jobs.insert API method.

  • Copy jobs per destination table per day — 1,000 (including failures)
  • Copy jobs per project per day — 10,000 (including failures)

Export jobs

The following limits apply to jobs that export data from BigQuery. The following limits apply to jobs created automatically by exporting data using the command-line tool or the BigQuery web UI. The limits also apply to export jobs submitted programmatically using the load-type jobs.insert API method.

  • Exports per day — 50,000 exports per project and up to 10 TB per day (the 10TB data limit is cumulative across all exports)
  • Wildcard URIs — 500 wildcard URIs per export

Dataset limits

The following limits apply to datasets:

  • Number of datasets per project — unrestricted
    The number of datasets per project is not subject to a quota; however, as you approach thousands of datasets in a project, web UI performance begins to degrade, and listing datasets becomes slower.
  • Number of tables per dataset — unrestricted
    The number of tables per dataset is also unrestricted, but as you approach 50,000 or more tables in a dataset, enumerating them becomes slower. Enumeration performance suffers whether you use an API call, the web UI, or the __TABLES_SUMMARY__ meta table. To improve UI performance, you can use the ?minimal parameter to limit the number of tables displayed to 30,000 per project. You add the parameter to the BigQuery web UI URL in the following format: https://bigquery.cloud.google.com/queries/[PROJECT_NAME]?minimal.
  • Maximum number of authorized views in a dataset's access control list — 1,000
    You can create an authorized view to restrict access to your source data. An authorized view is created using a SQL query that excludes columns you do not want users to see when they query the view. You can add up to 1,000 authorized views to a dataset's access control list.
  • Maximum rate of dataset metadata update operations — 1 operation every 2 seconds per dataset
    The dataset metadata update limit includes all metadata update operations performed using the BigQuery web UI, the bq command-line tool, or by calling the datasets.insert, datasets.patch, or datasets.update API methods.

Table limits

The following limits apply to BigQuery tables.

Standard tables

  • Maximum number of table operations per day — 1,000

You are limited to 1,000 operations per table per day whether the operation appends data to a table, overwrites a table, or uses a DML INSERT statement to write data to a table.

The maximum number of table operations includes the combined total of all load jobs, copy jobs, and query jobs that append to or overwrite a destination table or that use a DML INSERT statement to write data to a table.

For example, if you run 500 copy jobs that append data to mytable and 500 query jobs that append data to mytable, you would reach the quota.

  • Maximum rate of table metadata update operations — 1 operation every 2 seconds per table

The table metadata update limit includes all metadata update operations performed using the BigQuery web UI, the bq command-line tool, or by calling the tables.insert, tables.patch, or tables.update API methods. This limit also applies to jobs output.

Partitioned tables

  • Maximum number of partitions per partitioned table — 4,000

  • Maximum number of partitions modified by a single job — 2,000

Each job operation (query or load) can affect a maximum of 2,000 partitions. Any query or load job that affects more than 2,000 partitions is rejected by Google BigQuery.

  • Maximum number of partition modifications per day per table — 5,000

You are limited to a total of 5,000 partition modifications per day for a partitioned table. A partition can be modified by using an operation that appends to or overwrites data in the partition. Operations that modify partitions include: a load job, a query that writes results to a partition, or a DML statement (INSERT, DELETE, UPDATE, or MERGE) that modifies data in a partition.

More than one partition may be affected by a single job. For example, a DML statement can update data in multiple partitions (for both ingestion-time and partitioned tables). Query jobs and load jobs can also write to multiple partitions but only for partitioned tables. Google BigQuery uses the number of partitions affected by a job when determining how much of the quota the job consumes. Streaming inserts do not affect this quota.

  • Maximum rate of partition operations — 50 partition operations every 10 seconds

UDF limits

The following limits apply to user-defined functions in SQL queries.

  • The amount of data that your JavaScript UDF outputs when processing a single row — approximately 5 MB or less.
  • Concurrent rate limit for legacy SQL queries that contain user-defined functions (UDFs) — 6 concurrent queries.
  • The concurrent rate limit for legacy SQL queries that contain UDFs includes both interactive and batch queries. Interactive queries that contain UDFs also count toward the concurrent rate limit for interactive queries. This limit does not apply to standard SQL queries.

  • A query job can have a maximum of 50 JavaScript UDF resources (inline code blobs or external files).
  • Each inline code blob is limited to a maximum size of 32 KB.
  • Each external code resource is limited to a maximum size of 1 MB.

Data Manipulation Language statements

The following limits apply to Data Manipulation Language (DML) statements.

  • Maximum number of combined UPDATE, DELETE, and MERGE statements per day per table — 200
  • Maximum number of combined UPDATE, DELETE, and MERGE statements per day per project — 10,000
  • Maximum number of INSERT statements per day per table — 1,000

A MERGE statement is counted as a single DML statement, even if it contains multiple INSERT, UPDATE, or DELETE clauses.

DML statements are more limited than SELECT (SQL) statements because processing DML statements requires significantly more resources.

BigQuery ML limits

The following limits apply to standard SQL query jobs that use BigQuery ML statements and functions.

  • Queries that use the CREATE MODEL statement — 100 queries
    You are limited to 100 CREATE MODEL queries per day per project.

Streaming inserts

The following limits apply for streaming data into BigQuery.

  • Maximum row size: 1 MB. Exceeding this value will cause invalid errors.
  • HTTP request size limit: 10 MB. Exceeding this value will cause invalid errors.
  • Maximum rows per second: 100,000 rows per second, per project. Exceeding this amount will cause quotaExceeded errors. The maximum number of rows per second per table is also 100,000. You can use all of this quota on one table or you can divide this quota among several tables in a project.
  • Maximum rows per request: 10,000 rows per request. We recommend a maximum of 500 rows. Batching can increase performance and throughput to a point, but at the cost of per-request latency. Too few rows per request and the overhead of each request can make ingestion inefficient. Too many rows per request and the throughput may drop. We recommend using about 500 rows per request, but experimentation with representative data (schema and data sizes) will help you determine the ideal batch size.
  • Maximum bytes per second: 100 MB per second, per table. Exceeding this amount will cause quotaExceeded errors.

If you need more streaming data quota for your project, you can submit a request from the Google Cloud Platform Console page. You can set a custom quota on streaming data in increments of 50,000 rows. You will usually receive a response within 2 to 3 business days.

API requests

All API requests

The following limits apply to all BigQuery API requests:

  • API requests per second, per user — 100
    If you make more than 100 requests per second, throttling might occur. This limit does not apply to streaming inserts.
  • Concurrent API requests, per user: 300
    If you make more than 300 concurrent requests per user, throttling might occur. This limit does not apply to streaming inserts.

tabledata.list requests

The tabledata.list method retrieves table data from a specified set of rows. The following limits apply to tabledata.list requests:

  • Maximum bytes per second per project returned by calls to tabledata.list: 60 MB/second
    When you call tabledata.list, you can return a maximum of 60 MB per second of table row data per project. The limit applies to the project that contains the table being read.
  • Maximum rows per second per project returned by calls to tabledata.list: 150,000/second
    When you call tabledata.list, you can return a maximum of 150,000 table rows per second per project. The limit applies to the project that contains the table being read.

tables.insert requests

The tables.insert method creates a new, empty table in a dataset. The following limits apply to tables.insert requests:

  • Maximum requests per second per project: 10
    When you call tables.insert, you can create a maximum of 10 requests per second per project. This limit includes statements that create tables such as the CREATE TABLE DDL statement, and queries that write results to destination tables.

projects.list requests

The projects.list method lists all projects to which you have been granted access. The following limits apply to projects.list requests:

  • Maximum requests per second per project: 2
    When you call projects.list, you can create a maximum of 2 requests per second per project.

jobs.get requests

The jobs.get method returns information about a specific job. The following limits apply to jobs.get requests:

  • Maximum requests per second per project: 1,000
    When you call jobs.get, you can create a maximum of 1,000 requests per second per project.

When are quotas refilled?

Daily quotas are replenished at regular intervals throughout the day, reflecting their intent to guide rate limiting behaviors. Intermittent refresh is also done to avoid long disruptions when quota is exhausted. More quota is typically made available within minutes rather than globally replenished once daily.

Error codes

Quota and limit errors return either a 403 or a 400 HTTP response code. See troubleshooting errors for a full list of error codes and troubleshooting steps.

Was this page helpful? Let us know how we did:

Send feedback about...