Troubleshooting BigQuery quota errors

BigQuery has various quotas that limit the rate and volume of incoming requests. These quotas exist both to protect the backend systems, and to help guard against unexpected billing if you submit very large jobs. This document describes how to diagnose and mitigate specific errors resulting from quotas. If your exact error message is not listed in this document, please refer to Error messages, which has more generic error information.


If a BigQuery operation fails because of a quota limit, the API returns the HTTP 403 Forbidden status code. The response body contains more information about the limit that was reached. The response body looks similar to the following:

  "code" : 403,
  "errors" : [ {
    "domain" : "global",
    "message" : "Quota exceeded: ...",
    "reason" : "quotaExceeded"
  } ],
  "message" : "Quota exceeded: ..."

The message field in the payload describes which limit was exceeded. For example, the message field might say Exceeded rate limits: too many table update operations for this table.

In general, quota limits fall into two categories, indicated by the reason field in the response payload.

  • rateLimitExceeded.{: #ratelimitexceeded } This value indicates a short-term limit. Usually you can resolve these limits by retrying the operation after a few seconds. Use exponential backoff between retry attempts. That is, exponentially increase the delay between each retry.

  • quotaExceeded. This value indicates a longer-term limit. If you reach a longer-term quota limit, you should wait 10 minutes or longer before trying the operation again. If you consistently reach one of these longer-term quota limits, you should analyze your workload for ways to mitigate the issue. Mitigations can include optimizing your workload or requesting a quota increase.

For quotaExceeded errors, examine the error message to understand which quota limit was exceeded. Then analyze your workload to see if you can avoid reaching the quota, for example by optimizing query performance. In some cases, the quota can be raised by contacting BigQuery support or contacting Google Cloud sales, but we recommend trying the suggestions in this document first.

You can use INFORMATION_SCHEMA views to analyze the underlying issue. These views contain metadata about your BigQuery resources, including jobs, reservations, and streaming inserts. For example, the following query uses the JOBS_BY_PROJECT view to list all quota-related errors within the past day.

      error_result.reason IN ('rateLimitExceeded', 'quotaExceeded')

You can also view errors in Cloud Audit Logs. For example, using Logs Viewer, the following query finds errors with either Quota exceeded or limit in the message string:

resource.type = ("bigquery_project" OR "bigquery_dataset")
protoPayload.status.code ="7"
protoPayload.status.message: ("Quota exceeded" OR "limit")

(Status code 7 is PERMISSION_DENIED, which corresponds to the HTTP 403 status code.)

For additional Cloud Audit Logs query samples, see BigQuery queries.

Concurrent queries quota errors

If you receive the error message Exceeded rate limits: too many concurrent queries for this project_and_region, this quota limits the number of concurrent interactive query jobs that can run at the same time. If this limit is exceeded, then new query jobs fail immediately.

By default, queries run in interactive mode. To prevent this error, switch the query to batch mode. The query will be queued and will execute when resources are available. Batch queries don't count towards your concurrent rate limit, which can make it easier to start many queries at once.

You can review the current jobs running in the project using INFORMATION_SCHEMA.JOBS_BY_PROJECT to identify the largest consumers. To not affect other users, you can ask these large consumers to reduce their concurrent queries, use batch queries, or have them create a separate project.

This error may be encountered while using a Business Intelligence (BI) tool to create dashboards that query data in BigQuery. We recommend using BigQuery BI Engine since it is optimized for this use case.

It is possible to raise the concurrent queries limit for a project. However, increasing this quota may cause more contention for user queries because there will be more queries attempting to use the same number of slots. This will impact performance of queries. Therefore, we recommend increasing the number of slots as well if the concurrent queries limit is increased. More information about raising this limit is listed on the Quotas and limits page.

Partitioned table quota errors

When using partitioned tables, you may encounter BigQuery quotas and limits for partitioned tables.

If you receive the error message Quota exceeded: Your table exceeded quota for Number of partition modifications to a column partitioned table, this quota cannot be increased. To resolve this quota error, we recommend:

Streaming insert quota errors

This section gives some tips for troubleshooting quota errors related to streaming data into BigQuery.

In certain regions, streaming inserts have a higher quota if you don't populate the insertId field for each row. For more information about quotas for streaming inserts, see Streaming inserts. The quota-related errors for BigQuery streaming depend on the presence or absence of insertId.

If the insertId field is empty, the following quota error is possible:

Quota limit Error message
Bytes per second per project Your entity with gaia_id: GAIA_ID, project: PROJECT_ID in region: REGION exceeded quota for insert bytes per second.

If the insertId field is populated, the following quota errors are possible:

Quota limit Error message
Rows per second per project Your project: PROJECT_ID in REGION exceeded quota for streaming insert rows per second.
Rows per second per table Your table: TABLE_ID exceeded quota for streaming insert rows per second.
Bytes per second per table Your table: TABLE_ID exceeded quota for streaming insert bytes per second.

The purpose of the insertId field is to deduplicate inserted rows. If multiple inserts with the same insertId arrive within a few minutes' window, BigQuery writes a single version of the record. However, this automatic deduplication is not guaranteed. For maximum streaming throughput, we recommend that you don't include insertId and instead use manual deduplication. For more information, see Ensuring data consistency.


Use the STREAMING_TIMELINE_BY_* views to analyze the streaming traffic. These views aggregate streaming statistics over one-minute intervals, grouped by error code. Quota errors appear in the results with error_code equal to RATE_LIMIT_EXCEEDED or QUOTA_EXCEEDED.

Depending on the specific quota limit that was reached, look at total_rows or total_input_bytes. If the error is a table-level quota, filter by table_id. For example, the following query shows total bytes ingested per minute, and the total number of quota errors.

 SUM(total_input_bytes) as sum_input_bytes,
     total_requests, 0)) AS quota_error


If you are using the insertId field for deduplication, and your project is in a region that supports the higher streaming quota, we recommend removing the insertId field. This solution may require some additional steps to manually deduplicate the data. For more information, see Manually removing duplicates.

If you are not using insertId, or if it's not feasible to remove it, monitor your streaming traffic over a 24-hour period and analyze the quota errors:

  • If you see mostly RATE_LIMIT_EXCEEDED errors rather than QUOTA_EXCEEDED errors, and your overall traffic is below 80% of quota, the errors probably indicate temporary spikes. You can handle these errors by retrying the operation, using exponential backoff between retries.

  • If you see QUOTA_EXCEEDED errors or the overall traffic consistently exceeds 80% of quota, submit a request for a quota increase. For more information, see Requesting a higher quota limit.

Loading CSV files quota errors

When you load a large CSV file with the --allow_quoted_newlines flag, sometimes the import is not successful. The following error message is displayed:

Input CSV files are not splittable and at least one of the files is larger than
the maximum allowed size. Size is: ...

To resolve the issue, either disable the --allow_quoted_newlines flag or split the CSV file into smaller chunks that are each less than 4 GB. For more information on the limits, see load jobs.