Troubleshoot quota and limit errors

BigQuery has various quotas and limits that limit the rate and volume of different requests and operations. They exist both to protect the infrastructure and to help guard against unexpected customer usage. This document describes how to diagnose and mitigate specific errors resulting from quotas and limits.

If your error message is not listed in this document, then refer to the list of error messages which has more generic error information.


If a BigQuery operation fails because of exceeding a quota, the API returns the HTTP 403 Forbidden status code. The response body contains more information about the quota that was reached. The response body looks similar to the following:

  "code" : 403,
  "errors" : [ {
    "domain" : "global",
    "message" : "Quota exceeded: ...",
    "reason" : "quotaExceeded"
  } ],
  "message" : "Quota exceeded: ..."

The message field in the payload describes which limit was exceeded. For example, the message field might say Exceeded rate limits: too many table update operations for this table.

In general, quota limits fall into two categories, indicated by the reason field in the response payload.

  • rateLimitExceeded. This value indicates a short-term limit. To resolve these limit issues, retry the operation after few seconds. Use exponential backoff between retry attempts. That is, exponentially increase the delay between each retry.

  • quotaExceeded. This value indicates a longer-term limit. If you reach a longer-term quota limit, you should wait 10 minutes or longer before trying the operation again. If you consistently reach one of these longer-term quota limits, you should analyze your workload for ways to mitigate the issue. Mitigations can include optimizing your workload or requesting a quota increase.

For quotaExceeded errors, examine the error message to understand which quota limit was exceeded. Then, analyze your workload to see if you can avoid reaching the quota. For example, optimizing query performance can mitigate quota errors for concurrent queries.

In some cases, the quota can be raised by contacting BigQuery support or contacting Google Cloud sales, but we recommend trying the suggestions in this document first.


To diagnose issues, do the following:

  • Use INFORMATION_SCHEMA views to analyze the underlying issue. These views contain metadata about your BigQuery resources, including jobs, reservations, and streaming inserts.

    For example, the following query uses the INFORMATION_SCHEMA.JOBS view to list all quota-related errors within the past day:

          error_result.reason IN ('rateLimitExceeded', 'quotaExceeded')
  • View errors in Cloud Audit Logs.

    For example, using Logs Explorer, the following query returns errors with either Quota exceeded or limit in the message string:

    resource.type = ("bigquery_project" OR "bigquery_dataset")
    protoPayload.status.code ="7"
    protoPayload.status.message: ("Quota exceeded" OR "limit")

    In this example, the status code 7 indicates PERMISSION_DENIED, which corresponds to the HTTP 403 status code.

    For additional Cloud Audit Logs query samples, see BigQuery queries.

Concurrent queries quota errors

If a project is simultaneously running more interactive queries than the assigned limit for that project, you might encounter this error.

For more information about this limit, see the Maximum number of concurrent interactive queries limit.

Error message

Exceeded rate limits: too many concurrent queries for this project_and_region

When you encounter this error, diagnose the issue and then follow the recommended steps to resolve it.


If you haven't identified the query jobs that are returning this error, do the following:

  • Check for other queries that are running concurrently with the failed queries.

    For example, if your failed query was submitted on 2021-06-08 12:00:00 UTC in the us region, run the following query to the INFORMATION_SCHEMA.JOBS view table in the project where the failed query was submitted:

    DECLARE failed_query_submission_time DEFAULT CAST('2021-06-08 12:00:00' AS TIMESTAMP);
    WHERE creation_time >= date_sub(failed_query_submission_time, INTERVAL 1 DAY)
    AND job_type = 'QUERY'
    AND priority = 'INTERACTIVE'
    AND start_time <= failed_query_submission_time
    AND (end_time >= failed_query_submission_time OR state != 'DONE')
  • If your query to INFORMATION_SCHEMA.JOBS_BY_PROJECT fails with the same error, then run the bq ls command in the Cloud Shell terminal to list the queries that are running:

    bq ls -j --format=prettyjson -n 200 | jq '.[] | select(.status.state=="RUNNING")| {configuration: .query, id: .id, jobReference: .jobReference, user_email: .user_email}'


To resolve this quota error, do the following:

  • Pause the job. If your preceding diagnosis identifies a process or a workflow responsible for an increase in queries, then pause that process or workflow.

  • Use jobs with batch priority. Batch queries don't count towards your concurrent rate limit. Running batch queries can allow you to start many queries at once. Batch queries use the same resources as interactive (on-demand) queries. BigQuery queues each batch query on your behalf, and starts the query when idle resources are available in the BigQuery shared resource pool.

    Alternatively, you can also create a separate project to run queries.

  • Distribute queries. Organize and distribute the load across different projects as informed by the nature of your queries and your business needs.

  • Distribute run times. Distribute the load across a larger time frame. If your reporting solution needs to run many queries, try to introduce some randomness for when queries start. For example, don't start all reports at the same time.

  • Use BigQuery BI Engine. If you have encountered this error while using a business intelligence (BI) tool to create dashboards that query data in BigQuery, then we recommend that you can use BigQuery BI Engine. Using BigQuery BI Engine is optimal for this use case.

  • Optimize queries and data model. Oftentimes, a query can be rewritten so that it runs more efficiently. For example, if your query contains a Common table expression (CTE)WITH clause–which is referenced in more than one place in the query, then this computation is done multiple times. It is better to persist calculations done by the CTE in a temporary table, and then reference it in the query.

    Multiple joins can also be the source of lack of efficiency. In this case, you might want to consider using nested and repeated columns. Using this often improves locality of the data, eliminates the need for some joins, and overall reduces resource consumption and the query run time.

    Optimizing queries make them cheaper, so when you use flat-rate pricing, you can run more queries with your slots. For more information, see Introduction to optimizing query performance.

  • Optimize query model. BigQuery is not a relational database. It is not optimized for infinite number of small queries. Running a large number of small queries quickly depletes your quotas. Such queries don't run as efficiently as they do with the smaller database products. BigQuery is a large data warehouse and this is its primary use case. It performs best with analytical queries over large amounts of data.

  • Persist data (Saved tables). Pre-process the data in BigQuery and store it in additional tables. For example, if you execute many similar, computationally-intensive queries with different WHERE conditions, then their results are not cached. Such queries also consume resources each time they run. You can improve the performance of such queries and decrease their processing time by pre-computing the data and storing it in a table. This pre-computed data in the table can be queried by SELECT queries. It can often be done during ingestion within the ETL process, or by using scheduled queries or materialized views.

  • Use dry run mode. Run queries in dry run mode, which estimates the number of bytes read but does not actually process the query.

  • Preview table data. To experiment with or explore data rather than running queries, preview table data with BigQuery's table preview capability.

  • Use cached query results. All query results, including both interactive and batch queries, are cached in temporary tables for approximately 24 hours with some exceptions. While running a cached query does still count against your concurrent query limit, queries that use cached results are significantly faster than queries that don't use cached results because BigQuery does not need to compute the result set.

  • Increase quota limits. You can also increase the quota limits to resolve this error. To raise the limit, contact support or contact sales. Requesting a quota increase might take several days to process. To provide more information for your request, we recommend that your request includes the priority of the job, the user running the query, and the affected method.

    Limits are applied at the project level. However, increasing the number of concurrent jobs per project reduces the number of available slots for each concurrently running query, which might reduce performance of individual queries. To improve the performance, we recommend that you increase the number of slots if the concurrent queries limit is increased.

    To learn more about about raising this limit, see Quotas and limits. For more information about slots, see slot reservation.

Number of partition modifications for column-partitioned tables quota errors

BigQuery returns this error when your column-partitioned table reaches the quota of the number of partition modifications permitted per day. Partition modifications include the total of all load jobs, copy jobs, and query jobs that append or overwrite a destination partition, or that use a DML DELETE, INSERT, MERGE, TRUNCATE TABLE, or UPDATE statement to write data to a table.

To see the value of the Number of partition modifications per column-partitioned table per day limit, see Partitioned tables.

Error message

Quota exceeded: Your table exceeded quota for
Number of partition modifications to a column partitioned table


This quota cannot be increased. To resolve this quota error, do the following: