Troubleshoot quota and limit errors

BigQuery has various quotas and limits that limit the rate and volume of different requests and operations. They exist both to protect the infrastructure and to help guard against unexpected customer usage. This document describes how to diagnose and mitigate specific errors resulting from quotas and limits.

If your error message is not listed in this document, then refer to the list of error messages which has more generic error information.

Overview

If a BigQuery operation fails because of exceeding a quota, the API returns the HTTP 403 Forbidden status code. The response body contains more information about the quota that was reached. The response body looks similar to the following:

{
  "code" : 403,
  "errors" : [ {
    "domain" : "global",
    "message" : "Quota exceeded: ...",
    "reason" : "quotaExceeded"
  } ],
  "message" : "Quota exceeded: ..."
}

The message field in the payload describes which limit was exceeded. For example, the message field might say Exceeded rate limits: too many table update operations for this table.

In general, quota limits fall into two categories, indicated by the reason field in the response payload.

  • rateLimitExceeded. This value indicates a short-term limit. To resolve these limit issues, retry the operation after few seconds. Use exponential backoff between retry attempts. That is, exponentially increase the delay between each retry.

  • quotaExceeded. This value indicates a longer-term limit. If you reach a longer-term quota limit, you should wait 10 minutes or longer before trying the operation again. If you consistently reach one of these longer-term quota limits, you should analyze your workload for ways to mitigate the issue. Mitigations can include optimizing your workload or requesting a quota increase.

For quotaExceeded errors, examine the error message to understand which quota limit was exceeded. Then, analyze your workload to see if you can avoid reaching the quota.

In some cases, the quota can be raised by contacting BigQuery support or contacting Google Cloud sales, but we recommend trying the suggestions in this document first.

Diagnosis

To diagnose issues, do the following:

  • Use INFORMATION_SCHEMA views to analyze the underlying issue. These views contain metadata about your BigQuery resources, including jobs, reservations, and streaming inserts.

    For example, the following query uses the INFORMATION_SCHEMA.JOBS view to list all quota-related errors within the past day:

    SELECT
     job_id,
     creation_time,
     error_result
    FROM `region-us`.INFORMATION_SCHEMA.JOBS
    WHERE creation_time > TIMESTAMP_SUB(CURRENT_TIMESTAMP, INTERVAL 1 DAY) AND
          error_result.reason IN ('rateLimitExceeded', 'quotaExceeded')
    
  • View errors in Cloud Audit Logs.

    For example, using Logs Explorer, the following query returns errors with either Quota exceeded or limit in the message string:

    resource.type = ("bigquery_project" OR "bigquery_dataset")
    protoPayload.status.code ="7"
    protoPayload.status.message: ("Quota exceeded" OR "limit")
    

    In this example, the status code 7 indicates PERMISSION_DENIED, which corresponds to the HTTP 403 status code.

    For additional Cloud Audit Logs query samples, see BigQuery queries.

Query queue limit errors

If a project attempts to queue more interactive or batch queries than its queue limit allows, you might encounter this error.

Error message

Quota exceeded: Your project and region exceeded quota for
max number of jobs that can be queued per project.

Resolution

To resolve this quota error, do the following:

  • Pause the job. If you identify a process or a workflow responsible for an increase in queries, then pause that process or workflow.

  • Use jobs with batch priority. You can queue more batch queries than interactive queries.

  • Distribute queries. Organize and distribute the load across different projects as informed by the nature of your queries and your business needs.

  • Distribute run times. Distribute the load across a larger time frame. If your reporting solution needs to run many queries, try to introduce some randomness for when queries start. For example, don't start all reports at the same time.

  • Use BigQuery BI Engine. If you have encountered this error while using a business intelligence (BI) tool to create dashboards that query data in BigQuery, then we recommend that you can use BigQuery BI Engine. Using BigQuery BI Engine is optimal for this use case.

  • Optimize queries and data model. Oftentimes, a query can be rewritten so that it runs more efficiently. For example, if your query contains a Common table expression (CTE)WITH clause–which is referenced in more than one place in the query, then this computation is done multiple times. It is better to persist calculations done by the CTE in a temporary table, and then reference it in the query.

    Multiple joins can also be the source of lack of efficiency. In this case, you might want to consider using nested and repeated columns. Using this often improves locality of the data, eliminates the need for some joins, and overall reduces resource consumption and the query run time.

    Optimizing queries make them cheaper, so when you use capacity-based pricing, you can run more queries with your slots. For more information, see Introduction to optimizing query performance.

  • Optimize query model. BigQuery is not a relational database. It is not optimized for an infinite number of small queries. Running a large number of small queries quickly depletes your quotas. Such queries don't run as efficiently as they do with the smaller database products. BigQuery is a large data warehouse and this is its primary use case. It performs best with analytical queries over large amounts of data.

  • Persist data (Saved tables). Pre-process the data in BigQuery and store it in additional tables. For example, if you execute many similar, computationally-intensive queries with different WHERE conditions, then their results are not cached. Such queries also consume resources each time they run. You can improve the performance of such queries and decrease their processing time by pre-computing the data and storing it in a table. This pre-computed data in the table can be queried by SELECT queries. It can often be done during ingestion within the ETL process, or by using scheduled queries or materialized views.

  • Use dry run mode. Run queries in dry run mode, which estimates the number of bytes read but does not actually process the query.

  • Preview table data. To experiment with or explore data rather than running queries, preview table data with BigQuery's table preview capability.

  • Use cached query results. All query results, including both interactive and batch queries, are cached in temporary tables for approximately 24 hours with some exceptions. While running a cached query does still count against your concurrent query limit, queries that use cached results are significantly faster than queries that don't use cached results because BigQuery does not need to compute the result set.

Number of partition modifications for column-partitioned tables quota errors

BigQuery returns this error when your column-partitioned table reaches the quota of the number of partition modifications permitted per day. Partition modifications include the total of all load jobs, copy jobs, and query jobs that append or overwrite a destination partition.

To see the value of the Number of partition modifications per column-partitioned table per day limit, see Partitioned tables.

Error message

Quota exceeded: Your table exceeded quota for
Number of partition modifications to a column partitioned table

Resolution

This quota cannot be increased. To resolve this quota error, do the following:

  • Change the partitioning on the table to have more data in each partition, in order to decrease the total number of partitions. For example, change from partitioning by day to partitioning by month or change how you partition the table.
  • Use clustering instead of partitioning.
  • If you frequently load data from multiple small files stored in Cloud Storage that uses a job per file, then combine multiple load jobs into a single job. You can load from multiple Cloud Storage URIs with a comma-separated list (for example, gs://my_path/file_1,gs://my_path/file_2), or by using wildcards (for example, gs://my_path/*).

    For more information, see Batch loading data.

  • If you use load, select or copy jobs to append single rows of data to a table, for example, then you should consider batching multiple jobs into one job. BigQuery doesn't perform well when used as a relational database. As a best practice, avoid running frequent, single-row append actions.
  • To append data at a high rate, consider using BigQuery Storage Write API. It is a recommended solution for high-performance data ingestion. The BigQuery Storage Write API has robust features, including exactly-once delivery semantics. To learn about limits and quotas, see Storage Write API and to see costs of using this API, see BigQuery data ingestion pricing.
  • To monitor the number of modified partitions on a table, use the INFORMATION_SCHEMA view.

Streaming insert quota errors

This section gives some tips for troubleshooting quota errors related to streaming data into BigQuery.

In certain regions, streaming inserts have a higher quota if you don't populate the insertId field for each row. For more information about quotas for streaming inserts, see Streaming inserts. The quota-related errors for BigQuery streaming depend on the presence or absence of insertId.

Error message

If the insertId field is empty, the following quota error is possible:

Quota limit Error message
Bytes per second per project Your entity with gaia_id: GAIA_ID, project: PROJECT_ID in region: REGION exceeded quota for insert bytes per second.

If the insertId field is populated, the following quota errors are possible:

Quota limit Error message
Rows per second per project Your project: PROJECT_ID in REGION exceeded quota for streaming insert rows per second.
Rows per second per table Your table: TABLE_ID exceeded quota for streaming insert rows per second.
Bytes per second per table Your table: TABLE_ID exceeded quota for streaming insert bytes per second.

The purpose of the insertId field is to deduplicate inserted rows. If multiple inserts with the same insertId arrive within a few minutes' window, BigQuery writes a single version of the record. However, this automatic deduplication is not guaranteed. For maximum streaming throughput, we recommend that you don't include insertId and instead use manual deduplication. For more information, see Ensuring data consistency.

When you encounter this error, diagnose the issue and then follow the recommended steps to resolve it.

Diagnosis

Use the STREAMING_TIMELINE_BY_* views to analyze the streaming traffic. These views aggregate streaming statistics over one-minute intervals, grouped by error code. Quota errors appear in the results with error_code equal to RATE_LIMIT_EXCEEDED or QUOTA_EXCEEDED.

Depending on the specific quota limit that was reached, look at total_rows or total_input_bytes. If the error is a table-level quota, filter by table_id.

For example, the following query shows total bytes ingested per minute, and the total number of quota errors:

SELECT
 start_timestamp,
 error_code,
 SUM(total_input_bytes) as sum_input_bytes,
 SUM(IF(error_code IN ('QUOTA_EXCEEDED', 'RATE_LIMIT_EXCEEDED'),
     total_requests, 0)) AS quota_error
FROM
 `region-us`.INFORMATION_SCHEMA.STREAMING_TIMELINE_BY_PROJECT
WHERE
  start_timestamp > TIMESTAMP_SUB(CURRENT_TIMESTAMP, INTERVAL 1 DAY)
GROUP BY
 start_timestamp,
 error_code
ORDER BY 1 DESC

Resolution

To resolve this quota error, do the following:

  • If you are using the insertId field for deduplication, and your project is in a region that supports the higher streaming quota, we recommend removing the insertId field. This solution may require some additional steps to manually deduplicate the data. For more information, see Manually removing duplicates.

  • If you are not using insertId, or if it's not feasible to remove it, monitor your streaming traffic over a 24-hour period and analyze the quota errors:

    • If you see mostly RATE_LIMIT_EXCEEDED errors rather than QUOTA_EXCEEDED errors, and your overall traffic is below 80% of quota, the errors probably indicate temporary spikes. You can address these errors by retrying the operation using exponential backoff between retries.

    • If you are using a Dataflow job to insert data, consider using load jobs instead of streaming inserts. For more information, see Setting the insertion method. If you are using Dataflow with a custom I/O connector, consider using a built-in I/O connector instead. For more information, see Custom I/O patterns.

    • If you see QUOTA_EXCEEDED errors or the overall traffic consistently exceeds 80% of the quota, submit a request for a quota increase. For more information, see Requesting a higher quota limit.

    • You may also want to consider replacing streaming inserts with the newer Storage Write API which has higher throughput, lower price, and many useful features.

Loading CSV files quota errors

If you load a large CSV file using the bq load command with the --allow_quoted_newlines flag, you might encounter this error.

Error message

Input CSV files are not splittable and at least one of the files is larger than
the maximum allowed size. Size is: ...

Resolution

To resolve this quota error, do the following:

  • Set the --allow_quoted_newlines flag to false.
  • Split the CSV file into smaller chunks that are each less than 4 GB.

For more information about limits that apply when you load data into BigQuery, see Load jobs.

Table imports or query appends quota errors

BigQuery returns this error message when your table reaches the limit for table operations per day for Standard tables. Table operations include the combined total of all load jobs, copy jobs, and query jobs that append or overwrite a destination table.

To see the value of the Table operations per day limit, see Standard tables.

Error message

Your table exceeded quota for imports or query appends per table

When you encounter this error, diagnose the issue and then follow the recommended steps to resolve it.

Diagnosis

If you have not identified the source from where most table operations are originating, do the following:

  1. Make a note of the project, dataset, and table that the failed query, load, or the copy job is writing to.

  2. Use INFORMATION_SCHEMA.JOBS_BY_* tables to learn more about jobs that modify the table.

    The following example finds the hourly count of jobs grouped by job type for a 24-hour period using JOBS_BY_PROJECT. If you expect multiple projects to write to the table, replace JOBS_BY_PROJECT with JOBS_BY_ORGANIZATION.

    SELECT
      TIMESTAMP_TRUNC(creation_time, HOUR),
      job_type,
      count(1)
    FROM `region-us`.INFORMATION_SCHEMA.JOBS_BY_PROJECT
    #Adjust time
    WHERE creation_time BETWEEN "2021-06-20 00:00:00" AND "2021-06-21 00:00:00"
    AND destination_table.project_id = "my-project-id"
    AND destination_table.dataset_id = "my_dataset"
    AND destination_table.table_id = "my_table"
    GROUP BY 1, 2
    ORDER BY 1 DESC
    

Resolution

This quota cannot be increased. To resolve this quota error, do the following:

  • If you frequently load data from multiple small files stored in Cloud Storage that uses a job per file, then combine multiple load jobs into a single job. You can load from multiple Cloud Storage URIs with a comma-separated list (for example, gs://my_path/file_1,gs://my_path/file_2), or by using wildcards (for example, gs://my_path/*).

    For more information, see Batch loading data.

  • If you use load, select or copy jobs to append single rows of data to a table, for example, then you should consider batching multiple jobs into one job. BigQuery doesn't perform well when used as a relational database. As a best practice, avoid running frequent, single-row append actions.
  • To append data at a high rate, consider using BigQuery Storage Write API. It is a recommended solution for high-performance data ingestion. The BigQuery Storage Write API has robust features, including exactly-once delivery semantics. To learn about limits and quotas, see Storage Write API and to see costs of using this API, see BigQuery data ingestion pricing.
  • To monitor the number of modified partitions on a table, use the INFORMATION_SCHEMA view.

Maximum rate of table metadata update operations limit errors

BigQuery returns this error when your table reaches the limit for maximum rate of table metadata update operations per table for Standard tables. Table operations include the combined total of all load jobs, copy jobs, and query jobs that append to or overwrite a destination table or that use a DML DELETE, INSERT, MERGE, TRUNCATE TABLE, or UPDATE to write data to a table.

To see the value of the Maximum rate of table metadata update operations per table limit, see Standard tables.

Error message

Exceeded rate limits: too many table update operations for this table

When you encounter this error, diagnose the issue and then follow the recommended steps to resolve it.

Diagnosis

Metadata table updates can originate from API calls that modify a table's metadata or from jobs that modify a table's content. If you have not identified the source from where most update operations to a table's metadata are originating, do the following:

Identify API calls
  1. Go to the Google Cloud navigation menu and select Logging > Logs Explorer:

    Go to the Logs Explorer

  2. Filter logs to view table operations by running the following query:

    resource.type="bigquery_dataset"
    protoPayload.resourceName="projects/my-project-id/datasets/my_dataset/tables/my_table"
    (protoPayload.methodName="google.cloud.bigquery.v2.TableService.PatchTable" OR
    protoPayload.methodName="google.cloud.bigquery.v2.TableService.UpdateTable" OR
    protoPayload.methodName="google.cloud.bigquery.v2.TableService.InsertTable")
    
Identify jobs

The following query returns a list of jobs that modify the affected table in the project. If you expect multiple projects in an organization to write to the table, replace JOBS_BY_PROJECT with JOBS_BY_ORGANIZATION.

SELECT
 job_id,
 user_email,
 query
#Adjust region
FROM `region-us`.INFORMATION_SCHEMA.JOBS_BY_PROJECT
#Adjust time
WHERE creation_time BETWEEN "2021-06-21 10:00:00" AND "2021-06-21 20:00:00"
AND destination_table.project_id = "my-project-id"
AND destination_table.dataset_id = "my_dataset"
AND destination_table.table_id = "my_table"

For more information, see BigQuery audit logs overview.

Resolution

This quota cannot be increased. To resolve this quota error, do the following:

  • Reduce the update rate for the table metadata.
  • Add a delay between jobs or table operations to make sure that the update rate is within the limit.
  • For data inserts or modification, consider using DML operations. DML operations are not affected by the Maximum rate of table metadata update operations per table rate limit.

    DML operations have other limits and quotas. For more information, see Using data manipulation language (DML).

  • If you frequently load data from multiple small files stored in Cloud Storage that uses a job per file, then combine multiple load jobs into a single job. You can load from multiple Cloud Storage URIs with a comma-separated list (for example, gs://my_path/file_1,gs://my_path/file_2), or by using wildcards (for example, gs://my_path/*).

    For more information, see Batch loading data.

  • If you use load, select or copy jobs to append single rows of data to a table, for example, then you should consider batching multiple jobs into one job. BigQuery doesn't perform well when used as a relational database. As a best practice, avoid running frequent, single-row append actions.
  • To append data at a high rate, consider using BigQuery Storage Write API. It is a recommended solution for high-performance data ingestion. The BigQuery Storage Write API has robust features, including exactly-once delivery semantics. To learn about limits and quotas, see Storage Write API and to see costs of using this API, see BigQuery data ingestion pricing.
  • To monitor the number of modified partitions on a table, use the INFORMATION_SCHEMA view.

Maximum number of API requests limit errors

BigQuery returns this error when you hit the rate limit for the number of API requests to a BigQuery API per user per method—for example, the tables.get method calls from a service account, or the jobs.insert method calls from a different user email. For more information, see the Maximum number of API requests per second per user per method rate limit in BigQuery API.

Error message

Quota exceeded: Your user_method exceeded quota for concurrent api requests
per user per method.

When you encounter this error, diagnose the issue and then follow the recommended steps to resolve it.

Diagnosis

If you have not identified the method that has reached this rate limit, do the following:

For service account

  1. Go to the project that hosts the service account.

  2. In the Google Cloud console, go to the API Dashboard.

    For instructions on how to view the detailed usage information of an API, see Using the API Dashboard.

  3. In the API Dashboard, select BigQuery API.

  4. To view more detailed usage information, select Metrics, and then do the following:

    1. For Select Graphs, select Traffic by API method.

    2. Filter the chart by the service account's credentials. You might see spikes for a method in the time range where you noticed the error.

For API calls

Some API calls log errors in BigQuery audit logs in Cloud Logging. To identify the method that reached the limit, do the following:

  1. In the Google Cloud console, go to the Google Cloud navigation menu and then select Logging > Logs Explorer for your project:

    Go to the Logs Explorer

  2. Filter logs by running the following query:

     resource.type="bigquery_resource"
     protoPayload.authenticationInfo.principalEmail="<user email or service account>"
     "Too many API requests per user per method for this user_method"
     In the log entry, you can find the method name under the property protoPayload.method_name.
     

    For more information, see BigQuery audit logs overview.

Resolution

To resolve this quota error, do the following:

  • Reduce the number of API requests or add a delay between multiple API requests so that the number of requests stays under this limit.

  • If the limit is only exceeded occasionally, you can implement retries on this specific error with exponential backoff.

  • If you frequently insert data, consider using streaming inserts because streaming inserts are not affected by the BigQuery API quota. However, the streaming inserts API has costs associated with it and has its own set of limits and quotas.

    To learn about the cost of streaming inserts, see BigQuery pricing.

  • While loading data to BigQuery using Dataflow with the BigQuery I/O connector, you might encounter this error for the tables.get method. To resolve this issue, do the following:

    • Set the destination table's create disposition to CREATE_NEVER. For more information, see Create disposition.

    • Use the Apache Beam SDK version 2.24.0 or higher. In the previous versions of the SDK, the CREATE_IF_NEEDED disposition calls the tables.get method to check if the table exists.

  • You can request a quota increase by contacting support or sales. For additional quota, see Request a quota increase. Requesting a quota increase might take several days to process. To provide more information for your request, we recommend that your request includes the priority of the job, the user running the query, and the affected method.

Your project exceeded quota for free query bytes scanned

BigQuery returns this error when you run a query in the free usage tier and the account reaches the monthly limit of data size that can be queried. For more information about Queries (analysis), see Free usage tier.

Error message

Your project exceeded quota for free query bytes scanned

Resolution

To continue using BigQuery, you need to upgrade the account to a paid Cloud Billing account.

Maximum tabledata.list bytes per second per project quota errors

BigQuery returns this error when the project number mentioned in the error message reaches the maximum size of data that can be read through the tabledata.list API call in a project per second. For more information, see Maximum tabledata.list bytes per minute.

Error message

Your project:[project number] exceeded quota for tabledata.list bytes per second per project

Resolution

To resolve this error, do the following:

  • In general, we recommend trying to stay below this limit. For example, by spacing out requests over a longer period with delays. If the error doesn't happen frequently, implementing retries with exponential backoff solves this issue.
  • If the use case expects fast and frequent reading of large amount of data from a table, we recommend using BigQuery Storage Read API instead of the tabledata.list API.
  • If the preceding suggestions do not work, you can request a quota increase from Google Cloud console API dashboard by doing the following:

    1. Go to the Google Cloud console API dashboard.
    2. In the dashboard, filter for Quota: Tabledata list bytes per minute (default quota).
    3. Select the quota and follow the instruction in Requesting higher quota limit.

    It might take several days to review and process the request.

Too many DML statements outstanding against table

This error means that the number of concurrent mutating DML statements (UPDATE, DELETE, MERGE) running against the same table has exceeded the data manipulation language (DML) quota limit. This quota limit is per table, and applies to mutating DML only, which does not include INSERT.

Resolution

Batch the DML jobs by following Best practices for DML statements.

Maximum number of copy jobs per day per project quota errors

BigQuery returns this error when the number of copy jobs running in a project has exceeded the daily limit. To learn more about the limit for Copy jobs per day, see Copy jobs.

Error message

Your project exceeded quota for copies per project

Diagnosis

If you'd like to gather more data about where the copy jobs are coming from, you can try the following:

  • If your copy jobs are located in a single or only a few regions, you can try querying the INFORMATION_SCHEMA.JOBS table for that specific region(s). For example:
    SELECT
    creation_time, job_id, user_email, destination_table.project_id, destination_table.dataset_id, destination_table.table_id
    FROM `PROJECT_ID`.`REGION_NAME`.INFORMATION_SCHEMA.JOBS
    WHERE
    creation_time BETWEEN TIMESTAMP_SUB(CURRENT_TIMESTAMP(), INTERVAL 2 DAY) AND CURRENT_TIMESTAMP()
    AND job_type = "COPY"
    order by creation_time DESC
    
    The REGION_NAME part should be replaced with the region name including the region- prefix. For example, region-us , region-asia-south1. You can also adjust the time interval depending on the time range you're interested in.
  • To see all copy jobs in all regions, you can use the following filter in Cloud Logging:
    resource.type="bigquery_resource"
    protoPayload.methodName="jobservice.insert"
    protoPayload.serviceData.jobInsertRequest.resource.jobConfiguration.tableCopy:*
    

Resolution

  • If the goal of the frequent copy operations is to create a snapshot of data, consider using table snapshots instead. Table snapshots are cheaper and faster alternative to copying full tables.
  • You can request a quota increase by contacting support or sales. It might take several days to review and process the request. We recommend stating the priority, use case, and the project ID in the request.

Maximum number of concurrent queries that contain remote functions

BigQuery returns this error when the number of concurrent queries that contain remote functions exceeds the limit.

To learn more about the limit for Remote functions, see Remote functions.

Error message

Exceeded rate limits: too many concurrent queries with remote functions for
this project

Diagnosis

To see limits for concurrent queries that contain remote functions, see Remote function limits.

Resolution

  • When using remote functions, adhere to best practices for remote functions.
  • You can request a quota increase by contacting support or sales. It might take several days to review and process the request. We recommend stating the priority, use case, and the project ID in the request.

Shuffle size limit errors

BigQuery returns this error when your project exceeds the maximum disk and memory size limit available for shuffle operations.

This quota is computed per-reservation and sliced across projects for the reservations. The quota cannot be modified by Cloud Customer Care. You can learn more about your usage by querying the INFORMATION_SCHEMA.JOBS_TIMELINE view.

Error message

You receive one of the following error messages:

  • Quota exceeded: Your project exceeded quota for total shuffle size limit.
  • Resources exceeded: Your project or organization exceeded the maximum
    disk and memory limit available for shuffle operations. Consider provisioning
    more slots, reducing query concurrency, or using more efficient logic in this
    job.

Resolution

To resolve this error, do the following:

Maximum number of CREATE MODEL statements

This error means that you have exceeded the quota for CREATE MODEL statements.

Error message

Quota exceeded: Your project exceeded quota for CREATE MODEL queries per project.

Resolution

If you exceed the quota for CREATE MODEL statements, send an email to bqml-feedback@google.com and request a quota increase.