Quotas and limits

This document contains current restrictions and usage limits when using Cloud Data Loss Prevention (DLP). For billing information, see the pricing page.

These limits apply to each Google Cloud Console project and are shared across all applications and IP addresses using that project.

You can set lower quotas in the Google Cloud Console.

Request quotas

The current API request quota for Cloud DLP is as follows.

Request quota Value
Requests per minute 600

Content limits

Cloud DLP enforces the following usage limits:

Content limit Limit
Requests to projects.image.redact 4 MB
Default request size limit 0.5 MB

Larger files should be hosted on Cloud Storage to be inspected.

Resource limits

Cloud DLP enforces the following limits on stored resources per project.

Resource Limit
Maximum number of templates 1000
Maximum number of job triggers 1000
Maximum number of running jobs 1000
Maximum number of stored infoTypes 30

Content inspection and redaction limits

Cloud DLP enforces the following usage limits for inspecting and redacting content sent directly to the DLP API as text or images:

Type of limit Usage limit
Maximum number of regular custom dictionaries per request 10
Maximum size of each quote (a contextual snippet, returned with findings, of the text that triggered a match) 4 KB
Maximum number of table values 50,000
Maximum number of transformations per request 100

Storage inspection limits

Cloud DLP enforces the following usage limits for inspecting Google Cloud storage repositories:

Type of limit Usage limit
Maximum total scan size 2 TB
Maximum size of individual text items 0.5 MB
Maximum size of each quote (a contextual snippet, returned with findings, of the text that triggered a match) 4 KB

Data profiling limits

Cloud DLP enforces the following usage limits for profiling BigQuery data:

Type of limit Usage limit
Maximum number of BigQuery tables 200,000
Maximum sum of columns in all BigQuery tables to be profiled 20,000,000

These limits apply globally for all data configurations at both organization and project levels.

Example: Organization-level limits

Suppose you have an organization that has two folders, and you create a scan configuration for each folder. The total number of tables to be profiled from both of the configurations must not exceed 200,000. The total number of columns in all those tables must not exceed 20,000,000.

Example: Project-level limits

If you configure data profiling at the project level, then the total number of tables in that project must not exceed 200,000. The total number of columns in all those tables must not exceed 20,000,000.

Custom infoType limits

Cloud DLP enforces the following limits for custom infoTypes.

Type of limit Usage limit
Maximum size of word list passed directly in the request message per regular custom dictionary 128 KB
Maximum size of word list specified as a file in Cloud Storage per regular custom dictionary 512 KB
Maximum number of components (continuous sequences containing only letters, only digits, only non-letter characters, or only non-digit characters) per regular custom dictionary phrase 40
Maximum combined size of all stored custom dictionaries per request 5 MB
Maximum number of built-in and custom infoTypes per request 150
Maximum number of detection rules per custom infoType 5
Maximum number of custom infoTypes per request 30
Maximum number of regular custom dictionaries per request 10
Maximum length of regular expressions 1000

Stored infoType limits

Cloud DLP enforces the following limits for creating stored infoTypes.

Stored custom dictionaries

The following limits apply to stored custom dictionaries:

Type of limit Usage limit
Maximum size of a single input file stored in Cloud Storage 200 MB
Maximum combined size of all input files stored in Cloud Storage 1 GB
Maximum number of input files stored in Cloud Storage 100
Maximum size of an input column in BigQuery 1 GB
Maximum number of input table rows in BigQuery 5,000,000
Maximum size of output files 500 MB

Avro scanning limits

Generally, Avro files have the same limitations for Cloud DLP and BigQuery. If these limits are hit, binary scanning is used as a fallback. The following limits apply to inspect content requests and inspect storage jobs for Avro files:

Type of limit Usage limit
Maximum size for a single Avro block 100 MB
Maximum size for a single Avro file 1 TB
Maximum number of columns in an Avro file 10,000
Maximum level of nested fields 15

PDF/Word scanning limits

If these limits are hit, binary scanning is used as a fallback.

Type of Limit Usage Limit
Maximum size of a single pdf in Cloud Storage. Files exceeding this limit are binary scanned 30 MB
Maximum size of a single word file in Cloud Storage. Files exceeding this limit are binary scanned 30 MB

Quota increases

You can edit your quotas up to their maximum values by selecting Edit Quotas from the Quotas page of the Google Cloud Dashboard. To request an increase in quota, edit your quota with your requested increase and justification and submit your update. You are notified when your request is received. You might be contacted for more information regarding your request. After your request is reviewed, you are notified whether it has been approved or denied.

Quota dependencies

Depending on which features you are using, you may also need additional quota.

Learn about service disruptions

Cloud DLP has features that depend on other Google Cloud services. Because of these dependencies, you can expect Cloud DLP to have comparable reliability as those products.

We make a best effort to retry until any recurring errors have subsided, but degraded experiences may occur if those services experience disruptions.

Check the Google Cloud Status Dashboard for all known service disruptions. You can also subscribe to the Google Cloud Status Dashboard updates JSON feed or RSS feed for push updates.