This document contains current restrictions and usage limits when using Sensitive Data Protection. For billing information, see the pricing page.
These limits apply to each Google Cloud console project and are shared across all applications and IP addresses using that project.
You can set lower quotas in the Google Cloud console.
Request quotas
The current API request quota for Sensitive Data Protection is as follows.
Request quota | Value |
---|---|
Requests per minute | 600 |
Resource limits
Sensitive Data Protection enforces the following limits on stored resources per project.
Resource | Limit |
---|---|
Maximum number of templates | 1000 |
Maximum number of job triggers | 1000 |
Maximum number of running jobs | 1000 |
Maximum number of stored infoTypes | 30 |
Maximum number of discovery configurations | 100 |
Content inspection and de-identification limits
Sensitive Data Protection enforces the following usage limits for inspecting and de-identifying content sent directly to the DLP API as text or images:
Type of limit | Usage limit |
---|---|
Maximum number of regular custom dictionaries per request | 10 |
Maximum size of each quote (a contextual snippet, returned with findings, of the text that triggered a match) | 4 KB |
Maximum number of table values | 50,000 |
Maximum number of transformations per request | 100 |
Maximum number of inspection rules per set | 10 |
Maximum number of findings per request | 3,000 |
Maximum size of each request, except projects.image.redact |
0.5 MB |
Maximum size of each projects.image.redact request |
4 MB |
If you need to inspect files that are larger than these limits, store those files on Cloud Storage and run an inspection job.
Storage inspection limits
Sensitive Data Protection enforces the following usage limits for inspecting Google Cloud storage repositories:
Type of limit | Usage limit |
---|---|
Maximum total scan size | 2 TB |
Maximum size of each quote (a contextual snippet, returned with findings, of the text that triggered a match) | 4 KB |
Storage de-identification limits
Sensitive Data Protection enforces the following usage limit when you de-identify data in storage:
Type of limit | Usage limit |
---|---|
Maximum file size | 60,000 KB |
Data profiling limits
Sensitive Data Protection enforces the following usage limits for profiling data.
These limits apply globally for all data configurations at both organization and project levels.
BigQuery profiling limits
Type of limit | Usage limit |
---|---|
Maximum number of BigQuery tables | 200,000 |
Maximum sum of columns in all BigQuery tables to be profiled | 20,000,000 |
Example: Organization-level limits
Suppose you have an organization that has two folders, and you create a scan configuration for each folder. The total number of tables to be profiled from both of the configurations must not exceed 200,000. The total number of columns in all those tables must not exceed 20,000,000.
Example: Project-level limits
If you configure data profiling at the project level, then the total number of tables in that project must not exceed 200,000. The total number of columns in all those tables must not exceed 20,000,000.
Cloud SQL profiling limits
Type of limit | Usage limit |
---|---|
Maximum number of databases per instance | 1,000 |
Maximum number of tables per database | 20,000 |
Custom infoType limits
Sensitive Data Protection enforces the following limits for custom infoTypes.
Type of limit | Usage limit |
---|---|
Maximum size of word list passed directly in the request message per regular custom dictionary | 128 KB |
Maximum size of word list specified as a file in Cloud Storage per regular custom dictionary | 512 KB |
Maximum number of components (continuous sequences containing only letters, only digits, only non-letter characters, or only non-digit characters) per regular custom dictionary phrase | 40 |
Maximum combined size of all stored custom dictionaries per request | 5 MB |
Maximum number of built-in and custom infoTypes per request | 150 |
Maximum number of detection rules per custom infoType | 5 |
Maximum number of custom infoTypes per request | 30 |
Maximum number of regular custom dictionaries per request | 10 |
Maximum length of regular expressions | 1000 |
Stored infoType limits
Sensitive Data Protection enforces the following limits for creating stored infoTypes.
Type of limit | Usage limit |
---|---|
Maximum size of a single input file stored in Cloud Storage | 200 MB |
Maximum combined size of all input files stored in Cloud Storage | 1 GB |
Maximum number of input files stored in Cloud Storage | 100 |
Maximum size of an input column in BigQuery | 1 GB |
Maximum number of input table rows in BigQuery | 5,000,000 |
Maximum size of output files | 500 MB |
Avro scanning limits
Generally, Avro files have the same limitations for Sensitive Data Protection and BigQuery. If these limits are hit, binary scanning is used as a fallback. The following limits apply to inspect content requests and inspect storage jobs for Avro files:
Type of limit | Usage limit |
---|---|
Maximum size for a single Avro block | 100 MB |
Maximum size for a single Avro file | 1 TB |
Maximum number of columns in an Avro file | 10,000 |
Maximum level of nested fields | 15 |
Scanning limits for PDFs and Microsoft products
These limits apply when scanning the following types of files:
- Microsoft Word
- Microsoft Excel
- Microsoft Powerpoint
If these limits are hit, binary scanning is used as a fallback.
Type of Limit | Usage Limit |
---|---|
Maximum size of a single PDF in Cloud Storage. Files exceeding this limit are binary scanned. | 150 MB, up to 10,000 pages |
Maximum size of a single Word file in Cloud Storage. Files exceeding this limit are binary scanned. | 30 MB |
Maximum size of a single Excel file in Cloud Storage. Files exceeding this limit are binary scanned. | 30 MB |
Maximum size of a single Powerpoint file in Cloud Storage. Files exceeding this limit are binary scanned. | 30 MB |
These limits apply on these file types even if you set a maximum byte size per file.
Quota increases
You can edit your quotas up to their maximum values by selecting Edit Quotas from the Quotas page of the Google Cloud Dashboard. To request an increase in quota, edit your quota with your requested increase and justification and submit your update. You are notified when your request is received. You might be contacted for more information regarding your request. After your request is reviewed, you are notified whether it has been approved or denied.
Quota dependencies
Depending on which features you are using, you may also need additional quota.
- Pub/Sub: If you are using Pub/Sub, you may need additional quota.
- BigQuery:
- Streaming API is used for persisting findings for inspect jobs, where quota values and other restrictions apply.
google.cloud.bigquery.storage.v1beta1.BigQueryStorage
, which lists the contents of rows in a table, is subject to quota limits.
Learn about service disruptions
Sensitive Data Protection has features that depend on other Google Cloud services. Because of these dependencies, you can expect Sensitive Data Protection to have comparable reliability as those products.
We make a best effort to retry until any recurring errors have subsided, but degraded experiences may occur if those services experience disruptions.
Check the Google Cloud Status Dashboard for all known service disruptions. You can also subscribe to the Google Cloud Status Dashboard updates JSON feed or RSS feed for push updates.