Sensitive Data Protection documentation
Sensitive Data Protection provides access to a powerful sensitive data inspection, classification, and de-identification platform.
Sensitive Data Protection includes:
- Over 200 built-in information type (or "infoType") detectors.
- The ability to define custom infoType detectors using dictionaries, regular expressions, and contextual elements.
- De-identification techniques including redaction, masking, format-preserving encryption, date-shifting, and more.
- The ability to detect sensitive data within streams of data, structured text, files in storage repositories such as Cloud Storage and BigQuery, and even within images.
- Analysis of structured data to help understand its risk of being re-identified, including computation of metrics like k-anonymity, l-diversity, and more.
- The ability to automatically discover unencrypted secrets and profile data across an organization, folder, or project to identify data assets where high-risk and sensitive data reside.
Start your proof of concept with $300 in free credit
- Get access to Gemini 2.0 Flash Thinking
- Free monthly usage of popular products, including AI APIs and BigQuery
- No automatic charges, no commitment
Documentation resources
Guides
-
Quickstart: Using a JSON request
-
Quickstart: Inspect sensitive text by using the command line
-
Quickstart: Schedule a Sensitive Data Protection inspection scan
-
De-identifying sensitive data
-
Redacting sensitive data from images
-
Inspecting storage and databases for sensitive data
-
Inspecting text for sensitive data
-
Creating inspection templates
-
Creating and scheduling inspection jobs
-
Related resources
Related videos
New Way Now: Broadcom connects and protects the digital world with the help of AI
𝗦𝘂𝗺𝗺𝗮𝗿𝘆: Rich Jardine, VP of Cloud Platform Engineering and Operations at Broadcom, discusses their unique partnership with Google Cloud, and how the global infrastructure technology leader is modernizing its infrastructure and building a scalable
Customer Voices: Next DLP
Next DLP is a user-centric, flexible, cloud-native, AI/ML-powered solution built for today’s threat landscape. Hear from their Engineer, Chung Poon, as he explains why Google Cloud's Kubernetes Engine was the perfect fit for their SaaS platform
What is Automatic DLP?
Get Started with Data Loss Prevention → https://goo.gle/44nqJfH Data breaches keeping you up at night? Google Cloud's got your back with Automatic DLP! This video reveals how to effortlessly discover, classify, and protect your sensitive data across
Secure your cloud infrastructure — network, data, and compute — the Google way
Google secures the largest web apps and services on earth. We’ve learned a few things along the way about how to secure cloud infrastructure - compute, data, and network. In Google Cloud we applied our learnings to enable some infrastructure security
Remediating issues and managing risk with Automatic DLP
Getting Started with DLP → https://goo.gle/3FrRT8R Check out our documentation: Data Profiles for BigQuery Data → https://goo.gle/3yXaeZR With Automatic DLP gaining visibility across your entire organization and keeping sensitive data secure is now
How to analyze your data in Automatic DLP
View Data Profiles in the Google Cloud Console → https://goo.gle/3n833qK Analyze Data Profiles → https://goo.gle/3NedGTy Sample Data Studio Report → https://goo.gle/3NOnh4B Data is only as good as the insights you can pull from it, but this usually
How to configure Automatic DLP
Need to search through millions of database entries? Automatic DLP has new features to configure scans for all your business needs. In this video, we show how you can configure Automatic DLP to create a data profile for a scan configuration. Watch to
What is Automatic DLP?
Getting Started with Data Loss Prevention → https://goo.gle/3FrRT8R Do you need to monitor and protect large amounts of sensitive data, but you don’t know where that data is or how it’s being managed? In this first episode of Automatic DLP: Data
Google Cloud Security Showcase - DLP
Read More → https://goo.gle/3suqIVE DLP enables customers to automatically discover, classify, and protect their most sensitive data across their organization. It will provide rich insights for tables and columns in BigQuery and helps customers
Introduction to Google Cloud
Checkout this architecture in our NEW Architecture Diagramming Tool → https://goo.gle/3GUIztk When it comes to building your organization's app in the cloud you have many options. In this video Priyanka gives the complete overview of various Google
Data Loss Prevention in a minute
Cloud DLP → https://goo.gle/38C04zI Cloud Data Loss Prevention, or Cloud DLP, lets you discover, classify, and protect sensitive data from across your systems. In this episode of Cloud Bytes, we show how the Cloud DLP API offers a number of
Understanding BigQuery data governance
Read the blog → https://goo.gle/3Ag6qAA BigQuery Data Governance → https://goo.gle/37nq0yj BigQuery IAM → https://goo.gle/3yygQLm Want everyone in your organization to be able to easily find the data they need, while minimizing overall risk, and
How to classify and redact sensitive data
How can you easily discover, protect, and classify your most sensitive data? In this episode, we show you how to get started with the Cloud Data Loss Prevention (DLP) API, so you can have better observability over your data. Timestamps: 0:00 -
Increase API data security with Apigee and Google Cloud Data Loss Prevention (DLP)
Read more here→https://goo.gle/3eh32O7 Join this webcast with two expert Google Cloud Product Managers to learn about how to leverage Apigee and Cloud Data Loss Prevention (DLP) to classify, mask, tokenize, and transform sensitive elements to help
De-identification and inspection templates with Cloud DLP
Welcome back to Getting Started with Data Loss Prevention! In this video, we give you a demo showcasing the functionality of Cloud DLP. Specifically, we’ll show you how to create an inspection template to find sensitive information, while also
Google Cloud Data Catalog essentials: Identifying PII with Cloud DLP integration
Learn more → https://goo.gle/3mT0WVU Increased regulatory and compliance requirements mean that protecting sensitive data is top of mind for many organizations. Data Catalog’s integration with Cloud Data Loss Prevention or DLP makes it easy to
How to create DLP templates
Did you know that you can reuse your commonly used DLP configurations? In this episode of Getting Started with Data Loss Prevention, we demo and show you how to create inspection and de-identification templates, allowing you to decouple
How to use tokenization with Cloud DLP
KittyCat Walks was able to use Cloud DLP to help remove sensitive values in their data, but when it comes to handling sensitive financial information - such as credit card numbers - how can they preserve referential integrity with their data after
How to redact sensitive data using DLP
In the last episode, we touched on what de-identifiction is and how it can be beneficial for your application. In this episode of Getting Started with Data Loss Prevention, we speak to redaction - a basic type of de-identification - showing how it
Getting started with Data Loss Prevention on Security Command Center
Data Loss Prevention (DLP) is a fully managed service that helps you discover, classify, and protect your most sensitive data. In this episode of Getting Started with Security Command Center, we show you how to configure a DLP scan to send results to
Data Catalog for data discovery and metadata management
Data discovery and metadata management is a common pain point for most enterprise customers. Data Catalog is a managed service that addresses this pain point with a scalable and performant solution. Data Catalog offers integration with DLP for auto
What is Data Loss Prevention (DLP)?
Welcome to Getting Started with Data Loss Prevention! In this series, we’ll explore Data Loss Prevention - a fully managed service that helps you discover, classify, and protect your most sensitive data - and how it can help protect the data for our
Automating Cloud Storage Data Classification: DLP API and Cloud Function
In the last episode of Get Cooking in Cloud, Priyanka Vergadia and Jenny Brown touched base on the four step process for automating data classification. They only got through half of this process by creating Google Cloud Storage buckets and using
Setting up automated data classification
In this episode of Get Cooking in Cloud, Priyanka Vergadia and Jenny Brown go more into depth about data classification. Specifically, they go over how Cloud Storage Buckets, Cloud Pub/ Sub, Cloud Functions, and the DLP API play a crucial role in
Overview of automating data classification
In this episode of Get Cooking in Cloud, Priyanka Vergadia and Jenny Brown go over how AppEngine, Google Cloud Storage, Pub/Sub, Cloud Functions, and the Cloud Data Loss Prevention (DLP) API work together to separate classified data from
Sensitive data protections, Cloud architectures, & more!
Here to bring you the latest news in the cloud is Stephanie Wong. Learn more about these announcements → https://goo.gle/2MYlQm6 • Protect Sensitive Data with the new Cloud DLP UI → https://goo.gle/2OVJQcd • Best Practices for Password Management →
Automating Security in G Suite (Cloud Next '19)
This session will focus on how to automate security in G Suite. It will highlight key aspects from G Suite Security Center, Alert Center, and Data Loss Prevention that will help take some of the burden of manual work off admin’s and analyst’s
Identify and Protect Sensitive Data in the Cloud: Latest Innovations in Cloud DLP (Cloud Next '19)
In this session we'll share the latest advancements made to Google Cloud Data Loss Prevention and demo several different techniques to protect your sensitive data. Identifying and Protecting Cloud Data → http://bit.ly/2TT8g4Y Watch more: Next '19
Women of Cloud: How to Grow our Clout 2.0 (Cloud Next '19)
Join us for round two of the dynamic panel that asks the hard questions and puts diversity, equity, and inclusion at the forefront from a technical perspective. This panel of Google Cloud's senior technical women that you first met last year will
Comprehensive Protection of PII in GCP (Cloud Next '19)
As a major international bank, Scotiabank will discuss its security journey and cloud-native approach to ingesting PII into GCP, constraining access, and carefully and selectively allowing reidentification by bank applications. Comprehensive