From time to time, we release papers, blog posts, and videos related to Cloud Data Loss Prevention (DLP). They are listed here.
Scan for sensitive data in just a few clicks
A deeper look at the Google Cloud Console user interface for Cloud DLP to show how you can start to inspect your enterprise data with just a few clicks.
How tokenization makes data usable without sacrificing privacy
Tokenization, sometimes referred to as pseudonymization or surrogate replacement, is widely used in industries like finance and healthcare to help reduce the use of data in use, compliance scope, and minimize sensitive data being exposed to systems that do not need it. With Cloud DLP, customers can perform tokenization at scale with minimal setup.
Using Cloud DLP to de-identify and obfuscate sensitive information
The team discusses how to leverage Cloud DLP to protect data by automatically incorporating data obfuscation and minimization techniques into your workflows.
Using Cloud DLP to find and protect PII
Scott Ellis, Cloud DLP Product Manager, discusses how to leverage Cloud DLP to increase your privacy posture.
Scanning BigQuery with Cloud DLP
The team shares how to easily scan BigQuery from the Cloud Console.
De-identification and re-identification of PII in large-scale datasets using Cloud DLP
This solution discusses how to use Cloud DLP to create an automated data transformation pipeline to de-identify sensitive data like personally identifiable information (PII).
Automating the classification of data uploaded to Cloud Storage
This tutorial shows how to implement an automated data quarantine and classification system using Cloud Storage and other Google Cloud products.
Data Tokenization Using Dataflow/Beam & DLP API
This inspection and migration solution reads structured and unstructured data from storage systems like AWS S3 and Cloud Storage. Data can be automatically de-identified using DLP API and sent to BigQuery and Cloud Storage.
Example Dataflow template to de-identify stored data
This example template builds a streaming pipeline to transform sensitive data.
Relational database import to BigQuery with Dataflow
This proof-of-concept uses Dataflow and Cloud DLP to securely tokenize and import data from a relational database to BigQuery. The example describes how to use this pipeline with a sample SQL Server database created in Google Kubernetes Engine and use of DLP template to tokenize PII data before it's persisted.
Example architecture for using a Cloud DLP proxy to query a database containing sensitive data
This proof-of-concept architecture uses a proxy to pass all queries and results through a service that parses, inspects, and then either logs the findings or de-identifies the results by using Cloud DLP. It then returns the requested data to the user. Note that if the database already stores tokenized data, this proxy concept can also be used to de-tokenize before returning the requested data. Read tutorial: "Example architecture for using a Cloud DLP proxy to query a database containing sensitive data"
Cloud OnAir: Protecting sensitive datasets on Google Cloud
Data is one of your company's most valuable assets. Analytics and machine learning can help unlock valuable services for your customers and your business. These datasets can also contain sensitive data that need protection. In this webinar, you'll learn how Cloud DLP can help you discover, classify, and de-identify sensitive data as part of an overall governance strategy.
Cloud Next 2019: Scotiabank shares their cloud-native approach to ingesting PII into Google Cloud
As a major international bank, Scotiabank discusses its security journey and cloud-native approach to ingesting PII into Google Cloud, constraining access, and carefully and selectively allowing re-identification by bank applications.
Cloud Next 2019: Identify and Protect Sensitive Data in the Cloud
The team shares the latest advancements made to Cloud DLP and demos several different techniques to protect your sensitive data.