Stay organized with collections Save and categorize content based on your preferences.

A Cloud Data Loss Prevention action is something that occurs after an operation completes successfully or, in the case of emails, on error. For example, you can save findings to a BigQuery table, publish a notification to a Pub/Sub topic, or send an email when an operation either finishes successfully or stops on error.

Available actions

When you run a Cloud Data Loss Prevention job, a summary of its findings are saved by default within Cloud DLP. You can see this summary using Cloud DLP in the Google Cloud console. For jobs, you can also retrieve summary information in the DLP API using the projects.dlpJobs.get method.

Cloud DLP supports different types of actions depending on the type of operation being run. The following are the supported actions.

Save findings to BigQuery

Save the DLP job results to a BigQuery table. Before viewing or analyzing the results, first ensure that the job has completed.

Each time a scan runs, Cloud DLP saves scan findings to the BigQuery table you specify. The exported findings contain details about each finding's location and match likelihood. If you want each finding to include the string that matched the infoType detector, enable the Include quote option.

If you don't specify a table ID, BigQuery assigns a default name to a new table the first time the scan runs. If you specify an existing table, Cloud DLP appends scan findings to it.

When data is written to a BigQuery table, the billing and quota usage are applied to the project that contains the destination table.

If you don't save findings to BigQuery, the scan results only contain statistics about the number and infoTypes of the findings.

Publish to Pub/Sub

Publish a notification that contains the name of the DLP job as an attribute to a Pub/Sub channel. You can specify one or more topics to send the notification message to. Make sure that the Cloud DLP service account running the scan job has publishing access on the topic.

Publish to Security Command Center

Publish a summary of the job results to Security Command Center. For more information, see Send Cloud DLP scan results to Security Command Center.

Publish to Dataplex

Send job results to Dataplex, Google Cloud's metadata management service.

Notify by email

Send an email when the job completes. The email goes to IAM project owners and technical Essential Contacts.

Publish to Cloud Monitoring

Send inspection results to Cloud Monitoring in Google Cloud's operations suite.

Make a de-identified copy

De-identify any findings in the inspected data, and write the de-identified content to a new file. You can then use the de-identified copy in your business processes, in place of data that contains sensitive information. For more information, see Create a de-identified copy of Cloud Storage data using Cloud DLP in the Google Cloud console.

Supported operations

The following table shows the Cloud DLP operations and where each action is available.

Action BigQuery inspection Cloud Storage inspection Datastore inspection Hybrid inspection Risk analysis Data profiling
Save findings to BigQuery
Publish to Pub/Sub
Publish to Security Command Center
Publish to Data Catalog
Notify by email
Publish to Cloud Monitoring
De-identify findings

Specify actions

You can specify one or more actions when you configure a Cloud DLP:

  • When you create a new inspection or risk analysis job using Cloud DLP in the Google Cloud console, specify actions in the Add actions section of the job creation workflow.
  • When you configure a new job request to send to the DLP API, specify actions in the Action object.

For more information and sample code in several languages, see:

Example action scenario

You can use Cloud DLP actions to automate processes based on Cloud DLP scan results. Suppose you have a BigQuery table shared with an external partner. You want to ensure both that this table does not contain any sensitive identifiers like US Social Security numbers (the infoType US_SOCIAL_SECURITY_NUMBER), and that, if you find any, access is revoked from the partner. Here is a rough outline of a workflow that would use actions:

  1. Create a Cloud DLP job trigger to run an inspection scan of the BigQuery table every 24 hours.
  2. Set the action of these jobs to publish a Pub/Sub notification to the topic "projects/foo/scan_notifications."
  3. Create a Cloud Function that listens for incoming messages on "projects/foo/scan_notifications." This Cloud Function will receive the name of the DLP job every 24 hours, call Cloud DLP to get summary results from this job, and, if it finds any Social Security numbers, it can change settings in BigQuery or Identity and Access Management (IAM) to restrict access to the table.

What's next