This guide walks you through using Cloud Data Loss Prevention (DLP) to scan specific Google Cloud resources and send data to Security Command Center (Security Command Center).
Security Command Center enables you to gather data about, identify, and act on security threats before they can cause business damage or loss. With Security Command Center, you can perform several security-related actions from a single, centralized dashboard.
Cloud Data Loss Prevention (DLP) integrates natively with Security Command Center. When you use a Cloud DLP action to scan your Google Cloud storage repositories for sensitive data, it can send results directly to the Security Command Center dashboard so they display next to other security metrics.
The following video shows you how to set up Cloud DLP to send scan results to Security Command Center. The setup steps are also described in more detail later in this guide.
By completing the steps in this guide, you'll do the following:
- Enable Security Command Center and Cloud DLP.
- Set up Cloud DLP to scan a Google Cloud storage repository—either a Cloud Storage bucket, BigQuery table, or Datastore kind.
- Configure a Cloud DLP scan to send scan results to Security Command Center.
For more information about Security Command Center, see the Security Command Center documentation.
Costs
Following the instructions in this topic uses billable components of Google Cloud, including:
- Cloud Data Loss Prevention (DLP)
- Cloud Storage
- BigQuery
- Datastore
Use the Pricing Calculator to generate a cost estimate based on your projected usage.
New Google Cloud users might be eligible for a free trial.Before you begin
Before you can send Cloud DLP scan results to Security Command Center, you must set up the following components:
- Step 1: Set Google Cloud storage repositories.
- Step 2: Set Cloud Identity and Access Management (Cloud IAM) roles.
- Step 3: Enable Security Command Center.
- Step 4: Enable Cloud DLP.
- Step 5: Enable Cloud DLP as a security source for Security Command Center.
The steps to set up these components are described in the sections below.
Step 1: Set Google Cloud storage repositories
Choose whether you want to scan your own Google Cloud storage repository or an example one. This topic provides instructions for both scenarios.
Scan your own data
If you want to scan your own existing Cloud Storage bucket, BigQuery table, or Datastore kind, first open the project that the repository is in. In subsequent steps, you'll enable both Security Command Center and Cloud DLP for this project and its organization.
After you open the project you want to use, proceed to setting up some Cloud IAM roles.
Scan sample data
If you want to scan a "dummy" or test set of data, first make sure that you have a billing acount set up, and then create a new project. To complete this step, you must have the Cloud IAM Project Creator role. Learn more about Cloud IAM roles.
- Set up a billing account if you don't already have one.
Learn how to enable billing - Go to the New Project page in the
Cloud Console
New Project page - On the Billing account drop-down list, select the billing account that the project should be billed to.
- On the Organization drop-down list, select the organization that you want to create the project in.
- On the Location drop-down list, select the organization or folder that you want to create the project in.
Next, download and store the sample data:
- Go to the Cloud Functions tutorials repository on GitHub.
- Click Clone or download, and then click Download ZIP.
- Unzip the zip file that you downloaded.
- Go to the Storage Browser page in the
Cloud Console.
Go to Cloud Storage - Click Create bucket.
- On the Create a bucket page, give the bucket a unique name, and then click Create.
- On the Bucket details page, click Upload folder.
Go to the
dlp-cloud-functions-tutorials-master
folder that you unzipped earlier, open it, and then select thesample_data
folder. Click Upload to upload the folder's contents to Cloud Storage..
Note the name that you gave the Cloud Storage bucket for later. After the file upload completes, you're ready to continue.
Step 2: Set Cloud IAM roles
To use Cloud DLP to send scan results to Security Command Center, you need the Security Center Admin and DLP Jobs Editor Cloud IAM roles. This section describes how to add the roles. To complete this section, you must have the Organization Administrator Cloud IAM role.
Go to the Cloud Console IAM & Admin page.
- Under Member, click Edit next to your username. edit
- On the Edit permissions panel, click Add another role.
- On the Select a role drop-down list, select Security Command Center > Security Center Admin.
- Next, from the same drop-down list, select Cloud DLP > DLP Jobs Editor, and then click Save.
You now have DLP Jobs Editor and Security Center Admin roles for your organization. These roles will allow you to complete the tasks in the remainder of this topic.
Step 3: Enable Security Command Center
- Go to the Security Command Center page in the
Cloud Console.
Go to the Security Command Center page - On the Organization drop-down list, select the organization for which you want to enable Cloud DLP, and then click Select.
- On the Enable asset discovery page that appears, select All current and future projects, and then click Enable. A message should display that Cloud DLP is beginning asset discovery.
After asset discovery is complete, Cloud DLP will display your supported Google Cloud assets. Asset discovery might take a few minutes, and you might need to refresh the page to display the assets.
For more information about enabling Security Command Center, see the Security Command Center documentation.
Step 4: Enable Cloud DLP
Enable Cloud DLP for the project you want to scan. The project must be within the same organization for which you've enabled Security Command Center. To enable Cloud DLP using the Cloud Console:
- Go to the Cloud Console page to Register your application for Cloud DLP.
- On the Create a project drop-down list, select the project from Step 1 of this guide. The project must contain the Cloud Storage bucket, BigQuery table, or Datastore kind you want to scan.
- After you select the project you want to use, click Continue.
Cloud DLP is now enabled for your project.
Step 5: Enable Cloud DLP as a security source for Security Command Center
To view Cloud DLP scan findings in the Security Command Center dashboard, enable Cloud DLP as a security source:
- Go to the Security Command Center Security Sources page in
the Cloud Console.
Go to the Security Sources page - Select the organization for which you want to enable Cloud DLP as a security source.
- Under Enabled, click to enable Cloud DLP Data Discovery.
Findings for Cloud DLP are displayed on the Findings page in the Security Command Center dashboard. For more information about how to manage Security Command Center security sources, see the Security Command Center documentation.
Configure and run a Cloud DLP inspection scan
In this section, you configure and run a Cloud DLP scan job.
The inspection job that you configure here instructs Cloud DLP to scan either the sample data stored in Cloud Storage described in the set storage repositories step earlier on this page, or your own data stored in Cloud Storage, Datastore, or BigQuery. The job configuration that you specify is also where you instruct Cloud DLP to save its scan results to Security Command Center.
Step 1: Note your project identifier
Go to the Cloud Console.
Click Select.
On the Select from drop-down list, select the organization for which you enabled Security Command Center.
Under ID, copy the project ID for the project that contains the data you want to scan. This is the project described in the set storage repositories step earlier on this page.
Under Name, click the project to select it.
Step 2: Open APIs Explorer and configure the job
Go to APIs Explorer on the reference page for the
dlpJobs.create
method by clicking the following button:In the parent box, enter the following, where project-id is the project ID you noted earlier in Step 1:
projects/project-id
Replace the contents of the Request body field with the following JSON for the kind of data you want to use: sample data in a Cloud Storage bucket, or your own data stored in Cloud Storage, Datastore, or BigQuery.
Sample data
If you created a Cloud Storage bucket to store sample data as
described in the set storage repositories step earlier
on this page, copy the following JSON and then paste it into the Request
body field. Replace bucket-name
with the name
that you gave your Cloud Storage bucket:
{
"inspectJob":{
"storageConfig":{
"cloudStorageOptions":{
"fileSet":{
"url":"gs://bucket-name/**"
}
}
},
"inspectConfig":{
"infoTypes":[
{
"name":"EMAIL_ADDRESS"
},
{
"name":"PERSON_NAME"
},
{
"name": "LOCATION"
},
{
"name":"PHONE_NUMBER"
}
],
"includeQuote":true,
"minLikelihood":"UNLIKELY",
"limits":{
"maxFindingsPerRequest":100
}
},
"actions":[
{
"publishSummaryToCscc":{
}
}
]
}
}
Cloud Storage data
To scan your own Cloud Storage bucket, copy the following JSON and paste it into the Request body field.
Replace path-name
with the path to the location
that you want to scan. To scan recursively, end the path with two asterisks,
for example, gs://path_to_files/**
. To scan just a specific directory and
no deeper, end the path with one asterisk, for example,
gs://path_to_files/*
.
{
"inspectJob":{
"storageConfig":{
"cloudStorageOptions":{
"fileSet":{
"url":"gs://path-name"
}
}
},
"inspectConfig":{
"infoTypes":[
{
"name":"EMAIL_ADDRESS"
},
{
"name":"PERSON_NAME"
},
{
"name": "LOCATION"
},
{
"name":"PHONE_NUMBER"
}
],
"includeQuote":true,
"minLikelihood":"UNLIKELY",
"limits":{
"maxFindingsPerRequest":100
}
},
"actions":[
{
"publishSummaryToCscc":{
}
}
]
}
}
To learn more about the available scan options, see Inspecting storage and databases for sensitive data.
Datastore data
To scan your own data kept in Datastore, copy the following JSON and paste it into the Request body field.
Replace datastore-kind
with the name of the
Datastore kind. You can also replace
namespace-id
and
project-id
with the namespace and project
identifiers, repectively, or you can remove the "partitionID"
completely
if you want.
{
"inspectJob":{
"storageConfig":{
"datastoreOptions":{
"kind":{
"name":"datastore-kind"
},
"partitionId":{
"namespaceId":"namespace-id",
"projectId":"project-id"
}
}
},
"inspectConfig":{
"infoTypes":[
{
"name":"EMAIL_ADDRESS"
},
{
"name":"PERSON_NAME"
},
{
"name": "LOCATION"
},
{
"name":"PHONE_NUMBER"
}
],
"includeQuote":true,
"minLikelihood":"UNLIKELY",
"limits":{
"maxFindingsPerRequest":100
}
},
"actions":[
{
"publishSummaryToCscc":{
}
}
]
}
}
To learn more about the available scan options, see Inspecting storage and databases for sensitive data.
BigQuery data
To scan your own BigQuery table, copy the following JSON and paste it into the Request body field.
Replace project-id
,
bigquery-dataset-name
, and
bigquery-table-name
with the project ID and
BigQuery dataset and table names, repectively.
{
"inspectJob":
{
"storageConfig":
{
"bigQueryOptions":
{
"tableReference":
{
"projectId": "project-id",
"datasetId": "bigquery-dataset-name",
"tableId": "bigquery-table-name"
}
}
},
"inspectConfig":
{
"infoTypes":
[
{
"name": "EMAIL_ADDRESS"
},
{
"name": "PERSON_NAME"
},
{
"name": "LOCATION"
},
{
"name": "PHONE_NUMBER"
}
],
"includeQuote": true,
"minLikelihood": "UNLIKELY",
"limits":
{
"maxFindingsPerRequest": 100
}
},
"actions":
[
{
"publishSummaryToCscc":
{
}
}
]
}
}
To learn more about the available scan options, see Inspecting storage and databases for sensitive data.
Step 3: Execute the request to start the scan job
After you configure the job by following the preceding steps, click Execute to send the request. If the request is successful, a response appears below the request with a success code and a JSON object that indicates the status of the Cloud DLP job you just created.
Check the status of the Cloud DLP inspection scan
The response to your scan request includes the job ID of your inspection scan
job as the "name"
key, and the current state of the inspection scan job as the
"state"
key. Because you just submitted the request, the job's state at that
moment is "PENDING"
.
After you submit the scan request, the scan of your content begins immediately.
To check the status of the inspection scan job:
Go to APIs Explorer on the reference page for the
dlpJobs.get
method by clicking the following button:In the name box, type the name of the job from the JSON response to the scan request in the following form:
projects/project-id/dlpJobs/job-id
The job ID is in the form ofi-1234567890123456789
.To submit the request, click Execute.
If the response JSON object's "state"
key indicates that the job is "DONE"
,
then the scan job has finished.
To view the rest of the response JSON, scroll down the page. Under "result"
>
"infoTypeStats"
, each information type listed should have a corresponding
"count"
. If not, make sure that you entered the JSON accurately, and that the
path or location to your data is correct.
After the scan job is done, you can continue to the next section of this guide to view scan results in Security Command Center.
View Cloud DLP scan results in Security Command Center
Because you instructed Cloud DLP to send its inspection scan job results to Security Command Center, you can now view a summary of the scan in the Security Command Center dashboard:
- Go to the Security Command Center page in the Cloud Console.
Go to the Security Command Center page - Select the organization for which you enabled Security Command Center earlier.
The following screen shot of the Security Command Center main dashboard shows the two cards in which Cloud DLP data appears— Findings Summary and Cloud DLP Data Discovery.
.
Note that the LOCATION
infoType
isn't listed. That's because there were no
matches in Cloud DLP on the LOCATION
infoType
.
Cloud DLP findings are totaled and displayed alongside other
figures in the Findings Summary. Findings are also listed by information
type under Cloud DLP Data Discovery.
Findings Summary
If any findings were sent Cloud DLP, a row displays with the source Cloud DLP Data Discovery. If that row doesn't appear in Findings Summary, then Cloud DLP didn't send any findings to Security Command Center.
Cloud DLP Data Discovery
Any infoTypes
that successfully matched to scanned data are listed in the
Finding column. The number in the Count column corresponds to the number
of scans that found data that matched that infoType
. If a job is scanning
multiple sources, the count might include each source that was scanned.
To see individual findings for a specific category, click on a row—for
example, EMAIL_ADDRESS
. A list of each scan in which at least one
EMAIL_ADDRESS
was found displays under the Findings tab, as shown
in the following screenshot:
.
To display detail about a specific finding, click one of the EMAIL_ADDRESS
rows to see the detail page, as shown in the following screenshot:
.
Cleaning up
To avoid incurring charges to your Google Cloud account for the resources used in this topic:
Deleting the project
The easiest way to eliminate billing is to delete the project you created while following the instructions provided in this topic.
To delete the project:
- In the Cloud Console, go to the Projects page.
-
In the project list, select the project you
want to delete and click Delete project.
- In the dialog, type the project ID, and then click Shut down to delete the project.
If you delete your project using this method, the Cloud DLP job and Cloud Storage bucket you created were also deleted. It's not necessary to follow the instructions in the following sections.
Deleting the Cloud DLP job
If you scanned your own data, you only need to delete the inspect scan job you just created:
Go to APIs Explorer on the reference page for the
dlpJobs.delete
method by clicking the following button:In the name box, type the name of the job from the JSON response to the scan request, which has the following form:
projects/project-id/dlpJobs/job-id
The job ID is in the form ofi-1234567890123456789
.
If you created additional scan jobs or if you want to make sure you've deleted the job successfully, you can list all of the existing jobs:
Go to APIs Explorer on the reference page for the
dlpJobs.list
method by clicking the following button:In the parent box, type the project identifier in the following form:
projects/project-id
Click Execute.
If there are no jobs listed in the response, you've deleted all of the jobs. If jobs are listed in the response, repeat the deletion procedure above for those jobs.
Deleting the Cloud Storage bucket
If you created a new Cloud Storage bucket to hold sample data, delete the bucket:
- Open the Cloud Storage browser.
- In the Cloud Storage browser, select the checkbox next to the name of the bucket you created, and then click Delete.
What's next
- Learn more about the
publishSummaryToCscc
action in Cloud DLP. - Learn more about scanning storage repositories for sensitive data using Cloud DLP.
- Learn how to use Security Command Center.