Sending Cloud DLP scan results to Cloud SCC

This guide walks you through using Cloud Data Loss Prevention (DLP) (Cloud DLP) to scan specific Google Cloud Platform (GCP) resources and send data to Cloud Security Command Center (Cloud SCC).

Cloud SCC enables you to gather data about, identify, and act on security threats before they can cause business damage or loss. With Cloud SCC, you can perform several security-related actions from a single, centralized dashboard.

Cloud Data Loss Prevention (DLP) integrates natively with Cloud SCC. When you use a Cloud DLP action to scan your GCP storage repositories for sensitive data, it can send results directly to the Cloud SCC dashboard so they display next to other security metrics.

By completing this guide, you'll do the following:

  • Enable Cloud SCC and Cloud DLP.
  • Set up Cloud DLP to scan a GCP storage repository—either a Cloud Storage bucket, BigQuery table, or Cloud Datastore kind.
  • Configure a Cloud DLP scan to send scan results to Cloud SCC.

For more information about Cloud SCC, see the Cloud Security Command Center documentation.

Costs

Following the instructions in this topic uses billable components of GCP, including:

  • Cloud Data Loss Prevention (DLP)
  • Cloud Storage
  • BigQuery
  • Cloud Datastore

Use the Pricing Calculator to generate a cost estimate based on your projected usage.

New GCP users might be eligible for a free trial.

Before you begin

Before you can send Cloud DLP scan results to Cloud SCC, you must set up the following components:

  • Step 1: Set GCP storage repositories.
  • Step 2: Set Cloud Identity and Access Management (Cloud IAM) roles.
  • Step 3: Enable Cloud SCC.
  • Step 4: Enable Cloud DLP.
  • Step 5: Enable Cloud DLP as a security source for Cloud SCC.

The steps to set up these components are described in the sections below.

Step 1: Set GCP storage repositories

Choose whether you want to scan your own GCP storage repository or an example one. This topic provides instructions for both scenarios.

Scan your own data

If you want to scan your own existing Cloud Storage bucket, BigQuery table, or Cloud Datastore kind, first open the project that the repository is in. In subsequent steps, you'll enable both Cloud SCC and Cloud DLP for this project and its organization.

After you open the project you want to use, proceed to setting up some Cloud IAM roles.

Scan sample data

If you want to scan a "dummy" or test set of data, first create a new project:

  1. In the GCP Console, click the popup menu at the top of the window.
  2. In the Select from [ORG_NAME] window, click New Project.
  3. On the New Project screen, ensure the Location box lists the correct organization. If not, change it.
  4. Click Create

Make sure that billing is enabled for your project.

Learn how to enable billing

Next, download and store the sample data:

  1. Go to the Cloud Functions tutorials repository on GitHub.
  2. Click the green Clone or download button, and then click Download ZIP.
  3. Uncompress the downloaded ZIP file.
  4. In the Google Cloud Platform Console, go to Cloud Storage.

    Go to Cloud Storage

    This page allows you to configure Cloud Storage buckets.

  5. In the Cloud Storage browser, click Create bucket.
  6. On the Create a bucket page, give the bucket a unique name, and then click Create.
  7. After you create your bucket, on the Bucket details page, click Upload folder.
  8. Go to the dlp-cloud-functions-tutorials-master folder that you uncompressed earlier, open it, and then select the sample_data folder. Click Upload to upload the folder's contents to Cloud Storage.

    New Cloud Storage bucket.

Note the name you gave the Cloud Storage bucket for later. After the file upload completes, you're ready to continue.

Step 2: Set Cloud IAM roles

Before you can use Cloud DLP to send scan results to Cloud SCC, you'll need to enable some Cloud IAM roles. This section requires the Organization Administrator Cloud IAM role.

  1. Go to the GCP Console IAM & Admin page.

    Go to the IAM & Admin page

  2. Find your username in the Member column, and then click Edit Edit button.

  3. On the Edit permissions panel, click Add another role.
  4. From the Select a role drop-down list, select Security Command Center > Security Center Admin.
  5. Next, from the same drop-down list, select Cloud DLP > DLP Jobs Editor, and then click Save.

You now have Cloud DLP Job Editor and Cloud SCC Admin roles for your organization. These roles will allow you to complete the tasks in the remainder of this topic.

Step 3: Enable the Cloud SCC dashboard

  1. Go to the GCP Console Cloud Security Command Center Marketplace page.

    Go to the Cloud Security Command Center Marketplace page

  2. If the organization for which you want to enable Cloud SCC is not selected, select it from the popup menu.
  3. Under All current and future projects, click Enable. A message displays that Cloud SCC is starting asset discovery.

The GCP Console Cloud SCC page loads automatically, and Cloud SCC now displays your supported GCP assets.

For more information about enabling Cloud SCC, see the Cloud SCC documentation.

Step 4: Enable Cloud DLP

Enable Cloud DLP for the project you want to scan. The project must be within the same organization for which you've enabled Cloud SCC. To enable Cloud DLP using the GCP Console:

  1. Go to the GCP Console Manage resources page and select a project.

    Go to Manage resources page

    The project you select or create here must contain the Cloud Storage bucket, BigQuery table, or Cloud Datastore kind you want to scan.

  2. Click the following button to enable Cloud DLP:

    Enable Cloud DLP

When you are prompted to select a project where your application will be registered, find and select the project you want to use in the menu, and then click Continue. Cloud DLP is now enabled.

Step 5: Enable Cloud DLP as a security source for Cloud SCC.

To view Cloud DLP scan findings in the Cloud SCC dashboard, you first need to enable Cloud DLP as a security source.

To add Cloud DLP as a security source, follow the steps below:

  1. Go to the GCP Console Cloud SCC page.

    Go to Cloud Security Command Center

  2. Click Settings in the top right corner of the page.

  3. Select the Security Sources tab.

  4. Under Enabled, click the toggle next to Cloud DLP Data Discovery.

Findings for Cloud DLP will display in the Findings cards on the Cloud SCC dashboard.

Configure and run Cloud DLP inspection scan

In this section, you configure and run a Cloud DLP scan job.

The inspection job that you configure here instructs Cloud DLP to scan either the sample data stored in Cloud Storage (as described above) or your own data stored in Cloud Storage, Cloud Datastore, or BigQuery. The job configuration that you specify is also where you instruct Cloud DLP to save its scan results to Cloud SCC.

Step 1: Note your project identifier

  1. In the GCP Console, click the popup menu at the top of the window.
  2. In the Select from [ORG_NAME] window, make sure the organization name in the menu at the top of the window is the one for which you enabled Cloud SCC.
  3. Find the project that contains your Cloud Storage bucket with the sample data. Copy the project ID somewhere so that you can refer back to it later.
  4. Click the project's name to open it.

Step 2: Open APIs Explorer and configure the job

  1. Open Google APIs Explorer.
  2. In the search box at the top of the page, enter dlp.projects.dlpJobs.create.
  3. In the Methods list that appears, click dlp.projects.dlpJobs.create.
  4. On the method page that appears, click the Authorize requests using OAuth 2.0 toggle.
  5. On the Select OAuth 2.0 scopes dialog that appears,click Authorize to authorize the default access level to Cloud DLP. If prompted, log into your account.
  6. In the parent box, enter the following, where [PROJECT_ID] is the project ID you noted in Step 1 above: projects/[PROJECT_ID]
  7. Leave the fields box blank.
  8. Click in the Request body box. On the right side of the box, click the the arrow. On the drop-down list that appears, click Freeform editor.

Replace the contents of the Request body field with the following JSON for the kind of data you want to use: sample data in a Cloud Storage bucket, or your own data stored in Cloud Storage, Cloud Datastore, or BigQuery.

Sample data

If you created a Cloud Storage bucket to store sample data as mentioned above, copy the following JSON and paste it into the Request body field. Replace [BUCKET_NAME] with the name you gave your Cloud Storage bucket:

{
  "inspectJob":{
    "storageConfig":{
      "cloudStorageOptions":{
        "fileSet":{
          "url":"gs://[BUCKET_NAME]/**"
        }
      }
    },
    "inspectConfig":{
      "infoTypes":[
        {
          "name":"EMAIL_ADDRESS"
        },
        {
          "name":"PERSON_NAME"
        },
        {
          "name": "LOCATION"
        },
        {
          "name":"PHONE_NUMBER"
        }
      ],
      "includeQuote":true,
      "minLikelihood":"UNLIKELY",
      "limits":{
        "maxFindingsPerRequest":100
      }
    },
    "actions":[
      {
        "publishSummaryToCscc":{

        }
      }
    ]
  }
}

Cloud Storage data

To scan your own Cloud Storage bucket, copy the following JSON and paste it into the Request body field.

Replace [PATH_NAME] with the path to the location you want to scan. To scan recursively, end the path with two asterisks (gs://path_to_files/**). To scan just a specific directory and no deeper, end the path with one asterisk (gs://path_to_files/*).

{
  "inspectJob":{
    "storageConfig":{
      "cloudStorageOptions":{
        "fileSet":{
          "url":"gs://[PATH_NAME]"
        }
      }
    },
    "inspectConfig":{
      "infoTypes":[
        {
          "name":"EMAIL_ADDRESS"
        },
        {
          "name":"PERSON_NAME"
        },
        {
          "name": "LOCATION"
        },
        {
          "name":"PHONE_NUMBER"
        }
      ],
      "includeQuote":true,
      "minLikelihood":"UNLIKELY",
      "limits":{
        "maxFindingsPerRequest":100
      }
    },
    "actions":[
      {
        "publishSummaryToCscc":{

        }
      }
    ]
  }
}

To learn more about the available scan options, see Inspecting storage and databases for sensitive data.

Datastore data

To scan your own data kept in Cloud Datastore, copy the following JSON and paste it into the Request body field.

Replace [DATASTORE_KIND] with the name of the Cloud Datastore kind. You can also replace [NAMESPACE_ID] and [PROJECT_ID] with the namespace and project identifiers, repectively, or you can remove the "partitionID" completely if you want.

{
  "inspectJob":{
    "storageConfig":{
      "datastoreOptions":{
        "kind":{
          "name":"[DATASTORE_KIND]"
        },
        "partitionId":{
          "namespaceId":"[NAMESPACE_ID]",
          "projectId":"[PROJECT_ID]"
        }
      }
    },
    "inspectConfig":{
      "infoTypes":[
        {
          "name":"EMAIL_ADDRESS"
        },
        {
          "name":"PERSON_NAME"
        },
        {
          "name": "LOCATION"
        },
        {
          "name":"PHONE_NUMBER"
        }
      ],
      "includeQuote":true,
      "minLikelihood":"UNLIKELY",
      "limits":{
        "maxFindingsPerRequest":100
      }
    },
    "actions":[
      {
        "publishSummaryToCscc":{

        }
      }
    ]
  }
}

To learn more about the available scan options, see Inspecting storage and databases for sensitive data.

BigQuery data

To scan your own BigQuery table, copy the following JSON and paste it into the Request body field.

Replace [PROJECT_ID], [BIGQUERY-DATASET-NAME] and [BIGQUERY-TABLE-NAME] with the project ID and BigQuery dataset and table names, repectively.

{
  "inspectJob":
  {
    "storageConfig":
    {
      "bigQueryOptions":
      {
        "tableReference":
        {
          "projectId": "[PROJECT_ID]",
          "datasetId": "[BIGQUERY-DATASET-NAME]",
          "tableId": "[BIGQUERY-TABLE-NAME]"
        }
      }
    },
    "inspectConfig":
    {
      "infoTypes":
      [
        {
          "name": "EMAIL_ADDRESS"
        },
        {
          "name": "PERSON_NAME"
        },
        {
          "name": "LOCATION"
        },
        {
          "name": "PHONE_NUMBER"
        }
      ],
      "includeQuote": true,
      "minLikelihood": "UNLIKELY",
      "limits":
      {
        "maxFindingsPerRequest": 100
      }
    },
    "actions":
    [
      {
        "publishSummaryToCscc":
        {
        }
      }
    ]
  }
}

To learn more about the available scan options, see Inspecting storage and databases for sensitive data.

Step 3: Execute the request to start the scan job

After you've configured the job by following the steps above, click Execute to send the request. If the request is successful, a response will appear below the request with a success code and a JSON object that indicates the status of Cloud DLP job you just created.

Check the status of the Cloud DLP inspection scan

The response to your scan request includes the job ID of your inspection scan job (the "name" key) and the current state of the inspection scan job (the "state" key). Because you just submitted the request, the job's state at that moment is "PENDING".

After you submit the scan request, the scan of your content begins immediately.

To check the status of the inspection scan job, open another browser tab, and then open Google APIs Explorer again. Enter dlp.projects.dlpJobs.get into the search box, and then click the dlp.projects.dlpJobs.get method. On the method page:

  1. Make sure the Authorize requests using OAuth 2.0 toggle is set to ON. If not, click the toggle to authorize the default access level to Cloud DLP.
  2. In the name box, type the name of the job from the JSON response to the scan request, which has the following form:
    projects/[PROJECT_ID]/dlpJobs/[JOB_ID]
    The job ID is in the form i-1234567890123456789.
  3. To submit the request, click Execute.

If the response JSON object's "state" key indicates that the job is "DONE", the scan job has finished.

To view the rest of the response JSON, scroll down the page. Under "result" > "infoTypeStats", each information type listed should have a corresponding "count". If not, go back and check that you entered the JSON accurately, and that the path or location to your data is correct.

After the scan job is done, you can proceed to the Cloud Security Command Center section of this guide.

View Cloud DLP scan results in Cloud SCC

Because you instructed Cloud DLP to send its inspection scan job results to Cloud SCC, you can now view a summary of the scan in the Cloud SCC dashboard:

The below screen shot of the Cloud SCC main dashboard shows the two cards in which you'll see data Cloud DLP— Findings Summary and Cloud DLP Data Discovery.

DLP detail in CSCC.

Note that the LOCATION infoType isn't listed. That's because there were no matches in Cloud DLP on the LOCATION infoType. Cloud DLP findings are totaled and displayed alongside other figures in the Findings Summary and listed by information type under Cloud DLP Data Discovery.

Findings Summary

If any findings were sent Cloud DLP, a row with the source Cloud DLP Data Discovery will display. If that row doesn't appear in Findings Summary, Cloud DLP didn't send any findings to Cloud SCC.

Cloud DLP Data Discovery

Any infoTypes that successfully matched to scanned data will be listed in the Finding column. The number in the Count column corresponds to the number of scans that found data that matched that infoType. If a job is scanning multiple sources, the count may include each source that was scanned.

To see individual findings for a specific category, click on a row—for example, EMAIL_ADDRESS. A list of each scan in which at least one EMAIL_ADDRESS was found displays under the Findings tab, as shown in the following screen shot:

DLP detail in CSCC.

To display detail about a specific finding, click one of the EMAIL_ADDRESS rows to see the detail page, as shown below:

DLP detail in CSCC.

Cleaning up

To avoid incurring charges to your GCP account for the resources used in this topic:

Deleting the project

The easiest way to eliminate billing is to delete the project you created while following the instructions provided in this topic.

To delete the project:

  1. In the GCP Console, go to the Projects page.

    Go to the Projects page

  2. In the project list, select the project you want to delete and click Delete project. After selecting the checkbox next to the project name, click
    Delete project
  3. In the dialog, type the project ID, and then click Shut down to delete the project.

If you delete your project using this method,the Cloud DLP job and Cloud Storage bucket you created were also deleted. It's not necessary to follow the instructions in the following sections.

Deleting the Cloud DLP job

If you scanned your own data, you only need to delete the inspect scan job you just created:

  1. Open Google APIs Explorer. Enter dlp.projects.dlpJobs.delete in the search box, and then click the dlp.projects.dlpJobs.delete method.
  2. On the method page, make sure the Authorize requests using OAuth 2.0 toggle is set to ON. If not, click the toggle to authorize the default access level to Cloud DLP.
  3. In the name box, type the name of the job from the JSON response to the scan request, which has the following form:
    projects/[PROJECT_ID]/dlpJobs/[JOB_ID]
    The job ID is in the form i-1234567890123456789.

If you created additional scan jobs or just want to make sure you've deleted the job successfully, you can list all the existing jobs. Click the back arrow, and then click the dlp.projects.dlpJobs.list method. In the parent box, type the project identifier in the following form:

projects/[PROJECT_ID]

Click Execute. If there are no jobs listed in the response, you've deleted all of the jobs. If there are, repeat the deletion procedure above for those jobs.

Deleting the Cloud Storage bucket

If you created a new Cloud Storage bucket to hold sample data, delete the bucket:

  1. Open the Cloud Storage browser.

    Open Cloud Storage

  2. In the Cloud Storage browser, select the checkbox next to the name of the bucket you created, and then click Delete.

What's next

Hai trovato utile questa pagina? Facci sapere cosa ne pensi:

Invia feedback per...

Cloud Data Loss Prevention