创建和安排敏感数据保护检查作业

本主题详细介绍了如何创建敏感数据保护检查作业,以及如何通过创建作业触发器来安排周期性检查作业。如需快速浏览如何使用敏感数据保护界面创建新的作业触发器,请参阅快速入门:创建敏感数据保护作业触发器

关于检查作业和作业触发器

当敏感数据保护通过检查扫描来识别敏感数据时,每次扫描都会作为作业运行。只要让您要求敏感数据保护检查 Google Cloud 存储区(包括 Cloud Storage 存储分区、BigQuery 表格、Datastore 种类和外部数据),敏感数据保护就会创建并运行作业资源。

您可以通过创建作业触发器来安排敏感数据保护检查扫描作业。作业触发器会自动定期创建敏感数据保护作业,也可以按需运行。

如需详细了解敏感数据保护中的作业和作业触发器,请参阅作业和作业触发器概念页面。

创建新的检查作业

如需创建新的敏感数据保护检查作业,请执行以下操作:

控制台

在 Google Cloud 控制台的“敏感数据保护”部分中,前往创建作业或作业触发器页面。

转到“创建作业或作业触发器”

创建作业或作业触发器页面包含以下部分:

选择输入数据

名称

输入作业的名称。可使用字母、数字和连字符。您可以选择为作业命名。如果您未输入名称,敏感数据保护将为作业提供唯一的编号标识符。

位置

存储类型菜单中,选择用于存储要扫描的数据的存储区种类:

  • Cloud Storage:输入您要扫描的存储桶的网址,或从位置类型菜单中选择包含/排除,然后点击浏览以导航到您要扫描的存储桶或子文件夹。如果选中以递归方式扫描文件夹复选框,则扫描指定的目录和所有子目录。如果未选中该复选框,则系统只会扫描指定的目录,而不会扫描其中的子目录。
  • BigQuery:输入要扫描的项目、数据集和表格的标识符。
  • Datastore:输入要扫描的项目、命名空间(可选)和种类的标识符。
  • 混合:您可以添加必需的标签、可选标签和用于处理表格数据的选项。如需了解详情,请参阅您可以提供的元数据类型

采样

如果您的数据量非常庞大,可选择使用采样以节省资源。

采样下,您可以选择是扫描所有选定的数据,还是采样扫描特定百分比的数据。根据您要扫描的存储区类型,采样的工作方式有所不同:

  • 对于 BigQuery,您可以根据您指定要包含在扫描范围内的文件百分比,对所选总行数的一部分进行采样。
  • 对于 Cloud Storage,如有任何文件超出了要扫描的每个文件的最大字节数中指定的大小,敏感数据保护会对其进行扫描,直至达到文件大小上限为止,然后移至下一个文件。

如需开启采样功能,请从第一个菜单中选择下列其中一个选项:

  • 从顶部开始采样:敏感数据保护从数据开头开始部分扫描。对于 BigQuery,这表示将从第一行开始扫描。对于 Cloud Storage,此操作会从每个文件的开头开始扫描,并在敏感数据保护扫描到任何指定的文件大小上限后停止扫描。
  • 随机开始采样:敏感数据保护从数据中的随机位置开始部分扫描。对于 BigQuery,这表示从随机选择的行开始扫描。对于 Cloud Storage,此设置仅适用于超过指定大小上限的文件。敏感数据保护会扫描整个文件大小不超过大小上限的文件,并扫描文件大小超过上限(不超过上限)的文件。

如需执行部分扫描,您还必须选择要扫描的数据百分比。请使用滑块设置百分比。

您还可以按日期缩小要扫描的文件或记录范围。如需了解具体方法,请参阅本主题后面的时间安排

高级配置

在创建 Cloud Storage 存储分区或 BigQuery 表的扫描作业时,您可以通过指定高级配置来缩小搜索范围。具体来说,您可以配置:

  • 文件(仅限 Cloud Storage):要扫描的文件类型,包括文本文件、二进制文件和图片文件。
  • 标识字段(仅限 BigQuery):表中的唯一行标识符。
  • 对于 Cloud Storage,如有任何文件超出了要扫描的每个文件的最大字节数中指定的大小,敏感数据保护会对其进行扫描,直至达到文件大小上限为止,然后移至下一个文件。

如需开启采样功能,请选择要扫描的数据百分比。请使用滑块设置百分比。然后,从第一个菜单中选择下列其中一个选项:

  • 从顶部开始采样:敏感数据保护从数据开头开始部分扫描。对于 BigQuery,这表示将从第一行开始扫描。对于 Cloud Storage,此操作会在每个文件的开头启动扫描,并在敏感数据保护扫描到任何指定的文件大小上限(参见上文)后停止扫描。
  • 随机开始采样:敏感数据保护从数据中的随机位置开始部分扫描。对于 BigQuery,这表示从随机选择的行开始扫描。对于 Cloud Storage,此设置仅适用于超过指定大小上限的文件。敏感数据保护会扫描整个文件大小不超过大小上限的文件,并扫描文件大小超过上限(不超过上限)的文件。
文件

对于存储在 Cloud Storage 中的文件,您可以在文件下指定要包含在扫描范围内的类型。

您可以选择二进制、文本、图片、CSV、TSV、Microsoft Word、Microsoft Excel、Microsoft Powerpoint、PDF 和 Apache Avro 文件。如需查看敏感数据保护可以在 Cloud Storage 存储分区中扫描的文件扩展名的详尽列表,请参阅 FileType。选择二进制会使敏感数据保护扫描无法识别的文件类型。

标识字段

对于 BigQuery 中的表,您可以在标识字段字段中指示敏感数据保护在结果中包含表主键列的值。这样,您就可以将发现结果链接回包含这些发现结果的表行。

输入可唯一标识表中每一行的列的名称。如有必要,请使用点表示法指定嵌套字段。您可以根据需要添加任意数量的字段。

您还必须启用保存到 BigQuery 操作,将发现结果导出到 BigQuery。将发现结果导出到 BigQuery 时,每个发现结果都包含标识字段的相应值。如需了解详情,请参阅 identifyingFields

配置检测

您可以在配置检测部分中指定要扫描的敏感数据类型。您可以选择是否填写此部分。如果您跳过此部分,敏感数据保护将扫描您的数据,查找一组默认的 infoTypes

模板

您可以选择使用敏感数据保护模板,以重复使用之前指定的配置信息。

如果您已创建了要使用的模板,请点击模板名称字段以查看现有检查模板的列表。选择或输入要使用的模板的名称。

如需详细了解如何创建模板,请参阅创建敏感数据保护检查模板

InfoTypes

InfoType 检测器会查找特定类型的敏感数据。例如,敏感数据保护 US_SOCIAL_SECURITY_NUMBER 的内置 infoType 检测器会查找美国社会保障号。除了内置的 infoType 检测器之外,您还可以创建自己的自定义 infoType 检测器。

InfoTypes 下,选择与您要扫描以查找的数据类型相对应的 infoType 检测器。我们不建议您将此部分留空。这样做会使敏感数据保护使用一组默认的 infoType 扫描数据,其中可能包括您不需要的 infoType。如需详细了解每个检测器,请参阅 InfoType 检测器参考文档

如需详细了解如何管理内置和自定义 infoType,请参阅通过 Google Cloud 控制台管理 infoType

检查规则集
置信度阈值

每当敏感数据保护检测到敏感数据的潜在匹配项时,它就会为其分配一个可能性值,范围从“可能性极小”到“极有可能”。如果您在此处设置可能性值,则表示敏感数据保护仅匹配与该可能性值或更高值相对应的数据。

默认值“可能”(Possible) 足以满足大多数情况。如果您经常获得太过宽泛的匹配项,请调高滑块的值。如果匹配项太少,请调低滑块的值。

完成操作后,请点击继续

添加操作

添加操作步骤中,选择您希望敏感数据保护在作业完成后执行的一项或多项操作。

您可以配置以下操作:

  • 保存到 BigQuery:将敏感数据保护作业结果保存到 BigQuery 表。在查看或分析结果之前,请先确保作业已完成。

    每次运行扫描时,敏感数据保护都会将扫描发现结果保存到您指定的 BigQuery 表中。导出的结果包含有关每个发现结果的位置和匹配可能性的详细信息。如果您希望每个结果都包含与 infoType 检测器匹配的字符串,请启用包含引号选项。

    如果您未指定表 ID,则 BigQuery 会在首次运行扫描时为新表分配一个默认名称。如果您指定现有表,敏感数据保护会将扫描发现结果附加到该表。

    如果未将发现结果保存到 BigQuery,扫描结果将仅包含有关发现结果的数量和 infoType 的统计信息。

    将数据写入 BigQuery 表时,结算和配额用量将应用于目标表所属的项目。

  • 发布到 Pub/Sub:向 Pub/Sub 渠道发布一条通知,通知中包含敏感数据保护作业的名称作为属性。您可以指定一个或多个主题以接收通知消息。请确保运行扫描作业的敏感数据保护服务帐号具有该主题的发布权限。

  • 发布到 Security Command Center:将作业结果的摘要发布到 Security Command Center。如需了解详情,请参阅将敏感数据保护扫描结果发送到 Security Command Center

  • 发布到 Dataplex:将作业结果发送到 Dataplex(Google Cloud 的元数据管理服务)。

  • 通过电子邮件发送通知:在作业完成时发送电子邮件。该电子邮件会发送给 IAM 项目所有者和技术重要联系人

  • 发布到 Cloud Monitoring:将检查结果发送到 Google Cloud 的运维套件中的 Cloud Monitoring。

  • 制作去标识化副本:对检查的数据中的任何发现结果进行去标识化处理,并将去标识化的内容写入新文件。然后,您可以在业务流程中使用去标识化的副本来代替包含敏感信息的数据。如需了解详情,请参阅在 Google Cloud 控制台中使用敏感数据保护创建 Cloud Storage 数据的去标识化副本

如需了解详情,请参阅操作

选择好操作后,点击继续

审核

查看部分包含您刚刚指定的作业设置的 JSON 格式摘要。

点击创建以创建作业(如果未指定时间表)并运行一次作业。系统将显示作业的信息页面,其中包含状态和其他信息。如果作业当前正在运行,您可以点击取消按钮将其停止。您也可以通过点击删除来删除该作业。

如需返回“敏感数据保护”主页面,请点击 Google Cloud 控制台中的返回箭头。

C#

如需了解如何安装和使用用于敏感数据保护的客户端库,请参阅敏感数据保护客户端库

如需向敏感数据保护服务进行身份验证,请设置应用默认凭据。如需了解详情,请参阅为本地开发环境设置身份验证


using System;
using System.Linq;
using Google.Api.Gax.ResourceNames;
using Google.Cloud.Dlp.V2;
using static Google.Cloud.Dlp.V2.StorageConfig.Types;

public class JobsCreate
{
    public static DlpJob CreateJob(string projectId, string gcsPath)
    {
        var dlp = DlpServiceClient.Create();

        var storageConfig = new StorageConfig
        {
            CloudStorageOptions = new CloudStorageOptions
            {
                FileSet = new CloudStorageOptions.Types.FileSet()
                {
                    Url = gcsPath
                }
            },
            TimespanConfig = new TimespanConfig
            {
                EnableAutoPopulationOfTimespanConfig = true
            }
        };

        var inspectConfig = new InspectConfig
        {
            InfoTypes = { new[] { "EMAIL_ADDRESS", "CREDIT_CARD_NUMBER" }.Select(it => new InfoType() { Name = it }) },
            IncludeQuote = true,
            MinLikelihood = Likelihood.Unlikely,
            Limits = new InspectConfig.Types.FindingLimits() { MaxFindingsPerItem = 100 }
        };

        var response = dlp.CreateDlpJob(new CreateDlpJobRequest
        {
            Parent = new LocationName(projectId, "global").ToString(),
            InspectJob = new InspectJobConfig
            {
                InspectConfig = inspectConfig,
                StorageConfig = storageConfig,
            }
        });

        Console.WriteLine($"Job: {response.Name} status: {response.State}");

        return response;
    }
}

Go

如需了解如何安装和使用用于敏感数据保护的客户端库,请参阅敏感数据保护客户端库

如需向敏感数据保护服务进行身份验证,请设置应用默认凭据。如需了解详情,请参阅为本地开发环境设置身份验证

import (
	"context"
	"fmt"
	"io"

	dlp "cloud.google.com/go/dlp/apiv2"
	"cloud.google.com/go/dlp/apiv2/dlppb"
)

// createJob creates an inspection job
func createJob(w io.Writer, projectID, gcsPath string, infoTypeNames []string) error {
	// projectID := "my-project-id"
	// gcsPath := "gs://" + "your-bucket-name" + "path/to/file.txt";
	// infoTypeNames := []string{"EMAIL_ADDRESS", "PERSON_NAME", "LOCATION", "PHONE_NUMBER"}

	ctx := context.Background()

	// Initialize a client once and reuse it to send multiple requests. Clients
	// are safe to use across goroutines. When the client is no longer needed,
	// call the Close method to cleanup its resources.
	client, err := dlp.NewClient(ctx)
	if err != nil {
		return err
	}

	// Closing the client safely cleans up background resources.
	defer client.Close()

	// Specify the GCS file to be inspected.
	storageConfig := &dlppb.StorageConfig{
		Type: &dlppb.StorageConfig_CloudStorageOptions{
			CloudStorageOptions: &dlppb.CloudStorageOptions{
				FileSet: &dlppb.CloudStorageOptions_FileSet{
					Url: gcsPath,
				},
			},
		},

		// Set autoPopulateTimespan to true to scan only new content.
		TimespanConfig: &dlppb.StorageConfig_TimespanConfig{
			EnableAutoPopulationOfTimespanConfig: true,
		},
	}

	// Specify the type of info the inspection will look for.
	// See https://cloud.google.com/dlp/docs/infotypes-reference for complete list of info types.
	var infoTypes []*dlppb.InfoType
	for _, c := range infoTypeNames {
		infoTypes = append(infoTypes, &dlppb.InfoType{Name: c})
	}

	inspectConfig := &dlppb.InspectConfig{
		InfoTypes:    infoTypes,
		IncludeQuote: true,

		// The minimum likelihood required before returning a match:
		// See: https://cloud.google.com/dlp/docs/likelihood
		MinLikelihood: dlppb.Likelihood_UNLIKELY,

		// The maximum number of findings to report (0 = server maximum)
		Limits: &dlppb.InspectConfig_FindingLimits{
			MaxFindingsPerItem: 100,
		},
	}

	// Create and send the request.
	req := dlppb.CreateDlpJobRequest{
		Parent: fmt.Sprintf("projects/%s/locations/global", projectID),
		Job: &dlppb.CreateDlpJobRequest_InspectJob{
			InspectJob: &dlppb.InspectJobConfig{
				InspectConfig: inspectConfig,
				StorageConfig: storageConfig,
			},
		},
	}

	// Send the request.
	response, err := client.CreateDlpJob(ctx, &req)
	if err != nil {
		return err
	}

	// Print the results.
	fmt.Fprintf(w, "Created a Dlp Job %v and Status is: %v", response.Name, response.State)
	return nil
}

Java

如需了解如何安装和使用用于敏感数据保护的客户端库,请参阅敏感数据保护客户端库

如需向敏感数据保护服务进行身份验证,请设置应用默认凭据。如需了解详情,请参阅为本地开发环境设置身份验证


import com.google.cloud.dlp.v2.DlpServiceClient;
import com.google.privacy.dlp.v2.Action;
import com.google.privacy.dlp.v2.CloudStorageOptions;
import com.google.privacy.dlp.v2.CreateDlpJobRequest;
import com.google.privacy.dlp.v2.DlpJob;
import com.google.privacy.dlp.v2.InfoType;
import com.google.privacy.dlp.v2.InspectConfig;
import com.google.privacy.dlp.v2.InspectJobConfig;
import com.google.privacy.dlp.v2.Likelihood;
import com.google.privacy.dlp.v2.LocationName;
import com.google.privacy.dlp.v2.StorageConfig;
import com.google.privacy.dlp.v2.StorageConfig.TimespanConfig;
import java.io.IOException;
import java.util.List;
import java.util.stream.Collectors;
import java.util.stream.Stream;

public class JobsCreate {

  public static void main(String[] args) throws Exception {
    // TODO(developer): Replace these variables before running the sample.
    String projectId = "your-project-id";
    String gcsPath = "gs://" + "your-bucket-name" + "path/to/file.txt";
    createJobs(projectId, gcsPath);
  }

  // Creates a DLP Job
  public static void createJobs(String projectId, String gcsPath) throws IOException {
    // Initialize client that will be used to send requests. This client only needs to be created
    // once, and can be reused for multiple requests. After completing all of your requests, call
    // the "close" method on the client to safely clean up any remaining background resources.
    try (DlpServiceClient dlpServiceClient = DlpServiceClient.create()) {

      // Set autoPopulateTimespan to true to scan only new content
      boolean autoPopulateTimespan = true;
      TimespanConfig timespanConfig =
          TimespanConfig.newBuilder()
              .setEnableAutoPopulationOfTimespanConfig(autoPopulateTimespan)
              .build();

      // Specify the GCS file to be inspected.
      CloudStorageOptions cloudStorageOptions =
          CloudStorageOptions.newBuilder()
              .setFileSet(CloudStorageOptions.FileSet.newBuilder().setUrl(gcsPath))
              .build();
      StorageConfig storageConfig =
          StorageConfig.newBuilder()
              .setCloudStorageOptions(cloudStorageOptions)
              .setTimespanConfig(timespanConfig)
              .build();

      // Specify the type of info the inspection will look for.
      // See https://cloud.google.com/dlp/docs/infotypes-reference for complete list of info types
      List<InfoType> infoTypes =
          Stream.of("EMAIL_ADDRESS", "PERSON_NAME", "LOCATION", "PHONE_NUMBER")
              .map(it -> InfoType.newBuilder().setName(it).build())
              .collect(Collectors.toList());
      // The minimum likelihood required before returning a match:
      // See: https://cloud.google.com/dlp/docs/likelihood
      Likelihood minLikelihood = Likelihood.UNLIKELY;

      // The maximum number of findings to report (0 = server maximum)
      InspectConfig.FindingLimits findingLimits =
          InspectConfig.FindingLimits.newBuilder().setMaxFindingsPerItem(100).build();

      InspectConfig inspectConfig =
          InspectConfig.newBuilder()
              .addAllInfoTypes(infoTypes)
              .setIncludeQuote(true)
              .setMinLikelihood(minLikelihood)
              .setLimits(findingLimits)
              .build();

      // Specify the action that is triggered when the job completes.
      Action.PublishSummaryToCscc publishSummaryToCscc =
          Action.PublishSummaryToCscc.getDefaultInstance();
      Action action = Action.newBuilder().setPublishSummaryToCscc(publishSummaryToCscc).build();

      // Configure the inspection job we want the service to perform.
      InspectJobConfig inspectJobConfig =
          InspectJobConfig.newBuilder()
              .setInspectConfig(inspectConfig)
              .setStorageConfig(storageConfig)
              .addActions(action)
              .build();

      // Construct the job creation request to be sent by the client.
      CreateDlpJobRequest createDlpJobRequest =
          CreateDlpJobRequest.newBuilder()
              .setParent(LocationName.of(projectId, "global").toString())
              .setInspectJob(inspectJobConfig)
              .build();

      // Send the job creation request and process the response.
      DlpJob createdDlpJob = dlpServiceClient.createDlpJob(createDlpJobRequest);
      System.out.println("Job created successfully: " + createdDlpJob.getName());
    }
  }
}

Node.js

如需了解如何安装和使用用于敏感数据保护的客户端库,请参阅敏感数据保护客户端库

如需向敏感数据保护服务进行身份验证,请设置应用默认凭据。如需了解详情,请参阅为本地开发环境设置身份验证

// Imports the Google Cloud Data Loss Prevention library
const DLP = require('@google-cloud/dlp');

// Initialize google DLP Client
const dlp = new DLP.DlpServiceClient();

async function jobsCreate() {
  // Construct cloud storage configuration
  const cloudStorageConfig = {
    cloudStorageOptions: {
      fileSet: {
        url: cloudFileUrl,
      },
    },
    timespanConfig: {
      enableAutoPopulationOfTimespanConfig: true,
    },
  };

  // Construct inspect job configuration
  const inspectJob = {
    storageConfig: cloudStorageConfig,
  };

  // Construct inspect configuration
  const inspectConfig = {
    infoTypes: [
      {name: 'EMAIL_ADDRESS'},
      {name: 'PERSON_NAME'},
      {name: 'LOCATION'},
      {name: 'PHONE_NUMBER'},
    ],
    includeQuote: true,
    minLikelihood: DLP.protos.google.privacy.dlp.v2.Likelihood.LIKELY,
    excludeInfoTypes: false,
  };

  // Combine configurations into a request for the service.
  const request = {
    parent: `projects/${projectId}/locations/global`,
    inspectJob: inspectJob,
    inspectConfig: inspectConfig,
  };

  // Send the request and receive response from the service
  const [response] = await dlp.createDlpJob(request);
  // Print the results
  console.log(`Job created successfully: ${response.name}`);
}

jobsCreate();

PHP

如需了解如何安装和使用用于敏感数据保护的客户端库,请参阅敏感数据保护客户端库

如需向敏感数据保护服务进行身份验证,请设置应用默认凭据。如需了解详情,请参阅为本地开发环境设置身份验证

use Google\Cloud\Dlp\V2\Action;
use Google\Cloud\Dlp\V2\Action\PublishSummaryToCscc;
use Google\Cloud\Dlp\V2\Client\DlpServiceClient;
use Google\Cloud\Dlp\V2\CloudStorageOptions;
use Google\Cloud\Dlp\V2\CloudStorageOptions\FileSet;
use Google\Cloud\Dlp\V2\CreateDlpJobRequest;
use Google\Cloud\Dlp\V2\InfoType;
use Google\Cloud\Dlp\V2\InspectConfig;
use Google\Cloud\Dlp\V2\InspectConfig\FindingLimits;
use Google\Cloud\Dlp\V2\InspectJobConfig;
use Google\Cloud\Dlp\V2\Likelihood;
use Google\Cloud\Dlp\V2\StorageConfig;
use Google\Cloud\Dlp\V2\StorageConfig\TimespanConfig;

/**
 * Creates an inspection job with the Cloud Data Loss Prevention API.
 *
 * @param string $callingProjectId  The project ID to run the API call under.
 * @param string $gcsPath           GCS file to be inspected. Example : gs://GOOGLE_STORAGE_BUCKET_NAME/dlp_sample.csv
 */
function create_job(
    string $callingProjectId,
    string $gcsPath
): void {
    // Instantiate a client.
    $dlp = new DlpServiceClient();

    // Set autoPopulateTimespan to true to scan only new content.
    $timespanConfig = (new TimespanConfig())
        ->setEnableAutoPopulationOfTimespanConfig(true);

    // Specify the GCS file to be inspected.
    $cloudStorageOptions = (new CloudStorageOptions())
        ->setFileSet((new FileSet())
            ->setUrl($gcsPath));
    $storageConfig = (new StorageConfig())
        ->setCloudStorageOptions(($cloudStorageOptions))
        ->setTimespanConfig($timespanConfig);

    // ----- Construct inspection config -----
    $emailAddressInfoType = (new InfoType())
        ->setName('EMAIL_ADDRESS');
    $personNameInfoType = (new InfoType())
        ->setName('PERSON_NAME');
    $locationInfoType = (new InfoType())
        ->setName('LOCATION');
    $phoneNumberInfoType = (new InfoType())
        ->setName('PHONE_NUMBER');
    $infoTypes = [$emailAddressInfoType, $personNameInfoType, $locationInfoType, $phoneNumberInfoType];

    // Whether to include the matching string in the response.
    $includeQuote = true;
    // The minimum likelihood required before returning a match.
    $minLikelihood = likelihood::LIKELIHOOD_UNSPECIFIED;

    // The maximum number of findings to report (0 = server maximum).
    $limits = (new FindingLimits())
        ->setMaxFindingsPerRequest(100);

    // Create the Inspect configuration object.
    $inspectConfig = (new InspectConfig())
        ->setMinLikelihood($minLikelihood)
        ->setLimits($limits)
        ->setInfoTypes($infoTypes)
        ->setIncludeQuote($includeQuote);

    // Specify the action that is triggered when the job completes.
    $action = (new Action())
        ->setPublishSummaryToCscc(new PublishSummaryToCscc());

    // Configure the inspection job we want the service to perform.
    $inspectJobConfig = (new InspectJobConfig())
        ->setInspectConfig($inspectConfig)
        ->setStorageConfig($storageConfig)
        ->setActions([$action]);

    // Send the job creation request and process the response.
    $parent = "projects/$callingProjectId/locations/global";
    $createDlpJobRequest = (new CreateDlpJobRequest())
        ->setParent($parent)
        ->setInspectJob($inspectJobConfig);
    $job = $dlp->createDlpJob($createDlpJobRequest);

    // Print results.
    printf($job->getName());
}

Python

如需了解如何安装和使用用于敏感数据保护的客户端库,请参阅敏感数据保护客户端库

如需向敏感数据保护服务进行身份验证,请设置应用默认凭据。如需了解详情,请参阅为本地开发环境设置身份验证


import google.cloud.dlp

def create_dlp_job(
    project: str,
    bucket: str,
    info_types: list[str],
    job_id: str = None,
    max_findings: int = 100,
    auto_populate_timespan: bool = True,
) -> None:
    """Uses the Data Loss Prevention API to create a DLP job.
    Args:
        project: The project id to use as a parent resource.
        bucket: The name of the GCS bucket to scan. This sample scans all
            files in the bucket.
        info_types: A list of strings representing info types to look for.
            A full list of info type categories can be fetched from the API.
        job_id: The id of the job. If omitted, an id will be randomly generated.
        max_findings: The maximum number of findings to report; 0 = no maximum.
        auto_populate_timespan: Automatically populates time span config start
            and end times in order to scan new content only.
    """

    # Instantiate a client.
    dlp = google.cloud.dlp_v2.DlpServiceClient()

    # Convert the project id into a full resource id.
    parent = f"projects/{project}"

    # Prepare info_types by converting the list of strings into a list of
    # dictionaries (protos are also accepted).
    info_types = [{"name": info_type} for info_type in info_types]

    # Construct the configuration dictionary. Keys which are None may
    # optionally be omitted entirely.
    inspect_config = {
        "info_types": info_types,
        "min_likelihood": google.cloud.dlp_v2.Likelihood.UNLIKELY,
        "limits": {"max_findings_per_request": max_findings},
        "include_quote": True,
    }

    # Construct a cloud_storage_options dictionary with the bucket's URL.
    url = f"gs://{bucket}/*"
    storage_config = {
        "cloud_storage_options": {"file_set": {"url": url}},
        # Time-based configuration for each storage object.
        "timespan_config": {
            # Auto-populate start and end times in order to scan new objects
            # only.
            "enable_auto_population_of_timespan_config": auto_populate_timespan
        },
    }

    # Construct the job definition.
    job = {"inspect_config": inspect_config, "storage_config": storage_config}

    # Call the API.
    response = dlp.create_dlp_job(
        request={"parent": parent, "inspect_job": job, "job_id": job_id}
    )

    # Print out the result.
    print(f"Job : {response.name} status: {response.state}")

REST

在 DLP API 中,作业用 DlpJobs 资源来表示。您可以使用 DlpJob 资源的 projects.dlpJobs.create 方法创建新作业。

此示例 JSON 可以通过 POST 请求发送到指定的敏感数据保护 REST 端点。此 JSON 示例演示了如何在敏感数据保护中创建作业。该作业是 Datastore 检查扫描。

如需快速尝试此操作,您可以使用下面嵌入的 API Explorer。请注意,如果请求成功(即使是在 API Explorer 中创建的请求),就会新建一个作业。如需了解有关如何使用 JSON 将请求发送到 DLP API 的一般信息,请参阅 JSON 快速入门

JSON 输入:

{
  "inspectJob": {
    "storageConfig": {
      "bigQueryOptions": {
        "tableReference": {
          "projectId": "bigquery-public-data",
          "datasetId": "san_francisco_sfpd_incidents",
          "tableId": "sfpd_incidents"
        }
      },
      "timespanConfig": {
        "startTime": "2020-01-01T00:00:01Z",
        "endTime": "2020-01-31T23:59:59Z",
        "timestampField": {
          "name": "timestamp"
        }
      }
    },
    "inspectConfig": {
      "infoTypes": [
        {
          "name": "PERSON_NAME"
        },
        {
          "name": "STREET_ADDRESS"
        }
      ],
      "excludeInfoTypes": false,
      "includeQuote": true,
      "minLikelihood": "LIKELY"
    },
    "actions": [
      {
        "saveFindings": {
          "outputConfig": {
            "table": {
              "projectId": "[PROJECT-ID]",
              "datasetId": "[DATASET-ID]"
            }
          }
        }
      }
    ]
  }
}

JSON 输出:

以下输出表明已成功创建作业。

{
  "name": "projects/[PROJECT-ID]/dlpJobs/[JOB-ID]",
  "type": "INSPECT_JOB",
  "state": "PENDING",
  "inspectDetails": {
    "requestedOptions": {
      "snapshotInspectTemplate": {},
      "jobConfig": {
        "storageConfig": {
          "bigQueryOptions": {
            "tableReference": {
              "projectId": "bigquery-public-data",
              "datasetId": "san_francisco_sfpd_incidents",
              "tableId": "sfpd_incidents"
            }
          },
          "timespanConfig": {
            "startTime": "2020-01-01T00:00:01Z",
            "endTime": "2020-01-31T23:59:59Z",
            "timestampField": {
              "name": "timestamp"
            }
          }
        },
        "inspectConfig": {
          "infoTypes": [
            {
              "name": "PERSON_NAME"
            },
            {
              "name": "STREET_ADDRESS"
            }
          ],
          "minLikelihood": "LIKELY",
          "limits": {},
          "includeQuote": true
        },
        "actions": [
          {
            "saveFindings": {
              "outputConfig": {
                "table": {
                  "projectId": "[PROJECT-ID]",
                  "datasetId": "[DATASET-ID]",
                  "tableId": "[TABLE-ID]"
                }
              }
            }
          }
        ]
      }
    },
    "result": {}
  },
  "createTime": "2020-07-10T07:26:33.643Z"
}

创建新的作业触发器

如需创建新的敏感数据保护作业触发器,请执行以下操作:

控制台

在 Google Cloud 控制台的“敏感数据保护”部分中,前往创建作业或作业触发器页面。

转到“创建作业或作业触发器”

创建作业或作业触发器页面包含以下部分:

选择输入数据

名称

输入作业触发器的名称。可使用字母、数字和连字符。 您可以选择为作业触发器命名。如果您未输入名称,敏感数据保护将为作业触发器提供一个唯一的编号标识符。

位置

存储类型菜单中,选择用于存储要扫描的数据的存储区种类:

  • Cloud Storage:输入您要扫描的存储桶的网址,或从位置类型菜单中选择包含/排除,然后点击浏览以导航到您要扫描的存储桶或子文件夹。如果选中以递归方式扫描文件夹复选框,则扫描指定的目录和所有子目录。如果未选中该复选框,则系统只会扫描指定的目录,而不会扫描其中的子目录。
  • BigQuery:输入要扫描的项目、数据集和表格的标识符。
  • Datastore:输入要扫描的项目、命名空间(可选)和种类的标识符。

采样

如果您的数据量非常庞大,可选择使用采样以节省资源。

采样下,您可以选择是扫描所有选定的数据,还是采样扫描特定百分比的数据。根据您要扫描的存储区类型,采样的工作方式有所不同:

  • 对于 BigQuery,您可以根据您指定要包含在扫描范围内的文件百分比,对所选总行数的一部分进行采样。
  • 对于 Cloud Storage,如有任何文件超出了要扫描的每个文件的最大字节数中指定的大小,敏感数据保护会对其进行扫描,直至达到文件大小上限为止,然后移至下一个文件。

如需开启采样功能,请从第一个菜单中选择下列其中一个选项:

  • 从顶部开始采样:敏感数据保护从数据开头开始部分扫描。对于 BigQuery,这表示将从第一行开始扫描。对于 Cloud Storage,此操作会在每个文件的开头启动扫描,并在敏感数据保护扫描到任何指定的文件大小上限(参见上文)后停止扫描。
  • 随机开始采样:敏感数据保护从数据中的随机位置开始部分扫描。对于 BigQuery,这表示从随机选择的行开始扫描。对于 Cloud Storage,此设置仅适用于超过指定大小上限的文件。敏感数据保护会扫描整个文件大小不超过大小上限的文件,并扫描文件大小超过上限(不超过上限)的文件。

如需执行部分扫描,您还必须选择要扫描的数据百分比。请使用滑块设置百分比。

高级配置

在创建 Cloud Storage 存储分区或 BigQuery 表的扫描作业触发器时,您可以通过指定高级配置来缩小搜索范围。具体来说,您可以配置:

  • 文件(仅限 Cloud Storage):要扫描的文件类型,包括文本文件、二进制文件和图片文件。
  • 标识字段(仅限 BigQuery):表中的唯一行标识符。
  • 对于 Cloud Storage,如有任何文件超出了要扫描的每个文件的最大字节数中指定的大小,敏感数据保护会对其进行扫描,直至达到文件大小上限为止,然后移至下一个文件。

如需开启采样功能,请选择要扫描的数据百分比。请使用滑块设置百分比。然后,从第一个菜单中选择下列其中一个选项:

  • 从顶部开始采样:敏感数据保护从数据开头开始部分扫描。对于 BigQuery,这表示将从第一行开始扫描。对于 Cloud Storage,此操作会在每个文件的开头启动扫描,并在敏感数据保护扫描到任何指定的文件大小上限(参见上文)后停止扫描。
  • 随机开始采样:敏感数据保护从数据中的随机位置开始部分扫描。对于 BigQuery,这表示从随机选择的行开始扫描。对于 Cloud Storage,此设置仅适用于超过指定大小上限的文件。敏感数据保护会扫描整个文件大小不超过大小上限的文件,并扫描文件大小超过上限(不超过上限)的文件。

文件

对于存储在 Cloud Storage 中的文件,您可以在文件下指定要包含在扫描范围内的类型。

您可以选择二进制、文本、图片、Microsoft Word、Microsoft Excel、Microsoft Powerpoint、PDF 和 Apache Avro 文件。如需查看敏感数据保护可以在 Cloud Storage 存储分区中扫描的文件扩展名的详尽列表,请参阅 FileType。选择二进制会使敏感数据保护扫描无法识别的文件类型。

标识字段

对于 BigQuery 中的表,您可以在标识字段字段中指示敏感数据保护在结果中包含表主键列的值。这样,您就可以将发现结果链接回包含这些发现结果的表行。

输入可唯一标识表中每一行的列的名称。如有必要,请使用点表示法指定嵌套字段。您可以根据需要添加任意数量的字段。

您还必须启用保存到 BigQuery 操作,将发现结果导出到 BigQuery。将发现结果导出到 BigQuery 时,每个发现结果都包含标识字段的相应值。如需了解详情,请参阅 identifyingFields

配置检测

您可以在配置检测部分中指定要扫描的敏感数据类型。您可以选择是否填写此部分。如果您跳过此部分,敏感数据保护将扫描您的数据,查找一组默认的 infoTypes

模板

您可以选择使用敏感数据保护模板,以重复使用之前指定的配置信息。

如果您已创建了要使用的模板,请点击模板名称字段以查看现有检查模板的列表。选择或输入要使用的模板的名称。

如需详细了解如何创建模板,请参阅创建敏感数据保护检查模板

InfoTypes

InfoType 检测器会查找特定类型的敏感数据。例如,敏感数据保护 US_SOCIAL_SECURITY_NUMBER 的内置 infoType 检测器会查找美国社会保障号。除了内置的 infoType 检测器之外,您还可以自行创建自定义 infoType 检测器

InfoTypes 下,选择与您要扫描以查找的数据类型相对应的 infoType 检测器。您也可以将此字段留空以扫描查找所有默认的 infoType。如需详细了解每种检测器,请参阅 InfoType 检测器参考文档

您还可以在自定义 infoType 部分中添加自定义 infoType 检测器,并在检查规则集部分中自定义内置和自定义 infoType 检测器。

自定义 infoType
检查规则集
置信度阈值

每当敏感数据保护检测到敏感数据的潜在匹配项时,它就会为其分配一个可能性值,范围从“可能性极小”到“极有可能”。如果您在此处设置可能性值,则表示敏感数据保护仅匹配与该可能性值或更高值相对应的数据。

默认值“可能”(Possible) 足以满足大多数情况。如果您经常获得太过宽泛的匹配项,请调高滑块的值。如果匹配项太少,请调低滑块的值。

完成操作后,请点击继续

添加操作

添加操作步骤中,选择您希望敏感数据保护在作业完成后执行的一项或多项操作。

您可以配置以下操作:

  • 保存到 BigQuery:将敏感数据保护作业结果保存到 BigQuery 表。在查看或分析结果之前,请先确保作业已完成。

    每次运行扫描时,敏感数据保护都会将扫描发现结果保存到您指定的 BigQuery 表中。导出的结果包含有关每个发现结果的位置和匹配可能性的详细信息。如果您希望每个结果都包含与 infoType 检测器匹配的字符串,请启用包含引号选项。

    如果您未指定表 ID,则 BigQuery 会在首次运行扫描时为新表分配一个默认名称。如果您指定现有表,敏感数据保护会将扫描发现结果附加到该表。

    如果未将发现结果保存到 BigQuery,扫描结果将仅包含有关发现结果的数量和 infoType 的统计信息。

    将数据写入 BigQuery 表时,结算和配额用量将应用于目标表所属的项目。

  • 发布到 Pub/Sub:向 Pub/Sub 渠道发布一条通知,通知中包含敏感数据保护作业的名称作为属性。您可以指定一个或多个主题以接收通知消息。请确保运行扫描作业的敏感数据保护服务帐号具有该主题的发布权限。

  • 发布到 Security Command Center:将作业结果的摘要发布到 Security Command Center。如需了解详情,请参阅将敏感数据保护扫描结果发送到 Security Command Center

  • 发布到 Dataplex:将作业结果发送到 Dataplex(Google Cloud 的元数据管理服务)。

  • 通过电子邮件发送通知:在作业完成时发送电子邮件。该电子邮件会发送给 IAM 项目所有者和技术重要联系人

  • 发布到 Cloud Monitoring:将检查结果发送到 Google Cloud 的运维套件中的 Cloud Monitoring。

  • 制作去标识化副本:对检查的数据中的任何发现结果进行去标识化处理,并将去标识化的内容写入新文件。然后,您可以在业务流程中使用去标识化的副本来代替包含敏感信息的数据。如需了解详情,请参阅在 Google Cloud 控制台中使用敏感数据保护创建 Cloud Storage 数据的去标识化副本

如需了解详情,请参阅操作

选择好操作后,点击继续

时间表

时间表部分中,您可以执行下面两项操作:

  • 指定时间范围:此选项可按日期限制要扫描的文件或行。 点击开始时间可指定要涵盖的最早文件时间戳。将此值留空可指定所有文件。点击结束时间可指定要涵盖的最晚文件时间戳。将此值留空即表示不设时间戳上限。
  • 创建一个触发器来定期运行作业:此选项会创建作业触发器,并将其设为定期运行您指定的作业。默认值也是最小值,即 24 小时。 最大值为 60 天。如果您仅希望敏感数据保护扫描新的文件或行,请选中仅扫描新内容复选框。

审核

查看部分包含您刚刚指定的作业设置的 JSON 格式摘要。

点击创建以创建作业触发器(如果指定了时间表)。系统将显示作业触发器的信息页面,其中包含状态和其他信息。 如果作业当前正在运行,您可以点击取消按钮将其停止。您也可以通过点击删除来删除该作业。

如需返回“敏感数据保护”主页面,请点击 Google Cloud 控制台中的返回箭头。

C#

如需了解如何安装和使用用于敏感数据保护的客户端库,请参阅敏感数据保护客户端库

如需向敏感数据保护服务进行身份验证,请设置应用默认凭据。如需了解详情,请参阅为本地开发环境设置身份验证


using Google.Api.Gax.ResourceNames;
using Google.Cloud.Dlp.V2;
using System;
using System.Collections.Generic;
using static Google.Cloud.Dlp.V2.CloudStorageOptions.Types;
using static Google.Cloud.Dlp.V2.InspectConfig.Types;
using static Google.Cloud.Dlp.V2.JobTrigger.Types;
using static Google.Cloud.Dlp.V2.StorageConfig.Types;

public class TriggersCreate
{
    public static JobTrigger Create(
        string projectId,
        string bucketName,
        Likelihood minLikelihood,
        int maxFindings,
        bool autoPopulateTimespan,
        int scanPeriod,
        IEnumerable<InfoType> infoTypes,
        string triggerId,
        string displayName,
        string description)
    {
        var dlp = DlpServiceClient.Create();

        var jobConfig = new InspectJobConfig
        {
            InspectConfig = new InspectConfig
            {
                MinLikelihood = minLikelihood,
                Limits = new FindingLimits
                {
                    MaxFindingsPerRequest = maxFindings
                },
                InfoTypes = { infoTypes }
            },
            StorageConfig = new StorageConfig
            {
                CloudStorageOptions = new CloudStorageOptions
                {
                    FileSet = new FileSet
                    {
                        Url = $"gs://{bucketName}/*"
                    }
                },
                TimespanConfig = new TimespanConfig
                {
                    EnableAutoPopulationOfTimespanConfig = autoPopulateTimespan
                }
            }
        };

        var jobTrigger = new JobTrigger
        {
            Triggers =
            {
                new Trigger
                {
                    Schedule = new Schedule
                    {
                        RecurrencePeriodDuration = new Google.Protobuf.WellKnownTypes.Duration
                        {
                            Seconds = scanPeriod * 60 * 60 * 24
                        }
                    }
                }
            },
            InspectJob = jobConfig,
            Status = Status.Healthy,
            DisplayName = displayName,
            Description = description
        };

        var response = dlp.CreateJobTrigger(
            new CreateJobTriggerRequest
            {
                Parent = new LocationName(projectId, "global").ToString(),
                JobTrigger = jobTrigger,
                TriggerId = triggerId
            });

        Console.WriteLine($"Successfully created trigger {response.Name}");
        return response;
    }
}

Go

如需了解如何安装和使用用于敏感数据保护的客户端库,请参阅敏感数据保护客户端库

如需向敏感数据保护服务进行身份验证,请设置应用默认凭据。如需了解详情,请参阅为本地开发环境设置身份验证

import (
	"context"
	"fmt"
	"io"

	dlp "cloud.google.com/go/dlp/apiv2"
	"cloud.google.com/go/dlp/apiv2/dlppb"
	"github.com/golang/protobuf/ptypes/duration"
)

// createTrigger creates a trigger with the given configuration.
func createTrigger(w io.Writer, projectID string, triggerID, displayName, description, bucketName string, infoTypeNames []string) error {
	// projectID := "my-project-id"
	// triggerID := "my-trigger"
	// displayName := "My Trigger"
	// description := "My trigger description"
	// bucketName := "my-bucket"
	// infoTypeNames := []string{"US_SOCIAL_SECURITY_NUMBER"}

	ctx := context.Background()

	client, err := dlp.NewClient(ctx)
	if err != nil {
		return fmt.Errorf("dlp.NewClient: %w", err)
	}
	defer client.Close()

	// Convert the info type strings to a list of InfoTypes.
	var infoTypes []*dlppb.InfoType
	for _, it := range infoTypeNames {
		infoTypes = append(infoTypes, &dlppb.InfoType{Name: it})
	}

	// Create a configured request.
	req := &dlppb.CreateJobTriggerRequest{
		Parent:    fmt.Sprintf("projects/%s/locations/global", projectID),
		TriggerId: triggerID,
		JobTrigger: &dlppb.JobTrigger{
			DisplayName: displayName,
			Description: description,
			Status:      dlppb.JobTrigger_HEALTHY,
			// Triggers control when the job will start.
			Triggers: []*dlppb.JobTrigger_Trigger{
				{
					Trigger: &dlppb.JobTrigger_Trigger_Schedule{
						Schedule: &dlppb.Schedule{
							Option: &dlppb.Schedule_RecurrencePeriodDuration{
								RecurrencePeriodDuration: &duration.Duration{
									Seconds: 10 * 60 * 60 * 24, // 10 days in seconds.
								},
							},
						},
					},
				},
			},
			// Job configures the job to run when the trigger runs.
			Job: &dlppb.JobTrigger_InspectJob{
				InspectJob: &dlppb.InspectJobConfig{
					InspectConfig: &dlppb.InspectConfig{
						InfoTypes:     infoTypes,
						MinLikelihood: dlppb.Likelihood_POSSIBLE,
						Limits: &dlppb.InspectConfig_FindingLimits{
							MaxFindingsPerRequest: 10,
						},
					},
					StorageConfig: &dlppb.StorageConfig{
						Type: &dlppb.StorageConfig_CloudStorageOptions{
							CloudStorageOptions: &dlppb.CloudStorageOptions{
								FileSet: &dlppb.CloudStorageOptions_FileSet{
									Url: "gs://" + bucketName + "/*",
								},
							},
						},
						// Time-based configuration for each storage object. See more at
						// https://cloud.google.com/dlp/docs/reference/rest/v2/InspectJobConfig#TimespanConfig
						TimespanConfig: &dlppb.StorageConfig_TimespanConfig{
							// Auto-populate start and end times in order to scan new objects only.
							EnableAutoPopulationOfTimespanConfig: true,
						},
					},
				},
			},
		},
	}

	// Send the request.
	resp, err := client.CreateJobTrigger(ctx, req)
	if err != nil {
		return fmt.Errorf("CreateJobTrigger: %w", err)
	}
	fmt.Fprintf(w, "Successfully created trigger: %v", resp.GetName())
	return nil
}

Java

如需了解如何安装和使用用于敏感数据保护的客户端库,请参阅敏感数据保护客户端库

如需向敏感数据保护服务进行身份验证,请设置应用默认凭据。如需了解详情,请参阅为本地开发环境设置身份验证


import com.google.cloud.dlp.v2.DlpServiceClient;
import com.google.privacy.dlp.v2.CloudStorageOptions;
import com.google.privacy.dlp.v2.CreateJobTriggerRequest;
import com.google.privacy.dlp.v2.InfoType;
import com.google.privacy.dlp.v2.InspectConfig;
import com.google.privacy.dlp.v2.InspectJobConfig;
import com.google.privacy.dlp.v2.JobTrigger;
import com.google.privacy.dlp.v2.LocationName;
import com.google.privacy.dlp.v2.Schedule;
import com.google.privacy.dlp.v2.StorageConfig;
import com.google.privacy.dlp.v2.StorageConfig.TimespanConfig;
import com.google.protobuf.Duration;
import java.io.IOException;
import java.util.List;
import java.util.stream.Collectors;
import java.util.stream.Stream;

public class TriggersCreate {

  public static void main(String[] args) throws Exception {
    // TODO(developer): Replace these variables before running the sample.
    String projectId = "your-project-id";
    String gcsPath = "gs://" + "your-bucket-name" + "path/to/file.txt";
    createTrigger(projectId, gcsPath);
  }

  public static void createTrigger(String projectId, String gcsPath) throws IOException {
    // Initialize client that will be used to send requests. This client only needs to be created
    // once, and can be reused for multiple requests. After completing all of your requests, call
    // the "close" method on the client to safely clean up any remaining background resources.
    try (DlpServiceClient dlpServiceClient = DlpServiceClient.create()) {

      // Set autoPopulateTimespan to true to scan only new content
      boolean autoPopulateTimespan = true;
      TimespanConfig timespanConfig =
          TimespanConfig.newBuilder()
              .setEnableAutoPopulationOfTimespanConfig(autoPopulateTimespan)
              .build();

      // Specify the GCS file to be inspected.
      CloudStorageOptions cloudStorageOptions =
          CloudStorageOptions.newBuilder()
              .setFileSet(CloudStorageOptions.FileSet.newBuilder().setUrl(gcsPath))
              .build();
      StorageConfig storageConfig =
          StorageConfig.newBuilder()
              .setCloudStorageOptions(cloudStorageOptions)
              .setTimespanConfig(timespanConfig)
              .build();

      // Specify the type of info the inspection will look for.
      // See https://cloud.google.com/dlp/docs/infotypes-reference for complete list of info types
      List<InfoType> infoTypes =
          Stream.of("PHONE_NUMBER", "EMAIL_ADDRESS", "CREDIT_CARD_NUMBER")
              .map(it -> InfoType.newBuilder().setName(it).build())
              .collect(Collectors.toList());

      InspectConfig inspectConfig = InspectConfig.newBuilder().addAllInfoTypes(infoTypes).build();

      // Configure the inspection job we want the service to perform.
      InspectJobConfig inspectJobConfig =
          InspectJobConfig.newBuilder()
              .setInspectConfig(inspectConfig)
              .setStorageConfig(storageConfig)
              .build();

      // Set scanPeriod to the number of days between scans (minimum: 1 day)
      int scanPeriod = 1;

      // Optionally set a display name of max 100 chars and a description of max 250 chars
      String displayName = "Daily Scan";
      String description = "A daily inspection for personally identifiable information.";

      // Schedule scan of GCS bucket every scanPeriod number of days (minimum = 1 day)
      Duration duration = Duration.newBuilder().setSeconds(scanPeriod * 24 * 3600).build();
      Schedule schedule = Schedule.newBuilder().setRecurrencePeriodDuration(duration).build();
      JobTrigger.Trigger trigger = JobTrigger.Trigger.newBuilder().setSchedule(schedule).build();
      JobTrigger jobTrigger =
          JobTrigger.newBuilder()
              .setInspectJob(inspectJobConfig)
              .setDisplayName(displayName)
              .setDescription(description)
              .setStatus(JobTrigger.Status.HEALTHY)
              .addTriggers(trigger)
              .build();

      // Create scan request to be sent by client
      CreateJobTriggerRequest createJobTriggerRequest =
          CreateJobTriggerRequest.newBuilder()
              .setParent(LocationName.of(projectId, "global").toString())
              .setJobTrigger(jobTrigger)
              .build();

      // Send the scan request and process the response
      JobTrigger createdJobTrigger = dlpServiceClient.createJobTrigger(createJobTriggerRequest);

      System.out.println("Created Trigger: " + createdJobTrigger.getName());
      System.out.println("Display Name: " + createdJobTrigger.getDisplayName());
      System.out.println("Description: " + createdJobTrigger.getDescription());
    }
  }
}

Node.js

如需了解如何安装和使用用于敏感数据保护的客户端库,请参阅敏感数据保护客户端库

如需向敏感数据保护服务进行身份验证,请设置应用默认凭据。如需了解详情,请参阅为本地开发环境设置身份验证

// Imports the Google Cloud Data Loss Prevention library
const DLP = require('@google-cloud/dlp');

// Instantiates a client
const dlp = new DLP.DlpServiceClient();

// The project ID to run the API call under
// const projectId = 'my-project';

// (Optional) The name of the trigger to be created.
// const triggerId = 'my-trigger';

// (Optional) A display name for the trigger to be created
// const displayName = 'My Trigger';

// (Optional) A description for the trigger to be created
// const description = "This is a sample trigger.";

// The name of the bucket to scan.
// const bucketName = 'YOUR-BUCKET';

// Limit scan to new content only.
// const autoPopulateTimespan = true;

// How often to wait between scans, in days (minimum = 1 day)
// const scanPeriod = 1;

// The infoTypes of information to match
// const infoTypes = [{ name: 'PHONE_NUMBER' }, { name: 'EMAIL_ADDRESS' }, { name: 'CREDIT_CARD_NUMBER' }];

// The minimum likelihood required before returning a match
// const minLikelihood = 'LIKELIHOOD_UNSPECIFIED';

// The maximum number of findings to report per request (0 = server maximum)
// const maxFindings = 0;

async function createTrigger() {
  // Get reference to the bucket to be inspected
  const storageItem = {
    cloudStorageOptions: {
      fileSet: {url: `gs://${bucketName}/*`},
    },
    timeSpanConfig: {
      enableAutoPopulationOfTimespanConfig: autoPopulateTimespan,
    },
  };

  // Construct job to be triggered
  const job = {
    inspectConfig: {
      infoTypes: infoTypes,
      minLikelihood: minLikelihood,
      limits: {
        maxFindingsPerRequest: maxFindings,
      },
    },
    storageConfig: storageItem,
  };

  // Construct trigger creation request
  const request = {
    parent: `projects/${projectId}/locations/global`,
    jobTrigger: {
      inspectJob: job,
      displayName: displayName,
      description: description,
      triggers: [
        {
          schedule: {
            recurrencePeriodDuration: {
              seconds: scanPeriod * 60 * 60 * 24, // Trigger the scan daily
            },
          },
        },
      ],
      status: 'HEALTHY',
    },
    triggerId: triggerId,
  };

  // Run trigger creation request
  const [trigger] = await dlp.createJobTrigger(request);
  console.log(`Successfully created trigger ${trigger.name}.`);
}

createTrigger();

PHP

如需了解如何安装和使用用于敏感数据保护的客户端库,请参阅敏感数据保护客户端库

如需向敏感数据保护服务进行身份验证,请设置应用默认凭据。如需了解详情,请参阅为本地开发环境设置身份验证

use Google\Cloud\Dlp\V2\Client\DlpServiceClient;
use Google\Cloud\Dlp\V2\CloudStorageOptions;
use Google\Cloud\Dlp\V2\CloudStorageOptions\FileSet;
use Google\Cloud\Dlp\V2\CreateJobTriggerRequest;
use Google\Cloud\Dlp\V2\InfoType;
use Google\Cloud\Dlp\V2\InspectConfig;
use Google\Cloud\Dlp\V2\InspectConfig\FindingLimits;
use Google\Cloud\Dlp\V2\InspectJobConfig;
use Google\Cloud\Dlp\V2\JobTrigger;
use Google\Cloud\Dlp\V2\JobTrigger\Status;
use Google\Cloud\Dlp\V2\JobTrigger\Trigger;
use Google\Cloud\Dlp\V2\Likelihood;
use Google\Cloud\Dlp\V2\Schedule;
use Google\Cloud\Dlp\V2\StorageConfig;
use Google\Cloud\Dlp\V2\StorageConfig\TimespanConfig;
use Google\Protobuf\Duration;

/**
 * Create a Data Loss Prevention API job trigger.
 *
 * @param string $callingProjectId     The project ID to run the API call under
 * @param string $bucketName           The name of the bucket to scan
 * @param string $triggerId            (Optional) The name of the trigger to be created
 * @param string $displayName          (Optional) The human-readable name to give the trigger
 * @param string $description          (Optional) A description for the trigger to be created
 * @param int    $scanPeriod           (Optional) How often to wait between scans, in days (minimum = 1 day)
 * @param bool   $autoPopulateTimespan (Optional) Automatically limit scan to new content only
 * @param int    $maxFindings          (Optional) The maximum number of findings to report per request (0 = server maximum)
 */
function create_trigger(
    string $callingProjectId,
    string $bucketName,
    string $triggerId,
    string $displayName,
    string $description,
    int $scanPeriod,
    bool $autoPopulateTimespan,
    int $maxFindings
): void {
    // Instantiate a client.
    $dlp = new DlpServiceClient();

    // ----- Construct job config -----
    // The infoTypes of information to match
    $personNameInfoType = (new InfoType())
        ->setName('PERSON_NAME');
    $phoneNumberInfoType = (new InfoType())
        ->setName('PHONE_NUMBER');
    $infoTypes = [$personNameInfoType, $phoneNumberInfoType];

    // The minimum likelihood required before returning a match
    $minLikelihood = likelihood::LIKELIHOOD_UNSPECIFIED;

    // Specify finding limits
    $limits = (new FindingLimits())
        ->setMaxFindingsPerRequest($maxFindings);

    // Create the inspectConfig object
    $inspectConfig = (new InspectConfig())
        ->setMinLikelihood($minLikelihood)
        ->setLimits($limits)
        ->setInfoTypes($infoTypes);

    // Create triggers
    $duration = (new Duration())
        ->setSeconds($scanPeriod * 60 * 60 * 24);

    $schedule = (new Schedule())
        ->setRecurrencePeriodDuration($duration);

    $triggerObject = (new Trigger())
        ->setSchedule($schedule);

    // Create the storageConfig object
    $fileSet = (new FileSet())
        ->setUrl('gs://' . $bucketName . '/*');

    $storageOptions = (new CloudStorageOptions())
        ->setFileSet($fileSet);

    // Auto-populate start and end times in order to scan new objects only.
    $timespanConfig = (new TimespanConfig())
        ->setEnableAutoPopulationOfTimespanConfig($autoPopulateTimespan);

    $storageConfig = (new StorageConfig())
        ->setCloudStorageOptions($storageOptions)
        ->setTimespanConfig($timespanConfig);

    // Construct the jobConfig object
    $jobConfig = (new InspectJobConfig())
        ->setInspectConfig($inspectConfig)
        ->setStorageConfig($storageConfig);

    // ----- Construct trigger object -----
    $jobTriggerObject = (new JobTrigger())
        ->setTriggers([$triggerObject])
        ->setInspectJob($jobConfig)
        ->setStatus(Status::HEALTHY)
        ->setDisplayName($displayName)
        ->setDescription($description);

    // Run trigger creation request
    $parent = $dlp->locationName($callingProjectId, 'global');
    $createJobTriggerRequest = (new CreateJobTriggerRequest())
        ->setParent($parent)
        ->setJobTrigger($jobTriggerObject)
        ->setTriggerId($triggerId);
    $trigger = $dlp->createJobTrigger($createJobTriggerRequest);

    // Print results
    printf('Successfully created trigger %s' . PHP_EOL, $trigger->getName());
}

Python

如需了解如何安装和使用用于敏感数据保护的客户端库,请参阅敏感数据保护客户端库

如需向敏感数据保护服务进行身份验证,请设置应用默认凭据。如需了解详情,请参阅为本地开发环境设置身份验证

from typing import Optional

import google.cloud.dlp

def create_trigger(
    project: str,
    bucket: str,
    scan_period_days: int,
    info_types: List[str],
    trigger_id: Optional[str] = None,
    display_name: Optional[str] = None,
    description: Optional[str] = None,
    min_likelihood: Optional[int] = None,
    max_findings: Optional[int] = None,
    auto_populate_timespan: Optional[bool] = False,
) -> None:
    """Creates a scheduled Data Loss Prevention API inspect_content trigger.
    Args:
        project: The Google Cloud project id to use as a parent resource.
        bucket: The name of the GCS bucket to scan. This sample scans all
            files in the bucket using a wildcard.
        scan_period_days: How often to repeat the scan, in days.
            The minimum is 1 day.
        info_types: A list of strings representing info types to look for.
            A full list of info type categories can be fetched from the API.
        trigger_id: The id of the trigger. If omitted, an id will be randomly
            generated.
        display_name: The optional display name of the trigger.
        description: The optional description of the trigger.
        min_likelihood: A string representing the minimum likelihood threshold
            that constitutes a match. One of: 'LIKELIHOOD_UNSPECIFIED',
            'VERY_UNLIKELY', 'UNLIKELY', 'POSSIBLE', 'LIKELY', 'VERY_LIKELY'.
        max_findings: The maximum number of findings to report; 0 = no maximum.
        auto_populate_timespan: Automatically populates time span config start
            and end times in order to scan new content only.
    Returns:
        None; the response from the API is printed to the terminal.
    """

    # Instantiate a client.
    dlp = google.cloud.dlp_v2.DlpServiceClient()

    # Prepare info_types by converting the list of strings into a list of
    # dictionaries (protos are also accepted).
    info_types = [{"name": info_type} for info_type in info_types]

    # Construct the configuration dictionary. Keys which are None may
    # optionally be omitted entirely.
    inspect_config = {
        "info_types": info_types,
        "min_likelihood": min_likelihood,
        "limits": {"max_findings_per_request": max_findings},
    }

    # Construct a cloud_storage_options dictionary with the bucket's URL.
    url = f"gs://{bucket}/*"
    storage_config = {
        "cloud_storage_options": {"file_set": {"url": url}},
        # Time-based configuration for each storage object.
        "timespan_config": {
            # Auto-populate start and end times in order to scan new objects
            # only.
            "enable_auto_population_of_timespan_config": auto_populate_timespan
        },
    }

    # Construct the job definition.
    job = {"inspect_config": inspect_config, "storage_config": storage_config}

    # Construct the schedule definition:
    schedule = {
        "recurrence_period_duration": {"seconds": scan_period_days * 60 * 60 * 24}
    }

    # Construct the trigger definition.
    job_trigger = {
        "inspect_job": job,
        "display_name": display_name,
        "description": description,
        "triggers": [{"schedule": schedule}],
        "status": google.cloud.dlp_v2.JobTrigger.Status.HEALTHY,
    }

    # Convert the project id into a full resource id.
    parent = f"projects/{project}"

    # Call the API.
    response = dlp.create_job_trigger(
        request={"parent": parent, "job_trigger": job_trigger, "trigger_id": trigger_id}
    )

    print(f"Successfully created trigger {response.name}")

REST

在 DLP API 中,作业触发器用 JobTrigger 资源来表示。您可以使用 JobTrigger 资源的 projects.jobTriggers.create 方法创建新的作业触发器。

此示例 JSON 可以通过 POST 请求发送到指定的敏感数据保护 REST 端点。此 JSON 示例演示了如何在敏感数据保护中创建作业触发器。此触发器将触发的作业是 Cloud Datastore 检查扫描。创建的作业触发器每 86400 秒(即 24 小时)运行一次。

如需快速尝试此操作,您可以使用下面嵌入的 API Explorer。请注意,如果请求成功(即使是在 API Explorer 中),就会创建一个新的计划扫描作业。如需了解有关如何使用 JSON 将请求发送到 DLP API 的一般信息,请参阅 JSON 快速入门

JSON 输入:

{
  "jobTrigger":{
    "displayName":"JobTrigger1",
    "description":"Starts an inspection of a Datastore kind",
    "triggers":[
      {
        "schedule":{
          "recurrencePeriodDuration":"86400s"
        }
      }
    ],
    "status":"HEALTHY",
    "inspectJob":{
      "storageConfig":{
        "datastoreOptions":{
          "kind":{
            "name":"Example-Kind"
          },
          "partitionId":{
            "projectId":"[PROJECT_ID]",
            "namespaceId":"[NAMESPACE_ID]"
          }
        }
      },
      "inspectConfig":{
        "infoTypes":[
          {
            "name":"PHONE_NUMBER"
          }
        ],
        "excludeInfoTypes":false,
        "includeQuote":true,
        "minLikelihood":"LIKELY"
      },
      "actions":[
        {
          "saveFindings":{
            "outputConfig":{
              "table":{
                "projectId":"[PROJECT_ID]",
                "datasetId":"[BIGQUERY_DATASET_NAME]",
                "tableId":"[BIGQUERY_TABLE_NAME]"
              }
            }
          }
        }
      ]
    }
  }
}

JSON 输出

以下输出表明已成功创建作业触发器。

{
  "name":"projects/[PROJECT_ID]/jobTriggers/[JOB_TRIGGER_NAME]",
  "displayName":"JobTrigger1",
  "description":"Starts an inspection of a Datastore kind",
  "inspectJob":{
    "storageConfig":{
      "datastoreOptions":{
        "partitionId":{
          "projectId":"[PROJECT_ID]",
          "namespaceId":"[NAMESPACE_ID]"
        },
        "kind":{
          "name":"Example-Kind"
        }
      }
    },
    "inspectConfig":{
      "infoTypes":[
        {
          "name":"PHONE_NUMBER"
        }
      ],
      "minLikelihood":"LIKELY",
      "limits":{

      },
      "includeQuote":true
    },
    "actions":[
      {
        "saveFindings":{
          "outputConfig":{
            "table":{
              "projectId":"[PROJECT_ID]",
              "datasetId":"[BIGQUERY_DATASET_NAME]",
              "tableId":"[BIGQUERY_TABLE_NAME]"
            }
          }
        }
      }
    ]
  },
  "triggers":[
    {
      "schedule":{
        "recurrencePeriodDuration":"86400s"
      }
    }
  ],
  "createTime":"2018-11-30T01:52:41.171857Z",
  "updateTime":"2018-11-30T01:52:41.171857Z",
  "status":"HEALTHY"
}

列出所有作业

如需列出当前项目的所有作业,请执行以下操作:

控制台

  1. 在 Google Cloud 控制台中,前往“敏感数据保护”页面。

    前往“敏感数据保护”

  2. 点击检查标签页,然后点击检查作业子标签页。

控制台会以列表形式显示当前项目的所有作业,包括作业标识符、状态、创建时间和结束时间。您可以通过点击作业标识符来获取有关该作业的更多信息(包括其结果摘要)。

C#

如需了解如何安装和使用用于敏感数据保护的客户端库,请参阅敏感数据保护客户端库

如需向敏感数据保护服务进行身份验证,请设置应用默认凭据。如需了解详情,请参阅为本地开发环境设置身份验证


using Google.Api.Gax;
using Google.Api.Gax.ResourceNames;
using Google.Cloud.Dlp.V2;

public class JobsList
{
    public static PagedEnumerable<ListDlpJobsResponse, DlpJob> ListDlpJobs(string projectId, string filter, DlpJobType jobType)
    {
        var dlp = DlpServiceClient.Create();

        var response = dlp.ListDlpJobs(new ListDlpJobsRequest
        {
            Parent = new LocationName(projectId, "global").ToString(),
            Filter = filter,
            Type = jobType
        });

        // Uncomment to print jobs
        // foreach (var job in response)
        // {
        //     Console.WriteLine($"Job: {job.Name} status: {job.State}");
        // }

        return response;
    }
}

Go

如需了解如何安装和使用用于敏感数据保护的客户端库,请参阅敏感数据保护客户端库

如需向敏感数据保护服务进行身份验证,请设置应用默认凭据。如需了解详情,请参阅为本地开发环境设置身份验证

import (
	"context"
	"fmt"
	"io"

	dlp "cloud.google.com/go/dlp/apiv2"
	"cloud.google.com/go/dlp/apiv2/dlppb"
	"google.golang.org/api/iterator"
)

// listJobs lists jobs matching the given optional filter and optional jobType.
func listJobs(w io.Writer, projectID, filter, jobType string) error {
	// projectID := "my-project-id"
	// filter := "`state` = FINISHED"
	// jobType := "RISK_ANALYSIS_JOB"
	ctx := context.Background()
	client, err := dlp.NewClient(ctx)
	if err != nil {
		return fmt.Errorf("dlp.NewClient: %w", err)
	}
	defer client.Close()

	// Create a configured request.
	req := &dlppb.ListDlpJobsRequest{
		Parent: fmt.Sprintf("projects/%s/locations/global", projectID),
		Filter: filter,
		Type:   dlppb.DlpJobType(dlppb.DlpJobType_value[jobType]),
	}
	// Send the request and iterate over the results.
	it := client.ListDlpJobs(ctx, req)
	for {
		j, err := it.Next()
		if err == iterator.Done {
			break
		}
		if err != nil {
			return fmt.Errorf("Next: %w", err)
		}
		fmt.Fprintf(w, "Job %v status: %v\n", j.GetName(), j.GetState())
	}
	return nil
}

Java

如需了解如何安装和使用用于敏感数据保护的客户端库,请参阅敏感数据保护客户端库

如需向敏感数据保护服务进行身份验证,请设置应用默认凭据。如需了解详情,请参阅为本地开发环境设置身份验证


import com.google.cloud.dlp.v2.DlpServiceClient;
import com.google.privacy.dlp.v2.DlpJob;
import com.google.privacy.dlp.v2.DlpJobType;
import com.google.privacy.dlp.v2.ListDlpJobsRequest;
import com.google.privacy.dlp.v2.LocationName;
import java.io.IOException;

public class JobsList {

  public static void main(String[] args) throws Exception {
    // TODO(developer): Replace these variables before running the sample.
    String projectId = "your-project-id";
    listJobs(projectId);
  }

  // Lists DLP jobs
  public static void listJobs(String projectId) throws IOException {
    // Initialize client that will be used to send requests. This client only needs to be created
    // once, and can be reused for multiple requests. After completing all of your requests, call
    // the "close" method on the client to safely clean up any remaining background resources.
    try (DlpServiceClient dlpServiceClient = DlpServiceClient.create()) {

      // Construct the request to be sent by the client.
      // For more info on filters and job types,
      // see https://cloud.google.com/dlp/docs/reference/rest/v2/projects.dlpJobs/list
      ListDlpJobsRequest listDlpJobsRequest =
          ListDlpJobsRequest.newBuilder()
              .setParent(LocationName.of(projectId, "global").toString())
              .setFilter("state=DONE")
              .setType(DlpJobType.valueOf("INSPECT_JOB"))
              .build();

      // Send the request to list jobs and process the response
      DlpServiceClient.ListDlpJobsPagedResponse response =
          dlpServiceClient.listDlpJobs(listDlpJobsRequest);

      System.out.println("DLP jobs found:");
      for (DlpJob dlpJob : response.getPage().getValues()) {
        System.out.println(dlpJob.getName() + " -- " + dlpJob.getState());
      }
    }
  }
}

Node.js

如需了解如何安装和使用用于敏感数据保护的客户端库,请参阅敏感数据保护客户端库

如需向敏感数据保护服务进行身份验证,请设置应用默认凭据。如需了解详情,请参阅为本地开发环境设置身份验证

// Imports the Google Cloud Data Loss Prevention library
const DLP = require('@google-cloud/dlp');

// Instantiates a client
const dlp = new DLP.DlpServiceClient();

// The project ID to run the API call under
// const projectId = 'my-project';

// The filter expression to use
// For more information and filter syntax, see https://cloud.google.com/dlp/docs/reference/rest/v2/projects.dlpJobs/list
// const filter = `state=DONE`;

// The type of job to list (either 'INSPECT_JOB' or 'RISK_ANALYSIS_JOB')
// const jobType = 'INSPECT_JOB';
async function listJobs() {
  // Construct request for listing DLP scan jobs
  const request = {
    parent: `projects/${projectId}/locations/global`,
    filter: filter,
    type: jobType,
  };

  // Run job-listing request
  const [jobs] = await dlp.listDlpJobs(request);
  jobs.forEach(job => {
    console.log(`Job ${job.name} status: ${job.state}`);
  });
}

listJobs();

PHP

如需了解如何安装和使用用于敏感数据保护的客户端库,请参阅敏感数据保护客户端库

如需向敏感数据保护服务进行身份验证,请设置应用默认凭据。如需了解详情,请参阅为本地开发环境设置身份验证

use Google\Cloud\Dlp\V2\Client\DlpServiceClient;
use Google\Cloud\Dlp\V2\DlpJob\JobState;
use Google\Cloud\Dlp\V2\DlpJobType;
use Google\Cloud\Dlp\V2\ListDlpJobsRequest;

/**
 * List Data Loss Prevention API jobs corresponding to a given filter.
 *
 * @param string $callingProjectId  The project ID to run the API call under
 * @param string $filter            The filter expression to use
 */
function list_jobs(string $callingProjectId, string $filter): void
{
    // Instantiate a client.
    $dlp = new DlpServiceClient();

    // The type of job to list (either 'INSPECT_JOB' or 'REDACT_JOB')
    $jobType = DlpJobType::INSPECT_JOB;

    // Run job-listing request
    // For more information and filter syntax,
    // @see https://cloud.google.com/dlp/docs/reference/rest/v2/projects.dlpJobs/list
    $parent = "projects/$callingProjectId/locations/global";
    $listDlpJobsRequest = (new ListDlpJobsRequest())
        ->setParent($parent)
        ->setFilter($filter)
        ->setType($jobType);
    $response = $dlp->listDlpJobs($listDlpJobsRequest);

    // Print job list
    $jobs = $response->iterateAllElements();
    foreach ($jobs as $job) {
        printf('Job %s status: %s' . PHP_EOL, $job->getName(), $job->getState());
        $infoTypeStats = $job->getInspectDetails()->getResult()->getInfoTypeStats();

        if ($job->getState() == JobState::DONE) {
            if (count($infoTypeStats) > 0) {
                foreach ($infoTypeStats as $infoTypeStat) {
                    printf(
                        '  Found %s instance(s) of type %s' . PHP_EOL,
                        $infoTypeStat->getCount(),
                        $infoTypeStat->getInfoType()->getName()
                    );
                }
            } else {
                print('  No findings.' . PHP_EOL);
            }
        }
    }
}

Python

如需了解如何安装和使用用于敏感数据保护的客户端库,请参阅敏感数据保护客户端库

如需向敏感数据保护服务进行身份验证,请设置应用默认凭据。如需了解详情,请参阅为本地开发环境设置身份验证


from typing import Optional

import google.cloud.dlp

def list_dlp_jobs(
    project: str, filter_string: Optional[str] = None, job_type: Optional[str] = None
) -> None:
    """Uses the Data Loss Prevention API to lists DLP jobs that match the
        specified filter in the request.
    Args:
        project: The project id to use as a parent resource.
        filter: (Optional) Allows filtering.
            Supported syntax:
            * Filter expressions are made up of one or more restrictions.
            * Restrictions can be combined by 'AND' or 'OR' logical operators.
            A sequence of restrictions implicitly uses 'AND'.
            * A restriction has the form of '<field> <operator> <value>'.
            * Supported fields/values for inspect jobs:
                - `state` - PENDING|RUNNING|CANCELED|FINISHED|FAILED
                - `inspected_storage` - DATASTORE|CLOUD_STORAGE|BIGQUERY
                - `trigger_name` - The resource name of the trigger that
                                   created job.
            * Supported fields for risk analysis jobs:
                - `state` - RUNNING|CANCELED|FINISHED|FAILED
            * The operator must be '=' or '!='.
            Examples:
            * inspected_storage = cloud_storage AND state = done
            * inspected_storage = cloud_storage OR inspected_storage = bigquery
            * inspected_storage = cloud_storage AND
                                  (state = done OR state = canceled)
        type: (Optional) The type of job. Defaults to 'INSPECT'.
            Choices:
            DLP_JOB_TYPE_UNSPECIFIED
            INSPECT_JOB: The job inspected content for sensitive data.
            RISK_ANALYSIS_JOB: The job executed a Risk Analysis computation.

    Returns:
        None; the response from the API is printed to the terminal.
    """

    # Instantiate a client.
    dlp = google.cloud.dlp_v2.DlpServiceClient()

    # Convert the project id into a full resource id.
    parent = f"projects/{project}"

    # Job type dictionary
    job_type_to_int = {
        "DLP_JOB_TYPE_UNSPECIFIED": google.cloud.dlp.DlpJobType.DLP_JOB_TYPE_UNSPECIFIED,
        "INSPECT_JOB": google.cloud.dlp.DlpJobType.INSPECT_JOB,
        "RISK_ANALYSIS_JOB": google.cloud.dlp.DlpJobType.RISK_ANALYSIS_JOB,
    }
    # If job type is specified, convert job type to number through enums.
    if job_type:
        job_type = job_type_to_int[job_type]

    # Call the API to get a list of jobs.
    response = dlp.list_dlp_jobs(
        request={"parent": parent, "filter": filter_string, "type_": job_type}
    )

    # Iterate over results.
    for job in response:
        print(f"Job: {job.name}; status: {job.state.name}")

REST

DlpJob 资源具有 projects.dlpJobs.list 方法,您可以使用该方法列出所有作业。

如需列出项目中当前定义的所有作业,请向 dlpJobs 端点发送 GET 请求,如下所示:

网址

GET https://dlp.googleapis.com/v2/projects/[PROJECT-ID]/dlpJobs?key={YOUR_API_KEY}

以下 JSON 输出列出了返回的其中一个作业。请注意,该作业的结构建立了 DlpJob 资源结构的镜像。

JSON 输出:

{
  "jobs":[
    {
      "name":"projects/[PROJECT-ID]/dlpJobs/i-5270277269264714623",
      "type":"INSPECT_JOB",
      "state":"DONE",
      "inspectDetails":{
        "requestedOptions":{
          "snapshotInspectTemplate":{
          },
          "jobConfig":{
            "storageConfig":{
              "cloudStorageOptions":{
                "fileSet":{
                  "url":"[CLOUD-STORAGE-URL]"
                },
                "fileTypes":[
                  "FILE_TYPE_UNSPECIFIED"
                ],
                "filesLimitPercent":100
              },
              "timespanConfig":{
                "startTime":"2019-09-08T22:43:16.623Z",
                "enableAutoPopulationOfTimespanConfig":true
              }
            },
            "inspectConfig":{
              "infoTypes":[
                {
                  "name":"US_SOCIAL_SECURITY_NUMBER"
                },
                {
                  "name":"CANADA_SOCIAL_INSURANCE_NUMBER"
                }
              ],
              "minLikelihood":"LIKELY",
              "limits":{
              },
              "includeQuote":true
            },
            "actions":[
              {
                "saveFindings":{
                  "outputConfig":{
                    "table":{
                      "projectId":"[PROJECT-ID]",
                      "datasetId":"[DATASET-ID]",
                      "tableId":"[TABLE-ID]"
                    }
                  }
                }
              }
            ]
          }
        },
        "result":{
          ...
        }
      },
      "createTime":"2019-09-09T22:43:16.918Z",
      "startTime":"2019-09-09T22:43:16.918Z",
      "endTime":"2019-09-09T22:43:53.091Z",
      "jobTriggerName":"projects/[PROJECT-ID]/jobTriggers/sample-trigger2"
    },
    ...

如需快速尝试此操作,您可以使用下面嵌入的 API Explorer。如需了解有关如何使用 JSON 将请求发送到 DLP API 的一般信息,请参阅 JSON 快速入门

列出所有作业触发器

如需列出当前项目的所有作业触发器,请执行以下操作:

控制台

在 Google Cloud 控制台中,前往“敏感数据保护”页面。

前往“敏感数据保护”

检查标签页的作业触发器子标签页上,控制台会显示当前项目的所有作业触发器的列表。

C#

如需了解如何安装和使用用于敏感数据保护的客户端库,请参阅敏感数据保护客户端库

如需向敏感数据保护服务进行身份验证,请设置应用默认凭据。如需了解详情,请参阅为本地开发环境设置身份验证


using Google.Api.Gax;
using Google.Api.Gax.ResourceNames;
using Google.Cloud.Dlp.V2;
using System;

public class TriggersList
{
    public static PagedEnumerable<ListJobTriggersResponse, JobTrigger> List(string projectId)
    {
        var dlp = DlpServiceClient.Create();

        var response = dlp.ListJobTriggers(
            new ListJobTriggersRequest
            {
                Parent = new LocationName(projectId, "global").ToString(),
            });

        foreach (var trigger in response)
        {
            Console.WriteLine($"Name: {trigger.Name}");
            Console.WriteLine($"  Created: {trigger.CreateTime}");
            Console.WriteLine($"  Updated: {trigger.UpdateTime}");
            Console.WriteLine($"  Display Name: {trigger.DisplayName}");
            Console.WriteLine($"  Description: {trigger.Description}");
            Console.WriteLine($"  Status: {trigger.Status}");
            Console.WriteLine($"  Error count: {trigger.Errors.Count}");
        }

        return response;
    }
}

Go

如需了解如何安装和使用用于敏感数据保护的客户端库,请参阅敏感数据保护客户端库

如需向敏感数据保护服务进行身份验证,请设置应用默认凭据。如需了解详情,请参阅为本地开发环境设置身份验证

import (
	"context"
	"fmt"
	"io"
	"time"

	dlp "cloud.google.com/go/dlp/apiv2"
	"cloud.google.com/go/dlp/apiv2/dlppb"
	"github.com/golang/protobuf/ptypes"
	"google.golang.org/api/iterator"
)

// listTriggers lists the triggers for the given project.
func listTriggers(w io.Writer, projectID string) error {
	// projectID := "my-project-id"

	ctx := context.Background()

	client, err := dlp.NewClient(ctx)
	if err != nil {
		return fmt.Errorf("dlp.NewClient: %w", err)
	}
	defer client.Close()

	// Create a configured request.
	req := &dlppb.ListJobTriggersRequest{
		Parent: fmt.Sprintf("projects/%s/locations/global", projectID),
	}
	// Send the request and iterate over the results.
	it := client.ListJobTriggers(ctx, req)
	for {
		t, err := it.Next()
		if err == iterator.Done {
			break
		}
		if err != nil {
			return fmt.Errorf("Next: %w", err)
		}
		fmt.Fprintf(w, "Trigger %v\n", t.GetName())
		c, err := ptypes.Timestamp(t.GetCreateTime())
		if err != nil {
			return fmt.Errorf("CreateTime Timestamp: %w", err)
		}
		fmt.Fprintf(w, "  Created: %v\n", c.Format(time.RFC1123))
		u, err := ptypes.Timestamp(t.GetUpdateTime())
		if err != nil {
			return fmt.Errorf("UpdateTime Timestamp: %w", err)
		}
		fmt.Fprintf(w, "  Updated: %v\n", u.Format(time.RFC1123))
		fmt.Fprintf(w, "  Display Name: %q\n", t.GetDisplayName())
		fmt.Fprintf(w, "  Description: %q\n", t.GetDescription())
		fmt.Fprintf(w, "  Status: %v\n", t.GetStatus())
		fmt.Fprintf(w, "  Error Count: %v\n", len(t.GetErrors()))
	}

	return nil
}

Java

如需了解如何安装和使用用于敏感数据保护的客户端库,请参阅敏感数据保护客户端库

如需向敏感数据保护服务进行身份验证,请设置应用默认凭据。如需了解详情,请参阅为本地开发环境设置身份验证


import com.google.cloud.dlp.v2.DlpServiceClient;
import com.google.privacy.dlp.v2.JobTrigger;
import com.google.privacy.dlp.v2.ListJobTriggersRequest;
import com.google.privacy.dlp.v2.LocationName;
import java.io.IOException;

class TriggersList {
  public static void main(String[] args) throws Exception {
    // TODO(developer): Replace these variables before running the sample.
    String projectId = "your-project-id";
    listTriggers(projectId);
  }

  public static void listTriggers(String projectId) throws IOException {
    // Initialize client that will be used to send requests. This client only needs to be created
    // once, and can be reused for multiple requests. After completing all of your requests, call
    // the "close" method on the client to safely clean up any remaining background resources.
    try (DlpServiceClient dlpServiceClient = DlpServiceClient.create()) {
      // Build the request to be sent by the client
      ListJobTriggersRequest listJobTriggersRequest =
          ListJobTriggersRequest.newBuilder()
              .setParent(LocationName.of(projectId, "global").toString())
              .build();

      // Use the client to send the API request.
      DlpServiceClient.ListJobTriggersPagedResponse response =
          dlpServiceClient.listJobTriggers(listJobTriggersRequest);

      // Parse the response and process the results
      System.out.println("DLP triggers found:");
      for (JobTrigger trigger : response.getPage().getValues()) {
        System.out.println("Trigger: " + trigger.getName());
        System.out.println("\tCreated: " + trigger.getCreateTime());
        System.out.println("\tUpdated: " + trigger.getUpdateTime());
        if (trigger.getDisplayName() != null) {
          System.out.println("\tDisplay name: " + trigger.getDisplayName());
        }
        if (trigger.getDescription() != null) {
          System.out.println("\tDescription: " + trigger.getDescription());
        }
        System.out.println("\tStatus: " + trigger.getStatus());
        System.out.println("\tError count: " + trigger.getErrorsCount());
      }
      ;
    }
  }
}

Node.js

如需了解如何安装和使用用于敏感数据保护的客户端库,请参阅敏感数据保护客户端库

如需向敏感数据保护服务进行身份验证,请设置应用默认凭据。如需了解详情,请参阅为本地开发环境设置身份验证

// Imports the Google Cloud Data Loss Prevention library
const DLP = require('@google-cloud/dlp');

// Instantiates a client
const dlp = new DLP.DlpServiceClient();

// The project ID to run the API call under
// const projectId = 'my-project'

async function listTriggers() {
  // Construct trigger listing request
  const request = {
    parent: `projects/${projectId}/locations/global`,
  };

  // Helper function to pretty-print dates
  const formatDate = date => {
    const msSinceEpoch = parseInt(date.seconds, 10) * 1000;
    return new Date(msSinceEpoch).toLocaleString('en-US');
  };

  // Run trigger listing request
  const [triggers] = await dlp.listJobTriggers(request);
  triggers.forEach(trigger => {
    // Log trigger details
    console.log(`Trigger ${trigger.name}:`);
    console.log(`  Created: ${formatDate(trigger.createTime)}`);
    console.log(`  Updated: ${formatDate(trigger.updateTime)}`);
    if (trigger.displayName) {
      console.log(`  Display Name: ${trigger.displayName}`);
    }
    if (trigger.description) {
      console.log(`  Description: ${trigger.description}`);
    }
    console.log(`  Status: ${trigger.status}`);
    console.log(`  Error count: ${trigger.errors.length}`);
  });
}

listTriggers();

PHP

如需了解如何安装和使用用于敏感数据保护的客户端库,请参阅敏感数据保护客户端库

如需向敏感数据保护服务进行身份验证,请设置应用默认凭据。如需了解详情,请参阅为本地开发环境设置身份验证

use Google\Cloud\Dlp\V2\Client\DlpServiceClient;
use Google\Cloud\Dlp\V2\ListJobTriggersRequest;

/**
 * List Data Loss Prevention API job triggers.
 *
 * @param string $callingProjectId  The project ID to run the API call under
 */
function list_triggers(string $callingProjectId): void
{
    // Instantiate a client.
    $dlp = new DlpServiceClient();

    $parent = "projects/$callingProjectId/locations/global";

    // Run request
    $listJobTriggersRequest = (new ListJobTriggersRequest())
        ->setParent($parent);
    $response = $dlp->listJobTriggers($listJobTriggersRequest);

    // Print results
    $triggers = $response->iterateAllElements();
    foreach ($triggers as $trigger) {
        printf('Trigger %s' . PHP_EOL, $trigger->getName());
        printf('  Created: %s' . PHP_EOL, $trigger->getCreateTime()->getSeconds());
        printf('  Updated: %s' . PHP_EOL, $trigger->getUpdateTime()->getSeconds());
        printf('  Display Name: %s' . PHP_EOL, $trigger->getDisplayName());
        printf('  Description: %s' . PHP_EOL, $trigger->getDescription());
        printf('  Status: %s' . PHP_EOL, $trigger->getStatus());
        printf('  Error count: %s' . PHP_EOL, count($trigger->getErrors()));
        $timespanConfig = $trigger->getInspectJob()->getStorageConfig()->getTimespanConfig();
        printf('  Auto-populates timespan config: %s' . PHP_EOL,
            ($timespanConfig && $timespanConfig->getEnableAutoPopulationOfTimespanConfig() ? 'yes' : 'no'));
    }
}

Python

如需了解如何安装和使用用于敏感数据保护的客户端库,请参阅敏感数据保护客户端库

如需向敏感数据保护服务进行身份验证,请设置应用默认凭据。如需了解详情,请参阅为本地开发环境设置身份验证

import google.cloud.dlp

def list_triggers(project: str) -> None:
    """Lists all Data Loss Prevention API triggers.
    Args:
        project: The Google Cloud project id to use as a parent resource.
    Returns:
        None; the response from the API is printed to the terminal.
    """

    # Instantiate a client.
    dlp = google.cloud.dlp_v2.DlpServiceClient()

    # Convert the project id into a full resource id.
    parent = f"projects/{project}"

    # Call the API.
    response = dlp.list_job_triggers(request={"parent": parent})

    for trigger in response:
        print(f"Trigger {trigger.name}:")
        print(f"  Created: {trigger.create_time}")
        print(f"  Updated: {trigger.update_time}")
        if trigger.display_name:
            print(f"  Display Name: {trigger.display_name}")
        if trigger.description:
            print(f"  Description: {trigger.description}")
        print(f"  Status: {trigger.status}")
        print(f"  Error count: {len(trigger.errors)}")

REST

JobTrigger 资源具有 projects.jobTriggers.list 方法,您可以使用该方法列出所有作业触发器。

如需列出项目中当前定义的所有作业触发器,请向 jobTriggers 端点发送 GET 请求,如下所示:

网址

GET https://dlp.googleapis.com/v2/projects/[PROJECT-ID]/jobTriggers?key={YOUR_API_KEY}

以下 JSON 输出列出了我们在上一节中创建的作业触发器。请注意,该作业触发器的结构建立了 JobTrigger 资源结构的镜像。

JSON 输出:

{
  "jobTriggers":[
    {
      "name":"projects/[PROJECT_ID]/jobTriggers/[JOB_TRIGGER_NAME]",
      "displayName":"JobTrigger1",
      "description":"Starts an inspection of a Datastore kind",
      "inspectJob":{
        "storageConfig":{
          "datastoreOptions":{
            "partitionId":{
              "projectId":"[PROJECT_ID]",
              "namespaceId":"[NAMESPACE_ID]"
            },
            "kind":{
              "name":"Example-Kind"
            }
          }
        },
        "inspectConfig":{
          "infoTypes":[
            {
              "name":"PHONE_NUMBER"
            }
          ],
          "minLikelihood":"LIKELY",
          "limits":{

          },
          "includeQuote":true
        },
        "actions":[
          {
            "saveFindings":{
              "outputConfig":{
                "table":{
                  "projectId":"[PROJECT_ID]",
                  "datasetId":"[BIGQUERY_DATASET_NAME]",
                  "tableId":"[BIGQUERY_TABLE_NAME]"
                }
              }
            }
          }
        ]
      },
      "triggers":[
        {
          "schedule":{
            "recurrencePeriodDuration":"86400s"
          }
        }
      ],
      "createTime":"2018-11-30T01:52:41.171857Z",
      "updateTime":"2018-11-30T01:52:41.171857Z",
      "status":"HEALTHY"
    },

    ...

],
  "nextPageToken":"KkwKCQjivJ2UpPreAgo_Kj1wcm9qZWN0cy92ZWx2ZXR5LXN0dWR5LTE5NjEwMS9qb2JUcmlnZ2Vycy8xNTA5NzEyOTczMDI0MDc1NzY0"
}

如需快速尝试此操作,您可以使用下面嵌入的 API Explorer。如需了解有关如何使用 JSON 将请求发送到 DLP API 的一般信息,请参阅 JSON 快速入门

删除作业

如需从项目中删除某项作业(包括其结果),请执行以下操作。 此操作不会影响外部保存(如保存到 BigQuery)的任何结果。

控制台

  1. 在 Google Cloud 控制台中,前往“敏感数据保护”页面。

    前往“敏感数据保护”

  2. 点击检查标签页,然后点击检查作业子标签页。 Google Cloud 控制台会显示当前项目的所有作业的列表。

  3. 在要删除的作业触发器对应的操作列中,点击更多操作菜单(显示为纵向排列的三个点),然后点击删除

或者,从作业列表中,点击要删除的作业的标识符。在作业的详细信息页面上,点击删除

C#

如需了解如何安装和使用用于敏感数据保护的客户端库,请参阅敏感数据保护客户端库

如需向敏感数据保护服务进行身份验证,请设置应用默认凭据。如需了解详情,请参阅为本地开发环境设置身份验证


using System;
using Google.Cloud.Dlp.V2;

public class JobsDelete
{
    public static void DeleteJob(string jobName)
    {
        var dlp = DlpServiceClient.Create();

        dlp.DeleteDlpJob(new DeleteDlpJobRequest
        {
            Name = jobName
        });

        Console.WriteLine($"Successfully deleted job {jobName}.");
    }
}

Go

如需了解如何安装和使用用于敏感数据保护的客户端库,请参阅敏感数据保护客户端库

如需向敏感数据保护服务进行身份验证,请设置应用默认凭据。如需了解详情,请参阅为本地开发环境设置身份验证

import (
	"context"
	"fmt"
	"io"

	dlp "cloud.google.com/go/dlp/apiv2"
	"cloud.google.com/go/dlp/apiv2/dlppb"
)

// deleteJob deletes the job with the given name.
func deleteJob(w io.Writer, jobName string) error {
	// jobName := "job-example"
	ctx := context.Background()
	client, err := dlp.NewClient(ctx)
	if err != nil {
		return fmt.Errorf("dlp.NewClient: %w", err)
	}
	defer client.Close()
	req := &dlppb.DeleteDlpJobRequest{
		Name: jobName,
	}
	if err = client.DeleteDlpJob(ctx, req); err != nil {
		return fmt.Errorf("DeleteDlpJob: %w", err)
	}
	fmt.Fprintf(w, "Successfully deleted job")
	return nil
}

Java

如需了解如何安装和使用用于敏感数据保护的客户端库,请参阅敏感数据保护客户端库

如需向敏感数据保护服务进行身份验证,请设置应用默认凭据。如需了解详情,请参阅为本地开发环境设置身份验证


import com.google.cloud.dlp.v2.DlpServiceClient;
import com.google.privacy.dlp.v2.DeleteDlpJobRequest;
import com.google.privacy.dlp.v2.DlpJobName;
import java.io.IOException;

public class JobsDelete {
  public static void main(String[] args) throws Exception {
    // TODO(developer): Replace these variables before running the sample.
    String projectId = "your-project-id";
    String jobId = "your-job-id";
    deleteJobs(projectId, jobId);
  }

  // Deletes a DLP Job with the given jobId
  public static void deleteJobs(String projectId, String jobId) throws IOException {
    // Initialize client that will be used to send requests. This client only needs to be created
    // once, and can be reused for multiple requests. After completing all of your requests, call
    // the "close" method on the client to safely clean up any remaining background resources.
    try (DlpServiceClient dlpServiceClient = DlpServiceClient.create()) {

      // Construct the complete job name from the projectId and jobId
      DlpJobName jobName = DlpJobName.of(projectId, jobId);

      // Construct the job deletion request to be sent by the client.
      DeleteDlpJobRequest deleteDlpJobRequest =
          DeleteDlpJobRequest.newBuilder().setName(jobName.toString()).build();

      // Send the job deletion request
      dlpServiceClient.deleteDlpJob(deleteDlpJobRequest);
      System.out.println("Job deleted successfully.");
    }
  }
}

Node.js

如需了解如何安装和使用用于敏感数据保护的客户端库,请参阅敏感数据保护客户端库

如需向敏感数据保护服务进行身份验证,请设置应用默认凭据。如需了解详情,请参阅为本地开发环境设置身份验证

// Imports the Google Cloud Data Loss Prevention library
const DLP = require('@google-cloud/dlp');

// Instantiates a client
const dlp = new DLP.DlpServiceClient();

// The project ID to run the API call under
// const projectId = 'my-project';

// The name of the job whose results should be deleted
// Parent project ID is automatically extracted from this parameter
// const jobName = 'projects/my-project/dlpJobs/X-#####'

function deleteJob() {
  // Construct job deletion request
  const request = {
    name: jobName,
  };

  // Run job deletion request
  dlp
    .deleteDlpJob(request)
    .then(() => {
      console.log(`Successfully deleted job ${jobName}.`);
    })
    .catch(err => {
      console.log(`Error in deleteJob: ${err.message || err}`);
    });
}

deleteJob();

PHP

如需了解如何安装和使用用于敏感数据保护的客户端库,请参阅敏感数据保护客户端库

如需向敏感数据保护服务进行身份验证,请设置应用默认凭据。如需了解详情,请参阅为本地开发环境设置身份验证

use Google\Cloud\Dlp\V2\Client\DlpServiceClient;
use Google\Cloud\Dlp\V2\DeleteDlpJobRequest;

/**
 * Delete results of a Data Loss Prevention API job
 *
 * @param string $jobId The name of the job whose results should be deleted
 */
function delete_job(string $jobId): void
{
    // Instantiate a client.
    $dlp = new DlpServiceClient();

    // Run job-deletion request
    // The Parent project ID is automatically extracted from this parameter
    $deleteDlpJobRequest = (new DeleteDlpJobRequest())
        ->setName($jobId);
    $dlp->deleteDlpJob($deleteDlpJobRequest);

    // Print status
    printf('Successfully deleted job %s' . PHP_EOL, $jobId);
}

Python

如需了解如何安装和使用用于敏感数据保护的客户端库,请参阅敏感数据保护客户端库

如需向敏感数据保护服务进行身份验证,请设置应用默认凭据。如需了解详情,请参阅为本地开发环境设置身份验证


import google.cloud.dlp

def delete_dlp_job(project: str, job_name: str) -> None:
    """Uses the Data Loss Prevention API to delete a long-running DLP job.
    Args:
        project: The project id to use as a parent resource.
        job_name: The name of the DlpJob resource to be deleted.

    Returns:
        None; the response from the API is printed to the terminal.
    """

    # Instantiate a client.
    dlp = google.cloud.dlp_v2.DlpServiceClient()

    # Convert the project id and job name into a full resource id.
    name = f"projects/{project}/dlpJobs/{job_name}"

    # Call the API to delete job.
    dlp.delete_dlp_job(request={"name": name})

    print(f"Successfully deleted {job_name}")

REST

如需从当前项目中删除某项作业,请向 dlpJobs 端点发送 DELETE 请求,如下所示。将 [JOB-IDENTIFIER] 字段替换为以 i- 开头的作业标识符。

网址

DELETE https://dlp.googleapis.com/v2/projects/[PROJECT-ID]/dlpJobs/[JOB-IDENTIFIER]?key={YOUR_API_KEY}

如果请求成功,DLP API 将返回成功响应。如需验证作业已成功删除,请列出所有作业

如需快速尝试此操作,您可以使用下面嵌入的 API Explorer。如需了解有关如何使用 JSON 将请求发送到 DLP API 的一般信息,请参阅 JSON 快速入门

删除作业触发器

控制台

  1. 在 Google Cloud 控制台中,前往“敏感数据保护”页面。

    前往“敏感数据保护”

    检查标签页的作业触发器子标签页上,控制台会显示当前项目的所有作业触发器的列表。

  2. 在要删除的作业触发器对应的操作列中,点击更多操作菜单(显示为纵向排列的三个点),然后点击删除

或者,从作业触发器列表中,点击要删除的作业触发器的名称。在作业触发器的详细信息页面上,点击删除

C#

如需了解如何安装和使用用于敏感数据保护的客户端库,请参阅敏感数据保护客户端库

如需向敏感数据保护服务进行身份验证,请设置应用默认凭据。如需了解详情,请参阅为本地开发环境设置身份验证


using Google.Cloud.Dlp.V2;
using System;

public class TriggersDelete
{

    public static void Delete(string triggerName)
    {
        var dlp = DlpServiceClient.Create();

        dlp.DeleteJobTrigger(
            new DeleteJobTriggerRequest
            {
                Name = triggerName
            });

        Console.WriteLine($"Successfully deleted trigger {triggerName}.");
    }
}

Go

如需了解如何安装和使用用于敏感数据保护的客户端库,请参阅敏感数据保护客户端库

如需向敏感数据保护服务进行身份验证,请设置应用默认凭据。如需了解详情,请参阅为本地开发环境设置身份验证

import (
	"context"
	"fmt"
	"io"

	dlp "cloud.google.com/go/dlp/apiv2"
	"cloud.google.com/go/dlp/apiv2/dlppb"
)

// deleteTrigger deletes the given trigger.
func deleteTrigger(w io.Writer, triggerID string) error {
	// triggerID := "my-trigger"

	ctx := context.Background()

	client, err := dlp.NewClient(ctx)
	if err != nil {
		return fmt.Errorf("dlp.NewClient: %w", err)
	}
	defer client.Close()

	req := &dlppb.DeleteJobTriggerRequest{
		Name: triggerID,
	}

	if err := client.DeleteJobTrigger(ctx, req); err != nil {
		return fmt.Errorf("DeleteJobTrigger: %w", err)
	}
	fmt.Fprintf(w, "Successfully deleted trigger %v", triggerID)
	return nil
}

Java

如需了解如何安装和使用用于敏感数据保护的客户端库,请参阅敏感数据保护客户端库

如需向敏感数据保护服务进行身份验证,请设置应用默认凭据。如需了解详情,请参阅为本地开发环境设置身份验证

import com.google.cloud.dlp.v2.DlpServiceClient;
import com.google.privacy.dlp.v2.DeleteJobTriggerRequest;
import com.google.privacy.dlp.v2.ProjectJobTriggerName;
import java.io.IOException;

class TriggersDelete {

  public static void main(String[] args) throws Exception {
    // TODO(developer): Replace these variables before running the sample.
    String projectId = "your-project-id";
    String triggerId = "your-trigger-id";
    deleteTrigger(projectId, triggerId);
  }

  public static void deleteTrigger(String projectId, String triggerId) throws IOException {
    // Initialize client that will be used to send requests. This client only needs to be created
    // once, and can be reused for multiple requests. After completing all of your requests, call
    // the "close" method on the client to safely clean up any remaining background resources.
    try (DlpServiceClient dlpServiceClient = DlpServiceClient.create()) {

      // Get the full trigger name from the given triggerId and ProjectId
      ProjectJobTriggerName triggerName = ProjectJobTriggerName.of(projectId, triggerId);

      // Construct the trigger deletion request to be sent by the client
      DeleteJobTriggerRequest deleteJobTriggerRequest =
          DeleteJobTriggerRequest.newBuilder().setName(triggerName.toString()).build();

      // Send the trigger deletion request
      dlpServiceClient.deleteJobTrigger(deleteJobTriggerRequest);
      System.out.println("Trigger deleted: " + triggerName.toString());
    }
  }
}

Node.js

如需了解如何安装和使用用于敏感数据保护的客户端库,请参阅敏感数据保护客户端库

如需向敏感数据保护服务进行身份验证,请设置应用默认凭据。如需了解详情,请参阅为本地开发环境设置身份验证

// Imports the Google Cloud Data Loss Prevention library
const DLP = require('@google-cloud/dlp');

// Instantiates a client
const dlp = new DLP.DlpServiceClient();

// The project ID to run the API call under
// const projectId = 'my-project'

// The name of the trigger to be deleted
// Parent project ID is automatically extracted from this parameter
// const triggerId = 'projects/my-project/triggers/my-trigger';

async function deleteTrigger() {
  // Construct trigger deletion request
  const request = {
    name: triggerId,
  };

  // Run trigger deletion request
  await dlp.deleteJobTrigger(request);
  console.log(`Successfully deleted trigger ${triggerId}.`);
}

deleteTrigger();

PHP

如需了解如何安装和使用用于敏感数据保护的客户端库,请参阅敏感数据保护客户端库

如需向敏感数据保护服务进行身份验证,请设置应用默认凭据。如需了解详情,请参阅为本地开发环境设置身份验证

use Google\Cloud\Dlp\V2\Client\DlpServiceClient;
use Google\Cloud\Dlp\V2\DeleteJobTriggerRequest;

/**
 * Delete a Data Loss Prevention API job trigger.
 *
 * @param string $callingProjectId  The project ID to run the API call under
 * @param string $triggerId         The name of the trigger to be deleted.
 */
function delete_trigger(string $callingProjectId, string $triggerId): void
{
    // Instantiate a client.
    $dlp = new DlpServiceClient();

    // Run request
    // The Parent project ID is automatically extracted from this parameter
    $triggerName = "projects/$callingProjectId/locations/global/jobTriggers/$triggerId";
    $deleteJobTriggerRequest = (new DeleteJobTriggerRequest())
        ->setName($triggerName);
    $dlp->deleteJobTrigger($deleteJobTriggerRequest);

    // Print the results
    printf('Successfully deleted trigger %s' . PHP_EOL, $triggerName);
}

Python

如需了解如何安装和使用用于敏感数据保护的客户端库,请参阅敏感数据保护客户端库

如需向敏感数据保护服务进行身份验证,请设置应用默认凭据。如需了解详情,请参阅为本地开发环境设置身份验证

import google.cloud.dlp

def delete_trigger(project: str, trigger_id: str) -> None:
    """Deletes a Data Loss Prevention API trigger.
    Args:
        project: The id of the Google Cloud project which owns the trigger.
        trigger_id: The id of the trigger to delete.
    Returns:
        None; the response from the API is printed to the terminal.
    """

    # Instantiate a client.
    dlp = google.cloud.dlp_v2.DlpServiceClient()

    # Convert the project id into a full resource id.
    parent = f"projects/{project}"

    # Combine the trigger id with the parent id.
    trigger_resource = f"{parent}/jobTriggers/{trigger_id}"

    # Call the API.
    dlp.delete_job_trigger(request={"name": trigger_resource})

    print(f"Trigger {trigger_resource} successfully deleted.")

REST

如需从当前项目中删除某作业触发器,请向 jobTriggers 端点发送 DELETE 请求,如下所示。将 [JOB-TRIGGER-NAME] 字段替换为该作业触发器的名称。

网址

DELETE https://dlp.googleapis.com/v2/projects/[PROJECT-ID]/jobTriggers/[JOB-TRIGGER-NAME]?key={YOUR_API_KEY}

如果请求成功,DLP API 将返回成功响应。如需验证作业触发器已成功删除,请列出所有作业触发器

如需快速尝试此操作,您可以使用下面嵌入的 API Explorer。如需了解有关如何使用 JSON 将请求发送到 DLP API 的一般信息,请参阅 JSON 快速入门

获取作业

如需从项目中获取某项作业(包括其结果),请执行以下操作。此操作不会影响外部保存(如保存到 BigQuery)的任何结果。

C#

如需了解如何安装和使用用于敏感数据保护的客户端库,请参阅敏感数据保护客户端库

如需向敏感数据保护服务进行身份验证,请设置应用默认凭据。如需了解详情,请参阅为本地开发环境设置身份验证


using Google.Cloud.Dlp.V2;
using System;

public class JobsGet
{
    public static DlpJob GetDlpJob(string jobName)
    {
        var dlp = DlpServiceClient.Create();

        var response = dlp.GetDlpJob(jobName);

        Console.WriteLine($"Job: {response.Name} status: {response.State}");

        return response;
    }
}

Go

如需了解如何安装和使用用于敏感数据保护的客户端库,请参阅敏感数据保护客户端库

如需向敏感数据保护服务进行身份验证,请设置应用默认凭据。如需了解详情,请参阅为本地开发环境设置身份验证

import (
	"context"
	"fmt"
	"io"

	dlp "cloud.google.com/go/dlp/apiv2"
	"cloud.google.com/go/dlp/apiv2/dlppb"
)

// jobsGet gets an inspection job using jobName
func jobsGet(w io.Writer, projectID string, jobName string) error {
	// projectId := "my-project-id"
	// jobName := "your-job-id"

	ctx := context.Background()

	// Initialize a client once and reuse it to send multiple requests. Clients
	// are safe to use across goroutines. When the client is no longer needed,
	// call the Close method to cleanup its resources.
	client, err := dlp.NewClient(ctx)
	if err != nil {
		return err
	}

	// Closing the client safely cleans up background resources.
	defer client.Close()

	// Construct the request to be sent by the client.
	req := &dlppb.GetDlpJobRequest{
		Name: jobName,
	}

	// Send the request.
	resp, err := client.GetDlpJob(ctx, req)
	if err != nil {
		return err
	}

	// Print the results.
	fmt.Fprintf(w, "Job Name: %v Job Status: %v", resp.Name, resp.State)
	return nil
}

Java

如需了解如何安装和使用用于敏感数据保护的客户端库,请参阅敏感数据保护客户端库

如需向敏感数据保护服务进行身份验证,请设置应用默认凭据。如需了解详情,请参阅为本地开发环境设置身份验证


import com.google.cloud.dlp.v2.DlpServiceClient;
import com.google.privacy.dlp.v2.DlpJobName;
import com.google.privacy.dlp.v2.GetDlpJobRequest;
import java.io.IOException;

public class JobsGet {

  public static void main(String[] args) throws Exception {
    // TODO(developer): Replace these variables before running the sample.
    String projectId = "your-project-id";
    String jobId = "your-job-id";
    getJobs(projectId, jobId);
  }

  // Gets a DLP Job with the given jobId
  public static void getJobs(String projectId, String jobId) throws IOException {
    // Initialize client that will be used to send requests. This client only needs to be created
    // once, and can be reused for multiple requests. After completing all of your requests, call
    // the "close" method on the client to safely clean up any remaining background resources.
    try (DlpServiceClient dlpServiceClient = DlpServiceClient.create()) {

      // Construct the complete job name from the projectId and jobId
      DlpJobName jobName = DlpJobName.of(projectId, jobId);

      // Construct the get job request to be sent by the client.
      GetDlpJobRequest getDlpJobRequest =
          GetDlpJobRequest.newBuilder().setName(jobName.toString()).build();

      // Send the get job request
      dlpServiceClient.getDlpJob(getDlpJobRequest);
      System.out.println("Job got successfully.");
    }
  }
}

Node.js

如需了解如何安装和使用用于敏感数据保护的客户端库,请参阅敏感数据保护客户端库

如需向敏感数据保护服务进行身份验证,请设置应用默认凭据。如需了解详情,请参阅为本地开发环境设置身份验证

// Imports the Google Cloud Data Loss Prevention library
const DLP = require('@google-cloud/dlp');

// Instantiates a client
const dlp = new DLP.DlpServiceClient();

// Job name to look for
// const jobName = 'your-job-name';

async function getJob() {
  // Construct request for finding job using job name.
  const request = {
    name: jobName,
  };

  // Send the request and receive response from the service
  const [job] = await dlp.getDlpJob(request);

  // Print results.
  console.log(`Job ${job.name} status: ${job.state}`);
}

getJob();

PHP

如需了解如何安装和使用用于敏感数据保护的客户端库,请参阅敏感数据保护客户端库

如需向敏感数据保护服务进行身份验证,请设置应用默认凭据。如需了解详情,请参阅为本地开发环境设置身份验证

use Google\Cloud\Dlp\V2\Client\DlpServiceClient;
use Google\Cloud\Dlp\V2\GetDlpJobRequest;

/**
 * Get DLP inspection job.
 * @param string $jobName           Dlp job name
 */
function get_job(
    string $jobName
): void {
    // Instantiate a client.
    $dlp = new DlpServiceClient();
    try {
        // Send the get job request
        $getDlpJobRequest = (new GetDlpJobRequest())
            ->setName($jobName);
        $response = $dlp->getDlpJob($getDlpJobRequest);
        printf('Job %s status: %s' . PHP_EOL, $response->getName(), $response->getState());
    } finally {
        $dlp->close();
    }
}

Python

如需了解如何安装和使用用于敏感数据保护的客户端库,请参阅敏感数据保护客户端库

如需向敏感数据保护服务进行身份验证,请设置应用默认凭据。如需了解详情,请参阅为本地开发环境设置身份验证


import google.cloud.dlp

def get_dlp_job(project: str, job_name: str) -> None:
    """Uses the Data Loss Prevention API to retrieve a DLP job.
    Args:
        project: The project id to use as a parent resource.
        job_name: The name of the DlpJob resource to be retrieved.
    """

    # Instantiate a client.
    dlp = google.cloud.dlp_v2.DlpServiceClient()

    # Convert the project id and job name into a full resource id.
    job_name = f"projects/{project}/locations/global/dlpJobs/{job_name}"

    # Call the API
    response = dlp.get_dlp_job(request={"name": job_name})

    print(f"Job: {response.name} Status: {response.state}")

REST

如需从当前项目获取作业,请向 dlpJobs 端点发送 GET 请求,如此出所示。将 [JOB-IDENTIFIER] 字段替换为以 i- 开头的作业标识符。

网址

GET https://dlp.googleapis.com/v2/projects/[PROJECT-ID]/dlpJobs/[JOB-IDENTIFIER]?key={YOUR_API_KEY}

如果请求成功,DLP API 将返回成功响应。

如需快速尝试此操作,您可以使用下面嵌入的 API Explorer。如需了解有关如何使用 JSON 将请求发送到 DLP API 的一般信息,请参阅 JSON 快速入门

强制立即运行作业触发器

创建作业触发器后,您可以通过激活触发器来强制立即执行触发器以进行测试。为此,请运行以下命令:

curl --request POST \
    -H "Content-Type: application/json" \
    -H "Accept: application/json" \
    -H "Authorization: Bearer $(gcloud auth print-access-token)" \
    -H "X-Goog-User-Project: PROJECT_ID" \
    'https://dlp.googleapis.com/v2/JOB_TRIGGER_NAME:activate'

替换以下内容:

  • PROJECT_ID:用于结算与请求关联的访问费用的 Google Cloud 项目的 ID
  • JOB_TRIGGER_NAME:作业触发器的完整资源名称(例如 projects/my-project/locations/global/jobTriggers/123456789)。

更新现有的作业触发器

除了创建、列出和删除作业触发器之外,您还可以更新现有的作业触发器。如需更改现有作业触发器的配置,请执行以下操作:

控制台

  1. 在 Google Cloud 控制台中,前往“敏感数据保护”页面。

    前往“敏感数据保护”

  2. 点击检查标签页,然后点击作业触发器子标签页。

    控制台会显示当前项目的所有作业触发器的列表。

  3. 在要删除的作业触发器的操作列中,点击更多 ,然后点击查看详细信息

  4. 在作业触发器详情页面上,点击修改

  5. 在“修改触发器”页面上,您可以更改输入数据的位置、检测详细信息(如模板、infoType 或可能性)、任何扫描后操作以及作业触发器的时间表。完成更改后,请点击保存

C#

如需了解如何安装和使用用于敏感数据保护的客户端库,请参阅敏感数据保护客户端库

如需向敏感数据保护服务进行身份验证,请设置应用默认凭据。如需了解详情,请参阅为本地开发环境设置身份验证


using Google.Cloud.Dlp.V2;
using Google.Protobuf.WellKnownTypes;
using System;
using System.Collections.Generic;

public class TriggersUpdate
{
    public static JobTrigger UpdateJob(
        string projectId,
        string triggerId,
        IEnumerable<InfoType> infoTypes = null,
        Likelihood minLikelihood = Likelihood.Likely)
    {
        // Instantiate the client.
        var dlp = DlpServiceClient.Create();

        // Construct the update job trigger request object by providing the trigger name,
        // job trigger object which will specify the type of info to be inspected and
        // update mask object which specifies the field to be updated.
        // Refer to https://cloud.google.com/dlp/docs/reference/rest/v2/Container for specifying the paths in container object.
        var request = new UpdateJobTriggerRequest
        {
            JobTriggerName = new JobTriggerName(projectId, triggerId),
            JobTrigger = new JobTrigger
            {
                InspectJob = new InspectJobConfig
                {
                    InspectConfig = new InspectConfig
                    {
                        InfoTypes =
                        {
                            infoTypes ?? new InfoType[]
                            {
                                new InfoType { Name = "US_INDIVIDUAL_TAXPAYER_IDENTIFICATION_NUMBER" }
                            }
                        },
                        MinLikelihood = minLikelihood
                    }
                }
            },
            // Specify fields of the jobTrigger resource to be updated when the job trigger is modified.
            // Refer https://protobuf.dev/reference/protobuf/google.protobuf/#field-mask for constructing the field mask paths.
            UpdateMask = new FieldMask
            {
                Paths =
                {
                    "inspect_job.inspect_config.info_types",
                    "inspect_job.inspect_config.min_likelihood"
                }
            }
        };

        // Call the API.
        JobTrigger response = dlp.UpdateJobTrigger(request);

        // Inspect the result.
        Console.WriteLine($"Job Trigger Name: {response.Name}");
        Console.WriteLine($"InfoType updated: {response.InspectJob.InspectConfig.InfoTypes[0]}");
        Console.WriteLine($"Likelihood updated: {response.InspectJob.InspectConfig.MinLikelihood}");
        return response;
    }
}

Go

如需了解如何安装和使用用于敏感数据保护的客户端库,请参阅敏感数据保护客户端库

如需向敏感数据保护服务进行身份验证,请设置应用默认凭据。如需了解详情,请参阅为本地开发环境设置身份验证


import (
	"context"
	"fmt"
	"io"

	dlp "cloud.google.com/go/dlp/apiv2"
	"cloud.google.com/go/dlp/apiv2/dlppb"
	"google.golang.org/protobuf/types/known/fieldmaskpb"
)

// updateTrigger updates an existing job trigger in Google Cloud Data Loss Prevention (DLP).
// It modifies the configuration of the specified job trigger with the provided updated settings.
func updateTrigger(w io.Writer, jobTriggerName string) error {
	// jobTriggerName := "your-job-trigger-name" (projects/<projectID>/locations/global/jobTriggers/my-trigger)

	ctx := context.Background()

	// Initialize a client once and reuse it to send multiple requests. Clients
	// are safe to use across goroutines. When the client is no longer needed,
	// call the Close method to cleanup its resources.
	client, err := dlp.NewClient(ctx)
	if err != nil {
		return err
	}

	// Closing the client safely cleans up background resources.
	defer client.Close()

	// Specify the type of info the inspection will look for.
	// See https://cloud.google.com/dlp/docs/infotypes-reference for complete list of info types
	infoType := &dlppb.InfoType{
		Name: "PERSON_NAME",
	}

	// Specify the inspectConfig that represents the configuration settings for inspecting sensitive data in
	// DLP API. It includes detection types, custom info types, inspection methods, and actions
	// to be taken on detection.
	inspectConfig := &dlppb.InspectConfig{
		InfoTypes: []*dlppb.InfoType{
			infoType,
		},
		MinLikelihood: dlppb.Likelihood_LIKELY,
	}

	// Configure the inspection job we want the service to perform.
	inspectJobConfig := &dlppb.InspectJobConfig{
		InspectConfig: inspectConfig,
	}

	// Specify the jobTrigger that represents a DLP job trigger configuration.
	// It defines the conditions, actions, and schedule for executing inspections
	// on sensitive data in the specified data storage.
	jobTrigger := &dlppb.JobTrigger{
		Job: &dlppb.JobTrigger_InspectJob{
			InspectJob: inspectJobConfig,
		},
	}

	// fieldMask represents a set of fields to be included in an update operation.
	// It is used to specify which fields of a resource should be updated.
	updateMask := &fieldmaskpb.FieldMask{
		Paths: []string{"inspect_job.inspect_config.info_types", "inspect_job.inspect_config.min_likelihood"},
	}

	// Combine configurations into a request for the service.
	req := &dlppb.UpdateJobTriggerRequest{
		Name:       jobTriggerName,
		JobTrigger: jobTrigger,
		UpdateMask: updateMask,
	}

	// Send the scan request and process the response
	resp, err := client.UpdateJobTrigger(ctx, req)
	if err != nil {
		return err
	}

	// Print the result.
	fmt.Fprintf(w, "Successfully Updated trigger: %v", resp)
	return nil

}

Java

如需了解如何安装和使用用于敏感数据保护的客户端库,请参阅敏感数据保护客户端库

如需向敏感数据保护服务进行身份验证,请设置应用默认凭据。如需了解详情,请参阅为本地开发环境设置身份验证


import com.google.cloud.dlp.v2.DlpServiceClient;
import com.google.privacy.dlp.v2.InfoType;
import com.google.privacy.dlp.v2.InspectConfig;
import com.google.privacy.dlp.v2.InspectJobConfig;
import com.google.privacy.dlp.v2.JobTrigger;
import com.google.privacy.dlp.v2.JobTriggerName;
import com.google.privacy.dlp.v2.Likelihood;
import com.google.privacy.dlp.v2.UpdateJobTriggerRequest;
import com.google.protobuf.FieldMask;
import java.io.IOException;

public class TriggersPatch {

  public static void main(String[] args) throws Exception {
    // TODO(developer): Replace these variables before running the sample.

    // The Google Cloud project id to use as a parent resource.
    String projectId = "your-project-id";
    // The name of the job trigger to be updated.
    String jobTriggerName = "your-job-trigger-name";
    patchTrigger(projectId, jobTriggerName);
  }

  // Uses the Data Loss Prevention API to update an existing job trigger.
  public static void patchTrigger(String projectId, String jobTriggerName) throws IOException {
    // Initialize client that will be used to send requests. This client only needs to be created
    // once, and can be reused for multiple requests. After completing all of your requests, call
    // the "close" method on the client to safely clean up any remaining background resources.
    try (DlpServiceClient dlpServiceClient = DlpServiceClient.create()) {

      // Specify the type of info the inspection will look for.
      // See https://cloud.google.com/dlp/docs/infotypes-reference for complete list of info types
      InfoType infoType = InfoType.newBuilder().setName("PERSON_NAME").build();

      InspectConfig inspectConfig = InspectConfig.newBuilder()
              .addInfoTypes(infoType)
              .setMinLikelihood(Likelihood.LIKELY)
              .build();

      InspectJobConfig inspectJobConfig = InspectJobConfig.newBuilder()
              .setInspectConfig(inspectConfig)
              .build();

      JobTrigger jobTrigger = JobTrigger.newBuilder()
              .setInspectJob(inspectJobConfig)
              .build();

      // Specify fields of the jobTrigger resource to be updated when the job trigger is modified.
      // Refer https://protobuf.dev/reference/protobuf/google.protobuf/#field-mask for constructing the field mask paths.
      FieldMask fieldMask = FieldMask.newBuilder()
              .addPaths("inspect_job.inspect_config.info_types")
              .addPaths("inspect_job.inspect_config.min_likelihood")
              .build();

      // Update the job trigger with the new configuration.
      UpdateJobTriggerRequest updateJobTriggerRequest = UpdateJobTriggerRequest.newBuilder()
              .setName(JobTriggerName.of(projectId, jobTriggerName).toString())
              .setJobTrigger(jobTrigger)
              .setUpdateMask(fieldMask)
              .build();

      // Call the API to update the job trigger.
      JobTrigger updatedJobTrigger = dlpServiceClient.updateJobTrigger(updateJobTriggerRequest);

      System.out.println("Job Trigger Name: " + updatedJobTrigger.getName());
      System.out.println(
          "InfoType updated: "
              + updatedJobTrigger.getInspectJob().getInspectConfig().getInfoTypes(0).getName());
      System.out.println(
          "Likelihood updated: "
              + updatedJobTrigger.getInspectJob().getInspectConfig().getMinLikelihood());
    }
  }
}

Node.js

如需了解如何安装和使用用于敏感数据保护的客户端库,请参阅敏感数据保护客户端库

如需向敏感数据保护服务进行身份验证,请设置应用默认凭据。如需了解详情,请参阅为本地开发环境设置身份验证

// Imports the Google Cloud Data Loss Prevention library
const DLP = require('@google-cloud/dlp');

// Instantiates a client
const dlpClient = new DLP.DlpServiceClient();

// The project ID to run the API call under
// const projectId = 'my-project';

// The job trigger ID to run the API call under
// const jobTriggerName = 'your-job-trigger-name';

async function updateTrigger() {
  // Construct inspect configuration to match PERSON_NAME infotype
  const inspectConfig = {
    infoTypes: [{name: 'PERSON_NAME'}],
    minLikelihood: 'LIKELY',
  };

  // Configure the job trigger we want to update.
  const jobTrigger = {inspectJob: {inspectConfig}};

  const updateMask = {
    paths: [
      'inspect_job.inspect_config.info_types',
      'inspect_job.inspect_config.min_likelihood',
    ],
  };

  // Combine configurations into a request for the service.
  const request = {
    name: `projects/${projectId}/jobTriggers/${jobTriggerName}`,
    jobTrigger,
    updateMask,
  };

  // Send the request and receive response from the service
  const [updatedJobTrigger] = await dlpClient.updateJobTrigger(request);

  // Print the results
  console.log(`Updated Trigger: ${JSON.stringify(updatedJobTrigger)}`);
}
updateTrigger(projectId, jobTriggerName);

PHP

如需了解如何安装和使用用于敏感数据保护的客户端库,请参阅敏感数据保护客户端库

如需向敏感数据保护服务进行身份验证,请设置应用默认凭据。如需了解详情,请参阅为本地开发环境设置身份验证

use Google\Cloud\Dlp\V2\DlpServiceClient;
use Google\Cloud\Dlp\V2\InfoType;
use Google\Cloud\Dlp\V2\InspectConfig;
use Google\Cloud\Dlp\V2\InspectJobConfig;
use Google\Cloud\Dlp\V2\JobTrigger;
use Google\Cloud\Dlp\V2\Likelihood;
use Google\Protobuf\FieldMask;

/**
 * Update an existing job trigger.
 *
 * @param string $callingProjectId  The Google Cloud Project ID to run the API call under.
 * @param string $jobTriggerName    The job trigger name to update.
 *
 */
function update_trigger(
    string $callingProjectId,
    string $jobTriggerName
): void {
    // Instantiate a client.
    $dlp = new DlpServiceClient();

    // Configure the inspectConfig.
    $inspectConfig = (new InspectConfig())
        ->setInfoTypes([
            (new InfoType())
                ->setName('US_INDIVIDUAL_TAXPAYER_IDENTIFICATION_NUMBER')
        ])
        ->setMinLikelihood(Likelihood::LIKELY);

    // Configure the Job Trigger we want the service to perform.
    $jobTrigger = (new JobTrigger())
        ->setInspectJob((new InspectJobConfig())
            ->setInspectConfig($inspectConfig));

    // Specify fields of the jobTrigger resource to be updated when the job trigger is modified.
    // Refer https://protobuf.dev/reference/protobuf/google.protobuf/#field-mask for constructing the field mask paths.
    $fieldMask = (new FieldMask())
        ->setPaths([
            'inspect_job.inspect_config.info_types',
            'inspect_job.inspect_config.min_likelihood'
        ]);

    // Send the update job trigger request and process the response.
    $name = "projects/$callingProjectId/locations/global/jobTriggers/" . $jobTriggerName;

    $response = $dlp->updateJobTrigger($name, [
        'jobTrigger' => $jobTrigger,
        'updateMask' => $fieldMask
    ]);

    // Print results.
    printf('Successfully update trigger %s' . PHP_EOL, $response->getName());
}

Python

如需了解如何安装和使用用于敏感数据保护的客户端库,请参阅敏感数据保护客户端库

如需向敏感数据保护服务进行身份验证,请设置应用默认凭据。如需了解详情,请参阅为本地开发环境设置身份验证

from typing import List

import google.cloud.dlp

def update_trigger(
    project: str,
    info_types: List[str],
    trigger_id: str,
) -> None:
    """Uses the Data Loss Prevention API to update an existing job trigger.
    Args:
        project: The Google Cloud project id to use as a parent resource
        info_types: A list of strings representing infoTypes to update trigger with.
            A full list of infoType categories can be fetched from the API.
        trigger_id: The id of job trigger which needs to be updated.
    """

    # Instantiate a client.
    dlp = google.cloud.dlp_v2.DlpServiceClient()

    # Prepare info_types by converting the list of strings into a list of
    # dictionaries.
    info_types = [{"name": info_type} for info_type in info_types]

    # Specify fields of the jobTrigger resource to be updated when the
    # job trigger is modified.
    job_trigger = {
        "inspect_job": {
            "inspect_config": {
                "info_types": info_types,
                "min_likelihood": google.cloud.dlp_v2.Likelihood.LIKELY,
            }
        }
    }

    # Convert the project id into a full resource id.
    trigger_name = f"projects/{project}/jobTriggers/{trigger_id}"

    # Call the API.
    # Refer https://protobuf.dev/reference/protobuf/google.protobuf/#field-mask
    # for constructing the field mask paths.
    response = dlp.update_job_trigger(
        request={
            "name": trigger_name,
            "job_trigger": job_trigger,
            "update_mask": {
                "paths": [
                    "inspect_job.inspect_config.info_types",
                    "inspect_job.inspect_config.min_likelihood",
                ]
            },
        }
    )

    # Print out the result.
    print(f"Successfully updated trigger: {response.name}")
    print(
        f"Updated InfoType: {response.inspect_job.inspect_config.info_types[0].name}"
        f" \nUpdates Likelihood: {response.inspect_job.inspect_config.min_likelihood}\n",
    )

REST

使用 projects.jobTriggers.patch 方法向 DLP API 发送新的 JobTrigger 值,以更新指定作业触发器中的这些值。

例如,请考虑下面这个简单的作业触发器。此 JSON 是在向当前项目的作业触发器端点发送 GET 请求后返回的,表示作业触发器。

JSON 输出:

{
  "name":"projects/[PROJECT_ID]/jobTriggers/[JOB_TRIGGER_NAME]",
  "inspectJob":{
    "storageConfig":{
      "cloudStorageOptions":{
        "fileSet":{
          "url":"gs://dlptesting/*"
        },
        "fileTypes":[
          "FILE_TYPE_UNSPECIFIED"
        ],
        "filesLimitPercent":100
      },
      "timespanConfig":{
        "enableAutoPopulationOfTimespanConfig":true
      }
    },
    "inspectConfig":{
      "infoTypes":[
        {
          "name":"US_SOCIAL_SECURITY_NUMBER"
        }
      ],
      "minLikelihood":"POSSIBLE",
      "limits":{

      }
    },
    "actions":[
      {
        "jobNotificationEmails":{

        }
      }
    ]
  },
  "triggers":[
    {
      "schedule":{
        "recurrencePeriodDuration":"86400s"
      }
    }
  ],
  "createTime":"2019-03-06T21:19:45.774841Z",
  "updateTime":"2019-03-06T21:19:45.774841Z",
  "status":"HEALTHY"
}

下面的 JSON 在通过 PATCH 请求发送到指定端点时,会使用要扫描的新 infoType 和新的最小可能性更新给定的作业触发器。请注意,您还必须指定 updateMask 属性,并指定其值为 FieldMask 格式。

JSON 输入

PATCH https://dlp.googleapis.com/v2/projects/[PROJECT_ID]/jobTriggers/[JOB_TRIGGER_NAME]?key={YOUR_API_KEY}

{
  "jobTrigger":{
    "inspectJob":{
      "inspectConfig":{
        "infoTypes":[
          {
            "name":"US_INDIVIDUAL_TAXPAYER_IDENTIFICATION_NUMBER"
          }
        ],
        "minLikelihood":"LIKELY"
      }
    }
  },
  "updateMask":"inspectJob(inspectConfig(infoTypes,minLikelihood))"
}

将此 JSON 发送到指定网址后,它会返回以下内容,表示更新后的作业触发器。请注意,原始的 infoType 和可能性值已替换为新值。

JSON 输出:

{
  "name":"projects/[PROJECT_ID]/jobTriggers/[JOB_TRIGGER_NAME]",
  "inspectJob":{
    "storageConfig":{
      "cloudStorageOptions":{
        "fileSet":{
          "url":"gs://dlptesting/*"
        },
        "fileTypes":[
          "FILE_TYPE_UNSPECIFIED"
        ],
        "filesLimitPercent":100
      },
      "timespanConfig":{
        "enableAutoPopulationOfTimespanConfig":true
      }
    },
    "inspectConfig":{
      "infoTypes":[
        {
          "name":"US_INDIVIDUAL_TAXPAYER_IDENTIFICATION_NUMBER"
        }
      ],
      "minLikelihood":"LIKELY",
      "limits":{

      }
    },
    "actions":[
      {
        "jobNotificationEmails":{

        }
      }
    ]
  },
  "triggers":[
    {
      "schedule":{
        "recurrencePeriodDuration":"86400s"
      }
    }
  ],
  "createTime":"2019-03-06T21:19:45.774841Z",
  "updateTime":"2019-03-06T21:27:01.650183Z",
  "lastRunTime":"1970-01-01T00:00:00Z",
  "status":"HEALTHY"
}

如需快速尝试此操作,您可以使用下面嵌入的 API Explorer。如需了解有关如何使用 JSON 将请求发送到 DLP API 的一般信息,请参阅 JSON 快速入门

作业延迟时间

对于作业和作业触发器,无法保证服务等级目标 (SLO)。延迟时间受多种因素影响,包括要扫描的数据量、要扫描的存储代码库、要扫描的 infoType 的类型和数量、处理作业的区域,以及该区域中可用的计算资源。因此,无法提前确定检查作业的延迟时间。

如需帮助减少作业延迟时间,您可以尝试以下方法:

  • 如果作业或作业触发器可以使用采样,请启用它。
  • 避免启用不需要的 infoType。虽然以下请求在某些情况下很有用,但与不包含这些 infoType 的请求相比,这些 infoType 的运行速度可能会慢得多:

    • PERSON_NAME
    • FEMALE_NAME
    • MALE_NAME
    • FIRST_NAME
    • LAST_NAME
    • DATE_OF_BIRTH
    • LOCATION
    • STREET_ADDRESS
    • ORGANIZATION_NAME
  • 始终明确指定 infoType。请勿使用空的 infoType 列表。

  • 如果可能,请使用其他处理区域。

如果您在尝试这些方法后仍然遇到作业的延迟问题,请考虑使用 content.inspectcontent.deidentify 请求,而不是作业。这些方法在《服务等级协议》涵盖范围内。如需了解详情,请参阅敏感数据保护服务等级协议

仅扫描新内容

您可以将作业触发器配置为自动设置 Cloud StorageBigQuery 中存储的文件的时间范围日期。如果您将 TimespanConfig 对象设置为自动填充,敏感数据保护将仅扫描自触发器上次运行以来添加或修改的数据:

...
  timespan_config {
        enable_auto_population_of_timespan_config: true
      }
...

在文件上传时触发作业

除了内置于敏感数据保护中的作业触发器支持之外,Google Cloud 还提供了各种其他组件,可供您用来集成或触发敏感数据保护作业。例如,每次有文件上传到 Cloud Storage 时,您都可以使用 Cloud Functions 函数触发敏感数据保护扫描。

如需了解如何设置此操作,请参阅对上传到 Cloud Storage 的数据进行自动分类

后续步骤