建立 Cloud DLP 檢查工作和工作觸發條件

本主題將詳細說明如何建立新工作或工作觸發條件。如需如何使用 Cloud DLP UI 建立新工作觸發條件的快速逐步操作說明,請參閱建立 Cloud DLP 工作觸發條件的快速入門導覽課程

關於工作和工作觸發條件

「工作」是 Cloud Data Loss Prevention (DLP) 執行的動作,可掃描機密資料的內容。當您告知 Cloud DLP 檢查資料時,Cloud DLP 會建立並執行工作資源。

您可以建立「工作觸發條件」來排程 Cloud DLP 執行工作的時間。工作觸發條件為一種事件,可自動建立資料遺失防護 (DLP) 工作來掃描 Google Cloud Platform 儲存空間存放區,包括 Cloud Storage 值區、BigQuery 表格以及 Cloud Datastore 種類。

如要瞭解 Cloud DLP 中工作和工作觸發條件的概念資訊,請參閱工作和工作觸發條件一文。

建立新工作或工作觸發條件

如要建立新的 Cloud DLP 工作或工作觸發條件:

主控台

  1. 在 GCP 主控台中,開啟 Cloud DLP。

    前往 Cloud DLP UI Beta 版

  2. 從「Create」(建立) 選單中,選擇 [Job or job trigger] (工作或工作觸發條件)

    已選取 [Create] (建立) 選單 > [Jobs or job triggers] (工作或工作觸發條件) 的 DLP UI 螢幕擷圖。

    或者,按一下下列按鈕:

    建立新工作觸發條件

「Create job or job trigger」(建立工作或工作觸發條件) 頁面包含下列章節:

  1. 選擇輸入資料
  2. 設定偵測作業
  3. 新增動作
  4. 排程
  5. 查看

選擇輸入資料

  1. 在「Create job or job trigger」(建立工作或工作觸發條件) 頁面中,先輸入工作或工作觸發條件的名稱。您可以使用英文字母、數字和連字號。
  2. 接著,從「Storage type」(儲存空間類型) 選單中,選擇用於儲存您要掃描之資料的存放區類型,Cloud Storage、BigQuery 或 Cloud Datastore:
    • 針對 Cloud Storage,輸入您要掃描之值區的網址,或從「Location type」(位置類型) 選單中選擇 [Include/exclude] (納入/排除),然後按一下 [Browse] (瀏覽),前往您要掃描的值區或子資料夾。選取 [Scan folder recursively] (以遞迴方式掃描資料夾) 核取方塊可掃描指定目錄與包含的所有目錄。取消選取該核取方塊可僅掃描指定目錄,而不深入掃描。
    • 針對 BigQuery,輸入您要掃描之專案、資料集與資料表的 ID。
    • 針對 Cloud Datastore,輸入您要掃描之專案、命名空間與種類的 ID。
  3. 如要輸入進階設定詳細資料,請按一下 [Show Advanced Configuration] (顯示進階設定),然後繼續進行下一節。請注意,進階設定僅適用於掃描 Cloud Storage 值區和 BigQuery 表格。

指定完資料位置和任何進階設定細節後,您可以按一下 [Continue] (繼續) 繼續微調工作或工作觸發條件,也可以按一下 [Create] (建立) 立即建立工作。

進階設定

當您針對 Cloud Storage 值區或 BigQuery 表格建立工作或工作觸發條件時,可指定進階設定來縮小搜尋範圍。具體而言,您可以設定:

  • 檔案 (僅限 Cloud Storage):要掃描的檔案類型,包括文字、二進位檔和圖片檔。您也可以指定規則運算式 (regex),以包含或排除符合一或多個檔案模式的檔案。
  • 識別欄位 (僅限 BigQuery):表格中的不重複資料列 ID。
  • 取樣:是否使用取樣僅掃描一部分資料列或檔案,以及要掃描的輸入位置中的檔案百分比。
檔案

針對儲存在 Cloud Storage 中的檔案,您可以在「Files」(檔案) 下指定要包含在掃描中的類型。

開啟「File types」(檔案類型) 選單的 DLP UI 螢幕擷圖。

您可以選擇二進位檔、文字和圖片檔。FileType 的 API 參考頁面中詳細列出了 Cloud DLP 可在 Cloud Storage 值區中掃描的副檔名。請注意,選擇 [Binary] (二進位檔) 會導致 Cloud DLP 掃描無法辨識的檔案類型。

在 [Include paths] (包含路徑) 下,您可以指定一或多個規則運算式,以僅掃描值區內符合指定模式的檔案。如果您未提供路徑,系統將會掃描所有檔案類型。

在 [Exclude paths] (排除路徑) 下,您可以指定一或多個規則運算式,以略過值區內符合指定模式的檔案。如果您未提供路徑,系統將會掃描所有檔案類型。

如要新增您想包含或排除的規則運算式,請按一下 [Add Include regex] (新增檔案納入規則運算式)/[Add exclude regex] (新增檔案排除規則運算式),然後在欄位中輸入規則運算式。您可以視情況新增任意數量的欄位。如要移除欄位,請按一下您要刪除之欄位旁邊的 [Delete item] (刪除項目) (垃圾桶圖示)。

識別欄位

針對 BigQuery 中的表格,您可以在 [Identifying fields] (識別欄位) 下設定,指示 Cloud DLP 僅掃描一或多個特定欄位中含有值的資料列。

DLP UI 中顯示一個輸入欄位的「Identifying fields」(識別欄位) 區段螢幕擷圖。

如要新增欄位,請按一下 [Add identifying field] (新增識別欄位)。輸入欄位名稱,並在必要時使用點標記法指定巢狀欄位。

您可以視情況新增任意數量的欄位。如要移除欄位,請按一下您要刪除之欄位旁邊的 [Delete item] (刪除項目) (垃圾桶圖示)。

取樣

在 [Sampling] (取樣方法) 下,您可以選擇掃描所有選取的資料,還是透過掃描特定百分比來取樣資料。取樣的工作方式會視您掃描的儲存空間存放區類型而有所不同:

  • 針對 BigQuery,此設定會取樣所有已選取資料列的子集,並對應於您指定要包含在掃描作業內的檔案百分比。

    DLP UI 中適用於 BigQuery 的「Sampling」(取樣方法) 區段螢幕擷圖。

  • 針對 Cloud Storage,如果任何檔案超出「Max byte size to scan per file」(每個待掃描檔案的位元組大小上限) 中指定的大小,Cloud DLP 最多會掃描至該檔案大小上限,然後移至下一個檔案。

    DLP UI 中適用於 Cloud Storage 的「Sampling」(取樣方法) 區段螢幕擷圖。

如要開啟取樣,請從上方選單中選擇下列其中一個選項:

  • 從頭開始:Cloud DLP 會從資料的開頭開始部分掃描。針對 BigQuery,此設定會從第一個資料列開始掃描。針對 Cloud Storage,此設定會從每個檔案的開頭開始掃描,並在 Cloud DLP 掃描到任何指定檔案大小上限 (請參閱上述說明) 之後停止掃描。
  • 從隨機位置開始:Cloud DLP 會從資料的隨機位置開始部分掃描。針對 BigQuery,此設定會從隨機資料列開始掃描。針對 Cloud Storage,此設定僅適用於超出任何指定大小上限的檔案。Cloud DLP 掃描檔案的整體大小會低於檔案大小上限,若高於檔案大小上限,則會掃描到該上限為止。

如要執行部分掃描,您也必須選擇想要掃描的資料百分比。使用滑桿可設定百分比。

設定偵測作業

「Configure detection」(設定偵測作業) 區段可讓您指定想要掃描的機密資料類型。

範本

您可以選擇使用範本來重複使用先前已指定的設定資訊。

如果您已建立範本,請按一下 [Template name] (範本名稱) 欄位,查看現有範本清單。選擇或輸入您要使用的範本名稱。

如要進一步瞭解如何建立範本,請參閱建立 Cloud DLP 檢查範本一文。

InfoType

InfoType 偵測工具可尋找特定類型的機密資料。例如,Cloud DLP US_SOCIAL_SECURITY_NUMBER infoType 偵測工具可尋找美國社會安全號碼。

DLP UI 之「Create job or job trigger」(建立工作或工作觸發條件) 中的「InfoType」區段螢幕擷圖。

在「InfoType」下,選擇與您要掃描之資料類型對應的 infoType 偵測工具。您也可以將此欄位保留空白,以掃描所有預設 infoType。InfoType 偵測工具參考資料中提供每個偵測工具的詳細資訊。

可信度門檻

Cloud DLP 每次偵測到機密資料的可能相符項目時,都會為該項目指派可能性值,範圍從「非常不可能」到「非常可能」。當您在此處設定可能性值時,就是在指示 Cloud DLP 僅比對與該可能性值或更高值對應的資料。

DLP UI 之「Create job or job trigger」(建立工作或工作觸發條件) 中的「Confidence threshold」(可信度門檻) 區段螢幕擷圖。

預設值「Possible」(或許可能) 足以供大多數情況使用。如果您經常得到過於寬泛的相符項目,請將滑桿上移。如果您得到的相符項目太少,請將滑桿下移。

完成後,按一下 [Continue] (繼續)

新增動作

在「Add actions」(新增動作) 步驟中,您可以選取想讓 Cloud DLP 在工作完成之後執行的一或多個動作。

DLP UI 的「Add actions」(新增動作) 區段螢幕擷圖。

可用的選項包括:

  • 發佈至 Pub/Sub:此選項可在工作完成後,將通知訊息傳送至 Cloud Pub/Sub。按一下 [New Topic] (新增主題) 可指定您要發佈通知的一或多個主題名稱。
  • 儲存至 BigQuery:此選項可將發現項目儲存至 BigQuery 表格。儲存在 BigQuery 中的發現項目包含每個發現項目位置與相符可能性的詳細資料。如果您未儲存發現項目,已完成的工作將僅包含發現項目的數量和 infoType 的相關統計資料。如果您未指定表格 ID,BigQuery 會為新表格指派預設名稱。如果您指定現有的表格,發現項目就會附加到該表格。選擇 [Include quote] (包含引言) 核取方塊可在每個比對結果中包含內容比對文字。
  • 發布至 Google Cloud Security Command Center:此選項可將結果摘要發布至 Cloud SCC。詳情請參閱將 Cloud DLP 掃描結果傳送至 Cloud SCC 一文。
  • 電子郵件通知:此選項可讓 Cloud DLP 在工作完成時,將電子郵件傳送至專案擁有者和編輯者。

選取動作之後,按一下 [Continue] (繼續)

排程

在「Schedule」(排程) 區段中,您可以執行兩項操作:

  • 指定時距:此選項可將檔案或資料列限制為按日期掃描。按一下 [Start time] (開始時間) 可指定要包含的最早檔案時間戳記。將此值保留空白可指定所有檔案。按一下 [End time] (結束時間) 可指定要包含的最晚檔案時間戳記。將此值保留空白可不指定時間戳記上限。
  • 建立觸發條件來定期執行工作:此選項可建立工作觸發條件,並將其設定為定期執行您指定的工作。預設值也是最小值:24 小時。最大值是 60 天。如果您只想讓 Cloud DLP 掃描新檔案或資料列,請選取 [Limit scans only to new content] (將掃描限制為僅新內容) 核取方塊。

DLP UI 之「Create a job or job trigger」(建立工作或工作觸發條件) 中的「Schedule」(排程) 區段螢幕擷圖。

查看

「Review」(查看) 區段包含您剛指定之工作設定的 JSON 格式摘要。

按一下 [Create] (建立) 可建立工作觸發條件 (若您指定了排程) 或工作 (若您未指定排程),並執行工作。隨即會顯示工作或工作觸發條件的資訊頁面,其中包含狀態及其他資訊。如果目前正在執行工作,您可以按一下 [Cancel] (取消) 按鈕來停止工作,也可以按一下 [Delete] (刪除),刪除工作或工作觸發條件。

如要返回 Cloud DLP 主頁面,請在 GCP 主控台中按一下「返回」箭頭。

通訊協定

工作觸發條件在 DLP API 中會以 JobTrigger 資源表示。您可以使用 JobTrigger 資源的 projects.jobTriggers.create 方法建立新的工作觸發條件。

您可以在傳送 POST 要求時將以下 JSON 範例傳送至指定的 Cloud DLP REST 端點。這個 JSON 範例示範如何在 Cloud DLP 中建立工作觸發條件。此觸發條件啟動的工作是 Cloud Datastore 檢查掃描。已建立的工作觸發條件會每隔 86,400 秒 (或 24 小時) 執行一次。

如要快速嘗試,您可以使用 APIs Explorer。請注意,即使在 API Explorer 中建立,成功提出的要求仍將建立新排定的工作觸發條件。要瞭解如何透過 JSON 將要求傳送至 Cloud DLP API 的一般資訊,請參閱 JSON 快速入門導覽課程

JSON 輸入:

POST https://dlp.googleapis.com/v2/projects/[PROJECT_ID]/jobTriggers?key={YOUR_API_KEY}

{
  "jobTrigger":{
    "displayName":"JobTrigger1",
    "description":"Starts a DLP scan job of a Datastore kind",
    "triggers":[
      {
        "schedule":{
          "recurrencePeriodDuration":"86400s"
        }
      }
    ],
    "status":"HEALTHY",
    "inspectJob":{
      "storageConfig":{
        "datastoreOptions":{
          "kind":{
            "name":"Example-Kind"
          },
          "partitionId":{
            "projectId":"[PROJECT_ID]",
            "namespaceId":"[NAMESPACE_ID]"
          }
        }
      },
      "inspectConfig":{
        "infoTypes":[
          {
            "name":"PHONE_NUMBER"
          }
        ],
        "excludeInfoTypes":false,
        "includeQuote":true,
        "minLikelihood":"LIKELY"
      },
      "actions":[
        {
          "saveFindings":{
            "outputConfig":{
              "table":{
                "projectId":"[PROJECT_ID]",
                "datasetId":"[BIGQUERY_DATASET_NAME]",
                "tableId":"[BIGQUERY_TABLE_NAME]"
              }
            }
          }
        }
      ]
    }
  }
}

JSON 輸出:

以下輸出表示工作觸發條件已成功建立。

{
  "name":"projects/[PROJECT_ID]/jobTriggers/[JOB_TRIGGER_NAME]",
  "displayName":"JobTrigger1",
  "description":"Starts a DLP scan job of a Datastore kind",
  "inspectJob":{
    "storageConfig":{
      "datastoreOptions":{
        "partitionId":{
          "projectId":"[PROJECT_ID]",
          "namespaceId":"[NAMESPACE_ID]"
        },
        "kind":{
          "name":"Example-Kind"
        }
      }
    },
    "inspectConfig":{
      "infoTypes":[
        {
          "name":"PHONE_NUMBER"
        }
      ],
      "minLikelihood":"LIKELY",
      "limits":{

      },
      "includeQuote":true
    },
    "actions":[
      {
        "saveFindings":{
          "outputConfig":{
            "table":{
              "projectId":"[PROJECT_ID]",
              "datasetId":"[BIGQUERY_DATASET_NAME]",
              "tableId":"[BIGQUERY_TABLE_NAME]"
            }
          }
        }
      }
    ]
  },
  "triggers":[
    {
      "schedule":{
        "recurrencePeriodDuration":"86400s"
      }
    }
  ],
  "createTime":"2018-11-30T01:52:41.171857Z",
  "updateTime":"2018-11-30T01:52:41.171857Z",
  "status":"HEALTHY"
}

Java

/**
 * Schedule a DLP inspection trigger for a GCS location.
 *
 * @param triggerId (Optional) name of the trigger to be created
 * @param displayName (Optional) display name for the trigger to be created
 * @param description (Optional) description for the trigger to be created
 * @param autoPopulateTimespan If true, limits scans to new content only.
 * @param scanPeriod How often to wait between scans, in days (minimum = 1 day)
 * @param infoTypes infoTypes of information to match eg. InfoType.PHONE_NUMBER,
 *     InfoType.EMAIL_ADDRESS
 * @param minLikelihood minimum likelihood required before returning a match
 * @param maxFindings maximum number of findings to report per request (0 = server maximum)
 * @param projectId The project ID to run the API call under
 */
private static void createTrigger(
    String triggerId,
    String displayName,
    String description,
    String bucketName,
    String fileName,
    boolean autoPopulateTimespan,
    int scanPeriod,
    List<InfoType> infoTypes,
    Likelihood minLikelihood,
    int maxFindings,
    String projectId)
    throws Exception {

  // instantiate a client
  DlpServiceClient dlpServiceClient = DlpServiceClient.create();
  try {

    CloudStorageOptions cloudStorageOptions =
        CloudStorageOptions.newBuilder()
            .setFileSet(
                CloudStorageOptions.FileSet.newBuilder()
                    .setUrl("gs://" + bucketName + "/" + fileName))
            .build();

    TimespanConfig timespanConfig = TimespanConfig.newBuilder()
        .setEnableAutoPopulationOfTimespanConfig(autoPopulateTimespan).build();

    StorageConfig storageConfig =
        StorageConfig.newBuilder().setCloudStorageOptions(cloudStorageOptions)
            .setTimespanConfig(timespanConfig).build();

    InspectConfig.FindingLimits findingLimits =
        InspectConfig.FindingLimits.newBuilder().setMaxFindingsPerRequest(maxFindings).build();

    InspectConfig inspectConfig =
        InspectConfig.newBuilder()
            .addAllInfoTypes(infoTypes)
            .setMinLikelihood(minLikelihood)
            .setLimits(findingLimits)
            .build();

    InspectJobConfig inspectJobConfig =
        InspectJobConfig.newBuilder()
            .setInspectConfig(inspectConfig)
            .setStorageConfig(storageConfig)
            .build();

    // Schedule scan of GCS bucket every scanPeriod number of days (minimum = 1 day)
    Duration duration = Duration.newBuilder().setSeconds(scanPeriod * 24 * 3600).build();
    Schedule schedule = Schedule.newBuilder().setRecurrencePeriodDuration(duration).build();
    JobTrigger.Trigger trigger = JobTrigger.Trigger.newBuilder().setSchedule(schedule).build();
    JobTrigger jobTrigger =
        JobTrigger.newBuilder()
            .setInspectJob(inspectJobConfig)
            .setName(triggerId)
            .setDisplayName(displayName)
            .setDescription(description)
            .setStatus(JobTrigger.Status.HEALTHY)
            .addTriggers(trigger)
            .build();

    // Create scan request
    CreateJobTriggerRequest createJobTriggerRequest =
        CreateJobTriggerRequest.newBuilder()
            .setParent(ProjectName.of(projectId).toString())
            .setJobTrigger(jobTrigger)
            .build();

    JobTrigger createdJobTrigger = dlpServiceClient.createJobTrigger(createJobTriggerRequest);

    System.out.println("Created Trigger: " + createdJobTrigger.getName());
  } catch (Exception e) {
    System.out.println("Error creating trigger: " + e.getMessage());
  }
}

Node.js

// Imports the Google Cloud Data Loss Prevention library
const DLP = require('@google-cloud/dlp');

// Instantiates a client
const dlp = new DLP.DlpServiceClient();

// The project ID to run the API call under
// const callingProjectId = process.env.GCLOUD_PROJECT;

// (Optional) The name of the trigger to be created.
// const triggerId = 'my-trigger';

// (Optional) A display name for the trigger to be created
// const displayName = 'My Trigger';

// (Optional) A description for the trigger to be created
// const description = "This is a sample trigger.";

// The name of the bucket to scan.
// const bucketName = 'YOUR-BUCKET';

// Limit scan to new content only.
// const autoPopulateTimespan = true;

// How often to wait between scans, in days (minimum = 1 day)
// const scanPeriod = 1;

// The infoTypes of information to match
// const infoTypes = [{ name: 'PHONE_NUMBER' }, { name: 'EMAIL_ADDRESS' }, { name: 'CREDIT_CARD_NUMBER' }];

// The minimum likelihood required before returning a match
// const minLikelihood = 'LIKELIHOOD_UNSPECIFIED';

// The maximum number of findings to report per request (0 = server maximum)
// const maxFindings = 0;

// Get reference to the bucket to be inspected
const storageItem = {
  cloudStorageOptions: {
    fileSet: {url: `gs://${bucketName}/*`},
  },
  timeSpanConfig: {
    enableAutoPopulationOfTimespanConfig: autoPopulateTimespan,
  },
};

// Construct job to be triggered
const job = {
  inspectConfig: {
    infoTypes: infoTypes,
    minLikelihood: minLikelihood,
    limits: {
      maxFindingsPerRequest: maxFindings,
    },
  },
  storageConfig: storageItem,
};

// Construct trigger creation request
const request = {
  parent: dlp.projectPath(callingProjectId),
  jobTrigger: {
    inspectJob: job,
    displayName: displayName,
    description: description,
    triggers: [
      {
        schedule: {
          recurrencePeriodDuration: {
            seconds: scanPeriod * 60 * 60 * 24, // Trigger the scan daily
          },
        },
      },
    ],
    status: 'HEALTHY',
  },
  triggerId: triggerId,
};

try {
  // Run trigger creation request
  const [trigger] = await dlp.createJobTrigger(request);
  console.log(`Successfully created trigger ${trigger.name}.`);
} catch (err) {
  console.log(`Error in createTrigger: ${err.message || err}`);
}

Python

def create_trigger(project, bucket, scan_period_days, info_types,
                   trigger_id=None, display_name=None, description=None,
                   min_likelihood=None, max_findings=None,
                   auto_populate_timespan=False):
    """Creates a scheduled Data Loss Prevention API inspect_content trigger.
    Args:
        project: The Google Cloud project id to use as a parent resource.
        bucket: The name of the GCS bucket to scan. This sample scans all
            files in the bucket using a wildcard.
        scan_period_days: How often to repeat the scan, in days.
            The minimum is 1 day.
        info_types: A list of strings representing info types to look for.
            A full list of info type categories can be fetched from the API.
        trigger_id: The id of the trigger. If omitted, an id will be randomly
            generated.
        display_name: The optional display name of the trigger.
        description: The optional description of the trigger.
        min_likelihood: A string representing the minimum likelihood threshold
            that constitutes a match. One of: 'LIKELIHOOD_UNSPECIFIED',
            'VERY_UNLIKELY', 'UNLIKELY', 'POSSIBLE', 'LIKELY', 'VERY_LIKELY'.
        max_findings: The maximum number of findings to report; 0 = no maximum.
        auto_populate_timespan: Automatically populates time span config start
            and end times in order to scan new content only.
    Returns:
        None; the response from the API is printed to the terminal.
    """

    # Import the client library
    import google.cloud.dlp

    # Instantiate a client.
    dlp = google.cloud.dlp.DlpServiceClient()

    # Prepare info_types by converting the list of strings into a list of
    # dictionaries (protos are also accepted).
    info_types = [{'name': info_type} for info_type in info_types]

    # Construct the configuration dictionary. Keys which are None may
    # optionally be omitted entirely.
    inspect_config = {
        'info_types': info_types,
        'min_likelihood': min_likelihood,
        'limits': {'max_findings_per_request': max_findings},
    }

    # Construct a cloud_storage_options dictionary with the bucket's URL.
    url = 'gs://{}/*'.format(bucket)
    storage_config = {
        'cloud_storage_options': {
            'file_set': {'url': url}
        },
        # Time-based configuration for each storage object.
        'timespan_config': {
            # Auto-populate start and end times in order to scan new objects
            # only.
            'enable_auto_population_of_timespan_config': auto_populate_timespan
        },
    }

    # Construct the job definition.
    job = {
        'inspect_config': inspect_config,
        'storage_config': storage_config,
    }

    # Construct the schedule definition:
    schedule = {
        'recurrence_period_duration': {
            'seconds': scan_period_days * 60 * 60 * 24,
        }
    }

    # Construct the trigger definition.
    job_trigger = {
        'inspect_job': job,
        'display_name': display_name,
        'description': description,
        'triggers': [
            {'schedule': schedule}
        ],
        'status': 'HEALTHY'
    }

    # Convert the project id into a full resource id.
    parent = dlp.project_path(project)

    # Call the API.
    response = dlp.create_job_trigger(
        parent, job_trigger=job_trigger, trigger_id=trigger_id)

    print('Successfully created trigger {}'.format(response.name))

Go

// createTrigger creates a trigger with the given configuration.
func createTrigger(w io.Writer, client *dlp.Client, project string, minLikelihood dlppb.Likelihood, maxFindings int32, triggerID, displayName, description, bucketName string, autoPopulateTimespan bool, scanPeriodDays int64, infoTypes []string) {
	// Convert the info type strings to a list of InfoTypes.
	var i []*dlppb.InfoType
	for _, it := range infoTypes {
		i = append(i, &dlppb.InfoType{Name: it})
	}

	// Create a configured request.
	req := &dlppb.CreateJobTriggerRequest{
		Parent:    "projects/" + project,
		TriggerId: triggerID,
		JobTrigger: &dlppb.JobTrigger{
			DisplayName: displayName,
			Description: description,
			Status:      dlppb.JobTrigger_HEALTHY,
			// Triggers control when the job will start.
			Triggers: []*dlppb.JobTrigger_Trigger{
				{
					Trigger: &dlppb.JobTrigger_Trigger_Schedule{
						Schedule: &dlppb.Schedule{
							Option: &dlppb.Schedule_RecurrencePeriodDuration{
								RecurrencePeriodDuration: &duration.Duration{
									Seconds: scanPeriodDays * 60 * 60 * 24, // Days to seconds.
								},
							},
						},
					},
				},
			},
			// Job configures the job to run when the trigger runs.
			Job: &dlppb.JobTrigger_InspectJob{
				InspectJob: &dlppb.InspectJobConfig{
					InspectConfig: &dlppb.InspectConfig{
						InfoTypes:     i,
						MinLikelihood: minLikelihood,
						Limits: &dlppb.InspectConfig_FindingLimits{
							MaxFindingsPerRequest: maxFindings,
						},
					},
					StorageConfig: &dlppb.StorageConfig{
						Type: &dlppb.StorageConfig_CloudStorageOptions{
							CloudStorageOptions: &dlppb.CloudStorageOptions{
								FileSet: &dlppb.CloudStorageOptions_FileSet{
									Url: "gs://" + bucketName + "/*",
								},
							},
						},
						// Time-based configuration for each storage object. See more at
						// https://cloud.google.com/dlp/docs/reference/rest/v2/InspectJobConfig#TimespanConfig
						TimespanConfig: &dlppb.StorageConfig_TimespanConfig{
							// Auto-populate start and end times in order to scan new objects only.
							EnableAutoPopulationOfTimespanConfig: autoPopulateTimespan,
						},
					},
				},
			},
		},
	}
	// Send the request.
	resp, err := client.CreateJobTrigger(context.Background(), req)
	if err != nil {
		log.Fatalf("error creating job trigger: %v", err)
	}
	fmt.Fprintf(w, "Successfully created trigger: %v", resp.GetName())
}

PHP

use Google\Cloud\Dlp\V2\DlpServiceClient;
use Google\Cloud\Dlp\V2\JobTrigger;
use Google\Cloud\Dlp\V2\JobTrigger\Trigger;
use Google\Cloud\Dlp\V2\JobTrigger\Status;
use Google\Cloud\Dlp\V2\InspectConfig;
use Google\Cloud\Dlp\V2\InspectJobConfig;
use Google\Cloud\Dlp\V2\Schedule;
use Google\Cloud\Dlp\V2\CloudStorageOptions;
use Google\Cloud\Dlp\V2\CloudStorageOptions_FileSet;
use Google\Cloud\Dlp\V2\StorageConfig;
use Google\Cloud\Dlp\V2\StorageConfig_TimespanConfig;
use Google\Cloud\Dlp\V2\InfoType;
use Google\Cloud\Dlp\V2\Likelihood;
use Google\Cloud\Dlp\V2\InspectConfig\FindingLimits;
use Google\Protobuf\Duration;

/**
 * Create a Data Loss Prevention API job trigger.
 *
 * @param string $callingProjectId The project ID to run the API call under
 * @param string $bucketName The name of the bucket to scan
 * @param string $triggerId (Optional) The name of the trigger to be created
 * @param string $displayName (Optional) The human-readable name to give the trigger
 * @param string $description (Optional) A description for the trigger to be created
 * @param int $scanPeriod (Optional) How often to wait between scans, in days (minimum = 1 day)
 * @param int $maxFindings (Optional) The maximum number of findings to report per request (0 = server maximum)
 * @param bool $autoPopulateTimespan (Optional) Automatically limit scan to new content only
 */

function create_trigger(
    $callingProjectId,
    $bucketName,
    $triggerId = '',
    $displayName = '',
    $description = '',
    $scanPeriod = 1,
    $maxFindings = 0,
    $autoPopulateTimespan = false
) {
    // Instantiate a client.
    $dlp = new DlpServiceClient();

    // ----- Construct job config -----
    // The infoTypes of information to match
    $personNameInfoType = (new InfoType())
        ->setName('PERSON_NAME');
    $phoneNumberInfoType = (new InfoType())
        ->setName('PHONE_NUMBER');
    $infoTypes = [$personNameInfoType, $phoneNumberInfoType];

    // The minimum likelihood required before returning a match
    $minLikelihood = likelihood::LIKELIHOOD_UNSPECIFIED;

    // Specify finding limits
    $limits = (new FindingLimits())
        ->setMaxFindingsPerRequest($maxFindings);

    // Create the inspectConfig object
    $inspectConfig = (new InspectConfig())
        ->setMinLikelihood($minLikelihood)
        ->setLimits($limits)
        ->setInfoTypes($infoTypes);

    // Create triggers
    $duration = (new Duration())
        ->setSeconds($scanPeriod * 60 * 60 * 24);

    $schedule = (new Schedule())
        ->setRecurrencePeriodDuration($duration);

    $triggerObject = (new Trigger())
        ->setSchedule($schedule);

    // Create the storageConfig object
    $fileSet = (new CloudStorageOptions_FileSet())
        ->setUrl('gs://' . $bucketName . '/*');

    $storageOptions = (new CloudStorageOptions())
        ->setFileSet($fileSet);

    // Auto-populate start and end times in order to scan new objects only.
    $timespanConfig = (new StorageConfig_TimespanConfig())
        ->setEnableAutoPopulationOfTimespanConfig($autoPopulateTimespan);

    $storageConfig = (new StorageConfig())
        ->setCloudStorageOptions($storageOptions)
        ->setTimespanConfig($timespanConfig);

    // Construct the jobConfig object
    $jobConfig = (new InspectJobConfig())
        ->setInspectConfig($inspectConfig)
        ->setStorageConfig($storageConfig);

    // ----- Construct trigger object -----
    $jobTriggerObject = (new JobTrigger())
        ->setTriggers([$triggerObject])
        ->setInspectJob($jobConfig)
        ->setStatus(Status::HEALTHY)
        ->setDisplayName($displayName)
        ->setDescription($description);

    // Run trigger creation request
    $parent = $dlp->projectName($callingProjectId);
    $dlp->createJobTrigger($parent, [
        'jobTrigger' => $jobTriggerObject,
        'triggerId' => $triggerId
    ]);

    // Print results
    printf('Successfully created trigger %s' . PHP_EOL, $triggerId);
}

C#

public static object CreateJobTrigger(
    string projectId,
    string bucketName,
    string minLikelihood,
    int maxFindings,
    bool autoPopulateTimespan,
    int scanPeriod,
    IEnumerable<InfoType> infoTypes,
    string triggerId,
    string displayName,
    string description)
{
    DlpServiceClient dlp = DlpServiceClient.Create();

    var jobConfig = new InspectJobConfig
    {
        InspectConfig = new InspectConfig
        {
            MinLikelihood = (Likelihood)Enum.Parse(
                typeof(Likelihood),
                minLikelihood
            ),
            Limits = new FindingLimits
            {
                MaxFindingsPerRequest = maxFindings
            },
            InfoTypes = { infoTypes }
        },
        StorageConfig = new StorageConfig
        {
            CloudStorageOptions = new CloudStorageOptions
            {
                FileSet = new FileSet
                {
                    Url = $"gs://{bucketName}/*"
                }
            },
            TimespanConfig = new TimespanConfig
            {
                EnableAutoPopulationOfTimespanConfig = autoPopulateTimespan
            }
        }
    };

    var jobTrigger = new JobTrigger
    {
        Triggers =
        {
            new Trigger
            {
                Schedule = new Schedule
                {
                    RecurrencePeriodDuration = new Google.Protobuf.WellKnownTypes.Duration
                    {
                        Seconds = scanPeriod * 60 * 60 * 24
                    }
                }
            }
        },
        InspectJob = jobConfig,
        Status = Status.Healthy,
        DisplayName = displayName,
        Description = description
    };

    JobTrigger response = dlp.CreateJobTrigger(
        new CreateJobTriggerRequest
        {
            ParentAsProjectName = new ProjectName(projectId),
            JobTrigger = jobTrigger,
            TriggerId = triggerId
        });

    Console.WriteLine($"Successfully created trigger {response.Name}");
    return 0;
}

列出所有工作觸發條件

如要列出目前專案的所有工作觸發條件:

主控台

  1. 在 GCP 主控台中,開啟 Cloud DLP。

    前往 Cloud DLP UI Beta 版

  2. 按一下 [Job triggers] (工作觸發條件) 分頁標籤。

主控台會顯示目前專案所有工作觸發條件的清單。

通訊協定

JobTrigger 資源具有 projects.jobTriggers.list 方法,可用來列出所有工作觸發條件。

如要列出目前專案中定義的所有工作觸發條件,請將 GET 要求傳送至 jobTriggers 端點,如下所示:

網址:

GET https://dlp.googleapis.com/v2/projects/[PROJECT_ID]/jobTriggers?key={YOUR_API_KEY}

下列 JSON 輸出會列出在上一節中建立的工作觸發條件。請注意,工作觸發條件結構會與 JobTrigger 資源的結構完全相同。

JSON 輸出:

{
  "jobTriggers":[
    {
      "name":"projects/[PROJECT_ID]/jobTriggers/[JOB_TRIGGER_NAME]",
      "displayName":"JobTrigger1",
      "description":"Starts a DLP scan job of a Datastore kind",
      "inspectJob":{
        "storageConfig":{
          "datastoreOptions":{
            "partitionId":{
              "projectId":"[PROJECT_ID]",
              "namespaceId":"[NAMESPACE_ID]"
            },
            "kind":{
              "name":"Example-Kind"
            }
          }
        },
        "inspectConfig":{
          "infoTypes":[
            {
              "name":"PHONE_NUMBER"
            }
          ],
          "minLikelihood":"LIKELY",
          "limits":{

          },
          "includeQuote":true
        },
        "actions":[
          {
            "saveFindings":{
              "outputConfig":{
                "table":{
                  "projectId":"[PROJECT_ID]",
                  "datasetId":"[BIGQUERY_DATASET_NAME]",
                  "tableId":"[BIGQUERY_TABLE_NAME]"
                }
              }
            }
          }
        ]
      },
      "triggers":[
        {
          "schedule":{
            "recurrencePeriodDuration":"86400s"
          }
        }
      ],
      "createTime":"2018-11-30T01:52:41.171857Z",
      "updateTime":"2018-11-30T01:52:41.171857Z",
      "status":"HEALTHY"
    },

    ...

],
  "nextPageToken":"KkwKCQjivJ2UpPreAgo_Kj1wcm9qZWN0cy92ZWx2ZXR5LXN0dWR5LTE5NjEwMS9qb2JUcmlnZ2Vycy8xNTA5NzEyOTczMDI0MDc1NzY0"
}

Java

/**
 * List all DLP triggers for a given project.
 *
 * @param projectId The project ID to run the API call under.
 */
private static void listTriggers(String projectId) {
  // Instantiates a client
  try (DlpServiceClient dlpServiceClient = DlpServiceClient.create()) {
    ListJobTriggersRequest listJobTriggersRequest =
        ListJobTriggersRequest.newBuilder()
            .setParent(ProjectName.of(projectId).toString())
            .build();
    DlpServiceClient.ListJobTriggersPagedResponse response =
        dlpServiceClient.listJobTriggers(listJobTriggersRequest);
    response
        .getPage()
        .getValues()
        .forEach(
            trigger -> {
              System.out.println("Trigger: " + trigger.getName());
              System.out.println("\tCreated: " + trigger.getCreateTime());
              System.out.println("\tUpdated: " + trigger.getUpdateTime());
              if (trigger.getDisplayName() != null) {
                System.out.println("\tDisplay name: " + trigger.getDisplayName());
              }
              if (trigger.getDescription() != null) {
                System.out.println("\tDescription: " + trigger.getDescription());
              }
              System.out.println("\tStatus: " + trigger.getStatus());
              System.out.println("\tError count: " + trigger.getErrorsCount());
            });
  } catch (Exception e) {
    System.out.println("Error listing triggers :" + e.getMessage());
  }
}

Node.js

  // Imports the Google Cloud Data Loss Prevention library
  const DLP = require('@google-cloud/dlp');

  // Instantiates a client
  const dlp = new DLP.DlpServiceClient();

  // The project ID to run the API call under
  // const callingProjectId = process.env.GCLOUD_PROJECT;

  // Construct trigger listing request
  const request = {
    parent: dlp.projectPath(callingProjectId),
  };

  // Helper function to pretty-print dates
  const formatDate = date => {
    const msSinceEpoch = parseInt(date.seconds, 10) * 1000;
    return new Date(msSinceEpoch).toLocaleString('en-US');
  };

  try {
    // Run trigger listing request
    const [triggers] = await dlp.listJobTriggers(request);
    triggers.forEach(trigger => {
      // Log trigger details
      console.log(`Trigger ${trigger.name}:`);
      console.log(`  Created: ${formatDate(trigger.createTime)}`);
      console.log(`  Updated: ${formatDate(trigger.updateTime)}`);
      if (trigger.displayName) {
        console.log(`  Display Name: ${trigger.displayName}`);
      }
      if (trigger.description) {
        console.log(`  Description: ${trigger.description}`);
      }
      console.log(`  Status: ${trigger.status}`);
      console.log(`  Error count: ${trigger.errors.length}`);
    });
  } catch (err) {
    console.log(`Error in listTriggers: ${err.message || err}`);
  }
}

async function deleteTrigger(triggerId) {
  // Imports the Google Cloud Data Loss Prevention library
  const DLP = require('@google-cloud/dlp');

  // Instantiates a client
  const dlp = new DLP.DlpServiceClient();

  // The name of the trigger to be deleted
  // Parent project ID is automatically extracted from this parameter
  // const triggerId = 'projects/my-project/triggers/my-trigger';

  // Construct trigger deletion request
  const request = {
    name: triggerId,
  };
  try {
    // Run trigger deletion request
    await dlp.deleteJobTrigger(request);
    console.log(`Successfully deleted trigger ${triggerId}.`);
  } catch (err) {
    console.log(`Error in deleteTrigger: ${err.message || err}`);
  }

}

const cli = require(`yargs`) // eslint-disable-line
  .demand(1)
  .command(
    `create <bucketName> <scanPeriod>`,
    `Create a Data Loss Prevention API job trigger.`,
    {
      infoTypes: {
        alias: 't',
        default: ['PHONE_NUMBER', 'EMAIL_ADDRESS', 'CREDIT_CARD_NUMBER'],
        type: 'array',
        global: true,
        coerce: infoTypes =>
          infoTypes.map(type => {
            return {name: type};
          }),
      },
      triggerId: {
        alias: 'n',
        default: '',
        type: 'string',
      },
      displayName: {
        alias: 'd',
        default: '',
        type: 'string',
      },
      description: {
        alias: 's',
        default: '',
        type: 'string',
      },
      autoPopulateTimespan: {
        default: false,
        type: 'boolean',
      },
      minLikelihood: {
        alias: 'm',
        default: 'LIKELIHOOD_UNSPECIFIED',
        type: 'string',
        choices: [
          'LIKELIHOOD_UNSPECIFIED',
          'VERY_UNLIKELY',
          'UNLIKELY',
          'POSSIBLE',
          'LIKELY',
          'VERY_LIKELY',
        ],
        global: true,
      },
      maxFindings: {
        alias: 'f',
        default: 0,
        type: 'number',
        global: true,
      },
    },
    opts =>
      createTrigger(
        opts.callingProjectId,
        opts.triggerId,
        opts.displayName,
        opts.description,
        opts.bucketName,
        opts.autoPopulateTimespan,
        opts.scanPeriod,
        opts.infoTypes,
        opts.minLikelihood,
        opts.maxFindings
      )
  )
  .command(`list`, `List Data Loss Prevention API job triggers.`, {}, opts =>
    listTriggers(opts.callingProjectId)
  )
  .command(
    `delete <triggerId>`,
    `Delete a Data Loss Prevention API job trigger.`,
    {},
    opts => deleteTrigger(opts.triggerId)
  )
  .option('c', {
    type: 'string',
    alias: 'callingProjectId',
    default: process.env.GCLOUD_PROJECT || '',
  })
  .example(`node $0 create my-bucket 1`)
  .example(`node $0 list`)
  .example(`node $0 delete projects/my-project/jobTriggers/my-trigger`)
  .wrap(120)
  .recommendCommands()
  .epilogue(`For more information, see https://cloud.google.com/dlp/docs.`);

if (module === require.main) {
  cli.help().strict().argv; // eslint-disable-line
}

Python

def list_triggers(project):
    """Lists all Data Loss Prevention API triggers.
    Args:
        project: The Google Cloud project id to use as a parent resource.
    Returns:
        None; the response from the API is printed to the terminal.
    """

    # Import the client library
    import google.cloud.dlp

    # Instantiate a client.
    dlp = google.cloud.dlp.DlpServiceClient()

    # Convert the project id into a full resource id.
    parent = dlp.project_path(project)

    # Call the API.
    response = dlp.list_job_triggers(parent)

    # Define a helper function to convert the API's "seconds since the epoch"
    # time format into a human-readable string.
    def human_readable_time(timestamp):
        return str(time.localtime(timestamp.seconds))

    for trigger in response:
        print('Trigger {}:'.format(trigger.name))
        print('  Created: {}'.format(human_readable_time(trigger.create_time)))
        print('  Updated: {}'.format(human_readable_time(trigger.update_time)))
        if trigger.display_name:
            print('  Display Name: {}'.format(trigger.display_name))
        if trigger.description:
            print('  Description: {}'.format(trigger.discription))
        print('  Status: {}'.format(trigger.status))
        print('  Error count: {}'.format(len(trigger.errors)))

Go

// listTriggers lists the triggers for the given project.
func listTriggers(w io.Writer, client *dlp.Client, project string) {
	// Create a configured request.
	req := &dlppb.ListJobTriggersRequest{
		Parent: "projects/" + project,
	}
	// Send the request and iterate over the results.
	it := client.ListJobTriggers(context.Background(), req)
	for {
		t, err := it.Next()
		if err == iterator.Done {
			break
		}
		if err != nil {
			log.Fatalf("error getting jobs: %v", err)
		}
		c := t.GetCreateTime()
		u := t.GetUpdateTime()
		fmt.Fprintf(w, "Trigger %v\n", t.GetName())
		fmt.Fprintf(w, "  Created: %v\n", time.Unix(c.GetSeconds(), int64(c.GetNanos())).Format(time.RFC1123))
		fmt.Fprintf(w, "  Updated: %v\n", time.Unix(u.GetSeconds(), int64(u.GetNanos())).Format(time.RFC1123))
		fmt.Fprintf(w, "  Display Name: %q\n", t.GetDisplayName())
		fmt.Fprintf(w, "  Description: %q\n", t.GetDescription())
		fmt.Fprintf(w, "  Status: %v\n", t.GetStatus())
		fmt.Fprintf(w, "  Error Count: %v\n", len(t.GetErrors()))
	}
}

PHP

use Google\Cloud\Dlp\V2\DlpServiceClient;

/**
 * List Data Loss Prevention API job triggers.
 * @param string $callingProjectId The GCP Project ID to run the API call under
 */
function list_triggers($callingProjectId)
{
    // Instantiate a client.
    $dlp = new DlpServiceClient();

    $parent = $dlp->projectName($callingProjectId);

    // Run request
    $response = $dlp->listJobTriggers($parent);

    // Print results
    $triggers = $response->iterateAllElements();
    foreach ($triggers as $trigger) {
        printf('Trigger %s' . PHP_EOL, $trigger->getName());
        printf('  Created: %s' . PHP_EOL, $trigger->getCreateTime()->getSeconds());
        printf('  Updated: %s' . PHP_EOL, $trigger->getUpdateTime()->getSeconds());
        printf('  Display Name: %s' . PHP_EOL, $trigger->getDisplayName());
        printf('  Description: %s' . PHP_EOL, $trigger->getDescription());
        printf('  Status: %s' . PHP_EOL, $trigger->getStatus());
        printf('  Error count: %s' . PHP_EOL, count($trigger->getErrors()));
        $timespanConfig = $trigger->getInspectJob()->getStorageConfig()->getTimespanConfig();
        printf('  Auto-populates timespan config: %s' . PHP_EOL,
            ($timespanConfig && $timespanConfig->getEnableAutoPopulationOfTimespanConfig() ? 'yes' : 'no'));
    }
}

C#

public static object ListJobTriggers(string projectId)
{
    DlpServiceClient dlp = DlpServiceClient.Create();

    var response = dlp.ListJobTriggers(
        new ListJobTriggersRequest
        {
            ParentAsProjectName = new ProjectName(projectId)
        });

    foreach (var trigger in response)
    {
        Console.WriteLine($"Name: {trigger.Name}");
        Console.WriteLine($"  Created: {trigger.CreateTime.ToString()}");
        Console.WriteLine($"  Updated: {trigger.UpdateTime.ToString()}");
        Console.WriteLine($"  Display Name: {trigger.DisplayName}");
        Console.WriteLine($"  Description: {trigger.Description}");
        Console.WriteLine($"  Status: {trigger.Status}");
        Console.WriteLine($"  Error count: {trigger.Errors.Count}");
    }

    return 0;
}

刪除工作觸發條件

如要刪除現有工作觸發條件:

主控台

  1. 在 GCP 主控台中,開啟 Cloud DLP。

    前往 Cloud DLP UI Beta 版

  2. 按一下 [Job triggers] (工作觸發條件) 分頁標籤。主控台會顯示目前專案所有工作觸發條件的清單。

  3. 在您要刪除之工作觸發條件的「Actions」(動作) 欄中,按一下三個垂直圓點,然後按一下 [Delete] (刪除)

    開啟「Actions」(動作) 選單的 DLP UI 工作觸發條件清單螢幕擷圖。

或者,從工作觸發條件的清單中,按一下您要刪除的工作觸發條件名稱。在工作觸發條件的詳細資料頁面中,按一下 [Delete] (刪除)

通訊協定

如要從目前專案中刪除工作觸發條件,請將 DELETE 要求傳送至 jobTriggers 端點,如下所示。將 [JOB_TRIGGER_NAME] 欄位替換成建立要求的原始 JSON 回應中 "name" 欄位指定的名稱。

網址:

DELETE https://dlp.googleapis.com/v2/projects/[PROJECT_ID]/jobTriggers/[JOB_TRIGGER_NAME]?key={YOUR_API_KEY}

如果要求成功完成,Cloud DLP API 將傳回成功的回應。如要確認工作觸發條件已成功刪除,請列出所有工作觸發條件

Java

/**
 * Delete a DLP trigger in a project.
 *
 * @param projectId The project ID to run the API call under.
 * @param triggerId Trigger ID
 */
private static void deleteTrigger(String projectId, String triggerId) {

  ProjectJobTriggerName triggerName = ProjectJobTriggerName.of(projectId, triggerId);
  try (DlpServiceClient dlpServiceClient = DlpServiceClient.create()) {
    DeleteJobTriggerRequest deleteJobTriggerRequest =
        DeleteJobTriggerRequest.newBuilder().setName(triggerName.toString()).build();
    dlpServiceClient.deleteJobTrigger(deleteJobTriggerRequest);

    System.out.println("Trigger deleted: " + triggerName.toString());
  } catch (Exception e) {
    System.out.println("Error deleting trigger :" + e.getMessage());
  }
}

Node.js

// Imports the Google Cloud Data Loss Prevention library
const DLP = require('@google-cloud/dlp');

// Instantiates a client
const dlp = new DLP.DlpServiceClient();

// The name of the trigger to be deleted
// Parent project ID is automatically extracted from this parameter
// const triggerId = 'projects/my-project/triggers/my-trigger';

// Construct trigger deletion request
const request = {
  name: triggerId,
};
try {
  // Run trigger deletion request
  await dlp.deleteJobTrigger(request);
  console.log(`Successfully deleted trigger ${triggerId}.`);
} catch (err) {
  console.log(`Error in deleteTrigger: ${err.message || err}`);
}

Python

def delete_trigger(project, trigger_id):
    """Deletes a Data Loss Prevention API trigger.
    Args:
        project: The id of the Google Cloud project which owns the trigger.
        trigger_id: The id of the trigger to delete.
    Returns:
        None; the response from the API is printed to the terminal.
    """

    # Import the client library
    import google.cloud.dlp

    # Instantiate a client.
    dlp = google.cloud.dlp.DlpServiceClient()

    # Convert the project id into a full resource id.
    parent = dlp.project_path(project)

    # Combine the trigger id with the parent id.
    trigger_resource = '{}/jobTriggers/{}'.format(parent, trigger_id)

    # Call the API.
    dlp.delete_job_trigger(trigger_resource)

    print('Trigger {} successfully deleted.'.format(trigger_resource))

if __name__ == '__main__':
    default_project = os.environ.get('GCLOUD_PROJECT')

    parser = argparse.ArgumentParser(description=__doc__)
    subparsers = parser.add_subparsers(
        dest='action', help='Select which action to perform.')
    subparsers.required = True

    parser_create = subparsers.add_parser('create', help='Create a trigger.')
    parser_create.add_argument(
        'bucket', help='The name of the GCS bucket containing the file.')
    parser_create.add_argument(
        'scan_period_days', type=int,
        help='How often to repeat the scan, in days. The minimum is 1 day.')
    parser_create.add_argument(
        '--trigger_id',
        help='The id of the trigger. If omitted, an id will be randomly '
             'generated')
    parser_create.add_argument(
        '--display_name',
        help='The optional display name of the trigger.')
    parser_create.add_argument(
        '--description',
        help='The optional description of the trigger.')
    parser_create.add_argument(
        '--project',
        help='The Google Cloud project id to use as a parent resource.',
        default=default_project)
    parser_create.add_argument(
        '--info_types', action='append',
        help='Strings representing info types to look for. A full list of '
             'info categories and types is available from the API. Examples '
             'include "FIRST_NAME", "LAST_NAME", "EMAIL_ADDRESS". '
             'If unspecified, the three above examples will be used.',
        default=['FIRST_NAME', 'LAST_NAME', 'EMAIL_ADDRESS'])
    parser_create.add_argument(
        '--min_likelihood',
        choices=['LIKELIHOOD_UNSPECIFIED', 'VERY_UNLIKELY', 'UNLIKELY',
                 'POSSIBLE', 'LIKELY', 'VERY_LIKELY'],
        help='A string representing the minimum likelihood threshold that '
             'constitutes a match.')
    parser_create.add_argument(
        '--max_findings', type=int,
        help='The maximum number of findings to report; 0 = no maximum.')
    parser_create.add_argument(
        '--auto_populate_timespan', type=bool,
        help='Limit scan to new content only.')

    parser_list = subparsers.add_parser('list', help='List all triggers.')
    parser_list.add_argument(
        '--project',
        help='The Google Cloud project id to use as a parent resource.',
        default=default_project)

    parser_delete = subparsers.add_parser('delete', help='Delete a trigger.')
    parser_delete.add_argument(
        'trigger_id',
        help='The id of the trigger to delete.')
    parser_delete.add_argument(
        '--project',
        help='The Google Cloud project id to use as a parent resource.',
        default=default_project)

    args = parser.parse_args()

    if args.action == 'create':
        create_trigger(
            args.project, args.bucket, args.scan_period_days, args.info_types,
            trigger_id=args.trigger_id, display_name=args.display_name,
            description=args.description, min_likelihood=args.min_likelihood,
            max_findings=args.max_findings,
            auto_populate_timespan=args.auto_populate_timespan,
        )
    elif args.action == 'list':
        list_triggers(args.project)
    elif args.action == 'delete':
        delete_trigger(args.project, args.trigger_id)

Go

// deleteTrigger deletes the given trigger.
func deleteTrigger(w io.Writer, client *dlp.Client, triggerID string) {
	req := &dlppb.DeleteJobTriggerRequest{
		Name: triggerID,
	}
	err := client.DeleteJobTrigger(context.Background(), req)
	if err != nil {
		log.Fatalf("error deleting job: %v", err)
	}
	fmt.Fprintf(w, "Successfully deleted trigger %v", triggerID)
}

PHP

use Google\Cloud\Dlp\V2\DlpServiceClient;

/**
 * Delete a Data Loss Prevention API job trigger.
 * @param string $triggerId The name of the trigger to be deleted.
 *        Parent project ID is automatically extracted from this parameter
 */
function delete_trigger($triggerId)
{
    // Instantiate a client.
    $dlp = new DlpServiceClient();

    // Run request
    $response = $dlp->deleteJobTrigger($triggerId);

    // Print the results
    printf('Successfully deleted trigger %s' . PHP_EOL, $triggerId);
}

C#

public static object DeleteJobTrigger(string triggerName)
{
    DlpServiceClient dlp = DlpServiceClient.Create();

    dlp.DeleteJobTrigger(
        new DeleteJobTriggerRequest
        {
            Name = triggerName
        });

    Console.WriteLine($"Successfully deleted trigger {triggerName}.");
    return 0;
}

更新現有工作觸發條件

除了建立、列出和刪除工作觸發條件之外,您還可以更新現有工作觸發條件。如要變更現有工作觸發條件的設定:

主控台

  1. 在 GCP 主控台中,開啟 Cloud DLP。

    前往 Cloud DLP UI Beta 版

  2. 按一下 [Job triggers] (工作觸發條件) 分頁標籤。主控台會顯示目前專案所有工作觸發條件的清單。

  3. 在您要刪除之工作觸發條件的「Actions」(動作) 欄中,按一下三個垂直圓點,然後按一下 [View details] (查看詳細資料)

    開啟「Actions」(動作) 選單的 DLP UI 工作觸發條件清單螢幕擷圖。

  4. 在工作觸發條件詳細資料頁面中,按一下 [Edit] (編輯)

  5. 在「Edit trigger」(編輯觸發條件) 頁面中,您可以變更輸入資料的位置;偵測詳細資料,例如範本、infoType 或可能性;掃描後採取的動作以及工作觸發條件的排程。變更完成後,按一下 [Save] (儲存)

通訊協定

使用 projects.jobTriggers.patch 方法將新的 JobTrigger 值傳送到 Cloud DLP API,即可更新指定工作觸發條件中的這些值。

例如,請考量下列簡單的工作觸發條件。此 JSON 代表工作觸發條件,並且會在將 GET 要求傳送至目前專案的工作觸發條件端點之後傳回。

JSON 輸出:

{
  "name":"projects/[PROJECT_ID]/jobTriggers/[JOB_TRIGGER_NAME]",
  "inspectJob":{
    "storageConfig":{
      "cloudStorageOptions":{
        "fileSet":{
          "url":"gs://dlptesting/*"
        },
        "fileTypes":[
          "FILE_TYPE_UNSPECIFIED"
        ],
        "filesLimitPercent":100
      },
      "timespanConfig":{
        "enableAutoPopulationOfTimespanConfig":true
      }
    },
    "inspectConfig":{
      "infoTypes":[
        {
          "name":"US_SOCIAL_SECURITY_NUMBER"
        }
      ],
      "minLikelihood":"POSSIBLE",
      "limits":{

      }
    },
    "actions":[
      {
        "jobNotificationEmails":{

        }
      }
    ]
  },
  "triggers":[
    {
      "schedule":{
        "recurrencePeriodDuration":"86400s"
      }
    }
  ],
  "createTime":"2019-03-06T21:19:45.774841Z",
  "updateTime":"2019-03-06T21:19:45.774841Z",
  "status":"HEALTHY"
}

透過 PATCH 要求將下列 JSON 傳送至指定端點時,該 JSON 會以要掃描的新 infoType 和新的最低可能性,更新指定工作觸發條件。請注意,您還必須指定 updateMask 屬性,且其值的格式為 FieldMask

JSON 輸入:

PATCH https://dlp.googleapis.com/v2/projects/[PROJECT_ID]/jobTriggers/[JOB_TRIGGER_NAME]?key={YOUR_API_KEY}

{
  "jobTrigger":{
    "inspectJob":{
      "inspectConfig":{
        "infoTypes":[
          {
            "name":"US_INDIVIDUAL_TAXPAYER_IDENTIFICATION_NUMBER"
          }
        ],
        "minLikelihood":"LIKELY"
      }
    }
  },
  "updateMask":"inspectJob(inspectConfig(infoTypes,minLikelihood))"
}

在您將此 JSON 傳送至指定網址之後,JSON 會傳回下列內容,表示更新的工作觸發條件。請注意,原始 infoType 和可能性值已由新值取代。

JSON 輸出:

{
  "name":"projects/[PROJECT_ID]/jobTriggers/[JOB_TRIGGER_NAME]",
  "inspectJob":{
    "storageConfig":{
      "cloudStorageOptions":{
        "fileSet":{
          "url":"gs://dlptesting/*"
        },
        "fileTypes":[
          "FILE_TYPE_UNSPECIFIED"
        ],
        "filesLimitPercent":100
      },
      "timespanConfig":{
        "enableAutoPopulationOfTimespanConfig":true
      }
    },
    "inspectConfig":{
      "infoTypes":[
        {
          "name":"US_INDIVIDUAL_TAXPAYER_IDENTIFICATION_NUMBER"
        }
      ],
      "minLikelihood":"LIKELY",
      "limits":{

      }
    },
    "actions":[
      {
        "jobNotificationEmails":{

        }
      }
    ]
  },
  "triggers":[
    {
      "schedule":{
        "recurrencePeriodDuration":"86400s"
      }
    }
  ],
  "createTime":"2019-03-06T21:19:45.774841Z",
  "updateTime":"2019-03-06T21:27:01.650183Z",
  "lastRunTime":"1970-01-01T00:00:00Z",
  "status":"HEALTHY"
}

Java

/**
 * Delete a DLP trigger in a project.
 *
 * @param projectId The project ID to run the API call under.
 * @param triggerId Trigger ID
 */
private static void deleteTrigger(String projectId, String triggerId) {

  ProjectJobTriggerName triggerName = ProjectJobTriggerName.of(projectId, triggerId);
  try (DlpServiceClient dlpServiceClient = DlpServiceClient.create()) {
    DeleteJobTriggerRequest deleteJobTriggerRequest =
        DeleteJobTriggerRequest.newBuilder().setName(triggerName.toString()).build();
    dlpServiceClient.deleteJobTrigger(deleteJobTriggerRequest);

    System.out.println("Trigger deleted: " + triggerName.toString());
  } catch (Exception e) {
    System.out.println("Error deleting trigger :" + e.getMessage());
  }
}

Node.js

// Imports the Google Cloud Data Loss Prevention library
const DLP = require('@google-cloud/dlp');

// Instantiates a client
const dlp = new DLP.DlpServiceClient();

// The name of the trigger to be deleted
// Parent project ID is automatically extracted from this parameter
// const triggerId = 'projects/my-project/triggers/my-trigger';

// Construct trigger deletion request
const request = {
  name: triggerId,
};
try {
  // Run trigger deletion request
  await dlp.deleteJobTrigger(request);
  console.log(`Successfully deleted trigger ${triggerId}.`);
} catch (err) {
  console.log(`Error in deleteTrigger: ${err.message || err}`);
}

Python

def delete_trigger(project, trigger_id):
    """Deletes a Data Loss Prevention API trigger.
    Args:
        project: The id of the Google Cloud project which owns the trigger.
        trigger_id: The id of the trigger to delete.
    Returns:
        None; the response from the API is printed to the terminal.
    """

    # Import the client library
    import google.cloud.dlp

    # Instantiate a client.
    dlp = google.cloud.dlp.DlpServiceClient()

    # Convert the project id into a full resource id.
    parent = dlp.project_path(project)

    # Combine the trigger id with the parent id.
    trigger_resource = '{}/jobTriggers/{}'.format(parent, trigger_id)

    # Call the API.
    dlp.delete_job_trigger(trigger_resource)

    print('Trigger {} successfully deleted.'.format(trigger_resource))

if __name__ == '__main__':
    default_project = os.environ.get('GCLOUD_PROJECT')

    parser = argparse.ArgumentParser(description=__doc__)
    subparsers = parser.add_subparsers(
        dest='action', help='Select which action to perform.')
    subparsers.required = True

    parser_create = subparsers.add_parser('create', help='Create a trigger.')
    parser_create.add_argument(
        'bucket', help='The name of the GCS bucket containing the file.')
    parser_create.add_argument(
        'scan_period_days', type=int,
        help='How often to repeat the scan, in days. The minimum is 1 day.')
    parser_create.add_argument(
        '--trigger_id',
        help='The id of the trigger. If omitted, an id will be randomly '
             'generated')
    parser_create.add_argument(
        '--display_name',
        help='The optional display name of the trigger.')
    parser_create.add_argument(
        '--description',
        help='The optional description of the trigger.')
    parser_create.add_argument(
        '--project',
        help='The Google Cloud project id to use as a parent resource.',
        default=default_project)
    parser_create.add_argument(
        '--info_types', action='append',
        help='Strings representing info types to look for. A full list of '
             'info categories and types is available from the API. Examples '
             'include "FIRST_NAME", "LAST_NAME", "EMAIL_ADDRESS". '
             'If unspecified, the three above examples will be used.',
        default=['FIRST_NAME', 'LAST_NAME', 'EMAIL_ADDRESS'])
    parser_create.add_argument(
        '--min_likelihood',
        choices=['LIKELIHOOD_UNSPECIFIED', 'VERY_UNLIKELY', 'UNLIKELY',
                 'POSSIBLE', 'LIKELY', 'VERY_LIKELY'],
        help='A string representing the minimum likelihood threshold that '
             'constitutes a match.')
    parser_create.add_argument(
        '--max_findings', type=int,
        help='The maximum number of findings to report; 0 = no maximum.')
    parser_create.add_argument(
        '--auto_populate_timespan', type=bool,
        help='Limit scan to new content only.')

    parser_list = subparsers.add_parser('list', help='List all triggers.')
    parser_list.add_argument(
        '--project',
        help='The Google Cloud project id to use as a parent resource.',
        default=default_project)

    parser_delete = subparsers.add_parser('delete', help='Delete a trigger.')
    parser_delete.add_argument(
        'trigger_id',
        help='The id of the trigger to delete.')
    parser_delete.add_argument(
        '--project',
        help='The Google Cloud project id to use as a parent resource.',
        default=default_project)

    args = parser.parse_args()

    if args.action == 'create':
        create_trigger(
            args.project, args.bucket, args.scan_period_days, args.info_types,
            trigger_id=args.trigger_id, display_name=args.display_name,
            description=args.description, min_likelihood=args.min_likelihood,
            max_findings=args.max_findings,
            auto_populate_timespan=args.auto_populate_timespan,
        )
    elif args.action == 'list':
        list_triggers(args.project)
    elif args.action == 'delete':
        delete_trigger(args.project, args.trigger_id)

Go

// deleteTrigger deletes the given trigger.
func deleteTrigger(w io.Writer, client *dlp.Client, triggerID string) {
	req := &dlppb.DeleteJobTriggerRequest{
		Name: triggerID,
	}
	err := client.DeleteJobTrigger(context.Background(), req)
	if err != nil {
		log.Fatalf("error deleting job: %v", err)
	}
	fmt.Fprintf(w, "Successfully deleted trigger %v", triggerID)
}

PHP

use Google\Cloud\Dlp\V2\DlpServiceClient;

/**
 * Delete a Data Loss Prevention API job trigger.
 * @param string $triggerId The name of the trigger to be deleted.
 *        Parent project ID is automatically extracted from this parameter
 */
function delete_trigger($triggerId)
{
    // Instantiate a client.
    $dlp = new DlpServiceClient();

    // Run request
    $response = $dlp->deleteJobTrigger($triggerId);

    // Print the results
    printf('Successfully deleted trigger %s' . PHP_EOL, $triggerId);
}

C#

public static object DeleteJobTrigger(string triggerName)
{
    DlpServiceClient dlp = DlpServiceClient.Create();

    dlp.DeleteJobTrigger(
        new DeleteJobTriggerRequest
        {
            Name = triggerName
        });

    Console.WriteLine($"Successfully deleted trigger {triggerName}.");
    return 0;
}

使用工作觸發條件

本節說明如何將工作觸發條件只用來掃描新內容,以及如何使用 Cloud Functions 在每次將檔案上傳至 Cloud Storage 時觸發工作。

限制只掃描新內容

您也可以針對儲存在 Cloud StorageBigQuery 中的檔案,設定自動設定時間範圍日期的選項。將 TimespanConfig 物件設定為自動填入之後,Cloud DLP 只會掃描上次執行觸發條件後新增或修改的資料:

...
  timespan_config {
        enable_auto_population_of_timespan_config: true
      }
...

檔案上傳時觸發工作

除了內建於 Cloud DLP 之工作觸發條件的支援以外,GCP 還有各種其他元件,可用於整合或觸發 DLP 工作。例如,您可以使用 Cloud Functions,在每次檔案上傳至 Cloud Storage 時觸發 DLP 掃描。

如需相關的逐步操作說明,請參閱自動分類上傳至 Cloud Storage 的資料一文。

本頁內容對您是否有任何幫助?請提供意見:

傳送您對下列選項的寶貴意見...

這個網頁
Cloud Data Loss Prevention