Crea activadores de trabajo

Puedes desactivar los trabajos de análisis de Cloud Data Loss Prevention (DLP) de manera automática mediante la creación de activadores de trabajos. Un activador de trabajo es un evento que automatiza la creación de trabajos de DLP para analizar los repositorios de almacenamiento de Google Cloud Platform. Los activadores de trabajos también te permiten programar trabajos de análisis mediante la configuración de intervalos de desactivación de cada activador. Se pueden configurar para buscar nuevos resultados desde la última ejecución del análisis, lo que ayuda a supervisar los cambios o adiciones al contenido, o para generar informes de resultados actualizados.

Para obtener más información sobre activadores de trabajo, consulta el tema del concepto activadores de trabajo. En el resto de este tema, se muestra cómo crear activadores de trabajo junto con el código de ejemplo que puedes usar en tus propios proyectos de Cloud DLP.

Crea un activador de trabajo

Un activador de trabajo se representa mediante el objeto JobTrigger en Cloud DLP. Puedes crear un activador de trabajo nuevo mediante el método projects.jobTriggers.create del objeto JobTrigger.

Ejemplos de código

A continuación, se muestra un JSON de ejemplo y un código en varios lenguajes que demuestran cómo usar Cloud DLP para crear un activador de trabajo nuevo.

Protocolo

Este JSON de muestra se puede enviar en una solicitud POST al extremo REST de Cloud DLP especificado. Este JSON de ejemplo demuestra cómo crear un activador de trabajo en Cloud DLP. El trabajo que este activador iniciará es un análisis de inspección de Datastore. El activador de trabajo que se crea se ejecuta cada 86,400 segundos (o 24 horas).

Para probar esto con rapidez, puedes usar el Explorador de API en la página de referencia del método projects.jobTriggers.create. Ten en cuenta que una solicitud exitosa, incluso en el Explorador de API, creará un activador de trabajo nuevo programado. Si quieres obtener información general sobre el uso de JSON para enviar solicitudes a la API de Cloud DLP, consulta la guía de inicio rápido de JSON.

Entrada de JSON:

POST https://dlp.googleapis.com/v2/projects/[PROJECT_ID]/jobTriggers?key={YOUR_API_KEY}

{
  "jobTrigger":{
    "displayName":"JobTrigger1",
    "description":"Starts a DLP scan job of a Datastore kind",
    "triggers":[
      {
        "schedule":{
          "recurrencePeriodDuration":"86400s"
        }
      }
    ],
    "status":"HEALTHY",
    "inspectJob":{
      "storageConfig":{
        "datastoreOptions":{
          "kind":{
            "name":"Example-Kind"
          },
          "partitionId":{
            "projectId":"[PROJECT_ID]",
            "namespaceId":"[NAMESPACE_ID]"
          }
        }
      },
      "inspectConfig":{
        "infoTypes":[
          {
            "name":"PHONE_NUMBER"
          }
        ],
        "excludeInfoTypes":false,
        "includeQuote":true,
        "minLikelihood":"LIKELY"
      },
      "actions":[
        {
          "saveFindings":{
            "outputConfig":{
              "table":{
                "projectId":"[PROJECT_ID]",
                "datasetId":"[BIGQUERY_DATASET_NAME]",
                "tableId":"[BIGQUERY_TABLE_NAME]"
              }
            }
          }
        }
      ]
    }
  }
}

Resultado de JSON:

El resultado siguiente indica que el activador de trabajo se creó de manera correcta.

{
  "name":"projects/[PROJECT_ID]/jobTriggers/[JOB_ID]",
  "displayName":"JobTrigger1",
  "description":"Starts a DLP scan job of a Datastore kind",
  "inspectJob":{
    "storageConfig":{
      "datastoreOptions":{
        "partitionId":{
          "projectId":"[PROJECT_ID]",
          "namespaceId":"[NAMESPACE_ID]"
        },
        "kind":{
          "name":"Example-Kind"
        }
      }
    },
    "inspectConfig":{
      "infoTypes":[
        {
          "name":"PHONE_NUMBER"
        }
      ],
      "minLikelihood":"LIKELY",
      "limits":{

      },
      "includeQuote":true
    },
    "actions":[
      {
        "saveFindings":{
          "outputConfig":{
            "table":{
              "projectId":"[PROJECT_ID]",
              "datasetId":"[BIGQUERY_DATASET_NAME]",
              "tableId":"[BIGQUERY_TABLE_NAME]"
            }
          }
        }
      }
    ]
  },
  "triggers":[
    {
      "schedule":{
        "recurrencePeriodDuration":"86400s"
      }
    }
  ],
  "createTime":"2018-11-30T01:52:41.171857Z",
  "updateTime":"2018-11-30T01:52:41.171857Z",
  "status":"HEALTHY"
}

Java

/**
 * Schedule a DLP inspection trigger for a GCS location.
 *
 * @param triggerId (Optional) name of the trigger to be created
 * @param displayName (Optional) display name for the trigger to be created
 * @param description (Optional) description for the trigger to be created
 * @param autoPopulateTimespan If true, limits scans to new content only.
 * @param scanPeriod How often to wait between scans, in days (minimum = 1 day)
 * @param infoTypes infoTypes of information to match eg. InfoType.PHONE_NUMBER,
 *     InfoType.EMAIL_ADDRESS
 * @param minLikelihood minimum likelihood required before returning a match
 * @param maxFindings maximum number of findings to report per request (0 = server maximum)
 * @param projectId The project ID to run the API call under
 */
private static void createTrigger(
    String triggerId,
    String displayName,
    String description,
    String bucketName,
    String fileName,
    boolean autoPopulateTimespan,
    int scanPeriod,
    List<InfoType> infoTypes,
    Likelihood minLikelihood,
    int maxFindings,
    String projectId)
    throws Exception {

  // instantiate a client
  DlpServiceClient dlpServiceClient = DlpServiceClient.create();
  try {

    CloudStorageOptions cloudStorageOptions =
        CloudStorageOptions.newBuilder()
            .setFileSet(
                CloudStorageOptions.FileSet.newBuilder()
                    .setUrl("gs://" + bucketName + "/" + fileName))
            .build();

    TimespanConfig timespanConfig = TimespanConfig.newBuilder()
        .setEnableAutoPopulationOfTimespanConfig(autoPopulateTimespan).build();

    StorageConfig storageConfig =
        StorageConfig.newBuilder().setCloudStorageOptions(cloudStorageOptions)
            .setTimespanConfig(timespanConfig).build();

    InspectConfig.FindingLimits findingLimits =
        InspectConfig.FindingLimits.newBuilder().setMaxFindingsPerRequest(maxFindings).build();

    InspectConfig inspectConfig =
        InspectConfig.newBuilder()
            .addAllInfoTypes(infoTypes)
            .setMinLikelihood(minLikelihood)
            .setLimits(findingLimits)
            .build();

    InspectJobConfig inspectJobConfig =
        InspectJobConfig.newBuilder()
            .setInspectConfig(inspectConfig)
            .setStorageConfig(storageConfig)
            .build();

    // Schedule scan of GCS bucket every scanPeriod number of days (minimum = 1 day)
    Duration duration = Duration.newBuilder().setSeconds(scanPeriod * 24 * 3600).build();
    Schedule schedule = Schedule.newBuilder().setRecurrencePeriodDuration(duration).build();
    JobTrigger.Trigger trigger = JobTrigger.Trigger.newBuilder().setSchedule(schedule).build();
    JobTrigger jobTrigger =
        JobTrigger.newBuilder()
            .setInspectJob(inspectJobConfig)
            .setName(triggerId)
            .setDisplayName(displayName)
            .setDescription(description)
            .setStatus(JobTrigger.Status.HEALTHY)
            .addTriggers(trigger)
            .build();

    // Create scan request
    CreateJobTriggerRequest createJobTriggerRequest =
        CreateJobTriggerRequest.newBuilder()
            .setParent(ProjectName.of(projectId).toString())
            .setJobTrigger(jobTrigger)
            .build();

    JobTrigger createdJobTrigger = dlpServiceClient.createJobTrigger(createJobTriggerRequest);

    System.out.println("Created Trigger: " + createdJobTrigger.getName());
  } catch (Exception e) {
    System.out.println("Error creating trigger: " + e.getMessage());
  }
}

Node.js

// Imports the Google Cloud Data Loss Prevention library
const DLP = require('@google-cloud/dlp');

// Instantiates a client
const dlp = new DLP.DlpServiceClient();

// The project ID to run the API call under
// const callingProjectId = process.env.GCLOUD_PROJECT;

// (Optional) The name of the trigger to be created.
// const triggerId = 'my-trigger';

// (Optional) A display name for the trigger to be created
// const displayName = 'My Trigger';

// (Optional) A description for the trigger to be created
// const description = "This is a sample trigger.";

// The name of the bucket to scan.
// const bucketName = 'YOUR-BUCKET';

// Limit scan to new content only.
// const autoPopulateTimespan = true;

// How often to wait between scans, in days (minimum = 1 day)
// const scanPeriod = 1;

// The infoTypes of information to match
// const infoTypes = [{ name: 'PHONE_NUMBER' }, { name: 'EMAIL_ADDRESS' }, { name: 'CREDIT_CARD_NUMBER' }];

// The minimum likelihood required before returning a match
// const minLikelihood = 'LIKELIHOOD_UNSPECIFIED';

// The maximum number of findings to report per request (0 = server maximum)
// const maxFindings = 0;

// Get reference to the bucket to be inspected
const storageItem = {
  cloudStorageOptions: {
    fileSet: {url: `gs://${bucketName}/*`},
  },
  timeSpanConfig: {
    enableAutoPopulationOfTimespanConfig: autoPopulateTimespan,
  },
};

// Construct job to be triggered
const job = {
  inspectConfig: {
    infoTypes: infoTypes,
    minLikelihood: minLikelihood,
    limits: {
      maxFindingsPerRequest: maxFindings,
    },
  },
  storageConfig: storageItem,
};

// Construct trigger creation request
const request = {
  parent: dlp.projectPath(callingProjectId),
  jobTrigger: {
    inspectJob: job,
    displayName: displayName,
    description: description,
    triggers: [
      {
        schedule: {
          recurrencePeriodDuration: {
            seconds: scanPeriod * 60 * 60 * 24, // Trigger the scan daily
          },
        },
      },
    ],
    status: 'HEALTHY',
  },
  triggerId: triggerId,
};

try {
  // Run trigger creation request
  const [trigger] = await dlp.createJobTrigger(request);
  console.log(`Successfully created trigger ${trigger.name}.`);
} catch (err) {
  console.log(`Error in createTrigger: ${err.message || err}`);
}

Python

def create_trigger(project, bucket, scan_period_days, info_types,
                   trigger_id=None, display_name=None, description=None,
                   min_likelihood=None, max_findings=None,
                   auto_populate_timespan=False):
    """Creates a scheduled Data Loss Prevention API inspect_content trigger.
    Args:
        project: The Google Cloud project id to use as a parent resource.
        bucket: The name of the GCS bucket to scan. This sample scans all
            files in the bucket using a wildcard.
        scan_period_days: How often to repeat the scan, in days.
            The minimum is 1 day.
        info_types: A list of strings representing info types to look for.
            A full list of info type categories can be fetched from the API.
        trigger_id: The id of the trigger. If omitted, an id will be randomly
            generated.
        display_name: The optional display name of the trigger.
        description: The optional description of the trigger.
        min_likelihood: A string representing the minimum likelihood threshold
            that constitutes a match. One of: 'LIKELIHOOD_UNSPECIFIED',
            'VERY_UNLIKELY', 'UNLIKELY', 'POSSIBLE', 'LIKELY', 'VERY_LIKELY'.
        max_findings: The maximum number of findings to report; 0 = no maximum.
        auto_populate_timespan: Automatically populates time span config start
            and end times in order to scan new content only.
    Returns:
        None; the response from the API is printed to the terminal.
    """

    # Import the client library
    import google.cloud.dlp

    # Instantiate a client.
    dlp = google.cloud.dlp.DlpServiceClient()

    # Prepare info_types by converting the list of strings into a list of
    # dictionaries (protos are also accepted).
    info_types = [{'name': info_type} for info_type in info_types]

    # Construct the configuration dictionary. Keys which are None may
    # optionally be omitted entirely.
    inspect_config = {
        'info_types': info_types,
        'min_likelihood': min_likelihood,
        'limits': {'max_findings_per_request': max_findings},
    }

    # Construct a cloud_storage_options dictionary with the bucket's URL.
    url = 'gs://{}/*'.format(bucket)
    storage_config = {
        'cloud_storage_options': {
            'file_set': {'url': url}
        },
        # Time-based configuration for each storage object.
        'timespan_config': {
            # Auto-populate start and end times in order to scan new objects
            # only.
            'enable_auto_population_of_timespan_config': auto_populate_timespan
        },
    }

    # Construct the job definition.
    job = {
        'inspect_config': inspect_config,
        'storage_config': storage_config,
    }

    # Construct the schedule definition:
    schedule = {
        'recurrence_period_duration': {
            'seconds': scan_period_days * 60 * 60 * 24,
        }
    }

    # Construct the trigger definition.
    job_trigger = {
        'inspect_job': job,
        'display_name': display_name,
        'description': description,
        'triggers': [
            {'schedule': schedule}
        ],
        'status': 'HEALTHY'
    }

    # Convert the project id into a full resource id.
    parent = dlp.project_path(project)

    # Call the API.
    response = dlp.create_job_trigger(
        parent, job_trigger=job_trigger, trigger_id=trigger_id)

    print('Successfully created trigger {}'.format(response.name))

Go

// createTrigger creates a trigger with the given configuration.
func createTrigger(w io.Writer, client *dlp.Client, project string, minLikelihood dlppb.Likelihood, maxFindings int32, triggerID, displayName, description, bucketName string, autoPopulateTimespan bool, scanPeriodDays int64, infoTypes []string) {
	// Convert the info type strings to a list of InfoTypes.
	var i []*dlppb.InfoType
	for _, it := range infoTypes {
		i = append(i, &dlppb.InfoType{Name: it})
	}

	// Create a configured request.
	req := &dlppb.CreateJobTriggerRequest{
		Parent:    "projects/" + project,
		TriggerId: triggerID,
		JobTrigger: &dlppb.JobTrigger{
			DisplayName: displayName,
			Description: description,
			Status:      dlppb.JobTrigger_HEALTHY,
			// Triggers control when the job will start.
			Triggers: []*dlppb.JobTrigger_Trigger{
				{
					Trigger: &dlppb.JobTrigger_Trigger_Schedule{
						Schedule: &dlppb.Schedule{
							Option: &dlppb.Schedule_RecurrencePeriodDuration{
								RecurrencePeriodDuration: &duration.Duration{
									Seconds: scanPeriodDays * 60 * 60 * 24, // Days to seconds.
								},
							},
						},
					},
				},
			},
			// Job configures the job to run when the trigger runs.
			Job: &dlppb.JobTrigger_InspectJob{
				InspectJob: &dlppb.InspectJobConfig{
					InspectConfig: &dlppb.InspectConfig{
						InfoTypes:     i,
						MinLikelihood: minLikelihood,
						Limits: &dlppb.InspectConfig_FindingLimits{
							MaxFindingsPerRequest: maxFindings,
						},
					},
					StorageConfig: &dlppb.StorageConfig{
						Type: &dlppb.StorageConfig_CloudStorageOptions{
							CloudStorageOptions: &dlppb.CloudStorageOptions{
								FileSet: &dlppb.CloudStorageOptions_FileSet{
									Url: "gs://" + bucketName + "/*",
								},
							},
						},
						// Time-based configuration for each storage object. See more at
						// https://cloud.google.com/dlp/docs/reference/rest/v2/InspectJobConfig#TimespanConfig
						TimespanConfig: &dlppb.StorageConfig_TimespanConfig{
							// Auto-populate start and end times in order to scan new objects only.
							EnableAutoPopulationOfTimespanConfig: autoPopulateTimespan,
						},
					},
				},
			},
		},
	}
	// Send the request.
	resp, err := client.CreateJobTrigger(context.Background(), req)
	if err != nil {
		log.Fatalf("error creating job trigger: %v", err)
	}
	fmt.Fprintf(w, "Successfully created trigger: %v", resp.GetName())
}

PHP

use Google\Cloud\Dlp\V2\DlpServiceClient;
use Google\Cloud\Dlp\V2\JobTrigger;
use Google\Cloud\Dlp\V2\JobTrigger\Trigger;
use Google\Cloud\Dlp\V2\JobTrigger\Status;
use Google\Cloud\Dlp\V2\InspectConfig;
use Google\Cloud\Dlp\V2\InspectJobConfig;
use Google\Cloud\Dlp\V2\Schedule;
use Google\Cloud\Dlp\V2\CloudStorageOptions;
use Google\Cloud\Dlp\V2\CloudStorageOptions_FileSet;
use Google\Cloud\Dlp\V2\StorageConfig;
use Google\Cloud\Dlp\V2\StorageConfig_TimespanConfig;
use Google\Cloud\Dlp\V2\InfoType;
use Google\Cloud\Dlp\V2\Likelihood;
use Google\Cloud\Dlp\V2\InspectConfig\FindingLimits;
use Google\Protobuf\Duration;

/**
 * Create a Data Loss Prevention API job trigger.
 *
 * @param string $callingProjectId The project ID to run the API call under
 * @param string $bucketName The name of the bucket to scan
 * @param string $triggerId (Optional) The name of the trigger to be created
 * @param string $displayName (Optional) The human-readable name to give the trigger
 * @param string $description (Optional) A description for the trigger to be created
 * @param int $scanPeriod (Optional) How often to wait between scans, in days (minimum = 1 day)
 * @param int $maxFindings (Optional) The maximum number of findings to report per request (0 = server maximum)
 * @param bool $autoPopulateTimespan (Optional) Automatically limit scan to new content only
 */

function create_trigger(
    $callingProjectId,
    $bucketName,
    $triggerId = '',
    $displayName = '',
    $description = '',
    $scanPeriod = 1,
    $maxFindings = 0,
    $autoPopulateTimespan = false
) {
    // Instantiate a client.
    $dlp = new DlpServiceClient();

    // ----- Construct job config -----
    // The infoTypes of information to match
    $personNameInfoType = (new InfoType())
        ->setName('PERSON_NAME');
    $phoneNumberInfoType = (new InfoType())
        ->setName('PHONE_NUMBER');
    $infoTypes = [$personNameInfoType, $phoneNumberInfoType];

    // The minimum likelihood required before returning a match
    $minLikelihood = likelihood::LIKELIHOOD_UNSPECIFIED;

    // Specify finding limits
    $limits = (new FindingLimits())
        ->setMaxFindingsPerRequest($maxFindings);

    // Create the inspectConfig object
    $inspectConfig = (new InspectConfig())
        ->setMinLikelihood($minLikelihood)
        ->setLimits($limits)
        ->setInfoTypes($infoTypes);

    // Create triggers
    $duration = (new Duration())
        ->setSeconds($scanPeriod * 60 * 60 * 24);

    $schedule = (new Schedule())
        ->setRecurrencePeriodDuration($duration);

    $triggerObject = (new Trigger())
        ->setSchedule($schedule);

    // Create the storageConfig object
    $fileSet = (new CloudStorageOptions_FileSet())
        ->setUrl('gs://' . $bucketName . '/*');

    $storageOptions = (new CloudStorageOptions())
        ->setFileSet($fileSet);

    // Auto-populate start and end times in order to scan new objects only.
    $timespanConfig = (new StorageConfig_TimespanConfig())
        ->setEnableAutoPopulationOfTimespanConfig($autoPopulateTimespan);

    $storageConfig = (new StorageConfig())
        ->setCloudStorageOptions($storageOptions)
        ->setTimespanConfig($timespanConfig);

    // Construct the jobConfig object
    $jobConfig = (new InspectJobConfig())
        ->setInspectConfig($inspectConfig)
        ->setStorageConfig($storageConfig);

    // ----- Construct trigger object -----
    $jobTriggerObject = (new JobTrigger())
        ->setTriggers([$triggerObject])
        ->setInspectJob($jobConfig)
        ->setStatus(Status::HEALTHY)
        ->setDisplayName($displayName)
        ->setDescription($description);

    // Run trigger creation request
    $parent = $dlp->projectName($callingProjectId);
    $dlp->createJobTrigger($parent, [
        'jobTrigger' => $jobTriggerObject,
        'triggerId' => $triggerId
    ]);

    // Print results
    printf('Successfully created trigger %s' . PHP_EOL, $triggerId);
}

C#

public static object CreateJobTrigger(
    string projectId,
    string bucketName,
    string minLikelihood,
    int maxFindings,
    bool autoPopulateTimespan,
    int scanPeriod,
    IEnumerable<InfoType> infoTypes,
    string triggerId,
    string displayName,
    string description)
{
    DlpServiceClient dlp = DlpServiceClient.Create();

    var jobConfig = new InspectJobConfig
    {
        InspectConfig = new InspectConfig
        {
            MinLikelihood = (Likelihood)Enum.Parse(
                typeof(Likelihood),
                minLikelihood
            ),
            Limits = new FindingLimits
            {
                MaxFindingsPerRequest = maxFindings
            },
            InfoTypes = { infoTypes }
        },
        StorageConfig = new StorageConfig
        {
            CloudStorageOptions = new CloudStorageOptions
            {
                FileSet = new FileSet
                {
                    Url = $"gs://{bucketName}/*"
                }
            },
            TimespanConfig = new TimespanConfig
            {
                EnableAutoPopulationOfTimespanConfig = autoPopulateTimespan
            }
        }
    };

    var jobTrigger = new JobTrigger
    {
        Triggers =
        {
            new Trigger
            {
                Schedule = new Schedule
                {
                    RecurrencePeriodDuration = new Google.Protobuf.WellKnownTypes.Duration
                    {
                        Seconds = scanPeriod * 60 * 60 * 24
                    }
                }
            }
        },
        InspectJob = jobConfig,
        Status = Status.Healthy,
        DisplayName = displayName,
        Description = description
    };

    JobTrigger response = dlp.CreateJobTrigger(
        new CreateJobTriggerRequest
        {
            ParentAsProjectName = new ProjectName(projectId),
            JobTrigger = jobTrigger,
            TriggerId = triggerId
        });

    Console.WriteLine($"Successfully created trigger {response.Name}");
    return 0;
}

Enumera todos los activadores de trabajo

Para mostrar una lista de todos los activadores de trabajo, llama al método projects.jobTriggers.list del objeto JobTrigger.

A continuación, se muestran JSON de ejemplo y de código en varios lenguajes que muestran cómo enumerar todos los activadores de trabajos.

Ejemplos de código

Protocolo

Para enumerar todos los activadores de trabajos definidos en la actualidad en tu proyecto, envía una solicitud GET al extremo jobTriggers, como se muestra aquí:

URL:

GET https://dlp.googleapis.com/v2/projects/[PROJECT_ID]/jobTriggers?key={YOUR_API_KEY}

En la salida siguiente de JSON, se muestra el activador de trabajo que se creó en la sección anterior. Ten en cuenta que la estructura del activador de trabajo refleja la del recurso JobTrigger.

Resultado de JSON:

{
  "jobTriggers":[
    {
      "name":"projects/[PROJECT_ID]/jobTriggers/[JOB_ID]",
      "displayName":"JobTrigger1",
      "description":"Starts a DLP scan job of a Datastore kind",
      "inspectJob":{
        "storageConfig":{
          "datastoreOptions":{
            "partitionId":{
              "projectId":"[PROJECT_ID]",
              "namespaceId":"[NAMESPACE_ID]"
            },
            "kind":{
              "name":"Example-Kind"
            }
          }
        },
        "inspectConfig":{
          "infoTypes":[
            {
              "name":"PHONE_NUMBER"
            }
          ],
          "minLikelihood":"LIKELY",
          "limits":{

          },
          "includeQuote":true
        },
        "actions":[
          {
            "saveFindings":{
              "outputConfig":{
                "table":{
                  "projectId":"[PROJECT_ID]",
                  "datasetId":"[BIGQUERY_DATASET_NAME]",
                  "tableId":"[BIGQUERY_TABLE_NAME]"
                }
              }
            }
          }
        ]
      },
      "triggers":[
        {
          "schedule":{
            "recurrencePeriodDuration":"86400s"
          }
        }
      ],
      "createTime":"2018-11-30T01:52:41.171857Z",
      "updateTime":"2018-11-30T01:52:41.171857Z",
      "status":"HEALTHY"
    },

    ...

],
  "nextPageToken":"KkwKCQjivJ2UpPreAgo_Kj1wcm9qZWN0cy92ZWx2ZXR5LXN0dWR5LTE5NjEwMS9qb2JUcmlnZ2Vycy8xNTA5NzEyOTczMDI0MDc1NzY0"
}

Java

/**
 * List all DLP triggers for a given project.
 *
 * @param projectId The project ID to run the API call under.
 */
private static void listTriggers(String projectId) {
  // Instantiates a client
  try (DlpServiceClient dlpServiceClient = DlpServiceClient.create()) {
    ListJobTriggersRequest listJobTriggersRequest =
        ListJobTriggersRequest.newBuilder()
            .setParent(ProjectName.of(projectId).toString())
            .build();
    DlpServiceClient.ListJobTriggersPagedResponse response =
        dlpServiceClient.listJobTriggers(listJobTriggersRequest);
    response
        .getPage()
        .getValues()
        .forEach(
            trigger -> {
              System.out.println("Trigger: " + trigger.getName());
              System.out.println("\tCreated: " + trigger.getCreateTime());
              System.out.println("\tUpdated: " + trigger.getUpdateTime());
              if (trigger.getDisplayName() != null) {
                System.out.println("\tDisplay name: " + trigger.getDisplayName());
              }
              if (trigger.getDescription() != null) {
                System.out.println("\tDescription: " + trigger.getDescription());
              }
              System.out.println("\tStatus: " + trigger.getStatus());
              System.out.println("\tError count: " + trigger.getErrorsCount());
            });
  } catch (Exception e) {
    System.out.println("Error listing triggers :" + e.getMessage());
  }
}

Node.js

  // Imports the Google Cloud Data Loss Prevention library
  const DLP = require('@google-cloud/dlp');

  // Instantiates a client
  const dlp = new DLP.DlpServiceClient();

  // The project ID to run the API call under
  // const callingProjectId = process.env.GCLOUD_PROJECT;

  // Construct trigger listing request
  const request = {
    parent: dlp.projectPath(callingProjectId),
  };

  // Helper function to pretty-print dates
  const formatDate = date => {
    const msSinceEpoch = parseInt(date.seconds, 10) * 1000;
    return new Date(msSinceEpoch).toLocaleString('en-US');
  };

  try {
    // Run trigger listing request
    const [triggers] = await dlp.listJobTriggers(request);
    triggers.forEach(trigger => {
      // Log trigger details
      console.log(`Trigger ${trigger.name}:`);
      console.log(`  Created: ${formatDate(trigger.createTime)}`);
      console.log(`  Updated: ${formatDate(trigger.updateTime)}`);
      if (trigger.displayName) {
        console.log(`  Display Name: ${trigger.displayName}`);
      }
      if (trigger.description) {
        console.log(`  Description: ${trigger.description}`);
      }
      console.log(`  Status: ${trigger.status}`);
      console.log(`  Error count: ${trigger.errors.length}`);
    });
  } catch (err) {
    console.log(`Error in listTriggers: ${err.message || err}`);
  }
}

async function deleteTrigger(triggerId) {
  // Imports the Google Cloud Data Loss Prevention library
  const DLP = require('@google-cloud/dlp');

  // Instantiates a client
  const dlp = new DLP.DlpServiceClient();

  // The name of the trigger to be deleted
  // Parent project ID is automatically extracted from this parameter
  // const triggerId = 'projects/my-project/triggers/my-trigger';

  // Construct trigger deletion request
  const request = {
    name: triggerId,
  };
  try {
    // Run trigger deletion request
    await dlp.deleteJobTrigger(request);
    console.log(`Successfully deleted trigger ${triggerId}.`);
  } catch (err) {
    console.log(`Error in deleteTrigger: ${err.message || err}`);
  }

}

const cli = require(`yargs`) // eslint-disable-line
  .demand(1)
  .command(
    `create <bucketName> <scanPeriod>`,
    `Create a Data Loss Prevention API job trigger.`,
    {
      infoTypes: {
        alias: 't',
        default: ['PHONE_NUMBER', 'EMAIL_ADDRESS', 'CREDIT_CARD_NUMBER'],
        type: 'array',
        global: true,
        coerce: infoTypes =>
          infoTypes.map(type => {
            return {name: type};
          }),
      },
      triggerId: {
        alias: 'n',
        default: '',
        type: 'string',
      },
      displayName: {
        alias: 'd',
        default: '',
        type: 'string',
      },
      description: {
        alias: 's',
        default: '',
        type: 'string',
      },
      autoPopulateTimespan: {
        default: false,
        type: 'boolean',
      },
      minLikelihood: {
        alias: 'm',
        default: 'LIKELIHOOD_UNSPECIFIED',
        type: 'string',
        choices: [
          'LIKELIHOOD_UNSPECIFIED',
          'VERY_UNLIKELY',
          'UNLIKELY',
          'POSSIBLE',
          'LIKELY',
          'VERY_LIKELY',
        ],
        global: true,
      },
      maxFindings: {
        alias: 'f',
        default: 0,
        type: 'number',
        global: true,
      },
    },
    opts =>
      createTrigger(
        opts.callingProjectId,
        opts.triggerId,
        opts.displayName,
        opts.description,
        opts.bucketName,
        opts.autoPopulateTimespan,
        opts.scanPeriod,
        opts.infoTypes,
        opts.minLikelihood,
        opts.maxFindings
      )
  )
  .command(`list`, `List Data Loss Prevention API job triggers.`, {}, opts =>
    listTriggers(opts.callingProjectId)
  )
  .command(
    `delete <triggerId>`,
    `Delete a Data Loss Prevention API job trigger.`,
    {},
    opts => deleteTrigger(opts.triggerId)
  )
  .option('c', {
    type: 'string',
    alias: 'callingProjectId',
    default: process.env.GCLOUD_PROJECT || '',
  })
  .example(`node $0 create my-bucket 1`)
  .example(`node $0 list`)
  .example(`node $0 delete projects/my-project/jobTriggers/my-trigger`)
  .wrap(120)
  .recommendCommands()
  .epilogue(`For more information, see https://cloud.google.com/dlp/docs.`);

if (module === require.main) {
  cli.help().strict().argv; // eslint-disable-line
}

Python

def list_triggers(project):
    """Lists all Data Loss Prevention API triggers.
    Args:
        project: The Google Cloud project id to use as a parent resource.
    Returns:
        None; the response from the API is printed to the terminal.
    """

    # Import the client library
    import google.cloud.dlp

    # Instantiate a client.
    dlp = google.cloud.dlp.DlpServiceClient()

    # Convert the project id into a full resource id.
    parent = dlp.project_path(project)

    # Call the API.
    response = dlp.list_job_triggers(parent)

    # Define a helper function to convert the API's "seconds since the epoch"
    # time format into a human-readable string.
    def human_readable_time(timestamp):
        return str(time.localtime(timestamp.seconds))

    for trigger in response:
        print('Trigger {}:'.format(trigger.name))
        print('  Created: {}'.format(human_readable_time(trigger.create_time)))
        print('  Updated: {}'.format(human_readable_time(trigger.update_time)))
        if trigger.display_name:
            print('  Display Name: {}'.format(trigger.display_name))
        if trigger.description:
            print('  Description: {}'.format(trigger.discription))
        print('  Status: {}'.format(trigger.status))
        print('  Error count: {}'.format(len(trigger.errors)))

Go

// listTriggers lists the triggers for the given project.
func listTriggers(w io.Writer, client *dlp.Client, project string) {
	// Create a configured request.
	req := &dlppb.ListJobTriggersRequest{
		Parent: "projects/" + project,
	}
	// Send the request and iterate over the results.
	it := client.ListJobTriggers(context.Background(), req)
	for {
		t, err := it.Next()
		if err == iterator.Done {
			break
		}
		if err != nil {
			log.Fatalf("error getting jobs: %v", err)
		}
		c := t.GetCreateTime()
		u := t.GetUpdateTime()
		fmt.Fprintf(w, "Trigger %v\n", t.GetName())
		fmt.Fprintf(w, "  Created: %v\n", time.Unix(c.GetSeconds(), int64(c.GetNanos())).Format(time.RFC1123))
		fmt.Fprintf(w, "  Updated: %v\n", time.Unix(u.GetSeconds(), int64(u.GetNanos())).Format(time.RFC1123))
		fmt.Fprintf(w, "  Display Name: %q\n", t.GetDisplayName())
		fmt.Fprintf(w, "  Description: %q\n", t.GetDescription())
		fmt.Fprintf(w, "  Status: %v\n", t.GetStatus())
		fmt.Fprintf(w, "  Error Count: %v\n", len(t.GetErrors()))
	}
}

PHP

use Google\Cloud\Dlp\V2\DlpServiceClient;

/**
 * List Data Loss Prevention API job triggers.
 * @param string $callingProjectId The GCP Project ID to run the API call under
 */
function list_triggers($callingProjectId)
{
    // Instantiate a client.
    $dlp = new DlpServiceClient();

    $parent = $dlp->projectName($callingProjectId);

    // Run request
    $response = $dlp->listJobTriggers($parent);

    // Print results
    $triggers = $response->iterateAllElements();
    foreach ($triggers as $trigger) {
        printf('Trigger %s' . PHP_EOL, $trigger->getName());
        printf('  Created: %s' . PHP_EOL, $trigger->getCreateTime()->getSeconds());
        printf('  Updated: %s' . PHP_EOL, $trigger->getUpdateTime()->getSeconds());
        printf('  Display Name: %s' . PHP_EOL, $trigger->getDisplayName());
        printf('  Description: %s' . PHP_EOL, $trigger->getDescription());
        printf('  Status: %s' . PHP_EOL, $trigger->getStatus());
        printf('  Error count: %s' . PHP_EOL, count($trigger->getErrors()));
        $timespanConfig = $trigger->getInspectJob()->getStorageConfig()->getTimespanConfig();
        printf('  Auto-populates timespan config: %s' . PHP_EOL,
            ($timespanConfig && $timespanConfig->getEnableAutoPopulationOfTimespanConfig() ? 'yes' : 'no'));
    }
}

C#

public static object ListJobTriggers(string projectId)
{
    DlpServiceClient dlp = DlpServiceClient.Create();

    var response = dlp.ListJobTriggers(
        new ListJobTriggersRequest
        {
            ParentAsProjectName = new ProjectName(projectId)
        });

    foreach (var trigger in response)
    {
        Console.WriteLine($"Name: {trigger.Name}");
        Console.WriteLine($"  Created: {trigger.CreateTime.ToString()}");
        Console.WriteLine($"  Updated: {trigger.UpdateTime.ToString()}");
        Console.WriteLine($"  Display Name: {trigger.DisplayName}");
        Console.WriteLine($"  Description: {trigger.Description}");
        Console.WriteLine($"  Status: {trigger.Status}");
        Console.WriteLine($"  Error count: {trigger.Errors.Count}");
    }

    return 0;
}

Borra un activador de trabajo

Para borrar un activador de trabajo existente, llama al método projects.jobTriggers.delete del objeto JobTrigger.

A continuación, se muestran JSON de ejemplo y de código en varios lenguajes que muestran cómo borrar un activador de trabajo.

Ejemplos de código

Protocolo

Para borrar todos los activadores de trabajo definidos en la actualidad en tu proyecto, envía una solicitud de BORRAR al extremo jobTriggers como se muestra aquí. Reemplaza el campo [JOB_ID] con el ID de trabajo que se proporciona en el campo "name" de la respuesta JSON original a la solicitud de creación.

URL:

DELETE https://dlp.googleapis.com/v2/projects/[PROJECT_ID]/jobTriggers/[JOB_ID]?key={YOUR_API_KEY}

Si la solicitud tuvo éxito, la API de Cloud DLP mostrará una respuesta exitosa. Para verificar si se borró el activador de trabajo de manera correcta, enumera todos los activadores de trabajo.

Java

/**
 * Delete a DLP trigger in a project.
 *
 * @param projectId The project ID to run the API call under.
 * @param triggerId Trigger ID
 */
private static void deleteTrigger(String projectId, String triggerId) {

  ProjectJobTriggerName triggerName = ProjectJobTriggerName.of(projectId, triggerId);
  try (DlpServiceClient dlpServiceClient = DlpServiceClient.create()) {
    DeleteJobTriggerRequest deleteJobTriggerRequest =
        DeleteJobTriggerRequest.newBuilder().setName(triggerName.toString()).build();
    dlpServiceClient.deleteJobTrigger(deleteJobTriggerRequest);

    System.out.println("Trigger deleted: " + triggerName.toString());
  } catch (Exception e) {
    System.out.println("Error deleting trigger :" + e.getMessage());
  }
}

Node.js

// Imports the Google Cloud Data Loss Prevention library
const DLP = require('@google-cloud/dlp');

// Instantiates a client
const dlp = new DLP.DlpServiceClient();

// The name of the trigger to be deleted
// Parent project ID is automatically extracted from this parameter
// const triggerId = 'projects/my-project/triggers/my-trigger';

// Construct trigger deletion request
const request = {
  name: triggerId,
};
try {
  // Run trigger deletion request
  await dlp.deleteJobTrigger(request);
  console.log(`Successfully deleted trigger ${triggerId}.`);
} catch (err) {
  console.log(`Error in deleteTrigger: ${err.message || err}`);
}

Python

def delete_trigger(project, trigger_id):
    """Deletes a Data Loss Prevention API trigger.
    Args:
        project: The id of the Google Cloud project which owns the trigger.
        trigger_id: The id of the trigger to delete.
    Returns:
        None; the response from the API is printed to the terminal.
    """

    # Import the client library
    import google.cloud.dlp

    # Instantiate a client.
    dlp = google.cloud.dlp.DlpServiceClient()

    # Convert the project id into a full resource id.
    parent = dlp.project_path(project)

    # Combine the trigger id with the parent id.
    trigger_resource = '{}/jobTriggers/{}'.format(parent, trigger_id)

    # Call the API.
    dlp.delete_job_trigger(trigger_resource)

    print('Trigger {} successfully deleted.'.format(trigger_resource))

if __name__ == '__main__':
    default_project = os.environ.get('GCLOUD_PROJECT')

    parser = argparse.ArgumentParser(description=__doc__)
    subparsers = parser.add_subparsers(
        dest='action', help='Select which action to perform.')
    subparsers.required = True

    parser_create = subparsers.add_parser('create', help='Create a trigger.')
    parser_create.add_argument(
        'bucket', help='The name of the GCS bucket containing the file.')
    parser_create.add_argument(
        'scan_period_days', type=int,
        help='How often to repeat the scan, in days. The minimum is 1 day.')
    parser_create.add_argument(
        '--trigger_id',
        help='The id of the trigger. If omitted, an id will be randomly '
             'generated')
    parser_create.add_argument(
        '--display_name',
        help='The optional display name of the trigger.')
    parser_create.add_argument(
        '--description',
        help='The optional description of the trigger.')
    parser_create.add_argument(
        '--project',
        help='The Google Cloud project id to use as a parent resource.',
        default=default_project)
    parser_create.add_argument(
        '--info_types', action='append',
        help='Strings representing info types to look for. A full list of '
             'info categories and types is available from the API. Examples '
             'include "FIRST_NAME", "LAST_NAME", "EMAIL_ADDRESS". '
             'If unspecified, the three above examples will be used.',
        default=['FIRST_NAME', 'LAST_NAME', 'EMAIL_ADDRESS'])
    parser_create.add_argument(
        '--min_likelihood',
        choices=['LIKELIHOOD_UNSPECIFIED', 'VERY_UNLIKELY', 'UNLIKELY',
                 'POSSIBLE', 'LIKELY', 'VERY_LIKELY'],
        help='A string representing the minimum likelihood threshold that '
             'constitutes a match.')
    parser_create.add_argument(
        '--max_findings', type=int,
        help='The maximum number of findings to report; 0 = no maximum.')
    parser_create.add_argument(
        '--auto_populate_timespan', type=bool,
        help='Limit scan to new content only.')

    parser_list = subparsers.add_parser('list', help='List all triggers.')
    parser_list.add_argument(
        '--project',
        help='The Google Cloud project id to use as a parent resource.',
        default=default_project)

    parser_delete = subparsers.add_parser('delete', help='Delete a trigger.')
    parser_delete.add_argument(
        'trigger_id',
        help='The id of the trigger to delete.')
    parser_delete.add_argument(
        '--project',
        help='The Google Cloud project id to use as a parent resource.',
        default=default_project)

    args = parser.parse_args()

    if args.action == 'create':
        create_trigger(
            args.project, args.bucket, args.scan_period_days, args.info_types,
            trigger_id=args.trigger_id, display_name=args.display_name,
            description=args.description, min_likelihood=args.min_likelihood,
            max_findings=args.max_findings,
            auto_populate_timespan=args.auto_populate_timespan,
        )
    elif args.action == 'list':
        list_triggers(args.project)
    elif args.action == 'delete':
        delete_trigger(args.project, args.trigger_id)

Go

// deleteTrigger deletes the given trigger.
func deleteTrigger(w io.Writer, client *dlp.Client, triggerID string) {
	req := &dlppb.DeleteJobTriggerRequest{
		Name: triggerID,
	}
	err := client.DeleteJobTrigger(context.Background(), req)
	if err != nil {
		log.Fatalf("error deleting job: %v", err)
	}
	fmt.Fprintf(w, "Successfully deleted trigger %v", triggerID)
}

PHP

use Google\Cloud\Dlp\V2\DlpServiceClient;

/**
 * Delete a Data Loss Prevention API job trigger.
 * @param string $triggerId The name of the trigger to be deleted.
 *        Parent project ID is automatically extracted from this parameter
 */
function delete_trigger($triggerId)
{
    // Instantiate a client.
    $dlp = new DlpServiceClient();

    // Run request
    $response = $dlp->deleteJobTrigger($triggerId);

    // Print the results
    printf('Successfully deleted trigger %s' . PHP_EOL, $triggerId);
}

C#

public static object DeleteJobTrigger(string triggerName)
{
    DlpServiceClient dlp = DlpServiceClient.Create();

    dlp.DeleteJobTrigger(
        new DeleteJobTriggerRequest
        {
            Name = triggerName
        });

    Console.WriteLine($"Successfully deleted trigger {triggerName}.");
    return 0;
}

Otras acciones del activador de trabajo

Además de crear, enumerar y borrar activadores de trabajo, existen métodos de objeto JobTrigger para realizar las siguientes acciones:

  • Actualizar un activador de trabajo existente: usa el método projects.jobTriggers.patch para enviar valores JobTrigger actualizados a la API de Cloud DLP a fin de actualizar esos valores dentro de un activador de trabajo especificado.
  • Recuperar un activador de trabajo existente, incluso su configuración y estado: usa el método projects.jobTriggers.get para recuperar la información de configuración y estado.

Usa activadores de trabajo

En esta sección, se describe cómo usar los activadores de trabajo para analizar solo el contenido nuevo y cómo activar los trabajos cada vez que se sube un archivo a Cloud Storage mediante Cloud Functions.

Limita el análisis solo al contenido nuevo

También puedes configurar una opción para establecer de manera automática la fecha del período de los archivos almacenados en Cloud Storage o BigQuery. Una vez que configuraste el objeto TimespanConfig para que se propague de forma automática, Cloud DLP solo analizará los datos agregados o modificados luego de que se ejecutó el último activador:

...
  timespan_config {
        enable_auto_population_of_timespan_config: true
      }
...

Activa trabajos en la carga de archivos

Además de la asistencia para activadores de trabajo que está integrada en Cloud DLP, GCP también tiene una variedad de otros componentes que se pueden usar para integrar o activar trabajos de DLP. Por ejemplo, puedes usar Cloud Functions para activar un análisis de DLP cada vez que se suba un archivo en Cloud Storage.

Para obtener instrucciones paso a paso sobre cómo hacerlo, consulta Automatiza la clasificación de los datos cargados en Cloud Storage.

¿Te sirvió esta página? Envíanos tu opinión:

Enviar comentarios sobre…

Cloud Data Loss Prevention