Esegui il deployment di Cloud Functions (2ª generazione.) con l'attivatore di log di controllo utilizzando Terraform
Mantieni tutto organizzato con le raccolte
Salva e classifica i contenuti in base alle tue preferenze.
Configurazione Terraform completa per il deployment di una funzione Cloud Function (2ª generazione.) basata su eventi con risorse
Esempio di codice
Salvo quando diversamente specificato, i contenuti di questa pagina sono concessi in base alla licenza Creative Commons Attribution 4.0, mentre gli esempi di codice sono concessi in base alla licenza Apache 2.0. Per ulteriori dettagli, consulta le norme del sito di Google Developers. Java è un marchio registrato di Oracle e/o delle sue consociate.
[[["Facile da capire","easyToUnderstand","thumb-up"],["Il problema è stato risolto","solvedMyProblem","thumb-up"],["Altra","otherUp","thumb-up"]],[["Difficile da capire","hardToUnderstand","thumb-down"],["Informazioni o codice di esempio errati","incorrectInformationOrSampleCode","thumb-down"],["Mancano le informazioni o gli esempi di cui ho bisogno","missingTheInformationSamplesINeed","thumb-down"],["Problema di traduzione","translationIssue","thumb-down"],["Altra","otherDown","thumb-down"]],[],[[["\u003cp\u003eThis Terraform configuration deploys an event-driven Cloud Function (2nd gen) triggered by Google Cloud Audit Logs, specifically monitoring for \u003ccode\u003estorage.objects.create\u003c/code\u003e events on a designated Google Cloud Storage bucket.\u003c/p\u003e\n"],["\u003cp\u003eThe configuration utilizes a service account for both the Cloud Function and Eventarc trigger, granting it necessary roles like \u003ccode\u003erun.invoker\u003c/code\u003e, \u003ccode\u003eeventarc.eventReceiver\u003c/code\u003e, and \u003ccode\u003eartifactregistry.reader\u003c/code\u003e.\u003c/p\u003e\n"],["\u003cp\u003eA random ID is generated to create a unique prefix for the source and audit log buckets.\u003c/p\u003e\n"],["\u003cp\u003eThe config specifies a filter for resource names using the \u003ccode\u003ematch-path-pattern\u003c/code\u003e operator, enabling the function to be triggered only when \u003ccode\u003e.txt\u003c/code\u003e files are created within the specified bucket and allows path patterns.\u003c/p\u003e\n"],["\u003cp\u003eThe Cloud Function's build configuration is specified, setting the runtime, entry point, and source code location, while also providing service configurations for instance scaling, memory, timeout, environment variables and network security.\u003c/p\u003e\n"]]],[],null,["# Deploy Cloud Function 2nd gen with Audit Log trigger using Terraform\n\nFull terraform config to deploy an event-driven Cloud Function 2nd gen with resources\n\nCode sample\n-----------\n\n### Terraform\n\n\nTo learn how to apply or remove a Terraform configuration, see\n[Basic Terraform commands](/docs/terraform/basic-commands).\n\n\nFor more information, see the\n[Terraform provider reference documentation](https://registry.terraform.io/providers/hashicorp/google/latest/docs).\n\n # This example follows the examples shown in this Google Cloud Community blog post\n # https://medium.com/google-cloud/applying-a-path-pattern-when-filtering-in-eventarc-f06b937b4c34\n # and the docs https://cloud.google.com/eventarc/docs/path-patterns\n\n terraform {\n required_providers {\n google = {\n source = \"hashicorp/google\"\n version = \"\u003e= 4.34.0\"\n }\n }\n }\n\n resource \"random_id\" \"bucket_prefix\" {\n byte_length = 8\n }\n\n resource \"google_storage_bucket\" \"source_bucket\" {\n name = \"${random_id.bucket_prefix.hex}-gcf-source\"\n location = \"US\"\n uniform_bucket_level_access = true\n }\n\n data \"archive_file\" \"default\" {\n type = \"zip\"\n output_path = \"/tmp/function-source.zip\"\n source_dir = \"function-source/\"\n }\n\n resource \"google_storage_bucket_object\" \"default\" {\n name = \"function-source.zip\"\n bucket = google_storage_bucket.source_bucket.name\n source = data.archive_file.default.output_path # Path to the zipped function source code\n }\n\n resource \"google_service_account\" \"default\" {\n account_id = \"test-gcf-sa\"\n display_name = \"Test Service Account - used for both the cloud function and eventarc trigger in the test\"\n }\n\n # Note: The right way of listening for Cloud Storage events is to use a Cloud Storage trigger.\n # Here we use Audit Logs to monitor the bucket so path patterns can be used in the example of\n # google_cloudfunctions2_function below (Audit Log events have path pattern support)\n resource \"google_storage_bucket\" \"audit_log_bucket\" {\n name = \"${random_id.bucket_prefix.hex}-gcf-auditlog-bucket\"\n location = \"us-central1\" # The trigger must be in the same location as the bucket\n uniform_bucket_level_access = true\n }\n\n # Permissions on the service account used by the function and Eventarc trigger\n data \"google_project\" \"project\" {\n }\n\n resource \"google_project_iam_member\" \"invoking\" {\n project = data.google_project.project.project_id\n role = \"roles/run.invoker\"\n member = \"serviceAccount:${google_service_account.default.email}\"\n }\n\n resource \"google_project_iam_member\" \"event_receiving\" {\n project = data.google_project.project.project_id\n role = \"roles/eventarc.eventReceiver\"\n member = \"serviceAccount:${google_service_account.default.email}\"\n depends_on = [google_project_iam_member.invoking]\n }\n\n resource \"google_project_iam_member\" \"artifactregistry_reader\" {\n project = data.google_project.project.project_id\n role = \"roles/artifactregistry.reader\"\n member = \"serviceAccount:${google_service_account.default.email}\"\n depends_on = [google_project_iam_member.event_receiving]\n }\n\n resource \"google_cloudfunctions2_function\" \"default\" {\n depends_on = [\n google_project_iam_member.event_receiving,\n google_project_iam_member.artifactregistry_reader,\n ]\n name = \"gcf-function\"\n location = \"us-central1\"\n description = \"a new function\"\n\n build_config {\n runtime = \"nodejs22\"\n entry_point = \"entryPoint\" # Set the entry point in the code\n environment_variables = {\n BUILD_CONFIG_TEST = \"build_test\"\n }\n source {\n storage_source {\n bucket = google_storage_bucket.source_bucket.name\n object = google_storage_bucket_object.default.name\n }\n }\n }\n\n service_config {\n max_instance_count = 3\n min_instance_count = 1\n available_memory = \"256M\"\n timeout_seconds = 60\n environment_variables = {\n SERVICE_CONFIG_TEST = \"config_test\"\n }\n ingress_settings = \"ALLOW_INTERNAL_ONLY\"\n all_traffic_on_latest_revision = true\n service_account_email = google_service_account.default.email\n }\n\n event_trigger {\n trigger_region = \"us-central1\" # The trigger must be in the same location as the bucket\n event_type = \"google.cloud.audit.log.v1.written\"\n retry_policy = \"RETRY_POLICY_RETRY\"\n service_account_email = google_service_account.default.email\n event_filters {\n attribute = \"serviceName\"\n value = \"storage.googleapis.com\"\n }\n event_filters {\n attribute = \"methodName\"\n value = \"storage.objects.create\"\n }\n event_filters {\n attribute = \"resourceName\"\n # Selects all .txt files in the bucket\n value = \"/projects/_/buckets/${google_storage_bucket.audit_log_bucket.name}/objects/*.txt\"\n # Allows path patterns to be used in the value field\n operator = \"match-path-pattern\"\n }\n }\n }\n\nWhat's next\n-----------\n\n\nTo search and filter code samples for other Google Cloud products, see the\n[Google Cloud sample browser](/docs/samples?product=functions)."]]