使用客户管理的加密密钥
使用集合让一切井井有条
根据您的偏好保存内容并对其进行分类。
此示例展示了如何将客户管理的加密密钥与 Dataflow 流水线搭配使用。
代码示例
如未另行说明,那么本页面中的内容已根据知识共享署名 4.0 许可获得了许可,并且代码示例已根据 Apache 2.0 许可获得了许可。有关详情,请参阅 Google 开发者网站政策。Java 是 Oracle 和/或其关联公司的注册商标。
[[["易于理解","easyToUnderstand","thumb-up"],["解决了我的问题","solvedMyProblem","thumb-up"],["其他","otherUp","thumb-up"]],[["很难理解","hardToUnderstand","thumb-down"],["信息或示例代码不正确","incorrectInformationOrSampleCode","thumb-down"],["没有我需要的信息/示例","missingTheInformationSamplesINeed","thumb-down"],["翻译问题","translationIssue","thumb-down"],["其他","otherDown","thumb-down"]],[],[],[],null,["# Use customer-managed encryption keys\n\nThis sample shows how to use encryption keys managed by the customer, with a Dataflow pipeline.\n\nCode sample\n-----------\n\n### Java\n\n\nTo authenticate to Dataflow, set up Application Default Credentials.\nFor more information, see\n\n[Set up authentication for a local development environment](/docs/authentication/set-up-adc-local-dev-environment).\n\n // Query from the NASA wildfires public dataset:\n // https://console.cloud.google.com/bigquery?p=bigquery-public-data&d=nasa_wildfire&t=past_week&page=table\n String query =\n \"SELECT latitude,longitude,acq_date,acq_time,bright_ti4,confidence \"\n + \"FROM `bigquery-public-data.nasa_wildfire.past_week` \"\n + \"LIMIT 10\";\n\n // Schema for the output BigQuery table.\n final TableSchema outputSchema = new TableSchema().setFields(Arrays.asList(\n new TableFieldSchema().setName(\"latitude\").setType(\"FLOAT\"),\n new TableFieldSchema().setName(\"longitude\").setType(\"FLOAT\"),\n new TableFieldSchema().setName(\"acq_date\").setType(\"DATE\"),\n new TableFieldSchema().setName(\"acq_time\").setType(\"TIME\"),\n new TableFieldSchema().setName(\"bright_ti4\").setType(\"FLOAT\"),\n new TableFieldSchema().setName(\"confidence\").setType(\"STRING\")));\n\n // Create the BigQuery options from the command line arguments.\n BigQueryKmsKeyOptions options = PipelineOptionsFactory.fromArgs(args)\n .withValidation().as(BigQueryKmsKeyOptions.class);\n\n // String outputBigQueryTable = \"\u003cproject\u003e:\u003cdataset\u003e.\u003ctable\u003e\";\n String outputBigQueryTable = options.getOutputBigQueryTable();\n\n // String kmsKey =\n // \"projects/\u003cproject\u003e/locations/\u003ckms-location\u003e/keyRings/\u003ckms-keyring\u003e/cryptoKeys/\u003ckms-key\u003e\";\n String kmsKey = options.getKmsKey();\n\n // Create and run an Apache Beam pipeline.\n Pipeline pipeline = Pipeline.create(options);\n pipeline\n .apply(\"Read from BigQuery with KMS key\",\n BigQueryIO.readTableRows()\n .fromQuery(query)\n .usingStandardSql()\n .withKmsKey(kmsKey))\n .apply(\"Write to BigQuery with KMS key\",\n BigQueryIO.writeTableRows()\n .to(outputBigQueryTable)\n .withSchema(outputSchema)\n .withWriteDisposition(WriteDisposition.WRITE_TRUNCATE)\n .withKmsKey(kmsKey));\n pipeline.run().waitUntilFinish();\n\n### Python\n\n\nTo authenticate to Dataflow, set up Application Default Credentials.\nFor more information, see\n\n[Set up authentication for a local development environment](/docs/authentication/set-up-adc-local-dev-environment).\n\n import apache_beam as beam\n\n # output_bigquery_table = '\u003cproject\u003e:\u003cdataset\u003e.\u003ctable\u003e'\n # kms_key = 'projects/\u003cproject\u003e/locations/\u003ckms-location\u003e/keyRings/\u003ckms-keyring\u003e/cryptoKeys/\u003ckms-key\u003e' # noqa\n # beam_args = [\n # '--project', 'your-project-id',\n # '--runner', 'DataflowRunner',\n # '--temp_location', 'gs://your-bucket/samples/dataflow/kms/tmp',\n # '--region', 'us-central1',\n # ]\n\n # Query from the NASA wildfires public dataset:\n # https://console.cloud.google.com/bigquery?p=bigquery-public-data&d=nasa_wildfire&t=past_week&page=table\n query = \"\"\"\n SELECT latitude,longitude,acq_date,acq_time,bright_ti4,confidence\n FROM `bigquery-public-data.nasa_wildfire.past_week`\n LIMIT 10\n \"\"\"\n\n # Schema for the output BigQuery table.\n schema = {\n \"fields\": [\n {\"name\": \"latitude\", \"type\": \"FLOAT\"},\n {\"name\": \"longitude\", \"type\": \"FLOAT\"},\n {\"name\": \"acq_date\", \"type\": \"DATE\"},\n {\"name\": \"acq_time\", \"type\": \"TIME\"},\n {\"name\": \"bright_ti4\", \"type\": \"FLOAT\"},\n {\"name\": \"confidence\", \"type\": \"STRING\"},\n ],\n }\n\n options = beam.options.pipeline_options.PipelineOptions(beam_args)\n with beam.Pipeline(options=options) as pipeline:\n (\n pipeline\n | \"Read from BigQuery with KMS key\"\n \u003e\u003e beam.io.Read(\n beam.io.BigQuerySource(\n query=query,\n use_standard_sql=True,\n kms_key=kms_key,\n )\n )\n | \"Write to BigQuery with KMS key\"\n \u003e\u003e beam.io.WriteToBigQuery(\n output_bigquery_table,\n schema=schema,\n write_disposition=beam.io.BigQueryDisposition.WRITE_TRUNCATE,\n kms_key=kms_key,\n )\n )\n\nWhat's next\n-----------\n\n\nTo search and filter code samples for other Google Cloud products, see the\n[Google Cloud sample browser](/docs/samples?product=dataflow)."]]