고객 관리 암호화 키 사용
컬렉션을 사용해 정리하기
내 환경설정을 기준으로 콘텐츠를 저장하고 분류하세요.
이 샘플은 Dataflow 파이프라인으로 고객이 관리하는 암호화 키를 사용하는 방법을 보여줍니다.
코드 샘플
Java
Dataflow에 인증하려면 애플리케이션 기본 사용자 인증 정보를 설정합니다.
자세한 내용은 로컬 개발 환경의 인증 설정을 참조하세요.
Python
Dataflow에 인증하려면 애플리케이션 기본 사용자 인증 정보를 설정합니다.
자세한 내용은 로컬 개발 환경의 인증 설정을 참조하세요.
달리 명시되지 않는 한 이 페이지의 콘텐츠에는 Creative Commons Attribution 4.0 라이선스에 따라 라이선스가 부여되며, 코드 샘플에는 Apache 2.0 라이선스에 따라 라이선스가 부여됩니다. 자세한 내용은 Google Developers 사이트 정책을 참조하세요. 자바는 Oracle 및/또는 Oracle 계열사의 등록 상표입니다.
[[["이해하기 쉬움","easyToUnderstand","thumb-up"],["문제가 해결됨","solvedMyProblem","thumb-up"],["기타","otherUp","thumb-up"]],[["이해하기 어려움","hardToUnderstand","thumb-down"],["잘못된 정보 또는 샘플 코드","incorrectInformationOrSampleCode","thumb-down"],["필요한 정보/샘플이 없음","missingTheInformationSamplesINeed","thumb-down"],["번역 문제","translationIssue","thumb-down"],["기타","otherDown","thumb-down"]],[],[],[],null,["# Use customer-managed encryption keys\n\nThis sample shows how to use encryption keys managed by the customer, with a Dataflow pipeline.\n\nCode sample\n-----------\n\n### Java\n\n\nTo authenticate to Dataflow, set up Application Default Credentials.\nFor more information, see\n\n[Set up authentication for a local development environment](/docs/authentication/set-up-adc-local-dev-environment).\n\n // Query from the NASA wildfires public dataset:\n // https://console.cloud.google.com/bigquery?p=bigquery-public-data&d=nasa_wildfire&t=past_week&page=table\n String query =\n \"SELECT latitude,longitude,acq_date,acq_time,bright_ti4,confidence \"\n + \"FROM `bigquery-public-data.nasa_wildfire.past_week` \"\n + \"LIMIT 10\";\n\n // Schema for the output BigQuery table.\n final TableSchema outputSchema = new TableSchema().setFields(Arrays.asList(\n new TableFieldSchema().setName(\"latitude\").setType(\"FLOAT\"),\n new TableFieldSchema().setName(\"longitude\").setType(\"FLOAT\"),\n new TableFieldSchema().setName(\"acq_date\").setType(\"DATE\"),\n new TableFieldSchema().setName(\"acq_time\").setType(\"TIME\"),\n new TableFieldSchema().setName(\"bright_ti4\").setType(\"FLOAT\"),\n new TableFieldSchema().setName(\"confidence\").setType(\"STRING\")));\n\n // Create the BigQuery options from the command line arguments.\n BigQueryKmsKeyOptions options = PipelineOptionsFactory.fromArgs(args)\n .withValidation().as(BigQueryKmsKeyOptions.class);\n\n // String outputBigQueryTable = \"\u003cproject\u003e:\u003cdataset\u003e.\u003ctable\u003e\";\n String outputBigQueryTable = options.getOutputBigQueryTable();\n\n // String kmsKey =\n // \"projects/\u003cproject\u003e/locations/\u003ckms-location\u003e/keyRings/\u003ckms-keyring\u003e/cryptoKeys/\u003ckms-key\u003e\";\n String kmsKey = options.getKmsKey();\n\n // Create and run an Apache Beam pipeline.\n Pipeline pipeline = Pipeline.create(options);\n pipeline\n .apply(\"Read from BigQuery with KMS key\",\n BigQueryIO.readTableRows()\n .fromQuery(query)\n .usingStandardSql()\n .withKmsKey(kmsKey))\n .apply(\"Write to BigQuery with KMS key\",\n BigQueryIO.writeTableRows()\n .to(outputBigQueryTable)\n .withSchema(outputSchema)\n .withWriteDisposition(WriteDisposition.WRITE_TRUNCATE)\n .withKmsKey(kmsKey));\n pipeline.run().waitUntilFinish();\n\n### Python\n\n\nTo authenticate to Dataflow, set up Application Default Credentials.\nFor more information, see\n\n[Set up authentication for a local development environment](/docs/authentication/set-up-adc-local-dev-environment).\n\n import apache_beam as beam\n\n # output_bigquery_table = '\u003cproject\u003e:\u003cdataset\u003e.\u003ctable\u003e'\n # kms_key = 'projects/\u003cproject\u003e/locations/\u003ckms-location\u003e/keyRings/\u003ckms-keyring\u003e/cryptoKeys/\u003ckms-key\u003e' # noqa\n # beam_args = [\n # '--project', 'your-project-id',\n # '--runner', 'DataflowRunner',\n # '--temp_location', 'gs://your-bucket/samples/dataflow/kms/tmp',\n # '--region', 'us-central1',\n # ]\n\n # Query from the NASA wildfires public dataset:\n # https://console.cloud.google.com/bigquery?p=bigquery-public-data&d=nasa_wildfire&t=past_week&page=table\n query = \"\"\"\n SELECT latitude,longitude,acq_date,acq_time,bright_ti4,confidence\n FROM `bigquery-public-data.nasa_wildfire.past_week`\n LIMIT 10\n \"\"\"\n\n # Schema for the output BigQuery table.\n schema = {\n \"fields\": [\n {\"name\": \"latitude\", \"type\": \"FLOAT\"},\n {\"name\": \"longitude\", \"type\": \"FLOAT\"},\n {\"name\": \"acq_date\", \"type\": \"DATE\"},\n {\"name\": \"acq_time\", \"type\": \"TIME\"},\n {\"name\": \"bright_ti4\", \"type\": \"FLOAT\"},\n {\"name\": \"confidence\", \"type\": \"STRING\"},\n ],\n }\n\n options = beam.options.pipeline_options.PipelineOptions(beam_args)\n with beam.Pipeline(options=options) as pipeline:\n (\n pipeline\n | \"Read from BigQuery with KMS key\"\n \u003e\u003e beam.io.Read(\n beam.io.BigQuerySource(\n query=query,\n use_standard_sql=True,\n kms_key=kms_key,\n )\n )\n | \"Write to BigQuery with KMS key\"\n \u003e\u003e beam.io.WriteToBigQuery(\n output_bigquery_table,\n schema=schema,\n write_disposition=beam.io.BigQueryDisposition.WRITE_TRUNCATE,\n kms_key=kms_key,\n )\n )\n\nWhat's next\n-----------\n\n\nTo search and filter code samples for other Google Cloud products, see the\n[Google Cloud sample browser](/docs/samples?product=dataflow)."]]