BigQuery 데이터 전송용 커넥터
컬렉션을 사용해 정리하기
내 환경설정을 기준으로 콘텐츠를 저장하고 분류하세요.
워크플로 내에서 BigQuery Data Transfer에 액세스하는 데 사용되는 기본 제공 함수를 정의하는 워크플로 커넥터입니다.
더 살펴보기
이 코드 샘플이 포함된 자세한 문서는 다음을 참조하세요.
코드 샘플
달리 명시되지 않는 한 이 페이지의 콘텐츠에는 Creative Commons Attribution 4.0 라이선스에 따라 라이선스가 부여되며, 코드 샘플에는 Apache 2.0 라이선스에 따라 라이선스가 부여됩니다. 자세한 내용은 Google Developers 사이트 정책을 참조하세요. 자바는 Oracle 및/또는 Oracle 계열사의 등록 상표입니다.
[[["이해하기 쉬움","easyToUnderstand","thumb-up"],["문제가 해결됨","solvedMyProblem","thumb-up"],["기타","otherUp","thumb-up"]],[["이해하기 어려움","hardToUnderstand","thumb-down"],["잘못된 정보 또는 샘플 코드","incorrectInformationOrSampleCode","thumb-down"],["필요한 정보/샘플이 없음","missingTheInformationSamplesINeed","thumb-down"],["번역 문제","translationIssue","thumb-down"],["기타","otherDown","thumb-down"]],[],[],[],null,["# Connector for BigQuery Data Transfer\n\nWorkflows connector that defines the built-in function used to access BigQuery Data Transfer within a workflow.\n\nExplore further\n---------------\n\n\nFor detailed documentation that includes this code sample, see the following:\n\n- [BigQuery Data Transfer API Connector Overview](/workflows/docs/reference/googleapis/bigquerydatatransfer/Overview)\n\nCode sample\n-----------\n\n### YAML\n\n # This workflow creates a new dataset and a new table inside that dataset, which are required\n # for the BigQuery Data Transfer Job to run. It creates a new TransferJob configuration and starts\n # a manual run of the transfer (30 seconds after the config is created).\n # The transferRun is a blocking LRO.\n # All resources get deleted once the transfer run completes.\n #\n # On success, it returns \"SUCCESS\".\n #\n # Features included in this test:\n # - BigQuery Data Transfer connector\n # - Waiting for long-running transfer run to complete\n #\n # This workflow expects following items to be provided through input argument for execution:\n # - projectID (string)\n # - The user project ID.\n # - datasetID (string)\n # - The dataset name, expected to have an unique value to avoid the\n # instance being referred by multiple tests.\n # - tableID (string)\n # - The table name, expected to have an unique value to avoid the\n # instance being referred by multiple tests.\n # - runConfigDisplayName (string)\n # - The transfer run configuration display name.\n #\n # Expected successful output: \"SUCCESS\"\n main:\n params: [args]\n steps:\n - init:\n assign:\n - project_id: ${args.projectID}\n - destination_dataset: ${args.datasetID}\n - destination_table: ${args.tableID}\n - run_config_display_name: ${args.runConfigDisplayName}\n - run_config_data_source_id: \"google_cloud_storage\"\n - location: \"us\"\n - data_path_template: \"gs://xxxxxx-bucket/xxxxx/xxxx\"\n - create_dataset:\n call: googleapis.bigquery.v2.datasets.insert\n args:\n projectId: ${project_id}\n body:\n datasetReference:\n datasetId: ${destination_dataset}\n projectId: ${project_id}\n - create_table:\n call: googleapis.bigquery.v2.tables.insert\n args:\n datasetId: ${destination_dataset}\n projectId: ${project_id}\n body:\n tableReference:\n datasetId: ${destination_dataset}\n projectId: ${project_id}\n tableId: ${destination_table}\n schema:\n fields:\n - name: \"column1\"\n type: \"STRING\"\n - name: \"column2\"\n type: \"STRING\"\n - list_config:\n call: googleapis.bigquerydatatransfer.v1.projects.locations.transferConfigs.list\n args:\n parent: ${\"projects/\" + project_id + \"/locations/us\"}\n - create_run_config:\n call: googleapis.bigquerydatatransfer.v1.projects.locations.transferConfigs.create\n args:\n parent: ${\"projects/\" + project_id + \"/locations/\" + location}\n body:\n displayName: ${run_config_display_name}\n schedule: \"every day 19:22\"\n scheduleOptions:\n disableAutoScheduling: true\n destinationDatasetId: ${destination_dataset}\n dataSourceId: ${run_config_data_source_id}\n params:\n destination_table_name_template: ${destination_table}\n file_format: \"CSV\"\n data_path_template: ${data_path_template}\n result: config\n - get_time_in_30s:\n assign:\n - now_plus_30s: ${time.format(sys.now() + 30)}\n - start_run:\n call: googleapis.bigquerydatatransfer.v1.projects.locations.transferConfigs.startManualRuns\n args:\n parent: ${config.name}\n body:\n requestedRunTime: ${now_plus_30s}\n result: runsResp\n - remove_run_config:\n call: googleapis.bigquerydatatransfer.v1.projects.locations.transferConfigs.delete\n args:\n name: ${config.name}\n - delete_table:\n call: googleapis.bigquery.v2.tables.delete\n args:\n datasetId: ${destination_dataset}\n projectId: ${project_id}\n tableId: ${destination_table}\n - delete_dataset:\n call: googleapis.bigquery.v2.datasets.delete\n args:\n projectId: ${project_id}\n datasetId: ${destination_dataset}\n - the_end:\n return: \"SUCCESS\"\n\nWhat's next\n-----------\n\n\nTo search and filter code samples for other Google Cloud products, see the\n[Google Cloud sample browser](/docs/samples?product=workflows)."]]