BigQuery Data Transfer 用コネクタ
コレクションでコンテンツを整理
必要に応じて、コンテンツの保存と分類を行います。
ワークフロー内の BigQuery Data Transfer へのアクセスに使用される組み込み関数を定義する Workflows コネクタ。
もっと見る
このコードサンプルを含む詳細なドキュメントについては、以下をご覧ください。
コードサンプル
特に記載のない限り、このページのコンテンツはクリエイティブ・コモンズの表示 4.0 ライセンスにより使用許諾されます。コードサンプルは Apache 2.0 ライセンスにより使用許諾されます。詳しくは、Google Developers サイトのポリシーをご覧ください。Java は Oracle および関連会社の登録商標です。
[[["わかりやすい","easyToUnderstand","thumb-up"],["問題の解決に役立った","solvedMyProblem","thumb-up"],["その他","otherUp","thumb-up"]],[["わかりにくい","hardToUnderstand","thumb-down"],["情報またはサンプルコードが不正確","incorrectInformationOrSampleCode","thumb-down"],["必要な情報 / サンプルがない","missingTheInformationSamplesINeed","thumb-down"],["翻訳に関する問題","translationIssue","thumb-down"],["その他","otherDown","thumb-down"]],[],[],[],null,["# Connector for BigQuery Data Transfer\n\nWorkflows connector that defines the built-in function used to access BigQuery Data Transfer within a workflow.\n\nExplore further\n---------------\n\n\nFor detailed documentation that includes this code sample, see the following:\n\n- [BigQuery Data Transfer API Connector Overview](/workflows/docs/reference/googleapis/bigquerydatatransfer/Overview)\n\nCode sample\n-----------\n\n### YAML\n\n # This workflow creates a new dataset and a new table inside that dataset, which are required\n # for the BigQuery Data Transfer Job to run. It creates a new TransferJob configuration and starts\n # a manual run of the transfer (30 seconds after the config is created).\n # The transferRun is a blocking LRO.\n # All resources get deleted once the transfer run completes.\n #\n # On success, it returns \"SUCCESS\".\n #\n # Features included in this test:\n # - BigQuery Data Transfer connector\n # - Waiting for long-running transfer run to complete\n #\n # This workflow expects following items to be provided through input argument for execution:\n # - projectID (string)\n # - The user project ID.\n # - datasetID (string)\n # - The dataset name, expected to have an unique value to avoid the\n # instance being referred by multiple tests.\n # - tableID (string)\n # - The table name, expected to have an unique value to avoid the\n # instance being referred by multiple tests.\n # - runConfigDisplayName (string)\n # - The transfer run configuration display name.\n #\n # Expected successful output: \"SUCCESS\"\n main:\n params: [args]\n steps:\n - init:\n assign:\n - project_id: ${args.projectID}\n - destination_dataset: ${args.datasetID}\n - destination_table: ${args.tableID}\n - run_config_display_name: ${args.runConfigDisplayName}\n - run_config_data_source_id: \"google_cloud_storage\"\n - location: \"us\"\n - data_path_template: \"gs://xxxxxx-bucket/xxxxx/xxxx\"\n - create_dataset:\n call: googleapis.bigquery.v2.datasets.insert\n args:\n projectId: ${project_id}\n body:\n datasetReference:\n datasetId: ${destination_dataset}\n projectId: ${project_id}\n - create_table:\n call: googleapis.bigquery.v2.tables.insert\n args:\n datasetId: ${destination_dataset}\n projectId: ${project_id}\n body:\n tableReference:\n datasetId: ${destination_dataset}\n projectId: ${project_id}\n tableId: ${destination_table}\n schema:\n fields:\n - name: \"column1\"\n type: \"STRING\"\n - name: \"column2\"\n type: \"STRING\"\n - list_config:\n call: googleapis.bigquerydatatransfer.v1.projects.locations.transferConfigs.list\n args:\n parent: ${\"projects/\" + project_id + \"/locations/us\"}\n - create_run_config:\n call: googleapis.bigquerydatatransfer.v1.projects.locations.transferConfigs.create\n args:\n parent: ${\"projects/\" + project_id + \"/locations/\" + location}\n body:\n displayName: ${run_config_display_name}\n schedule: \"every day 19:22\"\n scheduleOptions:\n disableAutoScheduling: true\n destinationDatasetId: ${destination_dataset}\n dataSourceId: ${run_config_data_source_id}\n params:\n destination_table_name_template: ${destination_table}\n file_format: \"CSV\"\n data_path_template: ${data_path_template}\n result: config\n - get_time_in_30s:\n assign:\n - now_plus_30s: ${time.format(sys.now() + 30)}\n - start_run:\n call: googleapis.bigquerydatatransfer.v1.projects.locations.transferConfigs.startManualRuns\n args:\n parent: ${config.name}\n body:\n requestedRunTime: ${now_plus_30s}\n result: runsResp\n - remove_run_config:\n call: googleapis.bigquerydatatransfer.v1.projects.locations.transferConfigs.delete\n args:\n name: ${config.name}\n - delete_table:\n call: googleapis.bigquery.v2.tables.delete\n args:\n datasetId: ${destination_dataset}\n projectId: ${project_id}\n tableId: ${destination_table}\n - delete_dataset:\n call: googleapis.bigquery.v2.datasets.delete\n args:\n projectId: ${project_id}\n datasetId: ${destination_dataset}\n - the_end:\n return: \"SUCCESS\"\n\nWhat's next\n-----------\n\n\nTo search and filter code samples for other Google Cloud products, see the\n[Google Cloud sample browser](/docs/samples?product=workflows)."]]