使用資料表結構定義寫入 BigQuery
透過集合功能整理內容
你可以依據偏好儲存及分類內容。
提供資料表結構定義,從 Dataflow 寫入新的或現有的 BigQuery 資料表
深入探索
如需包含這個程式碼範例的詳細說明文件,請參閱下列內容:
程式碼範例
Java
如要向 Dataflow 進行驗證,請設定應用程式預設憑證。
詳情請參閱「為本機開發環境設定驗證」。
除非另有註明,否則本頁面中的內容是採用創用 CC 姓名標示 4.0 授權,程式碼範例則為阿帕契 2.0 授權。詳情請參閱《Google Developers 網站政策》。Java 是 Oracle 和/或其關聯企業的註冊商標。
[[["容易理解","easyToUnderstand","thumb-up"],["確實解決了我的問題","solvedMyProblem","thumb-up"],["其他","otherUp","thumb-up"]],[["難以理解","hardToUnderstand","thumb-down"],["資訊或程式碼範例有誤","incorrectInformationOrSampleCode","thumb-down"],["缺少我需要的資訊/範例","missingTheInformationSamplesINeed","thumb-down"],["翻譯問題","translationIssue","thumb-down"],["其他","otherDown","thumb-down"]],[],[[["This code sample demonstrates how to write data from Dataflow to a new or existing BigQuery table by providing a table schema."],["The example uses a custom data type `MyData` with fields for `name` and `age`, and defines a corresponding BigQuery table schema."],["The code utilizes `BigQueryIO.write()` to write data to BigQuery, including options to specify the table destination, format the data, set the create disposition, and provide the schema."],["The provided code shows how to use Application Default Credentials for authenticating to Dataflow, a required step for executing the pipeline."],["The pipeline uses the Storage Write API method to write to the BigQuery table for improved performance, setting up the pipeline options via command-line arguments."]]],[]]