[[["易于理解","easyToUnderstand","thumb-up"],["解决了我的问题","solvedMyProblem","thumb-up"],["其他","otherUp","thumb-up"]],[["很难理解","hardToUnderstand","thumb-down"],["信息或示例代码不正确","incorrectInformationOrSampleCode","thumb-down"],["没有我需要的信息/示例","missingTheInformationSamplesINeed","thumb-down"],["翻译问题","translationIssue","thumb-down"],["其他","otherDown","thumb-down"]],["最后更新时间 (UTC):2025-08-04。"],[[["\u003cp\u003eManaged I/O for BigQuery supports dynamic table creation and dynamic destinations.\u003c/p\u003e\n"],["\u003cp\u003eFor reading, the connector utilizes the BigQuery Storage Read API, and for writing, it uses the BigQuery Storage Write API in exactly-once mode for unbounded sources or BigQuery file loads for bounded sources.\u003c/p\u003e\n"],["\u003cp\u003eThe connector requires Apache Beam SDK for Java version 2.61.0 or later.\u003c/p\u003e\n"],["\u003cp\u003eConfiguration options include specifying the BigQuery \u003ccode\u003etable\u003c/code\u003e, \u003ccode\u003ekms_key\u003c/code\u003e, \u003ccode\u003efields\u003c/code\u003e, \u003ccode\u003equery\u003c/code\u003e, \u003ccode\u003erow_restriction\u003c/code\u003e, and \u003ccode\u003etriggering_frequency\u003c/code\u003e depending on the operation.\u003c/p\u003e\n"],["\u003cp\u003eManaged I/O for BigQuery does not support automatic upgrades.\u003c/p\u003e\n"]]],[],null,["# Dataflow managed I/O for BigQuery\n\nManaged I/O supports the following capabilities for BigQuery:\n\n- Dynamic table creation\n- [Dynamic destinations](/dataflow/docs/guides/write-to-iceberg#dynamic-destinations%22)\n- For reads, the connector uses the [BigQuery Storage Read API](/bigquery/docs/reference/storage).\n- For writes, the connector uses the following BigQuery methods:\n\n - If the source is unbounded and Dataflow is using [streaming exactly-once processing](/dataflow/docs/guides/streaming-modes), the connector performs writes to BigQuery, by using the [BigQuery Storage Write API](/bigquery/docs/write-api) with exactly-once delivery semantics.\n - If the source is unbounded and Dataflow is using [streaming at-least-once processing](/dataflow/docs/guides/streaming-modes), the connector performs writes to BigQuery, by using the [BigQuery Storage Write API](/bigquery/docs/write-api) with at-least-once delivery semantics.\n - If the source is bounded, the connector uses [BigQuery file loads](/bigquery/docs/batch-loading-data).\n\nRequirements\n------------\n\nThe following SDKs support managed I/O for BigQuery:\n\n- Apache Beam SDK for Java version 2.61.0 or later\n- Apache Beam SDK for Python version 2.61.0 or later\n\nConfiguration\n-------------\n\nManaged I/O for BigQuery supports the following configuration\nparameters:\n\n### `BIGQUERY` Read\n\n\u003cbr /\u003e\n\n### `BIGQUERY` Write\n\n\u003cbr /\u003e\n\nWhat's next\n-----------\n\nFor more information and code examples, see the following topics:\n\n- [Read from BigQuery](/dataflow/docs/guides/read-from-bigquery)\n- [Write to BigQuery](/dataflow/docs/guides/write-to-bigquery)"]]