Cloud Dataflow を使用してデータを書き込む
コレクションでコンテンツを整理
必要に応じて、コンテンツの保存と分類を行います。
Apache Beam を使用して Cloud Bigtable にデータを書き込みます。
もっと見る
このコードサンプルを含む詳細なドキュメントについては、以下をご覧ください。
コードサンプル
特に記載のない限り、このページのコンテンツはクリエイティブ・コモンズの表示 4.0 ライセンスにより使用許諾されます。コードサンプルは Apache 2.0 ライセンスにより使用許諾されます。詳しくは、Google Developers サイトのポリシーをご覧ください。Java は Oracle および関連会社の登録商標です。
[[["わかりやすい","easyToUnderstand","thumb-up"],["問題の解決に役立った","solvedMyProblem","thumb-up"],["その他","otherUp","thumb-up"]],[["わかりにくい","hardToUnderstand","thumb-down"],["情報またはサンプルコードが不正確","incorrectInformationOrSampleCode","thumb-down"],["必要な情報 / サンプルがない","missingTheInformationSamplesINeed","thumb-down"],["翻訳に関する問題","translationIssue","thumb-down"],["その他","otherDown","thumb-down"]],[],[[["\u003cp\u003eThis code sample demonstrates how to write data to Cloud Bigtable using Apache Beam.\u003c/p\u003e\n"],["\u003cp\u003eThe provided Java code utilizes the \u003ccode\u003eCloudBigtableIO\u003c/code\u003e connector to interact with Bigtable, including setting up configurations for project ID, instance ID, and table ID.\u003c/p\u003e\n"],["\u003cp\u003eThe example creates data rows with specific column family and column qualifier values, along with timestamps, and then uses \u003ccode\u003eParDo\u003c/code\u003e to create the necessary mutations to populate the Bigtable Table.\u003c/p\u003e\n"],["\u003cp\u003eThe sample code shows how to set up the authentication to connect to Bigtable using Application Default Credentials and defines the required options for project, instance, and table IDs.\u003c/p\u003e\n"],["\u003cp\u003eIt also describes a method to enable flow control using \u003ccode\u003eBigtableOptionsFactory.BIGTABLE_ENABLE_BULK_MUTATION_FLOW_CONTROL\u003c/code\u003e.\u003c/p\u003e\n"]]],[],null,["Write data to Cloud Bigtable with Apache Beam.\n\nExplore further\n\n\nFor detailed documentation that includes this code sample, see the following:\n\n- [Bigtable HBase Beam connector](/bigtable/docs/hbase-dataflow-java)\n\nCode sample \n\nJava\n\n\nTo learn how to install and use the client library for Bigtable, see\n[Bigtable client libraries](/bigtable/docs/reference/libraries).\n\n\nTo authenticate to Bigtable, set up Application Default Credentials.\nFor more information, see\n\n[Set up authentication for a local development environment](/docs/authentication/set-up-adc-local-dev-environment).\n\n\n import com.google.cloud.bigtable.beam.CloudBigtableIO;\n import com.google.cloud.bigtable.beam.CloudBigtableTableConfiguration;\n import com.google.cloud.bigtable.hbase.BigtableOptionsFactory;\n import org.apache.beam.runners.dataflow.options.DataflowPipelineOptions;\n import org.apache.beam.sdk.Pipeline;\n import org.apache.beam.sdk.options.Default;\n import org.apache.beam.sdk.options.Description;\n import org.apache.beam.sdk.options.PipelineOptionsFactory;\n import org.apache.beam.sdk.transforms.Create;\n import org.apache.beam.sdk.transforms.DoFn;\n import org.apache.beam.sdk.transforms.ParDo;\n import org.apache.hadoop.hbase.client.Mutation;\n import org.apache.hadoop.hbase.client.Put;\n import org.apache.hadoop.hbase.util.Bytes;\n\n public class HelloWorldWrite {\n\n public static void main(String[] args) {\n BigtableOptions options =\n PipelineOptionsFactory.fromArgs(args).withValidation().as(BigtableOptions.class);\n Pipeline p = Pipeline.create(options);\n\n CloudBigtableTableConfiguration bigtableTableConfig =\n new CloudBigtableTableConfiguration.Builder()\n .withProjectId(options.getBigtableProjectId())\n .withInstanceId(options.getBigtableInstanceId())\n .withTableId(options.getBigtableTableId())\n .build();\n\n p.apply(Create.of(\"phone#4c410523#20190501\", \"phone#4c410523#20190502\"))\n .apply(\n ParDo.of(\n new DoFn\u003cString, Mutation\u003e() {\n @ProcessElement\n public void processElement(@Element String rowkey, OutputReceiver\u003cMutation\u003e out) {\n long timestamp = System.currentTimeMillis();\n Put row = new Put(Bytes.toBytes(rowkey));\n\n row.addColumn(\n Bytes.toBytes(\"stats_summary\"),\n Bytes.toBytes(\"os_build\"),\n timestamp,\n Bytes.toBytes(\"android\"));\n out.output(row);\n }\n }))\n .apply(CloudBigtableIO.writeToTable(bigtableTableConfig));\n\n p.run().waitUntilFinish();\n }\n\n public interface BigtableOptions extends DataflowPipelineOptions {\n\n @Description(\"The Bigtable project ID, this can be different than your Dataflow project\")\n @Default.String(\"bigtable-project\")\n String getBigtableProjectId();\n\n void setBigtableProjectId(String bigtableProjectId);\n\n @Description(\"The Bigtable instance ID\")\n @Default.String(\"bigtable-instance\")\n String getBigtableInstanceId();\n\n void setBigtableInstanceId(String bigtableInstanceId);\n\n @Description(\"The Bigtable table ID in the instance.\")\n @Default.String(\"mobile-time-series\")\n String getBigtableTableId();\n\n void setBigtableTableId(String bigtableTableId);\n }\n\n public static CloudBigtableTableConfiguration batchWriteFlowControlExample(\n BigtableOptions options) {\n CloudBigtableTableConfiguration bigtableTableConfig =\n new CloudBigtableTableConfiguration.Builder()\n .withProjectId(options.getBigtableProjectId())\n .withInstanceId(options.getBigtableInstanceId())\n .withTableId(options.getBigtableTableId())\n .withConfiguration(BigtableOptionsFactory.BIGTABLE_ENABLE_BULK_MUTATION_FLOW_CONTROL,\n \"true\")\n .build();\n return bigtableTableConfig;\n }\n }\n\nWhat's next\n\n\nTo search and filter code samples for other Google Cloud products, see the\n[Google Cloud sample browser](/docs/samples?product=bigtable)."]]