データを読み取る
コレクションでコンテンツを整理
必要に応じて、コンテンツの保存と分類を行います。
Apache Beam を使用して Cloud Bigtable からデータを読み取ります。
もっと見る
このコードサンプルを含む詳細なドキュメントについては、以下をご覧ください。
コードサンプル
特に記載のない限り、このページのコンテンツはクリエイティブ・コモンズの表示 4.0 ライセンスにより使用許諾されます。コードサンプルは Apache 2.0 ライセンスにより使用許諾されます。詳しくは、Google Developers サイトのポリシーをご覧ください。Java は Oracle および関連会社の登録商標です。
[[["わかりやすい","easyToUnderstand","thumb-up"],["問題の解決に役立った","solvedMyProblem","thumb-up"],["その他","otherUp","thumb-up"]],[["わかりにくい","hardToUnderstand","thumb-down"],["情報またはサンプルコードが不正確","incorrectInformationOrSampleCode","thumb-down"],["必要な情報 / サンプルがない","missingTheInformationSamplesINeed","thumb-down"],["翻訳に関する問題","translationIssue","thumb-down"],["その他","otherDown","thumb-down"]],[],[[["\u003cp\u003eThis code sample demonstrates how to read data from Cloud Bigtable using Apache Beam.\u003c/p\u003e\n"],["\u003cp\u003eThe example uses the \u003ccode\u003eCloudBigtableIO\u003c/code\u003e class to configure and perform the read operation with a specified scan.\u003c/p\u003e\n"],["\u003cp\u003eIt utilizes a \u003ccode\u003eScan\u003c/code\u003e object with a \u003ccode\u003eFirstKeyOnlyFilter\u003c/code\u003e to optimize the data retrieval process, setting cache blocks to false.\u003c/p\u003e\n"],["\u003cp\u003eThe pipeline reads the results from Bigtable and processes each row, printing the row key to the console, using a \u003ccode\u003eDoFn\u003c/code\u003e to apply a custom transformation.\u003c/p\u003e\n"],["\u003cp\u003eThe required configuration parameters for Bigtable, such as project ID, instance ID, and table ID, are provided through the \u003ccode\u003eBigtableOptions\u003c/code\u003e interface, which extends \u003ccode\u003eDataflowPipelineOptions\u003c/code\u003e.\u003c/p\u003e\n"]]],[],null,["# Read data from Cloud Bigtable with Apache Beam.\n\nExplore further\n---------------\n\n\nFor detailed documentation that includes this code sample, see the following:\n\n- [Bigtable HBase Beam connector](/bigtable/docs/hbase-dataflow-java)\n\nCode sample\n-----------\n\n### Java\n\n\nTo learn how to install and use the client library for Bigtable, see\n[Bigtable client libraries](/bigtable/docs/reference/libraries).\n\n\nTo authenticate to Bigtable, set up Application Default Credentials.\nFor more information, see\n\n[Set up authentication for a local development environment](/docs/authentication/set-up-adc-local-dev-environment).\n\n import com.google.cloud.bigtable.beam.CloudBigtableIO;\n import com.google.cloud.bigtable.beam.CloudBigtableScanConfiguration;\n import org.apache.beam.runners.dataflow.options.DataflowPipelineOptions;\n import org.apache.beam.sdk.Pipeline;\n import org.apache.beam.sdk.io.Read;\n import org.apache.beam.sdk.options.Default;\n import org.apache.beam.sdk.options.Description;\n import org.apache.beam.sdk.options.PipelineOptionsFactory;\n import org.apache.beam.sdk.transforms.DoFn;\n import org.apache.beam.sdk.transforms.ParDo;\n import org.apache.hadoop.hbase.client.Result;\n import org.apache.hadoop.hbase.client.Scan;\n import org.apache.hadoop.hbase.filter.FirstKeyOnlyFilter;\n import org.apache.hadoop.hbase.util.Bytes;\n\n public class HelloWorldRead {\n public static void main(String[] args) {\n BigtableOptions options =\n PipelineOptionsFactory.fromArgs(args).withValidation().as(BigtableOptions.class);\n Pipeline p = Pipeline.create(options);\n\n Scan scan = new Scan();\n scan.setCacheBlocks(false);\n scan.setFilter(new FirstKeyOnlyFilter());\n\n CloudBigtableScanConfiguration config =\n new CloudBigtableScanConfiguration.Builder()\n .withProjectId(options.getBigtableProjectId())\n .withInstanceId(options.getBigtableInstanceId())\n .withTableId(options.getBigtableTableId())\n .withScan(scan)\n .build();\n\n p.apply(Read.from(CloudBigtableIO.read(config)))\n .apply(\n ParDo.of(\n new DoFn\u003cResult, Void\u003e() {\n @ProcessElement\n public void processElement(@Element Result row, OutputReceiver\u003cVoid\u003e out) {\n System.out.println(Bytes.toString(row.getRow()));\n }\n }));\n\n p.run().waitUntilFinish();\n }\n\n public interface BigtableOptions extends DataflowPipelineOptions {\n @Description(\"The Bigtable project ID, this can be different than your Dataflow project\")\n @Default.String(\"bigtable-project\")\n String getBigtableProjectId();\n\n void setBigtableProjectId(String bigtableProjectId);\n\n @Description(\"The Bigtable instance ID\")\n @Default.String(\"bigtable-instance\")\n String getBigtableInstanceId();\n\n void setBigtableInstanceId(String bigtableInstanceId);\n\n @Description(\"The Bigtable table ID in the instance.\")\n @Default.String(\"mobile-time-series\")\n String getBigtableTableId();\n\n void setBigtableTableId(String bigtableTableId);\n }\n }\n\nWhat's next\n-----------\n\n\nTo search and filter code samples for other Google Cloud products, see the\n[Google Cloud sample browser](/docs/samples?product=bigtable)."]]