Scrivere dati utilizzando Cloud Dataflow
Mantieni tutto organizzato con le raccolte
Salva e classifica i contenuti in base alle tue preferenze.
Scrivi dati in Cloud Bigtable con Apache Beam.
Per saperne di più
Per la documentazione dettagliata che include questo esempio di codice, vedi quanto segue:
Esempio di codice
Salvo quando diversamente specificato, i contenuti di questa pagina sono concessi in base alla licenza Creative Commons Attribution 4.0, mentre gli esempi di codice sono concessi in base alla licenza Apache 2.0. Per ulteriori dettagli, consulta le norme del sito di Google Developers. Java è un marchio registrato di Oracle e/o delle sue consociate.
[[["Facile da capire","easyToUnderstand","thumb-up"],["Il problema è stato risolto","solvedMyProblem","thumb-up"],["Altra","otherUp","thumb-up"]],[["Difficile da capire","hardToUnderstand","thumb-down"],["Informazioni o codice di esempio errati","incorrectInformationOrSampleCode","thumb-down"],["Mancano le informazioni o gli esempi di cui ho bisogno","missingTheInformationSamplesINeed","thumb-down"],["Problema di traduzione","translationIssue","thumb-down"],["Altra","otherDown","thumb-down"]],[],[[["\u003cp\u003eThis code sample demonstrates how to write data to Cloud Bigtable using Apache Beam.\u003c/p\u003e\n"],["\u003cp\u003eThe provided Java code utilizes the \u003ccode\u003eCloudBigtableIO\u003c/code\u003e connector to interact with Bigtable, including setting up configurations for project ID, instance ID, and table ID.\u003c/p\u003e\n"],["\u003cp\u003eThe example creates data rows with specific column family and column qualifier values, along with timestamps, and then uses \u003ccode\u003eParDo\u003c/code\u003e to create the necessary mutations to populate the Bigtable Table.\u003c/p\u003e\n"],["\u003cp\u003eThe sample code shows how to set up the authentication to connect to Bigtable using Application Default Credentials and defines the required options for project, instance, and table IDs.\u003c/p\u003e\n"],["\u003cp\u003eIt also describes a method to enable flow control using \u003ccode\u003eBigtableOptionsFactory.BIGTABLE_ENABLE_BULK_MUTATION_FLOW_CONTROL\u003c/code\u003e.\u003c/p\u003e\n"]]],[],null,["# Write data using Cloud Dataflow\n\nWrite data to Cloud Bigtable with Apache Beam.\n\nExplore further\n---------------\n\n\nFor detailed documentation that includes this code sample, see the following:\n\n- [Bigtable HBase Beam connector](/bigtable/docs/hbase-dataflow-java)\n\nCode sample\n-----------\n\n### Java\n\n\nTo learn how to install and use the client library for Bigtable, see\n[Bigtable client libraries](/bigtable/docs/reference/libraries).\n\n\nTo authenticate to Bigtable, set up Application Default Credentials.\nFor more information, see\n\n[Set up authentication for a local development environment](/docs/authentication/set-up-adc-local-dev-environment).\n\n\n import com.google.cloud.bigtable.beam.CloudBigtableIO;\n import com.google.cloud.bigtable.beam.CloudBigtableTableConfiguration;\n import com.google.cloud.bigtable.hbase.BigtableOptionsFactory;\n import org.apache.beam.runners.dataflow.options.DataflowPipelineOptions;\n import org.apache.beam.sdk.Pipeline;\n import org.apache.beam.sdk.options.Default;\n import org.apache.beam.sdk.options.Description;\n import org.apache.beam.sdk.options.PipelineOptionsFactory;\n import org.apache.beam.sdk.transforms.Create;\n import org.apache.beam.sdk.transforms.DoFn;\n import org.apache.beam.sdk.transforms.ParDo;\n import org.apache.hadoop.hbase.client.Mutation;\n import org.apache.hadoop.hbase.client.Put;\n import org.apache.hadoop.hbase.util.Bytes;\n\n public class HelloWorldWrite {\n\n public static void main(String[] args) {\n BigtableOptions options =\n PipelineOptionsFactory.fromArgs(args).withValidation().as(BigtableOptions.class);\n Pipeline p = Pipeline.create(options);\n\n CloudBigtableTableConfiguration bigtableTableConfig =\n new CloudBigtableTableConfiguration.Builder()\n .withProjectId(options.getBigtableProjectId())\n .withInstanceId(options.getBigtableInstanceId())\n .withTableId(options.getBigtableTableId())\n .build();\n\n p.apply(Create.of(\"phone#4c410523#20190501\", \"phone#4c410523#20190502\"))\n .apply(\n ParDo.of(\n new DoFn\u003cString, Mutation\u003e() {\n @ProcessElement\n public void processElement(@Element String rowkey, OutputReceiver\u003cMutation\u003e out) {\n long timestamp = System.currentTimeMillis();\n Put row = new Put(Bytes.toBytes(rowkey));\n\n row.addColumn(\n Bytes.toBytes(\"stats_summary\"),\n Bytes.toBytes(\"os_build\"),\n timestamp,\n Bytes.toBytes(\"android\"));\n out.output(row);\n }\n }))\n .apply(CloudBigtableIO.writeToTable(bigtableTableConfig));\n\n p.run().waitUntilFinish();\n }\n\n public interface BigtableOptions extends DataflowPipelineOptions {\n\n @Description(\"The Bigtable project ID, this can be different than your Dataflow project\")\n @Default.String(\"bigtable-project\")\n String getBigtableProjectId();\n\n void setBigtableProjectId(String bigtableProjectId);\n\n @Description(\"The Bigtable instance ID\")\n @Default.String(\"bigtable-instance\")\n String getBigtableInstanceId();\n\n void setBigtableInstanceId(String bigtableInstanceId);\n\n @Description(\"The Bigtable table ID in the instance.\")\n @Default.String(\"mobile-time-series\")\n String getBigtableTableId();\n\n void setBigtableTableId(String bigtableTableId);\n }\n\n public static CloudBigtableTableConfiguration batchWriteFlowControlExample(\n BigtableOptions options) {\n CloudBigtableTableConfiguration bigtableTableConfig =\n new CloudBigtableTableConfiguration.Builder()\n .withProjectId(options.getBigtableProjectId())\n .withInstanceId(options.getBigtableInstanceId())\n .withTableId(options.getBigtableTableId())\n .withConfiguration(BigtableOptionsFactory.BIGTABLE_ENABLE_BULK_MUTATION_FLOW_CONTROL,\n \"true\")\n .build();\n return bigtableTableConfig;\n }\n }\n\nWhat's next\n-----------\n\n\nTo search and filter code samples for other Google Cloud products, see the\n[Google Cloud sample browser](/docs/samples?product=bigtable)."]]