Gravar dados usando o Cloud Dataflow
Mantenha tudo organizado com as coleções
Salve e categorize o conteúdo com base nas suas preferências.
Grave dados no Cloud Bigtable com o Apache Beam.
Mais informações
Para ver a documentação detalhada que inclui este exemplo de código, consulte:
Exemplo de código
Exceto em caso de indicação contrária, o conteúdo desta página é licenciado de acordo com a Licença de atribuição 4.0 do Creative Commons, e as amostras de código são licenciadas de acordo com a Licença Apache 2.0. Para mais detalhes, consulte as políticas do site do Google Developers. Java é uma marca registrada da Oracle e/ou afiliadas.
[[["Fácil de entender","easyToUnderstand","thumb-up"],["Meu problema foi resolvido","solvedMyProblem","thumb-up"],["Outro","otherUp","thumb-up"]],[["Difícil de entender","hardToUnderstand","thumb-down"],["Informações incorretas ou exemplo de código","incorrectInformationOrSampleCode","thumb-down"],["Não contém as informações/amostras de que eu preciso","missingTheInformationSamplesINeed","thumb-down"],["Problema na tradução","translationIssue","thumb-down"],["Outro","otherDown","thumb-down"]],[],[[["\u003cp\u003eThis code sample demonstrates how to write data to Cloud Bigtable using Apache Beam.\u003c/p\u003e\n"],["\u003cp\u003eThe provided Java code utilizes the \u003ccode\u003eCloudBigtableIO\u003c/code\u003e connector to interact with Bigtable, including setting up configurations for project ID, instance ID, and table ID.\u003c/p\u003e\n"],["\u003cp\u003eThe example creates data rows with specific column family and column qualifier values, along with timestamps, and then uses \u003ccode\u003eParDo\u003c/code\u003e to create the necessary mutations to populate the Bigtable Table.\u003c/p\u003e\n"],["\u003cp\u003eThe sample code shows how to set up the authentication to connect to Bigtable using Application Default Credentials and defines the required options for project, instance, and table IDs.\u003c/p\u003e\n"],["\u003cp\u003eIt also describes a method to enable flow control using \u003ccode\u003eBigtableOptionsFactory.BIGTABLE_ENABLE_BULK_MUTATION_FLOW_CONTROL\u003c/code\u003e.\u003c/p\u003e\n"]]],[],null,["# Write data using Cloud Dataflow\n\nWrite data to Cloud Bigtable with Apache Beam.\n\nExplore further\n---------------\n\n\nFor detailed documentation that includes this code sample, see the following:\n\n- [Bigtable HBase Beam connector](/bigtable/docs/hbase-dataflow-java)\n\nCode sample\n-----------\n\n### Java\n\n\nTo learn how to install and use the client library for Bigtable, see\n[Bigtable client libraries](/bigtable/docs/reference/libraries).\n\n\nTo authenticate to Bigtable, set up Application Default Credentials.\nFor more information, see\n\n[Set up authentication for a local development environment](/docs/authentication/set-up-adc-local-dev-environment).\n\n\n import com.google.cloud.bigtable.beam.CloudBigtableIO;\n import com.google.cloud.bigtable.beam.CloudBigtableTableConfiguration;\n import com.google.cloud.bigtable.hbase.BigtableOptionsFactory;\n import org.apache.beam.runners.dataflow.options.DataflowPipelineOptions;\n import org.apache.beam.sdk.Pipeline;\n import org.apache.beam.sdk.options.Default;\n import org.apache.beam.sdk.options.Description;\n import org.apache.beam.sdk.options.PipelineOptionsFactory;\n import org.apache.beam.sdk.transforms.Create;\n import org.apache.beam.sdk.transforms.DoFn;\n import org.apache.beam.sdk.transforms.ParDo;\n import org.apache.hadoop.hbase.client.Mutation;\n import org.apache.hadoop.hbase.client.Put;\n import org.apache.hadoop.hbase.util.Bytes;\n\n public class HelloWorldWrite {\n\n public static void main(String[] args) {\n BigtableOptions options =\n PipelineOptionsFactory.fromArgs(args).withValidation().as(BigtableOptions.class);\n Pipeline p = Pipeline.create(options);\n\n CloudBigtableTableConfiguration bigtableTableConfig =\n new CloudBigtableTableConfiguration.Builder()\n .withProjectId(options.getBigtableProjectId())\n .withInstanceId(options.getBigtableInstanceId())\n .withTableId(options.getBigtableTableId())\n .build();\n\n p.apply(Create.of(\"phone#4c410523#20190501\", \"phone#4c410523#20190502\"))\n .apply(\n ParDo.of(\n new DoFn\u003cString, Mutation\u003e() {\n @ProcessElement\n public void processElement(@Element String rowkey, OutputReceiver\u003cMutation\u003e out) {\n long timestamp = System.currentTimeMillis();\n Put row = new Put(Bytes.toBytes(rowkey));\n\n row.addColumn(\n Bytes.toBytes(\"stats_summary\"),\n Bytes.toBytes(\"os_build\"),\n timestamp,\n Bytes.toBytes(\"android\"));\n out.output(row);\n }\n }))\n .apply(CloudBigtableIO.writeToTable(bigtableTableConfig));\n\n p.run().waitUntilFinish();\n }\n\n public interface BigtableOptions extends DataflowPipelineOptions {\n\n @Description(\"The Bigtable project ID, this can be different than your Dataflow project\")\n @Default.String(\"bigtable-project\")\n String getBigtableProjectId();\n\n void setBigtableProjectId(String bigtableProjectId);\n\n @Description(\"The Bigtable instance ID\")\n @Default.String(\"bigtable-instance\")\n String getBigtableInstanceId();\n\n void setBigtableInstanceId(String bigtableInstanceId);\n\n @Description(\"The Bigtable table ID in the instance.\")\n @Default.String(\"mobile-time-series\")\n String getBigtableTableId();\n\n void setBigtableTableId(String bigtableTableId);\n }\n\n public static CloudBigtableTableConfiguration batchWriteFlowControlExample(\n BigtableOptions options) {\n CloudBigtableTableConfiguration bigtableTableConfig =\n new CloudBigtableTableConfiguration.Builder()\n .withProjectId(options.getBigtableProjectId())\n .withInstanceId(options.getBigtableInstanceId())\n .withTableId(options.getBigtableTableId())\n .withConfiguration(BigtableOptionsFactory.BIGTABLE_ENABLE_BULK_MUTATION_FLOW_CONTROL,\n \"true\")\n .build();\n return bigtableTableConfig;\n }\n }\n\nWhat's next\n-----------\n\n\nTo search and filter code samples for other Google Cloud products, see the\n[Google Cloud sample browser](/docs/samples?product=bigtable)."]]