Melakukan operasi tulis batch (HBase)
Tetap teratur dengan koleksi
Simpan dan kategorikan konten berdasarkan preferensi Anda.
Menulis beberapa baris sekaligus. Jenis penulisan ini membuat permintaan MutateRows API.
Mempelajari lebih lanjut
Untuk dokumentasi mendetail yang menyertakan contoh kode ini, lihat artikel berikut:
Contoh kode
Kecuali dinyatakan lain, konten di halaman ini dilisensikan berdasarkan Lisensi Creative Commons Attribution 4.0, sedangkan contoh kode dilisensikan berdasarkan Lisensi Apache 2.0. Untuk mengetahui informasi selengkapnya, lihat Kebijakan Situs Google Developers. Java adalah merek dagang terdaftar dari Oracle dan/atau afiliasinya.
[[["Mudah dipahami","easyToUnderstand","thumb-up"],["Memecahkan masalah saya","solvedMyProblem","thumb-up"],["Lainnya","otherUp","thumb-up"]],[["Sulit dipahami","hardToUnderstand","thumb-down"],["Informasi atau kode contoh salah","incorrectInformationOrSampleCode","thumb-down"],["Informasi/contoh yang saya butuhkan tidak ada","missingTheInformationSamplesINeed","thumb-down"],["Masalah terjemahan","translationIssue","thumb-down"],["Lainnya","otherDown","thumb-down"]],[],[],[],null,["Write multiple rows at once. This type of write makes a MutateRows API request.\n\nExplore further\n\n\nFor detailed documentation that includes this code sample, see the following:\n\n- [Write examples](/bigtable/docs/writing-data)\n\nCode sample \n\nJava\n\n\nTo learn how to install and use the client library for Bigtable, see\n[Bigtable client libraries](/bigtable/docs/reference/libraries).\n\n\nTo authenticate to Bigtable, set up Application Default Credentials.\nFor more information, see\n\n[Set up authentication for a local development environment](/docs/authentication/set-up-adc-local-dev-environment).\n\n\n import com.google.cloud.bigtable.hbase.BigtableConfiguration;\n import java.util.ArrayList;\n import java.util.List;\n import org.apache.hadoop.hbase.TableName;\n import org.apache.hadoop.hbase.client.Connection;\n import org.apache.hadoop.hbase.client.Put;\n import org.apache.hadoop.hbase.client.Table;\n import org.apache.hadoop.hbase.util.Bytes;\n\n public class WriteBatch {\n\n private static final byte[] COLUMN_FAMILY_NAME = Bytes.toBytes(\"stats_summary\");\n\n public static void writeBatch(String projectId, String instanceId, String tableId) {\n // String projectId = \"my-project-id\";\n // String instanceId = \"my-instance-id\";\n // String tableId = \"mobile-time-series\";\n\n try (Connection connection = BigtableConfiguration.connect(projectId, instanceId)) {\n final Table table = connection.getTable(TableName.valueOf(Bytes.toBytes(tableId)));\n long timestamp = System.currentTimeMillis();\n byte[] one = new byte[]{0, 0, 0, 0, 0, 0, 0, 1};\n\n List\u003cPut\u003e puts = new ArrayList\u003cPut\u003e();\n puts.add(new Put(Bytes.toBytes(\"tablet#a0b81f74#20190501\")));\n puts.add(new Put(Bytes.toBytes(\"tablet#a0b81f74#20190502\")));\n\n puts.get(0).addColumn(COLUMN_FAMILY_NAME, Bytes.toBytes(\"connected_wifi\"), timestamp, one);\n puts.get(0)\n .addColumn(\n COLUMN_FAMILY_NAME,\n Bytes.toBytes(\"os_build\"),\n timestamp,\n Bytes.toBytes(\"12155.0.0-rc1\"));\n\n puts.get(1).addColumn(COLUMN_FAMILY_NAME, Bytes.toBytes(\"connected_wifi\"), timestamp, one);\n puts.get(1)\n .addColumn(\n COLUMN_FAMILY_NAME,\n Bytes.toBytes(\"os_build\"),\n timestamp,\n Bytes.toBytes(\"12145.0.0-rc6\"));\n\n table.put(puts);\n\n System.out.print(\"Successfully wrote 2 rows\");\n } catch (Exception e) {\n System.out.println(\"Error during WriteBatch: \\n\" + e.toString());\n }\n }\n }\n\nWhat's next\n\n\nTo search and filter code samples for other Google Cloud products, see the\n[Google Cloud sample browser](/docs/samples?product=bigtable)."]]