执行批量写入 (HBase)
使用集合让一切井井有条
根据您的偏好保存内容并对其进行分类。
一次写入多行。这种类型的写入发出 MutateRows API 请求。
深入探索
如需查看包含此代码示例的详细文档,请参阅以下内容:
代码示例
如未另行说明,那么本页面中的内容已根据知识共享署名 4.0 许可获得了许可,并且代码示例已根据 Apache 2.0 许可获得了许可。有关详情,请参阅 Google 开发者网站政策。Java 是 Oracle 和/或其关联公司的注册商标。
[[["易于理解","easyToUnderstand","thumb-up"],["解决了我的问题","solvedMyProblem","thumb-up"],["其他","otherUp","thumb-up"]],[["很难理解","hardToUnderstand","thumb-down"],["信息或示例代码不正确","incorrectInformationOrSampleCode","thumb-down"],["没有我需要的信息/示例","missingTheInformationSamplesINeed","thumb-down"],["翻译问题","translationIssue","thumb-down"],["其他","otherDown","thumb-down"]],[],[],[],null,["Write multiple rows at once. This type of write makes a MutateRows API request.\n\nExplore further\n\n\nFor detailed documentation that includes this code sample, see the following:\n\n- [Write examples](/bigtable/docs/writing-data)\n\nCode sample \n\nJava\n\n\nTo learn how to install and use the client library for Bigtable, see\n[Bigtable client libraries](/bigtable/docs/reference/libraries).\n\n\nTo authenticate to Bigtable, set up Application Default Credentials.\nFor more information, see\n\n[Set up authentication for a local development environment](/docs/authentication/set-up-adc-local-dev-environment).\n\n\n import com.google.cloud.bigtable.hbase.BigtableConfiguration;\n import java.util.ArrayList;\n import java.util.List;\n import org.apache.hadoop.hbase.TableName;\n import org.apache.hadoop.hbase.client.Connection;\n import org.apache.hadoop.hbase.client.Put;\n import org.apache.hadoop.hbase.client.Table;\n import org.apache.hadoop.hbase.util.Bytes;\n\n public class WriteBatch {\n\n private static final byte[] COLUMN_FAMILY_NAME = Bytes.toBytes(\"stats_summary\");\n\n public static void writeBatch(String projectId, String instanceId, String tableId) {\n // String projectId = \"my-project-id\";\n // String instanceId = \"my-instance-id\";\n // String tableId = \"mobile-time-series\";\n\n try (Connection connection = BigtableConfiguration.connect(projectId, instanceId)) {\n final Table table = connection.getTable(TableName.valueOf(Bytes.toBytes(tableId)));\n long timestamp = System.currentTimeMillis();\n byte[] one = new byte[]{0, 0, 0, 0, 0, 0, 0, 1};\n\n List\u003cPut\u003e puts = new ArrayList\u003cPut\u003e();\n puts.add(new Put(Bytes.toBytes(\"tablet#a0b81f74#20190501\")));\n puts.add(new Put(Bytes.toBytes(\"tablet#a0b81f74#20190502\")));\n\n puts.get(0).addColumn(COLUMN_FAMILY_NAME, Bytes.toBytes(\"connected_wifi\"), timestamp, one);\n puts.get(0)\n .addColumn(\n COLUMN_FAMILY_NAME,\n Bytes.toBytes(\"os_build\"),\n timestamp,\n Bytes.toBytes(\"12155.0.0-rc1\"));\n\n puts.get(1).addColumn(COLUMN_FAMILY_NAME, Bytes.toBytes(\"connected_wifi\"), timestamp, one);\n puts.get(1)\n .addColumn(\n COLUMN_FAMILY_NAME,\n Bytes.toBytes(\"os_build\"),\n timestamp,\n Bytes.toBytes(\"12145.0.0-rc6\"));\n\n table.put(puts);\n\n System.out.print(\"Successfully wrote 2 rows\");\n } catch (Exception e) {\n System.out.println(\"Error during WriteBatch: \\n\" + e.toString());\n }\n }\n }\n\nWhat's next\n\n\nTo search and filter code samples for other Google Cloud products, see the\n[Google Cloud sample browser](/docs/samples?product=bigtable)."]]