Realiza escrituras por lotes (HBase)
Organiza tus páginas con colecciones
Guarda y categoriza el contenido según tus preferencias.
Escribe varias filas a la vez. Con este tipo de escritura se realiza una solicitud a la API de MutateRows.
Explora más
Para obtener documentación en la que se incluye esta muestra de código, consulta lo siguiente:
Muestra de código
Salvo que se indique lo contrario, el contenido de esta página está sujeto a la licencia Atribución 4.0 de Creative Commons, y los ejemplos de código están sujetos a la licencia Apache 2.0. Para obtener más información, consulta las políticas del sitio de Google Developers. Java es una marca registrada de Oracle o sus afiliados.
[[["Fácil de comprender","easyToUnderstand","thumb-up"],["Resolvió mi problema","solvedMyProblem","thumb-up"],["Otro","otherUp","thumb-up"]],[["Difícil de entender","hardToUnderstand","thumb-down"],["Información o código de muestra incorrectos","incorrectInformationOrSampleCode","thumb-down"],["Faltan la información o los ejemplos que necesito","missingTheInformationSamplesINeed","thumb-down"],["Problema de traducción","translationIssue","thumb-down"],["Otro","otherDown","thumb-down"]],[],[],[],null,["Write multiple rows at once. This type of write makes a MutateRows API request.\n\nExplore further\n\n\nFor detailed documentation that includes this code sample, see the following:\n\n- [Write examples](/bigtable/docs/writing-data)\n\nCode sample \n\nJava\n\n\nTo learn how to install and use the client library for Bigtable, see\n[Bigtable client libraries](/bigtable/docs/reference/libraries).\n\n\nTo authenticate to Bigtable, set up Application Default Credentials.\nFor more information, see\n\n[Set up authentication for a local development environment](/docs/authentication/set-up-adc-local-dev-environment).\n\n\n import com.google.cloud.bigtable.hbase.BigtableConfiguration;\n import java.util.ArrayList;\n import java.util.List;\n import org.apache.hadoop.hbase.TableName;\n import org.apache.hadoop.hbase.client.Connection;\n import org.apache.hadoop.hbase.client.Put;\n import org.apache.hadoop.hbase.client.Table;\n import org.apache.hadoop.hbase.util.Bytes;\n\n public class WriteBatch {\n\n private static final byte[] COLUMN_FAMILY_NAME = Bytes.toBytes(\"stats_summary\");\n\n public static void writeBatch(String projectId, String instanceId, String tableId) {\n // String projectId = \"my-project-id\";\n // String instanceId = \"my-instance-id\";\n // String tableId = \"mobile-time-series\";\n\n try (Connection connection = BigtableConfiguration.connect(projectId, instanceId)) {\n final Table table = connection.getTable(TableName.valueOf(Bytes.toBytes(tableId)));\n long timestamp = System.currentTimeMillis();\n byte[] one = new byte[]{0, 0, 0, 0, 0, 0, 0, 1};\n\n List\u003cPut\u003e puts = new ArrayList\u003cPut\u003e();\n puts.add(new Put(Bytes.toBytes(\"tablet#a0b81f74#20190501\")));\n puts.add(new Put(Bytes.toBytes(\"tablet#a0b81f74#20190502\")));\n\n puts.get(0).addColumn(COLUMN_FAMILY_NAME, Bytes.toBytes(\"connected_wifi\"), timestamp, one);\n puts.get(0)\n .addColumn(\n COLUMN_FAMILY_NAME,\n Bytes.toBytes(\"os_build\"),\n timestamp,\n Bytes.toBytes(\"12155.0.0-rc1\"));\n\n puts.get(1).addColumn(COLUMN_FAMILY_NAME, Bytes.toBytes(\"connected_wifi\"), timestamp, one);\n puts.get(1)\n .addColumn(\n COLUMN_FAMILY_NAME,\n Bytes.toBytes(\"os_build\"),\n timestamp,\n Bytes.toBytes(\"12145.0.0-rc6\"));\n\n table.put(puts);\n\n System.out.print(\"Successfully wrote 2 rows\");\n } catch (Exception e) {\n System.out.println(\"Error during WriteBatch: \\n\" + e.toString());\n }\n }\n }\n\nWhat's next\n\n\nTo search and filter code samples for other Google Cloud products, see the\n[Google Cloud sample browser](/docs/samples?product=bigtable)."]]