BigQueryIO.Write (Google Cloud Dataflow SDK 1.9.1 API)

Google Cloud Dataflow SDK for Java, version 1.9.1

Class BigQueryIO.Write

  • Enclosing class:

    public static class BigQueryIO.Write
    extends Object
    A PTransform that writes a PCollection containing TableRows to a BigQuery table.

    In BigQuery, each table has an encosing dataset. The dataset being written must already exist.

    By default, tables will be created if they do not exist, which corresponds to a BigQueryIO.Write.CreateDisposition.CREATE_IF_NEEDED disposition that matches the default of BigQuery's Jobs API. A schema must be provided (via withSchema(TableSchema)), or else the transform may fail at runtime with an IllegalArgumentException.

    By default, writes require an empty table, which corresponds to a BigQueryIO.Write.WriteDisposition.WRITE_EMPTY disposition that matches the default of BigQuery's Jobs API.

    Here is a sample transform that produces TableRow values containing "word" and "count" columns:

     static class FormatCountsFn extends DoFn<KV<String, Long>, TableRow> {
       public void processElement(ProcessContext c) {
         TableRow row = new TableRow()
             .set("word", c.element().getKey())
             .set("count", c.element().getValue().intValue());

Was this page helpful? Let us know how we did:

Send feedback about...

Cloud Dataflow
Need help? Visit our support page.