Class Table (2.20.0)

public class Table extends TableInfo

A Google BigQuery Table.

Objects of this class are immutable. Operations that modify the table like #update return a new object. To get a Table object with the most recent information use #reload. Table adds a layer of service-related functionality over TableInfo.

Inheritance

Object > TableInfo > Table

Methods

copy(TableId destinationTable, BigQuery.JobOption[] options)

public Job copy(TableId destinationTable, BigQuery.JobOption[] options)

Starts a BigQuery Job to copy the current table to the provided destination table. Returns the started Job object.

Example copying the table to a destination table.


 String dataset = "my_dataset";
 String tableName = "my_destination_table";
 TableId destinationId = TableId.of(dataset, tableName);
 JobOption options = JobOption.fields(JobField.STATUS, JobField.USER_EMAIL);
 Job job = table.copy(destinationId, options);
 // Wait for the job to complete.
 try {
   Job completedJob = job.waitFor(RetryOption.initialRetryDelay(Duration.ofSeconds(1)),
       RetryOption.totalTimeout(Duration.ofMinutes(3)));
   if (completedJob != null && completedJob.getStatus().getError() == null) {
     // Job completed successfully.
   } else {
     // Handle error case.
   }
 } catch (InterruptedException e) {
   // Handle interrupted wait
 }
 
Parameters
NameDescription
destinationTableTableId

the destination table of the copy job

optionsJobOption[]

job options

Returns
TypeDescription
Job
Exceptions
TypeDescription
BigQueryException

upon failure

copy(String destinationDataset, String destinationTable, BigQuery.JobOption[] options)

public Job copy(String destinationDataset, String destinationTable, BigQuery.JobOption[] options)

Starts a BigQuery Job to copy the current table to the provided destination table. Returns the started Job object.

Example of copying the table to a destination table.


 String datasetName = "my_dataset";
 String tableName = "my_destination_table";
 Job job = table.copy(datasetName, tableName);
 // Wait for the job to complete.
 try {
   Job completedJob = job.waitFor(RetryOption.initialRetryDelay(Duration.ofSeconds(1)),
       RetryOption.totalTimeout(Duration.ofMinutes(3)));
   if (completedJob != null && completedJob.getStatus().getError() == null) {
     // Job completed successfully
   } else {
     // Handle error case
   }
 } catch (InterruptedException e) {
   // Handle interrupted wait
 }
 
Parameters
NameDescription
destinationDatasetString

the user-defined id of the destination dataset

destinationTableString

the user-defined id of the destination table

optionsJobOption[]

job options

Returns
TypeDescription
Job
Exceptions
TypeDescription
BigQueryException

upon failure

delete()

public boolean delete()

Deletes this table.

Example of deleting the table.


 boolean deleted = table.delete();
 if (deleted) {
   // the table was deleted
 } else {
   // the table was not found
 }
 
Returns
TypeDescription
boolean

true if table was deleted, false if it was not found

equals(Object obj)

public final boolean equals(Object obj)
Parameter
NameDescription
objObject
Returns
TypeDescription
boolean
Overrides

exists()

public boolean exists()

Checks if this table exists.

Example of checking if the table exists.


 boolean exists = table.exists();
 if (exists) {
   // the table exists
 } else {
   // the table was not found
 }
 
Returns
TypeDescription
boolean

true if this table exists, false otherwise

extract(String format, String destinationUri, BigQuery.JobOption[] options)

public Job extract(String format, String destinationUri, BigQuery.JobOption[] options)

Starts a BigQuery Job to extract the current table to the provided destination URI. Returns the started Job object.

Example extracting data to single Google Cloud Storage file.


 String format = "CSV";
 String gcsUrl = "gs://my_bucket/filename.csv";
 Job job = table.extract(format, gcsUrl);
 // Wait for the job to complete
 try {
   Job completedJob = job.waitFor(RetryOption.initialRetryDelay(Duration.ofSeconds(1)),
       RetryOption.totalTimeout(Duration.ofMinutes(3)));
   if (completedJob != null && completedJob.getStatus().getError() == null) {
     // Job completed successfully
   } else {
     // Handle error case
   }
 } catch (InterruptedException e) {
   // Handle interrupted wait
 }
 
Parameters
NameDescription
formatString

the format of the extracted data

destinationUriString

the fully-qualified Google Cloud Storage URI (e.g. gs://bucket/path) where the extracted table should be written

optionsJobOption[]

job options

Returns
TypeDescription
Job
Exceptions
TypeDescription
BigQueryException

upon failure

extract(String format, List<String> destinationUris, BigQuery.JobOption[] options)

public Job extract(String format, List<String> destinationUris, BigQuery.JobOption[] options)

Starts a BigQuery Job to extract the current table to the provided destination URIs. Returns the started Job object.

Example of partitioning data to a list of Google Cloud Storage files.


 String format = "CSV";
 String gcsUrl1 = "gs://my_bucket/PartitionA_*.csv";
 String gcsUrl2 = "gs://my_bucket/PartitionB_*.csv";
 List<String> destinationUris = new ArrayList<>();
 destinationUris.add(gcsUrl1);
 destinationUris.add(gcsUrl2);
 Job job = table.extract(format, destinationUris);
 // Wait for the job to complete
 try {
   Job completedJob = job.waitFor(RetryOption.initialRetryDelay(Duration.ofSeconds(1)),
       RetryOption.totalTimeout(Duration.ofMinutes(3)));
   if (completedJob != null && completedJob.getStatus().getError() == null) {
     // Job completed successfully
   } else {
     // Handle error case
   }
 } catch (InterruptedException e) {
   // Handle interrupted wait
 }
 
Parameters
NameDescription
formatString

the format of the exported data

destinationUrisList<String>

the fully-qualified Google Cloud Storage URIs (e.g. gs://bucket/path) where the extracted table should be written

optionsJobOption[]

job options

Returns
TypeDescription
Job
Exceptions
TypeDescription
BigQueryException

upon failure

getBigQuery()

public BigQuery getBigQuery()

Returns the table's BigQuery object used to issue requests.

Returns
TypeDescription
BigQuery

hashCode()

public final int hashCode()
Returns
TypeDescription
int
Overrides

insert(Iterable<InsertAllRequest.RowToInsert> rows)

public InsertAllResponse insert(Iterable<InsertAllRequest.RowToInsert> rows)

Insert rows into the table.

Streaming inserts reside temporarily in the streaming buffer, which has different availability characteristics than managed storage. Certain operations do not interact with the streaming buffer, such as #list(TableDataListOption...) and #copy(TableId, JobOption...). As such, recent streaming data will not be present in the destination table or output.

Example of inserting rows into the table.


 String rowId1 = "rowId1";
 String rowId2 = "rowId2";
 List<RowToInsert> rows = new ArrayList<>();
 Map<String, Object> row1 = new HashMap<>();
 row1.put("stringField", "value1");
 row1.put("booleanField", true);
 Map<String, Object> row2 = new HashMap<>();
 row2.put("stringField", "value2");
 row2.put("booleanField", false);
 rows.add(RowToInsert.of(rowId1, row1));
 rows.add(RowToInsert.of(rowId2, row2));
 InsertAllResponse response = table.insert(rows);
 // do something with response
 
Parameter
NameDescription
rowsIterable<RowToInsert>

rows to be inserted

Returns
TypeDescription
InsertAllResponse
Exceptions
TypeDescription
BigQueryException

upon failure

insert(Iterable<InsertAllRequest.RowToInsert> rows, boolean skipInvalidRows, boolean ignoreUnknownValues)

public InsertAllResponse insert(Iterable<InsertAllRequest.RowToInsert> rows, boolean skipInvalidRows, boolean ignoreUnknownValues)

Insert rows into the table.

Streaming inserts reside temporarily in the streaming buffer, which has different availability characteristics than managed storage. Certain operations do not interact with the streaming buffer, such as #list(TableDataListOption...) and #copy(TableId, JobOption...). As such, recent streaming data will not be present in the destination table or output.

Example of inserting rows into the table, ignoring invalid rows.


 String rowId1 = "rowId1";
 String rowId2 = "rowId2";
 List<RowToInsert> rows = new ArrayList<>();
 Map<String, Object> row1 = new HashMap<>();
 row1.put("stringField", 1);
 row1.put("booleanField", true);
 Map<String, Object> row2 = new HashMap<>();
 row2.put("stringField", "value2");
 row2.put("booleanField", false);
 rows.add(RowToInsert.of(rowId1, row1));
 rows.add(RowToInsert.of(rowId2, row2));
 InsertAllResponse response = table.insert(rows, true, true);
 // do something with response
 
Parameters
NameDescription
rowsIterable<RowToInsert>

rows to be inserted

skipInvalidRowsboolean

whether to insert all valid rows, even if invalid rows exist. If not set the entire insert operation will fail if rows to be inserted contain an invalid row

ignoreUnknownValuesboolean

whether to accept rows that contain values that do not match the schema. The unknown values are ignored. If not set, rows with unknown values are considered to be invalid

Returns
TypeDescription
InsertAllResponse
Exceptions
TypeDescription
BigQueryException

upon failure

list(BigQuery.TableDataListOption[] options)

public TableResult list(BigQuery.TableDataListOption[] options)

Returns the paginated list rows in this table.

Example of listing rows in the table.


 // This example reads the result 100 rows per RPC call. If there's no need to limit the number,
 // simply omit the option.
 Page<FieldValueList> page = table.list(TableDataListOption.pageSize(100));
 for (FieldValueList row : page.iterateAll()) {
   // do something with the row
 }
 
Parameter
NameDescription
optionsTableDataListOption[]

table data list options

Returns
TypeDescription
TableResult
Exceptions
TypeDescription
BigQueryException

upon failure

list(Schema schema, BigQuery.TableDataListOption[] options)

public TableResult list(Schema schema, BigQuery.TableDataListOption[] options)

Returns the paginated list rows in this table.

Example of listing rows in the table given a schema.


 Schema schema = ...;
 String field = "my_field";
 Page<FieldValueList> page = table.list(schema);
 for (FieldValueList row : page.iterateAll()) {
   row.get(field);
 }
 
Parameters
NameDescription
schemaSchema
optionsTableDataListOption[]

table data list options

Returns
TypeDescription
TableResult
Exceptions
TypeDescription
BigQueryException

upon failure

load(FormatOptions format, String sourceUri, BigQuery.JobOption[] options)

public Job load(FormatOptions format, String sourceUri, BigQuery.JobOption[] options)

Starts a BigQuery Job to load data into the current table from the provided source URI. Returns the started Job object.

Example loading data from a single Google Cloud Storage file.


 String sourceUri = "gs://my_bucket/filename.csv";
 Job job = table.load(FormatOptions.csv(), sourceUri);
 // Wait for the job to complete
 try {
   Job completedJob = job.waitFor(RetryOption.initialRetryDelay(Duration.ofSeconds(1)),
             RetryOption.totalTimeout(Duration.ofMinutes(3)));
   if (completedJob != null && completedJob.getStatus().getError() == null) {
     // Job completed successfully
   } else {
     // Handle error case
   }
 } catch (InterruptedException e) {
   // Handle interrupted wait
 }
 
Parameters
NameDescription
formatFormatOptions

the format of the data to load

sourceUriString

the fully-qualified Google Cloud Storage URI (e.g. gs://bucket/path) from which to load the data

optionsJobOption[]

job options

Returns
TypeDescription
Job
Exceptions
TypeDescription
BigQueryException

upon failure

load(FormatOptions format, List<String> sourceUris, BigQuery.JobOption[] options)

public Job load(FormatOptions format, List<String> sourceUris, BigQuery.JobOption[] options)

Starts a BigQuery Job to load data into the current table from the provided source URIs. Returns the started Job object.

Example loading data from a list of Google Cloud Storage files.


 String gcsUrl1 = "gs://my_bucket/filename1.csv";
 String gcsUrl2 = "gs://my_bucket/filename2.csv";
 List<String> sourceUris = new ArrayList<>();
 sourceUris.add(gcsUrl1);
 sourceUris.add(gcsUrl2);
 Job job = table.load(FormatOptions.csv(), sourceUris);
 // Wait for the job to complete
 try {
   Job completedJob = job.waitFor(RetryOption.initialRetryDelay(Duration.ofSeconds(1)),
       RetryOption.totalTimeout(Duration.ofMinutes(3)));
   if (completedJob != null && completedJob.getStatus().getError() == null) {
     // Job completed successfully
   } else {
     // Handle error case
   }
 } catch (InterruptedException e) {
   // Handle interrupted wait
 }
 
Parameters
NameDescription
formatFormatOptions

the format of the exported data

sourceUrisList<String>

the fully-qualified Google Cloud Storage URIs (e.g. gs://bucket/path) from which to load the data

optionsJobOption[]

job options

Returns
TypeDescription
Job
Exceptions
TypeDescription
BigQueryException

upon failure

reload(BigQuery.TableOption[] options)

public Table reload(BigQuery.TableOption[] options)

Fetches current table's latest information. Returns null if the table does not exist.

Example of fetching the table's latest information, specifying particular table fields to get.


 TableField field1 = TableField.LAST_MODIFIED_TIME;
 TableField field2 = TableField.NUM_ROWS;
 Table latestTable = table.reload(TableOption.fields(field1, field2));
 if (latestTable == null) {
   // the table was not found
 }
 
Parameter
NameDescription
optionsTableOption[]

table options

Returns
TypeDescription
Table

a Table object with latest information or null if not found

toBuilder()

public Table.Builder toBuilder()

Returns a builder for the table object.

Returns
TypeDescription
Table.Builder
Overrides

update(BigQuery.TableOption[] options)

public Table update(BigQuery.TableOption[] options)

Updates the table's information with this table's information. Dataset's and table's user-defined ids cannot be changed. A new Table object is returned.

Example of updating the table's information.


 Table updatedTable = table.toBuilder().setDescription("new description").build().update();
 
Parameter
NameDescription
optionsTableOption[]

dataset options

Returns
TypeDescription
Table

a Table object with updated information