public class Table extends TableInfo
A Google BigQuery Table.
Objects of this class are immutable. Operations that modify the table like #update
return a new object. To get a Table
object with the most recent information use #reload. Table
adds a layer of service-related functionality over TableInfo.
Methods
copy(TableId destinationTable, BigQuery.JobOption[] options)
public Job copy(TableId destinationTable, BigQuery.JobOption[] options)
Starts a BigQuery Job to copy the current table to the provided destination table. Returns the started Job object.
Example copying the table to a destination table.
String dataset = "my_dataset";
String tableName = "my_destination_table";
TableId destinationId = TableId.of(dataset, tableName);
JobOption options = JobOption.fields(JobField.STATUS, JobField.USER_EMAIL);
Job job = table.copy(destinationId, options);
// Wait for the job to complete.
try {
Job completedJob = job.waitFor(RetryOption.initialRetryDelay(Duration.ofSeconds(1)),
RetryOption.totalTimeout(Duration.ofMinutes(3)));
if (completedJob != null && completedJob.getStatus().getError() == null) {
// Job completed successfully.
} else {
// Handle error case.
}
} catch (InterruptedException e) {
// Handle interrupted wait
}
Parameters | |
---|---|
Name | Description |
destinationTable |
TableId the destination table of the copy job |
options |
JobOption[] job options |
Returns | |
---|---|
Type | Description |
Job |
Exceptions | |
---|---|
Type | Description |
BigQueryException |
upon failure |
copy(String destinationDataset, String destinationTable, BigQuery.JobOption[] options)
public Job copy(String destinationDataset, String destinationTable, BigQuery.JobOption[] options)
Starts a BigQuery Job to copy the current table to the provided destination table. Returns the started Job object.
Example of copying the table to a destination table.
String datasetName = "my_dataset";
String tableName = "my_destination_table";
Job job = table.copy(datasetName, tableName);
// Wait for the job to complete.
try {
Job completedJob = job.waitFor(RetryOption.initialRetryDelay(Duration.ofSeconds(1)),
RetryOption.totalTimeout(Duration.ofMinutes(3)));
if (completedJob != null && completedJob.getStatus().getError() == null) {
// Job completed successfully
} else {
// Handle error case
}
} catch (InterruptedException e) {
// Handle interrupted wait
}
Parameters | |
---|---|
Name | Description |
destinationDataset |
String the user-defined id of the destination dataset |
destinationTable |
String the user-defined id of the destination table |
options |
JobOption[] job options |
Returns | |
---|---|
Type | Description |
Job |
Exceptions | |
---|---|
Type | Description |
BigQueryException |
upon failure |
delete()
public boolean delete()
Deletes this table.
Example of deleting the table.
boolean deleted = table.delete();
if (deleted) {
// the table was deleted
} else {
// the table was not found
}
Returns | |
---|---|
Type | Description |
boolean |
|
equals(Object obj)
public final boolean equals(Object obj)
Parameter | |
---|---|
Name | Description |
obj |
Object |
Returns | |
---|---|
Type | Description |
boolean |
exists()
public boolean exists()
Checks if this table exists.
Example of checking if the table exists.
boolean exists = table.exists();
if (exists) {
// the table exists
} else {
// the table was not found
}
Returns | |
---|---|
Type | Description |
boolean |
|
extract(String format, String destinationUri, BigQuery.JobOption[] options)
public Job extract(String format, String destinationUri, BigQuery.JobOption[] options)
Starts a BigQuery Job to extract the current table to the provided destination URI. Returns the started Job object.
Example extracting data to single Google Cloud Storage file.
String format = "CSV";
String gcsUrl = "gs://my_bucket/filename.csv";
Job job = table.extract(format, gcsUrl);
// Wait for the job to complete
try {
Job completedJob = job.waitFor(RetryOption.initialRetryDelay(Duration.ofSeconds(1)),
RetryOption.totalTimeout(Duration.ofMinutes(3)));
if (completedJob != null && completedJob.getStatus().getError() == null) {
// Job completed successfully
} else {
// Handle error case
}
} catch (InterruptedException e) {
// Handle interrupted wait
}
Parameters | |
---|---|
Name | Description |
format |
String the format of the extracted data |
destinationUri |
String the fully-qualified Google Cloud Storage URI (e.g. gs://bucket/path) where the extracted table should be written |
options |
JobOption[] job options |
Returns | |
---|---|
Type | Description |
Job |
Exceptions | |
---|---|
Type | Description |
BigQueryException |
upon failure |
extract(String format, List<String> destinationUris, BigQuery.JobOption[] options)
public Job extract(String format, List<String> destinationUris, BigQuery.JobOption[] options)
Starts a BigQuery Job to extract the current table to the provided destination URIs. Returns the started Job object.
Example of partitioning data to a list of Google Cloud Storage files.
String format = "CSV";
String gcsUrl1 = "gs://my_bucket/PartitionA_*.csv";
String gcsUrl2 = "gs://my_bucket/PartitionB_*.csv";
List<String> destinationUris = new ArrayList<>();
destinationUris.add(gcsUrl1);
destinationUris.add(gcsUrl2);
Job job = table.extract(format, destinationUris);
// Wait for the job to complete
try {
Job completedJob = job.waitFor(RetryOption.initialRetryDelay(Duration.ofSeconds(1)),
RetryOption.totalTimeout(Duration.ofMinutes(3)));
if (completedJob != null && completedJob.getStatus().getError() == null) {
// Job completed successfully
} else {
// Handle error case
}
} catch (InterruptedException e) {
// Handle interrupted wait
}
Parameters | |
---|---|
Name | Description |
format |
String the format of the exported data |
destinationUris |
List<String> the fully-qualified Google Cloud Storage URIs (e.g. gs://bucket/path) where the extracted table should be written |
options |
JobOption[] job options |
Returns | |
---|---|
Type | Description |
Job |
Exceptions | |
---|---|
Type | Description |
BigQueryException |
upon failure |
getBigQuery()
public BigQuery getBigQuery()
Returns the table's BigQuery
object used to issue requests.
Returns | |
---|---|
Type | Description |
BigQuery |
hashCode()
public final int hashCode()
Returns | |
---|---|
Type | Description |
int |
insert(Iterable<InsertAllRequest.RowToInsert> rows)
public InsertAllResponse insert(Iterable<InsertAllRequest.RowToInsert> rows)
Insert rows into the table.
Streaming inserts reside temporarily in the streaming buffer, which has different availability characteristics than managed storage. Certain operations do not interact with the streaming buffer, such as #list(TableDataListOption...) and #copy(TableId, JobOption...). As such, recent streaming data will not be present in the destination table or output.
Example of inserting rows into the table.
String rowId1 = "rowId1";
String rowId2 = "rowId2";
List<RowToInsert> rows = new ArrayList<>();
Map<String, Object> row1 = new HashMap<>();
row1.put("stringField", "value1");
row1.put("booleanField", true);
Map<String, Object> row2 = new HashMap<>();
row2.put("stringField", "value2");
row2.put("booleanField", false);
rows.add(RowToInsert.of(rowId1, row1));
rows.add(RowToInsert.of(rowId2, row2));
InsertAllResponse response = table.insert(rows);
// do something with response
Parameter | |
---|---|
Name | Description |
rows |
Iterable<RowToInsert> rows to be inserted |
Returns | |
---|---|
Type | Description |
InsertAllResponse |
Exceptions | |
---|---|
Type | Description |
BigQueryException |
upon failure |
insert(Iterable<InsertAllRequest.RowToInsert> rows, boolean skipInvalidRows, boolean ignoreUnknownValues)
public InsertAllResponse insert(Iterable<InsertAllRequest.RowToInsert> rows, boolean skipInvalidRows, boolean ignoreUnknownValues)
Insert rows into the table.
Streaming inserts reside temporarily in the streaming buffer, which has different availability characteristics than managed storage. Certain operations do not interact with the streaming buffer, such as #list(TableDataListOption...) and #copy(TableId, JobOption...). As such, recent streaming data will not be present in the destination table or output.
Example of inserting rows into the table, ignoring invalid rows.
String rowId1 = "rowId1";
String rowId2 = "rowId2";
List<RowToInsert> rows = new ArrayList<>();
Map<String, Object> row1 = new HashMap<>();
row1.put("stringField", 1);
row1.put("booleanField", true);
Map<String, Object> row2 = new HashMap<>();
row2.put("stringField", "value2");
row2.put("booleanField", false);
rows.add(RowToInsert.of(rowId1, row1));
rows.add(RowToInsert.of(rowId2, row2));
InsertAllResponse response = table.insert(rows, true, true);
// do something with response
Parameters | |
---|---|
Name | Description |
rows |
Iterable<RowToInsert> rows to be inserted |
skipInvalidRows |
boolean whether to insert all valid rows, even if invalid rows exist. If not set the entire insert operation will fail if rows to be inserted contain an invalid row |
ignoreUnknownValues |
boolean whether to accept rows that contain values that do not match the schema. The unknown values are ignored. If not set, rows with unknown values are considered to be invalid |
Returns | |
---|---|
Type | Description |
InsertAllResponse |
Exceptions | |
---|---|
Type | Description |
BigQueryException |
upon failure |
list(BigQuery.TableDataListOption[] options)
public TableResult list(BigQuery.TableDataListOption[] options)
Returns the paginated list rows in this table.
Example of listing rows in the table.
// This example reads the result 100 rows per RPC call. If there's no need to limit the number,
// simply omit the option.
Page<FieldValueList> page = table.list(TableDataListOption.pageSize(100));
for (FieldValueList row : page.iterateAll()) {
// do something with the row
}
Parameter | |
---|---|
Name | Description |
options |
TableDataListOption[] table data list options |
Returns | |
---|---|
Type | Description |
TableResult |
Exceptions | |
---|---|
Type | Description |
BigQueryException |
upon failure |
list(Schema schema, BigQuery.TableDataListOption[] options)
public TableResult list(Schema schema, BigQuery.TableDataListOption[] options)
Returns the paginated list rows in this table.
Example of listing rows in the table given a schema.
Schema schema = ...;
String field = "my_field";
Page<FieldValueList> page = table.list(schema);
for (FieldValueList row : page.iterateAll()) {
row.get(field);
}
Parameters | |
---|---|
Name | Description |
schema |
Schema |
options |
TableDataListOption[] table data list options |
Returns | |
---|---|
Type | Description |
TableResult |
Exceptions | |
---|---|
Type | Description |
BigQueryException |
upon failure |
load(FormatOptions format, String sourceUri, BigQuery.JobOption[] options)
public Job load(FormatOptions format, String sourceUri, BigQuery.JobOption[] options)
Starts a BigQuery Job to load data into the current table from the provided source URI. Returns the started Job object.
Example loading data from a single Google Cloud Storage file.
String sourceUri = "gs://my_bucket/filename.csv";
Job job = table.load(FormatOptions.csv(), sourceUri);
// Wait for the job to complete
try {
Job completedJob = job.waitFor(RetryOption.initialRetryDelay(Duration.ofSeconds(1)),
RetryOption.totalTimeout(Duration.ofMinutes(3)));
if (completedJob != null && completedJob.getStatus().getError() == null) {
// Job completed successfully
} else {
// Handle error case
}
} catch (InterruptedException e) {
// Handle interrupted wait
}
Parameters | |
---|---|
Name | Description |
format |
FormatOptions the format of the data to load |
sourceUri |
String the fully-qualified Google Cloud Storage URI (e.g. gs://bucket/path) from which to load the data |
options |
JobOption[] job options |
Returns | |
---|---|
Type | Description |
Job |
Exceptions | |
---|---|
Type | Description |
BigQueryException |
upon failure |
load(FormatOptions format, List<String> sourceUris, BigQuery.JobOption[] options)
public Job load(FormatOptions format, List<String> sourceUris, BigQuery.JobOption[] options)
Starts a BigQuery Job to load data into the current table from the provided source URIs. Returns the started Job object.
Example loading data from a list of Google Cloud Storage files.
String gcsUrl1 = "gs://my_bucket/filename1.csv";
String gcsUrl2 = "gs://my_bucket/filename2.csv";
List<String> sourceUris = new ArrayList<>();
sourceUris.add(gcsUrl1);
sourceUris.add(gcsUrl2);
Job job = table.load(FormatOptions.csv(), sourceUris);
// Wait for the job to complete
try {
Job completedJob = job.waitFor(RetryOption.initialRetryDelay(Duration.ofSeconds(1)),
RetryOption.totalTimeout(Duration.ofMinutes(3)));
if (completedJob != null && completedJob.getStatus().getError() == null) {
// Job completed successfully
} else {
// Handle error case
}
} catch (InterruptedException e) {
// Handle interrupted wait
}
Parameters | |
---|---|
Name | Description |
format |
FormatOptions the format of the exported data |
sourceUris |
List<String> the fully-qualified Google Cloud Storage URIs (e.g. gs://bucket/path) from which to load the data |
options |
JobOption[] job options |
Returns | |
---|---|
Type | Description |
Job |
Exceptions | |
---|---|
Type | Description |
BigQueryException |
upon failure |
reload(BigQuery.TableOption[] options)
public Table reload(BigQuery.TableOption[] options)
Fetches current table's latest information. Returns null
if the table does not exist.
Example of fetching the table's latest information, specifying particular table fields to get.
TableField field1 = TableField.LAST_MODIFIED_TIME;
TableField field2 = TableField.NUM_ROWS;
Table latestTable = table.reload(TableOption.fields(field1, field2));
if (latestTable == null) {
// the table was not found
}
Parameter | |
---|---|
Name | Description |
options |
TableOption[] table options |
Returns | |
---|---|
Type | Description |
Table |
a |
toBuilder()
public Table.Builder toBuilder()
Returns a builder for the table object.
Returns | |
---|---|
Type | Description |
Table.Builder |
update(BigQuery.TableOption[] options)
public Table update(BigQuery.TableOption[] options)
Updates the table's information with this table's information. Dataset's and table's
user-defined ids cannot be changed. A new Table
object is returned.
Example of updating the table's information.
Table updatedTable = table.toBuilder().setDescription("new description").build().update();
Parameter | |
---|---|
Name | Description |
options |
TableOption[] dataset options |
Returns | |
---|---|
Type | Description |
Table |
a |