The interfaces provided are listed below, along with usage samples.
BaseBigQueryReadClient
Service Description: BigQuery Read API.
The Read API can be used to read data from BigQuery.
Sample for BaseBigQueryReadClient:
// This snippet has been automatically generated and should be regarded as a code template only.
// It will require modifications to work:
// - It may require correct/in-range values for request initialization.
// - It may require specifying regional endpoints when creating the service client as shown in
// https://cloud.google.com/java/docs/setup#configure_endpoints_for_the_client_library
try (BaseBigQueryReadClient baseBigQueryReadClient = BaseBigQueryReadClient.create()) {
ProjectName parent = ProjectName.of("[PROJECT]");
ReadSession readSession = ReadSession.newBuilder().build();
int maxStreamCount = 940837515;
ReadSession response =
baseBigQueryReadClient.createReadSession(parent, readSession, maxStreamCount);
}
BigQueryWriteClient
Service Description: BigQuery Write API.
The Write API can be used to write data to BigQuery.
For supplementary information about the Write API, see: https://cloud.google.com/bigquery/docs/write-api
Sample for BigQueryWriteClient:
// This snippet has been automatically generated and should be regarded as a code template only.
// It will require modifications to work:
// - It may require correct/in-range values for request initialization.
// - It may require specifying regional endpoints when creating the service client as shown in
// https://cloud.google.com/java/docs/setup#configure_endpoints_for_the_client_library
try (BigQueryWriteClient bigQueryWriteClient = BigQueryWriteClient.create()) {
TableName parent = TableName.of("[PROJECT]", "[DATASET]", "[TABLE]");
WriteStream writeStream = WriteStream.newBuilder().build();
WriteStream response = bigQueryWriteClient.createWriteStream(parent, writeStream);
}
Classes
AnnotationsProto
AppendRowsRequest
Request message for AppendRows
.
Due to the nature of AppendRows being a bidirectional streaming RPC, certain
parts of the AppendRowsRequest need only be specified for the first request
sent each time the gRPC network connection is opened/reopened.
The size of a single AppendRowsRequest must be less than 10 MB in size.
Requests larger than this return an error, typically INVALID_ARGUMENT
.
Protobuf type google.cloud.bigquery.storage.v1.AppendRowsRequest
AppendRowsRequest.Builder
Request message for AppendRows
.
Due to the nature of AppendRows being a bidirectional streaming RPC, certain
parts of the AppendRowsRequest need only be specified for the first request
sent each time the gRPC network connection is opened/reopened.
The size of a single AppendRowsRequest must be less than 10 MB in size.
Requests larger than this return an error, typically INVALID_ARGUMENT
.
Protobuf type google.cloud.bigquery.storage.v1.AppendRowsRequest
AppendRowsRequest.ProtoData
ProtoData contains the data rows and schema when constructing append requests.
Protobuf type google.cloud.bigquery.storage.v1.AppendRowsRequest.ProtoData
AppendRowsRequest.ProtoData.Builder
ProtoData contains the data rows and schema when constructing append requests.
Protobuf type google.cloud.bigquery.storage.v1.AppendRowsRequest.ProtoData
AppendRowsResponse
Response message for AppendRows
.
Protobuf type google.cloud.bigquery.storage.v1.AppendRowsResponse
AppendRowsResponse.AppendResult
AppendResult is returned for successful append requests.
Protobuf type google.cloud.bigquery.storage.v1.AppendRowsResponse.AppendResult
AppendRowsResponse.AppendResult.Builder
AppendResult is returned for successful append requests.
Protobuf type google.cloud.bigquery.storage.v1.AppendRowsResponse.AppendResult
AppendRowsResponse.Builder
Response message for AppendRows
.
Protobuf type google.cloud.bigquery.storage.v1.AppendRowsResponse
ArrowProto
ArrowRecordBatch
Arrow RecordBatch.
Protobuf type google.cloud.bigquery.storage.v1.ArrowRecordBatch
ArrowRecordBatch.Builder
Arrow RecordBatch.
Protobuf type google.cloud.bigquery.storage.v1.ArrowRecordBatch
ArrowSchema
Arrow schema as specified in https://arrow.apache.org/docs/python/api/datatypes.html and serialized to bytes using IPC: https://arrow.apache.org/docs/format/Columnar.html#serialization-and-interprocess-communication-ipc See code samples on how this message can be deserialized.
Protobuf type google.cloud.bigquery.storage.v1.ArrowSchema
ArrowSchema.Builder
Arrow schema as specified in https://arrow.apache.org/docs/python/api/datatypes.html and serialized to bytes using IPC: https://arrow.apache.org/docs/format/Columnar.html#serialization-and-interprocess-communication-ipc See code samples on how this message can be deserialized.
Protobuf type google.cloud.bigquery.storage.v1.ArrowSchema
ArrowSerializationOptions
Contains options specific to Arrow Serialization.
Protobuf type google.cloud.bigquery.storage.v1.ArrowSerializationOptions
ArrowSerializationOptions.Builder
Contains options specific to Arrow Serialization.
Protobuf type google.cloud.bigquery.storage.v1.ArrowSerializationOptions
AvroProto
AvroRows
Avro rows.
Protobuf type google.cloud.bigquery.storage.v1.AvroRows
AvroRows.Builder
Avro rows.
Protobuf type google.cloud.bigquery.storage.v1.AvroRows
AvroSchema
Avro schema.
Protobuf type google.cloud.bigquery.storage.v1.AvroSchema
AvroSchema.Builder
Avro schema.
Protobuf type google.cloud.bigquery.storage.v1.AvroSchema
AvroSerializationOptions
Contains options specific to Avro Serialization.
Protobuf type google.cloud.bigquery.storage.v1.AvroSerializationOptions
AvroSerializationOptions.Builder
Contains options specific to Avro Serialization.
Protobuf type google.cloud.bigquery.storage.v1.AvroSerializationOptions
BQTableSchemaToProtoDescriptor
Converts a BQ table schema to protobuf descriptor. All field names will be converted to lowercase when constructing the protobuf descriptor. The mapping between field types and field modes are shown in the ImmutableMaps below.
BaseBigQueryReadClient
Service Description: BigQuery Read API.
The Read API can be used to read data from BigQuery.
This class provides the ability to make remote calls to the backing service through method calls that map to API methods. Sample code to get started:
// This snippet has been automatically generated and should be regarded as a code template only.
// It will require modifications to work:
// - It may require correct/in-range values for request initialization.
// - It may require specifying regional endpoints when creating the service client as shown in
// https://cloud.google.com/java/docs/setup#configure_endpoints_for_the_client_library
try (BaseBigQueryReadClient baseBigQueryReadClient = BaseBigQueryReadClient.create()) {
ProjectName parent = ProjectName.of("[PROJECT]");
ReadSession readSession = ReadSession.newBuilder().build();
int maxStreamCount = 940837515;
ReadSession response =
baseBigQueryReadClient.createReadSession(parent, readSession, maxStreamCount);
}
Note: close() needs to be called on the BaseBigQueryReadClient object to clean up resources such as threads. In the example above, try-with-resources is used, which automatically calls close().
The surface of this class includes several types of Java methods for each of the API's methods:
- A "flattened" method. With this type of method, the fields of the request type have been converted into function parameters. It may be the case that not all fields are available as parameters, and not every API method will have a flattened method entry point.
- A "request object" method. This type of method only takes one parameter, a request object, which must be constructed before the call. Not every API method will have a request object method.
- A "callable" method. This type of method takes no parameters and returns an immutable API callable object, which can be used to initiate calls to the service.
See the individual methods for example code.
Many parameters require resource names to be formatted in a particular way. To assist with these names, this class includes a format method for each type of name, and additionally a parse method to extract the individual identifiers contained within names that are returned.
This class can be customized by passing in a custom instance of BaseBigQueryReadSettings to create(). For example:
To customize credentials:
// This snippet has been automatically generated and should be regarded as a code template only.
// It will require modifications to work:
// - It may require correct/in-range values for request initialization.
// - It may require specifying regional endpoints when creating the service client as shown in
// https://cloud.google.com/java/docs/setup#configure_endpoints_for_the_client_library
BaseBigQueryReadSettings baseBigQueryReadSettings =
BaseBigQueryReadSettings.newBuilder()
.setCredentialsProvider(FixedCredentialsProvider.create(myCredentials))
.build();
BaseBigQueryReadClient baseBigQueryReadClient =
BaseBigQueryReadClient.create(baseBigQueryReadSettings);
To customize the endpoint:
// This snippet has been automatically generated and should be regarded as a code template only.
// It will require modifications to work:
// - It may require correct/in-range values for request initialization.
// - It may require specifying regional endpoints when creating the service client as shown in
// https://cloud.google.com/java/docs/setup#configure_endpoints_for_the_client_library
BaseBigQueryReadSettings baseBigQueryReadSettings =
BaseBigQueryReadSettings.newBuilder().setEndpoint(myEndpoint).build();
BaseBigQueryReadClient baseBigQueryReadClient =
BaseBigQueryReadClient.create(baseBigQueryReadSettings);
Please refer to the GitHub repository's samples for more quickstart code snippets.
BaseBigQueryReadSettings
Settings class to configure an instance of BaseBigQueryReadClient.
The default instance has everything set to sensible defaults:
- The default service address (bigquerystorage.googleapis.com) and default port (443) are used.
- Credentials are acquired automatically through Application Default Credentials.
- Retries are configured for idempotent methods but not for non-idempotent methods.
The builder of this class is recursive, so contained classes are themselves builders. When build() is called, the tree of builders is called to create the complete settings object.
For example, to set the total timeout of createReadSession to 30 seconds:
// This snippet has been automatically generated and should be regarded as a code template only.
// It will require modifications to work:
// - It may require correct/in-range values for request initialization.
// - It may require specifying regional endpoints when creating the service client as shown in
// https://cloud.google.com/java/docs/setup#configure_endpoints_for_the_client_library
BaseBigQueryReadSettings.Builder baseBigQueryReadSettingsBuilder =
BaseBigQueryReadSettings.newBuilder();
baseBigQueryReadSettingsBuilder
.createReadSessionSettings()
.setRetrySettings(
baseBigQueryReadSettingsBuilder
.createReadSessionSettings()
.getRetrySettings()
.toBuilder()
.setTotalTimeout(Duration.ofSeconds(30))
.build());
BaseBigQueryReadSettings baseBigQueryReadSettings = baseBigQueryReadSettingsBuilder.build();
BaseBigQueryReadSettings.Builder
Builder for BaseBigQueryReadSettings.
BatchCommitWriteStreamsRequest
Request message for BatchCommitWriteStreams
.
Protobuf type google.cloud.bigquery.storage.v1.BatchCommitWriteStreamsRequest
BatchCommitWriteStreamsRequest.Builder
Request message for BatchCommitWriteStreams
.
Protobuf type google.cloud.bigquery.storage.v1.BatchCommitWriteStreamsRequest
BatchCommitWriteStreamsResponse
Response message for BatchCommitWriteStreams
.
Protobuf type google.cloud.bigquery.storage.v1.BatchCommitWriteStreamsResponse
BatchCommitWriteStreamsResponse.Builder
Response message for BatchCommitWriteStreams
.
Protobuf type google.cloud.bigquery.storage.v1.BatchCommitWriteStreamsResponse
BigDecimalByteStringEncoder
BigQueryReadClient
Service Description: BigQuery Read API.
The Read API can be used to read data from BigQuery.
This class provides the ability to make remote calls to the backing service through method calls that map to API methods. Sample code to get started:
try (BigQueryReadClient BigQueryReadClient = BigQueryReadClient.create()) {
String parent = "";
ReadSession readSession = ReadSession.newBuilder().build();
int maxStreamCount = 0;
ReadSession response = BigQueryReadClient.createReadSession(parent, readSession, maxStreamCount);
}
Note: close() needs to be called on the BigQueryReadClient object to clean up resources such as threads. In the example above, try-with-resources is used, which automatically calls close().
The surface of this class includes several types of Java methods for each of the API's methods:
- A "flattened" method. With this type of method, the fields of the request type have been converted into function parameters. It may be the case that not all fields are available as parameters, and not every API method will have a flattened method entry point.
- A "request object" method. This type of method only takes one parameter, a request object, which must be constructed before the call. Not every API method will have a request object method.
- A "callable" method. This type of method takes no parameters and returns an immutable API callable object, which can be used to initiate calls to the service.
See the individual methods for example code.
Many parameters require resource names to be formatted in a particular way. To assist with these names, this class includes a format method for each type of name, and additionally a parse method to extract the individual identifiers contained within names that are returned.
This class can be customized by passing in a custom instance of BigQueryReadSettings to create(). For example:
To customize credentials:
BigQueryReadSettings BigQueryReadSettings =
BigQueryReadSettings.newBuilder()
.setCredentialsProvider(FixedCredentialsProvider.create(myCredentials))
.build();
BigQueryReadClient BigQueryReadClient =
BigQueryReadClient.create(BigQueryReadSettings);
To customize the endpoint:
BigQueryReadSettings BigQueryReadSettings =
BigQueryReadSettings.newBuilder().setEndpoint(myEndpoint).build();
BigQueryReadClient BigQueryReadClient =
BigQueryReadClient.create(BigQueryReadSettings);
BigQueryReadGrpc
BigQuery Read API. The Read API can be used to read data from BigQuery.
BigQueryReadGrpc.BigQueryReadBlockingStub
BigQuery Read API. The Read API can be used to read data from BigQuery.
BigQueryReadGrpc.BigQueryReadFutureStub
BigQuery Read API. The Read API can be used to read data from BigQuery.
BigQueryReadGrpc.BigQueryReadImplBase
BigQuery Read API. The Read API can be used to read data from BigQuery.
BigQueryReadGrpc.BigQueryReadStub
BigQuery Read API. The Read API can be used to read data from BigQuery.
BigQueryReadSettings
Settings class to configure an instance of BigQueryReadClient.
The default instance has everything set to sensible defaults:
- The default service address (bigquerystorage.googleapis.com) and default port (443) are used.
- Credentials are acquired automatically through Application Default Credentials.
- Retries are configured for idempotent methods but not for non-idempotent methods.
The builder of this class is recursive, so contained classes are themselves builders. When build() is called, the tree of builders is called to create the complete settings object.
For example, to set the total timeout of createReadSession to 30 seconds:
BigQueryReadSettings.Builder BigQueryReadSettingsBuilder =
BigQueryReadSettings.newBuilder();
BigQueryReadSettingsBuilder.createReadSessionSettings().getRetrySettings().toBuilder()
.setTotalTimeout(Duration.ofSeconds(30));
BigQueryReadSettings BigQueryReadSettings = BigQueryReadSettingsBuilder.build();
BigQueryReadSettings.Builder
Builder for BigQueryReadSettings.
BigQuerySchemaUtil
BigQueryWriteClient
Service Description: BigQuery Write API.
The Write API can be used to write data to BigQuery.
For supplementary information about the Write API, see: https://cloud.google.com/bigquery/docs/write-api
This class provides the ability to make remote calls to the backing service through method calls that map to API methods. Sample code to get started:
// This snippet has been automatically generated and should be regarded as a code template only.
// It will require modifications to work:
// - It may require correct/in-range values for request initialization.
// - It may require specifying regional endpoints when creating the service client as shown in
// https://cloud.google.com/java/docs/setup#configure_endpoints_for_the_client_library
try (BigQueryWriteClient bigQueryWriteClient = BigQueryWriteClient.create()) {
TableName parent = TableName.of("[PROJECT]", "[DATASET]", "[TABLE]");
WriteStream writeStream = WriteStream.newBuilder().build();
WriteStream response = bigQueryWriteClient.createWriteStream(parent, writeStream);
}
Note: close() needs to be called on the BigQueryWriteClient object to clean up resources such as threads. In the example above, try-with-resources is used, which automatically calls close().
The surface of this class includes several types of Java methods for each of the API's methods:
- A "flattened" method. With this type of method, the fields of the request type have been converted into function parameters. It may be the case that not all fields are available as parameters, and not every API method will have a flattened method entry point.
- A "request object" method. This type of method only takes one parameter, a request object, which must be constructed before the call. Not every API method will have a request object method.
- A "callable" method. This type of method takes no parameters and returns an immutable API callable object, which can be used to initiate calls to the service.
See the individual methods for example code.
Many parameters require resource names to be formatted in a particular way. To assist with these names, this class includes a format method for each type of name, and additionally a parse method to extract the individual identifiers contained within names that are returned.
This class can be customized by passing in a custom instance of BigQueryWriteSettings to create(). For example:
To customize credentials:
// This snippet has been automatically generated and should be regarded as a code template only.
// It will require modifications to work:
// - It may require correct/in-range values for request initialization.
// - It may require specifying regional endpoints when creating the service client as shown in
// https://cloud.google.com/java/docs/setup#configure_endpoints_for_the_client_library
BigQueryWriteSettings bigQueryWriteSettings =
BigQueryWriteSettings.newBuilder()
.setCredentialsProvider(FixedCredentialsProvider.create(myCredentials))
.build();
BigQueryWriteClient bigQueryWriteClient = BigQueryWriteClient.create(bigQueryWriteSettings);
To customize the endpoint:
// This snippet has been automatically generated and should be regarded as a code template only.
// It will require modifications to work:
// - It may require correct/in-range values for request initialization.
// - It may require specifying regional endpoints when creating the service client as shown in
// https://cloud.google.com/java/docs/setup#configure_endpoints_for_the_client_library
BigQueryWriteSettings bigQueryWriteSettings =
BigQueryWriteSettings.newBuilder().setEndpoint(myEndpoint).build();
BigQueryWriteClient bigQueryWriteClient = BigQueryWriteClient.create(bigQueryWriteSettings);
Please refer to the GitHub repository's samples for more quickstart code snippets.
BigQueryWriteGrpc
BigQuery Write API. The Write API can be used to write data to BigQuery. For supplementary information about the Write API, see: https://cloud.google.com/bigquery/docs/write-api
BigQueryWriteGrpc.BigQueryWriteBlockingStub
BigQuery Write API. The Write API can be used to write data to BigQuery. For supplementary information about the Write API, see: https://cloud.google.com/bigquery/docs/write-api
BigQueryWriteGrpc.BigQueryWriteFutureStub
BigQuery Write API. The Write API can be used to write data to BigQuery. For supplementary information about the Write API, see: https://cloud.google.com/bigquery/docs/write-api
BigQueryWriteGrpc.BigQueryWriteImplBase
BigQuery Write API. The Write API can be used to write data to BigQuery. For supplementary information about the Write API, see: https://cloud.google.com/bigquery/docs/write-api
BigQueryWriteGrpc.BigQueryWriteStub
BigQuery Write API. The Write API can be used to write data to BigQuery. For supplementary information about the Write API, see: https://cloud.google.com/bigquery/docs/write-api
BigQueryWriteSettings
Settings class to configure an instance of BigQueryWriteClient.
The default instance has everything set to sensible defaults:
- The default service address (bigquerystorage.googleapis.com) and default port (443) are used.
- Credentials are acquired automatically through Application Default Credentials.
- Retries are configured for idempotent methods but not for non-idempotent methods.
The builder of this class is recursive, so contained classes are themselves builders. When build() is called, the tree of builders is called to create the complete settings object.
For example, to set the total timeout of createWriteStream to 30 seconds:
// This snippet has been automatically generated and should be regarded as a code template only.
// It will require modifications to work:
// - It may require correct/in-range values for request initialization.
// - It may require specifying regional endpoints when creating the service client as shown in
// https://cloud.google.com/java/docs/setup#configure_endpoints_for_the_client_library
BigQueryWriteSettings.Builder bigQueryWriteSettingsBuilder = BigQueryWriteSettings.newBuilder();
bigQueryWriteSettingsBuilder
.createWriteStreamSettings()
.setRetrySettings(
bigQueryWriteSettingsBuilder
.createWriteStreamSettings()
.getRetrySettings()
.toBuilder()
.setTotalTimeout(Duration.ofSeconds(30))
.build());
BigQueryWriteSettings bigQueryWriteSettings = bigQueryWriteSettingsBuilder.build();
BigQueryWriteSettings.Builder
Builder for BigQueryWriteSettings.
CivilTimeEncoder
Ported from ZetaSQL CivilTimeEncoder Original code can be found at: https://github.com/google/zetasql/blob/master/java/com/google/zetasql/CivilTimeEncoder.java Encoder for TIME and DATETIME values, according to civil_time encoding.
The valid range and number of bits required by each date/time field is as the following:
Field | Range | #Bits |
---|---|---|
Year | [1, 9999] | 14 |
Month | [1, 12] | 4 |
Day | [1, 31] | 5 |
Hour | [0, 23] | 5 |
Minute | [0, 59] | 6 |
Second | [0, 59]* | 6 |
Micros | [0, 999999] | 20 |
Nanos | [0, 999999999] | 30 |
* Leap second is not supported.
When encoding the TIME or DATETIME into a bit field, larger date/time field is on the more significant side.
ConnectionWorkerPool
Pool of connections to accept appends and distirbute to different connections.
ConnectionWorkerPool.Settings
Settings for connection pool.
ConnectionWorkerPool.Settings.Builder
Builder for the options to config ConnectionWorkerPool.
CreateReadSessionRequest
Request message for CreateReadSession
.
Protobuf type google.cloud.bigquery.storage.v1.CreateReadSessionRequest
CreateReadSessionRequest.Builder
Request message for CreateReadSession
.
Protobuf type google.cloud.bigquery.storage.v1.CreateReadSessionRequest
CreateWriteStreamRequest
Request message for CreateWriteStream
.
Protobuf type google.cloud.bigquery.storage.v1.CreateWriteStreamRequest
CreateWriteStreamRequest.Builder
Request message for CreateWriteStream
.
Protobuf type google.cloud.bigquery.storage.v1.CreateWriteStreamRequest
Exceptions
Exceptions for Storage Client Libraries.
FinalizeWriteStreamRequest
Request message for invoking FinalizeWriteStream
.
Protobuf type google.cloud.bigquery.storage.v1.FinalizeWriteStreamRequest
FinalizeWriteStreamRequest.Builder
Request message for invoking FinalizeWriteStream
.
Protobuf type google.cloud.bigquery.storage.v1.FinalizeWriteStreamRequest
FinalizeWriteStreamResponse
Response message for FinalizeWriteStream
.
Protobuf type google.cloud.bigquery.storage.v1.FinalizeWriteStreamResponse
FinalizeWriteStreamResponse.Builder
Response message for FinalizeWriteStream
.
Protobuf type google.cloud.bigquery.storage.v1.FinalizeWriteStreamResponse
FlushRowsRequest
Request message for FlushRows
.
Protobuf type google.cloud.bigquery.storage.v1.FlushRowsRequest
FlushRowsRequest.Builder
Request message for FlushRows
.
Protobuf type google.cloud.bigquery.storage.v1.FlushRowsRequest
FlushRowsResponse
Respond message for FlushRows
.
Protobuf type google.cloud.bigquery.storage.v1.FlushRowsResponse
FlushRowsResponse.Builder
Respond message for FlushRows
.
Protobuf type google.cloud.bigquery.storage.v1.FlushRowsResponse
GetWriteStreamRequest
Request message for GetWriteStreamRequest
.
Protobuf type google.cloud.bigquery.storage.v1.GetWriteStreamRequest
GetWriteStreamRequest.Builder
Request message for GetWriteStreamRequest
.
Protobuf type google.cloud.bigquery.storage.v1.GetWriteStreamRequest
JsonStreamWriter
A StreamWriter that can write JSON data (JSONObjects) to BigQuery tables. The JsonStreamWriter is built on top of a StreamWriter, and it simply converts all JSON data to protobuf messages then calls StreamWriter's append() method to write to BigQuery tables. It maintains all StreamWriter functions, but also provides an additional feature: schema update support, where if the BigQuery table schema is updated, users will be able to ingest data on the new schema after some time (in order of minutes).
JsonStreamWriter.Builder
JsonToProtoMessage
Converts Json data to protocol buffer messages given the protocol buffer descriptor. The protobuf descriptor must have all fields lowercased.
ProjectName
ProjectName.Builder
Builder for projects/{project}.
ProtoBufProto
ProtoRows
Protobuf type google.cloud.bigquery.storage.v1.ProtoRows
ProtoRows.Builder
Protobuf type google.cloud.bigquery.storage.v1.ProtoRows
ProtoSchema
ProtoSchema describes the schema of the serialized protocol buffer data rows.
Protobuf type google.cloud.bigquery.storage.v1.ProtoSchema
ProtoSchema.Builder
ProtoSchema describes the schema of the serialized protocol buffer data rows.
Protobuf type google.cloud.bigquery.storage.v1.ProtoSchema
ProtoSchemaConverter
ReadRowsRequest
Request message for ReadRows
.
Protobuf type google.cloud.bigquery.storage.v1.ReadRowsRequest
ReadRowsRequest.Builder
Request message for ReadRows
.
Protobuf type google.cloud.bigquery.storage.v1.ReadRowsRequest
ReadRowsResponse
Response from calling ReadRows
may include row data, progress and
throttling information.
Protobuf type google.cloud.bigquery.storage.v1.ReadRowsResponse
ReadRowsResponse.Builder
Response from calling ReadRows
may include row data, progress and
throttling information.
Protobuf type google.cloud.bigquery.storage.v1.ReadRowsResponse
ReadSession
Information about the ReadSession.
Protobuf type google.cloud.bigquery.storage.v1.ReadSession
ReadSession.Builder
Information about the ReadSession.
Protobuf type google.cloud.bigquery.storage.v1.ReadSession
ReadSession.TableModifiers
Additional attributes when reading a table.
Protobuf type google.cloud.bigquery.storage.v1.ReadSession.TableModifiers
ReadSession.TableModifiers.Builder
Additional attributes when reading a table.
Protobuf type google.cloud.bigquery.storage.v1.ReadSession.TableModifiers
ReadSession.TableReadOptions
Options dictating how we read a table.
Protobuf type google.cloud.bigquery.storage.v1.ReadSession.TableReadOptions
ReadSession.TableReadOptions.Builder
Options dictating how we read a table.
Protobuf type google.cloud.bigquery.storage.v1.ReadSession.TableReadOptions
ReadStream
Information about a single stream that gets data out of the storage system.
Most of the information about ReadStream
instances is aggregated, making
ReadStream
lightweight.
Protobuf type google.cloud.bigquery.storage.v1.ReadStream
ReadStream.Builder
Information about a single stream that gets data out of the storage system.
Most of the information about ReadStream
instances is aggregated, making
ReadStream
lightweight.
Protobuf type google.cloud.bigquery.storage.v1.ReadStream
ReadStreamName
ReadStreamName.Builder
Builder for projects/{project}/locations/{location}/sessions/{session}/streams/{stream}.
RowError
The message that presents row level error info in a request.
Protobuf type google.cloud.bigquery.storage.v1.RowError
RowError.Builder
The message that presents row level error info in a request.
Protobuf type google.cloud.bigquery.storage.v1.RowError
SplitReadStreamRequest
Request message for SplitReadStream
.
Protobuf type google.cloud.bigquery.storage.v1.SplitReadStreamRequest
SplitReadStreamRequest.Builder
Request message for SplitReadStream
.
Protobuf type google.cloud.bigquery.storage.v1.SplitReadStreamRequest
SplitReadStreamResponse
Response message for SplitReadStream
.
Protobuf type google.cloud.bigquery.storage.v1.SplitReadStreamResponse
SplitReadStreamResponse.Builder
Response message for SplitReadStream
.
Protobuf type google.cloud.bigquery.storage.v1.SplitReadStreamResponse
StorageError
Structured custom BigQuery Storage error message. The error can be attached as error details in the returned rpc Status. In particular, the use of error codes allows more structured error handling, and reduces the need to evaluate unstructured error text strings.
Protobuf type google.cloud.bigquery.storage.v1.StorageError
StorageError.Builder
Structured custom BigQuery Storage error message. The error can be attached as error details in the returned rpc Status. In particular, the use of error codes allows more structured error handling, and reduces the need to evaluate unstructured error text strings.
Protobuf type google.cloud.bigquery.storage.v1.StorageError
StorageProto
StreamConnection
StreamConnection is responsible for writing requests to a GRPC bidirecional connection.
StreamWriter creates a connection. Two callback functions are necessary: request_callback and done_callback. Request callback is used for every request, and done callback is used to notify the user that the connection is closed and no more callbacks will be received from this connection.
The stream writer will accept all the requests without flow control, and makes the callbacks in receiving order.
It's user's responsibility to do the flow control and maintain the lifetime of the requests.
StreamProto
StreamStats
Estimated stream statistics for a given read Stream.
Protobuf type google.cloud.bigquery.storage.v1.StreamStats
StreamStats.Builder
Estimated stream statistics for a given read Stream.
Protobuf type google.cloud.bigquery.storage.v1.StreamStats
StreamStats.Progress
Protobuf type google.cloud.bigquery.storage.v1.StreamStats.Progress
StreamStats.Progress.Builder
Protobuf type google.cloud.bigquery.storage.v1.StreamStats.Progress
StreamWriter
A BigQuery Stream Writer that can be used to write data into BigQuery Table.
TODO: Support batching.
StreamWriter.Builder
A builder of StreamWriters.
StreamWriter.SingleConnectionOrConnectionPool
When in single table mode, append directly to connectionWorker. Otherwise append to connection pool in multiplexing mode.
TableFieldSchema
TableFieldSchema defines a single field/column within a table schema.
Protobuf type google.cloud.bigquery.storage.v1.TableFieldSchema
TableFieldSchema.Builder
TableFieldSchema defines a single field/column within a table schema.
Protobuf type google.cloud.bigquery.storage.v1.TableFieldSchema
TableName
TableName.Builder
Builder for projects/{project}/datasets/{dataset}/tables/{table}.
TableProto
TableSchema
Schema of a table. This schema is a subset of google.cloud.bigquery.v2.TableSchema containing information necessary to generate valid message to write to BigQuery.
Protobuf type google.cloud.bigquery.storage.v1.TableSchema
TableSchema.Builder
Schema of a table. This schema is a subset of google.cloud.bigquery.v2.TableSchema containing information necessary to generate valid message to write to BigQuery.
Protobuf type google.cloud.bigquery.storage.v1.TableSchema
ThrottleState
Information on if the current connection is being throttled.
Protobuf type google.cloud.bigquery.storage.v1.ThrottleState
ThrottleState.Builder
Information on if the current connection is being throttled.
Protobuf type google.cloud.bigquery.storage.v1.ThrottleState
WriteStream
Information about a single stream that gets data inside the storage system.
Protobuf type google.cloud.bigquery.storage.v1.WriteStream
WriteStream.Builder
Information about a single stream that gets data inside the storage system.
Protobuf type google.cloud.bigquery.storage.v1.WriteStream
WriteStreamName
WriteStreamName.Builder
Builder for projects/{project}/datasets/{dataset}/tables/{table}/streams/{stream}.
Interfaces
AppendRowsRequest.ProtoDataOrBuilder
AppendRowsRequestOrBuilder
AppendRowsResponse.AppendResultOrBuilder
AppendRowsResponseOrBuilder
ArrowRecordBatchOrBuilder
ArrowSchemaOrBuilder
ArrowSerializationOptionsOrBuilder
AvroRowsOrBuilder
AvroSchemaOrBuilder
AvroSerializationOptionsOrBuilder
BatchCommitWriteStreamsRequestOrBuilder
BatchCommitWriteStreamsResponseOrBuilder
BigQueryReadSettings.RetryAttemptListener
CreateReadSessionRequestOrBuilder
CreateWriteStreamRequestOrBuilder
FinalizeWriteStreamRequestOrBuilder
FinalizeWriteStreamResponseOrBuilder
FlushRowsRequestOrBuilder
FlushRowsResponseOrBuilder
GetWriteStreamRequestOrBuilder
ProtoRowsOrBuilder
ProtoSchemaOrBuilder
ReadRowsRequestOrBuilder
ReadRowsResponseOrBuilder
ReadSession.TableModifiersOrBuilder
ReadSession.TableReadOptionsOrBuilder
ReadSessionOrBuilder
ReadStreamOrBuilder
RowErrorOrBuilder
SplitReadStreamRequestOrBuilder
SplitReadStreamResponseOrBuilder
StorageErrorOrBuilder
StreamConnection.DoneCallback
Invoked when server closes the connection.
StreamConnection.RequestCallback
Invoked when a response is received from the server.
StreamStats.ProgressOrBuilder
StreamStatsOrBuilder
TableFieldSchemaOrBuilder
TableSchemaOrBuilder
ThrottleStateOrBuilder
WriteStreamOrBuilder
Enums
AppendRowsRequest.MissingValueInterpretation
An enum to indicate how to interpret missing values. Missing values are fields present in user schema but missing in rows. A missing value can represent a NULL or a column default value defined in BigQuery table schema.
Protobuf enum
google.cloud.bigquery.storage.v1.AppendRowsRequest.MissingValueInterpretation
AppendRowsRequest.RowsCase
AppendRowsResponse.ResponseCase
ArrowSerializationOptions.CompressionCodec
Compression codec's supported by Arrow.
Protobuf enum
google.cloud.bigquery.storage.v1.ArrowSerializationOptions.CompressionCodec
DataFormat
Data format for input or output data.
Protobuf enum google.cloud.bigquery.storage.v1.DataFormat
ReadRowsResponse.RowsCase
ReadRowsResponse.SchemaCase
ReadSession.SchemaCase
ReadSession.TableReadOptions.OutputFormatSerializationOptionsCase
RowError.RowErrorCode
Error code for RowError
.
Protobuf enum google.cloud.bigquery.storage.v1.RowError.RowErrorCode
StorageError.StorageErrorCode
Error code for StorageError
.
Protobuf enum google.cloud.bigquery.storage.v1.StorageError.StorageErrorCode
StreamWriter.SingleConnectionOrConnectionPool.Kind
Kind of connection operation mode.
TableFieldSchema.Mode
Protobuf enum google.cloud.bigquery.storage.v1.TableFieldSchema.Mode
TableFieldSchema.Type
Protobuf enum google.cloud.bigquery.storage.v1.TableFieldSchema.Type
WriteStream.Type
Type enum of the stream.
Protobuf enum google.cloud.bigquery.storage.v1.WriteStream.Type
WriteStream.WriteMode
Mode enum of the stream.
Protobuf enum google.cloud.bigquery.storage.v1.WriteStream.WriteMode
WriteStreamView
WriteStreamView is a view enum that controls what details about a write stream should be returned.
Protobuf enum google.cloud.bigquery.storage.v1.WriteStreamView
Exceptions
Exceptions.AppendSerializtionError
This exception is thrown from JsonStreamWriter#append() when the client side Json to Proto serializtion fails. It can also be thrown by the server in case rows contains invalid data. The exception contains a Map of indexes of faulty rows and the corresponding error message.
Exceptions.FieldParseError
This exception is used internally to handle field level parsing errors.
Exceptions.InflightBytesLimitExceededException
Exceptions.InflightLimitExceededException
If FlowController.LimitExceededBehavior is set to Block and inflight limit is exceeded, this exception will be thrown. If it is just a spike, you may retry the request. Otherwise, you can increase the inflight limit or create more StreamWriter to handle your traffic.
Exceptions.InflightRequestsLimitExceededException
Exceptions.JsonDataHasUnknownFieldException
Input Json data has unknown field to the schema of the JsonStreamWriter. User can either turn on IgnoreUnknownFields option on the JsonStreamWriter, or if they don't want the error to be ignored, they should recreate the JsonStreamWriter with the updated table schema.
Exceptions.OffsetAlreadyExists
Offset already exists. This indicates that the append request attempted to write data to an offset before the current end of the stream. This is an expected exception when ExactOnce is enforced. You can safely ignore it, and keep appending until there is new data to append.
Exceptions.OffsetOutOfRange
Offset out of range. This indicates that the append request is attempting to write data to a point beyond the current end of the stream. To append data successfully, you must either specify the offset corresponding to the current end of stream, or omit the offset from the append request. It usually means a bug in your code that introduces a gap in appends.
Exceptions.SchemaMismatchedException
There was a schema mismatch due to bigquery table with fewer fields than the input message. This can be resolved by updating the table's schema with the message schema.
Exceptions.StorageException
Main Storage Exception. Might contain map of streams to errors for that stream.
Exceptions.StreamFinalizedException
The write stream has already been finalized and will not accept further appends or flushes. To send additional requests, you will need to create a new write stream via CreateWriteStream.
Exceptions.StreamNotFound
The stream is not found. Possible causes include incorrectly specifying the stream identifier or attempting to use an old stream identifier that no longer exists. You can invoke CreateWriteStream to create a new stream.
Exceptions.StreamWriterClosedException
This writer instance has either been closed by the user explicitly, or has encountered non-retriable errors.
To continue to write to the same stream, you will need to create a new writer instance.