Package com.google.cloud.bigquery.storage.v1 (3.8.0)

GitHub Repository

Client Classes

Client classes are the main entry point to using a package. They contain several variations of Java methods for each of the API's methods.

Client Description
com.google.cloud.bigquery.storage.v1.BaseBigQueryReadClient Service Description: BigQuery Read API.

The Read API can be used to read data from BigQuery.

com.google.cloud.bigquery.storage.v1.BigQueryReadClient Service Description: BigQuery Read API.

The Read API can be used to read data from BigQuery.

com.google.cloud.bigquery.storage.v1.BigQueryWriteClient Service Description: BigQuery Write API.

The Write API can be used to write data to BigQuery.

Settings Classes

Settings classes can be used to configure credentials, endpoints, and retry settings for a Client.

Settings Description
com.google.cloud.bigquery.storage.v1.BaseBigQueryReadSettings Settings class to configure an instance of BaseBigQueryReadClient.

The default instance has everything set to sensible defaults:

com.google.cloud.bigquery.storage.v1.BigQueryReadSettings Settings class to configure an instance of BigQueryReadClient.

The default instance has everything set to sensible defaults:

com.google.cloud.bigquery.storage.v1.BigQueryWriteSettings Settings class to configure an instance of BigQueryWriteClient.

The default instance has everything set to sensible defaults:

com.google.cloud.bigquery.storage.v1.ConnectionWorkerPool.Settings Settings for connection pool.

Classes

Class Description
com.google.cloud.bigquery.storage.v1.AnnotationsProto
com.google.cloud.bigquery.storage.v1.AppendRowsRequest Request message for AppendRows. Because AppendRows is a bidirectional streaming RPC, certain parts of the
com.google.cloud.bigquery.storage.v1.AppendRowsRequest.Builder Request message for AppendRows. Because AppendRows is a bidirectional streaming RPC, certain parts of the
com.google.cloud.bigquery.storage.v1.AppendRowsRequest.ProtoData ProtoData contains the data rows and schema when constructing append requests.
com.google.cloud.bigquery.storage.v1.AppendRowsRequest.ProtoData.Builder ProtoData contains the data rows and schema when constructing append requests.
com.google.cloud.bigquery.storage.v1.AppendRowsResponse Response message for AppendRows.
com.google.cloud.bigquery.storage.v1.AppendRowsResponse.AppendResult AppendResult is returned for successful append requests.
com.google.cloud.bigquery.storage.v1.AppendRowsResponse.AppendResult.Builder AppendResult is returned for successful append requests.
com.google.cloud.bigquery.storage.v1.AppendRowsResponse.Builder Response message for AppendRows.
com.google.cloud.bigquery.storage.v1.ArrowProto
com.google.cloud.bigquery.storage.v1.ArrowRecordBatch Arrow RecordBatch.
com.google.cloud.bigquery.storage.v1.ArrowRecordBatch.Builder Arrow RecordBatch.
com.google.cloud.bigquery.storage.v1.ArrowSchema Arrow schema as specified in https://arrow.apache.org/docs/python/api/datatypes.html and serialized to bytes using IPC:
com.google.cloud.bigquery.storage.v1.ArrowSchema.Builder Arrow schema as specified in https://arrow.apache.org/docs/python/api/datatypes.html and serialized to bytes using IPC:
com.google.cloud.bigquery.storage.v1.ArrowSerializationOptions Contains options specific to Arrow Serialization.
com.google.cloud.bigquery.storage.v1.ArrowSerializationOptions.Builder Contains options specific to Arrow Serialization.
com.google.cloud.bigquery.storage.v1.AvroProto
com.google.cloud.bigquery.storage.v1.AvroRows Avro rows.
com.google.cloud.bigquery.storage.v1.AvroRows.Builder Avro rows.
com.google.cloud.bigquery.storage.v1.AvroSchema Avro schema.
com.google.cloud.bigquery.storage.v1.AvroSchema.Builder Avro schema.
com.google.cloud.bigquery.storage.v1.AvroSerializationOptions Contains options specific to Avro Serialization.
com.google.cloud.bigquery.storage.v1.AvroSerializationOptions.Builder Contains options specific to Avro Serialization.
com.google.cloud.bigquery.storage.v1.BQTableSchemaToProtoDescriptor Converts a BQ table schema to protobuf descriptor. All field names will be converted to lowercase when constructing the protobuf descriptor. The mapping between field types and field modes are shown in the ImmutableMaps below.
com.google.cloud.bigquery.storage.v1.BaseBigQueryReadSettings.Builder Builder for BaseBigQueryReadSettings.
com.google.cloud.bigquery.storage.v1.BatchCommitWriteStreamsRequest Request message for BatchCommitWriteStreams.
com.google.cloud.bigquery.storage.v1.BatchCommitWriteStreamsRequest.Builder Request message for BatchCommitWriteStreams.
com.google.cloud.bigquery.storage.v1.BatchCommitWriteStreamsResponse Response message for BatchCommitWriteStreams.
com.google.cloud.bigquery.storage.v1.BatchCommitWriteStreamsResponse.Builder Response message for BatchCommitWriteStreams.
com.google.cloud.bigquery.storage.v1.BigDecimalByteStringEncoder
com.google.cloud.bigquery.storage.v1.BigQueryReadGrpc BigQuery Read API. The Read API can be used to read data from BigQuery.
com.google.cloud.bigquery.storage.v1.BigQueryReadGrpc.BigQueryReadImplBase Base class for the server implementation of the service BigQueryRead. BigQuery Read API.
com.google.cloud.bigquery.storage.v1.BigQueryReadSettings.Builder Builder for BigQueryReadSettings.
com.google.cloud.bigquery.storage.v1.BigQuerySchemaUtil
com.google.cloud.bigquery.storage.v1.BigQueryWriteGrpc BigQuery Write API. The Write API can be used to write data to BigQuery. For supplementary information about the Write API, see:
com.google.cloud.bigquery.storage.v1.BigQueryWriteGrpc.BigQueryWriteImplBase Base class for the server implementation of the service BigQueryWrite. BigQuery Write API.
com.google.cloud.bigquery.storage.v1.BigQueryWriteSettings.Builder Builder for BigQueryWriteSettings.
com.google.cloud.bigquery.storage.v1.CivilTimeEncoder Ported from ZetaSQL CivilTimeEncoder Original code can be found at: https://github.com/google/zetasql/blob/master/java/com/google/zetasql/CivilTimeEncoder.java Encoder for TIME and DATETIME values, according to civil_time encoding.
com.google.cloud.bigquery.storage.v1.ConnectionWorkerPool Pool of connections to accept appends and distirbute to different connections.
com.google.cloud.bigquery.storage.v1.ConnectionWorkerPool.Settings.Builder Builder for the options to config ConnectionWorkerPool.
com.google.cloud.bigquery.storage.v1.CreateReadSessionRequest Request message for CreateReadSession.
com.google.cloud.bigquery.storage.v1.CreateReadSessionRequest.Builder Request message for CreateReadSession.
com.google.cloud.bigquery.storage.v1.CreateWriteStreamRequest Request message for CreateWriteStream.
com.google.cloud.bigquery.storage.v1.CreateWriteStreamRequest.Builder Request message for CreateWriteStream.
com.google.cloud.bigquery.storage.v1.Exceptions Exceptions for Storage Client Libraries.
com.google.cloud.bigquery.storage.v1.Exceptions.AppendSerializationError This exception is thrown from SchemaAwareStreamWriter#append(Iterable) when the client side Proto serialization fails. It can also be thrown by the server in case rows contains invalid data. The exception contains a Map of indexes of faulty rows and the corresponding error message.
com.google.cloud.bigquery.storage.v1.Exceptions.AppendSerializtionError This class has a typo in the name. It will be removed soon. Please use AppendSerializationError
com.google.cloud.bigquery.storage.v1.Exceptions.FieldParseError This exception is used internally to handle field level parsing errors.
com.google.cloud.bigquery.storage.v1.Exceptions.OffsetAlreadyExists Offset already exists. This indicates that the append request attempted to write data to an offset before the current end of the stream. This is an expected exception when ExactOnce is enforced. You can safely ignore it, and keep appending until there is new data to append.
com.google.cloud.bigquery.storage.v1.Exceptions.OffsetOutOfRange Offset out of range. This indicates that the append request is attempting to write data to a point beyond the current end of the stream. To append data successfully, you must either specify the offset corresponding to the current end of stream, or omit the offset from the append request. It usually means a bug in your code that introduces a gap in appends.
com.google.cloud.bigquery.storage.v1.Exceptions.StreamNotFound The stream is not found. Possible causes include incorrectly specifying the stream identifier or attempting to use an old stream identifier that no longer exists. You can invoke CreateWriteStream to create a new stream.
com.google.cloud.bigquery.storage.v1.FinalizeWriteStreamRequest Request message for invoking FinalizeWriteStream.
com.google.cloud.bigquery.storage.v1.FinalizeWriteStreamRequest.Builder Request message for invoking FinalizeWriteStream.
com.google.cloud.bigquery.storage.v1.FinalizeWriteStreamResponse Response message for FinalizeWriteStream.
com.google.cloud.bigquery.storage.v1.FinalizeWriteStreamResponse.Builder Response message for FinalizeWriteStream.
com.google.cloud.bigquery.storage.v1.FlushRowsRequest Request message for FlushRows.
com.google.cloud.bigquery.storage.v1.FlushRowsRequest.Builder Request message for FlushRows.
com.google.cloud.bigquery.storage.v1.FlushRowsResponse Respond message for FlushRows.
com.google.cloud.bigquery.storage.v1.FlushRowsResponse.Builder Respond message for FlushRows.
com.google.cloud.bigquery.storage.v1.GetWriteStreamRequest Request message for GetWriteStreamRequest.
com.google.cloud.bigquery.storage.v1.GetWriteStreamRequest.Builder Request message for GetWriteStreamRequest.
com.google.cloud.bigquery.storage.v1.JsonStreamWriter A StreamWriter that can write JSON data (JSONObjects) to BigQuery tables. The JsonStreamWriter is built on top of a StreamWriter, and it simply converts all JSON data to protobuf messages then calls StreamWriter's append() method to write to BigQuery tables. It maintains all StreamWriter functions, but also provides an additional feature: schema update support, where if the BigQuery
com.google.cloud.bigquery.storage.v1.JsonStreamWriter.Builder
com.google.cloud.bigquery.storage.v1.JsonToProtoMessage Converts JSON data to Protobuf messages given the Protobuf descriptor and BigQuery table schema. The Protobuf descriptor must have all fields lowercased.
com.google.cloud.bigquery.storage.v1.ProjectName
com.google.cloud.bigquery.storage.v1.ProjectName.Builder Builder for projects/{project}.
com.google.cloud.bigquery.storage.v1.ProtoBufProto
com.google.cloud.bigquery.storage.v1.ProtoRows Protobuf type google.cloud.bigquery.storage.v1.ProtoRows
com.google.cloud.bigquery.storage.v1.ProtoRows.Builder Protobuf type google.cloud.bigquery.storage.v1.ProtoRows
com.google.cloud.bigquery.storage.v1.ProtoSchema ProtoSchema describes the schema of the serialized protocol buffer data rows.
com.google.cloud.bigquery.storage.v1.ProtoSchema.Builder ProtoSchema describes the schema of the serialized protocol buffer data rows.
com.google.cloud.bigquery.storage.v1.ProtoSchemaConverter
com.google.cloud.bigquery.storage.v1.ReadRowsRequest Request message for ReadRows.
com.google.cloud.bigquery.storage.v1.ReadRowsRequest.Builder Request message for ReadRows.
com.google.cloud.bigquery.storage.v1.ReadRowsResponse Response from calling ReadRows may include row data, progress and throttling information.
com.google.cloud.bigquery.storage.v1.ReadRowsResponse.Builder Response from calling ReadRows may include row data, progress and throttling information.
com.google.cloud.bigquery.storage.v1.ReadSession Information about the ReadSession.
com.google.cloud.bigquery.storage.v1.ReadSession.Builder Information about the ReadSession.
com.google.cloud.bigquery.storage.v1.ReadSession.TableModifiers Additional attributes when reading a table.
com.google.cloud.bigquery.storage.v1.ReadSession.TableModifiers.Builder Additional attributes when reading a table.
com.google.cloud.bigquery.storage.v1.ReadSession.TableReadOptions Options dictating how we read a table.
com.google.cloud.bigquery.storage.v1.ReadSession.TableReadOptions.Builder Options dictating how we read a table.
com.google.cloud.bigquery.storage.v1.ReadStream Information about a single stream that gets data out of the storage system. Most of the information about ReadStream instances is aggregated, making ReadStream lightweight.
com.google.cloud.bigquery.storage.v1.ReadStream.Builder Information about a single stream that gets data out of the storage system. Most of the information about ReadStream instances is aggregated, making ReadStream lightweight.
com.google.cloud.bigquery.storage.v1.ReadStreamName
com.google.cloud.bigquery.storage.v1.ReadStreamName.Builder Builder for projects/{project}/locations/{location}/sessions/{session}/streams/{stream}.
com.google.cloud.bigquery.storage.v1.RequestProfiler A profiler that would periodically generate a report for the past period with the latency report for the slowest requests. This is used for debugging only. A request id would be generated for each request at runtime. Certain part of the code would be wrapped with startOperation(...) and endOperation(...) to measure the latency of operations of individual request.
com.google.cloud.bigquery.storage.v1.RowError The message that presents row level error info in a request.
com.google.cloud.bigquery.storage.v1.RowError.Builder The message that presents row level error info in a request.
com.google.cloud.bigquery.storage.v1.SchemaAwareStreamWriter A StreamWriter that can write data to BigQuery tables. The SchemaAwareStreamWriter is built on top of a StreamWriter, and it converts all data to Protobuf messages using provided converter then calls StreamWriter's append() method to write to BigQuery tables. It maintains all StreamWriter functions, but also provides an additional feature: schema update support, where if
com.google.cloud.bigquery.storage.v1.SchemaAwareStreamWriter.Builder
com.google.cloud.bigquery.storage.v1.SplitReadStreamRequest Request message for SplitReadStream.
com.google.cloud.bigquery.storage.v1.SplitReadStreamRequest.Builder Request message for SplitReadStream.
com.google.cloud.bigquery.storage.v1.SplitReadStreamResponse Response message for SplitReadStream.
com.google.cloud.bigquery.storage.v1.SplitReadStreamResponse.Builder Response message for SplitReadStream.
com.google.cloud.bigquery.storage.v1.StorageError Structured custom BigQuery Storage error message. The error can be attached as error details in the returned rpc Status. In particular, the use of error codes allows more structured error handling, and reduces the need to evaluate
com.google.cloud.bigquery.storage.v1.StorageError.Builder Structured custom BigQuery Storage error message. The error can be attached as error details in the returned rpc Status. In particular, the use of error codes allows more structured error handling, and reduces the need to evaluate
com.google.cloud.bigquery.storage.v1.StorageProto
com.google.cloud.bigquery.storage.v1.StreamProto
com.google.cloud.bigquery.storage.v1.StreamStats Estimated stream statistics for a given read Stream.
com.google.cloud.bigquery.storage.v1.StreamStats.Builder Estimated stream statistics for a given read Stream.
com.google.cloud.bigquery.storage.v1.StreamStats.Progress Protobuf type google.cloud.bigquery.storage.v1.StreamStats.Progress
com.google.cloud.bigquery.storage.v1.StreamStats.Progress.Builder Protobuf type google.cloud.bigquery.storage.v1.StreamStats.Progress
com.google.cloud.bigquery.storage.v1.StreamWriter A BigQuery Stream Writer that can be used to write data into BigQuery Table.

TODO: Support batching.

com.google.cloud.bigquery.storage.v1.StreamWriter.Builder A builder of StreamWriters.
com.google.cloud.bigquery.storage.v1.TableFieldSchema TableFieldSchema defines a single field/column within a table schema.
com.google.cloud.bigquery.storage.v1.TableFieldSchema.Builder TableFieldSchema defines a single field/column within a table schema.
com.google.cloud.bigquery.storage.v1.TableFieldSchema.FieldElementType Represents the type of a field element.
com.google.cloud.bigquery.storage.v1.TableFieldSchema.FieldElementType.Builder Represents the type of a field element.
com.google.cloud.bigquery.storage.v1.TableName
com.google.cloud.bigquery.storage.v1.TableName.Builder Builder for projects/{project}/datasets/{dataset}/tables/{table}.
com.google.cloud.bigquery.storage.v1.TableProto
com.google.cloud.bigquery.storage.v1.TableSchema Schema of a table. This schema is a subset of google.cloud.bigquery.v2.TableSchema containing information necessary to generate valid message to write to BigQuery.
com.google.cloud.bigquery.storage.v1.TableSchema.Builder Schema of a table. This schema is a subset of google.cloud.bigquery.v2.TableSchema containing information necessary to generate valid message to write to BigQuery.
com.google.cloud.bigquery.storage.v1.ThrottleState Information on if the current connection is being throttled.
com.google.cloud.bigquery.storage.v1.ThrottleState.Builder Information on if the current connection is being throttled.
com.google.cloud.bigquery.storage.v1.WriteStream Information about a single stream that gets data inside the storage system.
com.google.cloud.bigquery.storage.v1.WriteStream.Builder Information about a single stream that gets data inside the storage system.
com.google.cloud.bigquery.storage.v1.WriteStreamName
com.google.cloud.bigquery.storage.v1.WriteStreamName.Builder Builder for projects/{project}/datasets/{dataset}/tables/{table}/streams/{stream}.

Interfaces

Interface Description
com.google.cloud.bigquery.storage.v1.AppendRowsRequest.ProtoDataOrBuilder
com.google.cloud.bigquery.storage.v1.AppendRowsRequestOrBuilder
com.google.cloud.bigquery.storage.v1.AppendRowsResponse.AppendResultOrBuilder
com.google.cloud.bigquery.storage.v1.AppendRowsResponseOrBuilder
com.google.cloud.bigquery.storage.v1.ArrowRecordBatchOrBuilder
com.google.cloud.bigquery.storage.v1.ArrowSchemaOrBuilder
com.google.cloud.bigquery.storage.v1.ArrowSerializationOptionsOrBuilder
com.google.cloud.bigquery.storage.v1.AvroRowsOrBuilder
com.google.cloud.bigquery.storage.v1.AvroSchemaOrBuilder
com.google.cloud.bigquery.storage.v1.AvroSerializationOptionsOrBuilder
com.google.cloud.bigquery.storage.v1.BatchCommitWriteStreamsRequestOrBuilder
com.google.cloud.bigquery.storage.v1.BatchCommitWriteStreamsResponseOrBuilder
com.google.cloud.bigquery.storage.v1.BigQueryReadGrpc.AsyncService BigQuery Read API. The Read API can be used to read data from BigQuery.
com.google.cloud.bigquery.storage.v1.BigQueryReadSettings.RetryAttemptListener
com.google.cloud.bigquery.storage.v1.BigQueryWriteGrpc.AsyncService BigQuery Write API. The Write API can be used to write data to BigQuery. For supplementary information about the Write API, see:
com.google.cloud.bigquery.storage.v1.CreateReadSessionRequestOrBuilder
com.google.cloud.bigquery.storage.v1.CreateWriteStreamRequestOrBuilder
com.google.cloud.bigquery.storage.v1.FinalizeWriteStreamRequestOrBuilder
com.google.cloud.bigquery.storage.v1.FinalizeWriteStreamResponseOrBuilder
com.google.cloud.bigquery.storage.v1.FlushRowsRequestOrBuilder
com.google.cloud.bigquery.storage.v1.FlushRowsResponseOrBuilder
com.google.cloud.bigquery.storage.v1.GetWriteStreamRequestOrBuilder
com.google.cloud.bigquery.storage.v1.ProtoRowsOrBuilder
com.google.cloud.bigquery.storage.v1.ProtoSchemaOrBuilder
com.google.cloud.bigquery.storage.v1.ReadRowsRequestOrBuilder
com.google.cloud.bigquery.storage.v1.ReadRowsResponseOrBuilder
com.google.cloud.bigquery.storage.v1.ReadSession.TableModifiersOrBuilder
com.google.cloud.bigquery.storage.v1.ReadSession.TableReadOptionsOrBuilder
com.google.cloud.bigquery.storage.v1.ReadSessionOrBuilder
com.google.cloud.bigquery.storage.v1.ReadStreamOrBuilder
com.google.cloud.bigquery.storage.v1.RowErrorOrBuilder
com.google.cloud.bigquery.storage.v1.SplitReadStreamRequestOrBuilder
com.google.cloud.bigquery.storage.v1.SplitReadStreamResponseOrBuilder
com.google.cloud.bigquery.storage.v1.StorageErrorOrBuilder
com.google.cloud.bigquery.storage.v1.StreamStats.ProgressOrBuilder
com.google.cloud.bigquery.storage.v1.StreamStatsOrBuilder
com.google.cloud.bigquery.storage.v1.TableFieldSchema.FieldElementTypeOrBuilder
com.google.cloud.bigquery.storage.v1.TableFieldSchemaOrBuilder
com.google.cloud.bigquery.storage.v1.TableSchemaOrBuilder
com.google.cloud.bigquery.storage.v1.ThrottleStateOrBuilder
com.google.cloud.bigquery.storage.v1.ToProtoConverter
com.google.cloud.bigquery.storage.v1.WriteStreamOrBuilder

Enums

Enum Description
com.google.cloud.bigquery.storage.v1.AppendRowsRequest.MissingValueInterpretation An enum to indicate how to interpret missing values of fields that are present in user schema but missing in rows. A missing value can represent a NULL or a column default value defined in BigQuery table schema.
com.google.cloud.bigquery.storage.v1.AppendRowsRequest.RowsCase
com.google.cloud.bigquery.storage.v1.AppendRowsResponse.ResponseCase
com.google.cloud.bigquery.storage.v1.ArrowSerializationOptions.CompressionCodec Compression codec's supported by Arrow.
com.google.cloud.bigquery.storage.v1.DataFormat Data format for input or output data.
com.google.cloud.bigquery.storage.v1.ReadRowsResponse.RowsCase
com.google.cloud.bigquery.storage.v1.ReadRowsResponse.SchemaCase
com.google.cloud.bigquery.storage.v1.ReadSession.SchemaCase
com.google.cloud.bigquery.storage.v1.ReadSession.TableReadOptions.OutputFormatSerializationOptionsCase
com.google.cloud.bigquery.storage.v1.ReadSession.TableReadOptions.ResponseCompressionCodec Specifies which compression codec to attempt on the entire serialized response payload (either Arrow record batch or Avro rows). This is not to be confused with the Apache Arrow native compression codecs
com.google.cloud.bigquery.storage.v1.RowError.RowErrorCode Error code for RowError.
com.google.cloud.bigquery.storage.v1.StorageError.StorageErrorCode Error code for StorageError.
com.google.cloud.bigquery.storage.v1.TableFieldSchema.Mode Protobuf enum google.cloud.bigquery.storage.v1.TableFieldSchema.Mode
com.google.cloud.bigquery.storage.v1.TableFieldSchema.Type Protobuf enum google.cloud.bigquery.storage.v1.TableFieldSchema.Type
com.google.cloud.bigquery.storage.v1.WriteStream.Type Type enum of the stream.
com.google.cloud.bigquery.storage.v1.WriteStream.WriteMode Mode enum of the stream.
com.google.cloud.bigquery.storage.v1.WriteStreamView WriteStreamView is a view enum that controls what details about a write stream should be returned.

Exceptions

Exception Description
com.google.cloud.bigquery.storage.v1.Exceptions.DataHasUnknownFieldException Input data object has unknown field to the schema of the SchemaAwareStreamWriter. User can either turn on IgnoreUnknownFields option on the SchemaAwareStreamWriter, or if they don't want the error to be ignored, they should recreate the SchemaAwareStreamWriter with the updated table schema.
com.google.cloud.bigquery.storage.v1.Exceptions.InflightBytesLimitExceededException
com.google.cloud.bigquery.storage.v1.Exceptions.InflightLimitExceededException If FlowController.LimitExceededBehavior is set to Block and inflight limit is exceeded, this exception will be thrown. If it is just a spike, you may retry the request. Otherwise, you can increase the inflight limit or create more StreamWriter to handle your traffic.
com.google.cloud.bigquery.storage.v1.Exceptions.InflightRequestsLimitExceededException
com.google.cloud.bigquery.storage.v1.Exceptions.JsonDataHasUnknownFieldException This class is replaced by a generic one. It will be removed soon. Please use DataHasUnknownFieldException
com.google.cloud.bigquery.storage.v1.Exceptions.MaximumRequestCallbackWaitTimeExceededException The connection was shut down because a callback was not received within the maximum wait time.
com.google.cloud.bigquery.storage.v1.Exceptions.SchemaMismatchedException There was a schema mismatch due to bigquery table with fewer fields than the input message. This can be resolved by updating the table's schema with the message schema.
com.google.cloud.bigquery.storage.v1.Exceptions.StorageException Main Storage Exception. Might contain map of streams to errors for that stream.
com.google.cloud.bigquery.storage.v1.Exceptions.StreamFinalizedException The write stream has already been finalized and will not accept further appends or flushes. To send additional requests, you will need to create a new write stream via CreateWriteStream.
com.google.cloud.bigquery.storage.v1.Exceptions.StreamWriterClosedException This writer instance has either been closed by the user explicitly, or has encountered non-retriable errors.

To continue to write to the same stream, you will need to create a new writer instance.