Package google.cloud.bigquery.storage.v1beta1

Index

BigQueryStorage

BigQuery storage API.

The BigQuery storage API can be used to read data stored in BigQuery.

BatchCreateReadSessionStreams

rpc BatchCreateReadSessionStreams(BatchCreateReadSessionStreamsRequest) returns (BatchCreateReadSessionStreamsResponse)

Creates additional streams for a ReadSession. This API can be used to dynamically adjust the parallelism of a batch processing task upwards by adding additional workers.

CreateReadSession

rpc CreateReadSession(CreateReadSessionRequest) returns (ReadSession)

Creates a new read session. A read session divides the contents of a BigQuery table into one or more streams, which can then be used to read data from the table. The read session also specifies properties of the data to be read, such as a list of columns or a push-down filter describing the rows to be returned.

A particular row can be read by at most one stream. When the caller has reached the end of each stream in the session, then all the data in the table has been read.

Read sessions automatically expire 24 hours after they are created and do not require manual clean-up by the caller.

FinalizeStream

rpc FinalizeStream(FinalizeStreamRequest) returns (Empty)

Triggers the graceful termination of a single stream in a ReadSession. This API can be used to dynamically adjust the parallelism of a batch processing task downwards without losing data.

This API does not delete the stream -- it remains visible in the ReadSession, and any data processed by the stream is not released to other streams. However, no additional data will be assigned to the stream once this call completes. Callers must continue reading data on the stream until the end of the stream is reached so that data which has already been assigned to the stream will be processed.

This method will return an error if there are no other live streams in the Session, or if SplitReadStream() has been called on the given Stream.

ReadRows

rpc ReadRows(ReadRowsRequest) returns (ReadRowsResponse)

Reads rows from the table in the format prescribed by the read session. Each response contains one or more table rows, up to a maximum of 10 MiB per response; read requests which attempt to read individual rows larger than this will fail.

Each request also returns a set of stream statistics reflecting the estimated total number of rows in the read stream. This number is computed based on the total table size and the number of active streams in the read session, and may change as other streams continue to read data.

SplitReadStream

rpc SplitReadStream(SplitReadStreamRequest) returns (SplitReadStreamResponse)

Splits a given read stream into two Streams. These streams are referred to as the primary and the residual of the split. The original stream can still be read from in the same manner as before. Both of the returned streams can also be read from, and the total rows return by both child streams will be the same as the rows read from the original stream.

Moreover, the two child streams will be allocated back to back in the original Stream. Concretely, it is guaranteed that for streams Original, Primary, and Residual, that Original[0-j] = Primary[0-j] and Original[j-n] = Residual[0-m] once the streams have been read to completion.

This method is guaranteed to be idempotent.

AvroRows

Avro rows.

Fields
serialized_binary_rows

bytes

Binary serialized rows in a block.

row_count

int64

The count of rows in the returning block.

AvroSchema

Avro schema.

Fields
schema

string

Json serialized schema, as described at https://avro.apache.org/docs/1.8.1/spec.html

BatchCreateReadSessionStreamsRequest

Information needed to request additional streams for an established read session.

Fields
session

ReadSession

Required. Must be a non-expired session obtained from a call to CreateReadSession. Only the name field needs to be set.

requested_streams

int32

Required. Number of new streams requested. Must be positive. Number of added streams may be less than this, see CreateReadSessionRequest for more information.

BatchCreateReadSessionStreamsResponse

The response from BatchCreateReadSessionStreams returns the stream identifiers for the newly created streams.

Fields
streams[]

Stream

Newly added streams.

CreateReadSessionRequest

Creates a new read session, which may include additional options such as requested parallelism, projection filters and constraints.

Fields
table_reference

TableReference

Required. Reference to the table to read.

Authorization requires the following Google IAM permission on the specified resource tableReference:

  • bigquery.tables.getData

parent

string

Required. String of the form projects/{project_id} indicating the project this ReadSession is associated with. This is the project that will be billed for usage.

Authorization requires the following Google IAM permission on the specified resource parent:

  • bigquery.readsessions.create

table_modifiers

TableModifiers

Optional. Any modifiers to the Table (e.g. snapshot timestamp).

requested_streams

int32

Optional. Initial number of streams. If unset or 0, we will provide a value of streams so as to produce reasonable throughput. Must be non-negative. The number of streams may be lower than the requested number, depending on the amount parallelism that is reasonable for the table and the maximum amount of parallelism allowed by the system.

Streams must be read starting from offset 0.

read_options

TableReadOptions

Optional. Read options for this session (e.g. column selection, filters).

format

DataFormat

Data output format. Currently default to Avro.

DataFormat

Data format for input or output data.

Enums
DATA_FORMAT_UNSPECIFIED Data format is unspecified.
AVRO Avro is a standard open source row based file format. See https://avro.apache.org/ for more details.

FinalizeStreamRequest

Request information for invoking FinalizeStream.

Fields
stream

Stream

Stream to finalize.

ReadRowsRequest

Requesting row data via ReadRows must provide Stream position information.

Fields
read_position

StreamPosition

Required. Identifier of the position in the stream to start reading from. The offset requested must be less than the last row read from ReadRows. Requesting a larger offset is undefined.

ReadRowsResponse

Response from calling ReadRows may include row data, progress and throttling information.

Fields
status

StreamStatus

Estimated stream statistics.

throttle_status

ThrottleStatus

Throttling status. If unset, the latest response still describes the current throttling status.

avro_rows

AvroRows

Serialized row data in AVRO format.

ReadSession

Information returned from a CreateReadSession request.

Fields
name

string

Unique identifier for the session, in the form projects/{project_id}/locations/{location}/sessions/{session_id}.

expire_time

Timestamp

Time at which the session becomes invalid. After this time, subsequent requests to read this Session will return errors.

streams[]

Stream

Streams associated with this session.

table_reference

TableReference

Table that this ReadSession is reading from.

table_modifiers

TableModifiers

Any modifiers which are applied when reading from the specified table.

avro_schema

AvroSchema

Avro schema.

SplitReadStreamRequest

Request information for SplitReadStream.

Fields
original_stream

Stream

Stream to split.

SplitReadStreamResponse

Response from SplitReadStream.

Fields
primary_stream

Stream

Primary stream. Will contain the beginning portion of |original_stream|.

remainder_stream

Stream

Remainder stream. Will contain the tail of |original_stream|.

Stream

Information about a single data stream within a read session.

Fields
name

string

Name of the stream, in the form projects/{project_id}/locations/{location}/streams/{stream_id}.

row_count

int64

Rows in the stream.

StreamPosition

Expresses a point within a given stream using an offset position.

Fields
stream

Stream

Identifier for a given Stream.

offset

int64

Position in the stream.

StreamStatus

Progress information for a given Stream.

Fields
estimated_row_count

int64

Number of estimated rows in the current stream. May change over time as different readers in the stream progress at rates which are relatively fast or slow.

TableModifiers

All fields in this message optional.

Fields
snapshot_time

Timestamp

The snapshot time of the table. If not set, interpreted as now.

TableReadOptions

Options dictating how we read a table.

Fields
selected_fields[]

string

Optional. Names of the fields in the table that should be read. If empty, all fields will be read. If the specified field is a nested field, all the sub-fields in the field will be selected. The output field order is unrelated to the order of fields in selected_fields.

row_restriction

string

Optional. SQL text filtering statement, similar to a WHERE clause in a query. Currently, only a single predicate that is a comparison between a column and a constant value is supported. Aggregates are not supported.

Examples: "int_field > 5" "date_field = CAST('2014-9-27' as DATE)" "nullable_field is not NULL" "st_equals(geo_field, st_geofromtext("POINT(2, 2)"))" "numeric_field BETWEEN 1.0 AND 5.0"

TableReference

Table reference that includes just the 3 strings needed to identify a table.

Fields
project_id

string

The assigned project ID of the project.

dataset_id

string

The ID of the dataset in the above project.

table_id

string

The ID of the table in the above dataset.

ThrottleStatus

Information on if the current connection is being throttled.

Fields
throttle_percent

int32

How much this connection is being throttled. 0 is no throttling, 100 is completely throttled.

Was this page helpful? Let us know how we did:

Send feedback about...