BigQuerySource(mapping=None, *, ignore_unknown_fields=False, **kwargs)
BigQuery source import data from.
.. _oneof: https://proto-plus-python.readthedocs.io/en/stable/fields.html#oneofs-mutually-exclusive-fields
Attributes |
|
---|---|
Name | Description |
partition_date |
google.type.date_pb2.Date
BigQuery time partitioned table's \_PARTITIONDATE in YYYY-MM-DD format. This field is a member of oneof _ partition .
|
project_id |
str
The project ID (can be project # or ID) that the BigQuery source is in with a length limit of 128 characters. If not specified, inherits the project ID from the parent request. |
dataset_id |
str
Required. The BigQuery data set to copy the data from with a length limit of 1,024 characters. |
table_id |
str
Required. The BigQuery table to copy the data from with a length limit of 1,024 characters. |
gcs_staging_dir |
str
Intermediate Cloud Storage directory used for the import with a length limit of 2,000 characters. Can be specified if one wants to have the BigQuery export to a specific Cloud Storage directory. |
data_schema |
str
The schema to use when parsing the data from the source. Supported values for user event imports: - user_event (default): One
UserEvent
per row.
Supported values for document imports:
- document (default): One
Document
format per row. Each document must have a valid
Document.id
and one of
Document.json_data
or
Document.struct_data.
- custom : One custom data per row in arbitrary format
that conforms to the defined
Schema of
the data store. This can only be used by the GENERIC Data
Store vertical.
|