Reference documentation and code samples for the Retail V2 API class Google::Cloud::Retail::V2::BigQuerySource.
BigQuery source import data from.
Inherits
- Object
Extended By
- Google::Protobuf::MessageExts::ClassMethods
Includes
- Google::Protobuf::MessageExts
Methods
#data_schema
def data_schema() -> ::String
-
(::String) —
The schema to use when parsing the data from the source.
Supported values for product imports:
product
(default): One JSON Product per line. Each product must have a valid Product.id.product_merchant_center
: See Importing catalog data from Merchant Center.
Supported values for user events imports:
user_event
(default): One JSON UserEvent per line.user_event_ga360
: The schema is available here: https://support.google.com/analytics/answer/3437719.user_event_ga4
: This feature is in private preview. Please contact the support team for importing Google Analytics 4 events. The schema is available here: https://support.google.com/analytics/answer/7029846.
Supported values for auto-completion imports:
suggestions
(default): One JSON completion suggestion per line.denylist
: One JSON deny suggestion per line.allowlist
: One JSON allow suggestion per line.
#data_schema=
def data_schema=(value) -> ::String
-
value (::String) —
The schema to use when parsing the data from the source.
Supported values for product imports:
product
(default): One JSON Product per line. Each product must have a valid Product.id.product_merchant_center
: See Importing catalog data from Merchant Center.
Supported values for user events imports:
user_event
(default): One JSON UserEvent per line.user_event_ga360
: The schema is available here: https://support.google.com/analytics/answer/3437719.user_event_ga4
: This feature is in private preview. Please contact the support team for importing Google Analytics 4 events. The schema is available here: https://support.google.com/analytics/answer/7029846.
Supported values for auto-completion imports:
suggestions
(default): One JSON completion suggestion per line.denylist
: One JSON deny suggestion per line.allowlist
: One JSON allow suggestion per line.
-
(::String) —
The schema to use when parsing the data from the source.
Supported values for product imports:
product
(default): One JSON Product per line. Each product must have a valid Product.id.product_merchant_center
: See Importing catalog data from Merchant Center.
Supported values for user events imports:
user_event
(default): One JSON UserEvent per line.user_event_ga360
: The schema is available here: https://support.google.com/analytics/answer/3437719.user_event_ga4
: This feature is in private preview. Please contact the support team for importing Google Analytics 4 events. The schema is available here: https://support.google.com/analytics/answer/7029846.
Supported values for auto-completion imports:
suggestions
(default): One JSON completion suggestion per line.denylist
: One JSON deny suggestion per line.allowlist
: One JSON allow suggestion per line.
#dataset_id
def dataset_id() -> ::String
- (::String) — Required. The BigQuery data set to copy the data from with a length limit of 1,024 characters.
#dataset_id=
def dataset_id=(value) -> ::String
- value (::String) — Required. The BigQuery data set to copy the data from with a length limit of 1,024 characters.
- (::String) — Required. The BigQuery data set to copy the data from with a length limit of 1,024 characters.
#gcs_staging_dir
def gcs_staging_dir() -> ::String
- (::String) — Intermediate Cloud Storage directory used for the import with a length limit of 2,000 characters. Can be specified if one wants to have the BigQuery export to a specific Cloud Storage directory.
#gcs_staging_dir=
def gcs_staging_dir=(value) -> ::String
- value (::String) — Intermediate Cloud Storage directory used for the import with a length limit of 2,000 characters. Can be specified if one wants to have the BigQuery export to a specific Cloud Storage directory.
- (::String) — Intermediate Cloud Storage directory used for the import with a length limit of 2,000 characters. Can be specified if one wants to have the BigQuery export to a specific Cloud Storage directory.
#partition_date
def partition_date() -> ::Google::Type::Date
-
(::Google::Type::Date) — BigQuery time partitioned table's _PARTITIONDATE in YYYY-MM-DD format.
Only supported when ImportProductsRequest.reconciliation_mode is set to
FULL
.
#partition_date=
def partition_date=(value) -> ::Google::Type::Date
-
value (::Google::Type::Date) — BigQuery time partitioned table's _PARTITIONDATE in YYYY-MM-DD format.
Only supported when ImportProductsRequest.reconciliation_mode is set to
FULL
.
-
(::Google::Type::Date) — BigQuery time partitioned table's _PARTITIONDATE in YYYY-MM-DD format.
Only supported when ImportProductsRequest.reconciliation_mode is set to
FULL
.
#project_id
def project_id() -> ::String
- (::String) — The project ID (can be project # or ID) that the BigQuery source is in with a length limit of 128 characters. If not specified, inherits the project ID from the parent request.
#project_id=
def project_id=(value) -> ::String
- value (::String) — The project ID (can be project # or ID) that the BigQuery source is in with a length limit of 128 characters. If not specified, inherits the project ID from the parent request.
- (::String) — The project ID (can be project # or ID) that the BigQuery source is in with a length limit of 128 characters. If not specified, inherits the project ID from the parent request.
#table_id
def table_id() -> ::String
- (::String) — Required. The BigQuery table to copy the data from with a length limit of 1,024 characters.
#table_id=
def table_id=(value) -> ::String
- value (::String) — Required. The BigQuery table to copy the data from with a length limit of 1,024 characters.
- (::String) — Required. The BigQuery table to copy the data from with a length limit of 1,024 characters.