Reference documentation and code samples for the BigQuery Data Transfer Service V1 API class Google::Cloud::Bigquery::DataTransfer::V1::TransferConfig.
Represents a data transfer configuration. A transfer configuration
contains all metadata needed to perform a data transfer. For example,
destination_dataset_id
specifies where data should be stored.
When a new transfer configuration is created, the specified
destination_dataset_id
is created when needed and shared with the
appropriate data source service account.
Inherits
- Object
Extended By
- Google::Protobuf::MessageExts::ClassMethods
Includes
- Google::Protobuf::MessageExts
Methods
#data_refresh_window_days
def data_refresh_window_days() -> ::Integer
-
(::Integer) — The number of days to look back to automatically refresh the data.
For example, if
data_refresh_window_days = 10
, then every day BigQuery reingests data for [today-10, today-1], rather than ingesting data for just [today-1]. Only valid if the data source supports the feature. Set the value to 0 to use the default value.
#data_refresh_window_days=
def data_refresh_window_days=(value) -> ::Integer
-
value (::Integer) — The number of days to look back to automatically refresh the data.
For example, if
data_refresh_window_days = 10
, then every day BigQuery reingests data for [today-10, today-1], rather than ingesting data for just [today-1]. Only valid if the data source supports the feature. Set the value to 0 to use the default value.
-
(::Integer) — The number of days to look back to automatically refresh the data.
For example, if
data_refresh_window_days = 10
, then every day BigQuery reingests data for [today-10, today-1], rather than ingesting data for just [today-1]. Only valid if the data source supports the feature. Set the value to 0 to use the default value.
#data_source_id
def data_source_id() -> ::String
- (::String) — Data source ID. This cannot be changed once data transfer is created. The full list of available data source IDs can be returned through an API call: https://cloud.google.com/bigquery-transfer/docs/reference/datatransfer/rest/v1/projects.locations.dataSources/list
#data_source_id=
def data_source_id=(value) -> ::String
- value (::String) — Data source ID. This cannot be changed once data transfer is created. The full list of available data source IDs can be returned through an API call: https://cloud.google.com/bigquery-transfer/docs/reference/datatransfer/rest/v1/projects.locations.dataSources/list
- (::String) — Data source ID. This cannot be changed once data transfer is created. The full list of available data source IDs can be returned through an API call: https://cloud.google.com/bigquery-transfer/docs/reference/datatransfer/rest/v1/projects.locations.dataSources/list
#dataset_region
def dataset_region() -> ::String
- (::String) — Output only. Region in which BigQuery dataset is located.
#destination_dataset_id
def destination_dataset_id() -> ::String
- (::String) — The BigQuery target dataset id.
#destination_dataset_id=
def destination_dataset_id=(value) -> ::String
- value (::String) — The BigQuery target dataset id.
- (::String) — The BigQuery target dataset id.
#disabled
def disabled() -> ::Boolean
- (::Boolean) — Is this config disabled. When set to true, no runs are scheduled for a given transfer.
#disabled=
def disabled=(value) -> ::Boolean
- value (::Boolean) — Is this config disabled. When set to true, no runs are scheduled for a given transfer.
- (::Boolean) — Is this config disabled. When set to true, no runs are scheduled for a given transfer.
#display_name
def display_name() -> ::String
- (::String) — User specified display name for the data transfer.
#display_name=
def display_name=(value) -> ::String
- value (::String) — User specified display name for the data transfer.
- (::String) — User specified display name for the data transfer.
#email_preferences
def email_preferences() -> ::Google::Cloud::Bigquery::DataTransfer::V1::EmailPreferences
- (::Google::Cloud::Bigquery::DataTransfer::V1::EmailPreferences) — Email notifications will be sent according to these preferences to the email address of the user who owns this transfer config.
#email_preferences=
def email_preferences=(value) -> ::Google::Cloud::Bigquery::DataTransfer::V1::EmailPreferences
- value (::Google::Cloud::Bigquery::DataTransfer::V1::EmailPreferences) — Email notifications will be sent according to these preferences to the email address of the user who owns this transfer config.
- (::Google::Cloud::Bigquery::DataTransfer::V1::EmailPreferences) — Email notifications will be sent according to these preferences to the email address of the user who owns this transfer config.
#encryption_configuration
def encryption_configuration() -> ::Google::Cloud::Bigquery::DataTransfer::V1::EncryptionConfiguration
- (::Google::Cloud::Bigquery::DataTransfer::V1::EncryptionConfiguration) — The encryption configuration part. Currently, it is only used for the optional KMS key name. The BigQuery service account of your project must be granted permissions to use the key. Read methods will return the key name applied in effect. Write methods will apply the key if it is present, or otherwise try to apply project default keys if it is absent.
#encryption_configuration=
def encryption_configuration=(value) -> ::Google::Cloud::Bigquery::DataTransfer::V1::EncryptionConfiguration
- value (::Google::Cloud::Bigquery::DataTransfer::V1::EncryptionConfiguration) — The encryption configuration part. Currently, it is only used for the optional KMS key name. The BigQuery service account of your project must be granted permissions to use the key. Read methods will return the key name applied in effect. Write methods will apply the key if it is present, or otherwise try to apply project default keys if it is absent.
- (::Google::Cloud::Bigquery::DataTransfer::V1::EncryptionConfiguration) — The encryption configuration part. Currently, it is only used for the optional KMS key name. The BigQuery service account of your project must be granted permissions to use the key. Read methods will return the key name applied in effect. Write methods will apply the key if it is present, or otherwise try to apply project default keys if it is absent.
#name
def name() -> ::String
-
(::String) — The resource name of the transfer config.
Transfer config names have the form either
projects/{project_id}/locations/{region}/transferConfigs/{config_id}
orprojects/{project_id}/transferConfigs/{config_id}
, whereconfig_id
is usually a UUID, even though it is not guaranteed or required. The name is ignored when creating a transfer config.
#name=
def name=(value) -> ::String
-
value (::String) — The resource name of the transfer config.
Transfer config names have the form either
projects/{project_id}/locations/{region}/transferConfigs/{config_id}
orprojects/{project_id}/transferConfigs/{config_id}
, whereconfig_id
is usually a UUID, even though it is not guaranteed or required. The name is ignored when creating a transfer config.
-
(::String) — The resource name of the transfer config.
Transfer config names have the form either
projects/{project_id}/locations/{region}/transferConfigs/{config_id}
orprojects/{project_id}/transferConfigs/{config_id}
, whereconfig_id
is usually a UUID, even though it is not guaranteed or required. The name is ignored when creating a transfer config.
#next_run_time
def next_run_time() -> ::Google::Protobuf::Timestamp
- (::Google::Protobuf::Timestamp) — Output only. Next time when data transfer will run.
#notification_pubsub_topic
def notification_pubsub_topic() -> ::String
-
(::String) — Pub/Sub topic where notifications will be sent after transfer runs
associated with this transfer config finish.
The format for specifying a pubsub topic is:
projects/{project}/topics/{topic}
#notification_pubsub_topic=
def notification_pubsub_topic=(value) -> ::String
-
value (::String) — Pub/Sub topic where notifications will be sent after transfer runs
associated with this transfer config finish.
The format for specifying a pubsub topic is:
projects/{project}/topics/{topic}
-
(::String) — Pub/Sub topic where notifications will be sent after transfer runs
associated with this transfer config finish.
The format for specifying a pubsub topic is:
projects/{project}/topics/{topic}
#owner_info
def owner_info() -> ::Google::Cloud::Bigquery::DataTransfer::V1::UserInfo
-
(::Google::Cloud::Bigquery::DataTransfer::V1::UserInfo) — Output only. Information about the user whose credentials are used to
transfer data. Populated only for
transferConfigs.get
requests. In case the user information is not available, this field will not be populated.
#params
def params() -> ::Google::Protobuf::Struct
- (::Google::Protobuf::Struct) — Parameters specific to each data source. For more information see the bq tab in the 'Setting up a data transfer' section for each data source. For example the parameters for Cloud Storage transfers are listed here: https://cloud.google.com/bigquery-transfer/docs/cloud-storage-transfer#bq
#params=
def params=(value) -> ::Google::Protobuf::Struct
- value (::Google::Protobuf::Struct) — Parameters specific to each data source. For more information see the bq tab in the 'Setting up a data transfer' section for each data source. For example the parameters for Cloud Storage transfers are listed here: https://cloud.google.com/bigquery-transfer/docs/cloud-storage-transfer#bq
- (::Google::Protobuf::Struct) — Parameters specific to each data source. For more information see the bq tab in the 'Setting up a data transfer' section for each data source. For example the parameters for Cloud Storage transfers are listed here: https://cloud.google.com/bigquery-transfer/docs/cloud-storage-transfer#bq
#schedule
def schedule() -> ::String
-
(::String) — Data transfer schedule.
If the data source does not support a custom schedule, this should be
empty. If it is empty, the default value for the data source will be
used.
The specified times are in UTC.
Examples of valid format:
1st,3rd monday of month 15:30
,every wed,fri of jan,jun 13:15
, andfirst sunday of quarter 00:00
. See more explanation about the format here: https://cloud.google.com/appengine/docs/flexible/python/scheduling-jobs-with-cron-yaml#the_schedule_formatNOTE: The minimum interval time between recurring transfers depends on the data source; refer to the documentation for your data source.
#schedule=
def schedule=(value) -> ::String
-
value (::String) — Data transfer schedule.
If the data source does not support a custom schedule, this should be
empty. If it is empty, the default value for the data source will be
used.
The specified times are in UTC.
Examples of valid format:
1st,3rd monday of month 15:30
,every wed,fri of jan,jun 13:15
, andfirst sunday of quarter 00:00
. See more explanation about the format here: https://cloud.google.com/appengine/docs/flexible/python/scheduling-jobs-with-cron-yaml#the_schedule_formatNOTE: The minimum interval time between recurring transfers depends on the data source; refer to the documentation for your data source.
-
(::String) — Data transfer schedule.
If the data source does not support a custom schedule, this should be
empty. If it is empty, the default value for the data source will be
used.
The specified times are in UTC.
Examples of valid format:
1st,3rd monday of month 15:30
,every wed,fri of jan,jun 13:15
, andfirst sunday of quarter 00:00
. See more explanation about the format here: https://cloud.google.com/appengine/docs/flexible/python/scheduling-jobs-with-cron-yaml#the_schedule_formatNOTE: The minimum interval time between recurring transfers depends on the data source; refer to the documentation for your data source.
#schedule_options
def schedule_options() -> ::Google::Cloud::Bigquery::DataTransfer::V1::ScheduleOptions
- (::Google::Cloud::Bigquery::DataTransfer::V1::ScheduleOptions) — Options customizing the data transfer schedule.
#schedule_options=
def schedule_options=(value) -> ::Google::Cloud::Bigquery::DataTransfer::V1::ScheduleOptions
- value (::Google::Cloud::Bigquery::DataTransfer::V1::ScheduleOptions) — Options customizing the data transfer schedule.
- (::Google::Cloud::Bigquery::DataTransfer::V1::ScheduleOptions) — Options customizing the data transfer schedule.
#state
def state() -> ::Google::Cloud::Bigquery::DataTransfer::V1::TransferState
- (::Google::Cloud::Bigquery::DataTransfer::V1::TransferState) — Output only. State of the most recently updated transfer run.
#update_time
def update_time() -> ::Google::Protobuf::Timestamp
- (::Google::Protobuf::Timestamp) — Output only. Data transfer modification time. Ignored by server on input.
#user_id
def user_id() -> ::Integer
- (::Integer) — Deprecated. Unique ID of the user on whose behalf transfer is done.
#user_id=
def user_id=(value) -> ::Integer
- value (::Integer) — Deprecated. Unique ID of the user on whose behalf transfer is done.
- (::Integer) — Deprecated. Unique ID of the user on whose behalf transfer is done.