TransferConfig(mapping=None, *, ignore_unknown_fields=False, **kwargs)
Represents a data transfer configuration. A transfer configuration
contains all metadata needed to perform a data transfer. For
example, destination_dataset_id
specifies where data should be
stored. When a new transfer configuration is created, the specified
destination_dataset_id
is created when needed and shared with
the appropriate data source service account.
.. _oneof: https://proto-plus-python.readthedocs.io/en/stable/fields.html#oneofs-mutually-exclusive-fields
Attributes | |
---|---|
Name | Description |
name |
str
The resource name of the transfer config. Transfer config names have the form either projects/{project_id}/locations/{region}/transferConfigs/{config_id}
or projects/{project_id}/transferConfigs/{config_id} ,
where config_id is usually a UUID, even though it is not
guaranteed or required. The name is ignored when creating a
transfer config.
|
destination_dataset_id |
str
The BigQuery target dataset id. This field is a member of oneof _ destination .
|
display_name |
str
User specified display name for the data transfer. |
data_source_id |
str
Data source ID. This cannot be changed once data transfer is created. The full list of available data source IDs can be returned through an API call: https://cloud.google.com/bigquery-transfer/docs/reference/datatransfer/rest/v1/projects.locations.dataSources/list |
params |
google.protobuf.struct_pb2.Struct
Parameters specific to each data source. For more information see the bq tab in the 'Setting up a data transfer' section for each data source. For example the parameters for Cloud Storage transfers are listed here: https://cloud.google.com/bigquery-transfer/docs/cloud-storage-transfer#bq |
schedule |
str
Data transfer schedule. If the data source does not support a custom schedule, this should be empty. If it is empty, the default value for the data source will be used. The specified times are in UTC. Examples of valid format: 1st,3rd monday of month 15:30 ,
every wed,fri of jan,jun 13:15 , and
first sunday of quarter 00:00 . See more explanation
about the format here:
https://cloud.google.com/appengine/docs/flexible/python/scheduling-jobs-with-cron-yaml#the_schedule_format
NOTE: The minimum interval time between recurring transfers
depends on the data source; refer to the documentation for
your data source.
|
schedule_options |
google.cloud.bigquery_datatransfer_v1.types.ScheduleOptions
Options customizing the data transfer schedule. |
data_refresh_window_days |
int
The number of days to look back to automatically refresh the data. For example, if data_refresh_window_days = 10 ,
then every day BigQuery reingests data for [today-10,
today-1], rather than ingesting data for just [today-1].
Only valid if the data source supports the feature. Set the
value to 0 to use the default value.
|
disabled |
bool
Is this config disabled. When set to true, no runs will be scheduled for this transfer config. |
update_time |
google.protobuf.timestamp_pb2.Timestamp
Output only. Data transfer modification time. Ignored by server on input. |
next_run_time |
google.protobuf.timestamp_pb2.Timestamp
Output only. Next time when data transfer will run. |
state |
google.cloud.bigquery_datatransfer_v1.types.TransferState
Output only. State of the most recently updated transfer run. |
user_id |
int
Deprecated. Unique ID of the user on whose behalf transfer is done. |
dataset_region |
str
Output only. Region in which BigQuery dataset is located. |
notification_pubsub_topic |
str
Pub/Sub topic where notifications will be sent after transfer runs associated with this transfer config finish. The format for specifying a pubsub topic is: projects/{project_id}/topics/{topic_id}
|
email_preferences |
google.cloud.bigquery_datatransfer_v1.types.EmailPreferences
Email notifications will be sent according to these preferences to the email address of the user who owns this transfer config. |
owner_info |
google.cloud.bigquery_datatransfer_v1.types.UserInfo
Output only. Information about the user whose credentials are used to transfer data. Populated only for transferConfigs.get requests. In case the user
information is not available, this field will not be
populated.
This field is a member of oneof _ _owner_info .
|
encryption_configuration |
google.cloud.bigquery_datatransfer_v1.types.EncryptionConfiguration
The encryption configuration part. Currently, it is only used for the optional KMS key name. The BigQuery service account of your project must be granted permissions to use the key. Read methods will return the key name applied in effect. Write methods will apply the key if it is present, or otherwise try to apply project default keys if it is absent. |