Class TransferConfig (3.4.1)

TransferConfig(mapping=None, *, ignore_unknown_fields=False, **kwargs)

Represents a data transfer configuration. A transfer configuration contains all metadata needed to perform a data transfer. For example, destination_dataset_id specifies where data should be stored. When a new transfer configuration is created, the specified destination_dataset_id is created when needed and shared with the appropriate data source service account.

.. _oneof: https://proto-plus-python.readthedocs.io/en/stable/fields.html#oneofs-mutually-exclusive-fields

Attributes

NameDescription
name str
The resource name of the transfer config. Transfer config names have the form projects/{project_id}/locations/{region}/transferConfigs/{config_id}. Where config_id is usually a uuid, even though it is not guaranteed or required. The name is ignored when creating a transfer config.
destination_dataset_id str
The BigQuery target dataset id. This field is a member of oneof_ destination.
display_name str
User specified display name for the data transfer.
data_source_id str
Data source id. Cannot be changed once data transfer is created.
params google.protobuf.struct_pb2.Struct
Parameters specific to each data source. For more information see the bq tab in the 'Setting up a data transfer' section for each data source. For example the parameters for Cloud Storage transfers are listed here: https://cloud.google.com/bigquery- transfer/docs/cloud-storage-transfer#bq
schedule str
Data transfer schedule. If the data source does not support a custom schedule, this should be empty. If it is empty, the default value for the data source will be used. The specified times are in UTC. Examples of valid format: 1st,3rd monday of month 15:30, every wed,fri of jan,jun 13:15, and first sunday of quarter 00:00. See more explanation about the format here: https://cloud.google.com/appengine/docs/flexible/python/scheduling-jobs-with-cron-yaml#the_schedule_format NOTE: the granularity should be at least 8 hours, or less frequent.
schedule_options google.cloud.bigquery_datatransfer_v1.types.ScheduleOptions
Options customizing the data transfer schedule.
data_refresh_window_days int
The number of days to look back to automatically refresh the data. For example, if data_refresh_window_days = 10, then every day BigQuery reingests data for [today-10, today-1], rather than ingesting data for just [today-1]. Only valid if the data source supports the feature. Set the value to 0 to use the default value.
disabled bool
Is this config disabled. When set to true, no runs are scheduled for a given transfer.
update_time google.protobuf.timestamp_pb2.Timestamp
Output only. Data transfer modification time. Ignored by server on input.
next_run_time google.protobuf.timestamp_pb2.Timestamp
Output only. Next time when data transfer will run.
state google.cloud.bigquery_datatransfer_v1.types.TransferState
Output only. State of the most recently updated transfer run.
user_id int
Deprecated. Unique ID of the user on whose behalf transfer is done.
dataset_region str
Output only. Region in which BigQuery dataset is located.
notification_pubsub_topic str
Pub/Sub topic where notifications will be sent after transfer runs associated with this transfer config finish. The format for specifying a pubsub topic is: projects/{project}/topics/{topic}
email_preferences google.cloud.bigquery_datatransfer_v1.types.EmailPreferences
Email notifications will be sent according to these preferences to the email address of the user who owns this transfer config.