DataTransferServiceClient(*, credentials: Optional[google.auth.credentials.Credentials] = None, transport: Optional[Union[str, google.cloud.bigquery_datatransfer_v1.services.data_transfer_service.transports.base.DataTransferServiceTransport]] = None, client_options: Optional[google.api_core.client_options.ClientOptions] = None, client_info: google.api_core.gapic_v1.client_info.ClientInfo = <google.api_core.gapic_v1.client_info.ClientInfo object>)
This API allows users to manage their data transfers into BigQuery.
Properties
transport
Returns the transport used by the client instance.
Type | Description |
DataTransferServiceTransport | The transport used by the client instance. |
Methods
DataTransferServiceClient
DataTransferServiceClient(*, credentials: Optional[google.auth.credentials.Credentials] = None, transport: Optional[Union[str, google.cloud.bigquery_datatransfer_v1.services.data_transfer_service.transports.base.DataTransferServiceTransport]] = None, client_options: Optional[google.api_core.client_options.ClientOptions] = None, client_info: google.api_core.gapic_v1.client_info.ClientInfo = <google.api_core.gapic_v1.client_info.ClientInfo object>)
Instantiates the data transfer service client.
Name | Description |
credentials |
Optional[google.auth.credentials.Credentials]
The authorization credentials to attach to requests. These credentials identify the application to the service; if none are specified, the client will attempt to ascertain the credentials from the environment. |
transport |
Union[str, DataTransferServiceTransport]
The transport to use. If set to None, a transport is chosen automatically. |
client_options |
google.api_core.client_options.ClientOptions
Custom options for the client. It won't take effect if a |
client_info |
google.api_core.gapic_v1.client_info.ClientInfo
The client info used to send a user-agent string along with API requests. If |
Type | Description |
google.auth.exceptions.MutualTLSChannelError | If mutual TLS transport creation failed for any reason. |
__exit__
__exit__(type, value, traceback)
Releases underlying transport's resources.
check_valid_creds
check_valid_creds(request: Optional[Union[google.cloud.bigquery_datatransfer_v1.types.datatransfer.CheckValidCredsRequest, dict]] = None, *, name: Optional[str] = None, retry: Union[google.api_core.retry.Retry, google.api_core.gapic_v1.method._MethodDefault] = <_MethodDefault._DEFAULT_VALUE: <object object>>, timeout: Optional[float] = None, metadata: Sequence[Tuple[str, str]] = ())
Returns true if valid credentials exist for the given data source and requesting user.
from google.cloud import bigquery_datatransfer_v1
def sample_check_valid_creds():
# Create a client
client = bigquery_datatransfer_v1.DataTransferServiceClient()
# Initialize request argument(s)
request = bigquery_datatransfer_v1.CheckValidCredsRequest(
name="name_value",
)
# Make the request
response = client.check_valid_creds(request=request)
# Handle the response
print(response)
Name | Description |
request |
Union[google.cloud.bigquery_datatransfer_v1.types.CheckValidCredsRequest, dict]
The request object. A request to determine whether the user has valid credentials. This method is used to limit the number of OAuth popups in the user interface. The user id is inferred from the API call context. If the data source has the Google+ authorization type, this method returns false, as it cannot be determined whether the credentials are already valid merely based on the user id. |
name |
str
Required. The data source in the form: |
retry |
google.api_core.retry.Retry
Designation of what errors, if any, should be retried. |
timeout |
float
The timeout for this request. |
metadata |
Sequence[Tuple[str, str]]
Strings which should be sent along with the request as metadata. |
Type | Description |
google.cloud.bigquery_datatransfer_v1.types.CheckValidCredsResponse | A response indicating whether the credentials exist and are valid. |
common_billing_account_path
common_billing_account_path(billing_account: str)
Returns a fully-qualified billing_account string.
common_folder_path
common_folder_path(folder: str)
Returns a fully-qualified folder string.
common_location_path
common_location_path(project: str, location: str)
Returns a fully-qualified location string.
common_organization_path
common_organization_path(organization: str)
Returns a fully-qualified organization string.
common_project_path
common_project_path(project: str)
Returns a fully-qualified project string.
create_transfer_config
create_transfer_config(request: Optional[Union[google.cloud.bigquery_datatransfer_v1.types.datatransfer.CreateTransferConfigRequest, dict]] = None, *, parent: Optional[str] = None, transfer_config: Optional[google.cloud.bigquery_datatransfer_v1.types.transfer.TransferConfig] = None, retry: Union[google.api_core.retry.Retry, google.api_core.gapic_v1.method._MethodDefault] = <_MethodDefault._DEFAULT_VALUE: <object object>>, timeout: Optional[float] = None, metadata: Sequence[Tuple[str, str]] = ())
Creates a new data transfer configuration.
from google.cloud import bigquery_datatransfer_v1
def sample_create_transfer_config():
# Create a client
client = bigquery_datatransfer_v1.DataTransferServiceClient()
# Initialize request argument(s)
transfer_config = bigquery_datatransfer_v1.TransferConfig()
transfer_config.destination_dataset_id = "destination_dataset_id_value"
request = bigquery_datatransfer_v1.CreateTransferConfigRequest(
parent="parent_value",
transfer_config=transfer_config,
)
# Make the request
response = client.create_transfer_config(request=request)
# Handle the response
print(response)
Name | Description |
request |
Union[google.cloud.bigquery_datatransfer_v1.types.CreateTransferConfigRequest, dict]
The request object. A request to create a data transfer configuration. If new credentials are needed for this transfer configuration, an authorization code must be provided. If an authorization code is provided, the transfer configuration will be associated with the user id corresponding to the authorization code. Otherwise, the transfer configuration will be associated with the calling user. |
parent |
str
Required. The BigQuery project id where the transfer configuration should be created. Must be in the format projects/{project_id}/locations/{location_id} or projects/{project_id}. If specified location and location of the destination bigquery dataset do not match - the request will fail. This corresponds to the |
transfer_config |
google.cloud.bigquery_datatransfer_v1.types.TransferConfig
Required. Data transfer configuration to create. This corresponds to the |
retry |
google.api_core.retry.Retry
Designation of what errors, if any, should be retried. |
timeout |
float
The timeout for this request. |
metadata |
Sequence[Tuple[str, str]]
Strings which should be sent along with the request as metadata. |
Type | Description |
google.cloud.bigquery_datatransfer_v1.types.TransferConfig | Represents a data transfer configuration. A transfer configuration contains all metadata needed to perform a data transfer. For example, destination_dataset_id specifies where data should be stored. When a new transfer configuration is created, the specified destination_dataset_id is created when needed and shared with the appropriate data source service account. |
data_source_path
data_source_path(project: str, data_source: str)
Returns a fully-qualified data_source string.
delete_transfer_config
delete_transfer_config(request: Optional[Union[google.cloud.bigquery_datatransfer_v1.types.datatransfer.DeleteTransferConfigRequest, dict]] = None, *, name: Optional[str] = None, retry: Union[google.api_core.retry.Retry, google.api_core.gapic_v1.method._MethodDefault] = <_MethodDefault._DEFAULT_VALUE: <object object>>, timeout: Optional[float] = None, metadata: Sequence[Tuple[str, str]] = ())
Deletes a data transfer configuration, including any associated transfer runs and logs.
from google.cloud import bigquery_datatransfer_v1
def sample_delete_transfer_config():
# Create a client
client = bigquery_datatransfer_v1.DataTransferServiceClient()
# Initialize request argument(s)
request = bigquery_datatransfer_v1.DeleteTransferConfigRequest(
name="name_value",
)
# Make the request
client.delete_transfer_config(request=request)
Name | Description |
request |
Union[google.cloud.bigquery_datatransfer_v1.types.DeleteTransferConfigRequest, dict]
The request object. A request to delete data transfer information. All associated transfer runs and log messages will be deleted as well. |
name |
str
Required. The field will contain name of the resource requested, for example: |
retry |
google.api_core.retry.Retry
Designation of what errors, if any, should be retried. |
timeout |
float
The timeout for this request. |
metadata |
Sequence[Tuple[str, str]]
Strings which should be sent along with the request as metadata. |
delete_transfer_run
delete_transfer_run(request: Optional[Union[google.cloud.bigquery_datatransfer_v1.types.datatransfer.DeleteTransferRunRequest, dict]] = None, *, name: Optional[str] = None, retry: Union[google.api_core.retry.Retry, google.api_core.gapic_v1.method._MethodDefault] = <_MethodDefault._DEFAULT_VALUE: <object object>>, timeout: Optional[float] = None, metadata: Sequence[Tuple[str, str]] = ())
Deletes the specified transfer run.
from google.cloud import bigquery_datatransfer_v1
def sample_delete_transfer_run():
# Create a client
client = bigquery_datatransfer_v1.DataTransferServiceClient()
# Initialize request argument(s)
request = bigquery_datatransfer_v1.DeleteTransferRunRequest(
name="name_value",
)
# Make the request
client.delete_transfer_run(request=request)
Name | Description |
request |
Union[google.cloud.bigquery_datatransfer_v1.types.DeleteTransferRunRequest, dict]
The request object. A request to delete data transfer run information. |
name |
str
Required. The field will contain name of the resource requested, for example: |
retry |
google.api_core.retry.Retry
Designation of what errors, if any, should be retried. |
timeout |
float
The timeout for this request. |
metadata |
Sequence[Tuple[str, str]]
Strings which should be sent along with the request as metadata. |
enroll_data_sources
enroll_data_sources(request: Optional[Union[google.cloud.bigquery_datatransfer_v1.types.datatransfer.EnrollDataSourcesRequest, dict]] = None, *, retry: Union[google.api_core.retry.Retry, google.api_core.gapic_v1.method._MethodDefault] = <_MethodDefault._DEFAULT_VALUE: <object object>>, timeout: Optional[float] = None, metadata: Sequence[Tuple[str, str]] = ())
Enroll data sources in a user project. This allows users to create transfer configurations for these data sources. They will also appear in the ListDataSources RPC and as such, will appear in the BigQuery UI 'https://bigquery.cloud.google.com' (and the documents can be found at https://cloud.google.com/bigquery/bigquery-web-ui and https://cloud.google.com/bigquery/docs/working-with-transfers).
from google.cloud import bigquery_datatransfer_v1
def sample_enroll_data_sources():
# Create a client
client = bigquery_datatransfer_v1.DataTransferServiceClient()
# Initialize request argument(s)
request = bigquery_datatransfer_v1.EnrollDataSourcesRequest(
)
# Make the request
client.enroll_data_sources(request=request)
Name | Description |
request |
Union[google.cloud.bigquery_datatransfer_v1.types.EnrollDataSourcesRequest, dict]
The request object. A request to enroll a set of data sources so they are visible in the BigQuery UI's |
retry |
google.api_core.retry.Retry
Designation of what errors, if any, should be retried. |
timeout |
float
The timeout for this request. |
metadata |
Sequence[Tuple[str, str]]
Strings which should be sent along with the request as metadata. |
from_service_account_file
from_service_account_file(filename: str, *args, **kwargs)
Creates an instance of this client using the provided credentials file.
Name | Description |
filename |
str
The path to the service account private key json file. |
Type | Description |
DataTransferServiceClient | The constructed client. |
from_service_account_info
from_service_account_info(info: dict, *args, **kwargs)
Creates an instance of this client using the provided credentials info.
Name | Description |
info |
dict
The service account private key info. |
Type | Description |
DataTransferServiceClient | The constructed client. |
from_service_account_json
from_service_account_json(filename: str, *args, **kwargs)
Creates an instance of this client using the provided credentials file.
Name | Description |
filename |
str
The path to the service account private key json file. |
Type | Description |
DataTransferServiceClient | The constructed client. |
get_data_source
get_data_source(request: Optional[Union[google.cloud.bigquery_datatransfer_v1.types.datatransfer.GetDataSourceRequest, dict]] = None, *, name: Optional[str] = None, retry: Union[google.api_core.retry.Retry, google.api_core.gapic_v1.method._MethodDefault] = <_MethodDefault._DEFAULT_VALUE: <object object>>, timeout: Optional[float] = None, metadata: Sequence[Tuple[str, str]] = ())
Retrieves a supported data source and returns its settings.
from google.cloud import bigquery_datatransfer_v1
def sample_get_data_source():
# Create a client
client = bigquery_datatransfer_v1.DataTransferServiceClient()
# Initialize request argument(s)
request = bigquery_datatransfer_v1.GetDataSourceRequest(
name="name_value",
)
# Make the request
response = client.get_data_source(request=request)
# Handle the response
print(response)
Name | Description |
request |
Union[google.cloud.bigquery_datatransfer_v1.types.GetDataSourceRequest, dict]
The request object. A request to get data source info. |
name |
str
Required. The field will contain name of the resource requested, for example: |
retry |
google.api_core.retry.Retry
Designation of what errors, if any, should be retried. |
timeout |
float
The timeout for this request. |
metadata |
Sequence[Tuple[str, str]]
Strings which should be sent along with the request as metadata. |
Type | Description |
google.cloud.bigquery_datatransfer_v1.types.DataSource | Defines the properties and custom parameters for a data source. |
get_mtls_endpoint_and_cert_source
get_mtls_endpoint_and_cert_source(
client_options: Optional[google.api_core.client_options.ClientOptions] = None,
)
Return the API endpoint and client cert source for mutual TLS.
The client cert source is determined in the following order:
(1) if GOOGLE_API_USE_CLIENT_CERTIFICATE
environment variable is not "true", the
client cert source is None.
(2) if client_options.client_cert_source
is provided, use the provided one; if the
default client cert source exists, use the default one; otherwise the client cert
source is None.
The API endpoint is determined in the following order:
(1) if client_options.api_endpoint
if provided, use the provided one.
(2) if GOOGLE_API_USE_CLIENT_CERTIFICATE
environment variable is "always", use the
default mTLS endpoint; if the environment variabel is "never", use the default API
endpoint; otherwise if client cert source exists, use the default mTLS endpoint, otherwise
use the default API endpoint.
More details can be found at https://google.aip.dev/auth/4114.
Name | Description |
client_options |
google.api_core.client_options.ClientOptions
Custom options for the client. Only the |
Type | Description |
google.auth.exceptions.MutualTLSChannelError | If any errors happen. |
Type | Description |
Tuple[str, Callable[[], Tuple[bytes, bytes]]] | returns the API endpoint and the client cert source to use. |
get_transfer_config
get_transfer_config(request: Optional[Union[google.cloud.bigquery_datatransfer_v1.types.datatransfer.GetTransferConfigRequest, dict]] = None, *, name: Optional[str] = None, retry: Union[google.api_core.retry.Retry, google.api_core.gapic_v1.method._MethodDefault] = <_MethodDefault._DEFAULT_VALUE: <object object>>, timeout: Optional[float] = None, metadata: Sequence[Tuple[str, str]] = ())
Returns information about a data transfer config.
from google.cloud import bigquery_datatransfer_v1
def sample_get_transfer_config():
# Create a client
client = bigquery_datatransfer_v1.DataTransferServiceClient()
# Initialize request argument(s)
request = bigquery_datatransfer_v1.GetTransferConfigRequest(
name="name_value",
)
# Make the request
response = client.get_transfer_config(request=request)
# Handle the response
print(response)
Name | Description |
request |
Union[google.cloud.bigquery_datatransfer_v1.types.GetTransferConfigRequest, dict]
The request object. A request to get data transfer information. |
name |
str
Required. The field will contain name of the resource requested, for example: |
retry |
google.api_core.retry.Retry
Designation of what errors, if any, should be retried. |
timeout |
float
The timeout for this request. |
metadata |
Sequence[Tuple[str, str]]
Strings which should be sent along with the request as metadata. |
Type | Description |
google.cloud.bigquery_datatransfer_v1.types.TransferConfig | Represents a data transfer configuration. A transfer configuration contains all metadata needed to perform a data transfer. For example, destination_dataset_id specifies where data should be stored. When a new transfer configuration is created, the specified destination_dataset_id is created when needed and shared with the appropriate data source service account. |
get_transfer_run
get_transfer_run(request: Optional[Union[google.cloud.bigquery_datatransfer_v1.types.datatransfer.GetTransferRunRequest, dict]] = None, *, name: Optional[str] = None, retry: Union[google.api_core.retry.Retry, google.api_core.gapic_v1.method._MethodDefault] = <_MethodDefault._DEFAULT_VALUE: <object object>>, timeout: Optional[float] = None, metadata: Sequence[Tuple[str, str]] = ())
Returns information about the particular transfer run.
from google.cloud import bigquery_datatransfer_v1
def sample_get_transfer_run():
# Create a client
client = bigquery_datatransfer_v1.DataTransferServiceClient()
# Initialize request argument(s)
request = bigquery_datatransfer_v1.GetTransferRunRequest(
name="name_value",
)
# Make the request
response = client.get_transfer_run(request=request)
# Handle the response
print(response)
Name | Description |
request |
Union[google.cloud.bigquery_datatransfer_v1.types.GetTransferRunRequest, dict]
The request object. A request to get data transfer run information. |
name |
str
Required. The field will contain name of the resource requested, for example: |
retry |
google.api_core.retry.Retry
Designation of what errors, if any, should be retried. |
timeout |
float
The timeout for this request. |
metadata |
Sequence[Tuple[str, str]]
Strings which should be sent along with the request as metadata. |
Type | Description |
google.cloud.bigquery_datatransfer_v1.types.TransferRun | Represents a data transfer run. |
list_data_sources
list_data_sources(request: Optional[Union[google.cloud.bigquery_datatransfer_v1.types.datatransfer.ListDataSourcesRequest, dict]] = None, *, parent: Optional[str] = None, retry: Union[google.api_core.retry.Retry, google.api_core.gapic_v1.method._MethodDefault] = <_MethodDefault._DEFAULT_VALUE: <object object>>, timeout: Optional[float] = None, metadata: Sequence[Tuple[str, str]] = ())
Lists supported data sources and returns their settings.
from google.cloud import bigquery_datatransfer_v1
def sample_list_data_sources():
# Create a client
client = bigquery_datatransfer_v1.DataTransferServiceClient()
# Initialize request argument(s)
request = bigquery_datatransfer_v1.ListDataSourcesRequest(
parent="parent_value",
)
# Make the request
page_result = client.list_data_sources(request=request)
# Handle the response
for response in page_result:
print(response)
Name | Description |
request |
Union[google.cloud.bigquery_datatransfer_v1.types.ListDataSourcesRequest, dict]
The request object. Request to list supported data sources and their data transfer settings. |
parent |
str
Required. The BigQuery project id for which data sources should be returned. Must be in the form: |
retry |
google.api_core.retry.Retry
Designation of what errors, if any, should be retried. |
timeout |
float
The timeout for this request. |
metadata |
Sequence[Tuple[str, str]]
Strings which should be sent along with the request as metadata. |
Type | Description |
google.cloud.bigquery_datatransfer_v1.services.data_transfer_service.pagers.ListDataSourcesPager | Returns list of supported data sources and their metadata. Iterating over this object will yield results and resolve additional pages automatically. |
list_transfer_configs
list_transfer_configs(request: Optional[Union[google.cloud.bigquery_datatransfer_v1.types.datatransfer.ListTransferConfigsRequest, dict]] = None, *, parent: Optional[str] = None, retry: Union[google.api_core.retry.Retry, google.api_core.gapic_v1.method._MethodDefault] = <_MethodDefault._DEFAULT_VALUE: <object object>>, timeout: Optional[float] = None, metadata: Sequence[Tuple[str, str]] = ())
Returns information about all transfer configs owned by a project in the specified location.
from google.cloud import bigquery_datatransfer_v1
def sample_list_transfer_configs():
# Create a client
client = bigquery_datatransfer_v1.DataTransferServiceClient()
# Initialize request argument(s)
request = bigquery_datatransfer_v1.ListTransferConfigsRequest(
parent="parent_value",
)
# Make the request
page_result = client.list_transfer_configs(request=request)
# Handle the response
for response in page_result:
print(response)
Name | Description |
request |
Union[google.cloud.bigquery_datatransfer_v1.types.ListTransferConfigsRequest, dict]
The request object. A request to list data transfers configured for a BigQuery project. |
parent |
str
Required. The BigQuery project id for which data sources should be returned: |
retry |
google.api_core.retry.Retry
Designation of what errors, if any, should be retried. |
timeout |
float
The timeout for this request. |
metadata |
Sequence[Tuple[str, str]]
Strings which should be sent along with the request as metadata. |
Type | Description |
google.cloud.bigquery_datatransfer_v1.services.data_transfer_service.pagers.ListTransferConfigsPager | The returned list of pipelines in the project. Iterating over this object will yield results and resolve additional pages automatically. |
list_transfer_logs
list_transfer_logs(request: Optional[Union[google.cloud.bigquery_datatransfer_v1.types.datatransfer.ListTransferLogsRequest, dict]] = None, *, parent: Optional[str] = None, retry: Union[google.api_core.retry.Retry, google.api_core.gapic_v1.method._MethodDefault] = <_MethodDefault._DEFAULT_VALUE: <object object>>, timeout: Optional[float] = None, metadata: Sequence[Tuple[str, str]] = ())
Returns log messages for the transfer run.
from google.cloud import bigquery_datatransfer_v1
def sample_list_transfer_logs():
# Create a client
client = bigquery_datatransfer_v1.DataTransferServiceClient()
# Initialize request argument(s)
request = bigquery_datatransfer_v1.ListTransferLogsRequest(
parent="parent_value",
)
# Make the request
page_result = client.list_transfer_logs(request=request)
# Handle the response
for response in page_result:
print(response)
Name | Description |
request |
Union[google.cloud.bigquery_datatransfer_v1.types.ListTransferLogsRequest, dict]
The request object. A request to get user facing log messages associated with data transfer run. |
parent |
str
Required. Transfer run name in the form: |
retry |
google.api_core.retry.Retry
Designation of what errors, if any, should be retried. |
timeout |
float
The timeout for this request. |
metadata |
Sequence[Tuple[str, str]]
Strings which should be sent along with the request as metadata. |
Type | Description |
google.cloud.bigquery_datatransfer_v1.services.data_transfer_service.pagers.ListTransferLogsPager | The returned list transfer run messages. Iterating over this object will yield results and resolve additional pages automatically. |
list_transfer_runs
list_transfer_runs(request: Optional[Union[google.cloud.bigquery_datatransfer_v1.types.datatransfer.ListTransferRunsRequest, dict]] = None, *, parent: Optional[str] = None, retry: Union[google.api_core.retry.Retry, google.api_core.gapic_v1.method._MethodDefault] = <_MethodDefault._DEFAULT_VALUE: <object object>>, timeout: Optional[float] = None, metadata: Sequence[Tuple[str, str]] = ())
Returns information about running and completed transfer runs.
from google.cloud import bigquery_datatransfer_v1
def sample_list_transfer_runs():
# Create a client
client = bigquery_datatransfer_v1.DataTransferServiceClient()
# Initialize request argument(s)
request = bigquery_datatransfer_v1.ListTransferRunsRequest(
parent="parent_value",
)
# Make the request
page_result = client.list_transfer_runs(request=request)
# Handle the response
for response in page_result:
print(response)
Name | Description |
request |
Union[google.cloud.bigquery_datatransfer_v1.types.ListTransferRunsRequest, dict]
The request object. A request to list data transfer runs. |
parent |
str
Required. Name of transfer configuration for which transfer runs should be retrieved. Format of transfer configuration resource name is: |
retry |
google.api_core.retry.Retry
Designation of what errors, if any, should be retried. |
timeout |
float
The timeout for this request. |
metadata |
Sequence[Tuple[str, str]]
Strings which should be sent along with the request as metadata. |
Type | Description |
google.cloud.bigquery_datatransfer_v1.services.data_transfer_service.pagers.ListTransferRunsPager | The returned list of pipelines in the project. Iterating over this object will yield results and resolve additional pages automatically. |
parse_common_billing_account_path
parse_common_billing_account_path(path: str)
Parse a billing_account path into its component segments.
parse_common_folder_path
parse_common_folder_path(path: str)
Parse a folder path into its component segments.
parse_common_location_path
parse_common_location_path(path: str)
Parse a location path into its component segments.
parse_common_organization_path
parse_common_organization_path(path: str)
Parse a organization path into its component segments.
parse_common_project_path
parse_common_project_path(path: str)
Parse a project path into its component segments.
parse_data_source_path
parse_data_source_path(path: str)
Parses a data_source path into its component segments.
parse_run_path
parse_run_path(path: str)
Parses a run path into its component segments.
parse_transfer_config_path
parse_transfer_config_path(path: str)
Parses a transfer_config path into its component segments.
run_path
run_path(project: str, transfer_config: str, run: str)
Returns a fully-qualified run string.
schedule_transfer_runs
schedule_transfer_runs(request: Optional[Union[google.cloud.bigquery_datatransfer_v1.types.datatransfer.ScheduleTransferRunsRequest, dict]] = None, *, parent: Optional[str] = None, start_time: Optional[google.protobuf.timestamp_pb2.Timestamp] = None, end_time: Optional[google.protobuf.timestamp_pb2.Timestamp] = None, retry: Union[google.api_core.retry.Retry, google.api_core.gapic_v1.method._MethodDefault] = <_MethodDefault._DEFAULT_VALUE: <object object>>, timeout: Optional[float] = None, metadata: Sequence[Tuple[str, str]] = ())
Creates transfer runs for a time range [start_time, end_time]. For each date - or whatever granularity the data source supports
- in the range, one transfer run is created. Note that runs are created per UTC time in the time range. DEPRECATED: use StartManualTransferRuns instead.
from google.cloud import bigquery_datatransfer_v1
def sample_schedule_transfer_runs():
# Create a client
client = bigquery_datatransfer_v1.DataTransferServiceClient()
# Initialize request argument(s)
request = bigquery_datatransfer_v1.ScheduleTransferRunsRequest(
parent="parent_value",
)
# Make the request
response = client.schedule_transfer_runs(request=request)
# Handle the response
print(response)
Name | Description |
request |
Union[google.cloud.bigquery_datatransfer_v1.types.ScheduleTransferRunsRequest, dict]
The request object. A request to schedule transfer runs for a time range. |
parent |
str
Required. Transfer configuration name in the form: |
start_time |
google.protobuf.timestamp_pb2.Timestamp
Required. Start time of the range of transfer runs. For example, |
end_time |
google.protobuf.timestamp_pb2.Timestamp
Required. End time of the range of transfer runs. For example, |
retry |
google.api_core.retry.Retry
Designation of what errors, if any, should be retried. |
timeout |
float
The timeout for this request. |
metadata |
Sequence[Tuple[str, str]]
Strings which should be sent along with the request as metadata. |
Type | Description |
google.cloud.bigquery_datatransfer_v1.types.ScheduleTransferRunsResponse | A response to schedule transfer runs for a time range. |
start_manual_transfer_runs
start_manual_transfer_runs(request: Optional[Union[google.cloud.bigquery_datatransfer_v1.types.datatransfer.StartManualTransferRunsRequest, dict]] = None, *, retry: Union[google.api_core.retry.Retry, google.api_core.gapic_v1.method._MethodDefault] = <_MethodDefault._DEFAULT_VALUE: <object object>>, timeout: Optional[float] = None, metadata: Sequence[Tuple[str, str]] = ())
Start manual transfer runs to be executed now with schedule_time equal to current time. The transfer runs can be created for a time range where the run_time is between start_time (inclusive) and end_time (exclusive), or for a specific run_time.
from google.cloud import bigquery_datatransfer_v1
def sample_start_manual_transfer_runs():
# Create a client
client = bigquery_datatransfer_v1.DataTransferServiceClient()
# Initialize request argument(s)
request = bigquery_datatransfer_v1.StartManualTransferRunsRequest(
)
# Make the request
response = client.start_manual_transfer_runs(request=request)
# Handle the response
print(response)
Name | Description |
request |
Union[google.cloud.bigquery_datatransfer_v1.types.StartManualTransferRunsRequest, dict]
The request object. A request to start manual transfer runs. |
retry |
google.api_core.retry.Retry
Designation of what errors, if any, should be retried. |
timeout |
float
The timeout for this request. |
metadata |
Sequence[Tuple[str, str]]
Strings which should be sent along with the request as metadata. |
Type | Description |
google.cloud.bigquery_datatransfer_v1.types.StartManualTransferRunsResponse | A response to start manual transfer runs. |
transfer_config_path
transfer_config_path(project: str, transfer_config: str)
Returns a fully-qualified transfer_config string.
update_transfer_config
update_transfer_config(request: Optional[Union[google.cloud.bigquery_datatransfer_v1.types.datatransfer.UpdateTransferConfigRequest, dict]] = None, *, transfer_config: Optional[google.cloud.bigquery_datatransfer_v1.types.transfer.TransferConfig] = None, update_mask: Optional[google.protobuf.field_mask_pb2.FieldMask] = None, retry: Union[google.api_core.retry.Retry, google.api_core.gapic_v1.method._MethodDefault] = <_MethodDefault._DEFAULT_VALUE: <object object>>, timeout: Optional[float] = None, metadata: Sequence[Tuple[str, str]] = ())
Updates a data transfer configuration. All fields must be set, even if they are not updated.
from google.cloud import bigquery_datatransfer_v1
def sample_update_transfer_config():
# Create a client
client = bigquery_datatransfer_v1.DataTransferServiceClient()
# Initialize request argument(s)
transfer_config = bigquery_datatransfer_v1.TransferConfig()
transfer_config.destination_dataset_id = "destination_dataset_id_value"
request = bigquery_datatransfer_v1.UpdateTransferConfigRequest(
transfer_config=transfer_config,
)
# Make the request
response = client.update_transfer_config(request=request)
# Handle the response
print(response)
Name | Description |
request |
Union[google.cloud.bigquery_datatransfer_v1.types.UpdateTransferConfigRequest, dict]
The request object. A request to update a transfer configuration. To update the user id of the transfer configuration, an authorization code needs to be provided. |
transfer_config |
google.cloud.bigquery_datatransfer_v1.types.TransferConfig
Required. Data transfer configuration to create. This corresponds to the |
update_mask |
google.protobuf.field_mask_pb2.FieldMask
Required. Required list of fields to be updated in this request. This corresponds to the |
retry |
google.api_core.retry.Retry
Designation of what errors, if any, should be retried. |
timeout |
float
The timeout for this request. |
metadata |
Sequence[Tuple[str, str]]
Strings which should be sent along with the request as metadata. |
Type | Description |
google.cloud.bigquery_datatransfer_v1.types.TransferConfig | Represents a data transfer configuration. A transfer configuration contains all metadata needed to perform a data transfer. For example, destination_dataset_id specifies where data should be stored. When a new transfer configuration is created, the specified destination_dataset_id is created when needed and shared with the appropriate data source service account. |