Schedule a ServiceNow transfer
The BigQuery Data Transfer Service for ServiceNow connector lets you automatically schedule and manage recurring load jobs from ServiceNow into BigQuery.
Limitations
ServiceNow data transfers are subject to the following limitations:
- Running concurrent data transfers on the same ServiceNow instance isn't recommended.
- The minimum interval time between recurring data transfers is 15 minutes. The default interval for a recurring transfer is 24 hours.
ServiceNow data transfers business-related tables from three applications: Procurement, Product Catalog, and Contract Management. The following tables can be transferred:
ast_contract
clm_condition_check
clm_condition_checker
clm_contract_history
clm_m2m_contract_and_terms
clm_m2m_contract_asset
clm_m2m_contract_user
clm_m2m_rate_card_asset
clm_terms_and_conditions
pc_hardware_cat_item
pc_product_cat_item
pc_software_cat_item
pc_vendor_cat_item
proc_po
proc_po_item
proc_rec_slip
proc_rec_slip_item
Before you begin
Before you create a ServiceNow data transfer, make the following configurations for ServiceNow and BigQuery.
ServiceNow prerequisites
- To access ServiceNow APIs, create OAuth credentials.
The following ServiceNow applications must all be enabled in the ServiceNow instance:
BigQuery prerequisites
- Complete all actions required to enable the BigQuery Data Transfer Service.
- Create a BigQuery dataset for storing the data.
- If you intend to set up transfer run notifications for Pub/Sub,
ensure that you have the
pubsub.topics.setIamPolicy
Identity and Access Management (IAM) permission. If you only set up email notifications, Pub/Sub permissions aren't required. For more information, see BigQuery Data Transfer Service run notifications.
Required BigQuery roles
To get the permissions that you need to create a transfer,
ask your administrator to grant you the
BigQuery Admin (roles/bigquery.admin
) IAM role.
For more information about granting roles, see Manage access to projects, folders, and organizations.
This predefined role contains the permissions required to create a transfer. To see the exact permissions that are required, expand the Required permissions section:
Required permissions
The following permissions are required to create a transfer:
-
bigquery.transfers.update
on the user -
bigquery.datasets.get
on the target dataset -
bigquery.datasets.update
on the target dataset
You might also be able to get these permissions with custom roles or other predefined roles.
Set up a ServiceNow data transfer
ServiceNow data transfers can be created in the Google Cloud console or the bq command-line tool.
Console
Go to the Data transfers page in the Google Cloud console.
Click
Create transfer.In the Source type section, for Source, select ServiceNow.
In the Data source details section, do the following:
- For Instance ID, enter the ServiceNow instance ID. You can get
this from your ServiceNow URL—for example,
https://INSTANCE_ID.service-now.com
. - For Username, enter the ServiceNow username to use for the connection.
- For Password, enter the ServiceNow password.
- For Client ID, enter the client ID from your OAuth credentials. To generate credentials, see Create OAuth Credentials.
- For Client secret, enter the client secret from your OAuth credentials.
- For Value type, choose one of the following:
- To transfer the values stored in the database, choose Actual.
- To transfer the display values of the columns, choose Display.
- For Instance ID, enter the ServiceNow instance ID. You can get
this from your ServiceNow URL—for example,
In the Destination settings section, for Dataset, select the dataset you created to store your data.
In the Transfer config name section, for Display name, enter a name for the data transfer.
In the Schedule options section, do the following:
In the Repeat frequency list, select an option to specify how often this data transfer runs. To specify a custom repeat frequency, select Custom. If you select On-demand, then this data transfer runs when you manually trigger the transfer.
If applicable, select either Start now or Start at set time and provide a start date and run time.
In the Service Account menu, select a service account from the service accounts associated with your Google Cloud project. The selected service account must have the required roles to run this data transfer.
If you signed in with a federated identity, then a service account is required to create a data transfer. If you signed in with a Google Account, then a service account for the transfer is optional.
For more information about using service accounts with data transfers, see Use service accounts.
Optional: In the Notification options section, do the following:
- To enable email notifications, click the Email notification toggle. When you enable this option, the transfer administrator receives an email notification when a transfer run fails.
- To enable Pub/Sub transfer run notifications for this data transfer, click the Pub/Sub notifications toggle. You can select your topic name, or you can click Create a topic to create one.
Click Save.
bq
Enter the bq mk
command
and supply the transfer creation flag, --transfer_config
:
bq mk \
--transfer_config \
--project_id=PROJECT_ID \
--data_source=DATA_SOURCE \
--display_name=DISPLAY_NAME \
--target_dataset=DATASET \
--params='PARAMETERS'
Replace the following:
PROJECT_ID
(optional): Your Google Cloud project ID. If a project ID isn't specified, the default project is used.DATA_SOURCE
: the data source (for example,servicenow
).DISPLAY_NAME
: the display name for the transfer configuration. The data transfer name can be any value that lets you identify the transfer if you need to modify it later.DATASET
: the target dataset for the transfer configuration.PARAMETERS
: the parameters for the created transfer configuration in JSON format. For example:--params='{"param":"param_value"}'
. The following are the parameters for a ServiceNow data transfer:ServiceNow parameter Required or optional Description connector.instanceId
Required Instance ID of the ServiceNow instance connector.authentication.username
Required Username of the credentials connector.authentication.password
Required Password of the credentials connector.authentication.oauth.clientId
Required Client ID of generated OAuth connector.authentication.oauth.clientSecret
Required Client Secret of generated OAuth connector.valueType
Optional Actual
orDisplay
(default:Actual
)For example, the following command creates a ServiceNow data transfer in the default project with all the required parameters:
bq mk \ --transfer_config \ --target_dataset=mydataset \ --data_source=servicenow \ --display_name='My Transfer' \ --params='{"connector.authentication.oauth.clientId": "1234567890", "connector.authentication.oauth.clientSecret":"ABC12345", "connector.authentication.username":"user1", "Connector.authentication.password":"abcdef1234", "connector.instanceId":"https://dev-instance.service-now.com"}'
API
Use the projects.locations.transferConfigs.create
method and supply an instance of the TransferConfig
resource.
Troubleshoot transfer issues
For more information, see Troubleshoot transfer configurations.
Transfer fails due to ServiceNow enablement
An issue occurs causing data transfers to fail when Procurement, Product Catalog, or Contract Management applications aren't enabled in ServiceNow. To fix it, enable all three applications. For example, activate Procurement.
Issue occurs during transfer run
An issue occurs causing the transfer run to not be created as intended. To resolve the issue, do the following:
- Check that the ServiceNow account credentials, such as Username, Password, Client ID, and Client secret values, are valid.
- Check that the Instance ID is the valid ID of your ServiceNow instance.
Pricing
There is no cost to transfer ServiceNow data into BigQuery while this feature is in Preview.
What's next
- For an overview of BigQuery Data Transfer Service, see Introduction to BigQuery Data Transfer Service.
- For information on using transfers including getting information about a transfer configuration, listing transfer configurations, and viewing a transfer's run history, see Working with transfers.
- Learn how to load data with cross-cloud operations.