Schedule a Facebook Ads transfer
The BigQuery Data Transfer Service for Facebook Ads connector lets you automatically schedule and manage recurring load jobs from Facebook Ads into BigQuery.
Limitations
Facebook Ads data transfers are subject to the following limitations:
- The minimum interval time between recurring Facebook Ads data transfers is 24 hours. The default interval for a recurring data transfer is 24 hours.
- The BigQuery Data Transfer Service for Facebook Ads only supports a fixed set of tables. Custom reports aren't supported.
- Facebook Ads data transfers have a maximum duration of six hours. A transfer fails if it takes longer than this maximum duration.
- Incremental transfers aren't supported for
AdInsights
andAdInsightsActions
tables. When you create a data transfer that includesAdInsights
andAdInsightsActions
tables, and you specified a date in Schedule options, all data that is available for that date is transferred. - The BigQuery Data Transfer Service supports a refresh window of one day to the
AdInsights
andAdInsightsActions
tables. The refresh window refers to the number of days that a data transfer will retrieve source data from. When you run a data transfer for the first time, the data transfer retrieves all source data available within the refresh window. The long-lived user access token that is required for Facebook Ads transfers expires after 60 days.
If your long-lived user access token is expired, you can obtain the new one by navigating to your data transfer details and clicking Edit. In the edit transfer page, follow the same steps in Facebook Ads prerequisites to generate a new long-lived user access token.
Data ingestion from Facebook Ads transfers
When you transfer data from Facebook Ads into BigQuery, the data is loaded into BigQuery tables that are partitioned by date. The table partition that the data is loaded into corresponds to the date from the data source. If you schedule multiple transfers for the same date, BigQuery Data Transfer Service overwrites the partition for that specific date with the latest data. Multiple transfers in the same day or running backfills don't result in duplicate data, and partitions for other dates are not affected.For AdInsights
and AdInsightsAction
tables, the table partition that the
data is loaded into corresponds to the date from the data source.
For AdAccounts
tables, snapshots are taken once a day and stored in the
partition of the last transfer run date. The refresh window does not apply to
the AdAccounts
table.
Before you begin
The following sections describe the steps that you need to take before you create a Facebook Ads data transfer.
Facebook Ads prerequisites
Ensure that you have the following Facebook Ads information when creating a Facebook Ads data transfer.
Facebook Ads parameters | Description |
---|---|
clientID |
The app ID name for the OAuth 2.0 client. |
clientSecret |
The app secret for the OAuth 2.0 client. |
refreshToken |
The long-lived user access token, also known as a refresh token. |
To obtain a clientID
and clientSecret
, perform the
following steps:
- Create a Facebook developer app
with the app type
Business
. - In the Facebook App dashboard, click App Settings > Basic and find the app ID and app secret that correspond to the app.
To obtain a long-lived user access token, also known as a refresh token, perform the following steps:
In the Google Cloud console, proceed with the steps to create a Facebook Ads transfer.
In the Data Source Details section, copy the redirect URI listed after the Refresh Token field.
Click the Facebook App dashboard, then click Set up in the Facebook login for Business section.
In the Settings page, enter the redirect URL in the Valid OAuth Redirect URIs field and click Save.
Return to the Google Cloud console. In the Data Source Details section, click Authorize. You will be redirected to a Facebook authentication page.
Select the Facebook developer app to authorize the account that connects with the BigQuery Data Transfer Service.
Once complete, click Got it to return to the Google Cloud console. The long-lived user access token is now populated in the transfer configuration.
Long-lived user access tokens expire after 60 days. For information on how to obtain a new long-lived user access token, see Limitations.
Refresh token alternatives
Alternatively, you can provide a refresh token when you create a data transfer if you have obtained one using one of the following methods:
- Generate a long-lived user access token using the Graph API.
The
ads_management
,ads_read
, andbusiness_management
permissions are required for a valid token for the data transfer. - Generate a system user token. A system user token lets you manually add assets, such as ad accounts, to be included in the data transfer. If a system user token is expired, you must manually update the transfer configuration with new credentials. You also have the option to create a token that doesn't expire when you create a system user token. For more information, see Supported access tokens.
BigQuery prerequisites
- Verify that you have completed all actions required to enable the BigQuery Data Transfer Service.
- Create a BigQuery dataset to store your data.
- If you intend to set up transfer run notifications for Pub/Sub,
ensure that you have the
pubsub.topics.setIamPolicy
Identity and Access Management (IAM) permission. If you only set up email notifications, Pub/Sub permissions aren't required. For more information, see BigQuery Data Transfer Service run notifications.
Required BigQuery roles
To get the permissions that you need to create a transfer,
ask your administrator to grant you the
BigQuery Admin (roles/bigquery.admin
) IAM role.
For more information about granting roles, see Manage access to projects, folders, and organizations.
This predefined role contains the permissions required to create a transfer. To see the exact permissions that are required, expand the Required permissions section:
Required permissions
The following permissions are required to create a transfer:
-
bigquery.transfers.update
on the user -
bigquery.datasets.get
on the target dataset -
bigquery.datasets.update
on the target dataset
You might also be able to get these permissions with custom roles or other predefined roles.
Create a Facebook Ads data transfer
Select one of the following options:
Console
Go to the Data transfers page in the Google Cloud console.
Click
Create transfer.In the Source type section, for Source, select Facebook Ads.
In the Data source details section, do the following:
- For Client ID, enter the app ID.
- For Client secret, enter the app secret.
- For Refresh token, enter the long-lived user access token ID by clicking Authorize. Alternatively, if you already have a refresh token or a system user token, you can enter the refresh token directly in this field. For information about retrieving a long-lived user access token, see Facebook Ads prerequisites.
In the Destination settings section, for Dataset, select the dataset that you created to store your data.
In the Transfer config name section, for Display name, enter a name for the data transfer.
In the Schedule options section, do the following:
- In the Repeat frequency list, select an option to specify how often this data transfer runs. To specify a custom repeat frequency, select Custom. If you select On-demand, then this transfer runs when you manually trigger the transfer.
- If applicable, select either Start now or Start at set time, and provide a start date and run time.
Optional: In the Service Account menu, select a service account from the service accounts that are associated with your Google Cloud project. The selected service account must have the required roles to run this data transfer.
If you signed in with a federated identity, then a service account is required to create a data transfer. If you signed in with a Google Account, then a service account for the data transfer is optional. For more information about using service accounts with data transfers, see Use service accounts.
Optional: In the Notification options section, do the following:
- To enable email notifications, click the Email notification toggle. When you enable this option, the transfer administrator receives an email notification when a transfer run fails.
- To enable Pub/Sub transfer run notifications for this data transfer, click the Pub/Sub notifications toggle. You can select your topic name, or you can click Create a topic to create one.
Click Save.
When this data transfer runs, the BigQuery Data Transfer Service automatically populates the following tables.
Table Name | Description |
---|---|
AdAccounts |
The ad accounts available for a user. |
AdInsights |
Ad insights report for all ad accounts. |
AdInsightsActions |
Ad insights actions report for all ad accounts. |
bq
Enter the bq mk
command
and supply the transfer creation flag
--transfer_config
:
bq mk \ --transfer_config \ --project_id=PROJECT_ID \ --data_source=DATA_SOURCE \ --display_name=DISPLAY_NAME \ --target_dataset=DATASET \ --params='PARAMETERS'
Where:
- PROJECT_ID (optional): your Google Cloud project ID.
If
--project_id
isn't supplied to specify a particular project, the default project is used. - DATA_SOURCE: the data source (for example,
facebook-ads
). - DISPLAY_NAME: the display name for the data transfer configuration. The transfer name can be any value that lets you identify the transfer if you need to modify it later.
- DATASET: the target dataset for the data transfer configuration.
- PARAMETERS: the parameters for the created data transfer
configuration in JSON format. For example:
--params='{"param":"param_value"}'
. The following are the parameters for a Facebook Ads transfer:connector.authentication.oauth.clientId
: The app ID name for the OAuth 2.0 client.connector.authentication.oauth.clientSecret
: The app secret for the OAuth 2.0 client.connector.authentication.oauth.refreshToken
: The long-lived token ID.
For example, the following command creates a Facebook Ads data transfer in the default project with all the required parameters:
bq mk \ --transfer_config \ --target_dataset=mydataset \ --data_source=facebook_ads \ --display_name='My Transfer' \ --params='{"connector.authentication.oauth.clientId": "1650000000", "connector.authentication.oauth.clientSecret":"TBA99550", "connector.authentication.oauth.refreshToken":"abcdef"}'
API
Use the projects.locations.transferConfigs.create
method and supply an instance of the TransferConfig
resource.
Troubleshoot transfer configuration
If you are having issues setting up a Facebook Ads data transfer, try the following troubleshooting steps:
- Check if your user access token has expired using the Facebook Access Token Debugger. Long-lived user access tokens expire after 60 days. If your long-lived user access token has expired, navigate to your transfer details then click Edit to modify your transfer configuration. In the edit transfer page, follow the same steps in Facebook Ads prerequisites to generate a new one.
- Check that the long-lived user access token is generated with the required
permissions -
ads_management
,ads_read
, andbusiness_management
. If not, follow the steps in Facebook Ads prerequisites to generate a new long-lived user access token. - Check the Required Actions tab on the Facebook App dashboard for any items that require attention.
You might encounter the following error messages related to Meta API rate limit errors:
- Error:
There have been too many calls from this ad-account. Wait a bit and try again.
- Resolution: Check that there are no parallel workflows using the same apps or credentials. If these errors persist, try upgrading your permissions to Advanced Access to get more rate limiting quota. For more information, see Marketing API Rate Limiting.
Common monitoring metrics messages
You can also check the BigQuery Data Transfer Service monitoring metrics
to determine the cause of a data transfer failure. The following table lists some
common ERROR_CODE
messages for Facebook Ads data transfers.
Error | Description |
---|---|
INVALID_ARGUMENT |
The supplied configuration is invalid |
PERMISSION_DENIED |
The credentials are invalid |
UNAUTHENTICATED |
Authentication is required |
SERVICE_UNAVAILABLE |
The service is temporarily unable to handle this data transfer |
DEADLINE_EXCEEDED |
The data transfer did not finish within the maximum duration of six hours |
NOT_FOUND |
A requested resource is not found |
INTERNAL |
Something else caused the connector to fail |
RESOURCE_EXHAUSTED |
A data source quota or limit was exhausted |
Pricing
There is no cost to transfer Facebook Ads data into BigQuery while this feature is in Preview.
What's next
- Learn more about the BigQuery Data Transfer Service.
- Learn more about working with transfers, such as viewing configurations and run history.
- Learn how to load data with cross-cloud operations.