Troubleshoot transfer configurations
This document is intended to help you troubleshoot the most common issues encountered when setting up a BigQuery Data Transfer Service transfer. This document does not encompass all possible error messages or issues.
If you are experiencing issues that are not covered in this document, you can request support.
Before contacting Cloud Customer Care, capture transfer configuration and transfer run details. For information on how to get these details, see Get information about a transfer and View transfer run details and log messages.
Examine errors
If your initial transfer run fails, you can examine the details in the run history. Errors listed in the run history can help you identify an appropriate resolution using this document.
You can also view error messages for a specific transfer job using the Logs Explorer. The following Logs Explorer filter returns information about a specific transfer configuration job, along with any error messages:
resource.type="bigquery_dts_config"
labels.run_id="RUN_ID"
resource.labels.config_id="CONFIG_ID"
Replace the following:
RUN_ID
: the ID number of a specific job runCONFIG_ID
: the ID number of a transfer configuration job
Before contacting Customer Care, capture any relevant information from the run history or Logs Explorer including any error messages.
General issues
When diagnosing general transfer issues, verify the following:
- Verify that you have completed all the steps in the "Before You Begin" section of the documentation page for your transfer type.
- The transfer configuration properties are correct.
- The user account used to create the transfer has access to the underlying resources.
If your transfer configuration is correct, and the appropriate permissions are granted, refer to the following for solutions to commonly encountered issues.
- Error:
An unexpected issue was encountered. If this issue persists, please contact customer support.
- Resolution: This error typically indicates a temporary outage or an issue within BigQuery. Wait approximately 2 hours for the issue to be resolved. If the problem persists, request support.
- Error:
Quota Exceeded.
Resolution: Transfers are subject to BigQuery quotas on load jobs. If you need to increase your quota, contact your Google Cloud sales representative. For more information, see Quotas and limits.
If you are loading Cloud Billing exports to BigQuery, you can encounter the
Quota Exceeded
error. Both the Cloud Billing export tables, and the destination BigQuery tables created by the BigQuery Data Transfer Service service are partitioned. Choosing the overwrite option while setting such BigQuery Data Transfer Service jobs causes the quota errors depending on how much data is exported. For information about troubleshooting quotas, see Troubleshoot quota and limit errors.If the error is because of BigQuery Data Transfer Service jobs for Cloud Billing exports, then please note, that since the individual Cloud Billing Export tables are partitioned, so is the target table created by BigQuery Data Transfer service, and hence choosing the overwrite option while setting up such data transfer jobs will result into (DML) Quota errors depending on how old the Billing Accounts are. For information about troubleshooting quotas, see Troubleshoot quota and limit errors.
- Error:
The caller does not have permission.
Resolution: Confirm the signed-in account in Google Cloud console is the same as the account you select for BigQuery Data Transfer Service when creating the transfer.
Signed-in account in Google Cloud console:
Choose an account to continue to BigQuery Data Transfer Service:
- Error:
Access Denied: ... Permission bigquery.tables.get denied on table ...
Resolution: Confirm that the BigQuery Data Transfer Service service agent is granted the
bigquery.dataEditor
role on the target dataset. This grant is automatically applied when creating and updating the transfer, but it's possible that the access policy was modified manually afterwards. To regrant the permission, see Grant access to a dataset.- Error:
region violates constraint constraints/gcp.resourceLocations on the resource projects/project_id
Resolution: This error occurs when a user tries to create a transfer configuration in a restricted location, as specified in the location restriction organization policy. You can resolve this issue by changing the organization policy to allow for the region, or by changing the transfer configuration to a destination dataset located in a region unrestricted by the organization policy.
- Error:
Please look into the errors[] collection for more details.
Resolution: This error can occur when a data transfer fails. For more information about why the data transfer failed, you can use Cloud Logging to view your logs. You can find logs for a specific run by searching using the transfer
run_id
.
Authorization and permission issues
The following are some common permission errors that you can encounter when you transfer data from different data sources:
- Error:
BigQuery Data Transfer Service is not enabled for <project_id>
- Error:
BigQuery Data Transfer Service has not been used in project <project_id> before or it is disabled ...
Resolution: Verify that the service agent role is granted with the following steps:
In the Google Cloud console, go to the IAM & Admin page.
Select the Include Google-provided role grants checkbox.
Verify that the service account with the name
service-<project_number>@gcp-sa-bigquerydatatransfer.iam.gserviceaccount.com
is shown or that it has been granted the BigQuery Data Transfer Service the BigQuery Data Transfer Service Agent role.
If the service account is not shown, or it does not have the BigQuery Data Transfer Service service agent role granted, grant the predefined role in the Google Cloud console or by running the following Google Cloud CLI command:
gcloud projects add-iam-policy-binding PROJECT_NUMBER \ --member serviceAccount:service-PROJECT_NUMBER@gcp-sa-bigquerydatatransfer.iam.gserviceaccount.com \ --role roles/bigquerydatatransfer.serviceAgent
Replace
PROJECT_NUMBER
with the project number associated with this service account.- Error:
There was an error loading this table. Check that the table exists and that you have the correct permissions.
Resolution:
In the Google Cloud console, go to the BigQuery page.
Click the destination dataset used in the transfer.
Click the Sharing menu, and then click Permissions.
Expand the BigQuery Data Editor role.
Verify that the BigQuery Data Transfer Service service agent is added to this role. If not, grant the BigQuery Data Editor (
roles/bigquery.dataEditor
) role to the BigQuery Data Transfer Service service agent.
- Error:
A permission denied error was encountered: PERMISSION_DENIED. Please ensure that the user account setting up the transfer config has the necessary permissions, and that the configuration settings are correct
Resolution:
In the Google Cloud console, go to the Data Transfers page.
Click the failed transfer, then select the Configuration tab.
Verify that the transfer owner listed in the User field has all the required permissions for the data source.
If the transfer owner does not have all the required permissions, grant the required permissions by updating their credentials. You can also change the transfer owner to another user with the required permissions.
- Error:
Authentication failure: User Id not found. Error code: INVALID_USERID
Resolution: The transfer owner has an invalid user ID. Change the transfer owner to a different user by updating their credentials. If you are using a service account, you should also verify that the accounts running the data transfer have all the required permissions to use a service account.
- Error:
The user does not have permission
Resolution: Verify that the transfer owner is a service account, and that the service has all the required permissions set. Another possibility is that the service account used was created under a different project than the project used to create this transfer. To resolve cross-project permission issues, see the following resources:
- Enable service accounts to be attached across projects
- Cross-project Service Account Authorization (for granting the necessary permissions)
- Error:
HttpError 403 when requesting returned "The caller does not have permission"
googleapiclient.errors.HttpError: <HttpError 403 when requesting returned "The caller does not have permission". Details: "The caller does not have permission">
This error might appear when you attempt to set up a scheduled query with a service account.
Resolution: Ensure that the service account has all the permissions required to schedule or modify a scheduled query, and ensure that the user setting up the scheduled query has access to the service account.
If the correct permissions are all assigned but you still encounter the error, check if the Disable Cross-Project Service Account Usage policy is enforced on the project by default. You can check for the policy in the Google Cloud console by navigating to IAM & Admin > Organization Policies and searching for the policy.
If the Disable Cross-Project Service Account Usage policy is enforced, you can disable the policy by doing the following:
- Identify the service accounts associated with the project using the Google Cloud console, by navigating to IAM & Admin > Service Accounts. This view displays all service accounts for the current project.
- Disable the policy in the project where the service accounts are located using the following command. To disable this policy, the user must be an Organization Policy Administrator. Only the Organization Administrator can grant a user this role.
gcloud resource-manager org-policies disable-enforce iam.disableCrossProjectServiceAccountUsage --project=[PROJECT-ID]
Amazon S3 transfer issues
The following are common errors encountered when creating an Amazon S3 transfer.
Amazon S3 PERMISSION_DENIED
errors
- Error:
The AWS Access Key Id you provided does not exist in our records.
- Resolution: Verify that the access key exists and the ID is correct.
- Error:
The request signature we calculated does not match the signature you provided. Check your key and signing method.
- Resolution: Verify that the transfer configuration has the correct corresponding Secret Access Key
- Error:
Failed to obtain the location of the source S3 bucket. Additional details: Access Denied
- Error:
Failed to obtain the location of the source S3 bucket. Additional details: HTTP/1.1 403 Forbidden
- Error:
Access Denied
(S3 error message) - Resolution: Ensure the AWS IAM user has permission to perform the
following:
- List the Amazon S3 bucket.
- Get the location of the bucket.
- Read the objects in the bucket.
- Error:
Server unable to initialize object upload.; InvalidObjectState: The operation is not valid for the object's storage class
- Error:
Failed to obtain the location of the source S3 bucket. Additional details: All access to this object has been disabled
- Resolution: Restore any objects that are archived to Amazon Glacier. Objects in Amazon S3 that are archived to Amazon Glacier are not accessible until they are restored
- Error:
All access to this object has been disabled
- Resolution: Confirm that the Amazon S3 URI in the transfer configuration is correct
Amazon S3 transfer limit errors
- Error:
Number of files in transfer exceeds limit of 10,000.
- Resolution: Evaluate if the number of wildcards in the Amazon S3 URI can be reduced to just one. If this is possible, retry with a new transfer configuration, as the maximum number of files per transfer run will be higher. You can also evaluate if the transfer configuration can be split into multiple transfer configurations, each transferring a portion of the source data.
- Error:
Size of files in transfer exceeds limit of 16492674416640 bytes.
- Resolution: Evaluate if the transfer configuration can be split into multiple transfer configurations, each transferring a portion of the source data.
General Amazon S3 issues
- Issue: Files are transferred from Amazon S3 but not loaded into BigQuery.
The transfer logs may look similar to the following:
Moving data from Amazon S3 to Google Cloud complete: Moved N object(s). No new files found matching Amazon_S3_URI.
Resolution: Confirm that the Amazon S3 URI in the transfer configuration is correct. If the transfer configuration was meant to load all files with a common prefix, ensure that the Amazon S3 URI ends with a wildcard. For example, to load all files in
s3://my-bucket/my-folder/
, the Amazon S3 URI in the transfer configuration must bes3://my-bucket/my-folder/*
, not justs3://my-bucket/my-folder/
.
Azure Blob Storage transfer issues
The following are common errors encountered when creating an Blob Storage transfer.
- Error:
Number of files in transfer exceeds the limit of 10,000.
- Resolution: Reduce the number of wildcards in the Blob Storage data path to 0 or 1, so the file limit increases to 10,000,000. You can also split into multiple transfer configurations, each transferring a portion of the source.
- Error:
Size of files in transfer exceeds the limit of 15 TB.
- Resolution: Split into multiple transfer configurations, each transferring a portion of the source data.
- Error:
Provided Azure SAS Token does not have required permissions.
- Resolution: Verify that the Azure SAS token in the transfer configuration is correct. For more information, see Shared access signature (SAS).
- Error:
Transfer encountered error, status:PERMISSION_DENIED, details:[This request is not authorized to perform this operation.]
- Resolution: Verify that the IP ranges used by BigQuery Data Transfer Service workers are added to your list of allowed IPs. For more information, see IP restrictions.
- Issue: Files are transferred from Blob Storage but not loaded into BigQuery.
The transfer logs might look similar to the following:
Moving data to Google Cloud complete: Moved <var>N</var> object(s). No new files found matching Blob Storage data path.
Resolution: Verify that the Blob Storage data path in the transfer configuration is correct.
Campaign Manager transfer issues
The following are common errors encountered when creating a Campaign Manager transfer.
- Error:
Import failed - no data was available for import. Please verify that data existence was expected.
- Error:
No data available for the requested date. Please try an earlier run date or verify that data existence was expected.
Resolution: Verify you are using the correct ID for the transfer. If you are using the correct ID, verify the Campaign Manager Cloud Storage bucket contains Data Transfer V2.0 files for the specified date range. If the files exist, schedule a backfill for the affected date range. For more information on creating a Campaign Manager backfill request, see set up a backfill.
You can verify whether the files existed when the transfer run was scheduled by checking the created time of the files in the Cloud Storage bucket. In some cases, the first transfer run of the day may be scheduled prior to the generation of the first batch of Campaign Manager Data Transfer files. Subsequent runs on the same day and the following day will load all the files generated by Campaign Manager.
- Error:
A permission denied error was encountered: PERMISSION_DENIED. Please ensure that the user account setting up the transfer config has the necessary permissions, and that the configuration settings are correct.
Resolution: The user creating the Campaign Manager transfer must have read access to the Cloud Storage bucket containing the Data Transfer V2.0 files. You can obtain information about the Cloud Storage bucket and request access from your Campaign Manager administrator.
Google Ads transfer issues
The following are common errors encountered when creating a Google Ads transfer.
- Error:
Import failed - no data was available for import. Please verify that data existence was expected.
- Error:
No data available for the requested date. Please try an earlier run date or verify that data existence was expected.
- Resolution: If you receive this error when you are creating a Google Ads transfer, request support and include a screen capture of the error message.
- Error:
AuthenticationError.NOT_ADS_USER.
- Resolution: The user setting up the Google Ads transfer must have a Google Ads account/login.
- Error:
ERROR_GETTING_RESPONSE_FROM_BACKEND.
- Resolution: If a Google Ads transfer run fails and returns
ERROR_GETTING_RESPONSE_FROM_BACKEND
, enable the Exclude Removed/Disabled Items option in the transfer configuration and set up a backfill to attempt to retrieve data for the days impacted by the failed transfer run. - Warning:
Data for the report ClickStats was not available for the specified date.
- Error:
INVALID_DATE_RANGE_FOR_REPORT.
- Resolution: This is expected when backfilling Click Performance Report data for more than 90 days back. In this case, you will see the above warning/error and the
ClickStats
table will not be updated for the specified date. - Error:
Error while processing report for table table_name for account id account_id. Http(400) Bad Request;
- Error:
AuthorizationError.TWO_STEP_VERIFICATION_NOT_ENROLLED
- Resolution: If the user account associated with this transfer does not have 2-step verification (or multi-factor authentication) enabled, enable 2-step verification for this account and then rerun the failed transfer job. Service accounts are exempted from the 2-step verification requirement.
- Error:
Quota exceeded: Your project exceeded quota for imports per project
- Resolution: Transfers are subject to BigQuery quotas on load jobs. If you hit the quota limit for load job, try to reduce unnecessary loadings by using table_filter, deleting unused transfer configs or reducing the refresh window. If you need to increase your quota, contact your Google Cloud sales representative. For more information, see Quotas and limits.
Google Ad Manager transfer issues
The following are common errors encountered when creating a Google Ad Manager transfer.
- Error:
No data available for the requested date. Please try an earlier run date or verify that data existence was expected.
- Error:
Import failed - no data was available for import. Please verify that data existence was expected.
Resolution: Verify the Google Ad Manager Cloud Storage bucket contains data transfer files for the specified date range. Your Google Ad Manager administrator manages the Cloud Storage bucket containing your Data Transfer files. Users creating Google Ad Manager transfers must be members of the Google Group with read access to the bucket.
You can verify Cloud Storage permissions by attempting to read files in the Google Ad Manager Data Transfer bucket. For more information on Google Ad Manager Cloud Storage buckets, see Access Google Ad Manager storage buckets.
You can verify whether the files existed when the transfer run was scheduled by checking the created time of the files in the Cloud Storage bucket. In some cases, the first transfer run of the day may be scheduled prior to the generation of the first batch of Google Ad Manager Data Transfer files. Subsequent runs on the same day and the following day will load all the files generated by Google Ad Manager.
If the files exist in the Data Transfer bucket and you have read permissions, schedule a backfill for the affected date range. For more information on creating a Google Ad Manager backfill request, see Set up a backfill.
- Error:
AuthenticationError: NO_NETWORKS_TO_ACCESS.
Resolution: Ensure you have read access to the Google Ad Manager network. If you need assistance determining network access, contact Google Ad Manager support.
- Error:
Error code 9 : Field field_name?field_name?field_name?RefererURL is unknown.; Table: table_name
Resolution: Ensure you are not using the thorn (þ) delimiter. The thorn delimiter is currently unsupported. Use of the thorn is indicated by the ? in the error message.
- Error:
Incompatible table partitioning specification. Destination table exists with partitioning specification interval(type:Day,field:) clustering
Resolution: The Google Ads Manager connector does not support transfer data to a dataset with clustering. Use a dataset without clustering instead.
Google Merchant Center transfer issues
The following are common errors encountered when creating a Google Merchant Center transfer.
- Error:
No data to transfer found for the Merchant account. If you have just created this transfer - you may need to wait for up to 90 minutes before the data of your Merchant account are prepared and available for the transfer.
- Resolution: You receive this error if you set up a transfer using the default starting date and time in the Schedule section. If you use the default scheduling values, the first transfer run starts immediately after the transfer is created, but it fails because your Merchant account data must be prepared before it can be transferred. Wait 90 minutes and then set up a backfill for today, or you can wait until tomorrow for the next scheduled run.
- Error:
No data to transfer found for Merchant account. This can be because your account currently doesn't have any products.
- Resolution: This error indicates that your Merchant account has no products. The transfer will begin working once you add products to your Merchant account.
- Error:
Transfer user doesn't have access to the Merchant account. Please verify access in the Users section of the Google Merchant Center.
- Resolution: This error indicates that the user who set up the transfer doesn't have access to the Merchant account used by the transfer. To resolve the issue, verify and grant missing account access in the Google Merchant Center.
- Error:
Transfer user doesn't have user roles that allows access to the product data of the Merchant account. Please verify access and roles in the Users section of the Google Merchant Center.
- Resolution: This error indicates that the user who set up the transfer doesn't have access to the product data of the Merchant account used by the transfer. To resolve the issue, verify and grant missing user roles in the Google Merchant Center.
- Error:
Historical backfills are currently not supported.
- Resolution: You receive this error if you set up a backfill for previous days. This error is expected. Historical backfills are not currently supported. You can only set up a backfill for today to refresh today's data after the scheduled daily run has already finished.
Google Play transfer issues
The following are common errors encountered when creating a Google Play transfer.
- Error:
No jobs to start for run
- Resolution: Verify that the user has sufficient permissions to initiate a Google Play transfer, then verify if the user has specified the correct Cloud Storage bucket when setting up the transfer. If the user is still experiencing the error, you can request support.
- Issue: Revenue and financial reports did not load into BigQuery
- Resolution: To access financial Google Play reports, users must have the
View financial data
permission. To manage developer account permissions, see Add developer account users and manage permissions.
Oracle transfer issues
The following are common issues you might encounter when creating an Oracle transfer.
- Error:
PERMISSION_DENIED. ORA-01017: invalid username/password; logon denied
- Resolution: Check that the provided Oracle credentials are valid.
- Error:
PERMISSION_DENIED. ORA-01045: user lacks CREATE SESSION privilege; logon denied
- Resolution: Grant the
CREATE SESSION
system privileges to the Oracle user. For more information about granting Oracle privileges, seeGRANT
. - Error:
SERVICE_UNAVAILABLE. ORA-12541: Cannot connect. No listener at host
orHOSTNAME
portPORT
SERVICE_UNAVAILABLE. Connection failed: IO Error. The Network Adapter could not establish the connection
- Resolution: Check that the provided Hostname and Port details are correct, and that the network attachment is set up correctly.
- Error:
NOT_FOUND. ORA-00942: table or view does not exist
Resolution: Check each of the following:
- The spelling of the table or view name is correct.
- The referenced table or view name exists.
- The synonym points to an existing table or view. If the table or view does exist, ensure that the correct access privileges are granted to the database user requiring access to the table. Otherwise, create the table.
- if you are attempting to access a table or view in another schema, make sure that the correct schema is referenced and that access to the object is granted.
- Error:
NOT_FOUND. Schema schema does not exist.
- Resolution: The specified schema does not exist.
- Error:
DEADLINE_EXCEEDED
- Resolution: The transfer run did not finish within the six hour maximum duration deadline. Reduce your transfer run times by splitting up large transfers into multiple smaller ones.
- Error:
INTERNAL
- Resolution: Something else caused the transfer to fail. Contact Cloud Customer Care for help resolving this issue.
- Error:
SQL Error [1950] [42000]: ORA-01950: no privileges on tablespace 'TablespaceName'
- Resolution: Assign the default tablespace to the user. For more information, see Assigning a Default Tablespace.
- Error:
403 PERMISSION_DENIED. Required 'compute.subnetworks.use' permission for project
Resolution: This error can occur if your network attachment is located in a different project than where the transfer configuration is located. To remedy this issue, you must grant the service account (for example,
service-customer_project_number@gcp-sa-bigquerydatatransfer.iam.gserviceaccount.com
) the following permissions in the project where the network attachment is located:compute.networkAttachments.get
compute.networkAttachments.update
compute.subnetworks.use
compute.regionOperations.get
This error can also occur if the network attachment is trying to connect to a Virtual Private Cloud (VPC) that is located in a different project, such as a shared VPC. In this case,you must grant the service account (for example,
service-customer_project_number@gcp-sa-bigquerydatatransfer.iam.gserviceaccount.com
) thecompute.subnetworks.use
permission on the hosting project of the shared VPC.
Salesforce transfer issues
The following are common errors encountered when creating a Salesforce transfer.
- Error:
Permission Denied: invalid_client. invalid client credentials
- Resolution: Verify that the ClientSecret provided is valid.
- Error:
Permission Denied: invalid_client. client identifier invalid
- Resolution: Verify that the ClientId provided is valid.
- Error:
Permission Denied: Error encountered while establishing connection
- Resolution: Check if the Salesforce MyDomain Name provided is correct.
- Error:
NOT_FOUND. asset type asset_name is not supported. If you are attempting to use a custom object, be sure to append the "__c" after the entity name. Please reference your WSDL or use the describe call for the appropriate names.
- Resolution: Follow the guidance in the error code, and verify that the provided asset name is correct.
- Error:
SERVICE_UNAVAILABLE
- Resolution: The service is temporarily unable to handle the request. Wait a few minutes and try the operation again.
- Error:
DEADLINE_EXCEEDED
- Resolution: The transfer run did not finish within the six hour maximum duration deadline. Minimize your transfer run times by splitting up large transfers into multiple smaller ones.
- Error:
Failed to create recordReader to read partition : Batch failed. BatchId='batch_id', Reason='FeatureNotEnabled : Binary field not supported'
- Resolution: The connector doesn't support sObject data structures that contain binary fields. Remove sObject data structures that contain binary fields from your transfer jobs. For more information, see Error 'Batch failed: FeatureNotEnabled: Binary field not supported' when you export related object in the Salesforce documentation.
- Error:
RESOURCE_EXHAUSTED: PrepareQuery failed : ExceededQuota : ApiBatchItems Limit exceeded
- Resolution: This error appears when you have exceeded the daily
ApiBatchItems
API limit for job runs. Salesforce has a daily API limit that is reset every 24 hours. To resolve this error, we recommend that you split up and schedule your transfer runs so that it doesn't exceed the daily batch API limit. You can also contact Salesforce support to increase your daily limit.
Teradata transfer issues
The following are common issues you might encounter when creating a Teradata transfer.
- Error:
Skipping extraction since table does not have change tracking column.
Resolution: The preceding message might appear when you attempt to perform a Teradata transfer on an already migrated table using an existing on-demand transfer config. If you want to start a new transfer on an already migrated table, create a new transfer config with the On-demand settings applied.
When you repeat a transfer using an on-demand transfer config, the BigQuery Data Transfer Service attempts to run it as an incremental transfer, but skips the table since the transfer config does not have the correct incremental settings applied. For more information about the different types of transfers, see On-demand or incremental transfers.
- Issue: Transfer of
CHAR
(N) data types adds spaces up to N characters for shorter strings. Resolution: Convert
CHAR
data toVARCHAR
and remove the extra spaces at the source. This issue occurs becauseCHAR
is a fixed length string, whereasVARCHAR
should be used for variable length strings. You can also remove the spaces in BigQuery after the migration using theRTRIM
function. The query to use theRTRIM
function is similar to the following example:UPDATE migrated_table SET migrated_char_column = RTRIM(migrated_char_column) WHERE true;
YouTube transfer issues
The following are common errors encountered when creating a YouTube transfer.
- Error:
Import failed - no data was available for import. Please verify that data existence was expected.
- Error:
No data available for requested date. Please try an earlier run date or verify that data existence was expected.
Resolution: If you have not previously created YouTube reporting jobs, allow YouTube at least 2 days for the BigQuery Data Transfer Service to generate the reports on your behalf. No additional action is required. Transfers will fail for the first 2 days and should succeed on day 3. If you have previously created YouTube reporting jobs, confirm the user creating the transfers has read access to the reports.
Also, verify the transfer was set up for the correct account. In the OAuth dialog, you must select the channel for which you would like to load data.
- Error:
No reports for reporting job with name name.
Resolution: This is not an error. It is a warning that indicates no data was found for the specified report. You can ignore this warning. Future transfers will continue to run.
- Issue: The resulting tables created by the transfer are incomplete, or the results are unexpected.
- Resolution: If you have multiple accounts, you must choose the correct account when you receive the YouTube permissions dialog.
- Issue: Data doesn't match between YouTube Analytics and BigQuery YouTube Transfers.
Background: BigQuery YouTube transfers use YouTube reporting API to ingest data directly into a BigQuery dataset. On the other hand, Youtube Analytics Dashboard pulls data using YouTube Analytics API. The numbers that YouTube produces in their generated Reporting API should be treated as the final numbers, whereas the numbers visible in the YouTube Analytics Dashboard/API should be treated as estimated numbers. Some degree of discrepancy between the two APIs is expected.
Resolution: If the reported numbers are indeed incorrect, then both YouTube's system and BigQuery Data Transfer Service YouTube transfer are set up to backfill missing numbers and make them available in new generated reports for the backfilled days. Because a BigQuery Data Transfer Service YouTube configuration simply loads all available reports created by YouTube Reporting API, when BigQuery transfer automatically imports future generated YouTube Reports, it will also account for the newly generated and updated data, and ingest it into the correct Date partition table.
YouTube permissions issues
For YouTube Content Manager reports, the user setting up the transfer must have
CMS user
permissions (at a minimum). CMS user
permissions must be
granted to each content manager for whom you are creating a transfer.
Quota issues
- Error:
Quota exceeded: Your project exceeded quota for imports per project.
- Resolution: Verify you have not scheduled too many transfers in your project. For information on calculating the number of load jobs initiated by a transfer, see Quotas and limits.