Stay organized with collections Save and categorize content based on your preferences.

Use service accounts

Some data sources support data transfer authentication by using a service account through API or the bq command line. A service account is a Google Account associated with your Google Cloud project. A service account can run jobs, such as scheduled queries or batch processing pipelines by authenticating with the service account credentials rather than a user's credentials.

You can update an existing data transfer with the credentials of a service account. For more information, see Update data transfer credentials.

The following situations require updating credentials:

  • Your transfer failed to authorize the user's access to the data source:

    Error code 401 : Request is missing required authentication credential. UNAUTHENTICATED

  • You receive an INVALID_USER error when you attempt to run the transfer:

    Error code 5 : Authentication failure: User Id not found. Error code: INVALID_USERID

To learn more about authenticating with service accounts, see Introduction to authentication.

Data sources with service account support

BigQuery Data Transfer Service can use service account credentials for transfers with the following:

Before you begin

  • Verify that you have completed all actions required in Enabling BigQuery Data Transfer Service.
  • Grant Identity and Access Management (IAM) roles that give users the necessary permissions to perform each task in this document.

Required permissions

Ensure that the person updating the transfer has the following required permissions:

  • BigQuery:

    • bigquery.transfers.update permissions to modify the transfer.

    The predefined roles/bigquery.admin IAM role includes the permissions that you need in order to modify a data transfer.

  • Service Account:

    • To update a data transfer to be run by a service account, you must have access to that service account. For more information on granting users the service account role, see Service Account user role.

Ensure that the chosen service account to run the transfer has the following required permissions:

  • BigQuery:

    • Both bigquery.datasets.get and bigquery.datasets.update permissions on the target dataset.

    The bigquery.admin predefined IAM role includes bigquery.datasets.update and bigquery.datasets.get permissions. For more information on IAM roles in BigQuery Data Transfer Service, see Access control.

  • Data sources:

    • The service account you choose to run the data transfer requires access to the configured transfer data source. For the respective data source required permissions, see Data sources with service account support.

Update data transfer credentials

bq

To update the credentials of a data transfer, you can use the bq command-line tool to update the transfer configuration.

Use the bq update command with the --transfer_config, --update_credentials, and --service_account_name flags.

For example, the following command updates a data transfer configuration to authenticate as a service account instead of your individual user account:

bq update \
--transfer_config \
--update_credentials \
--service_account_name=abcdef-test-sa@abcdef-test.iam.gserviceaccount.com projects/862514376110/locations/us/transferConfigs/5dd12f26-0000-262f-bc38-089e0820fe38 \

Java

Before trying this sample, follow the Java setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Java API reference documentation.

import com.google.api.gax.rpc.ApiException;
import com.google.cloud.bigquery.datatransfer.v1.DataTransferServiceClient;
import com.google.cloud.bigquery.datatransfer.v1.TransferConfig;
import com.google.cloud.bigquery.datatransfer.v1.UpdateTransferConfigRequest;
import com.google.protobuf.FieldMask;
import com.google.protobuf.util.FieldMaskUtil;
import java.io.IOException;

// Sample to update credentials in transfer config.
public class UpdateCredentials {

  public static void main(String[] args) throws IOException {
    // TODO(developer): Replace these variables before running the sample.
    String configId = "MY_CONFIG_ID";
    String serviceAccount = "MY_SERVICE_ACCOUNT";
    TransferConfig transferConfig = TransferConfig.newBuilder().setName(configId).build();
    FieldMask updateMask = FieldMaskUtil.fromString("service_account_name");
    updateCredentials(transferConfig, serviceAccount, updateMask);
  }

  public static void updateCredentials(
      TransferConfig transferConfig, String serviceAccount, FieldMask updateMask)
      throws IOException {
    try (DataTransferServiceClient dataTransferServiceClient = DataTransferServiceClient.create()) {
      UpdateTransferConfigRequest request =
          UpdateTransferConfigRequest.newBuilder()
              .setTransferConfig(transferConfig)
              .setUpdateMask(updateMask)
              .setServiceAccountName(serviceAccount)
              .build();
      dataTransferServiceClient.updateTransferConfig(request);
      System.out.println("Credentials updated successfully");
    } catch (ApiException ex) {
      System.out.print("Credentials was not updated." + ex.toString());
    }
  }
}

Python

Before trying this sample, follow the Python setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Python API reference documentation.

from google.cloud import bigquery_datatransfer
from google.protobuf import field_mask_pb2

transfer_client = bigquery_datatransfer.DataTransferServiceClient()

service_account_name = "abcdef-test-sa@abcdef-test.iam.gserviceaccount.com"
transfer_config_name = "projects/1234/locations/us/transferConfigs/abcd"

transfer_config = bigquery_datatransfer.TransferConfig(name=transfer_config_name)

transfer_config = transfer_client.update_transfer_config(
    {
        "transfer_config": transfer_config,
        "update_mask": field_mask_pb2.FieldMask(paths=["service_account_name"]),
        "service_account_name": service_account_name,
    }
)

print("Updated config: '{}'".format(transfer_config.name))