This page explains how to set up access to the data source and data sink for a data transfer using Storage Transfer Service.
Storage Transfer Service uses a Google-managed service account to move your data. If you create a transfer from Google Cloud Console and have permissions to update IAM policies for Cloud Storage resources, then transfers created from Google Cloud Console automatically grant the Google-managed service account used by Storage Transfer Service the required permissions for the transfer.
Access to non-Google Cloud data sources and transfers created using the Storage Transfer Service API requires additional setup.
Service account permissions are granted at the bucket level. You must have the ability to grant these permissions, such as having the Storage Admin role. For more information, see Identity and Access Management.
If you plan to use Pub/Sub for transfers, then
ensure that you
grant the service account the IAM role
for the desired Pub/Sub topic. There may be a delay of several
seconds between assigning the role and having it applied to your service
account. If you grant this permission programmatically, wait 30 seconds before
configuring Storage Transfer Service.
If Cloud Key Management Service is enabled on your Cloud Storage source or destination buckets, check that the quotas listed for Cloud KMS in your project's Quotas page are compatible with Storage Transfer Service's Read quotas and Write quotas. If they aren't, request a quota increase from your project's Quotas page.
For more information, see the following:
Setting up access to the data source
To set up Storage Transfer Service to use Cloud Storage as a data source
the following roles, or
equivalent permissions, to the Google-managed service account in the
accountEmail field for the source bucket:
|Role||What it does||Notes|
||Enables the service account to read the bucket's contents, and read object data and metadata.|
||Enables the service account to read a bucket's contents and its metadata, and read object metadata.||If you don't intend to delete source objects from Cloud Storage,
||Enables the service account to create, overwrite, and delete objects; list objects in a bucket; read object metadata when listing; and read bucket metadata, excluding IAM policies.||
If you intend to delete source objects from Cloud Storage, assign
For advanced data transfers, see IAM permissions for Storage Transfer Service.
Follow these steps to set up access to an Amazon S3 bucket:
Create an AWS Identity and Access Management (AWS IAM) user with a name that you
can easily recognize, such as
transfer-user. Ensure the name follows the AWS IAM user name guidelines (see Limitations on IAM Entities and Objects).
Give the AWS IAM user the ability to do the following:
- List the Amazon S3 bucket.
- Get the location of the bucket.
- Read the objects in the bucket.
- If you plan to delete objects from the source after the objects are transferred, grant the user Delete objects permissions.
Create at least one access/secret key pair for the transfer job that you plan to set up. You can also create a separate access/secret key pair for each transfer job.
- Restore any objects that are archived to Amazon Glacier. Objects in Amazon S3 that are archived to Amazon Glacier are not accessible until they are restored. For more information, see the Migrating to Cloud Storage From Amazon Glacier White Paper.
Microsoft Azure Blob Storage
Follow these steps to configure access to an Microsoft Azure Storage container:
- Create or use an existing Microsoft Azure Storage user to access the storage account for your Microsoft Azure Storage Blob container.
Create an SAS token at the container level. See Grant limited access to Azure Storage resources using shared access signatures.
The default expiration time for SAS tokens is 8 hours. When you create your SAS token, ensure to set a reasonable expiration time that enables you to successfully complete your transfer.
If your data source is a URL list, ensure that each object on the URL list is publicly accessible.
Setting up access to the data sink
Storage Transfer Service uses a Google-managed
service account to move your data.
The Google-managed service account is listed in the
accountEmail field. The data sink for your data transfer
is always a Cloud Storage bucket.
To set up Storage Transfer Service to use Cloud Storage as a data sink
the following roles, or
permissions, to the Google-managed service account in the
accountEmail field for the destination bucket:
|Role||What it does|
||Enables the Google-managed service account to create, overwrite, and delete objects; list objects in the destination bucket; read bucket metadata.|
||Enables the Google-managed service account to list and get objects in the destination bucket.|
For more information about required permissions, see IAM permissions for Storage Transfer Service.
- Create a URL list of objects you want to transfer.
- Set up a transfer with the Google Cloud Console.
- Learn how to set up transfers with the Storage Transfer Service API.