Configuring Access to Data Sources and Sinks

This page explains how to set up access to the data source and data sink for a transfer using Cloud Storage Transfer Service.

Setting up access to the data source

To set up access to your transfer data source:

Google Cloud Storage

To use a Google Cloud Storage bucket as a data source, you must give the service account associated with the Storage Transfer Service permission to view objects in the bucket. If you want the transfer to delete objects from the source bucket, you must also give the service account this deletion permission.

  1. Obtain the email address used for the service account.

    1. Use the Try this API section of the googleServiceAccounts.get method page.

    2. In the projectID field, enter the ID of the project that your data source bucket resides in.

    3. Click the Execute button.

    4. In the response that appears, find and copy the value for accountEmail.

      The email value has a form that looks like: storage-transfer-123456789@partnercontent.gserviceaccount.com

  2. Give this service account email the required roles to access the data.

    1. Storage Object Viewer allows the service account to read each object for transfer.

    2. (Optional): By giving the service account Storage Legacy Bucket Writer, you have the option to remove objects from the source bucket once they've been transferred.

    For a step-by-step guide to granting roles for buckets, see Adding a member to a bucket-level policy.

Amazon S3

Follow these steps to set up access to an Amazon S3 bucket:

  1. Create an AWS Identity and Access Management (AWS IAM) user with a name that you can easily recognize, such as transfer-user. Ensure the name follows the IAM user name guidelines (see Limitations on IAM Entities and Objects).

  2. Give the AWS IAM user the ability to do the following:

    • List the Amazon S3 bucket.
    • Get the location of the bucket.
    • Read the objects in the bucket.
  3. Create at least one access/secret key pair for each group of transfers that you plan to set up. You can also create a separate access/secret key pair for each transfer.

  4. Restore any objects that are archived to Amazon Glacier. Objects in Amazon S3 that are archived to Amazon Glacier are not accessible until they are restored. For more information, see the Migrating to Google Cloud Storage From Amazon Glacier White Paper.

URL list

If your data source is a URL list, ensure that each object on the URL list is publicly accessible.

Setting up access to the data sink

The data sink for your transfer is always a Google Cloud Storage bucket. To use a bucket as a data sink, you must give the service account associated with the Storage Transfer Service permission to create, delete, and list objects in the bucket.

  1. Obtain the email address used for the service account.

    1. Use the Try this API section of the googleServiceAccounts.get method page.

    2. In the projectID field, enter the ID of the project that your data sink bucket resides in.

    3. Click the Execute button.

    4. In the response that appears, find and copy the value for accountEmail.

      The email value has a form that looks like: storage-transfer-123456789@partnercontent.gserviceaccount.com

  2. Give this service account email Storage Legacy Bucket Writer access to the bucket that is your data sink.

    For a step-by-step guide to granting roles for buckets, see Adding a member to a bucket-level policy.

What's next

Monitor your resources on the go

Get the Google Cloud Console app to help you manage your projects.

Send feedback about...

Cloud Storage Documentation