Creating and Managing Data Transfers with the Console

This page shows you how to use the Google Cloud Platform Console to set up and manage transfer jobs. To work with Storage Transfer Service programmatically, see Creating a Storage Transfer Service Client and Creating and Managing Transfers Programmatically.

Before you start

Before you can set up transfer jobs in the GCP Console, make sure you have the necessary access:

  • Cloud Storage access: You must be the Owner or Editor of the project that manages the data transfer. This project does not have to be associated with either the source or sink.

    For step-by-step guides to adding and viewing project-level permissions, see Using IAM permissions with projects.

  • Source and sink access: Storage Transfer Service uses a service account to perform transfers. This service account must have permission to access both the data source and the data sink.

    For step-by-step guides to setting up access to data sources and sinks, see Configuring Access to Data Sources and Sinks.

Setting up a transfer job

  1. Open the Transfer page in the Google Cloud Platform Console.

    Open the Transfer page

  2. Click Create transfer job.

    Select a tab below for setup instructions. Your steps depend on whether your source is a Cloud Storage bucket, Amazon S3 bucket, or URL list.

Google Cloud Storage

  1. Under Select source, select Google Cloud Storage bucket.

  2. In the Cloud Storage bucket text box, enter the source bucket name (without the prefix gs://) or browse to the bucket and select it.

  3. To specify a subset of files in your source, click Specify file filters beneath the bucket field. You can include or exclude files based on file name prefix and file age. For more information, see Selecting source objects to transfer.

  4. Under Select destination, choose a sink bucket or create a new one.

    To choose an existing bucket, enter the name of the bucket (without the prefix gs://), or click Browse and browse to it.

    To transfer files to a new bucket, click Browse and then click the New bucket icon.

  5. Enable overwrite/delete options if needed.

    By default, Storage Transfer Service only overwrites an object when the source version is different from the sink version. No other objects are overwritten or deleted. Enable additional overwrite/delete options under Transfer options. For more information on your options, see TransferOptions in the API reference.

  6. Under Configure transfer, schedule your transfer job to Run now (one time) or Run daily at a time in your local timezone.

  7. [Optional] Edit the transfer job name under Description. Use a unique, descriptive name to help identify your transfer job later.

  8. Click Create.

Amazon S3

  1. Under Select source, select Amazon S3 bucket.

  2. In the Amazon S3 bucket text box, specify the source Amazon S3 bucket name.

    The bucket name is the name as it appears in the AWS Management Console.

  3. In the respective text boxes, enter the Access key ID and Secret key associated with the Amazon S3 bucket.

  4. To specify a subset of files in your source, click Specify file filters beneath the bucket field. You can include or exclude files based on file name prefix and file age. For more information, see Selecting source objects to transfer.

  5. Under Select destination, choose a sink bucket or create a new one.

    To choose an existing bucket, enter the name of the bucket (without the prefix gs://), or click Browse and browse to it.

    To transfer files to a new bucket, click Browse and then click the New bucket icon.

  6. Enable overwrite/delete options if needed.

    By default, your transfer job only overwrites an object when the source version is different from the sink version. No other objects are overwritten or deleted. Enable additional overwrite/delete options under Transfer options. For more information on your options, see TransferOptions in the API reference.

  7. Under Configure transfer, schedule your transfer job to Run now (one time) or Run daily at the local time you specify.

  8. [Optional] Edit the transfer job name under Description. Use a unique, descriptive name to help identify your transfer job later.

  9. Click Create.

URL list

  1. Under Select source, select List of object URLs.

  2. Under URL of TSV file, provide the URL to a tab-separated values (TSV) file, then click Continue. See Creating a URL List for details about how to create the TSV file.

  3. Under Select destination, choose a sink bucket or create a new one.

    To choose an existing bucket, enter the name of the bucket (without the prefix gs://), or click Browse and browse to it.

    To transfer files to a new bucket, click Browse and then click the New bucket icon.

  4. Enable overwrite/delete options if needed.

    By default, your transfer job only overwrites an object when the source version is different from the sink version. No other objects are overwritten or deleted. Enable additional overwrite/delete options under Transfer options. For more information on your options, see TransferOptions in the API reference.

  5. Under Configure transfer, schedule your transfer job to Run now (one time) or Run daily at the local time you specify.

  6. [Optional] Edit the transfer job name under Description. Use a unique, descriptive name to help identify your transfer job later.

  7. Click Create.

Local Data

Storage Transfer Service allows you to transfer online data into Cloud Storage. To sync Cloud Storage buckets to local data, use gsutil rsync.

Editing a transfer job

You can edit the configuration of a transfer job only if it is:

  • A recurring transfer job that is not stopped
  • A one-time transfer job that has not yet started executing

If one of these conditions is true, edit the configuration as follows:

  1. Open the Transfer page in the Google Cloud Platform Console.

    Open the Transfer page

  2. Click a transfer job name to get its details.

  3. Select Edit configuration and follow the instructions described in Setting up a transfer.

    Changes made to a recurring transfer job that is currently running take effect the next time the transfer job is scheduled to run.

Viewing the history of a transfer job

  1. Open the Transfer page in the Google Cloud Platform Console.

    Open the Transfer page

  2. Click a transfer job name to get its details.

  3. Select the Operations tab.

  4. In the list of entries, click the start date of a transfer operation to view details, including any errors that might have occurred.

    One-time transfer jobs run only once, so there will only be one entry in the transfer operation history.

Pausing or canceling an active transfer operation

To pause or cancel a transfer operation that is underway:

  1. Open the Transfer page in the Google Cloud Platform Console.

    Open the Transfer page

  2. Click a transfer job name to get its details.

  3. Select the Operations tab.

    In the list of entries, active transfer operations have pause (||) and cancel (X) buttons to the right of the status column.

  4. Click the pause button to temporarily pause the transfer operation or the cancel button to permanently cancel the transfer operation.

  5. If you are pausing the transfer operation, click the resume button (►) to continue the transfer operation.

    If you are cancelling the transfer operation, a confirmation window appears. Click Cancel transfer to confirm the cancel request.

Deleting a scheduled or recurring transfer job

  1. Open the Transfer page in the Google Cloud Platform Console.

    Open the Transfer page

  2. Select the checkbox of the transfer job you wish to remove.

  3. Click Delete, located above the list of transfer jobs.

  4. Click Ok in the confirmation window that appears.

Selecting source objects to transfer

Storage Transfer Service has prefixes you can use to select which files to include or exclude from the data source. In general, you can think of the prefixes as narrowing down the objects that get transferred. You can use just include prefixes, just exclude prefixes, or both. The following guidance applies for both data sources that are Amazon Simple Storage Service (Amazon S3) and Cloud Storage buckets.

  • Do not include the leading slash in a prefix. For example, to include the requests.gz object in a transfer from the following bucket path s3://my-aws-bucket/logs/y=2015/requests.gz, specify the include prefix as logs/y=2015/requests.gz.

  • If you use include prefixes and exclude prefixes together, then exclude prefixes must start with the value of one of the include prefixes. For example, if you specify a as an include prefix, valid exclude prefixes are a/b, aaa, and abc.

  • If you use just exclude prefixes, there are not restrictions on the prefixes you can use.

  • If you do not specify any prefixes, then all objects in the bucket are transferred.

  • Do not provide a path name for the data source or sink bucket names. For example, s3://my-aws-bucket and gs://example-bucket are valid, but s3://my-aws-bucket/subfolder or gs://example-bucket/files are not. To include paths, use include and exclude prefixes.

  • Storage Transfer Service does not support remapping, that is, you can not copy the path files/2015 in the data source to files/2016 in the data sink.

For more specifics about working with include and exclude prefixes, see the includePrefixes and excludePrefixes field descriptions in the API.

For more general information about prefixes, see Listing Keys Hierarchically Using a Prefix and Delimiter in the Amazon S3 documentation or the Objects list method for Cloud Storage.

Creating an Amazon S3 IAM user

These steps give an overview of the process of creating Amazon S3 credentials that can be used in data transfers from an Amazon S3 bucket to an Cloud Storage bucket. For detailed information, see Creating an IAM User in Your AWS Account and Bucket Policy Examples.

  1. Create a new user in the AWS Identity and Access Management console.

  2. Note the access credentials or download them.

    The downloaded credentials contain the user name, access key ID, and secret access key. When you configure the transfer job in Cloud Storage, you only need the access key ID and secret access key.

  3. Attach a managed policy to the IAM user that contains the permissions needed to complete a transfer.

    Attach the AmazonS3FullAccess policy if your transfer job is configured to delete source objects; otherwise, attach the AmazonS3ReadyOnlyAccess policy. For example, the AmazonS3FullAccess managed policy attached to a user through the IAM console is:

    {
      "Version": "2012-10-17",
      "Statement": [
        {
          "Effect": "Allow",
          "Action": "s3:*",
          "Resource": "*"
        }
      ]
    }
    
  4. Optionally, create a policy that is more restrictive than the managed policies.

    For example, you can create a policy that limits access to just the Amazon S3 bucket. For more information, see Bucket Policy Examples.

Was this page helpful? Let us know how we did:

Send feedback about...

Cloud Storage Transfer Service Documentation