Creating and Managing Transfers with the Console

This page shows you how to use the Google Cloud Platform Console to set up and manage transfers. To work with Cloud Storage Transfer Service programmatically, see Creating a Storage Transfer Service Client and Creating and Managing Transfers Programmatically.

Before you start

Before you can set up transfers in the Cloud Platform Console, make sure you have the necessary access:

  • Cloud Storage access: You must have access to Cloud Storage. You can access Cloud Storage if you were added as a project team member to an existing Cloud Platform Console account.
  • Source and destination access: You must be a project owner and an owner of the destination bucket, and you need read access to the data source. This is because when you set up a transfer, you agree to grant Cloud Storage Transfer Service access to read data from a source and transfer it to your destination. To learn more about access, see Setting up access to the data source.

Setting up a transfer

  1. Open the Transfer page in the Google Cloud Platform Console.

    Open the Transfer page

  2. Click Create transfer.

Select a tab below for setup instructions. Your steps depend on whether your source is a Cloud Storage bucket, Amazon S3 bucket, or URL list.

Source: Cloud Storage bucket

  1. Under Select source, select Google Cloud Storage bucket.
  2. Under Specify source details, enter the source bucket name (without the prefix gs://) or browse to the bucket and select it.
  3. To specify a subset of files in your source, click Specify file filters beneath the bucket field. You can include or exclude files based on file name prefix and file age. For more information, see Selecting source objects to transfer.
  4. Under Select destination, choose a destination bucket or create a new one.

    To choose an existing bucket, enter the name of the bucket (without the prefix gs://), or click Browse and browse to it.

    To transfer files to a new bucket, click Create bucket.

  5. Enable overwrite/delete options if needed.

    By default, your transfer only overwrites an object when the source version is different from the destination version. No other objects are overwritten or deleted. Enable additional overwrite/delete options under Transfer options. For more information on your options, see TransferOptions in the API reference.

  6. Under Configure transfer, schedule your transfer to Run now (one time) or Run daily at a time in your local timezone. (Note that when configuring or editing transfers programmatically using the Storage Transfer API, the time must be in UTC. For more information on specifying the schedule of a transfer, see Schedule

  7. [Optional] Edit the transfer name under Name. Use a unique, descriptive name to help identify your transfer later.

  8. Click Create.

Source: Amazon S3 bucket

  1. Under Select source, select Amazon S3 bucket. Next, under Specify source details, specify the source Amazon S3 bucket name and the access and secret keys.

    The bucket name is the name as it appears in the AWS Management Console.

    To specify a subset of files in your source, click Specify file filters beneath the bucket field. You can include or exclude files based on file name prefix and file age. For more information, see Selecting source objects to transfer.

  2. Under Select destination, choose a destination bucket or create a new one.

    To choose an existing bucket, enter the name of the bucket (without the prefix gs://), or click Browse and browse to it.

    To transfer files to a new bucket, click Create bucket.

  3. Enable overwrite/delete options if needed.

    By default, your transfer only overwrites an object when the source version is different from the destination version. No other objects are overwritten or deleted. Enable additional overwrite/delete options under Transfer options. For more information on your options, see TransferOptions in the API reference.

  4. Under Configure transfer, schedule your transfer to Run now (one time) or Run daily at the local time you specify.

  5. [Optional] Edit the transfer name under Name. Use a unique, descriptive name to help identify your transfer later.

  6. Click Create.

Source: URL list

  1. Under Select source, select List of object URLs. Next, under Specify source details, provide the URL to a tab-separated values (TSV) file, then click Continue. See Creating a URL List for details about how to create the TSV file.

  2. Under Select destination, choose a destination bucket or create a new one.

    To choose an existing bucket, enter the name of the bucket (without the prefix gs://), or click Browse and browse to it.

    To transfer files to a new bucket, click Create bucket.

  3. Enable overwrite/delete options if needed.

    By default, your transfer only overwrites an object when the source version is different from the destination version. No other objects are overwritten or deleted. Enable additional overwrite/delete options under Transfer options. For more information on your options, see TransferOptions in the API reference.

  4. Under Configure transfer, schedule your transfer to Run now (one time) or Run daily at the local time you specify.

  5. [Optional] Edit the transfer name under Name. Use a unique, descriptive name to help identify your transfer later.

  6. Click Create.

Source: Local Data

Cloud Storage Transfer Service allows you to transfer online data into Cloud Storage. To sync Cloud Storage buckets to local data, use gsutil rsync.

Editing a transfer

You can edit the configuration of a transfer only if it is:

  • A recurring transfer that is not stopped
  • A one-time transfer that has not yet started executing

If one of these conditions is true, edit the configuration as follows:

  1. Open the Transfer page in the Google Cloud Platform Console.

    Open the Transfer page

  2. Click a transfer name to get its details.

  3. Select Edit configuration and follow the instructions described in Setting up a transfer.

    Changes made to a recurring transfer that is currently running take effect the next time the transfer is scheduled to run.

Viewing the history of a transfer

  1. Open the Transfer page in the Google Cloud Platform Console.

    Open the Transfer page

  2. Click a transfer name to get its details.

  3. Select the History tab.

  4. In the list of entries, click the start date of a transfer to view details, including any errors that might have occurred.

    One-time transfers run only once, so there will only be one entry in the transfer history.

Pause or cancel an active transfer

To pause or cancel a transfer that is underway:

  1. Open the Transfer page in the Google Cloud Platform Console.

    Open the Transfer page

  2. Click a transfer name to get its details.

  3. Select the History tab.

    In the list of entries, active transfers have pause (||) and cancel (X) buttons to the right of the status column.

  4. Click the pause button to temporarily pause the transfer or the cancel button to permanently cancel the transfer.

  5. If you are pausing the transfer, click the resume button (►) to continue the transfer.

    If you are cancelling the transfer, a confirmation window appears. Click Cancel transfer to confirm the cancel request.

Delete a scheduled or recurring transfer

  1. Open the Transfer page in the Google Cloud Platform Console.

    Open the Transfer page

  2. Select the checkbox of the transfer you wish to remove.

  3. Click Delete, located above the list of transfers.

  4. Click Ok in the confirmation window that appears.

Selecting source objects to transfer

Cloud Storage Transfer Service has prefixes you can use to select which files to include or exclude from the data source. In general, you can think of the prefixes as narrowing down the objects that get transferred. You can use just include prefixes, just exclude prefixes, or both. The following guidance applies for both data sources that are Amazon Simple Storage Service (Amazon S3) and Google Cloud Storage buckets.

  • Do not include the leading slash in a prefix. For example, to include the requests.gz object in a transfer from the following bucket path s3://my-aws-bucket/logs/y=2015/requests.gz, specify the include prefix as logs/y=2015/requests.gz.

  • If you use include prefixes and exclude prefixes together, then exclude prefixes must start with the value of one of the include prefixes. For example, if you specify a as an include prefix, valid exclude prefixes are a/b, aaa, and abc.

  • If you use just exclude prefixes, there are not restrictions on the prefixes you can use.

  • If you do not specify any prefixes, then all objects in the bucket are transferred.

  • Do not provide a path name for the data source or sink bucket names. For example, s3://my-aws-bucket and gs://example-bucket are valid, but s3://my-aws-bucket/subfolder or gs://example-bucket/files are not. To include paths, use include and exclude prefixes.

  • Cloud Storage Transfer Service does not support remapping, that is, you can not copy the path files/2015 in the data source to files/2016 in the data sink.

For more specifics about working with include and exclude prefixes, see the includePrefixes and excludePrefixes field descriptions in the API.

For more general information about prefixes, see Listing Keys Hierarchically Using a Prefix and Delimiter in the Amazon S3 documentation or the Objects list method for Google Cloud Storage.

Creating an Amazon S3 IAM user

These steps give an overview of the process of creating Amazon S3 credentials that can be used in transfers from an Amazon S3 bucket to an Google Cloud Storage bucket. For detailed information, see Creating an IAM User in Your AWS Account and Bucket Policy Examples.

  1. Create a new user in the AWS Identity and Access Management console.

  2. Note the access credentials or download them.

    The downloaded credentials contain the user name, access key ID, and secret access key. When you configure the transfer in Google Cloud Storage, you only need the access key ID and secret access key.

  3. Attach a managed policy to the IAM user.

    Attach the AmazonS3FullAccess policy if your transfer is configured to delete source objects; otherwise, attach the AmazonS3ReadyOnlyAccess policy. For example, the AmazonS3FullAccess managed policy attached to a user through the IAM console is:

    {
      "Version": "2012-10-17",
      "Statement": [
        {
          "Effect": "Allow",
          "Action": "s3:*",
          "Resource": "*"
        }
      ]
    }
    
  4. Optionally, create a policy that is more restrictive than the managed policies.

    For example, you can create a policy that limits access to just the Amazon S3 bucket. For more information, see Bucket Policy Examples.

Send feedback about...

Cloud Storage Documentation