This page shows you how to use Storage Transfer Service directly through the REST API, and programmatically with Java and Python in two common scenarios. To create a transfer job using the Google Cloud Console, see Creating and Managing Transfers with the Console.
When you configure or edit transfer jobs programmatically using the Storage Transfer API, the time must be in UTC. For more information on specifying the schedule of a transfer job, see Schedule.
Before you start
Do the following before creating a transfer job:
Verify that you have Storage Transfer Service access by checking that you are assigned one of the following roles:
- roles/owner
- roles/editor
- roles/storagetransfer.admin
- roles/storagetransfer.user
A custom role that includes, at bare minimum, roles/storagetransfer.user permissions.
For more information about adding and viewing project-level permissions, see Using IAM permissions with projects.
For more information, see Troubleshooting access.
For more information about IAM roles and permissions in Storage Transfer Service, see Access control using IAM roles and permissions.
Transfer from Amazon S3 to Cloud Storage
In this example, you'll learn how to move files from Amazon S3 to a Cloud Storage bucket. Be sure to review Configuring Access and Pricing to understand the implications of moving data from Amazon S3 to Cloud Storage.
To create the transfer job
When creating transfer jobs, do not include the s3://
prefix for bucketName
in Amazon S3 bucket source names.
REST
Request using transferJobs create:POST https://storagetransfer.googleapis.com/v1/transferJobs { "description": "YOUR DESCRIPTION", "status": "ENABLED", "projectId": "PROJECT_ID", "schedule": { "scheduleStartDate": { "day": 1, "month": 1, "year": 2015 }, "scheduleEndDate": { "day": 1, "month": 1, "year": 2015 }, "startTimeOfDay": { "hours": 1, "minutes": 1 } }, "transferSpec": { "awsS3DataSource": { "bucketName": "AWS_SOURCE_NAME", "awsAccessKey": { "accessKeyId": "AWS_ACCESS_KEY_ID", "secretAccessKey": "AWS_SECRET_ACCESS_KEY" } }, "gcsDataSink": { "bucketName": "GCS_SINK_NAME" } } }Response:
200 OK { "transferJob": [ { "creationTime": "2015-01-01T01:01:00.000000000Z", "description": "YOUR DESCRIPTION", "name": "transferJobs/JOB_ID", "status": "ENABLED", "lastModificationTime": "2015-01-01T01:01:00.000000000Z", "projectId": "PROJECT_ID", "schedule": { "scheduleStartDate": { "day": 1, "month": 1, "year": 2015 }, "scheduleEndDate": { "day": 1, "month": 1, "year": 2015 }, "startTimeOfDay": { "hours": 1, "minutes": 1 } }, "transferSpec": { "awsS3DataSource": { "bucketName": "AWS_SOURCE_NAME" }, "gcsDataSink": { "bucketName": "GCS_SINK_NAME" }, "objectConditions": {}, "transferOptions": {} } } ] }
Java
For how to create a Storage Transfer Service client, refer to Creating a Client for a Google APIs Library.
Python
For how to create a Storage Transfer Service client, refer to Creating a Client for a Google APIs Library.
Transfer between Microsoft Azure Blob Storage and Cloud Storage
In this example, you'll learn how to move files from Microsoft Azure Storage to a Cloud Storage bucket. Be sure to review Configuring Access and Pricing to understand the implications of moving data from Microsoft Azure Storage to Cloud Storage.
REST
Request using transferJobs create:POST https://storagetransfer.googleapis.com/v1/transferJobs { "description": "YOUR DESCRIPTION", "status": "ENABLED", "projectId": "PROJECT_ID", "schedule": { "scheduleStartDate": { "day": 14, "month": 2, "year": 2020 }, "scheduleEndDate": { "day": 14 "month": 2, "year": 2020 }, "startTimeOfDay": { "hours": 1, "minutes": 1 } }, "transferSpec": { "azureBlobStorageDataSource": { "storageAccount": "AZURE_SOURCE_NAME", "azureCredentials": { "sasToken": "AZURE_SAS_TOKEN", }, "container": "AZURE_CONTAINER", }, "gcsDataSink": { "bucketName": "GCS_SINK_NAME" } } }Response:
200 OK { "transferJob": [ { "creationTime": "2020-02-14T01:01:00.000000000Z", "description": "YOUR DESCRIPTION", "name": "transferJobs/JOB_ID", "status": "ENABLED", "lastModificationTime": "2020-02-14T01:01:00.000000000Z", "projectId": "PROJECT_ID", "schedule": { "scheduleStartDate": { "day": 14 "month": 2, "year": 2020 }, "scheduleEndDate": { "day": 14, "month": 2, "year": 2020 }, "startTimeOfDay": { "hours": 1, "minutes": 1 } }, "transferSpec": { "azureBlobStorageDataSource": { "storageAccount": "AZURE_SOURCE_NAME", "azureCredentials": { "sasToken": "AZURE_SAS_TOKEN", }, "container": "AZURE_CONTAINER", }, "objectConditions": {}, "transferOptions": {} } } ] }
Transfer between Cloud Storage buckets
In this example, you'll learn how to move files from one Cloud Storage bucket to another. For example, you can replicate data to a bucket in another location.
To create the transfer job
REST
Request using transferJobs create:POST https://storagetransfer.googleapis.com/v1/transferJobs { "description": "YOUR DESCRIPTION", "status": "ENABLED", "projectId": "PROJECT_ID", "schedule": { "scheduleStartDate": { "day": 1, "month": 1, "year": 2015 }, "startTimeOfDay": { "hours": 1, "minutes": 1 } }, "transferSpec": { "gcsDataSource": { "bucketName": "GCS_SOURCE_NAME" }, "gcsDataSink": { "bucketName": "GCS_NEARLINE_SINK_NAME" }, "objectConditions": { "minTimeElapsedSinceLastModification": "2592000s" }, "transferOptions": { "deleteObjectsFromSourceAfterTransfer": true } } }Response:
200 OK { "transferJob": [ { "creationTime": "2015-01-01T01:01:00.000000000Z", "description": "YOUR DESCRIPTION", "name": "transferJobs/JOB_ID", "status": "ENABLED", "lastModificationTime": "2015-01-01T01:01:00.000000000Z", "projectId": "PROJECT_ID", "schedule": { "scheduleStartDate": { "day": 1, "month": 1, "year": 2015 }, "startTimeOfDay": { "hours": 1, "minutes": 1 } }, "transferSpec": { "gcsDataSource": { "bucketName": "GCS_SOURCE_NAME", }, "gcsDataSink": { "bucketName": "GCS_NEARLINE_SINK_NAME" }, "objectConditions": { "minTimeElapsedSinceLastModification": "2592000.000s" }, "transferOptions": { "deleteObjectsFromSourceAfterTransfer": true } } } ] }
Java
For how to create a Storage Transfer Service client, refer to Creating a Client for a Google APIs Library.
Python
For how to create a Storage Transfer Service client, refer to Creating a Client for a Google APIs Library.
Checking transfer operation status
A TransferJob
resource is
returned when you use transferJobs.create
.
You can check the transfer's status after creating the job using
transferJobs.get
. If the transfer job's
operation has started, this returns a
TransferJob
containing a
populated
latestOperationName
field. Otherwise, if the transfer job's operation hasn't started, the
latestOperationName
field is empty.
To check a transfer job's status
REST
Request using transferJobs get:GET https://storagetransfer.googleapis.com/v1/{jobName="name"}
Cancel transfer operations
To cancel a single transfer operation, use the
transferOperations cancel
method. To delete an entire transfer job, including future transfer operations
that are scheduled for it, set the transfer job's
status to DELETED
using the
transferJobs patch
method. Updating a
job's transfer status doesn't affect transfer operations that are currently
running. To cancel an in-progress transfer operation, use the
transferOperations cancel
method.
What's next
Learn how to work with Cloud Storage.