Interoperability with other storage providers

Cloud Storage is compatible with some other object storage platforms so you can seamlessly integrate data from different sources. This page describes Cloud Storage tools you can use to manage your cross-platform object data.

XML API

The Cloud Storage XML API is interoperable with some tools and libraries that work with services such as Amazon Simple Storage Service (Amazon S3). To use these tools and libraries with Cloud Storage, change the request endpoint that the tool or library uses to the Cloud Storage URI https://storage.googleapis.com, and then configure the tool or library to use your Cloud Storage HMAC keys. See Simple migration from Amazon Simple Storage Service (Amazon S3) for detailed instructions on getting started.

Authenticate with the V4 signing process

The V4 signing process allows you to create signed requests to the Cloud Storage XML API. When you perform the V4 signing process, you create a signature that can be used in a request header to authenticate. You can perform the signing process by using an RSA signature or your Amazon S3 workflow and HMAC credentials. For more details about authenticating requests, see Signatures.

Google Cloud CLI

The gcloud CLI is the preferred command line tool for accessing Cloud Storage. It also lets you access and work with other cloud storage services that use HMAC authentication, like Amazon S3. After you add your Amazon S3 credentials to ~/.aws/credentials, you can start using gcloud storage commands to manage objects in your Amazon S3 buckets. For example:

  • The following command lists the objects in the Amazon S3 bucket my-aws-bucket:

    gcloud storage ls s3://my-aws-bucket
  • The following command synchronizes data between an Amazon S3 bucket and a Cloud Storage bucket:

    gcloud storage rsync s3://my-aws-bucket gs://example-bucket --delete-unmatched-destination-objects --recursive

For more information, including details on how to optimize this synchronization, see the gcloud storage rsync documentation.

Invalid certificate from Amazon S3 bucket names containing dots

If you attempt to use the gcloud CLI to access an Amazon S3 bucket that contains a dot in its name, you might receive an invalid certificate error. This is because Amazon S3 does not support virtual-hosted bucket URLs with dots in their name. When working with Amazon S3 resources, you can configure the gcloud CLI to attempt to use path-style bucket URLs by setting the storage/s3_endpoint_url property to be the following:

storage/s3_endpoint_url https://s3.REGION_CODE.amazonaws.com

Where REGION_CODE is the region containing the bucket you are requesting. For example, us-east-2.

You can modify the storage/s3_endpoint_url property in one of the following ways:

gsutil command line

gsutil is a legacy tool for accessing Cloud Storage from the command line. It also lets you access and work with other cloud storage services that use HMAC authentication, like Amazon S3. After you add your Amazon S3 credentials to ~/.aws/credentials, you can start using gsutil to manage objects in your Amazon S3 buckets. For example:

  • The following command lists the objects in the Amazon S3 bucket my-aws-bucket:

    gsutil ls s3://my-aws-bucket
  • The following command synchronizes data between an Amazon S3 bucket and a Cloud Storage bucket:

    gsutil rsync -d -r s3://my-aws-bucket gs://example-bucket

For more information, including details on how to set up gsutil to optimize this synchronization, see the gsutil rsync documentation.

Invalid certificate from Amazon S3 bucket names containing dots

If you attempt to use gsutil to access an Amazon S3 bucket that contains a dot in its name, you might receive an invalid certificate error. This is because Amazon S3 does not support virtual-hosted bucket URLs with dots in their name. When working with Amazon S3 resources, you can configure gsutil to attempt to use path-style bucket URLs by adding the following entry to your .boto configuration file for gsutil:

[s3]
calling_format = boto.s3.connection.OrdinaryCallingFormat

Importing data with Storage Transfer Service

Storage Transfer Service lets you import large amounts of online data into Cloud Storage from Amazon S3 buckets, Microsoft Azure Blob Storage containers, and general HTTP/HTTPS locations. Storage Transfer Service can be used to schedule recurring transfers, delete source objects, and select which objects are transferred.

Additionally, if you use Amazon S3 Event Notifications, you can set up Storage Transfer Service event-driven transfers to listen for such notifications and automatically keep a Cloud Storage bucket in sync with a Amazon S3 source.

What's next

Amazon Simple Storage Service and Amazon S3 are trademarks of Amazon.com, Inc. or its affiliates in the United States and/or other countries.