Moving and Renaming Buckets

When you create a bucket, you permanently define its name, its geographic location, and the project it is part of. However, you can effectively move or rename your bucket:

  • If there is no data in your old bucket, simply delete the bucket and create another bucket with a new name, in a new location, or in a new project.

  • If you have data in your old bucket, create a new bucket with the desired name, location, and/or project, copy data from the old bucket to the new bucket, and delete the old bucket and its contents. The steps below describe this process.

    Note that if you want your new bucket to have the same name as your old bucket, you must move your data twice: an intermediary bucket temporarily holds your data so that you can delete the original bucket and free up the bucket name for the final bucket.

To move your data from one bucket to another:

Step 1) Create a new bucket

Console

  1. Open the Cloud Storage browser in the Google Cloud Platform Console.
    Open the Cloud Storage browser
  2. Click Create bucket.
  3. Specify a Name, subject to the bucket name requirements.
  4. Select a Default storage class for the bucket. The default storage class will be assigned by default to all objects uploaded to the bucket.

    Note: Click Compare storage classes to compare storage classes and monthly cost estimates .

  5. A Location where the bucket data will be stored.
  6. New bucket.

  7. Click Create.

gsutil

Use the gsutil mb command, replacing [VALUES_IN_BRACKETS] with the appropriate values:

gsutil mb gs://[BUCKET_NAME]/

Set the following optional flags to have greater control over the creation of your bucket:

  • -p: specify the project with which your bucket will be associated.
  • -c: specify the default storage class of your bucket.
  • -l: specify the location of your bucket.

For example:

  gsutil mb -p [PROJECT_NAME] -c [STORAGE_CLASS] -l [BUCKET_LOCATION] gs://[BUCKET_NAME]/

Code samples

C++

For more information, see the Cloud Storage C++ API reference documentation .

namespace gcs = google::cloud::storage;
[](gcs::Client client, std::string bucket_name) {
  gcs::BucketMetadata meta =
      client.CreateBucket(bucket_name, gcs::BucketMetadata());
  std::cout << "Bucket created.  The metadata is " << meta << std::endl;
}

C#

For more information, see the Cloud Storage C# API reference documentation .

private void CreateBucket(string bucketName)
{
    var storage = StorageClient.Create();
    storage.CreateBucket(s_projectId, bucketName);
    Console.WriteLine($"Created {bucketName}.");
}

Go

For more information, see the Cloud Storage Go API reference documentation .

bucket := client.Bucket(bucketName)
if err := bucket.Create(ctx, projectID, &storage.BucketAttrs{
	StorageClass: "COLDLINE",
	Location:     "asia",
}); err != nil {
	return err
}

Java

For more information, see the Cloud Storage Java API reference documentation .

Bucket bucket =
    storage.create(
        BucketInfo.newBuilder(bucketName)
            // See here for possible values: http://g.co/cloud/storage/docs/storage-classes
            .setStorageClass(StorageClass.COLDLINE)
            // Possible values: http://g.co/cloud/storage/docs/bucket-locations#location-mr
            .setLocation("asia")
            .build());

Node.js

For more information, see the Cloud Storage Node.js API reference documentation .

// Imports the Google Cloud client library
const {Storage} = require('@google-cloud/storage');

// Creates a client
const storage = new Storage();

/**
 * TODO(developer): Uncomment the following line before running the sample.
 */
// const bucketName = 'Name of a bucket, e.g. my-bucket';

// Creates a new bucket
await storage.createBucket(bucketName, {
  location: 'ASIA',
  storageClass: 'COLDLINE',
});

console.log(`Bucket ${bucketName} created.`);

PHP

For more information, see the Cloud Storage PHP API reference documentation .

use Google\Cloud\Storage\StorageClient;

/**
 * Create a Cloud Storage Bucket.
 *
 * @param string $bucketName name of the bucket to create.
 * @param string $options options for the new bucket.
 *
 * @return Google\Cloud\Storage\Bucket the newly created bucket.
 */
function create_bucket($bucketName, $options = [])
{
    $storage = new StorageClient();
    $bucket = $storage->createBucket($bucketName, $options);
    printf('Bucket created: %s' . PHP_EOL, $bucket->name());
}

Python

For more information, see the Cloud Storage Python API reference documentation .

def create_bucket(bucket_name):
    """Creates a new bucket."""
    storage_client = storage.Client()
    bucket = storage_client.create_bucket(bucket_name)
    print('Bucket {} created'.format(bucket.name))

Ruby

For more information, see the Cloud Storage Ruby API reference documentation .

# project_id    = "Your Google Cloud project ID"
# bucket_name   = "Name of Google Cloud Storage bucket to create"
# location      = "Location of where to create Cloud Storage bucket"
# storage_class = "Storage class of Cloud Storage bucket"

require "google/cloud/storage"

storage = Google::Cloud::Storage.new project_id: project_id
bucket  = storage.create_bucket bucket_name,
                                location:      location,
                                storage_class: storage_class

puts "Created bucket #{bucket.name} in #{location}" +
     " with #{storage_class} class"

REST APIS

JSON API

For information on available storage classes to set as the default for your bucket, see Storage Classes.

  1. Get an authorization access token from the OAuth 2.0 Playground. Configure the playground to use your own OAuth credentials.
  2. Create a .json file that contains the following information, replacing [VALUES_IN_BRACKETS] with the appropriate values:
  3. {
      "name": "[BUCKET_NAME]",
      "location": "[BUCKET_LOCATION]",
      "storageClass": "[STORAGE_CLASS]"
    }
  4. Use cURL to call the JSON API, replacing [VALUES_IN_BRACKETS] with the appropriate values:
    curl -X POST --data-binary @[JSON_FILE_NAME].json \
         -H "Authorization: Bearer [OAUTH2_TOKEN]" \
         -H "Content-Type: application/json" \
         "https://www.googleapis.com/storage/v1/b?project=[PROJECT_ID]"

XML API

For information on available storage classes to set as the default for your bucket, see Storage Classes.

  1. Get an authorization access token from the OAuth 2.0 Playground. Configure the playground to use your own OAuth credentials.
  2. Create a .xml file that contains the following information, replacing [VALUES_IN_BRACKETS] with the appropriate values:
  3. <CreateBucketConfiguration>
       <LocationConstraint>[BUCKET_LOCATION]</LocationConstraint>
       <StorageClass>[STORAGE_CLASS]</StorageClass>
    </CreateBucketConfiguration>
  4. Use cURL to call the XML API, replacing [VALUES_IN_BRACKETS] with the appropriate values:
    curl -X PUT --data-binary @[XML_FILE_NAME].xml \
         -H "Authorization: Bearer [OAUTH2_TOKEN]" \
         -H "x-goog-project-id: [PROJECT_ID]" \
         "https://storage.googleapis.com/[BUCKET_NAME]"

Step 2) Copy files from your old bucket to your new bucket

Console

Use the Cloud Storage Transfer Service from within Google Cloud Platform Console to copy data from one Cloud Storage bucket to another:

  1. Open the Transfer page in the Google Cloud Platform Console.

    Open the Transfer page

  2. Click Create transfer job.
  3. Follow the step-by-step walkthrough, clicking Continue as you complete each step:

    • Select Source: Use Google Cloud Storage Bucket as your selected source, and click Browse to find and select the bucket you want to move your files out of.

    • Select Destination: Click Browse to find and select the bucket you want to move your files into.

      Additionally, select the checkbox Delete source objects after the transfer completes.

    • Configure Transfer: You can ignore this section.

  4. After you complete the step-by-step walkthrough, click Create.

    This begins the process of copying files from your old bucket into your new one. This process may take some time; however, after you click Create, you can navigate away from the Google Cloud Platform Console.

To view the transfer's progress: Open the Transfer page in the Google Cloud Platform Console.

Open the Transfer page

gsutil

  1. Make sure you have at least gsutil 4.12 installed.
  2. Use the gsutil cp command, with the -r option, to recursively copy all your files from the source bucket to the destination bucket. Replace [VALUES_IN_BRACKETS] with the appropriate values:

    gsutil cp -r gs://[SOURCE_BUCKET]/* gs://[DESTINATION_BUCKET]

REST APIs

JSON API

Use the JSON API's rewrite method to copy data in limited-sized chunks over multiple requests. When doing so, you must loop and call the rewrite method until all the data are moved:

  1. Use cURL and the JSON API rewrite method to copy data from a source bucket to a destination bucket, replacing [VALUES_IN_BRACKETS] with the appropriate values:
  2. curl -X POST -H "Authorization: Bearer [OAUTH2_TOKEN]" \
    -H "Content-Length: 0" \
    "https://www.googleapis.com/storage/v1/b/[SOURCE_BUCKET]/o/[OBJECT_NAME]/rewriteTo/b/[DESTINATION_BUCKET]/o/[OBJECT_NAME]"
    

    If the object is, for example, 10 GB in size, the response to this request looks similar to the following example:

    {
    "kind": "storage#rewriteResponse",
    "totalBytesRewritten": 1048576,
    "objectSize": 10000000000,
    "done": false,
    "rewriteToken": [TOKEN_VALUE]
    }
    
  3. Use the rewriteToken in a subsequent request to continue copying data, replacing [VALUES_IN_BRACKETS] with the appropriate values:
  4. curl -X POST -H "Authorization: Bearer [OAUTH2_TOKEN]" \
    -H "Content-Length: 0" \
    -d '{"rewriteToken": "[TOKEN_VALUE]"}' \
    "https://www.googleapis.com/storage/v1/b/[SOURCE_BUCKET]/o/[OBJECT_NAME]/rewriteTo/b/[DESTINATION_BUCKET]/o/[OBJECT_NAME]"
    

    When all of the data is copied, the last response has a done property equal to true, there is no rewriteToken property, and the metadata of the copied-to object is included in the resource property.

    {
    "kind": "storage#rewriteResponse",
    "totalBytesRewritten": 10000000000,
    "objectSize": 10000000000,
    "done": true,
    "resource": objects Resource
    }
    

Step 3) Delete the files from your old bucket

Console

You don't need to do anything to delete the files from your old bucket: as part of copying files using the Transfer Service, old files are deleted automatically (this assumes you selected the "Delete source objects after the transfer completes" checkbox).

gsutil

Use the gsutil rm command, with the -r option, to recursively delete all your files from the source bucket, as well as the source bucket itself. Replace [VALUES_IN_BRACKETS] with the appropriate values:

gsutil rm -r gs://[SOURCE_BUCKET]

Or, to delete the files but keep the source bucket:

gsutil rm -a gs://[SOURCE_BUCKET]/**

REST APIs

JSON API

  1. Use cURL and the JSON API delete method to remove the original version of your data with the following command, replacing [VALUES_IN_BRACKETS] with the appropriate values:
    curl -X DELETE -H "Authorization: Bearer [OAUTH2_TOKEN]" \
    https://www.googleapis.com/storage/v1/b/[SOURCE_BUCKET]/o/[OBJECT_NAME]

If successful, the method returns an empty response.

What's next

Was this page helpful? Let us know how we did:

Send feedback about...

Cloud Storage
Need help? Visit our support page.