Use dual-region storage

Stay organized with collections Save and categorize content based on your preferences.

Go to concepts

This page describes how to use dual-region storage.

Prerequisites

Prerequisites can vary based on the tool used:

Console

In order to complete this guide using the Google Cloud console, you must have the proper IAM permissions. If the object you want to delete exists in a project that you did not create, you might need the project owner to give you a role that contains the necessary permissions.

For a list of permissions required for specific actions, see IAM permissions for the Google Cloud console.

For a list of relevant roles, see Cloud Storage roles. Alternatively, you can create a custom role that has specific, limited permissions.

Command line

In order to complete this guide using a command-line utility, you must have the proper IAM permissions. If the object you want to delete exists in a project that you did not create, you might need the project owner to give you a role that contains the necessary permissions.

For a list of permissions required for specific actions, see IAM permissions for gsutil commands.

For a list of relevant roles, see Cloud Storage roles. Alternatively, you can create a custom role that has specific, limited permissions.

Code samples

In order to complete this guide using the Cloud Storage client libraries, you must have the proper IAM permissions. If the object you want to delete exists in a project that you did not create, you might need the project owner to give you a role that contains the necessary permissions. Unless otherwise noted, client library requests are made through the JSON API.

For a list of permissions required for specific actions, see IAM permissions for JSON methods.

For a list of relevant roles, see Cloud Storage roles. Alternatively, you can create a custom role that has specific, limited permissions.

REST APIs

JSON API

In order to complete this guide using the JSON API, you must have the proper IAM permissions. If the object you want to delete exists in a project that you did not create, you might need the project owner to give you a role that contains the necessary permissions.

For a list of permissions required for specific actions, see IAM permissions for JSON methods.

For a list of relevant roles, see Cloud Storage roles. Alternatively, you can create a custom role that has specific, limited permissions.

Create a dual-region bucket

Complete the following steps to create a dual-region bucket:

Console

  1. In the Google Cloud console, go to the Cloud Storage Buckets page.

    Go to Buckets

  2. Click Create.

  3. On the Create a bucket page, enter your bucket information. To go to the next step, click Continue.

    1. For Name your bucket, enter a name that meets the bucket naming requirements.

    2. For Choose where to store your data, next to Location type, choose Dual-region. Optional: You can combine the feature with turbo replication, by selecting the Add turbo replication checkbox.

    3. For Location, select the multi-region and underlying associated regions you want to use.

    4. For Choose a default storage class for your data, select a storage class for the bucket. The default storage class is assigned by default to all objects uploaded to the bucket.

    5. For Choose how to control access to objects, select the public access prevention and access control options you want to use.

    6. For Choose how to protect object data, select the protection tools you want to use such as object versioning, a retention policy, and an encryption method.

  4. Click Create.

    To learn how to get detailed error information about failed Cloud Storage operations in the Google Cloud console, see Troubleshooting.

Command line

Use the gsutil mb -l --placement command:

gsutil mb -l MULTI-REGION --placement REGION_1,REGION_2 gs://BUCKET_NAME/

Where:

  • MULTI-REGION specifies the multi-region associated with the underlying regions. For example, if choosing the regions ASIA-EAST1 and ASIA-SOUTHEAST1, use ASIA.

  • REGION_1 specifies the geographic location of a region for your bucket. For example, ASIA-EAST1.

  • REGION_2 specifies the geographic location of a second region for your bucket. For example, ASIA-SOUTHEAST1.

  • BUCKET_NAME is the name of the relevant bucket. For example, my-bucket.

    If the request is successful, the command returns the following message:

    Creating gs://BUCKET_NAME/...

    Invalid region combinations, such as pairings with unsupported regions or regions from separate continents, will return an error.

Code samples

C++

For more information, see the Cloud Storage C++ API reference documentation.

namespace gcs = ::google::cloud::storage;
using ::google::cloud::StatusOr;
[](gcs::Client client, std::string const& bucket_name,
   std::string const& region_a, std::string const& region_b) {
  auto metadata = client.CreateBucket(
      bucket_name,
      gcs::BucketMetadata().set_custom_placement_config(
          gcs::BucketCustomPlacementConfig{{region_a, region_b}}));
  if (!metadata) throw std::runtime_error(metadata.status().message());

  std::cout << "Bucket " << metadata->name() << " created."
            << "\nFull Metadata: " << *metadata << "\n";
}

C#

For more information, see the Cloud Storage C# API reference documentation.


using Google.Apis.Storage.v1.Data;
using Google.Cloud.Storage.V1;
using System;

public class CreateDualRegionBucketSample
{
    public Bucket CreateDualRegionBucket(
        string projectId = "your-project-id",
        string bucketName = "your-unique-bucket-name",
        string location = "your-location",
        string region1 = "your-region1-name",
        string region2 = "your-region2-name")
    {
        var client = StorageClient.Create();

        var bucket = new Bucket
        {
            Name = bucketName,
            Location = location,
            CustomPlacementConfig = new Bucket.CustomPlacementConfigData
            {
                DataLocations = new[] { region1, region2 }
            }
        };

        var storageBucket = client.CreateBucket(projectId, bucket);

        Console.WriteLine($"Created storage bucket {storageBucket.Name}" +
            $" in {storageBucket.Location}" +
            $" with location-type {storageBucket.LocationType} and" +
            $" dataLocations {string.Join(",", storageBucket.CustomPlacementConfig.DataLocations)}.");

        return storageBucket;
    }

}

Go

For more information, see the Cloud Storage Go API reference documentation.

import (
	"context"
	"fmt"
	"io"
	"time"

	"cloud.google.com/go/storage"
)

// createBucketDualRegion creates a new dual-region bucket in the project in the
// provided location and regions.
// See https://cloud.google.com/storage/docs/locations#location-dr for more information.
func createBucketDualRegion(w io.Writer, projectID, bucketName string) error {
	// projectID := "my-project-id"
	// bucketName := "bucket-name"
	location := "US"
	region1 := "US-EAST1"
	region2 := "US-WEST1"

	ctx := context.Background()

	client, err := storage.NewClient(ctx)
	if err != nil {
		return fmt.Errorf("storage.NewClient: %v", err)
	}
	defer client.Close()

	ctx, cancel := context.WithTimeout(ctx, time.Second*30)
	defer cancel()

	storageDualRegion := &storage.BucketAttrs{
		Location: location,
		CustomPlacementConfig: &storage.CustomPlacementConfig{
			DataLocations: []string{region1, region2},
		},
	}
	bucket := client.Bucket(bucketName)
	if err := bucket.Create(ctx, projectID, storageDualRegion); err != nil {
		return fmt.Errorf("Bucket(%q).Create: %v", bucketName, err)
	}

	attrs, err := bucket.Attrs(ctx)
	if err != nil {
		return fmt.Errorf("Bucket(%q).Attrs: %v", bucketName, err)
	}
	fmt.Fprintf(w, "Created bucket %v", bucketName)
	fmt.Fprintf(w, " - location: %v", attrs.Location)
	fmt.Fprintf(w, " - locationType: %v", attrs.LocationType)
	fmt.Fprintf(w, " - customPlacementConfig.dataLocations: %v", attrs.CustomPlacementConfig.DataLocations)
	return nil
}

Java

For more information, see the Cloud Storage Java API reference documentation.


import com.google.cloud.storage.Bucket;
import com.google.cloud.storage.BucketInfo;
import com.google.cloud.storage.BucketInfo.CustomPlacementConfig;
import com.google.cloud.storage.Storage;
import com.google.cloud.storage.StorageOptions;
import java.util.Arrays;

public class CreateBucketDualRegion {

  public static void createBucketDualRegion(
      String projectId,
      String bucketName,
      String location,
      String firstRegion,
      String secondRegion) {
    // The ID of your GCP project.
    // String projectId = "your-project-id";

    // The ID to give your GCS bucket.
    // String bucketName = "your-unique-bucket-name";

    // The location your dual regions will be located in.
    // String location = "US";

    // One of the regions the dual region bucket is to be created in.
    // String firstRegion = "US-EAST1";

    // The second region the dual region bucket is to be created in.
    // String secondRegion = "US-WEST1";

    // See this documentation for other valid locations and regions:
    // https://cloud.google.com/storage/docs/locations

    Storage storage = StorageOptions.newBuilder().setProjectId(projectId).build().getService();

    CustomPlacementConfig config =
        CustomPlacementConfig.newBuilder()
            .setDataLocations(Arrays.asList(firstRegion, secondRegion))
            .build();

    BucketInfo bucketInfo =
        BucketInfo.newBuilder(bucketName)
            .setLocation(location)
            .setCustomPlacementConfig(config)
            .build();

    Bucket bucket = storage.create(bucketInfo);

    System.out.println(
        "Created bucket "
            + bucket.getName()
            + " in location "
            + bucket.getLocation()
            + " with regions "
            + bucket.getCustomPlacementConfig().getDataLocations().toString());
  }
}

Node.js

For more information, see the Cloud Storage Node.js API reference documentation.

/**
 * TODO(developer): Uncomment the following lines before running the sample.
 */
// The ID of your GCS bucket
// const bucketName = 'your-unique-bucket-name';

// The bucket's pair of regions. Case-insensitive.
// See this documentation for other valid locations:
// https://cloud.google.com/storage/docs/locations
// const location = 'US';
// const region1 = 'US-EAST1';
// const region2 = 'US-WEST1';

// Imports the Google Cloud client library
const {Storage} = require('@google-cloud/storage');

// Creates a client
// The bucket in the sample below will be created in the project associated with this client.
// For more information, please see https://cloud.google.com/docs/authentication/production or https://googleapis.dev/nodejs/storage/latest/Storage.html
const storage = new Storage();

async function createDualRegionBucket() {
  // For regions supporting dual-regions see: https://cloud.google.com/storage/docs/locations
  const [bucket] = await storage.createBucket(bucketName, {
    location,
    customPlacementConfig: {
      dataLocations: [region1, region2],
    },
  });

  console.log(`Created '${bucket.name}'`);
  console.log(`- location: '${bucket.metadata.location}'`);
  console.log(`- locationType: '${bucket.metadata.locationType}'`);
  console.log(
    `- customPlacementConfig: '${JSON.stringify(
      bucket.metadata.customPlacementConfig
    )}'`
  );
}

createDualRegionBucket().catch(console.error);

PHP

For more information, see the Cloud Storage PHP API reference documentation.

use Google\Cloud\Storage\StorageClient;

/**
 * Create a new bucket with a custom default storage class and location.
 *
 * @param string $bucketName The name of your Cloud Storage bucket.
 * @param string $location Location for the bucket's regions. Case-insensitive.
 * @param string $region1 First region for the bucket's regions. Case-insensitive.
 * @param string $region2 Second region for the bucket's regions. Case-insensitive.
 */
function create_bucket_dual_region(string $bucketName, string $location, string $region1, string $region2): void
{
    // $bucketName = 'my-bucket';
    // $location = 'US';
    // $region1 = 'US-EAST1';
    // $region2 = 'US-WEST1';

    $storage = new StorageClient();
    $bucket = $storage->createBucket($bucketName, [
        'location' => $location,
        'customPlacementConfig' => [
            'dataLocations' => [$region1, $region2],
        ],
    ]);

    $info = $bucket->info();

    printf("Created '%s':", $bucket->name());
    printf("- location: '%s'", $info['location']);
    printf("- locationType: '%s'", $info['locationType']);
    printf("- customPlacementConfig: '%s'" . PHP_EOL, print_r($info['customPlacementConfig'], true));
}

Python

For more information, see the Cloud Storage Python API reference documentation.

from google.cloud import storage


def create_bucket_dual_region(bucket_name, location, region_1, region_2):
    """Creates a Dual-Region Bucket with provided location and regions.."""
    # The ID of your GCS bucket
    # bucket_name = "your-bucket-name"

    # The bucket's pair of regions. Case-insensitive.
    # See this documentation for other valid locations:
    # https://cloud.google.com/storage/docs/locations
    # region_1 = "US-EAST1"
    # region_2 = "US-WEST1"
    # location = "US"

    storage_client = storage.Client()
    bucket = storage_client.create_bucket(bucket_name, location=location, data_locations=[region_1, region_2])

    print(f"Created bucket {bucket_name}")
    print(f" - location: {bucket.location}")
    print(f" - location_type: {bucket.location_type}")
    print(f" - customPlacementConfig data_locations: {bucket.data_locations}")

Ruby

For more information, see the Cloud Storage Ruby API reference documentation.

# The ID of your GCS bucket
# bucket_name = "your-bucket-name"

# The bucket's pair of regions. Case-insensitive.
# See this documentation for other valid locations:
# https://cloud.google.com/storage/docs/locations
# region_1 = "US-EAST1"
# region_2 = "US-WEST1"

require "google/cloud/storage"

storage = Google::Cloud::Storage.new
bucket  = storage.create_bucket bucket_name,
                                custom_placement_config: { data_locations: [region_1, region_2] }

puts "Bucket #{bucket.name} created:"
puts "- location: #{bucket.location}"
puts "- location_type: #{bucket.location_type}"
puts "- custom_placement_config:"
puts "  - data_locations: #{bucket.data_locations}"

REST APIs

JSON API

  1. Get an authorization access token from the OAuth 2.0 Playground. Configure the playground to use your own OAuth credentials. For instructions, see API authentication.
  2. Create a .json file that contains the settings for the bucket, which must include a name and location. See the Buckets:Insert documentation for a complete list of settings. The following are common settings to include:

    {
      "name": "BUCKET_NAME",
      "location": "MULTI-REGION",
      "customPlacementConfig": {
        "dataLocations": ["REGION_1", "REGION_2"]
        },
      "storageClass": "STORAGE_CLASS"
    }

    Where:

    • BUCKET_NAME is the name you want to give your bucket, subject to naming requirements. For example, my-bucket.
    • MULTI-REGION specifies the multi-region associated with the underlying regions. For example, if choosing the regions ASIA-EAST1 and ASIA-SOUTHEAST1, use ASIA.
    • REGION_1 and REGION_2 are the regions where you want to store your bucket's object data. For example, ASIA-EAST1 and ASIA-SOUTHEAST1.
    • STORAGE_CLASS is the storage class of your bucket. For example, STANDARD.

  3. Use cURL to call the JSON API:

    curl -X POST --data-binary @JSON_FILE_NAME.json \
     -H "Authorization: Bearer OAUTH2_TOKEN" \
     -H "Content-Type: application/json" \
     "https://storage.googleapis.com/storage/v1/b?project=PROJECT_ID"

    Where:

    • JSON_FILE_NAME is name of the JSON file you created in Step 2.
    • OAUTH2_TOKEN is the access token you generated in Step 1.
    • PROJECT_ID is the ID of the project with which your bucket will be associated. For example, my-project.

XML API

  1. Get an authorization access token from the OAuth 2.0 Playground. Configure the playground to use your own OAuth credentials. For instructions, see API authentication.
  2. Create an .xml file that contains the following information:

      <CreateBucketConfiguration>
         <LocationConstraint>MULTI-REGION</LocationConstraint>
            <CustomPlacementConfig>
               <DataLocations>
                  <DataLocation>REGION_1</DataLocation>
                  <DataLocation>REGION_2</DataLocation>
               </DataLocations>
            </CustomPlacementConfig>
         <StorageClass>STORAGE_CLASS</StorageClass>
      </CreateBucketConfiguration>
     

    Where:

    • MULTI-REGION specifies the multi-region associated with the underylying regions. For example, if choosing the regions ASIA-EAST1 and ASIA-SOUTHEAST1, use ASIA.
    • REGION_1 and REGION_2 are the regions where you want to store your bucket's object data. For example, ASIA-EAST1 and ASIA-SOUTHEAST1.
    • STORAGE_CLASS is the default storage class of your bucket. For example, STANDARD.

  3. Use cURL to call the XML API:

      curl -X PUT --data-binary @XML_FILE_NAME.xml \
      -H "Authorization: Bearer OAUTH2_TOKEN" \
      -H "x-goog-project-id: PROJECT_ID" \
      "https://storage.googleapis.com/BUCKET_NAME"
    

    Where:

    • XML_FILE_NAME is the name of the XML file you created in Step 2.
    • OAUTH2_TOKEN is the access token you generated in Step 1.
    • PROJECT_ID is the ID of the project with which your bucket will be associated. For example, my-project.
    • BUCKET_NAME is the name you want to give your bucket, subject to bucket naming requirements. For example, my-bucket.

    If the request includes unsupported regions, an error message is returned. If the request was successful, a response is not returned.

List the regions used for a dual-region bucket

Complete the following steps to list the bucket details, including the regions used for storage:

Console

  1. In the Google Cloud console, go to the Cloud Storage Buckets page.

    Go to Buckets

  2. In the bucket list, click the name of the bucket you want to verify.

  3. Click the Configuration tab to view bucket details such as the included regions, storage class, permissions, and replication type.

Command line

Use the gsutil ls -Lb command:

gsutil ls -Lb gs://BUCKET_NAME

Where:

  • BUCKET_NAME is the name of the relevant bucket. For example, my-bucket.

    If the request is successful, the command returns the following or similar:

gs://my-bucket/ :
  Storage class:              STANDARD
  Location type:              dual-region
  Location constraint:        US
  Placement locations:        [`US-CENTRAL1`, `US-WEST1`]
  ...

Code samples

C++

For more information, see the Cloud Storage C++ API reference documentation.

namespace gcs = ::google::cloud::storage;
using ::google::cloud::StatusOr;
[](gcs::Client client, std::string const& bucket_name) {
  StatusOr<gcs::BucketMetadata> bucket_metadata =
      client.GetBucketMetadata(bucket_name);

  if (!bucket_metadata) {
    throw std::runtime_error(bucket_metadata.status().message());
  }

  std::cout << "The metadata for bucket " << bucket_metadata->name() << " is "
            << *bucket_metadata << "\n";
}

C#

For more information, see the Cloud Storage C# API reference documentation.


using Google.Apis.Storage.v1.Data;
using Google.Cloud.Storage.V1;
using System;

public class GetBucketMetadataSample
{
    public Bucket GetBucketMetadata(string bucketName = "your-unique-bucket-name")
    {
        var storage = StorageClient.Create();
        var bucket = storage.GetBucket(bucketName, new GetBucketOptions { Projection = Projection.Full });
        Console.WriteLine($"Bucket:\t{bucket.Name}");
        Console.WriteLine($"Acl:\t{bucket.Acl}");
        Console.WriteLine($"Billing:\t{bucket.Billing}");
        Console.WriteLine($"Cors:\t{bucket.Cors}");
        Console.WriteLine($"DefaultEventBasedHold:\t{bucket.DefaultEventBasedHold}");
        Console.WriteLine($"DefaultObjectAcl:\t{bucket.DefaultObjectAcl}");
        Console.WriteLine($"Encryption:\t{bucket.Encryption}");
        if (bucket.Encryption != null)
        {
            Console.WriteLine($"KmsKeyName:\t{bucket.Encryption.DefaultKmsKeyName}");
        }
        Console.WriteLine($"Id:\t{bucket.Id}");
        Console.WriteLine($"Kind:\t{bucket.Kind}");
        Console.WriteLine($"Lifecycle:\t{bucket.Lifecycle}");
        Console.WriteLine($"Location:\t{bucket.Location}");
        Console.WriteLine($"LocationType:\t{bucket.LocationType}");
        Console.WriteLine($"Logging:\t{bucket.Logging}");
        Console.WriteLine($"Metageneration:\t{bucket.Metageneration}");
        Console.WriteLine($"Owner:\t{bucket.Owner}");
        Console.WriteLine($"ProjectNumber:\t{bucket.ProjectNumber}");
        Console.WriteLine($"RetentionPolicy:\t{bucket.RetentionPolicy}");
        Console.WriteLine($"SelfLink:\t{bucket.SelfLink}");
        Console.WriteLine($"StorageClass:\t{bucket.StorageClass}");
        Console.WriteLine($"TimeCreated:\t{bucket.TimeCreated}");
        Console.WriteLine($"Updated:\t{bucket.Updated}");
        Console.WriteLine($"Versioning:\t{bucket.Versioning}");
        Console.WriteLine($"Website:\t{bucket.Website}");
        Console.WriteLine($"TurboReplication:\t{bucket.Rpo}");
        if (bucket.Labels != null)
        {
            Console.WriteLine("Labels:");
            foreach (var label in bucket.Labels)
            {
                Console.WriteLine($"{label.Key}:\t{label.Value}");
            }
        }
        return bucket;
    }
}

Go

For more information, see the Cloud Storage Go API reference documentation.

import (
	"context"
	"fmt"
	"io"
	"time"

	"cloud.google.com/go/storage"
)

// getBucketMetadata gets the bucket metadata.
func getBucketMetadata(w io.Writer, bucketName string) (*storage.BucketAttrs, error) {
	// bucketName := "bucket-name"
	ctx := context.Background()
	client, err := storage.NewClient(ctx)
	if err != nil {
		return nil, fmt.Errorf("storage.NewClient: %v", err)
	}
	defer client.Close()

	ctx, cancel := context.WithTimeout(ctx, time.Second*10)
	defer cancel()
	attrs, err := client.Bucket(bucketName).Attrs(ctx)
	if err != nil {
		return nil, fmt.Errorf("Bucket(%q).Attrs: %v", bucketName, err)
	}
	fmt.Fprintf(w, "BucketName: %v\n", attrs.Name)
	fmt.Fprintf(w, "Location: %v\n", attrs.Location)
	fmt.Fprintf(w, "LocationType: %v\n", attrs.LocationType)
	fmt.Fprintf(w, "StorageClass: %v\n", attrs.StorageClass)
	fmt.Fprintf(w, "Turbo replication (RPO): %v\n", attrs.RPO)
	fmt.Fprintf(w, "TimeCreated: %v\n", attrs.Created)
	fmt.Fprintf(w, "Metageneration: %v\n", attrs.MetaGeneration)
	fmt.Fprintf(w, "PredefinedACL: %v\n", attrs.PredefinedACL)
	if attrs.Encryption != nil {
		fmt.Fprintf(w, "DefaultKmsKeyName: %v\n", attrs.Encryption.DefaultKMSKeyName)
	}
	if attrs.Website != nil {
		fmt.Fprintf(w, "IndexPage: %v\n", attrs.Website.MainPageSuffix)
		fmt.Fprintf(w, "NotFoundPage: %v\n", attrs.Website.NotFoundPage)
	}
	fmt.Fprintf(w, "DefaultEventBasedHold: %v\n", attrs.DefaultEventBasedHold)
	if attrs.RetentionPolicy != nil {
		fmt.Fprintf(w, "RetentionEffectiveTime: %v\n", attrs.RetentionPolicy.EffectiveTime)
		fmt.Fprintf(w, "RetentionPeriod: %v\n", attrs.RetentionPolicy.RetentionPeriod)
		fmt.Fprintf(w, "RetentionPolicyIsLocked: %v\n", attrs.RetentionPolicy.IsLocked)
	}
	fmt.Fprintf(w, "RequesterPays: %v\n", attrs.RequesterPays)
	fmt.Fprintf(w, "VersioningEnabled: %v\n", attrs.VersioningEnabled)
	if attrs.Logging != nil {
		fmt.Fprintf(w, "LogBucket: %v\n", attrs.Logging.LogBucket)
		fmt.Fprintf(w, "LogObjectPrefix: %v\n", attrs.Logging.LogObjectPrefix)
	}
	if attrs.CORS != nil {
		fmt.Fprintln(w, "CORS:")
		for _, v := range attrs.CORS {
			fmt.Fprintf(w, "\tMaxAge: %v\n", v.MaxAge)
			fmt.Fprintf(w, "\tMethods: %v\n", v.Methods)
			fmt.Fprintf(w, "\tOrigins: %v\n", v.Origins)
			fmt.Fprintf(w, "\tResponseHeaders: %v\n", v.ResponseHeaders)
		}
	}
	if attrs.Labels != nil {
		fmt.Fprintf(w, "\n\n\nLabels:")
		for key, value := range attrs.Labels {
			fmt.Fprintf(w, "\t%v = %v\n", key, value)
		}
	}
	return attrs, nil
}

Java

For more information, see the Cloud Storage Java API reference documentation.


import com.google.cloud.storage.Bucket;
import com.google.cloud.storage.BucketInfo;
import com.google.cloud.storage.Storage;
import com.google.cloud.storage.StorageOptions;
import java.util.Map;

public class GetBucketMetadata {
  public static void getBucketMetadata(String projectId, String bucketName) {
    // The ID of your GCP project
    // String projectId = "your-project-id";

    // The ID of your GCS bucket
    // String bucketName = "your-unique-bucket-name";

    Storage storage = StorageOptions.newBuilder().setProjectId(projectId).build().getService();

    // Select all fields. Fields can be selected individually e.g. Storage.BucketField.NAME
    Bucket bucket =
        storage.get(bucketName, Storage.BucketGetOption.fields(Storage.BucketField.values()));

    // Print bucket metadata
    System.out.println("BucketName: " + bucket.getName());
    System.out.println("DefaultEventBasedHold: " + bucket.getDefaultEventBasedHold());
    System.out.println("DefaultKmsKeyName: " + bucket.getDefaultKmsKeyName());
    System.out.println("Id: " + bucket.getGeneratedId());
    System.out.println("IndexPage: " + bucket.getIndexPage());
    System.out.println("Location: " + bucket.getLocation());
    System.out.println("LocationType: " + bucket.getLocationType());
    System.out.println("Metageneration: " + bucket.getMetageneration());
    System.out.println("NotFoundPage: " + bucket.getNotFoundPage());
    System.out.println("RetentionEffectiveTime: " + bucket.getRetentionEffectiveTime());
    System.out.println("RetentionPeriod: " + bucket.getRetentionPeriod());
    System.out.println("RetentionPolicyIsLocked: " + bucket.retentionPolicyIsLocked());
    System.out.println("RequesterPays: " + bucket.requesterPays());
    System.out.println("SelfLink: " + bucket.getSelfLink());
    System.out.println("StorageClass: " + bucket.getStorageClass().name());
    System.out.println("TimeCreated: " + bucket.getCreateTime());
    System.out.println("VersioningEnabled: " + bucket.versioningEnabled());
    if (bucket.getLabels() != null) {
      System.out.println("\n\n\nLabels:");
      for (Map.Entry<String, String> label : bucket.getLabels().entrySet()) {
        System.out.println(label.getKey() + "=" + label.getValue());
      }
    }
    if (bucket.getLifecycleRules() != null) {
      System.out.println("\n\n\nLifecycle Rules:");
      for (BucketInfo.LifecycleRule rule : bucket.getLifecycleRules()) {
        System.out.println(rule);
      }
    }
  }
}

Node.js

For more information, see the Cloud Storage Node.js API reference documentation.

// Imports the Google Cloud client library
const {Storage} = require('@google-cloud/storage');

// Creates a client
const storage = new Storage();

async function getBucketMetadata() {
  /**
   * TODO(developer): Uncomment the following lines before running the sample.
   */
  // The ID of your GCS bucket
  // const bucketName = 'your-unique-bucket-name';

  // Get Bucket Metadata
  const [metadata] = await storage.bucket(bucketName).getMetadata();

  console.log(JSON.stringify(metadata, null, 2));
}

PHP

For more information, see the Cloud Storage PHP API reference documentation.

use Google\Cloud\Storage\StorageClient;

/**
 * Get bucket metadata.
 *
 * @param string $bucketName The name of your Cloud Storage bucket.
 */
function get_bucket_metadata(string $bucketName): void
{
    // $bucketName = 'my-bucket';

    $storage = new StorageClient();
    $bucket = $storage->bucket($bucketName);
    $info = $bucket->info();

    printf('Bucket Metadata: %s' . PHP_EOL, print_r($info));
}

Python

For more information, see the Cloud Storage Python API reference documentation.


from google.cloud import storage


def bucket_metadata(bucket_name):
    """Prints out a bucket's metadata."""
    # bucket_name = 'your-bucket-name'

    storage_client = storage.Client()
    bucket = storage_client.get_bucket(bucket_name)

    print(f"ID: {bucket.id}")
    print(f"Name: {bucket.name}")
    print(f"Storage Class: {bucket.storage_class}")
    print(f"Location: {bucket.location}")
    print(f"Location Type: {bucket.location_type}")
    print(f"Cors: {bucket.cors}")
    print(f"Default Event Based Hold: {bucket.default_event_based_hold}")
    print(f"Default KMS Key Name: {bucket.default_kms_key_name}")
    print(f"Metageneration: {bucket.metageneration}")
    print(
        f"Public Access Prevention: {bucket.iam_configuration.public_access_prevention}"
    )
    print(f"Retention Effective Time: {bucket.retention_policy_effective_time}")
    print(f"Retention Period: {bucket.retention_period}")
    print(f"Retention Policy Locked: {bucket.retention_policy_locked}")
    print(f"Requester Pays: {bucket.requester_pays}")
    print(f"Self Link: {bucket.self_link}")
    print(f"Time Created: {bucket.time_created}")
    print(f"Versioning Enabled: {bucket.versioning_enabled}")
    print(f"Labels: {bucket.labels}")

Ruby

For more information, see the Cloud Storage Ruby API reference documentation.

def get_bucket_metadata bucket_name:
  # The ID of your GCS bucket
  # bucket_name = "your-unique-bucket-name"

  require "google/cloud/storage"

  storage = Google::Cloud::Storage.new
  bucket  = storage.bucket bucket_name

  puts "ID:                       #{bucket.id}"
  puts "Name:                     #{bucket.name}"
  puts "Storage Class:            #{bucket.storage_class}"
  puts "Location:                 #{bucket.location}"
  puts "Location Type:            #{bucket.location_type}"
  puts "Cors:                     #{bucket.cors}"
  puts "Default Event Based Hold: #{bucket.default_event_based_hold?}"
  puts "Default KMS Key Name:     #{bucket.default_kms_key}"
  puts "Logging Bucket:           #{bucket.logging_bucket}"
  puts "Logging Prefix:           #{bucket.logging_prefix}"
  puts "Metageneration:           #{bucket.metageneration}"
  puts "Retention Effective Time: #{bucket.retention_effective_at}"
  puts "Retention Period:         #{bucket.retention_period}"
  puts "Retention Policy Locked:  #{bucket.retention_policy_locked?}"
  puts "Requester Pays:           #{bucket.requester_pays}"
  puts "Self Link:                #{bucket.api_url}"
  puts "Time Created:             #{bucket.created_at}"
  puts "Versioning Enabled:       #{bucket.versioning?}"
  puts "Index Page:               #{bucket.website_main}"
  puts "Not Found Page:           #{bucket.website_404}"
  puts "Labels:"
  bucket.labels.each do |key, value|
    puts " - #{key} = #{value}"
  end
  puts "Lifecycle Rules:"
  bucket.lifecycle.each do |rule|
    puts "#{rule.action} - #{rule.storage_class} - #{rule.age} - #{rule.matches_storage_class}"
  end
end

REST APIs

JSON API

  1. Get an authorization access token from the OAuth 2.0 Playground. Configure the playground to use your own OAuth credentials. For instructions, see API authentication.
  2. Use cURL to call the JSON API with a GET Bucket request:

    curl -X GET \
      -H "Authorization: Bearer OAUTH2_TOKEN" \
      "https://storage.googleapis.com/storage/v1/b/BUCKET_NAME"

    Where:

    • OAUTH2_TOKEN is the name of the access token you generated in Step 1.
    • BUCKET_NAME is the name of the relevant bucket. For example, my-bucket.

The response looks like the following example or similar:

 {
  "name": "my-bucket",
  ...
  "location": "ASIA",
  "locationType": "dual-region",
  "customPlacementConfig": {
    "dataLocations": ["ASIA-EAST1", "ASIA-SOUTHEAST1"]
    },
 }

XML API

  1. Get an authorization access token from the OAuth 2.0 Playground. Configure the playground to use your own OAuth credentials. For instructions, see API authentication.
  2. Use cURL to call the XML API with a GET Bucket request. Include the customPlacementConfig query parameter to list the regions associated with the bucket. You can only use one query parameter at a time with the XML API:

    curl -X GET \
    -H "Authorization: Bearer OAUTH2_TOKEN" \
    "https://storage.googleapis.com/BUCKET_NAME?customPlacementConfig"
    

    Where:

    • OAUTH2_TOKEN is the name of the access token you generated in Step 1.
    • BUCKET_NAME is the name of the relevant bucket. For example, my-bucket.

    The response looks like the following example or similar:

    <CustomPlacementConfig>
      <DataLocations>
        <DataLocation>US-CENTRAL1</DataLocation>
        <DataLocation>US-WEST1</DataLocation>
      </DataLocations>
    </CustomPlacementConfig>
    

What's next