Create a shared reservation


This document explains how to create shared reservations, which are reservations shared across multiple projects, and manage which projects in your organization can consume the shared reservations.

A shared reservation can be used by the project that hosts the reservation (owner project) and by the projects the reservation is shared with (consumer projects). Use shared reservations if your organization has multiple projects that need virtual machine (VM) instances with the same properties reserved. By using shared reservations, you can improve the utilization of your reservations and reduce the number of reservations that you need to create and manage. To learn more about reservations, see Reservations of Compute Engine zonal resources.

For other methods of creating reservations, see instead the following pages:

  • If you have any 1-year or 3-years commitments in the current project, then your reserved resources automatically receive any applicable committed use discounts (CUDs). You can also create and attach a reservation to a commitment when you purchase the commitment. To learn more, see Attach reservations to commitments.

  • To create a reservation that can only be used by a single project, see Create a reservation for a single project.

Before you begin

  • Review the requirements and restrictions for reservations.
  • Review the quota requirements and restrictions for shared reservations.
  • Make sure the project you use to create shared reservations has been added to the allowlist for the Shared Reservations Owner Projects (compute.sharedReservationsOwnerProjects) organization policy constraint by an organization policy administrator. This allowlist is empty by default, so you can't create shared reservations until your organization grants this permission to one or more projects. For more details on viewing and editing the organization policy constraint, see Allowing and restricting projects from creating and modifying shared reservations in this document.
  • If you haven't already, then set up authentication. Authentication is the process by which your identity is verified for access to Google Cloud services and APIs. To run code or samples from a local development environment, you can authenticate to Compute Engine by selecting one of the following options:

    Select the tab for how you plan to use the samples on this page:

    Console

    When you use the Google Cloud console to access Google Cloud services and APIs, you don't need to set up authentication.

    gcloud

    1. Install the Google Cloud CLI, then initialize it by running the following command:

      gcloud init
    2. Set a default region and zone.

    Terraform

    To use the Terraform samples on this page in a local development environment, install and initialize the gcloud CLI, and then set up Application Default Credentials with your user credentials.

    1. Install the Google Cloud CLI.
    2. To initialize the gcloud CLI, run the following command:

      gcloud init
    3. If you're using a local shell, then create local authentication credentials for your user account:

      gcloud auth application-default login

      You don't need to do this if you're using Cloud Shell.

    For more information, see Set up authentication for a local development environment.

    Go

    To use the Go samples on this page in a local development environment, install and initialize the gcloud CLI, and then set up Application Default Credentials with your user credentials.

    1. Install the Google Cloud CLI.
    2. To initialize the gcloud CLI, run the following command:

      gcloud init
    3. If you're using a local shell, then create local authentication credentials for your user account:

      gcloud auth application-default login

      You don't need to do this if you're using Cloud Shell.

    For more information, see Set up authentication for a local development environment.

    Java

    To use the Java samples on this page in a local development environment, install and initialize the gcloud CLI, and then set up Application Default Credentials with your user credentials.

    1. Install the Google Cloud CLI.
    2. To initialize the gcloud CLI, run the following command:

      gcloud init
    3. If you're using a local shell, then create local authentication credentials for your user account:

      gcloud auth application-default login

      You don't need to do this if you're using Cloud Shell.

    For more information, see Set up authentication for a local development environment.

    Node.js

    To use the Node.js samples on this page in a local development environment, install and initialize the gcloud CLI, and then set up Application Default Credentials with your user credentials.

    1. Install the Google Cloud CLI.
    2. To initialize the gcloud CLI, run the following command:

      gcloud init
    3. If you're using a local shell, then create local authentication credentials for your user account:

      gcloud auth application-default login

      You don't need to do this if you're using Cloud Shell.

    For more information, see Set up authentication for a local development environment.

    Python

    To use the Python samples on this page in a local development environment, install and initialize the gcloud CLI, and then set up Application Default Credentials with your user credentials.

    1. Install the Google Cloud CLI.
    2. To initialize the gcloud CLI, run the following command:

      gcloud init
    3. If you're using a local shell, then create local authentication credentials for your user account:

      gcloud auth application-default login

      You don't need to do this if you're using Cloud Shell.

    For more information, see Set up authentication for a local development environment.

    REST

    To use the REST API samples on this page in a local development environment, you use the credentials you provide to the gcloud CLI.

      Install the Google Cloud CLI, then initialize it by running the following command:

      gcloud init

    For more information, see Authenticate for using REST in the Google Cloud authentication documentation.

Required roles

To get the permissions that you need to create shared reservations, ask your administrator to grant you the following IAM roles:

For more information about granting roles, see Manage access to projects, folders, and organizations.

These predefined roles contain the permissions required to create shared reservations. To see the exact permissions that are required, expand the Required permissions section:

Required permissions

The following permissions are required to create shared reservations:

  • compute.reservations.create on the project
  • To view organization policies: orgpolicy.policy.get on the organization
  • To edit organization policies: orgpolicy.policy.set on the organization
  • To specify an instance template: compute.instanceTemplates.useReadOnly on the instance template

You might also be able to get these permissions with custom roles or other predefined roles.

Best practices

When creating shared reservations, the following best practices are recommended. By following them, you can help optimize the manageability and utilization of shared reservations in your organization.

  • Have the owner project create the shared reservation but not consume it.
    • To spread spendings across projects, it's recommended to only have consumer projects consume a shared reservation. The owner project should only be used to create the shared reservation.
    • When creating a shared reservation, the owner project must have sufficient quota for the total resources to reserve. Then, after the reservation is created, the owner project must have quota for any reserved resources that it wants to consume. For more information, see Additional quota requirements for shared reservations.
  • Minimize the number of projects in your organization that you allow to create shared reservations. You can control this through the Shared Reservations Owner Projects (compute.sharedReservationsOwnerProjects) organization policy constraint.
    • You can only list the reservations created by each project. This means that shared reservations are only listed in the owner project—you cannot list the reservations shared with each project or list all shared reservations in each organization—so having only a few owner projects makes it easier to monitor and manage your shared reservations.
    • Limit the sharing of a shared reservation to a few projects only to manage the quota of your reserved resources more easily.
    • For more information, see Allow and restrict projects from creating and modifying shared reservations.
  • Minimize the number of separate shared reservations with identical VM properties.
    • An organization can have up to 100 shared reservations for each unique combination of VM properties. As a result, minimizing the number of shared reservations with identical VM properties that you create helps mitigate this limit.
    • Having fewer shared reservations improves manageability.
  • Only share reservations between projects with the same Cloud Billing account.
    • Limit each shared reservation to only be shared with consumer projects that have the same Cloud Billing account as the owner project. This makes it easier for you to see if a reservation was consumed and how it was billed.
    • If you enabled CUD sharing and you're eligible to receive CUDs at the Cloud Billing account level, then, to maximize the CUDs you receive for your consumed reservations, limit your shared reservations to that commitment's Cloud Billing account. Doing this lets you maintain consistent billing across projects that create and consume shared reservations.
  • For future reservation requests, carefully review the total count of VMs that you request.
    • If you are creating a future reservation request, ensure that you request a total count of VMs that accounts for all of the following:
      • All matching reserved VMs that will already exist at the future date.
      • All matching unreserved VMs that will already exist at the future date.
      • Any matching unused on-demand reservations that will already exist at the future date.
      • The increase in usage that you want to reserve at the future date.

      For example, suppose you need 10 additional VMs at the future date and you'll already have the following resources at the future date:

      • 40 matching reserved VMs
      • 50 matching unreserved VMs

      or

      • 40 matching reserved VMs
      • 50 matching unused on-demand reservations

      Because your existing usage at the future date already adds up to 90 matching VMs and reservations, and you need an additional ten VMs, you must specify a total count of 100 in your future reservation request.

      For more information, see Count and provision reserved resources.

Allow and restrict projects from creating and modifying shared reservations

By default, no projects are allowed to create or modify shared reservations in an organization. Add projects to the Shared Reservations Owner Projects (compute.sharedReservationsOwnerProjects) organization policy constraint to allow them to create and modify shared reservations. For more information about organization policy constraints, see Introduction to the Organization Policy Service.

Use the following steps to view and edit the Shared Reservations Owner Projects (compute.sharedReservationsOwnerProjects) organization policy constraint.

View the shared reservations organization policy constraint

To see which projects are allowed to create and modify shared reservations, use the Google Cloud console or gcloud CLI.

Console

Follow the steps for Viewing organization policies using the Shared Reservations Owner Projects constraint.

gcloud

To see which projects the compute.sharedReservationsOwnerProjects constraint allows to create and modify shared reservations:

  1. Download the policy for your organization as a file named policy.yaml, using the gcloud resource-manager org-policies describe command:

    gcloud resource-manager org-policies describe compute.sharedReservationsOwnerProjects --organization=ORGANIZATION_ID > policy.yaml
    

    Replace ORGANIZATION_ID with the organization ID of your organization.

  2. Use a text editor to open the policy.yaml file and view the compute.sharedReservationsOwnerProjects constraint. The projects that are allowed to create and modify shared reservations are listed under its allowedValues:

    ...
    constraint: constraints/compute.sharedReservationsOwnerProjects
    listPolicy:
      allowedValues:
      - projects/PROJECT_NUMBER_1
      - projects/PROJECT_NUMBER_2
      - projects/PROJECT_NUMBER_3
    ...
    

    where PROJECT_NUMBER_1, PROJECT_NUMBER_2, and PROJECT_NUMBER_3 are the project numbers of the only projects in your organization that are allowed to create shared reservations.

  3. Optional: Delete the policy.yaml file.

    • If you are using a Linux or macOS terminal, use the following command:

      rm policy.yaml
      
    • If you are using a Windows terminal, use the following command:

      del policy.yaml
      

Edit the shared reservations organization policy constraint

To edit which projects are allowed to create and modify shared reservations, use the Google Cloud console or gcloud CLI.

Console

Follow the steps for Customizing policies for list constraints using the Shared Reservations Owner Projects constraint.

gcloud

To edit which projects the compute.sharedReservationsOwnerProjects constraint allows to create and modify shared reservations, use one of the following methods:

  • To grant permission to a single project to create and modify shared reservations, use the gcloud resource-manager org-policies allow command. You can repeat this command for each project that you want to grant this permission to.

    gcloud resource-manager org-policies allow compute.sharedReservationsOwnerProjects projects/PROJECT_NUMBER \
        --organization=ORGANIZATION_ID
    

    Replace the following:

    • PROJECT_NUMBER: the project number (not project ID) of a project in your organization that you want to allow to create and modify shared reservations.
    • ORGANIZATION_ID: the organization ID of your organization.
  • To grant or revoke the permissions for multiple projects to create and modify shared reservations, replace the organization policy constraint:

    1. To download the policy for your organization as a file named policy.yaml, use the gcloud resource-manager org-policies describe command:

      gcloud resource-manager org-policies describe compute.sharedReservationsOwnerProjects --organization=ORGANIZATION_ID > policy.yaml
      

      Replace ORGANIZATION_ID with the organization ID of your organization.

    2. Use a text editor to modify the policy.yaml file so that the compute.sharedReservationsOwnerProjects constraint lists all of the projects that you want to be allowed to create and modify shared reservations under its allowedValues.

      • For each project that you want to grant the permission to create and modify shared reservations, add the project in a new line under allowedValues.
      • For each project that you want to revoke the permission to create and modify shared reservations, delete the line for that project.

      When you are finished, make sure the policy.yaml file looks similar to the following:

      ...
      constraint: constraints/compute.sharedReservationsOwnerProjects
      listPolicy:
        allowedValues:
        - projects/PROJECT_NUMBER_1
        - projects/PROJECT_NUMBER_2
        - projects/PROJECT_NUMBER_3
      ...
      

      where PROJECT_NUMBER_1, PROJECT_NUMBER_2, and PROJECT_NUMBER_3 are the project numbers (not project IDs) of all of the projects in your organization that you want to be allowed to create and modify shared reservations.

    3. Save the policy.yaml file and close the text editor.

    4. To update the policy for your organization with your changes, use the gcloud resource-manager org-policies set-policy command:

      gcloud resource-manager org-policies set-policy --organization=ORGANIZATION_ID policy.yaml
      

      Replace ORGANIZATION_ID with the organization ID of your organization.

    5. Optional: Delete the policy.yaml file.

      • If you are using a Linux or macOS terminal, use the following command:

        rm policy.yaml
        
      • If you are using a Windows terminal, use the following command:

        del policy.yaml
        

You might need to wait a few minutes for the edit to take effect.

Create a shared reservation

This section explains how to create shared reservations. After you create a shared reservation, it can be modified only by the owner project, but the resources for a shared reservation can be consumed by the owner project or any consumer projects.

To consume a reservation, a VM must have properties that exactly match that reservation. To specify the properties of the VMs that you want to reserve, select one of the following sections in this document:

  • Recommended: Specify an instance template

    This section explains how to use an instance template to define the properties of a shared reservation. By using an instance template, you can define the properties of a reservation and the VMs that can consume the reservation in the same place. However, because templates are project-specific, you can't use the same template to create VMs that can consume the reservation outside of the project that created the reservation. For the projects the reservation is shared with, you must create similar templates in those projects or create VMs by specifying properties directly.

  • Specify an existing VM

    This section explains how to use an existing VM to define the properties of a reservation. By using the properties of an existing VM, you can consume the reservation by creating VMs with properties that match the reference VM.

  • Specify properties directly

    This section explains how to directly define the properties of a shared reservation. This method requires you to manually ensure that the properties of your VMs and reservations match exactly—any mismatched properties prevent consumption.

By default, a reservation can be automatically consumed by any VMs with properties that match it. If you want to control reservation consumption, do one or more of the following:

Specify an instance template

Before creating a reservation by specifying an instance template, make sure of the following:

  • An instance template contains project-specific settings, so you can only access and use an instance template within the same project. If you create a shared reservation by specifying an instance template, then you can't use the same template to create VMs that can consume the reservation outside of the project that created the reservation.

  • Create your reservation in the same region and zone as the resources within the instance template. Any regional or zonal resources specified in an instance template—such as a machine type or a Persistent Disk volume—restrict the use of the template to the locations where those resources exist. For example, if your instance template specifies an existing Persistent Disk volume in zone us-central1-a, then you can only create your reservation in the same zone. To check if an existing template specifies any resources that bind the template to a specific region or zone, view the details of the instance template and look for references to regional or zonal resources inside of it.

To create a shared reservation by specifying an instance template, select one of the following options:

Console

  1. In the Google Cloud console, go to the Reservations page.

    Go to Reservations

    The Reservations page appears.

  2. Click Create reservation.

    The Create a reservation page appears.

  3. For Name, enter a name for your reservation.

  4. For Region and Zone, select where you want to reserve resources.

  5. In the Share type section, do the following:

    1. To specify a shared reservation, select Shared.

    2. Click Add projects, and then select the projects from the current project's organization that you want to share the reservation with. You can select up to 100 consumer projects.

  6. Optional: To allow a reservation of GPU VMs to be consumed by custom training jobs or prediction jobs in Vertex AI, in the Google Cloud services section, select Share reservation.

  7. In the Use with VM instance section, select one of the following options:

    • To allow matching VM instances to automatically use this reservation, select Use reservation automatically if it's not already selected.

    • To consume this reservation's resources only when creating matching VMs that specifically target this reservation by name, select Select specific reservation.

  8. For Number of VM instances, enter the number of VMs that you want to reserve.

  9. In the Machine configuration section, select Use instance template, and then select the instance template of your choice. If you select a regional instance template, then you can only reserve resources within the same region as the template's region.

  10. In the Auto-delete section, you can enable the auto-delete option to let Compute Engine automatically delete the reservation at a specific date and time. Automatically deleting reservations can be useful to avoid unnecessary charges when you stop consuming the reservation.

  11. To create the reservation, click Create.

    The Reservations page opens. Creating the shared reservation might take up to a minute to complete.

gcloud

To create a shared reservation, use the gcloud compute reservations create command with the --share-setting=projects and --share-with flags.

To create a shared reservation by specifying an instance template and without including any optional flags, run the following command:

gcloud compute reservations create RESERVATION_NAME \
    --share-setting=projects \
    --share-with=CONSUMER_PROJECT_IDS \
    --source-instance-template=projects/PROJECT_ID/LOCATION/instanceTemplates/INSTANCE_TEMPLATE_NAME \
    --vm-count=NUMBER_OF_VMS \
    --zone=ZONE

Replace the following:

  • RESERVATION_NAME: the name of the reservation to create.

  • PROJECT_ID: the ID of the project where you want to reserve resources and where the instance template exists.

  • CONSUMER_PROJECT_IDS: a comma-separated list of IDs of projects that can consume this reservation—for example, project-1,project-2. You can include up to 100 consumer projects. These projects must be in the same organization as the owner project. Don't include the owner project. By default, it's already allowed to consume the reservation.

  • LOCATION: the location of the instance template. Specify one of the following values:

    • For a global instance template: global.

    • For a regional instance template: regions/REGION. Replace REGION with the region where the instance template is located. If you specify a regional instance template, then you can only reserve VMs within the same region as the template's region.

  • INSTANCE_TEMPLATE_NAME: the name of an existing instance template. If the instance template specifies an A3 machine type, then you must include the --require-specific-reservation flag. This indicates that only VMs that specifically target the reservation can consume it. For more information, see Consume VMs from a specific reservation.

  • NUMBER_OF_VMS: the number of VMs to reserve.

  • ZONE: the zone in which to reserve resources.

For example, to create a reservation by specifying a global instance template in zone us-central1-a, share the reservation with projects project-1 and project-2, and reserve ten VMs that each use an N2 predefined machine type with 4 vCPUs, run the following command:

gcloud compute reservations create my-reservation \
    --share-setting=projects \
    --share-with=project-1,project-2 \
    --source-instance-template=projects/example-project/global/example-instance-template \
    --vm-count=10 \
    --zone=us-central1-a

Optionally, you can do one or more of the following:

  • To specify that only VMs that specifically target this reservation can consume it, include the --require-specific-reservation flag.

    gcloud compute reservations create RESERVATION_NAME \
        --require-specific-reservation \
        --share-setting=projects \
        --share-with=CONSUMER_PROJECT_IDS \
        --source-instance-template=projects/PROJECT_ID/LOCATION/instanceTemplates/INSTANCE_TEMPLATE_NAME \
        --vm-count=NUMBER_OF_VMS \
        --zone=ZONE
    
  • To allow a reservation of GPU VMs to be consumed by custom training jobs or prediction jobs in Vertex AI, use the gcloud beta compute reservations create command with the --reservation-sharing-policy=ALLOW_ALL flag.

    gcloud beta compute reservations create RESERVATION_NAME \
        --reservation-sharing-policy=ALLOW_ALL \
        --share-setting=projects \
        --share-with=CONSUMER_PROJECT_IDS \
        --source-instance-template=projects/PROJECT_ID/LOCATION/instanceTemplates/INSTANCE_TEMPLATE_NAME \
        --vm-count=NUMBER_OF_VMS \
        --zone=ZONE
    
  • To enable Compute Engine to automatically delete the reservation, select one of the following methods:

    • To delete the reservation at a specific date and time, use the gcloud beta compute reservations create command with the --delete-at-time flag.

      gcloud beta compute reservations create RESERVATION_NAME \
          --delete-at-time=DELETE_AT_TIME \
          --share-setting=projects \
          --share-with=CONSUMER_PROJECT_IDS \
          --source-instance-template=projects/PROJECT_ID/LOCATION/instanceTemplates/INSTANCE_TEMPLATE_NAME \
          --vm-count=NUMBER_OF_VMS \
          --zone=ZONE
      

      Replace DELETE_AT_TIME with a date and time formatted as an RFC 3339 timestamp, which must be as follows:

      YYYY-MM-DDTHH:MM:SSOFFSET
      

      Replace the following:

      • YYYY-MM-DD: a date formatted as a 4-digit year, 2-digit month, and a 2-digit day of the month, separated by hyphens (-).

      • HH:MM:SS: a time formatted as a 2-digit hour using a 24-hour time, 2-digit minutes, and 2-digit seconds, separated by colons (:).

      • OFFSET: the time zone formatted as an offset of Coordinated Universal Time (UTC). For example, to use the Pacific Standard Time (PST), specify -08:00. Alternatively, to use no offset, specify Z.

    • To delete the reservation after a specific duration, use the gcloud beta compute reservations create command with the --delete-after-duration flag.

      gcloud beta compute reservations create RESERVATION_NAME \
          --delete-after-duration=DELETE_AFTER_DURATION \
          --share-setting=projects \
          --share-with=CONSUMER_PROJECT_IDS \
          --source-instance-template=projects/PROJECT_ID/LOCATION/instanceTemplates/INSTANCE_TEMPLATE_NAME \
          --vm-count=NUMBER_OF_VMS \
          --zone=ZONE
      

      Replace DELETE_AFTER_DURATION with a duration in days, hours, minutes, or seconds. For example, specify 30m for 30 minutes, or 1d2h3m4s for 1 day, 2 hours, 3 minutes, and 4 seconds.

Go

import (
	"context"
	"fmt"
	"io"

	computepb "cloud.google.com/go/compute/apiv1/computepb"
	"google.golang.org/protobuf/proto"
)

// Creates shared reservation from given template in particular zone
func createSharedReservation(w io.Writer, client ClientInterface, projectID, baseProjectId, zone, reservationName, sourceTemplate string) error {
	// client, err := compute.NewReservationsRESTClient(ctx)
	// projectID := "your_project_id". Destination of sharing.
	// baseProjectId := "your_project_id2". Project where the reservation will be created.
	// zone := "us-west3-a"
	// reservationName := "your_reservation_name"
	// sourceTemplate: existing template path. Following formats are allowed:
	//  	- projects/{project_id}/global/instanceTemplates/{template_name}
	//  	- projects/{project_id}/regions/{region}/instanceTemplates/{template_name}
	//  	- https://www.googleapis.com/compute/v1/projects/{project_id}/global/instanceTemplates/instanceTemplate
	//  	- https://www.googleapis.com/compute/v1/projects/{project_id}/regions/{region}/instanceTemplates/instanceTemplate

	ctx := context.Background()

	shareSettings := map[string]*computepb.ShareSettingsProjectConfig{
		projectID: {ProjectId: proto.String(projectID)},
	}

	req := &computepb.InsertReservationRequest{
		Project: baseProjectId,
		ReservationResource: &computepb.Reservation{
			Name: proto.String(reservationName),
			Zone: proto.String(zone),
			SpecificReservation: &computepb.AllocationSpecificSKUReservation{
				Count:                  proto.Int64(2),
				SourceInstanceTemplate: proto.String(sourceTemplate),
			},
			ShareSettings: &computepb.ShareSettings{
				ProjectMap: shareSettings,
				ShareType:  proto.String("SPECIFIC_PROJECTS"),
			},
		},
		Zone: zone,
	}

	op, err := client.Insert(ctx, req)
	if err != nil {
		return fmt.Errorf("unable to create reservation: %w", err)
	}

	if op != nil {
		if err = op.Wait(ctx); err != nil {
			return fmt.Errorf("unable to wait for the operation: %w", err)
		}
	}

	fmt.Fprintf(w, "Reservation created\n")

	return nil
}

Java

import com.google.cloud.compute.v1.AllocationSpecificSKUReservation;
import com.google.cloud.compute.v1.Operation;
import com.google.cloud.compute.v1.Reservation;
import com.google.cloud.compute.v1.ReservationsClient;
import com.google.cloud.compute.v1.ShareSettings;
import com.google.cloud.compute.v1.ShareSettingsProjectConfig;
import java.io.IOException;
import java.util.concurrent.ExecutionException;
import java.util.concurrent.TimeUnit;
import java.util.concurrent.TimeoutException;

public class CreateSharedReservation {
  private final ReservationsClient reservationsClient;

  // Constructor to inject the ReservationsClient
  public CreateSharedReservation(ReservationsClient reservationsClient) {
    this.reservationsClient = reservationsClient;
  }

  public static void main(String[] args)
      throws IOException, ExecutionException, InterruptedException, TimeoutException {
    // TODO(developer): Replace these variables before running the sample.
    // The ID of the project where you want to reserve resources
    // and where the instance template exists.
    // By default, no projects are allowed to create or modify shared reservations
    // in an organization. Add projects to the Shared Reservations Owner Projects
    // (compute.sharedReservationsOwnerProjects) organization policy constraint
    // to allow them to create and modify shared reservations.
    // For more information visit this page:
    // https://cloud.google.com/compute/docs/instances/reservations-shared#shared_reservation_constraint
    String projectId = "YOUR_PROJECT_ID";
    // Zone in which the reservation resides.
    String zone = "us-central1-a";
    // Name of the reservation to be created.
    String reservationName = "YOUR_RESERVATION_NAME";
    // The URI of the global instance template to be used for creating the reservation.
    String instanceTemplateUri = String.format(
        "projects/%s/global/instanceTemplates/YOUR_INSTANCE_TEMPLATE_NAME", projectId);
    // Number of instances for which capacity needs to be reserved.
    int vmCount = 3;
    // In your main method, create ReservationsClient
    ReservationsClient client = ReservationsClient.create();
    // Create an instance of your class, passing in the client
    CreateSharedReservation creator = new CreateSharedReservation(client);

    creator.createSharedReservation(projectId, zone, reservationName, instanceTemplateUri, vmCount);
  }

  // Creates a shared reservation with the given name in the given zone.
  public void createSharedReservation(
      String projectId, String zone,
      String reservationName, String instanceTemplateUri, int vmCount)
      throws ExecutionException, InterruptedException, TimeoutException {

    ShareSettings shareSettings = ShareSettings.newBuilder()
        .setShareType(String.valueOf(ShareSettings.ShareType.SPECIFIC_PROJECTS))
        // The IDs of projects that can consume this reservation. You can include up to 100
        // consumer projects. These projects must be in the same organization as
        // the owner project. Don't include the owner project. By default, it is already allowed
        // to consume the reservation.
        .putProjectMap("CONSUMER_PROJECT_ID_1", ShareSettingsProjectConfig.newBuilder().build())
        .putProjectMap("CONSUMER_PROJECT_ID_2", ShareSettingsProjectConfig.newBuilder().build())
        .build();

    // Create the reservation.
    Reservation reservation =
        Reservation.newBuilder()
            .setName(reservationName)
            .setZone(zone)
            .setSpecificReservationRequired(true)
            .setShareSettings(shareSettings)
            .setSpecificReservation(
                AllocationSpecificSKUReservation.newBuilder()
                    .setCount(vmCount)
                    .setSourceInstanceTemplate(instanceTemplateUri)
                    .build())
            .build();

    // Wait for the create reservation operation to complete.
    Operation response =
        this.reservationsClient.insertAsync(projectId, zone, reservation).get(3, TimeUnit.MINUTES);

    if (response.hasError()) {
      System.out.println("Reservation creation failed!" + response);
      return;
    }
    System.out.println("Reservation created. Operation Status: " + response.getStatus());
  }
}

Node.js

// Import the Compute library
const computeLib = require('@google-cloud/compute');
const compute = computeLib.protos.google.cloud.compute.v1;

/**
 * TODO(developer): Uncomment reservationsClient and zoneOperationsClient before running the sample.
 */
// Instantiate a reservationsClient
// reservationsClient = new computeLib.ReservationsClient();
// Instantiate a zoneOperationsClient
// zoneOperationsClient = new computeLib.ZoneOperationsClient();

/**
 * TODO(developer): Update these variables before running the sample.
 */
// The ID of the project where you want to reserve resources and where the instance template exists.
const projectId = await reservationsClient.getProjectId();
// The zone in which to reserve resources.
const zone = 'us-central1-a';
// The name of the reservation to create.
const reservationName = 'reservation-01';
// The number of VMs to reserve.
const vmsNumber = 3;
// The name of an existing instance template.
const instanceTemplateName = 'global-instance-template-name';
// The location of the instance template.
const location = 'global';

async function callCreateComputeSharedReservation() {
  // Create reservation for 3 VMs in zone us-central1-a by specifying a instance template.
  const specificReservation = new compute.AllocationSpecificSKUReservation({
    count: vmsNumber,
    sourceInstanceTemplate: `projects/${projectId}/${location}/instanceTemplates/${instanceTemplateName}`,
  });

  // Create share settings. Share reservation with one customer project.
  const shareSettings = new compute.ShareSettings({
    shareType: 'SPECIFIC_PROJECTS',
    projectMap: {
      // The IDs of projects that can consume this reservation. You can include up to 100 consumer projects.
      // These projects must be in the same organization as the owner project.
      // Don't include the owner project. By default, it is already allowed to consume the reservation.
      consumer_project_id: {
        projectId: 'consumer_project_id',
      },
    },
  });

  // Create a reservation.
  const reservation = new compute.Reservation({
    name: reservationName,
    specificReservation,
    specificReservationRequired: true,
    shareSettings,
  });

  const [response] = await reservationsClient.insert({
    project: projectId,
    reservationResource: reservation,
    zone,
  });

  let operation = response.latestResponse;

  // Wait for the create reservation operation to complete.
  while (operation.status !== 'DONE') {
    [operation] = await zoneOperationsClient.wait({
      operation: operation.name,
      project: projectId,
      zone: operation.zone.split('/').pop(),
    });
  }

  console.log(`Reservation: ${reservationName} created.`);
  return response;
}

return await callCreateComputeSharedReservation();

Python

from __future__ import annotations

import sys
from typing import Any

from google.api_core.extended_operation import ExtendedOperation
from google.cloud import compute_v1


def wait_for_extended_operation(
    operation: ExtendedOperation, verbose_name: str = "operation", timeout: int = 300
) -> Any:
    """
    Waits for the extended (long-running) operation to complete.

    If the operation is successful, it will return its result.
    If the operation ends with an error, an exception will be raised.
    If there were any warnings during the execution of the operation
    they will be printed to sys.stderr.

    Args:
        operation: a long-running operation you want to wait on.
        verbose_name: (optional) a more verbose name of the operation,
            used only during error and warning reporting.
        timeout: how long (in seconds) to wait for operation to finish.
            If None, wait indefinitely.

    Returns:
        Whatever the operation.result() returns.

    Raises:
        This method will raise the exception received from `operation.exception()`
        or RuntimeError if there is no exception set, but there is an `error_code`
        set for the `operation`.

        In case of an operation taking longer than `timeout` seconds to complete,
        a `concurrent.futures.TimeoutError` will be raised.
    """
    result = operation.result(timeout=timeout)

    if operation.error_code:
        print(
            f"Error during {verbose_name}: [Code: {operation.error_code}]: {operation.error_message}",
            file=sys.stderr,
            flush=True,
        )
        print(f"Operation ID: {operation.name}", file=sys.stderr, flush=True)
        raise operation.exception() or RuntimeError(operation.error_message)

    if operation.warnings:
        print(f"Warnings during {verbose_name}:\n", file=sys.stderr, flush=True)
        for warning in operation.warnings:
            print(f" - {warning.code}: {warning.message}", file=sys.stderr, flush=True)

    return result


def create_compute_shared_reservation(
    project_id: str,
    zone: str = "us-central1-a",
    reservation_name="your-reservation-name",
    shared_project_id: str = "shared-project-id",
) -> compute_v1.Reservation:
    """Creates a compute reservation in GCP.
    Args:
        project_id (str): The ID of the Google Cloud project.
        zone (str): The zone to create the reservation.
        reservation_name (str): The name of the reservation to create.
        shared_project_id (str): The ID of the project that the reservation is shared with.
    Returns:
        Reservation object that represents the new reservation.
    """

    instance_properties = compute_v1.AllocationSpecificSKUAllocationReservedInstanceProperties(
        machine_type="n1-standard-1",
        # Optional. Specifies amount of local ssd to reserve with each instance.
        local_ssds=[
            compute_v1.AllocationSpecificSKUAllocationAllocatedInstancePropertiesReservedDisk(
                disk_size_gb=375, interface="NVME"
            ),
        ],
    )

    reservation = compute_v1.Reservation(
        name=reservation_name,
        specific_reservation=compute_v1.AllocationSpecificSKUReservation(
            count=3,  # Number of resources that are allocated.
            # If you use source_instance_template, you must exclude the instance_properties field.
            # It can be a full or partial URL.
            # source_instance_template="projects/[PROJECT_ID]/global/instanceTemplates/my-instance-template",
            instance_properties=instance_properties,
        ),
        share_settings=compute_v1.ShareSettings(
            share_type="SPECIFIC_PROJECTS",
            project_map={
                shared_project_id: compute_v1.ShareSettingsProjectConfig(
                    project_id=shared_project_id
                )
            },
        ),
    )

    # Create a client
    client = compute_v1.ReservationsClient()

    operation = client.insert(
        project=project_id,
        zone=zone,
        reservation_resource=reservation,
    )
    wait_for_extended_operation(operation, "Reservation creation")

    reservation = client.get(
        project=project_id, zone=zone, reservation=reservation_name
    )
    shared_project = next(iter(reservation.share_settings.project_map.values()))

    print("Name: ", reservation.name)
    print("STATUS: ", reservation.status)
    print("SHARED PROJECT: ", shared_project)
    # Example response:
    # Name:  your-reservation-name
    # STATUS:  READY
    # SHARED PROJECT:  project_id: "123456789012"

    return reservation

REST

To create a shared reservation, make a POST request to the reservations.insert method. In the request body, include the following:

  • The projectMap field.

  • The shareType field set to SPECIFIC_PROJECTS.

For example, to create a shared reservation by specifying an instance template without including any optional fields, and share the reservation with two consumer projects, make the following POST request:

POST https://compute.googleapis.com/compute/v1/projects/PROJECT_ID/zones/ZONE/reservations

{
  "name": "RESERVATION_NAME",
  "shareSettings": {
    "shareType": "SPECIFIC_PROJECTS",
    "projectMap": {
      "CONSUMER_PROJECT_ID_1": {
        "projectId": "CONSUMER_PROJECT_ID_1"
      },
      "CONSUMER_PROJECT_ID_2": {
        "projectId": "CONSUMER_PROJECT_ID_2"
      }
    }
  },
  "specificReservation": {
    "count": "NUMBER_OF_VMS",
    "sourceInstanceTemplate": "projects/PROJECT_ID/LOCATION/instanceTemplates/INSTANCE_TEMPLATE_NAME"
  }
}

Replace the following:

  • PROJECT_ID: the ID of the project where you want to reserve resources and where the instance template exists.

  • ZONE: the zone in which to reserve resources.

  • RESERVATION_NAME: the name of the reservation to create.

  • CONSUMER_PROJECT_ID_1 and CONSUMER_PROJECT_ID_2: the IDs of projects that can consume this reservation. You can include up to 100 consumer projects. These projects must be in the same organization as the owner project. Don't include the owner project. By default, it is already allowed to consume the reservation.

  • NUMBER_OF_VMS: the number of VMs to reserve.

  • LOCATION: the location of the instance template. Specify one of the following values:

    • For a global instance template: global.

    • For a regional instance template: regions/REGION. Replace REGION with the region where the instance template is located. If you specify a regional instance template, then you can only reserve VMs within the same region as the template's region.

  • INSTANCE_TEMPLATE_NAME: the name of an existing instance template. If the instance template specifies an A3 machine type, you must include the specificReservationRequired field in the request body, and set the field to true. This indicates that only VMs that specifically target this reservation can consume it. For more information, see Consume VMs from a specific reservation.

For example, to create a reservation for ten VMs in zone us-central1-a by specifying a global instance template, and share the reservation with projects project-1 and project-2, make the following POST request:

POST https://compute.googleapis.com/compute/v1/projects/example-project/zones/us-central1-a/reservations

{
  "name": "my-reservation",
  "shareSettings": {
    "shareType": "SPECIFIC_PROJECTS",
    "projectMap": {
      "project-1": {
        "projectId": "project-1"
      },
      "project-2": {
        "projectId": "project-2"
      }
    }
  },
  "specificReservation": {
    "count": "10",
    "sourceInstanceTemplate": "projects/example-project/global/instanceTemplates/example-instance-template"
  }
}

Optionally, you can do one or more of the following:

  • To specify that only VMs that specifically target this reservation can consume it, include the specificReservationRequired field in the request body, and set the field to true.

    For example, to create a specific reservation by specifying an instance template, and share the reservation with two consumer projects, make a request as follows:

    POST https://compute.googleapis.com/compute/v1/projects/PROJECT_ID/zones/ZONE/reservations
    
    {
      "name": "RESERVATION_NAME",
      "shareSettings": {
        "shareType": "SPECIFIC_PROJECTS",
        "projectMap": {
          "CONSUMER_PROJECT_ID_1": {
            "projectId": "CONSUMER_PROJECT_ID_1"
          },
          "CONSUMER_PROJECT_ID_2": {
            "projectId": "CONSUMER_PROJECT_ID_2"
          }
        }
      },
      "specificReservation": {
        "count": "NUMBER_OF_VMS",
        "sourceInstanceTemplate": "projects/PROJECT_ID/LOCATION/instanceTemplates/INSTANCE_TEMPLATE_NAME"
      },
      "specificReservationRequired": true
    }
    
  • To allow a reservation of GPU VMs to be consumed by custom training jobs or prediction jobs in Vertex AI, make a POST request to the beta.reservations.insert method. In the request body, include the serviceShareType field and set it to ALLOW_ALL.

    POST https://compute.googleapis.com/compute/beta/projects/PROJECT_ID/zones/ZONE/reservations
    
    {
      "name": "RESERVATION_NAME",
      "reservationSharingPolicy": {
        "serviceShareType": "ALLOW_ALL"
      },
      "shareSettings": {
        "shareType": "SPECIFIC_PROJECTS",
        "projectMap": {
          "CONSUMER_PROJECT_ID_1": {
            "projectId": "CONSUMER_PROJECT_ID_1"
          },
          "CONSUMER_PROJECT_ID_2": {
            "projectId": "CONSUMER_PROJECT_ID_2"
          }
        }
      },
      "specificReservation": {
        "count": "NUMBER_OF_VMS",
        "sourceInstanceTemplate": "projects/PROJECT_ID/LOCATION/instanceTemplates/INSTANCE_TEMPLATE_NAME"
      }
    }
    
  • To enable Compute Engine to automatically delete the reservation, select one of the following methods:

    • To delete the reservation at a specific date and time, make a POST request to the beta.reservations.insert method. In the request body, include the deleteAtTime field.

      For example, to create a reservation by specifying an instance template, auto delete the reservation at a specific date and time, and share the reservation with two consumer projects, make a request as follows:

      POST https://compute.googleapis.com/compute/beta/projects/PROJECT_ID/zones/ZONE/reservations
      
      {
        "deleteAtTime": "DELETE_AT_TIME",
        "name": "RESERVATION_NAME",
        "shareSettings": {
          "shareType": "SPECIFIC_PROJECTS",
          "projectMap": {
            "CONSUMER_PROJECT_ID_1": {
              "projectId": "CONSUMER_PROJECT_ID_1"
            },
            "CONSUMER_PROJECT_ID_2": {
              "projectId": "CONSUMER_PROJECT_ID_2"
            }
          }
        },
        "specificReservation": {
          "count": "NUMBER_OF_VMS",
          "sourceInstanceTemplate": "projects/PROJECT_ID/LOCATION/instanceTemplates/INSTANCE_TEMPLATE_NAME"
        }
      }
      

      Replace DELETE_AT_TIME with a date and time formatted as an RFC 3339 timestamp, which must be as follows:

      YYYY-MM-DDTHH:MM:SSOFFSET
      

      Replace the following:

      • YYYY-MM-DD: a date formatted as a 4-digit year, 2-digit month, and a 2-digit day of the month, separated by hyphens (-).

      • HH:MM:SS: a time formatted as a 2-digit hour using a 24-hour time, 2-digit minutes, and 2-digit seconds, separated by colons (:).

      • OFFSET: the time zone formatted as an offset of Coordinated Universal Time (UTC). For example, to use the Pacific Standard Time (PST), specify -08:00. Alternatively, to use no offset, specify Z.

    • To delete the reservation after a specific duration, make a POST request to the beta.reservations.insert method. In the request body, include the deleteAfterDuration field.

      For example, to create a reservation by specifying an instance template, auto delete the reservation after a specific duration, and share the reservation with two consumer projects, make a request as follows:

      POST https://compute.googleapis.com/compute/beta/projects/PROJECT_ID/zones/ZONE/reservations
      
      {
        "deleteAfterDuration": {
          "seconds": "DELETE_AFTER_DURATION"
        },
        "name": "RESERVATION_NAME",
        "shareSettings": {
          "shareType": "SPECIFIC_PROJECTS",
          "projectMap": {
            "CONSUMER_PROJECT_ID_1": {
              "projectId": "CONSUMER_PROJECT_ID_1"
            },
            "CONSUMER_PROJECT_ID_2": {
              "projectId": "CONSUMER_PROJECT_ID_2"
            }
          }
        },
        "specificReservation": {
          "count": "NUMBER_OF_VMS",
          "sourceInstanceTemplate": "projects/PROJECT_ID/LOCATION/instanceTemplates/INSTANCE_TEMPLATE_NAME"
        }
      }
      

      Replace DELETE_AFTER_DURATION with a duration in seconds. For example, specify 86400 for 86,400 seconds (1 day).

Specify an existing VM

You can only create a shared reservation based on an existing VM in the same project and zone as the VM.

After creating the reservation, you can consume it by creating VMs with properties that match the reference VM. You can do this by doing one of the following:

  • Create and use an instance template as follows:

    1. Create an instance template based on the reference VM without overriding the reference VM's properties.

    2. Create VMs using the newly-created template by doing one or both of the following:

  • Create a VM with properties that exactly match the reference VM as follows:

    • In the owner project, create a VM based on the reference VM without changing the properties of the VM that you're creating.

    • In the consumer projects, create a VM while manually ensuring that its properties and the reference VM's properties match.

To create a shared reservation that uses the properties of an existing VM, do the following:

  1. In the Google Cloud console, go to the Reservations page.

    Go to Reservations

  2. On the On-demand reservations tab (default), click Create reservation.

    The Create a reservation page opens.

  3. For Name, enter a name for your reservation.

  4. For Region and Zone, select where you want to reserve resources.

  5. In the Share type section, do the following:

    1. To specify a shared reservation, select Shared.

    2. Click Add projects, and then select the projects from the current project's organization that you want to share the reservation with. You can select up to 100 consumer projects.

  6. In the Use with VM instance section, select one of the following options:

    • To allow matching VMs to automatically consume this reservation, select Use reservation automatically if it's not already selected.

    • To consume this reservation's resources only when creating matching VMs that specifically target this reservation by name, select Select specific reservation.

  7. For Number of VM instances, enter the number of VMs that you want to reserve.

  8. In the Machine configuration section, do the following:

    1. Select Use existing VM.

    2. For Existing VM, select the VM which properties you want to use to create the reservation.

  9. In the Auto-delete section, you can enable the auto-delete option to let Compute Engine automatically delete the reservation at a specific date and time. Automatically deleting reservations can be useful to avoid unnecessary charges when you stop consuming the reservation.

  10. To create the reservation, click Create.

    The Reservations page opens. Creating the reservation might take up to a minute to complete.

Specify properties directly

To create a shared reservation by specifying properties directly, select one of the following options:

Console

  1. In the Google Cloud console, go to the Reservations page.

    Go to Reservations

  2. On the On-demand reservations tab (default), click Create reservation.

  3. Click Create reservation.

    The Create a reservation page appears.

  4. For Name, enter a name for your reservation.

  5. For Region and Zone, select where you want to reserve resources.

  6. In the Share type section, do the following:

    1. To specify a shared reservation, select Shared.

    2. Click Add projects, and then select the projects from the current project's organization that you want to share the reservation with. You can select up to 100 consumer projects.

  7. Optional: To allow a reservation of GPU VMs to be consumed by custom training jobs or prediction jobs in Vertex AI, in the Google Cloud services section, select Share reservation.

  8. In the Use with VM instance section, select one of the following options:

    • To allow matching VMs to automatically consume this reservation, select Use reservation automatically (default).

    • To consume this reservation's resources only when creating matching VMs that specifically target this reservation by name, select Select specific reservation.

  9. For Number of VM instances, enter the number of VMs that you want to reserve.

  10. In the Machine configuration section, select Specify machine type, and then specify the following:

    1. For Machine family, Series, and Machine type, select a machine family, series, and machine type.

    2. Optional: To specify a minimum CPU platform or attach GPUs to N1 VMs, do the following:

      1. To expand the CPU Platform and GPU section, click the expander arrow.

      2. Optional: To specify a minimum CPU platform, for CPU Plaform, select an option.

      3. Optional: To attach GPUs to N1 VMs, click Add GPU. Then, for GPU type and Number of GPUs, select the type and number of GPUs to attach to each N1 VM.

    3. Optional: To add Local SSD disks, do the following:

      1. For Number of disks, select the number of Local SSD disks for each VM.

      2. For Interface type, select the interface for the Local SSD disks.

  11. In the Auto-delete section, you can enable the auto-delete option to let Compute Engine automatically delete the reservation at a specific date and time. Automatically deleting reservations can be useful to avoid unnecessary charges when you stop consuming the reservation.

  12. To create the reservation, click Create.

    The Reservations page opens. Creating the shared reservation might take up to a minute time to complete.

gcloud

To create a shared reservation, use the gcloud compute reservations create command with the --share-setting=projects and --share-with flags.

To create a shared reservation by specifying properties directly and without including any optional flags, run the following command:

gcloud compute reservations create RESERVATION_NAME \
    --machine-type=MACHINE_TYPE \
    --share-setting=projects \
    --share-with=CONSUMER_PROJECT_IDS \
    --vm-count=NUMBER_OF_VMS \
    --zone=ZONE

Replace the following:

  • RESERVATION_NAME: the name of the reservation to create.

  • MACHINE_TYPE: a machine type to use for each VM. If you specify an A3 machine type, then you must include the --require-specific-reservation flag. This indicates that only VMs that specifically target the reservation can consume it. For more information, see Consume VMs from a specific reservation.

  • CONSUMER_PROJECT_IDS: a comma-separated list of IDs of projects that can consume this reservation—for example, project-1,project-2. You can include up to 100 consumer projects. These projects must be in the same organization as the owner project. Don't include the owner project. By default, it is already allowed to consume the reservation.

  • NUMBER_OF_VMS: the number of VMs to reserve.

  • ZONE: the zone in which to reserve resources.

For example, to create a reservation in zone us-central1-a for ten VMs that each use an N2 predefined machine type with 4 vCPUs, and share the reservation with projects project-1 and project-2, run the following command:

gcloud compute reservations create my-reservation \
    --machine-type=n2-standard-4 \
    --share-setting=projects \
    --share-with=project-1,project-2 \
    --vm-count=10 \
    --zone=us-central1-a

Optionally, you can do one or more of the following:

  • To attach GPUs to your reserved N1 VMs, include the --accelerator flag.

    gcloud compute reservations create RESERVATION_NAME \
        --accelerator=count=NUMBER_OF_ACCELERATORS,type=ACCELERATOR_TYPE
        --machine-type=MACHINE_TYPE \
        --share-setting=projects \
        --share-with=CONSUMER_PROJECT_IDS \
        --vm-count=NUMBER_OF_VMS \
        --zone=ZONE
    

    Replace the following:

  • To add one or more Local SSD disks to each reserved VM, include one or more --local-ssd flags. You can specify up to 24 Local SSD disks. Each Local SSD disk is 375 GB.

    For example, to specify two Local SSD disks when creating a shared reservation, include two --local-ssd flags as follows:

    gcloud compute reservations create RESERVATION_NAME \
        --local-ssd=size=375,interface=INTERFACE_1 \
        --local-ssd=size=375,interface=INTERFACE_2 \
        --machine-type=MACHINE_TYPE \
        --share-setting=projects \
        --share-with=CONSUMER_PROJECT_IDS \
        --vm-count=NUMBER_OF_VMS \
        --zone=ZONE
    

    Replace INTERFACE_1 and INTERFACE_2 with the type of interface you want each Local SSD disk to use. Specify one of the following values:

    • NVME disk interfaces: nvme

    • SCSI disk interfaces: scsi

    Make sure that the machine type you specify for the reserved VMs supports the chosen disk interfaces. Otherwise, creating the reservation fails. For more information, see how to choose a disk interface.

  • To have the reserved VMs use a specific minimum CPU platform instead of the zone's default CPU platform, include the --min-cpu-platform flag.

    gcloud compute reservations create RESERVATION_NAME \
        --machine-type=MACHINE_TYPE \
        --min-cpu-platform="MIN_CPU_PLATFORM" \
        --share-setting=projects \
        --share-with=CONSUMER_PROJECT_IDS \
        --vm-count=NUMBER_OF_VMS \
        --zone=ZONE
    

    Replace MIN_CPU_PLATFORM with a minimum CPU platform. To make sure that a CPU platform is available in the zone where you're reserving resources, view the available CPU platforms by zone.

  • To specify that only VMs that specifically target this reservation can consume it, include the --require-specific-reservation flag.

    gcloud compute reservations create RESERVATION_NAME \
        --machine-type=MACHINE_TYPE \
        --require-specific-reservation \
        --share-setting=projects \
        --share-with=CONSUMER_PROJECT_IDS \
        --vm-count=NUMBER_OF_VMS \
        --zone=ZONE
    
  • To allow a reservation of GPU VMs to be consumed by custom training jobs or prediction jobs in Vertex AI, use the gcloud beta compute reservations create command with the --reservation-sharing-policy=ALLOW_ALL flag.

    gcloud beta compute reservations create RESERVATION_NAME \
        --machine-type=MACHINE_TYPE \
        --reservation-sharing-policy=ALLOW_ALL \
        --share-setting=projects \
        --share-with=CONSUMER_PROJECT_IDS \
        --vm-count=NUMBER_OF_VMS \
        --zone=ZONE
    
  • To enable Compute Engine to automatically delete the reservation, select one of the following methods:

    • To delete the reservation at a specific date and time, use the gcloud beta compute reservations create command with the --delete-at-time flag.

      gcloud beta compute reservations create RESERVATION_NAME \
          --delete-at-time=DELETE_AT_TIME \
          --machine-type=MACHINE_TYPE \
          --share-setting=projects \
          --share-with=CONSUMER_PROJECT_IDS \
          --vm-count=NUMBER_OF_VMS \
          --zone=ZONE
      

      Replace DELETE_AT_TIME with a date and time formatted as an RFC 3339 timestamp, which must be as follows:

      YYYY-MM-DDTHH:MM:SSOFFSET
      

      Replace the following:

      • YYYY-MM-DD: a date formatted as a 4-digit year, 2-digit month, and a 2-digit day of the month, separated by hyphens (-).

      • HH:MM:SS: a time formatted as a 2-digit hour using a 24-hour time, 2-digit minutes, and 2-digit seconds, separated by colons (:).

      • OFFSET: the time zone formatted as an offset of Coordinated Universal Time (UTC). For example, to use the Pacific Standard Time (PST), specify -08:00. Alternatively, to use no offset, specify Z.

    • To delete the reservation after a specific duration, use the gcloud beta compute reservations create command with the --delete-after-duration flag.

      gcloud beta compute reservations create RESERVATION_NAME \
          --delete-after-duration=DELETE_AFTER_DURATION \
          --machine-type=MACHINE_TYPE \
          --share-setting=projects \
          --share-with=CONSUMER_PROJECT_IDS \
          --vm-count=NUMBER_OF_VMS \
          --zone=ZONE
      

      Replace DELETE_AFTER_DURATION with a duration in days, hours, minutes, or seconds. For example, specify 30m for 30 minutes, or 1d2h3m4s for 1 day, 2 hours, 3 minutes, and 4 seconds.

Terraform

To create a reservation, use the google_compute_reservation Terraform resource. To specify a shared reservation, define the share_settings block:

  • Set the share_type field to SPECIFIC_PROJECTS.
  • In the project_map block, specify the project IDs of the projects that you want to share this reservation with.

For more information about how to use Terraform, see Using Terraform with Google Cloud.

REST

To create a shared reservation, make a POST request to the reservations.insert method. In the request body, include the following:

  • The projectMap field.

  • The shareType field set to SPECIFIC_PROJECTS.

For example, to create a shared reservation without including any optional fields and share the reservation with two consumer projects, make the following POST request:

POST https://compute.googleapis.com/compute/v1/projects/PROJECT_ID/zones/ZONE/reservations

{
  "name": "RESERVATION_NAME",
  "shareSettings": {
    "shareType": "SPECIFIC_PROJECTS",
    "projectMap": {
      "CONSUMER_PROJECT_ID_1": {
        "projectId": "CONSUMER_PROJECT_ID_1"
      },
      "CONSUMER_PROJECT_ID_2": {
        "projectId": "CONSUMER_PROJECT_ID_2"
      }
    }
  },
  "specificReservation": {
    "count": "NUMBER_OF_VMS",
    "instanceProperties": {
      "machineType": "MACHINE_TYPE"
    }
  }
}

Replace the following:

  • PROJECT_ID: the ID of the project where you want to reserve resources.

  • ZONE: the zone in which to reserve resources.

  • RESERVATION_NAME: the name of the reservation to create.

  • CONSUMER_PROJECT_ID_1 and CONSUMER_PROJECT_ID_2: the IDs of projects that can consume this reservation. You can include up to 100 consumer projects. These projects must be in the same organization as the owner project. Don't include the owner project. By default, it is already allowed to consume the reservation.

  • NUMBER_OF_VMS: the number of VMs to reserve.

  • MACHINE_TYPE: a machine type to use for each VM. If you specify an A3 machine type, then you must include the specificReservationRequired field in the request body, and set the field to true. This indicates that only VMs that specifically target the reservation can consume it.

For example, to create a reservation by specifying a global instance template in zone us-central1-a, share the reservation with projects project-1 and project-2, and reserve ten VMs that each use an N2 predefined machine type with 4 vCPUs, make the following POST request:

POST https://compute.googleapis.com/compute/v1/projects/example-project/zones/us-central1-a/reservations

{
  "name": "my-reservation",
  "shareSettings": {
    "shareType": "SPECIFIC_PROJECTS",
    "projectMap": {
      "project-1": {
        "projectId": "project-1"
      },
      "project-2": {
        "projectId": "project-2"
      }
    }
  },
  "specificReservation": {
    "count": "10",
    "instanceProperties": {
      "machineType": "n2-standard-4",
    }
  }
}

Optionally, you can do one or more of the following:

  • To attach GPUs to your reserved N1 VMs, include the guestAccelerators field in the request body.

    For example, to create a reservation shared with two consumer projects, and attach GPUs to any reserved N1 VMs, make a request as follows:

    POST https://compute.googleapis.com/compute/v1/projects/PROJECT_ID/zones/ZONE/reservations
    
    {
      "name": "RESERVATION_NAME",
      "shareSettings": {
        "shareType": "SPECIFIC_PROJECTS",
        "projectMap": {
          "CONSUMER_PROJECT_ID_1": {
            "projectId": "CONSUMER_PROJECT_ID_1"
          },
          "CONSUMER_PROJECT_ID_2": {
            "projectId": "CONSUMER_PROJECT_ID_2"
          }
        }
      },
      "specificReservation": {
        "count": "NUMBER_OF_VMS",
        "instanceProperties": {
          "guestAccelerators": [
            {
              "acceleratorCount": NUMBER_OF_ACCELERATORS,
              "acceleratorType": "ACCELERATOR_TYPE"
            }
          ],
          "machineType": "MACHINE_TYPE"
        }
      }
    }
    

    Replace the following:

  • To add one or more Local SSD disks to each reserved VM, include the localSsds field in the request body. You can specify up to 24 Local SSD disks. Each Local SSD disk is 375 GB.

    For example, to create a shared reservation while specifying two Local SSD disks and two consumer projects, make a request as follows:

    POST https://compute.googleapis.com/compute/v1/projects/PROJECT_ID/zones/ZONE/reservations
    
    {
      "name": "RESERVATION_NAME",
      "shareSettings": {
        "shareType": "SPECIFIC_PROJECTS",
        "projectMap": {
          "CONSUMER_PROJECT_ID_1": {
            "projectId": "CONSUMER_PROJECT_ID_1"
          },
          "CONSUMER_PROJECT_ID_2": {
            "projectId": "CONSUMER_PROJECT_ID_2"
          }
        }
      },
      "specificReservation": {
        "count": "NUMBER_OF_VMS",
        "instanceProperties": {
          "localSsds": [
            {
              "diskSizeGb": "375",
              "interface": "INTERFACE_1"
            },
            {
              "diskSizeGb": "375",
              "interface": "INTERFACE_2"
            }
          ],
          "machineType": "MACHINE_TYPE"
        }
      }
    }
    

    Replace INTERFACE_1 and INTERFACE_2 with the type of interface you want each Local SSD disk to use. Specify one of the following values:

    • NVME disk interfaces: NVME

    • SCSI disk interfaces: SCSI

    Make sure that the machine type you specify for the reserved VMs supports the chosen disk interfaces. Otherwise, creating the reservation fails. For more information, see how to choose a disk interface.

  • To have the reserved VMs use a specific minimum CPU platform instead of the zone's default CPU platform, include the minCpuPlatform field in the request body.

    For example, to create a shared reservation while specifying a minimum CPU platform and two consumer projects, make a request as follows:

    POST https://compute.googleapis.com/compute/v1/projects/PROJECT_ID/zones/ZONE/reservations
    
    {
      "name": "RESERVATION_NAME",
      "shareSettings": {
        "shareType": "SPECIFIC_PROJECTS",
        "projectMap": {
          "CONSUMER_PROJECT_ID_1": {
            "projectId": "CONSUMER_PROJECT_ID_1"
          },
          "CONSUMER_PROJECT_ID_2": {
            "projectId": "CONSUMER_PROJECT_ID_2"
          }
        }
      },
      "specificReservation": {
        "count": "NUMBER_OF_VMS",
        "instanceProperties": {
          "machineType": "MACHINE_TYPE",
          "minCpuPlatform": "MIN_CPU_PLATFORM"
        }
      }
    }
    

    Replace MIN_CPU_PLATFORM with a minimum CPU platform. To make sure that a CPU platform is available in the zone where you're reserving resources, view the available CPU platforms by zone.

  • To specify that only VMs that specifically target this reservation can consume it, include the specificReservationRequired field in the request body, and set the field to true.

    For example, to create a specific reservation and share it with two consumer projects, make a request as follows:

    POST https://compute.googleapis.com/compute/v1/projects/PROJECT_ID/zones/ZONE/reservations
    
    {
      "name": "RESERVATION_NAME",
      "shareSettings": {
        "shareType": "SPECIFIC_PROJECTS",
        "projectMap": {
          "CONSUMER_PROJECT_ID_1": {
            "projectId": "CONSUMER_PROJECT_ID_1"
          },
          "CONSUMER_PROJECT_ID_2": {
            "projectId": "CONSUMER_PROJECT_ID_2"
          }
        }
      },
      "specificReservation": {
        "count": "NUMBER_OF_VMS",
        "instanceProperties": {
          "machineType": "MACHINE_TYPE"
        }
      },
      "specificReservationRequired": true
    }
    
  • To allow a reservation of GPU VMs to be consumed by custom training jobs or prediction jobs in Vertex AI, make a POST request to the beta.reservations.insert method. In the request body, include the serviceShareType field and set it to ALLOW_ALL.

    POST https://compute.googleapis.com/compute/beta/projects/PROJECT_ID/zones/ZONE/reservations
    
    {
      "name": "RESERVATION_NAME",
      "reservationSharingPolicy": {
        "serviceShareType": "ALLOW_ALL"
      },
      "shareSettings": {
        "shareType": "SPECIFIC_PROJECTS",
        "projectMap": {
          "CONSUMER_PROJECT_ID_1": {
            "projectId": "CONSUMER_PROJECT_ID_1"
          },
          "CONSUMER_PROJECT_ID_2": {
            "projectId": "CONSUMER_PROJECT_ID_2"
          }
        }
      },
      "specificReservation": {
        "count": "NUMBER_OF_VMS",
        "instanceProperties": {
          "machineType": "MACHINE_TYPE"
        }
      }
    }
    
  • To enable Compute Engine to automatically delete the reservation, select one of the following methods:

    • To delete the reservation at a specific date and time, make a POST request to the beta.reservations.insert method. In the request body, include the deleteAtTime field.

      For example, to create a reservation while specifying a date and time to delete a reservation, and share the reservation with two consumer projects, make a request as follows:

      POST https://compute.googleapis.com/compute/beta/projects/PROJECT_ID/zones/ZONE/reservations
      
      {
        "deleteAtTime": "DELETE_AT_TIME",
        "name": "RESERVATION_NAME",
        "shareSettings": {
          "shareType": "SPECIFIC_PROJECTS",
          "projectMap": {
            "CONSUMER_PROJECT_ID_1": {
              "projectId": "CONSUMER_PROJECT_ID_1"
            },
            "CONSUMER_PROJECT_ID_2": {
              "projectId": "CONSUMER_PROJECT_ID_2"
            }
          }
        },
        "specificReservation": {
          "count": "NUMBER_OF_VMS",
          "instanceProperties": {
            "machineType": "MACHINE_TYPE"
          }
        }
      }
      

      Replace DELETE_AT_TIME with a date and time formatted as an RFC 3339 timestamp, which must be as follows:

      YYYY-MM-DDTHH:MM:SSOFFSET
      

      Replace the following:

      • YYYY-MM-DD: a date formatted as a 4-digit year, 2-digit month, and a 2-digit day of the month, separated by hyphens (-).

      • HH:MM:SS: a time formatted as a 2-digit hour using a 24-hour time, 2-digit minutes, and 2-digit seconds, separated by colons (:).

      • OFFSET: the time zone formatted as an offset of Coordinated Universal Time (UTC). For example, to use the Pacific Standard Time (PST), specify -08:00. Alternatively, to use no offset, specify Z.

    • To delete the reservation after a specific duration, make a POST request to the beta.reservations.insert method. In the request body, include the deleteAfterDuration field.

      For example, to create a reservation that Compute Engine deletes after a specific duration, and share the reservation with two consumer projects, make a request as follows:

      POST https://compute.googleapis.com/compute/beta/projects/PROJECT_ID/zones/ZONE/reservations
      
      {
        "deleteAfterDuration": {
          "seconds": "DELETE_AFTER_DURATION"
        },
        "name": "RESERVATION_NAME",
        "shareSettings": {
          "shareType": "SPECIFIC_PROJECTS",
          "projectMap": {
            "CONSUMER_PROJECT_ID_1": {
              "projectId": "CONSUMER_PROJECT_ID_1"
            },
            "CONSUMER_PROJECT_ID_2": {
              "projectId": "CONSUMER_PROJECT_ID_2"
            }
          }
        },
        "specificReservation": {
          "count": "NUMBER_OF_VMS",
          "instanceProperties": {
            "machineType": "MACHINE_TYPE"
          }
        }
      }
      

      Replace DELETE_AFTER_DURATION with a duration in seconds. For example, specify 86400 for 86,400 seconds (1 day).

Troubleshooting

Learn how to troubleshoot reservation creation.

What's next