Getting Started with gRPC on Compute Engine

This page shows you how to deploy a simple example gRPC service with the Google Cloud Endpoints Extensible Server Proxy (ESP) in a Docker container in Google Compute Engine.

For an overview of Cloud Endpoints, see About Cloud Endpoints and Cloud Endpoints Architecture.

Task List

Use the following high-level task list as you work through the tutorial. All tasks are required to successfully send requests to the API.

  1. Set up a Cloud Platform project, and download required software. See Before you begin.
  2. Create a Compute Engine VM instance. See Creating a Compute Engine instance.
  3. Copy and configure files from the bookstore-grpc sample. See Configuring Endpoints.
  4. Deploy the Endpoints configuration to create a Cloud Endpoints service. See Deploying the Endpoints configuration.
  5. Deploy the API and Extensible Service Proxy on the Compute Engine VM. See Deploying the API backend.
  6. Send a request to the API. See Sending a request to the API.
  7. Avoid incurring charges to your Google Cloud Platform account. See Clean up.

Before you begin

  1. Sign in to your Google account.

    If you don't already have one, sign up for a new account.

  2. Select or create a Cloud Platform project.

    Go to the Manage resources page

  3. Enable billing for your project.

    Enable billing

  4. Note the project ID, because you'll need it later.
  5. Install and initialize the Cloud SDK.
  6. Update the Cloud SDK and install the Endpoints components:
    gcloud components update
  7. Make sure that Cloud SDK (gcloud) is authorized to access your data and services on Google Cloud Platform:
    gcloud auth login
    A new browser tab opens and you are prompted to choose an account.
  8. Set the default project to your project ID.
    gcloud config set project [YOUR_PROJECT_ID]

    Replace [YOUR_PROJECT_ID] with your project ID. Do not include the square brackets. If you have other Cloud Platform projects, and you want to use gcloud to manage them, see Managing Cloud SDK Configurations.

  9. Install protoc, the protocol buffers compiler.

Creating a Compute Engine instance

  1. In the Cloud Platform Console, go to the VM Instances page.

    Go to the VM Instances page

  2. Click the Create instance button.
  3. In the Boot disk section, click Change to begin configuring your boot disk.
  4. In the OS images tab, choose the Debian 8 image.
  5. Click Select.
  6. In the Firewall section, select Allow HTTP traffic and Allow HTTPS traffic.
  7. Click the Create button to create the instance.
  8. Screenshot of the VM instance creation window with the required options set

    Allow a short time for the instance to start up. Once ready, it will be listed on the VM Instances page with a green status icon.

  9. Make sure you that you can connect to your VM instance.
    1. In the list of virtual machine instances, click SSH in the row of the instance that you want to connect to.
  10. You can now use the terminal to run Linux commands on your Debian 8 instance. When you are done, enter exit to disconnect from the instance.
  11. Note the instance Name, Zone, and External IP address because you'll need them later.

Configuring Endpoints

The bookstore-grpc sample contains all the files that you need to copy and configure locally.

To configure Endpoints:

  1. Create a self-contained protobuf descriptor file from your service .proto file:
    1. Save a copy of bookstore.proto from our example repo. This file defines the Bookstore service's API.
    2. Create the descriptor file using the protocol buffers compiler, protoc:

      protoc --include_imports --include_source_info bookstore.proto --descriptor_set_out out.pb

      You may need to provide additional flags (-I) for the root directories for the imported .proto files.

  2. Create a gRPC API Configuration YAML file:
    1. Save a copy of api_config.yaml. This file defines the gRPC API configuration for the Bookstore service.
    2. Replace <MY_PROJECT_ID> in your api_config.yaml file with your Cloud Platform Console project ID.

      Note that the apis: name value in this file exactly matches the fully-qualified API name from the .proto file; otherwise deployment won't work. The Bookstore service is defined in bookstore.proto inside package endpoints.examples.bookstore. Its fully-qualified API name is endpoints.examples.bookstore.Bookstore, just as it appears in api_config.yaml.

        - name: endpoints.examples.bookstore.Bookstore

Deploying the Endpoints Configuration

To deploy the Endpoints configuration, you use Google Service Management, an infrastructure service of Google Cloud Platform that manages other APIs and services, including services created using Cloud Endpoints.

  1. Deploy the proto descriptor file and the configuration file using the gcloud command-line tool:
    gcloud service-management deploy out.pb api_config.yaml
  2. Wait for this operation to complete. After it is done, take a note of the service name and the service config ID in the command output, you'll need this when deploying your backend:
    Service Configuration [SERVICE_CONFIG_ID] uploaded for service [SERVICE_NAME]

    When you copy the service name and service configuration ID, do not include the square brackets.

Deploying the API backend

So far you have deployed the API configuration to Service Management, but you have not yet deployed the code that will serve the API backend. This section walks you through getting Docker set up on your VM instance and running the API backend code and the Extensible Service Proxy in a Docker container.

Install Docker on the VM Instance

To install Docker on the VM instance:

  1. Set the zone for your project by invoking the command:
    gcloud config set compute/zone [YOUR_INSTANCE_ZONE]

    Replace [YOUR_INSTANCE_ZONE] with the zone where your instance is running. Do not include the square brackets.

  2. Connect to your instance using the following command:
    gcloud compute ssh [INSTANCE_NAME]

    Replace [INSTANCE_NAME] with your VM instance name. Do not include the square brackets.

  3. Follow the steps in the Docker documentation to set up the Docker repository for amd64 and install Docker CE on the instance.

    The Docker Docs list several sudo add-apt-repository commands. Make sure you run the one for amd64:

    $ sudo add-apt-repository \
        "deb [arch=amd64]$(. /etc/os-release; echo "$ID") \
         $(lsb_release -cs) \

Running the sample API and ESP in a Docker container

To run the sample gRPC service with ESP in a Docker container so that clients can use it:

  1. Create your own container network called esp_net.
    sudo docker network create --driver bridge esp_net
  2. Run the sample Bookstore server that serves the sample API:
    sudo docker run --detach --name=bookstore \
        --net=esp_net \
  3. Run the pre-packaged Extensible Server Proxy Docker Container:
    sudo docker run --detach --name=esp \
        -p 80:9000 \
        --net=esp_net \ \
        -s ${SERVICE_NAME} \
        -v ${SERVICE_CONFIG_ID} \
        -P 9000 \
        -a grpc://bookstore:8000
    The following arguments specify how you want to run the Extensible Service Proxy container:
    • -s (or --service): specifies ths name of your Endpoints service
    • -v (or --version): specifies the service config ID of the Endpoints service
    • -P (or --http2_port): specifies the port that accepts HTTP2 connections
    • -a (or --backend): specifies the application backend to which the ESP proxies requests. In this example, the grpc:// prefix indicates that the backend accepts gRPC traffic.

If you have Transcoding enabled, make sure to configure a port for HTTP1.1 or SSL traffic.

Sending a request to the API

To send requests to the sample API, you can use a sample gRPC client written in Python.

  1. Clone the git repo where the gRPC client code is hosted:

    git clone
  2. Change your working directory:

    cd python-docs-samples/endpoints/bookstore-grpc/
  3. Install dependencies:

    virtualenv env
    source env/bin/activate
    python -m pip install -r requirements.txt
  4. Send a request to the sample API

    python --host $SERVER_IP --port 80
  5. Look at the activity graphs for your API in the Endpoints page.
    View Endpoints activity graphs
    It may take a few moments for the request to be reflected in the graphs.

  6. Look at the request logs for your API in the Logs Viewer page.
    View Endpoints request logs

If you’re sending the request from the same instance in which the Docker containers are running, you can replace $SERVER_IP with localhost. Otherwise replace $SERVER_IP with the external IP of the instance. The external IP can be found by executing

gcloud compute instances list

You just deployed and tested an API in Cloud Endpoints!

Clean up

To avoid incurring charges to your Google Cloud Platform account for the resources used in this quickstart:

  1. Delete the API:
    gcloud service-management delete [SERVICE_NAME]

    Replace [SERVICE_NAME] with the name of your service.

  2. In the Cloud Platform Console, go to the VM Instances page.

    Go to the VM Instances page

  3. Click the checkbox next to the instance you want to delete.
  4. Click the Delete button at the top of the page to delete the instance.

What's next

Monitor your resources on the go

Get the Google Cloud Console app to help you manage your projects.

Send feedback about...

Cloud Endpoints with gRPC