Getting started with Cloud Endpoints gRPC for Managed Instance Group with ESPv2


This tutorial shows you how to deploy a simple example gRPC service with the Extensible Service Proxy V2 (ESPv2) in a Managed Instance Group.

This tutorial uses the Python version of the bookstore-grpc sample. See the What's next section for gRPC samples in other languages.

For an overview of Cloud Endpoints, see About Endpoints and Endpoints architecture.

Objectives

Use the following high-level task list as you work through the tutorial. All tasks are required to successfully send requests to the API.

  1. Set up a Google Cloud project, and download required software. See Before you begin.
  2. Copy and configure files from the bookstore-grpc sample. See Configuring Endpoints.
  3. Deploy the Endpoints configuration to create an Endpoints service. See Deploying the Endpoints configuration.
  4. Deploy the API and ESPv2 on the Managed Instance Group backend. See Deploying the API backend.
  5. Send a request to the API. See Sending a request to the API.
  6. Avoid incurring charges to your Google Cloud account. See Clean up.

Costs

In this document, you use the following billable components of Google Cloud:

To generate a cost estimate based on your projected usage, use the pricing calculator. New Google Cloud users might be eligible for a free trial.

When you finish the tasks that are described in this document, you can avoid continued billing by deleting the resources that you created. For more information, see Clean up.

Before you begin

  1. Sign in to your Google Cloud account. If you're new to Google Cloud, create an account to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
  2. In the Google Cloud console, on the project selector page, select or create a Google Cloud project.

    Go to project selector

  3. Make sure that billing is enabled for your Google Cloud project.

  4. In the Google Cloud console, on the project selector page, select or create a Google Cloud project.

    Go to project selector

  5. Make sure that billing is enabled for your Google Cloud project.

  6. Make a note of the project ID because it's needed later.
  7. Install and initialize the Google Cloud CLI.
  8. Update the gcloud CLI and install the Endpoints components:
    gcloud components update
  9. Make sure that the Google Cloud CLI (gcloud) is authorized to access your data and services on Google Cloud:
    gcloud auth login
    In the new browser tab that opens, select an account.
  10. Set the default project to your project ID.
    gcloud config set project YOUR_PROJECT_ID

    Replace YOUR_PROJECT_ID with your project ID. If you have other Google Cloud projects, and you want to use gcloud to manage them, see Managing gcloud CLI Configurations.

  11. Follow the steps in the gRPC Python quickstart to install gRPC and the gRPC tools.

When you finish the tasks that are described in this document, you can avoid continued billing by deleting the resources that you created. For more information, see Clean up.

Configuring Endpoints

Clone the bookstore-grpc sample repository from GitHub.

To configure Endpoints:

  1. Create a self-contained protobuf descriptor file from your service .proto file:
    1. Save a copy of bookstore.proto from the example repository. This file defines the Bookstore service's API.
    2. Create the following directory: mkdir generated_pb2
    3. Create the descriptor file, api_descriptor.pb, by using the protoc protocol buffers compiler. Run the following command in the directory where you saved bookstore.proto:
      python -m grpc_tools.protoc \
          --include_imports \
          --include_source_info \
          --proto_path=. \
          --descriptor_set_out=api_descriptor.pb \
          --python_out=generated_pb2 \
          --grpc_python_out=generated_pb2 \
          bookstore.proto

      In the preceding command, --proto_path is set to the current working directory. In your gRPC build environment, if you use a different directory for .proto input files, change --proto_path so the compiler searches the directory where you saved bookstore.proto.

  2. Create a gRPC API configuration YAML file:
    1. Save a copy of the api_config.yamlfile. This file defines the gRPC API configuration for the Bookstore service.
    2. Replace MY_PROJECT_ID in your api_config.yaml file with your Google Cloud project ID. For example:
      #
      # Name of the service configuration.
      #
      name: bookstore.endpoints.example-project-12345.cloud.goog
      

      Note that the apis.name field value in this file exactly matches the fully-qualified API name from the .proto file; otherwise deployment won't work. The Bookstore service is defined in bookstore.proto inside package endpoints.examples.bookstore. Its fully-qualified API name is endpoints.examples.bookstore.Bookstore, just as it appears in the api_config.yaml file.

      apis:
        - name: endpoints.examples.bookstore.Bookstore

See Configuring Endpoints for more information.

Deploying the Endpoints configuration

To deploy the Endpoints configuration, you use the gcloud endpoints services deploy command. This command uses Service Management to create a managed service.

  1. Make sure you are in the directory where the api_descriptor.pb and api_config.yaml files are located.
  2. Confirm that the default project that the gcloud command-line tool is currently using is the Google Cloud project that you want to deploy the Endpoints configuration to. Validate the project ID returned from the following command to make sure that the service doesn't get created in the wrong project.
    gcloud config list project
    

    If you need to change the default project, run the following command:

    gcloud config set project YOUR_PROJECT_ID
    
  3. Deploy the proto descriptor file and the configuration file by using the Google Cloud CLI:
    gcloud endpoints services deploy api_descriptor.pb api_config.yaml
    

    As it is creating and configuring the service, Service Management outputs information to the terminal. When the deployment completes, a message similar to the following is displayed:

    Service Configuration [CONFIG_ID] uploaded for service [bookstore.endpoints.example-project.cloud.goog]

    CONFIG_ID is the unique Endpoints service configuration ID created by the deployment. For example:

    Service Configuration [2017-02-13r0] uploaded for service [bookstore.endpoints.example-project.cloud.goog]
    

    In the previous example, 2017-02-13r0 is the service configuration ID and bookstore.endpoints.example-project.cloud.goog is the service name. The service configuration ID consists of a date stamp followed by a revision number. If you deploy the Endpoints configuration again on the same day, the revision number is incremented in the service configuration ID.

Checking required services

At a minimum, Endpoints and ESP require the following Google services to be enabled:
Name Title
servicemanagement.googleapis.com Service Management API
servicecontrol.googleapis.com Service Control API

In most cases, the gcloud endpoints services deploy command enables these required services. However, the gcloud command completes successfully but doesn't enable the required services in the following circumstances:

  • If you used a third-party application such as Terraform, and you don't include these services.

  • You deployed the Endpoints configuration to an existing Google Cloud project in which these services were explicitly disabled.

Use the following command to confirm that the required services are enabled:

gcloud services list

If you do not see the required services listed, enable them:

gcloud services enable servicemanagement.googleapis.com
gcloud services enable servicecontrol.googleapis.com

Also enable your Endpoints service:

gcloud services enable ENDPOINTS_SERVICE_NAME

To determine the ENDPOINTS_SERVICE_NAME you can either:

  • After deploying the Endpoints configuration, go to the Endpoints page in the Cloud console. The list of possible ENDPOINTS_SERVICE_NAME are shown under the Service name column.

  • For OpenAPI, the ENDPOINTS_SERVICE_NAME is what you specified in the host field of your OpenAPI spec. For gRPC, the ENDPOINTS_SERVICE_NAME is what you specified in the name field of your gRPC Endpoints configuration.

For more information about the gcloud commands, see gcloud services.

If you get an error message, see Troubleshooting Endpoints configuration deployment. See Deploying the Endpoints configuration for additional information.

Deploying the API backend

So far you have deployed the API configuration to Service Management, but you haven't yet deployed the code that serves the API backend. This section walks you through getting Docker set up on your Managed Instance Group and running the API backend code and the ESPv2 in a Docker container.

Create an instance template

Create a template that you will use to create a group of VM instances. Each instance created from the template launches an ESPv2 and a backend application server.

  1. In the Google Cloud console, go to the Instance templates page.

    Go to Instance templates

  2. Click Create instance template.

  3. Under Name, enter load-balancing-espv2-template.

  4. Under Machine configuration, set the Machine type to e2-micro.

  5. Under Boot disk, set the Image to Container Optimized OS stable version.

  6. Under Firewall, select Allow HTTP traffic.

  7. Click Management, security, disks, networking, sole tenancy to reveal the advanced settings.

  8. Click the Management tab. Under Automation, enter the following Startup script. Remember to update ENDPOINTS_SERVICE_NAME.

    sudo docker network create --driver bridge esp_net
    sudo docker run \
      --detach \
      --name=bookstore \
      --net=esp_net \
      gcr.io/endpointsv2/python-grpc-bookstore-server:1
    sudo docker run \
      --detach \
      --name=esp \
      --publish=80:9000 \
      --net=esp_net \
      gcr.io/endpoints-release/endpoints-runtime:2 \
      --service=ENDPOINTS_SERVICE_NAME \
      --rollout_strategy=managed \
      --listener_port=9000 \
      --healthz=/healthz \
      --backend=grpc://bookstore:8000
    

    The script gets, installs, and launches the echo application server and the ESPv2 proxy server at instance startup.

  9. Click Create.

Wait until the template has been created before continuing.

Create a regional managed instance group

To run the application, use the instance template to create a regional managed instance group:

  1. In the Google Cloud console, go to the Instance groups page.

    Go to Instance groups

  2. Click Create instance group.

  3. Under Name, enter load-balancing-espv2-group.

  4. Under Location, select Multiple zones.

  5. Under Region, select us-central1.

  6. Click the Configure zones drop-down menu to reveal Zones. Select the following zones:

    • us-central1-b
    • us-central1-c
    • us-central1-f
  7. Under Instance template, select load-balancing-espv2-template.

  8. Under Autoscaling, select Don't autoscale.

  9. Set Number of instances to 3.

  10. Under Instance redistribution, select On.

  11. Under Autohealing and Health check, select No health check.

  12. Click Create. This redirects you back to the Instance groups page.

Create a load balancer

This section explains the steps required to create a regional load balancer that directs TCP traffic to your instance group.

  1. In the Google Cloud console, go to the Create a load balancer page.

    Go to Create a load balancer

  2. Under TCP Load Balancing, click Start configuration.

  3. Under Internet facing or internal only, select From Internet to my VMs.

  4. Under Multiple regions or single region, select Single region only.

  5. Under Backend type, select Backend Service.

  6. Click Continue.

  7. Under Name, enter espv2-load-balancer.

  8. Under Backend configuration, select region us-central1.

  9. Select instance group load-balancing-espv2-group.

  10. Under Health check, create a new health check.

    • Under name, enter espv2-load-balancer-check.
    • Confirm Protocol is TCP, Port is 80.
  11. Under Frontend configuration, enter port number 80.

  12. Under Review and finalize, verify

    • The Instance group is load-balancing-espv2-group.
    • The Region is us-central1.
    • The Protocol is TCP.
    • The IP:Port is EPHEMERAL:80.
  13. After the load balancer is created, find the IP address from the Load Balancer page.

    Go to Load Balancer

Sending a request to the API

If you are sending the request from the same instance in which the Docker containers are running, you can replace SERVER_IP with localhost. Otherwise replace SERVER_IP with the external IP of the instance.

You can find the external IP address by running:

gcloud compute instances list

To send requests to the sample API, you can use a sample gRPC client written in Python.

  1. Clone the git repo where the gRPC client code is hosted:

    git clone https://github.com/GoogleCloudPlatform/python-docs-samples.git
       

  2. Change your working directory:

    cd python-docs-samples/endpoints/bookstore-grpc/
      

  3. Install dependencies:

    pip install virtualenv
    virtualenv env
    source env/bin/activate
    python -m pip install -r requirements.txt

  4. Send a request to the sample API:

    python bookstore_client.py --host SERVER_IP --port 80
    

If you don't get a successful response, see Troubleshooting response errors.

You just deployed and tested an API in Endpoints!

Clean up

To avoid incurring charges to your Google Cloud account for the resources used in this tutorial, either delete the project that contains the resources, or keep the project and delete the individual resources.

  1. Make sure that the gcloud CLI (gcloud) is authorized to access your data and services on Google Cloud:

    gcloud auth login
    
  2. Enter the following to display the project IDs for your Google Cloud projects:

    gcloud projects list
    
  3. Using the applicable project ID from the previous step, set the default Google Cloud project to the one that your application is in:

    gcloud config set project [YOUR_PROJECT_ID]
    
  4. Obtain the name of all managed services in your Google Cloud project:

    gcloud endpoints services list
    
  5. Delete the service from Service Management. Replace SERVICE_NAME with the name of the service you want to remove.

    gcloud endpoints services delete SERVICE_NAME
    

    Running gcloud endpoints services delete doesn't immediately delete the managed service. Service Management disables the managed service for 30 days, which allows you time to restore it if you need to. After 30 days, Service Management permanently deletes the managed service.

  6. Go to the Load Balancer page.

    Go to Load Balancer

    Delete load balancer espv2-load-balancer with health check espv2-load-balancer-check.

  7. Go to the Instance Groups page.

    Go to Instance Groups

    Delete load-balancing-espv2-group

  8. Go to the Instance Template page.

    Go to Instance Templates

    Delete load-balancing-espv2-template.

What's next