This tutorial shows you how to deploy a simple example gRPC service with the Extensible Service Proxy V2 (ESPv2) in a Managed Instance Group.
This tutorial uses the Python version of the
bookstore-grpc
sample. See the What's next section for gRPC samples in other
languages.
For an overview of Cloud Endpoints, see About Endpoints and Endpoints architecture.
Objectives
Use the following high-level task list as you work through the tutorial. All tasks are required to successfully send requests to the API.
- Set up a Google Cloud project, and download required software. See Before you begin.
- Copy and configure files from the
bookstore-grpc
sample. See Configuring Endpoints. - Deploy the Endpoints configuration to create an Endpoints service. See Deploying the Endpoints configuration.
- Deploy the API and ESPv2 on the Managed Instance Group backend. See Deploying the API backend.
- Send a request to the API. See Sending a request to the API.
- Avoid incurring charges to your Google Cloud account. See Clean up.
Costs
In this document, you use the following billable components of Google Cloud:
To generate a cost estimate based on your projected usage,
use the pricing calculator.
When you finish the tasks that are described in this document, you can avoid continued billing by deleting the resources that you created. For more information, see Clean up.
Before you begin
- Sign in to your Google Cloud account. If you're new to Google Cloud, create an account to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
-
In the Google Cloud console, on the project selector page, select or create a Google Cloud project.
-
Make sure that billing is enabled for your Google Cloud project.
-
In the Google Cloud console, on the project selector page, select or create a Google Cloud project.
-
Make sure that billing is enabled for your Google Cloud project.
- Make a note of the project ID because it's needed later.
- Install and initialize the Google Cloud CLI.
- Update the gcloud CLI and install the Endpoints
components:
gcloud components update
-
Make sure that the Google Cloud CLI (
gcloud
) is authorized to access your data and services on Google Cloud: In the new browser tab that opens, select an account.gcloud auth login
-
Set the default project to your project ID.
gcloud config set project YOUR_PROJECT_ID
Replace YOUR_PROJECT_ID with your project ID. If you have other Google Cloud projects, and you want to use
gcloud
to manage them, see Managing gcloud CLI Configurations. - Follow the steps in the gRPC Python quickstart to install gRPC and the gRPC tools.
When you finish the tasks that are described in this document, you can avoid continued billing by deleting the resources that you created. For more information, see Clean up.
Configuring Endpoints
Clone the bookstore-grpc
sample
repository from GitHub.
To configure Endpoints:
- Create a self-contained protobuf descriptor file from your service
.proto
file:- Save a copy of
bookstore.proto
from the example repository. This file defines the Bookstore service's API. - Create the following directory:
mkdir generated_pb2
- Create the descriptor file,
api_descriptor.pb
, by using theprotoc
protocol buffers compiler. Run the following command in the directory where you savedbookstore.proto
:python -m grpc_tools.protoc \ --include_imports \ --include_source_info \ --proto_path=. \ --descriptor_set_out=api_descriptor.pb \ --python_out=generated_pb2 \ --grpc_python_out=generated_pb2 \ bookstore.proto
In the preceding command,
--proto_path
is set to the current working directory. In your gRPC build environment, if you use a different directory for.proto
input files, change--proto_path
so the compiler searches the directory where you savedbookstore.proto
.
- Save a copy of
- Create a gRPC API configuration YAML file:
- Save a copy of the
api_config.yaml
file. This file defines the gRPC API configuration for the Bookstore service. - Replace MY_PROJECT_ID in your
api_config.yaml
file with your Google Cloud project ID. For example:# # Name of the service configuration. # name: bookstore.endpoints.example-project-12345.cloud.goog
Note that the
apis.name
field value in this file exactly matches the fully-qualified API name from the.proto
file; otherwise deployment won't work. The Bookstore service is defined inbookstore.proto
inside packageendpoints.examples.bookstore
. Its fully-qualified API name isendpoints.examples.bookstore.Bookstore
, just as it appears in theapi_config.yaml
file.apis: - name: endpoints.examples.bookstore.Bookstore
- Save a copy of the
See Configuring Endpoints for more information.
Deploying the Endpoints configuration
To deploy the Endpoints configuration, you use the
gcloud endpoints services deploy
command. This command uses
Service Management
to create a managed service.
- Make sure you are in the directory where the
api_descriptor.pb
andapi_config.yaml
files are located. - Confirm that the default project that the
gcloud
command-line tool is currently using is the Google Cloud project that you want to deploy the Endpoints configuration to. Validate the project ID returned from the following command to make sure that the service doesn't get created in the wrong project.gcloud config list project
If you need to change the default project, run the following command:
gcloud config set project YOUR_PROJECT_ID
- Deploy the
proto descriptor
file and the configuration file by using the Google Cloud CLI:gcloud endpoints services deploy api_descriptor.pb api_config.yaml
As it is creating and configuring the service, Service Management outputs information to the terminal. When the deployment completes, a message similar to the following is displayed:
Service Configuration [CONFIG_ID] uploaded for service [bookstore.endpoints.example-project.cloud.goog]
CONFIG_ID is the unique Endpoints service configuration ID created by the deployment. For example:
Service Configuration [2017-02-13r0] uploaded for service [bookstore.endpoints.example-project.cloud.goog]
In the previous example,
2017-02-13r0
is the service configuration ID andbookstore.endpoints.example-project.cloud.goog
is the service name. The service configuration ID consists of a date stamp followed by a revision number. If you deploy the Endpoints configuration again on the same day, the revision number is incremented in the service configuration ID.
Checking required services
At a minimum, Endpoints and ESP require the following Google services to be enabled:Name | Title |
---|---|
servicemanagement.googleapis.com |
Service Management API |
servicecontrol.googleapis.com |
Service Control API |
In most cases, the gcloud endpoints services deploy
command enables these
required services. However, the gcloud
command completes successfully but
doesn't enable the required services in the following circumstances:
If you used a third-party application such as Terraform, and you don't include these services.
You deployed the Endpoints configuration to an existing Google Cloud project in which these services were explicitly disabled.
Use the following command to confirm that the required services are enabled:
gcloud services list
If you do not see the required services listed, enable them:
gcloud services enable servicemanagement.googleapis.com
gcloud services enable servicecontrol.googleapis.com
Also enable your Endpoints service:
gcloud services enable ENDPOINTS_SERVICE_NAME
To determine the ENDPOINTS_SERVICE_NAME you can either:
After deploying the Endpoints configuration, go to the Endpoints page in the Cloud console. The list of possible ENDPOINTS_SERVICE_NAME are shown under the Service name column.
For OpenAPI, the ENDPOINTS_SERVICE_NAME is what you specified in the
host
field of your OpenAPI spec. For gRPC, the ENDPOINTS_SERVICE_NAME is what you specified in thename
field of your gRPC Endpoints configuration.
For more information about the gcloud
commands, see
gcloud
services.
If you get an error message, see Troubleshooting Endpoints configuration deployment. See Deploying the Endpoints configuration for additional information.
Deploying the API backend
So far you have deployed the API configuration to Service Management, but you haven't yet deployed the code that serves the API backend. This section walks you through getting Docker set up on your Managed Instance Group and running the API backend code and the ESPv2 in a Docker container.
Create an instance template
Create a template that you will use to create a group of VM instances. Each instance created from the template launches an ESPv2 and a backend application server.
In the Google Cloud console, go to the Instance templates page.
Click Create instance template.
Under Name, enter
load-balancing-espv2-template
.Under Machine configuration, set the Machine type to
e2-micro
.Under Boot disk, set the Image to
Container Optimized OS stable version
.Under Firewall, select Allow HTTP traffic.
Click Management, security, disks, networking, sole tenancy to reveal the advanced settings.
Click the Management tab. Under Automation, enter the following Startup script. Remember to update ENDPOINTS_SERVICE_NAME.
sudo docker network create --driver bridge esp_net sudo docker run \ --detach \ --name=bookstore \ --net=esp_net \ gcr.io/endpointsv2/python-grpc-bookstore-server:1 sudo docker run \ --detach \ --name=esp \ --publish=80:9000 \ --net=esp_net \ gcr.io/endpoints-release/endpoints-runtime:2 \ --service=ENDPOINTS_SERVICE_NAME \ --rollout_strategy=managed \ --listener_port=9000 \ --healthz=/healthz \ --backend=grpc://bookstore:8000
The script gets, installs, and launches the echo application server and the ESPv2 proxy server at instance startup.
Click Create.
Wait until the template has been created before continuing.
Create a regional managed instance group
To run the application, use the instance template to create a regional managed instance group:
In the Google Cloud console, go to the Instance groups page.
Click Create instance group.
Under Name, enter
load-balancing-espv2-group
.Under Location, select Multiple zones.
Under Region, select us-central1.
Click the Configure zones drop-down menu to reveal Zones. Select the following zones:
- us-central1-b
- us-central1-c
- us-central1-f
Under Instance template, select
load-balancing-espv2-template
.Under Autoscaling, select Don't autoscale.
Set Number of instances to
3
.Under Instance redistribution, select On.
Under Autohealing and Health check, select No health check.
Click Create. This redirects you back to the Instance groups page.
Create a load balancer
This section explains the steps required to create a regional load balancer that directs TCP traffic to your instance group.
In the Google Cloud console, go to the Create a load balancer page.
Under TCP Load Balancing, click Start configuration.
Under Internet facing or internal only, select From Internet to my VMs.
Under Multiple regions or single region, select Single region only.
Under Backend type, select Backend Service.
Click Continue.
Under Name, enter
espv2-load-balancer
.Under Backend configuration, select region us-central1.
Select instance group
load-balancing-espv2-group
.Under Health check, create a new health check.
- Under name, enter
espv2-load-balancer-check
. - Confirm Protocol is TCP, Port is 80.
- Under name, enter
Under Frontend configuration, enter port number 80.
Under Review and finalize, verify
- The Instance group is
load-balancing-espv2-group
. - The Region is
us-central1
. - The Protocol is
TCP
. - The IP:Port is
EPHEMERAL:80
.
- The Instance group is
After the load balancer is created, find the IP address from the Load Balancer page.
Sending a request to the API
If you are sending the request from the same instance in which the Docker
containers are running, you can replace SERVER_IP with localhost
. Otherwise
replace SERVER_IP with the external IP of the instance.
You can find the external IP address by running:
gcloud compute instances list
To send requests to the sample API, you can use a sample gRPC client written in Python.
Clone the git repo where the gRPC client code is hosted:
git clone https://github.com/GoogleCloudPlatform/python-docs-samples.git
Change your working directory:
cd python-docs-samples/endpoints/bookstore-grpc/
Install dependencies:
pip install virtualenv
virtualenv env
source env/bin/activate
python -m pip install -r requirements.txt
Send a request to the sample API:
python bookstore_client.py --host SERVER_IP --port 80
Look at the activity graphs for your API in the Endpoints > Services page.
Go to the Endpoints Services page
It may take a few moments for the request to be reflected in the graphs.
Look at the request logs for your API in the Logs Explorer page.
If you don't get a successful response, see Troubleshooting response errors.
You just deployed and tested an API in Endpoints!
Clean up
To avoid incurring charges to your Google Cloud account for the resources used in this tutorial, either delete the project that contains the resources, or keep the project and delete the individual resources.
Make sure that the gcloud CLI (
gcloud
) is authorized to access your data and services on Google Cloud:gcloud auth login
Enter the following to display the project IDs for your Google Cloud projects:
gcloud projects list
Using the applicable project ID from the previous step, set the default Google Cloud project to the one that your application is in:
gcloud config set project [YOUR_PROJECT_ID]
Obtain the name of all managed services in your Google Cloud project:
gcloud endpoints services list
Delete the service from Service Management. Replace
SERVICE_NAME
with the name of the service you want to remove.gcloud endpoints services delete SERVICE_NAME
Running
gcloud endpoints services delete
doesn't immediately delete the managed service. Service Management disables the managed service for 30 days, which allows you time to restore it if you need to. After 30 days, Service Management permanently deletes the managed service.Go to the Load Balancer page.
Delete load balancer
espv2-load-balancer
with health checkespv2-load-balancer-check
.Go to the Instance Groups page.
Delete
load-balancing-espv2-group
Go to the Instance Template page.
Delete
load-balancing-espv2-template
.