Running a Local Extensible Service Proxy

These instructions explain how to configure an instance of the Extensible Service Proxy on a local machine or on another cloud provider.

Hosting a local instance of the proxy will allow you to verify security settings are configured and working properly, and that metrics and logs appear as expected.

Before you begin

  1. Create or use a project that has enabled the Cloud Endpoints API.

  2. Install Docker as follows:

    sudo apt-get update
    sudo apt-get install docker-engine

    If your Linux distribution does not have a Docker package, follow the general Docker installation instructions.

Creating a service account

The Extensible Service Proxy uses a service account for authorization when checking with Google Service Control to see if a request is authorized or to log the results.

You need a project that has enabled the Cloud Endpoints API. You can use an existing project if it has already enabled the Cloud Endpoints API.

  1. Select or create a project:
    Select or Create Project

  2. Create a service account:
    Create service account

  3. Under Service Account, select New service account.

  4. Enter a Service account name.

  5. Set the account Role:

    • Project -> Editor

    or select all of the following roles:

    • Project -> Viewer

    • Service Management -> Service Controller

    • Cloud Trace -> Cloud Trace Agent

  6. Click Create.

  7. Create a key for the newly created service account:

    • Click on the Hamburger button of the newly created service account

    • Click on the Create Key button

This creates the account and downloads its private key. These instructions assume the key file is stored in ~/Downloads/serviceaccount.json. To follow along, you may want to copy and rename your key file to match.

Starting your local server

For this tutorial, we use the Python Endpoints Example from the GitHub repository, but you can use whatever language and local server you are using for testing.

Before you start, you need to run the following commands:

sudo easy_install virtualenv
virtualenv .
/bin/pip install -r requirements.txt
  1. Clone the sample app repository to your local machine:

    git clone

  2. Change to the directory that contains the sample code:

    cd python-docs-samples/endpoints/getting-started

  3. Edit openapi.yaml and replace YOUR-PROJECT-ID with your own project ID. This must be the same project that the Service Account was created in.

  4. Start the server:


Deploying your OpenAPI configuration

To deploy your OpenAPI configuration:

  1. Deploy your configuration:

    gcloud service-management deploy openapi.yaml

  2. Display the configuration ID with the following command. Replace [YOUR-PROJECT-ID] with your project ID. Do not include the square brackets.

    gcloud service-management configs list --service=echo-api.endpoints.[YOUR-PROJECT-ID]
  3. Save the service name and the configuration ID to environment variables, so you can use them to set up the proxy, for example:

    export CONFIG_ID=2016-12-14r1

You have now deployed the OpenAPI configuration. To use it, you will need to deploy the Extensible Service Proxy Docker container.

Starting the proxy Docker container

The Extensible Service Proxy Docker container can be started using docker:

sudo docker run -d --name="esp" --net="host" -v ~/Downloads:/esp -s $SERVICE_NAME -v $CONFIG_ID -p 8082 -a localhost:8080 -k /esp/serviceaccount.json

There are several points to note about this docker run command:

  • The SERVICE_NAME and CONFIG_ID arguments refer to the environment variables set above; this allows the proxy to retrieve the API configuration.

  • -d starts the container in detached mode, so it runs in the background.

  • --name="esp" provides an easy to access name for the container. For instance, you could run docker logs esp to see logs from the container.

  • -v ~/Downloads:/esp maps your local ~/Downloads directory to the /esp directory in the container. This mapping is used by the -k argument, explained below.

  • -p 8082 sets the proxy to receive requests on port 8082.

  • --net="host" indicates that the Docker container should use the same network configuration as the host machine, allowing it to make calls to localhost on the host machine.

  • -a localhost:8080 indicates that your backend server receives requests at localhost on port 8080.

  • -k /esp/serviceaccount.json tells the proxy where to find the private key file. The /esp directory matches the mapping from the -v argument above, and the name of the serviceaccount.json file should match the name of the private key file you downloaded.

Sending requests

We have configured the proxy Docker container to receive requests on port 8082. If you send requests directly to the server at http://localhost:8080, skipping the proxy, you will see that the authentication check performed by the proxy is bypassed:

curl -d '{"message":"hello world"}' -H "content-type:application/json" http://localhost:8080/echo

  "message": "hello world"

To test the API with an API Key:

  1. Create an API key:
    Create API key

    If you want to use an API Key from a different project, see Sharing an API.

  2. Save the API key to an environment variable, for example:

      export KEY=AIza...
  3. Send a request with the key:

      curl -d '{"message":"hello world"}' -H "content-type:application/json" http://localhost:8082/echo?key=$KEY

Cleaning Up

Shut down and remove the esp Docker container using the docker tool:

sudo docker stop esp
sudo docker rm esp

If you want to clean up the deployed service configuration, see Deleting an API and API Instances.

Monitor your resources on the go

Get the Google Cloud Console app to help you manage your projects.

Send feedback about...

Cloud Endpoints with OpenAPI