Authentication in HTTP Cloud Functions

This tutorial describes a method of authenticating the origin of an HTTP request for triggering HTTP Cloud Functions. It shows how to implement a Cloud Function in which the originator supplies an access token for an account that has been authorized to execute code protected by authentication.

A Cloud Function can be configured to be triggered in many ways, for example:

  • In response to messages published on a Cloud Pub/Sub topic.
  • In response to a GET, POST, PUT, DELETE, or OPTIONS HTTP request. Such a function is said to have an HTTP trigger, and is called an HTTP Cloud Function.

A fully qualified URL is automatically generated and assigned to an HTTP Cloud Function. Anyone can make an HTTP(S) request to this URL to trigger your function without needing any authorization. This is a great way to have your Cloud Functions respond to events originating from third-party systems like GitHub, Slack, Stripe, or any other system that can send HTTP(S) requests to a URL.

However, in other cases you might want to lock the function so that only authorized originators can use its functionality through its URL. A common way to lock an endpoint is to send authorization information for the originator in an HTTP request header field.

Granting and revoking authorization

In this tutorial, you use a specially designated bucket that you create as a proxy for managing the authorization to use a Cloud Function's protected functionality. You authorize an account to use a function's protected functionality by granting the account storage.buckets.get permission to the bucket. Similarly, you revoke authorization by removing the permission.

You trigger the function by making an HTTP GET request to its URL, and pass an access token corresponding to the account in the Authorization request header. When the function executes, it checks to see whether the account associated with the supplied access token has storage.buckets.get permission on the designated bucket (see Figure 1). If the account has permission, the function executes; otherwise, the function fails and returns a 403 HTTP status code.

authorization scheme
Figure 1: Authorization scheme.

Granting permissions to a resource in Cloud Storage is a strongly consistent operation. In other words, as soon as a service account is granted permissions to the bucket, you can execute the function using its credentials. In contrast, revoking permissions is an eventually consistent operation and often takes time, typically a minute, to take effect. For more details, see Cloud Storage operation consistency guarantees.

While this authorization method will work with any kind of account, in this tutorial, you use two service accounts for testing authorization with the Cloud Function. A service account is a Google Account that's associated with your GCP project, as opposed to a specific user. For more information, see the overview of authentication.

You grant bucket read permission to one of the accounts, and confirm that the function executes when its access token is included in the triggering HTTP request. You then confirm that the function returns a 403 HTTP status code when the access token of the other service account, which lacks read permission on the bucket, is passed.


  • Configure the development environment.
  • Create two service accounts and a Cloud Storage bucket for testing authorization.
  • Write and deploy an HTTP Cloud Function that can accept and authorize the access token received in an HTTP header.
  • Run the Cloud Function and observe its behavior corresponding to the bucket permissions.


This tutorial uses the following billable components of Google Cloud Platform:

  • Cloud Functions

You can use the pricing calculator to generate a cost estimate based on your projected usage. New GCP users might be eligible for a free trial.

New GCP users might be eligible for a trial at no cost.

Before you begin

  1. Select or create a GCP project.

    Go to the Manage resources page

  2. Make sure that billing is enabled for your project.

    Learn how to enable billing

  3. Enable the Cloud Functions API.

    Enable the API

  4. Install and initialize the Cloud SDK.
  5. Prepare your environment for Node.js development.

Configure your environment

To configure your environment, follow these steps from a Terminal window:

  1. Set up environment variables.

    You will be using the bucket name and the project identifier in many commands, so it is convenient to define environment variables in your shell. In the following code, replace [PROJECT_NAME] with your project identifier. Replace [BUCKET_NAME] with the name of the bucket (without the gs:// prefix) that you will use for managing permissions for service accounts. The bucket does not need to exist before you define these variables.

  2. Set the project as the default project.

    Many gcloud commands require that you specify a project. You can save time by setting a project as the default in your active gcloud configuration. In the following code, replace [PROJECT_NAME] with the Cloud Platform project identifier that you want to use for this tutorial.

    gcloud config set core/project [PROJECT_NAME]

Creating service accounts

You need to create two service accounts. In later steps, you grant one of the accounts read permission to the designated bucket and observe the operation of the Cloud Function.

  1. Create a service account named alpha-account:

    gcloud iam service-accounts create alpha-account --display-name "Alpha account"
  2. Create a second service account named beta-account:

    gcloud iam service-accounts create beta-account --display-name "Beta account"

Creating the bucket

Enter the following command to create the bucket, replacing [BUCKET_NAME] with the name of your bucket:

gsutil mb gs://${BUCKET_NAME}

If a bucket with this name exists on GCP, the command fails. In that case, choose a name that's not in use.

Writing the Cloud Function

  1. Create a directory on your local workstation for the Cloud Function code and move into the directory. First, create the directory:

    mkdir ~/gcf_auth

    Next, move into the directory:

    cd ~/gcf_auth
  2. Create a package.json file. The default package manager for Node.js, npm, uses this JSON file to install the JavaScript modules required by a Node.js application. The Cloud Function you write uses the googleapis module, a Node.js client library for accessing Google APIs. Accordingly, create a package.json file in the gcf_auth directory with the following contents:

      "dependencies": {
        "googleapis": "21.2"
  3. Create an index.js file. A Node.js-based Cloud Function is expected to be a Node.js module that can be loaded using a require() call. A common way to define a Node.js module is to export its entry point in an index.js file, which is how you implement the Cloud Function in this tutorial.

    The following shows secureFunction, which is the entry point into the Cloud Function. This function and all of its dependencies, also shown later, are part of index.js.

    const Google = require('googleapis');
    const BUCKET = '[BUCKET_NAME]'; // Replace with name of your bucket
     * Cloud Function.
     * @param {Object} req Cloud Function request context.
     * @param {Object} res Cloud Function response context.
    exports.secureFunction = function secureFunction(req, res) {
        var accessToken = getAccessToken(req.get('Authorization'));
        var oauth = new Google.auth.OAuth2();
        oauth.setCredentials({access_token: accessToken});
        var permission = 'storage.buckets.get';
        var gcs ='v1');
            {bucket: BUCKET, permissions: [permission], auth: oauth}, {},
            function (err, response) {
                if (response && response['permissions'] && response['permissions'].includes(permission)) {
                } else {
                    res.status(403).send("The request is forbidden.");

The entry point function begins by extracting the access token passed in the Authorization request header using the getAccessToken helper function, and uses it to create an OAuth2 client for Google APIs. The OAuth2 client is used for authenticating the testIamPermissions request on Cloud Storage. The request, which expects a bucket name and a set of permissions to test, responds with the tested permissions that the account has on the bucket.

In this case, the function checks for only one permission, storage.buckets.get. The response includes a permissions field containing the permission only if the account has that permission on the bucket. If the account has that permission, the HTTP request that triggered the Cloud Function is deemed to be authorized, and the authorized helper function is called. You can include the code that you want to run on a successful authorization in this helper function.

In all other cases, for example, if the access token was invalid or not passed, or if the account did not have storage.buckets.get permission on the bucket, the response will not have the permissions field with storage.buckets.get, and the originating HTTP request is deemed to have failed authorization. The function returns a 403 HTTP status code.

The following code shows the helper function for extracting an access token from the HTTP request that triggered the Cloud Function:

function getAccessToken(header) {
    if (header) {
        var match = header.match(/^Bearer\s+([^\s]+)$/);
        if (match) {
            return match[1];

    return null;

This function expects the access token to be present in the Authorization HTTP request header field as Bearer <access token>. If that field is present, the function returns the access token; otherwise, it returns null.

The helper function authorized is called when an access token is deemed to have been authorized to execute the function. This is where you add the code to be executed in that event. An example implementation is shown here:

// The code to be executed on successful authorization goes here.
function authorized(res) {
    res.send("The request was successfully authorized.");

Deploying the function

  1. Deploy the function by running the following command:

    gcloud beta functions deploy secureFunction --trigger-http

    The --trigger-http option generates and assigns a URL endpoint to the function so that it can be triggered with an HTTP request to that endpoint.

    It might take a few minutes for the command to finish running. After deployment is finished, you will see something like the following:

    Deploying function (may take a while - up to 2 minutes)...done.
    availableMemoryMb: 256
    entryPoint: secureFunction
    latestOperation: operations/Z2NmLXNlY3VyZS91cy1jZW50cmFsMS9zZWN1cmUvT0FYclM0N2ttRmc
    name: projects/gcf-secure/locations/us-central1/functions/secureFunction

    Note the URL in bold. This is the URL that you will use to trigger the Cloud Function. Add it to the shell as an environment variable to be used in later commands:

    export URL=
  2. Verify the status of the deployment:

    gcloud beta functions describe secureFunction

    This command describes the configuration and status of the named Cloud Function, in this case, secureFunction. The command output looks something like this:

    status: READY
    timeout: 60s

A READY status indicates that the function has been deployed successfully and is ready to be invoked.

Obtaining access tokens

The Cloud Function requires an access token for an account to perform authorization. You supply the access token in the Authorization request header when invoking the Cloud Function via its URL. Therefore, you will need to generate access tokens for the service accounts you are using in this tutorial. The following steps show how to do that:

  1. Download credentials and generate an access token for the alpha-account service account:

    1. Download credentials:

      gcloud iam service-accounts keys create --iam-account alpha-account@${PROJECT_NAME} ./alpha-account.json
    2. Generate an access token and save it in an environment variable ALPHA_ACCOUNT_TOKEN for later use:

      export ALPHA_ACCOUNT_TOKEN=$(GOOGLE_APPLICATION_CREDENTIALS=./alpha-account.json gcloud auth application-default print-access-token)
  2. Download credentials for the beta-account service account:

    1. Download credentials:

      gcloud iam service-accounts keys create --iam-account beta-account@${PROJECT_NAME} ./beta-account.json
    2. Generate an access token and save it in an environment variable BETA_ACCOUNT_TOKEN for later use:

      export BETA_ACCOUNT_TOKEN=$(GOOGLE_APPLICATION_CREDENTIALS=./beta-account.json gcloud auth application-default print-access-token)

Running the function

  1. Verify that neither account has any permissions on the bucket. Run the following command to list all existing permissions for all members on the bucket:

    gsutil acl get gs://${BUCKET_NAME}

    Inspect the output and ensure neither the alpha-account nor beta-account name appears. If neither appears, it means that neither account has permissions related to the bucket.

    If either account appears in the output and you need to revoke permissions for that account, use the following command to revoke its permissions. The following command revokes permissions for alpha-account:

    gsutil acl ch -d alpha-account@${PROJECT_NAME} gs://${BUCKET_NAME}
  2. Verify that the Cloud Function reports both accounts as forbidden.

    1. Make a GET request to the Cloud Function URL previously generated with the alpha-account access token in the Authorization header:

      curl ${URL} -H "Authorization: Bearer ${ALPHA_ACCOUNT_TOKEN}"

      This command returns the following message:

      The request is forbidden.

  3. Make a GET request to the same URL using the beta-account access token.

    curl ${URL} -H "Authorization: Bearer ${BETA_ACCOUNT_TOKEN}"

    This command returns the same message:

    The request is forbidden.

  4. Add read permission to the bucket for alpha-account:

    gsutil acl ch -u alpha-account@${PROJECT_NAME} gs://${BUCKET_NAME}
  5. Verify that the function accepted the access token for alpha-account by making another GET request to the Cloud Function URL:

    curl ${URL} -H "Authorization: Bearer ${ALPHA_ACCOUNT_TOKEN}"

    Here's the response:

    The request was successfully authorized.

    However, the access token for beta-account will continue to be denied.

  6. Remove read permission from the bucket for the alpha-account account previously granted.

    gsutil acl ch -d alpha-account@${PROJECT_NAME} gs://${BUCKET_NAME}
  7. Verify that the credentials for alpha-account are no longer accepted.

    curl ${URL} -H "Authorization: Bearer ${ALPHA_ACCOUNT_TOKEN}"

    You'll see the following response:

    The request is forbidden.

    Recall that revoking permissions from a Cloud Storage resource is only eventually consistent. It might take a minute or so to take effect.

Also remember that the access tokens generated in this way expire after an hour. When that happens, all requests using that access token fail, regardless of whether the corresponding account has read permission on the bucket.

You can run the following command to get information about an access token:


The output of this command looks something like this::

  "azp": "104552263505891956924",
  "aud": "104552263505891956924",
  "scope": "",
  "exp": "1503901859",
  "expires_in": "2947",
  "access_type": "offline"

The line in bold shows the number of seconds remaining before the access token expires. If a token expires, you can regenerate it in the same way you generated it the first time.

Cleaning up

To avoid incurring charges to your Google Cloud Platform account for the resources used in this tutorial:

After you finish the tutorial, clean up any resources you created so you won't be billed for them in the future. The easiest way to eliminate billing is to delete the project you created for the tutorial.

  1. In the GCP Console, go to the Projects page.

    Go to the Projects page

  2. In the project list, select the project you want to delete and click Delete project. After selecting the checkbox next to the project name, click
      Delete project
  3. In the dialog, type the project ID, and then click Shut down to delete the project.

What's next

Try out other Google Cloud Platform features for yourself. Have a look at our tutorials.

Was this page helpful? Let us know how we did:

Send feedback about...