HTTP Functions

HTTP Functions are used when you want to directly invoke your function via HTTP(S). To allow for HTTP semantics, HTTP function signatures accept HTTP-specific arguments.

Sample Usage

The example below shows how to process an HTTP POST request containing a name parameter:

Node.js

/**
 * HTTP Cloud Function.
 *
 * @param {Object} req Cloud Function request context.
 *                     More info: https://expressjs.com/en/api.html#req
 * @param {Object} res Cloud Function response context.
 *                     More info: https://expressjs.com/en/api.html#res
 */
exports.helloHttp = (req, res) => {
  res.send(`Hello ${escapeHtml(req.body.name || 'World')}!`);
};

Python (Beta)

from flask import escape

def hello_http(request):
    """HTTP Cloud Function.
    Args:
        request (flask.Request): The request object.
        <http://flask.pocoo.org/docs/0.12/api/#flask.Request>
    Returns:
        The response text, or any set of values that can be turned into a
        Response object using `make_response`
        <http://flask.pocoo.org/docs/0.12/api/#flask.Flask.make_response>.
    """
    request_json = request.get_json()
    if request_json and 'name' in request_json:
        name = escape(request_json['name'])
    else:
        name = 'World'
    return 'Hello, {}!'.format(name)

The following command shows how to call the function and pass it a parameter using curl:

curl -X POST HTTP_TRIGGER_ENDPOINT -H "Content-Type:application/json"  -d '{"name":"Jane"}'

where HTTP_TRIGGER_ENDPOINT is the URL for the function, obtained when the function is deployed. For more information, see HTTP Triggers.

Parsing HTTP requests

The body of the request is automatically parsed based on the content-type header and made available via your HTTP function's arguments.

The example below shows how to read HTTP requests in various formats:

Node.js

const escapeHtml = require('escape-html');

/**
 * Responds to an HTTP request using data from the request body parsed according
 * to the "content-type" header.
 *
 * @param {Object} req Cloud Function request context.
 * @param {Object} res Cloud Function response context.
 */
exports.helloContent = (req, res) => {
  let name;

  switch (req.get('content-type')) {
    // '{"name":"John"}'
    case 'application/json':
      name = req.body.name;
      break;

    // 'John', stored in a Buffer
    case 'application/octet-stream':
      name = req.body.toString(); // Convert buffer to a string
      break;

    // 'John'
    case 'text/plain':
      name = req.body;
      break;

    // 'name=John' in the body of a POST request (not the URL)
    case 'application/x-www-form-urlencoded':
      name = req.body.name;
      break;
  }

  res.status(200).send(`Hello ${escapeHtml(name || 'World')}!`);
};

Python (Beta)

from flask import escape

def hello_content(request):
    """ Responds to an HTTP request using data from the request body parsed
    according to the "content-type" header.
    Args:
        request (flask.Request): The request object.
        <http://flask.pocoo.org/docs/0.12/api/#flask.Request>
    Returns:
        The response text, or any set of values that can be turned into a
        Response object using `make_response`
        <http://flask.pocoo.org/docs/0.12/api/#flask.Flask.make_response>.
    """
    content_type = request.headers['content-type']
    if content_type == 'application/json':
        name = request.json.get('name')
    elif content_type == 'application/octet-stream':
        name = request.data
    elif content_type == 'text/plain':
        name = request.data
    elif content_type == 'application/x-www-form-urlencoded':
        name = request.form.get('name')
    else:
        raise ValueError("Unknown content type: {}".format(content_type))
    return 'Hello, {}!'.format(escape(name))

Handling CORS requests

Cross-Origin Resource Sharing (CORS) is a way to let applications running on one domain access content from another domain, for example, letting yourdomain.com make requests to region-project.cloudfunctions.net/yourfunction.

If CORS isn't set up properly, you're likely to get errors that look like this:

XMLHttpRequest cannot load https://region-project.cloudfunctions.net/function.
No 'Access-Control-Allow-Origin' header is present on the requested resource.
Origin 'http://yourdomain.com' is therefore not allowed access.

CORS consists of two requests:

  • A preflight OPTIONS request.
  • A main request that follows the OPTIONS request.

The preflight request contains headers indicating which method (Access-Control-Request-Method) and which additional headers (Access-Control-Request-Headers) will be sent in the main request, as well as the origin of the main request (Origin).

To handle a preflight request, you must set the appropriate Access-Control-Allow-* headers to match the requests you want to accept:

Node.js

/**
 * HTTP function that supports CORS requests.
 *
 * @param {Object} req Cloud Function request context.
 * @param {Object} res Cloud Function response context.
 */
exports.corsEnabledFunction = (req, res) => {
  // Set CORS headers for preflight requests
  // Allows GETs from any origin with the Content-Type header
  // and caches preflight response for 3600s

  res.set('Access-Control-Allow-Origin', '*');

  if (req.method === 'OPTIONS') {
    // Send response to OPTIONS requests
    res.set('Access-Control-Allow-Methods', 'GET');
    res.set('Access-Control-Allow-Headers', 'Content-Type');
    res.set('Access-Control-Max-Age', '3600');
    res.status(204).send('');
  } else {
    // Set CORS headers for the main request
    res.set('Access-Control-Allow-Origin', '*');
    res.send('Hello World!');
  }
};

Python (Beta)

def cors_enabled_function(request):
    # For more information about CORS and CORS preflight requests, see
    # https://developer.mozilla.org/en-US/docs/Glossary/Preflight_request
    # for more information.

    # Set CORS headers for the preflight request
    if request.method == 'OPTIONS':
        # Allows GET requests from any origin with the Content-Type
        # header and caches preflight response for an 3600s
        headers = {
            'Access-Control-Allow-Origin': '*',
            'Access-Control-Allow-Methods': 'GET',
            'Access-Control-Allow-Headers': 'Content-Type',
            'Access-Control-Max-Age': '3600'
        }

        return ('', 204, headers)

    # Set CORS headers for the main request
    headers = {
        'Access-Control-Allow-Origin': '*'
    }

    return ('Hello World!', 200, headers)

Alternatively, you can use a library like cors to handle CORS for you.

Authentication and CORS

If you plan to send a request with an Authorization header, you must:

  1. Add the Authorization header to Access-Control-Allow-Headers.
  2. Set the Access-Control-Allow-Credentials header to true.
  3. Set a specific origin in Access-Control-Allow-Origin (wildcards are not accepted).

Node.js

/**
 * HTTP function that supports CORS requests with credentials.
 *
 * @param {Object} req Cloud Function request context.
 * @param {Object} res Cloud Function response context.
 */
exports.corsEnabledFunctionAuth = (req, res) => {
  // Set CORS headers for preflight requests
  // Allows GETs from origin https://mydomain.com with Authorization header

  res.set('Access-Control-Allow-Origin', 'https://mydomain.com');
  res.set('Access-Control-Allow-Credentials', 'true');

  if (req.method === 'OPTIONS') {
    // Send response to OPTIONS requests
    res.set('Access-Control-Allow-Methods', 'GET');
    res.set('Access-Control-Allow-Headers', 'Authorization');
    res.set('Access-Control-Max-Age', '3600');
    res.status(204).send('');
  } else {
    res.send('Hello World!');
  }
};

Python (Beta)

def cors_enabled_function_auth(request):
    # For more information about CORS and CORS preflight requests, see
    # https://developer.mozilla.org/en-US/docs/Glossary/Preflight_request
    # for more information.

    # Set CORS headers for preflight requests
    if request.method == 'OPTIONS':
        # Allows GET requests from origin https://mydomain.com with
        # Authorization header
        headers = {
            'Access-Control-Allow-Origin': 'https://mydomain.com',
            'Access-Control-Allow-Methods': 'GET',
            'Access-Control-Allow-Headers': 'Authorization',
            'Access-Control-Max-Age': '3600',
            'Access-Control-Allow-Credentials': 'true'
        }
        return ('', 204, headers)

    # Set CORS headers for main requests
    headers = {
        'Access-Control-Allow-Origin': 'https://mydomain.com',
        'Access-Control-Allow-Credentials': 'true'
    }

    return ('Hello World!', 200, headers)

Hosting on the same domain

Instead of implementing CORS, you could instead host your website and your functions on the same domain. Since requests would now come from the same origin, CORS won't be enforced. This simplifies your code considerably.

The easiest way to do this is to integrate Firebase Hosting with Google Cloud Functions.

Handling HTTP Methods

The following sample shows how to handle multiple HTTP methods (for example, GET and PUT):

Node.js

function handleGET (req, res) {
  // Do something with the GET request
  res.status(200).send('Hello World!');
}

function handlePUT (req, res) {
  // Do something with the PUT request
  res.status(403).send('Forbidden!');
}

/**
 * Responds to a GET request with "Hello World!". Forbids a PUT request.
 *
 * @example
 * gcloud alpha functions call helloHttp
 *
 * @param {Object} req Cloud Function request context.
 * @param {Object} res Cloud Function response context.
 */
exports.helloHttp = (req, res) => {
  switch (req.method) {
    case 'GET':
      handleGET(req, res);
      break;
    case 'PUT':
      handlePUT(req, res);
      break;
    default:
      res.status(405).send({ error: 'Something blew up!' });
      break;
  }
};

Python (Beta)

def hello_method(request):
    """ Responds to a GET request with "Hello world!". Forbids a PUT request.
    Args:
        request (flask.Request): The request object.
        <http://flask.pocoo.org/docs/0.12/api/#flask.Request>
    Returns:
        The response text, or any set of values that can be turned into a
         Response object using `make_response`
        <http://flask.pocoo.org/docs/0.12/api/#flask.Flask.make_response>.
    """
    from flask import abort

    if request.method == 'GET':
        return 'Hello, World!'
    elif request.method == 'PUT':
        return abort(403)
    else:
        return abort(405)

Handling Content Types

Cloud Functions parses request body content types of application/json and application/x-www-form-urlencoded as shown above. Plain text content types (text/plain) are passed through as strings using UTF-8 as a default encoding (or a custom encoding provided in the content-type header).

Other content types can be accessed by inspecting your HTTP function's argument. Methods for doing this vary by language.

The example below handles a request with a content type of text/xml:

Node.js

The rawBody property contains the unparsed bytes of the request body.

/**
 * Parses a document of type 'text/xml'
 *
 * @param {Object} req Cloud Function request context.
 * @param {Object} res Cloud Function response context.
 */
exports.parseXML = (req, res) => {
  // Convert the request to a Buffer and a string
  // Use whichever one is accepted by your XML parser
  let data = req.rawBody;
  let xmlData = data.toString();

  const parseString = require('xml2js').parseString;

  parseString(xmlData, (err, result) => {
    if (err) {
      console.error(err);
      res.status(500).end();
      return;
    }
    res.send(result);
  });
};

Python (Beta)

import json
import xmltodict

def parse_xml(request):
    """ Parses a document of type 'text/xml'
    Args:
        request (flask.Request): The request object.
    Returns:
        The response text, or any set of values that can be turned into a
         Response object using `make_response`
        <http://flask.pocoo.org/docs/0.12/api/#flask.Flask.make_response>.
    """
    data = xmltodict.parse(request.data)
    return json.dumps(data, indent=2)

Multipart Data and File Uploads

To process data with a multipart/form-data content type, use a parsing library as shown in the following example:

Node.js

/**
 * Parses a 'multipart/form-data' upload request
 *
 * @param {Object} req Cloud Function request context.
 * @param {Object} res Cloud Function response context.
 */
const path = require('path');
const os = require('os');
const fs = require('fs');

// Node.js doesn't have a built-in multipart/form-data parsing library.
// Instead, we can use the 'busboy' library from NPM to parse these requests.
const Busboy = require('busboy');

exports.uploadFile = (req, res) => {
  if (req.method === 'POST') {
    const busboy = new Busboy({ headers: req.headers });
    const tmpdir = os.tmpdir();

    // This object will accumulate all the fields, keyed by their name
    const fields = {};

    // This object will accumulate all the uploaded files, keyed by their name.
    const uploads = {};

    // This code will process each non-file field in the form.
    busboy.on('field', (fieldname, val) => {
      // TODO(developer): Process submitted field values here
      console.log(`Processed field ${fieldname}: ${val}.`);
      fields[fieldname] = val;
    });

    let fileWrites = [];

    // This code will process each file uploaded.
    busboy.on('file', (fieldname, file, filename) => {
      // Note: os.tmpdir() points to an in-memory file system on GCF
      // Thus, any files in it must fit in the instance's memory.
      console.log(`Processed file ${filename}`);
      const filepath = path.join(tmpdir, filename);
      uploads[fieldname] = filepath;

      const writeStream = fs.createWriteStream(filepath);
      file.pipe(writeStream);

      // File was processed by Busboy; wait for it to be written to disk.
      const promise = new Promise((resolve, reject) => {
        file.on('end', () => {
          writeStream.end();
        });
        writeStream.on('finish', resolve);
        writeStream.on('error', reject);
      });
      fileWrites.push(promise);
    });

    // Triggered once all uploaded files are processed by Busboy.
    // We still need to wait for the disk writes (saves) to complete.
    busboy.on('finish', () => {
      Promise.all(fileWrites)
        .then(() => {
          // TODO(developer): Process saved files here
          for (const name in uploads) {
            const file = uploads[name];
            fs.unlinkSync(file);
          }
          res.send();
        });
    });

    busboy.end(req.rawBody);
  } else {
    // Return a "method not allowed" error
    res.status(405).end();
  }
};

Python (Beta)

import os
import tempfile
from werkzeug.utils import secure_filename

# Helper function that computes the filepath to save files to
def get_file_path(filename):
    # Note: tempfile.gettempdir() points to an in-memory file system
    # on GCF. Thus, any files in it must fit in the instance's memory.
    file_name = secure_filename(filename)
    return os.path.join(tempfile.gettempdir(), file_name)


def parse_multipart(request):
    """ Parses a 'multipart/form-data' upload request
    Args:
        request (flask.Request): The request object.
    Returns:
        The response text, or any set of values that can be turned into a
         Response object using `make_response`
        <http://flask.pocoo.org/docs/0.12/api/#flask.Flask.make_response>.
    """

    # This code will process each non-file field in the form
    fields = {}
    data = request.form.to_dict()
    for field in data:
        fields[field] = data[field]
        print('Processed field: %s' % field)

    # This code will process each file uploaded
    files = request.files.to_dict()
    for file_name, file in files.items():
        file.save(get_file_path(file_name))
        print('Processed file: %s' % file_name)

    # Clear temporary directory
    for file_name in files:
        file_path = get_file_path(file_name)
        os.remove(file_path)

    return "Done!"

Alternatively, the raw request content can be accessed as shown in the Handling Content Types section.

File Uploads using Cloud Storage

For larger files or those that require persistent storage beyond the scope of a single request, you can use Cloud Storage as an entry point for your file uploads. The mechanism to allow this behavior is to generate a Signed URL, which provides temporary write access to a Cloud Storage bucket.

If you're using Cloud Functions directly, generate a signed URL using the appropriate Cloud Storage client library.

Uploading files to a Cloud Function using Cloud Storage is a three step process:

  1. Clients call a Cloud Function directly to retrieve a signed URL.

  2. Clients then send file data to the signed URL via an HTTP PUT request.

  3. A second Cloud Function is triggered by the mutation in the storage bucket to further process the file.

You can see an example below of using the Cloud Storage client library to generate a signed URL.

Cloud Functions have a "default application credential" which typically does not include the iam.serviceAccounts.signBlob permission. To allow this, you'll need to first make sure that your function's service account has the appropriate role. You can achieve this using either the GCP Console or the gcloud command-line tool:

console

To make sure that your function's service account has the appropriate role, you can directly modify the IAM roles for an account:

  1. Go to the Google Cloud Platform Console:

    Go to Google Cloud Platform Console

  2. Select the appropriate account, and choose Editor > Service Accounts > Service Account Token Creator.

gcloud

To make sure that your function's service account has the appropriate role, run the following command. The pre-defined serviceAccountTokenCreator role has the iam.serviceAccounts.signBlob permission you need:

gcloud projects add-iam-policy-binding [YOUR_PROJECT] \
    --member serviceAccount:[YOUR_SERVICE_ACCOUNT] --role roles/iam.serviceAccountTokenCreator

You can determine the service account used by your functions using either the GCP Console or the gcloud command-line tool:

console

To determine the service account used by your functions using GCP Console:

  1. Go to the Google Cloud Platform Console:

    Go to Google Cloud Platform Console

  2. Select the function you want to inspect from the list.

You can see the service account on the details page for the function.

gcloud

To determine the service account used by your functions, run the following command and look for the serviceAccountEmail property:

gcloud beta functions describe [YOUR_FUNCTION_NAME]

Here's an example of generating a signed URL that assumes the client will send the desired filename in the request body:

Node.js

const storage = require('@google-cloud/storage')();

/**
 * HTTP function that generates a signed URL
 * The signed URL can be used to upload files to Google Cloud Storage (GCS)
 *
 * @param {Object} req Cloud Function request context.
 * @param {Object} res Cloud Function response context.
 */
exports.getSignedUrl = (req, res) => {
  if (req.method === 'POST') {
    // TODO(developer) check that the user is authorized to upload

    // Get a reference to the destination file in GCS
    const file = storage.bucket('my-bucket').file(req.body.filename);

    // Create a temporary upload URL
    const expiresAtMs = Date.now() + 300000; // Link expires in 5 minutes
    const config = {
      action: 'write',
      expires: expiresAtMs,
      contentType: req.body.contentType
    };

    file.getSignedUrl(config, function (err, url) {
      if (err) {
        console.error(err);
        res.status(500).end();
        return;
      }
      res.send(url);
    });
  } else {
    // Return a "method not allowed" error
    res.status(405).end();
  }
};

Python (Beta)

from datetime import datetime, timedelta
from flask import abort
from google.cloud import storage
storage_client = storage.Client()


def get_signed_url(request):
    if request.method != 'POST':
        return abort(405)

    request_json = request.get_json()

    # Get a reference to the destination file in GCS
    file_name = request_json['filename']
    file = storage_client.bucket('my-bucket').blob(file_name)

    # Create a temporary upload URL
    expires_at_ms = datetime.now() + timedelta(seconds=30)
    url = file.generate_signed_url(expires_at_ms,
                                   content_type=request_json['contentType'])

    return url

When the client uploads a file to the signed URL, you can trigger a second function from this mutation if you want to take further action on the upload. See the Cloud Storage Tutorial for more information on triggering a Cloud Function from a Cloud Storage bucket.

Next steps

Was this page helpful? Let us know how we did:

Send feedback about...

Cloud Functions Documentation