HTTP Functions

HTTP Functions are used when you want to directly invoke your function via HTTP(S). To allow for specific HTTP semantics, the signature of an HTTP function takes HTTP-specific arguments: request and response.

The request parameter represents the HTTP request that was sent to the function, and the response parameter represents the HTTP response that will be returned to the caller. These parameters have properties of ExpressJS Request and Response objects, which should be used to extract and return data.

Sample Usage

The example below shows how to process an HTTP POST request containing a message parameter.

Node.js

/**
 * Responds to any HTTP request that can provide a "message" field in the body.
 *
 * @param {Object} req Cloud Function request context.
 * @param {Object} res Cloud Function response context.
 */
exports.helloWorld = function helloWorld (req, res) {
  if (req.body.message === undefined) {
    // This is an error case, as "message" is required
    res.status(400).send('No message defined!');
  } else {
    // Everything is ok
    console.log(req.body.message);
    res.status(200).end();
  }
};

The following command shows how to call the function and pass it a simple message using curl:

curl -X POST -H "Content-Type:application/json"  -d '{"message":"hello world!"}' YOUR_HTTP_TRIGGER_ENDPOINT

Parsing HTTP requests

The body of the request is automatically parsed based on the content-type header and populated in the body of the request object. The following table lists some common scenarios:

Content Type Request Body Behavior
application/json '{"name":"John"}' request.body.name equals 'John'
application/octet-stream 'my text' request.body equals '6d792074657874' (the raw bytes of the request; see the Node.js Buffer documentation)
text/plain 'my text' request.body equals 'my text'
application/x-www-form-urlencoded 'name=John' request.body.name equals 'John'

This parsing is done by the following body parsers:

The example below shows how to read responses in these various formats.

Node.js

/**
 * Responds to an HTTP request using data from the request body parsed according
 * to the "content-type" header.
 *
 * @param {Object} req Cloud Function request context.
 * @param {Object} res Cloud Function response context.
 */
exports.helloContent = function helloContent (req, res) {
  let name;

  switch (req.get('content-type')) {
    // '{"name":"John"}'
    case 'application/json':
      name = req.body.name;
      break;

    // 'John', stored in a Buffer
    case 'application/octet-stream':
      name = req.body.toString(); // Convert buffer to a string
      break;

    // 'John'
    case 'text/plain':
      name = req.body;
      break;

    // 'name=John'
    case 'application/x-www-form-urlencoded':
      name = req.body.name;
      break;
  }

  res.status(200).send(`Hello ${name || 'World'}!`);
};

Suppose your function is called with the following request:

curl -X POST -H "Content-Type:application/json" -H "X-MyHeader: 123" YOUR_HTTP_TRIGGER_ENDPOINT?foo=baz -d '{"text":"something"}'

then the sent data would be materialized under:

Property/Method Value
request.method "POST"
request.get('x-myheader') "123"
request.query.foo "baz"
request.body.text "something"
request.rawBody The raw (unparsed) bytes of the request

Handling HTTP Methods

If a function needs to handle multiple HTTP methods (for example, GET and PUT), check the method property of the request as shown in the following example.

Node.js

function handleGET (req, res) {
  // Do something with the GET request
  res.status(200).send('Hello World!');
}

function handlePUT (req, res) {
  // Do something with the PUT request
  res.status(403).send('Forbidden!');
}

/**
 * Responds to a GET request with "Hello World!". Forbids a PUT request.
 *
 * @example
 * gcloud alpha functions call helloHttp
 *
 * @param {Object} req Cloud Function request context.
 * @param {Object} res Cloud Function response context.
 */
exports.helloHttp = function helloHttp (req, res) {
  switch (req.method) {
    case 'GET':
      handleGET(req, res);
      break;
    case 'PUT':
      handlePUT(req, res);
      break;
    default:
      res.status(500).send({ error: 'Something blew up!' });
      break;
  }
};

Handling HTTP Requests & File Uploads

Cloud Functions parses request body content types of application/json and application/x-www-form-urlencoded according to the rules described above. Plain text content types (text/plain) are passed through as strings using UTF-8 as a default encoding (or a custom encoding provided in the content-type header).

For other content types, the rawBody property contains the unparsed bytes of the request body as a Buffer object.

The example below handles a request with a content type of text/xml:

exports.parseXML = (req, res) => {
  // If your XML parser handles parsing a Buffer, just use rawBody.
  let data = req.rawBody;

  // Or if you need a String...
  let xmlData = data.toString();

  const parseString = require('xml2js').parseString;

  parseString(xmlData, (err, result) => {
    if (err) {
      console.error(err);
      res.status(500).end();
      return;
    }
    res.send(result);
  });
}

Handling Multipart Form Uploads

Simple form data sent with an application/x-www-form-urlencoded content type is automatically parsed to a JSON document. However, if you want your Cloud Function to process multipart/form-data, you can use the rawBody property of the request.

The following example uses busboy, but you can use any valid multipart/form-data parsing module that takes a Buffer as input:

const Busboy = require('busboy');

exports.multipartToJson = (req, res) => {
    if (req.method === 'POST') {

        const busboy = new Busboy({ headers: req.headers });
        let formData = {};

        busboy.on('field', (fieldname, val, fieldnameTruncated, valTruncated, encoding, mimetype) => {
            // We're just going to capture the form data in a JSON document.
            formData[fieldname] = val;
        });

        busboy.on('finish', () => {
            res.send(formData);
        });

        // The raw bytes of the upload will be in req.rawBody.
        busboy.end(req.rawBody);
    } else {
        // Only support POST.
        res.status(405).end();
    }
}

Handling File Uploads

File uploads that are transported over HTTP as multipart/form-data are accessible from a Cloud Function via the rawBody property of the request. For small files (up to a few MB), clients can send file data directly to the function. For large files or files that persist beyond the scope of a single request, we recommend you use Cloud Storage as the primary entry point for the file upload.

Direct Handling of Small Files

Small files can be handled directly by a Cloud Function using a multipart/form-data parser operating on the rawBody property of the request. The following example uses busboy, but you can use any multipart/form-data parsing module that takes a Buffer as input:

const path = require('path');
const os = require('os');
const fs = require('fs');
const Busboy = require('busboy');

exports.uploadFile = (req, res) => {
    if (req.method === 'POST') {
        const busboy = new Busboy({ headers: req.headers });
        // This object will accumulate all the uploaded files, keyed by their name.
        const uploads = {}
        const tmpdir = os.tmpdir();

        // This callback will be invoked for each file uploaded.
        busboy.on('file', (fieldname, file, filename, encoding, mimetype) => {
            // Note that os.tmpdir() is an in-memory file system, so should
            // only be used for files small enough to fit in memory.
            const filepath = path.join(tmpdir, filename)
            uploads[fieldname] = filepath;
            file.pipe(fs.createWriteStream(filepath));
        });

        // This callback will be invoked after all uploaded files are saved.
        busboy.on('finish', () => {

            // *** Process uploaded files here ***

            for (const name in uploads) {
                const file = uploads[name];
                fs.unlinkSync(file);
            }
            res.end();
        });

        // The raw bytes of the upload will be in req.rawBody. Send it to
        // busboy, and get a callback when it's finished.
        busboy.end(req.rawBody);
    } else {
        // Client error - only support POST.
        res.status(405).end();
    }
}

File Uploads using Cloud Storage

For larger files or those that require persistent storage beyond the scope of a single request, you can use Cloud Storage as an entry point for your file uploads. The mechanism to allow this behavior is to generate a Signed URL, which provides temporary write access to a Cloud Storage bucket.

If you're using Firebase, you can achieve this using the Firebase SDK on your client. See Firebase Documentation for details.

If you're using Cloud Functions directly, generate a signed URL using the @google-cloud/storage Node module.

Uploading files to a Cloud Function using Cloud Storage is a three step process:

  1. Clients call a Cloud Function directly to retrieve a signed URL.

  2. Clients then send file data to the signed URL via an HTTP PUT request.

  3. A second Cloud Function is triggered by the mutation in the storage bucket to further process the file.

You can see an example below of using the @google-cloud/storage Node module to generate a signed URL.

Alternatively, you can enable the IAM API in the GCP Console.

Cloud Functions have a "default application credential" which typically does not include the iam.serviceAccounts.signBlob permission. To allow this, you'll need to first make sure that your function's service account has the appropriate role. You can achieve this using either the GCP Console or the gcloud command-line tool:

console

To make sure that your function's service account has the appropriate role, you can directly modify the IAM roles for an account:

  1. Go to the Google Cloud Platform Console:

    Go to Google Cloud Platform Console

  2. Select the appropriate account, and choose Editor > Service Accounts > Service Account Token Creator.

gcloud

To make sure that your function's service account has the appropriate role, run the following command. The pre-defined serviceAccountTokenCreator role has the iam.serviceAccounts.signBlob permission you need:

gcloud projects add-iam-policy-binding [YOUR_PROJECT] \
    --member serviceAccount:[YOUR_SERVICE_ACCOUNT] --role roles/iam.serviceAccountTokenCreator

You can determine the service account used by your functions using either the GCP Console or the gcloud command-line tool:

console

To determine the service account used by your functions using GCP Console:

  1. Go to the Google Cloud Platform Console:

    Go to Google Cloud Platform Console

  2. Select the function you want to inspect from the list.

You can see the service account on the details page for the function.

gcloud

To determine the service account used by your functions, run the following command and look for the serviceAccountEmail property:

gcloud beta functions describe [YOUR_FUNCTION_NAME]

Here's an example of generating a signed URL that assumes the client will send the desired filename in the request body:

const storage = require('@google-cloud/storage')();

exports.getSignedUrl = (req, res) => {

    if(req.method === 'POST') {

        // Perform any authorization checks here to assert
        // that the end user is authorized to upload.

        const myBucket = storage.bucket('my-bucket');
        const myFile = myBucket.file(req.body.filename);
        const contentType = req.body.contentType;

        // This link should only last 5 minutes
        const expiresAtMs = Date.now() + 300000;
        const config = {
            action: 'write',
            expires: expiresAtMs,
            contentType: contentType
        };

        myFile.getSignedUrl(config, function(err, url) {
            if (err) {
                console.error(err);
                res.status(500).end();
                return;
            }
            res.send(url);
        });
    } else {
        res.status(405).end();
    }
}

When the client uploads a file to the signed URL, you can trigger a second function from this mutation if you want to take further action on the upload. See the Cloud Storage Tutorial for more information on triggering a Cloud Function from a Cloud Storage bucket.

Using Middleware

Cloud Functions provides request and response objects that are compatible with ExpressJS to make consuming HTTP requests simple. Cloud Functions automatically reads the request body, so you will always receive the body of a request independent of the content type. This means that HTTP requests should be considered to have been fully read by the time your code is executed. The nesting of ExpressJS apps should be used with this caveat—specifically, middleware that expects the body of a request to be unread may not behave as expected.

Next steps

Send feedback about...

Cloud Functions Documentation