Testing and CI/CD

This guide covers best practices for testing and deploying Cloud Functions, discusses the types of tests you should use, and provides some instructions for example testing scenarios. This guide also has information on automatically running your tests and optionally redeploying your functions using a Continuous Integration and Deployment (CI/CD) platform such as Cloud Build.

Before you begin

Before starting this guide, set up your environment.

Node.js

  1. Go to the Node.js setup guide
  2. Optionally, install the Node.js Emulator.

Python (Beta)

  1. Go to the Python setup guide
  2. Install pytest.
    pip install --upgrade pytest
    

Frameworks

Node.js

The examples in this guide use AVA as a test framework to run tests and Sinon as a mocking framework to mock external dependencies.

An external dependency is a dependency that your function relies on that isn't a part of your function's code. Common examples of external dependencies are other Google Cloud Platform services and libraries installed using package managers such as npm.

Python (Beta)

The examples in this guide use Pytest as a test framework to run tests and unittest as a mocking framework to mock external dependencies.

An external dependency is a dependency that your function relies on that isn't a part of your function's code. Common examples of external dependencies are other Google Cloud Platform services and libraries installed using package managers such as pip.

Types of tests

There are three types of tests that you can use when working with Cloud Functions, each of which tests a different aspect of your code. They are listed below, from least to most thorough:

  • Unit tests
  • Integration tests
  • System tests

In general, more thorough tests take more time to complete. This document discusses these test types in detail, as well as how to strike a balance between speed and thoroughness.

Unit tests

Unit tests are narrowly-scoped tests for small, specific parts of your code. These tests are good for quickly verifying assumptions made during the development process, such as handling edge cases and input validation.

By design, unit tests do not test integration with any external dependencies, such as Cloud Functions itself or other Google Cloud Platform components. You can use your mocking framework to create mock versions of external dependencies.

For HTTP functions, tests should mock the wrapping HTTP framework. Confirm the function's behavior by combining testing and mocking frameworks and comparing your function's results to expected values.

Unit tests cannot detect changes in external dependencies. If these dependencies change, both the tested code and your mocks must be updated.

Integration tests

Integration tests validate interaction between parts of your code, and typically take a moderate amount of time to complete. For example, in Cloud Functions, integration tests can be used to test a function's usage of other GCP services such as Cloud Datastore or Cloud Vision.

The primary difference between unit tests and integration tests for Cloud Functions is that integration tests involve less mocking than unit tests. Integration tests should trigger and respond to Cloud events such as HTTP requests, Cloud Pub/Sub messages, or Storage object changes.

You can run integration tests locally using a shim. Validate function behavior by confirming the expected result, given specific inputs.

System tests

System tests are more complex tests that validate the behavior of your Cloud Function across multiple GCP components in an isolated test environment.

Deploy your Cloud Function to a test environment and test its functionality using HTTP requests, Pub/Sub messages, or Cloud Storage changes. Validate your function by reading the logs or checking for the desired behavior.

You should isolate your development, testing, and production environments. One way to achieve this is to use a separate GCP project for each step.

You should also assign system test resources globally unique names to prevent concurrent tests from interfering with each other. You can do this by programmatically creating and deleting the required resources before and after the test run.

Example testing scenarios

The scenarios below cover the different ways of triggering Cloud Functions. The structure of a function's tests depends heavily on which GCP resources a function uses. In turn, a function's resource use depends on how that function is triggered.

HTTP-triggered functions

Unlike tests for other types of functions, system and integration tests for HTTP-triggered functions are similar in structure, but most HTTP function unit tests have a different structure.

This overlap between system and integration tests is shown in the following example of an HTTP-triggered function:

Node.js

/**
 * HTTP Cloud Function.
 *
 * @param {Object} req Cloud Function request context.
 *                     More info: https://expressjs.com/en/api.html#req
 * @param {Object} res Cloud Function response context.
 *                     More info: https://expressjs.com/en/api.html#res
 */
exports.helloHttp = (req, res) => {
  res.send(`Hello ${req.body.name || 'World'}!`);
};

Python (Beta)

def hello_http(request):
    """HTTP Cloud Function.
    Args:
        request (flask.Request): The request object.
        <http://flask.pocoo.org/docs/0.12/api/#flask.Request>
    Returns:
        The response text, or any set of values that can be turned into a
        Response object using `make_response`
        <http://flask.pocoo.org/docs/0.12/api/#flask.Flask.make_response>.
    """
    request_json = request.get_json()
    if request_json and 'name' in request_json:
        name = request_json['name']
    else:
        name = 'World'
    return 'Hello, {}!'.format(name)

Unit tests

These tests act as unit tests for the HTTP-triggered function above.

Node.js

Express is mocked using Sinon.
const test = require(`ava`);
const sinon = require(`sinon`);
const uuid = require(`uuid`);

const helloHttp = require(`..`).helloHttp;

test(`helloHttp: should print a name`, t => {
  // Mock ExpressJS 'req' and 'res' parameters
  const name = uuid.v4();
  const req = {
    body: {
      name: name
    }
  };
  const res = { send: sinon.stub() };

  // Call tested function
  helloHttp(req, res);

  // Verify behavior of tested function
  t.true(res.send.calledOnce);
  t.deepEqual(res.send.firstCall.args, [`Hello ${name}!`]);
});

test(`helloHttp: should print hello world`, t => {
  // Mock ExpressJS 'req' and 'res' parameters
  const req = {
    body: {}
  };
  const res = { send: sinon.stub() };

  // Call tested function
  helloHttp(req, res);

  // Verify behavior of tested function
  t.true(res.send.calledOnce);
  t.deepEqual(res.send.firstCall.args, [`Hello World!`]);
});

Python (Beta)

Flask is mocked using unittest.
from unittest.mock import Mock

import main


def test_print_name():
    name = 'test'
    req = Mock(get_json=Mock(return_value={'name': name}))

    # Call tested function
    assert main.hello_http(req) == 'Hello, {}!'.format(name)


def test_print_hello_world():
    req = Mock(get_json=Mock(return_value={}))

    # Call tested function
    assert main.hello_http(req) == 'Hello, World!'

Use the following command to run the unit tests:

Node.js

ava test/sample.unit.http.test.js

Python (Beta)

pytest sample_http_test.py

Integration tests

These tests act as integration tests for the function above:

Node.js

const test = require(`ava`);
const Supertest = require(`supertest`);
const supertest = Supertest(process.env.BASE_URL);

test.cb(`helloHttp: should print a name`, (t) => {
  supertest
    .post(`/helloHttp`)
    .send({ name: 'John' })
    .expect(200)
    .expect((response) => {
      t.is(response.text, 'Hello John!');
    })
    .end(t.end);
});

test.cb(`helloHttp: should print hello world`, (t) => {
  supertest
    .get(`/helloHttp`)
    .expect(200)
    .expect((response) => {
      t.is(response.text, `Hello World!`);
    })
    .end(t.end);
});

Use the following file as a shim:

Node.js

// Import dependencies
const gcfCode = require('./index.js');
const express = require('express');

// TODO(developer): specify the port to use
// const PORT = 3000;

// Start local HTTP server
const app = express();
const server = require(`http`).createServer(app);
server.on('connection', socket => socket.unref());
server.listen(PORT);

// Register HTTP handlers
Object.keys(gcfCode).forEach(gcfFn => {
  // Handle a single HTTP request
  const handler = (req, res) => {
    gcfCode[gcfFn](req, res);
    server.close();
  };

  app.get(`/${gcfFn}`, handler);
  app.post(`/${gcfFn}`, handler);
});

To run integration tests for HTTP functions, use the following command:

Node.js

export PORT=8010
export BASE_URL=http://localhost:8010/YOUR_GCP_PROJECT_ID/YOUR_GCF_REGION
ava test/sample.integration.http.test.js

where:

  • YOUR_GCP_PROJECT_ID is your GCP project ID.
  • YOUR_GCF_REGION is your Cloud Functions region.
  • BASE_URL is an environment variable that specifies the URL where the function can be reached. Environment variables let you specify values available only in your local test environment. This allows you to avoid hardcoding these values into your code.

System tests

These tests act as system tests for the function above:

Node.js

Note that the system tests are identical to the function's integration tests.
const test = require(`ava`);
const Supertest = require(`supertest`);
const supertest = Supertest(process.env.BASE_URL);

test.cb(`helloHttp: should print a name`, (t) => {
  supertest
    .post(`/helloHttp`)
    .send({ name: 'John' })
    .expect(200)
    .expect((response) => {
      t.is(response.text, 'Hello John!');
    })
    .end(t.end);
});

test.cb(`helloHttp: should print hello world`, (t) => {
  supertest
    .get(`/helloHttp`)
    .expect(200)
    .expect((response) => {
      t.is(response.text, `Hello World!`);
    })
    .end(t.end);
});

Python (Beta)

import os
import uuid

import requests


def test_no_args():
    BASE_URL = os.getenv('BASE_URL')
    assert BASE_URL is not None

    res = requests.get('{}/hello_http'.format(BASE_URL))
    assert res.text == 'Hello, World!'


def test_args():
    BASE_URL = os.getenv('BASE_URL')
    assert BASE_URL is not None

    name = str(uuid.uuid4())
    res = requests.post(
      '{}/hello_http'.format(BASE_URL),
      json={'name': name}
    )
    assert res.text == 'Hello, {}!'.format(name)

To run system tests for HTTP functions, deploy your functions with the following command:

Node.js 6

gcloud functions deploy helloHttp --runtime nodejs6 

Node.js 8 (Beta)

gcloud functions deploy helloHttp --runtime nodejs8 

Python (Beta)

gcloud functions deploy hello_http --runtime python37 

Use the following commands to test your deployed HTTP function:

Node.js

Note that the primary difference between system tests and integration tests for HTTP Cloud Functions is the URL where the function can be reached.

export BASE_URL=https://YOUR_GCF_REGION-YOUR_GCP_PROJECT_ID.cloudfunctions.net/
ava test/sample.system.http.test.js

Python (Beta)

export BASE_URL=https://YOUR_GCF_REGION-YOUR_GCP_PROJECT_ID.cloudfunctions.net/
pytest sample_http_test_system.py

where:

  • YOUR_GCF_REGION is your Cloud Functions region.
  • YOUR_GCP_PROJECT_ID is your GCP project ID.

Pub/Sub-triggered functions

Pub/Sub-triggered function tests are structured differently depending on where the tested function resides.

Here is an example of a Pub/Sub-triggered function:

Node.js 6

/**
 * Background Cloud Function to be triggered by Pub/Sub.
 * This function is exported by index.js, and executed when
 * the trigger topic receives a message.
 *
 * @param {object} event The Cloud Functions event.
 * @param {function} callback The callback function.
 */
exports.helloPubSub = (event, callback) => {
  const pubsubMessage = event.data;
  const name = pubsubMessage.data ? Buffer.from(pubsubMessage.data, 'base64').toString() : 'World';

  console.log(`Hello, ${name}!`);

  callback();
};

Node.js 8 (Beta)

/**
 * Background Cloud Function to be triggered by Pub/Sub.
 * This function is exported by index.js, and executed when
 * the trigger topic receives a message.
 *
 * @param {object} data The event payload.
 * @param {object} context The event metadata.
 */
exports.helloPubSub = (data, context) => {
  const pubSubMessage = data;
  const name = pubSubMessage.data ? Buffer.from(pubSubMessage.data, 'base64').toString() : 'World';

  console.log(`Hello, ${name}!`);
};

Python (Beta)

def hello_pubsub(data, context):
    """Background Cloud Function to be triggered by Pub/Sub.
    Args:
         data (dict): The dictionary with data specific to this type of event.
         context (google.cloud.functions.Context): The Cloud Functions event
         metadata.
    """
    import base64

    if 'data' in data:
        name = base64.b64decode(data['data']).decode('utf-8')
    else:
        name = 'World'
    print('Hello, {}!'.format(name))

Unit tests

These tests act as unit tests for the Pub/Sub-triggered function above:

Node.js 6

const test = require(`ava`);
const uuid = require(`uuid`);
const sinon = require(`sinon`);

const helloPubSub = require(`..`).helloPubSub;
const consoleLog = sinon.stub(console, 'log');

test.cb(`helloPubSub: should print a name`, t => {
  t.plan(1);

  // Initialize mocks
  const name = uuid.v4();
  const event = {
    data: {
      data: Buffer.from(name).toString(`base64`)
    }
  };

  // Call tested function and verify its behavior
  helloPubSub(event, () => {
    t.true(consoleLog.calledWith(`Hello, ${name}!`));
    t.end();
  });
});

test.cb(`helloPubSub: should print hello world`, t => {
  t.plan(1);

  // Initialize mocks
  const event = {
    data: {}
  };

  // Call tested function and verify its behavior
  helloPubSub(event, () => {
    t.true(consoleLog.calledWith(`Hello, World!`));
    t.end();
  });
});

Node.js 8 (Beta)

const test = require(`ava`);
const uuid = require(`uuid`);
const sinon = require(`sinon`);

const helloPubSub = require(`..`).helloPubSub;
const consoleLog = sinon.stub(console, 'log');

test(`helloPubSub: should print a name`, async t => {
  // Initialize mocks
  const name = uuid.v4();
  const event = {
    data: Buffer.from(name).toString(`base64`)
  };

  // Call tested function and verify its behavior
  await helloPubSub(event);
  t.true(consoleLog.calledWith(`Hello, ${name}!`));
});

test(`helloPubSub: should print hello world`, async t => {
  // Initialize mocks
  const event = {};

  // Call tested function and verify its behavior
  await helloPubSub(event);
  t.true(consoleLog.calledWith(`Hello, World!`));
});

Python (Beta)

import base64

import main


def test_print_hello_world(capsys):
    data = {}

    # Call tested function
    main.hello_pubsub(data, None)
    out, err = capsys.readouterr()
    assert out == 'Hello, World!\n'


def test_print_name(capsys):
    name = 'test'
    data = {'data': base64.b64encode(name.encode())}

    # Call tested function
    main.hello_pubsub(data, None)
    out, err = capsys.readouterr()
    assert out == 'Hello, {}!\n'.format(name)

Use the following command to run the unit tests:

Node.js

ava test/sample.unit.pubsub.test.js

Python (Beta)

pytest sample_pubsub_test.py

Integration tests

These tests act as integration tests for the Pub/Sub-triggered function above:

Node.js

const childProcess = require(`child_process`);
const test = require(`ava`);
const uuid = require(`uuid`);

test.serial(`helloPubSub: should print a name`, async (t) => {
  t.plan(1);
  const startTime = new Date(Date.now()).toISOString();
  const name = uuid.v4();

  // Mock Pub/Sub call, as the emulator doesn't listen to Pub/Sub topics
  const encodedName = Buffer.from(name).toString(`base64`);
  const data = JSON.stringify({ data: encodedName });
  childProcess.execSync(`functions call helloPubSub --data '${data}'`);

  // Check the emulator's logs
  const logs = childProcess.execSync(`functions logs read helloPubSub --start-time ${startTime}`).toString();
  t.true(logs.includes(`Hello, ${name}!`));
});

test.serial(`helloPubSub: should print hello world`, async (t) => {
  t.plan(1);
  const startTime = new Date(Date.now()).toISOString();

  // Mock Pub/Sub call, as the emulator doesn't listen to Pub/Sub topics
  childProcess.execSync(`functions call helloPubSub --data {}`);

  // Check the emulator's logs
  const logs = childProcess.execSync(`functions logs read helloPubSub --start-time ${startTime}`).toString();
  t.true(logs.includes(`Hello, World!`));
});

Use the following file as a shim:

Node.js

// Import dependencies
const Pubsub = require('@google-cloud/pubsub');
const pubsub = Pubsub();

// TODO(developer): specify a function to test
// const gcfCode = require('./index.js');
// const gcfFn = gcfCode.YOUR_FUNCTION;

// TODO(developer): specify an existing topic and subscription to use
// const topicName = process.env.TOPIC || 'YOUR_TOPIC';
// const subscriptionName = process.env.SUBSCRIPTION || 'YOUR_SUBSCRIPTION';

// Subscribe to Pub/Sub topic
const subscription = pubsub.topic(topicName).subscription(subscriptionName);

// Handle a single Pub/Sub message
const messageHandler = (msg) => {
  gcfFn({ data: msg }, () => {
    msg.ack();
    subscription.removeListener(`message`, messageHandler);
  });
};
subscription.on(`message`, messageHandler);

To run the integration tests for this function, complete the following steps:

Node.js

  1. Optionally, if you haven't set the topic and subscription in your shim file, set the following environment variables:

    export TOPIC=YOUR_TOPIC
    export SUBSCRIPTION=YOUR_SUBSCRIPTION
    

    where:

    • YOUR_TOPIC is name of the Cloud Pub/Sub topic you want your functions to subscribe to.
    • YOUR_SUBSCRIPTION is your Cloud Pub/Sub subscription.

  2. To run the test, use the following command:

    ava test/sample.integration.pubsub.test.js
    

System tests

These tests act as system tests for this function:

Node.js

const childProcess = require(`child_process`);
const test = require(`ava`);
const uuid = require(`uuid`);
const Pubsub = require(`@google-cloud/pubsub`);
const pubsub = Pubsub();

const topicName = process.env.FUNCTIONS_TOPIC;
const baseCmd = `gcloud functions`;

test(`helloPubSub: should print a name`, async (t) => {
  t.plan(1);
  const startTime = new Date(Date.now()).toISOString();
  const name = uuid.v4();

  // Publish to pub/sub topic
  const topic = pubsub.topic(topicName);
  const publisher = topic.publisher();
  await publisher.publish(Buffer.from(name));

  // Wait for logs to become consistent
  await new Promise(resolve => setTimeout(resolve, 15000));

  // Check logs after a delay
  const logs = childProcess.execSync(`${baseCmd} logs read helloPubSub --start-time ${startTime}`).toString();
  t.true(logs.includes(`Hello, ${name}!`));
});

test(`helloPubSub: should print hello world`, async (t) => {
  t.plan(1);
  const startTime = new Date(Date.now()).toISOString();

  // Publish to pub/sub topic
  const topic = pubsub.topic(topicName);
  const publisher = topic.publisher();
  await publisher.publish(Buffer.from(''), { a: 'b' });

  // Wait for logs to become consistent
  await new Promise(resolve => setTimeout(resolve, 15000));

  // Check logs after a delay
  const logs = childProcess.execSync(`${baseCmd} logs read helloPubSub --start-time ${startTime}`).toString();
  t.true(logs.includes('Hello, World!'));
});

Python (Beta)

from datetime import datetime
from os import getenv
import subprocess
import time
import uuid

from google.cloud import pubsub_v1
import pytest

PROJECT = getenv('GCP_PROJECT')
TOPIC = getenv('TOPIC')

assert PROJECT is not None
assert TOPIC is not None


@pytest.fixture(scope='module')
def publisher_client():
    yield pubsub_v1.PublisherClient()


def test_print_name(publisher_client):
    start_time = datetime.utcnow().isoformat()
    topic_path = publisher_client.topic_path(PROJECT, TOPIC)

    # Publish the message
    name = uuid.uuid4()
    data = str(name).encode('utf-8')
    publisher_client.publish(topic_path, data=data).result()

    # Wait for logs to become consistent
    time.sleep(15)

    # Check logs after a delay
    log_process = subprocess.Popen([
        'gcloud',
        'alpha',
        'functions',
        'logs',
        'read',
        'hello_pubsub',
        '--start-time',
        start_time
    ], stdout=subprocess.PIPE)
    logs = str(log_process.communicate()[0])
    assert 'Hello, {}!'.format(name) in logs

To run the system tests:

  1. In your GCP project, select a Cloud Pub/Sub topic to subscribe to. If you provide the name of a Cloud Pub/Sub topic that does not exist, it is created automatically.

  2. Next, deploy your functions using the following command:

    Node.js 6

    gcloud functions deploy helloPubSub --runtime nodejs6 --trigger-topic YOUR_PUBSUB_TOPIC

    Node.js 8 (Beta)

    gcloud functions deploy helloPubSub --runtime nodejs8 --trigger-topic YOUR_PUBSUB_TOPIC

    Python (Beta)

    gcloud functions deploy hello_pubsub --runtime python37 --trigger-topic YOUR_PUBSUB_TOPIC

    where YOUR_PUBSUB_TOPIC is the name of the Cloud Pub/Sub topic you want your functions to subscribe to.

  3. Run the system tests with the following command:

    Node.js

    export FUNCTIONS_TOPIC=YOUR_PUBSUB_TOPIC
    ava test/sample.system.pubsub.test.js
    

    Python (Beta)

    export FUNCTIONS_TOPIC=YOUR_PUBSUB_TOPIC
    pytest sample_pubsub_test_system.py
    

    where YOUR_PUBSUB_TOPIC is the name of the Cloud Pub/Sub topic you want your functions to subscribe to.

Storage-triggered functions

Tests for storage-triggered functions are similar in structure to their Cloud Pub/Sub-triggered counterparts. Like Cloud Pub/Sub-triggered function tests, storage-triggered function tests are structured differently depending on where the tested function is hosted.

Here is an example of a storage-triggered function:

Node.js 6

/**
 * Background Cloud Function to be triggered by Cloud Storage.
 *
 * @param {object} event The Cloud Functions event.
 * @param {function} callback The callback function.
 */
exports.helloGCS = (event, callback) => {
  const file = event.data;

  if (file.resourceState === 'not_exists') {
    console.log(`File ${file.name} deleted.`);
  } else if (file.metageneration === '1') {
    // metageneration attribute is updated on metadata changes.
    // on create value is 1
    console.log(`File ${file.name} uploaded.`);
  } else {
    console.log(`File ${file.name} metadata updated.`);
  }

  callback();
};

Node.js 8 (Beta)

/**
 * Background Cloud Function to be triggered by Cloud Storage.
 *
 * @param {object} data The event payload.
 * @param {object} context The event metadata.
 */
exports.helloGCS = (data, context) => {
  const file = data;
  if (file.resourceState === 'not_exists') {
    console.log(`File ${file.name} deleted.`);
  } else if (file.metageneration === '1') {
    // metageneration attribute is updated on metadata changes.
    // on create value is 1
    console.log(`File ${file.name} uploaded.`);
  } else {
    console.log(`File ${file.name} metadata updated.`);
  }
};

Python (Beta)

def hello_gcs(data, context):
    """Background Cloud Function to be triggered by Cloud Storage.
    Args:
         data (dict): The dictionary with data specific to this type of event.
         context (google.cloud.functions.Context): The Cloud Functions
         event metadata.
    """
    print("File: {}.".format(data['objectId']))

Unit tests

These are the unit tests for the storage-triggered function above:

Node.js 6

const test = require(`ava`);
const uuid = require(`uuid`);
const sinon = require(`sinon`);

const helloGCS = require(`..`).helloGCS;
const consoleLog = sinon.stub(console, 'log');

test.cb(`helloGCS: should print uploaded message`, t => {
  t.plan(1);

  // Initialize mocks
  const filename = uuid.v4();
  const event = {
    data: {
      name: filename,
      resourceState: 'exists',
      metageneration: '1'
    }
  };

  // Call tested function and verify its behavior
  helloGCS(event, () => {
    t.true(consoleLog.calledWith(`File ${filename} uploaded.`));
    t.end();
  });
});

test.cb(`helloGCS: should print metadata updated message`, t => {
  t.plan(1);

  // Initialize mocks
  const filename = uuid.v4();
  const event = {
    data: {
      name: filename,
      resourceState: 'exists',
      metageneration: '2'
    }
  };

  // Call tested function and verify its behavior
  helloGCS(event, () => {
    t.true(consoleLog.calledWith(`File ${filename} metadata updated.`));
    t.end();
  });
});

test.cb(`helloGCS: should print deleted message`, t => {
  t.plan(1);

  // Initialize mocks
  const filename = uuid.v4();
  const event = {
    data: {
      name: filename,
      resourceState: 'not_exists',
      metageneration: '3'
    }
  };

  // Call tested function and verify its behavior
  helloGCS(event, () => {
    t.true(consoleLog.calledWith(`File ${filename} deleted.`));
    t.end();
  });
});

Node.js 8 (Beta)

const test = require(`ava`);
const uuid = require(`uuid`);
const sinon = require(`sinon`);

const helloGCS = require(`..`).helloGCS;
const consoleLog = sinon.stub(console, 'log');

test(`helloGCS: should print uploaded message`, async t => {
  // Initialize mocks
  const filename = uuid.v4();
  const event = {
    name: filename,
    resourceState: 'exists',
    metageneration: '1'
  };

  // Call tested function and verify its behavior
  await helloGCS(event);
  t.true(consoleLog.calledWith(`File ${filename} uploaded.`));
});

test(`helloGCS: should print metadata updated message`, async t => {
  // Initialize mocks
  const filename = uuid.v4();
  const event = {
    name: filename,
    resourceState: 'exists',
    metageneration: '2'
  };

  // Call tested function and verify its behavior
  await helloGCS(event);
  t.true(consoleLog.calledWith(`File ${filename} metadata updated.`));
});

test(`helloGCS: should print deleted message`, async t => {
  // Initialize mocks
  const filename = uuid.v4();
  const event = {
    name: filename,
    resourceState: 'not_exists',
    metageneration: '3'
  };

  // Call tested function and verify its behavior
  await helloGCS(event);
  t.true(consoleLog.calledWith(`File ${filename} deleted.`));
});

Python (Beta)

import main


def test_print(capsys):
    name = 'test'
    data = {'objectId': name}

    # Call tested function
    main.hello_gcs(data, None)
    out, err = capsys.readouterr()
    assert out == 'File: {}.\n'.format(name)

Use the following command to run the unit tests:

Node.js

ava test/sample.unit.storage.test.js

Python (Beta)

pytest sample_storage_test.py

Integration tests

These are the integration tests for the storage-triggered function above:

Node.js

const childProcess = require(`child_process`);
const test = require(`ava`);
const uuid = require(`uuid`);

test.serial(`helloGCS: should print uploaded message`, async (t) => {
  t.plan(1);
  const startTime = new Date(Date.now()).toISOString();
  const filename = uuid.v4(); // Use a unique filename to avoid conflicts

  // Mock GCS call, as the emulator doesn't listen to GCS buckets
  const data = JSON.stringify({
    name: filename,
    resourceState: 'exists',
    metageneration: '1'
  });

  childProcess.execSync(`functions-emulator call helloGCS --data '${data}'`);

  // Check the emulator's logs
  const logs = childProcess.execSync(`functions-emulator logs read helloGCS --start-time ${startTime}`).toString();
  t.true(logs.includes(`File ${filename} uploaded.`));
});

test.serial(`helloGCS: should print metadata updated message`, async (t) => {
  t.plan(1);
  const startTime = new Date(Date.now()).toISOString();
  const filename = uuid.v4(); // Use a unique filename to avoid conflicts

  // Mock GCS call, as the emulator doesn't listen to GCS buckets
  const data = JSON.stringify({
    name: filename,
    resourceState: 'exists',
    metageneration: '2'
  });

  childProcess.execSync(`functions-emulator call helloGCS --data '${data}'`);

  // Check the emulator's logs
  const logs = childProcess.execSync(`functions-emulator logs read helloGCS --start-time ${startTime}`).toString();
  t.true(logs.includes(`File ${filename} metadata updated.`));
});

test.serial(`helloGCS: should print deleted message`, async (t) => {
  t.plan(1);
  const startTime = new Date(Date.now()).toISOString();
  const filename = uuid.v4(); // Use a unique filename to avoid conflicts

  // Mock GCS call, as the emulator doesn't listen to GCS buckets
  const data = JSON.stringify({
    name: filename,
    resourceState: 'not_exists',
    metageneration: '3'
  });

  childProcess.execSync(`functions-emulator call helloGCS --data '${data}'`);

  // Check the emulator's logs
  const logs = childProcess.execSync(`functions-emulator logs read helloGCS --start-time ${startTime}`).toString();
  t.true(logs.includes(`File ${filename} deleted.`));
});

Use the following file as a shim:

Node.js

// Import dependencies
const Pubsub = require('@google-cloud/pubsub');
const Storage = require(`@google-cloud/storage`);
const pubsub = Pubsub();
const storage = Storage();

// TODO(developer): specify a function to test
// const gcfCode = require('./index.js');
// const gcfFn = gcfCode.YOUR_FUNCTION;

// TODO(developer): specify a Cloud Storage bucket to monitor
// const bucketName = 'YOUR_GCS_BUCKET'

// TODO(developer): specify an existing topic and subscription to use
// const topicName = process.env.TOPIC || 'YOUR_TOPIC';
// const subscriptionName = process.env.SUBSCRIPTION || 'YOUR_SUBSCRIPTION';

// Create notification on target bucket
// Further info: https://cloud.google.com/storage/docs/reporting-changes
const bucket = storage.bucket(bucketName);
return bucket.createNotification(topicName)
  .then(data => data[0])
  .then((notification) => new Promise(resolve => {
    // Subscribe to Pub/Sub topic
    const subscription = pubsub
      .topic(topicName)
      .subscription(subscriptionName);

    // Handle a single Pub/Sub message
    const messageHandler = (msg) => {
      const data = JSON.parse(Buffer.from(msg.data, 'base64').toString());
      gcfFn({ data: data }, () => {
        msg.ack();
        subscription.removeListener(`message`, messageHandler);
        resolve(notification);
      });
    };
    subscription.on(`message`, messageHandler);
  }))
  .then(notification => notification.delete()); // Delete notification

Use the following command to run the integration tests:

Node.js

  1. Optionally, if you haven't set the topic and subscription in your shim file, set the following environment variables:

    export TOPIC=YOUR_TOPIC
    export SUBSCRIPTION=YOUR_SUBSCRIPTION
    

    where:

    • YOUR_TOPIC is name of the Cloud Pub/Sub topic you want your functions to subscribe to.
    • YOUR_SUBSCRIPTION is your Cloud Pub/Sub subscription.

  2. To run the test, use the following command:

    ava test/sample.integration.storage.test.js
    

System tests

These are the system tests for the storage-triggered function above:

Node.js

const Storage = require(`@google-cloud/storage`);
const storage = Storage();
const uuid = require(`uuid`);
const test = require(`ava`);
const path = require(`path`);
const childProcess = require(`child_process`);
const localFileName = `test.txt`;

// Use unique GCS filename to avoid conflicts between concurrent test runs
const gcsFileName = `test-${uuid.v4()}.txt`;

const bucketName = process.env.FUNCTIONS_BUCKET;
const bucket = storage.bucket(bucketName);
const baseCmd = `gcloud functions`;

test.serial(`helloGCS: should print uploaded message`, async (t) => {
  t.plan(1);
  const startTime = new Date(Date.now()).toISOString();

  // Upload file
  const filepath = path.join(__dirname, localFileName);
  await bucket.upload(filepath, {
    destination: gcsFileName
  });

  // Wait for consistency
  await new Promise(resolve => setTimeout(resolve, 15000));

  // Check logs
  const logs = childProcess.execSync(`${baseCmd} logs read helloGCS --start-time ${startTime}`).toString();
  t.true(logs.includes(`File ${gcsFileName} uploaded`));
});

test.serial(`helloGCS: should print metadata updated message`, async (t) => {
  t.plan(1);
  const startTime = new Date(Date.now()).toISOString();

  // Update file metadata
  const file = bucket.file(gcsFileName);
  await file.setMetadata(gcsFileName, { foo: `bar` });

  // Wait for consistency
  await new Promise(resolve => setTimeout(resolve, 15000));

  // Check logs
  const logs = childProcess.execSync(`${baseCmd} logs read helloGCS --start-time ${startTime}`).toString();
  t.true(logs.includes(`File ${gcsFileName} metadata updated`));
});

test.serial(`helloGCS: should print deleted message`, async (t) => {
  t.plan(1);
  const startTime = new Date(Date.now()).toISOString();

  // Delete file
  bucket.deleteFiles();

  // Wait for consistency
  await new Promise(resolve => setTimeout(resolve, 15000));

  // Check logs
  const logs = childProcess.execSync(`${baseCmd} logs read helloGCS --start-time ${startTime}`).toString();
  t.true(logs.includes(`File ${gcsFileName} deleted`));
});

Python (Beta)

from datetime import datetime
from os import getenv, path
import subprocess
import time
import uuid

from google.cloud import storage
import pytest

PROJECT = getenv('GCP_PROJECT')
BUCKET = getenv('BUCKET')

assert PROJECT is not None
assert BUCKET is not None


@pytest.fixture(scope='module')
def storage_client():
    yield storage.Client()


@pytest.fixture(scope='module')
def bucket_object(storage_client):
    bucket_object = storage_client.get_bucket(BUCKET)
    yield bucket_object


@pytest.fixture(scope='module')
def uploaded_file(bucket_object):
    name = 'test-{}.txt'.format(str(uuid.uuid4()))
    blob = bucket_object.blob(name)

    test_dir = path.dirname(path.abspath(__file__))
    blob.upload_from_filename(path.join(test_dir, 'test.txt'))
    yield name
    blob.delete()


def test_hello_gcs(uploaded_file):
    start_time = datetime.utcnow().isoformat()
    time.sleep(10)  # Wait for logs to become consistent

    log_process = subprocess.Popen([
        'gcloud',
        'alpha',
        'functions',
        'logs',
        'read',
        'hello_gcs',
        '--start-time',
        start_time
    ], stdout=subprocess.PIPE)
    logs = str(log_process.communicate()[0])
    assert uploaded_file in logs

Use the following command to deploy your function:

Node.js 6

gcloud functions deploy helloGCS --runtime nodejs6 --trigger-bucket YOUR_GCS_BUCKET_NAME

Node.js 8 (Beta)

gcloud functions deploy helloGCS --runtime nodejs8 --trigger-bucket YOUR_GCS_BUCKET_NAME

Python (Beta)

gcloud functions deploy hello_gcs --runtime python37 --trigger-bucket YOUR_GCS_BUCKET_NAME

where YOUR_GCS_BUCKET_NAME is the Cloud Storage bucket you want to monitor. Note that this must reference a bucket that exists in the same GCP project that the function is deployed to.

Use the following commands to run the system tests:

Node.js

export BUCKET_NAME=YOUR_GCS_BUCKET_NAME
ava test/sample.system.storage.test.js

Python (Beta)

export BUCKET_NAME=YOUR_GCS_BUCKET_NAME
pytest sample_storage_test_system.py

Continuous testing and deployment

As you develop your function, you can run unit tests, integration tests, and system tests to ensure that your functions work both locally and in a test environment on GCP.

Once you finish developing locally, you can configure a continuous integration and deployment (CI/CD) platform such as Cloud Build to run your existing Cloud Functions tests on an ongoing basis. Continuous testing helps ensure that your code continues to work as intended and that your dependencies remain up-to-date. As Cloud Functions are not updated automatically, you can also configure continuous integration platforms (including Cloud Build) to automatically test and redeploy your functions from a source repository such as GitHub, Bitbucket, or Cloud Source Repositories.

Follow the instructions in the Automating Builds using Build Triggers guide using the cloudbuild.yaml build config file below to configure Cloud Build to automatically test and deploy your function. Replace [YOUR_FUNCTION_NAME] with the name of your Cloud Functions and [YOUR_FUNCTION_TRIGGER] with the appropriate trigger value.

Node.js

steps:
- name: 'gcr.io/cloud-builders/yarn'
  args: ['install']
  dir: 'functions/autodeploy'
- name: 'gcr.io/cloud-builders/npm'
  args: ['test']
  dir: 'functions/autodeploy'
- name: 'gcr.io/cloud-builders/gcloud'
  args: ['beta', 'functions', 'deploy', '[YOUR_FUNCTION_NAME]', '[YOUR_FUNCTION_TRIGGER]']
  dir: 'functions/autodeploy'

If Cloud Build doesn't suit your needs, see this list of continuous integration platforms.

Granting permissions to run builds and deployments

If you are using Cloud Build, you need to grant permissions to the Cloud Build service account.

For example:

  • To deploy Cloud Functions, you might want to assign the Cloud Functions Developer role to the Cloud Build service account (PROJECT_NUMBER@cloudbuild.gserviceaccount.com).
  • If you use the Cloud Functions Developer role, you also need to grant the Cloud Functions Runtime service account (PROJECT_ID@appspot.gserviceaccount.com) the IAM Service Account User role.
Was this page helpful? Let us know how we did:

Send feedback about...

Cloud Functions Documentation