SendGrid Tutorial

This tutorial demonstrates using Cloud Functions to send emails through the SendGrid platform, receive SendGrid analytics data via webhooks, and load the analytics data into Google BigQuery for analysis.

Objectives

  • Create a SendGrid account.
  • Write and deploy two HTTP Cloud Functions.
  • Write and deploy one Background Cloud Function.
  • Send an email from the deployed function via SendGrid.
  • Receive analytics data from SendGrid via webhooks.
  • Load SendGrid analytics data into BigQuery for analysis.

Costs

This tutorial uses billable components of Cloud Platform, including:

  • Google Cloud Functions
  • Google BigQuery
  • Google Cloud Storage

Use the Pricing Calculator to generate a cost estimate based on your projected usage.

New Cloud Platform users might be eligible for a free trial.

Before you begin

  1. Sign in to your Google account.

    If you don't already have one, sign up for a new account.

  2. Select or create a Cloud Platform project.

    Go to the Projects page

  3. Enable billing for your project.

    Enable billing

  4. Enable the Cloud Functions, Cloud Storage, and Google BigQuery APIs.

    Enable the APIs

  5. Install and initialize the Cloud SDK.
  6. Update and install gcloud components:
    gcloud components update beta &&
    gcloud components install

Visualizing the flow of data

The flow of data in the SendGrid tutorial application involves several steps:

  1. An email is sent through SendGrid by triggering a Cloud Function via HTTP.
  2. SendGrid sends analytics data to another Cloud Function via HTTP.
  3. Analytics data is saved to Cloud Storage as newline-delimited JSON.
  4. A third Cloud Function is triggered by new JSON files, and queues the JSON files to be loaded into BigQuery where the data can be analyzed.

It may help to visualize the steps:

Preparing the application

  1. Create a Sendgrid account. You can either do this manually via the SendGrid website, or you can use the Google Cloud Launcher, which will create an account for you and integrate billing.

    See Creating a SendGrid account using Cloud Launcher.

  2. Create a SendGrid API key:

    1. Log in to your SendGrid account at https://app.sendgrid.com.
    2. Navigate to Settings > API Keys.
    3. Create a new "General API Key".
    4. Ensure you select (at least) the "Mail Send" permission when you create the API key.
    5. Copy the API Key when it is displayed (you will only see this once, make sure you paste it somewhere so you can use it to call your Cloud Function at the end of this tutorial).
  3. Create an SendGrid Event Notification:

    1. Log in to your SendGrid account at https://app.sendgrid.com.
    2. Navigate to Settings > Mail Settings.
    3. Open the Event Notification tab.
    4. Click Edit and input the following into the HTTP POST URL field:

      http://[YOUR_USERNAME]:[YOUR_PASSWORD]@[YOUR_REGION].[YOUR_PROJECT_ID].cloudfunctions.net/sendgridWebhook

      where

      • [YOUR_USERNAME] and [YOUR_PASSWORD] are a username and password of your choice. You will enter these two variables into a config.json file in a subsequent step.
      • [YOUR_PROJECT_ID] is your Cloud project ID. This is visible in your terminal when your functions finish deploying.
      • [YOUR_REGION] is the region where your functions will be deployed. This is visible in your terminal when your functions finish deploying.
    5. Finally, set the event notification to On.

  4. Create a BigQuery dataset:

    gcloud beta bigquery datasets create [YOUR_DATASET_NAME]

    where [YOUR_DATASET_NAME] is the name of your new BigQuery dataset. You can also create a dataset from the BigQuery console.

  5. Create a Cloud Storage bucket to stage your Cloud Functions files, where [YOUR_STAGING_BUCKET_NAME] is a globally-unique bucket name:

    gsutil mb gs://[YOUR_STAGING_BUCKET_NAME]

  6. Create a Cloud Storage bucket to save the JSON files, where [YOUR_EVENT_BUCKET_NAME] is a globally-unique bucket name:

    gsutil mb gs://[YOUR_EVENT_BUCKET_NAME]

  7. Create a directory on your local system for the application code:

    • Linux or Mac OS X:

      mkdir ~/gcf_sendgrid
      cd ~/gcf_sendgrid
      
    • Windows

      mkdir %HOMEPATH%\gcf_sendgrid
      cd %HOMEPATH%\gcf_sendgrid
      
  8. Download the index.js file from the Cloud Functions sample project on GitHub and save it to the gcf_sendgrid directory.

  9. Create a config.json file in the gcf_sendgrid directory with the following contents:

    {
      "EVENT_BUCKET": "[YOUR_EVENT_BUCKET_NAME]",
      "DATASET": "[YOUR_DATASET_NAME]",
      "TABLE": "events",
      "USERNAME": "[YOUR_USERNAME]",
      "PASSWORD": "[YOUR_PASSWORD]"
    }
    
    • Replace [YOUR_EVENT_BUCKET_NAME] with a bucket name used for saving JSON files.
    • Replace [YOUR_DATASET_NAME] with a BigQuery dataset name.
    • Replace [YOUR_USERNAME] with a username of your choice, which will be used to verify data coming from SendGrid.
    • Replace [YOUR_PASSWORD] with a password of your choice, which will be used to verify data coming from SendGrid.

Understanding the code

Importing dependencies

The application must import several dependencies in order to communicate with Google Cloud Platform services:

Node.js

const sendgrid = require('sendgrid');
const config = require('./config.json');
const uuid = require('uuid');

// Get a reference to the Cloud Storage component
const storage = require('@google-cloud/storage')();
// Get a reference to the BigQuery component
const bigquery = require('@google-cloud/bigquery')();

Sending emails

The following function creates a SendGrid client for sending emails:

Node.js

/**
 * Returns a configured SendGrid client.
 *
 * @param {string} key Your SendGrid API key.
 * @returns {object} SendGrid client.
 */
function getClient (key) {
  if (!key) {
    const error = new Error('SendGrid API key not provided. Make sure you have a "sg_key" property in your request querystring');
    error.code = 401;
    throw error;
  }

  // Using SendGrid's Node.js Library https://github.com/sendgrid/sendgrid-nodejs
  return sendgrid(key);
}

The following function uses the SendGrid client to send an email:

Node.js

/**
 * Send an email using SendGrid.
 *
 * Trigger this function by making a POST request with a payload to:
 * https://[YOUR_REGION].[YOUR_PROJECT_ID].cloudfunctions.net/sendEmail?sg_key=[YOUR_API_KEY]
 *
 * @example
 * curl -X POST "https://us-central1.your-project-id.cloudfunctions.net/sendEmail?sg_key=your_api_key" --data '{"to":"bob@email.com","from":"alice@email.com","subject":"Hello from Sendgrid!","body":"Hello World!"}' --header "Content-Type: application/json"
 *
 * @param {object} req Cloud Function request context.
 * @param {object} req.query The parsed querystring.
 * @param {string} req.query.sg_key Your SendGrid API key.
 * @param {object} req.body The request payload.
 * @param {string} req.body.to Email address of the recipient.
 * @param {string} req.body.from Email address of the sender.
 * @param {string} req.body.subject Email subject line.
 * @param {string} req.body.body Body of the email subject line.
 * @param {object} res Cloud Function response context.
 */
exports.sendgridEmail = function sendgridEmail (req, res) {
  return Promise.resolve()
    .then(() => {
      if (req.method !== 'POST') {
        const error = new Error('Only POST requests are accepted');
        error.code = 405;
        throw error;
      }

      // Get a SendGrid client
      const client = getClient(req.query.sg_key);

      // Build the SendGrid request to send email
      const request = client.emptyRequest({
        method: 'POST',
        path: '/v3/mail/send',
        body: getPayload(req.body)
      });

      // Make the request to SendGrid's API
      console.log(`Sending email to: ${req.body.to}`);
      return client.API(request);
    })
    .then((response) => {
      if (response.statusCode < 200 || response.statusCode >= 400) {
        const error = Error(response.body);
        error.code = response.statusCode;
        throw error;
      }

      console.log(`Email sent to: ${req.body.to}`);

      // Forward the response back to the requester
      res.status(response.statusCode);
      if (response.headers['content-type']) {
        res.set('content-type', response.headers['content-type']);
      }
      if (response.headers['content-length']) {
        res.set('content-length', response.headers['content-length']);
      }
      if (response.body) {
        res.send(response.body);
      } else {
        res.end();
      }
    })
    .catch((err) => {
      console.error(err);
      const code = err.code || (err.response ? err.response.statusCode : 500) || 500;
      res.status(code).send(err);
      return Promise.reject(err);
    });
};

The sendgridEmail function is exported by the module and is executed when you make an HTTP POST request to the function's endpoint.

Receiving analytics data

The following function authenticates the incoming SendGrid request by checking for your configured username and password:

Node.js

/**
 * Verify that the webhook request came from sendgrid.
 *
 * @param {string} authorization The authorization header of the request, e.g. "Basic ZmdvOhJhcg=="
 */
function verifyWebhook (authorization) {
  const basicAuth = new Buffer(authorization.replace('Basic ', ''), 'base64').toString();
  const parts = basicAuth.split(':');
  if (parts[0] !== config.USERNAME || parts[1] !== config.PASSWORD) {
    const error = new Error('Invalid credentials');
    error.code = 401;
    throw error;
  }
}

The following function receives analytics data from SendGrid and saves the data as newline-delimited JSON to Cloud Storage:

Node.js

/**
 * Receive a webhook from SendGrid.
 *
 * See https://sendgrid.com/docs/API_Reference/Webhooks/event.html
 *
 * @param {object} req Cloud Function request context.
 * @param {object} res Cloud Function response context.
 */
exports.sendgridWebhook = function sendgridWebhook (req, res) {
  return Promise.resolve()
    .then(() => {
      if (req.method !== 'POST') {
        const error = new Error('Only POST requests are accepted');
        error.code = 405;
        throw error;
      }

      verifyWebhook(req.get('authorization') || '');

      const events = req.body || [];

      // Make sure property names in the data meet BigQuery standards
      fixNames(events);

      // Generate newline-delimited JSON
      // See https://cloud.google.com/bigquery/data-formats#json_format
      const json = events.map((event) => JSON.stringify(event)).join('\n');

      // Upload a new file to Cloud Storage if we have events to save
      if (json.length) {
        const bucketName = config.EVENT_BUCKET;
        const unixTimestamp = new Date().getTime() * 1000;
        const filename = `${unixTimestamp}-${uuid.v4()}.json`;
        const file = storage.bucket(bucketName).file(filename);

        console.log(`Saving events to ${filename} in bucket ${bucketName}`);

        return file.save(json).then(() => {
          console.log(`JSON written to ${filename}`);
        });
      }
    })
    .then(() => res.status(200).end())
    .catch((err) => {
      console.error(err);
      res.status(err.code || 500).send(err);
      return Promise.reject(err);
    });
};

The sendgridWebhook function is exported by the module and is executed when SendGrid makes an HTTP POST request to the function's endpoint.

Importing data into BigQuery

Finally, the following function imports the newline-delimited JSON data into BigQuery:

Node.js

/**
 * Cloud Function triggered by Cloud Storage when a file is uploaded.
 *
 * @param {object} event The Cloud Functions event.
 * @param {object} event.data A Cloud Storage file object.
 * @param {string} event.data.bucket Name of the Cloud Storage bucket.
 * @param {string} event.data.name Name of the file.
 * @param {string} [event.data.timeDeleted] Time the file was deleted if this is a deletion event.
 * @see https://cloud.google.com/storage/docs/json_api/v1/objects#resource
 */
exports.sendgridLoad = function sendgridLoad (event) {
  const file = event.data;

  if (file.resourceState === 'not_exists') {
    // This was a deletion event, we don't want to process this
    return;
  }

  return Promise.resolve()
    .then(() => {
      if (!file.bucket) {
        throw new Error('Bucket not provided. Make sure you have a "bucket" property in your request');
      } else if (!file.name) {
        throw new Error('Filename not provided. Make sure you have a "name" property in your request');
      }

      return getTable();
    })
    .then(([table]) => {
      const fileObj = storage.bucket(file.bucket).file(file.name);
      console.log(`Starting job for ${file.name}`);
      const metadata = {
        autodetect: true,
        sourceFormat: 'NEWLINE_DELIMITED_JSON'
      };
      return table.import(fileObj, metadata);
    })
    .then(([job]) => job.promise())
    .then(() => console.log(`Job complete for ${file.name}`))
    .catch((err) => {
      console.log(`Job failed for ${file.name}`);
      return Promise.reject(err);
    });
};

The sendgridLoad function is exported by the module and is executed when a new JSON file is saved to Cloud Storage.

Deploying the functions

  1. To deploy the sendgridEmail function with an HTTP trigger, run the following command in the gcf_sendgrid directory:

    gcloud beta functions deploy sendgridEmail --stage-bucket [YOUR_STAGING_BUCKET_NAME] --trigger-http
    

    where

    • [YOUR_STAGING_BUCKET_NAME] is the name of your staging Cloud Storage Bucket.
  2. To deploy the sendgridWebhook function with an HTTP trigger, run the following command in the gcf_sendgrid directory:

    gcloud beta functions deploy sendgridWebhook --stage-bucket [YOUR_STAGING_BUCKET_NAME] --trigger-http
    

    where

    • [YOUR_STAGING_BUCKET_NAME] is the name of your staging Cloud Storage Bucket.
  3. To deploy the sendgridLoad function with a storage trigger, run the following command in the gcf_sendgrid directory:

    gcloud beta functions deploy sendgridLoad --stage-bucket [YOUR_STAGING_BUCKET_NAME] --trigger-bucket [YOUR_EVENT_BUCKET_NAME]
    

    where

    • [YOUR_STAGING_BUCKET_NAME] is the name of your staging Cloud Storage Bucket.
    • [YOUR_EVENT_BUCKET_NAME] is the name of the Cloud Storage bucket for saving JSON files.

Sending an email

  1. Send an email:

    curl -X POST "https://[YOUR_REGION].[YOUR_PROJECT_ID].cloudfunctions.net/sendgridEmail?sg_key=[YOUR_SENDGRID_KEY]" --data '{"to":"[YOUR_SENDER_ADDR]","from":"[YOUR_RECIPIENT_ADDR]","subject":"Hello from Sendgrid!","body":"Hello World!"}' --header "Content-Type: application/json"

    where

    • [YOUR_REGION] is the region where your function is deployed. This is visible in your terminal when your function finishes deploying.
    • [YOUR_PROJECT_ID] is your Cloud project ID. This is visible in your terminal when your function finishes deploying.
    • [YOUR_SENDGRID_KEY] is your SendGrid API KEY.
    • [YOUR_SENDER_ADDR] is your SendGrid account's email address.
    • [YOUR_RECIPIENT_ADDR] is the recipient's email address.
  2. Watch the logs to be sure the executions have completed:

    gcloud beta functions logs read --limit 100
    
  3. You can view the saved JSON files in the Cloud Storage bucket specified by the EVENT_BUCKET value in the config.json file.

  4. You can view the imported analytics data in BigQuery at the following URL:

    https://bigquery.cloud.google.com/table/[YOUR_PROJECT_ID]:[YOUR_DATASET_NAME].events

    where

    • [YOUR_PROJECT_ID] is your Google Cloud project ID.
    • [YOUR_DATASET_NAME] is the BigQuery dataset name you configured in the config.json file.

Cleaning up

To avoid incurring charges to your Google Cloud Platform account for the resources used in this tutorial:

Deleting the project

The easiest way to eliminate billing is to delete the project you created for the tutorial.

To delete the project:

  1. In the Cloud Platform Console, go to the Projects page.

    Go to the Projects page

  2. In the project list, select the project you want to delete and click Delete project. After selecting the checkbox next to the project name, click
      Delete project
  3. In the dialog, type the project ID, and then click Shut down to delete the project.

Deleting the Cloud Functions

Deleting the Cloud Functions removes all Container Engine resources, but you need to manually remove any resources in Cloud Storage, and Cloud Pub/Sub.

Delete a Cloud Function:

gcloud beta functions delete [NAME_OF_FUNCTION]

You can also delete Cloud Functions from the Google Cloud Platform Console.

Delete a BigQuery dataset and all of its tables:

bq rm -r -f [YOUR_DATASET_NAME]

Send feedback about...

Cloud Functions Documentation