Smartening Up Support Tickets with Serverless Machine Learning: Tutorial

This tutorial shows how to augment a typical helpdesk scenario by enriching ticket data with Cloud AI using an event-based serverless architecture. The tutorial shows how to build the architecture outlined in the accompanying articles, Architecture of a Serverless Machine Learning Model and Building a Serverless Machine Learning Model.

With this architecture, you implement a workflow allowing your customers to log support tickets through a custom-built form. Each ticket submission triggers a machine learning (ML) step that enriches the ticket data. The ticket is then saved into a real-time database that functions as a customer frontend data store. At the same time, ticket submission triggers creation of a Salesforce.com case that mirrors the data in your helpdesk app. Subsequently, your support agents use Salesforce to manage the support tickets and leverage the extra intelligence provided by ML. The result can lead to faster and better support decisions.

This tutorial is part of a broader solution that contains the following components:

  • A simple frontend HTML page to log and list support tickets.
  • Two notebooks that detail how you can:

    • Build a custom regression model using TensorFlow.
    • Build a custom classification model using TensorFlow.
    • Deploy the models on AI Platform so that Cloud Functions can use them.
  • A Node.js file that contains the deployed Cloud Functions, including:

    • Four functions to enrich the ticket.
    • One function to write to Salesforce.
    • One function that Salesforce calls to update the Firebase database.

When you finish this tutorial, you'll have:

  • A simple UI that allows your customers to log tickets.
  • Serverless ML ticket enrichment involving:

    • Autotagging to retain words in the description with the highest salience.
    • Sentiment analysis on the ticket description.
    • Predictions about how long it will take to resolve the ticket.
    • Predictions about the likely assigned ticket priority.
  • A way to update your Salesforce cases.

Objectives

  • Implement a simple Firebase web client.
  • Create Cloud Functions triggered by a write to the Firebase real-time database.
  • Build and serve an AI Platform classification model.
  • Build and serve an AI Platform regression model. You call Cloud Functions to generate two types of ML models: Cloud Natural Language API and AI Platform.
  • Leverage machine learning in order to increase ticket context, allowing support agents to make faster and better decisions.
  • Leverage Cloud Functions to synchronize between your system and third-party tools in a serverless way—for example, between your system and Salesforce, as in this solution, or between your system and Jira, using a similar process.

Costs

This tutorial uses the following billable components of Google Cloud Platform:

  • Natural Language API
  • AI Platform Training and Prediction API
  • Firebase

To generate a cost estimate based on your projected usage, use the pricing calculator. New GCP users might be eligible for a free trial.

Before you begin

Before you get started, you need to set up your environment.

  1. In your terminal, clone the serverless ML enrichment repository:

    git clone https://github.com/GoogleCloudPlatform/ml-functions-helpdesk
  2. Create a Firebase project, if you don't already have one:

    open Firebase control panel

  3. In the Firebase control panel, click Add project.

  4. Click Create a project.

    create project

    This process also creates a GCP project accessible by the GCP Console.

    Remember the project ID because you'll need it later. You can also import an existing GCP project using the Import button from the Firebase console.

  5. In your terminal window, install the Firebase client with the following command:

    npm install -g firebase-tools
  6. In the Firebase console, in the customer_site/public/index.html file, update the values related to your Firebase instance:

    var config = {
      apiKey: "<API_KEY>",
      authDomain: "<PROJECT_ID>.firebaseapp.com",
      databaseURL: "https://<DATABASE_NAME>.firebaseio.com",
      storageBucket: "<BUCKET>.appspot.com",
      messagingSenderId: "<SENDER_ID>",
    };

    where:

    • <API_KEY> is found under Overview > Project settings > General > Web API Key.
    • <PROJECT_ID> is the project ID you generated in the previous section.
    • <DATABASE_NAME> is found under Database > Data.
    • <BUCKET> is found under Storage > Files (ignore the ‘gs://' prefix).
    • <SENDER_ID> is found under Overview > Project settings > Cloud Messaging > Sender ID.

    Note your project ID because creating a Firebase project also creates a GCP project in the background.

  7. Set the security rules. For the purposes of this tutorial, in the Firebase console under Database > Rules, set all rules to true.

    security rules

    For more details on how to secure your application, refer to What's next.

  8. Open the associated GCP project, substituting your Firebase project ID for [PROJECT_ID]:

    https://console.cloud.google.com/project/[PROJECT_ID]
  9. Enable the Natural Language and Machine Learning APIs.

    Enable the APIs

Create service accounts

Service accounts are used to authenticate GCP services. Make sure that all the services used by Cloud Functions have access to all the required GCP APIs and services.

Make sure that your default Firebase service account is a project Editor or Owner. The AI Platform scope is wide and requires this type of broad permission. If you haven't changed anything in your project, it's likely you don't need to take any action.

Configure Salesforce

This tutorial assumes you already use a customer relationship management (CRM) system such as Salesforce and know how to configure and operate it.

If needed, you can create an unlimited time developer's account on the Salesforce developer signup page. You must be familiar with JSforce, a JavaScript library that Cloud Functions uses to create Salesforce cases.

Machine learning

As discussed in Building a Serverless Machine Learning Model, two different types of ML models are available:

  • Cloud Natural Language API

    The Natural Language API is a REST API that uses natural language processing based on an ML model previously trained by Google.

  • Custom model

    You train and deploy a custom ML model as a REST API using AI Platform.

Calling the Natural Language API

You can call a REST API to do auto-tagging and sentiment analysis using the Natural Language API. You then access the Natural Language API directly from Cloud Functions once they are deployed.

Natural Language API authentication is handled automatically from within Cloud Functions using the google-gcloud library for language.

For custom training and prediction of resolution time and priority, you need to use AI Platform. The following sections outline this process.

Setting up your AI Platform models

To enable API-accessible AI Platform models, you must be able to:

  1. Gather your data.
  2. Organize your data.
  3. Design your model.
  4. Train and generate your model.
  5. Deploy your model.

The easiest way to work through these steps is to use Cloud Datalab. Cloud Datalab gives you access to all the GCP tools you will need to build, test, and deploy a custom ML model.

You use the following two notebooks to build and deploy your model.

  • Resolution Time Regression notebook
  • Priority Classification notebook

These two notebooks are similar and differ at only a few points:

  • The output column
  • The model that is called
  • The location of the deployed model

After running those notebooks, you will have two ML models deployed on AI Platform. Both models are available through REST API calls.

Running the notebooks

  1. Open Cloud Shell.

  2. After the shell opens, launch a Cloud Datalab instance, substituting your instance name for [YOUR_DATALAB_INSTANCE_NAME]:

    datalab create [YOUR_DATALAB_INSTANCE_NAME] --zone us-central1-f

    If all goes well, you see the following message:

    The connection to Datalab is now open and will remain until this command is
    killed. Click on the *Web Preview* (up-arrow button at top-left), select
    *port 8081*, and start using Datalab.
  3. Access your Cloud Datalab instance by clicking on the top of the shell and using the Change port menu to change the port to 8081.

    Change port to 8081

    A Cloud Datalab window appears:

    Cloud Datalab window

  4. Download the notebooks from GitHub. At the top right, click the following icon to download the notebooks:

    notebook icon

    Use the following URL for the repository:

    https://github.com/GoogleCloudPlatform/ml-functions-helpdesk
  5. In the top right, click Fetch from origin. The result is similar to this:

    Fetch from origin

  6. Click Checkout.

    Verify that you see the Cloud Datalab UI updated with all folders from GitHub.

  7. Run through both notebooks. Detailed explanations are available in the notebooks including a step-by-step guide describing how to create and deploy your ML models.

  8. After you complete both notebooks, you will see both models in the GCP Console under Machine Learning > Models.

    two models

    If you click either model name, the version name displays:

    model version name

Now you have both models deployed in AI Platform. You are ready to call them using Cloud Functions.

Enriching your models with Cloud Functions

The next step is to update a few parameters in your Cloud Functions script before deploying the functions.

Updating model parameters

In order for Cloud Functions to call your models, you must update the Cloud Function script parameters to match your model's specifications. Define the following model constants in the functions/index.js script:

Model Constant Description
MDL_PROJECT_NAME The name of the project hosting your models.
RESOLUTION_TIME_MODEL_NAME The name of the model predicting the ticket resolution time. It is set when the model is deployed in Cloud Datalab.
PRIORITY_MODEL_NAME The name of the model that predicts the ticket priority. It is set when the model is deployed in Cloud Datalab.
const MDL_PROJECT_NAME = <YOUR_PROJECT_HOSTING_MODELS>;
const RESOLUTION_TIME_MODEL_NAME = 'mdl_helpdesk_priority'; # Matches Notebook
const PRIORITY_MODEL_NAME = 'mdl_helpdesk_resolution_time'; # Matches Notebook
const SFDC_URL = <YOUR_SFDC_URL>;
const SFDC_LOGIN = <YOUR_SFDC_LOGIN>;
const SFDC_PASSWORD = <YOUR_SFDC_PASSWORD>;
const SFDC_TOKEN = <YOUR_SFDC_TOKEN>;

Updating authentication

AI Platform authentication is more complicated than Natural Language Processing API authentication. AI Platform authentication does not have a wrapper in GCP so it's necessary to enable code authentication using default credentials. AI Platform also requires full project scope.

Refer to the following functions/index.js code snippet for an authentication example:

if (authClient.createScopedRequired && authClient.createScopedRequired()) {
  // https://developers.google.com/identity/protocols/googlescopes#mlv1
  authClient = authClient.createScoped([
  'https://www.googleapis.com/auth/cloud-platform'
  ]);
}

//Create authenticated ml engine client
var ml = google.ml({
  version: 'v1',
  auth: authClient
});

Deploying your Cloud Functions

In your terminal, run the Firebase deploy command from the root folder of the repository to deploy the Cloud Functions.

firebase deploy --only functions --project [YOUR-PROJECT-ID]

If you have only one function, run the following command. Replace [FUNCTION_NAME] with any of the pre-created functions defined in the functions/index.js file: priority, sentiment, tags, updateSFDC, or resolutiontime.

firebase deploy --only functions:[FUNCTION_NAME] --project [YOUR-PROJECT-ID]

Your first Cloud Function deployment might take a while because multiple APIs are automatically activated and take time to run.

Also note the function's trigger defined directly in the code:

exports.[FUNCTION_NAME] = functions.database.ref('/tickets/{ticketID}').onWrite(event => {
  // Handle the new write to the Firebase database
});

You can check the model's deployment from within the Firebase console under Functions.

model deployment

Alternatively, you can look at your GCP Console under Cloud Functions:

Cloud Functions

Synchronization with a third-party helpdesk

This tutorial shows you how to write an enriched support ticket for a third-party tool—in this implementation, Salesforce.com.

During deployment, you need updates to flow in both directions:

  • When the user updates a ticket, the ticket is also updated within Salesforce.
  • When a ticket is updated in Salesforce, it is also updated on your client.

Synchronizing the client with Salesforce

Use the following tools to synchronize your client with Salesforce.

Cloud Functions
An event-based Cloud Function is triggered when a ticket has been enriched. In production, you modify the Cloud Function so that any update to Firebase triggers a ticket update.
JSforce
JSforce, a Node.js wrapper around the Salesforce API, enables writes and updates to the Salesforce database.
Salesforce Case interface
Your Salesforce Case interface displays ticket information, mirroring the enriched ticket that was created in your custom application.

Copying fields

To synchronize the client with Salesforce, you need to match Salesforce fields with client fields. For simplicity, this tutorial copies over only a few fields values to matching fields, but you can duplicate the process for all your fields. Defined fields include:

  • SuppliedEmail with 'user@example.com'.
  • Description with a description.
  • Type with type ['Issue', 'Request'].
  • Reason with category ['Technical', 'Performance', 'Authentication', 'Billing'].
  • Priority with the predicted priority.
  • ResolutionTime with the predicted resolution time.

In the Salesforce UI, under SETUP > OBJECT MANAGER, set up your field mappings:

Salesforce field mappings

Customize your Salesforce page to display appropriate fields, making is easier for you to synchronize between the two databases. Note that Salesforce has default values for certain fields, such as Case origin and Status, which you can leave as is.

Running the updateSFDC function

In functions/index.html, the updateSFDC function creates a Salesforce ticket using ML-enriched data. The function is triggered when something is written to the Firebase real-time database. The sequence is described as follows:

  1. Your Salesforce instance must authenticate the updateSFDC function:

    conn = new jsforce.Connection({
      loginUrl : SFDC_URL
    });
    conn.login(SFDC_LOGIN, SFDC_PASSWORD + SFDC_TOKEN, function(err, res) {
  2. The function then creates a ticket in Salesforce based on the enriched data:

    conn.sobject("Case").create({
      SuppliedEmail: 'user@example.com',
      Description: ticket.description,
      Type: ticket.type,
      Reason: ticket.category,
      Priority: ticket.priority,
      ResolutionTime__c: ticket.t_resolution
    }, function(err, ret) {
  3. If everything runs properly, you will see a ticket similar to this in your Salesforce UI:

    Salesforce ticket

Synchronizing Salesforce with the client

This tutorial does not explain in detail how to implement this synchronization because it's mostly done on the Salesforce side, but the main steps are outlined. Synchronizing from Salesforce to your client requires the following:

  • An HTTPS-based Cloud Function that you call as a webhook from Salesforce.
  • An event triggered by Salesforce as soon as the agent updates a ticket.

In order to deploy an HTTPS-based function, you set up a staging Cloud Storage bucket:

gsutil mb gs://[YOUR-PROJECT-ID]-staging-functions

Then you deploy the function:

gcloud functions deploy fromSFDCtoFirebase --local-path=extras/functions/fromSFDCtoFirebase --stage-bucket=gs://mam-cloud-staging-functions --trigger-http

Calling an HTTPS-triggered Cloud Function is not covered in this tutorial because it depends on your third-party helpdesk tool. In Salesforce, you could use the Trigger feature. Leveraging APEX scripting, you can easily call HTTP endpoints using Cloud Functions. Refer to Salesforce documentation for more information on how to use APEX to call HTTP endpoints.

Try it

In the folder that you cloned, there is a simple HTML file that you can run locally to test your setup:

  1. Double-click index.html to open it.
  2. Fill in all form fields.
  3. Click Create.

If everything works fine, a new ticket appears, and after a while, the fields under the --Predictions-- header get updated with values similar to the following:

--Predictions--
sentiment: -0.6000000238418579
tags: reasons,image,instance,screen
priority: P2
t_resolution: 6.791423797607422

The design of the frontend is not part of this tutorial, but you could use CSS and AngularJS, for example, to build something more colorful—for example:

frontend design

Cleaning up

Delete the resources

  1. Delete your Cloud Datalab instance, replacing [YOUR_DATALAB_INSTANCE_NAME] with your Cloud Datalab instance name:

    datalab delete [YOUR_DATALAB_INSTANCE_NAME] --zone us-central1-f
  2. Remove the deployed models, replacing the variables with the corresponding model names:

    gcloud ml-engine models delete [PRIORITY_MODEL_NAME]
    gcloud ml-engine models delete [RESOLUTION_TIME_MODEL_NAME]
  3. Remove Firebase database data, replacing [PROJECT_ID] with the project ID:

    firebase database:remove / --project [PROJECT_ID]
  4. Remove the deployed Cloud Functions:

    gcloud functions delete priority
    gcloud functions delete resolutiontime
    gcloud functions delete tags
    gcloud functions delete sentiment
    gcloud functions delete updateSFDC

Delete the project

Alternatively, you can delete the project entirely.

  1. In the GCP Console, go to the Manage resources page.

    Go to the Manage resources page

  2. In the project list, select the project you want to delete and click Delete .
  3. In the dialog, type the project ID, and then click Shut down to delete the project.

What's next

  • This tutorial's frontend is simple. To build a richer UI, consider using AngularJS integration with Firebase through AngularFire.
  • To host the customer frontend in a managed environment, you can leverage the Firebase Hosting tutorial.
  • To protect your application, make sure to use the proper security rules such as required authentication and authorization.
  • This tutorial uses Salesforce.com as an example, but you can use other options such as Jira or any other CRM/helpdesk tool that you might be using and that offers a REST API.
  • Try out other Google Cloud Platform features for yourself. Have a look at our tutorials.
Was this page helpful? Let us know how we did:

Send feedback about...