System testing Cloud Functions using Cloud Build and Terraform

This tutorial describes how to automate end-to-end testing of an app built with Cloud Functions. Cloud Build runs the testing pipeline, and HashiCorp Terraform sets up and tears down the Google Cloud Platform (GCP) resources required to run the tests. A Cloud Build trigger initiates the pipeline after each code commit.

Testability is a key consideration when you design apps and evaluate architectural choices. Creating and regularly running a comprehensive set of tests (including automated unit, integration, and end-to-end system tests) is essential to validate that your app behaves as expected. For details about how to approach each category of tests for different Cloud Functions scenarios, see the testing best practices guide.

Creating and running unit tests is typically straightforward, because these tests are isolated and independent of the execution environment. However, integration and system tests are more complex, particularly in a cloud environment. The need for end-to-end system tests is especially relevant for apps that use serverless technologies such as Cloud Functions. These apps are often event-driven and loosely coupled, and they might be independently deployed. Comprehensive end-to-end tests are essential to validate that the functions are correctly responding to events within the GCP execution environment.

Architecture

The following architectural diagram shows the components you use in this tutorial.

Architectural diagram of build and test projects.

The architecture has the following components:

  • A build project that hosts and runs the Cloud Build pipeline.
  • A test project that hosts GCP resources for the sample app under test.
    • The app described in the Serverless web performance monitoring tutorial is used as the sample app.
    • The GCP resources for the sample app are created and destroyed for each build iteration. The Cloud Firestore database is an exception. It's created once and reused by all subsequent builds.

Objectives

  • Create a Cloud Build pipeline to run unit and end-to-end tests for a sample app built with Cloud Functions.
  • Use Terraform from within the build to set up and destroy the GCP resources that the app requires.
  • Use a dedicated GCP testing project to keep the test environment isolated.
  • Create a Git repository in Cloud Source Repositories, and add a Cloud Build trigger to run the end-to-end build after a commit.

Costs

This tutorial uses the following billable components of Google Cloud Platform:

To generate a cost estimate based on your projected usage, use the pricing calculator. New GCP users might be eligible for a free trial.

When you finish this tutorial, you can avoid continued billing by deleting the resources you created. For more information, see Cleaning up.

Before you begin

  1. Select or create a GCP project. This is the test project that hosts the sample app.

    Go to the Project selector page

  2. Make a note of the GCP project ID for the test project. You need this ID in the next section on setting up your environment.
  3. Enable the Cloud Build, Cloud Functions, and Cloud Source Repositories APIs for that project.

    Enable the APIs

  4. In the GCP Console, go to the Firestore page.
  5. Go to the Firestore page
  6. Create a Cloud Firestore database.

    Learn how to create a Cloud Firestore database

  7. Select or create another GCP project. This is the build project that hosts the Cloud Build pipeline.
  8. Go to the Manage Resources page
  9. Make sure that billing is enabled for your GCP projects.

    Learn how to enable billing

Setting up your environment

In this tutorial, you run commands in Cloud Shell. Cloud Shell is a shell environment with the Cloud SDK already installed, including the gcloud command-line tool, and with values already set for your current project. Cloud Shell can take several minutes to initialize.

  1. In the GCP Console for the build project, open Cloud Shell.

    Open Cloud Shell

  2. Set a variable for the test GCP project ID that you copied earlier:

    export TEST_PROJECT=your-test-project-id
    

    Replace the following:

    • your-test-project-id: The ID of your test GCP project.
  3. Set the project ID and project number of the current build GCP project as variables:

    export BUILD_PROJECT=$(gcloud config get-value core/project)
    export BUILD_PROJECT_NUM=$(gcloud projects list) \
        --filter="$BUILD_PROJECT" --format="value(PROJECT_NUMBER)")
    
  4. Set a variable for the deployment region:

    export REGION=us-central1
    

    Although this tutorial uses the us-central1 region, you can change this to any region where Cloud Functions is available.

  5. Clone the repository containing the code for the sample app used in this tutorial:

    git clone \
        https://github.com/GoogleCloudPlatform/solutions-serverless-web-monitoring.git
    
  6. Go to the project directory:

    cd solutions-serverless-web-monitoring
    

Creating test infrastructure with Terraform

This tutorial uses Terraform to automatically create and destroy GCP resources within the test project. Creating independent resources for each build helps keep tests isolated from each other. When you isolate tests, builds can occur concurrently, and test assertions can be made against specific resources. Destroying the resources at the end of each build helps to minimize costs.

This tutorial deploys the app described in the serverless web monitoring tutorial. The app consists of a set of Cloud Functions, Cloud Storage buckets, Cloud Pub/Sub resources, and a Cloud Firestore database. The Terraform configuration defines the steps required to create these resources. The Cloud Firestore database isn't deployed by Terraform; the database is created once and reused by all tests.

The following code sample from the Terraform configuration file main.tf shows the steps required to deploy the trace Cloud Function. Refer to the full file for the complete configuration.

data "archive_file" "local_tracer_source" {
  type        = "zip"
  source_dir  = "./functions/tracer"
  output_path = "${var.local_output_path}/tracer.zip"
}

resource "google_storage_bucket_object" "gcs_tracer_source" {
  name   = "tracer.zip"
  bucket = "${google_storage_bucket.bucket_source_archives.name}"
  source = "${data.archive_file.local_tracer_source.output_path}"
}

resource "google_cloudfunctions_function" "function_tracer" {
  name = "tracer-${var.suffix}"
  project = "${var.project_id}"
  region = "${var.region}"
  available_memory_mb = "1024"
  entry_point = "trace"
  runtime = "nodejs8"
  trigger_http = "true"
  source_archive_bucket = "${google_storage_bucket.bucket_source_archives.name}"
  source_archive_object = "${google_storage_bucket_object.gcs_tracer_source.name}"
  environment_variables = {
    BUCKET_METRICS = "${google_storage_bucket.bucket_metrics.name}"
  }
}

In this step, you run the Terraform configuration to deploy the test resources. In a later step, Cloud Build deploys the resources automatically.

  1. In Cloud Shell, initialize Terraform:

    docker run -v $(pwd):/app -w /app hashicorp/terraform:0.12.0 init
    

    You use the public Terraform Docker image. Docker is already installed in Cloud Shell. The current working directory is mounted as a volume so the Docker container can read the Terraform configuration file.

  2. Create the resources using the Terraform apply command:

    docker run -v $(pwd):/app -w /app hashicorp/terraform:0.12.0 apply \
        --auto-approve \
        -var "project_id=$TEST_PROJECT" \
        -var "region=$REGION" \
        -var "suffix=tf-manual"
    

    The command includes variables that specify the GCP project and region where you want the test resources created. It also includes a suffix that is used in creating named resources for this step. In a later step, Cloud Build automatically supplies a suitable suffix.

    It takes a few minutes for the operation to complete.

  3. Confirm that resources were created in the test project:

    gcloud functions list --project $TEST_PROJECT
    

    The output displays three Cloud Functions whose names end with the suffix supplied earlier.

Running end-to-end tests

In this section, you run end-to-end tests against the test infrastructure that you deployed in the preceding section.

The following code snippet shows the tests. The tests validate both the success and failure scenario. The test pipeline can be summarized as follows:

  • First, the test calls the trace function. This call initiates a flow of events through the app that triggers other functions.
  • Then, the test verifies the behavior of each function and confirms that objects are written to Cloud Storage, results are persisted to Cloud Firestore, and Cloud Pub/Sub alerts are generated upon failures.
def test_e2e_pass():
  run_pipeline('http://www.example.com/', True)


def test_e2e_fail():
  run_pipeline('https://cloud.google.com/docs/tutorials', False)


def run_pipeline(url, should_pass):
  """Triggers the web analysis pipeline and verifies outputs of each stage.

  Args:
    url (str): The page to analyze.
    should_pass (bool): Whether the page should load within the threshold time.
  """
  trace_response = call_tracer(url)
  filename = assert_tracer_response(trace_response)
  assert_gcs_objects(filename)
  assert_firestore_doc(filename, should_pass)
  assert_pubsub_message(should_pass)

  # clean up
  delete_gcs_objects(filename)
  delete_firestore_doc(filename)

To run the end-to-end tests, complete the following steps:

  1. In Cloud Shell, create a new virtualenv environment. The virtualenv utility is already installed in Cloud Shell.

    virtualenv venv
    
  2. Activate the virtualenv environment:

    source venv/bin/activate
    
  3. Install the required Python libraries:

    pip install -r requirements.txt
    
  4. Run the end-to-end tests:

    python -m pytest e2e/ --tfstate terraform.tfstate
    

    You pass the Terraform state file, which contains details of the test resources created in the preceding section.

    The tests can take a few minutes to complete. A message indicating that two tests passed is displayed. You can ignore any warnings.

  5. Tear down the test resources by using the Terraform destroy command:

    docker run -v $(pwd):/app -w /app hashicorp/terraform:0.12.0 destroy \
        --auto-approve \
        -var "project_id=$TEST_PROJECT" \
        -var "region=$REGION" \
        -var "suffix=tf-manual"
    
  6. Confirm that the resources are destroyed:

    gcloud functions list --project $TEST_PROJECT
    

    There are no longer any Cloud Functions with names ending with the suffix supplied earlier.

Submitting the Cloud Build pipeline

In this section you use Cloud Build to automate the testing pipeline.

Set Cloud Build permissions

You run Cloud Build using a Cloud Build service account. The system tests executed by the build create and interact with Cloud Functions, Cloud Storage buckets, Cloud Pub/Sub resources, and Cloud Firestore documents. To do these things, Cloud Build requires the following:

  • Appropriate Cloud IAM roles in the test project.
  • The ability to act as the Cloud Functions runtime service account. By default, Cloud Functions uses the App Engine service account (your-test-project-id@appspot.gserviceaccount.com) as a runtime service account, where your-test-project-id is the name of your test GCP project.

In this procedure you add the appropriate roles and then add the Cloud Build service account.

  1. In Cloud Shell, add the appropriate Cloud IAM roles to the default Cloud Build service account:

    for role in cloudfunctions.developer pubsub.editor storage.admin datastore.user; do \
        gcloud projects add-iam-policy-binding $TEST_PROJECT \
        --member="serviceAccount:$BUILD_PROJECT_NUM@cloudbuild.gserviceaccount.com" \
        --role="roles/$role"; \
        done
    
  2. Add the Cloud Build service account as a serviceAccountUser of the App Engine service account within the test project:

    gcloud iam service-accounts add-iam-policy-binding \
        $TEST_PROJECT@appspot.gserviceaccount.com \
        --member="serviceAccount:$BUILD_PROJECT_NUM@cloudbuild.gserviceaccount.com" \
        --role=roles/iam.serviceAccountUser \
        --project $TEST_PROJECT
    

Submit a manual build

The build performs four logical tasks:

  • Running unit tests
  • Deploying the sample app
  • Running end-to-end tests
  • Destroying the sample app

Review the following code snippet from the cloudbuild.yaml file. The snippet illustrates the individual Cloud Build steps that deploy the sample app using Terraform and run the end-to-end tests.

# setup Terraform using public terraform Docker image
- id: terraform-init
  name: hashicorp/terraform:0.12.0
  args: ['init']


# deploy the required GCP resources
- id: terraform-apply
  name: hashicorp/terraform:0.12.0
  args: ['apply', '-auto-approve']
  env:
    - 'TF_VAR_project_id=$_TEST_PROJECT_ID'
    - 'TF_VAR_region=$_REGION'
    - 'TF_VAR_suffix=$BUILD_ID'


# run end-to-end tests to verify live interactions
- id: end-to-end-tests
  name: 'python:3.7-slim'
  entrypoint: /bin/sh
  args:
    - -c
    - 'pip install -r requirements.txt && python -m pytest e2e --tfstate terraform.tfstate'

To submit a manual build to Cloud Build and run end-to-end tests, do the following:

  • In Cloud Shell, enter the following:

    gcloud builds submit --config cloudbuild.yaml \
        --substitutions=_REGION=$REGION,_TEST_PROJECT_ID=$TEST_PROJECT
    

    The build takes several minutes to run. The following build steps occur:

    • Cloud Build uses substitutions to supply variables that specify the GCP project and region for the test resources you create.

    • Cloud Build runs the build in the build GCP project. The test resources are created in the separate test project.

    The build logs are streamed to Cloud Shell so that you can follow the build progress. The log stream terminates upon build completion. Messages are displayed that indicate the final terraform-destroy build step is successful and the build is done.

Automating test execution

A key tenet of continuous integration (CI) is to regularly run a set of comprehensive automated tests. Typically, the build-test pipeline runs for each commit to a shared code repository. This setup helps to confirm that each commit to the shared repository is tested and validated, allowing your team to detect problems early.

In the next sections, you perform the following actions:

  • Create a Git repository in Cloud Source Repositories.
  • Add a Cloud Build trigger to run the end-to-end build upon every commit.
  • Push code to the repository to trigger the build.

Create a Cloud Source Repository and a Cloud Build trigger

  1. In Cloud Shell, create a new Cloud Source Repository:

    gcloud source repos create serverless-web-monitoring
    
  2. In the GCP Console, open the Cloud Build Triggers page.

    Go to the Triggers page

  3. Click Create Trigger.

  4. On the Create trigger page, select Cloud Source Repositories as source, and then click Continue.

  5. In the Repository list, select the new serverless-web-monitoring repository, and then click Continue.

  6. On the Triggers settings page, fill out the following options:

    • In the Description field, type end-to-end-tests.
    • In the Trigger type list, select master.
    • In the Build configuration list, select Cloud Build configuration file.
    • In the Cloud Build configuration file location field, type cloudbuild.yaml.
    • To add a variable substitution to specify the GCP region where the test resources will be created, click Add Item:

      • Variable: _REGION
      • Value: your-test-region

        Replace the following:

        • your-test-region: The value of the $REGION variable in Cloud Shell.
    • To add another variable substitution to specify the ID of the project that will host the test resources, click Add Item:

      • Variable: _TEST_PROJECT_ID
      • Value: your-test-project

        Replace the following:

        • your-test-project: The value of the $TEST_PROJECT variable in Cloud Shell.
  7. Click Create trigger.

Trigger the build

  1. In Cloud Shell, add the repository as a new remote in your git config:

    git remote add csr \
        https://source.developers.google.com/p/$BUILD_PROJECT/r/serverless-web-monitoring
    
  2. To trigger the build, push the code to the repository:

    git push csr master
    
  3. List the most recent builds:

    gcloud builds list --limit 3
    

    The output displays a build in WORKING status, indicating that the build triggered as expected.

  4. Copy the ID of the WORKING build for the next step.

  5. Stream the build logs to the GCP Console:

    gcloud builds log --stream build-id
    

    Replace the following:

    • build-id: The ID of the WORKING build you copied in the preceding step.

    The log stream terminates upon build completion. Messages are displayed that indicate the final terraform-destroy build step is successful and that the build is done.

Cleaning up

To avoid incurring charges to your Google Cloud Platform account for the resources used in this tutorial:

Delete the project

  1. In the GCP Console, go to the Manage resources page.

    Go to the Manage resources page

  2. In the project list, select the project you want to delete and click Delete .
  3. In the dialog, type the project ID, and then click Shut down to delete the project.

What's next

Hai trovato utile questa pagina? Facci sapere cosa ne pensi:

Invia feedback per...