Automatic serverless deployments with Cloud Source Repositories and Container Builder
Chris Broadfoot
Developer Relations
There are many reasons to automate your deployments: consistency, safety, and timeliness. These increase in value as your software becomes more critical to your business. In this post, I'll demonstrate how easy it is to start automating deployments with Google Cloud Platform (GCP) tools, and refer you to additional resources to help make your deployment process more robust.
Suppose you have a Google Cloud Functions, Firebase or Google App Engine application. Today, you probably deploy your function or app via gcloud commands from your local workstation. Let's look at a lightweight workflow that takes advantage of two Google Cloud products: Cloud Source Repositories and Cloud Container Builder.
This simple pipeline uses build triggers in Cloud Container Builder to deploy a function to Cloud Functions when source code is pushed to a "prod" branch.
The first step is to get your code under revision control. If you're already using a provider like GitHub or Bitbucket, it's trivial to mirror your code to a Cloud Source Repository. Cloud Source Repositories is offered at no charge for up to five project-users, so it's perfect for small teams.
Commands for the command-line are captured below, but you can find more detailed guides in the documentation.
Create and clone your repository:
Now, create a simple function (include a package.json if you have third-party dependencies):
Then, create a Container Builder build definition:
This is equivalent to running the command:
Before you start your first build, set up your project for Container Builder. First, enable two APIs: Container Builder API and Cloud Functions API. To allow Container Builder to deploy, you need to give it access to your project. The build process uses the credentials of a service account associated with those builds. The address for that service account is {numerical-project-id}@cloudbuild.gserviceaccount.com.
You'll need to add an IAM role to that service account: Project Editor. If you use this process to deploy other resources, you might need to add other IAM roles.
Now, test your deployment configuration and permissions by running:
Your function is now being deployed via Cloud Container Builder.
Creating a build trigger is easy: choose your repository, the trigger condition (in this case, pushing to the "prod" branch), and the build to run (in this case, the build specified in "deploy.yaml").
Now, update the "prod" branch, bring it up-to-date with "master", push it to Cloud Source Repositories, and your function will be deployed!
If the deployment failed, it will show up as a failed build in the build history screen. Check the logs to investigate what went wrong. You can also configure e-mail or other notifications using Pub/Sub and Cloud Functions.
This is a simplified deployment pipeline—just enough to demonstrate the power of deployment automation. At some point, you'll probably find that this process doesn't meet your needs. For example, you might want to get a manual approval before you update production. If that happens, check out Spinnaker, an open-source deployment automation system that can handle more complex workflows.
And that’s just the beginning! As you get further down the road toward automating your deployments, here are some other tools and techniques for you to try:
- Adding a test step (e.g., "npm test").
- Deploying App Engine apps instead of Cloud Functions (simply replace the gcloud command)
- Use variable substitutions to alter function name based on branch name (e.g., "my-function-dev")
- IDE plugins and editor support, to push-to-deploy from your IDE or editor. Check out some examples of common build configurations.