A developer’s guide to Google Kubernetes Engine, or GKE
Drew Bradstock
Sr. Director of Product Management, Cloud Runtimes
When people think about whether or not to deploy on a container management platform like Kubernetes, the decision often comes down to its operational benefits: better resource efficiency, higher scalability, advanced resiliency, security, etc. But Kubernetes is also beneficial to the software development side of the house. Whether it’s improved portability of your code, or better productivity, Kubernetes is a win for developers, not just operators.
For one thing, as we argued in Re-architecting to cloud native: an evolutionary approach to increasing developer productivity at scale, Kubernetes makes it easier to adopt modern cloud-native software development patterns like microservices, which can give you:
Increased developer productivity, even as you increase your team sizes.
Faster time-to-market - Add new features and fix defects more quickly.
Higher availability - Increase the uptime of your software, reduce the rate of deployment failures, and reduce time-to-restore in the event of incidents.
Improved security - Reduce the attack surface area of your applications, and make it easier to detect and respond rapidly to attacks and newly discovered vulnerabilities.
Better scalability - Cloud-native platforms and applications make it easy to scale horizontally where necessary—and to scale down too.
Reduced costs - A streamlined software delivery process reduces the costs of delivering new features, and effective use of cloud platforms substantially reduces the operating costs of your services.
Google of course invented Kubernetes, which Google Cloud offers as the fully managed service, Google Kubernetes Engine (GKE). But did you know that Google Cloud also offers a full complement of developer tools that are tightly integrated with GKE? Today, in honor of KubeCon, we’re revisiting a few blogs that will show you how to develop apps destined for GKE, how to deploy them safely and efficiently, and how to monitor and debug them once they’re in production.
Developing for GKE: It all starts with you
Even the most enterprise-y applications get their start in life on a developer’s laptop. The same goes for applications running on GKE. To make that possible, there’s a variety of tools you can use to integrate your local development environment with GKE.
Developers are known for tricking out their laptops with lots of compute resources. Using Minikube, you can take advantage of GPUs, for example. There are also local development tools to help you containerize Java apps: Jib, and Skaffold. Jib helps to containerize your Java apps without having to install Docker, run a Docker daemon, or even write a Dockerfile, and is available as a plugin for Maven or Gradle. Then, you can use Skaffold to deploy those containerized Java apps to a Kubernetes cluster when it detects a change. Skaffold can even inject a new version of a file into a running container! Read about this in depth at Livin’ la vida local: Easier Kubernetes development from your laptop.
Another popular tool among GKE developers is Cloud Code, which provides plugins for the popular Visual Studio and IntelliJ integrated development environments (IDEs) to simplify developing for GKE. For example, we recently updated Cloud Code to have much more robust support for Kubernetes YAML and Custom Resource Definitions (CRDs). Read more at Cloud Code makes YAML easy for hundreds of popular Kubernetes CRDs.
Have a quick and dirty development task to do? Check out Cloud Shell Editor, which launches a full-featured, but self-contained, container development environment in your browser. Read more at New Cloud Shell Editor: Get your first cloud-native app running in minutes.
Get in the (pipe)line
Eventually, you’ll be ready to push the apps you developed on your laptop to production. Along the way, you’ll probably want to make sure that the code has been properly tested, and that it passes requisite security and compliance tests. Google Cloud offers a variety of tools to help you push that code through that pipeline.
- Setting up an automated deployment pipeline to GKE doesn’t have to be hard. In Create deployment pipelines for your GKE workloads in a few clicks, learn how to use Cloud Build to create a pipeline from scratch, including selecting your source, build configuration, and Kubernetes YAML files.
- But before you do, make sure that the image that you’re deploying is secure. Binary Authorization provides a policy enforcement chokepoint to ensure only signed and authorized images are deployed in your environment. You can read more about it in Deploy only what you trust: introducing Binary Authorization for GKE.
- Even better, Artifact Registry has built-in vulnerability scanning. Once enabled, all container images built using Cloud Build are automatically scanned for OS package vulnerabilities as they’re pushed to Artifact Registry. Read more at Turbocharge your software supply chain with Artifact Registry.
Monitor, Debug, Repeat: Remote development for GKE apps
Now that your app is in production on a GKE cluster, your work is done, right? Wrong. For developers, getting an app to production is still just the beginning of the software lifecycle. Chances are, you have ideas about how to improve your app, and you’ll definitely want to monitor it for signs of trouble. GKE is tightly integrated with several monitoring, debugging, and performance management tools that can help ensure the health of your GKE app—so you can make them even better!
When there’s a problem in your production environment, one of the first places you’ll want to look is your logs. You can do that with Cloud Logging and Cloud Monitoring, both enabled by default when you create a GKE cluster. To learn more about how to use Cloud Logging for GKE logs, use cases and best practices, check out Using logging for your apps running on Kubernetes Engine.
Once you’ve found the culprit, find out how you can use Cloud Logging and Cloud Monitoring to debug your applications.
We’re developers too
As long-standing leaders of the open source community, including the Cloud Native Computing Foundation (CNCF) and Open Container Initiative (OCI), we’re always thinking about how industry developments impact your day-to-day as a GKE developer. For example, Docker’s recent announcements about new limits on pull requests prompted us to write this post on how to manage these restrictions in a GKE environment. In addition to making GKE the most scalable and robust container management platform, we’re deeply committed to making it the easiest to use and develop on. New to Kubernetes and GKE? Learn more with this free, hands-on training. And if you’re participating in KubeCon this week, be sure to stop by our (virtual) booth to meet an expert.