Kubernetes Engine Overview

Google Kubernetes Engine provides a managed environment for deploying, managing, and scaling your containerized applications using Google infrastructure. The environment Kubernetes Engine provides consists of multiple machines (specifically, Google Compute Engine instances) grouped together to form a cluster.

Cluster orchestration with Kubernetes Engine

Kubernetes Engine clusters are powered by the Kubernetes open source cluster management system. Kubernetes provides the mechanisms through which you interact with your cluster. You use Kubernetes commands and resources to deploy and manage your applications, perform administration tasks and set policies, and monitor the health of your deployed workloads.

Kubernetes draws on the same design principles that run popular Google services and provides the same benefits: automatic management, monitoring and liveness probes for application containers, automatic scaling, rolling updates, and more. When you run your applications on a cluster, you're using technology based on Google's 10+ years of experience running production workloads in containers.

Kubernetes on Google Cloud Platform

When you run a Kubernetes Engine cluster, you also gain the benefit of advanced cluster management features that Google Cloud Platform provides. These include:

Kubernetes versions and features

Kubernetes Engine cluster masters are automatically upgraded to run new versions of Kubernetes as those versions become stable, so you can take advantage of newer features from the open source Kubernetes project.

New features in Kubernetes are listed as Alpha, Beta, or Stable, depending upon their status in development. In most cases, Kubernetes features that are listed as Beta or Stable are included with Kubernetes Engine. Kubernetes Alpha features are available in special Kubernetes Engine alpha clusters.

Kubernetes Engine Workloads

Kubernetes Engine works with containerized applications: applications packaged into hardware independent, isolated user-space instances, for example by using Docker. In Kubernetes Engine and Kubernetes, these containers, whether for applications or batch jobs, are collectively called workloads. Before you deploy a workload on a Kubernetes Engine cluster, you must first package the workload into a container.

Google Cloud Platform provides continuous integration and continuous delivery tools to help you build and serve application containers. You can use Google Container Builder to build container images (such as Docker) from a variety of source code repositories, and Google Container Registry to store and serve your container images.

Was this page helpful? Let us know how we did:

Send feedback about...

Kubernetes Engine