Google Kubernetes Engine

http/gRPC/GCP UIGoogle Kubernetes Engine container cluster in Google CloudRegional control planeNodesConnected Google Cloud servicesPersistent DiskLoad BalancerVPC NetworkingUserAPI serverResource controllerskubectlGoogle Kubernetes Engine provisions, maintains & operatesGoogle Kubernetes Engine provisions User optionally maintains & operatesCloud MonitoringUser podContainersUser podContainersUser podContainersSchedulerStorage
Run advanced apps on a secured and managed Kubernetes service

GKE is an enterprise-grade platform for containerized applications, including stateful and stateless, AI and ML, Linux and Windows, complex and simple web apps, API, and backend services. Leverage industry-first features like four-way auto-scaling and no-stress management. Optimize GPU and TPU provisioning, use integrated developer tools, and get multi-cluster support from SREs.

  • Start quickly with single-click clusters
  • Leverage a high-availability control plane including multi-zonal and regional clusters
  • Eliminate operational overhead with auto-repair, auto-upgrade, and release channels
  • Secure by default, including vulnerability scanning of container images and data encryption
  • Integrated Cloud Monitoring with infrastructure, application, and Kubernetes-specific views

Speed up app development without sacrificing security

Develop a wide variety of apps with support for stateful, serverless, and application accelerators. Use Kubernetes-native CI/CD tooling to secure and speed up each stage of the build-and-deploy life cycle.

Streamline operations with release channels

Choose the channel that fits your business needs. Rapid, regular, and stable release channels have different cadences of node upgrades and offer support levels aligned with the channel nature.

Let Google SREs manage the infrastructure

Get back time to focus on your applications with help from Google Site Reliability Engineers (SREs). Our SREs constantly monitor your cluster and its computing, networking, and storage resources.

Key features

For Developers

Kubernetes applications

Enterprise-ready containerized solutions with prebuilt deployment templates, featuring portability, simplified licensing, and consolidated billing. These are not just container images, but open source, Google-built, and commercial applications that increase developer productivity, available now on Google Cloud Marketplace.

For Platform operators

Pod and cluster autoscaling

Horizontal pod autoscaling based on CPU utilization or custom metrics, cluster autoscaling that works on a per-node-pool basis and vertical pod autoscaling that continuously analyzes the CPU and memory usage of pods and dynamically adjusts their CPU and memory requests in response. Automatically scales the node pool and clusters across multiple node pools, based on changing workload requirements.

For Security administrators

Workload and network security

GKE Sandbox provides a second layer of defense between containerized workloads on GKE for enhanced workload security. GKE clusters natively support Kubernetes Network Policy to restrict traffic with pod-level firewall rules. Private clusters in GKE can be restricted to a private endpoint or a public endpoint that only certain address ranges can access.

View all features

Customer stories


  • Reduces average API response time to requests by 10x

  • Doubles the number of Kubernetes nodes in seconds

  • Identifies chokepoints to improve time-consuming processes


  • Retail and consumer goods

View more customers

What’s new


GKE quickstart

Shows how to deploy a containerized application with GKE. Follow the steps on your own or try the training lab included.

Best practice
Best practices for operating containers

Learn best practices for operating containers in GKE.

GKE tutorial: Deploying a containerized web application

This tutorial shows you how to package a web application in a Docker container image and run that container image on a GKE cluster.

Gamified lab

Deploy a containerized application with GKE in less than 30 minutes. Score points and earn badges along the way.

Best practice
Preparing a GKE Environment for Production

Follow the guidance and methodology for onboarding your workloads more securely, reliably, and cost-effectively to GKE.

Common use cases

Modern application development

GKE enables rapid application development and iteration by making it easy to deploy, update, and manage your applications and services.

Continuous delivery pipeline

Continuous delivery pipeline using GKE, Cloud Source Repositories, Cloud Build, and Spinnaker for Google Cloud. Configure these services to automatically build, test, and deploy an app. When the app code is modified, the changes trigger the continuous delivery pipeline to automatically rebuild, retest, and redeploy the new version.

Architecture, showing continuous delivery pipeline using GKEDevelopersUsersCloud SourceRepositoriesSpinnakerCloud LoadBalancingCloud StorageCloud BuildGoogleKubernetes Enginemy-app-canarymy-app-prodGoogleKubernetes EngineGoogleKubernetes EngineRedisCloud Memorystore

VM migration to containers   

Use Migrate for Anthos to move and convert workloads directly into containers in GKE. Target workloads can include physical servers and VMs running on-premises, in Compute Engine, or in other clouds, giving you the flexibility to transform your existing infrastructure with ease. Best of all, Migrate for Anthos is available at no additional cost and it does not require an Anthos subscription.

Migrating a two-tier application to GKE

Migrate a two-tiered LAMP stack application, with both application and database VMs, from VMware to GKE. Improve security by making the database accessible from the application container only and not from outside the cluster. Replace SSH access with authenticated shell access through kubectl. See container system logs through automatic Cloud Logging integration.

Architecture, showing migration of a two-tier application to GKEGoogle Kubernetes EngineOn-premises Domain ControllerCorp DNSUsersOn-premises SubnetVPN InterconnectVPC Subnet SubnetVPCApp SVCNetwork PolicyDatabase SVCuser-space only- systemd+App Container user-space only- systemd+Database Container Cloud Extension

All features

Identity and access management Control access in the cluster with your Google accounts and role permissions.
Hybrid networking Reserve an IP address range for your cluster, allowing your cluster IPs to coexist with private network IPs via Google Cloud VPN.
Security and compliance GKE is backed by a Google security team of over 750 experts and is both HIPAA and PCI DSS compliant.
Integrated logging and monitoring Enable Cloud Logging and Cloud Monitoring with simple checkbox configurations, making it easy to gain insight into how your application is running.
Cluster options Choose clusters tailored to the availability, version stability, isolation, and pod traffic requirements of your workloads.
Auto scale Automatically scale your application deployment up and down based on resource utilization (CPU, memory).
Auto upgrade Automatically keep your cluster up to date with the latest release version of Kubernetes. Kubernetes release updates are quickly made available within GKE.
Auto repair When auto repair is enabled, if a node fails a health check, GKE initiates a repair process for that node.
Resource limits Kubernetes allows you to specify how much CPU and memory (RAM) each container needs, which is used to better organize workloads within your cluster.
Container isolation Use GKE Sandbox for a second layer of defense between containerized workloads on GKE for enhanced workload security.
Stateful application support GKE isn't just for 12-factor apps. You can attach persistent storage to containers, and even host complete databases.
Docker image support GKE supports the common Docker container format.
Fully managed GKE clusters are fully managed by Google Site Reliability Engineers (SREs), ensuring your cluster is available and up-to-date.
OS built for containers GKE runs on Container-Optimized OS, a hardened OS built and managed by Google.
Private container registry Integrating with Google Container Registry makes it easy to store and access your private Docker images.
Fast consistent builds Use Cloud Build to reliably deploy your containers on GKE without needing to setup authentication.
Workload portability, on-premises and cloud GKE runs Certified Kubernetes, enabling workload portability to other Kubernetes platforms across clouds and on-premises.
GPU and TPU support GKE supports GPUs and TPUs and makes it easy to run ML, GPGPU, HPC, and other workloads that benefit from specialized hardware accelerators.
Built-in dashboard Cloud Console offers useful dashboards for your project's clusters and their resources. You can use these dashboards to view, inspect, manage, and delete resources in your clusters.
Preemptible VMs Low-cost, short-term instances designed to run batch jobs and fault-tolerant workloads. Preemptible VMs provide significant savings of up to 80% while still getting the same performance and capabilities as regular VMs.
Persistent disks support Durable, high-performance block storage for container instances. Data is stored redundantly for integrity, flexibility to resize storage without interruption, and automatic encryption. You can create persistent disks in HDD or SSD formats. You can also take snapshots of your persistent disk and create new persistent disks from that snapshot.
Local SSD support GKE offers always-encrypted local solid-state drive (SSD) block storage. Local SSDs are physically attached to the server that hosts the virtual machine instance for very high input/output operations per second (IOPS) and very low latency compared to persistent disks.
Global load balancing Global load-balancing technology helps you distribute incoming requests across pools of instances across multiple regions, so you can achieve maximum performance, throughput, and availability at low cost.
Linux and Windows support Fully supported for both Linux and Windows workloads, GKE can run both Windows Server and Linux nodes.
Hybrid and multi-cloud support Take advantage of Kubernetes and cloud technology in your own data center. Get the GKE experience with quick, managed, and simple installs as well as upgrades validated by Google through Anthos GKE.
Serverless containers Run stateless serverless containers abstracting away all infrastructure management and automatically scale them with Cloud Run.
Usage metering Fine-grained visibility to your Kubernetes clusters. See your GKE clusters' resource usage broken down by namespaces and labels, and attribute it to meaningful entities.
Release channels Release channels provide more control over which automatic updates a given cluster receives, based on the stability requirements of the cluster and its workloads. You can choose rapid, regular, or stable. Each has a different release cadence and targets different types of workloads.
Software supply chain security Verify, enforce, and improve security of infrastructure components and packages used for container images with Container Analysis.
Per-second billing Google bills in second-level increments. You pay only for the compute time that you use.


GKE provides one zonal cluster per billing account for free.

GKE charges a cluster management fee of $0.10 per cluster per hour and each worker node in your cluster is charged according to Compute Engine instance pricing, until a cluster is deleted.

GKE cluster management fees do not apply to Anthos GKE clusters.

View pricing details

Take the next step

Get $300 in free credits to learn and build on Google Cloud for up to 12 months.

Need help getting started?
Work with a trusted partner
Continue browsing