Jump to Content
Containers & Kubernetes

Build your future with GKE

March 11, 2021
https://storage.googleapis.com/gweb-cloudblog-publish/images/GCP_Kubernetes_A.max-2600x2600.jpg
Pali Bhat

Vice President of Product & Design

American poet Maya Angelou said ”If you don’t know where you’ve come from, you don’t know where you’re going.” We agree. Today, as we kick off the Build with Google Kubernetes Engine event, and fresh off our GKE Autopilot launch, we wanted to take a step back and reflect on just how far GKE has come. In just six short years, GKE has become one of the most widely adopted services for running modern cloud-native applications, used by startups and Fortune 500 companies alike. This enthusiasm inspires us to push the limits of what’s possible with Kubernetes, making it easier for you to focus on creating great services for your users, while we take care of your Kubernetes clusters. 

So let’s take a look at where we’ve been with Kubernetes and where we are today—so we can build the future together.   

Sustained innovation

A lot has changed in the container orchestration space since we created Kubernetes and opened it up to the world more than 6 years ago. It’s a little hard to remember, but back when we first designed Kubernetes, there was no industry standard for managing fleets of containerized applications at scale. Because we had developed so many technical innovations for containers already (e.g., container optimized OS), it was only natural for us to propose a new approach for managing containers—one based on our experience at the time, launching billion containers every week for our internal needs.

In 2015, we co-founded the Cloud Native Computing Foundation (CNCF) as a vendor-neutral home for the Kubernetes project. Since then, a diverse, global community of developers has contributed to—and benefitted from—the project. Last year alone, developers from 500+ companies contributed to Kubernetes, and all the major cloud providers have followed in our footsteps in offering a managed Kubernetes service. This broad industry support for the technology we developed helps us deliver on our vision: giving customers the choice to run their workloads where and when they want, without being stuck on a legacy cloud provider with proprietary APIs.

Community leadership

Since its inception as an internal Google project, we’ve only continued to invest in Kubernetes. Under the auspices of the CNCF, we’ve made over 680,000 additional contributions to the project, including over 123,000 contributions in 2020. That’s more than all the other cloud providers combined. When you truly want to take advantage of Kubernetes, there’s no match for Google’s expertise—or GKE.

We also actively support CNCF with credits to host Kubernetes on Google Cloud, enabling 100 million container downloads every day and over 400,000 integration tests per month, totaling over 300,000 core hours on GKE and Google Cloud. (Yes, you read that right, the Kubernetes project itself is built and served from GKE and Google Cloud.) 

Customer outcomes

As the creators of Kubernetes, and with all this continued investment, it’s not surprising that we have a great managed Kubernetes service; in fact, I think we can credibly claim it’s the best one in the market. 

Enterprises flock to GKE to solve for speed, scale, security and availability. Among the Fortune 500, five out of top 10 telecommunications, media and gaming companies, 6 out of top 10 healthcare and lifesciences companies, and 7 out of top 10 retail and consumer packaged goods companies all use GKE. Leading technology companies are also embracing GKE, for example, Databricks is enabling customers to leverage a Google Kubernetes Engine-based Databricks service on Google Cloud.

When it comes to scale, GKE is second to none. After all, Google itself operates numerous globally available services like YouTube, Gmail and Drive, so we know a thing or two about deploying workloads at scale. We bring this expertise to Kubernetes in a way that only Google can. For example, Bayer Crop Science used GKE to seamlessly scale their research workloads over 200x with 15,000 node clusters. 

GKE offers native security capabilities such as network policy logging, hardened sandbox environment, vulnerability scanning, shielded nodes (that use a cryptographically verifiable check) and confidential nodes—all designed to simplify implementing a defense-in-depth approach to security, so you can operate safely at scale. Customers like Shopify trust GKE to help them handle terrific scale with no interruptions. Over the most recent Black Friday Cyber Monday period, Shopify processed over $5B in transactions!

It also offers a series of industry-first capabilities such as release channels, multi-cluster support, four-way auto-scaling, including node auto repair to help improve availability. And that's just its feature set—GKE also helps optimize costs with efficient bin packing and auto-scaling. Customers like OpenX are saving up to 45% using GKE

GKE Autopilot momentum

This leads us to GKE Autopilot, a new mode of operation for GKE that helps reduce the operational cost of managing clusters, optimize your clusters for production, and yield higher workload availability. Already since its launch last month, customers like Strabag and Via Transportation report seeing dramatic improvement in the performance, security, and resilience of their Kubernetes environments, all while spending less time managing their clusters. 

In short, we’ve worked hard to deliver the most configurable, secure, scalable, and automated Kubernetes service on the market today. And we're just getting started. With 5+ years of ongoing investment in Kubernetes, you can be confident that GKE will be there to support your success and growth— today and into the future.

Interested in showing just how much you love GKE? Join the Google Cloud {Code_Love_Hack} hackathon to show us how you use GKE, containers, and Cloud Code to spread the love of coding! Registration is open and we’re excited for all the great projects you’ll make using GKE!

Posted in