Continuous Integration and Delivery

Deploy code faster: CI/CD & Kubernetes

Try It Free Contact Sales

Deploy code faster: CI/CD and Kubernetes

Continuous integration and deployment (CI/CD) is a development strategy that gets application updates to your customers in a fast, automated way. Using Kubernetes and Kubernetes Engine, you can solve the following problems:

Long release cycles – Manual testing and deployment processes can cause delays in getting your code to production. This makes code-merge collisions more likely, and increases the time customers have to wait for patches and updates.

Outages – When you manage your infrastructure manually, someone has to carry a pager. Whether it’s a lightning hit on a datacenter or a traffic spike that exceeds capacity, someone on the team is getting woken up at 3am. While your app is down, you’re losing money and customers.

Inefficient server utilization – If you’re not managing your apps to ensure they’re packed efficiently onto servers, you’re probably paying too much for capacity, whether it’s in the cloud or on-premise.

Deploy code faster: CI/CD and Kubernetes

Containerize the code

Running your apps in containers ensures they have the resources and libraries they need, while preventing conflicts between library versions and application components. This makes your app portable between environments, easy to replicate, and scalable.

You can use Google Container Builder to run your container-image builds in a fast, consistent, and reliable environment on Google Cloud Platform.

Containerize the code

Orchestrate deployment with Kubernetes

Once your apps are running in containers, they still need to be managed. Someone has to deploy them, monitor their health, and scale them to meet demand. You could do that manually, or you can let Kubernetes orchestrate this work for you.

With Kubernetes, you specify the desired deployment state in a .yaml file. Kubernetes then continuously monitors the environment to maintain that state: deploying and scaling your app to meet demand, bin-packing containers efficiently onto servers, and terminating rogue processes.

Kubernetes is open-source, supported by a large community of developers working together to improve it. This means you can run Kubernetes anywhere, on a cloud provider of your choice, or even in your own datacenter. You’re not locked into any one platform.

Orchestrate deployment with Kubernetes

Run Kubernetes on Google Cloud Infrastructure

The open-source Kubernetes project originated from Google internal technologies, invented to solve problems exactly like continuous code integration and deployment. Google has been a partner in the Kubernetes project since the beginning and has deep knowledge of its technology.

Using Kubernetes Engine, you can abstract away the final step of CI/CD, managing the infrastructure Kubernetes runs on. Running your deployment on Kubernetes Engine ensures that you always have as many, or as few, servers as you need to keep your app running optimally and efficiently.

Running your workloads on Google Cloud infrastructure means that you don’t have to worry about datacenter management or outages, and have access to powerful and innovative technology, such as Google’s private — and blazingly fast — optical fiber network.

You can put that pager away and get your weekend back.

Run Kubernetes on Google Cloud Infrastructure

Deploy a Kubernetes Cluster and
Update Production Code in Seconds

Now it's your turn. Type commands into the following terminal emulator and learn
how to create a Kubernetes cluster on Kubernetes Engine.

Create your first cluster

Now that you know the basics, you’re ready launch your first Kubernetes Engine cluster on GCP.

Send feedback about...

Kubernetes Engine