Fully managed workflow orchestration
Cloud Composer's managed nature and Apache Airflow compatibility allows you to focus on authoring, scheduling, and monitoring your workflows as opposed to provisioning resources.
Integrates with other Google Cloud products
End-to-end integration with Google Cloud products including BigQuery, Dataflow, Dataproc, Datastore, Cloud Storage, Pub/Sub, and AI Platform gives users the freedom to fully orchestrate their pipeline.
Supports hybrid and multi-cloud
Author, schedule, and monitor your workflows through a single orchestration tool—whether your pipeline lives on-premises, in multiple clouds, or fully within Google Cloud.
Hybrid and multi-cloud
Ease your transition to the cloud or maintain a hybrid data environment by orchestrating workflows that cross between on-premises and the public cloud. Create workflows that connect data, processing, and services across clouds to give you a unified data environment.
Cloud Composer is built upon Apache Airflow, giving users freedom from lock-in and portability. This open source project, which Google is contributing back into, provides freedom from lock-in for customers as well as integration with a broad number of platforms, which will only expand as the Airflow community grows.
Cloud Composer pipelines are configured as directed acyclic graphs (DAGs) using Python, making it easy for any user. One-click deployment yields instant access to a rich library of connectors and multiple graphical representations of your workflow in action, making troubleshooting easy. Automatic synchronization of your directed acyclic graphs ensures your jobs stay on schedule.
Learn from customers using Cloud Composer
Overview of Cloud Composer
Find an overview of a Cloud Composer environment and the Google Cloud products used for an Apache Airflow deployment.
Automating infrastructure with Cloud Composer
Learn how to schedule automated backups of Compute Engine virtual machine (VM) instances.
Set up a CI/CD pipeline for your data-processing workflow
Discover how to set up a continuous integration/continuous deployment (CI/CD) pipeline for processing data with managed products on Google Cloud.
Private IP Cloud Composer environment
Find information on using a private IP Cloud Composer environment.
Writing DAGs (workflows)
Find out how to write an Apache Airflow directed acyclic graph (DAG) that runs in a Cloud Composer environment.
Qwiklab: Data engineering on Google Cloud
This four-day instructor led class provides participants a hands-on introduction to designing and building data pipelines on Google Cloud.
|Multi-cloud||Create workflows that connect data, processing, and services across clouds, giving you a unified data environment.|
|Open source||Cloud Composer is built upon Apache Airflow, giving users freedom from lock-in and portability.|
|Hybrid||Ease your transition to the cloud or maintain a hybrid data environment by orchestrating workflows that cross between on-premises and the public cloud.|
|Integrated||Built-in integration with BigQuery, Dataflow, Dataproc, Datastore, Cloud Storage, Pub/Sub, AI Platform, and more, giving you the ability to orchestrate end-to-end Google Cloud workloads.|
|Python programming language||Leverage existing Python skills to dynamically author and schedule workflows within Cloud Composer.|
|Reliability||Increase reliability of your workflows through easy-to-use charts for monitoring and troubleshooting the root cause of an issue.|
|Fully managed||Cloud Composer's managed nature allows you to focus on authoring, scheduling, and monitoring your workflows as opposed to provisioning resources.|
|Networking and security||During environment creation, Cloud Composer provides the following configuration options: Cloud Composer environment with a route-based GKE cluster (default), Private IP Cloud Composer environment, Cloud Composer environment with a VPC Native GKE cluster using alias IP addresses, Shared VPC.|