Create environments
Configure networking
-
Configure private IP networking
Configure your project networking for private IP Cloud Composer environments.
-
Configure shared VPC networking
Configure host and service project networking for Cloud Composer environments.
-
Configure VPC Service Controls
Create environments in a VPC Service Controls perimeter.
-
Configure large-scale networks for Cloud Composer environments
Organize large-scale networks for Cloud Composer environments.
-
Configure privately used public IP ranges
Use privately used public IP ranges in your environments.
-
Enable the IP Masquerade agent
Use IP masquerading with your environment.
Configure environments
-
Set environment variables
Set environment variables that are available to the Apache Airflow scheduler, worker, and web server processes.
-
Override Airflow configuration options
Override Airflow configuration options to adjust the Airflow instance to your needs and requirements.
-
Scale environments
Change scale and performance parameters of your environment.
-
Manage Airflow connections
Store the connection information that Airflow uses to communicate with other APIs, such as Google Cloud projects, other cloud providers, or third-party services.
-
Enable and disable DAG serialization
Configure the Airflow scheduler to process DAG files before they are sent to the web server.
-
Specify maintenance windows
Configure time windows when Cloud Composer can perform maintenance for your environment.
-
Configure email notifications
Configure SMTP services for your environment.
-
Manage environment labels and break down environment costs
Assign labels to your environments and then break down billing costs based on these labels
Configure security and access control
-
Access control with IAM
Assign roles and permissions for user and service accounts that work with Cloud Composer.
-
Airflow UI Access Control
Use Airflow UI Access Control (Airflow Role-Based Access Control, or Airflow RBAC) to assign permissions for actions available within Airflow UI.
-
Using customer-managed encryption keys
Encrypt data in your environment with customer-managed encryption keys (CMEK).
-
Configure resource location restrictions
Configure resource location restrictions so that Cloud Composer stores your data within the locations you specify.
-
Configure Secret Manager
Store sensitive data in Secret Manager and access it from your environment.
Update, upgrade, and delete environments
-
Update environments
Change the configuration of your environment.
-
Upgrading environments
Upgrade your environment to a later versions of Cloud Composer and Airflow
-
Delete environments
Delete your environment.
-
Save and load environment snapshots
Save and load the state of your environment using environment snapshots.
-
Clean up the Airflow database
Remove old entries from the Airflow database to reduce its size.
-
Migrate environments to Airflow 2
Transfer DAGs, data and configuration from your existing Airflow 1.10.* environments to Airflow 2 environments.
-
Migrate environments to Cloud Composer 2 (from Airflow 2)
Transfer DAGs, data and configuration from your existing Cloud Composer 1 environment with Airflow 2 to a Cloud Composer 2 environment.
-
Migrate environments to Cloud Composer 2 (from Airflow 2) using snapshots
Transfer DAGs, data and configuration from your existing Cloud Composer 1 environment with Airflow 2 to a Cloud Composer 2 environment using environment snapshots.
-
Migrate environments to Cloud Composer 2 (from Airflow 1)
Transfer DAGs, data and configuration from your existing Cloud Composer 1 environment with Airflow 1 to a Cloud Composer 2 environment.
-
Migrate environments to Cloud Composer 2 (from Airflow 1) using snapshots
Transfer DAGs, data and configuration from your existing Cloud Composer 1 environment with Airflow 1 to a Cloud Composer 2 environment using environment snapshots.
Access environments
Manage DAGs
-
Write DAGs
Write DAG definition files.
-
Adding and updating DAGs
Upload DAGs to your environment's bucket.
-
Triggering DAGs
Trigger DAGs on schedule, manually, and using other methods.
-
Import operators from backport provider packages
Use newer versions of operators in your Airflow 1.10.* environments by importing them from backport operator packages.
-
Use KubernetesPodOperator
Run pods in your environment's cluster using KubernetesPodOperator
-
Use the GKE Operators
Manage clusters and run pods in them using GKE operators.
-
Group tasks inside DAGs
Group tasks together in your DAGs using different methods provided by Airflow.
-
Triggering DAGs with Cloud Functions
Trigger DAGs in your environment using Cloud Functions
-
Installing custom plugins
Install custom plugins into your Cloud Composer environment.
-
Installing Python dependencies
Install Python packages in your environment.
-
Testing DAGs
Check your DAGs for errors, update, and test deployed DAGs.
Monitor environments
-
View Airflow logs
Access and view the Airflow logs for your environment.
-
View audit logs
Enable and view audit logs for Cloud Composer
-
Monitoring environments
Access and use the monitoring dashboard of your environment.
-
Monitoring environments with Monitoring
View metrics of your environment in Monitoring and access environment logs in Cloud Logging.
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.