Bashoperator python2
This example runs a Python script from the data folder to prevent Airflow from attempting to parse the script as a DAG.
Email notification on failure
Configure an Apache Airflow DAG to notify by using email on failure in Cloud Composer.
View in documentation
Explicity using the default Airflow connection
Run a task while explicitly configuring Composer to use the default Airflow connection.
View in documentation
Get Cloud Storage location for DAGs in your environment
Retrieves the Cloud Storage bucket and directory for DAGs in your Composer environment.
View in documentation
Get environment client ID
Get the client ID of a Composer environment's Identity-Aware Proxy.
View in documentation
Implicitly using the default Airflow connection
Run a task without setting the Airflow connection. The default connection will be used.
View in documentation
Set Python version of a single DAG
Use the PythonVirtualenvOperator to select an explicit Python version for a DAG.
Use local dependencies
Use local dependencies in an Apache Airflow DAG, running on Cloud Composer.
View in documentation
Use the BashOperator to run the BigQuery bq command
Use the BashOperator in an Apache Airflow DAG to call the BigQuery bq command.
View in documentation
Use the BigQueryOperator
Use the BigQueryOperator in an Apache Airflow DAG to run a BigQuery query from Cloud Composer.
View in documentation
Use the EmailOperator
An Apache Airflow DAG that uses the EmailOperator to send an email confirmation in Cloud Composer.
View in documentation
Using a custom Airflow connection
Run a task using a custom Airflow connection that you have previously created.
View in documentation