Cloud Composer 1 is in the post-maintenance mode. Google does not release any further updates to Cloud Composer 1, including new versions of Airflow, bugfixes, and security updates. We recommend planning migration to Cloud Composer 2.
This page explains how to connect to a Cloud SQL instance that runs
the Airflow database of your Cloud Composer
environment and run SQL queries.
For example, you might want to run queries directly on the Airflow database,
make database backups, gather statistics based on the database content, or
retrieve any other custom information from the database.
Before you begin
Run a SQL query on the Airflow database
To connect to the Airflow database:
Create a DAG with one or more PostgresOperator operators. To get started,
you can use the example DAG.
In the sql parameter of the operator, specify your SQL query.
Trigger the DAG, for example, you can do it
manually or wait until it runs on a schedule.
Example DAG:
importdatetimeimportosimportairflowfromairflow.providers.postgres.operators.postgresimportPostgresOperatorSQL_DATABASE=os.environ["SQL_DATABASE"]withairflow.DAG("airflow_db_connection_example",start_date=datetime.datetime(2024,1,1),schedule_interval=None,catchup=False)asdag:PostgresOperator(task_id="run_airflow_db_query",dag=dag,postgres_conn_id="airflow_db",database=SQL_DATABASE,sql="SELECT * FROM dag LIMIT 10;",)
Dump database contents and transfer them to a bucket
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-03-07 UTC."],[[["This page details how to connect to and run SQL queries on the Airflow database of your Cloud Composer 1 environment."],["Directly accessing the Airflow database is discouraged; the Airflow REST API or Airflow CLI commands are the recommended alternatives."],["Connecting to the Airflow database involves creating and uploading a DAG that utilizes the `PostgresOperator` to specify and run the SQL query."],["Avoid adding custom tables or modifying the schema of the existing Airflow database to prevent complications."],["Backing up the environment's data should be done with snapshots instead of dumping the database."]]],[]]