Cloud Composer 1 is in the post-maintenance mode. Google does not release any further updates to Cloud Composer 1, including new versions of Airflow, bugfixes, and security updates. We recommend planning migration to Cloud Composer 2.
This page explains how to connect to a Cloud SQL instance that runs
the Airflow database of your Cloud Composer
environment and run SQL queries.
For example, you might want to run queries directly on the Airflow database,
make database backups, gather statistics based on the database content, or
retrieve any other custom information from the database.
Before you begin
Run a SQL query on the Airflow database
To connect to the Airflow database:
Create a DAG with one or more PostgresOperator operators. To get started,
you can use the example DAG.
In the sql parameter of the operator, specify your SQL query.
Trigger the DAG, for example, you can do it
manually or wait until it runs on a schedule.
Example DAG:
importdatetimeimportosimportairflowfromairflow.providers.postgres.operators.postgresimportPostgresOperatorSQL_DATABASE=os.environ["SQL_DATABASE"]withairflow.DAG("airflow_db_connection_example",start_date=datetime.datetime(2024,1,1),schedule_interval=None,catchup=False)asdag:PostgresOperator(task_id="run_airflow_db_query",dag=dag,postgres_conn_id="airflow_db",database=SQL_DATABASE,sql="SELECT * FROM dag LIMIT 10;",)
Dump database contents and transfer them to a bucket
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-03-07 UTC."],[[["This page explains how to connect to and run SQL queries on the Cloud SQL instance that hosts the Airflow database for your Cloud Composer environment."],["While direct access to the Airflow database is possible, it's advised to use alternative methods like the Airflow REST API or CLI commands whenever feasible."],["To connect, create a DAG with `PostgresOperator` operators, specifying the SQL query in the `sql` parameter, and uploading/triggering it."],["It is not recommended to add custom tables or change the schema of the airflow database."],["Backing up the environment's data, including the Airflow database, should be done using snapshots instead of the deprecated database dumping method."]]],[]]