Am 15. September 2026erreichen alle Cloud Composer 1- und Cloud Composer 2-Umgebungen der Version 2.0.x das geplante Ende des Lebenszyklus und können nicht mehr verwendet werden. Wir empfehlen, die Migration zu Cloud Composer 3 zu planen.
Auf dieser Seite wird erläutert, wie Sie eine Verbindung zu einer Cloud SQL-Instanz herstellen, die die Airflow-Datenbank Ihrer Cloud Composer-Umgebung ausführt und wie Sie SQL-Abfragen ausführen.
Sie können beispielsweise Abfragen direkt in der Airflow-Datenbank ausführen, Datenbanksicherungen erstellen, Statistiken basierend auf dem Datenbankinhalt erfassen oder andere benutzerdefinierte Informationen aus der Datenbank abrufen.
Hinweise
SQL-Abfrage für die Airflow-Datenbank ausführen
So stellen Sie eine Verbindung zur Airflow-Datenbank her:
Erstellen Sie einen DAG mit einem oder mehreren SQLExecuteQueryOperator-Operatoren. Für den Einstieg können Sie die Beispiel-DAG verwenden.
Geben Sie im Parameter sql des Operators Ihre SQL-Abfrage an.
Lösen Sie die DAG aus. Das kann beispielsweise manuell erfolgen oder Sie warten, bis sie nach Zeitplan ausgeführt wird.
Beispiel-DAG:
importdatetimeimportosimportairflowfromairflow.providers.common.sql.operators.sqlimportSQLExecuteQueryOperatorSQL_DATABASE=os.environ["SQL_DATABASE"]withairflow.DAG("airflow_db_connection_example",start_date=datetime.datetime(2025,1,1),schedule_interval=None,catchup=False)asdag:SQLExecuteQueryOperator(task_id="run_airflow_db_query",dag=dag,conn_id="airflow_db",database=SQL_DATABASE,sql="SELECT * FROM dag LIMIT 10;",)
[[["Leicht verständlich","easyToUnderstand","thumb-up"],["Mein Problem wurde gelöst","solvedMyProblem","thumb-up"],["Sonstiges","otherUp","thumb-up"]],[["Schwer verständlich","hardToUnderstand","thumb-down"],["Informationen oder Beispielcode falsch","incorrectInformationOrSampleCode","thumb-down"],["Benötigte Informationen/Beispiele nicht gefunden","missingTheInformationSamplesINeed","thumb-down"],["Problem mit der Übersetzung","translationIssue","thumb-down"],["Sonstiges","otherDown","thumb-down"]],["Zuletzt aktualisiert: 2025-08-29 (UTC)."],[[["\u003cp\u003eThis page explains how to connect to and run SQL queries on the Cloud SQL instance that hosts the Airflow database for your Cloud Composer environment.\u003c/p\u003e\n"],["\u003cp\u003eWhile direct access to the Airflow database is possible, it's advised to use alternative methods like the Airflow REST API or CLI commands whenever feasible.\u003c/p\u003e\n"],["\u003cp\u003eTo connect, create a DAG with \u003ccode\u003ePostgresOperator\u003c/code\u003e operators, specifying the SQL query in the \u003ccode\u003esql\u003c/code\u003e parameter, and uploading/triggering it.\u003c/p\u003e\n"],["\u003cp\u003eIt is not recommended to add custom tables or change the schema of the airflow database.\u003c/p\u003e\n"],["\u003cp\u003eBacking up the environment's data, including the Airflow database, should be done using snapshots instead of the deprecated database dumping method.\u003c/p\u003e\n"]]],[],null,["# Access the Airflow database\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\n**Cloud Composer 3** \\| [Cloud Composer 2](/composer/docs/composer-2/access-airflow-database \"View this page for Cloud Composer 2\") \\| [Cloud Composer 1](/composer/docs/composer-1/access-airflow-database \"View this page for Cloud Composer 1\")\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\nThis page explains how to connect to a Cloud SQL instance that runs\nthe [Airflow database](/composer/docs/composer-3/environment-architecture#airflow-database) of your Cloud Composer\nenvironment and run SQL queries.\n\nFor example, you might want to run queries directly on the Airflow database,\nmake database backups, gather statistics based on the database content, or\nretrieve any other custom information from the database.\n| **Important:** We recommend to avoid directly accessing the Airflow database, if it is possible to use other approaches such as [Airflow REST API](/composer/docs/composer-3/access-airflow-api) or [Airflow CLI commands](/composer/docs/composer-3/access-airflow-cli) instead.\n\nBefore you begin\n----------------\n\n| **Warning:** Don't add your own custom tables to the Airflow database and don't change the schema of the Airflow database. Don't add users or databases to the Cloud SQL instance that hosts the Airflow database.\n\nRun a SQL query on the Airflow database\n---------------------------------------\n\nTo connect to the Airflow database:\n\n1. Create a DAG with one or more SQLExecuteQueryOperator operators. To get\n started, you can use the example DAG.\n\n | **Caution:** Your **SQL query might run more than once** because of the DAG schedule and catchup. If you want to run the SQL query only once, set `schedule_interval` to `None`, `catchup` to `False`, and then [trigger the DAG manually](/composer/docs/composer-3/schedule-and-trigger-dags#manually).\n2. In the `sql` parameter of the operator, specify your SQL query.\n\n3. [Upload](/composer/docs/composer-3/manage-dags#add) this DAG to your environment.\n\n4. Trigger the DAG, for example, you can do it\n [manually](/composer/docs/composer-3/schedule-and-trigger-dags#manually) or wait until it runs on a schedule.\n\nExample DAG: \n\n import datetime\n import os\n\n import airflow\n from airflow.providers.common.sql.operators.sql import SQLExecuteQueryOperator\n\n SQL_DATABASE = os.environ[\"SQL_DATABASE\"]\n\n with airflow.DAG(\n \"airflow_db_connection_example\",\n start_date=datetime.datetime(2025, 1, 1),\n schedule_interval=None,\n catchup=False) as dag:\n\n SQLExecuteQueryOperator(\n task_id=\"run_airflow_db_query\",\n dag=dag,\n conn_id=\"airflow_db\",\n database=SQL_DATABASE,\n sql=\"SELECT * FROM dag LIMIT 10;\",\n )\n\nFor more information about using the SQLExecuteQueryOperator, see the\n[How-to Guide for Postgres using SQLExecuteQueryOperator](https://airflow.apache.org/docs/apache-airflow-providers-postgres/stable/operators.html)\nin the Airflow documentation.\n\nDump database contents and transfer them to a bucket\n----------------------------------------------------\n\n| **Deprecated:** This approach is deprecated. Instead, use [snapshots](/composer/docs/composer-3/save-load-snapshots) to back up the environment's data, including the Airflow database contents.\n\nWhat's next\n-----------\n\n- [Access Airflow REST API](/composer/docs/composer-3/access-airflow-api)\n- [Run Airflow CLI commands](/composer/docs/composer-3/access-airflow-cli)"]]