[[["易于理解","easyToUnderstand","thumb-up"],["解决了我的问题","solvedMyProblem","thumb-up"],["其他","otherUp","thumb-up"]],[["很难理解","hardToUnderstand","thumb-down"],["信息或示例代码不正确","incorrectInformationOrSampleCode","thumb-down"],["没有我需要的信息/示例","missingTheInformationSamplesINeed","thumb-down"],["翻译问题","translationIssue","thumb-down"],["其他","otherDown","thumb-down"]],["最后更新时间 (UTC):2025-08-29。"],[[["\u003cp\u003eAirflow connections store credentials and connection details, allowing DAGs to interact with Google Cloud and other services, with most operators using connections instead of direct credentials.\u003c/p\u003e\n"],["\u003cp\u003eCloud Composer generates a unique fernet key for securing connection extras, and users are advised to store connections in Secret Manager for enhanced security instead of directly in Airflow or DAGs.\u003c/p\u003e\n"],["\u003cp\u003eAirflow supports various connection types to interface with specific services, and new types can be added by installing relevant PyPI packages, with some common packages already preinstalled.\u003c/p\u003e\n"],["\u003cp\u003eCloud Composer preconfigures default connections like \u003ccode\u003egoogle_cloud_default\u003c/code\u003e, \u003ccode\u003ebigquery_default\u003c/code\u003e, and others, for immediate access to resources within the project.\u003c/p\u003e\n"],["\u003cp\u003eConnections can be stored in Secret Manager using JSON or URI formats, or they can be added to Airflow directly using the Airflow CLI or UI, with connections in Secret Manager taking priority.\u003c/p\u003e\n"]]],[],null,["\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\n**Cloud Composer 3** \\| [Cloud Composer 2](/composer/docs/composer-2/manage-airflow-connections \"View this page for Cloud Composer 2\") \\| [Cloud Composer 1](/composer/docs/composer-1/manage-airflow-connections \"View this page for Cloud Composer 1\")\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\nThis page describes how to manage [Airflow connections](https://airflow.apache.org/docs/apache-airflow/stable/howto/connection.html)\nin your environment and access them from your DAGs.\n\nAbout Airflow connections\n\nAiflow connections store credentials and other connection information, such as\nuser names, connections strings, and passwords. Your DAGs use connections to\ncommunicate and access resources in Google Cloud and other services\nfrom your DAGs.\n\nAirflow operators in your DAGs either use a default connection for the\noperator, or you specify a custom connection name.\n\nAbout connection security\n\nMost Airflow operators do not accept credentials directly. Instead, they use\nAirflow connections.\n\nWhen you create a new environment, Cloud Composer generates a\nunique, permanent fernet key for the environment and secures connection extras\nby default. You can view the `fernet_key` in the **Configuration** page in\n[Airflow UI](/composer/docs/composer-3/access-airflow-web-interface).\n\nFor more information about how connections and passwords are secured in\nAirflow, see\n[Securing Connections](https://airflow.apache.org/docs/apache-airflow/stable/howto/connection.html#securing-connections)\nand\n[Masking sensitive data](https://airflow.apache.org/docs/apache-airflow/stable/security/secrets/mask-sensitive-values.html)\nin the Airflow documentations.\n| **Important:** In Cloud Composer, we recommend to **store your connections in Secret Manager** instead of storing them in Airflow or directly in your DAGs.\n\nAbout connection types\n\nAirflow uses connections of different types to connect to specific services.\nFor example, the\n[Google Cloud connection type](https://airflow.apache.org/docs/apache-airflow-providers-google/stable/connections/gcp.html)\nconnects to other services in Google Cloud. As another example, S3 connection\ntype connects to an Amazon S3 bucket.\n\nTo add a connection type to Airflow,\n[install a PyPI package](/composer/docs/composer-3/install-python-dependencies) with that connection type.\nSome packages are preinstalled in your environment. For\nexample, you can use connection from the `apache-airflow-providers-google`\npackage without installing custom PyPI packages.\n\nPreconfigured connections\n\nCloud Composer configures the following default connections in your\nenvironment. You can use these connections to access resources in your project\nwithout configuring them.\n\n- `google_cloud_default`\n- `bigquery_default`\n- `google_cloud_datastore_default`\n- `google_cloud_storage_default`\n\nAdd a connection in Secret Manager\n\nYou can store a connection in Secret Manager, without adding\nit to Airflow. We recommend to use this approach when storing credentials and\nother sensitive information.\n| **Important:** Your DAGs prioritize connections stored in Secret Manager over connections stored in Airflow.\n\nTo add a connection in Secret Manager:\n\n1. [Configure Secret Manager for your environment](/composer/docs/composer-3/configure-secret-manager).\n\n2. [Add a secret](/secret-manager/docs/creating-and-accessing-secrets) with the name that\n matches the pattern for connections.\n\n For example: `airflow-connections-example_connection`. In your DAGs, use the connection name without the prefix: `example_connection`.\n3. Add parameters for the connection:\n\n JSON format\n\n Add the JSON representation of your connection as the value of the\n secret. For example: \n\n {\n \"conn_type\": \"mysql\",\n \"host\": \"example.com\",\n \"login\": \"login\",\n \"password\": \"password\",\n \"port\": \"9000\"\n }\n\n For more information about the JSON connection format, see\n [Airflow documentation](https://airflow.apache.org/docs/apache-airflow/stable/howto/connection.html).\n\n URI format\n\n Add the URI representation of your connection as the value of the\n secret:\n - The secret must store a\n [URI representation](https://airflow.apache.org/docs/apache-airflow/stable/howto/connection.html#generating-connection-uri) of the connection.\n For example, `mysql://login:password@example.com:9000`.\n\n - The URI must be URL-encoded. For\n example, a password that has a space symbol in it must be URL-encoded\n as follows:\n `mysql://login:secret%20password@example.com:9000`.\n\n Airflow has a [convenience method](https://airflow.apache.org/docs/apache-airflow/stable/howto/connection.html#generating-connection-uri) for\n generating connection URIs. An example of how to encode a complex URL\n with JSON extras is available in the\n [Airflow documentation](https://airflow.apache.org/docs/apache-airflow-providers-mysql/stable/connections/mysql.html).\n4. Check that all connection parameters are\n [correctly read from Secret Manager](#check).\n\nAdd a connection in Airflow\n\nAs an alternative to storing your connections in\nSecret Manager, you can store them in Airflow.\n\nTo add a connection in Airflow: \n\nAirflow CLI\n\nRun the\n[`connections add`](https://airflow.apache.org/docs/apache-airflow/stable/cli-and-env-variables-ref.html#connections) Airflow CLI command\nwith Google Cloud CLI. For example:\n\n\n gcloud composer environments run \u003cvar translate=\"no\"\u003eENVIRONMENT_NAME\u003c/var\u003e \\\n --location \u003cvar translate=\"no\"\u003eLOCATION\u003c/var\u003e \\\n connections add -- \\\n --conn-type \"mysql\" \\\n --conn-host \"example.com\" \\\n --conn-port \"9000\" \\\n --conn-login \"login\" \\\n --conn-password \"password\" \\\n example_connection\n\nYou can also use the `--conn-uri` argument: \n\n gcloud composer environments run \u003cvar translate=\"no\"\u003eENVIRONMENT_NAME\u003c/var\u003e \\\n --location \u003cvar translate=\"no\"\u003eLOCATION\u003c/var\u003e \\\n connections add -- \\\n --conn-uri \"mysql://login:password@example.com:9000\" \\\n example_connection\n\n\u003cbr /\u003e\n\n\nReplace the following:\n\n- `ENVIRONMENT_NAME`: the name of your environment.\n- `LOCATION`: the region where the environment is located.\n\nAirflow UI\n\nFollow the [Airflow documentation on creating connections](https://airflow.apache.org/docs/apache-airflow/stable/howto/connection.html#creating-a-connection-with-the-ui).\n\nCheck that Airflow correctly reads a connection\n\nYou can run the `connections get` Airflow CLI command through\nGoogle Cloud CLI to check\nthat a connection is read correctly. For example, if you store a connection in\nSecret Manager, this provides a way to check if all\nparameters of a connection are read by Airflow from a secret.\n**Warning:** **Airflow prints connection passwords in the command's output**. This happens both for connections stored in Airflow and in Secret Manager. \n\n gcloud composer environments run \u003cvar translate=\"no\"\u003eENVIRONMENT_NAME\u003c/var\u003e \\\n --location \u003cvar translate=\"no\"\u003eLOCATION\u003c/var\u003e \\\n connections get \\\n -- \u003cvar translate=\"no\"\u003eCONNECTION_NAME\u003c/var\u003e\n\nReplace:\n\n- `ENVIRONMENT_NAME` with the name of the environment.\n- `LOCATION` with the region where the environment is located.\n- `CONNECTION_NAME` with the name of the connection. If your connection is stored in Secret Manager, use the connection name without the connection prefix. For example, specify `example_connection` instead of `airflow-connections-example_connection_json`.\n\nExample: \n\n gcloud composer environments run example-environment \\\n --location us-central1 \\\n connections get \\\n -- example_connection -o json\n\nUse Airflow connections in your DAGs\n\nThis section shows how to access your connection from a DAG.\n\nUse a Secret Manager connection\n\nUse the name of the connection without the prefix. For example, if your secret\nis named `airflow-connections-aws_s3`, specify\n`aws_s3`. \n\n transfer_dir_from_s3 = S3ToGCSOperator(\n task_id='transfer_dir_from_s3',\n aws_conn_id='aws_s3',\n prefix='data-for-gcs',\n bucket='example-s3-bucket-transfer-operators',\n dest_gcs='gs://us-central1-example-environ-361f4221-bucket/data/from-s3/')\n\nIf you store a default connection in Secret Manager you can omit\nthe connection name. See Airflow documentation for a specific operator to get\nthe default connection name used by an operator. For example, the\n`S3ToGCSOperator` Airflow operator uses the `aws_default` connection by\ndefault. You can store this default connection in a secret named\n`airflow-connections-aws_default`.\n\nUse a connection stored in Airflow\n\nUse the name of the connection, as it is defined in Airflow: \n\n transfer_dir_from_s3 = S3ToGCSOperator(\n task_id='transfer_dir_from_s3',\n aws_conn_id='aws_s3',\n prefix='data-for-gcs',\n bucket='example-s3-bucket-transfer-operators',\n dest_gcs='gs://us-central1-example-environ-361f4221-bucket/data/from-s3/')\n\nTo use the default connection for an operator, omit the connection name. See\nAirflow documentation for a specific operator to get the default connection\nname used by an operator. For example, the `S3ToGCSOperator` Airflow operator\nuses the `aws_default` connection by default.\n\nTroubleshooting\n\nIf your environment cannot access the secret stored in Secret Manager:\n\n1. Make sure that Secret Manager is configured in your\n environment.\n\n2. Check that connection's name in Secret Manager corresponds\n to the connection used by Airflow. For example, for a connection named\n `example_connection`, the secret name is\n `airflow-connections-example_connection`.\n\n3. Check that Airflow [correctly reads a connection](#check).\n\nWhat's next\n\n- [Configure Secret Manager](/composer/docs/composer-3/configure-secret-manager)\n- [Configure email notifications](/composer/docs/composer-3/configure-email)\n- [Write DAGs](/composer/docs/composer-3/write-dags)"]]