[[["易于理解","easyToUnderstand","thumb-up"],["解决了我的问题","solvedMyProblem","thumb-up"],["其他","otherUp","thumb-up"]],[["很难理解","hardToUnderstand","thumb-down"],["信息或示例代码不正确","incorrectInformationOrSampleCode","thumb-down"],["没有我需要的信息/示例","missingTheInformationSamplesINeed","thumb-down"],["翻译问题","translationIssue","thumb-down"],["其他","otherDown","thumb-down"]],["最后更新时间 (UTC):2025-09-03。"],[[["\u003cp\u003eBefore deploying DAGs to production, use Airflow CLI sub-commands to parse and test DAG code within the same execution context.\u003c/p\u003e\n"],["\u003cp\u003eCreate a dedicated test directory (e.g., \u003ccode\u003e/data/test\u003c/code\u003e) in your environment's bucket to store and test DAGs separately from production DAGs.\u003c/p\u003e\n"],["\u003cp\u003eUtilize the \u003ccode\u003egcloud composer environments run\u003c/code\u003e command with the appropriate subcommands (\u003ccode\u003edags list\u003c/code\u003e or \u003ccode\u003etasks test\u003c/code\u003e) to check for syntax and task-specific errors in your DAGs.\u003c/p\u003e\n"],["\u003cp\u003eMaintain distinct production and test environments, as Airflow does not offer strong DAG isolation, to prevent interference between test and production DAGs.\u003c/p\u003e\n"],["\u003cp\u003eWhen collaborating, each DAG contributor can have their subdirectory in the \u003ccode\u003edata/\u003c/code\u003e folder for development, using the \u003ccode\u003e--subdir\u003c/code\u003e flag with the test commands to run DAGs in their respective directories.\u003c/p\u003e\n"]]],[],null,["\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\n[Cloud Composer 3](/composer/docs/composer-3/test-dags \"View this page for Cloud Composer 3\") \\| [Cloud Composer 2](/composer/docs/composer-2/test-dags \"View this page for Cloud Composer 2\") \\| **Cloud Composer 1**\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\nBefore deploying DAGs to production, you can\n[execute Airflow CLI sub-commands](https://airflow.apache.org/docs/apache-airflow/stable/cli-and-env-variables-ref.html)\nto parse DAG code in the same context under which the DAG is executed.\n| **Note:** Because Apache Airflow does not provide strong DAG isolation, we recommend that you maintain separate production and test environments to prevent DAG interference.\n\nTesting during DAG creation\n\nYou can run a single task instance locally and view the log output.\nViewing the output enables you to check for syntax and task errors.\nTesting locally does not check dependencies or communicate status\nto the database.\n\nWe recommend that you put the DAGs in a `data/test` folder\nin your test environment.\n\nCreate a test directory\n\nIn your environment's bucket, create a test directory and copy your DAGs to it. \n\n gcloud storage cp \u003cvar translate=\"no\"\u003eBUCKET_NAME\u003c/var\u003e/dags \\\n \u003cvar translate=\"no\"\u003eBUCKET_NAME\u003c/var\u003e/data/test --recursive\n\nReplace the following:\n\n- `BUCKET_NAME`: the name of the bucket associated with your Cloud Composer environment.\n\nExample: \n\n gcloud storage cp gs://us-central1-example-environment-a12bc345-bucket/dags \\\n gs://us-central1-example-environment-a12bc345-bucket/data/test --recursive\n\nFor more information about uploading DAGs, see\n[Add and update DAGs](/composer/docs/composer-1/manage-dags).\n\nCheck for syntax errors\n\nTo check for syntax errors in DAGs that you uploaded to the `/data/test`\nfolder, enter the following `gcloud` command:\n\n\u003cbr /\u003e\n\nAirflow 2\n\n\n gcloud composer environments run \\\n \u003cvar translate=\"no\"\u003eENVIRONMENT_NAME\u003c/var\u003e \\\n --location \u003cvar translate=\"no\"\u003eENVIRONMENT_LOCATION\u003c/var\u003e \\\n dags list -- --subdir /home/airflow/gcs/data/test\n\n\u003cbr /\u003e\n\nAirflow 1\n\n\n gcloud composer environments run \\\n \u003cvar translate=\"no\"\u003eENVIRONMENT_NAME\u003c/var\u003e \\\n --location \u003cvar translate=\"no\"\u003eENVIRONMENT_LOCATION\u003c/var\u003e \\\n list_dags -- -sd /home/airflow/gcs/data/test\n\n\u003cbr /\u003e\n\nReplace the following:\n\n- `ENVIRONMENT_NAME`: the name of the environment.\n- `ENVIRONMENT_LOCATION`: the region where the environment is located.\n\nCheck for task errors\n\nTo check for task-specific errors in DAGs that you uploaded to the `/data/test`\nfolder, run the following `gcloud` command:\n\n\u003cbr /\u003e\n\nAirflow 2\n\n\n gcloud composer environments run \\\n \u003cvar translate=\"no\"\u003eENVIRONMENT_NAME\u003c/var\u003e \\\n --location \u003cvar translate=\"no\"\u003eENVIRONMENT_LOCATION\u003c/var\u003e \\\n tasks test -- --subdir /home/airflow/gcs/data/test \\\n \u003cvar translate=\"no\"\u003eDAG_ID\u003c/var\u003e \u003cvar translate=\"no\"\u003eTASK_ID\u003c/var\u003e \\\n \u003cvar translate=\"no\"\u003eDAG_EXECUTION_DATE\u003c/var\u003e\n\n\u003cbr /\u003e\n\nAirflow 1\n\n\n gcloud composer environments run \\\n \u003cvar translate=\"no\"\u003eENVIRONMENT_NAME\u003c/var\u003e \\\n --location \u003cvar translate=\"no\"\u003eENVIRONMENT_LOCATION\u003c/var\u003e \\\n test -- -sd /home/airflow/gcs/data/test \u003cvar translate=\"no\"\u003eDAG_ID\u003c/var\u003e \\\n \u003cvar translate=\"no\"\u003eTASK_ID\u003c/var\u003e \u003cvar translate=\"no\"\u003eDAG_EXECUTION_DATE\u003c/var\u003e\n\n\u003cbr /\u003e\n\nReplace the following:\n\n- `ENVIRONMENT_NAME`: the name of the environment.\n- `ENVIRONMENT_LOCATION`: the region where the environment is located.\n- `DAG_ID`: the ID of the DAG.\n- `TASK_ID`: the ID of the task.\n- `DAG_EXECUTION_DATE`: the execution date of the DAG. This date is used for templating purposes. Regardless of the date you specify here, the DAG runs immediately.\n\nExample:\n\n\u003cbr /\u003e\n\nAirflow 2\n\n\n gcloud composer environments run \\\n example-environment \\\n --location us-central1 \\\n tasks test -- --subdir /home/airflow/gcs/data/test \\\n hello_world print_date 2021-04-22\n\n\u003cbr /\u003e\n\nAirflow 1\n\n\n gcloud composer environments run example-environment \\\n --location us-central1 \\\n test -- -sd /home/airflow/gcs/data/test \\\n hello_world print_date 2021-04-22\n\n\u003cbr /\u003e\n\nUpdating and testing a deployed DAG\n\nTo test updates to your DAGs in your test environment:\n\n1. Copy the deployed DAG that you want to update to `data/test`.\n2. Update the DAG.\n3. Test the DAG.\n 1. [Check for syntax errors](#syntax).\n 2. [Check for task-specific errors](#task-error).\n4. Make sure the DAG runs successfully.\n5. Turn off the DAG in your test environment.\n 1. Go to the Airflow UI \\\u003e DAGs page.\n 2. If the DAG you're modifying runs constantly, turn off the DAG.\n 3. To expedite outstanding tasks, click the task and **Mark Success**.\n6. Deploy the DAG to your production environment.\n 1. Turn off the DAG in your production environment.\n 2. [Upload the updated DAG](/composer/docs/composer-1/manage-dags#adding) to the `dags/` folder in your production environment.\n\nFAQs for testing DAGs\n\nHow do I isolate DAG runs in my production and test environments?\n\nFor example, Airflow has a global repository of source code in the `dags/`\nfolder that all DAG runs share. You want to update source code in production\nor test without interfering with running DAGs.\n\nAirflow does not provide strong DAG isolation. We recommend\nthat you maintain separate production and test Cloud Composer\nenvironments to prevent your test DAGs from interfering with your production\nDAGs.\n\nHow do I avoid DAG interference when I run integration tests from different GitHub branches\n\nUse unique task names to prevent interference. For example, you can prefix\nyour task IDs with the branch name.\n\nWhat is a best practice for integration testing with Airflow?\n\nWe recommend that you use a dedicated environment for integration testing with\nAirflow. One way to signal the DAG run success is to write into\na file in a Cloud Storage folder and then check the content in your\nown integration test cases.\n\nHow do I collaborate efficiently with other DAG contributors?\n\nEach contributor can have a subdirectory in the `data/` folder for development.\n\nDAGs added to the `data/` folder are not picked up automatically by the\nAirflow scheduler or web server\n\nDAG contributors can create manual DAG runs by using\nthe `gcloud composer environments run` command and the `test` sub-command\nwith the `--subdir` flag to specify the contributor's development directory.\n\nFor example:\n\n\u003cbr /\u003e\n\nAirflow 2\n\n\n gcloud composer environments run test-environment-name \\\n tasks test -- dag-id task-id execution-date \\\n --subdir /home/airflow/gcs/data/alice_dev\n\n\u003cbr /\u003e\n\nAirflow 1\n\n\n gcloud composer environments run test-environment-name \\\n test -- dag-id task-id execution-date \\\n --subdir /home/airflow/gcs/data/alice_dev\n\n\u003cbr /\u003e\n\nHow do I keep my deployment and production environments in sync?\n\nTo manage access:\n\n- For authentication, use\n [service accounts](/composer/docs/composer-1/access-control#service-account).\n\n- For access control, use Identity and Access Management and Cloud Composer\n [roles and permissions](/composer/docs/composer-1/access-control#user-account).\n\nTo deploy from development to production:\n\n- Ensure consistent configuration, such as environment variables and PyPI\n packages.\n\n- Ensure consistent DAG arguments. To avoid hard-coding, we recommend that you\n use Airflow macros and variables.\n\n For example:\n\n \u003cbr /\u003e\n\n Airflow 2\n\n\n gcloud composer environments run test-environment-name \\\n variables set -- DATA_ENDPOINT_KEY DATA_ENDPOINT_VALUE\n\n \u003cbr /\u003e\n\n Airflow 1\n\n\n gcloud composer environments run test-environment-name \\\n variables -- --set DATA_ENDPOINT_KEY DATA_ENDPOINT_VALUE\n\n \u003cbr /\u003e\n\nWhat's next\n\n- [Troubleshooting DAGs](/composer/docs/composer-1/troubleshooting-dags)\n- [Adding and Updating DAGs](/composer/docs/composer-1/manage-dags)\n- [Test, synchronize, and deploy your DAGs using version control](/composer/docs/composer-1/dag-cicd-github)"]]