BigQuery의 예약 페이지는 Cloud Composer 3 환경에서 실행되는 Airflow DAG를 예약하는 도구를 제공합니다.
BigQuery에서 예약하는 Airflow DAG는 프로젝트에서 하나 이상의 Cloud Composer 환경에서 실행됩니다. BigQuery의 예약 페이지는 프로젝트의 모든 Airflow DAG에 관한 정보를 결합합니다.
DAG 실행 중에 Airflow는 DAG에서 정의된 순서대로 DAG를 구성하는 개별 태스크를 예약하고 실행합니다. BigQuery의 Scheduling 페이지에서 이전 DAG 실행의 상태를 확인하고, 모든 DAG 실행 및 이러한 DAG 실행의 모든 태스크에 대한 상세 로그를 살펴보고, DAG에 관한 세부정보를 확인할 수 있습니다.
Airflow DAG, DAG 실행, 태스크 또는 연산자와 같은 Airflow의 핵심 개념에 대한 자세한 내용은 Airflow 문서의 핵심 개념 페이지를 참조하세요.
Cloud Composer 환경에 대해 자세히 알아보려면 Cloud Composer 문서의 Cloud Composer 3 개요 페이지를 참조하세요.
[[["이해하기 쉬움","easyToUnderstand","thumb-up"],["문제가 해결됨","solvedMyProblem","thumb-up"],["기타","otherUp","thumb-up"]],[["이해하기 어려움","hardToUnderstand","thumb-down"],["잘못된 정보 또는 샘플 코드","incorrectInformationOrSampleCode","thumb-down"],["필요한 정보/샘플이 없음","missingTheInformationSamplesINeed","thumb-down"],["번역 문제","translationIssue","thumb-down"],["기타","otherDown","thumb-down"]],["최종 업데이트: 2025-08-26(UTC)"],[[["\u003cp\u003eThe Scheduling page in BigQuery allows users to manage and schedule Airflow DAGs running in Cloud Composer 3 environments within their Google Cloud project.\u003c/p\u003e\n"],["\u003cp\u003eUsers can manually trigger DAGs, view past run statuses and detailed logs, and examine DAG details, including visualizations and source code, all from the Scheduling page.\u003c/p\u003e\n"],["\u003cp\u003eBefore using the Scheduling features, users must enable the Cloud Composer API and ensure they have at least one Cloud Composer 3 environment with an uploaded DAG.\u003c/p\u003e\n"],["\u003cp\u003eSpecific IAM roles are required to view, trigger, and pause Airflow DAGs, such as Environment and Storage Object Viewer and Environment and Storage Object User, which can be granted by an administrator.\u003c/p\u003e\n"],["\u003cp\u003eThis feature is in "Pre-GA" (Preview) stage, meaning it's available "as is" with limited support, and users can provide feedback or request assistance via email.\u003c/p\u003e\n"]]],[],null,["Schedule Airflow DAGs\n\nThis document describes how to schedule\n[Airflow directed acyclic graphs (DAGs)](/composer/docs/composer-3/composer-overview#about-airflow)\nfrom\n[Cloud Composer 3](/composer/docs/composer-3/composer-overview) on the\n**Scheduling** page in BigQuery, including how to trigger DAGs\nmanually, and how to view the history and logs of past DAG runs.\n\nAbout managing Airflow DAGs in BigQuery\n\nThe **Scheduling** page in BigQuery provides tools to\nschedule Airflow DAGs that run in your Cloud Composer 3 environments.\n\nAirflow DAGs that you schedule in BigQuery are executed in\none or more Cloud Composer environments in your project. The\n**Scheduling** page in BigQuery combines information for\nall Airflow DAGs in your project.\n\nDuring a DAG run, Airflow schedules and executes individual tasks that make up\na DAG in a sequence defined by the DAG. On the **Scheduling** page in\nBigQuery, you can view statuses of past DAG runs, explore\ndetailed logs of all DAG runs and all tasks from these DAG runs, and view\ndetails about DAGs.\n| **Note:** You can't manage Cloud Composer environments in BigQuery. To manage environments, for example, to create an environment, install dependencies for your DAG files, upload, delete, or change individual DAGs, you use Cloud Composer.\n\nTo learn more about Airflow's core concepts such as Airflow DAGs, DAG runs,\ntasks, or operators, see the\n[Core Concepts](https://airflow.apache.org/docs/apache-airflow/stable/core-concepts/index.html)\npage in the Airflow documentation.\n\nTo learn more about Cloud Composer environments, see the\n[Cloud Composer 3 overview](/composer/docs/composer-3/composer-overview) page\nin the Cloud Composer documentation.\n\nBefore you begin\n\n1.\n\n\n Enable the Cloud Composer API.\n\n\n [Enable the API](https://console.cloud.google.com/flows/enableapi?apiid=composer.googleapis.com)\n2. Make sure that your Google Cloud project has at least one Cloud Composer 3 environment, with at least one already uploaded DAG file:\n - To get started with Airflow DAGs, follow the instructions in the [Run an\n Apache Airflow DAG in Cloud Composer 3](/composer/docs/composer-3/run-apache-airflow-dag) guide. As a part of this guide, you create a Cloud Composer 3 environment with the default configuration, upload a DAG to it, and check that Airflow runs it.\n - For detailed instructions to upload an Airflow DAG to a Cloud Composer 3 environment, see [Add and update\n DAGs](/composer/docs/composer-3/manage-dags).\n - For detailed instructions to create a Cloud Composer 3 environment, see [Create\n Cloud Composer environments](/composer/docs/composer-3/create-environments).\n\nRequired permissions\n\n\nTo get the permissions that\nyou need to schedule Airflow DAGs,\n\nask your administrator to grant you the\nfollowing IAM roles on the project:\n\n- To view Airflow DAGs and their details: [Environment and Storage Object Viewer](/iam/docs/roles-permissions/composer#composer.environmentAndStorageObjectViewer) (`roles/composer.environmentAndStorageObjectViewer`)\n- To trigger and pause Airflow DAGs: [Environment and Storage Object User](/iam/docs/roles-permissions/composer#composer.environmentAndStorageObjectUser) (`roles/composer.environmentAndStorageObjectUser`)\n\n\nFor more information about granting roles, see [Manage access to projects, folders, and organizations](/iam/docs/granting-changing-revoking-access).\n\n\nThese predefined roles contain\n\nthe permissions required to schedule Airflow DAGs. To see the exact permissions that are\nrequired, expand the **Required permissions** section:\n\n\nRequired permissions\n\nThe following permissions are required to schedule Airflow DAGs:\n\n- To view Airflow DAGs and their details: ` composers.dags.list, composer.environments.list `\n- To trigger and pause Airflow DAGs: ` composers.dags.list, composer.environments.list, composer.dags.execute`\n\n\nYou might also be able to get\nthese permissions\nwith [custom roles](/iam/docs/creating-custom-roles) or\nother [predefined roles](/iam/docs/roles-overview#predefined).\n\n\u003cbr /\u003e\n\nFor more information about Cloud Composer 3 IAM, see\n[Access control with IAM](/composer/docs/composer-3/access-control)\nin Cloud Composer documentation.\n\nManually trigger an Airflow DAG\n\nWhen you manually trigger an Airflow DAG, Airflow runs\nthe DAG once, independently from the schedule specified for the DAG.\n\nTo manually trigger a selected Airflow DAG, follow these steps:\n\n1. In the Google Cloud console, go to the **Scheduling** page.\n\n [Go to the **Scheduling** page](https://console.cloud.google.com/bigquery/orchestration)\n2. Do either of the following:\n\n - Click the name of the selected DAG, and then\n on the **DAG details** page, click **Trigger DAG**.\n\n - In the row that contains the selected DAG,\n click more_vert\n **View actions** in the **Actions** column, and then click\n **Trigger DAG**.\n\nView Airflow DAG run logs and details\n\nTo view details of a selected Airflow DAG, follow these steps:\n\n1. In the Google Cloud console, go to the **Scheduling** page.\n\n [Go to the **Scheduling** page](https://console.cloud.google.com/bigquery/orchestration)\n2. Click the name of the selected DAG.\n\n3. On the **DAG details** page, select the **Details** tab.\n\n4. To view past DAG runs, select the **Runs** tab.\n\n 1. Optional: The **Runs** tab displays DAG runs from the last 10 days by\n default. To filter DAG runs by a different time range, in\n the **10 days** drop-down menu, select a time range, and then click\n **OK**.\n\n 2. Optional: To display additional columns with DAG run details in the list\n of all DAG runs, click view_column\n **Column display options** , and then select columns and click **OK**.\n\n 3. To view details and logs for a selected DAG run, select a DAG run.\n\n5. To view a visualization of the DAG with task dependencies,\n select the **Diagram** tab.\n\n 1. To view task details, select a task on the diagram.\n6. To view the source code of the DAG, select the **Code** tab.\n\n7. Optional: To refresh the displayed data, click **Refresh**.\n\nView all Airflow DAGs\n\nTo view Airflow DAGs from all Cloud Composer 3 environments in your\nGoogle Cloud project, follow these steps:\n\n1. In the Google Cloud console, go to the **Scheduling** page.\n\n [Go to the **Scheduling** page](https://console.cloud.google.com/bigquery/orchestration)\n2. Optional: To display additional columns with DAG details,\n click view_column\n **Column display options** ,\n and then select columns and click **OK**.\n\nPause an Airflow DAG\n\nTo pause a selected Airflow DAG, follow these steps:\n\n1. In the Google Cloud console, go to the **Scheduling** page.\n\n [Go to the **Scheduling** page](https://console.cloud.google.com/bigquery/orchestration)\n2. Do either of the following:\n\n - Click the name of the selected DAG, and then\n on the **DAG details** page, click **Pause DAG**.\n\n - In the row that contains the selected DAG,\n click more_vert\n **View actions** in the **Actions** column, and then click **Pause DAG**.\n\nTroubleshooting\n\nFor instructions to troubleshoot Airflow DAGs, see\n[Troubleshooting Airflow DAGs](/composer/docs/composer-3/troubleshooting-dags)\nin Cloud Composer documentation.\n\nWhat's next\n\n- Learn more about [writing Airflow DAGs](/composer/docs/composer-3/write-dags).\n- Learn more about [Airflow in Cloud Composer 3](/composer/docs/composer-3/composer-overview#about-airflow)."]]