각 파이프라인에는 해당하는 Dataform 저장소 ID가 있습니다.
각 BigQuery 파이프라인 실행은 해당 Dataform 저장소 ID를 사용하여 Cloud Logging에 로깅됩니다. Cloud Monitoring을 사용하여 Cloud Logging 로그에서 BigQuery 파이프라인 실행 추세를 관찰하고 사용자가 지정한 조건이 발생하면 알림을 받을 수 있습니다.
BigQuery 파이프라인 실행이 실패할 경우 알림을 수신하려면 해당 Dataform 저장소 ID에 로그 기반 알림 정책을 만들면 됩니다. 자세한 내용은 실패한 워크플로 호출에 대한 알림 구성을 참조하세요.
Dataplex Universal Catalog는 파이프라인에서 다음 메타데이터를 자동으로 검색합니다.
데이터 애셋 이름
데이터 애셋 상위 항목
데이터 애셋 위치
데이터 애셋 유형
해당 Google Cloud 프로젝트
Dataplex Universal Catalog는 파이프라인을 다음과 같은 항목 값을 사용해 항목으로 로깅합니다.
시스템 항목 그룹
파이프라인의 시스템 항목 그룹은 @dataform입니다. Dataplex Universal Catalog에서 파이프라인 항목의 세부정보를 보려면 dataform 시스템 항목 그룹을 확인해야 합니다.
항목 그룹의 모든 항목 목록을 보는 방법에 관한 안내는 Dataplex Universal Catalog 문서의 항목 그룹 세부정보 보기를 참조하세요.
시스템 항목 유형
파이프라인의 시스템 항목 유형은 dataform-code-asset입니다. 파이프라인의 세부정보를 보려면 dataform-code-asset 시스템 항목 유형을 확인하고, 관점 기반 필터로 결과를 필터링하고, dataform-code-asset 관점 내의 type 필드를 WORKFLOW로 설정해야 합니다.
그런 다음 선택한 파이프라인의 항목을 선택합니다.
선택한 항목 유형의 세부정보를 보는 방법에 관한 안내는 Dataplex Universal Catalog 문서의 항목 유형의 세부정보 보기를 참조하세요.
선택한 항목의 세부정보를 보는 방법에 관한 안내는 Dataplex Universal Catalog 문서의 항목 세부정보 보기를 참조하세요.
시스템 관점 유형
파이프라인의 시스템 관점 유형은 dataform-code-asset입니다. 관점으로 데이터 파이프라인 항목에 주석을 추가하여 Dataplex Universal Catalog의 파이프라인에 추가 컨텍스트를 제공하려면 dataform-code-asset 관점 유형을 확인하고 관점 기반 필터로 결과를 필터링한 다음 dataform-code-asset 관점 내의 type 필드를 WORKFLOW로 설정합니다.
관점으로 항목에 주석을 추가하는 방법에 관한 안내는 Dataplex Universal Catalog 문서의 관점 관리 및 메타데이터 보강을 참조하세요.
유형
데이터 캔버스의 유형은 WORKFLOW입니다.
이 유형을 사용하면 관점 기반 필터에서 aspect:dataplex-types.global.dataform-code-asset.type=WORKFLOW 쿼리를 사용하여 dataform-code-asset 시스템 항목 유형 및 dataform-code-asset 관점 유형의 파이프라인을 필터링할 수 있습니다.
[[["이해하기 쉬움","easyToUnderstand","thumb-up"],["문제가 해결됨","solvedMyProblem","thumb-up"],["기타","otherUp","thumb-up"]],[["이해하기 어려움","hardToUnderstand","thumb-down"],["잘못된 정보 또는 샘플 코드","incorrectInformationOrSampleCode","thumb-down"],["필요한 정보/샘플이 없음","missingTheInformationSamplesINeed","thumb-down"],["번역 문제","translationIssue","thumb-down"],["기타","otherDown","thumb-down"]],["최종 업데이트: 2025-08-26(UTC)"],[],[],null,["# Manage pipelines\n================\n\nThis document describes how to manage\n[BigQuery pipelines](/bigquery/docs/pipelines-introduction),\nincluding how to schedule and delete pipelines.\n\nThis document also describes how to view and manage pipeline metadata in\n[Dataplex Universal Catalog](/dataplex/docs/introduction).\n\nPipelines are powered by [Dataform](/dataform/docs/overview).\n\nBefore you begin\n----------------\n\n1. [Create a BigQuery pipeline](/bigquery/docs/create-pipelines).\n2. To manage pipeline metadata in Dataplex Universal Catalog, ensure that the [Dataplex API](/dataplex/docs/enable-api) is enabled in your Google Cloud project.\n\n### Required roles\n\n\nTo get the permissions that\nyou need to manage pipelines,\n\nask your administrator to grant you the\nfollowing IAM roles:\n\n- To delete pipelines: [Dataform Admin](/iam/docs/roles-permissions/dataform#dataform.Admin) (`roles/dataform.Admin`) on the pipeline\n- To view and run pipelines: [Dataform Viewer](/iam/docs/roles-permissions/dataform#dataform.Viewer) (`roles/dataform.Viewer`) on the project\n\n\nFor more information about granting roles, see [Manage access to projects, folders, and organizations](/iam/docs/granting-changing-revoking-access).\n\n\nYou might also be able to get\nthe required permissions through [custom\nroles](/iam/docs/creating-custom-roles) or other [predefined\nroles](/iam/docs/roles-overview#predefined).\n\nTo manage pipeline metadata in Dataplex Universal Catalog,\nensure that you have the required\n[Dataplex Universal Catalog roles](/dataplex/docs/iam-roles)\n\nFor more information about Dataform IAM, see\n[Control access with IAM](/dataform/docs/access-control).\n| **Note:** When you create a pipeline, BigQuery grants you the Dataform Admin ([`roles/dataform.admin`](/iam/docs/understanding-roles#dataform.admin)) role on that pipeline. All users with the [Dataform Admin (`roles/dataform.admin`) role](/dataform/docs/access-control#dataform.admin) granted on the Google Cloud project have owner access to all pipelines created in the project.\n\nView all pipelines\n------------------\n\nTo view a list of all pipelines in your project, do the following:\n\n1. In the Google Cloud console, go to the\n **BigQuery** page.\n\n [Go to BigQuery](https://console.cloud.google.com/bigquery)\n2. In the **Explorer** pane,\n click more_vert\n expand **Pipelines**.\n\nView past manual runs\n---------------------\n\nTo view past manual runs of a selected pipeline, follow these steps:\n\n1. In the Google Cloud console, go to the\n **BigQuery** page.\n\n [Go to BigQuery](https://console.cloud.google.com/bigquery)\n2. In the **Explorer** pane, expand your project and the **Pipelines** folder,\n and then select a pipeline.\n\n3. Click **Executions**.\n\n4. Optional: To refresh the list of past runs, click **Refresh**.\n\nConfigure alerts for failed pipeline runs\n-----------------------------------------\n\nEach pipeline has a corresponding Dataform repository ID.\nEach BigQuery pipeline run is logged in\n[Cloud Logging](/logging/docs) using the corresponding\nDataform repository ID. You can use Cloud Monitoring to observe\ntrends in Cloud Logging logs for\nBigQuery pipeline runs and to notify\nyou when conditions you describe occur.\n\nTo receive alerts when a BigQuery pipeline run fails,\nyou can create a log-based alerting policy for the corresponding\nDataform repository ID. For instructions, see\n[Configure alerts for failed workflow invocations](/dataform/docs/monitor-runs#configure-alerts-failed-workflow-invocations).\n\nTo find the Dataform repository ID of your pipeline, do the following:\n\n1. In the Google Cloud console, go to the\n **BigQuery** page.\n\n [Go to BigQuery](https://console.cloud.google.com/bigquery)\n2. In the **Explorer** pane, expand your project and the **Pipelines** folder,\n and then select a pipeline.\n\n3. Click **Settings**.\n\n The Dataform repository ID of your pipeline is displayed at\n the bottom of the **Settings** tab.\n\nDelete a pipeline\n-----------------\n\nTo permanently delete a pipeline, follow these steps:\n\n1. In the Google Cloud console, go to the\n **BigQuery** page.\n\n [Go to BigQuery](https://console.cloud.google.com/bigquery)\n2. In the **Explorer** pane, expand your project and the **Pipelines** folder.\n Find the pipeline that you want to delete.\n\n3. Click more_vert\n **View actions** next to the pipeline, and then click **Delete**.\n\n4. Click **Delete**.\n\nManage metadata in Dataplex Universal Catalog\n---------------------------------------------\n\nDataplex Universal Catalog lets you store and manage metadata for\npipelines. Pipelines are available in Dataplex Universal Catalog\nby default, without additional configuration.\n\nYou can use Dataplex Universal Catalog to manage pipelines\nin all [pipeline locations](/bigquery/docs/locations).\nManaging pipelines in Dataplex Universal Catalog\nis subject to [Dataplex Universal Catalog quotas and limits](/dataplex/docs/quotas)\nand [Dataplex Universal Catalog pricing](/dataplex/pricing).\n\nDataplex Universal Catalog automatically retrieves\nthe following metadata from pipelines:\n\n- Data asset name\n- Data asset parent\n- Data asset location\n- Data asset type\n- Corresponding Google Cloud project\n\nDataplex Universal Catalog logs pipelines as\n[entries](/dataplex/docs/ingest-custom-sources#entries) with the following\nentry values:\n\nSystem entry group\n: The [system entry group](/dataplex/docs/ingest-custom-sources#entry-groups)\n for pipelines is `@dataform`. To view details of pipeline entries\n in Dataplex Universal Catalog, you need to view the `dataform` system entry group.\n For instructions about how to view a list of all entries in an entry group, see\n [View details of an entry group](/dataplex/docs/ingest-custom-sources#entry-group-details)\n in the Dataplex Universal Catalog documentation.\n\nSystem entry type\n: The [system entry type](/dataplex/docs/ingest-custom-sources#entry-types)\n for pipelines is `dataform-code-asset`. To view details of\n pipelines,you need to view the `dataform-code-asset` system entry type,\n filter the results with an aspect-based filter,\n and [set the `type` field inside `dataform-code-asset` aspect to `WORKFLOW`](/dataplex/docs/search-syntax#aspect-search).\n Then, select an entry of the selected pipeline.\n For instructions about how to view details of a selected entry type, see\n [View details of an entry type](/dataplex/docs/ingest-custom-sources#entry-type-details)\n in the Dataplex Universal Catalog documentation.\n For instructions about how to view details of a selected entry, see\n [View details of an entry](/dataplex/docs/search-assets#view-entry-details)\n in the Dataplex Universal Catalog documentation.\n\nSystem aspect type\n: The [system aspect type](/dataplex/docs/enrich-entries-metadata#aspect-types)\n for pipelines is `dataform-code-asset`. To\n provide additional context to pipelines in Dataplex Universal Catalog\n by annotating data pipeline entries with\n [aspects](/dataplex/docs/enrich-entries-metadata#aspects),\n view the `dataform-code-asset` aspect type,\n filter the results with an aspect-based filter,\n and [set the `type` field inside `dataform-code-asset` aspect to `WORKFLOW`](/dataplex/docs/search-syntax#aspect-search).\n For instructions about how to annotate entries with aspects, see\n [Manage aspects and enrich metadata](/dataplex/docs/enrich-entries-metadata)\n in the Dataplex Universal Catalog documentation.\n\nType\n: The type for data canvases is `WORKFLOW`.\n This type lets you filter pipelines in the `dataform-code-asset`\n system entry type and the `dataform-code-asset` aspect type by using the\n `aspect:dataplex-types.global.dataform-code-asset.type=WORKFLOW`\n query in an [aspect-based filter](/dataplex/docs/search-syntax#aspect-search).\n\nFor instructions about how to search for assets in Dataplex Universal Catalog, see\n[Search for data assets in Dataplex Universal Catalog](/dataplex/docs/search-assets)\nin the Dataplex Universal Catalog documentation.\n\nWhat's next\n-----------\n\n- Learn more about [BigQuery pipelines](/bigquery/docs/pipelines-introduction).\n- Learn how to [create pipelines](/bigquery/docs/create-pipelines).\n- Learn how to [schedule pipelines](/bigquery/docs/schedule-pipelines)."]]