[[["容易理解","easyToUnderstand","thumb-up"],["確實解決了我的問題","solvedMyProblem","thumb-up"],["其他","otherUp","thumb-up"]],[["難以理解","hardToUnderstand","thumb-down"],["資訊或程式碼範例有誤","incorrectInformationOrSampleCode","thumb-down"],["缺少我需要的資訊/範例","missingTheInformationSamplesINeed","thumb-down"],["翻譯問題","translationIssue","thumb-down"],["其他","otherDown","thumb-down"]],["上次更新時間:2025-08-17 (世界標準時間)。"],[],[],null,["Introduction to BigQuery pipelines\n\nYou can use BigQuery pipelines to automate and streamline your\nBigQuery data processes. With pipelines, you can schedule and\nexecute code assets in sequence to improve efficiency and reduce manual effort.\n\nOverview\n\nPipelines are powered by [Dataform](/dataform/docs/overview).\n\nA pipeline consists of one or more of the following code assets:\n\n- [Notebooks](/bigquery/docs/notebooks-introduction)\n- [SQL queries](/bigquery/docs/reference/standard-sql/query-syntax)\n- [Data preparations](/bigquery/docs/data-prep-introduction)\n\nYou can use pipelines to schedule the execution of code assets. For example,\nyou can schedule a SQL query to run daily and update a table with the most\nrecent source data, which can then power a dashboard.\n\nIn a pipeline with multiple code assets, you define the execution sequence.\nFor example, to train a machine learning model, you can create a workflow in\nwhich a SQL query prepares data, and then a subsequent notebook trains the\nmodel using that data.\n\nCapabilities\n\nYou can do the following in a pipeline:\n\n- [Create new or import existing](/bigquery/docs/create-pipelines#add_a_pipeline_task) SQL queries or notebooks into a pipeline.\n- [Schedule a pipeline](/bigquery/docs/schedule-pipelines) to automatically run at a specified time and frequency.\n- [Share a pipeline](/bigquery/docs/create-pipelines#share_a_pipeline) with users or groups you specify.\n- [Share a link to a pipeline](/bigquery/docs/create-pipelines#share_a_link_to_a_pipeline).\n\nLimitations\n\nPipelines are subject to the following limitations:\n\n- Pipelines are available only in the Google Cloud console.\n- You can't change the region for storing a pipeline after it is created.\n- You can grant users or groups access to a selected pipeline, but you can't grant them access to individual tasks within the pipeline.\n\nSet the default region for code assets\n\nIf this is the first time you are creating a code asset, you should set the\ndefault region for code assets. You can't change the region for a code asset\nafter it is created.\n| **Note:** If you create a pipeline and choose a different default region than the one you have been using for code assets---for example, choosing `us-west1` when you have been using `us-central1`---then that pipeline and all code assets you create afterwards use that new region by default. Existing code assets continue to use the region they were assigned when they were created.\n\nAll code assets in BigQuery Studio use the same default region.\nTo set the default region for code assets, follow these steps:\n\n1. Go to the **BigQuery** page.\n\n [Go to BigQuery](https://console.cloud.google.com/bigquery)\n2. In the **Explorer** pane, find the project in which you have enabled code\n assets.\n\n3. Click more_vert\n **View actions** next to the project, and then click\n **Change my default code region**.\n\n4. For **Region**, select the region that you want to use for code assets.\n\n5. Click **Select**.\n\nFor a list of supported regions, see [BigQuery Studio locations](/bigquery/docs/locations#bqstudio-loc).\n\nSupported regions\n\nAll code assets are stored in your\n[default region for code assets](/bigquery/docs/enable-assets#set_the_default_region_for_code_assets).\nUpdating the default region changes the region for all code assets created\nafter that point.\n\nThe following table lists the regions where pipelines are available:\n\n| | Region description | Region name | Details |\n|---|--------------------|---------------------------|----------------------------------------------------------|\n| **Africa** ||||\n| | Johannesburg | `africa-south1` | |\n| **Americas** ||||\n| | Columbus | `us-east5` | |\n| | Dallas | `us-south1` | [Low CO~2~](/sustainability/region-carbon#region-picker) |\n| | Iowa | `us-central1` | [Low CO~2~](/sustainability/region-carbon#region-picker) |\n| | Los Angeles | `us-west2` | |\n| | Las Vegas | `us-west4` | |\n| | Montréal | `northamerica-northeast1` | [Low CO~2~](/sustainability/region-carbon#region-picker) |\n| | N. Virginia | `us-east4` | |\n| | Oregon | `us-west1` | [Low CO~2~](/sustainability/region-carbon#region-picker) |\n| | São Paulo | `southamerica-east1` | [Low CO~2~](/sustainability/region-carbon#region-picker) |\n| | South Carolina | `us-east1` | |\n| **Asia Pacific** ||||\n| | Hong Kong | `asia-east2` | |\n| | Jakarta | `asia-southeast2` | |\n| | Mumbai | `asia-south1` | |\n| | Seoul | `asia-northeast3` | |\n| | Singapore | `asia-southeast1` | |\n| | Sydney | `australia-southeast1` | |\n| | Taiwan | `asia-east1` | |\n| | Tokyo | `asia-northeast1` | |\n| **Europe** ||||\n| | Belgium | `europe-west1` | [Low CO~2~](/sustainability/region-carbon#region-picker) |\n| | Frankfurt | `europe-west3` | |\n| | London | `europe-west2` | [Low CO~2~](/sustainability/region-carbon#region-picker) |\n| | Madrid | `europe-southwest1` | [Low CO~2~](/sustainability/region-carbon#region-picker) |\n| | Netherlands | `europe-west4` | [Low CO~2~](/sustainability/region-carbon#region-picker) |\n| | Turin | `europe-west12` | |\n| | Zürich | `europe-west6` | [Low CO~2~](/sustainability/region-carbon#region-picker) |\n| **Middle East** ||||\n| | Doha | `me-central1` | |\n| | Dammam | `me-central2` | |\n\nQuotas and limits\n\nBigQuery pipelines are subject to\n[Dataform quotas and limits](/dataform/docs/quotas).\n\nPricing\n\nThe execution of BigQuery pipeline tasks incurs compute and storage\ncharges in BigQuery. For more information, see\n[BigQuery pricing](/bigquery/pricing).\n\nPipelines containing notebooks incur Colab Enterprise runtime charges\nbased on the\n[default machine type](/colab/docs/runtimes#default_runtime_specifications).\nFor pricing details, see [Colab Enterprise pricing](/colab/pricing).\n\nEach BigQuery pipeline run is logged using\n[Cloud Logging](/logging/docs). Logging is automatically\nenabled for BigQuery pipeline runs, which can incur\nCloud Logging billing charges. For more information, see\n[Cloud Logging pricing](/logging/pricing).\n\nWhat's next\n\n- Learn how to [create pipelines](/bigquery/docs/create-pipelines).\n- Learn how to [manage pipelines](/bigquery/docs/manage-pipelines).\n- Learn how to [schedule pipelines](/bigquery/docs/schedule-pipelines)."]]