이 문서에서는 BigLake 테이블에서 실행되는 쿼리의 결과를 Azure Blob Storage로 내보내는 방법을 설명합니다.
BigQuery와 Azure Blob Storage 간의 데이터 흐름 방식은 데이터를 내보낼 때 데이터 흐름을 참조하세요.
제한사항
Amazon S3 및 Blob Storage를 기반으로 BigLake 테이블에 적용되는 전체 제한사항 목록은 제한사항을 참조하세요.
시작하기 전에
다음 리소스가 있는지 확인합니다.
Blob Storage에 액세스하기 위한 연결.
연결 내에서 내보낼 Blob Storage 컨테이너 경로에 대한 정책을 만들어야 합니다. 그런 다음 이 정책 내에서 Microsoft.Storage/storageAccounts/blobServices/containers/write 권한이 있는 역할을 만듭니다.
BigQuery Omni는 기존 콘텐츠에 관계없이 지정된 Azure Blob Storage 위치에 씁니다. 내보내기 쿼리에서 기존 데이터를 덮어쓰거나 쿼리 결과를 기존 데이터와 혼합할 수 있습니다. 쿼리 결과를 비어 있는 Blob Storage 컨테이너로 내보내는 것이 좋습니다.
AZURE_STORAGE_ACCOUNT_NAME: 쿼리 결과를 기록하려는 Blob Storage 계정의 이름
CONTAINER_NAME: 쿼리 결과를 기록하려는 컨테이너의 이름
FILE_PATH: 내보낸 파일을 쓰려는 경로. 경로 문자열의 리프 디렉터리에는 정확히 하나의 와일드 카드(*)가 포함되어야 합니다(예: ../aa/*, ../aa/b*c, ../aa/*bc, ../aa/bc*). BigQuery는 내보낸 파일 수에 따라 *를 0000..N으로 바꿉니다.
BigQuery에서 파일 수와 크기를 결정합니다. BigQuery에서 파일 2개를 내보내기로 한 경우 첫 번째 파일의 파일 이름에서 *가 000000000000으로 바뀌고, * 두 번째 파일의 파일 이름이 000000000001로 바뀝니다.
FORMAT: 지원되는 형식은 JSON, AVRO, CSV, PARQUET입니다.
QUERY: BigLake 테이블에 저장된 데이터를 분석하는 쿼리입니다.
문제 해결
quota failure 관련 오류가 발생하면 쿼리에 용량을 예약했는지 확인합니다. 슬롯 예약에 대한 자세한 내용은 이 문서의 시작하기 전에를 참조하세요.
[[["이해하기 쉬움","easyToUnderstand","thumb-up"],["문제가 해결됨","solvedMyProblem","thumb-up"],["기타","otherUp","thumb-up"]],[["이해하기 어려움","hardToUnderstand","thumb-down"],["잘못된 정보 또는 샘플 코드","incorrectInformationOrSampleCode","thumb-down"],["필요한 정보/샘플이 없음","missingTheInformationSamplesINeed","thumb-down"],["번역 문제","translationIssue","thumb-down"],["기타","otherDown","thumb-down"]],["최종 업데이트: 2025-09-04(UTC)"],[[["\u003cp\u003eThis guide outlines the process of exporting query results from a BigLake table to Azure Blob Storage.\u003c/p\u003e\n"],["\u003cp\u003eBefore exporting, you must establish a connection to your Blob Storage with appropriate write permissions and have a BigLake table in place.\u003c/p\u003e\n"],["\u003cp\u003eThe export process involves using a specific GoogleSQL \u003ccode\u003eEXPORT DATA\u003c/code\u003e query with connection details, target URI, desired format, and the source query.\u003c/p\u003e\n"],["\u003cp\u003eIt is recommended to export query results to an empty Blob Storage container, because the export query can overwrite or mix with existing data.\u003c/p\u003e\n"],["\u003cp\u003eIf you are receiving \u003ccode\u003equota failure\u003c/code\u003e error, check if you have reserved capacity for your queries.\u003c/p\u003e\n"]]],[],null,["# Export query results to Blob Storage\n====================================\n\nThis document describes how to export the result of a query that runs against a\n[BigLake table](/bigquery/docs/biglake-intro) to your\nAzure Blob Storage.\n\nFor information about how data flows between BigQuery and\nAzure Blob Storage,\nsee [Data flow when exporting data](/bigquery/docs/omni-introduction#export-data).\n\nLimitations\n-----------\n\nFor a full list of limitations that apply to BigLake tables\nbased on Amazon S3 and Blob Storage, see [Limitations](/bigquery/docs/omni-introduction#limitations).\n\nBefore you begin\n----------------\n\nEnsure that you have the following resources:\n\n\n- A [connection to access your Blob Storage](/bigquery/docs/omni-azure-create-connection). Within the connection, you must create a policy for the Blob Storage container path that you want to export to. Then, within that policy, create a role that has the `Microsoft.Storage/storageAccounts/blobServices/containers/write` permission.\n- An [Blob Storage BigLake table](/bigquery/docs/omni-azure-create-external-table).\n\n\u003c!-- --\u003e\n\n- If you are on the [capacity-based pricing model](/bigquery/pricing#capacity_compute_analysis_pricing), then ensure that you have enabled the [BigQuery Reservation API](https://console.cloud.google.com/apis/library/bigqueryreservation.googleapis.com) for your project. For information about pricing, see [BigQuery Omni pricing](/bigquery/pricing#bqomni).\n\nExport query results\n--------------------\n\nBigQuery Omni writes to the specified Blob Storage location regardless of any existing\ncontent. The export query can overwrite existing data or mix the query result\nwith existing data. We recommend that you export the query result to an empty\nBlob Storage container.\n\n1. In the Google Cloud console, go to the **BigQuery** page.\n\n [Go to BigQuery](https://console.cloud.google.com/bigquery)\n2. In the **Query editor** field, enter a GoogleSQL export query:\n\n ```bash\n EXPORT DATA WITH CONNECTION \\`CONNECTION_REGION.CONNECTION_NAME\\`\n OPTIONS(\n uri=\"azure://\u003cvar translate=\"no\"\u003eAZURE_STORAGE_ACCOUNT_NAME\u003c/var\u003e.blob.core.windows.net/\u003cvar translate=\"no\"\u003eCONTAINER_NAME\u003c/var\u003e/\u003cvar translate=\"no\"\u003eFILE_PATH\u003c/var\u003e/*\",\n format=\"\u003cvar translate=\"no\"\u003eFORMAT\u003c/var\u003e\"\n )\n AS QUERY\n ```\n\n Replace the following:\n - \u003cvar translate=\"no\"\u003eCONNECTION_REGION\u003c/var\u003e: the region where the connection was created.\n - \u003cvar translate=\"no\"\u003eCONNECTION_NAME\u003c/var\u003e: the connection name that you created with the necessary permission to write to the container.\n - \u003cvar translate=\"no\"\u003eAZURE_STORAGE_ACCOUNT_NAME\u003c/var\u003e: the name of the Blob Storage account to which you want to write the query result.\n - \u003cvar translate=\"no\"\u003eCONTAINER_NAME\u003c/var\u003e: the name of the container to which you want to write the query result.\n - \u003cvar translate=\"no\"\u003eFILE_PATH\u003c/var\u003e: the path where you want to write the exported file to. It must contain exactly one wildcard `*` anywhere in the leaf directory of the path string, for example, `../aa/*`, `../aa/b*c`, `../aa/*bc`, and `../aa/bc*`. BigQuery replaces `*` with `0000..N` depending on the number of files exported. BigQuery determines the file count and sizes. If BigQuery decides to export two files, then `*` in the first file's filename is replaced by `000000000000`, and `*` in the second file's filename is replaced by `000000000001`.\n - \u003cvar translate=\"no\"\u003eFORMAT\u003c/var\u003e: supported formats are `JSON`, `AVRO`, `CSV`, and `PARQUET`.\n - \u003cvar translate=\"no\"\u003eQUERY\u003c/var\u003e: the query to analyze the data that is stored in a BigLake table.\n\n| **Note:** To override the default project, use the `--project_id=`\u003cvar translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e parameter. Replace \u003cvar translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e with the ID of your Google Cloud project.\n\nTroubleshooting\n---------------\n\nIf you get an error related to `quota failure`, then check if you have reserved\ncapacity for your queries. For more information about slot reservations, see\n[Before you begin](#before_you_begin) in this document.\n\nWhat's next\n-----------\n\n- Learn about [BigQuery Omni](/bigquery/docs/omni-introduction).\n- Learn how to [export table data](/bigquery/docs/exporting-data).\n- Learn how to [query data stored in Blob Storage](/bigquery/docs/query-azure-data).\n- Learn how to [set up VPC Service Controls for BigQuery Omni](/bigquery/docs/omni-vpc-sc)."]]