Per un elenco completo delle limitazioni che si applicano alle tabelle BigLake basate su Amazon S3 e Blob Storage, consulta Limitazioni.
Prima di iniziare
Assicurati di disporre delle seguenti risorse:
Una connessione per accedere a Blob Storage.
All'interno della connessione, devi creare un criterio per il percorso del contenitore Blob Storage in cui vuoi eseguire l'esportazione. Poi, all'interno di questo criterio,
crea un ruolo che disponga dell'autorizzazioneMicrosoft.Storage/storageAccounts/blobServices/containers/write.
BigQuery Omni scrive nella posizione di Blob Storage specificata indipendentemente dai contenuti esistenti. La query di esportazione può sovrascrivere i dati esistenti o combinare il risultato della query con i dati esistenti. Ti consigliamo di esportare il risultato della query in un container
Archiviazione BLOB vuoto.
Nella Google Cloud console, vai alla pagina BigQuery.
CONNECTION_REGION: la regione in cui è stata creata la connessione.
CONNECTION_NAME: il nome della connessione che hai creato con l'autorizzazione necessaria per scrivere nel contenitore.
AZURE_STORAGE_ACCOUNT_NAME: il nome dell'account Blob Storage in cui vuoi scrivere il risultato della query.
CONTAINER_NAME: il nome del contenitore in cui vuoi scrivere il risultato della query.
FILE_PATH: il percorso in cui vuoi scrivere il
file esportato. Deve contenere esattamente un carattere jolly * in qualsiasi punto della directory finale della stringa del percorso, ad esempio ../aa/*,
../aa/b*c, ../aa/*bc e ../aa/bc*. BigQuery
sostituisce * con 0000..N a seconda del numero di file esportati.
BigQuery determina il numero e le dimensioni dei file. Se
BigQuery decide di esportare due file, * nel nome del primo
file viene sostituito da 000000000000 e * nel nome del
secondo file viene sostituito da 000000000001.
FORMAT: i formati supportati sono JSON, AVRO,
CSV e PARQUET.
QUERY: la query per analizzare i dati memorizzati in una tabella BigLake.
Risoluzione dei problemi
Se ricevi un errore relativo a quota failure, controlla se hai riservato la capacità per le tue query. Per ulteriori informazioni sulle prenotazioni degli slot, consulta la sezione Prima di iniziare in questo documento.
[[["Facile da capire","easyToUnderstand","thumb-up"],["Il problema è stato risolto","solvedMyProblem","thumb-up"],["Altra","otherUp","thumb-up"]],[["Difficile da capire","hardToUnderstand","thumb-down"],["Informazioni o codice di esempio errati","incorrectInformationOrSampleCode","thumb-down"],["Mancano le informazioni o gli esempi di cui ho bisogno","missingTheInformationSamplesINeed","thumb-down"],["Problema di traduzione","translationIssue","thumb-down"],["Altra","otherDown","thumb-down"]],["Ultimo aggiornamento 2025-09-04 UTC."],[[["\u003cp\u003eThis guide outlines the process of exporting query results from a BigLake table to Azure Blob Storage.\u003c/p\u003e\n"],["\u003cp\u003eBefore exporting, you must establish a connection to your Blob Storage with appropriate write permissions and have a BigLake table in place.\u003c/p\u003e\n"],["\u003cp\u003eThe export process involves using a specific GoogleSQL \u003ccode\u003eEXPORT DATA\u003c/code\u003e query with connection details, target URI, desired format, and the source query.\u003c/p\u003e\n"],["\u003cp\u003eIt is recommended to export query results to an empty Blob Storage container, because the export query can overwrite or mix with existing data.\u003c/p\u003e\n"],["\u003cp\u003eIf you are receiving \u003ccode\u003equota failure\u003c/code\u003e error, check if you have reserved capacity for your queries.\u003c/p\u003e\n"]]],[],null,["# Export query results to Blob Storage\n====================================\n\nThis document describes how to export the result of a query that runs against a\n[BigLake table](/bigquery/docs/biglake-intro) to your\nAzure Blob Storage.\n\nFor information about how data flows between BigQuery and\nAzure Blob Storage,\nsee [Data flow when exporting data](/bigquery/docs/omni-introduction#export-data).\n\nLimitations\n-----------\n\nFor a full list of limitations that apply to BigLake tables\nbased on Amazon S3 and Blob Storage, see [Limitations](/bigquery/docs/omni-introduction#limitations).\n\nBefore you begin\n----------------\n\nEnsure that you have the following resources:\n\n\n- A [connection to access your Blob Storage](/bigquery/docs/omni-azure-create-connection). Within the connection, you must create a policy for the Blob Storage container path that you want to export to. Then, within that policy, create a role that has the `Microsoft.Storage/storageAccounts/blobServices/containers/write` permission.\n- An [Blob Storage BigLake table](/bigquery/docs/omni-azure-create-external-table).\n\n\u003c!-- --\u003e\n\n- If you are on the [capacity-based pricing model](/bigquery/pricing#capacity_compute_analysis_pricing), then ensure that you have enabled the [BigQuery Reservation API](https://console.cloud.google.com/apis/library/bigqueryreservation.googleapis.com) for your project. For information about pricing, see [BigQuery Omni pricing](/bigquery/pricing#bqomni).\n\nExport query results\n--------------------\n\nBigQuery Omni writes to the specified Blob Storage location regardless of any existing\ncontent. The export query can overwrite existing data or mix the query result\nwith existing data. We recommend that you export the query result to an empty\nBlob Storage container.\n\n1. In the Google Cloud console, go to the **BigQuery** page.\n\n [Go to BigQuery](https://console.cloud.google.com/bigquery)\n2. In the **Query editor** field, enter a GoogleSQL export query:\n\n ```bash\n EXPORT DATA WITH CONNECTION \\`CONNECTION_REGION.CONNECTION_NAME\\`\n OPTIONS(\n uri=\"azure://\u003cvar translate=\"no\"\u003eAZURE_STORAGE_ACCOUNT_NAME\u003c/var\u003e.blob.core.windows.net/\u003cvar translate=\"no\"\u003eCONTAINER_NAME\u003c/var\u003e/\u003cvar translate=\"no\"\u003eFILE_PATH\u003c/var\u003e/*\",\n format=\"\u003cvar translate=\"no\"\u003eFORMAT\u003c/var\u003e\"\n )\n AS QUERY\n ```\n\n Replace the following:\n - \u003cvar translate=\"no\"\u003eCONNECTION_REGION\u003c/var\u003e: the region where the connection was created.\n - \u003cvar translate=\"no\"\u003eCONNECTION_NAME\u003c/var\u003e: the connection name that you created with the necessary permission to write to the container.\n - \u003cvar translate=\"no\"\u003eAZURE_STORAGE_ACCOUNT_NAME\u003c/var\u003e: the name of the Blob Storage account to which you want to write the query result.\n - \u003cvar translate=\"no\"\u003eCONTAINER_NAME\u003c/var\u003e: the name of the container to which you want to write the query result.\n - \u003cvar translate=\"no\"\u003eFILE_PATH\u003c/var\u003e: the path where you want to write the exported file to. It must contain exactly one wildcard `*` anywhere in the leaf directory of the path string, for example, `../aa/*`, `../aa/b*c`, `../aa/*bc`, and `../aa/bc*`. BigQuery replaces `*` with `0000..N` depending on the number of files exported. BigQuery determines the file count and sizes. If BigQuery decides to export two files, then `*` in the first file's filename is replaced by `000000000000`, and `*` in the second file's filename is replaced by `000000000001`.\n - \u003cvar translate=\"no\"\u003eFORMAT\u003c/var\u003e: supported formats are `JSON`, `AVRO`, `CSV`, and `PARQUET`.\n - \u003cvar translate=\"no\"\u003eQUERY\u003c/var\u003e: the query to analyze the data that is stored in a BigLake table.\n\n| **Note:** To override the default project, use the `--project_id=`\u003cvar translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e parameter. Replace \u003cvar translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e with the ID of your Google Cloud project.\n\nTroubleshooting\n---------------\n\nIf you get an error related to `quota failure`, then check if you have reserved\ncapacity for your queries. For more information about slot reservations, see\n[Before you begin](#before_you_begin) in this document.\n\nWhat's next\n-----------\n\n- Learn about [BigQuery Omni](/bigquery/docs/omni-introduction).\n- Learn how to [export table data](/bigquery/docs/exporting-data).\n- Learn how to [query data stored in Blob Storage](/bigquery/docs/query-azure-data).\n- Learn how to [set up VPC Service Controls for BigQuery Omni](/bigquery/docs/omni-vpc-sc)."]]