Stay organized with collections Save and categorize content based on your preferences.

Export query results to Azure Storage

This document describes how to export the result of a query that runs against a BigLake table to your Azure Blob storage.

Before you begin

  • Ensure that you have a connection to access your Azure Blob storage. Within the connection, you must create a policy for the Azure Storage container path that you want to export to. Then, within that policy, create a role that has the Microsoft.Storage/storageAccounts/blobServices/containers/write permission.

Export query results

BigQuery Omni writes to the specified Azure location regardless of any existing content. The export query can overwrite existing data or mix the query result with existing data. We recommend that you export the query result to an empty Azure Storage container.

  1. In the Google Cloud console, go to the BigQuery page.

    Go to BigQuery

  2. In the Query editor field, enter a Google Standard SQL export query:

    EXPORT DATA WITH CONNECTION CONNECTION_REGION.CONNECTION_NAME
    OPTIONS(
      uri="azure://AZURE_STORAGE_ACCOUNT_NAME.blob.core.windows.net/CONTAINER_NAME/FILE_PATH/*",
      format="FORMAT"
    )
    AS QUERY
    

    Replace the following:

    • CONNECTION_REGION: the region where the connection was created.
    • CONNECTION_NAME: the connection name that you created with the necessary permission to write to the container.
    • AZURE_STORAGE_ACCOUNT_NAME: the name of the Azure Storage account to which you want to write the query result.
    • CONTAINER_NAME: the name of the container to which you want to write the query result.
    • FILE_PATH: the path where you want to write the exported file to. It must contain exactly one wildcard * anywhere in the leaf directory of the path string, for example, ../aa/*, ../aa/b*c, ../aa/*bc, and ../aa/bc*. BigQuery replaces * with 0000..N depending on the number of files exported. BigQuery determines the file count and sizes. If BigQuery decides to export two files, then * in the first file's filename is replaced by 000000000000, and * in the second file's filename is replaced by 000000000001.
    • FORMAT: supported formats are JSON, AVRO, CSV, and PARQUET.
    • QUERY: the query to analyze the data that is stored in a BigQuery external table.

Troubleshooting

If you get an error related to quota failure, then check if you have reserved capacity for your queries. For more information about slot reservations, see Before you begin in this document.

What's next