Dataset locations
This page explains the concept of data location and the different locations where you can create datasets. To learn how to set the location for your dataset, see Creating datasets.
For information on regional pricing for BigQuery, see the Pricing page.
Key concepts
Locations or region types
BigQuery uses two types of locations:
A region is a specific geographic place, such as London.
A multi-region is a large geographic area, such as the United States, that contains two or more geographic places.
Dataset location
You specify a location for storing your BigQuery data when you create a dataset. After you create the dataset, the location cannot be changed, but you can copy the dataset to a different location, or manually move (recreate) the dataset in a different location.
BigQuery processes queries in the same location as the dataset that contains the tables you're querying.
BigQuery stores your data in the selected location in accordance with the Service Specific Terms.
Supported regions
BigQuery datasets can be stored in the following regions and multi-regions. For more information about regions and zones, see Geography and regions.
Regions
The following table lists the regions in the Americas where BigQuery is available.Region description | Region name | Details |
---|---|---|
Iowa | us-central1 |
|
Las Vegas | us-west4 |
|
Los Angeles | us-west2 |
|
Montréal | northamerica-northeast1 |
|
Northern Virginia | us-east4 |
|
Oregon | us-west1 |
|
Salt Lake City | us-west3 |
|
São Paulo | southamerica-east1 |
|
Santiago | southamerica-west1 |
|
South Carolina | us-east1 |
|
Toronto | northamerica-northeast2 |
|
Region description | Region name | Details |
---|---|---|
Delhi | asia-south2 |
|
Hong Kong | asia-east2 |
|
Jakarta | asia-southeast2 |
|
Melbourne | australia-southeast2 |
|
Mumbai | asia-south1 |
|
Osaka | asia-northeast2 |
|
Seoul | asia-northeast3 |
|
Singapore | asia-southeast1 |
|
Sydney | australia-southeast1 |
|
Taiwan | asia-east1 |
|
Tokyo | asia-northeast1 |
Region description | Region name | Details |
---|---|---|
Belgium | europe-west1 |
|
Finland | europe-north1 |
|
Frankfurt | europe-west3 |
|
London | europe-west2 |
|
Netherlands | europe-west4 |
|
Warsaw | europe-central2 |
|
Zürich | europe-west6 |
|
Multi-regions
The following table lists the multi-regions where BigQuery is available.Multi-region description | Multi-region name |
---|---|
Data centers within member states of the European Union1 | EU |
Data centers in the United States | US |
1 Data located in the EU
multi-region is not
stored in the europe-west2
(London) or europe-west6
(Zürich) data
centers.
Specify locations
When loading data, querying data, or exporting data, BigQuery
determines the location to run the job based on the datasets referenced in
the request. For example, if a query references a table in a dataset stored
in the asia-northeast1
region, the query job will run in that region. If a
query does not reference any tables or other resources contained within
datasets, and no destination table is provided, the query job will run in the
US
multi-region.
If the project has a flat-rate reservation in a region other than the US
and
the query does not reference any tables or other resources contained within
datasets, then you must explicitly specify the location of the flat-rate
reservation when submitting the job.
You can specify the location to run a job explicitly in the following ways:
- When you query data using the console, click More > Query settings, and for Processing Location, click Auto-select and choose your data's location.
- When you use the
bq
command-line tool, supply the--location
global flag and set the value to your location. - When you use the API, specify your region in the
location
property in thejobReference
section of the job resource.
BigQuery returns an error if the specified location does not match the location of the datasets in the request. The location of every dataset involved in the request, including those read from and those written to, must match the location of the job as inferred or specified.
Single-region locations do not match multi-region locations, even when the
single-region location is associated with the multi-region location. Therefore,
a job will always fail if the set of associated locations includes both a
single-region location and a multi-region location. For example, if a job's
location is set to US
, the job will fail if it references a dataset in
us-central1
. Likewise, a job that references one dataset in US
and another
dataset in us-central1
will fail.
Location considerations
When you choose a location for your data, consider the following:
- Colocate your BigQuery dataset when using external data sources.
- Cloud Storage: When you query data in Cloud Storage through a BigQuery external table, the data you query must be colocated with your BigQuery dataset. For example:
- Single region: If your BigQuery dataset is in the Warsaw (`europe-central2`) region, the corresponding Cloud Storage bucket must also be in the Warsaw region because there is currently no Cloud Storage dual-region that includes Warsaw.
- Dual-region: If your BigQuery dataset is in the Tokyo (`asia-northeast1`) region, the corresponding Cloud Storage bucket must be a bucket in the Tokyo region or in the `ASIA1` dual-region (which includes Tokyo).
- Multi-region: Because external query performance depends on minimal latency and optimal network bandwidth, using multi-region dataset locations with multi-region Cloud Storage buckets is not recommended for external tables.
- Cloud Bigtable: When you query data in
Cloud Bigtable through a BigQuery
external table, your
Cloud Bigtable instance must be in the same location as your BigQuery dataset.
- Single region: If your BigQuery dataset is in the Belgium (europe-west1) regional location, the corresponding Cloud Bigtable instance must be in the Belgium region.
- Multi-region: Because external query performance depends on minimal latency and optimal network bandwidth, using multi-region dataset locations is not recommended for external tables on Cloud Bigtable.
- Google Drive: Location considerations do not apply to Google Drive external data sources.
- Colocate your BigQuery dataset with your analysis tools.
- Dataproc: When you query BigQuery datasets using a BigQuery connector, your BigQuery dataset should be colocated with your Dataproc cluster. Dataproc is supported in all Compute Engine locations.
- Vertex AI Workbench: When you query BigQuery datasets using Jupyter notebooks in Vertex AI Workbench, your BigQuery dataset should be colocated with your Vertex AI notebook instance. View the supported Vertex AI Workbench locations.
- Colocate your Cloud Storage buckets for loading data.
- If your BigQuery dataset is in a multi-region, the Cloud Storage bucket containing the data you're loading must be in the same multi-region or in a location that is contained within the multi-region. For example, if your BigQuery dataset is in the `EU` multi-region, the Cloud Storage bucket can be located in the `europe-west1` Belgium region, which is within the EU.
- If your dataset is in a region, your Cloud Storage bucket must be in the same region. For example, if your dataset is in the `asia-northeast1` Tokyo region, your Cloud Storage bucket cannot be in the `ASIA` multi-region.
- Exception: If your dataset is in the `US` multi-region, you can load data from a Cloud Storage bucket in any location.
- Colocate your Cloud Storage buckets for exporting data.
- If your BigQuery dataset is in a multi-region, the Cloud Storage bucket containing the data you're exporting must be in the same multi-region or in a location that is contained within the multi-region. For example, if your BigQuery dataset is in the `EU` multi-region, the Cloud Storage bucket can be located in the `europe-west1` Belgium region, which is within the EU.
- If your dataset is in a region, your Cloud Storage bucket must be in the same region. For example, if your dataset is in the `asia-northeast1` Tokyo region, your Cloud Storage bucket cannot be in the `ASIA` multi-region.
- Exception: If your dataset is in the `US` multi-region, you can export data into a Cloud Storage bucket in any location.
- Develop a data management plan.
- If you choose a regional storage resource such as a BigQuery dataset or a Cloud Storage bucket, develop a plan for geographically managing your data.
For more information on Cloud Storage locations, see Bucket locations in the Cloud Storage documentation.
Restrict locations
You can restrict the locations in which your datasets can be created by using the Organization Policy Service. For more information, see Restricting resource locations and Resource locations supported services.
Dataset security
To control access to datasets in BigQuery, see Controlling access to datasets. For information about data encryption, see Encryption at rest.
Next steps
- Learn how to create datasets.
- Learn about loading data into BigQuery.
- Learn about BigQuery pricing.
- View all the Google Cloud services available in locations worldwide.
- Explore additional location-based concepts, such as zones, that apply to other Google Cloud services.