Pricing

BigQuery offers scalable, flexible pricing options to help fit your project and budget. BigQuery charges for data storage, streaming inserts, and for querying data, but loading and exporting data are free of charge.

BigQuery provides cost control mechanisms that enable you to cap your daily costs to an amount that you choose. For more information, see Cost Controls.

When charging in local currency, Google will convert the prices listed into applicable local currency pursuant to the conversion rates published by leading financial institutions.

The following table summarizes BigQuery pricing. BigQuery's quota policy applies for these operations.

Action Cost Notes
Storage $0.02 per GB, per month See Storage pricing.
Long Term Storage $0.01 per GB, per month See Long term storage pricing.
Streaming Inserts $0.01 per 200 MB See Storage pricing.
Queries $5 per TB First 1 TB per month is free, subject to query pricing details.
Loading data Free See Loading data into BigQuery.
Copying data Free See Copying an existing table.
Exporting data Free See Exporting data from BigQuery.
Metadata operations Free List, get, patch, update and delete calls.

Free operations

The following table shows BigQuery operations that are free of charge. BigQuery's quota policy applies for these operations.

Action Examples
Loading data Loading data into BigQuery
Copying data Copying an existing table
Exporting data Exporting data from BigQuery
Metadata operations List, get, patch, update and delete calls

Storage pricing

Loading data into BigQuery is free, with the exception of a small charge for streamed data. Storage pricing is based on the amount of data stored in your tables, which we calculate based on the types of data you store. For a detailed explanation of how we calculate your data size, see data size calculation.

Storage action Cost
Storage $0.02 per GB, per month.
Streaming Inserts $0.01 per 200 MB, with individual rows calculated using a 1 KB minimum size.

Storage pricing is prorated per MB, per second. For example, if you store:

  • 100 MB for half a month, you pay $0.001 (a tenth of a cent)
  • 500 GB for half a month, you pay $5
  • 1 TB for a full month, you pay $20

Long term storage pricing

If a table is not edited for 90 consecutive days, the price of storage for that table automatically drops by 50 percent to $0.01 per GB per month.

There is no degradation of performance, durability, availability, or any other functionality when a table is considered long term storage.

If the table is edited, the price reverts back to the regular storage pricing of $0.02 per GB per month, and the 90-day timer starts counting from zero.

Anything that modifies the data in a table resets the timer (load, copy to, query with destination table):

Action Notes
append Any job that has a destination table and uses write disposition of WRITE_APPEND.
overwrite Any job that has a destination table and uses write disposition of WRITE_TRUNCATE.
streaming Ingesting data using the Tabledata.insertAll() API call

All other actions do NOT reset the timer, including:

  • Query from
  • Create view
  • Export
  • Copy from
  • Patch

For tables that reach the 90-day threshold during a billing cycle, the price is prorated accordingly.

Long term storage pricing applies only to BigQuery storage, not to federated data sources.

Query pricing

Query pricing refers to the cost of running your SQL commands and user-defined functions. BigQuery charges for queries by using one metric: the number of bytes processed.

Free Tier Pricing

The first 1 TB of data processed per month is free of charge (per billing account).

Query Pricing Details

Beyond your first 1 TB of data processed in a month, you are charged as follows:

Resource Pricing
Queries $5 per TB
  • You aren't charged for queries that return an error, or for cached queries.
  • Charges are rounded to the nearest MB, with a minimum 10 MB data processed per table referenced by the query, and with a minimum 10 MB data processed per query.
  • Cancelling a running query job may incur charges up to the full cost for the query were it allowed to run to completion.
  • BigQuery uses a columnar data structure. You're charged according to the total data processed in the columns you select, and the total data per column is calculated based on the types of data in the column. For more information about how we calculate your data size, see data size calculation.
  • You can opt-in to a higher pricing tier by enabling high-compute queries, which is necessary for fewer than 1% of queries. For more information, see High-Compute queries.

High-Compute queries

High-Compute queries are queries that consume extraordinarily large computing resources relative to the number of bytes processed. Typically, such queries contain a very large number of JOIN or CROSS JOIN clauses or complex User-defined Functions. In order to run these queries, you need to explicitly opt in.

How to opt in

If your query is too compute intensive for BigQuery to complete at the standard $5 per TB pricing tier, BigQuery returns a billingTierLimitExceeded error and an estimate of how much the query would cost.

To run the query at a higher pricing tier, pass a new value for maximumBillingTier as part of the query request. The maximumBillingTier is a positive integer that serves as a multiplier of the basic price of $5 per TB. For example, if you set maximumBillingTier to 2, the maximum cost for that query will be $10 per TB.

API
Use the BigQuery API to run a query at a higher pricing tier by setting a value for maximumBillingTier. For example, to run a query with a maximum billing tier of 2, specify the following in your job configuration:
"query": {
  "query": "select count(*) from publicdata:samples.shakespeare",
  ...
  "maximumBillingTier": "2"
},
Command-line
You can run a query at a higher tier with the BigQuery command-line tool by using the --maximum_billing_tier option to pass an integer value with the bq query command. For example, the following command runs a query with a maximum billing tier of 2:
bq query --maximum_billing_tier 2 'select count(*) from publicdata:samples.shakespeare'
Web UI

You can't set a specific maximum billing tier with the BigQuery web UI, but you can disable the maximum billing tier so that a query can execute at an unlimited billing tier. To disable tier limits for a particular query in the web UI:

  1. Open the BigQuery web UI.
  2. Click COMPOSE QUERY.
  3. Click Show Options.
  4. Find Billing Tier and check Allow unlimited.

The Allow unlimited setting lasts only for the duration of the current COMPOSE QUERY session. For example, if you close the query editor and click COMPOSE QUERY to re-open the query editor, Allow unlimited will be unchecked.

Exercise caution when setting maximumBillingTier because there is no upper limit to the value that you can set. Keeping the value low can help control costs by preventing highly compute-intensive queries from completing. To further help you control your costs, BigQuery offers custom quotas, which can help mitigate the risk of runaway expenses. For more information, see Cost Controls.

Alternatively, you can request a project-wide default value by submitting the BigQuery High-Compute queries form. You can also use the form to disable the per-query override for your project.

Timing

All projects created before January 1, 2016 can use High-Compute queries at no extra cost until January 1, 2017. For these projects, all queries are charged $5 regardless of the billing tier. On January 1, 2017, these projects revert to a maximumBillingTier value of 1 unless you have changed the value by using the BigQuery High-Compute queries form. If you have a custom value in place, the setting will continue to be in effect at the prevailing prices. Project owners will be reminded about this change prior to January 1, 2017.

All projects created on January 1, 2016 or later start with a maximumBillingTier value of 1, but you can opt in to a higher tier at any time by setting maximumBillingTier for a single query or by submitting the BigQuery High-Compute queries form to set a project-wide default.

Data size calculation

When you load data into BigQuery or query the data, you're charged according to the data size. We calculate your data size based on the size of each data type.

Data type Size
STRING 2 bytes + the UTF-8 encoded string size
INTEGER 8 bytes
FLOAT 8 bytes
BOOLEAN 1 byte
TIMESTAMP 8 bytes
RECORD 0 bytes + the size of the contained fields

Null values for any data type are calculated as 0 bytes. Repeated fields are calculated per entry. For example, a repeated INTEGER with 4 entries counts as 32 bytes.

Sample query costs

When you run a query, you're charged according to the total data processed in the columns you select, even if you set an explicit LIMIT on the results. The total bytes per column is calculated based on the types of data in the column. For more information about how we calculate your data size, see data size calculation.

The following table shows several sample queries and a description of how many bytes are processed for each query.

Sample query Bytes processed
SELECT
  corpus,
  word
FROM
  publicdata:samples.shakespeare
LIMIT 1;
Total size of the corpus + size of word columns
SELECT
  corpus
FROM
  (SELECT
     *
   FROM
     publicdata:samples.shakespeare);
Total size of the corpus column
SELECT
  COUNT(*)
FROM
  publicdata:samples.shakespeare;
No bytes processed
SELECT
  COUNT(corpus)
FROM
  publicdata:samples.shakespeare;
Total size of the corpus column
SELECT
  COUNT(*)
FROM
  publicdata:samples.shakespeare
WHERE
  corpus = 'hamlet';
Total size of the corpus column
SELECT
  shakes.corpus,
  wiki.language
FROM
  publicdata:samples.shakespeare AS shakes
  JOIN EACH
  publicdata:samples.wikipedia AS wiki
  ON shakes.corpus = wiki.title;
Total size of the shakes.corpus, wiki.language and wiki.title columns

Send feedback about...

BigQuery Documentation