Hide
Google BigQuery

Pricing

BigQuery offers scalable, flexible pricing options to help fit your project and budget. BigQuery charges for data storage and for querying data, but loading and exporting data are free of charge. For querying data, we offer two pricing options: on-demand for a pay-as-you-go model, or reserved capacity for larger, more consistent workloads.

When charging in local currency, Google will convert the prices listed into applicable local currency pursuant to the conversion rates published by leading financial institutions.

Contents

Free operations

The following table shows BigQuery operations that are free of charge. BigQuery's quota policy applies for these operations.

Action Examples
Loading data Loading data into BigQuery
Exporting data Exporting data from BigQuery
Table reads Browsing through table data
Table copies Copying an existing table

Storage pricing

Loading data into BigQuery is free, with the exception of a small charge for streamed data. Storage pricing is based on the amount of data stored in your tables, which we calculate based on the types of data you store. For a detailed explanation of how we calculate your data size, see data size calculation.

Storage pricing is prorated per MB, per second. For example, if you store 1 TB for half of a month, you pay $10.

Storage action Cost
Storage $0.020 per GB, per month.
Streaming Inserts $0.01 per 100,000 rows.

Query pricing

BigQuery offers two pricing options for queries: on-demand, and reserved capacity. All queries are subject to BigQuery's quota policy.

On-demand pricing

On-demand queries use a shared pool of resources across users, unlike reserved capacity pricing. The first 1 TB of data processed per month is free of charge. You aren't charged for queries that return an error, or for cached queries.

BigQuery uses a columnar data structure. You're charged according to the total data processed in the columns you select, and the total data per column is calculated based on the types of data in the column. For more information about how we calculate your data size, see data size calculation.

Charges are rounded to the nearest MB, with a minimum 10 MB data processed per table referenced by the query.

Resource Pricing
Queries $5 per TB

Reserved capacity pricing

For larger, more consistent workloads, reserved capacity pricing can save as much as 70% off on-demand pricing. Reserved capacity pricing gives you the ability to reserve a certain amount of throughput each month for your queries for a set cost.

Throughput Pricing
5 GB per second $20,000 per month

You can purchase one or more increments for each reservation, up to 50 GB per second.

To sign up for reserved capacity pricing, contact a sales representative.

Data size calculation

When you load data into BigQuery or query the data, you're charged according to the data size. We calculate your data size based on the size of each data type.

Data type Size
STRING 2 bytes + the UTF-8 encoded string size
INTEGER 8 bytes
FLOAT 8 bytes
BOOLEAN 1 byte
TIMESTAMP 8 bytes
RECORD 0 bytes + the size of the contained fields

Null values for any data type are calculated as 0 bytes. Repeated fields are calculated per entry. For example, a repeated INTEGER with 4 entries counts as 32 bytes.

Sample query costs

When you run a query, you're charged according to the total data processed in the columns you select, even if you set an explicit LIMIT on the results. The total bytes per column is calculated based on the types of data in the column. For more information about how we calculate your data size, see data size calculation.

The following table shows several sample queries and a description of how many bytes are processed for each query.

Sample query Bytes processed
SELECT corpus, word FROM publicdata:samples.shakespeare LIMIT 1 Total size of the corpus + size of word columns
SELECT corpus FROM (SELECT * FROM publicdata:samples.shakespeare) Total size of the corpus column
SELECT COUNT(*) FROM publicdata:samples.shakespeare No bytes processed
SELECT COUNT(corpus) FROM publicdata:samples.shakespeare Total size of the corpus column
SELECT COUNT(*) FROM publicdata:samples.shakespeare WHERE corpus = 'hamlet' Total size of the corpus column
SELECT shakes.corpus, wiki.language FROM publicdata:samples.shakespeare AS shakes JOIN publicdata:samples.wikipedia AS wiki ON shakes.corpus = wiki.title Total size of the shakes.corpus, wiki.language and wiki.title columns